Breaking News
News & Analysis

PARC hosts summit on content-centric nets

8/12/2011 04:46 AM EDT
4 comments
NO RATINGS
More Related Links
View Comments: Newest First | Oldest First | Threaded View
digitalshaman
User Rank
Rookie
re: PARC hosts summit on content-centric nets
digitalshaman   8/22/2011 5:32:01 PM
NO RATINGS
Packet watermarks were introduced to address these issues in the early 2000s; but, Cisco and Juniper generally promoted the notion of "free" bandwidth and bandwidth allocation schemes limited to coarse estimates (MPLS, etc.). Packet watermarks also enable differential QoS at far higher efficiency in computational overhead and better management of "flow". The other push comes from search and poor notions of where the network (*your* network / infrastructure) starts. Innovation suffers when legacy models are not subject to "restart"!

LarryM99
User Rank
CEO
re: PARC hosts summit on content-centric nets
LarryM99   8/16/2011 5:42:22 PM
NO RATINGS
The problem with the current network address approach is that it takes each packet at face value (This one says it came from google.com? Good enough for me!). The idea of integrating authentication is a real step forward, whether it is done in a content-centric approach or an overlay onto the current structure. This seems to be a clean-sheet-of-paper approach, which has technical advantages and huge logistical challenges. Step one: throw away all of your infrastructure. Step two: doesn't matter until you get people to do step one. Larry M.

DrQuine
User Rank
CEO
re: PARC hosts summit on content-centric nets
DrQuine   8/16/2011 2:16:43 PM
NO RATINGS
One challenge of content-centric networks is that low quality information may be more difficult to avoid. The current network address based approach helps to identify trusted sources and avoid spam.

grouts
User Rank
Rookie
re: PARC hosts summit on content-centric nets
grouts   8/16/2011 1:46:39 AM
NO RATINGS
Content-centric networks should be a natural for any company to use for all their product design activities. The PARC announcement also strongly suggests the re-discovery of the use of hashed binary trees, which are the strongest data structures (read also of course files, networks, clouds, etc.), capable of organizing and providing immense amounts information, especially for hierarchical product and library design information. I believe you will find that NSA is also using this basic data and network paradigm for their huge huge collections of information. My own background here is that this is the approach we took at Honeywell Large Information Systems EDA hierarchical design data using what Honeywell called indexed-sequential files. We implemented a EDA database we called MUSER which was actively used by Honeywell / Bull from 1969 through 1997, and was only shut down due to the french Bull folks having to use some many off the shelf design tools. --Steve Grout

Top Comments of the Week
August Cartoon Caption Winner!
August Cartoon Caption Winner!
"All the King's horses and all the KIng's men gave up on Humpty, so they handed the problem off to Engineering."
5 comments
Like Us on Facebook

Datasheets.com Parts Search

185 million searchable parts
(please enter a part number or hit search to begin)
EE Times on Twitter
EE Times Twitter Feed
Radio
LATEST ARCHIVED BROADCAST
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.
Flash Poll