oreilly.comSafari Books Online.Conferences.
Articles Radar Books  

Gnutella and Freenet Represent True Technological Innovation
Pages: 1, 2, 3, 4

Other unique features of Freenet

Freenet is more restrained in the traffic generated than Gnutella, perhaps because it expects to transfer a complete file of data for each successful request. When a Freenet client receives a request it cannot satisfy, it sends the request on to a single peer; it does not multicast to all peers as Gnutella does. If the client receives a failure notice because no further systems are known down the line, or if the client fails to get a response because the time-to-live timed out, it tries another one of its peers. In brief, searching is done depth-first and not in parallel. Nevertheless, Clarke says searches are reasonably fast; each takes a couple seconds as with a traditional search engine. The simple caching system used in Freenet also seems to produce just as good results as the more deliberate caching used by ISPs for Web pages.



Freenet is being developed in Java and requires the Java Runtime Environment to run. It uses its own port and protocol, rather than running over HTTP as Gnutella does.

Limitations and risks of Freenet

Freenet seems more scalable than Gnutella. One would imagine that it could be impaired by flooding with irrelevant material (writing a script that dumped the contents of your 8-gig disk into it once every hour, for instance) but that kind of attack actually has little impact. So long as nobody asks for material, it doesn't go anywhere.

Furthermore, once someone puts up material, no one can overwrite it with a bogus replacement. Each item is identified by a unique identifier. If a malicious censor tries to put up his own material with the same identifier, the system checks for an existing version and says, "We already have the material!" The only effect is to make the original material stay up longer, because a request for it was made by the would-be censor.

Related Articles:

Napster and MP3: La Revolucion or La Larceny?


Music Industry Turns Heat on Net Music Pirates


Why the RIAA is Fighting a Losing Battle


Napster: Popular Program Raises Devilish Issues


Why the RIAA Still Stands a Chance

The searching problem

The unique identifier is Freenet's current weak point. Although someone posting material can assign any string as an identifier, Freenet chooses for security reasons to hash the string. Two search strings that differ by a single character (like "HumanRights" and Human-Rights") will hash to very different values, as with any hashing algorithm. This means that a prosecuting agency that is trying to locate offending material will have great difficulty identifying the material from a casual scan of each site.

But the hashing also renders Freenet unusable for random searches. If you know exactly what you're looking for on Freenet -- because someone has used another channel to say, for instance, "Search for HumanRights on Freenet" -- you can achieve success. But you can't do free text searches.

One intriguing use of Freenet is to offer sites with hyperlinks. Take people interested in bird-watching as an example. Just as an avid aviarist can offer a Web page with links to all kinds of Web sites, she can offer links that generate Freenet requests using known strings that retrieve data about birds over Freenet. Already, for people who want to try out Freenet without installing the client, a gateway to the Web exists under the name fproxy.

Another area for research is a client that accepts a string and changes it slightly in the hope of producing a more accurate string, then passes it on. The most important task in the Freenet project currently, according to Clarke, is to resolve the search problem.

Letting go

Once again, I refer readers to The Value of Gnutella and Freenet for a discussion of these systems' policy and social implications. I'll end this technical article by suggesting that the Gnutella and Freenet continue to loosen the virtual from the physical, a theme that characterizes network evolution. DNS decoupled names from physical systems; URNs will allow users to retrieve documents without domain names; virtual hosting and replicated servers change the one-to-one relationship of names to systems. Perhaps it is time for another major conceptual leap, where we let go of the notion of location. Welcome to the Heisenberg Principle, as applied to the Internet. Information just became free.

Gnutella and Freenet, in different ways, make the location of documents irrelevant; the search string becomes the location. To achieve this goal, they add a new layer of routing on top of the familiar routing done at the IP level. The new layer may appear at first to introduce numerous problems in efficiency and scaling, but in practice these turn out to be negligible or least tolerable. I think readers should take a close look at these systems; even if Gnutella and Freenet themselves turn out not to be good enough solutions for a new Internet era, they'll teach us some lessons when it's time for yet another leap.

Andy Oram is an editor for O'Reilly Media, specializing in Linux and free software books, and a member of Computer Professionals for Social Responsibility. His web site is www.praxagora.com/andyo.


Discuss this article in the O'Reilly Network General Forum.

Return to the O'Reilly Network Hub.





P2P Weblogs

Richard Koman Richard Koman's Weblog
Supreme Court Decides Unanimously Against Grokster
Updating as we go. Supremes have ruled 9-0 in favor of the studios in MGM v Grokster. But does the decision have wider import? Is it a death knell for tech? It's starting to look like the answer is no. (Jun 27, 2005)

> More from O'Reilly Developer Weblogs


More Weblogs
FolderShare remote computer search: better privacy than Google Desktop? [Sid Steward]

Data Condoms: Solutions for Private, Remote Search Indexes [Sid Steward]

Behold! Google the darknet/p2p search engine! [Sid Steward]

Open Source & The Fallacy Of Composition [Spencer Critchley]