Articles Weblogs Books School Short Cuts Podcasts  


by Clay Shirky

The peer-to-peer backlash has begun. On the same day, the Wall St. Journal ran an article by Lee Gomes entitled "Is P2P plunging off the deep end?", while Slashdot's resident commentator, Jon Katz, ran a review of O'Reilly's Peer to Peer book under the title "Does peer-to-peer suck?"

It's tempting to write this off as part of the Great Wheel of Hype we've been living with for years:

New Thing happens; someone thinks up catchy label for New Thing; press picks up on New Thing story; pundits line up to declare New Thing "Greatest Since Sliced Bread." Whole world not transformed in matter of months; press investigates further; New Thing turns out to be only best thing since soda in cans; pundits (often the same ones) line up to say they never believed it anyway.

This quick reversal is certainly part of the story here. The Journal quoted entrepreneurs and investors recently associated with peer-to-peer who are now distancing themselves from the phrase in order to avoid getting caught in the backlash. There is more to these critiques than business people simply repositioning themselves when the story crescendos, however, because each of the articles captures something important and true about peer-to-peer.

Related Articles:

Editor Andy Oram on O'Reilly's First Peer-to-Peer Book

Security Concerns Miss the P2P Point

What Is P2P ... And What Isn't

Popular Power Turns Off the Lights

Gnutella: Alive, Well, and Changing Fast

Gnutella and Freenet Represent True Tech Innovation

Articles by Clay Shirky

More from the

Where's the money?

The Wall St. Journal's take on peer-to-peer is simple and direct: it's not making investors any money right now. Mr. Gomes notes that many of the companies set up to take advantage of file sharing in the wake of Napster's successes have hit on tough times, and that Napster's serious legal difficulties have taken the bloom off the file sharing rose. Meanwhile, the distributed computing companies have found it hard to get either customers or investors, as the closing of Popular Power and the difficulties of the remaining field in finding customers have highlighted.

Furthermore, Gomes notes that P2P as a label has been taken on by many companies eager to seem cutting edge, even those whose technologies have architectures that differ scarcely at all from traditional client-server models. The principle critiques Gomes makes -- P2P isn't a well-defined business sector, nor a well-defined technology -- are both sensible. From a venture capitalist's point of view, P2P is too broad a category to be a real investment sector.

Is P2P even relevant?

Jon Katz's complaints about peer-to-peer are somewhat more discursive, but seem to center on its lack of a coherent definition. Like Gomes, he laments the hype surrounding peer-to-peer, riffing off a book jacket blurb that overstates peer-to-peer's importance, and goes on to note that the applications grouped together under the label peer-to-peer differ from one another in architecture and effect, often quite radically.

Katz goes on to suggest that interest in P2P is restricted to a kind of techno-elite, and is unlikely to affect the lives of "Harry and Martha in Dubuque." While Katz's writing is not as focused as Gomes', he touches on the same points: there is no simple definition for what makes something peer-to-peer, and its application in people's lives is unclear.

The unspoken premise of both articles is this: if peer-to-peer is neither a technology or a business model, then it must just be hot air. There is, however a third possibility besides "technology" and "business." The third way is simply this: Peer-to-peer is an idea.

Revolution convergence

Peer-to-Peer: Harnessing the Power of Disruptive Technologies

Peer-to-Peer: Harnessing the Power of Disruptive Technologies
Edited by Andy Oram
March 2001
0-596-00110-X, Order Number: 110X
448 pages, $29.95

As Jon Orwant noted recently in these pages, " Peer-to-peer is not a technology, it's a mindset."" Put another way, peer-to-peer is a related group of ideas about network architecture, ideas about how to achieve better integration between the Internet and the personal computer -- the two computing revolutions of the last 15 years.

The history of the Internet has been told often -- from the late '60s to the mid-'80s, the DARPA agency in the Department of Defense commissioned work on a distributed computer network that used packet switching as a way to preserve the fabric of the network, even if any given node failed.

The history of the PC has likewise been often told, with the rise of DIY kits and early manufacturers of computers for home use -- Osborne, Sinclair, the famous Z-80, and then the familiar IBM PC and with it Microsoft's DOS.

In an accident of history, both of those movements were transformed in January 1984, and began having parallel but increasingly important effects on the world. That month, a new plan for handling DARPA net addresses was launched. Dreamed up by Vint Cerf, this plan was called the Internet Protocol, and required changing the addresses of every node on the network over to one of the new IP addresses, a unique, global, and numerical address. This was the birth of the Internet we have today.

Comment on this articleIs P2P the "push" of the 2000s, or a really good idea? Respond to Shirky's opinion (or Katz's or Gomes') here.
Post your comments

Meanwhile, over at Apple Computer, January 1984 saw the launch of the first Macintosh, the computer that popularized the graphic user interface (GUI), with its now familiar point-and-click interactions and desktop metaphor. The GUI revolutionized the personal computer and made it accessible to the masses.

For the next decade, roughly 1984 to 1994, both the Internet and the PC grew by leaps and bounds, the Internet as a highly connected but very exclusive technology, and the PC as a highly dispersed but very inclusive technology, with the two hardly intersecting at all. One revolution for the engineers, another for the masses.

The thing that changed all of this was the Web. The invention of the image tag, as part of the Mosaic browser (ancestor of Netscape), brought a GUI to the previously text-only Internet in exactly the same way that, a decade earlier, Apple brought a GUI to the previously text-only operating system. The browser made the Internet point-and-click easy, and with that in place, there was suddenly pressure to fuse the parallel revolutions, to connect PCs to the Internet.

Which is how we got the mess we have today.

First and second-class citizens

In 1994, the browser created sudden pressure to wire the world's PCs, in order to take advantage of the browser's ability to make the network easy to use. The way the wiring happened, though -- slow modems, intermittent connections, dynamic or even dummy IP addresses -- meant that the world's PCs weren't being really connected to the Internet, so much as they were being hung off its edges, with the PC acting as no more than a life-support system for the browser. Locked behind their slow modems and impermanent addresses, the world's PC owners have for the last half-dozen years been the second-class citizens of the Internet.

Anyone who wanted to share anything with the world had to find space on a "real" computer, which is to say a server. Servers are the net's first-class citizens, with real connectivity and a real address. This is how the Geocities and Tripods of the world made their name, arbitraging the distinction between the PCs that were (barely) attached to the networks edge and the servers that were fully woven into the fabric of the Internet.

Big, sloppy ideas

Rejection of this gap between client and server is the heart of P2P. As both Gomes and Katz noted, P2P means many things to many people. PC users don't have to be second-class citizens. Personal computers can be woven directly into the Internet. Content can be provided from the edges of the network just as surely as from the center. Millions of small computers, with overlapping bits of content, can be more reliable than one giant server. Millions of small CPUs, loosely coupled, can do the work of a supercomputer.

These are sloppy ideas. It's not clear when something stops being "file sharing" and starts being "groupware." It's not clear where the border between client-server and peer-to-peer is, since the two-way Web moves power to the edges of the network while Napster and ICQ bootstrap connections from a big server farm. It's not clear how ICQ and SETI@Home are related, other than deriving their power from the network's edge.

No matter. These may be sloppy ideas, ideas that don't describe a technology or a business model, but they are also big ideas, and they are also good ideas. The world's Net-connected PCs host, both individually and in aggregate, an astonishing amount of power -- computing power, collaborative power, communicative power.

Our first shot at wiring PCs to the Internet was a half-measure -- second-class citizenship wasn't good enough. Peer-to-peer is an attempt to rectify that situation, to really integrate personal devices into the Internet. Someday we will not need a blanket phrase like peer-to-peer, because we will have a clearer picture of what is really possible, in the same way the arrival of the Palm dispensed with any need to talk about "pen-based computing."

In the meantime, something important is happening, and peer-to-peer is the phrase we've got to describe it. The challenge now is to take all these big sloppy ideas and actually do something with them, or, as Michael Tanne of XDegrees put it at the end of the Journal article:

"P2P is going to be used very broadly, but by itself, it's not going to create new companies. ...[T]he companies that will become successful are those that solve a problem."

Clay Shirky writes about the Internet and teaches at NYU's Interactive Telecommunications Program. He publishes a mailing list on Networks, Economics, and Culture at