The piece begins with an acknowledgment of how much junk the Internet contains, the “endless streams of mediocrity, eroding cultural norms about quality and acceptability.” But then Shirky contains it, stating “that’s what always happens.” Every time a technology brings an “increase in freedom to create or consume media,” he argues, the rules of communication shift and we have a period of apparent chaos and decline. Hence, the digital setting “alarms people accustomed to the restrictions of the old system, convincing them that the new media will make young people stupid.”
That term “alarm” is a suggestive one. No doubt there are Luddites and technophobes and other alarmists out there who simply react against change. But what about people who approve of some elements of the technology and disapprove of other elements? The term “alarm” doesn’t include them, for alarm disallows any ambivalence. It does, however, allow one to put the other side into a pathological condition. Shirky further diminishes it in the next sentence by claiming that we’ve seen it many times before: “This fear dates back to at least the invention of movable type.”
The translation, publication, and distribution of the Bible is his example. In effect, he says, the printing press fostered the Reformation, “which did indeed destroy the Church’s pan-European hold on intellectual life.” It also brought about other kinds of writing, popular and technical, which “had the effect of increasing, rather than decreasing, the intellectual range and output of society.”
No doubt, yes. But what makes the situation back then parallel to the situation now? Shirky claims that “we are living through a similar explosion of publishing capability today,” but his more specific example from the past gainsays the similarity. The example is the scientific revolution, whose “essential insight . . . was peer review, the idea that science was a collaborative effort that included the feedback and participation of others. Peer review was a cultural institution that took the printing press for granted as a means of distributing research quickly and widely, but added the kind of cultural constraints that made it valuable.”
This is a misleading characterization. First of all, peer review wasn’t the “essential insight,” but was rather one crucial element among many others in scientific method (including gathering and handling of evidence, objectivity, transparency, etc.). More importantly, peer review requires that participants have the status of “peers.” That is, they have to subscribe to binding principles of inquiry, and as time passed they had to be trained and accredited. This is hardly the best analogy to the wide-open spaces of Web 2.0.
Nevertheless, Shirky believes that the “explosion of publishing capability today, where digital media link over a billion people into the same network,” will produce a great leap forward precisely by harnessing some of the free time all these people have and directing it toward intellectual matters. Shirky calls the free time people have “our cognitive surplus,” fully 1,000,000,000,000 hours of leisure per year. If they turned “even a tiny fraction” of their time from, say, watching TV to Web “participation,” we’d see “enormous positive effects.” He then cites a few scattered examples of the process (crisis mapping tools in Kenya, Wikipedia).
Shirky acknowledges that “not everything people care about is a high-minded project,” conceding that when media proliferate, “average quality falls quickly.” But, once again, history shows a pattern. Edgar Allan Poe complained about too many books, and so did Martin Luther. Once the norms of writing shook out and people realized how to filter good books from bad books, the complaints stopped.
The same thing will happen with the Internet, he continues. We will “integrate digital freedoms into society as well as we integrated literacy.” Why, though?
Shirky doesn’t answer directly. Instead, he attacks the pessimists once again. He notes that “the rosy past of the pessimists was not, on closer examination, so rosy.” Back in the 80s, before the Digital Age, people watched bad sitcoms more than they read Proust. But, one may reply, one can criticize various elements of the Internet without doing so comparatively, that is, by holding up a Golden Age or “rosy past.” Indeed, he chides the assumption that “the recent past was a glorious and irreplaceable high-water mark of intellectual attainment.” But who says that the last years of the pre-digital age were so wondrous?
In the entire piece, then, Shirky has only one firm hypothesis, the transfer of cognitive surplus from TV and other passive consumptions of media to active participation in Web media. He believes that “this generation of young people will fail to invent cultural norms that do for the Internet’s abundance what the intellectuals of the 17th century did for print culture,” but apparently it will happen only if their “cognitive surplus” is put to better use.