I’m not sure to what extent I still believe a single word of what I wrote in yesterday’s State of Webcomics Address. As it was coming down to the wire I started realizing I was basically being a lapdog for Bengo and wasn’t really thinking about the situation clearly, right as my brain was burning out as well. I may take a more sober revisiting of the situation in a month or two.
But if there’s one part of it I do still believe in, it’s the opening section about the idealism of the youth, and that may be my starting point for a rewrite of the Address. And I can stand by that because it appears everywhere else I look at the Internet. A constant theme in all the developments to take place on the Internet is that people started a revolution first and asked questions on how to pay for it, or the impact on the money flowing to the institutions the revolutions were replacing, later, even for the most prominent and popular of those revolutions.
I was interested in Farhad Manjoo’s book True Enough last year, and now I may be adding his technology column for Slate to my RSS reader. And today, we’re going to go from Stuff From July week to Stuff From April week! Because it was in April that Manjoo wrote a column on the double-edged sword of user-generated content. It seems even YouTube, one of the most popular sites on the Internet, has been bleeding money and mostly been propped up by the parts of Google that have been actually making money.
Turns out that “user-generated content” can end up meaning “crap”. A lot of the stuff most people leer at on YouTube – copyright violations, groin shots and other dumb, vaguely voyeuristic things like that – are the stuff that advertisers don’t want to be associated with. The content that makes the most money is still content made by the pros. YouTube has attempted to make up for it by signing content deals with the pros, but that only gets advertisers to pay for the pro content, and still leaves YouTube holding the bag for the costs of storing the crap. YouTube may eventually have to impose restrictions or a limited paywall. User-generated content may have changed the world, but no one’s quite willing to sponsor it yet.
The result: I strongly suspect YouTube as we know it will die within a year.
And that’s the best thing that could possibly happen to it – and to the Web.
I say that because of Manjoo’s June column on the release of Firefox 3.5, which emphasizes its integration of the HTML 5 standard, especially the way it allows for video to be called up using an HTML <video> tag without calling up Flash. Manjoo emphasizes the way this could result in interactive video; I look at the way it could obviate the need for centralized video repositories like YouTube. User-generated video can now conceivably be hosted on the web site of the person that produced it rather easily, without needing a third party like YouTube.
I could see YouTube imposing a survival of the fittest system, where videos that fail to meet a certain support threshold, failing to earn its keep, will receive a warning and eventually be automatically removed. This would keep YouTube high-quality and fairly self-sustaining. This one-two punch – encouraging people to take on the costs of hosting their stuff themselves and making it easier to do so – would theoretically maintain the user-generated content revolution while dispersing its costs and making it more manageable.
What about other services facing the same problem? What about Flickr and Facebook? There’s not going to be some HTML white knight to save them, is there? Maybe not, but it’s telling that so many WordPress users have clamored for an image gallery in core, despite its gimmicky-ness, that one appears to be coming. Facebook may be harder to deal with, as it effectively is the place where a lot of this “decentralized” stuff would go, as is. Twitter may be starting to take some load off Facebook, but its core is its interconnectedness; is it even possible to decentralize social networking?
This brings me to the CWI’s Steven Pemberton’s vision of what Web 3.0 might be like. Web 2.0 was based in specific Web sites like Flickr, Facebook and Wikipedia (which seems to be doing well as the public television of the Web, running primarily on donations with zero ads), but Web 3.0, in Pemberton’s eyes, would be based on millions of personal Web sites. Pemberton’s concerned about the effect that potentially getting “locked-in” to a specific site might have if you decide to change sites, or if the site (or your account) gets shut down, or the redundancy of being on both MySpace and Facebook, and suggests instead that semantic standards be instituted for such things. For example, you could put your contacts on a page on your site, and an aggregator (of sorts) would compare that with other people’s contacts. Such an “aggregator” might actually be part of the browser itself. Imagine if Twitter were to shut down, without a replacement, but left a standard for people to send “tweets” from their own web sites that could then be read from within the browser. (Microsoft, of all people, may be getting a head start on this with the “accelerators” in IE8.)
Pemberton might seem to be a lone dreamer with his own wild vision, and at first glance, this may seem to be incompatible with corporate America’s demands to centralize everything in one place under one company that can rake in the dough. But because the Internet is free, they haven’t been raking in the dough – and because of that, money may actually encourage the creation of this new decentralized vision of the Internet, just to spread the costs out so they aren’t borne by a few companies.