Nouveau: Tech
You are in:  Home > Nouveau: Popular Culture > Tech   •  Archives   •  send page to a friend

Headline Feed
Email to a friend



Mike Reed: Big Dog and Me Too
© Mike Reed



By Matt Robson

NEW YORK, 6 SEPTEMBER 2007—The Internet is approaching a metamorphosis overlooked by the press. The current Renaissance in communications isn't a product of super-geeks or patent factories, but springs from our collective immersion in the ever-expanding digital world. As we converse, bookmark, and search the web with success or failure, we leave behind a new layer of meaning for others to learn from.

Who, now, is content to merely surf the web? We seek to swallow it whole, to process it, filter it, repackage it, redirect it, and spit it out with our name on it. 

Web 2.0 is a phrase used to describe how the Internet is increasingly becoming a community-driven medium using social networks and wikipedia-style dynamic resources. Web 2.0 also relies on new distribution mechanisms that enable the small-time blogger to broadcast his message to thousands of channels to reach the fragmented, decentralized web audience.

Recently, new-media experts gathered at the June 14th iBreakfast Web 2.0 & Madison Ave. 2.0 Summit in New York. In addition to new-media panels, there were also entrepreneur pitches for the New York Angels. This was clearly an event where forecasts of technology trends were the common currency. Keynote Esther Dyson, Queen Mother of the Attention Economy, has declared Facebook the most revolutionary web technology to date. She sees the platform's openness to extension by outside corporations as the most promising aspect of shaping the development of a veritable web operating system.

Mike Reed: Issues
© Mike Reed

What Today's Web Destinations Refuse to Face About Tomorrow's Media Reality

Large entrenched web interests, including Google, Myspace, and Ebay currently have distribution models that are inconsistent with the next-generation Web 2.0 openness and pervasiveness essential to creating self-expanding and adaptive resources. 
Web 1.0 destinations lack the ability or willingness for cross-site content sharing. This is because they developed before there was an environment of peer contribution and agreement on how to share information coherently. The legacy resources understandably want to protect content that they created or assembled; unfortunately, they are every bit as paranoid regarding cataloging that content. (It is as if Sears did not want their retail product catalog to be circulated to consumers.)  
How else are Web 1.0 sites far from ideal? They don't view their user communities as a vital part of their feedback loop. Major search engines all operate according to a mysterious black box. Web communities like Myspace and are self-contained by design. The new Web 2.0 publishing and distribution model borrows heavily from the breakthrough resource which is shared among many sites: the blogosphere. What the blogosphere did for blogs, Web 2.0 promises to do for everything else. 
The benefits of Web 2.0 are immense. Having trusted experts and die-hard users sort crucial pages into meaningful categories saves time for searchers and facilitates entirely new methods of retrieving information. However, in order for users to contribute to the system's knowledge, there needs to be an open system similar to Wikipedia that would allow the expert community to expand the resource with cross-references and footnotes.

Systems that allow public, liberal interaction and reorganization of their databases allow the adaptation of their content to the needs of diverse communities. In essence, the rigidity and territorialism exhibited by leading web properties reflect the most critical limitation in their ambitious mandate to catalog the world's information coherently.

Mike Reed: Xenophobe
© Mike Reed

The key to the construction of the most exhaustive and intelligent resource is the indexing. Creation of this index is not ideally managed by one company and its contingent limited ability and narrow interest. The best resources are so vast and complex that they require an inter-networked system of listings akin to the Multiple Listing Services for home sales. A system that would allow databases to be hosted in a distributed, yet standardized, fashion would offer the best chance to flourish and adapt into special purposes and foreign environments.

The best data resources spread like viruses and are fundamentally more dynamic than the sites they flow through. The vector of transmission for such infections today is the modern-day 'web service'. The originating 'hosts' for these, today's web 'sites,' are reduced to venues in which the virus can multiply. The viral propagation of web services is self-adaptive and ultimately pervasive, compared to their stagnant hosts which are immobile and non-evolutionary.

New web file types and ways of annotating information, collectively known as microformats, are giving momentum to cross-site 'mashups' which process content and features from many sources. Mashups allow web users to interface with data from many web sites at once in a coherent way. One very straightforward mashup takes housing listings from Craigslist, and allows you to view it on a Google map. The maker of this mashup created neither the Google map application, nor Craiglist's database; he only combined these two resources for a previously unrealized benefit. Other mashups often draw from more than one data source, and combine more than one application.

More recently, a universal twist of social networking is developing, enabled by social profile aggregators such as profilefly, and profilelinker. With centralized profile management, users can post their information to one site, and then broadcast their profile to dozens of hubs. This creates an environment where many site creators or even consumers can receive and present the same information in their own specialized way. It enables users to enjoy a cross-platform, cross-site experience with regard to their profiles, friends, bookmarks and other media. This breakthrough in media distribution first hit the mainstream in the form of blog burning via RSS feeds.

Mike Reed: Perv
© Mike Reed

One company that was riding the wave of personal content syndication to the blog world was FeedBurner (recently was gobbled by Google Inc. for around $100 million). As these new kinds of services flourish, they shall continue to incubate startups that bite at the media network and the search engine pie. This evolution makes a Web 2.0 reckoning inevitable.

Surely no company can organize the world's information alone. The demands of categorizing the all-encompassing, ever-changing pages, people, relationships, concepts, and emerging formats is so vast and complex that no company can arrive at a complete structure by itself. Profits from successful brands, such as, won’t vanish overnight, but they already show reduced growth. Their hesitation to innovate paves the way for a new generation of Web 2.0 markets and media.

Mike Reed: Archivist
© Mike Reed

Like the blogosphere, other infospheres don't rely on web crawling to come into being. Instead, they arise from feeds published by one resource in well-known standard formats, and imported into another. Most Web 2.0 information publishing platforms come with syndication and subscription support built in, allowing easy publish-and-subscribe of information.

Aggregated Services Span Web Sites, Enable Integration:

It is the aggregation process from many sources into one universal, yet dispersed, catalog which is the defining characteristic of Web 2.0, and its most well-known manifestation, the blogosphere.  It is the adoption of syndication feeds in lingua franca formats like RSS and Atom by the Web 2.0 publishing platforms that allowed the emergence of the blogosphere which enabled companies like Technorati to monitor, aggregate, classify and make intelligible the hubbub of chatter on millions of individual web sites.  For structured and authenticated data, the very process of crawling is made obsolete by RSS feeds. Crawling the web without any organization is like loading a dump truck full of books and calling that a "resource". Aggregating through structured syndication is the equivalent of putting each book on its appropriate shelves, and allowing anyone to read, copy, and annotate the information. 
Tomorrow's threat to web and media network stakeholders does not come in the form of rival web sites, but rather as pervasive web services spanning and unifying resources no longer confined to a single site. Leading web-players are threatened by such a permanent shift in database pervasiveness and propriety. Just as the walled gardens of the pre-web, such as AOL and Compuserve, gave way to the Open Web, so too will quarantined social networks, search engines, and job hubs lose ground to web services enabled by Web 2.0 technology.

Overcoming the reliance on central crawling, new efforts by Wikipedia founder, Jimmy Wales, include the purchase and donation of a peer-based web crawler to the open source community. This project is just one aspect of his collective Wikia search project which aims to create a new set of standards beyond HTML to support social search.  
The effects of this revolution in content distribution also ripple through print and show business as the power-shift away from networks back to content owners and users will affect entertainment and journalism – online and offline.

Ubiquitous web services also have a vastly different ownership profile than traditional fully proprietary and closed resources. While Web 1.0 destinations all have their price, Web 2.0 infospheres cannot be acquired: ( $630M. YouTube: $1.65B. The blogosphere: Priceless.)   

Please click here for page two

[ Feedback | Home ]

If you value this page, please send it to a friend.

Copyright © 2005 Euromedia Group, Ltd. All Rights Reserved.