In my university classes, I tell my students there are only two things they need to know about the widely misunderstood and overhyped terms “Web 1.0,” “Web 2.0,” and “Web 3.0.”
First, there are no decimal points. After all, there is no Web 1.7 or Web 2.3. The real terms should be “Web 1,” “Web 2,” and “Web 3.” Second, those terms roughly describe slightly overlapping periods in Web history, similar to how Cambrian, Triassic, and Jurassic are geologic periods.
Web 1 refers to the initial period, starting from Tim Berners-Lee’s invention of the Web in 1989. During Web 1, the technologies needed to either publish or broadcast on the Web were relatively expensive and required a modicum of technical training to use. Almost all Web publishing and broadcasting were done by large organizations, such as governments, universities, corporations, and media companies. People merely read, listened to, or viewed what those organizations put online.
Web 1 never really ended, nor will it. However, its primacy online was gradually ended due to Moore’s Law. As the average personal computer’s processing power increased and its costs decreased, the technologies used to publish or broadcast on the Web became inexpensive and easy enough for anyone to use. Large organizations no longer had an online monopoly on publishing and broadcasting. Behold the Web 2 period.
Anyone and everyone, including spontaneously formed groups, can publish or broadcast anything instantly online to the entire world. Much as Web 1 spawned new forms of media, such as search engines and Web pages, Web 2 spawned new forms, such as social media, blogging, and mashups.
Web 1 created what has become a hyperlinked multimedia magazine of universal and global scale: the Web itself, which people browse via Web pages. It let organizations instantly publish or broadcast anything to everywhere without the constraints of press times or transmitter ranges, thus radically reshaping the media landscape. Web 2 then engendered nebulae of networked people whose complexity of gravitational currents and flows rivals some actual galaxies. Letting everyone who wants to publish and broadcast, Web 2 reshaped the media landscape even more fundamentally than Web 1 did.
Yet progress doesn’t stop there. Moore’s Law marches on. In hindsight, Web 1 and 2 were primitive developments; people who use those technologies have to be hunters/gathers. You may be able to find what or who you want on the Web, but you have to manually hunt through search engines and visit many Web sites to gather it (even if you’re among the tiny group of Internet users who use RSS to slightly automate this process, not everything RSS delivers is what you want that day). It would be ridiculous to stay with that situation, particularly when the average personal computer’s processor power has geometrically increased several times (aided by duo-, quad-, and octa-processor machines) just in the several years since Web 2 arose.
Thus, Sir Tim Berners-Lee and the World Wide Web Consortium since about 2000 have been spearheading development of Web 3.
A simple way of understanding Web 3 is that if you’re planning to travel somewhere, you needn’t visit a travel site, a map site, restaurant review sites, shopping sites, social media itinerary site, and so on to find out how to get there, where to stay, where your appointments are located, and where to eat and shop and to know if any of your friends will coincidentally be traveling there, too. Instead, you’d type or tell your computer or handheld device where you’re going and it would do all those things for you, then display the results. Imagine something similar to your Facebook, MySpace, iGoogle, or My Yahoo pages that delivers to you everything your friends are doing, the news about the topics you care about, competitive information about products or services you want to buy, and everything else you might want from the Internet without the constraints, control, adware, or spyware of Facebook, MySpace, Google, or Yahoo. Rather than you surfing the Web (which you could still choose to do), the Web serves you.
Web 3 requires technologies such as XML, metadata markup, and Web ontology languages, things as arcane to mere mortals as HTML and CSS were a decade ago, but nothing insurmountable. Some IT people dismiss Web 3 as too complex ever to be implemented — which is, of course, what they said about Web 1 between 1989 and 1992.
Webs 1, 2, and 3 have a stratal relationship in the media landscape; they don’t replace each other but are built atop each other. Moreover, each of these historical periods added a new dimension. For marketers, the first period gave the Web height and the second gave it length (the long tail), the third period will give it depth — extraordinarily increasing marketers’ potential in new media.
New media marketing’s holy grail has long been that each consumer will receive information (including advertisements) only for products and services about which she is interested enough to buy. Likewise, marketers will be spend to reach only those consumers and not bother or expend effort on consumers who aren’t qualified or interested. (Brand awareness marketing doesn’t easily fit into the Internet.) Web 1 let marketers reach billions of people online. Web 2 gave those billions capabilities equal to publishers, broadcasters, and marketers — including the abilities to share their own reviews of products and services free of control or constraints by publishers, broadcasters, and marketers. However, Web 1 and 2 still rely on consumers searching and hunting for what they want and on publishers, broadcasters, and marketers hoping that the consumers will come to Web pages.
Searching, hunting, and hoping aren’t foundations of good business, particularly when the pace of technologies will now allow buyer and seller, consumer and marketer, to be matched automatically and on equal footing. The changes that Web 3 will bring to the advertising and marketing industries in the next decade or two will be as great as the equivalent computerization that reshaped the financial industry during the past three decades. We’ll still need people to craft the creative (to craft the “deal”), but Web 3 will likely revolutionize how product and service information is delivered and consumed. Media buying will most likely be done on an individual rather than demographic basis.
Web 3 is a large subject to which I’ll devote more columns later this year.
They're arguably the most annoying video ad formats in existence, but soon they'll be a thing of the past, at least on YouTube.
27-year-old Swede Felix Kjellberg, who goes by the name PewDiePie on YouTube, has found himself at the center of a firestorm.
The explosive growth of video in 2016 makes 2017 an important year for video content and as more publishers are tempted to use it, it’s useful to consider the best strategies to maximise its effectiveness.
Apple has announced that with the next update to iOS 10, they will limit the number of times an app owner can pester a user for a rating.