Adam Ash

Your daily entertainment scout. Whatever is happening out there, you'll find the best writing about it in here.

Friday, August 11, 2006

What became of the www?

The worldwide whatever – by DAVID OLIVE

"Legend has it that every new technology is first used for something related to sex or pornography. That seems to be the way of humankind." — Tim Berners-Lee, inventor of the World Wide Web

Last week, Playboy Enterprises, Inc. teamed with a Toronto-based specialist in gambling software to announce the launch later this year of the first Playboy-branded online poker website.

CryptoLogic Inc.'s president, Lewis Rose, lauded Playboy as "one of the world's premier entertainment brands." And Christie Hefner, CEO of Playboy, blew an air kiss Rose's way, heaping praise on his firm's "technical and industry strength."

In the United States, where online gambling is illegal but thriving, federal legislators now attempting a crackdown on the illicit industry might see the deal differently — as the partnership of a porn purveyor with a firm exploiting another of man's primordial weaknesses.

Fifteen years ago today, Tim Berners-Lee launched the World Wide Web with a noble vision for it as "a force for individual, regional and global understanding."

It's a safe guess that the great majority of the more than 1.1 billion people who now use the Web are more in tune with Playboy's vision of it than Berners-Lee's. For them it's about searching for a job or a new-home listing. Making a last-minute mortgage payment online. Comparison shopping for Advil at Walmart.com and Walgreens.com. Making arrangements for a trip to Disney World, including a stop at Lavalife.com to check out potential travelling companions. Downloading a screensaver of the cutest Coldplay band member. Filling out a collection of limited edition Hummels. Or ogling porn sites and visiting chat rooms — the "killer application" that drove Web-surfers to the first "cybercafés" in the early 1990s.

The Web has shrunk Marshall McLuhan's global village into a town square. Its most intimate manifestation, perhaps, is MySpace, a hangout for 100 million people, many of them teens who share photos with friends and exchange history-class horror stories.

The Web phenomenon of "social networking" has yielded scores of websites where people with common interests daily congregate, including Connect. ee for the Estonia diaspora; Reunion.com, where 25 million users re-connect with friends and family; and VampireFreaks.com, a seven-year-old Web community of 550,000 mostly teen-aged "cybergoths." Long-time members are granted access to The Dungeon, where they can view each other's buttocks. Possibly this is what George W. Bush had in mind when he warned of "children living, you know, in the dark dungeons of the Internet."

The mostly prosaic uses of the Web can only be a disappointment for Berners-Lee, a self-effacing man nonetheless infected with his share of that geek high-priesthood conviction that he's on a mission to refine our civilization, in ways we technophobes can't quite understand.

Just the same, Sir Tim (he was knighted in 2004) and his collaborators on the nascent Web project back in 1991 succeeded in creating a means by which their aspirations for a world less plagued by conflict, disease, and ignorance might yet be achieved.

In Boston, the dogged Sir Tim, now 51, is now at work at the Massachusetts Institute of Technology (MIT) on another potential triumph. Dubbed the Semantic Web, it aims to develop a "thinking" computer network that gathers, with a minimum of instruction, copious amounts of vaguely related data worldwide.

Cleverly interpreted, such a data accumulation could demystify the most stubbornly misunderstood diseases, or "connect the dots" of terrorist-group activities — a task beyond the abilities of intelligence agencies who so abysmally failed before the terrorist attacks on New York, Washington, Bali, Madrid and London.

Computers read but can't manipulate the billions of bytes of information out there. But with "global semantic data," says Berners-Lee, "you'll be able to combine the data you know about with other data that you didn't know about.

"The big challenges such as cancer, AIDS, and the drug discovery for new viruses require the interplay of vast amounts of data from many fields that overlap — genomics, proteomics, epidemiology, and so on. Some of this data is public, some very proprietary to drug companies, and some very private to a patient."

The challenge for Berners-Lee and his MIT colleagues is to develop computer programs that protect propriety and personal information while gathering and manipulating it all the same, to make trailblazing connections across hundreds of fields, encompassing everything from regional weather patterns to racial origin to household finances and job vocations, in order to determine predictors of disease incidence or avoidance.

"Our lives will be enriched by this data," Berners-Lee explained in a 2004 interview, "which we didn't have access to before, and we'll be able to write programs that will actually help because [the computer will] be able to understand the data out there rather than just presenting it to us on the screen."

Many of us would settle for changing the world just once a lifetime, as Berners-Lee did on Aug. 6, 1991. The Web came into being that day when, at age 36, the soft-spoken English computer programmer at a scientific think tank in Geneva put his first "Web" site onto the Internet, thus creating the World Wide Web, a term he coined.

That first website explained the set of Web protocols, of which Berners-Lee was the principal author, and by which millions of pages of data stored in far-flung computers worldwide could be retrieved with relative ease for the first time.

Among the most important acronyms of all time, the protocols HTTP, HTML, and URL (see sidebar this page) transformed the three-decade-old Internet from a repository of arcane technical data into the most powerful tool of today's Information Age.

Like running water, the Web is now a ubiquitous utility as familiar as the phone and cable companies that provide access to its congeries of 98 million websites (up from 50 in 1992). Websites are maintained by the Prime Minister's Office and comedy-club booking agents, by blue-chip corporations and local scrap yards, advocacy groups like Amnesty International and the Sierra Club, and possibly your neighbourhood church and pub, eager to post their calendar of choir practices and darts tournaments.

Not to overlook the estimated 30 million sites operated by "bloggers" — online diarists, or "Web loggers," who carry on about the perfidy of the Bush administration or its detractors, fly-fishing techniques, or updates of the Henderson family's progress, complete with holiday snaps, as it retraces the 18th-century trade route of the coureurs de bois .

Last year, Canadians purchased close to $40 billion dollars worth of merchandise online. Canadian households viewed 1,373 Web pages and spent 28 hours online each month. And that doesn't include Web surfing in the workplace.

The global count of Web searches worldwide is now 5.7 billion per day. For many of us, the Web is the first place to look for weather, travel and financial information and, thanks to Google Maps, the exact location of the dinner party you're to attend tonight.

`The information superhighway will... open up a huge opportunity to waste your time'

Harold Geneen

1970s management guru

The world's embrace of the Web is invariably described as unusually rapid. The widespread acceptance of the telephone took some 70 years; of cars, 60 years; of passenger aviation, 70 years; of radio and television, 30 years each; of microwave ovens, 28 years; and of VCRs, 35 years.

But the first "Internet cafés" appeared in London, Toronto and Helsinki just three years after Berners-Lee posted his 1991 manifesto proclaiming the newborn Web to be "an easy but powerful global information system."

And in the comparatively brief decade or so since the Web became a commercial medium, with the emergence of Amazon.com, eBay Inc. and other Web vendors in the mid-1990s, two-thirds of North Americans, or 227 million of us, became regular Web users.

Similarly startling were the early predictions made about the Web's certain impact.

Medical and other scientific discoveries would occur at a breakneck pace. Abroad, tyrannical regimes would succumb to new leaders empowered by information disseminated from freedom-loving societies, while at home political and corporate governance would be sanitized by thorough scrutiny from the proliferation of both revelations of wrongdoing and of the independent watchdogs (as political bloggers style themselves) who dug them up.

Traditional stores would be killed off by e-commerce, as computer-mouse "clicks" triumphed over "bricks." The mainstream media would be supplanted by Web broadcasts and electronic publishing of books, newspapers, and magazines.

"Commerce in the next decade will change more than it's changed in the last hundred years," Jack Welch, CEO of General Electric Co. and one of the globe's most respected management gurus, declared in the late 1990s.

Even earlier, Hugh McColl, CEO of Nationsbank (now Bank of America), said the Web "is like a tidal wave. If you fail in the game, you're going to be dead."

Bill Gates, chagrined that he had initially underestimated the Web's impact, said, "The Internet is only surpassed by mental telepathy."

The Web's usefulness is, of course, undeniable. But seldom has the significance of a major innovation been so exaggerated.

The industrialized world was primed for the Web well before it was launched. The Internet, of which the Web is something of a mere application, was 20 years old already and largely built, and personal computers were widespread in homes and offices by the late 1980s.

But, if anything, the uptake of the Web was halting in the beginning, given the conflicting strategies of the new Internet service providers (ISPs) and the lame content, crude graphic design, and poor navigation of mid-1990s websites.

A hesitant business world wavered between thinking the Web was a fad and fearing that its success would only cannibalize existing sales. E-commerce has since manifested itself as just another distribution channel of companies from Loblaws to the Royal Bank, which still operate plenty of bricks-and-mortar supermarkets and bank branches.

Amazon.com, still only marginally profitable 12 years after its founding in Seattle by Jeff Bezos, and eBay, not a retailer, actually, but an auction house, are among the few major survivors of the dotcom bust of 2000.

The traditional book trade has thrived since 1991. Stephen King's late-1990s experiment with writing a book exclusively for Web viewers has not been repeated, by King or anyone else. A decline in newspaper readership that began long before the Web's appearance was reversed by the papers' online editions, which have given major papers the biggest audiences they've ever commanded. Not one paper or magazine has perished directly because of the Web.

And, sadly, not one scientific breakthrough can be attributed to Web-accelerated exchanges of research data. Nor has peace and greater mutual understanding broken out in the Middle East, Darfur, or among the combatants in the fledging war between Ethiopian and Somalia.

Bloggers, mostly one-person operations with no reporting staff, scalp much of their content from the mainstream publications and network broadcasts they mock for their slow-footedness. Bloggers have not set the agenda in politics, science or any other realm.

"The blogosphere is not a hothouse where brilliant new ideas are generated by the self-described iconoclasts who populate it," says Rick Salutin, media critic at The Globe and Mail . "The main qualification for blogging is that you failed to get a mainstream media job. Writers on the Web tend to be in touch only with other bloggers, not people in the street. It still takes a grassroots movement to force a fundamental change in social conditions."

Then there's the downside. There is a Center for Online Addiction in Pennsylvania. Porn continues to be the Web's top attraction. Sexual predators are an increasing worry for parents of kids who frequent social-networking websites. And scam artists and identity thieves have found a new playground.

The Web 1.0 version has pretty much turned out as Harold Geneen, the most celebrated management guru of the 1970s, said it would in his 1997 book, The Synergy Myth .

"Let's not get carried away," Geneen said. "The information superhighway is a tool, perhaps as revolutionary an innovation as the printing press or the telephone, but a tool nonetheless. A lot of it will be a guy in New Jersey sitting in his room talking to a guy in Iceland about the weather. It will also open up a huge opportunity to waste your time. You ought to go back to the beginning of the television era and read some of the claims people made for that new technological wonder."

Still, there is reason to expect better from Web 2.0. In 1961, decades before the arrival of Newsworld, the Discovery Channel and Masterpiece Theatre , the head of the U.S. Federal Communications Commission could rightly describe TV as a "vast wasteland."

Berners-Lee, who was put off by the greed-seeking of the dotcom era (he forfeited his own chance at riches by donating his protocols in order to standardize the Web), insists the Web will evolve substantially from what he regards as its current infancy.

"It will be many decades before we will be able to say we have really implemented the Web idea in full, if ever we can," says Berners-Lee. But "once you start with the basic Web idea, so much stuff becomes possible."

Berners-Lee, who now makes his home near MIT, returned to religion after a long absence and is active in the Unitarian church. As he described on his own website, he sees a parallel between the Unitarians and the Web.

They are, he believes, both decentralized entities with a higher purpose.

0 Comments:

Post a Comment

<< Home