Monday 21 April 2008

Web 2.0 and Beyond


The Internet is an ever-expanding monument to human nature in the 21st Century. There are known to be around 1300 million users worldwide, each able, at the click of a button, to access information on pretty much any subject they desire. The consumer public can now consume at greater speeds than ever before, whether it be purchasing goods online, socialising via the various online means or simply entertaining themselves with all manner of available digital media. It's ability to connect, to unify, and to render distance an unimportant factor of communications lies at the foundation of its success.

In the beginning

In 1964, Marshall McLuhan, a man commonly acknowledged as "the leading prophet of the electronic age" spoke of a single consciousness of man, a global village of which he foresaw twenty years before the first personal computer was available. McLuhan had anticipated the emergence of the Internet.

The Internet, or ARPANET as it was initially known, was conceived as a result of technological fear during The Cold War. The Soviet Union, in 1957, had just launched the first Sputnik satellite into space and this brought worry to the US camp. They believed they were falling behind and so, in a competitive and perhaps slightly paranoid spirit, they began to develop a non-centralized and potentially infallible communications network, with a Joseph Licklider as their leading man.

Because of ARPANET several technological innovations were achieved: the ability to send electronic mail (or email), remote connections between computers, and file transfer protocol (FTP) allowing information to be sent from one computer to another in bulk. As a result non-military uses for the network increased and it eventually became unsafe for their use. By 1990 ARPANET was shutdown in the wake of in-house Local Area Networks (LANs) and new, faster networks utilising Internet Protocols which were initially set up for Universities and research groups. Then ,in 1989, Tim-Berners Lee invented the World Wide Web and the Internet became publicly available.

Web 2.0

The World Wide Web has spent much of its early years as a mass of information resources and commercial entities. In the 90's we experienced the dot-com boom and its subsequent crash. Many had taken to setting up Internet-based companies in hope of achieving success in this new arena; most meeting failure, few; success. But in 2005, Dale Dougherty, feeling a need to re-assess where the Internet was headed, came up with the concept of Web 2.0. Conferences were held by Dougherty's O'Reilly Media group, detailing the idea of an Internet that promoted sharing and collaboration.

The web is now predominantly about communities, social networking, and content users can edit themselves and share with others around the world. With Web 2.0, we the users are in control. As it becomes easier for anyone with basic computer literacy to contribute, and the number of Internet users, all wishing to express themselves in one way or another, carries on to grow, the amount of content will increase. Communication is a fundamental social process and need. It gives a person a sense of belonging and provides us with the interpersonal attachments we desire, and this is provided to people in abundance by the Internet. An Internet which increasingly becomes more easily accessible and provides a greater freedom of speech to the public than ever before.

Commercially, the Internet still provides financial success for certain businesses. For some industries though, such as the film and music industry, it has caused upset. You only need refer to the recent writer's strike in the U.S. to see an example of its impact. Though it seems industries are slowly coming round to the idea of embracing the technology and using it to their advantage.


Web 3.0?

So what about the Internet beyond Web 2.0? Currently it can be said that by giving everyone the freedom to publish content without scrutiny, that standards and accuracy of information, amongst other things, can be lessened. One definition of "Web 3.0" I encountered, seems to counteract this problem. Perhaps an idealistic view in some respects, but one that would cure the Internets penchant for misleading information; bringing quality and expertise to the forefront of the net experience.

If control is an element of the net that could be subject to change in the future, it must be considered that in what has so far been a relatively lawless area, restrictions and rules could begin to surface that if broken could be punishable by law. Governments could intervene; regulating Internet output and access, just as China's government does already.

Another prospect of Web 3.0 is the Semantic Web, a vision of Tim-Berners Lee, the director of W3C and inventor of the World Wide Web. His idea comprises of a web where the semantics of information and services online are defined, allowing the web to understand and satisfy the requests of people and machines to use the web content. These technologies are not yet fully in existence, but they're a likely and exciting prospect.

There are many differing opinions on what lies ahead for the net, but no one can be entirely sure. As long as the capabilities for richer content on the web continue to grow, so too will the possibilities.

Tuesday 1 April 2008

Apple's Nineteen Eighty-Four

It was a bleak picture, Apple - one of the biggest players in today's computing market, painted with their infamous advertisement of 1984. The T.V. advert, which borrowed from Orwell's dystopian vision of the same year, aired only twice, but had an impact far outweighing its precedence. Apple weren't messing about, they had recruited big-shot director Ridley Scott of Blade Runner fame to put it together, and at only a minute long it cost them around $1.5 million Dollars to produce. It was shown once during a late-night show in late 1983, and then once more during one of the biggest annual events in American broadcasting - The Superbowl - where it would have had the attention of a possible 90 million viewers. So now take yourselves back (if possible) to January 22nd, 1984, the Los Angeles Raiders are taking on the Washington Redskins, there's a third quarter time-out (if that means anything to you) and the Superbowl, as a platform for media advertisement, is about to change...




Apple were days from revolutionising the world's computing market with their Macintosh computer, and soon to free the consumer public of the oppressive "Big Brother" figure recognised as being Apple's main rivals, IBM. And how had they planned on doing this? And why had they appeared so confident in their claims? Because Apple were soon to make the first Graphic User Interface (GUI) widely available to public and commercial computing (not long after one failed attempt with their LISA model). This meant that anyone, with a little training, could make use of a computer without having to be advanced in coding and command lines. The interface provided users with windows, pull-down menus, clickable buttons, scroll bars, icons, images, and most notably, ease of use. And so, as a result, came the era of Desktop Publishing.

Before the advent of Desktop Publishing there were typesetters producing commercial prints; compositors working by hand with their inked presses and cast metal sorts, and later with machinery. There were machines in the second half of the 1970's and in the early 80's - mini-computers - which used text markup languages; the descendants of which are still in use today; most commonly on the Internet. These methods, however, were costly and required specialists. But, with the birth of the Macintosh, and the array of software that was to be produced for use with it such as Macromedia FreeHand, QuarkXPress and Adobe's Photoshop and Illustrator packages, people could publish designs and arrangements from the comfort of their homes or offices. The Macintosh allowed users to design, preview and print their own layouts for the first time, and at a crisp 300 dots per inch; an amazing development for communication design and graphic imaging.

But it was not without its downfalls. During these early years, desktop publishing acquired a bad reputation from untrained users who created and unleashed poorly-organized layouts with ransom-note effects. A noticeable gap existed between the skilled designers and the amateurs, but things were to progress. Now, with the rise of new media in our current culture and the on-going expansion of software capabilities, the future of desktop publishing, commercially or otherwise, does seem an exciting one. Especially as the reach and possibilities of this multi-faceted discipline continue to extend and grow.




As for the legacy of a certain aforementioned advertisement, I came across this whilst channel-hopping one evening...



Very nearly a quarter of a century on and still it seems to maintain a firm place in advertisers minds!

Just as fascinating though, is the ad campaign Apple followed the 1984 commercial with in '85 for their Macintosh Office package. Obviously an attempt to recreate the same kind of success and tone they had achieved with the previous campaign, it comes off as overtly grim, and even insulting to potential customers! Unlike the 1984 ad, it was a failure, and it's not hard to see why...