Thursday 11 December 2008

Rise of the Machines



As man races through technological advancements, moving further and further toward the kind of existence only envisioned in science fiction films, we, as a captivated audience, can only speculate on what this future may hold. With man creating machines that are ever more able to imitate and appear to take on human life, how long is it before they no longer need our input and are 'living' and acting creatively for themselves. Or is this out of our reach? Surely man must truly understand every facet of his own mind before successfully engineering that of another?

Man has now reached a level in the creation of artificial intelligence that it can now consider the applications of, say, a robot that thinks and acts for itself, even one that can program itself and possibly others. But it is still only consideration, the level of technology required for this to be a reality remains far from our reach. For now we have imitation; robotics acting through human programming and built to resemble us.



The capability of this female robot in imitating human movement and our outward characteristics may not quite stand up against the real thing, but in some cases, it can be enough to convince people that these robots are of an independent intelligence; arguably a prerequisite for creativity. Though obviously this is not the case, rather an example of complex animatronics and programming.

So what is it a machine needs in order to be creative of it's own freewill, in other words to program itself? Not speaking in any technical terms, it could be said that a machine requires a self-awareness, an ability for observation and curiosity; perhaps a certain element of personality? A man named Alan Turing (1912-1954), the founder of computer science, devised a simple but ingenius test in order to determine whether or not a machine held these abilities, and therefore the ability to think for itself. In order for the machine to be successful in proving its intelligence and pass the test it must be deceitful under interrogation, it must be creative! I have only touched on Turing's theories, but you can find out more here at Alan Turing's Homepage

No machine, as of yet, has passed the Turing test, so we can only wonder what great things will be achieved by computer scientists in the future. Will machines ever reach a level where they are effectively re-creating themselves, even bettering themselves for their own means? In popular culture, namely science fiction, we have enjoyed this notion in films such as The Terminator and literature such as I, Robot by Isaac Asimov. I say enjoy, but it seems we embrace these thoughts with a certain sense of trepidation; fearful that our own creations will turn against us! But I guess this is a possible eventuality that must be factored into the equation; would intelligent machines, as a collective 'species', necessitate such atrocities as war?

Monday 1 December 2008

Creativity in the Animal Kingdom



It can be said that we, as humans, are the only truly intelligent species on the planet; in that we are the only example of a species that can intelligently question the nature of our own being. Is it then possible that we are also the only species capable of creativity? Can an animal use its intuition and knowledge to act in a creative manner, or is it simply an act of imitation?

The term 'imitation' is often clouded with the misconception that it implies a kind of thoughtless 'copying' , but in this case it is a far more complex subject area. There must be a conscious connection between the action and its consequence. There are countless examples in scientific experiments of imitation in the animal kingdom; ranging from animals such as budgerigars and rats, to chimps and orangutans.

Take for example this video of an elephant, one of the most intelligent animals on the planet, painting an image of what appears to be a graphic representation of another elephant;



I'm sure there are some who would like to think that the elephant has a conscious awareness of what it is doing, that it's aware it is producing a representation of its own image through freewill - behaviour that would have to fall under the definition of creative. But rationally speaking it appears as though, through a system of repitition and reward (such as you might employ with the teaching of a domestic animal), the elephant has been taught to imitate and carry out an action knowing it will be rewarded, most likely with food. The elephant is not acting creatively, merely following a path it knows to have a desired end result.

But what of animals who act independently of human interference, those who can appear to act creatively as a means for survival. The following clip shows, amongst other things, a chimpanzee utilising stones as a hammer and anvil in order to break the hard shell of a nut it wishes to eat.



A fascinating clip which serves to highlight the evolutionary link between humans and chimpanzees. The ability to be creative is what seperates us as humans from other living organisms; but what we see here is arguably an example of a species, outside of our own, using creative thought in a purposeful way. Is this example merely some form of imitation being put to use after a chance encounter or experience? I'm not sure. What I do know is it would only have taken the actions of one chimpanzee in order to make this 'creative' behaviour commonplace among its culture.

Saturday 22 November 2008

Creative Thinking



Creativity, what exactly is it? A common dictionary definition tells us;

Creativity is marked by the ability or power to create, to bring into existence, to invest with a new form, to produce through imaginative skill, to make or bring into existence something new.

But this cannot be held as definitive, because 'creativity' itself is a very personal experience, one that is unique to each person. It fails to explain exactly where this creativity comes from and why. It's a definition many have struggled with, one even psychologists and professors have speculated over.

In my quest for this fabled definitive answer I came across many ideas on what qualifies as creativity, or being creative, and one word that cropped up again and again was 'originality'. So if this creativity, by definition, must involve the creation of something never before in existence how exactly do these creative ideas come to be in the first place?

Darwin wrote in his theories on evolution that nature creates many possibilities through blind "trial and error" and then lets the process of natural selection decide which species survive. In nature, 95% of new species fail and die within a short period of time. Genius is analogous to biological evolution in that it requires the unpredictable generation of a large quantity of alternatives and conjectures. From this quantity of alternatives and conjectures,the genius retains the best ideas for further development and communication.

Scholars have attempted to apply this idea to our understanding of creativity and genius. Fundamentally, to prove that creativity is hard-wired into the brain as a result of necessity and natural selection. And that real creative ideas come into fruition through, to use Darwin's terms - alternatives and conjectures, or more simply trial and error. To discover a good idea you must generate many ideas.

A good example of this would be Thomas Edison who reportedly had conducted 50,000 experiments before inventing the alkaline storage cell battery and 9,000 in order to perfect the light bulb. His creative genius was achieved through gathered knowledge and experience. This is a more rational approach to the idea of creativity, and one I would prefer to subscribe to.

The example above considers only creativity in a functional sense, but the same can be true of artistic creativity working in conjunction with its scientific counterpart. The artist and sculpture James Turrell has achieved something worthy of creative merit through much the same process. He has endeavoured to create these astonishing spaces within Roden Crater, an extinct volcano near San Francisco. These spaces include one in which he seeks to gather light older than our solar system, and another in which a viewers shadow can be cast by the light of Venus alone. Turrell approaches light in very much a physical sense and seeks to have people question their perceptions through this highly creative means. More information on James Turrell and his projects can be found at http://www.pbs.org/art21/artists/turrell/

There is an alternative view though, that creativity is God-given, and great creative ideas are simply gifted to us. It can be said that some are 'gifted' with greater creative abilities than others, but the rational person in me fails to believe that this is a product of any kind of divine intervention. It could also be argued that by being gifted the means to be creative, by that which created all that has and will ever exist, we cannot be truly creative at all, not in the true sense of the word. But in saying that we must be aware that words are nothing more than the tools with which we try to better understand things!


On a more light-hearted note, here is an interesting if unusual take on creativity I found on youtube.

Thursday 1 May 2008

The Multimedia Effect


Ray Bradbury, in his 1953 novel “Fahrenheit 451”, depicts a world where the printed word has been outlawed by a government in fear of an increasingly free-thinking public. It is an oppressive society in which the story’s heroin exists, but one which couldn’t be too far from the truth when we think about our fast-approaching future. OK, so perhaps the idea of a government-orchestrated oppression isn’t such an accurate foresight, but the idea of a world in which the printed word is in mass-decline could soon become a reality.

In other words, what we could soon come to live in, in the Western World at least is a ‘Post-Literate’ society where reading and writing as our primary means of communicating ideas becomes almost obsolete, or just simply undesired.

The growth of multimedia, and new media outlets such as the Internet has allowed the public to experience information; educative or otherwise, in a variety of forms. Image and sound through film, television, and the web form much of today’s content, and the interaction between this content and users is becoming increasingly important. As an example, it can be seen that websites are gradually replacing newspapers and magazines as a more dynamic and efficient way in which to present world news and information.

Although this is true I cannot envisage, in our lifetime at least, our Western Civilization reaching a point where we completely disregard things such as books. If only simply because people like to have something they can hold, something with a more physical presence. (The same can be said of the Compact Disc). And with a book a person often seeks to sit and relax: terms I, and probably many others, don’t always associate with the use of a PC for example. Although, that’s not to completely ignore the possibility of the printed word in an educative and informative context being replaced.

On the other hand, should we come to live in a society where the majority of the population is technologically adept and acceptant: then can we expect to see multimedia replacing the printed word? It seems a more viable outcome in those circumstances.


And finally, coming to issues arising, or perhaps being affected by this medial transition: wouldn’t the environment benefit somewhat from the possible redundancy of paper?

Monday 21 April 2008

Web 2.0 and Beyond


The Internet is an ever-expanding monument to human nature in the 21st Century. There are known to be around 1300 million users worldwide, each able, at the click of a button, to access information on pretty much any subject they desire. The consumer public can now consume at greater speeds than ever before, whether it be purchasing goods online, socialising via the various online means or simply entertaining themselves with all manner of available digital media. It's ability to connect, to unify, and to render distance an unimportant factor of communications lies at the foundation of its success.

In the beginning

In 1964, Marshall McLuhan, a man commonly acknowledged as "the leading prophet of the electronic age" spoke of a single consciousness of man, a global village of which he foresaw twenty years before the first personal computer was available. McLuhan had anticipated the emergence of the Internet.

The Internet, or ARPANET as it was initially known, was conceived as a result of technological fear during The Cold War. The Soviet Union, in 1957, had just launched the first Sputnik satellite into space and this brought worry to the US camp. They believed they were falling behind and so, in a competitive and perhaps slightly paranoid spirit, they began to develop a non-centralized and potentially infallible communications network, with a Joseph Licklider as their leading man.

Because of ARPANET several technological innovations were achieved: the ability to send electronic mail (or email), remote connections between computers, and file transfer protocol (FTP) allowing information to be sent from one computer to another in bulk. As a result non-military uses for the network increased and it eventually became unsafe for their use. By 1990 ARPANET was shutdown in the wake of in-house Local Area Networks (LANs) and new, faster networks utilising Internet Protocols which were initially set up for Universities and research groups. Then ,in 1989, Tim-Berners Lee invented the World Wide Web and the Internet became publicly available.

Web 2.0

The World Wide Web has spent much of its early years as a mass of information resources and commercial entities. In the 90's we experienced the dot-com boom and its subsequent crash. Many had taken to setting up Internet-based companies in hope of achieving success in this new arena; most meeting failure, few; success. But in 2005, Dale Dougherty, feeling a need to re-assess where the Internet was headed, came up with the concept of Web 2.0. Conferences were held by Dougherty's O'Reilly Media group, detailing the idea of an Internet that promoted sharing and collaboration.

The web is now predominantly about communities, social networking, and content users can edit themselves and share with others around the world. With Web 2.0, we the users are in control. As it becomes easier for anyone with basic computer literacy to contribute, and the number of Internet users, all wishing to express themselves in one way or another, carries on to grow, the amount of content will increase. Communication is a fundamental social process and need. It gives a person a sense of belonging and provides us with the interpersonal attachments we desire, and this is provided to people in abundance by the Internet. An Internet which increasingly becomes more easily accessible and provides a greater freedom of speech to the public than ever before.

Commercially, the Internet still provides financial success for certain businesses. For some industries though, such as the film and music industry, it has caused upset. You only need refer to the recent writer's strike in the U.S. to see an example of its impact. Though it seems industries are slowly coming round to the idea of embracing the technology and using it to their advantage.


Web 3.0?

So what about the Internet beyond Web 2.0? Currently it can be said that by giving everyone the freedom to publish content without scrutiny, that standards and accuracy of information, amongst other things, can be lessened. One definition of "Web 3.0" I encountered, seems to counteract this problem. Perhaps an idealistic view in some respects, but one that would cure the Internets penchant for misleading information; bringing quality and expertise to the forefront of the net experience.

If control is an element of the net that could be subject to change in the future, it must be considered that in what has so far been a relatively lawless area, restrictions and rules could begin to surface that if broken could be punishable by law. Governments could intervene; regulating Internet output and access, just as China's government does already.

Another prospect of Web 3.0 is the Semantic Web, a vision of Tim-Berners Lee, the director of W3C and inventor of the World Wide Web. His idea comprises of a web where the semantics of information and services online are defined, allowing the web to understand and satisfy the requests of people and machines to use the web content. These technologies are not yet fully in existence, but they're a likely and exciting prospect.

There are many differing opinions on what lies ahead for the net, but no one can be entirely sure. As long as the capabilities for richer content on the web continue to grow, so too will the possibilities.

Tuesday 1 April 2008

Apple's Nineteen Eighty-Four

It was a bleak picture, Apple - one of the biggest players in today's computing market, painted with their infamous advertisement of 1984. The T.V. advert, which borrowed from Orwell's dystopian vision of the same year, aired only twice, but had an impact far outweighing its precedence. Apple weren't messing about, they had recruited big-shot director Ridley Scott of Blade Runner fame to put it together, and at only a minute long it cost them around $1.5 million Dollars to produce. It was shown once during a late-night show in late 1983, and then once more during one of the biggest annual events in American broadcasting - The Superbowl - where it would have had the attention of a possible 90 million viewers. So now take yourselves back (if possible) to January 22nd, 1984, the Los Angeles Raiders are taking on the Washington Redskins, there's a third quarter time-out (if that means anything to you) and the Superbowl, as a platform for media advertisement, is about to change...




Apple were days from revolutionising the world's computing market with their Macintosh computer, and soon to free the consumer public of the oppressive "Big Brother" figure recognised as being Apple's main rivals, IBM. And how had they planned on doing this? And why had they appeared so confident in their claims? Because Apple were soon to make the first Graphic User Interface (GUI) widely available to public and commercial computing (not long after one failed attempt with their LISA model). This meant that anyone, with a little training, could make use of a computer without having to be advanced in coding and command lines. The interface provided users with windows, pull-down menus, clickable buttons, scroll bars, icons, images, and most notably, ease of use. And so, as a result, came the era of Desktop Publishing.

Before the advent of Desktop Publishing there were typesetters producing commercial prints; compositors working by hand with their inked presses and cast metal sorts, and later with machinery. There were machines in the second half of the 1970's and in the early 80's - mini-computers - which used text markup languages; the descendants of which are still in use today; most commonly on the Internet. These methods, however, were costly and required specialists. But, with the birth of the Macintosh, and the array of software that was to be produced for use with it such as Macromedia FreeHand, QuarkXPress and Adobe's Photoshop and Illustrator packages, people could publish designs and arrangements from the comfort of their homes or offices. The Macintosh allowed users to design, preview and print their own layouts for the first time, and at a crisp 300 dots per inch; an amazing development for communication design and graphic imaging.

But it was not without its downfalls. During these early years, desktop publishing acquired a bad reputation from untrained users who created and unleashed poorly-organized layouts with ransom-note effects. A noticeable gap existed between the skilled designers and the amateurs, but things were to progress. Now, with the rise of new media in our current culture and the on-going expansion of software capabilities, the future of desktop publishing, commercially or otherwise, does seem an exciting one. Especially as the reach and possibilities of this multi-faceted discipline continue to extend and grow.




As for the legacy of a certain aforementioned advertisement, I came across this whilst channel-hopping one evening...



Very nearly a quarter of a century on and still it seems to maintain a firm place in advertisers minds!

Just as fascinating though, is the ad campaign Apple followed the 1984 commercial with in '85 for their Macintosh Office package. Obviously an attempt to recreate the same kind of success and tone they had achieved with the previous campaign, it comes off as overtly grim, and even insulting to potential customers! Unlike the 1984 ad, it was a failure, and it's not hard to see why...





Monday 3 March 2008

Grayson Perry and Lars Tharp: In Conversation...?

On first encountering the Turner Prize winning artist and potter Grayson Perry, myself, a college student of around sixteen years of age, the first thing that struck me was his eccentricity and seemingly bizarre personality. Not least because of his tendency to make appearances in public dressed as ‘Clare’, the cross-dressing alter-ego he regularly adopts. So, five years on, the opportunity arose for me to witness the aforementioned in a live conversation with Lars Tharp; a Ceramics historian and broadcaster from the BBC’s Antiques Roadshow. On first impressions it appeared as if what had been planned here was a meeting of two very different minds, a ground for debate and conflicting ideas. Perry, I assumed, was to bring the more broad-minded and fantastical views; Tharp, a more traditionalist and conservative presence. Shortly after taking my seat in the audience, Tharp, you could say, confirmed this to the audience... in very few words at least. Then came Perry’s turn to make an entrance; I won’t pretend I didn’t expect him to come out donning one of his trademark ‘fit for a ball’ dresses and handbags, but to my surprise this was not the case. The bright pink V-neck would have to suffice.

So, what did I learn? I learnt within minutes that Lars Tharp isn’t one for confrontation. With the smallest display of disagreement from Perry came my realisation that this wasn’t going to be the head-on clash I’d previously hoped for. But all was not lost, the rest of the hour-long period we were treated to some deliciously funny quips from Perry. His comparisons between the work process and ‘tossing one off’, or even ‘shooting your load’, were particularly poignant, but not quite as amusing as his thoughts on the Arts Council and their members. Just as Perry labelled the women of the Council as the dangly-earring wearing type, I noticed sitting directly in front of me these very beings. In fact it became apparent that all around the lecture hall this was the case. That’s when I suspected that the majority of the audience weren’t here to see Grayson Perry at all; I suspected they were here to see Lars Tharp. If this was the case then I expect some were disappointed that the conversation became what I’d describe as an interview, conducted by Tharp.

With that settled we gathered a little insight into Perry’s creative world. He ventured into his beginnings as an artist, also on how his interest in pottery began, which seemed to be because no-one else held one at the time. He also expressed his opinion on original ideas in art, suggesting he didn’t much care for them. Strange because, by my reckoning, it was his original take on pottery as a canvas and narrative tool that went some way in gaining him the Turner Prize back in 2003. It seems that the ‘punk’ and rebellious attitude of his youth, one Perry had described to us as having, still resonated somewhat in his current life. Or maybe just that he wished to be different from everyone else, to do and say things that no-one else wanted to see or hear. Contradiction, to me it seemed, took a recurring presence in most of what he said. Despite this, I have to say I find his work intriguing and brilliantly bizarre. His use of visual metaphor and regressions on childhood are particularly striking. Admirable are his methods, processes and the fact his work relies quite heavily on an emotional and physical investment.

The conversation came to an end with Tharp unable to refrain any longer from mentioning his affiliation with The Antiques Roadshow, or ‘the Roadshow’ as he’d so proudly put it. I must commend him though, he’d tried his best. Not to fault his interviewing skills, he did a fine job, even if he had failed to point out Perry’s contradictory manner. Besides, that had proved unnecessary of Tharp when Perry had pointed it out by his own admission not far from the end. The audience gave quite a rapturous applause and left happily, as did I. The experience was entertaining and insightful for the most part - an often rare combination.