Monthly Archives: January 2013

Apple Mac

Living & Working Through The Digital Revolution – What Can We Expect Next?

Having lived through a large part of the life cycle of computer technology and graphics, taking a look back was a remarkable reminder of just how far we have come in such a short timeframe. In this blog, we take a look back at the key developments of the last 50 years and ask ourselves where we can expect to head next – what will be our next milestone?

There is a generation of us out there who begun our working lives, or even studied at university or school, just before the internet became an essential part of daily life and before digital technology had such power to influence our day to day decisions and dictate our behaviour. We have, therefore, experienced the dramatic changes that have taken place in society over the past few decades and grown with the digital revolution.

Without the computer technology now available to us through constant innovation and a strong air of competition amongst entrepreneurs, we would be living in a very different world, and one without half the amount of visual stimulation we see all around us on a daily basis. And it is quite startling when you start to look at how quickly things have evolved. If we had to travel back in time to the 1960s, we would struggle to get used to a life without our gadgets and visual media. How would we cope?

Considering the quality and realism that we see in computer graphics today, it’s hard to imagine that the field didn’t even exist 50 years ago. Yet even today, the technology continues to evolve at a rapid pace. And while companies have come and gone over the years, the people haven’t. Most of the early pioneers are still active in the industry and just as enthusiastic about the technology as they were when they first started. Companies are now being forced to keep up with developments in order to stay in business, as the consumer world changes day by day, behaviours are influenced by new technology and form new expectations.

What started with a monochrome command-line prompt and has grown into a multicolored, multitasking, multimedia masterpiece!

So when did it all begin?

Computing hardware evolved from machines that needed separate manual action to perform each arithmetic operation, to punched card machines, and then to stored-program computers. Before the development of general-purpose computers, most calculations were done by humans. Aside from written numerals, the first aid to computation were purely mechanical devices which required the operator to set up the initial values of an elementary arithmetic operation, then manipulate the device to obtain a result. A sophisticated example is the slide rule in which numbers are represented as lengths on a logarithmic scale and computation is performed by setting a cursor and aligning sliding scales, thus adding those lengths. Numbers could be represented in a continuous “analog” form, for instance a voltage or some other physical property was set to be proportional to the number. Analog computers, like those designed and built by Vannevar Bush before World War II were of this type. Numbers could be represented in the form of digits, automatically manipulated by a mechanical mechanism. Although this last approach required more complex mechanisms in many cases, it made for greater precision of results. In the United States, the development of the computer was underpinned by massive government investment in the technology for military applications during WWII and then the Cold War.

Colossus

Colossus

Colossus was the world’s first electronic programmable computing device, designed by engineer Tommy Flowers and operational in 1944. It used a large number of valves (vacuum tubes). It had paper-tape input and was capable of being configured to perform a variety of boolean logical operations on its data, but it was not Turing-complete. The Colossus computers were used by British codebreakers during World War II to help in the cryptanalysis of the Lorenz cipher. Details of their existence, design, and use were kept secret well into the 1970s. Two of the machines were transferred to the newly formed GCHQ and the others were destroyed. As a result the machines were not included in many histories of computing. A reconstructed working copy of one of the Colossus machines is now on display at Bletchley Park.

Colossus was not a general-purpose machine, being designed for a specific cryptanalytic task involving counting and Boolean operations. Being not widely known, it had little direct influence on the development of later computers; EDVAC was the early design which had the most influence on subsequent computer architecture. However, the technology of Colossus, and the knowledge that reliable high-speed electronic digital computing devices were feasible, had a significant influence on the development of early computers in the United Kingdom and probably in the US. A number of people who were associated with the project and knew all about Colossus played significant roles in early computer work in the UK.

Altair

Altair

It was in the1950’s that we saw the emergence of the first computers. They were a feeble excuse for the computer we see today, being large in size and built from kits. It wasn’t until the 1970’s that the microprocessor emerged facilitating the rise of the desktop PC. The ALTAIR 8088 by Intel was the first to emerge on the commercial market, but it was in kit form and could only be given instructions through switches and lights. There were no keyboards or mice. The computer also required a programming language to make it useful, the first of which was created by Bill Gates and Paul Allen.

The ALTAIR created a buzz amongst computer hobbyists. They were looking to move away from a kit to a pre-assembled computer available out of the box. No longer would you need the knowledge to build a kit. And this is where the founders of the Apple Computer company, Steve Wozniak and Steve Jobs first came into the picture.

The first IBM Desktop Computer

The first IBM Desktop Computer

They had the genius to put all of the necessary chips into one computer board and the vision to build it. The Apple II was released as a result and was extremely successful commercially. Then in 1980, IBM appeared. Before IBM, microcomputers were largely ignored in the business world. IBM developed the hardware required and enlisted a Microsoft team lead by Bill Gates to create the programming language and operating system. Unfortunately off the back of IBM’s success, Compaq figured out how to clone the IBM proprietary chip, BIOS, and developed a computer that was to help remove IBM from the PC industry.

In 1984, after the appearance of Steve Jobs at PARCS (a research facility in Palo Alto, California, set up by copier company, Xerox. Out of this research facility came the mouse, a graphical user interface, the first computer network called ethernet and WYSIWYG printing), Apple introduced the Macintosh and sparked the graphical user interface revolution. It would become the largest non IBM-compatible personal computer series ever introduced. The original Macintosh had almost no software and was not very well accepted into business. The first successful software that was written for it came from Microsoft who wrote Word and Excel originally for the MAC.

Apple Mac

Apple Mac

John Warnock later founded Adobe Systems and created a revolution in the publishing world with his PostScript page description language. Adobe PostScript helped make the Macintosh a success and started a new industry called Desktop Publishing. But PostScript was not enough. In order to make PostScript useful, the Mac needed a software program that would be capable of manipulating postscript to lay out type and pages. This came from a small start up company named Aldus, the program was named Pagemaker, and it revolutionised the industry. Adobe shortly released Illustrator, another program that was designed to work hand in hand with postscript to produce a new type of graphics called Vector Graphics. With the release of colour computers in the 1980’s, Photoshop was released, and invented by the Knoll Brothers, it completed the circle to make Desktop Publishing an industry of it’s own.

3D Graphics and Animation

The developments in computer technology and computer graphics also made way for the use of 3D animation in film. The technology has since been used in many other industries too.

Though there are many contributors to computer animation, 3D animation is often attributed to William Fetter. William Fetter worked for Boeing during the 1960s using computers to animate and design certain models. One of his projects involved making what came to be known as “The Boeing Man.” It was a three-dimensional representation of the human body. It was then that Fetter coined the term “computer graphics.”

Tom Fetter's Boeing Man

Tom Fetter’s Boeing Man

Other key names in the development of 3D animation, were Ivan Sutherland who developed the ‘sketchpad’ and the first computer controlled head-mounted display (HMD) allowing the viewer to see the computer scene in stereoscopic 3D, and Ed Catmull who originally created an animation of his hand opening and closing and went on to introduce parametric patch rendering, the z-buffer algorithm and texture mapping (method of taking a flat 2D image of what an object’s surface looks like, and then applying that flat image to a 3D computer generated object).

Pong arcade game

Pong arcade game

The 1970s saw the emergence of the gaming industry (Atari created pong and pacman) and the formation of a new computer graphics division of Lucasfilm that would create computer imagery for motion pictures, viewed by many as a major milestone in the history of computer graphics.

CGI was first used in movies in 1973, in the science fiction film, Westworld. Its sequel, Futureworld (1976) featured the first use of 3D wire frame imagery.  The third film ever to use this technology was Star Wars (1977), designing the Death Star and the targeting computers in the X-wings and the Millenium Falcon, Han Solo’s ship. Later on, The Black Hole (1979) used raster wire-frame model rendering to create a black hole onscreen. That same year, James Cameron’s Alien used the raster wire frame model to render the image of navigation monitors in the scene where the spaceship follows a beacon for landing guidance. ”

Autodesk's Animator

Autodesk’s Animator

In the early 80’s, AutoCAD was born with the formation of Autodesk Inc and became a commercial success, helping move computer graphics to the world of personal computers. Later, Autodesk released a new PC based animation package called Autodesk Animator. As a full featured 2D animation and painting package, Animator was Autodesk’s first step into the multimedia tools realm.

Another major milestone in the 1980′s for computer graphics was the founding of Silicon Graphics Inc. (SGI) by Jim Clark in 1982. SGI focused its resources on creating the highest performance graphics computers available. These systems offered built-in 3D graphics capabilities, high speed RISC (Reduced Instruction Set Chip) processors and symmetrical (multiple processor) architectures. The following year in 1983, SGI rolled out its first system, the IRIS 1000 graphics terminal. It was in the same year that Tom Brigham, created “Morphing”, destined to become a required tool for anyone producing computer graphics or special effects in the film or television industry.

Around 1985, multimedia started to make its big entrance. The International Standards Organization (ISO) created the first standard for Compact Discs with Read Only Memory (CD-ROM). Today multimedia is a major marketplace for personal computer 3D animation.

Tin Toy (Image credit: Pixar)

Tin Toy (Image credit: Pixar)

In the late 80s, Disney formed their Computer Generated Imagery (CGI) department that went on to work on films such as “The Little Mermaid,” “Rescuers Down Under,” “Beauty and the Beast” and “Aladdin.” And grew in size off the back of their success. Pixar developed Renderman in 1988, a standard for describing 3D scenes. The Renderman standard describes everything the computer needs to know before rendering your 3D scene such as the objects, light sources, cameras, atmospheric effects, and so on. In 1989, Pixar made history when they created Tin Toy, a film completely created with 3D computer graphics using Pixar’s Renderman.

Then in the 1990’s, Microsoft shipped Windows 3 that followed a GUI structure similar to the Apple Macintosh, and laid the foundation for a future growth in multimedia.

Star Wars: Empire at War (publisher: LucasArts)

Star Wars: Empire at War (publisher: LucasArts)

NewTek, a company founded in 1985, also released the Video Toaster, a video production card for Amiga personal computers and Autodesk shipped their first 3D Computer animation product, 3D Studio, which was later to become the leading PC based 3D computer animation software.

The rest of the decade the release of the first full-length computer 3d animated and rendered motion picture. It came from Pixar and was called Toy Story. This year also saw another graphics revolution, Sony released their Playstation (X) game-console worldwide. Until then the so-called Video Game consoles only managed to display 2 D graphics.

It took a Star Wars movie to impress the audiences again. The long-awaited prequel to the earlier Star Wars movies was released in May 1999. As expected, it was extremely successful at the box office preceded only by Titanic and the original Star Wars movie. What amazed most was not the quality of the CGI but the sheer amount of it. Some 95% of the imagery was digitally manipulated in one way or another.

Playstation 3 Graphics

Playstation 3 Graphics

As we entered the 21st century, Graphic software reached a peak in quality and user accessibility, PC displays supported real-time texture mapping, Flatbed scanners, laser printers, digital video cameras, etc., became commonplace, Program language moved toward Java and C++.and 3D modelling captured facial expressions, human face, hair, water, and other elements formerly difficult to render. The next generation of video consoles were also released – Playstation 3 and the successor to Nintendo’s GameCube and Microsofts Xbox 360.

 

Mobile, Marketing and Advertising

iPad mini

iPad mini

In the last decade we have seen the rise of mobile platforms with devices such as the SmartPhone and IPad coming to market that allow us to view multimedia on the move. During the evolution of computer graphics hardware, the cost has gradually reduced so as not to be so prohibitively expensive. Its now much more accessible to a wider market, meaning that companies can now get access to the plethora of benefits that this technology can have on all areas of business. It is no longer just the realm of the film and gaming industries.

Animations and 3D graphics are now also commonplace in advertising, promotional material, corporate video, concept design and product development and educational material. Digital has not only brought down the cost of production and ongoing maintenance of this collateral, but has also allowed for easy updates without having to go through the motions of reprinting at great expense and has given companies the ability to visualise concepts/ideas in a real time environment prior to their development and implementation.

A new generation in sales and marketing techniques has emerged. Born out of our constant desire to innovate have come Trade shows such as TFMA (Technology for Marketing and Advertising) and Internet World, that solely focus on online marketing and the use of digital channels and technology within the industry. These developments over the last decade or so have changed the consumer environment and we have witnessed the rise in marketing automation which in turn, has changed consumer expectations. Buyers now expect to see highly advanced graphics displayed on digital devices. They also expect a more personalised experience tailored to their interests or the ability to create or customise their own products on a digital devise.  The rise in database technologies alongside computer graphics and coding languages has allowed us to deliver this to our consumers, with more powerful marketing and sales tools at our disposal. The recent mobile revolution has added to the ‘reliance’ we had already developed for digital technology and computer graphics in our daily lives, and the obsession we have for constant innovation to create the next ‘revolution’.

What next?

So where do you think we are heading next? What’s the next step in the digital revolution and what will our consumers expect? Are we heading towards an autonomous society where our actions will be lead totally by virtual minds and not our own? We are already seeing the use of computers to drive robots that can control our vehicles without human intervention and to educate our children. Could the human mind end up becoming too clever for its own good and becoming redundant? And will the mobile revolution mean that we can never stop, constantly on the move, society could burn itself out?

A long way off, but food for thought.

Let us know what you think in the comments.

__________________________________________

PAULEY can help you create innovative and engaging interactive sales and marketing collateral and work with you to develop your digital strategy including offering advice on DAM (Digital asset Management). If you would like to chat to us about any ideas or concerns you may have, please call us on 01908 522532 or email info@pauley.co.uk. We would be more than happy to offer a free consultation. To find out more about us and what we do, visit our website http://www.pauley.co.uk

Share