Tag Archives: 3D graphics

ScreenShot_45

PAULEY Showcases 3D E-Learning at Space Situational Awareness Conference

  • PAULEY sponsors first Space Situational Awareness Conference
  • Collective efforts to ‘clean up’ Low Earth Orbits is essential
  • Virtual 3D environments could help tackle the problem

PAULEY were delighted to sponsor the inaugural Space Situational Awareness Conference 2013. 

We were invited to showcase our virtual reality visualisation of space using our Oculus Rift developer kit. The 80 international delegates, from research laboratories to government departments and private companies, were queuing up at our stand throughout the two days to take the immersive trip into space.

We garnered some great feedback over the two days of the conference. But why might accurate visualisation of space be such an important asset in the years to come? And how could we help?

Situational space awareness gains urgency

received_mmsid_38_0

As Alfonso Cuarón’s Gravity wins plaudits for its portrayal of astronauts fighting for survival after a devastating mid-space collision, we’re becoming increasingly aware of the complex machinery orbiting beyond our atmosphere.

The central plot of the film – in which debris from a destroyed satellite sweeps catastrophically around Earth – isn’t that preposterous. While the movie may portray spacecraft to be much closer than they are in reality, we are launching new objects into orbit all the time.

And collisions do happen. In 2009, the satellite Iridium 33 collided with an out-of-service Russian satellite, creating thousands of pieces of debris. While most of that debris is now thought to have burnt up in the atmosphere, the ISS had to perform an avoidance manoeuvre two years after the event.

“Situational space awareness can no longer afford to be ignored,” says our founder, Phil Pauley. “It’s essential that R&D, industry and military organisations continue to join forces.”

Out of sight, out of mind?

ScreenShot_29

In this modern, interconnected world, so much of what we do depends upon space satellites, from communications to weather forecasting, navigation and defence. There are around 1,000 active satellites in orbit today, with a net worth of €100 billion. They must be protected.

But there are threats to this status quo, in the form of naturally occurring space weather (predominantly solar flares and cosmic rays), asteroids and comets, and man-made space debris.

The debris issue is a growing problem. Causing most problems in low Earth orbits, debris is found where the majority of satellites used for observations, communications and military surveillance operate.

20,000 items of ‘space junk’ larger than a mobile phone are being tracked, and half a million smaller fragments are circling our planet. Travelling at speeds of 5 miles a second, they can do a lot more harm than you might think.

Just because we can’t see space debris from Earth, doesn’t mean it isn’t there. Tools such as ours could help accurately visualise the extent of the situation, make it real, and create ways to address the problem and help find solutions.

This month’s conference was recognition of the fact that something must be done. Despite political, military and international boundaries, it seems that those invested in space must start working together to take collective responsibility for debris.

Moving towards collective responsibility

Suggested approaches include launching ‘clean-up’ missions to collect large, disused and hazardous objects. Rockets armed with harpoons, robotic arms or nets could collect space junk and then either launch it out into a less crowded orbit or swing it back into the Earth’s atmosphere to burn up.

How PAULEY could help

Screenshot_2013-11-15-09-14-24The UK and international space industry is growing rapidly, and our reliance on the information gathered and distributed by spacecrafts and satellites is booming. There are plenty of challenges and opportunities ahead.

Much discussion at the conference centred on finding ways of incentivising a consistent process of cleaning up. Do we look at implementing a kind of global space traffic control, perhaps, using our technology to allow us to see what’s happening remotely?

Industry, government and business, some of whom we met at the SSA 2013 Conference, are keen to find new ways of training those involved in the industry, to visualise crafts in space, and to begin astronaut preparation in immersive e-learning environments on Earth.

Share
1375000_725742614108827_85146049_n

Should Next-Generation Education Use Oculus Rift?

“This technology is going to revolutionize the way we live, learn, work, and play.”
– Palmer Luckey, founder of Oculus VR

It might be time to move virtual reality out of that ‘cool things that never came to fruition’ box. The Oculus Rift is coming to town, and it’s bringing with it not only a revolutionary approach to gaming, but applications that stretch into education and e-training.

Oculus VR founder Palmer Luckey certainly views his device in a broader context: “Virtual reality provides more freedom for content creators than any other form, and allows us to simulate other art forms like movies, books, or traditional games. In that sense, it is the ultimate medium.”

Here’s four ways in which we think VR could become the ultimate educational tool.

1. Escaping the classroom
Research has shown that game-based learning exploits the natural competitive instinct in order to motivate, encourage and reward productive behaviour in the classroom. So surely VR could make this an even more powerful learning experience?

Naysayers insist that few educational games based on standard computers have made it into schools. Some suggest that too many adults associate video games with the propagation of violence, sleepless nights and an unhealthy obsession with artificial worlds and avatars.

But World of Warcraft has been successfully translated to the classroom, ‘gamifying’ the school day to increase productivity and pupil satisfaction. Similarly, a modification to online simulation game Minecraft is being used in over 1,000 schools to create hypothetical scenarios and reconstruct history.

The opportunities are boundless: PublicVR, for example, have created a virtual forest in which students can record measurements and make observations on tree species, canopy closure and tree biomass. It’s just one excellent example of VR-based experiential learning.

2. Learning from a distance
Imagine being able to join in with a lecture from one of the world’s top scientists from thousands of miles away. You’d be wearing a pair of goggles and have headphones in your ears and you’d only see each other as avatars. But would it be as engaging as being there in person?

Research suggests so. This type of digital teacher-student interaction could be even more valuable than the real thing by utilising ‘augmented gaze’. This involves digitally manipulating the avatar of the presenter to make constant direct eye contact with every participant separately.

Behavioural studies show that this simple strategy increases attention, naturally regulates conversation, and heightens physiological responses. The presenter or teacher becomes more influential and more persuasive as a result. We read this as better education for all.

3. Skills-based training
Practise makes perfect, but it’s not always practical. At PAULEY, we’ve created interactive and cost-saving e-training tools for companies who can’t always access the ‘real thing’, whether it’s checking the safety of train engines or training police officers to use new hardware.

VR really comes into its own in this arena, and hardware such as the Oculus Rift provides the closest thing to reality we can currently achieve in digital terms.

It’s no surprise that gory operation game Surgeon Simulator 2013 has already been adapted for Oculus Rift. Could something similar be used to train surgeons and health care professionals in complex surgical procedures?

4. Meditation & reassurance
Primary school teacher Mathieu Marunczyn has been using the Oculus Rift to help manage students with disabilities such as Autism Spectrum Disorder (ASD) and Sensory Processing Disorders (SPD).

He’s found software such as BlueMarble has a remarkable ability to calm down students – something he’s dubbed ‘digital meditation’: “The student was immediately engaged and was calmly yet actively exploring the world he became immersed in. He was no longer physically ‘acting-out’ and I noticed that his whole body became more relaxed.”

The technology could also be used to help children and adults with learning difficulties, or disorders such as those on the autism spectrum, to practice social exchanges and real-life situations in safe, controlled environments. Access to a fine-tuned 3D environment would allow people to repeat certain processes – perhaps the recognising of emotional cues, or the correct way to interact with a sales assistant – until appropriate behaviours are achieved.

Here at PAULEY, we think that experiential learning through VR and digital technologies is justifiably on the up. The benefits are numerous for streamlining efficiencies across multiple sectors. Why shouldn’t we imagine a not-too-distant future in which a Ray Mears-esque avatar leads students on virtual school trips through jungles and across mountains?

Let us know what you think.

Share
Apple Mac

Living & Working Through The Digital Revolution – What Can We Expect Next?

Having lived through a large part of the life cycle of computer technology and graphics, taking a look back was a remarkable reminder of just how far we have come in such a short timeframe. In this blog, we take a look back at the key developments of the last 50 years and ask ourselves where we can expect to head next – what will be our next milestone?

There is a generation of us out there who begun our working lives, or even studied at university or school, just before the internet became an essential part of daily life and before digital technology had such power to influence our day to day decisions and dictate our behaviour. We have, therefore, experienced the dramatic changes that have taken place in society over the past few decades and grown with the digital revolution.

Without the computer technology now available to us through constant innovation and a strong air of competition amongst entrepreneurs, we would be living in a very different world, and one without half the amount of visual stimulation we see all around us on a daily basis. And it is quite startling when you start to look at how quickly things have evolved. If we had to travel back in time to the 1960s, we would struggle to get used to a life without our gadgets and visual media. How would we cope?

Considering the quality and realism that we see in computer graphics today, it’s hard to imagine that the field didn’t even exist 50 years ago. Yet even today, the technology continues to evolve at a rapid pace. And while companies have come and gone over the years, the people haven’t. Most of the early pioneers are still active in the industry and just as enthusiastic about the technology as they were when they first started. Companies are now being forced to keep up with developments in order to stay in business, as the consumer world changes day by day, behaviours are influenced by new technology and form new expectations.

What started with a monochrome command-line prompt and has grown into a multicolored, multitasking, multimedia masterpiece!

So when did it all begin?

Computing hardware evolved from machines that needed separate manual action to perform each arithmetic operation, to punched card machines, and then to stored-program computers. Before the development of general-purpose computers, most calculations were done by humans. Aside from written numerals, the first aid to computation were purely mechanical devices which required the operator to set up the initial values of an elementary arithmetic operation, then manipulate the device to obtain a result. A sophisticated example is the slide rule in which numbers are represented as lengths on a logarithmic scale and computation is performed by setting a cursor and aligning sliding scales, thus adding those lengths. Numbers could be represented in a continuous “analog” form, for instance a voltage or some other physical property was set to be proportional to the number. Analog computers, like those designed and built by Vannevar Bush before World War II were of this type. Numbers could be represented in the form of digits, automatically manipulated by a mechanical mechanism. Although this last approach required more complex mechanisms in many cases, it made for greater precision of results. In the United States, the development of the computer was underpinned by massive government investment in the technology for military applications during WWII and then the Cold War.

Colossus

Colossus

Colossus was the world’s first electronic programmable computing device, designed by engineer Tommy Flowers and operational in 1944. It used a large number of valves (vacuum tubes). It had paper-tape input and was capable of being configured to perform a variety of boolean logical operations on its data, but it was not Turing-complete. The Colossus computers were used by British codebreakers during World War II to help in the cryptanalysis of the Lorenz cipher. Details of their existence, design, and use were kept secret well into the 1970s. Two of the machines were transferred to the newly formed GCHQ and the others were destroyed. As a result the machines were not included in many histories of computing. A reconstructed working copy of one of the Colossus machines is now on display at Bletchley Park.

Colossus was not a general-purpose machine, being designed for a specific cryptanalytic task involving counting and Boolean operations. Being not widely known, it had little direct influence on the development of later computers; EDVAC was the early design which had the most influence on subsequent computer architecture. However, the technology of Colossus, and the knowledge that reliable high-speed electronic digital computing devices were feasible, had a significant influence on the development of early computers in the United Kingdom and probably in the US. A number of people who were associated with the project and knew all about Colossus played significant roles in early computer work in the UK.

Altair

Altair

It was in the1950’s that we saw the emergence of the first computers. They were a feeble excuse for the computer we see today, being large in size and built from kits. It wasn’t until the 1970’s that the microprocessor emerged facilitating the rise of the desktop PC. The ALTAIR 8088 by Intel was the first to emerge on the commercial market, but it was in kit form and could only be given instructions through switches and lights. There were no keyboards or mice. The computer also required a programming language to make it useful, the first of which was created by Bill Gates and Paul Allen.

The ALTAIR created a buzz amongst computer hobbyists. They were looking to move away from a kit to a pre-assembled computer available out of the box. No longer would you need the knowledge to build a kit. And this is where the founders of the Apple Computer company, Steve Wozniak and Steve Jobs first came into the picture.

The first IBM Desktop Computer

The first IBM Desktop Computer

They had the genius to put all of the necessary chips into one computer board and the vision to build it. The Apple II was released as a result and was extremely successful commercially. Then in 1980, IBM appeared. Before IBM, microcomputers were largely ignored in the business world. IBM developed the hardware required and enlisted a Microsoft team lead by Bill Gates to create the programming language and operating system. Unfortunately off the back of IBM’s success, Compaq figured out how to clone the IBM proprietary chip, BIOS, and developed a computer that was to help remove IBM from the PC industry.

In 1984, after the appearance of Steve Jobs at PARCS (a research facility in Palo Alto, California, set up by copier company, Xerox. Out of this research facility came the mouse, a graphical user interface, the first computer network called ethernet and WYSIWYG printing), Apple introduced the Macintosh and sparked the graphical user interface revolution. It would become the largest non IBM-compatible personal computer series ever introduced. The original Macintosh had almost no software and was not very well accepted into business. The first successful software that was written for it came from Microsoft who wrote Word and Excel originally for the MAC.

Apple Mac

Apple Mac

John Warnock later founded Adobe Systems and created a revolution in the publishing world with his PostScript page description language. Adobe PostScript helped make the Macintosh a success and started a new industry called Desktop Publishing. But PostScript was not enough. In order to make PostScript useful, the Mac needed a software program that would be capable of manipulating postscript to lay out type and pages. This came from a small start up company named Aldus, the program was named Pagemaker, and it revolutionised the industry. Adobe shortly released Illustrator, another program that was designed to work hand in hand with postscript to produce a new type of graphics called Vector Graphics. With the release of colour computers in the 1980’s, Photoshop was released, and invented by the Knoll Brothers, it completed the circle to make Desktop Publishing an industry of it’s own.

3D Graphics and Animation

The developments in computer technology and computer graphics also made way for the use of 3D animation in film. The technology has since been used in many other industries too.

Though there are many contributors to computer animation, 3D animation is often attributed to William Fetter. William Fetter worked for Boeing during the 1960s using computers to animate and design certain models. One of his projects involved making what came to be known as “The Boeing Man.” It was a three-dimensional representation of the human body. It was then that Fetter coined the term “computer graphics.”

Tom Fetter's Boeing Man

Tom Fetter’s Boeing Man

Other key names in the development of 3D animation, were Ivan Sutherland who developed the ‘sketchpad’ and the first computer controlled head-mounted display (HMD) allowing the viewer to see the computer scene in stereoscopic 3D, and Ed Catmull who originally created an animation of his hand opening and closing and went on to introduce parametric patch rendering, the z-buffer algorithm and texture mapping (method of taking a flat 2D image of what an object’s surface looks like, and then applying that flat image to a 3D computer generated object).

Pong arcade game

Pong arcade game

The 1970s saw the emergence of the gaming industry (Atari created pong and pacman) and the formation of a new computer graphics division of Lucasfilm that would create computer imagery for motion pictures, viewed by many as a major milestone in the history of computer graphics.

CGI was first used in movies in 1973, in the science fiction film, Westworld. Its sequel, Futureworld (1976) featured the first use of 3D wire frame imagery.  The third film ever to use this technology was Star Wars (1977), designing the Death Star and the targeting computers in the X-wings and the Millenium Falcon, Han Solo’s ship. Later on, The Black Hole (1979) used raster wire-frame model rendering to create a black hole onscreen. That same year, James Cameron’s Alien used the raster wire frame model to render the image of navigation monitors in the scene where the spaceship follows a beacon for landing guidance. ”

Autodesk's Animator

Autodesk’s Animator

In the early 80’s, AutoCAD was born with the formation of Autodesk Inc and became a commercial success, helping move computer graphics to the world of personal computers. Later, Autodesk released a new PC based animation package called Autodesk Animator. As a full featured 2D animation and painting package, Animator was Autodesk’s first step into the multimedia tools realm.

Another major milestone in the 1980′s for computer graphics was the founding of Silicon Graphics Inc. (SGI) by Jim Clark in 1982. SGI focused its resources on creating the highest performance graphics computers available. These systems offered built-in 3D graphics capabilities, high speed RISC (Reduced Instruction Set Chip) processors and symmetrical (multiple processor) architectures. The following year in 1983, SGI rolled out its first system, the IRIS 1000 graphics terminal. It was in the same year that Tom Brigham, created “Morphing”, destined to become a required tool for anyone producing computer graphics or special effects in the film or television industry.

Around 1985, multimedia started to make its big entrance. The International Standards Organization (ISO) created the first standard for Compact Discs with Read Only Memory (CD-ROM). Today multimedia is a major marketplace for personal computer 3D animation.

Tin Toy (Image credit: Pixar)

Tin Toy (Image credit: Pixar)

In the late 80s, Disney formed their Computer Generated Imagery (CGI) department that went on to work on films such as “The Little Mermaid,” “Rescuers Down Under,” “Beauty and the Beast” and “Aladdin.” And grew in size off the back of their success. Pixar developed Renderman in 1988, a standard for describing 3D scenes. The Renderman standard describes everything the computer needs to know before rendering your 3D scene such as the objects, light sources, cameras, atmospheric effects, and so on. In 1989, Pixar made history when they created Tin Toy, a film completely created with 3D computer graphics using Pixar’s Renderman.

Then in the 1990’s, Microsoft shipped Windows 3 that followed a GUI structure similar to the Apple Macintosh, and laid the foundation for a future growth in multimedia.

Star Wars: Empire at War (publisher: LucasArts)

Star Wars: Empire at War (publisher: LucasArts)

NewTek, a company founded in 1985, also released the Video Toaster, a video production card for Amiga personal computers and Autodesk shipped their first 3D Computer animation product, 3D Studio, which was later to become the leading PC based 3D computer animation software.

The rest of the decade the release of the first full-length computer 3d animated and rendered motion picture. It came from Pixar and was called Toy Story. This year also saw another graphics revolution, Sony released their Playstation (X) game-console worldwide. Until then the so-called Video Game consoles only managed to display 2 D graphics.

It took a Star Wars movie to impress the audiences again. The long-awaited prequel to the earlier Star Wars movies was released in May 1999. As expected, it was extremely successful at the box office preceded only by Titanic and the original Star Wars movie. What amazed most was not the quality of the CGI but the sheer amount of it. Some 95% of the imagery was digitally manipulated in one way or another.

Playstation 3 Graphics

Playstation 3 Graphics

As we entered the 21st century, Graphic software reached a peak in quality and user accessibility, PC displays supported real-time texture mapping, Flatbed scanners, laser printers, digital video cameras, etc., became commonplace, Program language moved toward Java and C++.and 3D modelling captured facial expressions, human face, hair, water, and other elements formerly difficult to render. The next generation of video consoles were also released – Playstation 3 and the successor to Nintendo’s GameCube and Microsofts Xbox 360.

 

Mobile, Marketing and Advertising

iPad mini

iPad mini

In the last decade we have seen the rise of mobile platforms with devices such as the SmartPhone and IPad coming to market that allow us to view multimedia on the move. During the evolution of computer graphics hardware, the cost has gradually reduced so as not to be so prohibitively expensive. Its now much more accessible to a wider market, meaning that companies can now get access to the plethora of benefits that this technology can have on all areas of business. It is no longer just the realm of the film and gaming industries.

Animations and 3D graphics are now also commonplace in advertising, promotional material, corporate video, concept design and product development and educational material. Digital has not only brought down the cost of production and ongoing maintenance of this collateral, but has also allowed for easy updates without having to go through the motions of reprinting at great expense and has given companies the ability to visualise concepts/ideas in a real time environment prior to their development and implementation.

A new generation in sales and marketing techniques has emerged. Born out of our constant desire to innovate have come Trade shows such as TFMA (Technology for Marketing and Advertising) and Internet World, that solely focus on online marketing and the use of digital channels and technology within the industry. These developments over the last decade or so have changed the consumer environment and we have witnessed the rise in marketing automation which in turn, has changed consumer expectations. Buyers now expect to see highly advanced graphics displayed on digital devices. They also expect a more personalised experience tailored to their interests or the ability to create or customise their own products on a digital devise.  The rise in database technologies alongside computer graphics and coding languages has allowed us to deliver this to our consumers, with more powerful marketing and sales tools at our disposal. The recent mobile revolution has added to the ‘reliance’ we had already developed for digital technology and computer graphics in our daily lives, and the obsession we have for constant innovation to create the next ‘revolution’.

What next?

So where do you think we are heading next? What’s the next step in the digital revolution and what will our consumers expect? Are we heading towards an autonomous society where our actions will be lead totally by virtual minds and not our own? We are already seeing the use of computers to drive robots that can control our vehicles without human intervention and to educate our children. Could the human mind end up becoming too clever for its own good and becoming redundant? And will the mobile revolution mean that we can never stop, constantly on the move, society could burn itself out?

A long way off, but food for thought.

Let us know what you think in the comments.

__________________________________________

PAULEY can help you create innovative and engaging interactive sales and marketing collateral and work with you to develop your digital strategy including offering advice on DAM (Digital asset Management). If you would like to chat to us about any ideas or concerns you may have, please call us on 01908 522532 or email [email protected]. We would be more than happy to offer a free consultation. To find out more about us and what we do, visit our website http://www.pauley.co.uk

Share