SIGGRAPH 2006 Report
Emerging Technologies at Siggraph 2006
The Emerging Technologies venue at Siggraph 2006 held some real treasures. I had a chance to walk through the venue with Tom Craven, the Siggraph 2006 Emerging Technologies Chair. The venue held a large variety of different displays including a special area known as the "Fusion Midway" where art and technology were used together.
Front and center in the Emerging Technologies venue was a large array of white balloons connected to computer controlled motors that moved the balloons up and down to create interesting and intriguing patterns. This display was created by Masahiro Nakamura and a team at the University of Tsukuba in Japan. The display was pure eye candy enhanced with lights that changed the colors of the balloons. It provided a fitting welcome to the venue that featured the future of interactive technologies.
The Virtual Humanoid display featured a human-shaped robot draped with green cloth upon which a character's image could be projected. This was intended to enhance VR simulations by providing a physical object that you can interact with. By shaking the robot's hand, the computer senses you actions and transfers these actions to the person at the other end. Providing a physical model to interact with endows the system with the essential tactile feedback to make the interactions more real.
The Forehead Retina System display was developed to help visually impaired people navigate within the real world. The system includes a camera that captures an image of the objects directly in front of the wearer. The captured image is then analyzed and relayed to a set of 512 electrodes mounted within a headband. By understanding these impulses, a visually-impaired person can navigate a room of obstacles unaided.
The Virtual Open Heart Surgery display featured some work that is currently being used in Denmark. It allows doctors to train completing open heart surgery in a non-evasive manner. It also allows the MRI data from surgery patients to be loaded into the simulator so a doctor can practice on the exact conditions that they will face in the upcoming surgery.
The noisiest display in the venue was the 3D Laser Display. By crossing two laser beams within a controlled area, a bright dot of light (and its accompany crackle) appear. Using computers to control the precise position of each laser, the team created a series of simple shapes that moved about the area. This could easily be the precursor to true-3D holographic displays.
The Touch Screen Wall display featured a large 16 ft. display upon which a projected series of images where displayed. The wall allowed multiple users to simultaneously interact with the images providing an excellent interface for collaborative work.
The Mitsubishi Electric Research Lab (MERL) featured three unique demonstrations that used water. The water harp let users create music by interrupting continuous flows of water descending between two sensors. The second display featured a display screen under a large square dish of water. As the water is disturbed, the image displayed fish fleeing away from disturbed area. The third display featured a fountain of water that could be altered by moving one's hands closer or further from the fountain. This worked by changing the capacity of the electric system.
The Powered Shoes display let users of a VR system move through a world by walking on shoes with wheels that stay in the same place. By simulating the actual movement of walking, the simulation is enhanced.
Of the several different haptics displays, two were notable. A team from the Tokyo Institute of Technology developed a system they called the "Powder Screen," which included a pool filled with polystryrene beads. This system was demonstrated with a fishing haptic device that allowed users to fish in a virtual stream and experience the thrill of struggling to land a fish.
The second notable haptic display was titled, Perceptual Attraction Force. It used a force-feedback system to simulate accelerations and an increase in weight. The device was designed to be mobile and could be used outdoors.
In the Fusion Midway area, the Digiwall display featured a rock climbing wall with several hand grips. Each of these grips could be lightened and sensors indicate when each grip is grabbed. Using these sensors, the team created several games that could be played such as having a player climb on the wall until a lighted grip is touched and then another grip would be lighted.