As infants begin to investigate their world, the visual and touch channels play a crucial role in world modeling. Studies have shown that when stimuli such as vision and touch do not coincide temporally, i.e., there is a latency in one of the modalities, the child often becomes disengaged and/or confused . The coherent stimulation of the visual and touch sensations presents the child with a sense of cause and effect, known as causality. This sense of causality engages the investigator and is the basis of cognitive engagement. The sense of causality is present throughout life. Simulation technology, which cannot present a sense of causality, disconcerts the user and subsequently disengages them. Such simulations that seek to be didactic are thus self defeating.
The Instrumented Gloved Interface and Head Mount Display , or (IGI/HMD) project will provide the means to interact with a virtual environment, as simple or complex as you choose.
The Virtual Technologies CyberGlove is an instrument for measuring the movements of your fingers and hand. The CyberGlove is a glove with 22 sensors that monitor the motions of your hand and fingers. The sensors are located over or near the joints of the hand and wrist. There are three bend sensors on each of the fingers that measure the joint angles. Horseshoe-shaped abductions sensors measure the amount that each finger moves laterally in the plane of the palm. The thumb has an additional sensor that measures how much the thumb rotates toward the pinkie finger. The pinkie has a similar sensor to measure how much it rotates toward the thumb. Finally, there are two wrist sensors, one to measure wrist pitch and one to measure wrist yaw. Has Velcro position-sensor mount on the top of the wrist.
VirtualHand software is used with the CyberGlove. When executed, the VirtualHand program loads a pre-stored default hand-geometry and hand-calibration file (called default.cal for the SGI version). Using the calibration file, the Virtualhand software maps the movement of your physical fingers to the displayed graphic fingers in the correct directions, and roughly the correct amount. Note: the y-axis of the finger-joint coordinates points along the middle finger, and the z-axis is normal to the back of the hand. The hand-calibration information in default.cal provides the VirtualHand program with the parameters to calculate the mapping from digitized sensor values to the appropriate joint angle in radians. The position sensors provide x, y, z, azimuth (or yaw), elevation (or pitch), and roll position and orientation tracking.
In addition, the CyberTouch system allows the user to program vibratory tactile feedback of varying degrees to individual fingers and the palm. The vibratory actuators are small black cylinders (similar to pager vibrators) attached to the back of each finger and to the palm of the CyberGlove. The actuators are capable of producing short pulses and sustained vibration.
The Polhemus Isotrak II 3-D position sensor provides six-degree-of-freedom tracking using electromagnetic fields to determine the position and orientation of, in this case, the CyberGlove. The Polehemus system has a transmitter (source) and two smaller receivers (sensors). The sensor (a 3/4" square cube) is mounted on the top of the CyberGlove wristband and the source/transmitter (a 2" square cube) is placed on a table in front of you and to the right. The second receiver may be used to track the position of another object. If placed on the HeadMountDisplay, it will track head movement (which can be programmed to correspond to a translation and/or rotation of the scene on the screen). It is the orientation and position of the sensor(s) relative to the source that produces the signal used by the VirtualHand program to move the virtual hand on the screen.
Will be using a Silicon Graphics Infinite Reality Onyx workstation with 4 R10000 processors and 256 MB of RAM to render the virtual environments.
This project requires knowledge on the C programming language, which those of you who choose this project will be taught throughout the first week of the Summer Institute. However, it may be useful if some or all of the group members have some previous experience in C/C++. NOTE: this is NOT a requirement. In addition to C, you will be using the OpenGL graphics system to create your virtual world. OpenGL is a software interface (basically a library of commands) that allows you to create 3-D objects and add color, lighting, material properties, and texture to your scene.
Project ideas: You will be given the freedom to take this project in the direction that you find appropriate and challenging! You will design a scene that goes along with your focus. The "scene" is a cube that the hand moves within. You can import 3-D objects and add color or texture maps to the inner walls of the cube to create this scene. The CyberGlove will provide a method to interact with the virtual environment you create. Some uses of CyberTouch feedback include simulating tactile sensations, such as when a virtual hand touches or squezzes a virtual object. You will also be given access to the HeadMountDisplay, however time constraints and complexity of accurately configuring head tracking favors the use of the CyberGlove with a stationary scene. For example, you can add force feedback to the scene so that anytime a wall or object is touched the actuators will be set off, simulating a sense of touch.
Don Stredney, Anna Cherubim and Mireille Robitalle are the OSC coordinators for the Interactive Virtual Environments project. Don's office is in 420D. Please contact one of the coordinators to set up appointment(s) for consultation.