You are on page 1of 9

VIRTUAL REALITY

ABSTRACT: Virtual reality is attracting great deal of attention. Its use depends on having extremely low response times of virtual environment to user interaction. Applying virtual reality to finite element analysis(FEA) and executing interactive FEA provide an interface to implement comprehensive and intuitive approach. Java 3d API is chosen as software tool for developing interactive FEA. Also, using virtual reality it is possible to develop a virtual environment display for research and rehabilitation of balance disorders, called Balance NAVE (BNAVE). The BNAVE is promising tool for rehabilitation. The system uses four PCs, three stereoscopic projectors, and the three rear- projected screens. There are many fields of our daily life uses concept of Virtual reality, such as Entertainment, Education, Science, Medical, Defense. Etc. Virtual Reality Simulators are used for training in medical science as well as in military and in many other fields. 1.INTRODUCTION TO VIRTUAL REALITY:

Virtual Reality (VR) is popular name for an absorbing, interactive, Computer-mediated experience in which person perceives a synthetic(simulated) environment by means of special human-computer interface Equipment. It

interacts with simulated objects in that environment as If they were real. Several persons can see one another and interact in shared Synthetic environment such as battlefield. Virtual Reality is a term used to describe a computer generated virtual

Environment that may be moved through and manipulated by a user in real time. A virtual environment may be displayed on a head-mounted display, a computer monitor, or a large projection screen. Head and hand tracking systems are employed to enable the user to observe, move around, and manipulate the virtual environment.

What is VR? Virtual Reality has been addressed by a large number of authors in the literature for decades, many of them introducing slightly different meanings to the term. In 1997,keppell proposed that VR should be looked upon as a situation where a person was immersed into a computer generated environment that bore strong similarities with reality . Other authors tend to define VR from the point of view of what technological tools are being used, i.e. VR happens when head mounted visual display units and motion- tracking gloves are present. One could also define VR from a psychological perspective, where it becomes nothing of a technology but rather a state produced in the users minds that can occupy their awareness in a way similar to that of real environments [Keppell et. al., 1997]. The problems involved in finding a definition of VR that can be agreed upon has produced a host of competing terms that some authors prefer, e.g. synthetic environments, cyberspace, artificial reality, simulator technology [Isdale, 1993]. A different way of defining VR, and perhaps the best so far, is to center around the user and look at the style of interaction that takes place between the user and the computer-generated environment. The users manipulate what is perceived to be real objects in the same manner as they would manipulate them in the real world, as opposed to

the you

typing,

pointing interacts

and

clicking in

you

traditionally other

use

to

manipulate

objects

when

computer

environments.

Virtual environments: Foundations of Virtual Reality:

Virtual Reality (VR) refers to a technology which is capable of shifting a subject into a different environment without physically moving him/her. To this end the inputs into the subject's sensory organs are manipulated in such a way, that the perceived environment is associated with the desired Virtual Environment (VE) and not with the physical one. The manipulation process is controlled by a computer model that is

based on the physical description of the VE. Consequently, the technology is able to create almost arbitrarily perceived environments.

Immersion is a key issue in VR systems as it is central to the paradigm where the user becomes part of the simulated world, rather than the simulated world being a feature of the user's own world. The first immersive VR systems have been the flight simulators where the immersion is achieved by a subtle mixture of real hardware and virtual imagery. The term "immersion" is a description of a technology, which can be achieved to varying degrees. A necessary condition is Ellis' notion of a VE, maintained in at least one sensory modality (typically the visual). For example, a head-mounted display with wide field of view, and at least head tracking would be essential. The

degree of immersion is increased by adding additional, and consistent modalities, greater degree of body tracking, richer body representations, decreased lag between body movements and resulting changes in sensory data, and so on. Astheimer defines immersion as the feeling of a VR user, that his VE is real. Analogously to Turing's definition of artificial intelligence: if the user cannot tell, which reality is "real", and which one is "virtual", then the computer generated one is immersive.

DIFFERENT KINDS OF VIRTUAL REALITY: There is more than one type of virtual reality. Furthermore, there are different schema for classifying various types of virtual reality. Jacobson (1993a) suggests that there are four types of virtual reality: (1) immersive virtual reality, (2) desktop virtual reality (i.e., low cost homebrew virtual reality), (3) projection virtual reality; and (4) simulation virtual reality. Thurman and Mattoon (1994) present a model for differentiating between different types of VR, based on several "dimensions." They identify a "verity dimension" that helps to differentiate between different types of virtual reality, based on how closely the application corresponds to physical reality. They propose a scale showing the verity dimension of virtual realities (See Fig. 15-1). According to Thurman and Mattoon (1994, P.57), The two end points of this dimension - physical and abstract - describe the degree that a VR and entities within the virtual environment have the characteristics of reality. On the left end of the scale, VRs simulate or mimic real-

world counterparts which correspond to natural laws. On the right side of the scale, VRs represent abstract ideas that are completely novel and may not even resemble the real world. Thurman and Mattoon (1994) also identify an "integration dimension" that focuses on how humans are integrated into the computer system. This dimension includes a scale featuring three categories: batch processing, shared control, and total inclusion. These categories are based on three broad eras of human-computer integration, culminating with VR --- total inclusion. A third dimension of this model is interface, on a scale ranging between natural and artificial. These three dimensions are combined to form a three-dimensional classification scheme for virtual realities. This model provides a valuable tool for understanding and comparing different virtual realities.

Figure 15-1. Thurston and Mattoon's verity scale for virtual reality. (Adapted from Thurston & Mattoon, 1994.) Another classification scheme has been delineated by Brill (1993; 1994b). This model will be discussed in detail here. Brill's model features seven different types of virtual reality: (1) Immersive first-person; (2) Through the window; (3) Mirror world; (4) Waldo World; (5) Chamber world; (6) Cab simulator environment; and (7) Cyberspace. Some of Brill's categories of virtual reality are physically immersive and some are not. The key feature of all virtual reality systems is that they provide an environment created by the computer or other media where the user feels present, that is, immersed physically, perceptually, and psychologically. Virtual reality systems enable users to become participants in artificial spaces created by the computer. It is important to note that not all virtual worlds are three-dimensional. This is not necessary to provide an enriching experience. And to explore a virtual world, the user doesn't have to be completely immersed in it: first-person (direct) interaction, as well as secondperson and third-person interaction with the virtual world are all possible (Laurel, 1991; Norman, 1993), as the following discussion indicates. 1. Immersive First-Person: Usually when we think of virtual reality, we think of immersive systems involving computer interface devices such as a head-mounted display (HMD), fiber-optic wired gloves, position tracking devices, and audio systems providing 3-D (binaural) sound. Immersive virtual reality provides an immediate, first-person experience. With some applications, there is a treadmill interface to simulate the experience of walking through virtual space. And in place of the head-mounted display, there is the BOOM viewer from Fake Space Labs which hangs suspended in front of the viewer's face, not on it, so it is not as heavy and tiring to wear as the head-mounted display. 2. Augmented Reality: A variation of immersive virtual reality is Augmented Reality where a see-through layer of computer graphics is superimposed over the real world to highlight certain features and enhance understanding. One application of augmented reality is in aviation, where certain controls can be highlighted, for example the controls needed to land an airplane. 3. Through the window:

With this kind of system, also known as "desktop VR" the user sees the 3-D world through the 'window' of the computer screen and navigates through the space with a control device such as a mouse. Like immersive virtual reality, this provides a first-person experience. One low-cost example of a 'Through the window' virtual reality system is the 3-D architectural design planning tool Virtus WalkThrough that makes it possible to explore virtual reality on a Macintosh or IBM computer. 4. Mirror world:

In contrast to the first-person systems described above, Mirror Worlds (Projected Realities) provide a second-person experience in which the viewer stands outside the imaginary world, but communicates with characters or objects inside it. Mirror world systems use a video camera as an input device. Users see their images superimposed on or merged with a virtual world presented on a large video monitor or video projected image. 5. Waldo World:

This type of virtual reality application is a form of digital puppetry involving real-time computer animation. The name "Waldo" is drawn from a science fiction story by Robert Heinlein (1965). Wearing an electronic mask or body armor equipped with sensors that detect motion, a puppeteer controls, in real-time, a computer animation figure on a screen or a robot. 6. Chamber World:

A Chamber World is a small virtual reality projection theater controlled by several computers that gives users the sense of freer movement within a virtual world than the immersive VR systems and thus a feeling of greater immersion. Images are projected on all of the walls that can be viewed in 3-D with a head-mounted display showing a seamless virtual environment. 7. Cab Simulator Environment:

This is another type of "first-person" virtual reality technology that is essentially an extension of the traditional simulator(see 17.4). Hamit (1993) defines the cab simulator environment as: Usually an entertainment or experience simulation form of virtual reality, which can be used by a small group or by a single individual. The illusion of presence in the virtual environment is created by the use of visual elements greater than the field of view, three-dimensional sound inputs, computer-controlled motion bases and more than a bit of theatre (p. 428). 8. Cyberspace:

The term "cyberspace" was coined by William Gibson in the science fiction novel Neuromancer (1986), which describes a future dominated by vast computer networks and databases. Cyberspace is a global artificial reality that can be visited simultaneously by many people via networked computers. Cyberspace is where you are when you're hooked up to a computer network or electronic database --- or talking on the telephone. However, there are more specialized applications of cyberspace where users hook up to a virtual world that exists only electronically; these applications include text-based MUDs (Multi-User Dungeons or Multi-User Domains) and MUSEs (Multi-User Simulated Environments).

9.

Telepresence/Teleoperation:

The concept of cyberspace is linked to the notion of telepresence, the feeling of being in a location other than where you actually are. Related to this, teleoperation means that you can control a robot or another device at a distance. In the Jason Project, children at different sites across the U.S. have the opportunity to teleoperate the unmanned submarine Jason, the namesake for this innovative science education project directed by Robert Ballard, a scientist as the Woods Hole Oceanographic Institute (EDS, 1991; Ulman, 1993; McLellan, 1995). An extensive set of curriculum materials is developed by the National Science Teachers Association to support each Jason expedition. A new site is chosen each year.

Back Grpund: Terminology and concepts: The term "artificial reality", coined by Myron Krueger, has been in use since the 1970s; however, the origin of the term "virtual reality" can be traced back to the French playwright, poet, actor, and director Antonin Artaud. In his seminal book The Theatre and Its Double (1938), Artaud described theatre as "la ralite virtuelle", a virtual reality in which, in Erik Davis's words, "characters, objects, and images take on the phantasmagoric force of alchemy's visionary internal dramas". Artaud claimed that the "perpetual allusion to the materials and the principle of the theater found in almost all alchemical books should be understood as the expression of an identity [...] existing between the world in which the characters, images, and in a general way all that constitutes the virtual reality of the theater develops, and the purely fictitious and illusory world in which the symbols of alchemy are evolved". The term has also been used in The Judas Mandala, a 1982 science-fiction novel by Damien Broderick, where the context of use is somewhat different from that defined above. The earliest use cited by the Oxford English Dictionary is in a 1987 article titled "Virtual reality" but the article is not about VR technology. The concept of virtual reality was popularized in mass media by movies such as Brainstorm and The Lawnmower Man. The VR research boom of the 1990s was accompanied by the non-fiction book Virtual Reality (1991) by Howard Rheingold. The book served to demystify the subject, making it more accessible to less technical researchers and enthusiasts, with an impact similar to that which his book The Virtual Community had on virtual community research lines closely related to VR. Multimedia: from Wagner to Virtual Reality, edited by Randall Packer and Ken Jordan and first published in 2001, explores the term and its history from an avant-garde perspective. Philosophical implications of the concept of VR are systematically discussed in the book Get Real: A Philosophical Adventure in Virtual Reality (1998) by Philip Zhai, wherein the idea of VR is pushed to its logical extreme and ultimate possibility. According to Zhai, virtual reality could be made to have an ontological status equal to that of actual reality. Digital Sensations: Space, Identity and Embodiment in Virtual Reality (1999), written by Ken Hillis, offers a more critical and theoretical academic assessment of the complex set of cultural and political desires and practices culminating in the development of the technology. Timeline Virtual reality can trace its roots to the 1860s, when 360-degree art through panoramic murals began to appear. An example of this would be Baldassare Peruzzi's piece titled, Sala delle Prospettive. In the 1920s, vehicle simulators were introduced. Morton Heilig wrote in the 1950s of an "Experience Theatre" that could encompass all the senses

in an effective manner, thus drawing the viewer into the onscreen activity. He built a prototype of his vision dubbed the Sensorama in 1962, along with five short films to be displayed in it while engaging multiple senses (sight, sound, smell, and touch). Predating digital computing, the Sensorama was a mechanical device, which reportedly still functions today. Around this time, Douglas Englebart uses computer screens as both input and output devices. In 1966, Thomas A. Furness III introduces a visual flight stimulator for the Air Force. In 1968, Ivan Sutherland, with the help of his student Bob Sproull, created what is widely considered to be the first virtual reality and augmented reality (AR) head mounted display (HMD) system. It was primitive both in terms of user interface and realism, and the HMD to be worn by the user was so heavy it had to be suspended from the ceiling. The graphics comprising the virtual environment were simple wireframe model rooms. The formidable appearance of the device inspired its name, The Sword of Damocles. Also notable among the earlier hypermedia and virtual reality systems was the Aspen Movie Map, which was created at MIT in 1977. The program was a crude virtual simulation of Aspen, Colorado in which users could wander the streets in one of three modes: summer, winter, and polygons. The first two were based on photographsthe researchers actually photographed every possible movement through the city's street grid in both seasonsand the third was a basic 3-D model of the city. In the late 1980s, the term "virtual reality" was popularized by Jaron Lanier, one of the modern pioneers of the field. Lanier had founded the company VPL Research in 1985, which developed and built some of the seminal "goggles and gloves" systems of that decade. In 1991, Antonio Medina, a MIT graduate and NASA scientist, designed a virtual reality system to "drive" Mars rovers from Earth in apparent real time despite the substantial delay of Mars-Earth-Mars signals. The system, termed "Computer-Simulated Teleoperation" as published by Rand, is an extension of virtual reality. Some Applications of Virtual Reality: Imagine the following academic fiction: Eighteen professors from five departments decide to work together and submit a request for a virtual reality system. Suppose further that the administration actually believes that this is a wonderful idea and approves the proposal, provided that the virtual reality system is put to use in the classroom. The faculty eagerly agree to this condition, and to their amazement they acquire the funds to purchase an SGI Onyx 2 Reality Engine and 10 SGI Indigos. The above scenario is not some introduction to a John Grisham suspense novel, but a real story at Clemson University. Recently Steve (D.E.) Stevenson from the Department of Computer Science at Clemson University came to the Geometry Center and talked about applications of Geometry with computers. Steve mentioned briefly how various departments had been using the virtual reality system they acquired, and showed specific examples of what they had done with them. The departments using the system range from those which traditionally might use virtual reality, such as the Computer Science department, the Mechanical Engineering department and the Architecture department, to fields not generally associated with the technology such as the Biomedical Engineering department and the Performing Arts department. All these disciplines' projects use the technology in ways that create images and objects that otherwise would take a long time to construct, or not be feasible to construct at all. In particular, software is currently under development for Mechanical Engineering students that extends CAD/CAE software to virtual reality. Instead of clicking keystrokes to try to alter perspective views, a user is able to wear a

helmet and by moving their head around are able to view an object as if it were before them. Moreover one is able to look through different layers of an object to view how the device is operating internally. Although these are all things that CAD/CAE software allows, the virtual reality system gives a user a more natural way to view an object, which accordingly allows one to easier ask the question, "what if?" Some of the other projects involving engineering are simulation-based design, multipurpose design optimization and visualization in High Performance Computing-Computer Formulated Design structures. Lastly one professor dreams of creating a simulation of the famous Tacoma Narrows bridge collapsing so that Civil and Mechanical Engineers can fully appreciate the consequences of their errors. In the Biomedical Engineering department some of the projects mentioned are use of virtual reality for viewing of X-RAY's and MRI's, using stereolithography to make prototypes of joints, and even having students perform test surgery. In the Computer Science department some of the projects range from creating a toolkit for non-computer science designers, rendering and 3-D lighting, viewing non-euclidean geometries, and modeling for resource management. Projects in the Architecture department include creating a virtual reality model of campus, and a laboratory on building design. People in the Performing Arts department use virtual reality for Stage Lighting and Stage Design Courses. Of the above projects, two of the more interesting applications common to both Mechanical Engineering and Biomedical Engineering, involve stereolithography or 3D printing. One is able to design or input given data about an object and actually create a prototype made out of polymers of the object viewed in the virtual reality. One interesting example is that of an image of a Pelvis taken from an MRI, piped into the virtual reality software so that one is able to view it, and then a model of the bone is manufactured using the polymer machine. The following figure is a virtual reality image of this pelvis.

Similarly, a model of a "ship in a bottle" was created using CAD/CAE software viewed through the virtual reality software, and then made.

The virtual reality machines nicely compliment the polymer machine. One is able to thoroughly view an object before making a prototype, thus saving on the production costs of making a prototype. The Computer Science department has also created some interesting programs. Two software programs are titled Steve's Room and Oliver's Room. Steve's Room is a program which allows the user via the helmet to look around a room, turn on lights, and place objects by voice or mouse commands. Oliver's Room also is a high resolution room. In this room, one can see in high resolution, an Impressionist painting on the wall, a tiled floor, and a window with a view of mountains. The following picture is a view of Oliver's Room.

As with Steve's Room, the user is able via voice commands to move about the room. The next picture is an image of what one might see through the helmet after a request to move has been made.

The visual results from these projects are amazing, both in a practical sense and in a pure aesthetic sense. The images created are useful in understanding the structure of an object, as well as being suitable for framing. However, what is equally impressive is that various departments were able to get together and pool their resources so that this system could be acquired. By doing this, they have provided themselves, and more importantly, their students, an opportunity to use computer systems today that will no doubt be commonplace in the future. Conclusion: Affordable, PC-driven projection based virtual reality systems are a popular topic of investigation right now, and will probably soon become widespread. Our particular hope for such systems is that they will help expand VR out of the research and corporate labs, into public and educational venues. Our prototype display has now been functional and in use for most of a year. The entire system cost roughly $20,000 to construct; we estimate that a new one could currently be built for about half that amount.

In basic performance tests, as well as day-to-day use, the low-cost PC system is comparable to one using an SGI Onyx2. The LCD projectors and black screen provide a bright display with better contrast than older systems using CRT projectors. The lightweight passive stereo glasses are less encumbering, and less fragile, than active glasses. The system as a whole can be maintained by a group of students who have only recently started learning about VR. We would like to thank Chris Galbraith, Dan Neveu and Paul Costa, for their enthusiasm and ingenuity in helping us put together the Media Study low-cost VR system. The virtual reality research, collaborations, and outreach programs at the Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago are made possible by major funding from the National Science Foundation (NSF), awards EIA-9802090, EIA-9871058, ANI-9980480, and ANI-9730202, as well as the NSF Partnerships for Advanced Computational Infrastructure (PACI) cooperative agreement ACI-9619019 to the National Computational Science Alliance. EVL also receives major funding from the US Department of Energy (DOE) Science Grid program, awards 99ER25388 and 99ER25405, and the DOE ASCI VIEWS program, award B347714. In addition, EVL receives funding from Pacific Interface on behalf of NTT Optical Network Systems Laboratory in Japan. CAVE and ImmersaDesk are trademarks of the Board of Trustees of the University of Illinois. DLP is a trademark of Texas Instruments. IRIX, InfiniteReality, Onyx2 and OpenGL Performer are trademarks of SGI. Linux is a trademark of Linus Torvalds. Matrox is a trademark of Matrox Graphics Inc. Pentium is a trademark of Intel Corporation. Quadro2MXR and GeForce2MX are trademarks of NVIDIA Corporation. Radeon is a trademark of ATI Technologies Inc. References Amburn, P. (1992, June 1). Mission planning and debriefing using head-mounted display systems. 1992 EFDPMA Conference on Virtual Reality. Education Foundation of the Data Processing Management Association. Washington, D.C. Aukstalnis, S., & Blatner, D. (1992). Silicon mirage: The art and science of virtual reality. Berkeley, CA: Peachpit Press. Auld, L. W. S., & Pantelidis, V. S. (1994, January/February). Exploring virtual reality for classroom use: The Virtual Reality and Education Lab at East Carolina University. TechTrends. 39(2), 29-31. Badler, N. I., Barsky, B., & Zeltzer, D. (Eds.). (1991). Making them move: Mechanics, control and animation of articulated figures. San Mateo, CA: Morgan Kaufman. Baecker, R. M. (Ed.). (1993). Readings in groupware and computer-supported cooperative work. San Mateo, CA: Morgan Kaufman. Baird, J. B. (1992, September 6). New from the computer: 'Cartoons' for the courtroom. New York Times. Bates, J. (1992). Virtual reality, art, and entertainment. Presence, 1(1), 133-138. Begault, D. R. (1991, September 23). 3-D sound for virtual reality: The possible and the probable. Paper presented at the Virtual Reality '91 Conference. San Francisco, CA.

You might also like