British Success in VR

twinIsles.dev

Contents

1. Introduction
2. BT and VR
3. BBC Virtual Studio
4. Use of VR by the military
5. The Virtual Environment Graphics and Applications Group at Hull
6. Manchester Metropolitan University Virtual Museum research
7. Further reading

1. Introduction

The term virtual reality (VR) is becoming increasingly common in everyday life but it is difficult to provide a single authoritative definition. Encyclopaedia Britannica describes it as "the use of computer modeling and simulation to enable a person to interact with an artificial three-dimensional visual or other sensory environment". In practice VR can take many forms ranging from a simple computer screen displaying, say, a VRML application through assorted hardware e.g. head-mounted displays, gloves and exoskeletons equipped with sensors and force feedback mechanisms, offering the user varying degrees of immersion in the virtual environment. Essential features of VR include telepresence, the illusion of actually being in the virtual environment, and interaction, the ability not only to experience the virtual world but to modify it in some way.

The applications of VR are wide-ranging. The property of telepresence facilitates the handling of dangerous substances and working in hazardous environments (e.g. bomb disposal) without risk to the human operator. Flight simulators are the most obvious use of VR in training, permitting pilots to carry out manoeuvres and deal with emergencies that would be dangerous, expensive or extremely difficult to attempt or create in real life. VR also plays an invaluable role in the training of doctors, simulating a wide range of medical procedures without risk to human patients. In the world of entertainment and leisure VR experiences are already commonplace, and in architecture VR is being used to provide potential clients with the opportunity to walk through proposed new buildings, a considerable improvement upon gazing at two-dimensional drawings.

The aim of this document is to give an overview of the British contribution to the field of VR by describing some British successes in the industry as well as discussing research currently taking place.
Back to top

2. BT and VR

One of Britain's most active researchers into VR is telecommunications giant BT. Four BT applications are considered here: CamNet, AvatarBT, VisionDome and Smartspace.

CamNet is a VR product developed by BT providing augmented reality to a user via a headset linked audio-visually to an expert at a remote location. The headset consists of a microphone, earphones, miniature video camera and head mounted display. CamNet allows the expert to see exactly what the wearer is seeing and permits him to communicate not only by voice but also to show diagrams, reference manuals etc.

Possible applications include a paramedic at the scene of an accident connected to a specialist at a distant hospital, or an on-site computer technician at a client's premises linked to a highly skilled engineer at the supplier's HQ. The telepresence provided by the system vastly reduces the probability of error due to verbal misunderstanding that may result from voice-only communication. It also allows the remote expert to provide on-the-scene assistance more quickly and unlimited by geographic distance. CamNet is now produced by British company InfoDisp.

AvatarBT, which utilizes software from London based AvatarMe, is BT's system for generating avatars. An avatar is a photo-realistic three-dimensional animated representation of a person.

The subject poses in four positions in a special booth. For each position two digital photographs are taken simultaneously. The first is a texture photograph, capturing textural qualities of the subject, i.e. a "normal" shot. The second is a silhouette, capturing the subject's outline shape.

The system holds a representation of a standardized human being, a generic avatar. This is essentially a wire mesh model consisting of about 1500 polygons. The generic avatar is then deformed by the silhouette photographs of the subject.

The system then begins the process of texture sampling in which samples are collected from the texture photographs. The software is intelligent enough to know exactly which areas to sample and where to place them on the subject's avatar. To allow the avatar to move a computer-generated skeleton is placed inside it. As the skeleton moves so do the limbs and torso thereby animating the avatar. AvatarBT incorporates a phase called touch-up which allows final adjustments (e.g. to nose, mouth positioning) to be made manually.

The system has proven to be a popular attraction at the Millennium Dome, allowing visitors' avatars to be photographed riding a bike across the dome skyline with ET in the front basket.

The commercial applications of avatar technology include computer games and on-line clothing stores. The next generation of computer games could allow players themselves to become a part of the action, possibly with players from around the world competing against or alongside one another. Avatars would allow on-line clothing shoppers to "try on" the latest fashions before buying, observing the results on themselves from all possible angles and in all poses. BT further believes the technology has applications in teleconferencing and virtual communities.

VisionDome represents BT’s concept of a fully immersive virtual environment. Developed by American company Alternate Realities Corporation (ARC), BT has been collaborating with ARC in exploring the potential of this technology.

VisionDome consists of a walk-in dome capable of housing up to 15 people. Computer generated images are shown on the 5 metre hemispherical screen. The image is produced by a central projector unit fitted with a hemispherical lens matching the curvature of the dome which creates a 360 by 180-degree image. The system can project real-time computer generated graphics; HDTV video images; and live video camera images. User immersion is achieved by projecting an image that is greater than the viewer’s field of view combined with high-quality audio. Unlike 3D cinema or amusement park simulations the audience is able to interact with, and manipulate, the images and information displayed on the screen.

BT suggested applications include:

  • Design reviews, e.g. architectural walk-throughs, safety reviews and training, or molecular visualization for the pharmaceutical industry;
  • Telepresence, e.g. remote real-time projection of sporting events and concerts, immersive video conferencing, and recreation of inaccessible locations;
  • “Edutainment” (education + entertainment), e.g. the recreation of buildings, extinct creatures and historic events;
  • Synthetic environments, i.e. the connecting of simulators to provide more realistic training environments, e.g. an air traffic control simulation could be linked to an aircraft simulation;
  • Oil exploration, e.g. to review oil platforms before they are built and to assist in the understanding of seismic data showing where to drill.

Smartspace is BT’s VR workstation of the future. It combines video conferencing, Internet capability, full cinematic reproduction of audiovisual material and 3D sound with a user interface designed to reduce effort and improve efficiency.

The Smartspace concept grew from a project to evaluate ways of improving people’s working methods by incorporating a range of technologies into an innovative, new working environment. Usability was the overriding priority from the beginning. BT wanted to combine technologies such as speech recognition, 3D sound, iris recognition and those used in the VisionDome fully immersive environment into a semi-immersive desktop solution enabling users to experience the full benefits of the system’s virtuality without being isolated from their real surroundings.

Smartspace comprises a main display screen filling the user’s peripheral vision fed by two projectors mounted on the back of the chair. A 3D sound system envelopes the user in a “bubble of sound” allowing him/her to be aware of objects not currently displayed. Smartspace offers several alternative input methods, the main one being the 15” touch sensitive LCD screen located horizontally in front of the user with which he can interact by finger, corded pen and a chair-arm mounted rollerball. A “spacemouse” is provided for 3D applications and rotation of the workstation itself invokes an additional user interface. Voice recognition is also available.

Security may be maintained by biometrics (i.e. recognition of the user’s physical characteristics). Two such options are iris recognition (each individual's irises are different) and fingerprint recognition. Both methods are far more secure than a password system.

BT suggests the following potential applications:

  • situations involving remote expertise e.g. banking, control rooms and medical scenarios;
  • education, e.g. providing access to virtual classroom or lecture;
  • virtual estate agency, taking a potential buyer on a tour of a house;
  • entertainment, Smartspace shows the latest DVD movies with full cinematic style reproduction and Dolby Digital AC3 sound reproduction.

Smartspace is likely to be increasingly used by the growing number of home workers, allowing them to enjoy the advantages of working at home without the feeling of isolation that could arise from not being in the office.

Smartspace was licensed for production by British company Incorporated Technologies Ltd in 1999.
Back to top

3. BBC Virtual Studio

BBC Research and Development (R&D) has been working on the incorporation of VR into television production. A virtual studio allows the placement of real actors and props in a virtual environment, which may be totally computer generated in real-time, pre-rendered or normal video. The virtual scenery is moved in unison with the real action using positional information obtained from the studio cameras.

The technology allows considerable production cost savings as well as providing the opportunity to make productions that would otherwise be impossible.

The BBC decided to develop a 2D virtual set system due to weaknesses in existing 3D virtual studio technology; i.e. it was extremely expensive and produced images that were limited in complexity and quality when rendered in real time. As the majority of studio camera movements are pan, tilt or zoom, which do not produce perspective changes, a 3D model was deemed unnecessary as it is indistinguishable from 2D under such conditions. The BBC system allows high quality, photo realistic, television images to be produced in real-time at low cost.

Chromakey is a well established technique in television production in which actors and props are filmed against a background of uniform colour (often blue), called the key colour, which is replaced in the broadcast with a virtual background. To be effective the coloured background must be illuminated brightly and evenly. This gives rise to various problems; e.g. scenes must be lit for technical rather than dramatic effect; shadows, which are often unavoidable, do not produce clear key signals; coloured light from the brightly lit background can spill onto actors and props giving them an unnatural hue.

BBC R&D has developed a keying system which overcomes these problems. It uses a background made of a special retro-reflective material, illuminated by a light of the chosen key colour mounted on the camera. This material reflects nearly all incident light back in the direction from which it came thus ensuring the background appears bright regardless of the studio lighting configuration. The material appears dark grey under normal illumination thus eliminating the colour spill problem referred to above.

For the virtual set illusion to work effectively, camera movement must affect both the virtual scenery and the real action equally. A virtual set camera position measurement system must be able to measure both camera position and orientation to a high degree of accuracy, allow unconstrained movement of multiple cameras, work with a wide variety of camera mountings, measure the camera parameters with minimal delay and place no significant constraints either on scene content or studio environment.

The BBC has developed a system meeting all the above requirements. It makes use of a number of markers placed out of shot, e.g. on the ceiling, which are viewed by a small auxiliary camera mounted on the side of each studio camera. Each marker bears a marking representing a unique code number. The signal from the auxiliary camera is analyzed by a purpose-built hardware unit to calculate the camera’s precise position and orientation. These parameters are passed to the virtual set system along with the camera’s focus and zoom settings allowing the virtual scenery to be manipulated in real time. Further detail of the camera tracking system may be found in the paper “A versatile camera position measurement system for virtual reality TV production” by G A Thomas J Jin, T Niblett, C Urquhart.

Focus pulling (also known as selective focusing) is a technique often employed in photography and film to draw the viewer’s attention to the main point of interest by making the foreground and background out of focus. BBC R&D has developed a hardware unit that allows selective de-focusing of both 2D and 3D virtual scenery, which can mimic the behaviour of a normal camera viewing real scenery. The cameraman can now pull focus and the virtual scenery will follow suit.

The technology used in the BBC’s Virtual Studio is licensed to Radamec Broadcast Systems and sold under the following names: ‘Virtual Scenario’ (2D image manipulator), ‘free-d’ (camera tracking system) and ‘D•Focus’ (defocusing system).
Back to top

4. Use of VR by the military

VR has been used by the military in the design of hardware, training of personnel and planning of operations.

By the 1980s aircraft system technologies had advanced to the point at which pilots were in danger of being overwhelmed by the volume of information which they had to interpret and react to. The existing 2D displays were not particularly intuitive and not as suited to human perceptual processes as a 3D spatial representation would be. Kalawsky [The Science of Virtual Reality and Virtual Environments; Kalawsky, Roy S. 1993] describes how the concept of a virtual cockpit arose in response to this situation.

The British Aerospace Virtual Cockpit Research Facility was formed as part of a programme to integrate various cockpit technologies to provide a low workload man-machine interface. The facility's aim was to evaluate a range of cockpit technologies. In order to do this a high resolution helmet-mounted display, capable of developing all other similar displays of the time, was developed and combined with technologies such as speech recognition and 3D auditory localization.

The British Aerospace Virtual Environment Configurable Training Aids (VECTA) system began as a spin-off from the company's virtual cockpit programme. Its aim was to provide a facility for training pilots in a wide range of cockpit procedures. The initial VECTA system was well received when exhibited at the 1991 Paris International Air Show, however it was still inadequate as a comprehensive training environment. Drawbacks included limited resolution and the inability to display the operator's hands or to give tactile feedback when a virtual control was operated. Comment on the first VECTA led to the development of an improved version, exhibited in 1992. The later VECTA employed a higher performance graphics platform capable of high update rates and real time textural displays. A software modelling package was used to create realistic-looking complex objects at high speed. The system was also capable of being networked, allowing two remotely located pilots to participate in the same virtual mission.

Kalawsky also describes the development of the Real and Virtual Environment Configurable Training Aid (RAVECTA). As the name suggests this system provides an augmented reality in which, for example, the operator is able to see his own hands moving in front of the virtual world. It incorporates two helmet mounted tv cameras and uses the chromakey method described in section 3 to display the operator's actual environment with the virtual world overlaid only on those areas of the key colour.

Tacisys, the British Army’s Tactical Information System, has been used by peace-keeping forces in war-torn Bosnia to plan missions into hostile territory [Online article from TechWeb: British Soldiers Plug Into Virtual Reality].

Tacisys, produced by Ultra Electronics, allows missions to be planned using 3D terrain analysis. A commander can obtain a 3D view from any point, including that of the enemy, allowing him to predict likely ambush points and to know at which points his view will be obscured.

Tacisys is housed in a shock and vibration protected container approximately 4.27m long by 2.44m wide by 2.44m high. It is designed to be carried by a standard 4-tonne military vehicle or military aircraft. The IT hardware consists of Silicon Graphics and Sun Microsystems workstations running a range of (US developed) software applications.

Tacisys is expensive. Each of the eleven Tacisys systems in use in 1997 cost at least $600,000 per vehicle.

The effectiveness of Tacisys in Bosnia has been enhanced by its being connected to the Army’s latest mobile map-making machines as well as utilizing digitized satellite or reconnaissance aircraft photography making it not only highly realistic but also extremely up-to-date.
Back to top

5. The Virtual Environment Graphics and Applications Group at Hull

The Virtual Environment Graphics and Applications Group (VEGA) was founded in 1992 by the University of Hull with support from British Aerospace. Its general aims are to “promote fundamental research, scientific and technological development and industrial exploitation in the expanding area of virtual environments”.

Medical training is a discipline that can benefit enormously from the application of VR. The Virtual Environment Knee Arthroscopy Training System (VE-KATS) was developed as a joint project between orthopaedic surgeons, computer scientists and psychologists at the University of Hull with a view to providing a comprehensive training environment for knee arthroscopy. In this form of surgery the surgeon operates on the knee joint using a miniature camera (arthroscope) and specialized instruments inserted through a small incision made just below the knee cap.

VE-KATS consists of a pair of mock instruments (arthroscope and surgical probe), which are used on a hollow, articulated model of the knee. The position and orientation of the instruments is continuously tracked by the computer, which produces a simulated view from the arthroscope. The mock instruments, which have been designed to resemble the real thing as closely as possible, incorporate internal magnetic tracking devices. The same tracking system is used to measure movement of the artificial knee.

A real time simulation of the view from the arthroscope is displayed on the computer monitor which can also display an anatomical diagram showing the position of the instruments relative to the knee, this is useful during the early stages of training. The system is also capable of displaying video-clips of real arthroscopies to demonstrate technique and allow comparison with real anatomy.

VE-KATS incorporates simulation of the barrel image distortion produced by arthroscopes and detects and warns the surgeon of collisions with solid objects such as bone. Currently the collision warning is non-haptic. A limited simulation of deformation is incorporated but will be replaced with a more sophisticated model based on a technique known as Modal Analysis.

Current research is concerned with the measurement of forces applied during real surgery with a view to incorporating force feedback into the simulator. A prototype force feedback device has been developed to this end. An objective scoring system to assess the performance of trainees is also being developed.

Other research being carried out by VEGA includes a study into improving the man machine interface in the cockpits of military aircraft by means of both Head Down and Helmet Mounted Displays.
Back to top

6. Manchester Metropolitan University Virtual Museum research

Manchester Metropolitan University (MMU) is engaged in research into how computer technology (e.g. the Internet, multimedia and virtual environments) can be exploited in the development of educational resources by museums.

The paper "The Internet and Virtual Environments in Heritage Education: more than just a technical problem" by Dr. William L. Mitchell and Daphne Economou argues that successful virtual museums depend not only on technical excellence but also on a thorough understanding of the needs of the intended audience and suggests that projects should be developed according to some design methodology. The need for effective evaluation is also stressed. It describes two projects carried out at the university.

The objective of the Tomb of Menna project, which went live in April 1996, was to investigate how the Internet could be used to provide access to archival material from the Griffith Institute at the Ashmolean Museum, Oxford. An Egyptian tomb was chosen because of the simplicity of its geometry, and the tomb of Menna because it had already been well documented by Robert Mond. Between 1914 and 1916 Mond measured and created a plan of the tomb. He also photographed the interior ceiling, wall and floor coverings, covering 90% of the tomb’s surfaces.

A web site was built using VRML (Virtual Reality Modelling Language) A VRML world consists of a text file describing the geometry of the world along with texture files that are pasted onto the geometry. The more detail, the larger the files that are required. This has implications both for download times and for the processor used to regenerate the world. Given that the site was aimed at individuals the developers chose to present a lower resolution model combined with a series of high resolution still images.

Evaluation of the Tomb of Menna project has taken the form of e-mail feedback from users combined with observations of children using the system at Manchester Museum.

The initial aim of the Kahun project was to investigate how an Internet-based resource could be used to support the work of the Education Service at Manchester Museum. It was decided that Soft Systems Methodology (Checkland 1990) should be used as a starting point for both requirements gathering and opportunity identification. Stakeholders in the museum’s resources were identified and the National Curriculum for history was consulted. A typical school visit was analyzed and interviews carried out with teachers and students.

From the above it was found that showing everyday objects in context and illustrating how they were used were two areas in which VR could contribute to the educational process due to the Virtual Environment medium’s strengths in contextualisation and animation. It was decided to use objects recovered from the site of Kahun, a pyramid builders’ town.

The medium being explored in the Kahun project is the Collaborative Virtual Environment (CVE). A CVE is a Virtual Environment (VE) in which multiple users can be present and participate simultaneously. Each user is represented as a Virtual Actor (avatar). The five artefacts selected for inclusion in the VE are: senet (a board game for two players); shaduf (a device used for raising water from the river; a brick mould; weights and measures; and a mirror.

Initially a prototype senet environment was constructed using established 2D multimedia technology (Macromedia Director). The prototype was evaluated during an activity week at Manchester Museum and also on school parties visiting the museum. The next phase combined 2D multimedia with groupware, which allowed interaction between remote users in semi- and fully-populated environments. The third phase, which is ongoing, makes use of Deva, a system designed for managing large scale, three dimensional VEs. In this implementation users are represented by avatars, which are able to move within the VE and select objects by picking them up or pointing to them with handheld lasers. Further details may be found in the paper "CVE Technology Development Based on Real World Application and User Needs" by Daphne Economou, William L. Mitchell, Steve R. Pettifer and Adrian J. West.

Rather than exploiting state of the art technology Manchester has chosen instead to explore VR for the masses. Virtual museums and the recreation of sites of archaeological / historical significance provide viewers with the opportunity to experience sites that would otherwise be inaccessible due to location, visitor restrictions or simply because they no longer exist. As home computing power continues to increase at an exponential rate, and as VR hardware and software inevitably become cheaper so many more virtual museum sites will surely become available.
Back to top

7. Further reading

As illustrated by the examples above there is much activity in the field of VR within Britain both in terms of pure research and commercial exploitation. The government is keen to encourage the awareness and use of information technology throughout the economy and the UK Virtual Reality Forum has been founded as part of the DTI - Information Society Initiative: Programme for Business. This forum exists to promote the commercial use of Virtual Reality in the UK. Its web site includes case studies of British companies using VR as well as advice for those companies wishing to make a start in the field.

There are numerous other British companies and research centres working with, and researching into, VR. The following resources provide a starting point for further reading:

Back to top

twinIsles.dev

All information correct and links valid - November 2000

© twinIsles.dev (http://www.twinisles.com/dev/index.htm) 2001