Our Projective Immersive Simulated Controlled Environments System (PISCES) uses state-of-the-art computer vision techniques and novel interactive technologies to allow participants to view a life-size projection of themselves interacting with photorealistic virtual environments played in real-time on a screen in front of them.
The system comprises of a camera linked to a computer which feeds to a large screen display fitted inside a 2m x 2m booth.
The participant enters the booth and is filmed at half-profile angle as they face the screen. Real-time video special effects technology is used to extract the image of the participant from their background and superimpose it onto the interactive virtual environments.
The combined image is then displayed on the screen in front of the participant which offers them a unique 'out-of-body' perspective from which to simultaneously experience and observe their interactions with the characters in the virtual environments, whilst also recording their interactions for review and discussion.
Read more about our technologies by clicking on the link below;
PISCES differs from the first-person perspective of other virtual reality (VR) technologies which typically use either head-mounted displays or 'cave' projection systems, since it creates a sense of presence in the virtual environments through its novel third-person perspective approach.
PISCES offers the participant a “self-observational” view of a 2D virtual scene featuring their own image displayed in real-time on a screen in front of them and instead of offering navigation in a 3D virtual world using real-time computer-generated environments and avatars, uses filmed environments and actors to capture the subtle nuances of scene ambience, body language and communication.
This approach aims to provide a more natural and versatile way of experiencing immersive media whilst enabling learning through review and repetition of the participant's interactions within the virtual environments under precisely defined and controlled conditions.
The “self-observation” view offered by PISCES has been identified in research studies as being similar to an "out-of-body experience" and as an innovative approach to VR, is attracting particular interest within the health and learning sectors for its potential to assist cognitive behavioural therapy for anxiety disorders and learning disabilities;
"The feeling of an ‘‘out-of-body experience’’ or depersonalization while using this novel VE system deserves further study as it differs from the first-person perspective of conventional VR systems (watching the environment through goggles) and from the vicarious experience of computer games (identifying with a small avatar on the screen)". (Gega L. et al. 2013)
To read the latest research please click on the following links;
The "Rubber Hand Illusion"
In 1998, Neuroscientists Botvinick & Cohen were conducting research into how sight, touch and 'proprioception' (the sense of body position) combine to create a convincing feeling of body ownership, one of the foundations of self-consciousness.
Through this research they discovered an amazing illusion which could convince people that a rubber hand was their own. The now-famous “rubber hand illusion” was hugely important in understanding how bodily self-consciousness is anchored in both visual and tactile senses to create a sense of self-location and self-identification.
A group of Neuroscientists at the Swiss Federal Institute of Technology (EPFL) have since extended the “rubber hand illusion” to create a "virtual out-of-body experience" using cameras to fool people into thinking they are standing somewhere else in a room.
PISCES builds on this approach by adding interactive virtual environments to the “self-observational” view, which like the "rubber hand illusion" is frequently described by participants as being ‘‘weird’’ or ‘‘surreal’’, to offer a unique opportunity for experiential learning and therapy.
EPFL Experimental Setups - Video used courtesy of
Swiss Federal Institute of Technology (EPFL) lnco.epfl.ch
Download a diagram of our 'out-of-body' perspective setup by clicking on the following link.