Tuesday, July 28, 2009

A Toolkit for the EMIR Laboratory

The EMIR Laboratory (Exploration of Media Immersion for Rehabilitation) is now well underway to becoming a reality. We have a space, albeit still temporary as we shall eventually be moving to a completely refurbished space a few doors down the corridor, several computers and are in the process of acquiring our first major piece, a floor projection system. Combined with our efforts in collaboration with Bloorview Kids Rehab, we will be working with the full range of human sensory perception - visual and audio of course, but also tactile, movement, physiological (heart rate, skin conductance, breathing, etc.), olfactive and even taste as well as using a brain-computer interface. The goal is to generate immersive experiences - creative, game-like, artistic, etc. - that challenge rehab patients, clinicians and/or researchers to view themselves in new ways.

However, few people have any understanding of what can be achieved or how to go about doing this. In addition, even our team, which has been exploring multisensory immersive environments for some time, needs good intermediate tools to support our ongoing research, and we are not always aware of what is possible either. With a view to both helping ourselves, but also encouraging collaboration and participation in the new laboratory, we have embarked upon the process of developing a "toolkit" for delivering multisensory immersive experiences with a minimum of technical expertise.

Called an Affordance Toolkit (because each tool affords different sets of activities - we are drawing on Gibson's affordance theory for this), the framework consists of matching a set of controller interfaces to a set of viewer modules as a function of particular tasks. Controllers include cameras that are able to read and interpret gestures, tactile screens and pressure carpets able to register different forms of body contact, microphones for recording and interpreting sounds, and sensors for recording physiological or neurological signals. Viewers include 1-, 2- or 4-wall projection, ceiling and floor projection, surround spatialized sound, motor-driven devices - both large and small, scent diffusers, and so on.

Tools under development that bridge these two sets of functionalities include the following :

1) Mirror Space - using webcams and full wall prujections where the real-time video images are horizontally flipped to generate a pseudo-mirror image (occupying 1, 2 or all for walls), combined with the addition of digital enhancements, virtual objects and annotations added to the projected image, we are able to deliver an environment that supports a variety of tasks, including various physical games (tug of war, zone avoidance, tag, etc.), cognitive games or tasks (draw in the outlines of objects, paint by numbers, etc.) or controlled exercise and/or balance task (raise your feet until they hit a gong, move along a virtual line, etc.);

2) Master at Work - using data gloves or alternate controllers for those unable to use their hands, use gestures and manipulation to create and modify sounds, visual objects, odors, etc. to make a "multisensory composition" akin to a musical composition. This might be done in a darkened room and avoid the use of vision;

3) Room of Presence - Similarly to the previous tool, this will allow for the materialization of virtual characters that then interact with the user. The user will be able to draw on a bank of virtual characters with a range of pre-deteermined bheaviors, or be able to create very simple "characters" with new behaviors;

4) Multisensory Logbook - In order to record, annotate, archive and playback the expriences created in the EMIR laboratory, we are working on the development of a multisensory logbook system involving video cameras and microphones as well as a computerized logbook of programmed functions;

5) Social Atlas - Using GPS for outdoor environments and RFID tracers combined with other location technologies for interiors, we will provide the ability to both track volonteers or friends and to represent these movements within the EMIR laboratory;

6) Experiensorium - Using geographical database structures, we shall be able to provide the possibility of navigating large and complex virtual environments filled with a multitude of sensory experiences. This will be particularly effective in the presence of non-realistic visuals or no visuals at all. For example, walking through a sketched farmyard, but hearing and smelling the animals, feeling thir presence through air currents and the occasional sense of touch. Within the experiensorium, it will be possible to play out games or narrative experiences.

In addition to these macro-tools, we will also be developing and using a range of microtools such as the ability to call up a pop-up menu on the wall-screens using gestures, to partition the visual, audio or tactile spaces, to inject text into these different spaces (e.g. written, audio or braille), and so on.

Each of the proposed tools represents significant research and development challenges, but working on them is both satisfying and engaging. We look forward to reporting on progress on the development of the toolkit over the coming months.

1 comment:

daniel john said...

I really admire this, I mean it really looks interesting! I'm actually glad to see all this stuff,Thanks for convey this.


Term papers