Virtual world and dementia

dementia

Virtual word, tangible interface and dementia

The care of dementia is becoming a key agenda in health care research. A report published by the Health Economics Research Centre estimated that the cost of dementia in care and lost productivity in the UK could be as high as £23 billion.  Dementia not only affects those suffering from it but also incurs a personal cost to those taking care of them. Therefore, there is a strong interest in finding ways to assist the care of people with dementia.

This project (funded by EPSRC) aimed to understand the use of 3D virtual world (VW) technology to support life engagement for people with dementia in long-term care. We have created several versions of virtual world prototypes iteratively and tested them in three care homes, as well as at AgeUK and Alzheimer’s Society at Kent, England.

p02 p01

Version 1: Reminiscence Room Prototype

The first prototype presented the user with an avatar placed in a virtual room with objects from the past (old posters, magazines, TVs, books). A radio was also included in the virtual room playing music of that period. Two modes of interaction were available. In the first mode (walking mode), Kinect was used to map the users interaction to the avatars. More than 10 joints (e.g. Head, Torso, Waist, etc.) were detected and their movements mapped onto the avatar. In the second mode (seated mode), the avatar was seated in front of a table with virtual objects (a book, a magazine, a radio and a lamp) placed in front of the user. In this mode, Kinect only detected the movements of the upper body (arms and upper torso) and the user would remain seated in the physical world. The user could pick up items on the table by moving their hands to touch the object. Each mode had a first and third person view.

Version 2: Virtual Tour Prototype

The second version of the prototype was created based on the idea of users being taken along for a virtual tour through a pre-determined path. Two virtual worlds were created: the river tour and the park tour. In the river tour (see video), the participant was taken through a virtual river trip. The environment was created to resemble a tropical forest, with vegetation, plants and animated animals (such as elephants, rhinoceros, etc.). The user would tour around the river by doing a “rowing” motion with their arms, which would propel the boat forward along the path. The park tour was created to depict the experience of walking through a park. Users would move their arms to simulate a “jogging” motion to move around the virtual park. In this prototype, various animals were programmed to respond to the user as they got close. For instance, rabbits would hop away from the user as they “walked” closer.

Prototype4

Version 3: Gardening Prototype

In this prototype, the participant would work together with the caregivers to design a virtual garden. 20 objects were available for selection including nine types of flowers, four types of trees and seven types of vegetables. An Android- based tablet computer was used to allow the caregiver/residents to select the vegetation to plant in the virtual garden (see figure). The participant would “walk” to the empty plot using Kinect in a similar manner as prototype version 2. As they came close to an empty lot, a text would be displayed on the screen, asking the participant to select the plant. The user would then select the plant with the tablet computer and the plant would slowly pop up from the ground, followed by text feedback with a cheering sound effect as well as visual effects.

prototype05

Version 4: Tangible interfaces

This is an extension to version 3, but we got rid of the tablet. Although the tablet worked really well with many service users, those with more severe conditions still found it impossible to use. In this version, we made use of NFC tags and an NFC-enabled phone to create a tangible interface using plastics flowers (see video). The users were able to grab the plastic flower, put it on the “planting platform”, and a virtual version of the flower would pop up in the virtual garden. Making use of physical objects the users are familiar with worked really well.

A  paper has been published in ACM SIGCHI conference 2014 in Toronto

Advertisements