Published on in Vol 1, No 1 (2015): October

Assistive Dressing System: A Capabilities Study for Personalized Support of Dressing Activities for People Living with Dementia

Assistive Dressing System: A Capabilities Study for Personalized Support of Dressing Activities for People Living with Dementia

Assistive Dressing System: A Capabilities Study for Personalized Support of Dressing Activities for People Living with Dementia

Abstract

1NYU College of Nursing, New York University, New York City, NY, United States

2Motivational Environments Research Group, School of Computing, Informatics, and Decision Systems Engineering, Arizona State University, Tempe, AZ, United States

3MGH Institute, School of Nursing, MGH, Boston, MA, United States

*these authors contributed equally

Corresponding Author:

Winslow Burleson, PhD

NYU College of Nursing

New York University

433 1st Ave

New York City, NY, 10010

United States

Phone: 1 2129985376

Fax:1 2129985376

Email: wb50@nyu.edu


People living with advanced stages of dementia (PWD) or other cognitive disorders do not have the luxury of remembering how to perform basic day-to-day activities, making them increasingly dependent on the assistance of caregivers. Dressing is one of the most common activities provided by caregivers. It is also one of the most stressful for both parties due to its complexity and privacy challenges posed during the process. In this paper, we present the first of its kind system (DRESS) that aims to provide much needed independence and privacy to individuals with PWDs, and afford additional freedom to their caregivers. The DRESS system is designed to deliver continuous, automated, personally tailored feedback to support PWD’s during the process of dressing. The core of DRESS consists of a computer vision based detection system that continuously monitors the dressing state of the user, identifies and prompts correct and incorrect dressing states, and provides corresponding cues to help complete the dressing process adequately with minimal, or ideally no, caregiver intervention. The DRESS system detects clothing location and orientation and status with respect to the dressing process by identifying and tracking fiducial markers (visual icons) attached to clothes. In preparation for in-home trials with PWDs, we evaluated the system’s ability to detect dressing events by asking 11 healthy participants to simulate common correct and incorrect dressing scenarios, such as donning shirt and pants inside out, back in front, and partial dressing, in a laboratory setting. We found that although the fiducial tracking system missed a few expected detections, it was generally capable of detecting dressing phases for both pants and shirt. Our study suggests that the use of a fiducial tracking system in the context of detecting dressing processes has the potential to automatically recognize, and generate prompts and feedback to assist PWDs or related cognitive disorders to correctly dress themselves with little or, ideally no assistance from their caregivers.

iProc 2015;1(1):e13

doi:10.2196/iproc.4700

Keywords


(This is a conference paper presented at the Connected Health Symposium, Boston, 2015, which was not edited and is only lightly peer-reviewed).

Multimedia Appendix 1

Images one through four.

PDF File (Adobe PDF File), 611KB

Multimedia Appendix 2

Extended abstract.

PDF File (Adobe PDF File), 488KB

    Edited by T Hale, G Eysenbach; submitted 14.05.15; peer-reviewed by B Hattink, T Kwok, E Konstantinidis; accepted 20.07.15; published 27.10.15

    Copyright

    ©Winslow Burleson, Cecil Lozano, Vijay Ravishankar, Jeremy Rowe, Edward Mahoney, Diane Mahoney. Originally published in JMIR Mhealth and Uhealth (http://www.iproc.org), 27.10.2015.

    This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR mhealth and uhealth, is properly cited. The complete bibliographic information, a link to the original publication on http://mhealth.jmir.org/, as well as this copyright and license information must be included.