Abstract
We describe case studies of clinically significant changes in sedentary behavior of older adults captured with a novel computer vision algorithm for depth data. An unobtrusive Microsoft Kinect sensor continuously recorded older adults’ activity in the primary living spaces of TigerPlace apartments. Using the depth data from a period of ten months, we develop a context aware algorithm to detect person-specific postural changes (sit-to-stand and stand-to-sit events) that define sedentary behavior. The robustness of our algorithm was validated over 33,120 minutes of data for 5 residents against manual analysis of raw depth data as the ground truth, with a strong correlation (
Get full access to this article
View all access options for this article.
