Abstract
Advances in surface scanning technology have made possible the large-scale collection of body shape data. This paper describes methods for creating whole-body human figure models from the 3D scan data using a semi-automated process. Body surface landmark data are used to calculate joint locations and to segment the point cloud. A surface polygon mesh is fit to the point-cloud data for each segment. The model accuracy with respect to the original scan data can be made arbitrarily high by increasing the polygon model resolution. NURBS surfaces are then fit to the polygon mesh for rendering in CAD systems. The result of this process, which typically requires about five minutes, is a whole-body, articulated model that represents the body shape captured in a whole-body scan. The model posture can be adjusted dynamically in a CAD environment for application to a wide variety of ergonomic analyses.
Get full access to this article
View all access options for this article.
