Abstract
The modern manufacturing industry, especially in developed economies, is pushing towards automation of skill-intensive manual manufacturing operations to remain globally competitive. However, automation of such operations is not trivial because it is difficult to replace a highly skilled human with an equivalent machine. Therefore, it is necessary to capture and extract human skills from skill-intensive manufacturing operations in order to enable the process of automating such operations. To achieve this goal, the first step is to capture, digitise and segment the human-workpiece interactions during a simplified manual manufacturing operation in preparation for skill extraction. This paper presents one such attempt using a gaming interface device (Microsoft Kinectâ„¢) to simultaneously capture human actions and the resulting workpiece motions non-obtrusively in three dimensions in real-time. The captured human-workpiece interaction data is automatically segmented into human action states, represented as therbligs and workpiece states, using which the subsequent skill extraction can be performed. The use of therbligs also enables this work to be used for human motion analysis for effective and ergonomic design of assembly workstations.