LNCS GART The Gesture and Activity Recognition Toolkit…手势和活动识别工具包.pdfVIP

  • 14
  • 0
  • 约3.2万字
  • 约 10页
  • 2018-03-02 发布于四川
  • 举报

LNCS GART The Gesture and Activity Recognition Toolkit…手势和活动识别工具包.pdf

LNCS GART The Gesture and Activity Recognition Toolkit…手势和活动识别工具包.pdf

GART: The Gesture and Activity Recognition Toolkit Kent Lyons, Helene Brashear, Tracy Westeyn, Jung Soo Kim, and Thad Starner College of Computing and GVU Center Georgia Institute of Technology Atlanta, GA 30332-0280 USA {kent,brashear,turtle,jszzang,thad}@cc.gatech.edu Abstract. The Gesture and Activity Recognition Toolit (GART) is a user interface toolkit designed to enable the development of gesture- based applications. GART provides an abstraction to machine learning algorithms suitable for modeling and recognizing different types of gestures. The toolkit also provides support for the data collection and the training process. In this paper, we present GART and its machine learning abstractions. Furthermore, we detail the components of the toolkit and present two example gesture recognition applications. Keywords: Gesture recognition, user interface toolkit. 1 Introduction Gestures are a natural part of our everyday life. As we move about and interact with the world we use body language and gestures to help us communicate, and we perform gestures with physical artifacts around us. Using similar motions to provide input to a computer is an interesting area for exploration. Gesture systems allow a user to employ movements of her hand, arm or other parts of her body to control computational objects. While potentially a rich area for novel and natural interaction techniques, building gesture recognition systems can be very difficult. In particular, a programmer must be a good application developer, understand

您可能关注的文档

文档评论(0)

1亿VIP精品文档

相关文档