Aaron Bobick
Chair, Interactive Computing
- Office:
- TSRB 211B
Biography
Dr. Bobick's research spans a variety of aspects of computer vision. His primary work has focused on video sequences where the imagery varies over time either because of change in camera viewpoint or change in the scene itself. He has published papers addressing many levels of the problem from validating low level optic flow algorithms to constructing multi-representational systems for an autonomous vehicle to the representation and recognition of high level human activities. The current emphasis of his work is on action understanding, where the imagery is of a dynamic scene and the goal is to describe the action or behavior. Three examples are the basic recognition of human movements, natural gesture understanding, and the classification of football plays. Each of these examples requires describing human activity in a manner appropriate for the domain, and developing recognition techniques suitable for those representations.
Recently, Dr. Bobick has also explored the development of interactive environments where advanced sensing modalities provide input based upon the users' actions and, hopefully, intentions. The intriguing element of interactive environments is that the context of the situation can be exploited in the interpretation of the user's behavior. An example of such an environment is the KidsRoom, the world's first, interactive narrative play-space for children. The room employed large-scale video and sound to take the children through a fantasy story; all the sensing was accomplished using computer vision. A more current and ambitious project is the Aware Home Research Initiative. The goal of that effort is to impart sufficient perception and interface capabilities to a house such that it can enhance the quality of life of the inhabitants. A domestic setting provides a wealth of contextual information that will be needed to assist in understanding the activities of the people within.