This article discussed a method for incorporating the user's whole body as into the control of a large body display. They discuss ways of determining whether users are collaborating or working alone based on body language. The end goal is to make the entire process of interaction more intuitive.



The article takes a look at the psychology of how people interact with other people and begins to adapt it for the computer to human interaction.

Unique approach to storing tools "on" the body. This expands the idea of body awareness over having to find something on a display.


The interface seems limited by the use of the wiimote. Future versions might do better if it simply tracked all of the users body movements.

It seems that analyzing intent should go much further than simply deciding if two users are collaborating or not.