Open Exhibits - Research - Papers



Gesture Analyzing for Multi-Touch Screen Interfaces


The way we use computers today will soon change. The technology of the future will allow us to interact with the computer on a whole different level from what we are used to. The tools we use to communicate with the computer – such as the mouse and the keyboard, will slowly disappear and be replaced with tools more comfortable and more natural for the human being to use. That future is already here.

The increase rate of how touch screen hardware and applications are used is growing rapidly and will break new grounds in years to come. This new technology requires new ways of detecting inputs from the user – inputs which will be made out of on- screen gestures rather than by the pressing of buttons or rolling mouse wheels.

This report describes the gestures defined, the methods used to detect them and how they are passed on to an application.