Magic Finger: the future of touch interaction

Autodesk Australia
Wednesday, 24 October, 2012

You may like the touch input offered by your tablet or mobile device, but imagine if your entire surrounding environment was touch sensitive. This would allow any physical object, including your own body, to serve as a peripheral input surface for digital devices such as mobile phones and tablets.

A team of research scientists at Autodesk Research has now made this possible. In collaboration with research intern Xing-Dong Yang from University of Alberta and professor Daniel Wigdor from University of Toronto, research scientists Tovi Grossman and George Fitzmaurice have created a proof-of-concept device called Magic Finger, which allows touch interactions to be carried out on any physical surface.

Magic Finger is a thimble-like device worn on the user’s finger. It combines one of the world’s smallest RGB micro cameras with an optical motion sensor. Together, these sensors allow Magic Finger to not only sense finger input, it can also determine what it is the user is touching, such as a shirt, a table or human skin. This means that you could perform different actions depending on what object you are touching.

The team has explored a variety of interactions and applications that Magic Finger could support. For example, if you ever receive an unwanted call when your smartphone is in your handbag, you could simply tap the handbag to mute the notification. The Magic Finger could also be used as an input proxy for other wearable devices. For example, tapping on your wrist could bring up your calendar on a head-mounted display such as Google Glass.

Magic Finger combines one of the world’s smallest RGB micro cameras with an optical motion sensor to allow it to not only sense finger input, but also to determine what the user is touching.

Magic Finger combines one of the world’s smallest RGB micro cameras with an optical motion sensor to allow it to not only sense finger input, but also to determine what the user is touching.

The team performed a controlled evaluation of Magic Finger’s capabilities. They collected 22 different textures from a large variety of everyday objects, eg, table, clothes, skin, phone, etc. They found that Magic Finger can distinguish the tested objects with an impressive accuracy of 99.1%.

Autodesk has shown its continued interest in what the future may hold for how we interact with technology. Before Magic Finger becomes a reality, future work will be needed on miniaturising the Magic Finger device and resolving practical issues, such as Midas touch, power and communication issues.

The work will be published at the UIST 2012 ACM Symposium on User Interface Software and Technology. The full paper, and additional details, can be obtained from the Autodesk Research website.

Related News

Green hydrogen innovation wins Climate Innovation Challenge

South East Water and RMIT University have developed a method for producing green hydrogen from...

Alpha HPA gets finance for high-purity alumina plant

Alpha HPA has announced that it has reached Contractual Close on finance to build Australia's...

Orica to bring digital mining technology to Türkiye

Orica Digital Solutions has signed an agreement with Turkish company KAPEKS on the introduction...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd