Waiter, you can fill his glass now!

A revolutionary piece of technology that can detect what an object is simply by placing it on a small radar sensor has been created by researchers at the University of St Andrews.

RadarCat (Radar Categorisation for Input and Interaction) can recognise different objects by placing them on a tiny device attached to a computer – which then scans their physical properties using radio waves.

The radar sensor, which is based on Project Soli developed by Google’s Advanced Technologies and Projects (ATAP) lab, can be trained to recognise materials in real-time.

Using sophisticated machine learning algorithms, it can distinguish between a plastic tray, a piece of wood, the make and model of a smartphone, and even body parts. And, helpfully for diners in busy restaurants, it can alert waiters to when a glass needs refilling, with the ability to literally assess when a vessel is empty or full. With enough objects input onto the system, the technology can eventually create a ‘physical objects dictionary’.

The system, which employs a radar signal, has a range of potential applications, from helping blind people identify the different contents of two identical bottles, replacing bar codes at a checkout, helping recycling plants sort its rubbish into the appropriate categories, or even foreign language learning.

If the chip, which has no moving parts and uses very little energy, is integrated into the electronic circuitry of future smartphones, it can also scan body parts to generate useful applications; for example, when placed over the stomach it might be able to launch a recipe app, or when placed on the leg or foot, it could launch a map. It can also distinguish between a gloved hand and a normal hand, making the settings on an enabled device easier to use in certain jobs or climates.

“The Soli miniature radar opens up a wide-range of new forms of touchless interaction. Once Soli is deployed in products, our RadarCat solution can revolutionise how people interact with a computer, using everyday objects that can be found in the office or home, for new applications and novel types of interaction,” said Professor Aaron Quigley, Chair of Human Computer Interaction at the University.

“Our future work will explore object and wearable interaction, new features and fewer sample points to explore the limits of object discrimination.”

The sensor was originally provided by Google ATAP as part of its Project Soli alpha developer kit programme. It was originally conceived to sense micro and subtle motion of human fingers (think Nintendo Wii but radio waves are much more sensitive and response to movement than cameras) but the team at St Andrews discovered it could be used for much more.

“Beyond human computer interaction, we can also envisage a wide range of potential applications ranging from navigation and world knowledge to industrial or laboratory process control,” Professor Quigley added.

A team of undergraduates and postgraduate students at the University’s School of Computer Science was selected to show the project to Google in Mountain View in the United States earlier this year.  A snippet of the video was also shown on stage during the Google’s annual conference (I/O).

Professor Quigley is travelling to the 29th ACM User Interface Software and Technology Symposium in Tokyo next week to present the technology.