Health Care Systems Oncology, Imaging and Pharmacology, particularly for Prostate Cancer.
Technology that interests me: Sensors (Radar, Sonar, EO/IR,Fusion) Communications, Satellites, Unmanned Vehicles (UAV), Information Technology, Intelligent Transportation
Sunday, October 9, 2016
SACHI uses Soli in RadarCat to Identify Real-World Objects
RadarCat Doesn't Purr, But It Can Identify Real-World Objects | Digital Trends
Researchers at the University of St Andrews in Scotland recently
figured out a way for a computer to recognize different types of
materials and objects ranging from glass bottles to computer keyboards
to human body parts. They call the resulting device RadarCat, which is
short for Radar Categorization for Input and Interaction. As the name
implies, this device uses radar to identify objects.
RadarCat was created within the university’s Computer Human Interaction research group. The radar-based sensor used in RadarCat
stems from the Project Soli
alpha developer kit provided by the Google Advanced Technology and
Projects (ATAP) program. This sensor was originally created to detect
the slightest of finger movements, but the RadarCat team saw even
greater potential.
New technology using radar can identify materials and objects in real timeProfessor Aaron Quigley, Chair of Human Computer Interaction at the University, explained, "The Soli miniature radar opens up a wide-range of new forms of touchless interaction. Once Soli is deployed in products, our RadarCat solution can revolutionise how people interact with a computer, using everyday objects that can be found in the office or home, for new applications and novel types of interaction."
The system could be used in conjunction with a mobile phone, for example it could be trained to open a recipe app when you hold a phone to your stomach, or change its settings when operating with a gloved hand.
A team of undergraduates and postgraduate students at the University's School of Computer Science was selected to show the project to Google in Mountain View in the United States earlier this year. A snippet of the video was also shown on stage during the Google's annual conference (I/O).
Professor Quigley continued, "Our future work will explore object and wearable interaction, new features and fewer sample points to explore the limits of object discrimination.
"Beyond human computer interaction, we can also envisage a wide range of potential applications ranging from navigation and world knowledge to industrial or laboratory process control."
Related/Background:
Yeo, H.-S., Flamich, G., Schrempf, P., Harris-Birtill, D., and
Quigley, A. (2016) RadarCat: Radar Categorization for Input &
Interaction. In Proceedings of the 29th Annual ACM Symposium on User
Interface Software and Technology New York, NY, USA: ACM UIST ’16.
We will update with a link to this paper when it becomes live in the ACM digital library
No comments:
Post a Comment