Sunday, October 25, 2015

Project Soli uses radar for hand interface

Project Soli uses radar for hand interface
60GHz (5mm wavelength), 0.05 to 5m range, 180 degree field of view.


Welcome to Project Soli by OneTech2

Project Soli is developing a new interaction sensor using radar technology. The sensor can track sub-millimeter motions at high speed and accuracy. It fits onto a chip, can be produced at scale and built into small devices and everyday objects.

Related/Background:

  • Google unveils Project Soli, a radar-based wearable to control anything
  • Google's 'Project Soli' Radar Hand Tracking and How VR and AR Might Use it - Road to VR
  • Multi-sensor System for Driver’s Hand-Gesture Recognition FG2015.pdf
  • Short-Range FMCW Monopulse Radar for Hand-Gesture Sensing
  • Understanding Project Soli and Jacquard: Wearable Control Breakthroughs
  • Molchanov, P.; Gupta, S.; Kim, K.; Pulli, K., "Short-range FMCW monopulse radar for hand-gesture sensing," in Radar Conference (RadarCon), 2015 IEEE , vol., no., pp.1491-1496, 10-15 May 2015
    • doi: 10.1109/RADAR.2015.7131232
    • Abstract: Intelligent driver assistance systems have become important in the automotive industry. One key element of such systems is a smart user interface that tracks and recognizes drivers' hand gestures. Hand gesture sensing using traditional computer vision techniques is challenging because of wide variations in lighting conditions, e.g. inside a car. A short-range radar device can provide additional information, including the location and instantaneous radial velocity of moving objects. We describe a novel end-to-end (hardware, interface, and software) short-range FMCW radar-based system designed to effectively sense dynamic hand gestures. We provide an effective method for selecting the parameters of the FMCW waveform and for jointly calibrating the radar system with a depth sensor. Finally, we demonstrate that our system guarantees reliable and robust performance.
    • keywords: {CW radar;FM radar;computer vision;driver information systems;gesture recognition;radar imaging;automotive industry;computer vision technique;depth sensor;hand-gesture sensing;intelligent driver assistance system;short-range FMCW monopulse radar;smart user interface;Doppler radar;Frequency modulation;Radar measurements;Radar tracking;Receivers;Sensors},
    • URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7131232&isnumber=7130933
  • Bo Tan; Woodbridge, K.; Chetty, K., "A real-time high resolution passive WiFi Doppler-radar and its applications," in Radar Conference (Radar), 2014 International , vol., no., pp.1-6, 13-17 Oct. 2014
    • doi: 10.1109/RADAR.2014.7060359
    • Abstract: The design and implementation of a real-time passive high Doppler resolution radar system is described in this paper. Batch processing and pipelined processing flow are introduced for reducing the processing time to enable real-time display. The proposed method is implemented on a software defined radio (SDR) platform. Two experiments using this system are described: one based on small human body motions and another one on hand gesture detection. The results from these experiments show that the proposed system can be used in a range of application scenarios such as eHealth, human-machine interaction and high accuracy indoor target tracking.
    • keywords: {Doppler radar;passive radar;software radio;wireless LAN;SDR platform;batch processing;eHealth;hand gesture detection;human-machine interaction;indoor target tracking;pipelined processing flow;real-time high resolution passive WiFi doppler-radar;real-time passive high Doppler resolution radar system;software defined radio;Doppler radar;Doppler shift;IEEE 802.11 Standards;Passive radar;Real-time systems;Micro Doppler;Passive Radar;RealTtime;Wi-Fi},
    • URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7060359&isnumber=7060235
  • Ohn-Bar, E.; Tawari, A.; Martin, S.; Trivedi, M.M., "Predicting driver maneuvers by learning holistic features," in Intelligent Vehicles Symposium Proceedings, 2014 IEEE , vol., no., pp.719-724, 8-11 June 2014
    • doi: 10.1109/IVS.2014.6856612
    • Abstract: In this work, we propose a framework for the recognition and prediction of driver maneuvers by considering holistic cues. With an array of sensors, driver's head, hand, and foot gestures are being captured in a synchronized manner together with lane, surrounding agents, and vehicle parameters. An emphasis is put on real-time algorithms. The cues are processed and fused using a latent-dynamic discriminative framework. As a case study, driver activity recognition and prediction in overtaking situations is performed using a naturalistic, on-road dataset. A consequence of this work would be in development of more effective driver analysis and assistance systems.
    • keywords: {driver information systems;gesture recognition;learning (artificial intelligence);sensor arrays;traffic engineering computing;driver activity prediction;driver activity recognition;driver analysis and assistance systems;driver foot gestures;driver hand;driver head;driver maneuver prediction;driver maneuver recognition;holistic feature learning;latent-dynamic discriminative framework;naturalistic on-road dataset;sensor array;Cameras;Foot;Radar tracking;Sensors;Trajectory;Vehicle dynamics;Vehicles},
    • URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6856612&isnumber=6856370

No comments:

Post a Comment