
Imagine being able to control your smartphone without using your hands or voice.
Google Project Gameface makes this possible with its innovative hands-free navigation system. Initially introduced as a gaming mouse, Project GameFace has now expanded to Android offering a virtual cursor controlled by facial expression and head movements.
This innovative technology aims to make devices more accessible to people with physical disabilities.
Using the device's camera, Project GameFace tracks facial expressions and head movements, translating them into intuitive and personalised controls. The technology recognises a total of 52 facial gestures, including raising an eyebrow, opening the mouth, and moving the lips, to control and map a wide range of functions on the Android device.
Read more: Android 15 — A privacy and security-focused update for Android users
Users can also drag and swipe the home screen using a combination of head movements and facial expressions. Project Gameface collaborated with Incluzza, an Indian organisation supporting people with disabilities. This partnership helped expand the use cases of the technology, such as typing messages, finding jobs, and more.
By working together, Project Gameface learned how its technologies could be adapted to different needs. The project uses MediaPipe's Face Landmarks Detection API and Android's Accessibility Service to create virtual cursors. The cursor moves according to the user's head movement, tracked using the front-facing camera.
This innovation technology has been made available on GitHub, empowering developers to create apps that make devices more accessible, as reported by Gadget360.