Android apps will soon let you use your face to control your cursor

  News, Rassegna Stampa
image_pdfimage_print

Developers can now integrate the accessibility feature into their apps, allowing users to control the cursor with facial gestures or by moving their heads. For example, they can open their mouth to move the cursor or raise their eyebrows to click and drag.

Announced during last year’s Google I/O for desktop, Project Gameface uses the device’s camera and a database of facial expressions from MediaPipe’s Face Landmarks Detection API to manipulate the cursor.

“Through the device’s camera, it seamlessly tracks facial expressions and head movements, translating them into intuitive and personalized control,” Google explained in its announcement. “Developers can now build applications where their users can configure their experience by customizing facial expressions, gesture sizes, cursor speed, and more.”

While Gameface was initially made for gamers, Google says it has also partnered with Incluzza — a social enterprise in India focused on accessibility — to see how they can expand it to other settings like work, school, and social situations.

Project Gameface was inspired by quadriplegic video game streamer Lance Carr, who has muscular dystrophy. Carr collaborated with Google on the project, with the aim of creating a more affordable and accessible alternative to expensive head-tracking systems.

https://www.theverge.com/2024/5/14/24156810/google-android-project-gameface-accessibility-io