No matter if we touch, point, speak, look or simply think, the interface should handle it. Here, gaze is used as direct input, but mainly as “micro-intent” signal that provides additional context to the system. SwiftUI + ARKit https://x.com/yakuzeg/status/2026691159305896068 https://preview.redd.it/kstlk0jhqxng1.png?width=1920&format=png&auto=webp&s=d327f3135f0dc1817fcbe1a450d1c15ed275faa4 submitted by /u/sakrouseek
Originally posted by u/sakrouseek on r/ArtificialInteligence
You must log in or # to comment.
