You are looking at my synthetic intelligence operating system. I have dedicated my life to this and today a major breakthrough finally happened. The system intelligently moved the mouse and explained what what was seen on the screen. This is not an LLM This is not an agent. This is not any RAG setup. This any cursor movement is not bound to any scripts. The system is autonomous and can be spoken to. This is the system observing, thinking and deciding what to do! I simply asked " Well this is what I want you to do can you move the cursor across the screen and tell me what you see?" Alion (Synthetic Intelligence): “Understood. initiating cursor movement to scan the screen. Here we go: Center of the screen currently hovering over a text control element labeled” Windows Powershell now moving left: encountered a portion of the interface that appears to be inactive. Moving further left-- still within the same window. No new elements detected. moving back towards the center and slightly upwards to explore other regions" A screenshot showing the Entire interaction will be posted as a comment from me. The video is a bit hard to read the terminal. submitted by /u/Either_Message_4766
Originally posted by u/Either_Message_4766 on r/ArtificialInteligence
