Microsoft is planning to add native eye tracking support into Windows 10. The new support is primarily designed to help those suffering from neuro-muscular diseases like ALS and other disabilities to control the various interface elements in Windows 10 without a traditional mouse and keyboard. This ranges from gazing at apps to launch them, or using an onscreen keyboard to glance at characters and type out words.

Dubbed Eye Control in Windows 10, the new feature will require hardware like Tobii’s Eye Tracker 4C. Microsoft has worked closely with Tobii to enable this support, and existing devices like Tobii Dynavox PCEye Mini, PCEye Plus, EyeMobile Plus and I-series will all be supported soon. Eye Control in Windows 10 is in beta now, and participants will need to sign up to Microsoft’s Windows Insider program to get access.

Microsoft’s collaboration was triggered by the first Microsoft employee hackathon known as One Week back in 2014. A winning entry, inspired by former professional football player Steve Gleason, pushed Microsoft to form a new research team to investigate eye tracking. Microsoft’s Windows team built prototypes of eye tracking, and CEO Satya Nadella has now supported its integration directly into Windows 10. It’s not clear exactly when this eye tracking support will be available broadly, but given Microsoft recently started working on its March Windows 10 update, we’d expect to see eye tracking appear next year.