This document discusses considerations for eye gaze interfaces to help people with disabilities from a human-computer interaction perspective. It explores how users visually scan unfamiliar digital interfaces to understand their interactive potential. Several examples of new interaction modes are presented, including using gestures, head movements, touch, and eye tracking for control. The document concludes that interface paradigms need to change to better incorporate tangible and natural user interfaces rather than just adding features to graphical user interfaces. More research is needed on multi-sensory feedback to interface with eyes, gestures, voice, touch, emotions and cognition.