This document presents a research study aimed at developing a real-time hand-gesture recognition model for Arabic Sign Language (ARSL) using Microsoft Kinect and Leap Motion controls. The model employs supervised machine learning to accurately detect and classify gestures corresponding to the Arabic alphabet, achieving a 100% recognition rate for 22 out of 28 letters. The research highlights the potential of combining both devices for improved gesture recognition and communication assistance for the Arabic hearing-impaired community.