This document describes a glove-based sign language interpretation system that uses flex sensors and an Arduino Uno microcontroller. The system is intended to help those with speech impairments communicate by translating sign language gestures into text and speech output. The glove contains flex sensors that detect finger and hand movements, sending that data to the Arduino which interprets the gestures using machine learning algorithms and outputs the translation. The system aims to reduce communication barriers for the deaf and hard of hearing.