This study presents a system for real-time identification of human emotions using facial recognition for assistive robots, focusing on enhancing human-robot interaction. The proposed solution employs a two-stage processing approach, utilizing HOG and SVM for face detection and a convolutional neural network (CNN) for emotion recognition, achieving a 92% success rate in tests. The findings highlight the importance of real-time adaptability in robotic responses based on detected emotional states, while also indicating areas for future improvement in training datasets.