This document presents a distributed framework for analyzing multimodal data from multiple sensors. The framework uses a publish/subscribe architecture to synchronize data collection across sensor nodes. Data is streamed from sensor nodes to processing nodes for analysis. To validate the framework, researchers built a multimodal learning system that collected audio, video, and motion data from presentations to provide feedback. Fifty-four students tested the system, which received positive feedback regarding usability and learning experience. The distributed framework allows scalable and efficient multimodal data collection and analysis.