AI Audience Pulse: Resonance Radar is a challenge focused on building a system that captures and interprets audience reactions during live speeches, panels, or presentations. Using camera and microphone input, the system analyzes spoken content and detects emotional or behavioral responses—such as laughter, applause, nodding, or fidgeting. The solution integrates affective computing, natural language processing, and multimodal AI to offer new ways of evaluating public speaking and audience dynamics. Core components include visual reaction detection, audio emotion recognition, speech segmentation, an impact summary engine, and engagement heatmaps that visualize emotional intensity over time. Optional features may include a real-time feedback dashboard for moderators and personalized insights for speakers to refine future presentations. This challenge is ideal for all who are passionate about human-centered technology, storytelling, and real-time analytics.