We present and evaluate a new mixed-reality tool called SharedPhys, which tightly integrates real-time physiological sensing, whole-body interaction, and responsive large-screen visualizations to support new forms of embodied interaction and collaborative learning. While our primary content area is the human body—specifically, the respiratory and circulatory systems—we use the body and physical activity as a pathway to other STEM areas such as biology, health, and mathematics. We describe our participatory design process with 20 elementary school teachers, the development of three contrasting SharedPhys prototypes, and results from six exploratory evaluations in two after-school programs. Our findings suggest that the tight coupling between physical interaction, sensing, and visualization in a multi-user environment helps promote engagement, allows children to easily explore cause-and-effect relationships, supports and shapes social interactions, and promotes playful experiences.