November 27th, 2019

Activity summary

In this activity, we used the "The Body Synth" experiment, which can turn your body movements into sound. This activity was held at the school gym to give students a larger space to explore different body movements to trigger different sounds. During this activity, two Promethean Boards were set up with the experiment. Each student had several opportunities to stand before the camera and use body movements to trigger an instrument (Piano, Drums, or Strings) sound. Only facilitators had access to the controls and could pick instruments, accord and adjust volume levels.


  • learn more about immersive interfaces, similar to AR
  • Learn more about students response to visual and audio feedback
  • Learn more about interactions that are initiated by the body movements

Notes from the session

  • Learners were not able to select their own colors
  • They were touching the screen, expecting an outcome
  • The camera would shift focus from one person to another so music could not be played concurrently by several users
  • The older Promethean board had lower contrast, some learners were not able to recognize themselves
  • The drums and guitars were preferred by most learners, the synth was not, perhaps due to it sounding distorted coming out of the whiteboard’s speaker
  • We had to make changes for them when they requested a guitar or drums 
  • Most learners found the screen appealing for the first time compared to previous sessions
  • Cause and effect was easier to discern compared to previous sessions where they had to divide their attention between a computer and something happening in their physical environment
  • One student was turning her body away from the camera and waving her hands along her back while watching the display
  • Body Synth seemed resource-intensive when several people were in front of the camera. An instance of Chrome crashed on a Macbook, requiring a reboot, and was unusable on older Dell Chromebooks

Notes for C2LC design

  • Some mechanism allowing students to influence the pitch or music, in general, would be beneficial and making that interaction clear
  • Users should be able to select their instruments at any point in the activity
  • A more immersive interface, similar to AR, seems to resonate more with Beverley students
  • Letting users collaborate, pick instruments (different colors), possibly show musical notes emanating from their bodies
    • If a student is engaged in the activity and another enters the room there should be some interaction that lets the new user pick an instrument and make it clear that both users are now playing music together