top of page


winter 19'

Spotblinds explores concepts of physical computing in the built environment, prototyping a blind system that harnesses computer vision that interacts with a user in a space to passively control the room's lighting dynamics. The system uses a Microsoft Kinect, which uses disparity depth sensing and communicates live with an Arduino interface that controls the rotation of the blinds. The blinds are a series of servo motors that are assigned different rotation values based on your body position the Kinect picks up. In this case, the Kinect is detecting your right hand, and upon entry into the frame, will open the blinds most at the x-dimension your hand is currently closest to in relation to the blinds. The result is a dynamically controlled system that offers plenty of further possibilities for exploring dynamic blind systems.

PROJECT 1.jpeg
bottom of page