FaceOSC
Exploring opensource face control software
Hypothesis
I hypothesised that there must be some open source softwares that I could route into creative coding environments and use for interaction with them.
Process
I explored multiple creative coding languages: - Arduino - Processing - Max/MSP - Puredata
I then did extensive internet research to try and find a suitable software that would be able to interface with these softwares.
Outcome
I discovered FaceOSC developed by Kyle McDonald.
"FaceOSC will track a face and send its pose and gesture data over OSC, as well as the raw tracked points (when selected in the GUI). It will also stream the entire image over Syphon (Mac only) as "FaceOSC Camera" when selected" - https://github.com/kylemcdonald/ofxFaceTracker/releases
I knew that OSC data could be interpreted by Max for Live/Ableton and this meant I had an idea for my first hack.
Experimentation
I realised using FaceOSC and Max for Live, I could control any parameter within max using any parameter measurable by FaceOSC. Those being:
Pose
center position: /pose/position
scale: /pose/scale
orientation (which direction you're facing): /pose/orientation
Gestures
mouth width: /gesture/mouth/width
mouth height: /gesture/mouth/height
left eyebrow height: /gesture/eyebrow/left
right eyebrow height: /gesture/eyebrow/right
left eye openness: /gesture/eye/left
right eye openness: /gesture/eye/right
jaw openness: /gesture/jaw
nostril flate: /gesture/nostrils
These could be mapped via OSC and then MIDI to any parameter within Max. For example, you could control the master volume with the height of your mouth, the more open your mouth the louder the music.
Hack
I decided to take it a little bit further and code a hack that allowed me to control ableton without direct linkage to the position of my face - but by using a face movement as a trigger.
I created a hack that allowed the 'raise' of my eyebrow to trigger a 'bang' in Max for Live, this was sent to a 'tap tempo' circuit that used an average of the time between raises to produce a Beats Per Minute tempo. The BPM was sent to the Ableton Projects master tempo. This allows the user to control the speed of the track they are producing/performing live with their eyebrows.
Last updated
Was this helpful?