
Recording can also be enabled/disabled during message reception hitting the SPACE key. The list of controls that are actually recorded is reported later. The timeline will advance (even though the time cursor is not visible during recording). If REC if enabled, keyframes will be added to the timeline to record the current live performance. Press ESC to interrupt the reception and leave the face in its current position. Select the character on which you want to operate.Ĭlick “Start Net Listener” to enter a modal mode where the character face will be piloted by the FaceShift incoming data. Enable the FaceShift2Blender plugin, or copy the code into a script buffer and run it.Ī control panel will appear on the Tool box of the 3D View. Open Blender and load any MakeHuman character. The (cheaper) Freelance version allows data streaming only of recorded performances, not for live tracking. WARNING! Live streaming can be performed ONLY with FaceShift Studio version.

Usageįirst, configure FaceShift to broadcast the captured data as UDP packets to localhost, port 33433.

Or copy paste the script text into a script a script buffer and execute it. Use the standard Blender addon installation procedure. Ī demo scene, containing the script and a default MakeHuman character, can be downloaded for a quick test.

You can download the addon directly from our website.

The following picture shows the Kinect color camera detected face (left), the FaceShift 3D model (center), and the piloted MakeHuman face in Blender (right). FaceShift 2 Blender IntroductionįaceShift 2 Blender is a Blender addon to edit the facial expression of a MakeHuman character using a Kinect-like device and the output of the FaceShift expression recognition software. In this page the software we use to perform facial animation.
