Motion capture with Tenserflow.js/PoseNet + MaxMSP + Blender

The key steps of the video above: Use the PoseNet of TensorFlow based on web-based machine learning for motion capture; Link the PoseNet page to MaxMSP with the Node for Max module provided by MaxMSP; Human motion data captured by PoseNet is sent back to MaxMSP through SocketIO; MaxMSP sends the received data to Blender via OSC; Blender uses the received data to control the deformation animation in real time.

Speech recognition and then visualize the 3D text in Blender Eevee in realtime

Voice/speech recognition by my OSC controller-BugOSC, then send the text to Blender and rendered by Eevee, all realtime. Youtube: https://youtu.be/KVT-y5a963Y B站(Chinese): https://www.bilibili.com/video/BV1A54y1d7Sh/ OSC controller: BugOSC, an OSC controller I developed based of Wechat mini program (微信小程序). You should install Wechat App firstly and then search “BugOSC” in it. “BugOSC” is NOT a native App, it should be used with/in “Wechat”. BugOSC now (v0.4) supports speech/voice recognition! Or you can use

Tutorial to voice control HTML GIF animation with mobile phone using MaxMSP and Nodejs

A demo and tutorial to voice control HTML GIF animation with mobile phone using OSC + MaxMSP + Nodejs + SocketIO. Source code: https://gum.co/whmyz https://www.patreon.com/posts/35323161 The data flow: OSC controller —- MaxMSP —- Node for Max —- Animation (HTML Web GIF) OSC controller: BugOSC, an OSC controller I developed based of Wechat mini program (微信小程序). You should install wechat firstly and then search “BugOSC” in it. Or you can use

Interactive between mobile phone and Blender animation through OSC

Just another demo about interactive between mobile phone and Blender animation through OSC. Blender Eevee animation: BLUE FOX Creation https://youtu.be/TYkPvFLDBNI NodeOSC addon of Blender: maybites https://github.com/maybites/blender.NodeOSC OSC controller: BugOSC, an OSC controller I developed based of Wechat mini program (微信小程序). You should install wechat firstly and then search “BugOSC” in it. Detail how-to video: “How to make interactive audiovisual effect in 5 minutes using Blender and MaxMSP“https://youtu.be/ssVcU8xsRT8 Bgm: ” Limit

How to make interactive audiovisual effect in 5 minutes using Blender and MaxMSP

Blender is now a new force in 3D art. Although new but not young, about twenties. In [Experimental Programming], I generally use Blender as a Python runtime environment out of the box, like:Using Blender to run Python and visualizing the Fourier Series. And this time, it is a simple and crude VJ / music visualization / audio-visual interaction software: Fingers hurt a little, change a prop: One more: Controlled by mobile