Or: How I Learned to Stopped Worrying and Love .MHX2

So I just had a major breakthrough (I’m hoping!) with my NPC interactions. I had glanced once or twice at the work Thomas@MakeHuman had been doing, but in my earlier, more sleep deprived days of DaddyDev, I never quite grasped the concepts he was working on.

Ye gods, I’m so freakin’ thrilled I went back to take a second look.

I took one of my MakeHuman characters (the subway token booth clerk) and, after installing the .mhx2 files in Blender and MakeHuman; I re-exported him put him into Blender. Granted, I had to grasp there was an advanced options switch on the Blender import side – but once I activated the face shapes, I was blown away.

I had visemes (phonemes) already plugged in. I could export expressions from MH. Once I added the MakeWalk addon, I could import Mo-Cap .bvh files and apply them to my rig. Delving into the online manual (and a profound thanks to the dev who still does written manuals, I’m sure I will rant yet again about how much I hate video tutorials) I saw there were even more options to smooth out the mo-cap motions, edit the location / rotation of bones and once I figure it out, append other .bvh files. What really blew me away was the ability to import MOHO (.dat) lipsync files from Papagayo. And I loaded visemes ON TOP of the already loaded .bvh!

With the ability to make a character walk, talk and emote, suddenly a whole new facet of story telling opened up for me. Since I’m not big into programming, I’ve been using Playmaker to handle some of the basic interactivity; and after seeing the dev who made the wonderful Rift demo, ‘Coffee Without Words‘ made an asset that mimics human eye movement – my NPCs came alive.

Be forewarned, my demo video is NSFW; I made a MH woman with some extended clothing assets, I didn’t realize the bathrobe was translucent, so you see some breasts.

Lastly, after having tinkered with this for a bit, I got an email saying that Adobe had released Fuse, a character creation that tied in with Mixamo and the Mechanim animation system Unity now favors. I tried Fuse, made a character, got a walk animation added and plopped it in Unity (and Blender too, just to see what it would do, if I needed to tweak it)

The character worked, it did its walk cycle and looked ok after I changed its skin shader from transparent to opaque. I did notice that it can be lip-sync’ed, but only with a $35 additional purchase, and a new piece of software to learn and try to integrate. When I pulled the Fuse character into Blender, it was huge, distorted and had no face-rig that I could discern.

So Fuse wasn’t all that tempting, considering I’d have to shell out $$ for SALSA, and I suspect Adobe will be quickly adding a subscription fee like all its other products, I’d like to keep my dev budget as free and OSS as possible. Perhaps I’ll just create and export a bunch of random characters to populate my subway scene with, since MakeHuman doesn’t have the widest range of clothing options, the Fuse NPC’s will shake things up a bit visually.

Now I have to sit and watch about 2,000 .bvh animations to find what actions I think will go well with my actors…