Thursday, March 26, 2015

Make Human to Blender to UE4

I'm prototyping an FPS in UE4. I need/want a non-epic-blue-dude. I've got a good enough handle on Blender for performing basic tasks for assets, but nothing actually artistic.

So what do I do?

I created a pipeline from Make Human, to Blender, to UE4. This allows me to generate a great (albeit generic) human model that is rigged and all setup for UE4 animations. I then take it into Blender for any mesh adjustments I'd like to make. Then, I finally bring it into UE4. There, I make use of animation re-targeting. This allows me to copy/paste Unreal's free animations pack into my own model.

Essentially, I'm able to generate a model with ~60 quality FPS animations w/o actually creating any art assets! That's re-diq-you-lass.

Step-By-Step

Make Human
You need to download their latest "unstable" version. As of the time of this writing, that was the only version supporting UE4 Epic rigging.
  • Tweak your human.
    • UE4 normal person height = about 183cm. 
Make Human Exporting
  • Pose/Animate > Skeleton > Tag Filter > Game: Yes.
  • Pose/Animate > Skeleton > Rig Presets > Uengine.
  • Pose/Animate > Pose > T-Pose.
  • Export > Mesh Format > Filmbox(fbx)
  • Export > Options > Scale units > centimeter.
  • Export to UE4.
    •  Export > Options > Binary FBX: No.
  • Export to Blender.
    •  Export > Options > Binary FBX: Yes.
Importing TO Blender
  •  Export from Make Human as a "FBX Binary".
    • Blender needs it as a binary. UE4 needs it as ASCII. Remember that difference.
  • Set Scene>Metric>Degrees: 0.01
    • It's in the right-side panel (by default). Google up "Blender Set Metric Degrees".
  •  File>Import>FBX
Adjusting IN Blender

Blender will fudge things up a bit. So here's how you un-fudge it.
  •  Make sure you are in "Object" mode.
  • Select the model.
    • In the scene explorer it should be the top level of the model's hierarchy.
  • Press "N" with your mouse in the big scene view in the center of your screen (default).
    • This brings up the transforms.
  • The scale will probably be something dumb like "0.001" or w/e. Set these scales to "1.0".
  • Apply the transformations as the default to the model.
    • Crtl+A > Apply Rotation and Scale.
    • Important! Apply the Rotation AND Scale.
Exporting FROM Blender
  • File>Export>FBX.
  • Select ONLY Mesh + Armature.
  • Scale = 1.0.
  • Leave Forward/Up alone.
  • Uncheck "Default Take".
  • Toggle the pull down to ASCII and NOT Binary.
  • Everything else should be fine as is.
Importing to UE4
Only a few things to handle here...
  • Open your UE4 project.
  •  Drag your FBX file you exported INTO UE4.
  • A prompt will pop-up.
  • Mesh > Import Mesh: Yes
  • Mesh > Import as Skeletal: Yes
  • Mesh > Skeleton: None
  • Mesh > Use TPose Ref: Yes
  • Mesh > Normal Import Method: Import Normals and Tangents
  • Mesh > Create Physics Asset: No
  • Animation > Import Animations: Yes
  • Press Import All
  • Check out your imported mesh to see if it's looking right.
Re-Targeting Animations
Make sure you have the free animations pack. If you don't, download it from the Epic browser.
  • Add the animations to your project.
  • Part 1.
    • Open the HeroTPP model.
    • Select it's skeleton.
    • Open "Retarget Manager".
    • Manager Retarget Source > Current Skeleton > HeroTPP
    • Set up Rig > Select Rig > Humanoid
    • Save Pose.
  • Part 2.
    • Open your imported model.
    • Select it's skeleton.
    • Open "Retarget Manager".
    • Manager Retarget Source > Current Skeleton > None.
    • Set up Rig > Select Rig > Humanoid
    • Save Pose.
  • Part 3.
    • Find/Select the HeroTPP animation you want to steal.
    •  Right Click > Retarget Anim Assets > Duplicate Anim Assets and Retarget.
    • Pop-up.
    • You should see the HeroTPP in the left window.
    • Select the name of your model.
    • You should see your skeleton in the right window.
    • "Select".
    • Check out the new animation applied to your model!
Thoughts

 The feet and hands get to look pretty f'd up? Also, the neck can be quite stretched out with the head way in front of the chest and the shoulders pulled back. So lol, yea, be aware of that.

Tuesday, March 24, 2015

Upcoming topics for 2015!

It's been forever!!!

But, I do have some entries I'm planning to write up. They'll be tutorials and guides on how-to for a few topics in Unity and UE4.

They will include (at least):
  • Unity C#
    • Creating Event Triggers
    • Creating a hex tile system
    • Dynamically "stamping" to a terrain splat texture
    • Manually positioning a particle system's individual sprites
    • Maybe.... Implementing HTML Logging with screenshot support
    • Infinite Scrolling Parallax Backgrounds using 1-D noise 
    • Using Unity with GIT
  • UE4 C++
    •  Mele Combat
    • Dynamic textures affected by gameplay
    • Procedural Mazes
    • Maybe.... Implementing HTML Logging with screenshot support

Monday, March 24, 2014

Another way to post code snippets!

Bam.

A great way to host/post code snippets is to use Github+Gist.

In short, you create a file with some code formatting, and then copy the embed link to post on your blog. It's really simple and really fast.

Here's my steps for getting from A to Z:

  1. Sign up or sign in to github.
  2. Go to the uber top of the page and click "gist".
  3. Write your code into the file.
  4. Select the file type to match your code.
  5. Save.
  6. On the left of the page will be the embed link.
  7. Copy that link and paste it as HTML in your blog post.
That's it.

In Blogger, I don't see the Gist until I "Preview", so keep that in mind.

Wednesday, May 1, 2013

Spidermonkey Nightly [March 1st, 2013]: Windows7 VS2012

I built and packaged the dependencies for playing with spidermonkey (I think it's ionmonkey at the moment) within VS2012 on Windows 7.

I'm not sure if it's 32 or 64 but I basically built the Firefox nightly from source and then pulled out the Spidermonkey stuff for use.

Hopefully it runs for you right away! Took me a while to get it right:
https://bitbucket.org/brownw1/spidermonkey_vs2012

Thursday, April 18, 2013

Why singletons are "Bad Patterns"

I made a point for my new years resolution of 2013 to use more design patterns. I wanted to start making more conscious and judicious design decisions and to stop "slinging code".

So I started using the singleton pattern for things like caches and input in my projects.

Of course, I then find out that these convenient things are "bad practice" and that I should stop using them. I didn't really get why they are bad though until I crawled around some Stack Overflow posts on the topic.

What it boils down to is...

  1. Singletons hide dependencies
  2. Singletons make it hard to test
  3. Singletons aren't automatically cleaned up and are thus liabilities
I really do enjoy the concept of keeping things scoped so that they can just get destructed and neatly go away.

That being said, a decent "solution" to the "problem" seems to be to create your potential singleton object high up in the design hierarchy as a scoped object. Then, just have every object below it in the hierarchy that needs it receive a reference to it in their construction.

Doing that, you have a clear dependency, a scoped object, and defined initialization order (I believe that's the right way to say it?).

Maybe you could just define a base class that constructs to take reference of potential singleton resources like a cache manager or input manager. Then, the dependency would just be a way normal thing in your design.

Monday, March 4, 2013

GLFX Build with VS2012

If you get an error about a version 1600 not being 1700 when you use MSVC2012 to build a project with GLFX, then it means you need to compile GLFX for yourself.

So, here is how I did it (like 1 minute ago).


  1. Grab the glfx source.
  2. Open the vs2010 project in vs2012.
  3. In the solution, right click on the bold project "glfx".
  4. Goto "Resources".
  5. Note the project referenced, "glew_static".
  6. You'll need to get the glew source and add the project to the glfw solution, then add it as a reference.
  7. So, remove the old glew static reference inside of glfx references.
  8. Add your own, "my_glu\build\vc10\glew_static.vcxproj".
  9. Add a glfx reference to the project you just added to the solution.
  10. Your build should now work.
  11. Goto glfx/Debug (or Release if you built that?), and grab a copy of the .lib to use in your project.
Whamo, that should do it.

Wednesday, February 20, 2013

iOS Note: Panoramic Hotspots

This is just a note on some of the solutions I've implemented for features on a project we are doing at my job.

So we have a panoramic viewer on the iPad. There is an image mapped to a sphere and a camera in the center of the sphere. You can rotate around and zoom in on things.

The designers wanted the functionality to click on specifically designated areas to trigger events. For instance, touch this painting and you get info on the piece.

In the legacy code I inherited for the project( it was already roughed out in prototype ), these triggers were detected via point-radius calculations. Here is an XYZ-position, here is it's radius, do BLAH when a touch hits in that radius. This works fine to some level of accuracy but they quickly found that they needed not only more accuracy but a more flexible way for other designers on the team to author these areas without reaching into the compiled code.

So they asked me, and I said to move the problem from model space to image space. Drop the positions and distances and just work with the pixels since that's what you are interacting with anyway.

My solution was basically to modify the rendering pipeline a bit, and to add a message queue.

For the pipeline, setup framebuffer rendering so that you aren't only writing to the screen but also to an offscreen buffer. Note that this will be a framebuffer not a renderbuffer. What do we draw to this offscreen buffer? We will have the artists/designers copy the panoramic texture and draw, in solid 1 color colors (no gradients or whatnot) the areas they would like to respond to touch. They will paint on their touch areas. With rgb 255-255-255 you get around 16,000,000 colors? That's 16 million unique touch areas. Plenty.

Now you can just glReadPixels a single pixel at the touch screen position from the offscreen bufer to get the color they touched.

What I do is, on a touch, I push a "request" for a sampling into queue-A. Queue-Ais the "request mailbox", the renderer's "inbox". The renderer checks this when it's submitting render calls and if it see's any requests in it's inbox it it calls glReadPixels and puts the result into queue-B, the "response mainbox", this is the renderer's "outgoing box". The rest of the application checks this outgoing box regularly to see if anything is in it. It will take any mail from it, which will be an rgb color, and will check a table it has to see what event trigger given the color. And that is it.

The users don't see this rainbow collision map because it's drawn offscreen. You don't have to transform any coordinates because all of that is done in the shaders. The transforms are the exact same as those performed on the screen image also so the collisions match up 1-1 regardless of any zoom or w/e features you do. The designers get per-pixel accuracy. It's a solid solution (has been so far at least ).

Caveats include not working in the simulator. I believe this is because the technique is hardware accelerated and thus dependent on. well. hardware and not just simulated architecture.