This character was created to learn Maya (Modelling, UVMapping, Rigging and Skinning, Blend Shapes, Animation), Substance Painter+Designer, and establish/learn the workflow between UnrealEngine, and all of the aforementioned tools.
I have been using 3DS Max for few years, and I don’t think I will be using it any more, Maya is really a great tool, and all processes from start to finish are much more streamlined and beaten down. Not to mention MEL/Python integration, which are years ahead of MAXScript, which is truly awful, and should never have existed IMHO. But then again, millions of people use 3DS Max and it obviously has a soft spot in the Game Development community, but in all honesty, Max is the past, and Maya is the future, the way I see it. In any event, I am going to detail the process I went through to create this guy, he was modelled out of my head, by following along a very cool tutorial . I did not use reference images, and I simply used eye shot for the modelling, as I did not require a specific character, I just wanted to learn how to model one.
A huge Maya bonus, is the ability to insert edge loops (*yes Max also has that..but try Maya’s… you’ll see…) quickly create Shelves and custom hot keys and you’re on your way.
I always start with a Cube, but that’s just me, some people like to start with Cylinders, and those guys are awesome, but me… I like cubes… they make for good low poly models. And from a gaming perspective, I prefer modelling the lower res first, and then worry about the higher res and details, as I get quicker workflow and feedback between the Game Engine, and my geometry mistakes, which.. if you have been spending weeks modelling, is no joke to redo, and if you went ahead and UVMapped, Rigged and Skinned it, you can basically start from scratch, because any changes to the geometry after you have done those, is well… do it and you’ll see… NOT FUN.
Anyhooooo…. Here goes some screen shots from the character modelling process.
3DS Max has an awesome stack concept, and they work like layers, you can stack modifiers to your geometry on top of each other, and then apply various processes, like Symmetry or MeshSmooth, Bend etc… Maya uses history to the same effect, so Maya keeps track of most changes you made throughout the modelling process, and uses this history as a stack, so you can apply a Bend to a shape, modify the shape some more, apply another modifier, and change the geometry, and go back in the history, and change the properties of the bend, and the shape will adjust accordingly, this is very useful.
Maya symmetry works a lot different than Max’s, but I found some neat tricks, MEL scripting… Maya shows you everything it does when you apply changes, so for modeling in Symmetry, you ‘Duplicate Special’ -> Instance, and then “scale” it around itself, so your pivot must always be set, correctly, just like Max. then you simply combine the meshes when you are done, and if you want to separate again, and work in Symmetry, use a simple script, to select all geometry that lie on the other side of the pivot, and delete it, Duplicate Special again, and you’re on your way. Making scripts for these are the most fun, and makes Maya a real pleasure to use.. You do not need to compile the scripts, and you can easily include a script in your shelf, for easy single click action… This is the process I use to model, because even though symmetry is bad for art, it is great for game models, and really helps you quickly get your character, or piece of equipment up and running.
The next step was UV Mapping, this process in Max for characters, I usually use Pelt Mapping..and cry a lot. With Maya they have a similar concept, but they call it “Unfold” and “Smooth” UV’s. This process is a pleasure, and will not give you nightmares… literally… As long as your seams make sense, and logically, if you think about how to sew and weld something that should stretch a little, it’s not so hard. You also do not have to worry about the Ring Stretcher that Max uses for Pelt Mapping, as Maya knows where stuff should stretch to, based o your seams, and if all else fails, you simply tell Maya to “Figure it out for yourself” and Auto UnWrap, which usually works fine, for most non organic objects, and even some organic objects.
Here are some of the screen shots for the character UVMapping process, also done in Symmetry, and then later the UV’s are simply mirrored in Maya’s UV tool. Which is like Max’s UV tool, only it makes sense, and does not make you cry, and suck your thumb in the shower, rocking back and forth slowly while wondering “why….why….why are those UV’s so messed up”.
Side NOTE: His Eyes and Teeth (*barely visible) are attached to this character mesh, and share the SAME UV SPACE, but obviously have a different material assigned to it. In Substance Painter, and Unreal, this works perfectly fine, and you do not need to loose sleep over keeping separate UV Mappings for your head and EYES etc..
In Max I would usually use a MultiMaterial, and set the polygons material ID, and those would work fine, hell of a schlep, and seriously wastes time.
In Maya, you select your stuff, Place it where you want it, and you’re done. Literally that simple.
My HEAD, EYES, TEETH are all separate UV’s, nicely packed and packaged for UV space consumption, but SHARE the same UV space as the body. So when you select the Mesh and view the UV’s, everything is a mess… But when you select the polygons and fill by UV Seam, it all makes sense. Thanx Maya.
The character below looks really smooth, because Maya has a cool view, called ‘Smooth Preview’, which unlike Max’s MeshSmooth Modifier, does not alter the actual geometry of your Mesh, and allows you to work on the low level poly, while getting a pretty accurate, but not 100% idea of what the Mesh will look like, if you subdivided it now, with Maya’s Smooth Mesh tool.
Once the UVMapping was completed for the base character, I decided he is going to need some clothes, I do not like to model clothes and equipment as part of a single mesh, as that is too rigid, and binding, characters must be able to equip stuff, and have different hair, clothes… etc.. more work, yes… but in the long run.. much more fun and pleasant when he needs to change his shorts.
Before I go modelling his clothes, I thought I would create some clothing using Maya’s Cloth Tool, and integrate it with Apex Clothing, and import it into Unreal Engine…BOY was I WRONG… Even just Apex Clothing on it’s own, the NVidia plugin was proving to be *(if you watch Cricket) a Test match, and not a One Day Game. So no thank you… until that process evolves so normal people can use it that have actual day jobs, drink water and breathe air… hell no…. waaaayyy too much work, asif what we are doing here is not enough work already…
So I modelled and UVMapped (*as separate meshes), some clothing, a Shirt, Pants and Boots. One again, these are modelled as low poly, and then a “Smooth Mesh Preview” is applied, so the geometry is not affected in any way, and the models are easier to work with.
The next step was to add a jacket, and I wanted to have a wind breaker type jacket, so I modelled and UVMapped one.
Now for the hair… This process took a while, because I was under the impression I could use Maya’s hair tool, and integrate with NVidia’s HairStudio, and then into Unreal Engine…BOY was I WRONG… go google that… enjoy… worst week of my life… So what normal people do for game character hair, is model it. So I did.
Then I thought, OK, I will use the polygon strand technique, and holy crap, what a ball drag.. The polygons eventually intersect, as you want your guy to not look like a Vulture, with 5 strands of hair and his head showing everywhere… The intersecting polygons are a HUGE problem, even in Max or Maya, as they don’t know which polygon to render first, and then you get see through polygons, where you see the back face of the hair, in between the front faces, and then it looks like vulture, crapped out last nights dinner, on your beautiful head of hair. Export that to Unreal, and there is no difference, as it is a mathematical/shader problem, not a quickly google and find a solution problem. Second worst week of my life.
The I had a brain fart and just decided to go with what I know… CUBES.
Also, I created his brows, his brows are not connected to the hair mesh, they are all separate, so he could have different brows if required.
I started with a simple cube, I love cubes…and then simply used Maya’s Deformation Tool to deform the hair, using my Wacom Bamboo tablet, not a Sintique.. a cheap $100 tool, that works perfectly fine. Made and moved mountains in Unreal and Max, so why the hell not hair?
So he is not going to star in Timotei’s Natural Bouncy Hair commercial… big whoop.. he might blow your ass away with a bubble gum gun… Timotei…so beware..getting bubble gum out of natural bouncy hair is like figuring out how Apex clothing and Hair Studio works….
Aaaaaand here are the results.
The next step was giving him a hat, I modelled my best CowBoy hat from eyeshot, who needs reference materials, and I don’t really care, but I do care, it must at least look like a CowBoy hat, and I did pass, somewhat… once again, this is onyl for me, and working out how all this will go into Substance Painter/Designer for texturing, and Unreal Engine for actual use.
Once his clothing and attire was modelled and UVMapped to a respectable degree, I started the texturing process. For this I used Allergorithmic’s *(wow, spelt it right the first time..that’s a first…silly syllables) Substance Painter *(first time user) and then I knew I wanted to import it into Substance Designer, as Unreal Engine supports .sbsar files from their plugin (*long story). So I do not have to export and import tons of base, normal, rough and metallic PNG or whatever the hell it must be now, types of images into Unreal, and set them up for PBR manually, using the Material editor… That would just take forever, and programmers are lazy, but game developers establish workflow, package and compress, publish, import and export their way through a development process, with the emphasis on PACKAGE, and if there is something .sbsar does well, it is package and compress.
Substance Painter is like strolling through a breezy field, on a hot summer day, you open it up, you import something, and you paint.. You paint some more, and it doesn’t crash, like MudBox, only it works.. sorry MudBox, you’re a dip shit, like the cousin nobody talks about coz he drools when he talks.. *(sorry if that’s your cousin, or you.. less sorrier if you love MudBox) . But first, like I always do, I spent a week watching tutorials from their site, and learning as much as I can, truth be told, the product is a breeze if you used MudBox, and if you haven’t, don’t… Sorry MudBox… daddy still loves you… NOT… Ok ok ok…
So the only thing that ‘erked’ me a little was the Quick Masks in Painter, they did not work so well in geometry mode, and that’s fine, I got a good work around by simply selecting the geometry and painting on it… yes you heard me… SELECT geometry, then FILL it.. So you don’t have to spend another lifetime on your UVMapping process, to create separate materials, and separate UV’s, and God knows what else, you can paint by geometry, and very very easily as well…Without quick masks. And I also found the whole internal material ID thing that Painter and Designer used to be very confusing and made me cry.. So I did not use that, think it needs more time in the shop for that process to help…anyhooooo
Here are my UV Maps. Nothing special.
Quickly had to make sure the Meshes look OK in Unreal Engine, So I applied the Mesh Smooth, and imported a LOW and HIGH poly version of the same meshes, next to each other, and combined them in the FBX export, so I can view everything in once go, no need to remove any clothing as yet, as he is not skinned yet.
And now for the actual Texturing Process, first time using Painter and Designer.. Time of my life…
Here are some screen shots of Substance Painter in action.
Substance Painter then exports these as PNG images or whatever format your choose, I use the BaseColor, Normal, Metallic, Roughness PBR workflow, as that is what Unreal Engine uses for it’s Physically Based Rendering, and more importantly, because that’s what the documents say.
Here you can see Substance Designer in action, I am not even using a percentage of Substance Designer’s potential, as it can be used for degrading and wear effects over time etc.. And all sorts of fancy things, In my case, all I want is to export a single file, and import a single file, and have Unreal show me the Material. Which it does, brilliantly, just use the binary version from Unreal, as otherwise you need to Clone Allergorithmic’s GITHUB source, and I don’t like that, so now I am using Unreal’s binary engine, and no longer compiling from source, to get the plugin to go with it, Maya apparently also supports sbsar natively, but I could not get it working, so I don’t care. Only has to work in UE4, nowhere else.
It works and looks almost exactly like UE4’s material editor, as that is exactly what it is, you are visually creating/editing/compositing HLSL shader code in the background, without knowing a single shred of C++/Assembly mixmatch HLSL quadrollox, works here, but not there, works on this card, but not on that one, works everywhere except here, on sundays..Miss none of those days, coding HLSL, tell you that.
First, test the materials in Maya, secondly, Maya did not support sbsar, I tried the plugin, blah blah… So the textures are simply base color, and look like crap, but it really does not matter, as the textures are only there for deformation testing in Maya, they have to look the way I painted it, in UE4, which is the most important part.
Here are the materials in Maya, simply using only the base color images. Nothing special
The next logical step was to pull everything I had done into UE4. I obviously tested the Meshes before texturing, but I will include those screen shots as well, this was simply a static mesh, that does not move, but has PBR materials assigned to it, that look exactly like I painted it, and that is the MOST important part of this whole processm if it looks different to when you painted it, like MUDBOX.. sorry MudBox, then it sucks… So Painter and Designer DOES NOT SUCK.
The next step in this process, was to create a skeleton, bind/rig the mesh to it, and then “skin” the mesh by weighting the vertices to the bones/joints, so as the bones/joints move, the vertices uses these weights as influences to determine how much to transform along with the bone or joint, i.e move along the bone.
3DS Max has a very nice feature for rigging by clicking a single button, you can get a BiPed skeleton, and simply adjust the bones by moving and scaling them where you need them in your mesh, Maya does not have this natively, maybe a plugin or script, but in any event, creating joints manually is a bit of a process, but it does give you a lot more control over your bone orientations, and I absolutely did not do any IK/FK handles, as I am a human being, and have a day job, and this is only for a simple animation test in UE4, I do not need IK/FK for hitting the floor, and moving this way or that way, in real life though, I would create those, but not today…
Once rigged, I always create a test animation, using only the bones, and move them in all directions, where ever possible, so usually around 400 frames, includes moving the hips, legs, feet, arms, wrists, fingers, toes, head, neck and jaw. This is then the animation that I export along so I can see the results in Unreal Engine.
So weighting, this is by far the most difficult and tedious processes of all, as it involves a lot of eye strain, fidgeting crap loads with your tablet, scribbling till your fingers hurt, sometimes there is simply no sensitivity getting picked up, and I remembered in Max I always used the Weight Table, and edited the verts manually, sounds crazy? But it actually is a lot quicker, and more mathematical than anything else, so you can actually establish some sort of workflow, where you can continue where you left off, and do not have to rely on weird colour schematics on if this vert is red/yellow/green/black or blue, and you can hide the joints, you don’t ever need to select one while doing this.
You simply select the vert, and edit the table, any influences is shown in the table, trick is, if you want a vert to have an influence it does not have, you simply select another vert, that is under that bones influence, and tadaaaa… A week has passed… This is only done on the low res poly, you still have to duplicate this mesh, copy the weights over, LOD group them, and then fix up any errors in the hi res mesh, still less work than skinning the hi res mesh, as you are now going to edit hundreds of thousands of verts, vs only a few thousand, and then tweaking… mathematically speaking of course..
And then finally it comes together, and the mesh itself will need some tweaking, the skinning too.. But for this demo, I am happy with the result, as it told me everything I needed to know between working with Maya, Painter/Designer and UE4.
Here are some screen shots of the animation. The weight was done using symmetry, so after I am happy with the left side of my model, I copy the weights over to the right, and make sure to have animations on both sides of the mesh, to check for any funnies… usually there aren’t. The middle vertices are easiest to skin, as they are usually skinned to centred bones, like the neck, pelvis and spine joints.
The next step in the process is blend shapes, usually you should do you blend shapes first, as the order in which these are processed matters, if you first move a vert using a blend shape, and then transform it with a joint, the matrix will get you… as in, the transformation matrix, is applied after the vertex is trying to get where it wants to be, so the blend shape is distorted, you simply ensure your skeletal mesh skin, is the first component in the input connection hierarchy, this is done by selecting the mesh, edit input, and middle mouse drag the little fokker to where he needs to be… Then the vert is transformed fist by the matrix of the joint, and then moved along the blend shape, it will respect it’s orientation, as it’s transformation matrix has been altered by the joint. Anyhoooo…
I created a few simple blend shapes, and like a noob, deleted the target meshes, after watching a great tutorial, I learnt to keep the target meshes, as you can easily reapply them, and even see the results in real time. I created some eye blink shapes, and simple mouth shapes, just for fun, and a smile… of course… I also morphed the brows, and figured out a neat way, to drive the separate mesh brows morph, but using a “Set Driven Key” method, where as the brows of the character mesh are morphing, they will also drive setting the key for the separate mesh brows, the nett effect of this is… You only need to morph the character meshes brows, and the separate mesh brows will morph along. And this also applies to animation, where the keys are stored at a specific timeframe, and interpolated in between keys. So the animation that I had, I altered, and created keyframes for the blend shapes, so I can see his mouth, eyes, and eye brows move while the character itself is animating in his silly little dance.
Maya also gives you a nice little UI to control your character’s blend shapes, but some artists love to create their own UI’s and handles for that in Maya, but I did not require a UI for this demo, as it needs to go to Unreal and teach me workflow between these systems.
Here are the results in Maya. Then onto the final section, and the whole reason for this page… Unreal Engine.
I want the FBX to store the LOD information, and not have to export 2 meshes for every single component, so I duplicated the low level mesh, rigged and skinned it to the same skeleton, and copied the weights from the low poly to the high poly using Maya’s copy weights tool. And it worked great for the most part, I had some internal overlapping verts in the mouth, so I had to fix those manually using the Weight Table tool, but all in all, it could use more touch up, for a production asset, I would spend a lot more time on the mesh and various processes, but this is for a demo, for myself, and that is the most important part, to learn how to do it, not to bend it like Bekham.
Here are some shots of the LOD setup, look at the other viewports, and you will see the geometry change to different scales.
The above meshes, were exported to FBX, and imported into Unreal Engine, materials assigned, I also created a low res material, which only provides color, and assigned that to the LOD1 mesh. Basic, simple, easy stuff.
To get the setup in Unreal Engine that I wanted, I created a C++ Actor, and provided him with an Inventory, which is merely a BluePrintEditable TArray of type USkeletalMesh. Meaning, I can put stuff in his inventory, from the BluePrintEditor, and in-game.
The next step was to create USkeletalMeshComponents in C++, and give them meaningful variable names, like shirt, boots etc.. Which contain no information whatsoever at runtime, and they are meant to be equipped from the inventory by simply calling a C++ method, exposed to BluePrint, that will set the skeletalmesh on that component. The only trick was to be sure to call SetMasterPoseComponent on each and every USkeletalMeshComponent from the constructor, so that all of his equipment, will follow the transformations of the main mesh, which is exactly what equipment should do.
Here is a shot of the header file.
And the CPP
Then I created a BluePrint instance inside the editor based on this actor, and made sure to call the SetMaserComponent for each of the equippable slots.
The mesh component itself, I edited in the BluePrint, and assigned the MiniMan mesh to the SkeletalMesh component, and set his animation to play my imported animation of him moving around his joints, and includes the morph target animation as well.
Then I created a simple loop inside the BluePrint BeginPlay function, that simply randomly calls a C++ function, I exposed from BluePrint, to equip a random item from the inventory, the blueprint can also decide to play and stop the currently playing animation, and this was the whole reason for this demo, to have my character animate, equip stuff, and his equipment moves with him, which also includes morph targets and hair…
And you can also see the LOD working in UE4.
Some in game screen shots, of him doing his thing.
And if you think this was a lot of work, you will never be a 3D game developer, as the above requires a lot more work, before it will be considered “Production Ready”.
If you got this far… May the Force be With You.