We used Blender exclusively for our 3D modeling of individual scenes as props, as well as UV unwrapping. Actual materials were created with the awesome Substance Painter software. Rather that directly attaching materials in Blender, we had a custom configuration-based system that allowed to write material data in .ini files, allowing us to change properties such as textures, tints, brightness, and the like on the fly without having to reopen Blender. To get things into our game, we used ASSIMP for model loading, and SOIL2 for material loading.
We also used Blender for stitching together the individual scenes and props, as well as giving them metadata that our ActivatorRegistrator system took advantage above.
[Animation, Matt] Models were rigged and animated in Maya, then were exported as FBX files. In our code we loaded the animation files with Assimp, but had to compile it on our own rather than using Nuget’s package of assimp which could not load FBX files. I think we got the latest version of Assimp from their repo (https://github.com/assimp/assimp), using CMake to get our Visual Studio project file. For some reason CMake didn’t work the first time I tried, but it did another day. I don’t know why. Afterward I parsed the Assimp scene into my own AnimatedModel/AnimationPlayer/AnimationMesh classes to handle multiple animations because I don’t think you can put multiple animations in one FBX without some manual delineations for which seconds belong to which animation.
For modeling and animations, everything was done in Maya 2015 and the student version of Maya 2016, and exported as OBJ and FBX files. Everything was first tested in Open 3D Model Viewer to make sure models and animations looked good before going to the graphics team. Ultimately all model assets used were in FBX format. In order to get the textures for more complex and custom designs, such as the character model and the deerstalker, we created automatic UV maps and took UV snapshots in Maya. The UV snapshots are saved as PNG files and imported into Photoshop to paint before being imported back into Maya as texture color files. We learned that transparency in the texture file might make the model transparent and awkward, so future artists be careful! To model the character from 2D images, Tiffany drew orthographic views of the character (front and side) and imported them to Maya on intersecting planes. The intersecting planes act as guides for 3D modeling. Animations were done by keyframes, with the first and last frames the same for looping animations such as running and punching. Maya has a great rigging tool called HumanIK, which automatically generates a skeleton and an advanced rig for you. The rig helps the animating process smoother. However we had problems getting the HumanIK to load in Assimp so it’s safer to first manually create a skeleton with the joint tool. We were able to get animations with just the skeleton and no IK handles, though it does require a bit more attention to detail (ie. how would the rest of the body move with a lowered arm). Before animating, ensure the skin is bound to the right joints by checking the Paint Skin Weights Tool. Maya is a very advanced tool with a steep learning curve that caused many frustrations, but because it’s so widely used in the industry, the suffering was worth it! I expect a similar experience with 3ds Max and Blender so there’s no escape either way! [Tiffany]
[Lauren] All the models were created using Maya. Once the models were finished, they were exported as obj files and sent to Thomas to be imported into the game. Keeping the files small was important because if they were too big the initial start up of the game would be slowed significantly. Because of this, the models couldn't be very smooth, so we took a more angular design approach.
[Thomas] We did not use a library to load meshes, instead opting to code that manually. Textures were loaded using the SDL2_image library, version 2.0.1, so that we could support any image file format Lauren wanted to use without any difficulty. It's worth noting, however, that the textures loaded in this manner are flipped vertically when sent to OpenGL; in order to work around this, our shaders used 1-y for all texture lookups. While meshes and material data were loaded from the Wavefront OBJ format, we only supported use of a single material in a given model. In addition, we only supported meshes made entirely of triangles; we could not find any setting in Maya which would automatically triangulate meshes, so this required the additional step of manually triangulating each mesh before exporting.
For our game, we didn't have to support animation since our animation is all included through the physics library where pieces will fall apart. Therefore, we only need to worry about loading models and apply correct textures onto the model. We use an online loader called tinyobj loader to load our obj models. SOIL to load all the image assets such as texture map, normal map, metallic and gloss map. Then we use shader to apply all the effects. One problem we faced was using metallic and gloss map correctly. Due to our time constraint and problems with the shader and map itself, we ended up not using them at all.
I would suggest use mtl files first. The properties contained in mtl files are really useful. Things like object diffuse, specular and ambient make writing shader much more easier.
[Alex] Modeling was done with Blender 2.66, though Maya/3DMax would’ve worked just as well. When the model is done and textured, Blender is used to export the .obj file and the .mtl file. In the final version, we used Blender to export the .obj file and 4 .pngs corresponding to the model’s diffuse, specular, metallic and normal maps. The diffuse pictures were taken from the internet, and Crazybump was used to generate the specular, metallic and normals .pngs.
We used 3ds max for modeling and assimp to load 3D model and texture. To load animation, we found a md2 loader from Google, and here is the link http://www.flipcode.com/archives/MD2_Model_Loader.shtml. This md2 loader has a very simple interface, which only contains three functions, load, setframe, and draw. Load function is used to load animation, draw function is used to render a specific frame, and setframe is used to set that frame. However, the problem of md2 file format is that it doesn’t contain the normal of each vertex, which means we have to use the normal of the triangle instead. However, this md2 loader doesn’t support texture. Although we used this md2 loader finally, we also found some other helpful libraries for animation loading https://tutorialsplay.com/opengl/2014/09/17/lesson-11-loading-quake-ii-md2-models/ This md2 loader can support texture and Vertex buffer.
http://ogldev.atspace.co.uk/www/tutorial38/tutorial38.html. This tutorial teaches how to load a md5 (skeleton animation) animation by using assimp.
David Wang: Unless you have someone as good as Wes in importing Fbx files, I would probably stay away from that file format if you’re working in Maya. I found a way to export Md5 files from fbx files in blender with a python script, but that takes a little searching. I remember Md5 as being a much easier alternative, though by the time we figured that out, Wes already solved the fbx issue. .dae files cause bone issues in Maya and obj files are for static meshes only. If you do plan to export FBX files, animate in Maya LT through their game exporter. From there you’re able to export multiple “takes” for one model in one fbx file. This saves memory and time loaded in the beginning of the game. Running “select -hi” in Maya LT’s python script allows you to select all of the bones at once for keyframing. You’re welcome future animator.
Wesley Lau: For graphics, animation was a pain to figure out. I first had an initial model loader that just got the static models to load fine, but extending it to animation was the frustrating part. Anyways, there is not a lot of documentation out there, but perhaps I will post up the code for our animation loader up online if any future 125 students want to see how our pipeline (code-side) worked. Here are the steps that I took after receiving the fbx model file from David: Using ASSIMP, I loaded all the mesh, bone, and animation data At least start drawing the initial mesh (bind pose) Take the bone matrices of each mesh and apply the necessary calculation to animate them while running an animation timer Draw onto screen using VAO or vertex3
The characters were modeled using maya (1000-8000 triangles was the range for the characters, but most averaged at 2,000 polys). A combination of reducing triangles and smoothing normals in maya allowed us to have high res looking models but with a low poly count. The models were then exported and textured and UV mapped using Zbrush. Zbrush UV maps the models with a press of one button, then from zbrush it was exported back into maya just to make sure everything mapped correctly and then exported as an .obj or .fbx format. Zbrush is amazing, but It is usually used for modeling for movies were polycount doesn’t matter so that was a huge learning process, but its awesome to paint your models with. The 2D graphics were all done in photoshop. Other textures that were used were found online with a corresponding bump map.
For loading in meshes and models, we used both an obj loader and an fbx loader. The obj loader simply amounts to code that counts up the number of vertices, normals, and texture coordinates, and then fills in arrays with their data reading down the file (based on if the line starts with v, vn, or vt). The last part of the obj file are faces, which have three points. Each point is in the format x/y/z, where the x represents what number vertice was listed, y stands for the texture coordinate, and z is the normal. Since these counts start at 1, you should subtract 1 from the value when referencing your arrays. The fbx loader came from code found for the fbx file. The logic is similar to an obj loader, except it has functions for getting the index for a polygon and then the vertex and vertexnormal for the polygon. We added code to calculate the center of the model and adjust the points so that they were centered at 0, but other than that, no real troubleshooting was done. While the obj loader was really straight forward, we wouldn’t recommend the fbx file type since it contained quite a lot of extra information and we had one file that for an unknown reason seemed to take longer to load even though it was smaller and simpler than the other models.
The other piece of code we needed was the DDSTextureLoader from Microsoft. This was needed because they deprecated the other functions that were used to load textures and so this loader was needed to load in the textures. The textures had to be in DDS format for this to work. We found converting the textures to DDS in Visual Studio worked the best and had no problems. Other than that, it simply involved using the DirectX::CreateDDSTextureFromFile() function call which Microsoft provides details for. Although it did require adding outside code, the DDSTextureLoader was easy to use and would be recommended if you are using the latest version of Direct X.
For the art, we relied on 3DS Max to texture, skin and animate the models. The model mesh itself was created in a free program Sculptris. While Sculptris is a very helpful program for making detailed models that look good, it does not allow a lot of control over the size, vertices of the model or the number of triangles, also the cross compatibility of Sculptris models with other 3D modeling programs is dubious at best. If given the option to work on it again, even though it gives a lot less ease and flexibility, I would try creating, skinning, and rigging the model all in one program. Even though it probably would have resulted in a simpler-looking model it would be easier to use a model built within the program(Maya or 3DS Max) to take advantage of all of the program's features.
We used Assimp to load blender models. The problem with Assimp is that it doesn't allow bones with the same name. The artist needs to be aware of this and make sure every bone is named differently. Animations also need to be starting from 0 frame. We had an incident where the animation was off by one or two frames and Assimp failed to work. One more thing to notice for graphics programmer is that artists usually don't know the maximum number of bone attachments in the model because modelling software is pretty high level. This can cause less bone attachments inside the program than what was designed by the artist which will warp the model. The solution is to make the maximum attachment a larger number than what is usually used/needed.
Assimp is a pretty decent library and so far I haven't found any other library that’s easier to use. So I'll definitely use it again.
[Justina] I used 3ds Max in order to create all the player/monster models for the game. 3ds Max is very handy in a couple ways compared to Blender, but is completely frustrating in others.
To make sure your models come out well, especially if you’re using Panda DirectX in order to export is as a .X file for animations -
First make the model - make sure everything is closed up, there are no free open parts in your mesh.
Then create the UVW map for texturing - texturing made 60% of the model, so texturing is important. I turned off being able to see peel lines and created my own UV lines using the point-to-point seam tool in order to create good ways for the mesh to peel. Then select all the polygons within a seemed off area, and use “pelt map” to get a good initial peel. (search around for tutorials for a little better explanation of pelt :| ) Make sure no polygons are overlapping, and that the shape is generally spread out well enough that there wouldn’t be any strange stretching of textures, then you can render the uvw map and save it as a .psd file or a .png.
MAKE SURE, THEN, AFTER THAT YOU COLLAPSE THE STACK by converting the mesh to an editable poly, so that the uvw map sticks. You can re-attach the uvw modifier and it will still be there, but do that initially so that the uvw is saved.
ALSO, AFTER UVW MAP, ADD RESET XFORM MODIFIER to reset all the transformations on the mesh. This will ensure that the mesh doesn’t totally crumple when convert to .X and into your game. Seriously. :| You can then recollapse the stack to get rid of it if you so wish.
Then if you are doing animation, set up bones (see tutorials for more) just make sure that whether you are using bones or bipeds or CAT, that you pay attention to envelopes and weights to see how much vertices get pulled by a particular bone. Using the Skin modifier + Paint Weights is the best way to make sure your model will animate in the way you want to.
If you are animating with the biped, note that it uses some totally different animation key stuff, you have to look under the motion tab and then Key info - the keys there are what are used for biped (although it seems you can still use Auto Key with parts of the biped - it’s confusing and silly).
Now that I've taken the time to use 3ds Max, I would still use it again, cause it does have its benefits, but I can'’t help but wonder whether using Maya would have been easier/about the same amount of frustration. Learning to model for a game on the fly is a little difficult.
[Haro] If you want to use xAnimator, realize that it blends different animations together when you switch. So if you have intermediate animations to do the transitions, they won't work well because it will try to blend with the start of your previous animation anyway. Also, it doesn't give you full control over the models, and it's all client side, so the server will not be able to know about (and therefore follow) the animations. This might bring about problems with collisions (as it did with us). Maybe looking at different libraries would have been good.
[Bryan] The animations are loaded in reverse of the way that they are created, so whatever animation that you define first will be loaded last by DirectX. Plan around this by making you export options so you don’t have to delete everything when you add a new animation.
The workflow we finally used to get animated models into the game was. 1. Animate in 3ds max 2013, 2. export as collada (.dae) (with baked animations). 3. import into 3ds max 2012 4. export as .X (with full matrix key frames) 5. Fudge the ground position in the game. We would not recommend anyone else using this workflow. We never figured out how to solve the final positioning problems, along with several of our animations exporting with bone weights set incorrectly. If we were to continue development on the game, we would probably look for an alternative to 3ds max for modeling and animating and then test several file types and export options before continuing. Also, having Y-up rather than Z-up sometimes makes things easier.