Gameplay footage of Booyo Park.
Because of the nature of the Raymarching Toolkit we started the project with, the team decided early on that I would be animating the Booyos in Unity. I didn’t get too in-depth with this topic in the previous posts, but I will now! This time, I will be discussing the reason behind this decision, my workflow, the pros and cons, and the compromises I made to make way for the tech side of the project.
Why decide to model and animate in Unity?
We have a proficient 3D modeller on our team. However, we decided to create the model in Unity instead. The first reason: Raymarching Toolkit only applies its effects onto its own geometry. Simply put, the toolkit does not work on actual meshes, like models imported from Maya. Since our character had a simple design, this was not a huge sacrifice for us — except that our 3D modeller had to find other tasks, and that I would have to take over animation since I was more familiar with animating in Unity.
The second reason: we immediately eliminate the need for rigging and lessen asset import time in our workflow. The only complex shapes we have are the faces, which are modelled in Unity. Other than that, we would not have to dedicate time to importing a model from Maya and checking if it breaks – what we see in Unity is what we get.
Animation workflow
A video of the Booyo’s dancing animation, which will signify when they’re about to merge together!
1. Since I was animating in Unity, it was very important that the structure of the Booyo Game Object worked for both me and the programmers. This meant many backs-and-forths, continually improving the quality of the model from both disciplines.
2. After we had settled on the structure of our Booyo, I got to animating. Since the model was in Unity, all I needed to do was to push the .anim files to the repo, and it would update automatically. There was a minimal chance that anything would break since everything was done within Unity.
3. Meanwhile, the programmers set up the animation system and code for me. While they waited for me to finish making my animations, they used test ones they created in Unity. The downside to using Unity is that not all properties can be animated. So, the programmers set up functions using Animation Events that let me change those properties anyway!
4. If any of the programmers needed any tweaking, all they had to do was call me over to their station, and we could look at the animation and edit it together.
All in all, the workflow was pretty straightforward!
Pros & cons
Overall, I liked animating in Unity for this project. I found that:
It was really easy for anyone to edit the animations when needed, even when I wasn’t there. If anyone else in the team found a minor issue, they could easily go into Unity animation and tweak it.
I never needed to import anything from outside Unity. This cut a lot of time in production and greatly helped the whole team in the long-run.
Since the model only had 3 major animated parts (body and arms), animating was simple.
However, I found that animating in Unity had some drawbacks:
I could not animate complex models. We were lucky in this project that we only had a simple model made of 3 spheres, but what would have happened if we had a more complex character? Scaling the project up using this method is not feasible.
I could not animate the model in complex ways. The Booyos have an animation where it spins in a circle. This would have been simple to do in Maya — I would have animated the Booyo along a motion path shaped like a circle. Unity has no such thing! It became a tiring process, even if it was for only 1 animation.
Contradicting our original decision
Before I end this post, let’s go back to one of the main reasons we chose to animate our Booyo in Unity. We originally did it to make way to the constraints of the Raymarching Toolkit. However, it is important to note that we actually stopped using Raymarching Toolkit midway through production. We did this after my devlog on compromise and optimization, where I simplified the original Booyo model even further to make way for better framerate. It turned out that the Raymarching Toolkit did not work for our project as planned, simply because it was heavier on processing than we thought. Prioritizing our players’ comfort, we decided to choose better framerate over the blending jelly effect.
At first glance, we seem to have scrapped a lot of our work. However, when we think about it, the pros of cutting Raymarching Toolkit out of our project outweigh the cons. After removing the Raymarching Toolkit from the scene, we found that the game ran much better. We also found that players were not bothered by the lack of blending effect, but significantly appreciated the smoother framerate!
Going forward…
Since we are nearing the end of production, we are now at the final stages! Going forward, I will be focusing on the showcase experience outside of the game itself.
Comments