top of page

POST - MORTEM

 

Link to Itch.io:

https://camerono98.itch.io/breath-of-life

The production cycle of Breath of Life was, for the most part, hitch free, with the main source of concern coming from early programmatic challenges, as well as lag on different Android devices. This post mortem will be broken up into four sections detailing what went right/wrong on the project, as well as lessons learnt to reflect on in the future.

 

What went right?

From a project management side, Breath of Life was quite successful due to early planning and in-depth preparation. This was not simply from just documentation alone, but also from early planning with trying to arrange support for both programming and audio. This was thankfully successful relatively early, which was largely due to trying to source audio students through contacts from past units, who then publicly posted within their internal groups.When it came to Android based development, someone with experience in some form was desirable, and the programmer who ended up working on the project had previously played around with Unity Android builds.

 

The project moved along rather rapidly within the early weeks of production, having made ample progress on the core mechanics within a two week time frame, which allowed more time for the team to work towards trying to crack the desirable control scheme, which was to use a phone/tablet’s gyroscope for the aiming of the player pawn, and the device’s microphone, in which the intended method was to blow into the microphone so as the player could ‘be the wind.’ Unfortunately this was problematic and was removed from the project. What went wrong in particular and what was learnt will be discussed below.

Visually, the game is very pleasing to look at. The graphic designers who worked on the project really worked hard to get the overall feel of the themes and vibes that were present within the game. They truly made the work their own and once having ample discussion surround initial ideas, they hit the ground running early. The designers worked well to stay in tune with the reference materials, and overall colour style that the project had planned from the start. Their work complemented the hand drawn art nicely.

 

Particle systems were further explored to both a) compliment the visuals as well as; b) further enhance knowledge surrounding how Unity works. The particles were successful and added extra life into the levels. From general cosmetic use to providing the player with additional feedback. Examples of this include, air bubble pick up sending particles to the player’s HUD to draw their attention visually to how the item interacts with their air level, particle system that replaced the typical arrow that sometimes comes in the territory of touch and drag style games and simple motion particles that sparkled when the player moved.

Visual feedback was one of the game’s strong points with almost every form of action having some sort of causal effect in the game space. The player had, as mentioned above, a host of particle effects the complemented the player pawn. These were obvious such as the wind and golden sparkles when the player moved to slight dust particle that would appear upon collision with a wall or floor tile. In addition to particles, there were also visual and aural elements that assisted the player in recognising damage and motion. The player would flash red upon receiving damage, as well as having a pained audio clip play. In addition, the end level goal of the dying body feature a prominent white glow which was to draw the player’s attention to this.

 

Audio was also another strong point of the game, with the audio engineer working and experimenting at length to get the right feel that was requested at the game's conception. The game’s levels feature two audio clips which would slide between one another dependant on the player’s position in the level in relation to the level goal, or dying body. This acted to serve as an additional guide for players, especially if hypothetically there were longer maze-like levels that required lengthy exploration. Again, it was another point of feedback for the player.

 

Testing  was extremely helpful with this game, especially when taking the game to one of IGDA Melbourne’s monthly meetings. It was good to receive feedback that wasn’t the casual ‘it’s good/coming along nicely.’ Getting valuable feedback from peers who have been in the industry longer than the team allowed specifics to be discussed. Thankfully these aligned with the testing that took place on campus. Common complaints were about how the controls felt, gravity, lack of realisation surrounding the HUD and air bubble pickups, uncertainty surrounding hazards and level length versus timer. All of which was taken into consideration. The balancing of the game’s mechanics was a tricky thing due to technical issues on multiple devices, which will be discussed later. Testing on campus was mostly down each week to some extent, however due to time constraints from the second project, the last three levels were not as fully tested as well or as long as planned.

 

What went wrong?

As discussed above, tilt/mic mechanics sadly had to be dropped around midway during production. This was dropped at just the right moment, as any more time spent on those mechanics would’ve been detrimental to the refinement of the existing mechanics, as well as hindered testing. It was discovered that the way that microphones in mobile and tablet devices work is that they deliberately try to eliminate wind and blowing sounds, which entirely broke the purpose of being in the game. However aspects of the process were still valuable for potential future use in another title.

 

Tech related issues were a big issue from early on in the development cycle. Not major software related issues that destroyed projects, but just support issues on different machines.  What was originally the primary device for some reason was unable to recognise and build using Android SDK, despite even trying to completely remove all traces of it on the computer in question and reinstalling. In addition to this, the early stages of development relied on using Unity 5 Remote on mobile and tablets to aid in production and have live testing on the device(s) on the fly. However, it was later discovered that the way Unity built out to the device was completely different to Editor runtime. From that point forward, we had to build each time we need to test something new.

 

Lag was a consistent issue throughout the life cycle of the game, and it came from numerous sources. Initially the project was using Sprite Lamp to create Normal maps for each sprite, as well as use Sprite Lamp shaders in Unity itself. This was the source of much lag, as the shaders and program were not made with the intention of being used on a  mobile platform. However, this was not a wasted effort, as the Normal maps created using Sprite Lamp were used in more simple shaders. Lag continued to be an issue even after this, and it wasn’t until the day of the exhibition that we realised that it was the level editor that our programmer had made that was placing more than one tile at a time, which was due to the way the level editor worked.

 

Originally FMOD was planned to be used in game to allow for more dynamic audio. However for some unknown reason, performing the same simple FMOD integration of an Audio Source with a built-in attenuation radius was not working as was previously able to do in a previous project. As the audio was adding towards the end of the project, the decision was made to drop FMOD early, as the team was not overly familiar on the specifics of the engine and how it behaves with Unity and C#/Monodevelop. In addition to audio, the enemy static sound effect was slightly too harsh compared to what was intended. This was due to using Unity’s built in system that some of the flexibility that could have come from FMOD was lost.

 

Due to time constraints, mostly impeded by the second project, additional art assets and levels were unable to be implemented. Originally at least 2-3 different tilesets were planned to add variety to the levels, as well as having 5-10 levels. Unfortunately this did not come to fruition. However, the levels never appeared to get dull or repetitive visually, which is likely thanks to the depth added through parallax effects, the art and particles.

 

What was learnt?

Despite the failures that came with the tilt/microphone functionality, we still learnt more about how Unity handles both inputs, as well as some of the obstacles to overcome should you wish to use them. For example, Unity doesn’t have any default microphone functions, and doesn’t automatically access a microphones features, which means everything has to be coded the hard way. Similarly with the gyroscope, although only touching the surface with this, there was still some basic groundwork and knowledge that was gained from attempting to implement this, unfortunately it just didn’t work in this instance. In saying this, given how the game play turned out, it could potentially be a good thing that the secondary controls were not implemented, as this could have complicated levels to an impossible level.

 

Similar with FMOD, despite it not working as intended, I still further discovered how FMOD works as a whole, and I overcame some of the issues that was previously present such as audio cues and markers etc. Despite resorting to Unity’s default audio system, knowledge was gained on how 3D sounds work, which are in a similar fashion to how they are handled in FMOD, just not on such a high level.

 

Learning the hard way about shader compatibility with mobile devices was a valuable lesson. We suspected that the mobile devices should be fine as they can run fast 3D games fine, however memory management is a major thing and needs to be considered on every aspect of the project. Using Unity’s analytics revealed where most of the lag was coming from (Renderers), which turned out to be Sprite Lamp shaders. Despite losing some of the visual fidelity, a deeper understanding on how shaders work was developed. This means that potentially a custom shader should be written that only handles what is required.

 

From a project management side, deeper understanding came in from preparation and early planning. Having defined documentation was extremely beneficial, especially when needing to convey knowledge to other team members. Understandably, this confused some team members due to the way we work differently, as well as the different aspects of the project we need to manage, however this was remedied by making access easier for them. From a timeline perspective, it was good that work was done early and allowed room for lots of polish. Although losing some of the extra tilesets and levels, the original asset list was designed in such way that allowed design to be flexible in the event of any major issues.

What would do differently?

Going into technology that we hadn’t really explored previously, probably would do deeper research before deciding on key features. This would eliminate wasted hours spent on a feature that would be dropped that could otherwise be spent on more important features.

 

More testing. Despite testing most weeks, unfortunately half of the final levels were not tested as much as planned. Going forward, more testing needs to be done earlier, regardless on how bare bones it is. To complement this, builds should be done more often. We would have discovered the vast difference between the Unity Remote Editor testing and a proper build sooner if this had been done more regularly.

bottom of page