Build Agents: A look into Tactile’s internal build pipeline

Game development is a complex process. In 2023 alone we created more than 100.000 game builds, which is a huge number! Supporting our games team in the game development process are the brilliant minds of our Core team. Developers responsible for constructing and maintaining our internal infrastructure –  all the tools, platforms and systems, entirely built in-house and customized to our specific needs, which enable us to do live game operations.

One of these tools is the Build Server, our internal game building pipeline, which has been for the past 6 months under close inspection from Juan and Vlad (our Backend Engineers extraordinaire). We sat down with them for a chat about how we build our games, why we bother developing our own tools and to discover what improvements the team has been working on this year and why.

Juan & Vlad, Backend Engineers on the LiveOps Team within our Core Team

How do we build our games?

Our games are built using the Unity engine and every time there are updates to the game projects, such as asset bundles, new product features or in-game events, we create a new game build. For a game developer, this process takes a lot of time and resources from their machine. Despite the fact that today developers’ machines are more efficient and powerful, it can still take hours to compile all the assets into a new build. This is why, several years ago, we decided to build a collection of tools, which enable us to build games faster, in an automated and more efficient way.

Whilst there are companies that provide this service, having our own personalised, internal build pipeline offers us the flexibility to build in a way that suits our specific needs. It’s not uncommon that companies will have a dedicated Build Pipeline Engineer, or use an out-of-the-box solution. It is a necessity for making games. Ultimately, the product that the gaming company delivers is something that gets installed on some sort of a device (whether that’s a mobile phone, console or PC). Creating these game builds is the part that got automated with our Build Server tools.

Think of it like this – there’s two main components of the Build pipeline, the Controller and the Agents. The Controller allows game programmers, level designers and other users to order a build of whatever they are working on within a specific game. They can do so through our internal build dashboard. The build request then goes to a Build Agent, which starts going through a sequence of steps, resulting in an installable build. Besides these on-demand builds, we also have a lot of scheduled building, which normally runs overnight or on a specific schedule. All the ordered and scheduled builds, build agents and their respective statuses can be seen on the build dashboard.

Build Controller checking in on the Build Agents’ statuses

Vlad elaborates: “The internal build pipeline gives our teams whatever flexibility and customisation they might need – whether someone wants to order a build for a game for a specific branch or a specific revision, if they want to run tests and validations, or not, whether they want to run it on Monday at 2pm, or every Tuesday overnight … You get the drill. The Build Server will notify them when the build is done and they can track everything relating to the building process on a user-friendly and consistently running interface.” These tools therefore give us a complete overview and context into each build, and save us a bunch of time that was previously spent looking into individual problems.

‘If you want it done right, do it yourself.’

This is where building our own tools gives us the advantage over using commercial tools. It enables us to make the game building process super flexible and to bend it to our specific needs. We make a lot of custom integrations in our builds and we have made it very easy for us to report on the things happening in the build pipeline.

The Build Agent is only one of the many moving parts of our build tools. However, it is THE part which actually executes ‘the final recipe’ composed for every game. We actually have several Build Agents (148 to be precise), which are all running on ca. 30 Mac Minis. Juan shares a fun fact about this: “We use Mac Minis because we compile new game builds using Xcode, which requires us to use the MacOS. Having our own internal build tools enables us to build things in Xcode for iOS, which is not something that many commercial tools offered nearly a decade ago when we started looking for an appropriate solution.

Individual agents are set-up in the same way, so that we can reliably know that if the set-up worked for one project, it is also going to work for the others. Another good feature of our build pipeline is that it enables us to upload specific game builds to an app store, instead of our Producers having to do it manually. As Vlad explains: “It’s basically a big collection of tools that contribute to the overall ease of development, in particular by reducing the time that people spend doing repetitive tasks. There’s a lot of things that seem incremental in themselves, but end up adding to a lot.

Juan & Vlad taking a peek at the data

Changing things for the better

Juan and Vlad recently rewrote the Build Agents code from an older version of Python into TypeScript. The old code was difficult to maintain, so everything that needed to be added, fixed or modified to the codebase took a lot of time and effort from our development team. It was basically like constantly readjusting a big Jenga tower.

We therefore took the decision to re-do the Build Agents in a language that was comfortable for all our developers. We also took this opportunity to test our new coding patterns and practices. This was a small project to start with (compared to others), but complex enough that it was suitable for trying this out.

The new coding style is centred around dependency injection (DI). DI is a design pattern which promotes loose coupling between components by externalising the dependencies of a class. Instead of a class creating its own dependencies, they are provided, or injected, from the outside.

Applying DI therefore allows us to have total control over the business logic behind the code. To give an example, we have some business logic that says when the build is finished, we need to notify the user. In order to do that, we call a third party library. However, we do not have control over the third party library and have no way of testing that it actually does what we expect it to. But now, with DI, we are able to abstract that library into a third component which we have complete control over. Juan elaborates: “We now feel very comfortable making structural and behavioural changes in our code, because we are more confident that it will behave as it should. We have a big testing infrastructure, more control and a complete overview of its behaviour.

Vlad goes on to illustrate this with another example: “There are many ways to boil an egg, right? You can boil it in hot water, you can boil it in a microwave, or you can set the house on fire and eventually, this will also make the egg boil.” 🤣

But if you’d rather avoid burning the house down, here’s another metaphor provided by the all-knowing ChatGPT: Imagine you’re building a house. In traditional programming, each room (or class) in your house knows how to make its own furniture (dependencies). With DI, it’s like hiring an interior decorator. You tell the decorator what furniture each room needs, and they bring it in. This way, if you want to change the furniture later, you don’t have to rebuild the whole room; the decorator can just swap it out.

So, applying DI is like having someone else handle the connections between different parts of your program, making it easier to change or upgrade things without messing up the entire system (and burning the house down).

The importance of continuously reiterating on our coding style

“Do you solemnly swear to decouple your components, honour the single responsibility principle, and inject dependencies responsibly, forsaking tight coupling? May your code be flexible, and your tests evergreen.”

Within our team, we always strive towards making our code more maintainable and have an even better quality of life. It’s our long-term vision, and using DI is just one of the target conditions we want to reach on the way there. It is difficult to code consistently at scale in a big team, and to stay cohesive and coherent together. This is why it is extremely useful to have something that can be modified by all of our developers in a comfortable fashion. No matter your role, or your location, you can change something in the codebase and get a fast response into whether this is breaking something or not.

This is why we need to keep reiterating on our coding style – to keep up to date with the standards and good practices,” says Juan, “As long as technology, languages, customer needs and the gaming market evolve, our style will have to follow. We need to be more agile and quicker to respond to things. If we keep doing things the same way, then we will not be able to stay competitive.”

Our news

IMG_9998
Product

Behind the curtains: What it takes to make…

More
DSC08408
Tactile

Tactile welcomes new Chief of Data Analytics

More
IMG_4097
Product

A look into our game production pipelines

More
DSC08344
Tactilers

From Manual QA to Automation Engineering – My…

More
DSC_1104
Tactilers

The day we took 24 Tactilers on an…

More
Website Blog cover (1)
Game Dev

Building a testing ecosystem for high-quality games in…

More
Website Blog cover
Marketing

Tactile welcomes new CMO

More
Website Blog cover (12)
Marketing

The vital connection between user acquisition and product…

More
Website Blog cover (11)
Game Dev

Sharing is caring – The role and impact…

More
Roche (6)
Product

How to start your career in Level Design…

More