New Web (API) Foundations (Part Seven, Finale)

So at this point we’ve covered:

  • Why we chose the language we did.
  • Why we rolled for GraphQL instead of REST.
  • What the queries looked like.
  • Hitting home that this couldn’t go out until authentication and authorization were done.

So what exactly is left?

There’s two things that were left that were fairly key to getting it all ready for prime time, these were:

  • A Lua interface for a GMod server / client to use
  • Caching

While it wasn’t exactly a massive issue if we couldn’t get a Lua based interface working, it definitely would be a knock on the entire process. However at the end of the day a GraphQL query is really just a POST request made in a JSON-like form, the only downside is that you have to properly escape strings because… poor design implementation really on GraphQLs side.

To make sure it was effortless as could be, the signature that the client and server both use for querying is BB.MakeAPIRequest with all of the exact same logic, one could even call it shared (because it is)!
Even the way the client and server handle the authentication function BB.GetWebAPIToken() uses the same signature, though the underlying logic is different. Because we also ultimately get a JSON blob back (an actual JSON blob!) we’re able to use standard GMod functions to convert it to a table and pass the data to our callback function. Nice and easy!


At this point the web API had been available to anyone that wanted to use it publicly and /portal in-game had been updated to use it as a small test which worked incredibly well! At this point I was pretty satisfied in terms of the state of the API and the effectiveness of it, those that had been using it directly as an interface seemed pretty happy and even to this day it seems to get some usage which suggests adding public access wasn’t for naught!

The final thing needed was caching. While we didn’t need this for the go-live (and it wasn’t there) it was certainly going to help in terms of response time and load on both the webserver and the database. We have a fair bit of data that rarely changes or changes in a very controlled way. For example: we calculate rank data hourly and seasonal scores once per day so why bother forcing the database to query for this data every time when we can store the output in a serialized format and just present that data? It also makes sense as some of those queries can be slightly expensive computationally to run.

There was a slight problem with implementing caching though. In most languages you can wrap your cache layer around your database layer and it’s very much seamless; only Golang (at time of writing) doesn’t easily support this.

On and off for around three months I toyed with different ways of going about it and getting it as tightly connected as I could before I realised that in true Go fashion, guess I’m writing boilerplate for this again. Once Generics are released with Go 1.18 (assuming the underlying libraries also support them) it’s likely of the caching code added, around 60% of it can be condensed into more generic functions but for the time being it is how it is. It doesn’t impact performance in this case and more just requires lots of keyboard tapping!

In the vast majority of tests done caching meant we lowered response times by about 70 – 85%. There’s some additional optimizations that could be done here but some of that is outside the scope of the application and more into hardware & CDN optimization. Database querying to the game server (from the web API) also dropped fairly significantly and we were actually able to offload a number of data requests and in-game data syncing to the web API instead of streaming it down in-game which means users see less of a “freeze” when they first join. Another great win!


All in all this edition of the web API has been a huge success. It’s not even a year old and yet it’s far more capable of what the v1 API was ever capable of. Additionally it’s also a significant amount cheaper to run in terms of resources – in fact I’m not sure I’ve ever seen something this resource-lite with the exception of compiled applications.

On average the v2 API uses around 6MB – 10MB of RAM. Yep, you’re reading that correctly. It uses quite literally nothing with the actual API binary being a grand total of… 15MB in size. CPU-wise it may as well be invisible because it uses nothing! Even under load it’s stupidly efficient and I’m actually pretty impressed at how efficient Go has become. I really couldn’t be happier with the end result of the API and what it’s capable of.
For the interim I plan to keep it as a read-only API mostly because I don’t see a strong need as of yet to actually manipulate the data through a common API like that.

There’s one or two smaller things that could potentially benefit from it but equally whether or not those actions would even be part of the API is another question entirely. Given that these would be actions that we wouldn’t want a typical person to even be made aware of (e.g. adding a donation) it’s likely it’d end up with its own bespoke controls; unless we could control what introspection on the GraphQL API returns.

But that’s the new web API for you! Hopefully it’s been an interesting read and a peak behind the curtain and it’d be fantastic if any feedback could be given via the forums or Discord.

Until next time!

Leave a Reply