Skip to main content

Video Game Production Sprint 1

  After a long time spent away from Unreal Engine, I’m getting my hands dirty once again as a programmer for the Kill Everything in Sight team! KEIS is an endless action FPS roguelite where you play as a killer robot tasked with exterminating the remains of the human race. Despite your incredible movement and combat abilities, you are severely limited by a timer ticking down during the entirety of your run. Killing enemies will add precious grains of sand to your cruel hourglass, so the player must focus on speed just as much as precision!


I was brought on the team as a programmer, and I started off very nervous since it's been a while since I last programmed in Unreal. On top of that, my first task was to create the foundation of our enemy AI, probably the most critical gameplay element after the player and level design, and I haven’t touched AI in Unreal outside the absolute basics!


However, I was determined to make my team proud and confident in my ability to learn quickly, and so immediately I dove into the rabbit hole of Unreal AI tutorials on youtube. The first thing that I learned about was Unreal’s Behavior Trees (BTs). BTs allow a programmer to easily create a sort of decision tree, in which premade or customized Task nodes are executed from left to right, or selected to be executed based on conditions called “Decorators.”



Pictured is a small section of my final behavior tree at the end of this sprint. The purple nodes are Tasks, which contain typical blueprints code inside, the blue nodes sitting on top are decorators, which perform conditional checks and decide whether or not the tasks the adorn will be executed. The grey nodes are Sequences (execute their children from left to right til one fails) or Selectors (execute only the first child that succeeds starting from the left).


Unreal’s Behavior Trees also depend on a Blackboard, which holds keys. Keys are essentially variables for our usage, and they can be plugged into Decorators or Tasks as whatever value you need.


From top to bottom: two actor references, an enumerator, a Vector3D, and four floats


However, the Blackboard and BT were just two pieces of the AI puzzle. For even the most basic AI navigation. A navmesh was necessary to tell the AI what parts of the level it could move to. Thankfully, Unreal will handle the generation of the navmesh for us using a Navmesh Volume. The output looks like this!


“Everything the green touches is our kingdom.”


Now with this, I could call Unreal’s built-in Move To task inside the BT, and the AI could navigate there on its own! It’s like magic! One limitation of the navmesh however, is that the AI cannot jump on or off of ledges, so if the player were to jump up to the raised area in the middle, the AI would have to take the ramp to get to them, and vice versa if they jumped down. This limitation was one that my designer decided to accept for now, but in the future I could use something called nav link to set up points where the AI can jump up and down from. For now though, I had enough to create a basic idle state for my enemy, where it will walk to a random point nearby, wait for a bit, and repeat!


However, this was far from enough. We’re not making a peoplewatching sim, these enemies need to be dangerous! And the first step to being dangerous, is to identify the thing you are a danger to! To that end, my research led me to Unreal’s AIPerception Component!



This component, when attached to an AI Controller (think of it as a middle man between a pawn and a BT), allows me to configure a vast array of senses. For my purposes, I only needed sight, hearing, and damage, but Unreal also offers senses like touch and prediction!



With this set up, my enemy was now blessed with eyes and ears. The green radius is his vision, the yellow his hearing, and the red is where he will lose sight of things! The green sphere indicates where he last saw me, so cool! Now, upon sensing me, I could tell the AI to navigate to me and play a melee animation!


Still not much of a threat without a damage system, but he does his best!


After this, I took a slight detour from the current needs of my project, but I believe that further along in development this will come in handy. Throughout my research, I’d see really awesome enemy AI examples that were capable of advanced behavior, like strafing in a circle around the player, or hiding from them behind cover. These were both made possible with the Environment Query System or EQS. Basically, you can create Environment Queries (EQs) that take points in a defined area within the environment, and assign them a score based on any number of tests.


An EQ for playing hide and seek!


For example, this EQ I made scores points on a grid based on one: if the AI can navigate to that point, and two: whether or not the player can see that point. The blue points score 1 for pathfinding, but the green ones score 2 for pathfinding and visibility! What I can do from here is inside the behavior tree, run an environment query, and then move to the highest scoring point, or a random point scoring in the top X percent, which can let me make some really awesome behaviors! This is only scratching the surface of what EQS can do, as this was what I made after just one hour of working with it. Our current enemy AI needs are pretty basic, but this will allow me to make some far more advanced AI for future enemies!


Back to my assignments for this sprint though, Now that I had AI navigation and attacking set up, it was time to implement the first challenge for the player to face: the light enemy. Currently there are three basic enemies documented in the GDD, the light, medium, and heavy enemy. They largely do the same thing, but with different stats. Now I could just create a bunch of variables inside my base enemy, and edit them for each child, but Unreal offers a much more efficient way to get and store a large number of values that don’t need to change at runtime: Data Tables!


It looks more like a matrix than a table to me


The other programmer on my team has far more experience than me, and this has been a really awesome opportunity to work with and learn from them. They showed me how to easily take a .csv file (which I could easily get from google sheets/excel) and convert it to a data table in Unreal! This is awesome because my lead designer can create a spreadsheet, embed it into our design document, then I can download that spreadsheet and plug it straight into Unreal if we ever want to update or add new enemy stats!


With this, I’m able to get a row this data table from anywhere in blueprints, and plug the relevant data into wherever I need it. This way, with one string of nodes in the base class, I can easily make new child classes and just tell them which row to fetch instead of having to manually assign values!


It’s not limited to numbers either, you can store whatever type of Unreal Asset you want in a data table!


With this final puzzle piece, our first proper enemy was born: The light enemy! It’s a small, agile little guy who will make a beeline towards the player and try to knock their lights out, it’s very funny! Not only that, but with all this setup, I can quickly make new enemies too. (I’m writing this during the next sprint, and a couple days in I’ve already added two new shooting enemies!)


Take your kid to work day!
(Left - Light enemy)
(Right - Normal-sized Mannequin)


Comments

Popular posts from this blog

CAGD 373 Blog Post 4

This sprint I was assigned a modular set to create the basic interior rooms out of, with three different textures for the walls. (a Square Brick Texture, a Cement Texture, and a Metal Wall Texture) The actual modeling itself was as basic as it gets, I just made a few different shapes and sizes of wall along with a doorframe, with a couple floors and ceilings to complete the set. The interesting stuff this week was the textures, all of which I made in designer! The brick texture was probably the most complicated, and the one I’m most proud of. Starting with a brick generator, I used some gaussian spots to add some variance to the shape of bricks (using the spots to “cut out” chunks of the perfectly square bricks) and that worked pretty well! After that, I used a grainy looking noise map to fill in the black part of this mask to add in the noisy texture of mortar between bricks Next, to add some color variation to the bricks, I used a flood fill node, which was able to identify all the...

CAGD 370 Blog Post 5 - Final Sprint and Postmortem

  The Final Prototype is officially done! And this final sprint has definitely been the most intense yet. While not perfect, I’m very happy with the game that me and my team have made, and I feel a lot of motivation to start a brand new project! But before I get ahead of myself, it’s a great time to reflect on the last two weeks and this project as a whole. To start off our final sprint, I once again made adjustments to the pole vaulting. I made the impulse of the vault scale with how close you got to the pole, which meant that not only could the player not get an immense movement boost from a standstill, but also that there was now a “sweet spot” to aim for to get the most vertical height out of the vault. To help players seek out this sweet spot, I also took the time to create a charge-indicating progress bar next to the player that would fill up the closer they got to the sweet spot. The bar progressively fills as the player approaches the sweet spot After this, my lead de...

CAGD 373 Blog Post 5

  Judgement day nears! And sadly I don’t have too much to show for this week as finals in other classes have swallowed up most of my time. However with most of them finally out of the way I’m clear to focus all my time towards this project over these next couple of days! The real meat of my texturing work this week is this Exterior Trim Sheet. I tried to group together as many models with similar-ish materials as possible so that I could get textures applied efficiently. Here I’ve created a trim sheet consisting of metal, tree bark, wood, a beige stone texture that worked pretty well for the pipes, and a separate, rougher metal texture for the motors found just outside the facility doors. I spent a lot of time compiling a list of all the models in our project along with reference images, so that way I could organize them into texture sets based on similar materials/texturing needs This plan should help to really accelerate the texturing pipeline over the next couple of days, so t...