Multi-Surface Pathfinding in Alien: Rogue Incursion
Building reactive, fast-moving xenomorphs for a VR title.
AI and Games is made possible thanks to our premium subscribers who support our work. Subscribe now to have your name in video credits, contribute to future episode topics, watch content in early access, and exclusive supporters-only content.
If you’d like to work with us on your own games projects, please check out our consulting services provided at AI and Games. To sponsor our content, please visit our dedicated sponsorship page.
Alien: Rogue Incursion is the first VR game based on the popular horror franchise. First released as a VR title for Steam and PSVR2 in December 2024, followed by the Meta Quest 3 in February 2025 and a non-VR ‘Evolved’ Edition on PC, Playstation 5 and Xbox Series S and X in the following September. Given it’s one of the latest examples of the xenomorph in games, it’s a great opportunity to discuss an interesting technical challenge: how to get non-player characters pathfinding across multiple surfaces.
Follow AI and Games on: BlueSky | YouTube | LinkedIn | TikTok
Editorial Note: As I was finalising the written version of this episode for the website, I became aware that many of the developers behind this game have been laid off at Servios. I wish everyone affected the very best and hope you find new roles be they in the industry or elsewhere soon.
The Perfect Organism
Alien: Rogue Incursion has players take control of Zula Hendricks in a claustrophobic first person shooter where you have to balance your resources and use all the tools at your disposal in order to survive the latest encounter with the perfect organism.
Trapped in the Castor’s Cradle research facility on LV-354 owned by Gemini Exoplanet Solutions, Hendricks fights to uncover the truth of what the corporation has really been up to as she arrives at the now empty and damaged facility, and yeah... you can see where this is going. Lots of xenomorphs, lots of close encounters, and hugging that motion tracker very tightly.
Developed over a period of around 3 years, the goal of the Rogue Incursion was to deliver the tension of the alien franchise but within the VR viewport. Players should feel that danger lurks not just around every corner, but in the vents be it in the walls, or the ceilings. To maximise the VR experience, the goal was to have xenomorphs be able to spawn anywhere in the game world, and then use the environment to maximum effect: sneaking up on the player, and catching you off guard and ensuring no two playthroughs felt the same.
A big part of this is the navigation system, whereby the aliens climb along the walls, along the ceilings and through the vents to sneak up on the player. But also that they can jump down onto nearby geometry, and use different surfaces to close the distance as swiftly as possible. It’s the one thing Alien: Isolation did not do given it wasn’t part of the original movies’ lore, and when we have seen it in other Alien games, ranging from Aliens: Fireteam Elite to Aliens vs Predator (1999) and... Aliens vs Predator (2010), it’s often been far less complex in its implementation.
But what made this all the more interesting, was that the game was originally for VR headsets. While we’ve seen significant gains in performance over in the past 10 years in the likes of the Quest, there are still significant hardware constraints. VR games carry a heavier graphical overhead given the need to render out two separate images at higher frame rates than usual, and these devices still have limitations in areas such as memory bandwidth. Sure this is mitigated if you’re playing the game on Steam or PSVR2 given your PC or console is running the game, but those limitations still exist - and are exacerbated when you then play on a Meta Quest 3 which is a completely standalone device. This meant not just that there is a limit to the number of aliens the game can have actively fighting the player at once, but also a need to ensure ensuring that pathfinding is fast, and efficient.
Walking on Walls
Now you might be thinking “Tommy, what’s so special about xenos crawling on the walls? Surely you just use a navigation mesh?”. First of all, brownie points for you, what a clever cookie and yes, nav meshes are the standard for movement in 3D game engines, given they allow us to take the irregular geometry formed from 3D game worlds and compile it down to a 2D data structure which we can pathfind on. But, there are a lot of situations where navmeshes simply don’t work well, if at all. They’re not great for objects that move through volumes of 3D space, hence the Horizon games use custom pathfinding tools for enemies that fly or swim. Plus navmeshes often struggle with pathfinding on moving surfaces - which we talked about in part 3 of our series on the AI of Sea of Thieves.
But given that walls and other surfaces don’t move, and we simply move along those surfaces, surely what we could do instead is simply slap nav-meshes on all of the surfaces in the game world? And then use links between them so that an alien can shift from one surface to the other?
Well, as explained by Eugene Elkin and Curt Perry in their talk at 2025 GDC Game AI Summit, that’s exactly what Survios did. But it had a lot of problems. For one it required rewriting a lot of Recast - which is the navigation mesh library developed by Mikko Mononen that has been integrated into Unreal Engine - because navmeshes as a concept are designed to work on floors only. But on top of this it was limited in the ways in which an alien would traverse from one surface to another. It worked great when you put it in an empty square room, but as soon as level designers wanted to use more complex shapes for walls and ceilings, or filled the room with props and clutter then available space, as well as how to connect between traversable locations, became increasingly limited and after six months of having one developer work on this, it was abandoned for a different approach.
But it wasn’t just the issue of building the navigation mesh, it was also the resulting paths that the aliens were moving on. The dev team wanted the xenomorphs to move in ways that felt unpredictable, playful even. Sort of like a giant extra-terrestrial cat from hell, and the team worked on a pre-vis animatic to help visualise what they wanted the aliens to do, rather than what they did. This highlighted that the default navmesh actor in Unreal would prove insufficient, given it will aim to take the optimal route to the player. You could try and attach a navmesh modifier to the player, meaning the costs might shift, but that won’t guarantee the aliens prioritise using the ceiling, or sneaking through an air vent that leads to a more engaging experience. It really needed a different approach.


So the solution was to write a completely custom graph-based navigation system that worked alongside the navigation mesh on the floor. Each graph point dictates where an alien could crawl that would take it off of the floor, and then the paths they could take while on that surface, be it the ceiling or a wall. But these graph layouts required significant work on by level designers to respect each area so that the aliens are only attempting to crawl on specific surfaces that make sense for gameplay, but also having to work with the AI programmers to build custom waypoint types so aliens could interact with gameplay features and objects like railings and vents, but also custom jump points ranging from precise positioning to allowing NPCs to land within a radius of a geometric feature.
The next step was to figure out how to navigate, and address all the weird edge cases that would arise as they move between rooms, jump onto and off of the navmesh, and even get into situations where they’re pathing near other aliens. Running the A* search algorithm the idea was for aliens to figure out when they spawn their closest navigation position, be it on the navmesh or in the graph network, and then search for a target location that is generated either on the navmesh near the player, or is at a position close enough on the graph network it can then jump from that surface and attack you.
As a reminder for anyone not familiar, with A* we evaluate each node in the graph based on the equation f(s) = g(s) + h(s) where:
g(s): The cost to reach the node along the currently built path
h(s): The estimated distance to the goal from that point - which in this case would just be a straight line from the location to the player.
This is a fairly rudimentary AI algorithm, and anyone who knows their AI basics will tell you that A* is designed to always find the cheapest path - and typically when we say cheapest in pathfinding, we mean the shortest. However, what the devs did for Rogue Incursion, was allow for the paths to dynamically change their cost values so that the cheapest wasn’t guaranteed to be the shortest, but it would ensure that it would at least be interesting.
So after establishing a bunch of possible pairs of start and end points, the game runs a worker thread in the background that evaluates them against a bunch of criteria. First it checks whether they’re valid, because the alien might have trouble reaching the start point on the navmesh. But it also checks whether existing xenos were using that part of that graph network in their own paths, and it would make it more expensive so as to dissuade another alien following it and going the exact same route.
Aliens are prioritised so those that are already moving or close to the player always guaranteed valid paths while those farther away wait for something to free up for them to use. Plus there’s an effort to mess with the cost values to make them diversify their strategies. Level designers set the cost of parts of the graph network as they built it, and often encourage aliens to traverse on the walls or the ceiling by making it cheaper than simply moving across the floor. Meanwhile there’s a custom modifier that kicks in that makes paths that are occluded from the players view less expensive, hence it encourages the aliens to flank and move around in paths that are harder for the player to spot.
This then still has to handle a variety of edge cases, like aliens accidentally bunching up during execution, or reacting to the player moving or doing something like closing a door. If xenos get bunched up they run a deconflict algorithm where often they’ll either try to jump onto a nearby wall or ceiling, or even back off briefly to let their ally push up on you. Meanwhile repathing occurs at any time the player does something that blocks them, or if your movement leads to their target destination diverging from the players location by more than one meter. All of this combined with a bunch of smoothing behaviours for the movement, and efforts to maximise the approach vectors as they attack the player, helps ensure that the aliens can approach from a variety of angles, keep you on edge, and can result in you being caught off guard.
Designing Encounters
So with the aliens able to move around the world, the next step is to handle encounter design. Bearing in mind the game was built to run on the Meta Quest 3, and with each alien running a combination of a Behaviour Tree and Finite State Machines for its logic, there’s a risk that what little CPU resource available per frame would get choked up by having too many NPCs active at once. But conversely, if there’s only one single active alien in the game, then we’re back playing Alien: Isolation - only I now have lots of guns.
So there’s a trade-off built into the game whereby a director system manages the active xenomorphs in the map, and how they respond to the player, and ensures they keep their overall CPU costs lower. One of the biggest tricks that helps with this process is the tension that players feel courtesy of the motion tracker, it’s very rare to see several aliens on the motion tracker, while they’re also standing in front of you. In truth most of the time when you see an alien on the motion tracker, it isn’t actually in the level as a fully-fledged xenomorph. Rather it’s what is known as a Sim Entity, a lightweight invisible NPC that floats around the map, and when it decides to start hunting you - either because it heard a noise, it enters your area of proximity, or through scripted events, it then reaches a spawn point in the map - typically somewhere in a vent - where it then turns into a fully-fledged Behaviour-Tree executing xenomorph.
Rather than manually place aliens throughout the game, designers place what are known as Sim Regions, which are places from which a number of Sim Entities can spawn. These regions dictate the number of entities it can spawn, how long between spawning them, and their range. Since there are no loading screens in Rogue Incursion, the game is always paying attention to where you are and starts turning these Sim Regions on and off as you walk through the various buildings of Castor’s Cradle. These then are customised by designers who can change their configuration at various times throughout the game to dictate the pacing. Hence you can go through an area and its desolate early on only for it to be teeming with xenomorphs towards the end of the game. There’s even a condition for the number of resurrections allowed for Sim Region. Hence if you wipe out all of the aliens spawning out of it, it can still keep throwing more at you should the current story beat of the campaign require it.
Closing
Alien: Rogue Incursion touches on many of the same ideas we’ve seen in other alien games, and while constrained by the technical budget available to it, it still creates something unique that feels all the more tense when played through a VR headset. Though all that said it I kind of like having a bit of separation between me and a facehugger as it tries to get intimate with my airways.
References
“Simulating the Perfect Organism in Alien: Rogue Incursion”, Eugene Elkin and Curtis Perry, GDC Game AI Summit 2025.









