School Raider

DISCLAIMER: This page is not complete, we are not finished with this project yet.

TEMP PICTURE, BETTER COMING SOON

With a slingshot and a bright mind, you can pass every test.”

Our pitch

A third person puzzle/stealth game made during 15 weeks part time in a team of 14.

No trailer yet.

My contributions

I mainly worked with AI and the node editor for this project. I have worked with every single behaviour that the AI used but the ones that I have written about below is the behaviours that I did by myself. I also worked a bit with animations and sound. You can read more about some of the stuff I did down below!

Node Editor

One of our courses at TGA is about scripting with a node editor. We got the node editor from our school but we had to do all the nodes ourselves and some functionality like adding copy & paste. In this project I took that node editor and implemented it into our engine. I also changed some stuff like making it possible to save and load multiple node systems instead of only one and I added some buttons that let you simulate the script. Like if a certain script was triggered by a “Start Node”, you could click on the start button in Imgui and then see the flow in the editor. Another feature I added was the ability to click on a transition between two nodes and see more info about what happened like what value that was currently being sent and from which type of node. This was used for debugging. The node editor was used for the leveldesigners to script but also for the animators because we created a animationtree system for them that they could use with nodes.

Showing the node editor in our engine
Showing the link/transition debugger I made

General AI

The first thing I did with the AI was setting a structure and researching what type of decision making system we should use. I ended up choosing a “Behaviour Tree”. I had never worked with a behaviour tree before but I had read about it in one of our courses. I did not write my own behaviour tree but I implemented an existing framework called BrainTree. I made one change to the BrainTree and that was to change their blackboard to an opaque dictionary. The blackboard that BrainTree used did only have support for ints,floats,bools,strings and doubles compared to the OpaqueDictionary that can take any class since it is templated and stores the data as void pointers. At the beginning of the project I created some empty interfaces for the different behaviours we had decided upon and experimented with the different types of node that you could use in the behaviour tree. The behaviourtree and was owned by the class AIController, that class later on held most of the variables and handled the editing,saving and loading of the AIController component.

I also created a class called AIFunctions that had functions that I thought could be helpful for every behaviour. Some example for functions here were the Move function that moved the AI towards a point, the Rotate function that rotated the AI towards a point and the HasReachedPoint that checked if the AI had reached a point. In the beginning these functions were very simple, the move function for example only moved directly towards a position and it didn’t care if there were objects in the way. Having them in functions made it very easy to change functionality, when we got our pathfinding up and running I could just change the move function a bit and every behaviour would still work but with better movement.

I worked a lot with debuglines and visually trying to show what the AI was doing, this was both because the leveldesigners had specifically asked for it but also because it makes the debugging so much easier.

FollowPlayer Behaviour

The followplayer behaviour was pretty simple. If the player was in vision the AI would follow the player. The player was in vision if the AI could see him with the help of raycasts or if the player was very close to the AI(we called this spider sense). For debugging purposes it was also possible to force the janitor to see the player without having line of sight to the player. When the AI spotted the player, he would change a shader to show that he noticed the player for a short amount of time and then change it to a shader that showed that the AI was angry. The followplayer behaviour had a unique speed so the AI would be faster if he was chasing the player and he would also play another animation.

Patrol Behaviour

This behaviour was my favourite. The leveldesingers could place waypoints in unity that the AI would use when he was patrolling. You chose a name for the waypoint in unity and also what the name of the next waypoint is. I later added some additional features that enabled us to choose how long the AI should stay at the waypoint before going to the next and he should play a special kind of animation when he reached a waypoint.

Distract Behaviour

There were radios at our levels that you could use to distract the AI. If the AI was not currently chasing you he would get distracted by radio’s that were on. If a radio was on he would walk over to that radio and turn it off and then go back to whatever he was doing. A problem I had with this behaviour was the movement towards the radio. The radio was always outside of the navmesh the AI was allowed to go on because it was on a table or another furniture that wasn’t walkable. I solved this issue by finding a point on the navmesh that was close to the radio and told the AI to go there instead of the actual position of the radio. I could have looped trough all the vertices in the navmesh and compare the length between them and the radio’s position but this felt too expensive performance wise. The bigger the navmesh, the longer it would take to find a point. Instead I chose to just check some points around the radio with different offsets. I talked with the leveldesigners and they would always have the radio’s close to the navmesh so I wouldn’t have to check for more than 3 meters around the radio.

Stunned Behaviour

If the AI was being shot by the player he would stop his current behaviour and play a get hit animation. When the animation was over he would get angry and run towards the player.

Waypoint changer

I created an easy way for the leveldesigners to change the waypoint the AI was currently going towards. When the player entered a trigger collider with the waypoint changer script on, the AI would change his targetwaypoint to whatever was set on the script. This made it possible for the leveldesigners to have more control over where the AI would be at certain points in a level. In the video below, the AI would go into this room if the player had dropped down on the table because there was a trigger with a waypoint changer there.