Transcript:

Hey everybody and welcome back to another Gravity Ace Devlog!

This week I want to talk about AI programming. I’ve seen a few posts from people asking questions about basic AI recently. It seems overwhelming but I’m here to tell you that it can be simple. Complicated behavior is often just a side-effect of some fairly simple rules.

First, let’s look at drones. These little guys are ships that fly around and have one job: shoot the player. To do that, they need to be able to avoid obstacles, move towards the player, and aim their guns. As you can see, their movement pattern is interesting and somewhat unpredictable. It looks like complex behavior. As I move towards them they move back. As I pull away they give chase. They aim their cannons and shoot. They avoid the cavern walls and take bounces and getting hit with debris in stride. And they can even seem to navigate arbitrary pathways within the cave system.

Another thing you’ll notice is that they can’t hit each other with their own bullets. That simplifies things a lot. And in this clip I’ve given my shield 5000 health otherwise the video would’ve been a lot harder to make.

Node setup

Now let’s look at the node setup. Drones and other moving objects in the game are RigidBody2Ds . Rigid bodies are great because they get a lot of great physics interactions basically for free – the engine just makes it work in its simulation. You give a RigidBody2D a push and it’ll bounce and fall and collide without writing any code. Pretty cool.

There’s a collision shape of course. And there are two sprites, one for the base of the drone and one for the turret. This allows the turret to rotate independently of the body.

Some of these nodes are cosmetic so I’ll skip over them but let me show you some of the animations. We’ve got animations for the drones appearing from a warp, for when they’ve been destroyed, a shooting animation, and a flying animation.

Next there are some timers for handling cooldowns and I’ll talk about those more when we get into the code.

There are sound effect nodes, UI nodes for showing health bars…

These RayCast2D nodes are important. These are how the drones navigate and move without colliding with things. Think of them like whiskers. I have four of them, each one is configured to collide with certain important objects in the world like structures and walls and the player. I’ll show you how they work in a minute.

Let’s dig into some code.

There’s a lot going on here but I’ll give you the highlights that are relevant to the AI system. First of all, every enemy has health and a flag to say whether the enemy is alive or dead. This one also has a flag to tell if it’s “asleep”. I’ve made the drones go dormant when they’re off screen or when they first appear and then take a second or two to wake up.

_integrate_forces() is where most of the magic happens. This is a built-in function called by the engine that allows you to override the physics of the RigidBody2D . I’m using it to program AI behavior into the drones.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 func _integrate_forces(state): if Game . is_edit_mode(): linear_velocity = Vector2 () angular_velocity = 0 if _editor_transform: state . transform = _editor_transform _editor_transform = null return if not alive: return if asleep: return if not Game . player: return if not Game . player . alive: return var delta = state . get_step() # Check nearby objects with raycast var closest_collision = null $ rays . rotation += delta * 11 * PI for ray in $ rays . get_children(): if ray . is_colliding(): var collision_point = ray . get_collision_point() - global_position if closest_collision == null: closest_collision = collision_point if collision_point . length() < closest_collision . length(): closest_collision = collision_point # Dodge if closest_collision: var normal = - closest_collision . normalized() var dodge_direction = 1 if randf() < 0.5 : dodge_direction = - 1 linear_velocity += normal * MAX_THRUST * 2 * delta linear_velocity += normal . rotated(PI / 2 * dodge_direction) * MAX_THRUST * delta # Steer towards player var distance_to_player = global_position . distance_to(Game . player . global_position) var vector_to_player = (Game . player . global_position - global_position) . normalized() # Rotate turret var start = $ turret . rotation var angle_to_target = Vector2 ( 1 , 0 ) . rotated(start) . angle_to(vector_to_player) var end = start + angle_to_target $ Tween . interpolate_property( $ turret, 'rotation' , start, end, 0.1 , Tween . TRANS_QUAD, Tween . EASE_OUT) $ Tween . start() if distance_to_player > 150 : # Move towards player linear_velocity += vector_to_player * MAX_THRUST * delta hovering = false else : # Move away from player hovering = false linear_velocity += - vector_to_player * MAX_THRUST * delta # Clamp max speed if linear_velocity . length() > MAX_SPEED: linear_velocity = linear_velocity . normalized() * MAX_SPEED

First, I just check if the game is in editor mode. When in the editor the drones AI is deactivated and they don’t move.

Next I do some checks to see if the drone is asleep or dead, if the player exists, and if the player is alive.

Then I get the delta time for this physics step. In Godot, physics runs at a locked 60fps but it is configurable so I grab that time delta here so that everything will run at a constant rate regardless of hardware or config settings.

OK, that’s setup done. Now let’s look at the actual algorithm.

First, remember those RayCast2D s? This block of code rotates them as a group at 11*PI rotations per second. It checks each ray for a collision and finds the very closest collision point.

Next, if a closest collision point was found, then try to dodge it by moving perpendicular to the vector pointing to the collision. It chooses a direction (left or right) to dodge randomly and adds the MAX_THRUST amount to the drone’s linear velocity.

Next we calculate a distance and normalized vector to the player. Those values are used to rotate the turret towards the player. A Tween node is used to smoothly animate the turret rotation.

Then we check our distance to the player. If it’s greater then 150 pixels then move towards the player. If we’re too close then reverse away from the player.

Finally, we’ll clamp the max velocity of the drone so that it doesn’t keep speeding up without limit.

That’s it for movement. Conceptually pretty simple. It’s like a blind animal that can smell the player and uses its whiskers to detect walls and other obstacles nearby to avoid collisions. It rotates it’s whiskers looking for collisions, tries to dodge perpendicular to anything it’s about to hit, aims the turret at the player position, and tries to maintain a distance from the player of about 150 pixels.

Shooting is the only other major thing it does. The shootCooldown node is a Timer . It calls shoot() whenever it times out. And the shoot() function randomizes the timer and restarts it. That makes the drone keep firing regularly (with a little random variation) until off screen or the drone or player are dead. If everyone is still alive and awake and on screen then the drone actually fires a bullet via the AnimationPlayer .

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 func shoot(): $ shootCooldown . wait_time = COOLDOWN * ( 1 + rand_range( - 0.25 , 0.25 )) $ shootCooldown . start() if Game . is_edit_mode(): return if not alive: return if not Game . player: return if not Game . player . alive: return if asleep: return if not $ VisibilityEnabler2D . is_on_screen(): return shoot_sfx() # Start firing animation $ AnimationPlayer . play( 'firing' )

Firing animation

So that’s the major components of the drone AI. Feel free to ask any questions you have in the comments. I’ll also be posting this code over on GRAVITY ACE DOT COM and I’ll include a link in the description.

Thanks for watching and see you next time!