AI System Overview

Comprehensive AI system for creating intelligent game agents with behavior trees, state machines, perception, navigation, and steering behaviors.

Web Engine provides a comprehensive AI system for creating intelligent game agents with behavior trees, state machines, perception, navigation, and steering behaviors.

AI Architecture#

The AI system is built on a modular ECS architecture with several interconnected subsystems:

Core Components#

  • AIAgent — Core component that links entities to AI decision-making systems
  • Perception — Vision and hearing detection for environmental awareness
  • Perceivable — Tag component marking entities as detectable by AI

Decision-Making Systems#

Behavior Trees

Hierarchical decision-making with composite nodes (Selector, Sequence, Parallel), decorator nodes (Inverter, Repeater, Timeout), and action/condition nodes.

State Machines

Simple FSM and Hierarchical FSM with nested states, transition guards, event-driven transitions, and history states (shallow and deep).

Utility AI

Score-based decision making with considerations (health, distance, enemy count) and response curves for dynamic action selection.

Quick Start#

Basic AI Setup#

setup-ai.ts
typescript
import {
AIAgent,
Perception,
Perceivable,
initAIAgent,
initPerception,
initPerceivable,
AIControllerType,
AIAffiliation,
} from '@web-engine-dev/core/engine/ai';
// Create an AI agent with perception
addComponent(world, AIAgent, enemyEid);
addComponent(world, Perception, enemyEid);
// Initialize with behavior tree controller
initAIAgent(enemyEid, AIControllerType.BehaviorTree, AIAffiliation.Hostile);
initPerception(enemyEid);
// Make player perceivable
addComponent(world, Perceivable, playerEid);
initPerceivable(playerEid, AIAffiliation.Friendly);

Creating a Behavior Tree#

behavior-tree.ts
typescript
import {
registerBehaviorTree,
assignBehaviorTree,
BTSelector,
BTSequence,
BTCondition,
BTAction,
BTStatus,
} from '@web-engine-dev/core/engine/ai';
// Register a behavior tree
registerBehaviorTree('GuardAI', () => {
return new BTSelector([
// Combat behavior
new BTSequence([
new BTCondition('hasTarget', (ctx) =>
ctx.blackboard.has('targetEid')
),
new BTCondition('inRange', (ctx) => {
const dist = ctx.blackboard.get('distanceToTarget') ?? Infinity;
return dist < 10;
}),
new BTAction('attack', (ctx) => {
// Attack logic
return BTStatus.SUCCESS;
}),
]),
// Patrol behavior
new BTSequence([
new BTCondition('hasPatrol', (ctx) =>
ctx.blackboard.has('patrolWaypoints')
),
new BTAction('patrol', (ctx) => {
// Patrol logic
return BTStatus.RUNNING;
}),
]),
// Idle fallback
new BTAction('idle', (ctx) => {
return BTStatus.SUCCESS;
}),
]);
});
// Assign to entity
assignBehaviorTree(enemyEid, 'GuardAI');

AI Agent States#

AI agents have high-level operational states that control behavior execution:

StateDescription
DisabledAgent is inactive/disabled
IdleAgent is idle, not executing behavior
ActiveAgent is actively executing behavior
PausedAgent behavior is paused (can be resumed)
AlertAgent is in alert state (detected something)
CombatAgent is in combat/hostile state
FleeingAgent is fleeing/retreating
DeadAgent is dead/incapacitated

Blackboard System#

The blackboard is a shared memory system for AI decision-making. Each agent has its own blackboard for storing and sharing data between behavior nodes:

blackboard.ts
typescript
import { getBlackboard, setBlackboardValue, getBlackboardValue } from '@web-engine-dev/core/engine/ai';
// Get agent's blackboard ID
const blackboardId = AIAgent.blackboardId[eid];
// Set values
setBlackboardValue(blackboardId, 'targetEid', enemyEntity);
setBlackboardValue(blackboardId, 'health', 100);
setBlackboardValue(blackboardId, 'alertLevel', 0.5);
// Get values
const target = getBlackboardValue<number>(blackboardId, 'targetEid');
const health = getBlackboardValue<number>(blackboardId, 'health');
// In behavior tree nodes, access via context
const hasTarget = (ctx) => ctx.blackboard.has('targetEid');
const isHealthLow = (ctx) => (ctx.blackboard.get('health') ?? 100) < 20;

Perception System#

AI agents can perceive their environment through vision and hearing:

Vision Configuration#

  • Range — Maximum view distance (default: 30m)
  • FOV Angle — Field of view angle in degrees (default: 110°)
  • Peripheral Range — Peripheral vision distance (default: 15m)
  • Peripheral FOV — Peripheral field of view (default: 180°)
  • Eye Height — Height offset for vision rays (default: 1.6m)

Hearing Configuration#

  • Range — Maximum hearing distance (default: 20m)
  • Threshold — Minimum sound intensity to detect (default: 0.1)
  • Sensitivity — Hearing acuity multiplier (default: 1.0)
  • Occlusion — Whether sounds through walls are muffled

Emitting Sounds#

emit-sound.ts
typescript
import { emitSound, StimulusType } from '@web-engine-dev/core/engine/ai';
// Emit a sound stimulus that AI agents can hear
emitSound(
sourceEid, // Entity making the sound
position, // World position [x, y, z]
intensity: 0.8, // Loudness (0-1)
radius: 15.0, // Hearing radius
duration: 2.0 // How long it persists
);

AI Systems#

The AI module includes several ECS systems that should be added to your game loop:

  • PerceptionSystem — Handles vision and hearing detection, updates tracked entities
  • AIBehaviorTreeSystem — Executes behavior trees for agents with BehaviorTree controller
  • AIStateMachineSystem — Updates state machines for agents with StateMachine controller
  • UtilityAISystem — Evaluates utility AI for agents with UtilityAI controller
  • SteeringSystem — Applies steering behaviors for autonomous movement
  • FormationSystem — Manages group AI with formation patterns

Performance & Optimization#

Zero-GC Design#

The AI system is designed for zero garbage collection in hot paths:

  • Pre-allocated perception storage with circular buffers for stimuli
  • Object pooling for blackboards and behavior tree instances
  • Typed arrays for component data (bitECS integration)
  • Reused vectors in steering behaviors and pathfinding

LOD System#

Perception LOD (Level of Detail) reduces AI update frequency based on distance from focus point:

perception-lod.ts
typescript
import {
setPerceptionLODEnabled,
setLODFocusPosition,
setPerceptionLODConfig,
} from '@web-engine-dev/core/engine/ai';
// Enable perception LOD
setPerceptionLODEnabled(true);
// Set focus position (usually camera/player)
setLODFocusPosition(cameraPosition);
// Configure LOD distances
setPerceptionLODConfig({
lodDistances: [20, 50, 100], // Distance thresholds
updateIntervals: [0.1, 0.25, 0.5, 1.0], // Update rates per LOD
maxTrackedPerLOD: [32, 16, 8, 4], // Entity tracking limits
});

Spatial Indexing#

The AI system uses a spatial index for efficient neighbor queries:

spatial-index.ts
typescript
import { getAISpatialIndex } from '@web-engine-dev/core/engine/ai';
const spatialIndex = getAISpatialIndex();
// Query nearby entities
const neighbors = spatialIndex.queryRadius(
position, // Query center
radius, // Search radius
maxResults // Maximum results
);

Debugging & Visualization#

The AI system includes debug rendering capabilities:

debug-renderer.ts
typescript
import { getAIDebugRenderer } from '@web-engine-dev/core/engine/ai';
const debugRenderer = getAIDebugRenderer();
// Configure debug options
debugRenderer.setOptions({
showVisionCones: true,
showHearingRadii: true,
showTrackedEntities: true,
showPaths: true,
showSteeringForces: true,
showFormations: true,
showBehaviorTreeStatus: true,
showStateNames: true,
});
// Render debug visualization
debugRenderer.render(world, scene, camera);

Memory Management#

Properly clean up AI resources when entities are destroyed:

cleanup.ts
typescript
import {
cleanupAIAgent,
cleanupPerception,
cleanupAIEntity,
resetAllAISystems,
} from '@web-engine-dev/core/engine/ai';
// Clean up individual entity
cleanupAIAgent(eid); // Releases blackboard
cleanupPerception(eid); // Releases perception storage
cleanupAIEntity(eid); // Comprehensive cleanup
// Reset all AI systems (scene unload)
resetAllAISystems();
Documentation | Web Engine