AI System Overview
Comprehensive AI system for creating intelligent game agents with behavior trees, state machines, perception, navigation, and steering behaviors.
Web Engine provides a comprehensive AI system for creating intelligent game agents with behavior trees, state machines, perception, navigation, and steering behaviors.
AI Architecture#
The AI system is built on a modular ECS architecture with several interconnected subsystems:
Core Components#
- AIAgent — Core component that links entities to AI decision-making systems
- Perception — Vision and hearing detection for environmental awareness
- Perceivable — Tag component marking entities as detectable by AI
Decision-Making Systems#
Behavior Trees
Hierarchical decision-making with composite nodes (Selector, Sequence, Parallel), decorator nodes (Inverter, Repeater, Timeout), and action/condition nodes.
State Machines
Simple FSM and Hierarchical FSM with nested states, transition guards, event-driven transitions, and history states (shallow and deep).
Utility AI
Score-based decision making with considerations (health, distance, enemy count) and response curves for dynamic action selection.
Quick Start#
Basic AI Setup#
import { AIAgent, Perception, Perceivable, initAIAgent, initPerception, initPerceivable, AIControllerType, AIAffiliation,} from '@web-engine-dev/core/engine/ai'; // Create an AI agent with perceptionaddComponent(world, AIAgent, enemyEid);addComponent(world, Perception, enemyEid); // Initialize with behavior tree controllerinitAIAgent(enemyEid, AIControllerType.BehaviorTree, AIAffiliation.Hostile);initPerception(enemyEid); // Make player perceivableaddComponent(world, Perceivable, playerEid);initPerceivable(playerEid, AIAffiliation.Friendly);Creating a Behavior Tree#
import { registerBehaviorTree, assignBehaviorTree, BTSelector, BTSequence, BTCondition, BTAction, BTStatus,} from '@web-engine-dev/core/engine/ai'; // Register a behavior treeregisterBehaviorTree('GuardAI', () => { return new BTSelector([ // Combat behavior new BTSequence([ new BTCondition('hasTarget', (ctx) => ctx.blackboard.has('targetEid') ), new BTCondition('inRange', (ctx) => { const dist = ctx.blackboard.get('distanceToTarget') ?? Infinity; return dist < 10; }), new BTAction('attack', (ctx) => { // Attack logic return BTStatus.SUCCESS; }), ]), // Patrol behavior new BTSequence([ new BTCondition('hasPatrol', (ctx) => ctx.blackboard.has('patrolWaypoints') ), new BTAction('patrol', (ctx) => { // Patrol logic return BTStatus.RUNNING; }), ]), // Idle fallback new BTAction('idle', (ctx) => { return BTStatus.SUCCESS; }), ]);}); // Assign to entityassignBehaviorTree(enemyEid, 'GuardAI');AI Agent States#
AI agents have high-level operational states that control behavior execution:
| State | Description |
|---|---|
| Disabled | Agent is inactive/disabled |
| Idle | Agent is idle, not executing behavior |
| Active | Agent is actively executing behavior |
| Paused | Agent behavior is paused (can be resumed) |
| Alert | Agent is in alert state (detected something) |
| Combat | Agent is in combat/hostile state |
| Fleeing | Agent is fleeing/retreating |
| Dead | Agent is dead/incapacitated |
Blackboard System#
The blackboard is a shared memory system for AI decision-making. Each agent has its own blackboard for storing and sharing data between behavior nodes:
import { getBlackboard, setBlackboardValue, getBlackboardValue } from '@web-engine-dev/core/engine/ai'; // Get agent's blackboard IDconst blackboardId = AIAgent.blackboardId[eid]; // Set valuessetBlackboardValue(blackboardId, 'targetEid', enemyEntity);setBlackboardValue(blackboardId, 'health', 100);setBlackboardValue(blackboardId, 'alertLevel', 0.5); // Get valuesconst target = getBlackboardValue<number>(blackboardId, 'targetEid');const health = getBlackboardValue<number>(blackboardId, 'health'); // In behavior tree nodes, access via contextconst hasTarget = (ctx) => ctx.blackboard.has('targetEid');const isHealthLow = (ctx) => (ctx.blackboard.get('health') ?? 100) < 20;Perception System#
AI agents can perceive their environment through vision and hearing:
Vision Configuration#
- Range — Maximum view distance (default: 30m)
- FOV Angle — Field of view angle in degrees (default: 110°)
- Peripheral Range — Peripheral vision distance (default: 15m)
- Peripheral FOV — Peripheral field of view (default: 180°)
- Eye Height — Height offset for vision rays (default: 1.6m)
Hearing Configuration#
- Range — Maximum hearing distance (default: 20m)
- Threshold — Minimum sound intensity to detect (default: 0.1)
- Sensitivity — Hearing acuity multiplier (default: 1.0)
- Occlusion — Whether sounds through walls are muffled
Emitting Sounds#
import { emitSound, StimulusType } from '@web-engine-dev/core/engine/ai'; // Emit a sound stimulus that AI agents can hearemitSound( sourceEid, // Entity making the sound position, // World position [x, y, z] intensity: 0.8, // Loudness (0-1) radius: 15.0, // Hearing radius duration: 2.0 // How long it persists);AI Systems#
The AI module includes several ECS systems that should be added to your game loop:
- PerceptionSystem — Handles vision and hearing detection, updates tracked entities
- AIBehaviorTreeSystem — Executes behavior trees for agents with BehaviorTree controller
- AIStateMachineSystem — Updates state machines for agents with StateMachine controller
- UtilityAISystem — Evaluates utility AI for agents with UtilityAI controller
- SteeringSystem — Applies steering behaviors for autonomous movement
- FormationSystem — Manages group AI with formation patterns
Performance & Optimization#
Zero-GC Design#
The AI system is designed for zero garbage collection in hot paths:
- Pre-allocated perception storage with circular buffers for stimuli
- Object pooling for blackboards and behavior tree instances
- Typed arrays for component data (bitECS integration)
- Reused vectors in steering behaviors and pathfinding
LOD System#
Perception LOD (Level of Detail) reduces AI update frequency based on distance from focus point:
import { setPerceptionLODEnabled, setLODFocusPosition, setPerceptionLODConfig,} from '@web-engine-dev/core/engine/ai'; // Enable perception LODsetPerceptionLODEnabled(true); // Set focus position (usually camera/player)setLODFocusPosition(cameraPosition); // Configure LOD distancessetPerceptionLODConfig({ lodDistances: [20, 50, 100], // Distance thresholds updateIntervals: [0.1, 0.25, 0.5, 1.0], // Update rates per LOD maxTrackedPerLOD: [32, 16, 8, 4], // Entity tracking limits});Spatial Indexing#
The AI system uses a spatial index for efficient neighbor queries:
import { getAISpatialIndex } from '@web-engine-dev/core/engine/ai'; const spatialIndex = getAISpatialIndex(); // Query nearby entitiesconst neighbors = spatialIndex.queryRadius( position, // Query center radius, // Search radius maxResults // Maximum results);Debugging & Visualization#
The AI system includes debug rendering capabilities:
import { getAIDebugRenderer } from '@web-engine-dev/core/engine/ai'; const debugRenderer = getAIDebugRenderer(); // Configure debug optionsdebugRenderer.setOptions({ showVisionCones: true, showHearingRadii: true, showTrackedEntities: true, showPaths: true, showSteeringForces: true, showFormations: true, showBehaviorTreeStatus: true, showStateNames: true,}); // Render debug visualizationdebugRenderer.render(world, scene, camera);Memory Management#
Properly clean up AI resources when entities are destroyed:
import { cleanupAIAgent, cleanupPerception, cleanupAIEntity, resetAllAISystems,} from '@web-engine-dev/core/engine/ai'; // Clean up individual entitycleanupAIAgent(eid); // Releases blackboardcleanupPerception(eid); // Releases perception storagecleanupAIEntity(eid); // Comprehensive cleanup // Reset all AI systems (scene unload)resetAllAISystems();