Web Engine Docs
Preparing documentation
Use the search bar to quickly find any topic
Preparing documentation
Use the search bar to quickly find any topic
Create responsive multiplayer experiences with client-side prediction, server reconciliation, and lag compensation techniques.
Network latency is inevitable in online multiplayer games. Web Engine provides advanced techniques to hide latency and create smooth, responsive gameplay even with 100-200ms ping.
Apply player inputs immediately on the client before server confirmation for instant feedback.
Correct prediction errors when server state differs from client prediction.
Rewind game state for hit detection to account for player latency.
Adaptive quality and compression based on connection conditions.
Client-side prediction allows players to see the immediate result of their actions without waiting for server confirmation, eliminating the feel of input lag:
import { ClientPrediction } from '@web-engine-dev/core/network';const prediction = new ClientPrediction();// Simulation function (same on client and server)function simulateMovement(state: GameState, input: InputFrame): GameState {const player = state.entities.get(localPlayerId);if (!player) return state;// Apply movementconst speed = 5;player.pos[0] += input.inputs.move[0] * speed * 0.016;player.pos[2] += input.inputs.move[1] * speed * 0.016;// Apply gravityplayer.vel[1] -= 9.8 * 0.016;player.pos[1] += player.vel[1] * 0.016;return state;}// Process local input immediatelyfunction onInput(input: InputFrame) {// Predict movement on clientconst predictedState = prediction.processInput(input, simulateMovement);// Apply predicted state to renderingupdateVisuals(predictedState);// Send input to server for validationnetwork.sendInput(input);}// Reconcile with server statefunction onServerState(serverState: GameState, lastProcessedSeq: number) {const result = prediction.reconcile(serverState,lastProcessedSeq,simulateMovement);console.log('Prediction error:', result.predictionError);console.log('Rollback needed:', result.needsRollback);if (result.needsRollback) {// Re-simulate pending inputs from confirmed stateconsole.log('Re-simulated', result.inputsReconciled, 'inputs');}// Apply reconciled stateupdateVisuals(prediction.getPredictedState());}
Prediction Requirements
For prediction to work correctly, the movement simulation must be deterministicand use the same code on both client and server. Any randomness or divergence will cause prediction errors and jittery corrections.
When the server's authoritative state differs from the client's prediction, reconciliation corrects the error by rewinding and re-simulating:
// Server processes input and returns authoritative state// with the last input sequence it processedconst stateUpdate = {timestamp: 1000,lastProcessedSequence: 42,entities: [{netId: 1,pos: [10.5, 0.2, 5.3], // Server's authoritative positionrot: [0, 0, 0, 1],lastProcessedSequence: 42}]};// Client reconciles prediction with server stateconst result = prediction.reconcile(stateUpdate,stateUpdate.lastProcessedSequence,simulateMovement);// If prediction error is large, we had mispredictionif (result.predictionError > 0.5) {console.warn('Large prediction error detected!');// Possible causes:// 1. Different simulation code between client/server// 2. Packet loss causing missed inputs on server// 3. Server applied different physics (collision, terrain)// 4. Lag spike causing out-of-order processing}// Reconciliation automatically:// 1. Removes acknowledged inputs (seq <= 42)// 2. Rewinds to server state// 3. Re-applies remaining pending inputs (seq > 42)// 4. Updates predicted state
For small prediction errors, smooth the correction over multiple frames to avoid jarring teleports:
class PredictionSmoother {private errorOffset = { x: 0, y: 0, z: 0 };private smoothingRate = 0.2; // Correct 20% per framereconcile(serverPos: Vec3, predictedPos: Vec3) {// Calculate errorconst error = {x: serverPos.x - predictedPos.x,y: serverPos.y - predictedPos.y,z: serverPos.z - predictedPos.z,};// If error is small, smooth it outif (Math.abs(error.x) < 1 && Math.abs(error.y) < 1 && Math.abs(error.z) < 1) {this.errorOffset.x += error.x;this.errorOffset.y += error.y;this.errorOffset.z += error.z;} else {// Large error - teleport immediatelythis.errorOffset = { x: 0, y: 0, z: 0 };return serverPos;}// Return predicted position with partial correctionreturn {x: predictedPos.x + this.errorOffset.x * this.smoothingRate,y: predictedPos.y + this.errorOffset.y * this.smoothingRate,z: predictedPos.z + this.errorOffset.z * this.smoothingRate,};}update() {// Decay error offsetthis.errorOffset.x *= (1 - this.smoothingRate);this.errorOffset.y *= (1 - this.smoothingRate);this.errorOffset.z *= (1 - this.smoothingRate);}}
Lag compensation (server-side rewind) ensures fair hit detection by rewinding the game state to what the shooter saw when they fired:
import { LagCompensator } from '@web-engine-dev/core/network';// Server-side lag compensationconst compensator = new LagCompensator(1000); // 1 second history// Record state every framefunction serverUpdate(deltaTime: number) {const timestamp = Date.now();// Update game stateupdateGameLogic(deltaTime);// Record state for potential rollbackcompensator.recordState(timestamp, getCurrentGameState());}// Handle hitscan weapon firefunction handleHitscan(shooterId: number, timestamp: number, raycast: Ray) {// Get player's pingconst player = getPlayer(shooterId);const ping = player.getAveragePing();// Calculate when the player actually saw the targetconst compensatedTime = timestamp - ping;// Rewind to that timeconst pastState = compensator.getStateAtTime(compensatedTime);if (!pastState) {console.warn('Cannot compensate - state too old');return false;}// Perform hit detection against rewound stateconst hit = performRaycast(raycast, pastState);if (hit) {console.log(`Hit! Compensated ${ping}ms lag`);applyDamage(hit.target, 25);return true;}return false;}
Lag Compensation Limits
Only compensate for reasonable latencies (typically < 200ms). Higher compensation can create situations where players get hit after taking cover, degrading the experience. Consider setting a maximum compensation threshold.
Interpolate remote entities between server snapshots for smooth movement:
import { SnapshotBuffer } from '@web-engine-dev/core/network';// Create buffer for each remote entityconst buffers = new Map<number, SnapshotBuffer>();// Add snapshots from serverfunction onStateUpdate(entities: NetworkEntityState[]) {const timestamp = Date.now();for (const entity of entities) {let buffer = buffers.get(entity.netId);if (!buffer) {buffer = new SnapshotBuffer(64, 100); // 100ms interpolation delaybuffers.set(entity.netId, buffer);}buffer.addSnapshot(timestamp, entity);}}// Render with interpolationfunction render() {const renderTime = Date.now();for (const [netId, buffer] of buffers) {const interpolated = buffer.getInterpolated(renderTime);if (interpolated) {// Render entity at interpolated positionrenderEntity(netId, interpolated.state);}}}// Tune interpolation delay based on network conditionsfunction adjustInterpolationDelay(avgJitter: number) {// Higher jitter = more delay needed for smooth interpolationconst delay = Math.max(50, Math.min(200, avgJitter * 2));for (const buffer of buffers.values()) {buffer.setInterpolationDelay(delay);}}
Adapt quality and update rate based on connection conditions:
import {NetworkMonitor,BandwidthThrottle,getNetworkMonitor} from '@web-engine-dev/core/network';const monitor = getNetworkMonitor();const throttle = new BandwidthThrottle();// Monitor connection qualitysetInterval(() => {const metrics = monitor.getMetrics();console.log('Connection Quality:', {quality: metrics.connectionQuality,rtt: metrics.averageRtt.toFixed(1) + 'ms',packetLoss: (metrics.packetLoss * 100).toFixed(1) + '%',jitter: metrics.jitter.toFixed(1) + 'ms'});// Adapt based on qualityif (metrics.connectionQuality < 0.5) {// Poor connection - reduce qualitysetUpdateRate(10); // 10 Hz instead of 20 HzsetPositionPrecision(8); // Lower precisionsetNetworkLODDistance(50); // More aggressive LOD} else if (metrics.connectionQuality > 0.8) {// Good connection - increase qualitysetUpdateRate(30); // 30 HzsetPositionPrecision(16); // Higher precisionsetNetworkLODDistance(100); // Less aggressive LOD}}, 1000);// Bandwidth throttlingfunction sendUpdate(packet: NetworkPacket) {const packetSize = estimatePacketSize(packet);if (throttle.canSend(packetSize)) {network.sendPacket(packet);throttle.recordSent(packetSize);} else {// Bandwidth limit reached - queue for later or dropconsole.log('Bandwidth throttled');}}
Synchronize clocks between client and server for accurate timestamp-based logic:
import { NetworkManager } from '@web-engine-dev/core/network';const network = NetworkManager.getInstance();// Server sends its timestamp with each state updatefunction onStateUpdate(packet: StateUpdatePacket) {// Update client's clock offsetnetwork.updateServerTime(packet.timestamp);}// Get synchronized server timefunction performTimedAction() {const serverTime = network.getServerTime();// Use server time for consistencyconst action = {type: 'ability',timestamp: serverTime,playerId: network.getClientId()};network.sendPacket(action);}// Check time offsetconst offset = network.getDiagnostics().serverTimeOffset;console.log(`Server time offset: ${offset}ms`);// Offset is smoothed over time using exponential moving average// This prevents jitter from affecting time-sensitive actions
Buffer inputs to handle temporary connection issues without disrupting gameplay:
class InputBuffer {private buffer: InputFrame[] = [];private maxBuffer = 60; // 1 second at 60fpsaddInput(input: InputFrame) {this.buffer.push(input);// Limit buffer sizeif (this.buffer.length > this.maxBuffer) {this.buffer.shift();}}// Flush buffered inputs when connection recoversflush(network: NetworkManager) {console.log(`Flushing ${this.buffer.length} buffered inputs`);for (const input of this.buffer) {network.sendInput(input);}this.buffer = [];}clear() {this.buffer = [];}}// Usageconst inputBuffer = new InputBuffer();network.on('disconnect', () => {console.log('Disconnected - buffering inputs');});network.on('reconnect', () => {console.log('Reconnected - flushing buffer');inputBuffer.flush(network);});function onInput(input: InputFrame) {if (network.isActive()) {network.sendInput(input);} else {inputBuffer.addInput(input);}}
import { NetworkManager } from '@web-engine-dev/core/network';const network = NetworkManager.getInstance();// Simulate poor network conditions for testingnetwork.configureSimulation({latencyMs: 150, // 150ms round-trip latencyjitterMs: 30, // ±30ms jitterpacketLoss: 0.05, // 5% packet lossduplicateChance: 0.01 // 1% packet duplication});// Simulate different connection typesconst profiles = {perfect: { latencyMs: 0, jitterMs: 0, packetLoss: 0 },lan: { latencyMs: 10, jitterMs: 2, packetLoss: 0.001 },broadband: { latencyMs: 50, jitterMs: 10, packetLoss: 0.01 },mobile4g: { latencyMs: 100, jitterMs: 30, packetLoss: 0.02 },mobile3g: { latencyMs: 200, jitterMs: 50, packetLoss: 0.05 },satellite: { latencyMs: 600, jitterMs: 100, packetLoss: 0.02 },};// Test with mobile 4G simulationnetwork.configureSimulation(profiles.mobile4g);