Latency Handling
Create responsive multiplayer experiences with client-side prediction, server reconciliation, and lag compensation techniques.
Network latency is inevitable in online multiplayer games. Web Engine provides advanced techniques to hide latency and create smooth, responsive gameplay even with 100-200ms ping.
Latency Mitigation Techniques#
Client Prediction
Apply player inputs immediately on the client before server confirmation for instant feedback.
Server Reconciliation
Correct prediction errors when server state differs from client prediction.
Lag Compensation
Rewind game state for hit detection to account for player latency.
Bandwidth Optimization
Adaptive quality and compression based on connection conditions.
Client-Side Prediction#
Client-side prediction allows players to see the immediate result of their actions without waiting for server confirmation, eliminating the feel of input lag:
import { ClientPrediction } from '@web-engine/core/network'; const prediction = new ClientPrediction(); // Simulation function (same on client and server)function simulateMovement(state: GameState, input: InputFrame): GameState { const player = state.entities.get(localPlayerId); if (!player) return state; // Apply movement const speed = 5; player.pos[0] += input.inputs.move[0] * speed * 0.016; player.pos[2] += input.inputs.move[1] * speed * 0.016; // Apply gravity player.vel[1] -= 9.8 * 0.016; player.pos[1] += player.vel[1] * 0.016; return state;} // Process local input immediatelyfunction onInput(input: InputFrame) { // Predict movement on client const predictedState = prediction.processInput(input, simulateMovement); // Apply predicted state to rendering updateVisuals(predictedState); // Send input to server for validation network.sendInput(input);} // Reconcile with server statefunction onServerState(serverState: GameState, lastProcessedSeq: number) { const result = prediction.reconcile( serverState, lastProcessedSeq, simulateMovement ); console.log('Prediction error:', result.predictionError); console.log('Rollback needed:', result.needsRollback); if (result.needsRollback) { // Re-simulate pending inputs from confirmed state console.log('Re-simulated', result.inputsReconciled, 'inputs'); } // Apply reconciled state updateVisuals(prediction.getPredictedState());}Prediction Requirements
For prediction to work correctly, the movement simulation must be deterministicand use the same code on both client and server. Any randomness or divergence will cause prediction errors and jittery corrections.
Server Reconciliation#
When the server's authoritative state differs from the client's prediction, reconciliation corrects the error by rewinding and re-simulating:
// Server processes input and returns authoritative state// with the last input sequence it processedconst stateUpdate = { timestamp: 1000, lastProcessedSequence: 42, entities: [ { netId: 1, pos: [10.5, 0.2, 5.3], // Server's authoritative position rot: [0, 0, 0, 1], lastProcessedSequence: 42 } ]}; // Client reconciles prediction with server stateconst result = prediction.reconcile( stateUpdate, stateUpdate.lastProcessedSequence, simulateMovement); // If prediction error is large, we had mispredictionif (result.predictionError > 0.5) { console.warn('Large prediction error detected!'); // Possible causes: // 1. Different simulation code between client/server // 2. Packet loss causing missed inputs on server // 3. Server applied different physics (collision, terrain) // 4. Lag spike causing out-of-order processing} // Reconciliation automatically:// 1. Removes acknowledged inputs (seq <= 42)// 2. Rewinds to server state// 3. Re-applies remaining pending inputs (seq > 42)// 4. Updates predicted stateSmoothing Corrections#
For small prediction errors, smooth the correction over multiple frames to avoid jarring teleports:
class PredictionSmoother { private errorOffset = { x: 0, y: 0, z: 0 }; private smoothingRate = 0.2; // Correct 20% per frame reconcile(serverPos: Vec3, predictedPos: Vec3) { // Calculate error const error = { x: serverPos.x - predictedPos.x, y: serverPos.y - predictedPos.y, z: serverPos.z - predictedPos.z, }; // If error is small, smooth it out if (Math.abs(error.x) < 1 && Math.abs(error.y) < 1 && Math.abs(error.z) < 1) { this.errorOffset.x += error.x; this.errorOffset.y += error.y; this.errorOffset.z += error.z; } else { // Large error - teleport immediately this.errorOffset = { x: 0, y: 0, z: 0 }; return serverPos; } // Return predicted position with partial correction return { x: predictedPos.x + this.errorOffset.x * this.smoothingRate, y: predictedPos.y + this.errorOffset.y * this.smoothingRate, z: predictedPos.z + this.errorOffset.z * this.smoothingRate, }; } update() { // Decay error offset this.errorOffset.x *= (1 - this.smoothingRate); this.errorOffset.y *= (1 - this.smoothingRate); this.errorOffset.z *= (1 - this.smoothingRate); }}Lag Compensation#
Lag compensation (server-side rewind) ensures fair hit detection by rewinding the game state to what the shooter saw when they fired:
import { LagCompensator } from '@web-engine/core/network'; // Server-side lag compensationconst compensator = new LagCompensator(1000); // 1 second history // Record state every framefunction serverUpdate(deltaTime: number) { const timestamp = Date.now(); // Update game state updateGameLogic(deltaTime); // Record state for potential rollback compensator.recordState(timestamp, getCurrentGameState());} // Handle hitscan weapon firefunction handleHitscan(shooterId: number, timestamp: number, raycast: Ray) { // Get player's ping const player = getPlayer(shooterId); const ping = player.getAveragePing(); // Calculate when the player actually saw the target const compensatedTime = timestamp - ping; // Rewind to that time const pastState = compensator.getStateAtTime(compensatedTime); if (!pastState) { console.warn('Cannot compensate - state too old'); return false; } // Perform hit detection against rewound state const hit = performRaycast(raycast, pastState); if (hit) { console.log(`Hit! Compensated ${ping}ms lag`); applyDamage(hit.target, 25); return true; } return false;}Lag Compensation Limits
Only compensate for reasonable latencies (typically < 200ms). Higher compensation can create situations where players get hit after taking cover, degrading the experience. Consider setting a maximum compensation threshold.
Entity Interpolation#
Interpolate remote entities between server snapshots for smooth movement:
import { SnapshotBuffer } from '@web-engine/core/network'; // Create buffer for each remote entityconst buffers = new Map<number, SnapshotBuffer>(); // Add snapshots from serverfunction onStateUpdate(entities: NetworkEntityState[]) { const timestamp = Date.now(); for (const entity of entities) { let buffer = buffers.get(entity.netId); if (!buffer) { buffer = new SnapshotBuffer(64, 100); // 100ms interpolation delay buffers.set(entity.netId, buffer); } buffer.addSnapshot(timestamp, entity); }} // Render with interpolationfunction render() { const renderTime = Date.now(); for (const [netId, buffer] of buffers) { const interpolated = buffer.getInterpolated(renderTime); if (interpolated) { // Render entity at interpolated position renderEntity(netId, interpolated.state); } }} // Tune interpolation delay based on network conditionsfunction adjustInterpolationDelay(avgJitter: number) { // Higher jitter = more delay needed for smooth interpolation const delay = Math.max(50, Math.min(200, avgJitter * 2)); for (const buffer of buffers.values()) { buffer.setInterpolationDelay(delay); }}Bandwidth Optimization#
Adapt quality and update rate based on connection conditions:
import { NetworkMonitor, BandwidthThrottle, getNetworkMonitor} from '@web-engine/core/network'; const monitor = getNetworkMonitor();const throttle = new BandwidthThrottle(); // Monitor connection qualitysetInterval(() => { const metrics = monitor.getMetrics(); console.log('Connection Quality:', { quality: metrics.connectionQuality, rtt: metrics.averageRtt.toFixed(1) + 'ms', packetLoss: (metrics.packetLoss * 100).toFixed(1) + '%', jitter: metrics.jitter.toFixed(1) + 'ms' }); // Adapt based on quality if (metrics.connectionQuality < 0.5) { // Poor connection - reduce quality setUpdateRate(10); // 10 Hz instead of 20 Hz setPositionPrecision(8); // Lower precision setNetworkLODDistance(50); // More aggressive LOD } else if (metrics.connectionQuality > 0.8) { // Good connection - increase quality setUpdateRate(30); // 30 Hz setPositionPrecision(16); // Higher precision setNetworkLODDistance(100); // Less aggressive LOD }}, 1000); // Bandwidth throttlingfunction sendUpdate(packet: NetworkPacket) { const packetSize = estimatePacketSize(packet); if (throttle.canSend(packetSize)) { network.sendPacket(packet); throttle.recordSent(packetSize); } else { // Bandwidth limit reached - queue for later or drop console.log('Bandwidth throttled'); }}Time Synchronization#
Synchronize clocks between client and server for accurate timestamp-based logic:
import { NetworkManager } from '@web-engine/core/network'; const network = NetworkManager.getInstance(); // Server sends its timestamp with each state updatefunction onStateUpdate(packet: StateUpdatePacket) { // Update client's clock offset network.updateServerTime(packet.timestamp);} // Get synchronized server timefunction performTimedAction() { const serverTime = network.getServerTime(); // Use server time for consistency const action = { type: 'ability', timestamp: serverTime, playerId: network.getClientId() }; network.sendPacket(action);} // Check time offsetconst offset = network.getDiagnostics().serverTimeOffset;console.log(`Server time offset: ${offset}ms`); // Offset is smoothed over time using exponential moving average// This prevents jitter from affecting time-sensitive actionsInput Buffering#
Buffer inputs to handle temporary connection issues without disrupting gameplay:
class InputBuffer { private buffer: InputFrame[] = []; private maxBuffer = 60; // 1 second at 60fps addInput(input: InputFrame) { this.buffer.push(input); // Limit buffer size if (this.buffer.length > this.maxBuffer) { this.buffer.shift(); } } // Flush buffered inputs when connection recovers flush(network: NetworkManager) { console.log(`Flushing ${this.buffer.length} buffered inputs`); for (const input of this.buffer) { network.sendInput(input); } this.buffer = []; } clear() { this.buffer = []; }} // Usageconst inputBuffer = new InputBuffer(); network.on('disconnect', () => { console.log('Disconnected - buffering inputs');}); network.on('reconnect', () => { console.log('Reconnected - flushing buffer'); inputBuffer.flush(network);}); function onInput(input: InputFrame) { if (network.isActive()) { network.sendInput(input); } else { inputBuffer.addInput(input); }}Latency Handling Best Practices#
- Predict local player only — Only predict the local player's movement. Remote players should use interpolation.
- Use same simulation code — Client and server must use identical physics/movement code for accurate prediction.
- Smooth small errors — Lerp small prediction errors over multiple frames instead of snapping.
- Limit lag compensation — Cap compensation at ~200ms to prevent "getting shot behind cover" complaints.
- Show network status — Display ping, packet loss, and connection quality to help players understand issues.
- Tune interpolation delay — Balance between smoothness (higher delay) and responsiveness (lower delay).
- Test with realistic latency — Use network simulation tools to test with 50-300ms latency and 2-5% packet loss.
Testing with Network Simulation#
import { NetworkManager } from '@web-engine/core/network'; const network = NetworkManager.getInstance(); // Simulate poor network conditions for testingnetwork.configureSimulation({ latencyMs: 150, // 150ms round-trip latency jitterMs: 30, // ±30ms jitter packetLoss: 0.05, // 5% packet loss duplicateChance: 0.01 // 1% packet duplication}); // Simulate different connection typesconst profiles = { perfect: { latencyMs: 0, jitterMs: 0, packetLoss: 0 }, lan: { latencyMs: 10, jitterMs: 2, packetLoss: 0.001 }, broadband: { latencyMs: 50, jitterMs: 10, packetLoss: 0.01 }, mobile4g: { latencyMs: 100, jitterMs: 30, packetLoss: 0.02 }, mobile3g: { latencyMs: 200, jitterMs: 50, packetLoss: 0.05 }, satellite: { latencyMs: 600, jitterMs: 100, packetLoss: 0.02 },}; // Test with mobile 4G simulationnetwork.configureSimulation(profiles.mobile4g);