Need a way to fake a latency delay to increase ping to properly test client side prediction / reconciliation properly. This should enable us to properly implement client side prediction and test the result with added latency.
Starting point would be the queue in the main server process that handles incoming packets. Just add a timeout specified to send the responses should work in the first instance.
This should also print to the console that latency testing is enabled.
Should have following options passed into server init
- Fixed latency (for example 300ms)
- Jitter (random delays ontop of the fixed latency)
- Packet loss (drop % of packets altogether