Updated benchmarks: Speed boost!

A quick update to the benchmarks... surprisingly, with an actual login action in place, it's running faster than it was before.

This endpoint hits Redis and loads the user's actual data (parsed from JSON), adds unit type data the client will need to handle units correctly, dumps it back to JSON, and ships it off. There will be other views that are more computation-intensive, I'm sure, but this is... very promising. :)

The same "this benchmark is pretty much crap" disclaimer applies, but considering this is using a real endpoint and doing actual work with Redis, I'm pretty confident that a single reasonably beefy server should be able to handle a good deal of traffic with this setup.

10000 total requests over 10 processes
Elapsed time: 0.0944209098816 s
req/s: 105861

50000 total requests over 10 processes
Elapsed time: 0.53000497818 s
req/s: 94334

100000 total requests over 10 processes
Elapsed time: 5.44690394402 s
req/s: 18358

200000 total requests over 10 processes
Elapsed time: 47.9682750702 s
req/s: 4169

20000 total requests over 20 processes
Elapsed time: 0.155431985855 s
req/s: 128652

100000 total requests over 20 processes
Elapsed time: 0.848218202591 s
req/s: 117890

200000 total requests over 20 processes
Elapsed time: 13.7452850342 s
req/s: 14550

400000 total requests over 20 processes
Elapsed time: 107.0227952 s
req/s: 3737

30000 total requests over 30 processes
Elapsed time: 0.237970113754 s
req/s: 126053

150000 total requests over 30 processes
Elapsed time: 5.37003397942 s
req/s: 27932

300000 total requests over 30 processes
Elapsed time: 19.7377228737 s
req/s: 15199

Comments

Comments powered by Disqus

Contents © 2016 internationalfish