Because I tried running it on a non-3D-capable card?
Also, IIRC the server code specifically calls a Direct3D function which asks for 3D caps, I remember having to deal with that in my NULL renderer.
In fact, I'd have to look it up in Olly again, but there's a check that specifically bails out with an error message if the video device doesn't support 3D caps, even if it's running as a dedicated server.
My NULL renderer does the usual thing, reports itself as being capable of everything that's needed, but actually does none of it.
Don't know about the normal speed of Blt, but from what (very little) I know about it, the bit block transfer operations were historically fast, much faster than using MOV or even REP MOVS operations (although the latter point might well be moot, since bit blit originated on hardware which did not support anything like the REP prefix).
The interesting thing is, when I'm accessing the (unmodified) server through a remote connection, the server update rate drops to a crawl (<30 updates/s) whenever the server window is not minimized. That's on modern (and fairly decent) hardware...
In any case, at only 2 opcodes, my NULL implementation of Blt() can't possibly be optimized any further.
At least you can minimize it to make it use less CPU,
With the NULL renderer, I can't even measure
the CPU usage, it's off-scale low, always shows as 0.0%...
And before you ask - yes the server is actually running. I can't see the console obviously, but I can connect clients to it just fine.
So whatever is causing the slowdown and excessive CPU usage... it's evidently something in DirectX. I suspect Blt(), since it gets called the most?