Let's say that your program runs at a fixed frame rate of 60 Hz without vsync, and that the monitor runs exactly at 60 Hz. Since your program is not syncing to vsync, they may be out of phase; for example, the vertical retrace may happen in the middle of your update.
Now, taking into account that the time that update and draw take can vary, this can happen:
Code: Select all
Monitor:
...--1/60 s---------------------------|-------------------------------1/60 s-------------------------------|
...-screen refresh--------------------|==v.retrace==|--------------------screen refresh--------------------|...
^ ^
vsync vsync
Program (some frame):
|-------------------------------1/60 s-------------------------------|-------------------------------1/60 s-...
|Events|----Update---|--Draw--|*|----------------Sleep---------------|Events|...
^
Present
Program (some other frame where Update and Draw take longer):
|-------------------------------1/60 s-------------------------------|-------------------------------1/60 s-...
|-Events-|----------Update---------|---------Draw--------|*|--Sleep--|-Events-|...
^
Present
That won't happen if the sleep is performed before Present(): the sleep() will complete a whole 1/60th of a second, and Present() will therefore be called at regular intervals. There will be too little time for the player to react to the current frame though, but I doubt anyone has reflexes under 17 ms anyway. https://humanbenchmark.com/tests/reactiontime
That's for a fixed frame rate. The case of frame rate limiting (i.e. to have a maximum frame rate) is more complicated, unless the video card supports adaptive vsync.
I still don't see why the events processing time should not be included in the total measurement.