How to idle a full-screen app
David Thomas (43) 72 posts |
Hello chums, I’m just adding full screen support to game which is currently desktop-only. Of course it’s running too fast so it’s time to make it idle properly. What’s the current thinking on making single tasking apps idle? Am I best sitting in a loop calling Portable_Idle until my delay period is eaten up? D. |
Rick Murray (539) 13850 posts |
Or set the poll idle time to the delay period? Though, note two complications. Firstly, the polling can return early if something of interest happens, so you’ll need to track time independently and restart the poll idle with the remaining delay in this case. Secondly, perhaps more importantly, when you are in the desktop world, you are interacting with other software. Generally things run smoothly, but it is possible for a task to “take a while” if a user initiated action causes some sort of delay. So you’ll also need to notice time loss and handle it appropriately. At least we don’t have applications that randomly pop themselves to the top and steal the caret/keyboard input because they think what they’re doing is the most important thing in the world… Now, as for a single tasking app, if it is to run completely outside of the desktop environment, then you can do a simple loop like this: Wait for vsync Swap screen buffer Draw the screen Any keyboard input? If yes, handle it. Any NPCs to move or whatever? Check state (win/lose) for end of game. Artificial delay? (see below) Go to the top. While the Wimp will have handed you key events, you’ll need to set up the keyboard yourself. Specifically, remember to disable ESC and work out what behaviour the cursor keys should have. As for the artificial delay, you’ll need to keep track of the centisecond tick to allow your game to update at a fixed rate. This is because some monitors vsync at 60Hz, others at 75Hz, some at 85Hz. You can’t rely upon vsync alone for timing. |
André Timmermans (100) 655 posts |
Always use triple buffering. The VSync on the Pi is emulated, so it just occurs at the same frequency than the monitor.
Yes, that’s OK since the game is already too fast. If it were CPU intensive game with a frame rate lower than the monitor’s frequency, I’d recommend instead to wait for the VSync counter to differ from its value at the time of the last screen buffer rotation, that way if your processing takes to long you don’t idle unnecessarily till the next VSync. |
Rick Murray (539) 13850 posts |
It’s better, yes. But the process is the same, you just juggle three buffers instead of two.
That depends upon how the monitor is handled. If you have the GPU outputting a fixed mode (and rescaling to fit), the VSync will match that mode’s refresh rate. VSync should not be used for game timing, only for knowing when to begin drawing the next frame.
Then it would need to be handled differently. Again, triple buffering is useful here. With something I have written, I use VSync to wait for when to switch buffers and redraw, and Ticker to run the thing at around 50fps (two ticks). On slower hardware, it doesn’t even manage that, so if it detects the fps is low it won’t attempt to limit, and if it’s too low it’ll warn you that the machine is too slow. |