Last thing today, I promise (99%). I have spent some (too) long time trying to make fulscreen work at a desired resolution. Against any logic setting width & height didn’t wotk, and the application goes for native resolution.
Finally I found that changing app start params --fullscreen-custom does the trick, so I just copy-pasted form ApplyFullScreenCustomParam(...) into GameInitialize.pas
I’ve tried this a few times already. I’m wondering how it works for you. Not even the ‘screen_resolution_change’ demo changes the resolution: ‘cannot change screen resolution’. (Linux Mint MATE, games on Steam are working).
// auto generated
Application.OnInitialize := @ApplicationInitialize;
Window := TCastleWindow.Create(Application);
Application.MainWindow := Window;
// here starts my code
Window.AntiAliasing := aa8SamplesFaster;//aa16SamplesNicer;
Window.FullScreen := true;
Application.VideoResizeWidth := 1920;//800;
Application.VideoResizeHeight := 1080;//600;
Application.VideoResize := true;
Application.VideoChange(true);
// auto generated
Window.ParseParameters;
end.
It worked in new project based on FPS template as well, even with the crazy 800x600 on a 4k display. I didn’t bother about demos though. I’ve tried few different scenarios with Delphi and FPC but only on windows and a single monitor. The requested resolution and refresh rate both must be supported by the display, otherwise it will show an error msg similar to yours and try to run in default.
If other games play on the resolution that you try to apply, then try the refresh rate. It can be set by Application.VideoFrequency := xxx
As a side note.
The menu ‘Run’ in the CGE editor only has --fullscreen command. Anyway, --fullscreen-custom doesn’t support notation like ‘1000x2000@60’ to indicate the desired refresh rate, and I still didn’t find a CGE method to list device caps, but I only started to look inside it yesterday afternoon.
Thank you for your help. It still doesn’t work this way. I already tried in the demo with 800x600 or 1024x768, which also threw an error. Linux, Lazarus, NVIDIA 3060 laptop. But it’s not so important at the moment. Thanks again.
Our Window.FullScreen property indeed only toggles between “our window fills the whole screen” or not. It does not change the screen resolution. Fullscreen window, by default, has the same resolution as the screen.
It is deliberate that you need to do additional call, using Application.VideoChange and setup other Application.Video* API, to manage this. It’s a different need, and switching screen resolution may mean also choosing other things (like refresh rate).
We don’t really encourage to change the physical screen resolution.
Alt+Tab behaves in weird ways, sometimes changing resolution sometimes not, sometimes causing the need to recreate the context when going back (and thus application seems to hang, showing blackness, when alt+tabbing back).
users can be left with undesired resolution after the game closes.
OSes don’t really give us a reliable API to say “I want my application to work with different screen resolution, do it and handle the rest of applications / desktop reliably”. In case of Windows (using WinAPI) the capabilities are a bit better but not perfect (but at least the resolution we request is tied to our application and sometimes is reverts back OK), on Linux (XF86VMode of Xlib) it’s a bit worse (we have to manually restore the resolution, which requires us to track application change/close carefully). In neither case it’s really perfect.
At least in the past, I could confirm above also with games made with other engines, on Steam. Switching resolution in games (that change the physica screen resolution) and alt+tabbing between applications is risky, you’re not sure when the resolution will change/revert.
And on some systems, like mobile or web, it’s of course just impossible, application cannot change physical resolution. And we like to put most effort into solutions that are cross-platform.
I realize that many games expose the option to change screen resolution anyway. It works well enough for some configurations and for some systems.
That’s also why we have Application.VideoChange. Maybe I discourage this (see AD 2) and encourage a different way (see AD 4 below), but I don’t plan to remove Application.VideoChange or anything.
The way I recommend is to adjust to use always the native resolution of your monitor. Don’t use Application.VideoChange. Instead, rely on our UI scaling, active by default in new projects to adjust size of everything to the user’s screen.
The ideas in this point (UI scaling, rendering off-screen) are completely cross-platform, i.e. work on all platforms, and even allows to work in window mode of any size (the latter is IMHO useful for debugging).
More details: We actually have a Linux implementation (more generally, suitable for any Unix, like FreeBSD too). But it’s poor and so disabled by default.
It uses XF86VMode API, and we never managed to make it work well combined with GTK, see comment here. You can define the 2 symbols near the line linked above and try if it works for you. Long time ago when I tested, it caused weird “virtual screen”. It seemed that GTK was not ready that we will use XF86VMode sidestepping GTK, or maybe I wasn’t smart enough to do it correctly.
Alternatively, it works better if you switch the CastleWindow backend from CASTLE_WINDOW_GTK_2 (default on Linux now) to CASTLE_WINDOW_XLIB. But CASTLE_WINDOW_XLIB has other issues, fullscreen there is realized using an obsolete method (“override_redirect”), no native menu bar and dialogs.
So we just do not have a version of Application.VideoResize that works well on Linux (and FreeBSD and other Unixes, since they share the implementation). Yes, this is despite the fact that I do most of my work on Linux. So the default GTK backend, by default just does not support Application.VideoChange.
PRs to improve this are welcome, of course. As said above in AD 2,3,4 – we don’t recommend Application.VideoChange, so it’s not our focus. But at the same time we don’t plan to remove/deprecate it either, I understand this is a requirement in some cases, and not everyone has to agree with my experience from AD 2.
The reason why we don’t have a function to list available resolutions is similar to above – it makes sense, but it’s not our focus. We don’t want to put work into it, though PRs would be welcome.
But it’s not trivial, as people may have multiple monitors, so you need to also implement a concept of a screen, and know which screen is used (or will be used) by given window (sometimes it spans multiple screens), and which screen determines the position of the application.
And ideally we want it cross-platform, at least Windows (WinAPI) and Linux/FreeBSD (GTK).
Sorry that I didn’t read the README info. I wouldn’t have thought it would be so complicated. And the demo had such nice resolution change buttons in it But thank you very much for the detailed information on this topic. I’ll try to read a bit more in the future.
For me, personally, the resolution change to lower values make sense mainly for testing purpose. I don’t assume that what works on my screen will work on others. Alpha and pre-Alpha testing is easier on just 1 machine, as it offers immediate insight with just a few clicks, but considering what you wrote I’ll just stick to windowed mode for that. Later stages of testing require separate hardware anyway.
Historically “physical” fullscreen with reduced resolution was an option to play games at a decent fps, and it motivated me to think that if it’s easy then it’s worth it. But I do completely agree with your points @michalis, your answer gave me some more insight and let me re-think some stuff.
What you described in AD 4, creating the buffer and UI scaling, is actually done for modern 3d games by use of NVidia’s DLSS and AMD’s FidelityFX. They render the image at a lower resolution and scale it up to fit the window size. The advantage is that it’s done by the hardware. Disadvantage - it’s supported only on more recent cards (I think somewhere about the year 2018). I have yet to find a way to utilize it in CGE (maybe you already have it somewhere).
I do love that CGE supports everything from a washing machine to quantum computers, but scaling an off-screen buffer (the issue #669) programmatically IMHO may offer less benefit than expected. My personal choice is to support desktop & laptops in my game, so it seems the best option will be to focus on a higher-end graphics rather than trying to make older specs owners happy at any price. And it seems that older machines can only be addressed by custom settings (4k textures, million-polygon cute faces, fx and fancy clouds are easier to control than trying to implement work-arounds everywhere).
Nevertheless, thanks for your detailed answer. Appreciated.
I don’t yet have a way to do this, but I’d like to From what I understand, the AMD solution is open-source and could be checked easier than NVidia DSSL.
If you or anyone else wants to experiment with this, can we connect these solutions with CGE and how, I would be very happy
Indeed, it’s not a magic bullet. It only helps if
your bottleneck is “GPU work in a pixel shader”. Which can easily happen for 3D games, with lots of lights and textures. Then rendering at lower resolutions will help.
and if your users can tolerate lower resolution.
But if your performance problem is elsewhere, then naturally it will not help.
I saw it actually used by some games (one Diablo-like game, that I forgot the name now…) and Godot also offers this option, so it does seem like a reasonable option to offer
I thought that’s actually the reason why you ask for this, i.e. you want to offer users the option to change resolution → to increase performance For debugging, I would recommend what you said, use windowed mode. It may be easier, you can then easily pick any window size and aspect ratio, independently of what your monitor supports as screen size.
Appears that my enthusiasm about NVidia and AMD solutions was a bit too fast. Both need vulkan or directx. I start missing the times when hercules graphic card made people happy But it doesn’t mean nothing can be done.
I’ll look into the upscaling by CGE-native buffers. I’m worried about the need of blur / sharpen, issues with contrast and LOD, but it’s always better to offer customers some option than none.
Maybe if frames 1,3,5,9… were rendered at full resolution, and frames in between at half res upscaled 2x filled in by simple linear colour average, and then mixed with previous (full scale) frame? I’m thinking out a loud now, but for motion blur we need to mix frames anyway so it could be used together with scaling…
Anyway, I’ll look into it. Any performance gain is a good performance gain.
The GH issue 669 is just regular image upscaling, with bilinear filtering (so yes, it can make things blurry if one tried to make the rendering area too small) or nearest (so it will look pixelated; which may actually be a feature in certain cases, if one wants a “retro” look).
All kinds of research is welcome, though my suggestion would be to first make the “baseline” simplest approach, following idea of GH issue 669 On top of that code, experiments you propose can be made
Combining this with our screen effects which could introduce a “sharpen” image operation could made sense. Though I fear that any attempt here could “backfire” – because in the end, this is a technique to conserve time (unless used for that retro pixelated effect), so any processing of the image must be really fast or we risk negating the performance gains from rendering to a smaller resolution.
Admittedly I am a bit skeptical of using previous frames for this. Unless a specific game style wants a motion blur anyway. But then it’s motion blur, and frames “jump” in quality – which may be more distracting to human eye than naive “just downscale all the frames”. Anyhow, I know you were thinking out loud, and I shouldn’t jump into doubting too fast. Experiments are welcome and good
As for the skeleton app, inside the new_render_skeleton.dproj on lines 240 & 260 should be Win64? Otherwise the settings for Win32 are duplicated (2x debug, 2x release). I maybe wrong, I only made a very quick look at it.
Anyway, supporting vulkan (and possibly directx) is worth consideration. It would be a side project for me though.
For the upscaling I’ve seen many algorithms in my past, some preserving the sharp lines basing on a 3x3 or 3x4 neighbours, and one based on a very simple neural network but I only used them on CPU so far. If any of them prove to be fast enough as a shader on GPU, and device-independent, and easy to maintain then I’ll post the code here.
I will start from what you propose, basing on the issue 669. First thing of all I want it to be fully compatible with CGE - I don’t want to have to worry with every CGE update if it’ll still work. I consider it worth experimenting, and fun maybe even useful.