Code: Select all
love.window.setMode(config.width, config.height, {
resizable = true,
fullscreen = config.fullscreen,
vsync = config.vsync,
highdpi = config.highdpi,
minwidth = 480,
minheight = 480
})
local _, _, flags = love.window.getMode()
highdpi = flags.highdpi
vsync = flags.vsync
On Windows, I correctly get a non-highdpi window, and flags.highdpi is false.
On macOS on a highDPI monitor, I correctly get a highdpi window, and flags.highdpi is true.
However, on macOS on a non-highDPI monitor (e.g. a 23" 1080p monitor), it gives me a non-highDPI window of the requested size, but the highdpi flag is still set true, so my game renders its UI fonts at twice the scale as it should.
Is there a better way to detect whether the screen supports highdpi? As far as I can tell, on macOS the flags.highdpi value is just what's passed in to the love.window.setMode call, regardless of whether it's getting an actual highdpi window.
As a hack I could look at love.graphics.getFullscreenModes and see if any of the modes' widths are greater than 2560 (since I don't know of any highdpi screens that are <= 2560 wide) but that seems inelegant. Unfortunately getFullscreenModes also doesn't tell me whether one is actually highdpi, it just gives the actual physical pixel width and height as far as I can tell.
Thanks!