Before I post my .love file and beg you all for help, I'd like to provide a little context. I'm trying to build a rather rudimentary 2D adventure game. Something that would let the player make a character walk around in a walkable area, click on things to look at them, pick them up, use them etc. The game is an exercise for me to learn about Love's capabilities and limitations.
Unfortunately, I'm not even at the point where I have ego moving around the screen. I hit a bit of a snag scaling the background image to make my game resolution independent.
Thinking about games I'd played recently with smooth, beautiful, high-resolution background artwork (like Lost Horizon), I thought I'd use high resolution background images and scale them down (maintaining aspect ratio) to the user's resolution in full screen mode. Before I even got started, I ran into issues that were (thankfully!) well documented and fixed in builds > 0.80.
After sorting out full screen mode, my first attempt at drawing the background was to do things as simply as possible -- load the image and then scale, translate and draw it every frame. This worked fine on my Windows 7 desktop with its 8800 GTX that gobbles up and spits out large textures at lightning speeds. When I fired it up on a test Windows XP machine in Virtual PC, however, I got this:
I promptly checked out a few of the community articles on the power of 2 problem (PO2) and deduced that this might be the culprit. Indeed if I limited the source image size to 1024x1024, my test background magically appeared in Virtual PC. Thinking I'd solved the problem, I went about building a function that took a really high-res background image and split it into 512x512 (1 PO2 smaller for safety) textures.
Note: I call the 512x512 blocks that Love processes 'textures' because that's what I think they are. I'm not sure if that's the correct term, but it makes the most sense to me. As I understand it, they're stored in VRAM and their size is constrained by video card limitations, not Love ones.
Anyway, I built a similar function to piece the background back together from the smaller textures. So now I had a draw function that:
1. Scaled everything down to fit the full screen resolution
2. Translated everything down or right so that the letterboxing/pillarboxing (if any) around the content was equal
3. Drew the background in 512x512 blocks
4. Drew the cursor
Whew! I thought everything was complete. My 3200x2000 (16:10) source image was still being scaled down nicely on my 1600x1200 monitor. But (shock, horror!) on the Virtual PC Windows XP machine at 1280x1024, I noticed this:
Basically I'm seeing little gaps between the 512x512 textures. At different resolutions they can be more / less pronounced in other areas. I thought about the problem a bit.
- My original resolution was 3200x2000
- I was trying to scale to 1280x1024
- This would mean each 512x512 block would actually be (0.4 x 512) x (0.4 x 512) ... or 204.8x204.8
I thought this could be the problem. Love might be drawing the first texture at (0,0) then the second one at 204.8x204.8 and so on, causing the weird gaps between textures. I tried forcing a scale of 0.3984375 -- that is, 204x204 textures, but this didn't solve the problem! The gaps were... if anything, more pronounced. I even tried clamping my tiled x/y coordinates with math.floor because I thought there might be an IEEE float rounding issue, but still no dice.
So... I'm out of ideas. I've searched the forums thoroughly but haven't found any solutions. I'm hoping some kind soul will here point out where my code is broken. The .love file includes a "brony" (lol) background image from DeviantArt which I think is fair use. Most of the code is in "romance.lua" which is also my namespace for extra functions I've written, e.g. romance.newTiledImage, romance.drawTiledImage. Pretty straightforward, and love->romance seems to be a logical progression. Any, and I mean ANY help would be appreciated. Sorry for the long-winded post! I also have a second question about screenshots, but I might save it for another post.
Scaling + Tiled Images = Gaps
Forum rules
Before you make a thread asking for help, read this.
Before you make a thread asking for help, read this.
Scaling + Tiled Images = Gaps
- Attachments
-
- Game.love
- What the world needs now is love, sweeeeeeeet love.
- (1.08 MiB) Downloaded 625 times
Re: Scaling + Tiled Images = Gaps
I remember running into this on windows, but not on Linux, and it baked my noodle. The solution was to make sure you were drawing tiles at integer coordinates.
So, lets say you're keeping track of the scale (and it's some integer like 2, 3, etc.), before drawing the tile, do this:
This generally works for me, but you may need to tweak it a bunch.
So, lets say you're keeping track of the scale (and it's some integer like 2, 3, etc.), before drawing the tile, do this:
Code: Select all
x = math.floor(x*scale)/scale
y = math.floor(y*scale)/scale
love.graphics.draw( ... )
Re: Scaling + Tiled Images = Gaps
Unless Microsoft has updated Virtual PC with fancy graphics card support, the image from the XP VM is drawn with the OpenGL fallback software driver. I have a feeling that Microsoft never cared much about it, the poor driver is still stuck with OpenGL 1.1. Mesa (the software renderer on probably all Linux boxes) doesn't show this problem.
I'm guessing it's a blending issue, but I have to poke it myself for a while to figure out what's going on.
Edit: Oh yeah, about the first image in your post. LÖVE currently fails silently if OpenGL errors on texture loading and it shows up as white. 3200x2000 was too much for the driver.
Edit2: Arglrmlgr! Clamp to edge ((Image):setWrap("clamp", "clamp") [the default]) is OpenGL 1.2. The driver probably wraps the texture and interpolates the color from the other edge.
I'm guessing it's a blending issue, but I have to poke it myself for a while to figure out what's going on.
Edit: Oh yeah, about the first image in your post. LÖVE currently fails silently if OpenGL errors on texture loading and it shows up as white. 3200x2000 was too much for the driver.
Edit2: Arglrmlgr! Clamp to edge ((Image):setWrap("clamp", "clamp") [the default]) is OpenGL 1.2. The driver probably wraps the texture and interpolates the color from the other edge.
Shallow indentations.
Re: Scaling + Tiled Images = Gaps
Thanks for the suggestion, Inny. I have already tried flooring the coordinates AND ensuring the scale is a number that results in an integer texture size after scaling (512->204). The code I posted is my pre-flooring effort, as I felt flooring the numbers actually made things worse. To floor the values involved, I added this to the end of the love.load function:
and replaced the statement inside the loop of romance.drawTiledImage with this:
The result was something like this:
I hope you can see what I meant by the gaps being more pronounced. If you look closely, you can even see some "bleeding" of textures from the top onto the bottom line of the screen. I really don't get it! The top-left texture especially (0, 0) and the whole top row should be perfectly aligned with pixels!
Boolsheet, it's great to have confirmation of that first thing! Yeah, 3200x2000 was definitely too large a texture. So... software rendering only supports OpenGL 1.1 eh? Does clamp to edge in OpenGL 1.2 guarantee I won't get these weird gaps between textures? If so, maybe I should note that as a requirement: requires a video card support OpenGL 1.2 or greater. I would really prefer to solve the problem for all video cards that are compatible with Love, however.
Code: Select all
romance.game.viewport.scale = math.floor(romance.game.viewport.scale * romance.texture.width) / romance.texture.width
Code: Select all
love.graphics.draw(image[i], math.floor(((i - 1) % width) * romance.texture.width) + x, math.floor(math.floor((i - 1) / width) * romance.texture.height) + y)
I hope you can see what I meant by the gaps being more pronounced. If you look closely, you can even see some "bleeding" of textures from the top onto the bottom line of the screen. I really don't get it! The top-left texture especially (0, 0) and the whole top row should be perfectly aligned with pixels!
Boolsheet, it's great to have confirmation of that first thing! Yeah, 3200x2000 was definitely too large a texture. So... software rendering only supports OpenGL 1.1 eh? Does clamp to edge in OpenGL 1.2 guarantee I won't get these weird gaps between textures? If so, maybe I should note that as a requirement: requires a video card support OpenGL 1.2 or greater. I would really prefer to solve the problem for all video cards that are compatible with Love, however.
- slime
- Solid Snayke
- Posts: 3170
- Joined: Mon Aug 23, 2010 6:45 am
- Location: Nova Scotia, Canada
- Contact:
Re: Scaling + Tiled Images = Gaps
OpenGL 1.2 was 'released' in 1998, 14 years ago. Every system that will be playing games supports at least OpenGL 1.4, I believe.
Re: Scaling + Tiled Images = Gaps
Hmm... you make a good point, slime. Maybe I am just making a big fuss about nothing. It's really unlikely someone would be running a Windows XP-7 system without drivers. Does the .love file in my original post look alright on everyone else's systems? It looks fine to me, but I'd like some confirmation from someone (maybe with integrated graphics) that there aren't any weird gap problems for them.
I used this tool:
http://www.realtech-vr.com/glview/download.html
to check the OpenGL support of Virtual PC and sure enough, it's using the MS Direct3D wrapper that supports only version 1.1 of OpenGL.
Next step then, is there any way I can check for OpenGL >= 1.2 support and give users without such video hardware an error message?
I have an nVidia card. Is there any way I can enable/disable support for different OpenGL versions in my drivers (short of uninstalling them)? This is just to test 1.1 v/s 1.2 and for my own peace of mind, really.
Sorry if I sound like I'm worried about nothing. I'm just concerned and want to make sure that I do things correctly. Reducing my texture size from 3200x2000 to 512x512 is still a good idea though, right? My video card can support 3200x2000 textures, but I'm likely to find cards out there in the wild that won't. Is that correct?
I used this tool:
http://www.realtech-vr.com/glview/download.html
to check the OpenGL support of Virtual PC and sure enough, it's using the MS Direct3D wrapper that supports only version 1.1 of OpenGL.
Next step then, is there any way I can check for OpenGL >= 1.2 support and give users without such video hardware an error message?
I have an nVidia card. Is there any way I can enable/disable support for different OpenGL versions in my drivers (short of uninstalling them)? This is just to test 1.1 v/s 1.2 and for my own peace of mind, really.
Sorry if I sound like I'm worried about nothing. I'm just concerned and want to make sure that I do things correctly. Reducing my texture size from 3200x2000 to 512x512 is still a good idea though, right? My video card can support 3200x2000 textures, but I'm likely to find cards out there in the wild that won't. Is that correct?
- dreadkillz
- Party member
- Posts: 223
- Joined: Sun Mar 04, 2012 2:04 pm
- Location: USA
Re: Scaling + Tiled Images = Gaps
There's no seam on my end (Windows 7 x64, Nvidia GT 220). I think this is one of those cases where you are worrying too much to accommodate as many people as possible (Dinosaur computers).
Re: Scaling + Tiled Images = Gaps
Just to add that your problem also happens running XP in Vmware Fusion (Fullscreen). In VirtualBox running Windows 8 (Fullscreen) the problem however don't occur.
Re: Scaling + Tiled Images = Gaps
Windows 7 on a netbook, I've got an Intel 945 display adapter (Intel GMA 950) and it works without any problem.
Last edited by Nixola on Sat Jun 16, 2012 11:04 am, edited 1 time in total.
lf = love.filesystem
ls = love.sound
la = love.audio
lp = love.physics
lt = love.thread
li = love.image
lg = love.graphics
ls = love.sound
la = love.audio
lp = love.physics
lt = love.thread
li = love.image
lg = love.graphics
Re: Scaling + Tiled Images = Gaps
That's exactly what gave me the hint to look for the wrapping/clamping thing.gurok wrote:If you look closely, you can even see some "bleeding" of textures from the top onto the bottom line of the screen.
Microsoft's software renderer anyway. Mesa should support OpenGL 3.0 with the latest version.gurok wrote:So... software rendering only supports OpenGL 1.1 eh?
In this case, yes. It gets more complicated if you start using Quads and non-opaque pixels.gurok wrote:Does clamp to edge in OpenGL 1.2 guarantee I won't get these weird gaps between textures?
Looks fine on my netbook with a GMA 950 (Windows XP driver with OpenGL 1.4). Like slime said, it's really unlikely that somone still has OpenGL 1.1 hardware and then tries to run games on it. Hey, Intel's last 1.1 card is apparently from 1997 and that means something!gurok wrote:Does the .love file in my original post look alright on everyone else's systems?
Mh, just for completeness here's a possible solution. You could pad the 512x512 ImageData with a 1 pixel border with the color of the neighbouring pixel, create and place a 510x510 Quad in the center, and then draw the tiles like that. Not sure if that actually works and it's certainly not worth the effort to support the Microsoft fallback driver.gurok wrote:I would really prefer to solve the problem for all video cards that are compatible with Love, however.
No, there's no direct way to query this information. There is love.graphics.isSupported, but none of the current options check for OpenGL 1.2 specifically. Perhaps clamp to edge could be added there.. then again, it's OpenGL 1.2.gurok wrote:Next step then, is there any way I can check for OpenGL >= 1.2 support and give users without such video hardware an error message?
Not that I know of and a short google search didn't turn up anything. Maybe there's a way to get Mesa to do that; it already has an environment variable for disabling extensions.gurok wrote:I have an nVidia card. Is there any way I can enable/disable support for different OpenGL versions in my drivers (short of uninstalling them)?
Yes, 512x512 is a good compatibility size. I think current AMD/Nvidia cards can do 8192 to 16384. Older low-end stuff may cap at 512.gurok wrote:Reducing my texture size from 3200x2000 to 512x512 is still a good idea though, right? My video card can support 3200x2000 textures, but I'm likely to find cards out there in the wild that won't. Is that correct?
Edit: I also wanted to mention that loading the whole texture uses up ~35 MB of VRAM. LÖVE may fail silent or terribly loud on out-of-memory problems. It's always a surprise.
Do you know if you have hardware acceleration enabled or does Windows 8 actually ship a new OpenGL fallback driver?coffee wrote:In VirtualBox running Windows 8 (Fullscreen) the problem however don't occur.
Shallow indentations.
Who is online
Users browsing this forum: Amazon [Bot], Semrush [Bot] and 5 guests