Page 1 of 2

Gamma unaware anti-aliasing in hardware transforms?

Posted: Tue Feb 23, 2010 5:00 pm
by pygy
Hi all,

I've just read this article about how a lot of software doesn't take gamma into account when applying affine transforms to images.

Earlier today, while playing with LuaAV, I noticed that the anti-aliasing of my graphic card was having the same problem: When you look at the lines from a distance, the lines look dashed).
IxrS7.png
IxrS7.png (15.8 KiB) Viewed 514 times
I've written a test case in LÖVE: a 128x128 px crop of the Dalaï Lama picture from the above article, scaled and/or rotated using love.graphics.draw(...)
gamma.love
v2
(40.92 KiB) Downloaded 95 times
Here's the result on my machine (Mac OS X 10.6, Mobile Radeon X1600):
Up1Un.png
Up1Un.png (129.7 KiB) Viewed 514 times
The continous grey bands have a value of 128 on my machine, but it should be higher. With a gamma of 2.2 (the default for computer monitors), it should be 187.

How does it look on your hardware?
Is there an OpenGL option to correct this?

Re: Gamma unaware anti-aliasing in hardware transforms?

Posted: Tue Feb 23, 2010 5:37 pm
by kalle2990
Pretty much the same.. :|

Re: Gamma unaware anti-aliasing in hardware transforms?

Posted: Tue Feb 23, 2010 5:44 pm
by pygy
What are your OS and graphic card?

Re: Gamma unaware anti-aliasing in hardware transforms?

Posted: Tue Feb 23, 2010 5:54 pm
by kalle2990
Windows Vista 32-bits Home Premium SP 2
Nvidia GeForce GT120 1024MB + 1340MB (or something) shared

Re: Gamma unaware anti-aliasing in hardware transforms?

Posted: Tue Feb 23, 2010 6:02 pm
by pygy
Nice :-)

So we know that (some versions of) nVidia and ATI cards, Windows and OS X are affected.

We still need Linux and Intel cards to have our bases covered...

Re: Gamma unaware anti-aliasing in hardware transforms?

Posted: Tue Feb 23, 2010 6:06 pm
by bartbes
Same, linux+nvidia.

Re: Gamma unaware anti-aliasing in hardware transforms?

Posted: Tue Feb 23, 2010 6:09 pm
by pygy
bartbes wrote:Same, linux+nvidia.
What card exaclty? Do you use the binary driver from nVivia?

Re: Gamma unaware anti-aliasing in hardware transforms?

Posted: Tue Feb 23, 2010 6:13 pm
by bartbes
9800GT, nvidia proprietary driver.

Re: Gamma unaware anti-aliasing in hardware transforms?

Posted: Tue Feb 23, 2010 7:03 pm
by pygy
Thanks.

Actually, it looks like it's the bilinear filter that's responsible for the Dalai Lama bug (since the images appear to be filtered in LÖVE), but the FSAA suffers from the same bug (see the updated example and test case).

BTW, how does the FSAA buffer feature work in LÖVE? I get the best AA quality with the parameter set to 0. With other values, I get "blocky" AA (I've tried 1,2,3,4,8,12,100 with no success).


Edit: I've done some research, aparently, on recent hardware, you can enable gamma corrected anti-aliasing and texture filtering. I can't tell you how though...

Re: Gamma unaware anti-aliasing in hardware transforms?

Posted: Wed Feb 24, 2010 8:10 am
by pekka
There are specific programs to twiddle with the settings of your graphics card. Since the cards themselves have proprietary innards, these programs aer specific to the hardware( and particular OSes too), and made by the vendors of the cards. You'll probably find the right program to use by asking at a suitable support forum for your particular OS and hardware. Or you could visit your graphics card's vendor's website too. (They are usually named something Control somethings, because they Control your Card with a Capital C.)

Anyway, for my OS and card that particular program allowed me to change the settings for anisotropic filtering, mipmapping, anti-aliasing and all kinds of neat things. I'll have to revisit its gamma and color settings now that I've learned about what Dalai Lama has to say about gamma and image scaling :) :)

I think my card supports at most 6 FSAA buffers, by the way. (It's at home and I am not.) Newer cards support a few more, but 100 seems unlikely (though what do I know). It's possible to query how many were enabled via the OpenGL API, but I don't think LÖVE has a method to do it directly. If you know C, you can read and try the example OpenGL program that comes with SDL library. It has an option to enable FSAA buffers in it. It's also relevant to LÖVE, because LÖVE is built on top of SDL (and other libraries too, but for graphics it uses SDL and OpenGL).