[Solved] Why doesn't this shader code work properly? (Or alternate methods)
Posted: Sun May 01, 2022 5:47 pm
So in my game I've written a color reducing shader, a dithering shader I guess, that takes the final image and snaps all the colors to the closest one in a "palette" of colors. After weeks of looking for code that worked how I needed it to I found some code that uses the distance() function to compare the RGB of the current pixel to the RGB of every entry in a table of RGB values, the palette, and returns the closest color. Works great. Slows down at high resolutions but I don't care because my game is best played at low 320x180 resolution.
Thing is with this method it means I have to have all the colors in a table hard-coded to the shader code itself. This is fine if I have just one final palette I'll be using, but since I'm still in the playing around stage, I want to be changing palettes all the time. And it's a pain to keep having to replace the table every time I launch the game to see the new colors. So I've been looking for alternatives that do the same thing.
This is the default code:
First thing I've tried is changing it so the palette is just an image with all the colors in it. Basically the same PNG files you can get from LoSpec. I would pass the image to the shader and the following code would perform the same thing the original table-based code would do. Except it doesn't work...
(This code would replace the two lines in the middle of the above effect() function)
(Where 32 is the color count in the palette. In the final code it would be passed to the shader along with the image so I could have variable palette sizes)
I know it's overkill, but I was just playing around. Like it's sampling the color a lot of times. First to get the first color in the list, then once for every single color in the palette to compare, then one last time to get the final color. However idx doesn't seem to return the right thing because it only returns the first color in the palette. The fact that it actually does return the first color in the palette shows that it is working. But idx is not being returned properly. Because I assume idx is always being returned as 0. So taking idx as 0, it just returns the first pixel in the image. Even if the actual closest color is index #15 or something. Even if I modify it so the starting color is like 15, or 0.5 on the texture UV, it still returns 0. lol
So either, is there a way to fix this code to work right? Maybe I'm missing something easy. Or alternatively, can I send a table of colors from Lua into the shader every frame instead? Or would that be overload or super slow? Or am I just stuck with having to deal with having large lists of palette tables since it works?
See the issue is, palettes are gonna be pretty big. The bigger they are, the more colors and more detail in the output picture. I've found I can comfortably get away with up to 256 colors without slowdown. But can also go a lot lower. It'll all depend on the final project and how much color the textures have in them. (The project is a g3d powered retro-styled FPS with low resolution and color depth)
Thing is with this method it means I have to have all the colors in a table hard-coded to the shader code itself. This is fine if I have just one final palette I'll be using, but since I'm still in the playing around stage, I want to be changing palettes all the time. And it's a pain to keep having to replace the table every time I launch the game to see the new colors. So I've been looking for alternatives that do the same thing.
This is the default code:
Code: Select all
#define RGB(r, g, b) vec3(float(r)/255.0, float(g)/255.0, float(b)/255.0)
#define NUM_COLORS 32
vec3 palette[NUM_COLORS];
// pre GLES3 GPUs don't support array constructors, so need to initialize array explicitly
void InitPalette()
{
palette[0] = RGB(255, 255, 255);
palette[1] = ...
etc
etc
return vec4(pal[idx], 1.0);
}
vec4 EuclidDist(vec3 c, vec3[NUM_COLORS] pal)
{
int idx = 0;
float nd = distance(c, pal[0]);
for(int i = 1; i < NUM_COLORS; i++)
{
float d = distance(c, pal[i]);
if(d < nd)
{
nd = d;
idx = i;
}
}
return vec4(pal[idx], 1.0);
}
vec4 effect(vec4 color, Image tex, vec2 texcoord, vec2 pixcoord) {
<All my rendering code is here>
InitPalette(); // So annoying that it needs to define the palette colors every time
fincolor.rgb = EuclidDist(fincolor.rgb, palette);
return fincolor;
}
(This code would replace the two lines in the middle of the above effect() function)
Code: Select all
vec4 pixcolor = Texel(ColorMap, vec2(0.0, 0.0));
int idx = 0;
float nd = distance(fincolor, pixcolor);
for(int i = 0; i < 32; ++i) {
pixcolor = Texel(ColorMap, vec2(i / 32, 0.0));
float d = distance(fincolor, pixcolor);
if(d < nd)
{
nd = d;
idx = i;
}
}
fincolor = Texel(ColorMap, vec2(idx / 32, 0.0));
I know it's overkill, but I was just playing around. Like it's sampling the color a lot of times. First to get the first color in the list, then once for every single color in the palette to compare, then one last time to get the final color. However idx doesn't seem to return the right thing because it only returns the first color in the palette. The fact that it actually does return the first color in the palette shows that it is working. But idx is not being returned properly. Because I assume idx is always being returned as 0. So taking idx as 0, it just returns the first pixel in the image. Even if the actual closest color is index #15 or something. Even if I modify it so the starting color is like 15, or 0.5 on the texture UV, it still returns 0. lol
So either, is there a way to fix this code to work right? Maybe I'm missing something easy. Or alternatively, can I send a table of colors from Lua into the shader every frame instead? Or would that be overload or super slow? Or am I just stuck with having to deal with having large lists of palette tables since it works?
See the issue is, palettes are gonna be pretty big. The bigger they are, the more colors and more detail in the output picture. I've found I can comfortably get away with up to 256 colors without slowdown. But can also go a lot lower. It'll all depend on the final project and how much color the textures have in them. (The project is a g3d powered retro-styled FPS with low resolution and color depth)