Saving/Loading big map table

Questions about the LÖVE API, installing LÖVE and other support related questions go here.
Forum rules
Before you make a thread asking for help, read this.
LXDominik
Prole
Posts: 26
Joined: Wed Oct 01, 2014 9:05 am

Saving/Loading big map table

Post by LXDominik »

Hi everyone!
I'm trying to save generated map in a file, then load it.
If map is small like 128x128 i have no problem saving/loading it using bitser lib. But if map is 8192x2048 i can't load it anymore, application just closes printing this to console '[Finished in 1.8s with exit code 3221225477]'.

Also I'm doing map generation in love.load, and application is 'not responding' while map is generating, after map generation functions are done, application starts responding normally. Do i need to use something like coroutine for generation, so i can show progress bars and such? Or there is better ways to do this?
User avatar
zorg
Party member
Posts: 3465
Joined: Thu Dec 13, 2012 2:55 pm
Location: Absurdistan, Hungary
Contact:

Re: Saving/Loading big map table

Post by zorg »

coroutines are fine, but you'll need to yield in love.update, since love.load only gets called once at startup.
otherwise you could use actual threads, but that's more complicated.

as for the bitser thing, it may or may not have a bug with large amounts of data...
Me and my stuff :3True Neutral Aspirant. Why, yes, i do indeed enjoy sarcastically correcting others when they make the most blatant of spelling mistakes. No bullying or trolling the innocent tho.
LXDominik
Prole
Posts: 26
Joined: Wed Oct 01, 2014 9:05 am

Re: Saving/Loading big map table

Post by LXDominik »

zorg wrote: Mon May 21, 2018 10:16 pm coroutines are fine, but you'll need to yield in love.update, since love.load only gets called once at startup.
otherwise you could use actual threads, but that's more complicated.

as for the bitser thing, it may or may not have a bug with large amounts of data...
i've tried all serializers listed on love2d wiki, bitser was the only one that managed to save such big table.
do you have any suggestions what should i try?
User avatar
pgimeno
Party member
Posts: 3657
Joined: Sun Oct 18, 2015 2:58 pm

Re: Saving/Loading big map table

Post by pgimeno »

Try Smallfolk maybe? https://github.com/gvx/Smallfolk - it's not binary, though, so the size will be larger. But since it doesn't generate Lua source, it will probably be able to handle that table size.

Also consider the possibility that it's LuaJIT going out of memory. If that's the case, allocating through ffi.new instead of using tables would help. But you'll have to come up with your own serializer in that case.
KayleMaster
Party member
Posts: 234
Joined: Mon Aug 29, 2016 8:51 am

Re: Saving/Loading big map table

Post by KayleMaster »

If bitser happens to silently crash the game with large amounts of data, you probably need to increase it's buffer size. That's what fixed it for me @zorg
grump
Party member
Posts: 947
Joined: Sat Jul 22, 2017 7:43 pm

Re: Saving/Loading big map table

Post by grump »

What exactly do you store in the map?

Shameless plug for Blob, my binary serialization library. (De-)serialization of 2048 * 8192 ints:

Code: Select all

local blob = Blob(nil, 2048 * 8192 * 4)
for y = 0, 2047 do
	for x = 1, 8192 do
		blob:writeU32(y * 8192 + x)
	end
end

local map = {}
for y = 0, 2047 do
	for x = 1, 8192 do
		map[y * 8192 + x] = blob:readU32()
	end
end
Doesn't crash, is fast enough to not stall (above code takes ~240 ms in a VM on my laptop) and stores the data in a compact format.

It does also serialize tables, but tables that large would have a hefty overhead; one byte overhead per table key, one byte per table value (except for booleans), and all numbers are stored as 64 bit values.
LXDominik
Prole
Posts: 26
Joined: Wed Oct 01, 2014 9:05 am

Re: Saving/Loading big map table

Post by LXDominik »

grump wrote: Tue May 22, 2018 8:24 am What exactly do you store in the map?

Shameless plug for Blob, my binary serialization library. (De-)serialization of 2048 * 8192 ints:

Code: Select all

local blob = Blob(nil, 2048 * 8192 * 4)
for y = 0, 2047 do
	for x = 1, 8192 do
		blob:writeU32(y * 8192 + x)
	end
end

local map = {}
for y = 0, 2047 do
	for x = 1, 8192 do
		map[y * 8192 + x] = blob:readU32()
	end
end
Doesn't crash, is fast enough to not stall (above code takes ~240 ms in a VM on my laptop) and stores the data in a compact format.

It does also serialize tables, but tables that large would have a hefty overhead; one byte overhead per table key, one byte per table value (except for booleans), and all numbers are stored as 64 bit values.
In my map i store type of a block 'map[x][y].type = 0'
But how do i save to/load from file using this Blob lib?
grump
Party member
Posts: 947
Joined: Sat Jul 22, 2017 7:43 pm

Re: Saving/Loading big map table

Post by grump »

LXDominik wrote: Tue May 22, 2018 12:24 pm In my map i store type of a block 'map[x][y].type = 0'
For optimal results, you should define a maximum value range for type, choose a matching data type (8 bits, 16 bits, 32 bits, 64 bits are possible) and use the corresponding Blob:write*/read* functions as shown above. I advise against serializing huge arrays with Blob:writeTable(), since it introduces overhead and is considerably slower than doing it manually.
But how do i save to/load from file using this Blob lib?
Saving and loading is beyond the scope of the lib. You can either use

Code: Select all

-- save
assert(love.filesystem.write("filename", blob:string()))

-- load
local blob = Blob(assert(love.filesystem.read("filename")))
or the Lua io functions (not suitable for .love files):

Code: Select all

-- save
local file = assert(io.open("filename", "wb"))
file:write(blob:string())
file:close()

-- load
local file = assert(io.open("filename", "rb"))
local blob = Blob(file:read("*all"))
file:close()
See also the examples folder in the Blob repo.

Edit: it's designed to work with data in RAM - if your maps don't fit at least twice into memory (once for the raw data, and at least the same amount of RAM for the parsed data), it'll become harder to serialize them, because you have to implement chunk loading yourself. 8192 * 2048 ints need 64 MB of RAM, so that should not be a huge problem for you.
LXDominik
Prole
Posts: 26
Joined: Wed Oct 01, 2014 9:05 am

Re: Saving/Loading big map table

Post by LXDominik »

grump wrote: Tue May 22, 2018 12:53 pm
LXDominik wrote: Tue May 22, 2018 12:24 pm In my map i store type of a block 'map[x][y].type = 0'
For optimal results, you should define a maximum value range for type, choose a matching data type (8 bits, 16 bits, 32 bits, 64 bits are possible) and use the corresponding Blob:write*/read* functions as shown above. I advise against serializing huge arrays with Blob:writeTable(), since it introduces overhead and is considerably slower than doing it manually.
So if i understand this correctry when i do this:

Code: Select all

    for y = 1, 2048 do
        for x = 1, 8192 do
                blob:writeU8(1)
        end
    end
i write bunch of 1's in order like this 1 1 1 ...
and when i do this :

Code: Select all

    local map = {}
    for y=1,2048 do
        map[y] = {}
        for x=1,8192 do
            map[y][x] = blob:readU8()
        end
    end
i create a table with 1's in each map[y][x] in same order i wrote 1's in the first loop?
grump
Party member
Posts: 947
Joined: Sat Jul 22, 2017 7:43 pm

Re: Saving/Loading big map table

Post by grump »

Yes. Although using a table of tables is not the most efficient way to do this, it will work fine.

For maximum effciency use a plain, one-dimensional array and index like this:

Code: Select all

local map = {}
for y = 0, 2047 do
    for x = 1, 8192 do
        map[y * 8192 + x] = blob:readU8()
    end
end
Or even simpler:

Code: Select all

local map = {}
for pos = 1, 2048 * 8192 do
	map[pos] = blob:readU8()
end

-- looking up a x, y coordinate 
local type = map[(y - 1) * 8192 + x]
That's not Blob related though, just general performance advice. You'll get better cache hit rates when you do it like this.
Post Reply

Who is online

Users browsing this forum: Semrush [Bot] and 6 guests