Memory occupied by LÖVE objects does not count towards the LuaJIT memory limits. The object itself is counted, but not the memory allocated by the object, e.g. the memory required to store the pixels of an ImageData.pgimeno wrote: ↑Wed Aug 19, 2020 11:08 pm Sometimes, Löve objects take a long time to be cleaned up by the GC, letting many of them to accumulate and take a lot of memory. In these cases, it's best to explicitly release them with Object:release. I wonder if that's what's happening here.
Memory Profiling?
Forum rules
Before you make a thread asking for help, read this.
Before you make a thread asking for help, read this.
Re: Memory Profiling?
Re: Memory Profiling?
Locals are collected automatically like pgimeno mentioned.
To keep your global space clean you can use strict.lua:
http://metalua.luaforge.net/src/lib/strict.lua.html
Also, be careful when dealing with very large tables.
Re: Memory Profiling?
Thanks for the suggestion, I will give it a try.
I use this memory profiler at the moment:
https://gist.github.com/rm-code/383c98a ... 7ae536bcc5
What I do is run the game with 1 entity object, and re-run it with 10.000 entities.
- zorg
- Party member
- Posts: 3468
- Joined: Thu Dec 13, 2012 2:55 pm
- Location: Absurdistan, Hungary
- Contact:
Re: Memory Profiling?
FFI is still bound to LuaJIT, so that will still have the same memory limitations... but yes, you can store data more optimally with C types (and may also help those that would want to use a certain CE acronym'd app to play the game their own way )Xii wrote: ↑Tue Aug 18, 2020 11:59 pm If you are using Lua tables to store game information, like npc.health=100 or map[pos]=tile_id or whatever, you can significantly reduce your memory consumption by using FFI. Here is an article detailing how to do that. Using FFI you can define compact memory structures for values that simply do not need the full 64 bits range that regular Lua values have.
PGimeno explained better than i could, but the gist is:
To summarize, the 1GB limit is a limitation of the Linux kernel and the LuaJIT garbage collector. This only applies to objects within the LuaJIT state and can be overcome by using malloc, which will allocate outside the lower 32-bit address space. Also, it's possible to use the x86 build on x64 in 32-bit mode and have access the full 4GB.
Me and my stuff True Neutral Aspirant. Why, yes, i do indeed enjoy sarcastically correcting others when they make the most blatant of spelling mistakes. No bullying or trolling the innocent tho.
Re: Memory Profiling?
Memory allocated with FFI (via ffi.new or something like ffi.C.malloc) is like memory allocated by Löve: it doesn't count towards LuaJIT's garbage collection count (as grump clarified, sorry if I wasn't too clear). It also doesn't count against LuaJIT's memory limit, which applies to luajit-managed objects only. The pointers [edit: and some metadata associated with the types, not with the objects themselves, like the strings for the field names if you're using fields] would be the only thing that counts against that limit, but pointer size is typically negligible in comparison with the structures.
Last edited by pgimeno on Thu Aug 20, 2020 10:59 pm, edited 1 time in total.
- zorg
- Party member
- Posts: 3468
- Joined: Thu Dec 13, 2012 2:55 pm
- Location: Absurdistan, Hungary
- Contact:
Re: Memory Profiling?
Huh, weird; for some reason i was under the impression they did count towards that limit... i guess i'm wrong then.
Me and my stuff True Neutral Aspirant. Why, yes, i do indeed enjoy sarcastically correcting others when they make the most blatant of spelling mistakes. No bullying or trolling the innocent tho.
Re: Memory Profiling?
You are surely mistaken. C structs allocated with ffi.new definitely show up on my collectgarbage("count")
Re: Memory Profiling?
Yes, memory allocated with ffi.new counts towards the GC limit. Memory allocated with malloc does not.
Code: Select all
ffi.cdef([[
void* malloc(size_t);
void free(void*);
]])
-- manually allocate memory and make the pointer subject to garbage collection
local memory = ffi.gc(ffi.C.malloc(size), ffi.C.free)
Another way is love.data.newByteData. A nice advantage of ByteData is that you can share it between threads and pass the object over a Channel, unlike cdata. A pointer to the raw memory can be obtained with :getFFIPointer().
Re: Memory Profiling?
What if I use love multithreading? Am I correct that every love thread creates separated Lua environment? So for every thread I can use up to 1GB (or 2GB 4GB, no big difference)? So I can utilize decent amount of memory that modern PC has for plain lua objects?
Personally I faced that problem several times with different tasks in different projects.
For example I tried to parse some heavy json-s with lua on love. (but not so heavy really, just several files about 50 mb in filesize)
BTW, I use gcinfo() to get current amount of memory. Maybe collectgarbage("count") is better, as internets says, not tried it myself.
Is here some rumors about when it will be included? That would be really niceLuaJIT 2.1 (not a fork, the official one) supports much more memory. Circa 48 bits if my recollection is right. I don't think it's included with Löve yet, though.
Personally I faced that problem several times with different tasks in different projects.
For example I tried to parse some heavy json-s with lua on love. (but not so heavy really, just several files about 50 mb in filesize)
BTW, I use gcinfo() to get current amount of memory. Maybe collectgarbage("count") is better, as internets says, not tried it myself.
- slime
- Solid Snayke
- Posts: 3166
- Joined: Mon Aug 23, 2010 6:45 am
- Location: Nova Scotia, Canada
- Contact:
Re: Memory Profiling?
LuaJIT's hard memory limit is per process rather than per Lua instance within a process unfortunately, so spreading memory across multiple threads won't raise the limit.
That being said, most memory use in games tends to be things like textures and other large bits of data, most of which doesn't count towards LuaJIT's Lua-allocated memory limit when you use love objects. There are tons of ways to store large collections of data in memory allocated by love rather than by Lua, and doing so often comes with performance benefits as well.
love 12.
If you're blowing past the 1-2GB limit by doing that, I suspect there are useful optimizations you can do to your own code to drastically limit peak memory usage. For example depending on how you structure string concatenation within a loop, it can use a lot more temporary memory than you might expect.
Who is online
Users browsing this forum: Ahrefs [Bot], Amazon [Bot], Google [Bot] and 4 guests