Maximum indexes in an array?
- Zilarrezko
- Party member
- Posts: 345
- Joined: Mon Dec 10, 2012 5:50 am
- Location: Oregon
Maximum indexes in an array?
Pretty straightforward question, Does anyone know the maximum amount of indexes a(n) table/array can hold? Or is it more of a maximum data a table can be or do they even have a limit?
Re: Maximum indexes in an array?
i dont think they have a maximum. its infinite i belive
Re: Maximum indexes in an array?
For most purposes, I think you can assume that you can that a table can have as many indexed items as you want.
I believe there is some sort of hard-limit in terms of memory usage when a table goes above 4Gb, but that would likely take millions and millions of table entries.
I believe there is some sort of hard-limit in terms of memory usage when a table goes above 4Gb, but that would likely take millions and millions of table entries.
- Zilarrezko
- Party member
- Posts: 345
- Joined: Mon Dec 10, 2012 5:50 am
- Location: Oregon
Re: Maximum indexes in an array?
Above 4 Billion or about 4294967295 if it's signed and all the bits of the float are being used towards the value and the float is 32 bits. Though I'm not too sure about floats, I know that's true about integers, maybe floats use more than 1 or 2 bits for information like decimal places I'd have to look into more into floats. But then again I'm not sure how LuaJIT uses floats if they're 32 or 64 or something.BOT-Brad wrote:For most purposes, I think you can assume that you can that a table can have as many indexed items as you want.
I believe there is some sort of hard-limit in terms of memory usage when a table goes above 4Gb, but that would likely take millions and millions of table entries.
But I don't want to pretend that I know this low level stuff, It hurts my head to think in that way.
- Robin
- The Omniscient
- Posts: 6506
- Joined: Fri Feb 20, 2009 4:29 pm
- Location: The Netherlands
- Contact:
Re: Maximum indexes in an array?
Numbers in Lua are 64-bit floating point, so in theory you have ~52 bits, which is about a million more entries than Zilarrezko said. In any case, enough that you'll run out of RAM before "using up" a table's indices.
Help us help you: attach a .love.
- Zilarrezko
- Party member
- Posts: 345
- Joined: Mon Dec 10, 2012 5:50 am
- Location: Oregon
Re: Maximum indexes in an array?
Wowy, so if exactly 52 bits of storage... That's 4,503,599,627,370,495 indices? That's intense. Thanks for the insight guys and/or girls.Robin wrote:Numbers in Lua are 64-bit floating point, so in theory you have ~52 bits, which is about a million more entries than Zilarrezko said. In any case, enough that you'll run out of RAM before "using up" a table's indices.
- Jasoco
- Inner party member
- Posts: 3727
- Joined: Mon Jun 22, 2009 9:35 am
- Location: Pennsylvania, USA
- Contact:
Re: Maximum indexes in an array?
The thing you really have to watch out for is memory usage and garbage collection like robin said. Tables can have a lot of entries, but you'd probably hit a memory cap before you filled one up. And if the garbage piles up too high Löve will just crash out. Of course I only encounter this when working with 3D which creates a lot of garbage.
Re: Maximum indexes in an array?
as mentioned, the theoretical limit is very high
the practical limit is going to be lower, of course. huge tables containing millions of elements are probably not the best for performance, because the GC has to traverse them (atomically!) in order to find any GCable objects it might be holding, but since the GC is incremental it only needs to do this occasionally (but on the other hand this could happen in the middle of performance-sensitive code). also, internally, tables are split into a hash part and an array part. the array part has a fairly low limit (iirc around 2^24 entries) due to the bytecode format; any more than this and additional items will start getting put into the hash part which is a lot slower. oh and the array optimization is fairly conservative and easy to defeat if you aren't careful, so there's that. you probably won't need this much data anyway, but...
luajit (since you mentioned it) ffi arrays are going to fair better for various reasons:
the practical limit is going to be lower, of course. huge tables containing millions of elements are probably not the best for performance, because the GC has to traverse them (atomically!) in order to find any GCable objects it might be holding, but since the GC is incremental it only needs to do this occasionally (but on the other hand this could happen in the middle of performance-sensitive code). also, internally, tables are split into a hash part and an array part. the array part has a fairly low limit (iirc around 2^24 entries) due to the bytecode format; any more than this and additional items will start getting put into the hash part which is a lot slower. oh and the array optimization is fairly conservative and easy to defeat if you aren't careful, so there's that. you probably won't need this much data anyway, but...
luajit (since you mentioned it) ffi arrays are going to fair better for various reasons:
- the GC does not traverse them
- their size is not limited by the bytecode format (although if you don't malloc them, you may hit the annoyingly low memory limit)
- they are much easier for the jit to optimize
- not compatible with standard lua, although with appropriate duck typing (and abstraction), they could be substituted by userdata
- they cannot store lua objects (directly at least, although numbers and booleans will work. strings too but they involve copies)
- no built-in bounds-checking
- they are 0-based, which is of course inconsistent with lua, so you may have to correct for that in your indexing, waste element 0 (and ensure you've accounted for the extra element), or just live with the inconsistency
Who is online
Users browsing this forum: No registered users and 3 guests