Optimization Stuff
Posted: Wed Dec 09, 2020 1:05 am
Hello All,
As my project grows, I wonder more and more how fast stuff is under the hood, and how I should write it to run as fast as possible.
My first question is: do tables get slower the bigger they get, or does it not matter because they're just C arrays and hash tables under the hood? To Phrase it another way, can I trade memory for speed and put whatever methods I want on every entity in the game (let's say for example, 100 methods on every entity), or will that cause a performance hit because it's searching through these huge tables (mostly associative, not numerically indexed)? Not counting constructors/rehashes, which are done during level generation in this 'hypothetical' case.
Question #2 : Since I'm using the standard class module for Lua, I think I'm creating a new metable every time I derive a new class from a base class, rather than just having one metatable that all of the derived classes set themselves to. Does searching six or seven metatables down on a regular basis to find methods and properties drag down the speed? I suspect the answer could be "Not really, as long as the functions searching through these metatables are JIT-friendly". I've heard that searching in metatables is one of the fastest ways to do it but am still a little curious about it as the Lua book and Reference manual don't go into a ton of detail as they have a lot to cover.
To rephrase more briefly:
1) Big tables bad?
2) Stacks of metatables: not cool?
Thank you for indulging my ramblings
As my project grows, I wonder more and more how fast stuff is under the hood, and how I should write it to run as fast as possible.
My first question is: do tables get slower the bigger they get, or does it not matter because they're just C arrays and hash tables under the hood? To Phrase it another way, can I trade memory for speed and put whatever methods I want on every entity in the game (let's say for example, 100 methods on every entity), or will that cause a performance hit because it's searching through these huge tables (mostly associative, not numerically indexed)? Not counting constructors/rehashes, which are done during level generation in this 'hypothetical' case.
Question #2 : Since I'm using the standard class module for Lua, I think I'm creating a new metable every time I derive a new class from a base class, rather than just having one metatable that all of the derived classes set themselves to. Does searching six or seven metatables down on a regular basis to find methods and properties drag down the speed? I suspect the answer could be "Not really, as long as the functions searching through these metatables are JIT-friendly". I've heard that searching in metatables is one of the fastest ways to do it but am still a little curious about it as the Lua book and Reference manual don't go into a ton of detail as they have a lot to cover.
To rephrase more briefly:
1) Big tables bad?
2) Stacks of metatables: not cool?
Thank you for indulging my ramblings