Ought to be faster than our naive array-based approach.
Especially for the glyph cache, which has a solid amount of elements,
and is mostly cache hits.
(There are few things worse for performance in Lua than
table.remove @ !tail and table.insert @ !tail, which this was full of :/).
DocCache: New module that's now an actual Cache instance instead of a
weird hack. Replaces "Cache" (the instance) as used across Document &
co.
Only Cache instance with on-disk persistence.
ImageCache: Update to new Cache.
GlyphCache: Update to new Cache.
Also, actually free glyph bbs on eviction.
* Minor updates to the min & max cache sizes (16 & 64MB). Mostly to satisfy my power-of-two OCD.
* Purge broken on-disk cache files
* Optimize free RAM computations
* Start dropping LRU items when running low on memory before pre-rendring (hinting) pages in non-reflowable documents.
* Make serialize dump the most recently *displayed* page, as the actual MRU item is the most recently *hinted* page, not the current one.
* Use more accurate item size estimations across the whole codebase.
TileCacheItem:
* Drop lua-serialize in favor of Persist.
KoptInterface:
* Drop lua-serialize in favor of Persist.
* Make KOPTContext caching actually work by ensuring its hash is stable.
* stride is now a size_t
On some platforms, that's 64 bits, which means it's no longer
automatically converted to a Lua number to avoid precision loss.
Do that ourselves, because lua-serialize doesn't know how to handle an
uint64_t cdata ;).