external data (and in bad cases our own) can contain invalid byte
sequences in UTF8 strings. A prominent example are file names.
There was a 1-off bug in calculating the allowed length for multibyte
chars, and the iterator was a bit too greedy when stumbling upon
invalid sequences, returning a single "invalid" char for a sequence
up to the point where it became invalid in calculation. Now, we present
one invalid char for the first byte of that sequence and then check
for a valid char starting with the next byte.
This is a major overhaul of the hardware abstraction layer.
A few notes:
General platform distinction happens in
frontend/device.lua
which will delegate everything else to
frontend/device/<platform_name>/device.lua
which should extend
frontend/device/generic/device.lua
Screen handling is implemented in
frontend/device/screen.lua
which includes the *functionality* to support device specifics.
Actually setting up the device specific functionality, however,
is done in the device specific setup code in the relevant
device.lua file.
The same goes for input handling.
colors were a mixture of 4bpp integers (0=white, 15=black) and
fractional blackness levels (0=white, 1.0=black) before. This is
now unified to use the color specification of the Blitbuffer API.
rendertext.lua did use addblitFrom() for rendering text - i.e. blitting
the letters to a BlitBuffer. However, it used intensity=1.0, which is
the same as doing a (faster, more efficient) blitFrom(). So use that
instead.
What was probably intented here is a different kind of blitting - using
the bitbuffer of the glyph as a mask.