programs/wineconsole: Correctly display chars 00..1F and 7F
Kirill K. Smirnov
lich at math.spbu.ru
Wed Aug 8 05:32:44 CDT 2007
> > Is there something wrong with my patch? Did I miss something important?
> I'm not at all convinced that the conversion has to happen at that
> point. Do you have a test app that demonstrates this?
I've written test application, which writes control chars and performs
readback for comparision.
It's behaviour heavily depends on console font:
1) True Type (lucida). Control chars are displayed as square boxes,
ReadConsoleOutputCharacters[A|W] return codes [0..31].
2) Raster (fixedsys). Control chars are correctly displayed as glyphs,
ReadConsoleOutputCharacterA returns codes [0..31], ReadConsoleOutputCharacterW
returns unicode glyphes.
It seems I stumbled upon the same strange insane console behaviour I observed
while working upon console codepages.
Since wine renders fonts in his specific way (freetype, AFAIK), I propose the
1) Display control characters as glyphes. (like fixedsys).
2) ReadConsoleOutputCharacter[A|W] both should return characters without any
corrupting - if I wrote [0..31] I expect [0..31], if I wrote unicode glyphes,
I expect unicode glyphes to be returned. (like TrueType)
My patch assumes this behaviour.
Of course, I wanted to write proper wine test case, but it is impossible due
to different default console fonts under Windows - test will fail.
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 978 bytes
Desc: not available
Url : http://www.winehq.org/pipermail/wine-devel/attachments/20070808/cb8645ee/chars2.c
More information about the wine-devel