[PATCH v2 2/4] gdi32: Avoid best fit chars when mapping unicode character to a glyph index.

Huw Davies huw at codeweavers.com
Mon Dec 21 06:07:15 CST 2020


On Tue, Dec 15, 2020 at 03:04:46PM +0300, Dmitry Timoshkov wrote:
> This part fixes GetGlyphIndices.
> 
> Signed-off-by: Dmitry Timoshkov <dmitry at baikal.ru>
> ---
>  dlls/gdi32/font.c       | 2 +-
>  dlls/gdi32/tests/font.c | 2 --
>  2 files changed, 1 insertion(+), 3 deletions(-)
> 
> diff --git a/dlls/gdi32/font.c b/dlls/gdi32/font.c
> index 00bbeb0a11..9380d3efbe 100644
> --- a/dlls/gdi32/font.c
> +++ b/dlls/gdi32/font.c
> @@ -3223,7 +3223,7 @@ static DWORD CDECL font_GetGlyphIndices( PHYSDEV dev, const WCHAR *str, INT coun
>                  if (str[i] >= 0xf020 && str[i] <= 0xf100) glyph = str[i] - 0xf000;
>                  else if (str[i] < 0x100) glyph = str[i];
>              }
> -            else if (WideCharToMultiByte( physdev->font->codepage, 0, &str[i], 1,
> +            else if (WideCharToMultiByte( physdev->font->codepage, WC_NO_BEST_FIT_CHARS, &str[i], 1,
>                                            &ch, 1, NULL, &used ) && !used)
>                  glyph = (unsigned char)ch;
>          }

Hi Dmitry,

I suspect this should wait until after 6.0.  What app does this fix?

Huw.



More information about the wine-devel mailing list