[v2 PATCH] d2d1: Partially implement StrokeContainsPoint() for rectangles

Henri Verbeet hverbeet at gmail.com
Wed Sep 6 07:21:40 CDT 2017


On 6 September 2017 at 12:35, Nikolay Sivov <nsivov at codeweavers.com> wrote:
> -static void d2d_point_normalise(D2D1_POINT_2F *p)
> +static float d2d_point_normalise(D2D1_POINT_2F *p)
>  {
>      float l;
>
>      if ((l = sqrtf(d2d_point_dot(p, p))) != 0.0f)
>          d2d_point_scale(p, 1.0f / l);
> +
> +    return l;
>  }
Maybe. I'm not entirely convinced you need this, see below.

> +static BOOL d2d_rect_point_on(const D2D1_POINT_2F *point, float tolerance, const D2D1_POINT_2F *points)
> +{
> +    unsigned int i;
> +    float dot;
> +
> +    for (i = 0; i < 4; i++)
> +    {
It may improve readability to pull that loop out into the caller, and
make this function just about testing whether a point is on a line
segment.

> +        float edge_length, distance;
> +        D2D1_POINT_2F v, edge;
> +
> +        v.x = point->x - points[i].x;
> +        v.y = point->y - points[i].y;
d2d_point_subtract(&v, point, &points[i]);
Or, alternatively, d2d_point_subtract(&v_q, q, p0);

> +        /* Point to vertex distance. */
> +        if (d2d_point_dot(&v, &v) < tolerance * tolerance)
> +            return TRUE;
Maybe. An argument could probably be made for testing joins separately.

> +        /* Point to edge distance. */
> +        edge.x = points[(i + 1) % 4].x - points[i].x;
> +        edge.y = points[(i + 1) % 4].y - points[i].y;
d2d_point_subtract(&edge, &points[(i + 1) % 4], &points[i]);
Or, d2d_point_subtract(&v_p, p1, p0);

> +        if ((edge_length = sqrtf(d2d_point_dot(&edge, &edge))) == 0.0f)
> +            continue;
I guess we'd want to introduce d2d_point_length().

> +        distance = fabsf(edge.x * v.y - edge.y * v.x) / edge_length;
n.x = -v_p.y;
n.y = v_p.x;
distance = fabsf(d2d_point_dot(&v_q, &n)) / edge_length;

> +        d2d_point_normalise(&edge);
> +        dot = d2d_point_dot(&edge, &v);
> +        if (dot >= 0.0f && dot <= edge_length)
> +            return TRUE;
If you compare against "edge_length * edge_length", you can drop the
normalisation.

> +static BOOL d2d_rect_point_inside(const D2D1_POINT_2F *point, const D2D1_POINT_2F *points)
> +{
> +    D2D1_POINT_2F vec[2];
> +    unsigned int i;
> +    float dot;
> +
> +    for (i = 0; i < 2; i++)
> +    {
> +        float l;
> +
> +        vec[0].x = points[i + 1].x - points[i].x;
> +        vec[0].y = points[i + 1].y - points[i].y;
> +        vec[1].x = point->x - points[i].x;
> +        vec[1].y = point->y - points[i].y;
> +
> +        if ((l = d2d_point_normalise(&vec[0])) == 0.0f)
> +            return FALSE;
> +
> +        if ((dot = d2d_point_dot(&vec[0], &vec[1])) < 0.0f || dot > l)
> +            return FALSE;
> +    }
> +
> +    return TRUE;
> +}
Some of the comments on d2d_rect_point_on() apply here as well.
Perhaps more importantly, does this work with shear transformations?

> +    if (transform)
> +    {
> +        D2D1_POINT_2F outer[4], inner[4];
> +
> +        stroke_width *= 0.5f;
> +
> +        d2d_point_transform(&inner[0], transform, rect->left + stroke_width, rect->top + stroke_width);
> +        d2d_point_transform(&inner[1], transform, rect->left + stroke_width, rect->bottom - stroke_width);
> +        d2d_point_transform(&inner[2], transform, rect->right - stroke_width, rect->bottom - stroke_width);
> +        d2d_point_transform(&inner[3], transform, rect->right - stroke_width, rect->top + stroke_width);
> +
> +        d2d_point_transform(&outer[0], transform, rect->left - stroke_width, rect->top - stroke_width);
> +        d2d_point_transform(&outer[1], transform, rect->left - stroke_width, rect->bottom + stroke_width);
> +        d2d_point_transform(&outer[2], transform, rect->right + stroke_width, rect->bottom + stroke_width);
> +        d2d_point_transform(&outer[3], transform, rect->right + stroke_width, rect->top - stroke_width);
This seems suspicious. In general, the stroke width isn't affected by
transformations on the geometry. Test coverage for that would be nice
in either case.



More information about the wine-devel mailing list