Discussion:
Adaptive Subdivision of Bezier Curves
(too old to reply)
McSeem
2005-07-01 06:17:30 UTC
Permalink
Hi,

I'm the author of the Anti-Grain Geometry library, http://antigrain.com
and I would like to share my new article with you:
http://antigrain.com/research/adaptive_bezier/index.html
It can look kind of naive, but I have never seen a fast and perfect
method of Bezier approximation. Even Adobe engine fails in some cases.
This is an attempt to achieve perfect result. I would appreciate your
comments, suggestions, and criticism.

McSeem
http://antigrain.com
Just d' FAQs
2005-07-01 13:52:55 UTC
Permalink
Post by McSeem
I'm the author of the Anti-Grain Geometry library, http://antigrain.com
http://antigrain.com/research/adaptive_bezier/index.html
It can look kind of naive, but I have never seen a fast and perfect
method of Bezier approximation. Even Adobe engine fails in some cases.
This is an attempt to achieve perfect result. I would appreciate your
comments, suggestions, and criticism.
Looks like you've been working hard, but without adequate literature.
Many of the issues you tackle have been discussed here, and certainly
in the professional literature.

Let me first correct a claim you make:

But this sutuation is solved by forcing the subdivision the
first time. There is a proof that after one division by half
a cubic curve cannot have two inflection points, neither a
loop, so, this approach is safe.

A cubic curve can never have two inflection points, so that claim is
vacuously safe; but a midpoint subdivision is *not* a safe way to
eliminate a loop. What is true is that we can choose a subdivision
point that cuts the loop, just not at the midpoint. We can also cut an
"S" shape (which has no inflection, only extremes) to separate the two
changes in direction (the two bends of the "S").

Paul Bourke's code is needlessly expensive for direct evaluation; a
form of Horner's rule can be used for better speed. Even so, other
methods are faster.

The professional literature includes a careful study of adaptive
forward differencing, which combines some of the best features of both
an incremental approach and an adaptive approach.

Lien, Shantz, Pratt. Adaptive forward differencing for rendering
curves and surfaces. SIGGRAPH '87. ISBN 0-89791-227-6.
<http://portal.acm.org/citation.cfm?id=37416>

When using a flatness criterion, notice that a fat curve can amplify
the visibility of any error, so the tolerance might better be adjusted
based on width, not just thin-curve pixel errors.

A well-known flatness test is both cheaper and more reliable than the
ones you have tried. The essential observation is that when the curve
is a uniform speed straight line from end to end, the control points
are evenly spaced from beginning to end. Therefore, our measure of how
far we deviate from that ideal uses distance of the middle controls,
not from the line itself, but from their ideal *arrangement*. Point 2
should be half-way between points 1 and 3; point 3 should be half-way
between points 2 and 4.

This, too, can be improved. Yes, we can eliminate the square roots in
the distance tests, retaining the Euclidean metric; but the taxicab
metric is faster, and also safe. The length of displacement (x,y) in
this metric is |x|+|y|.

Detecting loops and cusps is a classical problem in algebraic geometry
that is also addressed specifically in computer graphics literature:

Stone, DeRose. A geometric characterization of parametric cubic
curves. ACM TOG v8 n3, 1989, pp. 147-163. ISSN 0730-0301.
<http://portal.acm.org/citation.cfm?id=77056>



But if we skip straight to the bottom line, past all this prior work
on the bits and pieces, the crucial paper to read is:

Fabris, Forrest. High Quality Rendering of Two-Dimensional
Continuous Curves. SIBGRAPI '97.
<http://mirror.impa.br/sibgrapi97/anais/ART18/>

Also follow the reference trail connected with that paper, works it
cites and any works citing it.
Don Lancaster
2005-07-01 16:03:59 UTC
Permalink
Post by Just d' FAQs
Post by McSeem
I'm the author of the Anti-Grain Geometry library, http://antigrain.com
http://antigrain.com/research/adaptive_bezier/index.html
It can look kind of naive, but I have never seen a fast and perfect
method of Bezier approximation. Even Adobe engine fails in some cases.
This is an attempt to achieve perfect result. I would appreciate your
comments, suggestions, and criticism.
Looks like you've been working hard, but without adequate literature.
Many of the issues you tackle have been discussed here, and certainly
in the professional literature.
But this sutuation is solved by forcing the subdivision the
first time. There is a proof that after one division by half
a cubic curve cannot have two inflection points, neither a
loop, so, this approach is safe.
A cubic curve can never have two inflection points, so that claim is
vacuously safe; but a midpoint subdivision is *not* a safe way to
eliminate a loop. What is true is that we can choose a subdivision
point that cuts the loop, just not at the midpoint. We can also cut an
"S" shape (which has no inflection, only extremes) to separate the two
changes in direction (the two bends of the "S").
Paul Bourke's code is needlessly expensive for direct evaluation; a
form of Horner's rule can be used for better speed. Even so, other
methods are faster.
The professional literature includes a careful study of adaptive
forward differencing, which combines some of the best features of both
an incremental approach and an adaptive approach.
Lien, Shantz, Pratt. Adaptive forward differencing for rendering
curves and surfaces. SIGGRAPH '87. ISBN 0-89791-227-6.
<http://portal.acm.org/citation.cfm?id=37416>
When using a flatness criterion, notice that a fat curve can amplify
the visibility of any error, so the tolerance might better be adjusted
based on width, not just thin-curve pixel errors.
A well-known flatness test is both cheaper and more reliable than the
ones you have tried. The essential observation is that when the curve
is a uniform speed straight line from end to end, the control points
are evenly spaced from beginning to end. Therefore, our measure of how
far we deviate from that ideal uses distance of the middle controls,
not from the line itself, but from their ideal *arrangement*. Point 2
should be half-way between points 1 and 3; point 3 should be half-way
between points 2 and 4.
This, too, can be improved. Yes, we can eliminate the square roots in
the distance tests, retaining the Euclidean metric; but the taxicab
metric is faster, and also safe. The length of displacement (x,y) in
this metric is |x|+|y|.
Detecting loops and cusps is a classical problem in algebraic geometry
Stone, DeRose. A geometric characterization of parametric cubic
curves. ACM TOG v8 n3, 1989, pp. 147-163. ISSN 0730-0301.
<http://portal.acm.org/citation.cfm?id=77056>
But if we skip straight to the bottom line, past all this prior work
Fabris, Forrest. High Quality Rendering of Two-Dimensional
Continuous Curves. SIBGRAPI '97.
<http://mirror.impa.br/sibgrapi97/anais/ART18/>
Also follow the reference trail connected with that paper, works it
cites and any works citing it.
http://www.tinaja.com/cubic01.asp
--
Many thanks,

Don Lancaster
Synergetics 3860 West First Street Box 809 Thatcher, AZ 85552
voice: (928)428-4073 email: ***@tinaja.com

Please visit my GURU's LAIR web site at http://www.tinaja.com
McSeem
2005-07-01 17:43:48 UTC
Permalink
Thank you for your comments.

First of all, thank you for you correction, you are right here. The
lack of my math knowledge results in misbeliefs.

Yes, I saw all papers you refer, the most interesting of them is the
Post by Just d' FAQs
Lien, Shantz, Pratt. Adaptive forward differencing for rendering
curves and surfaces. SIGGRAPH '87. ISBN 0-89791-227-6.
<http://portal.acm.org/citation.cfm?id=37416>
However, I found out that the incremental forward differencing approach
can't be considerably improved. I agree it can perfectly handle the
deviation error by the distance criterion, but there's no way to draw a
good looking fat stroke.
I might be mistaking, but as far as I understand you can only use some
constant coefficient that will define the maximal deviation. So that,
for the fat stroke you will have to increase the number of points along
the *entire curve*. You can't do that locally, at sharp turns. So, any
distance criterion is not enough and we simply have to add the *angle*
criterion as soon as we want to draw a fat curve. You are absolutely
right, a wide stroke amplifies the visibility of the errors, so that
the deviation from the ideal curve must be much smaller at sharp turns.
This is the whole idea, that's to considerably decrease the error at
sharp turns.

Initially I used only the angle criterion but it doesn't work well at
flat parts, so that I use a more traditional criterion.
Post by Just d' FAQs
A well-known flatness test is both cheaper and more reliable than the
ones you have tried. The essential observation is that when the curve
is a uniform speed straight line from end to end, the control points
are evenly spaced from beginning to end. Therefore, our measure
of how far we deviate from that ideal uses distance of the middle
controls, not from the line itself, but from their ideal *arrangement*.
Point 2 should be half-way between points 1 and 3; point 3 should
be half-way between points 2 and 4.
First of all, it doesn't seem to be cheaper and more reliable. The
curve can be very flat, like this one:
Loading Image...
And it requires only 9 points, with the maximal deviation of 0.12 of a
pixel. That's quite enough. Your criterion (well known flatness test)
will produce more points.
And it's not obvious how you define the value of "being half-way". In
terms of Manhattan distance? But it works only if points 1,2,3 are
about collinear. In the common case they are not.
It's very intresting, so, could you explain it more and provide some
pseudo-code?

I claim the following.
1. The line-point distance criterion calculated in Euclidian metrics is
the best way to estimate the deviation from the ideal curve, simply
because it's always proportional to the deviation. I agree it isn't the
cheapest way, but on modern processors the difference between my method
and the most efficient one is not that considerable. A floating point
multiplication is about as fast as FP addition.
2. At sharp turns the wide stroke amplifies visible errors, so that,
the approach with only a distance criterion isn't enough. My method
allows you to increase the accuracy locally, so that the distance error
at sharp turns is less as a factor of 10s, 100s, or even 1000s.
3. My approach is actually a combination of different conditions, not
just one method. It isn't mathematically beautiful, but it's a kind of
an engineering approach that works well in practice.

McSeem
http://antigrain.com
Just d' FAQs
2005-07-02 00:59:15 UTC
Permalink
Post by McSeem
And it's not obvious how you define the value of "being half-way". In
terms of Manhattan distance? But it works only if points 1,2,3 are
about collinear. In the common case they are not.
It's very intresting, so, could you explain it more and provide some
pseudo-code?
Really, it's as simple and reliable as stated. With points P1...P4,
the midpoint between P1 and P3 is (P1+P3)/2, so we compute a vector

dev2 = (P1+P3)/2 - P2

Better still, double it.

dd2 = P1 + P3 - 2 P2

This has an x component and a y component. Take their absolute values
and sum them. Likewise, compute the deviation of P3:

dd3 = P2 + P4 - 2 P3

Compute its Manhattan length as well, and sum the two. That's what we
use for our flatness test.

A variation, not quite so cheap, is to expect P2 to be 1/3 of the way
from P1 to P4, and P3 to be 2/3.

dev2 = (2 P1 + P4)/3 - P2
dev3 = (P1 + 2 P4)/3 - P3

Tripled, these give

td2 = 2 P1 + P4 - 3 P2
td3 = P1 + 2 P4 - 3 P3

As you discovered, collinearity is not a safe and reliable test; this
is. We could still use a Euclidean norm, but the Manhattan value will
be within a constant factor of it, which makes it safe.


Again, however, the SIBGRAPI paper and those it cites (such as the
doctoral dissertation) study the issues in far greater detail. It is
hard to believe you could have absorbed those contents and still have
the difficulties against which you fought. Perhaps you should entice
Fabris to take an interest in the Anti-Grain Geometry project.
McSeem
2005-07-02 03:51:53 UTC
Permalink
Thanks for the explanations!
Post by Just d' FAQs
Really, it's as simple and reliable as stated. With points P1...P4,
the midpoint between P1 and P3 is (P1+P3)/2, so we compute a vector...
I have tried your methods (that very "well known" criteria) and here's
the results.
Maybe I did something wrong but my code is as follows (for the sake of
clarity I use many variables).
dd2+dd3:
double dd2x = fabs(x1 + x3 - 2 * x2);
double dd2y = fabs(y1 + y3 - 2 * y2);
double dd3x = fabs(x2 + x4 - 2 * x3);
double dd3y = fabs(y2 + y4 - 2 * y3);
double dist = dd2x + dd2y + dd3x + dd3y;
if(dist <= m_distance_tolerance)
{
m_points.add(point_type(x23, y23));
return;
}

td2+td3:
double td2x = fabs(2 * x1 + x4 - 3 * x2);
double td2y = fabs(2 * y1 + y4 - 3 * y2);
double td3x = fabs(x1 + 2 * x4 - 3 * x3);
double td3y = fabs(y1 + 2 * y4 - 3 * y3);
double dist = td2x + td2y + td3x + td3y;
if(dist <= m_distance_tolerance)
{
m_points.add(point_type(x23, y23));
return;
}
Please correct me if there are mistakes.

The results:
Rather flat curves:
1. dd2+dd3:
Loading Image...
2. td2+td3:
Loading Image...
3. Incremental:
Loading Image...
4. AGG Subdivision:
Loading Image...

You see, keeping the same maximal deviation, the number of points is
very different.
I consider the 4th picture close to ideal. It gives about the same
error on different scales, keeping the minimal number of points. I
totally agree my method is much more expensive, and besides, we should
take special care of the collinear cases. But yours estimation works
about the same as simple incremental method, as I suspected.
But the main point of my method is that it can be combined with the
*angle* error estimation. The trick here is to handle cusps where the
angle condition will never theoretically occur. As the practice shows
the collinearity check protects us from infinite recursion quite well.
In my method there is a ready collinearity estimation, while in yours
it's not obvious how to get it. Since we can't get the collinearity
value we can't easily use the angle estimation because it will result
in infinite recursion on degenerate cases.

There's another case, a curve with a loop:
1. dd2+dd3:
Loading Image...
2. td2+td3:
Loading Image...
3. Incremental:
Loading Image...
4. AGG Subdivision:
Loading Image...

I turned off my *angle* condition to make the chances equal.
Post by Just d' FAQs
Again, however, the SIBGRAPI paper and those it cites (such as the
doctoral dissertation) study the issues in far greater detail. It is
hard to believe you could have absorbed those contents and still have
the difficulties against which you fought. Perhaps you should entice
Fabris to take an interest in the Anti-Grain Geometry project.
Thanks for the suggestion. However, my work differs considerably from
all those papers. Once again, it's mostly a matter of how to draw a
relatively thick equidistant outline (a stroke) keeping it looking
nice. And how NOT to increase the total number of points dramatically.

I suspect you blame me for some ignorance in math and I agree. What I
did is just an engineering research that works quite well in practice
within the defined contitions.

Thanks again, I really appreciate your comments!

McSeem
http://antigrain.com
Just d' FAQs
2005-07-02 14:33:46 UTC
Permalink
Post by McSeem
I have tried your methods (that very "well known" criteria) and here's
the results.
The methods are not mine, I'm merely passing them on. The most direct
ancestor is GhostScript, but the observations have a long history
before that.
Post by McSeem
Please correct me if there are mistakes.
Looks correct. One fine point we cannot judge from the snippet is that
it may be suboptimal to use the same numeric distance tolerance for
all the methods. Consider a "circle" in Manhattan versus Euclidean:
the former crosses a diagonal at (0.5,0.5); the latter, at (0.7,0.7).
But both cross the x axis at (1,0). This may be worth tuning.
Post by McSeem
You see, keeping the same maximal deviation, the number of points is
very different.
I consider the 4th picture close to ideal. It gives about the same
error on different scales, keeping the minimal number of points. I
totally agree my method is much more expensive, and besides, we should
take special care of the collinear cases. But yours estimation works
about the same as simple incremental method, as I suspected.
But the main point of my method is that it can be combined with the
*angle* error estimation. The trick here is to handle cusps where the
angle condition will never theoretically occur. As the practice shows
the collinearity check protects us from infinite recursion quite well.
In my method there is a ready collinearity estimation, while in yours
it's not obvious how to get it. Since we can't get the collinearity
value we can't easily use the angle estimation because it will result
in infinite recursion on degenerate cases.
There is a subtle logic behind the difference. Consider a case where
the first three control points are identical, and the last elsewhere.
This is a straight line, for which two points (the ends) suffice. Or
is it? The *parameterization* is not uniform, and for certain uses
(maybe not yours), that becomes visible. For example, if the curve is
not a solid stroke but a dash pattern, or if we animate motion along
the curve, we may expect to see the parameterization.

The tests I describe include collinearity, but are more stringent. The
central controls are tested against the line at specific positions. If
they are close to those positions, they are nearly collinear. But yes,
even far from their ideal positions they can still be collinear.
Post by McSeem
Post by Just d' FAQs
Again, however, the SIBGRAPI paper and those it cites (such as the
doctoral dissertation) study the issues in far greater detail. It is
hard to believe you could have absorbed those contents and still have
the difficulties against which you fought. Perhaps you should entice
Fabris to take an interest in the Anti-Grain Geometry project.
Thanks for the suggestion. However, my work differs considerably from
all those papers. Once again, it's mostly a matter of how to draw a
relatively thick equidistant outline (a stroke) keeping it looking
nice. And how NOT to increase the total number of points dramatically.
The focus of the Fabris and Forrest paper is antialiasing, which you
do not consider at all (though perhaps you will). But they cite a
paper by Klassen, "Drawing antialiased cubic spline curves", that is
more closely related to your stated intent. And a web search for who
cites that turns up another paper which may be of interest.

<http://dev.gentoo.org/~spyderous/tutorials/cairo/>
Post by McSeem
I suspect you blame me for some ignorance in math and I agree. What I
did is just an engineering research that works quite well in practice
within the defined contitions.
Not at all. I'm suggesting that you might spare yourself some effort
and also improve your results with a deeper use of the literature.
Sometimes a single paper provides a "silver bullet" solution to the
problem we are tackling. More often we have to dig, understand, and
synthesize new solutions from the ideas we find there.

One of the peculiar tensions in finding novel solutions is between the
benefits of doing everything ourselves (ab initio) versus studying
prior art. Each has benefits and risks. We better understand a topic
if we discover the facts for ourselves, and are more likely to form a
novel view. Yet we also benefit from the insights of our predecessors
and can build rather than repeat. (Also, academia and patent offices
insist that we not duplicate prior work.)

In reading your page, I saw more evidence for ab initio work than for
use of the literature. Perhaps that's an artifact of presentation, but
it seemed an obvious gap to highlight.
McSeem
2005-07-02 17:09:05 UTC
Permalink
First of all, thank you for continuing the discussion. It's very
interesting and I really do appreciate your comments.
Post by Just d' FAQs
Looks correct. One fine point we cannot judge from the snippet is that
it may be suboptimal to use the same numeric distance tolerance for
the former crosses a diagonal at (0.5,0.5); the latter, at (0.7,0.7).
But both cross the x axis at (1,0). This may be worth tuning.
Well, I refer to the estimation of the *actual* deviation along the
resulting polyline from the ideal curve. This is namely the Euclidian
distance calculated using the local normal to the curve. And this
estimation minimizes the number of points, not exceeding the error. In
Manhattan metrics it's much less accurate, although I agree it's very
fast, simple and 100% robust.
Post by Just d' FAQs
There is a subtle logic behind the difference. Consider a case where
the first three control points are identical, and the last elsewhere.
This is a straight line, for which two points (the ends) suffice. Or
is it?
In all *strictly* colliner cases my method behaves about the same as
yours, that is, it produces more points than necessary. But in my
opinion it's not that important because strictly collinear cases are
very rear. As soon as a control point digresses from the straight line
even by 0.001 pixel the method works well. My point is that strictly
collinear cases are extremally rear in practice, but *almost* collinear
happen quite often. Well, actually we only need to handle the following
collinear cases:
2---1---4---3, 1---2---4---3, and 2---1---3---4
Like in this picture:
Loading Image...
I just simplified the whole thing, not detecting these cases and
assuming that strictly collinear cases are rear. In case of
1---2---3---4 it will do the same as in the above cases, although it'd
be enough to draw just a line 1---4.
Post by Just d' FAQs
The *parameterization* is not uniform, and for certain uses
(maybe not yours), that becomes visible. For example, if the curve is
not a solid stroke but a dash pattern, or if we animate motion along
the curve, we may expect to see the parameterization.
There's no problem with dash lines in AGG:
Loading Image...
As you can see, uniformity isn't important at all. But what important
is that the *tangent* estimation becomes valuable in cases of very
thick dash lines, otherwise the calculeted local normals will have
considerable (and unpredictable) error.
Post by Just d' FAQs
The focus of the Fabris and Forrest paper is antialiasing, which you
do not consider at all (though perhaps you will). But they cite a
paper by Klassen, "Drawing antialiased cubic spline curves", that is
more closely related to your stated intent. And a web search for who
cites that turns up another paper which may be of interest.
<http://dev.gentoo.org/~spyderous/tutorials/cairo/>
Yes, I know Cairo, but we are not competitors. It only *seems* that
it's all about the same but essentially AGG and Cairo are quite
different.

As for anti-aliasing I do that differently. I didn't carefully study
the papers you refer to, but I read them quickly and found out that
drawing anti-aliased curves directly is too restrictive. I use a
scanline rasterizer approach that calculates exact pixel coverage
values. As far as I know it's one of the fastest algorithms and it
produces the best possible result. First it was used in libArt by Raph
Levien, then David Turner (FreeType) considerably improved it
switching to integer fixed point calculations, then I rewrote it in C++
and David gave me the permission to release it under a separate license
with a reference.
This algorithm perfectly works with subpixel accuracy (1/256) but it
works with straight line segments only.
First of all, I have my doubts if it's cheap and accurate to calculate
the coverage of pixel 2 in this picture:
Loading Image...
David's algorithm perfectly handles it if we approximate the curve with
line segments (with subpixel accuracy, of course). No matter how short
the line segments are. And it works amazingly accurace and fast. That's
especially important when rendering glyphs.

Once again, even if we can draw anti-aliased curves (lines and fills)
directly it's still too restrictive. You can't easily calculate an
equidistant to the curve, because generally the exact equidistant to
the curve is a curve of a *higher degree!*. With the approach when you
rasterize curves directly you can't even draw text with a "false bold"
font:
Loading Image...
As you can see it's perfectly anti-aliased and smooth. I do that in a
very simple way, that is, approximating the curves with line segments
and then, calculating the contour. Also, it's quite easy to apply any
non-linear transformations in such a way:
Loading Image...
In case of drawing Bezier curves directly it's not obvious how to apply
non-linear transformations to the curve.

In other words I want to be able to use the result of my curve
approximation in many different ways, not only do render the curves
directly. This is the whole point of generalization.

McSeem
http://antigrain.com
Tony
2005-07-01 18:30:24 UTC
Permalink
Post by Just d' FAQs
A cubic curve can never have two inflection points, so that claim is
vacuously safe
Detecting loops and cusps is a classical problem in algebraic geometry
Stone, DeRose. A geometric characterization of parametric cubic
Interestingly enough, the very same paper talks about and shows an
example of a cubic curve with 2 inflection points.
Just d' FAQs
2005-07-02 00:29:11 UTC
Permalink
Post by Tony
Interestingly enough, the very same paper talks about and shows an
example of a cubic curve with 2 inflection points.
My mistake. I was thinking of a function cubic,

f(t) = a t^3 + b t^2 + c t + d

For an inflection, the second derivative must be zero.

f''(t) = 6a t + 2b

This being a linear function, it can cross zero only once.

But the major point still holds: subdivision at the midpoint is not a
reliable way to simplify the curve.
Tony
2005-07-02 19:20:58 UTC
Permalink
Post by Just d' FAQs
But the major point still holds: subdivision at the midpoint is not a
reliable way to simplify the curve.
Let me be a bit extra strict here - the major point does not hold since
you haven't proved anything. You just stated something to be true.

In the same way, the opposite statement doesn't hold either. Without
the proof, that is.

I would tend to agree with you - just playing with the curve and curve
equation I could not spot anything that would guarantee that loop would
always be removed by one subdivision at parameter value t=1/2. But I am
not that good at math so this doesn't prove anything.

However, reading Stone,deRose article again I managed to (almost)
convince myself that, at least in one case (loop with intersection
point at t=0 endpoint), it is not possible that loop won't be be cut
after the first subdivision.

Proving it for me (so far) involves looking at their diagram of the
loop region for cubic curve in 'canonic' form so this claim can be
regarded as unsubstantiated and as vacuous as any other. Until some
mathematican jumps into the discussion or until somebody provides a
reference, I can just comment that it is also possible that McSeem's
statement is true.
Przemyslaw Koprowski
2005-07-03 08:14:20 UTC
Permalink
Post by Tony
I would tend to agree with you - just playing with the curve and curve
equation I could not spot anything that would guarantee that loop would
always be removed by one subdivision at parameter value t=1/2. But I am
not that good at math so this doesn't prove anything.
Proving it for me (so far) involves looking at their diagram of the
loop region for cubic curve in 'canonic' form so this claim can be
regarded as unsubstantiated and as vacuous as any other. Until some
mathematican jumps into the discussion or until somebody provides a
reference, I can just comment that it is also possible that McSeem's
statement is true.
Ok here it goes a (sketch of a) proof. Take a cubic Bezier C with a
loop. We don't assume anything about the loop's location. By a simple
linear substitution t |--> 1/3 t we can "extend" the Bezier segment
to a segment C1 such that C is its "parametric 1/3rd". The control
points of the new segment are:
P_0, 3*P_1-2*P_0, 9*P_2-12*P_1+4*P_0, 27*P_3-54*P_2+36*P_1-8*P_0
(where P_0, P_1, P_2, P_3 are control points of C).
It is obvious that splitting C1 at "parametric half" won't cut the
loop.

Przemek
--
"Beauty is the first test: there is no permanent place
in the world for ugly mathematics." G.H. Hardy
s***@ca.inter.net
2005-07-03 17:25:59 UTC
Permalink
If you are restricting yourself to the 2D cubic bezier case, here's a
reference to a paper I wrote describing a method to quickly determine
the geometric form of the curve. Source code is available from that
link.

http://www.acm.org/jgt/papers/Vincent02/

HTH

Stephen Vincent
Just d' FAQs
2005-07-03 17:24:26 UTC
Permalink
Post by Tony
Let me be a bit extra strict here - the major point does not hold since
you haven't proved anything. You just stated something to be true.
This is a bit silly. If any statement needed proof, it would be the
one asserting that a bisection *does* work. No proof was given, and
none can be given because that is false, which is what I had said.

I didn't offer a proof since my statement was not disputed; in fact,
it's fairly obvious when raised. A single counterexample is enough to
show bisection is unreliable, or consider the following construction.

Take a curve with whatever geometry you like, cusp or inflections or
whatever. Instead of subdividing it, double it. That is, let the
parameter range from 0 to 2 instead of 0 to 1, then create the control
points for that curve. Now when we halve the doubled curve, one half
is the original, which retains all the geometry in question. Therefore
midpoint bisection is an unreliable way to simplify the geometry, and
the original assertion on the AGG page is discredited. (A more careful
statement, allowing splitting at a *well chosen point*, is a different
matter.)

We can use a de Casteljau blossom algorithm to compute the new control
points, in much the same way as we would evaluate. Let points 1,...,4
be the given controls, blossom values 000, 001, 011, 111. Our new
points will be blossom values 000, 002, 022, 222. The de Casteljau
scheme is

000 001 011 111
\ / \ / \ /
002 012 112
\ / \ /
022 122
\ /
222

More explicitly, the de Casteljau weights will be -1 and 2, like so:

p002 = 2 p001 - p000
p012 = 2 p011 - p001
p112 = 2 p111 - p011

p022 = 2 p012 - p002
p122 = 2 p112 - p012

p222 = 2 p122 - p022



EXAMPLE
It's pretty easy to make a symmetric curve with a loop, a bit trickier
to produce a double inflection. Since the loop is easier to confirm
visually, let's try one. Take as control points

(-1,0) (3,2) (-3,2) (1,0)

Doubled, this becomes

(-1,0) (7,4) (-25,0) (63,-12)


COMMENT
For most readers, probably the most interesting part of this inquiry
is the idea that a de Casteljau evaluation can be used to *extend* a
Bezier curve, not just subdivide it.
Tony
2005-07-03 20:49:11 UTC
Permalink
Eye-opening explanation indeed!
Do you know if there is anything preventing cubic curves from having
multiple loops, for example? IOW, if one half has certain geometry,
does it and how does it restrict the other half of the geometry?
Jerzy Karczmarczuk
2005-07-03 22:40:47 UTC
Permalink
Tony
Post by Tony
Do you know if there is anything preventing cubic curves from having
multiple loops, for example? IOW, if one half has certain geometry,
does it and how does it restrict the other half of the geometry?
Look for yourself.
For every coordinate, x(t), y(t), z(t) (if you need the thir dimension)
are cubic polynomials in t.

Cubic polynomials may have at most two extrema.

How can you produce multiple loops?

Jerzy Karczmarczuk
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
Tony
2005-07-04 01:07:57 UTC
Permalink
Ah! Right! It is easy to visualize it for planar curve. One would need
three extrema for 2 loops. But I admit I am unable to extend the
argument to 3D so to make is as obvious as in 2D case.
Przemyslaw Koprowski
2005-07-04 07:45:28 UTC
Permalink
Post by Tony
Do you know if there is anything preventing cubic curves from having
multiple loops, for example? IOW, if one half has certain geometry,
does it and how does it restrict the other half of the geometry?
Yes the genus (http://mathworld.wolfram.com/CurveGenus.html) of the
(implicization of the) curve .

Przemek
--
"Beauty is the first test: there is no permanent place
in the world for ugly mathematics." G.H. Hardy
Tony
2005-07-04 13:42:51 UTC
Permalink
So a cubic, having genus 0, can not have both loop and cusp either. I
read about that and wondered how does one prove that. Thanks!
Tony
2005-07-04 15:15:49 UTC
Permalink
Do you know how would one figure out what is the value of symbol 'm'
called 'curve class' for a cubic curve? I couldn't find anything
obvious on Mathworld and Google search just confused me further in
terms of more definitions based on more undefined symbols/concepts. In
fact, I'm not even sure that genus is 0 but it seemed to follow from
one definition on some Web page.

Tony
"Math would be less confusing and difficult if only those people who
understand it would be the ones teaching it and explaining it"
Przemyslaw Koprowski
2005-07-04 19:30:52 UTC
Permalink
Post by Tony
Do you know how would one figure out what is the value of symbol 'm'
called 'curve class' for a cubic curve?
Degree of a dual curve.
Post by Tony
So a cubic, having genus 0
I'm not even sure that genus is 0 but it seemed to follow from
one definition on some Web page.
Sort of. The key point here is that we have a *parametric* cubic.

However, if you wish to continue this discussion I would suggest
to move it to new://sci.math, there you will probably get more
interested posters.

If you wish to learn more by yourself I would suggest to read
something on algebraic geometry. Miles Reid "Undergraduate
Algebraic Geometry" (http://isbn.nu/0521356628)
is a good starting point. For more in-depth treatment
Robin Hartshorne "Algebraic geometry" (http://isbn.nu/0387902449)
is a must-read.

Przemek
--
"Beauty is the first test: there is no permanent place
in the world for ugly mathematics." G.H. Hardy
Just d' FAQs
2005-07-04 15:19:42 UTC
Permalink
Post by Tony
Do you know if there is anything preventing cubic curves from having
multiple loops, for example? IOW, if one half has certain geometry,
does it and how does it restrict the other half of the geometry?
A cubic Bezier curve is a piece of a cubic polynomial curve, and as
such, even a small portion determines the entire infinite curve. With
a properly chosen set of four control points we can delimit as large
or as small a piece as we wish.

Specifically with respect to multiple loops, the question can now be
asked with respect to a piece, or for any cubic. The curve cannot loop
unless the control polyline loops, and it is clearly impossible for a
four-point polyline to loop twice. Bezier curves are called variation
diminishing, because the curve cannot cross any line more times than
the control polyline; we could employ this for a more formal proof.

Also, we can implicitize a parametric cubic curve to produce a cubic
polynomial incorporating that curve in implicit form. A classic result
from algebraic geometry says that a line cannot contact an implicit
curve more times than its degree. These contacts include multiplicity,
so that a tangent contact counts twice, as does a double point. If a
curve has two loops, then it has two double points where it crosses
itself. A line through those double points contacts the curve four
times, which is impossible for a cubic.

Implicit cubics are a strictly larger class of curves than parametric
cubics, even when we allow rational parametrics. In fact, if a curve
in implicit form has a parametric form, then somewhere on it (or more
precisely, on its complex projective version) we must be able to make
a double contact with any line. Sweeping the line while maintaining
the double contact produces a moving third point of intersection, and
thus parameterizes the implicit curve.
Tony
2005-07-04 16:50:37 UTC
Permalink
Post by Just d' FAQs
Also, we can implicitize a parametric cubic curve to produce a cubic
polynomial incorporating that curve in implicit form.
Thanks for making me discover very interesting topic of
implicitization. That was a stumbling block for me: to connect the
argument for extrema in parametric form to implicit form.

Tony
McSeem
2005-07-03 01:33:08 UTC
Permalink
Post by Just d' FAQs
But the major point still holds: subdivision at the midpoint is not a
reliable way to simplify the curve.
You are right. Here's the case:
378, 439, 378, 497, 487, 432, 14, 338
Tony
2005-07-03 05:03:03 UTC
Permalink
And here is my attempt to prove it:
When curve has a loop it intersects itself and two points on the curve
coincide. Let's say one point has parameter value t1 and another t2.
Loop can be made arbitrarily small. Since parameter t always either
increases or decreases along the curve, it follows that the difference
t2-t1 can also be made arbitrarily small. Take endpoint t1=0 to be the
first point. It follows that t2 can be made arbitrarily small. Thus it
can be made <t/2. Thus midpoint will not cut the loop.
Turbo
2005-07-01 14:09:41 UTC
Permalink
Post by McSeem
Hi,
I'm the author of the Anti-Grain Geometry library, http://antigrain.com
http://antigrain.com/research/adaptive_bezier/index.html
It can look kind of naive, but I have never seen a fast and perfect
method of Bezier approximation. Even Adobe engine fails in some cases.
This is an attempt to achieve perfect result. I would appreciate your
comments, suggestions, and criticism.
McSeem
http://antigrain.com
In my recent framework (www.amanith.org) i've implemented a non
recursive method for cubic Bézier curves, based on parabolic
approximation, it takes care of cuspid and flex.
It does an adaptive flattening, using chordal distance as "maximum
permitted variation" threshold.
Maybe you can have a look at it.

Regards.
h***@fho-emden.de
2005-07-01 17:12:20 UTC
Permalink
Maxim,

thanks for your investigation.
It will need some time to check & honor your achievements.

In order to stabilize a printable version I would recommend
to convert the HTML pages into PDF.

Best regards --Gernot Hoffmann
Andrey Kuznetsov
2005-07-01 19:26:43 UTC
Permalink
Hi McSeem,

I like your method.
It looks very good and understandable for me.
And it comes at right moment ;-)
(just now I implementing bezier drawing in java,
many thanks to Gernot Hoffman,
without his help I would be lost in mathe).

Keep up the good work!

Andrey
--
http://uio.imagero.com Unified I/O for Java
http://reader.imagero.com Java image reader
http://jgui.imagero.com Java GUI components and utilities
new - http://forum.imagero.com
Richard J Kinch
2005-07-06 18:25:31 UTC
Permalink
Post by McSeem
I would appreciate your
comments, suggestions, and criticism.
I've discussed here in the past how to locate inflection points, cusps, and
horizontal/vertical tangents. This lets you first minimally divide a curve
into normalized pieces, and then attack the subdivision without any special
cases.
McSeem
2005-07-07 13:35:28 UTC
Permalink
Could you give me the link or just the name of the topic?
Andrey Kuznetsov
2005-07-08 09:18:41 UTC
Permalink
Post by McSeem
Could you give me the link or just the name of the topic?
I suppose it is here:
http://groups.google.de/group/comp.graphics.algorithms/browse_thread/thread/4be5105dd0f4e9d7/0ba341a981576ee4?q=%22Richard+J+Kinch%22+inflection+points+cusps&rnum=1&hl=de#0ba341a981576ee4
--
Andrey Kuznetsov
http://uio.imagero.com Unified I/O for Java
http://reader.imagero.com Java image reader
http://jgui.imagero.com Java GUI components and utilities
t***@gmail.com
2013-02-03 19:10:41 UTC
Permalink
The idea for reducing point count and still maintaining the fidelity is excellent in
http://antigrain.com/research/adaptive_bezier/. And the idea to use angles to adjust the point count in sharp turns is good, but using atan2 is not the fastest way.

I have tested many ways to make angle comparisons in a way that increases speed without losing accuracy, but without success, until now, when I found a way using taxicab geometry. Taxicab uses so called diamond angles, which has range 0-4, while radians use range 0-2PI. You cannot compare between them, but instead if you first convert the radian tolerance to diamond angle, then you can make comparisons using diamond angles. The speedup is 1.5 - 4.7 in main browsers and I assume that in c++ and similar the speedup should be similar (not tested).

The desired functions are here:
http://stackoverflow.com/questions/1427422/cheap-algorithm-to-find-measure-of-angle-between-vectors/14675998#14675998

And I assume (not tested yet) that taxicab distances can be used for distance comparisons, because the bigger distance in euclidean is bigger also in taxicab (not sure about this).
t***@gmail.com
2013-02-04 02:56:08 UTC
Permalink
The time has gone, but the internet is up.

The idea for reducing point count and still maintaining the fidelity is excellent in http://antigrain.com/research/adaptive_bezier/. And the idea to use angles to adjust the point count in sharp turns is good, but using atan2 is not the fastest way.

I have tested many ways to make angle comparisons in a way that increases speed without loosing accuracy, but without success, until now, when I found a way using taxicab geometry. Taxicab uses so called diamond angles, which has range 0-4, while radians use range 0-2PI. You cannot compare between them, but instead if you first convert the radian tolerance to diamond angle, then you can make comparisons using diamond angles. The speedup compared to Math.atan2 is 1.44x - 15.2x (average 5.02x) in main browser's javascript engines and I assume that in c++ and similar the speedup may be similar (not tested).

The desired functions are here:
http://stackoverflow.com/questions/1427422/cheap-algorithm-to-find-measure-of-angle-between-vectors/14675998#14675998

The comparison between diamond angles is as reliable as comparing euclidean angles and the conversion between radians and diamond angles is accurate.

I assumed first that also taxicab distances could be used for accurate and fast distance comparisons, because the bigger distance in euclidean is bigger also in taxicab. I realized that contrary to euclidean distances, the angle between start and end point has affect to the taxicab distance. Only lengths of vertical and horizontal vectors can be converted easily and fast between euclidean and taxicab, but in every other case you have to take the angle into account and then the process is too slow.

So as a conclusion I am thinking that angles are faster to compare in taxicab space and distances like they already are in euclidean (squared) space.
Timo
2013-02-04 11:34:46 UTC
Permalink
I assume that the Antigrain adaptive subdivision cannot handle all collinear cases, or then I have made a mistake.

I ported the code to javascript and make a testbed:
http://jsbin.com/ivomiq/1

The code generates two versions of every curve: red is the reference curve, flattened using brute force method and the green is flattened using adaptive subdivision of Antigrain. They should be identical, but they are not.

There is a dataset of 256 collinear cases (there are duplicates, but all cases are covered). Of these 256 cases 8 fail, regardless of angle tolerance and approximation scale. The only way to make it work was to first split all curves in their local extremes (first derivative roots) and make then recursive subdivision to these parts. But it is rather slow to find local extremes. There must be a way to get all of collinear cases to work by modifying the code in case 0 (where all points are collinear or p1==p4. Any thoughts?

If McSeem or some other, has made a functional c++ version, could you please test the following data set and report the resulted mpoints array here.

// start of data
[10, 15, 10, 15, 10, 15, 10, 15],
[10, 20, 10, 20, 10, 20, 110, 20],
[10, 25, 10, 25, 10, 25, 210, 25],
[10, 30, 10, 30, 10, 30, 310, 30],
[10, 35, 10, 35, 110, 35, 10, 35],
[10, 40, 10, 40, 110, 40, 110, 40],
[10, 45, 10, 45, 110, 45, 210, 45],
[10, 50, 10, 50, 110, 50, 310, 50],
[10, 55, 10, 55, 210, 55, 10, 55],
[10, 60, 10, 60, 210, 60, 110, 60],
[10, 65, 10, 65, 210, 65, 210, 65],
[10, 70, 10, 70, 210, 70, 310, 70],
[10, 75, 10, 75, 310, 75, 10, 75],
[10, 80, 10, 80, 310, 80, 110, 80],
[10, 85, 10, 85, 310, 85, 210, 85],
[10, 90, 10, 90, 310, 90, 310, 90],
[10, 95, 110, 95, 10, 95, 10, 95],
[10, 100, 110, 100, 10, 100, 110, 100],
[10, 105, 110, 105, 10, 105, 210, 105],
[10, 110, 110, 110, 10, 110, 310, 110],
[10, 115, 110, 115, 110, 115, 10, 115],
[10, 120, 110, 120, 110, 120, 110, 120],
[10, 125, 110, 125, 110, 125, 210, 125],
[10, 130, 110, 130, 110, 130, 310, 130],
[10, 135, 110, 135, 210, 135, 10, 135],
[10, 140, 110, 140, 210, 140, 110, 140],
[10, 145, 110, 145, 210, 145, 210, 145],
[10, 150, 110, 150, 210, 150, 310, 150],
[10, 155, 110, 155, 310, 155, 10, 155],
[10, 160, 110, 160, 310, 160, 110, 160],
[10, 165, 110, 165, 310, 165, 210, 165],
[10, 170, 110, 170, 310, 170, 310, 170],
[10, 175, 210, 175, 10, 175, 10, 175],
[10, 180, 210, 180, 10, 180, 110, 180],
[10, 185, 210, 185, 10, 185, 210, 185],
[10, 190, 210, 190, 10, 190, 310, 190],
[10, 195, 210, 195, 110, 195, 10, 195],
[10, 200, 210, 200, 110, 200, 110, 200],
[10, 205, 210, 205, 110, 205, 210, 205],
[10, 210, 210, 210, 110, 210, 310, 210],
[10, 215, 210, 215, 210, 215, 10, 215],
[10, 220, 210, 220, 210, 220, 110, 220],
[10, 225, 210, 225, 210, 225, 210, 225],
[10, 230, 210, 230, 210, 230, 310, 230],
[10, 235, 210, 235, 310, 235, 10, 235],
[10, 240, 210, 240, 310, 240, 110, 240],
[10, 245, 210, 245, 310, 245, 210, 245],
[10, 250, 210, 250, 310, 250, 310, 250],
[10, 255, 310, 255, 10, 255, 10, 255],
[10, 260, 310, 260, 10, 260, 110, 260],
[10, 265, 310, 265, 10, 265, 210, 265],
[10, 270, 310, 270, 10, 270, 310, 270],
[10, 275, 310, 275, 110, 275, 10, 275],
[10, 280, 310, 280, 110, 280, 110, 280],
[10, 285, 310, 285, 110, 285, 210, 285],
[10, 290, 310, 290, 110, 290, 310, 290],
[10, 295, 310, 295, 210, 295, 10, 295],
[10, 300, 310, 300, 210, 300, 110, 300],
[10, 305, 310, 305, 210, 305, 210, 305],
[10, 310, 310, 310, 210, 310, 310, 310],
[10, 315, 310, 315, 310, 315, 10, 315],
[10, 320, 310, 320, 310, 320, 110, 320],
[10, 325, 310, 325, 310, 325, 210, 325],
[10, 330, 310, 330, 310, 330, 310, 330],
[110, 335, 10, 335, 10, 335, 10, 335],
[110, 340, 10, 340, 10, 340, 110, 340],
[110, 345, 10, 345, 10, 345, 210, 345],
[110, 350, 10, 350, 10, 350, 310, 350],
[110, 355, 10, 355, 110, 355, 10, 355],
[110, 360, 10, 360, 110, 360, 110, 360],
[110, 365, 10, 365, 110, 365, 210, 365],
[110, 370, 10, 370, 110, 370, 310, 370],
[110, 375, 10, 375, 210, 375, 10, 375],
[110, 380, 10, 380, 210, 380, 110, 380],
[110, 385, 10, 385, 210, 385, 210, 385],
[110, 390, 10, 390, 210, 390, 310, 390],
[110, 395, 10, 395, 310, 395, 10, 395],
[110, 400, 10, 400, 310, 400, 110, 400],
[110, 405, 10, 405, 310, 405, 210, 405],
[110, 410, 10, 410, 310, 410, 310, 410],
[110, 415, 110, 415, 10, 415, 10, 415],
[110, 420, 110, 420, 10, 420, 110, 420],
[110, 425, 110, 425, 10, 425, 210, 425],
[110, 430, 110, 430, 10, 430, 310, 430],
[110, 435, 110, 435, 110, 435, 10, 435],
[110, 440, 110, 440, 110, 440, 110, 440],
[110, 445, 110, 445, 110, 445, 210, 445],
[110, 450, 110, 450, 110, 450, 310, 450],
[110, 455, 110, 455, 210, 455, 10, 455],
[110, 460, 110, 460, 210, 460, 110, 460],
[110, 465, 110, 465, 210, 465, 210, 465],
[110, 470, 110, 470, 210, 470, 310, 470],
[110, 475, 110, 475, 310, 475, 10, 475],
[110, 480, 110, 480, 310, 480, 110, 480],
[110, 485, 110, 485, 310, 485, 210, 485],
[110, 490, 110, 490, 310, 490, 310, 490],
[110, 495, 210, 495, 10, 495, 10, 495],
[110, 500, 210, 500, 10, 500, 110, 500],
[110, 505, 210, 505, 10, 505, 210, 505],
[110, 510, 210, 510, 10, 510, 310, 510],
[110, 515, 210, 515, 110, 515, 10, 515],
[110, 520, 210, 520, 110, 520, 110, 520],
[110, 525, 210, 525, 110, 525, 210, 525],
[110, 530, 210, 530, 110, 530, 310, 530],
[110, 535, 210, 535, 210, 535, 10, 535],
[110, 540, 210, 540, 210, 540, 110, 540],
[110, 545, 210, 545, 210, 545, 210, 545],
[110, 550, 210, 550, 210, 550, 310, 550],
[110, 555, 210, 555, 310, 555, 10, 555],
[110, 560, 210, 560, 310, 560, 110, 560],
[110, 565, 210, 565, 310, 565, 210, 565],
[110, 570, 210, 570, 310, 570, 310, 570],
[110, 575, 310, 575, 10, 575, 10, 575],
[110, 580, 310, 580, 10, 580, 110, 580],
[110, 585, 310, 585, 10, 585, 210, 585],
[110, 590, 310, 590, 10, 590, 310, 590],
[110, 595, 310, 595, 110, 595, 10, 595],
[110, 600, 310, 600, 110, 600, 110, 600],
[110, 605, 310, 605, 110, 605, 210, 605],
[110, 610, 310, 610, 110, 610, 310, 610],
[110, 615, 310, 615, 210, 615, 10, 615],
[110, 620, 310, 620, 210, 620, 110, 620],
[110, 625, 310, 625, 210, 625, 210, 625],
[110, 630, 310, 630, 210, 630, 310, 630],
[110, 635, 310, 635, 310, 635, 10, 635],
[110, 640, 310, 640, 310, 640, 110, 640],
[110, 645, 310, 645, 310, 645, 210, 645],
[110, 650, 310, 650, 310, 650, 310, 650],
[210, 655, 10, 655, 10, 655, 10, 655],
[210, 660, 10, 660, 10, 660, 110, 660],
[210, 665, 10, 665, 10, 665, 210, 665],
[210, 670, 10, 670, 10, 670, 310, 670],
[210, 675, 10, 675, 110, 675, 10, 675],
[210, 680, 10, 680, 110, 680, 110, 680],
[210, 685, 10, 685, 110, 685, 210, 685],
[210, 690, 10, 690, 110, 690, 310, 690],
[210, 695, 10, 695, 210, 695, 10, 695],
[210, 700, 10, 700, 210, 700, 110, 700],
[210, 705, 10, 705, 210, 705, 210, 705],
[210, 710, 10, 710, 210, 710, 310, 710],
[210, 715, 10, 715, 310, 715, 10, 715],
[210, 720, 10, 720, 310, 720, 110, 720],
[210, 725, 10, 725, 310, 725, 210, 725],
[210, 730, 10, 730, 310, 730, 310, 730],
[210, 735, 110, 735, 10, 735, 10, 735],
[210, 740, 110, 740, 10, 740, 110, 740],
[210, 745, 110, 745, 10, 745, 210, 745],
[210, 750, 110, 750, 10, 750, 310, 750],
[210, 755, 110, 755, 110, 755, 10, 755],
[210, 760, 110, 760, 110, 760, 110, 760],
[210, 765, 110, 765, 110, 765, 210, 765],
[210, 770, 110, 770, 110, 770, 310, 770],
[210, 775, 110, 775, 210, 775, 10, 775],
[210, 780, 110, 780, 210, 780, 110, 780],
[210, 785, 110, 785, 210, 785, 210, 785],
[210, 790, 110, 790, 210, 790, 310, 790],
[210, 795, 110, 795, 310, 795, 10, 795],
[210, 800, 110, 800, 310, 800, 110, 800],
[210, 805, 110, 805, 310, 805, 210, 805],
[210, 810, 110, 810, 310, 810, 310, 810],
[210, 815, 210, 815, 10, 815, 10, 815],
[210, 820, 210, 820, 10, 820, 110, 820],
[210, 825, 210, 825, 10, 825, 210, 825],
[210, 830, 210, 830, 10, 830, 310, 830],
[210, 835, 210, 835, 110, 835, 10, 835],
[210, 840, 210, 840, 110, 840, 110, 840],
[210, 845, 210, 845, 110, 845, 210, 845],
[210, 850, 210, 850, 110, 850, 310, 850],
[210, 855, 210, 855, 210, 855, 10, 855],
[210, 860, 210, 860, 210, 860, 110, 860],
[210, 865, 210, 865, 210, 865, 210, 865],
[210, 870, 210, 870, 210, 870, 310, 870],
[210, 875, 210, 875, 310, 875, 10, 875],
[210, 880, 210, 880, 310, 880, 110, 880],
[210, 885, 210, 885, 310, 885, 210, 885],
[210, 890, 210, 890, 310, 890, 310, 890],
[210, 895, 310, 895, 10, 895, 10, 895],
[210, 900, 310, 900, 10, 900, 110, 900],
[210, 905, 310, 905, 10, 905, 210, 905],
[210, 910, 310, 910, 10, 910, 310, 910],
[210, 915, 310, 915, 110, 915, 10, 915],
[210, 920, 310, 920, 110, 920, 110, 920],
[210, 925, 310, 925, 110, 925, 210, 925],
[210, 930, 310, 930, 110, 930, 310, 930],
[210, 935, 310, 935, 210, 935, 10, 935],
[210, 940, 310, 940, 210, 940, 110, 940],
[210, 945, 310, 945, 210, 945, 210, 945],
[210, 950, 310, 950, 210, 950, 310, 950],
[210, 955, 310, 955, 310, 955, 10, 955],
[210, 960, 310, 960, 310, 960, 110, 960],
[210, 965, 310, 965, 310, 965, 210, 965],
[210, 970, 310, 970, 310, 970, 310, 970],
[310, 975, 10, 975, 10, 975, 10, 975],
[310, 980, 10, 980, 10, 980, 110, 980],
[310, 985, 10, 985, 10, 985, 210, 985],
[310, 990, 10, 990, 10, 990, 310, 990],
[310, 995, 10, 995, 110, 995, 10, 995],
[310, 1000, 10, 1000, 110, 1000, 110, 1000],
[310, 1005, 10, 1005, 110, 1005, 210, 1005],
[310, 1010, 10, 1010, 110, 1010, 310, 1010],
[310, 1015, 10, 1015, 210, 1015, 10, 1015],
[310, 1020, 10, 1020, 210, 1020, 110, 1020],
[310, 1025, 10, 1025, 210, 1025, 210, 1025],
[310, 1030, 10, 1030, 210, 1030, 310, 1030],
[310, 1035, 10, 1035, 310, 1035, 10, 1035],
[310, 1040, 10, 1040, 310, 1040, 110, 1040],
[310, 1045, 10, 1045, 310, 1045, 210, 1045],
[310, 1050, 10, 1050, 310, 1050, 310, 1050],
[310, 1055, 110, 1055, 10, 1055, 10, 1055],
[310, 1060, 110, 1060, 10, 1060, 110, 1060],
[310, 1065, 110, 1065, 10, 1065, 210, 1065],
[310, 1070, 110, 1070, 10, 1070, 310, 1070],
[310, 1075, 110, 1075, 110, 1075, 10, 1075],
[310, 1080, 110, 1080, 110, 1080, 110, 1080],
[310, 1085, 110, 1085, 110, 1085, 210, 1085],
[310, 1090, 110, 1090, 110, 1090, 310, 1090],
[310, 1095, 110, 1095, 210, 1095, 10, 1095],
[310, 1100, 110, 1100, 210, 1100, 110, 1100],
[310, 1105, 110, 1105, 210, 1105, 210, 1105],
[310, 1110, 110, 1110, 210, 1110, 310, 1110],
[310, 1115, 110, 1115, 310, 1115, 10, 1115],
[310, 1120, 110, 1120, 310, 1120, 110, 1120],
[310, 1125, 110, 1125, 310, 1125, 210, 1125],
[310, 1130, 110, 1130, 310, 1130, 310, 1130],
[310, 1135, 210, 1135, 10, 1135, 10, 1135],
[310, 1140, 210, 1140, 10, 1140, 110, 1140],
[310, 1145, 210, 1145, 10, 1145, 210, 1145],
[310, 1150, 210, 1150, 10, 1150, 310, 1150],
[310, 1155, 210, 1155, 110, 1155, 10, 1155],
[310, 1160, 210, 1160, 110, 1160, 110, 1160],
[310, 1165, 210, 1165, 110, 1165, 210, 1165],
[310, 1170, 210, 1170, 110, 1170, 310, 1170],
[310, 1175, 210, 1175, 210, 1175, 10, 1175],
[310, 1180, 210, 1180, 210, 1180, 110, 1180],
[310, 1185, 210, 1185, 210, 1185, 210, 1185],
[310, 1190, 210, 1190, 210, 1190, 310, 1190],
[310, 1195, 210, 1195, 310, 1195, 10, 1195],
[310, 1200, 210, 1200, 310, 1200, 110, 1200],
[310, 1205, 210, 1205, 310, 1205, 210, 1205],
[310, 1210, 210, 1210, 310, 1210, 310, 1210],
[310, 1215, 310, 1215, 10, 1215, 10, 1215],
[310, 1220, 310, 1220, 10, 1220, 110, 1220],
[310, 1225, 310, 1225, 10, 1225, 210, 1225],
[310, 1230, 310, 1230, 10, 1230, 310, 1230],
[310, 1235, 310, 1235, 110, 1235, 10, 1235],
[310, 1240, 310, 1240, 110, 1240, 110, 1240],
[310, 1245, 310, 1245, 110, 1245, 210, 1245],
[310, 1250, 310, 1250, 110, 1250, 310, 1250],
[310, 1255, 310, 1255, 210, 1255, 10, 1255],
[310, 1260, 310, 1260, 210, 1260, 110, 1260],
[310, 1265, 310, 1265, 210, 1265, 210, 1265],
[310, 1270, 310, 1270, 210, 1270, 310, 1270],
[310, 1275, 310, 1275, 310, 1275, 10, 1275],
[310, 1280, 310, 1280, 310, 1280, 110, 1280],
[310, 1285, 310, 1285, 310, 1285, 210, 1285],
[310, 1290, 310, 1290, 310, 1290, 310, 1290]
// end of data
Timo
2013-02-04 16:03:09 UTC
Permalink
Problematic indexes of the array in question are 13,20,37,40,65,130,133 and 242. All are collinear, so all falls in category 0 in recursive function.
Post by Timo
// start of data
[10, 15, 10, 15, 10, 15, 10, 15],
[310, 1290, 310, 1290, 310, 1290, 310, 1290]
// end of data
n***@gmail.com
2013-06-02 10:34:42 UTC
Permalink
Hi Timo,

don't mean to fire-up a war here, but Javascript is not the right tool for math computations..
At this level of accuracy

N
Post by Timo
Problematic indexes of the array in question are 13,20,37,40,65,130,133 and 242. All are collinear, so all falls in category 0 in recursive function.
Post by Timo
// start of data
[10, 15, 10, 15, 10, 15, 10, 15],
[310, 1290, 310, 1290, 310, 1290, 310, 1290]
// end of data
Loading...