I was asked to fill in the blanks left at the end of the previous Curve Fitting article, so just quickly, here goes. Last time we assumed a linear mathematical model, y = mx + c. This time we start off with a more general one, y = p(x), where function p is assumed to have one or more adjustable parameters that we can tweak to find the best approximation.
As before, we first plot all n data points (x,y) on a graph, and conceptually draw the curve represented by our function p(x) somewhere through the middle of them. Then the sum of the squares of the error terms is,
∑(∆y)² = ∑(p(x) - y)².
For each parameter t in model p, we partially differentiate this sum and look for turning points:
∂∑/∂t = 2∑(p(x) - y)(∂p/∂t), which equals zero when
∑(∂p/∂t)p(x) = ∑(∂p/∂t)y.
Now let's plug in a concrete example, in fact let's make a beeline for that bloody quintic. This has six independent parameters, which we'll label a through f:
p(x) = ax⁵ + bx⁴ + cx³ + dx² + ex + f
We simply read off the six partial derivatives, and substitute each of these in turn into the above formula.
∂p/∂a = x⁵; ∂p/∂b = x⁴; ∂p/∂c = x³;
∂p/∂d = x²; ∂p/∂e = x; ∂p/∂f = 1.
a∑x¹⁰ + b∑x⁹ + c∑x⁸ + d∑x⁷ + e∑x⁶ + f∑x⁵ = ∑x⁵y
a∑x⁹ + b∑x⁸ + c∑x⁷ + d∑x⁶ + e∑x⁵ + f∑x⁴ = ∑x⁴y
a∑x⁸ + b∑x⁷ + c∑x⁶ + d∑x⁵ + e∑x⁴ + f∑x³ = ∑x³y
a∑x⁷ + b∑x⁶ + c∑x⁵ + d∑x⁴ + e∑x³ + f∑x² = ∑x²y
a∑x⁶ + b∑x⁵ + c∑x⁴ + d∑x³ + e∑x² + f∑x = ∑xy
a∑x⁵ + b∑x⁴ + c∑x³ + d∑x² + e∑x + fn = ∑y
There we have our six explicit simultaneous equations in six parameters; solve 'em! Use any decent math library. Or, if there's a cow peering into your Portakabin™, and all you have is Commodore Pet BASIC, take advantage of that 30KB of RAM to knock up a quick Gaussian elimination.
From memory, of course.