Top: raw data and model. Bottom: evolution of the normalised sum of the squares of the errors. Curve fitting is the process of constructing a curve, or mathematical function, that has the best fit to a series curve fitting matlab pdf data points, possibly subject to constraints.

Curve fitting can involve either interpolation, where an exact fit to the data is required, or smoothing, in which a “smooth” function is constructed that approximately fits the data. A related topic is regression analysis, which focuses more on questions of statistical inference such as how much uncertainty is present in a curve that is fit to data observed with random errors. Fitted curves can be used as an aid for data visualization, to infer values of a function where no data are available, and to summarize the relationships among two or more variables. Extrapolation refers to the use of a fitted curve beyond the range of the observed data, and is subject to a degree of uncertainty since it may reflect the method used to construct the curve as much as it reflects the observed data.

Polynomial curves fitting points generated with a sine function. This is a line with slope a. A line will connect any two points, so a first degree polynomial equation is an exact fit through any two points with distinct x coordinates.

This will exactly fit a simple curve to three points. This will exactly fit four points. A more general statement would be to say it will exactly fit four constraints. Angle and curvature constraints are most often added to the ends of a curve, and in such cases are called end conditions.

Identical end conditions are frequently used to ensure a smooth transition between polynomial curves contained within a single spline. Higher-order constraints, such as “the change in the rate of curvature”, could also be added. The first degree polynomial equation could also be an exact fit for a single point and an angle while the third degree polynomial equation could also be an exact fit for two points, an angle constraint, and a curvature constraint. Many other combinations of constraints are possible for these and for higher order polynomial equations.

In general, however, some method is then needed to evaluate each approximation. The least squares method is one way to compare the deviations. There are several reasons given to get an approximate fit when it is possible to simply increase the degree of the polynomial equation and get an exact match. Even if an exact match exists, it does not necessarily follow that it can be readily discovered.

Depending on the algorithm used there may be a divergent case, where the exact fit cannot be calculated, or it might take too much computer time to find the solution. This situation might require an approximate solution. The effect of averaging out questionable data points in a sample, rather than distorting the curve to fit them exactly, may be desirable. Runge’s phenomenon: high order polynomials can be highly oscillatory.