Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors

Curve fitting. Least squares regression

09/04/2026

Online curve fitting simulations on this page help us to understand how to fit a point cloud and show us some of the most common techniques such as least squares regression.

What is curve fitting

Curve fitting is a technique used in data analysis to find a mathematical function that fits a set of points. The goal is to find a smooth curve that passes close to all the points and represents the relationship between the variables of interest accurately.

Curve fitting applications

Curve fitting is used in a wide range of fields, such as physics, statistics, economics and engineering. It is especially useful when you have a set of data and you want to obtain a function that describes them adequately. This can be useful for predicting future values, interpolating between known points, or better understanding the relationship between variables.

Curve fitting methods

There are several methods for curve fitting, the most common being linear fitting and polynomial fitting. Linear fitting is used when data are expected to follow a linear relationship, while polynomial fitting allows more complex relationships to be represented by higher degree polynomials. These methods are based on minimizing the difference between the values predicted by the fitted function and the actual data values.

Another popular approach is curve fitting using exponential, logarithmic or trigonometric functions, depending on the shape of the data and the expected relationship between the variables. These models can capture nonlinear patterns and provide a better approximation in some cases.

Least squares regression method

The least squares regression method is a mathematical technique used to find the most accurate relationship between a set of data and a function by minimizing the sum of the squares of the differences between the observed values and the values predicted by the model. This method is fundamental in curve fitting, since it allows determining the curve that best fits the experimental data, optimizing its representation and reducing prediction errors. It is widely used in statistics, physics, engineering and other disciplines to analyze trends and make predictions based on observed data.

Limitations of curve fitting

It is important to note that curve fitting is not always appropriate, as it can lead to overfitting or underfitting. Overfitting occurs when the fitted function over-fits the training data and performs poorly on new data, while underfitting occurs when the function does not adequately capture the relationship between variables.

Explore the exciting STEM world with our free, online, simulations and accompanying companion courses! With them you’ll be able to experience and learn hands-on. Take this opportunity to immerse yourself in virtual experiences while advancing your education – awaken your scientific curiosity and discover all that the STEM world has to offer!

Curve fitting simulations

Fitting the curve


With the mouse, drag the data points and their error bars, and see the best fit of the polynomial curve that instantly updates. You can choose the type of fit: linear, quadratic, or cubic. The reduced chi-square statistic shows you when the fit is good. Or you can try to find the best fit manually by adjusting the parameters.
Licencia de Creative Commons

Least squares regression


Create your own scatter plot or use real-world data and try to create a line of fit. Explore how individual data points affect the correlation coefficient and trend line.
Licencia de Creative Commons

“If I have seen further, it is by standing on the shoulders of giants”

Isaac Newton

Your path to becoming a giant of knowledge begins with these top free courses
Your path to becoming a giant of knowledge begins with these top free courses

Test your knowledge

Curve fitting is a mathematical technique used to find a function that best represents the relationship between variables in a dataset. Its purpose is to obtain a smooth curve that approximates the general trend of the data, even when the points show noise or dispersion. This technique is widely used in physics, engineering, statistics, economics and any field that requires modeling real‑world phenomena. Curve fitting can be linear, polynomial or based on more complex functions such as exponential, logarithmic or trigonometric models. The most common method for determining the best curve is least‑squares regression, which minimizes the sum of the errors between observed and predicted values. Curve fitting enables interpolation, prediction, pattern identification and a deeper understanding of variable relationships.
Curve fitting methods include linear fitting, polynomial fitting and the use of specific functions such as exponential, logarithmic or trigonometric models. Linear fitting is used when data follow a straight‑line trend, while polynomial fitting captures more complex behaviors. Other models are chosen when data suggest exponential growth, logarithmic relationships or periodic patterns. The least‑squares method provides the mathematical foundation for determining the optimal parameters of the chosen model. It minimizes the sum of the squared differences between observed values and model predictions, producing the curve that best represents the dataset. However, overly complex models may lead to overfitting, while overly simple ones may cause underfitting. Choosing the right method is essential for reliable results.
It is used to find a mathematical function that describes how two variables are related. When we plot experimental data, the points rarely form a perfect line or curve, so curve fitting helps us obtain an equation that approximates them. This is useful for predicting future values, understanding trends, interpolating between known points or analyzing how a phenomenon behaves. For example, if we measure how a plant grows each day, curve fitting allows us to create a curve that represents that growth and estimate its height in the future.
Linear fitting uses a straight line to represent the data, so it only works when the relationship between variables is roughly proportional. Polynomial fitting uses higher‑degree polynomials, which can represent curved patterns. If the data follow a straight trend, linear fitting is enough; if they form a curve, a polynomial may describe them better. However, using very high‑degree polynomials can cause overfitting, where the model adapts too closely to the data and performs poorly on new values.
The least‑squares method finds the curve that best fits the data by minimizing the sum of squared errors. Each data point has a difference between its real value and the value predicted by the curve; the method squares these differences and adds them up. The best curve is the one that makes this sum as small as possible. It is widely used because it is simple, efficient and works well with many types of models. It also allows easy comparison between different fits to choose the most accurate one.

You may also be interested

Cargando clima y ubicación...

You may also be interested