Data fitting example 3 2 1 0 -2 -1 0 1 2 The result of fitting a set of data points with a quadratic function DATA 17 Copyright 2019 by Xiaoyu Li
Copyright © 2019 by Xiaoyu Li. 17 Data fitting example The result of fitting a set of data points with a quadratic function
Data fitting example 20 15 10 -5 -10 -15 -2020 -15-10-5051015 20 Conic fitting a set of points using least- squares approximation 18 DATA Copyright 2019 by Xiaoyu Li
Copyright © 2019 by Xiaoyu Li. 18 Data fitting example Conic fitting a set of points using leastsquares approximation
1 The least square method-LSM A standard approach in regression analysis to the approximate solution of over determined systems; Widely used to find or estimate the numerical values of the parameters to fit a function to a set of data and to characterize the statistical properties of estimates; Means that the overall solution minimizes the sum of the squares of the errors made in the results of every single equation; The most important application is in data fitting 19 Copyright 2019 by Xiaoyu Li
Copyright © 2019 by Xiaoyu Li. 19 1 The least square method-LSM A standard approach in regression analysis to the approximate solution of over determined systems; Widely used to find or estimate the numerical values of the parameters to fit a function to a set of data and to characterize the statistical properties of estimates; Means that the overall solution minimizes the sum of the squares of the errors made in the results of every single equation; The most important application is in data fitting
2 Variations of LSM .Ordinary least squares-OLS 。普通最小二乘法 Weighted least squares-WLS ·加权最小二乘法 .Alternating least squares-ALS ·交替最小二乘法 Partial least squares-PLS 。偏最小二乘法 20 DATA Copyright 2019 by Xiaoyu Li
Copyright © 2019 by Xiaoyu Li. 20 2 Variations of LSM Ordinary least squares-OLS 普通最小二乘法 Weighted least squares-WLS 加权最小二乘法 Alternating least squares-ALS 交替最小二乘法 Partial least squares-PLS 偏最小二乘法
3 The principle of LSM (1) a set of N pairs of observations {Yi,Xi} values of an independent variable (X) value of the dependent variable (Y) the prediction is given by the following equation: Y=a+bx. (1) This amounts to minimizing the expression: 8=∑(Y-)2=∑Y-(a+bX2 (2) (where stands for "error"which is the quantity to be minimized) D】ATA 21 Copyright 2019 by Xiaoyu Li
Copyright © 2019 by Xiaoyu Li. 21 3 The principle of LSM (1)