Complete Formulae of Least Square Method for Regression Model
Our goal is to find out conditions that can minimize (Y-Y)2
Suppose S=(Y-Y)2
Consider Y=f(xi), i=1, 2, 3…n, and Y=f(xi)= axi+b, then S=i=1n[yi-f(xi)]2=i=1n(yi-axi-b)2
Using the notion of limitation(极限), our task can be redefined as:
minS=limn→∞i=1n(yi-axi-b)2
According to partial derivative law (偏导数算法), these two conditions 按照偏导数算法,这意味着S随着a和b导向0,而趋向0。
must be met:
∂S∂a=0,
∂S∂b=0
Thus, we get these two formulae:
∂S∂a=…这部分在很多教科书中都被省略了,只有通过《高等数学》的导数和偏导数才能推算出来。
=-2i=1n(yi-axi-b)xi=0,
The omitted process: how and why?
Let i=1, then, i=1n(yi-axi-b)2=(y1-ax1-b)2=[y1-(ax1+b)]2=y12+ax1+b2-2y1ax1+b=y12由于只对a求偏导数,导数之外的变量x、y、b都看做常数,常数导数为0。
+a2y=xn的导数公式为y’=nxn-1。所以,a2的导数应该是2a2-1=2a
x12+b2+2a同理,a的导数是1a1-1=1a1-1=1a0=1
bx1-2ay1x1-2by1
According to derivative law (导数算法), ∂S∂a=0+2ax12+0+2bx1-2y1x1-0=2x1ax1+b-y1=-2(y1-ax1-b)x1
Then, consider i=1, 2, 3…n, we get, ∂S∂a=-2i=1n(yi-axi-b)xi
∂S∂b=…=-2i=1n(yi-axi-b)=0
By the same token, the omitted formulae should be:
Let i=1, then, i=1n(yi-axi-b)2=(y1-ax1-b)2=[y1-(ax1+b)]2=y12+ax1+b2-2y1ax1+b=y12+a2x12+b2+2abx1-2ay1x1-2by1这里的偏导数算法同前,只是这里的常量变为y、x、a了。
According to derivative law (导数算法), ∂S∂b=0+0+2b+2ax1-0-2y1=2b+2ax1-2y1=-2(y1-ax1-b)
Then,
(旅游概论英文课件)S#4 Formulae for Least Square(一) 来自淘豆网www.taodocs.com转载请标明出处.