Testing the numerical derivative jacobian
I updated the code and ran the test_nonlinearls.py. It tests the results obtained from NonlinearLS fitting with WLS fitting results. All the tests gave fine results for all test models using jacobian calculated explicitly using numerical derivative.
For the case of linear model, y = a + b*x0 + c*x1 + e ,where a,b,c are the parameters to be estimated and e is the noise. The output is as given below (.params and .bse used for the following outputs)
leastsq Parameters [ 0.85542169 -0.72289157 1.80722892]
leastsq Standard Errors [ 0.69147062 0.85276594 2.04464611]
WLS Parameters [ 0.85542169 -0.72289157 1.80722892]
WLS Standard Errors [ 0.69147062 0.85276594 2.04464611]
I updated the code and ran the test_nonlinearls.py. It tests the results obtained from NonlinearLS fitting with WLS fitting results. All the tests gave fine results for all test models using jacobian calculated explicitly using numerical derivative.
For the case of linear model, y = a + b*x0 + c*x1 + e ,where a,b,c are the parameters to be estimated and e is the noise. The output is as given below (.params and .bse used for the following outputs)
leastsq Parameters [ 0.85542169 -0.72289157 1.80722892]
leastsq Standard Errors [ 0.69147062 0.85276594 2.04464611]
WLS Parameters [ 0.85542169 -0.72289157 1.80722892]
WLS Standard Errors [ 0.69147062 0.85276594 2.04464611]
Providing the Exact Jacobian
As mentioned in the previous post, I introduced a function 'jacobian' to let the user provide the analytical formula for calculating the jacobian.
I tested it using the linear regression model: y = a + b*x0 + c*x1 + e. The output is as given below (.params and .bse used for the following outputs):
leastsq Parameters [ 0.72754286 -0.81228571 2.15571429]
leastsq Standard Errors [ 0.69737916 0.86005273 2.06211739]
WLS Parameters [ 0.85542169 -0.72289157 1.80722892]
WLS Standard Errors [ 0.69147062 0.85276594 2.04464611]
The output values clearly don't match.
However, when the WLS estimates are provided as starting values for the leastsq method, the output is as follows.
WLS Parameters [ 0.85542169 -0.72289157 1.80722892]
WLS Standard Errors [ 0.69147062 0.85276594 2.04464611]
leastsq Parameters [ 0.85542169 -0.72289157 1.80722892]
leastsq Standard Errors [ 0.69147062 0.85276594 2.04464611]
The output values do match now.
The following points can be inferred:
Next, we will test the code written now with some 'real' nonlinear models and compare it with results with other statistical packages.As mentioned in the previous post, I introduced a function 'jacobian' to let the user provide the analytical formula for calculating the jacobian.
I tested it using the linear regression model: y = a + b*x0 + c*x1 + e. The output is as given below (.params and .bse used for the following outputs):
leastsq Parameters [ 0.72754286 -0.81228571 2.15571429]
leastsq Standard Errors [ 0.69737916 0.86005273 2.06211739]
WLS Parameters [ 0.85542169 -0.72289157 1.80722892]
WLS Standard Errors [ 0.69147062 0.85276594 2.04464611]
The output values clearly don't match.
However, when the WLS estimates are provided as starting values for the leastsq method, the output is as follows.
WLS Parameters [ 0.85542169 -0.72289157 1.80722892]
WLS Standard Errors [ 0.69147062 0.85276594 2.04464611]
leastsq Parameters [ 0.85542169 -0.72289157 1.80722892]
leastsq Standard Errors [ 0.69147062 0.85276594 2.04464611]
The output values do match now.
The following points can be inferred:
- leastsq using LM algorithm gives quite different values for parameters than WLS. However the standard errors are close for both methods.This is the case when the jacobian is calculated analytically for leastsq. In case of numerical derivative jacobian, the results from WLS and leastsq match exactly. This point needs to be tested with some other results to understand this behaviour.
- The start values for fitting using nonlinearls can be provided with WLS estimates. This gives a good result atleast in the case of linear regression model.
No comments:
Post a Comment