A new Method to Estimate the Parameters of Quadratic Regression

Auto-correlation


Introduction
A quadratic regression refers to linear regression with two or more predictors (x 1 , x 2 ,..., x n ) . When multiple predictors are used, the regression line cannot be visualized in two-dimensional space. However, the line can be computed simply by expanding the equation for single-predictor linear regression to include the parameters for each of the predictors.
Although linearity is still the primary model in applications, there has been an increasing in examples of data that are nonlinear, and in particular, where a quadratic fit may be more appropriate. The method of Theil [9] can readily be modified for the quadratic case.
Necessary and sufficient conditions for noninferiority due to Kuhn and Tucker are analogous to the classic Kuhn-Tucker conditions for optimality of a scalar optimization problem. The Kuhn-Tucker conditions for noninferiority (KTCN) will be defined in the same spirit as in Cohon and Marks [3] and Cohon [2].
A quadratic regression models play an important role in many fields. The object of this paper is to estimate the parameters for a quadratic regression model by using Kuhn-Tucker conditions. Kuhn-Tucker conditions provide the minimize error of the estimated parameters for a quadratic regression.
The Durbin-Watson statistic is a statistical test used to detect the presence of autocorrelation in the residuals (prediction errors) from a regression analysis [9,10]. Watson (1950, 1951) applied this statistic to the residuals from the regression line, and developed bounds tests for the null hypothesis that the errors are serially uncorrelated against the alternative that they follow a first order autoregressive process.
A quadratic regression models which studied by using Kuhn-Tucker conditions can take the following form: The purpose of this article is to develop a new procedure that can always produce regression curve estimators for the quadratic model (1.1) by using the Kuhn-Tucker conditions.

Problem Formulation
Equation (1.1) can be written in the following form: feasible solution x ÎS is said to be satisfy KTCN for vector optimization problem if: 1. all f i and g i are differentiable and S = f ; and 2. there exists u i ³ 0, i = 1,...,n , with strict inequality holding for at least one and v i ³ 0, i = 1,...,n, such that The Kukn-Tucker conditions for this problem take the form (see [6,7]): The determination of b 1 ,b 2 is depending on the obtained values of u i which can be determined as follow: solution of (2.1)} (see [6,7] ).

Stability set of this problem
Then, it is clear that

Testing Disturbances
In this study we examine whether the disturbances in the regression model (2.1) are well behaved or not. As known, this can also be viewed as a test for the higher moments of the dependent variable conform to the assumptions of the model.

Example
In this example we examined whether the disturbances in the regression median model (2.1) are well behaved or not. According to Durbin-Watson test, therefore the null hypothesis of no serial correlation ( H 0 : r = 0), against ( H 1 : r > 0).
The data set of X and Y is given in table 1

Comparison of Kuhn-Tucker Estimation and Least Squares Estimation on Quadratic Regression
The data in table 1 has been studied by using least square method.
By using SPSS program for this example, we get the following analysis data:   The result shows that, the statistic Durbin-Watson test value in Kuhn-Tucker estimation is less than its value in least square estimation .

Conclusion
In this paper, we introduce a new method to estimate the parameter for a quadratic regression model by using Kuhn-Tucker conditions. According to the Durbin-Watson test we show that there is positively autocorrelation between the errors for the regression curve, this means that our estimators are a suitable estimator in the case of fitting data.