A new stochastic restricted two-parameter estimator in multiple linear regression model

: In this paper, we proposed a biased estimator, a new stochastic restricted two-parameter estimator (NSRTPE), for the multiple linear regression model to tackle the multicollinearity problem when the stochastic restrictions are available. Necessary and suﬀicient conditions for the superiority of the proposed estimator over the ordinary least square estimator (OLSE), ridge estimator (RE), Liu estimator (LE), almost unbiased Liu estimator (AULE), modified new two-parameter estimator (MNTPE), mixed estimator (ME), stochastic restricted Liu estimator (SRLE) were derived in the mean square error matrix (MSEM) criterion. Finally, we showed the superiority of the estimator proposed using a simulation study and a real-world example in the scalar mean square error (SMSE) criterion.


Introduction
We consider the multiple linear regression model where y is an n × 1 observable random vector, X is a n × p known design matrix of rank p , β is a p × 1 vector of unknown parameters, and ϵ is a n × 1 vector of disturbances.The ordinary least square estimator (OLSE) for model is given by (1.1) where S = X ′ X.In the case of multicollinearity, the Ordinary Least Squares method produces estimates with significant variances, wide confidence intervals, unreliable tests, and incorrect signs.Several researchers have proposed alternative estimators instead of the Ordinary Least Square Estimator (OLSE) to confront the multicollinearity problem.In order to solve the multicollinearity problem, Hoerl and Kennard (1970) firstly proposed the Ridge Estimator (RE).Followed by Hoerl and Kennard (1970), the Liu Estimator (LE) by Liu (1993), and the Almost Unbiased Liu Estimator (AULE) by Akdeniz and Kaçiranlar (1995) have been proposed to solve multicollinearity.Recently, Ahmad and Aslam (2020) proposed Modified New Two Parameter Estimator (MNTPE).
The bias vector and mean square error (MSEM ) matrix of RE can be obtained as respectively.
The bias vector and MSEM of LE can be obtained as The AULE proposed by Akdeniz and Kaçiranlar (1995) is defined as The MNTPE proposed by Ahmad and Aslam (2020) is defined as where L k,d = (S + I) −1 (S + dI) (S + kdI) −1 S The bias vector and MSEM of MNTPE can be obtained as (1.13) Another method to deal with multicollinearity problems is to consider parameter estimation with some restrictions on the unknown parameters, which may be exact or stochastic restrictions (Rao and Touterburg, 1995).
In addition to the sample model (1.1), let us be given some prior information about βin the form of a set of q independent stochastic linear restrictions as follows.
where h is a known stochastic vector, H is an q×p of full row rank q ≤ p with known element, v is a q × 1 random vector of disturbances, and Ω is assumed to be known and positive definite.Moreover, it is assumed that v is stochastically independent of ϵ , i.e.,E ϵv ′ = 0.Then, by combining the sample model (1.1) and the stochastic restriction (1.15), the Mixed Estimator (ME) is proposed by Theil and Goldberger (1961) as follows.
The MSEM of ME can be obtained as where .
By replacing ME in the place of OLSE in LE, Hubert and Wijekoon (2006) proposed the stochastic restricted Liu estimator (SRLE) as follows.
The bias vector and MSEM of SRLE are given by respectively.
The hope is that the combination of two different estimators might inherit the advantages of both estimators.Therefore, in this research, we propose a new estimator by combining ME and MNTPE.

The Proposed Estimator and Its Stochastic Properties
Following Hubert and Wijekoon (2006) βNSRT The bias vector, dispersion matrix, and MSEM of βNSRT P E (k, d) = L k,d βME can be obtained as and

Mean Square Error Matrix Comparison
This section compares the performance of the proposed estimator with OLSE, ME, RE, LE, AULE, SRLE, and MNTPE.

Vavuniya Journal of Science
Arumairajan and Kayathiri, 2022

MSEM comparison between RE and NSRTPE
In this section, the RE and NSRTPE will be compared.The MSEM difference is Now the following theorem can be stated.
< 1 , the estimator NSRTPE is superior to RE in the mean squared error matrix sense if and only if Proof: One can clearly say that the matrix is positive definite.According to lemma 1 (Appendix), it can be said that the matrix D 1 is positive definite if and only if Now, based on Lemma 2 (Appendix), we can say that This completes the proof.

MSEM comparison between LE and NSRTPE
In order to compare the LE and NSRTPE in terms of the MSEM matrix, we investigate the following difference.
Now, one can state the following theorem.

Theorem 3.2
The NSRTPE is superior to LE in the mean squared error matrix sense if and only if where Proof:Let us consider ≤ 1 This completes the proof.

MSEM comparison between AULE and NSRTPE
In this section, the AULE and NSRTPE will be compared.The MSEM difference is Now the following theorem can be stated.
< 1 the estimator NSRTPE is superior to AULE in the mean squared error matrix sense if and only if β where Proof: One can say that the matrix L k,d GL ′ k,d is a positive definite matrix.According to lemma 1, it can be said that the matrix D 3 is a positive definite matrix if and only if This completes the proof.

MSEM comparison between SRLE and NSRTPE
We consider the MSEM difference between SRLE and NSRTPE as: Now, the following theorem can be stated.

Theorem 3.4:
The NSRTPE is superior to SRLE in the mean squared error matrix sense if and only if Now, according to lemma 2, one can say that one can say that This completes the proof.

MSEM M comparison between MNTPE and NSRTPE
We consider the MSEM difference between MNTPE and NSRTPE as: which means that the estimator NSRTPE is always superior to MNTPE in the mean squared error matrix sense.

MSEM comparison between OLSE and NSRTPE
In this subsection, the estimator NSRTPE will be compared with OLSE.We consider the MSEM difference between βOLSE and βNSRT P E (k, d) as Now, the following theorem can be stated.

Theorem 3.5
The OLSE is superior to NSRTPE in the mean square error matrix sense if and only if Proof To prove this theorem, first, we need to show that the matrix D 5 is a positive definite.Since the matrix S is positive definite, there exists an orthogonal matrix P and a positive definite diagonal matrix Consider Now, according to Lemma 3, it can be said that the OLSE is superior to NSRTPE in the mean square error matrix sense if and only if This completes the proof.

MSEM comparison between ME and NSRTPE
We consider the MSEM difference between ME and NSRTPE as: Now, one can state the following theorem.
Theorem 3.6 The ME is superior to NSRTPE in the mean square error matrix sense if and only if Proof: Put B = P ′ AP , with A as defined in Section 1.The matrix is a nonnegative definite matrix and hence so is B.The diagonal elements a ij of B are , therefore , all nonnegative.Consider Now, it can be said that the matrix This completes the proof.

Vavuniya Journal of Science
Arumairajan and Kayathiri, 2022 We consider the following stochastic restriction according to Yildiz (2019): Also, it was obtained that the scalar mean square error (SMSE) of OLSE is 0.0808.From Table 1-4, we can see that the NSRTPE outperforms the other estimators when the parameter k is relatively small and the parameter d is large.However, the NSTRTPE is worse than some existing estimators when d is small.

Simulation Study
To further illustrate the behavior of the proposed estimator, we perform a Monte Carlo simulation study by considering different levels of multicollinearity.Following McDonald and Galarneau (1975), we generate ex-Vavuniya Journal of Science Arumairajan and Kayathiri, 2022 planatory variables as follows: where z ij is an independent normal pseudorandom number and γ is specified so that the theoretical correlation between any two explanatory variables is given by γ 2 .
A dependent variable is generated by using the following equation where ϵ i is a normal pseudo and variance σ 2 i .
Newhouse and Oman ( 1971) have noted that if MSEM is a function of σ 2 i and β, and if the explanatory variables are fixed, the subject to the constraint ββ ′ = 1, the MSEM is minimized when β is the normalized eigenvector corresponding to the largest eigenvalue of the XX ′ matrix.In this study, we choose the normalized eigenvector corresponding to the largest eigenvalue of the XX ′ matrix as the coefficient vector, n = 100, p = 4 and σ 2 i = 1.Three different sets of correlations are considered by selecting the values as γ = 0.7, 0.8 and 0.9.The estimated SMSE of OLSE at γ = 0.7, 0.8 and 0.9 are 0.0622, 0.0853 and 0.1572, respectively.In the simulation study, we have used the same stochastic restriction used in Section 4.1.
Tables 5-16 (see Appendix) are obtained using the estimated scalar mean square error (SMSE ) values of RE, LE, AULE, MNTPE, ME, SRLE and NSRTPE for different d and k values with different sets of correlation, namely, γ = 0.7, 0.8 and 0.9.According to Table 7, It is observed that the proposed estimator has the smallest scalar mean square error value than other estimators for k = 0.01 and 0.005 when d = 0.9 and γ = 0.7.Also, from Table 8, it can be seen that the NSRTPE has the smallest SMSE than other estimators for k ≤ 0.03.When d = 0.99 and γ = 0.7.From Table 11, one can say that the NSRTPE has the smallest SMSE than other estimators for k = 0.01 and 0.005 when d = 0.9 and γ = 0.8.Moreover, Table 12 shows that the NSRTPE showed a better performance for k ≤ 0.04 when d = 0.99 and γ = 0.8 Based on table 15, one can conclude that the proposed estimator has lower SMSE values compared to those of OLSE, RE, LE, AULE, MNTPE, ME and SRLE for k = 0.01 and k = 0.05 when d = 0.9 and γ = 0.9.Moreover, Table 16 shows that the proposed estimator has the smallest SMSE than other estimators for k ≤ 0.05 when d = 0.99 and γ = 0.9.Furthermore, it can be observed in all tables that the NSRTPE is always superior to MNTPE.Nevertheless, the NSTRTPE is worse than some existing estimators when d is small.

Conclusion
A new biased estimator has been proposed for estimating the parameter of the multiple linear regression with multicollinearity when the stochastic restrictions are available.Moreover, necessary and sufficient conditions for the superiority of the proposed estimator over the OLSE, RE, LE, AULE, MNTPE, ME, SRLE in the MSEM sense have been discussed.Finally, we illustrated our findings with a real-world example and a Monte Carlo simulation.From the results of the real-world example and simulation study, it could be concluded that the proposed estimator performs well compared to others when the shrinkage parameters k and d are relatively small and large, respectively.McDonald, G.C. and Galarneau, D.I., 1975

Table 4 :
The estimated SMSE of RE, LE, AULE, MNTPE, ME, SRLE and NSRTPE when d = 0.99 . A Monte Carlo evaluation of some ridge-type estimators.