Rao score test

From Scholarpedia
Calyampudi Radhakrishna Rao (2009), Scholarpedia, 4(10):8220. doi:10.4249/scholarpedia.8220 revision #121946 [link to/cite this article]
Jump to: navigation, search
Post-publication activity

Curator: Calyampudi Radhakrishna Rao

Rao’s Score test is an alternative to Likelihood Ratio and Wald tests. These three tests are referred to in statistical literature on testing of hypotheses as the Holy Trinity. All these tests are equivalent to the first order of asymptotics, but differ to some extent in the second order properties. No one is uniformly superior to the others. Some related tests are Neyman’s C(α) and Neyman–Rao tests.

Contents

Definition

Let \(X = (x_1,\ldots, x_n)\) be an iid sample from a probability density function \( p(x,\theta) \ ,\) where \(\theta\) is an \(r\)- vector parameter. Let

\[ P(X,\theta) = p(x_1,\theta) \ldots p(x_n,\theta) \]


The score vector of Fisher is \[ S (\theta) = \left[ s_1(\theta), \ldots, s_r(\theta) \right]' \quad s_j(\theta) = \frac{1}{P}\frac{\partial P}{\partial \theta_j} \quad j = 1, \ldots, r \]

The Fisher information matrix of order \(r \times r\) is defined by

\[ I(\theta) =( i_{jk} (\theta)),\quad i_{jk}(\theta) = E ( s_j(\theta) s_k(\theta)). \]

The Rao’s score(RS) test for a simple hypothesis \(H_0: \theta = \theta_0\ ,\) introduced in Rao (1948), is \[\tag{1} RSS = S (\theta_0)^\prime \left[I (\theta_0)\right]^{-1} S(\theta_0) \]


which has an asymptotic chi-square distribution on r degrees of freedom. Test (1) uses only \( \theta_0 \ ,\) the null value of \(\theta\!\ ,\) unlike the Wald test. Consider the composite hypothesis \[ H_0 : H(\theta) = C \] where

\[\tag{2} H(\theta)^\prime = (h_1(\theta),\ldots, h_t(\theta) ), \quad C^\prime=(c_1, \ldots, c_t), \quad t \leq r, \]


\( h_1,\ldots,h_t \) are given functions and \( c_1, \ldots,c_t \) are given constants. Let \(\hat{\theta}\) be the maximum likelihood estimate (mle) of \( \theta\!\) under the restriction (2). The RS test for the composite hypothesis (2) is \[ RSC = S (\hat{\theta})^\prime [I (\hat{\theta})]^{-1} S (\hat{\theta}). \]

An alternative way of expressing the RSC is as follows. Note that the restricted mle \(\hat{\theta}\) is a solution of \[\tag{3} S (\theta) + [ G(\theta)]^\prime \lambda = 0, \quad H (\theta) = C \]


where \( G(\theta)=((\partial h_i/\partial \theta_j )) \) and \(\lambda\) is a \(t\)-vector of Lagrangian Multipliers, so that \( [S (\hat{\theta}) ]^\prime = -\lambda^\prime G(\hat{\theta})\ .\) Substituting in (3), we have \[\tag{4} RSC =\lambda^\prime[ A(\hat{\theta} )]\lambda \]

where \[ A(\theta) = G (\theta) [I (\theta)]^{-1}[ G (\theta)]^\prime \]

Silvey (1959) expressed RSC in the form (4) and called it the Lagrangian Multiplier (LM) test. Neyman (1979) considered the special case of a composite hypothesis \[\tag{5} H : \theta_1 = \theta_{10},\theta_2, \ldots, \theta_r. \]

where \(\theta_{10}\) is given and the rest are arbitrary.

The RSC for (5) is known as Neyman’s \(C(\alpha)\) test in statistical literature. Hall and Mathiason (1990) considered a more general composite hypothesis of the form. \[\tag{6} H : \theta_1 = \theta_{10},\ldots, \theta_q = \theta_{qo}, \ldots, \theta_{q+1}, \ldots, \theta_r. \]

where \(\theta_{10},\ldots, \theta_{qo} \) are all given and the rest are arbitrary.

The RSC for(6) is termed as Neyman – Rao test by them.

History

In the early years of my appointment at the Indian Statistical Institute, I had the opportunity of interacting with the staff and research scholars and discussing with them new problems in statistics arising out of consultation work. One of the scholars, S.J. Poti, by name, asked me about testing a simple hypothesis \( H : \theta = \theta_0 \) concerning a single parameter \(\theta\) when there is some prior information about the alternative such as \(\theta > \theta_0\ .\) I suggested a procedure by which local power on the right side of \(\theta_0\) is maximized leading to a test of the form \(P^\prime(\theta)/ P (\theta) > \lambda \) where \(P^\prime/ P \) is Fisher’s score. The result was published in Rao and Poti (1946). They also proposed a general test of the form \((P^\prime/ P)^2 > \lambda \) which is likely to have good local power on either side of \(\theta_0\ .\) Two years later, I was working on a problem at Cambridge University, UK, which involved testing of simple and composite hypotheses concerning multiple parameters when there is information that alternatives are close to those specified by the null hypothesis. This led to combining the individual tests criteria based on the Fisher’s scores \(s_1(\theta), \ldots, s_r(\theta) \) into a single criterion. Following methods used in multivariate analysis, I arrived at statistics of the form (1) and (2), which were approved by R.A. Fisher, my thesis advisor when I was working at Cambridge University during 1946-48. The paper (Rao,1948), containing the general discussion of score tests is published in the proceedings of the Cambridge Philosophical Society.

Applications

A comprehensive account of the applications of RS tests in econometrics is given in Godfrey (1988). Some applications to statistical inference are given in Bera and Jarque (1981), Breusch and Pagan (1979), Breush (1978), Godfrey (1978 a,b), Byron (1968), Bera and Ullah (1991) and in a series of papers in Vol. 97, pp 1-200, 2000 of J. Statistical Planning and inference. Comparison with likelihood ratio and Wald tests in terms of power and other properties are summarized in Rao(2005), which also gives references to some key papers and recent developments on RS tests, suggesting modifications in particular applications.

References

  • Bera, A.K.and Jarque, C.M. (1981), Working papers in Economics and Econometrics 40, Australian National University.
  • Bera, A.K. and Ullah, A (1991) J.Quantitative Economics 7, 189-220.
  • Bera, A.K. and Bilias, Y (2001), J Statistical Planning and Inference 97 9-44,
  • Bruesch, T.S. (1978), Australian Economic papers 17, 334-355.
  • Bruesch, T.S. and Pagan, A.R. (1979), Econometrics 47, 1287-1294.
  • Byron, R.P. (1968) Australian Economic Papers 7, 227-248.
  • Godfrey, L.G. (1978 a), Econometrika 46, 227-236.
  • Godfrey, L.G. (1978 b), Econometrika 46, 1303-1310.
  • Godfrey, L.G. (1988), Misspecification tests in Econometrics, Cambridge University Press.
  • Hall, H.J and Mathiason, D.J. (1990), Int. Statistical Review 58, 77-99.
  • Neyman, J (1979), Sankhya A 41, 1-21.
  • Rao, C.R. (1948), Proc. Camb.Philos. Soc. 44,50-57.
  • Rao,C.R. (2005), In Advances in Ranking and Selection, Multiple Comparisons and Reliability, Birkhauser,3-20.

Internal references


Recommended reading

  • Bing Li (2001), J.Statistical Planning and Inference 97, 57-66.
  • Chandra T.K. and Joshi, S.N. (1983), Sankhya A 45, 226-246.
  • Ghosh J.K. and Mukherjee R. (2001), J. Statistical Planning and Inference 97, 45-55.
  • Taniguchi, M. (2001), J Statistical Planning and Inference 97, 191-200.

External links

See also

Cramér-Rao bound, Rao-Blackwell theorem, Fisher-Rao Metric, Second Order Efficiency.

Personal tools
Namespaces

Variants
Actions
Navigation
Focal areas
Activity
Tools