Tests of Independence in Parametric Models With Applications and Illustrations

A. Colin Cameron, Pravin K. Trivedi
1993 Journal of business & economic statistics  
Tests of independence between variables in a wide variety of discrete and continuous bivariate and multivariate regression equations are derived using results from the theory of series expansions of ,Joint distributions in terms of margínal dístríbutions and their related orthonormal polynomials. The tcst, are condltianal momcnt. tests based on covarlances between palrs of orthonormal polynomials. Examples lnclude tests of serial independence against billnear and~or ARCH alternatives, tests of
more » ... rnatives, tests of dependence in multivaríate normal regression model, and dependence in count data models. Monte Carlo simulation based on bivaríate count models Ss used to evaluate the size and power properties of the proposed tests. A multivaríate count data model for Australian health care utílízatíon data ís used to empirically illustrate the tests. Some Key Wo~ds: SERIES EXPANSIONS; ORTNOCONAL POLYNONIALS; SCORE TEST; DYNAMIC INFORMATION MATRIX TEST; ARCN AND BILINEAR NODELS; COUNT DATA. see Brock et. al. (1991), Hsieh (1989), and Robinson (1991). But for cross sectional work there Ss a relative dearth of tests. A general framework which considers both tíme serles and cross sectíon data is desirable. This paper develops score type tests of independence based on a series expansion of the unknown~oint pdf of the observations. This is simpler than the alternative approach of writing down the joint densíty explicitly and derlving score tests of independence, because ín some non-Gaussian situatlons a flexible speclfication of the joint density Ss often not readily avallable. This also makes the construction of Wald and likelihood ratio tests difficult and partly explains the relative infrequency with which such tests are developed or used. By contrast the approach of thls paper requires the specífication of the univariate margínals which are then used to form an approximation to the Joínt distributíon. Gíven correct specification of the marginals, the valídíty of the resulting independence tests does not depend on the adequacy of this approxímation, though the power of the test wlll. A general framework for testing dependence must address the following z problem: except in special cases, tests of independence involve, ín principle, an infinite number of restríctíons. So an approach is requíred which wíll either test a smaller subset of these restrictions, or test the restrictions through one or more parameters ín the joint distríbution. How to deríve and Justify such restrlctlons Ss an lmportant issue which ls addressed by the general method of testing for índependence between random variables considered here. It Ss based on a characterization of bivarlate and multivaríate distributions, introduced by Lancaster (1958) and subsequently elaborated and extended in Lancaster (1963 ( , 1969 ( ), and Eagleson (1964 . Infinlte series expansions for the bivariate or multivariate joínt distributions are constructed using the univariate marginal distributions and their associated orthonormal polynomials. The tests are conditional moment tests based on low order terms in the series expansíon. A brief comparíson of the approach of this paper with other approaches in the literature may provide an improved perspective. In econometrics tests of dependence are most highly developed Sn the context of tlme series. Serial correlation tests are the most common, but the literature also consíders nonllnear dependence of other types; for example, bilinear and ARCH dependence (Granger and Andersen (1978), Engle (1982), Neiss (1986)), and ARCH-M dependence (Engle et. al. (1987) ). Much of this work Ss restricted by the assumption of conditionally Gaussian or symmetrically distributed errors. More recently, a nonparametrlc approach to testing for nonllnear time series dependence using the correlation integral has been investigated and applied by Brock with a number of co-authors; see Brock, Hsieh and LeBaron (1991) and Brock and Potter (1991) . Robinson (1991) has also proposed a nonparametric test of independence of yt and yt-1 for a stationary process {yt} based on the Kullback-Leibler entropy measure of the difference between the joínt distributíon and the product of the two marginals. By contrast, tests of
doi:10.1080/07350015.1993.10509931 fatcat:wusa6hpz2fe6zcdzxfrdwyerma