论文标题
有效的大空间自调估算
Efficient closed-form estimation of large spatial autoregressions
论文作者
论文摘要
研究参数空间随着样本大小的函数而缓慢增长,研究了具有大量参数的空间自回归模型的伪造最大似然估计。这些具有与高斯性下的最大似然性相同的渐近效率属性,但具有封闭形式。因此,它们在计算上是简单的,没有紧凑的假设,从而避免了两个臭名昭著的大型空间自动重测的臭名昭著的陷阱。对于最初的最小二乘估计,牛顿步骤还可能导致中心限制定理的规律性较弱,而不是文献中存在的定理。一项仿真研究表明,牛顿迭代中的出色样品收益很高,尤其是在大型多参数模型中,网格搜索的成本很高。一个小的经验例证显示,使用真实数据的估计精度有所改善。
Newton-step approximations to pseudo maximum likelihood estimates of spatial autoregressive models with a large number of parameters are examined, in the sense that the parameter space grows slowly as a function of sample size. These have the same asymptotic efficiency properties as maximum likelihood under Gaussianity but are of closed form. Hence they are computationally simple and free from compactness assumptions, thereby avoiding two notorious pitfalls of implicitly defined estimates of large spatial autoregressions. For an initial least squares estimate, the Newton step can also lead to weaker regularity conditions for a central limit theorem than those extant in the literature. A simulation study demonstrates excellent finite sample gains from Newton iterations, especially in large multiparameter models for which grid search is costly. A small empirical illustration shows improvements in estimation precision with real data.