  ### 博碩士論文 93225023 詳細資訊

 以作者查詢圖書館館藏 、以作者查詢臺灣博碩士 、以作者查詢全國書目 、勘誤回報 、線上人數：12 、訪客IP：34.239.179.228

(Statistical Inference in Time Series Models)

 ★ 不需常態假設與不受離群值影響的選擇迴歸模型的方法 ★ 用卜瓦松與負二項分配建構非負連續隨機變數平均數之概似函數 ★ 強韌變異數分析 ★ 用強韌概似函數分析具相關性之二分法資料 ★ 利用Bartlett第二等式來估計有序資料的相關性 ★ 相關性連續與個數資料之強韌概似分析 ★ 不偏估計函數之有效性比較 ★ 一個分析相關性資料的新方法-複合估計方程式 ★ (一)加權概似函數之強韌性探討 (二)影響代謝症候群短期發生及消失的相關危險因子探討 ★ 利用 Bartlett 第二等式來推論模型假設錯誤下的變異數函數 ★ (一)零過多的個數資料之變異數函數的強韌推論 (二)影響糖尿病、高血壓短期發生的相關危險因子探討 ★ 一個分析具相關性的連續與比例資料的簡單且強韌的方法 ★ 複合概似函數有效性之探討 ★ 決定分析相關性資料時統計檢定力與樣本數的普世強韌法 ★ 檢定DNA鹼基替換模型的新方法 - 考慮不同DNA鹼基間的相關性 ★ 針對名目、個數與有序資料迴歸係數統計檢定力計算的普世強韌法

that address two important topics in time series. The first article is concerned with the problem of forecasting a non-negative first-order autoregressive (AR(1)) process. In both the stationary and unit root cases, the moment bounds and limiting distributions of an extreme value estimator of the AR coefficient are established. These results enable us to provide an asymptotic expression for the mean squared prediction error (MSPE) of the minimum ratio predictor,
which is constructed through the extreme value estimator. Based on this expression, we compare the performances of the minimum ratio predictor and the least squares predictor from the MSPE point of view. Our comparison reveals that the better predictor between these two predictors is determined not only by whether a unit root exists or not, but also by the behavior of the underlying error distribution near the origin, and hence is difficult to be
identified in practice. To circumvent this difficulty, we suggest choosing the predictor with the smaller accumulated prediction error (APE) and show that the predictor chosen in this way is asymptotically equivalent to the better one.

The second article provides a method for estimating the model coefficients in the linear regression model with serially correlated errors. The main aim is to propose a generalized least squares (GLS) estimator being as efficient asymptotically as the best linear unbiased estimator (BLUE). To this end, a consistent estimator of
the inverse of the autocovariance matrix of errors is equired. In order to ensure the positive definiteness of the estimated autocovariance matrix, we build an estimator for the inverse of the autocovariance matrix by the use of the modified Cholesky decomposition instead of directly estimating the autocovariance matrix. The resulting matrix estimator converges to the corresponding population matrix at a suitable rate. Moreover, the asymptotic optimality of the corresponding GLS estimator is established.

In these two articles, simulation studies are given to confirm our theoretical finding. In addition, real data analysis is included to illustrate the applicability of proposed methods.

★ 極值估計式
★ 均方預測誤差
★ 累積預測誤差
★ 廣義最小平方估計式
★ 共變異矩陣
★ Cholesky分解法

★ extreme value estimator
★ mean squared prediction error
★ accumulated prediction error
★ generalized least squares estimator
★ covariance matrix
★ Cholesky decomposition

1 Introduction 2
1.1 Background and Literature Review . . . . . . . . . . . . . . . . . . . . 2
1.2 Motivation and Objective . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2 Asymptotic Properties of Estimators for the AR coefficients 6
2.1 The Extreme Value Estimator . . . . . . . . . . . . . . . . . . . . . . . 6
2.2 The Least Squares Estimator . . . . . . . . . . . . . . . . . . . . . . . 7
3 Mean Square Prediction Error 9
3.1 The MSPE of the Minimum Ratio Predictor . . . . . . . . . . . . . . . 9
3.2 The MSPE of the Least Squares Predictor . . . . . . . . . . . . . . . . 13
4 Predictor Selection Rules Based on the APE 14
5 Numerical Studies 17
5.1 Finite Sample Performance of the Two Candidate Predictors . . . . . . 17
5.2 Finite Sample Performance of Predictor Selection Rules . . . . . . . . . 21
5.3 Real Data Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
6 Conclusion and Remarks 27
II Estimation of Linear Regression Models with Serially
Correlated Errors 29
7 Introduction 30
7.1 Background and Literature Review . . . . . . . . . . . . . . . . . . . . 30
7.2 Motivation and Objective . . . . . . . . . . . . . . . . . . . . . . . . . 32
7.3 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
8 The GLS Estimator of Regression Coefficients 33
8.1 Model and Notation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
8.2 An Estimator of the Inverse of the Autocovariance Matrix . . . . . . . 34
9 Assumptions and Some Asymptotic Properties 37
9.1 Technical Conditions . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
9.2 The Asymptotic Optimality of the Proposed GLS Estimator . . . . . . . 38
10 Numerical Studies 40
10.1 Finite Sample Performance of the Autocovariance Matrix Estimator . . 40
10.2 Finite Sample Performance of the Proposed GLS Estimator . . . . . . . 47
10.3 Real Data Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
11 Conclusion and Remarks 53
Reference 54
Appendices 59
A: Proofs of Theorems 2.1 and 2.2 . . . . . . . . . . . . . . . . . . . . . . . 59
B: Proofs of Theorems 3.1 and 3.2 . . . . . . . . . . . . . . . . . . . . . . . 65
C: Proofs of Theorems 4.1 and 4.2 . . . . . . . . . . . . . . . . . . . . . . . 71
D: Proofs of Chapter 9 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81

 An, H.-Z. (1992), ’’Non-negative autoregressive models,’’ Journal of Time Series Analysis, 13, 283-295.
 Andel, J. (1989), ’’Non-negative autoregressive processes,’’ Journal of Time Series Analysis, 13, 1-11.
 Anderson, T. W. (1994), The Statistical Analysis of Time Series, Wiley Classics Library Edition Published.
 Barndorff-Nielsen, O. E. and Shephard, N. (2001), ’’Non-Gaussian Ornstein-Uhlenbeck-based models and some of their uses in financial economics (with discussion),’’ Journal of the Royal Statistical Society, Ser. B, 63, 167-241.
 Baxter, G. (1962), ’’An asymptotic result for the finite predictor,’’ Mathematica Scandinavica, 10, 137-144.
 Bell, C. B. and Smith, E. P. (1986), ’’Inference for non-negative autoregressive schemes,’’ Communications in Statistics-Theory and Methods, 15, 2267-2269.
 Berk, K. (1974), ’’Consistent autoregressive spectral estimates,’’ The Annals of Statistics, 2, 489-502.
 Bickel, P. J. and Levina, E. (2008), ’’Regularized estimation of large covariance matrices,’’ The Annals of Statistics, 36, 199-227.
 Billingsley, P. (1968), Convergence of Probability Measures, Wiley, New York.
 Bloomfield, P. (1992), ’’Trends in global temperature,’’ Climatic Change, 21, 1-16.
 Brillinger, D. R. (2001), Time Series: Data Analysis and Theory. Society for Industrial and Applied Mathematics, Philadelphia.
 Brockwell, P. J. and Davis, R. A. (1991), Time Series: Theory and Methods, Springer, New York.
 Bryant, P. G. and Smith, M. A. (1995), Practical Data Analysis: Case Studies in Business Statistics (2nd edition), Volume 1, Richard D. Irwin, Inc, Chicago.
 Burr, I. W. (1976), Statistical Quality Control Methods, Marcel Dekker, New York.
 Chan, J. S. K. and Choy, S. T. B. (2008), ’’Analysis of Covariance Structures in Time Series,’’ Journal of Data Science, 6, 573-589.
 Chan, N. H. (1989), ’’Asymptotic inference for unstable autoregressive time series with drifts,’’ Journal of Statistical Planning and Inference, 23, 301-312.
 Chow, Y. S. (1965), ’’Local convergence of martingales and the law of large numbers,’’ The Annals of Mathematical Statistics, 36, 552-558.
 Chow, Y. S. and Teicher, H. (1997), Probability Theory: Independence, Interchangeability, Martingales (2nd edition), Springer, New York.
 Datta, D. and McCormick, W. P. (1995), ’’Bootstrap inference for a first-order autoregression with positive innovations,’’ Journal of the American Statistical Association, 90, 1289-1300.
 Davis, R. A. and McCormick, W. P. (1989), ’’Estimation for first-order autoregressive processes with positive or bounded innovations,’’ Stochastic Processes and Their Applications, 31, 237-250.
 Durbin, J. and Watson, G. S. (1951), ’’Testing for serial correlation in least squares regression II,’’ Biometrika, 38, 159-177.
 Fuller, W. A. (1996), Introduction to Statistical Time Series (2nd edition), Wiley-Interscience Publication.
 Furrer, R. and Bengtsson, T. (2007), ’’Estimation of high-dimensional prior and posterior covariance matrices in Kalman filter variants,’’ Journal of Multivariate Analysis, 98, 227-255.
 Gaver, D. P. and Lewis, P. A. W. (1980), ’’First-order autoregressive gamma sequences and point processes,’’ Advances in Applied Probability, 12, 727-745.
 Grenander, U. (1954), ’’On the estimation of regression coefficients in the case of an autocorrelated disturbance,’’ The Annals of Mathematical Statistics, 25, 252-272.
 Grenander, U. and Rosenblatt, M. (1957), ’’Regression analysis of time series with stationary residuals,’’ Proceedings of the National Academy of Sciences of the
United States of America, 40, 812-816.
 Hemerly, E. M. and Davis, M. H. A. (1989), ’’Strong consistency of the pls criterion for order determination of autoregressive processes,’’ The Annals of Statistics, 17,
941-946.
 Horn, R. A. and Johnson, C. R. (1990), Matrix Analysis, Corrected reprint of the 1985 original, Cambridge University Press, Cambridge.
 Ing, C.- K. (2001), ’’A note on mean-squared prediction errors of the least squares predictors in random walk models,’’ Journal of Time Series Analysis, 22, 711-724.
 Ing, C.-K. and Wei, C. Z. (2003), ’’On same-realization prediction in an infiniteorder autoregressive process,’’ Journal of Multivariate Analysis, 85, 130-155.
 Ing, C.- K. (2004), ’’Selecting optimal multistep predictors for autoregressive processes of unknown order,’’ The Annals of Statistics, 32, 693-722.
 Ing, C.- K. (2007), ’’Accumulated prediction errors, information criteria and optimal forecasting for autoregressive time series,’’ The Annals of Statistics, 35, 1238-1277.
 Koreisha, S. G. and Fang, Y. (2001), ’’Generalized least squares with misspecified serial correlation structures,’’ Journal of the Royal Statistical Society, Series B, 63, 515-531.
 Lawrance, A. J. and Lewis, P. A. W. (1985), ’’Modelling and residual analysis of nonlinear autoregressive time series in exponential variables (with Discussion),’’ Journal of the Royal Statistical Society, Series B, 47, 165-202.
 Marron, J. S. and Ruppert, D. (1994), ’’Transformations to reduce boundary bias
in kernel density estimation,’’ Journal of the Royal Statistical Society, Series B, 56, 653-671.
 McCormick, W. P. and Mathew, G. (1993), ’’Estimation for nonnegative autoregressive processes with an unknown location parameter,’’ Journal of Time Series Analysis, 14, 71-92.
 McMurry, T. L. and Politis, D. N. (2010), ’’Banded and tapered estimates for autocovariance matrices and the linear process bootstrap,’’ Journal of Time Series Analysis,
31, 471-482.
 Nicholson, A. J. (1950), ’’Population oscillations caused by competition for food,’’ Nature, 165, 476-477.
 Nielsen, B. and Shephard, N. (2003), ’’Likelihood analysis of a first-order autoregressive model with exponential innovations,’’ Journal of Time Series Analysis, 24, 337-344.
 Sim, C. H. (1987), ’’A mixed gamma ARMA(1,1) model for river flow time series,’’ Water Resources Research, 23, 32-36.
 Smith, R. L. (1986), ’’Maximum likelihood estimation for the NEAR(2) model,’’ Journal of the Royal Statistical Society, Series B, 48, 251-257.
 Speed, T. P. and Yu, B, (1993), ’’Model selection and prediction: normal regression,’’ Annals of the institute of statistical mathematics, 45, 35-54.
 Wei, C.-Z. (1987), ’’Adaptive prediction by least squares predictors in stochastic regression models with application to time series,’’ The Annals of Statistics, 15,
1667-1682.
 Wei, C.-Z. (1992), ’’On predictive least squares principles,’’ The Annals of Statistics, 20, 1-42.
 Wei, W. W. S. (2006), Time Series Analysis: Univariate and Multivariate Methods (2nd edition), Pearson Addison Wesley, Boston.
 Wu, W. B. (2011), ’’Asymptotic theory for stationary processes,’’ Statistics and Its Interface, 4, 207-226.
 Wu, W. B. and Pourahmadi, M. (2003), ’’Nonparametric etimation of large covariance matrices of longitudinal data,’’ Biometrika, 90, 831-844.
 Wu, W. B. and Pourahmadi, M. (2009), ’’Banding sample autocovariance matrices of stationary processes,’’ Statistica Sinica, 19, 1755-1768.
 Xiao, H. and Wu, W. B. (2012), ’’Covariance matrix estimation for stationary time series,’’ The Annals of Statistics, 40, 466-493.
 Yu, S.-H., Lin, C.-C. and Cheng, H.-W. (2012), ’’A note on mean squared prediction error under the unit root model with deterministic trend,’’ Journal of Time Series
Analysis, 33, 276-286.
 Zhou, J. and Basawa, I. V. (2005), ’’Maximum likelihood estimation for a first-order bifurcating autoregressive process with exponential errors,’’ Journal of Time Series Analysis, 26, 825-842.