統計数学セミナー

過去の記録 ~03/18次回の予定今後の予定 03/19~

担当者 吉田朋広、荻原哲平、小池祐太
セミナーURL http://www.sigmath.es.osaka-u.ac.jp/~kamatani/statseminar/
目的 確率統計学およびその関連領域に関する研究発表, 研究紹介を行う.

2017年01月19日(木)

13:00-15:30   数理科学研究科棟(駒場) 052号室
Feng Chen 氏 (University of New South Wales)
Talk 1:Likelihood inference for a continuous time GARCH model
Talk 2:Nonparametric Estimation for Self-Exciting Point Processes: A Parsimonious Approach
[ 講演概要 ]
Talk 1:The continuous time GARCH (COGARCH) model of Kluppelberg, Lindner and Maller (2004) is a natural extension of the discrete time GARCH(1,1) model which preserves important features of the GARCH model in the discrete-time setting. For example, the COGARCH model is driven by a single source of noise as in the discrete time GARCH model, which is a Levy process in the COGARCH case, and both models can produced heavy tailed marginal returns even when the driving noise is light-tailed. However, calibrating the COGARCH model to data is a challenge, especially when observations of the COGARCH process are obtained at irregularly spaced time points. The method of moments has had some success in the case with regularly spaced data, yet it is not clear how to make it work in the more interesting case with irregularly spaced data. As a well-known method of estimation, the maximum likelihood method has not been developed for the COGARCH model, even in the quite simple case with the driving Levy process being compound Poisson, though a quasi-maximum likelihood (QML)method has been proposed. The challenge with the maximum likelihood method in this context is mainly due to the lack of a tractable form for the likelihood. In this talk, we propose a Monte Carlo method to approximate the likelihood of the compound Poisson driven COGARCH model. We evaluate the performance of the resulting maximum likelihood (ML) estimator using simulated data, and illustrate its application with high frequency exchange rate data. (Joint work with Damien Wee and William Dunsmuir).

Talk 2:There is ample evidence that in applications of self-exciting point process (SEPP) models, the intensity of background events is often far from constant. If a constant background is imposed, that assumption can reduce significantly the quality of statistical analysis, in problems as diverse as modelling the after-shocks of earthquakes and the study of ultra-high frequency financial data. Parametric models can be
used to alleviate this problem, but they run the risk of distorting inference by misspecifying the nature of the background intensity function. On the other hand, a purely nonparametric approach to analysis
leads to problems of identifiability; when a nonparametric approach is taken, not every aspect of the model can be identified from data recorded along a single observed sample path. In this paper we suggest overcoming this difficulty by using an approach based on the principle of parsimony, or Occam's razor. In particular, we suggest taking the point-process intensity to be either a constant or to have maximum differential entropy. Although seldom used for nonparametric function estimation in other settings, this approach is appropriate in the context of SEPP models. (Joint work with the late Peter Hall.)