Gibbs posterior inference on a Levy density under discrete sampling

Abstract

In mathematical finance, Levy processes are widely used for their ability to model both continuous variation and abrupt, discontinuous jumps. These jumps are practically relevant, so reliable inference on the feature that controls jump frequencies and magnitudes, namely, the Levy density, is of critical importance. A specific obstacle to carrying out model-based (e.g., Bayesian) inference in such problems is that, for general Levy processes, the likelihood is intractable. To overcome this obstacle, here we adopt a Gibbs posterior framework that updates a prior distribution using a suitable loss function instead of a likelihood. We establish asymptotic posterior concentration rates for the proposed Gibbs posterior. In particular, in the most interesting and practically relevant case, we give conditions under which the Gibbs posterior concentrates at (nearly) the minimax optimal rate, adaptive to the unknown smoothness of the true Levy density.

Versions

➤  Version 1 (2021-09-14)

Citations

Zhe Wang and Ryan Martin (2021). Gibbs posterior inference on a Levy density under discrete sampling. Researchers.One. https://researchers.one/articles/21.09.00013v1

    Reviews & Substantive Comments

    0 Comments

Add to the conversation

© 2018–2026 Researchers.One