Mathematical Statistics Seminar

Low-rank volatility estimation for high-dimensional Lévy processes and low frequency observations

Speaker(s): 
Mathias Trabs (University Paris-Dauphine)
Date: 
Wednesday, July 1, 2015 - 10:00am
Location: 
WIAS, Erhard-Schmidt-Saal, Mohrenstraße 39, 10117 Berlin

Whenever the modelling of random processes in biology, finance or physics requires to incorporate jumps, Lévy processes are one of the building blocks under consideration. Consequently, their statistical analysis attracted much attention in the last decades. We first review some results on the nowadays well understood nonparametric estimation of the characteristic triplet of a univariate Lévy process based on low frequent observations. The underlying inverse problem is ill-posed, where the degree of ill-posedness is determined by the characteristic triplet itself.

An Adaptive Functional Autoregressive Forecasting Model to Predict Electricity Price Curves

Speaker(s): 
Ying Chen (National University Singapore)
Date: 
Wednesday, June 24, 2015 - 10:00am
Location: 
WIAS, Erhard-Schmidt-Saal, Mohrenstraße 39, 10117 Berlin

Electricity price forecasting is becoming increasingly relevant in the competitive energy markets. We provide an approach to predict the whole electricity price curves based on the adaptive functional autoregressive (AFAR) methodology. The AFAR has time varying operators that allow it to be safely used in both stationary and non-stationary situations. Under stationarity, we develop a consistent maximum likelihood (ML) estimator with closed form, where the likelihood function is defined on the parameters' subspace or Sieves.

Inference problems in high dimensional linear models

Speaker(s): 
Alexandra Carpentier (Cambridge)
Date: 
Wednesday, June 17, 2015 - 10:00am
Location: 
WIAS, Erhard-Schmidt-Saal, Mohrenstraße 39, 10117 Berlin

In this talk I will consider a general noisy linear regression setting Y = + \epsilon, that simultaneously describes the usual "vector" linear regression setting, and the "matrix" linear regression setting. I will consider the problem of inference in this model, i.e. estimation of the underlying parameter \theta, and associated uncertainty quantification.

On optimization aspects of finding Wasserstain(-Kantorovich) barycenter

Speaker(s): 
Aleksey Gasnikov (MITP, Moscow)
Date: 
Wednesday, June 10, 2015 - 10:00am
Location: 
WIAS, Raum 4.13, Hausvogteiplatz 11, 10117 Berlin

In the talk we'll discuss recent works by Marco Cuturi (Kyoto Univ.) et al. devoted to the fast algorithm of computation of Wasserstain barycenter (Wb). In our approach we try to reduce a problem to another high dimensional convex optimization problem . The idea is to freeze the measures support by allowing the cardinalities of the support sets to be large enough. Then we have to solve a sadle-point convex-concave optimization problem (Cuturi et al. considered this problem to be nonsmooth convex optimization problem). We propose new different numerical approaches to solve this problem. 1.

Adaptive Testing on a Regression Function at a Point

Speaker(s): 
Timothy B. Armstrong (Yale University)
Date: 
Wednesday, May 27, 2015 - 10:00am
Location: 
WIAS, Erhard-Schmidt-Saal, Mohrenstraße 39, 10117 Berlin

We consider the problem of inference on a regression function at a point when the entire function satisfies a sign or shape restriction under the null. We propose a test that achieves the optimal minimax rate adaptively over a range of Hölder classes, up to a log log n term, which we show to be necessary for adaptation. We apply the results to adaptive one-sided tests for the regression discontinuity parameter under a monotonicity restriction, the value of a monotone regression function at the boundary, and the proportion of true null hypotheses in a multiple testing problem.

Network Analysis of Big Data

Speaker(s): 
Henry Horng-Shing Lu
Date: 
Wednesday, May 20, 2015 - 10:00am
Location: 
WIAS, Erhard-Schmidt-Saal, Mohrenstraße 39, 10117 Berlin

One great challenge of big data research is to efficiently and accurately identify the inherent complex network structure. We will discuss possible approaches to reconstruct Boolean networks. Specifically, we will prove that (log n) state transition pairs are sufficient and necessary to reconstruct the time delay Boolean network of n nodes with high accuracy if the number of input genes to each gene is bounded. Future developments of methodologies and computation systems for big data researches will be also discussed.

FASTEC: FActorisable Sparse Tail Event Curves

Speaker(s): 
Shih-Kang Chao (Humboldt-Universität zu Berlin)
Date: 
Wednesday, May 13, 2015 - 10:00am
Location: 
WIAS, Erhard-Schmidt-Saal, Mohrenstraße 39, 10117 Berlin

High-dimensional multivariate quantile analysis is crucial for many applications, such as risk management and weather analysis. In these applications, quantile functions qY (τ) of random variable Y such that P{Y ≤ qY (τ)} = τ at the "tail" of the distribution, namely at τ close 0 or 1, such as τ = 1%, 5% or τ = 95%, 99%, is of great interest.

On the statistical properties of $\ell_p$-Norm multiple kernel learning

Speaker(s): 
Marius Kloft (HU Berlin)
Date: 
Wednesday, May 6, 2015 - 10:00am
Location: 
WIAS, Erhard-Schmidt-Saal, Mohrenstraße 39, 10117 Berlin

Reproducing kernel Hilbert space methods have become a popular and versatile tool with many application areas in statistics and machine learning, the flagship method being the support vector machine. Nevertheless, a displeasing stumbling block towards the complete automatization of this method remains that of automatic kernel selection. In the seminal work of Lanckriet et al. (2004), it was shown that it is computationally feasible to simultaneously learn a support vector machine and a linear combination of kernels; this approach is dubbed "multiple kernel learning".

An L_∞ estimation in regression

Speaker(s): 
Keight Knight (University Toronto, USA)
Date: 
Wednesday, April 29, 2015 - 10:00am
Location: 
WIAS, Erhard-Schmidt-Saal, Mohrenstraße 39, 10117 Berlin

L_∞ estimation is not part of the traditional canon of applied regression analysis. And for good reason - it is highly non-robust and potentially very unstable. Nonetheless, in some situations, minimizing the maximum absolute residual is a worthwhile objective. In this talk, we will discuss the properties (both asymptotic and non-asymptotic) of L_∞ estimation in linear regression and describe an approach for "rescuing'' L_∞ estimation that can also be applied to non-parametric regression problems.

Variational regularization of statistical inverse problems

Speaker(s): 
Torsten Hohage (Universität Göttingen)
Date: 
Wednesday, April 22, 2015 - 10:00am
Location: 
WIAS, Erhard-Schmidt-Saal, Mohrenstraße 39, 10117 Berlin

We consider variational regularization methods for ill-posed inverse problems described by operator equations $F(x)=y$ in Banach spaces. One focus of this talk will be on data noise models: We will present a general framework which allows to treat many noise models and data fidelity terms in a unified setting, including Gaussian and Poisson processes, continuous and discrete models, and impulsive noise models. Rates of convergence are determined by abstract smoothness conditions called source conditions.

Pages

Subscribe to Mathematical Statistics Seminar