### Refine

Analysis II
(2000)

We compare different notions of differentiability of a measure along a vector field on a locally convex space. We consider in the L2-space of a differ entiable measure the analoga of the classical concepts of gradient, divergence and Laplacian (which coincides with the OrnsteinUhlenbeck operator in the Gaussian case). We use these operators for the extension of the basic results of Malliavin and Stroock on the smoothness of finite dimensional image measures under certain nonsmooth mappings to the case of non-Gaussian measures. The proof of this extension is quite direct and does not use any Chaos-decomposition. Finally, the role of this Laplacian in the procedure of quantization of anharmonic oscillators is discussed.

Starting from the uniqueness question for mixtures of distributions this review centers around the question under which formally weaker assumptions one can prove the existence of SPLIFs, in other words perfect statistics and tests. We mention a couple of positive and negative results which complement the basic contribution of David Blackwell in 1980. Typically the answers depend on the choice of the set theoretic axioms and on the particular concepts of measurability.

Bekanntlich gibt es keinen befriedigenden unendlich dimensionalen Ersatz für das Lebesgue-Mass. Andererseits lassen sich viele Techniken klassischer Analysis auch auf unendlich dimensionale Situationen übertragen. Eine Möglichkeit hierzu gibt die Theorie differenzierbarer Masse. Man definiert Richtungsableitungen für Masse ähnlich wie für Funktionen. Eines der zentralen Beispiele ist das Wiener-Mass. Stochastische Integration bezüglich der Brownschen Bewegung, insbesondere das Skorokhod-Integral ergeben sich in natürlicher Weise durch diesen Ansatz und auch die Grundideen des MalliavinKalküls lassen sich in diesem Rahmen einfach erläutern. Die Vorträge geben die meisten Beweise.

The paper studies differential and related properties of functions of a real variable with values in the space of signed measures. In particular the connections between different definitions of differentiability are described corresponding to different topologies on the measures. Some conditions are given for the equivalence of the measures in the range of such a function. These conditions are in terms of socalled logarithmic derivatives and yield a generalization of the Cameron-Martin-Maruyama-Girsanov formula. Questions of this kind appear both in the theory of differentiable measures on infinite-dimensional spaces and in the theory of statistical experiments.

Sudakov's typical marginals, random linear functionals and a conditional central limit theorem
(1997)

V.N. Sudakov [Sud78] proved that the one-dimensional marginals of a highdimensional second order measure are close to each other in most directions. Extending this and a related result in the context of projection pursuit of P. Diaconis and D. Freedman [Dia84], we give for a probability measure P and a random (a.s.) linear functional F on a Hilbert space simple sufficient conditions under which most of the one-dimensional images of P under F are close to their canonical mixture which turns out to be almost a mixed normal distribution. Using the concept of approximate conditioning we deduce a conditional central limit theorem (theorem 3) for random averages of triangular arrays of random variables which satisfy only fairly weak asymptotic orthogonality conditions.

In this note, answering a question of N. Maslova, we give a two-dimensional elementary example of the phenomenon indicated in the title. Perhaps this simple example may serve as an object of comparison for more refined models like in the theory of kinetic differential equations where similar questions still seem to be unsettled.

The observation of an ergodic Markov chain asymptotically allows perfect identification of the transition matrix. In this paper we determine the rate of the information contained in the first n observations, provided the unknown transition matrix belongs to a known finite set. As an essential tool we prove new refinements of the large deviation theory of the empirical pair measure of finite Markov chains. Keywords: Markov Chain, Entropy, Bayes risk, Large Deviations.

We compare different notions of differentiability of a measure along a vector field on a locally convex space. We consider in the \(L^2\)-space of a differentiable measure the analoga of the classical concepts of gradient, divergence and Laplacian (which coincides with the Ornstein-Uhlenbeck
operator in the Gaussian case). We use these operators for the extension of the basic results of Malliavin and Stroock on the smoothness of finite dimensional image measures under certain nonsmooth mappings to the case of non-Gaussian measures. The proof of this extension is quite direct and does not use any Chaos-decomposition. Finally, the role of this Laplacian in the
procedure of quantization of anharmonic oscillators is discussed.