Calculus without limits

# Updates to the Cauchy Central Limit

There were two updates on the Cauchy central limit theorem telling that if the Cauchy mean and risk $\lim_{m \to \infty} \frac{1}{m} \int_{-m}^m x^2 f(x) \; dx$ of a random variable X with PDF f is finite and non-zero, then any IID random process $X_n$ with that distribution has normalized sums $S_n$ which converge in distribution to the Cauchy distribution. There are results of Paul L&eacute;vy describing in terms of the characteristic function to have convergence. This is based on a rather simple fixed point result telling that the $log g(t)$ of the characteristic function $\phi(t)$ satisfies the fixed point equation $g(t/2)=g(t)/2$ so that if $latx g$ has one sided derivatives at 0 different from 0 and infinity and is even, then $g(t) = -c |t|$ must be the fixed point. Indeed, the renormalization map $T(g) = 2 g(t/2)$ on even functions converges. By a general Zeckendorff argument the convergence of $S_{2^n}$ is equivalent to the convergence of $S_n$. As for working with distribution, one can find in the book of Kolmogorov-Gnedenko a theorem of Gnedenko which describes in terms of the CDF , whether we are in the attractor of the Cauchy distribution. What I want to do is to have the risk condition alone decide whether we are in the attractor. We only need to show that non-zero, finite risk implies that $\phi$ (or $g$) have one sided limits at zero. This is not so obvious. While the function $\phi$ is uniformly continuous, the one-sided limits do not necessary have to exist There is a beautiful theorem of Polya which assures that $\phi$ comes from a probability distribution. Sufficient is that $\latex phi$ is convex and symmetric and $\phi(0)=1$ and $-1 \leq \phi \leq 1$. Under this Polya condition the one sided derivatives exist at every point! But most distributions do not satisfy the Polya condition. One can construct densities $f$ for which at some points $t$, the derivative function $\phi'(t)$ has a devil comb oscillatory singularity and I believe one can have this even on a dense set of points but I do not think this can happen at 0. But for Cauchy, one only needs to show that under the finite risk condition to be the case. By subtracting and adding a suitable Cauchy distribution function one now only has to show that if the risk functional for a function $f$ is zero, then its Fourier transform $\phi$ has a derivative at $t=0$. Classically, if the mean is zero and variance is finite then $\phi$ is even twice differentiable and $t=0$ is a maximum. A Fourier computation similarly as done in the video should allow to conclude that in the zero risk case, we still must have both $\phi'(0+) = \phi'(0-) = 0$. I had been thinking about this problem and the videos helped a bit. But it is not finished.