Abstract

We study Davis-type theorems on the optimal rate of convergence of moderate deviation probabilities. In the case of martingale difference sequences, under the finite pth moments hypothesis (1p<), and depending on the normalization factor, our results show that Davis' theorems either hold if and only if p>2 or fail for all p1. This is in sharp contrast with the classical case of i.i.d. centered sequences, where both Davis' theorems hold under the finite second moment hypothesis (or less).