Kerry W. Fendick, Ward Whitt, "Verifying cell loss requirements in high-speed communication networks", International Journal of Stochastic Analysis, vol. 11, Article ID 859753, 20 pages, 1998. https://doi.org/10.1155/S1048953398000276
Verifying cell loss requirements in high-speed communication networks
In high-speed communication networks it is common to have requirements of very small cell loss probabilities due to buffer overflow. Losses are measured to verify that the cell loss requirements are being met, but it is not clear how to interpret such measurements. We propose methods for determining whether or not cell loss requirements are being met. A key idea is to look at the stream of losses as successive clusters of losses. Often clusters of losses, rather than individual losses, should be regarded as the important loss events. Thus we propose modeling the cell loss process by a batch Poisson stochastic process. Successive clusters of losses are assumed to arrive according to a Poisson process. Within each cluster, cell losses do not occur at a single time, but the distance between losses within a cluster should be negligible compared to the distance between clusters. Thus, for the purpose of estimating the cell loss probability, we ignore the spaces between successive cell losses in a cluster of losses. Asymptotic theory suggests that the counting process of losses initiating clusters often should be approximately a Poisson process even though the cell arrival process is not nearly Poisson. The batch Poisson model is relatively easy to test statistically and fit; e.g., the batch-size distribution and the batch arrival rate can readily be estimated from cell loss data. Since batch (cluster) sizes may be highly variable, it may be useful to focus on the number of batches instead of the number of cells in a measurement interval. We also propose a method for approximately determining the parameters of a special batch Poisson cell loss with geometric batch-size distribution from a queueing model of the buffer content. For this step, we use a reflected Brownian motion (RBM) approximation of a queueing model. We also use the RBM model to estimate the input burstiness given the cell loss rate. In addition, we use the RBM model to determine whether the presence of losses should significantly affect the estimation of server utilization when both losses and utilizations are estimated from data. Overall, our analysis can serve as a basis for determining required observation intervals in order to reach conclusions with a solid statistical basis. Thus our analysis can help plan simulations as well as computer system measurements.
Copyright © 1998 Hindawi Publishing Corporation. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.