Table of Contents Author Guidelines Submit a Manuscript
Comparative and Functional Genomics
Volume 3 (2002), Issue 4, Pages 375-379
Conference Review

The Curse of Normalization

1Department of Biomolecular Sciences, Control Systems Centre, UMIST, Manchester M60 1QD, UK
2Department of Electrical Engineering, Control Systems Centre, UMIST, Manchester M60 1QD, UK

Received 31 May 2002; Accepted 12 June 2002

Copyright © 2002 Hindawi Publishing Corporation. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Despite its enormous promise to further our understanding of cellular processes involved in the regulation of gene expression, microarray technology generates data for which statistical pre-processing has become a necessity before any interpretation of data can begin. The process by which we distinguish (and remove) non-biological variation from biological variation is called normalization. With a multitude of experimental designs, techniques and technologies influencing the acquisition of data, numerous approaches to normalization have been proposed in the literature. The purpose of this short review is not to add to the many suggestions that have been made, but to discuss some of the difficulties we encounter when analysing microarray data.