Table of Contents Author Guidelines Submit a Manuscript
Advances in Multimedia
Volume 2011 (2011), Article ID 914062, 13 pages
http://dx.doi.org/10.1155/2011/914062
Research Article

Real-Time Adaptive Content-Based Synchronization of Multimedia Streams

1Department of Electrical and Computer Engineering, American University of Beirut, P.O. Box 11-0236, Riad El Solh, Beirut 1107 2020, Lebanon
2Department of Electrical and Computer Engineering, Michigan State University, East lansing, MI 48824-1226, USA

Received 21 January 2011; Revised 29 April 2011; Accepted 27 June 2011

Academic Editor: Chong Wah Ngo

Copyright © 2011 Imad H. Elhajj et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Traditional synchronization schemes of multimedia applications are based on temporal relationships between inter- and intrastreams. These schemes do not provide good synchronization in the presence of random delay. As a solution, this paper proposes an adaptive content-based synchronization scheme that synchronizes multimedia streams by accounting for content in addition to time. This approach to synchronization is based on the fact that having two streams sampled close in time does not always imply that these streams are close in content. The proposed scheme primary contribution is the synchronization of audio and video streams based on content. The secondary contribution is adapting the frame rate based on content decisions. Testing adaptive content-based and adaptive time-based synchronization algorithms remotely between the American University of Beirut and Michigan State University showed that the proposed method outperforms the traditional synchronization method. Objective and subjective assessment of the received video and audio quality demonstrated that the content-based scheme provides better synchronization and overall quality of multimedia streams. Although demonstrated using a video conference application, the method can be applied to any multimedia streams including nontraditional ones referred to as supermedia like control signals, haptic, and other sensory measurements. In addition, the method can be applied to synchronize more than two streams simultaneously.