About this Journal Submit a Manuscript Table of Contents

Activity Monitoring Using Multimodal Data

Call for Papers

With the availability of affordable smart sensors, multimodal sensor data is conjectured to play an increasingly significant and central role in healthcare by the leverage of sophisticated data analysis technologies geared especially towards behavioral disease monitoring and diagnosis. Sensors exist for a wide range of modalities and environments, from wearable, physiological devices to pressure and contact sensors and ambient, audiovisual, or other environmental sensors. Accurate and effective remote monitoring to facilitate independent living of the elderly is becoming increasingly available as smart home technology matures. As such, the semantic interpretation of multimodal data as well as issues revolving around content-based access to such data is fast becoming relevant research issues associated with smart homes, remote monitoring, and elderly care technologies.

Multimodal data analysis, information retrieval, and indexing are central to this research agenda in that the semantic processing of the large-scale and heterogeneous data produced by smart sensors is fundamental to provide diagnostic assistance and feedback to people with medical conditions or in need of monitoring. Research challenges include the processing of dynamic, real-time, heterogeneous data, and they must take into account the spatiotemporal context of each sensor and the individual aspects of the person monitored. Fusion and higher-level interpretation techniques are needed in order to provide concrete and useful information to clinicians through appropriate user interfaces. Potential topics include, but are not limited to:

  • Multimedia indexing and retrieval methods for healthcare monitoring
  • Recognition of activities of daily living
  • Detection of abnormal behaviors/event detection for health monitoring and remote care
  • Video & voice-based analytics for monitoring and assessment
  • Semantic complex event processing, reasoning, knowledge structures of high–level concepts, and data fusion
  • Personalized data analysis, event detection, and profile building from multimodal data
  • Wearable and pervasive computing
  • Sensor networks, sensor correlation, and fusion
  • Context modeling and contextual reasoning in ambient intelligence

Before submission authors should carefully read over the journal’s Author Guidelines, which are located at http://www.hindawi.com/journals/jece/guidelines/. Prospective authors should submit an electronic copy of their complete manuscript through the journal Manuscript Tracking System at http://mts.hindawi.com/submit/journals/jece/signal.processing/mum/ according to the following timetable:

Manuscript DueFriday, 6 June 2014
First Round of ReviewsFriday, 29 August 2014
Publication DateFriday, 24 October 2014

Lead Guest Editor

  • Aytul Ercil, Faculty of Engineering and Natural Sciences, Sabancı University, Orhanlı, Istanbul, Turkey

Guest Editors

  • Eamonn Newman, Insight Centre for Data Analytics, Dublin City University, Dublin 9, Ireland
  • Ceyhun Burak Akgul, Vistek ISRA Vision A.S., ITU Ayazaga Kampusu, Teknokent Arı 1 Binası, No. 24/3, Maslak, Istanbul, Turkey
  • Yiannis Kompatsiaris, Information Technologies Institute, Centre for Research and Technology Hellas, Thermi, Thessaloniki, Greece