Computational and Mathematical Methods in Medicine The latest articles from Hindawi Publishing Corporation © 2015 , Hindawi Publishing Corporation . All rights reserved. The Use of Virtual Reality in Psychology: A Case Study in Visual Perception Mon, 03 Aug 2015 14:32:23 +0000 Recent proliferation of available virtual reality (VR) tools has seen increased use in psychological research. This is due to a number of advantages afforded over traditional experimental apparatus such as tighter control of the environment and the possibility of creating more ecologically valid stimulus presentation and response protocols. At the same time, higher levels of immersion and visual fidelity afforded by VR do not necessarily evoke presence or elicit a “realistic” psychological response. The current paper reviews some current uses for VR environments in psychological research and discusses some ongoing questions for researchers. Finally, we focus on the area of visual perception, where both the advantages and challenges of VR are particularly salient. Christopher J. Wilson and Alessandro Soranzo Copyright © 2015 Christopher J. Wilson and Alessandro Soranzo. All rights reserved. Augmented Reality: A Brand New Challenge for the Assessment and Treatment of Psychological Disorders Mon, 03 Aug 2015 14:23:18 +0000 Augmented Reality is a new technological system that allows introducing virtual contents in the real world in order to run in the same representation and, in real time, enhancing the user’s sensory perception of reality. From another point of view, Augmented Reality can be defined as a set of techniques and tools that add information to the physical reality. To date, Augmented Reality has been used in many fields, such as medicine, entertainment, maintenance, architecture, education, and cognitive and motor rehabilitation but very few studies and applications of AR exist in clinical psychology. In the treatment of psychological disorders, Augmented Reality has given preliminary evidence to be a useful tool due to its adaptability to the patient needs and therapeutic purposes and interactivity. Another relevant factor is the quality of the user’s experience in the Augmented Reality system determined from emotional engagement and sense of presence. This experience could increase the AR ecological validity in the treatment of psychological disorders. This paper reviews the recent studies on the use of Augmented Reality in the evaluation and treatment of psychological disorders, focusing on current uses of this technology and on the specific features that delineate Augmented Reality a new technique useful for psychology. Irene Alice Chicchi Giglioli, Federica Pallavicini, Elisa Pedroli, Silvia Serino, and Giuseppe Riva Copyright © 2015 Irene Alice Chicchi Giglioli et al. All rights reserved. Novel Virtual User Models of Mild Cognitive Impairment for Simulating Dementia Mon, 03 Aug 2015 13:34:19 +0000 Virtual user modeling research has attempted to address critical issues of human-computer interaction (HCI) such as usability and utility through a large number of analytic, usability-oriented approaches as cognitive models in order to provide users with experiences fitting to their specific needs. However, there is demand for more specific modules embodied in cognitive architecture that will detect abnormal cognitive decline across new synthetic task environments. Also, accessibility evaluation of graphical user interfaces (GUIs) requires considerable effort for enhancing ICT products accessibility for older adults. The main aim of this study is to develop and test virtual user models (VUM) simulating mild cognitive impairment (MCI) through novel specific modules, embodied at cognitive models and defined by estimations of cognitive parameters. Well-established MCI detection tests assessed users’ cognition, elaborated their ability to perform multitasks, and monitored the performance of infotainment related tasks to provide more accurate simulation results on existing conceptual frameworks and enhanced predictive validity in interfaces’ design supported by increased tasks’ complexity to capture a more detailed profile of users’ capabilities and limitations. The final outcome is a more robust cognitive prediction model, accurately fitted to human data to be used for more reliable interfaces’ evaluation through simulation on the basis of virtual models of MCI users. Sofia Segkouli, Ioannis Paliokas, Dimitrios Tzovaras, Thanos Tsakiris, Magda Tsolaki, and Charalampos Karagiannidis Copyright © 2015 Sofia Segkouli et al. All rights reserved. Thermal Infrared Imaging-Based Computational Psychophysiology for Psychometrics Mon, 03 Aug 2015 12:48:44 +0000 Thermal infrared imaging has been proposed as a potential system for the computational assessment of human autonomic nervous activity and psychophysiological states in a contactless and noninvasive way. Through bioheat modeling of facial thermal imagery, several vital signs can be extracted, including localized blood perfusion, cardiac pulse, breath rate, and sudomotor response, since all these parameters impact the cutaneous temperature. The obtained physiological information could then be used to draw inferences about a variety of psychophysiological or affective states, as proved by the increasing number of psychophysiological studies using thermal infrared imaging. This paper presents therefore a review of the principal achievements of thermal infrared imaging in computational physiology with regard to its capability of monitoring psychophysiological activity. Daniela Cardone, Paola Pinti, and Arcangelo Merla Copyright © 2015 Daniela Cardone et al. All rights reserved. Computational Psychometrics in Communication and Implications in Decision Making Mon, 03 Aug 2015 12:46:46 +0000 Recent investigations emphasized the role of communication features on behavioral trust and reciprocity in economic decision making but no studies have been focused on the effect of communication on affective states in such a context. Thanks to advanced methods of computational psychometrics, in this study, affective states were deeply examined using simultaneous and synchronized recordings of gazes and psychophysiological signals in 28 female students during an investment game. Results showed that participants experienced different affective states according to the type of communication (personal versus impersonal). In particular, participants involved in personal communication felt more relaxed than participants involved in impersonal communication. Moreover, personal communication influenced reciprocity and participants’ perceptions about trust and reciprocity. Findings were interpreted in the light of the Arousal/Valence Model and self-disclosure process. Pietro Cipresso, Daniela Villani, Claudia Repetto, Lucia Bosone, Anna Balgera, Maurizio Mauri, Marco Villamira, Alessandro Antonietti, and Giuseppe Riva Copyright © 2015 Pietro Cipresso et al. All rights reserved. An Efficient Optimization Method for Solving Unsupervised Data Classification Problems Wed, 29 Jul 2015 16:02:05 +0000 Unsupervised data classification (or clustering) analysis is one of the most useful tools and a descriptive task in data mining that seeks to classify homogeneous groups of objects based on similarity and is used in many medical disciplines and various applications. In general, there is no single algorithm that is suitable for all types of data, conditions, and applications. Each algorithm has its own advantages, limitations, and deficiencies. Hence, research for novel and effective approaches for unsupervised data classification is still active. In this paper a heuristic algorithm, Biogeography-Based Optimization (BBO) algorithm, was adapted for data clustering problems by modifying the main operators of BBO algorithm, which is inspired from the natural biogeography distribution of different species. Similar to other population-based algorithms, BBO algorithm starts with an initial population of candidate solutions to an optimization problem and an objective function that is calculated for them. To evaluate the performance of the proposed algorithm assessment was carried on six medical and real life datasets and was compared with eight well known and recent unsupervised data classification algorithms. Numerical results demonstrate that the proposed evolutionary optimization algorithm is efficient for unsupervised data classification. Parvaneh Shabanzadeh and Rubiyah Yusof Copyright © 2015 Parvaneh Shabanzadeh and Rubiyah Yusof. All rights reserved. Dual Energy Method for Breast Imaging: A Simulation Study Mon, 13 Jul 2015 11:22:59 +0000 Dual energy methods can suppress the contrast between adipose and glandular tissues in the breast and therefore enhance the visibility of calcifications. In this study, a dual energy method based on analytical modeling was developed for the detection of minimum microcalcification thickness. To this aim, a modified radiographic X-ray unit was considered, in order to overcome the limited kVp range of mammographic units used in previous DE studies, combined with a high resolution CMOS sensor (pixel size of 22.5 μm) for improved resolution. Various filter materials were examined based on their K-absorption edge. Hydroxyapatite (HAp) was used to simulate microcalcifications. The contrast to noise ratio () of the subtracted images was calculated for both monoenergetic and polyenergetic X-ray beams. The optimum monoenergetic pair was 23/58 keV for the low and high energy, respectively, resulting in a minimum detectable microcalcification thickness of 100 μm. In the polyenergetic X-ray study, the optimal spectral combination was 40/70 kVp filtered with 100 μm cadmium and 1000 μm copper, respectively. In this case, the minimum detectable microcalcification thickness was 150 μm. The proposed dual energy method provides improved microcalcification detectability in breast imaging with mean glandular dose values within acceptable levels. V. Koukou, N. Martini, C. Michail, P. Sotiropoulou, C. Fountzoula, N. Kalyvas, I. Kandarakis, G. Nikiforidis, and G. Fountos Copyright © 2015 V. Koukou et al. All rights reserved. Modelling Optimal Control of Cholera in Communities Linked by Migration Mon, 13 Jul 2015 06:40:12 +0000 A mathematical model for the dynamics of cholera transmission with permissible controls between two connected communities is developed and analysed. The dynamics of the disease in the adjacent communities are assumed to be similar, with the main differences only reflected in the transmission and disease related parameters. This assumption is based on the fact that adjacent communities often have different living conditions and movement is inclined toward the community with better living conditions. Community specific reproduction numbers are given assuming movement of those susceptible, infected, and recovered, between communities. We carry out sensitivity analysis of the model parameters using the Latin Hypercube Sampling scheme to ascertain the degree of effect the parameters and controls have on progression of the infection. Using principles from optimal control theory, a temporal relationship between the distribution of controls and severity of the infection is ascertained. Our results indicate that implementation of controls such as proper hygiene, sanitation, and vaccination across both affected communities is likely to annihilate the infection within half the time it would take through self-limitation. In addition, although an infection may still break out in the presence of controls, it may be up to 8 times less devastating when compared with the case when no controls are in place. J. B. H. Njagarah and F. Nyabadza Copyright © 2015 J. B. H. Njagarah and F. Nyabadza. All rights reserved. Automated Delineation of Vessel Wall and Thrombus Boundaries of Abdominal Aortic Aneurysms Using Multispectral MR Images Sun, 05 Jul 2015 07:31:50 +0000 A correct patient-specific identification of the abdominal aortic aneurysm is useful for both diagnosis and treatment stages, as it locates the disease and represents its geometry. The actual thickness and shape of the arterial wall and the intraluminal thrombus are of great importance when predicting the rupture of the abdominal aortic aneurysms. The authors describe a novel method for delineating both the internal and external contours of the aortic wall, which allows distinguishing between vessel wall and intraluminal thrombus. The method is based on active shape model and texture statistical information. The method was validated with eight MR patient studies. There was high correspondence between automatic and manual measurements for the vessel wall area. Resulting segmented images presented a mean Dice coefficient with respect to manual segmentations of 0.88 and a mean modified Hausdorff distance of 1.14 mm for the internal face and 0.86 and 1.33 mm for the external face of the arterial wall. Preliminary results of the segmentation show high correspondence between automatic and manual measurements for the vessel wall and thrombus areas. However, since the dataset is small the conclusions cannot be generalized. B. Rodriguez-Vila, J. Tarjuelo-Gutierrez, P. Sánchez-González, P. Verbrugghe, I. Fourneau, G. Maleux, P. Herijgers, and E. J. Gomez Copyright © 2015 B. Rodriguez-Vila et al. All rights reserved. Dynamical Analysis of an SEIT Epidemic Model with Application to Ebola Virus Transmission in Guinea Thu, 02 Jul 2015 09:14:26 +0000 In order to investigate the transmission mechanism of the infectious individual with Ebola virus, we establish an SEIT (susceptible, exposed in the latent period, infectious, and treated/recovery) epidemic model. The basic reproduction number is defined. The mathematical analysis on the existence and stability of the disease-free equilibrium and endemic equilibrium is given. As the applications of the model, we use the recognized infectious and death cases in Guinea to estimate parameters of the model by the least square method. With suitable parameter values, we obtain the estimated value of the basic reproduction number and analyze the sensitivity and uncertainty property by partial rank correlation coefficients. Zhiming Li, Zhidong Teng, Xiaomei Feng, Yingke Li, and Huiguo Zhang Copyright © 2015 Zhiming Li et al. All rights reserved. Undersampled MR Image Reconstruction with Data-Driven Tight Frame Wed, 24 Jun 2015 12:09:26 +0000 Undersampled magnetic resonance image reconstruction employing sparsity regularization has fascinated many researchers in recent years under the support of compressed sensing theory. Nevertheless, most existing sparsity-regularized reconstruction methods either lack adaptability to capture the structure information or suffer from high computational load. With the aim of further improving image reconstruction accuracy without introducing too much computation, this paper proposes a data-driven tight frame magnetic image reconstruction (DDTF-MRI) method. By taking advantage of the efficiency and effectiveness of data-driven tight frame, DDTF-MRI trains an adaptive tight frame to sparsify the to-be-reconstructed MR image. Furthermore, a two-level Bregman iteration algorithm has been developed to solve the proposed model. The proposed method has been compared to two state-of-the-art methods on four datasets and encouraging performances have been achieved by DDTF-MRI. Jianbo Liu, Shanshan Wang, Xi Peng, and Dong Liang Copyright © 2015 Jianbo Liu et al. All rights reserved. Advances in Computational Methods for Genetic Diseases Thu, 18 Jun 2015 12:39:23 +0000 Francesco Camastra, Roberto Amato, Maria Donata Di Taranto, and Antonino Staiano Copyright © 2015 Francesco Camastra et al. All rights reserved. Accelerated Compressed Sensing Based CT Image Reconstruction Thu, 18 Jun 2015 08:16:22 +0000 In X-ray computed tomography (CT) an important objective is to reduce the radiation dose without significantly degrading the image quality. Compressed sensing (CS) enables the radiation dose to be reduced by producing diagnostic images from a limited number of projections. However, conventional CS-based algorithms are computationally intensive and time-consuming. We propose a new algorithm that accelerates the CS-based reconstruction by using a fast pseudopolar Fourier based Radon transform and rebinning the diverging fan beams to parallel beams. The reconstruction process is analyzed using a maximum-a-posterior approach, which is transformed into a weighted CS problem. The weights involved in the proposed model are calculated based on the statistical characteristics of the reconstruction process, which is formulated in terms of the measurement noise and rebinning interpolation error. Therefore, the proposed method not only accelerates the reconstruction, but also removes the rebinning and interpolation errors. Simulation results are shown for phantoms and a patient. For example, a 512 × 512 Shepp-Logan phantom when reconstructed from 128 rebinned projections using a conventional CS method had 10% error, whereas with the proposed method the reconstruction error was less than 1%. Moreover, computation times of less than 30 sec were obtained using a standard desktop computer without numerical optimization. SayedMasoud Hashemi, Soosan Beheshti, Patrick R. Gill, Narinder S. Paul, and Richard S. C. Cobbold Copyright © 2015 SayedMasoud Hashemi et al. All rights reserved. Towards Automated Three-Dimensional Tracking of Nephrons through Stacked Histological Image Sets Mon, 15 Jun 2015 14:03:06 +0000 An automated approach for tracking individual nephrons through three-dimensional histological image sets of mouse and rat kidneys is presented. In a previous study, the available images were tracked manually through the image sets in order to explore renal microarchitecture. The purpose of the current research is to reduce the time and effort required to manually trace nephrons by creating an automated, intelligent system as a standard tool for such datasets. The algorithm is robust enough to isolate closely packed nephrons and track their convoluted paths despite a number of nonideal, interfering conditions such as local image distortions, artefacts, and interstitial tissue interference. The system comprises image preprocessing, feature extraction, and a custom graph-based tracking algorithm, which is validated by a rule base and a machine learning algorithm. A study of a selection of automatically tracked nephrons, when compared with manual tracking, yields a 95% tracking accuracy for structures in the cortex, while those in the medulla have lower accuracy due to narrower diameter and higher density. Limited manual intervention is introduced to improve tracking, enabling full nephron paths to be obtained with an average of 17 manual corrections per mouse nephron and 58 manual corrections per rat nephron. Charita Bhikha, Arne Andreasen, Erik I. Christensen, Robyn F. R. Letts, Adam Pantanowitz, David M. Rubin, Jesper S. Thomsen, and Xiao-Yue Zhai Copyright © 2015 Charita Bhikha et al. All rights reserved. The Prioritization of Clinical Risk Factors of Obstructive Sleep Apnea Severity Using Fuzzy Analytic Hierarchy Process Mon, 15 Jun 2015 08:56:26 +0000 Recently, there has been a problem of shortage of sleep laboratories that can accommodate the patients in a timely manner. Delayed diagnosis and treatment may lead to worse outcomes particularly in patients with severe obstructive sleep apnea (OSA). For this reason, the prioritization in polysomnography (PSG) queueing should be endorsed based on disease severity. To date, there have been conflicting data whether clinical information can predict OSA severity. The 1,042 suspected OSA patients underwent diagnostic PSG study at Siriraj Sleep Center during 2010-2011. A total of 113 variables were obtained from sleep questionnaires and anthropometric measurements. The 19 groups of clinical risk factors consisting of 42 variables were categorized into each OSA severity. This study aimed to array these factors by employing Fuzzy Analytic Hierarchy Process approach based on normalized weight vector. The results revealed that the first rank of clinical risk factors in Severe, Moderate, Mild, and No OSA was nighttime symptoms. The overall sensitivity/specificity of the approach to these groups was 92.32%/91.76%, 89.52%/88.18%, 91.08%/84.58%, and 96.49%/81.23%, respectively. We propose that the urgent PSG appointment should include clinical risk factors of Severe OSA group. In addition, the screening for Mild from No OSA patients in sleep center setting using symptoms during sleep is also recommended (sensitivity = 87.12% and specificity = 72.22%). Thaya Maranate, Adisak Pongpullponsak, and Pimon Ruttanaumpawan Copyright © 2015 Thaya Maranate et al. All rights reserved. A Forward Dynamic Modelling Investigation of Cause-and-Effect Relationships in Single Support Phase of Human Walking Sun, 14 Jun 2015 09:46:52 +0000 Mathematical gait models often fall into one of two categories: simple and complex. There is a large leap in complexity between model types, meaning the effects of individual gait mechanisms get overlooked. This study investigated the cause-and-effect relationships between gait mechanisms and resulting kinematics and kinetics, using a sequence of mathematical models of increasing complexity. The focus was on sagittal plane and single support only. Starting with an inverted pendulum (IP), extended to include a HAT (head-arms-trunk) segment and an actuated hip moment, further complexities were added one-by-one. These were a knee joint, an ankle joint with a static foot, heel rise, and finally a swing leg. The presence of a knee joint and an ankle moment (during foot flat) were shown to largely influence the initial peak in the vertical GRF curve. The second peak in this curve was achieved through a combination of heel rise and the presence of a swing leg. Heel rise was also shown to reduce errors in the horizontal GRF prediction in the second half of single support. The swing leg is important for centre-of-mass (CM) deceleration in late single support. These findings provide evidence for the specific effects of each gait mechanism. Michael McGrath, David Howard, and Richard Baker Copyright © 2015 Michael McGrath et al. All rights reserved. Reconstruction Accuracy Assessment of Surface and Underwater 3D Motion Analysis: A New Approach Sun, 14 Jun 2015 06:29:29 +0000 This study assessed accuracy of surface and underwater 3D reconstruction of a calibration volume with and without homography. A calibration volume (6000 × 2000 × 2500 mm) with 236 markers (64 above and 88 underwater control points—with 8 common points at water surface—and 92 validation points) was positioned on a 25 m swimming pool and recorded with two surface and four underwater cameras. Planar homography estimation for each calibration plane was computed to perform image rectification. Direct linear transformation algorithm for 3D reconstruction was applied, using 1600000 different combinations of 32 and 44 points out of the 64 and 88 control points for surface and underwater markers (resp.). Root Mean Square (RMS) error with homography of control and validations points was lower than without it for surface and underwater cameras (). With homography, RMS errors of control and validation points were similar between surface and underwater cameras (). Without homography, RMS error of control points was greater for underwater than surface cameras () and the opposite was observed for validation points (). It is recommended that future studies using 3D reconstruction should include homography to improve swimming movement analysis accuracy. Kelly de Jesus, Karla de Jesus, Pedro Figueiredo, João Paulo Vilas-Boas, Ricardo Jorge Fernandes, and Leandro José Machado Copyright © 2015 Kelly de Jesus et al. All rights reserved. Genetic Consequences of Antiviral Therapy on HIV-1 Wed, 10 Jun 2015 07:41:06 +0000 A variety of enzyme inhibitors have been developed in combating HIV-1, however the fast evolutionary rate of this virus commonly leads to the emergence of resistance mutations that finally allows the mutant virus to survive. This review explores the main genetic consequences of HIV-1 molecular evolution during antiviral therapies, including the viral genetic diversity and molecular adaptation. The role of recombination in the generation of drug resistance is also analyzed. Besides the investigation and discussion of published works, an evolutionary analysis of protease-coding genes collected from patients before and after treatment with different protease inhibitors was included to validate previous studies. Finally, the review discusses the importance of considering genetic consequences of antiviral therapies in models of HIV-1 evolution that could improve current genotypic resistance testing and treatments design. Miguel Arenas Copyright © 2015 Miguel Arenas. All rights reserved. Nanodosimetry-Based Plan Optimization for Particle Therapy Mon, 08 Jun 2015 06:00:50 +0000 Treatment planning for particle therapy is currently an active field of research due uncertainty in how to modify physical dose in order to create a uniform biological dose response in the target. A novel treatment plan optimization strategy based on measurable nanodosimetric quantities rather than biophysical models is proposed in this work. Simplified proton and carbon treatment plans were simulated in a water phantom to investigate the optimization feasibility. Track structures of the mixed radiation field produced at different depths in the target volume were simulated with Geant4-DNA and nanodosimetric descriptors were calculated. The fluences of the treatment field pencil beams were optimized in order to create a mixed field with equal nanodosimetric descriptors at each of the multiple positions in spread-out particle Bragg peaks. For both proton and carbon ion plans, a uniform spatial distribution of nanodosimetric descriptors could be obtained by optimizing opposing-field but not single-field plans. The results obtained indicate that uniform nanodosimetrically weighted plans, which may also be radiobiologically uniform, can be obtained with this approach. Future investigations need to demonstrate that this approach is also feasible for more complicated beam arrangements and that it leads to biologically uniform response in tumor cells and tissues. Margherita Casiraghi and Reinhard W. Schulte Copyright © 2015 Margherita Casiraghi and Reinhard W. Schulte. All rights reserved. Revisiting Warfarin Dosing Using Machine Learning Techniques Thu, 04 Jun 2015 15:19:15 +0000 Determining the appropriate dosage of warfarin is an important yet challenging task. Several prediction models have been proposed to estimate a therapeutic dose for patients. The models are either clinical models which contain clinical and demographic variables or pharmacogenetic models which additionally contain the genetic variables. In this paper, a new methodology for warfarin dosing is proposed. The patients are initially classified into two classes. The first class contains patients who require doses of >30 mg/wk and the second class contains patients who require doses of ≤30 mg/wk. This phase is performed using relevance vector machines. In the second phase, the optimal dose for each patient is predicted by two clinical regression models that are customized for each class of patients. The prediction accuracy of the model was 11.6 in terms of root mean squared error (RMSE) and 8.4 in terms of mean absolute error (MAE). This was 15% and 5% lower than IWPC and Gage models (which are the most widely used models in practice), respectively, in terms of RMSE. In addition, the proposed model was compared with fixed-dose approach of 35 mg/wk, and the model proposed by Sharabiani et al. and its outperformance were proved in terms of both MAE and RMSE. Ashkan Sharabiani, Adam Bress, Elnaz Douzali, and Houshang Darabi Copyright © 2015 Ashkan Sharabiani et al. All rights reserved. Medical Image Fusion Based on Rolling Guidance Filter and Spiking Cortical Model Wed, 03 Jun 2015 11:55:59 +0000 Medical image fusion plays an important role in diagnosis and treatment of diseases such as image-guided radiotherapy and surgery. Although numerous medical image fusion methods have been proposed, most of these approaches are sensitive to the noise and usually lead to fusion image distortion, and image information loss. Furthermore, they lack universality when dealing with different kinds of medical images. In this paper, we propose a new medical image fusion to overcome the aforementioned issues of the existing methods. It is achieved by combining with rolling guidance filter (RGF) and spiking cortical model (SCM). Firstly, saliency of medical images can be captured by RGF. Secondly, a self-adaptive threshold of SCM is gained by utilizing the mean and variance of the source images. Finally, fused image can be gotten by SCM motivated by RGF coefficients. Experimental results show that the proposed method is superior to other current popular ones in both subjectively visual performance and objective criteria. Liu Shuaiqi, Zhao Jie, and Shi Mingzhu Copyright © 2015 Liu Shuaiqi et al. All rights reserved. Enhancing the Lasso Approach for Developing a Survival Prediction Model Based on Gene Expression Data Wed, 03 Jun 2015 07:57:39 +0000 In the past decade, researchers in oncology have sought to develop survival prediction models using gene expression data. The least absolute shrinkage and selection operator (lasso) has been widely used to select genes that truly correlated with a patient’s survival. The lasso selects genes for prediction by shrinking a large number of coefficients of the candidate genes towards zero based on a tuning parameter that is often determined by a cross-validation (CV). However, this method can pass over (or fail to identify) true positive genes (i.e., it identifies false negatives) in certain instances, because the lasso tends to favor the development of a simple prediction model. Here, we attempt to monitor the identification of false negatives by developing a method for estimating the number of true positive (TP) genes for a series of values of a tuning parameter that assumes a mixture distribution for the lasso estimates. Using our developed method, we performed a simulation study to examine its precision in estimating the number of TP genes. Additionally, we applied our method to a real gene expression dataset and found that it was able to identify genes correlated with survival that a CV method was unable to detect. Shuhei Kaneko, Akihiro Hirakawa, and Chikuma Hamada Copyright © 2015 Shuhei Kaneko et al. All rights reserved. The Technological Growth in eHealth Services Wed, 03 Jun 2015 07:56:59 +0000 The infusion of information communication technology (ICT) into health services is emerging as an active area of research. It has several advantages but perhaps the most important one is providing medical benefits to one and all irrespective of geographic boundaries in a cost effective manner, providing global expertise and holistic services, in a time bound manner. This paper provides a systematic review of technological growth in eHealth services. The present study reviews and analyzes the role of four important technologies, namely, satellite, internet, mobile, and cloud for providing health services. Shilpa Srivastava, Millie Pant, Ajith Abraham, and Namrata Agrawal Copyright © 2015 Shilpa Srivastava et al. All rights reserved. Monte Carlo Calculation of Radioimmunotherapy with 90Y-, 177Lu-, 131I-, 124I-, and 188Re-Nanoobjects: Choice of the Best Radionuclide for Solid Tumour Treatment by Using TCP and NTCP Concepts Tue, 02 Jun 2015 07:32:44 +0000 Radioimmunotherapy has shown that the use of monoclonal antibodies combined with a radioisotope like 131I or 90Y still remains ineffective for solid and radioresistant tumour treatment. Previous simulations have revealed that an increase in the number of 90Y labelled to each antibody or nanoobject could be a solution to improve treatment output. It now seems important to assess the treatment output and toxicity when radionuclides such as 90Y, 177Lu, 131I, 124I, and 188Re are used. Tumour control probability (TCP) and normal tissue complication probability (NTCP) curves versus the number of radionuclides per nanoobject were computed with MCNPX to evaluate treatment efficacy for solid tumours and to predict the incidence of surrounding side effects. Analyses were carried out for two solid tumour sizes of 0.5 and 1.0 cm radius and for nanoobject (i.e., a radiolabelled antibody) distributed uniformly or nonuniformly throughout a solid tumour (e.g., Non-small-cell-lung cancer (NSCLC)). 90Y and 188Re are the best candidates for solid tumour treatment when only one radionuclide is coupled to one carrier. Furthermore, regardless of the radionuclide properties, high values of TCP can be reached without toxicity if the number of radionuclides per nanoobject increases. S. Lucas, O. Feron, B. Gallez, B. Masereel, C. Michiels, and T. Vander Borght Copyright © 2015 S. Lucas et al. All rights reserved. Automatic Evaluation of Voice Quality Using Text-Based Laryngograph Measurements and Prosodic Analysis Tue, 02 Jun 2015 06:48:46 +0000 Due to low intra- and interrater reliability, perceptual voice evaluation should be supported by objective, automatic methods. In this study, text-based, computer-aided prosodic analysis and measurements of connected speech were combined in order to model perceptual evaluation of the German Roughness-Breathiness-Hoarseness (RBH) scheme. 58 connected speech samples (43 women and 15 men; years) containing the German version of the text “The North Wind and the Sun” were evaluated perceptually by 19 speech and voice therapy students according to the RBH scale. For the human-machine correlation, Support Vector Regression with measurements of the vocal fold cycle irregularities (CFx) and the closed phases of vocal fold vibration (CQx) of the Laryngograph and 33 features from a prosodic analysis module were used to model the listeners’ ratings. The best human-machine results for roughness were obtained from a combination of six prosodic features and CFx (, ). These correlations were approximately the same as the interrater agreement among human raters (, ). CQx was one of the substantial features of the hoarseness model. For hoarseness and breathiness, the human-machine agreement was substantially lower. Nevertheless, the automatic analysis method can serve as the basis for a meaningful objective support for perceptual analysis. Tino Haderlein, Cornelia Schwemmle, Michael Döllinger, Václav Matoušek, Martin Ptok, and Elmar Nöth Copyright © 2015 Tino Haderlein et al. All rights reserved. Preliminary Investigation of Microdosimetric Track Structure Physics Models in Geant4-DNA and RITRACKS Mon, 01 Jun 2015 13:12:32 +0000 The major differences between the physics models in Geant4-DNA and RITRACKS Monte Carlo packages are investigated. Proton and electron ionisation interactions and electron excitation interactions in water are investigated in the current work. While these packages use similar semiempirical physics models for inelastic cross-sections, the implementation of these models is demonstrated to be significantly different. This is demonstrated in a simple Monte Carlo simulation designed to identify differences in interaction cross-sections. Michael Douglass, Scott Penfold, and Eva Bezak Copyright © 2015 Michael Douglass et al. All rights reserved. The Influence of DNA Configuration on the Direct Strand Break Yield Mon, 01 Jun 2015 12:37:51 +0000 Purpose. To study the influence of DNA configuration on the direct damage yield. No indirect effect has been accounted for. Methods. The GEANT4-DNA code was used to simulate the interactions of protons and alpha particles with geometrical models of the A-, B-, and Z-DNA configurations. The direct total, single, and double strand break yields and site-hit probabilities were determined. Certain features of the energy deposition process were also studied. Results. A slight increase of the site-hit probability as a function of the incident particle linear energy transfer was found for each DNA configuration. Each DNA form presents a well-defined site-hit probability, independently of the particle linear energy transfer. Approximately 70% of the inelastic collisions and ~60% of the absorbed dose are due to secondary electrons. These fractions are slightly higher for protons than for alpha particles at the same incident energy. Conclusions. The total direct strand break yield for a given DNA form depends weakly on DNA conformation topology. This yield is practically determined by the target volume of the DNA configuration. However, the double strand break yield increases with the packing ratio of the DNA double helix; thus, it depends on the DNA conformation. M. A. Bernal, C. E. deAlmeida, S. Incerti, C. Champion, V. Ivanchenko, and Z. Francis Copyright © 2015 M. A. Bernal et al. All rights reserved. Electrical Neuroimaging with Irrotational Sources Sun, 31 May 2015 07:58:39 +0000 This paper discusses theoretical aspects of the modeling of the sources of the EEG (i.e., the bioelectromagnetic inverse problem or source localization problem). Using the Helmholtz decomposition (HD) of the current density vector (CDV) of the primary current into an irrotational (I) and a solenoidal (S) part we show that only the irrotational part can contribute to the EEG measurements. In particular we present for the first time the HD of a dipole and of a pure irrotational source. We show that, for both kinds of sources, I extends all over the space independently of whether the source is spatially concentrated (as the dipole) or not. However, the divergence remains confined to a region coinciding with the expected location of the sources, confirming that it is the divergence rather than the CDV that really defines the spatial extension of the generators, from where it follows that an irrotational source model (ELECTRA) is always physiologically meaningful as long as the divergence remains confined to the brain. Finally we show that the irrotational source model remains valid for the most general electrodynamics model of the EEG in inhomogeneous anisotropic dispersive media and thus far beyond the (quasi) static approximation. Rolando Grave de Peralta Menendez and Sara Gonzalez Andino Copyright © 2015 Rolando Grave de Peralta Menendez and Sara Gonzalez Andino. All rights reserved. A New Approach for Mining Order-Preserving Submatrices Based on All Common Subsequences Thu, 28 May 2015 12:36:32 +0000 Order-preserving submatrices (OPSMs) have been applied in many fields, such as DNA microarray data analysis, automatic recommendation systems, and target marketing systems, as an important unsupervised learning model. Unfortunately, most existing methods are heuristic algorithms which are unable to reveal OPSMs entirely in NP-complete problem. In particular, deep OPSMs, corresponding to long patterns with few supporting sequences, incur explosive computational costs and are completely pruned by most popular methods. In this paper, we propose an exact method to discover all OPSMs based on frequent sequential pattern mining. First, an existing algorithm was adjusted to disclose all common subsequence (ACS) between every two row sequences, and therefore all deep OPSMs will not be missed. Then, an improved data structure for prefix tree was used to store and traverse ACS, and Apriori principle was employed to efficiently mine the frequent sequential pattern. Finally, experiments were implemented on gene and synthetic datasets. Results demonstrated the effectiveness and efficiency of this method. Yun Xue, Zhengling Liao, Meihang Li, Jie Luo, Qiuhua Kuang, Xiaohui Hu, and Tiechen Li Copyright © 2015 Yun Xue et al. All rights reserved. Statistical and Computational Methods for Genetic Diseases: An Overview Thu, 28 May 2015 11:29:41 +0000 The identification of causes of genetic diseases has been carried out by several approaches with increasing complexity. Innovation of genetic methodologies leads to the production of large amounts of data that needs the support of statistical and computational methods to be correctly processed. The aim of the paper is to provide an overview of statistical and computational methods paying attention to methods for the sequence analysis and complex diseases. Francesco Camastra, Maria Donata Di Taranto, and Antonino Staiano Copyright © 2015 Francesco Camastra et al. All rights reserved.