Computational and Mathematical Methods in Medicine The latest articles from Hindawi Publishing Corporation © 2015 , Hindawi Publishing Corporation . All rights reserved. Fast and Accurate Semiautomatic Segmentation of Individual Teeth from Dental CT Images Thu, 27 Aug 2015 06:31:15 +0000 DIn this paper, we propose a fast and accurate semiautomatic method to effectively distinguish individual teeth from the sockets of teeth in dental CT images. Parameter values of thresholding and shapes of the teeth are propagated to the neighboring slice, based on the separated teeth from reference images. After the propagation of threshold values and shapes of the teeth, the histogram of the current slice was analyzed. The individual teeth are automatically separated and segmented by using seeded region growing. Then, the newly generated separation information is iteratively propagated to the neighboring slice. Our method was validated by ten sets of dental CT scans, and the results were compared with the manually segmented result and conventional methods. The average error of absolute value of volume measurement was , which was more accurate than conventional methods. Boosting up the speed with the multicore processors was shown to be 2.4 times faster than a single core processor. The proposed method identified the individual teeth accurately, demonstrating that it can give dentists substantial assistance during dental surgery. Ho Chul Kang, Chankyu Choi, Juneseuk Shin, Jeongjin Lee, and Yeong-Gil Shin Copyright © 2015 Ho Chul Kang et al. All rights reserved. Estimation of the Basic Reproductive Ratio for Dengue Fever at the Take-Off Period of Dengue Infection Tue, 25 Aug 2015 09:02:20 +0000 Estimating the basic reproductive ratio of dengue fever has continued to be an ever-increasing challenge among epidemiologists. In this paper we propose two different constructions to estimate which is derived from a dynamical system of host-vector dengue transmission model. The construction is based on the original assumption that in the early states of an epidemic the infected human compartment increases exponentially at the same rate as the infected mosquito compartment (previous work). In the first proposed construction, we modify previous works by assuming that the rates of infection for mosquito and human compartments might be different. In the second construction, we add an improvement by including more realistic conditions in which the dynamics of an infected human compartments are intervened by the dynamics of an infected mosquito compartment, and vice versa. We apply our construction to the real dengue epidemic data from SB Hospital, Bandung, Indonesia, during the period of outbreak Nov. 25, 2008–Dec. 2012. We also propose two scenarios to determine the take-off rate of infection at the beginning of a dengue epidemic for construction of the estimates of : scenario I from equation of new cases of dengue with respect to time (daily) and scenario II from equation of new cases of dengue with respect to cumulative number of new cases of dengue. The results show that our first construction of accommodates the take-off rate differences between mosquitoes and humans. Our second construction of the estimation takes into account the presence of infective mosquitoes in the early growth rate of infective humans and vice versa. We conclude that the second approach is more realistic, compared with our first approach and the previous work. Jafaruddin, Sapto W. Indratno, Nuning Nuraini, Asep K. Supriatna, and Edy Soewono Copyright © 2015 Jafaruddin et al. All rights reserved. Risk Prediction of One-Year Mortality in Patients with Cardiac Arrhythmias Using Random Survival Forest Tue, 25 Aug 2015 08:21:54 +0000 Existing models for predicting mortality based on traditional Cox proportional hazard approach (CPH) often have low prediction accuracy. This paper aims to develop a clinical risk model with good accuracy for predicting 1-year mortality in cardiac arrhythmias patients using random survival forest (RSF), a robust approach for survival analysis. 10,488 cardiac arrhythmias patients available in the public MIMIC II clinical database were investigated, with 3,452 deaths occurring within 1-year followups. Forty risk factors including demographics and clinical and laboratory information and antiarrhythmic agents were analyzed as potential predictors of all-cause mortality. RSF was adopted to build a comprehensive survival model and a simplified risk model composed of 14 top risk factors. The built comprehensive model achieved a prediction accuracy of 0.81 measured by c-statistic with 10-fold cross validation. The simplified risk model also achieved a good accuracy of 0.799. Both results outperformed traditional CPH (which achieved a c-statistic of 0.733 for the comprehensive model and 0.718 for the simplified model). Moreover, various factors are observed to have nonlinear impact on cardiac arrhythmias prognosis. As a result, RSF based model which took nonlinearity into account significantly outperformed traditional Cox proportional hazard model and has great potential to be a more effective approach for survival analysis. Fen Miao, Yun-Peng Cai, Yu-Xiao Zhang, Ye Li, and Yuan-Ting Zhang Copyright © 2015 Fen Miao et al. All rights reserved. Dependence of Shape-Based Descriptors and Mass Segmentation Areas on Initial Contour Placement Using the Chan-Vese Method on Digital Mammograms Mon, 24 Aug 2015 06:26:18 +0000 Variation in signal intensity within mass lesions and missing boundary information are intensity inhomogeneities inherent in digital mammograms. These inhomogeneities render the performance of a deformable contour susceptible to the location of its initial position and may lead to poor segmentation results for these images. We investigate the dependence of shape-based descriptors and mass segmentation areas on initial contour placement with the Chan-Vese segmentation method and compare these results to the active contours with selective local or global segmentation model. For each mass lesion, final contours were obtained by propagation of a proposed initial level set contour and by propagation of a manually drawn contour enclosing the region of interest. Differences in shape-based descriptors were quantified using absolute percentage differences, Euclidean distances, and Bland-Altman analysis. Segmented areas were evaluated with the area overlap measure. Differences were dependent upon the characteristics of the mass margins. Boundary moments presented large percentage differences. Pearson correlation analysis showed statistically significant correlations between shape-based descriptors from both initial locations. In conclusion, boundary moments of digital mass lesions are sensitive to the placement of initial level set contours while shape-based descriptors such as Fourier descriptors, shape convexity, and shape rectangularity exhibit a certain degree of robustness to changes in the location of the initial level set contours for both segmentation algorithms. S. N. Acho and W. I. D. Rae Copyright © 2015 S. N. Acho and W. I. D. Rae. All rights reserved. Efficient Noninferiority Testing Procedures for Simultaneously Assessing Sensitivity and Specificity of Two Diagnostic Tests Thu, 20 Aug 2015 12:42:10 +0000 Sensitivity and specificity are often used to assess the performance of a diagnostic test with binary outcomes. Wald-type test statistics have been proposed for testing sensitivity and specificity individually. In the presence of a gold standard, simultaneous comparison between two diagnostic tests for noninferiority of sensitivity and specificity based on an asymptotic approach has been studied by Chen et al. (2003). However, the asymptotic approach may suffer from unsatisfactory type I error control as observed from many studies, especially in small to medium sample settings. In this paper, we compare three unconditional approaches for simultaneously testing sensitivity and specificity. They are approaches based on estimation, maximization, and a combination of estimation and maximization. Although the estimation approach does not guarantee type I error, it has satisfactory performance with regard to type I error control. The other two unconditional approaches are exact. The approach based on estimation and maximization is generally more powerful than the approach based on maximization. Guogen Shan, Amei Amei, and Daniel Young Copyright © 2015 Guogen Shan et al. All rights reserved. Corrigendum to “Optimization and Corroboration of the Regulatory Pathway of p42.3 Protein in the Pathogenesis of Gastric Carcinoma” Mon, 17 Aug 2015 12:19:02 +0000 Yibin Hao, Tianli Fan, and Kejun Nan Copyright © 2015 Yibin Hao et al. All rights reserved. Modeling of the Bacillus subtilis Bacterial Biofilm Growing on an Agar Substrate Mon, 17 Aug 2015 07:44:41 +0000 Bacterial biofilms are organized communities composed of millions of microorganisms that accumulate on almost any kinds of surfaces. In this paper, a biofilm growth model on an agar substrate is developed based on mass conservation principles, Fick’s first law, and Monod’s kinetic reaction, by considering nutrient diffusion between biofilm and agar substrate. Our results show biofilm growth evolution characteristics such as biofilm thickness, active biomass, and nutrient concentration in the agar substrate. We quantitatively obtain biofilm growth dependence on different parameters. We provide an alternative mathematical method to describe other kinds of biofilm growth such as multiple bacterial species biofilm and also biofilm growth on various complex substrates. Xiaoling Wang, Guoqing Wang, and Mudong Hao Copyright © 2015 Xiaoling Wang et al. All rights reserved. The Structural Characterization of Tumor Fusion Genes and Proteins Mon, 10 Aug 2015 14:07:54 +0000 Chromosomal translocation, which generates fusion proteins in blood tumor or solid tumor, is considered as one of the major causes leading to cancer. Recent studies suggested that the disordered fragments in a fusion protein might contribute to its carcinogenicity. Here, we investigated the sequence feature near the breakpoints in the fusion partner genes, the structure features of breakpoints in fusion proteins, and the posttranslational modification preference in the fusion proteins. Results show that the breakpoints in the fusion partner genes have both sequence preference and structural preference. At the sequence level, nucleotide combination AG is preferred before the breakpoint and GG is preferred at the breakpoint. At the structural level, the breakpoints in the fusion proteins prefer to be located in the disordered regions. Further analysis suggests the phosphorylation sites at serine, threonine, and the methylation sites at arginine are enriched in disordered regions of the fusion proteins. Using EML4-ALK as an example, we further explained how the fusion protein leads to the protein disorder and contributes to its carcinogenicity. The sequence and structural features of the fusion proteins may help the scientific community to predict novel breakpoints in fusion genes and better understand the structure and function of fusion proteins. Dandan Wang, Daixi Li, Guangrong Qin, Wen Zhang, Jian Ouyang, Menghuan Zhang, and Lu Xie Copyright © 2015 Dandan Wang et al. All rights reserved. A Relation Extraction Framework for Biomedical Text Using Hybrid Feature Set Mon, 10 Aug 2015 09:45:26 +0000 The information extraction from unstructured text segments is a complex task. Although manual information extraction often produces the best results, it is harder to manage biomedical data extraction manually because of the exponential increase in data size. Thus, there is a need for automatic tools and techniques for information extraction in biomedical text mining. Relation extraction is a significant area under biomedical information extraction that has gained much importance in the last two decades. A lot of work has been done on biomedical relation extraction focusing on rule-based and machine learning techniques. In the last decade, the focus has changed to hybrid approaches showing better results. This research presents a hybrid feature set for classification of relations between biomedical entities. The main contribution of this research is done in the semantic feature set where verb phrases are ranked using Unified Medical Language System (UMLS) and a ranking algorithm. Support Vector Machine and Naïve Bayes, the two effective machine learning techniques, are used to classify these relations. Our approach has been validated on the standard biomedical text corpus obtained from MEDLINE 2001. Conclusively, it can be articulated that our framework outperforms all state-of-the-art approaches used for relation extraction on the same corpus. Abdul Wahab Muzaffar, Farooque Azam, and Usman Qamar Copyright © 2015 Abdul Wahab Muzaffar et al. All rights reserved. Stability and Hopf Bifurcation in a Delayed HIV Infection Model with General Incidence Rate and Immune Impairment Tue, 04 Aug 2015 13:07:45 +0000 We investigate the dynamical behavior of a delayed HIV infection model with general incidence rate and immune impairment. We derive two threshold parameters, the basic reproduction number and the immune response reproduction number . By using Lyapunov functional and LaSalle invariance principle, we prove the global stability of the infection-free equilibrium and the infected equilibrium without immunity. Furthermore, the existence of Hopf bifurcations at the infected equilibrium with CTL response is also studied. By theoretical analysis and numerical simulations, the effect of the immune impairment rate on the stability of the infected equilibrium with CTL response has been studied. Fuxiang Li, Wanbiao Ma, Zhichao Jiang, and Dan Li Copyright © 2015 Fuxiang Li et al. All rights reserved. Advances in Computational Psychometrics Tue, 04 Aug 2015 12:46:21 +0000 Pietro Cipresso, Aleksandar Matic, Dimitris Giakoumis, and Yuri Ostrovsky Copyright © 2015 Pietro Cipresso et al. All rights reserved. The Use of Virtual Reality in Psychology: A Case Study in Visual Perception Mon, 03 Aug 2015 14:32:23 +0000 Recent proliferation of available virtual reality (VR) tools has seen increased use in psychological research. This is due to a number of advantages afforded over traditional experimental apparatus such as tighter control of the environment and the possibility of creating more ecologically valid stimulus presentation and response protocols. At the same time, higher levels of immersion and visual fidelity afforded by VR do not necessarily evoke presence or elicit a “realistic” psychological response. The current paper reviews some current uses for VR environments in psychological research and discusses some ongoing questions for researchers. Finally, we focus on the area of visual perception, where both the advantages and challenges of VR are particularly salient. Christopher J. Wilson and Alessandro Soranzo Copyright © 2015 Christopher J. Wilson and Alessandro Soranzo. All rights reserved. Augmented Reality: A Brand New Challenge for the Assessment and Treatment of Psychological Disorders Mon, 03 Aug 2015 14:23:18 +0000 Augmented Reality is a new technological system that allows introducing virtual contents in the real world in order to run in the same representation and, in real time, enhancing the user’s sensory perception of reality. From another point of view, Augmented Reality can be defined as a set of techniques and tools that add information to the physical reality. To date, Augmented Reality has been used in many fields, such as medicine, entertainment, maintenance, architecture, education, and cognitive and motor rehabilitation but very few studies and applications of AR exist in clinical psychology. In the treatment of psychological disorders, Augmented Reality has given preliminary evidence to be a useful tool due to its adaptability to the patient needs and therapeutic purposes and interactivity. Another relevant factor is the quality of the user’s experience in the Augmented Reality system determined from emotional engagement and sense of presence. This experience could increase the AR ecological validity in the treatment of psychological disorders. This paper reviews the recent studies on the use of Augmented Reality in the evaluation and treatment of psychological disorders, focusing on current uses of this technology and on the specific features that delineate Augmented Reality a new technique useful for psychology. Irene Alice Chicchi Giglioli, Federica Pallavicini, Elisa Pedroli, Silvia Serino, and Giuseppe Riva Copyright © 2015 Irene Alice Chicchi Giglioli et al. All rights reserved. Novel Virtual User Models of Mild Cognitive Impairment for Simulating Dementia Mon, 03 Aug 2015 13:34:19 +0000 Virtual user modeling research has attempted to address critical issues of human-computer interaction (HCI) such as usability and utility through a large number of analytic, usability-oriented approaches as cognitive models in order to provide users with experiences fitting to their specific needs. However, there is demand for more specific modules embodied in cognitive architecture that will detect abnormal cognitive decline across new synthetic task environments. Also, accessibility evaluation of graphical user interfaces (GUIs) requires considerable effort for enhancing ICT products accessibility for older adults. The main aim of this study is to develop and test virtual user models (VUM) simulating mild cognitive impairment (MCI) through novel specific modules, embodied at cognitive models and defined by estimations of cognitive parameters. Well-established MCI detection tests assessed users’ cognition, elaborated their ability to perform multitasks, and monitored the performance of infotainment related tasks to provide more accurate simulation results on existing conceptual frameworks and enhanced predictive validity in interfaces’ design supported by increased tasks’ complexity to capture a more detailed profile of users’ capabilities and limitations. The final outcome is a more robust cognitive prediction model, accurately fitted to human data to be used for more reliable interfaces’ evaluation through simulation on the basis of virtual models of MCI users. Sofia Segkouli, Ioannis Paliokas, Dimitrios Tzovaras, Thanos Tsakiris, Magda Tsolaki, and Charalampos Karagiannidis Copyright © 2015 Sofia Segkouli et al. All rights reserved. Thermal Infrared Imaging-Based Computational Psychophysiology for Psychometrics Mon, 03 Aug 2015 12:48:44 +0000 Thermal infrared imaging has been proposed as a potential system for the computational assessment of human autonomic nervous activity and psychophysiological states in a contactless and noninvasive way. Through bioheat modeling of facial thermal imagery, several vital signs can be extracted, including localized blood perfusion, cardiac pulse, breath rate, and sudomotor response, since all these parameters impact the cutaneous temperature. The obtained physiological information could then be used to draw inferences about a variety of psychophysiological or affective states, as proved by the increasing number of psychophysiological studies using thermal infrared imaging. This paper presents therefore a review of the principal achievements of thermal infrared imaging in computational physiology with regard to its capability of monitoring psychophysiological activity. Daniela Cardone, Paola Pinti, and Arcangelo Merla Copyright © 2015 Daniela Cardone et al. All rights reserved. Computational Psychometrics in Communication and Implications in Decision Making Mon, 03 Aug 2015 12:46:46 +0000 Recent investigations emphasized the role of communication features on behavioral trust and reciprocity in economic decision making but no studies have been focused on the effect of communication on affective states in such a context. Thanks to advanced methods of computational psychometrics, in this study, affective states were deeply examined using simultaneous and synchronized recordings of gazes and psychophysiological signals in 28 female students during an investment game. Results showed that participants experienced different affective states according to the type of communication (personal versus impersonal). In particular, participants involved in personal communication felt more relaxed than participants involved in impersonal communication. Moreover, personal communication influenced reciprocity and participants’ perceptions about trust and reciprocity. Findings were interpreted in the light of the Arousal/Valence Model and self-disclosure process. Pietro Cipresso, Daniela Villani, Claudia Repetto, Lucia Bosone, Anna Balgera, Maurizio Mauri, Marco Villamira, Alessandro Antonietti, and Giuseppe Riva Copyright © 2015 Pietro Cipresso et al. All rights reserved. An Efficient Optimization Method for Solving Unsupervised Data Classification Problems Wed, 29 Jul 2015 16:02:05 +0000 Unsupervised data classification (or clustering) analysis is one of the most useful tools and a descriptive task in data mining that seeks to classify homogeneous groups of objects based on similarity and is used in many medical disciplines and various applications. In general, there is no single algorithm that is suitable for all types of data, conditions, and applications. Each algorithm has its own advantages, limitations, and deficiencies. Hence, research for novel and effective approaches for unsupervised data classification is still active. In this paper a heuristic algorithm, Biogeography-Based Optimization (BBO) algorithm, was adapted for data clustering problems by modifying the main operators of BBO algorithm, which is inspired from the natural biogeography distribution of different species. Similar to other population-based algorithms, BBO algorithm starts with an initial population of candidate solutions to an optimization problem and an objective function that is calculated for them. To evaluate the performance of the proposed algorithm assessment was carried on six medical and real life datasets and was compared with eight well known and recent unsupervised data classification algorithms. Numerical results demonstrate that the proposed evolutionary optimization algorithm is efficient for unsupervised data classification. Parvaneh Shabanzadeh and Rubiyah Yusof Copyright © 2015 Parvaneh Shabanzadeh and Rubiyah Yusof. All rights reserved. Dual Energy Method for Breast Imaging: A Simulation Study Mon, 13 Jul 2015 11:22:59 +0000 Dual energy methods can suppress the contrast between adipose and glandular tissues in the breast and therefore enhance the visibility of calcifications. In this study, a dual energy method based on analytical modeling was developed for the detection of minimum microcalcification thickness. To this aim, a modified radiographic X-ray unit was considered, in order to overcome the limited kVp range of mammographic units used in previous DE studies, combined with a high resolution CMOS sensor (pixel size of 22.5 μm) for improved resolution. Various filter materials were examined based on their K-absorption edge. Hydroxyapatite (HAp) was used to simulate microcalcifications. The contrast to noise ratio () of the subtracted images was calculated for both monoenergetic and polyenergetic X-ray beams. The optimum monoenergetic pair was 23/58 keV for the low and high energy, respectively, resulting in a minimum detectable microcalcification thickness of 100 μm. In the polyenergetic X-ray study, the optimal spectral combination was 40/70 kVp filtered with 100 μm cadmium and 1000 μm copper, respectively. In this case, the minimum detectable microcalcification thickness was 150 μm. The proposed dual energy method provides improved microcalcification detectability in breast imaging with mean glandular dose values within acceptable levels. V. Koukou, N. Martini, C. Michail, P. Sotiropoulou, C. Fountzoula, N. Kalyvas, I. Kandarakis, G. Nikiforidis, and G. Fountos Copyright © 2015 V. Koukou et al. All rights reserved. Modelling Optimal Control of Cholera in Communities Linked by Migration Mon, 13 Jul 2015 06:40:12 +0000 A mathematical model for the dynamics of cholera transmission with permissible controls between two connected communities is developed and analysed. The dynamics of the disease in the adjacent communities are assumed to be similar, with the main differences only reflected in the transmission and disease related parameters. This assumption is based on the fact that adjacent communities often have different living conditions and movement is inclined toward the community with better living conditions. Community specific reproduction numbers are given assuming movement of those susceptible, infected, and recovered, between communities. We carry out sensitivity analysis of the model parameters using the Latin Hypercube Sampling scheme to ascertain the degree of effect the parameters and controls have on progression of the infection. Using principles from optimal control theory, a temporal relationship between the distribution of controls and severity of the infection is ascertained. Our results indicate that implementation of controls such as proper hygiene, sanitation, and vaccination across both affected communities is likely to annihilate the infection within half the time it would take through self-limitation. In addition, although an infection may still break out in the presence of controls, it may be up to 8 times less devastating when compared with the case when no controls are in place. J. B. H. Njagarah and F. Nyabadza Copyright © 2015 J. B. H. Njagarah and F. Nyabadza. All rights reserved. Automated Delineation of Vessel Wall and Thrombus Boundaries of Abdominal Aortic Aneurysms Using Multispectral MR Images Sun, 05 Jul 2015 07:31:50 +0000 A correct patient-specific identification of the abdominal aortic aneurysm is useful for both diagnosis and treatment stages, as it locates the disease and represents its geometry. The actual thickness and shape of the arterial wall and the intraluminal thrombus are of great importance when predicting the rupture of the abdominal aortic aneurysms. The authors describe a novel method for delineating both the internal and external contours of the aortic wall, which allows distinguishing between vessel wall and intraluminal thrombus. The method is based on active shape model and texture statistical information. The method was validated with eight MR patient studies. There was high correspondence between automatic and manual measurements for the vessel wall area. Resulting segmented images presented a mean Dice coefficient with respect to manual segmentations of 0.88 and a mean modified Hausdorff distance of 1.14 mm for the internal face and 0.86 and 1.33 mm for the external face of the arterial wall. Preliminary results of the segmentation show high correspondence between automatic and manual measurements for the vessel wall and thrombus areas. However, since the dataset is small the conclusions cannot be generalized. B. Rodriguez-Vila, J. Tarjuelo-Gutierrez, P. Sánchez-González, P. Verbrugghe, I. Fourneau, G. Maleux, P. Herijgers, and E. J. Gomez Copyright © 2015 B. Rodriguez-Vila et al. All rights reserved. Dynamical Analysis of an SEIT Epidemic Model with Application to Ebola Virus Transmission in Guinea Thu, 02 Jul 2015 09:14:26 +0000 In order to investigate the transmission mechanism of the infectious individual with Ebola virus, we establish an SEIT (susceptible, exposed in the latent period, infectious, and treated/recovery) epidemic model. The basic reproduction number is defined. The mathematical analysis on the existence and stability of the disease-free equilibrium and endemic equilibrium is given. As the applications of the model, we use the recognized infectious and death cases in Guinea to estimate parameters of the model by the least square method. With suitable parameter values, we obtain the estimated value of the basic reproduction number and analyze the sensitivity and uncertainty property by partial rank correlation coefficients. Zhiming Li, Zhidong Teng, Xiaomei Feng, Yingke Li, and Huiguo Zhang Copyright © 2015 Zhiming Li et al. All rights reserved. Undersampled MR Image Reconstruction with Data-Driven Tight Frame Wed, 24 Jun 2015 12:09:26 +0000 Undersampled magnetic resonance image reconstruction employing sparsity regularization has fascinated many researchers in recent years under the support of compressed sensing theory. Nevertheless, most existing sparsity-regularized reconstruction methods either lack adaptability to capture the structure information or suffer from high computational load. With the aim of further improving image reconstruction accuracy without introducing too much computation, this paper proposes a data-driven tight frame magnetic image reconstruction (DDTF-MRI) method. By taking advantage of the efficiency and effectiveness of data-driven tight frame, DDTF-MRI trains an adaptive tight frame to sparsify the to-be-reconstructed MR image. Furthermore, a two-level Bregman iteration algorithm has been developed to solve the proposed model. The proposed method has been compared to two state-of-the-art methods on four datasets and encouraging performances have been achieved by DDTF-MRI. Jianbo Liu, Shanshan Wang, Xi Peng, and Dong Liang Copyright © 2015 Jianbo Liu et al. All rights reserved. Advances in Computational Methods for Genetic Diseases Thu, 18 Jun 2015 12:39:23 +0000 Francesco Camastra, Roberto Amato, Maria Donata Di Taranto, and Antonino Staiano Copyright © 2015 Francesco Camastra et al. All rights reserved. Accelerated Compressed Sensing Based CT Image Reconstruction Thu, 18 Jun 2015 08:16:22 +0000 In X-ray computed tomography (CT) an important objective is to reduce the radiation dose without significantly degrading the image quality. Compressed sensing (CS) enables the radiation dose to be reduced by producing diagnostic images from a limited number of projections. However, conventional CS-based algorithms are computationally intensive and time-consuming. We propose a new algorithm that accelerates the CS-based reconstruction by using a fast pseudopolar Fourier based Radon transform and rebinning the diverging fan beams to parallel beams. The reconstruction process is analyzed using a maximum-a-posterior approach, which is transformed into a weighted CS problem. The weights involved in the proposed model are calculated based on the statistical characteristics of the reconstruction process, which is formulated in terms of the measurement noise and rebinning interpolation error. Therefore, the proposed method not only accelerates the reconstruction, but also removes the rebinning and interpolation errors. Simulation results are shown for phantoms and a patient. For example, a 512 × 512 Shepp-Logan phantom when reconstructed from 128 rebinned projections using a conventional CS method had 10% error, whereas with the proposed method the reconstruction error was less than 1%. Moreover, computation times of less than 30 sec were obtained using a standard desktop computer without numerical optimization. SayedMasoud Hashemi, Soosan Beheshti, Patrick R. Gill, Narinder S. Paul, and Richard S. C. Cobbold Copyright © 2015 SayedMasoud Hashemi et al. All rights reserved. Towards Automated Three-Dimensional Tracking of Nephrons through Stacked Histological Image Sets Mon, 15 Jun 2015 14:03:06 +0000 An automated approach for tracking individual nephrons through three-dimensional histological image sets of mouse and rat kidneys is presented. In a previous study, the available images were tracked manually through the image sets in order to explore renal microarchitecture. The purpose of the current research is to reduce the time and effort required to manually trace nephrons by creating an automated, intelligent system as a standard tool for such datasets. The algorithm is robust enough to isolate closely packed nephrons and track their convoluted paths despite a number of nonideal, interfering conditions such as local image distortions, artefacts, and interstitial tissue interference. The system comprises image preprocessing, feature extraction, and a custom graph-based tracking algorithm, which is validated by a rule base and a machine learning algorithm. A study of a selection of automatically tracked nephrons, when compared with manual tracking, yields a 95% tracking accuracy for structures in the cortex, while those in the medulla have lower accuracy due to narrower diameter and higher density. Limited manual intervention is introduced to improve tracking, enabling full nephron paths to be obtained with an average of 17 manual corrections per mouse nephron and 58 manual corrections per rat nephron. Charita Bhikha, Arne Andreasen, Erik I. Christensen, Robyn F. R. Letts, Adam Pantanowitz, David M. Rubin, Jesper S. Thomsen, and Xiao-Yue Zhai Copyright © 2015 Charita Bhikha et al. All rights reserved. The Prioritization of Clinical Risk Factors of Obstructive Sleep Apnea Severity Using Fuzzy Analytic Hierarchy Process Mon, 15 Jun 2015 08:56:26 +0000 Recently, there has been a problem of shortage of sleep laboratories that can accommodate the patients in a timely manner. Delayed diagnosis and treatment may lead to worse outcomes particularly in patients with severe obstructive sleep apnea (OSA). For this reason, the prioritization in polysomnography (PSG) queueing should be endorsed based on disease severity. To date, there have been conflicting data whether clinical information can predict OSA severity. The 1,042 suspected OSA patients underwent diagnostic PSG study at Siriraj Sleep Center during 2010-2011. A total of 113 variables were obtained from sleep questionnaires and anthropometric measurements. The 19 groups of clinical risk factors consisting of 42 variables were categorized into each OSA severity. This study aimed to array these factors by employing Fuzzy Analytic Hierarchy Process approach based on normalized weight vector. The results revealed that the first rank of clinical risk factors in Severe, Moderate, Mild, and No OSA was nighttime symptoms. The overall sensitivity/specificity of the approach to these groups was 92.32%/91.76%, 89.52%/88.18%, 91.08%/84.58%, and 96.49%/81.23%, respectively. We propose that the urgent PSG appointment should include clinical risk factors of Severe OSA group. In addition, the screening for Mild from No OSA patients in sleep center setting using symptoms during sleep is also recommended (sensitivity = 87.12% and specificity = 72.22%). Thaya Maranate, Adisak Pongpullponsak, and Pimon Ruttanaumpawan Copyright © 2015 Thaya Maranate et al. All rights reserved. A Forward Dynamic Modelling Investigation of Cause-and-Effect Relationships in Single Support Phase of Human Walking Sun, 14 Jun 2015 09:46:52 +0000 Mathematical gait models often fall into one of two categories: simple and complex. There is a large leap in complexity between model types, meaning the effects of individual gait mechanisms get overlooked. This study investigated the cause-and-effect relationships between gait mechanisms and resulting kinematics and kinetics, using a sequence of mathematical models of increasing complexity. The focus was on sagittal plane and single support only. Starting with an inverted pendulum (IP), extended to include a HAT (head-arms-trunk) segment and an actuated hip moment, further complexities were added one-by-one. These were a knee joint, an ankle joint with a static foot, heel rise, and finally a swing leg. The presence of a knee joint and an ankle moment (during foot flat) were shown to largely influence the initial peak in the vertical GRF curve. The second peak in this curve was achieved through a combination of heel rise and the presence of a swing leg. Heel rise was also shown to reduce errors in the horizontal GRF prediction in the second half of single support. The swing leg is important for centre-of-mass (CM) deceleration in late single support. These findings provide evidence for the specific effects of each gait mechanism. Michael McGrath, David Howard, and Richard Baker Copyright © 2015 Michael McGrath et al. All rights reserved. Reconstruction Accuracy Assessment of Surface and Underwater 3D Motion Analysis: A New Approach Sun, 14 Jun 2015 06:29:29 +0000 This study assessed accuracy of surface and underwater 3D reconstruction of a calibration volume with and without homography. A calibration volume (6000 × 2000 × 2500 mm) with 236 markers (64 above and 88 underwater control points—with 8 common points at water surface—and 92 validation points) was positioned on a 25 m swimming pool and recorded with two surface and four underwater cameras. Planar homography estimation for each calibration plane was computed to perform image rectification. Direct linear transformation algorithm for 3D reconstruction was applied, using 1600000 different combinations of 32 and 44 points out of the 64 and 88 control points for surface and underwater markers (resp.). Root Mean Square (RMS) error with homography of control and validations points was lower than without it for surface and underwater cameras (). With homography, RMS errors of control and validation points were similar between surface and underwater cameras (). Without homography, RMS error of control points was greater for underwater than surface cameras () and the opposite was observed for validation points (). It is recommended that future studies using 3D reconstruction should include homography to improve swimming movement analysis accuracy. Kelly de Jesus, Karla de Jesus, Pedro Figueiredo, João Paulo Vilas-Boas, Ricardo Jorge Fernandes, and Leandro José Machado Copyright © 2015 Kelly de Jesus et al. All rights reserved. Genetic Consequences of Antiviral Therapy on HIV-1 Wed, 10 Jun 2015 07:41:06 +0000 A variety of enzyme inhibitors have been developed in combating HIV-1, however the fast evolutionary rate of this virus commonly leads to the emergence of resistance mutations that finally allows the mutant virus to survive. This review explores the main genetic consequences of HIV-1 molecular evolution during antiviral therapies, including the viral genetic diversity and molecular adaptation. The role of recombination in the generation of drug resistance is also analyzed. Besides the investigation and discussion of published works, an evolutionary analysis of protease-coding genes collected from patients before and after treatment with different protease inhibitors was included to validate previous studies. Finally, the review discusses the importance of considering genetic consequences of antiviral therapies in models of HIV-1 evolution that could improve current genotypic resistance testing and treatments design. Miguel Arenas Copyright © 2015 Miguel Arenas. All rights reserved. Nanodosimetry-Based Plan Optimization for Particle Therapy Mon, 08 Jun 2015 06:00:50 +0000 Treatment planning for particle therapy is currently an active field of research due uncertainty in how to modify physical dose in order to create a uniform biological dose response in the target. A novel treatment plan optimization strategy based on measurable nanodosimetric quantities rather than biophysical models is proposed in this work. Simplified proton and carbon treatment plans were simulated in a water phantom to investigate the optimization feasibility. Track structures of the mixed radiation field produced at different depths in the target volume were simulated with Geant4-DNA and nanodosimetric descriptors were calculated. The fluences of the treatment field pencil beams were optimized in order to create a mixed field with equal nanodosimetric descriptors at each of the multiple positions in spread-out particle Bragg peaks. For both proton and carbon ion plans, a uniform spatial distribution of nanodosimetric descriptors could be obtained by optimizing opposing-field but not single-field plans. The results obtained indicate that uniform nanodosimetrically weighted plans, which may also be radiobiologically uniform, can be obtained with this approach. Future investigations need to demonstrate that this approach is also feasible for more complicated beam arrangements and that it leads to biologically uniform response in tumor cells and tissues. Margherita Casiraghi and Reinhard W. Schulte Copyright © 2015 Margherita Casiraghi and Reinhard W. Schulte. All rights reserved.