The Scientific World Journal: Computer Science The latest articles from Hindawi Publishing Corporation © 2015 , Hindawi Publishing Corporation . All rights reserved. Game Theory Based Trust Model for Cloud Environment Tue, 25 Aug 2015 12:44:30 +0000 The aim of this work is to propose a method to establish trust at bootload level in cloud computing environment. This work proposes a game theoretic based approach for achieving trust at bootload level of both resources and users perception. Nash equilibrium (NE) enhances the trust evaluation of the first-time users and providers. It also restricts the service providers and the users to violate service level agreement (SLA). Significantly, the problem of cold start and whitewashing issues are addressed by the proposed method. In addition appropriate mapping of cloud user’s application to cloud service provider for segregating trust level is achieved as a part of mapping. Thus, time complexity and space complexity are handled efficiently. Experiments were carried out to compare and contrast the performance of the conventional methods and the proposed method. Several metrics like execution time, accuracy, error identification, and undecidability of the resources were considered. K. Gokulnath and Rhymend Uthariaraj Copyright © 2015 K. Gokulnath and Rhymend Uthariaraj. All rights reserved. Universal Keyword Classifier on Public Key Based Encrypted Multikeyword Fuzzy Search in Public Cloud Tue, 25 Aug 2015 08:53:37 +0000 Cloud computing has pioneered the emerging world by manifesting itself as a service through internet and facilitates third party infrastructure and applications. While customers have no visibility on how their data is stored on service provider’s premises, it offers greater benefits in lowering infrastructure costs and delivering more flexibility and simplicity in managing private data. The opportunity to use cloud services on pay-per-use basis provides comfort for private data owners in managing costs and data. With the pervasive usage of internet, the focus has now shifted towards effective data utilization on the cloud without compromising security concerns. In the pursuit of increasing data utilization on public cloud storage, the key is to make effective data access through several fuzzy searching techniques. In this paper, we have discussed the existing fuzzy searching techniques and focused on reducing the searching time on the cloud storage server for effective data utilization. Our proposed Asymmetric Classifier Multikeyword Fuzzy Search method provides classifier search server that creates universal keyword classifier for the multiple keyword request which greatly reduces the searching time by learning the search path pattern for all the keywords in the fuzzy keyword set. The objective of using BTree fuzzy searchable index is to resolve typos and representation inconsistencies and also to facilitate effective data utilization. Shyamala Devi Munisamy and Arun Chokkalingam Copyright © 2015 Shyamala Devi Munisamy and Arun Chokkalingam. All rights reserved. Recent Advances in General Game Playing Mon, 24 Aug 2015 11:47:34 +0000 The goal of General Game Playing (GGP) has been to develop computer programs that can perform well across various game types. It is natural for human game players to transfer knowledge from games they already know how to play to other similar games. GGP research attempts to design systems that work well across different game types, including unknown new games. In this review, we present a survey of recent advances (2011 to 2014) in GGP for both traditional games and video games. It is notable that research on GGP has been expanding into modern video games. Monte-Carlo Tree Search and its enhancements have been the most influential techniques in GGP for both research domains. Additionally, international competitions have become important events that promote and increase GGP research. Recently, a video GGP competition was launched. In this survey, we review recent progress in the most challenging research areas of Artificial Intelligence (AI) related to universal game playing. Maciej Świechowski, HyunSoo Park, Jacek Mańdziuk, and Kyung-Joong Kim Copyright © 2015 Maciej Świechowski et al. All rights reserved. A Dynamic Intrusion Detection System Based on Multivariate Hotelling’s T2 Statistics Approach for Network Environments Tue, 18 Aug 2015 10:05:13 +0000 The ever expanding communication requirements in today’s world demand extensive and efficient network systems with equally efficient and reliable security features integrated for safe, confident, and secured communication and data transfer. Providing effective security protocols for any network environment, therefore, assumes paramount importance. Attempts are made continuously for designing more efficient and dynamic network intrusion detection models. In this work, an approach based on Hotelling’s T2 method, a multivariate statistical analysis technique, has been employed for intrusion detection, especially in network environments. Components such as preprocessing, multivariate statistical analysis, and attack detection have been incorporated in developing the multivariate Hotelling’s T2 statistical model and necessary profiles have been generated based on the T-square distance metrics. With a threshold range obtained using the central limit theorem, observed traffic profiles have been classified either as normal or attack types. Performance of the model, as evaluated through validation and testing using KDD Cup’99 dataset, has shown very high detection rates for all classes with low false alarm rates. Accuracy of the model presented in this work, in comparison with the existing models, has been found to be much better. Aneetha Avalappampatty Sivasamy and Bose Sundan Copyright © 2015 Aneetha Avalappampatty Sivasamy and Bose Sundan. All rights reserved. Prediction of Domain Behavior through Dynamic Well-Being Domain Model Analysis Mon, 17 Aug 2015 12:00:57 +0000 As the concept of context-awareness is becoming more popular the demand for improved quality of context-aware systems increases too. Due to the inherent challenges posed by context-awareness, it is harder to predict what the behavior of the systems and their context will be once provided to the end-user than is the case for non-context-aware systems. A domain where such upfront knowledge is highly important is that of well-being. In this paper, we introduce a method to model the well-being domain and to predict the effects the system will have on its context when implemented. This analysis can be performed at design time. Using these predictions, the design can be fine-tuned to increase the chance that systems will have the desired effect. The method has been tested using three existing well-being applications. For these applications, domain models were created in the Dynamic Well-being Domain Model language. This language allows for causal reasoning over the application domain. The models created were used to perform the analysis and behavior prediction. The analysis results were compared to existing application end-user evaluation studies. Results showed that our analysis could accurately predict success and possible problems in the focus of the systems, although certain limitation regarding the predictions should be kept into consideration. Steven Bosems and Marten van Sinderen Copyright © 2015 Steven Bosems and Marten van Sinderen. All rights reserved. Improved Secret Image Sharing Scheme in Embedding Capacity without Underflow and Overflow Mon, 17 Aug 2015 06:59:40 +0000 Computational secret image sharing (CSIS) is an effective way to protect a secret image during its transmission and storage, and thus it has attracted lots of attentions since its appearance. Nowadays, it has become a hot topic for researchers to improve the embedding capacity and eliminate the underflow and overflow situations, which is embarrassing and difficult to deal with. The scheme, which has the highest embedding capacity among the existing schemes, has the underflow and overflow problems. Although the underflow and overflow situations have been well dealt with by different methods, the embedding capacities of these methods are reduced more or less. Motivated by these concerns, we propose a novel scheme, in which we take the differential coding, Huffman coding, and data converting to compress the secret image before embedding it to further improve the embedding capacity, and the pixel mapping matrix embedding method with a newly designed matrix is used to embed secret image data into the cover image to avoid the underflow and overflow situations. Experiment results show that our scheme can improve the embedding capacity further and eliminate the underflow and overflow situations at the same time. Liaojun Pang, Deyu Miao, Huixian Li, and Qiong Wang Copyright © 2015 Liaojun Pang et al. All rights reserved. A Customizable Quantum-Dot Cellular Automata Building Block for the Synthesis of Classical and Reversible Circuits Sun, 09 Aug 2015 13:48:28 +0000 Quantum-dot cellular automata (QCA) are nanoscale digital logic constructs that use electrons in arrays of quantum dots to carry out binary operations. In this paper, a basic building block for QCA will be proposed. The proposed basic building block can be customized to implement classical gates, such as XOR and XNOR gates, and reversible gates, such as CNOT and Toffoli gates, with less cell count and/or better latency than other proposed designs. Ahmed Moustafa, Ahmed Younes, and Yasser F. Hassan Copyright © 2015 Ahmed Moustafa et al. All rights reserved. Computer Intelligence in Modeling, Prediction, and Analysis of Complex Dynamical Systems Wed, 05 Aug 2015 08:23:29 +0000 Ivan Zelinka, Ajith Abraham, Otto Rossler, Mohammed Chadli, and Rene Lozi Copyright © 2015 Ivan Zelinka et al. All rights reserved. Pattern Recognition Methods and Features Selection for Speech Emotion Recognition System Tue, 04 Aug 2015 11:32:17 +0000 The impact of the classification method and features selection for the speech emotion recognition accuracy is discussed in this paper. Selecting the correct parameters in combination with the classifier is an important part of reducing the complexity of system computing. This step is necessary especially for systems that will be deployed in real-time applications. The reason for the development and improvement of speech emotion recognition systems is wide usability in nowadays automatic voice controlled systems. Berlin database of emotional recordings was used in this experiment. Classification accuracy of artificial neural networks, k-nearest neighbours, and Gaussian mixture model is measured considering the selection of prosodic, spectral, and voice quality features. The purpose was to find an optimal combination of methods and group of features for stress detection in human speech. The research contribution lies in the design of the speech emotion recognition system due to its accuracy and efficiency. Pavol Partila, Miroslav Voznak, and Jaromir Tovarek Copyright © 2015 Pavol Partila et al. All rights reserved. Time Evolution of Initial Errors in Lorenz’s 05 Chaotic Model Tue, 04 Aug 2015 11:24:10 +0000 Initial errors in weather prediction grow in time and, as they become larger, their growth slows down and then stops at an asymptotic value. Time of reaching this saturation point represents the limit of predictability. This paper studies the asymptotic values and time limits in a chaotic atmospheric model for five initial errors, using ensemble prediction method (model’s data) as well as error approximation by quadratic and logarithmic hypothesis and their modifications. We show that modified hypotheses approximate the model’s time limits better, but not without serious disadvantages. We demonstrate how hypotheses can be further improved to achieve better match of time limits with the model. We also show that quadratic hypothesis approximates the model’s asymptotic value best and that, after improvement, it also approximates the model’s time limits better for almost all initial errors and time lengths. Hynek Bednář, Aleš Raidl, and Jiří Mikšovský Copyright © 2015 Hynek Bednář et al. All rights reserved. Advanced Approach of Multiagent Based Buoy Communication Tue, 04 Aug 2015 11:23:35 +0000 Usually, a hydrometeorological information system is faced with great data flows, but the data levels are often excessive, depending on the observed region of the water. The paper presents advanced buoy communication technologies based on multiagent interaction and data exchange between several monitoring system nodes. The proposed management of buoy communication is based on a clustering algorithm, which enables the performance of the hydrometeorological information system to be enhanced. The experiment is based on the design and analysis of the inexpensive but reliable Baltic Sea autonomous monitoring network (buoys), which would be able to continuously monitor and collect temperature, waviness, and other required data. The proposed approach of multiagent based buoy communication enables all the data from the costal-based station to be monitored with limited transition speed by setting different tasks for the agent-based buoy system according to the clustering information. Gediminas Gricius, Darius Drungilas, Arunas Andziulis, Dale Dzemydiene, Miroslav Voznak, Mindaugas Kurmis, and Sergej Jakovlev Copyright © 2015 Gediminas Gricius et al. All rights reserved. Nonlinear versus Ordinary Adaptive Control of Continuous Stirred-Tank Reactor Tue, 04 Aug 2015 11:13:21 +0000 Unfortunately, the major group of the systems in industry has nonlinear behavior and control of such processes with conventional control approaches with fixed parameters causes problems and suboptimal or unstable control results. An adaptive control is one way to how we can cope with nonlinearity of the system. This contribution compares classic adaptive control and its modification with Wiener system. This configuration divides nonlinear controller into the dynamic linear part and the static nonlinear part. The dynamic linear part is constructed with the use of polynomial synthesis together with the pole-placement method and the spectral factorization. The static nonlinear part uses static analysis of the controlled plant for introducing the mathematical nonlinear description of the relation between the controlled output and the change of the control input. Proposed controller is tested by the simulations on the mathematical model of the continuous stirred-tank reactor with cooling in the jacket as a typical nonlinear system. Jiri Vojtesek and Petr Dostal Copyright © 2015 Jiri Vojtesek and Petr Dostal. All rights reserved. ASM Based Synthesis of Handwritten Arabic Text Pages Thu, 30 Jul 2015 15:57:49 +0000 Document analysis tasks, as text recognition, word spotting, or segmentation, are highly dependent on comprehensive and suitable databases for training and validation. However their generation is expensive in sense of labor and time. As a matter of fact, there is a lack of such databases, which complicates research and development. This is especially true for the case of Arabic handwriting recognition, that involves different preprocessing, segmentation, and recognition methods, which have individual demands on samples and ground truth. To bypass this problem, we present an efficient system that automatically turns Arabic Unicode text into synthetic images of handwritten documents and detailed ground truth. Active Shape Models (ASMs) based on 28046 online samples were used for character synthesis and statistical properties were extracted from the IESK-arDB database to simulate baselines and word slant or skew. In the synthesis step ASM based representations are composed to words and text pages, smoothed by B-Spline interpolation and rendered considering writing speed and pen characteristics. Finally, we use the synthetic data to validate a segmentation method. An experimental comparison with the IESK-arDB database encourages to train and test document analysis related methods on synthetic samples, whenever no sufficient natural ground truthed data is available. Laslo Dinges, Ayoub Al-Hamadi, Moftah Elzobi, Sherif El-etriby, and Ahmed Ghoneim Copyright © 2015 Laslo Dinges et al. All rights reserved. Fusion of Heterogeneous Intrusion Detection Systems for Network Attack Detection Wed, 29 Jul 2015 16:08:59 +0000 An intrusion detection system (IDS) helps to identify different types of attacks in general, and the detection rate will be higher for some specific category of attacks. This paper is designed on the idea that each IDS is efficient in detecting a specific type of attack. In proposed Multiple IDS Unit (MIU), there are five IDS units, and each IDS follows a unique algorithm to detect attacks. The feature selection is done with the help of genetic algorithm. The selected features of the input traffic are passed on to the MIU for processing. The decision from each IDS is termed as local decision. The fusion unit inside the MIU processes all the local decisions with the help of majority voting rule and makes the final decision. The proposed system shows a very good improvement in detection rate and reduces the false alarm rate. Jayakumar Kaliappan, Revathi Thiagarajan, and Karpagam Sundararajan Copyright © 2015 Jayakumar Kaliappan et al. All rights reserved. An Adaptive Multilevel Security Framework for the Data Stored in Cloud Environment Thu, 16 Jul 2015 10:35:10 +0000 Cloud computing is renowned for delivering information technology services based on internet. Nowadays, organizations are interested in moving their massive data and computations into cloud to reap their significant benefits of on demand service, resource pooling, and rapid elasticity that helps to satisfy the dynamically changing infrastructure demand without the burden of owning, managing, and maintaining it. Since the data needs to be secured throughout its life cycle, security of the data in cloud is a major challenge to be concentrated on because the data is in third party’s premises. Any uniform simple or high level security method for all the data either compromises the sensitive data or proves to be too costly with increased overhead. Any common multiple method for all data becomes vulnerable when the common security pattern is identified at the event of successful attack on any information and also encourages more attacks on all other data. This paper suggests an adaptive multilevel security framework based on cryptography techniques that provide adequate security for the classified data stored in cloud. The proposed security system acclimates well for cloud environment and is also customizable and more reliant to meet the required level of security of data with different sensitivity that changes with business needs and commercial conditions. Sudha Devi Dorairaj and Thilagavathy Kaliannan Copyright © 2015 Sudha Devi Dorairaj and Thilagavathy Kaliannan. All rights reserved. Theory and Application on Rough Set, Fuzzy Logic, and Granular Computing Thu, 16 Jul 2015 07:49:58 +0000 Xibei Yang, Weihua Xu, and Yanhong She Copyright © 2015 Xibei Yang et al. All rights reserved. Detecting and Preventing Sybil Attacks in Wireless Sensor Networks Using Message Authentication and Passing Method Sun, 05 Jul 2015 10:26:14 +0000 Wireless sensor networks are highly indispensable for securing network protection. Highly critical attacks of various kinds have been documented in wireless sensor network till now by many researchers. The Sybil attack is a massive destructive attack against the sensor network where numerous genuine identities with forged identities are used for getting an illegal entry into a network. Discerning the Sybil attack, sinkhole, and wormhole attack while multicasting is a tremendous job in wireless sensor network. Basically a Sybil attack means a node which pretends its identity to other nodes. Communication to an illegal node results in data loss and becomes dangerous in the network. The existing method Random Password Comparison has only a scheme which just verifies the node identities by analyzing the neighbors. A survey was done on a Sybil attack with the objective of resolving this problem. The survey has proposed a combined CAM-PVM (compare and match-position verification method) with MAP (message authentication and passing) for detecting, eliminating, and eventually preventing the entry of Sybil nodes in the network. We propose a scheme of assuring security for wireless sensor network, to deal with attacks of these kinds in unicasting and multicasting. Udaya Suriya Raj Kumar Dhamodharan and Rajamani Vayanaperumal Copyright © 2015 Udaya Suriya Raj Kumar Dhamodharan and Rajamani Vayanaperumal. All rights reserved. A Dynamic Recommender System for Improved Web Usage Mining and CRM Using Swarm Intelligence Wed, 01 Jul 2015 12:30:42 +0000 In modern days, to enrich e-business, the websites are personalized for each user by understanding their interests and behavior. The main challenges of online usage data are information overload and their dynamic nature. In this paper, to address these issues, a WebBluegillRecom-annealing dynamic recommender system that uses web usage mining techniques in tandem with software agents developed for providing dynamic recommendations to users that can be used for customizing a website is proposed. The proposed WebBluegillRecom-annealing dynamic recommender uses swarm intelligence from the foraging behavior of a bluegill fish. It overcomes the information overload by handling dynamic behaviors of users. Our dynamic recommender system was compared against traditional collaborative filtering systems. The results show that the proposed system has higher precision, coverage, measure, and scalability than the traditional collaborative filtering systems. Moreover, the recommendations given by our system overcome the overspecialization problem by including variety in recommendations. Anna Alphy and S. Prabakaran Copyright © 2015 Anna Alphy and S. Prabakaran. All rights reserved. ECG Prediction Based on Classification via Neural Networks and Linguistic Fuzzy Logic Forecaster Mon, 29 Jun 2015 08:46:57 +0000 The paper deals with ECG prediction based on neural networks classification of different types of time courses of ECG signals. The main objective is to recognise normal cycles and arrhythmias and perform further diagnosis. We proposed two detection systems that have been created with usage of neural networks. The experimental part makes it possible to load ECG signals, preprocess them, and classify them into given classes. Outputs from the classifiers carry a predictive character. All experimental results from both of the proposed classifiers are mutually compared in the conclusion. We also experimented with the new method of time series transparent prediction based on fuzzy transform with linguistic IF-THEN rules. Preliminary results show interesting results based on the unique capability of this approach bringing natural language interpretation of particular prediction, that is, the properties of time series. Eva Volna, Martin Kotyrba, and Hashim Habiballa Copyright © 2015 Eva Volna et al. All rights reserved. H.264 SVC Complexity Reduction Based on Likelihood Mode Decision Sun, 28 Jun 2015 07:11:03 +0000 H.264 Advanced Video Coding (AVC) was prolonged to Scalable Video Coding (SVC). SVC executes in different electronics gadgets such as personal computer, HDTV, SDTV, IPTV, and full-HDTV in which user demands various scaling of the same content. The various scaling is resolution, frame rate, quality, heterogeneous networks, bandwidth, and so forth. Scaling consumes more encoding time and computational complexity during mode selection. In this paper, to reduce encoding time and computational complexity, a fast mode decision algorithm based on likelihood mode decision (LMD) is proposed. LMD is evaluated in both temporal and spatial scaling. From the results, we conclude that LMD performs well, when compared to the previous fast mode decision algorithms. The comparison parameters are time, PSNR, and bit rate. LMD achieve time saving of 66.65% with 0.05% detriment in PSNR and 0.17% increment in bit rate compared with the full search method. L. Balaji and K. K. Thyagharajan Copyright © 2015 L. Balaji and K. K. Thyagharajan. All rights reserved. Fuzzy Number Addition with the Application of Horizontal Membership Functions Tue, 23 Jun 2015 12:27:33 +0000 The paper presents addition of fuzzy numbers realised with the application of the multidimensional RDM arithmetic and horizontal membership functions (MFs). Fuzzy arithmetic (FA) is a very difficult task because operations should be performed here on multidimensional information granules. Instead, a lot of FA methods use α-cuts in connection with 1-dimensional classical interval arithmetic that operates not on multidimensional granules but on 1-dimensional intervals. Such approach causes difficulties in calculations and is a reason for arithmetical paradoxes. The multidimensional approach allows for removing drawbacks and weaknesses of FA. It is possible thanks to the application of horizontal membership functions which considerably facilitate calculations because now uncertain values can be inserted directly into equations without using the extension principle. The paper shows how the addition operation can be realised on independent fuzzy numbers and on partly or fully dependent fuzzy numbers with taking into account the order relation and how to solve equations, which can be a difficult task for 1-dimensional FAs. Andrzej Piegat and Marcin Pluciński Copyright © 2015 Andrzej Piegat and Marcin Pluciński. All rights reserved. A Hybrid Swarm Intelligence Algorithm for Intrusion Detection Using Significant Features Mon, 22 Jun 2015 08:47:53 +0000 Intrusion detection has become a main part of network security due to the huge number of attacks which affects the computers. This is due to the extensive growth of internet connectivity and accessibility to information systems worldwide. To deal with this problem, in this paper a hybrid algorithm is proposed to integrate Modified Artificial Bee Colony (MABC) with Enhanced Particle Swarm Optimization (EPSO) to predict the intrusion detection problem. The algorithms are combined together to find out better optimization results and the classification accuracies are obtained by 10-fold cross-validation method. The purpose of this paper is to select the most relevant features that can represent the pattern of the network traffic and test its effect on the success of the proposed hybrid classification algorithm. To investigate the performance of the proposed method, intrusion detection KDDCup’99 benchmark dataset from the UCI Machine Learning repository is used. The performance of the proposed method is compared with the other machine learning algorithms and found to be significantly different. P. Amudha, S. Karthik, and S. Sivakumari Copyright © 2015 P. Amudha et al. All rights reserved. Optimization of Processing Parameters in ECM of Die Tool Steel Using Nanofluid by Multiobjective Genetic Algorithm Thu, 18 Jun 2015 11:29:37 +0000 Formation of spikes prevents achievement of the better material removal rate (MRR) and surface finish while using plain NaNO3 aqueous electrolyte in electrochemical machining (ECM) of die tool steel. Hence this research work attempts to minimize the formation of spikes in the selected workpiece of high carbon high chromium die tool steel using copper nanoparticles suspended in NaNO3 aqueous electrolyte, that is, nanofluid. The selected influencing parameters are applied voltage and electrolyte discharge rate with three levels and tool feed rate with four levels. Thirty-six experiments were designed using Design Expert 7.0 software and optimization was done using multiobjective genetic algorithm (MOGA). This tool identified the best possible combination for achieving the better MRR and surface roughness. The results reveal that voltage of 18 V, tool feed rate of 0.54 mm/min, and nanofluid discharge rate of 12 lit/min would be the optimum values in ECM of HCHCr die tool steel. For checking the optimality obtained from the MOGA in MATLAB software, the maximum MRR of 375.78277 mm3/min and respective surface roughness Ra of 2.339779 μm were predicted at applied voltage of 17.688986 V, tool feed rate of 0.5399705 mm/min, and nanofluid discharge rate of 11.998816 lit/min. Confirmatory tests showed that the actual performance at the optimum conditions was 361.214 mm3/min and 2.41 μm; the deviation from the predicted performance is less than 4% which proves the composite desirability of the developed models. V. Sathiyamoorthy, T. Sekar, and N. Elango Copyright © 2015 V. Sathiyamoorthy et al. All rights reserved. Erratum to “N-Screen Aware Multicriteria Hybrid Recommender System Using Weight Based Subspace Clustering” Wed, 17 Jun 2015 07:20:09 +0000 Farman Ullah, Ghulam Sarwar, and Sungchang Lee Copyright © 2015 Farman Ullah et al. All rights reserved. Usage of Probabilistic and General Regression Neural Network for Early Detection and Prevention of Oral Cancer Mon, 15 Jun 2015 13:47:42 +0000 In India, the oral cancers are usually presented in advanced stage of malignancy. It is critical to ascertain the diagnosis in order to initiate most advantageous treatment of the suspicious lesions. The main hurdle in appropriate treatment and control of oral cancer is identification and risk assessment of early disease in the community in a cost-effective fashion. The objective of this research is to design a data mining model using probabilistic neural network and general regression neural network (PNN/GRNN) for early detection and prevention of oral malignancy. The model is built using the oral cancer database which has 35 attributes and 1025 records. All the attributes pertaining to clinical symptoms and history are considered to classify malignant and non-malignant cases. Subsequently, the model attempts to predict particular type of cancer, its stage and extent with the help of attributes pertaining to symptoms, gross examination and investigations. Also, the model envisages anticipating the survivability of a patient on the basis of treatment and follow-up details. Finally, the performance of the PNN/GRNN model is compared with that of other classification models. The classification accuracy of PNN/GRNN model is 80% and hence is better for early detection and prevention of the oral cancer. Neha Sharma and Hari Om Copyright © 2015 Neha Sharma and Hari Om. All rights reserved. FoodWiki: Ontology-Driven Mobile Safe Food Consumption System Mon, 15 Jun 2015 13:44:43 +0000 An ontology-driven safe food consumption mobile system is considered. Over 3,000 compounds are being added to processed food, with numerous effects on the food: to add color, stabilize, texturize, preserve, sweeten, thicken, add flavor, soften, emulsify, and so forth. According to World Health Organization, governments have lately focused on legislation to reduce such ingredients or compounds in manufactured foods as they may have side effects causing health risks such as heart disease, cancer, diabetes, allergens, and obesity. By supervising what and how much to eat as well as what not to eat, we can maximize a patient’s life quality through avoidance of unhealthy ingredients. Smart e-health systems with powerful knowledge bases can provide suggestions of appropriate foods to individuals. Next-generation smart knowledgebase systems will not only include traditional syntactic-based search, which limits the utility of the search results, but will also provide semantics for rich searching. In this paper, performance of concept matching of food ingredients is semantic-based, meaning that it runs its own semantic based rule set to infer meaningful results through the proposed Ontology-Driven Mobile Safe Food Consumption System (FoodWiki). Duygu Çelik Copyright © 2015 Duygu Çelik. All rights reserved. Hybrid Modified -Means with C4.5 for Intrusion Detection Systems in Multiagent Systems Mon, 15 Jun 2015 11:54:40 +0000 Presently, the processing time and performance of intrusion detection systems are of great importance due to the increased speed of traffic data networks and a growing number of attacks on networks and computers. Several approaches have been proposed to address this issue, including hybridizing with several algorithms. However, this paper aims at proposing a hybrid of modified -means with C4.5 intrusion detection system in a multiagent system (MAS-IDS). The MAS-IDS consists of three agents, namely, coordinator, analysis, and communication agent. The basic concept underpinning the utilized MAS is dividing the large captured network dataset into a number of subsets and distributing these to a number of agents depending on the data network size and core CPU availability. KDD Cup 1999 dataset is used for evaluation. The proposed hybrid modified -means with C4.5 classification in MAS is developed in JADE platform. The results show that compared to the current methods, the MAS-IDS reduces the IDS processing time by up to 70%, while improving the detection accuracy. Wathiq Laftah Al-Yaseen, Zulaiha Ali Othman, and Mohd Zakree Ahmad Nazri Copyright © 2015 Wathiq Laftah Al-Yaseen et al. All rights reserved. Personal Authentication Using Multifeatures Multispectral Palm Print Traits Sun, 14 Jun 2015 12:48:39 +0000 Biometrics authentication is an effective method for automatically recognizing a person’s identity with high confidence. Multispectral palm print biometric system is relatively new biometric technology and is in the progression of being endlessly refined and developed. Multispectral palm print biometric system is a promising biometric technology for use in various applications including banking solutions, access control, hospital, construction, and forensic applications. This paper proposes a multispectral palm print recognition method with extraction of multiple features using kernel principal component analysis and modified finite radon transform. Finally, the images are classified using Local Mean K-Nearest Centroid Neighbor algorithm. The proposed method efficiently accommodates the rotational, potential deformations and translational changes by encoding the orientation conserving features. The proposed system analyses the hand vascular authentication using two databases acquired with touch-based and contactless imaging setup collected from multispectral Poly U palm print database and CASIA database. The experimental results clearly demonstrate that the proposed multispectral palm print authentication obtained better result compared to other methods discussed in the literature. Gayathri Rajagopal and Senthil Kumar Manoharan Copyright © 2015 Gayathri Rajagopal and Senthil Kumar Manoharan. All rights reserved. Effective Filtering of Query Results on Updated User Behavioral Profiles in Web Mining Wed, 10 Jun 2015 13:38:13 +0000 Web with tremendous volume of information retrieves result for user related queries. With the rapid growth of web page recommendation, results retrieved based on data mining techniques did not offer higher performance filtering rate because relationships between user profile and queries were not analyzed in an extensive manner. At the same time, existing user profile based prediction in web data mining is not exhaustive in producing personalized result rate. To improve the query result rate on dynamics of user behavior over time, Hamilton Filtered Regime Switching User Query Probability (HFRS-UQP) framework is proposed. HFRS-UQP framework is split into two processes, where filtering and switching are carried out. The data mining based filtering in our research work uses the Hamilton Filtering framework to filter user result based on personalized information on automatic updated profiles through search engine. Maximized result is fetched, that is, filtered out with respect to user behavior profiles. The switching performs accurate filtering updated profiles using regime switching. The updating in profile change (i.e., switches) regime in HFRS-UQP framework identifies the second- and higher-order association of query result on the updated profiles. Experiment is conducted on factors such as personalized information search retrieval rate, filtering efficiency, and precision ratio. S. Sadesh and R. C. Suganthe Copyright © 2015 S. Sadesh and R. C. Suganthe. All rights reserved. Brain Computer Interface on Track to Home Mon, 08 Jun 2015 09:08:54 +0000 The novel BackHome system offers individuals with disabilities a range of useful services available via brain-computer interfaces (BCIs), to help restore their independence. This is the time such technology is ready to be deployed in the real world, that is, at the target end users’ home. This has been achieved by the development of practical electrodes, easy to use software, and delivering telemonitoring and home support capabilities which have been conceived, implemented, and tested within a user-centred design approach. The final BackHome system is the result of a 3-year long process involving extensive user engagement to maximize effectiveness, reliability, robustness, and ease of use of a home based BCI system. The system is comprised of ergonomic and hassle-free BCI equipment; one-click software services for Smart Home control, cognitive stimulation, and web browsing; and remote telemonitoring and home support tools to enable independent home use for nonexpert caregivers and users. BackHome aims to successfully bring BCIs to the home of people with limited mobility to restore their independence and ultimately improve their quality of life. Felip Miralles, Eloisa Vargiu, Stefan Dauwalder, Marc Solà, Gernot Müller-Putz, Selina C. Wriessnegger, Andreas Pinegger, Andrea Kübler, Sebastian Halder, Ivo Käthner, Suzanne Martin, Jean Daly, Elaine Armstrong, Christoph Guger, Christoph Hintermüller, and Hannah Lowish Copyright © 2015 Felip Miralles et al. All rights reserved. Energy Efficient Link Aware Routing with Power Control in Wireless Ad Hoc Networks Mon, 08 Jun 2015 08:34:33 +0000 In wireless ad hoc networks, the traditional routing protocols make the route selection based on minimum distance between the nodes and the minimum number of hop counts. Most of the routing decisions do not consider the condition of the network such as link quality and residual energy of the nodes. Also, when a link failure occurs, a route discovery mechanism is initiated which incurs high routing overhead. If the broadcast nature and the spatial diversity of the wireless communication are utilized efficiently it becomes possible to achieve improvement in the performance of the wireless networks. In contrast to the traditional routing scheme which makes use of a predetermined route for packet transmission, such an opportunistic routing scheme defines a predefined forwarding candidate list formed by using single network metrics. In this paper, a protocol is proposed which uses multiple metrics such as residual energy and link quality for route selection and also includes a monitoring mechanism which initiates a route discovery for a poor link, thereby reducing the overhead involved and improving the throughput of the network while maintaining network connectivity. Power control is also implemented not only to save energy but also to improve the network performance. Using simulations, we show the performance improvement attained in the network in terms of packet delivery ratio, routing overhead, and residual energy of the network. Jeevaa Katiravan, D. Sylvia, and D. Srinivasa Rao Copyright © 2015 Jeevaa Katiravan et al. All rights reserved. Development of a Comprehensive Database System for Safety Analyst Mon, 08 Jun 2015 07:44:12 +0000 This study addressed barriers associated with the use of Safety Analyst, a state-of-the-art tool that has been developed to assist during the entire Traffic Safety Management process but that is not widely used due to a number of challenges as described in this paper. As part of this study, a comprehensive database system and tools to provide data to multiple traffic safety applications, with a focus on Safety Analyst, were developed. A number of data management tools were developed to extract, collect, transform, integrate, and load the data. The system includes consistency-checking capabilities to ensure the adequate insertion and update of data into the database. This system focused on data from roadways, ramps, intersections, and traffic characteristics for Safety Analyst. To test the proposed system and tools, data from Clark County, which is the largest county in Nevada and includes the cities of Las Vegas, Henderson, Boulder City, and North Las Vegas, was used. The database and Safety Analyst together help identify the sites with the potential for safety improvements. Specifically, this study examined the results from two case studies. The first case study, which identified sites having a potential for safety improvements with respect to fatal and all injury crashes, included all roadway elements and used default and calibrated Safety Performance Functions (SPFs). The second case study identified sites having a potential for safety improvements with respect to fatal and all injury crashes, specifically regarding intersections; it used default and calibrated SPFs as well. Conclusions were developed for the calibration of safety performance functions and the classification of site subtypes. Guidelines were provided about the selection of a particular network screening type or performance measure for network screening. Alexander Paz, Naveen Veeramisti, Indira Khanal, Justin Baker, and Hanns de la Fuente-Mella Copyright © 2015 Alexander Paz et al. All rights reserved. An Efficient Approach for Web Indexing of Big Data through Hyperlinks in Web Crawling Sun, 07 Jun 2015 14:19:48 +0000 Web Crawling has acquired tremendous significance in recent times and it is aptly associated with the substantial development of the World Wide Web. Web Search Engines face new challenges due to the availability of vast amounts of web documents, thus making the retrieved results less applicable to the analysers. However, recently, Web Crawling solely focuses on obtaining the links of the corresponding documents. Today, there exist various algorithms and software which are used to crawl links from the web which has to be further processed for future use, thereby increasing the overload of the analyser. This paper concentrates on crawling the links and retrieving all information associated with them to facilitate easy processing for other uses. In this paper, firstly the links are crawled from the specified uniform resource locator (URL) using a modified version of Depth First Search Algorithm which allows for complete hierarchical scanning of corresponding web links. The links are then accessed via the source code and its metadata such as title, keywords, and description are extracted. This content is very essential for any type of analyser work to be carried on the Big Data obtained as a result of Web Crawling. R. Suganya Devi, D. Manjula, and R. K. Siddharth Copyright © 2015 R. Suganya Devi et al. All rights reserved. A Strategic Study about Quality Characteristics in e-Health Systems Based on a Systematic Literature Review Thu, 04 Jun 2015 10:08:14 +0000 e-Health Systems quality management is an expensive and hard process that entails performing several tasks such as analysis, evaluation, and quality control. Furthermore, the development of an e-Health System involves great responsibility since people’s health and quality of life depend on the system and services offered. The focus of the following study is to identify the gap in Quality Characteristics for e-Health Systems, by detecting not only which are the most studied, but also which are the most used Quality Characteristics these Systems include. A strategic study is driven in this paper by a Systematic Literature Review so as to identify Quality Characteristics in e-Health. Such study makes information and communication technology organizations reflect and act strategically to manage quality in e-Health Systems efficiently and effectively. As a result, this paper proposes the bases of a Quality Model and focuses on a set of Quality Characteristics to enable e-Health Systems quality management. Thus, we can conclude that this paper contributes to implementing knowledge with regard to the mission and view of e-Health (Systems) quality management and helps understand how current researches evaluate quality in e-Health Systems. F. J. Domínguez-Mayo, M. J. Escalona, M. Mejías, G. Aragón, J. A. García-García, J. Torres, and J. G. Enríquez Copyright © 2015 F. J. Domínguez-Mayo et al. All rights reserved. A Multiconstrained Grid Scheduling Algorithm with Load Balancing and Fault Tolerance Wed, 03 Jun 2015 07:41:42 +0000 Grid environment consists of millions of dynamic and heterogeneous resources. A grid environment which deals with computing resources is computational grid and is meant for applications that involve larger computations. A scheduling algorithm is said to be efficient if and only if it performs better resource allocation even in case of resource failure. Allocation of resources is a tedious issue since it has to consider several requirements such as system load, processing cost and time, user’s deadline, and resource failure. This work attempts to design a resource allocation algorithm which is budget constrained and also targets load balancing, fault tolerance, and user satisfaction by considering the above requirements. The proposed Multiconstrained Load Balancing Fault Tolerant algorithm (MLFT) reduces the schedule makespan, schedule cost, and task failure rate and improves resource utilization. The proposed MLFT algorithm is evaluated using Gridsim toolkit and the results are compared with the recent algorithms which separately concentrate on all these factors. The comparison results ensure that the proposed algorithm works better than its counterparts. P. Keerthika and P. Suresh Copyright © 2015 P. Keerthika and P. Suresh. All rights reserved. A Framework and Improvements of the Korea Cloud Services Certification System Mon, 01 Jun 2015 11:30:15 +0000 Cloud computing service is an evolving paradigm that affects a large part of the ICT industry and provides new opportunities for ICT service providers such as the deployment of new business models and the realization of economies of scale by increasing efficiency of resource utilization. However, despite benefits of cloud services, there are some obstacles to adopt such as lack of assessing and comparing the service quality of cloud services regarding availability, security, and reliability. In order to adopt the successful cloud service and activate it, it is necessary to establish the cloud service certification system to ensure service quality and performance of cloud services. This paper proposes a framework and improvements of the Korea certification system of cloud service. In order to develop it, the critical issues related to service quality, performance, and certification of cloud service are identified and the systematic framework for the certification system of cloud services and service provider domains are developed. Improvements of the developed Korea certification system of cloud services are also proposed. Hangoo Jeon and Kwang-Kyu Seo Copyright © 2015 Hangoo Jeon and Kwang-Kyu Seo. All rights reserved. Educational Applications for Blind and Partially Sighted Pupils Based on Speech Technologies for Serbian Mon, 01 Jun 2015 06:39:20 +0000 The inclusion of persons with disabilities has always represented an important issue. Advancements within the field of computer science have enabled the development of different types of aids, which have significantly improved the quality of life of the disabled. However, for some disabilities, such as visual impairment, the purpose of these aids is to establish an alternative communication channel and thus overcome the user’s disability. Speech technologies play the crucial role in this process. This paper presents the ongoing efforts to create a set of educational applications based on speech technologies for Serbian for the early stages of education of blind and partially sighted children. Two educational applications dealing with memory exercises and comprehension of geometrical shapes are presented, along with the initial tests results obtained from research including visually impaired pupils. Branko Lučić, Stevan Ostrogonac, Nataša Vujnović Sedlar, and Milan Sečujski Copyright © 2015 Branko Lučić et al. All rights reserved. Electronic Voting Protocol Using Identity-Based Cryptography Sun, 24 May 2015 14:30:56 +0000 Electronic voting protocols proposed to date meet their properties based on Public Key Cryptography (PKC), which offers high flexibility through key agreement protocols and authentication mechanisms. However, when PKC is used, it is necessary to implement Certification Authority (CA) to provide certificates which bind public keys to entities and enable verification of such public key bindings. Consequently, the components of the protocol increase notably. An alternative is to use Identity-Based Encryption (IBE). With this kind of cryptography, it is possible to have all the benefits offered by PKC, without neither the need of certificates nor all the core components of a Public Key Infrastructure (PKI). Considering the aforementioned, in this paper we propose an electronic voting protocol, which meets the privacy and robustness properties by using bilinear maps. Gina Gallegos-Garcia and Horacio Tapia-Recillas Copyright © 2015 Gina Gallegos-Garcia and Horacio Tapia-Recillas. All rights reserved. An Accurate FFPA-PSR Estimator Algorithm and Tool for Software Effort Estimation Wed, 20 May 2015 12:25:26 +0000 Software companies are now keen to provide secure software with respect to accuracy and reliability of their products especially related to the software effort estimation. Therefore, there is a need to develop a hybrid tool which provides all the necessary features. This paper attempts to propose a hybrid estimator algorithm and model which incorporates quality metrics, reliability factor, and the security factor with a fuzzy-based function point analysis. Initially, this method utilizes a fuzzy-based estimate to control the uncertainty in the software size with the help of a triangular fuzzy set at the early development stage. Secondly, the function point analysis is extended by the security and reliability factors in the calculation. Finally, the performance metrics are added with the effort estimation for accuracy. The experimentation is done with different project data sets on the hybrid tool, and the results are compared with the existing models. It shows that the proposed method not only improves the accuracy but also increases the reliability, as well as the security, of the product. Senthil Kumar Murugesan and Chidhambara Rajan Balasubramanian Copyright © 2015 Senthil Kumar Murugesan and Chidhambara Rajan Balasubramanian. All rights reserved. Efficient Scheduling of Scientific Workflows with Energy Reduction Using Novel Discrete Particle Swarm Optimization and Dynamic Voltage Scaling for Computational Grids Thu, 14 May 2015 07:20:48 +0000 One of the most significant and the topmost parameters in the real world computing environment is energy. Minimizing energy imposes benefits like reduction in power consumption, decrease in cooling rates of the computing processors, provision of a green environment, and so forth. In fact, computation time and energy are directly proportional to each other and the minimization of computation time may yield a cost effective energy consumption. Proficient scheduling of Bag-of-Tasks in the grid environment ravages in minimum computation time. In this paper, a novel discrete particle swarm optimization (DPSO) algorithm based on the particle’s best position (pbDPSO) and global best position (gbDPSO) is adopted to find the global optimal solution for higher dimensions. This novel DPSO yields better schedule with minimum computation time compared to Earliest Deadline First (EDF) and First Come First Serve (FCFS) algorithms which comparably reduces energy. Other scheduling parameters, such as job completion ratio and lateness, are also calculated and compared with EDF and FCFS. An energy improvement of up to 28% was obtained when Makespan Conservative Energy Reduction (MCER) and Dynamic Voltage Scaling (DVS) were used in the proposed DPSO algorithm. M. Christobel, S. Tamil Selvi, and Shajulin Benedict Copyright © 2015 M. Christobel et al. All rights reserved. Emergency Situation Prediction Mechanism: A Novel Approach for Intelligent Transportation System Using Vehicular Ad Hoc Networks Sun, 10 May 2015 11:16:24 +0000 In Indian four-lane express highway, millions of vehicles are travelling every day. Accidents are unfortunate and frequently occurring in these highways causing deaths, increase in death toll, and damage to infrastructure. A mechanism is required to avoid such road accidents at the maximum to reduce the death toll. An Emergency Situation Prediction Mechanism, a novel and proactive approach, is proposed in this paper for achieving the best of Intelligent Transportation System using Vehicular Ad Hoc Network. ESPM intends to predict the possibility of occurrence of an accident in an Indian four-lane express highway. In ESPM, the emergency situation prediction is done by the Road Side Unit based on (i) the Status Report sent by the vehicles in the range of RSU and (ii) the road traffic flow analysis done by the RSU. Once the emergency situation or accident is predicted in advance, an Emergency Warning Message is constructed and disseminated to all vehicles in the area of RSU to alert and prevent the vehicles from accidents. ESPM performs well in emergency situation prediction in advance to the occurrence of an accident. ESPM predicts the emergency situation within 0.20 seconds which is comparatively less than the statistical value. The prediction accuracy of ESPM against vehicle density is found better in different traffic scenarios. P. Ganeshkumar and P. Gokulakrishnan Copyright © 2015 P. Ganeshkumar and P. Gokulakrishnan. All rights reserved. Online Pedagogical Tutorial Tactics Optimization Using Genetic-Based Reinforcement Learning Thu, 07 May 2015 11:48:51 +0000 Tutorial tactics are policies for an Intelligent Tutoring System (ITS) to decide the next action when there are multiple actions available. Recent research has demonstrated that when the learning contents were controlled so as to be the same, different tutorial tactics would make difference in students’ learning gains. However, the Reinforcement Learning (RL) techniques that were used in previous studies to induce tutorial tactics are insufficient when encountering large problems and hence were used in offline manners. Therefore, we introduced a Genetic-Based Reinforcement Learning (GBML) approach to induce tutorial tactics in an online-learning manner without basing on any preexisting dataset. The introduced method can learn a set of rules from the environment in a manner similar to RL. It includes a genetic-based optimizer for rule discovery task by generating new rules from the old ones. This increases the scalability of a RL learner for larger problems. The results support our hypothesis about the capability of the GBML method to induce tutorial tactics. This suggests that the GBML method should be favorable in developing real-world ITS applications in the domain of tutorial tactics induction. Hsuan-Ta Lin, Po-Ming Lee, and Tzu-Chien Hsiao Copyright © 2015 Hsuan-Ta Lin et al. All rights reserved. QRFXFreeze: Queryable Compressor for RFX Wed, 06 May 2015 13:31:39 +0000 The verbose nature of XML has been mulled over again and again and many compression techniques for XML data have been excogitated over the years. Some of the techniques incorporate support for querying the XML database in its compressed format while others have to be decompressed before they can be queried. XML compression in which querying is directly supported instantaneously with no compromise over time is forced to compromise over space. In this paper, we propose the compressor, QRFXFreeze, which not only reduces the space of storage but also supports efficient querying. The compressor does this without decompressing the compressed XML file. The compressor supports all kinds of XML documents along with insert, update, and delete operations. The forte of QRFXFreeze is that the textual data are semantically compressed and are indexed to reduce the querying time. Experimental results show that the proposed compressor performs much better than other well-known compressors. Radha Senthilkumar, Gomathi Nandagopal, and Daphne Ronald Copyright © 2015 Radha Senthilkumar et al. All rights reserved. Security of Information and Networks Mon, 04 May 2015 13:05:51 +0000 Iftikhar Ahmad, Aneel Rahim, Adeel Javed, and Hafiz Malik Copyright © 2015 Iftikhar Ahmad et al. All rights reserved. Method for Detecting Manipulated Compilation of Sensing Reports in Wireless Sensor Networks Sun, 03 May 2015 13:50:58 +0000 In cluster-based wireless sensor networks (WSNs), a few sensor nodes, including cluster heads (CHs), can be physically compromised by a malicious adversary. By using compromised CHs, the adversary can intentionally attach false message authentication codes into legitimate sensing reports in order to interrupt reporting of the real events. The existing solutions are vulnerable to such a type of security attacks, called manipulated compilation attacks (MCAs), since they assume that CHs are uncompromised. Thus, the reports manipulated by compromised CHs will be discarded by forwarding nodes or rejected at base stations, so that real events on the fields cannot be properly reported to the users. In this paper, the author proposes a method for the detection of MCAs in cluster-based WSNs. In the proposed method, every sensing report is collaboratively generated and verified by cluster nodes based on very loose synchronization. Once a cluster node has detected an MCA for a real event, it can reforward a legitimate report immediately. Therefore, the event can be properly reported to the users. The performance of the proposed method is shown with analytical and experimental results at the end of the paper. Hae Young Lee Copyright © 2015 Hae Young Lee. All rights reserved. Using a Prediction Model to Manage Cyber Security Threats Sun, 03 May 2015 10:30:00 +0000 Cyber-attacks are an important issue faced by all organizations. Securing information systems is critical. Organizations should be able to understand the ecosystem and predict attacks. Predicting attacks quantitatively should be part of risk management. The cost impact due to worms, viruses, or other malicious software is significant. This paper proposes a mathematical model to predict the impact of an attack based on significant factors that influence cyber security. This model also considers the environmental information required. It is generalized and can be customized to the needs of the individual organization. Venkatesh Jaganathan, Priyesh Cherurveettil, and Premapriya Muthu Sivashanmugam Copyright © 2015 Venkatesh Jaganathan et al. All rights reserved. A Novel Protective Framework for Defeating HTTP-Based Denial of Service and Distributed Denial of Service Attacks Sun, 03 May 2015 10:18:06 +0000 The growth of web technology has brought convenience to our life, since it has become the most important communication channel. However, now this merit is threatened by complicated network-based attacks, such as denial of service (DoS) and distributed denial of service (DDoS) attacks. Despite many researchers’ efforts, no optimal solution that addresses all sorts of HTTP DoS/DDoS attacks is on offer. Therefore, this research aims to fix this gap by designing an alternative solution called a flexible, collaborative, multilayer, DDoS prevention framework (FCMDPF). The innovative design of the FCMDPF framework handles all aspects of HTTP-based DoS/DDoS attacks through the following three subsequent framework’s schemes (layers). Firstly, an outer blocking (OB) scheme blocks attacking IP source if it is listed on the black list table. Secondly, the service traceback oriented architecture (STBOA) scheme is to validate whether the incoming request is launched by a human or by an automated tool. Then, it traces back the true attacking IP source. Thirdly, the flexible advanced entropy based (FAEB) scheme is to eliminate high rate DDoS (HR-DDoS) and flash crowd (FC) attacks. Compared to the previous researches, our framework’s design provides an efficient protection for web applications against all sorts of DoS/DDoS attacks. Mohammed A. Saleh and Azizah Abdul Manaf Copyright © 2015 Mohammed A. Saleh and Azizah Abdul Manaf. All rights reserved. Using Fuzzy Logic Techniques for Assertion-Based Software Testing Metrics Tue, 28 Apr 2015 07:38:50 +0000 Software testing is a very labor intensive and costly task. Therefore, many software testing techniques to automate the process of software testing have been reported in the literature. Assertion-Based automated software testing has been shown to be effective in detecting program faults as compared to traditional black-box and white-box software testing methods. However, the applicability of this approach in the presence of large numbers of assertions may be very costly. Therefore, software developers need assistance while making decision to apply Assertion-Based testing in order for them to get the benefits of this approach at an acceptable level of costs. In this paper, we present an Assertion-Based testing metrics technique that is based on fuzzy logic. The main goal of the proposed technique is to enhance the performance of Assertion-Based software testing in the presence of large numbers of assertions. To evaluate the proposed technique, an experimental study was performed in which the proposed technique is applied on programs with assertions. The result of this experiment shows that the effectiveness and performance of Assertion-Based software testing have improved when applying the proposed testing metrics technique. Ali M. Alakeel Copyright © 2015 Ali M. Alakeel. All rights reserved. Generating Personalized Web Search Using Semantic Context Mon, 27 Apr 2015 09:25:17 +0000 The “one size fits the all” criticism of search engines is that when queries are submitted, the same results are returned to different users. In order to solve this problem, personalized search is proposed, since it can provide different search results based upon the preferences of users. However, existing methods concentrate more on the long-term and independent user profile, and thus reduce the effectiveness of personalized search. In this paper, the method captures the user context to provide accurate preferences of users for effectively personalized search. First, the short-term query context is generated to identify related concepts of the query. Second, the user context is generated based on the click through data of users. Finally, a forgetting factor is introduced to merge the independent user context in a user session, which maintains the evolution of user preferences. Experimental results fully confirm that our approach can successfully represent user context according to individual user information needs. Zheng Xu, Hai-Yan Chen, and Jie Yu Copyright © 2015 Zheng Xu et al. All rights reserved. A Novel Way to Relate Ontology Classes Tue, 21 Apr 2015 06:42:42 +0000 The existing ontologies in the semantic web typically have anonymous union and intersection classes. The anonymous classes are limited in scope and may not be part of the whole inference process. The tools, namely, the pellet, the jena, and the protégé, interpret collection classes as (a) equivalent/subclasses of union class and (b) superclasses of intersection class. As a result, there is a possibility that the tools will produce error prone inference results for relations, namely, sub-, union, intersection, equivalent relations, and those dependent on these relations, namely, complement. To verify whether a class is complement of other involves utilization of sub- and equivalent relations. Motivated by the same, we (i) refine the test data set of the conference ontology by adding named, union, and intersection classes and (ii) propose a match algorithm to (a) calculate corrected subclasses list, (b) correctly relate intersection and union classes with their collection classes, and (c) match union, intersection, sub-, complement, and equivalent classes in a proper sequence, to avoid error prone match results. We compare the results of our algorithms with those of a candidate reasoner, namely, the pellet reasoner. To the best of our knowledge, ours is a unique attempt in establishing a novel way to relate ontology classes. Ami T. Choksi and Devesh C. Jinwala Copyright © 2015 Ami T. Choksi and Devesh C. Jinwala. All rights reserved. Smart TV-Smartphone Multiscreen Interactive Middleware for Public Displays Thu, 09 Apr 2015 12:18:25 +0000 A new generation of public displays demands high interactive and multiscreen features to enrich people’s experience in new pervasive environments. Traditionally, research on public display interaction has involved mobile devices as the main characters during the use of personal area network technologies such as Bluetooth or NFC. However, the emergent Smart TV model arises as an interesting alternative for the implementation of a new generation of public displays. This is due to its intrinsic connection capabilities with surrounding devices like smartphones or tablets. Nonetheless, the different approaches proposed by the most important vendors are still underdeveloped to support multiscreen and interaction capabilities for modern public displays, because most of them are intended for domestic environments. This research proposes multiscreen interactive middleware for public displays, which was developed from the principles of a loosely coupled interaction model, simplicity, stability, concurrency, low latency, and the usage of open standards and technologies. Moreover, a validation prototype is proposed in one of the most interesting public display scenarios: the advertising. Francisco Martinez-Pabon, Jaime Caicedo-Guerrero, Jhon Jairo Ibarra-Samboni, Gustavo Ramirez-Gonzalez, and Davinia Hernández-Leo Copyright © 2015 Francisco Martinez-Pabon et al. All rights reserved. Machine Learning in Intelligent Video and Automated Monitoring Thu, 09 Apr 2015 08:58:18 +0000 Yu-Bo Yuan, Gao Yang David, and Shan Zhao Copyright © 2015 Yu-Bo Yuan et al. All rights reserved. Corrigendum to “A Preliminary Investigation of User Perception and Behavioral Intention for Different Review Types: Customers and Designers Perspective” Tue, 07 Apr 2015 13:04:12 +0000 Atika Qazi, Ram Gopal Raj, Muhammad Tahir, Mehwish Waheed, Saif Ur Rehman Khan, and Ajith Abraham Copyright © 2015 Atika Qazi et al. All rights reserved. A Novel Rules Based Approach for Estimating Software Birthmark Mon, 06 Apr 2015 11:22:33 +0000 Software birthmark is a unique quality of software to detect software theft. Comparing birthmarks of software can tell us whether a program or software is a copy of another. Software theft and piracy are rapidly increasing problems of copying, stealing, and misusing the software without proper permission, as mentioned in the desired license agreement. The estimation of birthmark can play a key role in understanding the effectiveness of a birthmark. In this paper, a new technique is presented to evaluate and estimate software birthmark based on the two most sought-after properties of birthmarks, that is, credibility and resilience. For this purpose, the concept of soft computing such as probabilistic and fuzzy computing has been taken into account and fuzzy logic is used to estimate properties of birthmark. The proposed fuzzy rule based technique is validated through a case study and the results show that the technique is successful in assessing the specified properties of the birthmark, its resilience and credibility. This, in turn, shows how much effort will be required to detect the originality of the software based on its birthmark. Shah Nazir, Sara Shahzad, Sher Afzal Khan, Norma Binti Alias, and Sajid Anwar Copyright © 2015 Shah Nazir et al. All rights reserved. Hybrid Ontology for Semantic Information Retrieval Model Using Keyword Matching Indexing System Wed, 01 Apr 2015 08:36:56 +0000 Ontology is the process of growth and elucidation of concepts of an information domain being common for a group of users. Establishing ontology into information retrieval is a normal method to develop searching effects of relevant information users require. Keywords matching process with historical or information domain is significant in recent calculations for assisting the best match for specific input queries. This research presents a better querying mechanism for information retrieval which integrates the ontology queries with keyword search. The ontology-based query is changed into a primary order to predicate logic uncertainty which is used for routing the query to the appropriate servers. Matching algorithms characterize warm area of researches in computer science and artificial intelligence. In text matching, it is more dependable to study semantics model and query for conditions of semantic matching. This research develops the semantic matching results between input queries and information in ontology field. The contributed algorithm is a hybrid method that is based on matching extracted instances from the queries and information field. The queries and information domain is focused on semantic matching, to discover the best match and to progress the executive process. In conclusion, the hybrid ontology in semantic web is sufficient to retrieve the documents when compared to standard ontology. K. R. Uthayan and G. S. Anandha Mala Copyright © 2015 K. R. Uthayan and G. S. Anandha Mala. All rights reserved. Strategic Management Advanced Service for Sustainable Computing Environment Mon, 30 Mar 2015 12:34:54 +0000 Sang-Soo Yeo, Qun Jin, Vincenzo Loia, and Hangbae Chang Copyright © 2015 Sang-Soo Yeo et al. All rights reserved. ACOustic: A Nature-Inspired Exploration Indicator for Ant Colony Optimization Mon, 30 Mar 2015 11:53:06 +0000 A statistical machine learning indicator, ACOustic, is proposed to evaluate the exploration behavior in the iterations of ant colony optimization algorithms. This idea is inspired by the behavior of some parasites in their mimicry to the queens’ acoustics of their ant hosts. The parasites’ reaction results from their ability to indicate the state of penetration. The proposed indicator solves the problem of robustness that results from the difference of magnitudes in the distance’s matrix, especially when combinatorial optimization problems with rugged fitness landscape are applied. The performance of the proposed indicator is evaluated against the existing indicators in six variants of ant colony optimization algorithms. Instances for travelling salesman problem and quadratic assignment problem are used in the experimental evaluation. The analytical results showed that the proposed indicator is more informative and more robust. Rafid Sagban, Ku Ruhana Ku-Mahamud, and Muhamad Shahbani Abu Bakar Copyright © 2015 Rafid Sagban et al. All rights reserved. Energy Aware Swarm Optimization with Intercluster Search for Wireless Sensor Network Mon, 30 Mar 2015 08:29:24 +0000 Wireless sensor networks (WSNs) are emerging as a low cost popular solution for many real-world challenges. The low cost ensures deployment of large sensor arrays to perform military and civilian tasks. Generally, WSNs are power constrained due to their unique deployment method which makes replacement of battery source difficult. Challenges in WSN include a well-organized communication platform for the network with negligible power utilization. In this work, an improved binary particle swarm optimization (PSO) algorithm with modified connected dominating set (CDS) based on residual energy is proposed for discovery of optimal number of clusters and cluster head (CH). Simulations show that the proposed BPSO-T and BPSO-EADS perform better than LEACH- and PSO-based system in terms of energy savings and QOS. Shanmugasundaram Thilagavathi and Bhavani Gnanasambandan Geetha Copyright © 2015 Shanmugasundaram Thilagavathi and Bhavani Gnanasambandan Geetha. All rights reserved. Fast Image Search with Locality-Sensitive Hashing and Homogeneous Kernels Map Sun, 29 Mar 2015 13:50:51 +0000 Fast image search with efficient additive kernels and kernel locality-sensitive hashing has been proposed. As to hold the kernel functions, recent work has probed methods to create locality-sensitive hashing, which guarantee our approach’s linear time; however existing methods still do not solve the problem of locality-sensitive hashing (LSH) algorithm and indirectly sacrifice the loss in accuracy of search results in order to allow fast queries. To improve the search accuracy, we show how to apply explicit feature maps into the homogeneous kernels, which help in feature transformation and combine it with kernel locality-sensitive hashing. We prove our method on several large datasets and illustrate that it improves the accuracy relative to commonly used methods and make the task of object classification and, content-based retrieval more fast and accurate. Jun-yi Li and Jian-hua Li Copyright © 2015 Jun-yi Li and Jian-hua Li. All rights reserved. Declarative Programming with Temporal Constraints, in the Language CG Sun, 29 Mar 2015 07:35:24 +0000 Specifying and interpreting temporal constraints are key elements of knowledge representation and reasoning, with applications in temporal databases, agent programming, and ambient intelligence. We present and formally characterize the language CG, which tackles this issue. In CG, users are able to develop time-dependent programs, in a flexible and straightforward manner. Such programs can, in turn, be coupled with evolving environments, thus empowering users to control the environment’s evolution. CG relies on a structure for storing temporal information, together with a dedicated query mechanism. Hence, we explore the computational complexity of our query satisfaction problem. We discuss previous implementation attempts of CG and introduce a novel prototype which relies on logic programming. Finally, we address the issue of consistency and correctness of CG program execution, using the Event-B modeling approach. Lorina Negreanu Copyright © 2015 Lorina Negreanu. All rights reserved. Benchmarking RCGAu on the Noiseless BBOB Testbed Sun, 29 Mar 2015 07:19:15 +0000 RCGAu is a hybrid real-coded genetic algorithm with “uniform random direction” search mechanism. The uniform random direction search mechanism enhances the local search capability of RCGA. In this paper, RCGAu was tested on the BBOB-2013 noiseless testbed using restarts till a maximum number of function evaluations (#FEs) of 105 × D are reached, where D is the dimension of the function search space. RCGAu was able to solve several test functions in the low search dimensions of 2 and 3 to the desired accuracy of 108. Although RCGAu found it difficult in getting a solution with the desired accuracy 108 for high conditioning and multimodal functions within the specified maximum #FEs, it was able to solve most of the test functions with dimensions up to 40 with lower precisions. Babatunde A. Sawyerr, Aderemi O. Adewumi, and M. Montaz Ali Copyright © 2015 Babatunde A. Sawyerr et al. All rights reserved. Developing R&D Portfolio Business Validity Simulation Model and System Sun, 29 Mar 2015 07:11:03 +0000 The R&D has been recognized as critical method to take competitiveness by not only companies but also nations with its value creation such as patent value and new product. Therefore, R&D has been a decision maker’s burden in that it is hard to decide how much money to invest, how long time one should spend, and what technology to develop which means it accompanies resources such as budget, time, and manpower. Although there are diverse researches about R&D evaluation, business factors are not concerned enough because almost all previous studies are technology oriented evaluation with one R&D technology based. In that, we early proposed R&D business aspect evaluation model which consists of nine business model components. In this research, we develop a simulation model and system evaluating a company or industry’s R&D portfolio with business model point of view and clarify default and control parameters to facilitate evaluator’s business validity work in each evaluation module by integrate to one screen. Hyun Jin Yeo and Kwang Hyuk Im Copyright © 2015 Hyun Jin Yeo and Kwang Hyuk Im. All rights reserved. Research and Development of Advanced Computing Technologies Thu, 26 Mar 2015 07:59:57 +0000 Shifei Ding, Zhongzhi Shi, and Ahmad Taher Azar Copyright © 2015 Shifei Ding et al. All rights reserved. Research and Application of Knowledge Resources Network for Product Innovation Wed, 25 Mar 2015 14:03:07 +0000 In order to enhance the capabilities of knowledge service in product innovation design service platform, a method of acquiring knowledge resources supporting for product innovation from the Internet and providing knowledge active push is proposed. Through knowledge modeling for product innovation based on ontology, the integrated architecture of knowledge resources network is put forward. The technology for the acquisition of network knowledge resources based on focused crawler and web services is studied. Knowledge active push is provided for users by user behavior analysis and knowledge evaluation in order to improve users’ enthusiasm for participation in platform. Finally, an application example is illustrated to prove the effectiveness of the method. Chuan Li, Wen-qiang Li, Yan Li, Hui-zhen Na, and Qian Shi Copyright © 2015 Chuan Li et al. All rights reserved. From Determinism and Probability to Chaos: Chaotic Evolution towards Philosophy and Methodology of Chaotic Optimization Tue, 24 Mar 2015 11:14:05 +0000 We present and discuss philosophy and methodology of chaotic evolution that is theoretically supported by chaos theory. We introduce four chaotic systems, that is, logistic map, tent map, Gaussian map, and Hénon map, in a well-designed chaotic evolution algorithm framework to implement several chaotic evolution (CE) algorithms. By comparing our previous proposed CE algorithm with logistic map and two canonical differential evolution (DE) algorithms, we analyse and discuss optimization performance of CE algorithm. An investigation on the relationship between optimization capability of CE algorithm and distribution characteristic of chaotic system is conducted and analysed. From evaluation result, we find that distribution of chaotic system is an essential factor to influence optimization performance of CE algorithm. We propose a new interactive EC (IEC) algorithm, interactive chaotic evolution (ICE) that replaces fitness function with a real human in CE algorithm framework. There is a paired comparison-based mechanism behind CE search scheme in nature. A simulation experimental evaluation is conducted with a pseudo-IEC user to evaluate our proposed ICE algorithm. The evaluation result indicates that ICE algorithm can obtain a significant better performance than or the same performance as interactive DE. Some open topics on CE, ICE, fusion of these optimization techniques, algorithmic notation, and others are presented and discussed. Yan Pei Copyright © 2015 Yan Pei. All rights reserved. Unbiased Feature Selection in Learning Random Forests for High-Dimensional Data Tue, 24 Mar 2015 08:52:59 +0000 Random forests (RFs) have been widely used as a powerful classification method. However, with the randomization in both bagging samples and feature selection, the trees in the forest tend to select uninformative features for node splitting. This makes RFs have poor accuracy when working with high-dimensional data. Besides that, RFs have bias in the feature selection process where multivalued features are favored. Aiming at debiasing feature selection in RFs, we propose a new RF algorithm, called xRF, to select good features in learning RFs for high-dimensional data. We first remove the uninformative features using -value assessment, and the subset of unbiased features is then selected based on some statistical measures. This feature subset is then partitioned into two subsets. A feature weighting sampling technique is used to sample features from these two subsets for building trees. This approach enables one to generate more accurate trees, while allowing one to reduce dimensionality and the amount of data needed for learning RFs. An extensive set of experiments has been conducted on 47 high-dimensional real-world datasets including image datasets. The experimental results have shown that RFs with the proposed approach outperformed the existing random forests in increasing the accuracy and the AUC measures. Thanh-Tung Nguyen, Joshua Zhexue Huang, and Thuy Thi Nguyen Copyright © 2015 Thanh-Tung Nguyen et al. All rights reserved. Reverse Engineering of Free-Form Surface Based on the Closed-Loop Theory Tue, 24 Mar 2015 06:30:15 +0000 To seek better methods of measurement and more accurate model of reconstruction in the field of reverse engineering has been the focus of researchers. Based on this, a new method of adaptive measurement, real-time reconstruction, and online evaluation of free-form surface was presented in this paper. The coordinates and vectors of the prediction points are calculated according to a Bézier curve which is fitted by measured points. Final measured point cloud distribution is in agreement with the geometric characteristics of the free-form surfaces. Fitting the point cloud to a surface model by the nonuniform B-spline method, extracting some check points from the surface models based on grids and a feature on the surface, review the location of these check points on the surface with CMM and evaluate the model, and then update the surface model to meet the accuracy. Integrated measurement, reconstruction, and evaluation, with the closed-loop reverse process, established an accurate model. The results of example show that the measuring points are distributed over the surface according to curvature, and the reconstruction model can be completely expressed with micron level. Meanwhile, measurement, reconstruction and evaluation are integrated in forms of closed-loop reverse system. Xue Ming He, Jun Fei He, Mei Ping Wu, Rong Zhang, and Xiao Gang Ji Copyright © 2015 Xue Ming He et al. All rights reserved. State of the Art of Fuzzy Methods for Gene Regulatory Networks Inference Mon, 23 Mar 2015 12:34:14 +0000 To address one of the most challenging issues at the cellular level, this paper surveys the fuzzy methods used in gene regulatory networks (GRNs) inference. GRNs represent causal relationships between genes that have a direct influence, trough protein production, on the life and the development of living organisms and provide a useful contribution to the understanding of the cellular functions as well as the mechanisms of diseases. Fuzzy systems are based on handling imprecise knowledge, such as biological information. They provide viable computational tools for inferring GRNs from gene expression data, thus contributing to the discovery of gene interactions responsible for specific diseases and/or ad hoc correcting therapies. Increasing computational power and high throughput technologies have provided powerful means to manage these challenging digital ecosystems at different levels from cell to society globally. The main aim of this paper is to report, present, and discuss the main contributions of this multidisciplinary field in a coherent and structured framework. Tuqyah Abdullah Al Qazlan, Aboubekeur Hamdi-Cherif, and Chafia Kara-Mohamed Copyright © 2015 Tuqyah Abdullah Al Qazlan et al. All rights reserved. Enhancing the Selection of Backoff Interval Using Fuzzy Logic over Wireless Ad Hoc Networks Mon, 23 Mar 2015 12:09:16 +0000 IEEE 802.11 is the de facto standard for medium access over wireless ad hoc network. The collision avoidance mechanism (i.e., random binary exponential backoff—BEB) of IEEE 802.11 DCF (distributed coordination function) is inefficient and unfair especially under heavy load. In the literature, many algorithms have been proposed to tune the contention window (CW) size. However, these algorithms make every node select its backoff interval between [0, CW] in a random and uniform manner. This randomness is incorporated to avoid collisions among the nodes. But this random backoff interval can change the optimal order and frequency of channel access among competing nodes which results in unfairness and increased delay. In this paper, we propose an algorithm that schedules the medium access in a fair and effective manner. This algorithm enhances IEEE 802.11 DCF with additional level of contention resolution that prioritizes the contending nodes according to its queue length and waiting time. Each node computes its unique backoff interval using fuzzy logic based on the input parameters collected from contending nodes through overhearing. We evaluate our algorithm against IEEE 802.11, GDCF (gentle distributed coordination function) protocols using ns-2.35 simulator and show that our algorithm achieves good performance. Radha Ranganathan and Kathiravan Kannan Copyright © 2015 Radha Ranganathan and Kathiravan Kannan. All rights reserved. Kernel Method Based Human Model for Enhancing Interactive Evolutionary Optimization Mon, 23 Mar 2015 09:58:08 +0000 A fitness landscape presents the relationship between individual and its reproductive success in evolutionary computation (EC). However, discrete and approximate landscape in an original search space may not support enough and accurate information for EC search, especially in interactive EC (IEC). The fitness landscape of human subjective evaluation in IEC is very difficult and impossible to model, even with a hypothesis of what its definition might be. In this paper, we propose a method to establish a human model in projected high dimensional search space by kernel classification for enhancing IEC search. Because bivalent logic is a simplest perceptual paradigm, the human model is established by considering this paradigm principle. In feature space, we design a linear classifier as a human model to obtain user preference knowledge, which cannot be supported linearly in original discrete search space. The human model is established by this method for predicting potential perceptual knowledge of human. With the human model, we design an evolution control method to enhance IEC search. From experimental evaluation results with a pseudo-IEC user, our proposed model and method can enhance IEC search significantly. Yan Pei, Qiangfu Zhao, and Yong Liu Copyright © 2015 Yan Pei et al. All rights reserved. A Double Herd Krill Based Algorithm for Location Area Optimization in Mobile Wireless Cellular Network Mon, 23 Mar 2015 09:47:19 +0000 In wireless communication systems, mobility tracking deals with determining a mobile subscriber (MS) covering the area serviced by the wireless network. Tracking a mobile subscriber is governed by the two fundamental components called location updating (LU) and paging. This paper presents a novel hybrid method using a krill herd algorithm designed to optimize the location area (LA) within available spectrum such that total network cost, comprising location update (LU) cost and cost for paging, is minimized without compromise. Based on various mobility patterns of users and network architecture, the design of the LR area is formulated as a combinatorial optimization problem. Numerical results indicate that the proposed model provides a more accurate update boundary in real environment than that derived from a hexagonal cell configuration with a random walk movement pattern. The proposed model allows the network to maintain a better balance between the processing incurred due to location update and the radio bandwidth utilized for paging between call arrivals. F. Vincylloyd and B. Anand Copyright © 2015 F. Vincylloyd and B. Anand. All rights reserved. A Heuristic Ranking Approach on Capacity Benefit Margin Determination Using Pareto-Based Evolutionary Programming Technique Mon, 23 Mar 2015 09:43:02 +0000 This paper introduces a novel multiobjective approach for capacity benefit margin (CBM) assessment taking into account tie-line reliability of interconnected systems. CBM is the imperative information utilized as a reference by the load-serving entities (LSE) to estimate a certain margin of transfer capability so that a reliable access to generation through interconnected system could be attained. A new Pareto-based evolutionary programming (EP) technique is used to perform a simultaneous determination of CBM for all areas of the interconnected system. The selection of CBM at the Pareto optimal front is proposed to be performed by referring to a heuristic ranking index that takes into account system loss of load expectation (LOLE) in various conditions. Eventually, the power transfer based available transfer capability (ATC) is determined by considering the firm and nonfirm transfers of CBM. A comprehensive set of numerical studies are conducted on the modified IEEE-RTS79 and the performance of the proposed method is numerically investigated in detail. The main advantage of the proposed technique is in terms of flexibility offered to an independent system operator in selecting an appropriate solution of CBM simultaneously for all areas. Muhammad Murtadha Othman, Nurulazmi Abd Rahman, Ismail Musirin, Mahmud Fotuhi-Firuzabad, and Abbas Rajabi-Ghahnavieh Copyright © 2015 Muhammad Murtadha Othman et al. All rights reserved. New Enhanced Artificial Bee Colony (JA-ABC5) Algorithm with Application for Reactive Power Optimization Mon, 23 Mar 2015 09:42:16 +0000 The standard artificial bee colony (ABC) algorithm involves exploration and exploitation processes which need to be balanced for enhanced performance. This paper proposes a new modified ABC algorithm named JA-ABC5 to enhance convergence speed and improve the ability to reach the global optimum by balancing exploration and exploitation processes. New stages have been proposed at the earlier stages of the algorithm to increase the exploitation process. Besides that, modified mutation equations have also been introduced in the employed and onlooker-bees phases to balance the two processes. The performance of JA-ABC5 has been analyzed on 27 commonly used benchmark functions and tested to optimize the reactive power optimization problem. The performance results have clearly shown that the newly proposed algorithm has outperformed other compared algorithms in terms of convergence speed and global optimum achievement. Noorazliza Sulaiman, Junita Mohamad-Saleh, and Abdul Ghani Abro Copyright © 2015 Noorazliza Sulaiman et al. All rights reserved. Fast Adapting Ensemble: A New Algorithm for Mining Data Streams with Concept Drift Mon, 23 Mar 2015 07:28:03 +0000 The treatment of large data streams in the presence of concept drifts is one of the main challenges in the field of data mining, particularly when the algorithms have to deal with concepts that disappear and then reappear. This paper presents a new algorithm, called Fast Adapting Ensemble (FAE), which adapts very quickly to both abrupt and gradual concept drifts, and has been specifically designed to deal with recurring concepts. FAE processes the learning examples in blocks of the same size, but it does not have to wait for the batch to be complete in order to adapt its base classification mechanism. FAE incorporates a drift detector to improve the handling of abrupt concept drifts and stores a set of inactive classifiers that represent old concepts, which are activated very quickly when these concepts reappear. We compare our new algorithm with various well-known learning algorithms, taking into account, common benchmark datasets. The experiments show promising results from the proposed algorithm (regarding accuracy and runtime), handling different types of concept drifts. Agustín Ortíz Díaz, José del Campo-Ávila, Gonzalo Ramos-Jiménez, Isvani Frías Blanco, Yailé Caballero Mota, Antonio Mustelier Hechavarría, and Rafael Morales-Bueno Copyright © 2015 Agustín Ortíz Díaz et al. All rights reserved. Negative Correlation Learning for Customer Churn Prediction: A Comparison Study Mon, 23 Mar 2015 07:12:52 +0000 Recently, telecommunication companies have been paying more attention toward the problem of identification of customer churn behavior. In business, it is well known for service providers that attracting new customers is much more expensive than retaining existing ones. Therefore, adopting accurate models that are able to predict customer churn can effectively help in customer retention campaigns and maximizing the profit. In this paper we will utilize an ensemble of Multilayer perceptrons (MLP) whose training is obtained using negative correlation learning (NCL) for predicting customer churn in a telecommunication company. Experiments results confirm that NCL based MLP ensemble can achieve better generalization performance (high churn rate) compared with ensemble of MLP without NCL (flat ensemble) and other common data mining techniques used for churn analysis. Ali Rodan, Ayham Fayyoumi, Hossam Faris, Jamal Alsakran, and Omar Al-Kadi Copyright © 2015 Ali Rodan et al. All rights reserved. Unsupervised Spectral-Spatial Feature Selection-Based Camouflaged Object Detection Using VNIR Hyperspectral Camera Mon, 23 Mar 2015 06:18:05 +0000 The detection of camouflaged objects is important for industrial inspection, medical diagnoses, and military applications. Conventional supervised learning methods for hyperspectral images can be a feasible solution. Such approaches, however, require a priori information of a camouflaged object and background. This letter proposes a fully autonomous feature selection and camouflaged object detection method based on the online analysis of spectral and spatial features. The statistical distance metric can generate candidate feature bands and further analysis of the entropy-based spatial grouping property can trim the useless feature bands. Camouflaged objects can be detected better with less computational complexity by optical spectral-spatial feature analysis. Sungho Kim Copyright © 2015 Sungho Kim. All rights reserved. A Novel Clustering Algorithm Inspired by Membrane Computing Sun, 22 Mar 2015 13:09:19 +0000 P systems are a class of distributed parallel computing models; this paper presents a novel clustering algorithm, which is inspired from mechanism of a tissue-like P system with a loop structure of cells, called membrane clustering algorithm. The objects of the cells express the candidate centers of clusters and are evolved by the evolution rules. Based on the loop membrane structure, the communication rules realize a local neighborhood topology, which helps the coevolution of the objects and improves the diversity of objects in the system. The tissue-like P system can effectively search for the optimal partitioning with the help of its parallel computing advantage. The proposed clustering algorithm is evaluated on four artificial data sets and six real-life data sets. Experimental results show that the proposed clustering algorithm is superior or competitive to k-means algorithm and several evolutionary clustering algorithms recently reported in the literature. Hong Peng, Xiaohui Luo, Zhisheng Gao, Jun Wang, and Zheng Pei Copyright © 2015 Hong Peng et al. All rights reserved. A Novel Psychovisual Threshold on Large DCT for Image Compression Sun, 22 Mar 2015 12:42:46 +0000 A psychovisual experiment prescribes the quantization values in image compression. The quantization process is used as a threshold of the human visual system tolerance to reduce the amount of encoded transform coefficients. It is very challenging to generate an optimal quantization value based on the contribution of the transform coefficient at each frequency order. The psychovisual threshold represents the sensitivity of the human visual perception at each frequency order to the image reconstruction. An ideal contribution of the transform at each frequency order will be the primitive of the psychovisual threshold in image compression. This research study proposes a psychovisual threshold on the large discrete cosine transform (DCT) image block which will be used to automatically generate the much needed quantization tables. The proposed psychovisual threshold will be used to prescribe the quantization values at each frequency order. The psychovisual threshold on the large image block provides significant improvement in the quality of output images. The experimental results on large quantization tables from psychovisual threshold produce largely free artifacts in the visual output image. Besides, the experimental results show that the concept of psychovisual threshold produces better quality image at the higher compression rate than JPEG image compression. Nur Azman Abu and Ferda Ernawan Copyright © 2015 Nur Azman Abu and Ferda Ernawan. All rights reserved. Chaos Time Series Prediction Based on Membrane Optimization Algorithms Sun, 22 Mar 2015 12:39:39 +0000 This paper puts forward a prediction model based on membrane computing optimization algorithm for chaos time series; the model optimizes simultaneously the parameters of phase space reconstruction and least squares support vector machine (LS-SVM) by using membrane computing optimization algorithm. It is an important basis for spectrum management to predict accurately the change trend of parameters in the electromagnetic environment, which can help decision makers to adopt an optimal action. Then, the model presented in this paper is used to forecast band occupancy rate of frequency modulation (FM) broadcasting band and interphone band. To show the applicability and superiority of the proposed model, this paper will compare the forecast model presented in it with conventional similar models. The experimental results show that whether single-step prediction or multistep prediction, the proposed model performs best based on three error measures, namely, normalized mean square error (NMSE), root mean square error (RMSE), and mean absolute percentage error (MAPE). Meng Li, Liangzhong Yi, Zheng Pei, Zhisheng Gao, and Hong Peng Copyright © 2015 Meng Li et al. All rights reserved. Primary Path Reservation Using Enhanced Slot Assignment in TDMA for Session Admission Sun, 22 Mar 2015 12:39:33 +0000 Mobile ad hoc networks (MANET) is a self-organized collection of nodes that communicates without any infrastructure. Providing quality of service (QoS) in such networks is a competitive task due to unreliable wireless link, mobility, lack of centralized coordination, and channel contention. The success of many real time applications is purely based on the QoS, which can be achieved by quality aware routing (QAR) and admission control (AC). Recently proposed QoS mechanisms do focus completely on either reservation or admission control but are not better enough. In MANET, high mobility causes frequent path break due to the fact that every time the source node must find the route. In such cases the QoS session is affected. To admit a QoS session, admission control protocols must ensure the bandwidth of the relaying path before transmission starts; reservation of such bandwidth noticeably improves the admission control performance. Many TDMA based reservation mechanisms are proposed but need some improvement over slot reservation procedures. In order to overcome this specific issue, we propose a framework—PRAC (primary path reservation admission control protocol), which achieves improved QoS by making use of backup route combined with resource reservation. A network topology has been simulated and our approach proves to be a mechanism that admits the session effectively. Suresh Koneri Chandrasekaran, Prakash Savarimuthu, Priya Andi Elumalai, and Kathirvel Ayyaswamy Copyright © 2015 Suresh Koneri Chandrasekaran et al. All rights reserved. A Novel Multiobjective Evolutionary Algorithm Based on Regression Analysis Sun, 22 Mar 2015 12:39:04 +0000 As is known, the Pareto set of a continuous multiobjective optimization problem with objective functions is a piecewise continuous ()-dimensional manifold in the decision space under some mild conditions. However, how to utilize the regularity to design multiobjective optimization algorithms has become the research focus. In this paper, based on this regularity, a model-based multiobjective evolutionary algorithm with regression analysis (MMEA-RA) is put forward to solve continuous multiobjective optimization problems with variable linkages. In the algorithm, the optimization problem is modelled as a promising area in the decision space by a probability distribution, and the centroid of the probability distribution is ()-dimensional piecewise continuous manifold. The least squares method is used to construct such a model. A selection strategy based on the nondominated sorting is used to choose the individuals to the next generation. The new algorithm is tested and compared with NSGA-II and RM-MEDA. The result shows that MMEA-RA outperforms RM-MEDA and NSGA-II on the test instances with variable linkages. At the same time, MMEA-RA has higher efficiency than the other two algorithms. A few shortcomings of MMEA-RA have also been identified and discussed in this paper. Zhiming Song, Maocai Wang, Guangming Dai, and Massimiliano Vasile Copyright © 2015 Zhiming Song et al. All rights reserved. Integrating Reconfigurable Hardware-Based Grid for High Performance Computing Sun, 22 Mar 2015 12:34:54 +0000 FPGAs have shown several characteristics that make them very attractive for high performance computing (HPC). The impressive speed-up factors that they are able to achieve, the reduced power consumption, and the easiness and flexibility of the design process with fast iterations between consecutive versions are examples of benefits obtained with their use. However, there are still some difficulties when using reconfigurable platforms as accelerator that need to be addressed: the need of an in-depth application study to identify potential acceleration, the lack of tools for the deployment of computational problems in distributed hardware platforms, and the low portability of components, among others. This work proposes a complete grid infrastructure for distributed high performance computing based on dynamically reconfigurable FPGAs. Besides, a set of services designed to facilitate the application deployment is described. An example application and a comparison with other hardware and software implementations are shown. Experimental results show that the proposed architecture offers encouraging advantages for deployment of high performance distributed applications simplifying development process. Julio Dondo Gazzano, Francisco Sanchez Molina, Fernando Rincon, and Juan Carlos López Copyright © 2015 Julio Dondo Gazzano et al. All rights reserved. An Approach to Model Based Testing of Multiagent Systems Sun, 22 Mar 2015 12:33:25 +0000 Autonomous agents perform on behalf of the user to achieve defined goals or objectives. They are situated in dynamic environment and are able to operate autonomously to achieve their goals. In a multiagent system, agents cooperate with each other to achieve a common goal. Testing of multiagent systems is a challenging task due to the autonomous and proactive behavior of agents. However, testing is required to build confidence into the working of a multiagent system. Prometheus methodology is a commonly used approach to design multiagents systems. Systematic and thorough testing of each interaction is necessary. This paper proposes a novel approach to testing of multiagent systems based on Prometheus design artifacts. In the proposed approach, different interactions between the agent and actors are considered to test the multiagent system. These interactions include percepts and actions along with messages between the agents which can be modeled in a protocol diagram. The protocol diagram is converted into a protocol graph, on which different coverage criteria are applied to generate test paths that cover interactions between the agents. A prototype tool has been developed to generate test paths from protocol graph according to the specified coverage criterion. Shafiq Ur Rehman and Aamer Nadeem Copyright © 2015 Shafiq Ur Rehman and Aamer Nadeem. All rights reserved. Composition of Web Services Using Markov Decision Processes and Dynamic Programming Sun, 22 Mar 2015 12:28:44 +0000 We propose a Markov decision process model for solving the Web service composition (WSC) problem. Iterative policy evaluation, value iteration, and policy iteration algorithms are used to experimentally validate our approach, with artificial and real data. The experimental results show the reliability of the model and the methods employed, with policy iteration being the best one in terms of the minimum number of iterations needed to estimate an optimal policy, with the highest Quality of Service attributes. Our experimental work shows how the solution of a WSC problem involving a set of 100,000 individual Web services and where a valid composition requiring the selection of 1,000 services from the available set can be computed in the worst case in less than 200 seconds, using an Intel Core i5 computer with 6 GB RAM. Moreover, a real WSC problem involving only 7 individual Web services requires less than 0.08 seconds, using the same computational power. Finally, a comparison with two popular reinforcement learning algorithms, sarsa and Q-learning, shows that these algorithms require one or two orders of magnitude and more time than policy iteration, iterative policy evaluation, and value iteration to handle WSC problems of the same complexity. Víctor Uc-Cetina, Francisco Moo-Mena, and Rafael Hernandez-Ucan Copyright © 2015 Víctor Uc-Cetina et al. All rights reserved. Ubiquitous Systems towards Green, Sustainable, and Secured Smart Environment Thu, 19 Mar 2015 08:09:36 +0000 Jong-Hyuk Park, Yi Pan, Han-Chieh Chao, and Neil Y. Yen Copyright © 2015 Jong-Hyuk Park et al. All rights reserved. Intelligent Topical Sentiment Analysis for the Classification of E-Learners and Their Topics of Interest Wed, 18 Mar 2015 10:22:36 +0000 Every day, huge numbers of instant tweets (messages) are published on Twitter as it is one of the massive social media for e-learners interactions. The options regarding various interesting topics to be studied are discussed among the learners and teachers through the capture of ideal sources in Twitter. The common sentiment behavior towards these topics is received through the massive number of instant messages about them. In this paper, rather than using the opinion polarity of each message relevant to the topic, authors focus on sentence level opinion classification upon using the unsupervised algorithm named bigram item response theory (BIRT). It differs from the traditional classification and document level classification algorithm. The investigation illustrated in this paper is of threefold which are listed as follows: lexicon based sentiment polarity of tweet messages; the bigram cooccurrence relationship using naïve Bayesian; the bigram item response theory (BIRT) on various topics. It has been proposed that a model using item response theory is constructed for topical classification inference. The performance has been improved remarkably using this bigram item response theory when compared with other supervised algorithms. The experiment has been conducted on a real life dataset containing different set of tweets and topics. M. Ravichandran, G. Kulanthaivel, and T. Chellatamilan Copyright © 2015 M. Ravichandran et al. All rights reserved. Conceptual Framework for the Mapping of Management Process with Information Technology in a Business Process Thu, 12 Mar 2015 12:24:04 +0000 This study on component framework reveals the importance of management process and technology mapping in a business environment. We defined ERP as a software tool, which has to provide business solution but not necessarily an integration of all the departments. Any business process can be classified as management process, operational process and the supportive process. We have gone through entire management process and were enable to bring influencing components to be mapped with a technology for a business solution. Governance, strategic management, and decision making are thoroughly discussed and the need of mapping these components with the ERP is clearly explained. Also we suggest that implementation of this framework might reduce the ERP failures and especially the ERP misfit was completely rectified. Vetrickarthick Rajarathinam, Swarnalatha Chellappa, and Asha Nagarajan Copyright © 2015 Vetrickarthick Rajarathinam et al. All rights reserved. Moving Object Detection for Video Surveillance Wed, 11 Mar 2015 08:57:08 +0000 The emergence of video surveillance is the most promising solution for people living independently in their home. Recently several contributions for video surveillance have been proposed. However, a robust video surveillance algorithm is still a challenging task because of illumination changes, rapid variations in target appearance, similar nontarget objects in background, and occlusions. In this paper, a novel approach of object detection for video surveillance is presented. The proposed algorithm consists of various steps including video compression, object detection, and object localization. In video compression, the input video frames are compressed with the help of two-dimensional discrete cosine transform (2D DCT) to achieve less storage requirements. In object detection, key feature points are detected by computing the statistical correlation and the matching feature points are classified into foreground and background based on the Bayesian rule. Finally, the foreground feature points are localized in successive video frames by embedding the maximum likelihood feature points over the input video frames. Various frame based surveillance metrics are employed to evaluate the proposed approach. Experimental results and comparative study clearly depict the effectiveness of the proposed approach. K. Kalirajan and M. Sudha Copyright © 2015 K. Kalirajan and M. Sudha. All rights reserved. Mobility Based Key Management Technique for Multicast Security in Mobile Ad Hoc Networks Thu, 05 Mar 2015 09:28:11 +0000 In MANET multicasting, forward and backward secrecy result in increased packet drop rate owing to mobility. Frequent rekeying causes large message overhead which increases energy consumption and end-to-end delay. Particularly, the prevailing group key management techniques cause frequent mobility and disconnections. So there is a need to design a multicast key management technique to overcome these problems. In this paper, we propose the mobility based key management technique for multicast security in MANET. Initially, the nodes are categorized according to their stability index which is estimated based on the link availability and mobility. A multicast tree is constructed such that for every weak node, there is a strong parent node. A session key-based encryption technique is utilized to transmit a multicast data. The rekeying process is performed periodically by the initiator node. The rekeying interval is fixed depending on the node category so that this technique greatly minimizes the rekeying overhead. By simulation results, we show that our proposed approach reduces the packet drop rate and improves the data confidentiality. B. Madhusudhanan, S. Chitra, and C. Rajan Copyright © 2015 B. Madhusudhanan et al. All rights reserved. Abstract Computation in Schizophrenia Detection through Artificial Neural Network Based Systems Thu, 05 Mar 2015 08:17:41 +0000 Schizophrenia stands for a long-lasting state of mental uncertainty that may bring to an end the relation among behavior, thought, and emotion; that is, it may lead to unreliable perception, not suitable actions and feelings, and a sense of mental fragmentation. Indeed, its diagnosis is done over a large period of time; continuos signs of the disturbance persist for at least 6 (six) months. Once detected, the psychiatrist diagnosis is made through the clinical interview and a series of psychic tests, addressed mainly to avoid the diagnosis of other mental states or diseases. Undeniably, the main problem with identifying schizophrenia is the difficulty to distinguish its symptoms from those associated to different untidiness or roles. Therefore, this work will focus on the development of a diagnostic support system, in terms of its knowledge representation and reasoning procedures, based on a blended of Logic Programming and Artificial Neural Networks approaches to computing, taking advantage of a novel approach to knowledge representation and reasoning, which aims to solve the problems associated in the handling (i.e., to stand for and reason) of defective information. L. Cardoso, F. Marins, R. Magalhães, N. Marins, T. Oliveira, H. Vicente, A. Abelha, J. Machado, and J. Neves Copyright © 2015 L. Cardoso et al. All rights reserved. HT-Paxos: High Throughput State-Machine Replication Protocol for Large Clustered Data Centers Wed, 04 Mar 2015 13:47:51 +0000 Paxos is a prominent theory of state-machine replication. Recent data intensive systems that implement state-machine replication generally require high throughput. Earlier versions of Paxos as few of them are classical Paxos, fast Paxos, and generalized Paxos have a major focus on fault tolerance and latency but lacking in terms of throughput and scalability. A major reason for this is the heavyweight leader. Through offloading the leader, we can further increase throughput of the system. Ring Paxos, Multiring Paxos, and S-Paxos are few prominent attempts in this direction for clustered data centers. In this paper, we are proposing HT-Paxos, a variant of Paxos that is the best suitable for any large clustered data center. HT-Paxos further offloads the leader very significantly and hence increases the throughput and scalability of the system, while at the same time, among high throughput state-machine replication protocols, it provides reasonably low latency and response time. Vinit Kumar and Ajay Agarwal Copyright © 2015 Vinit Kumar and Ajay Agarwal. All rights reserved. A Novel Ontology Approach to Support Design for Reliability considering Environmental Effects Wed, 04 Mar 2015 13:35:10 +0000 Environmental effects are not considered sufficiently in product design. Reliability problems caused by environmental effects are very prominent. This paper proposes a method to apply ontology approach in product design. During product reliability design and analysis, environmental effects knowledge reusing is achieved. First, the relationship of environmental effects and product reliability is analyzed. Then environmental effects ontology to describe environmental effects domain knowledge is designed. Related concepts of environmental effects are formally defined by using the ontology approach. This model can be applied to arrange environmental effects knowledge in different environments. Finally, rubber seals used in the subhumid acid rain environment are taken as an example to illustrate ontological model application on reliability design and analysis. Bo Sun, Yu Li, Tianyuan Ye, and Yi Ren Copyright © 2015 Bo Sun et al. All rights reserved. A Seed-Based Plant Propagation Algorithm: The Feeding Station Model Mon, 02 Mar 2015 09:24:06 +0000 The seasonal production of fruit and seeds is akin to opening a feeding station, such as a restaurant. Agents coming to feed on the fruit are like customers attending the restaurant; they arrive at a certain rate and get served at a certain rate following some appropriate processes. The same applies to birds and animals visiting and feeding on ripe fruit produced by plants such as the strawberry plant. This phenomenon underpins the seed dispersion of the plants. Modelling it as a queuing process results in a seed-based search/optimisation algorithm. This variant of the Plant Propagation Algorithm is described, analysed, tested on nontrivial problems, and compared with well established algorithms. The results are included. Muhammad Sulaiman and Abdellah Salhi Copyright © 2015 Muhammad Sulaiman and Abdellah Salhi. All rights reserved. Using Shadow Page Cache to Improve Isolated Drivers Performance Sat, 28 Feb 2015 10:47:59 +0000 With the advantage of the reusability property of the virtualization technology, users can reuse various types and versions of existing operating systems and drivers in a virtual machine, so as to customize their application environment. In order to prevent users’ virtualization environments being impacted by driver faults in virtual machine, Chariot examines the correctness of driver’s write operations by the method of combining a driver’s write operation capture and a driver’s private access control table. However, this method needs to keep the write permission of shadow page table as read-only, so as to capture isolated driver’s write operations through page faults, which adversely affect the performance of the driver. Based on delaying setting frequently used shadow pages’ write permissions to read-only, this paper proposes an algorithm using shadow page cache to improve the performance of isolated drivers and carefully study the relationship between the performance of drivers and the size of shadow page cache. Experimental results show that, through the shadow page cache, the performance of isolated drivers can be greatly improved without impacting Chariot’s reliability too much. Hao Zheng, Xiaoshe Dong, Endong Wang, Baoke Chen, Zhengdong Zhu, and Chengzhe Liu Copyright © 2015 Hao Zheng et al. All rights reserved. An Incremental High-Utility Mining Algorithm with Transaction Insertion Wed, 25 Feb 2015 09:07:56 +0000 Association-rule mining is commonly used to discover useful and meaningful patterns from a very large database. It only considers the occurrence frequencies of items to reveal the relationships among itemsets. Traditional association-rule mining is, however, not suitable in real-world applications since the purchased items from a customer may have various factors, such as profit or quantity. High-utility mining was designed to solve the limitations of association-rule mining by considering both the quantity and profit measures. Most algorithms of high-utility mining are designed to handle the static database. Fewer researches handle the dynamic high-utility mining with transaction insertion, thus requiring the computations of database rescan and combination explosion of pattern-growth mechanism. In this paper, an efficient incremental algorithm with transaction insertion is designed to reduce computations without candidate generation based on the utility-list structures. The enumeration tree and the relationships between 2-itemsets are also adopted in the proposed algorithm to speed up the computations. Several experiments are conducted to show the performance of the proposed algorithm in terms of runtime, memory consumption, and number of generated patterns. Jerry Chun-Wei Lin, Wensheng Gan, Tzung-Pei Hong, and Binbin Zhang Copyright © 2015 Jerry Chun-Wei Lin et al. All rights reserved. An Energy-Efficient Cluster-Based Vehicle Detection on Road Network Using Intention Numeration Method Sun, 22 Feb 2015 06:51:46 +0000 The traffic in the road network is progressively increasing at a greater extent. Good knowledge of network traffic can minimize congestions using information pertaining to road network obtained with the aid of communal callers, pavement detectors, and so on. Using these methods, low featured information is generated with respect to the user in the road network. Although the existing schemes obtain urban traffic information, they fail to calculate the energy drain rate of nodes and to locate equilibrium between the overhead and quality of the routing protocol that renders a great challenge. Thus, an energy-efficient cluster-based vehicle detection in road network using the intention numeration method (CVDRN-IN) is developed. Initially, sensor nodes that detect a vehicle are grouped into separate clusters. Further, we approximate the strength of the node drain rate for a cluster using polynomial regression function. In addition, the total node energy is estimated by taking the integral over the area. Finally, enhanced data aggregation is performed to reduce the amount of data transmission using digital signature tree. The experimental performance is evaluated with Dodgers loop sensor data set from UCI repository and the performance evaluation outperforms existing work on energy consumption, clustering efficiency, and node drain rate. Deepa Devasenapathy and Kathiravan Kannan Copyright © 2015 Deepa Devasenapathy and Kathiravan Kannan. All rights reserved. Power, Control, and Optimization Thu, 19 Feb 2015 08:30:01 +0000 Pandian Vasant, Gerhard-Wilhelm Weber, Nader Barsoum, and Vo Ngoc Dieu Copyright © 2015 Pandian Vasant et al. All rights reserved. Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing Thu, 12 Feb 2015 13:43:04 +0000 Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud. S. Balasubramaniam and V. Kavitha Copyright © 2015 S. Balasubramaniam and V. Kavitha. All rights reserved. Ensemble Classifier for Epileptic Seizure Detection for Imperfect EEG Data Wed, 04 Feb 2015 14:33:09 +0000 Brain status information is captured by physiological electroencephalogram (EEG) signals, which are extensively used to study different brain activities. This study investigates the use of a new ensemble classifier to detect an epileptic seizure from compressed and noisy EEG signals. This noise-aware signal combination (NSC) ensemble classifier combines four classification models based on their individual performance. The main objective of the proposed classifier is to enhance the classification accuracy in the presence of noisy and incomplete information while preserving a reasonable amount of complexity. The experimental results show the effectiveness of the NSC technique, which yields higher accuracies of 90% for noiseless data compared with 85%, 85.9%, and 89.5% in other experiments. The accuracy for the proposed method is 80% when  dB, 84% when  dB, and 88% when  dB, while the compression ratio (CR) is 85.35% for all of the datasets mentioned. Khalid Abualsaud, Massudi Mahmuddin, Mohammad Saleh, and Amr Mohamed Copyright © 2015 Khalid Abualsaud et al. All rights reserved. Hybrid Swarm Intelligence Optimization Approach for Optimal Data Storage Position Identification in Wireless Sensor Networks Wed, 04 Feb 2015 11:17:43 +0000 The current high profile debate with regard to data storage and its growth have become strategic task in the world of networking. It mainly depends on the sensor nodes called producers, base stations, and also the consumers (users and sensor nodes) to retrieve and use the data. The main concern dealt here is to find an optimal data storage position in wireless sensor networks. The works that have been carried out earlier did not utilize swarm intelligence based optimization approaches to find the optimal data storage positions. To achieve this goal, an efficient swam intelligence approach is used to choose suitable positions for a storage node. Thus, hybrid particle swarm optimization algorithm has been used to find the suitable positions for storage nodes while the total energy cost of data transmission is minimized. Clustering-based distributed data storage is utilized to solve clustering problem using fuzzy-C-means algorithm. This research work also considers the data rates and locations of multiple producers and consumers to find optimal data storage positions. The algorithm is implemented in a network simulator and the experimental results show that the proposed clustering and swarm intelligence based ODS strategy is more effective than the earlier approaches. Ranganathan Mohanasundaram and Pappampalayam Sanmugam Periasamy Copyright © 2015 Ranganathan Mohanasundaram and Pappampalayam Sanmugam Periasamy. All rights reserved. Design and Implementation of Streaming Media Server Cluster Based on FFMpeg Tue, 03 Feb 2015 06:28:49 +0000 Poor performance and network congestion are commonly observed in the streaming media single server system. This paper proposes a scheme to construct a streaming media server cluster system based on FFMpeg. In this scheme, different users are distributed to different servers according to their locations and the balance among servers is maintained by the dynamic load-balancing algorithm based on active feedback. Furthermore, a service redirection algorithm is proposed to improve the transmission efficiency of streaming media data. The experiment results show that the server cluster system has significantly alleviated the network congestion and improved the performance in comparison with the single server system. Hong Zhao, Chun-long Zhou, and Bao-zhao Jin Copyright © 2015 Hong Zhao et al. All rights reserved. CaLRS: A Critical-Aware Shared LLC Request Scheduling Algorithm on GPGPU Mon, 02 Feb 2015 11:15:59 +0000 Ultra high thread-level parallelism in modern GPUs usually introduces numerous memory requests simultaneously. So there are always plenty of memory requests waiting at each bank of the shared LLC (L2 in this paper) and global memory. For global memory, various schedulers have already been developed to adjust the request sequence. But we find few work has ever focused on the service sequence on the shared LLC. We measured that a big number of GPU applications always queue at LLC bank for services, which provide opportunity to optimize the service order on LLC. Through adjusting the GPU memory request service order, we can improve the schedulability of SM. So we proposed a critical-aware shared LLC request scheduling algorithm (CaLRS) in this paper. The priority representative of memory request is critical for CaLRS. We use the number of memory requests that originate from the same warp but have not been serviced when they arrive at the shared LLC bank to represent the criticality of each warp. Experiments show that the proposed scheme can boost the SM schedulability effectively by promoting the scheduling priority of the memory requests with high criticality and improves the performance of GPU indirectly. Jianliang Ma, Jinglei Meng, Tianzhou Chen, and Minghui Wu Copyright © 2015 Jianliang Ma et al. All rights reserved. Workflow Modelling and Analysis Based on the Construction of Task Models Thu, 29 Jan 2015 12:47:28 +0000 We describe the structure of a workflow as a graph whose vertices represent tasks and the arcs are associated to workflow transitions in this paper. To each task an input/output logic operator is associated. Furthermore, we associate a Boolean term to each transition present in the workflow. We still identify the structure of workflows and describe their dynamism through the construction of new task models. This construction is very simple and intuitive since it is based on the analysis of all tasks present on the workflow that allows us to describe the dynamism of the workflow very easily. So, our approach has the advantage of being very intuitive, which is an important highlight of our work. We also introduce the concept of logical termination of workflows and provide conditions under which this property is valid. Finally, we provide a counter-example which shows that a conjecture presented in a previous article is false. Glória Cravo Copyright © 2015 Glória Cravo. All rights reserved. Constructing RBAC Based Security Model in u-Healthcare Service Platform Tue, 27 Jan 2015 08:57:03 +0000 In today’s era of aging society, people want to handle personal health care by themselves in everyday life. In particular, the evolution of medical and IT convergence technology and mobile smart devices has made it possible for people to gather information on their health status anytime and anywhere easily using biometric information acquisition devices. Healthcare information systems can contribute to the improvement of the nation’s healthcare quality and the reduction of related cost. However, there are no perfect security models or mechanisms for healthcare service applications, and privacy information can therefore be leaked. In this paper, we examine security requirements related to privacy protection in u-healthcare service and propose an extended RBAC based security model. We propose and design u-healthcare service integration platform (u-HCSIP) applying RBAC security model. The proposed u-HCSIP performs four main functions: storing and exchanging personal health records (PHR), recommending meals and exercise, buying/selling private health information or experience, and managing personal health data using smart devices. Moon Sun Shin, Heung Seok Jeon, Yong Wan Ju, Bum Ju Lee, and Seon-Phil Jeong Copyright © 2015 Moon Sun Shin et al. All rights reserved. Twin-Schnorr: A Security Upgrade for the Schnorr Identity-Based Identification Scheme Tue, 27 Jan 2015 07:58:09 +0000 Most identity-based identification (IBI) schemes proposed in recent literature are built using pairing operations. This decreases efficiency due to the high operation costs of pairings. Furthermore, most of these IBI schemes are proven to be secure against impersonation under active and concurrent attacks using interactive assumptions such as the one-more RSA inversion assumption or the one-more discrete logarithm assumption, translating to weaker security guarantees due to the interactive nature of these assumptions. The Schnorr-IBI scheme was first proposed through the Kurosawa-Heng transformation from the Schnorr signature. It remains one of the fastest yet most secure IBI schemes under impersonation against passive attacks due to its pairing-free design. However, when required to be secure against impersonators under active and concurrent attacks, it deteriorates greatly in terms of efficiency due to the protocol having to be repeated multiple times. In this paper, we upgrade the Schnorr-IBI scheme to be secure against impersonation under active and concurrent attacks using only the classical discrete logarithm assumption. This translates to a higher degree of security guarantee with only some minor increments in operational costs. Furthermore, because the scheme operates without pairings, it still retains its efficiency and superiority when compared to other pairing-based IBI schemes. Ji-Jian Chin, Syh-Yuan Tan, Swee-Huay Heng, and Raphael Chung-Wei Phan Copyright © 2015 Ji-Jian Chin et al. All rights reserved. Temporary Redundant Transmission Mechanism for SCTP Multihomed Hosts Sun, 18 Jan 2015 13:06:01 +0000 In SCTP’s Concurrent Multipath Transfer, if data is sent to the destined IP(s) without knowledge of the paths condition, packets may be lost or delayed. This is because of the bursty nature of IP traffic and physical damage to the network. To offset these problems, network path status is examined using our new mechanism Multipath State Aware Concurrent Multipath Transfer using redundant transmission (MSACMT-RTv2). Here the status of multiple paths is analyzed, initially and periodically thereafter transmitted. After examination, paths priority is assigned before transmission. One path is temporarily employed as redundant path for the failure-expected path (FEP); this redundant path is used for transmitting redundant data. At the end of predefined period, reliability of the FEP is confirmed. If FEP is ensured to be reliable, temporary path is transformed into normal CMT path. MSACMT-RTv2 algorithm is simulated using the Delaware University ns-2 SCTP/CMT module (ns-2; V2.29). We present and discuss MSACMT-RTv2 performance in asymmetric path delay and with finite receiver buffer (rbuf) size. We extended our experiment to test robustness of this algorithm and inferred exhaustive result. It is inferred that our algorithm outperforms better in terms of increasing the throughput and reducing the latency than existing system. D. Mohana Geetha, S. K. Muthusundar, M. Subramaniam, and Kathirvel Ayyaswamy Copyright © 2015 D. Mohana Geetha et al. All rights reserved. A Novel Cost Based Model for Energy Consumption in Cloud Computing Thu, 15 Jan 2015 14:03:22 +0000 Cloud data centers consume enormous amounts of electrical energy. To support green cloud computing, providers also need to minimize cloud infrastructure energy consumption while conducting the QoS. In this study, for cloud environments an energy consumption model is proposed for time-shared policy in virtualization layer. The cost and energy usage of time-shared policy were modeled in the CloudSim simulator based upon the results obtained from the real system and then proposed model was evaluated by different scenarios. In the proposed model, the cache interference costs were considered. These costs were based upon the size of data. The proposed model was implemented in the CloudSim simulator and the related simulation results indicate that the energy consumption may be considerable and that it can vary with different parameters such as the quantum parameter, data size, and the number of VMs on a host. Measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment. Also, measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment. A. Horri and Gh. Dastghaibyfard Copyright © 2015 A. Horri and Gh. Dastghaibyfard. All rights reserved. Exploiting Semantic Annotations and -Learning for Constructing an Efficient Hierarchy/Graph Texts Organization Thu, 01 Jan 2015 09:34:27 +0000 Tremendous growth in the number of textual documents has produced daily requirements for effective development to explore, analyze, and discover knowledge from these textual documents. Conventional text mining and managing systems mainly use the presence or absence of key words to discover and analyze useful information from textual documents. However, simple word counts and frequency distributions of term appearances do not capture the meaning behind the words, which results in limiting the ability to mine the texts. This paper proposes an efficient methodology for constructing hierarchy/graph-based texts organization and representation scheme based on semantic annotation and -learning. This methodology is based on semantic notions to represent the text in documents, to infer unknown dependencies and relationships among concepts in a text, to measure the relatedness between text documents, and to apply mining processes using the representation and the relatedness measure. The representation scheme reflects the existing relationships among concepts and facilitates accurate relatedness measurements that result in a better mining performance. An extensive experimental evaluation is conducted on real datasets from various domains, indicating the importance of the proposed approach. Asmaa M. El-Said, Ali I. Eldesoky, and Hesham A. Arafat Copyright © 2015 Asmaa M. El-Said et al. All rights reserved. Proposed Framework for the Evaluation of Standalone Corpora Processing Systems: An Application to Arabic Corpora Wed, 31 Dec 2014 07:32:48 +0000 Despite the accessibility of numerous online corpora, students and researchers engaged in the fields of Natural Language Processing (NLP), corpus linguistics, and language learning and teaching may encounter situations in which they need to develop their own corpora. Several commercial and free standalone corpora processing systems are available to process such corpora. In this study, we first propose a framework for the evaluation of standalone corpora processing systems and then use it to evaluate seven freely available systems. The proposed framework considers the usability, functionality, and performance of the evaluated systems while taking into consideration their suitability for Arabic corpora. While the results show that most of the evaluated systems exhibited comparable usability scores, the scores for functionality and performance were substantially different with respect to support for the Arabic language and N-grams profile generation. The results of our evaluation will help potential users of the evaluated systems to choose the system that best meets their needs. More importantly, the results will help the developers of the evaluated systems to enhance their systems and developers of new corpora processing systems by providing them with a reference framework. Abdulmohsen Al-Thubaity, Hend Al-Khalifa, Reem Alqifari, and Manal Almazrua Copyright © 2014 Abdulmohsen Al-Thubaity et al. All rights reserved. Computational Intelligence and Metaheuristic Algorithms with Applications Wed, 31 Dec 2014 07:25:02 +0000 Xin-She Yang, Su Fong Chien, and Tiew On Ting Copyright © 2014 Xin-She Yang et al. All rights reserved. Erratum to “A Network and Visual Quality Aware N-Screen Content Recommender System Using Joint Matrix Factorization” Mon, 29 Dec 2014 00:10:37 +0000 Farman Ullah, Ghulam Sarwar, and Sungchang Lee Copyright © 2014 Farman Ullah et al. All rights reserved. Recent Advances on Internet of Things Mon, 22 Dec 2014 10:47:08 +0000 Xiaoxuan Meng, Jaime Lloret, Xudong Zhu, and Zhongmei Zhou Copyright © 2014 Xiaoxuan Meng et al. All rights reserved. Development of Robust Behaviour Recognition for an at-Home Biomonitoring Robot with Assistance of Subject Localization and Enhanced Visual Tracking Sun, 21 Dec 2014 09:48:47 +0000 Our research is focused on the development of an at-home health care biomonitoring mobile robot for the people in demand. Main task of the robot is to detect and track a designated subject while recognizing his/her activity for analysis and to provide warning in an emergency. In order to push forward the system towards its real application, in this study, we tested the robustness of the robot system with several major environment changes, control parameter changes, and subject variation. First, an improved color tracker was analyzed to find out the limitations and constraints of the robot visual tracking considering the suitable illumination values and tracking distance intervals. Then, regarding subject safety and continuous robot based subject tracking, various control parameters were tested on different layouts in a room. Finally, the main objective of the system is to find out walking activities for different patterns for further analysis. Therefore, we proposed a fast, simple, and person specific new activity recognition model by making full use of localization information, which is robust to partial occlusion. The proposed activity recognition algorithm was tested on different walking patterns with different subjects, and the results showed high recognition accuracy. Nevrez Imamoglu, Enrique Dorronzoro, Zhixuan Wei, Huangjun Shi, Masashi Sekine, José González, Dongyun Gu, Weidong Chen, and Wenwei Yu Copyright © 2014 Nevrez Imamoglu et al. All rights reserved. Video Multiple Watermarking Technique Based on Image Interlacing Using DWT Sun, 21 Dec 2014 08:17:18 +0000 Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth. Mohamed M. Ibrahim, Neamat S. Abdel Kader, and M. Zorkany Copyright © 2014 Mohamed M. Ibrahim et al. All rights reserved. Robot Trajectories Comparison: A Statistical Approach Tue, 25 Nov 2014 13:02:43 +0000 The task of planning a collision-free trajectory from a start to a goal position is fundamental for an autonomous mobile robot. Although path planning has been extensively investigated since the beginning of robotics, there is no agreement on how to measure the performance of a motion algorithm. This paper presents a new approach to perform robot trajectories comparison that could be applied to any kind of trajectories and in both simulated and real environments. Given an initial set of features, it automatically selects the most significant ones and performs a statistical comparison using them. Additionally, a graphical data visualization named polygraph which helps to better understand the obtained results is provided. The proposed method has been applied, as an example, to compare two different motion planners, and WaveFront, using different environments, robots, and local planners. A. Ansuategui, A. Arruti, L. Susperregi, Y. Yurramendi, E. Jauregi, E. Lazkano, and B. Sierra Copyright © 2014 A. Ansuategui et al. All rights reserved. Critical Product Features’ Identification Using an Opinion Analyzer Mon, 24 Nov 2014 00:00:00 +0000 The increasing use and ubiquity of the Internet facilitate dissemination of word-of-mouth through blogs, online forums, newsgroups, and consumer’s reviews. Online consumer’s reviews present tremendous opportunities and challenges for consumers and marketers. One of the challenges is to develop interactive marketing practices for making connections with target consumers that capitalize consumer-to-consumer communications for generating product adoption. Opinion mining is employed in marketing to help consumers and enterprises in the analysis of online consumers’ reviews by highlighting the strengths and weaknesses of the products. This paper describes an opinion mining system based on novel review and feature ranking methods to empower consumers and enterprises for identifying critical product features from enormous consumers’ reviews. Consumers and business analysts are the main target group for the proposed system who want to explore consumers’ feedback for determining purchase decisions and enterprise strategies. We evaluate the proposed system on real dataset. Results show that integration of review and feature-ranking methods improves the decision making processes significantly. Azra Shamim, Vimala Balakrishnan, Muhammad Tahir, and Muhammad Shiraz Copyright © 2014 Azra Shamim et al. All rights reserved. Development and Application of New Quality Model for Software Projects Sun, 16 Nov 2014 06:46:01 +0000 The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects. K. Karnavel and R. Dillibabu Copyright © 2014 K. Karnavel and R. Dillibabu. All rights reserved. A New Pixels Flipping Method for Huge Watermarking Capacity of the Invoice Font Image Wed, 12 Nov 2014 09:38:17 +0000 Invoice printing just has two-color printing, so invoice font image can be seen as binary image. To embed watermarks into invoice image, the pixels need to be flipped. The more huge the watermark is, the more the pixels need to be flipped. We proposed a new pixels flipping method in invoice image for huge watermarking capacity. The pixels flipping method includes one novel interpolation method for binary image, one flippable pixels evaluation mechanism, and one denoising method based on gravity center and chaos degree. The proposed interpolation method ensures that the invoice image keeps features well after scaling. The flippable pixels evaluation mechanism ensures that the pixels keep better connectivity and smoothness and the pattern has highest structural similarity after flipping. The proposed denoising method makes invoice font image smoother and fiter for human vision. Experiments show that the proposed flipping method not only keeps the invoice font structure well but also improves watermarking capacity. Li Li, Qingzheng Hou, Jianfeng Lu, Qishuai Xu, Junping Dai, Xiaoyang Mao, and Chin-Chen Chang Copyright © 2014 Li Li et al. All rights reserved. A Green Strategy for Federated and Heterogeneous Clouds with Communicating Workloads Tue, 11 Nov 2014 09:23:10 +0000 Providers of cloud environments must tackle the challenge of configuring their system to provide maximal performance while minimizing the cost of resources used. However, at the same time, they must guarantee an SLA (service-level agreement) to the users. The SLA is usually associated with a certain level of QoS (quality of service). As response time is perhaps the most widely used QoS metric, it was also the one chosen in this work. This paper presents a green strategy (GS) model for heterogeneous cloud systems. We provide a solution for heterogeneous job-communicating tasks and heterogeneous VMs that make up the nodes of the cloud. In addition to guaranteeing the SLA, the main goal is to optimize energy savings. The solution results in an equation that must be solved by a solver with nonlinear capabilities. The results obtained from modelling the policies to be executed by a solver demonstrate the applicability of our proposal for saving energy and guaranteeing the SLA. Jordi Mateo, Jordi Vilaplana, Lluis M. Plà, Josep Ll. Lérida, and Francesc Solsona Copyright © 2014 Jordi Mateo et al. All rights reserved. The Approach for Action Recognition Based on the Reconstructed Phase Spaces Mon, 10 Nov 2014 06:28:53 +0000 This paper presents a novel method of human action recognition, which is based on the reconstructed phase space. Firstly, the human body is divided into 15 key points, whose trajectory represents the human body behavior, and the modified particle filter is used to track these key points for self-occlusion. Secondly, we reconstruct the phase spaces for extracting more useful information from human action trajectories. Finally, we apply the semisupervised probability model and Bayes classified method for classification. Experiments are performed on the Weizmann, KTH, UCF sports, and our action dataset to test and evaluate the proposed method. The compare experiment results showed that the proposed method can achieve was more effective than compare methods. Hong-bin Tu and Li-min Xia Copyright © 2014 Hong-bin Tu and Li-min Xia. All rights reserved. Integrating SOMs and a Bayesian Classifier for Segmenting Diseased Plants in Uncontrolled Environments Tue, 04 Nov 2014 13:43:28 +0000 This work presents a methodology that integrates a nonsupervised learning approach (self-organizing map (SOM)) and a supervised one (a Bayesian classifier) for segmenting diseased plants that grow in uncontrolled environments such as greenhouses, wherein the lack of control of illumination and presence of background bring about serious drawbacks. During the training phase two SOMs are used: one that creates color groups of images, which are classified into two groups using -means and labeled as vegetation and nonvegetation by using rules, and a second SOM that corrects classification errors made by the first SOM. Two color histograms are generated from the two color classes and used to estimate the conditional probabilities of the Bayesian classifier. During the testing phase an input image is segmented by the Bayesian classifier and then it is converted into a binary image, wherein contours are extracted and analyzed to recover diseased areas that were incorrectly classified as nonvegetation. The experimental results using the proposed methodology showed better performance than two of the most used color index methods. Deny Lizbeth Hernández-Rabadán, Fernando Ramos-Quintana, and Julian Guerrero Juk Copyright © 2014 Deny Lizbeth Hernández-Rabadán et al. All rights reserved. Collaborative and Multilingual Approach to Learn Database Topics Using Concept Maps Mon, 03 Nov 2014 09:04:40 +0000 Authors report on a study using the concept mapping technique in computer engineering education for learning theoretical introductory database topics. In addition, the learning of multilingual technical terminology by means of the collaborative drawing of a concept map is also pursued in this experiment. The main characteristics of a study carried out in the database subject at the University of the Basque Country during the 2011/2012 course are described. This study contributes to the field of concept mapping as these kinds of cognitive tools have proved to be valid to support learning in computer engineering education. It contributes to the field of computer engineering education, providing a technique that can be incorporated with several educational purposes within the discipline. Results reveal the potential that a collaborative concept map editor offers to fulfil the above mentioned objectives. Ana Arruarte, Iñaki Calvo, Jon A. Elorriaga, Mikel Larrañaga, and Angel Conde Copyright © 2014 Ana Arruarte et al. All rights reserved. An Evolved Wavelet Library Based on Genetic Algorithm Mon, 27 Oct 2014 11:55:06 +0000 As the size of the images being captured increases, there is a need for a robust algorithm for image compression which satiates the bandwidth limitation of the transmitted channels and preserves the image resolution without considerable loss in the image quality. Many conventional image compression algorithms use wavelet transform which can significantly reduce the number of bits needed to represent a pixel and the process of quantization and thresholding further increases the compression. In this paper the authors evolve two sets of wavelet filter coefficients using genetic algorithm (GA), one for the whole image portion except the edge areas and the other for the portions near the edges in the image (i.e., global and local filters). Images are initially separated into several groups based on their frequency content, edges, and textures and the wavelet filter coefficients are evolved separately for each group. As there is a possibility of the GA settling in local maximum, we introduce a new shuffling operator to prevent the GA from this effect. The GA used to evolve filter coefficients primarily focuses on maximizing the peak signal to noise ratio (PSNR). The evolved filter coefficients by the proposed method outperform the existing methods by a 0.31 dB improvement in the average PSNR and a 0.39 dB improvement in the maximum PSNR. D. Vaithiyanathan, R. Seshasayanan, K. Kunaraj, and J. Keerthiga Copyright © 2014 D. Vaithiyanathan et al. All rights reserved. Cognitive Inference Device for Activity Supervision in the Elderly Mon, 27 Oct 2014 11:16:37 +0000 Human activity, life span, and quality of life are enhanced by innovations in science and technology. Aging individual needs to take advantage of these developments to lead a self-regulated life. However, maintaining a self-regulated life at old age involves a high degree of risk, and the elderly often fail at this goal. Thus, the objective of our study is to investigate the feasibility of implementing a cognitive inference device (CI-device) for effective activity supervision in the elderly. To frame the CI-device, we propose a device design framework along with an inference algorithm and implement the designs through an artificial neural model with different configurations, mapping the CI-device’s functions to minimise the device’s prediction error. An analysis and discussion are then provided to validate the feasibility of CI-device implementation for activity supervision in the elderly. Nilamadhab Mishra, Chung-Chih Lin, and Hsien-Tsung Chang Copyright © 2014 Nilamadhab Mishra et al. All rights reserved. Effects of Corporate Social Responsibility and Governance on Its Credit Ratings Mon, 27 Oct 2014 07:17:21 +0000 This study reviews the impact of corporate social responsibility (CSR) and corporate governance on its credit rating. The result of regression analysis to credit ratings with relevant primary independent variables shows that both factors have significant effects on it. As we have predicted, the signs of both regression coefficients have a positive sign (+) proving that corporates with excellent CSR and governance index (CGI) scores have higher credit ratings and vice versa. The results show nonfinancial information also may have effects on corporate credit rating. The investment on personal data protection could be an example of CSR/CGI activities which have positive effects on corporate credit ratings. Dong-young Kim and JeongYeon Kim Copyright © 2014 Dong-young Kim and JeongYeon Kim. All rights reserved. Based on Regular Expression Matching of Evaluation of the Task Performance in WSN: A Queue Theory Approach Thu, 23 Oct 2014 13:16:45 +0000 Due to the limited resources of wireless sensor network, low efficiency of real-time communication scheduling, poor safety defects, and so forth, a queuing performance evaluation approach based on regular expression match is proposed, which is a method that consists of matching preprocessing phase, validation phase, and queuing model of performance evaluation phase. Firstly, the subset of related sequence is generated in preprocessing phase, guiding the validation phase distributed matching. Secondly, in the validation phase, the subset of features clustering, the compressed matching table is more convenient for distributed parallel matching. Finally, based on the queuing model, the sensor networks of task scheduling dynamic performance are evaluated. Experiments show that our approach ensures accurate matching and computational efficiency of more than 70%; it not only effectively detects data packets and access control, but also uses queuing method to determine the parameters of task scheduling in wireless sensor networks. The method for medium scale or large scale distributed wireless node has a good applicability. Jie Wang, Kai Cui, Kuanjiu Zhou, and Yanshuo Yu Copyright © 2014 Jie Wang et al. All rights reserved. A Novel -Input Voting Algorithm for -by-Wire Fault-Tolerant Systems Sun, 19 Oct 2014 00:00:00 +0000 Voting is an important operation in multichannel computation paradigm and realization of ultrareliable and real-time control systems that arbitrates among the results of N redundant variants. These systems include -modular redundant (NMR) hardware systems and diversely designed software systems based on -version programming (NVP). Depending on the characteristics of the application and the type of selected voter, the voting algorithms can be implemented for either hardware or software systems. In this paper, a novel voting algorithm is introduced for real-time fault-tolerant control systems, appropriate for applications in which N is large. Then, its behavior has been software implemented in different scenarios of error-injection on the system inputs. The results of analyzed evaluations through plots and statistical computations have demonstrated that this novel algorithm does not have the limitations of some popular voting algorithms such as median and weighted; moreover, it is able to significantly increase the reliability and availability of the system in the best case to 2489.7% and 626.74%, respectively, and in the worst case to 3.84% and 1.55%, respectively. Abbas Karimi, Faraneh Zarafshan, S. A. R. Al-Haddad, and Abdul Rahman Ramli Copyright © 2014 Abbas Karimi et al. All rights reserved. Proactive Supply Chain Performance Management with Predictive Analytics Wed, 15 Oct 2014 09:53:58 +0000 Today’s business climate requires supply chains to be proactive rather than reactive, which demands a new approach that incorporates data mining predictive analytics. This paper introduces a predictive supply chain performance management model which combines process modelling, performance measurement, data mining models, and web portal technologies into a unique model. It presents the supply chain modelling approach based on the specialized metamodel which allows modelling of any supply chain configuration and at different level of details. The paper also presents the supply chain semantic business intelligence (BI) model which encapsulates data sources and business rules and includes the data warehouse model with specific supply chain dimensions, measures, and KPIs (key performance indicators). Next, the paper describes two generic approaches for designing the KPI predictive data mining models based on the BI semantic model. KPI predictive models were trained and tested with a real-world data set. Finally, a specialized analytical web portal which offers collaborative performance monitoring and decision making is presented. The results show that these models give very accurate KPI projections and provide valuable insights into newly emerging trends, opportunities, and problems. This should lead to more intelligent, predictive, and responsive supply chains capable of adapting to future business environment. Nenad Stefanovic Copyright © 2014 Nenad Stefanovic. All rights reserved. Medical Applications of Microwave Imaging Tue, 14 Oct 2014 07:16:04 +0000 Ultrawide band (UWB) microwave imaging is a promising method for the detection of early stage breast cancer, based on the large contrast in electrical parameters between malignant tumour tissue and the surrounding normal breast-tissue. In this paper, the detection and imaging of a malignant tumour are performed through a tomographic based microwave system and signal processing. Simulations of the proposed system are performed and postimage processing is presented. Signal processing involves the extraction of tumour information from background information and then image reconstruction through the confocal method delay-and-sum algorithms. Ultimately, the revision of time-delay and the superposition of more tumour signals are applied to improve accuracy. Zhao Wang, Eng Gee Lim, Yujun Tang, and Mark Leach Copyright © 2014 Zhao Wang et al. All rights reserved. Trust-Based Access Control Model from Sociological Approach in Dynamic Online Social Network Environment Mon, 13 Oct 2014 14:06:05 +0000 There has been an explosive increase in the population of the OSN (online social network) in recent years. The OSN provides users with many opportunities to communicate among friends and family. Further, it facilitates developing new relationships with previously unknown people having similar beliefs or interests. However, the OSN can expose users to adverse effects such as privacy breaches, the disclosing of uncontrolled material, and the disseminating of false information. Traditional access control models such as MAC, DAC, and RBAC are applied to the OSN to address these problems. However, these models are not suitable for the dynamic OSN environment because user behavior in the OSN is unpredictable and static access control imposes a burden on the users to change the access control rules individually. We propose a dynamic trust-based access control for the OSN to address the problems of the traditional static access control. Moreover, we provide novel criteria to evaluate trust factors such as sociological approach and evaluate a method to calculate the dynamic trust values. The proposed method can monitor negative behavior and modify access permission levels dynamically to prevent the indiscriminate disclosure of information. Seungsoo Baek and Seungjoo Kim Copyright © 2014 Seungsoo Baek and Seungjoo Kim. All rights reserved. Cooperation-Controlled Learning for Explicit Class Structure in Self-Organizing Maps Thu, 18 Sep 2014 00:00:00 +0000 We attempt to demonstrate the effectiveness of multiple points of view toward neural networks. By restricting ourselves to two points of view of a neuron, we propose a new type of information-theoretic method called “cooperation-controlled learning.” In this method, individual and collective neurons are distinguished from one another, and we suppose that the characteristics of individual and collective neurons are different. To implement individual and collective neurons, we prepare two networks, namely, cooperative and uncooperative networks. The roles of these networks and the roles of individual and collective neurons are controlled by the cooperation parameter. As the parameter is increased, the role of cooperative networks becomes more important in learning, and the characteristics of collective neurons become more dominant. On the other hand, when the parameter is small, individual neurons play a more important role. We applied the method to the automobile and housing data from the machine learning database and examined whether explicit class boundaries could be obtained. Experimental results showed that cooperation-controlled learning, in particular taking into account information on input units, could be used to produce clearer class structure than conventional self-organizing maps. Ryotaro Kamimura Copyright © 2014 Ryotaro Kamimura. All rights reserved. Intelligent Bar Chart Plagiarism Detection in Documents Wed, 17 Sep 2014 12:13:27 +0000 This paper presents a novel features mining approach from documents that could not be mined via optical character recognition (OCR). By identifying the intimate relationship between the text and graphical components, the proposed technique pulls out the Start, End, and Exact values for each bar. Furthermore, the word 2-gram and Euclidean distance methods are used to accurately detect and determine plagiarism in bar charts. Mohammed Mumtaz Al-Dabbagh, Naomie Salim, Amjad Rehman, Mohammed Hazim Alkawaz, Tanzila Saba, Mznah Al-Rodhaan, and Abdullah Al-Dhelaan Copyright © 2014 Mohammed Mumtaz Al-Dabbagh et al. All rights reserved. A Three-Step Approach with Adaptive Additive Magnitude Selection for the Sharpening of Images Tue, 16 Sep 2014 08:45:26 +0000 Aimed to find the additive magnitude automatically and adaptively, we propose a three-step and model-based approach for the sharpening of images in this paper. In the first pass, a Grey prediction model is applied to find a global maximal additive magnitude so that the condition of oversharpening in images to be sharpened can be avoided. During the second pass, edge pixels are picked out with our previously proposed edge detection mechanism. In this pass, a low-pass filter is also applied so that isolated pixels will not be regarded as around an edge. In the final pass, those pixels detected as around an edge are adjusted adaptively based on the local statistics, and those nonedge pixels are kept unaltered. Extensive experiments on natural images as well as medical images with subjective and objective evaluations will be given to demonstrate the usefulness of the proposed approach. Lih-Jen Kau and Tien-Lin Lee Copyright © 2014 Lih-Jen Kau and Tien-Lin Lee. All rights reserved. Adaptive Cuckoo Search Algorithm for Unconstrained Optimization Sun, 14 Sep 2014 06:00:44 +0000 Modification of the intensification and diversification approaches in the recently developed cuckoo search algorithm (CSA) is performed. The alteration involves the implementation of adaptive step size adjustment strategy, and thus enabling faster convergence to the global optimal solutions. The feasibility of the proposed algorithm is validated against benchmark optimization functions, where the obtained results demonstrate a marked improvement over the standard CSA, in all the cases. Pauline Ong Copyright © 2014 Pauline Ong. All rights reserved. A New Sensors-Based Covert Channel on Android Sun, 14 Sep 2014 00:00:00 +0000 Covert channels are not new in computing systems, and have been studied since their first definition four decades ago. New platforms invoke thorough investigations to assess their security. Now is the time for Android platform to analyze its security model, in particular the two key principles: process-isolation and the permissions system. Aside from all sorts of malware, one threat proved intractable by current protection solutions, that is, collusion attacks involving two applications communicating over covert channels. Still no universal solution can countermeasure this sort of attack unless the covert channels are known. This paper is an attempt to reveal a new covert channel, not only being specific to smartphones, but also exploiting an unusual resource as a vehicle to carry covert information: sensors data. Accelerometers generate signals that reflect user motions, and malware applications can apparently only read their data. However, if the vibration motor on the device is used properly, programmatically produced vibration patterns can encode stolen data and hence an application can cause discernible effects on acceleration data to be received and decoded by another application. Our evaluations confirmed a real threat where strings of tens of characters could be transmitted errorless if the throughput is reduced to around 2.5–5 bps. The proposed covert channel is very stealthy as no unusual permissions are required and there is no explicit communication between the colluding applications. Ahmed Al-Haiqi, Mahamod Ismail, and Rosdiadee Nordin Copyright © 2014 Ahmed Al-Haiqi et al. All rights reserved. Improving RLRN Image Splicing Detection with the Use of PCA and Kernel PCA Sun, 14 Sep 2014 00:00:00 +0000 Digital image forgery is becoming easier to perform because of the rapid development of various manipulation tools. Image splicing is one of the most prevalent techniques. Digital images had lost their trustability, and researches have exerted considerable effort to regain such trustability by focusing mostly on algorithms. However, most of the proposed algorithms are incapable of handling high dimensionality and redundancy in the extracted features. Moreover, existing algorithms are limited by high computational time. This study focuses on improving one of the image splicing detection algorithms, that is, the run length run number algorithm (RLRN), by applying two dimension reduction methods, namely, principal component analysis (PCA) and kernel PCA. Support vector machine is used to distinguish between authentic and spliced images. Results show that kernel PCA is a nonlinear dimension reduction method that has the best effect on R, G, B, and Y channels and gray-scale images. Zahra Moghaddasi, Hamid A. Jalab, Rafidah Md Noor, and Saeed Aghabozorgi Copyright © 2014 Zahra Moghaddasi et al. All rights reserved. Heuristic Evaluation on Mobile Interfaces: A New Checklist Thu, 11 Sep 2014 12:08:19 +0000 The rapid evolution and adoption of mobile devices raise new usability challenges, given their limitations (in screen size, battery life, etc.) as well as the specific requirements of this new interaction. Traditional evaluation techniques need to be adapted in order for these requirements to be met. Heuristic evaluation (HE), an Inspection Method based on evaluation conducted by experts over a real system or prototype, is based on checklists which are desktop-centred and do not adequately detect mobile-specific usability issues. In this paper, we propose a compilation of heuristic evaluation checklists taken from the existing bibliography but readapted to new mobile interfaces. Selecting and rearranging these heuristic guidelines offer a tool which works well not just for evaluation but also as a best-practices checklist. The result is a comprehensive checklist which is experimentally evaluated as a design tool. This experimental evaluation involved two software engineers without any specific knowledge about usability, a group of ten users who compared the usability of a first prototype designed without our heuristics, and a second one after applying the proposed checklist. The results of this experiment show the usefulness of the proposed checklist for avoiding usability gaps even with nontrained developers. Rosa Yáñez Gómez, Daniel Cascado Caballero, and José-Luis Sevillano Copyright © 2014 Rosa Yáñez Gómez et al. All rights reserved. A Model Independent S/W Framework for Search-Based Software Testing Thu, 11 Sep 2014 11:50:34 +0000 In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model. Jungsup Oh, Jongmoon Baik, and Sung-Hwa Lim Copyright © 2014 Jungsup Oh et al. All rights reserved. Generalized Synchronization with Uncertain Parameters of Nonlinear Dynamic System via Adaptive Control Thu, 11 Sep 2014 11:00:39 +0000 An adaptive control scheme is developed to study the generalized adaptive chaos synchronization with uncertain chaotic parameters behavior between two identical chaotic dynamic systems. This generalized adaptive chaos synchronization controller is designed based on Lyapunov stability theory and an analytic expression of the adaptive controller with its update laws of uncertain chaotic parameters is shown. The generalized adaptive synchronization with uncertain parameters between two identical new Lorenz-Stenflo systems is taken as three examples to show the effectiveness of the proposed method. The numerical simulations are shown to verify the results. Cheng-Hsiung Yang and Cheng-Lin Wu Copyright © 2014 Cheng-Hsiung Yang and Cheng-Lin Wu. All rights reserved. Improving Vision-Based Motor Rehabilitation Interactive Systems for Users with Disabilities Using Mirror Feedback Thu, 11 Sep 2014 09:05:37 +0000 Observation is recommended in motor rehabilitation. For this reason, the aim of this study was to experimentally test the feasibility and benefit of including mirror feedback in vision-based rehabilitation systems: we projected the user on the screen. We conducted a user study by using a previously evaluated system that improved the balance and postural control of adults with cerebral palsy. We used a within-subjects design with the two defined feedback conditions (mirror and no-mirror) with two different groups of users (8 with disabilities and 32 without disabilities) using usability measures (time-to-start () and time-to-complete ()). A two-tailed paired samples -test confirmed that in case of disabilities the mirror feedback facilitated the interaction in vision-based systems for rehabilitation. The measured times were significantly worse in the absence of the user’s own visual feedback ( () and ()). In vision-based interaction systems, the input device is the user’s own body; therefore, it makes sense that feedback should be related to the body of the user. In case of disabilities the mirror feedback mechanisms facilitated the interaction in vision-based systems for rehabilitation. Results recommends developers and researchers use this improvement in vision-based motor rehabilitation interactive systems. Antoni Jaume-i-Capó, Pau Martínez-Bueso, Biel Moyà-Alcover, and Javier Varona Copyright © 2014 Antoni Jaume-i-Capó et al. All rights reserved. Performance Evaluation of the Machine Learning Algorithms Used in Inference Mechanism of a Medical Decision Support System Thu, 11 Sep 2014 07:16:52 +0000 The importance of the decision support systems is increasingly supporting the decision making process in cases of uncertainty and the lack of information and they are widely used in various fields like engineering, finance, medicine, and so forth, Medical decision support systems help the healthcare personnel to select optimal method during the treatment of the patients. Decision support systems are intelligent software systems that support decision makers on their decisions. The design of decision support systems consists of four main subjects called inference mechanism, knowledge-base, explanation module, and active memory. Inference mechanism constitutes the basis of decision support systems. There are various methods that can be used in these mechanisms approaches. Some of these methods are decision trees, artificial neural networks, statistical methods, rule-based methods, and so forth. In decision support systems, those methods can be used separately or a hybrid system, and also combination of those methods. In this study, synthetic data with 10, 100, 1000, and 2000 records have been produced to reflect the probabilities on the ALARM network. The accuracy of 11 machine learning methods for the inference mechanism of medical decision support system is compared on various data sets. Mert Bal, M. Fatih Amasyali, Hayri Sever, Guven Kose, and Ayse Demirhan Copyright © 2014 Mert Bal et al. All rights reserved. Secure Cooperative Spectrum Sensing for the Cognitive Radio Network Using Nonuniform Reliability Thu, 11 Sep 2014 06:02:00 +0000 Both reliable detection of the primary signal in a noisy and fading environment and nullifying the effect of unauthorized users are important tasks in cognitive radio networks. To address these issues, we consider a cooperative spectrum sensing approach where each user is assigned nonuniform reliability based on the sensing performance. Users with poor channel or faulty sensor are assigned low reliability. The nonuniform reliabilities serve as identification tags and are used to isolate users with malicious behavior. We consider a link layer attack similar to the Byzantine attack, which falsifies the spectrum sensing data. Three different strategies are presented in this paper to ignore unreliable and malicious users in the network. Considering only reliable users for global decision improves sensing time and decreases collisions in the control channel. The fusion center uses the degree of reliability as a weighting factor to determine the global decision in scheme I. Schemes II and III consider the unreliability of users, which makes the computations even simpler. The proposed schemes reduce the number of sensing reports and increase the inference accuracy. The advantages of our proposed schemes over conventional cooperative spectrum sensing and the Chair-Varshney optimum rule are demonstrated through simulations. Muhammad Usman and Insoo Koo Copyright © 2014 Muhammad Usman and Insoo Koo. All rights reserved. A Hybrid Approach of Stepwise Regression, Logistic Regression, Support Vector Machine, and Decision Tree for Forecasting Fraudulent Financial Statements Thu, 11 Sep 2014 05:47:34 +0000 As the fraudulent financial statement of an enterprise is increasingly serious with each passing day, establishing a valid forecasting fraudulent financial statement model of an enterprise has become an important question for academic research and financial practice. After screening the important variables using the stepwise regression, the study also matches the logistic regression, support vector machine, and decision tree to construct the classification models to make a comparison. The study adopts financial and nonfinancial variables to assist in establishment of the forecasting fraudulent financial statement model. Research objects are the companies to which the fraudulent and nonfraudulent financial statement happened between years 1998 to 2012. The findings are that financial and nonfinancial information are effectively used to distinguish the fraudulent financial statement, and decision tree C5.0 has the best classification effect 85.71%. Suduan Chen, Yeong-Jia James Goo, and Zone-De Shen Copyright © 2014 Suduan Chen et al. All rights reserved. FraudMiner: A Novel Credit Card Fraud Detection Model Based on Frequent Itemset Mining Thu, 11 Sep 2014 05:47:10 +0000 This paper proposes an intelligent credit card fraud detection model for detecting fraud from highly imbalanced and anonymous credit card transaction datasets. The class imbalance problem is handled by finding legal as well as fraud transaction patterns for each customer by using frequent itemset mining. A matching algorithm is also proposed to find to which pattern (legal or fraud) the incoming transaction of a particular customer is closer and a decision is made accordingly. In order to handle the anonymous nature of the data, no preference is given to any of the attributes and each attribute is considered equally for finding the patterns. The performance evaluation of the proposed model is done on UCSD Data Mining Contest 2009 Dataset (anonymous and imbalanced) and it is found that the proposed model has very high fraud detection rate, balanced classification rate, Matthews correlation coefficient, and very less false alarm rate than other state-of-the-art classifiers. K. R. Seeja and Masoumeh Zareapoor Copyright © 2014 K. R. Seeja and Masoumeh Zareapoor. All rights reserved. The Assignment of Scores Procedure for Ordinal Categorical Data Thu, 11 Sep 2014 00:00:00 +0000 Ordinal data are the most frequently encountered type of data in the social sciences. Many statistical methods can be used to process such data. One common method is to assign scores to the data, convert them into interval data, and further perform statistical analysis. There are several authors who have recently developed assigning score methods to assign scores to ordered categorical data. This paper proposes an approach that defines an assigning score system for an ordinal categorical variable based on underlying continuous latent distribution with interpretation by using three case study examples. The results show that the proposed score system is well for skewed ordinal categorical data. Han-Ching Chen and Nae-Sheng Wang Copyright © 2014 Han-Ching Chen and Nae-Sheng Wang. All rights reserved. PhysioDroid: Combining Wearable Health Sensors and Mobile Devices for a Ubiquitous, Continuous, and Personal Monitoring Wed, 10 Sep 2014 17:15:25 +0000 Technological advances on the development of mobile devices, medical sensors, and wireless communication systems support a new generation of unobtrusive, portable, and ubiquitous health monitoring systems for continuous patient assessment and more personalized health care. There exist a growing number of mobile apps in the health domain; however, little contribution has been specifically provided, so far, to operate this kind of apps with wearable physiological sensors. The PhysioDroid, presented in this paper, provides a personalized means to remotely monitor and evaluate users’ conditions. The PhysioDroid system provides ubiquitous and continuous vital signs analysis, such as electrocardiogram, heart rate, respiration rate, skin temperature, and body motion, intended to help empower patients and improve clinical understanding. The PhysioDroid is composed of a wearable monitoring device and an Android app providing gathering, storage, and processing features for the physiological sensor data. The versatility of the developed app allows its use for both average users and specialists, and the reduced cost of the PhysioDroid puts it at the reach of most people. Two exemplary use cases for health assessment and sports training are presented to illustrate the capabilities of the PhysioDroid. Next technical steps include generalization to other mobile platforms and health monitoring devices. Oresti Banos, Claudia Villalonga, Miguel Damas, Peter Gloesekoetter, Hector Pomares, and Ignacio Rojas Copyright © 2014 Oresti Banos et al. All rights reserved. SVM-RFE Based Feature Selection and Taguchi Parameters Optimization for Multiclass SVM Classifier Wed, 10 Sep 2014 00:00:00 +0000 Recently, support vector machine (SVM) has excellent performance on classification and prediction and is widely used on disease diagnosis or medical assistance. However, SVM only functions well on two-group classification problems. This study combines feature selection and SVM recursive feature elimination (SVM-RFE) to investigate the classification accuracy of multiclass problems for Dermatology and Zoo databases. Dermatology dataset contains 33 feature variables, 1 class variable, and 366 testing instances; and the Zoo dataset contains 16 feature variables, 1 class variable, and 101 testing instances. The feature variables in the two datasets were sorted in descending order by explanatory power, and different feature sets were selected by SVM-RFE to explore classification accuracy. Meanwhile, Taguchi method was jointly combined with SVM classifier in order to optimize parameters and to increase classification accuracy for multiclass classification. The experimental results show that the classification accuracy can be more than 95% after SVM-RFE feature selection and Taguchi parameter optimization for Dermatology and Zoo databases. Mei-Ling Huang, Yung-Hsiang Hung, W. M. Lee, R. K. Li, and Bo-Ru Jiang Copyright © 2014 Mei-Ling Huang et al. All rights reserved. Comparative Study of Human Age Estimation with or without Preclassification of Gender and Facial Expression Tue, 09 Sep 2014 13:20:33 +0000 Age estimation has many useful applications, such as age-based face classification, finding lost children, surveillance monitoring, and face recognition invariant to age progression. Among many factors affecting age estimation accuracy, gender and facial expression can have negative effects. In our research, the effects of gender and facial expression on age estimation using support vector regression (SVR) method are investigated. Our research is novel in the following four ways. First, the accuracies of age estimation using a single-level local binary pattern (LBP) and a multilevel LBP (MLBP) are compared, and MLBP shows better performance as an extractor of texture features globally. Second, we compare the accuracies of age estimation using global features extracted by MLBP, local features extracted by Gabor filtering, and the combination of the two methods. Results show that the third approach is the most accurate. Third, the accuracies of age estimation with and without preclassification of facial expression are compared and analyzed. Fourth, those with and without preclassification of gender are compared and analyzed. The experimental results show the effectiveness of gender preclassification in age estimation. Dat Tien Nguyen, So Ra Cho, Kwang Yong Shin, Jae Won Bang, and Kang Ryoung Park Copyright © 2014 Dat Tien Nguyen et al. All rights reserved. Robust Framework to Combine Diverse Classifiers Assigning Distributed Confidence to Individual Classifiers at Class Level Mon, 08 Sep 2014 11:33:27 +0000 We have presented a classification framework that combines multiple heterogeneous classifiers in the presence of class label noise. An extension of m-Mediods based modeling is presented that generates model of various classes whilst identifying and filtering noisy training data. This noise free data is further used to learn model for other classifiers such as GMM and SVM. A weight learning method is then introduced to learn weights on each class for different classifiers to construct an ensemble. For this purpose, we applied genetic algorithm to search for an optimal weight vector on which classifier ensemble is expected to give the best accuracy. The proposed approach is evaluated on variety of real life datasets. It is also compared with existing standard ensemble techniques such as Adaboost, Bagging, and Random Subspace Methods. Experimental results show the superiority of proposed ensemble method as compared to its competitors, especially in the presence of class label noise and imbalance classes. Shehzad Khalid, Sannia Arshad, Sohail Jabbar, and Seungmin Rho Copyright © 2014 Shehzad Khalid et al. All rights reserved. Gene Network Biological Validity Based on Gene-Gene Interaction Relevance Mon, 08 Sep 2014 08:41:38 +0000 In recent years, gene networks have become one of the most useful tools for modeling biological processes. Many inference gene network algorithms have been developed as techniques for extracting knowledge from gene expression data. Ensuring the reliability of the inferred gene relationships is a crucial task in any study in order to prove that the algorithms used are precise. Usually, this validation process can be carried out using prior biological knowledge. The metabolic pathways stored in KEGG are one of the most widely used knowledgeable sources for analyzing relationships between genes. This paper introduces a new methodology, GeneNetVal, to assess the biological validity of gene networks based on the relevance of the gene-gene interactions stored in KEGG metabolic pathways. Hence, a complete KEGG pathway conversion into a gene association network and a new matching distance based on gene-gene interaction relevance are proposed. The performance of GeneNetVal was established with three different experiments. Firstly, our proposal is tested in a comparative ROC analysis. Secondly, a randomness study is presented to show the behavior of GeneNetVal when the noise is increased in the input network. Finally, the ability of GeneNetVal to detect biological functionality of the network is shown. Francisco Gómez-Vela and Norberto Díaz-Díaz Copyright © 2014 Francisco Gómez-Vela and Norberto Díaz-Díaz. All rights reserved. Insights into the Prevalence of Software Project Defects Sun, 07 Sep 2014 09:43:42 +0000 This paper analyses the effect of the effort distribution along the software development lifecycle on the prevalence of software defects. This analysis is based on data that was collected by the International Software Benchmarking Standards Group (ISBSG) on the development of 4,106 software projects. Data mining techniques have been applied to gain a better understanding of the behaviour of the project activities and to identify a link between the effort distribution and the prevalence of software defects. This analysis has been complemented with the use of a hierarchical clustering algorithm with a dissimilarity based on the likelihood ratio statistic, for exploratory purposes. As a result, different behaviours have been identified for this collection of software development projects, allowing for the definition of risk control strategies to diminish the number and impact of the software defects. It is expected that the use of similar estimations might greatly improve the awareness of project managers on the risks at hand. Javier Alfonso-Cendón, Manuel Castejón Limas, Joaquín B. Ordieres Meré, and Juan Pavón Copyright © 2014 Javier Alfonso-Cendón et al. All rights reserved. Comparative Study on Interaction of Form and Motion Processing Streams by Applying Two Different Classifiers in Mechanism for Recognition of Biological Movement Wed, 03 Sep 2014 07:31:50 +0000 Research on psychophysics, neurophysiology, and functional imaging shows particular representation of biological movements which contains two pathways. The visual perception of biological movements formed through the visual system called dorsal and ventral processing streams. Ventral processing stream is associated with the form information extraction; on the other hand, dorsal processing stream provides motion information. Active basic model (ABM) as hierarchical representation of the human object had revealed novelty in form pathway due to applying Gabor based supervised object recognition method. It creates more biological plausibility along with similarity with original model. Fuzzy inference system is used for motion pattern information in motion pathway creating more robustness in recognition process. Besides, interaction of these paths is intriguing and many studies in various fields considered it. Here, the interaction of the pathways to get more appropriated results has been investigated. Extreme learning machine (ELM) has been implied for classification unit of this model, due to having the main properties of artificial neural networks, but crosses from the difficulty of training time substantially diminished in it. Here, there will be a comparison between two different configurations, interactions using synergetic neural network and ELM, in terms of accuracy and compatibility. Bardia Yousefi and Chu Kiong Loo Copyright © 2014 Bardia Yousefi and Chu Kiong Loo. All rights reserved. LPTA: Location Predictive and Time Adaptive Data Gathering Scheme with Mobile Sink for Wireless Sensor Networks Wed, 03 Sep 2014 06:57:08 +0000 This paper exploits sink mobility to prolong the lifetime of sensor networks while maintaining the data transmission delay relatively low. A location predictive and time adaptive data gathering scheme is proposed. In this paper, we introduce a sink location prediction principle based on loose time synchronization and deduce the time-location formulas of the mobile sink. According to local clocks and the time-location formulas of the mobile sink, nodes in the network are able to calculate the current location of the mobile sink accurately and route data packets timely toward the mobile sink by multihop relay. Considering that data packets generating from different areas may be different greatly, an adaptive dwelling time adjustment method is also proposed to balance energy consumption among nodes in the network. Simulation results show that our data gathering scheme enables data routing with less data transmission time delay and balance energy consumption among nodes. Chuan Zhu, Yao Wang, Guangjie Han, Joel J. P. C. Rodrigues, and Jaime Lloret Copyright © 2014 Chuan Zhu et al. All rights reserved. Integer-Linear-Programing Optimization in Scalable Video Multicast with Adaptive Modulation and Coding in Wireless Networks Wed, 03 Sep 2014 00:00:00 +0000 The advancement in wideband wireless network supports real time services such as IPTV and live video streaming. However, because of the sharing nature of the wireless medium, efficient resource allocation has been studied to achieve a high level of acceptability and proliferation of wireless multimedia. Scalable video coding (SVC) with adaptive modulation and coding (AMC) provides an excellent solution for wireless video streaming. By assigning different modulation and coding schemes (MCSs) to video layers, SVC can provide good video quality to users in good channel conditions and also basic video quality to users in bad channel conditions. For optimal resource allocation, a key issue in applying SVC in the wireless multicast service is how to assign MCSs and the time resources to each SVC layer in the heterogeneous channel condition. We formulate this problem with integer linear programming (ILP) and provide numerical results to show the performance under 802.16 m environment. The result shows that our methodology enhances the overall system throughput compared to an existing algorithm. Dongyul Lee and Chaewoo Lee Copyright © 2014 Dongyul Lee and Chaewoo Lee. All rights reserved. A Novel Latin Hypercube Algorithm via Translational Propagation Tue, 02 Sep 2014 12:05:52 +0000 Metamodels have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is directly related to the experimental designs used. Optimal Latin hypercube designs are frequently used and have been shown to have good space-filling and projective properties. However, the high cost in constructing them limits their use. In this paper, a methodology for creating novel Latin hypercube designs via translational propagation and successive local enumeration algorithm (TPSLE) is developed without using formal optimization. TPSLE algorithm is based on the inspiration that a near optimal Latin Hypercube design can be constructed by a simple initial block with a few points generated by algorithm SLE as a building block. In fact, TPSLE algorithm offers a balanced trade-off between the efficiency and sampling performance. The proposed algorithm is compared to two existing algorithms and is found to be much more efficient in terms of the computation time and has acceptable space-filling and projective properties. Guang Pan, Pengcheng Ye, and Peng Wang Copyright © 2014 Guang Pan et al. All rights reserved. Security Considerations and Recommendations in Computer-Based Testing Mon, 01 Sep 2014 13:32:43 +0000 Many organizations and institutions around the globe are moving or planning to move their paper-and-pencil based testing to computer-based testing (CBT). However, this conversion will not be the best option for all kinds of exams and it will require significant resources. These resources may include the preparation of item banks, methods for test delivery, procedures for test administration, and last but not least test security. Security aspects may include but are not limited to the identification and authentication of examinee, the risks that are associated with cheating on the exam, and the procedures related to test delivery to the examinee. This paper will mainly investigate the security considerations associated with CBT and will provide some recommendations for the security of these kinds of tests. We will also propose a palm-based biometric authentication system incorporated with basic authentication system (username/password) in order to check the identity and authenticity of the examinee. Saleh M. Al-Saleem and Hanif Ullah Copyright © 2014 Saleh M. Al-Saleem and Hanif Ullah. All rights reserved. A Synthesized Heuristic Task Scheduling Algorithm Mon, 01 Sep 2014 12:15:14 +0000 Aiming at the static task scheduling problems in heterogeneous environment, a heuristic task scheduling algorithm named HCPPEFT is proposed. In task prioritizing phase, there are three levels of priority in the algorithm to choose task. First, the critical tasks have the highest priority, secondly the tasks with longer path to exit task will be selected, and then algorithm will choose tasks with less predecessors to schedule. In resource selection phase, the algorithm is selected task duplication to reduce the interresource communication cost, besides forecasting the impact of an assignment for all children of the current task permits better decisions to be made in selecting resources. The algorithm proposed is compared with STDH, PEFT, and HEFT algorithms through randomly generated graphs and sets of task graphs. The experimental results show that the new algorithm can achieve better scheduling performance. Yanyan Dai and Xiangli Zhang Copyright © 2014 Yanyan Dai and Xiangli Zhang. All rights reserved. Method for User Interface of Large Displays Using Arm Pointing and Finger Counting Gesture Recognition Mon, 01 Sep 2014 08:08:37 +0000 Although many three-dimensional pointing gesture recognition methods have been proposed, the problem of self-occlusion has not been considered. Furthermore, because almost all pointing gesture recognition methods use a wide-angle camera, additional sensors or cameras are required to concurrently perform finger gesture recognition. In this paper, we propose a method for performing both pointing gesture and finger gesture recognition for large display environments, using a single Kinect device and a skeleton tracking model. By considering self-occlusion, a compensation technique can be performed on the user’s detected shoulder position when a hand occludes the shoulder. In addition, we propose a technique to facilitate finger counting gesture recognition, based on the depth image of the hand position. In this technique, the depth image is extracted from the end of the pointing vector. By using exception handling for self-occlusions, experimental results indicate that the pointing accuracy of a specific reference position was significantly improved. The average root mean square error was approximately 13 pixels for a 1920 × 1080 pixels screen resolution. Moreover, the finger counting gesture recognition accuracy was 98.3%. Hansol Kim, Yoonkyung Kim, and Eui Chul Lee Copyright © 2014 Hansol Kim et al. All rights reserved. Efficiently Hiding Sensitive Itemsets with Transaction Deletion Based on Genetic Algorithms Mon, 01 Sep 2014 07:26:42 +0000 Data mining is used to mine meaningful and useful information or knowledge from a very large database. Some secure or private information can be discovered by data mining techniques, thus resulting in an inherent risk of threats to privacy. Privacy-preserving data mining (PPDM) has thus arisen in recent years to sanitize the original database for hiding sensitive information, which can be concerned as an NP-hard problem in sanitization process. In this paper, a compact prelarge GA-based (cpGA2DT) algorithm to delete transactions for hiding sensitive itemsets is thus proposed. It solves the limitations of the evolutionary process by adopting both the compact GA-based (cGA) mechanism and the prelarge concept. A flexible fitness function with three adjustable weights is thus designed to find the appropriate transactions to be deleted in order to hide sensitive itemsets with minimal side effects of hiding failure, missing cost, and artificial cost. Experiments are conducted to show the performance of the proposed cpGA2DT algorithm compared to the simple GA-based (sGA2DT) algorithm and the greedy approach in terms of execution time and three side effects. Chun-Wei Lin, Binbin Zhang, Kuo-Tung Yang, and Tzung-Pei Hong Copyright © 2014 Chun-Wei Lin et al. All rights reserved. Nonuniform Video Size Reduction for Moving Objects Sun, 31 Aug 2014 14:44:31 +0000 Moving objects of interest (MOOIs) in surveillance videos are detected and encapsulated by bounding boxes. Since moving objects are defined by temporal activities through the consecutive video frames, it is necessary to examine a group of frames (GoF) to detect the moving objects. To do that, the traces of moving objects in the GoF are quantified by forming a spatiotemporal gradient map (STGM) through the GoF. Each pixel value in the STGM corresponds to the maximum temporal gradient of the spatial gradients at the same pixel location for all frames in the GoF. Therefore, the STGM highlights boundaries of the MOOI in the GoF and the optimal bounding box encapsulating the MOOI can be determined as the local areas with the peak average STGM energy. Once an MOOI and its bounding box are identified, the inside and outside of it can be treated differently for object-aware size reduction. Our optimal encapsulation method for the MOOI in the surveillance videos makes it possible to recognize the moving objects even after the low bitrate video compressions. Anh Vu Le, Seung-Won Jung, and Chee Sun Won Copyright © 2014 Anh Vu Le et al. All rights reserved. Network Anomaly Detection System with Optimized DS Evidence Theory Sun, 31 Aug 2014 14:39:40 +0000 Network anomaly detection has been focused on by more people with the fast development of computer network. Some researchers utilized fusion method and DS evidence theory to do network anomaly detection but with low performance, and they did not consider features of network—complicated and varied. To achieve high detection rate, we present a novel network anomaly detection system with optimized Dempster-Shafer evidence theory (ODS) and regression basic probability assignment (RBPA) function. In this model, we add weights for each senor to optimize DS evidence theory according to its previous predict accuracy. And RBPA employs sensor’s regression ability to address complex network. By four kinds of experiments, we find that our novel network anomaly detection model has a better detection rate, and RBPA as well as ODS optimization methods can improve system performance significantly. Yuan Liu, Xiaofeng Wang, and Kaiyu Liu Copyright © 2014 Yuan Liu et al. All rights reserved. Color Image Segmentation Based on Different Color Space Models Using Automatic GrabCut Sun, 31 Aug 2014 13:28:38 +0000 This paper presents a comparative study using different color spaces to evaluate the performance of color image segmentation using the automatic GrabCut technique. GrabCut is considered as one of the semiautomatic image segmentation techniques, since it requires user interaction for the initialization of the segmentation process. The automation of the GrabCut technique is proposed as a modification of the original semiautomatic one in order to eliminate the user interaction. The automatic GrabCut utilizes the unsupervised Orchard and Bouman clustering technique for the initialization phase. Comparisons with the original GrabCut show the efficiency of the proposed automatic technique in terms of segmentation, quality, and accuracy. As no explicit color space is recommended for every segmentation problem, automatic GrabCut is applied with , , , , and color spaces. The comparative study and experimental results using different color images show that color space is the best color space representation for the set of the images used. Dina Khattab, Hala Mousher Ebied, Ashraf Saad Hussein, and Mohamed Fahmy Tolba Copyright © 2014 Dina Khattab et al. All rights reserved. The Framework for Simulation of Bioinspired Security Mechanisms against Network Infrastructure Attacks Sun, 31 Aug 2014 11:57:57 +0000 The paper outlines a bioinspired approach named “network nervous system" and methods of simulation of infrastructure attacks and protection mechanisms based on this approach. The protection mechanisms based on this approach consist of distributed prosedures of information collection and processing, which coordinate the activities of the main devices of a computer network, identify attacks, and determine nessesary countermeasures. Attacks and protection mechanisms are specified as structural models using a set-theoretic approach. An environment for simulation of protection mechanisms based on the biological metaphor is considered; the experiments demonstrating the effectiveness of the protection mechanisms are described. Andrey Shorov and Igor Kotenko Copyright © 2014 Andrey Shorov and Igor Kotenko. All rights reserved. Autogenerator-Based Modelling Framework for Development of Strategic Games Simulations: Rational Pigs Game Extended Sun, 31 Aug 2014 11:57:12 +0000 When considering strategic games from the conceptual perspective that focuses on the questions of participants’ decision-making rationality, the very issues of modelling and simulation are rarely discussed. The well-known Rational Pigs matrix game has been relatively intensively analyzed in terms of reassessment of the logic of two players involved in asymmetric situations as gluttons that differ significantly by their attributes. This paper presents a successful attempt of using autogenerator for creating the framework of the game, including the predefined scenarios and corresponding payoffs. Autogenerator offers flexibility concerning the specification of game parameters, which consist of variations in the number of simultaneous players and their features and game objects and their attributes as well as some general game characteristics. In the proposed approach the model of autogenerator was upgraded so as to enable program specification updates. For the purpose of treatment of more complex strategic scenarios, we created the Rational Pigs Game Extended (RPGE), in which the introduction of a third glutton entails significant structural changes. In addition, due to the existence of particular attributes of the new player, “the tramp,” one equilibrium point from the original game is destabilized which has an influence on the decision-making of rational players. Robert Fabac, Danijel Radošević, and Ivan Magdalenić Copyright © 2014 Robert Fabac et al. All rights reserved. SPONGY (SPam ONtoloGY): Email Classification Using Two-Level Dynamic Ontology Sun, 31 Aug 2014 10:51:36 +0000 Email is one of common communication methods between people on the Internet. However, the increase of email misuse/abuse has resulted in an increasing volume of spam emails over recent years. An experimental system has been designed and implemented with the hypothesis that this method would outperform existing techniques, and the experimental results showed that indeed the proposed ontology-based approach improves spam filtering accuracy significantly. In this paper, two levels of ontology spam filters were implemented: a first level global ontology filter and a second level user-customized ontology filter. The use of the global ontology filter showed about 91% of spam filtered, which is comparable with other methods. The user-customized ontology filter was created based on the specific user’s background as well as the filtering mechanism used in the global ontology filter creation. The main contributions of the paper are (1) to introduce an ontology-based multilevel filtering technique that uses both a global ontology and an individual filter for each user to increase spam filtering accuracy and (2) to create a spam filter in the form of ontology, which is user-customized, scalable, and modularized, so that it can be embedded to many other systems for better performance. Seongwook Youn Copyright © 2014 Seongwook Youn. All rights reserved. Comparing Evolutionary Strategies on a Biobjective Cultural Algorithm Sun, 31 Aug 2014 06:35:51 +0000 Evolutionary algorithms have been widely used to solve large and complex optimisation problems. Cultural algorithms (CAs) are evolutionary algorithms that have been used to solve both single and, to a less extent, multiobjective optimisation problems. In order to solve these optimisation problems, CAs make use of different strategies such as normative knowledge, historical knowledge, circumstantial knowledge, and among others. In this paper we present a comparison among CAs that make use of different evolutionary strategies; the first one implements a historical knowledge, the second one considers a circumstantial knowledge, and the third one implements a normative knowledge. These CAs are applied on a biobjective uncapacitated facility location problem (BOUFLP), the biobjective version of the well-known uncapacitated facility location problem. To the best of our knowledge, only few articles have applied evolutionary multiobjective algorithms on the BOUFLP and none of those has focused on the impact of the evolutionary strategy on the algorithm performance. Our biobjective cultural algorithm, called BOCA, obtains important improvements when compared to other well-known evolutionary biobjective optimisation algorithms such as PAES and NSGA-II. The conflicting objective functions considered in this study are cost minimisation and coverage maximisation. Solutions obtained by each algorithm are compared using a hypervolume S metric. Carolina Lagos, Broderick Crawford, Enrique Cabrera, Ricardo Soto, José-Miguel Rubio, and Fernando Paredes Copyright © 2014 Carolina Lagos et al. All rights reserved. A Hybrid Digital-Signature and Zero-Watermarking Approach for Authentication and Protection of Sensitive Electronic Documents Thu, 28 Aug 2014 11:30:41 +0000 This paper addresses the problems and threats associated with verification of integrity, proof of authenticity, tamper detection, and copyright protection for digital-text content. Such issues were largely addressed in the literature for images, audio, and video, with only a few papers addressing the challenge of sensitive plain-text media under known constraints. Specifically, with text as the predominant online communication medium, it becomes crucial that techniques are deployed to protect such information. A number of digital-signature, hashing, and watermarking schemes have been proposed that essentially bind source data or embed invisible data in a cover media to achieve its goal. While many such complex schemes with resource redundancies are sufficient in offline and less-sensitive texts, this paper proposes a hybrid approach based on zero-watermarking and digital-signature-like manipulations for sensitive text documents in order to achieve content originality and integrity verification without physically modifying the cover text in anyway. The proposed algorithm was implemented and shown to be robust against undetected content modifications and is capable of confirming proof of originality whilst detecting and locating deliberate/nondeliberate tampering. Additionally, enhancements in resource utilisation and reduced redundancies were achieved in comparison to traditional encryption-based approaches. Finally, analysis and remarks are made about the current state of the art, and future research issues are discussed under the given constraints. Omar Tayan, Muhammad N. Kabir, and Yasser M. Alginahi Copyright © 2014 Omar Tayan et al. All rights reserved. Fault Detection and Diagnosis for Gas Turbines Based on a Kernelized Information Entropy Model Thu, 28 Aug 2014 11:24:08 +0000 Gas turbines are considered as one kind of the most important devices in power engineering and have been widely used in power generation, airplanes, and naval ships and also in oil drilling platforms. However, they are monitored without man on duty in the most cases. It is highly desirable to develop techniques and systems to remotely monitor their conditions and analyze their faults. In this work, we introduce a remote system for online condition monitoring and fault diagnosis of gas turbine on offshore oil well drilling platforms based on a kernelized information entropy model. Shannon information entropy is generalized for measuring the uniformity of exhaust temperatures, which reflect the overall states of the gas paths of gas turbine. In addition, we also extend the entropy to compute the information quantity of features in kernel spaces, which help to select the informative features for a certain recognition task. Finally, we introduce the information entropy based decision tree algorithm to extract rules from fault samples. The experiments on some real-world data show the effectiveness of the proposed algorithms. Weiying Wang, Zhiqiang Xu, Rui Tang, Shuying Li, and Wei Wu Copyright © 2014 Weiying Wang et al. All rights reserved. IoT-Based Smart Garbage System for Efficient Food Waste Management Thu, 28 Aug 2014 07:07:26 +0000 Owing to a paradigm shift toward Internet of Things (IoT), researches into IoT services have been conducted in a wide range of fields. As a major application field of IoT, waste management has become one such issue. The absence of efficient waste management has caused serious environmental problems and cost issues. Therefore, in this paper, an IoT-based smart garbage system (SGS) is proposed to reduce the amount of food waste. In an SGS, battery-based smart garbage bins (SGBs) exchange information with each other using wireless mesh networks, and a router and server collect and analyze the information for service provisioning. Furthermore, the SGS includes various IoT techniques considering user convenience and increases the battery lifetime through two types of energy-efficient operations of the SGBs: stand-alone operation and cooperation-based operation. The proposed SGS had been operated as a pilot project in Gangnam district, Seoul, Republic of Korea, for a one-year period. The experiment showed that the average amount of food waste could be reduced by 33%. Insung Hong, Sunghoi Park, Beomseok Lee, Jaekeun Lee, Daebeom Jeong, and Sehyun Park Copyright © 2014 Insung Hong et al. All rights reserved. A Chaotic Cryptosystem for Images Based on Henon and Arnold Cat Map Thu, 28 Aug 2014 07:01:06 +0000 The rapid evolution of imaging and communication technologies has transformed images into a widespread data type. Different types of data, such as personal medical information, official correspondence, or governmental and military documents, are saved and transmitted in the form of images over public networks. Hence, a fast and secure cryptosystem is needed for high-resolution images. In this paper, a novel encryption scheme is presented for securing images based on Arnold cat and Henon chaotic maps. The scheme uses Arnold cat map for bit- and pixel-level permutations on plain and secret images, while Henon map creates secret images and specific parameters for the permutations. Both the encryption and decryption processes are explained, formulated, and graphically presented. The results of security analysis of five different images demonstrate the strength of the proposed cryptosystem against statistical, brute force and differential attacks. The evaluated running time for both encryption and decryption processes guarantee that the cryptosystem can work effectively in real-time applications. Ali Soleymani, Md Jan Nordin, and Elankovan Sundararajan Copyright © 2014 Ali Soleymani et al. All rights reserved. On Distribution Reduction and Algorithm Implementation in Inconsistent Ordered Information Systems Thu, 28 Aug 2014 06:33:42 +0000 As one part of our work in ordered information systems, distribution reduction is studied in inconsistent ordered information systems (OISs). Some important properties on distribution reduction are studied and discussed. The dominance matrix is restated for reduction acquisition in dominance relations based information systems. Matrix algorithm for distribution reduction acquisition is stepped. And program is implemented by the algorithm. The approach provides an effective tool for the theoretical research and the applications for ordered information systems in practices. For more detailed and valid illustrations, cases are employed to explain and verify the algorithm and the program which shows the effectiveness of the algorithm in complicated information systems. Yanqin Zhang Copyright © 2014 Yanqin Zhang. All rights reserved. Analysis and Simulation of the Dynamic Spectrum Allocation Based on Parallel Immune Optimization in Cognitive Wireless Networks Thu, 28 Aug 2014 06:31:43 +0000 Spectrum allocation is one of the key issues to improve spectrum efficiency and has become the hot topic in the research of cognitive wireless network. This paper discusses the real-time feature and efficiency of dynamic spectrum allocation and presents a new spectrum allocation algorithm based on the master-slave parallel immune optimization model. The algorithm designs a new encoding scheme for the antibody based on the demand for convergence rate and population diversity. For improving the calculating efficiency, the antibody affinity in the population is calculated in multiple computing nodes at the same time. Simulation results show that the algorithm reduces the total spectrum allocation time and can achieve higher network profits. Compared with traditional serial algorithms, the algorithm proposed in this paper has better speedup ratio and parallel efficiency. Wu Huixin, Mo Duo, and Li He Copyright © 2014 Wu Huixin et al. All rights reserved. Low Complexity Mode Decision for 3D-HEVC Thu, 28 Aug 2014 06:29:34 +0000 High efficiency video coding- (HEVC-) based 3D video coding (3D-HEVC) developed by joint collaborative team on 3D video coding (JCT-3V) for multiview video and depth map is an extension of HEVC standard. In the test model of 3D-HEVC, variable coding unit (CU) size decision and disparity estimation (DE) are introduced to achieve the highest coding efficiency with the cost of very high computational complexity. In this paper, a fast mode decision algorithm based on variable size CU and DE is proposed to reduce 3D-HEVC computational complexity. The basic idea of the method is to utilize the correlations between depth map and motion activity in prediction mode where variable size CU and DE are needed, and only in these regions variable size CU and DE are enabled. Experimental results show that the proposed algorithm can save about 43% average computational complexity of 3D-HEVC while maintaining almost the same rate-distortion (RD) performance. Qiuwen Zhang, Nana Li, and Yong Gan Copyright © 2014 Qiuwen Zhang et al. All rights reserved. A Multianalyzer Machine Learning Model for Marine Heterogeneous Data Schema Mapping Thu, 28 Aug 2014 00:00:00 +0000 The main challenges that marine heterogeneous data integration faces are the problem of accurate schema mapping between heterogeneous data sources. In order to improve the schema mapping efficiency and get more accurate learning results, this paper proposes a heterogeneous data schema mapping method basing on multianalyzer machine learning model. The multianalyzer analysis the learning results comprehensively, and a fuzzy comprehensive evaluation system is introduced for output results’ evaluation and multi factor quantitative judging. Finally, the data mapping comparison experiment on the East China Sea observing data confirms the effectiveness of the model and shows multianalyzer’s obvious improvement of mapping error rate. Wang Yan, Le Jiajin, and Zhang Yun Copyright © 2014 Wang Yan et al. All rights reserved. Smoothing Strategies Combined with ARIMA and Neural Networks to Improve the Forecasting of Traffic Accidents Thu, 28 Aug 2014 00:00:00 +0000 Two smoothing strategies combined with autoregressive integrated moving average (ARIMA) and autoregressive neural networks (ANNs) models to improve the forecasting of time series are presented. The strategy of forecasting is implemented using two stages. In the first stage the time series is smoothed using either, 3-point moving average smoothing, or singular value Decomposition of the Hankel matrix (HSVD). In the second stage, an ARIMA model and two ANNs for one-step-ahead time series forecasting are used. The coefficients of the first ANN are estimated through the particle swarm optimization (PSO) learning algorithm, while the coefficients of the second ANN are estimated with the resilient backpropagation (RPROP) learning algorithm. The proposed models are evaluated using a weekly time series of traffic accidents of Valparaíso, Chilean region, from 2003 to 2012. The best result is given by the combination HSVD-ARIMA, with a MAPE of 0 : 26%, followed by MA-ARIMA with a MAPE of 1 : 12%; the worst result is given by the MA-ANN based on PSO with a MAPE of 15 : 51%. Lida Barba, Nibaldo Rodríguez, and Cecilia Montt Copyright © 2014 Lida Barba et al. All rights reserved. Ephedrine QoS: An Antidote to Slow, Congested, Bufferless NoCs Thu, 28 Aug 2014 00:00:00 +0000 Datacenters consolidate diverse applications to improve utilization. However when multiple applications are colocated on such platforms, contention for shared resources like networks-on-chip (NoCs) can degrade the performance of latency-critical online services (high-priority applications). Recently proposed bufferless NoCs (Nychis et al.) have the advantages of requiring less area and power, but they pose challenges in quality-of-service (QoS) support, which usually relies on buffer-based virtual channels (VCs). We propose QBLESS, a QoS-aware bufferless NoC scheme for datacenters. QBLESS consists of two components: a routing mechanism (QBLESS-R) that can substantially reduce flit deflection for high-priority applications and a congestion-control mechanism (QBLESS-CC) that guarantees performance for high-priority applications and improves overall system throughput. We use trace-driven simulation to model a 64-core system, finding that, when compared to BLESS, a previous state-of-the-art bufferless NoC design, QBLESS, improves performance of high-priority applications by an average of 33.2% and reduces network-hops by an average of 42.8%. Juan Fang, Zhicheng Yao, Xiufeng Sui, and Yungang Bao Copyright © 2014 Juan Fang et al. All rights reserved. The Deployment of Routing Protocols in Distributed Control Plane of SDN Thu, 28 Aug 2014 00:00:00 +0000 Software defined network (SDN) provides a programmable network through decoupling the data plane, control plane, and application plane from the original closed system, thus revolutionizing the existing network architecture to improve the performance and scalability. In this paper, we learned about the distributed characteristics of Kandoo architecture and, meanwhile, improved and optimized Kandoo’s two levels of controllers based on ideological inspiration of RCP (routing control platform). Finally, we analyzed the deployment strategies of BGP and OSPF protocol in a distributed control plane of SDN. The simulation results show that our deployment strategies are superior to the traditional routing strategies. Zhou Jingjing, Cheng Di, Wang Weiming, Jin Rong, and Wu Xiaochun Copyright © 2014 Zhou Jingjing et al. All rights reserved. Discrete Bat Algorithm for Optimal Problem of Permutation Flow Shop Scheduling Wed, 27 Aug 2014 12:56:28 +0000 A discrete bat algorithm (DBA) is proposed for optimal permutation flow shop scheduling problem (PFSP). Firstly, the discrete bat algorithm is constructed based on the idea of basic bat algorithm, which divide whole scheduling problem into many subscheduling problems and then NEH heuristic be introduced to solve subscheduling problem. Secondly, some subsequences are operated with certain probability in the pulse emission and loudness phases. An intensive virtual population neighborhood search is integrated into the discrete bat algorithm to further improve the performance. Finally, the experimental results show the suitability and efficiency of the present discrete bat algorithm for optimal permutation flow shop scheduling problem. Qifang Luo, Yongquan Zhou, Jian Xie, Mingzhi Ma, and Liangliang Li Copyright © 2014 Qifang Luo et al. All rights reserved. Surface Evaluation by Estimation of Fractal Dimension and Statistical Tools Wed, 27 Aug 2014 08:51:20 +0000 Structured and complex data can be found in many applications in research and development, and also in industrial practice. We developed a methodology for describing the structured data complexity and applied it in development and industrial practice. The methodology uses fractal dimension together with statistical tools and with software modification is able to analyse data in a form of sequence (signals, surface roughness), 2D images, and dividing lines. The methodology had not been tested for a relatively large collection of data. For this reason, samples with structured surfaces produced with different technologies and properties were measured and evaluated with many types of parameters. The paper intends to analyse data measured by a surface roughness tester. The methodology shown compares standard and nonstandard parameters, searches the optimal parameters for a complete analysis, and specifies the sensitivity to directionality of samples for these types of surfaces. The text presents application of fractal geometry (fractal dimension) for complex surface analysis in combination with standard roughness parameters (statistical tool). Vlastimil Hotar and Petr Salac Copyright © 2014 Vlastimil Hotar and Petr Salac. All rights reserved. DS-ARP: A New Detection Scheme for ARP Spoofing Attacks Based on Routing Trace for Ubiquitous Environments Wed, 27 Aug 2014 08:50:22 +0000 Despite the convenience, ubiquitous computing suffers from many threats and security risks. Security considerations in the ubiquitous network are required to create enriched and more secure ubiquitous environments. The address resolution protocol (ARP) is a protocol used to identify the IP address and the physical address of the associated network card. ARP is designed to work without problems in general environments. However, since it does not include security measures against malicious attacks, in its design, an attacker can impersonate another host using ARP spoofing or access important information. In this paper, we propose a new detection scheme for ARP spoofing attacks using a routing trace, which can be used to protect the internal network. Tracing routing can find the change of network movement path. The proposed scheme provides high constancy and compatibility because it does not alter the ARP protocol. In addition, it is simple and stable, as it does not use a complex algorithm or impose extra load on the computer system. Min Su Song, Jae Dong Lee, Young-Sik Jeong, Hwa-Young Jeong, and Jong Hyuk Park Copyright © 2014 Min Su Song et al. All rights reserved. An Efficient Algorithm for Recognition of Human Actions Wed, 27 Aug 2014 06:24:18 +0000 Recognition of human actions is an emerging need. Various researchers have endeavored to provide a solution to this problem. Some of the current state-of-the-art solutions are either inaccurate or computationally intensive while others require human intervention. In this paper a sufficiently accurate while computationally inexpensive solution is provided for the same problem. Image moments which are translation, rotation, and scale invariant are computed for a frame. A dynamic neural network is used to identify the patterns within the stream of image moments and hence recognize actions. Experiments show that the proposed model performs better than other competitive models. Yaser Daanial Khan, Nabeel Sabir Khan, Shoaib Farooq, Adnan Abid, Sher Afzal Khan, Farooq Ahmad, and M. Khalid Mahmood Copyright © 2014 Yaser Daanial Khan et al. All rights reserved. Self-Organized Service Negotiation for Collaborative Decision Making Wed, 27 Aug 2014 06:21:30 +0000 This paper proposes a self-organized service negotiation method for CDM in intelligent and automatic manners. It mainly includes three phases: semantic-based capacity evaluation for the CDM sponsor, trust computation of the CDM organization, and negotiation selection of the decision-making service provider (DMSP). In the first phase, the CDM sponsor produces the formal semantic description of the complex decision task for DMSP and computes the capacity evaluation values according to participator instructions from different DMSPs. In the second phase, a novel trust computation approach is presented to compute the subjective belief value, the objective reputation value, and the recommended trust value. And in the third phase, based on the capacity evaluation and trust computation, a negotiation mechanism is given to efficiently implement the service selection. The simulation experiment results show that our self-organized service negotiation method is feasible and effective for CDM. Bo Zhang, Zhenhua Huang, and Ziming Zheng Copyright © 2014 Bo Zhang et al. All rights reserved. A Ranking Procedure by Incomplete Pairwise Comparisons Using Information Entropy and Dempster-Shafer Evidence Theory Wed, 27 Aug 2014 06:15:48 +0000 Decision-making, as a way to discover the preference of ranking, has been used in various fields. However, owing to the uncertainty in group decision-making, how to rank alternatives by incomplete pairwise comparisons has become an open issue. In this paper, an improved method is proposed for ranking of alternatives by incomplete pairwise comparisons using Dempster-Shafer evidence theory and information entropy. Firstly, taking the probability assignment of the chosen preference into consideration, the comparison of alternatives to each group is addressed. Experiments verified that the information entropy of the data itself can determine the different weight of each group’s choices objectively. Numerical examples in group decision-making environments are used to test the effectiveness of the proposed method. Moreover, the divergence of ranking mechanism is analyzed briefly in conclusion section. Dongbo Pan, Xi Lu, Juan Liu, and Yong Deng Copyright © 2014 Dongbo Pan et al. All rights reserved. Research on the Trajectory Model for ZY-3 Wed, 27 Aug 2014 05:46:53 +0000 The new generation Chinese high-resolution three-line stereo-mapping satellite Ziyuan 3 (ZY-3) is equipped with three sensors (nadir, backward, and forward views). Its objective is to manufacture the 1 : 50000 topographic map and revise and update the 1 : 25000 topographic map. For the push-broom satellite, the interpolation accuracy of orbit and attitude determines directly the satellite’s stereo-mapping accuracy and the position accuracy without ground control point. In this study, a new trajectory model is proposed for ZY-3 in this paper, according to researching and analyzing the orbit and attitude of ZY-3. Using the trajectory data set, the correction and accuracy of the new proposed trajectory are validated and compared with the other models, polynomial model (LPM), piecewise polynomial model (PPM), and Lagrange cubic polynomial model (LCPM). Meanwhile, the differential equation is derivate for the bundle block adjustment. Finally, the correction and practicability of piece-point with weight polynomial model for ZY-3 satellite are validated according to the experiment of geometric correction using the ZY-3 image and orbit and attitude data. Yifu Chen and Zhong Xie Copyright © 2014 Yifu Chen and Zhong Xie. All rights reserved. A Method of Extracting Ontology Module Using Concept Relations for Sharing Knowledge in Mobile Cloud Computing Environment Wed, 27 Aug 2014 00:00:00 +0000 In mobile cloud computing environment, the cooperation of distributed computing objects is one of the most important requirements for providing successful cloud services. To satisfy this requirement, all the members, who are employed in the cooperation group, need to share the knowledge for mutual understanding. Even if ontology can be the right tool for this goal, there are several issues to make a right ontology. As the cost and complexity of managing knowledge increase according to the scale of the knowledge, reducing the size of ontology is one of the critical issues. In this paper, we propose a method of extracting ontology module to increase the utility of knowledge. For the given signature, this method extracts the ontology module, which is semantically self-contained to fulfill the needs of the service, by considering the syntactic structure and semantic relation of concepts. By employing this module, instead of the original ontology, the cooperation of computing objects can be performed with less computing load and complexity. In particular, when multiple external ontologies need to be combined for more complex services, this method can be used to optimize the size of shared knowledge. Keonsoo Lee, Seungmin Rho, and Seok-Won Lee Copyright © 2014 Keonsoo Lee et al. All rights reserved. Chaos Enhanced Differential Evolution in the Task of Evolutionary Control of Selected Set of Discrete Chaotic Systems Tue, 26 Aug 2014 13:28:58 +0000 Evolutionary technique differential evolution (DE) is used for the evolutionary tuning of controller parameters for the stabilization of set of different chaotic systems. The novelty of the approach is that the selected controlled discrete dissipative chaotic system is used also as the chaotic pseudorandom number generator to drive the mutation and crossover process in the DE. The idea was to utilize the hidden chaotic dynamics in pseudorandom sequences given by chaotic map to help differential evolution algorithm search for the best controller settings for the very same chaotic system. The optimizations were performed for three different chaotic systems, two types of case studies and developed cost functions. Roman Senkerik, Ivan Zelinka, Michal Pluhacek, Donald Davendra, and Zuzana Oplatková Kominkova Copyright © 2014 Roman Senkerik et al. All rights reserved. An Opportunistic Routing Mechanism Combined with Long-Term and Short-Term Metrics for WMN Tue, 26 Aug 2014 11:48:23 +0000 WMN (wireless mesh network) is a useful wireless multihop network with tremendous research value. The routing strategy decides the performance of network and the quality of transmission. A good routing algorithm will use the whole bandwidth of network and assure the quality of service of traffic. Since the routing metric ETX (expected transmission count) does not assure good quality of wireless links, to improve the routing performance, an opportunistic routing mechanism combined with long-term and short-term metrics for WMN based on OLSR (optimized link state routing) and ETX is proposed in this paper. This mechanism always chooses the highest throughput links to improve the performance of routing over WMN and then reduces the energy consumption of mesh routers. The simulations and analyses show that the opportunistic routing mechanism is better than the mechanism with the metric of ETX. Weifeng Sun, Haotian Wang, Xianglan Piao, and Tie Qiu Copyright © 2014 Weifeng Sun et al. All rights reserved. Mobile Recommendation Based on Link Community Detection Tue, 26 Aug 2014 11:01:15 +0000 Since traditional mobile recommendation systems have difficulty in acquiring complete and accurate user information in mobile networks, the accuracy of recommendation is not high. In order to solve this problem, this paper proposes a novel mobile recommendation algorithm based on link community detection (MRLD). MRLD executes link label diffusion algorithm and maximal extended modularity (EQ) of greedy search to obtain the link community structure, and overlapping nodes belonging analysis (ONBA) is adopted to adjust the overlapping nodes in order to get the more accurate community structure. MRLD is tested on both synthetic and real-world networks, and the experimental results show that our approach is valid and feasible. Kun Deng, Jianpei Zhang, and Jing Yang Copyright © 2014 Kun Deng et al. All rights reserved. A Prerecognition Model for Hot Topic Discovery Based on Microblogging Data Tue, 26 Aug 2014 09:21:14 +0000 The microblogging is prevailing since its easy and anonymous information sharing at Internet, which also brings the issue of dispersing negative topics, or even rumors. Many researchers have focused on how to find and trace emerging topics for analysis. When adopting topic detection and tracking techniques to find hot topics with streamed microblogging data, it will meet obstacles like streamed microblogging data clustering, topic hotness definition, and emerging hot topic discovery. This paper schemes a novel prerecognition model for hot topic discovery. In this model, the concepts of the topic life cycle, the hot velocity, and the hot acceleration are promoted to calculate the change of topic hotness, which aims to discover those emerging hot topics before they boost and break out. Our experiments show that this new model would help to discover potential hot topics efficiently and achieve considerable performance. Tongyu Zhu and Jianjun Yu Copyright © 2014 Tongyu Zhu and Jianjun Yu. All rights reserved. Feature Selection and Classifier Parameters Estimation for EEG Signals Peak Detection Using Particle Swarm Optimization Tue, 19 Aug 2014 06:50:03 +0000 Electroencephalogram (EEG) signal peak detection is widely used in clinical applications. The peak point can be detected using several approaches, including time, frequency, time-frequency, and nonlinear domains depending on various peak features from several models. However, there is no study that provides the importance of every peak feature in contributing to a good and generalized model. In this study, feature selection and classifier parameters estimation based on particle swarm optimization (PSO) are proposed as a framework for peak detection on EEG signals in time domain analysis. Two versions of PSO are used in the study: (1) standard PSO and (2) random asynchronous particle swarm optimization (RA-PSO). The proposed framework tries to find the best combination of all the available features that offers good peak detection and a high classification rate from the results in the conducted experiments. The evaluation results indicate that the accuracy of the peak detection can be improved up to 99.90% and 98.59% for training and testing, respectively, as compared to the framework without feature selection adaptation. Additionally, the proposed framework based on RA-PSO offers a better and reliable classification rate as compared to standard PSO as it produces low variance model. Asrul Adam, Mohd Ibrahim Shapiai, Mohd Zaidi Mohd Tumari, Mohd Saberi Mohamad, and Marizan Mubin Copyright © 2014 Asrul Adam et al. All rights reserved. A Novel BA Complex Network Model on Color Template Matching Tue, 19 Aug 2014 06:15:44 +0000 A novel BA complex network model of color space is proposed based on two fundamental rules of BA scale-free network model: growth and preferential attachment. The scale-free characteristic of color space is discovered by analyzing evolving process of template’s color distribution. And then the template’s BA complex network model can be used to select important color pixels which have much larger effects than other color pixels in matching process. The proposed BA complex network model of color space can be easily integrated into many traditional template matching algorithms, such as SSD based matching and SAD based matching. Experiments show the performance of color template matching results can be improved based on the proposed algorithm. To the best of our knowledge, this is the first study about how to model the color space of images using a proper complex network model and apply the complex network model to template matching. Risheng Han, Shigen Shen, Guangxue Yue, and Hui Ding Copyright © 2014 Risheng Han et al. All rights reserved. Towards Enhancement of Performance of K-Means Clustering Using Nature-Inspired Optimization Algorithms Mon, 18 Aug 2014 06:55:25 +0000 Traditional K-means clustering algorithms have the drawback of getting stuck at local optima that depend on the random values of initial centroids. Optimization algorithms have their advantages in guiding iterative computation to search for global optima while avoiding local optima. The algorithms help speed up the clustering process by converging into a global optimum early with multiple search agents in action. Inspired by nature, some contemporary optimization algorithms which include Ant, Bat, Cuckoo, Firefly, and Wolf search algorithms mimic the swarming behavior allowing them to cooperatively steer towards an optimal objective within a reasonable time. It is known that these so-called nature-inspired optimization algorithms have their own characteristics as well as pros and cons in different applications. When these algorithms are combined with K-means clustering mechanism for the sake of enhancing its clustering quality by avoiding local optima and finding global optima, the new hybrids are anticipated to produce unprecedented performance. In this paper, we report the results of our evaluation experiments on the integration of nature-inspired optimization methods into K-means algorithms. In addition to the standard evaluation metrics in evaluating clustering quality, the extended K-means algorithms that are empowered by nature-inspired optimization methods are applied on image segmentation as a case study of application scenario. Simon Fong, Suash Deb, Xin-She Yang, and Yan Zhuang Copyright © 2014 Simon Fong et al. All rights reserved. Further Study of Multigranulation -Fuzzy Rough Sets Sun, 17 Aug 2014 12:48:35 +0000 The optimistic multigranulation -fuzzy rough set model was established based on multiple granulations under -fuzzy approximation space by Xu et al., 2012. From the reference, a natural idea is to consider pessimistic multigranulation model in -fuzzy approximation space. So, in this paper, the main objective is to make further studies according to Xu et al., 2012. The optimistic multigranulation -fuzzy rough set model is improved deeply by investigating some further properties. And a complete multigranulation -fuzzy rough set model is constituted by addressing the pessimistic multigranulation -fuzzy rough set. The full important properties of multigranulation -fuzzy lower and upper approximation operators are also presented. Moreover, relationships between multigranulation and classical -fuzzy rough sets have been studied carefully. From the relationships, we can find that the -fuzzy rough set model is a special instance of the two new types of models. In order to interpret and illustrate optimistic and pessimistic multigranulation -fuzzy rough set models, a case is considered, which is helpful for applying these theories to practical issues. Wentao Li, Xiaoyan Zhang, and Wenxin Sun Copyright © 2014 Wentao Li et al. All rights reserved. A Novel Adaptive Cuckoo Search for Optimal Query Plan Generation Thu, 14 Aug 2014 15:40:24 +0000 The emergence of multiple web pages day by day leads to the development of the semantic web technology. A World Wide Web Consortium (W3C) standard for storing semantic web data is the resource description framework (RDF). To enhance the efficiency in the execution time for querying large RDF graphs, the evolving metaheuristic algorithms become an alternate to the traditional query optimization methods. This paper focuses on the problem of query optimization of semantic web data. An efficient algorithm called adaptive Cuckoo search (ACS) for querying and generating optimal query plan for large RDF graphs is designed in this research. Experiments were conducted on different datasets with varying number of predicates. The experimental results have exposed that the proposed approach has provided significant results in terms of query execution time. The extent to which the algorithm is efficient is tested and the results are documented. Ramalingam Gomathi and Dhandapani Sharmila Copyright © 2014 Ramalingam Gomathi and Dhandapani Sharmila. All rights reserved. Resource Management Scheme Based on Ubiquitous Data Analysis Wed, 13 Aug 2014 11:55:23 +0000 Resource management of the main memory and process handler is critical to enhancing the system performance of a web server. Owing to the transaction delay time that affects incoming requests from web clients, web server systems utilize several web processes to anticipate future requests. This procedure is able to decrease the web generation time because there are enough processes to handle the incoming requests from web browsers. However, inefficient process management results in low service quality for the web server system. Proper pregenerated process mechanisms are required for dealing with the clients’ requests. Unfortunately, it is difficult to predict how many requests a web server system is going to receive. If a web server system builds too many web processes, it wastes a considerable amount of memory space, and thus performance is reduced. We propose an adaptive web process manager scheme based on the analysis of web log mining. In the proposed scheme, the number of web processes is controlled through prediction of incoming requests, and accordingly, the web process management scheme consumes the least possible web transaction resources. In experiments, real web trace data were used to prove the improved performance of the proposed scheme. Heung Ki Lee, Jaehee Jung, and Gangman Yi Copyright © 2014 Heung Ki Lee et al. All rights reserved. Novel Real-Time Facial Wound Recovery Synthesis Using Subsurface Scattering Tue, 12 Aug 2014 13:17:28 +0000 We propose a wound recovery synthesis model that illustrates the appearance of a wound healing on a 3-dimensional (3D) face. The H3 model is used to determine the size of the recovering wound. Furthermore, we present our subsurface scattering model that is designed to take the multilayered skin structure of the wound into consideration to represent its color transformation. We also propose a novel real-time rendering method based on the results of an analysis of the characteristics of translucent materials. Finally, we validate the proposed methods with 3D wound-simulation experiments using shading models. Taeyoung Choi and Seongah Chin Copyright © 2014 Taeyoung Choi and Seongah Chin. All rights reserved. Satellite Fault Diagnosis Using Support Vector Machines Based on a Hybrid Voting Mechanism Tue, 12 Aug 2014 10:29:04 +0000 The satellite fault diagnosis has an important role in enhancing the safety, reliability, and availability of the satellite system. However, the problem of enormous parameters and multiple faults makes a challenge to the satellite fault diagnosis. The interactions between parameters and misclassifications from multiple faults will increase the false alarm rate and the false negative rate. On the other hand, for each satellite fault, there is not enough fault data for training. To most of the classification algorithms, it will degrade the performance of model. In this paper, we proposed an improving SVM based on a hybrid voting mechanism (HVM-SVM) to deal with the problem of enormous parameters, multiple faults, and small samples. Many experimental results show that the accuracy of fault diagnosis using HVM-SVM is improved. Hong Yin, Shuqiang Yang, Xiaoqian Zhu, Songchang Jin, and Xiang Wang Copyright © 2014 Hong Yin et al. All rights reserved. Group Search Optimizer for the Mobile Location Management Problem Mon, 11 Aug 2014 12:11:32 +0000 We propose a diversity-guided group search optimizer-based approach for solving the location management problem in mobile computing. The location management problem, which is to find the optimal network configurations of management under the mobile computing environment, is considered here as an optimization problem. The proposed diversity-guided group search optimizer algorithm is realized with the aid of diversity operator, which helps alleviate the premature convergence problem of group search optimizer algorithm, a successful optimization algorithm inspired by the animal behavior. To address the location management problem, diversity-guided group search optimizer algorithm is exploited to optimize network configurations of management by minimizing the sum of location update cost and location paging cost. Experimental results illustrate the effectiveness of the proposed approach. Dan Wang, Congcong Xiong, and Wei Huang Copyright © 2014 Dan Wang et al. All rights reserved. Tracking Pedestrians across Multiple Microcells Based on Successive Bayesian Estimations Mon, 11 Aug 2014 11:44:43 +0000 We propose a method for tracking multiple pedestrians using a binary sensor network. In our proposed method, sensor nodes are composed of pairs of binary sensors and placed at specific points, referred to as gates, where pedestrians temporarily change their movement characteristics, such as doors, stairs, and elevators, to detect pedestrian arrival and departure events. Tracking pedestrians in each subregion divided by gates, referred to as microcells, is conducted by matching the pedestrian gate arrival and gate departure events using a Bayesian estimation-based method. To improve accuracy of pedestrian tracking, estimated pedestrian velocity and its reliability in a microcell are used for trajectory estimation in the succeeding microcell. Through simulation experiments, we show that the accuracy of pedestrian tracking using our proposed method is improved by up to 35% compared to the conventional method. Yoshiaki Taniguchi, Masahiro Sasabe, Takafumi Watanabe, and Hirotaka Nakano Copyright © 2014 Yoshiaki Taniguchi et al. All rights reserved. A Novel Algorithm for Imbalance Data Classification Based on Neighborhood Hypergraph Mon, 11 Aug 2014 08:26:54 +0000 The classification problem for imbalance data is paid more attention to. So far, many significant methods are proposed and applied to many fields. But more efficient methods are needed still. Hypergraph may not be powerful enough to deal with the data in boundary region, although it is an efficient tool to knowledge discovery. In this paper, the neighborhood hypergraph is presented, combining rough set theory and hypergraph. After that, a novel classification algorithm for imbalance data based on neighborhood hypergraph is developed, which is composed of three steps: initialization of hyperedge, classification of training data set, and substitution of hyperedge. After conducting an experiment of 10-fold cross validation on 18 data sets, the proposed algorithm has higher average accuracy than others. Feng Hu, Xiao Liu, Jin Dai, and Hong Yu Copyright © 2014 Feng Hu et al. All rights reserved.