Tuesday, 10 October 2017 08:52

Abductive Network Ensembles for Improved

Prediction of Future Change-Prone Classes

in Object-Oriented Software

Mojeeb Al-Khiaty1, Radwan Abdel-Aal2, and Mahmoud Elish1,3

1Information and Computer Science Department, King Fahd University of Petroleum and Minerals, Saudi Arabia

2Computer Engineering Department, King Fahd University of Petroleum and Minerals, Saudi Arabia

3Computer Science Department, Gulf University for Science and Technology, Kuwait

Abstract: Software systems are subject to a series of changes due to a variety of maintenance goals. Some parts of the software system are more prone to changes than others. These change-prone parts need to be identified so that maintenance resources can be allocated effectively. This paper proposes the use of Group Method of Data Handling (GMDH)-based abductive networks for modeling and predicting change proneness of classes in object-oriented software using both software structural properties (quantified by the C&K metrics) and software change history (quantified by a set of evolution-based metrics) as predictors. The empirical results derived from an experiment conducted on a case study of an open-source system show that the proposed approach improves the prediction accuracy as compared to statistical-based prediction models.

Keywords: Change-proneness, software metrics, abductive networks, ensemble classifiers.

Received June 2, 2015; accepted September 20, 2015

 

Full text  


Tuesday, 10 October 2017 08:50

Chaotic Encryption Scheme Based on a Fast Permutation and Diffusion Structure

Jean De Dieu Nkapkop1,2, Joseph Effa1, Monica Borda2, Laurent Bitjoka3, and Alidou Mohamadou4

1Department of Physics, University of Ngaoundéré, Cameroon

2Department of Communications, Technical University of Cluj-Napoca, Romania

3Department of Electrical Engineering, Energetics and Automatics, University of Ngaoundéré, Cameroon

4Department of Physics, University of Maroua, Cameroon

Abstract: The image encryption architecture presented in this paper employs a novel permutation and diffusion strategy based on the sorting of chaotic solutions of the Linear Diophantine Equation (LDE) which aims to reduce the computational time observed in Chong's permutation structure. In this scheme, firstly, the sequence generated by the combination of Piece Wise Linear Chaotic Map (PWLCM) with solutions of LDE is used as a permutation key to shuffle the sub-image. Secondly, the shuffled sub-image is masked by using diffusion scheme based on Chebyshev map. Finally, in order to improve the influence of the encrypted image to the statistical attack, the recombined image is again shuffle by using the same permutation strategy applied in the first step. The design of the proposed algorithm is simple and efficient, and based on three phases which provide the necessary properties for a secure image encryption algorithm. According to NIST randomness tests the image sequence encrypted by the proposed algorithm passes all the statistical tests with the high P-values. Extensive cryptanalysis has also been performed and results of our analysis indicate that the scheme is satisfactory in term of the superior security and high speed as compared to the existing algorithms.

Keywords: Fast and secure encryption, chaotic sequence, linear diophantine equation, NIST test.

Received May 16, 2015; accepted September 7, 2015

 

Full text 


Tuesday, 10 October 2017 08:49

Constructing a Lexicon of Arabic-English Named

Entity using SMT and Semantic Linked Data

Emna Hkiri, Souheyl Mallat, Mounir Zrigui and Mourad Mars

Faculty of Sciences of Monastir, University of Monastir, Tunisia

Abstract: Named Entity Recognition (NER) is the problem of locating and categorizing atomic entities in a given text. In this work, we used DBpedia Linked datasets and combined existing open source tools to generate from a parallel corpus a bilingual lexicon of Named Entities (NE). To annotate NE in the monolingual English corpus, we used linked data entities by mapping them to Gate Gazetteers. In order to translate entities identified by the gate tool from the English corpus, we used moses, a Statistical Machine Translation (SMT) system. The construction of the Arabic-English NE lexicon is based on the results of moses translation. Our method is fully automatic and aims to help Natural Language Processing (NLP) tasks such as, Machine Translation (MT) information retrieval, text mining and question answering. Our lexicon contains 48753 pairs of Arabic-English NE, it is freely available for use by other researchers.

Keywords: NER, named entity translation, parallel Arabic-English lexicon, DBpedia, linked data entities, parallel corpus, SMT.

Received April 1, 2015; accepted October 7, 2015

 

Full text  


Tuesday, 10 October 2017 08:46

Forecasting of Chaotic Time Series Using RBF

Neural Networks Optimized By Genetic Algorithms

Mohammed Awad

Faculty of Engineering and Information Technology, Arab American University, Palestine

Abstract: Time series forecasting is an important tool, which is used to support the areas of planning for both individual and organizational decisions. This problem consists of forecasting future data based on past and/or present data. This paper deals with the problem of time series forecasting from a given set of input/output data. We present a hybrid approach for time series forecasting using Radial Basis Functions Neural Network (RBFNs) and Genetic Algorithms (GAs). GAs technique proposed to optimize centers c and width r of RBFN, the weights w of RBFNs optimized used traditional algorithm. This method uses an adaptive process of optimizing the RBFN parameters depending on GAs, which improve the homogenize during the process. This proposed hybrid approach improves the forecasting performance of the time series. The performance of the proposed method evaluated on examples of short-term mackey-glass time series. The results show that forecasting by RBFNs parameters is optimized using GAs to achieve better root mean square error than algorithms that optimize RBFNs parameters found by traditional algorithms.

Keywords: Time series forecasting, RBF neural networks, genetic algorithms, hybrid approach.

Received March 17, 2015; accepted October 7, 2015

 

Full text 

Tuesday, 10 October 2017 07:59

Contextual Text Categorization: An Improved Stemming Algorithm to Increase the Quality of Categorization in Arabic Text

Said Gadri and Abdelouahab Moussaoui

 Department of Computer Science, University Ferhat Abbas of Setif, Algeria

Abstract: One of the methods used to reduce the size of terms vocabulary in Arabic text categorization is to replace the different variants (forms) of words by their common root. This process is called stemming based on the extraction of the root. Therefore, the search of the root in Arabic or Arabic word root extraction is more difficult than in other languages since the Arabic language has a very different and difficult structure, that is because it is a very rich language with complex morphology. Many algorithms are proposed in this field. Some of them are based on morphological rules and grammatical patterns, thus they are quite difficult and require deep linguistic knowledge. Others are statistical, so they are less difficult and based only on some calculations. In this paper we propose an improved stemming algorithm based on the extraction of the root and the technique of n-grams which permit to return Arabic words’ stems without using any morphological rules or grammatical patterns.

Keywords: Root extraction, information retrieval, bigrams, stemming, Arabic morphological rules, feature selection.

Received February 22, 2015; accepted August 12, 2015

 

Tuesday, 10 October 2017 07:34

An Architecture of Thin Client-Edge Computing Collaboration for Data Distribution and Resource Allocation in Cloud

Aymen Alsaffar, Pham Hung, and Eui-Nam Huh

Department of Computer Science and Engineering, Kyung Hee University, South Korea

Abstract: These days, Thin-client devices are continuously accessing the Internet to perform/receive diversity of services in the cloud. However these devices might either has lack in their capacity (e.g., processing, CPU, memory, storage, battery, resource allocation, etc) or in their network resources which is not sufficient to meet users satisfaction in using Thin-client services. Furthermore, transferring big size of Big Data over the network to centralized server might burden the network, cause poor quality of services, cause long respond delay, and inefficient use of network resources. To solve this issue, Thin-client devices such as smart mobile device should be connected to edge computing which is a localized near to user location and more powerful to perform computing or network resources. In this paper, we introduce a new method that constructs its architecture on Thin-client -edge computing collaboration. Furthermore, present our new strategy for optimizing big data distribution in cloud computing. Moreover, we propose algorithm to allocate resources to meet Service Level Agreement (SLA) and Quality of Service (QoS) requirements. Our simulation result shows that our proposed approach can improve resource allocation efficiently and shows better performance than other existing methods.

Keywords: Cloud computing, data distribution, edge computing, resource allocation, and thin client.

Received January 19, 2015; accepted August 12, 2015


Tuesday, 10 October 2017 07:32

TDMCS: An Efficient Method for Mining Closed Frequent Patterns over Data Streams Based on Time Decay Model

Meng Han, Jian Ding, and Juan Li

 School of Computer Science and Engineering, North Minzu University, China

Abstract: In some data stream applications, the information embedded in the data arriving in the new recent time period is important than historical transactions. Because data stream is changing over time, concept drift problem may appear in data stream mining. Frequent pattern mining methods always generate useless and redundant patterns. In order to obtain the result set of lossless compression, closed pattern is needed. A novel method for efficiently mining closed frequent patterns on data stream is proposed in this paper. The main works includes: distinguished importance of recent transactions from historical transactions based on time decay model and sliding window model; designed the frame minimum support count-maximal support error rate-decay factor (θ-ε-f) to avoid concept drift; used closure operator to improve the efficiency of algorithm; design a novel way to set decay factor: average-decay-factor faverage in order to balance the high recall and high precision of algorithm. The performance of proposed method is evaluated via experiments, and the results show that the proposed method is efficient and steady-state. It applies to mine data streams with high density and long patterns. It is suitable for different size sliding windows, and it is also superior to other analogous algorithms.

Keywords: data stream mining, frequent pattern mining, closed pattern mining, time decay model, sliding window, concept drift.

Received January 15, 2015; accepted August 12, 2015

Tuesday, 10 October 2017 07:30

Internal Model Control to Characterize Human Handwriting Motion

Ines Chihi, Afef Abdelkrim, and Mohamed Benrejeb

National School of Engineers of Tunisia, Tunis El Manar University, Tunisia

Abstract: The main purpose of this paper is to consider the human handwriting process as an Internal Model Control structure (IMC). The proposed approach allows characterizing the biological process from two muscles activities of the forearm, named ElectroMyoGraphy signals (EMG). For this, an experimental approach was used to record the coordinates of a pen-tip moving on (x,y) plane and EMG signals during the handwriting act. In this sense direct and inverse handwriting models are proposed to establish the relationship between the muscles activities of the forearm and the velocity of the pen-tip. Recursive Least Squares algorithm (RLS) is used to estimate the parameters of both models (direct and inverse). Simulations show good agreement between the proposed approach results and the recorded data.

Keywords: Human handwriting process; IMC; the muscular activities; direct and inverse handwriting models; velocity of the pen-tip; RLS algorithm.

Received January 6, 2015; accepted September 22, 2015

 

Full text 


Tuesday, 10 October 2017 07:27

Efficient Segmentation of Arabic Handwritten

Characters Using Structural Features

Mazen Bahashwan, Syed Abu-Bakar, and Usman Sheikh

Department of Electronics and Computer Engineering, Universiti Teknologi Malaysia, Malaysia

Abstract: Handwriting recognition is an important field as it has many practical applications such as for bank cheque processing, post office address processing and zip code recognition. Most applications are developed exclusively for Latin characters. However, despite tremendous effort by researchers in the past three decades, Arabic handwriting recognition accuracy remains low because of low efficiency in determining the correct segmentation points. This paper presents an approach for character segmentation of unconstrained handwritten Arabic words. First, we seek all possible character segmentation points based on structural features. Next, we develop a novel technique to create several paths for each possible segmentation point. These paths are used in differentiating between different types of segmentation points. Finally, we use heuristic rules and neural networks, utilizing the information related to segmentation points, to select the correct segmentation points. For comparison, we applied our method on IESK-arDB and IFN/ENIT databases, in which we achieved a success rate of 91.6% and 90.5% respectively.

Keywords: Arabic handwriting, character segmentation and structural features.

Received December 23, 2014; accepted August 26, 2015


Tuesday, 10 October 2017 07:26

A Novel Swarm Intelligence Algorithm for the Evacuation Routing Optimization Problem

Jin-long Zhu1, Wenhui Li2, Huiying Li2, Qiong Wu2, and Liang Zhang2

1Department of Computer Science and Technology, ChangChun Normal University, China

2Department of Computer Science and Technology, Jilin University, China

Abstract: This paper presents a novel swam intelligence optimization algorithm that combines the evolutionary method of Particle Swarm Optimization (PSO) with the filled function method in order to solve the evacuation routing optimization problem. In the proposed algorithm, the whole process is divided into three stages. In the first stage, we make use of global optimization of filled function to obtain optimal solution to set destination of all particles. In the second stage, we make use of the randomicity and rapidity of PSO to simulate the crowd evacuation. In the third stage, we propose three methods to manage the competitive behaviors among the particles. This algorithm makes an evacuation plan using the dynamic way finding of particles from both a macroscopic and a microscopic perspective simultaneously. There are three types of experimental scenes to verify the effectiveness and efficiency of the proposed algorithm: a single room, a 4-room/1-corridor layout, and a multi-room multi-floor building layout. The simulation examples demonstrate that the proposed algorithm can greatly improve upon evacuation clear and congestion times. The experimental results demonstrate that this method takes full advantage of multiple exits to maximize the evacuation efficiency.

Keywords: PSO, filled function, global optimum, local optimum.

Received November 17, 2014; accepted September 10, 2015


Tuesday, 10 October 2017 07:22

The Veracious Counting Bloom Filter

Brindha Palanisamy1 and Senthilkumar Athappan2

1Research Scholar, Anna University, India

2Department of Electrical and Electronics Engineering, Anna University, India

Abstract: Counting Bloom Filters (CBFs) are widely employed in many applications for fast membership queries. CBF works on dynamic sets rather than a static set via item insertions and deletions. CBF allows false positive, but not false negative. The Bh-Counting Bloom Filter (Bh-CBF) and Variable Increment Counting Bloom Filter (VI-CBF) are introduced to reduce the False Positive Probability (FPP), but they suffer from memory overhead and hardware complexity. In this paper, we proposed a multilevel optimization approach named as Veracious Bh-Counting Bloom Filter (VBh-CBF) and Veracious Variable Increment Counting Bloom Filter (VVI-CBF) by partitioning the counter vector into multiple levels to reduce the FPP and to limit the memory requirement. The experiment result shows that the FPP and total memory size are reduced by 65.4%, 67.74% and 20.26%, 41.29% respectively compared to basic Bh-CBF and VI-CBF.

Key words: Bloom filter, false positive, counting bloom filter, intrusion detection system.

Received August 3, 2014; accepted November 25, 2015

Tuesday, 10 October 2017 07:21

A MMDBM Classifier with CPU and CUDA GPU

Computing in Various Sorting Procedures

 Sivakumar Selvarasu1, Ganesan Periyanagounder1, and Sundar Subbiah2

1Department of Mathematics, Anna University, India

2Department of Mathematics, Indian Institute of Technology, India

Abstract: A decision tree classifier called Mixed Mode Database Miner (MMDBM) which is used to classify large number of datasets with large number of attributes is implemented with different types of sorting techniques (quick sort and radix sort) in both Central Processing Unit (CPU) computing  and General-Purpose computing on Graphics Processing Unit (GPGPU) computing and the results are discussed. This classifier is suitable for handling large number of both numerical and categorical attributes. The MMDBM classifier has been implemented in CUDA GPUs and the code is provided.  We used the parallelized algorithms of the two sorting techniques on GPU using Compute Unified Device Architecture (CUDA) parallel programming platform developed by NVIDIA corporation. In this paper, we have discussed an efficient parallel (quick sort and radix sort) sorting procedures on GPGPU computing and compared the results of GPU to the CPU computing.  The main result of MMDBM is used to compare the classifier with an existing CPU computing results and GPU computing results. The GPU sorting algorithms provides quick and exact results with less handling time and offers sufficient support in real time applications.

Keywords: Classification, Data Mining, CUDA, GPUs, Decision tree, Quick sort, Radix sort.

Received July 29, 2014; accepted April 12, 2015

 

Full text   

Tuesday, 10 October 2017 07:13

Inter-Path OOS Packets Differentiation Based Congestion Control for Simultaneous Multipath Transmission

Samiullah Khan and Muhammad Abdul Qadir

Department of Computer Science, Capital University of Science and Technology, Pakistan

Abstract: An increase in the popularity and usage of Multimode’s devices for ubiquitous network access creates thrust for utilization of simultaneous network connections. Unfortunately, the standard transport layer protocols used single homed congestion control mechanism for multipath transmission. One major challenge in such multipath transmission is related to the Receiver Buffer (RBuf) blocking that hinders higher aggregation ratio of multiple paths. This study proposed Simultaneous Multipath Transmission (SMT) scheme to avoid the RBuf blocking problem. Realistic simulation scenarios were designed such as intermediate nodes, cross traffic, scalability or mix of them to thoroughly analyses SMT performance. The results revealed that SMT has overcome RBuf blocking with improvement in aggregate throughput up to 95.3 % of the total bandwidth.

Keywords: Multipath transmission, RBuf blocking, out-of-sequence arrival, throughput, congestion window.

Received June 14, 2014; accepted September 16, 2015

  

Full text   

Tuesday, 10 October 2017 07:12

Method-level Code Clone Detection for Java through Hybrid Approach

Egambaram Kodhai1 and Selvadurai Kanmani2

1Department of Computer Science and Engineering, Sri MankulaVinayagar Engineering College, India

2Department of Information Technology, Pondicherry Engineering College, India

Abstract: A software clone is an active research area where several researchers have investigated techniques to automatically detect duplicated code in programs. However their researches have limitations either in finding the structural or functional clones. Moreover, all these techniques detected only the first three types of clones. In this paper, we propose a hybrid approach combining metric-based approach with textual analysis of the source code for the detection of both syntactical and functional clones in a given Java source code. This proposal is also used to detect all four types of clones. The detection process makes use of a set of metrics calculated for each type of clones. A tool named CloneManager is developed based on this method in Java for high portability and platform-independency. The various types of clones detected by the tool are classified and clustered as clone clusters. The tool is also tested with seven existing open source projects developed in Java and compared with the existing approaches.

Keywords: Clone detection, functional clones, source code metrics, string-matching.

Received October 21, 2013; accepted June 24, 2014

Tuesday, 10 October 2017 07:05

Enhancing Cloud Security Based On Group

Signature

Arumugam Sakthivel

Department of Computer Science and Engineering, Kalasalingam University, India

Abstract: Using the eccentric of truncated preservation, cloud computing gives a reasonable and proficient result for distributing cluster resources among cloud clients. Regrettably, distributing data in a multi user fashion whereas maintaining data and individuality privacy from an unfaith cloud is quiet a puzzling concern, because of the recurrent change of the participation. The proposed system focuses a protected multi user data distributing method, for active clusters in the cloud. Using group signature and active broadcast encryption methods, any cloud client can secretly distribute data among others. Provisionally, the storage load and encryption calculation cost of the proposed method is liberated from the amount of repealed clients. Additionally, the security and performance analysis of the proposed method shows that, much more efficient and secure than all other existing methods.

Keywords: Active broadcast encryption, cloud, data distribution, group signature.

Received September 12, 2014; accepted June 18, 2015

Top
We use cookies to improve our website. By continuing to use this website, you are giving consent to cookies being used. More details…