Sunday, 29 March 2015 07:33

Speed up of Reindexing in Adaptive Particle Swarm Optimization

 

Niraimathi Ponnusamy and Bhoopathy Krishnaswamy
Department of Electronics Engineering, Anna University, India

 

Abstract: Palette re-ordering is a class of pre-processing method with the objective to manipulate the palette index such that the adjacent symbols are assigned close indices in the symbol space, thus enhancing the compressibility of the image with many lossless compressors. Finding an exact reordered palette would certainly be exhaustive and computationally complex. A solution to this NP hard problem is presented by using an Adaptive Particle Swarm Optimization (APSO) to achieve fast global convergence by maximizing the co-occurrences. A new algorithm with improved inertia factor is presented here to accelerate the convergence speed of the reindexing scheme. In this algorithm, the key parameter inertia weight is formulated as a factor of gradient based rate of particle convergence. Experimental results assert that the proposed modification helps in improving APSO performance in terms of solution quality and convergence to global optima.

 

Key words: Reindexing, palette-indexed image, Cross Entropy (CE), rate of particle convergence (k), improved Inertia Weight Adaptive Particle Swarm Optimization (IWAPSO).

Received April 3, 2013; accepted November10, 2014

Full Text

 

 

 

Sunday, 17 August 2014 08:03

Two Layer Defending Mechanism against DDoS Attacks

Kiruthika Subramain, Preetha Gunasekaran, and Mercy Selvaraj

Department of Computer Science and Engineering, Thiagarajar College of Engineering,

Affiliated to Anna University, India

Abstract: Distributed Denial of Service (DDoS) attackers make a service unavailable for intended users. Attackers use IP spoofing as a weapon to disguise their identity. The spoofed traffic follows the same principles as normal traffic, so detection and filtering is very essential.  Hop-Count Filtering (HCF) scheme identifies packet whose source IP address is spoofed. The  information  about  a  source  IP  address  and  its  corresponding  hops  from  a  server (victim) recorded in a table at the victim. The incoming packet is checked against this table for authenticity. The design of IP2HC table reduces the amount of storage space by IP address clustering. The proposed work filters majority of the spoofed traffic by HCF-SVM algorithm on the network layer.  DDoS attackers using genuine IP is subjected to traffic limit at the Application layer. The two layer defense approach protects legitimate traffic from being denied, thereby mitigating DDoS effectively. HCF - SVM model yields 98.99% accuracy with reduced false positive rate and the rate limiter punishes the aggressive flows and provides sufficient bandwidth for legitimate users without any denial of service. The implementation of the proposed work is carried out on an experimental testbed.

 Keywords: DDoS, hop-count, IP2HC table, clustering, IP spoofing, testbed.

 Received November 9, 2012; acceped April 29, 2013

Full Text

 

 

Sunday, 17 August 2014 08:00

An Improved Iterative Segmentation Algorithm using Canny Edge Detector for Skin Lesion Border Detection

J.H.Jaseema Yasmin1and M.Mohamed Sathik2

1Associate Professor, National College of Engineering, India

2Principal, Sadakathullah Appa College, India

Abstract: One of the difficult problems recognized in image processing and pattern analysis, in particular in medical imaging applications is Boundary detection. The detection of skin lesion boundaries  accurately  allows,  skin  cancer  detection .There  is  no unified approach to  this problem, which has been found to be application  dependent. Early diagnosis of melanoma is a challenge, especially for general practitioners, as melanomas are hard to distinguish from common moles, even for experienced dermatologists. Melanoma can be cured by simple excision, when diagnosed at an early stage. Our proposed improved iterative segmentation algorithm, using canny edge detector, which is a simple and effective method to find the border of real skin lesions is presented, that helps in early detection of malignant melanoma and its performance is compared with the segmentation algorithm using canny detector[16], developed by us previously for border detection of real skin lesions. The experimental results demonstrate the successful border detection of noisy real skin lesions by our proposed improved iterative segmentation algorithm using canny detector. We conclude that our proposed  segmentation algorithm, segments the lesion from the image even in the presence of noise for a variety of lesions, and skin types and its performance is more reliable than the segmentation algorithm[16] that we have developed previously that uses canny detector, for border detection of real skin lesions for noisy skin lesion diagnosis.                

Keywords: Melanoma, canny edge detector, border detection, segmentation, skin lesion

Received April 15, 2012; accept February 13, 2013

Full Text

 

 

Sunday, 17 August 2014 07:56

VLSI Implementations of Compressive Image Acquisition Using Block Based Compression Algorithm

Muthukumaran N and Ravi Ramraj

Francis Xavier Engineering College, Tirunelveli-627003, India.

Abstract: In this research paper consists of compressing the images within each pixel before the storage processes, hence the size of the memory gets reduced. This can be done by the proposed method namely block based compression algorithm which uses the differential coding scheme. Here differential values are captured and then quantized. The differential coding scheme uses the concept of selecting the brightest pixel as the reference pixel. The difference between brightest pixel and subsequent pixel is calculated and quantized. Hence, their range is compressed and the spatial redundancy can be removed using block based compression algorithm. Thus, the proposed scheme reduces the accumulation of error and also reduces the requirement of memory. Thus, the Peak Signal to Noise Ratio (PSNR) value can be improved and Bits Per Pixel (BPP) value can be reduced. The future scope of the project is that the quality of the image can be further improved with high peak signal to noise ratio value using some other compression techniques.

 Key words: Image capture, image store, image compression, JPEG, PSNR, compression ratio, CMOS image sensor, VLSI implementation.

Received May 28, 2013; accepted September 26, 2013

Full Text

Sunday, 17 August 2014 07:38

Vulnerability Analysis of Two Ultralightweight RFID Authentication Protocols

Yousof Farzaneh1, Mahdi Azizi2, Masoud Dehkordi1, and Abdolrasoul Mirghadri2

 1School of Mathematics, Iran University of Science and Technology, Iran

2Faculty of Communication and Information Technology, IHU University, Iran

Abstract: Ultralightweight Radio Frequency Identification (RFID) authentication protocols are suitable for low-cost RFID tags with restricted computational power and memory space. Recently, Lee proposed two ultra lightweight authentication protocols for low-cost RFID tags, namely DIDRFID and SIDRFID protocols. The first protocol is based on dynamic identity and the second one on static identity. Lee claimed that his protocols can resist tracking, replay, impersonation, and DOS attacks. In this paper, we show that Lee’s protocols are not secure and they are vulnerable against tracking, impersonation, and full disclosure attacks. Specially, an adversary can accomplish an effective full disclosure attack on DIDRFID protocol by eavesdropping two consecutive sessions and gets all the secret information stored on a tag. Also, we demonstrate that an adversary with ability of obtaining secret information of a single compromised tag in SIDRFID protocol, can get the secret information of other tags and she/he can completely control the whole RFID system.

 Keywords: Low-cost RFID, cryptography, protocol, vulnerability.

Received August 8, 2012; accepted July 28, 2013

Full Text

 

 

Sunday, 17 August 2014 07:35

Efficient Multimodal Biometric Database
Construction and Protection Schemes

Kebbeb Asma1, Mostefai Messaoud1, Benmerzoug Fateh1, and Youssef Chahir2

1MSE Laboratory, University of Bordj Bou Arreridj, Algeria

2GREYC Laboratory, University of Caen, France

Abstract: This work proposes an efficient approach for the construction and the protection of a dynamic and evolutionary multimodal biometric database. The last is dedicated to a biometric authentication system operating on a set of connected sites. For a better protection of acquired data, a topological watermarking module is developed to dissimulate the related enrolled person’s files links. 

 Keywords: Biometric databases, multimodal authentication, digital watermarking, cross-section topology.

 Received November 30, 2012; accepted February 21, 2013        

Full Text

 

 

Sunday, 17 August 2014 07:29

Pairwise Sequence Alignment using Bio-Database Compression by Improved Fine Tuned Enhanced Suffix Array

Kunthavai A1, Vasantharathna S2, and  Thirumurugan S3

1,3Department of Computer Science & Engineering / IT, Coimbatore Institute of Technology, India.

2Department of Electrical & Electronics Engineering, Coimbatore Institute of Technology, India.

 Abstract: Sequence alignment is a bioinformatics application that determines the degree of similarity between nucleotide sequences which is assumed to have same ancestral relationships. This sequence alignment method reads query sequence from the user and makes an alignment against large and genomic sequence data sets and locate targets that are similar to an input query sequence. Existing accurate algorithm, such as Smith-Waterman and FASTA are computationally very expensive, which limits their use in practice.  The existing search tools, such as BLAST and      WU-BLAST, employ heuristics to improve the speed of such searches. However, such heuristics can sometimes miss targets, in which many cases are undesirable. Considering the rapid growth of database  sizes,  this   problem      demands ever-growing computation resources, and remains as a computational challenge. Most common sequence alignment algorithms like BLAST, WU-BLAST, and SCT searches a given query sequence against set of database sequences.  In this paper BioDBMPHF Tool has been developed to find pair wise local sequence alignment by preprocessing the database. Preprocessing is done by means of finding Longest Common Substring (LCS) from the database of sequences that have the highest local similarity with a given query sequence and reduces the size of the database based on frequent common subsequence. In this BioDBMPHF Tool fine-tuned enhanced suffix array is constructed and used to find LCS. Experimental results show that HashIndexalgorithm reduces the time and space complexity to access LCS. Time complexity to find LCS of the HashIndexalgorithm is O (2 + γ) where ‘γ’ is the time taken to access the pattern. Space complexity of fine-tuned enhanced suffix array is 5n bytes per character for reduced enhanced Lcp table and to store bucket table it requires 32 bytes. Data mining technique is used to cross validate the result. It is proved that the developed BioDBMPHF Tool effectively compresses the database and obtains same results compared to that traditional algorithm in approximately half the time taken by them thereby reducing the time complexity.

 Keywords: Sequence alignment, enhanced suffix array, compression, minimum perfect hash function, data mining

Recevied October 25, 2012; accepted January 1, 2013

 

 

Sunday, 17 August 2014 07:26

Mining Closed and Multi-Supports-Based Sequential Pattern in High-Dimensional Dataset

Meng Han1,2, Zhihai Wang1,and Jidong Yuan1

1School of Computer and Information Technology, Beijing Jiaotong University, China[1]

2School of Computer Science and Engineering, Beifang University of Nationalities, China

 Abstract: Previous mining algorithms on high dimensional datasets, such as biological dataset, create very large patterns sets as a result which includes small and discontinuous sequential patterns. These patterns do not bear any useful information for usage. Mining sequential patterns in such sequences need to consider different forms of patterns, such as contiguous patterns, local patterns which appear more than one time in a special sequence and so on. Mining closed pattern leads to a more compact result set but also a better efficiency. In this paper, a novel algorithm based on BI - directional extension and multi-supports is provided specifically for mining contiguous closed patterns in high dimensional dataset. Three kinds of contiguous closed sequential patterns are mined which are sequential patterns, local sequential patterns and total sequential patterns. Thorough performances on biological sequences have demonstrated that the proposed algorithm reduces memory consumption and generates compact patterns.

A detailed analysis of the multi-supports-based results is provided in this paper.

 Keywords: High - dimensional dataset, closed pattern, contiguous pattern, multi - supports, biological sequences.

 

Receivsed Junuary 11, 2012; accepted April 29, 2013

Full Text

 

 

Sunday, 17 August 2014 07:21

Solving QBF with Heuristic

Small-world Optimization Search Algorithm

  Tao Li1 and Nanfeng Xiao2

  1Modern Education and Technology Center, South China Agricultural University, China

2School of Computer Science and Engineering, South China University of Technology, China

Abstract: In this paper, we use Gaifman graph to describe the topological structure of the Quantified Boolean Formulae (QBF), we mainly study the formula family with the small-world network topology. We analyze the traditional Putnam, Logemann and Loveland (DPLL) solving algorithm for QBF, then we improve the DPLL algorithm and propose the solving algorithm framework based on small world optimization search algorithm, we apply this small world optimization search algorithm to determine the order of the DPLL branch variable. Our result proves that small world optimization search algorithm has a certain degree of effectiveness to improve the solving efficiency. It is valuable as an incomplete solution algorithm for search-based solver.

 Keywords: QBF, small-world, search algorithm, optimization algorithm.

Received July 26, 2012; accepted February 11, 2013

Full Text

 

 

Sunday, 17 August 2014 07:17

Balanced Workload Clusters for Distributed Object Oriented Software

 
          Hebat-Allah M. Ragab1, Amany Sarhan1, Al Sayed A. H. Sallam1, Reda A. AMMAR2

1 Computer and Control Engineering Department, Faculty of Engineering, Tanta University, Egypt

2 Computer Science and Engineering Department, School of Engineering, University of Connecticut, USA

 Abstract: When clustering objects to be allocated on a number of nodes, most researches focus only on either the communication cost between clusters or the balancing of the workload on the nodes. Load balancing is a technique to distribute workload evenly across two or more computers, network links, CPUs, hard drives, or other resources, in order to get optimal resource utilization, maximize throughput, minimize response time, and avoid overload. In this paper, we introduce three clustering algorithms that obtain balanced clusters for homogeneous clustered with minimized communication cost.

Keywords: Load balance; Distributed system; Software Restructuring; Cluster Algorithms.

 Full Text

 


 

Sunday, 17 August 2014 07:09

An Accelerometer-Based Approach to Evaluate 3D Unistroke Gestures

Tahir Madani1,2,*, Muhammad Tahir3, Sheikh Ziauddin1,  Syed Raza1, Mirza Ahmed1, Majid  Khan1, and Mahmood Ashraf 4,5

1Department of Computer Science, COMSATS Institute of Information Technology, Pakistan

2Department of Information Technology, University Technology Petronas, Malaysian

3Faculty of Computing and Information Technology, King Abdulaziz University, Kingdom of Saudi Arabia

4Department of Computer Science, Science and Technology, Pakistan

5Department of Software Engineering, Universiti Teknologi, Malaysia

 

Abstract: This paper presents an evaluation of Three Dimensional (3D) unistroke human arm gestures. Our scheme employs an accelerometer-based approach by using Nintendo™ Wiimote as a gesture device. The system takes acceleration signals from Wiimote in order to classify different gestures. It deals with numeric gestures, i.e., digits from 0 to 9 and simple mathematical operator gestures for addition, subtraction, multiplication and division. Two techniques, Dynamic Time Warping (DTW) and 2D trajectories are used to recognize and classify gestures. Successful recognition rates indicate that performing 3D gestures using accelerometer-based devices is intuitive and provides an effective means of interaction with computers.

 Keywords: Human computer interaction, 3D gestures, accelerometer, 3D calculator, dynamic time warping, 2D trajectories, wiimote.

Received May 29, 2012; accepted September 26, 2013

Full text

 


 

Sunday, 17 August 2014 07:03

An Artificial Neural Network Approach for Sentence Boundary Disambiguation in Urdu Language Text

Shazia Raj, Zobia Rehman, Sonia Rauf, Rehana Siddique, and  Waqas Anwar

Department of Computer Science, COMSATS Institute of Information Technology, Pakistan

Abstract: Sentence boundary identification is an important step for text processing tasks, e.g., machine translation, POS tagging, text summarization etc., in this paper we present an approach comprising of feed forward neural network along with part of speech information of the words in a corpus. Proposed adaptive system has been tested after training it with varying sizes of data and threshold values. The best results, our system produced are 93.05% precision, 99.53% recall and 96.18% f-measure.

Keywords: Sentence boundary identification, feed forwardneural network, back propagation learning algorithm.

Received April 22, 2013; accepted September 19, 2013

Full text


 

Sunday, 17 August 2014 06:49

Distinguishing Attack on CSA

Kai Zhang and Jie Guan

Zhengzhou Information Science and Technology Institute, China

Abstract: Common Scrambling Algorithm (CSA) has been used to encrypt European cable digital television signals since 1994. Although the key size of CSA is small, up to now, there haven’t any effective crypto results which can break the algorithm totally. Based on the idea of slide resynchronization attack, a distinguishing attack which can distinguish the keystream of the stream cipher from a purely random sequence with computational complexity of O (215) is proposed. According to the distinguishing attack, the 64 bit initial key can be recovered with computational complexity of O (255).

 

Keywords: DVB-CSA, distinguishing attack, slide resynchronization attack, hybrid cipher.

 

Received August 31, 2012; accepted February 23, 2014

 

 

Top
We use cookies to improve our website. By continuing to use this website, you are giving consent to cookies being used. More details…