Tuesday, 29 December 2009 19:00

DDoS Incidents and their Impact: A Review

Monika Sachdeva¬1, Gurvinder Singh2, Krishan Kumar1, and Kuldip Singh3
1Department of Computer Science and Engineering, SBS College of Engineering and Technology, India
2Department of Computer Science and Engineering, Guru Nanak Dev University, India
3Department of Electronics and Computer Engineering, Indian Institute of Technology, India

Abstract: The phenomenal growth and success of Internet has changed the way traditional essential services such as banking, transportation, medicine, education and defence are operated. Now they are being progressively replaced by cheaper and more efficient Internet-based applications. In present era, the world is highly dependent on the Internet and it is considered as main infrastructure of the global information society. Therefore, the availability of Internet is very critical for the socio-economic growth of the society. However, the inherent vulnerabilities of the Internet architecture provide opportunities for a lot of attacks on its infrastructure and services. Distributed denial-of-service attack is one such kind of attack, which poses an immense threat to the availability of the Internet. One of the biggest challenges before researchers is to find details of these attacks because to avoid defamation most of the commercial sites do not even reveal that they were attacked. In this paper, an overview of distributed denial-of-service problem and Inherent vulnerabilities in the Internet architecture are provided. Real distributed denial-of-service incidents with their financial impact are critically analyzed and finally need for a comprehensive distributed denial-of-service solution is highlighted.

Keywords: Availability, botnet, DoS, DDoS, incident, vulnerability.

Received April 18, 2008; accepted June 8, 2008

Tuesday, 29 December 2009 19:00

Input Variable Selection Using Parallel Processing of RBF Neural Networks

Mohammed Awad
Faculty of Engineering and Information Technology, Arab American University, Palestine

Abstract: In this paper we propose a new technique focused on the selection of the important input variable for modelling complex systems of function approximation problems, in order to avoid the exponential increase in the complexity of the system that is usual when dealing with many input variables. The proposed parallel processing approach is composed of complete radial basis function neural networks that are in charge of a reduced set of input variables depending in the general behaviour of the problem. For the optimization of the parameters of each radial basis function neural networks in the system, we propose a new method to select the more important input variables which is capable of deciding which of the chosen variables go alone or together to each radial basis function neural networks to build the parallel structure, thus reducing the dimension of the input variable space for each radial basis function neural networks. We also provide an algorithm which automatically finds the most suitable topology of the proposed parallel processing structure and selects the more important input variables for it. Therefore, our goal is to find the most suitable of the proposed families of parallel processing architectures in order to approximate a system from which a set of input/output. So that the proposed parallel processing structure outperforms other algorithms not only with respect to the final approximation error but also with respect to the number of computation parameters of the system.

Keywords: Parallel processing, input variable selection, radial basis function neural networks.

Received November 30, 2007; accepted May 12, 2008

Tuesday, 29 December 2009 19:00

An Enhancement of Major Sorting Algorithms

Jehad Alnihoud and Rami Mansi
Department of Computer Science, Al al-Bayt University, Jordan

Abstract: One of the fundamental issues in computer science is ordering a list of items. Although there is a huge number of sorting algorithms, sorting problem has attracted a great deal of research; because efficient sorting is important to optimize the use of other algorithms. This paper presents two new sorting algorithms, enhanced selection sort and enhanced bubble Sort algorithms. Enhanced selection sort is an enhancement on selection sort by making it slightly faster and stable sorting algorithm. Enhanced bubble sort is an enhancement on both bubble sort and selection sort algorithms with O(nlgn) complexity instead of O(n2) for bubble sort and selection sort algorithms. The two new algorithms are analyzed, implemented, tested, and compared and the results were promising.

Keywords: Enhanced selection sort, enhanced bubble sort, selection sort, bubble sort, number of swaps, time complexity.  


Received May 27, 2008; accepted September 1, 2008

 

Full Text

 
Tuesday, 29 December 2009 19:00

Performance of OCDMA Systems Using
Random Diagonal Code for Different
Decoders Architecture Schemes 

Hilal Fadhil, Syed Aljunid, and Badlished Ahmed
School of Computer and Communication, University Malaysia Perlis, Malaysia


Abstract: In this paper, new code families are constructed for spectral-amplitude coding optical code division multiple access, called random diagonal code for spectral amplitude coding optical code division multiple access networks. Random diagonal code is constructed using code level and data level. One of the important properties of this code is that the cross correlation at data level is always zero, which means that Phase Intensity Induced Noise is reduced. We find that the performance of the random diagonal code will be better than modified frequency hopping and Hadamard code. It has been observed through simulation and theoretical calculation that bit-error rate for random diagonal code perform significantly better than other codes. We analyze the performance of the proposed codes and examine how the code size and correlation properties are related. Three different decoding schemes are used for implementing the system: thin film filter, arrayed waveguide grating and Fiber Bragg Grating. Simulation results show that for low channel (three users), the thin film and AWG filters perform well but Fiber Bragg Grating filters have higher dispersion than others, which could reduce the goal of 10 Gbit/s channel.

Keywords: Optical code division multiple access, bit-error rate, phase intensity induced noise, SNR, and modified frequency hopping.

Received April 21, 2008; accepted June 8, 2008

Tuesday, 29 December 2009 19:00

Semiotics Explorations on Designing the Information Intensive Web Interfaces

Muhammad Islam1, Mohsin Ali1, Ali Al-Mamun2, and Motaharul Islam3
1Department of Computer Science and Engineering, Khulna University of Engineering & Technology, Bangladesh
2Department of Computer Science and Information Technology, Islamic University Islamic University of Technology, Bangladesh
3Training and Instrumentation Division, University Grants Commission, Bangladesh

Abstract: The growth of technological innovations, internet developments, and their (web) applications has raised a definite issue on retaining the web interface quite understandable. Moreover, a need is also being felt on developing suitable and coherent guidelines for designing interface to swell the user interpretability of web signs. These design principles are semiotics by nature and semiotics is the science of signs, that is, meaning’s of representations. For this, new and important perspectives for interface design would be discovered by semiotic analysis on interface signs. Therefore, this research mainly focuses on the valuable insights that semiotic analysis could offer to present the fundamental concepts to create understandable signs. The fundamental role of this research is to provide the semiotics background to the web designers with presenting the entire semiotics explanations for a particular web domain and the semiotics golden rules that will help them to designing the web interface signs comprehensible and usable to work with.

Keywords: Usability, human computer interaction, web interface, empirical study.

Received May 4, 2008; accepted June 8, 2008

Tuesday, 29 December 2009 19:00

Low Latency, High Throughput, and Less Complex VLSI Architecture for 2D-DFT

Sohil Shah, Preethi Venkatesan, Deepa Sundar, and Muniandi Kannan
 Department of Electronics Engineering, Anna University, India

Abstract: This paper proposes a pipelined, systolic architecture for two- dimensional discrete Fourier transform computation which is highly concurrent. The architecture consists of two, one-dimensional discrete Fourier transform blocks connected via an intermediate buffer. The proposed architecture offers low latency as well as high throughput and can perform both one- and two- dimensional discrete Fourier transforms. The architecture supports transform length that is not power of two and not based on products of co-prime numbers. The simulation and synthesis were carried out using Cadence tools, NcSim and RTL Compiler, respectively, with 180 nm libraries.

Keywords: Digital signal processing chip, discrete Fourier transforms, systolic array, and very-large-scale integration circuit.

Received February 22, 2008; accepted June 8, 2008

Tuesday, 29 December 2009 19:00

Specification and Prototyping of Reactive Distributed Systems with Algebraic Term Nets

Faiza Bouchoul and Mohammed Mostefai
Laboratory of Automatics of Sétif, Ferhat Abbas University, Algeria

Abstract: The specification of the dynamic behaviour of reactive distributed systems must exhibit the structures of control and has to imply explicitly the relevant aspects of the distribution, such as the concurrency, the reactivity and the interaction between the entities. Among the most common reactive distributed systems we can cite industrial ones; distributed networks occur for example in telecommunications, Internet, power and energy, transportation and manufacturing. Distributed computing will play an increasingly critical role in the global industrial-infrastructure. The need for trustworthy systems has received tremendous researchers’ attention. The usage of formal tools for simulation and rototyping designed to facilitate the modelling of such systems is of great interest. Improved methods are needed to insure reliability, security and robustness of industrial distributed systems. This paper proposes the fundamentals of a formal approach for the specification of reactive distributed systems based on object-oriented paradigm. Object’s behaviour is modelled as REACTNets. The REACTNets enhance the ECATNets that are a kind of high level algebraic Petri nets with explicit distribution and reactivity. We associate to the classic ECATNets MAUDE rules to handle interactions between objects. The two formalisms have a common semantics in term of rewriting logic so that interesting prospects are opened for their integration.

Keywords: Reactive distributed systems, object oriented paradigm, rewriting logic, ECATNets, Maude, rapid prototyping.

 Received February 5, 2008; accepted September 1, 2008

  

Full Text

Tuesday, 29 December 2009 19:00

Variable Rate Steganography in Gray Scale Digital Images Using Neighborhood
Pixel Information

Moazzam Hossain, Sadia Al Haque, and Farhana Sharmin
Department of Computer Science and Engineering, International Islamic University, Bangladesh

Abstract: Steganography is the art of hiding the fact that communication is taking place, by hiding information in other information. Security has always been a major concern since time immemorial. In the past, people used hidden tattoos or invisible ink to convey steganographic content. Today, digital technology and Internet provide for easy to use cover media for steganography. In order to improve the security by providing the stego image with imperceptible quality, three different steganographic methods for gray level images are presented in this paper. Four neighbors, diagonal neighbors and eight neighbors methods are employed in our scheme. These methods utilize a pixel’s dependency on its neighborhood and psycho visual redundancy to ascertain the smooth areas and complicated areas in the image. In smooth areas we embed three bits of secret information. In the complicated areas, variable rate bits are embedded. From the experimental results it is seen that the proposed methods achieve a much higher visual quality as indicated by the high peak signal-to-noise ratio in spite of hiding a larger number of secret bits in the image. In addition, to embed this large amount of secret information, at most only half of the total number of pixels in an image is used. Moreover, extraction of the secret information is independent of original cover image.

Keywords: Information security, steganography, data hiding, stego image, cover image.

 Received October 27, 2007; accepted February 11, 2008

 

Full Text

 
Tuesday, 29 December 2009 19:00

Ontology-Based Intelligent Mobile Search Oriented to Global e-Commerce

Abdelkader Dekdouk
 Département d’Informatique, Université Es-Sénia, Algeria

Abstract: In this paper we propose a novel approach for searching eCommerce products using a mobile phone, illustrated by a prototype eCoMobile. This approach aims to globalize the mobile search by integrating the concept of user multilinguism into it. To illustrate that, we particularly deal with English and Arabic languages. Indeed the mobile user can formulate his query on a commercial product in either language (English/Arabic). The description of his information need on commercial products relies on the ontology that represents the conceptualization of the product catalogue knowledge domain defined in both English and Arabic languages. A query expressed on a mobile device client defines the concept that corresponds to the name of the product followed by a set of pairs (property, value) specifying the characteristics of the product. Once a query is submitted it is then communicated to the server side which analyses it and in its turn performs an http request to an eCommerce application server (like Amazon.com). This latter responds by returning an XML file representing a set of elements where each element defines an item of the searched product with its specific characteristics. The XML file is analyzed on the server side and then items are displayed on the mobile device client along with its relevant characteristics in the chosen language.

Keywords: Mobile computing, search engine, multilingual global eCommerce, ontology, XML.

Received March 10, 2008; accepted June 08, 2008

Tuesday, 29 December 2009 19:00

A Quantum-Inspired Differential Evolution Algorithm for Solving the N-Queens Problem

Amer Draa1, Souham Meshoul2, Hichem Talbi3, and Mohamed Batouche2
1Computer Science Department, Mentouri University, Algeria
2Computer Science Department, King Saud University, Saudi Arabia
3OE Faculty, Emir Abdelkader University, Algeria

Abstract: In this paper, a quantum-inspired differential evolution algorithm for solving the N-queens problem is presented. The N-queens problem aims at placing N queens on an NxN chessboard, in such a way that no queen could capture any of the others. The proposed algorithm is a novel hybridization between differential evolution algorithms and quantum computing principles. Accordingly, differential evolution algorithms have been enhanced by the adoption of some quantum concepts such as quantum bits and states superposition. The use of the quantum interference has allowed this hybrid approach to have a remarkable efficiency and good results.

Keywords: Quantum computing, differential evolution, N-queens problem, combinatorial optimization.

Received March 20, 2008; accepted July 8, 2008

Tuesday, 29 December 2009 19:00

Modelling of Updating Moving Object Database Using Timed Petri Net Model

Hatem Abdul-Kader and Warda El-Kholy
Faculty of Computers and Information, Menoufya University, Egypt

Abstract: Tracking moving objects is one of the most common requirements for many location-based applications. The location of a moving object changes continuously but the database location of the moving object cannot update continuously. Modelling of such moving object database should be considered to facilitate study of the performance and design parameters for this database feature. Such study is essential for selecting the optimal solution in order to minimize the implementation of the overhead cost. Location updating strategy for such type of database is the most important criteria. This paper proposed a timed Petri net model based on one of the most common updating strategies, namely the distance updating strategy. In addition, a method for estimating the time needed to update moving object database  using the concept of the minimum cycle time in timed Petri nets is presented. This time is the main criterion, which can be used to study the overhead communication cost for moving object database. A typical numerical example is given to demonstrate the advantages of proposed modelling technique.

Keywords: Updating moving object database, deterministic timed Petri net, deviation update policy,  tracking moving object database.

Received June 15, 2008; accepted September 3, 2009

Tuesday, 29 December 2009 19:00

Identification of Promoter Region in Genomic DNA Using Cellular Automata Based Text Clustering

Kiran Sree1 and Ramesh Babu2
1Department of Computer Science, Jawaharlal Nehru Technological University, India
2Department of Computer Science, Acharya Nagarjuna University, India

Abstract: Identifying the promoter regions play a vital role in understanding human genes. This paper presents a new cellular automata based text clustering algorithm for identifying these promoter regions in genomic DNA. Experimental results confirm the applicability of cellular automata based text clustering algorithm for identifying these regions. We also note an increase in accuracy of fining these promoter regions by 12 percent for DNA sequences for shorter length. This algorithm was trained to identify promoter regions in mixed and overlapping DNA sequences also. However this algorithm fails in identifying the promoter regions of length greater than 54. This algorithm will be also used to predict the RNA structure.

Keywords: Text clustering, DNA sequence, cellular automata.

Received May 10, 2008; accepted October 1, 2008

Tuesday, 29 December 2009 19:00

HW/SW Design-Based Implementation of Vector Median Rational Hybrid Filter

Anis Boudabous1, Ahmed Ben Atitallah1,3, Lazhar Khriji2, Patrice Kadionik3, and Nouri Masmoudi1
1Laboratory of Electronics and Information Technology, Tunisia
2Department of Electrical and Computer Engineering, Sultan Qaboos University, Oman
3IMS Laboratory, University Bordeaux I, France

Abstract: A new code sign implementation of vector median rational hybrid filter based on efficient hardware/software implementation is introduced and applied to colour image filtering problems. This filter is used essentially to remove impulsive and Gaussian noise in colour images. In our design we start by implementing the software solution in system on programmable chip context using NIOS-II softcore processor and µClinux as operating system. We evaluate the execution time of the whole filtering process. Than we add a hardware accelerator part. This latter is implemented using fast parallel architecture. Compared to the software solution results, the use of the hardware accelerator improves clearly the filtering speed and maintains the good filtering quality as shown by simulations.

Keywords: Filtering, co-design, FPGA implementation, SoPC, NIOS-II processor.

Received January 1, 2008; accepted September 11, 2008

Tuesday, 29 December 2009 19:00

Optimal DSP Based Integer Motion Estimation Implementation for H.264/AVC Baseline Encoder

Imen Werda1, Haithem Chaouch1, Amine Samet2, Mohamed Ben Ayed2, and Nouri Masmoudi1
1 National School of Engineering, University of Sfax, Tunisia
2 High Institute of Electronics and Communication, University of Sfax, Tunisia

Abstract: The coding gain of the H.264/AVC video encoder mainly comes from the new incorporated prediction tools. However, their enormous computation and ultrahigh memory bandwidth are the penalties. In this paper we present an approach supporting efficient data reuse process to avoid unnecessary memory accesses and redundant motion estimation computations combined with a novel fast algorithm. A merging procedure joining search origin, search pattern and new variable block size motion estimation for H.264/AVC is detailed in this paper. Those approaches yield good tradeoffs between motion estimation distortion and number of computations since they invest and exploit the centre-biased characteristics of the real world video sequences: a reliable predictor determines the search origin, localizing the search process. An efficient search pattern exploits structural constraints within the motion field. A new fast block size selection DSP-based algorithm allows simultaneous fidelity of the video quality and the reduction of the computational cost. Extensive experimental work has been done, results of which show that our approach gives a speed up of 1.14 times over that of the recent fast algorithms and 10 times over the spiral search algorithm on average, with a negligible degradation in peak signal-to-noise ratio. In addition, interesting memory bandwidth is further saved with the proposed data reuse techniques at architecture level.

Keywords: H.264/AVC, search centre, block matching algorithm, pattern search, variable block size, complexity, video quality, PSNR, SSIM.

 Received March 7, 2008; accepted September 1, 2008

 

Full Text

 
Tuesday, 29 December 2009 19:00

Representing Uncertainty in Medical Knowledge: An Interval Based Approach for
Binary Fuzzy Relations

Bassam Haddad and Ahmad Awwad
 Faculty of Infor¬mation Technology, Petra University, Jordan

Abstract: This paper addresses issues involved in representation of causal relationships between medical categories.  An interval based ap¬proach for medical binary fuzzy relations is proposed to represent the ignorance about un¬certainty and impre¬cision. A major advancement propagated by this model lies in formalizing some novel medical measures en¬hancing the sight in understanding the causality relationship between medical entities. This view is expressed in extension of the classical fuzzy implication relationship in terms of interval valued fuzzy inclusion relationship in the context of fuzzy binary relationships.  The focus of attention of this model is based on utilizing interval based fuzzy inclusion relationships as causality measures expressing the strength of the degree of inclusion between fuzzy sets.  In addition, derived from the direction of an inclu¬sion degree, an interval based causal relation¬ship can medically be interpreted as the neces¬sity or the sufficiency of occur¬rence of a medical entity such as symptoms or disease with another one.  Furthermore, for sim¬plification of computations and defuzzifi¬cation of dependent intervals a method for transformation of these relations into point-valued relations is pro¬posed.

Keywords: Binary fuzzy relation, interval valued representation, medical knowledge representation, fuzzy inclusion measure, uncertainty, fuzzy logic.

 Received April 23, 2008; accepted September 1, 2009

Top
We use cookies to improve our website. By continuing to use this website, you are giving consent to cookies being used. More details…