Tuesday, 15 November 2011 07:02

Entropy Improvement for Fractal Image Coder

Jyh-Horng Jeng, Shuo-Li Hsu, and Yukon Chang
 Department of Information Engineering, I-Shou University, Taiwan
 
Abstract: Fractal Image Coder (FIC) makes use of the self-similarity inside a natural image to achieve high compression ratio and maintain good image quality. In FIC, the most important factor affecting the compression ratio and the image quality is the quantization of the contrast scaling and brightness offset coefficients. Most quantization methods treat the two coefficients independently and quantize them separately. However, the two coefficients are highly correlated and scatter around a line. In this paper, a joint coefficient quantization method is proposed that considers the two coefficients together and thereby achieves better compression ratio and image quality. The proposed method is especially effective under parsimonious conditions. For example, using only 3 bits each to represent the contrast and brightness coefficients of Lena, the proposed method yields quality of 27.04 dB, which is significantly better than 22.87 dB obtained from the traditional linear quantization msethod.

Keywords: Fractal image coder, dihedral transformation, entropy, quantization, contrast adjustment, and brightness offset.


Received March 09, 2010; accepted October 24, 2010

Tuesday, 15 November 2011 06:59

Chromaticity Based Waste Paper Grade Identification

Mohammad Osiur Rahman, Aini Hussain, Noor Ezlin Ahmad Basri, Edgar Scavino,
Hassan Basri, and Mahammad Abdul Hannan
Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, Malaysia

 
Abstract: In recycling, waste papers are segregated into various grades as they are subjected to different recycling processes. Highly sorted paper streams facilitate high quality end products and save processing chemicals and energy. Automated paper sorting systems offer significant advantages over human inspection in terms of worker fatigue, throughput, speed, and accuracy. As a consequence, many automated mechanical and optical paper sorting methods have been developed to fill the paper sorting demand during 1932 to 2009. Because of inadequate throughput and some major drawbacks of mechanical paper sorting systems, the popularity of optical paper sorting systems has increased. The implementation of the previous methods, while being a step forward in the large-volume automated sorting technology, is still complex, expensive and sometimes offers limited reliability. This research attempts to develop a smart vision sensing system that is able to separate the different grades of paper using chromaticity. For constructing template database, hue and saturation of the paper object image in a selected area are considered. The paper grade is identified based on the maximum occurrence of a specific template in the paper object image. The classification success rates for white paper, old newsprint paper and old corrugated cardboard are 95%, 92% and 90%, respectively. Finally, the best result of the proposed method is compared with the results published in literature where waste paper grade identification systems were developed using other methods. The remarkable achievement obtained with the method is the accurate identification and dynamic sorting of all grades of papers using chromaticity, which is the best among the prevailing techniques of optical or electronic image based systems.


Keywords: Waste paper sorting, grades of paper, and template matching.


Received April 6, 2010; accepted August 10, 2010

Tuesday, 15 November 2011 06:57

Location and Non-Location Based Ad-Hoc Routing Protocols under Various Mobility Models:
A Comparative Study

Misbah Jadoon1, Sajjad Madani1, Khizar Hayat1, and Stefan Mahlknecht2
1 Department of Computer Science, COMSATS Institute of Information Technology, Pakistan
2 Institute of Computer Technology, Vienna University of Technology, Austria


 
Abstract: In this paper we present a performance evaluation study of three fundamentally different ad-hoc routing protocols using different mobility patterns with special focus on three well-known performance metrics, namely the throughput, the end-to-end delay, and the packet loss. Simulations study is carried out in a standard simulator that provides the scalable simulation environment for wireless network systems. The comparative study entails three different protocols, namely the Dynamic Source Routing (DSR), the Location-Aided Routing (LAR), and the Wireless Routing Protocol (WRP). The mobility models employed in this study include the Random Way Point (RWP) mobility model, the Reference Point Group (RPG) mobility model, the Manhattan Grid (MG) mobility model, and the Gauss-Markov (GM) mobility model. The results shows that the performance metrics of ad-hoc routing protocols vary significantly with the node mobility pattern. It confirms that the speed, with which the node changes its position, considerably affect the network performance. Furthermore, it has been observed that a location-based routing protocol shows quite a good performance with various mobility patterns.


Keywords: MANET, routing protocol, and mobility model.


Received April 14, 2010; accepted May 17, 2010

Tuesday, 15 November 2011 06:55

On Handling Real-Time Communications in MAC Protocols

Sonia Mammeri and Rachid Beghdad
Faculty of sciences, University of Bejaia, Algerie

 
Abstract: In this paper, we present a critical study of two real time medium access control protocols: the reservation protocol of IEEE802.5, and the timed token protocol. First of all, the reservation protocol of IEEE802.5 suffers from the lower priority messages that may experience starvation, which is caused by large amounts of enqueued higher-priority packets. To solve that, we propose a protocol that allows the transmission of low priority messages after a high priority message. Secondly, our improved timed token protocol [1] suffers from the high number of enqueued non real-time messages. Facing this problem, we propose a new version that allows the transmission of non real-time messages without waiting the next token arrival. Simulation results show the robustness of our proposed solutions.

Keywords: Real-time constraints, reservation protocol, timed-token protocol, scheduling messages, and MAC protocols.


Received April 17, 2010; accepted January 03, 2011

Tuesday, 15 November 2011 06:52

Improving Exposure of Intrusion Deception System through Implementation of Hybrid Honeypot

Masood Mansoori1, Omar Zakaria2, and Abdullah Gani3
1,3Faculty of Computer Science and Information Technology, University of Malaya, MALAYSIA
2Department of Computer Science, Faculty of Defence Science and Technology National Defence University of Malaysia, Malaysia
 
Abstract: This paper presents a new design hybrid honeypot to improve the exposure aspect of intrusion deception systems and in particular, research server honeypots. A major attribute in the design of a server honeypot is its passiveness, which allows the honeypot to expose its services and passively wait to be attacked. Although passiveness of a server honeypot simplifies the analysis process by classifying traffics as malicious, however it also lessens its ability to lure attackers through exposure of vulnerable service. As a result it captures smaller amount of data on attacks for analysis. Client honeypot designs, on the other hand, contain modules that actively interact with outside networks, expose vulnerabilities in client side software, and identify malicious content, hosted on webservers. The proposed hybrid system integrates active module concept of a client honeypot into a server honeypot. The active module interacts with webservers utilising a custom crawler and browser, publicises the honeypot’s IP address and therefore improves exposure of server honeypot's vulnerable services. The findings presented in this paper show that interaction with webservers improves exposure, and results in significantly higher number of attacks, which in turn, increases the probability of discovering new threats. The findings also characterise most attacks to be worm based and directed at Windows based hosts and services.


Keywords: IDS, server honeypot, client honeypot, and hybrid honeypot.


Received April 27, 2010; accepted October 24, 2010

Tuesday, 15 November 2011 06:46

Lossless Data Hiding Based on Histogram Modification

Rajkumar Ramaswamy1 and Vasuki Arumugam2
1Research Scholar, Department of Electronics and Communication Engineering Kumaraguru College of Technology, India
 2Assistant Professor, Department of Electronics and Communication Engineering Kumaraguru College of Technology, India
 
 
Abstract: Lossless data hiding is the technique of embedding data in an image and retrieval of the data with lossless reconstruction of original image. In this paper, we present a novel lossless data hiding scheme based on histogram modification. This technique is based on differences of adjacent pixels for embedding data and has more hiding capacity compared to existing methods. The number of message bits that can be embedded into an image equals the number of pixels associated with the peak point. Here, a histogram shifting technique is applied in order to prevent overflow and underflow problems of the pixels. For color images the hiding capacity is three times larger than the grayscale images. The performance of the algorithm has been evaluated for eight grayscale images and eight color images with hiding capacity (bits) and peak signal to noise ratio (dB) of the reconstruction as the parameters.


Keywords: Data hiding, histogram modification, histogram shifting, and lossless reconstruction.


Received May 6, 2010; accepted August 10, 2010

Tuesday, 15 November 2011 06:37

Content-Based Image Retrieval System Based on Self Organizing Map, Fuzzy Color Histogram and Subtractive Fuzzy Clustering

Jehad Alnihoud
Department of Computer Science, Al al-Bayt University, Jordan  

 
Abstract: A novel system with high level of retrieval accuracy has been presented in this paper. Color as one of the most important discriminators in CBIR (content-based image retrieval) is utilized through calculating some of the primitive color features. The indexing of image database is performed with SOM (self-organizing map) which identified the BMU's (best matching units). Subsequently, Fuzzy Color Histogram (FCH) and subtractive fuzzy clustering algorithms have been utilized to identify the cluster for which the query image is belonging. Furthermore, the paper presents an enhanced edge detection algorithm to remove unwanted pixels and to solidify objects within images which ease similarity measures based on extracted shape features. The proposed approach overcomes the computational complexity of applying bin-to-bin comparison as a multi dimensional feature vectors in the original color histogram approach and improves the retrieval accuracy based on shape as compared with the most dominant approaches in this filed of study.


Keywords: CBIR, FCH, SOM, and subtractive fuzzy clustering.

 

 Receivead May 13, 2010; accepted August 10, 2010

 

Tuesday, 15 November 2011 06:34

PLA Data Reduction for Speeding Up
Time Series Comparison

Bachir Boucheham
 Department of Informatics, University of Skikda 20 Aout 1955, Algeria

 
Abstract: We consider comparison of two Piecewise Linear Approximation (PLA) data reduction methods, a recursive PLA-segmentation technique (Douglas-Peucker Algorithm) and a sequential PLA-segmentation technique (FAN) when applied in prior of our previously developed  time series alignment technique SEA, which was established as a very effective method. The outcome of these two combination are two new time series alignment methods: RecSEA and SeqSEA. The study shows that both RecSEA and SeqSEA perform alignments as good as those of SEA with important reductions in data (RecSEA: up to 60%, SeqSEA up to 80% samples reduction) and processing time(RecSEA: up to 85%, SeqSEA up to 95% time reduction) with respect to the SEA method. This makes both the two new methods more suitable for time series databases querying, searching and retrieval. Particularly, SeqSEA is significantly much faster than RecSEA for long time series.


Keywords: Pattern matchin, data reduction,; time series comparison, time series alignment, datamining, and data retrieval.


Received May 17, 2010; accepted October 24, 2010

Tuesday, 15 November 2011 06:20

Arabic Text Categorization: A comparative Study of Different Representation Modes

Zakaria Elberrichi and Karima Abidi 
EEDIS Laboratory, Department of computer science, Algeria
 
 
Abstract: The quantity of accessible information on Internet is phenomenal, and its categorization remains one of the most important problems. A lot of work is currently, focused on English rightly since; it is the dominant language of the Web. However, a need arises for the other languages, because the Web is each day more multilingual. The need is much more pressing for the Arabic language. Our research is on the categorization of the Arabic texts, its originality relates to the use of a conceptual representation of the text. For that we will use Arabic WordNet (AWN) as a lexical and semantic resource. To comprehend its effect, we incorporate it in a comparative study with the other usual modes of representation (bag of words and N-grams), and we use the K-NN learning scheme with different similarity measures. The results show the benefits and advantages of this representation compared to the more conventional methods, and demonstrate that the addition of the semantic dimension is one of the most promising ways for the automatic categorization of Arabic texts.




Keywords:
Categorisation, Arabic texts, Arabic wordnet, bag of words, ngrams, and concepts.


Received May 27, 2010; accepted August 10, 2010

Tuesday, 15 November 2011 06:17

Lossless Image Cryptography Algorithm Based on Discrete Cosine Transform

Sara Tedmori1 and Nijad Al-Najdawi2
  1Department of Computer Science, Princess Sumaya University for Technology, Jordan
2Department of Information Technology, Al-Balqa Applied University, Jordan

 
Abstract: The science of cryptography has recently attracted significant attention, as progressively more information is stored and transmitted in electronic form. Cryptography is the discipline of using codes to encrypt data into an unreadable format that only the targeted recipients can decrypt and read.  Encryption methods can be divided into two categories: lossy and lossless. In lossy encryption methods, the decrypted image details are vulnerable to distortion. Lossless encryption methods are more relevant when marginal distortion is not tolerable. In this research, the authors propose a novel lossless encryption/decryption technique. In the proposed algorithm, the image is transformed into the frequency domain, where low and high frequencies are processed in a way that guarantees a secure, reliable, and an unbreakable form.  The encryption algorithm uses the discrete cosine transform to convert the target image into the frequency domain, after which the encryption involves scattering the distinguishable DC value using a reversible weighting factor amongst the rest of the frequencies. The algorithm is designed to shuffle and reverse the sign of each frequency in the transformed block before the image blocks are transformed back to the pixel domain. The results show a total change in the encrypted image pixel values, concealing the image details. The decryption algorithm reverses the encryption steps and returns the image to its original form without any loss in the pixel values. Based on the application’s requirements, .the decryption algorithm can perform with or without a decryption key, The encryption algorithm is suitable for applications that require secure transport of high quality data.


Keywords: Image cryptography, image encryption and decryption, and frequency domain coefficients.

Received June 14, 2010; accepted January 3, 2011

Tuesday, 15 November 2011 06:14

Literature Review of Interactive Cross Language Information Retrieval Tools

Farag Ahmed and Andreas Nurnberger
Data and Knowledge Engineering Group, Faculty of Computer Science, Otto-von-Guericke-University of Magdeburg 39106 Magdeburg, Germany
 
Abstract: The unprecedented rise of multilingual resources afforded by the exponential web growth demands the development of communication technologies in order to eliminate the barriers between languages. More comprehensive tools to overcome such barriers, such as machine translation and cross-lingual information retrieval applications, are nowadays in strong demands. In this paper, we present an overview of the literature of interactive Cross Language Information Retrieval (CLIR) tools and discuss their limitations. In addition, possible future research directions of interactive CLIR tools are discussed.



Keywords: Cross Language Information Retrieval (CLIR), CLIR interaction tools, and query disambiguation. 


Received July 06, 2010; accepted October 24, 2010

Tuesday, 15 November 2011 06:11

A  New Image Segmentation Method Based on Particle Swarm Optimization

Fahd Mohsen1, Mohiy Hadhoud2, Kamel Mostafa3, and Khalid Amin2
1Department of Computer and Mathematics, Faculty of Science, Ibb University, Yemen
2Faculty of Computers and Information, Minufiya University, Egypt
3Faculty of Computers and Information, Banha University, Egypt

 
Abstract: In this paper, a new segmentation method for images based on particle swarm optimization (PSO) is proposed. The new method is produced through combining PSO algorithm with one of region-based image segmentation methods, which is named Seeded Region Growing (SRG).The algorithm of SRG  method performs a segmentation of an image with respect to a set of points known as seeds. Two problems are related with SRG method, the first one is the choice of the similarity criteria of pixels in regions and the second problem is how to select the seeds. In the proposed method, PSO algorithm tries to solve the two problems of SRG method. The similarity criteria that will be solved is the best similarity difference between the pixel intensity and the region mean value. The proposed algorithm randomly initialise each particle  in the swarm to contain K seed points (each seed point contains its location and similarity difference value) and then SRG algorithm is applied to each particle. PSO technique is then applied to refine the locations and similarity difference values of the K seed points. Finally, region merging is applied to remove small regions from the segmented image.


Keywords: Image segmentation, particle swarm optimization, region-based segmentation, and seeded region growing.


Received July 12, 2010; accepted October 24, 2010

Top
We use cookies to improve our website. By continuing to use this website, you are giving consent to cookies being used. More details…