Sunday, 12 October 2008 04:43

2008 Index
The International Arab Journal of Information Technology
Vol. 5

This index covers all papers that appeared in IAJIT during 2008. The author’s index contains the primary entry for each item, listed under the first author’s name. The primary entry includes the coauthor’s names, the title of the paper, journal abbreviation, month, year, and inclusive pages. The subject’s index contains entries describing the item under all appropriate subject headings based on the keywords specified by the authors. Each entry includes the title of the paper, the author’s names, journal abbreviation, month, year, and inclusive pages.

Author Index

Subject Index

 

Thursday, 12 June 2008 07:05

Scatter Search and Graph Heuristics for the Examination Timetabling Problem

Drifa Hadjidj1 and Habiba Drias2
1Département d’Informatique, Université M’hamed Bouguerra, Algérie
2Institut National d’Informatique-INI-, Algérie

Abstract: Examination timetabling problem is an optimization problem which consists in assigning a set of exams to a set of contiguous time slot, satisfying a set of constraints. The problem falls in the category of  the NP-Complete problems  and is usually tackled using heuristic methods. In this paper we describe a solution algorithm and its implementation based on the graph heuristics and the evolutionary meta-heuristic called scatter search which operates on a set of solutions by combining two or more elements. New solutions are improved before replacing others according to their quality and diversity. The implementation of the algorithm has been experimented on the popular carter’s benchmarks and compared with the best recent results. 

Keywords: Examination timetabling, scatter search, evolutionary, meta-heuristic, graph heuristics.

Received December 4, 2006; accepted May 3, 2007

 

Full Text

Thursday, 12 June 2008 07:05

Generating Exact Approximations to Model Check Concurrent Systems

Mustapha Bourahla
 Computer Science Department, University of Biskra, Algeria

Abstract: In this paper, we present a method to generate abstractions for model checking concurrent systems. A program using a defined syntax and semantics, first describes the concurrent system that can be infinite. This program is abstracted using the framework of abstract interpretation where an abstract function will be given. This abstract program is demonstrated to be an accurate approximation of the original program that may contain spurious behaviours. These spurious behaviours will be identified and removed using a new defined abstraction framework based on the restrictions. The new produced abstract program is an exact approximation of the original program.

 

Keywords: Model checking, abstractions, concurrent systems.

Received February 23, 2007; accepted June 6, 2007

Full Text


Thursday, 12 June 2008 07:05

Incompatibility Dimensions and Integration of Atomic Commit Protocols

Yousef Al-Houmaily
Department of Computer and Information Programs, Institute of Public Administration, Saudi Arabia

Abstract: Advanced software application systems contain transactions that tend to traverse incompatible database sites belonging to different human organizations. One key requirement of these application systems is universal transactional support and, in particular, guaranteeing the atomicity property of transactions in the presence of incompatible atomic commit protocols. Detailed analysis show that incompatibilities among atomic commit protocols could be due to the semantics of coordination messages or the presumptions about the outcome of terminated transactions. This leads to the definition of “operational correctness”, a criterion that captures the practical integration of incompatible atomic commit protocols. It also leads to the definition of “safe state”, a notion that determines the conditions under which all information pertaining to distributed transactions can be discarded without sacrificing their consistent termination across all participating sites. The significance of the analytical results is demonstrated through the development of a new atomic commit protocol called “integrated two-phase commit” that integrates the most commonly known atomic commit protocols, with respect to applicability and performance, in a practical manner and in spite of their incompatibilities.

 

Keywords: Two-phase commit, voting protocols, distributed transaction processing, integrated database systems, internet transactions, electronic services and electronic commerce.

Received March 26, 2007; accepted June 9, 2007

Full Text


Thursday, 12 June 2008 07:05

Analysis of Epileptic Events Using Wavelet Packets

Nisrine Sinno and Kifah Tout
Department of Computer Science, Lebanese University, Lebanon

Abstract: Many studies have focused on the nonlinear analysis of electroencephalography mainly for the characterization of epileptic brain states. The spatial and temporal dynamics of the epileptogenic process is still not clear completely especially the most challenging aspects of epileptology which is the anticipation of the seizure. Despite all the efforts we still don’t know how and when and why the seizure occurs. However actual studies bring strong evidence that the interictal-ictal state transition is not an abrupt phenomena. Findings also indicate that it is possible to detect a preseizure phase. We will study the patients admitted to the epilepsy monitoring unit for the purpose of recording their seizures. These patients have their EEG signal recorded 24 hours a day for several days until they have enough number of seizures to determine eligibility for seizure surgery. Thus, preictal, ictal, and post ictal electroencephalography recordings are available on such patients for analysis. We propose to use wavelet analysis in order to investigate a case study of the electroencephalography  signal and determine the localization of the seizure and its characteristics.

 

Keywords: Epilepsy analysis, wavelet, spike detection.

Received June 12, 2006; accepted June 13, 2007

Full Text


Thursday, 12 June 2008 06:59

 

Enforcing User-Defined Constraints During the Run-Time in OODB

Belal Zaqaibeh1, Hamidah Ibrahim2, Ali Mamat2, and Nasir bin Sulaiman1
1Faculty of Science and Information Technology, Zarqa Private University, Jordan
2Universiti Putra Malaysia, Malaysia

 

Abstract: In this paper a run-time model is proposed. The run-time model enforces integrity constraints for attributes that are derived from composition and inheritance hierarchies. The run-time model is designed for enforcing the logical integrity constraints in object-oriented databases during the run-time. A new technique called detection method is designed to check the object meta data to detect and catch the object-oriented databases violation before it occurs. Furthermore, we have implemented the RTM and supported set of definitions that are for checking attribute values validity, object-oriented databases consistency, and also a method for verifying attribute values when inserting, deleting, and updating objects.

Keywords: Object-oriented databases, integrity constraints, constraints violation.
 Received December 12, 2006; accepted May 13, 2007 

Full Text                            

Thursday, 12 June 2008 06:59

Optimal Fuzzy Clustering in Overlapping Clusters

Ouafa Ammor1, Abdelmonaime Lachkar2, Khadija Slaoui3, and Noureddine Rais1
1 Department of Mathematics, Faculty of Sciences and Technology of Fes, Morocco
2 ESTM, Moulay Ismail University, Morocco
3 Department of Physics, Faculty of Sciences Dhar Mehraz of Fes, Morocco

 

Abstract: The fuzzy c-means clustering algorithm has been widely used to obtain the fuzzy k-partitions. This algorithm requires that the user gives the number of clusters k. To find automatically the “right” number of clusters, k, for a given data set, many validity indexes algorithms have been proposed in the literature. Most of these indexes do not work well for clusters with different overlapping degree. They usually have a tendency to fails in selecting the correct optimal clusters number when dealing with some data sets containing overlapping clusters. To overcome this limitation, we propose in this paper, a new and efficient clusters validity measure for determination of the optimal number of clusters which can deal successfully with or without situation of overlapping. This measure is based on maximum entropy principle. Our approach does not require any parameter adjustment, it is then completely automatic. Many simulated and real examples are presented, showing the superiority of our measure to the existing ones.

Keywords: Unsupervised clustering, cluster validity index, optimal clusters number, overlapping clusters, maximum entropy principle.
Received November 30, 2006; accepted June 12, 2007
   

Full Text

                                                                                                                           
Thursday, 12 June 2008 06:52

Shadow Casting with Stencil Buffer for Real-Time Rendering

Lee Weng, Daut Daman, and Mohd Rahim              
Faculty of Computer Science and Information System, Universiti Teknologi Malaysia, Malaysia

 

Abstract: We present a new method for real-time rendering of soft shadows in dynamic scenes. Our approach is based on shadow volume algorithm which provides fast, accurate and high quality shadows. The shadow volume algorithm is used to generate hard shadows before adding fake soft shadows onto it. Although the generated soft shadows are physically inaccurate, this method provides soft shadows that are smooth and perceptually convincing. This proposed hybrid method adds more realism to a dynamic scene which is an important factor in computer graphics. 

Keywords: Shadow volume, silhouette detection, rendering, depth-pass, soft shadow

 Received November 14, 2006; accepted May 31, 2007

Thursday, 12 June 2008 06:52

Design and Implementation of G/G/1 Queuing Model Algorithm for its Applicability in Internet Gateway Server

Laxmi Singh1 and Riktesh Srivastava2
1Department of Solid State Physics and Electronics, Dr. RML Avadh University, India
2Project Head-Internet Technologies, Academecia Software Solution, India

 

Abstract: How to competently apportion system resource to process the client request by gateway servers is a tricky predicament. In this paper, we propose an enhanced proposal for autonomous recital of gateway servers under highly dynamic traffic loads. We have developed G/G/1 queuing model algorithm and premeditated its intricacy, so that there is lossless information repossession at each node of gateway server. This facilitates to reduce response time variance in existence of bursty traffic.  The most widespread contemplation is performance, because gateway servers must offer cost-effective and high-availability services in the elongated period, thus they have to be scaled to meet the expected load. performance measurements can be the base for performance modeling and prediction. With the help of performance models, the performance metrics (like buffer estimation, waiting time) can be determined at the development process, so that there is lossless information retrieval of data at every node of internet gateway servers. The paper portrays the assessment of buffer size using G/G/1 queuing model to estimate the final value of the memory size and then examine its implementation at the gateway servers. The obtained output is based on the simulation and experimental studies using synthesized workloads and analysis of real-world gateway servers demonstrate the effectiveness of the proposed system.

 

Keywords: M/M/1, G/G/1, internet gateway server, queuing process.

 Received February 20, 2007; accepted June 8, 2007

Thursday, 12 June 2008 06:52

A Customized Particle Swarm Optimization for Classification of Multispectral Imagery Based on Feature Fusion

Venkatalakshmi Krishnan1, Anisha Praisy1, Maragathavalli R.1, and Mercy Shalinie2
1Department of IT, Thiagarajar College of Engineering, India.
2Department of CSE, Thiagarajar College of Engineering

 

Abstract: An attempt has been made in this paper to classify multispectral images using customized particle swam optimization. To reduce the time consumption due to increase in dimensionality of multispectral imagery a preprocessing is done using feature extraction based on decision boundary. The customized particle swam optimization then works on the reduced multispectral imagery to find globally optimal cluster centers. Here particle swam optimization is tailored for classification of multispectral images as customized particle swam optimization. The modifications are performed on the velocity function such that velocity in each iteration is updated as a factor of g-best (global best) alone and the particle structure is made to incorporate the entire cluster centers of the reduced imagery.  The initialization of particles is accomplished using modified k-means in order to retain the simplicity. AVIRIS images are used as test site and it was found that the customized particle swam optimization finds the globally optimal clusters with 98.56% accuracy.

 

Keywords: Multispectral image, decision based feature extraction, particle swam optimization, global optima, g-best.

 Received January 8, 2007; accepted May 3, 2007

Thursday, 12 June 2008 06:52

Microcontroller Based Heart Rate Monitor

Mohamed Fezari, Mounir Bousbia-Salah, and Mouldi Bedda
Department of electronics, University of Badji Mokhtar, Annaba

 

Abstract: This paper describes the development of a heart rate monitor system based on a microcontroller. It offers the advantage of portability over tape-based recording systems. The paper explains how a single-chip microcontroller can be used to analyse heart beat rate signals in real-time. In addition, it allows doctors to get the heart beat rate file of the patient by e-mail every twenty four hours. It can also be used to control patients or athletic person over a long period. The system reads, stores and analyses the heart beat rate signals repetitively in real-time. The hardware and software design are oriented towards a single-chip microcontroller-based system, hence minimizing the size. The important feature of this paper is the use of zero crossing algorithm to compute heart rate. It then processes on real-time the information to determine some heart diseases.

Keywords: Microsystems, microcontroller, real-time, heart rate monitoring, zero crossing algorithm.

 Received November 30, 2006; accepted June 12, 2007

Sunday, 08 June 2008 00:47

A Dynamic Traffic Shaping Technique for a Scalable QoS in ATM Networks

Francis Ogwu1, Mohammad Talib1, Ganiyu Aderounmu2, and Olufade Onifade3
1Department of Computer Science, University of Botswana, Gaborone, Botswana
2Department of Computer Science and Engineering, Obafemi Awolowo University, Nigeria
3Department of Computer Science, University of Ibadan, Nigeria

 

Abstract: Traffic shaping function becomes imperative for the new broadband services that are being deployed in order to avoid information loss, to provide the end users multiple traffic options in terms of bandwidth and to ensure optimal use of the communication channels. To simultaneously manage the amount of cell loss and delay experienced by two or more classes of service categories constant bit rate/ variable bit rate, we developed a new buffer partitioning scheme tagged complete sharing with gradual release. The proposed model was combined with a scheduling method known as weighted round robin with absolute increment. An analytical model was developed for the proposed buffer partition to dynamically monitor and determine the output mean rate of the classes of service present, and the individual mean rate of the class of service. The model was simulated and performance evaluation carried out. The result thus obtained depicts a better performance as a method of traffic shaper in a multi-quality of service traffic over asynchronous transfer mode networks.

 Keywords: Asynchronous transfer mode, quality of service, complete sharing/ gradual release, traffic shaper, constant bit rate, variable bit rate.

Received July 25, 2006; accepted May 31, 2007

                                                                                                              
Sunday, 08 June 2008 00:47

Modelling Concurrent Mobile Transactions Execution in Broadcasting Environments

Ahmad Al-Qerem1 and Walter Hussak2
1Computer Science Department, Zarqa Private University, Jordan
2Department of Computer Science, Loughborough University, UK

 

Abstract: Broadcast is an efficient and scalable method for resolving the bandwidth limitation in a wireless environment. There is a trade-off between clients’ access time and throughput for update mobile transactions in on-demand data dissemination environments. Data scheduling at the fixed server can allow more transactions to commit while retaining the access time for each transaction. In this paper, we present a data scheduling scheme for both read only and update mobile transactions in pull-based broadcasting environments. Rather than consider access time, which is well studied elsewhere in [1, 2, 3], our concern is to examine the probability that a mobile transaction is able to avoid conflict and commit. Specifically, a set of formulas giving an analysis of this probability is examined. Furthermore, a report of a simulation study for validating these formulas is also provided.

 Keywords: Wireless broadcast, data organization, mobile transactions.

Received February 13, 2007; accepted June 8, 2007

 

 Full Text

                                                                                                             
Sunday, 08 June 2008 00:47

Flexible Database Querying Based on Ordered Lattice Theory Extension

Minyar Sassi1, Amel Touzi1, and Habib Ounelli2
1Ecole Nationale d’Ingénieurs de Tunis, Tunisia
2Faculté des Sciences de Tunis, Campus Universitaire-1060 Tunis, Tunisia

 

Abstract: This research reports on the synthesis of flexible database querying approach based on ordered lattice theory extension to deal with imprecise and structured data. This approach allows us to construct a multi-attributes type abstraction hierarchy structure for the case of decomposition according several attributes. This structure is defined from an ordered lattice theory extension. Our approach consists of two steps: the first step consists in data organization and the second at seeking, to interrogate them, relevant data sources for a given query. The contributions of this approach are a) the interdependence of the query research criteria, b) the research of the relevant data sources for a given query, and d) the scheduling of the results.

Keywords: Fuzzy cluster analysis, formal concept analysis, flexible database querying, concept query, relieving query.

Received March 3, 2007; accepted June 13, 2007

 Full Text


Sunday, 08 June 2008 00:47

IP Over WDM Network Control: Network Addressing and Restoration

Refat Kibria1 and Syed Reza2
1Department of Computer Science and Engineering, Shah Jalal University of Science & Tech., Bangladesh
2HUAWEI Technologies, Bangladesh

 

Abstract: There are some common network control issues in IP over WDM(IP/ WDM) which includes network addressing, neighborhood discovery, routing behavior, connection setup and tear down, signaling mechanism, network access control and IP/ WDM protection and restoration. Among them network addressing and restoration are two major research area. In this paper these two issues have been discussed. This paper also contains two different case studies based on lightpath provisioning and segment restoration.

 Keywords: Optical cross-connector, shared-risk link group, open shortest path first, link state advertisement, label switched path, resource reservation protocol.

Received June 15, 2006; accepted June 13, 2007

 Full Text

                                                                                                              
Top
We use cookies to improve our website. By continuing to use this website, you are giving consent to cookies being used. More details…