Thursday, 24 June 2010 19:00

A Modified 2D Chain Code Algorithm for
Object Segmentation and Contour Tracing

Walid Shahab, Hazem Al-Otum, and Farouq Al-Ghoul
EE Department, Jordan University of Science & Technology, Jordan


Abstract: In this paper, a modified algorithm for object segmentation of binary images is presented, and denoted as 2D modified chain code algorithm. The 2D modified chain code algorithm can be applied to color images after being binarized. The segmented object is used to derive the chain code in the image. The definition of the 2D- modified chain code algorithm is valid for shapes composed of triangular, rectangular, and hexagonal cells. The 2D modified chain code preserves information and allows computing geometric dimension. The results demonstrate that the 2D modified chain code algorithm could extract the coordinates of the shapes at lower computational cost when compared to the classical chain code.  Here, a considerable improvement in accuracy (20.1-57.2%) over what is possible with the classical chain code has been achieved at the expense of slight increase in computational cost (10-20%).

Keywords: Image segmentation, contour detection,  chain codes.

Received September 11, 2007; accepted December 13, 2007

Full Text

Thursday, 24 June 2010 19:00

Wavelet Based Video Encoder Using KCDS

Sudhakar Radhakrishnan, Guruprasad Subbarayan, and Karthick Vikram
Department of Electronics and Communication Engineering, PSG College of Technology, India


Abstract: Video coding schemes for low bit-rate is of high importance and traditional coding schemes which use block transforms suffer from blocking artefacts.  Here we propose a Video Codec based on Wavelet transform and hence performance of the coder is superior to other block transform based codecs. The Wavelet coefficients are coded using the computationally simpler no list SPIHT whose performance is similar to that of set partitioning in hierarchical trees.  Motion estimation is done using the recently proposed kite-cross-diamond search algorithm which is the fastest among the block matching algorithms.  The codec is ideally suited for sequences with smooth and gentle motion of the video conferencing kind.  Simulation results are provided to evaluate the performance of the codec at various bit-rates.  The codec is scalable in terms of bandwidth requirement which means only one compressed bit stream is produced for different bit-rates.  The use of NLS makes the codec scalable since it has the embedded coding property.  However for resolution scalability different compressed files are required.

Keywords: Codec, motion estimation, SPIHT, video encoder, wavelets.

Received October 6, 2007; accepted January 6, 2008 

Full Text
Thursday, 24 June 2010 19:00

Building an Efficient Indexing for Crawling
the Web Site with an Efficient Spider

Ghaleb Al-Gaphari
Faculty of Computer Science, University of Sana’a, Yemen


Abstract: Constructing a high performance web search engine requires an efficient indexing mechanism as well as a high performance web spider. With the present effort, we propose to investigate results of applying both, the Right-Truncated Index-Based Web Search Engine and the High-performance web spider in order to determine its usefulness for storing and retrieving Arabic documents on one hand and their effectiveness in finding and analyzing data to be indexed on the other hand. The Right-Truncated Index-Based Web Search Engine, being a program for reading any set of Arabic documents, accepts a query, and then processes both the documents and the query.  Thus, it selects (predicts) those documents most relevant to the query which was inserted. The program encompasses both a morphological component and a mathematical one.  The morphological component allows the researcher to run either a stemming algorithm or a right-truncated algorithm.  The chief advantage of the stemming algorithm is that it uses the least possible amount of storage for indexing by mapping the inflected and derived terms into a single, indexed stem-word.  On the other hand, the right-truncated algorithm reduces the amount of storage to a lesser degree, but increases the probability of retrieving relevant (user-favorable) documents, compared to the stemming algorithm.  One of the purposes of our investigation is to compare the efficiency of these two indexing mechanisms. This component computes the TF-IDF (term-weighting scheme) by multiplying the inverse document frequency-array with the term frequency-array for each term contained in every document.  Then, it computes the cosine-similarity shared by the query-vector and each individual document-vector in the collection.  The greater the cosine-similarity between the query-vector and the document-vector, the greater the relevancy the document presents to the query. Expressed differently, the greater the cosine-similarity between the terms of the query and the document which contains those terms, the higher the probability that said document will correspond to user-interest, thereby  improving the query's power to retrieve. This paper also describes building a simple search engine based on a crawler or a spider. The clawer is an algorithm to crawl the file systems from specified folder, and indexes different types of documents. A basic design and object model was developed to support  single search word results  as well as  multiple search words results.  It is capable of finding data to index by following (tracing) web links rather than searching directory listings in the file system. In this process files are downloaded through HTTP and HTML pages parsed in order to obtain more links without getting into a recursive loop. Also, this paper discusses how to improve indexing mechanism efficiency using a right truncated stemmer in terms of Arabic documents processing.

Keywords: Web search engine, truncation, indexing efficiency, spider, crawler.

Received March 18, 2007; accepted December 31, 2007 

Full Text
Friday, 24 July 2009 19:00

A Novel Distance Based Relocation Mechanism
to Enhance the Performance of Proxy
 Cache in a Cellular Network

Mala Chelliah1, Govindaram N.1, and Nagamaputhur Gopalan2
1Department of Computer Science and Engineering, National Institute of Technology, India
2Department of Computer Applications, National Institute of Technology, India


Abstract: Accessing the World Wide Web with the wireless devices has been the promising technology for the past few years. Interruption in World Wide Web access during handover in a cellular network can be avoided by relocating in advance the proxy cache to the target base station before the handover using the path prediction algorithm. To reduce the cache overhead in the handover, this paper proposes a “distance based relocation” mechanism, in which the cache of the base station is relocated once the mobile unit reaches the relocation point in the cell. This mechanism estimates the time at which the relocation has to be done by keeping track of the distance between the mobile terminal and the base station. 

Keywords: WWW, proxy cache, relocation point, path prediction, distance based relocation.

                              Received March 14, 2007; accepted October 10, 2007

Friday, 24 July 2009 19:00

Internet Banking in Jordan: An Arabic
Instrument Validation Process

Emad Abu-Shanab1 and Michael Pearson2
1Management Information System Department, Yarmouk University, Jordan
2Southern Illinois University, Carbondale, USA


Abstract: Internet banking is booming in Jordan and it is time for banks and customers to reap the benefits from such technology. Bank customers’ propensity to use internet banking is dependent on their attitudes towards such technology. This work validates an Arabic technology acceptance instrument through a rigorous process so that banks can better understand the factors that affect the customer’s intention to use the internet banking technology. The work utilized the backward translation method and developed an Arabic instrument for eleven constructs that yielded an acceptable level of reliability. Conclusions, implications and future work are provided at the end of the paper.

Keywords: Technology acceptance, factor analysis, Internet banking, Arabic instrument, UTAUT.

Received March 18, 2007; accepted December 31, 2007

Wednesday, 24 June 2009 19:00

Robust Approach of Address Block Localization
in Business Mail by Graph Coloring

Djamel Gaceb, Véronique Eglin, Frank Lebourgeois, and Hubert Emptoz
 University of Lyon, France


Abstract: An efficient mail sorting system is mainly based on an accurate optical recognition of the addresses on the envelopes.  However, the localizing of the address block should be done before the OCR recognition process. The location step is very crucial as it has a great impact on the global performance of the system.  Actually, a good localizing step leads to a better recognition rate. The limit of current methods depends on modular linear architectures used for address block localization. Their performances depend on each independent module performance. We are presenting in this paper a new approach for ABL based on the hierarchical graph coloring and on the pyramidal data organization. This new approach presents the advantage to guarantee a good coherence between different modules and that reduces both the computation time and the rejection rate. The proposed method gives a very satisfying rate of 98% of good locations on a set of 750 envelope images.

Keywords: Text localization, physical segmentation, real time processing, business documents processing, graph    coloring.

Received August 17, 2007; accepted December 13, 2007 

Full Text
Wednesday, 24 June 2009 19:00

Performance Evaluation of Location
Update Schemes for MANET

Khaled Omer1 and Daya Lobiyal2
1Faculty of Engineering, School of Computer and Systems Sciences, University of Aden, Yemen
2Faculty of Computer and System Science, Nehru University, India


Abstract: In this paper, we have developed an analytical model to evaluate the performance of home agent, quorum based, and grid location service update schemes using Markov chain. The model evaluates the performance in terms of the cost of updates and queries. The cost of updates is computed in terms of the hops used in updating a location. The model also considers selective queries for destination search to compute the cost of queries, such that the cost of queries is computed in terms of the hops used in searching for destination. Finally, the average total cost that includes the update cost and query cost is determined. In the model, moving node initiates a location update using a distance based triggering strategy. The average total cost is determined for different threshold distances. The analytical results show that the home agent location update scheme outperforms quorum based and grid location service location update schemes. 

Keywords: Markov chain, location update, home agent, quorum, grid location service.

Received January 16, 2007, accepted February 23, 2008

Wednesday, 24 June 2009 19:00

An Empirical Performance Study of Connection Oriented Time Warp Parallel Simulation

Ali Al-Humaimidi and Hussam Ramadan
Information Systems Department, King Saud University, Saudi Arabia


Abstract: Time warp is a well-known optimistic mechanism for parallel execution of simulation programs. Implementing time warp using a connection-oriented communication approach is proposed in the literature as a way to improve time warp performance because it allows for the use of more efficient event queue implementations. However, no empirical performance studies have been reported for connection-oriented time warp. In this paper, we present an enhanced version of the connection-oriented time warp algorithm along with its associated data structures. An empirical performance study of the connection-oriented time warp is conducted on a network of workstations using a standard synthetic benchmark simulation model. Experimental results show that this algorithm is capable of achieving better performance than that of traditional connectionless time warp for several performance measures.

Keywords: Parallel simulation, time warp, connection-oriented, connectionless.

Received March 18, 2007; accepted December 13, 2007

Full Text
Wednesday, 24 June 2009 19:00

Energy Efficient Data Compression in Wireless Sensor Networks

Ranganathan Vidhyapriya1 and Ponnusamy Thangapandian Vanathi 2
1Department of Information Technology, PSG College of Technology, India
 2Department of Electronics and Communication Engineering, PSG College of Technology, India


Abstract: In order for wireless sensor networks to exploit signal, signal data must be collected at a multitude of sensors and must be shared among the sensors. The vast sharing of data among the sensors contradicts the requirements (energy efficiency, low latency and high accuracy) of wireless networked sensor. This paper describes our design and implementation of the two lossless data compression algorithms integrated with the shortest path routing technique to reduce the raw data size and to accomplish optimal trade-off between rate, energy, and accuracy in a sensor network. To validate and evaluate our work, we apply it to different types of datasets from different real-world deployments and show that our approaches can reduce energy consumption over other data compression schemes based on simulations.

Keywords: Wireless sensor networks; compression, routing, energy efficiency, lifetime, shortest path

Received December 14, 2007; accepted March 12, 2008

Wednesday, 24 June 2009 19:00

Handwriting Arabic Character Recognition
LeNet Using Neural Network
 

Rashad Al-Jawfi
Department of Mathematics and Computer Science, Ibb University, Yemen


Abstract: Character recognition has served as one of the principal proving grounds for neural network methods and has emerged as one of the most successful applications of this technology. In this paper, a new network is designed to recognize a set of handwritten arabic characters. This new network consists of two stages. The first is to recognize the main shape of the character, and the second stage is for dots recognition. Also, the characteristics, structure, and the training algorithm for the network are presented.

Keywords: Arabic character recognition, neural network, LeNet.

Received October 11, 2007; accepted February 11, 2008 

Full Text
Wednesday, 24 June 2009 19:00

New Architecture of Fuzzy Database
Management Systems

Amel Grissa Touzi and Mohamed Ali Ben Hassine
Faculty of Sciences of Tunis, Tunisia University, Tunisia


Abstract: Fuzzy relational data bases have been extensively studied in a theoretical level. Unfortunately, the repercussions of these works on the practical plan are negligible.  Medina et al. have developed a server named fuzzy SQL, supporting flexible queries and based on a theoretic model called GEFRED. This server has been programmed in PL/SQL language under Oracle database management systems. To model the flexible queries and the concept of fuzzy attributes, an extension of the SQL language named fuzzy SQL has been defined. The FSQL language extends the SQL language, to support the flexible queries, with many fuzzy concepts. The FRDB is supposed has already been defined by the user. In this paper, we extend the work of medina et al. to present a new architecture of fuzzy DBMS based on the GEFRED model. This architecture is based on the concept of weak coupling with the DBMS Oracle. It permits, in particular, the description, the manipulation and the interrogation of FRDB in FSQL language.

Keywords: Fuzzy DB, FSQL, FIRST, FSQL serve, GEFRED.

Received March 18, 2007; accepted May 30, 2007

Wednesday, 24 June 2009 19:00

Performance of Adaptive Beamforming Algorithm for LMS-MCCDMA MIMO Smart Antennas

Sidi Bahri1 and Fethi Bendimerad2
1Departments of Electronics, Chlef University, Algeria
 2Departments of Electronics, Tlemcen University, Algeria


Abstract: We propose a downlink multiple-input multiple-output multi-carrier code division multiple access system with adaptive beamforming algorithm for smart antennas. The algorithm used in this paper is based on the least mean square, with pilot channel estimation and the zero forcing equalizer in the receiver, requiring reference signal and no knowledge channel. multi-carrier code division multiple access is studied in a multiple antenna context in order to efficiently exploit robustness against multipath effects and multi-user flexibility multi-carrier code division multiple access and channel diversity offered by multiple-input multiple-output systems for radio mobile channels. Computer simulations, considering multi-path Rayleigh Fading Channel, interference inter symbol and interference are presented to verify the performance.Simulation results demonstrate a significant performance improvement using our proposed receiver structure for a multiple-input multiple-output system with the presence of large Interferences. Therefore, the BER performance of the proposed system is much better than STBC- multi-carrier code division multiple access system with RMSE algorithm. In the other hand it can be seen that, as a number of antennas at transmitter and receiver increases, the performance also improves and the number of interferences decrease the performance of the system with the same Rayleigh fading environment.

Keywords: Adaptive beamforming, LMS algorithm, MC-CDMA, MIMO system, smart antenna.

Received September 23, 2007; accepted February 11, 2008 

Full Text
Wednesday, 24 June 2009 19:00

Implementation of Adaptive Buffer in Video Receivers Using Network Processor IXP 2400

Kandasamy Anusuya, Karupagouder Thirunavukkarasu, and Subha Rani Sundaresan
Faculty in Electronics and Communication Engineering, PSG College of Technology, India


Abstract: New services such as non-interactive video streaming, which demand higher bandwidth have become popular with the introduction of broad band networks. But the quality of streamed video is impaired by the factors such as packet loss, congestion, delay and jitter in the network. Hence, to improve the video quality, an adaptive data rate based play out buffer management scheme is proposed. It mainly considers the play back time and play out buffer size of the video player used at the receiver. Moreover in recent designs, the network processors are used in a wide range of networking embedded systems, including multi service switches, routers and so on. The network processors are fully programmable Processors which performs number of simultaneous operations. This ensures full network performance and also accommodates complex services on packet basis. Hence, the proposed scheme is implemented in the Network Processors, IXP 2400 and its performance is evaluated for different data rates. It is observed that the packet loss is reduced and the buffer utilization is improved. 

Keywords: Buffer utilization, network processor, play out buffer, play back time, video streaming

Received February 6, 2007; accepted November 6, 2008 

Full Text
Wednesday, 24 June 2009 19:00

A Rule-Based Approach for Tagging
Non-Vocalized Arabic Words

Ahmad Al-Taani and Salah Abu Al-Rub
Department of Computer Sciences, Yarmouk University, Jordan


Abstract: In this work, we present a tagging system which classifies the words in a non-vocalized Arabic text to their tags. The proposed tagging system passes through three levels of analysis. The first level is a lexical analyzer that composed of a lexicon containing all fixed words and particles such as prepositions and pronouns. The second level is a morphological analyzer which relies on word structure using patterns and affixes to determine word class. The third level is a syntax analyzer or a grammatical tagging which relies on the process of assigning grammatical tags to words based on their context or the position of the word in the sentence.  The syntax analyzer level consists of two stages: the first stage depends on specific keywords that inform the tag of the successive word, the second stage is the reversed parsing technique which scans the available grammars of Arabic language to get the class of a single ambiguity word in the sentence. We have tested the proposed system on a corpus consists of 2355 words. Experimental results showed that the proposed system achieved a rate of success approaching 94% of the total number of words in the sample used in the study.

Keywords: Part-of-speech tagging, lexical analyzer, morphological analyzer, Arabic language processing.

Received July 3, 2008; accepted September 3, 2008

Wednesday, 24 June 2009 19:00

Local Predecimation with Range Index Communication Parallelization Strategy for Fractal Image Compression on a Cluster of Workstations

Syed  Hussain1, Kalim Qureshi2, Mohammad Al-Mullah2, and Haroon Rashid1
1Department of Computer Science, Comsats Institute of IT, Pakistan
2Faculty of Math and Computer Science, Kuwait University, Kuwait


Abstract: In this paper, we have implemented and evaluated the performance of local predecimation with range index communication parallelization strategy for fractal image compression on a beowulf cluster of workstations. The strategy effectively balances the load among workstations. We have evaluated the execution time of LPRI, varying the number of workstations and user-specified root mean square error. We have also reported the measured speedup and worker idle time of LPRI parallelization. 

Keywords: Load balancing, task partitioning, parallelization,  fractal image compression.

Received October 25, 2007; accepted March 5, 2008 

Full Text
Wednesday, 24 June 2009 19:00

An Empirical Examination of Maturity Model as Measurement of Information Technology
 Governance Implementation

Husam Abu Khadra1, Majdy Zuriekat2, and Nidal Alramhi3
1Department of AIS, Arab Academy for Banking and Financial Sciences, Jordan
2Department of Accounting, German-Jordanian University, Jordan
3Department of AIS, Zarqa Private University, Jordan


Abstract: The aim of this study is to evaluate information technology governance implementation in the Jordanian domestic banks using the maturity model. An empirical survey using self-administered questionnaire has been carried out to achieve the study objectives. The study results reveal that Jordanian domestic banks applied effective awareness and communications, responsibility and accountability, and skills and expertise dimensions mainly, while they do not do enough with regard to the other dimensions (tools and automation, “goal setting and measurement”, and “policies, plans and procedures”). The study main recommendation for domestic banks is to give more attention to information technology governance. Professionals in Jordanian domestic banks should work more to increase the information technology governance strength for all of its dimensions.

Keywords: Accounting information system, control, security, information technology governance, maturity model.

Received March 31, 2007; accepted June 24, 2008 

Full Text
Top
We use cookies to improve our website. By continuing to use this website, you are giving consent to cookies being used. More details…