......

..

IJCTA-Volume 3 Issue 5/ September-October 2012
S.No
Title/Author Name
Page No
1
A Comparative Evaluation of Cryptographic Algorithms
-Er. Kumar Saurabh,Er.Ravinder Singh Mann
Abstract
Security of information has become an important issue in data communication. Encryption has come up as a solution, and plays a vital role in information security system. This security mechanism uses some algorithms to scramble data into unreadable text which can be only being decoded or decrypted by party those possesses the associated key. These algorithms consume a significant amount of computing resources such as CPU time,memory and battery power and computation time. This paper performs comparative analysis of three algorithm; ECC, AES and RSA considering certain parameters such as computation time and complexity of the algorithms. A cryptographic tool is used for conducting experiments. Experiments results are given to analyses the effectiveness of each algorithm
Keywords:Encryption, secret key encryption, public key encryption, ECC, RSA, AES, encryption.
1653-1657

  PDF

2
The Web Page Prediction by "TRP" Machine
-Sourabh Mayria
Abstract
In the era of computer science there are some very famous phenomenon take place, the use of INTERNET is also one of them.Today the internet is very fond of thing in not only developed countries, but also spread itself in developing countries as well. By the means of it the area of use of computer is not only in technical field but also in the artistic and domestic field also. In the light of overall spread of internet it is obvious that the commerce of any country must be affected by it, and after all the use of internet is at the point of study at market point of view, psychological and philosophical view as well.In the market view it must be necessary that for any advertising company (for example), it observed keenly that what kind of or say in more crucial way that what type of or directly say which web site by users are more and more open and used, in the technical term it is called the WEB SITE PREDICTION or WEB PAGE PREDICTION as well (depends on what predicts, fully or partially web site or any or more pages).On the discuss point of view this prediction normally makes by any algorithm or set of more than one algorithm, hundreds of or thousands of approaches are used now a day and more and more new approaches takes place daily in that field. But it is worth noticed that all these approaches are very technical in that way if it is used by some less skilled person, it gives improper result, because to write a proper algorithm is not the play of children.As I described above the area of web page prediction is nit only the field of interest of technical person but also of market interest, so it must be necessitate that some kind of approach that must be gives more proper result in the field(i.e., web page prediction).My new approach is based on collaboration of online computer with the TRP machine, which is used today in television channels, and gives nice results.
1658-1661

  PDF

3
Design of Fuzzy Logic Controller to Enhance the Operation of Cricket Bowling Machine
-Manas Yetirajam,Manas Ranjan Nayak,Dr.Subhagata Chattopadhyay
Abstract
Fuzzy Logic Controller (FLC) can be used in a cricket bowling machine to increase the variations of a throw with in a fuzzy range of parameters. Bowling machine controlled by an FLC enables a coach to specify the range of parameters which replicate a particular manner of bowling. Each time the FLC generates a different crisp value out of the specified fuzzy range, it provides a feeling of a real match to the batsman. To study the operations of the controller, four standard parameters, such as speed, line, length, and deviation are considered. Experimental results show that each time controller generates a new crisp value within the specified range of parameters. FLC also generates the probability to play or not to play for a particular type of a throw. Generated probability values facilitate the function of the coach virtually. Thus, the paper suggests that the application of FLC in bowling machine to help the coach create a real match scenario and also train the batsman with a less effort by suggesting the batsman whether to take an attempt for a throw or not to take.
Keywords: Fuzzy logic controller; Cricket bowling machines; Virtual play; Decision making
1662-1666


  PDF

4
Wireless Sensor Networks Issues and Applications
-Rajkumar, Vani B A , Kiran Jadhav, Vidya S
Abstract
Wireless Sensor Networks have come to the forefront of the scientific community recently. Current WSNs typically communicate directly with a centralized controller or satellite. On the other hand, a smart WSN consists of a number of sensors spread across a geographical area; each sensor has wireless communication capability and sufficient intelligence for signal processing and networking of the data. The structures of WSNs are tightly application-dependent, and many services are also dependent on application semantics. Thus, there is no single typical WSN application, and dependency on applications is higher than in traditional distributed applications. The application/middleware layer must provide functions that create effective new capabilities for efficient extraction, manipulation, transport, and representation of information derived from sensor data. This paper provides a survey of Wireless Sensor Networks Issues and Applications, where the use of such sensor networks has been proposed.
Keywords: Wireless Sensor Network , Issues and Applications
1667-1673

  PDF

5
Avi Sorting Network
-Avinash Bansal, Kamal Gupta
Abstract
Sorting network is an abstract mathematical model which can be used as a multiple-input, multiple-output switching network to sort the data in ascending or descending order [1]. Sorting has been one of the most critical applications on parallel computing machines. Many classic textbooks on algorithms like Thomas H. Cormen, therefore consider this problem in great detail and list many sorting network for this purpose [2]. There are many sorting algorithms as the Bubble / Insertion sorter, Odd-Even sorter, Sort the data in O(log2 n)2 time complexity and some other sorter have O(n2) as time complexity, where n is the number of elements. In this paper we propose a sorting network called “Avi Sorter” having time complexity O(n log2 n) which is based on just similar to bubble sort algorithm. This sorting network provides the easy way to understand and manipulate the concept of sorting network.
Keywords: Sorting, Parallel, Network, Avi and Comparator

1674-1676

  PDF

6
A UPFC damping control scheme using Lead-Lag and ANN based
Adaptive controllers

-D. Ramesh,L.V.MaheshBabu
Abstract
Low Frequency Oscillations (LFO) occur in power systems because of lack of the damping torque in order to dominance to power system disturbances as change in mechanical input power. In the recent past Power System Stabilizer (PSS) was used to damp LFO. FACTs devices, such as Unified Power Flow Controller (UPFC), can control power flow and increase transient stability. So UPFC may be used to damp LFO instead of PSS. UPFC damps LFO through direct control of voltage and power. In this research the linearized model of synchronous machine (Heffron-Philips) connected to infinite bus (Single Machine-Infinite Bus: SMIB) with UPFC is used and also in order to damp LFO, adaptive ANN damping controller for UPFC is designed and simulated. Simulation is performed for various types of loads and for different disturbances. Simulation results demonstrate that the developed ANN damping controller would be more effective in damping electromechanical oscillations in comparison with the conventional lead-lag controller.
Keywords- Low Frequency Oscillations (LFO), Unified Power Flow Controller (UPFC), Single Machine-Infinite Bus (SMIB) power system, Artificial Neural Network (ANN) damping controller
1677-1681


  PDF

7
Realising State Model for Air Seperation unit using Subspace Identification Method
-P A Nagewswaro Rao,P Mallikarujuna Rao,B Rajesh kumar
Abstract
In this paper the state model formation using subspace identification method is discussed and implemented for the real time distillation column used in air separation unit. The algorithm used is numerical method for state space subspace identification (N4SID), the data is processed for modelling and results are discussed.
Keywords: Subspace Identification, N4SID, MIMO, ASU.
1682-1686


  PDF

8
Event Sequence Analysis using Self Organizing Map
-Dharmendra Kaushik,Babita Kubde
Abstract
In today’s world we have abundance of data and scarcity of Knowledge data mining field emerged as the fit of the tool to the problem. With the advent of internet technology and the exponential growth in the technology behind the world wide web, the concept of web mining and found a place for itself and emerged as a separate field of research. Web mining involves a wide range of applications that aim at discovering and extracting hidden information in data stored on the Web. Web log analysis is an innovative and unique field constantly formed and changed by the convergence of various emerging Web technologies. Due to its interdisciplinary character, the diversity of issues it addresses, and the variety and number of Web applications, it is the subject of many distinctive and diverse research methodologies. This chapter examines research methodologies used by contributing authors in preparing the individual chapters for this handbook, summarizes research results, and proposes new directions for future research in this area.
KEYWORDS - Data Mining, Web Mining, Preprocessing web log, weblog mining, web usage mining
1687-1693

  PDF

9
Making Method of the Data Processing System in the Experiment of
the Planck Constant Measuring with Photoelectric Effect

-JIANG Xing-fang, LIYan-ji, JIANG Hong
Abstract
The Planck constant is an important constant in modern physics. Its experimental verification is generally by the experiment of photoelectric effect. The data processing is very complex. In order to simplify the complex question the intelligent system “Measuring of the Planck constant with photoelectric effect” had been developed with multimedia authoring tool Multimedia ToolBook. The experimental data was of large number that obtained by user after the leakage current between electrodes and the anode current had been compensated. When the data had been filled the corresponding fields, the intelligent system could display dynamically the volt-ampere curve and the speed displayed could be adjusted. Especially when the user only completed the 3 or 4 filters experiments and the number of the clamp voltages was 3 or 4, the intelligent experimental data processing system could display the slope of the fitting straight line and the Planck constant. In laboratory it provided five filters. The intelligent system used the technologies such as the statistics method of “Non zero plus one”, “the least square method”, “rounding off”, “The uncertainty had a bit and it was aligned as the last bit of the significant figure”. After the arbitrarily 3 or 4 values of the clamp voltageor all 5 values of the clamp voltage had been filled in the fields, the slope of the fitting straight line could be calculated and be drawn, and the Planck constant and its uncertainty could becalculated and be displayed. The fitting straight line was the relationship between the clamp voltages and the incident light frequencies. The Planck constant was obtained by multiplying theslope of the fitting straight line with the electronic charge. These methods could be applied to develop various intelligent experimental data processing systems.
Key words: physics experiment; intelligent system;data processing; Multimedia ToolBook
1694-1698


  PDF

10
An Efficient Reversible Design of BCD Adder
-T.S.R.Krishna Prasad, Y.Satyadev
Abstract
Nowadays, Reversible logic plays an important role in vlsi design. It has voluminous applications in quantum computing, optical computing, quantum dot cellular automata and digital signal processing. Adders are key components in many computational units, so design efficient binary coded decimal (BCD) adder using reversible gates is needed. It is not possible to calculate quantum cost without implementation of reversible logic. This paper propose a new design for BCD adder that optimized in terms of quantum cost, memory usage and number of reversible gates. The important reversible gates used for reversible logic synthesis are NOT gate, CNOT gate, Toffoli gate, peres gate, TR gate and MTSG gate.
Key words—Reversible logic, quantum cost, mtsg gate
1699-1703

  PDF

11
A Survey on Security Issues and Threat Models in the Cloud
-S.Neelima,Y. Lakshmi Prasanna,M.Padmavathi
Abstract
Cloud Computing sees a technical and cultural shift of computing service provision from being provided locally to being provided remotely, and en masse, by third-party service providers. Data that was once housed under the security domain of the service user has now been placed under the protection of the service provider. Users have lost control over the protection of their data: No longer is our data kept under our own watchful eyes. It is argued that obfuscation of one's data is not enough when seeking to protect data. The control of how one's data is used and the trust afforded to service providers is equally as important. In this paper Cloud computing security issues to be found within the cloud from both a technical and socio-technical perspective are discussed. The origin of threats towards data within the cloud is described together with two threat models based upon said lifecycle. The first model represents a user-centric view, the other a Cloud Service Provider point of view.
1704-1709

  PDF

12
Enabling a High Performance Result Verification Mechanism in Cloud
Architecture by LP Computations

-Sanchari Saha
Abstract
Cloud computing is a comprehensive Internet-based computing solution. The flexibility of cloud computing is a function of the allocation of resources on demand. While a traditional computer setup requires you to be in the same location as your data storage device, the cloud takes away that step .It makes possible for us to access our information from anywhere at any time. But often the information housed on the cloud is often seen as valuable to individuals with malicious intent. The main concerns voiced by those moving to the cloud are security and privacy. The companies supplying cloud computing services know this and understand that without reliable security, their businesses will collapse. So security and privacy are high priorities for all cloud computing entities. The main focus of this paper is not only to protect confidential data from various malicious modifications but also to give a proof that the computed result is correct as per request. For this, linear programming computations are decomposed into public LP solvers. Here the original LP problem is converted into an arbitrary problem which helps to protect confidential informationsstored in the cloud and also facilitates the users with an efficient result verification mechanism.
Keywords - Cloud computing, Confidential information, homomorphic encryption, result verification, LP computation.
1710-1714


  PDF

13
Strenthening Hash Functions using Block Symmetric Key Encryption
Algorithm

-Richa Purohit,Yogendra Singh,Dr. Upendra Mishra,Dr. Abhay Bansal
Abstract
With the requirement of high speed network protocols, fast message integrity and authentication services are at the top most priority. Currently many integrity techniques, that involve hashing, are being used and developed, but almost every technique faces one or other attack or any other security or performance related issue. The chief problem being faced is the forgery of hash value, being transferred with message, as the adversary may create fraud hash value for the changed message. This paper introduces a DES encryption based hashing technique for message integrity that provides the authentication that message was sent from original sender only, and analyzes its strength. The security of proposed method is based on the strength of the underlying one way hash function.
Keywords: MD5, DES, message integrity, message authentication, hash function
1715-1719


  PDF

14
Web Document Clustering Using KEA-Means Algorithm
-Swapnali Ware,N.A.Dhawas
Abstract
In most traditional techniques of document clustering, the number of total clusters is not known in advance and the cluster that contains the target information or précised information associated with the cluster cannot be determined. This problem solved by Kmeans algorithm. By providing the value of no. of cluster k. However, if the value of k is modified, the precision of each result is also changes. To solve this problem, this paper proposes a new clustering algorithm known as KEA-Means algorithm which will combines the kea i.e. key phrase extraction algorithm which returns several key phrases from the source documents by using some machine learning language by creating model which will contains some rule for generating the no. of clusters of the web documents from the dataset and the k-means algorithm. This algorithm will automatically generate the number of clusters at the run time. This KEAMeans clustering algorithm provides easy and efficient way to extract test documents from massive quantities of resources.
1720-1725

  PDF

15
A Survey on Congestion Control and Maximization of Throughput in Wireless Networks
-K.Ravindra Babu,J.Ranga Rao
Abstract
In multihop wireless networks, designing distributed Scheduling algorithms to achieve the maximal throughput is a challenging problem because of the complex interference constraints among different links. Traditional maximal weight scheduling (MWS), although throughput-optimal, is difficult to implement in distributed networks. On the other hand, a distributed greedy protocol similar to IEEE 802.11 does not guarantee the maximal throughput. This proposed system introduces an adaptive carrier sense multiple access (CSMA) scheduling algorithm that can achieve the maximal throughput distributively. Some of the major advantages of the algorithm are that it applies to a very general interference model and that it is simple, distributed, and asynchronous. We are also including congestion control algorithms to reduce the packet loss in the network which improves the packet delivery ratio substantially.
Keywords- Multihop wireless networks, CSMA, Distributed network, Congestion control, Conflict graph, MAC Layer.
1726-1730


  PDF

16
A Novel Approach to Conquer Inference Problems and Inference Risks in Secured Database
-K. Karthikeyan,T. Ravichandran
Abstract
In many applications like Defense department, Commercial departments and Marketing departments we need a strongly secured database. Database securities are needed in order to protect our identity and authentication process of users. We propose a novel security mechanism to overcome inference problems and risks for securing the database. Our approach is used for the violation inference detection for single users and multi users. An agent is located between the user input query and the database. Our approach can be used for both the single user as well as the multiple users. This process achieves high authorization, communication accuracy and trust in communication and preventing data from leakage by inference.
Keywords:Inference engines, deduction, knowledge processing
1731-1735


  PDF

17
Human Computer Interaction – A Modern Overview
-Rachit Gupta
Abstract
Human Computer Interaction (HCI) is a discipline which aims at an established understanding and designing of different interfaces between humans and computers in a way that it defines systems that are enjoyable to use, are engaging and are accessible. In 1970’s, the development in HCI was majorly inclined towards “usability” of the interaction systems. Since then, HCI has set a positive growth in developing designs as well as evaluation methods to ensure that technologies are easy to learn and use. While, HCI leads to efficient user handling, the questions about increasing human dependency on computers and how it leads to a change such as: not being able to respond fully to beauty, are ignored with development. Hence, with novel touch to our future, the HCI practitioners in the coming years should strive for a confliction free world where technology and quotidian life exists in harmony.
1736-1740

  PDF

18
An Efficient CBIR System for Gray Scale Image Based on Local Row and Column Mean
-Madhavi kshatri, Prof.Yogesh Ratore
Abstract
Spatial feature of an image like row mean and column mean can be used to design a CBIR system but since the efficiency of this system is very poor due to the fact that in any two different images the row mean and column mean feature may be the same.if we divide the image into different part and and then compute the row mean and column mean for these divided part , the efficiency and accuracy can be improved alot without extra computational power.This paper present local row mean and local column mean based image retrieval system for gray scale image.Various performance parameter were calculated which shows the accuracy of the proposed method
1741-1746


  PDF

19
A Systematic way of Hybrid model design and comparative analysis of EBGM and eigen values for biometric face recognition using neural network
-Jagmeet Singh Brar,Sonika Jindal
Abstract
Face recognition plays an essential role in humanmachine interfaces and naturally an automatic face recognition system is an application of great interest. Although the roots of automatic face recognition trace back to the 1960, a complete system that gives satisfactory results for video streams still remains an open problem. Research in the field has been intensified the last decade due to an increasing number of applications that can apply recognition techniques, such as security systems, ATM machines, “smart rooms” and other human machine interfaces. Elastic Bunch Graph Matching (EBGM) [3] is a feature-based face identification method. The algorithm assumes that the positions of certain fiducial points on the faces are known and stores information about the faces by convolving theimages around the fiducial points with 2D Gabor wavelets of varying size. The results of all convolutions form the Gabor jet for that fiducial point. EBGM treats all images as graphs (called Face Graphs), with each jet forming a node. The training images are all stacked in a structure called the Face Bunch Graph (FBG), which is the model used for identification. For each test image, the first step is to estimate the position of fiducial points on the face based on the known positions of fiducial points in the FBG. Eigenfaces are a set of eigenvectors used in the computer vision problem of human face recognition. The approach of using eigenfaces for recognitionwas developed by Sirovich and Kirby (1987) and used by Turk and Alex Pentland in face classification. It is considered the first successful example of facial recognition technology. The purpose of this paper is the implementation of various methods from Two different families of face recognition algorithms, namely the the EBGM and eigenvalues for biometric face recognition.
1747-1751



  PDF

20
Fault-Tolerant Identification in Wireless Sensor Networks for Maximizing
System Lifetime

-Middela Shailaja,AnandaRaj S.P,Poornima.S
Abstract
Wireless Sensor Network (WSN) is used by many applications such as security, command and control and surveillance monitoring. In all such applications, the main application of WSN is sensing data and retrieval of data. There are many WSN systems that are query based. They give responses in a stipulated time based on the user’s query word. However, the WSN has possible sensor faults for it is not reliable and thus the network energy level goes down. It results in reduction of lifetime of network. To overcome the fault tolerance mechanisms can be used to improve reliability of the finding failure nodes and recovered by cluster heads. This paper presents an algorithm that can effectively increase lifetime of WSN besides satisfying the QoS requirements of application. Such algorithm is adaptive and also fault – tolerant. It uses path and source redundancy and based on hop-by-hop data delivery. Empirical simulation results revealed that the proposed system is feasible. This system alsoproposed the authentication of all kinds of identified faults and provides the services in quality manner. It increases the data flow and reduces the faults. Index Terms—WSN, QoS, query processing, energy efficiency, network life time, fault tolerant, data aggregation, data flow.
1752-1757


  PDF

 
 
     
IJCTA © Copyrights 2010| All Rights Reserved.

This work is licensed under a Creative Commons Attribution 2.5 India License.