......

..

IJCTA-Volume 4 Issue 6/ November-December 2013
S.No
Title/Author Name
Page No
1
Encryption Quality Analysis in MPEG Video Format
-Priyanka Sharma,Dinesh Goyal
Abstract
In this work, an attempt is made to analyze four fast MPEG video encryption algorithms. These algorithms use a secret key randomly changing the sign bits of DCT coefficients and/or the sign bits of motion vectors. The encryption effects are achieved by the IDCT during MPEG video decompression processing. These algorithms add very small overhead to MPEG codec. Software implementations are fast enough to meet the real time requirement of MPEG video applications and thus the analysis shows that satisfactory results can be obtained using these video encryption algorithms. The experiments conducted helped us analyze two points. First, to test the encryption results with overheads added to MPEG codec and second, encoding time with varying key lengths. We believe that these experiments will produce satisfactory results and thus help to know the efficiency and application of such algorithms in the real world. And will also give us a chance to analyze the drawbacks in the existing algorithms so that certain modification may be suggested for getting fast and better security.
Keywords: MPEG video encryption, MPEG codec, VEA algorithms
869-879

  PDF

2
A Simple Algorithm for Eye Detection and Cursor Control
-Surashree Kulkarni,Sagar Gala
Abstract
This paper presents an effective albeit simple technique to perform mouse cursor movement by first detecting the user’s eyes, and then calculating the position on screen at which the user is looking. The idea of our paper is to use a series of steps for image processing, and then use a certain algorithm to convert screen coordinates to world coordinates. For users without spectacles, the formula works quite well.
Keywords: Eye Detection, Image Processing, Connected Component Labelling, Cursor Movement
880-882

  PDF

3
An Effective Algorithm of Encryption and Decryption of Images Using Random Number Generation Technique and Huffman coding
-Dr. T. Bhaskara Reddy,Hema Suresh Yaragunti,T. Sri Harish Reddy,Dr. S. Kiran
Abstract
Data security has become most important aspect while transmission of data and storage. The transmission and exchange of image also needs a high security. Cryptography is used to maintain security. In this paper, we implemented security for image. We have considered an image, read its pixels and convert it into pixels matrix of order as height and width of the image. Replace that pixels into some fixed numbers, generate the key using random generation technique .Encrypting the image using this key ,performing random transposition on encrypted image, converting it into one dimensional encrypted array and finally applied Huffman coding on that array , due this size of the encrypted image is reduced and image is encrypted again .The decryption is reverse process of encryption. Hence the proposed method provides a high security for an image with minimum memory usage.
Key words: Encryption, Decryption, Random generation technique, Exception law, Random row transposition
883-891

  PDF

4
Design and Developmet of Fixture with Cantilever beam for Intermediate Gearbox
-K.Prabhakar,M.Ashok kumar,E.V.Subbareddy,S.Kasim Vali
Abstract
The steel industry involves various production methods and processes for steel manufacturing and end products. This in turn is done with the help of heavy duty industrial machines and equipment. Heavy-duty and powerful industrial machines including Converters, Continuous casting machines, crane slew drives, conveyors and winches are used in the steel industry. Various components including powerful gearboxes are used in these machines for ensuring a smooth power transmission. Our team successfully reduced repair time by crashing the dismantling time of both halves of gear box casing and in turn reduced man-hours consumption. This was achieved by design and manufacturing a fixture which can separate both halves of gearbox casing within minutes using a 100 The hydraulic jack.
Keywords—fixture, hydraulic jack, intermediate gear box,development.
892-896

  PDF

5
Generic Architecture for Mobile Check System
-Karima MaaZouz,Habib.Benlahmer,Naceur.Achtaich
Abstract
The explosion of the market of Smartphone has rapidly changed the way of m-commerce transactions, especially the m-payment systems which are knowing a wide acceptance due to their diversity and the new mobile technologies. In this work we have introduced the system m-check as a mobile payment system, we have presented a generic architecture of the system and the different protocols for the implementation of the system m-check
897-901

  PDF

6
Balanced Reliable Shortest Route for AOMDV (BRSR-AOMDV) Using TF Mechanism in MANET
-D.Maheshwari,R.Nedunchezhian
Abstract
A Mobile ad hoc network is a self-configuring, infrastructure less network; nodes are connected via a wireless link. Nodes move independently in any direction. Links of each node change frequently. To increase the reliability of data transmission with fault tolerance and it also provides load balancing. This paper proposes a new mechanism to achieve Balanced Reliable Shortest Routing (BRSR) in Ad Hoc On-Demand Multipath Distance Vector Routing Protocol (AOMDV) with Three-way Filter (TF) mechanism. The selection of BRSR between source to destination is done based on energy, link quality and inference noise in order to improve the data transmission and load balancing. The protocol performance is verified simulation done using NS-2 simulator. The experimental results showed that in BRSR-AOMDV protocol with TF mechanism decreases the packet loss rate, increases throughput and packet delivery ratio which reduce the average delay in an effective manner.
Keyword: MANET, BRSR-AOMDV, Fault tolerance, Load Balancing
902-909

  PDF

7
Subspace Clustering for High Dimensional Data using Density Notion
-Swati Harkanth,Prof. B. D. Phulpagar
Abstract
Clustering is a crucial task of data mining, which supports in abstracting huge amount of data. Instead of searching for clusters in full feature space, researchers are discovering novel approaches for achieving clusters in subspaces. As a result of which, subspace clustering has put on an ample attention. In this paper we had proposed an algorithm- SUBDENCLU (SUBspace DENsity CLUstering) for subspace clustering using density based approach. Proposed algorithm is an extension of DBSCAN and SUBCLU algorithm. It aims to minimize subspace search by effectively pruning irrelevant subspaces. Algorithm minimizes expensive region query associated with density based clustering. Experimental evaluation shows that we succeed to improve quality of clusters produced and clustering result contains meaningful clusters.
910-916

  PDF

8
Study on Improving Web Security using SAML Token
-Venkadesh.M,Dr.A.Chandra Sekar
Abstract
Web service is the most important word which is most commonly used by all computer professionals. Web services are mainly used for communicating different platform within a network connection. Web services are created based on SOA architecture. SOA platform is most popularly used in distributed systems. When using in distributed system security becomes prominent. This study is carried out on improving web security using SAML tokens. SOA principles and SAML are used in this study.
Keywords: Web Services, SAML, XML, HTTP, SOAP.
917-921

  PDF

9
A Novel Hybrid Feature Selection and Intrusion Detection Based On PCNN and Support Vector Machine
-Aditya Shrivastava,Mukesh Baghel,Hitesh Gupta
Abstract
In this paper proposed a hybrid model for feature selection and intrusion detection. Feature selection is important issue in intrusion detection. The selection of feature in attack attribute and normal traffic attribute is challenging task. The selection of known and unknown attack is also faced a problem of classification. PCNN is dynamic network used for the process of feature selection in classification. The dynamic nature of PCNN select attribute on selection of entropy. The attribute entropy is high the feature value of PCNN network is selected and the attribute value is low the PCNN feature selector reduces the value of feature selection. After selection of feature the Gaussian kernel of support vector machine is integrated for classification. Our detection rate is very high in compression of other neural network model such as RBF neural network and SOM network. For the empirical evaluation used KDDCUP99 dataset and measure detection rate precision and recall of proposed model.
Keywords: - IDS, PCNN SVM feature selection KDDCUP99.
922-927

  PDF

10
Novel Vantage- Scalable cache Compression Scheme
-Poonam P.Aswani,Prof. B.Padmavathi
Abstract
In today’s world speed is one of the important factor that is considered for selecting any electronic component in the market. Speed of a microprocessor based system mainly depends on the speed of the microprocessor which in turn depends on the memory access time. cache compression is one of the way to increase speed of a microprocessor based system since it increases cache capacity and off-chip bandwidth. Storing compressed lines in the cache increases the effective cache capacity. For example, a compressed L1 cache design where each set can store either one uncompressed line or two compressed lines. Increasing the effective cache size can eliminate misses and thereby reduce the time lost to long off-chip miss penalties. However, compression increases the cache hit time, since the decompression overhead lies on the critical access path. Depending upon the balance between hits and
misses, cache compression has the potential to either greatly help or greatly hurt performance.
928-932

  PDF

11
Cross Layer Reliability Maximization in Layered Network With Random Link Failures
-K. Dastagiri,Dr K. Madhavi
Abstract
The MAC layer level clogging revealing scheme has been expected. The strategic representation plans to determine a system to calculate the unit of clogging at the victim node with maximal precision. This clogging recognition apparatus is integrated with a Two-Step Cross Layer Clogging Control Routing Topology. The projected model involves controlling of clogging in two steps with efficient energy capable blocking detection and the optimal cost of routing. This projected model sightsee a cross layered model of clogging recognition and control mechanism which include energy efficient congestion detection. The Multicast Group Level Clogging valuation and Handling Algorithm [MGLCEH] and Multicast Group Level Load Balancing Algorithm [MGLLBA], which is a hierarchical cross layered base clogging recognition and avoidance model in short, can refer as Qos Optimization by cross layered clogging handling (MGLCEH). This paper holds up investigational and replication results which show with proposed topology better store consumption and energy efficiency in congestion detection and clogging control is possible by the proposed topology are possible.
Index Terms- Ad-hoc networks, MANETS, clogging, cross-layer design, optimization, random access, wireless network.
933-939

  PDF

12
QoS Centric Functional Adaptation for composite Services in SOA
-P.Gangadhara
Abstract
Present Service oriented architecture (SOA) used in many globally distributed huge software system.In such systems networks have collaborative environments, and they are in constant fluctuations, concepts like interactions and task delegation executions need continuous readjustments,requiringflexible and context based interaction models. Software engineering has to be face that challengesthat how to control such systems powerfully and successfully. Sophisticated adaption techniques required for improving the collaborations in those systems. To overcome the problems that are not solved in self-adaptive managedSOA basedsystems, dynamic adaptionmethods required to deal with composite services along with peer-to-peer, distributed architectures.An approach suggested for the systems with dynamic adaptive composite services based on wide spread protocol, and is based on three layer reference model for such systems uses a dialogue protocol for information distribution and decision making. Proposed system objective is to offer dynamic adaption in composite web serviceswith asynchronous executiontoachieve structural and global QOSrequirements.
940-944

  PDF

13
Towards Minimizing Computational Cost of Range Aggregates against Uncertain Location-Based Queries
-Y.Rajasekhar
Abstract
Queries which are location based are unsure in nature. Such queries are become common in many real time applications where location based services are cause to be. Multi-dimensional search space is effective use of such applications. Processing unsure location based queries accurately is a tough task. This paper handles a new algorithm to handle such queries. It makes utilize of range aggregates like count, avg, sum etc. for query processing. The algorithm calculates range aggregates proficiently so as to support correct result making. We built a model application for testing the future algorithm. The investigational results showed that the application is proficient of processing uncertain location based queries efficiently consuming less processing power and computation cost.
Index Terms – Aggregates, Uncertain location based queries
945-950

  PDF

14
Effective Information Search and Retrieval for Answering Time Sensitive Queries
-V.Bharath Kumar
Abstract
World Wide Web accumulates large volumes of documents every day. Search engines are used to search for required information from WWW. The search is generally carried out based on the similarity of the documents being searched for. The results are presented as per ranking of the items. However, for best ranking mechanism only similarity is not adequate. For some class of queries time could be an important dimension for searching in addition to the content similarity. Such queries are known as time sensitive queries that are processed and ranked based on the publication time and similarity. The existing research focused on retrieving recent queries. Recently Dakka et al. presented a general framework for handling time –sensitive queries. In this paper we propose a framework that extends their work by considering more time related dimensions such as republication date and time, review articles of the documents with later dates etc. This improves the robustness of the system with respect to answering time sensitive queries as it can make use of review articles and summarize events in temporal domain. Thus the system is made capable of analyzing contents of web documents on different dimensions in addition to their publication date. We built a prototype application to demonstrate the proof of concept. The empirical results revealed that the proposed framework for multi-dimensional time sensitive queries is effective.
Index Terms – Time sensitive queries, information retrieval, multi-dimensions
951-955

  PDF

15
A Personalized Multimedia Recording System Using Memory Package on Mobile Device
-Cheng-Chieh Chiang
Abstract
Mobile devices such as smart phone and PDA have become more popular in our life. In this paper, we design a personalized multimedia system that can record human activities anytime and anywhere using a smart phone. First, we propose a structure called memory package to integrate the multimedia information in smart phone. Users can create a memory package in a time period to record their activities by using multimedia contents such as text, image, video, audio, and GPS records. Next, we design scene modes that can predefine different preferences of our life to provide appropriates templates for generating memory packages. This system has been implemented in Android platform to help users employ multimedia devices mounted in smart phone to record their life. This paper presents the details of this proposed multimedia system.
Keywords- Mobile Device; Memory Package:Scene Mode; Android System.
956-961

  PDF

16
Assuring Reliability of Localization Accuracy in Anchor-free Mobile Localization
-Zulfazli Hussin,Yukikazu Nakamoto
Abstract
Anchor-free localization algorithms do not depend on the existence of anchor objects. Since no anchor objects are required, the localization algorithm still can be applied in such problem as difficulty in distribution of anchor objects. In anchor-free localization problem, majority of the solutions need a specific requirement in order to achieve a high accuracy in their localization. In a real environment, it is difficult to maintain such environment. In this paper, we proposed the Distance Sequence for Mobile Localization (DSML) algorithm in order to estimate the position of sensor nodes based on the Receive Signal Strength (RSS) measurements. We exploit the relation of measured distance sequences and mobile beacon’s route to assure the reliability of the localization. We observed that about 89% of selected sensor nodes have improved for about 59% their localization error averagely.
Keywords:Wireless sensor networks, mobile localization, received signal strength
962-968

  PDF

17
Survey of Filtering System For OSN (Online Social Networks)
-Rashmi R. Atkare,Prof. P.D.Soni
Abstract
In recent years, Online Social Networks (OSNs) have become an important part of daily life. Users build explicit networks to represent their social relationships. Users can upload and share information related to their personal lives. The potential privacy risks of such behaviour are often ignored. And the fundamental issue in today On-line Social Networks is to give users the ability to control the messages posted on their own private space to avoid that unwanted content is displayed. Today OSNs provide very little support to prevent unwanted messages on user walls. For that purpose, we proposed a new system allowing OSN users to have a direct control on the messages posted on their walls. This is achieved through a flexible rule-based system, that allows users to customize the filtering criteria to be applied to their walls, and a Machine Learning (ML) based soft classifier automatically labelling messages in support of content-based filtering. The system exploits a ML soft classifier to enforce customizable content-dependent Filtering Rules. And the flexibility of the system in terms of filtering options is enhanced through the management of Blacklists. The proposed system gives security to the On-line Social Networks.
Keywords: Online Social Networks, Machine Learning, Filtering Rules, Content-based filtering, Filtering system.
969-972

  PDF

18
Privacy-Preserving Data Analysis Techniques by using different modules
-Payal P. Wasankar,Prof. Arvind S. Kapse
Abstract
The competing parties who have private data may collaboratively conduct privacy preserving distributed data analysis (PPDA) tasks to learn beneficial data models or analysis results. For example, different credit card companies may try to build better models for credit card fraud detection through PPDA tasks. Similarly, competing companies in the same industry may try to combine their sales data to build models that may predict the future sales. In many of these cases, the competing parties have different incentives. Although certain PPDA techniques guarantee that nothing other than the final analysis result is revealed, it is impossible to verify whether or not participating parties are truthful about their private input data.
Keywords- Privacy, security, Secure multi-party computation, Non-cooperative computation.
973-975

  PDF

19
Data Encapsulation To Prevent Jamming Attacks In Wireless Networks
-Israa Tawfik Aziz,S.K. Yadav
Abstract
The open nature of the wireless medium leaves it vulnerable to intentional interference attacks, typically referred to as jamming. This intentional interference with wireless transmissions can be used as a launch pad for mounting Denial-of-Service attacks on wireless networks. Typically, jamming has been addressed under an external threat model. However, adversaries with internal knowledge of protocol specifications and network secrets can launch low-effort jamming attacks that are difficult to detect and counter. In this work, we address the problem of selective jamming attacks in wireless networks. In these attacks, the adversary is active only for a short period of time, selectively targeting messages of high importance. We illustrate the advantages of selective jamming in terms of network performance degradation and adversary effort by presenting two case studies; a selective attack on TCP and one on routing. We show that selective jamming attacks can be launched by performing real-time packet classification at the physical layer. To mitigate these attacks.
976-985

  PDF

20
A Review on anonymization approach to preserve privacy of Published data through record elimination
-Isha K. Gayki,Prof.Arvind S.Kapse
Abstract
Data mining is the process of analyzing data. Data Privacy is collection of data and dissemination of data. Privacy issues arise in different area such as health care, intellectual property, biological data, financial transaction etc. It is very difficult to protect the data when there is transfer of data. Sensitive information must be protected. There are two kinds of major attacks against privacy namely record linkage and attribute linkage attacks. Research have proposed some methods namely k-anonymity, ℓ-diversity, t-closeness for data privacy. K-anonymity method preserves the privacy against record linkage attack alone. It is unable to prevent address attribute linkage attack. ℓ-diversity method overcomes the drawback of k-anonymity method. But it fails to prevent identity disclosure attack and attribute disclosure attack. t-closeness method preserves the privacy against attribute linkage attack but not identity disclosure attack. A proposed method used to preserve the privacy of individuals sensitive data from record and attribute linkage attacks. In the proposed method, privacy preservation is achieved through generalization by setting range values and through record elimination. A proposed method overcomes the drawback of both record linkage attack and attribute linkage attack.
986-989

  PDF

 

 

 

 

IJCTA © Copyrights 2010| All Rights Reserved.

This work is licensed under a Creative Commons Attribution 2.5 India License.