......

..

IJCTA-Volume 2 Issue 5/ September-October 2011
S.No
Title/Author Name
Page No
1
A Review of Retinal Vessel Segmentation Techniques and Algorithms
-Mohd. Imran Khan, Heena Shaikh, Anwar Mohd. Mansuri,Pradhumn Soni
Abstract
Retinal vessel segmentation algorithms are the critical components of circulatory blood vessel Analysis systems. We present a survey of vessel segmentation techniques and algorithms. We put the various vessel segmentation approaches and techniques in perspective by means of a classification of the existing research. While we have mainly targeted the segmentation of blood vessels, neurovascular structure in particular. We have divided vessel segmentation algorithms and techniques into six main categories: (1) Parallel Multiscale Feature Extraction and Region Growing, (2) a hybrid filtering, (3) Ridge-Based Vessel Segmentation, (4) artificial intelligencebased approaches, (5) neural network-based approaches, and (6) miscellaneous tube-like object detection approaches. Some of these categories are further divided into subcategories.
Keywords: Vessel segmentation, retinal image, Parallel Multiscale Feature Extraction
1140-1144

  PDF

2
Multiple Sequence Alignment using Boolean Algebra and Fuzzy Logic:A Comparative Study
-Nivit Gill, Shailendra Singh
Abstract
Multiple sequence alignment is the most fundamental and essential task of computational biology, and forms the base for other tasks of bioinformatics. In this paper, two different approaches to sequence alignment have been discussed and compared. The first method employs Boolean algebra which is a two-valued logic whereas the second is based on Fuzzy logic which is a multi-valued logic. Both the methods perform sequence matching by direct comparison method using the operations of Boolean algebra and fuzzy logic respectively. To ensure the optimal alignment, dynamic programming is employed to align multiple sequences progressively. Both the methods are implemented and then tested on various sets of real genome sequences taken from NCBI bank. The processing time for both the methods on these data sets have been computed and compared.
Keywords: Bioinformatics, multiple sequence alignment, Boolean algebra, fuzzy logic
1145-1152


  PDF

3
Spatial Intrusion of Mining
-SarathChand P.V,Bhukya Shankar Nayak,Rambabu pemula,Nagamani K,Bhukya Ravindranaik
Abstract
The Spatial Intrusion of Mining is a conceptual of space or the location component. The Mining can be viewed as the data about the objects which are located in physical space. It may be implemented by a specific location attributes by latitudes and longitudes and also by more implicitly included about the partitioning of the databases which are based on the location. The spatial data bases and ware houses were the data is accesses by by using the queries containing directions, adjacent and contained in. Generally the data is stored in the spatial databases which contain spatial data and about the non spatial data about the objects. The ware houses and databases are stored by using the spatial data structures using the topological information. The spatial intrusion describes with both spatial and non spatial attributes. It mainly concentrates on location type of attributes which must be included and identifies a precise point. The logical address such as street number, zip code. The objects are located by different locations and some sort of translation between one attribute and the other which is needed to perform the spatial operations between the objects. This paper describes an algorithm for the retrieval of the objects from the non spatial objects which are stored in relational databases and the spatial objects are stored in some of the spatial data structures. In general the tuple represents the spatial object and the link to the tuple represents the corresponding position in the non spatial tuple.
Keywords: Intrusion, partitions, topological forms, spatial objects, query forms, tuple relations, spatial objects
1153-1159


  PDF

4
On Union and Intersection of Fuzzy Soft Set
-Tridiv Jyoti Neog, Dusmanta Kumar Sut
Abstract
Molodtsov introduced the theory of soft sets, which can be seen as a new mathematical approach to vagueness. Maji et al. have further initiated several basic notions of soft set theory. They have also introduced the concept of fuzzy soft set, a more generalized concept, which is a combination of fuzzy set and soft set. They introduced some properties regarding fuzzy soft union, intersection, complement of a fuzzy soft set, DeMorgan Laws etc. These results were further revised and improved by Ahmad and Kharal. They defined arbitrary fuzzy soft union and intersection and proved DeMorgan Inclusions and DeMorgan Laws in Fuzzy Soft Set Theory. In this paper, we give some propositions on fuzzy soft union and intersection with proof and examples. Using the definition of arbitrary fuzzy soft union and intersection proposed by Ahmad and Kharal, we are giving two more propositions with proof and examples. We further give the proof of DeMorgan Laws for a family of fuzzy soft sets in a fuzzy soft class proposed by Ahmad and Kharal and verify these laws with examples.
Key words: Soft Set, Fuzzy Soft Set, Fuzzy Soft Class.
1160-1176

  PDF

5
Spatial-Temporal Analysis of Residential Burglary Repeat Victimization: Case Study of Chennai City Promoters Apartments, INDIA
-M.Vijaya Kumar,Dr .C.Chandrasekar
Abstract
One of the most important roles of government is to protect its citizens from crime and unsafe situations.The use of Geographic Information Systems (GIS) to understand spatial and temporal patterns of crime offences has become more prevalent in recent years; GIS help to optimize effectiveness in the reduction of crime & to increase the safety of residents. Important process offered through GIS is the identification of hot spots, or locations with a high crime rates. The identification of hot spots in time may even be very important; help in better understanding of crime pattern to create a crime reduction plan & allowing for the strategic deployment of resources sometimes & places when they can make the greatest difference.
Spatial-Temporal information analysis plays a central role in lots of security-related applications. This study carried out to inquire in to and evaluate the effectiveness of associating spatial and temporal factors for repeated events in residential housebreaking. To demonstrate the application of spatial statistics, this approach can be a viable analysis alternative in security informatics. In this paper used Chennai City Promoters Apartments in India as a case study.
Key Words:Crime hot spots, repeat victimisation, Geographic Information System (GIS), Spatial-Temporal Analysis of Crime, Residential Burglary.
1177-1191


  PDF

6
A Least Square Approach to Analyze usage data for Effective web Personilization
-S. S. Patil
Abstract
Web server logs have abundant information about the nature of users accessing it. Web usage mining, in conjunction with standard approaches to personalization helps to address some of the shortcomings of these techniques, including reliance on subjective lack of scalability, poor performance, user ratings and sparse data. But, it is not sufficient to discover patterns from usage data for performing the personalization tasks. It is necessary to derive a good quality of aggregate usage profiles which indeed will help to devise efficient recommendation for web personalization [11, 12, 13].
This paper presents and experimentally evaluates a technique for finely tuning user clusters based on similar web access patterns on their usage profiles by approximating through least square approach. Each cluster is having users with similar browsing patterns. These clusters are useful in web personalization so that it communicates better with its users. Experimental results indicate that using the generated aggregate usage profiles with approximating clusters through least square approach effectively personalize at early stages of user visits to a site without deeper knowledge about them.
Index Terms—Aggregate Usage Profile, Least Square Approach, Web Personalization, Recommendation Systems, Expectation Maximization
1192-1196


  PDF

7
Data Clustering Method for Very Large Databases using entropy-based algorithm
-S.Karunakar, K.Rajesh, Ashraf Ali, K.Nageswara Rao, CH.Srinivas Rao
Abstract
Finding useful patterns in large datasets has attracted considerable interest recently and one of the most widely studied problems in this area is the identification of clusters, or densely y populated regions, in a multi-dimensional dataset. Prior work does not adequately address the problem of large datasets and minimization of I/O costs. Clustering of categorical attributes is a difficult problem that has not received as much attention as its numerical counterpart. In this paper we explore the connection between clustering and entropy: clusters of similar points have lower entropy than those of dissimilar ones. We use this connection to design a heuristic algorithm, which is capable of efficiently cluster large data sets of records with categorical attributes. In contrast with other categorical clustering algorithms published in the past, clustering results are very stable for different sample sizes and parameter settings. Also, the criteria for clustering are a very intuitive one, since it is deeply rooted on the well-known notion of entropy.
Keywords – Data mining, categorical clustering, data labeling.
1197-1200


  PDF

8

 

Automatic Identification used in Audio-Visual indexing and Analysis
-A. Satish Chowdary, N.Tirupathi, K. Nageswara Rao, K. Nagamani
Abstract
To locate a video clip in large collections is very important for retrieval applications, especially for digital rights management. We attempt to provide a comprehensive and high-level review of audiovisual features that can be extracted from the standard compressed domains, such as MPEG-1 and MPEG-2. This paper presents a graph transformation and matching approach to identify the occurrence of potentially different ordering or length due to content editing. With a novel batch query algorithm to retrieve similar frames, the mapping relationship between the query and database video is first represented by a bipartite graph. The densely matched parts along the long sequence are then extracted, followed by a filter-and-refine search strategy to prune some irrelevant subsequences. During the filtering stage, Maximum Size Matching is deployed for each sub graph constructed by the query and candidate subsequence to obtain a smaller set of candidates. During the refinement stage, Sub-Maximum Similarity Matching is devised to identify the subsequence with the highest aggregate score from all candidates, according to a robust video similarity model that incorporates visual content, temporal order, and frame alignment information. This new algorithm is based on dynamic programming that fully uses the temporal dimension to measure the similarity between two video sequences. A normalized chromaticity histogram is used as a feature which is illumination invariant. Dynamic programming is applied on shot level to find the optimal nonlinear mapping between video sequences. Two new normalized distance measures are presented for video sequence matching. One measure is based on the normalization of the optimal path found by dynamic programming. The other measure combines both the visual features and the temporal information. The proposed distance measures are suitable for variable-length comparisons.
Keywords – Audio and Video, Multimedia, Speech Processing, Topic Segmentation, Topic Identification.
1201-1205

  PDF

9
Test case Effectiveness of higher order Mutation Testing
-Shalini Kapoor
Abstract
Effectiveness means how good a test case is in finding faults. Traditional mutation testing considers First Order Mutants (FOM) created by injection of a single fault. We focus on Higher Order Mutants (HOM) and in particular on subsuming HOM. Higher Order Mutants contain more than one fault. We report in this paper that a strongly subsuming HOM is more effective as it kills all the FOM’s from which it is constructed thereby reducing testing efforts without loss of effectiveness.
Keywords—Mutation Testing; First order mutants; Higher order mutants
1206-1211

  PDF

10
Data Access by Data Intrusion
-SarathChand P.V,VenuMadhav K,Mahendra Arya Bhanu,Rajalakshmi Selvaraj,Laxmaiah.M
Abstract
The mining is a source for the data base as it requires several elements and techniques for the retrieval operations. Some techniques and their corresponding technologies have an important role in the database which is required in large amounts for data access. The primary method is to retrieve the relevant data and the secondary is retrieval of the matched data. Generally the data ware houses is much more than an information technology project for the companies embracing the concept of mass customization. The Armed data warehouses filled with quality information but one has to use mass customization techniques and scientific testing methods to expand the customer based from one million to 10 million in 10 year’s period. The mass customization is ultimate use of data ware housing and the mining techniques which are making the integral part of the business process. Just like the arms, a data ware house and data mining increases the strength with the active use. With every new test and new product valuable information is added and at the same time information is retrieved, modified which allows the analysts to learn from the successes and the failures of the past. The paper is mainly concerned with retrieval operation by enforcing these failures and successful operation in the form of data intrusion. The data intrusion helps in recognizing the data in the form of mass storage and can be retrieved the relevant data. It is simple and one of the techniques which can cope up with the business market.
Keywords: intrusion, operational systems, legacy systems, homogeneous records, forth coming
1212-1217


  PDF

11
A Client Level Tool for Concluding Router Packet Forwarding Priority
-T.Chandrasekhar, Pallamreddy VenkataSubbaReddy, Sk.Faiz Ahamed
Abstract
Packet forwarding prioritization (PFP) in routers is one of the mechanisms commonly available to network operators. PFP can have a significant impact on the accuracy of network measurements, the performance of applications and the effective-ness of network troubleshooting procedures. Despite its potential impacts, no information on PFP settings is readily available to end users. In this paper, we present an end-to-end approach for PFP inference and its associated tool, POPI. This is the first attempt to infer router packet forwarding priority through end-to-end mea-surement. POPI enables users to discover such network policies through measurements of packet losses of different packet types. We evaluated our approach via statistical analysis, simulation and wide-area experimentation in PlanetLab. We employed POPI to analyze 156 paths among 162 PlanetLab sites. POPI flagged 15 paths with multiple priorities, 13 of which were further validated through hop-by-hop loss rates measurements. In addition, we surveyed all related network operators and received responses for about half of them all confirming our inferences. Besides, we com-pared POPI with the inference mechanisms through other metrics such as packet reordering [called out-of-order (OOO)]. OOO is unable to find many priority paths such as those implemented via traffic policing. On the other hand, interestingly, we found it can detect existence of the mechanisms which induce delay differences among packet types such as slow processing path in the router and port-based load sharing.
Keywords—Network inference, network neutrality, packet forwarding priority
1218-1226

  PDF

12
Dynamic telecast Routing with Security Intensification
-BeerthiSahadev,Panuganti.Srinivas,Ganta.Raju
Abstract
Security has become one of the majorissues for data communication over wired and wireless networks. Different from the past work on the designs of network security algorithms and system infrastructures, we will propose a dynamic broadcast routing algorithm that could randomize delivery paths for data transmission. The algorithm is easy to implement and compatible with popular routing protocols, such as the Routing Information Protocol in wired networks and Destination-Sequenced Distance Vector protocol in wireless networks , without introducing extra control messages. The simulation results have been verified from the proposed algorithm and it shows the capability of the proposed algorithm.
Keywords:Security-intensification of data transmission, dynamic telecast routing, RIP, DSDV
1227-1234

  PDF

13
Studies on Fuzzy Logic and Dispositions for Medical Diagnosis
-Prof. Sripati Mukhopadhyay,Jyotirmoy Ghosh
Abstract
For designing and developing a knowledge-based system we need to store expert’s knowledge in a suitable form, known as knowledge base, and then applying a suitable reasoning process to arrive at a decision. Formal logic, a two-valued logic, known as predicate logic, is suitable for developing and inferring for systems like mechanical theorem proving. But if we want to develop a serious real life knowledge-based system like Medical Diagnosis, formal logic fails to describe the knowledge-base, and obviously Fuzzy logic and extension of Fuzzy logic, known as dispositions as proposed by Zadeh, come in to rescue and ultimately enabling us to use linguistic variables. In this article an attempt has been made to show how fuzzy logic and it’s extension, particularly dispositions, can be used for modeling a medical diagnosis system.
Key-words: Formal Grammar, Fuzzy logic, Dispositions, Medical Diagnosis.
1235-1240

  PDF

14
Framework for suggesting POPULAR ITEMS to users by Analyzing Randomized Algorithms
-Y.Maanasa,V.Kumar,P.Satish Babu
Abstract
Now a day’s interactive computational system are helping people to leverage social information; in technical these systems are called social navigation systems. These help individuals in behavior guiding and decision making over selecting the data. Based on the individual feedback the ranking and suggesting of popular items were done. The individual feedback can be obtained by displaying group of suggested items, where the selection of items is based on the preference of the individual or from the suggested items. The objective is to suggest true popular items by quickly studying the true popularity ranking of items. The complexity in suggesting to the users can emphasize reputation for some items but may disfigure the resulting item ranking for other items. So the problem of ranking and suggesting items affected many applications including tag suggestions and search query suggestions for social tagging systems. In this paper we propose and study algorithms like naive, PROP, M2S, FM2S algorithms for ranking and suggesting popular items.
Keywords:Item sets, tagging, suggestion, search query, ranking rules
1241-1246


  PDF

15
Study the Effect of Variable Viscosity and Thermal Conductivity of Micropolar Fluid in a Porous Channel
-Gitima Patowary, Dusmanta Kumar Sut
Abstract
To study the effect of thermal radiation on unsteady boundary layer flow with variable viscosity and thermal conductivity due to stretching sheet in porous channel in presence of magnetic field, a numerical model has been developed. The Rosseland diffusion approximation is used to describe the radiative heat flux in the energy equation. The governing equations reduced to similarity boundary layer equations using suitable transformations and then solved using shooting method. A parametric study illustrating the influence of the radiation R, variable viscosityε, Darcy number Da, porous media inertia coefficient γ, thermal conductivity κ, unsteady A parameters on skin friction and Nusselt number and Magnetic field parameter M.
Key words: unsteady flow, radiation, stretching sheet, variable viscosity, variable thermal conductivity, porous channel.
1247-1255


  PDF

16
Hardware Enhancement Association Rule with Privacy Preservation
-Phani Ratna Sri Redipalli,G.Srinivasa Rao
Abstract
In recent days Data mining techniques have been widely used in various applications. One of the most important applications in data mining is association rule mining. For hardware implementation of Apriori-based association rule mining we have to load candidate item sets and a database into the hardware. As the hardware architecture capacity is fixed, when the number of items or the number of candidate item sets in database is larger than the hardware capacity, the items are loaded into the hardware separately. Which increases the time complexity to those steps that require to load candidate item sets or database items into the hardware which is proportional to the number of candidate item sets multiplied by the number of items in the database. As the time complexity is increasing because of many candidate item sets and use of large database , which is finally reflecting the performance bottleneck. In this paper, we propose a HAsh-based and PiPelIned (abbreviated as HAPPI) architecture to enhance the implementation of association rule mining on hardware. Hence, we can effectively decrease the frequency of loading the database into the hardware. HAPPI solves the bottleneck problem in a priori-based hardware schemes. Along with this hashing we are including here the privacy preservation for the sensitive data that is processing in the data mining. It is a common problem that is being faced by all Data Mining Techniques.
Keywords:Apriori-based, Privacy, HAPPI, Items, Trimming, Information
1256-1262

  PDF

17
Performance and Analysis of Directional Edge detectord on 3- planer image Corrupted with Image noise
-Shilpa Narula, Ashish Oberoi, Sumit Kaushik,Dr.D.S.Rao
Abstract
Edge detection is a research field within Image processing and Computer vision, in particular within the area of feature extraction. It is extensively used in image segmentation when we want to divide the image into areas corresponding to different objects. Representing an image by its edges has the further advantage that the amount of data is reduced significantly while retaining most of the image information. In this paper edge detection from the application of various 1-directional as well as 8-directional masks or edge detection operators on the images corrupted with different levels of Impulsive noise is presented. Further, 1-Dimensional operators: Kirsch, Prewitt, Sobel and Robinson used for edge detection are applied on RGB (3-planar) images. Subjective and Objective methods are used to evaluate the different edge operators. Results show that 8-directional operators give better performance than 1-directional operators in presence of impulsive noise, which implies as number of orientations increases we get better results and effect of noise decreases.
Keywords: Edge detection, image processing, impulsive noise, 3-planar image
1263-1268


  PDF

18
Application of MPI for Efficient Detection and Extraction of Features in Video Surveillance
-Debmalya Sinha, Gautam Sanyal
Abstract
Recent past has witnessed rapid growth on the usage of CCTV and video camera to beep up security in almost all aspects of life. This has resulted in tremendous growth of video content and its processing time. One of the challenges faced in this area is fast and efficient processing of these huge contents of video images. Therefore, integrating the principles of parallelism with the processing of the video images techniques has almost become mandatory for extraction of the desired information and achieving better performance. In this paper, a parallel algorithm has been proposed for extraction of person features from video frames that can execute on a cluster of workstations. The person behaviors are classified using Minimum distance classifier by the extracted feature vector. The algorithm has been implemented
using MPI to estimate the time complexity and efficiency.
Keywords: MPI, Feature extraction, video processing, classification and repeated position.
1269-1274


  PDF

19
High Density Four Transistor SRAM Cell with low Power Consumption
-Sushil Bhushan,Shishir Rastogi,Mayank Shastri,Asso. Professor Shyam Akashe, Dr. Sanjay Sharma
Abstract
This paper presents a CMOS four-transistor SRAM cell for very high density and low power embedded SRAM applications as well as for stand-alone SRAM applications. The new cell size is 35.45% smaller than a conventional six-transistor cell using same design rules. Also proposed cell uses two word-lines and one pair bit-line. Read operation perform from one side of cell, and write operation perform from another side of cell, and swing voltage reduced on word-lines thus power during read/write operation reduced. Cadence Virtuoso simulation in standard 45nm CMOS technology confirms all results obtained from this paper.
Keywords: SRAM, read operation, write operation, power consumption.
1275-1282


  PDF

20
Privacy Enhancing and Breach Reduction
-Sushma rani.N,Thulasi prasad,Prasad kaviti
Abstract
Data integration in the distributed data system is introduced to solve the problem that data model has. The data integration in the distributional systems can be supported effectively. Data conversion is still a challenge in distributed system integration. Community based system is used for distributed data integration. It comprises of three elements: community, data model and communication protocol. The integration system solves the data heterogeneous problem in production management, making users check data more transparently and conveniently. The construction of central database is a comprehensive giant engineering system, which directly serves the demands of various application subsystem developments.
1283-1289

  PDF

 
 
   
IJCTA © Copyrights 2010| All Rights Reserved.

This work is licensed under a Creative Commons Attribution 2.5 India License.