IJCTA-Volume 5 Issue 5/ September-October 2014
Title/Author Name
Page No
An Effective Personalized E-Learning System Using Agent Technology in Semantic Web
As today the amount of accessible information is overwhelming, the intelligent and personalized filtering of available information is a main challenge. .A personal information agent that is delivering the right information at the right time by accessing, filtering and presenting information in a situation aware matter is needed. Applying Agent-technology is promising, because the inherent capabilities of agents like autonomy and reactiveness offer an adequate approach. We developed an agent-based personal information system for collecting, filtering, and integrating information from specific domain.
Personalization of web search is to carry out retrieval for each user incorporating his/her interests. The user profiles are used to improve retrieval effectiveness in web search. A user profile are learned from the user's search history and a domain hierarchy respectively which are combined to map a user query into a set of categories, which represent the user's search intention and serve as a context to disambiguate the words in the user's query. This paper is designed based on agent and semantic web technologies. We propose an approach agent based personalized semantic web information retrieval, to cope with currently existing challenges of information retrieval over the web.
Keywords—component; formatting; style; styling; insert
An Approach To Detect Node Replication in Mobile Sensor Network-Survey
-P. Edith Linda,R.Sangeetha
In mobile sensor network (MSN), there are many nodes and they are unattended so an adversary can easily attack and compromise the sensor nodes and take private key from the nodes. In this paper we mainly focus on the detection of replication node in mobile sensor networks. Several algorithms are developed to detect the replica attacks, in static WSNs and mobile WSNs. Every technique has its own advantages and disadvantages. In recent years, detection of replication node is an important task to detect the node in wireless sensor network area. In our survey, we analyze previous year research and contributions of the existing techniques are Random key Pre distribution, SET, Deterministic Multicast, Randomized Multicast and Line Selected Multicast, Randomized Efficient and Distributed Mechanism, Localized Multicast, Sequential Probability Ratio Test, eXtremely Efficient Detection, Efficient And Distributed Detection.
Keywords: mobile sensor network, clone attack, witness Node
Selection, Evaluation, Testing, Integration and Implementation of Commercial-Off-The-Shelf(COTS) components in Software
-N. Gnanasankaran,K.Iyakutti,S. Natarajan,K. Alagarsamy
Commercial-Off-The-Shelf(COTS) technology is widely used in many software companies and also in scientific computing. The definition of COTS is furnished. Its virtues and skills needed in using COTS SW are detailed. The methodology used for the selection and evaluation of COTS is given. Some information about the testing of COTS components is also dealt with. The importance of integration and composition of the components is stressed. The technique of two-layer wrapping for integration is also touched upon. The methodology for a successful COTS implementation is given in detail. The “six-step methodology” of Minkiewicz is suggested for successful COTS implementation.
Keywords: COTS, Definition, Selection, Evaluation, Testing, Integration, Composition, Implementation.
The Characteristic Objects Method: a new approach to identify a multi-criteria group decision-making model
-Wojciech SAŁABUN
Although the multi-criteria group decision-making methods have been commonly used in numerous decisional problems, they have several significant drawbacks. One of the most significant and commonly observed shortcomings is a rank reversal phenomenon. This paper presents a new multi-criteria decisionmaking technique: the characteristic objects method. In this approach, the preference of each alternative is obtained on the basis of the distance from the nearest characteristic objects and their value, therefore it is rank reversal free method. Subsequently, the paper extends the Characteristic Objects method to group decision-making. For this purpose, the simple example is used to illustrate the way to identify the group model. This modification uses a simply data fusion, where any data is not reject
Compare Between Two Type Of Structure Overlay Network
-Iyas ALODAT,Mohammad ALODAT
large-scale distributed P2P research environment; algorithms are often studied by using crawlers, the experiment of crawlers built by visited the node and take the information by collect data that was took from other visited node. This paper is for make comparison between two type of structure overlay network; Kadmelia and Chord in Key-base routing method under simulation Oversim.
Keywords—p2p network, overlay network.
Various Techniques for Brain Tumor Identification and Segmentation Approach in MRI Images
Brain tumors are often detected by Magnetic Resonance (MR) image which is difficult and consumes more time. In this paper, we proposed a novel approach for automated Brain Tumor detection of employing segmentation, feature extraction and classification that distinguish the region of brain tumor from sizeable tissues in MRI images. Our proposed approach comprises of four stages such as preprocessing, segmentation, features extraction and classification. The application of our proposed approach for former tumor detection is presented to enhance accuracy and efficiency of clinical pattern. The experimental results are numerically evaluated by a human expert. The average overlap utility, average accuracy and average retrieve between the results found employing our proposed scheme. An assortment with precision of 99%, 98%, 97%and 91%has been received by ANN, k- NN and SVM.
Key words: MRI images, brain tumor, segmentation, feature extraction, median filtering, classification, GLCM, HOG, SVM and ANN.
Netlist Level Hardware IP Protection by Obfuscation Technique using Key-FSM
-Pranjali Akkawar,P. V. Sriniwas Shastry
Now-a-days, Complexity & size of electronic devices is increasing which leads to design reuse. Re-use of design is beneficial but it increases the risk of security threats like IP cloning, reverse engineering, tampering, etc. In this paper one obfuscation method is proposed to overcome such security threats at netlist level. The steps to implement this method are simple & common but each design is uniquely obfuscated such that it proves to be efficient in securing IP core. This method is implemented on five ISCAS-89 benchmark circuits & efficiency of it is also calculated. Here IP security is achieved at cost of approximately 10% area & power overhead for circuit. For small circuit whose gate count is less than 100, overhead is more.
Reviewing on Ad Hoc Wireless Network
-Ashish Chaturvedi,Sashi Kant Gupta,Aamer Mohammad
This paper presents a coherent survey on ad hoc wireless networks, with the intent of serving as a fast reference to the current research issues in ad hoc network. The paper discuses a broad range of research issues such as Routing, Medium Access, Multicasting, Quality of service, TCP performance, Energy, Security, outlining the major challenges which have to be solved before widespread deployment of the technology is possible. Through this survey it would be seen that Ad hoc Networking presence an interesting research area inheriting the problems of wireless and mobile communications in their most difficult form.
Key words: Ad hoc network, routing, MAC, multicasting, quality of service, TCP, energy, security
Adaptive Boost less Magnitude based on Segmented Evaluation Resolution System
-H.Salomi Hemachitra,M.Sumathi,G.B.Govinda Prabhu
Image processing is a technique analyzing and manipulating the image and performs some operations on it, in order to obtain a better image or to extract some valuable information from it. Image Resolution is a one type of process which is sensor to observe or appraise the least object clearly with distinct boundaries. This process generally done by two type of approach one is spatial resolution and PPI (Pixel per Inch). Spatial resolution procedure is measure of how closely lines can be resolved in an image. Measurement of the pixel density (resolution) of devices in various contexts is PPI. In existing Scheme has some kind of problems such that gives irrelevant results. The predicament is the image resolution renovate means, the magnitude also renovate and get raise. In reverse, resolution renovation is shrinking then the image magnitude also gets decrease, there is no worth of its appearance. There is no specific mechanism for resolving these above issues. The proposed system approaches the new type of conversion method is Segment Evaluated Resolution System (SERS). This technique segmenting the Low Resolution image and makes renovation, pixelation of the image. Also this paper approaches rather than old system method such old system has input from several low resolution images but, our approaches makes it one to one manner.
Keywords— Image Processing, Boost less magnitude system, Segmentation, Boost less Segmentation, Segmented Evaluation System, Adaptive boost less magnitude
A Brief Survey on Text Mining and its Applications
-Meenambigai Krishnamoorthy,Menaka Mani
Text mining also known as knowledge discovery in text (KDT) is an emerging technology and it focuses on discovering a text from unstructured text. The reason for text mining is stored information is increasing day by day, so discovering a text from un structured or semi structured data is important one. Therefore several techniques and algorithms are required for extracting useful information. In our work, we discussed about text mining approach, various techniques and applications.
Keywords— KDD, Stemming, Semi structured data, Accuracy, Filtering.
Algorithm for Determining Most Qualified Nodes for Improvement in Testability
-Rupali Aher,Sejal Badgujar,Swarada Deodhar,P.V.Sriniwas Shastry
This paper proposes an algorithm for testability measurement of combinational circuits and determination of Most Qualified Nodes for Improvement in testability (MQNI). The algorithm is verified on the benchmark combinational circuits of ISCAS’89. The netlist (Verilog) file is directly used as an input to this algorithm and a list of nodes for insertion of effective Design for Testability (DFT) is generated. A table of Testability Improvement Factor (TIF) after insertion of DFT is also generated, at the shortlisted nodes without realization of actual schematic circuit. The algorithm is based on SCOAP [1], to measure the testability of the circuit. Testability improvement of every node is considered individually and analyzed on the basis of TIF hence computed. This algorithm makes use of testability and fan out of a node to identify MQNI. Improvement in testability and area overhead for various benchmark circuits are obtained by insertion of DFT at MQNI determined by this algorithm.
Keywords—DFT, SCOAP, Testability, Controllability, Observability
Data Extraction from Web Database Search Result Using Automatic Annotation
Data extraction has become the need of the hour with so much data being populated on web pages. More specifically, the users expect the data extraction to be structured and dynamic. Data extraction from the HTML based search interfaces is usually performed by wrappers. Manually wrapping results into poor scalability. The querying process through the search interfaces retrieves data from multiple data units corresponds to one semantic and the result set is referred to as search result records (SRRs). Now data has to be extracted from these SRRs and assign meaningful labels referred as Annotation. The annotated data units with same semantic meaning are grouped. An annotation wrapper can be used to annotate the new result records from the same web database. In this paper we present the automatic annotation approach which involve three phases to annotate and display the result.
Keywords: Result record, web database, data units, annotation wrapper
An Integrated Framework for Cloud Data Management in Educational Institutes
-Indu Arora,Dr. Anu Gupta
Information and Communication Technology (ICT) has transformed the whole world to a global village. Acquiring and maintaining essential ICT infrastructure has become a great challenge especially in education sector. Being a human resource development sector, it needs to use expensive infrastructure more effectively not only for providing education but also for its transactional applications. With the evolution of Cloud Computing, scalable IT enabled services are delivered on-line to its users on pay-per-usage basis from anywhere through variety of devices. The present paper emphasizes upon the use of Cloud Computing and Cloud databases for the transactional applications of educational institutes. It highlights the issues involved in managing data in Cloud like conformity of ACID (Atomicity, Consistency, Isolation, Durability) guarantees in transactional data. Then the paper proposes an Integrated Framework for managing transactional applications of educational institutes. The proposed framework provides efficient and effective technique to manage transactional data. It also brings uniformity in the way transactional applications and data are stored and accessed by various educational institutes in the Cloud.
Polyhedrons in Spherical Coordination System
The paper presents new way of polyhedron description based on spherical coordination system. Originally the method was developed as a part of an algorithm which will be used to extend the method of mini-models onto the multidimensional space. The polyhedron is described by its faces not by its vertices. Algorithm gives an easy-to-understand and relatively simple way of polyhedron manipulation and testing the point inclusion within the figure area. The algorithm can be used in wide range of computational geometry application and is especially handful in task in which polyhedron size and location are constantly changing. In the first part the article briefly describes the polar and spherical coordination system. In the second part the paper presents polyhedrons in the spherical coordination system in comprehensive manner and also its future use.
Keyword: computational geometry, spherical coordination system, polyhedron, point inclusion
Classification of Genome Data using Random Forest Algorithm: Review
-Mohammed Zakariah
Random Forest is a popular machine learning tool for classification of large datasets. TheDataset classified with Random Forest Algorithm (RF) are correlated and also the interaction between the features leads to the study of genome interaction. The review is about RF with respect to its variable selection property reduces the large datasets into relevant samples and predict the accuracy for the selected variable. The variables are selected among the huge datasets and then its error rate are calculated with prediction accuracy methods, when these two properties are applied then the classification of huge data becomes easy. Various variable selection and accuracy prediction methods are discussedin this review. Keywords:Random Forest Algorithm, Genome datasets, Classification, Data mining, Variable Selection, Accuracy prediction.
Impact and Challenges of Information and Communication Technology, Cloud Computing and Open Source Software in e-Governance in India
-Jovi D‟Silva,Amrita Mukherjee
E-Governance is an emerging concept in India. Though multiple challenges in implementation of E-Governance in a developing country like India, the government has taken up the challenge and is making efforts to reach out to the stakeholders. This paper deals with a detailed account of the present state of E-Governance in India as well as the changes that are necessary to the framework in terms of new and evolving technology of E-Governance in India so that it reaches the masses who need it the most.
Analysing Large Web Log Files in a Hadoop Distributed Cluster Environment
-S Saravanan,B Uma Maheswari
Analysing web log files has become an important task for E-Commerce companies to predict their customer behaviour and to improve their business. Each click in an E-commerce web page creates 100 bytes of data. Large E-Commerce websites like flipkart.com, amazon.in and ebay.in are visited millions of customers simultaneously. As a result, these customers generate petabytes of data in their web log files. As the web log file size is huge we require parallel processing and reliable data storage system for processing the web log files. Both the requirements are provided by Hadoop framework. Hadoop provides Hadoop Distributed File System (HDFS) and MapReduce programming model for processing huge dataset efficiently and effectively. In this paper, NASA web log file is analysed and the total number of hits received by each web page in a website, the total number of hits received by a web site in each hour using Hadoop framework is calculated and it is shown that Hadoop framework takes less response time to produce accurate results.
Keywords - Hadoop, MapReduce, Log Files, Parallel Processing, Hadoop Distributed File System, E-Commerce
Enhanced Analysis on Route Summarization and Route Redistribution with OSPF vs. EIGRP Protocols Using GNS-3 Simulation
-Haresh N. Patel,Prof.Rashmi Pandey
Routing protocol is taking an important role within the Different internet era. A routing protocol determines however the Routers communicate with one another to forward the packets by taking the best path to travel from a source node to a destination node. During this paper we've explored two eminent protocols specifically, Enhanced Interior Gateway Routing Protocol (EIGRP) and Open Shortest Path First (OSPF) protocols based on route redistribution and route summarization using different techniques to reduce routes, filter LSA Types and also reduce size of LSA database, traffic of networks. In any case, having a multiple protocol atmosphere makes redistribution a necessity. Variations in routing protocol characteristics, like metrics, administrative distance, classful and classless capabilities will result route redistribution. Though should be to those variations for distribution to succeed. Then traffic of the network is increase. This analysis will simulate Networks using Route summarization, Stub area, Totally Stub area , NSSA area, NSSA stub area ,NSSA totally Stub Area, Additionally greatly reduces processor workloads, memory and information measure demand.
Index Terms:OSPF, EIGRP, Route Summarization, Route Redistribution, Stub Area, Totally Stub area, NSSA area, NSSA- Stub Area, NSSA-Totally Stub Area
Technology, Systems and Implementation of a Smart Home Automation System: A Review
-Suraj Bhatia,Jatin Bajaj,M.Mani Roja
The development of smart home automation system has great potential in today’s age of technology. According to the Smart Home Energy, a smart home, or smart house, is a home that incorporates advanced automation systems to provide the inhabitants with sophisticated monitoring and control over the building's functions. Such systems make use of different types of sensors to examine the environment and maintain a control over home appliances using latest communication and networking methods. The purpose of this paper is to provide information about the implementation and design of existing smart home technologies. This paper also discusses about our wireless, voice control Smart Home System that allow people control their home devices by voice command at home.
Keywords: Smart Home, Automation, Wireless
Semi-Automated Construction Mechanism of Heterogeneous Artifacts
-Mounir ZEKKAOUI,Abdelhadi FENNAN
An artifact is a general term for any kind of created information, produced, modified or used by developers in the implementation of software systems. Among these artifacts include the source code, analysis and design models, unit testing, XML deployment descriptors, user guides, and properties files. We consider an application is described by a together heterogeneous and distributed software artifacts. All artifacts can evolve over time (artifacts can be removed others can be added). Each artifact may change over time. This may be a source of degradation in functional, qualitative, or behavioral terms of modified software. Hence the need for a unified approach for extraction and representation of different heterogeneous artifacts in order to ensure a unified and detailed description of heterogeneous software artifacts, exploitable by several software tools and enabling responsible for the evolution of carry out the reasoning change concerned.
Keywords: Heterogeneous software artifacts, Software evolution control, Unified approach, Meta Model, Software Architecture
IJCTA © Copyrights 2010| All Rights Reserved.

This work is licensed under a Creative Commons Attribution 2.5 India License.