You are on page 1of 65

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

LOW COMPLEXITY FAULT DETECTION SCHEME FOR ADVANCED ENCRYPTION STANDARD


JISHNUVIMAL, PG, A.SARAVANAN, MahaBarathi Engineering College

Abstract The Advanced Encryption Standard is the newly accepted symmetric cryptographic standard for transferring block of data safely. In order to prevent the Advanced Encryption Standard from suffering from fault attacks, the technique of error detection can be adopted to detect the errors during encryption or decryption. In this paper, low complexity fault detection schemes for reaching a reliable AES architecture are mentioned. Parity based mechanism is implemented instead of look up table method in the case of sub bytes and there by we propose low complexity fault detection schemes for the AES encryption and decryption. Our simulation results show the error coverage of greater than 99 per cent for the proposed schemes.

COLLABORATIVE INFORMATION RETRIEVAL WITH N-GRAM EXTRACTION AND COLLABORATIVE USER RANKING
S.RENUGADEVI, S.AFSAR SALEEMA, Anna University, Chennai,

Abstract There are too many information available on the web; users are often not patient enough to see long list of results given by search engines to find relevant information. Web search can be made more useful, effective and less burdensome to users by trying to infer what would be relevant for the current user for a given query considering individual users interests and provide those results on the top, so that the user need not have to scroll down a long list of results. The collaboration makes sense to search and recommend the results in effective manner among many users. While working with multiple word queries, N-gram approach is applied to search in documents, especially in cases when one must work with phrase queries. Here the n gram approach is used for searching and retrieving the utmost matched key phrase. For multiple word queries, we internally get a result set for each word inside the query; but better matches combined with a good UsersRank are more probable to occur at the first spots. Collaborative information retrieval in the Ranking phase provides more relevant and preferred pages by the users in the Collaboration.
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

RECENT TREND IN IT- HAPTIC TECHNOLOGY IN SURGICAL SIMULATION AND MEDICAL TRAINING. ( A TOUCH REVOLUTION)
Vickram college f engineering

ALAGESWARAN.A, OM PRAKKASH T.S,

Abstract Engineering as it finds its wide range of application in every field not an exception even the medical field. One of the technologies which aid the surgeons to perform even the most complicated surgeries successfully is Virtual Reality. Even though virtual reality is employed to carry out operations the surgeons attention is one of the most important parameter. If he commits any mistakes it may lead to a dangerous end. So, one may think of a technology that reduces the burdens of a surgeon by providing an efficient interaction to the surgeon than VR. Now our dream came to reality by means of a technology called HAPTIC TECHNOLOGY. Haptic is the science of applying tactile sensation to human interaction with computers. In our paper we have discussed the basic concepts behind haptic along with the haptic devices and how these devices are interacted to produce sense of touch and force feedback mechanisms. Also the implementation of this mechanism by means of haptic rendering and contact detection were discussed. We mainly focus on Application of Haptic Technology in Surgical Simulation and Medical Training. Further we explained the storage and retrieval of haptic data while working with haptic devices. Also the necessity of haptic data compression is illustrated.

i-TREESEARCH USING TOP-K APPROXIMATE SUBTREE MATCHING


INDU R NETHAJI Mahendra Institute of Engineering & Technology

Abstract This research paper implements i-TreeSearch using Top-k Approximate Subtree Matching (TASM). It is the problem of finding the k best matches of a small query tree within a large document tree using the canonical tree edit distance as a similarity measure between subtrees.
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

Evaluating the tree edit distance for large XML trees is difficult. The best known algorithms have cubic runtime and quadratic space complexity, and, thus, do not scale. TASM-postorder is a memory-efficient and scalable TASM algorithm. This paper proves an upper bound for the maximum subtree size for which the tree edit distance needs to be evaluated. The upper bound depends on the query and is independent of the document size and structure. A core problem is to efficiently prune subtrees that are above this size threshold. I develop an algorithm based on the prefix ring buffer that allows us to prune all subtrees above the threshold in a single postorder scan of the document.

SEMANTIC WEB SERVICES-A SURVEY Gayathiri Abstract The technology where the meaning of the information and the service of the web is defined by making the web to understand and satisfies the request of the people is called Semantic Web Services. The idea of having data on the web defined and linked in a way that it can be used by machines not just for display purpose, but for automation, integration and reuse of data across various application and semantic is raised to overcome the limitation of the Web services such as Average WWW searches examines only about 25% of potentially relevant sites and return a lot of unwanted information, Information on web is not suitable for software agent and Doubling of size. It is built on top of the Web Services extended with rich semantic representations along with capabilities for automatic reasoning developed in the field of artificial intelligence. This work attempts to give an overview of the underlying concepts and technologies along with the categorization, selection and discovery of services based on semantic.

DELIVERING SCALABLE HIGH BANDWIDTH STORAGE FOR HIGH SPEED DATA TRANSFER
L.M. GLADIS BEULA Mrs. N. SARAVANAN VelTech MultiTech Dr.Rangarajan Dr.SakunthalaEngineering College.

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

Abstract A number of high-bandwidth networks have been constructed; existing, high-speed protocols cannot fully utilize the bandwidth of the high speed networks. As their fixed size application

level receiving buffers suffer from buffer bottleneck. In this paper, analyze the buffer bottleneck problem and propose Rada. By periodically detecting the data arrival rate and consumption rate in the buffer using Exponential Moving Average Scheme. Rada adapts the buffer size dynamically. Rada decides to increase/decrease the buffer when the data arrival rate is constantly faster/slower than the data consumption rate. The adaptation extent in each buffer increase/decrease operation based on a Linear Aggressive Increase Conservative Decrease scheme. Memory utilization is based on Weighted Mean Function. To achieve a high-speed data transfer, as well as easy deployment, User Datagram Protocol (UDP) based high-speed protocols running at the application level have recently been proposed and deployed. UDP based high-speed protocols still cannot fully utilize these high-bandwidth networks.

DEFENCE TO UNSAFE COMPONENT LOADINGS


GREESHMA BANERJI., HEMALATHA.B., Mahendra College of Engineering for Women

Abstract Dynamic loading is an important mechanism for software development. It allows an application the flexibility to dynamically link a component and use its exported functionalities. In general, an operating system or a runtime environment resolves the loading of a specifically named component by searching for its first occurrence in a sequence of directories determined at runtime. Correct component resolution is critical for reliable and secure .Dynamic loading can be hijacked by placing an arbitrary file with the specified name in a directory searched before resolving the target component. A key step in dynamic loading is component resolution, i.e., how to locate the correct component for use at runtime. Operating systems generally provide two resolution methods, either specifying the fullpath or the filename of the target component. It is now important to detect and fix these vulnerabilities. This is first automated technique to detect vulnerable and unsafe dynamic component loadings. Classify two types of unsafe dynamic loadings, resolution
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

failure and resolution hijacking and develop an effective dynamic program analysis to detect and avoid both types of unsafe loadings. It can detect more than 1,700 unsafe DLL loadings and discover new serious attack vectors for remote code execution. In this the detected malicious dll files are prevented from loading while opening any of the file or copying any files and then asks the user whether or not to continue opening or copying .If the user wants to continue, the system continues the opening of the corresponding software if opening or continues copying process. If the user wants to stop the process, user can select the stop option.

JAMMING ATTACK PREVENTION IN WIRELESS NETWORK USING PACKET HIDING METHODS


E.GOPINATHDHINAKARAN, S.V.MANIKANTHAN M.E., Dr.Pauls Engineering College

Abstract Modern society is heavily dependent on wireless network for data Transmission. While data transmission in wireless medium, the jamming attacks occur.That selective jamming attacks can be launched by performing real-time packet classification at the physical layer.In All-Or-Nothing Transformation methods introduce a modest communication and computation overhead.In this method Block encryption algorithm is used to hiding the massages.But this algorithm not considered the Timing limits and Parameters length.To overcome this problem the Smart code generator algorithm is used.This techniques provide the strong security level in wireless Medium.

PREVENTION OF BLACK HOLE ATTACK AND CO-OPERATIVE BLACK HOLE ATTACK IN MANET jeeva Abstract Advancement in the research field has witnessed a rapid development in Mobile Ad-hoc Networks. The distributive nature and the infrastructureless structure make it an easy prey to security related threats. A black hole is a malicious node which replies the route requests that it has a fresh route to destination and drops all the receiving packets. The damage will be serious when they work as a group and this type of attack is called Co-operative black hole attack. In this
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

work, we have designed a routing solution called Trust Based DSR (TBDSR) that enables the Dynamic Source Routing Protocol (DSR) to find a secure end-to-end route free of black hole nodes with cooperation from the neighbors. Also our solution can be able to protect the network in the presence of colluding attackers without the need of promiscuous monitoring the neighbor nodes. The extended defense routing protocol worked efficiently for the malicious node detection and removal in case of Co-operative Black Hole attack resulting in increased network performance. Keywords: Black Hole Attack, Cooperative Black Hole Attack, Ad Hoc Networks, DSR.

AUTHENTICATION ON KEY MANAGEMENT FRAMEWORK WITH HYBRID MULTICASTING NETWORK.


JEYABHARATHI P.S.R. Engineering college.

Abstract Wireless Ad Hoc Network is a collection of wireless hosts that can be rapidly deployed as a multi hop packet radio network without the aid of any established infrastructure or centralized administration. The cost reduction and fast evolution experienced by wireless communication technologies have made them suitable for a wide spectrum of applications, One of them is multicasting Networks. Multicasting systems aim at providing a platform for various applications that can improve safety and efficient group communication. This proposed asynchronous key verification scheme as a part of the protocol poses a significant reduction in the message delay.

EMPOWERED SERVICE DELEGATION WITH ATTRIBUTE ENCRYPTION FOR DISTRIBUTED CLOUD COMPUTING
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

M.JOTHIMANI

Nandha Engineering College,

Abstract Cloud computing has emerged as one of the most influential paradigms in the IT industry. In this, new computing technology requires users to entrust their valuable data to cloud providers, there have been increasing security and privacy concerns on outsourced data. Several schemes employing attribute-based encryption (ABE) have been proposed for access control of outsourced data in cloud computing. The most of them suffer from inflexibility in implementing complex access control policies. In this paper, allowing cloud service providers (CSPs), which are not in the same trusted domains as enterprise users, to take care of confidential data, may raise potential security and privacy issues. To keep the sensitive user data confidential against untrusted CSPs, a natural way is to apply cryptographic approaches, by disclosing Decryption keys only to authorized users. But also provide high performance, full delegation, and scalability, so as to best serve the needs of accessing data anytime and anywhere, delegating within enterprises, and achieving a dynamic set of users.HASBE employs multiple value assignments for access expiration time to deal with user revocation more efficiently than existing schemes. It can be provide fine-grained access control and full delegation. Based on the HASBE model, Finally, we propose a scalable revocation scheme by delegating to the CSP most of the computing tasks in revocation, to achieve a dynamic set of users efficiently.

A SIMPLE METHOD FOR LYMPHOMA CLASSIFICATION USING PRINCIPAL COMPONENT ANALYSIS


METILDA.D, KALAIVANI.I Dr. Sivanthi Aditanar College of Engineering

Abstract Lymphoma is a cancer of the lymphocyte. The proposed approach tends to classify three types of malignant lymphoma: chronic lymphocytic leukemia, follicular lymphoma, and mantle cell lymphoma. Initially, raw pixels were transformed with a set of transforms into spectral planes. Simple and compound transforms were computed. Raw pixels and spectral planes were then routed to the second stage. At the inner level, the set of features was computed. A single feature

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

vector is formed by fusing all computed features. The classification mechanism carried out to classify the malignancies by type.

EFFICIENT RESOURCES PROVISIONING IN CLOUD SYSTEMS FOR COST BENEFITS


M.KARTHI S.NACHIYAPPAN Velammal College of Engineering and Technology

Abstract Cloud providers can offer cloud consumers two provisioning plans for computing resources, namely reservation plan and on-demand plan. In generally, the cost of utilizing computing resources provisioned by reservation plan is cheaper than on demand plan. There are many kinds of resource provisioning options available in cloud environment to reduce the total paying cost and better utilizing cloud resources.However, the best advance reservation of resources is difficult to be achieved due to uncertainty of consumers future demand and providers resource prices. To address this problemProbabilistic based cloud resource provisioning (PCRP) algorithm is proposed by formulating a Probabilistic model.In this paper to obtain the solution of the PCRP algorithm is considered including State base machine, probability of utilization and Estimate future demand.

DERIVING CAPACITY LIMITS OF DATA COLLECTION FOR RANDOM WIRELESS SENSOR NETWORKS
MR.K.A.RAJA, M.E, K.LAVANYA, Ranipettai Engineering college.

Abstract A wireless sensor network (WSN) consists of spatially distributed sensors to monitor environmental condition, and to cooperatively collect and pass their data through the network to a main location or sink. So data collection is a fundamental function provided by wireless sensor network. The performance of data collection in sensor networks can be characterized by the rate at which sensing networks can be collected and transmitted to the sink node. Data collection capacity reflects how fast the sink can collect sensing data from all sensors with interference constraint. In
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

this project, in order to get optimal performance for any arbitrary sensor networks, we use a simple BFS tree method for data collection and greedy scheduling algorithm for deriving capacity bounds of data collection under general graph model where two nearby nodes may be unable to communicate due to barrier or path fading where sensor nodes can be deployed in Gaussian distribution with any network topology.

DETECTING COMPROMISED MACHINERY BY MONITORING SOCIABLE COMMUNICATION


MRS. K.ARUNA N.LAKSHMI, V.GAYATHRI, G.ABIRAMI A.V.C College of Engineering

Abstract This paper focus on the detection of the compromised machines in a network that are used for sending spam messages which are commonly referred to as spam zombies. The nature of sequentially observing outgoing messages gives rise to the sequential detection problem. In this project, we will develop a spam zombie detection system, named SPOT, by monitoring outgoing messages. SPOT is Ratio designed based on a statistical method called Sequential Probability

Test (SPRT), developed by SPRT. It is a powerful statistical method that can be used to

test between two hypotheses (in our case, a machine is compromised versus the machine is not compromised), as the events (in our case, outgoing messages) occur sequentially. We develop an effective spam zombie detection system named SPOT by monitoring outgoing messages of a network. Our evaluation studies show that SPOT is an effective and efficient system in in a network. This system focuses on

automatically detecting compromised machines

identifying the spam message and also detects the compromised machine based on that spam message. After identifying the spam message the SPOT system detect the compromised machine and also restrict the outgoing messages for the corresponding compromised machine. This system can be used in online applications. Based on this project we have to reduce the large number of compromised machines in a same network.
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

AUTOMATED DETECTION OF CYBER SECURITY ATTACKS


SABARINATHAN P., KAVIYARASI S, PABCET.

Abstract Internet services and applications have become an inextricable part of daily life, enabling communication and the management of personal information from anywhere. To accommodate this increase in application and data complexity, web services have moved to a multi-tiered design wherein the web server runs the application front-end logic and data are outsourced to a database or file server. This is the main reason for the attackers try to attack the data base. The Cyber security attacks can be detected by using double guard. Double Guard differs from other type of approach that correlates alerts from independent IDSs. The cyber security uses Container-based and session-separated web server architecture enhances the security performances and also provides the isolation between the information flows that are separated in each container session. Virtualization is used to isolate objects and enhance security performance. Lightweight containers can have considerable performance advantages over full virtualization. SECURE AUTHENTICATION USING BIOMETRIC CRYPTOSYSTEM
MS. N.MADHU SUGANYA, MS.T.MEKALA M.Kumarasamy College of engineering

Abstract Cryptography is a concept to protect data during transmission over wireless network. Cryptography is used in information security to protect information from unauthorized or accidental disclosure while the information is in transmitting (either electrically or physically) and while information is in storage. The information could be accessed by the unauthorized user for malicious purpose. Therefore, it is necessary to apply effective encryption/decryption methods to enhance data security. The existing system limits only the total number of users from the unknown remote host to as low as the known remote host. It uses the white list values for tracking legitimate users. But the cookie value expires after certain time period. So the attackers may use different browsers or may try on another machine or may retry after certain time. If any malicious attacks occurred the authenticated user does not know about that. The proposed system uses two algorithms known us Bio-Metric Encryption Algorithm (BEA), Minutiae Extraction Algorithm (MEA). It uses Multi Bio-metric features for authentication purpose. And also this system
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

dynamically generates a new Session Key for each transaction. After completion of each transaction Authenticated user must change their PIN Number to improve the security. So the proposed system will protect Data Confidentiality, Data Integrity, Authentication, Availability, Access control of information over the network.

DISTRIBUTED OPPORTUNISTIC ROUTING WITH CONGESTION DIVERSITY


MR.M.ISLABUDEEN.M.E.,(PH.D) P.NAGARAJAN. Syed Ammal Engineering College

Abstract The main challenge in the design of minimum delay routing policies is balancing the exchange between routing the packets along the shortest paths to the destination and distributing traffic according to the maximum backpressure. Combining important aspects of shortest path and backpressure routing, this paper provides a systematic development of a distributed opportunistic routing policy with congestion diversity(D-ORCD) in wireless Ad-hoc networks. D-ORCD uses a measure of draining time to opportunistically identify and route packets along the paths with an expected low overall congestion. D-ORCD is proved to ensure a bounded expected delay for all networks and under any admissible traffic. RealisticQualnet simulations for 802.11based networks demonstrate a significant improvement in the average delay over comparative solutions in the literature.

ENHANCED MEASURES FOR PERSONALIZATION OF WEB SEARCH RESULT


MR.P.PRABU MR.T.GOPALAKRISHNAN Bannari Amman Institute of Technology

Abstract The web search results reordering should be performed along with the user results for more relevance according to his profile. The above concepts should be called as personalization. Creation of the profile based on the input directly given in any form by the user and also users browsing patterns. Input given by the user in the way of keywords, instruction, etc. The profile refers to data that should be maintained by client or server level. Reordering of results that should
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

maintained standards or degree while retrieving by the user. Hyperlink to be formed with data collection based on reordering search result and hyperlink data is page ranked using page ranking algorithm and apriority algorithm.

EVOLUTION OF ONTOLOGY BASED ON FREE TEXT DESCRIPTOR FOR WEB SERVICE


RAGAVENDIREN

Abstract Ontologies have become the de-facto modeling tool of choice, employed in a variety of applications and prominently in the Semantic Web. Nevertheless, ontology construction remains a daunting task. Ontological bootstrapping, which aims at automatically generating concepts and their relations in a given domain, is a promising technique for ontology construction. Bootstrapping ontology based on a set of predefined textual sources, such as Web services, must address the problem of multiple concepts that are largely unrelated. This paper exploits the advantage that Web services usually consist of both WSDL and free text descriptors. The WSDL descriptor is evaluated using two methods, namely Term Frequency/Inverse Document Frequency (TF/IDF) and Web context generation. We propose an ontology bootstrapping process that integrates the results of both methods and validates the concepts using the free text descriptors. The web service free text descriptor offering the more accurate definition of ontologies. They extensively validated our ontology.

AUTOMATION OF PUBLIC DISTRIBUTION SYSTEM USING RFID CARD AND BIOMETRIC FOR FASTER AND SAFER ACCESS
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

K.RAJA MR.N.ANANDA KUMAR

Arunai College of Enginering.

Abstract Our paper focuses on design and implementation of computerization of Public Distribution System (Ration Shop) through the state. In recent scenario, all the public and private sectors are going for computerization in their process to simplify and to reduce errors. Civil Supplies Corporation is the major public sector which manages and distributes the essential commodities to all the citizens. In that system various products like Rice, Sugar, Dhal, and Kerosene etc., are distributed using conventional ration shop system. Some of the limitations of conventional ration shop system are due to the manual measurements in the conventional system, the user can not able to get the accurate quantity of material. And also there is a chance for the illegal usage of our products in the conventional system. So we have proposed computerization of Ration Shop and to enhance security we have introduced fingerprint for opening the billing interface so as to avoid illegal entries without the knowledge of the ration card holder. User can also get the accurate quantity of supplies and correct price. In this automated system we replace the convectional ration card by smart card in which all the details about users are provided RFID Smart card and providing Fingerprint of the card holders is used for user authentication. Monitoring Public Distribution system is one of the big issues among public sector, so we have eased the process monitoring the whole system and also the public complaints are directly sent to the higher authority without any intermediate.

DATA MINING APPROACH TO DETECT SPAM ON FACEBOOK


A.RAJASEKAR,G.SUTHAKAR, Jayaraj Annapackiam C.S.I.College of Engineering

Abstract In this work, we present a social network spam detection application based on texts. Particularly, we tested on the Face book spam. We develop an application to test the prototype of Face book spam detection. The features for checking spams are the number of keywords, the average number of words, the text length, the number of links. The data mining model using the decision tree J48 is created using Weka [1]. The methodology can be extended to include other attributes. The
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

prototype application demonstrates the real use of the Face book application.

AUTHENTICATION BASED CLOUD STORAGE AND SECURE DATA FORWARDING


RAJASEKARAN.S.,KALIFULLA.Y., MURUGESAN.S Veltech Multitech Dr.Rangarajan Dr.Sakunthala Engineering College,

Abstract A cloud storage system, consisting of a collection of storage servers, provides long-term storage services over the Internet. Storing data in a third partys cloud system causes serious concern over data confidentiality. General encryption schemes protect data confidentiality, but also limit the functionality of the storage system because a few operations are supported over encrypted data. Constructing a secure storage system that supports multiple functions is challenging when the storage system is distributed and has no central authority. We propose a threshold proxy reencryption scheme and integrate it with a decentralized erasure code such that a secure distributed storage system is formulated. The distributed storage system not only supports secure and robust data storage and retrieval, but also lets a user forward his data in the storage servers to another user without retrieving the data back. The main technical contribution is that the proxy re-encryption scheme supports encoding operations over encrypted messages as well as forwarding operations over encoded and encrypted messages. Our method fully integrates encrypting, encoding, and forwarding. We analyze and suggest suitable parameters for the number of copies of a message dispatched to storage servers and the number of storage servers queried by a key server. These parameters allow more flexible adjustment between the number of storage servers and robustness.

NORMALIZED MEAN MEDIAN FILTER FOR HIGHLY CORRUPTED IMPULSE NOISE IMAGE
P.RAJESWARI Anna University, Regional Centre,

Abstract A novel normalized mean median filter is presented for the removal of salt and pepper noise from highly corrupted noisy images. The noisy pixels are replaced with either the computed mean value
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

or the computed median value. The proposed method replaces only the noisy pixel.Experimental results show the superiority of the performance of the proposed algorithm as compared to that of the state-of-the art methods like standard median filter, progressive switching median filter especially when the image is corrupted with high density impulse noise.

EFFECTIVE OPTIMIZATION OF VIDEO TRANSMISSION IN WLAN


RAMYASREE.R.S. MRS.A.CYNTHIA Dhanalakshmi Srinivasan College of Engg.and Tech.,

Abstract The prevalence of high-denition (HD) cameras, televisions, Blu-Ray players, and DVD recorders means that almost all video content is now captured and recorded digitally and much of it in HD. MPEG-2, H.264/AVC, and VC-1 are the most popular codecs in use today, and these rely on decorrelating trans- forms, motion estimation, intra prediction, and variable-length entropy coding (VLC) to achieve good picture quality at high compression ratios .Alongside the need for efcient video compression, there is a critical requirement for error resilience, in particular in association with wireless networks which are characterized by highly dynamic variations in error rate and bandwidth . Compression techniques based on prediction and variable- length coding render an encoded bit stream highly sensitive to channel errors. In the paper, techniques such as pyramid vector quantization (PVQ) have been implemented for increasing the ability to prevent error propagation through the use of fixed-length codeword in the Wireless Environment. In the paper, frame performance of the video has been observed in the pyramid vector section which offers greater compression performance in various techniques.

CLOUD COMPUTING
M.RIZWANA BARVEEN R.V.R.PRIYANKA KLN College of information technology

Abstract
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

The term, cloud computing, has become one of the latest buzzwords in the IT industry. Cloud computing is an innovative approach that leverages existing IT infrastructure to optimize compute resources and manage data and computing workloads. Cloud computing promises to increase the velocity with which applications are deployed, increase innovation, and lower costs, all while increasing business agility. cloud computing that allows it to support every facet, including the server, storage, network, and virtualization

A GLOBAL THRESHOLD BASED APPROACH FOR DENDRITIC SPINE DETECTION


MR.S.ATHINARAYANANK. SAM ELIEZER

Abstract Neuron reconstruction and dendritic spine identification on a large data set of microscopy images is essential for understanding the relationship between morphology and functions of dendritic spines.Dendrites are the tree-like structures of neuronal cells, and spines are small protrusions on the surface of dendrites. Spines have various visual shapes (e.g., mushroom, thin, and stubby) and can appear or disappear over time. Existing neurobiology literature shows that the morphological changes of spines and the dendritic spine structures are highly correlated with their underlying cognitive functions.How to accurately and automatically analyse meaningful structural information from a large microscopy image data set is a difficult task. One challenge in spine detection and segmentation is how to automatically separate touching spines. In this paper, based on various global and local geometric features of the dendrite structure touching spines are detected and to segment them a breaking-down and stitching-up algorithm is used.

SYMPATHETIC NODE LOCALIZABILITY OF WIRELESS AD HOC AND SENSOR NETWORKS


G.BASKARAN, M.SARANYA Srinivasan Engineering College,

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

Abstract Location awareness is highly difficult for wireless sensor networks. To localize the node using GPS, it is observed that the network is not entirely localized and also cant identify the number of nodes can be located within a network. Node localizability testing cannot be achieved. A new scheme called as Euclidean distance ranging techniques and polynomial algorithm localizability testing, for the node localizability is proposed. It can identify the number of nodes can be located in a connected network. When localize the node, the nodes can be uniquely localized and also the path can be identified using vertex disjoin path. Node localizability provides useful guidelines for network deployment and other location based services.

A LOSSLESS COMPRESSION SCHEME FOR BAYER COLOR FILTER ARRAY IMAGES USING GENETIC ALGORITHM
S.SARANYA Mr.G.MOHANBAABU Dr.G.ATHISHA PSNA College of Engg & Tech

Abstract A portable device such as a digital camera with a single sensor and Bayer color filter array (CFA) requires demosaicing to reconstruct a full color image. In most digital cameras, Bayer CFA images are captured and demosaicing is generally carried out before compression. Recently, it was found that compression-first schemes outperform the conventional demosaicing-first schemes in terms of output image quality. A Genetic Algorithm based lossless compression scheme for Bayer CFA images is proposed in this Project. Simulation results show that the proposed compression scheme can achieve a better compression performance than conventional lossless CFA image coding schemes. HOMOMORPHIC AUTHENTICATION WITH DYNAMIC AUDIT FOR CATCHING THE MODIFICATIONS OF DATA IN MULTI CLOUD STORAGE
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

V.B.VINITHA Prathyusha Institute of Technology and Management

Abstract A multi cloud is a cloud computing environment in which an organization provides and manages some internal and external resources. Provable data possession (PDP) is a audit technique for ensuring the integrity of data in storage outsourcing. However, early remote data audit schemes have focused on static data and the fact that users no longer have physical possession of the possibly large size of outsourced data makes the data integrity protection is very challenging task. In this project proposes the homomorphic authentication with dynamic audit mechanism in multi clouds to support the scalable service and data migration, in which multiple cloud service providers to collaboratively store and maintain the clients' data. Security in cloud is achieved by signing the data block before sending to the cloud by using Boneh LynnShacham (BLS) algorithm which is more secure compared to other algorithms. To ensure the correctness of data, we consider an external auditor called as third party auditor (TPA), on behalf of the cloud user, to verify the integrity of the data stored in the cloud. The audit service construction is based on the techniques, fragment structure, random sampling and index-hash table, supporting provable updates to outsourced data and timely anomaly detection. The security of this scheme based on multi-prover zero-knowledge proof system, which can satisfy completeness, knowledge

soundness, and zero-knowledge properties. The technique of bilinear aggregate signature is used to achieve batch auditing. Batch auditing reduces the computation overhead. Extensive security and performance analysis shows the proposed schemes are provably secure and highly efficient. A SHORT-WAVE INFRARED NANOINJECTION IMAGER WITH 2500 A/W RESPONSIVITY AND LOW EXCESS NOISE
K.SRIJA V.SINDHU

Abstract We report on a novel nanoinjection-based short-wave infrared imager, which consists of InGaAs/GaAsSb/InAlAs/InP-based nanoinjection detectors with internal gain. The imager is 320256 pixels with a 30m pixel pitch. The test pixels show responsivity values in excess of 2500 A/W, indicating generation of more than 2000 electrons/photon with high quantum
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

efficiency. This amplification is achieved at complementary metal-oxide semicon- ductor (CMOS) compatible, subvolt bias. The measured excess noise factor F of the hybridized imager pixels is around 1.5 at the responsivity range 1500 to 2000 A/W. The temperature behavior of the internal dark current of the imager pixels is also studied from 300 to 77 K. The presented results show, for the first time, that the nanoinjection mechanism can be implemented in imagers to provide detector-level internal amplification, while maintaining low noise levels and CMOS compatibility.

DATA HIDING IN MPEG VIDEO FILES USING BLOCK SHUFFLING APPROACH


GOPU

Abstract Data hiding consists of two set of data, namely the cover medium and digital medium embedding data ,which is called message. Early video data hiding approaches were proposing still image watermarking techniques. This work extended to video by hiding the message in each frame independently. This work deals with two approaches for data hiding .First approach, quantization scale of constant bit rate video and second-order multivariate regression .However, the message payload is restricted to one bit per macro block. Second approach, Flexible Macro block Ordering was used to allocate macro block to slice group according to the content of message. In existing work of compressed video, packets may lost if channel is unreliable. The enhancement to robustness of existing work may proposes a Block shuffling scheme to isolate erroneous block caused by packet loss. And apply data hiding to add additional protection for motion vector. The existing solutions cause compression overhead and proposed solution reduces the packet loss during transmission.

PERCOLATION THEORY BASED 2-D NETWORK CONNECTIVITY IN VANETS


SAKTHI LAKSHMI PRIYA C Anna University, Regional centre

Abstract
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

Vehicle Ad Hoc Network is a special category of Ad-Hoc Network in which vehicles act as nodes. Due to its fast moving nature the network connectivity is an important factor because it can be greatly the performance of VANETs. Percolation theory[7] can be used to analyze the connectivity of VANETs through theoretical deduction, discover the quantitative relationship among network connectivity, vehicle density and transmission range. When vehicle density or transmission range is big enough then there is a jump of network connectivity. By knowing the vehicle density it is possible to calculate the minimum transmission range to achieve good network connectivity. As a large transmission range can cause serious collisions in wireless links, it is tradeoff to choose proper transmission range. Proper analysis of the transmission range can be useful in the realworld deployment of VANETs.

INFORMATION LOSS REVELATION USING FAKE OBJECTS


SREEKUMAR K N Mahendra Institute of Engineering & Technology

Abstract A data distributor has given sensitive data to a set of supposedly trusted agents (third parties).Some of the data are leaked and found in an unauthorized place (e.g., on the web or somebodys laptop). The distributor must assess the likelihood that the leaked data came from one or more agents, as opposed to having been independently gathered by other means. This paper proposes data allocation strategies (across the agents) that improve the probability of identifying leakages. These methods do not rely on alterations of the released data (e.g., watermarks). In some cases, we can also inject realistic but fake data records to further improve our chances of detecting leakage and identifying the guilty party.

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

ADAPTIVE DEFENSE AGAINST VARIOUS ATTACKS IN DOS LIMITING NETWORK ARCHITECTURE


A.ANTON STENY, MRS.A.MANIAMMAL Kurinji College of Engineering & Technology

Abstract DoS attacks aim to reduce scarce resource by generating illegal requests from one or more hosts. This affects the reliability of the Internet. Also this threatens both routers as well as hosts. To avoid this new concept is proposed named Adaptive Selective Verification (ASV) to avoid Dos attacks, which is a distributed adaptive mechanism for thwarting attackers efforts to deny service to legitimate clients based on selective verification. It performs empirical evaluation AVS protocol with the aim of understanding

A CRYPTOGRAPHIC APPROACH FOR EFFICIENT KEYWORD SEARCH SCHEME IN CLOUD COMPUTING


K.FATHIMA BUSHRA N. MARTINA P. USHADEVI Dr.Sivanthi Aditanar College of Engineering

Abstract A user stores his personal files in a cloud, and retrieves them wherever and whenever he wants. For the sake of protecting the user data privacy and the user queries privacy, a user should store his personal files in an encrypted form in a cloud, and then sends queries in the form of encrypted keywords. However, a simple encryption scheme may not work well when a user wants to retrieve only files containing certain keywords using a thin client. First, the user needs to encrypt and decrypt files frequently, which depletes too much CPU capability and memory power of the client. Second, the service provider couldnt determine which files contain keywords specified by a user if the encryption is not searchable. Therefore, it can only return back all the encrypted files. A thin client generally has limited bandwidth, CPU and memory, and this may not be a feasible

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

Solution under the circumstances. In this paper, we investigate the characteristics of cloud computing and propose an efficient privacy preserving keyword search scheme in cloud computing. It allows a service provider to participate in partial decipherment to reduce a clients computational overhead, and enables the service provider to search the keywords on encrypted files to protect the user data privacy and the user queries privacy efficiently. By proof, our scheme is semantically secure.

DATA DREADGING FOR STREAMING DATA NUGGETS USING XQUERIES


A.T.SUMITHAR. VAISHNAVI Sri Sairam Engineering College

Abstract Ever growing internet has a very large amount of digital information in the form of semi structured documents and retrieving interesting data according to the user query is a Herculean task. Indeed, documents are often so large that the dataset returned as an answer to a query may be huge to convey interpretable knowledge. In this work an approach is described based on Tree-based Association Rules (TARs), which provides approximate, intensional information on both the structure and the contents of XML documents, and can be stored in XML format as well. This mined knowledge is later used to provide: (i) a concise idea of both the structure and the content of the XML document and (ii) quick, approximate answers to queries. In this work we focus on the second feature. A prototype system and experimental results demonstrate the effectiveness of the approach.

DIGITAL IMPLEMENTATION OF MULTILAYER PERCEPTRON NETWORK FOR PATTERN RECOGNITION


A.THILAGAVATHY K.VIJAYA KANTH Srinivasan engineering college, Perambalur.

Abstract Artificial Neural Networks support their processing capabilities in a parallel architecture. It is
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

widely used in pattern recognition, system identification and control problems. Multilayer Perceptron is an artificial neural network with one or more hidden layers. This paper presents the digital implementation of multi layer perceptron neuron network using FPGA (Field Programmable Gate Array) for image recognition. This network was implemented by using three types of non linear activation function: hardlims, satlins and tansig. A neural network was implemented by using VHDL hardware description Language codes and XC3S250E-PQ 208 Xilinx FPGA device. The results obtained with Xilinx Foundation 9.2i software are presented. The results are analyzed by using device utilization and time delay.

COMPUTER AIDED DIAGNOSIS OF CANCER WITH MAMMOGRAM USING FUZZY CASE BASED REASONING
A.ATHIRAJA, Dr.P.VENKATA KRISHNAN , Dr. A. ASKARUNISHA Vickram college of engineering,

Abstract In this project to diagnosis cancer based on micro calcification (MC) in the mammography image. Pre-processing technique is applied to mammogram after that it will convert into training dataset. Perception algorithm used to classify the MC present cells and Mc absent cells. To collect the MC present cells only is called region of interest (ROI). Case Based Reasoning (CBR) classifier used to classify MC present cells into following classes initial, small, medium, high and very high. To make prediction or decision making either patient can be affected by cancer or not. If the patient affected by cancer, they are suggested to antibiotics or operations. The decision making using the fuzzy set approach. This method provides better performance compare to the previous diagnosis techniques.

OFFLINE HANDWRITTEN TAMIL CHARACTER RECOGNITION USING HMM MODEL


Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

M.AYSHWARIYA, M.ANTONY ROBERT RAJ Alpha College of Engineering

Abstract Character recognition is the important area in image processing and pattern recognition fields. Handwritten character recognition refers to the process of conversion of handwritten character in to Unicode character. The recognition system can be either on-line or off-line. Offline TAMIL handwritten character recognition has become a difficult problem because of the high variability and ambiguity in the character shapes written by individuals. A lot of researchers have proposed many approaches are designed to solve this complex problem. But still some of the problems encountered by researchers include, long network training time, long recognition time and low recognition accuracy. The performance of character recognition system is depends on proper feature extraction and correct classifier selection. This paper proposes an approach for offline recognition of Tamil characters using their structural features. Structural features are based on topological and geometrical properties of the character, such as aspect ratio, cross points, loops, branch points, inflection between two points, horizontal curves at top or bottom, etc. These features utilize Hidden Markov Models (HMMs) classifier for recognizing offline Tamil handwritten characters. Higher degree of accuracy in results has been obtained with the implementation of this approach on a comprehensive database and the precision of the results demonstrates its application on commercial usage. The concept proposed is a solution crafted to enhance computational efficiency and improve the recognition accuracy.

SLOW FEATURE ANALYSIS: A NOVEL APPROACH TO HUMAN ACTION RECOGNITION


L. BERWIN RUBIA K. MANIMALA Dr. Sivanthi Aditanar College of Engineering

Abstract Slow Feature Analysis (SFA) extracts slowly varying features from a quickly varying input signal. SFA framework is introduced to the problem of human action recognition by incorporating the supervised information with the original unsupervised SFA learning. Firstly, interest points are detected in the local spatial and temporal regions, and local feature is described with SFA method. Each action sequence is represented by the Accumulated Squared Derivative (ASD), which is a statistical distribution of the slow features in an action sequence [1]. The descriptive statistical features are extracted inorder to reduce the dimension of the ASD feature is proposed. Finally, one against all support vector machine (SVM) is trained to classify action represented by statistical features.

ANALYSIS, FUTURE AND COMPARISON OF 4G TECHNOLOGY

DIVYA PRIYADHARSHINI M PREETHI S Anna University, Regional Centre

Abstract The means of communication not until recently has been only voice and text. Voice and SMS services were given top priority by telecom networks. But, the Internet has provided many other services like electronic file sharing, online gaming, e-commerce and getting access to any information by just goggling which appeal to people as these services are cost effective and also reduces burden on the human part. Making these services available on mobile devices has far more benefits and interesting situations. However, todays internet through cables and wireless limits connectivity only to a small region called Local Area Network (LAN) and Wireless Local Area Network (WLAN) hot spot respectively. Also getting an advanced service support to todays voice dominated telecom mobile networks is not an easy task either. Globally there is a perception that IP is the protocol that will enable new possibilities for telecom sector in future. This article discusses about the features of 4G, the edge it provides once operational, impact on India, barriers to implementation of 4G and recommendations to overcome these barriers.

A HYBRID NETWORK FOR AUTOMATIC GREENHOUSE MANAGEMENT


R.NARMATHA, C.K.NITHYA, G.RANJITHA, M.KALAIYARASI P.S.R.Rengasamy College of Engineering for women

Abstract A greenhouse is a building in which plants are grown in closed environment. Greenhouse management is controlling of several greenhouse. The wireless section is located in the indoor environment where great flexibility is needed, particularly in the production area of greenhouse. Instead, the wired section is mainly used in the outside area as a control backbone, to interconnect the greenhouse with the control room. An integrated wired/wireless solution is to use the advantages of both technologies by improving performances. In the wired section, a controller area network (CAN) type network has been chosen on the account of its simplicity, strongest, cheapness, and good performances. for the wireless part, a zigbee type network has been chosen. The SCADA system is to monitor and control data in a simple way. To maintain the optimal conditions of the environment, greenhouse management requires data acquisition using the SCADA (supervisory control and data acquisition.

AUTOMATIC DETECTION OF FACIAL RECOGNITATION USING HAAR CLASSIFIER


K.JEYASREE, V.RAJALEKSHMI Lord Jegannath College of Engineering & Technology

Abstract This paper proposes Automatic detection of facial recognitation in an image that can be important stage for various facial image manipulation works, such as face recognition, facial expression recognition, 3D face modeling and facial features tracking. R egi on d etection of facial features like eye, pupil, mouth, nose, nostrils, lip corners, eye corners etc., with different facial image with neutral region selection and illumination is a challenging task. In this paper, we presented different methods for fully automatic region detection of facial features. Object detector is used along with haar-like cascaded features in order to detect face, eyes and nose. Novel techniques using the basic concepts of facial geometry are proposed to locate the mouth position, nose position and eyes position. The estimation of detection region for features like eye, nose and mouth enhanced the detection accuracy effectively. An algorithm, using the H-plane of the HSV color space is proposed for detecting eye pupil from the eye detected region. Proposed algorithm is tested over 100 frontal face images with two different facial expressions (neutral face and smiling face).

KEY MANAGEMENT SCHEME FOR VANET BASED ON VECTOR GROUP


K.JEYASREE, V.RAJALEKSHMI Lord Jegannath College of Engineering & Technology

Abstract Vehicular AdHoc Networks can offer various service to the user. In this paper, A Key management scheme based on vector group is to be proposed for VANET to overcome high memory overhead and also to reduce the high computational time in the existing system. We propose vector Based Cryptosystem to achieve security in terms of privacy and authentication. KeywordsKey Management, privacy, Authentication Energy consumption, Pair wise key Establishment

PERFORMANCE ANALYSIS AND DESIGN OF ENERGY EFFICIENT ARITHMETIC ADDERS BY PTL TECHNOLOGY AND ITS APPLICATION
L.KRISHNAKUMARI K.MURUGAN National College of Engineering

Abstract Energy efficiency is one of the most required features for the modern electronic systems designed for high-performance and portable applications. Based on this, a high-speed and low-power full-adder cell is designed with DPL and SR-CPL internal logic to reduce the power delay product. The full adder is implemented with an alternative internal logic structure based on multiplexing in order to reduce the power consumption and delay. The designed DPL and SR CPL full adder shows the reduction in power consumption and delay . Post Layout Simulations shows that the proposed full adders gives better energy efficieny . The resultant full adder show to be more efficient then other logic implementation. Thus the full adders were designed and the performances were analysed in which the full-adder cells are implemented with an alternative internal logic structure, based on the multiplexing of the Boolean functions XOR/ XNOR and AND/OR, to obtain balanced delays in SUM and CARRY outputs, respectively, and pass-transistor powerless/groundless logic styles, in order to reduce power consumption.

A DESIGN OF VIRUS DETECTION PROCESSOR FOR EMBEDDED NETWORK SECURITY


S.SHAMILI AND B.KARTHIGA

Abstract In an Intrusion Detection System (IDS) has emerged as one of the most effective way of furnishing security to those connected to the network and the heart of the modern intrusion detection system is a pattern matching algorithm. A network security application needs the ability to perform the pattern matching to protect against attacks like viruses and spam. The solutions for firewall are not scalable; they dont address the difficult of antivirus. The main work is to furnish the powerful systematic virus detection hardware solution with minimum memory for network security. Instead of placing the entire patterns on a chip, a two phase antivirus processor works by condensing as much of the important filtering information as possible onto a chip. Thus, the proposed system is mainly concentrated on reducing the memory gap in on chip memory.

A NOVEL HIERARCHICAL ACCESS CONTROL APPROACH IN CLOUD COMPUTING


MR. V. VENKATESA KUMARM., MUTHULAKSHMI Anna University Regional Centre,

Abstract The Cloud Computing user will store the data in storage area provided by service providers. To achieve the security in storage devices, the HASBE (Hierarchical Attribute Set Based Encryption) this is driven by the CP-ABE with a hierarchical structure of Cloud users. The data owner can concurrently obtain encrypted data and decryption keys and allows the user to access files without authorization. When user revocation taken the Data must re-encrypt and re-upload to Cloud. This process has to do by data owner itself. The Computation cost and bandwidth cost increased. The HASBE scheme proposes a new scalable hierarchical attribute access control by introducing a key server which shows both efficient access control for outsourced data and encryption/decryption keys. The key must be generated to the users when user revocation taken. The generated key may be useless when user revocated after key generation process. Here, the Key generation time is increased. The HASBE scheme also proposes a new scheme for balancing the key generation and user revocation. It shows effective way for generating key in Cloud environment. It shows high secure and effective way for accessing data in Cloud environment.

IMPLEMETATION OF TURBO CODED WIRELESS SYSTEM AND STBC BASED SPATIAL DIVERSITY FOR WSN
N.NAVEEN KUMAR MR.P.SAMPATH KUMAR Varuvan Vadivelan institute of technology

Abstract The uncoded systems have been discussed and energy efficiency has been calculated in previous works. An energy-efficient virtual multiple-input multiple-output (MIMO) communication architecture based on turbo coder is proposed for energy constrained, distributed wireless sensor networks. As sensor nodes are generally battery-powered devices, the critical aspects to face concern how to reduce the energy consumption of nodes, so that the network life time can be increased to reasonable times. The efficiency of space-time block code-encoded (STBC) cooperative transmission is studied. Energy consumption differs for coded and uncoded systems. Though STBC is discussed, a channel encoding scheme consumes more power while system is operating. Energy efficiency is analyzed as a trade-off between the reduced transmission energy consumption and the increased electronic and overhead energy consumption. Simulations are expected to show that with proper design, cooperative

transmission can enhance energy efficiency and prolong sensor network lifetime. Along with that the BER performance is also analyzed under various SNR conditions. Simulation results are included. Since we use turbo coder and decoder for this coded system, BER is expected to be zero at a least value as less as 3dB. CLOUD COMPUTING
S.SANTHOS KUMAR NIRMAL KUMAR N Gnanamani College of Technology

Abstract Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. The concept of cloud computing fills a perpetual need of IT a way to increase capacity or add capabilities on the fly without investing in new infrastructure, training new personnel, or licensing new software. Cloud computing encompasses any subscription-based or pay-per-use service that, in real time over the Internet, extends its existing capabilities. Cloud computing provides computation, software, data access, and storage services that do not require end-user knowledge of the physical location and configuration of the system that delivers the services. Parallels to this concept can be drawn with the electricity grid, wherein end-users consume power without needing to understand the component devices or infrastructure required to provide the service Cloud computing THEFT IDENTIFICATION DURING DATA TRANSFER IN IT SECTOR
PREETHI

Abstract Theft identification during data transfer can be elaborated as in when a data distributor has given sensitive data to the trusted agents and some of the data is leaked and found in an unauthorized place. For this the system can use data allocation strategies or can also inject "realistic but fake" data records to improve identification of leaked data and who leaks the data. The Fake Objects looks exactly like original data in which the agents cannot be identified. Many of the data from the organization can be mostly leaked through the e-mails. In order to secure the data which are leaked from the mail can be detected and identified through the Fake Objects. E-Random and S-Random algorithm are used to minimize the content as well as to detect the guilty agent. The leaked data from the organization can be sent to the third parties in the form of cipher text.

A DIFFERENTIATED QUALITYOF SERVICE BASED SCHEDULING ALGORITHM FOR REAL TIME TRAFFIC

AFSAN FAZIHA. R,ABINAYA DEEPIKA .R, DEEPALAKSHMI.R Velammal College of Engineering

Abstract It is a very important factor to allocate the resources for increasing the QoS (Quality of service) for any network carrying various types of real time traffic. Real-time applications are most important to get the benefit of QoS adaptation. More scheduling disciplines are employed at the router to guarantee the QoS of the network. DiffServ (Differentiated Services) is an IP based QoS support framework that differentiates between different classes of traffic. The function of the core router of the network is to forward packets as per the per-hop behavior associated with the DSCP (Differentiated services code point) value. So we are going to propose Quality of Service Model Scheduling Algorithm (QSMA) and random walk protocol for an effective scheme to maintain QOS Parameters such as packet loss, packet delay and Bandwidth providing absolute differentiated services for real-time applications.

AN EFFICIENT WAY OF SIMILARITY SEARCH DATA PRESERVATION USING ASCENT PLUGNE METHOD
S.SELVABHARATHI, S.SARANYA, K.SELVASHEELA. N.P.RAJESWARI. Veerammal Engineering College

Abstract Our project is a method of preliminary formulation of ascent plunge with data seclusion preservation. Here there are two approach named stochastic approach and least square approach. The two methods are proposed for securely complex blocks for both horizontally partitioned data and vertically portioned data. In horizontal and vertical partitioned the parties hold the same object for same set of attributes. The mining of attribute is confined securely and it can access by the key, which is generated from DSA algorithm. Access that data the method involved is the linear regression method... The secure matrix multiplication is used for the computation for linear regression and classification for performing the operation securely and finding the data set of user without accessing the private data. This method is used for securely performing ascent plunge method over vertically partitioned data.

DUAL PROTECTING MECHANISM FOR MULTITIER WEB APPLICATION

SHANOFER SHAJAHAN

Abstract Web application is an application that is accessed over a network such as the Internet. They are increasingly used for critical services. In order to adopt with increase in demand and data complexity, web application are moved to multitier Design. Thus web applications are become a popular and valuable target for security attacks. These attacks have recently become more diverse and attention of an attacker have been shifted from attacking the front-end and exploiting vulnerabilities of the web applications in order to corrupt the back-end database system. In order to penetrate their targets, attackers may exploit well known service vulnerabilities. To protect multitier web applications, several intrusion detection systems has been proposed. By monitoring both web and subsequent database requests, we are able to ferret out attacks that an independent ID would not be able to identify. An intrusion detection system (IDS) is used to detect potential violations in database security. In every database, some of the attributes are considered more sensitive to malicious modifications compared to others.

A NOVEL CHANNEL ADAPTIVE ROUTING WITH HANDOVER IN MANETs


A.SIVAGAMI, M.RENUKADEVI P.S.R.Rengasamy College of Engineering For Women

Abstract Radio link fluctuations is a difficult task in packet transmission in mobile ad hoc networks. To overcome this we are proposing a new protocol called novel channel adaptive routing protocol which reduces channel fading. The proposed channel used to select stable links for route discovery by using average non fading duration technique and handoff strategy maintains reliable connections. This protocol provides a dual-attack for avoiding unnecessary route discoveries, predicting path failure leading to handoff and then bringing paths back into play when they are again available, rather than simply discarding them at the first sign of a fade. Keywords- Mobile ad hoc networks, Average non-fading duration, Routing protocols, Channel adaptive routing.

A RANKING BASED APPROACH FOR HANDLING SENSITIVE DATA


TANIA VERONICA SEBASTIAN J. RAJA Annai Mathammal Sheela Engineering College

Abstract An organization undoubtedly wants to preserve and retain data stored in an organizations computers. On the other hand, this data is necessary for daily work processes. Users within the organizations perimeter (e.g., employees, subcontractors, or partners) perform various actions on this data (e.g., query, report, and search) and may be exposed to sensitive information embodied within the data they access. In an effort to determine the extent of damage to an organization that a user can cause using the information obtained, the concept of a ranking based approach in security alert for handling sensitive data organization is introduced. The score measure is tailored for tabular data sets (e.g., result sets of relational database queries) and cannot be applied to non-tabular data such as intellectual property, business plans, etc. By assigning a score that represents the sensitivity level of the data that a user is exposed to, the weight can determine the extent of damage to the organization if the data is misused. Using this information, the organization can then take appropriate steps to prevent or minimize the damage.

A NOVEL ON-DEMAND CO-OPERATIVE OPPORTUNISTIC ROUTING SCHEME FOR CLUSTER BASED MANET
ABINAYA DEEPIKA.R, AFSAN FAZIHA.R, DEEPALAKSHMI.R.Velammal College of Engg and Tech

Abstract Mobile networks have received great deal of attention during last few decades due to their potential applications such as large scale, improved flexibility, and reduced costs. Variation in link quality and routing assignment are the major problems in communication network. This proposed work addresses two problems associated with mobile network such as method to reduce overhead between the nodes, and energy balanced routing of packets by Co-Operative opportunistic routing for cluster based communication. We propose a modified algorithm that uses On-Demand Opportunistic Group mobility based clustering (ODOGMBC) for forming the cluster and predicting the cluster mobility by neighborhood update algorithm. Cluster formation involves election of a mobile node as Cluster head. Each cluster comprises of cluster head and non-cluster head node that forms a cluster dynamically. Each node in the network continuously finds it neighbour by communicating with them, and nodes have consistent updated routing information in route cache by neighborhood update algorithm. In routing process packet forwarded by the source node is updated by intermediate forwarder if topology undergo changes. This opportunistic routing scheme provides responsive data transportation and managing the node effectively, even in heavily loaded environment. Thus, our proposed routing technique helps us to reduce overhead, increases efficiency and better control of path selection.

ANDROID MOBILES TO STOP THE PASSWORD STEALING ATTACKS


P. PREETHY JEMIMA , AND MR. S. MUTHUKUMARASAMY S.A.Engineering College

Abstract Text passwords have been adopted as the primary mean for user authentication in online websites. Humans are not experts in memorizing them, therefore they rely on the weak passwords. As they are the static passwords there are some adversary who can launch attacks to steal passwords, and suffers quitely from few security drawbacks: phishing, keyloggers and malware. This problem can be overcome by a protocol named oPass which leverages a users cellphone and an SMS to thwart password stealing. Opass greatly avoids the man-in-middle attacks. In case of users lose their cellphones, this still works by reissuing the SIM cards and long-term passwords. This is a efficient user authentication protocol and is at affordable cost.

RIHT: A NOVEL HYBRID IP TRACEBACK SCHEME


ATHIRA JAYAN , MALU FATHIMA AFSAR National College Of Engineering

Abstract Because the Internet has been widely applied in various fields, more and more network security issues emerge and catch peoples attention. However, adversaries often hide them selvesby spoofing their own IP addresses and then launch attacks. For this reason, researchers have proposed a lot of trace back schemes to trace the source of these attacks. Some use only one packet in their packet logging schemes to achieve IP tracking. Others combine packet marking with packet logging and therefore create hybrid IP trace back schemes demanding less storage but requiring a longer search. In this paper, we propose a new hybrid IP trace back scheme with efficient packet logging aiming to have a fixed storage requirement for each router (under 320 KB, according to CAIDAs skitter data set) in packet logging without the need to refresh the logged tracking information and to achieve zero false positive and false negative rates in attack-path reconstruction. In addition, we use a packets marking field to censor attack traffic on its upstream routers. Lastly, we simulate and analyze our scheme, in comparison with other related research in the following aspects: storage requirement, computation, and accuracy. EFFECTIVE ADAPTIVE PREDICTION SCHEME FOR WORKLOADS IN GRID ENVIRONMENT
MOHAMED AFFIR. A, VIJAYA KARTHIK.P, VASUDEVAN. Kalasalingam University

Abstract It is easier to predict workload when task is not complex but it is difficult to predict grid performance if a task is complex because heterogeneous resource nodes are involved in a distributed environment. Time-consuming execution of workload on a grid is even harder to predict due to heavy load fluctuations. In this paper we use, polynomial fitting method for CPU workload prediction. While predicting the workload of the grid error may occur during the prediction, such errors are denoted as prediction errors. These errors are minimized by using the technique called EBAF (Estimation Based Adaptive Filter method). Resource window is generated and mean square error analysis is done, error means difference between the true value and predicted value is calculated. Finally benchmark techniques have been applied to evaluate the performance of the grid.

ASSURING DATA AVAILABILITY ALL TIME IN CLOUD USING ERASURE CODE TECHNIQUE
A. FLORENCE, M. DHANALAKSHMI, V.VENU MOHAN KUMAR Saveetha School of Engineering

Abstract Cloud computing is the long dreamed vision of computing as a utility, where data owners can remotely store their data in the cloud to enjoy on-demand high-quality applications and services from a shared pool of configurable computing resources. In this paper, to focus on the security of cloud data storage, effective and flexible distributed storage verification scheme to ensure the correctness and availability of users data all time in the cloud.The proposed design allows encryption process of the data by the data owner before it reaches the Cloud server. To guarantee the simultaneous identification of the misbehaving servers such as Byzantine failure, malicious data modification attack and even sever colluding attacks. By implementing Erasure code technique, data can be recovered from the above failures and that to achieve data availability all time in cloud. Storing data in a third partys cloud system causes serious concern on data confidentiality. This project also provides where users can safely delegate the integrity checking tasks to third-party auditors (TPA).The proposed design further supports secure and efficient dynamic operations on outsourced data, including block modification, deletion, and append This project ensures proper double time data security. AN EFFECTIVE METHOD TO COMPOSE RULES USING RULE ONTOLOGY IN REPEATED RULE ACQUISITION FROM SIMILAR WEB SITES
A.L.ANUSHYA M.U.ABDUL BASITH S.ARUL K.SANGEETHA SNS College of Technology

Abstract Semantic content of the web page is used to extract the rules from similar web pages of same domain. Rule acquisition is used to acquire rules. We obtain rules from web pages which contain unstructured texts .Acquiring rules from a site by using similar rules of other sites in the same domain rather than extracting rules from each page from the start. We proposed an automatic rule acquisition procedure using rule ontology Rule To Onto, which represents information about the rule components and their structures. The rule acquisition procedure consists of the rule component identification step and the rule composition step. The rule component identification is complete. We use Genetic algorithm for the rule composition and we perform experiments demonstrating that our ontology-based rule acquisition approach works in a real-world appln.

FUZZY LOGIC BASED SCALABLE PACKET CLASSIFICATION ON FPGA


SILPA.L., MRS.NAGESWARI., The Rajaas Engineering College

Abstract Multi-field packet classification has evolved from tra- ditional fixed 5-tuple matching to flexible matching with arbitrary combination of numerous packet header fields. For example, the recently proposed Open Flow switching requires classifying each packet using up to 12-tuple packet header fields. It has become a great challenge to develop scalable solutions for nextgeneration packet classification that support higher throughput, larger rule sets and more packet header fields. This paper exploits the general packet classication problem has received a great deal of attention over the last decade. The ability to classify packets into ows based on their packet headers is important for security, virtual private networks and packet ltering applications. Multi-eld packet classication has evolved from traditional xed 5-tuple matching to exible matching with arbitrary combination of numerous packet header elds. In this project we propose a new approach to packet classication based on fuzzy logic decision trees. We focus here only on the problem of identifying the class to which a packet belongs. Here we present a fuzzy decision-tree-based linear multi-pipeline architecture on FPGAs for wire-speed multield packet classication. A new method of fuzzy decision trees called soft decision trees is used. This method combines tree growing and pruning, to determine the structure of the soft decision tree, with refitting and back-fitting, to improve its generalization capabilities. We considered the next-generation packet classication problems where more than 5-tuple packet header elds would be classied. Several optimization techniques were proposed to reduce the memory requirement of the state-of-the-art decision-tree-based packet classication algorithm. When matching multiple fields simultaneously, it is difficult to achieve both high classification rate and modest storage in the worst case. Our soft decision tree-based scheme, which can be considered among the most algorithms which has high throughput and efficiency.

ENHANCED CLOUD STORAGE SECURITY WITH AVP MECHANISM


ANANDH.A Saveetha Engineering college

Abstract Cloud storage is an online storage where data is stored in virtualized pools of storage which are hosted by third parties. However, data storage may not be fully trustworthy which possesses many security challenges on cloud storage. Access control, version control, and public auditing are taken into account to secure the data stored in the cloud. Proposed secured overlay cloud storage system will provide finegrained access by using hierarchy based access control, access to the cloud storage system is provided based on the users group type. Version control is the framework which eliminates data redundancy and provides version backup in the cloud storage. On top of the version control design, layered approach of cryptographic protection is added to enhance the data security. Version control will be employed in to the cloud storage by implementing appropriate storage mechanism. Finally, public auditing in the cloud storage system will be enforced to maintain the activity log and to analyses the data accessed by the users, thus data stored in the cloud storage is audited and any kind of modification to the data will be reported to the administrator. DETECTING SESSION HIJACKS IN WIRELESS NETWORKS
BANU PRIYA.E , MOHAMMAD MALIK MUBEEN.S. National College of Engineering

Abstract Among the variety of threats and risks that wireless LANs are facing, session hijacking attacks are common and serious ones. When a session hijacking attack occurs, an attacker forces a normal user to terminate its connection to an access point (AP) by rst masquerading the APs MAC address. The attacker then associates with the AP by masquerading the us ers MAC address and takes over its session. Current techniques for detecting session hijacking attacks are mainly based on spoof able and predictable parameters such as sequence numbers, which can be guessed by the attackers. To enhance the reliability of intrusion detection systems, mechanisms that utilize the un spoofable PHY layer characteristics are needed. We show that using a Wavelet Transform (WT), the colored noise with complex POWER Spectral Density (PSD) in our case can be approximately whitened. Since a larger Signal to Noise Ratio (SNR) increases the detection rate and decreases the false alarm rate, the SNR is maximized by analyzing the signal at specic frequency ranges.

ENERGY CONSUMPTION AND LOCALITY OF SENSOR NETWORKS


ARAVINTH.S Mr. RAMALINGAM SAKTHIVELAN N.M.K. Shri Krishna Engineering College

Abstract Wireless sensor networks (WSNs) are used in many areas for critical infrastructure monitoring and information collection. For WSNs, SLP service is further complicated by the nature that the sensor nodes generally consist of low-cost and low-power radio devices. Computationally intensive cryptographic algorithms (such as public-key cryptosystems), and large scale broadcasting-based protocols may not be suitable. Propose criteria to quantitatively measure source-location information leakage in routing-based SLP protection schemes for WSNs. Through this model, identify the vulnerabilities of SLP protection schemes. Propose a scheme to provide SLP through routing to a randomly selected intermediate node (RSIN) and a network mixing ring (NMR). The security analysis, based on the proposed criteria, shows that the proposed scheme can provide excellent SLP.The message will send securely.The adversaries cannot able to identify the source location. The adversaries cannot make any interuption to the message because of the secure algorithms. The comprehensive simulation results demonstrate that the proposed scheme is very efficient and can achieve a high message delivery ratio. It can be used in many practical applications. PACKET DATA REDUNDANCY ELIMINATION IN DATA AGGREGATION
C.HANNAH JASMINE S.SIVARANJANI Kalasalingam University

Abstract Wireless Network consists of sizable amount of device nodes and base station. Every nodes transmit the similar information to the bottom station. So Energy are wasted and network life is drained quicker in device network. During this paper we have a tendency to projected Energy efficient Heterogeneous cluster Protocol with Support Vector Machine (SVM) supported the data Aggregation. It collects or compress the data from the various finish points. Therefore minimum energy are spent and prolong the network life time and additionally it minimizes the amount of transmissions. The performance of the projected technique is then compared with the LEACH protocol. Simulation results shows that this projected mechanism will with efficiency take away the data redundancy in wireless device network.

EFFICIENT AND EFFECTIVE DATA MINING WITH BLOOMCAST AND RANKING OF DATA BY STEMMING ALGORITHM IN UNSTRUCTURED P2P NETWORKS
P.ARUNA .P.RANJAN School of Computing Sciences, Hindustan University.

Abstract Efficient and effective full-text retrieval and ranking process in unstructured peer-to-peer networks remains a challenge in the research community because it is difficult, if not

impossible, for unstructured P2P systems to effectively locate items with guaranteed recall and existing schemes to improve search success rate often rely on replicating a large number of item replicas across the wide area network, incurring a large amount of communication and storage costs. Due to the exact match problem of DHTs and federated search problem, such schemes provide poor full-text search capacity. It proposes replication of Bloom Filters for efficient and effective data retrieval and ranking of data in unstructured P2P networks. Ranking that provides the needs of the users vary, so that what may be interesting for one may be completely irrelevant for another. The role of ranking process is thus crucial: select the pages that are most likely be able to satisfy the users needs, and bring them in the top positions. Ranking of data is performed based on the term frequency and keywords. By replicating the encoded term sets using BFs and stemming of words instead of raw documents among peers, the communication and storage costs are greatly reduced, while the full-text multi keyword searching is supported. BEHAVIOURAL BASED SECURED USER AUTHENTICATION USING IMAGE CAPTCHA
S.DEEPAN K.SURESH KUMAR Saveetha Engineering College,

Abstract Recent days, web access has become more popular. Due to more number of user access there are many threats. Web access has been controlled by providing a secured authentication (username and password). Remembering the Password is the main challenging task for the user when accessing the webpage or Email account. A major problem in security is the fact that internet users have online accounts to many websites, systems, and devices that require them to remember passwords for identification. Because users can only remember a limited number of passwords, many simply forget them. In order to reaccess their account security question provides the solution. Since remembering the Password is the main challenge, it is difficult for the user to give the appropriate answer. To overcome this problem a new technique proposed called as image CAPTCHA.

SUPERVISED CLUSTERING ALGORITHM FOR GENE DATA CLASSIFICATION


DR. S.SAKTHIVEL D.GOMATHI Anna University of Chennai

Abstract Microarray is an array of gene data; in turn a gene data is nothing but a cell. The Microarray technology is an important biotechnological means that allows us to record the expression levels of thousands of genes. This process is carried out simultaneously within a number of different samples. An important application of microarray gene expression data in functional genomics is to classify samples according to their gene expression profiles. The proposed work attempts to find the application of the mutual information criterion to evaluate a set of attributes and to select an informative subset to be used as input data for microarray classification. In this set of large amount of genes, only a few is effective to perform diagnostics in an optimal way. In order to find the effective group of genes we are proposing a Supervised Clustering Algorithm (SCA) in this work. All the existing Unsupervised Clustering Algorithms groups genes according to mean and Standard Deviation measures. These existing algorithms will not consider parameters such as mutual information or correlation. The proposed algorithm is introduced to compute the similarity between attributes. This similarity measure is useful for reducing the redundancy among the attributes. The original gene set is partitioned into subsets or clusters with respect to the similarity measure. A single gene from each cluster having the highest gene-class relevance value is selected as the representative gene. The proposed supervised attribute clustering algorithm yields biologically significant gene clusters. The performance of the proposed algorithm is effective when compared with existing algorithms on both qualitatively and quantitatively. INDIAN LICENSE PLATE RECOGNITION BASED ON OPTICAL CHARACTER RECOGNITON
R.DENNIS,DR.R.K.SELVAKUMAR Cape Institute of Technology

Abstract Indian license plate recognition based on optical character recognition (ILPROCR) plays an important role in numerous applications and a number of techniques have been proposed. The approach concerns stages of pre-processing, license plate detection, extract character and number from the detection plate, license plate segmentation and character recognition. In the experiments all types of license plate, camera obtained at different day time and weather conditions were used. This paper provides character recognizer for the identification of the characters in the license plate.

ANALYSIS AND COMPENSATION FOR NONLINEAR INTERFERENCE OF MULTICARRIER MODULATION OVER SATELLITE LINK
A.IDA BERYL J.BEMINA Anand Institute of Higher Technology

Abstract The analytical characterization of the nonlinear interference that results when passing more than one high-order modulation carrier through the same nonlinear transponder high-power amplifier. A Volterra filter is proposed which is novel in its implementation of this analytical characterization and modeling of inter symbol interference and adjacent channel interference. The focus is on adaptive algorithms with pilot-based training so that the solutions are completely blind to unknown transponder HPA characteristics, and can rapidly respond to varying operating back-off level. Furthermore, two families of adaptive solutions are provided to compensate for nonlinear ISI and ACI. The first set performs adaptive channel inversion and then applies equalization. The second set of solutions performs adaptive channel identification and then applies cancellation. The effectiveness of the proposed analysis and techniques is demonstrated via extensive simulations for high-order QAM and APSK modulations. It is also included the coded performance with selected LDPC codes designed for the DVB-S2 standard. Finally, computational complexity is assessed and performance impact is quantified when complexity is reduced by decreasing the number of Volterra coefficients.

CLOUD INFORMATION ACCOUNTABILITY FRAMEWORK


SELVAMANJU.E,. MS.S.AGNES JOSHY, Francis Xavier Engineering College,

Abstract: Cloud Computing is a large scale distributed storage system. It offers the end user resources and highly scalable services. In the cloud services, users data are usually placed in the remote area. Users do not operate the data directly. Due to this user fears about losing of their own data. Then a highly decentralized information accountability framework is mainly used to keep on monitor the users data in the cloud. To provide both logging and auditing mechanisms for users data and users control. To ensure the users data will trigger authentication and automated logging to the JAR (JavaARchives) programmable capabilities. JAR files automatically log the usage of the users data by any entity in the cloud. In addition, an approach can handle personal identifiable information and also to provide a security analysis and reliability. It is essential to provide an effective mechanism.

RISK FACTOR ASSESSMENT FOR HEART DISEASE USING DATA MINING TECHNIQUES
I.S.JENZI, P.PRIYANKA, DR.P.ALLI Velammal College of Engineering and Technology

Abstract Cardio vascular disease is a major threat to half of the world population. The term heart disease is related to all the diverse diseases affecting the heart. The healthcare industry generates large amount of data that are too difficult to be analyzed by traditional methods. This shows the significance of computer assisted method to make correct decisions. The objective of this paper is to develop a heart disease prediction system using data mining techniques. This helps to identify useful patterns of information from the medical data for quality decision making. Association rules and Classification techniques like Decision Tree, Naive Bayes and Neural Networks are used in literature for disease prediction. This work concentrates on building a classifier model using Decision Tree for predicting heart disease. The system is implemented in .NET platform and the popular data mining tool WEKA is also used. The result obtained from the classifier enables to establish significant patterns and relationships between the medical factors relating to heart disease.

A NOVEL CORRELATION PRESERVING INDEXING METHOD FOR DOCUMENT CLUSTERING IN CORRELATION SIMILARITY MEASURE SPACE
S.GEOFFRIN, MRS.J.C KANCHANA, KLN College of Engineering,

Abstract This paper presents a new spectral clustering method called correlation preserving indexing (CPI),which is performed in the correlation similarity measure space. In this framework, the documents are projected into a low-dimensional semantic space in which the correlations between the documents in the local patches are maximized while the correlations between the documents outside these patches are minimized simultaneously. Since the intrinsic geometrical structure of the document space is often embedded in the similarities between the documents, correlation as a similarity measure is more suitable for detecting the intrinsic geometrical structure of the document space than euclidean distance. Consequently, the proposed CPI method can effectively discover the intrinsic structures embedded in high-dimensional document space. The effectiveness of the new method is demonstrated by extensive experiments conducted on various data sets and by comparison with existing document clustering methods.

ANALYSIS AND COMPENSATION FOR NONLINEAR INTERFERENCE OF MULTICARRIER MODULATION OVER SATELLITE LINK
A.IDA BERYL J.BEMINA Anand Institute of Higher Technology

Abstract The analytical characterization of the nonlinear interference that results when passing more than one high-order modulation carrier through the same nonlinear transponder high-power amplifier. A Volterra filter is proposed which is novel in its implementation of this analytical characterization and modeling of intersymbol interference and adjacent channel interference. The focus is on adaptive algorithms with pilot-based training so that the solutions are completely blind to unknown transponder HPA characteristics, and can rapidly respond to varying operating back-off level. Furthermore, two families of adaptive solutions are provided to compensate for nonlinear ISI and ACI. The first set performs adaptive channel inversion and then applies equalization. The second set of solutions performs adaptive channel identification and then applies cancellation. The effectiveness of the proposed analysis and techniques is demonstrated via extensive simulations for high-order QAM and APSK modulations. It is also included the coded performance with selected LDPC codes designed for the DVB-S2 standard. Finally, computational complexity is assessed and performance impact is quantified when complexity is reduced by decreasing the number of Volterra coefficients. RISK FACTOR ASSESSMENT FOR HEART DISEASE USING DATA MINING TECHNIQUES
I.S.JENZI, P.PRIYANKA, DR.P.ALLI Velammal College of Engineering and Technology

Abstract Cardio vascular disease is a major threat to half of the world population. The term heart disease is related to all the diverse diseases affecting the heart. The healthcare industry generates large amount of data that are too difficult to be analyzed by traditional methods. This shows the significance of computer assisted method to make correct decisions. The objective of this paper is to develop a heart disease prediction system using data mining techniques. This helps to identify useful patterns of information from the medical data for quality decision making. Association rules and Classification techniques like Decision Tree, Naive Bayes and Neural Networks are used in literature for disease prediction. This work concentrates on building a classifier model using Decision Tree for predicting heart disease. The system is implemented in .NET platform and the popular data mining tool WEKA is also used. The result obtained from the classifier enables to establish significant patterns and relationships between the medical factors relating to heart disease.

RFID AND ZIGBEE BASED MANUFACTURE MONITORING SYSTEM


R.JEGATHEESWARI , R.KANITHA Kalasalingam Institute of Technology

Abstract Automatic Identification Technologies (AIT) have revolutionize the way the world conducts commerce, but many people do not really understand what these technologies do or how AIT is changing our lives. Automatic Identification Technology is comprised of numerous technologies such as RFID, OCR, 2D-bar codes, magnetic strips, smart cards, voice recognition, and biometrics. Automatic identification holds the promise of collecting data about a movable asset in the physical world with 100 percent accuracy in real-time. Traditional methods of monitoring production in enterprises by humans on site are unable to meet the expectations for efficiency, accuracy and cost as product lifecycles are shortened continuously. Setting up an RFID and ZigBee based manufacturing monitoring system is a good approach to improve monitoring efficiency so as to improve management efficiency in enterprises. Although there are still some problems to be solved for RFID and ZigBee technologies, their unique features still make the monitoring system based on them a promising system in manufacturing enterprises. The architecture of the RFID and ZigBee based monitoring system is presented in this paper. A MACHINE DOCTOR THAT DIAGNOSING OPHTHALMOLOGY PROBLEMS USING NEURAL NETWORKS
A.JENEFA., S RAJI., Francis Xavier Engineering college

Abstract Ophthalmology is the branch of medicine it deals with eye and its problems. ExpertSystem contain the knowledge about a particular diseases. Machine doctor is one without any human doctor machine can cure ophthalmology problems by using expert system. We are ophthalmology problems such as myopia hypermetropia, astigmatism, mainly dealing with

presbiopia , retinopathy

and glaucoma . Our machine doctor provides both advice about diseases and the information about diseases using the expert's system and auto refraction. The machine doctor get the input as either the queries, data, voice etc.. and provide the output as data can be taken either by printed statement or using any electronic devices. The Neural Networks concept is used to get the input Using back propagation alg. The main aim is to give advice to rural people and make our machine doctor as user friendly one and cost effective.

EMPOWERED SERVICE DELEGATION WITH ATTRIBUTE ENCRYPTION FOR DISTRIBUTED CLOUD COMPUTING
M.JOTHIMANI Nandha Engineering College

Abstract Cloud computing has emerged as one of the most influential paradigms in the IT industry. In this, new computing technology requires users to entrust their valuable data to cloud providers, there have been increasing security and privacy concerns on outsourced data. Several schemes employing attributebased encryption (ABE) have been proposed for access control of outsourced data in cloud computing. The most of them suffer from inflexibility in implementing complex access control policies. In this paper, allowing cloud service providers (CSPs), which are not in the same trusted domains as enterprise users, to take care of confidential data, may raise potential security and privacy issues. To keep the sensitive user data confidential against untrusted CSPs, a natural way is to apply cryptographic approaches, by disclosing Decryption keys only to authorized users. But also provide high performance, full delegation, and scalability, so as to best serve the needs of accessing data anytime and anywhere, delegating within enterprises, and achieving a dynamic set of users. HASBE employs multiple value assignments for access expiration time to deal with user revocation more efficiently than existing schemes. It can be provide fine-grained access control and full delegation. Based on the HASBE model, Finally, we propose a scalable revocation scheme by delegating to the CSP most of the computing tasks in revocation, to achieve a dynamic set of users efficiently. A NOVEL CORRELATION PRESERVING INDEXING METHOD FOR DOCUMENT CLUSTERING IN CORRELATION SIMILARITY MEASURE SPACE
S.GEOFFRIN, MRS.J.C KANCHANA KLN College of Engineering,

Abstract This paper presents a new spectral clustering method called correlation preserving indexing (CPI), which is performed in the correlation similarity measure space. In this framework, the documents are projected into a low-dimensional semantic space in which the correlations between the documents in the local patches are maximized while the correlations between the documents outside these patches are minimized simultaneously. Since the intrinsic geometrical structure of the document space is often embedded in the similarities between the documents, correlation as a similarity measure is more suitable for detecting the intrinsic geometrical structure of the document space than euclidean distance. Consequently, the proposed CPI method can effectively discover the intrinsic structures embedded in high-dimensional document space. The effectiveness of the new method is demonstrated by extensive experiments conducted on various data sets and by comparison with existing document clustering methods.

ROBUST FUZZY SCHEDULING IN OFDMA NETWORKS FOR RESOURCE ALLOCATION


S.BRENIJA STANLEY DR.M.IRULAPPAN Francis Xavier Engineering College

Abstract This paper, deals with an opportunistic resource scheduling problem for the relay-based Orthogonal Frequency Division Multiple Access(OFDMA) cellular network where relay stations (RSs) perform opportunistic network coding with downlink and uplink sessions of a mobile station (MS). To this end, consider time-division duplexing (TDD) where each time-slot is divided into three phases according to the type of transmitter nodes, i.e., the base station(BS), MSs, and RSs. Moreover, to improve the flexibility for resource allocation, dynamic TDD scheme is applied, in which the time duration of each phase in each time-slot can be adjusted. The opportunistic network coding, introduces a novel model for network coding aware RSs with which an opportunistic network coding problem can be reduced to an opportunistic sub channel scheduling problem. Scheduling can be provided by the Fuzzy algorithm. This paper formulate an optimization problem that aims at maximizing the average weighted-sum rate for both downlink and uplink sessions of all MSs, in order to satisfy the quality-of service(QoS) requirements of each MS. It develops a resource scheduling algorithm that optimally and opportunistically schedule sub channel, transmission power, network coding, reduced power consumption and time duration of each phase in each time-slot. Through the numerical results, measures how each of network coding strategy and dynamic TDD affects the network performance with various network environments. RTOS BASED MONITOR THE INDUSTRIAL ENVIRONMENT AND EMBEDDED SYSTEM INTEGRATED IN A WSN
R.JAYAKUMAR MRS.R.THENMOZHI Ganadipathy tulsis jain engineering college

Abstract The system proposed in this paper aims to reduce the switching time delay and increase number of motors at monitoring the torque and efficiency in Automatic industrial environment, in real time by employing wireless sensor networks (WSNs). An embedded system is employed for acquiring electrical signals from the motor in a noninvasive and invasive manner, and then performing local processing for torque and efficiency estimation. The values calculated by the embedded system are transmitted to a monitoring unit through an IEEE 802.15.4-based WSN. At the base unit, various motors can be monitored in real time. The RTOSVXWOKS Reduce switching delay time between the twotasks according to assigned priority, it is theoretically zero time delay, so increase the monitoring time. IEEE 802.15.4Zigbee is work according to MAC address based. When the speed and

temperature exceeds threshold level, the ARM controller were control the motor and enables the buzzer. VXWORKS provide high performance, scalable, Reliable and high throughput monitoring system. Keyword: VXWORKS, Efficiency estimation, embedded systems, induction Motors, torque measurement, wireless sensor networks (WSNs). TOP-K RESPONSES USING KEYWORD SEARCH OVER RELATIONAL DATABASES THROUGH TUPLE UNITS
MRS.G.JEYASRI, MRS.K.UMAMAHESWARI, University College of Engineering

Abstract Existing keyword search methods on databases usually find Steiner trees composed of connected database tuples as answers. By discovering rich structural relationship between database tuples they on-the fly identify Steiner trees, and without consider the fact that structural relationship can be precomputed and indexed. Tuple units are proposed to improve search efficiency by indexing structural relationships between tuples, and existing methods identify a single tuple unit to answer keyword queries. In many cases, multiple tuple units should be combined together to answer a keyword query. Hence these methods will involve false negatives. To handle this problem, we study how to integrate multiple related tuple units to effectively answer keyword queries and to achieve a high performance, two novel indexes are used, single keyword based structure aware index and keyword pair based structure aware index .Structural relationships between different tuple units are incorporated into the indexes. By using the indexes, we can effectively identify the answers of integrated tuple units. New ranking techniques and algorithms are progressively implemented to find top-k answer.

IMAGE STEGANOGRAPHY USING ADAPTIVE LSB ENCODING


ANAND.G , ABINAYA .S , KARTHIGA RPSRR College of engineering for women

Abstract Steganography is the art and science of hiding secret data within an innocent cover data so that it can be securely transmitted over a network. Steganography is the art of hiding the fact that communication take place by secretly hiding the information from others hiding. Many different carrier file formats are used, but digital images are popular because of their frequency on the internet. Hiding the information within the image includes various steganography methods some are more complex than others and all of them have respective good and weak points. This project report intends to give an view of image steganography, its uses and techniques. In this paper we proposed on edge adaptive scheme that will select the sharper pixel along the edges for hiding the message. The advantage is that the smooth edges are very less affected. A new technique that adaptively selects the

pixels at the edge (sharper regions) for hiding the secret data rather than selecting randomly using PRNG. For lower embedding rates, only sharper edge regions are used. When the embedding rate increases, the edge regions can be released adaptively for data hiding by adjusting a few parameters. A TREE BASED MINING APPROACH FOR DETECTING INTERACTION PATTERNS
LINCY JANET. F, KARTHIKA. N, M. JANAKI MEENA Velammal College of Engineering and Technology

Abstract Human interactions are defined as the social behavior or communicative actions taken by meeting participants related to the current topic. The various kinds of human interactions are proposing an idea, giving comments, acknowledgement; ask opinion, positive opinion and negative opinion. These interactions are essential to predict the user role, attitude and intention towards the topic. This project focuses on only the task oriented interactions that address the task related aspects. Mining human interaction is important to access and understand the meeting content quickly. This project proposes a mining method to extract the frequent human interaction patterns. The interaction flow is represented as a tree. Hence, the popular tree based mining algorithms namely frequent interaction tree pattern mining and frequent interaction subtree pattern mining are designed to analyze and extract frequent patterns from the constructed trees.

EMBEDDED SYSTEM BASED OILWELL HEALTH MONITORING AND CONTROLING USING SENSOR NETWORK
T.KATHIRAVAN., MR.M.SUDHAKARAN Ganadipathy tulsis jain engineering college

Abstract The security management system is described in this project. System structure of wireless security adopts two level structures. The first level is consist of some remote controllers and a launcher which is include wireless burglar alarm, fault alarm, power-off alarm, self-checking alarm and some wireless night patrol point. The second level is consisting of a wireless receiver and a wireless alarm controller in the system. Gas sensor to detect the flammable gas which generally evolves from the oil wells, in case of any detection occur automatically exhaust will switch on to pass away the particles. Temperature level feel to be high in the mean time cooling fan will trigger to reduce or to maintain the particular temperature in the wells. With the help of current & potential Transformer we can find out the fluctuations in the pumping section, The level of oil should be vary from the indicated level its gives alert message via voice indicator, measure the humidity, use PH sensor. All datas transmitted & monitored in PC, which means control room.

A WEIGHTED PERIODICAL PATTERN MINING AND PREDICTION FOR PERSONAL MOBILE COMMERCE
P. LAKSHMIPRIYA, M.KALIDASS Maharaja Engineering College

Abstract The development of wireless and web technologies has allowed the mobile users to request various kinds of services by mobile devices at anytime and anywhere but also use their mobile devices to make business transactions easily, e.g., via digital wallet. The location of the mobile phone user is an important piece of information used during mobile commerce or m-commerce transactions. Mining and Prediction of users mobile commerce behaviors such as their movements and purchase transactions has been studied in data mining research. Most of the previous studies adopt an Pattern Mining approach. However, Pattern Mine needs more time to mine the frequent patterns in transaction databases when data size increases. In this study, propose a Weighted Sequential Pattern (WSP) and Periodical Pattern method for mining and prediction of purchase behavior of mobile users which reduces the time complexity and mining the accurate result for each item set. Performance study shows that the Weighted Sequential Pattern (WSP) and Periodical Pattern mining is efficient and accurate for Predict both large and small frequent patterns, and is about an order of magnitude faster than some recently reported new frequent-pattern mining methods. A JOINT SENTIMENT TOPIC DETECTION FROM TEXT USING SEMI-SUPERVISED TACTIC
KARTHIKA .N , LINCY JANET .F, JANAKI MEENA.M., Velammal College of Engineering and Technology

Abstract Sentiment analysis or Opinion mining is the process of detecting the subjective information in given text. Text may include subjective information like opinions, attitudes and feelings. Sentiment analysis also has an important potential role as enabling technologies for other systems. This paper employed two semi supervised probabilistic approaches called JST model and Reverse JST model to detect sentimental topic. The system designed in this paper classifies positive and negative labels of an online review. In JST, the document level sentiment classification is based on topic detection and topic sentiment analysis. JST process, the sentiment labels are associated with documents, the topics are generated dependent on sentiment distribution and words are generated conditioned on the sentiment topic pair. In Reverse JST, the sentiment label is dependent on the topics. In this process, where the topics are associated with documents, the sentiment labels are associated with topics and words are associated with both topics and sentiment labels. In LDA, where topic are associated with documents and words are associated with topic distribution. JST and Reverse JST are evaluated on

four different domains using the Gibbs Sampling Algorithm. The nature of JST makes it highly portable to other domains. It compares JST and Reverse JST with latent dirichlet allocation. In this paper observed topic and topic sentiment detected by JST are indeed coherent and informative. A SURVEY ON TRANSACTION MAPPING ALGORITHM FOR MINING FREQUENTLY OCCURRING DATASETS
S.SURIYA, R.M.MEENAKSHI Velammal College of Engineering and Technology

Abstract An algorithm for mining complete frequent itemsets is used. This algorithm is referred to as the Transaction mapping algorithm. This algorithm contains transaction ids of each itemset that are mapped and compressed to continuous transaction intervals in a various space and the counting of itemsets is performed by intersecting these interval lists in a depth-first order along the lexicographic tree. And as when the compression coefficient becomes smaller than the average number of comparisons for intervals intersection at a particular level, the algorithm switches to transaction id intersection. DIGITAL IMAGE FORGERY DETECTION AND ESTIMATION THROUGH EXPLORING IMAGE MANIPULATIONS
MARY METILDA. J Roever Engineering College

Abstract In this modern age in which we are living, digital images play a vital role in much application areas. But at the same time the image retouching techniques has also increased which forms a serious threat to the security of digital images. To scope with this problem, the field of digital forensics and investigation has emerged and provided some trust in digital images. The proposed technique for image authentication that detects the manipulations that are done in the digital images. In most of the image forgeries such as copy-and-paste forgery, region duplication forgery, image splicing forgery etc basic image operations or manipulations are often involved, if there exists the evidence for basic image alterations in digital images we can say that the image has been altered. This paper aims at detecting the basic image operations such as re-sampling (rotation, rescaling), contrast enhancement and histogram equalization which are often done in forged images. The available interpolation related spectral signature method is used for detecting rotation and rescaling and for estimating parameters such as rotation angle and rescale factors. This rotation/rescaling detection method detects some unaltered images as altered one when the images are JPEG compressed. We have overcome that problem by adding noise in the input images. We have also used the existing fingerprint detection technique for detecting contrast enhancement and histogram equalization.

OPASS: A USER AUTHENTICATION PROTOCOL RESISTANT TO PREVENT ATTACK AND DETECT PHISHING WEBSITE
M.MINU SUNITHA MARY., MS.E.SALOME , Holy Cross Engineering College

Abstract Today security concerns are on the rise in all areas such as banks, governmental applications, healthcare industry, military organization, educational institutions, etc. Government organizations are setting standards, passing laws and forcing organizations and agencies to comply with these standards with non-compliance being met with wide-ranging consequences. There are several issues when it comes to security concerns in these numerous and varying industries with one common weak link being passwords.Most systems today rely on static passwords to verify the users identity. However, such passwords come with major management security concerns. Users tend to use easy-to-guess passwords, use the same password in multiple accounts, write the passwords or store them on their machines, etc. Furthermore, hackers have the option of using many techniques to steal passwords such as shoulder surfing, snooping, sniffing, guessing, etc.Several proper strategies for using passwords have been proposed. But they didnt meet the companys security concerns. Two factor authentication using devices such as tokens and ATM cards has been proposed to solve the password problem and have shown to be difficult to hack. Two factor authentications is a mechanism which implements two factors and is therefore considered stronger and more secure than the traditionally implemented one factor authentication system. Withdrawing money from an ATM machine utilizes two factor authentications; the user must possess the ATM card, i.e. what you have, and must know a unique personal identification number (PIN), i.e. what you know. ENERGY EFFICIENT SENSORY DATA COLLECTION AND RECONCILING FROM DAMAGED NETWORK
MS.SRIE VIDHYA JANANI.E., NANDHINI.B Anna University, Regional Centre

Abstract Wireless Sensor Network has wide range of applications in the field of networks. The sink nodes need to communicate effectively with other sensor nodes, for effective communication. The facts such as cluster size, energy and lifetime of the nodes should be considered to make the communication effective. While transmitting, the nodes are grouped in clusters with one head per cluster. The cluster, nearer to the sink nodes may run out of energy due to continuous utilization. So an intermediate node for communication is used, called as AGM node .The sensor nodes in the clusters, first send the information to their cluster head(chosen on the basis higher residual energy),the cluster head in turn

sends the information to the AGM node whereas the AGM transmits it to the respective sensor node and vice versa. Selection of AGM among many nodes and the entire process is carried out on the basis of The maneuver algorithm, which has 6 phases like compact clustering, AGM selection, Interclustering, Load balancing and data distribution, Communication and replenishment and Reconcile algorithm. In this process the CH transmits data after eliminating redundancy in it. In case of massive damage, Reconcile algorithm is used for the efficient usage of available AGM nodes to regain from the relapsed network. FEATURES EXTRACTION AND VERIFICATION OF SIGNATURE IMAGE
A.VAIRAMUTHU,NAVIAJOSEPH,A.RUBIYA,S.RAMYA P.S.R.Rengasamy college of engineering for women

Abstract Communication leads to the development of languages. Writing is an art which varies from person to person .Signature is one of the best way to identify the people. Signature of the same individual may vary with time and situation. Signature verification is very important in the field of person authentication such as military, banking, etc. In order to identify and control forgeries we go for signature verification. There are three types of forgeries present. They are random forgery, simple forgery and skilled forgery. In this paper we present a suitable and efficient method for offline signature verification with good reliability and accuracy. This method is very useful to identify the forgeries. Signature verification process includes preprocessing stage, feature extraction stage and signature verification stage. This method is reliable and less expensive. Skilled forgeries are also identified using this method. AIRBORNE INTERNET
NITHIYA.A ., MANIMEKALAI.V., National engineering college

Abstract The word on just about every Internet user's lips these days is "broadband." We have so much more data to send and download today, including audio files, video files and photos, that it's clogging our wimpy modems. Many Internet users are switching to cable modems and digital subscriber lines (DSLs) to increase their bandwidth. There's also a new type of service being developed that will take broadband into the air. Our paper explains some of the drawbacks that exist in satellite Internet and introduces the airborne Internet, called High Altitude Long Operation (HALO), which would use lightweight planes to circle overhead and provide data delivery faster than a T1 line for businesses. Consumers would get

a connection comparable to DSL. The HALO Network will serve tens of thousands of subscribers within a super-metropolitan area, by offering ubiquitous access throughout the networks signal "footprint". The HALO aircraft will carry the "hub" of a wireless network having a star topology. The initial HALO Network is expected to provide a raw bit capacity exceeding 16 Gbps. The concept of basic network connectivity could be used to connect mobile vehicles, including automobiles, trucks, trains, and even aircraft. Network connectivity could be obtained between vehicles and a ground network infrastructure. BRAIN TUMOR DETECTION AND AUTOMATIC SEGMENTATION
K.MUTHUREGA, A.NIVETHA, H.PADMAPRIYA, Dr.K.Ramasamy P.S.R.Rengasamy College of Engineering for women

Abstract Brain tumor detection and segmentation is the complicated task in MRI. The MRI indicates the regular and irregular tissues to differentiate the overlapping tissues in the brain. The automatic seed selection method has the problem if there is no growth of tumor and if even any small white region is present there. But the edges of the tumor is not sharped so, the result is not accurate. This happened only at the initial stage of tumor. So, the texture based detection is used and it segment automatically to separate the regular and irregular tissue to obtain the tumor area from unaffected area. The method used here is seeded region growing method and version using is MATLAB 7.8.0.347. CLASSIFICATION MODEL FOR EARLY DISEASE DIAGNOSIS USING DATA MINING
P.PRIYANKA,I.S.JENZI, DR.P.ALLI Velammal College of Engineering and Technology

Abstract Cerebrovascular disease is a disease threatening human health seriously. It is ranked as the second leading cause of death after ischemic heart disease. To discover and prevent cerebrovascular disease as early as possible has become critical. In clinical practice its occurrence is so abrupt and fierce that it is hard to make early and accurate diagnosis and prediction beforehand. To overcome this cerebrovascular disease predictive model is constructed using the classification algorithms. This work aims at obtaining data on the patients including their physical exam results, blood test results and diagnosis data. The purpose is to construct an optimum cerebrovascular disease predictive model. Three classification models are constructed using the classification techniques like Bayesian classifier decision tree, and back propagation neural network. This work focuses on providing the pre-processed dataset where the missing values and unwanted values are removed. The pre-processed data are classified according to age attribute and are extracted using feature extraction. The mean and standard

deviation for each attribute is calculated. The attribute matching the threshold value based on the importance of the attributes are extracted employing the SVM algorithm. The extracted attributes are splitted as T1, T2, T3 and are used for constructing classification model. The efficiency of the models are compared with each other and the model with best efficiency is taken and rules are predicted. WIDE RANGE REPUTATION BASED ANNOUNCEMENT SCHEME FOR VANETS
P.RAJASEKAR., MRS.A.H.RAGAMATHUNISA BEGUM,National Engg College

Abstract Using mobile ad hoc networks in an automatic environment (VANET) opens a new set of applications, such as the passing the information about local traffic or road conditions. This can increase traffic safety and improve mobility. One of the main advantages is to forward event related message. Vehicular ad hoc network (VANETs) it can be allowing vehicles to generate and broadcast message to inform nearby neighboring vehicles about road conditions, such as traffic congestion and accidents. Neighboring vehicles can use this information which may improve road safety and traffic efficiency. But messages generated by vehicles may not be reliable. In existing system use an announcement scheme for vanets based on a reputation system it can be allows evaluation of message reliability .This can improve the secure and efficient reputation broadcast in vanets. Our Proposed system It might be of interest to extend the current scheme in such a way that a message can be utilized by vehicles in a greater area. AGENT TRUST FOR EFFECTIVE COMMUNICATION IN INTELLIGENT TRANSPORTATION SYSTEM
S.RAMAPRIYA, S.PADMADEVI Velammal College of Engineering and Technology

Abstract An increasingly large number of cars are being equipped with global positioning system and Wi-Fi devices, enabling vehicle-to-vehicle (V2V) communication with the aim of providing road safety and increased passenger. This technology functions the need for agents that assist users by intelligently processing the most effective received information. Some of these mobile agents try to maximize car owners utility by sending out erroneous information. The consequence of acting on erroneous information implies the serious need to establish trust among mobile agents. The main aim of this work is to develop a model for the trustworthiness of agents in other vehicles to receive the most effective information. The challenge is to design intelligent agents to enable the sharing of information between vehicles in mobile ad hoc vehicular networks (VANETs). This work develops a multifaceted trust modeling approach that incorporates role based trust, priority based trust, experience based trust and majority-based trust and this is able to restrict the number of reports

received. It includes an algorithm that proposes how to integrate various dimensions of trust, with the practice of experimenting to validate the benefit of agent approach, stressing the importance of each of the different facets. The result provides an important methodology to enable effective V2V communication via intelligent mobile agents. CRITICAL EVENT DETECTION AND MONITORING USING NOVEL SLEEP SCHEDULING IN WSN
K.RAMYA, MR. V.SENTHIL MURUGAN, Srinivasan Engineering College

Abstract In wireless sensor networks during critical event monitoring only a small number of packets have to be transmitted. The alarm packet should be broadcast to the entire network as earlier, if any critical event is detected. Therefore, broadcasting delay is an important problem for the application of the critical event monitoring. To prolong the network lifetime some of the sleep scheduling methods are always employed in WSNs it results in a significant broadcasting delay. A novel sleep scheduling method to be proposed it is based on the level-by-level offset schedule to achieve a low broadcasting delay in wireless sensor networks (WSNs). There are two phases to set the alarm broadcasting first one is, if a node detects a critical event, it create an alarm message and quickly transmits it to a center node along a pre-determined path with a node-by-node offset way. Then the center node broadcasts the alarm message to the other nodes along another predetermined path without collision. An on demand distance vector routing protocol is established in one of the traffic direction for alarm transmission. The proposed system is used in military and forest fire application.

CONVOLUTION PARTIAL TRANSMITS SEQUENCE SCHEME FOR IMPROVING THE ENERGY EFFICIENCY OF A TRANSMITTED SIGNAL.
L.NIRMALADEVI, D.RATHIMEENA, A.SHIFANA YASMIN, P.SURESH PANDIYARAJAN P.S.R.Rengasamy College of Engineering for women

Abstract Reducing the Peak to Average Power Ratio (PAPR) in OFDM system by using Convolution Partial Transmit Sequence (C-PTS). In C-PTS has several Inverse Fast-Fourier transform (IFFT) operation increase the computational complexity of C-PTS. In our method the number of IFFT operations are used to reduce the slight PAPR losses. Simulations are performed with QPSK modulation with OFDM signal and Salehmodel power amplifier. The linearity and efficiency of the Saleh model power amplifier (PA). is increased by the effect of digital predistortion (DPD).

SECURE TRANSACTION IN ONLINE BANKING SYSTEM USING IB-MRSA


S.RENUGA DEVI., S.CHIDAMBARAM., V.MANIMARAN., National Engineering College,

Abstract Now a days more number of clients using online banking, online banking systems are becoming more desirable and achieve in banking system secure in client information product data from attacker. To maintain the clients trust and confidentiality of their online banking services on purchase items, check account information etc. How attackers compromise accounts and develop methods to protect them. Towards this purpose, presents a modified model to authenticate clients for online banking transactions through utilizing Identity- Based mediated RSA (IB-mRSA) technique in conjunction with the one-time ID concept for the increase security in online banking, The introduced system exploits a method for splitting private keys between the client and the Certification Authority (CA) server. Generating key splitting into two parties one for SEM (SEcurity Mediator) another key using for client using this key encrypt the message. SEM using key for Decrypt the client requests.

THE EYE MOUSE IS THE EQUIVALENT OF THE CONVENTIONAL COMPUTER MOUSE


RAJA SARATHA

Abstract: The Eye Mouse is the equivalent of the conventional computer mouse, but it is entirely controlled by the eyes and nose movements. This offers interesting possibilities for the study of eye movements during drawing, as well as providing a unique device to allow disabled users to operate computers. In this paper we would like to introduce design of a system for controlling a PC by eye movements. During last ten years the computers have become common tools of work it is nearly impossible to exist without them in everyday life. We are witnessing the time of revolutionary introduction of computers and information technologies into daily practice. Healthy people use keyboard ,mouse, trackball, or touchpad for controlling the PC. However these peripheries are usually not suitable for disabled people. They may have problems using these standard peripheries, for example when they suffer from myopathy, or cannot make moves with hands after an injury. Therefore we are coming with a proposal how to ease the disabled people to control the PC.

THREE-PORT SERIES-RESONANT DCDC CONVERTER TO INTERFACE RENEWABLE ENERGY SOURCES WITH BIDIRECTIONAL LOAD AND ENERGY STORAGE PORTS
RESMI.S.P AND LINDA PHILIP Udaya school of engineering.

Abstract Future renewable energy systems will need to interface several energy sources such as fuel cells, photovoltaic (PV) array with the load along with battery backup. A three-port converter finds applications in such systems since it has advantages of reduced conversion stages, high-frequency aclink, multi winding transformer in a single core and centralized control. Some of the applications are in fuel-cell systems, automobiles, and stand-alone self-sufficient residential buildings

MOBILE DATA GATHERING USING PPS IN WIRELESS SENSOR NETWORK


MR. R.SARAVANAN V.REVATHI, Anna University of Chennai,

Abstract Energy consumption becomes a primary concern in a Wireless Sensor Network. To pursue maximum energy saving at sensor nodes, a mobile collector should traverse the transmission range of each sensor in the field such that each data packet can be directly transmitted to the mobile collector without any relay. This approach leads to significantly increased data gathering latency due to the low moving velocity of the mobile collector. It studies the tradeoff between energy saving and data gathering latency in mobile data gathering by exploring a balance between the relay hop count of local data aggregation and the moving tour length of the mobile collector. This approach proposes a polling-based mobile gathering approach and formulates it into an optimization problem, named bounded relay hop mobile data gathering. Specifically, a subset of sensors will be selected as polling points that buffer locally aggregated data and upload the data to the mobile collector when it arrives. In the meanwhile, when sensors are affiliated with these polling points, it is guaranteed that any packet relay is bounded within a given number of hops. It then gives two efficient algorithms for selecting polling points among sensors.

BRAIN TUMOR DETECTION AND IDENTIFICATION USING IMAGE PROCESSING AND SOFT COMPUTING
A.SAHAYASUJI G.ATHILAKSHMIVINOTHINI P.NIVETHA

Abstract In this paper, modified image segmentation techniques were applied on MRI scan images in order to detect brain tumor. In order that for segmentation purpose we have handled analysis process from that we have proposed better algorithm for detection of brain tumor. In case of the next process that is of identification, in which we are using probabilistic neural network classifier, in this it classifies tumor tissue from normal tissue.

A SECURE DATA FORWARDING IN THE CLOUD STORAGE SYSTEM BASED ON PROXY RE-ENCRYPTION
J.SHYAMALA, B.VINISHA CATHRINE ANTONUS, R.SIVASUBRA NARAYANAN HolyCross Engineering College

Abstract: Cloud computing enables highly scalable services to be easily consumed over the Internet on an asneeded basis. cloud storage is a model of networked online storage where data is stored in virtualized pools of storage which are generally hosted by third parties. Hosting companies operate large data centers, and people who require their data to be hosted buy or lease storage capacity from them. Data robustness is a major requirement for storage systems. There have been many proposals of storing data over storage servers. One way to provide data robustness is to replicate a message such that each storage server stores a copy of the message. A decentralized erasure code is suitable for use in a distributed storage system. To construct a secure cloud storage system that supports the function of secure data forwarding by using a proxy re-encryption scheme. The encryption scheme supports decentralized erasure codes over encrypted messages and forwarding operations over encrypted and encoded messages. Our system is highly distributed where storage servers independently encode and forward messages and key servers independently perform partial decryption. We analyze and suggest suitable parameters for the number of copies of a message dispatched to storage servers and the number of storage servers queried by a key server. These parameters allow more flexible adjustment between the number of storage servers and robustness.

IMPROVING PERFORMANCE OF VIDEO TRANSMISSION QUALITY USING ENCRYPTION AND RESOURCE ALLOCATION METHOD
M.SARANYA, R. KAVITHA Velammal College of Engineering and Technology

Abstract Visual Sensor Networks (VSNs) are recent communication infrastructures used to capture and transmit dense visual information from an application context. This paper addresses the most relevant challenges posed by VSNs, namely energy efficiency and security.Energy efficiency is one of the most challenging issues for multimedia communication in Visual Sensor Networks due to the resource constraints, the requirements for high bandwidth and low transmission delay. When the nodes send any video data, it consumes more time. This is due to the large size of the video file when compared to text file. Therefore, compressed the video data before sending to the destination. Another important factor during data transfer is security. This paper proposes the joint compression and encryption which are employed to enable faster and secured transmission of video data. The joint compression and encryption algorithms resolve two major issues such as energy efficiency and security when confidential video data is sent over the Visual Sensor Networks. ADAPTIVE COUNTERMEASURE TO PREVENT DOS ATTACKS USING AN ADAPTIVE SENSITIVE AUTHORIZATION
N.SELVAGANAPATHY G.VINOTHCHAKKARAVARTHY Velammal College of Engineering and Technology

Abstract DoS attacks aim to reduce scarce resources by generating illegitimate requests from one or many hosts. It made damage to the system. To avoid this, propose a new concept Adaptive selective Verification Certification (ASV) method to avoid DoS attack, which is a distributed adaptive mechanism for thwarting attackers efforts to deny service to legitimate clients based on selective verification, with the auction based payment. Various users network path to be limited. For that, set adaptive bandwidth limit with server state whose size remains small and constant regardless of the actions and set band limit by dynamically changeable. The change depends on the usage of the clients. Perform empirical evaluation of the ASV protocol with the aim of understanding performance in practice of the attackers. And enhanced the system by adding multiple properties for the clients on finding attack rate.

AN EFFICIENT CLUSTERING ALGORITHM FOR SPAM FILTERING.


P.SHARMILA, J.SHANTHALAKSHMI REVATHY Velammal college of engineering & technology

Abstract Clustering high dimensional data results in overlapping and loss of some data. This paper extends the k-means clustering using weight function for clustering high dimensional data. The weight function can be determined by vector space model that convert high dimensional data into vector matrix. Thus the proposed algorithm is for fuzzy projective clustering which is used to find the overlapping boundaries in various subspaces. The objective function is to find the relevant dimensions by reducing the irrelevant dimensions for cluster formation. This can be explained in document clustering. Email documents are taken as sample datasets to explain fuzzy projective clustering. FAST PERCEPTUAL VIDEO ENCRYPTION USING RANDOM PERMUTATION ON MODIFIED DCT CO-EFFICIENTS
M.SHENBAGAVALLI., S.RAJAGOPAL., L.JERART JULUS National engineering college

Abstract Generally videos are of larger volume. Video encryption is also known as video scrambling. It is one of the powerful techniques for preventing unwanted interception. In this paper a robust Perceptual Video Encryption technique is applied by selecting one out of multiple unitary transforms according to the encryption key generated from random permutation method at the transformation stage. The encryption is done by splitting each frame into their corresponding RGB components. By altering the phase angle of the encryption key the separated components of each frame are thus underwent to unitary transform. The transformed frame contains co-efficient which includes both high frequency component and low frequency component. In the first stage, IDCT is applied to encrypted frames and the frames are then combined together to get the encrypted video. In the second stage, the encrypted frames are quantized and encoded. To overcome the drawbacks of Huffman coding, adaptive arithmetic encoder is used at the coding stage. Thus the encrypted bit stream is obtained. In the third stage, the decryption is done to obtain the original video. Also the performance factors under various parameters are analyzed. This methodology will be useful for video-based services over networks.

A NOVEL CHANNEL ADAPTIVE ROUTING WITH HANDOVER IN MANETS


A.SIVAGAMI, L.MARISELVI, M.RENUKADEVI P.S.R.Rengasamy College of Engineering For Women

Abstract Radio link fluctuations is a difficult task in packet transmission in mobile ad hoc networks. To overcome this we are proposing a new protocol called novel channel adaptive routing protocol which reduces channel fading. The proposed channel used to select stable links for route discovery by using average non fading duration technique and handoff strategy maintains reliable connections. This protocol provides a dual-attack for avoiding unnecessary route discoveries, predicting path failure leading to handoff and then bringing paths back into play when they are again available, rather than simply discarding them at the first sign of a fade.

SENSING ENVIRONMENTAL AND DISASTER USING CLOUD COMPUTING INFRASTRUCTURE


SUDHAGAR.V, MUTHU PATTAN.V Lord Jegannath College of Engineering and Technology

Abstract The remote monitoring system is growing very rapidly due to the growth of supporting technologies as well. Problem that may occur in remote monitoring such as the number of objects to be monitored and how fast, how much data to be transmitted to the data center to be processed properly. This study proposes using a cloud computing infrastructure as processing center in the remote sensing data. This study focuses on the situation for sensing on the environment condition and disaster early detection. Where those two things, it has become an important issue, especially in big cities big cities that have many residents. This study proposes to build the conceptual and also prototype model in a comprehensive manner from the remote terminal unit until development method for data retrieval. We also propose using FTR-HTTP method to guarantee the delivery from remote client to server.

A MULTI-RESOLUTION FAST FILTER BANK USING CYCLOSTATIONARY FEATURE DETECTION FOR SPECTRUM SENSING IN MILITARY RADIO RECEIVER
SUNDARESAN.V REVATHI.B

Abstract Now a days scarcity of spectrum is a major issue in the field of wireless communication, so efficient usage of spectrum is needed. This can be achieved by using the cognitive radio. Major problem concerned with cognitive radio is spectrum sensing. A multi resolution fast filter bank using cyclostationary feature detection to sense the various ranges of spectrum in military radio receivers is proposed. It overcomes the constraint of fixed spectrum sensing. In cyclostationary feature detection small sub bands can also be sensed and the small sub bands can be used for LAN communications in military applications. By means of cyclostationary feature detection we can classify and identify the primary signal either Digital Video Broadcasting- Terrestrial (DVB-T) or wireless microphone signal. By using the knowledge of identifying primary signals will help cognitive radio to use fraction of TV band when only a wireless microphone signal is present in the channel. It can also detect some features of the primary signal like double sideband, data rates and the modulation type .

IMPROVING EFFICIENCY IN MULTICASTING PROTOCOL FOR AD HOC NETWORKS


S.SURYA, SANGEETHASENTHILKUMAR Oxford Engineering College

Abstract Multicasting protocols supports group communication in Ad hoc networks where Receiver-based protocol is one among them which have been proposed as a means of allowing communication when nodes do not maintain any state information. In receiver-based protocols, receivers contend to be the next-hop router of a packet. Further, For multicast communication, the RB Multicast protocol is used, which simply uses a list of the multicast members addresses, embedded in packet headers, to enable receivers to decide the best way to forward the multicast traffic and a new retransmission scheme to enhance the performance of RB Multicast was proposed For receiver-based protocols using effective Duty Cycle Assignment technique based on distance. That minimizes the expected energy dissipation for a given node distance to the sink. Moreover, This method achieves energy efficiency and high packet delivery ratio even in heavy network traffic without sacrificing the latency and throughput significantly.

RESILIENT AUTHENTICATED EXECUTION OF CRITICAL APPLICATIONS IN CORRUPTED ENVIRONMENT USING VMM


S. VALARMATHI Mr. S. SATHISHKUMAR Srinivasan engineering college

Abstract VMM (Virtual Machine Monitor) is used to develop a resilient execution environment for a critical application even in the presence of corrupted OS kernel. The attacker tries to capture the application content by corrupting the OS when an application is executing. In previous case the attacker corrupts the OS by injecting code in the system then the application terminates immediately without executing it. In this current system even in the presence of corruption the application is executed without any interception and it provide a resilient authenticated execution of critical application in untrusted environment by using Virtual Machine Monitor (VMM). VMM is a monitoring technique to monitor all the activities during execution and it is one of the online based recovery schemes to identify any such corruption. It repairs the memory corruption and allows the process for normal execution. VMM solutions generally broadcast into two categories they are memory authentication and memory duplication. Memory authentication is to check the integrity of an application and memory duplication is to rectify the corruption. The system can be applied for military application, hospitals and for all critical applications. IMPLEMENTATION OF EFFICIENT LIFTING BASED MULTI LEVEL 2-D DWT
R.VIJAYAMOHANARENGAN Indra Ganesan College of Engineering,

Abstract To present a modular and pipeline architecture for lifting based multilevel 2-D DWT. A VHDL model was described and synthesized using implementation of our architecture. The whole architecture was optimized in efficient pipeline and parallel design way to speed up and achieve higher hardware utilization. The two dimensional discrete wavelet transform (2-D DWT) is widely used in many image compression techniques. This is because the DWT can decompose the signals into different sub-bands with both time and frequency information and facilitate to achieve a high compression ratio. It is therefore a challenging problem to design an efficient VLSI architecture to implement the DWT computation for real-time applications. Owing to its regular and flexible structure, the design can be extended easily into Different resolution levels and its area is independent of the length of the 1-D input sequence. Compared with other known architectures, proposed design requires the least computing time for 1-D lifting DWT.

EVALUATION OF DATA TRANSFERRING IN MULTICORE SYSTEM


B.VINISHA CATHRINE ANTONUS, J.SHYAMALA, A.JEYAMURUGAN HolyCross Engineering College,

Abstract Receive side scaling (RSS) is an NIC technology that provides the benefits of parallel receive processing in multiprocessing environments. However, RSS lacks a critical data steering mechanism that would automatically steer incoming network data to the same core on which its application thread resides. This absence causes inefficient cache usage if an application thread is not running on the core on which RSS has scheduled the received traffic to be processed and results in degraded performance. To remedy the RSS limitation, Intels Ethernet Flow Director technology has been introduced. However, our analysis shows that Flow Director can cause significant packet reordering. Packet reordering causes various negative impacts in high-speed networks. We propose an NIC data steering mechanism to remedy the RSS and Flow Director limitations. This data steering mechanism is mainly targeted at TCP. We term an NIC with such a data steering mechanism A Transport Friendly NIC (A-TFN). Experimental results have proven the effectiveness of A-TFN in accelerating TCP/IP performance.

MEMBERSHIP MANAGEMENT IN LARGE SCALE RELIABLE STORAGE SYSTEM


K.VINOTHINI Ms. B. AMUTHA Srinivasan engg college

Abstract The current system is having limitations in handling reconfigurations for a replica set and it is also difficult for life time membership. For that dynamically changing System membership in a large scale reliable storage system is maintained and carried out by a membership service .This service is done with an automatic reconfiguration. This reconfiguration is carried out by a membership service and dBQS[database Query Service]. dBQS is interesting in its own right because its storage algorithms extend existing Byzantine Quorum protocols to handle changes in the replica set, and it differ from previous DHTs by providing Byzantine Fault tolerance and offering strong semantics. We develop two heuristic algorithms for the problems. Experimental studies show that the heuristic algorithms achieve good performance in reducing communication cost and are close to optimal solutions.

ADAPTIVE IMAGE SEGMENTATION BASED ON HUMAN VISUAL ATTENTION


C.UMAMAHESWARI, S.ROSLIN MARY Anand Institute of Higher Technology.

Abstract One of the major high-level tasks in computer vision is the process of object detection and recognition. The human visual system observes and understands a scene or image by making series of fixations. Every fixation point lies inside a particular region of arbitrary shape and size in the scene which can either be an object or just a part of it. Using that fixation point will be an identification marker on the object, a method to segment the object of interest by finding the optimal closed contour around the fixation point in the polar space. The proposed segmentation framework combines visual cues, in a cue independent manner. The proposed algorithm is more suitable for an active observer capable of fixating at different locations in the scene: it applies in a single image. The optimal closed contour around a given fixation point is found. This proposed segmentation framework is used to establish a simple feedback between the mid level cues (regions) and the low level cues (edges). The segmentation refinement process based on this feedback process. Our algorithm is parameter-free, computationally efficient and robust.

You might also like