Accepted Papers

  • Low Power Transmission of Images over Wireless Channel
    Sarala Shirabadagi,VTU University, India
    ABSTRACT
    Wireless manual of still images and video streams over crumbling and blatant channels is a arduous assignment which has been beneath astronomic developments in contempo years. Fading, interference, shadowing, aisle accident and multipath are sources of agitation in wireless channels, which acquaint absurdity to a transmitted abstracts bit-stream. The claiming in manual of a scalable bit-stream is to ensure aerial believability of the accustomed signal, while advancement aerial abstracts amount during transmission. Hence it is appropriate to advance a arrangement for manual of video with aerial abstracts amount and with QoS. Manual of JPEG2000 images application an diff ability allocation (UPA) arrangement and erect abundance analysis multiplexing (OFDM) is presented. It relies on application a wavelet transform that allows for a able diff administration of the manual ability amid JP2000 coding units according to their addition to the decoded angel quality. In the proposed system, the JP2000 beck is disconnected into a assertive cardinal of packet groups and anniversary accumulation is transmitted through a abstracted sub-channel at a altered amount and power. By application an Diff Ability Allocation (UPA) arrangement and Erect Abundance Analysis Multiplexing (OFDM) address a manual of JPEG2000 images over a block crumbling - abundance careful approach is presented. Ability is assigned to anniversary bit in the JPEG2000 bit beck application direct and boilerplate approach accompaniment advice based on its addition to the decoded angel quality. More over abstinent and compared the absolute ability and captivated ability for transmission.
  • Performance Analysis of Self-Organizing Neural Network-Based Clustering
    Kadijath Thahira, Jovita Vani Sequeira, Sameema, Zahid Ansari,P.A. College of Engineering,India
    ABSTRACT
    Data mining and knowledge discovery in databases have been attracting a significant amount of research, industry, and media attention. Data mining is the process of “extracting” or “mining” knowledge from large amounts of data. We can perform data analysis, classification, clustering etc. of huge data by using different algorithms. It is important to evaluate the performance of various clustering techniques because the application of different clustering techniques generally results in different sets of cluster formation. The performance can be evaluated in terms of accuracy and validity of the clusters, and also the time required to generate them, using appropriate performance measures. In this paper, we have analysed the performance of Self-Organizing neural network based clustering and k-Means clustering using Matrix Laboratory tool, MATLAB. These techniques are tested against the various datasets. Finally, their performance results are compared and presented.
  • A Systematic Hybrid Approach in building an Effective Answer Retrieval System
    Kiran B. Malagi, Jaya Pujar, Karuna C. Gull and Akshata B. Angadi,K.L.E. Institute of Technology, India
    ABSTRACT
    Google is one of the top ranked search engine that give results for a query, asked by user in a form of list of URLaes / websites. It uses Page Rank Algorithm. Finding answers from those websites is time consuming as n pages and n websites list are displayed based on the factors it considers. The question is Can we trust the content displayed by any website just by its number of hits/occurrences alone??? No. Considering this as our initial question we thought of throwing an idea on this and proposed a framework that help user to get the answer quickly and trust it without having a second thought. The paper present an algorithm of believing the website based on ‘n’ factors. Survey in a paper details the work done by scholars. A novel approach in a paper makes people to think in different way and implement a system that gives better experience to users. What a user needs? A quick answer and a trustful result. Concentrating on these two keywords paper framed a model that helps in relevant access of the content. It’s a novel technique of retrieval system that interacts with user to rephrase a question in better way if system is unable to get it. System needs to be trained before the query is posed. The approach combines the concept of framing the question and finding trustworthiness, this hybrid approach builds an effective and healthy move towards the development of search engines. .
  • Efficient Extraction and Alignment from Web Data Bases
    Saminathan.P, Raghuveera. T,CEG Anna University, India
    ABSTRACT
    Data extraction is the act or process of retrieving data out of data sources. Typical unstructured data sources include web pages, emails, documents, PDFs, scanned text, mainframe reports, spool files etc. Extracting data from these unstructured sources has grown into a considerable technical challenge. This project proposes a new method called Combining Tag and Value Similarity (CTVS) that combines both tag and value similarity. CTVS automatically extracts data from query result pages by first identifying and segmenting the query result records (QRRs) in the query result pages and then aligning the segmented QRRs into a table, in which the data values from the same attribute are put into the same column. This CTVS method specifically proposed to handle the case where QRRs are non-contiguous and handle the nested structure that may present in the QRRs. This CTVS also proposes a new record alignment algorithm that aligns the attributes in a record, first pair wise and then holistically, by combining the tag and data value similarity information.
  • Fuzzy Technique for Software Development Test Effort Estimation
    Vishal Choudhary,Jaipur National University, India
    ABSTRACT
    The software are more complex and size of software increased many folds with additional functionality enhancements on the same way the complication of testing as well as the types of testing changes and new testing tools are available in the software industries . As a result, there is a major shift in the estimation of testing the effort. In the software engineering research various models are devised and still in progress to enhance the accuracy of testing tools. Here in this paper I used a fuzzy model for effort estimation for software’s development. Here I taken the approximate data and devised the model for effort estimation in software development on various platform. I also used the fuzzy test case for login on different browsers. .
  • Design & Evaluation of Hybrid Decision Process of Reengineering for Software Maintenance
    Shashank Sharma,Sumit Srivastava, Manipal University, India
    ABSTRACT
    In present times software reengineering has become an important domain of research to increase the shelf life of legacy system. The major objective for reengineering revolves around reducing the cost of investment in Information Technology (IT) infrastructure by reducing the maintenance cost and capitalizing on the current existing IT infrastructure. This can be achieved by making it more adaptable to the changing requirements. The decision for reengineering system is quite challenging as one has to select from the available option of investing in new system or legacy system. Further cost of reengineering is not a true decisive parameter for taking up reengineering. A better approach is finding out return on investment (ROI). ROI of reengineering a system is difficult to calculate as one has to assume the cost of project and the project also depends on the requirement of reengineering. This paper gives a generalized approach towards decision making for reengineering the legacy system. This paper presents a requirement specific approach of cost estimation and proposes ROI computation for the assessment of reengineering.
  • Frequency Measurements in Numeric Relay
    Prathibha and Sharath Kumar M D,SJCE, India
    ABSTRACT
    Frequency is one of the most important parameters in power system operation because it can reflect the dynamic energy balance between load and generating power. So frequency is always regarded as an index of the operating practices and utilities can know the system energy balance situations by observing frequency variations. Frequency may vary very fast in the transient events such that it is difficult to track it accurately. Modern power systems are prone to harmonics and noise. Hence developing a reliable method that can measure frequency in presence of harmonics and noise is essential. With the advent of the microprocessor, more and more microprocessor-based equipments have been extensively used in frequency relays. Using such equipments is known to provide accurate, fast responding, economic and flexible solutions to measurement problems. To find the best algorithm for the implementation is a challenging job. Therefore in this paper work, a precise digital algorithm based on recursive Discrete Fourier Transforms (DFT) to measure the frequency of a sinusoidal signal in presence of harmonics is proposed and it is compared with Zero Crossing Detection (ZCD) Technique to validate the claimed benefits of DFT. The proposed algorithm smartly detects the errors that arise when frequency deviates from the applied frequency. The simulation is done using MATLAB and simulated program for proposed algorithm is used for hardware implementation using TMS-DSP Processor.
  • Cloud Computing and Related Security Issues
    Dhruwajita Devi,Shri Madhwa Vadiraja Institute of Technology and Management,India
    ABSTRACT
    Cloud Computing has become the part and parcel of the computing environments.It is becoming an increasingly popular paradigm. The modern society is become reliant on the cloud because of the services and the facilities that it offers. Surely in near future it will be the prime technology for surfing internet because of all its advantages. However, despite of all the features, there is always been security issues with all the technologies that are evolving. In this paper, the author has focused on some of them in accordance with some previous related work. We must have to fix the security flaws in order to protect our confidential and sensible data from unauthorized access.
  • Automated Model in the Loop for Embedded Systems Testing
    Samaa A. Abdel Samie,Ain Shams University, Egypt
    ABSTRACT
    At present there is a new trend in the embedded systems industry towards model-driven engineering. Software components are no longer handwritten in C or Assembler code but modelled with MATLAB/Simulink™, Statemate, or similar tools. However, quality assurance of model-driven engineering, especially testing, is still poorly supported. Many development projects require creation of expensive proprietary testing solutions. Due to the complex context of embedded systems, defects can cause life threatening situations. Delays can reate huge costs, and insufficient productivity can impact the entire industry. The rapid evolution of software engineering technologies will be a key factor in the successful future development of even more complex embedded systems. Nowadays, Model Driven Engineering has become a promising approach. Instead of directly coding the software using programming languages, developers model software systems using expressive, graphical notations, which provide a higher abstraction level than programming languages. Model Based testing as well will help identify problems early thus reduce rework cost. Applying tests based on the designed models not only enable early detection of defects, but also continuous quality assurance. Testing can start in the first iteration of the development process. In this paper, we present an approach to functional black-box testing based on the system’s model. It provides a test model that is executed on the system model itself. The idea is to validate the software model itself, as it is the base that will be used for code generation later on. It is contrasted to the current approaches which extract the test model from the software model, thus defects in the software model during development will continue to appear in the test cases, and later in the code because the model itself was not validated. The proposed approach is called Automated Model-in-the-Loop for Embedded Systems Testing (AMiLEST).
  • No SQL Data Base and Comparison with Relational Data Base
    E. Bangoli, F. Karimi,Islamic Azad University, Iran
    ABSTRACT
    Generally, relational database is used in most applications in order to store and retrieve of data and information. These types of data bases are functioned efficiently with high efficiency while be responsible for considering to confined collection of data. However, the investigations have shown the apace access to terrific content of data such as Internet, in relational data base would be inefficient and of little profit. So, in order to remove this problem, No SQL data base or Not Only SQL has been developed. The aim of this paper is therefore to survey the problems of relational data bases, to present No SQL data base and the structure of which. Also, it will be investigated all kinds of No SQL data bases, the advantages and disadvantages and differences with which relational data bases and several techniques used in.
  • Network Forensic Tool ae Concept and Architecture
    Mrunal H. Mate and Smita Kapse, Y.C.C.E, India
    ABSTRACT
    Network forensic is a offset of digital forensics used for the monitoring and analysis of computer network traffic intended for collecting information , lawful proof against illegal activity, or intrusion detection in the network. Network examination manage unstable and persuasive data. Network traffic movement is spread and afterward lost, so organize legal sciences is frequently a devoted examination science. The project is intended to deliver the device relying upon the point of view for Network investigation. It additionally helps in law requirement investigation. The Network Forensic Tool has committed examination foundation that permits observing and investigation for an investigation purpose. The tool captures packet within the network in order to recognize hosts, open ports, sessions, etc. without putting traffic on the network. It parses PCAP files for offline analysis also regenerates reassembles transmitted files and web pages.
  • Design and Implementation of a Cloud Based Computer Forensic Tool
    Monali P. Mohite and S. B. Ardhapurkar,Y.C.C.E, India
    ABSTRACT
    Nowadays, Cloud computing is receiving more and more attention from the information and communication technology industry recently. Thus, From the demand of cloud users digital forensics in cloud computing are a raw expanse of study linked to the increasing use of information processing governance, internet and digital computer storage devices in numerous criminal actions in both traditional and Hi-Tech. The Digital forensics, including handle, conduct of, study, and document digital evidence in a court of law. Digital Forensic tool in a cloud computing environment is a big demand from forensic investigator. Thus, in the process of digital forensics, it is needed to create an image of the original digital data without damage and to show that the computer evidence existed at the specific time. The evidences are then analyzed by the forensic investigator. After the proof is examined, it is obliged to make a report to embrace it as legitimately successful confirmation in the law court. To give an advanced crime scene investigation benefit on cloud environment, a cloud based computer forensic tool is proposed in this paper.To probe the evidence multiple features are provided in this tool like data recovery, sorting, indexing, hex viewer, data bookmarking.
  • NLP Based Encryption Technique Using Artificial Intelligence
    Rajat Doshi and Prathamesh Durgude,University Of Pune, India
    ABSTRACT
    Language is a most common mode of information exchange for all humans. But to make secrets, people use different types of cipher techniques, which always make the third person more curios to decrypt the encrypted message. This is probably because the Cipher text is not understandable directly. This paper presents an approach to encrypt the English statements into meaningful English statements with a new approach of changing Context of the text using Artificial Intelligence. Natural Language Processing based Cryptography can be used significantly to enhance the security for variety of purposes. An experimental approach is proposed in the paper which will make the Natural Language to Natural Language text encryption a grand success.
  • “m-ACO”: A novel Test Case Prioritization Approach
    Kamna Solanki, Yudhvir Singh,University Institute of Engineering and Technology (UIET),India
    ABSTRACT
    Software testing is a time and cost intensive process. A careful scrutiny of the code and rigorous testing is required to identify and rectify the putative bugs. The process of bug identification and its consequent correction is continuous in nature and often some of the bugs are removed after the software has been launched in the market. This process of code validation of the altered software during the maintenance phase is termed as Regression testing. Regression testing ubiquitously considers resource constraints; therefore the deduction of an appropriate set of test cases (from the ensemble of the entire gamut of test cases) is critical issue for regression test planning. This paper presents a novel method for designing a suitable prioritization process to optimize fault detection rate and performance of regression test with respect to predefined constraints. The proposed method for test case prioritization "m-ACO" alters the food source selection criteria of natural ants and is basically a modified version of Ant Colony Optimization `(ACO). The proposed m-ACO approach has been coded in "Perl" language and results are validated by computation of Average Percentage of Faults Detected (APFD) metric.
  • Study of Various Test Suite Prioritization Techniques
    Sandeep Dalal,M.D.University,India
    ABSTRACT
    Testing is a means of making sure that the product meets the needs of the customer. Software Testing is the process of exercising the software product in predefined ways to check if the behavior is the same as expected behavior. The purpose of testing is not to prove that the product has no defects. The purpose of software testing is to find defects in the software product. The main objectives of testing are to provide quality products to customers. Defects are like pests, testing is like designing the right pesticides to catch and kill the pests. The test cases that are written are like pesticides. The purpose of regression testing of a system is to discover the entry of any new faults because of fixing the previously existing faults. In software regression testing, one is often interested in judging how well a series of test inputs tests a piece of code.Test Suite Prioritization is one of the popular regression testing technique.This paper describes various test case prioritization techniques developed so far systematically.
  • A Hybrid Approach for VM Load Balancing in Cloud Using Cloud Analyst
    R.Gnanasundaram and S.Suresh,Adhiyamaan College of engineering,India
    ABSTRACT
    In this work a hybrid approach has been proposed for virtual machine level load balancing using concepts of two classical algorithms for load balancing. These two classical algorithms for load balancing are Round Robin Algorithm and Throttled algorithm. It has also been implemented for an IaaS framework in simulated cloud computing environment and the results obtained were analyzed. The hybrid approach gave better results in terms of: Response time, Data center request serving time and Data center processing time, when compared with results of Round robin algorithm, Throttled algorithm and equally spread current execution algorithm separately. The proposed hybrid algorithm was found to be efficient in case of same data size per request as well as for different data size per request.