background verification service

59
Background Verification Service A Dissertation submitted to Jawaharlal Nehru Technological University, Hyderabad In partial fulfillment of the requirements for the award of the degree of BACHELOR OF TECHNOLOGY In COMPUTER SCIENCE AND ENGINEERING BY Mohd Faiyaz Ali 11N31A05B3 Under the esteemed guidance of G Manoj Kumar Assistant Professor DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING MALLA REDDY COLLEGE OF ENGINEERING AND TECHNOLOGY 2011-2015

Upload: mohammed-faiyaz-ali

Post on 24-Jan-2017

474 views

Category:

Engineering


1 download

TRANSCRIPT

Page 1: Background Verification Service

Background Verification Service

A Dissertation submitted to

Jawaharlal Nehru Technological University, Hyderabad

In partial fulfillment of the requirements for the award of the degree of

BACHELOR OF TECHNOLOGY

In

COMPUTER SCIENCE AND ENGINEERING

BY

Mohd Faiyaz Ali

11N31A05B3

Under the esteemed guidance of

G Manoj Kumar

Assistant Professor

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING

MALLA REDDY COLLEGE OF ENGINEERING AND

TECHNOLOGY

2011-2015

Page 2: Background Verification Service

MALLAREDDY COLLEGE OF

ENGINEERING AND TECHNOLOGY

(Sponsored by CMR Educational Society)

Affiliated to JNTU, Hyderabad

MAISAMMAGUDA, DHULAPALLY

SECUNDRABAD-500014

Phone: 040-23792146

DEPARTMENT OF COMPUTER SCIENCE AND TECHNOLOGY

CERTIFICATE

This is to certify that dissertation entitled “Background Verification

Service” is a bonafide work done Mohd Faiyaz Ali bearing 11N31A05B3 under my

guidance and supervision and is submitted to Jawaharlal Nehru Technological

University, Hyderabad in partial fulfillment of requirements for the award of

Bachelor Of Technology in Computer Science and Engineering during the

academic year 2014-2015.

G Manoj Kumar M Saidhi Reddy

Assistant Professor Professor

(Internal Guide) Head of Department

External Examiner

Page 3: Background Verification Service

DECLARATION

I here by declare that the project entitled “Background Verification Service”

submitted to Malla Reddy College of Engineering and Technology, affiliated to

Jawaharlal Nehru Technology University, Hyderabad for award of the Degree of

Bachelor of Technology in Computer Science Engineering is a result of original

research work done by us. It is further declared that the project report or any part

thereof has not been previously submitted to any University for the award of degree of

diploma.

With regards and gratitude

Mohd Faiyaz Ali

Page 4: Background Verification Service

ACKNOWLEDGEMENT

It plunges us in exhilaration in taking privilege in expressing our heartfelt

gratitude to all those who helped, encouraged and supported us in the successful

completion of our project. We express sincere thanks and gratitude to our internal

guide Mrs. G Manoj Kumar, Assistant Professor Computer Science and

Engineering Department, Malla Reddy College of Engineering and Technology

for thus encouragement, support and guidance in carrying out the project.

I wish to convey our sincere thanks to Dr.V.S.K Reddy, Principal of Malla

Reddy College of Engineering and Technology and M Saidhi Reddy

, HOD-CSE Dept., Malla Reddy College of Engineering and Technology who

directly or indirectly contributed their assistance in finishing out this project

successfully.

I take pride in forwarding our gratitude and wishes to our parents and friends

who have given us moral support and helped us for the successful completion of the

project.

With Regards and Gratitude,

Mohd Faiyaz Ali

Page 5: Background Verification Service

ABSTRACT

To provide a solution in integrating Employer & Immigration department for

the purpose of locating Employee and verifying background before recruitment.

The application proposes to isolate problems of fake certificates and

employment details furnished by candidates. In the present scenario the

concerned employer authorizes legal consultants to provide information about

the candidate and verify his credibility on the certificates submitted. The

consultant then makes a physical call, additionally may send their officers to the

specified previous employer and verify the authenticity of the candidate. The

status of the investigation is then reported to the organization for them to take a

decision on appointment or refusal. The proposed system ensures that the above

tasks can be easily performed by developing web applications for the

organization and the immigration department, integrating them to generate

candidate’s previous employment details. The system requires every

organization to be compulsorily registered with the immigration dept. The

immigration department then creates the database for the company dynamically.

Whenever the company recruits employee for their organization the data is also

updated to the immigration department for verification. The status of the

verification is generated on the candidate name and a detailed report on his

employment and activities related to all the companies are available to the

recruiting organization. If the candidate is recruited his new information is

updated to the immigration dept. A powerful search engine is designed to locate

employment details of an employee. Additionally the company information can

also be tracked. The site also assists in identifying foreign nationals employed

in that country.

Page 6: Background Verification Service

INDEX

Topic Pg.no

Title page

College certificate

Declaration………………………………………………………………i

Acknowledgement………………………………………………………ii

Abstract…………………………………………………………………iii

CONTENTS

Chapter 1: Introduction

1.1 Existing System………………………………………………………………...1

1.2 Proposed System………………………………………………………………..2

1.3 Software Requirements………………………………………………………...2

1.4 Hardware Requirements……………………………………………………….2

1.5 Modules……………………………………………………………………….…3

1.6 Process Diagram………………………………………………………………...3

Chapter 2: System Analysis

2.1 Feasibility Study……………………………………………………………..4

2.1.1 Technical feasibility…………………………………………………...5

2.1.2 Economical Feasibility………………………………………………...5

Page 7: Background Verification Service

2.1.3 Operational Feasibility…………………………………………………6

2.2 System Analysis……………………………………………………………..6

Chapter 3: System Requirement Specification

3.1 Introduction…………………………………………………………………..10

3.2 Functional Requirements…………………………………………………....11

3.3 Non-Function Requirements..………………..……………………………..12

Chapter 4: System Design

4.1 Introduction……………………………………………………………………..13

4.2 High-level Design……………………………………………………………….13

All Uml Diagrams

Use case Diagram……………………………………………………………………16

Sequence Diagram …………………………………………………………...…….17

Class Diagram …………………………………………………………………...…18

Object Diagram....…………………………………………………………………..19

State Chart Diagram..……………………………………………………………....20

Activity Diagram …………………………………………………………………...21

Component Diagram ……………………………………………………………….22

Deployment Diagram …………………………………………………..…………..22

Chapter 5: Coding

5.1 Software’s Description (Technology Description)……………………….…....23

5.2 Coding…………………………………………………………………………....30

Page 8: Background Verification Service

Chapter 6: Testing………………………………………………………………….....38

Chapter 7: Screen Shots……………………………………………………………....43

Chapter 8: Conclusion……………………………………………………………......50

Chapter 9: Bibliography………………………………………………………….......51

Page 9: Background Verification Service

Background Verification Service Page 1

Chapter 1: Introduction

The application proposes to isolate problems of fake certificates and employment details

furnished by candidates. In the present scenario the concerned employer authorizes legal

consultants to provide information about the candidate and verify his credibility on the

certificates submitted. The consultant then makes a physical call, additionally may send their

officers to the specified previous employer and verify the authenticity of the candidate. The

status of the investigation is then reported to the organization for them to take a decision on

appointment or refusal.

The proposed system ensures that the above tasks can be easily performed by developing.

Web applications for the organization and the immigration department integrating them to

generate candidate‟s previous employment details.

The system requires every organization to be compulsorily registered with the immigration

dept. The immigration department then creates the database for the company dynamically.

Whenever the company recruits employee for their organization the data is also updated to

the immigration department for verification. The status of the verification is generated on the

candidate name and a detailed report on his employment and activities related to all the

companies are available to the recruiting organization. If the candidate is recruited his new

information is updated to the immigration dept.

A powerful search engine is designed to locate employment details of an employee.

Additionally the company information can also be tracked. The site also assists in identifying

foreign nationals employed in that country

1.1 Existing System:

In the present scenario though the companies the registered at the company of registrations

their functionality is confined to that company only. The companies are always in recruiting the

candidates both fresher and experienced candidates. The candidates are recruited and then the

enquiry is made by the third party services. If the candidate is found to be fake then the company

has to think about the candidate to remove or keep them.

Problem with existing system:

The system performance depends upon manually efficiency.

Enquiry about the information of the experienced candidates is obtained later.

The achievements of a company cannot be known to anyone or to very few only. The

candidates generally don‟t know the working nature of the company.

Lot of money spent on for enquiring the experience about the candidates.

No proper control over the candidate or employee working for that company.

Current System Disadvantages:

Manual system that takes time to report Third party organizations involved and hence

payment involved.

Difficult to trace non-existent or un-registered companies.

Page 10: Background Verification Service

Background Verification Service Page 2

Communication for verification should be either in person, telephone, fax or emails.

Candidates can fake certificates and contact points.

1.2 Proposed System:

A fully automated online system that concentrated on the proper maintenance of the

companies and employees working in the companies. The Application is split into two modules

and a search engine by using which the coordination can be established between the companies

and the integrated services to maintain the information about the employees. The employees

before being recruited are enquired about their experience by the privilege given to company.

Each employee is assigned with a unique SSN number (globally accessible) by using which the

required information about the specified candidate and be known. On realization of the

experience of the candidate, he can be recruited into the company. The achievements made by

the company is to specified from time to time to show their working standards in the market

which can be viewed by the integrated services and by the candidates using the search engine.

The integrated services also maintain the information about the companies by registering

them with proper information. Their working nature and achievements can be viewed by this

service.

Proposed System Advantages:

The system that can be automated, fully functional and web supportive.

It necessitates the requirement of an immigration agency.

Companies that are already established or new have to register with the immigration dept.

Each registered company is provided by id.

A search engine to facilitate report generation on verification input.

Recruitment and Termination information with precision and accuracy.

Isolate fake certifications and establish genuinely.

1.3 Software Requirements:

Content Description

OS Windows XP with SP2 or Windows Vista

Database MS-SQL server 2008

Technologies ASP.NET with C#.NET

IDE Visual Studio .Net 2008

Browser Internet Explorer

1.4 Hardware Requirements:

Content Description

Processor Pentium IV

Hard Disk Drive 20 GB Minimum,40 GB Recommended

RAM 1 GB Min ; 2 GB Recommended

Page 11: Background Verification Service

Background Verification Service Page 3

1.5 Modules:

Login & Security

The module ensures that the application is used only by authenticated users. Validations are

performed with the database before the user can sign-in.

Company Registration:

The module allows companies to register in order to use the services of the immigration dept.

The registered companies are provided with login and password to use the services.

Employee Recruitment & Termination:

The module tracks the employment and termination of employees of the company. Whenever an

employee is recruited or terminated it is reflected on both the companies and the immigration

dept.

Search Engine

The module ensures that the experienced employee‟s previous employment details are verified

with the immigration department for authenticity. The engine reports similarity and dissimilarity

if found. Useful before the employee can be recruited.

Immigration maintenance

The module deals with the immigration departments services. It is necessary that the maintained

data be modified, deleted or viewed whenever necessary.

Report Generation

The module generates report across the immigration, employees & verification procedures.

1.6 Process Diagram:

Admin Process:

Customer Process:

Admin

Company Info View Employee

Info

View

Achievements

Set

Suggestions

Company

Employee

Info

Enquiry

Form

Set

Achievements

Promotion/

Demotion

View

Suggestions

Page 12: Background Verification Service

Background Verification Service Page 4

Chapter 2: System Analysis

Background:

After analyzing the requirements of the task to be performed, the next step is to analyze the

problem and understand its context. The first activity in the phase is studying the existing system

and other is to understand the requirements and domain of the new system. Both the activities are

equally important, but the first activity serves as a basis of giving the functional specifications

and then successful design of the proposed system. Understanding the properties and

requirements of a new system is more difficult and requires creative thinking and understanding

of existing running system is also difficult, improper understanding of present system can lead

diversion from solution.

System Details:

The first activity in the phase is studying the existing system and other is to understand the

requirements and domain of the new system. Both the activities are equally important, but the

first activity serves as a basis of giving the functional specifications and then successful design

of the proposed system. Understanding the properties and requirements of a new system is more

difficult and requires creative thinking and understanding of existing running system is also

difficult, improper understanding of present system can lead diversion from solution

In the flexibility of the user the interface has been developed a graphics concept in mind,

associated through a browses interface. The GUI‟S at the top level have been categorized as

Administrative user interface

The operational or generic user interface

The administrative user interface concentrates on the consistent information that is practically,

part of the organizational activities and which needs proper authentication for the data collection.

The interfaces help the administrations with all the transactional states like Data insertion, Data

deletion and Date updating along with the extensive data search capabilities.

The operational or generic user interface helps the users upon the system in transactions through

the existing data and required services. The operational user interface also helps the ordinary

users in managing their own information helps the ordinary users in managing their own

information in a customized manner as per the assisted flexibilities.

2.1 Feasibility Study:

Feasibility Study is a high level capsule version of the entire process intended to answer a

number of questions like: What is the problem? Is there any feasible solution to the given

problem? Is the problem even worth solving? Feasibility study is conducted once the problem

clearly understood. Feasibility study is necessary to determine that the proposed system is

Feasible by considering the technical, Operational, and Economical factors. By having a detailed

feasibility study the management will have a clear-cut view of the proposed system.

Page 13: Background Verification Service

Background Verification Service Page 5

The following feasibilities are considered for the project in order to ensure that the project is

variable and it does not have any major obstructions. Feasibility study encompasses the

following things:

Technical Feasibility

Economical Feasibility

Operational Feasibility

In this phase, we study the feasibility of all proposed systems, and pick the best feasible solution

for the problem. The feasibility is studied based on three main factors as follows.

2.1.1 Technical Feasibility:

In this step, we verify whether the proposed systems are technically feasible or not. i.e., all

the technologies required to develop the system are available readily or not.

Technical Feasibility determines whether the organization has the technology and skills

necessary to carryout the project and how this should be obtained. The system can be feasible

because of the following grounds.

All necessary technology exists to develop the system.

This system is too flexible and it can be expanded further.

This system can give guarantees of accuracy, ease of use, reliability and the data security.

This system can give instant response to inquire.

Our project is technically feasible because, all the technology needed for our project is readily

available.

Front End : ASP.NET with C#

Back End : MS SQL Server 2008

Web-Server : IIS 5.0

Host : Windows-XP

2.1.2 Economical Feasibility:

In this step, we verify which proposal is more economical. We compare the financial benefits

of the new system with the investment. The new system is economically feasible only when the

financial benefits are more than the investments and expenditure. Economical Feasibility

determines whether the project goal can be within the resource limits allocated to it or not. It

must determine whether it is worthwhile to process with the entire project or whether the benefits

obtained from the new system are not worth the costs. Financial benefits must be equal or exceed

the costs. In this issue, we should consider:

The cost to conduct a full system investigation.

The cost of h/w and s/w for the class of application being considered.

The development tool.

The cost of maintenance etc.,

Page 14: Background Verification Service

Background Verification Service Page 6

Our project is economically feasible because the cost of development is very minimal

when compared to financial benefits of the application.

2.1.3 Operational Feasibility:

In this step, we verify different operational factors of the proposed systems like man-power,

time etc., whichever solution uses less operational resources, is the best operationally feasible

solution. The solution should also be operationally possible to implement. Operational

Feasibility determines if the proposed system satisfied user objectives could be fitted into the

current system operation. The present system Enterprise Resource Information System can be

justified as Operationally Feasible based on the following grounds.

The methods of processing and presentation are completely accepted by the clients since they

can meet all user requirements.

The clients have been involved in the planning and development of the system.

The proposed system will not cause any problem under any circumstances.

Our project is operationally feasible because the time requirements and personnel requirements

are satisfied. We are a team of four members and we worked on this project for three working

months.

2.2 System Analysis:

Analysis Model:

The model that is basically being followed is the WATER FALL MODEL, which states that

the phases are organized in a linear order. First of all the feasibility study is done. Once that part

is over the requirement analysis and project planning begins. If system exists one and

modification and addition of new module is needed, analysis of present system can be used as

basic model.

The design starts after the requirement analysis is complete and the coding begins after the

design is complete. Once the programming is completed, the testing is done. In this model the

sequence of activities performed in a software development project are: -

Requirement Analysis, Project Planning, System design, Detail design, Coding, Unit testing,

System integration & testing

Here the linear ordering of these activities is critical. End of the phase and the output of one

phase is the input of other phase. The output of each phase is to be consistent with the overall

requirement of the system. Some of the qualities of spiral model are also incorporated like after

the people concerned with the project review completion of each of the phase the work done.

Water Fall Model was being chosen because all requirements were known beforehand and the

objective of our software development is the computerization/automation of an already existing

manual working system.

Page 15: Background Verification Service

Background Verification Service Page 7

Water Fall Model

Architectural Flow:

N-Tier Applications:

N-Tier Applications can easily implement the concepts of Distributed Application Design

and Architecture. The N-Tier Applications provide strategic benefits to Enterprise Solutions.

While 2-tier, client-server can help us create quick and easy solutions and may be used for Rapid

Prototyping, they can easily become a maintenance and security nightmare. The N-tier

Applications provide specific advantages that are vital to the business continuity of the

enterprise. Typical features of a real life n-tier may include the following:

Security

Availability and Scalability

Manageability

Easy Maintenance

Data Abstraction

The above mentioned points are some of the key design goals of a successful n-tier application

that intends to provide a good Business Solution.

Definition:

Simply stated, an n-tier application helps us distribute the overall functionality into various

tiers or layers:

Presentation Layer

Business Rules Layer

Data Access Layer

Each layer can be developed independently of the other provided that it adheres to the standards

and communicates with the other layers as per the specifications. This is the one of the biggest

advantages of the n-tier application. Each layer can potentially treat the other layer as a „Block-

Box‟. In other words, each layer does not care how other layer processes the data as long as it

sends the right data in a correct format.

Page 16: Background Verification Service

Background Verification Service Page 8

The Presentation Layer:

Also called as the client layer comprises of components that are dedicated to presenting the data

to the user.

For example: Windows/Web Forms and buttons, edit boxes, Text boxes, labels, grids, etc.

The Business Rules Layer:

This layer encapsulates the Business rules or the business logic of the encapsulations. To have a

separate layer for business logic is of a great advantage. This is because any changes in Business

Rules can be easily handled in this layer. As long as the interface between the layers remains the

same, any changes to the functionality/processing logic in this layer can be made without

impacting the others. A lot of client-server apps failed to implement successfully as changing the

business logic was a painful process.

The Data Access Layer:

This layer comprises of components that help in accessing the Database. If used in the right way,

this layer provides a level of abstraction for the database structures. Simply put changes made to

the database, tables, etc. do not affect the rest of the application because of the Data Access

layer. The different application layers send the data requests to this layer and receive the

response from this layer.

The current application is being developed by taking the 3-tier architecture as a prototype. The 3-

tier architecture is the most common approach used for web applications today. In the typical

example of this model, the web browser acts as the client, IIS handles the business logic, and a

separate tier MS-SQL Server handles database functions. Although the 3-tier approach increases

scalability and introduces a separation of business logic from the display and database layers, it

does not truly separate the application into specialized, functional layers. For prototype or simple

web applications, the 3-tier architecture may be sufficient. However, with complex demands

placed on web applications, a 3-tiered approach falls short in several key areas, including

flexibility and scalability. These shortcomings occur mainly because the business logic tier is

still too broad- it has too many functions grouped into one tier that could be separated out into a

finer grained model.

Page 17: Background Verification Service

Background Verification Service Page 9

The proposed system can be designed perfectly with the three tier model, as all layers are

perfectly getting set as part of the project. In the future, while expanding the system, in order to

implement integration touch points and to provide enhanced user interfaces, the n-tier

architecture can be used.

Page 18: Background Verification Service

Background Verification Service Page 10

Chapter 3: System Requirements Specification

3.1 Introduction:

Software Requirements Specification plays an important role in creating quality software

solutions. Specification is basically a representation process. Requirements are represented in a

manner that ultimately leads to successful software implementation.

Requirements may be specified in a variety of ways. However there are some guidelines worth

following: -

• Representation format and content should be relevant to the problem

• Information contained within the specification should be nested

• Diagrams and other notational forms should be restricted in number and consistent in use.

• Representations should be revisable.

The software requirements specification is produced at the culmination of the analysis task. The

function and performance allocated to the software as a part of system engineering are refined by

establishing a complete information description, a detailed functional and behavioral description,

and indication of performance requirements and

Design constraints, appropriate validation criteria and other data pertinent to requirements.

Page 19: Background Verification Service

Background Verification Service Page 11

3.2 Functional Requirements:

This section contains specification of all the functional requirements needed to develop this

module or sub module.

Requirement

ID

Requirement Specification Priority

(A/B/C)

SC_R_01 System should provide a provision for authenticate user login.

SC_R_02 System should provide a provision for admin to add company.

SC_R_03 System should provide a provision for admin to alter company.

SC_R_04 System should provide a provision for admin to view the company

details.

SC_R_06 System should provide a provision for admin to view the

achievements.

SC_R_06 System should provide a provision for admin to give suggestions.

SC_R_07 System should provide a provision for view employee details.

SC_R_08 System should provide a provision for company to add employee.

SC_R_09 System should provide a provision for company to alter employee

SC_R_1 System should provide a provision for company release the

employee.

SC_R_11

System should provide a provision for company to view employee.

SC_R_12

System should provide a provision for company to view set

Achievements.

SC_R_13

System should provide a provision for company to give promotion

to the employee.

SC_R_14

System should provide a provision to company to give demotion to

the employee.

Page 20: Background Verification Service

Background Verification Service Page 12

3.3 Non Functional Requirements:

Performance Requirements:

Good band width, less congestion on the network. Identifying the shortest route to reach the

destination will all improve performance.

Safety Requirements:

No harm is expected from the use of the product either to the OS or any data.

Product Security Requirements:

The product is protected from un-authorized users from using it. The system allows only

authenticated users to work on the application. The users of this system are organization and ISP

administrator.

Software Quality Attributes:

The product is user friendly and its accessibility is from the client. The application is reliable

and ensures its functioning maintaining the ISP web service is accessible to the various

organizations. As it is developed in .Net it is highly interoperable with OS that have provided

support for MSIL (Server side). The system requires less maintenance as it is not installed on the

client but hosted on the ISP. The firewall, antivirus protection etc. is provided by the ISP.

Page 21: Background Verification Service

Background Verification Service Page 13

Chapter 4: System Design

4.1 Introduction:

Software design sits at the technical kernel of the software engineering process and is applied

regardless of the development paradigm and area of application. Design is the first step in the

development phase for any engineered product or system. The designer‟s goal is to produce a

model or representation of an entity that will later be built. Beginning, once system requirement

have been specified and analyzed, system design is the first of the three technical activities -

design, code and test that is required to build and verify software.

The importance can be stated with a single word “Quality”. Design is the place where quality is

fostered in software development. Design provides us with representations of software that can

assess for quality. Design is the only way that we can accurately translate a customer‟s view into

a finished software product or system. Software design serves as a foundation for all the software

engineering steps that follow. Without a strong design we risk building an unstable system – one

that will be difficult to test, one whose quality cannot be assessed until the last stage.

During design, progressive refinement of data structure, program structure, and procedural

details are developed reviewed and documented. System design can be viewed from either

technical or project management perspective. From the technical point of view, design is

comprised of four activities – architectural design, data structure design, interface design and

procedural design.

4.2 High Level Design:

System Design:

Understanding bigger application with its external interfaces is called System Design.

The “Background Verification System(BVS)” is the application used to verifying the

candidate‟s certificates for validation before recruiting by a company , this application also used

to verifying the employee‟s previous working details(experience gained before )etc.

Subsystem Design:

Understanding bigger system into smaller independent working systems is called subsystem

design.

Web User

Interface

Backgroun

d

Verificatio

n

System

Database

System

Web User

Background

Verification

System

Page 22: Background Verification Service

Background Verification Service Page 14

Block Design:

Web User Interface:

Web user Interface will provide the interface to the user to communicate with the system.

DB Storage:

This block helps us to store the data or retrieve the data from database.

Admin Handler:

This block will handle the information about companies.

Company Handler:

This block helps us to add, delete and alter employee details.

Database Manager:

This block will manage the data form the database.

Login manager:

This block will manage the login operations.

UML Diagrams:

Every complex system is best approached through a small set of nearly independent

views of a model; no single viewer is sufficient. Every model may be expressed at

different levels of fidelity. The best models are connected to reality. The UML defines nine

graphical diagrams.

Class diagram

Object diagram

Use-case diagram

Sequence diagram

Web User

Interface

Admin

Handler

Company

Handler

Login

Manager

DB Manager

DB Storage

Page 23: Background Verification Service

Background Verification Service Page 15

Collaboration diagram

Activity diagram

ER diagram

State Chart diagram

Component diagram

Deployment diagram

Dataflow diagrams

Data Flow Diagrams:

Page 24: Background Verification Service

Background Verification Service Page 16

Use case Diagrams:

Page 25: Background Verification Service

Background Verification Service Page 17

Sequence Diagrams:

Page 26: Background Verification Service

Background Verification Service Page 18

Class Diagram:

Page 27: Background Verification Service

Background Verification Service Page 19

Object diagram:

Page 28: Background Verification Service

Background Verification Service Page 20

State chart:

Page 29: Background Verification Service

Background Verification Service Page 21

Activity diagrams:

Page 30: Background Verification Service

Background Verification Service Page 22

Component Diagrams:

Deployment Diagrams:

Page 31: Background Verification Service

Background Verification Service Page 23

Chapter 5: Coding

5.1 Software Description (Technology Description):

Microsoft.NET Framework:

The .NET Framework is a new computing platform that simplifies application development

in the highly distributed environment of the Internet. The .NET Framework is designed to fulfill

the following objectives:

To provide a consistent object-oriented programming environment whether object code is

stored and executed locally, executed locally but Internet-distributed, or executed

remotely.

To provide a code-execution environment that minimizes software deployment and

versioning conflicts.

To provide a code-execution environment that guarantees safe execution of code,

including code created by an unknown or semi-trusted third party.

To provide a code-execution environment that eliminates the performance problems of

scripted or interpreted environments.

To make the developer experience consistent across widely varying types of applications,

such as Windows-based applications and Web-based applications.

To build all communication on industry standards to ensure that code based on the .NET

Framework can integrate with any other code.

The .NET Framework has two main components: the common language runtime and the

.NET Framework class library. The common language runtime is the foundation of the .NET

Framework. You can think of the runtime as an agent that manages code at execution time,

providing core services such as memory management, thread management and remoting, while

also enforcing strict type safety and other forms of code accuracy that ensure security and

robustness. In fact, the concept of code management is a fundamental principle of the runtime.

Code that targets the runtime is known as managed code, while code that does not target the

runtime is known as unmanaged code. The class library, the other main component of the .NET

Framework, is a comprehensive, object-oriented collection of reusable types that you can use to

develop applications ranging from traditional command-line or graphical user interface (GUI)

applications to applications based on the latest innovations provided by ASP.NET, such as Web

Forms and XML Web services.

The .NET Framework can be hosted by unmanaged components that load the common

language runtime into their processes and initiate the execution of managed code, thereby

creating a software environment that can exploit both managed and unmanaged features. The

.NET Framework not only provides several runtime hosts, but also supports the development of

third-party runtime hosts.

For example, ASP.NET hosts the runtime to provide a scalable, server-side environment for

managed code. ASP.NET works directly with the runtime to enable Web Forms applications and

XML Web services, both of which are discussed later in this topic.

Page 32: Background Verification Service

Background Verification Service Page 24

Internet Explorer is an example of an unmanaged application that hosts the runtime (in the

form of a MIME type extension). Using Internet Explorer to host the runtime enables you to

embed managed components or Windows Forms controls in HTML documents. Hosting the

runtime in this way makes managed mobile code (similar to Microsoft® ActiveX® controls)

possible, but with significant improvements that only managed code can offer, such as semi-

trusted execution and secure isolated file storage.

The following illustration shows the relationship of the common language runtime and the

class library to your applications and to the overall system. The illustration also shows how

managed code operates within a larger architecture.

Features of the Common Language Runtime:

The common language runtime manages memory, thread execution, code execution, code

safety verification, compilation, and other system services. These features are intrinsic to the

managed code that runs on the common language runtime.

With regards to security, managed components are awarded varying degrees of trust,

depending on a number of factors that include their origin (such as the Internet, enterprise

network, or local computer). This means that a managed component might or might not be able

to perform file-access operations, registry-access operations, or other sensitive functions, even if

it is being used in the same active application.

The runtime enforces code access security. For example, users can trust that an executable

embedded in a Web page can play an animation on screen or sing a song, but cannot access their

personal data, file system, or network. The security features of the runtime thus enable legitimate

Internet-deployed software to be exceptionally featuring rich. The runtime also enforces code

robustness by implementing a strict type- and code-verification infrastructure called the common

type system (CTS). The CTS ensures that all managed code is self-describing. The various

Microsoft and third-party language compilers generate managed code that conforms to the CTS.

This means that managed code can consume other managed types and instances, while strictly

enforcing type fidelity and type safety. In addition, the managed environment of the runtime

eliminates many common software issues. For example, the runtime automatically handles object

layout and manages references to objects, releasing them when they are no longer being used.

This automatic memory management resolves the two most common application errors, memory

leaks and invalid memory references.

The runtime also accelerates developer productivity. For example, programmers can write

applications in their development language of choice, yet take full advantage of the runtime, the

class library, and components written in other languages by other developers. Any compiler

vendor who chooses to target the runtime can do so. Language compilers that target the .NET

Framework make the features of the .NET Framework available to existing code written in that

language, greatly easing the migration process for existing applications.

While the runtime is designed for the software of the future, it also supports software of

today and yesterday. Interoperability between managed and unmanaged code enables developers

Page 33: Background Verification Service

Background Verification Service Page 25

to continue to use necessary COM components and DLLs. The runtime is designed to enhance

performance. Although the common language runtime provides many standard runtime services,

managed code is never interpreted. A feature called just-in-time (JIT) compiling enables all

managed code to run in the native machine language of the system on which it is executing.

Meanwhile, the memory manager removes the possibilities of fragmented memory and increases

memory locality-of-reference to further increase performance.

Finally, the runtime can be hosted by high-performance, server-side applications, such as

Microsoft® SQL Server™ and Internet Information Services (IIS). This infrastructure enables

you to use managed code to write your business logic, while still enjoying the superior

performance of the industry's best enterprise servers that support runtime hosting.

.NET Framework Class Library

The .NET Framework class library is a collection of reusable types that tightly integrate with

the common language runtime. The class library is objecting oriented, providing types from

which your own managed code can derive functionality. This not only makes the .NET

Framework types easy to use, but also reduces the time associated with learning new features of

the .NET Framework. In addition, third-party components can integrate seamlessly with classes

in the .NET Framework.

For example, the .NET Framework collection classes implement a set of interfaces that you

can use to develop your own collection classes. Your collection classes will blend seamlessly

with the classes in the .NET Framework.

As you would expect from an object-oriented class library, the .NET Framework types enable

you to accomplish a range of common programming tasks, including tasks such as string

management, data collection, database connectivity, and file access. In addition to these common

tasks, the class library includes types that support a variety of specialized development scenarios.

For example, you can use the .NET Framework to develop the following types of applications

and services:

Console applications.

Scripted or hosted applications.

Windows GUI applications (Windows Forms).

ASP.NET applications.

XML Web services.

Windows services.

For example, the Windows Forms classes are a comprehensive set of reusable types that vastly

simplify Windows GUI development. If you write an ASP.NET Web Form application, you can

use the Web Forms classes.

SQL SERVER DATABASE:

A database management, or DBMS, gives the user access to their data and helps them

transform the data into information. Such database management systems include dBase, paradox,

Page 34: Background Verification Service

Background Verification Service Page 26

IMS, SQL Server and SQL Server. These systems allow users to create, update and extract

information from their database.

A database is a structured collection of data. Data refers to the characteristics of people,

things and events. SQL Server stores each data item in its own fields. In SQL Server, the fields

relating to a particular person, thing or event are bundled together to form a single complete unit

of data, called a record (it can also be referred to as raw or an occurrence). Each record is made

up of a number of fields. No two fields in a record can have the same field name.

During an SQL Server Database design project, the analysis of your business needs identifies all

the fields or attributes of interest. If your business needs change over time, you define any

additional fields or change the definition of existing fields.

SQL Server Tables:

SQL Server stores records relating to each other in a table. Different tables are created for

the various groups of information. Related tables are grouped together to form a database.

Primary Key:

Every table in SQL Server has a field or a combination of fields that uniquely identifies each

record in the table. The Unique identifier is called the Primary Key, or simply the Key. The

primary key provides the means to distinguish one record from all other in a table. It allows the

user and the database system to identify, locate and refer to one particular record in the database.

Relational Database:

Sometimes all the information of interest to a business operation can be stored in one table.

SQL Server makes it very easy to link the data in multiple tables. Matching an employee to the

department in which they work is one example. This is what makes SQL Server a relational

database management system, or RDBMS. It stores data in two or more tables and enables you

to define relationships between the tables and enables you to define relationships between the

tables.

Foreign Key:

When a field is one table matches the primary key of another field is referred to as a foreign

key. A foreign key is a field or a group of fields in one table whose values match those of the

primary key of another table.

Referential Integrity:

Not only does SQL Server allow you to link multiple tables, it also maintains consistency

between them. Ensuring that the data among related tables is correctly matched is referred to as

maintaining referential integrity.

Data Abstraction:

Page 35: Background Verification Service

Background Verification Service Page 27

A major purpose of a database system is to provide users with an abstract view of the data.

This system hides certain details of how the data is stored and maintained. Data abstraction is

divided into three levels.

Physical level:

This is the lowest level of abstraction at which one describes how the data are actually stored.

Conceptual Level:

At this level of database abstraction all the attributed and what data are actually stored is

described and entries and relationship among them.

View level:

This is the highest level of abstraction at which one describes only part of the database.

Advantages of RDBMS:

Redundancy can be avoided

Inconsistency can be eliminated

Data can be Shared

Standards can be enforced

Security restrictions can be applied

Integrity can be maintained

Conflicting requirements can be balanced

Data independence can be achieved.

Disadvantages of DBMS:

A significant disadvantage of the DBMS system is cost. In addition to the cost of

purchasing of developing the software, the hardware has to be upgraded to allow for the

extensive programs and the workspace required for their execution and storage. While

centralization reduces duplication, the lack of duplication requires that the database be

adequately backed up so that in case of failure the data can be recovered.

Features of SQL Server (RDBMS):

SQL SERVER is one of the leading database management systems (DBMS) because it is the

only Database that meets the uncompromising requirements of today‟s most demanding

information systems. From complex decision support systems (DSS) to the most rigorous online

transaction processing (OLTP) application, even application that require simultaneous DSS and

OLTP access to the same critical data, SQL Server leads the industry in both performance and

capability

SQL SERVER is a truly portable, distributed, and open DBMS that delivers unmatched

performance, continuous operation and support for every database.

Page 36: Background Verification Service

Background Verification Service Page 28

SQL SERVER RDBMS is high performance fault tolerant DBMS which is specially designed

for online transactions processing and for handling large database application.

SQL SERVER with transactions processing option offers two features which contribute to very

high level of transaction processing throughput, which are

The row level locks manager

Enterprise wide Data Sharing:

The unrivaled portability and connectivity of the SQL SERVER DBMS enables all the

systems in the organization to be linked into a singular, integrated computing resource.

Portability:

SQL SERVER is fully portable to more than 80 distinct hardware and operating systems

platforms, including UNIX, MSDOS, OS/2, Macintosh and dozens of proprietary platforms.

This portability gives complete freedom to choose the database sever platform that meets the

system requirements.

Open Systems:

SQL SERVER offers a leading implementation of industry –standard SQL. SQL Server‟s

open architecture integrates SQL SERVER and non –SQL SERVER DBMS with industries most

comprehensive collection of tools, application, and third party software products SQL Server‟s

Open architecture provides transparent access to data from other relational database and even

non-relational database.

Distributed Data Sharing:

SQL Server‟s networking and distributed database capabilities to access data stored on

remote server with the same ease as if the information was stored on a single local computer. A

single SQL statement can access data at multiple sites. You can store data where system

requirements such as performance, security or availability dictate.

Unmatched Performance:

The most advanced architecture in the industry allows the SQL SERVER DBMS to deliver

unmatched performance.

Sophisticated Concurrency Control:

Real World applications demand access to critical data. With most database Systems

application becomes “contention bound” – which performance is limited not by the CPU power

or by disk I/O, but user waiting on one another for data access. SQL Server employs full,

unrestricted row-level locking and contention free queries to minimize and in many cases

entirely eliminates contention wait times.

No I/O Bottlenecks

Page 37: Background Verification Service

Background Verification Service Page 29

SQL Server‟s fast commit groups commit and deferred write technologies dramatically

reduce disk I/O bottlenecks. While some database write whole data block to disk at commit time,

SQL Server commits transactions with at most sequential log file on disk at commit time, On

high throughput systems, one sequential writes typically group commit multiple transactions.

Data read by the transaction remains as shared memory so that other transactions may access that

data without reading it again from disk. Since fast commits write all data necessary to the

recovery to the log file, modified blocks are written back to the database independently of the

transaction commit, when written from memory to disk.

Normalization:

It is a process of converting a relation to a standard form. The process is used to handle the

problems that can arise due to data redundancy i.e. repetition of data in the database, maintain

data integrity as well as handling problems that can arise due to insertion, updating, deletion

anomalies.

Decomposing is the process of splitting relations into multiple relations to eliminate

anomalies and maintain anomalies and maintain data integrity. To do this we use normal forms

or rules for structuring relation.

Insertion anomaly: Inability to add data to the database due to absence of other data.

Deletion anomaly: Unintended loss of data due to deletion of other data.

Update anomaly: Data inconsistency resulting from data redundancy and partial update

Normal Forms: These are the rules for structuring relations that eliminate anomalies.

First Normal Form:

A relation is said to be in first normal form if the values in the relation are atomic for

every attribute in the relation. By this we mean simply that no attribute value can be a set of

values or, as it is sometimes expressed, a repeating group.

Second Normal Form:

A relation is said to be in second Normal form is it is in first normal form and it should

satisfy any one of the following rules.

Primary key is a not a composite primary key

No non key attributes are present

Every non key attribute is fully functionally dependent on full set of primary key.

Third Normal Form:

A relation is said to be in third normal form if their exits no transitive dependencies.

Transitive Dependency: If two non key attributes depend on each other as well as on the primary

key then they are said to be transitively dependent.

Page 38: Background Verification Service

Background Verification Service Page 30

The above normalization principles were applied to decompose the data in multiple tables

thereby making the data to be maintained in a consistent state.

5.2 Coding:

//Adminfrom.aspx

Using System;

using System.Data;

using System.Configuration;

using System.Collections;

using System.Web;

using System.Web.Security;

using System.Web.UI;

using System.Web.UI.WebControls;

using System.Web.UI.WebControls.WebParts;

using System.Web.UI.HtmlControls;

publicpartialclassAdminForm : System.Web.UI.Page

{

protectedvoid Page_Load(object sender, EventArgs e)

{

if (Session["usr"].ToString() != "")

{

Response.Write("<frameset rows='20%,80%' border='1'>");

Response.Write("<frame name='f1' noresize src='Heading.aspx'>");

Response.Write("<frameset cols='20%,80%'>");

Response.Write("<frame name='f2' noresize src='AdminList.aspx'>");

Response.Write("<frame name='f3' noresize src='Welcome.aspx'>");

Response.Write("</frameset></frameset>");

}

else

{

Response.Write("<center><h1>Error in Loading the Page...</h1></center>");

}

}

}

//company.aspx

using System;

using System.Data;

using System.Configuration;

Page 39: Background Verification Service

Background Verification Service Page 31

using System.Collections;

using System.Web;

using System.Web.Security;

using System.Web.UI;

using System.Web.UI.WebControls;

using System.Web.UI.WebControls.WebParts;

using System.Web.UI.HtmlControls;

using System.Data.SqlClient;

publicpartialclassAdd_Company : System.Web.UI.Page

{

SqlCommand RS;

MyConnection mc = new MyConnection();

protectedvoid Page_Load(object sender, EventArgs e)

{

mc.DB.Open();

loadnumber();

}

void loadnumber()

{

String str = "select Max(CID) from Company";

RS = new SqlCommand(str, mc.DB);

SqlDataReader dr = RS.ExecuteReader(CommandBehavior.SingleRow);

int no = 0;

try

{

if (dr.Read())

{

no = int.Parse(dr.GetValue(0).ToString()) + 1;

TextBox1.Text = no.ToString();

}

dr.Close();

}

catch (Exception e1)

{

dr.Close();

TextBox1.Text = "10001";

}

Page 40: Background Verification Service

Background Verification Service Page 32

String[] mon ={ "Jan", "Feb", "Mar", "Apr", "May", "Jun", "Jul", "Aug", "Sep", "Oct", "Nov",

"Dec" };

str=DateTime.Today.Day+"-"+mon[DateTime.Today.Month-1]+"-

"+DateTime.Today.Year;

//str = String.Today.Day + "-" + mon[DateTime.Today.Month - 1] + "-" +

DateTime.Today.Year;

TextBox5.Text = str;

}

protectedvoid Button1_Click(object sender, EventArgs e)

{

try

{

if (TextBox2.Text.Trim() != ""&& TextBox3.Text.Trim() != ""&& TextBox4.Text.Trim() !=

""&& TextBox6.Text.Trim() != ""&& TextBox7.Text.Trim() != ""&& TextBox8.Text.Trim() !=

"")

{

if (TextBox7.Text.Trim() == TextBox8.Text.Trim())

{

String str = "select * from Company where CName='" + TextBox2.Text + "' or PWord='" +

TextBox7.Text + "'";

RS = new SqlCommand(str, mc.DB);

SqlDataReader dr = RS.ExecuteReader(CommandBehavior.SequentialAccess);

int c = 0;

while (dr.Read())

c++;

dr.Close();

if (c == 0)

{

str = "insert into Company values(" + int.Parse(TextBox1.Text) + ",'" +

TextBox2.Text + "','" + TextBox3.Text + "','" + TextBox4.Text + "','" +

DateTime.Parse(TextBox5.Text).ToString("yyyyMMdd")+"','" + TextBox6.Text + "','" +

TextBox7.Text + "','Working')";

RS = new SqlCommand(str, mc.DB);

RS.ExecuteNonQuery();

str = "create table C_" + TextBox1.Text + "(EId numeric,Ename nchar(20),DOB

DateTime,Address nchar(30),Qualification nchar(50),Designation nchar(30),DOJ

Page 41: Background Verification Service

Background Verification Service Page 33

DateTime,Initial_Salary numeric,DOR DateTime, Release_Salary numeric,Status nchar(20),SSN

numeric)";

RS = new SqlCommand(str, mc.DB);

RS.ExecuteNonQuery();

Label10.Text = "Company Registration is Successfull...";

TextBox1.Text = "";

TextBox2.Text = "";

TextBox3.Text = "";

TextBox4.Text = "";

TextBox5.Text = "";

TextBox6.Text = "";

TextBox7.Text = "";

TextBox8.Text = "";

loadnumber();

}

else

{

Label10.Text = "Company Name/PassWord Already exists...";

}

}

else

{

Label10.Text = "PassWord/Confirm PassWord Does Not Match...";

}

}

else

{

Label10.Text = "InComplete Company Details to Register...";

}

}

catch (Exception e1)

{

Label10.Text = "Error : " + e1.ToString();

}

}

}

//Alert _company.aspx

using System;

Page 42: Background Verification Service

Background Verification Service Page 34

using System.Data;

using System.Configuration;

using System.Collections;

using System.Web;

using System.Web.Security;

using System.Web.UI;

using System.Web.UI.WebControls;

using System.Web.UI.WebControls.WebParts;

using System.Web.UI.HtmlControls;

using System.Data.SqlClient;

publicpartialclassAlter_Company : System.Web.UI.Page

{

MyConnection mc = new MyConnection();

SqlCommand RS;

protectedvoid Page_Load(object sender, EventArgs e)

{

mc.DB.Open();

}

protectedvoid DropDownList1_SelectedIndexChanged(object sender, EventArgs e)

{

try

{

if (DropDownList1.SelectedItem.ToString() != "--Select--")

{

String str = "Select CID,CName,CAddress,Work_Mode,Reg_Date,Contacts from Company

where CID=" + int.Parse(DropDownList1.SelectedItem.ToString());

RS = new SqlCommand(str, mc.DB);

SqlDataReader dr = RS.ExecuteReader(CommandBehavior.SingleRow);

if (dr.Read())

{

TextBox1.Text = dr.GetValue(0).ToString().Trim();

TextBox2.Text = dr.GetValue(1).ToString().Trim();

TextBox3.Text = dr.GetValue(2).ToString().Trim();

TextBox4.Text = dr.GetValue(3).ToString().Trim();

DateTime dt = DateTime.Parse(dr.GetValue(4).ToString());

String[] mon ={ "Jan", "Feb", "Mar", "Apr", "May", "Jun", "Jul", "Aug", "Sep", "Oct", "Nov",

"Dec" };

str = dt.Day + "-" + mon[dt.Month - 1] + "-" + dt.Year;

TextBox5.Text = str;

Page 43: Background Verification Service

Background Verification Service Page 35

TextBox6.Text = dr.GetValue(5).ToString().Trim();

Label9.Text = "";

}

dr.Close();

}

else

{

TextBox1.Text = "";

TextBox2.Text = "";

TextBox3.Text = "";

TextBox4.Text = "";

TextBox5.Text = "";

TextBox6.Text = "";

Label9.Text = "";

}

}

catch (Exception e1)

{

}

}

protectedvoid Button1_Click(object sender, EventArgs e)

{

try

{

String str = "select CID from Company where Status='Working'";

RS = new SqlCommand(str, mc.DB);

SqlDataReader dr = RS.ExecuteReader(CommandBehavior.SequentialAccess);

DropDownList1.Items.Clear();

DropDownList1.Items.Add("--Select--");

while (dr.Read())

{

DropDownList1.Items.Add(dr.GetValue(0).ToString());

}

dr.Close();

}

catch (Exception e1)

{

}

Page 44: Background Verification Service

Background Verification Service Page 36

}

protectedvoid Button2_Click(object sender, EventArgs e)

{

try

{

if (TextBox1.Text.Trim() != ""&& TextBox2.Text.Trim() != ""&& TextBox3.Text.Trim() !=

""&& TextBox4.Text.Trim() != ""&& TextBox5.Text.Trim() != ""&& TextBox6.Text.Trim() !=

"")

{

String str = "update Company set CName='" + TextBox2.Text + "',CAddress='" +

TextBox3.Text + "',Work_Mode='" + TextBox4.Text + "',Contacts='" + TextBox6.Text + "'

where CID=" + int.Parse(DropDownList1.SelectedItem.ToString());

RS = new SqlCommand(str, mc.DB);

RS.ExecuteNonQuery();

TextBox1.Text = "";

TextBox2.Text = "";

TextBox3.Text = "";

TextBox4.Text = "";

TextBox5.Text = "";

TextBox6.Text = "";

Label9.Text = "Company Information updated Successfully...";

DropDownList1.Items.Clear();

}

else

{

Label9.Text = "InComplete Company Details to Update...";

}

}

catch (Exception e1)

{

}

}

protectedvoid Button3_Click(object sender, EventArgs e)

{

try

{

if (TextBox1.Text.Trim()!="")

{

Page 45: Background Verification Service

Background Verification Service Page 37

String str = "Select * from C_" + DropDownList1.SelectedItem.ToString() + " where

Status='Working'";

RS = new SqlCommand(str, mc.DB);

SqlDataReader dr = RS.ExecuteReader(CommandBehavior.SequentialAccess);

int c = 0;

while (dr.Read())

c++;

dr.Close();

if (c == 0)

{

str = "update Company set Status='Deleted' where CID=" +

int.Parse(DropDownList1.SelectedItem.ToString());

RS = new SqlCommand(str, mc.DB);

RS.ExecuteNonQuery();

TextBox1.Text = "";

TextBox2.Text = "";

TextBox3.Text = "";

TextBox4.Text = "";

TextBox5.Text = "";

TextBox6.Text = "";

Label9.Text = "Company Information Deleted Successfully...";

DropDownList1.Items.Clear();

}

else

{

Label9.Text = "There are Still Some Employees working in the

Company...CANNOT DELETE";

}

}

else

{

Label9.Text = "No Company is Specified to Delete...";

}

}

catch (Exception e1)

{

Label9.Text = "Error : " + e1.ToString();

}}}

Page 46: Background Verification Service

Background Verification Service Page 38

Chapter 6: Testing

Software testing is a critical element of software quality assurance and represents the

ultimate review of specification, design and coding. The increasing visibility of software as a

system element and attendant costs associated with a software failure are motivating factors for

we planned, through testing. Testing is the process of executing a program with the intent of

finding an error. The design of tests for software and other engineered products can be as

challenging as the initial design of the product itself.There of basically two types of testing

approaches

One is Black-Box testing – the specified function that a product has been designed to

perform, tests can be conducted that demonstrate each function is fully operated.

The other is White-Box testing – knowing the internal workings of the product

,tests can be conducted to ensure that the internal operation of the product performs

according to specifications and all internal components have been adequately exercised.

White box and Black box testing methods have been used to test this package. The

entire loop constructs have been tested for their boundary and intermediate conditions. The

test data was designed with a view to check for all the conditions and logical

decisions. Error handling has been taken care of by the use of exception handlers.

Testing Strategies:

Testing is a set of activities that can be planned in advanced and conducted systematically. A

strategy for software testing must accommodation low-level tests that are necessary to verify that

a small source code segment has been correctly implemented as well as high-level tests that

validate major system functions against customer requirements.

Software testing is one element of verification and validation. Verification refers to the set

of activities that ensure that software correctly implements as specific function. Validation refers

to a different set of activities that ensure that the software that has been built is traceable to

customer requirements.

The main objective of software is testing to uncover errors. To fulfill this objective, a series

of test steps unit, integration, validation and system tests are planned and executed. Each test

step is accomplished through a series of systematic test technique that assist in the design of test

cases. With each testing step, the level of abstraction with which software is considered is

broadened.

Testing is the only way to assure the quality of software and it is an umbrella activity rather

than a separate phase. This is an activity to be preformed in parallel with the software effort and

one that consists of its own phases of analysis, design, implementation, execution and

maintenance.

Page 47: Background Verification Service

Background Verification Service Page 39

Unit Testing:

This testing method considers a module as single unit and checks the unit at interfaces and

communicates with other modules rather than getting into details at statement level. Here the

module will be treated as a black box, which will take some input and generate output. Outputs

for a given set of input combination are pre-calculated and are generated by the module.

System Testing:

Here all the pre tested individual modules will be assembled to create the larger system and

tests are carried out at system level to make sure that all modules are working in synchronous

with each other. This testing methodology helps in making sure that all modules which are

running perfectly when checked individually are also running in cohesion with other modules.

For this testing we create test cases to check all modules once and then generated test

combinations of test paths through out the system to make sure that no path are making its way

into chaos.

Integrated Testing:

Testing is a major quality control measure employed during software development. Its basic

function is to detect errors. Sub functions when combined may not produce than it is desired.

Global data structures can represent the problems. Integrated testing is a systematic technique

for constructing the program structure while conducting the tests. To uncover errors that are

Page 48: Background Verification Service

Background Verification Service Page 40

associated with interfacing the objective is to make unit test modules and built a program

structure that has been detected by design. In a non - incremental integration all the modules are

combined in advance and the program is tested as a whole. Here errors will appear in an end less

loop function. In incremental testing the program is constructed and tested in small segments

where the errors are isolated and corrected.

Different incremental integration strategies are top – down integration, bottom – up integration,

regression testing.

Top-Down Integration Test:

Modules are integrated by moving downwards through the control hierarchy beginning with

main program. The subordinate modules are incorporated into structure in either a breadth first

manner or depth first manner. This process is done in five steps:

Main control module is used as a test driver and steps are substituted or all modules directly

to main program.Depending on the integration approach selected subordinate is replaced at a

time with actual modules.

Tests are conducted.

On completion of each set of tests another stub is replaced with the real module

Regression testing may be conducted to ensure trha6t new errors have not been introduced.

This process continuous from step 2 until entire program structure is reached. In top down

integration strategy decision making occurs at upper levels in the hierarchy and is encountered

first. If major control problems do exists early recognitions is essential.

If depth first integration is selected a complete function of the software may be implemented and

demonstrated.

Some problems occur when processing at low levels in hierarchy is required to adequately test

upper level steps to replace low-level modules at the beginning of the top down testing. So no

data flows upward in the program structure.

Bottom-Up Integration Test:

Begin construction and testing with atomic modules. As modules are integrated from the bottom

up, processing requirement for modules subordinate to a given level is always available and

need for stubs is eliminated. The following steps implements this strategy.

Low-level modules are combined in to clusters that perform a specific software sub function.

A driver is written to coordinate test case input and output. Cluster is tested. Drivers are removed

and moving upward in program structure combines clusters. Integration moves upward, the need

for separate test driver‟s lesions. If the top levels of program structures are integrated top down,

the number of drivers can be reduced substantially and integration of clusters is greatly

simplified.

Page 49: Background Verification Service

Background Verification Service Page 41

Regression Testing:

Each time a new module is added as a part of integration as the software changes. Regression

testing is an actually that helps to ensure changes that do not introduce unintended behavior as

additional errors.

Regression testing maybe conducted manually by executing a subset of all test cases or using

automated capture play back tools enables the software engineer to capture the test case and

results for subsequent playback and compression. The regression suit contains different classes

of test cases.

A representative sample to tests that will exercise all software functions.

Additional tests that focus on software functions that are likely to be affected by the change.

Validation Testing:

Validation testing demonstrates the traces the requirements of the software. This can be

achieved through a series of black box tests.

Test cases:

Module Name: Admin Login

File Name: index.aspx

Test Inputs Actual Output Obtained Output Description

Valid

Login

Uid&pwd Success Success Test passed. Passes the

control to the Other

Module Menus.

Invalid

Login

Uid&pwd Failed Failed Test Passed.

Passes the control to the

Error Page with

appropriate message

Add

Company

Company

details

Company added

successfully

Enter valid details Test Passed.

Pass the control to the

Error page with

appropriate message.

Alter

Company

Modify

Company

details

Company details

updated

successfully

Failed exception

message will be

displayed.

Test Passed.

Passes the control to the

Error Page with

appropriate message

Page 50: Background Verification Service

Background Verification Service Page 42

Module Name: Company Login

File Name: index.aspx

Test Inputs Actual Output Obtained

Output

Description

Valid Login Uid&pwd Success Success Test passed.

Passes the control

to the Other

Module Menus.

Invalid Login Uid&pwd Failed Failed Test Passed.

Passes the control

to the Error Page

with appropriate

message

Add employee Employee

details

Employee

added

successfully

Enter valid

details

Test Passed.

Pass the control to

the Error page

with appropriate

message.

Alter employee Modify

employee

details

Employee

details updated

successfully

Failed

exception

message will

be displayed.

Test Passed.

Passes the control

to the Error Page

with appropriate

message

Page 51: Background Verification Service

Background Verification Service Page 43

Chapter 7: Screen Shorts

Page 52: Background Verification Service

Background Verification Service Page 44

Page 53: Background Verification Service

Background Verification Service Page 45

Page 54: Background Verification Service

Background Verification Service Page 46

Page 55: Background Verification Service

Background Verification Service Page 47

Page 56: Background Verification Service

Background Verification Service Page 48

Page 57: Background Verification Service

Background Verification Service Page 49

Page 58: Background Verification Service

Background Verification Service Page 50

Chapter 8: Conclusion

The entire project has been developed and deployed as per the requirements stated by the

user, it is found to be bug free as per the testing standards that are implemented .Any

specification untraced errors will be concentrated in the coming versions, which are planned to

be developed in near future.

Page 59: Background Verification Service

Background Verification Service Page 51

Chapter 9: Bibliography

FOR .NET INSTALLATION

www.support.mircosoft.com

FOR DEPLOYMENT AND PACKING ON SERVER

www.developer.com

www.16seconds.com

FOR SQL

www.msdn.microsoft.com

FOR ASP.NET

www.msdn.microsoft.com/net/quickstart/aspplus/default.com

www.asp.net

www.fmexpense.com/quickstart/aspplus/default.com

www.asptoday.com

www.aspfree.com

Reference Books:

Evangelous Petereous, “C#.NET Black Book”.

Roger Pressman, “Software Engineering”.

Jain, “SQL FOR PROFESSIONALS”.

Wrox, “Professional ASP.NET”.