product quality improvement and project assessment using test … · 2018. 12. 17. · tmmi is...

35
Time Journals of Engineering and Physical Sciences Vol. 2(1): 28-62 June 2014 www.timejournals.org/tjeps © 2013 Time Journals ISSN: 2360-7335 Product quality improvement and project assessment using test maturity model integrated (TMMi) Vishwa Nath Maurya 1 *, Diwinder Kaur Arora 2 , Avadhesh Kumar Maurya 3 , Dibyanshu Singh 4 , Anurag Priyadarshi 4 , Gaurav Anand 4 and Nida Ateeq 4 1 Department of Applied Mathematics and Statistics, School of Science and Technology, The University of Fiji, Fiji Islands, India. 2 Group Centre, Central Reserve Police Force, Lucknow, Ministry of Home Affairs, Govt. of India. 3 Department of Electronics and Communication Engineering, Lucknow Institute of Technology, Gautam Buddha Technical University, Lucknow, U.P., India. 4 Department of Computer Science and Engineering, Sapthagiri College of Engineering, Bangalore Visvesvaraya Technological University, Belgaum, India. Accepted Date: 16 June, 2014 Present paper demonstrates an application aspect of test maturity model integrated technique for product quality improvement and project assessment in industrial organizations. More specifically, test maturity model integrated (TMMi) technique has been used in product quality improvement process. Keeping in view the fact that many modern industrial and software organizations find value in benchmarking their progress in test process improvement for not only internal purposes but also for external customers and suppliers. Here, related literature for testing quality is reviewed and it is suggested that the test maturity model integrated (TMMi) technique provides an excellent reference model to be used during such assessments. Assessment teams use TMMi to guide their identification and prioritization of findings. These findings along with the guidance of TMMi practices are used to plan improvements for the organization. This application helps in evaluating projects under various companies using TMMi levels and standards and hence, generating reports in form of graphs showing the areas that need to have improvement. Key words: Project assessment, capability maturity model, TMMi, CMMi, test process improvement, windows, MySQL, PHP, Perl or Python, hypertext preprocessor, hypertext markup language, extended markup language, cascading style sheets, quality assurance. INTRODUCTION For the past decade, the software industry has invested substantial effort to improve the quality of its products. This has been a difficult job, since the size and complexity of software increases rapidly while customers and users are becoming more and more demanding. *Corresponding author. E-mail: [email protected], [email protected]. Despite encouraging results with various quality improvement approaches, the software industry is still far from zero defects. To improve product quality, the software industry has often focused on improving its development processes. A guideline that has been widely used to improve the development processes is the Capability Maturity Model. The Capability Maturity Model (CMM) and its’ successor the Capability Maturity Model Integration (CMMI) are often regarded as the industry standard for software process improvement. Despite the

Upload: others

Post on 10-Aug-2021

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Time Journals of Engineering and Physical Sciences Vol. 2(1): 28-62 June 2014 www.timejournals.org/tjeps © 2013 Time Journals ISSN: 2360-7335

Product quality improvement and project assessment using test maturity model integrated (TMMi)

Vishwa Nath Maurya1*, Diwinder Kaur Arora2, Avadhesh Kumar Maurya3, Dibyanshu Singh4, Anurag Priyadarshi4, Gaurav Anand4 and Nida Ateeq4

1Department of Applied Mathematics and Statistics, School of Science and Technology, The University of Fiji, Fiji

Islands, India. 2Group Centre, Central Reserve Police Force, Lucknow, Ministry of Home Affairs, Govt. of India.

3Department of Electronics and Communication Engineering, Lucknow Institute of Technology, Gautam Buddha

Technical University, Lucknow, U.P., India. 4Department of Computer Science and Engineering, Sapthagiri College of Engineering, Bangalore

Visvesvaraya Technological University, Belgaum, India.

Accepted Date: 16 June, 2014

Present paper demonstrates an application aspect of test maturity model integrated technique for product quality improvement and project assessment in industrial organizations. More specifically, test maturity model integrated (TMMi) technique has been used in product quality improvement process. Keeping in view the fact that many modern industrial and software organizations find value in benchmarking their progress in test process improvement for not only internal purposes but also for external customers and suppliers. Here, related literature for testing quality is reviewed and it is suggested that the test maturity model integrated (TMMi) technique provides an excellent reference model to be used during such assessments. Assessment teams use TMMi to guide their identification and prioritization of findings. These findings along with the guidance of TMMi practices are used to plan improvements for the organization. This application helps in evaluating projects under various companies using TMMi levels and standards and hence, generating reports in form of graphs showing the areas that need to have improvement. Key words: Project assessment, capability maturity model, TMMi, CMMi, test process improvement, windows, MySQL, PHP, Perl or Python, hypertext preprocessor, hypertext markup language, extended markup language, cascading style sheets, quality assurance.

INTRODUCTION For the past decade, the software industry has invested substantial effort to improve the quality of its products. This has been a difficult job, since the size and complexity of software increases rapidly while customers and users are becoming more and more demanding.

*Corresponding author. E-mail: [email protected], [email protected].

Despite encouraging results with various quality improvement approaches, the software industry is still far from zero defects. To improve product quality, the software industry has often focused on improving its development processes. A guideline that has been widely used to improve the development processes is the Capability Maturity Model. The Capability Maturity Model (CMM) and its’ successor the Capability Maturity Model Integration (CMMI) are often regarded as the industry standard for software process improvement. Despite the

Page 2: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Time J. Engr. Phy. Sci. 28 fact that testing often accounts for at least 30-40% of the total project costs, only limited attention is given to testing in the various software process improvement models such as the CMM and the CMMI. As an answer, the testing community has created its own improvement models. The TMMi is a detailed model for test process improvement and is positioned as being complementary to the CMMI. Background The TMMi framework has been developed by the TMMi Foundation as a guideline and reference framework for test process improvement and is positioned as a complementary model to the CMMI addressing those issues important to test managers, test engineers and software quality professionals, for more details we refer, Veenendaal [Veenendaal, 2009, 2012]. Several noteworthy researchers e.g. Koomen and Martin [1999]; Koroorian and Kajko-Mattsson [2008]; Maurya and Bathla [2012]; Maurya and Rajender [2012]; Maurya et al. [2013]; Persson and Yilmaztürk [2004] and Veenendaal and Cannegieter [2012] and references therein confined their attention in this direction. Testing as defined in the TMMi is applied in its broadest sense to encompass all software product quality-related activities. Testing is the process consisting of all lifecycle activities, static and dynamic, concerned with planning, preparation and evaluation of software products and related work products to determine that they satisfy specified requirements, to demonstrate that they are fit for purpose and to detect defects. Just like the CMMI staged representation, the TMMi also uses the concept of maturity levels for process evaluation and improvement. Furthermore process areas, goals and practices are identified. Applying the TMMi maturity criteria will improve the test process and have a positive impact on product quality, test engineering productivity, and cycle-time effort. The TMMi has been developed to support organizations with evaluating and improving their test process. Within the TMMi, testing evolves from a chaotic, ill-defined process with a lack of resources, tools and well-educated testers to a mature and controlled process that has defect prevention as its main objective. Practical experiences are positive and show that TMMi supports the process of establishing a more effective and efficient test process. Testing becomes a profession and a fully integrated part of the development process. As stated the focus of testing changes from defect detection to defect prevention. Sources The development of the TMMi has used the TMM3

framework as developed by the Illinois Institute of Technology. In addition to the TMM, it was largely guided by the work done on the Capability Maturity Model Integration (CMMI), a process improvement model that has widespread support in the IT industry. The CMMI has both a staged and continuous representation. Within the staged representation the CMMI architecture prescribes the stages that an organization must proceed through in an orderly fashion to improve its development process. Within the continuous representation there is no fixed set of levels or stages to proceed through. An organization applying the continuous representation can select areas for improvement from many different categories. The TMMi has been developed as a staged model. The staged model uses predefined sets of process areas to define an improvement path for an organization. This improvement path is described by a model component called a maturity level. A maturity level is a well-defined evolutionary plateau towards achieving improved organizational processes. Test levels Whereas some models for test process improvement focus mainly on higher test levels, e.g., Test Process Improvement (TPI), or address only one aspect of structured testing e.g., the test organization, the TMMi addresses all test levels (including static testing) and aspects of structured testing. With respect to dynamic testing, both lower test level (e.g., component test, integration test) and higher test levels (e.g., system test, acceptance test) are within the scope of the TMMi. Studying the model more in detail one will learn that the model addresses all four cornerstones for structured testing (lifecycle, techniques, infrastructure and organization). Description of the CMMI and TMMi Capability Maturity Model (CMM) broadly refers to a process improvement approach that is based on a process model. CMM also refers specifically to the first such model, developed by the Software Engineering Institute (SEI) in the mid-1980s, as well as the family of process models that followed. A process model is a structured collection of practices that describe the characteristics of effective processes; the practices included are those proven by experience to be effective. CMM can be used to assess an organization against a scale of five process maturity levels. Each level ranks the organization according to its standardization of processes in the subject area being assessed. The subject areas can be as diverse as software engineering, systems engineering, project management, risk management,

Page 3: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

system acquisition, information technology (IT) services and personnel management. CMM was developed by the SEI at Carnegie Mellon University in Pittsburgh. It has been used extensively for avionics software and government projects, in North America, Europe, Asia, Australia, South America, and Africa. Currently, some government departments require software development contract organization to achieve and operate at a level 3 standards. The Capability Maturity Model was initially funded by military research. The United States Air Force funded a study at the Carnegie-Mellon Software Engineering Institute to create a model (abstract) for the military to use as an objective evaluation of software subcontractors. The result was the Capability Maturity Model, published as Managing the Software Process in 1989. The CMM is no longer supported by the SEI and has been superseded by the more comprehensive Capability Maturity Model Integration (CMMI).

The Capability Maturity Model (CMM) is a way to develop and refine an organization's processes. The first CMM was for the purpose of developing and refining software development processes. A maturity model is a structured collection of elements that describe characteristics of effective processes. A maturity model provides: A place to start the benefit of a community’s prior experiences A common language and a shared vision A framework for prioritizing actions A way to define what improvement means for your organization. A maturity model can be used as a benchmark for assessing different organizations for equivalent comparison. It describes the maturity of the company based upon the project the company is dealing with and the clients. In the 1970s, technological improvements made computers more widespread, flexible, and inexpensive. Organizations began to adopt more and more computerized information systems and the field of software development grew significantly. This led to an increased demand for developers-and managers-which was satisfied with less experienced professionals. Unfortunately, the influx of growth caused growing pains; project failure became more commonplace not only because the field of computer science was still in its infancy, but also because projects became more ambitious in scale and complexity. In response, individuals such as Edward Yourdon, Larry Constantine, Gerald Weinberg, Tom DeMarco, and David Parnas published articles and books with research results in an attempt to professionalize the software development process. Watts Humphrey's Capability Maturity Model (CMM) was described in the book “Managing the Software Process” (1989). The CMM as conceived by

Maurya et al 29 Watts Humphrey was based on the earlier work of Phil Crosby. Active development of the model by the SEI began in 1986. The CMM was originally intended as a tool to evaluate the ability of government contractors to perform a contracted software project. Though it comes from the area of software development, it can be, has been, and continues to be widely applied as a general model of the maturity of processes in IS/IT (and other) organizations.

It is also important to note that TMMi is positioned as a complementary model to the CMMI. In many cases a given TMMi level needs specific support from process areas at its corresponding CMMI level or from lower CMMI levels. In exceptional cases there is even a relationship to higher CMMI levels. Process areas and practices that are elaborated within the CMMI are mostly not repeated within TMMi; they are only referenced. For example the process area configuration management, which is also applicable to test (work) products / test ware, is not elaborated upon in detail within the TMMi; the practices from CMMI are referenced and implicitly re-used. Many organizations find value in benchmarking their progress in test process improvement for both internal purposes and for external customers and suppliers. Test process assessments focus on identifying improvement opportunities and understanding the organization’s position relative to the selected model or standard. The TMMi provides an excellent reference model to be used during such assessments. Assessment teams use TMMi to guide their identification and prioritization of findings. These findings along with the guidance of TMMi practices are used to plan improvements for the organization. For more details, we refer some noteworthy researchers like Runeson [2006]; Sebesta [2008]; Stefan et al. [2005] and Veenendaal and Cannegieter [2012]. Overview of the present work TMMi has a staged architecture for process improvement as shown in Figure 1. Recently, Maurya et al. [2014, 2013] confined their attention to focus few architectural features of the TMMi criteria. It contains stages or levels through which an organization passes as its testing process evolves from one that is ad hoc and unmanaged, to one that is managed, defined, measured, and optimized. Achieving each stage ensures that an adequate improvement has been laid as a foundation for the next stage. The internal structure of the TMMi is rich in testing practices that can be learned and applied in a systematic way to support a quality testing process that improves in incremental steps. There are five levels in the TMMi that prescribe a maturity hierarchy and an evolutionary path to test process improvement. Each level has a set of process areas that an organization

Page 4: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Time J. Engr. Phy. Sci. 30

Figure 1. TMMi maturity levels and process areas.

needs to implement on to achieve maturity at that level. Experience has shown that organizations do their best when they focus their test process improvement efforts on a manageable number of process areas at a time, and that those areas require increasing sophistication as the organization improves. Because each maturity level forms a necessary foundation for the next level, trying to skip a maturity level is usually counter-productive. At the same time, you must recognize that test process improvement efforts should focus on the needs of the organization in the context of its business environment and the process areas at higher maturity levels may address the current needs of an organization or project.

Level 1 initial At TMMi level 1, testing is a chaotic, undefined process and is often considered a part of debugging. The organization usually does not provide a stable environment to support the processes. Success in these organizations depends on the competence and heroics of the people in the organization and not the use of proven processes. Tests are developed in an ad hoc way after coding is completed. Testing and debugging are interleaved to get the bugs out of the system. The objective of testing at this level is to show that the software runs without major failures. Products are

Page 5: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

released without adequate visibility regarding quality and risks. In the field, the product often does not fulfill its needs, is not stable, and/or is too slow. Within testing there is a lack of resources, tools and well-educated staff. At TMMi level 1 there are no defined process areas. Maturity level 1 organizations are characterized by a tendency to over commit, abandonment of processes in a time of crises, and an inability to repeat their successes. In addition products tend not to be released on time, budgets are overrun and delivered quality is not according to expectations. Level 2 managed At TMMi level 2, testing becomes a managed process and is clearly separated from debugging. The process discipline reflected by maturity level 2 helps to ensure that existing practices are retained during times of stress. However, testing is still perceived by many stakeholders as being a project phase that follows coding. In the context of improving the test process, a company-wide or program-wide test strategy is established. Test plans are also developed. Within the test plan a test approach is defined, whereby the approach is based on the result of a product risk assessment. Risk management techniques are used to identify the product risks based on documented requirements. The test plan defines what testing is required, when, how and by whom. Commitments are established with stakeholders and revised as needed. Testing is monitored and controlled to ensure it is going according to plan and actions can be taken if deviations occur. The status of the work products and the delivery of testing services are visible to management. Test design techniques are applied for deriving and selecting test cases from specifications. However, testing may still start relatively late in the development lifecycle, e.g., during the design or even during the coding phase. In TMMI level 2 testing are multi-leveled: there are component, integration, and system and acceptance test levels. For each identified test level there are specific testing objectives defined in the organization-wide or program-wide test strategy. The processes of testing and debugging are differentiated. The main objective of testing in a TMMi level 2 organizations is to verify that the product satisfies the specified requirements. Many quality problems at this TMMi level occur because testing occurs late in the development lifecycle. Defects are propagated from the requirements and design into code. There are no formal review programs as yet to address this important issue. Post code, execution-based testing is still considered by many stakeholders the primary testing activity. The process areas at TMMi level 2 are: Test policy and strategy

Maurya et al 31 Test planning Test monitoring and control Test design and execution Test environment

Level 3 defined

At TMMi level 3, testing is no longer confined to a phase that follows coding. It is fully integrated into the development lifecycle and the associated milestones. Test planning is done at an early project stage, e.g., during the requirements phase, and is documented in a master test plan. The development of a master test plan builds on the test planning skills and commitments acquired at TMMi level 2. The organization’s set of standard test processes, which is the basis for maturity level 3, is established and improved over time. A test organization and a specific test training program exist, and testing is perceived as being a profession. Test process improvement is fully institutionalized as part of the test organization’s accepted practices. Organizations at level 3 understand the importance of reviews in quality control; a formal review program is implemented although not yet fully linked to the dynamic testing process. Reviews take place across the lifecycle. Test professionals are involved in reviews of requirements specifications; whereby the test designs at TMMi level 2 focuses mainly on functionality testing test designs and test techniques are expanded at level 3 to include non-functional testing, e.g., usability and/or reliability, depending on the business objectives. A critical distinction between TMMi maturity level 2 and 3 is the scope of the standards, process descriptions, and procedures. At maturity level 2 these may be quite different in each specific instance, e.g., on a particular project. At maturity levels 3 these are tailored from the organization’s set of standard processes to suit a particular project or organizational unit and therefore are more consistent except for the differences allowed by the tailoring guidelines. Another critical distinction is that at maturity level 3, processes are typically described more rigorously than at maturity level 2. As a consequence at maturity level 3, the organization must revisit the maturity level 2 process areas. The process areas at TMMi level 3 are:

Test organization Test training program Test lifecycle and Integration Non-functional testing Peer reviews

Level 4 measured Achieving the goals of TMMi levels 2 and 3 has the

Page 6: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Time J. Engr. Phy. Sci. 32 benefits of putting into place a technical, managerial, and staffing infrastructure capable of thorough testing and providing support for test process improvement. With this infrastructure in place, testing can become a measured process to encourage further growth and accomplishment. In TMMi level 4 organizations, testing is a thoroughly defined, well-founded and measurable process. Testing is perceived as evaluation; it consists of all lifecycle activities concerned with checking products and related work products. An organization-wide test measurement program will be put into place that can be used to evaluate the quality of the testing process, to assess productivity, and to monitor improvements. Measures are incorporated into the organization’s measurement repository to support fact-based decision making. A test measurement program also supports predictions relating to test performance and cost. With respect to product quality, the presence of a measurement program allows an organization to implement a product quality evaluation process by defining quality needs, quality attributes and quality metrics. (Work) products are evaluated using quantitative criteria for quality attributes such as reliability, usability and maintainability. Product quality is understood in quantitative terms and is managed to the defined objectives throughout the lifecycle. Reviews and inspections are considered to be part of the test process and are used to measure product quality early in the lifecycle and to formally control quality gates. Peer reviews as a defect detection technique is transformed into a product quality measurement technique in line with the process area Product Quality Evaluation. TMMi level 4 also covers establishing a coordinated test approach between peer reviews (static testing) and dynamic testing and the usage of peer reviews results and data to optimize the test approach with both aiming at making testing more effective and more efficient. Peer reviews are now fully integrated with the dynamic testing process, e.g. part of the test strategy, test plan and test approach. The process areas at TMMi level 4 are as following: Test Measurement Product Quality Evaluation Advanced Peer Reviews Level 5 optimization The achievement of all previous test improvement goals at levels 1 through 4 of TMMi has created an organizational infrastructure for testing that supports a completely defined and measured process. At TMMi maturity level 5, an organization is capable of continually improving its processes based on a quantitative understanding of statistically controlled processes.

Improving test process performance is carried out through incremental and innovative process and technological improvements. The testing methods and techniques are optimized and there is a continuous focus on fine tuning and process improvement. The level 5 for optimization is the most important from economic point of view. Several researchers contributed their economic perspectives for different frameworks of software testing process. In 2006, Ramler and Wolfmaier succeeded to present their economic perspectives in test automation and balancing automated and manual testing with opportunity cost. Subsequently, tremendous research works in this direction can be found in the literature by some other worth mentioning researchers e.g. Koomen and Martin [1999], Maurya and Maurya [2013]; Maurya and Bathla [2012]; Maurya et al. [2013, 2014] and Rajender et al. [2012]. An optimized test process as defined by the TMMi is one that has following characteristics: managed, defined, measured, efficient and effective statistically controlled and predictable focused on defect prevention supported by automation as much is deemed an effective use of resources able to support technology transfer from the industry to the organization able to support re-use of test assets focused on process change to achieve continuous improvement. To support the continuous improvement of the test process infrastructure, and to identify, plan and implement test improvements, a permanent test process improvement group is formally established and is staffed by members who have received specialized training to increase the level of their skills and knowledge required for the success of the group. In many organizations this group is called a Test Process Group. Support for a Test Process Group formally begins at TMMi level 3 when the test organization is introduced. At TMMi levels 4 and 5, the responsibilities grow as more high level practices are introduced, e.g., identifying At TMMi level 5, the Test Process Optimization process area introduces mechanisms to fine-tune and continuously improve testing. There is an established procedure to identify process enhancements as well as to select and evaluate new testing technologies. Tools support the test process as much as is effective during test design, test execution, regression testing, test case management, defect collection and analysis, etc. Process and test ware reuse across the organization is also common practice and is supported by a test (process) asset library. The three TMMi level 5 process areas, Defect Prevention, Quality Control and Test Process Optimization all provide support for continuous process improvement. In fact, the

Page 7: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

three process areas are highly interrelated. TMMi level 5, testing is a process with the objective of preventing defects. The process areas at TMMi level 5 are as following:

Defect Prevention Quality Control Components of the TMMi

Maturity levels

A maturity level within the TMMi can be regarded as a degree of organizational test process quality. It is defined as an evolutionary plateau of test process improvement. Each level progressively develops an important part of the organization’s test processes. There are five maturity levels within the TMMi. Each maturity level tells what to implement in order to achieve the given level. The higher the maturity level the organization achieves, the more mature the test process of the organization is. To reach a particular maturity level, an organization must satisfy all of the appropriate goals (both specific and generic) of the process areas at the specific level and also those at earlier maturity levels. Note that all organizations possess a minimum of TMMi level 1, as this level does not contain any goals that must be satisfied.

Process areas

As stated with the exception of level 1, each maturity level consists of several process areas that indicate where an organization should focus to improve its test process. Process areas identify the issues that must be addressed to achieve a maturity level. Each process area identifies a cluster of test related activities. When the practices are all performed a significant improvement in activities related to that area will be made. In the TMMi, only those process areas that are considered to be key determinants of test process capability are identified. Runeson [2006] and Sebesta [2008] proposed in their previous research studies that all process areas of the maturity level and the lower maturity levels must be satisfied to consider a maturity level to be achieved.

Problem statement

To take steps that are measurable and that score the existing maturity using TMMi foundation standards, Benchmark the test maturity of any organization against the industry standards, and help organizations meet the TMMi standards.

Objectives of present study More and more organizations are trying to improve their

Maurya et al 33 software development processes. The reference improvement model that is most often used is the Capability Maturity Model Integration (CMMI). In this model process areas related to testing, verification and validation, are described.

But the level of detail is limited, especially from the viewpoint of the test professional. To fill in this gap the Test Maturity Model (TMM) has been developed, which has recently been succeeded by the Test Maturity Model Integration (TMMi).

This application helps the organization to meet the requirements needed to be able to achieve TMMi standards. It evaluates the projects under given companies, that is which TMMi level a project belongs to and reports are generated stating whether a certain project needs improvement. If so, then the areas a project can improve upon are specified. LITERATURE SURVEY There are many serious problems faced by companies which are in business of software production and involved in development of software, one of which is regarding project assessment and keeping the continuous track of projects under a certain company. Most organizations don't have a standard process for defining, managing and reporting on quality activities. Without organizational buy-in, where everyone understands and accepts that they all 'own' quality, it is impossible to create the right processes, metrics, continuous improvement plans and foundations to support quality.

The number of organizations that still view testing and quality as an afterthought to the development process and implement an approach half-way through the project which is both chaotic and expensive, is surprising. In these organizations QA and Testing becomes a bottle-neck between the development and production teams, and because the end date is near, not enough time is available to ensure a high quality product; we refer Runeson [2006].

So to solve this bullet-proof approach is needed to manage software quality to address even the most elementary business goals and objectives.

The concept of an independent industry test maturity model (TMM) is used to make the project development perfect. Categorical survey in connection with enhancement of software testing process can be found in research works carried out by earlier researchers Fewster and Graham [1]; Kaner [1997]; Karlsson et al. [2000]; Martin et al. [2002]; Maurya et al. [2014]; Maurya et al. [2014, 2013] and Veenendaal and Cannegieter [2012]. However, relevant research work and textbook on the subject authored by Martin et al. [2002]; Runeson [2006] and Stefan et al. [2005] are worth mentioning.

Page 8: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Time J. Engr. Phy. Sci. 34 Summary of the prior work Enhancement and optimization of performance in emerging areas of Science and Technology pay a vital role, particularly to improve the test process quality in multinational software organizations. Due to increasing demand of enhanced models in test process quality, a large number of noteworthy researchers contributed their research works in this connection. In sequential development for software testing process, we refer some previous researchers for their recent research works in different frame works of software testing e.g. Maurya and Bathla [2012]; Maurya and Rajender [2012]; Maurya et al. [2013, 2014]; Veenendaal and Cannegieter [2012] and Veenendaal [2012]. In 2012, Rajender et al proposed a cost benefit model for evaluating regression testing prior to this. Subsequently Maurya et al. [2014] examined for software testing development by applying new fuzzy logic model. Although the concept of an independent industry test maturity model (TMM) had been around for years, originally presented in 1996 by the Institute of Illinois; but it wasn't until 2005 that testing and QA experts across the world decided to look at the model in more detail and consider what could be done to better the industry. The TMMi framework has been developed by the TMMi Foundation as a guideline and reference framework for test process improvement and is positioned as a complementary model to the CMMI Version 1.2 [CMMI] addressing those issues important to test managers, test engineers and software quality professionals. Testing as defined in The TMMi is applied in its broadest sense to encompass all software product quality-related activities. Practical experiences are positive and show that TMMi supports the process of establishing a more effective and efficient test process. Testing becomes a profession and a fully integrated part of the development process. As stated the focus of testing changes from defect detection to defect prevention.

The development of the TMMi has used the TMM3 framework as developed by the Illinois Institute of Technology as one of its major sources .In addition to the TMM; it was largely guided by the work done on the Capability Maturity Model Integration (CMMI), a process improvement model that has widespread support in the IT industry. Other sources to the TMMi development include the Gelperin and Hatzel’s Evolution of Testing Model, which describes the evolution of the testing process over a 40-year period, Beizer’s testing model, which describes the evolution of the individual tester’s thinking research on the TMM carried out in the EU funded MB-TMM project, and international testing standards, e.g., IEEE 829 Standard for Software Test Documentation. As stated for defining the maturity levels, the evolutionary testing model of Gelperin and Hetzel has served as a foundation for historical-level differentiation in the TMMi. The Gelperin and Hetzel model describes phases and

test goals for the 1950s through the 1990s. The initial period is described as “debugging oriented”, during which most software development organizations had not clearly differentiated between testing and debugging. Testing was an ad hoc activity associated with debugging to remove bugs from programs. Testing has, according to Gelperin and Hetzel, since progressed to a “prevention-oriented” period, which is associated with current best practices and reflects the highest maturity level of the TMMi. For more details, we refer Maurya et al. [2014, 2013]; Runeson [2006], Stefan et al. [2005] and Veenendaal and Cannegieter [2012]. Furthermore, various industry best practices, practical experience using the TMM and testing surveys have contributed to the TMMi development providing it with its necessary empirical foundation and required level of practicality. These illustrate the current best and worst testing practices in the IT industry, and have allowed the developers of the TMMi framework to extract realistic benchmarks by which to evaluate and improve testing practices, we refer Persson and Yilmaztürk [2004] and Runeson [2006]. Outcome of the review-problems identified The existing system has problem that it is not developed to a broad level and has too high complexity. Also it’s tedious to convert it into an application. So, increasing size and complexity is a big challenge. Deadlines, competition and outsourcings are three other main challenges that we have to suffer from. If deadline is crossed the project assessment get delayed due to which the requirement lists get delayed and this affects overall TMMi approach. The following things are needed:

The need to know the weaknesses – improve efficiency and effectiveness of the test function with ensuing financial and other savings. To know where the project is. (Often compared to industry sector benchmarking). What do is needed to be done to gain a certain level of capability/maturity?

Hence, the rise of complexities and a lot of paper work involved, evaluating projects becomes difficult. This application aims reducing the paper work and help in reducing the rising complexities, for further details we refer Martin et al. [2002], Maurya and Bathla [2012], Maurya et al. [2014, 2013] and Ramler and Wolfmaier [2006]. Furthermore, in this direction recent research works of Maurya et al. [2014] is also worth mentioning. Proposed work The proposed model is to seek the test maturity level that

Page 9: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Maurya et al 35

Figure 2. The rating’s provided during assessment and their meanings.

is clearly going to identify which project of which company is at which TMMi level. We need to monitor and look at the results and requirements to make a test process improve. And the rating’s provided for the TMMi Standard questions will tell us the areas we are lagging in.

The model has an admin who can add any number of companies for assessment and view various project’s progresses within a company. Each company contains a CEO who can add Employees and Managers for different levels of designation they belong to and also see a project’s progress. Each of the projects is self-assessed by the Employee and also his assessor, who can mark the TMMi standard questions on the rating’s 0-5 as shown in Figure 2. The TMMi questions are segregated into four levels, with the Initial level 1 being the Ad hoc level. The four levels are as follows: Level 2 is managed. Level 3 is defined. Level 4 is measured. Level 5 is Optimization. An Manager cannot rate a project question of level 2 without getting full credits of level 1 i.e. one has to clear one level after completing next level assessment takes place. The Manager can rate the project for a certain TMMi level only after the Employee finished rating the project up to that level. The Manager’s rating for the questions are considered for assessing a project. A Manager can handle many projects. Based on the ratings provided for each project, reports are generated in form of graphs. The project has generation of three reports in form of graphs. Three types of graphs and details: - Level graph: This graph takes the average of the TMMi based questions for each level. This graph is which takes: X-axis: Shows all the TMMi levels 2, 3, 4 and 5.

Y-axis: The average of the ratings for all the questions in one particular TMMi level. - Detailed graph: This graph shows the rating provided for each question in any TMMi level selected. There is pull-down menu is used to allow the user to select the TMMi level for which they want to see the graph. This graph takes: X-axis: Question numbers. Y-axis: Rating for that particular question. - Project vis-a-vis levels: This graph shows which project is at which TMMi level. This graph takes: X-axis: Project ID Y-axis: TMMi levels Three graph representations are as follows: Bar graph: It is a graph with rectangular bars with length proportional to the values that they represent. It is the graphical display of data using bars of different heights. Line graph: It displays information as series of data points connected by straight line segments. Scatter plot graph: The data is displayed as a collection of points, each having the value of one variable determining the position on the horizontal axis and the value of the other variable on the vertical axis. Hence, each project is assessed several times before it can clear the five TMMi Levels. The criteria to be fulfilled for each question asked can also be viewed. The project details can be exported to Microsoft Excel sheets. Hence, project can be assessed during the testing phase and improvements can be made in the projects. SYSTEM REQUIREMENTS Functional requirements Functional requirements capture the intended behaviour

Page 10: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Time J. Engr. Phy. Sci. 36 of the system. This behaviour may be expressed as services, tasks or functions the system is required to perform. Functional requirements are very important system requirements such as technical specifications, system design parameters and guidelines, data manipulation, data processing and calculation modules etc. The system shall allow the admin to add a company. The system shall allow the admin to view the projects of any company and their reports generated in form of graphs and the areas they need to improve upon. The system allows the admin control over database. The system allows the CEO of a company to view the projects of his company and see the graph of every project. The system allows the CEO to add the Employees and Managers of his company. The system also will allow an Employee to add a project and to add his Manager. The system will allow the Employee to self-assess the project. The Manager will assess the project and the TMMI level of the project will be decided on the Manager’s rating. The system should allow the admin, CEO, Manager and Employee view their respected graphs generated. The system should allow details of the graph to be exported to excel files. Non functional requirements Non-functional requirements are the constraints on the services or functions or offered by the system. They include timing constraints on the development process, standards, etc.

The following are the various components of the non-functional requirements for our proposed system; The system should be simple and easy to use. It improves the project assessment process during the testing phase of an organization. Hardware requirements Computer systems with 512MB RAM. Software requirements For the proposed work, following software are required: WAMP server version 2.2. Notepad++ editor version 5.9.

Web Browser like Google Chrome Version. Microsoft Excel Spreadsheet. Operating System: Windows 7. SYSTEM DESIGN Design is one of the most important phases of software development. The design is a creative process in which a system organization is established that will satisfy the fundamental and non-fundamental system requirements. Large systems are always decomposed into subsystems that provide some related set of services. The output of the design process is a description of the system architecture. Application architecture The Architectural design process is concerned with establishing a basic structural framework for a system. It involves identifying major components of system and communication between these components.

The system has four logins for four different users each having a separate function. The four users and the other modules shown in Figure 3 are as follows: Admin: The admin handles the whole tool and the database. The admin can add companies and view different projects under all the companies and their reports generated. CEO: The CEO can add Employees and Managers and also see the reports generated in form of graphs for each project in that company. Employee: The Employee can add projects, select his manager, self-assess project and check the areas to be improved. Manager: The Manager can assess projects and view the reports generated in form of graphs. The MySQL database stores and retrieves the values. The WAMP server connects the PHP, HTML, and MySQL and gives the output on the Web Browser. UML diagrams Unified Modeling Language (UML) is a standardized general purpose modeling language in the field of object-oriented software engineering. The UML notation is rich and full bodied.

There is a notation for modeling the static elements of a design such as classes, attributes, and relationships. There is also a notation for modeling the dynamic elements of a design such as objects, messages and finite state machines.

Page 11: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Maurya et al 37

Figure 3. Application architecture.

Sequence diagram A sequence diagram in a Unified Modeling Language (UML) is a kind of interaction diagram that shows how processes operate with one another and in what order. It is a construct of a Message Sequence Chart. A sequence diagram shows object interactions arranged in time sequence. It depicts the sequence of messages exchanged between the objects needed to carry out the functionality of the scenario. Sequence diagrams typically are associated with use case realizations in the Logical View of the system under development. A sequence diagram shows, as parallel vertical lines (lifelines), different processes or objects that live simultaneously, and, as horizontal arrows, the messages exchanged between them, in the order in which they occur. This allows the specification of simple runtime scenarios in a graphical manner. Figures 4, 5, 6 and 7 show the sequence diagrams for admin, CEO, Employee and Manager.

Use case diagram A use case diagram at its simplest is a representation of a user's interaction with the system and depicting the specifications of a use case. A use case diagram can portray the different types of users of a system and the various ways that they interact with the system. The Figure 8 shows the actor’s admin, CEO, Manager and Employee and each of their functionality in project assessment application. ER diagram In software engineering, an Entity-relationship model (ER model) is a data model for describing a data base in an abstract way. In the case of a relational database, which stores data in tables, some of the data in these tables point to data in other tables. Diagrams created to design these entities and relationships are called entity-

Page 12: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Time J. Engr. Phy. Sci. 38

Figure 4. Sequence diagram for admin user.

Figure 5. Sequence diagram for CEO of a particular company.

Figure 6. Sequence diagram for manager for a particular company.

Page 13: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Maurya et al 39

Figure 7. Sequence diagram for employees for a particular company.

Figure 8. Use case diagram for project assessment application.

Page 14: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Time J. Engr. Phy. Sci. 40

Figure 9. ER diagram of the database.

relationship diagrams or ER diagrams. The Figure 9 shows the tables of the Project assessment database and their relationships. Activity diagram Activity diagrams are graphical representations of workflows of stepwise activities and actions with support for choice, iteration and concurrency. In the unified modeling language, activity diagrams can be used to

describe the business and operational step-by-step workflows of components in a system. An activity diagram shows the overall flow of control. The Figure 10 shows the activities and their dependencies. IMPLEMENTATION Implementation of any software is always preceded by important decision regarding selection of platform and language used. WAMP server software as platform is

Page 15: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Maurya et al 41

Figure 10. Activity diagram of functionalities.

used and the language used is PHP, JavaScript and HTML. Modules This section contains a detailed description of modules

and sub-modules of the project. The four modules as follows: Admin Employee CEO Manager

Page 16: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Time J. Engr. Phy. Sci. 42 Module I: Admin This module contains all the preliminary details about the companies that are going to be added in the assessment tools. Essentially, this is the administrator for the software. The admin has the ability to add a company and its details like company name and its address. The admin has ability to view reports in form of graphs for each project in any company. This module consists of various php files which are as follows:- Admin_view_projects.php: This opens content_dashboard_admin.php, which will tell us about project name, employee, employee levels, manager and a manager’s levels. Check_results.php: This opens content _check _result.php which will tell us about rating’s given by employee and match it with manager rating which is given by manager if it is same the level of project is increased by one level up and if it is not same then an information is send to the employee that level not cleared. Register_company.php: This opens content_register_company .php in which server demands for information of a company like name and address of company and hence a company gets registered. Content_show_graph_admin.php: This shows the entire graph which will display the level of a project i.e. all the projects with their level and description will be shown over here. Nav_superadmin.php: This shows the content of dashboard of admin which consists of home page link; add companies, view projects, show project graphs and logout php links. Company_details.php: This will direct to content_company_details.php where details about company are displayed. Module II: Employee This module is for an employee belonging to a particular company. All the details about an employee are shown over here and this module can add project inside assessment tool which can be self-assessed. The Employee can also view reports in forms of graphs and see the areas in which he lacks. This module consists of various php file inside it which are as follows: Create_project.php: This opens content_create_project.php which is used to create a new project inside the assessment tool it is done according to the employee_id. Edit_project.php: This opens content_edit_project.php which is basically used to add or update any details of project which is been added by

the employee. Check_results.php: This opens content _check _result.php which will tell us about rating’s provided by employee and match it with manager rating which is given by manager if it is same the level of project is increased by one level up and if it is not same then an information is send to the employee that level not cleared. Nav_employee.php: This shows the content of dashboard of employee which consists of home page link; add project, view projects, show project graphs and logout php links. Content_show_graph_employee.php: This will show the detail and level graphs. The detail graph gives each rating provided for each question rated by the employee and the manager. The level graph gives the average of the rating’s provided in each level. Both, the graphs show the employee and the manager’s ratings in comparison.

Module III: CEO

This module is used to present the functionality of CEO of an organization. Every company has its own CEO (Chief Executive Officer) which means that there are many company added by admin and there are individual CEO’s for each company. CEO can view the entire project and its details belonging to his company. CEO can see the full overview of all the projects reports. There are various php files are as follows:

Add_employee.php: This opens content_add_employee.php which will tell us about and how to add an employee and a manager. Check_results.php: This opens content _check _result.php which will tell us about the ratings given by employee and match it with manager rating which is given by manager if it is same the level of project is increased by one level up and if it is not same then an information is send to the employee that level not cleared. Company_projects.php: This opens content_company_projects.php which will tell us about project name, employee and employee’s level, manager and manager’s level. Dashboard_admin.php: This opens Content_dashboard_superadmin.php which consists of home page link, add employee, view projects, show project graphs and logout php links. Content_show_graph_admin.php: This shows the entire graph which will display the level of a project that is all the projects with their level and description will be shown over here.

Module IV: Manager

This module is the most important of all the module .This

Page 17: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Maurya et al 43

Table 1. Unit test case of storing values in database.

Sl. No. of test case Unit test 1

Name of the test case Receive input

Feature to be tested Input to be saved in database

Sample input Input from users

Expected output Stores the data

Remarks Pass

Table 2. Unit test case of retrieving values from database.

Sl. No. of test case Unit test 2

Name of the test case Retrieving from database

Feature to be tested Data to be retrieved from database and displayed

Sample input Request for display for information

Expected output Data is retrieved and shown

Remarks Pass

uses the manager designation to handle to computational and comparative study of projects and producing result and suggestion based on these studies. The manager rates a project which is already being rated by the employee, so to make the project more efficient and successful. The manager’s ratings will show us the flaws and error which fail to be fulfilled by the employee. This is the main task done by this module. There are many php files under this also which are as follows: Manager_rate.php: This opens content_manager_rate.php which will tell us how a manager is rating and how it is clearing all level from 2 to 5 TMMI levels of projects. Check_results.php: This opens content _check _result.php which will tell us about rating given by employee and match it with manager rating which is given by manager if it is same the level of project is increased by one level up and if it is not same then an information is send to the employee that level not cleared. Dashboard_manager.php: This opens content_dashboard_manager.php which consists of home page link; add manager rating, view projects, show project graphs and logout php links. Content_show_graph_manager.php: This will show the detail and level graphs. The detail graph gives each rating provided for each question rated by the employee and the manager. The level graph gives the average of the rating’s provided in each level. Both, the graphs show the employee and the manager’s ratings in comparison. So these are some important modules which are used in implementation of this project assessment tool.

RESULTS AND TESTING Testing Testing is the set of activities that can be planned in advance and conducted systematically. Unit testing Using the unit test plans the basic functionalities of each module. At the end this testing phase, each unit was found to be working satisfactorily, as regard to the expected output from module. Unit testing of storing in database: Storing values in database is tested independently and the test case is tabulated in Table 1. Unit testing of retrieving values from database: Retrieving values from database is tested independently and the test cases are tabulated in Table 2. Unit testing of adding a company: A company is added by the admin, this is tested independently and the test cases are tabulated in the Table 3. Unit testing of adding an employee: An employee is added by the CEO, this is tested independently and the test cases are tabulated in Table 4. Unit testing of adding a project: A project is added by the employee, this is tested independently and the test case is tabulated in the Table 5.

Page 18: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Time J. Engr. Phy. Sci. 44

Table 3. Unit test case of adding companies.

Table 4. Unit test case of adding an employee.

Sl. No. of test case Unit test 4

Name of the test case Adding employee

Feature to be tested Employee to be added in the company

Sample input Name ,email, type, level

Expected output Adding the employee to the company

Remarks Pass

Table 5. Unit test case of adding a project.

Sl no of test case Unit test 5

Name of the test case Adding a project

Feature to be tested Project added to a company

Sample input Project name, owner, manager level, status, start date, end date, industry, manager

Expected output Project added in the database

Remarks Pass

Table 6. Unit testing of editing a project details.

Sl. No. of test case Unit test 6

Name of the test case Editing project details

Feature to be tested Editing of a project details

Sample input Project name , owner , manager level , status , start date , end date , industry, manager

Expected output Changing the details in the database

Remarks Pass

Unit testing of editing a project details: A projects detail is edited by the employee, this is tested independently and the test case is tabulated in Table 6. Integration testing Using the integration test plans, graphs displayed, rating’s provided for questions and reports generated are tested. At the end of this testing phase, each subsystem was found to be working satisfactorily, as regard to the

expected output from subsystems. It tests the interactions and links between modules. Integration testing of graph: Here the modules admin, CEO, manager, employee are combined into one subsystem. For making various graphs we need rating of employee and manager and the test cases are tabulated in Table 7. Integration testing of manager notification for new project: Here the manager and employee modules are

Sl. No. of test case Unit test 3

Name of the test case Adding of a company

Feature to be tested New company to be added by admin

Sample input Company name, address , email , password

Expected output Company added

Remarks Pass

Page 19: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Maurya et al 45

Table 7. Integration test case of viewing a graph.

Sl no of test case Integration Test 1

Name of the test case Viewing the graph

Feature to be tested The rating of employee and manager is taken and graph is made from that

Sample input Employee rating , manager rating

Expected output The different types of graphs.

Remarks Pass

Table 8. Integration test case of manager notification of new project.

Sl. No. of test case Integration Test 2

Name of the test case Manager is notified

Feature to be tested Manager should get notified about the new project which he has to assess

Sample input Project created by employee

Expected output Project should show in the list of project the manager is assessing

Remarks Pass

Table 9. Integration test case of employee rating reflecting on manager rating.

Sl. No. of test case Integration Test 3

Name of the test case Employee rating reflected on manager rating

Feature to be tested Manager can rate once employee has rated

Sample input Employee rates the project

Expected output Manager is allowed to rate

Remarks Pass

Table 10. Integration test of criteria improvement.

Sl no of test case Integration Test 4

Name of the test case Criteria improvement for employee

Feature to be tested Employee gets to know the questions in which he is rated low

Sample input Manager rating

Expected output Employee can see the questions with low rating and see their criteria

Remarks Pass

combined into one subsystem and the test cases are tabulated in Table 8. Integration testing of employee rating reflected on manager rating: Here the manager and employee modules are combined into one subsystem and the test cases are tabulated in the Table 9. Integration testing of criteria improvement for employee: Here the manager and employee modules are combined into one subsystem and the test cases are tabulated in the Table 10.

System testing Using the system test plans, all the modules are combined into a single system and are tested to uncover errors within boundary of the modules. At The end this testing phase, the system was found to be working satisfactorily, as regard to the expected output from system. System testing of all the modules combined together: Here all the four modules are combined into a single system and the test cases are tabulated in the Table 11.

Page 20: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Time J. Engr. Phy. Sci. 46

Table 11. System testing of User interface.

Sl. No. of test case System test 1

Name of the test case User interface

Feature to be tested Whether the user interface has working forms for input.

Sample input Entering data into form provided.

Expected output Getting the output according to the form

Actual output As Expected

Remarks Pass

Table 12. System testing of login page.

Sl no of test case System test 2

Name of the test case Login

Feature to be tested Various users login with their respective username, password

Sample input Username , password

Expected output Login successful

Actual output As Expected

Remarks Pass

Table 13. System testing of compatibility.

Sl. No. of test case System test 3

Name of the test case compatibility

Feature to be tested Compatibility with all of web browsers

Sample input None

Expected output Works on chrome ,Firefox

Actual output As Expected

Remarks Pass

Table 14. System testing of export to excel.

Sl. No. of test case System test 4

Name of the test case Export to Excel Button

Feature to be tested Whether the graph details are exported to excel.

Sample input None

Expected output The Values are exported to excel sheet.

Actual output Compatibility error

Remarks Failure

System testing of login page: Here all the four modules are combined into a single system and the test cases are tabulated in the Table 12. System testing of application compatibility: Here the application works on different web servers like Google chrome and Firefox. The test cases are tabulated in the

Table 13. System testing of export to excel button: Here the export to excel button is tested as shown in Table 14. The Export to excel button showed an error because in the export_excel.php the data variable had an error. The

Page 21: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Maurya et al 47

Table 15. System testing of export to excel- pass case.

Sl. No. of test case System test 4

Name of the test case Export to Excel Button

Feature to be tested Whether the graph details are exported to excel.

Sample input None

Expected output The Values are exported to excel sheet.

Actual output Values are exported.

Remarks Pass

Table 16. System testing of edits project details.

Sl. No. of test case System test 5

Name of the test case Edit Project details page

Feature to be tested Whether the changed fields are updated in the database

Sample input Changed Project details

Expected output Changes reflected in the database

Actual output Changes were not updated

Remarks Failure

Table 17. System testing of edit project details: pass case.

Sl. No. of test case System test 5

Name of the test case Edit Project details page

Feature to be tested Whether changed fields are updated in the database

Sample input Changed Project details

Expected output Changes reflected in the database

Actual output Changes were updated in the database

Remarks Pass

data variable was corrected and export to excel was tested again as shown in Table 15. System testing of editing project details: The edit projects details page is tested for checking the edited details are updated in the database as Table 16. The content_edit_projects.php file had the submit button named as submit_editor instead of submit_edit. The changes were made as shown in Table 17. Results of project assessment The results of project assessment application as screenshots are generated which are included in this section. In order to show results of the project assessment application, different figures are displayed as following Figure 11 screenshot showing the login page for

all the users. The Figure 12 shows the homepage for admin where company details are listed. Figure 13 show that admin gets the company’s information when clicking on its name. Figure 14 lists the projects of the company when user clicks on view projects button. Figure 15 shows the project details when user clicks on the name of a project which is a link. Figure 16 shows the webpage where admin registers the company. Admin gets this page when admin clicks on the register company tab. Figure 17 shows the webpage where admin and CEO can see project level graph. Figure 18 shows the webpage where all users can see detail graph of project that is the ratings provided for each question. Figure 19 shows the webpage where all users can see level graph of project that is the average rating of a particular level. Figure 20 shows we can see graph details on clicking on the Graph Details button. Figure 21 shows we can download or print a graph. Figure 22 show that the data can be exported to excel. Figure 23 shows the excel

Page 22: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Time J. Engr. Phy. Sci. 48

Figure 11. Login page.

Figure 12. Homepage of the admin.

Page 23: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Maurya et al 49

Figure 13. Company information displayed.

Figure 14. List of projects in the company.

Page 24: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Time J. Engr. Phy. Sci. 50

Figure 15. Project details.

Figure 16. Company registration form for admin.

sheet to which it is exported. Figure 24 show that user can see question appendix which lists the questions and the TMMi levels they belong to. Figure 25 show admin selecting a company to see its project details. Figure 26 shows the project details and questions rated below 5 for the employee. Figure 27 show that on clicking any

question we get its criteria. Figure 28 shows the homepage of CEO of a company where projects are listed. Figure 29 shows the webpage where CEO can add an employee. Figure 30 shows the homepage of employee where projects to be rated are listed. Figure 31 show the webpage where employee rates his project and

Page 25: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Maurya et al 51

Figure 17. Project-level graph of company.

Figure 18. Detail graph of project.

Page 26: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Time J. Engr. Phy. Sci. 52

Figure 19. Level graph for project.

Figure 20. Showing graph details.

Page 27: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Maurya et al 53

Figure 21. Download or print graph option.

Figure 22. Export to excel.

Page 28: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Time J. Engr. Phy. Sci. 54

Figure 23. Excel sheet with graph details.

Figure 24. Question appendix.

Page 29: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Maurya et al 55

Figure 25. List box to see project details

Figure 26. Project details and question rated below 5.

Page 30: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Time J. Engr. Phy. Sci. 56

Figure 27. Criteria for a question.

Figure 28. Homepage of CEO.

where employee will view and rate the questions which have been rated low by his assessor on clicking improve link. Figure 32 shows the webpage where employee creates a project on clicking create projects tab and edit

projects. Figure 33 show the webpage where employee can view the assessment of his project by his manager on clicking on rate projects tab. Figure 34 shows the homepage for manager. Figure 35 shows the webpage

Page 31: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Maurya et al 57

Figure 29. CEO adding an employee.

Figure 30. Homepage of employee.

Figure 31. Employee rating his project.

Page 32: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Time J. Engr. Phy. Sci. 58

Figure 32. Employee creating a project.

Figure 33. Employee viewing the manager’s assessment.

Figure 34. Homepage for manager.

Page 33: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Maurya et al 59

Figure 35. Manager selecting project to rate.

Figure 36. Manager rating the project.

where manager selects the projects to rate. Figure 36 shows the webpage where manager rates the particular project and can see the employee’s rating for that particular question. CONCLUSIVE OBSERVATIONS AND ENHANCEMENT FOR FUTURE SCOPE The TMMi maturity criteria have been demonstrated comprehensively in this paper for enhancement of testing process in multinational software companies and industrial organizations. The present study shows that the test maturity model integrated (TMMi) technique has a positive impact on product quality, test engineering productivity, and cycle-time effort in software companies

and industrial organizations. As also described herein that the test maturity model integrated (TMMi) is positioned as a complementary model to the capability maturity model integrated (CMMi). Resultantly, TMMi is more useful as a software testing model to get optimum accuracy in product quality improvement and hence the best criteria for the project assessment. This application helps in evaluating the given projects in a company against TMMi standards and finding areas that still need to be improved upon. The reports generated by the TMMi in form of graphs helps in understanding and evaluating a project in its testing phase. In addition to this, a few possibilities in its enhancement for future scope are suggested as following: Features to grade companies against the TMMi

Page 34: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Time J. Engr. Phy. Sci. 60 standards and giving it a TMMi level based on the project’s assessment can be added. Having more than one assessor and more than one employee for a project. Integrating the project on a cloud so that it can be available to various organizations and companies. ACKNOWLEDGEMENTS The present paper is an integral part of the project work entitled “Project Assessment Using Test Maturity Model Integration (TMMi)”; submitted in partial fulfillment for the award of Bachelor of Engineering in Computer Science and Engineering of Visvesvaraya Technological University, Belgaum, India during the academic year 2012-13.The last four authors of this paper would like to express their immense gratitude to Dr. Puttamadappa C., Principal,Sapthagiri College of Engineering Bangalore, Dr. C.M. Prashanth, Professor & Head, Department of Computer Science and Engineering,Sapthagiri College of Engineering, Bangalore for his help and inspiration during the tenure of the B.Tech. Program. These authors would also like to express their heartiest gratitude to their Guide Prof. (Dr.) V.N. Maurya, School of Science & Technology, The University of Fiji, Saweni, Fiji& Ex Director, Vision Institute of Technology, Aligarh (G.B. Technical University, Lucknow, India) and Co-Guides Mrs. Poornima G.J., Assistant Professor of Department of Computer Science & Engineering, Sapthagiri College of Engineering, Bangalore and Prof. (Er.) Avadhesh Kumar Maurya, Assistant Professor & Head, Department of Electronics & Communication Engineering, Lucknow Institute of Technology (G.B. Technical University, Lucknow) for their valuable advice, sincere guidance and regular assistance throughout the project work. REFERENCES Fewster M, Graham D (1999). Software test automation

effective use of test execution tools, Addison-Wesley, Harlow, ISBN 0-201-33140-3

Kaner C (1997). Pitfalls and strategies in automated testing. s.l : IEEE,

Karlsson E, Andersson L, Leion P(2000). Daily build and feature development in large distributed projects. s.l.: ACM, 1-58113-206-9/00/06.

Koomen T, Martin P(1999). Test process improvement: A step-by-step guide to structured testing. s.l.: Addison Wesley, Harlow,.

Koroorian S, Kajko-Mattsson M (2008). A tale of two daily build projects, s.l.: IEEE, 978-0-7695-3372-8.

Koskela L (2004). Introduction to code coverage, Javaranch,. [Online] http://www.javaranch.com/journal/2004/01/IntroToCodeCoverage.html.

Martin P, Teunissen R ,Veenendaal E(2002). Software

testing, TMAP, 0-201-74571-2 Maurya AK, Maurya VN (2013). A novel algorithm for

optimum balancing energy consumption LEACH protocol using numerical simulation technique, Inter. J. of Elect. Comm. and Elect. Engr. Algeria, 3(4): 1-19, ISSN: 2277-7040

Maurya AK, Maurya VN, Ahmad A (2013). Remote system accessing and network security using efficient experimental techniques, J. of Engr. and Tech. Res. Scientia Research Library, Georgia, 1(1): 42-56, ISSN: 2348-0424

Maurya AK, Maurya VN, Arora DK, Misra RB, Singh D (2014). Modeling and analysis of architectural design and partitioning by mapping approach, communicated,

Maurya AK, Maurya VN, Arora DK, Misra RB,Singh D (2014). Software development process and tools in architectural design, communicated,

Maurya VN, Bathla RK(2012). Design and analytical study of module testing, Ph.D. Thesis, Department of Computer Science and Engineering, CMJ University, Shillong, Meghalaya, India.

Maurya VN, Rajender K (2012). Analytical study on manual vs. automated module testing using critique on overly simplistic cost model, International. J.of Elect. Comm. & Elect. Engr. Algeria, 2(1): 203-216, ISSN: 2277-7040.

Maurya VN, Bathla RK, Maurya AK, Arora DK (2013). A dynamic innovative scenario of automated regression testing using software testing tool, International.J.of Info.

Tech. and Operations Manage. Academic and Scientific Publishing, New York, USA, 1(1):1-10, ISSN: 2328-8582

Maurya VN, Bathla RK, Maurya AK, Arora DK, Gautam RA (2013). An alternate efficient sorting algorithm applicable for classification of versatile data, International J. of Math. Modeling and Applied Comput. Academic & Scientific Publishing, New York, USA, 1(1): 1-10.

Maurya VN, Bathla RK, Maurya AK, Arora DK (2014). New fuzzy logic model for effort estimation in software module development, J. of Engr. and Tech. Res. Scientia Research Library, Georgia, 2(1) :10-16, , ISSN: 2348-0424.

Maurya VN, Bathla RK, Maurya AK, Arora DK (2013). Modern scenario in software testing with an emergence of automation technique, International J. of Info. Tech. & Operations Manage. Academic and Scientific Publishing, New York, USA, 1(6):11-28. ISSN: 2328-8531

Maurya VN, Bathla RK, Maurya AK, Arora DK (2013). Dynamic test case design scenario and analysis of module testing using manual versus automated technique, International J. of Elect. Comm. and Elect. Engr. Algeria, 3(4): 20-41. ISSN: 2277-7040

Page 35: Product quality improvement and project assessment using test … · 2018. 12. 17. · TMMi is applied in its broadest sense to encompass all software product quality-related activities

Maurya VN, Maurya AK, Singh D, Priyadarshi A, Anand

G, Ateeq N (2014). A short note on test maturity model integrated technique in project assessment, National Res. J. of Info.Tech. & Info. Sci. National Press Associates, New Delhi, India, 1(1).

Maurya VN, Maurya AK, Singh D, Priyadarshi A, Anand G, Ateeq N (2013). Test maturity model integrated technique for enhancement of software testing process in organizations, J. of Engr. and Tech.Res. Scientia Research Library, Georgia, 1(1):42-56. ISSN: 2348-0424

Stefan B, Roland W, Rudolf K (2005). Observations and lesson learned from automated testing, s.l.: ACM,

Veenendaal EV,Cannegieter JJ (2012). Test maturity model integration (TMMi) results of the first TMMi

benchmark- where are we today, Euro star software testing.

Maurya et al 61 Persson C. Yilmaztürk N (2004). Establishment of

automated regression testing at ABB: Industrial experience report on 'Avoiding the Pitfalls'. s.l IEEE.

Rajender K, Maurya VN, Maurya AK (2012). A cost- benefit model for evaluating regression testing

technique, International J. of Software Engr. & Comp. Serials Publications, New Delhi, India, 4(2):84-89. ISSN: 2229-7413

Ramler R, Wolfmaier K (2006), Economic perspective in test automation: Balancing automated and manual testing with opportunity cost, Hagenberg: ACM.

Runeson Per (2006). A survey of unit testing practice, .s.l. IEEE.

Sebesta RW (2008). Programming the World Wide Web, Pearson Education, 4

th Edition,

Veenendaal EV (2009). Test maturity model integration (TMMI) Version 2.0. s.l. TMMi Foundation,

Veenendaal EV (2012).TMMi Release 1.0,TMMi Foundation. .