sugi 24: web-based reporting and tools used in the qa ... · the sas software. the qa website...

7
Web-based Reporting and Tools used in the QA process for the SAS System® Software Jim McNealy, SAS Institute Inc., Cary, NC Dawn Amos, SAS Institute Inc., Cary, NC Abstract SAS Institute Quality Assurance faced many challenges with the software validation cycle for Version 7 of the SAS System® Software. Just a few of those challenges were metric collection, reporting, dissemination, and thin client technology for lightweight tool development. To meet these challenges QA turned to the emerging Web enablement in the SAS Software. The QA website evolved into a rich set of interactive reports and tools that increased the quantity and quality of information available to the Quality Analyst, Management, and R&D interested parties. Introduction The Intranet has become an integral part of the Quality Assurance process for testing SAS Software. Using SAS/IntrNet® software, QA began creating HTML- formatted reports, then moved to dynamic, interactive reports with live data using the greater integration of the emerging Web enablement in the SAS Software. In turn, the reports have now become lightweight applications that provide drill-down and update abilities that in the past would require multiple applications. These lightweight applications provide complete use of the data. The user can now drill down for more detail in specific areas, and the data can be updated through the report. In the instance of problem tracking this concept is especially significant. Now the user can research a problem with more detail from the problem tracking system. When appropriate, the quality analyst can confirm fixes by adding verification records to the data, completing a common and time-consuming activity. The move to Web-based architecture frees the user from specific platform requirements. The lightweight application is now available on most of the same platforms as the systems under test. This positions the quality analyst, the target test system, and the lightweight application in an environment to increase productivity. Adding value and usability to the content of our Intranet has increased the quality and quantity of the information. More divisions used the information for tracking the Version 7 software validation cycle. Research & Development interested parties were asked to critique the site for comment on the information regarding their respective areas. What resulted was content directed at the divisions instead of at quality analysts and management. Where We Started Quality Assurance started with the flat file reports common in all businesses in the early 1990’s, with a different physical file for each query at greater levels of detail. As the access to a web server became more common, the interest in providing and consolidating more information arose as well. The simplest answer was to provide the graphics found in the QAInfo application on a Web page. These graphics were produced in nightly or weekly batch processes and made easily accessible to the Web page. This stage of Web use was where we stayed for some time. We explored the ability to link graphs in sequence to produce the effect of drill down and data exploration until Web enablement became an emerging feature of the SAS Software. Taking Small Steps With the release of SAS/IntrNet® and the Web Formatting Tools® in Release 6.12, the ability to easily change the flat file report into a Web page became a reality. Literally over night QA was able to produce HTML- formatted reports to the web server. Initially the static text listing reports were generated as pre-formatted text in HTML using the Output Formatter. Soon after, we began using the Data Set Formatter to generate Web pages with sophisticated tables. These pages were still static, being generated typically overnight and aging during the day. To provide more real-time data, we began using the Application Dispatcher to query Data Sets Internet, Intranets, and The Web

Upload: others

Post on 27-Jun-2020

6 views

Category:

Documents


0 download

TRANSCRIPT

Web-based Reporting andTools used in the QA processfor the SAS System®SoftwareJim McNealy, SAS Institute Inc., Cary, NCDawn Amos, SAS Institute Inc., Cary, NC

AbstractSAS Institute Quality Assurance faced manychallenges with the software validation cyclefor Version 7 of the SAS System® Software.Just a few of those challenges were metriccollection, reporting, dissemination, and thinclient technology for lightweight tooldevelopment. To meet these challenges QAturned to the emerging Web enablement inthe SAS Software. The QA website evolvedinto a rich set of interactive reports and toolsthat increased the quantity and quality ofinformation available to the Quality Analyst,Management, and R&D interested parties.

IntroductionThe Intranet has become an integral part ofthe Quality Assurance process for testingSAS Software. Using SAS/IntrNet®software, QA began creating HTML-formatted reports, then moved to dynamic,interactive reports with live data using thegreater integration of the emerging Webenablement in the SAS Software. In turn, thereports have now become lightweightapplications that provide drill-down andupdate abilities that in the past would requiremultiple applications. These lightweightapplications provide complete use of thedata. The user can now drill down for moredetail in specific areas, and the data can beupdated through the report. In the instanceof problem tracking this concept is especiallysignificant. Now the user can research aproblem with more detail from the problemtracking system. When appropriate, thequality analyst can confirm fixes by addingverification records to the data, completing acommon and time-consuming activity.

The move to Web-based architecture freesthe user from specific platform requirements.The lightweight application is now availableon most of the same platforms as thesystems under test. This positions the

quality analyst, the target test system, andthe lightweight application in an environmentto increase productivity.

Adding value and usability to the content ofour Intranet has increased the quality andquantity of the information. More divisionsused the information for tracking the Version7 software validation cycle. Research &Development interested parties were askedto critique the site for comment on theinformation regarding their respective areas.What resulted was content directed at thedivisions instead of at quality analysts andmanagement.

Where We StartedQuality Assurance started with the flat filereports common in all businesses in theearly 1990’s, with a different physical file foreach query at greater levels of detail. As theaccess to a web server became morecommon, the interest in providing andconsolidating more information arose aswell. The simplest answer was to providethe graphics found in the QAInfo applicationon a Web page. These graphics wereproduced in nightly or weekly batchprocesses and made easily accessible tothe Web page. This stage of Web use waswhere we stayed for some time. Weexplored the ability to link graphs insequence to produce the effect of drill downand data exploration until Web enablementbecame an emerging feature of the SASSoftware.

Taking Small StepsWith the release of SAS/IntrNet® and theWeb Formatting Tools® in Release 6.12, theability to easily change the flat file report intoa Web page became a reality. Literally overnight QA was able to produce HTML-formatted reports to the web server. Initiallythe static text listing reports were generatedas pre-formatted text in HTML using theOutput Formatter. Soon after, we beganusing the Data Set Formatter to generateWeb pages with sophisticated tables. Thesepages were still static, being generatedtypically overnight and aging during the day.

To provide more real-time data, we began usingthe Application Dispatcher to query Data Sets

Internet, Intranets, and The Web

and stream the output directly to the browser.This provided up-to-date information, but stillrestricted the user to queries built for them by theprogrammer. As the technology builtmomentum, data source definitions for commondata sets were provided for SAS/IntrNet htmSQLbased query forms. This allowed fast and easydynamic reporting which allowed the user tocreate the query.

To be effective, the reports need to providethe data to the user to answer a question. Inthe daily life of the quality analyst thequestion is often ‘what do I need to focus ontoday?’ QA produced HTML-formattedreports from QAPlan, our project trackingapplication using SAS full-screen productson the UNIX network, to answer thisquestion. The next step was prompted bymany of the analysts, ‘Can I have a Web-based report that allows me to update theitems I’ve finished?’ To help provide thisfeature the HTML-formatted reports werecombined with links via SAS/IntrNet htmSQLsoftware to update the data set as itemswere finished. Then the reporting code wasadded to the growing library of ApplicationDispatcher software to allow the dynamicregeneration of the report. This resulted inthe QAPlan report in Figure 1.

Figure 1

The analyst would click on the TASKID ofthe item to get more information. When theitem was complete the analyst would drill tothe update form to enter changes in status,shown in Figure 2.

Figure 2

This model worked well for a number ofreasons. The users were familiar with thereport from the text-based format. Theinterface traveled to most platforms wherethe tasks were needed to be performed. Theactivity was consolidated into a singleinterface, the browser. It was common tohave either a hardcopy or a browser versionof the report at hand. It no longer required aseparate UNIX application to update theitems.

Replicating Our SuccessThis simple model of consolidating multipleapplications into a single interface wassuccessful with the query and update ofQAPlan items. Using this success weattempted the same type of conversion withthe activity of reporting the verification offixes in target systems. This was anothervery common task that often required areport on hand, a SAS application on UNIXto provide detail on the problem, andanother such application to update the data.Unlike the QAPlan interface, which requireda number of data sets to be updated, theverification required a single data set,making it a good choice for a SAS/IntrNetApplication Dispatcher software program.

Internet, Intranets, and The Web

Figure 3

This report in Figure 3 allows the analyst tosee the detailed information from theproblem tracking system and to use theverification form to add history records to thedata set. A column was added to the reportfor each item. The column is an HTML linkto the verification form. The form providessome customization by the user through listboxes for verification status, comments, andplatform. The resulting URL includes thepertinent details for the verification programto produce a proper verification record onthe problem, shown in Figure 4.

Figure 4

This attempt was also a success, eliminatingthe need for up to three applications to benavigated by the analyst. The applicationprovided an interface available to anyplatform with a browser. Because a growingminority in QA used a PC as their primaryworkstation, this interface migrated with theuser. Not to mention the ability to use theapplication on the same platform where theanalyst is verifying code changes.

Broadening the ApproachTake the lessons and apply them to otherreports and data sources.

Figure 5

The Build and Test Status pageThe build and test status page, Figure 5, isused by developers and testers to check thecurrent image and testware status of anyhost. The page provides this necessaryinformation in a clear and concise format.The folks who handle the ports and buildsare responsible for updating theirinformation on the page. Knowing that thereis a main repository for this information, wehave created a tool (image) that allows auser to query the data outside of a Webbrowser, and a tool (image_updater) thatsimplifies page updates. Most host groupshave host specific build and test statuspages, which are linked off the main buildand test status page. Without a singlesource of information regarding builds andtests status, developers and testers wouldhave to hunt in various locations todetermine what image and what version ofthe testware are available on each host.

Defects Web ClientThe main problem tracking system used byResearch & Development received anotherintegration milestone during the Version 7development cycle. The Defects Web clientwas announced. This Web-basedapplication uses SAS/IntrNet software. Theapplication is designed to provide the abilityto add and update problem reports throughthe Web browser.

Internet, Intranets, and The Web

Figure 6

The Defects Web client, in Figure 6, issuccessful for the same reasons that theprevious Web-based applicationssucceeded. Using a familiar User Interfacesignificantly reduced the learning curve. Theanalysts could have the Web-basedapplication available on any platform with abrowser.

Using Defects Web Client with otherreportsIncluding links to the Defects Web client inexisting defect reports helped expedite thereview and approval process. During theend of the Version 7 Validation cycle, theDefects Web client proved an expedientmethod of dealing with the class of problemstill to be reviewed. The link in theDEFECTID field of defect reports waschanged to bring up the defects Web client.This produced a lightweight application forassessing and approving those changes inthe SAS Software that were truly necessary.

Managing software coding projects usingthe problem tracking system

Figure 7

In the past, new features in the SAS Systemwere typically documented by each groupwriting the feature, scattering the informationacross multiple locations and in inconsistentformats. Using the new Defects Web Client,we began tracking these consistently at thehighest level in the problem-tracking system,one DEFECTID per project; the problemdescription would supply goal dateinformation and a pointer to the location ofadditional details. Seen in Figure 7 above.

Our methods for producing reports from theproblem tracking system then showed theway to present the software projects tomanagement in a versatile fashion. UsingSAS/IntrNet htmSQL and the Data SetFormatter, we provide an initial screen thatsorts all software projects by product.Additionally, using frames we also offer listboxes allowing the user to limit the displayby product, as well as specify alternate sortorders, such as by goal date or platform.The contents of the list boxes are createddynamically using a separate SQL query, sothat the list of products offered is alwayslimited to those that actually have softwareprojects in the works. Finally, we provide thedrill-down effect by turning the DEFECTIDfield into a link to the Defects Web Clientpage that offers the details of the project. Inturn, any URL specified in the detail sectionof the Defects Web Client page would bedrillable by the user, allowing navigation tothe complete project documentation. Sincethis front-end could be linked into any pageon the Intranet, for the first time we had acentral point for managing the content of the

Internet, Intranets, and The Web

next version of the software, includingevaluating which projects were feasible toinclude in the release given the developmentand testing schedule. The major advantageover the past method of managing thisinformation is that the pointers to each levelof detail tend to remain live and updated,rather than aging and becoming outdated ormisplaced altogether.

Tracking daily code changesThe problem tracking system obviously isalso used for identifying defects andspecifying their fixes. Developers use theapplication to send an email signal to qualityanalysts indicating that a fix is available; thequality analysts in turn determine therequired retesting effort, and via theapplication send an email signal indicatingthey are ready to accept and test the fix. Theprocess can be susceptible to network oremail delivery glitches, or is disrupted if oneof the signaled personnel happens to be outof the office for an extended period.

To give management a way to remaincurrent with requests for FIXIDS, wedeveloped a live query, again using theDefects Web Client along with theSAS/IntrNet Application Dispatcher andSAS/IntrNet htmSQL software.

Figure 8

Figure 8 shows the outstanding requests inreal time. Again the link to the Defects WebClient is built into the page, allowing drilldown to the details of the code fix andadditionally to the link allowing the manageror analyst to add the approval record overthe Web. These interfaces proved to bequite a bit faster and easier to use when theuser’s browser was already running but theactual problem-tracking system was not.

QATrack Web ReportsThe QATrack tool is used to analyze andresolve batch testing performed on all

releases, products, and systems in the QAcycle. The underlying data are stored inSAS Data Sets that are ordinarily accessedvia a SAS/AF® FRAME-based application.During the testing cycle associated with theVersion 7 release, QA decided to leveragethe new ODS and SAS/IntrNet functionalityto produce a Web-based system thataccessed and updated the same data.

Users of the new system must first definesubsetting parameters to limit the batchtesting data to the products of interest. Forexample, a QA analyst might only care tosee the results for the SAS BASE® softwareacross all systems in Version 7. Theseparameters are entered via a SAS/AF®menu and the SAS ODS-based Web reportis generated. Once the parameters havebeen established, this report will berefreshed automatically on a nightly basisvia a background job.

Figure 9

The top-level report produced is a verticalbar chart showing frequency counts for thevarious test result statuses (clean runs,abends, bad return codes, differences fromknown benchmarks, etc.). An example ofsuch a chart is in Figure 9. The user mightalso want to view the same results, butwould ignore those tests that ran withoutproblems. Figure 10 shows such a chart.Multiple products could also be displayed onthe same report, thus supplying a snapshotof the testing status for large portions of theSAS Software.

Internet, Intranets, and The Web

Figure 10

All vertical bar charts are interactive,allowing the user to drill down to a greaterlevel of detail. Clicking on the bars will takethe user to a SAS ODS table showing thespecific tests run, along with other pertinentinformation. Figure 11 shows a portion ofsuch a table.

Figure 11

The user can drill one level further byclicking on any of the cells showing teststatus. This brings up the SAS/IntrNet

generated form in figure 12 showing detailslike when/where the test was run, whatoutput was produced, and which SASSystem image was used. ThroughSAS/IntrNet htmSQL software queries, theform lets the user update key values in thedata. Previously, such functionality wasrestricted to the qatrack SAS/AF softwarebased interface.

Figure 12

The Web-based approach to the batchtesting data allows users to summarize thelatest test runs from a variety of hosts on thenetwork, not just the one on which theSAS/AF based application is hosted.Furthermore, for quick editing tasks, theWeb-based approach has frequently provenmore convenient since users typically haveWeb browsers running constantly,eliminating the need to launch a separateSAS/AF software based application.

ConclusionsSelective application of Web-basedtechnology has allowed analyst productivityand information access gains for QualityAssurance. Beginning by converting staticreports to Web documents, we firstimproved the visual presentation of theinformation, then provided access toincreasingly live up-to-the-minute data,finally allowing users to create their ownqueries of the live data as well as offeringpoint-and-click tools for tailoring thepresentation for their own purposes. We

Internet, Intranets, and The Web

learned that managing information over theIntranet in this way short-circuited thetendency of static documents to becomeobsolete or misplaced. We also obtainedunprecedented productivity gains by usingthe browser as the swift focal point ofinformation delivery, using the power of SASSoftware to manage the information,especially since the browser is souniversally available and typically morestreamlined and responsive than having touse multiple applications. Finally, wediscovered that these techniques unified theorganization by providing many separatedepartments a centralized view of thedevelopment process, which in turnencouraged enterprise-wide teamwork inshipping quality software.

Acknowledgements

The authors would like to express theirappreciation for the details and applicationimages provided by Meg Arnette, Ed Overton,Mark Schneider, and Jeff Simpson all of theQuality Assurance Information Systems group.

SAS, SAS/AF, and SAS/IntrNet are registeredtrademarks or trademarks of SAS Institute Inc. inthe USA and other countries. ® indicates USAregistration.

Other brand and product names are registeredtrademarks or trademarks of their respectivecompanies.

Internet, Intranets, and The Web