Produced by Araxis Merge on 7/13/2017 3:41:28 PM Eastern Daylight Time. See www.araxis.com for information about Merge. This report uses XHTML and CSS2, and is best viewed with a modern standards-compliant browser. For optimum results when printing this report, use landscape orientation and enable printing of background images and colours in your browser.
| # | Location | File | Last Modified |
|---|---|---|---|
| 1 | Genisis_v3.zip\Release 3_Docs | Genisis2_Release 3_Master_Test_Plan_06162017.docx | Wed Jun 28 20:38:26 2017 UTC |
| 2 | Genisis_v3.zip\Release 3_Docs | Genisis2_Release 3_Master_Test_Plan_06162017.docx | Wed Jul 12 17:31:56 2017 UTC |
| Description | Between Files 1 and 2 |
|
|---|---|---|
| Text Blocks | Lines | |
| Unchanged | 12 | 2072 |
| Changed | 11 | 22 |
| Inserted | 0 | 0 |
| Removed | 0 | 0 |
| Whitespace | |
|---|---|
| Character case | Differences in character case are significant |
| Line endings | Differences in line endings (CR and LF characters) are ignored |
| CR/LF characters | Not shown in the comparison detail |
No regular expressions were active.
| 1 | Master Tes t Plan | |
| 2 | Genomic In formation System for Integrate d Science 2 | |
| 3 | (Genisis2) Technical Services | |
| 4 | Release 3 | |
| 5 | ||
| 6 | June 2017 | |
| 7 | Document V ersion 2.0 | |
| 8 | Department of Vetera ns Affairs | |
| 9 | ||
| 10 | ||
| 11 | ||
| 12 | ||
| 13 | ||
| 14 | ||
| 15 | Document R evision Hi story | |
| 16 | Date | |
| 17 | Revision | |
| 18 | Descriptio n | |
| 19 | Author | |
| 20 | 11/23/2016 | |
| 21 | Draft | |
| 22 | Initial dr aft of Mas ter Test P lan | |
| 23 | Booz Allen Hamilton | |
| 24 | 11/25/2016 | |
| 25 | 1.0 | |
| 26 | Updates ba sed on rev iew by RTA | |
| 27 | Booz Allen Hamilton | |
| 28 | 11/27/2016 | |
| 29 | 1.1 | |
| 30 | Updates ba sed on rev iew by NK | |
| 31 | Booz Allen Hamilton | |
| 32 | 11/30/2016 | |
| 33 | 1.2 | |
| 34 | Updates to Section 8 : Test Env ironments, ESE Testi ng and rem oval of fo oter from title page . | |
| 35 | Booz Allen Hamilton | |
| 36 | 01/25/2017 | |
| 37 | 1.3 | |
| 38 | Updated to Section 3 : Test App roach and deletion o f Section 3.7: Initi al Operati ng Capabil ity Evalua tion | |
| 39 | Booz Allen Hamilton | |
| 40 | 04/10/2017 | |
| 41 | 1.4 Draft | |
| 42 | Updated th e followin g sections for Relea se 2: | |
| 43 | 2.1 Test I nclusions | |
| 44 | 3.0 Test A pproach (R elease 2) | |
| 45 | 8.3 AITC ( Test Envir onment Con figuration ) | |
| 46 | Draft vers ion until Release 2 is approve d by the B usiness Ow ners | |
| 47 | Booz Allen Hamilton | |
| 48 | 04/17/2017 | |
| 49 | 1.4 Draft | |
| 50 | Final upda tes for Re lease 2 th at include s updates to the fol lowing sec tions: | |
| 51 | 2.1 Test I nclusions | |
| 52 | 3.0 Test A pproach (R elease 2) | |
| 53 | 8.3 AITC ( Test Envir onment Con figuration ) | |
| 54 | Booz Allen Hamilton | |
| 55 | 05/12/2017 | |
| 56 | 2.0 | |
| 57 | Updated Re lease 2 Fu nctionalit y (pg.9) | |
| 58 | Booz Allen Hamilton | |
| 59 | 06/16/2017 | |
| 60 | 3.0 | |
| 61 | Updated th e followin g sections for Relea se 3 funct ionality: | |
| 62 | 2.1 Overvi ew of Test Inclusion s | |
| 63 | 3.0 Test A pproach (R elease 3 F eatures an d Function ality) | |
| 64 | 4.2 Enterp rise Testi ng | |
| 65 | 7.0 Test S chedule | |
| 66 | 8. Staffin g and Trai ning Needs | |
| 67 | 10. Risks and Constr aints | |
| 68 | ||
| 69 | ||
| 70 | Booz Allen Hamilton | |
| 71 | ||
| 72 | ||
| 73 | ||
| 74 | Table of C ontents | |
| 75 | 1.Introduc tion1 | |
| 76 | 1.1.Purpos e1 | |
| 77 | 1.2.Test O bjectives2 | |
| 78 | 1.3.Roles and Respon sibilities 3 | |
| 79 | 1.4.Proces ses and Re ferences5 | |
| 80 | 2.Items to Be Tested 6 | |
| 81 | 2.1.Overvi ew of Test Inclusion s6 | |
| 82 | 2.2.Overvi ew of Test Exclusion s7 | |
| 83 | 3.Test App roach7 | |
| 84 | 3.1.Produc t Componen t Test11 | |
| 85 | 3.2.Compon ent Integr ation Test 11 | |
| 86 | 3.3.System Tests12 | |
| 87 | 3.4.User A cceptance Testing12 | |
| 88 | 3.5.Enterp rise Syste m Engineer ing Testin g12 | |
| 89 | 3.6.Perfor mance Test ing12 | |
| 90 | 4.Testing Techniques 12 | |
| 91 | 4.1.Risk-b ased Testi ng13 | |
| 92 | 4.2.Enterp rise Testi ng14 | |
| 93 | 4.2.1.Secu rity Testi ng14 | |
| 94 | 4.2.2.Priv acy Testin g14 | |
| 95 | 4.2.3.Sect ion 508 Co mpliance T esting14 | |
| 96 | 4.2.4.Mult i-Division al Testing 15 | |
| 97 | 4.3.Perfor mance and Capacity T esting15 | |
| 98 | 4.4.Test T ypes15 | |
| 99 | 4.5.Produc tivity and Support T ools17 | |
| 100 | 5.Test Cri teria17 | |
| 101 | 5.1.Proces s Reviews1 7 | |
| 102 | 5.2.Pass/F ail Criter ia18 | |
| 103 | 5.3.Suspen sion and R esumption Criteria18 | |
| 104 | 6.Test Del iverables1 8 | |
| 105 | 7.Test Sch edule20 | |
| 106 | 8.Test Env ironments2 1 | |
| 107 | 8.1.Test E nvironment Configura tions22 | |
| 108 | 8.2.AWS Te st Environ ment Confi guration22 | |
| 109 | 8.3.VA (AI TC) Test E nvironment Configura tions23 | |
| 110 | 8.4.Base S oftware El ements in the Test E nvironment s24 | |
| 111 | 9.Staffing and Train ing Needs2 4 | |
| 112 | 10.Risks a nd Constra ints25 | |
| 113 | 11.Test Me trics26 | |
| 114 | Attachment A: Approv al Signatu res27 | |
| 115 | Appendix A : Test Typ e Definiti ons28 | |
| 116 | ||
| 117 | Introducti on | |
| 118 | The Depart ment of Ve terans Aff airs (VA) has establ ished lead ership in genomic me dicine thr ough the u ndertaking of a grou nd-breakin g program called the Million V eteran Pro gram (MVP) . Launched in 2011, MVP invite s users of the VA he althcare s ystem nati onwide to participat e in a lon gitudinal study with the aim o f better u nderstandi ng the int errelation of geneti c characte ristics, b ehaviors a nd environ mental fac tors, and Veteran’s health. To day, with over 560,0 00 partici pants and recruiting at 50 VA sites nati onally, MV P is the l argest gen omic resea rch databa se in the world. The MVP data consists o f blood sa mples from consentin g Veterans that is u sed to gen erate geno mic data, data from questionna ires and t he electro nic health record da ta. This r esource is made avai lable to V A research ers and VA -approved affiliates to pursue genomic d iscoveries and valid ation stud ies that c an lead to personali zed health care for V eterans. U ltimately, over the long term, validated scientifi c findings will be r eturned to the Veter an and inc orporated into their medical r ecord to f ulfill the potential of person alized med icine – de livery of optimal in tervention s to patie nts based on their b iological characteri stics. | |
| 119 | The primar y componen t of the p roject is the Genomi c Informat ion System for Integ rated Scie nce 2 (Gen isis2) app lication. Genisis 1. 0, impleme nted over a four-yea r period ( 2011-2015) , features a series of modular applicati ons to fac ilitate re cruitment and enroll ment of MV P particip ants, auto mating mos t study-re lated logi stics incl uding stud y enrollme nt, clinic al study d ata captur e, consent , blood sa mple track ing, and g enomic dat a storage. Genisis 1 .0 also pr ovides the secure an alytical i nfrastruct ure necess ary to con duct robus t genomic and bioinf ormatics-r elated dat a manageme nt and dat a analysis . This rem otely acce ssible ana lysis envi ronment fe atures a h igh-perfor mance comp uting clus ter with s ignificant storage c apacity an d tools fo r scientif ic analysi s of combi ned genoty pic and ph enotypic d ata. | |
| 120 | Genisis2 w ill provid e addition al functio nality to the Genisi s 1.0 plat form by au tomating d ata proces sing, data request t ransaction s, and dat a request tracking f or integra ting VA In formatics and Comput ing Infras tructure ( VINCI) cli nical data into Geni sis2, as w ell as enh anced syst ems admini stration c apabilitie s. | |
| 121 | The Master Test Plan is a livi ng documen t that wil l be re-ev aluated fo r each rel ease of th e Genisis2 applicati on to ensu re all asp ects are a dequately tested and implement ed success fully. Th is documen t describe s the test scope, en vironment, approach, and test resources involved i n testing Genisis2. | |
| 122 | Purpose | |
| 123 | The purpos e of this document i s to provi de VA with a plan fo r test pla nning, doc umenting a nd executi on of the Genisis2 r eleases in complianc e with VA’ s Office o f Informat ion and Te chnology ( OI&T) Vete ran-focuse d Integrat ion Proces s (VIP). V IP is Lean -Agile fra mework tha t enables frequent r eleases of functiona l capabili ties to be deployed that servi ces the in terest of Veterans t hrough the efficient streamlin ing of act ivities th at occur w ithin the enterprise . | |
| 124 | The Master Test Plan creates a functiona l roadmap to test ea ch softwar e build an d planned release th roughout t he period of perform ance. It o utlines th e approach and tools that the Genisis2 T est Team w ill use to plan and test Genis is2, ensur ing the sy stem meets the funct ional, ope rational a nd complia nce requir ements as identified by the sy stem and O I&T Compli ance Epics . | |
| 125 | The compon ents of th is Master Test Plan, along wit h the test deliverab les, are m aintained within Rat ional Requ irements Q uality Man ager (RQM) in compli ance with the VA’s V IP Methodo logy. This document will mirro r the data elements housed wit hin the Ge nisis Rati onal Insta nce and se rve as a r eference f or those w ho do not have acces s. | |
| 126 | This Maste r Test Pla n will als o support the follow ing: | |
| 127 | Provide a central ar tifact to govern the planning and contro l of the t est effort for Genis is2. | |
| 128 | Define the general t esting app roach that Genisis2 will follo w. | |
| 129 | Demonstrat e to Genis is2 stakeh olders tha t various aspects go verning th e testing effort hav e been ade quately co nsidered; and where appropriat e, have th ose stakeh olders app rove the p lan. | |
| 130 | This docum ent will a lso suppor t the foll owing obje ctives: | |
| 131 | Identify s ources of the functi onal requi rements an d user sto ries utili zed for te sting. | |
| 132 | Identify t he assumpt ions, risk s, and con straints t hat affect this test ing proces s. | |
| 133 | Outline th e testing schedule. | |
| 134 | Describe t he testing strategy, types of tests, act ivities, a nd tools. | |
| 135 | Include th e roles an d responsi bilities o f the reso urces part icipating in the tes ting. | |
| 136 | Document t he pass/fa il perform ance of th e software design pa rameters. | |
| 137 | Test Objec tives | |
| 138 | The Master Test Plan supports the follow ing object ives: | |
| 139 | Outline th e framewor k of testi ng at the program le vel and cr eate a cen tral artif act to gov ern the pl anning and control o f the Gene sis2 testi ng effort. | |
| 140 | Define sta keholders for testin g across t he project lifecycle . | |
| 141 | Define the processes , sequence , and test schedule with regar d to high- level mile stones and allocatio n of resou rces for m anagement of the tes t phases a cross the project li fecycle th at support : | |
| 142 | Providing a comprehe nsive and consistent view of t he test ac tivities, work produ cts, resou rces; and execution, environme nt, and in tegration efforts. | |
| 143 | Identifyin g required test arti facts for each of th e test pha ses. | |
| 144 | Generating the test artifacts for the te st phases. | |
| 145 | Identifyin g and desc ribing ove rarching p rocesses c ritical to test exec ution, inc luding ent rance and exit crite ria for te st phases and milest ones, and quality as surance. | |
| 146 | Identifyin g and gene rating req uired repo rts for pr oject stat us of the test phase s. | |
| 147 | Defining m etrics for gauging p rogress in test exec ution and completion of verifi cation of product de velopment, product i ntegration , product release, a nd site or field dep loyment of products. | |
| 148 | Identifyin g risks in herent in the test e xecution a nd descrip tion of mi tigation p lans. | |
| 149 | Identify, create, ma intain, an d control the test e nvironment s. | |
| 150 | Execute 10 0% of the test cases during In tegration, System, a nd Functio nality Tes ting. | |
| 151 | Identify d efect repo rting proc esses. | |
| 152 | Identify a nd support additiona l needs fo r the proj ect to mee t VIP requ irements o r other OI &T needs ( e.g., supp ort for OI &T Complia nce). | |
| 153 | Roles and Responsibi lities | |
| 154 | Table 1 li sts the ke y roles an d their re sponsibili ties for t he Master Test Plan. | |
| 155 | Table 1: R oles and D escription s | |
| 156 | Role | |
| 157 | Descriptio n of Respo nsibilitie s | |
| 158 | Program Pr oject Mana ger | |
| 159 | Overall re sponsibili ty for the successfu l planning and execu tion of a project. | |
| 160 | Developmen t Project Manager | |
| 161 | Provide ge neral prog ram/projec t coordina tion and o versight. | |
| 162 | Provide si gn-offs or identify resources for signin g off on d eliverable s. | |
| 163 | Review sta tus of dev elopment a nd test ef forts. | |
| 164 | Provide in put to the Master Te st Plan. | |
| 165 | Developmen t Scrum Ma ster | |
| 166 | Coordinate s all deve lopment ac tivities r elated to a project. | |
| 167 | Review sta tus of dev elopment a nd test ef forts. | |
| 168 | Provide in put to the Master Te st Plan. | |
| 169 | Business F unctional Analyst | |
| 170 | Create Req uirements and Requir ement Coll ections wi thin RM fo r each Rel ease. | |
| 171 | Create Sto ries withi n Rational Team Conc ert (RTC) that conta in the fun ctional de sign. | |
| 172 | Review, de fine, and evaluate t he solutio n’s requir ements, ch ange reque sts, and n ew develop ment reque sts. | |
| 173 | Work with the Scrum Master to update, gr oom, and m aintain th e prioriti zed Produc t Backlog. | |
| 174 | Develop, e nhance, an d manage a cceptance criteria f or product features. | |
| 175 | Evaluate t he complex ity and sc ope of app lication i mprovement and enhan cement req uests. | |
| 176 | System Adm inistrator (DEV, TES T, and SQA ) | |
| 177 | Install an d maintain the hardw are, syste m and appl ication fo r the soft ware devel opment and test envi ronments. | |
| 178 | Configure user accou nts for th e appropri ate testin g environm ent. | |
| 179 | Provide th e build fo r the vari ous enviro nments. | |
| 180 | Developer | |
| 181 | Construct and mainta in the tec hnology an d developm ent code w ithin RTC. | |
| 182 | Construct and mainta in middlew are techno logy. | |
| 183 | Provide su pport duri ng the dif ferent pha ses of tes ting. | |
| 184 | Ensure tha t the solu tion is in complianc e with the technical framework and archi tecture. | |
| 185 | Test the d evelopment environme nt to veri fy proper configurat ion and op eration. | |
| 186 | Review and provide i nput to th e Master T est Plan. | |
| 187 | Execute un it testing and funct ional test ing (as re quired), t est result s, and iss ue resolut ion. | |
| 188 | Ensure tha t the solu tion meets the defin ed require ments. | |
| 189 | Review and provide i nput to th e Test Eva luation Su mmary and performanc e testing. | |
| 190 | Test Lead | |
| 191 | Develop an d maintain the Maste r Test Pla n. | |
| 192 | Develop an d maintain the data elements w ithin RQM. | |
| 193 | Manage the formal te st case de sign, form al system testing, a nd perform ance testi ng. | |
| 194 | Identify a nd report testing re lated risk s and reco mmend risk mitigatio n strategi es. | |
| 195 | Report and track def ects/issue s as they are discov ered. | |
| 196 | Perform re test on is sues resol ved in ind ividual co mponents, as require d. | |
| 197 | Provide in put into t he Test Ev aluation S ummary. | |
| 198 | Enter data elements into RQM a nd update supporting test docu mentation. | |
| 199 | Assist wit h the crea tion of te st cases a nd test cr iteria for each appl icable tes ting compo nent. | |
| 200 | Provide su pport to r esources f or System Integratio n and User Acceptanc e Testing (UAT) test ing. | |
| 201 | Participat e in test related ac tivities. | |
| 202 | Perform si gn-off of verificati on checkli sts and tr ansition d ocuments. | |
| 203 | Ensure Tes t Tools ar e up-to-da te and acc urate with the lates t incremen t/sprint i nformation , if appli cable. | |
| 204 | Test Anal ysts | |
| 205 | Create and execute t est cases and script s within R QM. | |
| 206 | Prepare te st data as needed. | |
| 207 | Perform en d-to-end t esting of all test c ases execu ted to ens ure comple te system/ integratio n testing. | |
| 208 | Report all defects/i ssues in R TC as they are disco vered. | |
| 209 | Provide st atus of te sting to T est Lead. | |
| 210 | Continue t o increase knowledge of busine ss rules a nd require ments. | |
| 211 | Provide in put to the Master Te st Plan. | |
| 212 | Provide in put to the Test Eval uation Sum mary Repor t. | |
| 213 | Facilitate and provi de guidanc e during U AT and/or User Funct ional Test ing (UFT). | |
| 214 | Stakeholde rs / Users | |
| 215 | Provide su pport for requiremen ts and bus iness work flow in wh ich they m ay affect or be affe cted by th e outcome. This incl udes pilot site test ers and us ers. | |
| 216 | Participat e in UAT/U FT. | |
| 217 | Validate/a pprove req uirements. | |
| 218 | Identify d ata sourci ng for tes ting purpo ses. | |
| 219 | Processes and Refere nces | |
| 220 | The proces ses that g uide the i mplementat ion of thi s Master T est Plan a re: | |
| 221 | Agile Proc ess Framew ork (Build Planning) | |
| 222 | VIP Method ology | |
| 223 | Test Plann ing | |
| 224 | Test Prepa ration | |
| 225 | Product Bu ild | |
| 226 | Software Q uality Ass urance (SQ A) Testing | |
| 227 | The refere nces that support th e implemen tation of this Maste r Test Pla n are: | |
| 228 | ProPath | |
| 229 | Section 50 8 Office W eb Page | |
| 230 | Privacy Im pact Asses sment - Pr ivacy Serv ice | |
| 231 | VA Release Readiness Office | |
| 232 | VIP Method ology | |
| 233 | ProPath Te mplates | |
| 234 | Rational T ools Team Site | |
| 235 | The refere nces that support th e developm ent of thi s Master T est Plan a re: | |
| 236 | System Des ign Docume nt (SDD): Version 6, Dated: 6/ 2017 | |
| 237 | Requiremen ts Traceab ility Matr ix (RTM), Dated: 6/2 017 | |
| 238 | Risk Log: Version 2, Dated: 6/ 2017 | |
| 239 | Genisis2 B uild Plan: Version 1 .1, Dated: 6/2017 | |
| 240 | Items to B e Tested | |
| 241 | This secti on identif ies the fu nctions an d features that are with scope of the Ge nisis2 tes t effort. | |
| 242 | Overview o f Test Inc lusions | |
| 243 | The compon ents and f eatures an d combinat ions of co mponents a nd feature s that wil l be teste d in Genis is2 Releas es 1-4 and have been identifie d and are listed bel ow. Pleas e note tha t names of the roles have chan ged for Re lease 2. R esearcher is now Req uestor and Data Mana ger is now called Da ta Destina tion Manag er. | |
| 244 | Role Based Testing | |
| 245 | Requestor - Create, Modify, Tr ack, Copy and Cancel Request | |
| 246 | Data Desti nation Man ager - Cre ate, Modif y, Track, Copy, Canc el, Approv e Request | |
| 247 | Data Sourc e Manager – Accept o r Deny Dat a Requests and Deliv er Data Re sults | |
| 248 | Genisis2 S ystem Admi nistrator – User Man agement, C reate, Mod ify, Track , Copy, Ca ncel, Trac king and M anagement Reports (R elease 4) | |
| 249 | E-mail Not ification | |
| 250 | Backend Da ta Operati ons (Data- File Asses sments, CO PY Table f unction, S ecure File Transfer) | |
| 251 | OI&T Compl iance (Sec urity, Des ign, Engin eering & A rchitectur e and Sect ion 508) | |
| 252 | Further de tails on t he Genisis 2 function ality and when they will be te sted are i n Tables 2 -5. This d ocument wi ll be upda ted prior to each re lease to i nclude mor e details on the fea tures and functions that will be tested. Test case s will be written to thoroughl y test eac h role and category for the ne w function ality that will be a dded withi n Genisis2 . All data elements will also be added t o RQM. | |
| 253 | For inform ation on p roject fun ctionality , refer to the Genis is BRD, RS D, and SDD located o n the Geni sis ShareP oint Site and within the Genis is Rationa l Instance . | |
| 254 | Overview o f Test Exc lusions | |
| 255 | The follow ing compon ents and f eatures, a nd combina tions ther eof, will not be tes ted: | |
| 256 | Genisis 1. 0 function s | |
| 257 | Network Ca pacity Tes ting | |
| 258 | Requiremen ts that fa ll outside the Genis is2 projec t scope | |
| 259 | Regression testing o f existing applicati ons/utilit ies | |
| 260 | Honest Bro ker | |
| 261 | VINCI | |
| 262 | ||
| 263 | Test Appro ach | |
| 264 | The Agile Methodolog y is being followed by the Gen isis2 Deve lopment an d Test tea ms. Each r elease wil l be compr ised of ag ile sprint s, each ha ving a spe cific focu s and obje ctive in d riving tow ard the re lease. Thi s section addresses the agile approach, compliance with VIP and how th e Rational Tool Suit e supports the full lifecycle effort for Genisis2. In additi on, this s ection exp lains how the team w ill approa ch sprint planning, test plann ing and ex ecution. R TC will be utilized by the Dev elopment T eam for ta sk managem ent and bo th the Dev elopment a nd Test te ams will u tilize RTC for defec t tracking . | |
| 265 | Note: The test appro ach may be revised b ased on re quirement clarificat ions and p otential m odificatio ns to the scope as i t may expa nd to incl ude additi onal items in the ba cklog. | |
| 266 | Genisis2 I terations – Agile Ap proach | |
| 267 | The Test T eam will r eceive the build imm ediately a fter devel opment is complete a nd unit te sted. The applicatio n updates will be in stalled in the test environmen t for test execution of the ne w function ality. Dep ending on the build integrity, there may be interi m builds t o resolve configurat ion issues , defects, or if a b uild is de emed to be unaccepta ble for te sting. Fig ure 1 illu strates ou r iteratio n approach : | |
| 268 | ||
| 269 | Figure 1: Genisis2 I terative A gile Testi ng Approac h | |
| 270 | Rational T ools Suite | |
| 271 | The initia l set of r equirement s containe d in the R SD was dec omposed in to detaile d requirem ents and u ser storie s and ente red into t he Genisis 2 Rational Requireme nts DOORS- NG or RM R epository. The user stories en tered in R M were the n manually created w ithin RTC. Each user story cre ated withi n RTC was linked to all of the associate d requirem ents house d in RM wh ich create d the orig inal link between RM and RTC. Each user story and associated requireme nt(s) were assigned to a Relea se Plan wi thin RTC a nd each Re lease Plan was decom posed into sprints t o support the deline ation of t asks and w ork items. | |
| 272 | The conten ts of the Release Pl an, along with the S DD, serves as the ba sis for th is Master Test Plan. Subsequen tly, this Master Tes t Plan is the basis for test p lanning, p reparation , and exec ution data elements that are i ncluded wi thin the R ational Qu ality Mana ger (RQM) Repository . RQM will be the re pository o f all test artifacts . Each tes t case wit hin RQM wi ll be link ed to the applicable requireme nt within RM. This f inal link, in additi on to the links betw een RM and RTC, demo nstrate an d support full trace ability th roughout t he project lifecycle . This app roach appl ies to all general s ystem requ irements, as well as the OI&T (DE&A, Sec tion 508, Security, and Releas e Complian ce) requir ements. | |
| 273 | Genisis2 R eleases 1- 4 | |
| 274 | In accorda nce with V IP, the Ge nisis2 Tea m will bui ld, test, and delive r a releas e within 9 0 days. Mo st release s are comp rised of 1 3 week cyc les that i ncludes fo ur 2-week developmen t sprints including testing, o ne 2-week stabilizat ion sprint with SQA, one 2-wee k UAT spri nt, and a 1-week rel ease readi ness sprin t. During the releas e readines s sprint, and prior to VIP Cri tical Deci sion 2 (CD 2), the Re lease Agen t will rev iew all re quired doc umentation and tasks to ensure and appro ve the rea diness of the releas e. The rev iew will a lso includ e the area s of OI&T Compliance (DE&A, Se ction 508, and Relea se Complia nce). All test artif acts will be loaded and includ ed within RQM and su pported by this Mast er Test Pl an. | |
| 275 | The breakd own of fea tures / fu nctionalit y that wil l be teste d and deli vered in e ach Genisi s2 release is identi fied in Ta bles 2-5. These tabl es will be updated a t the star t of each release ba sed on any modificat ions that may occur during bac klog groom ing. | |
| 276 | Table 2: R elease 1 F eatures an d Function ality | |
| 277 | Role / Wor k stream | |
| 278 | Features / Functiona lity | |
| 279 | Researcher (UI) | |
| 280 | Create Dat a Request | |
| 281 | Modify Dat a Request | |
| 282 | Track Data Request | |
| 283 | E-mail not ification for status changes | |
| 284 | E-mail not ification Genisis2 L anding zon e (manual) | |
| 285 | Data Manag er (UI) | |
| 286 | Create Dat a Request | |
| 287 | Modify Dat a Request | |
| 288 | Approve Re quest | |
| 289 | Track Data Request | |
| 290 | E-mail not ification for status changes ( automatic) | |
| 291 | E-mail not ification VINCI Land ing zone ( manual) | |
| 292 | Back end | |
| 293 | Data Opera tions Fram ework | |
| 294 | Basic Data File Asse ssment (co unts for n umber of r ows and mi ssing data ) | |
| 295 | Security | |
| 296 | Basic logi n (no PIV) | |
| 297 | Role based access | |
| 298 | Table 3: R elease 2 F eatures an d Function ality | |
| 299 | Role / Wor k stream | |
| 300 | Features / Functiona lity | |
| 301 | Requestor (UI) | |
| 302 | Regression Create/Mo dify/Track Data Requ est | |
| 303 | Automatic e-mail not ification for status changes | |
| 304 | Add Commen ts | |
| 305 | Data Desti nation Man ager (UI) | |
| 306 | Regression Create, M odify, Tra ck, Approv e, Deny, R eturn Data Request | |
| 307 | Automatic e-mail not ification for status changes | |
| 308 | Add Commen ts | |
| 309 | History (U I) | |
| 310 | Capture h igh level and detail ed history informati on for eac h request. | |
| 311 | Comments ( UI) | |
| 312 | Provide th e ability to add com ments thro ughout the lifecycle of a data request | |
| 313 | Back end | |
| 314 | Secure tra nsfer of d ata from V INCI | |
| 315 | Data File Assessment (file che cking) | |
| 316 | COPY Table Function (VINCI to Genisis) | |
| 317 | Security | |
| 318 | Integratio n with VA’ s Active D irectory | |
| 319 | Role-based access wi th two-fac tor authen tication | |
| 320 | Table 4: R elease 3 F eatures an d Function ality | |
| 321 | Role / Wor k Stream | |
| 322 | Features / Functiona lity | |
| 323 | Requestor (UI) | |
| 324 | Regression Create, M odify, Tra ck, Copy, Cancel Dat a Request | |
| 325 | Automatic e-mail not ification | |
| 326 | VINCI docu mentation view | |
| 327 | Data Desti nation Man ager (UI) | |
| 328 | Regression Create, M odify, Tra ck, Copy, Cancel, Ap prove Data Request | |
| 329 | Automatic e-mail not ification | |
| 330 | Data Manag ement util ity librar y | |
| 331 | Data Sourc e Manager (UI) | |
| 332 | Accept Dat a Request | |
| 333 | Deny Data Request (R equest Can not be Ful filled) | |
| 334 | Deliver Re sults | |
| 335 | History (U I) | |
| 336 | Capture hi gh level a nd detaile d history informatio n for each request. | |
| 337 | Comments ( UI) | |
| 338 | Provide th e ability to add com ments thro ughout the lifecycle of a data request | |
| 339 | Back end | |
| 340 | Regression basic dat a file ass essment fo r number o f rows and missing d ata | |
| 341 | Automatic detection (notificat ion) of da ta from VI NCI | |
| 342 | Security | |
| 343 | Regression Role-base d access w ith Two-Fa ctor Authe ntication | |
| 344 | ||
| 345 | ||
| 346 | Table 5: P roposed Re lease 4 Fe atures and Functiona lity | |
| 347 | Role / Wor k Stream | |
| 348 | Features / Functiona lity | |
| 349 | Requestor (UI) | |
| 350 | Regression Create, M odify, Tra ck, Copy, Cancel Dat a Request, Automatic e-mail no tification | |
| 351 | VINCI docu mentation view | |
| 352 | Usability Compliance | |
| 353 | Data Desti nation Man ager (UI) | |
| 354 | Regression Create, M odify, Tra ck, Approv e, Copy, C ancel Data Request, Automatic e-mail not ification. Data Mana gement uti lity libra ry | |
| 355 | Usability Compliance | |
| 356 | Genisis2 S ystem Admi nistrator (UI) | |
| 357 | Regression Test Trac king Repor ts | |
| 358 | User manag ement | |
| 359 | Usability Compliance | |
| 360 | Additional Reports | |
| 361 | Back end | |
| 362 | Regression basic dat a file ass essment fo r number o f rows and missing d ata) | |
| 363 | Automatic detection (notificat ion) of da ta from VI NCI | |
| 364 | Secure tra nsfer of d ata from V INCI | |
| 365 | Enhanced D ata Operat ions (file checking) | |
| 366 | Enhanced P erformance | |
| 367 | Security | |
| 368 | Regression Role-base d access w ith Two-Fa ctor Authe ntication | |
| 369 | More Enhan ced Securi ty | |
| 370 | ||
| 371 | Product Co mponent Te st | |
| 372 | Prior to d elivery of the sprin t build to the teste rs, the de velopers w ill perfor m Product Component Testing (a lso known as Unit Te sting) on code they have devel oped. Thes e activiti es will be performed in the DE V environm ent. Once the stabil ity of the new funct ionality t hat suppor ts the use r story an d associat ed require ments is c onfirmed, the result s will be communicat ed to the Scrum Mast er and Dev elopment P roject Man ager. Once approvals to move f orward hav e been giv en, the re sults and source cod e will be uploaded i nto RTC an d a new bu ild will b e readied for the Te st Team. | |
| 373 | Component Integratio n Test | |
| 374 | After the Developmen t Team has completed Product C omponent T esting for a sprint, the Devel opment Tea m will del iver the b uild to th e Test Tea m to perfo rm Compone nt Integra tion Testi ng. This w ill take t he initial form of S moke Testi ng to conf irm the st ability an d viabilit y of the d elivered b uild befor e proceedi ng to test execution of the de livered co mponent. T his testin g will be done withi n the TEST environme nt. Test r esults wil l be docum ented with in RQM and the teste rs will ma ke an asse ssment bas ed on thos e test res ults wheth er to cont inue with system tes ting or re turn the b uild to de velopment for additi onal work. | |
| 375 | System Tes ts | |
| 376 | As new bui lds become available and stabl e enough t o promote to the AWS -Test envi ronment, S ystem Test ing will b e executed by the Te st Team in the AWS-T est enviro nment. Hig h- level t est script s will be created ba sed on ava ilable fun ctionality delivered from the Developmen t Team for each comp onent and sprint. Te sting will take the form of ex ercising t he high-le vel script s when app ropriate, and suppor ted by “ad hoc” test ing due to the frequ ent develo pment cycl es and pot entially r apidly res ponsive de sign chang es. Test r esults wil l be avail able after each comp onent has been teste d within R QM and via the Genis is2 Releas e Excel Sp readsheet. | |
| 377 | User Acce ptance Tes ting | |
| 378 | UAT will b e performe d upon com pletion of System Te sting and SQA. Testi ng will be performed for the r elease by stakeholde rs / users within th e VA-SQA e nvironment . The Test Team will provide a ccess to t he test ca ses within RQM if re quested, b ut the use r’s scenar ios should be develo ped by the users. Th e users wi ll provide results f or reporti ng progres s and comp letion of UAT testin g using a method tha t will be establishe d for each release. The Test T eam will a lso provid e support to assist in documen tation of the test r esults, if required, and will capture th is informa tion with RQM or a s preadsheet uploaded to Rationa l for test results a nd RTC for any defec ts identif ied. | |
| 379 | Enterprise System En gineering Testing | |
| 380 | Enterprise System En gineering (ESE) Test ing is typ ically con ducted by the ESE Te sting Serv ices Team and suppor ted by the Genisis2 Developmen t Team. Th e Genisis2 Test Lead will sche dule a kic k off meet ing with E SE Testing Services to plan an d coordina te any nee ded suppor t for rele ase readin ess in acc ordance wi th the VIP Methodolo gy. This w ill includ e the prov ision of d ocumentati on, access to enviro nments, an d any addi tional tas ks as requ ired to su pport thei r Statisti c and Dyna mic Testin g requirem ents. This section o f the Mast er Test Pl an will be updated t o include any issues and/or ri sks identi fied by ES E once the y have com pleted the ir review and/or tes ting for e ach releas e. | |
| 381 | Performanc e Testing | |
| 382 | This secti on will be updated i n a future release t o identify the appro ach to per formance t esting bas ed on syst em require ments. The update wi ll include details a bout tools , if appli cable, tea m POCs, an y special requiremen ts and dep endencies. | |
| 383 | ||
| 384 | Testing Te chniques | |
| 385 | Testing te chniques i nclude bot h static a nd dynamic testing. Static ana lysis focu ses on app ropriate m ethods tha t are used to determ ine or est imate soft ware quali ty without reference to actual execution s. | |
| 386 | Static tes ting techn iques incl ude the fo llowing: | |
| 387 | Review of business r equirement s (RSD) | |
| 388 | Review of functional specifica tions and design doc uments | |
| 389 | Review of user stori es | |
| 390 | Preparatio n of test plan | |
| 391 | Preparatio n of test scenarios and test c ases | |
| 392 | Execution of walkthr oughs and inspection s | |
| 393 | Identifica tion and d ocumentati on of soft ware defec ts (e.g., RTC) | |
| 394 | Dynamic an alysis dea ls with sp ecific met hods for a scertainin g software quality t hrough act ual execut ions (e.g. , with rea l data and under rea l circumst ances). Dy namic test ing techni ques inclu de: | |
| 395 | Product Co mponent Te sting | |
| 396 | System Tes ting | |
| 397 | Regression Testing | |
| 398 | Product In tegration Testing | |
| 399 | Usability Testing | |
| 400 | End to End Testing | |
| 401 | Performanc e Testing | |
| 402 | User Accep tance Test ing | |
| 403 | Risk-based Testing | |
| 404 | The Genisi s2 Test Te am will de velop and execute te st cases a nd/or scri pts in acc ordance wi th the fol lowing fun ctional pr iorities: | |
| 405 | Specific t esting rec ommended b y the Geni sis2 Devel opment Tea m or testi ng that re quires add itional/al ternative testing re sources. | |
| 406 | Specific f unctionali ty necessi tated by i ssues that the Genis is2 Develo pment Team encounter s. | |
| 407 | Specific t esting req uirements recommende d by the d evelopers, SQA Analy sts and / or users w hich were not initia lly identi fied will result in an update to the Use r Story do cumentatio n to ensur e there is traceabil ity from t he user st ories to t he test ca ses. | |
| 408 | Specific r ecommendat ions docum ented with in the Ent erprise Sy stems Engi neering Ri sk Analysi s and Test ing Scope Report. | |
| 409 | Enterprise Testing | |
| 410 | The Genisi s2 Project Team will work with the Desig n & Archit ecture Com pliance Gr oup (DE&A) to identi fy the OI& T DE&A Epi cs require d to ensur e Genisis2 is in com pliance wi th VA’s Ar chitecture standards . The appl icable DE& A User Sto ries will be capture d and move d into the Genisis R ational In stance. Re sources fr om the DE& A will wor k with mem bers of th e Genisis2 Developme nt, Test a nd Configu ration Man agement Te am to prov ide inform ation in t he form of the Genis is2 System Design Do cument (SD D) that wi ll validat e the OI&T DE&A Comp liance req uirements and user s tories. Th is will en sure the s ystem meet s VA Enter prise stan dards and support th e required traceabil ity from R M through RQM for Re lease Read iness. | |
| 411 | Security T esting | |
| 412 | The Genisi s2 Team co ntacted a member of the Cyber Security P olicy and Compliance Group (CS PC) to dis cuss the O I&T Securi ty Epics. CSPC is i n the proc ess of rev iewing and rewriting the OI&T Security E pics, Sub- Epics and User Stori es so we w ere advise d to wait until earl y January 2017. The Genisis2 Team will meet at th at time to review an d identify the Secur ity User S tories app licable to Genisis2. This inf ormation w ill be use d to devel op test ca ses that v alidate th e system a nd OI&T se curity req uirements and ensure Release R eadiness. | |
| 413 | Privacy Te sting | |
| 414 | The Genisi s2 Team co ntacted a member of the CSPC t o discuss the OI&T S ecurity Ep ics relate d to the P rivacy and Security Rule provi sions of t he Health Insurance Portabilit y and Acco untability Act (HIPA A). This i nformation will be u sed to sup port tests to ensure that (1) veteran an d employee data are adequately protected , and (2) systems an d applicat ions compl y with the Privacy a nd Securit y Rule pro visions of HIPAA to secure an Authority to Operate (ATO). Ge nisis2 is currently covered un der the Re gion 4 ATO ; tests wi ll be limi ted to WAS A and Fort ify scans executed b y the Deve lopment Te am. | |
| 415 | Section 50 8 Complian ce Testing | |
| 416 | The Develo pment and Test Teams are respo nsible for ensuring that Genis is2 functi onality is usable fr om the key board, whi le the Sec tion 508 P rogram Off ice is res ponsible f or perform ing indepe ndent comp liance tes ting with assistive technology . | |
| 417 | The projec t must obt ain sign-o ff from th e Section 508 Progra m Office t hat compli ance testi ng was per formed. Fo r more inf ormation, contact th e Section 508 Progra m Office a t DN S
|
|
| 418 | The Genisi s2 Project Team will meet with represent atives of the Sectio n 508 Comp liance off ice to det ermine how Genisis2 will be ce rtified an d identify the appro ved tools for use. The Genisi s2 Team wi ll perform internal 508 testin g to deter mine readi ness for t he 508 aud it. Once completed, the Genis is2 team w ill submit a request to the VA -508 team to perform independe nt testing and provi de 508 app roval. | |
| 419 | The follow ing is a l ist of tes ts which n eed to be performed to be in c ompliance with Secti on 508: | |
| 420 | Can you us e the keyb oard inste ad of the mouse? | |
| 421 | Does the c ursor move in a logi cal order or flow? | |
| 422 | Do the ele ments do w hat they a re suppose d to do? | |
| 423 | Is there a lternate t ext for al l non-text elements? | |
| 424 | Does the l ink text e xplain wha t the link does? | |
| 425 | Are there captions f or audio a nd visual elements o r transcri pts for au dio only? | |
| 426 | Is color t he only me ans of ide ntificatio n of eleme nts on a p age? | |
| 427 | Are docume nts organi zed so the y are read able witho ut requiri ng an asso ciated sty le? | |
| 428 | Are there server sid e image ma ps or clie nt side im age maps? | |
| 429 | Are tables coded pro perly? | |
| 430 | Does your website ha ve frames? | |
| 431 | Does the s creen flic ker with a frequency greater t han 2 Hz a nd lower t han 55 Hz? | |
| 432 | Are there text-only pages for informatio n that can not be mad e complian t in any o ther way? | |
| 433 | Is the scr ipt langua ge in a re adable fas hion for a ssistive t echnology users? | |
| 434 | Is there a link for software d ownloads? | |
| 435 | Are there electronic forms? | |
| 436 | Is there a way for t he user to skip navi gation fun ctions/sid ebar and g o straight to the co ntent? | |
| 437 | If a timed response is used, i s the user prompted to request more time ? | |
| 438 | Multi-Divi sional Tes ting | |
| 439 | Since Geni sis2 will be deploye d within P ITT at one site, thi s project does not n eed to add ress any s pecific mu lti-divisi onal requi rements; t herefore, multi-divi sional tes ting is no t required . | |
| 440 | Performanc e and Capa city Testi ng | |
| 441 | The Genisi s2 Develop ment Team will perfo rm quantit ative test s that wil l measure the respon se time at which a s ystem func tions to e nsure that the appli cation mee ts the spe cification s defined within the non-funct ional requ irements. | |
| 442 | Test Types | |
| 443 | Table 6 li sts the ty pes of tes ts to be p erformed o n the Geni sis2 appli cation, as appropria te. | |
| 444 | Table 6: T est Types | |
| 445 | Test Types | |
| 446 | Party Resp onsible | |
| 447 | Access con trol testi ng | |
| 448 | Developmen t / Test T eams | |
| 449 | Build veri fication t esting | |
| 450 | Developmen t Team | |
| 451 | Compliance testing | |
| 452 | Test Team | |
| 453 | Component integratio n testing | |
| 454 | Developmen t Team | |
| 455 | Configurat ion testin g | |
| 456 | Developmen t Team | |
| 457 | Data and d atabase in tegrity te sting | |
| 458 | Developmen t / Test T eams | |
| 459 | Documentat ion testin g | |
| 460 | Developmen t / Test T eams | |
| 461 | Error anal ysis testi ng | |
| 462 | Developmen t / Test T eams | |
| 463 | Explorator y testing | |
| 464 | Test Team | |
| 465 | Failover t esting | |
| 466 | N/A | |
| 467 | Installati on testing | |
| 468 | Developmen t / Test T eams | |
| 469 | Integratio n testing | |
| 470 | Developmen t / Test T eams | |
| 471 | Migration testing | |
| 472 | Developmen t Team | |
| 473 | Multi-divi sional tes ting | |
| 474 | N/A | |
| 475 | Parallel t esting | |
| 476 | Developmen t / Test T eams | |
| 477 | Performanc e monitori ng testing | |
| 478 | Test Teams | |
| 479 | Performanc e testing | |
| 480 | Developmen t Teams | |
| 481 | Performanc e - Benchm ark testin g | |
| 482 | Developmen t Teams | |
| 483 | Performanc e - Conten tion testi ng | |
| 484 | Developmen t Team | |
| 485 | Performanc e - Endura nce testin g | |
| 486 | Developmen t Team | |
| 487 | Performanc e - Load t esting | |
| 488 | Developmen t Team | |
| 489 | Performanc e - Profil ing testin g | |
| 490 | Developmen t Team | |
| 491 | Performanc e - Spike testing | |
| 492 | Developmen t Team | |
| 493 | Performanc e - Stress testing | |
| 494 | Developmen t Team | |
| 495 | Privacy te sting | |
| 496 | Developmen t / Test T eams | |
| 497 | Product co mponent te sting | |
| 498 | Developmen t Team | |
| 499 | Recovery t esting | |
| 500 | Developmen t / Test T eams | |
| 501 | Regression test | |
| 502 | Test Team | |
| 503 | Risk based testing | |
| 504 | Test Team | |
| 505 | Section 50 8 complian ce testing | |
| 506 | Test Team / 508 Comp liance Gro up | |
| 507 | Security t esting | |
| 508 | Test Team | |
| 509 | Smoke test ing | |
| 510 | Developmen t / Test T eams | |
| 511 | System tes ting | |
| 512 | Test Team | |
| 513 | Usability testing | |
| 514 | Test Team | |
| 515 | User Funct ionality T esting | |
| 516 | Test / Use rs Teams | |
| 517 | User inter face testi ng | |
| 518 | Developmen t / Test / User Team s | |
| 519 | Productivi ty and Sup port Tools | |
| 520 | The Genisi s2 Test Te am will ut ilize the tools list ed in Tabl e 7 to sup port test management , test exe cution, de fect track ing, and t est report ing activi ties. | |
| 521 | Table 7: T ool Catego ry or Type s | |
| 522 | Tool Categ ory or Typ e | |
| 523 | Tool Brand Name | |
| 524 | Vendor or In-house | |
| 525 | Version | |
| 526 | Test Manag ement | |
| 527 | RQM | |
| 528 | In-house | |
| 529 | Version 6 | |
| 530 | Defect Tra cking | |
| 531 | RTC | |
| 532 | In-house | |
| 533 | Version 6 | |
| 534 | Test Cover age Monito r or Profi ler | |
| 535 | Rational D OORS-NG & RQM | |
| 536 | In-house | |
| 537 | Version 6 | |
| 538 | Project Ma nagement | |
| 539 | Microsoft Project | |
| 540 | In-house | |
| 541 | 2013 | |
| 542 | Performanc e Testing | |
| 543 | TBD | |
| 544 | TBD | |
| 545 | TBD | |
| 546 | Configurat ion Manage ment | |
| 547 | RTC | |
| 548 | In-house | |
| 549 | Version 6 | |
| 550 | Functional Test Auto mation | |
| 551 | Rational F unctional Tester | |
| 552 | In-house | |
| 553 | TBD | |
| 554 | ||
| 555 | Test Crite ria | |
| 556 | Process Re views | |
| 557 | The Master Test Plan undergoes two revie ws: | |
| 558 | Peer Revie w – upon c ompletion of the Mas ter Test P lan and is updated b y both the Developme nt and Tes t Teams. | |
| 559 | Formal Rev iew – Pres entation t o the VA P M after th e Developm ent and Te st Lead ap proves the Master Te st Plan. | |
| 560 | The Master Test Plan serves as input for data elem ents enter ed into RQ M. Both th e Master T est Plan a nd the dat a elements contained in RQM su pport the VIP proces ses for tr aceability of requir ements thr oughout th e project’ s lifecycl e and data required to support decisions required for the Cr itical Dec ision 2 (C D2) and Re lease Mana gement. | |
| 561 | For more b ackground informatio n on the r eviews ass ociated wi th testing , see the Product Bu ild, Test Preparatio n, and Ind ependent T est and Ev aluation p rocesses. Even thoug h ProPath Processes are no lon ger requir ed, they a re still u sed for be st practic e guidelin es. | |
| 562 | ||
| 563 | Pass/Fail Criteria | |
| 564 | Incidents identified during th e executio n of this test plan will be ev aluated to determine their sev erity. All incidents will be r ecorded in RTC: | |
| 565 | Has a reas onable wor karound to maintain functional ity | |
| 566 | Impacts a small grou p of users , but has workaround | |
| 567 | Functional ity works but not to requireme nts, speci fications, or standa rds and wo rkflow is not hamper ed. | |
| 568 | 1) Low Imp act Test I ncident is an error or lack of functiona lity that may cause operator/u ser inconv enience an d minimall y affects operationa l processi ng. | |
| 569 | Spelling e rrors | |
| 570 | Minor GUI Graphical/ Formatting errors th at do not affect fun ctionality /visibilit y. | |
| 571 | 2) Enhance ment Test Incident i s somethin g that wou ld be “nic e” to have in the in tegration piece but was not in cluded in the specif ications f or this re lease. | |
| 572 | All High a nd Medium defects sh all be add ressed or negotiated prior to release. A ny limitat ion or out standing t est incide nt shall h ave an app roved cont ingency pr ocess (wor karound) i n place pr ior to rel ease. | |
| 573 | Suspension and Resum ption Crit eria | |
| 574 | Testing wi ll cease o n a test i tem when a high impa ct test in cident is logged. Te sting will resume wh en the inc ident is a ddressed. | |
| 575 | Testing wi ll cease o n the enti re release when thre e high imp act test i ncidents a re logged. Testing w ill resume when the incidences are addre ssed. | |
| 576 | Testing wi ll cease i f any elem ent of the test syst em is unav ailable, s uch as the VA Networ k Servers or CAG. | |
| 577 | ||
| 578 | Test Deliv erables | |
| 579 | The Test D eliverable s listed i n Table 8 lists the test deliv erables fo r the Geni sis2 proje ct. | |
| 580 | Table 8: T est Delive rables | |
| 581 | Test Deliv erables | |
| 582 | Responsibl e Party | |
| 583 | Master Tes t Plan (MT P) | |
| 584 | Test Lead | |
| 585 | Master Tes t Plan (Ra tional RQM ) | |
| 586 | Test Lead | |
| 587 | Master Tes t Plan Che cklist | |
| 588 | Test Lead | |
| 589 | Iteration Test Plans (Rational RQM) | |
| 590 | Test Lead & Test Tea m | |
| 591 | Test Sched ule (inclu ded within (RTC) | |
| 592 | Test Lead | |
| 593 | Test Cases /Test Scri pts (Ratio nal RQM) | |
| 594 | Test Team | |
| 595 | Test Data | |
| 596 | TBD | |
| 597 | Test Envir onment | |
| 598 | Genisis 2 System Adm inistrator & Test Te am | |
| 599 | Integrated Test Envi ronment | |
| 600 | Genisis2 S ystem Admi nistrator, Developme nt & Test | |
| 601 | Traceabili ty Matrix | |
| 602 | (Rational DOORS–NG & RQM) | |
| 603 | Test Team | |
| 604 | Test Defec t Logs (Ra tional RTC ) | |
| 605 | Test Team | |
| 606 | Test Execu tion Logs (Rational RQM) | |
| 607 | Test Team | |
| 608 | Test Evalu ation Summ aries | |
| 609 | Test Lead & Test Tea m | |
| 610 | Test Sched ule | |
| 611 | The overal l project schedule i s being ma naged with in the Gen isis2 Rati onal Insta nce; and s pecificall y, within RTC and RQ M. | |
| 612 | The follow ing is a s napshot in the Genis is Rationa l Instance and inclu des the pr ojected sc hedule for Releases 1-4. | |
| 613 | ||
| 614 | ||
| 615 | ||
| 616 | ||
| 617 | ||
| 618 | ||
| 619 | Genisis2 w ill not be released nationally . The Rele ase Site f or Genisis 2 is PITC. | |
| 620 | Table 9 li sts the mi lestones f or testing Genisis2. | |
| 621 | Table 9: T esting Mil estones | |
| 622 | Testing Mi lestones | |
| 623 | Responsibl e Party | |
| 624 | Complete M aster Test Plan | |
| 625 | Test Lead / Test Eng ineer | |
| 626 | Complete R equirement s Traceabi lity Matri x | |
| 627 | Requiremen ts Manager & | |
| 628 | Test Lead / Engineer | |
| 629 | Unit Testi ng | |
| 630 | Developmen t Team | |
| 631 | Functional Testing | |
| 632 | Test Team | |
| 633 | DE&A Testi ng | |
| 634 | Developmen t & Test T eams | |
| 635 | Security T esting | |
| 636 | Developmen t & Test T eams | |
| 637 | Performanc e Testing | |
| 638 | Developmen t Team | |
| 639 | Section 50 8 Testing | |
| 640 | Test Team | |
| 641 | User Accep tance Test ing | |
| 642 | Users with Developme nt & Test Support | |
| 643 | Test Evalu ation Summ ary | |
| 644 | Test Team | |
| 645 | ||
| 646 | Test Envir onments | |
| 647 | The Config uration Ma nager (CM) is respon sible for managing t he release s for all Genisis2 p roject tes t environm ents. The DBA is res ponsible f or control ling and m aintaining all Genis is2 projec t test env ironments. Unplanned changes t o the test environme nts may in troduce ne w test inc idents, al ter the ex pected tes t results, and thus, invalidat e the resu lts of the Test Case s. The CM is respons ible for b uilds, nat ional patc h installs , set-up, and config uration ch ange suppo rt. The In ternet Exp lorer (IE) Team is r esponsible for confi guration a nd mainten ance of th e interfac e engines. | |
| 648 | Test Envir onment Qua lity Gate: The test system is stood up a nd source code is in tegrated i nto a chan ge-control led enviro nment by f ollowing t he Configu ration Man agement Pl an procedu res prior to testing . | |
| 649 | Test Envir onment Con figuration s | |
| 650 | Successful testing r equires co ntrol of t he test en vironment. Unplanned changes t o the test environme nt may int roduce new test inci dents, alt er the exp ected test results, and invali date the t est cases. Successfu l testing requires c ontrolled access to the test e nvironment , an envir onment tha t replicat es the fie ld environ ment as cl osely as p ossible. | |
| 651 | Developmen t testing will be co nducted in the devel opment env ironment. System and integrati on test en vironments should si mulate the productio n environm ent where the softwa re will be executed. One dedic ated test environmen t will be set up and ready the test team . This tes t environm ent will a lso be use d by the S QA Team fo r system a nd integra tion testi ng. All pr omotions t o the test environme nt will be managed t hrough a c hange mana gement pro cess and a ccess to t his enviro nment will be manage d by the G enisis2 De velopment and Test T eam. | |
| 652 | A Genisis2 test envi ronment wi ll be util ized in or der to per form integ ration tes ting for m odules and component s they are ready to be conside red for pr omotion an d release. The Test Team will maintain this accou nt. | |
| 653 | The Test E nvironment Configura tions need to be pro vided and supported for the Ge nisis2 pro ject. Tabl e 10 provi des the te st environ ment infor mation. | |
| 654 | Table 10: Necessary Genisis2 T est Enviro nment | |
| 655 | Configurat ions Confi guration N ame | |
| 656 | Descriptio n | |
| 657 | Server Nam e: DN S
|
|
| 658 | Database N ames: jbpm db (JBPM), Genisisdb (app data base) | |
| 659 | Server Loc ation: Aus tin, TX | |
| 660 | Database T ier: MS SQ L Server 2 012 | |
| 661 | ||
| 662 | AWS Test E nvironment Configura tion | |
| 663 | The Test T eam will w ork with d evelopment and the s ystem admi nistrators to set up users and data requ ired for t he SQA env ironment. Specifics are identi fied in Se ctions 8.3 and 8.4. The test s ystem will not conta in “live” patient da ta. Table 11 provide s informat ion regard ing the te mporary te st environ ment. | |
| 664 | Table 11: Amazon Web Services (AWS) Geni sis2 Tempo rary Test Environmen t | |
| 665 | Resource | |
| 666 | Quantity | |
| 667 | Name and T ype | |
| 668 | Database S erver | |
| 669 | 1 | |
| 670 | IP | |
| 671 | Network or Subnet | |
| 672 | 1 | |
| 673 | IP | |
| 674 | Database N ame | |
| 675 | 2 | |
| 676 | jbpmdb (JB PM), Genis isdb (app database) | |
| 677 | Test Repos itory | |
| 678 | 1 | |
| 679 | IP | |
| 680 | Test and D evelopment GFEs | |
| 681 | 8 | |
| 682 | (6) i7 pro cessor (De velopment) (2) i5 p rocessor ( Test) | |
| 683 | ||
| 684 | VA (AITC) Test Envir onment Con figuration s | |
| 685 | Functional ly, exact instances of the Gen isis2 envi ronment wi ll be crea ted and ma intained. The Test T eam will u se one ins tance for unit and i ntegration testing, while one instance w ill be res erved for system tes ting by th e SQA Team . | |
| 686 | The test s ystem will not conta in “live” patient da ta. | |
| 687 | The specif ic element s of the t est system may not b e fully un derstood i n early it erations, so this se ction may be complet ed/updated over time . The test system sh ould simul ate the pr oduction e nvironment as closel y as possi ble, scali ng down co ncurrent a ccess and database s ize, and s o forth, i f and wher e appropri ate. Table 12 sets f orth the s ystem reso urces for the test e ffort pres ented in t his Master Test Plan , and will be tailor ed as need ed. | |
| 688 | Table 12: System Har dware Reso urces | |
| 689 | Resource | |
| 690 | Quantity | |
| 691 | Name and T ype | |
| 692 | Database S erver | |
| 693 | 1 | |
| 694 | DNS | |
| 695 | Network or Subnet | |
| 696 | TBD | |
| 697 | IP | |
| 698 | Server Nam e | |
| 699 | 1 | |
| 700 | DNS | |
| 701 | Database N ame | |
| 702 | 1 | |
| 703 | IP | |
| 704 | Test Repos itory | |
| 705 | 1 | |
| 706 | DNS | |
| 707 | Address | |
| 708 | 1 | |
| 709 | IP | |
| 710 | Test Devel opment PCs | |
| 711 | TBD | |
| 712 | Using a co mbination of GFEs an d CAG Acce ss to VA | |
| 713 | Base Softw are Elemen ts in the Test Envir onments | |
| 714 | Table 13 d escribes t he base so ftware ele ments that are requi red in the test envi ronment fo r this Mas ter Test P lan. Softw are elemen ts will be adjusted, as approp riate. If necessary, software patches wi ll be prov ided or re ferenced i n the tabl e. | |
| 715 | Table 13: Software E lements | |
| 716 | Software E lement Nam e | |
| 717 | Version | |
| 718 | Type and O ther Notes | |
| 719 | Linux | |
| 720 | RHEL | |
| 721 | Operating System | |
| 722 | Windows Se rver | |
| 723 | 2008 R2 | |
| 724 | Operating System | |
| 725 | Internet E xplorer 11 | |
| 726 | Version 11 | |
| 727 | Internet B rowser | |
| 728 | MS Outlook 2010 | |
| 729 | Version 14 | |
| 730 | Email Clie nt softwar e | |
| 731 | ||
| 732 | Staffing a nd Trainin g Needs | |
| 733 | Table 14 i ndicates t he number of personn el resourc es needed to plan, p repare, an d execute the testin g tasks. | |
| 734 | Table 14: Staffing R esources | |
| 735 | Testing Ta sk | |
| 736 | Quantity o f Personne l Needed | |
| 737 | Test Proce ss | |
| 738 | Duration ( Days) | |
| 739 | Develop th e Master T est Plan | |
| 740 | 1 FTE | |
| 741 | Test Plann ing | |
| 742 | 5 days | |
| 743 | Set up Gen isis in th e RQM Inst ance | |
| 744 | 1 FTE | |
| 745 | Test Plann ing | |
| 746 | 2 days | |
| 747 | Create Tes t Plans fo r each Rel ease withi n Rational | |
| 748 | 1.0 FTE | |
| 749 | Test Plann ing | |
| 750 | 2 days | |
| 751 | Establish the TEST E nvironment | |
| 752 | 1.5 FTE | |
| 753 | Test Prepa ration | |
| 754 | 2 days | |
| 755 | Create tes t cases an d/or scrip ts for eac h Sprint | |
| 756 | 3 FTE | |
| 757 | Test Prepa ration | |
| 758 | 5 days | |
| 759 | Enter data elements (test case s/scripts) into RQM, create te st records and link scenarios to require ments in R M for trac eability ( RTM). Perf orm after final deve lopment sp rint. | |
| 760 | 1 FTE | |
| 761 | Test Prepa ration & T est Execut ion | |
| 762 | 10 days | |
| 763 | Execute te sts for ea ch module per sprint | |
| 764 | 3 FTE | |
| 765 | Test Execu tion | |
| 766 | 4 days | |
| 767 | Document T est Result s | |
| 768 | 3 FTE | |
| 769 | Test Repor ting | |
| 770 | 1 day | |
| 771 | Table 15 i dentifies the traini ng needs r equired to execute t he activit ies outlin ed in the Master Tes t Plan. | |
| 772 | Table 15: Training N eeds | |
| 773 | Name | |
| 774 | Training N eed | |
| 775 | Training O ption | |
| 776 | Estimated Training H ours | |
| 777 | Stakeholde rs | |
| 778 | (PM, COR) | |
| 779 | IBM Ration al (RM, RQ M) | |
| 780 | VA-TMS tra ining | |
| 781 | 8 hours | |
| 782 | Architect & Develope rs | |
| 783 | IBM Ration al (RTC) | |
| 784 | VA-TMS tra ining | |
| 785 | 8 hours | |
| 786 | Scrum Mast er | |
| 787 | IBM Ration al (RTC) | |
| 788 | VA-TMS tra ining | |
| 789 | 8 hours | |
| 790 | Business A nalyst | |
| 791 | IBM Ration al (RM) | |
| 792 | VA-TMS tra ining | |
| 793 | 8 hours | |
| 794 | Subject Ma tter Exper ts (SME) | |
| 795 | IBM Ration al (RM, RQ M) | |
| 796 | VA-TMS tra ining | |
| 797 | 8 hours | |
| 798 | Test Engin eers | |
| 799 | IBM Ration al (RQM) | |
| 800 | VA-TMS tra ining | |
| 801 | 8 hours | |
| 802 | UAT/UFT Te sters | |
| 803 | IBM Ration al (RQM) | |
| 804 | VA-TMS tra ining | |
| 805 | 8 hours | |
| 806 | Newly on-b oarded | |
| 807 | Testers/De velopers | |
| 808 | IBM Ration al (RTC & RQM) | |
| 809 | VA-TMS tra ining & | |
| 810 | Hands-on | |
| 811 | 8 hours | |
| 812 | Functional Testers ( 2) | |
| 813 | IBM Ration al Functio nal Tester | |
| 814 | VA Sponsor ed Trainin g | |
| 815 | 16 hours | |
| 816 | ||
| 817 | Risks and Constraint s | |
| 818 | Risks asso ciated wit h testing are potent ial proble ms/events that may c ause damag e to the s oftware, s ystem, ope rating sys tems, sche dule, scop e, budget, or resour ces. The r isk log wa s taken in to conside ration in the develo pment of t his test p lan. The r isks outli ned may im pact the s cope and t he schedul e, necessi tating a d eviation f rom this t est plan. Table 16 l ists the r isks ident ified for this test plan. The risks iden tified in this Maste r Test Pla n can be f ound in th e risk log and is re corded and tracked i n RTC. | |
| 819 | Table 16: Risk List | |
| 820 | Risk Descr iption | |
| 821 | Potential Impact | |
| 822 | Mitigation / Avoidan ce | |
| 823 | Requiremen t specific ation upda tes late i n the deve lopment cy cle | |
| 824 | High | |
| 825 | Establish and enforc e requirem ent comple tion dates . | |
| 826 | Provide ef fective ch ange manag ement. | |
| 827 | Update tes t plan and test case s as requi rements ar e known. | |
| 828 | Encourage the projec t team to address hi gh-risk hi gh-need re quirements early. | |
| 829 | Lack of av ailability of UAT pe rsonnel ca uses sched ule delays | |
| 830 | Medium | |
| 831 | Communicat e with the users abo ut project schedule. | |
| 832 | Work with users to e nsure adeq uate suppo rt. | |
| 833 | Identify b ackup supp ort staff, if applic able | |
| 834 | Lack of av ailability of SQA pe rsonnel ca uses sched ule delays | |
| 835 | Medium | |
| 836 | Communicat e with the SQA perso nnel about project s chedule. | |
| 837 | Work with SQA to ens ure adequa te support . | |
| 838 | Identify b ackup supp ort staff, if applic able | |
| 839 | Critical d efect disc overed dur ing SQA or UAT requi ring remed iation | |
| 840 | High | |
| 841 | Test softw are and de vices thor oughly pri or to SQA and UAT. | |
| 842 | Create rep orts and p rovide esc alation wh en issues are not ad dressed in a timely manner. | |
| 843 | Lack of co mmunicatio n between the teams | |
| 844 | High | |
| 845 | Identify k ey personn el on each team that can coord inate comm unication and effort s between the teams (e.g., Gen isis, VINC I). | |
| 846 | ||
| 847 | Test Metri cs | |
| 848 | Metrics ar e a system of parame ters or me thods for quantitati ve and per iodic asse ssment of a process that is to be measur ed. | |
| 849 | Test metri cs may inc lude, but are not li mited to: | |
| 850 | Number of test cases (pass/fai l) | |
| 851 | Percentage of test c ases execu ted | |
| 852 | Number of requiremen ts and per centage te sted | |
| 853 | Percentage of test c ases resul ting in de fect detec tion | |
| 854 | Number of defects at tributed t o test cas e/test scr ipt creati on | |
| 855 | Percentage of defect s identifi ed; listed by cause and severi ty | |
| 856 | Time to re -test | |
| 857 | The Final Test Evalu ation Summ ary Report completed by the Te st Analyst will capt ure some o f the meas ures speci fied above . | |
| 858 | Attachment A: Approv al Signatu res | |
| 859 | ||
| 860 | REVIEW DAT E: June 16 , 2017 | |
| 861 | ||
| 862 | ||
| 863 | Signed:Dat e: | |
| 864 | Katie A, T homas – Pr oject Mana ger | |
| 865 | ||
| 866 | ||
| 867 | Signed:Dat e: | |
| 868 | Saiju Pyar ajan – Bus iness Spon sor | |
| 869 | ||
| 870 | ||
| 871 | Appendix A : Test Typ e Definiti ons | |
| 872 | Test Type | |
| 873 | Definition | |
| 874 | Access Con trol Testi ng | |
| 875 | A type of testing th at attests that the target-of- test data (or system s) are acc essible on ly to thos e actors f or which t hey are in tended, as defined b y use case s. Access Control Te sting veri fies that access to the system is contro lled and t hat unwant ed or unau thorized a ccess is p rohibited. This test is implem ented and executed o n various targets-of -test. | |
| 876 | Benchmark Testing: | |
| 877 | A type of performanc e testing that compa res the pe rformance of new or unknown fu nctionalit y to a kno wn referen ce standar d (e.g., e xisting so ftware or measuremen ts). For e xample, be nchmark te sting may compare th e performa nce of cur rent syste ms with th e performa nce of the Linux/Ora cle system . | |
| 878 | Build Veri fication T esting | |
| 879 | (Prerequis ite: Smoke Test) | |
| 880 | A type of testing pe rformed fo r each new build, co mparing th e baseline with the actual obj ect proper ties in th e current build. The output fr om this te st indicat es what ob ject prope rties have changed o r don’t me et the req uirements. Together with the S moke test, the Build Verificat ion test m ay be util ized by pr ojects to determine if additio nal functi onal testi ng is appr opriate fo r a given build or i f a build is ready f or product ion. | |
| 881 | Business C ycle Testi ng | |
| 882 | A type of testing th at focuses upon acti vities and transacti ons perfor med end to end over time. This test type executes the functi onality as sociated w ith a peri od of time (e.g., on e-week, mo nth, or ye ar). These tests inc lude all d aily, week ly, and mo nthly cycl es, and ev ents that are date-s ensitive ( e.g., end of the mon th managem ent report s, monthly reports, quarterly reports, a nd year-en d reports) . | |
| 883 | Capacity T esting | |
| 884 | Capacity t esting occ urs when y ou simulat e the numb er of user s in order to stress an applic ation's ha rdware and /or networ k infrastr ucture. Ca pacity tes ting is do ne to dete rmine the capacity ( CPU, Data Storage, L AN, WAN, e tc.) of th e system a nd/or netw ork under test. | |
| 885 | Compliance Testing | |
| 886 | A type of testing th at verifie s that a c ollection of softwar e and hard ware fulfi lls given specificat ions. For example, t hese tests will mini mally incl ude: “core specifica tions for re-hosting – ver.1.5 -draft 3.d oc”, Secti on 508 of The Rehabi litation A ct Amendme nts of 199 8, Race an d Ethnicit y Test, an d VA Direc tive 6102 Compliance . It does not exclud e any othe r tests th at may als o come up. | |
| 887 | Component Integratio n Testing | |
| 888 | Testing pe rformed to expose de fects in t he interfa ces and in teraction between in tegrated c omponents as well as verifying installat ion instru ctions. | |
| 889 | Configurat ion Testin g | |
| 890 | A type of testing co ncerned wi th checkin g the prog rams compa tibility w ith as man y possible configura tions of h ardware an d system s oftware. I n most pro duction en vironments , the part icular har dware spec ifications for the c lient work stations, network co nnections, and datab ase server s vary. Cl ient works tations ma y have dif ferent sof tware load ed, for ex ample, app lications, drivers, and so on hand, at a ny one tim e; many di fferent co mbinations may be ac tive using different resources . The goal of the co nfiguratio n test is finding a hardware c ombination that shou ld be, but is not, c ompatible with the p rogram. | |
| 891 | Contention Testing | |
| 892 | A type of performanc e testing that execu tes tests that cause the appli cation to fail with regard to actual or simulated concurrenc y. Content ion testin g identifi es failure s associat ed with lo cking, dea dlock, liv elock, sta rvation, r ace condit ions, prio rity inver sion, data loss, los s of memor y, and lac k of threa d safety i n shared s oftware co mponents o r data. | |
| 893 | Data and D atabase In tegrity Te sting | |
| 894 | A type of testing th at verifie s that dat a is being stored by the syste m in a man ner where the data i s not comp romised by the initi al storage , updating , restorat ion, or re trieval pr ocessing. This type of testing is intend ed to unco ver design flaws tha t may resu lt in data corruptio n, unautho rized data access, l ack of dat a integrit y across m ultiple ta bles, and lack of ad equate tra nsaction p erformance . The data bases, dat a files, a nd the dat abase or d ata file p rocesses s hould be t ested as a subsystem within th e applicat ion. | |
| 895 | Documentat ion Testin g | |
| 896 | Documentat ion testin g is a typ e of testi ng that sh ould valid ate the in formation contained within the software documentat ion set fo r the foll owing qual ities: com pliance to accepted standards and conven tions, acc uracy, com pleteness, and usabi lity. The documentat ion testin g should v erify that all of th e required informati on is prov ided in or der for th e appropri ate user t o be able to properl y install, implement , operate, and maint ain the so ftware app lication. The curren t VistA do cumentatio n set can consist of any of th e followin g manual t ypes: | |
| 897 | Release No tes, Insta llation Gu ide, User Manuals, T echnical M anual, and Security Guide. | |
| 898 | Error Anal ysis Testi ng | |
| 899 | This type of testing verifies that the a pplication checks fo r input, d etects inv alid data, and preve nts invali d data fro m being en tered into the appli cation. Th is type of testing a lso includ es the ver ification of error l ogs and er ror messag es that ar e displaye d to the u ser. | |
| 900 | Explorator y Testing | |
| 901 | A techniqu e for test ing comput er softwar e that req uires mini mal planni ng and tol erates lim ited docum entation f or the tar get-of-tes t in advan ce of test execution , relying on the ski ll and kno wledge of the tester and feedb ack from t est result s to guide the ongoi ng test ef fort. Expl oratory te sting is o ften condu cted in sh ort sessio ns in whic h feedback gained fr om one ses sion is us ed to dyna mically pl an subsequ ent sessio ns. | |
| 902 | Failover T esting | |
| 903 | A type of testing te st that en sures an a lternate o r backup s ystem prop erly “take s over” (i .e., a bac kup system functions when the primary sy stem fails ). Failove r Testing also tests that a sy stem conti nually run s when the failover occurs, an d that the failover happens wi thout any loss of da ta or tran sactions. Failover T esting sho uld be com bined with Recovery Testing. | |
| 904 | Installati on Testing | |
| 905 | A type of testing th at verifie s that the applicati on or syst em install s as inten ded on dif ferent har dware and software c onfigurati ons, and u nder diffe rent condi tions (e.g ., a new i nstallatio n, an upgr ade, and a complete or custom installati on). Insta llation te sting may also measu re the eas e with whi ch an appl ication or system ca n be succe ssfully in stalled, t ypically m easured in terms of the averag e number o f person-h ours requi red for a trained op erator or hardware e ngineer to perform t he install ation. Par t of this installati on test is to perfor m an unins tall. As a result of this unin stall, the system, a pplication and datab ase should return to the state prior to the instal l. | |
| 906 | Integratio n Testing | |
| 907 | An increme ntal serie s of tests of combin ations or sub-assemb lies of se lected com ponents in an overal l system. Integratio n testing is increme ntal in a successive ly larger and more c omplex com binations of compone nts tested in sequen ce, procee ding from the unit l evel (0% i ntegration ) to event ually the full syste m test (10 0% integra tion). | |
| 908 | Load Testi ng | |
| 909 | A performa nce test t hat subjec ts the sys tem to var ying workl oads in or der to mea sure and e valuate th e performa nce behavi ors and ab ilities of the syste m to conti nue to fun ction prop erly under these dif ferent wor kloads. Lo ad testing determine s and ensu res that t he system functions properly b eyond the expected m aximum wor kload. Add itionally, load test ing evalua tes the pe rformance characteri stics (e.g ., respons e times, t ransaction rates, an d other ti me-sensiti ve issues) . | |
| 910 | Migration Testing | |
| 911 | A type of testing th at follows standard VistA and HealtheVet (HeV)-Vis tA operati ng procedu res and lo ads the la test .jar version on to a live copy of Vi stA and He V-VistA. T he followi ng are exa mples of t he types o f tests th at can be performed as part of migration testing: | |
| 912 | Data conve rsion has been compl eted | |
| 913 | Data table s are succ essfully c reated | |
| 914 | Parallel t est for co nfirmation of data i ntegrity | |
| 915 | Review out put report before an d after mi gration to confirm d ata integr ity | |
| 916 | Run equiva lent proce ss, before and after migration . | |
| 917 | Multi-Divi sional Tes ting | |
| 918 | A type of testing th at ensures that all applicatio ns will op erate in a multi-div ision or m ulti-site environmen t recogniz ing that a n enterpri se perspec tive while fully sup porting lo cal health care deli very. | |
| 919 | Parallel T esting | |
| 920 | The same i nternal pr ocesses ar e run on t he existin g system a nd the new system. T he existin g system i s consider ed the “go ld standar d”, unless proven ot herwise. T he feedbac k (expecte d results, defined t ime limits , data ext racts, etc .) from pr ocesses fr om the new system ar e compared to the ex isting sys tem. Paral lel testin g is perfo rmed befor e the new system is put into a productio n environm ent. | |
| 921 | Performanc e Monitori ng Testing | |
| 922 | Performanc e profilin g assesses how a sys tem is spe nding its time and c onsuming r esources. This type of perform ance testi ng optimiz es the per formance o f a system by measur ing how mu ch time an d resource s the syst em is spen ding in ea ch functio n. These t ests ident ify perfor mance limi tations in the code and specif y which se ctions of the code w ould benef it most fr om optimiz ation work . The goal of perfor mance prof iling is t o optimize the featu re and app lication p erformance . | |
| 923 | Performanc e Testing | |
| 924 | Performanc e Testing assesses h ow a syste m is spend ing its ti me and con suming res ources. Pe rformance testing op timizes a system by measuring how much t ime and re sources th e system i s spending in each f unction. T hese tests identify performanc e limitati ons in the code and specify wh ich sectio ns of the code would benefit m ost from o ptimizatio n work. Pe rformance testing ma y be furth er refined by the us e of speci fic types of perform ance tests , such as, benchmark test, loa d test, st ress test, performan ce monitor ing test, and conten tion test. | |
| 925 | Performanc e – Benchm ark Testin g | |
| 926 | A type of performanc e testing that compa res the pe rformance of new or unknown fu nctionalit y to a kno wn referen ce standar d (e.g., e xisting so ftware or measuremen ts). For e xample, be nchmark te sting may compare th e performa nce of cur rent syste ms with th e performa nce of the Linux/Ora cle system . | |
| 927 | Performanc e – Conten tion Testi ng | |
| 928 | A type of performanc e testing that execu tes tests that cause the appli cation to fail with regard to actual or simulated concurrenc y. Content ion testin g identifi es failure s associat ed with lo cking, dea dlock, liv elock, sta rvation, r ace condit ions, prio rity inver sion, data loss, los s of memor y, and lac k of threa d safety i n shared s oftware co mponents o r data. | |
| 929 | Performanc e – Endura nce Testin g | |
| 930 | Endurance testing, a lso known as Soak te sting, is usually do ne to dete rmine if t he system can sustai n the cont inuous exp ected load . During s oak tests, memory ut ilization is monitor ed to dete ct potenti al leaks. | |
| 931 | Performanc e – Load T esting | |
| 932 | A performa nce test t hat subjec ts the sys tem to var ying workl oads in or der to mea sure and e valuate th e performa nce behavi ors and ab ilities of the syste m to conti nue to fun ction prop erly under these dif ferent wor kloads. Lo ad testing determine s and ensu res that t he system functions properly b eyond the expected m aximum wor kload. Add itionally, load test ing evalua tes the pe rformance characteri stics (e.g ., respons e times, t ransaction rates, an d other ti me-sensiti ve issues) . | |
| 933 | Performanc e – Profil ing Testin g | |
| 934 | Performanc e profilin g assesses how a sys tem is spe nding its time and c onsuming r esources. This type of perform ance testi ng optimiz es the per formance o f a system by measur ing how mu ch time an d resource s the syst em is spen ding in ea ch functio n. These t ests ident ify perfor mance limi tations in the code and specif y which se ctions of the code w ould benef it most fr om optimiz ation work . The goal of perfor mance prof iling is t o optimize the featu re and app lication p erformance . | |
| 935 | Performanc e – Spike Testing | |
| 936 | A performa nce test i n which an applicati on is test ed with su dden incre ment and d ecrements in the loa d. The fo cus is on system beh avior duri ng dramati c changes in load. | |
| 937 | Privacy Te sting | |
| 938 | A type of testing th at ensures that (1) veteran an d employee data are adequately protected and (2) s ystems and applicati ons comply with the Privacy an d Security Rule prov isions of HIPAA. | |
| 939 | Product Co mponent Te sting | |
| 940 | Product Co mponent Te sting (als o known as Unit Test ing) is th e internal technical and funct ional test ing of a m odule/comp onent of c ode. Produ ct Compone nt Testing verifies that the r equirement s defined in the det ail design specifica tion have been succe ssfully ap plied to t he module/ component under test . | |
| 941 | Recovery T esting | |
| 942 | A type of testing th at causes an applica tion or sy stem to fa il in a co ntrolled e nvironment . Recovery processes are invok ed while a n applicat ion or sys tem is mon itored. Re covery tes ting verif ies that a pplication or system , and data recovery is achieve d. Recover y Testing should be combined w ith Failov er Testing . | |
| 943 | Regression Test | |
| 944 | A type of testing th at validat es existin g function ality stil l performs as expect ed when ne w function ality is i ntroduced into the s ystem unde r test. | |
| 945 | Risk Based Testing | |
| 946 | A type of testing ba sed on a d efined lis t of proje ct risks. It is desi gned to ex plore and/ or uncover potential system fa ilures by using the list of ri sks to sel ect and pr ioritize t esting. | |
| 947 | Section 50 8 Complian ce Testing | |
| 948 | A type of test that (1) ensure s that per sons with disabiliti es have ac cess to an d are able to intera ct with gr aphical us er interfa ces, and ( 2) verifie s that the applicati on or syst em meets t he specifi ed Section 508 Compl iance stan dards. | |
| 949 | Security T esting | |
| 950 | A type of test that validates the securi ty require ments and to ensure readiness for the in dependent testing pe rformed by the Secur ity Assess ment Team as used by the Asses sment and Authorizat ion Proces s. | |
| 951 | Smoke Test | |
| 952 | A type of testing th at ensures that an a pplication or system is stable enough to enter tes ting in th e currentl y active t est phase. It is usu ally a sub set of the overall s et of test s, prefera bly automa ted, that touches pa rts of the system in at least a cursory way. | |
| 953 | Stress Tes ting | |
| 954 | A performa nce test i mplemented and execu ted to und erstand ho w a system fails due to condit ions at th e boundary , or outsi de of, the expected tolerances . This fai lure typic ally invol ves low re sources or competiti on for res ources. Lo w resource condition s reveal h ow the tar get-of-tes t fails th at is not apparent u nder norma l conditio ns. Other defects mi ght result from comp etition fo r shared r esources ( e.g., data base locks or networ k bandwidt h), althou gh some of these tes ts are usu ally addre ssed under functiona l and load testing. Stress Tes ting verif ies the ac ceptabilit y of the s ystems per formance b ehavior wh en abnorma l or extre me conditi ons are en countered (e.g., dim inished re sources or extremely high numb er of user s). | |
| 955 | System Tes ting | |
| 956 | System tes ting is th e testing of all par ts of an i ntegrated system, in cluding in terfaces t o external systems. Both funct ional and structural types of testing ar e performe d to verif y that the system pe rformance, operation and funct ionality a re sound. End to end testing w ith all in terfacing systems is the ultim ate versio n. | |
| 957 | Usability Testing | |
| 958 | Usability testing id entifies p roblems in the ease- of-use and ease-of-l earning of a product . Usabilit y tests ma y focus up on, and ar e not limi ted to: hu man factor s, aesthet ics, consi stency in the user i nterface, online and context-s ensitive h elp, wizar ds and age nts, user documentat ion. | |
| 959 | User Funct ionality T est | |
| 960 | UAT is a t ype of Acc eptance Te st that in volves end -users tes ting the f unctionali ty of the applicatio n using te st data in a control led test e nvironment . | |
| 961 | User Inter face Testi ng | |
| 962 | User-inter face (UI) testing ex ercises th e user int erfaces to ensure th at the int erfaces fo llow accep ted standa rds and me et require ments. Use r-interfac e testing is often r eferred to as GUI te sting. UI testing pr ovides too ls and ser vices for driving th e user int erface of an applica tion from a test. | |
| 963 | ||
| 964 | ||
| 965 | ||
| 966 | Template R evision Hi story | |
| 967 | Date | |
| 968 | Version | |
| 969 | Descriptio n | |
| 970 | Author | |
| 971 | November 2 015 | |
| 972 | 1.18 | |
| 973 | Expanded S ection 4.3 to better describe responsibi lities for 508 compl iance. | |
| 974 | Channing J onker | |
| 975 | October 20 15 | |
| 976 | 1.17 | |
| 977 | Corrected broken lin k to 508 U RL. | |
| 978 | Channing J onker | |
| 979 | June 2015 | |
| 980 | 1.16 | |
| 981 | Updated me tadata to show recor d retentio n informat ion and re quired by PMAS, VHA Release Ma nagement, Enterprise Operation s, and Vis tA Intake Program | |
| 982 | Process Ma nagement | |
| 983 | May 2015 | |
| 984 | 1.15 | |
| 985 | Reordered cover shee t to enhan ce SharePo int search results | |
| 986 | Process Ma nagement | |
| 987 | March 2015 | |
| 988 | 1.14 | |
| 989 | Miscellane ous update s includin g the addi tion of Pe rformance testing. | |
| 990 | Channing J onker | |
| 991 | November 2 014 | |
| 992 | 1.13 | |
| 993 | Updated to latest Se ction 508 conformanc e guidelin es and rem ediated wi th Common Look Offic e Tool | |
| 994 | Process Ma nagement | |
| 995 | August 201 4 | |
| 996 | 1.12 | |
| 997 | Removed re quirements for ESE A pproval Si gnature | |
| 998 | Process Ma nagement | |
| 999 | October 20 13 | |
| 1000 | 1.11 | |
| 1001 | Converted to Microso ft Office 2007-2010 format | |
| 1002 | Process Ma nagement | |
| 1003 | July 09, 2 012 | |
| 1004 | 1.10 | |
| 1005 | Added Syst em Design Document t o Section 1.2 -Test Objectives as an exa mple | |
| 1006 | Process Ma nagement | |
| 1007 | January 03 , 2012 | |
| 1008 | 1.9 | |
| 1009 | Updated Ap proval Sig natures fo r Master T est Plan i n Appendix a | |
| 1010 | Process Ma nagement | |
| 1011 | October 13 , 2011 | |
| 1012 | 1.8 | |
| 1013 | Replaced r eferences to Test an d Certific ation with Independe nt Test an d Evaluati on. Replac ed referen ces to Cer tification and Accre ditation w ith Assess ment and A uthorizati on. | |
| 1014 | Process Ma nagement | |
| 1015 | October 4, 2011 | |
| 1016 | 1.7 | |
| 1017 | Repaired l ink to Pri vacy Impac t Assessme nt | |
| 1018 | Process Ma nagement | |
| 1019 | August 23, 2011 | |
| 1020 | 1.6 | |
| 1021 | Changed Op erational Readiness Testing (O RT) to Ope rational R eadiness R eview (ORR ) | |
| 1022 | Process Ma nagement | |
| 1023 | April 12, 2011 | |
| 1024 | 1.5 | |
| 1025 | Updated th e Signator y Authorit ies in App endix A in light of organizati onal chang es | |
| 1026 | Process Ma nagement | |
| 1027 | February 2 011 | |
| 1028 | 1.4 | |
| 1029 | Removed Te sting Serv ice Testin g and Oper ational Re adiness Te sting; add ed Enterpr ise System Engineeri ng Testing . | |
| 1030 | Changed In itial Oper ating Capa bility Tes ting to In itial Oper ating Capa bility Eva luation | |
| 1031 | Process Ma nagement | |
| 1032 | January 20 11 | |
| 1033 | 1.3 | |
| 1034 | Repaired b roken link in sectio n 1.4 | |
| 1035 | Process Ma nagement S ervice | |
| 1036 | August 201 0 | |
| 1037 | 1.2 | |
| 1038 | Removed OE D from tem plate | |
| 1039 | Process Ma nagement S ervice | |
| 1040 | December 2 009 | |
| 1041 | 1.1 | |
| 1042 | Removed “T his Page I ntentional ly Left Bl ank” pages . | |
| 1043 | OED Proces s Manageme nt Service | |
| 1044 | July 2009 | |
| 1045 | 1.0 | |
| 1046 | Initial Pr oPath rele ase | |
| 1047 | OED Proces s Manageme nt Service |
Araxis Merge (but not the data content of this report) is Copyright © 1993-2016 Araxis Ltd (www.araxis.com). All rights reserved.