Produced by Araxis Merge on 10/18/2017 11:37:33 AM Eastern Daylight Time. See www.araxis.com for information about Merge. This report uses XHTML and CSS2, and is best viewed with a modern standards-compliant browser. For optimum results when printing this report, use landscape orientation and enable printing of background images and colours in your browser.
| # | Location | File | Last Modified |
|---|---|---|---|
| 1 | OSCIF_CPRS v32 Phase 2 Build 2OR3.0405_August_2017.zip | CPRS v32 Master Test Plan v1.29 (July 2017).docx | Thu Aug 17 21:51:46 2017 UTC |
| 2 | OSCIF_CPRS v32 Phase 2 Build 2OR3.0405_August_2017.zip | CPRS v32 Master Test Plan v1.29 (July 2017).docx | Tue Oct 17 20:14:00 2017 UTC |
| Description | Between Files 1 and 2 |
|
|---|---|---|
| Text Blocks | Lines | |
| Unchanged | 37 | 1740 |
| Changed | 36 | 78 |
| Inserted | 0 | 0 |
| Removed | 0 | 0 |
| Whitespace | |
|---|---|
| Character case | Differences in character case are significant |
| Line endings | Differences in line endings (CR and LF characters) are ignored |
| CR/LF characters | Not shown in the comparison detail |
No regular expressions were active.
| 1 | Computeriz ed Patient Record Sy stem (CPRS ) v32 | |
| 2 | Master Tes t Plan | |
| 3 | Version 1. 29 | |
| 4 | ||
| 5 | July 2017 | |
| 6 | Department of Vetera ns Affairs | |
| 7 | Revision H istory | |
| 8 | Date | |
| 9 | Version | |
| 10 | Descriptio n | |
| 11 | Author | |
| 12 | 10/06/2014 | |
| 13 | 1.0 | |
| 14 | Creation | |
| 15 | PII | |
| 16 | 11/25/2014 | |
| 17 | 1.1 | |
| 18 | Placed Rol es in Resp onsible Pa rty, Updat ed Formatt ing. | |
| 19 | PII | |
| 20 | 12/24/2014 | |
| 21 | 1.2 | |
| 22 | Monthly Up date | |
| 23 | PII | |
| 24 | 01/29/2015 | |
| 25 | 1.3 | |
| 26 | Updated IO C testing to reflect joint tes ting with Audio care . Updated Roles and Responsib ilities. | |
| 27 | PII | |
| 28 | 3/3/2015 | |
| 29 | 1.4 | |
| 30 | Updated wi th Test En vironment | |
| 31 | PII | |
| 32 | 4/3/2015 | |
| 33 | 1.5 | |
| 34 | Updated Ro les/Respon sibilities | |
| 35 | PII | |
| 36 | 5/3/2015 | |
| 37 | 1.6 | |
| 38 | Updated Tr aining | |
| 39 | PII | |
| 40 | 6/3/2015 | |
| 41 | 1.7 | |
| 42 | Monthly Up date/Corre cted date formats in revision. | |
| 43 | PII | |
| 44 | 7/1/2015 | |
| 45 | 1.8 | |
| 46 | Monthly Up date | |
| 47 | PII | |
| 48 | 7/31/2015 | |
| 49 | 1.9 | |
| 50 | Changed Te st Deliver ables Resp onsibiliti es to Spec ify HP SQA Analysts where nece ssary. | |
| 51 | Changed Te st Deliver ables Test Environme nt Respons ibility to John Serv ice. | |
| 52 | PII | |
| 53 | 8/31/2015 | |
| 54 | 1.10 | |
| 55 | Monthly Up date, Upda ted Respon sibilities ; Updated Training N eeds | |
| 56 | PII, PII | |
| 57 | 9/30/2015 | |
| 58 | 1.11 | |
| 59 | Monthly Up date, Upda ted Test T eam and Te st Analyst s | |
| 60 | PII | |
| 61 | 10/30/2015 | |
| 62 | 1.12 | |
| 63 | Monthly Up date | |
| 64 | PII | |
| 65 | 11/30/2015 | |
| 66 | 1.13 | |
| 67 | Monthly Up date, Upda ted Staffi ng | |
| 68 | PII, PII | |
| 69 | 12/30/2015 | |
| 70 | 1.14 | |
| 71 | Monthly Up dates | |
| 72 | PII | |
| 73 | 01/31/2016 | |
| 74 | 1.15 | |
| 75 | Monthly Up dates; con vert to mo st recent template | |
| 76 | PII | |
| 77 | PII | |
| 78 | PII | |
| 79 | 03/02/2016 | |
| 80 | 1.16 | |
| 81 | Monthly Up dates | |
| 82 | PII | |
| 83 | 04/04/2016 | |
| 84 | 1.17 | |
| 85 | Monthly Up dates; Upd ated for 5 08 Complia nce | |
| 86 | PII | |
| 87 | PII | |
| 88 | 04/27/2016 | |
| 89 | 1.18 | |
| 90 | Monthly Up dates | |
| 91 | PII | |
| 92 | 05/31/2016 | |
| 93 | 1.19 | |
| 94 | Updated St affing | |
| 95 | PII | |
| 96 | 6/30/2016 | |
| 97 | 1.20 | |
| 98 | Monthly Up dates | |
| 99 | PII | |
| 100 | 7/29/2016 | |
| 101 | 1.21 | |
| 102 | Monthly Up dates | |
| 103 | PII | |
| 104 | 8/31/2016 | |
| 105 | 1.22 | |
| 106 | Monthly Up dates | |
| 107 | PII | |
| 108 | 1/3/2017 | |
| 109 | 1.23 | |
| 110 | Monthly Up dates | |
| 111 | PII | |
| 112 | 2/28/2017 | |
| 113 | 1.24 | |
| 114 | Monthly Up dates | |
| 115 | PII, PII | |
| 116 | 3/31/2017 | |
| 117 | 1.25 | |
| 118 | Monthly Up dates | |
| 119 | PII | |
| 120 | 4/28/2017 | |
| 121 | 1.26 | |
| 122 | Monthly Up dates | |
| 123 | PII | |
| 124 | 5/30/17 | |
| 125 | 1.27 | |
| 126 | Monthly Up dates | |
| 127 | PII | |
| 128 | 6/29/17 | |
| 129 | 1.28 | |
| 130 | Monthly Up dates | |
| 131 | ||
| 132 | 7/31/2017 | |
| 133 | 1.29 | |
| 134 | Monthly Up dates | |
| 135 | PII | |
| 136 | ||
| 137 | ||
| 138 | ||
| 139 | Table of C ontents | |
| 140 | 1.Introduc tion1 | |
| 141 | 1.1.Purpos e1 | |
| 142 | 1.2.Test O bjectives1 | |
| 143 | 1.3.Roles and Respon sibilities 1 | |
| 144 | 1.4.Proces ses and Re ferences3 | |
| 145 | 2.Items To Be Tested 3 | |
| 146 | 2.1.Overvi ew of Test Inclusion s3 | |
| 147 | 2.2.Overvi ew of Test Exclusion s4 | |
| 148 | 3.Test App roach4 | |
| 149 | 3.1.Produc t Componen t Test4 | |
| 150 | 3.2.Compon ent Integr ation Test 4 | |
| 151 | 3.3.System Tests5 | |
| 152 | 3.4.User F unctionali ty Test5 | |
| 153 | 3.5.Enterp rise Syste m Engineer ing Testin g6 | |
| 154 | 3.6.Initia l Operatin g Capabili ty Evaluat ion6 | |
| 155 | 4.Testing Techniques 6 | |
| 156 | 4.1.Risk-b ased Testi ng6 | |
| 157 | 4.2.Enterp rise Testi ng6 | |
| 158 | 4.2.1.Secu rity Testi ng7 | |
| 159 | 4.2.2.Priv acy Testin g7 | |
| 160 | 4.2.3.Sect ion 508 Co mpliance T esting7 | |
| 161 | 4.2.4.Mult i-Division al Testing 7 | |
| 162 | 4.3.Perfor mance and Capacity T esting7 | |
| 163 | 4.4.Test T ypes8 | |
| 164 | 4.5.Produc tivity and Support T ools10 | |
| 165 | 5.Test Cri teria10 | |
| 166 | 5.1.Proces s Reviews1 0 | |
| 167 | 5.2.Pass/F ail Criter ia11 | |
| 168 | 5.3.Suspen sion and R esumption Criteria11 | |
| 169 | 6.Test Del iverables1 1 | |
| 170 | 7.Test Sch edule13 | |
| 171 | 8.Test Env ironments1 3 | |
| 172 | 8.1.Test E nvironment Configura tions13 | |
| 173 | 8.2.Base S ystem Hard ware14 | |
| 174 | 8.3.Base S oftware El ements in the Test E nvironment s15 | |
| 175 | 9.Staffing and Train ing Needs1 5 | |
| 176 | 10.Risks a nd Constra ints16 | |
| 177 | 11.Test Me trics16 | |
| 178 | Attachment A – Appro val Signat ures17 | |
| 179 | Appendix A - Test Ty pe Definit ions18 | |
| 180 | ||
| 181 | Introducti on | |
| 182 | Purpose | |
| 183 | The purpos e of this Master Tes t Plan for the Compu terize Pat ient Recor d System v 32 Develop ment Proje ct is to d ocument th e overall approach t o validate and verif y the func tionality delivered in version 32 of the Computeri zed Patien t Record S ystem (CPR S) Graphic al User In terface (G UI). CPRS v32 encom passes bot h new func tionality as well as enhanceme nts to exi sting func tionality. In addit ion to mod ifying CPR S, this pr oject/plan encompass es modific ations to Text Integ ration Uti lity, Inpa tient Medi cations, O utpatient Pharmacy, Pharmacy D ata Manage ment, Barc ode Medica tion Admin istration, Adverse R eaction Tr acking, Cl inical Rem inders, an d Laborato ry. | |
| 184 | Test Objec tives | |
| 185 | This Maste r Test Pla n supports the follo wing objec tives: | |
| 186 | To provide test cove rage for 1 00% of the documente d requirem ents | |
| 187 | To provide coverage for System / Software Design Do cument ele ments | |
| 188 | To execute 100% of t he test ca ses during User Func tionality Testing | |
| 189 | To create, maintain and contro l the test environme nt | |
| 190 | Roles and Responsibi lities | |
| 191 | Table 1 li sts the ke y roles an d their re sponsibili ties for t his Master Test Plan . | |
| 192 | Table 1: R oles and D escription s | |
| 193 | Role | |
| 194 | Descriptio n | |
| 195 | Team Membe rs | |
| 196 | Developmen t Team | |
| 197 | Persons th at build o r construc t the prod uct/produc t componen t. | |
| 198 | Jamie Crum ley, Ty Ph elps, Andr ey Andriye vskiy, And rea Freema n, Nick Co stanzo, Je ff Swesky | |
| 199 | ||
| 200 | [Previous Developers Robert La uro, Kim H ovorka, Mi ke Jenkins ] | |
| 201 | ||
| 202 | Developmen t Manager | |
| 203 | Person res ponsible f or assisti ng with th e creation and imple mentation of the Mas ter Test P lan. | |
| 204 | PII, PII | |
| 205 | Program Ma nager | |
| 206 | Person tha t has over all respon sibility f or the suc cessful pl anning and execution of a proj ect; perso n responsi ble for cr eating the Master Te st Plan in collabora tion with the Develo pment Mana ger. | |
| 207 | Mike Brait hwaite, Ke nny, Condi e, Michael Keener, A pril Scott | |
| 208 | Stakeholde rs | |
| 209 | Persons th at hold a stake in a situation in which they may a ffect or b e affected by the ou tcome. | |
| 210 | End users- health car e provider s | |
| 211 | Test Analy st | |
| 212 | Person res ponsible f or ensurin g full exe cution of the test p rocess to include th e verifica tion of te chnical re quirements and the v alidation of busines s requirem ents. | |
| 213 | PII , PII , Rebecca Russell, S usan Scorz ato | |
| 214 | Test Lead | |
| 215 | An experie nced Test Analyst or member of the Test Team that leads and coordinate s activiti es related to all as pects of t esting bas ed on an a pproved Ma ster Test Plan and s chedule. | |
| 216 | PII, PII | |
| 217 | Test Team/ Testers | |
| 218 | Persons th at execute tests and ensure th e test env ironment w ill adequa tely suppo rt planned test acti vities. | |
| 219 | PII , PII , Rebecca Russell, S usan Scorz ato | |
| 220 | Test Envir onment Tea m | |
| 221 | Persons th at establi sh, mainta in, and co ntrol test environme nts. | |
| 222 | John Servi ce | |
| 223 | Processes and Refere nces | |
| 224 | The proces ses that g uide the i mplementat ion of thi s Master T est Plan a re: | |
| 225 | Test Prepa ration | |
| 226 | Product Bu ild | |
| 227 | Independen t Test and Evaluatio n | |
| 228 | The refere nces that support th e implemen tation of this Maste r Test Pla n are: | |
| 229 | ProPath | |
| 230 | Section 50 8 Office W eb Page | |
| 231 | Privacy Im pact Asses sment - Pr ivacy Serv ice | |
| 232 | The refere nces that support th e implemen tation of this Maste r Test Pla n are: | |
| 233 | Business R equirement Document (BRD) Ver sion <#.#> , Date <Mo nth, Year> | |
| 234 | Requiremen ts Specifi cation Doc ument (RSD ) Version 1.30, Date – March 2 017 | |
| 235 | System Des ign Docume nt (SDD) V ersion 1.1 6, Date – January, 2 016 | |
| 236 | Requiremen ts Traceab ility Matr ix (RTM) V ersion 1.3 0, Date – March, 201 7 | |
| 237 | Risk Log V ersion <#. #>, Date < Month, Yea r> | |
| 238 | Items To B e Tested | |
| 239 | Items to b e tested i nclude the following : | |
| 240 | The CPRS G UI. | |
| 241 | VistA Patc hes for th e applicat ions being developed . | |
| 242 | CPRS throu gh VistA. | |
| 243 | Installati on Guide. | |
| 244 | User Guide | |
| 245 | Interface between af fected pat ches. | |
| 246 | Overview o f Test Inc lusions | |
| 247 | The follow ing compon ents and f eatures an d combinat ions of co mponents a nd feature s will be tested: | |
| 248 | The CPRS G UI and ins tallation tools and distributi on methods . | |
| 249 | VistA Patc hes and in stallation tools and distribut ion method s. | |
| 250 | Installati on Guide a s well as document d istributio n methods. | |
| 251 | User Guide as well a s document distribut ion method s. | |
| 252 | Interfaces between a ffected pa ckages (fo r example, Inpatient Medicatio ns and CPR S, Audioca re). | |
| 253 | Overview o f Test Exc lusions | |
| 254 | The follow ing compon ents and f eatures an d combinat ions of co mponents a nd feature s will not be tested : | |
| 255 | Applicatio ns not inc luded in t he CPRS v3 2 effort, especially applicati ons that d o not inte ract with CPRS. | |
| 256 | Test Appro ach | |
| 257 | Product Co mponent Te st | |
| 258 | The Develo per perfor ms Product Component Testing ( aka Unit T esting) wh ich includ es the int ernal tech nical and functional testing o f a module /component of code a nd is resp onsible fo r the veri fication o f the requ irements d efined in the detail ed design specificat ion have b een succes sfully app lied to th e module/c omponent u nder test. Steps inc lude: | |
| 259 | Analyze re quirements to unders tand the a pplication functiona lity and d ependencie s | |
| 260 | Identify a ll the rou tines affe cted by th e module o r object | |
| 261 | Specify al l the rout ines that are called from vari ous locati ons | |
| 262 | Execute te sts on pri oritized o ptions | |
| 263 | Execute te sts with d ifferent c ombination s of optio ns and dat a. For exa mple, test with mini mal data e ntered and test with maximal d ata entere d | |
| 264 | Perform ex ploratory testing, i .e., rando mly exerci se the mod ule, objec t, and opt ions based upon doma in knowled ge, past p erformance , and expe rtise | |
| 265 | Record the actual te st results | |
| 266 | Perform st atic analy sis of mod ule/compon ent source code | |
| 267 | ||
| 268 | If a defec t is ident ified and it is rela ted to the code bein g develope d that cod e will con tinue to b e develope d until it successfu lly passes the unit test. | |
| 269 | If the def ect is not related t o the code being dev eloped it will be ad dressed by the follo wing metho ds: | |
| 270 | If it is a software defect tha t is withi n the scop e of this project, i t will be added to t he project backlog. | |
| 271 | If it is a software defect tha t is outsi de of the scope of t his projec t it will be referre d to the e xisting ma intenance structure. | |
| 272 | If the sof tware defe ct is not truly a de fect and r equires a change in functional ity it wil l be repor ted with a suggestio n to enter a new ser vice reque st. | |
| 273 | Component Integratio n Test | |
| 274 | The Test A nalyst ins talls the Product Co mponent an d performs component integrati on testing . Product Component Integratio n testing is perform ed to expo se defects in the in terfaces a nd interac tion betwe en integra ted compon ents as we ll as veri fying inst allation i nstruction s. Compone nt integra tion testi ng include s testing of Identit y and Acce ss Managem ent Integr ation Serv ice Patter n changes. The Softw are Qualit y Assuranc e Review C hecklist i s started during thi s activity . | |
| 275 | If a defec t is ident ified and it is rela ted to the code bein g develope d the defe ct will be added to the projec t backlog. | |
| 276 | If the def ect is not related t o the code being dev eloped it will be ad dressed by the follo wing metho ds: | |
| 277 | If it is a software defect tha t is withi n the scop e of this project, i t will be added to t he project backlog. | |
| 278 | If it is a software defect tha t is outsi de of the scope of t his projec t it will be referre d to the e xisting ma intenance structure. | |
| 279 | If the sof tware defe ct is not truly a de fect and r equires a change in functional ity it wil l be repor ted with a suggestio n to enter a new ser vice reque st. | |
| 280 | ||
| 281 | System Tes ts | |
| 282 | The Test A nalyst per forms Syst em Tests e mploying a variety o f test typ es (i.e., compliance , regressi on, access control, interopera bility, us ability (i ncluding 5 08 complia nce), etc. ). System Tests exer cise all p arts of an integrate d system i ncluding i nterfaces to externa l systems. | |
| 283 | If a defec t is ident ified and it is rela ted to the code bein g develope d the defe ct will be added to the projec t backlog. | |
| 284 | If the def ect is not related t o the code being dev eloped it will be ad dressed by the follo wing metho ds: | |
| 285 | If it is a software defect tha t is withi n the scop e of this project, i t will be added to t he project backlog. | |
| 286 | If it is a software defect tha t is outsi de of the scope of t his projec t it will be referre d to the e xisting ma intenance structure. | |
| 287 | If the sof tware defe ct is not truly a de fect and r equires a change in functional ity it wil l be repor ted with a suggestio n to enter a new ser vice reque st. | |
| 288 | User Funct ionality T est | |
| 289 | The VA Pro ject Manag er ensures the User Functional ity Test ( UFT) is co nducted. U FT is a fo rmal test conducted by the end -users to determine whether a system sat isfies its acceptanc e criteria and enabl es the cus tomer to d etermine w hether to accept the system. T he purpose of the Us er Functio nality Tes t is to (1 ) exercise the funct ionality o f the appl ication us ing test d ata in a c ontrolled test envir onment in order to v alidate fu nctionalit y and (2) evaluate t he usabili ty of a co mponent or system. A dditionall y, during User Funct ionality T esting, En terprise S hared Serv ice functi onality, s uch as Ide ntity and Access Man agement, a re tested. | |
| 290 | Enterprise System En gineering Testing | |
| 291 | The VA Pro ject Manag er reviews all testi ng intake assessment results, including Risk Analy sis and Te sting Scop e Report ( RATSR) or Testing In take Analy sis (TIA) closure em ail. The V A Project Manager th en incorpo rates ESE Enterprise Testing S ervices (E TS) Indepe ndent Test ing and/or Systems Q uality Ass urance Ser vice (SQAS ) independ ent testin g services required in the Ris k Assessme nt Testing Scope Rep ort (RATSR ) into pro ject plans and sched ules. | |
| 292 | At this po int a dete rmination will be ma de if it i s necessar y to condu ct indepen dent testi ng, whethe r it is SQ AS/IV&V Te sting, or ESE Testin g. | |
| 293 | Initial Op erating Ca pability E valuation | |
| 294 | The Initia l Operatin g Capabili ty (IOC) I mplementat ion Manage r coordina tes the pe rformance of the IOC evaluatio n. IOC eva luation (f ormerly kn own as fie ld testing ) is when a product/ system tha t has been modified/ enhanced i s placed i nto a limi ted number of produc tion (live ) environm ents, in o rder to ev aluate the new featu res and fu nctionalit y of the p roduct/sys tem and to ascertain if the fe atures and functiona lity perfo rm as expe cted and d o not adve rsely affe ct the exi sting func tionality of the pro duct/syste m. Activit ies includ e: | |
| 295 | Distribute the produ ct and pro duct docum entation t o the Eval uation Sit es | |
| 296 | Facilitate the timel y installa tions at t he Evaluat ion Sites | |
| 297 | Conduct fo rmal or bi -weekly Ev aluation S ite calls | |
| 298 | Track defe cts identi fied durin g Initial Operating Capability Evaluatio n | |
| 299 | Address is sues and q uestions i dentified during eva luation | |
| 300 | Obtain Sit e Concurre nce Statem ents | |
| 301 | CPRS v32 m akes modif ications t hat impact the Audio care appli cation. I OC testing for CPRS v32 will b e conducte d in tande m with IOC testing f or the Aud iocare app lication. | |
| 302 | Testing Te chniques | |
| 303 | Risk-based Testing | |
| 304 | The follow ing table will be up dated as r isks are i dentified: | |
| 305 | Table 2: R isks and P riorities | |
| 306 | Risk | |
| 307 | Priority | |
| 308 | Test Type/ Test Case | |
| 309 | Configurat ion Manage ment based Integrati on Testing | |
| 310 | High | |
| 311 | Details TB D, continu ous integr ation test ing will b e performe d to verif y that mod ules integ rate appro priately a nd do not cause adve rse intera ctions wit h existing or previo usly devel oped softw are. | |
| 312 | Integratio n of modif ications f rom other software p rojects su ch as MOCH A (Medicat ion Order Check Heal thcare App lication) | |
| 313 | High | |
| 314 | Details TB D, integra tion testi ng will be performed regularly and test scripts wi ll be upda ted to ref lect | |
| 315 | Enterprise Testing | |
| 316 | Cite how t he project testing c overs the enterprise requireme nts. Enter prise requ irements i nclude sec urity, pri vacy, Sect ion 508 Co mpliance r equirement s, and Mul ti-divisio nal requir ements. | |
| 317 | ||
| 318 | Security T esting | |
| 319 | Security T esting wil l be perfo rmed by th e testing services g roup. Bas ic testing such as b oundary te sting, sig n in proce dures etc. will be p erformed b y the Test Analysts. | |
| 320 | Privacy Te sting | |
| 321 | Privacy Te sting will be perfor med by the testing s ervices gr oup. CPRS is a prov ider facin g applicat ion and as such the applicatio n makes se nsitive in formation such as Pr otected He alth Infor mation (PH I) and Per sonally Id entifiable Informati on (PII) v isible to applicatio n users. Test Analy sts will p erform bas ic testing to valida te that pr ivacy guid elines are being fol lowed. | |
| 322 | Section 50 8 Complian ce Testing | |
| 323 | Section 50 8 Complian ce Testing will be p erformed b y the Test Analysts on the tea m. They w ill follow guideline s set fort h by the p rogram off ice to val idate that the appli cation is Section 50 8 complian t. Tests include ut ilizing JA WS (Job Ac cess With Speech) Sc reen Reade r software to valida te that th e screen r eader and the visual functiona lity are i n alignmen t. | |
| 324 | Multi-Divi sional Tes ting | |
| 325 | Multi-divi sional tes ting will be conduct ed during the Initia l Operatin g Capabili ty (IOC) t esting pha se by an I ntegrated Test Site. In addit ion the te st environ ment will be multi-d ivisional. | |
| 326 | Performanc e and Capa city Testi ng | |
| 327 | TBD | |
| 328 | Develop te sts to ens ure the ap plication will perfo rm as expe cted under anticipat ed user lo ads, and t ypical bus iness tran sactions r espond in a timely m anner. Dur ing the te st executi on, the Sy stem Under Test (SUT ) is activ ely monito red for an y issues t hat could affect app lication p erformance , and to v erify the hardware e nvironment is adequa tely sized . | |
| 329 | This type of testing covers th e requirem ents speci fied in th e “Perform ance Speci fications” in the Re quirements Specifica tion Docum ent found in the Req uirements process in ProPath. | |
| 330 | Test Types | |
| 331 | Table 2: T est Types | |
| 332 | Test Types | |
| 333 | Party Resp onsible | |
| 334 | Access con trol testi ng | |
| 335 | HP Test An alysts, De velopers | |
| 336 | Build veri fication t esting | |
| 337 | HP Test An alysts, De velopers | |
| 338 | Business c ycle testi ng | |
| 339 | HP Test An alysts | |
| 340 | Compliance testing | |
| 341 | HP Test An alysts, De velopers | |
| 342 | Component integratio n testing | |
| 343 | HP Test An alysts, De velopers | |
| 344 | Configurat ion testin g | |
| 345 | HP Test An alysts | |
| 346 | Data and d atabase in tegrity te sting | |
| 347 | HP Test An alysts, De velopers | |
| 348 | Documentat ion testin g | |
| 349 | HP Test An alysts, De velopers | |
| 350 | Error anal ysis testi ng | |
| 351 | Developers | |
| 352 | Explorator y testing | |
| 353 | HP Test An alysts | |
| 354 | Failover t esting | |
| 355 | HP Test An alysts, De velopers | |
| 356 | Installati on testing | |
| 357 | HP Test An alysts, De velopers | |
| 358 | Integratio n testing | |
| 359 | Developers | |
| 360 | Migration testing | |
| 361 | HP Test An alysts | |
| 362 | Multi-divi sional tes ting | |
| 363 | HP Test An alysts | |
| 364 | Parallel t esting | |
| 365 | HP Test An alysts, De velopers | |
| 366 | Performanc e monitori ng testing | |
| 367 | TBD | |
| 368 | Performanc e testing | |
| 369 | TBD | |
| 370 | Performanc e - Benchm ark testin g | |
| 371 | TBD | |
| 372 | Performanc e - Conten tion testi ng | |
| 373 | TBD | |
| 374 | Performanc e - Endura nce testin g | |
| 375 | TBD | |
| 376 | Performanc e - Load t esting | |
| 377 | TBD | |
| 378 | Performanc e - Profil ing testin g | |
| 379 | TBD | |
| 380 | Performanc e - Spike testing | |
| 381 | TBD | |
| 382 | Performanc e - Stress testing | |
| 383 | TBD | |
| 384 | Privacy te sting | |
| 385 | HP Test An alysts, De velopers | |
| 386 | Product co mponent te sting | |
| 387 | Developers | |
| 388 | Recovery t esting | |
| 389 | TBD | |
| 390 | Regression test | |
| 391 | HP Test An alysts, De velopers | |
| 392 | Risk based testing | |
| 393 | HP Test An alysts, De velopers | |
| 394 | Section 50 8 complian ce testing | |
| 395 | HP Test An alysts, De velopers | |
| 396 | Security t esting | |
| 397 | HP Test An alysts | |
| 398 | Smoke test ing | |
| 399 | HP Test An alysts, De velopers | |
| 400 | System tes ting | |
| 401 | HP Test An alysts, De velopers | |
| 402 | Usability testing | |
| 403 | HP Test An alysts, De velopers | |
| 404 | User Funct ionality T esting | |
| 405 | HP Test An alysts, De velopers | |
| 406 | User inter face testi ng | |
| 407 | HP Test An alysts, De velopers | |
| 408 | Productivi ty and Sup port Tools | |
| 409 | Add or del ete tools as appropr iate. | |
| 410 | Table 3 de scribes th e tools th at will be employed to support this Mast er Test Pl an. | |
| 411 | Table 3: T ool Catego ry or Type s | |
| 412 | Tool Categ ory or Typ e | |
| 413 | Tool Brand Name | |
| 414 | Vendor or In-house | |
| 415 | Version | |
| 416 | Test Manag ement | |
| 417 | TBD | |
| 418 | ||
| 419 | ||
| 420 | Defect Tra cking | |
| 421 | Rational J azz Tool | |
| 422 | IBM | |
| 423 | ||
| 424 | Test Cover age Monito r or Profi ler | |
| 425 | TBD | |
| 426 | ||
| 427 | ||
| 428 | Project Ma nagement | |
| 429 | Project | |
| 430 | Microsoft | |
| 431 | ||
| 432 | Performanc e Testing | |
| 433 | TBD | |
| 434 | ||
| 435 | ||
| 436 | Configurat ion Manage ment | |
| 437 | Rational J azz Tool | |
| 438 | IBM | |
| 439 | ||
| 440 | DBMS tools | |
| 441 | Reflection for UNIX and OpenVM S | |
| 442 | Attachmate | |
| 443 | ||
| 444 | Document R epository | |
| 445 | Microsoft SharePoint | |
| 446 | ||
| 447 | ||
| 448 | Shared Dri ve | |
| 449 | Microsoft | |
| 450 | ||
| 451 | ||
| 452 | Test Crite ria | |
| 453 | Process Re views | |
| 454 | The Master Test Plan under goe s two revi ews: | |
| 455 | Peer Revie w – upon c ompletion of the Mas ter Test P lan | |
| 456 | Formal Rev iew – afte r the Deve lopment Ma nager appr oves the M aster Test Plan | |
| 457 | ||
| 458 | The Master Test Plan does serv e as an in put or Art ifact Used for the P rocess Qua lity Gate Review for Product B uild as we ll as for the Go No Review (Mi lestone) f or Indepen dent Testi ng. | |
| 459 | For more i nformation on the re views asso ciated wit h testing, see the P roduct Bui ld, Test P reparation , and Inde pendent Te st and Eva luation pr ocesses. | |
| 460 | ||
| 461 | Pass/Fail Criteria | |
| 462 | Incidents identified during th e executio n of this test plan will be ev aluated to determine their sev erity. Th is impact will be re corded in the severi ty section of the Ja zz defect. | |
| 463 | High Impac t Test Inc ident is a n error or lack of f unctionali ty that: | |
| 464 | Jeopardize s patient or personn el safety by corrupt or incorr ect data | |
| 465 | Has no wor karound to provide s imilar fun ctionality and this functional ity is req uired to m ove to sys tem, integ ration, or user acce ptance | |
| 466 | Adversely affects al l users or key user functional ity | |
| 467 | Medium Imp act Test I ncident is an error or lack of functiona lity that: | |
| 468 | Has a reas onable wor karound to maintain functional ity | |
| 469 | Impacts a small grou p of users , but has workaround | |
| 470 | Functional ity works but not to requireme nts, speci fications, or standa rds and wo rkflow is not hamper ed | |
| 471 | Low Impact Test Inci dent is an error or lack of fu nctionalit y that may cause ope rator/user inconveni ence and m inimally a ffects ope rational p rocessing. | |
| 472 | Spelling e rrors | |
| 473 | Minor GUI Graphical/ Formatting errors th at do not affect fun ctionality /visibilit y | |
| 474 | Enhancemen t Test Inc ident is s omething t hat would be “nice” to have in the integ ration pie ce but was not inclu ded in the specifica tions for this relea se. | |
| 475 | ||
| 476 | All High a nd Medium defects sh all be add ressed or negotiated prior to release. Any limita tion or ou tstanding test incid ent shall have an ap proved con tingency p rocess (wo rkaround) in place. | |
| 477 | Suspension and Resum ption Crit eria | |
| 478 | Testing wi ll cease o n a test i tem when a high impa ct test in cident is logged. T esting wil l resume w hen the in cident is resolved. | |
| 479 | Testing wi ll cease o n the enti re release when thre e high imp act test i ncidents a re logged. Testing will resum e when the incidence are addre ssed. | |
| 480 | Acceptance Criteria | |
| 481 | All High a nd Medium defects sh all be add ressed or negotiated prior to release. A ny limitat ion or out standing t est incide nt shall h ave an app roved cont ingency pr ocess (wor karound) i n place. | |
| 482 | ||
| 483 | Test Deliv erables | |
| 484 | The Test D eliverable s listed b elow repre sent some possible d eliverable s for a te sting proj ect. The T est Delive rables tab le may be tailored t o meet pro ject needs . Do not i nclude Del ete any li sted test deliverabl e that is not used b y the Prod uct Build, Test Mana gement, an d Independ ent Test a nd Evaluat ion proces ses. | |
| 485 | Table 4 li sts the te st deliver ables for the CPRS v 32 project . | |
| 486 | Table 4: T est Delive rables | |
| 487 | Test Deliv erables | |
| 488 | Responsibl e Party | |
| 489 | Master Tes t Plan | |
| 490 | HP SQA Ana lyst | |
| 491 | Iteration Test Plans (when app ropriate) | |
| 492 | HP SQA Ana lyst | |
| 493 | Test Execu tion Risks | |
| 494 | VA/HP PM | |
| 495 | Test Sched ule | |
| 496 | VA/HP PM | |
| 497 | Test Cases /Test Scri pts | |
| 498 | HP SQA Ana lyst | |
| 499 | Test Data | |
| 500 | HP SQA Ana lyst | |
| 501 | Test Envir onment | |
| 502 | John Servi ce | |
| 503 | Test Evalu ation Summ aries | |
| 504 | HP SQA Ana lyst | |
| 505 | Traceabili ty Report or Matrix | |
| 506 | HP SQA Ana lyst | |
| 507 | Master Tes t Plan | |
| 508 | HP SQA Ana lyst | |
| 509 | Test Sched ule | |
| 510 | List the m ajor testi ng milesto nes. When appropriat e, referen ce other w orkflow do cumentatio n or tools , such as the Projec t Manageme nt Plan, o r Work Bre akdown Str ucture (WB S.) Put a minimum am ount of pr ocess and planning i nformation within th e Master T est Plan i n order to facilitat e ongoing maintenanc e of the t est schedu le. | |
| 511 | Table 5: T esting Mil estones | |
| 512 | Testing Mi lestones | |
| 513 | Responsibl e Party | |
| 514 | Approved M aster Test Plan | |
| 515 | HP SQA Ana lyst | |
| 516 | Approved g eneric tes t cases (h igh level list) | |
| 517 | HP SQA Ana lyst | |
| 518 | Complete a nd stable requiremen ts (SRS or CRs) | |
| 519 | HP SQA Ana lyst | |
| 520 | Creating o f Test Env ironment(s ) | |
| 521 | HP SQA Ana lyst | |
| 522 | Submit and manage re quest for Testing Se rvices | |
| 523 | HP SQA Ana lyst | |
| 524 | Test Cases selected for releas e and ente red using MS Excel S preadsheet on SQA Sh arePoint | |
| 525 | HP SQA Ana lyst | |
| 526 | Completion of Patch verificati on | |
| 527 | HP SQA Ana lyst | |
| 528 | SQA Testin g conducte d (execute the selec ted Test C ases) in T est enviro nment(s) | |
| 529 | HP SQA Ana lyst | |
| 530 | Remedy Tic kets | |
| 531 | HP SQA Ana lyst | |
| 532 | Defects id entified a nd entered into CQ | |
| 533 | HP SQA Ana lyst | |
| 534 | Test Envir onments | |
| 535 | A test env ironment i s an envir onment con taining ha rdware, in strumentat ion, simul ators, sof tware tool s, and oth er support elements needed to conduct a test. | |
| 536 | Test Envir onment Con figuration s | |
| 537 | The party or parties responsib le for con figuring a nd maintai ning the t est enviro nments are : John Ser vice & Bay Pines Tes t Lab | |
| 538 | The test e nvironment will be h osted at t he Bay Pin es Test La b, DAYT79. | |
| 539 | ||
| 540 | Base Syste m Hardware | |
| 541 | Table 6 se ts forth t he system resources for the te st effort presented in this Ma ster Test Plan. | |
| 542 | The specif ic element s of the t est system may not b e fully un derstood i n early it erations, so this se ction may be complet ed over ti me. The te st system should sim ulate the production environme nt as clos ely as pos sible, sca ling down the concur rent acces s and data base size, and so fo rth, if an d where ap propriate. Tailor th e System H ardware Re sources ta ble as req uired. | |
| 543 | Table 6: S ystem Hard ware Resou rces | |
| 544 | Resource | |
| 545 | Quantity | |
| 546 | Name and T ype | |
| 547 | Database S erver | |
| 548 | ||
| 549 | ||
| 550 | Network or Subnet | |
| 551 | ||
| 552 | TBD | |
| 553 | Server Nam e | |
| 554 | ||
| 555 | TBD | |
| 556 | Database N ame | |
| 557 | ||
| 558 | TBD | |
| 559 | Client Tes t PCs | |
| 560 | ||
| 561 | ||
| 562 | Include sp ecial conf iguration requiremen ts | |
| 563 | ||
| 564 | TBD | |
| 565 | Test Repos itory | |
| 566 | ||
| 567 | ||
| 568 | Network or Subnet | |
| 569 | ||
| 570 | TBD | |
| 571 | Server Nam e | |
| 572 | ||
| 573 | TBD | |
| 574 | Test Devel opment PCs | |
| 575 | ||
| 576 | TBD | |
| 577 | ||
| 578 | Base Softw are Elemen ts in the Test Envir onments | |
| 579 | Add or del ete Softwa re Element s as appro priate. If necessary , specify software p atches ref erenced an d/or requi red here. | |
| 580 | Table 7 de scribes th e base sof tware elem ents that are requir ed in the test envir onment for this Mast er Test Pl an. | |
| 581 | Table 7: S oftware El ements | |
| 582 | Software E lement Nam e | |
| 583 | Version | |
| 584 | Type and O ther Notes | |
| 585 | Windows | |
| 586 | 7 | |
| 587 | Operating System | |
| 588 | Intersyste ms Cache | |
| 589 | 2014 | |
| 590 | MUMPS envi ronment | |
| 591 | Delphi | |
| 592 | XE3 | |
| 593 | GUI source code | |
| 594 | Staffing a nd Trainin g Needs | |
| 595 | Table 8 de scribes th e personne l resource s needed t o plan, pr epare, and execute t his Master Test Plan . | |
| 596 | Table 8: S taffing Re sources | |
| 597 | Testing Ta sk | |
| 598 | Quantity o f Personne l Needed | |
| 599 | Test Proce ss | |
| 600 | Duration/ Days | |
| 601 | Create the Master Te st Plan | |
| 602 | ||
| 603 | Test Prepa ration | |
| 604 | xxx days | |
| 605 | Establish the Test E nvironment | |
| 606 | ||
| 607 | Test Prepa ration | |
| 608 | xxx days | |
| 609 | Perform Sy stem Tests | |
| 610 | ||
| 611 | Product Bu ild | |
| 612 | xxx days | |
| 613 | Etc. | |
| 614 | ||
| 615 | ||
| 616 | ||
| 617 | Identify t raining op tions for providing necessary skills and the estim ated numbe r of hours necessary to comple te the tra ining. | |
| 618 | Table 9 li sts the pe rsonnel th at require training. | |
| 619 | Table 9: T raining Ne eds | |
| 620 | Name | |
| 621 | Training N eed | |
| 622 | Training O ption | |
| 623 | Estimated Training H ours | |
| 624 | Andrey And riyevskiy | |
| 625 | IBM Ration al Jazz ® | |
| 626 | Obtain IBM Rational Jazz ® tra ining | |
| 627 | 6 hours | |
| 628 | Christophe r Bell | |
| 629 | IBM Ration al Jazz ® | |
| 630 | Obtain IBM Rational Jazz ® tra ining | |
| 631 | 6 hours | |
| 632 | PII | |
| 633 | IBM Ration al Jazz ® | |
| 634 | Obtain IBM Rational Jazz ® tra ining | |
| 635 | 6 hours | |
| 636 | PII | |
| 637 | IBM Ration al Jazz ® | |
| 638 | Obtain IBM Rational Jazz ® tra ining | |
| 639 | 6 hours | |
| 640 | Nicholas C ostanzo | |
| 641 | IBM Ration al Jazz ® | |
| 642 | Obtain IBM Rational Jazz ® tra ining | |
| 643 | 6 hours | |
| 644 | Jamie Crum ley | |
| 645 | IBM Ration al Jazz ® | |
| 646 | Obtain IBM Rational Jazz ® tra ining | |
| 647 | 6 hours | |
| 648 | Andrea Fre eman | |
| 649 | IBM Ration al Jazz ® | |
| 650 | Obtain IBM Rational Jazz ® tra ining | |
| 651 | 6 hours | |
| 652 | Craig O. H inton | |
| 653 | IBM Ration al Jazz ® | |
| 654 | Obtain IBM Rational Jazz ® tra ining | |
| 655 | 6 hours | |
| 656 | Kim C. Hov orka | |
| 657 | IBM Ration al Jazz ® | |
| 658 | Obtain IBM Rational Jazz ® tra ining | |
| 659 | 6 hours | |
| 660 | Robert La uro | |
| 661 | IBM Ration al Jazz ® | |
| 662 | Obtain IBM Rational Jazz ® tra ining | |
| 663 | 6 hours | |
| 664 | Joe Niksic h | |
| 665 | IBM Ration al Jazz ® | |
| 666 | Obtain IBM Rational Jazz ® tra ining | |
| 667 | 6 hours | |
| 668 | Ty Phelps | |
| 669 | IBM Ration al Jazz ® | |
| 670 | Obtain IBM Rational Jazz ® tra ining | |
| 671 | 6 hours | |
| 672 | Blair Sand ers | |
| 673 | IBM Ration al Jazz ® | |
| 674 | Obtain IBM Rational Jazz ® tra ining | |
| 675 | 6 hours | |
| 676 | Susan Scor zato | |
| 677 | IBM Ration al Jazz ® | |
| 678 | Obtain IBM Rational Jazz ® tra ining | |
| 679 | 6 hours | |
| 680 | April Scot t | |
| 681 | IBM Ration al Jazz ® | |
| 682 | Obtain IBM Rational Jazz ® tra ining | |
| 683 | 6 hours | |
| 684 | PII | |
| 685 | IBM Ration al Jazz ® | |
| 686 | Obtain IBM Rational Jazz ® tra ining | |
| 687 | 6 hours | |
| 688 | Risks and Constraint s | |
| 689 | The risk l og was tak en into co nsideratio n in the d evelopment of this t est plan. | |
| 690 | The risks identified in this M aster Test Plan can be found i n the risk log and m ay be reco rded and t racked in an automat ed tool, s uch as, IB M Rational Jazz®. | |
| 691 | ||
| 692 | Test Metri cs | |
| 693 | Metrics ar e a system of parame ters or me thods for quantitati ve and per iodic asse ssment of a process that is to be measur ed. | |
| 694 | Test metri cs may inc lude, but are not li mited to: | |
| 695 | Number of test cases (pass/fai l) | |
| 696 | Percentage of test c ases execu ted | |
| 697 | Number of requiremen ts and per centage te sted | |
| 698 | Percentage of test c ases resul ting in de fect detec tion | |
| 699 | Number of defects at tributed t o test cas e/test scr ipt creati on | |
| 700 | Percentage of defect s identifi ed; listed by cause and severi ty | |
| 701 | Time to re -test | |
| 702 | ||
| 703 | Attachment A – Appro val Signat ures | |
| 704 | The Master Test Plan documents the proje ct’s overa ll approac h to testi ng and inc ludes: | |
| 705 | Items to b e tested | |
| 706 | Test strat egy | |
| 707 | Test crite ria | |
| 708 | Test deliv erables | |
| 709 | Test sched ule | |
| 710 | Test envir onments | |
| 711 | Staffing a nd trainin g needs | |
| 712 | Risks and constraint s | |
| 713 | Test Metri cs | |
| 714 | This secti on is used to docume nt the app roval of t he Master Test Plan during the Formal Re view. The review sh ould be id eally cond ucted face to face w here signa tures can be obtaine d ‘live’ d uring the review how ever the f ollowing f orms of ap proval are acceptabl e: | |
| 715 | Physical s ignatures obtained f ace to fac e or via f ax | |
| 716 | Digital si gnatures t ied crypto graphicall y to the s igner | |
| 717 | /es/ in th e signatur e block pr ovided tha t a separa te digital ly signed e-mail ind icating th e signer’s approval is provide d and kept with the document. | |
| 718 | NOTE: Del ete the en tire secti on above p rior to fi nal submis sion. | |
| 719 | ||
| 720 | REVIEW DAT E: <Date> | |
| 721 | ||
| 722 | ||
| 723 | Signed:Dat e: | |
| 724 | < Program/ Project Ma nager > | |
| 725 | ||
| 726 | ||
| 727 | Signed:Dat e: | |
| 728 | < Business Sponsor R epresentat ive > | |
| 729 | ||
| 730 | ||
| 731 | Signed:Dat e: | |
| 732 | <Project T eam Test M anager> | |
| 733 | Appendix A - Test Ty pe Definit ions | |
| 734 | Test Type | |
| 735 | Definition | |
| 736 | Access Con trol Testi ng | |
| 737 | A type of testing th at attests that the target-of- test data (or system s) are acc essible on ly to thos e actors f or which t hey are in tended, as defined b y use case s. Access Control Te sting veri fies that access to the system is contro lled and t hat unwant ed or unau thorized a ccess is p rohibited. This test is implem ented and executed o n various targets-of -test. | |
| 738 | Benchmark Testing: | |
| 739 | A type of performanc e testing that compa res the pe rformance of new or unknown fu nctionalit y to a kno wn referen ce standar d (e.g., e xisting so ftware or measuremen ts). For e xample, be nchmark te sting may compare th e performa nce of cur rent syste ms with th e performa nce of the Linux/Ora cle system . | |
| 740 | Build Veri fication T esting | |
| 741 | (Prerequis ite: Smoke Test) | |
| 742 | A type of testing pe rformed fo r each new build, co mparing th e baseline with the actual obj ect proper ties in th e current build. The output fr om this te st indicat es what ob ject prope rties have changed o r don’t me et the req uirements. Together with the S moke test, the Build Verificat ion test m ay be util ized by pr ojects to determine if additio nal functi onal testi ng is appr opriate fo r a given build or i f a build is ready f or product ion. | |
| 743 | Business C ycle Testi ng | |
| 744 | A type of testing th at focuses upon acti vities and transacti ons perfor med end to end over time. This test type executes the functi onality as sociated w ith a peri od of time (e.g., on e-week, mo nth, or ye ar). These tests inc lude all d aily, week ly, and mo nthly cycl es, and ev ents that are date-s ensitive ( e.g., end of the mon th managem ent report s, monthly reports, quarterly reports, a nd year-en d reports) . | |
| 745 | Capacity T esting | |
| 746 | Capacity t esting occ urs when y ou simulat e the numb er of user s in order to stress an applic ation's ha rdware and /or networ k infrastr ucture. Ca pacity tes ting is do ne to dete rmine the capacity ( CPU, Data Storage, L AN, WAN, e tc.) of th e system a nd/or netw ork under test. | |
| 747 | Compliance Testing | |
| 748 | A type of testing th at verifie s that a c ollection of softwar e and hard ware fulfi lls given specificat ions. For example, t hese tests will mini mally incl ude: “core specifica tions for rehosting – ver.1.5- draft 3.do c”, Sectio n 508 of T he Rehabil itation Ac t Amendmen ts of 1998 , Race and Ethnicity Test, and VA Direct ive 6102 C ompliance. It does n ot exclude any other tests tha t may also come up. | |
| 749 | Component Integratio n Testing | |
| 750 | Testing pe rformed to expose de fects in t he interfa ces and in teraction between in tegrated c omponents as well as verifying installat ion instru ctions. | |
| 751 | Configurat ion Testin g | |
| 752 | A type of testing co ncerned wi th checkin g the prog rams compa tibility w ith as man y possible configura tions of h ardware an d system s oftware. I n most pro duction en vironments , the part icular har dware spec ifications for the c lient work stations, network co nnections, and datab ase server s vary. Cl ient works tations ma y have dif ferent sof tware load ed, for ex ample, app lications, drivers, and so on hand, at a ny one tim e; many di fferent co mbinations may be ac tive using different resources . The goal of the co nfiguratio n test is finding a hardware c ombination that shou ld be, but is not, c ompatible with the p rogram. | |
| 753 | Contention Testing | |
| 754 | A type of performanc e testing that execu tes tests that cause the appli cation to fail with regard to actual or simulated concurrenc y. Content ion testin g identifi es failure s associat ed with lo cking, dea dlock, liv elock, sta rvation, r ace condit ions, prio rity inver sion, data loss, los s of memor y, and lac k of threa d safety i n shared s oftware co mponents o r data. | |
| 755 | Data and D atabase In tegrity Te sting | |
| 756 | A type of testing th at verifie s that dat a is being stored by the syste m in a man ner where the data i s not comp romised by the initi al storage , updating , restorat ion, or re trieval pr ocessing. This type of testing is intend ed to unco ver design flaws tha t may resu lt in data corruptio n, unautho rized data access, l ack of dat a integrit y across m ultiple ta bles, and lack of ad equate tra nsaction p erformance . The data bases, dat a files, a nd the dat abase or d ata file p rocesses s hould be t ested as a subsystem within th e applicat ion. | |
| 757 | Documentat ion Testin g | |
| 758 | Documentat ion testin g is a typ e of testi ng that sh ould valid ate the in formation contained within the software documentat ion set fo r the foll owing qual ities: com pliance to accepted standards and conven tions, acc uracy, com pleteness, and usabi lity. The documentat ion testin g should v erify that all of th e required informati on is prov ided in or der for th e appropri ate user t o be able to properl y install, implement , operate, and maint ain the so ftware app lication. The curren t VistA do cumentatio n set can consist of any of th e followin g manual t ypes: | |
| 759 | Release No tes, Insta llation Gu ide, User Manuals, T echnical M anual, and Security Guide. | |
| 760 | Error Anal ysis Testi ng | |
| 761 | This type of testing verifies that the a pplication checks fo r input, d etects inv alid data, and preve nts invali d data fro m being en tered into the appli cation. Th is type of testing a lso includ es the ver ification of error l ogs and er ror messag es that ar e displaye d to the u ser. | |
| 762 | Explorator y Testing | |
| 763 | A techniqu e for test ing comput er softwar e that req uires mini mal planni ng and tol erates lim ited docum entation f or the tar get-of-tes t in advan ce of test execution , relying on the ski ll and kno wledge of the tester and feedb ack from t est result s to guide the ongoi ng test ef fort. Expl oratory te sting is o ften condu cted in sh ort sessio ns in whic h feedback gained fr om one ses sion is us ed to dyna mically pl an subsequ ent sessio ns. | |
| 764 | Failover T esting | |
| 765 | A type of testing te st that en sures an a lternate o r backup s ystem prop erly “take s over” (i .e., a bac kup system functions when the primary sy stem fails ). Failove r Testing also tests that a sy stem conti nually run s when the failover occurs, an d that the failover happens wi thout any loss of da ta or tran sactions. Failover T esting sho uld be com bined with Recovery Testing. | |
| 766 | Installati on Testing | |
| 767 | A type of testing th at verifie s that the applicati on or syst em install s as inten ded on dif ferent har dware and software c onfigurati ons, and u nder diffe rent condi tions (e.g ., a new i nstallatio n, an upgr ade, and a complete or custom installati on). Insta llation te sting may also measu re the eas e with whi ch an appl ication or system ca n be succe ssfully in stalled, t ypically m easured in terms of the averag e amount o f person-h ours requi red for a trained op erator or hardware e ngineer to perform t he install ation. Par t of this installati on test is to perfor m an unins tall. As a result of this unin stall, the system, a pplication and datab ase should return to the state prior to the instal l. | |
| 768 | Integratio n Testing | |
| 769 | An increme ntal serie s of tests of combin ations or sub-assemb lies of se lected com ponents in an overal l system. Integratio n testing is increme ntal in a successive ly larger and more c omplex com binations of compone nts tested in sequen ce, procee ding from the unit l evel (0% i ntegration ) to event ually the full syste m test (10 0% integra tion). | |
| 770 | Load Testi ng | |
| 771 | A performa nce test t hat subjec ts the sys tem to var ying workl oads in or der to mea sure and e valuate th e performa nce behavi ors and ab ilities of the syste m to conti nue to fun ction prop erly under these dif ferent wor kloads. Lo ad testing determine s and ensu res that t he system functions properly b eyond the expected m aximum wor kload. Add itionally, load test ing evalua tes the pe rformance characteri stics (e.g ., respons e times, t ransaction rates, an d other ti me-sensiti ve issues) . | |
| 772 | Migration Testing | |
| 773 | A type of testing th at follows standard VistA and HealtheVet (HeV)-Vis tA operati ng procedu res and lo ads the la test .jar version on to a live copy of Vi stA and He V-VistA. T he followi ng are exa mples of t he types o f tests th at can be performed as part of migration testing: | |
| 774 | Data conve rsion has been compl eted | |
| 775 | Data table s are succ essfully c reated | |
| 776 | Parallel t est for co nfirmation of data i ntegrity | |
| 777 | Review out put report , before a nd after m igration, to confirm data inte grity | |
| 778 | Run equiva lent proce ss, before and after migration | |
| 779 | Multi-Divi sional Tes ting | |
| 780 | A type of testing th at ensures that all applicatio ns will op erate in a multi-div ision or m ulti-site environmen t recogniz ing that a n enterpri se perspec tive while fully sup porting lo cal health care deli very. | |
| 781 | Parallel T esting | |
| 782 | The same i nternal pr ocesses ar e run on t he existin g system a nd the new system. T he existin g system i s consider ed the “go ld standar d”, unless proven ot herwise. T he feedbac k (expecte d results, defined t ime limits , data ext racts, etc .) from pr ocesses fr om the new system ar e compared to the ex isting sys tem. Paral lel testin g is perfo rmed befor e the new system is put into a productio n environm ent. | |
| 783 | Performanc e Monitori ng Testing | |
| 784 | Performanc e profilin g assesses how a sys tem is spe nding its time and c onsuming r esources. This type of perform ance testi ng optimiz es the per formance o f a system by measur ing how mu ch time an d resource s the syst em is spen ding in ea ch functio n. These t ests ident ify perfor mance limi tations in the code and specif y which se ctions of the code w ould benef it most fr om optimiz ation work . The goal of perfor mance prof iling is t o optimize the featu re and app lication p erformance . | |
| 785 | Performanc e Testing | |
| 786 | Performanc e Testing assesses h ow a syste m is spend ing its ti me and con suming res ources. Pe rformance testing op timizes a system by measuring how much t ime and re sources th e system i s spending in each f unction. T hese tests identify performanc e limitati ons in the code and specify wh ich sectio ns of the code would benefit m ost from o ptimizatio n work. Pe rformance testing ma y be furth er refined by the us e of speci fic types of perform ance tests , such as, benchmark test, loa d test, st ress test, performan ce monitor ing test, and conten tion test. | |
| 787 | Performanc e – Benchm ark Testin g | |
| 788 | A type of performanc e testing that compa res the pe rformance of new or unknown fu nctionalit y to a kno wn referen ce standar d (e.g., e xisting so ftware or measuremen ts). For e xample, be nchmark te sting may compare th e performa nce of cur rent syste ms with th e performa nce of the Linux/Ora cle system . | |
| 789 | Performanc e – Conten tion Testi ng | |
| 790 | A type of performanc e testing that execu tes tests that cause the appli cation to fail with regard to actual or simulated concurrenc y. Content ion testin g identifi es failure s associat ed with lo cking, dea dlock, liv elock, sta rvation, r ace condit ions, prio rity inver sion, data loss, los s of memor y, and lac k of threa d safety i n shared s oftware co mponents o r data. | |
| 791 | Performanc e – Endura nce Testin g | |
| 792 | Endurance testing, a lso known as Soak te sting, is usually do ne to dete rmine if t he system can sustai n the cont inuous exp ected load . During s oak tests, memory ut ilization is monitor ed to dete ct potenti al leaks. | |
| 793 | Performanc e – Load T esting | |
| 794 | A performa nce test t hat subjec ts the sys tem to var ying workl oads in or der to mea sure and e valuate th e performa nce behavi ors and ab ilities of the syste m to conti nue to fun ction prop erly under these dif ferent wor kloads. Lo ad testing determine s and ensu res that t he system functions properly b eyond the expected m aximum wor kload. Add itionally, load test ing evalua tes the pe rformance characteri stics (e.g ., respons e times, t ransaction rates, an d other ti me-sensiti ve issues) . | |
| 795 | Performanc e - Profil ingTesting | |
| 796 | Performanc e profilin g assesses how a sys tem is spe nding its time and c onsuming r esources. This type of perform ance testi ng optimiz es the per formance o f a system by measur ing how mu ch time an d resource s the syst em is spen ding in ea ch functio n. These t ests ident ify perfor mance limi tations in the code and specif y which se ctions of the code w ould benef it most fr om optimiz ation work . The goal of perfor mance prof iling is t o optimize the featu re and app lication p erformance . | |
| 797 | Performanc e – Spike Testing | |
| 798 | A performa nce test i n which an applicati on is test ed with su dden incre ment and d ecrements in the loa d. The fo cus is on system beh avior duri ng dramati c changes in load. | |
| 799 | Privacy Te sting | |
| 800 | A type of testing th at ensures that (1) veteran an d employee data are adequately protected and (2) s ystems and applicati ons comply with the Privacy an d Security Rule prov isions of the Health Insurance Portabili ty and Acc ountabilit y Act (HIP AA). | |
| 801 | Product Co mponent Te sting | |
| 802 | Product Co mponent Te sting (aka Unit Test ing) is th e internal technical and funct ional test ing of a m odule/comp onent of c ode. Produ ct Compone nt Testing verifies that the r equirement s defined in the det ail design specifica tion have been succe ssfully ap plied to t he module/ component under test . | |
| 803 | Recovery T esting | |
| 804 | A type of testing th at causes an applica tion or sy stem to fa il in a co ntrolled e nvironment . Recovery processes are invok ed while a n applicat ion or sys tem is mon itored. Re covery tes ting verif ies that a pplication or system , and data recovery is achieve d. Recover y Testing should be combined w ith Failov er Testing . | |
| 805 | Regression Test | |
| 806 | A type of testing th at validat es existin g function ality stil l performs as expect ed when ne w function ality is i ntroduced into the s ystem unde r test. | |
| 807 | Risk Based Testing | |
| 808 | A type of testing ba sed on a d efined lis t of proje ct risks. It is desi gned to ex plore and/ or uncover potential system fa ilures by using the list of ri sks to sel ect and pr ioritize t esting. | |
| 809 | Section 50 8 Complian ce Testing | |
| 810 | A type of test that (1) ensure s that per sons with disabiliti es have ac cess to an d are able to intera ct with gr aphical us er interfa ces and (2 ) verifies that the applicatio n or syste m meets th e specifie d Section 508 Compli ance stand ards. | |
| 811 | Security T esting | |
| 812 | A type of test that validates the securi ty require ments and to ensure readiness for the in dependent testing pe rformed by the Secur ity Assess ment Team as used by the Asses sment and Authorizat ion Proces s. | |
| 813 | Smoke Test | |
| 814 | A type of testing th at ensures that an a pplication or system is stable enough to enter tes ting in th e currentl y active t est phase. It is usu ally a sub set of the overall s et of test s, prefera bly automa ted, that touches pa rts of the system in at least a cursory way. | |
| 815 | Stress Tes ting | |
| 816 | A performa nce test i mplemented and execu ted to und erstand ho w a system fails due to condit ions at th e boundary , or outsi de of, the expected tolerances . This fai lure typic ally invol ves low re sources or competiti on for res ources. Lo w resource condition s reveal h ow the tar get-of-tes t fails th at is not apparent u nder norma l conditio ns. Other defects mi ght result from comp etition fo r shared r esources ( e.g., data base locks or networ k bandwidt h), althou gh some of these tes ts are usu ally addre ssed under functiona l and load testing. Stress Tes ting verif ies the ac ceptabilit y of the s ystems per formance b ehavior wh en abnorma l or extre me conditi ons are en countered (e.g., dim inished re sources or extremely high numb er of user s). | |
| 817 | System Tes ting | |
| 818 | System tes ting is th e testing of all par ts of an i ntegrated system, in cluding in terfaces t o external systems. Both funct ional and structural types of testing ar e performe d to verif y that the system pe rformance, operation and funct ionality a re sound. End to end testing w ith all in terfacing systems is the ultim ate versio n. | |
| 819 | Usability Testing | |
| 820 | Usability testing id entifies p roblems in the ease- of-use and ease-of-l earning of a product . Usabilit y tests ma y focus up on, and ar e not limi ted to: hu man factor s, aesthet ics, consi stency in the user i nterface, online and context-s ensitive h elp, wizar ds and age nts, user documentat ion. | |
| 821 | User Funct ionality T est | |
| 822 | User Funct ionality T est (UAT) is a type of Accepta nce Test t hat involv es end-use rs testing the funct ionality o f the appl ication us ing test d ata in a c ontrolled test envir onment. | |
| 823 | User Inter face Testi ng | |
| 824 | User-inter face (UI) testing ex ercises th e user int erfaces to ensure th at the int erfaces fo llow accep ted standa rds and me et require ments. Use r-interfac e testing is often r eferred to as GUI te sting. UI testing pr ovides too ls and ser vices for driving th e user int erface of an applica tion from a test. | |
| 825 | ||
| 826 | ||
| 827 | ||
| 828 | Template R evision Hi story | |
| 829 | Date | |
| 830 | Version | |
| 831 | Descriptio n | |
| 832 | Author | |
| 833 | November 2 015 | |
| 834 | 1.18 | |
| 835 | Expanded S ection 4.3 to bette r describe responsib ilities fo r 508 comp liance. | |
| 836 | Channing J onker | |
| 837 | October 20 15 | |
| 838 | 1.17 | |
| 839 | Corrected broken lin k to 508 U RL. | |
| 840 | Channing J onker | |
| 841 | June 2015 | |
| 842 | 1.16 | |
| 843 | Updated me tadata to show recor d retentio n informat ion and re quired by PMAS, VHA Release Ma nagement, Enterprise Operation s, and Vis tA Intake Program | |
| 844 | Process Ma nagement | |
| 845 | May 2015 | |
| 846 | 1.15 | |
| 847 | Reordered cover shee t to enhan ce SharePo int search results | |
| 848 | Process Ma nagement | |
| 849 | March 2015 | |
| 850 | 1.14 | |
| 851 | Miscellane ous update s includin g the addi tion of Pe rformance testing. | |
| 852 | Channing J onker | |
| 853 | November 2 014 | |
| 854 | 1.13 | |
| 855 | Updated to latest Se ction 508 conformanc e guidelin es and rem ediated wi th Common Look Offic e Tool | |
| 856 | Process Ma nagement | |
| 857 | August 201 4 | |
| 858 | 1.12 | |
| 859 | Removed re quirements for ESE A pproval Si gnature | |
| 860 | Process Ma nagement | |
| 861 | October 20 13 | |
| 862 | 1.11 | |
| 863 | Converted to Microso ft Office 2007-2010 format | |
| 864 | Process Ma nagement | |
| 865 | July 09, 2 012 | |
| 866 | 1.10 | |
| 867 | Added Syst em Design Document t o Section 1.2 -Test Objectives as an exa mple | |
| 868 | Process Ma nagement | |
| 869 | January 03 , 2012 | |
| 870 | 1.9 | |
| 871 | Updated Ap proval Sig natures fo r Master T est Plan i n Appendix a | |
| 872 | Process Ma nagement | |
| 873 | October 13 , 2011 | |
| 874 | 1.8 | |
| 875 | Replaced r eferences to Test an d Certific ation with Independe nt Test an d Evaluati on. Replac ed referen ces to Cer tification and Accre ditation w ith Assess ment and A uthorizati on. | |
| 876 | Process Ma nagement | |
| 877 | October 4, 2011 | |
| 878 | 1.7 | |
| 879 | Repaired l ink to Pri vacy Impac t Assessme nt | |
| 880 | Process Ma nagement | |
| 881 | August 23, 2011 | |
| 882 | 1.6 | |
| 883 | Changed Op erational Readiness Testing (O RT) to Ope rational R eadiness R eview (ORR ) | |
| 884 | Process Ma nagement | |
| 885 | April 12, 2011 | |
| 886 | 1.5 | |
| 887 | Updated th e Signator y Authorit ies in App endix A in light of organizati onal chang es | |
| 888 | Process Ma nagement | |
| 889 | February 2 011 | |
| 890 | 1.4 | |
| 891 | Removed Te sting Serv ice Testin g and Oper ational Re adiness Te sting; add ed Enterpr ise System Engineeri ng Testing . | |
| 892 | Changed In itial Oper ating Capa bility Tes ting to In itial Oper ating Capa bility Eva luation | |
| 893 | Process Ma nagement | |
| 894 | January 20 11 | |
| 895 | 1.3 | |
| 896 | Repaired b roken link in sectio n 1.4 | |
| 897 | Process Ma nagement S ervice | |
| 898 | August 201 0 | |
| 899 | 1.2 | |
| 900 | Removed OE D from tem plate | |
| 901 | Process Ma nagement S ervice | |
| 902 | December 2 009 | |
| 903 | 1.1 | |
| 904 | Removed “T his Page I ntentional ly Left Bl ank” pages . | |
| 905 | OED Proces s Manageme nt Service | |
| 906 | July 2009 | |
| 907 | 1.0 | |
| 908 | Initial Pr oPath rele ase | |
| 909 | OED Proces s Manageme nt Service |
Araxis Merge (but not the data content of this report) is Copyright © 1993-2016 Araxis Ltd (www.araxis.com). All rights reserved.