Produced by Araxis Merge on 5/15/2018 2:43:24 PM Central Daylight Time. See www.araxis.com for information about Merge. This report uses XHTML and CSS2, and is best viewed with a modern standards-compliant browser. For optimum results when printing this report, use landscape orientation and enable printing of background images and colours in your browser.
| # | Location | File | Last Modified |
|---|---|---|---|
| 1 | CPRS_V32_COMBINED_BUILD_V45.zip\CPRSv32Build45_Documentation | CPRS v32 Master Test Plan v1.36 (Feb 2018).doc | Thu May 10 19:56:02 2018 UTC |
| 2 | CPRS_V32_COMBINED_BUILD_V45.zip\CPRSv32Build45_Documentation | CPRS v32 Master Test Plan v1.36 (Feb 2018).doc | Mon May 14 19:20:36 2018 UTC |
| Description | Between Files 1 and 2 |
|
|---|---|---|
| Text Blocks | Lines | |
| Unchanged | 1 | 780 |
| Changed | 0 | 0 |
| Inserted | 0 | 0 |
| Removed | 0 | 0 |
| Whitespace | |
|---|---|
| Character case | Differences in character case are significant |
| Line endings | Differences in line endings (CR and LF characters) are ignored |
| CR/LF characters | Not shown in the comparison detail |
No regular expressions were active.
| 1 | Master Tes t Plan Tem plateCompu terized Pa tient Reco rd System (CPRS) v32 | |
| 2 | Master Tes t Plan | |
| 3 | Version 1. 32 | |
| 4 | ||
| 5 | October 20 17 | |
| 6 | Department of Vetera ns Affairs | |
| 7 | Revision H istory | |
| 8 | DateVersio nDescripti onAuthor10 /06/20141. 0CreationR ishan Chan darana11/2 5/20141.1P laced Role s in Respo nsible Par ty, Update d Formatti ng. Risha n Chandara na12/24/20 141.2Month ly UpdateR ishan Chan darana01/2 9/20151.3U pdated IOC testing t o reflect joint test ing with A udio care. Updated Roles and Responsibi lities. R ishan Chan darana3/3/ 20151.4Upd ated with Test Envir onmentRish an Chandar ana4/3/201 51.5Update d Roles/Re sponsibili ties Risha n Chandara na5/3/2015 1.6Updated Training Rishan Cha ndarana6/3 /20151.7Mo nthly Upda te/Correct ed date fo rmats in r evision. Rishan Cha ndarana7/1 /20151.8Mo nthly Upda teRishan C handarana7 /31/20151. 9Changed T est Delive rables Res ponsibilit ies to Spe cify HP SQ A Analysts where nec essary. | |
| 9 | Changed Te st Deliver ables Test Environme nt Respons ibility to John Serv ice.Brian Watt8/31/2 0151.10Mon thly Updat e, Updated Responsib ilities; U pdated Tra ining Need sRishan Ch andarana, Brian Watt 9/30/20151 .11Monthly Update, U pdated Tes t Team and Test Anal ystsBrian Watt10/30/ 20151.12Mo nthly Upda teBrian Wa tt11/30/20 151.13Mont hly Update , Updated StaffingBr ian Watt, Rishan Cha ndarana12/ 30/20151.1 4Monthly U pdatesBria n Watt01/3 1/20161.15 Monthly Up dates; con vert to mo st recent templateRi shan Chand arana | |
| 10 | Brian Watt | |
| 11 | Juico Bowl ey03/02/20 161.16Mont hly Update sRishan Ch andarana04 /04/20161. 17Monthly Updates; U pdated for 508 Compl ianceBrian Watt | |
| 12 | Rishan Cha ndarana04/ 27/20161.1 8Monthly U pdatesBria n Watt05/3 1/20161.19 Updated St affingRish an Chandar ana6/30/20 161.20Mont hly Update sBrian Wat t7/29/2016 1.21Monthl y UpdatesR ishan Chan darana8/31 /20161.22M onthly Upd atesRishan Chandaran a1/3/20171 .23Monthly UpdatesBr ian Watt2/ 28/20171.2 4Monthly U pdatesBria n Watt, Cr aig Hinton 3/31/20171 .25Monthly UpdatesBr ian Watt4/ 28/20171.2 6Monthly U pdates Bri an Watt5/3 0/171.27Mo nthly Upda tesBrian W att6/29/17 1.28Monthl y Updates7 /31/20171. 29Monthly UpdatesBri an Watt8/3 0/20171.30 Monthly Up dateBrian Watt9/29/2 0171.31Mon thly Updat eBrian Wat t10/31/201 71.32Month ly UpdateB rian WattT able of Co ntents | |
| 13 | 11. | |
| 14 | Introducti on | |
| 15 | ||
| 16 | ||
| 17 | 11.1. | |
| 18 | Purpose | |
| 19 | ||
| 20 | ||
| 21 | 11.2. | |
| 22 | Test Objec tives | |
| 23 | ||
| 24 | ||
| 25 | 11.3. | |
| 26 | Roles and Responsibi lities | |
| 27 | ||
| 28 | ||
| 29 | 31.4. | |
| 30 | Processes and Refere nces | |
| 31 | ||
| 32 | ||
| 33 | 32. | |
| 34 | Items To B e Tested | |
| 35 | ||
| 36 | ||
| 37 | 32.1. | |
| 38 | Overview o f Test Inc lusions | |
| 39 | ||
| 40 | ||
| 41 | 42.2. | |
| 42 | Overview o f Test Exc lusions | |
| 43 | ||
| 44 | ||
| 45 | 43. | |
| 46 | Test Appro ach | |
| 47 | ||
| 48 | ||
| 49 | 43.1. | |
| 50 | Product Co mponent Te st | |
| 51 | ||
| 52 | ||
| 53 | 43.2. | |
| 54 | Component Integratio n Test | |
| 55 | ||
| 56 | ||
| 57 | 53.3. | |
| 58 | System Tes ts | |
| 59 | ||
| 60 | ||
| 61 | 53.4. | |
| 62 | User Funct ionality T est | |
| 63 | ||
| 64 | ||
| 65 | 63.5. | |
| 66 | Enterprise System En gineering Testing | |
| 67 | ||
| 68 | ||
| 69 | 63.6. | |
| 70 | Initial Op erating Ca pability E valuation | |
| 71 | ||
| 72 | ||
| 73 | 64. | |
| 74 | Testing Te chniques | |
| 75 | ||
| 76 | ||
| 77 | 64.1. | |
| 78 | Risk-based Testing | |
| 79 | ||
| 80 | ||
| 81 | 64.2. | |
| 82 | Enterprise Testing | |
| 83 | ||
| 84 | ||
| 85 | 74.2.1. | |
| 86 | Security T esting | |
| 87 | ||
| 88 | ||
| 89 | 74.2.2. | |
| 90 | Privacy Te sting | |
| 91 | ||
| 92 | ||
| 93 | 74.2.3. | |
| 94 | Section 50 8 Complian ce Testing | |
| 95 | ||
| 96 | ||
| 97 | 74.2.4. | |
| 98 | Multi-Divi sional Tes ting | |
| 99 | ||
| 100 | ||
| 101 | 74.3. | |
| 102 | Performanc e and Capa city Testi ng | |
| 103 | ||
| 104 | ||
| 105 | 84.4. | |
| 106 | Test Types | |
| 107 | ||
| 108 | ||
| 109 | 104.5. | |
| 110 | Productivi ty and Sup port Tools | |
| 111 | ||
| 112 | ||
| 113 | 105. | |
| 114 | Test Crite ria | |
| 115 | ||
| 116 | ||
| 117 | 105.1. | |
| 118 | Process Re views | |
| 119 | ||
| 120 | ||
| 121 | 115.2. | |
| 122 | Pass/Fail Criteria | |
| 123 | ||
| 124 | ||
| 125 | 115.3. | |
| 126 | Suspension and Resum ption Crit eria | |
| 127 | ||
| 128 | ||
| 129 | 116. | |
| 130 | Test Deliv erables | |
| 131 | ||
| 132 | ||
| 133 | 137. | |
| 134 | Test Sched ule | |
| 135 | ||
| 136 | ||
| 137 | 138. | |
| 138 | Test Envir onments | |
| 139 | ||
| 140 | ||
| 141 | 138.1. | |
| 142 | Test Envir onment Con figuration s | |
| 143 | ||
| 144 | ||
| 145 | 148.2. | |
| 146 | Base Syste m Hardware | |
| 147 | ||
| 148 | ||
| 149 | 158.3. | |
| 150 | Base Softw are Elemen ts in the Test Envir onments | |
| 151 | ||
| 152 | ||
| 153 | 159. | |
| 154 | Staffing a nd Trainin g Needs | |
| 155 | ||
| 156 | ||
| 157 | 1610. | |
| 158 | Risks and Constraint s | |
| 159 | ||
| 160 | ||
| 161 | 1611. | |
| 162 | Test Metri cs | |
| 163 | ||
| 164 | ||
| 165 | 17Attachme nt A – App roval Sign atures | |
| 166 | ||
| 167 | ||
| 168 | 18Appendix A - Test Type Defin itions | |
| 169 | ||
| 170 | ||
| 171 | ||
| 172 | ||
| 173 | Introducti on | |
| 174 | Purpose | |
| 175 | The purpos e of this Master Tes t Plan for the Compu terize Pat ient Recor d System v 32 Develop ment Proje ct is to d ocument th e overall approach t o validate and verif y the func tionality delivered in version 32 of the Computeri zed Patien t Record S ystem (CPR S) Graphic al User In terface (G UI). CPRS v32 encom passes bot h new func tionality as well as enhanceme nts to exi sting func tionality. In addit ion to mod ifying CPR S, this pr oject/plan encompass es modific ations to Text Integ ration Uti lity, Inpa tient Medi cations, O utpatient Pharmacy, Pharmacy D ata Manage ment, Barc ode Medica tion Admin istration, Adverse R eaction Tr acking, Cl inical Rem inders, an d Laborato ry. | |
| 176 | Test Objec tives | |
| 177 | This Maste r Test Pla n supports the follo wing objec tives: | |
| 178 | To provide test cove rage for 1 00% of the documente d requirem ents | |
| 179 | To provide coverage for System / Software Design Do cument ele ments | |
| 180 | To execute 100% of t he test ca ses during User Func tionality Testing | |
| 181 | To create, maintain and contro l the test environme nt | |
| 182 | Roles and Responsibi lities | |
| 183 | Table 1 li sts the ke y roles an d their re sponsibili ties for t his Master Test Plan . | |
| 184 | Table 1: R oles and D escription s | |
| 185 | RoleDescri ptionTeam MembersDev elopment T eam Person s that bui ld or cons truct the product/pr oduct comp onent.Jami e Crumley, Ty Phelps , Andrey A ndriyevski y, Andrea Freeman, N ick Costan zo, Jeff S wesky | |
| 186 | [Previous Developers Robert La uro, Kim H ovorka, Mi ke Jenkins ] | |
| 187 | Developmen t ManagerP erson resp onsible fo r assistin g with the creation and implem entation o f the Mast er Test Pl an.Craig H inton, Ris han Chanda ranaProgra m ManagerP erson that has overa ll respons ibility fo r the succ essful pla nning and execution of a proje ct; person responsib le for cre ating the Master Tes t Plan in collaborat ion with t he Develop ment Manag er. Mike B raithwaite , Kenny, C ondie, Mic hael Keene r, April S cottStakeh oldersPers ons that h old a stak e in a sit uation in which they may affec t or be af fected by the outcom e.End user s-health c are provid ersTest An alystPerso n responsi ble for en suring ful l executio n of the t est proces s to inclu de the ver ification of technic al require ments and the valida tion of bu siness req uirements. Brian Wat t, Juico B owley, Reb ecca Russe ll, Susan ScorzatoTe st LeadAn experience d Test Ana lyst or me mber of th e Test Tea m that lea ds and coo rdinates a ctivities related to all aspec ts of test ing based on an appr oved Maste r Test Pla n and sche dule.Brian Watt, Jui co BowleyT est Team/T estersPers ons that e xecute tes ts and ens ure the te st environ ment will adequately support p lanned tes t activiti es.Brian W att, Juico Bowley, R ebecca Rus sell, Susa n Scorzato Test Envir onment Tea mPersons t hat establ ish, maint ain, and c ontrol tes t environm ents.John ServicePro cesses and Reference s | |
| 188 | The proces ses that g uide the i mplementat ion of thi s Master T est Plan a re: | |
| 189 | Test Prepa ration | |
| 190 | Product Bu ild | |
| 191 | Independen t Test and Evaluatio n | |
| 192 | The refere nces that support th e implemen tation of this Maste r Test Pla n are: | |
| 193 | ProPath | |
| 194 | Section 50 8 Office W eb Page | |
| 195 | Privacy Im pact Asses sment - Pr ivacy Serv ice | |
| 196 | The refere nces that support th e implemen tation of this Maste r Test Pla n are: | |
| 197 | Business R equirement Document (BRD) Ver sion <#.#> , Date <Mo nth, Year> | |
| 198 | Requiremen ts Specifi cation Doc ument (RSD ) Version 1.30, Date – March 2 017 | |
| 199 | System Des ign Docume nt (SDD) V ersion 1.1 6, Date – January, 2 016 | |
| 200 | Requiremen ts Traceab ility Matr ix (RTM) V ersion 1.3 0, Date – March, 201 7 | |
| 201 | Risk Log V ersion <#. #>, Date < Month, Yea r> | |
| 202 | Items To B e Tested | |
| 203 | Items to b e tested i nclude the following : | |
| 204 | The CPRS G UI. | |
| 205 | VistA Patc hes for th e applicat ions being developed . | |
| 206 | CPRS throu gh VistA. | |
| 207 | Installati on Guide. | |
| 208 | User Guide | |
| 209 | Interface between af fected pat ches. | |
| 210 | Overview o f Test Inc lusions | |
| 211 | The follow ing compon ents and f eatures an d combinat ions of co mponents a nd feature s will be tested: | |
| 212 | The CPRS G UI and ins tallation tools and distributi on methods . | |
| 213 | VistA Patc hes and in stallation tools and distribut ion method s. | |
| 214 | Installati on Guide a s well as document d istributio n methods. | |
| 215 | User Guide as well a s document distribut ion method s. | |
| 216 | Interfaces between a ffected pa ckages (fo r example, Inpatient Medicatio ns and CPR S, Audioca re). | |
| 217 | Overview o f Test Exc lusions | |
| 218 | The follow ing compon ents and f eatures an d combinat ions of co mponents a nd feature s will not be tested : | |
| 219 | Applicatio ns not inc luded in t he CPRS v3 2 effort, especially applicati ons that d o not inte ract with CPRS. | |
| 220 | Test Appro ach | |
| 221 | Product Co mponent Te st | |
| 222 | The Develo per perfor ms Product Component Testing ( aka Unit T esting) wh ich includ es the int ernal tech nical and functional testing o f a module /component of code a nd is resp onsible fo r the veri fication o f the requ irements d efined in the detail ed design specificat ion have b een succes sfully app lied to th e module/c omponent u nder test. Steps inc lude: | |
| 223 | Analyze re quirements to unders tand the a pplication functiona lity and d ependencie s | |
| 224 | Identify a ll the rou tines affe cted by th e module o r object | |
| 225 | Specify al l the rout ines that are called from vari ous locati ons | |
| 226 | Execute te sts on pri oritized o ptions | |
| 227 | Execute te sts with d ifferent c ombination s of optio ns and dat a. For exa mple, test with mini mal data e ntered and test with maximal d ata entere d | |
| 228 | Perform ex ploratory testing, i .e., rando mly exerci se the mod ule, objec t, and opt ions based upon doma in knowled ge, past p erformance , and expe rtise | |
| 229 | Record the actual te st results | |
| 230 | Perform st atic analy sis of mod ule/compon ent source code | |
| 231 | If a defec t is ident ified and it is rela ted to the code bein g develope d that cod e will con tinue to b e develope d until it successfu lly passes the unit test. | |
| 232 | If the def ect is not related t o the code being dev eloped it will be ad dressed by the follo wing metho ds: | |
| 233 | If it is a software defect tha t is withi n the scop e of this project, i t will be added to t he project backlog. | |
| 234 | If it is a software defect tha t is outsi de of the scope of t his projec t it will be referre d to the e xisting ma intenance structure. | |
| 235 | If the sof tware defe ct is not truly a de fect and r equires a change in functional ity it wil l be repor ted with a suggestio n to enter a new ser vice reque st. | |
| 236 | Component Integratio n Test | |
| 237 | The Test A nalyst ins talls the Product Co mponent an d performs component integrati on testing . Product Component Integratio n testing is perform ed to expo se defects in the in terfaces a nd interac tion betwe en integra ted compon ents as we ll as veri fying inst allation i nstruction s. Compone nt integra tion testi ng include s testing of Identit y and Acce ss Managem ent Integr ation Serv ice Patter n changes. The Softw are Qualit y Assuranc e Review C hecklist i s started during thi s activity . | |
| 238 | If a defec t is ident ified and it is rela ted to the code bein g develope d the defe ct will be added to the projec t backlog. | |
| 239 | If the def ect is not related t o the code being dev eloped it will be ad dressed by the follo wing metho ds: | |
| 240 | If it is a software defect tha t is withi n the scop e of this project, i t will be added to t he project backlog. | |
| 241 | If it is a software defect tha t is outsi de of the scope of t his projec t it will be referre d to the e xisting ma intenance structure. | |
| 242 | If the sof tware defe ct is not truly a de fect and r equires a change in functional ity it wil l be repor ted with a suggestio n to enter a new ser vice reque st. | |
| 243 | System Tes ts | |
| 244 | The Test A nalyst per forms Syst em Tests e mploying a variety o f test typ es (i.e., compliance , regressi on, access control, interopera bility, us ability (i ncluding 5 08 complia nce), etc. ). System Tests exer cise all p arts of an integrate d system i ncluding i nterfaces to externa l systems. | |
| 245 | If a defec t is ident ified and it is rela ted to the code bein g develope d the defe ct will be added to the projec t backlog. | |
| 246 | If the def ect is not related t o the code being dev eloped it will be ad dressed by the follo wing metho ds: | |
| 247 | If it is a software defect tha t is withi n the scop e of this project, i t will be added to t he project backlog. | |
| 248 | If it is a software defect tha t is outsi de of the scope of t his projec t it will be referre d to the e xisting ma intenance structure. | |
| 249 | If the sof tware defe ct is not truly a de fect and r equires a change in functional ity it wil l be repor ted with a suggestio n to enter a new ser vice reque st. | |
| 250 | User Funct ionality T est | |
| 251 | The VA Pro ject Manag er ensures the User Functional ity Test ( UFT) is co nducted. U FT is a fo rmal test conducted by the end -users to determine whether a system sat isfies its acceptanc e criteria and enabl es the cus tomer to d etermine w hether to accept the system. T he purpose of the Us er Functio nality Tes t is to (1 ) exercise the funct ionality o f the appl ication us ing test d ata in a c ontrolled test envir onment in order to v alidate fu nctionalit y and (2) evaluate t he usabili ty of a co mponent or system. A dditionall y, during User Funct ionality T esting, En terprise S hared Serv ice functi onality, s uch as Ide ntity and Access Man agement, a re tested. | |
| 252 | Enterprise System En gineering Testing | |
| 253 | The VA Pro ject Manag er reviews all testi ng intake assessment results, including Risk Analy sis and Te sting Scop e Report ( RATSR) or Testing In take Analy sis (TIA) closure em ail. The V A Project Manager th en incorpo rates ESE Enterprise Testing S ervices (E TS) Indepe ndent Test ing and/or Systems Q uality Ass urance Ser vice (SQAS ) independ ent testin g services required in the Ris k Assessme nt Testing Scope Rep ort (RATSR ) into pro ject plans and sched ules. | |
| 254 | At this po int a dete rmination will be ma de if it i s necessar y to condu ct indepen dent testi ng, whethe r it is SQ AS/IV&V Te sting, or ESE Testin g. | |
| 255 | Initial Op erating Ca pability E valuation | |
| 256 | The Initia l Operatin g Capabili ty (IOC) I mplementat ion Manage r coordina tes the pe rformance of the IOC evaluatio n. IOC eva luation (f ormerly kn own as fie ld testing ) is when a product/ system tha t has been modified/ enhanced i s placed i nto a limi ted number of produc tion (live ) environm ents, in o rder to ev aluate the new featu res and fu nctionalit y of the p roduct/sys tem and to ascertain if the fe atures and functiona lity perfo rm as expe cted and d o not adve rsely affe ct the exi sting func tionality of the pro duct/syste m. Activit ies includ e: | |
| 257 | Distribute the produ ct and pro duct docum entation t o the Eval uation Sit es | |
| 258 | Facilitate the timel y installa tions at t he Evaluat ion Sites | |
| 259 | Conduct fo rmal or bi -weekly Ev aluation S ite calls | |
| 260 | Track defe cts identi fied durin g Initial Operating Capability Evaluatio n | |
| 261 | Address is sues and q uestions i dentified during eva luation | |
| 262 | Obtain Sit e Concurre nce Statem ents | |
| 263 | CPRS v32 m akes modif ications t hat impact the Audio care appli cation. I OC testing for CPRS v32 will b e conducte d in tande m with IOC testing f or the Aud iocare app lication. | |
| 264 | Testing Te chniques | |
| 265 | Risk-based Testing | |
| 266 | The follow ing table will be up dated as r isks are i dentified: | |
| 267 | Table 2: R isks and P riorities | |
| 268 | RiskPriori tyTest Typ e/Test Cas eConfigura tion Manag ement base d Integrat ion Testin gHighDetai ls TBD, co ntinuous i ntegration testing w ill be per formed to verify tha t modules integrate appropriat ely and do not cause adverse i nteraction s with exi sting or p reviously developed software. Integratio n of modif ications f rom other software p rojects su ch as MOCH A (Medicat ion Order Check Heal thcare App lication)H ighDetails TBD, inte gration te sting will be perfor med regula rly and te st scripts will be u pdated to reflect En terprise T esting | |
| 269 | Cite how t he project testing c overs the enterprise requireme nts. Enter prise requ irements i nclude sec urity, pri vacy, Sect ion 508 Co mpliance r equirement s, and Mul ti-divisio nal requir ements. | |
| 270 | Security T esting | |
| 271 | Security T esting wil l be perfo rmed by th e testing services g roup. Bas ic testing such as b oundary te sting, sig n in proce dures etc. will be p erformed b y the Test Analysts. | |
| 272 | Privacy Te sting | |
| 273 | Privacy Te sting will be perfor med by the testing s ervices gr oup. CPRS is a prov ider facin g applicat ion and as such the applicatio n makes se nsitive in formation such as Pr otected He alth Infor mation (PH I) and Per sonally Id entifiable Informati on (PII) v isible to applicatio n users. Test Analy sts will p erform bas ic testing to valida te that pr ivacy guid elines are being fol lowed. | |
| 274 | Section 50 8 Complian ce Testing | |
| 275 | Section 50 8 Complian ce Testing will be p erformed b y the Test Analysts on the tea m. They w ill follow guideline s set fort h by the p rogram off ice to val idate that the appli cation is Section 50 8 complian t. Tests include ut ilizing JA WS (Job Ac cess With Speech) Sc reen Reade r software to valida te that th e screen r eader and the visual functiona lity are i n alignmen t. | |
| 276 | Multi-Divi sional Tes ting | |
| 277 | Multi-divi sional tes ting will be conduct ed during the Initia l Operatin g Capabili ty (IOC) t esting pha se by an I ntegrated Test Site. In addit ion the te st environ ment will be multi-d ivisional. | |
| 278 | Performanc e and Capa city Testi ng | |
| 279 | TBD | |
| 280 | Develop te sts to ens ure the ap plication will perfo rm as expe cted under anticipat ed user lo ads, and t ypical bus iness tran sactions r espond in a timely m anner. Dur ing the te st executi on, the Sy stem Under Test (SUT ) is activ ely monito red for an y issues t hat could affect app lication p erformance , and to v erify the hardware e nvironment is adequa tely sized . | |
| 281 | This type of testing covers th e requirem ents speci fied in th e “Perform ance Speci fications” in the Re quirements Specifica tion Docum ent found in the Req uirements process in ProPath. | |
| 282 | Test Types | |
| 283 | Table 2: T est Types | |
| 284 | Test Types Party Resp onsibleAcc ess contro l testingH P Test Ana lysts, Dev elopersBui ld verific ation test ingHP Test Analysts, Developer sBusiness cycle test ingHP Test AnalystsC ompliance testingHP Test Analy sts, Devel opersCompo nent integ ration tes tingHP Tes t Analysts , Develope rsConfigur ation test ingHP Test AnalystsD ata and da tabase int egrity tes tingHP Tes t Analysts , Develope rsDocument ation test ingHP Test Analysts, Developer sError ana lysis test ingDevelop ersExplora tory testi ngHP Test AnalystsFa ilover tes tingHP Tes t Analysts , Develope rsInstalla tion testi ngHP Test Analysts, Developers Integratio n testingD evelopersM igration t estingHP T est Analys tsMulti-di visional t estingHP T est Analys tsParallel testingHP Test Anal ysts, Deve lopersPerf ormance mo nitoring t estingTBDP erformance testingTB DPerforman ce - Bench mark testi ngTBDPerfo rmance - C ontention testingTBD Performanc e - Endura nce testin gTBDPerfor mance - Lo ad testing TBDPerform ance - Pro filing tes tingTBDPer formance - Spike tes tingTBDPer formance - Stress te stingTBDPr ivacy test ingHP Test Analysts, Developer sProduct c omponent t estingDeve lopersReco very testi ngTBDRegre ssion test HP Test An alysts, De velopersRi sk based t estingHP T est Analys ts, Develo persSectio n 508 comp liance tes tingHP Tes t Analysts , Develope rsSecurity testingHP Test Anal ystsSmoke testingHP Test Analy sts, Devel opersSyste m testingH P Test Ana lysts, Dev elopersUsa bility tes tingHP Tes t Analysts , Develope rsUser Fun ctionality TestingHP Test Anal ysts, Deve lopersUser interface testingHP Test Anal ysts, Deve lopersProd uctivity a nd Support Tools | |
| 285 | Add or del ete tools as appropr iate. | |
| 286 | Table 3 de scribes th e tools th at will be employed to support this Mast er Test Pl an. | |
| 287 | Table 3: T ool Catego ry or Type s | |
| 288 | Tool Categ ory or Typ eTool Bran d NameVend or or In-h ouseVersio nTest Mana gementTBDD efect Trac kingRation al Jazz To olIBM Test Coverage Monitor or ProfilerT BDProject Management ProjectMic rosoftPerf ormance Te stingTBDCo nfiguratio n Manageme ntRational Jazz Tool IBMDBMS to olsReflect ion for UN IX and Ope nVMSAttach mateDocume nt Reposit oryMicroso ft SharePo intShared DriveMicro softTest C riteria | |
| 289 | Process Re views | |
| 290 | The Master Test Plan under goe s two revi ews: | |
| 291 | Peer Revie w – upon c ompletion of the Mas ter Test P lan | |
| 292 | Formal Rev iew – afte r the Deve lopment Ma nager appr oves the M aster Test Plan | |
| 293 | The Master Test Plan does serv e as an in put or Art ifact Used for the P rocess Qua lity Gate Review for Product B uild as we ll as for the Go No Review (Mi lestone) f or Indepen dent Testi ng. | |
| 294 | For more i nformation on the re views asso ciated wit h testing, see the P roduct Bui ld, Test P reparation , and Inde pendent Te st and Eva luation pr ocesses. | |
| 295 | Pass/Fail Criteria | |
| 296 | Incidents identified during th e executio n of this test plan will be ev aluated to determine their sev erity. Th is impact will be re corded in the severi ty section of the Ja zz defect. | |
| 297 | High Impac t Test Inc ident is a n error or lack of f unctionali ty that: | |
| 298 | Jeopardize s patient or personn el safety by corrupt or incorr ect data | |
| 299 | Has no wor karound to provide s imilar fun ctionality and this functional ity is req uired to m ove to sys tem, integ ration, or user acce ptance | |
| 300 | Adversely affects al l users or key user functional ity | |
| 301 | Medium Imp act Test I ncident is an error or lack of functiona lity that: | |
| 302 | Has a reas onable wor karound to maintain functional ity | |
| 303 | Impacts a small grou p of users , but has workaround | |
| 304 | Functional ity works but not to requireme nts, speci fications, or standa rds and wo rkflow is not hamper ed | |
| 305 | Low Impact Test Inci dent is an error or lack of fu nctionalit y that may cause ope rator/user inconveni ence and m inimally a ffects ope rational p rocessing. | |
| 306 | Spelling e rrors | |
| 307 | Minor GUI Graphical/ Formatting errors th at do not affect fun ctionality /visibilit y | |
| 308 | Enhancemen t Test Inc ident is s omething t hat would be “nice” to have in the integ ration pie ce but was not inclu ded in the specifica tions for this relea se. | |
| 309 | All High a nd Medium defects sh all be add ressed or negotiated prior to release. Any limita tion or ou tstanding test incid ent shall have an ap proved con tingency p rocess (wo rkaround) in place. | |
| 310 | Suspension and Resum ption Crit eria | |
| 311 | Testing wi ll cease o n a test i tem when a high impa ct test in cident is logged. T esting wil l resume w hen the in cident is resolved. | |
| 312 | Testing wi ll cease o n the enti re release when thre e high imp act test i ncidents a re logged. Testing will resum e when the incidence are addre ssed. | |
| 313 | Acceptance Criteria | |
| 314 | All High a nd Medium defects sh all be add ressed or negotiated prior to release. A ny limitat ion or out standing t est incide nt shall h ave an app roved cont ingency pr ocess (wor karound) i n place. | |
| 315 | Test Deliv erables | |
| 316 | The Test D eliverable s listed b elow repre sent some possible d eliverable s for a te sting proj ect. The T est Delive rables tab le may be tailored t o meet pro ject needs . Do not i nclude Del ete any li sted test deliverabl e that is not used b y the Prod uct Build, Test Mana gement, an d Independ ent Test a nd Evaluat ion proces ses. | |
| 317 | Table 4 li sts the te st deliver ables for the CPRS v 32 project . | |
| 318 | Table 4: T est Delive rables | |
| 319 | Test Deliv erablesRes ponsible P artyMaster Test Plan HP SQA Ana lystIterat ion Test P lans (when appropria te)HP SQA AnalystTes t Executio n Risks VA /HP PMTest ScheduleV A/HP PMTes t Cases/Te st Scripts HP SQA Ana lystTest D ataHP SQA AnalystTes t Environm entJohn Se rviceTest Evaluation Summaries HP SQA Ana lystTracea bility Rep ort or Mat rixHP SQA AnalystMas ter Test P lanHP SQA AnalystTes t Schedule | |
| 320 | List the m ajor testi ng milesto nes. When appropriat e, referen ce other w orkflow do cumentatio n or tools , such as the Projec t Manageme nt Plan, o r Work Bre akdown Str ucture (WB S.) Put a minimum am ount of pr ocess and planning i nformation within th e Master T est Plan i n order to facilitat e ongoing maintenanc e of the t est schedu le. | |
| 321 | Table 5: T esting Mil estones | |
| 322 | Testing Mi lestonesRe sponsible PartyAppro ved Master Test Plan HP SQA Ana lystApprov ed generic test case s (high le vel list)H P SQA Anal ystComplet e and stab le require ments (SRS or CRs)HP SQA Analy stCreating of Test E nvironment (s)HP SQA AnalystSub mit and ma nage reque st for Tes ting Servi cesHP SQA AnalystTes t Cases se lected for release a nd entered using MS Excel Spre adsheet on SQA Share PointHP SQ A AnalystC ompletion of Patch v erificatio nHP SQA An alystSQA T esting con ducted (ex ecute the selected T est Cases) in Test e nvironment (s)HP SQA AnalystRem edy Ticket sHP SQA An alystDefec ts identif ied and en tered into CQHP SQA AnalystTes t Environm ents | |
| 323 | A test env ironment i s an envir onment con taining ha rdware, in strumentat ion, simul ators, sof tware tool s, and oth er support elements needed to conduct a test. | |
| 324 | Test Envir onment Con figuration s | |
| 325 | The party or parties responsib le for con figuring a nd maintai ning the t est enviro nments are : John Ser vice & Bay Pines Tes t Lab | |
| 326 | The test e nvironment will be h osted at t he Bay Pin es Test La b, DAYT79. | |
| 327 | Base Syste m Hardware | |
| 328 | Table 6 se ts forth t he system resources for the te st effort presented in this Ma ster Test Plan. | |
| 329 | The specif ic element s of the t est system may not b e fully un derstood i n early it erations, so this se ction may be complet ed over ti me. The te st system should sim ulate the production environme nt as clos ely as pos sible, sca ling down the concur rent acces s and data base size, and so fo rth, if an d where ap propriate. Tailor th e System H ardware Re sources ta ble as req uired. | |
| 330 | Table 6: S ystem Hard ware Resou rces | |
| 331 | ResourceQu antityName and TypeD atabase Se rverNetwor k or Subne tTBDServer NameTBDDa tabase Nam eTBDClient Test PCsI nclude spe cial confi guration r equirement sTBDTest R epositoryN etwork or SubnetTBDS erver Name TBDTest De velopment PCsTBDBase Software Elements i n the Test Environme nts | |
| 332 | Add or del ete Softwa re Element s as appro priate. If necessary , specify software p atches ref erenced an d/or requi red here. | |
| 333 | Table 7 de scribes th e base sof tware elem ents that are requir ed in the test envir onment for this Mast er Test Pl an. | |
| 334 | Table 7: S oftware El ements | |
| 335 | Software E lement Nam eVersionTy pe and Oth er NotesWi ndows7Oper ating Syst emIntersys tems Cache 2014MUMPS environmen tDelphiXE3 GUI source codeStaff ing and Tr aining Nee ds | |
| 336 | Table 8 de scribes th e personne l resource s needed t o plan, pr epare, and execute t his Master Test Plan . | |
| 337 | Table 8: S taffing Re sources | |
| 338 | Testing Ta skQuantity of Person nel Needed Test Proce ssDuration / DaysCrea te the Mas ter Test P lanTest Pr eparationx xx daysEst ablish the Test Envi ronmentTes t Preparat ionxxx day sPerform S ystem Test sProduct B uild xxx d aysEtc.Ide ntify trai ning optio ns for pro viding nec essary ski lls and th e estimate d number o f hours ne cessary to complete the traini ng. | |
| 339 | Table 9 li sts the pe rsonnel th at require training. | |
| 340 | Table 9: T raining Ne eds | |
| 341 | NameTraini ng NeedTra ining Opti onEstimate d Training HoursAndr ey Andriye vskiyIBM R ational Ja zz ®Obtain IBM Ratio nal Jazz ® training 6 hoursChr istopher B ellIBM Rat ional Jazz ®Obtain I BM Rationa l Jazz ® t raining 6 hours Juic o Bowley I BM Rationa l Jazz ®Ob tain IBM R ational Ja zz ® train ing 6 hour sRishan Ch andaranaIB M Rational Jazz ®Obt ain IBM Ra tional Jaz z ® traini ng 6 hours Nicholas C ostanzoIBM Rational Jazz ®Obta in IBM Rat ional Jazz ® trainin g 6 hoursJ amie Cruml eyIBM Rati onal Jazz ®Obtain IB M Rational Jazz ® tr aining 6 h oursAndrea FreemanIB M Rational Jazz ®Obt ain IBM Ra tional Jaz z ® traini ng6 hoursC raig O. Hi ntonIBM Ra tional Jaz z ®Obtain IBM Ration al Jazz ® training 6 hoursKim C. Hovorka IBM Ration al Jazz ®O btain IBM Rational J azz ® trai ning 6 hou rsRobert LauroIBM R ational Ja zz ®Obtain IBM Ratio nal Jazz ® training 6 hoursJoe NiksichIB M Rational Jazz ®Obt ain IBM Ra tional Jaz z ® traini ng 6 hours Ty PhelpsI BM Rationa l Jazz ®Ob tain IBM R ational Ja zz ® train ing 6 hour sBlair San dersIBM Ra tional Jaz z ®Obtain IBM Ration al Jazz ® training 6 hoursSusa n Scorzato IBM Ration al Jazz ®O btain IBM Rational J azz ® trai ning6 hour sApril Sco tt IBM Ra tional Jaz z ®Obtain IBM Ration al Jazz ® training 6 hoursBria n WattIBM Rational J azz ®Obtai n IBM Rati onal Jazz ® training 6 hoursRis ks and Con straints | |
| 342 | The risk l og was tak en into co nsideratio n in the d evelopment of this t est plan. | |
| 343 | The risks identified in this M aster Test Plan can be found i n the risk log and m ay be reco rded and t racked in an automat ed tool, s uch as, IB M Rational Jazz®. | |
| 344 | Test Metri cs | |
| 345 | Metrics ar e a system of parame ters or me thods for quantitati ve and per iodic asse ssment of a process that is to be measur ed. | |
| 346 | Test metri cs may inc lude, but are not li mited to: | |
| 347 | Number of test cases (pass/fai l) | |
| 348 | Percentage of test c ases execu ted | |
| 349 | Number of requiremen ts and per centage te sted | |
| 350 | Percentage of test c ases resul ting in de fect detec tion | |
| 351 | Number of defects at tributed t o test cas e/test scr ipt creati on | |
| 352 | Percentage of defect s identifi ed; listed by cause and severi ty | |
| 353 | Time to re -test | |
| 354 | Attachment A – Appro val Signat ures | |
| 355 | The Master Test Plan documents the proje ct’s overa ll approac h to testi ng and inc ludes: | |
| 356 | Items to b e tested | |
| 357 | Test strat egy | |
| 358 | Test crite ria | |
| 359 | Test deliv erables | |
| 360 | Test sched ule | |
| 361 | Test envir onments | |
| 362 | Staffing a nd trainin g needs | |
| 363 | Risks and constraint s | |
| 364 | Test Metri cs | |
| 365 | This secti on is used to docume nt the app roval of t he Master Test Plan during the Formal Re view. The review sh ould be id eally cond ucted face to face w here signa tures can be obtaine d ‘live’ d uring the review how ever the f ollowing f orms of ap proval are acceptabl e: | |
| 366 | Physical s ignatures obtained f ace to fac e or via f ax | |
| 367 | Digital si gnatures t ied crypto graphicall y to the s igner | |
| 368 | /es/ in th e signatur e block pr ovided tha t a separa te digital ly signed e-mail ind icating th e signer’s approval is provide d and kept with the document. | |
| 369 | NOTE: Del ete the en tire secti on above p rior to fi nal submis sion. | |
| 370 | REVIEW DAT E: <Date> | |
| 371 | Signed: | |
| 372 | Date: | |
| 373 | < Program/ Project Ma nager > | |
| 374 | Signed: | |
| 375 | Date: | |
| 376 | < Business Sponsor R epresentat ive > | |
| 377 | Signed: | |
| 378 | Date: | |
| 379 | <Project T eam Test M anager> | |
| 380 | Appendix A - Test Ty pe Definit ions | |
| 381 | Test TypeD efinitionA ccess Cont rol Testin gA type of testing t hat attest s that the target-of -test data (or syste ms) are ac cessible o nly to tho se actors for which they are i ntended, a s defined by use cas es. Access Control T esting ver ifies that access to the syste m is contr olled and that unwan ted or una uthorized access is prohibited . This tes t is imple mented and executed on various targets-o f-test.Ben chmark Tes ting:A typ e of perfo rmance tes ting that compares t he perform ance of ne w or unkno wn functio nality to a known re ference st andard (e. g., existi ng softwar e or measu rements). For exampl e, benchma rk testing may compa re the per formance o f current systems wi th the per formance o f the Linu x/Oracle s ystem.Buil d Verifica tion Testi ng | |
| 382 | (Prerequis ite: Smoke Test)A ty pe of test ing perfor med for ea ch new bui ld, compar ing the ba seline wit h the actu al object properties in the cu rrent buil d. The out put from t his test i ndicates w hat object propertie s have cha nged or do n’t meet t he require ments. Tog ether with the Smoke test, the Build Ver ification test may b e utilized by projec ts to dete rmine if a dditional functional testing i s appropri ate for a given buil d or if a build is r eady for p roduction. Business C ycle Testi ngA type o f testing that focus es upon ac tivities a nd transac tions perf ormed end to end ove r time. Th is test ty pe execute s the func tionality associated with a pe riod of ti me (e.g., one-week, month, or year). The se tests i nclude all daily, we ekly, and monthly cy cles, and events tha t are date -sensitive (e.g., en d of the m onth manag ement repo rts, month ly reports , quarterl y reports, and year- end report s).Capacit y TestingC apacity te sting occu rs when yo u simulate the numbe r of users in order to stress an applica tion's har dware and/ or network infrastru cture. Cap acity test ing is don e to deter mine the c apacity (C PU, Data S torage, LA N, WAN, et c.) of the system an d/or netwo rk under t est.Compli ance Testi ngA type o f testing that verif ies that a collectio n of softw are and ha rdware ful fills give n specific ations. Fo r example, these tes ts will mi nimally in clude: “co re specifi cations fo r rehostin g – ver.1. 5-draft 3. doc”, Sect ion 508 of The Rehab ilitation Act Amendm ents of 19 98, Race a nd Ethnici ty Test, a nd VA Dire ctive 6102 Complianc e. It does not exclu de any oth er tests t hat may al so come up .Component Integrati on Testing Testing pe rformed to expose de fects in t he interfa ces and in teraction between in tegrated c omponents as well as verifying installat ion instru ctions.Con figuration TestingA type of te sting conc erned with checking the progra ms compati bility wit h as many possible c onfigurati ons of har dware and system sof tware. In most produ ction envi ronments, the partic ular hardw are specif ications f or the cli ent workst ations, ne twork conn ections, a nd databas e servers vary. Clie nt worksta tions may have diffe rent softw are loaded , for exam ple, appli cations, d rivers, an d so on ha nd, at any one time; many diff erent comb inations m ay be acti ve using d ifferent r esources. The goal o f the conf iguration test is fi nding a ha rdware com bination t hat should be, but i s not, com patible wi th the pro gram.Conte ntion Test ingA type of perform ance testi ng that ex ecutes tes ts that ca use the ap plication to fail wi th regard to actual or simulat ed concurr ency. Cont ention tes ting ident ifies fail ures assoc iated with locking, deadlock, livelock, starvation , race con ditions, p riority in version, d ata loss, loss of me mory, and lack of th read safet y in share d software component s or data. Data and Database I ntegrity T estingA ty pe of test ing that v erifies th at data is being sto red by the system in a manner where the data is no t compromi sed by the initial s torage, up dating, re storation, or retrie val proces sing. This type of t esting is intended t o uncover design fla ws that ma y result i n data cor ruption, u nauthorize d data acc ess, lack of data in tegrity ac ross multi ple tables , and lack of adequa te transac tion perfo rmance. Th e database s, data fi les, and t he databas e or data file proce sses shoul d be teste d as a sub system wit hin the ap plication. Documenta tion Testi ngDocument ation test ing is a t ype of tes ting that should val idate the informatio n containe d within t he softwar e document ation set for the fo llowing qu alities: c ompliance to accepte d standard s and conv entions, a ccuracy, c ompletenes s, and usa bility. Th e document ation test ing should verify th at all of the requir ed informa tion is pr ovided in order for the approp riate user to be abl e to prope rly instal l, impleme nt, operat e, and mai ntain the software a pplication . The curr ent VistA documentat ion set ca n consist of any of the follow ing manual types: | |
| 383 | Release No tes, Insta llation Gu ide, User Manuals, T echnical M anual, and Security Guide.Erro r Analysis TestingTh is type of testing v erifies th at the app lication c hecks for input, det ects inval id data, a nd prevent s invalid data from being ente red into t he applica tion. This type of t esting als o includes the verif ication of error log s and erro r messages that are displayed to the use r.Explorat ory Testin gA techniq ue for tes ting compu ter softwa re that re quires min imal plann ing and to lerates li mited docu mentation for the ta rget-of-te st in adva nce of tes t executio n, relying on the sk ill and kn owledge of the teste r and feed back from test resul ts to guid e the ongo ing test e ffort. Exp loratory t esting is often cond ucted in s hort sessi ons in whi ch feedbac k gained f rom one se ssion is u sed to dyn amically p lan subseq uent sessi ons.Failov er Testing A type of testing te st that en sures an a lternate o r backup s ystem prop erly “take s over” (i .e., a bac kup system functions when the primary sy stem fails ). Failove r Testing also tests that a sy stem conti nually run s when the failover occurs, an d that the failover happens wi thout any loss of da ta or tran sactions. Failover T esting sho uld be com bined with Recovery Testing.In stallation TestingA type of te sting that verifies that the a pplication or system installs as intende d on diffe rent hardw are and so ftware con figuration s, and und er differe nt conditi ons (e.g., a new ins tallation, an upgrad e, and a c omplete or custom in stallation ). Install ation test ing may al so measure the ease with which an applic ation or s ystem can be success fully inst alled, typ ically mea sured in t erms of th e average amount of person-hou rs require d for a tr ained oper ator or ha rdware eng ineer to p erform the installat ion. Part of this in stallation test is t o perform an uninsta ll. As a r esult of t his uninst all, the s ystem, app lication a nd databas e should r eturn to t he state p rior to th e install. Integratio n TestingA n incremen tal series of tests of combina tions or s ub-assembl ies of sel ected comp onents in an overall system. I ntegration testing i s incremen tal in a s uccessivel y larger a nd more co mplex comb inations o f componen ts tested in sequenc e, proceed ing from t he unit le vel (0% in tegration) to eventu ally the f ull system test (100 % integrat ion). Load TestingA performanc e test tha t subjects the syste m to varyi ng workloa ds in orde r to measu re and eva luate the performanc e behavior s and abil ities of t he system to continu e to funct ion proper ly under t hese diffe rent workl oads. Load testing d etermines and ensure s that the system fu nctions pr operly bey ond the ex pected max imum workl oad. Addit ionally, l oad testin g evaluate s the perf ormance ch aracterist ics (e.g., response times, tra nsaction r ates, and other time -sensitive issues).M igration T estingA ty pe of test ing that f ollows sta ndard Vist A and Heal theVet (He V)-VistA o perating p rocedures and loads the latest .jar vers ion onto a live copy of VistA and HeV-Vi stA. The f ollowing a re example s of the t ypes of te sts that c an be perf ormed as p art of mig ration tes ting: | |
| 384 | Data conve rsion has been compl eted | |
| 385 | Data table s are succ essfully c reated | |
| 386 | Parallel t est for co nfirmation of data i ntegrity | |
| 387 | Review out put report , before a nd after m igration, to confirm data inte grity | |
| 388 | Run equiva lent proce ss, before and after migration Multi-Divi sional Tes tingA type of testin g that ens ures that all applic ations wil l operate in a multi -division or multi-s ite enviro nment reco gnizing th at an ente rprise per spective w hile fully supportin g local he alth care delivery.P arallel Te stingThe s ame intern al process es are run on the ex isting sys tem and th e new syst em. The ex isting sys tem is con sidered th e “gold st andard”, u nless prov en otherwi se. The fe edback (ex pected res ults, defi ned time l imits, dat a extracts , etc.) fr om process es from th e new syst em are com pared to t he existin g system. Parallel t esting is performed before the new syste m is put i nto a prod uction env ironment.P erformance Monitorin g TestingP erformance profiling assesses how a syst em is spen ding its t ime and co nsuming re sources. T his type o f performa nce testin g optimize s the perf ormance of a system by measuri ng how muc h time and resources the syste m is spend ing in eac h function . These te sts identi fy perform ance limit ations in the code a nd specify which sec tions of t he code wo uld benefi t most fro m optimiza tion work. The goal of perform ance profi ling is to optimize the featur e and appl ication pe rformance. Performanc e TestingP erformance Testing a ssesses ho w a system is spendi ng its tim e and cons uming reso urces. Per formance t esting opt imizes a s ystem by m easuring h ow much ti me and res ources the system is spending in each fu nction. Th ese tests identify p erformance limitatio ns in the code and s pecify whi ch section s of the c ode would benefit mo st from op timization work. Per formance t esting may be furthe r refined by the use of specif ic types o f performa nce tests, such as, benchmark test, load test, str ess test, performanc e monitori ng test, a nd content ion test. Performanc e – Benchm ark Testin gA type of performan ce testing that comp ares the p erformance of new or unknown f unctionali ty to a kn own refere nce standa rd (e.g., existing s oftware or measureme nts). For example, b enchmark t esting may compare t he perform ance of cu rrent syst ems with t he perform ance of th e Linux/Or acle syste m.Performa nce – Cont ention Tes tingA type of perfor mance test ing that e xecutes te sts that c ause the a pplication to fail w ith regard to actual or simula ted concur rency. Con tention te sting iden tifies fai lures asso ciated wit h locking, deadlock, livelock, starvatio n, race co nditions, priority i nversion, data loss, loss of m emory, and lack of t hread safe ty in shar ed softwar e componen ts or data .Performan ce – Endur ance Testi ngEnduranc e testing, also know n as Soak testing, i s usually done to de termine if the syste m can sust ain the co ntinuous e xpected lo ad. During soak test s, memory utilizatio n is monit ored to de tect poten tial leaks .Performan ce – Load TestingA p erformance test that subjects the system to varyin g workload s in order to measur e and eval uate the p erformance behaviors and abili ties of th e system t o continue to functi on properl y under th ese differ ent worklo ads. Load testing de termines a nd ensures that the system fun ctions pro perly beyo nd the exp ected maxi mum worklo ad. Additi onally, lo ad testing evaluates the perfo rmance cha racteristi cs (e.g., response t imes, tran saction ra tes, and o ther time- sensitive issues).Pe rformance - Profilin gTestingPe rformance profiling assesses h ow a syste m is spend ing its ti me and con suming res ources. Th is type of performan ce testing optimizes the perfo rmance of a system b y measurin g how much time and resources the system is spendi ng in each function. These tes ts identif y performa nce limita tions in t he code an d specify which sect ions of th e code wou ld benefit most from optimizat ion work. The goal o f performa nce profil ing is to optimize t he feature and appli cation per formance.P erformance – Spike T estingA pe rformance test in wh ich an app lication i s tested w ith sudden increment and decre ments in t he load. The focus is on syst em behavio r during d ramatic ch anges in l oad.Privac y TestingA type of t esting tha t ensures that (1) v eteran and employee data are a dequately protected and (2) sy stems and applicatio ns comply with the P rivacy and Security Rule provi sions of t he Health Insurance Portabilit y and Acco untability Act (HIPA A).Product Component TestingPr oduct Comp onent Test ing (aka U nit Testin g) is the internal t echnical a nd functio nal testin g of a mod ule/compon ent of cod e. Product Component Testing v erifies th at the req uirements defined in the detai l design s pecificati on have be en success fully appl ied to the module/co mponent un der test.R ecovery Te stingA typ e of testi ng that ca uses an ap plication or system to fail in a control led enviro nment. Rec overy proc esses are invoked wh ile an app lication o r system i s monitore d. Recover y testing verifies t hat applic ation or s ystem, and data reco very is ac hieved. Re covery Tes ting shoul d be combi ned with F ailover Te sting.Regr ession Tes tA type of testing t hat valida tes existi ng functio nality sti ll perform s as expec ted when n ew functio nality is introduced into the system und er test. R isk Based TestingA t ype of tes ting based on a defi ned list o f project risks. It is designe d to explo re and/or uncover po tential sy stem failu res by usi ng the lis t of risks to select and prior itize test ing.Sectio n 508 Comp liance Tes tingA type of test t hat (1) en sures that persons w ith disabi lities hav e access t o and are able to in teract wit h graphica l user int erfaces an d (2) veri fies that the applic ation or s ystem meet s the spec ified Sect ion 508 Co mpliance s tandards.S ecurity Te stingA typ e of test that valid ates the s ecurity re quirements and to en sure readi ness for t he indepen dent testi ng perform ed by the Security A ssessment Team as us ed by the Assessment and Autho rization P rocess.Smo ke TestA t ype of tes ting that ensures th at an appl ication or system is stable en ough to en ter testin g in the c urrently a ctive test phase. It is usuall y a subset of the ov erall set of tests, preferably automated , that tou ches parts of the sy stem in at least a c ursory way . Stress T estingA pe rformance test imple mented and executed to underst and how a system fai ls due to conditions at the bo undary, or outside o f, the exp ected tole rances. Th is failure typically involves low resour ces or com petition f or resourc es. Low re source con ditions re veal how t he target- of-test fa ils that i s not appa rent under normal co nditions. Other defe cts might result fro m competit ion for sh ared resou rces (e.g. , database locks or network ba ndwidth), although s ome of the se tests a re usually addressed under fun ctional an d load tes ting. Stre ss Testing verifies the accept ability of the syste ms perform ance behav ior when a bnormal or extreme c onditions are encoun tered (e.g ., diminis hed resour ces or ext remely hig h number o f users).S ystem Test ingSystem testing is the testi ng of all parts of a n integrat ed system, including interface s to exter nal system s. Both fu nctional a nd structu ral types of testing are perfo rmed to ve rify that the system performan ce, operat ion and fu nctionalit y are soun d. End to end testin g with all interfaci ng systems is the ul timate ver sion. Usab ility Test ingUsabili ty testing identifie s problems in the ea se-of-use and ease-o f-learning of a prod uct. Usabi lity tests may focus upon, and are not l imited to: human fac tors, aest hetics, co nsistency in the use r interfac e, online and contex t-sensitiv e help, wi zards and agents, us er documen tation. Us er Functio nality Tes tUser Func tionality Test (UAT) is a type of Accept ance Test that invol ves end-us ers testin g the func tionality of the app lication u sing test data in a controlled test envi ronment. U ser Interf ace Testin gUser-inte rface (UI) testing e xercises t he user in terfaces t o ensure t hat the in terfaces f ollow acce pted stand ards and m eet requir ements. Us er-interfa ce testing is often referred t o as GUI t esting. UI testing p rovides to ols and se rvices for driving t he user in terface of an applic ation from a test.Te mplate Rev ision Hist ory | |
| 389 | DateVersio nDescripti onAuthorNo vember 201 51.18Expan ded Sectio n 4.3 to better des cribe resp onsibiliti es for 508 complianc e.Channing JonkerOct ober 20151 .17Correct ed broken link to 50 8 URL.Chan ning Jonke rJune 2015 1.16Update d metadata to show r ecord rete ntion info rmation an d required by PMAS, VHA Releas e Manageme nt, Enterp rise Opera tions, and VistA Int ake Progra m Process Managemen tMay 20151 .15Reorder ed cover s heet to en hance Shar ePoint sea rch result sProcess M anagementM arch 20151 .14Miscell aneous upd ates inclu ding the a ddition of Performan ce testing .Channing JonkerNove mber 20141 .13Updated to latest Section 5 08 conform ance guide lines and remediated with Comm on Look Of fice ToolP rocess Man agementAug ust 20141. 12Removed requiremen ts for ESE Approval SignatureP rocess Man agementOct ober 20131 .11Convert ed to Micr osoft Offi ce 2007-20 10 formatP rocess Man agementJul y 09, 2012 1.10 Added System De sign Docum ent to Sec tion 1.2 - Test Objec tives as a n exampleP rocess Man agementJan uary 03, 2 0121.9Upda ted Approv al Signatu res for Ma ster Test Plan in Ap pendix aPr ocess Mana gementOcto ber 13, 20 111.8Repla ced refere nces to Te st and Cer tification with Inde pendent Te st and Eva luation. R eplaced re ferences t o Certific ation and Accreditat ion with A ssessment and Author ization.Pr ocess Mana gementOcto ber 4, 201 11.7Repair ed link to Privacy I mpact Asse ssmentProc ess Manage mentAugust 23, 20111 .6Changed Operationa l Readines s Testing (ORT) to O perational Readiness Review (O RR)Process Managemen tApril 12, 20111.5Up dated the Signatory Authoritie s in Appen dix A in l ight of or ganization al changes Process Ma nagementFe bruary 201 11.4Remove d Testing Service Te sting and Operationa l Readines s Testing; added Ent erprise Sy stem Engin eering Tes ting. | |
| 390 | Changed In itial Oper ating Capa bility Tes ting to In itial Oper ating Capa bility Eva luationPro cess Manag ementJanua ry 20111.3 Repaired b roken link in sectio n 1.4Proce ss Managem ent Servic eAugust 20 101.2Remov ed OED fro m template Process Ma nagement S erviceDece mber 20091 .1Removed “This Page Intention ally Left Blank” pag es.OED Pro cess Manag ement Serv iceJuly 20 091.0Initi al ProPath releaseOE D Process Management Service |
Araxis Merge (but not the data content of this report) is Copyright © 1993-2016 Araxis Ltd (www.araxis.com). All rights reserved.