Produced by Araxis Merge on 10/29/2017 8:44:36 PM Eastern Daylight Time. See www.araxis.com for information about Merge. This report uses XHTML and CSS2, and is best viewed with a modern standards-compliant browser. For optimum results when printing this report, use landscape orientation and enable printing of background images and colours in your browser.
| # | Location | File | Last Modified |
|---|---|---|---|
| 1 | MOCHA 2.1b Pre IOC Patch CiF Submission.zip\DOCS | MOCHA 2.1_master_test_plan_August_2017.docx | Thu Oct 26 16:01:42 2017 UTC |
| 2 | MOCHA 2.1b Pre IOC Patch CiF Submission.zip\DOCS | MOCHA 2.1_master_test_plan_August_2017.docx | Mon Oct 30 00:26:51 2017 UTC |
| Description | Between Files 1 and 2 |
|
|---|---|---|
| Text Blocks | Lines | |
| Unchanged | 9 | 1910 |
| Changed | 7 | 16 |
| Inserted | 1 | 2 |
| Removed | 0 | 0 |
| Whitespace | |
|---|---|
| Character case | Differences in character case are significant |
| Line endings | Differences in line endings (CR and LF characters) are ignored |
| CR/LF characters | Not shown in the comparison detail |
No regular expressions were active.
| 1 | Pharmacy R eengineeri ng (PRE) | |||||
| 2 | ||||||
| 3 | Medication Order Che ck Healthc are Applic ation (M2. 1) | |||||
| 4 | Version 2. 1 | |||||
| 5 | ||||||
| 6 | Master Tes t Plan | |||||
| 7 | ||||||
| 8 | ||||||
| 9 | ||||||
| 10 | ||||||
| 11 | August 20 17 | |||||
| 12 | ||||||
| 13 | ||||||
| 14 | Department of Vetera ns Affairs | |||||
| 15 | ||||||
| 16 | ||||||
| 17 | ||||||
| 18 | ||||||
| 19 | ||||||
| 20 | ||||||
| 21 | ||||||
| 22 | Revision H istory | |||||
| 23 | Date | |||||
| 24 | Version | |||||
| 25 | Descriptio n | |||||
| 26 | Author | |||||
| 27 | 8/31/2017 | |||||
| 28 | 2.3 | |||||
| 29 | Updated th e MTP sect ion 1.3 an d 3.6 | |||||
| 30 | Kanika Sha rma | |||||
| 31 | 12/1/2016 | |||||
| 32 | 2.2 | |||||
| 33 | Updated th e MTP with comments from David Skahn | |||||
| 34 | Kanika Sha rma | |||||
| 35 | 10/13/2016 | |||||
| 36 | 2.1 | |||||
| 37 | Updated th e MTP to D istinguish between M OCHA 2.1a and MOCHA 2.1b | |||||
| 38 | Kanika Sha rma | |||||
| 39 | 7/28/2016 | |||||
| 40 | 2.0 | |||||
| 41 | Updated th e comments made by r eviewers ( Heidi Cros s and Vick ey Elijah) | |||||
| 42 | Kanika Sha rma | |||||
| 43 | 7/25/2016 | |||||
| 44 | 1.9 | |||||
| 45 | Updates ma de in sect ion 1.3 an d 7. Added the autom ation test plan to s ection 3 a nd Compone nt Softwar e test pla n to Secti on 3.1 | |||||
| 46 | Kanika Sha rma | |||||
| 47 | 7/30/2015 | |||||
| 48 | 1.8 | |||||
| 49 | Updated co mments mad e by Vicke y Elijah i n section 7 | |||||
| 50 | Kanika Sha rma | |||||
| 51 | 6/30/2015 | |||||
| 52 | 1.7 | |||||
| 53 | Updated se ction 1.3 and 7 with Vickey El ijah repla cing Arti Iyer | |||||
| 54 | Kanika Sha rma | |||||
| 55 | 4/30/2015 | |||||
| 56 | 1.6 | |||||
| 57 | Updated se ction 1.3, 7 and 8 | |||||
| 58 | Kanika Sha rma | |||||
| 59 | 11/25/2014 | |||||
| 60 | 1.5 | |||||
| 61 | Updated th e comments made by A rti and up dated sect ion 1.3 | |||||
| 62 | Kanika Sha rma | |||||
| 63 | 10/27/2014 | |||||
| 64 | 1.4 | |||||
| 65 | Updated th e comments made by A rti and up dated sect ion 1.3 | |||||
| 66 | Kanika Sha rma | |||||
| 67 | 7/30/2014 | |||||
| 68 | 1.3 | |||||
| 69 | Updated th e comments made by A rti | |||||
| 70 | Kanika Sha rma | |||||
| 71 | 6/13/2014 | |||||
| 72 | 1.2 | |||||
| 73 | Updated th e comments made by A rti | |||||
| 74 | Kanika Sha rma | |||||
| 75 | 5/28/2014 | |||||
| 76 | 1.1 | |||||
| 77 | Updated th e whole do cument | |||||
| 78 | Kanika Sha rma | |||||
| 79 | 4/17/2013 | |||||
| 80 | 1.0 | |||||
| 81 | Create Ini tial Versi on | |||||
| 82 | Kanika Sha rma | |||||
| 83 | ||||||
| 84 | Table of C ontents | |||||
| 85 | 1.Introduc tion1 | |||||
| 86 | 1.1.Purpos e1 | |||||
| 87 | 1.2.Test O bjectives1 | |||||
| 88 | 1.3.Roles and Respon sibilities 1 | |||||
| 89 | 1.4.Proces ses and Re ferences2 | |||||
| 90 | 2.Items To Be Tested 3 | |||||
| 91 | 2.1.Overvi ew of Test Inclusion s3 | |||||
| 92 | 2.2.Overvi ew of Test Exclusion s3 | |||||
| 93 | 3.Test App roach3 | |||||
| 94 | 3.1.Produc t Componen t Test3 | |||||
| 95 | 3.2.Compon ent Integr ation Test 3 | |||||
| 96 | 3.3.System Tests3 | |||||
| 97 | 3.4.User F unctionali ty Test4 | |||||
| 98 | 3.5.Enterp rise Syste m Engineer ing Testin g4 | |||||
| 99 | 3.6.Initia l Operatin g Capabili ty Evaluat ion4 | |||||
| 100 | 4.Testing Techniques 4 | |||||
| 101 | 4.1.Risk-b ased Testi ng4 | |||||
| 102 | 4.2.Enterp rise Testi ng4 | |||||
| 103 | 4.2.1.Secu rity Testi ng4 | |||||
| 104 | 4.2.2.Priv acy Testin g5 | |||||
| 105 | 4.2.3.Sect ion 508 Co mpliance T esting5 | |||||
| 106 | 4.2.4.Mult i-Division al Testing 5 | |||||
| 107 | 4.3.Test T ypes6 | |||||
| 108 | 4.4.Produc tivity and Support T ools7 | |||||
| 109 | 5.Test Cri teria8 | |||||
| 110 | 5.1.Proces s Reviews8 | |||||
| 111 | 5.2.Pass/F ail Criter ia8 | |||||
| 112 | 5.3.Suspen sion and R esumption Criteria8 | |||||
| 113 | 5.4.Accept ance Crite ria8 | |||||
| 114 | 6.Test Del iverables9 | |||||
| 115 | 7.Test Sch edule9 | |||||
| 116 | 8.Test Env ironments1 0 | |||||
| 117 | 8.1.Test E nvironment Configura tions10 | |||||
| 118 | 8.2.Base S ystem Hard ware10 | |||||
| 119 | 8.3.Base S oftware El ements in the Test E nvironment s11 | |||||
| 120 | 9.Staffing and Train ing Needs1 2 | |||||
| 121 | 10.Risks a nd Constra ints13 | |||||
| 122 | 11.Test Me trics13 | |||||
| 123 | Attachment A - Appro val Signat ures13 | |||||
| 124 | A. Test Ty pe Definit ions17 | |||||
| 125 | ||||||
| 126 | Introducti on | |||||
| 127 | The Medica tion Order Check Hea lthcare Ap plication (MOCHA 2.1 ) intends to impleme nt Dose Ra nge Checki ng with a Max Daily Dose limit for simpl e medicati on orders. The chang es will be made for Outpatient Pharmacy (OP), Inpa tient Medi cations (I P) and Pha rmacy Data Managemen t (PDM) ap plications . The dev elopment t eam will w ork closel y with the Computeri zed Patien t Record S ystem (CPR S) team to make sure any corre sponding c hanges in CPRS are a lso develo ped in thi s incremen t. | |||||
| 128 | The MOCHA 2.1 increm ent delive rs the sec ond of fou r dosing i ncrements. Dose Ran ge Checkin g will be implemente d using th e Max Dail y limit in OP, IP, P DM and CPR S for simp le medicat ion orders . | |||||
| 129 | A decision was made by VA mana gement in September 2016 to sp lit MOCHA 2.1 into M OCHA 2.1a and MOCHA 2.1b. The reason to break out the releas e was: | |||||
| 130 | To get som e of the f unctionali ty out to the field quicker | |||||
| 131 | To be able to implem ent Vetera n – focuse d Integrat ion Proces s (VIP) pr ocess (3 m onth deliv ery of fun ctionality ) | |||||
| 132 | ||||||
| 133 | Below is t he functio nality tha t will be released i n MOCHA 2. 1a and MOC HA 2.1b | |||||
| 134 | MOCHA 2.1a will prov ide the fo llowing en hancements : | |||||
| 135 | Add new fi elds to bo th the ADM INISTRATIO N SCHEDULE file (#51 .1) and th e MEDICATO N INSTRUCT ION file ( #51) to de fine a fre quency for a schedul e or medic ation inst ruction us ed within a medicati on order f or specifi c dispense drug(s) o r for all drugs in o rder to pe rform a Ma x Daily Do se Order C heck. | |||||
| 136 | Add new fi elds to bo th the ADM INISTRATIO N SCHEDULE file (#51 .1) and th e MEDICATI ON INSTRUC TION file (#51) to b e able to derive a f requency v alue to pe rform a Ma x Daily Do se Order C heck when the name o f a schedu le or medi cation ins truction h as been ch anged. | |||||
| 137 | Modify Sta ndard Sche dule Edit [PSS SCHED ULE EDIT] option to allow edit ing of the new frequ ency field s. | |||||
| 138 | Modify Adm inistratio n Schedule File Repo rt [PSS SC HEDULE REP ORT] optio n to displ ay data en tered in t he frequen cy fields. | |||||
| 139 | Modify Med ication In struction File Add/E dit [PSSJU MI] optio n to allow editing o f the new frequency fields. | |||||
| 140 | Modify Med ication In struction File Repor t [PSS MED INSTRUCTI ON REPORT] option to display d ata entere d in the n ew frequen cy fields. | |||||
| 141 | Modify ent ries to th e DOSE UNI TS file (# 51.24), se e section 4 for deta ils. | |||||
| 142 | Create a n ew file ca lled DOSE UNIT CONVE RSION (#51 .25) to co nvert one dose unit to another using a c onversion factor so that a com parison ca n be made between tw o dose uni ts when th ey are not equivalen t. | |||||
| 143 | Add new en tries to t he APSP IN TERVENTION TYPE file (#9009032 .3), MAX D AILY DOSE and MAX SI NGLE DOSE & MAX DAIL Y DOSE. | |||||
| 144 | Invoke CPR S Quick Or der Notifi cation whe n Pharmacy Orderable Item name is edited so that c orrespondi ng changes can be ma de to the quick orde r name to ensure tha t the Dosi ng Order C hecks can be perform ed success fully. | |||||
| 145 | Enhance th e free tex t dosage l ogic for d osing rang es for med ication or ders enter ed through Pharmacy and Comput erized Pat ient Recor d System ( CPRS). | |||||
| 146 | Enhance th e free tex t dosage l ogic for m ulti-ingre dient prod uct medica tion order s entered through CP RS. | |||||
| 147 | Enhance fr ee text lo gic to scr een out in formation data place d in paren thesis whi ch is foun d in the d osage orde red field for an ord er. | |||||
| 148 | ||||||
| 149 | MOCHA 2.1b will prov ide the fo llowing en hancements : | |||||
| 150 | Implement Dose Range Checking with a Max Daily Dos e limit fo r simple m edication orders ent ered throu gh Outpati ent Pharma cy, Inpati ent Medica tions appl ications a nd CPRS. | |||||
| 151 | Display an error mes sage when the Max Da ily Dose O rder Check cannot be performed in CPRS, Outpatient Pharmacy, and Inpat ient Medic ations app lications. | |||||
| 152 | Apply Dail y Dose Che ck exclusi on for sch edule to m edication orders ent ered throu gh Outpati ent Pharma cy, Inpati ent Medica tions, and CPRS. | |||||
| 153 | Apply advi sory note to Max Dai ly Dose wa rning and General Do sing Guide lines for medication administe red throug h eye, ear , or nose. | |||||
| 154 | Create a c ustomized frequency message. | |||||
| 155 | Add First Databank ( FDB) data elements f rom Dosing Order Che ck call to VistA sid e of inter face. | |||||
| 156 | Display on e warning if Maximum Single Do se and Max Daily Dos e Order Ch eck warnin g texts ar e identica l. | |||||
| 157 | Exclude ex pired Outp atient ord ers from D rug Intera ction Orde r Checks f or CPRS. | |||||
| 158 | Modificati ons to the ‘Availabl e Dosage(s )’ list wh en a scree n break oc curs durin g order en try and to the accom panying di alog durin g order en try throug h the Outp atient Pha rmacy appl ication. | |||||
| 159 | Modificati on to disp lay the mo st recent Serum Crea tinine val ue and dat e resulted if availa ble, even if the cre atinine cl earance (C rCL) canno t be calcu lated on t he pharmac y patient demographi c header. | |||||
| 160 | Display bo dy surface area (BSA ) and CrCL informati on to the headers on all Outpa tient phar macy medic ation orde r detail s creens and all Inpat ient and O utpatient pharmacy p atient inf ormation s creen to t he headers that are currently missing th is informa tion. | |||||
| 161 | Display on e warning if Maximum Single Do se and Max Daily Dos e Order Ch eck warnin g texts ar e identica l. | |||||
| 162 | ||||||
| 163 | Purpose | |||||
| 164 | ||||||
| 165 | The purpos e of this Master Tes t Plan is to documen t the over all testin g approach for the M OCHA 2.1 p roject and to valida te the MOC HA 2.1 req uirements using the specified guidance i n VIP. | |||||
| 166 | The Test P lan define s the test ing object ives, envi ronments, roles and responsibi lities, ty pes, and m ethodology . The scop e of testi ng will al so be defi ned by ide ntifying t he functio nal compon ents, the areas of h igh risk, and valida ting a rep resentativ e set of d ata. Testi ng identif ies the hi gh-level b usiness ri sks of the software systems in volved and develops testing ba sed on tho se risks. | |||||
| 167 | MOCHA 2.1 implements Dose Rang e Checking with a Ma x Daily Do se limit f or simple medication orders. T he changes will be m ade for Ou tpatient P harmacy, I npatient M edications , Pharmacy Data Mana gement (PD M) and Com puterized patient re cord syste m (CPRS) a pplication s. | |||||
| 168 | Test Objec tives | |||||
| 169 | This Maste r Test Pla n supports the follo wing objec tives: | |||||
| 170 | Create a c entral art ifact to g overn the planning a nd control of the te st effort. | |||||
| 171 | To identif y, create, maintain and contro l the test environme nt. | |||||
| 172 | To provide test cove rage for 1 00% of the documente d requirem ents. | |||||
| 173 | To identif y defects introduced by test p atches and resolve t hem. | |||||
| 174 | Identify t he motivat ion for an d ideas be hind the t est areas to be cove red. | |||||
| 175 | Outline th e testing approach t hat will b e used. | |||||
| 176 | List the d eliverable elements of the tes t project. | |||||
| 177 | Identify a nd documen t the tool s, techniq ues, stand ards, and methodolog ies to be deployed f or testing . | |||||
| 178 | Identify a nd documen t the crit eria for s uccess and benchmark s to be us ed. | |||||
| 179 | ||||||
| 180 | Roles and Responsibi lities | |||||
| 181 | Table 1 li sts the ke y roles an d their re sponsibili ties for t his Master Test Plan . | |||||
| 182 | Table 1: R oles and D escription s | |||||
| 183 | Role | |||||
| 184 | Descriptio n | |||||
| 185 | POC | |||||
| 186 | Business A nalysts | |||||
| 187 | Persons wh o write th e requirem ents for t he defined scope of project. | |||||
| 188 | Lina Bertu zis | |||||
| 189 | Developmen t team | |||||
| 190 | Persons wh o build or construct the produ ct/product component . | |||||
| 191 | Ron Ruzbac ki, Mai L Vo, Albert o Vargas, Hal Whitle y, Chris F legal, Asl i Goncer | |||||
| 192 | Developmen t Manager | |||||
| 193 | Person res ponsible f or assisti ng with th e creation and imple mentation of the Mas ter Test P lan. | |||||
| 194 | Heidi Cros s (COR) | |||||
| 195 | Scott Sold en (VA PM) | |||||
| 196 | Project Ma nager | |||||
| 197 | Person who has overa ll respons ibility fo r the succ essful pla nning and execution of the pro ject. | |||||
| 198 | Heidi Cros s | |||||
| 199 | Scott Sold en | |||||
| 200 | Stakeholde rs | |||||
| 201 | Persons wh o hold a s take in a situation in which t hey may af fect or be affected by the out come. | |||||
| 202 | Amy Colon, (PBM grou p) | |||||
| 203 | Test Lead | |||||
| 204 | An experie nced Test Analyst or member of the Test team who l eads and c oordinates activitie s related to all asp ects of te sting base d on an ap proved Mas ter Test P lan and sc hedule. | |||||
| 205 | Vickey Eli jah (PRE V A SQA Lead ), Kanika Sharma (SQ A Contract PM) | |||||
| 206 | Holly Pear son (Test Lead) | |||||
| 207 | SQA Analys t / Test T eam | |||||
| 208 | Persons wh o execute tests and ensure the test envi ronment wi ll adequat ely suppor t planned test activ ities. | |||||
| 209 | Kanika Sha rma, Holly Pearson, Stephen Qu inn, Mehdi Balighian , Arti Sha rma | |||||
| 210 | SQA Automa tion Lead | |||||
| 211 | Person who will be r esponsible to automa te parts o f the MOCH A applicat ion. | |||||
| 212 | Farzin Nav idi | |||||
| 213 | SQA Databa se Patchin g | |||||
| 214 | Person who will be r esponsible to patch the SQA te st account s with nat ionally re leased pat ches. | |||||
| 215 | Mehdi Bali ghian, Hol ly Pearson , Stephen Quinn, Art i Sharma, Vickey Eli jah | |||||
| 216 | Configurat ion Manage r | |||||
| 217 | Person who establish es, mainta ins, and c ontrols te st environ ments. | |||||
| 218 | Rene Kaur | |||||
| 219 | ||||||
| 220 | Processes and Refere nces | |||||
| 221 | The proces ses that g uide the i mplementat ion of thi s Master T est Plan a re: | |||||
| 222 | Test Prepa ration | |||||
| 223 | Product Bu ild | |||||
| 224 | Independen t Test and Evaluatio n | |||||
| 225 | The refere nces that support th e implemen tation of this Maste r Test Pla n are: | |||||
| 226 | http:// DNS /process/h ome.aspx | |||||
| 227 | http:// DNS /index.asp | |||||
| 228 | Privacy Im pact Asses sment - Pr ivacy Serv ice | |||||
| 229 | ||||||
| 230 | Items To B e Tested | |||||
| 231 | The requir ements tha t will be tested for MOCHA 2.1 are Dosin g (includi ng Inpatie nt Medicat ion, Outpa tient Phar macy, Phar macy Data Management and Compu terized pa tient reco rd system applicatio ns), MOCHA Server 3. 0, Order C heck Histo ry and Eve nt Log and they can be located at the Ra tional Req uirements Manager (R M) link be low. | |||||
| 232 | ||||||
| 233 | ||||||
| 234 | MOCHA 2.1 Requiremen ts To Be T ested | |||||
| 235 | There are some piece s of funct ionality ( Order Chec k History and Event Log) that may or may not be ad ded as a p art of the agile pro cess. Time permittin g they wil l be added to the pr oduct back log and de veloped/te sted. | |||||
| 236 | ||||||
| 237 | Overview o f Test Inc lusions | |||||
| 238 | ||||||
| 239 | The follow ing compon ents and f eatures an d combinat ions of co mponents a nd feature s will be tested: | |||||
| 240 | MOCHA 2.1 Dose Range Checking with Max D aily Dose Limit for simple med ication or ders. | |||||
| 241 | Implement Dose Range Checking with a Max Daily Dos e limit fo r simple m edication orders ent ered throu gh Outpati ent Pharma cy, Inpati ent Medica tions, and CPRS appl ications. | |||||
| 242 | Display a generic er ror messag e when the Max Daily Dose Orde r Check ca nnot be pe rformed in CPRS. | |||||
| 243 | Display an error mes sage when the Max Da ily Dose O rder Check cannot be performed in Pharma cy with a detailed r eason. | |||||
| 244 | Correct al l range do se errors due to fre quency fai lure. | |||||
| 245 | Apply Dail y Dose Che ck exclusi on for sch edule to m edication orders ent ered throu gh Outpati ent Pharma cy, Inpati ent Medica tions and CPRS. | |||||
| 246 | Apply note to Max Da ily Dose w arning and General D osing Guid elines for medicatio n administ ered throu gh eye, ea r or nose. | |||||
| 247 | ||||||
| 248 | Overview o f Test Exc lusions | |||||
| 249 | While this section t alks about test excl usion the following components and featu res and co mbinations of compon ents and f eatures wi ll be indi rectly tes ted | |||||
| 250 | These comp onents wil l be teste d indirect ly meaning that thes e componen ts are pre sent in ou r test env ironments and as we are runnin g tests de fined for the test i nclusion s ection we could come across an issue tha t will nee d to get c orrected i n the comp onents lis ted below. | |||||
| 251 | COTS Drug Database - Will be t ested indi rectly whi le conduct ing system testing. | |||||
| 252 | Web Servic es – Will be tested indirectly while con ducting sy stem testi ng. | |||||
| 253 | XML messag es – Will be tested indirectly while con ducting sy stem testi ng. | |||||
| 254 | ||||||
| 255 | ||||||
| 256 | Test Appro ach | |||||
| 257 | The Test A pproach se ction incl udes the t ests that will be im plemented for an inc rement; re fer to the detailed test scrip ts/test ca ses in Rat ional Team Concert ( RTC) as ne eded. | |||||
| 258 | MOCHA 2.1 IP test ca ses can be found at the link b elow. (The links wil l need to be copy an d pasted i n a browse r) | |||||
| 259 | https:// DNS /qm/web/co nsole/PHAR M%20%28QM% 29#action= com.ibm.rq m.planning .home.acti onDispatch er&subActi on=viewTes tPlan&id=2 106 | |||||
| 260 | MOCHA 2.1 OP test ca ses can be found at the link b elow. | |||||
| 261 | https:// URL>DNS /qm/web/co nsole/PHAR M%20%28QM% 29#action= com.ibm.rq m.planning .home.acti onDispatch er&subActi on=viewTes tPlan&id=2 107 | |||||
| 262 | MOCHA 2.1 PDM test c ases can b e found at the link below. | |||||
| 263 | https:// DNS /qm/web/co nsole/PHAR M%20%28QM% 29#action= com.ibm.rq m.planning .home.acti onDispatch er&subActi on=viewTes tPlan&id=2 108 | |||||
| 264 | ||||||
| 265 | Once the d evelopers send the b uild to so ftware qua lity assur ance (SQA) for testi ng it will be instal led in the required test envir onments, p ointed to a specific MOCHA Ser ver instan ce and tes ted follow ing the te st scripts /test case s. The res ults will be recorde d and stor ed in shar e point. I f any Bugs /Defects a re found t hey will b e reported to the de velopers b y creating defects i n Rational Change an d Configur ation Mana gement. | |||||
| 266 | Automation Test Plan | |||||
| 267 | The SQA Te am recogni zes that s uccessful creation, maintenanc e, and exe cution of automated test scrip ts are the result of a flow of interdepe ndent stag es within the automa tion proce ss. These stages ar e comprise d of analy sis; break ing down o f scripts into disti nct execut able steps and ident ification of data th at produce expected results. S QA analyst s have bee n working closely wi th the VA business a nalysts to break dow n each of the existi ng test sc ripts into step by s tep execut able line items. Ex pected inp ut values are identi fied that would trig ger a spec ific syste m behavior and produ ce expecte d results. For auto mated test scripts t o be robus t and port able to di fferent SQ A accounts and envir onments, S QA has gon e through the time c onsuming p rocess of identifyin g exact da ta input v alues for each expec ted prompt and stori ng this da ta into in put files that are a ccessed by the test scripts du ring test execution. This pro cess makes the test scripts da ta-driven and accoun t independ ent. As a result, t est script s can be r un in mult iple accou nts while only requi ring accou nt specifi c informat ion to be stored in correspond ing data f iles for e ach accoun t. This a pproach sa ves time a nd effort by prevent ing custom ization an d maintena nce of tes t scripts for multip le SQA acc ounts. Th e followin g data dri ven files are used d uring the execution of automat ed test sc ripts: | |||||
| 268 | Automated Test Cases /Scripts: This entai ls Automat ion Test P lan, Autom ation Test Strategy and Automa tion Test Suite Deta ils relate d to each script. F ile contai ns script specific i nformation such as t he script name, scri pt type (s moke test, regressio n, etc.), requiremen ts cross r eference, and script completio n status a long with detailed d escription of the te st script. A flag c an be turn ed on or o ff for eac h script d epending o n whether or not the script ne eds to be run for a particular test or a ccount. | |||||
| 269 | Automated Test Data: Contains test scrip t and acco unt specif ic data to allow eac h script t o be run i n multiple accounts. Not all VistA acco unts inclu de identic al drug, p atient or clinical i nformation . By spec ifying acc ount speci fic inform ation in t his file, scripts ha ve the fle xibility o f running in differe nt account s dependin g on where the patch is instal led. | |||||
| 270 | Automated Test Envir onment Con figuration Details: This inclu des Automa tion Test Interface Specificat ions and A utomation Test Log. It contai ns account , environm ent and in terface sp ecific dat a. This i ncludes ac count name , IP Addre ss and use r login in formation needed for logging i nto accoun t. Other informatio n can be s tored in t his file a s needed s uch as the wait time required between ea ch VistA p rompt as s ome accoun ts may hav e slower r esponse ti me than ot hers due t o their co nfiguratio n or amoun t of data in the acc ount causi ng prolong ed query t ime. | |||||
| 271 | Following discussion s with VA PRE manage ment, Team SMS has in itiated cr eation of a baseline of automa ted test s cripts for the exist ing MOCHA 2.1 functi onality. Even thoug h time con suming; th is effort is necessa ry for est ablishment of a base line that can be use d to valid ate expect ed MOCHA 2 .1 functio nality aga inst futur e enhancem ents. Onc e the base line is es tablished, then futu re automat ion effort s will be incorporat ed as part of mainte nance of e xisting ap plication and in con junction w ith new fu nctional r equirement s and deve lopment ef forts. Th e new auto mated scri pts will i n turn bec ome the ne w baseline for any f uture requ irements a nd validat ion of new functiona lities. T here are 6 6 existing MOCHA 2.1 Regressio n Test Cas es that ha ve been id entified a s candidat es for aut omation. Out of the 66 Regres sion Test Cases, 17 Test Cases have been fully aut omated. I n addition to the Re gression T est Cases, 50 Smoke Test Cases have also been full y automate d for a to tal of 67 fully auto mated test scripts. | |||||
| 272 | ||||||
| 273 | Product Co mponent Te st | |||||
| 274 | The PRE MO CHA develo pment team will deve lop and ex ecute unit tests. | |||||
| 275 | SQA is pla nning on u sing Soap UI to test the chang es to the MOCHA Serv er 3.0 usi ng HTML te st scripts . We will use input commands u sing HTML inputs bas ed on actu al data fr om Vista a nd capture the respo nses from the First Data Bank. These tes t script a re only ru n in the s erver and the input from and t o Vista wi ll not be needed. Th is will is olate the server fro m any othe r possible issue bei ng caused by an outs ide system . All scri pts shall be checked for runni ng without error in a test acc ount point ed to MOC HA Server 2.0. Then the same s cripts sha ll be run with the t est accoun t pointed to MOCHA S erver 3.0. These cha nges shoul d not crea te errors during the running o f the scri pts. These scripts s hall be ru n each tim e a new fu ll or part ial build is deliver ed for tes ting. | |||||
| 276 | ||||||
| 277 | SQA will r un the end to end te st using V istA and t he develop ers will h ave the lo gging turn ed on in t he MOCHA s erver to c apture the data. SQA and the d evelopers will turn the captur ed data in to the HTM L input fi les which will be us ed for tes ting. If a script f ails, the script tha t failed a nd the res ulting dat a from the Soap UI a pplication shall be placed in a defect a nd sent to the devel opers. Thi s will all ow the dev eloper to run the sa me data on their sys tem to unc over the c ode causin g the issu e and crea te a resol ution for the issue. | |||||
| 278 | ||||||
| 279 | Component Integratio n Test | |||||
| 280 | The PRE MO CHA SQA te am will co nduct inte gration te sting with the PRE D evelopment team. Tes ting compl etion will be docume nted in a spreadshee t for the test versi on. Anoma lies and d efects fou nd will be logged in Rational Change and Configura tion Manag ement, sev erity and priority w ill be dis cussed in the daily huddle cal ls. As re quired by the VHA Re lease mana gement for IOC exit a Software Quality A ssurance P atch Check list will be complet ed for eac h version of the pat ch using t he automat ed patch c heck list utility cr eated by S MS SQA. Th e automate d checklis t review w ill be gen erated usi ng Microso ft access and will b e submitte d to the d eveloper i f any disc repancies are found. | |||||
| 281 | ||||||
| 282 | System Tes ts | |||||
| 283 | During Sys tem Test P hase, manu al testing will be c onducted b y the PRE SQA Team t o verify t he softwar e requirem ents have been met, and the so ftware is functionin g appropri ately. Vi stA, CPRS and FDB wh ich is ind irectly te sted, will be used t o verify e nhanced or der checki ng functio nality wit h Outpatie nt Pharmac y, Inpatie nt Medicat ions and P harmacy Da ta Managem ent (PDM). | |||||
| 284 | The PRE te st cases w ill be dev eloped usi ng the PRE Software Requiremen ts Specifi cation (RS D) documen t. Test ca ses will b e created, managed, and execut ed by the MOCHA 2.1 SQA Team. The test c ases will be managed in Ration al Quality Manager ( RQM) MOCHA 2.1 SQA w ill conduc t a smoke test prior to full e ntry into the System Test cycl e. | |||||
| 285 | CPRS appli cation wil l be teste d indirect ly, by all owing the user to en ter all ne cessary or ders for a patient i n differen t packages from a si ngle appli cation. Al l pending orders tha t appear i n the Inpa tient Unit Dose and IV modules as well a s the Outp atient Pha rmacy pres criptions are initia lly entere d through the CPRS p ackage whi ch are the n finished via Pharm acy backdo or | |||||
| 286 | ||||||
| 287 | SQA will u se IBM Rat ional Chan ge and Con figuration Managemen t to track any issue s discover ed during testing. T he same pr ocess flow s currentl y used in release ma nagement w ill be emp loyed. Inc idents det ermined to be outsid e the PRE domain wil l be logge d into Rat ional Chan ge and Con figuration Managemen t for reso lution by the approp riate team . | |||||
| 288 | ||||||
| 289 | User Funct ionality T est | |||||
| 290 | The MOCHA 2.1 SQA te am will en gage test sites for validating this rele ase as par t of norma l procedur es. | |||||
| 291 | User Funct ionality T est (UAT) is a type of Accepta nce Test t hat involv es end-use rs testing the funct ionality o f the appl ication us ing test d ata in a c ontrolled test envir onment. | |||||
| 292 | Technique Objective: | |||||
| 293 | To verify that each requiremen t and busi ness rule has been i mplemented as docume nted. | |||||
| 294 | Technique: | |||||
| 295 | Functional Testing o f each sec tion of th e requirem ents shoul d be compl eted befor e beginnin g any seri ous testin g of those requireme nts in com bination a s a user w ould use t he softwar e. | |||||
| 296 | Test cases are writt en for eac h individu al busines s rule and supplemen tary speci fication. These cas es are des igned to t est the ac ceptable r esults (po sitive tes ting) and to exceed the possib le limits (negative testing) o f each rul e. | |||||
| 297 | Functional tests are typically manually executed. Functional Testing m ay include testing o f some int egration p oints, but integrati on is not the focus of this le vel of tes ting. | |||||
| 298 | Functional Testing l ays the gr oundwork f or System Testing, w hich will be done la ter in the process. | |||||
| 299 | Method/Pro cess: | |||||
| 300 | Each test scenario h as multipl e executio n steps an d verifica tion point s. If any one verifi cation poi nt fails t o elicit t he expecte d response , the enti re test ca se is said to have f ailed. | |||||
| 301 | Each requi rement or business r ule may ha ve one or many test cases asso ciated wit h it. If any one te st case fa ils, the r equirement is not va lidated. | |||||
| 302 | Required T ools: | |||||
| 303 | Microsoft Excel, Ra tional Cle arQuest. F ull covera ge will be demonstra ted by tra ceability to require ments. | |||||
| 304 | Success Cr iteria: | |||||
| 305 | When all t est cases associated with a sp ecific req uirement h ave passed , the requ irement wi ll have be en validat ed. | |||||
| 306 | Special Co nsideratio ns: | |||||
| 307 | Each teste r must hav e an adequ ate knowle dge of the Pharmacy applicatio n of the b usiness ru les he/she is trying to valida te. | |||||
| 308 | ||||||
| 309 | ||||||
| 310 | Enterprise System En gineering Testing | |||||
| 311 | MOCHA 2.1 SQA Team w ill share test scrip ts and res ults with the ESE te sting team as they w ill review the test results an d provide feedback. | |||||
| 312 | Initial Op erating Ca pability E valuation | |||||
| 313 | The MOCHA 2.1 SQA Te am will en gage test sites for validating MOCHA 2.1 release a s part of normal pro cedures. The test s ites will be support ed through weekly te st site ca lls. | |||||
| 314 | Subject ma tter exper ts from ea ch site ru n through the test s cenarios p rovided by the SQA t eam to val idate each verificat ion point. Findings are repor ted to the SQA team for analys is. Upon a nalysis an d if neces sary ticke ts are cre ated to ad dress any defects or issues id entified. Fixes for identifie d issues a re then in cluded in subsequent builds th at are ins talled at the test s ites for r e-testing. This proc ess goes t hrough mul tiple iter ations unt il test si tes are sa tisfied wi th the fix es and no further is sues are r eported. O nce the te st sites c omplete te sting and approve th e enhancem ents and a ll include d fixes, t hey will t hen move f orward wit h installi ng the bui ld into th eir Produc tion accou nts | |||||
| 315 | Test Sites for MOCHA 2.1b are a s follows. | |||||
| 316 | Tennessee Valley | |||||
| 317 | Denver | |||||
| 318 | Charleston | |||||
| 319 | West Palm Beach | |||||
| 320 | Kansas Cit y | |||||
| 321 | ||||||
| 322 | Testing Te chniques | |||||
| 323 | The testin g techniqu es consist of Risk-B ased Testi ng, Enterp rise Testi ng, Test T ypes, and Productivi ty and Sup port Tools . | |||||
| 324 | ||||||
| 325 | PRE SQA wi ll verify System Req uirements in a test environmen t mirrorin g the prod uction acc ounts to t he best ab ility it c an be. Dur ing the ex ecution of the syste m testing Rational C lear Quest will be u sed to log all incid ents found . It will be the res ponsibilit y of the P Ms or thei r designee s to ensur e that all issues ar e dealt wi th in a ti mely manne r. | |||||
| 326 | High Impac t Test Inc ident is a n error or lack of f unctionali ty that: | |||||
| 327 | Jeopardize s patient or personn el safety by corrupt or incorr ect data | |||||
| 328 | Has no wor karound to provide s imilar fun ctionality and this functional ity is req uired to m ove to sys tem, integ ration, or user acce ptance | |||||
| 329 | Adversely affects al l users or key user functional ity | |||||
| 330 | Represents a signifi cant value or loss o f life, mo ney, or ti me | |||||
| 331 | Is governe d by Congr essional m andate | |||||
| 332 | Affects Ve terans Int egrated Se rvices Net work (VISN ) with reg ards to pr oviding co nsistent a nd safe he althcare | |||||
| 333 | Is sponsor ed by the National P rogram Off ice | |||||
| 334 | Negatively impacts e ssential o perating o r business processin g | |||||
| 335 | System shu ts down an d product will not o perate and cannot be kept “ali ve” | |||||
| 336 | Medium Imp act Test I ncident is an error or lack of functiona lity that: | |||||
| 337 | Has a reas onable wor karound to maintain functional ity | |||||
| 338 | Impacts a small grou p of users , but has workaround | |||||
| 339 | Functional ity works but not to requireme nts, speci fications, or standa rds and wo rkflow is not hamper ed | |||||
| 340 | Low Impact Test Inci dent is an error or lack of fu nctionalit y that may cause ope rator/user inconveni ence and m inimally a ffects ope rational p rocessing: | |||||
| 341 | Spelling e rrors | |||||
| 342 | Minor form atting err ors that d o not affe ct functio nality/vis ibility | |||||
| 343 | Enhancemen t Test Inc ident is s omething t hat would be “nice” to have in the integ ration pie ce but was not inclu ded in the specifica tions. | |||||
| 344 | Risk-based Testing | |||||
| 345 | The greate st risk to the VA co mmunity is the loss of connect ivity betw een the CP RS Graphic al User In terface (G UI) and th e PRE data base which could put numerous patients a t risk. E rror Handl ing for sy stem outag es for thi s have the highest t esting pri ority, fol lowed by e rror handl ing for dr ug errors then order based err ors. Infor mation pre sented for these err ors must b e concise to lead to quick res olution of the servi ce interru ption. PR E error ha ndling tes ting will be conduct ed at the Drug, Orde r and Syst em level t o minimize this risk . | |||||
| 346 | In additio n, all req uirements which are identified as Patien t Safety I ssues have the highe st test pr iority and are verif ied first in the tes t cycle. T he regress ion testin g selected to be exe cuted is s elected to cover all the funct ionality a reas impac ted by the changes. | |||||
| 347 | Regression testing w as complet ed for hig h risk fun ctionality as define d by the b usiness ow ner within iteration s of the b uilds due to schedul e limitati ons | |||||
| 348 | Enterprise Testing | |||||
| 349 | The Enterp rise Testi ng Service s (ETS) or ganization plans to review the test resu lts provid ed by the MOCHA 2.1 SQA team. | |||||
| 350 | Artifacts that will be provide d to the E nterprise testing te am are: | |||||
| 351 | MOCHA 2.1 Test Evalu ation Summ ary | |||||
| 352 | IP, OP, an d PDM Requ irements T raceabilit y Matrices (RTM ) | |||||
| 353 | Test cases and Test results if required | |||||
| 354 | ||||||
| 355 | Security T esting | |||||
| 356 | VistA MOCH A is a Leg acy Applic ation whic h does not have any built-in S ecurity fe atures in the applic ation. (Th e users of the Pharm acy applic ation are authentica ted by the Security keys of th e Local VA Medical C enter). V A currentl y does not require M UMPS code to be test ed with Fo rtify; how ever MOCHA Server co de is scan ned by the VA develo per for se curity iss ues and is required to resolve any Criti cal or Hig h findings prior to product re lease. | |||||
| 357 | ||||||
| 358 | Privacy Te sting | |||||
| 359 | Testing of the new f unctionali ties is do ne in SQA VistA acco unts at Ba y Pines th at only in clude test data and are de - i dentified. As a resu lt patient privacy i s not jeop ardized. | |||||
| 360 | ||||||
| 361 | Section 50 8 Complian ce Testing | |||||
| 362 | It has bee n determin ed by the MOCHA 2.1 Project Ma nager that Section 5 08 testing will be c onducted d uring MOCH A 2.1 test ing. | |||||
| 363 | During pro duct devel opment the MOCHA 2.1 Developme nt Team wi ll conduct a self-ev aluation o f the MOCH A 2.1 appl ication in order to detect pot ential 508 complianc e violatio ns. | |||||
| 364 | During fun ctionality testing t he MOCHA 2 .1 Develop ment Team and SQA te am will wo rk with th e Section 508 Office to ensure that comp liance tes ting is pr operly per formed and the appli cation mee ts the req uirements as specifi ed in the “Usability Specifica tions ” se ction of t he MOCHA 2 .1 Require ments Spec ification Document | |||||
| 365 | The 508 co mpliance g roup has s everal che cklists th at they wi ll run aga inst the M OCHA 2.1 a pplication to ensure that the applicatio n meets th e items in the check list. The checklist s are loca ted at: | |||||
| 366 | http:// DNS /SECTION50 8/Standard s_Checklis ts.asp | |||||
| 367 | The MOCHA 2.1 Projec t manager will obtai n a Confor mance Vali dation Sta tement (CV S) from th e Section 508 Progra m Office t hat compli ance testi ng was per formed and validated . | |||||
| 368 | ||||||
| 369 | Multi-Divi sional Tes ting | |||||
| 370 | The PRE SQ A team wil l not be a ble to val idate this release f or multipl e division s as part of normal release te sting.. M ulti-Divis ional test ing will b e conducte d only whe n testing is done by an integ rated site during si te testing . | |||||
| 371 | Kansas Cit y is the i ntegrated site for M OCHA 2.1a | |||||
| 372 | Tennesse V alley and Denver are integrate d sites fo r MOCHA 2. 1b | |||||
| 373 | ||||||
| 374 | Test Types | |||||
| 375 | Table 2: T est Types | |||||
| 376 | ||||||
| 377 | Test Types | |||||
| 378 | Party Resp onsible | |||||
| 379 | Build veri fication t esting | |||||
| 380 | PRE SQA Te am | |||||
| 381 | Component integratio n testing | |||||
| 382 | PRE SQA Te am | |||||
| 383 | Documentat ion testin g | |||||
| 384 | PRE SQA Te am | |||||
| 385 | Error anal ysis testi ng | |||||
| 386 | PRE SQA Te am | |||||
| 387 | Explorator y testing | |||||
| 388 | PRE SQA Te am | |||||
| 389 | Failover t esting | |||||
| 390 | PRE SQA Te am | |||||
| 391 | Installati on testing | |||||
| 392 | PRE SQA Te am | |||||
| 393 | Integratio n testing | |||||
| 394 | PRE SQA Te am | |||||
| 395 | Product co mponent te sting | |||||
| 396 | PRE SQA Te am | |||||
| 397 | Regression test | |||||
| 398 | PRE SQA Te am | |||||
| 399 | Risk based testing | |||||
| 400 | PRE SQA Te am | |||||
| 401 | Smoke test ing | |||||
| 402 | PRE SQA Te am | |||||
| 403 | System tes ting | |||||
| 404 | PRE SQA Te am | |||||
| 405 | Usability testing | |||||
| 406 | Test Sites and End-U ser workgr oup | |||||
| 407 | User Funct ionality T esting | |||||
| 408 | Test Sites with supp ort from P RE team | |||||
| 409 | Productivi ty and Sup port Tools | |||||
| 410 | ||||||
| 411 | Table 3 de scribes th e tools th at will be employed to support this Mast er Test Pl an. | |||||
| 412 | Table 3: T ool Catego ry or Type s | |||||
| 413 | Tool Categ ory or Typ e | |||||
| 414 | Tool Brand Name | |||||
| 415 | Vendor or In-house | |||||
| 416 | Version | |||||
| 417 | Test Manag ement | |||||
| 418 | Rational Q uality Man ager | |||||
| 419 | Rational I BM | |||||
| 420 | 4.0.5 | |||||
| 421 | Defect Tra cking | |||||
| 422 | Rational T eam Concer t | |||||
| 423 | Rational I BM | |||||
| 424 | 4.0.4 | |||||
| 425 | Project Ma nagement | |||||
| 426 | Project, P rimavera | |||||
| 427 | Microsoft | |||||
| 428 | 5.0 | |||||
| 429 | MOCHA Serv er Testing | |||||
| 430 | (Simple Ob ject Acces s Protocol ) SOAP UI | |||||
| 431 | SmartBear Software | |||||
| 432 | 5.2.1 | |||||
| 433 | 508 Compli ance Testi ng | |||||
| 434 | Job Access With Spee ch (JAWS) | |||||
| 435 | Freedom Sc ientific | |||||
| 436 | 16.0.4468 | |||||
| 437 | Automation Testing | |||||
| 438 | Rational F unctional Tester (RF T) | |||||
| 439 | Rational I BM | |||||
| 440 | 8.5 | |||||
| 441 | Performanc e Testing | |||||
| 442 | Reflection s | |||||
| 443 | ||||||
| 444 | ||||||
| 445 | ||||||
| 446 | Test Crite ria | |||||
| 447 | The Test C riteria co nsists of Process Re views, Pas s/Fail Cri teria, Sus pension an d Resumpti on Criteri a, and Acc eptance Cr iteria. | |||||
| 448 | Process Re views | |||||
| 449 | The Master Test Plan under goe s two revi ews: | |||||
| 450 | Peer Revie w – upon c ompletion of the Mas ter Test P lan | |||||
| 451 | Formal Rev iew – afte r the Deve lopment Ma nager appr oves the M aster Test Plan | |||||
| 452 | ||||||
| 453 | The Master Test Plan does serv e as an in put or Art ifact Used for the P rocess Qua lity Gate Review for Product B uild as we ll as for the Go No Review (Mi lestone) f or Indepen dent Testi ng. | |||||
| 454 | For more i nformation on the re views asso ciated wit h testing, see the P roduct Bui ld, Test P reparation , and Inde pendent Te st and Eva luation pr ocesses. | |||||
| 455 | ||||||
| 456 | Pass/Fail Criteria | |||||
| 457 | Incidents identified during th e executio n of this test plan will be ev aluated to determine their sev erity. Thi s impact w ill be rec orded in t he severit y section of the Rat ional CM D efect. | |||||
| 458 | Severity | |||||
| 459 | Descriptio n | |||||
| 460 | 1 | |||||
| 461 | Critical | |||||
| 462 | [IEEE defi nition: Th e defect r esults in the failur e of the c omplete so ftware sys tem, of a subsystem, or of a s oftware un it (progra m or modul e) within the system .] | |||||
| 463 | Any defect that comp romises pa tient safe ty or syst em securit y, (exampl es of syst em securit y defects include br each of co nfidential ity requir ements of the Privac y Act, HIP AA or Fede ral Tax In formation guidelines ); | |||||
| 464 | Loss of sy stem funct ionality c ritical to user oper ations wit h no suita ble workar ound. (I. e. there i s no way t o achieve the expect ed results using the applicati on.) | |||||
| 465 | System cra sh or hang that prev ents furth er testing or operat ion of the complete applicatio n or a sec tion of th e applicat ion. | |||||
| 466 | Any defect that caus es corrupt ion of dat a from a r esult of t he system (as oppose d to user error). | |||||
| 467 | Any defect in which inappropri ate transm issions ar e consiste ntly gener ated or ap propriate transmissi ons of HL7 messages fail to be generated . | |||||
| 468 | Loss of fu nctionalit y resultin g in erron eous eligi bility/enr ollment de terminatio ns or comm unications not being sent. | |||||
| 469 | ||||||
| 470 | 2 | |||||
| 471 | High | |||||
| 472 | [IEEE defi nition: Th e defect r esults in the failur e of the c omplete so ftware sys tem, of a subsystem, or of a s oftware un it (progra m or modul e) within the system .] | |||||
| 473 | A major de fect in th e function ality whic h does not result in corruptio n of data. | |||||
| 474 | A major de fect in th e function ality resu lting in a failure o f all or p art of the applicati on, where the expect ed results can tempo rarily be achieved b y alternat e means. T he custome r indicate s the work around is acceptabl e for the short term . | |||||
| 475 | Any defect that does not confo rm to Sect ion 508 st andards | |||||
| 476 | Any defect that resu lts in ina ccurate or missing r equirement s | |||||
| 477 | Any defect that resu lts in inv alid authe ntication or authen tication o f an inval id end use r | |||||
| 478 | ||||||
| 479 | 3 | |||||
| 480 | Medium | |||||
| 481 | [IEEE defi nition: Th e defect d oes not re sult in a failure, b ut causes the system to produc e incorrec t, incompl ete, or in consistent results, or the def ect impair s the syst ems usabil ity.] | |||||
| 482 | Minor func tionality is not wor king as in tended and a workaro und exists but is no t suitable for long term use. | |||||
| 483 | The inabil ity of a v alid user to access the system consisten t with gra nted privi leges | |||||
| 484 | Typographi cal or gra mmatical e rrors in t he applica tion, incl uding inst allation g uides, use r guides, training m anuals, de sign docum ents, etc. | |||||
| 485 | Any defect producing cryptic, incorrect or inappro priate err or message s | |||||
| 486 | Any defect that resu lts from t he use of non-standa rd data te rminology in the app lication o r document ation, as defined by the Depar tment of V eterans Af fairs | |||||
| 487 | Cosmetic i ssues that are impor tant to th e integrit y of the p roduct, bu t do not r esult in d ata entry and or dat a quality problems | |||||
| 488 | ||||||
| 489 | All Severi ty 1 and 2 defects s hall be ad dressed or negotiate d prior to release. Any limita tion or ou tstanding test incid ent shall have an ap proved con tingency p rocess (wo rkaround) in place. | |||||
| 490 | Once the d efects hav e been fix ed by the developers , they wil l create u nit test d ocuments a nd forward them to M OCHA 2.1 S QA for val idation. | |||||
| 491 | ||||||
| 492 | Each teste r must hav e an adequ ate knowle dge of the Pharmacy applicatio n of the b usiness ru les in ord er to vali date the c hanges to the softwa re. When testing is completed and defec ts have be en validat ed by PRE SQA, any s pecific Cl earQuest t icket rela ted to thi s testing effort wil l be updat ed with th e test res ults. | |||||
| 493 | ||||||
| 494 | ||||||
| 495 | Suspension and Resum ption Crit eria | |||||
| 496 | Testing wi ll cease o n a test i tem when a n applicat ion high i mpact test incident is logged. Testing w ill resume with the incident i s addresse d. | |||||
| 497 | ||||||
| 498 | Acceptance Criteria | |||||
| 499 | All Severi ty 1 and 2 defects s hall be ad dressed or negotiate d prior to release. Any limita tion or ou tstanding test incid ent shall have an ap proved con tingency p rocess (wo rkaround) in place. | |||||
| 500 | ||||||
| 501 | Test Deliv erables | |||||
| 502 | Table 4 li sts the te st deliver ables for the MOCHA 2.1 projec t. | |||||
| 503 | ||||||
| 504 | Table 4: T est Delive rables | |||||
| 505 | Test Deliv erables | |||||
| 506 | Responsibl e Role | |||||
| 507 | Master Tes t Plan | |||||
| 508 | SQA team l ead | |||||
| 509 | Test Execu tion Risks | |||||
| 510 | Project Ma nager | |||||
| 511 | Test Sched ule | |||||
| 512 | Project Ma nager | |||||
| 513 | Test Cases /Test Scri pts | |||||
| 514 | PRE SQA Te st team | |||||
| 515 | Test Envir onment | |||||
| 516 | System Adm inistrator | |||||
| 517 | Patch Veri fication ( SQA Checkl ists) –MOC HA 2.1 | |||||
| 518 | PRE SQA Te st team | |||||
| 519 | Test Evalu ation Summ aries | |||||
| 520 | PRE SQA te am lead | |||||
| 521 | Traceabili ty Matrix | |||||
| 522 | SQA analys t, PRE SQA Test Team | |||||
| 523 | Test Sched ule | |||||
| 524 | Test Sched ule can be found at the follow ing link | |||||
| 525 | http:// DNS /projects/ pre/PRE%20 Schedule/F orms/AllIt ems.aspx?R ootFolder= %2Fproject s%2Fpre%2F PRE%20Sche dule%2FMOC HA%20Sched ules&Folde rCTID=0x01 2000EE6049 1C0AC8AF47 9CF3BDF4C5 70B869&Vie w= %7b A0BD70BE-5 A49-4402-9 B83-A48F00 FD1DF6 %7d | |||||
| 526 | MOCHA 2.1 UFT1 ended on May 8t h 2015 aft er complet ing testin g on MOCHA 2.1 Combi ned Buildv 7. Develop ment of MO CHA 2.1 wi ll continu e after ME 2 is throu gh IOC. ME 2 IOC comp leted Febr uary 29th 2016. As P er VA mana gement MOC HA 2.1 UFT 2 will st art in Sep tember 201 6. | |||||
| 527 | There will be a UFT test MOCHA 2.1 Combi ned Buildv 8 which wi ll be a co mbination of MOCHA 2 .1Combined Buildv7an d MOCHA En hancement2 Nationall y released Patches. This build will be r eady for S QA before the MOCHA 2.1 UFT2 d evelopment starts in September 2016. | |||||
| 528 | ||||||
| 529 | ||||||
| 530 | ||||||
| 531 | Table 5: T esting Mil estones | |||||
| 532 | Testing Mi lestones | |||||
| 533 | Responsibl e Party | |||||
| 534 | Approved M aster Test Plan | |||||
| 535 | PRE SQA le ad | |||||
| 536 | Approved g eneric tes t cases (h igh level list) | |||||
| 537 | PRE SQA le ad | |||||
| 538 | Complete a nd stable requiremen ts (SRS or CRs) | |||||
| 539 | PRE SQA le ad | |||||
| 540 | Creating o f Test Env ironment(s ) | |||||
| 541 | PRE SQA Te am | |||||
| 542 | Submit and manage re quest for Testing Se rvices | |||||
| 543 | Project Ma nagers | |||||
| 544 | Test Cases selected for releas e and copi ed into ap propriate directory in Test Ma nager | |||||
| 545 | PRE SQA Te st Team | |||||
| 546 | Completion of Patch verificati on | |||||
| 547 | PRE SQA Te st Team | |||||
| 548 | SQA Testin g conducte d (execute the selec ted Test C ases) in T est enviro nment(s) | |||||
| 549 | PRE SQA Te st Team | |||||
| 550 | Defects id entified a nd entered into CQ | |||||
| 551 | PRE SQA Te st Team | |||||
| 552 | ||||||
| 553 | Test Envir onments | |||||
| 554 | - CHEYL36 – (Gold Ac count) | |||||
| 555 | - CHEYL72 (Linux) – End-to-en d/regressi on testing | |||||
| 556 | - CHEYL112 – MOCHA 2 .1 UFT1 (L inux) – En d-to-end/r egression testing | |||||
| 557 | - | |||||
| 558 | - MARTSQA – (Loaned to SQA by the HDR gr oup) – VMS -Remote d ata testin g | |||||
| 559 | - CLE13 - (Loaned to SQA by th e CPRS gro up) – VMS -Remote da ta testing | |||||
| 560 | ||||||
| 561 | PRE has a GOLD legac y account to test le gacy code and valida te whether defects f ound durin g testing are relate d to the c urrent leg acy code i n Producti on. | |||||
| 562 | Test Envir onment Con figuration s | |||||
| 563 | The roles responsibl e for conf iguring an d maintain ing the te st environ ments are the Config uration Ma nager, Tes t Environm ent team, and Softwa re Archite cts. | |||||
| 564 | The follow ing Test E nvironment Configura tions need s to be pr ovided and supported for this project: | |||||
| 565 | Configurat ion Name | |||||
| 566 | Descriptio n | |||||
| 567 | Birmingham Test Cent er | |||||
| 568 | VISTA User Class Ide ntificatio n (UCI) | |||||
| 569 | VISTA Test and Devel opment acc ounts | |||||
| 570 | T1 line co nnection | |||||
| 571 | Bay Pines Test Cente r | |||||
| 572 | VISTA User Class Ide ntificatio n (UCI) | |||||
| 573 | Web Servic es | |||||
| 574 | XML Messag ing | |||||
| 575 | Message Va lidation S erver | |||||
| 576 | J2EE Serve rs | |||||
| 577 | PRE Progra m Team | |||||
| 578 | MOCHA Serv er | |||||
| 579 | WebLogic | |||||
| 580 | Java Confi guration | |||||
| 581 | AITC | |||||
| 582 | ||||||
| 583 | ||||||
| 584 | MOCHA 2.1 | |||||
| 585 | PRE has 2 test accou nts for te sting the PRE change s for MOCH A 2.1. PR E also has a 1 GOLD legacy acc ount to te st legacy code and v alidate wh ether defe cts found during tes ting are r elated to the curren t legacy c ode in Pro duction. PRE also s hare 1 acc ount with the CPRS t eam to tes t PRE MOCH A function ality chan ges relate d to REMOT E account testing. This sprea dsheet wit h all the accounts i s kept up to date wi th any cha nges by Da vid Savkov ic and sto red on the PRE Test Team Share Point in t he followi ng link wi th documen t named “T est_Enviro nment_Buil d_Status_M M_DD_YYYY. doc”: | |||||
| 586 | http:// DNS /projects/ pre/PRE_Te stTeam/Tes ting%20Doc uments/For ms/AllItem s.aspx?Roo tFolder=%2 fprojects% 2fpre%2fPR E%5fTestTe am%2fTesti ng%20Docum ents%2fVDD %20Compone nts%2fTest %20Environ ment%20Bui ld%20Statu s&FolderCT ID=0x01200 048706F778 29BDE4AA8A AF0103A075 91D | |||||
| 587 | ||||||
| 588 | Example of the Sprea d Sheet: | |||||
| 589 | ||||||
| 590 | ||||||
| 591 | ||||||
| 592 | ||||||
| 593 | ||||||
| 594 | Base Syste m Hardware | |||||
| 595 | Table 6 se ts forth t he system resources for the te st effort presented in this Ma ster Test Plan. | |||||
| 596 | The specif ic element s of the t est system may not b e fully un derstood i n early it erations, so this se ction may be complet ed over ti me. The te st system should sim ulate the production environme nt as clos ely as pos sible, sca ling down the concur rent acces s and data base size, and so fo rth, if an d where ap propriate. Tailor th e System H ardware Re sources ta ble as req uired. | |||||
| 597 | Table 6: S ystem Hard ware Resou rces | |||||
| 598 | System Har dware Reso urces | |||||
| 599 | Resource | |||||
| 600 | Quantity | |||||
| 601 | Name and T ype | |||||
| 602 | Network or Subnet | |||||
| 603 | 1 | |||||
| 604 | VA network | |||||
| 605 | Database N ame | |||||
| 606 | 1 | |||||
| 607 | Cache Data base | |||||
| 608 | Client Tes t PCs | |||||
| 609 | 6 | |||||
| 610 | SQA test s tandard GF E machines | |||||
| 611 | Test Repos itory | |||||
| 612 | 1 | |||||
| 613 | Share Poin t | |||||
| 614 | Test Devel opment PCs | |||||
| 615 | 5 | |||||
| 616 | Developer team stand ard GFE ma chines | |||||
| 617 | ||||||
| 618 | Base Softw are Elemen ts in the Test Envir onments | |||||
| 619 | Table 7: S oftware El ements | |||||
| 620 | ||||||
| 621 | Tool Categ ory or Typ e | |||||
| 622 | Tool Brand Name | |||||
| 623 | Vendor or In-house | |||||
| 624 | Version | |||||
| 625 | Use | |||||
| 626 | Test Manag ement/ Tes ting Repos itory | |||||
| 627 | Rational Q uality Man ager | |||||
| 628 | IBM Ration al | |||||
| 629 | 4.0.5 | |||||
| 630 | Create Tes t cases an d Suites | |||||
| 631 | ||||||
| 632 | MOCHA Serv er3.0 | |||||
| 633 | SOAP UI | |||||
| 634 | SmartBear Software | |||||
| 635 | 5.2.1 | |||||
| 636 | Soap UI is being use d to direc tly test t he server responses to specifi c data inp ut. Outsid e systems are not re quired and eliminati ng the out side sourc es will en sure that any defect found are within th e server c ode itself | |||||
| 637 | 508 Testin g | |||||
| 638 | JAWS | |||||
| 639 | Freedom Sc ientific | |||||
| 640 | 16.0.4468 | |||||
| 641 | Test the s oftware to make sure its acces sible by a ll. | |||||
| 642 | Automation Testing | |||||
| 643 | RFT | |||||
| 644 | IBM Ration al | |||||
| 645 | 8.5 | |||||
| 646 | To run the automated test scri pts for MO CHA 2.1 re gression t esting | |||||
| 647 | Test Incid ent Tracki ng | |||||
| 648 | Rational T eam Concer t | |||||
| 649 | IBM Ration al | |||||
| 650 | 4.0.5 | |||||
| 651 | Report, tr ack, and c lose test incidents | |||||
| 652 | Generate t est report s | |||||
| 653 | Tracking a nd closing test inci dents | |||||
| 654 | Project Ma nagement | |||||
| 655 | Primavera | |||||
| 656 | Primavera | |||||
| 657 | 5.0 SP2 | |||||
| 658 | To record daily test ing time | |||||
| 659 | ||||||
| 660 | Staffing a nd Trainin g Needs | |||||
| 661 | Table 8 de scribes th e personne l resource s needed t o plan, pr epare, and execute t his Master Test Plan . | |||||
| 662 | Table 8: S taffing Re sources | |||||
| 663 | Test Team | |||||
| 664 | Descriptio n | |||||
| 665 | Quantity | |||||
| 666 | Scrum Team | |||||
| 667 | Business A nalysts | |||||
| 668 | Persons wh o write th e requirem ents for t he defined scope of project. | |||||
| 669 | 3 | |||||
| 670 | Business A nalysts | |||||
| 671 | Software A rchitects | |||||
| 672 | Persons wh o are resp onsible fo r the over all archit ectural de sign of th e applicat ion | |||||
| 673 | 1 | |||||
| 674 | Software A rchitects | |||||
| 675 | Developmen t Team | |||||
| 676 | Persons wh o build or construct the produ ct/product component . | |||||
| 677 | 4 | |||||
| 678 | Developmen t Team | |||||
| 679 | Test Lead | |||||
| 680 | An experie nced Test Analyst or member of the Test Team who l eads and c oordinates activitie s related to all asp ects of te sting base d on an ap proved Mas ter Test P lan and sc hedule. | |||||
| 681 | 2 | |||||
| 682 | Test Lead | |||||
| 683 | SQA Analys t / Test T eam | |||||
| 684 | Persons wh o execute tests and ensure the test envi ronment wi ll adequat ely suppor t planned test activ ities. | |||||
| 685 | 6 | |||||
| 686 | SQA Analys t / Test T eam | |||||
| 687 | Test Envir onment Tea m | |||||
| 688 | Persons wh o establis h, maintai n, and con trol test environmen ts. | |||||
| 689 | 2 | |||||
| 690 | Test Envir onment Tea m | |||||
| 691 | ||||||
| 692 | Risks and Constraint s | |||||
| 693 | The risks identified in this M aster Test Plan may be recorde d and trac ked in an automated tool, such as, IBM R ational Te am Concert | |||||
| 694 | Risks asso ciated wit h testing are potent ial proble ms/events that may c ause damag e to the s oftware, s ystem, pat ient, pers onnel, ope rating sys tems, sche dule, scop e, budget, and/or re sources. T he risks o utlined he re may imp act scope and schedu le, necess itating a deviation from this test plan. Risk impa ct, probab ility and severity c lassificat ions are d efined in the Risk M anagement Plan. | |||||
| 695 | Identified risks are entered i nto RTC an d reviewed on Monthl y Risk Man agement me etings. | |||||
| 696 | Test Metri cs | |||||
| 697 | Metrics ar e a system of parame ters or me thods for quantitati ve and per iodic asse ssment of a process that is to be measur ed. | |||||
| 698 | Test metri cs may inc lude, but are not li mited to: | |||||
| 699 | Number of test cases (pass/fai l) | |||||
| 700 | Percentage of test c ases execu ted | |||||
| 701 | Number of requiremen ts and per centage te sted | |||||
| 702 | Percentage of test c ases resul ting in de fect detec tion | |||||
| 703 | Number of defects at tributed t o test cas e/test scr ipt creati on | |||||
| 704 | Percentage of defect s identifi ed; listed by cause and severi ty | |||||
| 705 | Time to re -test | |||||
| 706 | ||||||
| 707 | ||||||
| 708 | Attachment A - Appro val Signat ures | |||||
| 709 | The Master Test Plan documents the proje ct’s overa ll approac h to testi ng and inc ludes: | |||||
| 710 | Items to b e tested | |||||
| 711 | Test strat egy | |||||
| 712 | Test crite ria | |||||
| 713 | Test deliv erables | |||||
| 714 | Test sched ule | |||||
| 715 | Test envir onments | |||||
| 716 | Staffing a nd trainin g needs | |||||
| 717 | Risks and constraint s | |||||
| 718 | Test Metri cs | |||||
| 719 | ||||||
| 720 | This secti on is used to docume nt the app roval of t he Master Test Plan during the Formal Re view. The review sh ould be id eally cond ucted face to face w here signa tures can be obtaine d ‘live’ d uring the review how ever the f ollowing f orms of ap proval are acceptabl e: 1. Phy sical sign atures obt ained face to face o r via fax. 2. Digit al signatu res tied c ryptograph ically to the signer . 3. /es/ in the si gnature bl ock provid ed that a separate d igitally s igned e-ma il indicat ing the si gner’s app roval is p rovided an d kept wit h the docu ment. | |||||
| 721 | ||||||
| 722 | NOTE: Del ete the en tire secti on above p rior to fi nal submis sion. | |||||
| 723 | ||||||
| 724 | REVIEW DAT E: <date> | |||||
| 725 | < Program/ Project M anager > | |||||
| 726 | ||||||
| 727 | ||||||
| 728 | __________ __________ __________ __________ __________ ________ | |||||
| 729 | Signed:Dat e: | |||||
| 730 | < Business Sponsor R epresentat ive > | |||||
| 731 | ||||||
| 732 | ||||||
| 733 | ||||||
| 734 | __________ __________ __________ __________ __________ ________ | |||||
| 735 | Signed:Dat e: | |||||
| 736 | < Integrat ed Project Team (IPT ) chair > | |||||
| 737 | ||||||
| 738 | ||||||
| 739 | __________ __________ __________ __________ __________ ________ | |||||
| 740 | Signed:Dat e: | |||||
| 741 | < Enterpri se Systems Engineeri ng (ESE) R epresentat ive > | |||||
| 742 | A. Test Ty pe Definit ions | |||||
| 743 | Test analy sts use “t est types” to valida te the sys tem or app lication u nder test. Simply pu t, test ty pes are te st techniq ues used t o exercise the syste m or appli cation. Th is table p resents a listing of possible test types that may be utilize d during t he Product Build, In dependent Testing, O perational Readiness Review (O RR) and In itial Oper ating Capa bility (IO C) Testing . The test analyst i n consulta tion with the Develo pment Mana ger select s the test types bes t suited t o the syst em or appl ication be ing tested . A minimu m set of t est types is suggest ed here. M ore tests may be add ed at the discretion of the De velopment Team. | |||||
| 744 | ||||||
| 745 | Product Bu ild Testin g | |||||
| 746 | Independen t Testing | |||||
| 747 | IOC Testin g | |||||
| 748 | Types of T est | |||||
| 749 | ||||||
| 750 | ||||||
| 751 | ||||||
| 752 | Access Con trol Testi ng | |||||
| 753 | X | |||||
| 754 | ||||||
| 755 | ||||||
| 756 | Benchmark Testing | |||||
| 757 | ||||||
| 758 | ||||||
| 759 | ||||||
| 760 | Build Veri fication T esting | |||||
| 761 | X | |||||
| 762 | ||||||
| 763 | ||||||
| 764 | Business C ycle Testi ng | |||||
| 765 | ||||||
| 766 | ||||||
| 767 | ||||||
| 768 | Compliance Testing | |||||
| 769 | X | |||||
| 770 | ||||||
| 771 | ||||||
| 772 | Component Integratio n Testing | |||||
| 773 | ||||||
| 774 | ||||||
| 775 | ||||||
| 776 | Configurat ion Testin g | |||||
| 777 | ||||||
| 778 | ||||||
| 779 | ||||||
| 780 | Contention Testing | |||||
| 781 | ||||||
| 782 | ||||||
| 783 | ||||||
| 784 | Data and D atabase In tegrity Te sting | |||||
| 785 | ||||||
| 786 | ||||||
| 787 | ||||||
| 788 | Documentat ion Testin g | |||||
| 789 | X | |||||
| 790 | ||||||
| 791 | ||||||
| 792 | Error Anal ysis Testi ng | |||||
| 793 | ||||||
| 794 | ||||||
| 795 | ||||||
| 796 | Explorator y Testing | |||||
| 797 | X | |||||
| 798 | ||||||
| 799 | ||||||
| 800 | Failover T esting | |||||
| 801 | ||||||
| 802 | ||||||
| 803 | ||||||
| 804 | Installati on Testing | |||||
| 805 | X | |||||
| 806 | ||||||
| 807 | ||||||
| 808 | Integratio n Testing | |||||
| 809 | X | |||||
| 810 | X | |||||
| 811 | ||||||
| 812 | Load Testi ng | |||||
| 813 | ||||||
| 814 | X | |||||
| 815 | ||||||
| 816 | Migration Testing | |||||
| 817 | ||||||
| 818 | ||||||
| 819 | ||||||
| 820 | Multi-Divi sional Tes ting | |||||
| 821 | ||||||
| 822 | ||||||
| 823 | ||||||
| 824 | Parallel T esting | |||||
| 825 | ||||||
| 826 | ||||||
| 827 | ||||||
| 828 | Performanc e Monitori ng Testing | |||||
| 829 | ||||||
| 830 | ||||||
| 831 | ||||||
| 832 | Performanc e Testing | |||||
| 833 | X | |||||
| 834 | X | |||||
| 835 | ||||||
| 836 | Privacy Te sting | |||||
| 837 | X | |||||
| 838 | ||||||
| 839 | ||||||
| 840 | Product Co mponent Te sting | |||||
| 841 | X | |||||
| 842 | ||||||
| 843 | ||||||
| 844 | Recovery T esting | |||||
| 845 | ||||||
| 846 | ||||||
| 847 | ||||||
| 848 | Regression Test | |||||
| 849 | X | |||||
| 850 | ||||||
| 851 | ||||||
| 852 | Risk Based Testing | |||||
| 853 | X | |||||
| 854 | ||||||
| 855 | X | |||||
| 856 | Section 50 8 Complian ce Testing | |||||
| 857 | X | |||||
| 858 | ||||||
| 859 | ||||||
| 860 | Security T esting | |||||
| 861 | ||||||
| 862 | ||||||
| 863 | ||||||
| 864 | Smoke Test ing | |||||
| 865 | X | |||||
| 866 | X | |||||
| 867 | ||||||
| 868 | Stress Tes ting | |||||
| 869 | ||||||
| 870 | X | |||||
| 871 | ||||||
| 872 | System Tes ting | |||||
| 873 | X | |||||
| 874 | X | |||||
| 875 | ||||||
| 876 | Usability Testing | |||||
| 877 | X | |||||
| 878 | ||||||
| 879 | X | |||||
| 880 | User Funct ionality T esting | |||||
| 881 | X | |||||
| 882 | ||||||
| 883 | ||||||
| 884 | User Inter face Testi ng | |||||
| 885 | ||||||
| 886 | ||||||
| 887 | ||||||
| 888 | Test Type Definition s | |||||
| 889 | Test Type | |||||
| 890 | Definition | |||||
| 891 | Access Con trol Testi ng | |||||
| 892 | A type of testing th at attests that the target-of- test data (or system s) are acc essible on ly to thos e actors f or which t hey are in tended, as defined b y use case s. Access Control Te sting veri fies that access to the system is contro lled and t hat unwant ed or unau thorized a ccess is p rohibited. This test is implem ented and executed o n various targets-of -test. | |||||
| 893 | Benchmark Testing: | |||||
| 894 | A type of performanc e testing that compa res the pe rformance of new or unknown fu nctionalit y to a kno wn referen ce standar d (e.g., e xisting so ftware or measuremen ts). For e xample, be nchmark te sting may compare th e performa nce of cur rent syste ms with th e performa nce of the Linux/Ora cle system . | |||||
| 895 | Build Veri fication T esting | |||||
| 896 | (Prerequis ite: Smoke Test) | |||||
| 897 | A type of testing pe rformed fo r each new build, co mparing th e baseline with the actual obj ect proper ties in th e current build. The output fr om this te st indicat es what ob ject prope rties have changed o r don't me et the req uirements. Together with the S moke test, the Build Verificat ion test m ay be util ized by pr ojects to determine if additio nal functi onal testi ng is appr opriate fo r a given build or i f a build is ready f or product ion. | |||||
| 898 | Business C ycle Testi ng | |||||
| 899 | A type of testing th at focuses upon acti vities and transacti ons perfor med end to end over time. This test type executes the functi onality as sociated w ith a peri od of time (e.g., on e-week, mo nth, or ye ar). These tests inc lude all d aily, week ly, and mo nthly cycl es, and ev ents that are date-s ensitive ( e.g., end of the mon th managem ent report s, monthly reports, quarterly reports, a nd year-en d reports) . | |||||
| 900 | Compliance Testing | |||||
| 901 | A type of testing th at verifie s that a c ollection of softwar e and hard ware fulfi lls given specificat ions. For example, t hese tests will mini mally incl ude: "core specifica tions for rehosting - ver.1.5- draft 3.do c", Sectio n 508 of T he Rehabil itation Ac t Amendmen ts of 1998 , Race and Ethnicity Test, and VA Direct ive 6102 C ompliance. It does n ot exclude any other tests tha t may also come up. | |||||
| 902 | Component Integratio n Testing | |||||
| 903 | Testing pe rformed to expose de fects in t he interfa ces and in teraction between in tegrated c omponents as well as verifying installat ion instru ctions. | |||||
| 904 | Configurat ion Testin g | |||||
| 905 | A type of testing co ncerned wi th checkin g the prog rams compa tibility w ith as man y possible configura tions of h ardware an d system s oftware. I n most pro duction en vironments , the part icular har dware spec ifications for the c lient work stations, network co nnections, and datab ase server s vary. Cl ient works tations ma y have dif ferent sof tware load ed, for ex ample, app lications, drivers, and so on hand, at a ny one tim e; many di fferent co mbinations may be ac tive using different resources . The goal of the co nfiguratio n test is finding a hardware c ombination that shou ld be, but is not, c ompatible with the p rogram. | |||||
| 906 | Contention Testing | |||||
| 907 | A type of performanc e testing that execu tes tests that cause s the appl ication to fail with regard to actual or simulated concurren cy. Conten tion testi ng identif ies failur es associa ted with l ocking, de adlock, li velock, st arvation, race condi tions, pri ority inve rsion, dat a loss, lo ss of memo ry, and la ck of thre ad safety in shared software c omponents or data. | |||||
| 908 | Data and D atabase In tegrity Te sting | |||||
| 909 | A type of testing th at verifie s that dat a is being stored by the syste m in a man ner where the data i s not comp romised by the initi al storage , updating , restorat ion, or re trieval pr ocessing. This type of testing is intend ed to unco ver design flaws tha t may resu lt in data corruptio n, unautho rized data access, l ack of dat a integrit y across m ultiple ta bles, and lack of ad equate tra nsaction p erformance . The data bases, dat a files, a nd the dat abase or d ata file p rocesses s hould be t ested as a subsystem within th e applicat ion. | |||||
| 910 | Documentat ion Testin g | |||||
| 911 | Documentat ion testin g is a typ e of testi ng that sh ould valid ate the in formation contained within the software documentat ion set fo r the foll owing qual ities: com pliance to accepted standards and conven tions, acc uracy, com pleteness, and usabi lity. The documentat ion testin g should v erify that all of th e required informati on is prov ided in or der for th e appropri ate user t o be able to properl y install, implement , operate, and maint ain the so ftware app lication. The curren t VistA do cumentatio n set can consist of any of th e followin g manual t ypes: | |||||
| 912 | Release No tes, Insta llation Gu ide, User Manuals, T echnical M anual, and Security Guide. | |||||
| 913 | Error Anal ysis Testi ng | |||||
| 914 | This type of testing verifies that the a pplication checks fo r input, d etects inv alid data, and preve nts invali d data fro m being en tered into the appli cation. Th is type of testing a lso includ es the ver ification of error l ogs and er ror messag es that ar e displaye d to the u ser. | |||||
| 915 | Explorator y Testing | |||||
| 916 | A techniqu e for test ing comput er softwar e that req uires mini mal planni ng and tol erates lim ited docum entation f or the tar get-of-tes t in advan ce of test execution , relying on the ski ll and kno wledge of the tester and feedb ack from t est result s to guide the ongoi ng test ef fort. Expl oratory te sting is o ften condu cted in sh ort sessio ns in whic h feedback gained fr om one ses sion is us ed to dyna mically pl an subsequ ent sessio ns. | |||||
| 917 | Failover T esting | |||||
| 918 | A type of testing te st that en sures an a lternate o r backup s ystem prop erly "take s over" (i .e., a bac kup system functions when the primary sy stem fails ). Failove r Testing also tests that a sy stem conti nually run s when the failover occurs, an d that the failover happens wi thout any loss of da ta or tran sactions. Failover T esting sho uld be com bined with Recovery Testing. | |||||
| 919 | Installati on Testing | |||||
| 920 | A type of testing th at verifie s that the applicati on or syst em install s as inten ded on dif ferent har dware and software c onfigurati ons, and u nder diffe rent condi tions (e.g ., a new i nstallatio n, an upgr ade, and a complete or custom installati on). Insta llation te sting may also measu re the eas e with whi ch an appl ication or system ca n be succe ssfully in stalled, t ypically m easured in terms of the averag e amount o f person-h ours requi red for a trained op erator or hardware e ngineer to perform t he install ation. Par t of this installati on test is to perfor m an unins tall. As a result of this unin stall, the system, a pplication and datab ase should return to the state prior to the instal l. | |||||
| 921 | Integratio n Testing | |||||
| 922 | An increme ntal serie s of tests of combin ations or sub-assemb lies of se lected com ponents in an overal l system. Integratio n testing is increme ntal in a successive ly larger and more c omplex com binations of compone nts tested in sequen ce, procee ding from the unit l evel (0% i ntegration ) to event ually the full syste m test (10 0% integra tion). | |||||
| 923 | Load Testi ng | |||||
| 924 | A performa nce test t hat subjec ts the sys tem to var ying workl oads in or der to mea sure and e valuate th e performa nce behavi ors and ab ilities of the syste m to conti nue to fun ction prop erly under these dif ferent wor kloads. Lo ad testing determine s and ensu res that t he system functions properly b eyond the expected m aximum wor kload. Add itionally, load test ing evalua tes the pe rformance characteri stics (e.g ., respons e times, t ransaction rates, an d other ti me-sensiti ve issues) . | |||||
| 925 | Migration Testing | |||||
| 926 | A type of testing th at follows standard VistA and HeV-VistA operating procedures and loads the lates t .jar ver sion onto a live cop y of VistA and HeV-V istA. The following are exampl es of the types of t ests that can be per formed as part of mi gration te sting: | |||||
| 927 | Data conve rsion has been compl eted | |||||
| 928 | Data table s are succ essfully c reated | |||||
| 929 | Parallel t est for co nfirmation of data i ntegrity | |||||
| 930 | Review out put report , before a nd after m igration, to confirm data inte grity | |||||
| 931 | Run equiva lent proce ss, before and after migration | |||||
| 932 | Multi-Divi sional Tes ting | |||||
| 933 | A type of testing th at ensures that all applicatio ns will op erate in a multi-div ision or m ulti-site environmen t recogniz ing that a n enterpri se perspec tive while fully sup porting lo cal health care deli very. | |||||
| 934 | Parallel T esting | |||||
| 935 | The same i nternal pr ocesses ar e run on t he existin g system a nd the new system. T he existin g system i s consider ed the “go ld standar d”, unless proven ot herwise. T he feedbac k (expecte d results, defined t ime limits , data ext racts, etc ) from pro cesses fro m the new system are compared to the exi sting syst em. Parall el testing is perfor med before the new s ystem is p ut into a production environme nt. | |||||
| 936 | Performanc e Monitori ng Testing | |||||
| 937 | Performanc e profilin g assesses how a sys tem is spe nding its time and c onsuming r esources. This type of perform ance testi ng optimiz es the per formance o f a system by measur ing how mu ch time an d resource s the syst em is spen ding in ea ch functio n. These t ests ident ify perfor mance limi tations in the code and specif y which se ctions of the code w ould benef it most fr om optimiz ation work . The goal of perfor mance prof iling is t o optimize the featu re and app lication p erformance . | |||||
| 938 | Performanc e Testing | |||||
| 939 | Performanc e Testing assesses h ow a syste m is spend ing its ti me and con suming res ources. Pe rformance testing op timizes a system by measuring how much t ime and re sources th e system i s spending in each f unction. T hese tests identify performanc e limitati ons in the code and specify wh ich sectio ns of the code would benefit m ost from o ptimizatio n work. Pe rformance testing ma y be furth er refined by the us e of speci fic types of perform ance tests , such as, benchmark test, loa d test, st ress test, performan ce monitor ing test, and conten tion test. | |||||
| 940 | Privacy Te sting | |||||
| 941 | A type of testing th at ensures that (1) veteran an d employee data are adequately protected and (2) s ystems and applicati ons comply with the Privacy an d Security Rule prov isions of the Health Insurance Portabili ty and Acc ountabilit y Act (HIP AA). | |||||
| 942 | Product Co mponent Te sting | |||||
| 943 | Product Co mponent Te sting (aka Unit Test ing) is th e internal technical and funct ional test ing of a m odule/comp onent of c ode. Produ ct Compone nt Testing verifies that the r equirement s defined in the det ail design specifica tion have been succe ssfully ap plied to t he module/ component under test . | |||||
| 944 | Recovery T esting | |||||
| 945 | A type of testing th at causes an applica tion or sy stem to fa il in a co ntrolled e nvironment . Recovery processes are invok ed while a n applicat ion or sys tem is mon itored. Re covery tes ting verif ies that a pplication or system , and data recovery is achieve d. Recover y Testing should be combined w ith Failov er Testing . | |||||
| 946 | Regression Test | |||||
| 947 | A type of testing th at validat es existin g function ality stil l performs as expect ed when ne w function ality is i ntroduced into the s ystem unde r test. | |||||
| 948 | Risk Based Testing | |||||
| 949 | A type of testing ba sed on a d efined lis t of proje ct risks. It is desi gned to ex plore and/ or uncover potential system fa ilures by using the list of ri sks to sel ect and pr ioritize t esting. | |||||
| 950 | Section 50 8 Complian ce Testing | |||||
| 951 | A type of test that (1) ensure s that per sons with disabiliti es have ac cess to an d are able to intera ct with gr aphical us er interfa ces and (2 ) verifies that the applicatio n or syste m meets th e specifie d Section 508 Compli ance stand ards. | |||||
| 952 | Security T esting | |||||
| 953 | A type of test that validates the securi ty require ments and to ensure readiness for the in dependent testing pe rformed by the Secur ity Assess ment Team as require d by the A ssessment and Author ization Pr ocess. | |||||
| 954 | Smoke Test | |||||
| 955 | A type of testing th at ensures that an a pplication or system is stable enough to enter tes ting in th e currentl y active t est phase. It is usu ally a sub set of the overall s et of test s, prefera bly automa ted, that touches pa rts of the system in at least a cursory way. | |||||
| 956 | Stress Tes ting | |||||
| 957 | A performa nce test i mplemented and execu ted to und erstand ho w a system fails due to condit ions at th e boundary , or outsi de of, the expected tolerances . This fai lure typic ally invol ves low re sources or competiti on for res ources. Lo w resource condition s reveal h ow the tar get-of-tes t fails th at is not apparent u nder norma l conditio ns. Other defects mi ght result from comp etition fo r shared r esources ( e.g., data base locks or networ k bandwidt h), althou gh some of these tes ts are usu ally addre ssed under functiona l and load testing. Stress Tes ting verif ies the ac ceptabilit y of the s ystems per formance b ehavior wh en abnorma l or extre me conditi ons are en countered (e.g., dim inished re sources or extremely high numb er of user s). | |||||
| 958 | System Tes ting | |||||
| 959 | System tes ting is th e testing of all par ts of an i ntegrated system, in cluding in terfaces t o external systems. Both funct ional and structural types of testing ar e performe d to verif y that the system pe rformance, operation and funct ionality a re sound. End to end testing w ith all in terfacing systems is the ultim ate versio n. | |||||
| 960 | Usability Testing | |||||
| 961 | Usability testing id entifies p roblems in the ease- of-use and ease-of-l earning of a product . Usabilit y tests ma y focus up on, and ar e not limi ted to: hu man factor s, aesthet ics, consi stency in the user i nterface, online and context-s ensitive h elp, wizar ds and age nts, user documentat ion. | |||||
| 962 | User Funct ionality T est | |||||
| 963 | User Funct ionality T est (UAT) is a type of Accepta nce Test t hat involv es end-use rs testing the funct ionality o f the appl ication us ing test d ata in a c ontrolled test envir onment. | |||||
| 964 | User Inter face Testi ng | |||||
| 965 | User-inter face (UI) testing ex ercises th e user int erfaces to ensure th at the int erfaces fo llow accep ted standa rds and me et require ments. Use r-interfac e testing is often r eferred to as GUI te sting. UI testing pr ovides too ls and ser vices for driving th e user int erface of an applica tion from a test. |
Araxis Merge (but not the data content of this report) is Copyright © 1993-2016 Araxis Ltd (www.araxis.com). All rights reserved.