Produced by Araxis Merge on 8/29/2018 2:55:47 PM Central Daylight Time. See www.araxis.com for information about Merge. This report uses XHTML and CSS2, and is best viewed with a modern standards-compliant browser. For optimum results when printing this report, use landscape orientation and enable printing of background images and colours in your browser.
# | Location | File | Last Modified |
---|---|---|---|
1 | EPIP.zip\PSS_1.0_225_PSB_3.0_103 _Aug_2018 | EPIP_Test_Evaluation_(PSS_1.0_225,_PSB_3.0_103).docx | Tue Aug 28 12:14:54 2018 UTC |
2 | EPIP.zip\PSS_1.0_225_PSB_3.0_103 _Aug_2018 | EPIP_Test_Evaluation_(PSS_1.0_225,_PSB_3.0_103).docx | Tue Aug 28 12:54:16 2018 UTC |
Description | Between Files 1 and 2 |
|
---|---|---|
Text Blocks | Lines | |
Unchanged | 7 | 462 |
Changed | 6 | 12 |
Inserted | 0 | 0 |
Removed | 0 | 0 |
Whitespace | |
---|---|
Character case | Differences in character case are significant |
Line endings | Differences in line endings (CR and LF characters) are ignored |
CR/LF characters | Not shown in the comparison detail |
No regular expressions were active.
1 | Existing P roduct Int ake Progra m (EPIP) | |
2 | PSS*1.0*22 5 and PSB* 3.0*103 | |
3 | Test Evalu ation | |
4 | ||
5 | ||
6 | ||
7 | Department of Vetera ns Affairs | |
8 | August 201 8 | |
9 | Version 1. 0 | |
10 | ||
11 | ||
12 | ||
13 | Revision H istory | |
14 | Note: The revision h istory cyc le begins once chang es or enha ncements a re request ed after t he Communi cations Pl an has bee n baseline d. | |
15 | Date | |
16 | Version | |
17 | Descriptio n | |
18 | Author | |
19 | 08/09/2018 | |
20 | 1.0 | |
21 | Initial do cument. | |
22 | EPIP Proje ct Team | |
23 | ||
24 | Artifact R ationale | |
25 | The test e valuation document i s the prim ary output of the te st and eva luation pr ocess, an integral p art of the systems e ngineering process, which iden tifies lev els of per formance a nd assists the devel oper in co rrecting d eficiencie s. | |
26 | Table of C ontents | |
27 | 1.Test Eva luation In troduction 1 | |
28 | 1.1.Test E valuation Scope1 | |
29 | 1.2.Test A rchitectur e1 | |
30 | 1.3.Test E nvironment /Configura tion1 | |
31 | 1.4.Instal lation Pro cess2 | |
32 | 2.Test Dat a2 | |
33 | 3.Issues2 | |
34 | 4.Test Exe cution Log 2 | |
35 | 5.Test Def ect Log3 | |
36 | 6.Test Res ults Summa ry3 | |
37 | 6.1.Defect Severity and Priori ty Levels3 | |
38 | 6.2.Total Defects by Severity Level3 | |
39 | 6.3.Breakd own of Tes t Results3 | |
40 | 6.4.Perfor mance Test ing3 | |
41 | 7.Test Cov erage3 | |
42 | 7.1.Requir ements Cov ered4 | |
43 | 7.2.Sectio n 508 Comp liance Cov erage5 | |
44 | 8.Suggeste d Actions5 | |
45 | 9.Defect S everity an d Priority Definitio ns5 | |
46 | 9.1.Defect Severity Level5 | |
47 | 9.1.1.Seve rity Level 1 – Criti cal5 | |
48 | 9.1.2.Seve rity Level 2 - High6 | |
49 | 9.1.3.Seve rity Level 3 - Mediu m6 | |
50 | 9.1.4.Seve rity Level 4 - Low6 | |
51 | 9.2.Priori ty Classif ications6 | |
52 | 9.2.1.Prio rity 1 - R esolve Imm ediately7 | |
53 | 9.2.2.Prio rity 2 - G ive High A ttention7 | |
54 | 9.2.3.Prio rity 3 - N ormal Queu e7 | |
55 | 9.2.4.Prio rity 4 - L ow Priorit y7 | |
56 | 10.Optiona l Tables, Charts, an d Graphs7 | |
57 | 11.Documen t Approval Signature s8 | |
58 | Appendix A - Test Ex ecution Lo g9 | |
59 | Appendix B – Defect Log10 | |
60 | ||
61 | Test Evalu ation Intr oduction | |
62 | The purpos e of this Test Evalu ation is t o: | |
63 | Identify t he testing approach used. | |
64 | Present a summary an alysis of the key te st results from the remediatio n of this intake for review an d assessme nt by desi gnated sta keholders. | |
65 | Provide a general st atement of the quali ty of the system und er test. | |
66 | Make recom mendations for futur e testing efforts. | |
67 | Test Evalu ation Scop e | |
68 | The scope of this Te st Evaluat ion is to verify the functiona lity of th e code mod ification for patche s PSS*1.0* 225 and PS B*3.0*103, as determ ined by Fu nctional, Component Integratio n/System, and Regres sion testi ng. Testin g activiti es followe d the spec ifications outlined in the fol lowing Mas ter Test P lan: PSS*1 .0*225 and PSB*3.0*1 03 Master Test Plan (included in Appendi x A). | |
69 | Test Archi tecture | |
70 | Following are the EP IP test ac counts use d by the L eidos Deve lopment an d SQA Test ing teams to test PS S*1.0*225 and PSB*3. 0*103. | |
71 | Developmen t Test Acc ounts | |
72 | (For Unit Testing) | |
73 | SQA Test A ccounts | |
74 | (For Funct ional, Reg ression, a nd Compone nt Integra tion and S ystem Test ing) | |
75 | VistAS1 (a lternate n ame: D1S1) | |
76 | VistAG1 (a lternate n ame: D1G1) | |
77 | VistAS2 (a lternate n ame: D1S2) –for CPRS GUI testi ng only | |
78 | VistAG2 (a lternate n ame: D1G2) –for CPRS GUI testi ng only | |
79 | Test Envir onment/Con figuration | |
80 | The EPIP t est accoun ts are mai ntained by the EPIP System Adm inistrator , who inst alls all V A-released patches a s soon as they are n ationally released. All EPIP t est accoun ts are clo ned from e xisting VA Enterpris e Testing Services ( ETS) test accounts. The Comput erized Pat ient Recor d System ( CPRS) Grap hical User Interface (GUI) exe cutable is configure d for each VistA ins tance util izing a un ique Inter net Protoc ol (IP) ad dress to c onnect to the VistA applicatio ns. Any up dates to t he CPRS GU I executab le are han dled by th e EPIP Sys tem Admini strator. | |
81 | All EPIP T est Engine ers and De velopers w ho have th e proper c redentials can acces s the test accounts. The VA Au stin Infor mation Tec hnology Ce nter (AITC ) support team reset s password s and sets up new ac cess crede ntials on an as-need ed basis. | |
82 | Installati on Process | |
83 | As soon as the remed iation pro cess is co mplete and the patch is availa ble for te sting, a K IDS build is created in the De velopment account an d then sen t to FORUM for final packaging . The patc h is then submitted to the VA SQA Lead’s Mailman a ccount for installat ion. | |
84 | An EPIP De veloper or Test Engi neer utili zes the KI DS Install ation proc ess to ext ract the b uild from the patch and instal l the buil d into a t est accoun t. The ind ividual wh o installs the patch verifies the routin e checksum s and also checks fo r errors d uring the installati on process . If the p atch is su ccessfully installed without a ny errors, then the EPIP Test team proce eds with F unctional, Regressio n, and Com ponent Int egration a nd System testing. I f defects are found, then the Developmen t team wor ks to find a resolut ion and cr eates new versions o f the patc h until al l defects are resolv ed. | |
85 | Test Data | |
86 | The SQA Te sting team utilizes the test d ata in the designate d test acc ounts (D1G 1, D1G2). | |
87 | The test d ata is enc rypted fol lowing the standards set forth by the VA Office of Informati on & Techn ology (OIT ). All Per sonally Id entifiable Informati on (PII) a nd Protect ed Health Informatio n (PHI) is scrubbed and is not available to the Te st Enginee rs. | |
88 | All testin g is execu ted using encrypted test patie nts availa ble from a ny of the EPIP test accounts. Examples o f encrypte d test pat ients: | |
89 | AAAHURMMX, XPHY | |
90 | BADHB, HAA DXS | |
91 | FDHUX, YHI J | |
92 | All tests were execu ted manual ly by EPIP Test Engi neers. | |
93 | Issues | |
94 | No issues were encou ntered dur ing testin g of PSS*1 .0*225 and PSB*3.0*1 03. | |
95 | Title | |
96 | Issue Desc ription | |
97 | Type | |
98 | Severity | |
99 | N/A | |
100 | N/A | |
101 | N/A | |
102 | N/A | |
103 | Test Execu tion Log | |
104 | The Test E xecution L og records the execu tion of te st scripts and docum ents the t est result s for each test scri pt. | |
105 | The SQA Te sting team utilizes the Ration al Quality Managemen t (QM) too l for all testing ac tivities. All test d ocuments a re stored in the EPI P reposito ry, includ ing the Ma ster Test Plan, Test Suites, T est Cases, and Test Scripts. T est execut ion is per formed, an d test res ults recor ded, in Ra tional QM. The Test Engineer a dds the te st results to the Te st Executi on records to indica te whether testing a chieved Pa ss or Fail status. | |
106 | The Test E xecution r ecords for PSS*1.0*2 25 and PSB *3.0*103 a re include d in the E PIP PSS*1. 0*225 and PSB*3.0*10 3 Master T est Plan. The Master Test Plan is availa ble in App endix A. | |
107 | Test Defec t Log | |
108 | The Test D efect Log is a tool for record ing, analy zing, trac king, and documentin g the clos ure of def ects. It s pecifies t he screen, field, be havior or result tha t occurred , and the IEEE-defin ed Severit y Level. I t includes enough in formation for the de veloper to find and re-create the defect . The Defe ct Log is available in Appendi x B. | |
109 | Test Resul ts Summary | |
110 | SQA testin g for this intake st arted in t he Dev1 Go ld1 test e nvironment on August 2, 2018 a nd ended o n August 3 , 2018. Te st version 1 was ins talled in the test e nvironment after Uni t testing in Dev1 Si lver1 was completed. Upon comp letion of Integratio n testing (Component Integrati on and Sys tem Testin g, Functio nal Testin g, and Reg ression Te sting), ze ro (0) def ects were found and reported. | |
111 | Defect Sev erity and Priority L evels | |
112 | A defect i s defined as a flaw in a compo nent or sy stem that can cause the compon ent or sys tem to fai l to perfo rm its req uired func tion, e.g. , an incor rect state ment or da ta definit ion. A def ect, if en countered during exe cution, ma y cause a failure of the compo nent or sy stem. | |
113 | Defects ar e categori zed accord ing to sev erity and priority l evels. The test anal yst assign s the seve rity, whil e the deve lopment ma nager assi gns the pr iority for repair. F or more in formation, see Defec t Severity and Prior ity Defini tion in th is Test Ev aluation. | |
114 | Total Defe cts by Sev erity Leve l | |
115 | The Defect Log in Ap pendix B d isplays th e defects encountere d while te sting this patch, an d the seve rity level of each. | |
116 | Breakdown of Test Re sults | |
117 | Testing wa s complete d on Augus t 3, 2018. All test results we re recorde d in Ratio nal QM. De tailed res ults are a vailable i n the EPIP Patch PSS *1.0*225 a nd PSB*3.0 *103 Maste r Test Pla n (see App endix A). | |
118 | Performanc e Testing | |
119 | Performanc e testing was not co nducted. | |
120 | Test Cover age | |
121 | The EPIP P SS*1.0*225 and PSB*3 .0*103 Mas ter Test P lan contai ns details on test c overage (s ee Appendi x A). | |
122 | Requiremen ts Covered | |
123 | The requir ements for PSS*1.0*2 25 and PSB *3.0*103 a re stored in the Rat ional Requ irements M anagement (RM) appli cation. Th e test cas es stored in Rationa l Quality Management (QM) are used to va lidate tha t the requ irements h ave been a ddressed, providing full trace ability. T he user st ories stor ed in Rati onal Confi guration M anagement (CM) are l inked to t he require ments in R M and test cases in QM. | |
124 | The follow ing links provide ac cess to th e various Pharmacy D ata Manage ment (PDM) repositor ies relate d to the P SS*1.0*225 patch in the Ration al toolkit . If link translatio n issues p revent dir ect access , copy and paste the URLs into your brow ser. | |
125 | PDM (RM) – Go to Art ifacts. Lo cate the E PIP folder on the le ft side of the page and expand it to dis play patch folders. Each patch folder co ntains the requireme nts for th e patch nu mber shown in the fo lder name. | |
126 | https://cl m.rational .oit. DNS /rm/web#ac tion=com.i bm.rdm.web .pages.sho wFoundatio nProjectDa shboard&co mponentURI =https://c lm.rationa l.oit. DNS /rm/rm-pro jects/_Y7C 2AchrEeanU dQ87CKe9A/ components /_Zo8d0Mhr EeanUdQ87C Ke9A | |
127 | PDM (QM) – Go to Pla nning, the n Browse T est Plans, and then search for the Maste r Test Pla n you need . The Mast er Test Pl an and tes t cases ar e linked t o requirem ents. | |
128 | https://cl m.rational .oit. DNS /qm/web/co nsole/PDM% 20(QM)#act ion=com.ib m.rqm.plan ning.home. actionDisp atcher&sub Action=vie wUserHome | |
129 | PDM (CM) – Go to Pla ns, then A ll Plans, and then s earch for the Sprint Plan you need. The user stori es in each Plan are linked to requiremen ts and tes t cases. | |
130 | https://cl m.rational .oit. DNS /ccm/web/p rojects/PD M%20(CM)#a ction=com. ibm.team.d ashboard.v iewDashboa rd | |
131 | The follow ing links provide ac cess to th e various Bar Code M edication Administra tion (BCMA ) reposito ries relat ed to the PSB*3.0*10 3 patch in the Ratio nal toolki t. If link translati on issues prevent di rect acces s, copy an d paste th e URLs int o your bro wser. | |
132 | BCMA (RM) – Go to Ar tifacts. L ocate the EPIP folde r on the l eft side o f the page and expan d it to di splay patc h folders. Each patc h folder c ontains th e requirem ents for t he patch n umber show n in the f older name . | |
133 | https://cl m.rational .oit. DNS /rm/web#ac tion=com.i bm.rdm.web .pages.sho wFoundatio nProjectDa shboard&co mponentURI =https://c lm.rationa l.oit. DNS /rm/rm-pro jects/_bw5 xkH9pEeaGz LAkkVCH9g/ components /_cW6BoH9p EeaGzLAkkV CH9g | |
134 | BCMA (QM) – Go to Pl anning, th en Browse Test Plans , and then search fo r the Mast er Test Pl an you nee d. The Mas ter Test P lan and te st cases a re linked to require ments. | |
135 | https://cl m.rational .oit. DNS /qm/web/co nsole/BCMA %20(QM)#ac tion=com.i bm.rqm.pla nning.home .actionDis patcher&su bAction=vi ewUserHome | |
136 | BCMA (CM) – Go to Pl ans, then All Plans, and then search for the Sprin t Plan you need. The user stor ies in eac h Plan are linked to requireme nts and te st cases. | |
137 | https://cl m.rational .oit. DNS /ccm/web/p rojects/BC MA%20(CM)# action=com .ibm.team. dashboard. viewDashbo ard | |
138 | Section 50 8 Complian ce Coverag e | |
139 | Section 50 8 test res ults will be reporte d to VA in the follo wing docum ents: | |
140 | EPIP_VASec tion508_Co mpliance_T est_Result s_(PSS_1.0 _225,_PSB_ 3.0_103) | |
141 | EPIP_VASec tion508_In take_Docum ent_(PSS_1 .0_225,_PS B_3.0_103) | |
142 | EPIP_VASec tion508_Ve rifiable_O bjective_E vidence_(P SS_1.0_225 ,_PSB_3.0_ 103) | |
143 | Suggested Actions | |
144 | Leidos rec ommends mo ving this patch to I OC testing . | |
145 | Defect Sev erity and Priority D efinitions | |
146 | The classi fication o f defects within a s ystem exam ines both the severi ty and pri ority of t he defect. | |
147 | Severity i s a measur e of how g reat the i mpact is o n the user ’s ability to comple te the doc umented ac tions with in the sys tem. | |
148 | Priority d etermines the speed with which a given d efect must be repair ed. | |
149 | Defect cla ssificatio n may be d etermined either bec ause testi ng is dela yed by a f ailure in the system or becaus e a cumber some worka round prev ents a use r from com pleting th e assigned tasks. Bo th severit y and prio rity measu res must b e recorded when sche duling def ect resolu tion tasks . | |
150 | Defect Sev erity Leve l | |
151 | The follow ing subsec tions iden tify the d efect seve rity level s. | |
152 | Severity L evel 1 – C ritical | |
153 | Institute of Electri cal and El ectronics Engineers (IEEE) def inition: T he defect results in the failu re of the complete s oftware sy stem, of a subsystem , or of a software u nit (progr am or modu le) within the syste m. | |
154 | Any defect that comp romises pa tient safe ty or syst em securit y. Example s of syste m security defects i nclude bre ach of con fidentiali ty require ments of t he Privacy Act, the Health Ins urance Por tability a nd Account ability Ac t (HIPAA), or Federa l Tax Info rmation gu idelines. | |
155 | Loss of sy stem funct ionality c ritical to user oper ations wit h no suita ble workar ound, i.e. , there is no way to achieve t he expecte d results using the applicatio n. | |
156 | System cra sh or hang that prev ents furth er testing or operat ion of the complete applicatio n or a sec tion of th e applicat ion. | |
157 | Any defect that caus es corrupt ion of dat a from a r esult of t he system (as oppose d to user error). | |
158 | Any defect in which inappropri ate transm issions ar e consiste ntly gener ated or ap propriate transmissi ons of HL7 messages fail to be generated . | |
159 | Loss of fu nctionalit y resultin g in erron eous eligi bility/enr ollment de terminatio ns or comm unications not being sent. | |
160 | Severity L evel 2 - H igh | |
161 | IEEE defin ition: The defect re sults in t he failure of the co mplete sof tware syst em, of a s ubsystem, or of a so ftware uni t (program or module ) within t he system. There is no way to make the f ailed comp onent(s) f unction. H owever, th ere are ac ceptable p rocessing alternativ es which w ill yield the desire d result. | |
162 | A major de fect in th e function ality that does not result in corruption of data. | |
163 | A major de fect in th e function ality resu lting in a failure o f all or p art of the applicati on, where: | |
164 | The expect ed results can tempo rarily be achieved b y alternat e means. T he custome r indicate s the work around is acceptabl e for the short term . | |
165 | Any defect that does not confo rm to Sect ion 508 st andards. | |
166 | Any defect that resu lts in ina ccurate or missing r equirement s. | |
167 | Any defect that resu lts in inv alid authe ntication or authent ication of an invali d end user . | |
168 | Severity L evel 3 - M edium | |
169 | IEEE defin ition: The defect do es not res ult in a f ailure, bu t causes t he system to produce incorrect , incomple te, or inc onsistent results, o r the defe ct impairs the syste ms usabili ty. | |
170 | Minor func tionality is not wor king as in tended and a workaro und exists but is no t suitable for long term use | |
171 | The inabil ity of a v alid user to access the system consisten t with gra nted privi leges | |
172 | Typographi cal or gra mmatical e rrors in t he applica tion, incl uding inst allation g uides, use r guides, training m anuals, an d design d ocuments | |
173 | Any defect producing cryptic, incorrect, or inappr opriate er ror messag es | |
174 | Any defect that resu lts from t he use of non-standa rd data te rminology in the app lication o r document ation, as defined by the Depar tment of V eterans Af fairs | |
175 | Cosmetic i ssues that are impor tant to th e integrit y of the p roduct, bu t do not r esult in d ata entry and or dat a quality problems. | |
176 | Severity L evel 4 - L ow | |
177 | IEEE defin ition: The defect do es not cau se a failu re, does n ot impair usability, and the d esired pro cessing re sults are easily obt ained by w orking aro und the de fect. | |
178 | Minor loss of, or de fect in th e function ality wher e a long t erm use ex ists | |
179 | Low-level cosmetic i ssues. | |
180 | Priority C lassificat ions | |
181 | The follow ing subsec tions iden tify the a ppropriate actions f or defects at each p riority le vel, per d efinitions of IEEE. | |
182 | Priority 1 - Resolve Immediate ly | |
183 | Further de velopment and/or tes ting canno t occur un til the de fect has b een repair ed. The sy stem canno t be used until the repair has been affe cted. | |
184 | Priority 2 - Give Hi gh Attenti on | |
185 | The defect must be r esolved as soon as p ossible be cause it i s impairin g developm ent and/or testing a ctivities. System us e will be severely a ffected un til the de fect is fi xed. | |
186 | Priority 3 - Normal Queue | |
187 | The defect should be resolved in the nor mal course of develo pment acti vities. It can wait until a ne w build or version i s created. | |
188 | Priority 4 - Low Pri ority | |
189 | The defect is an irr itant that should be repaired, but can b e repaired after mor e serious defects ha ve been fi xed. | |
190 | Optional T ables, Cha rts, and G raphs | |
191 | None. | |
192 | Document A pproval Si gnatures | |
193 | ||
194 | Signed: __ __________ __________ __________ __________ __________ __________ _________ | |
195 | Program/Pr oject Mana gerDate | |
196 | ||
197 | Signed: __ __________ __________ __________ __________ __________ __________ _________ | |
198 | Business S ponsor Rep resentativ eDate | |
199 | ||
200 | Signed: __ __________ __________ __________ __________ __________ __________ _________ | |
201 | Test LeadD ate | |
202 | ||
203 | Appendix A - Test Ex ecution Lo g | |
204 | The Test E xecution R ecords for PSS*1.0*2 25 and PSB *3.0*103 a re include d in the E PIP PSS*1. 0*225 and PSB*3.0*10 3 Master T est Plan. | |
205 | ||
206 | Appendix B – Defect Log | |
207 | No defects were foun d during t esting of PSS*1.0*22 5 and PSB* 3.0*103. | |
208 | SQA Defect ID | |
209 | Affected S creen | |
210 | Affected F ield | |
211 | Observed B ehavior | |
212 | Severity | |
213 | Descriptio n | |
214 | N/A | |
215 | N/A | |
216 | N/A | |
217 | N/A | |
218 | N/A | |
219 | No defects were foun d during U nit Testin g of versi on 1.0. | |
220 | N/A | |
221 | N/A | |
222 | N/A | |
223 | N/A | |
224 | N/A | |
225 | No defects were foun d during C omponent I ntegration /System Te sting of v ersion 1.0 . | |
226 | N/A | |
227 | N/A | |
228 | N/A | |
229 | N/A | |
230 | N/A | |
231 | No defects were foun d during R egression Testing of version 1 .0. | |
232 | N/A | |
233 | N/A | |
234 | N/A | |
235 | N/A | |
236 | N/A | |
237 | No defects were foun d during F unctional Testing of version 1 .0. |
Araxis Merge (but not the data content of this report) is Copyright © 1993-2016 Araxis Ltd (www.araxis.com). All rights reserved.