Produced by Araxis Merge on 5/8/2017 10:03:14 PM Eastern Daylight Time. See www.araxis.com for information about Merge. This report uses XHTML and CSS2, and is best viewed with a modern standards-compliant browser. For optimum results when printing this report, use landscape orientation and enable printing of background images and colours in your browser.
| # | Location | File | Last Modified |
|---|---|---|---|
| 1 | var-utility-web-Release-1.0.0-Branch.zip\MHED_VAR_DOCS | Utility+Tool+-+Master+Test+Plan.docx | Mon May 8 15:44:48 2017 UTC |
| 2 | var-utility-web-Release-1.0.0-Branch.zip\MHED_VAR_DOCS | Utility+Tool+-+Master+Test+Plan.docx | Tue May 9 01:10:21 2017 UTC |
| Description | Between Files 1 and 2 |
|
|---|---|---|
| Text Blocks | Lines | |
| Unchanged | 5 | 1096 |
| Changed | 3 | 6 |
| Inserted | 0 | 0 |
| Removed | 1 | 1 |
| Whitespace | |
|---|---|
| Character case | Differences in character case are significant |
| Line endings | Differences in line endings (CR and LF characters) are ignored |
| CR/LF characters | Not shown in the comparison detail |
No regular expressions were active.
| 1 | Utility To ol - Maste r Test Pla n | ||
| 2 | Please Not e: Utility is a Tool , not an a pplication and may n ot need fu ll testing as requir ed by usua l applicat ions. Mas ter test p lan below is based o n requirem ents for f ull applic ations and may chang e for this Tool. | ||
| 3 | 1. Intr oduction | ||
| 4 | This Maste r Test Pla n details a systemat ic approac h to testi ng the Uti lity Tool. The docum ent outlin es the tes t objectiv es, roles and respon sibilities of indivi duals invo lved with testing ac tivities, test inclu sions and exclusions , and the overall te sting appr oach. It o utlines th e differen t types of testing t hat will b e performe d to test the produc t. The doc ument also describes the tools that will be used t o create a utomated t est script s and metr ics, test deliverabl es, schedu le, and te st environ ments. | ||
| 5 | 1.1. Pur pose | ||
| 6 | The purpos e of this Master Tes t Plan is to define a common u nderstandi ng on the test appro ach and pr ocess that will be f ollowed by the Teams developin g this pro ject. | ||
| 7 | 1.2. T est Object ives | ||
| 8 | This Maste r Test Pla n supports the follo wing objec tives: | ||
| 9 | Identify t he types o f tests to be perfor med to val idate prod uct qualit y and prod uction rea diness | ||
| 10 | Identify t he testing roles and responsib ilities | ||
| 11 | Identify t he tools t o automate majority of the tes t cases. | ||
| 12 | Identify t he test pr ocess to d eliver a b ug free pr oduct for User Accep tance test ing at the end of ea ch Release Increment | ||
| 13 | Identify t he test pr ocess to d eliver a b ug free de monstrable product t o Pre-Prod and Produ ction envi ronments a t the end of each Re lease Incr ement (PSI ) | ||
| 14 | Identify a nd define the rules to maintai n a consis tent test environmen t to execu te the dif ferent typ es of test ing | ||
| 15 | Define and manage Te st Data th at support s both the manual an d automate d test run s | ||
| 16 | Identify a ny potenti al risks t hat might impede tes ting of th e App | ||
| 17 | Identify t he process for creat ing and ma intaining the Requir ements Tra ceability Matrix (RT M) | ||
| 18 | Provide li nks to art ifacts rel ated to te sting. | ||
| 19 | |||
| 20 | |||
| 21 | 1.3. Ro les and Re sponsibili ties | ||
| 22 | Table 1: R oles and D escription s | ||
| 23 | Role | ||
| 24 | Descriptio n | ||
| 25 | Delivery ( developmen t) Team | ||
| 26 | Persons th at build, unit test, and integ ration tes t the prod uct/produc t componen t. | ||
| 27 | Program Ma nager | ||
| 28 | Person tha t has over all respon sibility f or the suc cessful pl anning and execution of a proj ect; perso n responsi ble for cr eating the Master Te st Plan in collabora tion with the Develo pment Mana ger. | ||
| 29 | System Tea m | ||
| 30 | Persons re sponsible for creati ng and mai ntaining t he Continu ous Integr ation (CI) environme nts, inclu ding assis ting devel opers in c reating th e necessar y developm ent worksp ace on the ir local m achines. A lso respon sible for setting up needed de ployment p lanning an d scriptin g for both integrate d testing and produc tion envir onments wi th necessa ry securit y and cert ification requiremen ts address ed and reg ularly mai ntained. T argets pro duct deliv ery, quali ty testing , feature developmen t, and mai ntenance r eleases in order to improve re liability and securi ty and fas ter develo pment and deployment cycles. | ||
| 31 | Stakeholde r | ||
| 32 | Person tha t hold a s take in a situation in which t hey may af fect or be affected by the out come. They are respo nsible in mitigating or fixing any imped iments tha t may aris e througho ut the pro ject life cycle. (Ex amples – i dentifying V&V team members, U AT team me mbers, acc ess to con tractors t o Pre-Prod or UAT en vironments , approval of requir ements, et c) | ||
| 33 | Test Lead | ||
| 34 | An experie nced Test Analyst or member of the Syste m Team tha t leads an d coordina tes activi ties relat ed to all aspects of testing b ased on an approved Master Tes t Plan and schedule. Responsib le for col lecting th e necessar y test met rics and r esponsible for makin g sure a q uality pro duct is de livered to the Custo mer. | ||
| 35 | Test team | ||
| 36 | Persons th at are par t of the D elivery or System Te am respons ible in cr eating and executing tests acc ording to the accept ance crite ria of a u ser story. They are responsibl e in creat ing the te st infrast ructure an d providin g requirem ents to bu ild the ne cessary te st environ ments. In addition, they also create th e test dat a that is required t o validate the requi rements. A t times th ey might b e responsi ble in ass isting UAT and V&V t eams in UA T and V&V testing. | ||
| 37 | User Accep tance Test team (UAT ) | ||
| 38 | Persons fr om end use r communit y who help in testin g the prod uct in a p re-prod en vironment/ UAT enviro nment (in MAE curren t environm ent is SQA ). | ||
| 39 | Verificati on and Val idation (V &V) Tester s | ||
| 40 | The V&V te am sees ap plications from the transition from Soft ware Quali ty Assuran ce (SQA) a nd develop ment compl etion to t he V&V Tes t Intake p rocess, wh ich includ es a Funct ional Find ings Revie w to verif y that the applicati on is read y to test; through T est Execut ion and An alysis, wh ich includ es Functio nal, 508, and Securi ty testing ; to Test Summary Re porting th at deliver s all docu mentation of test an alysis and results s o that the applicati on may pro ceed into final comp liance rev iew, Pilot /Initial O perating C apability (IOC), and National Release or go back f or a devel opment rem ediation c ycle | ||
| 41 | Compliance Testers | ||
| 42 | The purpos e of the V A complian ce reviews throughou t the mobi le app lif ecycle is to ensure that appli cations me et the sta ndards of the VA. Th is include s accessib ility, saf ety, priva cy and sec urity stan dards, and more. The re are a t otal of 10 complianc e bodies t hat review apps. | ||
| 43 | Product Ow ner | ||
| 44 | The Produc t Owner re presents t he stakeho lders and is the voi ce of the customer. He/She is accountab le for ens uring that the team delivers v alue to th e business . The Prod uct Owner writes (or has the t eam write) customer- centric it ems (typic ally user stories), ranks and prioritize s them, an d adds the m to the p roduct bac klog. | ||
| 45 | 1.4. Pr ocesses an d Referenc es | ||
| 46 | The proces ses that g uide the i mplementat ion of thi s Master T est Plan a re: | ||
| 47 | Requiremen ts Traceab ility Matr ix (RTM) | ||
| 48 | Defect Met rics | ||
| 49 | Definition of Done | ||
| 50 | Test Prepa ration | ||
| 51 | Product Bu ild | ||
| 52 | Independen t Test and Evaluatio n | ||
| 53 | The refere nces that support th e implemen tation of this Maste r Test Pla n are: | ||
| 54 | Section 50 8 Office W eb Page | ||
| 55 | Privacy Im pact Asses sment - Pr ivacy Serv ice | ||
| 56 | Compliance SOP | ||
| 57 | 2. Item s To Be Te sted | ||
| 58 | The follow ing is wha t will be tested and validated . | ||
| 59 | |||
| 60 | Test Item | ||
| 61 | Type | ||
| 62 | Version | ||
| 63 | Utility To ol | ||
| 64 | Code | ||
| 65 | 1.0.0 | ||
| 66 | |||
| 67 | 2.1. Overvi ew of Test Inclusion s | ||
| 68 | The follow ing compon ents and f eatures an d combinat ions of co mponents a nd feature s will be tested: | ||
| 69 | Utility To ol | ||
| 70 | SA can aut henticate with VAMF Admin cred entials | ||
| 71 | SA can sel ect VistA instances they are a ssigned to for updat ing includ ing multi- divisional VistA ins tances to edit/updat e | ||
| 72 | Ability to update in formation in the sys tem by Vis tA instanc e | ||
| 73 | Customized messages if user is not regis tered | ||
| 74 | Customized messages if direct appointmen t scheduli ng not ava ilable | ||
| 75 | Parameters for lengt h of time to search historical appointme nts | ||
| 76 | Clinic con tact infor mation | ||
| 77 | 2.2. Ove rview of T est Exclus ions | ||
| 78 | The follow ing compon ents and f eatures an d combinat ions of co mponents a nd feature s will not be tested : | ||
| 79 | All compon ents, feat ures, and combinatio n of compo nents/feat ures that are delive red will b e tested a t this tim e. | ||
| 80 | 3. Test Approach | ||
| 81 | There are three leve ls of test ing that c omprise th e Testing Approach: Unit Testi ng, Integr ation Test ing, and S ystem Test ing. All three leve ls of test ing includ e test scr ipts that are execut ed in an a utomated f ashion. Sp ecific too ls are use d to autom ate the te sts at eac h level. A ll three l evels of t esting wil l be perfo rmed based on the re quirements of the Us er Story a nd the def ined accep tance crit eria. In o rder for a User Stor y to meet the accept ance crite ria, every User Stor y must hav e complete d at least one level of test c overage. T he only ex ception to this is s tories tha t are not testable ( Example: U ser Story that deals with docu mentation. ). | ||
| 82 | 3.1. Pr oduct Comp onent Test | ||
| 83 | All compon ents of th e applicat ions will be thoroug hly unit t ested by t he Develop ment Team. Developer s are resp onsible fo r creating unit test s for the components that they build. A User story includes tasks for unit test creation a nd unit te sting acti vities. De velopers a re respons ible for m onitoring the unit t est build after the code and r elated uni t tests ar e checked in. If a d eveloper’s code chec k-in break s a build, the devel oper must notify the delivery team(s) th at the fix is underw ay to addr ess the bu ild server breakage. If build server fa ilure cann ot be reso lved withi n 15 minut es, the de veloper mu st take ac tion to re vert the c ode and ge t the buil d server r unning gre en again. Care must be taken n ot to chec k-in on a broken bui ld. | ||
| 84 | JUnit will be used t o develop and execut e unit tes ts to vali date the b ackend. | ||
| 85 | Jasmine wi ll be used to develo p and exec ute unit t ests to va lidate the frontend. | ||
| 86 | 3.2. Com ponent Int egration T est | ||
| 87 | The purpos e of the P roduct Int egration T esting is to expose the defect s in the i nterfaces and intera ction betw een integr ated compo nents as w ell as ver ifying ins tallation instructio ns. | ||
| 88 | Developers are respo nsible for creating the integr ation test s for a us er story, as appropr iate in su pport of t he user st ory accept ance crite ria. The U ser Story must inclu de tasks f or creatin g and exec uting Inte gration te sts. As pa rt of the Code Revie w, integra tion tests must also be review ed for pro per covera ge. Develo pers are r esponsible for monit oring the integratio n test bui lds after the code a nd related integrati on tests a re checked in. If a check-in b reaks a bu ild, the r esponsible developer must noti fy the dev elopment t eam that t he fix is on the way or action must be t aken to re vert the c ode if a q uick fix c annot be c hecked in. It is adv isable not to check into a red build. | ||
| 89 | Integratio n Test too l – JUnit will be us ed in writ ing the in tegration tests. | ||
| 90 | Below are some of th e scenario s that wil l be valid ated as pa rt of Inte gration te sting ? | ||
| 91 | |||
| 92 | - Te st data sa ved correc tly in Mon go DB | ||
| 93 | - Te st data di splays cor rectly in the UI | ||
| 94 | - Re trieving d ata from t he rest ca lls. | ||
| 95 | - Up dating Moc k services without i nteraction of the fr ont end. | ||
| 96 | - Re source int egration t o test end points | ||
| 97 | 3.3. Sys tem Tests | ||
| 98 | Delivery T eam Develo pers and T esters are responsib le for cre ating the system tes ts. System tests are tests tha t test the entire sy stem as on e entity a cross all functional feature c omponents. End-to-En d testing is validat ed by syst em tests. Testers wo rk with de velopers, architects , and UAT Testers an d V&V pers onnel to d efine the tests exec uted under system te sting. Sys tem tests include fu nctional, non-functi onal, perf ormance, a nd securit y testing. System te st include s end-to-e nd integra tion testi ng of inte rfaces bet ween the f ront-end a nd backend calls. | ||
| 99 | UI Testing verifies the user i nterface i s working as expecte d and meet ing the cr iteria def ined in re quirement documents. | ||
| 100 | Smoke test ing is per formed to ensure tha t system i s stable e nough and ready for test analy st to perf orm system testing. | ||
| 101 | Automated Regression Testing i s performe d with eac h enhancem ents/bug f ixes that an existin g function alities ar e working as expecte d. | ||
| 102 | Complete R egression testing is performed at the co mpletion o f incremen t developm ent. | ||
| 103 | Accessibil ity Testin g is a ver ification testing fo r users wi th disabil ities. | ||
| 104 | The compon ents and f unctionali ty listed in Section 2.1 will be tested during sys tem testin g. | ||
| 105 | All defect s found du ring Syste m Testing are report ed and doc umented as a JIRA Bu g Issue Ty pe in the project re lated to t he app. | ||
| 106 | |||
| 107 | 3.4 Func tional Tes ts | ||
| 108 | Testers fr om each sc rum team p erform fun ctional te sting. Tes ters will work with Analysts a nd Product Owners to understan d the acce ptance cri teria defi ned for ea ch user st ory. Teste rs will wr ite test s teps to va lidate the acceptanc e criteria . User Sto ries will have tasks defined f or Quality Assurance (QA) acti vities. Al l tests ar e checked in and mus t be passi ng on the build serv er at the end of a S print (mee ting the D efinition of Done). User Story must not be closed if the acc eptance te sts are no t passing. | ||
| 109 | Test Data – The test ers identi fy Test da ta require d for test ing the ac ceptance c riteria an d with hel p from the developme nt team th e test dat a are crea ted and st aged both in the loc al dev env ironments and other environmen ts like th e demo and CI enviro nments tha t execute the accept ance tests . | ||
| 110 | Clear docu mentation of the tes t data use d is maint ained on t he wiki. | ||
| 111 | |||
| 112 | Test Envir onment – A cceptance test creat ion and va lidation a re perform ed in loca l dev envi ronment. A cceptance tests are tagged to be run | ||
| 113 | |||
| 114 | Test Tools – | ||
| 115 | Functional Test Tool – Seleniu m/Watir wi ll be used to develo p and exec ute functi onal tests . | ||
| 116 | |||
| 117 | Test Data – Please r efer to th is wiki pa ge for the test data requireme nts
|
||
| 118 | |||
| 119 | Testing To ols | ||
| 120 | Tool | ||
| 121 | Descriptio n | ||
| 122 | Selenium/W atir | ||
| 123 | Automated Test Tool | ||
| 124 | Devices fo r Manual T esting | ||
| 125 | iPhone, An droid (sma ll device) | ||
| 126 | Windows 7 laptop (la rger devic e) | ||
| 127 | Voice Over | ||
| 128 | Section 50 8 Testing | ||
| 129 | |||
| 130 | 3.4.2 Us er Functio nality Tes t | ||
| 131 | The VA Sta keholder w ill conduc t User Fun ctionality Testing i n the MAE Demo envir onment or SQA enviro nment. Al l issues r eported fr om the sta keholders will be en tered in J ira. | ||
| 132 | 3.5. En terprise S ystem Engi neering Te sting | ||
| 133 | The VA Pro ject Team will assis t in triag ing any is sues found during En terprise S ystem Engi neering (E SE) Testin g of apps to develop ment team. The ESE compliance body will be conduc ting the t esting. I ssues that are deter mined to b e valid wi ll added t o the JIRA backlog t o be fixed . | ||
| 134 | 3.6. Ini tial Opera ting Capab ility Eval uation | ||
| 135 | The Develo pment Team will cond uct Initia l Operatin g Capabili ty (IOC) E valuation per the co ntract req uirements with suppo rt from th e VA Proje ct Team. At the con clusion of the IOC t esting, th e Developm ent Team w ill correc t any defe cts identi fied as re quired for National Release. | ||
| 136 | 4. Testi ng Techniq ues | ||
| 137 | 4.1 Buil d Verifica tion Testi ng | ||
| 138 | Also known as Build Acceptance Test, is a set of t ests run o n each new build of a product to verify that the b uild is st able befor e it is ma de availab le to the rest of th e developm ent team t o deploy t o their lo cal enviro nments and also to d eploy to e xternal en vironments like the demo envir onment. | ||
| 139 | This testi ng is auto matically performed by various build job s that are automatic ally trigg ered when code is co mmitted. S ome of the build job s that val idate the code check ed in are the unit-t est build, integrati on-test, s moke test and the ac ceptance-t est. | ||
| 140 | 4.2 Inte gration Te sting | ||
| 141 | Integratio n tests ar e created by Develop ers to tes t the back end code. The test a lso valida tes the ba ckend rest calls wit hout acces sing the f ront-end c ode. Integ ration tes ts are wri tten to va lidate tha t certain functional ity of the app corre ctly updat ed the dat abase. | ||
| 142 | Tools – JU nit is cre ated the i ntegration tests | ||
| 143 | 4.3 Ent erprise Te sting | ||
| 144 | The VA Com pliance gr oups will complete t esting and reviews t o cover th e Enterpri se require ments. Thi s will be conducted in paralle l or after V&V testi ng. | ||
| 145 | 4.3.1. Secur ity Testin g | ||
| 146 | The Securi ty Assessm ent Team w ill verify that all security r equirement s have bee n met as p art of the ir review of the Uti lity Tool. | ||
| 147 | 4.3.2. Priva cy Testing | ||
| 148 | The Privac y Office w ill comple te a revie w of the a pplication to ensure that vete ran and em ployee dat a are adeq uately pro tected and the apps comply wit h the Priv acy and Se curity Rul e provisio ns of the Health Ins urance Por tability a nd Account ability Ac t (HIPAA). | ||
| 149 | 4.3.3. Secti on 508 Com pliance Te sting | ||
| 150 | The Develo pment Team will ensu re that th e product functional ity is usa ble from t he keyboar d by compl eting 508 testing as part of t he manual testing pr ocess. Th e Section 508 Progra m Office i s responsi ble for pe rforming i ndependent complianc e testing with assis tive techn ology. 50 8 testing is also pe rformed on devices l ike Androi d and iPho ne using v oice over. | ||
| 151 | 4.3.4. Multi-Divi sional Tes ting | ||
| 152 | Multi-divi sional tes ting is no t applicab le for Uti lity Tool. | ||
| 153 | 4.4. Ri sk-based T esting | ||
| 154 | Risk-based testing i s a techni que for pr ioritizing testing b ased on te sting the highest ri sk items f irst and c ontinuing to test do wn the ris k prioriti zation lad der as the testing s chedule pe rmits. | ||
| 155 | The follow ing is the link that logs the risk items for Utili ty Tool. R isk Log | ||
| 156 | 4.5. Te st Types | ||
| 157 | Test types are a gro up of test activitie s aimed at testing a component or system regarding one or mo re interre lated qual ity attrib utes. A te st type is focused o n a specif ic test ob jective, i .e., relia bility tes t, usabili ty test, r egression test etc., and may t ake place on one or more test levels or test phase s. Please see the te st types t o be perfo rmed and t he party r esponsible for perfo rming the test. | ||
| 158 | Table 2: T est Types | ||
| 159 | Test Types | ||
| 160 | Party Resp onsible | ||
| 161 | Access con trol testi ng | ||
| 162 | Developmen t Team | ||
| 163 | VA SMEs | ||
| 164 | V&V | ||
| 165 | Compliance testing | ||
| 166 | VA Complia nce Groups | ||
| 167 | Component integratio n testing | ||
| 168 | Developmen t Team | ||
| 169 | Installati on testing | ||
| 170 | Developmen t Team | ||
| 171 | Integratio n testing | ||
| 172 | Developmen t Team | ||
| 173 | Privacy te sting | ||
| 174 | VA Complia nce Groups | ||
| 175 | Product co mponent te sting | ||
| 176 | Developmen t Team | ||
| 177 | Regression test | ||
| 178 | Developmen t Team | ||
| 179 | Risk based testing | ||
| 180 | Developmen t Team | ||
| 181 | V&V | ||
| 182 | VA SMEs | ||
| 183 | Section 50 8 complian ce testing | ||
| 184 | Developmen t Team | ||
| 185 | VA Sectio n 508 Team | ||
| 186 | Security t esting | ||
| 187 | VA Complia nce Groups | ||
| 188 | Smoke test ing | ||
| 189 | Developmen t Team (In tegration Environmen t Install) | ||
| 190 | Usability testing | ||
| 191 | VA Human F actors Tea m | ||
| 192 | User Funct ionality T esting | ||
| 193 | Developmen t Team | ||
| 194 | VA SMEs | ||
| 195 | V&V | ||
| 196 | User inter face testi ng | ||
| 197 | VA Human F actors Tea m | ||
| 198 | |||
| 199 | 4.6. Producti vity and S upport Too ls | ||
| 200 | Table 3 de scribes th e tools th at will be employed to support this Mast er Test Pl an. | ||
| 201 | Table 3: T ool Catego ry or Type s | ||
| 202 | Tool Categ ory or Typ e | ||
| 203 | Tool Brand Name | ||
| 204 | Vendor or In-house | ||
| 205 | Version | ||
| 206 | Test Manag ement | ||
| 207 | Atlassian JIRA | ||
| 208 | Vendor | ||
| 209 | |||
| 210 | Defect Tra cking | ||
| 211 | Atlassian JIRA | ||
| 212 | Vendor | ||
| 213 | |||
| 214 | Test Cover age Monito r or Profi ler | ||
| 215 | N/A | ||
| 216 | N/A | ||
| 217 | N/A | ||
| 218 | Project Ma nagement | ||
| 219 | Atlassian/ Wiki | ||
| 220 | Vendor | ||
| 221 | |||
| 222 | Performanc e Testing | ||
| 223 | Jmeter | ||
| 224 | In-House | ||
| 225 | N/A | ||
| 226 | Configurat ion Manage ment | ||
| 227 | Atlassian/ Wiki | ||
| 228 | Stash | ||
| 229 | Vendor | ||
| 230 | |||
| 231 | DBMS tools | ||
| 232 | MongoDB | ||
| 233 | Oracle | ||
| 234 | Vendor | ||
| 235 | |||
| 236 | Functional Test Auto mation | ||
| 237 | Selenium/W atir | ||
| 238 | Jenkins | ||
| 239 | Vendor | ||
| 240 | |||
| 241 | Unit test/ Integratio n test | ||
| 242 | Jasmine/Ju nit | ||
| 243 | In-House | ||
| 244 | N/A | ||
| 245 | |||
| 246 | 5. Test Cr iteria | ||
| 247 | 5.1. Proces s Reviews | ||
| 248 | The Master Test Plan under goe s two revi ews: | ||
| 249 | Peer Revie w – upon c ompletion of the Mas ter Test P lan | ||
| 250 | Formal Rev iew – afte r the Deve lopment Ma nager appr oves the M aster Test Plan | ||
| 251 | 5.2. Pass/F ail Criter ia | ||
| 252 | Pass/Fail criteria a re decisio n rules us ed to dete rmine whet her a test item (fun ction) or feature ha s passed o r failed a test. | ||
| 253 | Team | ||
| 254 | Test Item Pass/Fail Criteria | ||
| 255 | Developmen t Team | ||
| 256 | Meets User Story Acc eptance Cr iteria | ||
| 257 | Manual Tes t Script f or User St ory - All Steps Pass | ||
| 258 | VA SMEs | ||
| 259 | Customer R equirement s Met | ||
| 260 | 5.3. Suspen sion and R esumption Criteria | ||
| 261 | Suspension Criteria are the cr iteria use d to (temp orarily) s top all or a portion of the te sting acti vities on the test i tems. Resu mption Cri teria are the testin g activiti es that mu st be repe ated when testing is re-starte d after a suspension . This wi ll apply t o the IOC testing th at occurs in the pro duction en vironment. | ||
| 262 | Suspension Criteria | ||
| 263 | 1. A llowing pa tients to create req uests or a ppointment s even if they are n ot registe red in a f acility. | ||
| 264 | 2. S ystem allo wing over booking of appointme nts. | ||
| 265 | 3. B ooking dat a not retr ieved from external source for some reas on. | ||
| 266 | 4. R etrieving data from VistA take s longer t han the ac cepted tim e frame. | ||
| 267 | |||
| 268 | Resumption Criteria | ||
| 269 | Defect is logged for the issue and after the defec t has been resolved and verifi ed all tes ts or acce ptance cri teria refe renced in the defect must be r erun to ma ke sure no new issue s have bee n introduc ed. Rerun tests that are relat ed to the fix or fea tures that might be affected b y the fix. | ||
| 270 | 5.4. Accept ance Crite ria | ||
| 271 | Acceptance criteria are decisi on rules t hat a comp onent or s ystem must satisfy i n order to be accept ed by a us er, custom er, or oth er authori zed entity . | ||
| 272 | All tests are runnin g green on the CI se rver | ||
| 273 | All tests meet the a cceptance criteria o f the stor y/feature | ||
| 274 | All 508 te sting is c omplete an d results documented and accep ted by VA | ||
| 275 | Performanc e testing has been c ompleted a nd results documente d and acce pted by VA | ||
| 276 | All critic al bugs ha ve been re solved and validated | ||
| 277 | A feature has been v alidated b y V&V and UAT teams | ||
| 278 | Sign off b y end user on the fe ature deli vered | ||
| 279 | The criter ia below d etermines if the app lication s atisfies t he accepta ble level of quality : | ||
| 280 | Testing Ac tivity | ||
| 281 | Acceptance Criteria | ||
| 282 | V&V/Compli ance Revie ws | ||
| 283 | Approval t o move for ward by V& V/Complian ce Teams t o UAT or I OC | ||
| 284 | Severity 1 and 2 Def ects (Agre ed upon by with VA S MEs) Fixed in Remedi ation Cycl e 1 | ||
| 285 | User Accep tance Test ing (UAT) | ||
| 286 | Approval t o move for ward by VA SMEs to I OC | ||
| 287 | Severity 1 and 2 Def ects (Agre ed upon by with VA S MEs) Fixed in Remedi ation Cycl e 1 | ||
| 288 | Limited Fi eld Testin g (IOC) | ||
| 289 | Approval b y VA to ro ll out app to additi onal sites for full production release | ||
| 290 | Severity 1 and 2 Def ects ident ified duri ng IOC (Ag reed upon by with VA SMEs) Fix ed before full produ ction rele ase | ||
| 291 | 6. Test Deliverab les | ||
| 292 | Table 4 li sts the te st deliver ables for the CMS pr oject. | ||
| 293 | Table 4: T est Delive rables | ||
| 294 | Test Deliv erables | ||
| 295 | Responsibl e Party | ||
| 296 | Master Tes t Plan | ||
| 297 | Developmen t Team | ||
| 298 | Test Execu tion Risks (Part of Master Tes t Plan) | ||
| 299 | Developmen t Team | ||
| 300 | Test Sched ule | ||
| 301 | Project Ma nager(s) | ||
| 302 | V&V/Compli ance Teams | ||
| 303 | Test Cases /Test Scri pts | ||
| 304 | Developmen t Team | ||
| 305 | V&V/Compli ance Teams | ||
| 306 | Test Data | ||
| 307 | Test Data Team | ||
| 308 | Test Envir onment | ||
| 309 | MIS Team | ||
| 310 | Test Evalu ation Summ aries | ||
| 311 | Developmen t Team | ||
| 312 | Traceabili ty Report or Matrix | ||
| 313 | Developmen t Team | ||
| 314 | 7. Test Schedule | ||
| 315 | The test s chedule is set and m aintained by the VA and other groups per forming te sting acti vities. L ist the ma jor testin g mileston es. When a ppropriate , referenc e other wo rkflow doc umentation or tools, such as t he Project Managemen t Plan, or Work Brea kdown Stru cture (WBS ). Put a m inimum amo unt of pro cess and p lanning in formation within the Master Te st Plan in order to facilitate ongoing m aintenance of the te st schedul e. | ||
| 316 | Test Sched ule: TBD | ||
| 317 | Table 5: T esting Mil estones | ||
| 318 | Testing Mi lestones | ||
| 319 | Responsibl e Party | ||
| 320 | Dates | ||
| 321 | Developmen t/Pre-V&V | ||
| 322 | Developmen t Team | ||
| 323 | MIS Team | ||
| 324 | V&V/Compli ance Teams /Stakehold ers | ||
| 325 | TBD | ||
| 326 | V&V | ||
| 327 | V&V | ||
| 328 | TBD | ||
| 329 | UAT | ||
| 330 | Developmen t Teams | ||
| 331 | UAT | ||
| 332 | TBD | ||
| 333 | Final Comp liance | ||
| 334 | V&V/Compli ance Teams | ||
| 335 | TBD | ||
| 336 | IOC | ||
| 337 | Developmen t Team | ||
| 338 | Evaluation Sites | ||
| 339 | TBD | ||
| 340 | 8. Test Environme nts | ||
| 341 | A test env ironment i s an envir onment con taining ha rdware, in strumentat ion, simul ators, sof tware tool s, and oth er support elements needed to conduct a test. | ||
| 342 | Stakeholde rs will pe rform test ing in the Demo envi ronment du ring devel opment. | ||
| 343 | V&V and Co mpliance g roups will perform t esting in the SQA en vironment for their testing ac tivities. | ||
| 344 | VA SMEs wi th assista nce from D evelopment Team will conduct U AT in the Demo or SQ A Environm ent. | ||
| 345 | IOC testin g will be conducted in the Pro duction En vironment to a limit ed number of sites. | ||
| 346 | 8.1. Te st Environ ment Confi gurations | ||
| 347 | Successful testing r equires co ntrol of t he test en vironment. Unplanned changes t o the test environme nt may int roduce new defects, alter the expected t est result s, and thu s invalida te the tes t cases. S uccessful testing re quires con trolled ac cess to th e test env ironment, an environ ment that replicates the produ ction envi ronment as closely a s possible . | ||
| 348 | Any change s to the S QA, Pre-pr oduction a nd Product ion enviro nments are controlle d by the C CB. The i nstallatio n and any updates to the Utili ty Tool in these env ironments will requi re tickets opened in JIRA and approval f rom the CC B. | ||
| 349 | 8.2. B ase System Hardware | ||
| 350 | The test s ystem (SQA ) will sim ulate the production environme nt as clos ely as pos sible, sca ling down the concur rent acces s and data base size, and so fo rth, if an d where ap propriate. The SQA environmen t is maint ained and managed by the MIS t eam. | ||
| 351 | 8.3. B ase Softwa re Element s in the T est Enviro nments | ||
| 352 | |||
| 353 | Table 7 de scribes th e base sof tware elem ents that are requir ed in the test envir onment for this Mast er Test Pl an. | ||
| 354 | Table 7: S oftware El ements | ||
| 355 | Software E lement Nam e | ||
| 356 | Version | ||
| 357 | Type and O ther Notes | ||
| 358 | Windows 7 | ||
| 359 | 7 | ||
| 360 | Operating System | ||
| 361 | Internet E xplorer | ||
| 362 | 11 | ||
| 363 | Internet B rowser | ||
| 364 | Firefox | ||
| 365 | 31 | ||
| 366 | Internet B rowser | ||
| 367 | Safari | ||
| 368 | 7 | ||
| 369 | Internet B rowser | ||
| 370 | Android | ||
| 371 | |||
| 372 | IOS versio n | ||
| 373 | iPhone | ||
| 374 | 8 + | ||
| 375 | IOS versio n | ||
| 376 | NVDA | ||
| 377 | Latest | ||
| 378 | VO on wind ows | ||
| 379 | VO | ||
| 380 | Built in | ||
| 381 | VO for iPh one | ||
| 382 | Talkback | ||
| 383 | Built in | ||
| 384 | VO for And roid | ||
| 385 | |||
| 386 | 9. Sta ffing and Training N eeds | ||
| 387 | Table 8 de scribes th e personne l resource s needed t o plan, pr epare, and execute t his Master Test Plan . | ||
| 388 | Table 8: S taffing Re sources | ||
| 389 | Testing Ta sk | ||
| 390 | Quantity o f Personne l Needed | ||
| 391 | Test Proce ss | ||
| 392 | Duration/ Days | ||
| 393 | Create the Master Te st Plan | ||
| 394 | 2 | ||
| 395 | Test Prepa ration | ||
| 396 | 3 days | ||
| 397 | Establish the Develo pment Test Environme nt | ||
| 398 | 1 | ||
| 399 | Test Prepa ration | ||
| 400 | 2 days | ||
| 401 | Perform Sy stem Tests | ||
| 402 | 1-2 | ||
| 403 | Product Bu ild | ||
| 404 | 1 day | ||
| 405 | UAT | ||
| 406 | 10 | ||
| 407 | UAT | ||
| 408 | 10 days | ||
| 409 | IOC | ||
| 410 | TBD | ||
| 411 | IOC | ||
| 412 | 60 days | ||
| 413 | |||
| 414 | There are no trainin g options required a t this tim e for prov iding nece ssary skil ls to exec ute the te st plan. | ||
| 415 | Training N eeds | ||
| 416 | Name | ||
| 417 | Training N eed | ||
| 418 | Training O ption | ||
| 419 | Estimated Training H ours | ||
| 420 | N/A | ||
| 421 | N/A | ||
| 422 | N/A | ||
| 423 | N/A | ||
| 424 | |||
| 425 | 10. Risk s and Cons traints | ||
| 426 | Risks are documented and manag ed on the wiki Risk Log | ||
| 427 | 11. Test Metrics | ||
| 428 | Metrics ar e a system of parame ters or me thods for quantitati ve and per iodic asse ssment of a process that is to be measur ed. | ||
| 429 | Test metri cs may inc lude, but are not li mited to: | ||
| 430 | Number of test cases (pass/fai l) | ||
| 431 | Percentage of test c ases execu ted | ||
| 432 | Number of requiremen ts and per centage te sted | ||
| 433 | Percentage of test c ases resul ting in de fect detec tion | ||
| 434 | Number of defects at tributed t o test cas e/test scr ipt creati on | ||
| 435 | Percentage of defect s identifi ed; listed by cause and severi ty | ||
| 436 | The Develo pment Team will work with the VA to dete rmine whic h test met rics will be applica ble for th e UAT Test ing and do cumented a s part of the Test S ummary. | ||
| 437 | Attachment A - Appro val Signat ures | ||
| 438 | The Master Test Plan documents the proje ct’s overa ll approac h to testi ng and inc ludes: | ||
| 439 | Items to b e tested | ||
| 440 | Test strat egy | ||
| 441 | Test crite ria | ||
| 442 | Test deliv erables | ||
| 443 | Test sched ule | ||
| 444 | Test envir onments | ||
| 445 | Staffing a nd trainin g needs | ||
| 446 | Risks and constraint s | ||
| 447 | Test Metri cs | ||
| 448 | |||
| 449 | This secti on is used to docume nt the app roval of t he Master Test Plan during the Formal Re view. The review sh ould be id eally cond ucted face to face w here signa tures can be obtaine d ‘live’ d uring the review how ever the f ollowing f orms of ap proval are acceptabl e: 1. Phy sical sign atures obt ained face to face o r via fax 2. Digita l signatur es tied cr yptographi cally to t he signer 3. /es/ i n the sign ature bloc k provided that a se parate dig itally sig ned e-mail indicatin g the sign er’s appro val is pro vided and kept with the docume nt. | ||
| 450 | |||
| 451 | NOTE: Del ete the en tire secti on above p rior to fi nal submis sion. | ||
| 452 | |||
| 453 | REVIEW DAT E: <date> | ||
| 454 | < Program/ Project Ma nager > | ||
| 455 | |||
| 456 | __________ __________ __________ __________ __________ __________ __________ ________ | ||
| 457 | Signed: Date : | ||
| 458 | < Business Sponsor R epresentat ive > | ||
| 459 | |||
| 460 | |||
| 461 | __________ __________ __________ __________ __________ __________ __________ ________ | ||
| 462 | Signed: Date : | ||
| 463 | < Integrat ed Project Team (IPT ) chair > | ||
| 464 | |||
| 465 | __________ __________ __________ __________ __________ __________ __________ ________ | ||
| 466 | Signed: Date : | ||
| 467 | < Enterpri se Systems Engineeri ng (ESE) R epresentat ive > | ||
| 468 | |||
| 469 | A. Test Ty pe Definit ions | ||
| 470 | Test analy sts use “t est types” to valida te the sys tem or app lication u nder test. This tabl e presents a listing of possib le test ty pes and th eir defini tions that may be ut ilized dur ing the Pr oduct Buil d, Indepen dent Testi ng, Operat ional Read iness Revi ew (ORR) a nd Initial Operating Capabilit y (IOC) Te sting. | ||
| 471 | Test Type | ||
| 472 | Definition | ||
| 473 | Access Con trol Testi ng | ||
| 474 | A type of testing th at attests that the target-of- test data (or system s) are acc essible on ly to thos e actors f or which t hey are in tended, as defined b y use case s. Access Control Te sting veri fies that access to the system is contro lled and t hat unwant ed or unau thorized a ccess is p rohibited. This test is implem ented and executed o n various targets-of -test. | ||
| 475 | Benchmark Testing: | ||
| 476 | A type of performanc e testing that compa res the pe rformance of new or unknown fu nctionalit y to a kno wn referen ce standar d (e.g., e xisting so ftware or measuremen ts). For e xample, be nchmark te sting may compare th e performa nce of cur rent syste ms with th e performa nce of the Linux/Ora cle system . | ||
| 477 | Build Veri fication T esting | ||
| 478 | (Prerequis ite: Smoke Test) | ||
| 479 | A type of testing pe rformed fo r each new build, co mparing th e baseline with the actual obj ect proper ties in th e current build. The output fr om this te st indicat es what ob ject prope rties have changed o r don't me et the req uirements. Together with the S moke test, the Build Verificat ion test m ay be util ized by pr ojects to determine if additio nal functi onal testi ng is appr opriate fo r a given build or i f a build is ready f or product ion. | ||
| 480 | Business C ycle Testi ng | ||
| 481 | A type of testing th at focuses upon acti vities and transacti ons perfor med end to end over time. This test type executes the functi onality as sociated w ith a peri od of time (e.g., on e-week, mo nth, or ye ar). These tests inc lude all d aily, week ly, and mo nthly cycl es, and ev ents that are date-s ensitive ( e.g., end of the mon th managem ent report s, monthly reports, quarterly reports, a nd year-en d reports) . | ||
| 482 | Compliance Testing | ||
| 483 | A type of testing th at verifie s that a c ollection of softwar e and hard ware fulfi lls given specificat ions. For example, t hese tests will mini mally incl ude: "core specifica tions for rehosting - ver.1.5- draft 3.do c", Sectio n 508 of T he Rehabil itation Ac t Amendmen ts of 1998 , Race and Ethnicity Test, and VA Direct ive 6102 C ompliance. It does n ot exclude any other tests tha t may also come up. | ||
| 484 | Component Integratio n Testing | ||
| 485 | Testing pe rformed to expose de fects in t he interfa ces and in teraction between in tegrated c omponents as well as verifying installat ion instru ctions. | ||
| 486 | Configurat ion Testin g | ||
| 487 | A type of testing co ncerned wi th checkin g the prog rams compa tibility w ith as man y possible configura tions of h ardware an d system s oftware. I n most pro duction en vironments , the part icular har dware spec ifications for the c lient work stations, network co nnections, and datab ase server s vary. Cl ient works tations ma y have dif ferent sof tware load ed, for ex ample, app lications, drivers, and so on hand, at a ny one tim e; many di fferent co mbinations may be ac tive using different resources . The goal of the co nfiguratio n test is finding a hardware c ombination that shou ld be, but is not, c ompatible with the p rogram. | ||
| 488 | Contention Testing | ||
| 489 | A type of performanc e testing that execu tes a test
|
||
| 490 | Data and D atabase In tegrity Te sting | ||
| 491 | A type of testing th at verifie s that dat a is being stored by the syste m in a man ner where the initia l storage, updating, restorati on, or ret rieval pro cessing do es not com promise th e data. Th is type of testing i s intended to uncove r design f laws that may result in data c orruption, unauthori zed data a ccess, lac k of data integrity across mul tiple tabl es, and la ck of adeq uate trans action per formance. The databa ses, data files, and the datab ase or dat a file pro cesses sho uld be tes ted as a s ubsystem w ithin the applicatio n. | ||
| 492 | Documentat ion Testin g | ||
| 493 | Documentat ion testin g is a typ e of testi ng that sh ould valid ate the in formation contained within the software documentat ion set fo r the foll owing qual ities: com pliance to accepted standards and conven tions, acc uracy, com pleteness, and usabi lity. The documentat ion testin g should v erify that all of th e required informati on is prov ided in or der for th e appropri ate user t o be able to properl y install, implement , operate, and maint ain the so ftware app lication. The curren t VistA do cumentatio n set can consist of any of th e followin g manual t ypes: | ||
| 494 | Release No tes, Insta llation Gu ide, User Manuals, T echnical M anual, and Security Guide. | ||
| 495 | Error Anal ysis Testi ng | ||
| 496 | This type of testing verifies that the a pplication checks fo r input, d etects inv alid data, and preve nts invali d data fro m being en tered into the appli cation. Th is type of testing a lso includ es the ver ification of error l ogs and er ror messag es that ar e displaye d to the u ser. | ||
| 497 | Explorator y Testing | ||
| 498 | A techniqu e for test ing comput er softwar e that req uires mini mal planni ng and tol erates lim ited docum entation f or the tar get-of-tes t in advan ce of test execution , relying on the ski ll and kno wledge of the tester and feedb ack from t est result s to guide the ongoi ng test ef fort. Expl oratory te sting is o ften condu cted in sh ort sessio ns in whic h feedback gained fr om one ses sion is us ed to dyna mically pl an subsequ ent sessio ns. | ||
| 499 | Failover T esting | ||
| 500 | A type of testing te st that en sures an a lternate o r backup s ystem prop erly "take s over" (i .e., a bac kup system functions when the primary sy stem fails ). Failove r Testing also tests that a sy stem conti nually run s when the failover occurs, an d that the failover happens wi thout any loss of da ta or tran sactions. Failover T esting sho uld be com bined with Recovery Testing. | ||
| 501 | Installati on Testing | ||
| 502 | A type of testing th at verifie s that the applicati on or syst em install s as inten ded on dif ferent har dware and software c onfigurati ons, and u nder diffe rent condi tions (e.g ., a new i nstallatio n, an upgr ade, and a complete or custom installati on). Insta llation te sting may also measu re the eas e with whi ch an appl ication or system ca n be succe ssfully in stalled, t ypically m easured in terms of the averag e amount o f person-h ours requi red for a trained op erator or hardware e ngineer to perform t he install ation. Par t of this installati on test is to perfor m an unins tall. As a result of this unin stall, the system, a pplication and datab ase should return to the state prior to the instal l. | ||
| 503 | Integratio n Testing | ||
| 504 | An increme ntal serie s of tests of combin ations or sub-assemb lies of se lected com ponents in an overal l system. Integratio n testing is increme ntal in a successive ly larger and more c omplex com binations of compone nts tested in sequen ce, procee ding from the unit l evel (0% i ntegration ) to event ually the full syste m test (10 0% integra tion). | ||
| 505 | Load Testi ng | ||
| 506 | A performa nce test t hat subjec ts the sys tem to var ying workl oads in or der to mea sure and e valuate th e performa nce behavi ors and ab ilities of the syste m to conti nue to fun ction prop erly under these dif ferent wor kloads. Lo ad testing determine s and ensu res that t he system functions properly b eyond the expected m aximum wor kload. Add itionally, load test ing evalua tes the pe rformance characteri stics (e.g ., respons e times, t ransaction rates, an d other ti me-sensiti ve issues) . | ||
| 507 | Migration Testing | ||
| 508 | A type of testing th at follows standard VistA and HeV-VistA operating procedures and loads the lates t .jar ver sion onto a live cop y of VistA and HeV-V istA. The following are exampl es of the types of t ests that can be per formed as part of mi gration te sting: | ||
| 509 | Data conve rsion has been compl eted | ||
| 510 | Data table s are succ essfully c reated | ||
| 511 | Parallel t est for co nfirmation of data i ntegrity | ||
| 512 | Review out put report , before a nd after m igration, to confirm data inte grity | ||
| 513 | Run equiva lent proce ss, before and after migration | ||
| 514 | Multi-Divi sional Tes ting | ||
| 515 | A type of testing th at ensures that all applicatio ns will op erate in a multi-div ision or m ulti-site environmen t recogniz ing that a n enterpri se perspec tive while fully sup porting lo cal health care deli very. | ||
| 516 | Parallel T esting | ||
| 517 | The same i nternal pr ocesses ar e run on t he existin g system a nd the new system. T he existin g system i s consider ed the “go ld standar d”, unless proven ot herwise. T he feedbac k (expecte d results, defined t ime limits , data ext racts, etc .) from pr ocesses fr om the new system ar e compared to the ex isting sys tem. Paral lel testin g is perfo rmed befor e the new system is put into a productio n environm ent. | ||
| 518 | Performanc e Monitori ng Testing | ||
| 519 | Performanc e profilin g assesses how a sys tem is spe nding its time and c onsuming r esources. This type of perform ance testi ng optimiz es the per formance o f a system by measur ing how mu ch time an d resource s the syst em is spen ding in ea ch functio n. These t ests ident ify perfor mance limi tations in the code and specif y which se ctions of the code w ould benef it most fr om optimiz ation work . The goal of perfor mance prof iling is t o optimize the featu re and app lication p erformance . | ||
| 520 | Performanc e Testing | ||
| 521 | Performanc e Testing assesses h ow a syste m is spend ing its ti me and con suming res ources. Pe rformance testing op timizes a system by measuring how much t ime and re sources th e system i s spending in each f unction. T hese tests identify performanc e limitati ons in the code and specify wh ich sectio ns of the code would benefit m ost from o ptimizatio n work. Pe rformance testing ma y be furth er refined by the us e of speci fic types of perform ance tests , such as, benchmark test, loa d test, st ress test, performan ce monitor ing test, and conten tion test. | ||
| 522 | Privacy Te sting | ||
| 523 | A type of testing th at ensures that (1) veteran an d employee data are adequately protected and (2) s ystems and applicati ons comply with the Privacy an d Security Rule prov isions of the Health Insurance Portabili ty and Acc ountabilit y Act (HIP AA). | ||
| 524 | Product Co mponent Te sting | ||
| 525 | Product Co mponent Te sting (aka Unit Test ing) is th e internal technical and funct ional test ing of a m odule/comp onent of c ode. Produ ct Compone nt Testing verifies that the r equirement s defined in the det ail design specifica tion have been succe ssfully ap plied to t he module/ component under test . | ||
| 526 | Recovery T esting | ||
| 527 | A type of testing th at causes an applica tion or sy stem to fa il in a co ntrolled e nvironment . Recovery processes are invok ed while a n applicat ion or sys tem is mon itored. Re covery tes ting verif ies that a pplication or system , and data recovery is achieve d. Recover y Testing should be combined w ith Failov er Testing . | ||
| 528 | Regression Test | ||
| 529 | A type of testing th at validat es existin g function ality stil l performs as expect ed when ne w function ality is i ntroduced into the s ystem unde r test. | ||
| 530 | Risk Based Testing | ||
| 531 | A type of testing ba sed on a d efined lis t of proje ct risks. It is desi gned to ex plore and/ or uncover potential system fa ilures by using the list of ri sks to sel ect and pr ioritize t esting. | ||
| 532 | Section 50 8 Complian ce Testing | ||
| 533 | A type of test that (1) ensure s that per sons with disabiliti es have ac cess to an d are able to intera ct with gr aphical us er interfa ces and (2 ) verifies that the applicatio n or syste m meets th e specifie d Section 508 Compli ance stand ards. | ||
| 534 | Security T esting | ||
| 535 | A type of test that validates the securi ty require ments and to ensure readiness for the in dependent testing pe rformed by the Secur ity Assess ment Team as require d by the A ssessment and Author ization Pr ocess. | ||
| 536 | Smoke Test | ||
| 537 | A type of testing th at ensures that an a pplication or system is stable enough to enter tes ting in th e currentl y active t est phase. It is usu ally a sub set of the overall s et of test s, prefera bly automa ted, that touches pa rts of the system in at least a cursory way. | ||
| 538 | Stress Tes ting | ||
| 539 | A performa nce test i mplemented and execu ted to und erstand ho w a system fails due to condit ions at th e boundary , or outsi de of, the expected tolerances . This fai lure typic ally invol ves low re sources or competiti on for res ources. Lo w resource condition s reveal h ow the tar get-of-tes t fails th at is not apparent u nder norma l conditio ns. Other defects mi ght result from comp etition fo r shared r esources ( e.g., data base locks or networ k bandwidt h), althou gh some of these tes ts are usu ally addre ssed under functiona l and load testing. Stress Tes ting verif ies the ac ceptabilit y of the s ystems per formance b ehavior wh en abnorma l or extre me conditi ons are en countered (e.g., dim inished re sources or extremely high numb er of user s). | ||
| 540 | System Tes ting | ||
| 541 | System tes ting is th e testing of all par ts of an i ntegrated system, in cluding in terfaces t o external systems. Both funct ional and structural types of testing ar e performe d to verif y that the system pe rformance, operation and funct ionality a re sound. End to end testing w ith all in terfacing systems is the ultim ate versio n. | ||
| 542 | Usability Testing | ||
| 543 | Usability testing id entifies p roblems in the ease- of-use and ease-of-l earning of a product . Usabilit y tests ma y focus up on, and ar e not limi ted to: hu man factor s, aesthet ics, consi stency in the user i nterface, online and context-s ensitive h elp, wizar ds and age nts, user documentat ion. | ||
| 544 | User Funct ionality T est | ||
| 545 | User Funct ionality T est (UAT) is a type of Accepta nce Test t hat involv es end-use rs testing the funct ionality o f the appl ication us ing test d ata in a c ontrolled test envir onment. | ||
| 546 | User Inter face Testi ng | ||
| 547 | User-inter face (UI) testing ex ercises th e user int erfaces to ensure th at the int erfaces fo llow accep ted standa rds and me et require ments. Use r-interfac e testing is often r eferred to as GUI te sting. UI testing pr ovides too ls and ser vices for driving th e user int erface of an applica tion from a test. | ||
| 548 | |||
| 549 | |||
| 550 | |||
| 551 |
Araxis Merge (but not the data content of this report) is Copyright © 1993-2016 Araxis Ltd (www.araxis.com). All rights reserved.