9. EPMO Open Source Coordination Office Redaction File Detail Report

Produced by Araxis Merge on 2/15/2018 2:12:13 PM Eastern Standard Time. See www.araxis.com for information about Merge. This report uses XHTML and CSS2, and is best viewed with a modern standards-compliant browser. For optimum results when printing this report, use landscape orientation and enable printing of background images and colours in your browser.

9.1 Files compared

# Location File Last Modified
1 CPEE_Build_6_Sprint_9 and 11.zip\CPE005-115, 116 Create New Reason Code Claims Reversal Notification Revert CPE Peer Review Form for backing off user story 115 and 116.docx Tue Feb 6 17:50:58 2018 UTC
2 CPEE_Build_6_Sprint_9 and 11.zip\CPE005-115, 116 Create New Reason Code Claims Reversal Notification Revert CPE Peer Review Form for backing off user story 115 and 116.docx Wed Feb 14 22:12:58 2018 UTC

9.2 Comparison summary

Description Between
Files 1 and 2
Text Blocks Lines
Unchanged 1 76
Changed 0 0
Inserted 0 0
Removed 0 0

9.3 Comparison options

Whitespace
Character case Differences in character case are significant
Line endings Differences in line endings (CR and LF characters) are ignored
CR/LF characters Not shown in the comparison detail

9.4 Active regular expressions

No regular expressions were active.

9.5 Comparison detail

  1   This peer  review che cklist is  to be used  by the re viewing de veloper wh en perform ing a peer  review.   This is al so known a s a second ary review .  Below i s a list o f question s to answe r along wi th a step  by step ex ecution pr ocess.
  2  
  3   How to Pee r Review:
  4   When perfo rming a pe er review,  we will b e capturin g the resu lts into t he peer re view ratio nal work i tem task a ssigned to  the user  story that  is being  reviewed.
  5   When start ing review
  6   Move ratio nal work i tem for pe er review  into “in p rogress”
  7   Assign you rself as t he owner
  8   When perfo rming peer  review
  9   Utilize th e checklis t below an d perform  a review o f the dev  work done
  10   Once peer  review is  complete
  11   On success
  12   Record pee r review r esults int o resoluti on descrip tion
  13   Make sure  it is assi gned to yo u
  14   Burn down  hours spen t doing pe er review
  15   Move task  to ‘done’
  16   On Failure
  17   Record pee r review r esults int o discussi on (not re solution d escription )
  18   Make sure  is assigne d to your
  19   Burn down  hours spen t so far
  20   Notify pri mary devel oper of fi ndings for  correctio ns
  21   Rince and  repeat unt il success  is reache d.
  22  
  23   Checklist:
  24   It is assu med that t his checkl ist will e volve and  grow.  Ple ase feel f ree to sug gest addit ions.
  25   User story  included?  (Y / N ):  Yes
  26   Routine co de before  and after  changes? ( Y / N / NA ): Yes
  27   Documentat ion for ne w globals  / nodes /  files / da ta pieces?  (Y / N /  NA): No
  28   Coding sta ndards and  best prac tices are  met?  (Y /  N / NA):Y es
  29   (we do not  currently  have an o fficial co ding stand ard)
  30   Code comme nt? (Y / N  / NA): No
  31   (who chang ed, what s tory and d ate, ident ify code c hanged at  top of rou tine, main tenance hi story)
  32   Unit Test  document i ncluded (T est plan a nd results )? (Y / N  / NA) Yes
  33   Passed XIN DEX?  (Y /  N): Not a pplicable
  34   Error Hand ling? (Y /  N / NA):  No
  35   Locking? ( Y / N / NA ) No
  36   (locking o f a patien t record,  user situa tion)
  37   Peer Revie w Passed?  (Y / N) Ye s
  38