Laying the foundation for a robust faculty data reporting infrastructure at a medical college
2015-08-20T19:20:59Z (GMT) by
Background: In 2014, Weill Cornell Medical College's (WCMC) Information Technologies Services was charged with producing high quality and on demand reports that would empower administrators to make key decisions and fulfill external reporting requirements. One dean is particularly keen on being able to do analyses in a disintermediated way - that is, without depending on middle men. Information systems originally designed for a single purpose can be serviceable for the occasional ad hoc system-specific reports, but the need for reliable, on-demand, and sophisticated reporting across systems highlighted our key unmet needs. These include: business process improvements for maintaining the systems; building open channels for feedback from end users and other stakeholders; improved processes for supplying data to downstream systems; and documenting and communicating the meaning and context of data. Such considerations cannot be an afterthought. <br><br>Analysis: Weill Cornell maintains 10 systems especially relevant to faculty including those that capture faculty reviews, board certifications, hospital credentials, and appointments as well as VIVO. To lay the groundwork for a faculty data reporting infrastructure, we scored all 10 systems according to these custom set of criteria. <br> - Authoritative data is accurate <br> - Secondary data is accurate <br> - Data is well-structured <br> - End user can view <br> - End user can update, or at least provide feedback <br> - Accurate assignment of institutional identifier <br> - Avoids duplicate records <br> - Well-connected to other systems <br> - Relevant information is collected <br> - Technically easy to output reports <br> - Transparent reporting process <br><br>In total, the 110 scores can be grouped as follows: <br> - 30 needs improvement <br> - 42 okay <br> - 28 good <br> - 10 unknown <br><br>Conclusion: At first blush, producing reports on faculty seems like it should be a straightforward proposition. Similarly, some VIVO implementation sites might assume that the majority of work towards propping up a new VIVO will be devoted to figuring out the technical mechanics of moving data between systems and tools. But this analysis clarified for our key stakeholders and ourselves certain prerequisites for creating a nimble and reliable faculty reporting infrastructure. When they come to fruition, we expect our efforts will also improve the quality of data in our VIVO.