Educational Relative Value Units as a Measure of Academic Productivity: A Systematic Review

Introduction: Academic Health Centers (AHCs) have complex, often competing missions. Many have developed mission-based management (MBM) systems to support their clinical and non-clinical missions. There are limited data on MBM use for their educational missions. Our scoping review explored how AHCs employed such systems. Materials and methods: Arksey and O’Malley’s six-stage framework guided our review. Based on pre-defined criteria, English language articles from PubMed, EMBASE, SCOPUS, and the Healthcare Administration Database published between 2010 and 2020 were loaded into a reference manager. The search included all health professions education schools. Articles were excluded if they were review articles, commentaries, or clearly did not involve funding for education. From the final list of selected articles, data were extracted using a data extraction sheet we developed. Two researchers reviewed each article again to ensure extracted data were reported consistently and with sufficient detail. Results: Of the 1729 manuscripts identified, 35 met inclusion criteria. Sixteen (46%) contained data in some form but did not have a formal methods section describing the specific approach to data collection and analysis. Moreover, there was marked variability in how educational effort was quantified, what counted as educational effort (educational scholarship versus teaching) and the impacts of such quantification (departmental funding versus individual faculty incentives). None of the studies reported on the impact on faculty promotion. Faculty satisfaction with the system was reported in seven studies (20%) and was generally positive. Conclusions: A systematic description of how systems were developed to support the educational mission was lacking. Clear goals, methods of development, uniform data on educational productivity and quality, and program evaluation were not defined by most articles. This lack of process clarity presents a challenge, but more importantly an opportunity for academic health centers to unify efforts and continue to further their educational mission.

TITLE-ABS-KEY ( "medical education" OR "academic medical center*" OR "university hospital*" OR "teaching hospital*" OR ( ( education* OR faculty OR teach* OR academic ) AND ( medical OR hospital ) ) ) W/3 ( "mission based" OR "relative value" OR "value based" OR "value unit" OR "value units" OR rvu OR rvus OR eVU OR eVUs OR rbrv OR rbrvs OR arvu OR arvus OR "funds flow" OR incentive* ) AND ( LIMIT-TO ( DOCTYPE , "ar" ) OR LIMIT-TO ( DOCTYPE , "re" ) ) Healthcare Administration Database 536 (("medical education" OR "academic medical center*" OR "university hospital*" OR "teaching hospital*" OR ((education* OR faculty OR teach* OR academic) AND (medical OR hospital))) AND ("mission based" OR "relative value" OR "value based" OR "value unit" OR "value units" OR rvu OR rvus OR eVU OR eVUs OR rbrv OR rbrvs OR arvu OR arvus OR "funds flow" OR incentive*)) (filter 2010 to present and scholarly journals)

TABLE 1: Database Search Strategies
The databases used for this search included: PubMed, EMBASE, SCOPUS, and Healthcare Administration Database. The search included English-language articles published between 2010 and 2020. The search results were loaded into a systematic review reference manager (Covidence®, Melbourne, Australia) to screen out duplicate articles. During the full-text screen and data extraction phase, reference lists were also inspected to identify additional articles for inclusion in the study, and citations of the extracted papers were also searched. The search included studies from all schools/systems involved in any health professions education such as dentistry, nursing and pharmacy.

Stage 3: selecting studies
Each abstract was reviewed by two members of the research team for inclusion or exclusion. If there were disagreements, a third researcher reviewed the abstract to make a final determination. Abstracts were included if they indicated the article involved a discussion about eVUs or a similar metric. Abstracts were excluded if they were from review articles, commentaries, or clearly did not involve funding for education.
After the abstract review process, all selected papers underwent full-text reviews. Each article was reviewed by two members of the research team. Any disagreements were resolved by a third member of the team reviewing the article and making a final determination. Once this process was completed, data were extracted.

Stage 4: extracting data
One of the researchers (BZM) developed a draft data extraction form. The group reviewed, discussed, and edited the information we hoped to cull from the articles. Upon final approval, a finalized extraction form was set up in Covidence. We did not include faculty perception on the development of the eVU rubrics amongst data to be abstracted. We also chose to distinguish between educational scholarship (work aimed at dissemination of a peer-reviewed educational product) from all other educational activities such as teaching, mentoring, assessing learners and curriculum development. This distinction was informed by the models of Boyer [6] and Glassick [7], and was driven partly by the fact that some of the selected papers included "scholarship" with educational activity in their eVU rubrics, whereas other papers separated the two. We divided the final list of articles to extract data from such that each team member was responsible for 10-13 articles.
Once data had been extracted from each article, the information was exported to an Excel spreadsheet (Microsoft, Redmond, WA, USA) for further summarization and analysis. The final step in the extraction process was for two of the researchers (BZM, GBD) to review each article again to ensure extracted data were reported consistently and with sufficient detail to summarize. The complete table for data extraction is shown in the appendix.

Results
A total of 1729 papers were identified using the search criteria and subject to the first level of review (titles and abstracts). Of the 50 studies selected for full text review, 23 were excluded due to the absence of data or the lack of a specific eVU plan/implementation. These articles were more of a commentary about the need for such a measurement. This resulted in 27 papers ultimately chosen for review. An additional nine articles were identified from mining the references of these 27 manuscripts. Finally, after final review and abstraction, one more article was removed due to the lack of sufficient data, resulting in a total of 35 papers for inclusion ( Figure 1). None of the studies involving schools/systems from non-physician professions made the final cut based on our pre-set inclusion criteria.

FIGURE 1: Article selection flow diagram
Of note, 16 of the 35 papers (46%) contained data in some form but did not have a formal methods section describing the specific approach to data collection and analysis. The abstracted data are presented in Table 2.
There is wide variation in the designs of the eVU systems reported.
Sm grp teaching, procedure/lab, advising Organizing conferences, small-groups, exam writing, teaching awards

Settings and Size of Studies/Reports
Three of the studies (9%) reported on a (medical) school-wide roll out of their system. Two reports (6%) were focused on a hospital setting -in both instances a free-standing children's hospital. The remaining 85% were limited to a single department, although some of the departments were quite large and included specialty divisions ( Table 3). The numbers of faculty ranged from 11 to 893; although in eight reports (23%), specific numbers were not reported.

Educational Productivity Change
As noted earlier, we chose to separate teaching and educational productivity from educational scholarship in our data abstraction of the papers. Eighteen of 35 (49%) studies included in the review reported data on educational productivity, as defined by us ( Table 4). An increase in productivity was reported in 12, no change was reported in three, a decrease was noted in one, and in two reports, productivity increased for some faculty and decreased for others. Scholarly activities, including those related to education, were often not separated from educational activities as we have defined them; the rubrics included scholarship as an eVU-generating activity. In the 11 papers that report increases in educational productivity, the increase was, using our construct, often in the realm of scholarship [6,10,17,29,30]. The papers do not report if the increased scholarly output was in the domain of clinical practice, educational scholarship, or scientific research.

1146
No statistical analyses applied to eVU data; ranges suggest these numbers did not statistically differ.
"Scholarship" -grants, peer-reviewed papers, presentations -quantified separately. Scholarship of teaching and learning not separated out.
Increased for some, reduced for others Increase over baseline: Cardio, GI, ID, Immunology, Nephrology, Pulmonary Decrease: Endocrinology, Hematology/Oncology Money to General Internal Med stable -directly from medical school. Unclear impact in that division of program.
Khan [27] Increased From year 1 and 3, mean group educational productivity increased from 73% to 88% of expected, and mean individual productivity increased from 54% to 82% of expected.
Carmody [29] Administrative measures improved Conference attendance increased 21%; the number of resident assessments completed increased by 30%. 1240 academic activities logged in new system -no baseline data to compare.
Carmody [29] Increased hours Teaching hours increased by 8% over 3 years -not statistically different. Total publications did increase statistically significantly. Incentive dollars increased from a mean of $3,191 to $11,153.
Leverence [30] Median academic bonus fairly constant over 10 years Scholarship of teaching and learning not separated from scholarship points. Total academic bonus rose linearly among faculty in the bottom three quartiles of academic productivity; increased exponentially for those in the 75 th to 100 th percentile.
Ma [31] Administrative measures House [32] Increased Preliminary report; no specific data. "There has been evidence of increased academic productivity at both the department level and the individual faculty level."

Ridley [6] Increased
Not quantified: faculty participation in resident teaching and attendance at conferences "dramatically increased." Scheduling faculty for lectures became easier: faculty attendance at resident morning report improved.
Stites [34] Academic productivity units remained stable 2-year follow-up. Those more clinically productive were also more academically productive. No difference between junior and senior faculty.
Williams [38] Scores remained stable Mean overall scores, as well as in domains of clinical practice, education, scholarship, and administration scores did not change significantly. Overall scores for assistant professors did increase: not reported in which domain.
Filler [36] Scores remained stable Mean overall scores, as well as in domains of clinical practice, education, scholarship, and administration scores did not change significantly. Overall scores for assistant professors did increase: not reported in which domain.

TABLE 4: Impact of an eVU/Mission-based tracking system on "educational productivity"
UPMC=University of Pittsburgh Medical Center, eVU=educational relative value unit

Department Funds Flow
Six of 35 papers (17%) reported data on change in flow of funds to departments and/or divisions. In two studies [24,30], department funds either increased or decreased, depending on performance of the overall department/division. For the other four [23,29,31,32], funds flow to departments/divisions increased.
Departments (and hospitals) reported that they had the support of the leadership to begin the process. Traditional budgets had to evolve to accommodate the new systems. Revenues typically were a mix of allocations from the school, the practice plan, and other sources of income. Increased funds as a result of eVU systems were typically the result of reallocation of university dollars in support of mission-necessary faculty efforts (e.g., "teaching activities"), as well as increased clinical revenues and grant support.

Provider Funds Flow
Twelve of 35 papers (34%) reported data on changes in flow of funds to individual faculty. In nine (75%) of these 12, faculty were eligible to receive incentive dollars. In two (17%) of these 12, individual salaries went up or went down [19,34]. If the salary dropped, this was due to a decrement in university support based on less than anticipated/expected educational effort; decrements were typically offset by increased clinical revenue. In one paper [10], incentive dollars were collected by eligible faculty, but the data demonstrate the bonus was for "scholarship" (e.g., papers and grants) as opposed to "teaching." It is not clear if any of these scholarship increments were based on productivity in the scholarship of teaching and learning.
In five additional papers (14%), it is not clear how bonus dollars were distributed, even if faculty met or exceeded their personal targets. These five papers will be discussed individually. For instance, Rouan et al.
reported that redistribution of teaching dollars among divisions increased 11.4% [21]. Based on the description of their plan, readers were left to presume funding was distributed to individuals if they met or exceeded thresholds. However, the Division of General Internal Medicine (GIM) was compensated using a rubric that differed from the other divisions and faculty did receive incentive monies. How other divisions in the department distributed funds was not clearly articulated.
In the second of the five papers, Pugh et al. reported that providers who achieved both of the pre-set benchmarks for evaluation completion and attendance were to be monetarily incentivized each quarter [22]. Faculty significantly achieved/exceeded expectations for percent completion of resident evaluations and did so in a shorter time frame. There was no significant increase in the average faculty attendance at educational sessions. The paper does not explicitly state if distribution of incentives indeed occurred and if so, how many faculty received the incentive.
In year three of the rollout of a third program [27], 12 of 22 providers met or exceeded quotas and, per protocol, should have received incentive payments; this is not explicitly stated. In the fourth program [29], there were providers who met criteria for incentives, but the amounts had not yet been determined at the time of publication. Finally, in the fifth paper, Filler et al. reported assistant professors had greater improvement in a self-scored 'scholarship score' (comprised of various components for scholarship in the domains of clinical practice, education, research, and administration) than did associate or full professors [36]. This then translated into an incentive payment, although neither individual nor mean data are reported.

Scholarly Activity
Twenty of the studies (57%) reported on how scholarly activity was accounted for as part of the missionbased process (Table 5). Typically, the reports identified high-level categories as grant revenue, peerreviewed publications, and services like editor of a journal or membership on a study section. As noted earlier in the section on educational productivity results, it was rare for the papers to clearly delineate if the scholarship was centered around discovery research (clinical, basic, and/or translational), clinical work (e.g., quality improvement) or the science of teaching and learning. Other Elements Hales [41]

Promotion and Tenure
None of the studies reported specific data related to the impact of eVU systems on faculty promotions. One report noted that meeting eVUs was considered as part of the promotion application [27]. Reports that included the comprehensive mission-based programs, ones that addressed all three to four components of a clinical faculty member's job -clinical care, education, scholarship, service -did demonstrate an increase in grants and papers, which presumably correlate with successful professional advancement and promotion [18].

Clinical Productivity
Six of the 35 papers (17%) reported on the impact of implementation on clinical productivity. Clinical productivity increased, with two of these six papers reporting in more detail; in one [36], clinical productivity only increased for assistant professors and was stable for the others. In the other paper [18], clinical productivity increased for some faculty and decreased for others.

Faculty Perspectives
Seven reports (20%) contained data, generally positive, on the satisfaction of the faculty after the program was implemented.

Impact on Learners/Learner Perspectives on Faculty Performance After Roll Out of an eVU System
Three of the papers (9%) included a pre-post implementation evaluation of learner perspectives on teaching by faculty participating in the eVU program. Two of these were from the University of Pittsburgh Medical Center (UPMC). In the first [14], the eVU system in part used teaching ratings by learners as a method determine part of faculty eVUs; no change in ratings was evident. For the second [9] the eVU rubric was different than the first UPMC report. On average, there were better learner scores of teaching sessions, and more presenters were requested by the learners to return for subsequent sessions.
The third report that contained learners' perspectives was from the University of Queensland/Ochsner (UQ-OCS) general practice clerkship (GPC) [12]. The authors simply report, without further elaboration, that the GPC was "the top-rated third-year clerkship at the UQ-OCS for the first three years of clinical rotations at the school." The authors did not report that their ranking was a result of support through their eVU system.

Discussion
Around 25 years ago, MBM proponents began to develop the concept of the eVU [4,25] as one way to level the playing field by highlighting the educational mission of an AHC and quantifying it to a level similar to the measurement of the clinical mission product via wRVUs. We undertook this scoping review to explore the published literature on MBM/eVU systems adopted by AHCs that have included specific metrics to acknowledge the educational mission and to quantify educational effort. The 35 papers that were reviewed and abstracted for inclusion in this scoping review were heterogenous in their scope and purpose. Many, it seemed, were published as proof of concept papers, demonstrating that an MBM approach to the education mission was possible, albeit complex. Others demonstrated some improvements in specific aspects of an educational program (e.g., completion of learner evaluations [22]) associated with some effort to incentivize the desired behaviors. The nature of the papers was such that often there was not a clear methods section, so we had to infer that certain measures were made, even if they were not described. For example, authors may have stated that faculty were "accepting of" or "satisfied with" an eVU program that had been initiated, although how the authors arrived at those conclusions was not clear. As we abstracted the manuscripts, we accepted that, for the program described in the paper, faculty acceptance/satisfaction was measured.

Recommendations for Future eVU Studies/Reports
Having reviewed and summarized the extant eVU data at a granular level, we now wish to step back and reframe the discussion with a goal to identifying paths to make future studies on eVU systems implementation more consistent and useful. What is clear from this review is that the concept of an eVUbased MBM approach to measuring educational activity is possible, although the components of such systems are complex to develop and, in some cases, to measure. Accordingly, we have identified aspects that we feel should be addressed in future studies of such systems, to facilitate adoption by other institutions: Comprehensive approach: Ideally, a study of an eVU program should be as comprehensive as possible, addressing all the aspects of the educational program. Studies should have specific aims identified in the manuscripts, especially if only one or two aspects of the eVU program are to be investigated. Methods specific to those aims need to be clearly described.

Conflicts of interest:
In compliance with the ICMJE uniform disclosure form, all authors declare the following: Payment/services info: All authors have declared that no financial support was received from any organization for the submitted work. Financial relationships: All authors have declared that they have no financial relationships at present or within the previous three years with any organizations that might have an interest in the submitted work. Other relationships: All authors have declared that there are no other relationships or activities that could appear to have influenced the submitted work.