The mission of any academic orthopaedic training program can be divided into 3 general areas of focus: clinical care, academic performance, and research. Clinical care is evaluated on clinical volume, patient outcomes, patient satisfaction, and becoming increasingly focused on data-driven quality metrics. Academic performance of a department can be used to motivate individual surgeons, but objective measures are used to define a residency program. Annual in-service examinations serve as a marker of resident knowledge base, and board pass rates are clearly scrutinized. Research productivity, however, has proven harder to objectively quantify. In an effort to improve transparency and better account for conflicts of interest, bias, and self-citation, multiple bibliometric measures have been developed. Rather than using individuals' research productivity as a surrogate for departmental research, we sought to establish an objective methodology to better assess a residency program's ability to conduct meaningful research. In this study, we describe a process to assess the number and quality of publications produced by an orthopaedic residency department. This would allow chairmen and program directors to benchmark their current production and make measurable goals for future research investment. The main goal of the benchmarking system is to create an "h-index" for residency programs. To do this, we needed to create a list of relevant articles in the orthopaedic literature. We used the Journal Citation Reports. This publication lists all orthopaedic journals that are given an impact factor rating every year. When we accessed the Journal Citation Reports database, there were 72 journals included in the orthopaedic literature section. To ensure only relevant, impactful journals were included, we selected journals with an impact factor greater than 0.95 and an Eigenfactor Score greater than 0.00095. After excluding journals not meeting these criteria, we were left with 45 journals. We performed a Scopus search over a 10-year period of these journals and created a database of articles and their affiliated institutions. We performed several iterations of this to maximize the capture of articles attributed to institutions with multiple names. Based off of this extensive database, we were able to analyze all allopathic US residency programs based on their quality research productivity. We believe this as a novel methodology to create a system by which residency program chairmen and directors can assess progress over time and accurate comparison with other programs.