Skip to main content
Advertisement

Main menu

  • Home
  • Content
    • Current Issue
    • Accepted Manuscripts
    • Article Preview
    • Past Issue Archive
    • AJNR Case Collection
    • Case of the Week Archive
    • Classic Case Archive
    • Case of the Month Archive
  • Special Collections
    • Spinal CSF Leak Articles (Jan 2020-June 2024)
    • 2024 AJNR Journal Awards
    • Most Impactful AJNR Articles
  • Multimedia
    • AJNR Podcast
    • AJNR Scantastics
    • Video Articles
  • For Authors
    • Submit a Manuscript
    • Author Policies
    • Fast publishing of Accepted Manuscripts
    • Graphical Abstract Preparation
    • Manuscript Submission Guidelines
    • Imaging Protocol Submission
    • Submit a Case for the Case Collection
  • About Us
    • About AJNR
    • Editorial Board
  • More
    • Become a Reviewer/Academy of Reviewers
    • Subscribers
    • Permissions
    • Alerts
    • Feedback
    • Advertisers
    • ASNR Home
  • Other Publications
    • ajnr

User menu

  • Alerts
  • Log in

Search

  • Advanced search
American Journal of Neuroradiology
American Journal of Neuroradiology

American Journal of Neuroradiology

ASHNR American Society of Functional Neuroradiology ASHNR American Society of Pediatric Neuroradiology ASSR
  • Alerts
  • Log in

Advanced Search

  • Home
  • Content
    • Current Issue
    • Accepted Manuscripts
    • Article Preview
    • Past Issue Archive
    • AJNR Case Collection
    • Case of the Week Archive
    • Classic Case Archive
    • Case of the Month Archive
  • Special Collections
    • Spinal CSF Leak Articles (Jan 2020-June 2024)
    • 2024 AJNR Journal Awards
    • Most Impactful AJNR Articles
  • Multimedia
    • AJNR Podcast
    • AJNR Scantastics
    • Video Articles
  • For Authors
    • Submit a Manuscript
    • Author Policies
    • Fast publishing of Accepted Manuscripts
    • Graphical Abstract Preparation
    • Manuscript Submission Guidelines
    • Imaging Protocol Submission
    • Submit a Case for the Case Collection
  • About Us
    • About AJNR
    • Editorial Board
  • More
    • Become a Reviewer/Academy of Reviewers
    • Subscribers
    • Permissions
    • Alerts
    • Feedback
    • Advertisers
    • ASNR Home
  • Follow AJNR on Twitter
  • Visit AJNR on Facebook
  • Follow AJNR on Instagram
  • Join AJNR on LinkedIn
  • RSS Feeds

Welcome to the new AJNR, Updated Hall of Fame, and more. Read the full announcements.


AJNR is seeking candidates for the position of Associate Section Editor, AJNR Case Collection. Read the full announcement.

 

Research ArticlePediatrics

Peer Learning Program Metrics: A Pediatric Neuroradiology Example

N. Kadom, K.M. Reddy, G. Khanna, S.F. Simoneaux, J.W. Allen and M.E. Heilbrun
American Journal of Neuroradiology November 2022, 43 (11) 1680-1684; DOI: https://doi.org/10.3174/ajnr.A7673
N. Kadom
aFrom the Department of Radiology and Imaging Sciences (N.K., K.M.R., G.K., S.F.S., J.W.A., M.E.H.), Emory University School of Medicine, Atlanta, Georgia
bDepartment of Radiology (N.K., K.M.R., G.K., S.F.S.), Children’s Healthcare of Atlanta, Atlanta, Georgia
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for N. Kadom
K.M. Reddy
aFrom the Department of Radiology and Imaging Sciences (N.K., K.M.R., G.K., S.F.S., J.W.A., M.E.H.), Emory University School of Medicine, Atlanta, Georgia
bDepartment of Radiology (N.K., K.M.R., G.K., S.F.S.), Children’s Healthcare of Atlanta, Atlanta, Georgia
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
G. Khanna
aFrom the Department of Radiology and Imaging Sciences (N.K., K.M.R., G.K., S.F.S., J.W.A., M.E.H.), Emory University School of Medicine, Atlanta, Georgia
bDepartment of Radiology (N.K., K.M.R., G.K., S.F.S.), Children’s Healthcare of Atlanta, Atlanta, Georgia
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for G. Khanna
S.F. Simoneaux
aFrom the Department of Radiology and Imaging Sciences (N.K., K.M.R., G.K., S.F.S., J.W.A., M.E.H.), Emory University School of Medicine, Atlanta, Georgia
bDepartment of Radiology (N.K., K.M.R., G.K., S.F.S.), Children’s Healthcare of Atlanta, Atlanta, Georgia
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for S.F. Simoneaux
J.W. Allen
aFrom the Department of Radiology and Imaging Sciences (N.K., K.M.R., G.K., S.F.S., J.W.A., M.E.H.), Emory University School of Medicine, Atlanta, Georgia
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for J.W. Allen
M.E. Heilbrun
aFrom the Department of Radiology and Imaging Sciences (N.K., K.M.R., G.K., S.F.S., J.W.A., M.E.H.), Emory University School of Medicine, Atlanta, Georgia
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for M.E. Heilbrun
  • Article
  • Figures & Data
  • Supplemental
  • Info & Metrics
  • Responses
  • References
  • PDF
Loading

Abstract

BACKGROUND AND PURPOSE: The American College of Radiology is now offering an accreditation pathway for programs that use peer learning. Here, we share feasibility and outcome data from a pilot peer learning program in a pediatric neuroradiology section that, in its design, follows the American College of Radiology peer learning accreditation pathway criteria.

MATERIALS AND METHODS: We retrospectively reviewed metrics from a peer learning program with 5 participating full-time pediatric neuroradiologists during 1 year: 1) number of cases submitted, 2) percentage of radiologists meeting targets, 3) monthly attendance, 4) number of cases reviewed, 5) learning points, and 6) improvement actions. In addition, a faculty survey was conducted and is reported here.

RESULTS: Three hundred twenty-four cases were submitted (mean, 7 cases/faculty/month). The faculty never met the monthly submission target. Peer learning meeting attendance was 100%. One hundred seventy-nine cases were reviewed during the peer learning meetings. There were 22 learning points throughout the year and 30 documented improvement actions. The faculty survey yielded the highest ratings (4.8 of 5) for ease of meeting the 100% attendance requirement and for the learning value of the peer learning sessions. The lowest rating (4.2 of 5) was given for the effectiveness of improvements as a result of peer learning discussions.

CONCLUSIONS: Implementing a peer learning program that follows the American College of Radiology peer learning accreditation pathway criteria is feasible. Program metric documentation can be time-consuming. Participant feedback led to meaningful program improvement, such as improving trust, expanding case submission categories, and delegating tasks to administrative staff. Effort to make peer learning operations more efficient and more effective is underway.

ABBREVIATIONS:

ACR
American College of Radiology
CME
Continuing Medical Education
PL
peer learning

The American College of Radiology (ACR) is now offering an accreditation pathway for programs that use peer learning (PL).1 To qualify, a PL program should have a PL policy, explicit program targets, and annual documentation of program metrics. Specifically, the annual report should include the total number of case submissions to the PL program, the number and percentage of radiologists meeting targets as defined in the facility practice policy, a determination of whether PL activities met the minimum standard as defined by the facility practice policy, and a summary of related quality-improvement effort and accomplishments.1

Many radiology practices in the United States are adopting PL in lieu of or in addition to traditional score-based peer review.2⇓-4 PL is an approach to performance improvement that is based on quality and safety concepts found in high-reliability organizations.5 PL builds a safety culture by creating a safe environment for error disclosure, it facilitates joint learning from mistakes, and it creates opportunities for improvement through group discussions that elucidate sources of errors.6⇓⇓⇓⇓⇓-12 Higher case submission rates have been observed after switching from score-based peer review to PL, indicating higher engagement of radiologists.6,7

Here, we share feasibility and outcomes data from a pilot PL program in a pediatric neuroradiology section that, in its design, follows the ACR PL accreditation pathway criteria.1 Our program uses several PL metrics, including radiologist participation rates, number of cases submitted, number of cases reviewed, tangible lessons learned, and improvement projects completed.

MATERIALS AND METHODS

This quality-assurance study was exempt from institutional review board approval. The data were collected at Children's Healthcare of Atlanta (CHOA), a freestanding academic pediatric hospital with nearly 300,000 examinations annually. Five full-time pediatric neuroradiologists participated in the PL program during the 1-year study period, January 1, 2021, through December 31, 2021. A total of 24,724 neuroradiology examination reports were issued during this time.

PL Program

In December 2020, we incorporated an additional pediatric site into our practice and added substantially to our pediatric neuroradiology faculty, resulting in a separation of pediatric from adult neuroradiology service lines. This created an opportunity for implementing a pilot PL program for the pediatric neuroradiologists who previously participated in score-based peer review.

Our PL program is informed by a written policy that incorporates the elements recommended by the ACR accreditation checklist for PL.1 Our section chief defined the program targets as follows: PL conferences to occur monthly, 100% faculty attendance, and 5 PL cases submitted each month per pediatric neuroradiologist. The annual documentation of our PL program metrics includes the following: a statement of commitment to sequestering PL from performance evaluations, the total number of case submissions to the PL program, the number and percentage of radiologists meeting targets as defined in the facility practice policy, a determination of whether PL activities met the minimum standard as defined by the facility practice policy, and a summary of related quality-improvement effort and accomplishments.1

PL conferences occur monthly throughout the calendar year and are recorded for asynchronous viewing. The meetings occur between 12:00 and 1:00 pm, when, in most instances, there is service coverage by a fellow, and they last for 1 hour. There are 2 dedicated faculty members who alternate monthly in selecting and presenting cases. During the study period, we reviewed not only discrepancies of perception, interpretation, or communication, but also interesting cases. Each month, cases submitted during the previous month were reviewed. Cases were presented as anonymized PowerPoint slides (Microsoft). The case discussion was documented for each case on a case-review form, along with any learning points and improvement actions. Each session was recorded (Teams; Microsoft) and saved in an online location outside the institution’s health records system, where it is protected under peer review state law. Recordings are shared only with faculty and PL staff and can be accessed for remote viewing by those who could not attend the in-person session. During the study period, any improvement actions were immediately assigned to a faculty volunteer who set a deadline; he or she was followed to the conclusion at the beginning of subsequent PL meetings.

Data Collection

We analyzed the following items that were collected monthly: 1) the number of cases submitted per faculty per month, 2) the percentage of radiologists meeting PL program targets for case submissions (5 per month per faculty), 3) monthly faculty PL attendance (target of 100% live attendance or asynchronous viewing of session recordings), 4) the number of cases reviewed during the PL session, 5) the number and nature of learning points, and 6) the number and nature of improvement actions with assigned faculty volunteer and documented completion.

Faculty Survey

An 11-item survey (Online Supplemental Data) was developed and face-validated by the radiology quality director (N.K.). Responses were collected anonymously in January 2022. There were 2 yes/no questions, 3 open-comment items, and 6 Likert items requesting a Likert star rating with the maximum rating of 5 stars.

Data Analysis

Descriptive statistics were performed in Excel (Microsoft).

RESULTS

PL Program Metrics

The number of monthly case submissions varied widely. During the year, 324 cases were submitted for the PL meetings, with an average of 7 case submissions per faculty per month, and monthly submissions ranging from 0 to 26 cases for a single faculty member (Online Supplemental Data and Fig 1).

FIG 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIG 1.

Monthly cases submitted by faculty. Five faculty members (A–E) were observed during this study period; 2 of the faculty joined in August.

There was no month during which >80% of the faculty met the monthly submission target of 5 cases (Fig 2). The low case-submission rate for review in January could be due to the program being new (it was started December 2020), and low submission rates in April correspond to high case volumes and diminished staffing in the same month (data not shown).

FIG 2.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIG 2.

Percentage of faculty meeting the monthly case-submission target. Five faculty members (A–E) were observed during this study period; 2 of the faculty joined in August.

PL meeting attendance was 100% for each faculty member.

A total of 179 cases were reviewed throughout the year, which is about 50% of all case submissions (179/324). On average, we reviewed 14 cases per PL meeting, ranging from 6–24 case reviews per session.

The session moderator documented any learning points and improvement actions during each PL meeting. There were 22 learning points throughout the year, which averages to 2 learning points per session. Lessons learned included recognizing the importance of accurate use of overnight agree/disagree statements, identifying potential pitfalls in image interpretation, importance of report proofreading, identifying instances when it is appropriate to reference normative values for measurements, and identifying imaging signs of rare diagnoses.

There were 30 documented improvement actions throughout the year, which average to 2.5 improvements identified per session. Improvements that resulted from the PL program thus far included changes to CT and MR imaging protocols, education of radiologists and technologists, changes to reporting templates, changes to EPIC workflows, and modifications of team communications.

PL Faculty Survey

All faculty members responded to the survey (response rate, 100%) (Online Supplemental Data). All section members had previously participated in randomized score-based peer review, and only 2 faculty members had experienced PL previously.

When asked to list any differences between random score-based review and PL that favor random score-based review, respondents listed the following: faster, more objective, simple, easy metric, and mixed agree/disagree (versus only reporting disagreements) as giving a sense of accuracy. Respondents listed differences that favor PL as the following: more fun, learning, discovering improvement opportunities, group discussion, interactive and constructive feedback, and better experience overall.

The highest ratings (4.8 of 5) were given for ease of meeting the 100% attendance requirement and for the learning value of the PL sessions. A slightly lower rating (4.6 of 5) was given for feeling safe during case discussions, for the ease of submitting cases, and for the ability to gain Continuing Medical Education (CME) credit for session participation. The lowest rating (4.2 of 5) was given for effectiveness of improvements as a result of PL discussions.

Additional general comments included lowering the participation target to 80% to include good calls and not just discrepancies, having too many case-submission tools, and improvement actions being rushed and seeming reactive.

DISCUSSION

We were able to set up a PL program in pediatric neuroradiology that incorporates the checklist items for the new ACR accreditation pathway for PL, demonstrating feasibility in program design and implementation. However, we have not yet sought ACR accreditation through this pathway.

Most interesting, generating the data required for ACR reporting adds to the overall time commitment for running a PL program. While we did not measure this issue, we estimate that the annual time commitment for the physician leaders is 56 hours, which includes 4 hours/month to collate, select, prepare, and discuss cases for the monthly PL conference, 0.5 hour/month for transcribing PL program data and submitting CME materials, and 2 hours for writing the annual report. We have now trained an administrative assistant who reviews the PL session recording to track attendance, fill out the case-review forms, and handle any activities related to CME credit. While obtaining CME credit for PL was rated less important in our survey, we will continue to offer it because our administrative staff is now managing this aspect of the program. The more time-intensive effort for PL programs compared with score-based peer review has been acknowledged by others.13

The monthly PL meeting attendance target was easily met when allowing our faculty who could not attend the live session to attest to viewing session recordings. Faculty rated the ease of compliance with this target very favorably. Similar to others, we used the virtual format due to coronavirus disease 2019 (COVID-19) conditions14 but realize now that it remains the best option for participating from various sites and practice locations within our system. Most interesting, we are not using any incentives or penalties to drive up our faculty participation rate.15

There was not a single month when our entire faculty met the target for case submissions. Two faculty members (Fig 1, faculty D and E) disclosed not entirely trusting the separation of learning from performance assessment and, therefore, avoiding case submissions, which was also reflected in the survey by low ratings for the perceived safety during PL meetings. The other faculty member struggled with the multitude of reporting tools to be used, ie, RADPEER (https://www.acr.org/Clinical-Resources/RADPEER), EPIC, and e-mails. In response to these concerns, we have uninvited an external PL session participant who represented the system Peer Practice Evaluation Committee. We also informed our faculty about the educational nature of the PL program and its protection under state peer review legislation, and a reminder slide is now included in the introductory portion of the PL meeting slides. As another change in response to these concerns, we are now keeping case discussions completely anonymous, meaning that we no longer allow faculty to self-identify in any way during a case discussion. Regarding the case submission tools, we are currently still required to use the ACR RADPEER tool for ongoing professional practice evaluation. Unfortunately, our RADPEER is not set up to allow reviews for past faculty readers nor can we submit cases when the current and past reader is the same person. In those instances, we have configured a quality reporting tool in EPIC, but it can only be used as long the report has not been finalized. For all other cases, we notify the PL leaders by e-mail so that cases can be included. We are currently developing an alternate performance review system for ongoing professional practice evaluation16,17 so that we can abandon RADPEER and replace all current submission options with a single tool.

At the start of the Pl program, we randomly determined the target for monthly case submission per faculty on the basis of what seemed “reasonable.” Because our faculty never met that target, we propose several changes. In our program we reviewed a maximum of 24 cases in a PL conference, which can help determine faculty case submissions per month. For example, for our general pediatric radiology section with currently 18 faculty, it was decided to maintain a minimum submission of 2 cases per month per faculty. This still yields a surplus of cases that allows the PL program leads to select cases with the highest yield for discussion and omit redundant/repetitive cases. If we continually fail to meet our monthly case-submission target in pediatric neuroradiology, we may lower the monthly target below 5 cases or set the target at the section rather than the individual level. Another option to consider, especially for smaller radiology subspecialties, could be to expand PL programs across multiple institutions to spread the shared learning experience and variety of cases.18,19

On the basis of the collected data on learning points and survey responses highly rating the learning value, our program performs similar to those of others who reported higher rates of satisfaction and learning.6,20,21 On the basis of the feedback submitted in the survey, we have expanded the submission categories from only discrepancies to also include good calls,22 interesting cases, and cases for any type of group discussion (communication, protocols, imaging technique, and so forth). Sources for PL cases in our program include routine workflow, clinical conferences, consultations, as well as a provider feedback submission system. In the future, we may be able to integrate artificial intelligence applications that can identify cases with radiology-pathology correlations.23

The lowest survey ratings from our faculty were issued for the improvement effectiveness of the PL program. On further inquiry, faculty members were concerned that improvement actions were decided too quickly without deeper reflection on root causes and balancing measures. We are now documenting any improvement ideas that are mentioned during PL conferences, but we hold off on initiating improvements until a subsequent discussion with the section director has occurred.

Of note, our PL process eliminates faculty “voting” on discrepancies of perception, interpretation, and communication. In our system, the radiologist who identifies a discrepancy is in charge of immediately addressing any patient care issues and notifying the original interpreting radiologist of the discrepancy. He or she can suggest that the original radiologist should act, eg, by issuing an addendum to a report. Whether the recommended action is implemented by the original radiologist, however, is left to that radiologist’s professional decision. Any concerns regarding a radiologist’s clinical practice or behaviors are to be submitted to our system’s Peer Practice Evaluation Committees, which review any physician practice or behavior concerns and determine possible actions.

This study has several limitations. While we assume that PL is more effective than score-based peer review when it comes to improved practice, we do not have any data to show this to be true. Some programs use addendum rates as a proxy for improvement effects and show higher addendum rates with PL compared with score-based peer review.8,15 Our survey supports the notion that PL is a valued activity for our faculty, and that at a minimum, it creates opportunities for teambuilding and collaboration.24 Some of the submitted discrepancies may be unproven, disputed, or clinically insignificant. We have not yet needed a system to address disputes. We currently have the person identifying a discrepancy notify the original reader and indicate that either no further action is needed on the basis of an existing follow-up report or an action would be helpful on the basis of a clinician request or patient care impact. It is then up to the radiologist receiving this feedback to act appropriately and responsibly.

CONCLUSIONS

We show the feasibility of a PL program in a pediatric neuroradiology section that follows the ACR PL accreditation pathway criteria. At our academic institution, PL is currently piloted in the pediatric radiology sections. Solicitation of feedback from PL program participants has been helpful in making changes to certain aspects of the program, such as improving trust in the PL program, including meaningful case submission types, and more thoughtful improvement actions. While radiologists favor PL over score-based review, the lack of tools and support to run PL meetings efficiently and effectively may present a barrier to a widespread replacement of score-based review with PL. We are currently developing a submission and data collection tool that supports semiautomated reporting for the ACR accreditation pathway, and we are exploring aspects of the PL process that can be handed off to administrative staff.

Acknowledgments

Special thanks goes to Jennifer Broder, MD, who provided insight and support throughout the process of designing and implementing our peer learning program.

Footnotes

  • Disclosure forms provided by the authors are available with the full text and PDF of this article at www.ajnr.org.

References

  1. 1.↵
    American College of Radiology (ACR). Peer Learning Program Checklist for ACR Accredited Facilities. https://www.acr.org/-/media/ACR/Files/Peer-Learning-Summit/Peer-Learning-Program-Checklist-for-ACR-accredited-facilities.pdf. Accessed September 30, 2022
  2. 2.↵
    1. Lee CS,
    2. Neumann C,
    3. Jha P, et al
    . Current status and future wish list of peer review: a national questionnaire of U.S. radiologists. AJR Am J Roentgenol 2020;214:493–97 doi:10.2214/AJR.19.22194 pmid:31939700
    CrossRefPubMed
  3. 3.↵
    1. Larson DB,
    2. Broder JC,
    3. Bhargavan-Chatfield M, et al
    . Transitioning from peer review to peer learning: report of the 2020 peer learning summit. J Am Coll Radiol 2020;17:1499–1508 doi:10.1016/j.jacr.2020.07.016 pmid:32771491
    CrossRefPubMed
  4. 4.↵
    RADPEER. https://www.acr.org/Clinical-Resources/RADPEER. Accessed September 30, 2022
  5. 5.↵
    1. Larson DB,
    2. Nance JJ
    . Rethinking peer review: what aviation can teach radiology about performance improvement. Radiology 2011;259:626–32 doi:10.1148/radiol.11102222 pmid:21602501
    CrossRefPubMed
  6. 6.↵
    1. Sharpe RE Jr.,
    2. Huffman RI,
    3. Congdon RG, et al
    . Implementation of a peer learning program replacing score-based peer review in a multispecialty integrated practice. AJR Am J Roentgenol 2018;211:949–56 doi:10.2214/AJR.18.19891 pmid:30207788
    CrossRefPubMed
  7. 7.↵
    1. Donnelly LF,
    2. Dorfman SR,
    3. Jones J 3rd., et al
    . Transition from peer review to peer learning: experience in a radiology department. J Am Coll Radiol 2018;15:1143–49 doi:10.1016/j.jacr.2017.08.023 pmid:29055610
    CrossRefPubMed
  8. 8.↵
    1. Trinh TW,
    2. Boland GW,
    3. Khorasani R
    . Improving radiology peer learning: comparing a novel electronic peer learning tool and a traditional score-based peer review system. AJR Am J Roentgenol 2019;212:135–41 doi:10.2214/AJR.18.19958 pmid:30403533
    CrossRefPubMed
  9. 9.↵
    1. Brook OR,
    2. Romero J,
    3. Brook A, et al
    . The complementary nature of peer review and quality assurance data collection. Radiology 2015;274:221–29 doi:10.1148/radiol.14132931 pmid:25188432
    CrossRefPubMed
  10. 10.↵
    1. Itri JN,
    2. Donithan A,
    3. Patel SH
    . Random versus nonrandom peer review: a case for more meaningful peer review. J Am Coll Radiol 2018;15:1045–52 doi:10.1016/j.jacr.2018.03.054 pmid:29807816
    CrossRefPubMed
  11. 11.↵
    1. Harvey HB,
    2. Alkasab TK,
    3. Prabhakar AM, et al
    . Radiologist peer review by group consensus. J Am Coll Radiol 2016;13:656–62 doi:10.1016/j.jacr.2015.11.013 pmid:26908200
    CrossRefPubMed
  12. 12.↵
    1. Alkasab TK,
    2. Harvey HB,
    3. Gowda V, et al
    . Consensus-oriented group peer review: a new process to review radiologist work output. J Am Coll Radiol 2014;11:131–38 doi:10.1016/j.jacr.2013.04.013 pmid:24139321
    CrossRefPubMed
  13. 13.↵
    1. Sayyouh MM,
    2. Sella EC,
    3. Shankar PR, et al
    . Lessons learned from peer learning conference in cardiothoracic radiology. Radiographics 2022;42:579–93 doi:10.1148/rg.210125 pmid:35148241
    CrossRefPubMed
  14. 14.↵
    1. Virarkar M,
    2. Morani AC,
    3. Bhosale P, et al
    . Peer learning and operationalizing during COVID-19 pandemic and beyond. Cureus 2021;13:e16568 doi:10.7759/cureus.16568 pmid:34430170
    CrossRefPubMed
  15. 15.↵
    1. Zhao AH,
    2. Burk KS,
    3. Enamandram SS, et al
    . Peer learning in radiology: effect of a pay-for-performance initiative on clinical impact and usage. AJR Am J Roentgenol 2021;216:1659–67 doi:10.2214/AJR.20.23253 pmid:33787297
    CrossRefPubMed
  16. 16.↵
    1. Donnelly LF,
    2. Larson DB,
    3. Heller RE III., et al
    . Practical suggestions on how to move from peer review to peer learning. AJR Am J Roentgenol 2018;210:578–82 doi:10.2214/AJR.17.18660 pmid:29323555
    CrossRefPubMed
  17. 17.↵
    The Joint Commission. What are the key elements needed to meet the ongoing professional practice evaluation (OPPE) requirements? 2021. https://www.jointcommission.org/standards/standard-faqs/critical-access-hospital/medical-staff-ms/000001500/. Accessed September 14, 2021
  18. 18.↵
    1. Bowman AW,
    2. Tan N,
    3. Adamo DA, et al
    . Implementation of peer learning conferences throughout a multi-site abdominal radiology practice. Abdom Radiol (NY) 2021;46:5489–99 doi:10.1007/s00261-021-03114-8 pmid:33999282
    CrossRefPubMed
  19. 19.↵
    1. Chow RA,
    2. Tan N,
    3. Henry TS, et al
    . Peer learning through multi-institutional case conferences: abdominal and cardiothoracic radiology experience. Acad Radiol 2021;28:255–60 doi:10.1016/j.acra.2020.01.015 pmid:32061469
    CrossRefPubMed
  20. 20.↵
    1. Broder JC,
    2. Scheirey CD,
    3. Wald C
    . Step by step: a structured approach for proposing, developing and implementing a radiology peer learning program. Curr Probl Diagn Radiol 2021;50:457–60 doi:10.1067/j.cpradiol.2021.02.007 pmid:33663894
    CrossRefPubMed
  21. 21.↵
    1. Haas BM,
    2. Mogel GT,
    3. Attaya HN
    . Peer learning on a shoe string: success of a distributive model for peer learning in a community radiology practice. Clin Imaging 2020;59:114–18 doi:10.1016/j.clinimag.2019.10.012 pmid:31816537
    CrossRefPubMed
  22. 22.↵
    1. Lee RK,
    2. Cohen M,
    3. David N, et al
    . Transitioning to peer learning: lessons learned. J Am Coll Radiology 2021;18:499–506 doi:10.1016/j.jacr.2020.09.058 pmid:33096087
    CrossRefPubMed
  23. 23.↵
    1. Filice RW
    . Radiology-pathology correlation to facilitate peer learning: an overview including recent artificial intelligence methods. J Am Coll Radiol 2019;16:1279–85 doi:10.1016/j.jacr.2019.05.010 pmid:31492406
    CrossRefPubMed
  24. 24.↵
    1. Chetlen AL,
    2. Petscavage-Thomas J,
    3. Cherian RA, et al
    . Collaborative learning in radiology: from peer review to peer learning and peer coaching. Acad Radiol 2020;27:1261–67 doi:10.1016/j.acra.2019.09.021 pmid:31636005
    CrossRefPubMed
  • Received June 27, 2022.
  • Accepted after revision September 12, 2022.
  • © 2022 by American Journal of Neuroradiology
PreviousNext
Back to top

In this issue

American Journal of Neuroradiology: 43 (11)
American Journal of Neuroradiology
Vol. 43, Issue 11
1 Nov 2022
  • Table of Contents
  • Index by author
  • Complete Issue (PDF)
Advertisement
Print
Download PDF
Email Article

Thank you for your interest in spreading the word on American Journal of Neuroradiology.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Peer Learning Program Metrics: A Pediatric Neuroradiology Example
(Your Name) has sent you a message from American Journal of Neuroradiology
(Your Name) thought you would like to see the American Journal of Neuroradiology web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Cite this article
N. Kadom, K.M. Reddy, G. Khanna, S.F. Simoneaux, J.W. Allen, M.E. Heilbrun
Peer Learning Program Metrics: A Pediatric Neuroradiology Example
American Journal of Neuroradiology Nov 2022, 43 (11) 1680-1684; DOI: 10.3174/ajnr.A7673

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
0 Responses
Respond to this article
Share
Bookmark this article
PEDS Neuroradiology Peer Learning Metrics
N. Kadom, K.M. Reddy, G. Khanna, S.F. Simoneaux, J.W. Allen, M.E. Heilbrun
American Journal of Neuroradiology Nov 2022, 43 (11) 1680-1684; DOI: 10.3174/ajnr.A7673
del.icio.us logo Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One
Purchase

Jump to section

  • Article
    • Abstract
    • ABBREVIATIONS:
    • MATERIALS AND METHODS
    • RESULTS
    • DISCUSSION
    • CONCLUSIONS
    • Acknowledgments
    • Footnotes
    • References
  • Figures & Data
  • Supplemental
  • Info & Metrics
  • Responses
  • References
  • PDF

Related Articles

  • PubMed
  • Google Scholar

Cited By...

  • Peer Learning in Neuroradiology: Easier Than It Sounds
  • Peer Learning in Neuroradiology: Not as Easy as It Sounds
  • Crossref (4)
  • Google Scholar

This article has been cited by the following articles in journals that are participating in Crossref Cited-by Linking.

  • Peer Learning in Neuroradiology: Easier Than It Sounds
    Mara M. Kunst, Richard E. Sharpe, Jay K. Pahade, Lane F. Donnelly, Jennifer Broder
    American Journal of Neuroradiology 2024 45 6
  • Peer Learning in Neuroradiology: Not as Easy as It Sounds
    K. Mani, K. Shah, N. Kadom, D. Seidenwurm, A.J. Nemeth
    American Journal of Neuroradiology 2023 44 10
  • Converting case conferences to peer learning: Opportunities and barriers
    N Kadom, T Sivathapandi, DJ Murcia, C Moreno, P Balthazar
    Current Problems in Diagnostic Radiology 2025 54 3
  • The Lifelong Learning Journey of Health Professionals
    Jessica López Espinosa, América Daniela Valero Rodríguez, Amairany Vega Bravo, Gabriela Vázquez-Armenta, Mildred Lopez
    2024

More in this TOC Section

  • Comparison of Image Quality and Radiation Dose in Pediatric Temporal Bone CT Using Photon-Counting Detector CT and Energy-Integrating Detector CT
  • SyMRI & MR Fingerprinting in Brainstem Myelination
  • ASL in PEDS Nontraumatic Orbital Lesions
Show more Pediatrics

Similar Articles

Advertisement

Indexed Content

  • Current Issue
  • Accepted Manuscripts
  • Article Preview
  • Past Issues
  • Editorials
  • Editors Choice
  • Fellow Journal Club
  • Letters to the Editor

Cases

  • Case Collection
  • Archive - Case of the Week
  • Archive - Case of the Month
  • Archive - Classic Case

Special Collections

  • Special Collections

Resources

  • News and Updates
  • Turn around Times
  • Submit a Manuscript
  • Author Policies
  • Manuscript Submission Guidelines
  • Evidence-Based Medicine Level Guide
  • Publishing Checklists
  • Graphical Abstract Preparation
  • Imaging Protocol Submission
  • Submit a Case
  • Become a Reviewer/Academy of Reviewers
  • Get Peer Review Credit from Publons

Multimedia

  • AJNR Podcast
  • AJNR SCANtastic
  • Video Articles

About Us

  • About AJNR
  • Editorial Board
  • Not an AJNR Subscriber? Join Now
  • Alerts
  • Feedback
  • Advertise with us
  • Librarian Resources
  • Permissions
  • Terms and Conditions

American Society of Neuroradiology

  • Not an ASNR Member? Join Now

© 2025 by the American Society of Neuroradiology All rights, including for text and data mining, AI training, and similar technologies, are reserved.
Print ISSN: 0195-6108 Online ISSN: 1936-959X

Powered by HighWire