The ASNR-ACR-RSNA Common Data Elements Project: What Will It Do for the House of Neuroradiology? ================================================================================================ * A.E. Flanders * J.E. Jordan ## Abstract **SUMMARY:** The American Society of Neuroradiology has teamed up with the American College of Radiology and the Radiological Society of North America to create a catalog of neuroradiology common data elements that addresses specific clinical use cases. Fundamentally, a common data element is a question, concept, measurement, or feature with a set of controlled responses. This could be a measurement, subjective assessment, or ordinal value. Common data elements can be both machine- and human-generated. Rather than redesigning neuroradiology reporting, the goal is to establish the minimum number of “essential” concepts that should be in a report to address a clinical question. As medicine shifts toward value-based service compensation methodologies, there will be an even greater need to benchmark quality care and allow peer-to-peer comparisons in all specialties. Many government programs are now focusing on these measures, the most recent being the Merit-Based Incentive Payment System and the Medicare Access Children's Health Insurance Program Reauthorization Act of 2015. Standardized or structured reporting is advocated as one method of assessing radiology report quality, and common data elements are a means for expressing these concepts. Incorporating common data elements into clinical practice fosters a number of very useful downstream processes including establishing benchmarks for quality-assurance programs, ensuring more accurate billing, improving communication to providers and patients, participating in public health initiatives, creating comparative effectiveness research, and providing classifiers for machine learning. Generalized adoption of the recommended common data elements in clinical practice will provide the means to collect and compare imaging report data from multiple institutions locally, regionally, and even nationally, to establish quality benchmarks. ## ABBREVIATIONS: ACR : American College of Radiology AI : artificial intelligence ASNR : American Society of Neuroradiology BI-RADS : Breast Imaging Reporting and Data System CDE : common data element EHR : Electronic Health Record IT : information technology LGG : low grade glioma PQRS : physician quality reporting system RSNA : Radiological Society of North America TCGA : The Cancer Genome Atlas TCIA : The Cancer Imaging Archive VASARI : Visually AcceSsible Rembrandt Images The ASNR-ACR-RSNA Common Data Elements (CDEs) Project represents a collaboration between the American Society of Neuroradiology (ASNR), the American College of Radiology (ACR), and the Radiological Society of North America (RSNA). The function of this workgroup is to develop a catalog of CDEs for neuroradiology that are both practical and useful for clinical practice, with the goals of unifying practice standards by improving consistency in reporting and developing human- and machine-interpretable features that can be used to measure quality in our specialty. There are numerous secondary benefits in comparative effectiveness research, precision medicine, radiomics, registry participation, machine learning, communication, and public health. This joint committee was formed to catalog, unify, and codify known existing neuroradiology findings, observations, and concepts commonly used in neuroradiology. The following is a brief overview of the rationale and processes behind this collaborative effort and the potential benefits to our profession and patient care. Despite the advances in information technology that are ubiquitous in our profession, the process for composing the radiology report has changed little in the past 100 years; reports are largely prose descriptions of findings.1 The consumer of the prose report is obligated to extract its concepts to drive clinical decision-making. There is no author obligation to use consistent terminology when generating a report. This situation creates myriad problems, including challenges in comparing studies or aggregating reports of the same type produced by different authors. Re-review of the original images is often the only practical solution in either case. While there have been several notable efforts to create consistency in reporting styles through structured or standardized reporting (RSNA Informatics Reporting: RadReport; radreport.org) and by the creation of a vendor-neutral standard for a report template data schema and exchange (Management of Radiology Report Templates; [https://docs.imphub.org/display/ITMT/MRRT+Documentation](https://docs.imphub.org/display/ITMT/MRRT+Documentation)),2,3 little attention has been paid to developing consistent representation of the intrinsic concepts contained in the report that drive clinical decisions. This deficiency is at the core of the common data elements effort. ### What is a Common Data Element? Fundamentally, a CDE is a question/concept combined with a set of expected responses. A CDE is the most granular statement or observation that one can provide in a report. It is a single accepted concept with a response. The important characteristic is that both the concept and the response are consistent whenever it is used. CDEs can record properties of imaging findings such as anatomic location, shape, image number, image coordinates, and dimensions and can store computed values such as texture metrics.4 Machine-generated values that are subsequently inserted into a radiology report could (eg, from a sonography device or postprocessing workstation) also represent CDEs. The response could be Boolean (eg, yes or no), quantitative (eg, 1, 2.3, 5.01), ordinal (eg, A, B, C1, D6), or a list of consistent terms/phrases. A report might contain many CDEs or sets of CDEs that are relevant to a specific disease. A brain MR imaging evaluation for multiple sclerosis, for example, might include a CDE set related to specific plaque characteristics (eg, number, location, features, size, enhancement). Sets of CDEs could be incorporated into a report on approval of the radiologist based on specific circumstances. For example, reporting an incidental laryngeal mass on a neck CTA could be improved by automatically importing a laryngeal mass CDE subset into a CTA report template. CDE sets can be used once or reused in other clinical contexts. Use of a controlled response creates uniformity in activities such as clinical reporting for the human consumer, but it also creates an environment that facilitates computable consumption of concepts that can drive downstream actionable processes.4 Additional benefits include diminished ambiguity, increased acceptance by clinicians, modular authoring, and modification of report templates. Examples might include an ASPECTS for stroke ([http://www.aspectsinstroke.com/](http://www.aspectsinstroke.com/)) (integer range: 0–10), Pfirrmann grade for disc degeneration ([https://www.researchgate.net/publication/5840284\_Modified\_Pfirrmann\_Grading\_System\_for\_Lumbar\_Intervertebral\_Disc\_Degeneration](https://www.researchgate.net/publication/5840284\_Modified\_Pfirrmann\_Grading\_System\_for\_Lumbar_Intervertebral_Disc_Degeneration)) (integer range: 1 – 8), or foraminal stenosis (text: normal, mild, moderate, severe). There are many examples of CDEs in our literature that correlate to outcomes, therapeutic response, and disease state. In most instances, CDEs are concepts that are already familiar to the practicing radiologist and clinician; they need not be obscure, complex, or uncommon. CDEs can also be used in both prose reporting and structured reporting. The concept of CDEs should sound familiar because radiologists have been using them in various forms for years. The Breast Imaging Reporting and Data System (ACR BI-RADS Atlas 5th Edition; [https://www.acr.org/Clinical-Resources/Reporting-and-Data-Systems/Bi-Rads](https://www.acr.org/Clinical-Resources/Reporting-and-Data-Systems/Bi-Rads)) is the first clinical progenitor of CDEs. BI-RADS is focused on the probability of malignancy (0–6) using a global assessment category (eg, shape, margin, density of masses, calcifications, and so forth, which are part of a controlled terminology). Paramount to the generation of a BI-RADS global assessment score is the dependency on the component features/observations. The “RADS” construct has increased in popularity in recent years in other areas such as LI-RADS (Liver Reporting and Data System; [https://www.acr.org/Clinical-Resources/Reporting-and-Data-Systems/LI-RADS](https://www.acr.org/Clinical-Resources/Reporting-and-Data-Systems/LI-RADS)), PI-RADS (Prostate Imaging Reporting and Data System; [https://www.acr.org/Clinical-Resources/Reporting-and-Data-Systems/PI-RADS](https://www.acr.org/Clinical-Resources/Reporting-and-Data-Systems/PI-RADS)), TI-RADS (Thyroid Imaging Reporting and Data System; [https://www.acr.org/Clinical-Resources/Reporting-and-Data-Systems/TI-RADS](https://www.acr.org/Clinical-Resources/Reporting-and-Data-Systems/TI-RADS)), NI-RADS (Neck Imaging Reporting and Data System; [https://www.acr.org/Clinical-Resources/Reporting-and-Data-Systems/NI-RADs](https://www.acr.org/Clinical-Resources/Reporting-and-Data-Systems/NI-RADs)), and HI-RADS (Head Injury Imaging Reporting and Data System; [https://www.acr.org/Clinical-Resources/Reporting-and-Data-Systems/HI-RADS](https://www.acr.org/Clinical-Resources/Reporting-and-Data-Systems/HI-RADS)).4 Compliance with a single terminology facilitates aggregation of data from multiple facilities and increases the value of our reports, including at points of care.4 Related initiatives that are tied to compliance and payment include the Physician Quality Reporting System ([https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/PQRS/index.html](https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/PQRS/index.html)) reporting measures for the Centers for Medicare and Medicaid Services. Many published CDE sets originated through clinical trials research and have reasonable interrater agreement. The National Institute of Neurological Disorders and Stroke, for example, maintains a catalog of imaging-based CDEs for research purposes ([https://www.commondataelements.ninds.nih.gov](https://www.commondataelements.ninds.nih.gov)). The National Institute of Neurological Disorders and Stroke collection has imaging CDEs relevant to traumatic brain injury, stroke, multiple sclerosis, spinal cord injury, Parkinson disease, and others. The Visually AcceSsible Rembrandt Images (VASARI; [https://radiopaedia.org/articles/vasari-mri-feature-set](https://radiopaedia.org/articles/vasari-mri-feature-set)) collection of The Cancer Imaging Archive ([https://www.cancerimagingarchive.net/](https://www.cancerimagingarchive.net/)) is the most comprehensive set of visual features that have been used to describe human gliomas on baseline MR imaging studies. The multicenter, federated The Cancer Genome Atlas (TCGA; [https://wiki.cancerimagingarchive.net/display/Public/TCGA+Glioma+Phenotype+Research+Group](https://wiki.cancerimagingarchive.net/display/Public/TCGA+Glioma+Phenotype+Research+Group)) Glioma Phenotype Research Group collected and validated the most useful imaging features culled from the known literature on gliomas. The group developed the VASARI feature set using a large set of baseline glioblastoma and low-grade glioma (LGG) imaging studies stored in The Cancer Imaging Archive (TCIA). These phenotypic imaging data were successfully correlated with gene-expression data derived from the tumors in TCGA. The 25 features contained in the VASARI collection are all concepts familiar to neuroradiologists (eg, cyst, necrosis, enhancement, and so forth).5 A subset of the VASARI features that demonstrates value in predicting tumor genomics or survival could be incorporated into a CDE module for clinical reporting. While substantial effort by domain experts has gone into cataloging and validating these collections for research, there has hardly been any adoption of these very valuable observations into clinical reporting until very recently. Moreover, most of the CDEs in existence took initial form as part of research initiatives and subsequently were never used once the research was completed. It is now well-recognized that there is substantial value in resurrecting many of these visually derived imaging features that were originally applied to address a specific research question and adapt them for clinical use. Some practices have taken on the task of incorporating CDEs to enhance the quality of local reporting practices. Mamlouk et al6 reported on their very successful collaborative effort to disseminate consistent contextual reporting templates for neuroradiology examinations in a large multicenter practice. Over 50 specific use-case neuroradiology reporting templates were created. They describe a formal process in which templates are proposed and adjudicated by a panel that includes clinical input before dissemination to all radiologists.6 ### Why Do Common Data Elements Matter Now? US health care is at a crossroads in which each specialty is being asked to define its inherent value in the patient care continuum. Health care organizations and subspecialty provider organizations are being asked to develop, benchmark, and comply with specific quality standards. Pay-for-performance initiatives, meaningful use, and Physician Quality Reporting System programs are now being wrapped up into the new value-based programs under the Merit-Based Incentive Payment System and the Medicare Access and Children's Health Insurance Program Reauthorization Act of 2015 (MIPS/MACRA: [https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/Value-Based-Programs/MACRA-MIPS-and-APMs/MACRA-MIPS-and-APMs.html](https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/Value-Based-Programs/MACRA-MIPS-and-APMs/MACRA-MIPS-and-APMs.html)). The radiology value chain considers the importance of clear and accurate reporting and report communications to physicians, patients, and other stakeholders. The ACR has also identified these areas as potential value-based payment metrics. In addition to clarity of reporting, structured reporting, standard lexicon, and language are key elements of the value-based payment metrics proposal.7 Because referring physicians have shown a preference for structured reporting in contrast to conventional radiology reports, CDEs will likely play a key role in service to the goals of value-based reporting.8 The dissemination of electronic medical records created by the American Recovery and Reinvestment Act of 2009 stimulus package has facilitated the capability of collecting, sharing, and disseminating data.9 Thus, various clinical subspecialties (eg, cardiology, gastroenterology, pathology, ophthalmology, oncology) have been very active in defining clinical concepts and reporting elements for the electronic medical records that can be readily mined to establish quality parameters and benchmark quality, thereby demonstrating the value of the services delivered. Clinical use of CDEs fosters participation in data registries, which, in turn, are being used to benchmark practice performance. The field of radiology has led the way in health care information technology (IT), interoperability, and information exchange, yet our field remains behind in standardizing quality measures for radiology reporting. With the exception of mammography, most of our quality metrics have focused on service delivery, workflow metrics, and payment policy and less on the content of our reports. Nevertheless, accuracy of reporting is a quintessential value metric of what radiologists offer, and CDEs should be viewed as a powerful tool to enhance the quality of our reports and actionable information.10 ### How the Currently Available CDEs Were Created At an Imaging Informatics summit of the Radiological Society of North America Radiology Informatics Committee and the American College of Radiology Imaging Technology and Informatics Committee, discussions focused on the relative absence of codified observations/findings in radiology and a structure for representing them in our IT platforms. A collaboration was started to help fill this void. The initial objectives of this collaboration were to generate a common data model and syntax for representing reporting concepts that could interoperate with existing reporting technologies and extend their capabilities. An on-line repository (RadElement.org; [http://www.radelement.org/](http://www.radelement.org/)) was built to house some of the limited existing content (eg, LI-RADS, PI-RADS). Each concept and controlled response stored in the repository is assigned a unique identifier used in downstream IT systems to retain the fidelity of the concept and response. Having set the stage, groups of domain experts could begin to create content to populate the repository, validate the concepts, and develop modifications. Similar to the related efforts in terminology (RadLex; [http://www.rsna.org/RadLex.aspx](http://www.rsna.org/RadLex.aspx)) and reporting (RadReport), cataloguing CDEs requires enlisting the knowledge of domain experts to ensure that relevant content is being included. For both the RadLex and RadReport efforts, the American Society of Neuroradiology was the first subspecialty organization to volunteer to help the RSNA to create repositories of neuroradiology-/ear, nose, throat–specific terminology and report templates, respectively. The ASNR has again volunteered to be the first subspecialty society to lend their expertise to this new CDE effort under the auspices of the ASNR Standards and Guidelines Committee. The neuroradiology effort is taking a pragmatic approach by developing CDEs for common clinical use cases that a neuroradiologist encounters every day rather than attempting to encompass all diseases in our specialty. This will help to inform us on how to design a process for authoring, vetting, editing, and publishing content in an efficient manner. The workgroup's charge is to compile only the most essential concepts for each clinical use case and to avoid making the lists comprehensive or exhaustive. By limiting the sets to the most essential concepts, the CDE sets become more practical, modular, and easier to use in practice and to incorporate into a report. The intent is to replicate what is taught in the training environment, whereby a neuroradiology attending physician might recite to a trainee the few key concepts that must be conveyed in a clinically useful report. Ultimately, the goal of this initiative is to empower the domain experts in the ASNR to develop the criteria on the basis of experience, evidence, and clinical consultation. Twenty-five neuroradiologists and staff from the ASNR, ACR, and RSNA participate in the workgroup activities. There is neuroradiology subspecialty representation from the American Society of Spine Radiology, American Society of Functional Neuroradiology, American Society of Pediatric Neuroradiology, and American Society of Head and Neck Radiology. A group e-mail account and a collaborative workspace were set up to support asynchronous communication and for all members to have access to all work products and artifacts. The group meets monthly by teleconference with a preplanned agenda, action items, and minutes. Ideas for new CDE nominations are brought to the entire group. A single subspecialty volunteer then takes ownership of the first draft of the CDE set, which is authored directly on a spreadsheet visible to all workgroup members. The workgroup is free to revise or comment on the draft. Corrections or modifications are made on the basis of exchange through group e-mails or via discussion at the monthly teleconference. The final version of the CDE is then handed off to the ACR-RSNA CDE subcommittee to catalog and number in the RadElement.org CDE repository. A Neuro-CDE master list is used to track CDE progress from proposal to final draft. Twelve of the initial CDE sets or modules were converted into PowerScribe 360 macros ([https://www.nuance.com/healthcare/medical-imaging/powerscribe-360-reporting.html](https://www.nuance.com/healthcare/medical-imaging/powerscribe-360-reporting.html)) and posted on the ASNR Web site for public view/download ([https://www.asnr.org/resources/cde/](https://www.asnr.org/resources/cde/)) and were featured in a public demonstration at the ASNR 2018 Annual Meeting (Vancouver, British Columbia, Canada). ### What Are the Potential Benefits of CDEs? A number of other potential tangible benefits and incentives for radiologists to embrace CDE models and reporting exist, and there is growing evidence that inclusion of CDEs in clinical reporting can be performed efficiently, will augment communication, and is preferred by clinicians.11,12 While current vendor offerings of reporting products are limited in their ability to fully support CDEs, there is a movement underway to address these limitations for the next generation of reporting tools that will include CDEs and radiology decision support content (Computed Assisted Radiology Decision Support [CAR/DS]).13 Artificial intelligence (AI) and natural language processing cannot ultimately solve the problem of converting heterogeneous prose reports into homogeneous concepts. A combination of solutions that includes new reporting tools that aid radiologists in image interpretation and dictation will ultimately provide the ideal balance between quality and efficiency. While vendors can encode picklists and insertion fields into templates, the current commercial offerings lack the ability to incorporate triggers and logic into the reporting workflow that enhance efficiency, mitigate reporting errors, and augment quality. There is the capability today, however, to dynamically modify a report on the basis of the content that has already been created. For example, mention of a mass on brain MR imaging might invoke a CDE set that cues the radiologist with a list of ASNR-recommended brain mass features. The ACR-Assist ([https://www.acr.org/Practice-Management-Quality-Informatics/Informatics/Structured-Content](https://www.acr.org/Practice-Management-Quality-Informatics/Informatics/Structured-Content)) technology is a radiologist decision support framework that uses the spoken or transcribed concepts in a report as a “trigger” to instantaneously provide consistent and useful supplemental recommendations in a report.4 The software behind radiology decision support has “awareness” of key concepts/findings/observations (eg, CDEs) and can use this knowledge base to automatically suggest other supplemental features that should be included or to provide recommendations based on the individual features of a finding. A TI-RADS score could automatically be calculated and inserted into a report on the basis of feature descriptions of a thyroid nodule. The automatic insertion of a macro containing the essential imaging features of laryngeal cancer could follow after describing an incidental laryngeal mass on a neck CT angiogram. Information collected from that macro could generate staging information for the electronic medical record that would be valuable to the oncologist/otolaryngologist. Paramount to the development and deployment of these new software tools is expert review of the inherent concepts and potential enhancements (eg, calculations, assessment scores). Inclusion of CDEs in reports creates a multitude of opportunities for the concepts in the report to improve other downstream processes. The unique identifier associated with each CDE concept/response can be encoded and transmitted with the text report by a reporting system and can be used to trigger downstream events in other disparate IT systems that have been programmed to comprehend and respond to specific concepts and values. New events could include automatic notification of care team members for critical results communication while the report is still in process. Automated generation of quality-assurance data for a number of clinical use cases such as acute stroke turnaround times and notification could be more accurately collected. Payment denials could be mitigated at the time of report generation by checking for appropriate terminology and concepts in reports that are critical for approval. In the electronic medical record, encoded CDEs could be used to supplement the problem list, progress notes, recommendations, and discharge summaries of the patient. Patient-centric versions of radiology reports could be generated for consumption on patient portals. The concepts from CDEs could be used to collect vast quantities of data for quality assurance and benchmarking in registries. Local, regional, and national registries containing imaging features for specific clinical use cases could be created and could be used for large-scale imaging-based comparative effectiveness research for population health and high-profile health care initiatives. These all have an additive effect of augmenting the value of every radiology report (Figure). ![FIGURE.](http://www.ajnr.org/http://ajnr-stage2.highwire.org/content/ajnr/40/1/14/F1.medium.gif) [FIGURE.](http://www.ajnr.org/content/40/1/14/F1) FIGURE. Concepts/responses captured through report CDEs are used in downstream IT systems. Concepts, features, and measurements from CDEs are encoded with unique identifiers (eg, RDE236.3) by the reporting system, which are passed across interfaces and received by various systems programmed to act on specific values. The unique identifiers can trigger other events or be recoded/translated to provide discrete data that drive additional value-based health care processes. PQRS indicates Physician Quality Reporting System; EHR, indicates Electronic Health Record. Medical imaging AI research and development could also be accelerated by the inclusion of CDEs in clinical reporting. While close to one-half billion unique imaging studies are generated annually in the United States, only a small portion of these examinations are “AI ready” for training and validation of AI algorithms. The lack of relevant annotations is often cited as the principal reason for shortages of suitable AI training datasets. Investigators have attempted to mobilize natural language–processing applications to retrospectively extract the needed concepts from prose reports with varied success. Additional expert resources are usually required to re-review the original imaging data to create the annotations for a specific disease entity (eg, stroke, glioma, fracture). The annotations and anatomic locations of the features on the images are used to create AI classifiers of disease. Imaging concepts encoded in CDEs make it simpler to create the annotations and subsequently the AI classifiers. The inclusion of CDEs in reports makes it easier to prospectively generate needed annotations. Moreover, universal adoption of CDEs for stroke, cerebral neoplasia, multiple sclerosis, and so forth makes it easier to aggregate data from multiple sites for AI development. ## Conclusions There are clearly a large number of benefits to be derived from adopting the general practice of using a singular set of concepts, observations, and features in radiology reporting. Codifying the content with neuroradiology domain experts is critical to the success of the process. The joint collaboration among ASNR, ACR, and RSNA is to develop a continual process by which CDE content is proposed, authored, reviewed, approved, and validated for all of neuroradiology. The effort can provide a single clearinghouse of neuroradiology CDEs that can be directly used by the commercial and research sectors to improve product offerings. There is a “symbiosis” between the product development and content creation for CDEs, with each relying on the deliverables of the other. The hope is that the processes being set forth by the ASNR in collaboration with the RSNA and ACR will serve as a pilot for content creation of the other radiology subspecialties. We encourage all practitioners in the “House of Neuroradiology” to contribute and provide guidance for the construction of this collection. ## References 1. 1. Langlotz CP. The Radiology Report: A Guide to Thoughtful Communication for Radiologists and Other Medical Professionals. CreateSpace Independent Publishing Platform, [https://www.createspace.com/](https://www.createspace.com/). ; 2015 2. 2. Chen JY, Sippel-Schmidt TM, Carr CD, et al. Enabling the next-generation radiology report: description of two new system standards. Radiographics 2017;37:2106–12 doi:10.1148/rg.2017160106 pmid:28968194 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1148/rg.2017160106&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=28968194&link_type=MED&atom=%2Fajnr%2F40%2F1%2F14.atom) 3. 3. Kahn CE Jr., Langlotz CP, Burnside ES, et al. Toward best practices in radiology reporting. Radiology 2009;252:852–56 doi:10.1148/radiol.2523081992 pmid:19717755 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1148/radiol.2523081992&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=19717755&link_type=MED&atom=%2Fajnr%2F40%2F1%2F14.atom) 4. 4. Rubin DL, Kahn CE. Common data elements in radiology. Radiology 2017;283:837–44 doi:10.1148/radiol.2016161553 pmid:27831831 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1148/radiol.2016161553&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=27831831&link_type=MED&atom=%2Fajnr%2F40%2F1%2F14.atom) 5. 5.The TCGA Glioma Phenotype Research Group. [https://wiki.cancerimagingarchive.net/display/Public/TCGA+Glioma+Phenotype+Research+Group](https://wiki.cancerimagingarchive.net/display/Public/TCGA+Glioma+Phenotype+Research+Group). Accessed May 15, 2018. 6. 6. Mamlouk MD, Chang PC, Saket RR. Contextual radiology reporting: a new approach to neuroradiology structured templates. AJNR Am J Neuroradiol 2018 Jun 14. [Epub ahead of print]. doi:10.3174/ajnr.A5697 pmid:29903922 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.3174/ajnr.A5697&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=29903922&link_type=MED&atom=%2Fajnr%2F40%2F1%2F14.atom) 7. 7.US Department of Health and Human Services, Centers for Medicare and Medicaid Services. 42 CFR Parts 414 and 495. Medicare program; Merit-Based Incentive Payment System (MIPS) and Alternative Payment Model (APM) incentive under the Physician Fee Schedule, and criteria for physician focused payment models. Available at: [https://www.gpo.gov/fdsys/pkg/FR-2016-05-09/pdf/2016-10032.pdf](https://www.gpo.gov/fdsys/pkg/FR-2016-05-09/pdf/2016-10032.pdf). Accessed August 20, 2016. 8. 8. Schwartz LH, Panicek DM, Berk AR, et al. Improving communication of diagnostic radiology findings through structured reporting. Radiology 2011;260:174–81 doi:10.1148/radiol.11101913 pmid:21518775 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1148/radiol.11101913&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=21518775&link_type=MED&atom=%2Fajnr%2F40%2F1%2F14.atom) 9. 9.H.R.1 - American Recovery and Reinvestment Act of 2009 [https://www.congress.gov/bill/111th-congress/house-bill/1/text](https://www.congress.gov/bill/111th-congress/house-bill/1/text). Last accessed August 8, 2018. 10. 10. Wibmer A, Vargas HA, Sosa R, et al. Value of a standardized lexicon for reporting levels of diagnostic certainty in prostate MRI. AJR Am J Roentgenol 2014;203:W651–57 doi:10.2214/AJR.14.12654 pmid:25415731 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.2214/AJR.14.12654&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=25415731&link_type=MED&atom=%2Fajnr%2F40%2F1%2F14.atom) 11. 11. Weinberg BD, Gore A, Shu HG, et al. Management-based structured reporting of posttreatment glioma response with the brain tumor reporting and data system. J Am Coll Radiol 2018;15:767–71 doi:10.1016/j.jacr.2018.01.022 pmid:29503151 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1016/j.jacr.2018.01.022&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=29503151&link_type=MED&atom=%2Fajnr%2F40%2F1%2F14.atom) 12. 12. Bink A, Benner J, Reinhardt J, et al. Structured reporting in neuroradiology: intracranial tumors. Front Neurol 2018;9:32 doi:10.3389/fneur.2018.00032 pmid:29467712 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.3389/fneur.2018.00032&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=29467712&link_type=MED&atom=%2Fajnr%2F40%2F1%2F14.atom) 13. 13. Alkasab TK, Bizzo BC, Berland LL, et al. Creation of an open framework for point-of-care computer-assisted reporting and decision support tools for radiologists. J Am Coll Radiol 2017;14:1184–89 * Received May 24, 2018. * Accepted after revision June 28, 2018. * © 2019 by American Journal of Neuroradiology