• Login
    View Item 
    •   Home
    • University Hospitals of Leicester NHS Trust
    • Acute Medicine/ED and Specialist Medicine
    • Dermatology
    • View Item
    •   Home
    • University Hospitals of Leicester NHS Trust
    • Acute Medicine/ED and Specialist Medicine
    • Dermatology
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of EMERCommunitiesPublication DateAuthorsTitlesSubjectsThis CollectionPublication DateAuthorsTitlesSubjectsProfilesView

    My Account

    LoginRegister

    Links

    About EMERPoliciesDerbyshire Community Health Services NHS Foundation TrustLeicester Partnership TrustNHS Nottingham and Nottinghamshire CCGNottinghamshire Healthcare NHS Foundation TrustNottingham University Hospitals NHS TrustSherwood Forest Hospitals NHS Foundation TrustUniversity Hospitals of Derby and Burton NHS Foundation TrustUniversity Hospitals Of Leicester NHS TrustOther Resources

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    Performance of ChatGPT on dermatology Specialty Certificate Examination multiple choice questions

    • CSV
    • RefMan
    • EndNote
    • BibTex
    • RefWorks
    Author
    Wernham, Aaron
    Date
    2023-06-02
    
    Metadata
    Show full item record
    DOI
    10.1093/ced/llad197
    Publisher's URL
    https://academic.oup.com/ced/advance-article-abstract/doi/10.1093/ced/llad197/7188526?redirectedFrom=fulltext&login=false
    Abstract
    ChatGPT is a large language model trained on increasingly large datasets by OpenAI to perform language-based tasks. It is capable of answering multiple-choice questions, such as those posed by the dermatology SCE examination. We asked two iterations of ChatGPT: ChatGPT-3.5 and ChatGPT-4 84 multiple-choice sample questions from the sample dermatology SCE question bank. ChatGPT-3.5 achieved an overall score of 63.1%, and ChatGPT-4 scored 90.5% (a significant improvement in performance (p<0.001)). The typical pass mark for the dermatology SCE is 70-72%. ChatGPT-4 is therefore capable of answering clinical questions and achieving a passing grade in these sample questions. There are many possible educational and clinical implications for increasingly advanced artificial intelligence (AI) and its use in medicine, including in the diagnosis of dermatological conditions. Such advances should be embraced provided that patient safety is a core tenet, and the limitations of AI in the nuances of complex clinical cases are recognised.
    Citation
    Passby, L., Jenko, N., & Wernham, A. (2023). Performance of ChatGPT on dermatology Specialty Certificate Examination multiple choice questions. Clinical and experimental dermatology, llad197. Advance online publication. https://doi.org/10.1093/ced/llad197
    Type
    Article
    URI
    http://hdl.handle.net/20.500.12904/17407
    Collections
    Dermatology

    entitlement

     
    DSpace software (copyright © 2002 - 2025)  DuraSpace
    Quick Guide | Contact Us
    Open Repository is a service operated by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.