Medical education instructional videos are more popular and easier to create than ever before. Standard quality measures for this medium do not exist, leaving educators, learners, and content creators unable to assess these videos.Drawing from the literature on video quality and popularity, reusable learning objects, and multimedia and curriculum development principles, we developed a 26-item instructional video quality checklist (IVQC), to capture aspects of educational design (six items), source reliability (four items), multimedia principle adherence (10 items), and accessibility (six items). Two raters applied IVQC to 206 videos from five producers across topics from two organ systems (cardiology and pulmonology) encompassing four disciplines (anatomy, physiology, pathology, and pharmacology).Inter-rater reliability was strong. According to two-rater means, eight multimedia items were present in over 80% of videos. A minority of videos included learning objectives (46%), alternative language translations (41%), when the video was updated (40%), analogies (37%), or references (9%). Producer ratings varied significantly (p < .001) across 17 of 26 items. There were no significant differences according to the video topic.IVQC detected differences in elements of instructional video quality. Future work can apply this instrument to a broader array of videos and in authentic educational settings.