Though many colleges and professional organizations employ rubrics or other measures to assess the quality of online courses, there are many different types of assessment that can be used. Before choosing or constructing any instrument, we must first decide who (or what) we'll be assessing.
<img style="float:right; border: 1px solid black; margin:5px 5px 5px 5px;" src="http://wou.edu/~jkepka15/peopleinschoolmeeting.jpg">
Choose One:
[[We are assessing the Learning Management System]]
[[We are assessing the course design and/or accessibility]]
[[We are assessing course delivery/faculty]]
[[We are assessing student performance or readiness]]
[[We are assessing some other aspect of online learning]]
#This page is under construction
<table style="background-color:#000000;color:#ffffff;"><tr><td><h4>Key Considerations for Assessing THING</h4></td></tr></table>
<img style="float:right; border: 1px solid black; margin: 10px 10px 10px 10px;" src="http://wou.edu/~jkepka15/firstdayschool.jpg">OVERVIEW Wholistic consideration of student readiness is generally assessed for large blocks of students. It may be assessed in concert with assessments of student performance or separately. Readiness can refer to preparation to navigate and/or obtain necessary technology; experience with the LMS or other course delivery methods; experience with online learning systems; or experience with college learning in general.
<table style="background-color:#000000;color:#ffffff;"><tr><td>Key Questions for Assessing THING</td></tr></table>
<ul><h4>**Motivation:**</h4>
<ul><li>Is the assessment meant to be strictly informative/internal?
<li>Will the assessment be administered to students or analyze existing student data?
<li>Will it contribute to research being published (interally or externally)?
<li>Will this review motivate changes for course content or delivery?
<li>Will the assessment results motivate changes in requirements for access to online courses?</ul>
<li><h4>**Tools and Information:**</h4>
<ul><li>What instrument or information will be used to assess readiness? Examples: Student survey; LMS analytics; Faculty reports
<li>What statistics are available for comparison?
<li>What statistics need to be gathered or generated? Who has access to this information?</ul>
<li><h4>**Information Management:**</h4>
<ul><li>Is the information being compared confidential or covered by FERPA? Does this have implications for the results or for who can participate in the review?</li>
<li>Are there adequate statistical analysis applications available?</ul>
<table style="background-color:#000000;color:#ffffff;"><tr><td>Key Stakeholders for Assessing THING</td></tr></table>
<ul><li>Students: from these classes? from similar classes? from other classes?
<li>Instructors: from the reviewed classes, similar classes, the same department, external departments, or external institutions
<li>Administrators: Institutional Research, Academic Technology, or Distance Learning department leaders</ul>
<table style="background-color:#000000;color:#ffffff;"><tr><td>Existing Assessment Tools for THING</td></tr></table>
<em>Please assume that these rubrics are copyrighted by the departments/colleges that have created them, and use them as examples only. All links open in a new window/tab.</em>
<ul><li><a href="http://www.unc.edu/tlim/ser/" target="_blank">UNC Online Learning Readiness Questionnaire</a>
#This page is under construction
<table style="background-color:#000000;color:#ffffff;"><tr><td><h4>Key Considerations for Assessing THING</h4></td></tr></table>
<img style="float:right; border: 1px solid black; margin: 10px 10px 10px 10px;" src="http://wou.edu/~jkepka15/firstdayschool.jpg">OVERVIEW Wholistic consideration of student readiness is generally assessed for large blocks of students. It may be assessed in concert with assessments of student performance or separately. Readiness can refer to preparation to navigate and/or obtain necessary technology; experience with the LMS or other course delivery methods; experience with online learning systems; or experience with college learning in general.
<table style="background-color:#000000;color:#ffffff;"><tr><td>Key Questions for Assessing THING</td></tr></table>
<ul><h4>**Motivation:**</h4>
<ul><li>Is the assessment meant to be strictly informative/internal?
<li>Will the assessment be administered to students or analyze existing student data?
<li>Will it contribute to research being published (interally or externally)?
<li>Will this review motivate changes for course content or delivery?
<li>Will the assessment results motivate changes in requirements for access to online courses?</ul>
<li><h4>**Tools and Information:**</h4>
<ul><li>What instrument or information will be used to assess readiness? Examples: Student survey; LMS analytics; Faculty reports
<li>What statistics are available for comparison?
<li>What statistics need to be gathered or generated? Who has access to this information?</ul>
<li><h4>**Information Management:**</h4>
<ul><li>Is the information being compared confidential or covered by FERPA? Does this have implications for the results or for who can participate in the review?</li>
<li>Are there adequate statistical analysis applications available?</ul>
<table style="background-color:#000000;color:#ffffff;"><tr><td>Key Stakeholders for Assessing THING</td></tr></table>
<ul><li>Students: from these classes? from similar classes? from other classes?
<li>Instructors: from the reviewed classes, similar classes, the same department, external departments, or external institutions
<li>Administrators: Institutional Research, Academic Technology, or Distance Learning department leaders</ul>
<table style="background-color:#000000;color:#ffffff;"><tr><td>Existing Assessment Tools for THING</td></tr></table>
<em>Please assume that these rubrics are copyrighted by the departments/colleges that have created them, and use them as examples only. All links open in a new window/tab.</em>
<ul><li><a href="http://www.unc.edu/tlim/ser/" target="_blank">UNC Online Learning Readiness Questionnaire</a>
<table style="background-color:#000000;color:#ffffff;"><tr><td><h4>Key Considerations for LMS Assessment</h4></td></tr></table>
Assessment of the Learning Management System usually occurs as part of a process to decide on whether it will be the right fit for an institution. Sometimes, the existing LMS may be assessed to find out where key stakeholders find fault (or fortune!) within.
<img style="float:right; border:1px; margin: 10px 10px 10px 10px;" src="http://wou.edu/~jkepka15/cardcatalog.jpg">Typically, there could already be both external and internal reviews of an LMS available: Sometimes departments have informal reviews of an LMS that an entire college can use as part of its process. Sometimes questions about the LMS are included in student course evaluations at the end of the term. Both are valuable areas for collecting information.
Outside of the college, there may also be many existing reviews of the LMS available. Some reviews may be able to substitute for hands-on experience, particularly in terms of long-term benefits and costs for any single system. However, as most LMS companies are commercial entities, it may be difficult to find honest and unvarnished views openly available online; this may take more research and networking.
The companies and designers that work for Learning Management System companies may also be able to provide information about the intent and development of pieces.
Rubrics should take these different inputs into consideration for a full picture.</p>
<table style="background-color:#000000;color:#ffffff;"><tr><td>Key Questions for LMS Assessment:</td></tr></table>
<ul><li>Who/what is driving this review?
<ul><li>Deciding whether to implement a new LMS?
<li>Deciding whether to stay with the existing LMS?
<li>Deciding on whether to implement certain features (or to take some away)?
<li>Cost/benefit analysis?
<li>Obsolescence?</ul>
<li>What most warrants assessment or review?
<ul><li>Current features
<li>Desired features
<li>Existing problems/criticisms
<li>Student access
<li>Faculty attitude toward the LMS
<li>Student attitude toward LMS
<li>Cost (in time, money, training)
<li>Benefits (what does the system afford users)
<li>Future use and possibility for the LMS
<li>External links (nearby institutions/partners using other LMS</ul></ul>
<table style="background-color:#000000;color:#ffffff;"><tr><td>Key Stakeholders for LMS Assessment:</td></tr></table>
Discussion of an in-use Learning Management System will likely involve those who use it most. However, choosing a new system may be part of a process to bring reluctant users on board -- in which case, they should be a part of the process, too. These users may have wildly different opinions on the system, its purpose, and its usefulness.
<ul><li>Systems Administrators/Instructional Technology Staff
<li>Instructional Designers
<li>Faculty (for online, hybrid, and face-to-face classes; full-time and contingent; with LMS experience and without)
<li>Librarians (and researchers, multimedia specialists, archivists)
<li>Students (in online, hybrid, and face-to-face classes; undergraduate and graduate; part- and full-time; etc.)
<li>Outside users (i.e., continuing education enrollees, public observers, visiting faculty, college assessment or accreditation staff)
<li>Other users as defined by review initiator</ul>
<table style="background-color:#000000;color:#ffffff;"><tr>
<td>Existing Rubrics for LMS Assessment:</td></tr></table>
<em>Please assume that these rubrics are copyrighted by the departments/colleges that have created them, and use them as examples only. All links open in a new window/tab.</em>
<ul><Li><a href="https://online.ucf.edu/about/lms-migration/lms-evaluation-checklist/" target="_blank">University of Central Florida Center for Distributed Learning LMS Checklist</a>
<li><a href="http://www.academia.edu/4313606/A_Rubric_to_Evaluate_Learning_Management_Systems" target="_blank">A Rubric to Evaluate Learning Management Systems (paper)</a> via Academia.Edu (login required)
<li><a href="http://it.umd.edu/elmseval" target="_blank">University of Maryland ELMS Committee (with a Request for Information that includes criteria for review).</a>
<li>University System of Georgia report on LMS selection (2011) available <a href="http://www.usg.edu/assets/learning_management_system/documents/USG_LMS_Task_Force_Final_Report.pdf" target="_blank">in PDF</a>. This lengthy report documents the exact steps that the sprawling USG went through to choose a new LMS after its old one was no longer supported. This includes timetables for selection, meeting minutes, descriptions of how advice was solicited, and the test and results of surveys delivered to students and faculty about LMS use.
<table style="background-color:#000000;color:#ffffff;" width="100%"><tr><td>Popular Learning Management Systems:</td></tr></table>
Market shares for the major Learning Management Systems have shifted substantially in the last five years. Blackboard remains the most used system in colleges with enrollment of at least 250 students, as shown in this chart from the Fourth Annual LMS Data Review at <a href="http://edutechnica.com" target="_blank">EduTechnica</a>:
<img style="border: 1px solid black" src="http://www.wou.edu/~jkepka15/lmsupdatechart.jpg">
<ol><li><a href="http://www.blackboard.com/higher-education/index.aspx" target="_blank">Blackboard</a> is a K-12, higher ed, and other training-centered company that also provides teleconference software, hosting, mobile learning systems and apps, and a number of other technological services aimed at education. It is perhaps the oldest and richest player in the American LMS field, and has bought several competitiors (for instance: ANGEL, WebCT, Prometheus). The company held approximately 75 percent of the higher education market in 2014. It has undergone significant design change in the past 2-3 years, though many campuses continue to run older versions of Blackboard.
<li>Instructure's Canvas system has seen dramatic growth in the past 5 years and has now become the second-most popular LMS in the United States for higher education.
<li>Moodle, the open-source LMS that originated in Australia, is in third place and has held relatively steady over the past few years. Of note: the hosting companies that add features to Moodle have quite a competiton going, with Blackboard-owned Moodlerooms falling behind Baltimor's e(cite)
<li>Desire2Learn (D2L) has seen small growth in the past few years.
<li>Sakai is primarily in use at small schools.
<li>ANGEL is owned by Blackboard and support has been eliminated as of 2016. (need cite)
Here are the totals in the United States as of Spring 2016:
<p align="center"><img src="http://www.wou.edu/~jkepka15/lmsupdatetable.jpg"></p>
Blackboard Learn still has a substantial advantage in scale, which means that there's a slightly higher chance that new faculty may have used it. However, regional differences, department/division/subject matter preferences, and the expansion of new quasi-LMS or partial LMS like Google Apps for Education/Google Classroom may continue to change these demographics.
[[Demo Page]]
Any discussion of course design should consider the accessibility of course materials. However, some discussions are more focused on accessibility than others. The level of focus will decide the type or manner of review that's necessary:
<img style="float:right; border: 1px solid black; margin: 10px 10px 10px 10px;" src="http://wou.edu/~jkepka15/mansittingonbenchatosu.jpg">
[[We are interested in an overall course design review]]
[[We are interested in a specific accessibility review]]
<table style="background-color:#000000;color:#ffffff;"><tr><td><h4>Key Considerations for Assessing Student Performance or Readiness</h4></td></tr></table>
<img style="float:right; border:1px; margin: 5px 5px 5px 5px;" src="http://wou.edu/~jkepka15/studentsaroundadesk.jpg">Institutions or departments may want to assess the readiness or performance of students who use online courses or online course systems. Let's first define what we're discussing:
**Student Performance** here will be discussed wholistically, in terms of completion or acheivement of outcomes; information on assessments within a course are part of its design and/or delivery.
**Student Readiness** will refer to the preparation a student has before taking the course related to the online learning environment (not the content or anything that would be covered by prerequisites).
Assessment of Student Performance would typically take place after or during the course deployment, while student readiness would typically be assessed before the course/sequence begins.
[[We want to assess Student Performance]]
[[We want to assess Student Readiness]]
<table style="background-color:#000000;color:#ffffff;width:100%;"><tr><td><h4>Key Considerations for Assessing Course Design</h4></td></tr></table>
<img style="float:right; border: 1px solid black; margin: 10px 10px 10px 10px;" src="http://wou.edu/~jkepka15/philomathschool.jpg">OVERVIEW Wholistic consideration of student readiness is generally assessed for large blocks of students. It may be assessed in concert with assessments of student performance or separately. Readiness can refer to preparation to navigate and/or obtain necessary technology; experience with the LMS or other course delivery methods; experience with online learning systems; or experience with college learning in general.
<table style="background-color:#000000;color:#ffffff; width:100%;"><tr><td>Key Questions for Assessing THING</td></tr></table>
<ul><h4>**Motivation:**</h4>
<ul><li>Is the assessment meant to be strictly informative/internal?
<li>Will the assessment be administered to students or analyze existing student data?
<li>Will it contribute to research being published (interally or externally)?
<li>Will this review motivate changes for course content or delivery?
<li>Will the assessment results motivate changes in requirements for access to online courses?</ul>
<li><h4>**Tools and Information:**</h4>
<ul><li>What instrument or information will be used to assess readiness? Examples: Student survey; LMS analytics; Faculty reports
<li>What statistics are available for comparison?
<li>What statistics need to be gathered or generated? Who has access to this information?</ul>
<li><h4>**Information Management:**</h4>
<ul><li>Is the information being compared confidential or covered by FERPA? Does this have implications for the results or for who can participate in the review?</li>
<li>Are there adequate statistical analysis applications available?</ul>
<table style="background-color:#000000;color:#ffffff;width:100%;"><tr><td>Key Stakeholders for Assessing Course Design</td></tr></table>
<ul><li>Instructional Designers: as content creators/designers, as reviewers, as subject matter experts
<li>Instructors: from the reviewed classes, similar classes, the same department, external departments, or external institutions; as content creators or subject matter experts
<li>Administrators: Institutional Research, Academic Technology, or Distance Learning department leaders
<li>Students: as users or potential users</ul>
<table style="background-color:#000000;color:#ffffff; width:100%;"><tr><td>Existing Assessment Tools for THING</td></tr></table>
The most well-known instrument for online course assessment might be the proprietary Quality Matters Rubric. This tool is worth its [[own page->QM Rubric]].
<em>Please assume that these rubrics are copyrighted by the departments/colleges that have created them, and use them as examples only. All links open in a new window/tab.</em>
<ul><li>California State - Chico <a href="http://www.csuchico.edu/eoi/the_rubric.shtml" target="_blank">Rubric of Online Instruction</a>, also in <a href="http://www.csuchico.edu/eoi/documents/rubricpdf">PDF</a>
<ul><li>This rubric was developed by the college after viewing several external rubrics, including Quality Matters</li></ul>
<li><a href="http://www.jalc.edu/tlc/teaching-online" target="_blank">John A. Logan College Assessment Videos</a>.
<ul><li>This series of 9 videos walks through the assessment tool JALC uses for its courses; the instrument was developed to meet accreditation standards. The Introduction is below:
<iframe width="560" height="315" src="https://www.youtube.com/embed/BYkjshPd7sQ" frameborder="0" allowfullscreen></iframe></li></ul>
#This page is under construction
<table style="background-color:#000000;color:#ffffff;"><tr><td><h4>Key Considerations for Assessing THING</h4></td></tr></table>
<img style="float:right; border: 1px solid black; margin: 10px 10px 10px 10px;" src="http://wou.edu/~jkepka15/firstdayschool.jpg">OVERVIEW Wholistic consideration of student readiness is generally assessed for large blocks of students. It may be assessed in concert with assessments of student performance or separately. Readiness can refer to preparation to navigate and/or obtain necessary technology; experience with the LMS or other course delivery methods; experience with online learning systems; or experience with college learning in general.
<table style="background-color:#000000;color:#ffffff;"><tr><td>Key Questions for Assessing THING</td></tr></table>
<ul><h4>**Motivation:**</h4>
<ul><li>Is the assessment meant to be strictly informative/internal?
<li>Will the assessment be administered to students or analyze existing student data?
<li>Will it contribute to research being published (interally or externally)?
<li>Will this review motivate changes for course content or delivery?
<li>Will the assessment results motivate changes in requirements for access to online courses?</ul>
<li><h4>**Tools and Information:**</h4>
<ul><li>What instrument or information will be used to assess readiness? Examples: Student survey; LMS analytics; Faculty reports
<li>What statistics are available for comparison?
<li>What statistics need to be gathered or generated? Who has access to this information?</ul>
<li><h4>**Information Management:**</h4>
<ul><li>Is the information being compared confidential or covered by FERPA? Does this have implications for the results or for who can participate in the review?</li>
<li>Are there adequate statistical analysis applications available?</ul>
<table style="background-color:#000000;color:#ffffff;"><tr><td>Key Stakeholders for Assessing THING</td></tr></table>
<ul><li>Students: from these classes? from similar classes? from other classes?
<li>Instructors: from the reviewed classes, similar classes, the same department, external departments, or external institutions
<li>Administrators: Institutional Research, Academic Technology, or Distance Learning department leaders</ul>
<table style="background-color:#000000;color:#ffffff;"><tr><td>Existing Assessment Tools for THING</td></tr></table>
<em>Please assume that these rubrics are copyrighted by the departments/colleges that have created them, and use them as examples only. All links open in a new window/tab.</em>
<ul><li><a href="http://www.unc.edu/tlim/ser/" target="_blank">UNC Online Learning Readiness Questionnaire</a>
<h1>Why might we assess online courses?</h1>
<img style="float:right; border:1px solid black; margin:10px 10px 10px 10px;" src="http://www.wou.edu/~jkepka15/whatarewelookingat.jpg">There are many motivations that spur the need for assessment, whether formal or informal. The lens or window through which we are evaluating is as important as what we are looking at.
This presentation will attempt to narrow the field by dividing the types of assessment we might undertake, discussing possible motivations for each type of assessment, identifying potential key stakeholders in each assessment, and collecting existing resources to help create a rubric or measurement for the review.
An honest discussion of **why we're assessing or looking at a course/system/structure** will help define the type of tool that's used.
<table style="background-color:#000000;color:#ffffff;" width="100%"><tr><td><h3>Motivations For Review:</h3></td></tr></table>
Here are three possible motivations for a <strong>formal</strong>, meaning institutionally sponsored or initiated, review:
<ol><li>**Curiosity:** Faculty, Administrators, or even students may start a review in order to satisfy curiosity about how the course (or online courses in general) functions or fits within the discipline, the pantheon of online courses, or simply within one student's day-to-day experience. Perhaps a faculty member wants to know whether her course aligns with what others are doing; perhaps a department chair wants to compare an online section with a hybrid delivery of the same course. <br>
<li>**Research:** The desire to research or to generate studied results can motivate assessment of online courses in a number of ways. For example, though numbers may be readily available for who passes or fails (completes/doesn't complete) an online course, assessment may allow more specific observation of the situation that causes these numbers. In addition, assessment may allow for statistical comparisons or qualitative inquiries. Finally, assessment may help align courses so that they can be included in experimental research designs.<br>
<li>**Administrative or Department-Level Expectations:** Online courses may be assessed to determine if and how they meet certain expectations at the department or division level. In addition, reviews may be generated by a desire to prove (or disprove) the general quality of online learning versus "traditional" learning methods.</li>
</ol>
<p>Once we have determined why we are assessing, we can then move on to deciding [[What/Who Are We Assessing]]
<table style="background-color:#000000;color:#ffffff;"><tr><td><h4>Key Considerations for Assessing Student Performance</h4></td></tr></table>
<img style="float:right; border:1px; margin: 5px 5px 5px 5px;" src="http://wou.edu/~jkepka15/graduationdayphoto.jpg">Wholistic consideration of student performance may be assessed by individual class, by class type (i.e., online vs. face-to-face), or by an online unit overall.
<table style="background-color:#000000;color:#ffffff;"><tr><td>Key Questions for Assessing Student Performance</td></tr></table>
<ul><h4>**Motivation:**</h4>
<ul><li>Is the assessment meant to be strictly informative/internal?
<li>Will it contribute to research being published (interally or externally)?
<li>Will this review motivate changes for course content or delivery?</ul>
<li><h4>**Tools and Information:**</h4>
<ul><li>What scale will be used to assess performance? Examples: Complete/Not Complete, Grades, GPA, or outcome-based scales)
<li>What statistics are available for comparison?
<li>What statistics need to be gathered or generated? Who has access to this information?</ul>
<li><h4>**Information Management:**</h4>
<ul><li>Is the information being compared confidential or covered by FERPA? Does this have implications for the results or for who can participate in the review?</li>
<li>Are there adequate statistical analysis applications available?</ul>
<table style="background-color:#000000;color:#ffffff;"><tr><td>Key Stakeholders for Assessing Student Performance</td></tr></table>
<ul><li>**Students:** from these classes? from similar classes? from other classes?
<li>**Instructors:** from the reviewed classes, similar classes, the same department, external departments, or external institutions
<li>**Administrators:** Institutional Research, Academic Technology, or Distance Learning department leaders</ul>
<table style="background-color:#000000;color:#ffffff;"><tr><td><h4>**Key Considerations for Assessing Student Readiness**</h4></td></tr></table>
<img style="float:right; border: 1px solid black; margin: 10px 10px 10px 10px;" src="http://wou.edu/~jkepka15/firstdayschool.jpg">Student readiness is generally assessed for large blocks of students. It may be assessed in concert with assessments of student performance or separately. Readiness can refer to preparation to navigate and/or obtain necessary technology; experience with the LMS or other course delivery methods; experience with online learning systems; or experience with college learning in general.
<table style="background-color:#000000;color:#ffffff;"><tr><td>Key Questions for Assessing Student Readiness</td></tr></table>
<ul><h4>**Motivation:**</h4>
<ul><li>Is the assessment meant to be strictly informative/internal?
<li>Will the assessment be administered to students or analyze existing student data?
<li>Will it contribute to research being published (interally or externally)?
<li>Will this review motivate changes for course content or delivery?
<li>Will the assessment results motivate changes in requirements for access to online courses?</ul>
<li><h4>**Tools and Information:**</h4>
<ul><li>What instrument or information will be used to assess readiness? Examples: Student survey; voluntary questionnaire; pre-course assessment tool (quiz, lesson, self-check); LMS analytics; faculty reports
<li>What information is already available for comparison?
<li>What information needs to be gathered or generated? From whom?
<li>Who has access to this information?
<li>What scale will be employed to judge student readiness? Examples: 1 (not prepared) to 5 (perfectly prepared)</ul>
<li><h4>**Information Management:**</h4>
<ul><li>Is the information being compared confidential or covered by FERPA? Does this have implications for the results or for who can participate in the review?</li>
<li>Are there adequate statistical analysis applications available?</ul>
<table style="background-color:#000000;color:#ffffff;"><tr><td>Key Stakeholders for Assessing Student Performance</td></tr></table>
<ul><li>Students: from these classes? from similar classes? from other classes?
<li>Instructors: from the reviewed classes, similar classes, the same department, external departments, or external institutions
<li>Administrators: Institutional Research, Academic Technology, or Distance Learning department leaders</ul>
<table style="background-color:#000000;color:#ffffff;"><tr><td>Existing Assessment Tools for Student Readiness</td></tr></table>
<em>Please assume that these rubrics are copyrighted by the departments/colleges that have created them, and use them as examples only. All links open in a new window/tab.</em>
<ul><li><a href="http://www.unc.edu/tlim/ser/" target="_blank">UNC Online Learning Readiness Questionnaire</a>
<ul><li><a href="http://www.cheyney.edu/InstructionalDesign/Online-Readiness-Asessment2.cfm" target="_blank">Cheyney University Student Readiness Questionnaire
<font-size:2.75em;>↶</font> Return to [[We are assessing student performance or readiness]]
<table style="background-color:#000000;color:#ffffff;width:100%;"><tr><td><h4>Assessing with the Quality Matters Rubric</h4></td></tr></table>
<p>Using the Quality Matters review system is popular and complicted. This page attempts an explanation of the history of QM, how it is used, and its challenges. You may navigate to any section:
<a href="#qmis">What QM Is</a>
<a href="#qmnot">What QM is Not</a>
<a href="#anchor1">History</a>
<a href="#anchor2">QM Certification</a>
<a href="#anchor3">The Review Process</a>
<a href="#anchor4">QM Benefits</a>
<a href="#anchor4">QM Challenges/Issues</a>
<table style="background-color:#000000;color:#ffffff;width:50%;"><tr><td><a name="qmis">What QM Is:</a></td></tr></table>
<p>QM is first and foremost focused on reviewing course design, not delivery. The eight General Standards focus on (via <a href="http://www.cheyney.edu/InstructionalDesign/Review-the-Quality-Matters-Rubric.cfm" target="_blank">Cheyney University</a>):
<ol>
<li><strong>Course Overview and Introduction</strong> – The overall design of the course is made clear to the student at the beginning of the course.</li>
<li><strong>Learning Objectives (Competencies)</strong> – Learning objectives are measurable and are clearly stated.</li>
<li><strong>Assessment and Measurement</strong> – Assessment strategies are designed to evaluate student progress by reference to stated learning objectives, to measure the effectiveness of student learning; and to be integral to the learning process.</li>
<li><strong>Instructional Materials </strong>– Instructional materials are sufficiently comprehensive to achieve stated course objectives and learning outcomes.</li>
<li><strong>Learner Interaction and Engagement</strong> – Forms of interaction incorporated in the course motivate students and promote learning.</li>
<li><strong>Course Technology </strong>– Course navigation and technology support student engagement and ensure access to course components.</li>
<li><strong>Learner Support</strong> – The course facilitates student access to institutional support services essential to student success.</li>
<li><strong>Accessibility</strong> – The course demonstrates a commitment to accessibility for all students.</li>
</ol>
<p>Because the alignment of outcomes throughout the course is one focus of a review, a course without measurable, specific outcomes cannot be reviewed by a Quality Matters team.
<p>As such, the review and the rubric are alignment tools in addition to a method of certifying course adherence to design principles.
<table style="background-color:#000000;color:#ffffff;width:50%;"><tr><td><a name="qmisnot">What QM Is Not:</a></td></tr></table>
<p> QM reviews do not engage in faculty criticism or student evaluation. Reviewers do not directly interact with a live course and therefore have little information on which to base critiques of engagement. In addition, reviewers do not critique the quality of a course's content beyond a subject matter expert's review of the currency and relevance of the tools and materials presented and the appropriateness of the workload for the course level.
<table style="background-color:#000000;color:#ffffff;width:100%;"><tr><td><h1><a name="anchor1">History</a></h1></td></tr></table>
<p>The Quality Matters Rubric was developed through a federal grant by MarylandOnline, a collection of colleges and universities. It has now spun off into an independent organization with 900 subscribing institutions, and it has developed rubrics for assessing online course delivery in K-12 education, higher education, educational publishing, and continuing and professional education.
<p>The Cheyney University web site provides a QM Higher Ed rubric overview, which can also be downloaded as a <a href="http://www.cheyney.edu/InstructionalDesign/documents/QM_Rubric.pdf" target="_blank">PDF</a>.
<iframe width="700" height="300" src="http://www.cheyney.edu/InstructionalDesign/documents/QM_Rubric.pdf">
<p>Your browser does not support iframes. Visit qualitymatters.org/rubric for more information</p>
</iframe>
<p>The rubric itself is available for non-subscribers, but an account must be created to access it. The rubric also has annotations for each General and Specific Standard which provide further guidance and examples for reviewers. These are not accessible without registration.
<p><a href="http://qualitymatters.org" target="_blank">QualityMatters.org</a>, the home of Quality Matters online, offers this 9-minute introduction video:
<p><iframe width="560" height="315" src="https://www.youtube.com/embed/yQm_WbRxOGU" frameborder="1" allowfullscreen></iframe>
<table style="background-color:#000000;color:#ffffff;width:50%;"><tr><td><h2><a name="anchor2">QM Certification</a></h2></td></tr></table>
<p>Once an institution or an individual pays for subscription, the rubric can be used for informal review.
<p>QM also provides a Quality Matters Certification for higher education courses that have passed an official review. This QM seal of approval represents that a couse has met 85 percent or more of the expectations for course design outlined in the rubric's 8 General Standards. Each standard has multiple Specific Standards; a course is judged on 42 different aspects in its design during a review.
<p>Certain specific standards are considered "essential" standards, meaning a course cannot pass if it does not meet that standard in at least 85 percent of its design. Most essential standards deal with "alignment," which is the relationship between course elements and the stated course and module outcomes.</p>
<table style="background-color:#000000;color:#ffffff;width:50%;"><tr><td><h2><a name="anchor3">Review Process</a></h2></td></tr></table>
<p>Reviews are conducted by teams of three faculty members who have successfully completed the Applying the Quality Matters Rubric and Peer Reviewer Courses. The three team members must, in some combination, include a Subject Matter Expert, a Master Reviewer (who coordinates the review and has completed additional QM training), and an External Reviewer who participates from an institution not involved in the review. It is possible for one reviewer to hold all three roles.
<p>Official reviews may be completed internally or externally. During an internal review, the campus QM Coordinator recruits all three reviewers, using the QM reviewer database. During an external review, a coordinator with QM (the organization) chooses the master reviewer, who then selects two additional team members from the QM database. Reviewers are paid.
<p>Reviews take place over three weeks (at most). If the course does not meet at the 85% expectation, the Course Representative (the instructor, designer, or other campus staff who are able to alter and speak to the design of the reviewed course) has up to 17 additional weeks to revise the design and resubmit for re-review. </p>
<table style="background-color:#000000;color:#ffffff;width:100%;"><tr><td><h2><a name="anchor4">QM Benefits</a></h2></td></tr></table>
<p>The material benefits of QM subscriptions include copies of the QM rubric, access to QM online resources including their Course Review Management System; their professional development webinars, the Instructional Designer network, and online faculty course; and their pool of trained reviewers. </p>
<p>Less tangible benefits include the credibility of the QM name, which stems from the construction and quality of the rubric. Northland Pioneer College outlines this case:</p>
<iframe src="http://eresource.npc.edu/distance/evaluation/WhyQM/" width="700" height="500">Visit <a href="http://eresource.npc.edu/distance/evaluation/WhyQM/" target="_blank">NPC's web site</a> if your browser doesn't support iFrames.</iframe></p>
<p>QM can assist colleges in starting and maintaining assessment procedures. Because it has a large pool of trained, ready-to-go reviewers and a developed rubric, it is seen as a stable and credible source for review. The QM Certification is also used to advertise course quality to students (the <a href="http://www.galencollege.edu/blog/article/63/why-quality-matters-matters/" target="_blank">Galen College of Nursing</a> is one example).</p>
<p>Because the rubric was developed and tested over multiple universities, and because <a href="https://www.insidehighered.com/news/2014/05/09/ideas-take-shape-new-accreditors-aimed-emerging-online-providers" target="_blank">more than 5,000 courses have now been QM reviewed</a>, it is also a certification that is useful for professionals, both as reviewers and as recipients. Professors and/or Instructional Designers can list QM certification on their CV and expect that it will be understood. This may help recruit reluctant faculty or contingent faculty into course reviews.</p>
<p>QM Certification is also useful externally, as in, for example, college accreditation reporting. Demonstrated participation in QM can be useful for showing an institution's commitment to alignment and online course assessment.</p>
<p>In other words, the major benefits of a Quality Matters subscription include:</p>
<ul>
<li>Access to the QM rubric and online Course Review System</li>
<li>Recognition by internal and external authorities of the QM name/process
<li>Access to QM-led workshops, webinars, and online courses (though some have additional charges)</li>
<li>Professional development opportunities for faculty and instructional designers through QM conferences, research, and PD courses, and certifications</li>
</ul>
<table style="background-color:#000000;color:#ffffff;width:100%;"><tr><td><h2><a name="anchor5">QM Challenges/Concerns</a></h2></td></tr></table>
<p>The focus of QM Reviews is relentlessly alignment-based, which has caused some to criticize it as an institution that inspires cookie-cutter course templates.
<p>In addition, the subscription can be expensive. Institutions must pay an annual fee based on size. Individuals may also register with QM independent of an organization; however, the costs
<p>Quality Matters training and buy-in also take time. <a href="http://cop.hlcommission.org/Teaching-and-Learning/kramer2015.html" target="_blank">An intiative</a> at Allen College to increase QM certification and attention found that the time involved for faculty was a major factor in involvement, as well.
Blackboard Demo of new design/features (Dec. 2014):
<iframe width="560" height="315" src="https://www.youtube.com/embed/tLxKxHa34WU" frameborder="0" allowfullscreen></iframe>