Though many colleges and professional organizations employ rubrics or other measures to assess the quality of online courses, there are many different types of assessment that can be used. Before choosing or constructing any instrument, we must first decide who (or what) we'll be assessing.
<img style="float:right; border: 1px solid black; margin:5px 5px 5px 5px;" src="http://wou.edu/~jkepka15/peopleinschoolmeeting.jpg">
Choose One:
[[We are assessing the Learning Management System]]
[[We are assessing the course design and/or accessibility]]
[[We are assessing course delivery/faculty]]
[[We are assessing student performance or readiness]]
[[We are assessing some other aspect of online learning]]
<table width="100%" style="background-color:#000000;color:#ffffff;"><tr><td><h4>Key Considerations for Assessing Course Delivery or Online Faculty</h4></td></tr></table>
<img style="float:right; border: 1px solid black; margin: 10px 10px 10px 10px;" src="http://wou.edu/~jkepka15/peerreviewcats.jpg">Assessing delivery of online courses may be a discussion of the method (i.e., online versus face-to-face) or technology (LMS), but here, delivery and faculty performance will be considered together. This definition is borrowed from the Quality Matters division between Design and Delivery, which is echoed in many online rubrics.
Some delicacy and tact will likely be required when constructing and enacting reviews that touch on faculty performance. Before reviews can be undertaken, the motivations and projected outcomes should be available to all participants.
In general, assessment of faculty performance has been measured in the same way online that it has been in the classroom: namely, through student evaluations or professional/peer observation and evaluation. Online courses, however, offer an opportunity for self-review that a face-to-face class does not, as the instructor can observe and review and even alter a live class while it's in session without having to videotape themselves or otherwise break the flow of class proceedings.
Most of the tools here focus on self-assessment and peer review. Student evaluation of online instruction should be part of any true review of online course delivery, of course, but the tools that exist for this are already multitudinous.
<table width="100%" style="background-color:#000000;color:#ffffff;"><tr><td>Key Questions for Assessing Course Delivery</td></tr></table>
<h4>**Motivation:**</h4>
<ul><li>Is the assessment meant to be strictly informative/internal? If so, will permanent records be maintained? By whom? Who will have access?
<li>Will the assessment have a bearing on faculty assignments, pay, status, tenure?
<ul><li>If so, consider whether there may be legal issues; consult with an HR professional before beginning the review or assigning review committee personnel.
<li>If so, also consider to what level involvement with the assessment will be changed by faculty knowledge of these stakes.</ul>
<li>Will it contribute to research being published (internally or externally)?
<li>Will this review motivate changes for faculty training (either immediately or in the future)?
<li>Will reviews be conducted by trained faculty or faculty peers? Will training be provided in how to assess the course?
<li>Will the assessment results motivate changes in requirements for access to online courses?</ul>
<h4>**Tools and Information:**</h4>
<ul><li>What instrument or information will be used to assess readiness? Examples: Student survey; LMS analytics; Faculty reports
<li>What statistics are available for comparison?
<li>What statistics need to be gathered or generated? Who has access to this information?</ul>
<h4>**Information Management:**</h4>
<ul><li>Is the information being compared confidential or covered by FERPA? Does this have implications for the results or for who can participate in the review?</li>
<li>Are there adequate statistical analysis applications available?</ul>
<table width="100%" style="background-color:#000000;color:#ffffff;"><tr><td>Key Stakeholders for Assessing Course Delivery</td></tr></table>
<ul><li>Students: from these classes? from similar classes? from other classes? Using existing evaluation methods or others? Help desk workers who may have received complaints?
<li>Instructors: from the reviewed classes, similar classes, the same department, external departments, or external institutions; peers or different ranks? same or different subject matters? Volunteers or as service requirement?
<li>Administrators: department or division chairs, Institutional Research staff, Academic Technology, or Distance Learning department leaders; teaching and learning center staff</ul>
<table width="100%" style="background-color:#000000;color:#ffffff;"><tr><td>Existing Assessment Tools for Course Delivery</td></tr></table>
<em>Please assume that these rubrics are copyrighted by the departments/colleges that have created them, and use them as examples only. All links open in a new window/tab.</em>
<ul><li><a href="http://facdev.e-education.psu.edu/evaluate-revise/peerreviewonline" target="_blank">Penn State</a> has a peer review process for online faculty that includes self-reflection as well as peer observation. It is available under a CC-BY-NC-SA license for adaptation.
<ul><li>They also have a CC licensed <a href="https://weblearning.psu.edu/FacultySelfAssessment/#" target="_blank">faculty self-assessment tool</a> to determine whether one is ready to teach online. </ul>
<li>Eskey, M.T. and Roehrich, H. (n.d.). "<a href="http://www.nyu.edu/classes/keefer/waoe/eskeyr.pdf" target="_blank">A Faculty Observation Model for Online Instructors: Observing Faculty Members in the Online Classroom</a> [PDF]" from Park University.
<ul> <li>The <a href="http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.395.8698&rep=rep1&type=pdf" target="_blank">Online Instructor Evaluation System</a> is available in this document, along with results from some reviews.</li></ul>
<li>Presentation (1 hour): O'Malley and Clinefelter, "Assessing the Quality of Online Instruction Using Observations and Student Surveys," The Learning House Connect 2012. A <a href="http://www.learninghouse.com/blog/publishing/planning-for-success-the-importance-of-course-delivery-assessment-tools" target="_blank">text summary</a> is also available.<iframe width="560" height="315" src="https://www.youtube.com/embed/y3RgITWelRs" frameborder="0" allowfullscreen></iframe>
[[Return to Main Navigation->What/Who Are We Assessing]]
<table width="100%" style="background-color:#000000;color:#ffffff;"><tr><td><h4>Other Possible Assessments for Online Learning</h4></td></tr></table>
<img style="float:right; border: 1px solid black; margin: 10px 10px 10px 10px;" src="http://www.wou.edu/~jkepka15/linotyperoom.jpg">
Online learning systems and content can be assessed in many different ways. Here are a few other types or systems of assessment that may be useful:
<a href="#oope">Overall online program evaluations</a>
<a href="#oer">Open Educational Resource assessments</a>
<a href="#tech">Technology and Security Assessment</a>
<table style="background-color:#000000;color:#ffffff;"><tr>
<td><h4 id="oope">Overall Program Evaluation</h4></td></tr></table>
Institutions may want to consider the overall quality, outcomes, student satisfaction, or support from or of their online programs. Some institutions may want to consider whether it is worthwhile to start an online division or to consolidate existing efforts. These evaluation tools may be of use as part of this consideration:
<ul><li><a href="http://onlinelearningconsortium.org/consult/quality-scorecard/" target="_blank">The Online Learning Consortium Quality Scorecard (2016)</a>.
<ul><li>"By using the OLC Scorecard an administrator – regardless of size or type of institution – can determine strengths and weaknesses of their program, and initiate planning efforts towards areas of improvement. It can also be used to demonstrate elements of quality within the program, as well as an overall level of quality, to higher education accrediting bodies." The scorecard can be downloaded for free.</li></ul>
<li><a href="http://library.gwu.edu/sites/default/files/tlc/Middle%20States%20Guidelines%20for%20Evaluation%20of%20Distance%20Education.pdf" target="_blank">Guidelines for the Evaluation of Distance Education (Online Learning) from Middle States Commission on Higher Education</a>. (2009)
<li>Lockee, B., Moore, M., and Burton, J. (2002). <a href="http://library.gwu.edu/sites/default/files/tlc/eqm0213.pdf" target="_blank">Summative Evaluation of Distance Education PDF</a>. <em>Educause</em>.</ul></ul>
<table style="background-color:#000000;color:#ffffff;"><tr>
<td><h4 id="oer">Open Educational Resource assessment</h4></td></tr></table>
With the growing enthusiasm for the creation and remixing of open educational resources around the world, programs, faculty, students, and administrators may be interested in assessing both how and what OER are being used within their programs. These tools may be helpful in that assessment:
<ul><li><a href="http://info.merlot.org/merlothelp/merlot_peer_reviewer_report_form_09-2016.pdf" target="_blank">The MERLOT Peer Reviewer form in PDF</a>
<ul><li>This form is used by faculty reviewers as they assess new entries to the MERLOT database of OER. It would be useful both for self-review, peer review, and committee review of a created or in-use resource. </ul>
<li>Atenas, J. and Havemann, L. (2014). Questions of quality in repositories of open educational resources: a literature review. Research in Learning Technology, 22: <a href="https://www.openeducationeuropa.eu/sites/default/files/legacy_files/asset/questions-quality-repositories-OER-article.pdf" target="_blank">http://dx.doi.org/10.3402/rlt.v22.20</a>
<ul><li>This article provides "10 quality indicators" for assessing whether a repository of OER is effective.</ul>
<li>OER Assessment Training from Open Oregon (14 April 2016):
<iframe src="http://cdnapi.kaltura.com/p/601682/sp/60168200/embedIframeJs/uiconf_id/28127352/partner_id/601682?iframeembed=true&playerId=kaltura_player_1465275006&entry_id=0_pmuma8pd&flashvars[streamerType]=auto" width="560" height="395" allowfullscreen webkitallowfullscreen mozAllowFullScreen frameborder="0"></iframe></ul>
<table style="background-color:#000000;color:#ffffff;"><tr>
<td><h4 id="tech">Technology Assessment</h4></td></tr></table>
The "distance" of distance learning imposes new challenges in sending and receiving private information. Institutions may want to audit the technology being used by students, faculty, and the institution as a whole, from individual apps to the system-wide infrastructure that supports Information Technology. These articles present good starting places for this type of assessment and review:
<ul><li>Grama, J. (2015). <a href="http://er.educause.edu/articles/2015/2/~/link.aspx?_id=67D92C0F4A204081B9E47540E548FA98&_z=z" target="_blank">Understanding IT Governance, Risk, and Compliance in Higher Education: IT Governance</a>. EDUCAUSE Review.
<ul><li>This series of three articles defines GRC and explains why IT boards in higher education should consider all three: "By providing more informed decisions, better investment decisions, and more stakeholder support, IT governance programs help ensure that information technology aligns with the institutional mission."</ul>
<li><a href="http://www.nctp.com/tech_plan_links.cfm" target="_blank">National Center for Technology Planning articles</a>
<ul><li>This site provides a number of planning documents related to implementing technology in the classroom with an emphasis on considering how it may be used. It is the home of the Technology Audit Survivor's Guide, which is a lengthy but useful resource when planning an audit of this type.</ul></ul>
[[Return to Main Navigation->What/Who Are We Assessing]]
<table style="background-color:#000000;color:#ffffff;" width="100%"><tr><td><h4>Key Considerations for LMS Assessment</h4></td></tr></table>
<table style="background-color:#000000;color:#ffffff;" width="50%"><tr><td><h5>Contents:</h4></td></tr></table>
<a href="#keyqs">Key Questions</a>
<a href="#keystakes">Key Stakeholders</a>
<a href="#rubrics">Existing Rubrics</a>
<a href="#poplms">Popular LMS: Trends, Description, and History</a>
Assessment of the Learning Management System usually occurs as part of a process to decide on whether it will be the right fit for an institution. Sometimes, the existing LMS may be assessed to find out where key stakeholders find fault (or fortune!) within.
<img style="float:right; border:1px; margin: 10px 10px 10px 10px;" src="http://wou.edu/~jkepka15/cardcatalog.jpg">Typically, there could already be both external and internal reviews of an LMS available: Sometimes departments have informal reviews of an LMS that an entire college can use as part of its process. Sometimes questions about the LMS are included in student course evaluations at the end of the term. Both are valuable areas for collecting information.
Outside of the college, there may also be many existing reviews of the LMS available. Some reviews may be able to substitute for hands-on experience, particularly in terms of long-term benefits and costs for any single system. However, as most LMS companies are commercial entities, it may be difficult to find honest and unvarnished views openly available online; this may take more research and networking.
The companies and designers that work for Learning Management System companies may also be able to provide information about the intent and development of pieces.
Rubrics should take these different inputs into consideration for a full picture.</p>
<table style="background-color:#000000;color:#ffffff;" width="100%"><tr><td><h4 id="keyqs">Key Questions for LMS Assessment:</h4></a></td></tr></table>
<ul><li>Who/what is driving this review?
<ul><li>Deciding whether to implement a single LMS for a campus or system?
<li>Deciding whether an LMS is needed?
<li>Deciding between homegrown, open, or commercial LMS?
<li>Deciding whether to implement a new LMS?
<li>Deciding whether to stay with the existing LMS?
<li>Deciding on whether to implement certain features (or to take some away)?
<li>Cost/benefit analysis?
<li>Obsolescence?</ul>
<li>What most warrants assessment or review?
<ul><li>Current features
<li>Desired features
<li>Existing problems/criticisms
<li>Student access
<li>Faculty attitude toward the LMS
<li>Student attitude toward LMS
<li>Cost (in time, money, training, support, hosting)
<li>Benefits (what does the system afford users)
<li>Future use and possibility for the LMS
<li>External alignment/connections (nearby institutions/partners using other LMS)</ul>
<li>What are the core features that any LMS must have to work at our institution?
<li>Will use be mandatory, decided among divisions, or completely optional?
<ul><li>If optional, can others use different LMS?</ul>
<li>What external tools must the LMS work with or within?
<ul><li>Examples: Student Information Systems (Banner, Oracle/Peoplesoft, etc.); online or digital learning packages (Pearson, etc.)</ul></ul>
<table style="background-color:#000000;color:#ffffff;" width="100%"><tr><td><h4 id="keystakes">Key Stakeholders for LMS Assessment:</td></tr></table>
Discussion of an in-use Learning Management System will likely involve those who use it most (whether in person or through usage statistics). However, choosing a new system may be part of a process to bring reluctant users on board, which requires an entirely different recruitment process. Users may have wildly different opinions on the system, its purpose, and its usefulness.
To facilitate an open conversation, it may be useful to remember that even the definition of the learning management system's role will not be obvious to many users and non-users. Targeting information about the review to appeal to both novice and expert audiences may help in recruiting a well-rounded and lively team.
####Key Stakeholders include:
<ul><li>Systems Administrators
<li>Instructional Technology Staff
<li>Instructional Designers
<li>Faculty (for online, hybrid, and face-to-face classes; full-time and contingent; with LMS experience and without)
<li>Librarians (and researchers, multimedia specialists, archivists)
<li>Students (in online, hybrid, and face-to-face classes; undergraduate and graduate; part- and full-time; etc.)
<li>Outside users (i.e., continuing education enrollees, public observers, visiting faculty, college assessment or accreditation staff)
<li>Other users as defined by review initiator (accreditors, outside reviewers)</ul>
<table style="background-color:#000000;color:#ffffff;" width="100%"><tr><td><h4 id="rubrics">Existing Rubrics for LMS Assessment:</h4></td></tr></table>
<em>Please assume that these rubrics are copyrighted by the departments/colleges that have created them, and use them as examples only. All links open in a new window/tab.</em>
<ul><Li><a href="https://online.ucf.edu/about/lms-migration/lms-evaluation-checklist/" target="_blank">University of Central Florida Center for Distributed Learning LMS Checklist</a>
<li><a href="http://www.academia.edu/4313606/A_Rubric_to_Evaluate_Learning_Management_Systems" target="_blank">A Rubric to Evaluate Learning Management Systems (paper)</a> via Academia.Edu (login required)
<li>Tri-C (Cuyahoga Community College) <a href="https://wcetfrontiers.org/2016/02/04/the-great-lms-review-adventure/" target="_blank">LMS Review process</a>, which includes:
<ul><li>A <a href="https://drive.google.com/file/d/0B0lwpuBcseSvNnpPRC1Uc0pqU0k/view" target="_blank">robust LMS rubric</a>
<li>The <a href="https://drive.google.com/file/d/0B0lwpuBcseSvUmVBNXFnX2VGa0E/view" target="_blank">Request For Proposals</a> that the committee developed
<li>And write-ups and videos of the entire process (which is continuing with Blackboard, Brightspace, and Canvas) at <a href="https://elearningandinnovation.com/lms-review/" target="_blank">the Tri-C eLearning blog</a></ul>
<li><a href="http://it.umd.edu/elmseval" target="_blank">University of Maryland ELMS Committee (with a Request for Information that includes criteria for review).</a>
<li>University System of Georgia report on LMS selection (2011) available <a href="http://www.usg.edu/assets/learning_management_system/documents/USG_LMS_Task_Force_Final_Report.pdf" target="_blank">in PDF</a>. This lengthy report documents the exact steps that the sprawling USG went through to choose a new LMS after its old one was no longer supported. This includes timetables for selection, meeting minutes, descriptions of how advice was solicited, and the test and results of surveys delivered to students and faculty about LMS use.
<li>The University of North Carolina Charlotte published their final report on a decision to migrate to Canvas. This <a href="http://teaching.uncc.edu/sites/teaching.uncc.edu/files/media/files/LMSEval/FinalReportoftheLearningManagementSystemEvaluationCommitteeMarch2016.pdf" target="_blank">PDF</a> includes the survey questions administered to faculty about Moodle/Canvas satisfaction and the report of the accessibility office.</ul>
<table style="background-color:#000000;color:#ffffff;" width="100%"><tr><td><h4 id="poplms">Popular Learning Management Systems:</h4></td></tr></table>
Market shares for the major Learning Management Systems have shifted substantially in the last five years. Blackboard remains the most used system in colleges with enrollment of at least 250 students, as shown in this chart from the Fourth Annual LMS Data Review at <a href="http://edutechnica.com" target="_blank">EduTechnica</a>:
<img style="border: 1px solid black" src="http://www.wou.edu/~jkepka15/lmsupdatechart.jpg">
Here are the totals in the United States as of Spring 2016:
<img src="http://www.wou.edu/~jkepka15/lmsupdatetable.jpg"></p>
----
<table style="background-color:#000000;color:#ffffff;" width="50%"><tr><td><h4 id="lmshx">Third Generation LMS & History</h4></td></tr></table>
This <a href="https://www.insidehighered.com/blogs/digital-tweed/looking-beyond-sakai" target="_blank">Campus Computing Project article</a> provides a bit of history on the development of LMS computing currently and over the last 2 decades:
<iframe width="100%" height="500" src="https://www.insidehighered.com/blogs/digital-tweed/looking-beyond-sakai" frameborder="0" allowfullscreen>Digital Tweed Article from Inside Higher Ed</iframe>
<table style="background-color:#000000;color:#ffffff;" width="50%"><tr><td><h4 id="top5lms">Top 5 U.S. LMS</h4></td></tr></table>
<ol><li><img style="float:right; border:1px; margin: 5px 5px 0px 0px;" src="http://wou.edu/~jkepka15/bbdlogo.png"><a href="http://www.blackboard.com/higher-education/index.aspx" target="_blank">Blackboard</a> is a K-12, higher ed, and other training-centered company that also provides teleconference software, hosting, mobile learning systems and apps, and a number of other technological services aimed at education (both public, private, and corporate). It is perhaps the oldest and richest surviving player in the American LMS field, and has bought several competitors (for instance: ANGEL, WebCT, Prometheus). The company held approximately 75 percent of the higher education market in 2014. It has undergone significant design change in the past 2-3 years, though many campuses continue to run older versions of Blackboard.
<li><a href="https://www.canvaslms.com/higher-education/" target="_blank">Instructure's Canvas</a> system has seen dramatic growth in the past 5 years and has now become the second-most popular LMS in the United States for higher education. It is known for its simple design, powerful features, and positive user reviews.
<li><a href="https://moodle.com/" target="_blank">Moodle</a>, the open-source LMS that originated in Australia, is in third place and has held relatively steady over the past few years. Of note: EduTechnica notes that the hosting companies that add features to Moodle have quite a competition going, with Blackboard-owned <a href="https://www.moodlerooms.com/higher-education/" target="_blank">Moodlerooms</a> falling behind Baltimore's <a href="http://www.ethinkeducation.com/?moodlead=ethink.general" target="_blank">eThinkEducation</a>.
<li><A href="https://www.d2l.com/solutions/higher-education/" target="_blank">Desire2Learn</a> (D2L), which runs the Brightspace LMS, has seen some small growth in the past few years. Like Blackboard, it is a proprietary system licensed to institutions and supported by a community and formal technical support.
<li><a href="https://sakaiproject.org/" target="_blank">Sakai</a> is an open-source learning management system developed by and maintained through higher education institutions and currently used at about 300 higher education institutions worldwide. Like Moodle, users can download the source code to host their own implementation or pay an affiliate to host and maintain the site for them. Fun fact: Sakai is <a href="https://www.sakaiproject.org/sakai-history" target="_blank">named for Iron Chef Hiroyuki Sakai</a>.
####Trends:
Thomas Cavanagh and Kelvin Thompson at the University of Central Florida present a 27-minute podcast on "where the LMS has been, where it is, and where it is going" on <a href="https://online.ucf.edu/topcast-s01e12/" target="_blank">Episode 12 of TOPcast</a> from April 2016 (<a href="https://online.ucf.edu/files/2016/04/TOPcast-Episode-12.pdf" target="_blank">PDF</a> Transcript Also Available).
Blackboard Learn still has a substantial advantage in scale, which means that there's a slightly higher chance that new faculty may have used it. As EduTechnica notes, however, Moodle now has the most running installations of an obsolete LMS --which might mean many schools are on the verge of choosing a new LMS.
Regional differences, department/division/subject matter preferences, and the expansion of new quasi-LMS or partial LMS like Google Apps for Education/Google Classroom and Wordpress may continue to change these demographics.
[[Check out Demo Videos of the Top 5 LMS->Demo Page]]
[[Return to Main Navigation->What/Who Are We Assessing]]
Any discussion of course design should consider the accessibility of course materials. However, some discussions are more focused on accessibility than others. The level of focus will decide the type or manner of review that's necessary:
<img style="float:right; border: 1px solid black; margin: 10px 10px 10px 10px;" src="http://wou.edu/~jkepka15/mansittingonbenchatosu.jpg">
[[We are interested in an overall course design review]]
[[We are interested in a specific accessibility review]]
<table style="background-color:#000000;color:#ffffff;"><tr><td><h4>Key Considerations for Assessing Student Performance or Readiness</h4></td></tr></table>
<img style="float:right; border:1px; margin: 5px 5px 5px 5px;" src="http://wou.edu/~jkepka15/studentsaroundadesk.jpg">Institutions or departments may want to assess the readiness or performance of students who use online courses or online course systems. Let's first define what we're discussing:
**Student Performance** here will be discussed wholistically, in terms of completion or achievement of outcomes; information on assessments within a course are part of its design and/or delivery.
**Student Readiness** will refer to the preparation a student has before taking the course related to the online learning environment (not the content or anything that would be covered by prerequisites).
Assessment of Student Performance would typically take place after or during the course deployment, while student readiness would typically be assessed before the course/sequence begins.
[[We want to assess Student Performance]]
[[We want to assess Student Readiness]]
<table width="100%" style="background-color:#000000;color:#ffffff;width:100%;"><tr><td><h4>Key Considerations for Assessing Course Design</h4></td></tr></table>
<img style="float:right; border: 1px solid black; margin: 10px 10px 10px 10px;" src="http://wou.edu/~jkepka15/blueprint.jpg">Course design review is perhaps the most formally organized type of online class review that currently exists. In a design review, a course (minus the instructor) is assessed against a list of best practices or a rubric. Included in a design review may be considerations of:
<ul><li>Course navigation
<li>Accessibility
<li>Use of tools, activities, and other online affordances
<li>Communication (between users, with instructor, etc.)
<li>Methods of content presentation</ul>
Design reviews generally avoid commenting on course content and delivery. For example, a course might be reviewed over how it presents lab work and judged to have a design flaw in that students can only access the material in written form when they are expected to later demonstrate visual knowledge. However, a design review would likely not criticize a course for not including a specific type of lab or term within the content. When content will be part of a design review, the review team should include stakeholders with expertise in the content and best practices for presenting it.
Some design reviews will consider whether materials are current or taking full advantage of the online system. This may be important to consider if the result of this review would be the adoption of new tools or a learning management system.
How and whether to assess online course design -- and whether the reviews themselves may stifle creativity -- is a topic of significant discussion. In <a href="https://online.ucf.edu/topcast-s02e16/" target="_blank">Episode 16</a> of the Teaching Online Podcast from the University of Central Florida Center for Distributed Learning (1 August 2016), Kelvin Thompson and Thomas Cavanagh discuss various methods and issues in course quality review (30 minutes | <a href="https://online.ucf.edu/files/2016/08/TOPcast-Episode-16.pdf" target="_blank">PDF Transcript</a>).
Thompson's dissertation and <a href="http://onlinecoursecriticism.blogspot.com/2005/08/overview-of-online-course-criticism.html" target="_blank">blog</a> describe a model for online course criticism that may serve as an interesting theoretical background for further discussion (available in <a href="http://pegasus.cc.ucf.edu/~kthompso/criticism/chapter6.pdf" target="_blank">PDF</a>).
<table style="background-color:#000000;color:#ffffff; width:100%;"><tr><td>Key Questions for Assessing Course Design</td></tr></table>
<h4>**Motivation:**</h4>
<ul><li>Is the assessment meant to be strictly informative/internal?
<li>Will the assessment be administered to students or analyze existing student data?
<li>Will it contribute to research being published (internally or externally)?
<li>Will this review motivate changes for course content or delivery?
<li>Will the assessment results motivate changes in requirements for access to online courses?</ul>
<h4>**Tools and Information:**</h4>
<ul><li>What instrument or information will be used to assess readiness? Examples: Student survey; LMS analytics; Faculty reports
<li>What statistics are available for comparison?
<li>What statistics need to be gathered or generated? Who has access to this information?</ul>
<h4>**Information Management:**</h4>
<ul><li>Is the information being compared confidential or covered by FERPA? Does this have implications for the results or for who can participate in the review?</li>
<li>Are there adequate statistical analysis applications available?</ul>
<table style="background-color:#000000;color:#ffffff;width:100%;"><tr><td>Key Stakeholders for Assessing Course Design</td></tr></table>
<ul><li>Instructional Designers: as content creators/designers, as reviewers, as subject matter experts
<li>Instructors: from the reviewed classes, similar classes, the same department, external departments, or external institutions; as content creators or subject matter experts
<li>Administrators: Institutional Research, Academic Technology, or Distance Learning department leaders
<li>Students: as users or potential users</ul>
<table style="background-color:#000000;color:#ffffff; width:100%;"><tr><td>Existing Assessment Tools for Course Design</td></tr></table>
The most well-known instrument for online course design assessment might be the proprietary Quality Matters Rubric. This tool is complex and discussion of it is worth its [[own page->QM Rubric]].
<em>Please assume that these rubrics are copyrighted by the departments/colleges that have created them, and use them as examples only. All links open in a new window/tab.</em>
<ul><li>California State - Chico <a href="http://www.csuchico.edu/eoi/the_rubric.shtml" target="_blank">Rubric of Online Instruction</a>, also in <a href="http://www.csuchico.edu/eoi/documents/rubricpdf">PDF</a>
<ul><li>This rubric was developed by the college after viewing several external rubrics, including Quality Matters</li></ul>
<li><a href="http://commons.suny.edu/cote/course-supports/oscqr-rubric/" target="_blank">The Open SUNY Course Quality Review Rubric (OSCQR)</a>
<ul><li>This rubric is available under a creative commons BY-NC-SA license, making it free to adapt as needed. It is available in multiple formats (PDF, Google Doc) and includes a self-assessment tool. </ul>
<li>The International Association for K-12 Online Learning (iNACOL) provides <a href="http://www.inacol.org/resource/inacol-national-standards-for-quality-online-courses-v2/" target="_blank">National Standards for Quality Online Courses</a> that could be adapted to higher education.
<li><a href="http://online.ucf.edu/wp-content/blogs.dir/21/files/2016/01/Course-Quick-Check-4_15_13.docx" target="_blank">University of Central Florida's Online Course Quick Check (in Word format)</a>
<ul><li>Further resources are <a href="https://online.ucf.edu/faculty-seminar05/" target="_blank">also available</a>, including a webinar and slides</li></ul>
<li><a href="http://www.jalc.edu/tlc/teaching-online" target="_blank">John A. Logan College Assessment Videos</a>.
<ul><li>This series of 9 videos walks through the assessment tool JALC uses for its courses; the instrument was developed to meet accreditation standards. The Introduction is below:
<iframe width="560" height="315" src="https://www.youtube.com/embed/BYkjshPd7sQ" frameborder="0" allowfullscreen></iframe></li></ul>
[[Return to Main Navigation->What/Who Are We Assessing]]
<table width="100%" style="background-color:#000000;color:#ffffff;"><tr><td><h4>Key Considerations for Assessing Accessibility</h4></td></tr></table>
<img style="float:right; border: 1px solid black; margin: 10px 10px 10px 10px;" src="http://wou.edu/~jkepka15/xkcd.png" alt="comic showing someone trying to train a computer to talk to a TV">
The accessibility of a course should be a first-level consideration during its creation. However, as courses are sometimes constructed without the luxury of time to research best practices, they may sometimes lack universal design principles.
At other times, courses may be designed with a need-only idea in mind: that is, instructors or designers may plan to include accessible resources only if the specific need arises. This limits the usability of the course for all users.
As Carl Stamsheim explaine in this <em>Inside Higher Ed</em> <a href="https://www.insidehighered.com/news/2016/11/07/disability-rights-advocates-shift-strategies-ensure-equal-rights-digital-age" target="_blank">article</a>, we have entered a "New Era for Disability Rights" that requires consideration at all levels when posting content online:
<iframe width="100%" height="500" src="https://www.insidehighered.com/news/2016/11/07/disability-rights-advocates-shift-strategies-ensure-equal-rights-digital-age" scrolling="yes" frameborder="yes">Visit https://www.insidehighered.com/news/2016/11/07/disability-rights-advocates-shift-strategies-ensure-equal-rights-digital-age to read online</iframe>
<iframe style="float:right" width="50%" height="100" scrolling="no" frameborder="no" src="https://w.soundcloud.com/player/?url=https%3A//api.soundcloud.com/tracks/153595259&color=ff5500"></iframe>Though the need to bring online content up to the standards of the Americans with Disabilities Act could be one reason to pursue an accessibility review, the need to provide all students with multiple methods of access to course content should be another. Just UX Design explains how "accessibility begins with design" here (<a href="http://justuxdesign.com/blog/designing-for-accessibility" target="_blank">transcript available</a>).
Most institutions will have an office focused on accessibility and professionals with experience in designing and accommodating a variety of classroom needs. However, these offices may face the same challenges with online delivery that burden designers and instructors, namely lack of time and funding to research and update policies to match an ever-expanding catalog of tools being used. Thus an accessibility review should include as one of its major starting considerations whether institutional support for the review would extend to funding the needed staffing and training to make revisions.
<table width="100%" style="background-color:#000000;color:#ffffff;"><tr><td>Key Questions for Assessing Accessibility</td></tr></table>
<h4>**Motivation:**</h4>
<ul><li>Is the assessment motivated by a learner/student/user issue, complaint, or difficulty?
<ul><li>Accessibility reviews and their results may have legal ramifications, particularly if they are motivated by documented challenges that users have faced.
<li>If the assessment is in response to a specific concern or complaint, will it extend beyond this issue or narrowly focus only on a specific class, a specific tool, a specific delivery method, or a specific design?</ul>
<li>Is the review motivated by a desire to improve or alter an existing system?
<li>Is the review meant to establish a baseline for accessibility among a group of courses? If so, will a template be developed as an outcome? A list of best practices? A getting-started guide?
<li>Will the review contribute to research being published (internally or externally)?
<li>Will this review motivate changes for course content or delivery?
<li>Will the assessment results motivate changes in requirements for who can teach or design online courses?
<li>Will this review be part of a broader review of the Learning Management System or other tools?</ul>
<h4>**Tools and Information:**</h4>
<ul><li>What instrument or information will be used to assess accessibility? Examples: Existing rubric, campus accessibility policies or statements, student complaints, trends noticed by course designers or technologists, other models?
<li>What statistics are available for comparison? Examples: Number of online courses designed by or reviewed by someone with accessibility expertise; number of students who request alternative accomodations to materials; costs (in time, money) for current procedures to upgrade/update courses to make them accessible
<li>What statistics need to be gathered or generated? Who has access to this information?
<li>Will external tools (i.e., publisher created websites or interactive suites, the learning management system) need to be reviewed as part of the review?
<li>Do reviewers have access to a shared definition of accessibility? Should one be created or used from an existing campus document?</ul>
<h4>**Information Management:**</h4>
<ul><li>Is the information being compared confidential or covered by FERPA? Does this have implications for the results or for who can participate in the review?</li>
<li>Is the information being discussed or reviewed part of any ongoing legal challenge or question?</ul>
<table width="100%" style="background-color:#000000;color:#ffffff;"><tr><td>Key Stakeholders for Assessing Accessibility</td></tr></table>
<ul><li>Students: from these classes? from similar classes? from other classes? who have faced challenges in using online courses?
<li>Instructors: from the reviewed classes, similar classes, the same department, external departments, or external institutions
<li>Legal counsel
<li>Accessibility Resources staff, faculty, or experienced administrators with deep knowledge of campus policy, climate, student need, and local law
<li>Learning Management System administrators
<li>Technology help desk staff who may be aware of current issues students have in accessing information
<li>Course designers, instructional designers, and/or faculty trainers
<li>Administrators: Institutional Research, Academic Technology, or Distance Learning department leaders</ul>
<table width="100%" style="background-color:#000000;color:#ffffff;"><tr><td>Existing Assessment Tools for Accesibility</td></tr></table>
<em>Please assume that these rubrics are copyrighted by the departments/colleges that have created them, and use them as examples only. All links open in a new window/tab.</em>
<ul><li><a href="http://www.pcc.edu/resources/instructional-support/access/Standard8rubric.html" target="_blank">Portland Community College: How a course is evaluated for accessibility</a>
<ul><li>This guide explains how PCC uses the Quality Matters rurbric to specifically guide accessibility reviews for online courses, with a helpful breakdown of how each standard matters</ul>
<li><a href="http://commons.suny.edu/fact2onlineaccessibility/" target="_blank">SUNY Faculty Advisory Council on Teaching and Technology (FACT2) Online Accessibility Site</a>
<ul><li>This site includes specific pages on <a href="http://commons.suny.edu/fact2onlineaccessibility/how-do-i-verify-accessibility-of-materials/" target="_blank">verifying the accessibility of materials and tools</a> and <a href="http://www.hhs.gov/web/section-508/making-files-accessible/checklist/index.html" target="_blank">links</a> to the Health and Human Services Section 508 Accessibility Checklists.</ul>
<li>The Open SUNY Course Quality Rubric also includes a <a href="https://docs.google.com/spreadsheets/d/1ICwNCdRQTw7thRKCMLonrVaoyIin584gjUcu96YGzS8/edit#gid=637842929" target=_blank">specific accessibility section</a> with 37 criteria.
<li><a href="http://www.washington.edu/doit/real-connections-making-distance-learning-accessible-everyone" target="_blank">Equal Access: Universal Design of Instruction</a> from Disabilities, Opportunities, Internetworkings, and Technology (DO-IT)
<ul><li>This document "presents design considerations for ensuring that a course is accessible," and includes a lengthy list of resources about online course accessibility.
<li>They also have a 12-minute video, "Making Distance Learning Available to Everyone":
<iframe width="560" height="315" src="https://www.youtube.com/embed/zhw4UmTBlAk" frameborder="0" allowfullscreen></iframe></ul>
<li>Webinar: <a href="https://ucf.adobeconnect.com/_a826512158/p95ckwp8gbz/?launcher=false&fcsContent=true&pbMode=normal" target="_blank">Creating Accessible Online Content in Webcourses@UCF</a> provides a 1 hour quick course for faculty in upgrading accessibility within a Canvas course. This would be a useful review for members of a review committee unfamiliar with steps that are necessary for online accessibility.</ul>
###Resources:
<ul><li>3Play Media provides <a href="https://soundcloud.com/3play_media" target="_blank">captions</a>:
<iframe width="100%" height="166" scrolling="no" frameborder="no" src="https://w.soundcloud.com/player/?url=https%3A//api.soundcloud.com/tracks/233351852"></iframe></li></ul>
[[Return to Design Assessment->We are assessing the course design and/or accessibility]]
[[Return to Main Navigation->What/Who Are We Assessing]]
<h1>Why might we assess online courses?</h1>
<img style="float:right; border:1px solid black; margin:10px 10px 10px 10px;" height="400px"; src="http://www.wou.edu/~jkepka15/whatarewelookingat.jpg">There are many motivations that spur the need for higher education professionals to undertake online course assessment, whether formal or informal. The lens or window through which we are evaluating is as important as what we are looking at.
This presentation will attempt to narrow the field by dividing the types of assessment we might undertake, discussing possible motivations for each type of assessment, identifying potential key stakeholders in each assessment, and collecting existing resources to help create a rubric or measurement for the review.
An honest discussion of **why we're assessing or looking at a course/system/structure** will help define the type of tool that's used.
<table style="background-color:#000000;color:#ffffff;" width="100%"><tr><td><h3>Motivations For Review:</h3></td></tr></table>
Here are three possible motivations for a <strong>formal</strong>, meaning institutionally sponsored or initiated, review:
<ol><li>**Curiosity:** Faculty, Administrators, or even students may start a review in order to satisfy curiosity about how the course (or online courses in general) functions or fits within the discipline, the pantheon of online courses, or simply within one student's day-to-day experience. Perhaps a faculty member wants to know whether her course aligns with what others are doing; perhaps a department chair wants to compare an online section with a hybrid delivery of the same course. <br>
<li>**Research:** The desire to research or to generate studied results can motivate assessment of online courses in a number of ways. For example, though numbers may be readily available for who passes or fails (completes/doesn't complete) an online course, assessment may allow more specific observation of the situation that causes these numbers. In addition, assessment may allow for statistical comparisons or qualitative inquiries. Finally, assessment may help align courses so that they can be included in experimental research designs.<br>
<li>**Administrative or Department-Level Expectations:** Online courses may be assessed to determine if and how they meet certain expectations at the department or division level. In addition, reviews may be generated by a desire to prove (or disprove) the general quality of online learning versus "traditional" learning methods.</li>
</ol>
<p>Once we have determined why we are assessing, we can then move on to deciding [[What/Who Are We Assessing]]
---
#####Notes:
You can access this entire site as <a href="https://docs.google.com/document/d/1Wt4V-3HuprAKwb5qmx2Z3-Z_mhmuNtHOx0PoQJOjTJw/edit?usp=sharing" target="_blank">a set of Google Docs<a/>.
All images are taken from <a href="https://www.flickr.com/commons" target="_blank">the public domain</a> and have no known restrictions on their use.
<p align="center"><a rel="license" href="http://creativecommons.org/licenses/by/4.0/"><img alt="Creative Commons License" style="border-width:0" src="https://i.creativecommons.org/l/by/4.0/88x31.png" /></a><br />This work is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a> by Jenn Kepka.
<table style="background-color:#000000;color:#ffffff;"><tr><td><h4>Key Considerations for Assessing Student Performance</h4></td></tr></table>
<img style="float:right; border:1px; margin: 5px 5px 5px 5px;" src="http://wou.edu/~jkepka15/graduationdayphoto.jpg">Measurement of student performance is usually conducted within the courses themselves through activities and assessments. However, there may at times be value in assessing the performance of students in large groups in order to predict or alter existing systems.
Most higher education institutions are already set up to report on the standard measures of student performance, i.e. grades or outcomes. When applying these as measures of the success of an online course system, other measures may also be used, depending on what drives the review process. As Professor Anthony Picciano <a href="http://www.anitacrawley.net/Articles/Picciano2002.pdf" target="_blank">wrote [PDF]</a>in the Journal of Asynchronous Learning (2002):
<blockquote>Student performance is open to many definitions. Successful completion of a course, course withdrawals, grades, added knowledge, and skill building are some of the ways that performance is measured, depending upon the content of the course and the nature of the students... Many studies of student performance in face-to-face and online courses rely on student perceptions of their learning experiences including "how well" or "how much" they have learned. Ultimately, student perceptions of their learning may be as good as other measures because these perceptions may be the catalysts for continuing to pursue coursework and other learning opportunities. Student performance is well understood to be a multivariable phenomenon
effected by study habits, prior knowledge, communications skills, time available for study, teacher effectiveness, etc. The purpose of this study is to examine performance in an online course in terms of student interaction and sense of presence. Data on multiple independent (measures of interaction and presence) and dependent (measures of performance) variables were collected and subjected to analysis. </blockquote>
Wholistic consideration of student performance may be assessed by individual class, by class type (i.e., online vs. face-to-face), by academic department or division, or by an online unit overall.
<table style="background-color:#000000;color:#ffffff;"><tr><td>Key Questions for Assessing Student Performance</td></tr></table>
<h4>**Motivation:**</h4>
<ul><li>Is the assessment meant to be strictly informative/internal?
<li>Will it contribute to research being published (internally or externally)?
<li>Will this review motivate changes for course content or delivery?</ul>
<h4>**Tools and Information:**</h4>
<ul><li>What scale will be used to assess performance? Examples: Complete/Not Complete, Grades, GPA, or outcome-based scales)
<li>What statistics are available for comparison?
<li>What statistics need to be gathered or generated? Who has access to this information?</ul>
<h4>**Information Management:**</h4>
<ul><li>Is the information being compared confidential or covered by FERPA? Does this have implications for the results or for who can participate in the review?</li>
<li>Are there adequate statistical analysis applications available?</ul>
<table style="background-color:#000000;color:#ffffff;"><tr><td>Key Stakeholders for Assessing Student Performance</td></tr></table>
<ul><li>**Students:** from these classes? from similar classes? from other classes?
<li>**Instructors:** from the reviewed classes, similar classes, the same department, external departments, or external institutions
<li>**Administrators:** Institutional Research, Academic Technology, or Distance Learning department leaders</ul>
<table style="background-color:#000000;color:#ffffff;"><tr><td>Existing Rubrics or Measurement Tools for Assessing Student Performance</td></tr></table>
A significant body of scholarly research has focused on comparing online and traditional classroom learning. The methods used in these studies may be useful when constructing a way to assess student performance:
Wagner, S.C., Garipoo, S.J., and Lovaas, P. (2011) A Longitudinal Comparison of Online Versus Traditional Instruction. Journal of Online Learning and Teaching, 7(1). <a href="Suzanne C. Wagner " target="_blank">http://jolt.merlot.org/vol7no1/wagner_0311.pdf</a>
Ya Ni, A. (2013). Comparing the Effectiveness of Classroom and Online Learning: Teaching Research Methods. Journal of Public Affairs Education 19(2). pp. 199-215. <a href="http://www.naspaa.org/jpaemessenger/Article/VOL19-2/03_Ni.pdf" target="_blank">http://www.naspaa.org/jpaemessenger/Article/VOL19-2/03_Ni.pdf</a>
[[Return to Student Assessment->We are assessing student performance or readiness]]
[[Return to Main Navigation->What/Who Are We Assessing]]
<table style="background-color:#000000;color:#ffffff;"><tr><td><h4>**Key Considerations for Assessing Student Readiness**</h4></td></tr></table>
<img style="float:right; border: 1px solid black; margin: 10px 10px 10px 10px;" src="http://wou.edu/~jkepka15/firstdayschool.jpg">Student readiness is generally assessed for large blocks of students. It may be assessed in concert with assessments of student performance or separately. Readiness can refer to preparation to navigate and/or obtain necessary technology; experience with the LMS or other course delivery methods; experience with online learning systems; or experience with college learning in general.
<table style="background-color:#000000;color:#ffffff;"><tr><td>Key Questions for Assessing Student Readiness</td></tr></table>
<h4>**Motivation:**</h4>
<ul><li>Is the assessment meant to be strictly informative/internal?
<li>Will the assessment be administered to students or analyze existing student data?
<li>Will it contribute to research being published (internally or externally)?
<li>Will this review motivate changes for course content or delivery?
<li>Will the assessment results motivate changes in requirements for access to online courses?</ul>
<h4>**Tools and Information:**</h4>
<ul><li>What instrument or information will be used to assess readiness? Examples: Student survey; voluntary questionnaire; pre-course assessment tool (quiz, lesson, self-check); LMS analytics; faculty reports
<li>What information is already available for comparison?
<li>What information needs to be gathered or generated? From whom?
<li>Who has access to this information?
<li>What scale will be employed to judge student readiness? Examples: 1 (not prepared) to 5 (perfectly prepared)</ul>
<h4>**Information Management:**</h4>
<ul><li>Is the information being compared confidential or covered by FERPA? Does this have implications for the results or for who can participate in the review?</li>
<li>Are there adequate statistical analysis applications available?</ul>
<table style="background-color:#000000;color:#ffffff;"><tr><td>Key Stakeholders for Assessing Student Readiness</td></tr></table>
<ul><li>Students: from these classes? from similar classes? from other classes?
<li>Instructors: from the reviewed classes, similar classes, the same department, external departments, or external institutions
<li>Administrators: Institutional Research, Academic Technology, or Distance Learning department leaders</ul>
<table style="background-color:#000000;color:#ffffff;"><tr><td>Existing Assessment Tools for Student Readiness</td></tr></table>
<em>Please assume that these rubrics are copyrighted by the departments/colleges that have created them, and use them as examples only. All links open in a new window/tab.</em>
<ul><li>The <a href="http://www.ion.uillinois.edu/resources/tutorials/pedagogy/StudentProfile.asp" target="_blank">Illinois Online Network</a> provides a list of qualities for online students that could be used to build an assessment model.
<li><a href="http://www.unc.edu/tlim/ser/" target="_blank">UNC Online Learning Readiness Questionnaire</a>
<li><a href="http://www.cheyney.edu/InstructionalDesign/Online-Readiness-Asessment2.cfm" target="_blank">Cheyney University Student Readiness Questionnaire</a>
<li><a href="https://pennstate.qualtrics.com/jfe/form/SV_7QCNUPsyH9f012B?s=246aa3a5c4b64bb386543eab834f8e75" target="_blank">Online Readiness Assessment by Vicki Williams and the Pennsylvania State University</a>. This assessment is available under a CC-BY-NC-SA license for adaptation.
<li>Glenn Pillsbury at California State University-Stanislaus provides <a href="https://www.csustan.edu/teach-online/online-readiness-self-assessment-0" target="_blank">a 17-Question self-assessment</a> for students to judge online readiness (CC-BY-NC-SA).
<li>The <a href="http://apps.3cmediasolutions.org/oei/" target="_blank">California Community Colleges Online Education Initiative</a> provides a series of editable, CC-BY modules that can be adapted to help students and faculty assess readiness. This includes an assessment on computer skills.
[[Return to Student Assessment->We are assessing student performance or readiness]]
[[Return to Main Navigation->What/Who Are We Assessing]]
<table style="background-color:#000000;color:#ffffff;width:100%;"><tr><td><h4>Assessing with the Quality Matters Rubric</h4></td></tr></table>
<p>Using the Quality Matters review system is popular and complicted. This page attempts an explanation of the history of QM, how it is used, and its challenges. You may navigate to any section:
<a href="#qmis">What QM Is</a>
<a href="#qmnot">What QM is Not</a>
<a href="#anchor1">History</a>
<a href="#anchor2">QM Certification</a>
<a href="#anchor3">The Review Process</a>
<a href="#anchor4">QM Benefits</a>
<a href="#anchor4">QM Challenges/Issues</a>
<table style="background-color:#000000;color:#ffffff;width:50%;"><tr><td><a name="qmis">What QM Is:</a></td></tr></table>
<p>QM is first and foremost focused on reviewing course design, not delivery. The eight General Standards focus on (via <a href="http://www.cheyney.edu/InstructionalDesign/Review-the-Quality-Matters-Rubric.cfm" target="_blank">Cheyney University</a>):
<ol>
<li><strong>Course Overview and Introduction</strong> – The overall design of the course is made clear to the student at the beginning of the course.</li>
<li><strong>Learning Objectives (Competencies)</strong> – Learning objectives are measurable and are clearly stated.</li>
<li><strong>Assessment and Measurement</strong> – Assessment strategies are designed to evaluate student progress by reference to stated learning objectives, to measure the effectiveness of student learning; and to be integral to the learning process.</li>
<li><strong>Instructional Materials </strong>– Instructional materials are sufficiently comprehensive to achieve stated course objectives and learning outcomes.</li>
<li><strong>Learner Interaction and Engagement</strong> – Forms of interaction incorporated in the course motivate students and promote learning.</li>
<li><strong>Course Technology </strong>– Course navigation and technology support student engagement and ensure access to course components.</li>
<li><strong>Learner Support</strong> – The course facilitates student access to institutional support services essential to student success.</li>
<li><strong>Accessibility</strong> – The course demonstrates a commitment to accessibility for all students.</li>
</ol>
<p>Because the alignment of outcomes throughout the course is one focus of a review, a course without measurable, specific outcomes cannot be reviewed by a Quality Matters team.
<p>As such, the review and the rubric are alignment tools in addition to a method of certifying course adherence to design principles.
<table style="background-color:#000000;color:#ffffff;width:50%;"><tr><td><a name="qmisnot">What QM Is Not:</a></td></tr></table>
<p> QM reviews do not engage in faculty criticism or student evaluation. Reviewers do not directly interact with a live course and therefore have little information on which to base critiques of engagement. In addition, reviewers do not critique the quality of a course's content beyond a subject matter expert's review of the currency and relevance of the tools and materials presented and the appropriateness of the workload for the course level.
<table style="background-color:#000000;color:#ffffff;width:100%;"><tr><td><h1><a name="anchor1">History</a></h1></td></tr></table>
<p>The Quality Matters Rubric was developed through a federal grant by MarylandOnline, a collection of colleges and universities. It has now spun off into an independent organization with 900 subscribing institutions, and it has developed rubrics for assessing online course delivery in K-12 education, higher education, educational publishing, and continuing and professional education.
<p>The Cheyney University web site provides a QM Higher Ed rubric overview, which can also be downloaded as a <a href="http://www.cheyney.edu/InstructionalDesign/documents/QM_Rubric.pdf" target="_blank">PDF</a>.
<iframe width="700" height="300" src="http://www.cheyney.edu/InstructionalDesign/documents/QM_Rubric.pdf">
<p>Your browser does not support iframes. Visit qualitymatters.org/rubric for more information</p>
</iframe>
<p>The rubric itself is available for non-subscribers, but an account must be created to access it. The rubric also has annotations for each General and Specific Standard which provide further guidance and examples for reviewers. These are not accessible without registration.
<p><a href="http://qualitymatters.org" target="_blank">QualityMatters.org</a>, the home of Quality Matters online, offers this <a href="https://www.youtube.com/embed/yQm_WbRxOGU" target="_blank">9-minute introduction video</a>:
<p><iframe width="560" height="315" src="https://www.youtube.com/embed/yQm_WbRxOGU" frameborder="1" allowfullscreen></iframe>
<table style="background-color:#000000;color:#ffffff;width:50%;"><tr><td><h2><a name="anchor2">QM Certification</a></h2></td></tr></table>
<p>Once an institution or an individual pays for subscription, the rubric can be used for informal review.
<p>QM also provides a Quality Matters Certification for higher education courses that have passed an official review. This QM seal of approval represents that a course has met 85 percent or more of the expectations for course design outlined in the rubric's 8 General Standards. Each standard has multiple Specific Standards; a course is judged on 42 different aspects in its design during a review.
<p>Certain specific standards are considered "essential" standards, meaning a course cannot pass if it does not meet that standard in at least 85 percent of its design. Most essential standards deal with "alignment," which is the relationship between course elements and the stated course and module outcomes.</p>
<table style="background-color:#000000;color:#ffffff;width:50%;"><tr><td><h2><a name="anchor3">Review Process</a></h2></td></tr></table>
<p>Reviews are conducted by teams of three faculty members who have successfully completed the Applying the Quality Matters Rubric and Peer Reviewer Courses. The three team members must, in some combination, include a Subject Matter Expert, a Master Reviewer (who coordinates the review and has completed additional QM training), and an External Reviewer who participates from an institution not involved in the review. It is possible for one reviewer to hold all three roles.
<p>Official reviews may be completed internally or externally. During an internal review, the campus QM Coordinator recruits all three reviewers, using the QM reviewer database. During an external review, a coordinator with QM (the organization) chooses the master reviewer, who then selects two additional team members from the QM database. Reviewers are paid.
<p>Reviews take place over three weeks (at most). If the course does not meet at the 85% expectation, the Course Representative (the instructor, designer, or other campus staff who are able to alter and speak to the design of the reviewed course) has up to 17 additional weeks to revise the design and resubmit for re-review. </p>
<table style="background-color:#000000;color:#ffffff;width:100%;"><tr><td><h2><a name="anchor4">QM Benefits</a></h2></td></tr></table>
<p>The material benefits of QM subscriptions include copies of the QM rubric, access to QM online resources including their Course Review Management System; their professional development webinars, the Instructional Designer network, and online faculty course; and their pool of trained reviewers. </p>
<p>Less tangible benefits include the credibility of the QM name, which stems from the construction and quality of the rubric. Northland Pioneer College <a href="http://eresource.npc.edu/distance/evaluation/WhyQM/" target="_blank">outlines this case:</a></p>
<iframe src="http://eresource.npc.edu/distance/evaluation/WhyQM/" width="700" height="500">Visit <a href="http://eresource.npc.edu/distance/evaluation/WhyQM/" target="_blank">NPC's web site</a> if your browser doesn't support iFrames.</iframe></p>
<p>QM can assist colleges in starting and maintaining assessment procedures. Because it has a large pool of trained, ready-to-go reviewers and a developed rubric, it is seen as a stable and credible source for review. The QM Certification is also used to advertise course quality to students (the <a href="http://www.galencollege.edu/blog/article/63/why-quality-matters-matters/" target="_blank">Galen College of Nursing</a> is one example).</p>
<p>Because the rubric was developed and tested over multiple universities, and because <a href="https://www.insidehighered.com/news/2014/05/09/ideas-take-shape-new-accreditors-aimed-emerging-online-providers" target="_blank">more than 5,000 courses have now been QM reviewed</a>, it is also a certification that is useful for professionals, both as reviewers and as recipients. Professors and/or Instructional Designers can list QM certification on their CV and expect that it will be understood. This may help recruit reluctant faculty or contingent faculty into course reviews.</p>
<p>QM Certification is also useful externally, as in, for example, college accreditation reporting. Demonstrated participation in QM can be useful for showing an institution's commitment to alignment and online course assessment.</p>
<p>In other words, the major benefits of a Quality Matters subscription include:</p>
<ul>
<li>Access to the QM rubric and online Course Review System</li>
<li>Recognition by internal and external authorities of the QM name/process
<li>Access to QM-led workshops, webinars, and online courses (though some have additional charges)</li>
<li>Professional development opportunities for faculty and instructional designers through QM conferences, research, and PD courses, and certifications</li>
</ul>
<table style="background-color:#000000;color:#ffffff;width:100%;"><tr><td><h2><a name="anchor5">QM Challenges/Concerns</a></h2></td></tr></table>
<p>The focus of QM Reviews is relentlessly alignment-based, which has caused some to criticize it as an institution that inspires cookie-cutter course templates.
<p>In addition, the subscription can be expensive. Institutions must pay an annual fee based on size. Individuals may also register with QM independent of an organization; however, the costs may be prohibitive for those not affiliated with an institution, particularly contingent faculty. This cost may be offset by the money made through reviews.
<p>Quality Matters training and buy-in also take time. <a href="http://cop.hlcommission.org/Teaching-and-Learning/kramer2015.html" target="_blank">An initiative</a> at Allen College to increase QM certification and attention found that the time involved for faculty was a major factor in involvement, as well.
[[Return to the Course Design Assessment Page->We are interested in an overall course design review]]
[[Return to Main Navigation->What/Who Are We Assessing]]
<table style="background-color:#000000;color:#ffffff;" width="100%"><tr><td><h4 id="bbd">Blackboard Learn</h4></td></tr></table>
Blackboard webinar (1 hour) with demo of new Bb Learn design/features, Dec. 2014:
<iframe width="560" height="315" src="https://www.youtube.com/embed/tLxKxHa34WU" frameborder="0" allowfullscreen></iframe>
<table style="background-color:#000000;color:#ffffff;" width="100%"><tr><td><h4 id="canvas">Instructure Canvas</h4></td></tr></table>
Instructure Canvas, "Cool Things You Can Do With Canvas," 21 Sept. 2015:
<iframe width="560" height="315" src="https://www.youtube.com/embed/TdDS6gVdI10?rel=0" frameborder="0" allowfullscreen></iframe>
Instructure Canvas, Fast Track Series, connecting SAMR and Canvas; a <a href="https://www.youtube.com/playlist?list=PLKAGO__0NI1B8-aRZmXQBR0oqac854lnK" target="_blank">playlist</a> designed to explain the features of the LMS.
<table style="background-color:#000000;color:#ffffff;" width="100%"><tr><td><h4 id="moodle">Moodle</h4></td></tr></table>
Moodle, "Moodle 3.1 Overview," a <a href="https://www.youtube.com/playlist?list=PLxcO_MFWQBDfWYsGAI-SXY7JW6T_3t0yM" target="_blank">playlist</a> of 13 videos explaining new and updated features in the latest version of Moodle, posted 24 May 2016:
<iframe width="560" height="315" src="https://www.youtube.com/embed/AOqFpLHi4S4?rel=0" frameborder="0" allowfullscreen></iframe>
<table style="background-color:#000000;color:#ffffff;" width="100%"><tr><td><h4 id="d2l">D2L Brightspace</h4></td></tr></table>
D2L Brightspace, Live Demo (1 hour 2o minutes), 28 April 2016:
<iframe width="560" height="315" src="https://www.youtube.com/embed/eRz0pYnErr4?rel=0" frameborder="0" allowfullscreen></iframe>
D2L Brightspace, <a href="https://www.youtube.com/channel/UCLSxTdOzKAFOCZjXav1aCRQ" target="_blank">Brightspace Tutorials Channel</a> -- YouTube channel with official videos and playlists for both learners and instructors to learn navigation and manipulation of the Brightspace LMS.
<table style="background-color:#000000;color:#ffffff;" width="100%"><tr><td><h4 id="sakai">Sakai</h4></td></tr></table>
Sakai, "Introducing Sakai 11," 12 May 2016:
<iframe width="560" height="315" src="https://www.youtube.com/embed/53RZQVynmMQ?rel=0" frameborder="0" allowfullscreen></iframe>
Closing Keynote - NYU Demo and Discussion of their switch to Sakai, webinar (1 hour), 11 Nov. 2016:
<iframe width="560" height="315" src="https://www.youtube.com/embed/sgTEMqkK-es?rel=0" frameborder="0" allowfullscreen></iframe>
[[Return to LMS Page->We are assessing the Learning Management System]]
[[Return to Main Navigation->What/Who Are We Assessing]]