Skip to end of metadata
Go to start of metadata


In an effort to ensure access to UMassOnline's community of users as well as others who might provide insight as we undertake the LPR, all content, comments and attachments are accessible by the public.


The Learning Platform Review process is intended to help ensure a smooth transition between learning management environments. It is through campus participation in the process that we will be able to determine the steps needed to transition from BlackBoard Vista to other learning management environments, and better understand the impact on an individual campus, the impact on all of the campuses, and the impact on UMassOnline across a variety of options. Making technology decisions before the discovery process is executed increases the risk profile for the University of Massachusetts as well as individual campuses. Although we are not in a position to responsibly ask questions about the relationship between UMassOnline and specific platforms, we do assert that campus participation in the Learning Platform Review (LPR) is an important part of being part of the UMassOnline consortium. It is critical for each campus to participate because the decision of one campus may impact the other campuses and the University of Massachusetts. Participation in the LPR includes activities such as:

  • Documenting current use of the learning management system
  • Describing what online learning is on the campus.
  • Defining resources and operations currently supporting the LMS and online learning practices on the campus

The Problem with LMS Reviews

Ambiguity: What is online education anyway?

EduTools has long been a resource for colleges and universities assessing options for deploying a Learning Management System (LMS). The site offers institutions, "feature-by-feature product comparisons and decision-making support," by cross-referencing current LMS platforms against fifteen learner tools (communication, productivity and student involvement), seventeen support tools (administration, course delivery and content development) and eight technical specifications (hardware/software and product details).The goal of EduTools is to provide an apples to apples comparison of various systems based on specific features (discussion forums, grade books, file sharing, etc.) so that institutions can compare the functionality associated with each feature to assess compatibility with current online teaching and learning practices among faculty, activities required by academic programs and administrative requirements for the institution.

However while the goals of feature comparisons like EduTools are admirable, unfortunately they can not deliver the assessment many wish due to the ambiguity surrounding the tool sets associated with Learning Management Systems. What, after all,  is a discussion forum, a grade book or file sharing? That is, what are the minimal features, functionality and capabilities required for a tool to be labeled (and recognized as), for example, a "discussion forum?" NetLingo broadly explains a discussion forum as, "An online community where users read and post topics of common interest." While this definition seems generally acceptable for most in explaining what a discussion forum is, they provide little insight for defining and assessing the requirements of a discussion forum for an organization, that is, the specific tools needed, and techniques used to, "read and post topics" that satisfy operational and business needs.

Looking at Blackboard, Desire2LEarn, Moodle and Sakai illustrates this point. Each of these systems allows users to "read and post topics," thus their functionality is quite similar. The value of discussion forums arise from the application of the technology within the course, i,e. technique not technology. Trying to identify, assess and value a discussion forum's functionality becomes a fool's errand, as each forum offered within the competing LMS's provide essentially the same set of features with any differentiators — those that can be derived — so subtle they, in and of themselves, offer few, if any, advantages in teaching or learning. For example, "A discussion forum can be sorted by topic, name, date, and group affiliation;" "A discussion forum can be sorted by topic, name, date, and group affiliation and allows search;" "A discussion forum can be sorted by topic, name, date, and group affiliation, allows search and has a rich text editor;" "A discussion forum can be sorted by topic, name, date, and group affiliation, allows search, has a rich text editor and has customizable skins;" "A discussion forum can be sorted by topic, name, date, and group affiliation, allows search, has a rich text editor, has customizable skins and... ad nauseum."

Indeed, as many of the tools now offered have arguably become standard, it is the application of these tools and technologies by faculty and campuses (both the Luddites and the early-adopters) that is significant to measure.

Subjectivity: My grandma's cookies are better than your grandma's cookies...

A significant problem with many technology assessments is subjectivity. Often decision-making for selecting a new service or system is based on surveys (collecting information about use through faculty and student questionnaires), polls (asking users and stakeholders to vote on choices) and pilots (opening up candidate technologies to end-users for assessment). These approaches are subjective, resulting in popularity contests, where personal bias directs choice or influential stakeholders determine direction. For example, a survey sent to users to solicit features for a new system might result in the following: "A user friendly and easy to navigate interface." "User friendly" and "easy to navigate" can not be objectively defined, therefore any assessment can only be based on the personal preferences of those reviewing the system. A better approach would be to determine specific assessment criteria and metrics for assessing those criteria. The requirement,  "A user friendly and easy to navigate interface" might become "Faculty can navigate to any tool within two clicks." This provides a simple way of determining if a system meets a stakeholder's requirements as it can be tested empirically and objectively.

An Evidence-based Approach

UMassOnline currently provides a variety of technologies and support services in the delivery of a Learning Management System beyond simply hosting, these include: integration, e.g. with student information systems (PeopleSoft, Banner, etc.) and synchronous communications, (Wimba Classroom, Wimba Voice, etc.); vendor management with Blackboard, Perceptis, UITS, etc.; tiered end-user and technical support; and training. In addition, to these technical services, UMassOnline supports a variety of institutions engaged in an even broader set of academic initiatives: all of which require stability and scalability. This comprehensive — and complex — environment is defined as a "platform." In order to understand potential options (both opportunities and risks), it is prudent to understand the entire platform that supports online education, not just the Learning Management System.

The "Platform" is composed of the following attributes and includes practices, processes and policies associated with :

  • Business Processes
    What institutional work-flows and operations should be considered to ensure business continuity?
    What considerations should be included to assure campuses' current business models and values remain?
  • Costs
    What considerations should be included to assess Total Cost of Ownership and Return on Investment?
  • Migration
    What considerations should be made for: ensuring the fidelity of content (learning objects) and courses (learning activities); integration integrity (e.g. connections to Student Information Systems, scripts and scheduled jobs); developing and transferring knowledge (training and end-user support)?
  • Application Delivery Model
    How well does the current technology infrastructure and skill set of UMassOnline and our partners (University Information Technology Services) align with potential providers? How well do options align with current development initiatives underway across the University?
  • End User & Technical Support
    • How will various options affect support requirements (help desk) and service levels (issue resolution)?
  • Teaching and Learning
    • What considerations should be addressed to ensure current teaching and learning practices remain available while addressing gaps faculty and students experience?

Please note, the above Platform attributes are listed alphabetically, not in order of relevance, priority of value.


In order to identify the next iteration of UMassOnline, the LPR will undertake the following processes.

1. Describe Current Practices, Processes and Policies Associated with the "Learning Platform"

While UMassOnline, and the campuses we support, have been very successful over the past ten years, The Platform which supports online learning has continued to evolve. Many of the services and systems originally developed to support the campuses and online courses are no longer applicable, while new ones have become mission critical. Therefore, it is crucial that the LPR catalog UMassOnline's currently supported portfolio of services as well as those employed by our campus partners. The purpose of this phase of the LPR will be to understand how the implementation of services and systems supported by UMassOnline contribute to, or detract from, the development and delivery of online education.

2. Identify Current/Desired Requirements

Online learning — the business processes, budgets, technologies, support services and teaching, The Platform — has grown in both breadth and depth substantially since UMassOnline's inception in 2001, even since UMassOnline's last significant technology migration in 2005 from Vista 4 to 5. As the diversity of academic programs online increase, so too has the variety of educational content and delivery methods that enable it. While we understand that technologies will evolve and practices will mature, in order to ensure our next platform provides the same affordances, UMassOnline must understand current functionality (as opposed to simply features). While specifics will vary across campuses, the LPR should provide a framework for the collection of functional requirements related to The Platform (i.e. business processes, costs, migration, technology, support and teaching and learning).

User Stories

"A user story is one or more sentences in the everyday or business language of the user that captures what the user wants to achieve... The user stories should be written by the customers for a software project and are their main instrument to influence the development of the software" - Wikipedia

The Learning Platform Review (LPR) process will ensure that the interests of the University of Massachusetts and all of the campuses are well preserved. User stories, based on the current practice and needs of the campuses participating on the LPR Committee as well as information received by vendors, will be created to not only capture the functional and technical requirements, but also provide a mechanism to test and assess the capabilities of potential LMS providers. This will help provide us with an "objective" perspective on which we can rationalize and base our assessment and thus decisions.

Here's an example..
As a faculty member, I want to create groups from the students enrolled in my course, so that students can collaborate on assignments as a team."

For a more detailed example see the User Stories page.

3. Assess Findings

After an understanding of both the current and desired platform has been developed through the development of user stories, this institutional assessment will be published to potential service providers.

Functional Requirements

Gather testing scripts from potential service providers;

Once the user stories are complete, they will be aggregated and presented to the various organizations offering LMS support. These providers will be asked to reply to the user stories with testing scripts. An example of a testing script developed by UMassOnline for an LMS user story can be found on the LPR page, 4.3. Requirements Gathering - User Stories, and, for the purposes of the LPR, is essentially a set of instructions that an end-user can follow to achieve the desired objective (functionality) described in a user story. A flavor for testing scripts can be derived from many of the instructions posted within the UMassOnline Technology Knowledge Base, particularly the set-by-step instructions for "Backing up Sections."

Undertake User Acceptance Testing

"User Acceptance Testing (UAT) is a process to obtain confirmation by a Subject Matter Expert (SME), preferably the owner or client of the object under test, through trial or review, that a system meets mutually agreed-upon requirements." -Wikipedia

Each campus represented on the LPR committee will be asked to undertake "user acceptance testing" in order to assess if the functionality defined in the user story is satisfied by the testing script offered by the LPR respondent.

The LPR documentation, published to vendors as an RFI/RFP, will include a single, comprehensive catalog of the user stories independently created and contributed by each of the LPR committee members. The LPR RFP respondents will be required to create "testing scripts" for each user story, however the testing campuses will only be required to test user stories of interest — not all of the scripts actually included in the LPR RFP. Testing campuses may select a sub-set of the LPR's user stories and testing scripts that match the individual needs, interests, disciplines, etc. of their local institution.

Campuses assessing user stories during UAT will be given access to each of the learning management systems submitted for review by each LPR respondent. Any campus undertaking User Acceptance Testing, should test each the user stories they selected against all of the Learning Management Systems under consideration. For example, if a campus chose 25 user stories to test, they must test all 25 stories and scripts against each LMS under review — not just their leading candidate(s).

The campuses are asked to identify stakeholders who share similar domain expertise as those included in the user story. For example, if a user story states, "As a faculty member..." the campus should identify actual faculty members to conduct the user acceptance testing for that story, other stake holders might include, instructional designers, system administrators, students, etc.

Employing UAT to assess user stories, is similar to "Task-based usability tests" that require users to attempt to actually complete an objective in the system under review. Their failure or success demonstrates the weak or strong qualities of the system. Harvard Business School, Tufts University and DePaul University have undertaken such an approach in their LMS review process (see Depaul's "Extended Task-based Rubric"). While the formatting of the user stories are a bit different, they do provide the same assessment opportunity. In addition, user stories and acceptance testing eliminate any bias testers might have based on previous systems' work flows, naming conventions, etc.

Simple Example of User Acceptance Testing


User Story

Testing Script

Does Not Meet Expectation(s)

Partially Meets Expectation(s)

Meets Expectation(s)

LMS Candidate "A"

As a faculty member, I want to upload files, so that they appear with individual assignments.

1. Log in to the system with your Faculty credentials (username and Password)
2. Navigate to the assignment of interest.
3. Click the "Edit Activity" button.
4. Click on the "Add Resources" button
5. In the dialog box that appears, click the "Browse Desktop" link to find local files stored on your computer, or the "Browse Course Content" button to link to a document already loaded in the course.
6. If uploading a document from your desktop, find the document of interest and click "Upload"; if linking to an existing document already uploaded, find you document and click "Save."
7. The document should now appear in the "View Resources" pop-up menu of the assignment.




LMS Candidate "B"

As a faculty member, I want to upload files, so that they appear with individual assignments.

1. Log into the system with your faculty username and password.
2. Go to the course/section you would like to add a file to.
3. Go to the assignment you would like to add a file to.
4. To add a file from your computer, find it on your computer and then click and drag it from your computer desktop (or a window) to the assignment in your browser, and drop it (unclick).
5. To add a file already in your course, click "Add Content" on the main menu bar along the top of any page (a new window appears with all of your course's files), then drag and drop it from the window to the assignment.
6. A new icon will appear in the assignment window.




LMS Candidate "C"

As a faculty member, I want to upload files, so that they appear with individual assignments.

1. From your desktop computer, start up a FTP client and log into the secure FTP site for your course.
2. Find the document you would like to add to your assignment and upload it to the folder labeled, "Document Repository."
3. Repeat this for as many files you wish to upload.
4. Close the FTP client on your desktop and open a web browser (IE 8 or better only)
5. Log in to your course as "Course Administrator" (logging into the course as "Instructor" will not allow you to link files).
6. Go to the "Course Administration" tab.
7. Click on "Link Files," the browser will refresh to display all of the files owned by the user.
8. Find and click on the file you uploaded via FTP, the browser will update to display "File Properties."
9. Locate the line labeled "URL," highlight the entire line, beginning with the "http://", and then copy this line (CTL+C)
10. Close the "File Properties" and the "Document Repository" windows.
11. Go to the "Course Assignments" tab
12. The browser updates to list all of your assignments in the course.
13. Find your assignment and click the "Edit Assignment Instructions" link
14. In the Rich Text Editor, place you cursor where you want the link to your file to appear.
15. Click the "Add Link" button in the menu bar
16. Paste (CTL+V) the URL ("http://...) into the dialog box and click "Save."
16. The link will now appear in your Assignment's instructions.




LMS Candidate "D"

As a faculty member, I want to upload files, so that they appear with individual assignments.

Files cannot be associated with specific assignments, students will retrieve all files from the "Course Resources" folder located in the "MyFiles" tab.




User Acceptance Testing, Beyond Functional Analysis

The final decision for implementation might not be for the application with the highest "score" from UAT, i.e. a system with that meets 85% of the users expectations, might offer other benefits (e.g. cost savings), over another system that meets 90% of user's expectations.  Undertaking UAT will provide UMassOnline and the campuses with the functional gaps for desired functionality which can then be used to identify alternative functionality, work around, training, or alternate processes/practices, allowing all parties to identify and address issues, before implementation, not when encountered during implementation or production.

Technical Requirements

Technical/Operational Compatibility;

UmassOnline delivers services to five University of Massachusetts campuses and ten other private and public colleges and universities across the Commonwealth. As such, understanding our current technology infrastructure and delivery model is critical.

Business Requirements


Migration includes not only the transportation of content from one system to another and the transformation of the course's learning sequence and activities, but the processes required to translate the course/content as well.

Total Cost of Ownership

Calculating total cost of ownership should include not only the hard/direct costs associate with the LMS and underlining infrastructure, e.g. licensing, hardware, staffing, but also the soft costs such as migration, training, course/content development.

4. Analyze Results

  1. Campuses report outcomes of UAT (functional requirements)
  2. UMOL assess Technical and Business Requirements as reported through RFI by vendors
  3. 4.5 LPR Evaluation, Recommendation and Rationale

5. Identify a Direction

  1. Provide recommendation and rationale to ELAC and hosted campuses sitting on the LPR Committee for feedback
  2. Announce results to UMassOnline community and University

Continuous Development: Continuous improvement!

In order to increase awareness (and ideally participation) throughout UMassOnline and broader education and technology communities all of the activities undertaken through the LPR as well as the information attained with be available for public critique. It is desired that continuous feedback will ensure relevance and accuracy.


In an effort to ensure access to UMassOnline's community of users as well as others who might provide insight as we undertake the LPR, all content, comments and attachments are accessible by the public.