Versions Compared


  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migrated to Confluence 4.0


  1. The evaluation process for software is evolving for college and university campuses from focusing on features (the tools and technologies native to a system, e.g. calendars, file uploads, discussion forums, etc.) to assessing affordances (functionality, i.e. collaboration, communication, etc.).
  2. For obvious reasons, not the least of which is monetary incentive, vendors of academic technologies readily respond to Request for Proposals which includes product descriptions in line with the instruested institution's requirements. Many products do not enjoy direct support from vendors (open source tools such as Bedework, DuraSpace, Drupal, Fedora, Mahara, Moodle, OSP, Sakai, uPortal etc.) or if they do, the vendors may not have the capacity to compete with larger competitors (e.g. Atlassian versus Microsoft, Canonical versus Apple, etc.). Without a vendor to promote a product, and thus respond to formal requests from institutions of higher education, many quality and potentially useful academic technologies go unrecognized. Even with support services for open source options through third party affiliates, these providers may not be interested in a university’s evaluation stage. As such organizations specialize in support and hosting services, there is little incentive to engage with an institution until they have already identified (and thus evaluated) a specific application/technology.
  3. Wiki MarkupAs explained by WCET, "As product features became more common \ [nifti:across systems within a product category\], a focus on 'essential questions' provides more value to educators deciding among competing products." Assessment by campuses is usually based on what it has (its tools), rather than what a system enables (its functionality). As different products reach feature parity, a feature-to-feature comparison becomes less valuable.

In an effort to overcome these barriers to evaluation and informed decision-making, UMassOnline is volunteering the create, manage, and fund a web space at which functional requirements (descriptions of teaching and learning activities and objectives) can be collected and featured: think, “next-generation EduTools.” To do this, we at UMassOnline propose developing 'user stories' that describe what a system can do, not merely what it has. A user story is one or more sentences in the everyday language of faculty, students, technologists, administrators, etc. that captures what the user wants to achieve (i.e. a stakeholder, an activity and an outcome) This represents a radically different, but immensely valuable and reliable approach to evaluating and selecting academic technologies. Through open dialogue among current adopters (campus faculty and staff), commercial affiliates, and developers, a reference library of user stories describing activities will emerge along with 'testing scripts' (i.e. user instructions) to assess if the desired outcomes can be achieved (i.e. the functional affordances of the tool under assessment).