Wednesday, March 23, 2011

Enhancing the quality of student learning - systems to monitor, evaluate, review

ACODE 55 Enhancing the quality of student learning: systems in place to monitor, evaluate and review kicked off on a wet Sydney morning in March. The irony – there is always irony with ICT – is that the chosen technology for the opening keynote wasn’t quite up to the task.

Keynote

The speaker, Atlanta-based Regional Education Board Director Bruce Chaloux’s engaging talk on Closing the gap in online quality: emerging models in our continuing challenge, was interrupted on slide 3. Poor lip synch had already raised eyebrows and sighs, then the Skype connection broke. Question: did the choice or the technology let the side down?

Bruce's message still came through loud and clear though - the growth of online learning (in the US) is faster than the growth of participation in HE, now serving 6 million from a total of 20 million students. A national benchmarking study revealed perceptions of general satisfaction around quality – although some respondents still believe it’s the same as face to face or somewhat superior. Next (rhetorical) question – what is the opportunity cost of conducting this kind of survey?

The Sloan Consortium Framework for Quality Assurance got good press - Bruce is a past President of the Board of Directors at Sloan C, and knows it well. A six volume series Elements of Quality in Online Education includes:

6: Engaging communities

5: Into the mainstream

4: Practice and direction

3: Elements of quality online education

2: Online education: Learning effectiveness, faculty satisfaction, and cost effectiveness

1: Online education: Learning effectiveness and faculty satisfaction

Summaries are available as PDF from the website, but the books come with a price tag and the order facility requires a log in (free to set up).

Last mentioned was the Quality Matters Program Rubric, which addresses eight broad standards and provides a useful framework for looking at different dimensions of quality for online courses. http://www.qmprogram.org/rubric

Local quality enhancement initiatives

The day outside brightened up as some local (Australasian) initiatives showcased practical ways they monitor, evaluate and review programs and courses. Prof Stephen Towers, Queensland University of Technology (QUT), gave an impressive presentation of their annual Courses Performance Report, produced by a system that consolidates voluminous data in a simple dashboard. Some institutions don’t allow such public shows of information, but the value outweighs any concerns in this self-defined ‘university for the real world. A description of the process is on QUT's website

University of Southern Queensland were up next, with the Course and Program Management System. This is another impressive quality assurance system that follows through from initial accreditation to dissemination of student outcomes aligned with learning journeys. The system mediates and facilitates key administrative functions, and provides key data to staff and students. No details are returned from a search of the institution’s website, but Dr Michael Sankey who presented may be able to assist with any enquiries.

Both these systems looked practical, comprehensive, superbly fit for purpose and able to present very disparate pieces of information in a useful format for the host institutions. My question here is, will systems like these become standard one day? I certainly hope so.

Case studies

After lunch, the meeting moved on to case studies and group discussions. A/Prof Maree Gosper one of our hosts at Macquarie University looked at how results of a study of student experience and aspirations for use of a range of technologies are being used in participating institutions to support planning and development.

Rhonda Leece from University of New England gave a stand out show featuring an “automated wellness engine’. The name didn’t make it immediately obvious (to me) what this engine did, but the performance was innovative and most impressive. UNE’s Engagement and Retention Project is designed to identify students at risk, and does so quite effectively by trawling nightly through a number of online systems for triggers associated with an overall student ‘wellness’ index. A student support task force responds with triage and and/or referral for students at risk. The project has already hit the headlines in a Computerworld article.

Derek White - University of Waikato is leading a project to integrate all grade items results and system activity into a single repository. The aim here is also to identify at risk students, and to generate comprehensive views of student progress. Although the project didn’t follow an ‘ideal’ path, it is reaching the the intended goals of building infrastructure to support this reporting, tackling the issue of curriculum change, data entry practices and building reporting functionality - something I think most of our institutions would like to do (better).

Wrap up

Discussion and wiki posting of relevant experience and practices at participating institutions rounded off another long but productive meeting of ACODE Reps. The parting question is: how can we disseminate these discussions more widely within and across our institutions. That may be on the agenda next time. Here is one piece of my humble attempt. Thanks for reading it.