Sunday, October 16, 2011

A digital literacy related site from Australia

To complement the JISC site featured in the previous post, these resources from University of New South Wales focus on How to Teach Online, and a project funded by the Australian Learning and Teaching Council (ALTC).

Developing digital literacies

Its a relief to see that, despite heavy cuts to UK higher education budgets, some priority areas are still getting the investment they need to move the sector forward. JISC has announced a two year Developing Digital Literacies project that aims to ensure graduates are well equipped for the 90% of new jobs that are expected to require excellent digital skills. An informative and useful website presents research, recommendations, best practice examples, workshop materials, an organizational audit process and case studies. This is another five star resource from a world class organization that leads developments in educational technology, and this post in a note of appreciation for the benefits this offers to practitioners and tertiary sectors within and beyond their borders.

Thursday, June 23, 2011

A-head in the cloud

I often used to envy the governance, funding and resources available to the UK higher education sector - less so in the past year or so admittedly - but today's announcement of 12.5M funding to develop cloud services for education and research brought the green eyed monster roaring back into my life. This initiative is not only exciting pointer to the technology of the future. It could go a long way to solving some of the sector's financial challenges by gaining efficiencies at a national level, without squeezing more life - or quality - out of a public sector that is already careening rapidly towards the lean end of the spectrum.

When I first came to NZ in 1995, I - foolishly as it turned out - thought the collaborative model pursued by the UK higher education sector could work even better in a country with just 7 (at the time) universities. There is potential here, and some great initiatives, but we fall far short of the scale of leadership and cooperation happening in the UK. Maybe our crisis needs to get worse so it becomes the only way forward :-(

The monster that thought conjures up is anything but green!

Monday, May 30, 2011

End of an ERA - journal ranking scrapped by Australian politician

Minister for Innovation, Industry, Science and Research, Kim Carr has announced the scrapping of journal rankings as part of a shake up of the Excellence in Research for Australia initiative. Carr said he wanted to address the contested nature, and improper use of these quality rankings. His action in scrapping the system could be read to imply the belief that, in the absence of surefooted people or an effective fence at the top, nothing at the bottom of a cliff is a better option than an ambulance.

The value of journal ranking and impact factors has been the subject of debate for many years, and such systems are admittedly less than perfect. Chastizing the research community - as the Australian newspaper reported Carr as doing - for learning to 'play a system' that is imposed without necessarily making sense is one way to address the problem. The scenario is not unfamiliar.

I hope Minister Carr's move has cleared the way for a process of broad consultation to devise new measures of quality through transparent collaborative process. I hope any new measures make obvious sense to people whose work they relate to, and reflect the value of publishing in up and coming, national, and specialized journals as well as the ones that were top of the class ten years ago. Last time I looked into how rankings were determined, I found none of these factors in place.

Quotes that 'these reforms will strengthen the role of the ERA Research Evaluation Committee members in using their own, discipline-specific expertise to make judgments about the journal publication patterns...' and 'the change empowered committee members to use their expert judgment to take account of nuances in publishing behavior' do little to instill confidence.

If I believed that experts on committees knew everything there is to know about the realities of life on the research 'factory floor' I might have more faith in the prospects. I really hope what happens next will expose me as a change_resistant_bore!

Tuesday, May 24, 2011

Teaching and Learning Vision Conference

I haven't attended one of these conferences, but understand they come highly recommended.

Head to Australia's Gold Coast in November for two days of expert opinion, networking and sharing ideas about teaching and learning with vision. The Teaching & Learning with Vision 2011 conference will bring together practitioners and experts in the use of learning technologies for education and training. Keynote and featured speakers will challenge and extend your thinking. Our case study speakers are real-life educators who are using these new technologies to engage and extend their learners in ways that are not possible with traditional methods. A conference exhibition will allow delegates to investigate the latest technological innovations. Visit the conference website for further information.

Wednesday, April 20, 2011

More on OER Debate

Sir John Daniel, President and CEO of the Commonwealth of Learning has now posted a statement to open the debate on the Open Educational Resources movement being a flawed because it is based on the unsupported assumption that academics are willing to share their materials. All readers are invited to comment.

Thursday, April 14, 2011

eLearning Africa debate on OER

The program for the 6th International Conference on ICT for Development, Education and Training in Tanzania in May is available online. Among others things on a varied and engaging sessions list is a debate on the motion:

This house believes that the OER movement is fundamentally flawed because it is based on the false assumption that education institutions are willing to share resources freely and openly.

I'd love to be there!

Wednesday, March 23, 2011

Enhancing the quality of student learning - systems to monitor, evaluate, review

ACODE 55 Enhancing the quality of student learning: systems in place to monitor, evaluate and review kicked off on a wet Sydney morning in March. The irony – there is always irony with ICT – is that the chosen technology for the opening keynote wasn’t quite up to the task.

Keynote

The speaker, Atlanta-based Regional Education Board Director Bruce Chaloux’s engaging talk on Closing the gap in online quality: emerging models in our continuing challenge, was interrupted on slide 3. Poor lip synch had already raised eyebrows and sighs, then the Skype connection broke. Question: did the choice or the technology let the side down?

Bruce's message still came through loud and clear though - the growth of online learning (in the US) is faster than the growth of participation in HE, now serving 6 million from a total of 20 million students. A national benchmarking study revealed perceptions of general satisfaction around quality – although some respondents still believe it’s the same as face to face or somewhat superior. Next (rhetorical) question – what is the opportunity cost of conducting this kind of survey?

The Sloan Consortium Framework for Quality Assurance got good press - Bruce is a past President of the Board of Directors at Sloan C, and knows it well. A six volume series Elements of Quality in Online Education includes:

6: Engaging communities

5: Into the mainstream

4: Practice and direction

3: Elements of quality online education

2: Online education: Learning effectiveness, faculty satisfaction, and cost effectiveness

1: Online education: Learning effectiveness and faculty satisfaction

Summaries are available as PDF from the website, but the books come with a price tag and the order facility requires a log in (free to set up).

Last mentioned was the Quality Matters Program Rubric, which addresses eight broad standards and provides a useful framework for looking at different dimensions of quality for online courses. http://www.qmprogram.org/rubric

Local quality enhancement initiatives

The day outside brightened up as some local (Australasian) initiatives showcased practical ways they monitor, evaluate and review programs and courses. Prof Stephen Towers, Queensland University of Technology (QUT), gave an impressive presentation of their annual Courses Performance Report, produced by a system that consolidates voluminous data in a simple dashboard. Some institutions don’t allow such public shows of information, but the value outweighs any concerns in this self-defined ‘university for the real world. A description of the process is on QUT's website

University of Southern Queensland were up next, with the Course and Program Management System. This is another impressive quality assurance system that follows through from initial accreditation to dissemination of student outcomes aligned with learning journeys. The system mediates and facilitates key administrative functions, and provides key data to staff and students. No details are returned from a search of the institution’s website, but Dr Michael Sankey who presented may be able to assist with any enquiries.

Both these systems looked practical, comprehensive, superbly fit for purpose and able to present very disparate pieces of information in a useful format for the host institutions. My question here is, will systems like these become standard one day? I certainly hope so.

Case studies

After lunch, the meeting moved on to case studies and group discussions. A/Prof Maree Gosper one of our hosts at Macquarie University looked at how results of a study of student experience and aspirations for use of a range of technologies are being used in participating institutions to support planning and development.

Rhonda Leece from University of New England gave a stand out show featuring an “automated wellness engine’. The name didn’t make it immediately obvious (to me) what this engine did, but the performance was innovative and most impressive. UNE’s Engagement and Retention Project is designed to identify students at risk, and does so quite effectively by trawling nightly through a number of online systems for triggers associated with an overall student ‘wellness’ index. A student support task force responds with triage and and/or referral for students at risk. The project has already hit the headlines in a Computerworld article.

Derek White - University of Waikato is leading a project to integrate all grade items results and system activity into a single repository. The aim here is also to identify at risk students, and to generate comprehensive views of student progress. Although the project didn’t follow an ‘ideal’ path, it is reaching the the intended goals of building infrastructure to support this reporting, tackling the issue of curriculum change, data entry practices and building reporting functionality - something I think most of our institutions would like to do (better).

Wrap up

Discussion and wiki posting of relevant experience and practices at participating institutions rounded off another long but productive meeting of ACODE Reps. The parting question is: how can we disseminate these discussions more widely within and across our institutions. That may be on the agenda next time. Here is one piece of my humble attempt. Thanks for reading it.

Monday, February 28, 2011

Why 'successful' elearning projects 'fail'

What Tom Franklin calls ‘success’, and in particular, long term sustainability of elearning projects are a constant challenge to me in my role as Head of an eLearning Group in a large university. I absolutely agree with Tom’s January 2011 ALT Newsletter item that forward planning, budgeting for ongoing support and change management are important aspects to consider. However, in this I see tensions, first of all with funding body requirements (fixed term funding, intention to support start ups / exploratory projects etc) but more importantly, with the nature of innovation itself.

The kind of project that I want to see supported past the initial funding stage starts out with a great learning design idea and a creative teacher with a problem to solve. They [may] get start up funding, then use of the ‘product’ grows beyond their wildest aspirations. Colleagues in other faculties and institutions use – and even become dependent on it. In some cases, many thousands of students are involved, and no one questions the benefits to teaching, learning and productivity. But no one is ready to support long term sustainability either. This may be because it doesn’t come with the IT Services seal of approval, or because initial funding was from an external e.g. Government or one off source. There are probably many other reasons such ‘successful’ systems persist with just one or a few people to support them, and no institutional commitment. I see the lack of institutional response systems and processes as a key one.

The major difficulty of planning for change management and ongoing support with projects such as these is that the impact could not be anticipated at the time of the original proposal. That is the evolutionary nature of innovations, and the education sector – as a whole and in parts - seems woefully ill equipped to step in and provide the necessary support. I’d welcome any suggestions as to how these issues might be resolved. I have my own ideas, but they involve organizational change rather than change management at practice level. Further details in my 2010 article
Sustainability factors for elearning initiatives, ALT-J (Research in Learning Technology) 18, no. 2: 89-103.

Tuesday, February 1, 2011

Seize the opportunity of online learning - once more unto the breach...

Collaborate to Compete is the latest release (Feb 2011) in a stream of reports on how universities can use technology and online pedagogy to achieve quality and cost-effectiveness in meeting student demands for flexible learning. In a nutshell, how to create the best opportunities, and get the best return on investment in online learning. The terms of reference for HEFCE’s Online Learning Taskforce focused on the UK higher education sector, though the case studies and recommendations have broader relevance.

I know I’m not alone in responding with a degree of scepticism to the announcement of another report on how to exploit the potential of online learning. The fact that we are still talking about potential means that work remains to be done to exploit it. Collaborate to Compete: Seizing the Opportunity of Online Learning for UK Higher Education) offers many valuable insights. For me, it also has one key limitation in the range of voices it represents. However, the impact of this latest set of recommendations – as the report notes about estimating the size of the market for online learning - is hard to predict.

A summary of recommendations (with my comments added):

1. Students need greater support to ensure their study and academic literacy skills are fit for the digital age (and staff need greater incentives, support and evidence of benefits to ensure their skills are up to the mark).

2. Investment is needed to build consortia to achieve scale and branding in online learning. (Cases where previous attempts failed are featured in the report. While I support the kind of multiskilled teams described by Edelson (2006), and believe collaboration can harness the strengths of diverse roles and organizations, I wonder if the consortium is a realistic proposition for how they will be put together? There are obvious benefits, as risk will be shared and dissemination more effective from the start, but where will the necessary investment come from in these tight financial times?)

3. Better market intelligence about international demand (this can only help to avoid repeating expensive mistakes of the past. Zemsky & Massy (2004) made a strong retrospective case against using inappropriate forecasting methods to anticipate demand and drive investment. There does indeed need to be ‘clarity about the markets in which ventures will operate’, but the wisdom of hindsight says more about what not, than what to do. What specific strategies will allow this to be achieved?)

4. Institutions take a strategic approach to realign structures and processes to embed online learning (this is my strongest wish). Such changes will not happen rapidly enough without effective organizational structures and processes (the mantra is good, articulation seems to be the problem, or perhaps it’s different interpretations of what effective means in this context.) Institutions need to ensure that staff understand the range of challenges and opportunities provided by online learning, and ensure what they do is cost effective and high quality (and experienced staff need to help their institutions understand what is involved in addressing these challenges and opportunities with cost-effective and high quality solutions.)

5. Realign training and development to enable the academic community to play a leading role in online learning (if the academic community can’t I don’t know who can!). Promote understanding of potential, and put greater priority on partnerships between technologists, learning support specialists and academics (with emphasis on partnership amongst equals).

6. Invest in the development and exploitation of open educational resources to enhance efficiency and quality (Open education is not just about sharing resources, initiatives such as the OPAL Open Educational Quality Initiative show that practice is equally important. The conceptual model here is one where producer and recipient practices meet comfortably in the middle.) There is no point in duplicating effort to create content that is already available, (but there is more than a grain of truth in the saying ‘to change something you have to understand it and to understand it you have to change it’. The ‘not invented here syndrome’ may have deeper psychological roots than we acknowledge.)

Key points I take away from the report are that a) adapting organizational structures and processes may require a significant change in academic and organizational culture; b) institutional promotion criteria (accountability measures), selection criteria for awards etc. would provide incentives for staff; and c) strong leadership and commitment to a clear strategic approach are fundamental to effect change in institutional policies and procedures.

With my academic / professional development hat on, I appreciate the report's acknowledgment that professional development for online learning is not just the responsibility of a small team of people in a central service unit, but the broad and joint responsibility of institutions and national and professional bodies.

Collaborate to Compete ‘highlights many things that have been said before but not widely heeded’. The authors state their belief in the title, and assert that the report has come ‘at a time when technology, internationalism, curricula and the power and nature of the student voice have moved forward, thus making the report timely and important’.

While I agree that the report is both these things, I also believe it is missing one very important voice, and thus not truly reflective of the spirit of the title. Composition of the task force does not quite model the kind of collaboration that some of us ‘chalk face’ workers are such strong advocates of - i.e. collaboration across all levels within organizations. I believe there is much to be learned from the experience of the lead practitioner, the average academic and the early career tutor, all of whose professional practice the online learning strategy seeks to reshape. Consultation with people in teaching and research roles is one thing, collaboration with them is quite another. The taskforce has captured the voices of leaders, directors, chief executives and managers and students through the President of the National Union. While case studies may reflect the practice of lecturers and tutors across institutional levels, I do not hear their voices in the report I have just read. This is the part of organizational culture and process I think will take longest to change - mainly because it doesn't even seem to be on the agenda yet.

References

Edelson, D.C. 2006. Balancing innovation and risk: Assessing design research proposals. In Educational Design Research, eds Van Den Akker, J, Gravemeijer, K, Mckenney, S and Nieveen, N, 100-06. London and New York: Routledge.

Zemsky, R. and W. Massy. 2004. Thwarted Innovation: What Happened to e-Learning and Why?: Final Report of The Weatherstation Project, The Learning Alliance, University of Pennsylvania. http:// www.irhe.upenn.edu/Docs/Jun2004/ThwartedInnovation.pdf