Sunday, September 17, 2017

Students and educational technology - it's complicated

A recurring narrative in educational technology is that students are driving the change. They are using the creative and collaborative opportunities offered by today's social media and demanding that universities and faculty do the same. This is used as a powerful argument by edtech companies when selling their solutions. This narrative is strongly linked to the idea of students as digital natives and is often the reason for institutions rushing head first into hasty and largely unplanned technology projects. No one wants to appear out of touch with student demands and so major technology projects are launched without first discussing the pedagogical implications and how the technology can be used to enhance teaching and learning. Of course students are using digital tools and arenas in their learning but not as widely and proficiently as the edtech narrative claims.

But are students really demanding change? My own experience leads me to question this to a certain extent. Furthermore, digital literacy is not a simple generation issue but more about whether or not a person is interested in the use of technology. Many students are highly proficient at using digital devices and social media for socialising and entertainment but are unaware of how to use them for work and learning. It would be extremely dangerous to assume that all students are proficient at information retrieval, source criticism, collaborative learning and media skills and most institutions are now integrating these soft skills into all parts of the curriculum. However, this change is not due to student demand but due to teachers becoming aware that these skills are missing and taking action to remedy the gaps.

Sometimes students can be more conservative than faculty and this issue is raised in an article in EdSurge, What If Students Are the Biggest Barrier to Innovation? Even if many students have used a wide range of digital tools and learning spaces in their high school years, whenever they arrive on campus they are told that university is different. Students are generally very pragmatic; they want to get their grades and then a degree in the most efficient way possible. They will adapt to whatever the institution demands. They also have the traditional image of university in their heads and are disappointed if their experience doesn't match it.

... the “metaphor” of the old and wise professor pouring knowledge into students through class lectures is what many graduate students expect. When they don’t get that experience because they are forced to do group work and interact with peers (who they do not view as subject-matter experts), they don’t feel like they are learning.

“I have had students tell me. I am not paying to listen to my neighbor's thoughts, I am paying to hear you,” says Pickett.

This is reinforced by the vocabulary of higher education so if we call the teachers lecturers and the lessons are called lectures then a lecture is naturally what the students expect. Teachers who focus on student-centred learning and concepts such as collaborative learning, flipped classroom, problem-based learning and so on risk negative evaluations from students who expect to be fed with wisdom and knowledge. Expectations, tradition and attitudes are major barriers to change and cannot be overcome simply by logical argument, no matter how much research evidence is available.

I would say that the main drivers of change in terms of educational technology are teachers who are using the technology successfully, often due to a desire to widen their professional skills and a genuine interest in pedagogical innovation. The problem is, however, that many of these teachers have developed their digital skills on their own initiative, usually outside working hours. This ad hoc approach means that teachers' digital skills are not evenly distributed and there is often an alarming digital divide within the teaching staff of most institutions.

This inconsistency is a concern for many students as discussed in an article from JISC in the UK, It’s official - higher education students want staff to be better with digital, not to use more of it. Students in this national survey state clearly that they want all staff to be trained in using digital tools and demand more consistent use of technology rather than more variety.

Don’t allow academic staff to pick their own ways of using digital resources. At the moment each academic uses the virtual learning environment (VLE) in a different way, making it very time consuming to keep switching approaches. It’s also obvious that academic staff have not received adequate training in using these systems.

They are not demanding more technology but a more strategic approach to technology use and fewer bottom-up ad hoc initiatives. Interestingly the survey reveals a concern that online learning lacks a face-to-face element and a fear that more technology means less classroom contact with teachers.

However, when asked what their institution should do and not do, students requested a better use of digital systems, not more, fearing it could be used to replace face-to-face time with staff.

What seems to be missing here is a realisation that online collaboration is an essential skill in many companies and organisations today and that this skill must be developed at university and integrated into all programmes. Sadly online work is still seen as a substitute for "real" contact instead of a valuable skill in itself.

The conclusion is that technology can enhance teaching and learning but to succeed it needs to be explained and introduced gradually. Both teachers and students need to change their perspectives and this takes time. Above all the process requires skilled management and leadership and only when all these conditions are fulfilled will we see successful implementation and integration of technology in education.

Saturday, September 9, 2017

Why free is not always best in education

We seem to think that everything on the net must be free and are very reluctant to pay for any service, no matter how good it may be. In education we use a wide range of services and tools in our daily work that are free to use in their basic form. Most of these are so-called freemium services; the basic version is free but if you want more interesting features like personalisation, greater storage capacity or extra functionality then you have to pay. The problem is that the idea of the free internet is so entrenched that few of us ever move on to the commercial version of the tools we use and love so much. We seldom stop to wonder how the people who invent these tools get money to pay their bills. We love free but we dislike all the ads that accompany it. In general if you pay you lose the ads, or at least the vast majority.

So I enjoyed reading Nik Peachey's excellent post this week, Why the culture of ‘free’ is damaging edtech & education, dealing with exactly this question and I just sat there nodding in agreement all the way through. The logic is pretty simple; if we don't pay for these resources the companies that offer them will soon go out of business and we'll lose them. The only exceptions to this principle are resources subsidised by advertising (like Google)and where you are the product, and the open source tools that are developed by enthusiasts without commercial interest. How many digital tools do you actually pay for? I pay for only a handful, the ones I love most, and the yearly cost of the pro versions is often very low. At the same time there are plenty of tools I only use in their free version.

We need to look beyond the mythology of the free internet and accept that good and reliable tools and services cost money, as in the physical world. Teachers are understandably unwilling to use their own money to subscribe but Peachey proposes giving teachers a small budget for digital tools to spend as they see fit, in the same way as many teachers are able to buy relevant literature for professional development.

A better alternative would be for schools to provide a budget for teachers to purchase licences for the tools they want to use with their students. I know that most schools and colleges already have a technology budget, but this is usually a centralised one with teachers often excluded from the purchase decision making process.

Giving teachers a part of this budget would not only ensure that they were able to access the tools and services that they like and need, but would also empower them to be part of the edtech development process within the school and make them much more likely to adopt and use more digital resources.

Of course, the most important digital resources are provided by the institution, such as the learning management system, file storage, e-mail and so on. But there is such a vast range of attractive digital resources out there that it is impossible to restrict teachers to only a handful of approved ones. Each teacher should be able to choose the resources that are most fit for purpose. Which tools would be on your list for upgrading?

Saturday, September 2, 2017

Choose your own reality

Reality and fact are being rapidly undermined by fake news, manipulated photos and films and now even voice manipulation, photoshopping for voice, like Adobe's Voco project which allows you to make people say things they never actually said. It's getting increasingly difficult to check the validity of a news item, especially when it confirms your own opinions, and this presents an enormous challenge for all educators. What happens when there's more fake news than real news? Whose news do you believe? Instead of creating a global meeting place to promote democracy and freedom the internet is now allowing us to create many parallel worlds where totally different perceptions and ideologies exist side by side but almost invisible to each other. The real world is complex, often full of contradictions and grey zones, and there are seldom clear-cut answers. So much easier to turn your back on all that and retreat into a simplistic ideology full of sweeping generalisations and quick solutions, backed up by mountains of fake evidence.

Source criticism is getting harder every week and a rather chilling new challenge is presented in an article in Business Insider, Researchers taught AI to write totally believable fake reviews, and the implications are terrifying. Artificial Intelligence (AI) offers a wealth of exciting new opportunities but can equally be used to undermine society if it comes into the wrong hands. New research by Ben Y. Zhao and colleagues at the University of ChicagoAutomated Crowdturfing Attacks and Defenses in Online Review Systems, has examined the use of AI to automatically generate fake reviews of hotels and restaurants. As AI develops, these fake reviews become almost impossible to spot and if produced on a massive scale could completely undermine the credibility of crowd-sourced guides like Yelp, Amazon or TripAdvisor. If we know that most reviews are manipulated or fake then they all become worthless. This may not seem so big if it is only about comments on discussion threads or review sites but the risk is that this will quickly spread to other fields. As Zhao claims in the BI article:

"In general, the threat is bigger. I think the threat towards society at large and really disillusioned users and to shake our belief in what is real and what is not, I think that's going to be even more fundamental," Zhao said. "So we're starting with online reviews. Can you trust what so-and-so said about a restaurant or product? But it is going to progress.

"It is going to progress to greater attacks, where entire articles written on a blog may be completely autonomously generated along some theme by a robot, and then you really have to think about where does information come from, how can you verify ... that I think is going to be a much bigger challenge for all of us in the years ahead."

Reality is in danger of becoming completely subjective and the challenge for education is how to equip our pupils and students to deal with this fragmented world.