At Pamoja, we are proactively engaged with the critical issues facing the EdTech sector, including making sure that young people feel safe online and that there are sufficient safeguarding processes in place.

This is especially important now that the coronavirus pandemic has seen a rush of young people move online in every facet of their lives, particularly in education, as schools across the world are closed or facing closure.

In this article, we take a look at how the government’s Online Harms White Paper (2019) will affect EdTech businesses across the sector and how they need to think about changing their business model accordingly, especially in light of more schools digitising their offering in the coming years as a consequence of the pandemic.

The dangers of being online

Today, the dangers of being online and using EdTech are manifold, especially for young people. There is online bullying, hate speech, child grooming, extremist or terrorist political radicalization, and even “suggested” posts on social media that influence children to self-harm or take their own lives.

While internet providers and social media companies have taken some steps to protect users, the overall response to online harms has been intermittent and unsatisfactory. This is of crucial importance now that we are in a time of high anxiety about the future.

The UK’s Online Harms White Paper

Inevitably, governments are now waking up to the scale of the dangers, and in 2019 the UK government was one of the first to unveil a proposal that aims to tackle them through its Online Harms White Paper.

With the Queen’s Speech in October of that year confirming that legislation is currently being drafted, the Online Harms White Paper will create a new regulator with wide-ranging powers that will define how social media and tech companies must manage content on their platforms.

Importantly, it introduces the legal concept that companies have a “duty of care” to their online users – those that do not adequately moderate content may face substantial fines, penalties or even be blocked from the UK.

With any company hosting user-generated content – from group forums upwards – likely to fall under the new legislation, this would include many current and future EdTech companies that will find their products being used in the UK’s schools.

The social nature of EdTech

Given that EdTech products are usually aimed at young people, EdTech companies have a particular responsibility to think about possible harms – in fact, safeguarding is as important in EdTech as it is in any school.

EdTech platforms and products are also increasingly social in nature, connecting learners with teachers and each other – so they can pose risks similar to any social network.

Figures from around the world also illustrate that the number of young people using social EdTech is huge and growing.

In India, the number of EdTech users is anticipated to jump 600% between 2016 and 2021; global EdTech social media networks Edmodo and Brainly have now grown to 87m and 150m active users respectively; educational app Kahoot!, which is also used globally and allows user-generated content, has been used by 830m people (with 50m active users per month).

What do EdTech companies need to do?

The question naturally arises of how EdTech companies will moderate content to satisfy the regulations.

While such interventions must be practical, we believe that responsible EdTech businesses should build the appropriate duty of care into their operations from the beginning – and if their business models do not allow for such safeguards, then the models need to be revised.

With so many new EdTech startups coming onto the scene each year – often with founders who come from technology rather than education backgrounds – companies need to be careful to ensure safeguarding and content monitoring are key concerns.

After all, reputation and trust are everything when it comes to schools choosing EdTech. This is why EdTech companies must also ensure from the beginning that they keep user data private, don’t use it for commercial purposes, and take the necessary measures to protect their platforms from any intrusion.

US EdTech inBloom – which had to shut down over a data privacy backlash from parents and school districts – serves as a cautionary tale.

How will regulation work?

There are other important questions raised by the UK government’s proposal that EdTech companies need to consider and be wary of.

It is in the nature of the Internet to extend across political borders, so will EdTech platforms hosted in Ireland or the Netherlands be covered by the regulations as long as they have UK users?

If not, won’t companies just register or host elsewhere? A regulator that ensures companies’ compliance with future-proofed standards may make the UK an attractive place to start or invest in a digital business – but equally, we should be cautious about how regulation might affect innovation.

And what safeguards will prevent the regulator, once created, having its scope and remit widened in the future – and exerting more control over the industry than promised?

Nonetheless, we should remember that while an online regulator could be a welcome development, it shouldn’t reduce EdTech companies to passive compliance.

A truly safe online world will only come when companies proactively build these concerns into their business – and think about their users with as much care as they do their venture funding or IPOs.

And schools should make sure the EdTech they use in the future ticks all these boxes. As we have seen over the past few months, this is crucial for schools who are rushing to find EdTech solutions to mitigate against school closures during the pandemic. The race to digitise must also factor in the safety of their users.

For more information about what we do at Pamoja Education, visit our website. Article written by John Ingram, CEO of Pamoja Education