Don’t waste time protecting children online | Media Pyro

[ad_1]

As the government changes its internet safety bill, Emily Carter highlights the importance of making progress.

Back in 2019, the government published a white paper for the Internet Safety Bill, billed as world-leading legislation aimed at making the UK the “safest place in the world” to work online. In early September – after three long years of consultation, scrutiny and debate – then Prime Minister Liz Truss suspended the bill in the final stages of the legislative process so it could be “tweaked”. However, the recent findings of the inquest into the death of 14-year-old Molly Russell have shown that there is no time for delay. Given that the coroner determined that Molly died “suffering from depression and the negative effects of online content,” the government must remain focused on the task at hand.

The regulation of online services has never been easy, and there are a diverse range of stakeholders invested in getting the law right, especially when it comes to regulating legal but harmful content that adults can access. However, there are risks to delaying or outright rejecting the Internet Safety Bill, with the greatest impact on children’s safety. Given that there is no perfect compromise, is the proposed regulatory framework for online security effective? And what can be learned from existing regulation of children’s rights to privacy on the Internet?

Children’s rights to privacy on the Internet

The Data Protection Act has already been successful in protecting children online. Children are clearly recognized as having the same rights as adults, which is reinforced by Article 2 of the UN Convention on the Rights of the Child, which enshrines the “best interests of the child”. Although online platforms are entitled to rely on the consent of a child over 13 years of age, all persons under the age of 18 need special protection. There are specific obligations regarding lawful processing and transparency, such as ensuring that privacy information is age-appropriate and appears at the time of disclosure, rather than being hidden in illegible fine print. The UK General Data Protection Regulation (GDPR) contains specific obligations regarding profiling and automated decision-making that rely on data collected about users.

The processing of children’s data has been a regulatory priority for the Information Commissioner’s Office (ICO) for several years. Children’s rights got a further boost a year ago when the ICO introduced the Age Appropriate Code (known as the Children’s Code), a statutory code of practice for online services that are “likely to be accessed” by children. This applies to any company that processes the data of British children, whether in the UK or abroad.

The Children’s Code contains a number of measures to protect children online, including risk assessment, age verification, measures to protect children from any harmful effects of profiling and turning off geolocation services. The Code embodies the principles of “privacy by design” and “privacy by default” by requiring services to “build in” online protections for children. In practice, the code has already resulted in a number of platforms changing their default settings. It has been used as a template for California’s Age-Responsible Design Act (due to take effect in July 2024), and various other jurisdictions are considering similar codes.

Although the Children’s Code does not have the force of law, it is used as a yardstick to assess compliance with the principles and obligations of data protection legislation. The ICO assesses the compliance of more than 50 online services, has conducted nine audits and four investigations are ongoing. On September 26, the ICO published a notice of intent to impose a potential £27 million fine on TikTok for alleged breaches of consent, transparency and the processing of “special category” data.

The ICO has laid a strong foundation for further regulation that extends obligations to online services to implement systems and processes to protect children from harmful material. Importantly, data protection regulation demonstrates how internet security regulation can operate in the real world.

How will the rules work?

Regulation of online content is inherently complex due to the volume of content, the number of users, and the complexity and speed of technical development. When the White Paper was published, there was no precedent for regulating online user-generated content. Stakeholders and their interests are diverse, representing the technology industry, regulators, law enforcement, civil society and charities on the front lines of child protection. The bill should provide certainty and clarity for all stakeholders, as well as flexibility for future development. This is a difficult task.

Based on the principles of regulation of systems and processes allows the regulatory scheme to develop over time as the theory is applied to practice. As the implementation of the GDPR has demonstrated, detailed codes of practice and guidance will be developed in consultation with relevant stakeholders and adjusted over time to meet the challenges facing the industry.

The Online Safety Bill applies to user-to-user services and search services with links to the UK. This includes any services with a significant number of users in the UK or which target users in the UK. Approximately 25,000 online services will be covered, with heavier obligations placed on 30-40 services based on risk of harm. It contains measures to protect all users, but those relating to the protection of children include:

  • the requirement to carry out a risk assessment;
  • regulating the availability of illegal content, especially materials of sexual violence against children;
  • imposing special obligations on services that can be accessed by children to implement systems and processes to protect them from age-inappropriate content; and
  • require transparency reports outlining the steps taken by services to address online harm.

Instead of moderating certain content, which would be inflexible and unworkable, the bill focuses on ensuring appropriate systems and processes in services, taking into account the identified risks. This will include an assessment of the impact of algorithms on content accessible to children. “Security by Design” should be incorporated into the design of services alongside existing “privacy by design” requirements. Age verification will be an important element in ensuring that children only have access to material that is appropriate for their age, with specific provision for access to online pornography.

Ofcom has been given responsibility for regulating online security, with a range of investigative and enforcement powers, underpinned by criminal sanctions and individual accountability for senior management. Ofcom can impose fines of up to £18m or 10% of global annual turnover (whichever is greater) and can seek court orders to restrain non-compliant services. Ofcom is also focusing on improving media literacy in this area, and in March published research into children’s and parents’ attitudes and use of the media to inform this work (see bit.ly/3FbnvRZ).

Ofcom is actively planning to implement the new law from spring next year and intends to hire more than 300 additional staff. Ofcom will stand side by side with the ICO. The two regulators work closely together on online security issues, including as key members of the new Digital Regulatory Cooperation Forum, established two years ago. With such a long lead time, it is not surprising that Ofcom is more than willing to continue the important work required to implement the legislation once it has been passed.

There is no time to waste

Given that our online world is dynamic, regulating online security is bound to be an iterative process. The government must not allow the heated debate over “legal but harmful” content for adults to delay provisions of the Child Protection Bill from coming into law. A review of the online safety legislation is expected two and five years after adoption, giving the opportunity to adjust the legislation.

Organizations promoting legislative success need certainty sooner rather than later to plan ahead. They will need to develop their internal systems and processes in line with anticipated legal obligations, particularly larger social media platforms with existing self-regulatory schemes. The safer technology industry needs clarity, especially on age verification technology, which will be critical to the legislation’s success. Organizations that identify and remove harmful content from the Internet, such as the Internet Watch Foundation, must understand their role in the new regulatory framework.

The Internet Security Bill cannot precisely define all the necessary details that the industry will require in the coming years. The regulator must be trusted to continue the important work of filling gaps in the law, including on vital issues such as the definition of primary harm to children. Ofcom is committed to listening to all stakeholders and is subject to both parliamentary scrutiny and judicial review if it makes unreasonable or illegal decisions.

Ofcom has planned a full consultation with all stakeholders in spring 2023 and plans to finalize guidance and codes on child harm and safeguarding in 2024. There is a huge amount of work to be done once the Internet Safety Bill becomes law. Even though our government has a bulging tray, no time can be wasted in making sure these “adjustments” are published and the legislation completed.

This article was first published in the New Law Journal on October 28, 2022.

[ad_2]

Source link

Avatar photo

About the author

Media Pyro is a site giving interesting facts about acer brand products. We also Provide information about your online Privacy Laws.