[ad_1]
The United Kingdom may be preparing to hit several tech companies with regulations (and potentially fines) related to the Children’s Online Privacy and Safety Code, which has been in place for a year.
“The ICO is currently examining how well more than 50 different online services comply with the code, with four ongoing investigations. We have also audited nine organizations and are currently evaluating their results,” the data protection service said in a blog post yesterday, marking the first anniversary of the Code’s implementation.
The Telegraph, which interviewed Information Commissioner John Edwards, who heads the Information Commissioner’s Office (ICO), said in today’s paper that two of the four social media and technology firms being investigated are well-known.
Its reports say that the ICO’s decision on whether to prosecute is expected to be announced within weeks.
“This code makes it clear that children are not like adults online and their data needs greater protection,” Edwards told the Telegraph. “We will use our enforcement powers where necessary.”
The companies in question were not named by the newspaper or the ICO, but the watchdog wrote to Apple and Google last November after concerns were raised about how they rate apps in their respective mobile app stores to determine , what age ratings they apply.
At the time, the ICO described its outreach as an “evidence-gathering process to establish compliance with the code” – although it remains to be seen whether the two tech giants are among the four companies that could face possible enforcement, or if they are part of a wider group. the compliance of which was watched by a watchdog.
“Unfortunately, we are unable to name the companies at this time due to ongoing investigations,” an ICO spokeswoman confirmed when asked if she could share more details.
The ICO first published the Children’s Code back in 2020. It contains 15 standards for what it calls “age-appropriate design”—essentially a set of design guidelines for web services that children are likely to access, including recommendations such as setting a high level of customization privacy by default and not using heavy-handed engagement tactics that can lead to unhealthy addictions to digital services for children.
The main purpose of the Code is to encourage platforms to protect children from accessing objectionable content and to prevent them from commercial data mining, although the ICO regulates personal data (not content) – the latter responsibility will fall to Ofcom under the incoming Online Safety Bill (assuming , that another change of the Prime Minister of Great Britain will not lead to a revision of the legislation on this front).
This division of regulatory responsibilities has caused some controversy from child safety campaigners who, while supportive of the Code – and indeed even more so than in the case of the head of 5Rights and peer Baroness Kidron, who was the main driver behind the standards ( and continues to press for amendments from his seat in the House of Lords) — have complained of “gaps” as they wait for content-focused security laws to make their way through Parliament.
The ICO has therefore faced pressure to also check adult websites – that is, to require porn sites to also comply with the Code – and not just check the varieties of games and social media apps that are apparently most popular with children.
Age verification for porn sites?
The main push by child safety advocates is to force adult websites to implement stricter age checks to prevent children from accessing pornography online — essentially reinstating the mandatory age-verification policy for porn sites that has been condemned by UK lawmakers for years — most recently reinstated (earlier this year) as (another) addition to the Internet Safety Bill after a separate age-verification scheme was scrapped in 2019 following criticism that it was unworkable.
Campaigners may finally feel a sense of victory on this front thanks to the Internet Safety Bill, as the government said in February it would make it mandatory for adult sites to use “age verification technology” to make it harder for children to access or stumble upon pornography. But they obviously aren’t sitting back and waiting for this legislation to be passed – not when the Children’s Code and data protection laws in the UK already exist for them to use…
And in what appears to be a change in its approach announced yesterday, the ICO has bowed to pressure to widen its interpretation of the Code to cover pornographic websites – or at least those that are “allegedly” accessible to children (whatever that is was). that means) — wrote in a blog post that: “We have…revised our position to clarify that adult-only services are subject to the Children’s Code if they can be accessed by children.”
The ICO says this evolution in the application of the Code follows petitions from child safety campaigners and others warning of the risk of “data protection harm” when children visit porn sites.
“We will continue to evolve our approach by listening to others to ensure maximum code impact,” it continued. “For example, we are seeing more and more research (from the NSPCC, 5Rights, Microsoft and the British Board of Film Classification) that children are likely to access adult-only services and that this harms data protection and children lose out. control over their data or manipulate it to provide more data, in addition to damaging the content.”
This change in the program does not (cannot) entail an expansion of what regulates ICOs to include the content itself. (“We don’t regulate content,” his spokeswoman confirmed. “We regulate how children’s personal data is used or processed to make content available to children. This is a step before children see the content.”)
However, it’s clear that the data-gathering habits of porn sites aren’t the primary concern for child safety campaigners — rather, the content is — but if campaigners can use child privacy rules to force porn sites to check age, they don’t seem too fussed.
In a statement welcoming the ICO review to include adult-only sites within the scope of the Code, child safety campaign group the 5Rights Foundation said:
The UK Compliant Design Code applies to all services that can be accessed by persons under the age of 18, even if they are not intended for children. In the course of its investigative work handed over to the ICO last year, 5Rights found that children were visiting sites including gambling, dating and pornographic sites that did not comply with the Code, including profiling children to post harmful material.
“Notifying ICOs on adult-only sites will provide much-needed clarity to those companies who believe they are outside the law,” added Duncan McCann, head of policy enforcement, in another statement of support. “They will no longer have gray bars to use and we hope this development will further improve young people’s online lives.”
Although the UK Children’s Code is not itself legally binding, it adds to the country’s wider data protection rules, including the UK Data Protection Act and GDPR, and the ICO’s guidance states that relevant online services “ must meet’ standards to ‘make sure they comply with their obligations under data protection law to protect children’s data online’.
Under the GDPR, the ICO has broad powers to protect against privacy breaches – with the ability to fine violators up to 4% of their global annual turnover (or £17.5m, whichever is greater). So the implication here is basically ‘comply with the code or risk GDPR-level compliance’ – giving the ICO a big stick to encourage digital services within the scope to apply gold-plated rules that could lead to an obsolete internet, because who knows what other services the kids are “likely” to be able to access?
Asked how adult websites should assess whether children are likely to access their services, an ICO spokeswoman said: “Services should be accountable for their decisions and be able to provide evidence to support their views as to whether they are likely to children will have access. To determine whether they are covered by the code, adult services will need to understand who their users are and whether children make up a significant number of those users. To do this, the online service may conduct research on its users, review academic research or commission market research, consider the types of content and activities that children are interested in and the appeal of their services to children; or consider whether children are known to enjoy such services.’
The phrase “to understand who their users are and to determine whether children make up a significant number of those users” does a lot of work in this sentence, although the ICO has not expressly proposed using age verification technology as a way for a service to determine whether it falls under the Code . It will continue…
“If there is a possibility that children will access an adult-only online service, the service must take steps to limit children’s access to the service, such as by implementing age-proofing measures, or it must implement the standards of the Code Proportionately, a way to protect children’s online privacy based on risk assessment,” an ICO spokeswoman also told us, adding: “It is vital that we look after children online and not treat them in the same way as adults. Adopting kids’ code is a long-term transformational process, but we’re seeing more and more changes that benefit kids, allow the online industry to be more innovative, and rightly so.”
The ICO blog post also states that the (privacy) regulator will work with Ofcom (the inbound content regulator) and the Department for Digital, Culture, Media and Sport (DCMS) to “establish how the code works in practice in relation to adults-only services and what they should expect.” So expect further “evolution” of implementation as more pieces of the UK’s digital regulatory strategy are adopted (or, well, dropped).
The ICO already takes credit for a number of policy tweaks applied to children’s accounts by major platforms, including Facebook, Instagram, YouTube, Google and Nintendo, over the past year — for example, Meta-owned platforms restricting targeting based on age, gender, and location for under 18s; and YouTube turns off autoplay by default and turns on pause and bedtime reminders by default for Google Accounts for under-18s, to name two actions it flags.
The UK code has also been credited with spurring similar policy moves in other jurisdictions — reportedly inspiring a bill in California passed by lawmakers this week (and, if signed into law, would apply a similar set of protections to under-18s in the state). , among a number of other moves by other regulators and policymakers to protect children online.
[ad_2]
Source link