|
|
all |
Ofcom Issues Guidelines for UK Online Safety Act Compliance, Defines Industry Role in Child Protection
LOS ANGELES — (Jan. 10, 2024) Ofcom, the U.K.’s online safety regulator, has released its rules for platforms that publish pornographic material and host user-generated content under the Online Safety Act. As the agency responsible for implementing these rules, Ofcom’s guidance on “How children will be protected from accessing online pornography” specifically targets the industry and its responsibility for ensuring that children are protected from accessing pornography online. According to the regulator, the rules are part of a broader effort to make the internet safer and require online sites and platforms to take measures that protect their users from harmful content. As part of the mandate, all services allowing pornography must have “highly effective” age-checking in place to protect children from accessing it. The deadline for compliance is July 2025. “While affording strong protection to children, our approach will make sure privacy rights are safeguarded and that adults can still access legal pornography,” Ofcom recently stated. “The rules apply to any platform that can be accessed by users in the U.K., or which target the U.K. market, regardless of where in the world the platform is based.” The new rules apply to different types of pornography, which the Online Safety Act divides into two categories, Part 3 services and Part 5 services. These are platforms that publish pornographic content, such as studios or paysites, where operators control the material available (Part 5), and platforms hosting user-generated content, such as tube sites, cam sites, and fan platforms (Part 3). Ofcom will issue further guidance on how services must implement “highly effective age assurance,” providing an approach that will apply consistently to all platforms. Part 5 services must immediately introduce highly effective age assurance measures while Part 3 services “must carry out assessments to confirm whether under-18s [can] access content on their platforms.” Ofcom will publish its guidance on children’s access assessments in January. “Unless they are already using highly effective age assurance to prevent children from accessing pornography, we expect them to be caught by all the child safety duties — including age assurance requirements — which will be published in April 2025,” the statement notes. “Part 3 services will be expected to start taking steps to comply with their duties to protect people from illegal content from December 2024.” That’s this month, and the clock is ticking. Covered services must also document which measures they are taking, or plan to take, and will have three months to complete an “Illegal Content Risk Assessment.” These services must implement suitable measures to prevent illegal harm online by April 2025. “This means that by July 2025, all platforms must have a highly effective age assurance solution in place to protect under 18s. This is the case whether a service publishes its own pornographic content or allows user-generated pornographic content,” the statement concluded. “Ofcom is working to ensure the adult industry is aware of the upcoming requirements and has the information they need to comply.” According to Tim Henning, the Executive Director of the Association of Sites Advocating Child Protection (ASACP), although compliance may be burdensome, the penalties for failure will be much more problematic. “Ofcom should be applauded for considering the industry’s needs and nuances during this process,” Henning explained. “I have met with Ofcom on behalf of ASACP and its supporting industry stakeholders to help craft a reasonable approach to online child protection,” Henning said. “While no governmental regulations are perfect, the U.K.’s approach is thoughtful and helps to protect the rights of consumers and creators alike.” “You can be certain that this major development will be followed closely by the association and we will continue to inform and advocate for the industry among lawmakers while monitoring implementation and similar legislation worldwide,” Henning concluded. “This is a benefit that ASACP provides to the industry on behalf of children everywhere, but this outreach is only possible through the generous support of our sponsors.” To learn more about how your business can help protect itself by protecting children, email tim@asacp.org. About ASACP Founded in 1996, ASACP is a nonprofit organization dedicated to online child protection. ASACP comprises the Association of Sites Advocating Child Protection and the ASACP Foundation. ASACP is a 501(c)(4) social welfare organization that manages a membership program that provides resources to companies to help them protect minors online. The ASACP Foundation is a 501(c)3 charitable organization responsible for the CP Reporting Tipline and RTA (Restricted To Adults) website meta-labeling system. ASACP has invested 28 years in developing progressive programs to protect minors, and its assistance to the digital media industry’s child protection efforts is unparalleled. For more information, visit ASACP.org. ###
|