Hosting
Monday, February 24, 2025
Google search engine
HomeInternetOther countries struggle to control how children access the internet. What can...

Other countries struggle to control how children access the internet. What can Australia learn?


The debate continues to rage in Australia over whether children should (or can) be banned from social media. Following recent promises from politicians to ban those under the age of 16 from the platforms, eSafety Commissioner Julie Inman Grant has expressed concern that imposing age restrictions could encourage children to use social media in secret and restrict their access to limiting crucial social support.

A recent analysis in the UK found that a ban on social media would “solve nothing”, citing evidence from an 18-year study in 168 countries that found there is “no causal link” between internet access and the well-being of young people.

The Australian federal government wants to test age assurance technology to restrict children’s access. For now, it is unclear what technical solutions currently exist that can effectively limit access based on age.

Other countries have been trying to block children’s access to online content for decades, but have mostly failed. Australia would be wise to heed the lessons learned from these experiences.

What has the United States tried?

The Children’s Online Privacy Protection Rule (COPPA) was introduced in the United States in 1998. This rule continues to impact the way children – worldwide – access information online.

COPPA imposes various requirements on “operators of websites or online services” that collect personal information from children under the age of 13. This includes the need to obtain parental consent.

To comply with this law, many companies (including social media platforms) have banned children under the age of 13 from accessing online services.

However, these bans have been heavily criticized as contributing to online age fraud. They also restrict children’s rights to access information and the right to self-expression, as protected by the United Nations Convention on the Rights of the Child.

Another far-reaching effort to limit children’s access to “obscene or harmful content on the Internet” was introduced in the United States in 2000.

The Children’s Internet Protection Act (CIPA) required schools and libraries to monitor the content children accessed online. This was usually achieved using internet filters that blocked searches for certain words.

However, these blunt instruments often blocked useful information. For example, a blocked search for the word “breast” to restrict access to pornographic content could also block information about breast cancer.

Research has shown for years that internet filtering is not effective in protecting children from bad experiences online.

Failed age bans

Many other countries have banned children’s access to online content, with varying degrees of success.

South Korea imposed a shutdown law in 2011. This was intended to tackle online gambling addiction by restricting young people under the age of 16 from accessing gambling sites after midnight.

However, many children used accounts in their parents’ names to continue accessing gaming sites. The law also faced legal challenges, with parents concerned about restrictions on their rights to raise and educate their children. The law was abolished in 2021.

In 2015, the European Union introduced legislation banning children under the age of 16 from accessing online services (including social media) without parental consent.

The proposed legislation was controversial. There was a lot of protest from technology companies and human rights organizations. They claimed the rules would violate children’s rights to expression and access to information.

The law was changed to allow individual countries to opt out of the new age ban, with Britain choosing to keep the limits only for people under 13. This patchwork approach meant that individual countries could set their own borders.

For example, in 2023, France passed a law requiring social media platforms to restrict access to teens under the age of 15 unless given permission by a parent or guardian.

Today, Europe leads the world in imposing significant online protections for children, with huge consequences for tech companies.

In 2023, a new Digital Services Act was introduced, which prohibits platforms such as Instagram, Facebook, TikTok and Snapchat from targeting children with personalized advertisements.

Rather than banning children from online services, this legislation focuses on controlling the way very large platforms interact with children. It aims to ensure protection is in place to manage harmful content and algorithmic influences on platform usage.

What can Australia learn from these global efforts?

A critical message from the past twenty years is that bans are ineffective. While technological interventions (such as filtering and age guarantee technologies) continue to improve, there are many solutions (such as using other people’s accounts) that make it impossible to ban children outright.

One effective approach focused on protecting children’s personal information. This has led to long-standing requirements for companies to comply with the restrictions. India and Brazil have recently introduced similar data-centric protections for children.

However, for older children, significant restrictions may conflict with UN protections for children’s rights. Australia must carefully consider potential conflicts when attempting to restrict or ban children’s online access.

Even if Australia imposes a ban on children under 16, it is unlikely to reform the global approach to such bans.

The US and EU are large markets, with significant influence on the actions of technology companies. Similar to COPPA’s influence in restricting social media access for children under 13 worldwide, it is likely that US and European policy innovations will continue to play a primary role in shaping global approaches.

Australia should take the lead in aligning its approach with these international efforts to strengthen appropriate protections for young children. At the same time, we need to help parents educate older children about the appropriate use of social media.

This strikes an appropriate balance between protecting children’s rights to access information and express themselves, while ensuring that guardrails are in place to do this safely.



Source link

RELATED ARTICLES
- Advertisment -
Google search engine

Most Popular