Rapid website-blocking power for violent material proposed for eSafety Commissioner
A new Online Safety Bill could see Australia’s eSafety Commissioner be given powers to implement targeted blocks of terrorist or extreme violent material during an online crisis event and order the removal of image-based abuse within 24 hours.
The federal government on Wednesday opened consultation on the new Bill [PDF] which would also create a cyber abuse take-down scheme for Australian adults.
Following the eSafety Commissioner in September 2019 issuing a direction to the nation’s ISPs to continue blocking websites that host the video of the Christchurch terrorist attack, and agreeing on new protocols with ISPs in March to block such content, the new Bill proposes further action.
It would introduce a specific and targeted power for the eSafety Commissioner to direct ISPs to block certain domains containing terrorist or extreme violent material, for time-limited periods, in the event of an online crisis event.
Must read: Christchurch terrorist’s radicalisation shows the limits of surveillance and censorship
As flagged at the start of consultation a year ago, online platforms would also see the amount of time that they have to pull down content after receiving a missive from the Australian eSafety Commissioner halved under the new Bill.
Take-down notices for image-based abuse, cyber abuse, cyberbullying, and seriously harmful online content would now need to be actioned within 24 hours, instead of 48 hours.
If a website or app systemically ignores take-down notices for class 1 material under the online content scheme, such as child sexual abuse material, the eSafety Commissioner can require search engines and app stores to remove access to that service.
These protections will be backed by civil penalties — up to AU$550,000 for companies and AU$111,000 for individuals.
The Bill expands the cyberbullying scheme for children, enabling eSafety to order the removal of material from further online services such as games, websites, messaging, and hosting services — not just social media platforms.
The Bill will also extend cyber abuse take-down to adults.
According to the legislation, cyber abuse material in an adult context is when “an ordinary reasonable person would conclude that it is likely that the material was intended to have an effect of causing serious harm to a particular Australian adult”.
The scheme will empower the eSafety Commissioner to order the removal of seriously harmful online abuse when websites, social media, and other online services do not remove it after a complaint is made.
In addition, the eSafety Commissioner will have the power to require online services to provide contact or identifying information for individuals using anonymous accounts to abuse, bully, or share intimate images without consent.
A set of Basic Online Safety Expectations will also be set in law. The Act will establish mandatory reporting requirements that will allow the eSafety Commissioner to require online services to provide specific information about online harms, such as their response to terrorism and abhorrent violent material, or volumetric attacks where “digital lynch mobs” seek to overwhelm a victim with abuse.
Services will have to report on how they will uphold these expectations and can be penalised if they fail to do so.
The government will also update Australia’s Online Content Scheme to “better reflect the modern digital environment”.
Under this, sections of the tech industry will be tasked with creating new and strengthened industry codes to keep users safe. Industry will be given six months to establish the new codes, with the eSafety Commissioner also having the power to create industry standards within 12 months if industry fails to do so itself.
RELATED COVERAGE
READ MORE HERE