New Australian Online Safety Act to include take-down of cyber abuse
Online platforms would see the amount of time that they have to pull down content after receiving a missive from the Australian eSafety Commissioner be reduced to 24 hours, under a new Online Safety Act for Australia.
The Act would also include the extension of cyberbullying provisions from children to the entire population, with a higher threshold for adults; getting search engines to “de-rank offending content”; and handing the eSafety Commissioner the power to force transparency reporting by digital platforms.
The consultation process for the Act kicked off at the National Press Club on Wednesday, with Minister for Communications, Cyber Safety, and the Arts Paul Fletcher stating there was a disconnect between community expectations and digital platforms over online safety.
Fletcher said the eSafety Commissioner is currently without any legislative power to investigate cyber abuse of adults, and can only provide advice and rely on platforms responding to requests.
“To be clear — this will not mean intervening in everyday personal disputes. However, there is a strong case for a take-down scheme targeted at the most seriously harmful online conduct, already criminalised in the Criminal Code,” Fletcher said.
“The government therefore proposes to introduce a new cyber abuse scheme for Australian adults, with its own take-down regime and appropriate civil penalties.”
The federal government would also increase criminal penalties for online abuse and harassment, Fletcher added.
During 2019, the eSafety Commissioner is likely to complete 12,000 investigations into child sexual abuse material, up 50% on last year, and since gaining oversight for image-based abuse, the Office of the eSafety Commissioner has gained a 90% success rate in addressing over 1,800 reports concerning more than 2,500 URLs, the minister said.
The Act will also give the eSafety Commissioner power to have content related to child exploitation, abhorrent violence, content that incites terrorism or violence, and “other extreme material” removed, no matter where it is hosted around the world.
“The government remains concerned with the ease with which children can access pornography and other types of harmful online content,” he said.
“The codes will require industry to provide their customers with optional products that limit exposure to prohibited online content in their homes.”
Should industry be unable to agree to a code, or the government deems them ineffective, the eSafety Commissioner would be able to create a code.
The new Act is a response to the review of online safety legislation that started in the middle of last year by former Australian Public Service Commissioner Lynelle Briggs.
The Department of Communications has published a discussion paper on the Act, with submission to close on 19 February 2020.
“I have been working in this industry long enough to know that some of what I have just outlined will make some industry representatives uncomfortable,” Fletcher said.
“But what I have outlined is the next phase of the collaboration between government and industry to maintain Australians’ confidence in the online world — confidence that the internet is a remarkable resource for good that they and their children can safely embrace into every part of their life.”
After the Christchurch terrorist attack, the eSafety Commissioner issued a direction to the nation’s largest internet service providers to block eight unnamed sites, and keep it blocked for six months. In September that block was renewed.
Australia’s telcos blocked 40 sites of their own accord after the attack.
In August, the government said it would create a content blocking regime for crisis events, with the eSafety Commissioner set to gain the power to force the nation’s telcos to block certain content.
The government also said at the time it would establish a 24/7 Crisis Coordination Centre to inform government agencies of “online crisis events” and aid the eSafety Commissioner to make a “rapid assessment” during such situations.
Related Coverage
Online age verification will have to involve biometrics: Former eSafety chief
Similar to eye balling a person suspected of being underaged in a liquor store, Alastair MacGibbon has said any online age verification system would need to involve biometric proof, such as a video.
Australia’s eSafety says age verification not a panacea for protecting kids from porn
The Office of the eSafety Commissioner said other technological solutions need to be leveraged to ensure online harms are addressed in a holistic, and multi-faceted way.
Australian eSafety Commissioner directs ISPs to keep Christchurch attack blocks
Eight sites to be blocked for the next six months, with the block to be lifted if video is removed.
Australia still working with tech giants on what constitutes abhorrent violent material
Following a discussion with the Attorney-General’s Department on Australia’s abhorrent video streaming laws, Google was still unsure of its actual obligations.
Australian Christian Lobby thinks NBN or telcos should do age verification
If the government-owned broadband wholesaler begins to collect identity data on all Australians, heaven help us.
READ MORE HERE