Canberra asks big tech to develop detection capabilities in encrypted communication

0
185

Image Courtesy: ZD Net

The Australian government has created a set of proposed guidelines that social media firms, for example, must follow if they wish to operate a service in Australia. While failing to comply with reporting requirements may result in a AU$555,000 punishment, the draught regulations also include encryption-breaking expectations.

The Online Safety Act 2020 will grant Australia’s eSafety Commissioner broad new powers beginning in January. Oversight of a new set of Basic Online Safety Expectations (BOSE) that puts out a number of standards for big companies is one of these authorities.

These expectations [PDF] will apply to service providers such as social media, “relevant electronic services of any sort,” such as messaging applications and games, and other defined internet services, such as websites.

According to the proposed Draft Online Safety (Basic Online Safety Expectations) Determination 2021, the provider is expected to take reasonable efforts to guarantee safe use. This includes the “core” expectation that the service provider would take reasonable efforts to guarantee that end users may use the service safely.

The provider is expected to take reasonable efforts to guarantee safe usage under the proposed Draft Online Safety (Basic Online Safety Expectations) Determination 2021. This includes the “core” expectation that the service provider would take reasonable efforts to guarantee that end-users may use the service safely.

The provider is expected to minimise the availability of cyberbullying material targeted at an Australian child, cyber abuse material targeted at an Australian adult, a non-consensual intimate image of a person, class 1 material, material that promotes abhorrent violent conduct, material that incites abhorrent violent conduct, material that instructs in abhorrent violent conduct, and material that depicts abhorrent violent conduct.

Additional expectations include that the service provider will take reasonable measures to proactively reduce the extent to which material or behaviour on the service is or may be unlawful or harmful. According to the paper, reasonable actions that might be taken include the creation or implementation of mechanisms to identify, moderate, report, and delete material or activity on the service that is or may be unlawful or harmful.