Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Australia Confronts Trust Issues on AI, Aims to Create Own AI Advisory Body

Jan 17, 2024 - techtimes.com
Australia is planning to establish its own artificial intelligence (AI) advisory body and guidelines to mitigate AI risks, in consultation with industry bodies and experts. The government is considering new legislation specifically for AI or modifying existing ones to ensure safeguards on AI research and application in high-risk environments. The government is also collaborating with industries to create a voluntary AI safety standard and options for watermarking and labeling goods produced by AI.

The government aims to differentiate between "high risk" and "low risk" AI applications, with the former including the production of modified information or "deep fakes," and the latter including screening spam emails. Safeguards being considered include testing requirements for products, transparency about model design and data supporting AI applications, training programs for AI system developers, and potential certification programs. The government is also observing how other nations, including the US, Canada, and EU, are addressing AI-related issues.

Key takeaways:

  • Australia is planning to create its own artificial intelligence (AI) advisory body and guidelines to mitigate AI risks, in consultation with industry bodies and experts.
  • The government is considering enacting new legislation or modifying existing ones to impose safeguards on the research and application of AI in high-risk environments.
  • Immediate actions are being taken to collaborate with industries to create a voluntary AI safety standard and options for watermarking and labeling goods produced by AI.
  • Australia is closely monitoring how other nations, including the US, Canada, and EU, are addressing the issues raised by AI, and is committed to collaborating with other nations to influence global efforts in this domain.
View Full Article

Comments (0)

Be the first to comment!