- Have any questions?
- 02 9247 6000
- media@commsroom.co
- Have any questions?
- 02 9247 6000
- media@commsroom.co
These models, which can process and generate various types of data such as images, audio, and video, are gaining traction as generative AI continues to evolve.
The paper, Examination of technology–Multimodal Foundation Models, explores how MFMs impact the regulatory roles of DP-REG members.
As generative AI spreads into new fields, communication around the opportunities and risks posed by these technologies is becoming more important. The report addresses concerns such as the rise of deepfake content, a growing issue for both regulators and the public.
Read also: AI in high-risk settings: How regulatory guardrails shape trust and communication
DP-REG’s paper supports its 2024-26 strategic priorities, which include understanding and responding to the potential harms and benefits of AI technologies. The exploration of MFMs aims to contribute to ongoing government discussions about AI and its role in society.
This is the third paper in a series on digital platform technologies. Earlier reports looked at the harms and risks associated with algorithms and examined large language models (LLMs) that generate text. DP-REG, which includes the Australian Competition and Consumer Commission (ACCC), the Australian Communications and Media Authority (ACMA), the eSafety Commissioner (eSafety), and the Office of the Australian Information Commissioner (OAIC), is working together to address the regulatory challenges posed by these technologies.
As AI continues to transform communication and other industries, DP-REG’s work ensures a coordinated regulatory approach to digital platforms in Australia, promoting safety and transparency in the digital age.
A new knowledge platform and website aimed at assisting the communications industry and its professionals. Contribute your op-ed, press releases, how-to articles, videos and infographics at media@commsroom.co