AI Transparency Statement
Screen Australia is committed to safe, responsible, ethical and transparent use of Artificial Intelligence (AI) technologies in accordance with the principles outlined by the Digital Transformation Agency’s (DTA) Policy for the responsible use of AI in government (“the Policy”).
In accordance with the requirements under the Policy, this Transparency Statement outlines Screen Australia’s approach to AI usage and adoption. We acknowledge that AI usage and adoption is rapidly evolving, and are committed to ongoing refinement of our policies and approaches to AI.
Screen Australia’s accountable official who is responsible for implementing the Policy is our Chief Operating Officer, Grainne Brunsdon. The accountable official’s other responsibilities include:
How Screen Australia uses AI
Screen Australia supports the ethical and legal use of AI technologies and services. We do not currently use AI technologies or services where the public may directly interact with, or be significantly impacted by, AI usage, without a human intermediary or human intervention. Screen Australia currently uses AI in the domains, as defined by the DTA, of ‘service delivery’ and ‘corporate and enabling’. Screen Australia also employs the DTA defined usage patterns of ‘workplace productivity’ and ‘analytics for insights’.
Approach to Staff Usage
Staff may access AI technologies and services in the course of their work and on work devices if the technology and service has been approved by the agency. In August 2024, we introduced an internal Policy on the use of Artificial Intelligence (AI) by Screen Australia Employees & Contractors, that states the terms upon which our employees and contractors will:
This internal policy requires staff to comply with the Australian Government’s Interim guidance on government use of public generative AI tools when using any approved AI tool, and reminds employees and contractors of their current obligations under policy, contract and applicable laws. The interim guidance requires staff above all to:
- be able to explain, justify and take ownership of their advice and decisions
- not input anything into an AI tool that could reveal classified, personal or otherwise sensitive information.
The agency also held an internal mandatory webinar to educate Screen Australia staff about these terms of approval and usage.
Screen Australia follows the direction of Government on the use of specific AI platforms, such as the February directive regarding DeepSeek. From February 2025, staff cannot access or use DeepSeek on Screen Australia devices or private devices that contain agency email access.
Managing Stakeholder Usage
Any use of AI by our stakeholders, including applicants for funding, must be consistent with Screen Australia’s Terms of Trade, the relevant program guidelines, and any applicable laws.
In September 2024, Screen Australia published a set of AI Guiding Principles, which publicly articulates our expectations around the proposed use of AI technologies or services by Screen Australia and its stakeholders.
These key principles are:
-
Talent, creativity, culture and the individual
-
Transparency
-
Ethical use of AI
-
Diversity, equity and inclusion
-
Fairness
-
Responsibility and accountability.
Screen Australia is continually reviewing opportunities to assist stakeholders to balance the uptake of AI as a tool for innovation and best practice, alongside the safe and responsible use of AI. This includes sharing updates and insights from subject matter experts.
This Transparency Statement was last updated on 24 February 2025. It will be updated as Screen Australia’s approach to AI changes, and at least every 12 months.
For enquiries about Screen Australia’s adoption of AI, please contact [email protected]