Close Menu
Voxa News

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Gavin and Stacey’s Joanna Page says she was groped by TV host

    September 20, 2025

    Pauline Dujancourt Spring 2026 Ready-to-Wear

    September 20, 2025

    Ball State vs. UConn live stream, where to watch online, CBS Sports Network channel finder, odds

    September 20, 2025
    Facebook X (Twitter) Instagram
    Voxa News
    Trending
    • Gavin and Stacey’s Joanna Page says she was groped by TV host
    • Pauline Dujancourt Spring 2026 Ready-to-Wear
    • Ball State vs. UConn live stream, where to watch online, CBS Sports Network channel finder, odds
    • This Is the Happiest State in the US, According to a New Study
    • UK and St George’s flags should never be used to ‘intimidate and terrify’, says senior Lib Dem | Liberal Democrats
    • King made more than £1m selling land for leg of HS2 that was scrapped | Duchy of Lancaster
    • xAI debuts a faster and more cost-effective version of Grok 4
    • Steve Martin Cancels Comedy Shows Due to COVID Diagnosis
    Saturday, September 20
    • Home
    • Business
    • Health
    • Lifestyle
    • Politics
    • Science
    • Sports
    • Travel
    • World
    • Entertainment
    • Technology
    Voxa News
    Home»World»Use of AI could worsen racism and sexism in Australia, human rights commissioner warns | Artificial intelligence (AI)
    World

    Use of AI could worsen racism and sexism in Australia, human rights commissioner warns | Artificial intelligence (AI)

    By Olivia CarterAugust 13, 2025No Comments5 Mins Read0 Views
    Facebook Twitter Pinterest LinkedIn Telegram Tumblr Email
    Use of AI could worsen racism and sexism in Australia, human rights commissioner warns | Artificial intelligence (AI)
    Productivity gains from AI will be discussed next week at the federal government’s economic summit, as unions and industry bodies raise concerns about copyright and privacy protections. Photograph: SOPA Images/LightRocket/Getty Images
    Share
    Facebook Twitter LinkedIn Pinterest Email

    AI risks entrenching racism and sexism in Australia, the human rights commissioner has warned, amid internal Labor debate about how to respond to the emerging technology.

    Lorraine Finlay says the pursuit of productivity gains from AI should not come at the expense of discrimination if the technology is not properly regulated.

    Finlay’s comments follow Labor senator Michelle Ananda-Rajah breaking ranks to call for all Australian data to be “freed” to tech companies to prevent AI perpetuating overseas biases and reflect Australian life and culture.

    Ananda-Rajah is opposed to a dedicated AI act but believes content creators should be paid for their work.

    Sign up: AU Breaking News email

    Productivity gains from AI will be discussed next week at the federal government’s economic summit, as unions and industry bodies raise concerns about copyright and privacy protections.

    Media and arts groups have warned of “rampant theft” of intellectual property if big tech companies can take their content to train AI models.

    Finlay said a lack of transparency in what datasets AI tools are being trained on makes it difficult to identify which biases it may contain.

    “Algorithmic bias means that bias and unfairness is built into the tools that we’re using, and so the decisions that result will reflect that bias,” she said.

    The human rights commissioner, Lorraine Finlay. Photograph: Mick Tsikas/AAP

    “When you combine algorithmic bias with automation bias – which is where humans are more likely to rely on the decisions of machines and almost replace their own thinking – there’s a real risk that what we’re actually creating is discrimination and bias in a form where it’s so entrenched, we’re perhaps not even aware that it’s occurring.”

    The Human Rights Commission has consistently advocated for an AI act, bolstering existing legislation, including the Privacy Act, and rigorous testing for bias in AI tools. Finlay said the government should urgently establish new legislative guardrails.

    “Bias testing and auditing, ensuring proper human oversight review, you [do] need those variety of different measures in place,” she said.

    There is growing evidence that there is bias in AI tools in Australia and overseas, in areas such as medicine and job recruitment.

    An Australian study published in May found job candidates being interviewed by AI recruiters risked being discriminated against if they spoke with an accent or were living with a disability.

    Ananda-Rajah, who was a medical doctor and researcher in AI before entering parliament, said it was important for AI tools to be trained on Australian data, or risk perpetuating overseas biases.

    While the government has stressed the need for protecting intellectual property, she warned that not opening up domestic data would mean Australia would be “forever renting [AI] models from tech behemoths overseas” with no oversight or insight into their models or platforms.

    “AI must be trained on as much data as possible from as wide a population as possible or it will amplify biases, potentially harming the very people it is meant to serve,” Ananda-Rajah said.

    “We need to free our own data in order to train the models so that they better represent us.

    skip past newsletter promotion

    Sign up to Breaking News Australia

    Get the most important news as it breaks

    Privacy Notice: Newsletters may contain info about charities, online ads, and content funded by outside parties. For more information see our Privacy Policy. We use Google reCaptcha to protect our website and the Google Privacy Policy and Terms of Service apply.

    after newsletter promotion

    “I’m keen to monetise content creators while freeing the data. I think we can present an alternative to the pillage and plunder of overseas.”

    Ananda-Rajah raised skin cancer screening by AI as an example where the tools used for testing have been shown to have algorithmic bias. Ananda-Rajah said the way to overcome any bias or discrimination against certain patients would be to train “these models on as much diverse data from Australia as possible”, with appropriate protections for sensitive data.

    Finlay said any release of Australian data should be done in a fair way but she believes the focus should be on regulation.

    “Having diverse and representative data is absolutely a good thing … but it’s only one part of the solution,” she said.

    “We need to make sure that this technology is put in place in a way that’s fair to everybody and actually recognises the work and the contributions that humans are making.”

    An AI expert at La Trobe university and former data researcher at an AI company, Judith Bishop, said freeing up more Australian data could help train AI tools more appropriately – while warning AI tools developed overseas using international data may not reflect the needs of Australians – but that it was a small part of the solution.

    “We have to be careful that a system that was initially developed in other contexts is actually applicable for the [Australian] population, that we’re not relying on US models which have been trained on US data,” Bishop said.

    The eSafety commissioner, Julie Inman Grant, is also concerned by the lack of transparency around the data AI tools use.

    In a statement, she said tech companies should be transparent about their training data, develop reporting tools and must use diverse, accurate and representative data in their products.

    “The opacity of generative AI development and deployment is deeply problematic,” Inman Grant said. “This raises important questions about the extent to which LLMs [large language models] could amplify, even accelerate, harmful biases – including narrow or harmful gender norms and racial prejudices.

    “With the development of these systems concentrated in the hands of a few companies, there’s a real risk that certain bodies of evidence, voices and perspectives could be overshadowed or sidelined in generative outputs.”

    artificial Australia commissioner Human intelligence racism rights sexism Warns Worsen
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Olivia Carter
    • Website

    Olivia Carter is a staff writer at Verda Post, covering human interest stories, lifestyle features, and community news. Her storytelling captures the voices and issues that shape everyday life.

    Related Posts

    Man armed with gun and knife detained at Charlie Kirk memorial service venue | Charlie Kirk shooting

    September 20, 2025

    The Encampments: Inside the US student protests for Gaza | Documentary

    September 20, 2025

    Man charged with the murder of teen TikTok star

    September 20, 2025

    Munich’s Oktoberfest 2025: Millions expected at beer festival

    September 20, 2025

    Two-year-old boy drowns after heavy rain unleashes mudslides in southern California | California

    September 20, 2025

    The complex stories after Charlie Kirk’s killing | Donald Trump

    September 20, 2025
    Leave A Reply Cancel Reply

    Medium Rectangle Ad
    Top Posts

    Glastonbury 2025: Saturday with Charli xcx, Kneecap, secret act Patchwork and more – follow it live! | Glastonbury 2025

    June 28, 20258 Views

    In Bend, Oregon, Outdoor Adventure Belongs to Everyone

    August 16, 20257 Views

    The Underwater Scooter Divers and Snorkelers Love

    August 13, 20257 Views
    Don't Miss

    Gavin and Stacey’s Joanna Page says she was groped by TV host

    September 20, 2025

    BBC/Sarah Louise BennettJoanna Page was warned the host could be “handsy”Gavin and Stacey star Joanna…

    Pauline Dujancourt Spring 2026 Ready-to-Wear

    September 20, 2025

    Ball State vs. UConn live stream, where to watch online, CBS Sports Network channel finder, odds

    September 20, 2025

    This Is the Happiest State in the US, According to a New Study

    September 20, 2025
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews
    Medium Rectangle Ad
    Most Popular

    Glastonbury 2025: Saturday with Charli xcx, Kneecap, secret act Patchwork and more – follow it live! | Glastonbury 2025

    June 28, 20258 Views

    In Bend, Oregon, Outdoor Adventure Belongs to Everyone

    August 16, 20257 Views

    The Underwater Scooter Divers and Snorkelers Love

    August 13, 20257 Views
    Our Picks

    As a carer, I’m not special – but sometimes I need to be reminded how important my role is | Natasha Sholl

    June 27, 2025

    Anna Wintour steps back as US Vogue’s editor-in-chief

    June 27, 2025

    Elon Musk reportedly fired a key Tesla executive following another month of flagging sales

    June 27, 2025
    Recent Posts
    • Gavin and Stacey’s Joanna Page says she was groped by TV host
    • Pauline Dujancourt Spring 2026 Ready-to-Wear
    • Ball State vs. UConn live stream, where to watch online, CBS Sports Network channel finder, odds
    • This Is the Happiest State in the US, According to a New Study
    • UK and St George’s flags should never be used to ‘intimidate and terrify’, says senior Lib Dem | Liberal Democrats
    • About Us
    • Disclaimer
    • Get In Touch
    • Privacy Policy
    • Terms and Conditions
    2025 Voxa News. All rights reserved.

    Type above and press Enter to search. Press Esc to cancel.