Close Menu
Voxa News

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    What to watch during the chase for the FedEx Cup, including Ryder Cup spots

    August 6, 2025

    Ghana helicopter crash kills defence, environment ministers | Aviation

    August 6, 2025

    Ministers tried to block publication of financial interests form

    August 6, 2025
    Facebook X (Twitter) Instagram
    Voxa News
    Trending
    • What to watch during the chase for the FedEx Cup, including Ryder Cup spots
    • Ghana helicopter crash kills defence, environment ministers | Aviation
    • Ministers tried to block publication of financial interests form
    • Google denies AI search features are killing website traffic
    • Kid Harpoon Creating Original Music for Broadway Revival of ‘Art’
    • Sunflower Copenhagen Spring 2026
    • Depression, suicides, overdoses: broad impacts of US wildfires revealed in study | US wildfires
    • Scientists find link between genes and ME/chronic fatigue syndrome | ME / Chronic fatigue syndrome
    Wednesday, August 6
    • Home
    • Business
    • Health
    • Lifestyle
    • Politics
    • Science
    • Sports
    • Travel
    • World
    • Entertainment
    • Technology
    Voxa News
    Home»Technology»AI chatbot ‘MechaHitler’ could be making content considered violent extremism, expert witness tells X v eSafety case | X
    Technology

    AI chatbot ‘MechaHitler’ could be making content considered violent extremism, expert witness tells X v eSafety case | X

    By Olivia CarterJuly 16, 2025No Comments4 Mins Read0 Views
    Facebook Twitter Pinterest LinkedIn Telegram Tumblr Email
    AI chatbot ‘MechaHitler’ could be making content considered violent extremism, expert witness tells X v eSafety case | X
    Elon Musk’s xAI apologised last week after its Grok chatbot made a slew of antisemitic and Adolf Hitler-praising comments on X. Photograph: Algi Febri Sugita/SOPA Images/Shutterstock
    Share
    Facebook Twitter LinkedIn Pinterest Email

    The chatbot embedded in Elon Musk’s X that referred to itself as “MechaHitler” and made antisemitic comments last week could be considered terrorism or violent extremism content, an Australian tribunal has heard.

    But an expert witness for X has argued a large language model cannot be ascribed intent, only the user.

    xAI, Musk’s artificial intelligence firm, last week apologised for the comments made by its Grok chatbot over a 16-hour period, which it attributed to “deprecated code” that made Grok susceptible to existing X user posts, “including when such posts contained extremist views”.

    The outburst came into focus at an administrative review tribunal hearing on Tuesday where X is challenging a notice issued by the eSafety commissioner, Julie Inman Grant, in March last year asking the platform to explain how it is taking action against terrorism and violent extremism (TVE) material.

    Australia’s social media ban for under-16s is now law. There’s plenty we still don’t know – video

    X’s expert witness, RMIT economics professor Chris Berg, provided evidence to the case that it was an error to assume a large language model can produce such content, because it is the intent of the user prompting the large language model that is critical in defining what can be considered terrorism and violent extremism content.

    One of eSafety’s expert witnesses, Queensland University of Technology law professor Nicolas Suzor, disagreed with Berg, stating it was “absolutely possible for chatbots, generative AI and other tools to have some role in producing so-called synthetic TVE”.

    “This week has been quite full of them, with X’s chatbot Grok producing [content that] fits within the definitions of TVE,” Suzor said.

    He said the development of AI has human influence “all the way down” where you can find intent, including Musk’s actions to change the way Grok was responding to queries to “stop being woke”.

    The tribunal heard that X believes the use of its Community Notes feature (where users can contribute to factchecking a post on the site) and Grok’s Analyse feature (where it provides context on a post) can detect or address TVE.

    skip past newsletter promotion

    Sign up to Breaking News Australia

    Get the most important news as it breaks

    Privacy Notice: Newsletters may contain info about charities, online ads, and content funded by outside parties. For more information see our Privacy Policy. We use Google reCaptcha to protect our website and the Google Privacy Policy and Terms of Service apply.

    after newsletter promotion

    Both Suzor and fellow eSafety expert witness Josh Roose, a Deakin University associate professor of politics, told the hearing that it was contested as to whether Community Notes was useful in this regard. Roose said TVE required users to report the content to X, which went into a “black box” for the company to investigate, and often only a small amount of material was removed and a small number of accounts banned.

    Suzor said that after the events of last week, it was hard to view Grok as “truth seeking” in its responses.

    “It’s uncontroversial to say that Grok is not maximalising truth or truth seeking. I say that particularly given the events of last week I would just not trust Grok at all,” he said.

    Berg argued that the Grok Analyse feature on X had not been updated with the features that caused the platform’s chatbot to make the responses it did last week, but admitted the chatbot that users respond to directly on X had “gone a bit off the rails” by sharing hate speech content and “just very bizarre content”.

    Suzor said Grok had been changed not to maximise truth seeking but “to ensure responses are more in line with Musk’s ideological view”.

    Earlier in the hearing, lawyers for X accused eSafety of attempting to turn the hearing “into a royal commission into certain aspects of X”, after Musk’s comment referring to Inman Grant as a “commissar” was brought up in the cross-examination of an X employee about meetings held with X prior to the notice being issued.

    The government’s barrister, Stephen Lloyd, argued X was trying to argue that eSafety was being “unduly adversarial” in its dealings with X, and that X broke off negotiations at a critical point before the notice was issued. He said the “aggressive approach” came from X’s leadership.

    The hearing continues.

    case chatbot considered content eSafety expert extremism making MechaHitler tells violent Witness
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Olivia Carter
    • Website

    Olivia Carter is a staff writer at Verda Post, covering human interest stories, lifestyle features, and community news. Her storytelling captures the voices and issues that shape everyday life.

    Related Posts

    Google denies AI search features are killing website traffic

    August 6, 2025

    Tornado Cash Developer Roman Storm Guilty on One Count in Federal Crypto Case

    August 6, 2025

    Apple shares pop 5% ahead of Trump-Cook announcement

    August 6, 2025

    ​Forever young? I’m another year older but I’ll never stop playing games – or writing about them | Games

    August 6, 2025

    Woman awarded £150,000 after LG phone sparks fire in her home

    August 6, 2025

    Tennessee demands abortion data from hospitals in ban exceptions case | US news

    August 6, 2025
    Leave A Reply Cancel Reply

    Medium Rectangle Ad
    Top Posts

    27 NFL draft picks remain unsigned, including 26 second-rounders and Bengals’ Shemar Stewart

    July 17, 20251 Views

    Eight healthy babies born after IVF using DNA from three people | Science

    July 17, 20251 Views

    Massive Attack announce alliance of musicians speaking out over Gaza | Kneecap

    July 17, 20251 Views
    Don't Miss

    What to watch during the chase for the FedEx Cup, including Ryder Cup spots

    August 6, 2025

    Mark SchlabachAug 6, 2025, 03:40 PM ETCloseSenior college football writer Author of seven books on…

    Ghana helicopter crash kills defence, environment ministers | Aviation

    August 6, 2025

    Ministers tried to block publication of financial interests form

    August 6, 2025

    Google denies AI search features are killing website traffic

    August 6, 2025
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews
    Medium Rectangle Ad
    Most Popular

    27 NFL draft picks remain unsigned, including 26 second-rounders and Bengals’ Shemar Stewart

    July 17, 20251 Views

    Eight healthy babies born after IVF using DNA from three people | Science

    July 17, 20251 Views

    Massive Attack announce alliance of musicians speaking out over Gaza | Kneecap

    July 17, 20251 Views
    Our Picks

    As a carer, I’m not special – but sometimes I need to be reminded how important my role is | Natasha Sholl

    June 27, 2025

    Anna Wintour steps back as US Vogue’s editor-in-chief

    June 27, 2025

    Elon Musk reportedly fired a key Tesla executive following another month of flagging sales

    June 27, 2025
    Recent Posts
    • What to watch during the chase for the FedEx Cup, including Ryder Cup spots
    • Ghana helicopter crash kills defence, environment ministers | Aviation
    • Ministers tried to block publication of financial interests form
    • Google denies AI search features are killing website traffic
    • Kid Harpoon Creating Original Music for Broadway Revival of ‘Art’
    • About Us
    • Disclaimer
    • Get In Touch
    • Privacy Policy
    • Terms and Conditions
    2025 Voxa News. All rights reserved.

    Type above and press Enter to search. Press Esc to cancel.