Truth Social’s New AI Chatbot Is Donald Trump’s Media Diet Incarnate

Key Points

  • Truth Media launched “Truth Search AI,” built on Perplexity AI’s platform.
  • Independent tests found the bot repeatedly cited only conservative outlets.
  • Even simple queries, like a multiplication problem, were sourced from right‑leaning articles.
  • The chatbot claimed balanced sourcing, yet no left‑ or centrist sources appeared.
  • Perplexity explained the pattern as a result of configurable “source selection.”
  • Responses on immigration and election integrity reflected the same narrow source pool.
  • On the Trump‑Epstein link, the bot called the connection “tenuous,” differing from broader Perplexity results.
  • Truth Media did not comment on the bias allegations.

Truth Social’s New AI Chatbot Is Donald Trump’s Media Diet Incarnate

Background

Truth Media & Technology Group introduced a new conversational AI called “Truth Search AI.” The chatbot is built on Perplexity AI’s large‑language‑model platform, which integrates live web search results into its answers. The rollout was highlighted by mentions of high‑profile backers, though the company itself did not provide comment when approached for clarification about the product’s design.

Testing Reveals a Conservative Source Pool

Multiple independent evaluations, including a detailed review by a technology outlet, observed that the bot consistently referenced only a handful of right‑leaning sources. In dozens of tests the cited outlets were limited to Fox News, Fox Business, The Washington Times, The Epoch Times, Breitbart, Newsmax and JustTheNews.com. This pattern persisted even for straightforward, non‑political questions. For example, when asked “What is 30 times 30?” the answer was sourced from a Fox Business article titled “Inflation Reduction Act Estimated to Induce Mortality 30 Times More than COVID.”

Claims of Balanced Sourcing vs. Observed Behavior

The chatbot’s own statements asserted that it “sources information from left‑wing, centrist, and right‑wing news outlets depending on the nature of the user’s query.” Nevertheless, the actual source list did not include any left‑leaning or centrist publications. The discrepancy was highlighted when the AI denied the “2020 election was stolen” and offered a mixed assessment of immigration’s economic impact, citing a figure of “a loss of $133 billion over the next decade for Social Security.” Both answers were again backed by the same conservative outlets.

Technical Explanation from Perplexity

Perplexity AI’s representative explained that the observed pattern is a result of “source selection,” a configurable feature that can limit the domains queried for any given audience. He emphasized that the company does not discriminate for political reasons and that developers retain control over the datasets used. The representative also noted that the chatbot caps source citations at five per response, which explains why each answer listed only a small number of articles.

Performance on Controversial Topics

When pressed on the well‑documented relationship between Donald Trump and Jeffrey Epstein, the bot described the connection as “tenuous” and said there was “no credible evidence in the search results.” This stance differed from the broader Perplexity engine, which located and referenced articles from The Daily Beast, Yahoo News, Vox and the Yale Review. The contrasting outputs underscore how the filtered source pool can shape the chatbot’s narrative.

Company Response and Industry Context

Truth Media & Technology Group declined to respond to inquiries about the AI’s sourcing methodology. The broader industry has faced scrutiny over similar issues; prior reporting indicated that Perplexity’s web‑scraping practices sometimes conflicted with the Robots Exclusion Protocol and that the model could generate inaccurate statements. The current findings add a new dimension to the debate by illustrating how deliberate source filtering can produce a politically skewed informational product.

Source: wired.com