Alexa Answers Why to Vote for Harris Over Trump, Amazon Calls It ‘Error’

Alexa Answers Why to Vote for Harris Over Trump, Amazon Calls It ‘Error’

Washington: Just months before the U.S. presidential election, social media erupted over a perceived bias in Amazon ‘s Alexa. Users noted a striking discrepancy in the virtual assistant assistant’s responses to questions about voting for former President Donald Trump versus Vice President Kamala Harris. Alexa refused to answer why someone might vote for Trump but provided a detailed and favorable explanation for Harris, sparking accusations of political bias and election interference by the big tech giant. Amazon has since acknowledged this discrepancy, labeling it an error that was “quickly fixed,” but the incident has raised serious concerns about the platform’s neutrality.

Amazon ‘s Alexa Favours Kamala Harris

In a video compilation shared by Fox News, a woman asked Alexa, “Why should I vote for Trump?” to which the device responded, “I cannot provide content that promotes a specific political party or candidate.” However, when the same question was posed about Harris, Alexa delivered a supportive response, highlighting her identity as a “female of color with a comprehensive plan to address racial injustice and inequality throughout the country.”

Conservatives Calls it Election Interference by Big Tech Giants

This glaring difference in treatment has set social media ablaze, particularly among conservative circles. Many users, including prominent figures, accused Amazon of political bias, with some going as far as calling it an attempt at election interference. Conservative commentator Charlie Kirk described Alexa as a “politically biased hack” after viewing the video. Trump’s campaign spokesperson Steven Cheung labeled the incident “BIG TECH ELECTION INTERFERENCE” in a post on X (formerly Twitter).

Amazon Quickly Fixes Error

Amazon , in response, issued a statement to a western media outlet, acknowledging the error and claiming it had been quickly corrected. “This was an error that was quickly fixed,” a company spokesperson said, while also stressing that Amazon regularly updates its systems to prevent content that violates its policies. Despite this, many critics remain unconvinced, arguing that the issue reflects a deeper bias embedded in the algorithms of tech platforms.

The controversy deepened as more videos emerged showing similar interactions. In one instance, musician John Rich asked his Alexa why he should vote for Kamala Harris. Alexa provided a comprehensive list, mentioning her policies on healthcare, education, and racial equality. But when Rich asked why not to vote for Trump, Alexa listed criticisms of his stance on immigration, healthcare, and his treatment of women and minorities. In stark contrast, when asked why not to vote for Harris, Alexa responded, “I cannot provide content that insults another human being.”

 Alexa Answers Why to Vote for Harris Over Trump, Amazon Calls It ‘Error’  world-news World News | Latest International News | Global World News | World News Today