A 50-year-old grandmother from Tennessee has become the latest victim of faulty AI technology after police arrested her at gunpoint for bank robberies committed over 1,000 miles away in North Dakota—a state she had never visited. Angela Lipps was arrested on 14 July 2025 after facial recognition software called Clearview AI incorrectly identified her as a suspect in a string of bank robberies in Fargo. Despite protesting her innocence and spending 108 days in jail without bail or a formal interview, Lipps endured a harrowing ordeal that culminated in her inaugural flight to face trial. The case has prompted significant concerns about the reliability of AI identification tools in police work and has encouraged officials to reconsider their deployment of these tools.
The detention that altered everything
On the morning of 14 July 2025, Angela Lipps was looking after four young children when her life took an shocking and distressing turn. Without warning, a team of U.S. Marshals raided her Tennessee home and arrested her at gunpoint. The grandmother had received no advance notice, no phone call, and no chance to ready herself for what was going to happen. She was handcuffed and led away whilst the children watched, leaving her bewildered and frightened about the charges that lay ahead.
What rendered the arrest particularly shocking was the utter absence of legal procedure that went before it. No law enforcement officer had telephoned to interrogate her. No inquiry officer had questioned her about her movements or conduct. Instead, police authorities had relied solely on the findings of an facial recognition AI system to support her arrest. Lipps would subsequently learn that she had been identified by Clearview AI technology after CCTV footage from bank robberies in Fargo, North Dakota, was analysed by the software. The software had identified her as a “potential suspect with similar features,” serving as the exclusive basis for her arrest hundreds of miles from where the criminal acts had taken place.
- Taken into custody without notice or previous law enforcement inquiry or interview
- Identified solely by Clearview AI facial recognition system
- Taken into custody based on “similar features” to actual suspect
- No opportunity to defend herself before being handcuffed and removed
How facial recognition systems caused false arrest
The chain of occurrences that resulted in Angela Lipps’s arrest began with a string of financial institution thefts in Fargo, North Dakota. Surveillance footage recorded a woman employing forged military credentials to withdraw tens of thousands of pounds from various banks. Instead of conducting traditional investigative work, regional law enforcement decided to utilise advanced AI systems to locate the perpetrator. They uploaded the surveillance footage to Clearview AI, a facial recognition programme intended to match faces against vast databases of images. The software produced a match: Angela Lipps from Tennessee, a woman who had never visited North Dakota and had never once travelled on an aeroplane.
The reliance on this single piece of technological evidence proved disastrous for Lipps. Police Chief Dave Zibolski subsequently disclosed that he was completely unaware the department had been using Clearview AI and stated he would never have authorised its deployment. The programme’s classification of Lipps as a “potential suspect with similar features” became the sole justification for her apprehension. No supporting evidence was collected. No independent verification was sought. The AI system’s output was regarded as conclusive proof of guilt, circumventing fundamental investigative procedures and the assumption of innocence that underpins the justice system.
The Clearview artificial intelligence system
Clearview AI represents a controversial frontier in law enforcement technology. The system operates by comparing facial features from crime scene footage against enormous databases of photographs, including mugshots, driver’s licence images, and social media pictures. Advocates argue the technology accelerates investigations and helps identify suspects quickly. However, the system has faced significant criticism for its accuracy limitations, particularly when matching faces across different ethnicities and age groups. In Lipps’s case, the software identified her based merely on “similar features,” a vague criterion that failed to account for the possibility of resemblance between|likeness among unrelated individuals.
The utilisation of Clearview AI in Lipps’s case has subsequently prompted a detailed review of the system’s function in law enforcement. Police Chief Zibolski openly acknowledged that the software has since been banned from use within his force, recognising the dangers presented by over-reliance on algorithmic matching tools. The case stands as a sobering wake-up call that artificial intelligence, despite its sophistication, remains fallible and should never replace rigorous investigative work. When police departments regard algorithmic results as conclusive proof rather than investigative leads requiring verification, wrongly accused individuals can find themselves wrongfully detained and charged.
Five months in custody without answers
Following her arrest at gunpoint whilst babysitting four young children on 14 July 2025, Angela Lipps found herself confined to a Tennessee county jail with virtually no explanation. She was detained without bail, a situation that left her bewildered and frightened. Throughout her extended confinement, no one spoke with her. No investigators attempted to verify her account or gather basic information about her whereabouts on the date of the alleged crimes. She was simply locked away, watching days turn into weeks and weeks into months, whilst the justice system progressed at a sluggish pace with no clear answers about why she had been arrested or what evidence connected her to crimes committed over 1,000 miles away.
The conditions of her incarceration compounded indignity to an deeply distressing situation. Lipps was unable to obtain her dentures during the 108 days she spent in custody, a small but significant deprivation that underscored the callousness of her detention. She had never flown before her arrest, never left Tennessee, and certainly never visited North Dakota or its surrounding states. Yet these facts appeared irrelevant to the authorities holding her. It was not until 30 October 2025, over three months into her detention, that she was eventually moved to North Dakota for trial—her first and terrifying experience boarding an aircraft, undertaken in the context of criminal charges that would soon be dismissed entirely.
- Taken into custody without prior interview or investigation into her background
- Held without bail for 108 straight days in county jail
- Prevented from obtaining essential personal belongings including her dentures
- Not once interviewed by investigators about her account of her movements or location
- Sent to North Dakota for trial as her first aeroplane journey
Delayed justice, life wrecked
When Angela Lipps finally entered the courtroom in North Dakota, she sought vindication. Instead, what she received was a dismissal so swift it approached the absurd. The entire case against her collapsed in roughly five minutes—a stark contrast to the 108 days she had spent locked away, the months of uncertainty, and the significant disruption to her life. The charges were dismissed, the case closed, and yet no apology was forthcoming. No compensation was offered. The machinery of justice, having wrongfully ensnared her through defective AI, simply proceeded, forcing her to gather the remnants of a shattered existence.
The injury inflicted upon Lipps extended far beyond her time in custody. Her reputation in her local area became sullied by connection to grave criminal allegations. She had missed months with her family, including valuable moments with the four young children she had been babysitting when arrested. Her job opportunities were damaged by a criminal record that ought never to have been created. The psychological toll of being arrested at gunpoint, imprisoned without explanation, and transported across the country for crimes she had not committed cannot be simply calculated. Yet the system that destroyed her sense of security and safety gave no genuine redress or acknowledgement of the grave injustice she had experienced.
The aftermath and persistent struggle
In the period following her release, Lipps established a GoFundMe campaign to help manage the financial and emotional costs of her ordeal. The verified fundraiser served as a public record of her experience, documenting not only the facts of her case but also the very human cost of algorithmic error. Her story struck a chord with countless individuals who identified the dangers of over-reliance on artificial intelligence in law enforcement without adequate human oversight or accountability mechanisms in place.
Police Chief Dave Zibolski recognised that the Clearview AI facial recognition tool employed in Lipps’s case was problematic and has subsequently been banned from use. However, this policy shift came only following irreversible harm had been inflicted. The question remains whether Lipps will receive any form of financial redress or official exoneration, or whether she will be forced to carry the permanent scars of a justice system that let her down so catastrophically.
Questions regarding AI responsibility within law enforcement
The case of Angela Lipps has sparked critical questions about the implementation of artificial intelligence systems in criminal investigations in the absence of adequate safeguards or human review. Law enforcement agencies throughout America have with growing frequency adopted facial recognition technology to locate suspects, yet cases like Lipps’s reveal the severe consequences when these systems generate wrong results. The fact that she was taken into custody, detained for 108 days, and transported across the country resting only on an computer-generated identification raises fundamental concerns about fair legal procedures and the accuracy of artificial intelligence investigative systems. If a woman with a clean record and uninvolved in the alleged crimes could be falsely incarcerated, how many other blameless individuals may have endured like situations beyond public awareness?
The absence of accountability frameworks encompassing Clearview AI’s implementation in this case is especially concerning. Police Chief Zibolski’s confession that he was uninformed the technology was being used—and that he would not have approved it—suggests a breakdown in institutional oversight and management. The fact that the tool has subsequently been banned does little to address the injury already done upon Lipps. Legal experts and human rights campaigners argue that police forces must be required to validate AI systems prior to implementation, create clear guidelines for human verification of algorithmic findings, and maintain transparent records of the timing and manner in which these technologies are used. Without such measures, artificial intelligence systems risks becoming a mechanism that exacerbates injustice rather than mitigates it.
- Facial recognition systems produce elevated failure rates for women and individuals from ethnic minorities
- No government mandates presently require accuracy standards for law enforcement algorithmic technologies
- Suspects identified by AI should require corroborating evidence preceding warrant approval
- Individuals falsely detained via AI misidentification deserve statutory compensation and expungement