Lamya Robinson was ready to enjoy a Saturday night with friends at the local skating rink, only to be rejected after facial recognition software misidentified her, Fox2 Detroit reported

"To me, it's basically racial profiling," Lamya’s mother, Juliea Robinson, said. "You're just saying every young Black, brown girl with glasses fits the profile and that's not right."

The Robinson family plans on taking legal action against Riverside Arena in Livonia, Michigan after they kicked Lamya out, believing she was involved in a fight that took place at the establishment back in March. 

"I was like, ‘That is not me, who is that?’ the 14-year-old said. "I was so confused because I've never been there.” 

The skating rink issued a statement explaining that using facial recognition is part of their protocol, and they reprimanded Lamya because her face was a 97% match to the suspect(s)-in question. 

"One of our managers asked Ms. Robinson (Lamya's mother) to call back sometime during the week. He explained to her, this [sic] our usual process, as sometimes the line is quite long and it's a hard look into things when the system is running,” a statement from Livonia Skating Rink read.

"The software had her daughter at a 97 percent match. This is what we looked at, not the thumbnail photos Ms. Robinson took a picture of, if there was a mistake, we apologize for that,” the statement continued. 

Through extensive research on facial recognition software, findings show that there are significant discrepancies in the algorithms identifying people across demographics. According to the Journal of Biometrics and Biostatistics, Black females aged 18 to 30 years old consistently demonstrated the poorest accuracy rate. 

"Facial recognition does not accurately recognize darker skin tones," Tawana Petty, national organizing director at Data 4 Black Lives, said. "So, I don't want to go to Walmart and be tackled by an officer or security guard, because they misidentified me for something I didn't do."

Data 4 Black Lives is one of 35 organizations calling for retailers such as Lowes and Macy’s to stop using facial recognition software on customers or employees.

In a case similar to Lamya’s, Robert Williams, a Black man whose false identification led to his arrest last year, testified before the U.S. House Judiciary Subcommittee on Crime, Terrorism and Homeland Security on Tuesday. 

"I just don't think it's right, that my picture was used in some type of lineup, and I never been in trouble," Williams said in his testimony.

In Williams' case, police were investigating five watches stolen from a Shinola retail store in Detroit, worth an estimated $3,800 of merchandise and accused him of committing the theft. 

"When I look at the picture of the guy, I just see a big Black guy. I don't see a resemblance. I don't think he looks like me at all," Williams said in an interview with NPR. 

A complaint filed by the American Civil Liberties Union last year stated that Williams had nothing in common with the suspect, aside from his physical attributes and skin color. 

"Face recognition technology can't tell Black people apart,” the ACLU complaint read, Newsweek reported. “That includes Robert Williams, whose only thing in common with the suspect caught by the watch shop's surveillance feed is that they are both large-framed Black men."