YouTube was taken to task this week by members of the United Kingdom’s parliament, after it was discovered that four neo-Nazi propaganda videos remained on the site despite orders from the lawmakers to take them down, the Independent reports.

In a 45-minute hearing led by the UK Parliament Home Affairs Committee, lawmakers accused both YouTube and Google of providing a "platform for extremism.”

“You are continuing to host illegal organizations, you are continuing to collude with these organizations by providing a platform for their extremism and your algorithms are promoting radicalization by promoting extreme organizations,” committee chair Yvette Cooper told William McCants, Alphabet's representative.

McCants, who is the global counter-terrorism lead for both Google and YouTube, apologized for the mishap, saying he felt “personal frustration” over the issue and would make it his “personal mission” to fix it.

McCants blamed a team of reviewers, arguing that they were more trained to flag radical Islamic content than alt-right content. He said that National Action videos will now be sent to a special team of reviewers, general reviewers will get extra training and their recognition technology will get “fine-tuned.”

However, lawmakers were unimpressed with the fact that McCants couldn’t answer basic questions about the reviewers; like where they were based or if they were YouTube employees. Cooper called him out on it, saying it’s “frankly shocking you seem to know so little about who they are,” and she was “extremely disappointed,” at the evidence presented.

“The failing is not simply an accident," Cooper added. "The evidence you have given us today is so weak it looks like a failure to do even the basic checks along the way.”

Cooper also expressed disbelief that this is the first time Google and YouTube have dealt with this type of content.

“[I] do not believe this is the first time you have heard this – allegations and concerns that your algorithms are promoting more and more extreme content at people."

She added that the company has a responsibility to ensure hate speech is not spread via its platforms: “Whatever they search for, what they get back is a whole load more extreme recommendations coming through the algorithms. You are the king of the search engine and yet your search engines are promoting things that further and further radicalize people.”