Breaking: Facebook Admits to Having Censorship Blacklist (Full Tommy Robinson Translated Video)

Breaking: Facebook Admits to Having Censorship Blacklist (Full Tommy Robinson Translated Video)

  • Posted by Amy Mek
  • On October 2, 2019
  • 0 Comments
  • Aia Fog, Ben Shapiro, Butcher of Bosnia, Deadline, Denmark, Ekstra Bladet, Free Press Society, Hate Speech, Jyllands Posten, Lars Boje Mathiesen, Leif Donbeak, Lotte Folke Kaarsholm, Nye Borgerlige, Peter Andreas Münster, Radikale Venstre, Ratko Mladic, Saul Alinsky, Yugoslavia

This past Sunday, RAIR Foundation USA released a short video segment featuring Peter Andreas Münster, Facebook’s Head of Communications for the Nordic Region, who admitted that Facebook is censoring virtually everything connected to UK activist and journalist Tommy Robinson. His stunning admission to interviewer Lotte Folke Kaarsholm of Deadline (one of Denmark’s most popular news programs) was that Facebook only allows slanderous comments about Tommy Robinson. The article also discussed how the Facebook representative compared Robinson to convicted war criminal Ratko Mladic, who is nicknamed the “butcher of Bosnia.”

RAIR Foundation USA is now exclusively bringing you the full translated 19-minute video interview, which reveals that Tommy Robinson and his supporters are not the only ones being banned from Facebook in the Nordic Region. Evidently, there is a black list, which is kept hidden from the public. Another revealing aspect of this extended interview is that Facebook is seeking to be regulated. RAIR discusses why such regulation would be a horrific idea and would result in a complete Facebook monopoly.

The Black List

The Facebook representative makes it clear that those who support “hate preacher” Tommy Robinson would be banned from the platform. Those who speak derisively of the British firebrand, however, are allowed to stay on Facebook, he said. Further explaining that Facebook has an elusive “list” of other figures that people are not allowed to admire, Münster says bluntly: “….we don’t allow other people to praise or support them.” Münster deflects when asked who is specifically on this list, saying…

Well… ehhh… As I said this concrete list is not public. That list of those who, besides Tommy Robinson, are completely banned. That is a list of those persons where, in addition to our usual rules about hate speech, we don’t allow other people to praise or support them.

The interviewer asked if one could see the list and also wanted to know how many people were on the list. Here is the exchange:

Interviewer: Can one see this list?

Münster: No, you cannot.

Interviewer: How long is it (the list)?

Münster: I don’t know, I actually haven’t seen it myself.

Interviewer: Is Hitler on there? Is Pol Pot on the list?

Münster: I don’t know. It is, as I said, not my role to be a part of all details around these decisions.

Interviewer: Who decides who gets their name on the list?

Münster: That would be the team that makes our policies. That is some of Facebook’s employees. —Yes. And that is a task they perform in cooperation with external experts when they need to.

Interviewer: What is it that qualifies them in order to make these… very final decisions?

Münster: Well, we hire people with a relevant background. It could be that they are lawyers or something else appropriate.

Clearly, such a list would be rife for abuse. Considering that Facebook compares a man who has literally never called for or advocated violence in any way – Tommy Robinson – to a convicted war criminal is a clear indicator of the political nature of the platform’s decisions. Consider that when pushed, the Facebook representative could not give any concrete example of why they made the decision to deplatform Tommy Robinson:

Interviewer: Where can I, as a journalist, Find documentation for your allegations against Tommy Robinson?

Münster: Yeah, well there isn’t one particular post where we can say that this is THE post. Well, this is, as I said, of course a drastic step, but we made the decision based on a number of different signals. I mean, what are people doing when they are not on Facebook? What are they doing in all sorts of other places? What sort of demonstrations they are helping to arrange, and who are the key persons in that environment, and what do they stand for? So it is… it is nuanced; it is a serious evaluation made on a case-by-case basis.

Interviewer: And where can I, as a journalist, check the documentation for your allegations?

Münster: Well, you can’t… yet

Münster also states that because a “universal” definition of “hate speech” does not exist, Facebook seeks guidance in the form of legislation. “One of those things we have been promoting,” he said, “is that we have a regulation in place; [we] are actually out there asking to be regulated in this area, in the area of illegal content.”

Begging to be Regulated

Commentator Ben Shapiro discusses Facebook’s desire for regulation in his podcast dated April 5 2019. He notes that this effort is really about two different issues: First: Regulations will empower Facebook to smash all competitors. If regulations dictated social media content, there would be no opportunity for competitors to present a “free” social media platform. Facebook, which already enjoys almost-monopoly status, would be free to easily crush competitors who may rise up as a free speech alternative.

Secondly, the left in particular constantly slams social media platforms for allowing “hate speech” (ie, speech that contradicts a left narrative) so regulation would give Facebook and other media outlets a clear path to proceed without having to figure it out themselves. “Facebook is now begging the government to censor it. I’m not kidding you. Facebook wants censorship,” Shapiro said.

Mark Zuckerberg was quoted as saying:

“There’s a question of what decisions should be left to a private company to make, especially around things like speech and expression for so many people around the world, and where should we have either industry or more government regulation…It’s not clear to me that we want a private company making that kind of fundamental decision [which speech to allow]…”

As Shapiro points out, the desire for government intervention of Facebook has less to do with Facebook’s dilemma in deciding what speech should be allowed, and more to do with their bottom line.

In the terrifying closing moments of the interview, Münster suggests that Robinson and other banned dissidents in cyber gulag could potentially be reinstated on Facebook if they essentially repent and reform their views to the correct Leftist views.

Call to Action: Write Facebook CEO Mark Zuckerman at zuck@fb.com and demand Facebook:

  1. Release the full list of those individuals and organizations who are currently banned from using the platform;
  2. Release the names and credentials of those individuals comprising the “review team” for decisions to ban specific user; and
  3. Detail the criteria used in decisions as to whether to ban a user from the platform.

We must demand transparency and accountability.

Full Transcript: many thanks to Tonia Groth for the translation!

First Facebook closed down the radical right wing activist Tommy Robinson’s profile. Then Facebook started deleting profiles that supported Tommy Robinson. And now Facebook are starting to delete posts that merely mention Tommy Robinson’s name.

Is that a way to deal with our common communication? This is what I, in a moment, will be asking Facebook’s Nordic Communication Chief Peter Münster.

Facebook has removed me. I had 1.2 million… I was the most interacted political Facebook page in Britain. I had a reach and an interaction that politicians could only dream of. If I went live, I would have 20,000 people watching me instantly, that would then spread to millions by the next day.

When the radical right-wing activist Tommy Robinson railed against immigration and Islam, his message once had a wide reach on Facebook. That is no longer the case. In February he was banned from Facebook and his page was closed. And not only that, lately his banning has had other consequences.

Lately a number of Danish Facebook users have found that their Facebook posts have been removed, apparently because they have merely mentioned Tommy Robinson. And that has resulted in intense condemnation.

Is our freedom of speech being destroyed? And why are the authorities not reacting, writes New Conservatives’ Lars Boje Mathiesen in his blog in Jyllands Posten.

The same question is asked by the legal practitioner and member of SF, Leif Donbaek, in Ekstra Bladet [newspaper]: whether a privately owned American company should unilaterally be dictating how we handle public debate.

While he, by the way, at the same time calls Robinson a giant idiot.

After sharing his blog on Facebook he too has been penalized, and can neither post nor comment on Facebook for the next 24 hours. And on their web page the Free Press Society writes that Facebook has threatened to close their Facebook page, and just this evening they inform us that they have been banned from posting on their page.

Furthermore, Chairwoman Aia Fog says that she has been blocked for thirty days for writing about Tommy Robinson.

According to Aia Fog the ban doesn’t come directly from Facebook itself, but “from strong political forces that more or less put pressure on social media.”

That explanation sounds very much like Tommy Robinson’s own

That’s why I said it’s good to be seen! Because I have become completely unpersoned from all social media. Even Snapchat, Instagram… and the maddest thing is, I’ve done nothing wrong! They run with these lies, they keep saying that I promote violence against Muslims. Show me where… show me ONE screen-shot that I’ve done.

Peter Münster, Nordic Communications Chief for Facebook. The guy we just saw here, Tommy Robinson, is he so dangerous that merely mentioning his name makes the world a dangerous place?

Well, for starters, the problem isn’t so much mentioning his name, the problem is that he falls under a category that we consider to be hate preacher, that is, people who use their status in the public sphere to organize hate or to encourage violence against a minority.

And we don’t permit that. We don’t permit the sharing of hate speech or racist attacks, but we don’t in general permit people who predominantly have this as a major part of their public work, public profile, to encourage people to something similar, so… it IS a tough sanction.

That is a tough sanction against him, he’s cut off, but what is happening now is that people who just mention Tommy Robinson have their posts removed, get blocked, or threatened to be removed from Facebook. How does that work? 

Well… our rules don’t explicitly forbid talking about Tommy Robinson. You are allowed to write that you don’t like him, or that he’s an idiot as was mentioned in the intro, …and then he was blocked. Yes, and that’s another question, but for starters, what you aren’t allowed to do according to our rules is show support, or positively promote, or in other ways give representation to these hate preachers.

What does that mean, to give representation? —Well, that means that if we have a policy that a man like Tommy Robinson is not allowed to have a profile or a page, then of course it doesn’t work if he has a following who are on Facebook who can say “I have spoken with Tommy Robinson today, and he asked me to say to everybody such and such…” (OK) We also have a policy for that
and he asked me to say to everybody such and such… ” (OK) We also have a policy for that. I would like to dig a little deeper into what you said was another question. I would like to show you a post our editor made earlier today. He wrote on Facebook…

Tonight Deadline will do a segment about Tommy Robinson. What is the best article about Tommy Robinson we should read? After only nine minutes the post had been removed. Does it further understanding about the phenomenon Tommy Robinson, and the possibility of discussing the way of thinking that he represents, when a post like this is removed? No. I think it’s OK to wonder why a post such as this has been removed.

Clearly when you have platform with 2.3 billion people who share lots of things every single day, then it is a big job to ensure that we not only enforce our rules, but that these rules are also correctly enforced. We go to great lengths to be as good as possible and to be as precise as possible, but sometimes we make mistakes, and with so much content it means that even a very small margin of error can mean a very large number of mistakes. But it looks rather systematic.

I mentioned earlier a couple of other cases where there was critical coverage of him. It doesn’t look like just a few mistakes. It looks like all mention of Tommy Robinson is systematically deleted. It shouldn’t, according to our rules. It shouldn’t be the case that if you problematize a person of his type that it gets removed. One of the things we have done to ensure that this type of mistake doesn’t become too prevalent is that it can be redressed by allowing people to appeal our decisions. And of course that is something that we keep an eye on.

So if there is an instance where we see that there are a large number of appeals, then when we go in and look at it again, and we can see that there are a large number of mistakes, then we have some work to do to make our policies more clear-cut, for our employees, what crosses the line and what is within those lines. So when you see this, are you thinking there this is a mistake where you are going too far?

Well I can’t make that decision. It is not my role to make these very detailed decisions regarding our platform. OK. What is it… we return to Tommy Robinson… you said yourself that you are subjecting him to drastic methods. Is it that you have been subjected to political pressure, that both he and the Free Press Society say is being used to persecute him? Why on earth do something so drastic as to remove a whole person instead of just problematic posts?

Yeah. Well, it’s not a matter of political pressure with regard to Tommy Robinson. But… but… one has to have lived under a very large rock the last couple of years not to have seen that we have had some challenging years. We have had some years where there has been a big focus on our not having taken sufficient responsibility, not only the technology, but the way people use the technology, and there has been a lag… (What, for example?) Well, there has been a problem with misinformation regarding political elections, which is something we have taken action on.

—But but but… this is just another example of… where we earlier have seen ourselves as custodians of a technological platform that one could use as one wished, and that as long as the technology worked, we could just go home and be happy about that, where that is another thing that is occurring, we have to have another and broader vision of our responsibility for the platform.

That also includes what people are using the platform for. And with that said, then the difficult compromise-filled decisions begin to crop up, because if we  start with a principle — and that is where all of our rules start; that’s the whole idea behind the platform — that is, we give people the ability to say what they think, without first having to ask permission, and regardless of any political opinions they might have, and that is where we start. But then we also have a responsibility to ensure that the platform is safe, and that the environment is not one where minorities can be persecuted, and especially that it doesn’t cause people to get attacked on the street, and that is a conflict, and we can’t get around that.

And that is what you think that Tommy Robinson has caused? —Yes, Tommy Robinson, as far as we are concerned, belongs in the category “hate preacher”. Well, we see him as a person belonging to the same category as Ratko Mladic. Well, well people like him, who is a convicted war criminal, he was actually convicted of genocide. He is… Tommy Robinson is, despite everything, not! No, that doesn’t mean either that everything is similar, but but… it’s people who with their public actions are a part of organizing hatred against minorities, and who encourage violence.

And, according to The Guardian, it is because Tommy Robinson, on Facebook, has encouraged the decapitation of Muslims. He says he hasn’t done anything. Where can I, as a journalist, Find documentation for your allegations against Tommy Robinson? Yeah, well there isn’t one particular post where we can say that this is THE post. Well, this is, as I said, of course a drastic step, but we made the decision based on a number of different signals. I mean, what are people doing when they are not on Facebook? What are they doing in all sorts of other places? What sort of demonstrations they are helping to arrange, and who are the key persons in that environment, and what do they stand for? So it is… it is nuanced; it is a serious evaluation made on a case-by-case basis. And where can I, as a journalist, check the documentation for your allegations?

Well, you can’t… yet. And it is not because it is something I can sit here and promise you that you will be able to at some particular point in time. But but if you look at what happened with us in recent years — we’ve become much more transparent. We have become more open about both how our rules have been created and how they look and why they are the way they are. We have had journalists present at our meetings where these things have been decided. And we are also in the process of implementing some measures that will — how shall I put it — ensure greater legitimacy around these decisions. Well, in this case it is a… in this case it is a question of jurisprudence, to the extent that you are the common infrastructure for our common communication. So where can either those who have been banned, or journalists who have to hold those in power accountable, where can we go investigate in these situations?

Well, we aren’t THE platform for the public discourse. We are one platform that supplements the public discourse that has been going on for many years. But for a quarter of 16 to 24 year olds, you, the social media in any case, are the ONLY platform where they get their news.
Yes. And that is a big responsibility. That is what could be called THE platform. And that is something we take very seriously. —So how is it possible to investigate you? Well… ehhh… As I said this concrete list is not public. That list of those who, besides Tommy Robinson, are completely banned. That is a list of those persons where, in addition to our usual rules about hate speech, we don’t allow other people to praise or support them.

Can one see this list? —No, you cannot. How long is it (the list)?

—I don’t know, I actually haven’t seen it myself. Is Hitler on there? Is Pol Pot on the list? —I don’t know. It is, as I said, not my role to be a part of all details around these decisions. Who decides who gets their name on the list? That would be the team that makes our policies. That is some of Facebook’s employees. —Yes. And that is a task they perform in cooperation with external experts when they need to. What is it that qualifies them in order to make these… very final decisions? Well, we hire people with a relevant background. It could be that they are lawyers or something else appropriate.

But besides that, the question about how they have become qualified is also, in some way, a reflection upon how we are permitted to make these rules. We have gotten that because it was… we have a responsibility for that platform. And somebody has to make these rules. And until such time as we arrive at a situation where there is a political definition, and perhaps even a democratically legitimized definition of what hate speech is, and what is illegal content on the Internet, then we have to make those rules ourselves. Definitions of illegal content do exist, as do definitions of illegal encouragements to violence, for example. But you go much MUCH farther than what is illegal, in your decisions.

There are definitions, yes, there are in fact many, and that is one of the challenges; there isn’t just one. You speak on behalf of a company, and I ask you questions one would ask of a political and society-supporting institution, and that is why one might feel that we may be misunderstanding each other. But can you understand that that is how you are perceived, when for certain age groups the primary access to the news… if we talk about our youth, there are a lot of young people who maybe know he is a well-known figure, that he is highlighting something that he and others in society perceive as being a problem, and then he disappears!

Should we not find it worrisome that this younger generation might lose confidence in institutions such as your own, as they could lose confidence in being able to find truth and reality, if people can just disappear without their understanding why? Well, I think that would be a relevant discussion to have. And I think this is the essence of why these are some complex considerations to be made. Because we can’t get around that we, ultimately, have a conflict between consideration for freedom of speech on the one side, and consideration for people’s safety in contrast with that. And you can’t wrap that up in some nice and pleasant way, that is the brutal truth.

It is difficult to make these decisions — every time. And that is something we feel a big responsibility to do properly. So that discussion that you think would be a relevant discussion, but which you also say would be a brutal discussion. How can we, in some way, democratically ensure that we also gain access to that discussion and are heard, and that it’s not just employees of Facebook who decide who is allowed to participate? Well, this is something that we have discussed at length. How can we have another forum for legitimacy behind the rules that we have, now that we are the ones who have made those rules because it is our platform?

One of those things we have been promoting is that we have a regulation in place, are actually out there asking to be regulated in this area, in the area of illegal content. When there aren’t any definitions. Universal democratic definitions of, for example, hate speech, then we have to, because it is our responsibility to make those so the platform doesn’t get used for things it shouldn’t be used for.

— But clearly it would have been better if there had been a political process, and there had been democratic legitimacy, and we end up in a place where the rules that are there are openly legitimate because they are regulated politically instead of just coming from some American company. But isn’t that setting the standards for yourself so high that the world never would be able to live up to it, to say that there should be a global political consensus on what we all accept hate speech is, before the rest of the democratic political society in some way has to be drawn into the discussion about who should be removed from public discourse?

Well, I don’t know if the bar is high, because that IS where the bar is today. Facebook offers a global public arena in which to have a global conversation, and that means that the rules that exist are global, so the bar sits where it sits. And here at the end Peter Münster, Tommy Robinson: will he ever be able to do anything to get a voice on Facebook again? Well, yes, there are lots of examples of people who leave these very violent and hateful environments and simply change their lives, and of course our rules aren’t set in stone, where we stubbornly hold onto a many-years-old evaluation, if we have concrete evidence/documentation that this is a closed chapter, then of course that would be something we would have to re-evaluate.

PLEASE HELP RAIR FOUNDATION USA EXPOSE THE ENEMIES OF THE WEST! REPORT SUBVERSIVE ACTIVITY IN YOUR COMMUNITY BEING IGNORED BY THE ESTABLISHMENT! EMAIL INFO@RAIRFOUNDATION.COM! BECOME A MEMBER, DONATE, AND TELL YOUR FRIENDS!

 

0 Comments

Leave Reply

Your email address will not be published. Required fields are marked *