On May 6, Facebook announced a 20-member independent oversight board that can overrule the social media giant’s decisions on content moderation, a contentious issue for long. Only one Indian, Sudhir Krishnaswamy, is part of what is being called as Facebook’s Supreme Court. The vice-chancellor of Bengaluru-based National Law School of India University told Moneycontrol’s Pratik Bhakta in an interview over the phone that the decisions of the oversight board are binding on the Menlo Park-headquartered technology giant. The creation of the oversight board, he said, is a great lesson in private regulations and will balance free speech with individual dignity, arguably the biggest challenge of the social-media age. The social media network is looking to enlarge the domain of freedom of speech but also wants to preserve dignity and equality, according to Krishnaswamy. Edited excerpts:
Q: How did this assignment come about? What are the kind of discussions you had before agreeing to be a part of the board?
In the US, it started way back in 2018 and then an independent trust was created and four co-chairs appointed. These four co-chairs ran the entire process independent of Facebook. I don’t know how they decided on the members but no one applied. They did their own ways of searching and sifting through people and I am sure they spoke to many of us before announcing it. At my level, I have been engaged with the board for about three months.
Q: How will the board function? Do you have to set aside specific hours for it?
This is a very part-time association. The decision-making process will be panel-based. When you are made part of a panel, then you will participate. There are 20 members and we are not going to sit together but mostly it will be few hours a week or something like that.
Once the panel takes a decision, it will be confirmed by the entire board. These will not be individual decisions.
Q: Will you be taking cases related to India or South Asia?
I will not be taking cases about India only or any specific geographies. We will all be sitting on decisions from across the world. Technically, I am not representing India. At any point in time, the panel can call other experts for help. These can be linguistic experts, cultural experts, whoever they think is useful to that case.
Q) Will Facebook be involved in helping conduct meetings, etc?
We are an independent oversight board, Facebook is not involved. My appointment was also handled by the board. To the extent of managing content on Facebook and Instagram, we will coordinate with them to get those files but they will not be part of the decision-making process.
What is important here is that I don’t report to anybody at Facebook and I don’t engage with anybody there. Resources have been placed in an independent and an irrevocable trust.
Q) How do you balance freedom of expression and defamatory statements on social media? We have seen multiple cases where political leaders have got people arrested for cartoons or even jokes.
Firstly, jurisdictional laws and rules will continue to apply. The way Facebook is working with local judicial authorities that will continue. We will not get involved there.
Also, on one point we are clear that we want to enlarge and expand the domain of freedom of speech but at the same time, we also want to preserve the dignity of the individual, equality on the platform and we do not want use of abusive language and denigration of people on the internet.
So, there are two ends of the spectrum: dignity and freedom of speech. Since we will be taking up complicated cases, we have to carefully calibrate how these interests add up. That is going to be the tough part of the job. We cannot articulate this in advance. We have to do this on a case-by-case basis.
Q) How do you look at political advertisements on Facebook? Do you think they should also go through a fact-check filter?
This issue around political interference in Facebook is an old problem and in fact valid for all the internet platforms. Facebook has faced problems of this nature in the past and in some cases, they have not been able to respond in a way the entire community could be happy. That is exactly why the board has been constituted.
But is there any magic wand? No. But at least we can say this is much better than what existed before. Previously, some decisions were taken opaquely and unilaterally by the company. Now there is a process through which you can challenge and raise that grievance and you will get a fair, well-articulated and well-argued conclusion to the question. Explaining ourselves in this way makes us accountable and makes Facebook accountable.
Q) Do you see your decisions clashing with the internal decisions of Facebook?
There is no guarantee that the board’s decisions will be in line with those of Facebook. If you say that is a clash, then yes there is a good potential for a clash and the answer to that clash is the mandate of the board is such that our decision overrides their internal decisions.
As oversight board members and as a group, we are not even experts in business but that is not even our job. Our mandate is to give decisions on content moderation and on the content policy. What effects it has on business that other people will bother about.
Q) Do you see such an initiative becoming relevant for broader private regulations in India and the world?
Can this become a cross-platform initiative? I think if we succeed, it will clearly show the path. It is an innovative effort and the first of its kind. Now it depends on how we do our job and how the community, platforms and even government, for that matter, relate to our job. The way the body is being formed, I think it already is a great lesson in private regulations and even in state regulations. These are lessons that can go well beyond Facebook
Q) How will cases be coming in? Will there be ground rules for appealing to the board?
Potentially, there are three channels of cases coming in. The first channel can be Facebook and Instagram themselves. If they are unsure about a particular case, they can refer it to us. Second, users may also complain if they are not happy with what Facebook has done. Third, the board takes up some cases itself.
Right now, the third option is not clearly set up. The first and second options have already been put out by Facebook. Now who can approach, how they can approach and what types of cases will be taken up, these will be contained in a discreet public document. That document is not ready yet. It will happen in the near future.
The body has just been formed and given the current scenario no one has met anybody but it is remarkable that we have come this far, now the actual work starts.
Q) The recent “Bois Locker Room” chatroom on Instagram has triggered outrage and concerns over the abuse of social media. While police have taken action, what are the ethical dilemmas here and what is the best way to handle such a situation?
Just to make it clear, we will express our opinions on specific cases and not on generic matters. The decisions will be clearly reasoned, articulated into several languages and made public. These will be binding on Facebook.
It might come (down) to the interpretation of a specific content moderation policy or a specific challenging case. With respect to the current issue around Bois Locker Room, Instagram has already responded, so has the government and the process is underway. So, I do not think this is that big an issue from interpretation perspective. Now with regards to social issues, as individuals, we will continue to learn and respond to such incidents in better ways but that will not be the job of the board.
Q) Facebook has a content moderation policy. Will the board making changes to it?
When we start, we will continue to work with the policy that Facebook has. Also, we have publicly stated that we are committed to international human rights. These two are our normative commitments. Policies change in two major ways. First, they change by interpretation and second, they evolve.
We will publish an annual report about the kinds of cases we have taken up and Facebook’s level of compliance. This will be a public report. Now in that way policy will also evolve.
For instance, if we observe the application of certain tenets of the policy is not happening in the right way consistently over a few cases, then the policy will change. It is not part of our direct responsibility to rewrite their policy but over a period of time, there will be some effects on the policy as well.
Q) Facebook founder Mark Zuckerberg has spoken about AI (artificial intelligence) and its role in moderating content on Facebook. Where does the board stand on it?
There are already two layers with respect to content moderation. The first layer is the AI layer, where technology is used to moderate content. Second, is the human-moderation layer. I think Facebook already has a strong human-moderation setup. We are the third layer where things will come when either party is not satisfied but whatever we decide is binding on both the previous layers of moderation.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!
