When Ramya*, a former Facebook content moderator in India, heard about a $52 million settlement to her counterparts in the US, she wondered naively, “How did they get it? Is there a way for us to get too?”
Of course not. For the settlement came at the back of a lawsuit filed by Facebook content moderators in the US alleging that social media major did not provide a safe workplace. The lawsuit was filed in September 2018.
Online media platform The Verge, that brought to light the dire working conditions of the moderators, said in its investigative report that these employees suffered psychologically as they continue to see disturbing content for longer periods.
It led to post-traumatic stress disorder (PTSD) and anxiety. Some resorted to drugs to cope up with their work that is taking a toll on them emotionally.
Speaking to Indian content moderators, one realises that their story is not very different.
Unfortunately for moderators like Ramya, there are no lawsuits filed and possibility of getting a settlement is far less. But working conditions for people like Ramya were far from healthy, both physically and mentally.
In a blog, Facebook said that there are over 30,000 people in teams working on safety and security and about half the number of people in these teams are content reviewers — a mix of full-time employees, contractors and companies it partners with.
Ramya was on the payroll of Cognizant, a third party firm that got the a content review contract from Facebook. According to the former moderators Moneycontrol spoke to, there were close to 1,000 of them in Cognizant’s Hyderabad campus.
Last year, the company decided to move out of the Facebook content moderation business that employed close to 6,000 employees.
Many of them, including Ramya, were laid off then. It is not clear where the business went. Some speculate that it could have gone to Wipro or Genpact.
Recounting her experience, Ramya said, “It was horrible. We were treated worse than dogs, with low pay and more importantly no respect.”
“And that was hurting,” she added.
Ramya is an engineer and has a cumulative work experience of five years. She worked in Cognizant for close to two years and she was earning Rs 25,000. When she passed out five years back from a tier-2 college in Andhra Pradesh, jobs were hard to come by.
So Ramya took up the first job she got for Rs 8,700 as a content moderator. “I had to repay a Rs 4 lakh loan. So I was desperate for a job and didn’t hesitate to take the job,” she added.
After that, changing a career was not an option and she stuck to it. However, she underestimated the impact the prolonged exposure to disturbing content would have on her health, both physical and mental.
A typical day in a content moderators' life
This is how a typical day of content moderator pans out.
Ramya starts at 6 am and works the 9-hour shift with an hour's break in between. “So this one hour break includes loo breaks and lunch and breakfast, since I am on morning shift,” she explained.
There is no room for distractions and mobile phones are not allowed in the workspace.
Her workload changes on a daily basis. “On average we do 2,000-3,000 decision per day and on other days it could be less or more depending on size of the content,” she pointed.
As a content moderator, Ramya’s job is to decide if a particular social media content could run on the site based on the guidelines Facebook sets. The content could be short videos/images that users report for removal.
These guidelines change frequently and at times lead to wrong decisions.
“So if a particular image is passed today, tomorrow the same image will not be passed due to change in guidelines. It is hard to understand the rationale,” added Mridula*, another former moderator and Ramya's colleague currently working for a different firm.
There is hardly any room for error and the consequences are terrifying.
“For every single error we make, awaits verbal backlash in the manager’s room,” added Mridula.
"Because, there is a huge emphasis on accuracy. Our manager, you could say, is not tolerant to mistakes," she said.
His discontent reflects in verbal lashing. “So apart from our work, we also had to deal with such managers,” she recounted.
That is only half the problem. The other half is coping up with the distress seeing these disturbing images cause. “We see a lot of images related porn, rape and violence. After a certain point, you tend to normalise them,” added Pawan*, a former colleague of Ramya and Mridula.
For instance, seeing a gruesome murder video for Pawan is normal. Few months into reviewing, he didn't bat an eye while passing decisions on such content.
However, after a certain point, seeing all this day after day does get to them.
“When we see some of the videos we feel like puking when we are eating,” Ramya said. Sleeping is harder as images of blood and gore keep you awake.
It is harder to share anything with family and friends, given that most of them belong to conservative families.
Most of the moderators Moneycontrol spoke to admitted that their families are not aware of the 'gory' details except that they are working for Cognizant.
“They are proud and giving them the details only disappoint them,” added Pawan.
While friends outside the workplace are easier to talk to, moderators pointed out that they can be sympathetic at best. It would still be harder to understand the impact the work is having on them, they added.
“We cannot talk. It is a miserable life. So that is why we are close to our teammates. It is like we are united by our misery,” Mridula said.
Access to mental health
In a 2018 blog, Ellen Silver, VP of Operations, said, “All content reviewers, whether full-time employees, contractors, or those employed by partner companies, have access to mental health resources, including trained professionals onsite for both individual and group counseling. And all reviewers have full health care benefits.”
In a response to Moneycontrol's query, a Facebook spokesperson said, "We require all of our partners globally to provide access to extensive support from licensed mental health counselors to ensure their well-being, including 24/7 on-site support with trained clinicians, an on-call service, and access to private healthcare from the first day of employment. We are also globally employing new technology to limit their exposure to graphic material as much as possible."
However these former employees have questioned the company’s effort, which hardly did anything to help them such as providing access to counsellors.
Queries sent to Cognizant did not elicit any response at the time of publishing.
According to research journals, continuous exposure to violent content could lead to issues such as post-traumatic stress disorder, depression, anxiety, use of drugs, and alcohol dependence. Lack of emotional support, makes them even more vulnerable.
There are other challenges as well. Having continuously skipped meals, Ramya developed ulcer. Most of them stopped drinking water too.
"Drinking water meant going to washrooms often. We cannot afford that," added Ramya. These led to health issues.
Mridula has developed anxiety issues after her loan was cancelled when the office did not respond to verification calls from the bank.
“Mobile phones are not allowed and the landline does not work half the time. I had applied for personal loan and the bank had called the office to verify it. But no one picked up the call and my loan was cancelled,” Mridula explained.
“Since then I cannot help but imagine a situation when there is a personal crisis and no one is able to reach me,” she said. So every time there is a call, she jumps out of her seat.
These three are now glad that they are not working there anymore.
“I don’t ever want to go back to that life,” they say in unison. However given the loans they shoulder, a job is necessary.
Ramya and Mridula have joined a different firm and are still into content review.
“It is eons better than what we were doing earlier,” they added.
Pawan has moved on from content moderation role and is working for a tech major as an engineer.
For the others, the cycle probably continues.
*Names have been changed to protect identities.