Imagine you see a room in a social audio network discussing a highly polarising topic with hundreds of people tuned in. The room is live and the only option right now on most live audio platforms is for the user to report the room.
Even then, it takes time for the platforms to respond and take appropriate action.
But what if platforms can take immediate action, where the platform has a feature to shut down content right there if the content is deemed objectionable? What if there is an option where club members can continue to debate, but you could hide it from appearing in the feed in the interest of the larger public?
While this might not check all the boxes when it comes to moderation, these are some of the steps Indian platforms are looking to implement to address the challenge live audio presents.
From porn and bullying to hate speech, social audio platforms such as Clubhouse and Twitter Spaces, which gained popularity over the last one year, are struggling with content moderation. But the Indian counterparts, albeit smaller in size, are more confident about tackling this by having a moderation-first approach and having multiple layers for moderation.
This is important especially in the light of challenges big tech players like Facebook face due to lack of a focus on moderation, and fast adoption of social audio features on platforms.
Emergence of social audio
One could consider 2021 as the year of social audio, which was popularised by drop-in live audio platform Clubhouse. The platform saw downloads peak in February 2021 when Tesla’s Elon Musk and Facebook founder Mark Zuckerberg made an appearance. This resulted in multiple platforms like Twitter, Spotify and Facebook joining the audio bandwagon.
Twitter launched Spaces for the public in May 2021, and Spotify launched Greenroom, its own social audio network in June 2021. Facebook, now Meta, started testing its own audio room, Hotline, in April 2021.
Over the last one year India too has seen multiple companies get into the audio space, and the existing ones gaining traction. These include Leher, FireSide, owned by short-video platform Chingari, Scenes by Avalon, Mentza, and Bakstage. This is apart from other startups in the audio space like Kuku FM and aawaz.com, which are podcasting platforms.
Challenges of moderation in audio
Moderation in live audio is hard. It is live and, hence, it is impossible to know how the conversation will flow.
This is the challenge, Paul Davison, co-founder and CEO, Clubhouse, pointed out in an earlier interaction with Indian journalists. He said: “Live group audio is really different from other mediums since you don't know what people are going to say and what the content is going to be in advance. That presents a certain set of challenges.”
The second problem is actually taking action against objectionable content. Given that most of the moderation lies with the host/moderators of the room, without third-party intervention, it is not always easy to control content that could incite violence, bullying or harassment.
Case in point are the rooms like the one on body part auctions in Clubhouse, or anti-vaccine groups on Twitter, as pointed out by a Washington Post article.
Responding to a Washington Post article on moderation in Twitter Spaces, Twitter too has acknowledged the limitations audio poses when it comes to moderation since technology that could scan audio in real time does not exist. The statement added that the company is exploring avenues to moderate audio.
These are the two aspects Indian platforms are focusing on.
Multiple layers of moderation
Take Leher. While the platform was launched in 2018, it wasn’t until 2021 that it saw significant traction. Co-founder and CEO Vikas Malpani in a recent interaction quipped that the live audio platform has not been as lucky as Clubhouse in terms of number of users. Leher so far has 300,000 downloads, and is confident about its growth. But important as growing the platform and community is, moderation has been a focus area from the start, he said.
Leher has multiple layers of moderation to ensure that the user accesses only the right content, Malpani said. At the first level, algorithms weed out objectionable content such as pornography. But this is not enough of an answer for moderation, given that technology has not reached a level where it could catch various dialects in Indian languages.
So the company has another mechanism in place. Malpani explained that when a new user starts a room, not all users see it in their feed immediately. New users are tested randomly to gauge the quality of the content they produce before their rooms are made visible to the entire network.
Unlike the typical options available to users—‘Show me fewer rooms like this’ or ‘Report room’—Leher has added another layer, super moderation access.
Leher gives certain sections of the community super moderation access. Those with such access have two powers: One, they can end the room in question forcefully and two, hide the room from public view. These people are selected based on their behaviour on the platform over a period of time. For this, the moderators are rewarded with additional coins.
Sumit Ghosh, co-founder and CEO, Chingari, which also owns FireSide, a live audio platform launched last year, too is looking at similar access control. Ghosh told Moneycontrol, “We also have issues with moderation. While currently manual moderation is going on, it (moderation) should be a community-driven thing.”
FireSide will soon be integrated with the parent app Chingari, and the standalone app will be shut down. In the integrated app, each of the rooms will have moderators from the GARI community, will have clear cut guidelines and they will enforce them. GARI is the crypto token launched by Chingari in November.
“We will have our own community guidelines and if users breach them the GARI community moderators will ban or kick out people, force close the rooms,” Ghosh said. These community moderators are not the part of active discussion but will be a part of the room. When the situation moves out of hand, the hosts will be warned about the kind of discussions allowed. If they don’t listen, the rooms will be closed.
“This is something we are planning to do in Chingari as well. Once you incentivise the users, they will come and upload all the shit. The quality of the app will go down. So the only way to go about this is for it to be community-driven like Reddit or Stack overflow,” Ghosh added.
While anyone can apply to be a moderator and earn GARI tokens on Chingari/FireSide, Ghosh said that there will be weekly reviews or town halls on how to improve moderation.
Why is moderation important?
According to the founders, quality content is important to attract users, and businesses, and lack of it could have a negative impact.
Ajit Narasimhan, chief marketing officer, Sundaram Mutual, in an earlier interaction explained that for brands, it is important to be associated with a platform that sets the right image to identify and attract customers. “Otherwise, it is better to stay away from the platform,” he said.
Apart from brand collaborations, the image is important if a platform wants to build a paid subscriber base. This becomes particularly important as tipping and room ticket features become a source of revenue for creators.
Take KuKu FM, which has over 200,000 active paid users, for the audio content generated by users on the platform. Lal Chand Bisu, co-founder and CEO of the podcasting platform that also has live interactions, said that to serve subscription-based premium content to customers, one needs to focus on moderation from day one.
Bisu explained that for Kuku FM, moderation was built-in instead of an afterthought, and has put in place a mechanism to ensure that its premium users stay with the platform. The company follows a hybrid model for moderation, where 80 percent is automated and 20 percent manual.
“What we do is when someone is making good quality content repeatedly, we mark that creator as a premium creator. After that what happens is that the content goes to the listeners directly. But we also give controls to listeners to report bad content, which is reviewed by our content moderation team,” Bisu said.
Out of 70 employees, 30 are members of a dedicated content moderation team. According to Bisu, having such a number of people dedicated to moderation is necessary to build a safe platform.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!