How mental health subreddits are coping with the coronavirus infodemic
First Draft uses cookies to distinguish you from other users of our website. They allow us to recognise users over multiple visits, and to collect basic data about your use of the website. Cookies help us provide you with a good experience when you browse our website and also allows us to improve our site. Check our cookie policy to read more. Cookie Policy.

This website is hosted in perpetuity by the Internet Archive.

How mental health subreddits are coping with the coronavirus infodemic

Woman using mobile phone. Image credit: peakpx

First Draft examined how mental health communities are dealing with the pandemic as people become more vulnerable to misinformation in times of crisis

“I have had two meltdowns from the fear mongering,” one Reddit user writes on a mental health forum. “I know that some people are so health concern dismissive that they NEED the fear tactics to take reasonable health precautions. But for us it is hell to read over and over again!”

Another adds: “The misinformation is the worst part. You can’t tell what’s misinformation and what isn’t – unqualified people [are] making statements as if they were fact.” 

As the world responds to an unprecedented crisis, the coronavirus pandemic has become a source of great anxiety for many. However, for those with pre-existing mental health conditions like health anxiety and obsessive compulsive disorder, the impact of the outbreak, and the ‘infodemic’ that has accompanied it, has been especially strong. Mental health charity Anxiety UK advises that they may find the “current heightened focus on contamination and infection particularly distressing”.

The disorders are characterised by an intolerance of uncertainty and excessive worry, psychological states that have moved to the forefront of many people’s lives as we grapple with a crisis situation. Peter Tyrer, Emeritus Professor in Community Psychiatry at Imperial College and a health anxiety expert, defines the illness as being characterised by “an irrational fear of either having or getting a disease”. He stressed that the current situation with coronavirus is markedly different. In a pandemic, fear is not irrational, but some “complicated and highly unproductive” measures people may take to avoid infection can be.

First Draft spoke to Dr. Kate Starbird, Associate Professor of Human Centered Design & Engineering at the University of Washington and crisis informatics researcher, who explained why the pandemic is fertile ground for misinformation. 

“We’re really vulnerable to disinformation flows, especially at this time when we have all of this anxiety. We’re trying desperately to find information that can help us make the right decisions for ourselves, our families and our communities, and just resolve that uncertainty. And so [that makes us] acutely vulnerable to the spread of mis- and dis- information.”

Reddit hosts a number of mental health communities, voluntarily moderated like the others on the platform. The head of the WHO has said that coronavirus misinformation is spreading even faster than the virus itself and Reddit has proven to be no exception. First Draft found that in the absence of the platform taking a hard line against misinformation, moderators in mental health communities have had to step up to deal with the swell of coronavirus information.

The site has also run banners by health organisations and quarantined communities posting hoaxes or misinformation, like r/Wuhan_flu. But its approach has been markedly less stringent than that taken by some of the other platforms, including Facebook and Pinterest.

In some ways, online mental health communities like r/healthanxiety are well-prepared for the Covid-19 infodemic. The subreddit has a rule in place that feels timely: “No ‘buzz illness’ posts”. The rule warns that posts about coronavirus in the subreddit will be removed to avoid the forum being “overrun” with “media-hyped illness”. Posts about coronavirus are only allowed within a specific megathread within the subreddit. On r/OCD, moderators use an AutoModerator feature to remove posts with certain keywords.

Matt, a moderator on r/healthanxiety, said this was to “ensure that the conversations remain productive and healthy”. He told First Draft: “There has been no shortage of scams, spam, and misinformation being shared. Thankfully, we have caught most of them before they were publicly visible.”

He said moderators are taking steps to keep the environment productive and healthy, including implementing the AutoModerator feature to filter out coronavirus-related posts and direct posters to the forum’s megathreads, as well as experimenting with manually approving links and filtering them out entirely. Moderators have also been sharing links to expert guidance on the virus.

Reddit says it is closely monitoring the pandemic, is providing resources to support volunteer moderators and users.

A spokesperson said: “Moderators have access to crisis management resources on our Mod Help Center, which were also shared through a direct message and post in our dedicated community for moderators, r/modsupport, in light of Covid-19. We have also created a Community Council of our most impacted communities to work closely together, understand the issues they face, and better support them through this time.

“In the event that emergency moderation resources are needed, we will also make our Moderator Reserves program available. Reddit recently launched a partnership with Crisis Text Line as well, which any user or moderator that is struggling to cope may access. We will continue to evaluate and evolve how we can best support our communities.”

Moves to keep the environment free from an influx of unverified information can be crucial during a crisis. Dr. Starbird said that the way people respond to crises like a pandemic can often lead to the spread of rumour and false information online. “People come together to try to make sense of what’s going on: it’s a natural response to the uncertainty and anxiety that are inherent to crisis events like this, where we don’t really know how it’s gonna play out.”

This has always been true, long before the advent of social media. “These behaviours are natural human behaviours: social media becomes a facilitator for them,” says Dr. Starbird. “But there’s new configurations of how we can participate [in this sense-making process]. Because we can participate from all over the world and we have these different network structures, there’s different ways that things are amplified and influences are flowing.

“It qualitatively shapes how the sense-making processes are taking form, as well as just scaling up the number of people that can participate and the distance across which we can [communicate].”

While collective sense-making, and the rumours that often accompany it, are a long-standing human response to crises, our information environments have evolved rapidly over time. Though historically crises have often been characterised by a lack of information, they now increasingly feature an overabundance of it, as we’re seeing with the Covid-19 infodemic. 

Dr. Starbird underlined a similarity between these distinct states: “The effect is the same in that you have uncertainty about what to do. The difference is that before, your sense-making practices were making it up from scratch, and now you have an infinite number of sources to turn to good, bad and otherwise to pick from to help you grasp those ideas.”

Our current information abundance, and the natural response of feeling overwhelmed in times of crisis raise questions of platform responsibility. While volunteer moderators work to maintain an information ecosystem for people who Professor Tyrer says often feel isolated in their daily lives, should Reddit be taking a more interventionist approach in filtering out fact from fiction?

Matt thinks so, saying: “I suspect that Reddit could be doing more for source validation, and at the very least giving moderators more pre-made tools for verifying and displaying high quality sources and data.”

Dr. Starbird agreed the platform should give moderators more tools to regulate the site. However, she stressed the complexity of policing misinformation, cautioning that it can have far reaching implications. 

“I think that the focus right now is best placed on when problematic information is getting a wide reach; when information that is likely to cause people to take actions that are detrimental to themselves or society is going viral.”

She emphasised the importance of transparency from official channels in a situation that is changing day-to-day, but also our own need to adapt as information participants.

“We’ve got to adjust our natural tendency to want certainty and really accept the fact that [the situation] is uncertain and changing, and as much as we want to resolve that, it’s only going to resolve with time. And we don’t even know how long that time is.”