Discord updates his Community Guidelines with a clause prohibiting the sharing of information it deems “false or misleading” and “likely to cause physical or social harm” if acted upon. The rule could apply to a lot of information, but Covid-19 rhetoric is the prime example. The chat service does not want to be a source for “opponents of vaccination” or advice not accepted by the medical community, such as the use of unproven home remedies.
In short, Discord does not allow individuals to “post, promote, or organize communities around false or misleading health information that is likely to result in harm,” wrote Alex Anderson, Discord’s senior platform policy specialist, in a blog entry explain the update.
Discord defines false or misleading health information as any health information that “directly and unequivocally contradicts the latest consensus of the medical community,” and offers a surprising amount of detail as to what is meant by it.
The following is a list of topics Discord warns against making “false or misleading” claims about:
- the safety, side effects or effectiveness of vaccines;
- the ingredients, development or approval of vaccines;
- alternative, unapproved treatments for diseases (including claims promoting harmful forms of self-medication and claims advocating rejection of vaccines or alternatives);
- the existence or prevalence of a disease;
- the transmission or symptoms of a disease;
- public health guidance, advice, or mandates (including false claims of preventive measures and actions that could impede the resolution of a public health emergency);
- the availability or eligibility for healthcare services; and,
- Content that implicates a health conspiracy by malicious forces (including claims that could cause social unrest or cause critical infrastructure to be destroyed).
The list alone could be construed as a blanket ban on expressing distrust of local health regulations or even recommending “alternative” traditional medicines. However, Anderson says Discord will treat context as intent and will not take action unless it believes messages are “likely to cause some form of harm.”
“This policy is not intended to penalize polarizing or controversial positions,” he writes. “We allow the sharing of personal health experiences; Opinions and comments (so long as those views are factual and not harmful); good faith discussions of medical science and research; Content designed to condemn or debunk misinformation about health; and satire and humor designed to ridicule patently and intentionally false or misleading health claims.”
People with polarizing or controversial views will likely disagree with the claim that they’re not being attacked, although it’s worth mentioning that Discord users who mostly stick to smaller groups might not notice a change regardless of what they say on the platform .
When I spoke to Discord about privacy in 2019, it told me that it doesn’t proactively monitor a given server’s text and voice chat — how could it with over 150 million monthly active users? Instead, moderators mostly respond to user reports, most likely coming from large public servers.
I find it unlikely that Discord would search the chat logs of every 20-person server looking for Covid-19 vaccine and microchipping narratives, although there is some precedent for proactive moderation on Discord. 2018, after some releases reported that the relative privacy Discord offers turned it into a hideout for white supremacists, the company made publicly known efforts to rid itself of hate group servers. Following this example, it is possible for Discord to locate and shut down servers that openly advertise themselves as anti-vax hubs, if such servers exist. (If I had to guess, I’d say they do.)
the new discord guidelines come into force on March 28th. “Malicious identity theft” is also prohibited by the new guidelines, noting that “satire and parody are fine,” and Discord has given itself permission to consider “relevant off-platform behavior” when responding to users Reports such as “Membership or association with a hate group, illegal activities, and hateful, sexual, or other types of violent acts.”
Discord also says it will take action against “false, malicious, or spam” reports. “If you are found to be reporting in a malicious manner, we may take action against your account,” the company says.
As someone who doesn’t use Discord in one way or another as a soapbox for vaccine commentary, the messages mostly serve as a reminder that conversations happening on the platform aren’t entirely private, even on so-called private servers. It is a moderated social network. So if someone submits a report, Discord mods can see your chat logs and issue warnings, suspensions, or bans. For those who want Discord-like features without joining a social network like Enterprise teamspeak still offer paid private VOIP servers. (Right now, I’m not too worried about encrypting my D&D group’s endless planning calls.)
This article was previously published on Source link