Techno Blender
Digitally Yours.

Meta rolls back COVID-19 misinformation rules in many countries

0 24


Meta is rolling back for Instagram and Facebook in countries that no longer deem the pandemic to be a national emergency. The policy will no longer apply in the US, along with some other territories.

Last July, Meta for its opinion on the misinformation policy after noting that the pandemic had “evolved.” It , but in April, the group suggested that Meta should keep removing false claims about COVID-19 that are “likely to directly contribute to the risk of imminent and significant physical harm.” The Oversight Board also told the company to “reassess” the types of pandemic claims that it removes under the policy.

In addition, the advisory group suggested that Meta make preparations ahead of the World Health Organization nixing the emergency status of COVID-19 “to protect freedom of expression and other human rights in these new circumstances.” The WHO lifted its COVID-19 emergency designation in May and Meta has now made its response to the Oversight Board’s recommendations.

“We will take a more tailored approach to our COVID-19 misinformation rules consistent with the Board’s guidance and our existing policies. In countries that have a COVID-19 public health emergency declaration, we will continue to remove content for violating our COVID-19 misinformation policies given the risk of imminent physical harm,” Meta . “We are consulting with health experts to understand which claims and categories of misinformation could continue to pose this risk. Our COVID-19 misinformation rules will no longer be in effect globally as the global public health emergency declaration that triggered those rules has been lifted.”

Soon after the onset of the pandemic, social media platforms faced pressure to combat COVID-19 misinformation that people were spreading, such as inaccurate claims about vaccines. Many — including Meta, Twitter and YouTube — established policies to tackle COVID-19 falsehoods.

Those rules have evolved over time. For instance, in May 2021, Meta said it would that COVID-19 was “man-made.” As the Oversight Board noted last year, Meta removed 27 million Facebook and Instagram posts that contained COVID-19 misinformation between March 2020 and July 2022.

Twitter its COVID-19 misinformation policy in November, not long after Elon Musk took over the company and . Meanwhile, YouTube recently updated its misinformation policy to containing 2020 election denialism.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission. All prices are correct at the time of publishing.


Meta is rolling back for Instagram and Facebook in countries that no longer deem the pandemic to be a national emergency. The policy will no longer apply in the US, along with some other territories.

Last July, Meta for its opinion on the misinformation policy after noting that the pandemic had “evolved.” It , but in April, the group suggested that Meta should keep removing false claims about COVID-19 that are “likely to directly contribute to the risk of imminent and significant physical harm.” The Oversight Board also told the company to “reassess” the types of pandemic claims that it removes under the policy.

In addition, the advisory group suggested that Meta make preparations ahead of the World Health Organization nixing the emergency status of COVID-19 “to protect freedom of expression and other human rights in these new circumstances.” The WHO lifted its COVID-19 emergency designation in May and Meta has now made its response to the Oversight Board’s recommendations.

“We will take a more tailored approach to our COVID-19 misinformation rules consistent with the Board’s guidance and our existing policies. In countries that have a COVID-19 public health emergency declaration, we will continue to remove content for violating our COVID-19 misinformation policies given the risk of imminent physical harm,” Meta . “We are consulting with health experts to understand which claims and categories of misinformation could continue to pose this risk. Our COVID-19 misinformation rules will no longer be in effect globally as the global public health emergency declaration that triggered those rules has been lifted.”

Soon after the onset of the pandemic, social media platforms faced pressure to combat COVID-19 misinformation that people were spreading, such as inaccurate claims about vaccines. Many — including Meta, Twitter and YouTube — established policies to tackle COVID-19 falsehoods.

Those rules have evolved over time. For instance, in May 2021, Meta said it would that COVID-19 was “man-made.” As the Oversight Board noted last year, Meta removed 27 million Facebook and Instagram posts that contained COVID-19 misinformation between March 2020 and July 2022.

Twitter its COVID-19 misinformation policy in November, not long after Elon Musk took over the company and . Meanwhile, YouTube recently updated its misinformation policy to containing 2020 election denialism.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission. All prices are correct at the time of publishing.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment