Amid growing calls from health experts to stem the tide of misinformation spread by anti-vaccination groups on social media, Facebook announced on Thursday that it would reduce the visibility of inaccurate posts and reject ads that include such content.
“We are working to tackle vaccine misinformation on Facebook by reducing its distribution and providing people with authoritative information on the topic,” said a blog post by Monika Bickert, Facebook’s vice-president of global policy management.
In recent years, groups trying to promote an anti-vaccine agenda have been able to spread their messages — largely uncensored — on Facebook, YouTube and other social media sites. Public health authorities believe these posts have frightened some parents into “vaccine hesitancy.”
Anti-vaccination groups often post false but alarming content, ranging from claims that vaccines cause autism (an idea that was scientifically debunked long ago), to insisting that being infected by measles as a child can protect against heart disease and cancer later in life.
The World Health Organization has identified reluctance or refusal to vaccinate against preventable diseases, such as measles, as one of the top 10 threats to global health.
Health authorities believe vaccine hesitancy is one of the contributing factors behind current measles outbreaks in Canada and the U.S. — a trend they say is particularly disturbing given that the disease was declared eliminated in this country in 1998.
In her post, Bickert said Facebook will “reduce the ranking of groups and pages that spread misinformation about vaccinations in News Feed and Search” — meaning they won’t pop up as often or as prominently as they do now. They also won’t be included in “recommendations,” she said.
The measures also target advertisers trying to promote paid anti-vaccination content on Facebook.
“When we find ads that include misinformation about vaccinations, we will reject them,” Bickert said. “For ad accounts that continue to violate our policies, we may take further action, such as disabling the ad account.”
The company, which also owns Instagram, “won’t show or recommend content that contains misinformation about vaccinations on Instagram Explore or hashtag pages,” she said.
In addition, Facebook is “exploring ways to share educational information about vaccines when people come across misinformation on this topic.”
The platform will use information from “leading global health organizations, such as the World Health Organization and the U.S. Centers for Disease Control and Prevention” on “verifiable vaccine hoaxes” to take action against them, Bickert said.
These actions take effect immediately, a Facebook spokesperson told CBC News.
Not enough, social media expert says
Fuyuki Kurasawa, director of the global digital citizenship lab at York University in Toronto, welcomed the measures — but said they are “belated” and don’t go far enough.
“[They] do address some of the concerns raised and criticisms directed at [Facebook] around this issue,” he told CBC News in an email.
“[But] rather than reduce their ranking, why not ban or disable private groups and pages that spread misinformation about vaccinations?” Kurasawa asked.
Kurasawa also wondered how Facebook would “deal with the ways in which anti-vaxxers may re-label or programmatically re-code their content so as to become undetected or less detectable via [Facebook’s] algorithms or human moderation?”
CBC News asked Facebook how it would identify organizations with names that don’t reveal their affiliation with anti-vaccination messaging — such as the U.S.-based National Vaccine Information Center, which focuses on claims of “vaccine injury” rather than promoting vaccination for disease prevention.
In an email, a spokesperson responded that any group posting misinformation about vaccines will now have their posts reduced in Facebook users’ news feeds and the group itself will be demoted in search results.
Providing ‘accurate information’
Facebook also plans to provide its users “with additional context” and is “exploring ways to give people more accurate information from expert organizations about vaccines at the top of results for related searches, on Pages discussing the topic, and on invitations to join groups about the topic,” Bickert said in her blog post.
Facebook’s attempts to curb vaccine misinformation come after other online platforms have already taken action.
Pinterest, a site on which users (called “pinners”) save visual content they like from other blogs or websites to their personal pages, started blocking searches containing terms such as “anti-vaccine” last year.
YouTube said it does not allow videos that promote anti-vaccination content to make money by hosting ads. A spokesperson told CBC News in an email last week that the platform is also “surfacing more authoritative sources and reducing recommendations for anti-vaccination videos,” as well as showing “information panels where users can fact check information for themselves.”
from Update Trend News https://ift.tt/2XJXa5l
0 Comments