(CNN) Facebook will soon take action against misinformation about vaccines, according to a Facebook representative.
Public health experts have pointed fingers at social media platforms, saying that false claims that vaccines cause autism and other diseases have frightened parents into refusing to vaccinate, resulting in the current measles outbreak that started in Washington state.
The Facebook representative, who asked not to be named, said the social media giant is working with health experts to decide what changes to make and considering a combination of approaches to handle vaccine misinformation. These approaches wouldn't take misinformation off Facebook but rather make it less prominent.
For example, groups that promote vaccine misinformation wouldn't show up in the list of groups that Facebook recommends users join. Also, Facebook would make sure that posts containing vaccine misinformation would appear farther down in a user's newsfeed.
Public health and technology experts welcomed the planned changes. "This is good news," said Art Caplan, head of the Division of Medical Ethics at the New York University School of Medicine. "They're incremental steps, but they're heading in the right direction."
Darrell West, director of the Center for Technology Innovation at the nonprofit Brookings Institute, agreed. "Facebook is on the side of science," he said. "They're acting in accordance with the scientific consensus."
Facebook is also considering making changes in its advertising policy, according to the representative. A CNN search on Facebook's archived advertisement website found that several groups that promote false information about vaccines are advertising on the site.
Another change would involve putting results with vaccine misinformation farther down when people search for certain terms. This could result in major changes. According to recent CNN searches on Facebook, anti-vaccine groups now show up high on the list of results when the word "vaccine" is searched.
Facebook has been under considerable pressure to do something about the anti-vaccine information on its site; other social media platforms got tough on such content years ago. For example, YouTube doesn't allow ads to appear on videos that promote anti-vaccine content, which means these groups cannot make money through advertising.
Pinterest has the most restrictive rules about vaccine information; users are unable to link to certain sites that have misinformation, among other restrictions.
"We're a place to go to for inspiration, and there's nothing inspiring about harmful content," said Ifeoma Ozoma, Pinterest's policy and social impact manager.
Caplan said that while he welcomes Facebook's changes, he wishes the platform wouldn't stop there. "It may be necessary go even further in terms of pulling scurrilous and erroneous sites down completely," he said.
A First Amendment rights advocate, however, said she thought Facebook had struck the right balance. "They're avoiding censorship, but they're not necessarily recommending it to their readers, and that's probably the best way to go about it," said Lata Nott, executive director of the First Amendment Center at the nonpartisan Freedom Forum Institute.