A senior executive said on Monday that ahead of the May European elections in May Facebook is planning to increase efforts to combat misinformation and join the To influence public opinion Facebook has been pressured around the world since the US elections in 2016 to stop using fake accounts and other types of fraud news agency DPA to increase fact-finding.
To influence public opinion Facebook has been pressured around the world since the US elections in 2016 to stop using fake accounts and other types of fraud.
Last month, the European Union accused Google, Facebook and Twitter of not vowing their pledges to fight false news ahead of European elections after signing a voluntary code of conduct to stop regulation.
Facebook said on Monday that it was creating an operation center that would work 24 hours a day with engineers, data scientists, researchers and policy experts, and coordinated with foreign organizations.
Tessa Lyons, head of Facebook’s news integrity, told reporters in Berlin that “They will try proactively to identify emerging threats so they can take action as soon as possible.”
Facebook also announced that it is joining Germany’s largest news agency, DPA, to help check the accuracy of the posts, in addition to Correctiv, a nonprofit collective of investigative journalists that has been reported by fake companies since January 2017.
He will also try to stop paid advertising that is misused for political purposes and will train over 100,000 students in Germany in media writing.
Germany has been implementing a law last year that forces companies to delete offensive posts or face fines up to 50m euros ($ 56.71m) and particularly active in trying to print hate speech online.
The misinformation and election issue became apparent after the US intelligence agencies concluded that Russia tried to influence the outcome of the US presidential election in 2016 in favor of Donald Trump, partly using social media. Moscow denied any interference.
“Facebook had made progress in limiting false news in the last two years, and would increase the number of people working on this issue globally by 30,000 by the end of the year by 20,000 currently”, said Lyons
She said that In addition to human intervention, Facebook continually refines its machine learning tools to identify untrustworthy messages and to limit their distribution.
“This is a very opposite area, and if the bad actors are financially or ideologically motivated, they will try to find and adapt to the work we are doing,” she said./investing.com