Facebook pledged real-time monitoring of Election Day misinformation and manipulation efforts as voters began in-person balloting across the United States Tuesday.
Along with other social platforms, the company has promised to stem misinformation around the election, including premature claims of victory, seeking to avoid a repeat of 2016 manipulation efforts.
"Our Election Operations Center will continue monitoring a range of issues in real time -- including reports of voter suppression content," said a Facebook statement posted on Twitter.
"If we see attempts to suppress participation, intimidate voters, or organize to do so, the content will be removed."
Facebook said its election center is also tracking other issues such as the actions by supporters of President Donald Trump to surround campaign buses for Democrat Joe Biden.
"We are monitoring closely and will remove content calling for coordinated harm or interference with anyone's ability to vote," Facebook said.
Facebook reiterated that it would place warning labels on any posts which seek to claim victory prematurely.
"If a presidential candidate or party declares premature victory, we will add more specific information in the labels on candidate posts, add more specific information in the top-of-feed notifications and continue showing the latest results in our Voting Information Center," the social giant said.
Twitter meanwhile added a warning label on a tweet from Trump late Monday for spreading misleading information.
The post said a slow vote count in battleground state Pennsylvania could lead to "rampant and unchecked cheating."
"It will also induce violence in the streets. Something must be done!" he tweeted.
Both Facebook and Twitter taken multiple steps to stem the flow of false and misleading election information but have faced glitches and loopholes in implementing their policies.
Google-owned YouTube has also sought to limit the sharing of videos with election misinformation.