© Reuters. FILE PHOTO: The Instagram application is seen on a phone screen

(Reuters) – Facebook Inc (O:) released its fourth report on enforcement against content that violates its policies on Wednesday, adding data on photo-sharing app Instagram and content depicting suicide or self-harm for the first time.

Pro-active detection of violating content was generally lower on Instagram than on Facebook’s flagship app, where the company initially implemented many of its detection tools.

For example, the company said it proactively detected content affiliated with terrorist organizations 98.5% of the time on Facebook and 92.2% of the time on Instagram.

Facebook said it had removed about 2.5 million posts in the third quarter that depicted or encouraged suicide or self-injury. The company also removed about 4.4 million pieces of drug sale content during the quarter, it said in a blog post.

Disclaimer: Fusion Media would like to remind you that the data contained in this website is not necessarily real-time nor accurate. All CFDs (stocks, indexes, futures) and Forex prices are not provided by exchanges but rather by market makers, and so prices may not be accurate and may differ from the actual market price, meaning prices are indicative and not appropriate for trading purposes. Therefore Fusion Media doesn`t bear any responsibility for any trading losses you might incur as a result of using this data.

Fusion Media or anyone involved with Fusion Media will not accept any liability for loss or damage as a result of reliance on the information including data, quotes, charts and buy/sell signals contained within this website. Please be fully informed regarding the risks and costs associated with trading the financial markets, it is one of the riskiest investment forms possible.

Source link

2019-11-13