The social network Facebook started in 2004 at Harvard following the initiative of Mark Zuckerberg. At first, the platform was intended for students of this university, but this situation quickly changed, because everyone can now use it. Faced with this democratization, algorithm improvements are essential …

False information Facebook algorithms

Update to combat false information

According to the Facebook team, authentic information is very important for a company, hence some improvements to its news feed in order to offer information likely to interest subscribers. Concretely, a new code in application makes it easy to identify the relevant publications in order to popularize them. It also helps to identify and punish pages with non-authentic content.

google loves me workshop banner

To do this, Facebook detects pages that often post unwanted ads as well as pages that use inappropriate methods to gain visibility. This is for example the case of pages that ask fans to comment, share or like a “post”. The social network also penalizes those with content often hidden by Internet users.

The modification of the algorithm in question helps Facebook to provide a real-time news feed by looking at the progress of reactions related to the publication. In practice, if a person leaves a comment on a “post”, it will be more likely to appear in the news feed.

In addition, this update allowsidentify interest Web users for posted videos thanks to two criteria:

  • The date: we prioritize events that happen live rather than deferred;
  • The duration of the recording: the longer it is, the more likely the video will appear in the users’ news feed. This probability increases in particular if it obtains a significant reading time.

Facebook code improvement

Code improvements to prevent suicides

Suicide is one of the leading causes of death among young people between the ages of 15 and 29. According to statistics, a person tries to end his life every 40 seconds. In this sense, the firm of Mark zuckerberg wants to solve this problem, hence the development of a new tool. Its objective: to report a video published via Facebook Live where its author suggests that he wants to end his life.

A team immediately analyzes the case. If she believes the risk exists, she sends a notification to the person concerned to encourage them to consult a psychology expert in real time via Messenger. The whistleblower will also receive an invitation to chat with the individual in distress, along with all the steps to avoid the worst. Remember that he can decide to remain completely anonymous to preserve his identity.

Facebook for its part gathers all the information obtained through the reporting tool by betting on a artificial technology to analyze data. The objective of this new approach is to analyze the similarities of the different situations that have already occurred in order to design a model to be used to easily identify a person who needs assistance. The algorithm in question then sends a notification to the team responsible for analyzing this type of situation. If the doubts are well founded, she will offer a helping hand to the concerned.

Through this process, Facebook can easily and quickly recognize people in distress even without the help of Internet users to significantly reduce the suicide rate.

Facebook algorithm

A new algorithm to prevent “porn revenge”

Many people have been victims of “revenge porn”. This type of content aims to publish a photo or video on the social network without the consent of the person concerned in order to publicly humiliate him. This kind of defamation can, however, have serious consequences. According to a study, 93% of victims suffer from emotional distress while 82% no longer have any relationship with those around them.

To combat this phenomenon, Facebook allows report intimate images may be shared without the owner’s consent. This system sends the file to experts who will take care of analyzing it and removing it immediately if they deem it inappropriate. Best of all, they’ll suspend the author’s account. Note that it is possible to send a request information from Facebook if the manager mistakenly deleted a photo that was not pornographic.

With this new update, Facebook’s goal is simple: block the publication of this type of image. This is why it analyzes the content transferred on three platforms, namely: Facebook, Messenger and Instagram. If the user tries to share an inappropriate file, they will always receive a warning that the photo they want to publish has already been reported due to non-compliance with website policy. In other words, the person concerned will not be able to make the defamatory image public.

Facebook also wants to help victims of “revenge porn” by collaborating with associations that are experts in the field such as the Cyber ​​Civil Rights Initiative and Revenge Porn Helpline. These are intended to provide moral support to the person whose intimate photo has been disclosed.

Given the development of Facebook, these various improvements are now essential. According to statistics, the platform has around 1.94 billion active users. The turnover of the company, meanwhile, reached $ 8.03 billion at the beginning of 2017, against only 5.38 billion in 2016. We can therefore expect that juggernaut makes other code changes to attract more internet users in the near future, hopefully.

Save

Save