Facebook’s improved AI uses pattern recognition to help prevent suicides

Facebook isn’t just a simple social networking website, as they’ve shown before with previous projects and are showing again with their new suicide prevention AI. Using pattern recognition on posts and live video streams, the AI should be able to recognize if a Facebook user is having suicidal thoughts.

Guy Rosen, VP of Product Management, explained in a blog post how it is going to work.

“Using pattern recognition to detect posts or live videos where someone might be expressing thoughts of suicide, and to help respond to reports faster

Improving how we identify appropriate first responders

Dedicating more reviewers from our Community Operations team to review reports of suicide or self harm”

Pattern recognition isn’t a new feature. It is already used in Facebook’s “first responders” program. This program has resulted in over 100 wellness checks, based on reports Facebook received via their proactive detection efforts.

The suicide prevention AI will start to roll out very soon outside of the US. After that, it will be available worldwide except in the European Union. The pattern recognition will look into comments like “Are you ok?” and “Do you need help?” This will then send a signal to the AI that the Facebook staff should investigate a certain post or live stream.

Until the AI rolls out, you can help with suicide prevention as well. One way would be to reach out to the person you think is at risk of suicide or report the post to Facebook. Their teams work worldwide and are available 24/7. Reports are sorted by priority so if you report a possible case it will be attended to as soon as possible.

Pattern recognition will add more value to Facebook’s current efforts toward suicide prevention. This is great news for Facebook and one more big victory for AI. As technology advances, we expect to see more tools like this one that will help people worldwide.

What are your thoughts on this tool from Facebook? Let us know in the comments below.

 

Via: Facebook

Loading comments