Instagram is taking action against bullying on its platform.
The Facebook-owned photo sharing network on Tuesday rolled out a machine learning tool to detect bullying in photos and captions. If the AI tool deems a photo unkind or unwelcome, it’s sent to Instagram’s community operations team for further review, according to a blog post introducing the anti-bullying tools.
Instagram also introduced a bullying comments filter for live videos, which is able to detect and block offensive words during a live stream. Instagram launched the filter in May for comments on photos and videos in Feed, Explore and Profile.
Facebook and Twitter have also launched similar initiatives to limit bullying on their platforms. Twitter put a specific timeline in place last October to remove things like nudity and hateful imagery from its platform. Facebook added tools earlier this month to enable users to hide or delete multiple comments at once and allow users to report bullying or harassment on behalf of a friend or family member.
In addition, Instagram also added a kindness camera effect with teen author and actor Maddie Ziegler. If you follow Ziegler, you get the effect automatically. In selfie mode, your face is covered in hearts, and you can tag someone to support, according to the blog post. If you switch to the rear camera, you see the word “kindness” in different languages.
Instagram didn’t immediately respond to a request for additional comment.