Being a part of a successful company like Facebook certainly sounds like a dream come true, but Sara Katz discovers that what she has faced is not so pleasant.
Sarah Katz, 27, worked as a moderator in Facebook, where she reviewed 8,000 reports a day, including child pornography, hate speech and not yet.
Through an associate company it was received to view content on Facebook and worked for eight months in 2016. She was supposed to discover which announcements violate the company’s standards of operation and block them.
Facebook has more than 4,500 moderators like Facebook, and this year it is planned to receive another 3,000 to cope with the dark side of social networks.
Sarah says she’s always able to come across some terrible stuff and find that work is stressful and you need to make quick decisions.
Facebook has posted 21 million posts containing nudity and sex only in the first 3 months of 2018, 3.4 million posts showing some sort of violence and millions more from hate speech, terror and spam.
Even when signing up for a Facebook deal, you are stressed that you will face a lot of awkward content even to the point of child pornography.
Sarah says they are still haunting some bizarre photos of children as they touch some of the violent bloody acts. She is happy that she does not do the job anymore.