How Facebook is keeping the social media safe

The Asian Age With Agency Inputs

Technology, In Other news

Facebook is doubling the number of people working on in safety and security teams this year to 20,000.

More than 1.4 billion people use Facebook every day from all around the world. (AP Photo)

People all around the world use Facebook to connect with friends and family and openly discuss different ideas. But they will only share when they are safe. That’s why the company has devised clear rules about what’s acceptable on Facebook and established processes for applying them. However, things always don’t go right.

Earlier this week a TV report on Channel 4 in the UK has raised important questions about those policies and processes, including guidance given during training sessions in Dublin. Facebook claims that some of what was discussed in the program does not reflect their policies or values.

Facebook has been investigating exactly what happened so as to prevent these issues from happening again. For example, they immediately required all trainers in Dublin to do a re-training session — and are preparing to do the same globally. The policy questions and enforcement actions were also reviewed which were brought into the limelight and fixed the mistakes.

More than 1.4 billion people use Facebook every day from all around the world. They post in dozens of different languages: everything from photos and status updates to live videos. Deciding what stays up and what comes down involves hard judgment calls on complex issues — from bullying and hate speech to terrorism and war crimes. That’s why the Community Standards were developed with inputs academics, NGOs and lawyers from around the world. They also took advice from human rights and free speech advocates, as well as counter-terrorism and child safety experts.

To help manage and review content, Facebook works with several companies across the globe. These teams review reports 24 hours a day, seven days a week, across all time zones and in dozens of languages. When needed, they escalate decisions to Facebook staff with deep subject matter and country expertise. For specific, highly problematic types of content such as child abuse, the final decisions are made by Facebook employees.

Reviewing reports quickly and accurately is essential to keeping people safe on Facebook. This is why they’re doubling the number of people working on in safety and security teams this year to 20,000. This includes over 7,500 content reviewers. They also use technology to assist in sending reports to reviewers with the right expertise, to cut out duplicate reports, and to help detect and remove terrorist propaganda and child sexual abuse images before they’ve even been reported.

Facebook says they are committed to getting it right for people and their friends. “Creating a safe environment where people from all over the world can share and connect is core to our long-term success,” they added.

(source)

Read more...