WhatsApp is popular worldwide and likes to advertise its own security through end-to-end encryption. But as research now reveals, there are still cases in which external parties can read messages.

WhatsApp is probably one of the best-known messenger apps. The Facebook subsidiary has around two billion monthly active users worldwide.

The Statista Global Consumer Survey in Germany also shows a clear picture. Here, 90 percent of respondents answered “WhatsApp” to the question “Which messenger do you use regularly?

Facebook Messenger takes second place with just 38 percent. It is followed by Skype, Telegram, Discord and Snapchat, each with less than 20 percent.

How secure is WhatsApp encryption really?

WhatsApp is popular and has created a seemingly secure space for private communication with its end-to-end encryption.

But parent company Facebook’s claim that no one can read WhatsApp messages isn’t entirely accurate, according to research by U.S. magazine Pro Publica.

If you open a chat on WhatsApp, a message even appears that points out the encryption and is intended to clarify once again that no one is reading along:

Messages and calls are end-to-end encrypted. No one outside of that chat can read or listen to them, not even WhatsApp.

It’s true that your chats are encrypted so that no one can hack in and read them. However, this is only half the truth.

External moderators read WhatsApp messages

If a chat is reported for offensive content, the encryption is removed, Pro Publica reports further. This includes spam, political hate speech, fake news and child pornography. Then, an artificial intelligence first checks the content. This is followed by a review by content moderators.

These are all external and not directly employed by WhatsApp. According to Pro Publica, they work from Texas, Dublin or Singapore and are employed by management consultants Accenture, among others. Their starting salary is said to be around 14 euros an hour.

Also, these external auditors do not only get to read the reported message. “WhatsApp does disclose that you release the last few messages for review. But it does not say exactly how many there are,” writes Peter Elkind at Pro Publica.

According to the research of the U.S. magazine, it should be about five messages that are released to the moderator:inside to read.

WhatsApp not only decrypts messages

Every week, millions of WhatsApp contents are checked by the external moderators. Every day, there are more than 600 cases per employee.

The company did not want to disclose how many third-party moderators WhatsApp employs. According to Pro Publica, however, there are more than 1,000 employees at Accenture alone.

But for the audit, WhatsApp decrypts not only the reported message and the four that follow. It also includes numerous metadata such as name, profile picture, phone number, status message, phone battery level, language and time zone, and WLAN signal strength, among others.

Decisions about abusive sexual images, for example, can be based on an assessment of whether a naked child in a picture appears adolescent or prepubescent, based on a comparison of hip bones and pubic hair to a medical index chart.

In this regard, the work instructions as well as the work itself are sometimes disturbing for the moderators. One moderator told Pro Publica about a video in which a man with a machete held up what appeared to be a severed head.

The moderators would have had to watch and decide whether it was a “real corpse or a fake corpse”.

And that with a workload in which they actually have less than one minute per case.