Signal, the messaging platform that pitches its privacy-first nature as the USP, recently took aim at Facebook with a very interesting idea.
We all know that Facebook collects a treasure-trove of data about every user across its many online properties. Facebook sells this data and makes money. But Facebook has a system that allows this data to be used by paying clients – so, if a client wanted to target people who are interested in ‘parenting’ as a topic, you can select that audience cluster on Facebook’s system and your ad would be beamed to them.
But, can you tell those people that they have been selected because they previously showed interest in that topic? After all, Facebook selected them precisely for this reason, so you can inform those people, you’d presume.
Apparently, you cannot. Or, an advertising brand cannot. Signal figured this out through their subversive campaign idea that was booted out of Instagram!
Here are the creatives they presented to Facebook!
Let me try a rudimentary offline equivalent: You have a powerful binocular. You figure out that there is an entire apartment complex in front of your house with a LOT of open windows (the windows are kept open all through the day, voluntarily). The residents association of that complex happen to hire you to keep a watch on their apartment complex, incidentally.
You start observing the people in those flats through your binocular and make comprehensive notes on what each person is doing, the backdrop in their homes, and so on.
You then start selling that information to the shop owners nearby so that they can sell some products from their stores based on that information. One such shop wants to inform Flat No.102 that they are reaching out to them because they found from your binocular-peeping that they seemed very tired after using conventional vacuum cleaners and that the shop has a good collection of automatic vacuum cleaners. But you tell the shop that they can only sell the vacuum cleaners, they cannot tell the flat’s residents why and how they were selected 🙂
There is another angle to this theme.
Facebook recently banned a web browser extension called Ad Observer, created by researchers at New York University. The idea behind the extension was that people voluntarily install it and share selective data anonymously to the researchers including the categories advertisers chose when targeting them! This goes back to the Signal experiment!
The research project’s objective was to showcase with proof how Facebook is not doing enough to prevent misinformation and disinformation, and instead choosing to side with paying advertisers without any rules, or flouting rules.
Facebook’s logic for blocking access to this tool is to cite violation of its terms of service and, that the research tool was compromising user privacy! Facebook also said that they had no choice but to shut us down because of an agreement they had with the Federal Trade Commission (FTC).
In fact, Ad Observer collects information only about advertisers, and the FTC has stated that this research does not violate its consent decree with Facebook!
Taken together, the Signal experience and the NYU ban, it seems like Facebook believes that it can use people’s data only through its channels. If others want to merely communicate it back to the people whose data is being used, even that is something Facebook cannot deal with!
If I talk about vaccuum cleaners often, and I see ads for the product on Instagram, I’d probably understand the context of why I’m seeing those ads. But if I talk about one political party or leader and I see persistent ads about another political party or leader, I won’t be able to understand the context since the logic is inversed. But that’s the logic that will be beneficial to the advertiser (the 2nd political party or leader) and obviously annoying to me as a user.
From a transparency point of view, these actions severely undermine Facebook’s war against Apple on privacy.