This post has already been read 7783 times!
I suppose you could call it ironic. There was a story from a ‘friend’ on my Facebook news feed today called “Quitting the Like” all about escaping Facebook’s data collection processes by simply not “liking” items or comments you see.
Right below this ostensibly anti-Facebook story were three related links produced by one of the Facebook data-collection bots all about the same thing: breaking free from Facebook’s data mining. I suspect the FB programmers hadn’t planned it that way. But aside from the irony, it caused me to read them all.
The first story, fully titled “I Quit Liking Things On Facebook for Two Weeks. Here’s How It Changed My View of Humanity” is by blogger Elan Morgan who explains:
I no longer wanted to be as active a participant in teaching Facebook how to advertise to me as I had been in the past, but another and much larger issue was my real curiosity: how was my Facebook experience going to change once I stopped feeding its engine with likes?
Her conclusion is that by not feeding the bots through the automatic reflex of clicking “like” below a ‘friend’s’ (real or artificial) post or comment, and instead by writing something positive in a comment, you can increase the human interaction on FB and reduce the generated noise.
Quit the Like and experiment with amplifying a better signal. What will happen to your Facebook without your likes? What will happen to your perception of not only your Facebook friends but the world at large? What will happen to us?
That’s an interesting approach, one I had considered but never taken. It encourages me to try it, but I also wonder if sharing a post also generates the same sort of algorithmic reaction. Is a ‘share’ a formulaic equivalent to a ‘like’ or are they weighted in some manner? And I am not sure the algorithms don’t also track comment density and content: is replacing ‘like’ with a comment merely a cosmetic change, or does it have a significant effect?
Gizmodo reported FB is doing some automatic liking for you, behind the curtain:
You might think clicking “Like” is the only way to stamp that public FB affirmation on something—you’re wrong. Facebook is checking your private messages and automatically liking things you talk about.
What I’d really like to know is how FB’s bots manage all the content, beyond the like button. So they weigh keywords, track what we post, how we comment, share and so on? I suspect they do, but how they parse the results is a fascinating, sometimes a bit scary mystery.
The second article, by Wired’s Mat Honan, is about the opposite experience; clicking “like” on everything on your Facebook feed. Its’ called “I Liked Everything I Saw on Facebook for Two Days. Here’s What It Did to Me.” Honan wrote:
…Facebook uses algorithms to decide what shows up in your feed. It isn’t just a parade of sequential updates from your friends and the things you’ve expressed an interest in. In 2014 the News Feed is a highly-curated presentation, delivered to you by a complicated formula based on the actions you take on the site, and across the web. I wanted to see how my Facebook experience would change if I constantly rewarded the robots making these decisions for me, if I continually said, “good job, robot, I like this.” I also decided I’d only do this on Facebook itself—trying to hit every Like button I came across on the open web would just be too daunting. But even when I kept the experiment to the site itself, the results were dramatic.
I hesitate to duplicate Honan’s experiment for the same reason I would not want to duplicate the “Supersize Me” diet or the “10-cans-of-cola-a-day” diet. The results would be unpleasant, unhealthy and possibly permanent. Honan discovered in a very short time that by ‘liking’ everything his newsfeed became dehumanized:
My News Feed took on an entirely new character in a surprisingly short amount of time. After checking in and liking a bunch of stuff over the course of an hour, there were no human beings in my feed anymore. It became about brands and messaging, rather than humans with messages.
Likewise, content mills rose to the top. Nearly my entire feed was given over to Upworthy and the Huffington Post. As I went to bed that first night and scrolled through my News Feed, the updates I saw were (in order): Huffington Post, Upworthy, Huffington Post, Upworthy, a Levi’s ad, Space.com, Huffington Post, Upworthy, The Verge, Huffington Post, Space.com, Upworthy, Space.com.
A response to Honan’s piece was posted by Matt Kruse on Social Fixer. Kruse comments on Honan’s findings and then offers several suggestions to humanize the Facebook experience. He also notes:
Facebook’s primary focus is increasing exposure of ads and paid content. This extreme experiment shows what Facebook does when you give it lots of data points – it finds a lot more paid content to show you! Brands win, not people. Under normal user activity, this is not so blatant and obvious. But it is definitely happening. Every data point you give Facebook is used by them to show you more advertising and sponsored content.
The third piece, by actress Felicia Day, on the “Felicia’s Melange” blog is a reaction to the latter story and her own emotional reaction to Facebook’s algorithms.
Day has a rather different experience from the rest of us. She is part of the celebrity hamster wheel and collects followers like my sweaters collect cat hair. She explains her frustration:
I have almost a million followers on FB. But I reach a very small percentage of people with every post on average. And this isn’t by my fans’ choice. Not a day goes by when I don’t see comments like, ‘Hey why am I not getting your updates anymore?” I’m helpless to explain exactly why, but I have a good inkling.
FB and Twitter use similar models to identify success: numbers of ‘friends’ or followers. But as Day has discovered, the sheer volume becomes unmanageable. Feeds move too fast for people to keep up, or for any real human interaction to occur. Having too many ‘friends’ is a dehumanizing experience.
Success by FB standards is failure by human ones. We are basically, genetically and biologically, inclined to encompass small family-tribe units, not big mobs. I am often overwhelmed by a feed from 500-plus ‘friends’ – replete as it is with the FB-generated content – and often miss content they post simply because I can’t watch the feed 24/7. I cannot imagine what it’s like to have a million. Humans cannot effectively manage that volume of content.
Day wrote about how FB took over the management of what she sees:
…the FB algorithm filters my content based on how many likes people make on the things I share. (Because closing people into interest pockets makes it easier to make money on them, I get it.)
Yes, money is at the root of it. Welcome to capitalism. Money’s at the root of everything. Don’t expect that to change within the Age of Mammals.
Because Facebook offers the service to users for free, it needs to make money elsewhere to pay for its servers, its employees, its shareholders. It generates its income by selling your information to advertisers. Some of that information comes from your “liking” content on your news feed. This helps advertisers target you because you’re telling them what matters to you as a consumer. Since this happens behind the scenes, so to speak, we don’t pay as much attention to the data mining as we should.*
But of course, FB isn’t alone in its data mining by a long shot. It’s just very good at it, in part because we let it. We willingly participate. But advertising is everywhere, including targetted ads, even when we’re just surfing. While researching this post I found a piece on “How to Quit Facebook.” Scattered across its page were ads generated by Google’s aggressive data-collection algorithms (your every search identifies you as much as your likes on FB).
If one of the reasons you want to opt out of FB is because of the advertising, you are going to be disappointed: many, many other sites use similar techniques to draw your attention (and money). I don’t want to quit Facebook, however: just modify my experience. Less noise, more relevant content. De-clutter my feed. replace some of the content-mill posts with human-made stuff.
I am not personally offended by advertising. I understand the economics: if I want the service free or at least subsidized, someone else has to pay for it. From decades of experience in the industries, I know the costs of publishing and printing,. I know that magazines and newspapers are paid for by advertising, not by subscriptions. I know that the trade-off for discounted books on Amazon is their tracking my purchasing history and recommending books or products that fit my history as a consumer. We are all cogs in this machine.
Would you subscribe to FB (i.e. pay a monthly fee) if it was ad-free? Would you pay more for books on Amazon if they dropped the targetted recommendations and ads? Would you pay the real cost of a newspaper or a magazine if it was ad-free? I suspect few of us would. We like the economics: we get service, low prices, free stuff in exchange for an intangible asset: our data.
I am not even offended by the targetted advertising: if I must see ads, I would rather see those that were for things or services that at least came close to my personal interests. One of the reasons we quit cable TV was because the shotgun-approach advertising simply distracted (and bored) us too much. On the other hand, I frequently examine the ‘suggestions’ that pop up on Amazon because I know they will be at least somewhat relevant. And not surprisingly, I often buy from those suggestions.
Blogger George Jenkins also tried the not-liking experiment for a similar period (two weeks) and noticed several changes in his FB feed. He noticed that FB’s bots mined content from his comments instead of his missing likes, and when he stopped that, they delved into his profile:
In a life without “Likes,” it seems that Facebook will dig deeper into your profile and use data from it to display targeted ads. This seems consistent with the targeting options Facebook provides advertisers…
So not ‘liking’ a post or comment produces only a partial qualitative change, but not necessarily a better one. Jenkins concluded:
My experiment reinforced my view that Facebook isn’t really a social networking service. Why? First, there is the 12-percent delivery rate of your friends’ status messages. So, you can’t assume you’ve seen everything by your friends, nor that your friends have seen all of your posts. Not very social. Second, in a life without “Liking” things, as Facebook digs deeper into your profile to target ads, it becomes clear that the service is really a gigantic, worldwide advertising delivery and distribution system.
There’s a comment on Jenkins’ blog from an anonymous reader that asks,
What effect do those artificial realities have on us? How are those false realities changing us? What does that mean for our political culture? What does that mean or how we interact socially? For what we believe to be true? Does this help make us one people with a common set of shared fundamental beliefs? Or does it divide us into factions, where we only listen to and hear what pleases and what the masters of the matrices want us to hear to accomplish their ends? And do these matrices effect a bigotry of thought by obliging us to conform to the orthodoxy of our own insular group on social media? And what of destroying our individuality, freedom, and independence by compiling dossiers of our lives and using that information to subject us to government and/or corporate control and use? And once in the Matrix or matrices, how do you escape? Can you escape?
The third story is “How To Get Your Facebook Status To The Top Of All Your Friends’ News Feeds.” Caleb Garling, writing in The Atlantic, wanted to fool Facebook’s algorithms by playing games with it and multiplying the “like” factor to get his content move up the chain on his friends’ feed:
I wanted to see if I could trick Facebook into believing I’d had one of those big life updates that always hang out at the top of the feed. People tend to word those things roughly the same way and Facebook does smart things with pattern matching and sentiment analysis. Let’s see if I can fabricate some social love
In the end, while his experiment is somewhat of a success, Garling wasn’t satisfied with the result because although he tricked FB’s bots into elevating his feed, he wasn’t able to fully understand the “why” of it:
Deciphering the exact levers and pulleys in Facebook’s algorithms is impossible from the outside. The best you’ll ever get in trying to experiment with successful posts on Facebook is feeble correlation. Cause—the software code—is under lock and key. We don’t know why some posts go up or down other than what we can surmise. Like Google with search, tech companies don’t share their secret sauce.
FB and all social media exploit a very basic human behaviour: sharing. That’s why they are so successful and popular. When we see a ‘like’ button we click it as an emotional response to the poster, to our friends, and our family. To let them know we appreciate their time and effort to post something. To say we like them. We think we’re communicating, connecting with the person, engaging in simple human pleasantries, bonding – but we’re also feeding the data miners.
Sometimes online I feel like a small bit of data krill in the mouth of a very large, very hungry baleen whale.
* There’s a sinister side to “like farming” too, as CNN reported a year ago, as scammers and cons work the system, like remora fish or parasites attached to sharks:
Those waves of saccharin-sweet posts that sometimes fill your news feed may seem harmless. But all too often, they’re being used for nefarious purposes. At best, a complete stranger may be using the photos to stroke their own ego. At worst, experts say, scammers and spammers are using Facebook, often against the site’s rules, to make some easy cash.
And they’re wiling to play on the good intentions of Facebook users to do it.
Often… Facebook pages are created with the sole purpose of spreading viral content that will get lots of likes and shares.
Once the page creators have piled up hundreds of thousands of likes and shares, they’ll strip the page and promote something else, like products that they get a commission for selling. Or, they may turn around and sell the page through black-market websites to someone who does the same.
It’s a way to trick Facebook’s algorithm, which is designed to give more value to popular pages than the ones, like scams and spam, that pop up overnight.
Similarly, CNET posted a story in mid-2013 about FB ‘like’ scams:
Begging for Facebook “likes” has become epidemic. “If I get a million likes I’ll be cured of my terminal disease and I’ll be able to implement my sure-fire plan for world peace!”
“If you don’t like this picture you hate your mother, America, and apple pie.”
Scammers prey on Facebook users’ propensity to respond emotionally by clicking “Like” when an image or plea tugs at their heart strings or piques their ire.
Scam sites offer to sell you likes clicked by real-live humans. The buyers intend to convert the clicks into traffic for their Facebook page, which translates to increased ad revenue. Several such sites I visited appear to be owned by the same anonymous party and are registered in Panama.
(To add to the sardonic nature: there’s a FB page called “Stop Liking Everything” – stupidly written in all-caps like irrelevant content always is – that has 1,000 likes…)
- 2772 words
- 16607 characters
- Reading time: 903 s
- Speaking time: 1386s