An example of the impact of design choices and the evolution of habits around them.
When we post a link on Facebook, it shows us a preview of that link right here in the feed before we even click on it. Can we imagine what the world would be like today if Facebook had never implemented link preview?
This was a design decision Facebook made in 2008 when people went beyond posting just status updates on their Facebook. The goal was to make it easier to see what the link points to so we can decide if it’s worth clicking.
Then over time we started to trust the headline, image, and description enough that we didn’t see much value in clicking on the link. We also started to see a lot more links in our feeds and we didn’t want to click on each one. Also, it is always gong to be faster to browse Facebook if we never clicked on any links. Especially within the apps. So it slows down our dopamine wagon when we click on a link and wait for a page to load and we don’t like to wait.
But when we stopped clicking on links, the producers of the content stopped getting traffic. So they started tweaking the headlines, images, and description to stand out more in your Facebook feed. FB initially allowed a lot of customization. e.g. The headline in the feed preview didn’t even have to match the headline on the web page. This accelerated the attention war – distortions and click bait were needed to counter our laziness to click on links within the Facebook feed. Also, Facebook always has a disincentive in helping us get off Facebook. So deep down it never really wants us to click on a link. Unless it’s sponsored and earns them revenue.
Over the last few years, we have stopped going to the sources of information directly. Everything comes curated from our social media feeds. But because of the added friction, the extra step of representation, and the more time we now spend inside apps, we are engaging with websites in a fundamentally different way than we used to.
What if today, just for a day, the link previewer in our Facebook feed stopped working. Look at your feed. Imagine no website previews. Just text links everywhere. Wouldn’t we automatically feel less stressed? Then we realize that’s why people like Instagram. No links ☺️
Facebook, in principle, has a very solid premise. An online social graph where every individual is a real person and has the same social norms and consequences as in the real world is a social graph that is good at regulating itself. But they added in non-people into this graph to generate revenue and ended up with a highly profitable business model where a highly engaged audience of all humans on earth can be targeted and reached by a business.
All this is fine if the only evil is businesses trying to get you to buy things. But this got totally upended when a “business” is a foreign government and what they are trying to “sell” is just chaos, polarization, divisiveness, voter self suppression etc. So Facebook finds itself in this very unfortunate position where the tool they have built for a different purpose is getting weaponized in a way that is beginning to destroy the world.
I was doing an experiment and was very happy to discover what they are doing now to address this problem. I am very critical of Facebook but I also want to give them credit where credit is due. And here I am very pleased by the effort and the sincerity behind it!
The ad I was trying to post is this:
Turns out I can not just go and post a political ad. It is reviewed by a human and then gets flagged in the following way.
When I opt in to confirm my ID, it gets into quite an onerous process.
I have to upload my drivers license / passport, answer a bunch of ssn-related identity questions, then FB scrubs my timeline to verify that I indeed seem to be living in the US based on my entire history of posts and social graph, and then still it wants me to share a physical address where they will send a letter with a code to confirm I have physical access to that address. Then and only then can I make a politic ad!
This step brings some of the strengths of the original social graph. When identity is tied to a constraint, like in this case real physical people, then the social norms can be a powerful way to self-moderate. Of course there’s always a flip side to this and we do lose anonymity which is freedom. These two things will always be in tension it seems.
Facebook is investing in features that are bound to reduce “engagement” as it was measured the old way. The old metric was “time spent”. Now the new metric is time spent “with intention, being inspired and feeling positive.”
Short-term investors saw this and decided that it’s a bad thing – as engagement goes down, so will revenue. And FB stock took a hit. But if we take the long view, we can see how this strategy is like to pay off.
(FB’s revenue model was similarly questioned several years ago when users were moving from Desktop to Mobile. The fear then was that they couldn’t show as many ads on mobile as they could on desktop. But what actually happened is that as the number of ads went down the price of ads went up as there was limited space to show them on mobile.)
Facebook launched the Like button 18 months ago and it has had a huge impact on how people browse and share information and form associations with other entities. Within days websites had integrated Facebook social plugins which made it super easy to feed stuff back to Facebook and share with your friends in a frictionless way. Though Facebook started collecting information about every webpage you went to as long as there was any social plugin on that page, you still had to take an additional step to decide if something was worth sharing with your friends or else they would never see it. Let’s take an example:
I visit the NYT webpage and read a couple of stories, say A and B. I then decide that story A is worth sharing and hit the “Recommend” button and it gets posted to my feed. My friend arrives on NYT and sees the headline for story B. He doesn’t know that I checked it out as well but he is interested in it and even clicks on it but he never shares it either. Then a third friend is now on NYT trying to decide what she should read. Given the old scenario, only story A would be recommended to her. The information about story B and two friends interacting with it has been lost.
Maybe it’s lost of for a good reason – it probably wasn’t worth sharing. One could argue it keeps the signal to noise ratio high. But the best way to deal with information overload is generating more information, not less. With enough training data, and meta information like time spent and other derived engagement metrics it won’t be too hard to use that lost information to come up with even better suggestions.