These days I have been pondering how useless Facebook has become over time, and by a bit of serendipity I come across this article from The Atlantic. Facebook over time has become more of a cesspool, and I am able to say that in part since I have been using the site since the days when you needed a .edu e-mail address to get an account. Those days are long gone, and odds are newer users may think the .edu address thing is an urban legend. But I assure you it is not. I wanted then to jot down some thoughts about using Facebook, my experience, and comment on some quotes from the article. Basically a group of researchers did some investigation on Facebook and its users, and they found what some of us have suspected a while: yes, the place is awful, and it is due to a good amount of assholes that Facebook freely enables to benefit their bottom line.
So, for starters, who are those assholes?
"They’re superusers. And because Facebook’s algorithm rewards engagement, these superusers have enormous influence over which posts are seen first in other users’ feeds, and which are never seen at all. Even more shocking is just how nasty most of these hyper-influential users are. The most abusive people on Facebook, it turns out, are given the most power to shape what Facebook is."
A big part of the problem are those superusers, who are basically assholes who get a lot of leeway and license from Facebook because they bring in engagement and eyeballs. This helps Facebook's bottom line. It is clear Facebook has no interest in policing any of them because they bring in a lot of attention and engagement (that was their conclusion by the way, though they phrase it a lot nicer than I do, keep reading).
The article also discusses how Facebook manipulates its data by using a metric of their own creation, the MSI (Meaningful Social Interaction) that assigns points to certain interactions you do: a like, a comment, if you reply with an emoji, etc. They do tweak how many points an interaction gets based in part, once again, on their bottom line, what would be favorable to them to keep that engagement going. The article does mention Zuckerberg, Facebook's founder and boss, at times opposed any positive changes because he did not want to "mess" with the MSI.
Defining those power users:
"So who are these people? To answer that question, we looked at a random sample of 30,000 users, out of the more than 52 million users we observed participating on these pages and public groups. We focused on the most active 300 by total interactions, those in the top percentile in their total likes, shares, reactions, comments, and group posts. Our review of these accounts, based on their public profile information and pictures, shows that these top users skew white, older, and—especially among abusive users—male. Under-30 users are largely absent."
When some young people joke that Facebook has become the hangout of grandpas and grandmas, they are not far off the mark. Not only has Facebook moved to an older demographic, but it has moved to a fairly homogeneous demographic. And to make it worse:
"Of the 219 accounts with at least 25 public comments, 68 percent spread misinformation, reposted in spammy ways, published comments that were racist or sexist or anti-Semitic or anti-gay, wished violence on their perceived enemies, or, in most cases, several of the above. Even with a 6 percent margin of error, it is clear that a supermajority of the most active users are toxic."
Now, I make use of every option to limit and filter my Facebook feed. I have my personal profile set to private, so you can't really see it if you are not logged in, and even then, you have to be on my friends' list. Even with that, I limit it further who can see my content, so I have a custom privacy rule that certain people do not see what I post. A big reason for that is that I have family and friends all over the political spectrum from very liberal to toxic right wingers. I've unfriended some of the more toxic folks, but once in a while there is that one aunt or such who would notice you defriended them, so just filtering them out so I do not see their stuff and they just see a mostly blank page where they think I do not post much is a more peaceful option than going full nuclear. My point in explaining this is that I am mostly spared from the extreme toxicity that many users may experience. That is because I have put in significant effort to use filters and other privacy options to avoid the toxicity. Facebook does offer some privacy and filtering options, but you have to know where they are and how to use them. It takes effort and work to do that, an effort and work that most users either are unable, i.e. they do not know how, or unwilling, i.e. it takes too much work, to do. Additionally, I do use a third party filter software to customize how the Facebook interface looks and remove some of the annoyances they've added over time, but that last part is another story.
So how bad are those superusers?
"Top users pushed a dizzying and contradictory flood of misinformation. Many asserted that COVID does not exist, that the pandemic is a form of planned mass murder, or that it is an elaborate plot to microchip the population through “the killer vaccination program” of Bill Gates. Over and over, these users declared that vaccines kill, that masks make you sick, and that hydroxychloroquine and zinc fix everything. The misinformation we encountered wasn’t all about COVID-19—lies about mass voter fraud appeared in more than 1,000 comments.
Racist, sexist, anti-Semitic, and anti-immigrant comments appeared constantly. Female Democratic politicians—Black ones especially—were repeatedly addressed as “bitch” and worse. Name-calling and dehumanizing language about political figures was pervasive, as were QAnon-style beliefs that the world is run by a secret cabal of international child sex traffickers. "Again, this is why I make efforts to filter and restrict certain things. While my measures are not perfect, I can skip a lot of the above. I will add that users on my friends list who did any of the above quickly got defriended. I do not care who they were or how long I knew them for, once they went down the dark path of intentional willfully ignorant misinformation and disinformation, I cut them off. I do look over my feeds and clean them up every so often, so for example, if someone suddenly decides to go antivax, they are gone and dead to me. I may be a librarian and educator, but even I know some people cannot be saved. I do not engage such people. I do not try to "correct" them with facts, not worth the effort. You cannot help those who refuse to be helped, and you definitely cannot help those who have decided to be your enemies, so out they go. These days I value my mental health more.
On algorithms.
"Recommendation algorithms change over time, and Facebook is notoriously secretive about its inner workings. Our research captures an important but still-limited snapshot of the platform. But so long as user engagement remains the most important ingredient in how Facebook recommends content, it will continue to give its worst users the most influence. And if things are this bad in the United States, where Facebook’s moderation efforts are most active, they are likely much worse everywhere else."
Speaking of their algorithms, this is another reason you may want to check your feeds and weed them once in a while. Things like groups, organizations, and pages you like get put in a big giant list (you can check this list on your profile). Facebook decides in part what to feed you in your general feed based on those things. I recently did a weed and removed a bunch of stuff, and suddenly the feeds got leaner, but it also removed a lot of junk. That is I suppose another lesson for users: you need to actively work to filter your feed to get better content and remove or at least mute the toxic stuff. I will add that Facebook does make an effort to make it more difficult to filter and block things you may not want. After all, it all boils down to what serves their bottom line, and engagement, even if it is your making an angry comment on some toxic fuckbagel's post, is good for their business. You ignoring it and then blocking it is may be better for you, but it is bad business for Facebook.
And in conclusion, the article's author writes:
"Allowing a small set of people who behave horribly to dominate the platform is Facebook’s choice, not an inevitability. If each of Facebook’s 15,000 U.S. moderators aggressively reviewed several dozen of the most active users and permanently removed those guilty of repeated violations, abuse on Facebook would drop drastically within days. But so would overall user engagement."As I said at the beginning, this is the kind of thing some of us suspected, but now we have some tangible proof.
Now, a couple of my readers may be asking why I do not leave Facebook. Well, believe me I would like to leave it. However, there are reasons that keep me there. One, I have family and friends who, for whatever their reason, would not move to an alternative platform. Facebook is how they stay in touch, check out each other's family photos, the cute kids, etc. If I left, I would lose that. Two, at least on this campus, the students basically live on Facebook. Our library has a Facebook page, and we get some decent engagement from the campus community because of it. In addition, I am a member of some campus groups, and if I as a a librarian need to make a library announcement, share content from our Facebook feed, our library blog, or other material, posting about it in the campus groups works a lot better than sending a campus wide email. We need to be blunt here: students are bombarded with so much campus email that they soon learn to ignore it. Some even create email rules to toss certain campus emails into their junk folder. But they do pay attention to their Facebook. Even knowing how bad and toxic Facebook can be, they stick with it, so in my line of work I need to stick with it. I hate the idea, but the library is not about to shut our page down and leave Facebook. Again, the damn engagement comes into play.
On a side note, it may be interesting to investigate just how much our students on campus know or not about Facebook's darker side. This would go along with how much they pay attention to news or not. But that is a thought for another time.
Citation for the article: Matthew Hindman, "Facebook Has A Superuser-Supremacy Problem." The Atlantic, February 10 2022.
A small note on the citation. It appears this is an online only article. I checked the print issue, it would be the January/February 2022 issue, and it is not in the print issue. I also checked our online databases, since we do have online access to The Atlantic to present day (i.e we get the latest issue), and this article is not in that either. So I am speculating they do not provide an article like this one as part of the vendor's package to the library. This can be an issue for me if I get someone at the reference desk asking for the article. In theory we have the magazine, but in practice we do not have it (or we do not have all of it). How did I read it? Well, the magazine has a limited paywall, i.e. gives you some free articles before it wants you to pay, plus, if need be, I know a trick or two to get around that if need be that I do not discuss in polite company.
No comments:
Post a Comment