106 Comments

Our "elite" universities like Brown are major sources of toxic ideas and dialogue: https://yuribezmenov.substack.com/p/how-to-rank-the-top-npc-universities

Expand full comment
Jul 24, 2022Liked by Nikita Petrov

When I see Tucker Carlson pushed into my YouTube recommendations I start to worry “what am I watching?”. The algorithms are all about doubling down. You get an exponential effect that leads to more radical views. We need our institutions to show leadership in mixing people of different views and counteracting the polarizing effects of social media. But instead they just pick a side and add to the effect. Someone goes through DEIA training during the day - at their work. It pisses the off. They go home and start watching YouTube videos of why it’s misguided. Those videos just lead them farther and farther down the rabbit hole. Or the opposite happens and they go down the rabbit hole of SJW videos. Either way - It’s not good.

Expand full comment

In 2018, Friend A posted a story on Facebook about how the mayor of a city wanted to stop their EMTs from answering calls for people who were overdosing. Apparently people were using in public places because if they ODed then chances were high that someone would call for help whereas if they were alone at home they would most likely not make it. The justification was that there were so many called involving ODs that if someone was having any other medical issue they were finding the EMT response time high. The thinking was the ODs should be further down on the response list.

Friend B posted about how horrible that was and how terrible for this mayor to have no sympathy for these drug addicts.

Fast forward to 2020. Friend B is all over FB talking about how if you’re out of the house and getting Covid you are taking a bed in the hospital from someone who really needs it. Selfish, Trump-supporter, killing people, irresponsible, etc. and so on.

That’s what these algorithms are doing.

Expand full comment

I think much of today’s social unrest is directly attributable to these algorithms that reinforce rather then educate, that inflame rather than inform. The process drives eyeballs, which drives clicks, which drives advertising revenues. It’s central to the business model of firms like Twitter and Facebook. But people like it because it’s “free”. How many of us are willing to pay to subscribe to a social media site that doesn’t do this? Perhaps the old adage is true: you get what you pay for.

Expand full comment

A modest proposal, put down the phone, get off your ass, and actually talk to people you otherwise wouldn’t. Sure , you may have to screw up your courage, but it’s all for the best. I believe if you do this, social media is useful, but not primary. As a bonus, you can pick out the race baiters easily!

Expand full comment
Jul 24, 2022·edited Jul 24, 2022Liked by Nikita Petrov

Disclosure: I am a software engineer building online advertising systems.

// How can we gauge the true intensity of the culture war—and find a way to dial it down—when algorithms and other forms of AI generate profit by amplifying it?

You are fed what you engage with. There are millions of people who's only experience of [insert media platform] is makeup tutorials, sports, etc. Most platforms expose tools for you to curate your feed. Youtube/Facebook et.al. have little buttons that say "I'm not interested in this" - those buttons _work_. You can also uninstall or block the apps, and yes it is possible, people do it all the time.

In general, talk of this manner surrenders way too much agency to the platforms. The main driver of the content you engage with is _you_.

// Is online interaction giving us a clear picture of who thinks and feels what?

No. Mostly its you looking into your own reflection.

// does our very participation online leave our thoughts and feelings vulnerable to manipulation?

Yes, but so do most things. If you read the NYT, you are being manipulated. Platforms want to maximize time-on-site, and that is different flavor of manipulation, but its not obvious which is worse. This probably varies case-by-case.

EDIT:

// social media companies need to be more transparent about how their algorithms work. We need to understand how they are curating our content and why they are making the choices they are making.

I make algorithms for a living. With a few narrow exceptions, "how" or "why" are not questions that can be answered in a way that people would find satisfying. We have a few options:

1) I can point you to the formal mathematical specification in any number of textbooks. Read them, and you will know one kind of "how"

2) I can show you the ten billion numbers that constitute it for a second kind of "how"

3) I can say "why" in that X content maximized the probability of you generating money for the platform.

4) Sometimes (but more often not) I can say "why" in that the model considers certain things "important". But in what way its important I (typically) cant say.

Imagine you have a brain in a jar, you ask me why the brain did Y instead of Z. I answer that by showing you a video of all the neurons activating at the time the decision was made. This is a bad example for various reasons, but it illustrates that "why/how" are not applicable questions.

The reason we can't answer questions like this is not a conspiracy of silence on the part of tech companies - its that nobody has an answer humans can comprehend let alone accept.

Expand full comment
Jul 24, 2022Liked by Nikita Petrov

Sea Sentry, SWJ is social justice warrior. Another old adage is the voter gets what they deserve, good and hard. You will not be able to overturn the profit motive so the algorithms will continue, and the political parties will leverage the anger to motivate their base. But politics ebbs and flows and the movement to the left has engendered the current support of the right politically and push back in areas like education. Remember that the politicians are arguing about symptoms and the answer we need is to address the root cause which is the destruction of the family. Addressing that is the discussion we need.

Expand full comment

What's worse, algorithms that reinforce existing biases or algorithms that say WE just can't allow you to read or see stuff because we think it will get you to consider things WE don't think you should consider?

Expand full comment

I respect the contributor's perspective. (I respect Glenn's intro more, tho'.)

"especially for people of color"

I have a problem with this. As a people, of course "we" are more likely to click on topics that touch on racial issues. But that's been true all my life. I cannot say that social media algorithms are making things worse per se. And I certainly do not think that "our" issues with this technology are particularly more serious or troubling than other groups. Frankly it depends on what groups we are talking about.

Zeroing in on *racial* groups obscures the picture.

All that said, please don't get me wrong. This topic is both critical and fundamental. But we need to delve much deeper to truly understand this data.

People can have a huge interest in something, click on relevant links all day and still not be imprisoned by any kind of echo chamber. (Trust me. It's not hard to do =))

Expand full comment
Jul 24, 2022·edited Jul 24, 2022

Racial hatred (and the stoking and cultivation thereof) is a massive business and revenue stream in America.

As we all know, the MSM treats the latest possible white-cop-on-black-victim incident like a movie studio treats a summer blockbuster: wall-to-wall coverage, flashing lights and colors, everyone dedicated to the same task of pouring as much gasoline as possible onto a social fire. (And any inconvenient facts or context ignored or buried.)

Of course politicians exist to make their voters addled with so much hatred and fear of the Other over there, with the goal being a malleable gullible populace that gladly gives up its liberty for a spurious safety (but is really just a desire to punch the other team on the nose).

And then more recently, academia has whole enormous bureaucracies dedicated to dividing us all by skin color and attaching moral value based on our official Oppressor/Oppressed points total. (Academia being Patient Zero of our perpetually occurring Race War, the heart of the White Guilt Industrial Complex, constantly pumping out 10,000 theories that all say the same thing: Everything is Oppression, which can only be cured by unlimited Leftist social engineering.)

And lastly, there is an entire cottage industry of websites, blogs, brands, Tweeters, journalists etc that have their customers hooked on the Race War just like a dealer gets his customer hooked on crack: for the Left, they pay to be fed constant tales of the evil bigotry of their blood enemies, the Deplorables, with reminders that they are the Good ones because they know to always "center the marginalized" and never forget to capitalize Black; for the right, they pay to be fed constant tales of black dysfunction and criminality, with reminders that they are the Good ones because they raise patriotic law-abiding children.

We like to believe that Love Conquers All (Love Wins!) but the truth is that no one can really love a stranger (except as an abstraction) but you can easily and happily hate one (or millions). Hate gets the blood flowing and makes us jump out of bed in the morning ready for another day fighting the imaginary enemy—we all love to hate and our corporate hate dealers are more than happy to have us hooked, demanding larger and larger doses.

And hey, in the worst-case scenario, if some kind of race war breaks out, think of how good that will be for business!

Expand full comment
Jul 24, 2022·edited Jul 24, 2022

Dr. Glenn writes, "...because the United States has a long history of race-based discrimination, which means that people of color are more likely to have mistrust and suspicion towards people who don't share their racial background."

Andrew Sullivan analyzed the same issue. I call attention to

https://andrewsullivan.substack.com/p/why-is-wokeness-winning

Why Is Wokeness Winning? Andrew Sullivan Oct 16, 2020.

Sullivan observes, " advocates of what Wes Yang has called “the successor ideology” never debate any serious opponents of their position. This is because debate in a liberal society implies equal standing for both sides, and uses reason to determine who’s right or wrong. But there can be no “both sides” within CRT, no equation of “racists” and “antiracists”, and debates are inherently oppressive. Logic, evidence, and reason are...mere products of white supremacy, forms of violence against the oppressed. " [end paste from Sullivan article]

For more than four decades, I have tried a reasoned approach, but against an emotional crowd, there is no contest. Net-net, I think the situation remains hopeless. In substance, I agree with Sullivan. Emotion and tribalism are too difficult to overcome. Still listening to those who disagree with my sad conclusion, but honestly, I see little hope.

Expand full comment

Thanks for sharing this insightful letter, Glenn. Some good ideas presented.

My personal strategy emerged from a life-long response to advertising in general. At some point in my early 20s, I decided to ignore as much advertising as possible. Not to watch commercials on TV. Ignore billboards. Ignore where the local McDonalds was because I didn’t like being told I needed a Big Mac. After spending my childhood in the 60s watching a lot of TV, I removed TV from my life for a few decades. I now only watch cable TV when I can forward through commercials. I turn off the car radio when an ad comes on. I find most advertising aesthetically unappealing and ugly, so this “diet” from advertisers was partly about what I enjoy in life. But I also realized at that time that, like all humans, I am not immune to manipulation from people wanting to sell me stuff. We can ALL be conned by a good carney. I wanted to do what I could to safeguard myself from parting with my money for junk that wouldn’t, in fact, make me happier, wealthier, or more beautiful. When the internet emerged, I gave myself the same rules, to ignore everything that was “suggested” to me…now that I had become the product myself. And now that ideas and harmful ideologies are what is being sold to me as much as stuff.

Of course, I cannot help seeing the suggestions in my peripheral vision and sometimes they make me laugh. Occasionally, they get something right about my health that I do find to be useful. But except in those rare instances, I almost NEVER click. So the Algorithm Gods are confused about me! “It seems like she would like Tucker Carlson, but she never clicks on it! It seems like she would like to read the latest tweet from AOC, but she doesn’t click on that either!! But let’s keep trying!” The Algorithm Gods even at times seem to think I’m a man in my interests and try to get me interested in porn. Which is really is flabbergasting and annoying. The Algorithm Gods have plenty of info on me about my interest in women’s rights, but they don’t make helpful suggestions about this genuine interest. Maybe because the Algorithm Gods don’t want anyone to care about women’s rights.

I am not a product. I cannot be neatly categorized, especially since my awakening to the dangers of woke ideologies in the past few years. So I just try to ignore the Algorithm Gods as much as I can.

Expand full comment

Facebook is the only social media platform I (reluctantly, occasionally) use, other than YouTube. Other than I guess providing a means for an old friend or classmate to contact me, it’s strictly for volunteering and some activism (basic info about when/where/what). Often I’ll only see a post if FB prompts me via an email: “so and so posted _____!” Even though these emails are annoying, I’ll occasionally see an event in time I wouldn’t otherwise. It’s far better than randomly going on and mindlessly scrolling which, from the beginning, made me feel spiritually half-sick. But even the prompts to see what a friend posted can cause trouble. Because their online personae and obsessions are not the person I know and like. Facebook, and other platforms I’m sure, turn people with certain biases and tendencies and vulnerabilities in real life, characteristics and attitudes which if they emerge in person, are at worst balanced and grounded by all of the other funny and absurd and compelling things people who actually know and like each other focus on when they get together. Online, in these intense echo chambers the very online inhabit, they post like at turns smug or brittle clout-chasing, posturing, pandering, aggressively toxic fanatics who are imbued with such rage-fueled, ignorant shallow faux-certainty they will take your head off and hold it up before their followers if you so much as slightly, mildly disagree. It’s as if you’re interrupting a performance on stage. You can clap and cheer adoringly and praise them for the brave vehemence in pushing even further what is in their online clique already the very most-approved opinions. But even accidentally introducing a bit of cognitive dissonance is treated like standing up in the audience with a loudspeaker hijacking the star performer’s show. I I wish I had never seen some of these friends posting online. I had one turn on me so viciously for daring to suggest cooperating with tech and media oligopolies and the most historically checkered of federal law enforcement and spy agencies to quash the speech and even the ability of political opponents to participate more broadly in civil society was not only illiberal in principle but very risky empirically for actually independent and marginalized and unpopular but important voices they might support. I was algorithmically nudged into a seeing another friend post hysterically and credulously in response to the then supposed racist attack on Jussie Smollett, “no justice, no peace!”. Little of either of their building SJW identities and increasingly unhinged assumptions and conclusions which they posted and reposted and shared on and on (especially how outraged they are that they know their black friends can’t so much as walk outside without a realistic chance of being met by a hail of gunfire from racist fascist cops - and you’re as horrible a person as has ever walked the earth if you quibble or question this even slightly) were things that came out in one and one interactions, especially in person. No doubt Covid lockdowns (which of course they vehemently supported and angrily judged critics of) and even more time at home, online made this worse. In person, when I’m around a group of people whom I know share more typically uniform progressive or even identitarian views, comments are always milder, if they even come up. People seem more attuned to the possibility they might not know everything or others might disagree somewhat, while still being good people.

Expand full comment

This is yet another "Facebook is evil and should be dismantled immediately" post :)

The flip side of this is that even if people are not reading or sharing a lot about race, they know who the people are who already agree with them and make sure to amplify those people's content to show what side they are on.

Expand full comment

Interesting perspective. I wonder if this is a bit more complicated tho. Do algorithms need to change, probably but people also need to think through what they are clicking and perhaps through own experience sort through if what the are reading makes sense or not.

Computers are never going to be here to protect us from ourselves, not in any good way anyway!

Expand full comment

This sounds like a discussion I was in yesterday about the evolution of autistic traits in youth who were not autistic by traits or by diagnosis in childhood but happened to have been born between 1996 and 2016. "The Social Dilemma" (documentary) and "The Coddling of the American Mind" (book) both allude to some of the key underlying mechanisms of 'algorithmia' that is now infecting an entire generation. We see this in medical patients, in college students, in educators, in institutions captured and capsized by ideologies...

Expand full comment