"We tend to value speed, certainty, and comfort—at a deep, subconscious, and very nearly fixed level."
This feels biologically-correct and biologically-influenced-at-its-core. Historical survival of our species (I think) required our ability to quickly process things to determine what was good-vs-bad or safe-vs-dangerous or unthreatening-vs-threatening, but with a very limited time-horizon! It mattered much less that something was safe-now-but-dangerous-3-years-from-now than if it were bad-now-but-good-3-years-from-now. It does feel natural that we'd be hard-wired for finding comfort in things that come to us quickly with a representation of certainty (if, not, correct).
To your final section / conclusion: humans have always been fearful-of and slow-to change. My strong prior has always been that what we're seeing today is no different from what we saw historically. However, it does seem t be that the rate technological change is starting to creep up against the rate of human capacity to change and/or the rate of change for the human capacity to collaborate and collectively adapt to the rate of technological change. Maybe? I dunno. I'm less pessimistic / fearful of you on this, it seems.
Yeah I agree that the core of it just the speed mismatch between very fast things (technology), medium fast things (culture), and very slow things (deep human wiring).
While I'm generally a long-term optimist, the specific danger I'm concerned about here is just the market dynamics. Writers/creators can make much easier and stronger incomes by exploiting this mismatch (with varying levels of intent and awareness). Given that AI (at both a generative and algorithmic level) is going to massively favor those most agnostic about the truth-value of their content, I'm just worried that we don't really have any widescale immune system defenses left to hold up against what's coming.
While I'm skeptical that writing was ever wildly truth-seeking or truth-constrained on average, there were at least some safeguards that kept supply at a higher quality than demand would still accept. Editors and agents had fundamentally paternalistic instincts, each with some bar that a bit of content had to pass before it could get distribution. While these standards were always uneven and loosely enforced, at least they were there. With the advent of social feeds, we've seen creators emerge that don't really have any higher aspirations than producing whatever wins attention. Given that attention is finite, AI can help the most ambitious eat up more and more market share.
The insidious thing imo is that *what we consume also reshapes what we demand*. When an Andrew Tate gets in front of a boy at 12, what they seek out for the rest of their teen years shifts. Same for what they allow in. While teachers and parents can theoretically give those kids tools along the way that help them parse content more critically, I think the broad evidence is against this working much. And if the main thing rate-limiting the Andrew Tates of the world was how much content they could produce, reducing the time and cost required towards zero is going to have some unpleasant consequences.
Will we develop new systems to combat this? Maybe! It's certainly something I've been thinking a lot about, and I know many others are too. But it's very hard to fight against natural market forces, and I'm not sure that those consuming the most slop are super open to the idea that their diet needs fixing.
500 years ago Andrew Tate had a massively smaller reach. But, he was also more or less completely unopposed when it came to his ability to shape the hearts & minds of those he could reach (aka his town). So, while the reach was smaller, the impact on each individual reached seems massively larger.
Maybe the difference is that the today Andrew Tate (1) still reaches a small-town-sized cohort of individuals w/ massive impact, but they're geographically spread out much further; and, (2) is able to reach orders of magnitude more, but with little-to-no impact?
Still hard for me to shake my prior here that resolves to "this is the same change we went through for the printing press and the radio and the ..." historical media tech changes.
Yeah I think it’s def on the same continuum, just with each epoch having its own quirks:
- The printing press enabled reach, but with some gatekeeping on distribution. It also didn’t speed up actual writing, and didn’t inherently connect people who liked the same text.
- The internet removed the check on distribution and made it much easier to start, find, and join affinity groups.
- AI removes the final check by allowing a near infinite scale of new content per creator + affinity group
Of the last two, the internet was imo the much bigger deal. Tate being able to get to teens directly via YouTube and encouraging the formation of semi-private subreddits and discords was a huge shift. But we haven’t survived that super well, and to the degree we have one big tool has been the diversity of voices. There are other people on YouTube saying other things etc.
But those other voices only exist (in most cases) coz there’s some minimum level of monetizable attention left for them. Even a prolific creator like MrBeast can only put out so much content. But what about when that stops being as true? If winners start taking more share and winners are primarily truth-neutral (or sometimes hostile) attention hackers, the shrinking leftover audience makes truly original content production less cost-feasible. Maybe some creators stay in it for the love of the game, but how much reduction can we afford? And you generally need some critical mass to really create a community around your work. What if reaching this mass becomes say twice as hard for truth-constrained originalists? How many do we lose to the discouragement and bad economics? And how much does their rate of production then go down on the aggregate with the reward drop?
So I guess my thesis is just that this compositional shift may matter a lot. Not as much as the internet, but in compounding its worst qualities at a time when we’re already not coping super well.
"Will the the growing explosion of hybrid work in the interim impoverish us or enrich us?"
I imagine that the ambiguity of "impoverish" and "enrich" here is fully intentional, but the extent to which the two go hand in hand, work against each other, or neither, is definitely a question that has been in my thoughts for years now.
Technological progress is always both I think? As someone who thinks virtually everything is a question of tradeoffs, I'm more for a world with AI than one without it. If nothing else it gives us new kinds of tools that can be wielded against the downsides it unleashes. But I'm admittedly bummed that so much AI safety focus has been on avoiding skynet and not "ok what will a world with soft AGI look like and how will it be very obviously exploited". It's cool that projects like Community Notes exist and are gaining traction, and AI in various forms will make them much better. But on net I'm still pretty bearish about the near-term social costs of how much worse the infosphere is gonna get.
"We tend to value speed, certainty, and comfort—at a deep, subconscious, and very nearly fixed level."
This feels biologically-correct and biologically-influenced-at-its-core. Historical survival of our species (I think) required our ability to quickly process things to determine what was good-vs-bad or safe-vs-dangerous or unthreatening-vs-threatening, but with a very limited time-horizon! It mattered much less that something was safe-now-but-dangerous-3-years-from-now than if it were bad-now-but-good-3-years-from-now. It does feel natural that we'd be hard-wired for finding comfort in things that come to us quickly with a representation of certainty (if, not, correct).
To your final section / conclusion: humans have always been fearful-of and slow-to change. My strong prior has always been that what we're seeing today is no different from what we saw historically. However, it does seem t be that the rate technological change is starting to creep up against the rate of human capacity to change and/or the rate of change for the human capacity to collaborate and collectively adapt to the rate of technological change. Maybe? I dunno. I'm less pessimistic / fearful of you on this, it seems.
Yeah I agree that the core of it just the speed mismatch between very fast things (technology), medium fast things (culture), and very slow things (deep human wiring).
While I'm generally a long-term optimist, the specific danger I'm concerned about here is just the market dynamics. Writers/creators can make much easier and stronger incomes by exploiting this mismatch (with varying levels of intent and awareness). Given that AI (at both a generative and algorithmic level) is going to massively favor those most agnostic about the truth-value of their content, I'm just worried that we don't really have any widescale immune system defenses left to hold up against what's coming.
While I'm skeptical that writing was ever wildly truth-seeking or truth-constrained on average, there were at least some safeguards that kept supply at a higher quality than demand would still accept. Editors and agents had fundamentally paternalistic instincts, each with some bar that a bit of content had to pass before it could get distribution. While these standards were always uneven and loosely enforced, at least they were there. With the advent of social feeds, we've seen creators emerge that don't really have any higher aspirations than producing whatever wins attention. Given that attention is finite, AI can help the most ambitious eat up more and more market share.
The insidious thing imo is that *what we consume also reshapes what we demand*. When an Andrew Tate gets in front of a boy at 12, what they seek out for the rest of their teen years shifts. Same for what they allow in. While teachers and parents can theoretically give those kids tools along the way that help them parse content more critically, I think the broad evidence is against this working much. And if the main thing rate-limiting the Andrew Tates of the world was how much content they could produce, reducing the time and cost required towards zero is going to have some unpleasant consequences.
Will we develop new systems to combat this? Maybe! It's certainly something I've been thinking a lot about, and I know many others are too. But it's very hard to fight against natural market forces, and I'm not sure that those consuming the most slop are super open to the idea that their diet needs fixing.
500 years ago Andrew Tate had a massively smaller reach. But, he was also more or less completely unopposed when it came to his ability to shape the hearts & minds of those he could reach (aka his town). So, while the reach was smaller, the impact on each individual reached seems massively larger.
Maybe the difference is that the today Andrew Tate (1) still reaches a small-town-sized cohort of individuals w/ massive impact, but they're geographically spread out much further; and, (2) is able to reach orders of magnitude more, but with little-to-no impact?
Still hard for me to shake my prior here that resolves to "this is the same change we went through for the printing press and the radio and the ..." historical media tech changes.
Yeah I think it’s def on the same continuum, just with each epoch having its own quirks:
- The printing press enabled reach, but with some gatekeeping on distribution. It also didn’t speed up actual writing, and didn’t inherently connect people who liked the same text.
- The internet removed the check on distribution and made it much easier to start, find, and join affinity groups.
- AI removes the final check by allowing a near infinite scale of new content per creator + affinity group
Of the last two, the internet was imo the much bigger deal. Tate being able to get to teens directly via YouTube and encouraging the formation of semi-private subreddits and discords was a huge shift. But we haven’t survived that super well, and to the degree we have one big tool has been the diversity of voices. There are other people on YouTube saying other things etc.
But those other voices only exist (in most cases) coz there’s some minimum level of monetizable attention left for them. Even a prolific creator like MrBeast can only put out so much content. But what about when that stops being as true? If winners start taking more share and winners are primarily truth-neutral (or sometimes hostile) attention hackers, the shrinking leftover audience makes truly original content production less cost-feasible. Maybe some creators stay in it for the love of the game, but how much reduction can we afford? And you generally need some critical mass to really create a community around your work. What if reaching this mass becomes say twice as hard for truth-constrained originalists? How many do we lose to the discouragement and bad economics? And how much does their rate of production then go down on the aggregate with the reward drop?
So I guess my thesis is just that this compositional shift may matter a lot. Not as much as the internet, but in compounding its worst qualities at a time when we’re already not coping super well.
Thank you!
"Will the the growing explosion of hybrid work in the interim impoverish us or enrich us?"
I imagine that the ambiguity of "impoverish" and "enrich" here is fully intentional, but the extent to which the two go hand in hand, work against each other, or neither, is definitely a question that has been in my thoughts for years now.
Technological progress is always both I think? As someone who thinks virtually everything is a question of tradeoffs, I'm more for a world with AI than one without it. If nothing else it gives us new kinds of tools that can be wielded against the downsides it unleashes. But I'm admittedly bummed that so much AI safety focus has been on avoiding skynet and not "ok what will a world with soft AGI look like and how will it be very obviously exploited". It's cool that projects like Community Notes exist and are gaining traction, and AI in various forms will make them much better. But on net I'm still pretty bearish about the near-term social costs of how much worse the infosphere is gonna get.