I appreciate the punchy positivity but "we're all going to make it" is an idea so severely far-removed from the actual consequences of your premises that I have to wonder if you're being purposefully disingenuous.
He isn't specifically referring to any 1 individual but humanity in general, the reason is because in the ai space the majority opinion is to overegulate ai dew to "existential risks" and effectively decelerate agi development, e/acc is basically anti ai safety,pro-growth and technological development, it's the ultimate energic hope-pill if you will, just built
“Nothing human makes it out of the near-future...”
e/acc is a pathetic attempt to hold on to the remnants of humanistic theology. It lacks any rigor or attempt to critique or intensify the general increasing technique gradient (Ellul). ‘We are all going to make it’ - is about as naive as a Winnie the Pooh book - we won’t all make it and if we accelerate hard enough and fast enough, most humans will become obsolete faster than you can say ‘Garbage time is running out’.
I dunno, I think humanity is quickly approaching our next great cataclysm. I certainly wish the best of luck to the builders/try-hards, but this roller coaster has gotten a bit too wacky for me.
> Contrary examples from history—where humanity has solved a problem by skulking backward—are scarce to non-existent.
What is meant by "skulking backward" here? Literal or metaphorical? Some wars have been avoided by literally skulking backwards / capitulating, so I assume metaphorically. Just want to clear that up for the record.
*BUT* for a substack that focuses on inevitable technological exponentiation and entropy as the major obstacle, I’m surprised I didn’t see any mention of man’s greatest obstacle--
death.
And the biotechnology that we need to invent to limit metabolic entropy.
Great points. I don't mind saying that "growth" is not all good or even necessary within this philosophy. All we need are systems that make all participants owners who naturally demand transparency. Can you read this
I have no objection to nuclear power but most of this article is conspiratorial nonsense. Smaller populations will help humanity win the battle against life threatening climate disaster and save the other species we share the planet with.
Robots equipped with artificial general intelligence will wipe our aging asses and grow and prepare our food. Young people will have less competition for jobs so their wages will rise and with less demand for housing the cost of the existing housing stock will become more affordable. Paul Krugman recently looked at low birth rate Japan and penned an amazingly optimistic report on its economic conditions. "In some ways, Japan, rather than being a cautionary tale, is a kind of role model - an example of how to manage difficult demography while remaining prosperous and socially stable. As long as it is voluntary achieving lower population is a benefit to humanity.
I appreciate the punchy positivity but "we're all going to make it" is an idea so severely far-removed from the actual consequences of your premises that I have to wonder if you're being purposefully disingenuous.
He isn't specifically referring to any 1 individual but humanity in general, the reason is because in the ai space the majority opinion is to overegulate ai dew to "existential risks" and effectively decelerate agi development, e/acc is basically anti ai safety,pro-growth and technological development, it's the ultimate energic hope-pill if you will, just built
Almost exactly the note I made for myself: "juvenile naivety or disingenuous misdirection"
“Nothing human makes it out of the near-future...”
e/acc is a pathetic attempt to hold on to the remnants of humanistic theology. It lacks any rigor or attempt to critique or intensify the general increasing technique gradient (Ellul). ‘We are all going to make it’ - is about as naive as a Winnie the Pooh book - we won’t all make it and if we accelerate hard enough and fast enough, most humans will become obsolete faster than you can say ‘Garbage time is running out’.
Bad take. Think harder.
Thank God for pessimists to keep the Normies from drowning in their delusions.
Dyson spheres won't build themselves
unless...
time war
I reckon the aliens that keep bumping into us are not really aliens...but us coming back
The time war has already started if it ever starts
We build self-building dyson spheres
(that is what i was getting at, 10 points!)
I dunno, I think humanity is quickly approaching our next great cataclysm. I certainly wish the best of luck to the builders/try-hards, but this roller coaster has gotten a bit too wacky for me.
> Contrary examples from history—where humanity has solved a problem by skulking backward—are scarce to non-existent.
What is meant by "skulking backward" here? Literal or metaphorical? Some wars have been avoided by literally skulking backwards / capitulating, so I assume metaphorically. Just want to clear that up for the record.
E/acc must prevail over the demoralization of ESG/DEI/EA. Calculate your ESG score here: https://yuribezmenov.substack.com/p/how-to-raise-your-esg-score
My people... My good people.
I'm reporting to you live from the year 2606.
E/acc won eons ago.
Decels tried everything.
They used the media mass psychosis formation weapon to great harm.
The populace was maimed.
Technology was defamed.
...and then the memes came.
Shortly thereafter, the spice began to flow.
We all remember Nov 6, 2023 as the day the spice turned sentient.
Do not let fear slow down your accelerating love for humanity.
You lucky bastards are just getting to the good part.
Love,
Matt
Subscribed.
*BUT* for a substack that focuses on inevitable technological exponentiation and entropy as the major obstacle, I’m surprised I didn’t see any mention of man’s greatest obstacle--
death.
And the biotechnology that we need to invent to limit metabolic entropy.
Man's greatest obstacles are consciousness and culture, a deadly tag team.
One important direction is Space Colonization, before we cannot create enough energy to do this (a low EROI).
https://meaningofstuff.blogspot.com/2015/10/morality-derived-from-space-colonization.html
Where are the credits to Nikola Tesla? This Manifesto is no novelty to THE PROBLEM OF INCREASING HUMAN ENERGY - http://www.tfcbooks.com/tesla/1900-06-00.htm
check this out
https://doxbin.com/upload/ForbesJournalistEmilyBakerWhite
Think bigger and bigger and bigger 1% of the space age is way bigger than 500% of the Stone Age
How does one pronounce the abbreviation "e/acc"? I'm mentally using "ee-ACK" but am open to being told I'm wrong.
I'm convinced. I'm in. 👍🏼
Great points. I don't mind saying that "growth" is not all good or even necessary within this philosophy. All we need are systems that make all participants owners who naturally demand transparency. Can you read this
https://paradime.substack.com/p/tech-and-philosophy-unite
Thanks
I have no objection to nuclear power but most of this article is conspiratorial nonsense. Smaller populations will help humanity win the battle against life threatening climate disaster and save the other species we share the planet with.
Robots equipped with artificial general intelligence will wipe our aging asses and grow and prepare our food. Young people will have less competition for jobs so their wages will rise and with less demand for housing the cost of the existing housing stock will become more affordable. Paul Krugman recently looked at low birth rate Japan and penned an amazingly optimistic report on its economic conditions. "In some ways, Japan, rather than being a cautionary tale, is a kind of role model - an example of how to manage difficult demography while remaining prosperous and socially stable. As long as it is voluntary achieving lower population is a benefit to humanity.