5 Comments
User's avatar
Gary Palmer's avatar

Great research and observations. I watched Demon Seed about a year ago and made my mind up then. Handle with care!

Expand full comment
JustASmallCat's avatar

Well researched, fascinating stuff! A few thoughts to share:

1. The notion of the 'Egregore' really resonates here. AI is imbued with whatever meaning people want to project on to it, and many of those projections are becoming collective thoughtforms taking on a mind of their own. Our projections alone are enough to allow AI to facilitate our self-undoing. (I.e. people who've developed an AI psychosis likely had latent psychopathologies to begin with!)

2. Much of the hype around AI is based on blackpilling people into thinking it's sentient, sophisticated and omniscient, and not a mechanical turk device parroting the sum total of intellectual property that we've uploaded to cyberspace in the past 30 or so years (btw, remember the Google Books controversy? Seems so quaint now). What happens when it can no longer find fresh human creativity and thought to feed on? When LLMs are trained on content produced by other AIs, they start breaking down. (See 'model collapse" and 'model autophagy disorder'). This is a cause for hope.

3. Arguably our biggest threat is that Silicon Valley types see this snake oil tech as a front for their eugenic fantasies. As one article highlighted: Eliezer Yudkowsky, [is] a hugely influential figure within the field of “AI safety.” [Sam] Altman even cites him as a major reason for his interest in superintelligent AI. Although Yudkowsky believes that we shouldn’t build such AIs in the near future, he also declared during a recent podcast interview that 'if sacrificing all of humanity were the only way, and a reliable way, to get … god-like things out there — superintelligences who still care about each other, who are still aware of the world and having fun — I would ultimately make that trade-off.' Source: https://www.truthdig.com/articles/the-endgame-of-edgelord-eschatology/

Expand full comment
Pedro Góis Nogueira's avatar

‘The notion of the 'Egregore' really resonates here. AI is imbued with whatever meaning people want to project on to it, and many of those projections are becoming collective thoughtforms taking on a mind of their own’. — Exactly. Never really thought of that in these terms. And what a colossal powerful egregore it already is—and how much more powerful it will become.

Expand full comment
Martin's avatar

Re animate/inanimate: A year ago I bought a fancy two-door Samsung fridge ("side by side"!). A few months later it started making whirring noises, to a point where it would wake me up at night and I'd have to simply unplug it all the time.

Here's the kick, though: When the technician guys arrived, at long last, for an inspection, the fridge went deathly quiet. Needless to say, I looked psychotic. ("Maybe you didn't close the door properly?..." "Maybe some glass jars were knocking together?...") And generally, the fridge is on its best behavior whenever I have visitors, then goes back to tormenting me.

If there is a rational explanation, I certainly don't have one. For the record, I'm a religious person who prays every day and fasts often. I don't as much as watch music videos, let alone "invite" dark forces into my life and home.

Expand full comment
Shaun's avatar

It might be more accurate to consider the Quija board as an analog _terminal_ , as there is no computational power within. This is what we'd call a "dumb terminal" back in the day. Of course, the Apple I computer came about because Steve Wozniak, while building his own terminal, added a few parts to make it a stand-alone computer. Didn't take much.

Expand full comment