What if instead you hid an encrypted signal within an otherwise perfectly legible audio signal? Imagine a song being played. To the ear the song seems perfectly normal. For example, data could be embedded within a song by ever-so-slightly raising or lowering the pitch of a song multiple times per second. Then if you had a copy of the original file, software could compare the original file to the song transmitted over the radio. The locations where the pitch rose or fell could be noted, and the data could be retrieved. You could send encrypted data without anyone realizing you’re sending encrypted data. To anyone else listening, it would simply sound like a song or other audio track being played.
How to address superintelligence, if that is actually something we realistically face:
Make creating an unlicensed AI with over a certain threshold to be a capital offense.
Regulate the field of artificial intelligence as heavily as we do nuclear science and nuclear weapons development.
Have strict international treaties on model size and capability limitations.
Have inspection regimes in place to allow international monitoring of any electricity usage over a certain threshold.
Use satellites to track anomalous large power use across the globe (monitored via waste heat) and thoroughly investigate any large unexplained energy use.
Target the fabs. High powered chips should be licensed and tracked like nuclear materials.
Make clear that a nuclear first strike is a perfectly acceptable response to a nation state trying to create AGI.
Anyone who says this technology simply cannot be regulated is a fool. We’re talking models that require hundreds of megawatts or more to run and giant data centers full of millions of dollars worth of chips. There’s only a handful of companies on the planet producing the hardware for these systems. The idea that we can’t regulate such a thing is ridiculous.
I’m sorry, but I put the survival of the human race above your silly science project. If I have to put every person on this planet with a degree in computer science into a hole in the ground to save the human race, that is a sacrifice I am willing to make. Hell, I’ll go full Dune and outlaw computers all together, go back to pen and paper for everything, before I condone AGI.
We can’t control this technology? Balderdash. It’s created by human beings. And human beings can be killed.
So, how do we deal with ASI? You put anyone trying to create it deep in the ground. This is self defense at a species level. Sacrificing a few thousand madmen who think they’re going to summon a benevolent god to serve them is simple self-defense. It’s OK to kill cultists who are trying to summon a demon.