(To provide some content for this blog‘s grand opening, I’ve decided to recycle articles I’ve written previously, and in some cases published elsewhere. This is one such article. It was originally published, with minor differences, in issue #139 of the Glia Society’s journal, Thoth, in December 2019.)
In The Hitchhiker’s Guide to the Galaxy, the supercomputer “Deep Thought” determines that the answer to Life, the Universe, and Everything is 42. Unfortunately, it hasn’t figured out what the question is.
Now suppose that perceptrons, which are commonly called “artificial neurons” in the context of machine learning, could be utilized in a computer program such that there would be a one-to-one mapping between the functions of those perceptrons and the functions of natural neurons. Or, in simpler terms: suppose that “copying” a brain’s neural configuration into perceptrons would result in an artificial neural network that would perfectly mimic the capabilities of that brain. Then, we need to figure out how much processing power that would take to simulate. Here’s a quick-and-dirty estimate:
The human brain contains about 85,000,000,000 neurons, each of which has about 1,000 synapses, each of which fires about 1,000 times per second. Therefore, our perceptron-simulation would require ~1 * 1017 floating-point operations per second (FLOPS), assuming that one FLOP is equivalent to one synapse firing, which I’ll be the first to admit I’m not sure is a correct assumption. This is a mere 100 peta-FLOPS, which could be reached by a cluster of about 1,000 Nvidia Titan V graphics cards. (This is indubitably a better use for all of that hashing and fossil fuel power than mining Dunning-Krugerrands, a.k.a. Bitcoins.) Even if this estimate needs to be, for whatever reason, adjusted upward by several orders of magnitude, it becomes apparent that we can already build supercomputers with more than enough processing power to run a software equivalent of a human brain! Only if the estimate is many orders of magnitude below the real value does this train of thought derail.
The problem, then, is figuring out how to configure the perceptrons. It currently appears infeasible to scan a human brain and then encode it into perceptrons, so our only option is to have the neural network “evolve” through machine learning. In order to do this, the artificial neural network would have to experience selection pressure towards general intelligence, but current techniques can only provide selection pressure towards extremely specific abilities, like clustering stolen credit card numbers or facial identification of Muslims in Xinjiang. If we could create a loss function that would force evolution towards general intelligence, then the “invisible hand” of the evolutionary market would probably solve the problem for us within a reasonable time frame.
Unfortunately, general intelligence is far too complicated to be modeled by current computer programmers, which is why “autodidactic” programs like perceptron networks were created in the first place! The machine learning engineer simply defines the desired output and then throws matrix multiplications at the problem until it’s resolved. Even though we probably have all of the resources we need to create the answer to the problem of artificial general intelligence, we can’t do it because don’t know how to phrase the question.
So far, I’ve only thought of one solution (“solution” in the sense of Jeopardy!): simulate a system so complicated that having general intelligence provides agents within that system with more evolutionary fitness than any task-specific mental abilities could. This would probably require simulating real-life physical reality on a cosmic scale, or something similar to that, which is far beyond the limits of current computing technology. Expanding the limits of computational tractability should therefore probably be one of the prime factors for providing the decryption key to artificial intelligence. Yog-Sothoth is the gate. Yog-Sothoth knows the gate. Yog-Sothoth is the gate. Yog-Sothoth is the key and guardian of the gate. Past, present, future, all are one in Yog-Sothoth.