Originally posted by RJHinds
Computer Code Discovered Concealed In Superstring Equations
"Doubly-even self-dual linear binary error-correcting block code," first invented by Claude Shannon in the 1940's, has been discovered embedded WITHIN the equations of superstring theory!
Why does nature have this? What errors does it need to corre ...[text shortened]... imulation Hypothesis: http://www.simulation-argument.com/simulation.html
I tried reading the superstrings theory paper, and didn't understand it. He seemed to be classifying graphs which represent collections of supersymmetry operators. He has a way of getting the theory to generate binary N-vectors, and then maps these to error correction codes. This happens because the super-algebra enforces symmetries between the vectors, and the rules happen to be the same as for Hanning Codes.
The philosophy paper is bunk. The argument's related to the ID one based on Fred Hoyle's work. If his deduction that we are in a simulation (or it's impossible and therefore we are going to become extinct) is correct then the super-humans simulating us must also be in a simulation - by his own argument. Therefore one must have an infinite tower of simulations, with no base reality.
If we are in an ancestor simulation then we must have identical laws of physics to the advanced humans simulating us (otherwise it's not an ancestor simulation) so they have the "error correction code" as well, which just reinforces this infinite tower of simulations.
Incidentally, the problem of running one of these is software based, not hardware based. The time elapsed within the simulation is not related to the run-time within the machine it's running. A slow machine that simulates 1 second of a universe in a million years would be boring for the programmers, but the people in the simulation would know no different.
On the other hand it solves the EPR paradox. When a mind object in the simulation (when I say object, variable, and function I mean in the sense of "object-oriented programming".) makes an observation of say the spin of an electron then the electron object being observed has a state-vector variable initialized to some value which may not be an eigenstate of the observe-spin member function, so the machine resets the state vector electron object member variable to one of the possible results using a random number generator and assigns probabilities based on the projection of the state vector along the eigenvectors of the quantity being measured. The spooky action at a distance of entanglement is then simply explained by the program insisting on conservation laws being enforced and each of the entangled particle objects having a pointer to the other one (the pointer is NULL if the particle isn't entangled).
The problem with this idea is you can say anything, take the galaxy rotation problem - dark matter is evidence for hacking to get round the laws of physics they invented for us not being quite consistent.