Although the article touches on a fascinating subject, I disagree with the analogy the author drew between neurons and C++ code; specifically the "lines of code per Neuron." Individual neurons do not have code.
The author also advocates buying stock in Cryonics companies. The only two Cryonics organizations I know of are Alcor and The Cryonics Institute. Neither of these are publicly traded.
Interesting fact: Ted Williams is stored at Alcor. Paris Hilton has signed up with The Cryonics Institute.
I personally will be neuro-suspended at Alcor. I am the #830-something person to sign up there. For those who are interested for themselves, it is actually rather cheap to fund with term life insurance.
I suspect that a 1percent loss of neurons would probably still create a reasonable approximation for who you where. You might forget some childhood memory's etc, but the human brain is extremely redundant (unlike software) so it's not a error prone as we might think. IMO, The real issue would be avoiding creating lot's of false connections lest we wake of with even more fake memory's.
From what I've read and understood, the redundancy of the brain has been approximated up to 10%. It will take time to get back to being 100% normal, but that with brain loss of 10% or lower you can easily survive and in fact return to normality within (I assume on average) under a year.
If they ever allow the freezer tubing of conscious humans with presently incurable diseases I believe it should be required for them to record their daily life for between a month and a year along with daily video journals so that they can be replayed this to prompt the reconnection of neurones.
I think you are likely correct. I kill quite a few neurons with a night of heavy drinking and indeed do "mis-remember" what happened in reality.
Brains are very redundant (as you mentioned) and they are also dynamic. The way I remember an event now will probably be different than the way I remember the same event 10 years from now.
Thanks for the wishes of good luck. I figure its worth a shot, although I have doubts that it will work. Plus, being burned to ashes or being put under dirt does not really appeal to me.
Actually, proteome-related research on synapses has found strong evidence that a significant amount of computation happens in the synapse. Higher mammals have distinct kinds of synapses, and these seem to coincide with specific cognitive abilities.
This is discussed in episode 51 of the Brain Science Podcast.
I think we're talking about the difference between processing and coding. Neurones appear to perform many different processes, but do not appear to have code.
It's like the difference between an analogue>digital converter, which performs a process as the signal passes through the circuitry. However that's very different from inserting an analogue signal into a computer, running it through a program that then produces the same result from the output.
We're talking the difference between a cigarette box converter and a computer. This is likely the difference between the action of a neurone and the action of the visual center of the brain.
There's been many experiments that IMO show parts of the brain (like the visual center) use a lot of processes, but there is obviously some computation in there. For instance, they've done experiments where a digital camera converts an image into a pixel map (i think it was like 16 pixels of black or white) that is then transmitted through electrodes onto the tongue. After between 5-15 minutes the subjects (all of whom were blind) became able to see the picture. IMO this clearly shows that there is a computation center that after repeated failure to understand begins experimenting until it finds the correct result, which in this case was turning touch into sight.
Analog->digital converters often do have code. They can be implemented using HDL to program FPGAs, or the HDL can be compiled to silicon. ADCs definitely embody an algorithm. (There are a variety of algorithms you can choose, providing different tradeoffs between low cost of the device and accuracy.)
Note that protozoans are capable of significant data processing so that they can navigate and find food and engage in reproduction. It is now thought that the synapse evolved from these data processing mechanisms.
It seems like you are still using the "a neuron is a neuron, a synapse is a synapse, the significance is all in the interconnects" view of the brain. This is outmoded.
Interesting article, however I would say that five lines of C++ code per neuron is pretty optimistic considering the amount of work going into modeling a neuron in the blue brain project; http://seedmagazine.com/designseries/henry-markram.html
Also I remember reading in Code Complete that the space-shuttle software had zero defects per 500,000 lines of code by combining formal development methods, peer reviews, and statistical testing. Not really sured how they 'proved' there were zero defects, but I'm sure you could get the number far lower than 1 per 100 lines.
The basic premise of writing a model for the entire human brain in software is bunk. The general idea, as I understand it, is to write a piece of software that simulates the biochemical processes that happen in the brain, with the neurons themselves being represented by some sort of graph.
Basically, the 'brain-in-a-jar', except the jar is virtualized.
I somehow managed to say the exact opposite of what I wanted.
What I meant to say is that physically coding every bit of what every neuron does in software makes zero sense. As in, having X lines of code for neuron one, X for neuron two, etc. Which is what the original article seems to apply with the 'lines of C++ code' estimation -- that you would have to write unique code for every neuron.
What makes sense to me, is to build a software model of a brain, and let that grow in the same way brains grow naturally, just in a virtualized environment.
If someone does not know anything about neural networks or programming they should not be drawing conclusions about either, and if they are doing it anyway, it should certainly not be on the front page of this site.
Yes, frogs use sugars, arctic fish use enzymes but they all only lower the freezing temperature. And as your link points out, they don't lower it enough.
The author also advocates buying stock in Cryonics companies. The only two Cryonics organizations I know of are Alcor and The Cryonics Institute. Neither of these are publicly traded.
Interesting fact: Ted Williams is stored at Alcor. Paris Hilton has signed up with The Cryonics Institute.
I personally will be neuro-suspended at Alcor. I am the #830-something person to sign up there. For those who are interested for themselves, it is actually rather cheap to fund with term life insurance.