Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Somehow tangent but this made me think about this quote found on HN last year:

Context: Evolutionary algorithms and analog electronic circuits

> One thing stands out when you try playing with evolutionary systems. Evolution is _really_ good at gaming the system. Unless you are very careful at specifying all of the constraints that you care about you can end up with a solution that is very clever but not quite what you had in mind. Here power consumption is the issue. If you tried to evolve a sturdy chair you might end up with something that is 1mm tall. or maybe a fuel efficient car that exploits continental drift.

I think it's the same here: The net is never gonna better than what it needs to be, and it is probably always gonna take the easy route.



You don't even need a neural net for that, take any global optimization method and give it a somewhat ill-defined scoring function, it will instantly run circles around you laughing.


There's an alife program called DarwinBots where small bots powered by mutating code compete against each other to survive and reproduce.

Given enough time, you'd expect the to develop clever behaviors, but instead they just fuzz-tested the sim and locked in on exploits of bugs or environment settings. They only got a bit more clever when connecting different sims running on different conditions.

Eyes already use different kinds and densities of sensors optimized for either detail and color or movement/edges. I wouldn't expect a single learning method, even after optimizing it to its limits, to be above what two or more layers of different methods could do, especially when trying to avoid exploits like the tank story.


Given enough time, you'd expect the to develop clever behaviors, but instead they just fuzz-tested the sim and locked in on exploits of bugs or environment settings.

Classic A-life! Also, not so different from the spirit of actual biology.

They only got a bit more clever when connecting different sims running on different conditions.

Diversity is very important for evolution on many levels. What many don't realize (especially, I note, evolution deniers) is that the ecosystem as a whole provides a very complex and continually varying epiphenomenal fitness function to any given organism.


If you don't have a sufficiently complex genotype phenotype mapping and the system is not evolvabke (See Gunter Wager's work) the you shouldn't expect more complex phenotype. Understanding a genetic representation is going to be an important step toward open ended evolutionary systems.


> They only got a bit more clever when connecting different sims running on different conditions.

Part of the reason why a lot of these nets are trained with added noise, as well as drop-out (randomly disabling 50% of the hidden neurons, every training step).

Especially the drop-out tactic is particularly effective at preventing "exploits" of the neural net type, which otherwise appear in the form of large correlated weights (really big weights depending on other really big opposite weights to cancel out--it works, but it doesn't help learning).

Either way, adding noisy hurdles helps because exploits are usually edge cases, and noise makes them less dependable, as the region of fitness space very close to an exploitable spot, is usually not very high-ranking at all (which is why you don't want your classifiers ending up there).


Darwinbots uses actual computer code to control the robots. This makes it really hard for evolution to work with. Most mutations just break the code, and very very few mutations create anything interesting. And the simulation is too slow to explore millions of different possibilities to make up for the difficulty. What makes it worse is they are usually asexual.

However I think that's ok. Most of the fun with darwinbots is programming your own bots. They used to be (still are?) competitions where people wrote their own bots and had them compete under different conditions.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: