Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

With regards to GAs, it's good to keep in mind the comments from Russell and Norvig's AI book:

"At present, it is not clear whether the appeal of genetic algorithms arises from their performance or from their aesthetically pleasing origins."

Furthermore:

"Most comparisons of genetic algorithms with other approaches (esp. stochastic hill climbing) have found that the GAs are slower to converge."

The discussion at the page below is very enlightening as well.

http://books.google.ca/books?id=8jZBksh-bUMC&lpg=PR8&...



Yeah, I see them as just one tool in the randomized-optimization toolbox. It's not entirely clear to me why many people see them as the only or the first tool (perhaps the appealing biology metaphor?). In particular, I tend to like a complexity ramp: first try the dumbest thing, like random instances, then move on to randomized hill climbing or a variant like simulated annealing, then add more complex stuff on top of that, whether via GA crossover, or something that learns/exploits a statistical distribution, like MIMIC, if the simpler methods didn't work and you need to exploit the structure of the space. And with purely discrete things, I also like considering whether completely different approaches, like constraint programming, would work instead (instead of optimizing a numerical fitness/cost function, you give a list of things that must be true about any desired solution, and ask it to find one meeting the constraints).


I find that a lot of people seem to be more comfortable jumping from "I have a problem" to "let's make a GA" than to "let me formulate it as an optimization problem and optimize it". Which is funny, since GAs are essentially odd optimization techniques.

Most naive uses of GAs I've seen tend to mix two essentially separate things: how to formulate it as an optimization problem and how you're going to optimize if (if with a convex/continuous relaxation, local approximation, stochastic hill-climbing). In general, many GA formulations tend to be really naive, choosing parametrizations where (for example) some bits are highly correlated (this is bad because stochastic bit-flipping methods tend to mix very slowly with correlated inputs) in ways that could be fixable with a better parametrization (that would, however, not necessarily fit the "chromossome" metaphor).


I think that most implementations should pay more attention to why choosing GA as Defacto method. It is a interesting observation that in many places, the problem can be reduced to convex optimization (for continuous case) or a specialized heuristic search (for discrete case). On my hand currently, I has a problem that could potentially convert the original GA implementation to float search.

However, let's not diminish the fact that GA is a damn good tool to get hand dirty first on most problems.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: