Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Python bogs down where there is excess use of the dynamic aspects, e.g., a large list comprehension.

Go with a generator, and watch that list be more kind to the resources.



> Python bogs down where there is excess use of the dynamic aspects, e.g., a large list comprehension. Go with a generator, and watch that list be more kind to the resources.

This is a micro-optimization and it's not always true. Generators usually use less memory, but list comprehensions can actually be faster because the memory gets allocated all at once and the loop is performed internally inside the interpreter, instead of the generator possibly being consumed as a slow top-level loop.

I was talking about examples like making a copy of a list/dict/object instead of replacing elements in-place, or using a stack of map/filter operations instead of a single-pass for-loop with dynamic programming.


Fair point, especially in the case of network-bound or disk-IO intensive work.


The thing that really irks me is that the generator pattern doesn't have to be an OO-first feature. Observable streams[1] work with the same basic foundation and those are awesome for FP. It's really frustrating that standard libraries are so eager to adopt generators without taking that last step into supporting functional stream consumption.

[1]: https://reactivex.io/


Python supports generator comprehensions too!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: