Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Poetry doesn't play nice with PyTorch and many other ML libraries.


We use it for both pytorch and tensorflow (sometimes in the same package) and it works fine.


How do you manage cuda versions of pytorch? At the moment I'm using poethepoet to install them separately but it feels very suboptimal.


So I personally didn't do the sysadminning, but it's a docker container in which CUDA has been loaded, and after that it 'just works'.


One of the nice things about Conda is that it will handle these kinds of dependencies for you within the environment it sets up.

While it doesn't really matter for web/scripting use cases, it's an absolute godsend for DS/ML workflows.

But the resolver, oh why does the resolver take so long?


we do conda -> mamba for that exact scenario :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: