Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Every time you fetch a package from NPM you are accessing that code from a URL (at npmjs.com) and then caching that locally. Deno is just eliminating the middleman. If you still trust npmjs.com for source code delivery you could continue to do that.

What it isn't eliminating is the ability to define "local" caches. Just because you are importing everything from URLs doesn't mean that they can't all be URLs that you directly control. You don't have to use CDN-like URLs in Production, you can entirely copy and paste all the modules you need onto a server you control and URL scheme that you maintain.

There will still possibly be good uses of caching automation in Deno (once people get bored with xcopy/robocopy/cp -r/rsync scripts), but it will likely seem a return to more bower-like tools rather than necessarily full blown packager npm-like tools. (ETA: Or proxy services like GitHub Artifacts that pull an upstream source for you and rehost it on controlled URLs with some additional security checks.)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: