Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The first "gigabyte" multisampled libraries appeared in the 2000's, when memory was even tighter and spinning disks were the norm, so you're underestimating the technique here - it's always been streaming-intensive, and the software is doing a lot to mask I/O latency. A faster disk goes a long way in this respect, letting you run more instances with smaller buffers.

Memory does pose a bottleneck for huge arrangements in the studio, but in the live setting you literally don't have enough performers at the keys for the same constraint to apply. The stuff they might trigger can be bounced out into multisamples, so the remaining bottleneck is with effects processing.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: