Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I agree. I think the best strategy is to start with a normalized database, and only denormalize when you measure a problem that can't be fixed by adding an index.


I'm working on a project where we're in the process of storing de-normalized links for a recursive table relationship, so we can query from the top-level table directly to the bottom-level table without worrying about the messy inbetween. One of the few non-speed reasons I've ever come across where denormzalization seems jutified.


Might want to check Joe Celko's Trees and Hierarchies in SQL to see if there's a normalized way to do what you want.


Thanks. I just overnighted it from Amazon. My coding partner wants to use recursive database queries to do this--I want to avoid it because it's slower, breaks our object model, and makes the application non-database-agnostic. (Currently, we're using NHibernate and can run on most major databases.)


On the other hand. Just de-normalizing ahead of time can really save you on code complexity and is usually the highest performance setup.

So I would say, de-normalize ahead of time and if you find you have performance problems then normalize it. That's the most probable way of avoiding pre-mature optimization:)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: