"To put it concisely, I think that’s what a lot of critiques of “magic” really boil down to: the notion that just because you can do something doesn’t mean you should. Accepting this can be tough, but I think it’s a necessary part of becoming a good programmer; if you don’t have a certain sense of discipline and respect for the power you’re wielding, well, you might up cloning dinosaurs and getting eaten by them when the power goes out."
Going beyond Ruby, I do have this criticism for a number of the really dynamic languages. Doing one of the mighty wizardly things might work, but it often has side effects that only a wizard will know. Sure, hacking on the Array class in Javascript can be cool, but breaks things like property enumeration in "weird" ways (well, I understand them but a newbie won't)... are you really really sure that your awesome new functionality couldn't have just been done with a simple function that takes an array as the first argument? Or did you actually need to extend Array?
Sometimes the answer is, yes, you needed to extend Array. OK then. But it frequently isn't. Wizardly code tends to tie your code up in snarls; one minute you're wizarding this, the next that, then the wizard hunks start interacting, then all hell breaks loose and you get an incomprehensible mess, when it could probably have just been some functions with a bit more verbosity on the calling side. I know; I've been through this pattern at least two big times, including the refactoring back into functions, and I consider this a shameful admission, justified only by "I was young and we all are young at some point"... and I can't help but notice I rarely lose much of significance when I do the refactoring back into functions.
The trouble is, many people appear to believe that your argument that there are things which one generally shouldn't do with dynamic languages has the corollary that one should not be able to do them, i.e., that these features should not be available in the first place. The nuance of "Most of the time you shouldn't do this, but sometimes it's necessary or even the correct approach" tends to get lost; only the first clause gets parsed, and we have yet another round of stupid arguments in which one side says "Dynamic languages are bad because you can change the way core objects function", and the other says "Dynamic languages are good because you can change the way core objects function".
There is a possible explanation for this: language fundamentalism is an easy position to hold, while subtlety is demanding. Dogma merely requires one to follow its commands, rather than using one's judgement to negotiate between conflicting demands. I hope that this isn't, in fact, the reason that these arguments perpetuate, but I fear that it is.
There's something to be said for language enforced limitations supporting encapsulation when creating an ecosystem of reusable components - something Java/C# has been great at.
An analogy is enforced immutability. Unlike Ruby, Python has immutable strings and tuples, and requires the use of immutable objects for hashmaps - a great choice for library writers, though a minor inconvenience for application programmers who want to say "trust me, i won't change this object later."
The recommended and near-universally used Ruby hash keys are immutable. Symbols are used most frequently, but even if you pass a string as a hash key it gets automatically copied and frozen. freeze is also a method on Object and can be used anywhere to make an object immutable.
Well, technically the requirement is to implement both __hash__() and one of __eq__()/__cmp__(). But Python fudges this a bit in how it handles user-defined types.
I completely agree. I'll step back from my normal state of bashing the Rails community for a moment and give a quick personal example from Python:
I use an incredibly helpful library that basically allows one to create a public API out of their Django models with very little effort. It even generates documentation and validation, etc. The problem is, it uses metaclass "magic" and decorators for everything. I understand both these concepts well enough, but when an entire system is built around using these two powerful constructs, unwinding how things happen becomes rather difficult (for someone who doesn't use metaclasses and decorators for everything) -- to the point that if the library can't do something, it isn't even worth my time to properly modify it.
There were certainly other ways to go about creating the provided functionality. Django's inner-class-based approach would have been a good start! And there'd be nothing wrong with supplying helpful decorators, or even using metaclasses here and there, but the library (at least to me) screams "magic for magic's sake".
I'll repeat my reply here, because I prefer the discussion participants ;)
After 4+ years with Rails, it was very refreshing to read this insightful post. All the spinning plates is the reason the Rails Way is larger than most of my Java books, and about equal in size to my Spring in Action (2 ed) book. Rails has simply baked too much magic in, and keeping track of it is a nightmare. It might be helped somewhat if the docs were better (and the guides are making this much better), but it’s still painful. Combine that with an inconsistent API and a codebase that takes advantage of virtually every magic trick ruby has to offer, and you have a framework that excites you at first but wears you down over time. I’ve developed very large systems with Rails and it’s served me well, but it’s also made me very weary.
This is the reason both Python in a Nutshell and the Definitive Guide to Django now sit open on my desk. I come seeking clarity, explicitness, and productivity. I’ve shied away from Python for years because of some awful inconsistencies in the language and lack of closures (I <3 closures), but if it keeps things straightforward and clean, I’ll overlook them.
I’ll probably always prefer Ruby as a language in general…as a tool with which to perform magical displays of power and light…but I’m very close to giving up on it for day-to-day real work.
Great post. I will say, while I rarely do things in a "magic" way just for the sake of it, I often do it to understand how something works, which, admittedly, may be just as bad. A good example is using generators in python; While writing a program I happened to be reading about generators, so I decided to incorporate them into my code, thinking that doing so would help me understand them better. It worked fine and I understood it at the time, but now if I were to go back and muck with that code, I'd probably have to brush-up on generators just to make changes that didn't break the whole thing.
So I guess my point is, another valid reason, beyond "doing it just because you can", is "doing it so you can learn". However, the end result seems about the same to me.
> The right thing, if there is one, consists of a willingness to learn how stuff works even if it’s unfamiliar or seems complicated
The key point is: how hard is this magic to learn, and how much is there learn? For some types of "magic" there is simply no way to learn how they work. They're just arbitrarily baked in in such a way that even a person skilled would never figure it out. You just have to hope that the documentation is good and put your faith in it. It's true that programming languages ask us to accept things on faith in just such a way, but that is why languages themselves are so minimalist - they have to be so that we can afford to learn them at such expert level that we can use them competently despite their magical nature. Frameworks like Rails, Grails and Django are anything but minimalist. To ask us to learn everything about them without strong links to the foundational languages they are built on is asking too much - it doesn't scale.
There can be a lot of hate on Rails over "magic", but a lot of it isn't that hard to understand.
From a comment on the article:
The obvious example of ‘magic’ in Rails (beyond all the activerecord stuff) that I always mention, is that the controllers (equivalent to Django’s views) don’t take a request parameter in the function definition. They also ‘magically’ render the relevant template without you have to do it explicitly.
Well, the first part of that is that Django uses functions and Rails uses methods of an ActiveController subclass. It's just that the things that would go into a request parameter were initialized as class variables with accessors and then the method was run. Now, you don't see the initialization, but it's not as if you ever explicitly pass "request" to a function in Django through urls.py. You just declare functions as taking that parameter, but in urls.py you say:
(r'apple/$', 'project.apple.views.apple_view')
and "magically" the request object gets passed to that view and magically it's imported into the current context without specifying an import statement.
Similarly, one could argue that rendering a template without having to explicitly specify keeps Rails DRY. With Django, there's plenty of opportunity to make things harder to keep track of if you're constantly randomly naming templates.
--
What makes James' piece so good is that he realizes that it's not about magic or not. It's about a balance. No magic means you're constantly repeating yourself and leaving yourself prone to errors. Too much magic and people can't understand what's going on. Clearly managed memory isn't too much magic in most people's opinion. But it is magic. Is assuming that the action "apple" of the controller "Fruit" should want a template fruit/apple too much magic?
In my opinion, that seems like a sensible conclusion. It's overridable. It follows.
The problem is that there's a ton of people that don't seem to realize that Django and Rails aren't that different. People continually want to make silly judgements between the two and claim one the victor - people who usually aren't the ones developing those frameworks. Heck, even DHH has talked about being able to use Rails 3's routing to add Django powered sections so that one could take advantage of GeoDjango's cool spacial stuff if one wanted. The Django core team is never really badmouthing Rails. There just seems to be people who want to fight over whether you have to say "def view(request)" or "def view" when neither is that helpful or that much of a hindrance. Is this how people keep from having to write their applications? Language and framework wars?
Me, I keep from writing my applications on HN ;-).
> Similarly, one could argue that rendering a template without having to explicitly specify keeps Rails DRY. With Django, there's plenty of opportunity to make things harder to keep track of if you're constantly randomly naming templates.
Oh come on, if you're going to constantly randomly name things, no matter what they are, your application will be a mess no matter what framework you choose.
"To put it concisely, I think that’s what a lot of critiques of “magic” really boil down to: the notion that just because you can do something doesn’t mean you should. Accepting this can be tough, but I think it’s a necessary part of becoming a good programmer; if you don’t have a certain sense of discipline and respect for the power you’re wielding, well, you might up cloning dinosaurs and getting eaten by them when the power goes out."
Going beyond Ruby, I do have this criticism for a number of the really dynamic languages. Doing one of the mighty wizardly things might work, but it often has side effects that only a wizard will know. Sure, hacking on the Array class in Javascript can be cool, but breaks things like property enumeration in "weird" ways (well, I understand them but a newbie won't)... are you really really sure that your awesome new functionality couldn't have just been done with a simple function that takes an array as the first argument? Or did you actually need to extend Array?
Sometimes the answer is, yes, you needed to extend Array. OK then. But it frequently isn't. Wizardly code tends to tie your code up in snarls; one minute you're wizarding this, the next that, then the wizard hunks start interacting, then all hell breaks loose and you get an incomprehensible mess, when it could probably have just been some functions with a bit more verbosity on the calling side. I know; I've been through this pattern at least two big times, including the refactoring back into functions, and I consider this a shameful admission, justified only by "I was young and we all are young at some point"... and I can't help but notice I rarely lose much of significance when I do the refactoring back into functions.