Rising to the bait again (to keep the fishy metaphor going), there's yet another discussion of how the ability to hack the runtime changes the world. At one level that's true, but not in this case.
Dependency Injection is not a recent invention that a cabal of TDDers forced on the world, and it's absolutely not something that we came up with just so we could crowbar in our Mock Objects.
The relevant design guideline, named the Law of Demeter ("Strong Suggestion" isn't very catchy) was first described by Karl Lieberherr in 1987! The Mock Object pattern came from people applying their substantial O-O experience to TDD in Java, trying to figure out how best to avoid exposing implementation details in unit tests. One of the critical lessons from Demeter is that objects should have explicit dependencies. It helps to keep them focussed on their responsibilities and, as a result, easier to maintain—and a good way to make dependencies explicit is to pass them in.
Unfortunately, since then the world seems to have filled up with DI frameworks which cloud what should be a coding style discussion. This is not about having to configure every last corner of your application in XML, this is about how objects, or small clusters of them, get to talk to each other.
Roy asks, "[...] do you need DI all over the place, or just in specific places where you know dependencies could be a problem?" Well no—if you have enough foresight to know where those places are. I'm struggling at the moment to test against a framework where everything is helpfully packaged up nice and tight so I can't create an instance of one of its core objects. It's actually well written, but the authors just weren't good enough at prophecy to figure out my particular need. That's why I don't rely just on my intuition, I use the needs of my unit tests to help me figure out where the seams should be. To counter the FUD argument, I have absolutely no problem with saying that I don't want tools that do magic because I need guidance with my code.
As Roy (very politely) concludes, there isn't a high enough proportion of really good code in the world (some of it mine) that we should be in hurry to cut back on techniques such as DI. Just because something has been around for a while doesn't mean it's been superseded, especially in as conventional an environment as .Net.
11 comments:
To play devil's advocate for a second, Typemock would help you exactly in the situation you posted above.
But I agree that DI is not dead. But perhaps it time to reconsider whether DI for the sake of DI is a bad thing, when tools allow testability even without it.
JMocking in Java and Typemock in .NET provide this ability, as well as ruby, where you can replace anything and everything.
Do you need DI in a language like ruby? where everything is replaceable?
Roy
(disclosure: I recently started working at Typemock)
ISerializable.com
As everyone keeps saying, TypeMock is not limited to working around the back. That's true but that's not what I, personally, am looking for.
I'm not sure what "DI for the sake of it" means. I use DI to introduce objects that need to talk to each other, so that each has a coherent set of responsibilities. I think about the relationships first and the dependencies derive from that. This is fundamental to the school of OO that I mostly follow, what Ralph Johnson is pleased to call the mystical view of OO.
It's not a tool issue. Many of my circle spent time in the Smalltalk mines, which is far more flexible than Ruby (wanna change the meaning of inheritance?), and the Demeter ideas applied then. In retrospect, many of us wish we'd used inheritance less and composition more.
The idea behind DI: that of of clearly distinguishing between an object (or component's) provided and required services and separating how peer objects are interrelated from their internal implementation details goes back further than the Law of Demeter. It was realised in the CONIC system [1] and perhaps in earlier systems too.
The idea has been given several names over the years. Dependency Injection is perhaps the worst, because it is so misleading.
[1] Kramer, J., Magee, J., Sloman, M.S., and Lister, A., CONIC: An Integrated Approach to Distributed Computer Control Systems, IEE Proceedings., 130, Pt. E, ( 1983), 1-10.
Correction: the earliest mention of CONIC in the academic literature is 1981.
"Well no—if you have enough foresight to know where those places are"
A lot of the time you have strong evidence though.
I certainly know that my data access or infrastructure code will change for different reasons and at a different pace from my business code so I decouple.
However I'm not at all worried about a domain class referencing directly a domain rule written for it. Customer referencing CustomerMustHaveAddressRule (directly or indirectly), don't mind that at all.
@Nat. You just want a reference to Imperial College in there. :)
@Colin. I never cease to be amazed at the unexpected ways I find code develops. At the gross level, separating layers is pretty obvious (although some people still argue). That's also where an IoC container might be appropriate.
There are finer-grain levels too where these ideas still count, which is more what we're talking about here. Not every little feature, perhaps (that's another failure mode), but a lot more than many coders are used to.
"I never cease to be amazed at the unexpected ways I find code develops. At the gross level, separating layers is pretty obvious (although some people still argue). That's also where an IoC container might be appropriate."
Yeah agreed, between domain module/packages is another obvious example where the decoupling is useful.
"There are finer-grain levels too where these ideas still count, which is more what we're talking about here. Not every little feature, perhaps (that's another failure mode), but a lot more than many coders are used to."
I agree the ideas always count, but you need to find balance. I've certainly gone overboard on the decoupling approach and haven't found a lot of the loose coupling bought me anything. In a lot of cases a depdency on a concrete class isn't going to cause you any problems and you should just go for it.
Sorry to post on this thread again and I'm a bit off topic here but I noticed Nat Pryce had posted on where/when he uses the LOD (http://c2.com/cgi/wiki?LawOfDemeter).
In particular I'm interested in the idea of having an exception for collections, am I right in thinking this is to avoid having boring pass throw methods in cases like customer.Orders.Add.
"One of the critical lessons from Demeter is that objects should have explicit dependencies."
Really? I could not find any mention with similar meaning anywhere I looked:
http://en.wikipedia.org/wiki/Law_of_Demeter
http://www.ccs.neu.edu/home/lieber/LoD.html
http://www.ccs.neu.edu/research/demeter/papers/icse-04-keynote/ICSE2004.pdf
http://www.cmcrossroads.com/bradapp/docs/demeter-intro.html
What did I miss? Could you point out where I can find the source for that statement?
@Anonymous (perhaps you would like to leave a name?)
Nice to see that someone's still reading the old postings
If you look at the "rules" for which methods an object O may invoke, they're all attached to objects that are either internal to O or passed in to O. So, one reading is that O can only call external objects if they're passed in, which makes them explicit.
Post a Comment