MEF and the Personal Spike

February 12, 2009

A week or two ago I attended the Open Space Coding day arranged by Alan Dean, and held at the Conchango offices. The Alt.Net community is very good at getting together and talking about code, software, and how it should all be done, but the focus of this meeting was to get on and write something!

The format was much like an open space conference, the first thing we did was suggest things we’d like to look in to, experiment and play with. There was a morning and afternoon session, in the morning I went to one on static reflection. A very interesting technique of using lambda syntax to analyse code as a traversable tree of expressions. All very good. Unfortunately for me this is the opportunity my Windows 2008 Server VM on my MacBook decided to bomb out on me. And bomb out it did. Blue Screen of death even before windows got a chance to get it’s boot on. How embarrassing, there was me thinking, I could finally be one of the cool kids, with my shiny white MacBook with after market Ram and HD upgrade just so I could run Windows in a VM, and it all fell apart.

From the Static reflection session I took away that it did make me a whole lot more comfortable with the reflection thing. Up until now I had always treated it like a leper of hackery, unjustly so, but this experience made me much more comfortable. It did appear to me that there was space here for a good library to make the traversal of the tree a lot more intuitive. Let me know if you know of any or if I missed something that made it all a lot easier than it looks.

The second session I attended was one on MEF. As we were all new to the format (or at least I thought we were) this got off to a bit of a slow start. We did that thing where go in to a room preped and briefed not to expect anyone to lead and then stare at someone who seems to know more than anyone else until they stand up and start presenting. Andrew Clancy, an Conchango employee, admitted to have played with MEF, and so showed us all very basically what its all about.

Once we had been briefed we paired up and made our own toy examples. I think it was a good thing that Andy’s example was out of date and the version of MEF we all downloaded was completely different. It forced us all to learn a bit better how it all worked. Mike and I quickly came up with a toy example that involved contract killers.

We quickly cut two dlls, that each contained one type of contract killer. MEF made it unbelievably easy to export this implementations of IContractKiller. Just attribute them up and they’re ready for consumption. The contract it self was also implemented in it’s own library and finally we had a console app that would tie it all together and offer up some victims to be killed (in various ways).

MEF allowed us to simply load all the dlls in a directory and in an IoC like way made them available to plug in to IContractKiller shaped holes (properties tagged with corresponding attributes). It also rather neatly allowed us to get all the implementations and put them in an IEnumerable for us to use.

Now here’s what I really learnt at this day. I knew about MEF when it was released and have seen many a blog about people using it (seen, not read). But only when I actually played with it, did it start clicking, how I could leverage this in my work. Where it might be appropriate to alleviate some problem we were having, and also where it wouldn’t be so handy. It’s very easy to get carried away with a new toy. Scott Cowan who also attended the session described this to me as a personal spike.

Take an hour or two, some time boxed period, branch your code or knock up some simple harness, and just play with something you’ve not used before. Something you’re interested in learning. With no goal other than to learn a little more about what it is you’re playing with. As is described in Pragmatic Thinking and Learning by Andy Hunt. Learn by play really does work. It may seem obvious but reading about something really is totally different. Andy explains that it’s because it uses a different part of your brain, and that neglected part, the play part, seems to have a hidden ability. To perculate and strike you in the down time. Let your mind wander a little and it often does go find something interesting and useful!

Woo, my first post written and published in one sitting. I hope it doesn’t show (much).

Advertisements

The object epiphany

February 2, 2009

My friend Mike Wagg recently had what I’m calling the Object epiphany that all Ruby devs have at some point, that makes them fall in love with the language. Now I can’t claim to have yet had this great moment, I haven’t given my self any time to play in the language to have achieved Ruby nirvana.

The reason I’ve decided to blog about it, is because, it does tend to lead in to some kind of OO rebellion, whereby devs cry from the roof tops that all OO languages before Ruby (or some other dynamic language) weren’t truly OO. That in fact, Class based languages are of the false gods and that we should all come in to the light that is duck typing. I may have got a little carried away with that there. Sorry.

I want to keep here my current view point and opinion that, in time, I can come back and look at what a fool I had been. Hopefully that comes soon after I’ve made a million on my latest Rails app. Or indeed you can do just that right now if you’ve already got there. You lucky so and so you.

So it goes a little something like this. Ruby emphasises the messaging side of objects, it gives you complete freedom to look at the polymorphism side of OO all by itself. This is great. It’s called duck typing. You send a message to (or call a function on) an object and if it can respond it does so. Awesome, I’m no longer tied down by the compiler and it’s evil desire to know all about all before it allows you to ‘compile’, and completely unnecessary step in the world where I’m a rock star programmer. Sheet, if I wanted to I could open up that there object (not class) and add to it if I wanted to. This must be OO, I just referred to something as an object and not a class!

You then start to think, we can do away completely with inheritance! For me this is where I get a little confused as to why this is such a revelation. I’ve never thought of Inheritance as the primary mechanism for polymorphism (I mean why would they be distinct OO concepts if that was so). The idea that one would use Inheritence in order to achieve some Polymorphism and for that reason alone seems little odd. I’ve always thought Inheritance was for that old chestnut, code reuse. I’ve even heard that some folk don’t like Inheritance when it’s used for just that reason! Yes I know to prefer composition over inheritance, it’s more flexible and so on. It’s all to easy to start arguing away Inheritance entirely by looking over something I think is quite integral to OO.

OO’s primary benefit, in my opinion, has always been that it’s just easier to map a real world domain problem in to computer code with it. My little brain has a better chance of understanding what a computer is doing, if its expressed to me in groupings of stuff (logic and data) that I have a chance of mapping to something in the real world. Further to this inheritance, does just fit this model of thinking. If I go about building a dog, and then I have to build a cat, and I see that they both work in the same way for some task (I don’t know, chewing), I’m going to throw that there stuff in to something they both are, let’s go with animals. Yeh sure I could make a Animal mix in that gives anything the ability to chew. I don’t disagree that that’s a potential course of action, that may well have it’s benefits. It is still bit easier to ‘get’ though when Inheritance links the two. The idea that OO is for dealing with objects and dealing with object blue prints (Classes) isn’t OO, seems little baby + bathwater. I still get to think in objects, and alright I don’t get to monkey patch them, but really when is the last time a monkey rocked up and gave you the ability to quack like a duck? Not that wouldn’t be cool if I was somehow trapped in a pond.

So, inheritance is for my thinking, and for code reuse, I don’t think its for anything else. You get Polymorphism for free, but it’s not there for it. When I hear or read that Interfaces are in Java / C# for the purpose of multiple inheritance, I’m a little bit sick in my throat. Every time. Where was the code I inherited, or even the frikin data? I didn’t. The only thing you could say I inherited was my public interface contract, and even then just potentially a small part of it. No, Interfaces are for polymorphism, and that alone. I appear to have gotten a little confident in my rant. Interfaces allow you to send message to any objects that can handle them, they let you take a homogeneous collection of objects that have some common interface and play with them as though they were the same. Allowing them to specialise how they behave for that contract. All fairly nifty stuff.

Why do we have to put up with C< languages and their need to know everything. Where did it come from? From what I can tell, and this has nothing to do with any research, just a feeling, is that it’s because these language’s were made by hardcore computer scientists. They had to deal with just 1k of ram or had fresh memories of punch card pains. I think what it gets us is performance and maybe stability, in the day of 3.2ghz x 4 cores on my personal computer, that probably means not so much, but if twitters up time is anything to go by, I’d say it’s still got a little bit left in it. (NB: I don’t think that’s all it gives us, I just want to keep to less than 1k words)

Keep in mind Ruby is a language that throws all languages together to allow anyone to join in. How much of this is achieved my accident I don’t know. I suspect much of it comes from Python and Ruby has just made it accessible to us foolish C/C++/Java/C# newbies by pretending it likes Classist design. It’s a Rockstar language made by a rockstar for rockstars (the dudes name is Matz with a ‘z’ and everything)*

I can’t wait for my object epiphany.

*the z thing may have been made up by me I’m not sure