Microwave Typing is made to protect dumb programmers from themselves, like the different types of microwaves. You know, if you want to warm-up some milk you use the milk-microwave, but if you have some left-over pastas from yesterday night you must use the pasta-from-yesterday-microwave because the milk-microwave's door won't open for that, neither will the pasta-microwave's door. Okay you need to have like five or four rooms full of microwaves to be able to warm-up whatever you want, but it's very secure. In particular, it is strictly impossible to put some metallic stuff in the microwave, which would be very likely to make it explode.
Hopefully, Microwave Typing is here and with it the programming situation analogous to the last metaphore is impossible: your code will be type-errors free. This is important because professional programmers can't take care of this kind of things by themsleves. Programming languages which don't offer Microwave Typing are clearly far too dangerous to be used in real world...
Err, wait a moment. In real life, I only have one microwave and I just know that I shouldn't put anything metallic into it, right? And so do everyone else... Oh. Ah, and wait again. I'm not even a professional warm-upper! And I have a last-minute information: it seems most people aren't!
So it seems like everyone is able to use a (void*)-microwave without messing up... Strange, or not?
Update: I received an email about this article and responded to it, I'll copy the conversation here to mitigate the trollness of this article.
On Thu, Sep 29, 2011 at 6:57 AM, Tyr <*@*> wrote:
Subject: Microwave typing
Now, I'm a Haskeller, so strong static typing is something I've learned to value quite a bit, and I feel that your example doesn't match my experience with static typing at all. Maybe it's because in Haskell you tend to make use of higher order functions and complicated things so being able what bits fit together nicely without really debugging it. The type signatures are like a very simplified version of your functions where you can can see how things must fit together. Having the correct types is a prerequisite to being semantically correct. And you can write generic code too. Having a typeclass (interface is another name for it) for things where temperature matters. Then a microwave function can take anything that satisfies the interface and add temperature to it.
I agree with that you know, this blogpost was a troll ;-).
Actually what annoys me is the "well-typed programs can't go wrong" motto. Because this is stupid: "real" bugs are in the algorithms. For instance a programmer could write an algorithm working on a DAG and badly write the checking algorithm which makes sure that the data he's reading at runtime actually correspond to and _acyclic_ graph. Or he could just not be checking that at all, and at some point something changes in the program usage which makes reading a cyclic graph possible. And this kind of erroneous behavior won't be caught by a type system as static as it may be. And this is where the "real" bugs are.
So this is why I wrote this troll, to moan "please stop being static-typing integrists, it won't make you coffee!" at people who do think that "well-typed programs can't go wrong".
Now that I've said that, I have to admit that having a good type system allow the programmer to focus one the "real" bugs such as the one I described and let all the "obvious" ones to static checking. In this sense, static typing is useful, but one should not be fanatical about it, and remember that it's not because one's program pass static typing that it's bug free.