The Ugly American Programmer: Bill Burke

I originally tried to post this as a comment on Bill Burke's Polyglotism is the worst idea I ever heard, but as far as I know, my comment is still awaiting moderation. So I'll go ahead and dissect it here. Burke starts with an alleged tongue-in-cheek:
Let me first start off with some tongue in cheek…I’m an American. I have no need to speak another language because well, I speak English. 99% of the civilized world speaks English so WTF should I ever learn another language? Case in point, I minored in German in college. For two semesters I went over to Munich and worked at Deutsche Aerospace so that I could learn German better. The thing is, besides the fact that my coworkers spoke damn good English to begin with, all the documentation they put out was in English, the units used were feet, pounds, miles. When the French came over to work with us, we also spoke English. So what is the freakin point of learning German? I was pretty damn disappointed that I wasted all this time in college learning German when in reality it was just a freakin useless exercise…
So we've taken the ugly American attitude and are now applying it to software development. Now, if he was trying to make the point that there is a lot of benefit to learning another language, his tongue-in-cheek disclaimer would make sense. But instead the above ignorant statement is followed by more ignorant rambling about using multiple languages in a project. He seems to think tongue-in-cheek means "I can say something ignorant and not be criticized for it".
Which brings me to the point of this blog. Polyglotism in software has to be the worst idea I ever heard. The idea of it is that you use the language that is best fits the job. Some say this is a huge boon for the developer as they will become more productive. In practice though, I think this is just a big excuse so the developer can learn and play with a new language, or for a language zealot/missionary to figure out a way to weasel in his pet language into a company. Plus, you’d probably end up being average or good at many languages but a master of none….But lets pretend that it is a benefit to the developer. Developers need to realize that there are implications to being polyglot.
Notice his focus on the single developer who is out to ruin things through his own selfishness (he must hate America!) No consideration is given to the benefits that a team and organization can derive from using multiple languages (which we will get to). His next point, maintenance:
So, you’ve added a Ruby module to that big flagship Java application or product your company is so proud of. You did it fast. It works. And management loves you for it. They love you so much for it, they’ve promoted you to software lead and now you are running a brand new project. Now that you’ve left your polyglot project, somebody needs to take over your work. Unfortunately, your group is a bunch of Java developers. For any bug that needs to be fixed, these developers need to be retrained in Ruby, a new Ruby developer needs to be hired, and/or a Ruby consultant/contractor needs to be brought it. Multiply this by each language you’ve introduced to your project.
Oh no, a Java programmer has to learn some Ruby! If you have any talented developers, they'll probably jump at the chance to escape Java hell and learn something new, assuming they haven't already learned Ruby on their own. "Multiply this by each language" is meaningless scare language. This assumes that skills are not transferable between programming languages. It's not at all difficult to be proficient in multiple languages. Many Java/PHP developers are also well versed in JavaScript (if you discount JavaScript at this point, I hope you have a career path that doesn't involve any programming for the web). Knowing multiple languages demonstrates mental flexibility and teaches a developer to think about problems in different ways. I would never hire a programmer that was only proficient in a single language.
The JVM is pretty cool now. We can run Ruby on it, Python on it, and even PHP on it. Your JRuby apps can work with Java APIs. Same with Jython and JPHP. Great. So now your developers can use any one of these language to build out extensions to your Java app. But what happens when you want to refactor one of your re-used Java libraries? OOPS!!!
Groovy and Scala are conspicuously absent from Mr. Burke's list. Could that be because they both compile to bytecode, and the resulting classes can therefore be used by any JVM language? So you could use Java, Scala and Groovy together in the same project in a completely interchangeable way, right now.
Ah, so you’ve weathered through the maintenance and refactoring nightmares and you’ve finally shipped your product. Hmm, but you’ve just added the complexity of installing multiple runtimes on your user base. Its not hugely bad if you’ve used the JVM as your base virtual machine. But you still need to package and install all the appropriate Java libraries, Ruby gems, and Python libraries on the appropriate places of your user’s machine. You also need to hope that all the desparate environments don’t conflict with anything the user has already installed on his machine. And that’s just with languages running in the JVM. Imagine if you used vanilla Ruby, Python and Java all as separate silos!
These new languages are used to reduce maintenance by dramatically reducing the size of the codebase, and simplifying application setup and ramp up times. Give me a domain model and I can have a functional Grails or Rails application running in less than an hour. And the code base will be a fraction of the size of the equivalent Java project. As for packaging, Maven handles Groovy and JRuby quite well. I'm sure PHP and Jython will have support soon enough, if they don't already (I don't know).
A support call comes in for your product or application. Its obviously a bug, but where is the problem? Is it your application code? Your JVM? Your Ruby VM? Your Java library? Your Ruby gem?
What an absurd statement. Where is the bug? I don't know, how do you normally find the bug in a "pure" application? Lets see, the date is parsed incorrectly and we're using some Ruby code to parse dates... Where could the bug be? Is it in the PHP templates? Only if you're an idiot. You could make the same argument about the MVC pattern and Hibernate. 3 layers? How will I know where the bug is? What if the bug is in a library? I better write it all myself in C. (Note that this is the correct way to make a tongue-in-cheek statement).
All and all, let me put it this way. We all work in multi-national environments. What if each developer documented their projects in their own native language, because lets face it, they are most productive in that language. Where would we be? Doing what’s best for oneself isn’t always best for the big picture
This last point about documentation doesn't even make sense. You write for your target audience. The compiler/interpreter is the intended audience of a programming language. You write Ruby code for the Ruby interpreter, Java for the Java compiler, etc.

The whole thing is absurd. You might as well say that SQL is a waste of time. The simple fact is that modern programmers use multiple languages in the same project with regularity, and the only ones who have trouble are the poorly trained & ignorant (who can hardly be called master of even the single language they know), and the old fogies, like Burke, who consider learning something new a waste of time.