I don’t understand why companies are so hesitant to invest in infrastructure. For the rest of this article, I’ll be referring to internal tools and such, not applications meant for public use.
As a developer, the biggest bummer is when you have to deal with legacy code that’s suddenly broken. No one is around who remembers how it works, and it’s usually in some language that the developers don’t know. And usually in those cases, there’s some special instance – database or web server, for example – that’s keeping that one application alive. That special instance could be jettisoned for something more up to date, but no one’s willing to invest a few weeks’ time into reimplementing the application in a more modern language.
I mean, yes, I understand the business logic: why pay a developer or someone to rebuild this thing when the old one mostly works? The problem is that the developer spends increasingly more time fixing each issue, and at some point the amount you’ve spent on upkeep is equal to or more than the amount you could have spent building something more efficient. But aside from that, business requirements change, and what you built the application for several years ago just doesn’t make sense – it needs to be expanded to handle other circumstances or something. In some cases, circumstances, there are better applications out on the market already. With Web 2.0 in full swing, everyone’s building one-off apps that cost several bucks a month, and they handle your new use case better than your current software does.
It just seems to me that every few years, it would make sense to rebuild necessary internal applications and infrastructure. It would certainly improve efficiency within the organization, could save the company a few bucks in the long run (in terms of human hours and hardware costs), and would also increase business value through added functionality. And it’d probably making all the developers’ lives a little easier.