Question

There is nothing like the 43rd day of your life spent tracking down issues due to CR/LF, different slash types, or a Big Endian vs. Little Endian bug. These issues are 20 years old and they make me feel as though humans are still cavemen. Are we simply replacing these old issues for new ones? XML has helped but aren't these issues costing companies millions in time, money, and effort? Is it a conspiracy to promote vendor-lockin?

Was it helpful?

Solution

Yes.

However I don't think it's a conspiracy as such. "Never attribute to malice that which can be explained by incompetence".

OTHER TIPS

I reckon we're stuck with the old probs, and every day we get a slew of new ones too. It's not to do with vendor-lockin, more the way in which we think, realistically we are still cavemen our brains haven't changed much in 20,000 years and we carry on making the same mistakes.

You've touched on a much bigger philosophical observation than just coding, it applies to most aspects of human life.

There's always the upcoming Unix date overflow in 2038 or so.

Unlike physical constructs like token ring networks software and data is intangible. I think data-formatting things CR/LF problems will still persist 20 years in the future (especially considering they are not solved now).

You can make a judgment call on each item. If programs are unable to read big or little endian, the data will be converted and it will eventually die off. But if programs continue in the Robustness Principle - things like CR/LF, Big Little Endian, and the mismash of HTML will persist for a very long time.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top