In the good old days (no, this is not grandpa-talk :p) a Megabyte used to be 2^20 Byte, which equals to 1.048.576 Byte.
Primarily the hard disc vendors started using the decimal system to calculate disc space, because the numbers would get bigger faster.
Thus a Megabyte would be 10^6 Byte, which equals to 1.000.000 Byte.
From a computer science point of view this is just wrong. It is like saying a kilogram isn’t 1000 grams anymore but 951 grams. How does that sound to you?
Weird, right?
When you only think about small files this might not sound like much of a difference, for a Megabyte it is 4,9%. But when we reach, say, a Terabyte the difference is already 10%. This means they already cheated you by 100 Gigabyte of disc space. That’s quite an amount, isn’t it? Some things are just wrong no matter how you try looking at it. Thinking that maybe more and more companies in the IT industry might consider switching to this kind of maths leaves kind of a sad aftertaste to me.