Computing enthusiasts may have to get used to a whole new set of terms because of a wrangle over semantics.
A binary megabyte is 1,048,576 bytes whereas a decimal megabyte is 1,000,000 bytes, and this has caused endless confusion, and several lawsuits.
Many end users do not know the difference between the two scales of measuring capacity and can, as a result, feel after buying a product, that they have got less capacity than they expected.
To solve this problem, a new term may be introduced. A tebibyte is a terabyte counted in binary where as a gibibyte is a gigabyte counted in decimal.
Both terms are a bit of a mouthful but here's hoping they will go some way to ending the confusion.