It's all binary's fault.
The standard definition of a kilo-anything is 1000 of that thing, binary came along and couldn't deal in powers of ten, only powers of two. So they decided that "near enough was good enough" and that 1024 bytes should be called a "kilobyte" despite there being 24 more bytes than the SI definition allows for.
Well, in those days of kilobytes the extra 24 bytes meant SFA but now that we're in the realm of terabytes it's really starting to add up.
Manufacturers are adhering to the rule, when you buy a terabyte HDD you are getting 1,000,000,000,000 bytes of storage but the computer's wacky idea of a kilobyte comes in to ruin your party.
1 000 000 000 000 / 1024 = 976 562 500 supposed kilobytes
976 562 500 / 1024 = 953 674 supposed megabytes
953 674 / 1024 = 931 supposed gigabytes
So you're only missing 1.5"GB" for formatting etc. Looks fine to me.
In reality you're getting the full 1000000000000 bytes you paid for, minus a little bit for formatting.
The kilobyte problem so long ago now means that there's about a 7% discrepancy between the bytes you buy and the "gigabytes" your computer tells you that you have.