Sunday, May 28, 2017

A Computer memory uses 1024 Bytes for a KiBiByte and Computer Disk uses 1000 Bytes for a Kilobyte: Now you can stop the confusion of binary and decimal definitions of memories


    Obviously computer engineering and computer science has nothing in common regarding most of the confusions both of them present to each other and to the general public.
How?
     One of such confusions is the unit of measurement for the basic data unit which grew out from bit to bytes to  word and to other higher data units which I do  not really have time to explain here on this post.
It really seems someone is being too technical or someone is trying to deceive people with a unit definition that could literally mean something totally different, and that person could be microsoft which uses the same definition for both memory and disk which is totally wrong.  For all their highly qualified programmers, Microsoft in the early days of computers didn't use the correct units for file sizes and no one cared to stop them till it spread like wanna cry baby, later manufacturers took advantage of that in their intentional misrepresentation the capacity of their disk flash memory devices by using decimal definitions, in which a megabyte is 1,000,000 bytes.
The meomry of computer system which mostly is the RAM is defined in binary terms, with the 2n      rule,
where 8 bits, used as a memory address, can address 256 bytes of memory, 16 bits can address 64 kilobytes of memory which was where early PCs started, 32 bits can address 4 gigabytes and 64 bits can address 16 exabytes.
In the early days of the computer industry, they used "kilo" for 1,024 bytes, then the hard drive industry realised they could put bigger numbers on their drives if they used 1,000, so someone came up with "kibibyte" for 1,024 bytes. It goes kibibyte, mebibyte, gibibyte, tebibyte, pebibyte, exbibyte, zebibyte, and yobibyte, each a factor of 1,024 instead of 1,000. While the hard drive people switched to multiples of 1,000, I don't know if the memory people ever did. But you can see why it was convenient to "round" 1,000 to 1,024 for computers, but then how having more than one definition for metric prefixes was confusing.
There should just be one set of prefixes for everything.
Too bad nobody in the early days of computing thought of making a set a prefixes that are based on some multiple of a power of 2 (1,024 is 2^10 and 10 isn't a power of 2) -- that would be ideal for computers. But it seems to be too late now as Microsoft had ruined everything by following the wrong trend.
Below is the system property of my Fedora Linux and how the memory and disk are defined with differently, binary and decimal respectively.
                                                                                 Linux use Gigabytes correctly
Yottabyte which is currently the largest metric prefix for standard units remains the largest we can think of now for the big data industry, thus these prefixes go kilo, mega, giga, tera, peta, exa, zetta, yotta.
After 64 bits which can address up to can address 16 exabytes 128 bits can address 281,474,976,710,656 yottabytes and we are far from a 128 data bus as of now.
The latest we have heard about is the rumours is the 92-bit bus which will be able to give us an available data of up to 4,096 "yotta"bytes.
gib-vs-gb-table
Thank you for your time and help stop the confusion microsoft brought to the computer word. 
Follow me on Social Networks

Popular Posts

Google+ Badge

Like our posts? Subscribe to get GridSpot news

Social

Featured Post

Transhumanism: World’s First Cyborg Neil Harbisson wanted to be able to understand color, so he drilled a hole into his head

Neil Harbisson (born 27 July 1984)  a Catalan-raised, British-born avant-garde artist and cyborg activist based in New York City says...

Sponsord

Popular Posts