Programming

GB (Gigabyte)

The gigabyte is a multiple of the unit byte for digital information. The prefix giga means 109 in the International System of Units (SI). Therefore, one gigabyte is 1,000,000,000 bytes. The unit symbol for the gigabyte is GB.

This definition is used in all contexts of science, engineering, business, and many areas of computing, including hard drive, solid state drive, and tape capacities, as well as data transmission speeds. However, the term is also used in some fields of computer science and information technology to denote 1,073,741,824 (10243 or 230) bytes, particularly for sizes of RAM. The use of gigabyte may thus be ambiguous. Hard disk capacities as described and marketed by drive manufacturers using the standard metric definition of the gigabyte, but when a 500-GB drive’s capacity is displayed by, for example, Microsoft Windows, it is reported as 465 GB, using a binary interpretation.

To address this ambiguity, the International System of Quantities standardizes the binary prefixes which denote a series of integer powers of 1024. With these prefixes, a memory module that is labeled as having the size 1GB has one gibibyte (1GiB) of storage capacity.

Tags:
Related Articles