If you use a computer or are thinking about buying one, you probably see the term "GB" quite a bit. It is a term used for measuring digital-information storage size.
GB stands for "gigabyte." A gigabyte is 1 billion bytes. A byte is one of the smallest units of measurement of digital-information storage. In some cases (such as in RAM, described below) a gigabyte refers to a larger amount than 1 billion (specifically, 1,073,741,824 bytes).
The term "byte" originally referred to the smallest amount of data that a computer could "bite" or handle at one time. It was spelled byte, with a y, to avoid confusion with another computing term: bit. Giga is a prefix denoting 1 billion.
You will most often need to convert between megabytes (1 million bytes) and gigabytes. There are 1,000 megabytes in 1 gigabyte. Occasionally you may encounter TB (terabyte), which is 1,000 gigabytes.
Very large files, such as movies, and hard drives are usually measured in gigabytes. For instance, Apple sells its computers with hard-drive capacities ranging from 120 GB to 640 GB.
GB is also used to measure RAM (Random Access Memory). RAM measures a system's ability to multitask, performing several operations at the same time. For the average user, 1 or 2 GB of RAM is sufficient. For a user with greater demands, such as high-end movie editing or animation effects, a system with 3 or 4 GB of RAM may be preferable.