Main Difference
The main difference between Gigabit and Gigabyte is that the Gigabit is a unit of information and Gigabyte is a multiple of the unit byte.

Gigabit
The gigabit is a multiple of the unit bit for digital information or computer storage. The prefix giga (symbol G) is defined in the International System of Units (SI) as a multiplier of 109 (1 billion, short scale), and therefore
1 gigabit = 109bits = 1000000000bits.The gigabit has the unit symbol Gbit or Gb.
Using the common byte size of 8 bits, 1 Gbit is equal to 125 megabytes (MB) or approximately 119 mebibytes (MiB).
The gigabit is closely related to the gibibit, a unit multiple derived from the binary prefix gibi (symbol Gi) of the same order of magnitude, which is equal to 230bits = 1073741824bits, or approximately 7% larger than the gigabit.

Gigabyte
The gigabyte () is a multiple of the unit byte for digital information. The prefix giga means 109 in the International System of Units (SI). Therefore, one gigabyte is 1000000000bytes. The unit symbol for the gigabyte is GB.
This definition is used in all contexts of science, engineering, business, and many areas of computing, including hard drive, solid state drive, and tape capacities, as well as data transmission speeds. However, the term is also used in some fields of computer science and information technology to denote 1073741824 (10243 or 230) bytes, particularly for sizes of RAM. The use of gigabyte may thus be ambiguous. Hard disk capacities as described and marketed by drive manufacturers using the standard metric definition of the gigabyte, but when a 500GB drive’s capacity is displayed by, for example, Microsoft Windows, it is reported as 465 GB, using a binary interpretation.
To address this ambiguity, the International System of Quantities standardizes the binary prefixes which denote a series of integer powers of 1024. With these prefixes, a memory module that is labeled as having the size 1GB has one gibibyte (1GiB) of storage capacity.

Gigabit (noun)
10^{9} bits, a thousand million (1,000,000,000) bits.

Gigabit (noun)
2^{30} (1,073,741,824) bits.

Gigabyte (noun)
10^{9}, one billion (1,000,000,000) bytes. SI symbol: GB

Gigabyte (noun)
a gibibyte or 1024^{3} (1,073,741,824) bytes.

Gigabit (noun)
a unit of information equal to one thousand million (10⁹) or (strictly) 2³⁰ bits.

Gigabyte (noun)
a unit of information equal to one thousand million (10⁹) or, strictly, 2³⁰ bytes.