Bits - Bit is a binary digit, it goes from 0-1, its a transmition of data and storage mesuarment.
Byte - Byte, megabyte, gigabyte, terabyte is storage measurements.

A bit is a binary digit, taking a value of either 0 or 1. For example, the number 10010111 is 8 bits long, or in most cases, one modern PC byte. Binary digits are a basic unit of information storage and communication in digital computing and digital information theory. Information theory also often uses the natural digit, called either a nit or a nat. Quantum computing also uses qubits, a single piece of information with a probability of being true.

The bit is also a unit of measurement, the information capacity of one binary digit. It has the symbol bit, or b.The unit is also known as the shannon, with symbol Sh.

In computer science a byte is a unit of measurement of information storage, most often consisting of eight bits. In many computer architectures it is a unit of memory addressing.

Originally, a byte was a small group of bits of a size convenient for data such as a single character from a Western character set. Its size was generally determined by the number of possible characters in the supported character set and was chosen to be a divisor of the computer's word size; historically, bytes have ranged from five to twelve bits. The popularity of IBM's System/360 architecture starting in the 1960s and the explosion of microcomputers based on 8-bit microprocessors in the 1980s has made eight bits by far the most common size for a byte. The term octet is widely used as a more precise synonym where ambiguity is undesirable (for example, in protocol definitions).
nibble - half of a bit

Bits - Bit is a binary digit, it goes from 0-1, its a transmition of data and storage mesuarment.

Byte - Byte, megabyte, gigabyte, terabyte is storage measurements.

A bit is a binary digit, taking a value of either 0 or 1. For example, the number 10010111 is 8 bits long, or in most cases, one modern PC byte. Binary digits are a basic unit of information storage and communication in digital computing and digital information theory. Information theory also often uses the natural digit, called either a nit or a nat. Quantum computing also uses qubits, a single piece of information with a probability of being true.

The bit is also a unit of measurement, the information capacity of one binary digit. It has the symbol bit, or b.The unit is also known as the shannon, with symbol Sh.

In computer science a byte is a unit of measurement of information storage, most often consisting of eight bits. In many computer architectures it is a unit of memory addressing.

Originally, a byte was a small group of bits of a size convenient for data such as a single character from a Western character set. Its size was generally determined by the number of possible characters in the supported character set and was chosen to be a divisor of the computer's word size; historically, bytes have ranged from five to twelve bits. The popularity of IBM's System/360 architecture starting in the 1960s and the explosion of microcomputers based on 8-bit microprocessors in the 1980s has made eight bits by far the most common size for a byte. The term octet is widely used as a more precise synonym where ambiguity is undesirable (for example, in protocol definitions).

nibble - half of a bit

bit - 1/8 of a byte

byte - one 1/1010 of a kilobyte

kilobyte - 1/1010 of a megabyte

megabyte - 1/1010 of a gigabyte

gigabyte - 1/1010 of a terabyte

Made by Christian Tilley