Bit (b) - Unit Information & Conversion
What is a Bit?
Definition
A bit (short for binary digit) is the most basic unit of information in computing and digital communications. A single bit can have only one of two values, typically represented as 0 or 1.
History
The term "bit" was first coined by John W. Tukey in a Bell Labs memo in 1947 and popularized by Claude Shannon in his seminal 1948 paper "A Mathematical Theory of Communication". Shannon used the bit as the fundamental unit of information entropy. Early computing relied directly on manipulating bits through mechanical relays or vacuum tubes.
Common Uses
- Representing binary states (on/off, true/false).
- Quantifying information entropy.
- Measuring data transfer rates (e.g., kilobits per second - kbps).
- Fundamental building block for all digital data (bytes, kilobytes, etc.).
- Processor architecture specifications (e.g., 32-bit, 64-bit processors).
Unit FAQs
How many bits are in a byte?
There are typically 8 bits in 1 byte. This is the most common standard in modern computing.
What does a bit represent?
A bit represents the smallest possible unit of information, corresponding to a choice between two possibilities. In electronics, this is often represented by the presence or absence of an electrical charge or voltage level.
Why is it called a binary digit?
It's called a binary digit because it exists in a binary (base-2) system, meaning it can only have one of two possible values (0 or 1), unlike the decimal system (base-10) which uses digits 0 through 9.
Bit Conversion Formulas
To Byte:
To Kilobit:
To Kilobyte:
To Megabit:
To Megabyte:
To Gigabit:
To Gigabyte:
To Terabit:
To Terabyte:
To Petabit:
To Petabyte:
To Exabit:
To Exabyte:
To Kibibit:
To Kibibyte:
To Mebibit:
To Mebibyte:
To Gibibit:
To Gibibyte:
To Tebibit:
To Tebibyte:
To Pebibit:
To Pebibyte:
To Exbibit:
To Exbibyte:
Convert Bit
Need to convert Bit to other data storage units? Use our conversion tool.