Computers use binary digits (0s and 1s) to represent all data as these are the only two states that electronics can recognize. Characters are represented by patterns of bits (the smallest unit of data) that are grouped into bytes. Initially, the ASCII coding scheme was used to represent English and Western European languages using 7-bit bytes, but Unicode superseded it using 16-bit bytes and supports over 65,000 characters for many languages. When typing on a keyboard, keys are converted to electronic signals then binary codes which are processed and converted back to recognizable characters on a screen.