Characters are represented using character sets that assign a unique binary code to each character. The ASCII character set uses 7 bits to represent 128 characters, while Extended ASCII uses 8 bits to represent 256 characters. Unicode uses 16 bits to represent up to 65,536 characters to support international languages. Error detection methods like parity bits are used to detect errors during data transmission by checking if the number of 1 bits is even or odd as expected.