Quantum computing uses quantum bits (qubits) that can exist in superpositions of states and become entangled. Shor's algorithm shows how a quantum computer could factor large numbers much faster than a classical computer by using quantum parallelism and the quantum Fourier transform. It works by first preparing the input in a superposition, applying a modular exponentiation operation, measuring the output qubit to partially collapse the input, applying a quantum Fourier transform to reveal periodicity, and using the period to determine the factors of the original number. This algorithm demonstrates the power of quantum computing for certain problems.