A signal can be defined as a pattern of variation that carries information over time. Signals can be continuous analog signals defined by mathematical functions, or discrete digital signals represented by discrete samples. Analog signals are more accurate but digital signals are easier to store and analyze. Conversion between analog and digital signals involves sampling the analog signal at discrete time intervals and quantizing the amplitude into discrete levels. Signals can be analyzed in the time domain, looking at amplitude variation over time, or the frequency domain, looking at how many times different events occur over the total observation period.