This document discusses numerical suppression of linear effects in optical CDMA transmission. It presents a proposed model using Fractional Step Methods to predict signal deformation due to chromatic dispersion and reconstitute the original signal after propagation in a single-mode fiber. Simulation results are shown for Gaussian pulse propagation under varying conditions of wavelength, fiber length, dispersion, and absorption to analyze linear effects. The proposed detection algorithm uses FFT and IFFT to iteratively determine the initial signal from the dispersed signal after propagation.