Spark is potentially replacing MapReduce as the primary execution framework for Hadoop, though Hadoop will likely continue embracing new frameworks. Spark code is easier to write and its performance is faster for iterative algorithms. However, not all applications are faster in Spark and it may have limitations. Hadoop also supports many other frameworks and is about more than just MapReduce, including storage, resource management, and a growing ecosystem of tools.