This document discusses two approaches to attention in neural machine translation (NMT): global attention and local attention. It presents the results of experiments comparing these two approaches on English-German translation tasks. Global attention considers the entire input sequence at each time step, while local attention only considers a subset. The experiments show that both approaches improve over non-attention-based NMT, but local attention performs slightly better, achieving a BLEU score of 24.61 on the WMT 2014 test set and 25.9 on the WMT 2015 test set.