This document summarizes recent advancements in dependency parsers. It discusses how dependency parsers have been used to parse languages with free word order like Hindi and analyze source code from various programming languages. Several studies are highlighted that have used dependency parsers to extract semantic relationships, identify errors in automatic speech recognition, incorporate long-distance dependencies, and address feature sparseness issues. Dependency parsers have been shown to outperform other models for tasks like topic detection and can parse biomedical text, though both Link Grammar and Connexor Machinese Syntax parsers were found to have limitations for the biomedical domain.