SlideShare a Scribd company logo
M.V. Rama Kumar Mobile: +91-9014514500
Email:matururamakumar@gmail.com
Professional Summary:
• 3 years of overall IT experience in Application Development in Java and Big Data
Hadoop.
• 1.6 years of exclusive experience in Hadoop and its components like HDFS, Map
Reduce, Apache Pig, Hive, Sqoop , HBase and Oozie
• Extensive Experience in Setting Hadoop Cluster
• Capable of processing large sets of structured, semi-structured and unstructured
data and supporting systems application architecture.
• Involved in writing the Pig scripts to reduce the job execution time.
• Experience in importing and exporting data using Sqoop from Hdfs to Relational
Database Systems and vice-versa.
• Extending Hive and Pig core functionality by writing Custom UDFs.
• Load data in to the Hive tables.
• Experience in analyzing data using HiveQL, Pig Latin, and custom Map Reduce
programs in Java.
• Knowledge in job workflow scheduling tool like Oozie.
• Experience in Capturing data from existing databases that provides SQL interfaces
using Sqoop.
• Good working knowledge with Sqoop and Apache Pig , Apache Oozie
• Excellent communication, interpersonal, analytical skills, and strong ability to
perform as part of team.
• Exceptional ability to learn new concepts.
• Hard working and enthusiastic.
Qualifications:
Program of
Study
Specialization Institute Board/University
Year of
Passing
%
M.C.A CS
Prakasam Eng
College
JNTUK 2012 70%
Technical Skills:
Languages SQL, Pig Script, HQL, Core Java, Map Reduce.
Big data Technologies HDFS,MapReduce, Pig, Sqoop, Hive, Hbase,Oozie
Frameworks Hadoop.
Java IDEs Eclipse.
Operating Systems Windows7, Windows XP and Linux
Work Experience
• Currently working as a System Analyst with Global InfoVision Pvt Ltd from
Sepetember 2012
Professional Summary
• Deploying and configuring the components of the system infrastructure on an
ongoing basis
• Understanding the client required tools and configuring the same in the business
environment with the reusable documentation
• Giving KT to the team members on the newly configured tools
• Configuring Basic Hadoop Big Data on Linux platform.
• Providing application support (administration, configuration, and monitoring) for
both off the shelf and home grown applications
• Supporting 100+ servers in the Dev/ IDE / STAGING QA/ Prod environment
• Working closely with the software development, Integration, Quality Assurance and
production teams from design to deployment phase of the applications
• Troubleshooting client related issues.
• Designed, configured and managed the backup and recovery for Hadoop data.
• Optimized and tuned the Hadoop environment to meet the performance
requirements.
• Proficient in designing, configuring and managing the backup and recovery for
Hadoop data.
• Proficient installing and configuring Hive, Pig, HBASE and other Hadoop tools.
Project#1:
Project Name : Sales Analytics
Environment : Hadoop, HDFS, Oozie, Map Reduce, Hbase, Pig, Hive, SQOOP, Java
Duration : Jan 2015 to till date
Role : Hadoop Developer
Project Description
• We use to analyze historical sales information and generate reports on monthly basis
and compare sales between months, compare sales of different product categories
and analyze growth and decline of sales, generating weekly, monthly and yearly
report for the same.
• Doing forecast for future sales on also comparing the target achieved or not for
weekly and yearly basis.
• When selling products, in order to prevent products stock out due to the uncertainty
events, enterprises usually reserves a certain amount of safety stock.
• The amount of safety stock is directly related to inventory costs and no. of sales.
• We determine the amount of future safety stock as per the previous sales report.
• A Cluster of 7 nodes was setup using Cloudera Manager 4.0.Various data sources
like csv, xml and RDBMS were used as data source
Roles & Responsibilities
• Involved in setting up the development environment
• Developed Map Reduce jobs in Java for data cleaning and pre-processing
• Developed Map Reduce programs to parse the raw data, populate staging tables and
store the refined data in partitioned tables in the EDW
• Assisted with data capacity planning and node forecasting.
• Created Hive queries that helped market analysts spot emerging trends by
comparing fresh data with EDW reference tables and historical metrics.
• Used to write hive and pig queries on different data sets and joining them.
• Used Sqoop for data transfer between RDBMS and HDFS.
• Checked the performance by introducing concept of combiner and Partitioner
• Hadoop Administration and Support, log monitoring as well as node monitoring for
performance issues. Used Cloudera manager as well as system log files.
• Worked on setting up a Cluster with High Availability Name node configuration
(active and standby mode) through NFS.
• Supported code/design analysis, strategy development and project planning
• Created scripts to delete log files of hadoop periodically
Project#2:
Project Name : Sentiment Analysis on Customer evolution in banking domain
Environment : Hadoop, HDFS, Apache Pig, SQOOP, Java, Linux, MySQL
Duration : May 2014 to Sep 2014
Role : Hadoop Developer
Project Description:
The purpose of the project is to store information generated by the bank’s historical data,
extract meaning information out of it and based on the information predict the customer’s
category. The solution is based on the open source Big data s/w Hadoop. The data will be
stored in hadoop file system and processed using Map/Reduce jobs for the Product and
pricing information.
Roles and Responsibilities:
• Involved in developing the pig Scripts
• Developed the Sqoop scripts in order to make the integration between Pig and
MySQL Database.
• Completely involved in the requirement analysis phase
Project#3:
Project Name : Preparing a web interface for hbase
Environment : Hadoop, HDFS, Apache Pig, SQOOP, Map Reduce, MySQL, Hbase
Duration : Sep 2014 to Dec 2014
Role : Hadoop Developer
Project Description:
The purpose of the project is to perform the analysis on the effectiveness and validity of
controls and to store terabytes of log information generated by the source providers as part
of the analysis and extract meaningful information out of it.The solution is based on the
open source bigdata software hadoop. The data will be stored in hadoop filesystem and
processed using Map Reduce jobs, which intern includes getting the raw data, process the
data to obtain controls and redesign/change history information, extract various reports out
of the controls history and export the information for further processing.
Roles and Responsibilities:
• Fetched data from hdfs using pig, hive, mapreduce into hbase
• Fetched data from existing MySql database to HBase using Sqoop and Map reduce.
• Prepared a web UI for the HBase database for crud operation like put, get, scan,
delete, update etc.
• Provided a UI to enter user’s Query which would be parsed and executed internally
and Result would be displayed on the web interface.
Project#4:
Project Name : Severity Claims System (SCS).
Duration : Sep 2012 to Apr 2014
Role : Developer
Type of Project : Ticket Support and Maintenance.
Project Description:
SCS (Severity Claims System) application is a core insurance claim processing system which
caters to the Excess Claims, Environmental and Toxic tort LOBs (Line of Businesses). It aims
to identify potential Severity claims, determine and eliminate duplicate claims, calculate the
potential exposure of AIG regarding claims and optimize the process of claims processing.
Roles and Responsibilities:
• Perform Root Cause analysis for User requests.
• Performance Tuning and Query Optimization.
• Ensure smooth knowledge transition to team members.
• Attend Status Meetings.
• Doing a peer review of the deliverables and documenting the defects.
• Giving new proposals and suggestions for application performance improvement.
Operating System : Windows XP, Windows 7, OS/390.
Languages : CORE JAVA, HTML, JCL, COBOL, DB2.
Special Software : Eclipse, PVCS, Query Tool.
PERSONAL DETAILS:
Name : M.V. Rama Kumar
Address : 2-147/A, CHANDOLE (POST), PITTALAVANI PALEM
(M.D)
Guntur (DIST), A.P-522311
D.O.B :12-11-1987
Languages Known : English, Telugu, Hindi
Nationality : Indian
Interests : Playing Cricket
• Performance Tuning and Query Optimization.
• Ensure smooth knowledge transition to team members.
• Attend Status Meetings.
• Doing a peer review of the deliverables and documenting the defects.
• Giving new proposals and suggestions for application performance improvement.
Operating System : Windows XP, Windows 7, OS/390.
Languages : CORE JAVA, HTML, JCL, COBOL, DB2.
Special Software : Eclipse, PVCS, Query Tool.
PERSONAL DETAILS:
Name : M.V. Rama Kumar
Address : 2-147/A, CHANDOLE (POST), PITTALAVANI PALEM
(M.D)
Guntur (DIST), A.P-522311
D.O.B :12-11-1987
Languages Known : English, Telugu, Hindi
Nationality : Indian
Interests : Playing Cricket

More Related Content

DOC
DOC
Atul Mithe
DOCX
Sureh hadoop 3 years t
DOC
Resume - Narasimha Rao B V (TCS)
DOCX
Prashanth Kumar_Hadoop_NEW
DOC
Srikanth hadoop hyderabad_3.4yeras - copy
DOC
Pallavi_Resume
DOCX
Bharath Hadoop Resume
Atul Mithe
Sureh hadoop 3 years t
Resume - Narasimha Rao B V (TCS)
Prashanth Kumar_Hadoop_NEW
Srikanth hadoop hyderabad_3.4yeras - copy
Pallavi_Resume
Bharath Hadoop Resume

What's hot (20)

DOCX
Srikanth hadoop 3.6yrs_hyd
DOCX
Nagesh Hadoop Profile
DOCX
Vijay_hadoop admin
DOC
KOTI_RESUME_(1) (2)
DOCX
sam_resume - updated
PPTX
Overview of Big data, Hadoop and Microsoft BI - version1
DOCX
Amith_Hadoop_Admin_CV
DOCX
Sasmita bigdata resume
PPTX
Big Data Simplified - Is all about Ab'strakSHeN
PPTX
Big Data on the Microsoft Platform
PPTX
Etu L2 Training - Hadoop 企業應用實作
DOCX
PPTX
Hadoop in a Nutshell
PPTX
Big data Hadoop
DOC
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMIN
PDF
How can Hadoop & SAP be integrated
DOCX
Shiv shakti resume
DOCX
Rameez Rangrez_Hadoop_Admin
DOC
Hadoop Developer
Srikanth hadoop 3.6yrs_hyd
Nagesh Hadoop Profile
Vijay_hadoop admin
KOTI_RESUME_(1) (2)
sam_resume - updated
Overview of Big data, Hadoop and Microsoft BI - version1
Amith_Hadoop_Admin_CV
Sasmita bigdata resume
Big Data Simplified - Is all about Ab'strakSHeN
Big Data on the Microsoft Platform
Etu L2 Training - Hadoop 企業應用實作
Hadoop in a Nutshell
Big data Hadoop
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMIN
How can Hadoop & SAP be integrated
Shiv shakti resume
Rameez Rangrez_Hadoop_Admin
Hadoop Developer
Ad

Viewers also liked (15)

PDF
Deck
PDF
The Science of Listening
PPTX
AppTrim - A Healthy Way to Manage Appetite and Obesity
PPTX
Dress to party 2013
PPTX
Percura, Improve Pain and Numbness Associated with Peripheral Neuropathy
PDF
カナエール2012 報告書
PDF
翔太の紙ヒコーキ~カナエール物語~
PPTX
Future of Grammar
PPTX
Community engagement in the extractive industries in mongolia
PPT
A Prom Girl Diary
PPTX
Add value to bread 3
ZIP
Tyler's work assassin .key
PPTX
Targeted Medical Pharma Innovating Healthcare for Better Patient Health IP 2014
PPTX
Breathe Free Day and Night without Dangerous Side Effects
PPTX
HAKI (Hak Atas Kekayaan Intelektual) for GPJHS
Deck
The Science of Listening
AppTrim - A Healthy Way to Manage Appetite and Obesity
Dress to party 2013
Percura, Improve Pain and Numbness Associated with Peripheral Neuropathy
カナエール2012 報告書
翔太の紙ヒコーキ~カナエール物語~
Future of Grammar
Community engagement in the extractive industries in mongolia
A Prom Girl Diary
Add value to bread 3
Tyler's work assassin .key
Targeted Medical Pharma Innovating Healthcare for Better Patient Health IP 2014
Breathe Free Day and Night without Dangerous Side Effects
HAKI (Hak Atas Kekayaan Intelektual) for GPJHS
Ad

Similar to hadoop exp (20)

DOC
Resume_VipinKP
DOC
DeepeshRehi
DOC
Robin_Hadoop
DOCX
DOC
hadoop_bigdata
DOCX
Sudhanshu kumar hadoop
DOC
HariKrishna4+_cv
DOC
Neelima_Resume
DOC
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
DOC
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
DOC
Sourav banerjee resume
DOC
DOC
Resume_2706
DOCX
Nagarjuna_Damarla_Resume
DOCX
Prasanna Resume
DOC
new_Rajesh_Hadoop Developer_2016
DOC
ganesh_2+yrs_Java_Developer_Resume
DOCX
Deepankar Sehdev- Resume2015
DOC
Nagarjuna_Damarla
DOCX
Rama prasad owk etl hadoop_developer
Resume_VipinKP
DeepeshRehi
Robin_Hadoop
hadoop_bigdata
Sudhanshu kumar hadoop
HariKrishna4+_cv
Neelima_Resume
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sourav banerjee resume
Resume_2706
Nagarjuna_Damarla_Resume
Prasanna Resume
new_Rajesh_Hadoop Developer_2016
ganesh_2+yrs_Java_Developer_Resume
Deepankar Sehdev- Resume2015
Nagarjuna_Damarla
Rama prasad owk etl hadoop_developer

hadoop exp

  • 1. M.V. Rama Kumar Mobile: +91-9014514500 Email:matururamakumar@gmail.com Professional Summary: • 3 years of overall IT experience in Application Development in Java and Big Data Hadoop. • 1.6 years of exclusive experience in Hadoop and its components like HDFS, Map Reduce, Apache Pig, Hive, Sqoop , HBase and Oozie • Extensive Experience in Setting Hadoop Cluster • Capable of processing large sets of structured, semi-structured and unstructured data and supporting systems application architecture. • Involved in writing the Pig scripts to reduce the job execution time. • Experience in importing and exporting data using Sqoop from Hdfs to Relational Database Systems and vice-versa. • Extending Hive and Pig core functionality by writing Custom UDFs. • Load data in to the Hive tables. • Experience in analyzing data using HiveQL, Pig Latin, and custom Map Reduce programs in Java. • Knowledge in job workflow scheduling tool like Oozie. • Experience in Capturing data from existing databases that provides SQL interfaces using Sqoop. • Good working knowledge with Sqoop and Apache Pig , Apache Oozie • Excellent communication, interpersonal, analytical skills, and strong ability to perform as part of team. • Exceptional ability to learn new concepts. • Hard working and enthusiastic. Qualifications: Program of Study Specialization Institute Board/University Year of Passing % M.C.A CS Prakasam Eng College JNTUK 2012 70% Technical Skills: Languages SQL, Pig Script, HQL, Core Java, Map Reduce. Big data Technologies HDFS,MapReduce, Pig, Sqoop, Hive, Hbase,Oozie Frameworks Hadoop. Java IDEs Eclipse. Operating Systems Windows7, Windows XP and Linux Work Experience
  • 2. • Currently working as a System Analyst with Global InfoVision Pvt Ltd from Sepetember 2012 Professional Summary • Deploying and configuring the components of the system infrastructure on an ongoing basis • Understanding the client required tools and configuring the same in the business environment with the reusable documentation • Giving KT to the team members on the newly configured tools • Configuring Basic Hadoop Big Data on Linux platform. • Providing application support (administration, configuration, and monitoring) for both off the shelf and home grown applications • Supporting 100+ servers in the Dev/ IDE / STAGING QA/ Prod environment • Working closely with the software development, Integration, Quality Assurance and production teams from design to deployment phase of the applications • Troubleshooting client related issues. • Designed, configured and managed the backup and recovery for Hadoop data. • Optimized and tuned the Hadoop environment to meet the performance requirements. • Proficient in designing, configuring and managing the backup and recovery for Hadoop data. • Proficient installing and configuring Hive, Pig, HBASE and other Hadoop tools. Project#1: Project Name : Sales Analytics Environment : Hadoop, HDFS, Oozie, Map Reduce, Hbase, Pig, Hive, SQOOP, Java Duration : Jan 2015 to till date Role : Hadoop Developer Project Description • We use to analyze historical sales information and generate reports on monthly basis and compare sales between months, compare sales of different product categories and analyze growth and decline of sales, generating weekly, monthly and yearly report for the same. • Doing forecast for future sales on also comparing the target achieved or not for weekly and yearly basis. • When selling products, in order to prevent products stock out due to the uncertainty events, enterprises usually reserves a certain amount of safety stock. • The amount of safety stock is directly related to inventory costs and no. of sales. • We determine the amount of future safety stock as per the previous sales report.
  • 3. • A Cluster of 7 nodes was setup using Cloudera Manager 4.0.Various data sources like csv, xml and RDBMS were used as data source Roles & Responsibilities • Involved in setting up the development environment • Developed Map Reduce jobs in Java for data cleaning and pre-processing • Developed Map Reduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW • Assisted with data capacity planning and node forecasting. • Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics. • Used to write hive and pig queries on different data sets and joining them. • Used Sqoop for data transfer between RDBMS and HDFS. • Checked the performance by introducing concept of combiner and Partitioner • Hadoop Administration and Support, log monitoring as well as node monitoring for performance issues. Used Cloudera manager as well as system log files. • Worked on setting up a Cluster with High Availability Name node configuration (active and standby mode) through NFS. • Supported code/design analysis, strategy development and project planning • Created scripts to delete log files of hadoop periodically Project#2: Project Name : Sentiment Analysis on Customer evolution in banking domain Environment : Hadoop, HDFS, Apache Pig, SQOOP, Java, Linux, MySQL Duration : May 2014 to Sep 2014 Role : Hadoop Developer Project Description: The purpose of the project is to store information generated by the bank’s historical data, extract meaning information out of it and based on the information predict the customer’s category. The solution is based on the open source Big data s/w Hadoop. The data will be stored in hadoop file system and processed using Map/Reduce jobs for the Product and pricing information. Roles and Responsibilities: • Involved in developing the pig Scripts • Developed the Sqoop scripts in order to make the integration between Pig and MySQL Database. • Completely involved in the requirement analysis phase Project#3: Project Name : Preparing a web interface for hbase
  • 4. Environment : Hadoop, HDFS, Apache Pig, SQOOP, Map Reduce, MySQL, Hbase Duration : Sep 2014 to Dec 2014 Role : Hadoop Developer Project Description: The purpose of the project is to perform the analysis on the effectiveness and validity of controls and to store terabytes of log information generated by the source providers as part of the analysis and extract meaningful information out of it.The solution is based on the open source bigdata software hadoop. The data will be stored in hadoop filesystem and processed using Map Reduce jobs, which intern includes getting the raw data, process the data to obtain controls and redesign/change history information, extract various reports out of the controls history and export the information for further processing. Roles and Responsibilities: • Fetched data from hdfs using pig, hive, mapreduce into hbase • Fetched data from existing MySql database to HBase using Sqoop and Map reduce. • Prepared a web UI for the HBase database for crud operation like put, get, scan, delete, update etc. • Provided a UI to enter user’s Query which would be parsed and executed internally and Result would be displayed on the web interface. Project#4: Project Name : Severity Claims System (SCS). Duration : Sep 2012 to Apr 2014 Role : Developer Type of Project : Ticket Support and Maintenance. Project Description: SCS (Severity Claims System) application is a core insurance claim processing system which caters to the Excess Claims, Environmental and Toxic tort LOBs (Line of Businesses). It aims to identify potential Severity claims, determine and eliminate duplicate claims, calculate the potential exposure of AIG regarding claims and optimize the process of claims processing. Roles and Responsibilities: • Perform Root Cause analysis for User requests.
  • 5. • Performance Tuning and Query Optimization. • Ensure smooth knowledge transition to team members. • Attend Status Meetings. • Doing a peer review of the deliverables and documenting the defects. • Giving new proposals and suggestions for application performance improvement. Operating System : Windows XP, Windows 7, OS/390. Languages : CORE JAVA, HTML, JCL, COBOL, DB2. Special Software : Eclipse, PVCS, Query Tool. PERSONAL DETAILS: Name : M.V. Rama Kumar Address : 2-147/A, CHANDOLE (POST), PITTALAVANI PALEM (M.D) Guntur (DIST), A.P-522311 D.O.B :12-11-1987 Languages Known : English, Telugu, Hindi Nationality : Indian Interests : Playing Cricket
  • 6. • Performance Tuning and Query Optimization. • Ensure smooth knowledge transition to team members. • Attend Status Meetings. • Doing a peer review of the deliverables and documenting the defects. • Giving new proposals and suggestions for application performance improvement. Operating System : Windows XP, Windows 7, OS/390. Languages : CORE JAVA, HTML, JCL, COBOL, DB2. Special Software : Eclipse, PVCS, Query Tool. PERSONAL DETAILS: Name : M.V. Rama Kumar Address : 2-147/A, CHANDOLE (POST), PITTALAVANI PALEM (M.D) Guntur (DIST), A.P-522311 D.O.B :12-11-1987 Languages Known : English, Telugu, Hindi Nationality : Indian Interests : Playing Cricket