SlideShare a Scribd company logo
Big Data Technical Solution Architect & Data Analyst
Senior Java/J2EE/SOA/ETL Solution Architect with Agile and SCRUM
Jayaram Parida
Mobile: +44-7478212120
Email: jairamparida@gmail.com
Jayaram Experience Summary
Total 19+ years of IT experience includes 3 years as Big Data Technical solution Architect) and
Senior Java/J2EE, SOA, ETL, DI Technical Architect with 10+ years hands on experince have
worked on various domains ( Banking, Telecom, Health Care and Logistics for product development
and services) in multiple MNCs(IBM and iGate) in India and Abroad. Jayaram has excellent
technical as well as communication skill with requirement analysis to design and development in
day to day client and team interaction.
Big Data Primary Skills : Hadoop, HDFS, HBase, Hive, Map Reduce, KAFKA, STORM, YARN, PIG,
R, Python(Pandas), Talend, Cassandra, Scoop, HortonWorks, Cloudera, GrigGain, Pentaho,
CauchDB, Elastic Search and Lumify.
Secondary Skills: Java/J2EE, EAI/SOA, Application Integrator/Solution Architect/Technology
Evangelist, SDLC System Architecture, IBM SOA(Web Sphere Tools), SOA Suits 11g, BPEL, SOA,
Java, J2EE, ESB, Web Services, STRUTS, SPRING and Hibernate using ECLIPSE, NETBEANS and
Developer with ADF, AIA and Oracle SOA Suits.
3+ Years as a Big Data Solution Architect/Data Scientist, focused on Data Analytics
Requirement, RealTime Analytics platform, Large Scale Data Store and analysis, with Business
Intelligence and Analytics Services.
10+ years as a Java and J2EE Technical Solution Architect for Enterprise Applications
experience for solution with SOA, Web Services and Middleware EA Integration for Banking,
Finance and Energy Utility Service Sector. 5+ years’ experience as an SOA, EAI Technical
Leader/EAI Lead with hands on IBM SOA, Oracle SOA, Mule ESB design and Developments.
5+ years as an Onsite Technical Project Co-Coordinator for Onsite-Offshore Projects at Canada,
UK and Europe.
3 + years of experience as an ETL and EAI as a Middleware Architect in fields like Data
Integration, Data enrichment and processing with service oriented architecture (SOA) and Business
Process Modeling (BPM), as well as implementation of SOA an SAP XI/BI, IBM Data Stage and IBM
Cognos.
1 | P a g e
Role Responsibilities:
• Provide technical oversight and mentoring to peers and junior teammates to grow the team’s
overall capability using Agile and SCRUM
• Provide consultative subject matter scoping assistance to Technical Sales Specialists as
needed to produce accurate statements of work as a customer deliverable
• Deliver customer facing consultative white-boarding sessions to help cultivate Big Data
services opportunities in partnership with the Sales team as needed
• Attend and support the Sales team at the Executive Briefing Center by presenting and
explaining the value of proposed Big Data solutions
• Build effective relationships with the Sales community by establishing self and team as the
Subject Matter Experts (SMEs) and “trusted advisors” for the BI &A Services organization
• Present comprehensive analysis and documentation of the customer’s “As Is” state of
operations and recommendations for optimal “To Be” state. This requires proficiency in
creating and operating ROI & TCO models.
• Contribute to IP development, best practices, and knowledge sharing to drive DA
standardization.
As a Data Architect :
• 10+ years of business, software, or consulting experience with an emphasis on Business
Intelligence & Analytics, and project management. Professional services experience .
- Exceptional interpersonal, communication and presentation skills at CxO, and CTO levels.
- 2+ years of experience with Apache Hadoop stack (e.g., YARN, MapReduce, Pig, Hive,
HBase, STORM)
- 10+ years of experience with related languages (e.g. Java, Linux, Apache, Python)
- Knowledge of NoSQL platforms (Hbase, MongoDB, Cassandra, Elastic Search, Couch DB,
Accumulo)
- 6 years of experience with ETL (Extract-Transform-Load, Talend ) tools.
- 3 years of experience with BI tools and reporting software (e.g. Microstrategy, Tableau,
Cognos, Pentaho)
- Demonstrated experience in effectively supporting a Big Data practice in a professional
Services firm
- Proven ability to generate and build offers sets that drive multi-million dollar services
projects
- Demonstrated experience in generating and closing identified revenue targets
- Demonstrated ability to multitask and work multiple projects concurrently
- Demonstrated ability to provide technical oversight for large complex projects and achieve
desired customer satisfaction from inception to deployment in a consulting environment
- Advanced analytical, problem solving, negotiation and organizational skills with
demonstrated ability to multi-task, organize, prioritize and meet deadline
- Ability to interface with the client in a pre-sales fashion, and on an on-going basis as
projects progress
- Demonstrated initiative and positive can-do attitude
- High level of professionalism, integrity and commitment to quality
- Ability to work independently and as part of a solution team using agile and scrum
- Demonstrated attentiveness to quality and productivity
- Technical and project management experience with SCRUM & PMP
2 | P a g e
-
Hands on Project Experience with past projects :
• Experience setting up, optimizing and sustaining distributed big data environments at scale
(5+ nodes, 1+ TB)
• Strong knowledge of Hadoop and related tools including STORM, Pig, Hive, Sqoop, Flume and
Impala
• Knowledge of one or more NoSQL database such as Casandra or Hadoop NoSQL(Hive and
Impala )
• Background with relational OLAP and OLTP databases, including one or more of PostgreSQL,
MySQL and SQL Server
• Experience writing scalable Map-Reduce jobs using general purpose languages such as Java,
Python
• Knowledge of common ETL packages/libraries, common analytical/statistical libraries and
common graphical display packages (SAS, PENTAHO BA, Informatics BigData, and Talend )
• Hands on Design and Development of Visualization Tools for Dashboards and Reporting
(Tableu, Microstrategy, QlickView and SAS)
• Experience with documentation of design, system builds, configurations, and support
procedures absolutely essential. The ability to clearly articulate strategic business value,
outside of technology, and present ideas to a client concisely.
• Experience with standard document used in support of the sales cycle (RFI/RFQ responses,
solutions proposals, current-state/future-state diagrams, platform descriptions, etc).
Big Data Infra structure architect:
• Real Life Hadoop deployment & administration engineer with 3+ years of experience with
good Technical Architect skills
• Hadoop ecosystem design, configuration and deployment Architecture definition and
documentation for Hadoop based production environment that can scale to petabytes of data
store and processing
• Having additional knowledge of Data and Infrastructure architecture with – designing of
cluster node configuration, connectivity, capacity, compute architecture, name node/data
node/job tracker deployment layout, server requirements, SAN, RAID configurations etc.
• Hands on practical working experience with leading big data technologies like HBase, Hive,
Pig, Oozie, Flume, Sqoop, Mahout, Hue, Hadoop, Zookeeper and more.
• Deployed, administer and managed Hadoop Software on large cluster implementations
• Well versed in installing & managed Apache , Horton and Cloudera Hadoop Data platforms
3 | P a g e
(CDH3, CDH4, Cloudera manager, HUE and Hortonworks Ambri)
• Strong experience of deployments and administration in Linux/Unix environments
• Past experience installing & administering Web App server's, RDBMS database design and
Modeling in a production data center environment
Professional Experience and Accomplishments
Present company : iGate Global Solutions, London , UK, working as a Big Data Technical
Architect from Dec’2014 and iGate, India from Dec 2012.
Academic and Professional Credentials
Completed Masters in Computer Science(MCS) from DIT and DOEACC with BSC Hons in Zoology
from Utkal University.
Professional Certification
• TOGAF 9 Certification
• SOA Certification from IBM.
• PMI, PMP Certification Course for appearing PMP exam
• Java 2 from Brain bench, Business English from Cambridge University
TRAINING
• Oracle Fusion SOA Suits 11g, Oracle Fusion Middleware from Oracle Fusion Middleware
School.UK
• IBM Web Sphere Message Broker, from IBM
• SAP Netweaver XI and EP from IBM
• Business Process Reengineering/Automation (BPR/BPA).
• GoF Design Patterns, Component Based Development (CBD) from IBM
• Performance Engineering, Software Quality Assurance from IBM.
• Personality improvements and behavioral patterns from IBM.
4 | P a g e
• Presentation and Communication Skills from IBM.
5 | P a g e
Big Data Project Detail :
At present engaged with Betfair, UK as a Onsite Big Data Technical Architect to develop a Real time Auto
Trading and Monitor Analytics application for online Sports and Gaming.
1. Mineral Exploration Data Analysis for Chemical Composition : To Collect large volume of Lab
sample ore data from diifferent lab, and store it in hadoop File system, extract the mineral datae, and
analyse the % of minerals, based on chemical composition. in specific mines.
Data Analysis : 1. Create cluster of ore diposites
2. Classify ores based on mineral ores.
3. find the maximum presence of Cupper
Client : RioTinto Innovation Center, Emerging Technology, Australia.
Tools and Languages : Linux, with 3 data node 16GB / 500 GB, Hadoop 2.1, YARN, MR2, Hbase,
Python, Pandas, MatplotLib and Scipy.
2. RT Drilling Well Data Analytics Platform ( iRTAnalytics) :
Project Description: This big data analytics platform showcase how real time analytics use cases
can be solved using popular open source technologies (Storm+ Hadoop) to process real time data
in a fault tolerant and distributed manner. The major challenge here is; it needs to be always
available to process real time feeds; it needs to be scalable enough to process hundreds of
thousands of message per second; and it needs to support a scale-out distributed architecture to
process the stream in parallel. One the data collected from the sources, it can be processed and
analyzed for any anomalies, with predictive analysis rules for multiple well hazardous conditions.
The multiwall real time dashboard has capable of monitoring multiple well drilling in a single
window. The system is also capable of storing real-time data in HDFS, HBase and any DW system
for batch Log Analysis and reporting.
Data Analysis : Stuck Pipe Analysis , Dashboard, Alert.
Client : SLUMBERGE, for Real Time Drilling Monitor and Analysis for Stuck-Pipe Analysis
Technology : Cloudera Data Platform 4.1, KAFKA, STORM, Radis, HBASE and HIVE, JQuery,
WS and DOJO
Role was to take complete technical ownership for Architect, Design and Develop Hadoop Realtime
Analystics Platform as POC using scrumas a Technical Scrum master.
3. Smart Meter Data Analytics (SMDA Analytics) Platform :
Project Description: The objective of developing this Smart Grid and MDM Data Analytics platform using Big
Data – Hadoop and Analytics tools and technology is to showcase iGate capability , how we can leverage
maximum benefit to different utility customers and service provider to get better business insight with low
investment for a developing high capacity big data analytic center.
Data Analysis : Customer Sentiment, Recommendation,
Client : COMCAST, USA
Technology : Cloudera Data Platform 4.4, STORM, KAFKA, Mahout and JQuery, D3
4. iHeal BigData Analytics : Log event Analysis , Philips , Hospital Information System (HealtCare) :
is analyzed using process mining techniques. The log contains events related to treatment and diagnosis steps for
patients diagnosed with cancer. Given the heterogeneous nature of these cases, we first demonstrate that it is
possible to create more homogeneous subsets of cases (e.g. patients having a particular type of cancer that need to
6 | P a g e
be treated urgently). Such preprocessing is crucial given the variation and variability found in the event log. The
discovered homogeneous subsets are analyzed using state of-the-art process mining approaches. In this Project, we
report on the findings discovered using enhanced fuzzy mining and trace alignment.
Client : Philips, USA
Technology : Cloudera Data Platform 4.6, SPLUNK/HUNK, PIG, YARN, Mahout and QlickView
Awards / Major Accomplishments
• Received BRAVO Award in 2006 for the executing the large enterprise projects (APL Customer
Profile Rewrite) as Subject Matter Expert for the Offshore Team
• Active Member of Core Architect Team of IBM Web sphere Front Office (new product for real time
feed management to financial service Industries)
• Successfully developed more than 5 CD-Rom titles on different project as Project Leader &
Designer
• Designed and hosted many web sites for Private & Govt. Sectors
7 | P a g e

More Related Content

PDF
Borja González - Resume ​Big Data Architect
DOC
Hadoop Big Data Resume
DOCX
sudipto_resume
DOCX
Suresh_Hadoop_Resume
DOCX
Resume
DOCX
Anil_BigData Resume
DOCX
RENUGA VEERARAGAVAN Resume HADOOP
DOC
Resume_of_Vasudevan - Hadoop
Borja González - Resume ​Big Data Architect
Hadoop Big Data Resume
sudipto_resume
Suresh_Hadoop_Resume
Resume
Anil_BigData Resume
RENUGA VEERARAGAVAN Resume HADOOP
Resume_of_Vasudevan - Hadoop

What's hot (18)

DOCX
hadoop resume
DOCX
Resume_Abhinav_Hadoop_Developer
DOCX
Resume_Triveni_Bigdata_Hadoop Professional
DOCX
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
DOC
Pankaj Resume for Hadoop,Java,J2EE - Outside World
DOC
kishore resume hadoop
DOC
Resume (2)
DOCX
Suresh_Yadav_Hadoop_Fresher_Resume
DOCX
Bharath Hadoop Resume
DOCX
Shanthkumar 6yrs-java-analytics-resume
DOC
Robin_Hadoop
DOC
Resume_Ayush_Hadoop
DOC
Soundarya Reddy Resume
DOC
Sriramjasti
DOC
Resume For Java Devloper
DOCX
Arindam Sengupta _ Resume
DOC
Murali tummala resume in SAP BO/BI
DOCX
Manoj(Java Developer)_Resume
hadoop resume
Resume_Abhinav_Hadoop_Developer
Resume_Triveni_Bigdata_Hadoop Professional
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Pankaj Resume for Hadoop,Java,J2EE - Outside World
kishore resume hadoop
Resume (2)
Suresh_Yadav_Hadoop_Fresher_Resume
Bharath Hadoop Resume
Shanthkumar 6yrs-java-analytics-resume
Robin_Hadoop
Resume_Ayush_Hadoop
Soundarya Reddy Resume
Sriramjasti
Resume For Java Devloper
Arindam Sengupta _ Resume
Murali tummala resume in SAP BO/BI
Manoj(Java Developer)_Resume
Ad

Similar to Jayaram_Parida- Big Data Architect and Technical Scrum Master (20)

DOC
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
DOC
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
DOC
Sourav banerjee resume
DOC
Venkatesh-Babu-Profile2
DOCX
Khalid SRIJI resume
DOC
Bigdata.sunil_6+yearsExp
DOC
GauravSriastava
DOC
Purnachandra_Hadoop_N
DOC
Hemant Shedge_CV
DOC
Xu Zhe CV EN
PDF
Ramesh kutumbaka resume
DOC
Resume Vardan Karapetian Updated
DOC
KEDAR_TERDALKAR
DOC
Big Data Analyst at BankofAmerica
PDF
Cendien - Senior Cognos Developer & Consultant
DOCX
Chandan's_Resume
DOCX
Vaibhav_Rane
DOC
SunilTiwari_CV
DOC
SunilTiwari_CV
DOC
Resume_2706
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sourav banerjee resume
Venkatesh-Babu-Profile2
Khalid SRIJI resume
Bigdata.sunil_6+yearsExp
GauravSriastava
Purnachandra_Hadoop_N
Hemant Shedge_CV
Xu Zhe CV EN
Ramesh kutumbaka resume
Resume Vardan Karapetian Updated
KEDAR_TERDALKAR
Big Data Analyst at BankofAmerica
Cendien - Senior Cognos Developer & Consultant
Chandan's_Resume
Vaibhav_Rane
SunilTiwari_CV
SunilTiwari_CV
Resume_2706
Ad

Jayaram_Parida- Big Data Architect and Technical Scrum Master

  • 1. Big Data Technical Solution Architect & Data Analyst Senior Java/J2EE/SOA/ETL Solution Architect with Agile and SCRUM Jayaram Parida Mobile: +44-7478212120 Email: jairamparida@gmail.com Jayaram Experience Summary Total 19+ years of IT experience includes 3 years as Big Data Technical solution Architect) and Senior Java/J2EE, SOA, ETL, DI Technical Architect with 10+ years hands on experince have worked on various domains ( Banking, Telecom, Health Care and Logistics for product development and services) in multiple MNCs(IBM and iGate) in India and Abroad. Jayaram has excellent technical as well as communication skill with requirement analysis to design and development in day to day client and team interaction. Big Data Primary Skills : Hadoop, HDFS, HBase, Hive, Map Reduce, KAFKA, STORM, YARN, PIG, R, Python(Pandas), Talend, Cassandra, Scoop, HortonWorks, Cloudera, GrigGain, Pentaho, CauchDB, Elastic Search and Lumify. Secondary Skills: Java/J2EE, EAI/SOA, Application Integrator/Solution Architect/Technology Evangelist, SDLC System Architecture, IBM SOA(Web Sphere Tools), SOA Suits 11g, BPEL, SOA, Java, J2EE, ESB, Web Services, STRUTS, SPRING and Hibernate using ECLIPSE, NETBEANS and Developer with ADF, AIA and Oracle SOA Suits. 3+ Years as a Big Data Solution Architect/Data Scientist, focused on Data Analytics Requirement, RealTime Analytics platform, Large Scale Data Store and analysis, with Business Intelligence and Analytics Services. 10+ years as a Java and J2EE Technical Solution Architect for Enterprise Applications experience for solution with SOA, Web Services and Middleware EA Integration for Banking, Finance and Energy Utility Service Sector. 5+ years’ experience as an SOA, EAI Technical Leader/EAI Lead with hands on IBM SOA, Oracle SOA, Mule ESB design and Developments. 5+ years as an Onsite Technical Project Co-Coordinator for Onsite-Offshore Projects at Canada, UK and Europe. 3 + years of experience as an ETL and EAI as a Middleware Architect in fields like Data Integration, Data enrichment and processing with service oriented architecture (SOA) and Business Process Modeling (BPM), as well as implementation of SOA an SAP XI/BI, IBM Data Stage and IBM Cognos. 1 | P a g e
  • 2. Role Responsibilities: • Provide technical oversight and mentoring to peers and junior teammates to grow the team’s overall capability using Agile and SCRUM • Provide consultative subject matter scoping assistance to Technical Sales Specialists as needed to produce accurate statements of work as a customer deliverable • Deliver customer facing consultative white-boarding sessions to help cultivate Big Data services opportunities in partnership with the Sales team as needed • Attend and support the Sales team at the Executive Briefing Center by presenting and explaining the value of proposed Big Data solutions • Build effective relationships with the Sales community by establishing self and team as the Subject Matter Experts (SMEs) and “trusted advisors” for the BI &A Services organization • Present comprehensive analysis and documentation of the customer’s “As Is” state of operations and recommendations for optimal “To Be” state. This requires proficiency in creating and operating ROI & TCO models. • Contribute to IP development, best practices, and knowledge sharing to drive DA standardization. As a Data Architect : • 10+ years of business, software, or consulting experience with an emphasis on Business Intelligence & Analytics, and project management. Professional services experience . - Exceptional interpersonal, communication and presentation skills at CxO, and CTO levels. - 2+ years of experience with Apache Hadoop stack (e.g., YARN, MapReduce, Pig, Hive, HBase, STORM) - 10+ years of experience with related languages (e.g. Java, Linux, Apache, Python) - Knowledge of NoSQL platforms (Hbase, MongoDB, Cassandra, Elastic Search, Couch DB, Accumulo) - 6 years of experience with ETL (Extract-Transform-Load, Talend ) tools. - 3 years of experience with BI tools and reporting software (e.g. Microstrategy, Tableau, Cognos, Pentaho) - Demonstrated experience in effectively supporting a Big Data practice in a professional Services firm - Proven ability to generate and build offers sets that drive multi-million dollar services projects - Demonstrated experience in generating and closing identified revenue targets - Demonstrated ability to multitask and work multiple projects concurrently - Demonstrated ability to provide technical oversight for large complex projects and achieve desired customer satisfaction from inception to deployment in a consulting environment - Advanced analytical, problem solving, negotiation and organizational skills with demonstrated ability to multi-task, organize, prioritize and meet deadline - Ability to interface with the client in a pre-sales fashion, and on an on-going basis as projects progress - Demonstrated initiative and positive can-do attitude - High level of professionalism, integrity and commitment to quality - Ability to work independently and as part of a solution team using agile and scrum - Demonstrated attentiveness to quality and productivity - Technical and project management experience with SCRUM & PMP 2 | P a g e
  • 3. - Hands on Project Experience with past projects : • Experience setting up, optimizing and sustaining distributed big data environments at scale (5+ nodes, 1+ TB) • Strong knowledge of Hadoop and related tools including STORM, Pig, Hive, Sqoop, Flume and Impala • Knowledge of one or more NoSQL database such as Casandra or Hadoop NoSQL(Hive and Impala ) • Background with relational OLAP and OLTP databases, including one or more of PostgreSQL, MySQL and SQL Server • Experience writing scalable Map-Reduce jobs using general purpose languages such as Java, Python • Knowledge of common ETL packages/libraries, common analytical/statistical libraries and common graphical display packages (SAS, PENTAHO BA, Informatics BigData, and Talend ) • Hands on Design and Development of Visualization Tools for Dashboards and Reporting (Tableu, Microstrategy, QlickView and SAS) • Experience with documentation of design, system builds, configurations, and support procedures absolutely essential. The ability to clearly articulate strategic business value, outside of technology, and present ideas to a client concisely. • Experience with standard document used in support of the sales cycle (RFI/RFQ responses, solutions proposals, current-state/future-state diagrams, platform descriptions, etc). Big Data Infra structure architect: • Real Life Hadoop deployment & administration engineer with 3+ years of experience with good Technical Architect skills • Hadoop ecosystem design, configuration and deployment Architecture definition and documentation for Hadoop based production environment that can scale to petabytes of data store and processing • Having additional knowledge of Data and Infrastructure architecture with – designing of cluster node configuration, connectivity, capacity, compute architecture, name node/data node/job tracker deployment layout, server requirements, SAN, RAID configurations etc. • Hands on practical working experience with leading big data technologies like HBase, Hive, Pig, Oozie, Flume, Sqoop, Mahout, Hue, Hadoop, Zookeeper and more. • Deployed, administer and managed Hadoop Software on large cluster implementations • Well versed in installing & managed Apache , Horton and Cloudera Hadoop Data platforms 3 | P a g e
  • 4. (CDH3, CDH4, Cloudera manager, HUE and Hortonworks Ambri) • Strong experience of deployments and administration in Linux/Unix environments • Past experience installing & administering Web App server's, RDBMS database design and Modeling in a production data center environment Professional Experience and Accomplishments Present company : iGate Global Solutions, London , UK, working as a Big Data Technical Architect from Dec’2014 and iGate, India from Dec 2012. Academic and Professional Credentials Completed Masters in Computer Science(MCS) from DIT and DOEACC with BSC Hons in Zoology from Utkal University. Professional Certification • TOGAF 9 Certification • SOA Certification from IBM. • PMI, PMP Certification Course for appearing PMP exam • Java 2 from Brain bench, Business English from Cambridge University TRAINING • Oracle Fusion SOA Suits 11g, Oracle Fusion Middleware from Oracle Fusion Middleware School.UK • IBM Web Sphere Message Broker, from IBM • SAP Netweaver XI and EP from IBM • Business Process Reengineering/Automation (BPR/BPA). • GoF Design Patterns, Component Based Development (CBD) from IBM • Performance Engineering, Software Quality Assurance from IBM. • Personality improvements and behavioral patterns from IBM. 4 | P a g e
  • 5. • Presentation and Communication Skills from IBM. 5 | P a g e
  • 6. Big Data Project Detail : At present engaged with Betfair, UK as a Onsite Big Data Technical Architect to develop a Real time Auto Trading and Monitor Analytics application for online Sports and Gaming. 1. Mineral Exploration Data Analysis for Chemical Composition : To Collect large volume of Lab sample ore data from diifferent lab, and store it in hadoop File system, extract the mineral datae, and analyse the % of minerals, based on chemical composition. in specific mines. Data Analysis : 1. Create cluster of ore diposites 2. Classify ores based on mineral ores. 3. find the maximum presence of Cupper Client : RioTinto Innovation Center, Emerging Technology, Australia. Tools and Languages : Linux, with 3 data node 16GB / 500 GB, Hadoop 2.1, YARN, MR2, Hbase, Python, Pandas, MatplotLib and Scipy. 2. RT Drilling Well Data Analytics Platform ( iRTAnalytics) : Project Description: This big data analytics platform showcase how real time analytics use cases can be solved using popular open source technologies (Storm+ Hadoop) to process real time data in a fault tolerant and distributed manner. The major challenge here is; it needs to be always available to process real time feeds; it needs to be scalable enough to process hundreds of thousands of message per second; and it needs to support a scale-out distributed architecture to process the stream in parallel. One the data collected from the sources, it can be processed and analyzed for any anomalies, with predictive analysis rules for multiple well hazardous conditions. The multiwall real time dashboard has capable of monitoring multiple well drilling in a single window. The system is also capable of storing real-time data in HDFS, HBase and any DW system for batch Log Analysis and reporting. Data Analysis : Stuck Pipe Analysis , Dashboard, Alert. Client : SLUMBERGE, for Real Time Drilling Monitor and Analysis for Stuck-Pipe Analysis Technology : Cloudera Data Platform 4.1, KAFKA, STORM, Radis, HBASE and HIVE, JQuery, WS and DOJO Role was to take complete technical ownership for Architect, Design and Develop Hadoop Realtime Analystics Platform as POC using scrumas a Technical Scrum master. 3. Smart Meter Data Analytics (SMDA Analytics) Platform : Project Description: The objective of developing this Smart Grid and MDM Data Analytics platform using Big Data – Hadoop and Analytics tools and technology is to showcase iGate capability , how we can leverage maximum benefit to different utility customers and service provider to get better business insight with low investment for a developing high capacity big data analytic center. Data Analysis : Customer Sentiment, Recommendation, Client : COMCAST, USA Technology : Cloudera Data Platform 4.4, STORM, KAFKA, Mahout and JQuery, D3 4. iHeal BigData Analytics : Log event Analysis , Philips , Hospital Information System (HealtCare) : is analyzed using process mining techniques. The log contains events related to treatment and diagnosis steps for patients diagnosed with cancer. Given the heterogeneous nature of these cases, we first demonstrate that it is possible to create more homogeneous subsets of cases (e.g. patients having a particular type of cancer that need to 6 | P a g e
  • 7. be treated urgently). Such preprocessing is crucial given the variation and variability found in the event log. The discovered homogeneous subsets are analyzed using state of-the-art process mining approaches. In this Project, we report on the findings discovered using enhanced fuzzy mining and trace alignment. Client : Philips, USA Technology : Cloudera Data Platform 4.6, SPLUNK/HUNK, PIG, YARN, Mahout and QlickView Awards / Major Accomplishments • Received BRAVO Award in 2006 for the executing the large enterprise projects (APL Customer Profile Rewrite) as Subject Matter Expert for the Offshore Team • Active Member of Core Architect Team of IBM Web sphere Front Office (new product for real time feed management to financial service Industries) • Successfully developed more than 5 CD-Rom titles on different project as Project Leader & Designer • Designed and hosted many web sites for Private & Govt. Sectors 7 | P a g e