SlideShare a Scribd company logo
ETL Testing - Introduction to ETL Testing
Introduction to ETL TestingIntroduction to ETL Testing
The process of updating the data
warehouse.
Design by :- Vibrant
Technologies & computers
Two Data Warehousing StrategiesTwo Data Warehousing Strategies
• Enterprise-wide warehouse, top down, the Inmon
methodology
• Data mart, bottom up, the Kimball methodology
• When properly executed, both result in an
enterprise-wide data warehouse
The Data Mart StrategyThe Data Mart Strategy
• The most common approach
• Begins with a single mart and architected marts are
added over time for more subject areas
• Relatively inexpensive and easy to implement
• Can be used as a proof of concept for data
warehousing
• Can perpetuate the “silos of information” problem
• Can postpone difficult decisions and activities
• Requires an overall integration plan
The Enterprise-wide StrategyThe Enterprise-wide Strategy
• A comprehensive warehouse is built initially
• An initial dependent data mart is built using a
subset of the data in the warehouse
• Additional data marts are built using subsets of the
data in the warehouse
• Like all complex projects, it is expensive, time
consuming, and prone to failure
• When successful, it results in an integrated, scalable
warehouse
Data Sources and TypesData Sources and Types
• Primarily from legacy, operational systems
• Almost exclusively numerical data at the present
time
• External data may be included, often purchased
from third-party sources
• Technology exists for storing unstructured data and
expect this to become more important over time
Extraction, Transformation, and LoadingExtraction, Transformation, and Loading
(ETL) Processes(ETL) Processes
• The “plumbing” work of data warehousing
• Data are moved from source to target data bases
• A very costly, time consuming part of data
warehousing
Recent Development:Recent Development:
More Frequent UpdatesMore Frequent Updates
• Updates can be done in bulk and trickle modes
• Business requirements, such as trading partner
access to a Web site, requires current data
• For international firms, there is no good time to load
the warehouse
Recent Development:Recent Development:
Clickstream DataClickstream Data
• Results from clicks at web sites
• A dialog manager handles user interactions. An
ODS (operational data store in the data staging
area) helps to custom tailor the dialog
• The clickstream data is filtered and parsed and
sent to a data warehouse where it is analyzed
• Software is available to analyze the clickstream
data
Data ExtractionData Extraction
• Often performed by COBOL routines
(not recommended because of high program
maintenance and no automatically generated
meta data)
• Sometimes source data is copied to the target
database using the replication capabilities of
standard RDMS (not recommended because of
“dirty data” in the source systems)
• Increasing performed by specialized ETL software
Sample ETL ToolsSample ETL Tools
• Teradata Warehouse Builder from Teradata
• DataStage from Ascential Software
• SAS System from SAS Institute
• Power Mart/Power Center from Informatica
• Sagent Solution from Sagent Software
• Hummingbird Genio Suite from Hummingbird
Communications
Reasons for “Dirty” DataReasons for “Dirty” Data
• Dummy Values
• Absence of Data
• Multipurpose Fields
• Cryptic Data
• Contradicting Data
• Inappropriate Use of Address Lines
• Violation of Business Rules
• Reused Primary Keys,
• Non-Unique Identifiers
• Data Integration Problems
Data CleansingData Cleansing
• Source systems contain “dirty data” that must be cleansed
• ETL software contains rudimentary data cleansing capabilities
• Specialized data cleansing software is often used. Important
for performing name and address correction and
householding functions
• Leading data cleansing vendors include Vality (Integrity),
Harte-Hanks (Trillium), and Firstlogic (i.d.Centric)
Steps in Data CleansingSteps in Data Cleansing
• Parsing
• Correcting
• Standardizing
• Matching
• Consolidating
ParsingParsing
• Parsing locates and identifies individual data
elements in the source files and then isolates these
data elements in the target files.
• Examples include parsing the first, middle, and last
name; street number and street name; and city
and state.
CorrectingCorrecting
• Corrects parsed individual data components using
sophisticated data algorithms and secondary data
sources.
• Example include replacing a vanity address and
adding a zip code.
StandardizingStandardizing
• Standardizing applies conversion routines to
transform data into its preferred (and consistent)
format using both standard and custom business
rules.
• Examples include adding a pre name, replacing a
nickname, and using a preferred street name.
MatchingMatching
• Searching and matching records within and across
the parsed, corrected and standardized data
based on predefined business rules to eliminate
duplications.
• Examples include identifying similar names and
addresses.
ConsolidatingConsolidating
• Analyzing and identifying relationships between
matched records and consolidating/merging them
into ONE representation.
Data StagingData Staging
• Often used as an interim step between data extraction
and later steps
• Accumulates data from asynchronous sources using
native interfaces, flat files, FTP sessions, or other
processes
• At a predefined cutoff time, data in the staging file is
transformed and loaded to the warehouse
• There is usually no end user access to the staging file
• An operational data store may be used for data staging
Data TransformationData Transformation
• Transforms the data in accordance with the
business rules and standards that have been
established
• Example include: format changes, deduplication,
splitting up fields, replacement of codes, derived
values, and aggregates
Data LoadingData Loading
• Data are physically moved to the data warehouse
• The loading takes place within a “load window”
• The trend is to near real time updates of the data
warehouse as the warehouse is increasingly used for
operational applications
Meta DataMeta Data
• Data about data
• Needed by both information technology
personnel and users
• IT personnel need to know data sources and
targets; database, table and column names;
refresh schedules; data usage measures; etc.
• Users need to know entity/attribute definitions;
reports/query tools available; report distribution
information; help desk contact information, etc.
Recent Development:Recent Development:
Meta Data IntegrationMeta Data Integration
• A growing realization that meta data is critical
to data warehousing success
• Progress is being made on getting vendors to
agree on standards and to incorporate the
sharing of meta data among their tools
• Vendors like Microsoft, Computer Associates,
and Oracle have entered the meta data
marketplace with significant product offerings
ThankThank You !!!You !!!
For More Information click below link:
Follow Us on:
http://vibranttechnologies.co.in/etl-testing-classes-in-mu

More Related Content

PPTX
Data warehouseold
PPTX
Data warehouse
PPT
ETL Testing - Introduction to ETL testing
PDF
Data Warehouses & Deployment By Ankita dubey
PPTX
Data warehouse
PPT
Data Warehousing and Data Mining
PPTX
3 tier data warehouse
 
PPS
Data Warehouse 101
Data warehouseold
Data warehouse
ETL Testing - Introduction to ETL testing
Data Warehouses & Deployment By Ankita dubey
Data warehouse
Data Warehousing and Data Mining
3 tier data warehouse
 
Data Warehouse 101

What's hot (19)

PPT
data warehousing
PPT
Data Warehouse
PPT
Introduction To Msbi By Yasir
PPTX
Data warehouse introduction
PPS
Introduction to Data Warehousing
PPT
Ch1 data-warehousing
PPT
Ch1 data-warehousing
PPTX
Data warehouse architecture
PPT
Business intelligence and data warehouses
PPTX
Etl - Extract Transform Load
PPT
Presentation
DOCX
Components of a Data-Warehouse
PPTX
DATA WAREHOUSING
PPTX
ETL Testing Overview
PPTX
A 5-step methodology for complex E&P data management
PPT
Lecture 03 - The Data Warehouse and Design
PPT
Olap, oltp and data mining
PPTX
Introduction to data warehousing
PPTX
Data Warehouse
data warehousing
Data Warehouse
Introduction To Msbi By Yasir
Data warehouse introduction
Introduction to Data Warehousing
Ch1 data-warehousing
Ch1 data-warehousing
Data warehouse architecture
Business intelligence and data warehouses
Etl - Extract Transform Load
Presentation
Components of a Data-Warehouse
DATA WAREHOUSING
ETL Testing Overview
A 5-step methodology for complex E&P data management
Lecture 03 - The Data Warehouse and Design
Olap, oltp and data mining
Introduction to data warehousing
Data Warehouse
Ad

Viewers also liked (10)

PPT
Introduction To Computer Security
PPT
Ethical Hacking - Introduction to Computer Security
PPT
Introduction to computer security syllabus
PDF
10 Ways to Win at SlideShare SEO & Presentation Optimization
PDF
Masters of SlideShare
PDF
What Makes Great Infographics
PDF
STOP! VIEW THIS! 10-Step Checklist When Uploading to Slideshare
PDF
How To Get More From SlideShare - Super-Simple Tips For Content Marketing
PDF
You Suck At PowerPoint!
PDF
How to Make Awesome SlideShares: Tips & Tricks
Introduction To Computer Security
Ethical Hacking - Introduction to Computer Security
Introduction to computer security syllabus
10 Ways to Win at SlideShare SEO & Presentation Optimization
Masters of SlideShare
What Makes Great Infographics
STOP! VIEW THIS! 10-Step Checklist When Uploading to Slideshare
How To Get More From SlideShare - Super-Simple Tips For Content Marketing
You Suck At PowerPoint!
How to Make Awesome SlideShares: Tips & Tricks
Ad

Similar to ETL Testing - Introduction to ETL Testing (20)

PPTX
ETL-Datawarehousing.ppt.pptx
PPT
Datastage Introduction To Data Warehousing
PPT
D01 etl
PPT
Etl data processing system which is very useful for the engineering students
PPT
Introduction to ETL Data Warehousing.ppt
PPT
extract, transform, load_Data Analyt.ppt
PPT
Data Warehouse Basic Guide
PPTX
1.3 CLASS-DW.pptx-ETL process in details with detailed descriptions
PDF
Data Ware House Testing
PDF
Data warehouse-testing
PPTX
Extract, Transform and Load.pptx
PPTX
ETL_Methodology.pptx
PPTX
What is a Data Warehouse and How Do I Test It?
PPT
ITReady DW Day2
PPT
definign etl process extract transform load.ppt
PPTX
What is ETL?
PPTX
Datawarehouse org
PPTX
ETL processes , Datawarehouse and Datamarts.pptx
PPTX
ETL
PPT
Intro to Data warehousing lecture 09
ETL-Datawarehousing.ppt.pptx
Datastage Introduction To Data Warehousing
D01 etl
Etl data processing system which is very useful for the engineering students
Introduction to ETL Data Warehousing.ppt
extract, transform, load_Data Analyt.ppt
Data Warehouse Basic Guide
1.3 CLASS-DW.pptx-ETL process in details with detailed descriptions
Data Ware House Testing
Data warehouse-testing
Extract, Transform and Load.pptx
ETL_Methodology.pptx
What is a Data Warehouse and How Do I Test It?
ITReady DW Day2
definign etl process extract transform load.ppt
What is ETL?
Datawarehouse org
ETL processes , Datawarehouse and Datamarts.pptx
ETL
Intro to Data warehousing lecture 09

Recently uploaded (20)

PDF
cuic standard and advanced reporting.pdf
PDF
KodekX | Application Modernization Development
PDF
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
PDF
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
PPT
Teaching material agriculture food technology
PDF
Encapsulation_ Review paper, used for researhc scholars
PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
DOCX
The AUB Centre for AI in Media Proposal.docx
PPT
“AI and Expert System Decision Support & Business Intelligence Systems”
PDF
Diabetes mellitus diagnosis method based random forest with bat algorithm
PDF
Mobile App Security Testing_ A Comprehensive Guide.pdf
PDF
Agricultural_Statistics_at_a_Glance_2022_0.pdf
PPTX
Detection-First SIEM: Rule Types, Dashboards, and Threat-Informed Strategy
PPTX
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
PDF
CIFDAQ's Market Insight: SEC Turns Pro Crypto
PDF
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
PDF
Modernizing your data center with Dell and AMD
PDF
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
PDF
NewMind AI Monthly Chronicles - July 2025
PDF
Per capita expenditure prediction using model stacking based on satellite ima...
cuic standard and advanced reporting.pdf
KodekX | Application Modernization Development
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
Teaching material agriculture food technology
Encapsulation_ Review paper, used for researhc scholars
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
The AUB Centre for AI in Media Proposal.docx
“AI and Expert System Decision Support & Business Intelligence Systems”
Diabetes mellitus diagnosis method based random forest with bat algorithm
Mobile App Security Testing_ A Comprehensive Guide.pdf
Agricultural_Statistics_at_a_Glance_2022_0.pdf
Detection-First SIEM: Rule Types, Dashboards, and Threat-Informed Strategy
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
CIFDAQ's Market Insight: SEC Turns Pro Crypto
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
Modernizing your data center with Dell and AMD
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
NewMind AI Monthly Chronicles - July 2025
Per capita expenditure prediction using model stacking based on satellite ima...

ETL Testing - Introduction to ETL Testing

  • 2. Introduction to ETL TestingIntroduction to ETL Testing The process of updating the data warehouse. Design by :- Vibrant Technologies & computers
  • 3. Two Data Warehousing StrategiesTwo Data Warehousing Strategies • Enterprise-wide warehouse, top down, the Inmon methodology • Data mart, bottom up, the Kimball methodology • When properly executed, both result in an enterprise-wide data warehouse
  • 4. The Data Mart StrategyThe Data Mart Strategy • The most common approach • Begins with a single mart and architected marts are added over time for more subject areas • Relatively inexpensive and easy to implement • Can be used as a proof of concept for data warehousing • Can perpetuate the “silos of information” problem • Can postpone difficult decisions and activities • Requires an overall integration plan
  • 5. The Enterprise-wide StrategyThe Enterprise-wide Strategy • A comprehensive warehouse is built initially • An initial dependent data mart is built using a subset of the data in the warehouse • Additional data marts are built using subsets of the data in the warehouse • Like all complex projects, it is expensive, time consuming, and prone to failure • When successful, it results in an integrated, scalable warehouse
  • 6. Data Sources and TypesData Sources and Types • Primarily from legacy, operational systems • Almost exclusively numerical data at the present time • External data may be included, often purchased from third-party sources • Technology exists for storing unstructured data and expect this to become more important over time
  • 7. Extraction, Transformation, and LoadingExtraction, Transformation, and Loading (ETL) Processes(ETL) Processes • The “plumbing” work of data warehousing • Data are moved from source to target data bases • A very costly, time consuming part of data warehousing
  • 8. Recent Development:Recent Development: More Frequent UpdatesMore Frequent Updates • Updates can be done in bulk and trickle modes • Business requirements, such as trading partner access to a Web site, requires current data • For international firms, there is no good time to load the warehouse
  • 9. Recent Development:Recent Development: Clickstream DataClickstream Data • Results from clicks at web sites • A dialog manager handles user interactions. An ODS (operational data store in the data staging area) helps to custom tailor the dialog • The clickstream data is filtered and parsed and sent to a data warehouse where it is analyzed • Software is available to analyze the clickstream data
  • 10. Data ExtractionData Extraction • Often performed by COBOL routines (not recommended because of high program maintenance and no automatically generated meta data) • Sometimes source data is copied to the target database using the replication capabilities of standard RDMS (not recommended because of “dirty data” in the source systems) • Increasing performed by specialized ETL software
  • 11. Sample ETL ToolsSample ETL Tools • Teradata Warehouse Builder from Teradata • DataStage from Ascential Software • SAS System from SAS Institute • Power Mart/Power Center from Informatica • Sagent Solution from Sagent Software • Hummingbird Genio Suite from Hummingbird Communications
  • 12. Reasons for “Dirty” DataReasons for “Dirty” Data • Dummy Values • Absence of Data • Multipurpose Fields • Cryptic Data • Contradicting Data • Inappropriate Use of Address Lines • Violation of Business Rules • Reused Primary Keys, • Non-Unique Identifiers • Data Integration Problems
  • 13. Data CleansingData Cleansing • Source systems contain “dirty data” that must be cleansed • ETL software contains rudimentary data cleansing capabilities • Specialized data cleansing software is often used. Important for performing name and address correction and householding functions • Leading data cleansing vendors include Vality (Integrity), Harte-Hanks (Trillium), and Firstlogic (i.d.Centric)
  • 14. Steps in Data CleansingSteps in Data Cleansing • Parsing • Correcting • Standardizing • Matching • Consolidating
  • 15. ParsingParsing • Parsing locates and identifies individual data elements in the source files and then isolates these data elements in the target files. • Examples include parsing the first, middle, and last name; street number and street name; and city and state.
  • 16. CorrectingCorrecting • Corrects parsed individual data components using sophisticated data algorithms and secondary data sources. • Example include replacing a vanity address and adding a zip code.
  • 17. StandardizingStandardizing • Standardizing applies conversion routines to transform data into its preferred (and consistent) format using both standard and custom business rules. • Examples include adding a pre name, replacing a nickname, and using a preferred street name.
  • 18. MatchingMatching • Searching and matching records within and across the parsed, corrected and standardized data based on predefined business rules to eliminate duplications. • Examples include identifying similar names and addresses.
  • 19. ConsolidatingConsolidating • Analyzing and identifying relationships between matched records and consolidating/merging them into ONE representation.
  • 20. Data StagingData Staging • Often used as an interim step between data extraction and later steps • Accumulates data from asynchronous sources using native interfaces, flat files, FTP sessions, or other processes • At a predefined cutoff time, data in the staging file is transformed and loaded to the warehouse • There is usually no end user access to the staging file • An operational data store may be used for data staging
  • 21. Data TransformationData Transformation • Transforms the data in accordance with the business rules and standards that have been established • Example include: format changes, deduplication, splitting up fields, replacement of codes, derived values, and aggregates
  • 22. Data LoadingData Loading • Data are physically moved to the data warehouse • The loading takes place within a “load window” • The trend is to near real time updates of the data warehouse as the warehouse is increasingly used for operational applications
  • 23. Meta DataMeta Data • Data about data • Needed by both information technology personnel and users • IT personnel need to know data sources and targets; database, table and column names; refresh schedules; data usage measures; etc. • Users need to know entity/attribute definitions; reports/query tools available; report distribution information; help desk contact information, etc.
  • 24. Recent Development:Recent Development: Meta Data IntegrationMeta Data Integration • A growing realization that meta data is critical to data warehousing success • Progress is being made on getting vendors to agree on standards and to incorporate the sharing of meta data among their tools • Vendors like Microsoft, Computer Associates, and Oracle have entered the meta data marketplace with significant product offerings
  • 25. ThankThank You !!!You !!! For More Information click below link: Follow Us on: http://vibranttechnologies.co.in/etl-testing-classes-in-mu

Editor's Notes

  • #4: There is still debate over which approach is best.
  • #5: The key is to have an overall plan, processes, and technologies for integrating the different marts. The marts may be logically rather than physically separate.
  • #6: Even with the enterprise-wide strategy, the warehouse is developed in phases and each phase should be designed to deliver business value.
  • #7: It is not unusual to extract data from over 100 source systems. While the technology is available to store structured and unstructured data together, the reality is that warehouse data is almost exclusively structured -- numerical with simple textual identifiers.
  • #8: ETL tends to be “pick and shovel” work. Most organization’s data is even worse than imagined.
  • #9: As data warehousing becomes more critical to decision making and operational processes, the pressure is to have more current data, which leads to trickle updates.
  • #10: The ODS is used to support the web site dialog -- an operational process -- while the data in the warehouse is analyzed -- to better understand customers and their use of the web site.
  • #11: It’s changing, but COBOL extracts are still the most common ETL process. There are multiple reasons for this -- the cost of specialized ETL software, in-house programmers who have a good knowledge of the COBOL based source systems that will be used, and the peculiarities of the source systems that make the use of ETL software difficult.
  • #12: You might go to the vendors’ web sites to find a good demo to show your students.
  • #13: Here’s a couple of examples: Dummy data -- a clerk enters 999-99-9999 as a SSN rather than asking the customer for theirs. Reused primary keys -- a branch bank is closed. Several years later, a new branch is opened, and the old identifier is used again.
  • #14: Data cleansing is critical to customer relationship management initiatives.
  • #15: A good example to use is cleansing customer data. Most students can identify with receiving multiple copies of the same catalog because the company is not doing a good data cleansing job.
  • #16: The record is broken down into atomic data elements.
  • #17: External data, such as census data, is often used in this process.
  • #18: Companies decide on the standards that they want to use.
  • #19: Commercial data cleansing software often uses AI techniques to match records.
  • #20: All of the data are now combined in a standard format.
  • #21: Data staging is used in cleansing, transforming, and integrating the data.
  • #22: Aggregates, such as sales totals, are often precalculated and stored in the warehouse to speed queries that require summary totals.
  • #23: Most loads involve only change data rather than a bulk reloading of all of the data in the warehouse.
  • #24: The importance of meta data is now realized, even though creating it is not glamorous work.
  • #25: Historically, each vendor had their own meta data solution -- which was incompatible with other vendors’ solutions. This is changing.