SlideShare a Scribd company logo
ETL Testing - Introduction to ETL testing
Introduction to ETL TestingIntroduction to ETL Testing
The process of updating the data
warehouse.
Design by :- Vibrant
Technologies & computers
Two Data Warehousing StrategiesTwo Data Warehousing Strategies
• Enterprise-wide warehouse, top down, the Inmon
methodology
• Data mart, bottom up, the Kimball methodology
• When properly executed, both result in an
enterprise-wide data warehouse
The Data Mart StrategyThe Data Mart Strategy
• The most common approach
• Begins with a single mart and architected marts are
added over time for more subject areas
• Relatively inexpensive and easy to implement
• Can be used as a proof of concept for data
warehousing
• Can perpetuate the “silos of information” problem
• Can postpone difficult decisions and activities
• Requires an overall integration plan
The Enterprise-wide StrategyThe Enterprise-wide Strategy
• A comprehensive warehouse is built initially
• An initial dependent data mart is built using a
subset of the data in the warehouse
• Additional data marts are built using subsets of the
data in the warehouse
• Like all complex projects, it is expensive, time
consuming, and prone to failure
• When successful, it results in an integrated, scalable
warehouse
Data Sources and TypesData Sources and Types
• Primarily from legacy, operational systems
• Almost exclusively numerical data at the present
time
• External data may be included, often purchased
from third-party sources
• Technology exists for storing unstructured data and
expect this to become more important over time
Extraction, Transformation, and LoadingExtraction, Transformation, and Loading
(ETL) Processes(ETL) Processes
• The “plumbing” work of data warehousing
• Data are moved from source to target data bases
• A very costly, time consuming part of data
warehousing
Recent Development:Recent Development:
More Frequent UpdatesMore Frequent Updates
• Updates can be done in bulk and trickle modes
• Business requirements, such as trading partner
access to a Web site, requires current data
• For international firms, there is no good time to load
the warehouse
Recent Development:Recent Development:
Clickstream DataClickstream Data
• Results from clicks at web sites
• A dialog manager handles user interactions. An
ODS (operational data store in the data staging
area) helps to custom tailor the dialog
• The clickstream data is filtered and parsed and
sent to a data warehouse where it is analyzed
• Software is available to analyze the clickstream
data
Data ExtractionData Extraction
• Often performed by COBOL routines
(not recommended because of high program
maintenance and no automatically generated
meta data)
• Sometimes source data is copied to the target
database using the replication capabilities of
standard RDMS (not recommended because of
“dirty data” in the source systems)
• Increasing performed by specialized ETL software
Sample ETL ToolsSample ETL Tools
• Teradata Warehouse Builder from Teradata
• DataStage from Ascential Software
• SAS System from SAS Institute
• Power Mart/Power Center from Informatica
• Sagent Solution from Sagent Software
• Hummingbird Genio Suite from Hummingbird
Communications
Reasons for “Dirty” DataReasons for “Dirty” Data
• Dummy Values
• Absence of Data
• Multipurpose Fields
• Cryptic Data
• Contradicting Data
• Inappropriate Use of Address Lines
• Violation of Business Rules
• Reused Primary Keys,
• Non-Unique Identifiers
• Data Integration Problems
Data CleansingData Cleansing
• Source systems contain “dirty data” that must be cleansed
• ETL software contains rudimentary data cleansing capabilities
• Specialized data cleansing software is often used. Important
for performing name and address correction and
householding functions
• Leading data cleansing vendors include Vality (Integrity),
Harte-Hanks (Trillium), and Firstlogic (i.d.Centric)
Steps in Data CleansingSteps in Data Cleansing
• Parsing
• Correcting
• Standardizing
• Matching
• Consolidating
ParsingParsing
• Parsing locates and identifies individual data
elements in the source files and then isolates these
data elements in the target files.
• Examples include parsing the first, middle, and last
name; street number and street name; and city
and state.
CorrectingCorrecting
• Corrects parsed individual data components using
sophisticated data algorithms and secondary data
sources.
• Example include replacing a vanity address and
adding a zip code.
StandardizingStandardizing
• Standardizing applies conversion routines to
transform data into its preferred (and consistent)
format using both standard and custom business
rules.
• Examples include adding a pre name, replacing a
nickname, and using a preferred street name.
MatchingMatching
• Searching and matching records within and across
the parsed, corrected and standardized data
based on predefined business rules to eliminate
duplications.
• Examples include identifying similar names and
addresses.
ConsolidatingConsolidating
• Analyzing and identifying relationships between
matched records and consolidating/merging them
into ONE representation.
Data StagingData Staging
• Often used as an interim step between data extraction
and later steps
• Accumulates data from asynchronous sources using
native interfaces, flat files, FTP sessions, or other
processes
• At a predefined cutoff time, data in the staging file is
transformed and loaded to the warehouse
• There is usually no end user access to the staging file
• An operational data store may be used for data staging
Data TransformationData Transformation
• Transforms the data in accordance with the
business rules and standards that have been
established
• Example include: format changes, deduplication,
splitting up fields, replacement of codes, derived
values, and aggregates
Data LoadingData Loading
• Data are physically moved to the data warehouse
• The loading takes place within a “load window”
• The trend is to near real time updates of the data
warehouse as the warehouse is increasingly used for
operational applications
Meta DataMeta Data
• Data about data
• Needed by both information technology
personnel and users
• IT personnel need to know data sources and
targets; database, table and column names;
refresh schedules; data usage measures; etc.
• Users need to know entity/attribute definitions;
reports/query tools available; report distribution
information; help desk contact information, etc.
Recent Development:Recent Development:
Meta Data IntegrationMeta Data Integration
• A growing realization that meta data is critical
to data warehousing success
• Progress is being made on getting vendors to
agree on standards and to incorporate the
sharing of meta data among their tools
• Vendors like Microsoft, Computer Associates,
and Oracle have entered the meta data
marketplace with significant product offerings
ThankThank You !!!You !!!
For More Information click below link:
Follow Us on:
http://vibranttechnologies.co.in/etl-testing-classes-in-mu

More Related Content

PPTX
ETL Testing Overview
PPTX
DATA WAREHOUSE -- ETL testing Plan
DOC
Data Warehouse (ETL) testing process
PPT
Data Verification In QA Department Final
DOC
ETL QA
PDF
Data warehousing testing strategies cognos
PPT
ETL Testing Training Presentation
PDF
Data Warehouse Testing: It’s All about the Planning
ETL Testing Overview
DATA WAREHOUSE -- ETL testing Plan
Data Warehouse (ETL) testing process
Data Verification In QA Department Final
ETL QA
Data warehousing testing strategies cognos
ETL Testing Training Presentation
Data Warehouse Testing: It’s All about the Planning

What's hot (17)

DOC
Etl testing
DOC
Testing data warehouse applications by Kirti Bhushan
PPTX
What is ETL testing & how to enforce it in Data Wharehouse
PPS
Etl Overview (Extract, Transform, And Load)
PPTX
Introduction to ETL process
DOC
Etl And Data Test Guidelines For Large Applications
PPTX
Etl process in data warehouse
PPT
Introduction To Msbi By Yasir
PPTX
Etl - Extract Transform Load
PDF
Get started with data migration
PPT
ETL Testing - Introduction to ETL testing
PPT
Data migration
PDF
Testing the Data Warehouse
PDF
ETIS09 - Data Quality: Common Problems & Checks - Presentation
PPTX
SKILLWISE-SSIS DESIGN PATTERN FOR DATA WAREHOUSING
PPTX
What is a Data Warehouse and How Do I Test It?
PPT
Database migration
Etl testing
Testing data warehouse applications by Kirti Bhushan
What is ETL testing & how to enforce it in Data Wharehouse
Etl Overview (Extract, Transform, And Load)
Introduction to ETL process
Etl And Data Test Guidelines For Large Applications
Etl process in data warehouse
Introduction To Msbi By Yasir
Etl - Extract Transform Load
Get started with data migration
ETL Testing - Introduction to ETL testing
Data migration
Testing the Data Warehouse
ETIS09 - Data Quality: Common Problems & Checks - Presentation
SKILLWISE-SSIS DESIGN PATTERN FOR DATA WAREHOUSING
What is a Data Warehouse and How Do I Test It?
Database migration
Ad

Similar to ETL Testing - Introduction to ETL testing (20)

PPT
Datastage Introduction To Data Warehousing
PPTX
ETL-Datawarehousing.ppt.pptx
PPTX
Data warehouse
PPTX
Data warehouseold
PPT
DW (1).ppt
PPTX
Datawarehouse org
PPT
Datawarehousing
PPT
Various Applications of Data Warehouse.ppt
PPTX
The Data Engineering Guide 101 - GDGoC NUML X Bytewise
PPT
D01 etl
PPTX
Data warehouse introduction
PPTX
ETL processes , Datawarehouse and Datamarts.pptx
PPT
Data Warehouse Basic Guide
PPTX
Data Warehouse 1111111111111111111111111111.pptx
PPTX
DataWarehouse Architecture,daat mining,data mart,etl process.pptx
PPTX
module 1 DWDM (complete) chapter ppt.pptx
PPT
extract, transform, load_Data Analyt.ppt
PPTX
Chapter 6.pptx
PPTX
Data warehousing and data mart
PDF
Data Mining is the process ofData Mining is the process ofData Mining is the ...
Datastage Introduction To Data Warehousing
ETL-Datawarehousing.ppt.pptx
Data warehouse
Data warehouseold
DW (1).ppt
Datawarehouse org
Datawarehousing
Various Applications of Data Warehouse.ppt
The Data Engineering Guide 101 - GDGoC NUML X Bytewise
D01 etl
Data warehouse introduction
ETL processes , Datawarehouse and Datamarts.pptx
Data Warehouse Basic Guide
Data Warehouse 1111111111111111111111111111.pptx
DataWarehouse Architecture,daat mining,data mart,etl process.pptx
module 1 DWDM (complete) chapter ppt.pptx
extract, transform, load_Data Analyt.ppt
Chapter 6.pptx
Data warehousing and data mart
Data Mining is the process ofData Mining is the process ofData Mining is the ...
Ad

More from Vibrant Technologies & Computers (20)

PPT
Buisness analyst business analysis overview ppt 5
PPT
SQL Introduction to displaying data from multiple tables
PPT
SQL- Introduction to MySQL
PPT
SQL- Introduction to SQL database
PPT
ITIL - introduction to ITIL
PPT
Salesforce - Introduction to Security & Access
PPT
Data ware housing- Introduction to olap .
PPT
Data ware housing - Introduction to data ware housing process.
PPT
Data ware housing- Introduction to data ware housing
PPT
Salesforce - classification of cloud computing
PPT
Salesforce - cloud computing fundamental
PPT
SQL- Introduction to PL/SQL
PPT
SQL- Introduction to advanced sql concepts
PPT
SQL Inteoduction to SQL manipulating of data
PPT
SQL- Introduction to SQL Set Operations
PPT
Sas - Introduction to designing the data mart
PPT
Sas - Introduction to working under change management
PPT
SAS - overview of SAS
PPT
Teradata - Architecture of Teradata
PPT
Teradata - Restoring Data
Buisness analyst business analysis overview ppt 5
SQL Introduction to displaying data from multiple tables
SQL- Introduction to MySQL
SQL- Introduction to SQL database
ITIL - introduction to ITIL
Salesforce - Introduction to Security & Access
Data ware housing- Introduction to olap .
Data ware housing - Introduction to data ware housing process.
Data ware housing- Introduction to data ware housing
Salesforce - classification of cloud computing
Salesforce - cloud computing fundamental
SQL- Introduction to PL/SQL
SQL- Introduction to advanced sql concepts
SQL Inteoduction to SQL manipulating of data
SQL- Introduction to SQL Set Operations
Sas - Introduction to designing the data mart
Sas - Introduction to working under change management
SAS - overview of SAS
Teradata - Architecture of Teradata
Teradata - Restoring Data

Recently uploaded (20)

PDF
Empathic Computing: Creating Shared Understanding
PPTX
A Presentation on Artificial Intelligence
PPT
Teaching material agriculture food technology
PDF
The Rise and Fall of 3GPP – Time for a Sabbatical?
PDF
Review of recent advances in non-invasive hemoglobin estimation
PPTX
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
PDF
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
PDF
Unlocking AI with Model Context Protocol (MCP)
PDF
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
PPTX
20250228 LYD VKU AI Blended-Learning.pptx
PDF
cuic standard and advanced reporting.pdf
PDF
NewMind AI Weekly Chronicles - August'25 Week I
PDF
Electronic commerce courselecture one. Pdf
PDF
Encapsulation theory and applications.pdf
PDF
NewMind AI Monthly Chronicles - July 2025
PPTX
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
PDF
Spectral efficient network and resource selection model in 5G networks
PPTX
Cloud computing and distributed systems.
PDF
Bridging biosciences and deep learning for revolutionary discoveries: a compr...
Empathic Computing: Creating Shared Understanding
A Presentation on Artificial Intelligence
Teaching material agriculture food technology
The Rise and Fall of 3GPP – Time for a Sabbatical?
Review of recent advances in non-invasive hemoglobin estimation
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
Reach Out and Touch Someone: Haptics and Empathic Computing
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
Unlocking AI with Model Context Protocol (MCP)
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
20250228 LYD VKU AI Blended-Learning.pptx
cuic standard and advanced reporting.pdf
NewMind AI Weekly Chronicles - August'25 Week I
Electronic commerce courselecture one. Pdf
Encapsulation theory and applications.pdf
NewMind AI Monthly Chronicles - July 2025
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
Spectral efficient network and resource selection model in 5G networks
Cloud computing and distributed systems.
Bridging biosciences and deep learning for revolutionary discoveries: a compr...

ETL Testing - Introduction to ETL testing

  • 2. Introduction to ETL TestingIntroduction to ETL Testing The process of updating the data warehouse. Design by :- Vibrant Technologies & computers
  • 3. Two Data Warehousing StrategiesTwo Data Warehousing Strategies • Enterprise-wide warehouse, top down, the Inmon methodology • Data mart, bottom up, the Kimball methodology • When properly executed, both result in an enterprise-wide data warehouse
  • 4. The Data Mart StrategyThe Data Mart Strategy • The most common approach • Begins with a single mart and architected marts are added over time for more subject areas • Relatively inexpensive and easy to implement • Can be used as a proof of concept for data warehousing • Can perpetuate the “silos of information” problem • Can postpone difficult decisions and activities • Requires an overall integration plan
  • 5. The Enterprise-wide StrategyThe Enterprise-wide Strategy • A comprehensive warehouse is built initially • An initial dependent data mart is built using a subset of the data in the warehouse • Additional data marts are built using subsets of the data in the warehouse • Like all complex projects, it is expensive, time consuming, and prone to failure • When successful, it results in an integrated, scalable warehouse
  • 6. Data Sources and TypesData Sources and Types • Primarily from legacy, operational systems • Almost exclusively numerical data at the present time • External data may be included, often purchased from third-party sources • Technology exists for storing unstructured data and expect this to become more important over time
  • 7. Extraction, Transformation, and LoadingExtraction, Transformation, and Loading (ETL) Processes(ETL) Processes • The “plumbing” work of data warehousing • Data are moved from source to target data bases • A very costly, time consuming part of data warehousing
  • 8. Recent Development:Recent Development: More Frequent UpdatesMore Frequent Updates • Updates can be done in bulk and trickle modes • Business requirements, such as trading partner access to a Web site, requires current data • For international firms, there is no good time to load the warehouse
  • 9. Recent Development:Recent Development: Clickstream DataClickstream Data • Results from clicks at web sites • A dialog manager handles user interactions. An ODS (operational data store in the data staging area) helps to custom tailor the dialog • The clickstream data is filtered and parsed and sent to a data warehouse where it is analyzed • Software is available to analyze the clickstream data
  • 10. Data ExtractionData Extraction • Often performed by COBOL routines (not recommended because of high program maintenance and no automatically generated meta data) • Sometimes source data is copied to the target database using the replication capabilities of standard RDMS (not recommended because of “dirty data” in the source systems) • Increasing performed by specialized ETL software
  • 11. Sample ETL ToolsSample ETL Tools • Teradata Warehouse Builder from Teradata • DataStage from Ascential Software • SAS System from SAS Institute • Power Mart/Power Center from Informatica • Sagent Solution from Sagent Software • Hummingbird Genio Suite from Hummingbird Communications
  • 12. Reasons for “Dirty” DataReasons for “Dirty” Data • Dummy Values • Absence of Data • Multipurpose Fields • Cryptic Data • Contradicting Data • Inappropriate Use of Address Lines • Violation of Business Rules • Reused Primary Keys, • Non-Unique Identifiers • Data Integration Problems
  • 13. Data CleansingData Cleansing • Source systems contain “dirty data” that must be cleansed • ETL software contains rudimentary data cleansing capabilities • Specialized data cleansing software is often used. Important for performing name and address correction and householding functions • Leading data cleansing vendors include Vality (Integrity), Harte-Hanks (Trillium), and Firstlogic (i.d.Centric)
  • 14. Steps in Data CleansingSteps in Data Cleansing • Parsing • Correcting • Standardizing • Matching • Consolidating
  • 15. ParsingParsing • Parsing locates and identifies individual data elements in the source files and then isolates these data elements in the target files. • Examples include parsing the first, middle, and last name; street number and street name; and city and state.
  • 16. CorrectingCorrecting • Corrects parsed individual data components using sophisticated data algorithms and secondary data sources. • Example include replacing a vanity address and adding a zip code.
  • 17. StandardizingStandardizing • Standardizing applies conversion routines to transform data into its preferred (and consistent) format using both standard and custom business rules. • Examples include adding a pre name, replacing a nickname, and using a preferred street name.
  • 18. MatchingMatching • Searching and matching records within and across the parsed, corrected and standardized data based on predefined business rules to eliminate duplications. • Examples include identifying similar names and addresses.
  • 19. ConsolidatingConsolidating • Analyzing and identifying relationships between matched records and consolidating/merging them into ONE representation.
  • 20. Data StagingData Staging • Often used as an interim step between data extraction and later steps • Accumulates data from asynchronous sources using native interfaces, flat files, FTP sessions, or other processes • At a predefined cutoff time, data in the staging file is transformed and loaded to the warehouse • There is usually no end user access to the staging file • An operational data store may be used for data staging
  • 21. Data TransformationData Transformation • Transforms the data in accordance with the business rules and standards that have been established • Example include: format changes, deduplication, splitting up fields, replacement of codes, derived values, and aggregates
  • 22. Data LoadingData Loading • Data are physically moved to the data warehouse • The loading takes place within a “load window” • The trend is to near real time updates of the data warehouse as the warehouse is increasingly used for operational applications
  • 23. Meta DataMeta Data • Data about data • Needed by both information technology personnel and users • IT personnel need to know data sources and targets; database, table and column names; refresh schedules; data usage measures; etc. • Users need to know entity/attribute definitions; reports/query tools available; report distribution information; help desk contact information, etc.
  • 24. Recent Development:Recent Development: Meta Data IntegrationMeta Data Integration • A growing realization that meta data is critical to data warehousing success • Progress is being made on getting vendors to agree on standards and to incorporate the sharing of meta data among their tools • Vendors like Microsoft, Computer Associates, and Oracle have entered the meta data marketplace with significant product offerings
  • 25. ThankThank You !!!You !!! For More Information click below link: Follow Us on: http://vibranttechnologies.co.in/etl-testing-classes-in-mu

Editor's Notes

  • #4: There is still debate over which approach is best.
  • #5: The key is to have an overall plan, processes, and technologies for integrating the different marts. The marts may be logically rather than physically separate.
  • #6: Even with the enterprise-wide strategy, the warehouse is developed in phases and each phase should be designed to deliver business value.
  • #7: It is not unusual to extract data from over 100 source systems. While the technology is available to store structured and unstructured data together, the reality is that warehouse data is almost exclusively structured -- numerical with simple textual identifiers.
  • #8: ETL tends to be “pick and shovel” work. Most organization’s data is even worse than imagined.
  • #9: As data warehousing becomes more critical to decision making and operational processes, the pressure is to have more current data, which leads to trickle updates.
  • #10: The ODS is used to support the web site dialog -- an operational process -- while the data in the warehouse is analyzed -- to better understand customers and their use of the web site.
  • #11: It’s changing, but COBOL extracts are still the most common ETL process. There are multiple reasons for this -- the cost of specialized ETL software, in-house programmers who have a good knowledge of the COBOL based source systems that will be used, and the peculiarities of the source systems that make the use of ETL software difficult.
  • #12: You might go to the vendors’ web sites to find a good demo to show your students.
  • #13: Here’s a couple of examples: Dummy data -- a clerk enters 999-99-9999 as a SSN rather than asking the customer for theirs. Reused primary keys -- a branch bank is closed. Several years later, a new branch is opened, and the old identifier is used again.
  • #14: Data cleansing is critical to customer relationship management initiatives.
  • #15: A good example to use is cleansing customer data. Most students can identify with receiving multiple copies of the same catalog because the company is not doing a good data cleansing job.
  • #16: The record is broken down into atomic data elements.
  • #17: External data, such as census data, is often used in this process.
  • #18: Companies decide on the standards that they want to use.
  • #19: Commercial data cleansing software often uses AI techniques to match records.
  • #20: All of the data are now combined in a standard format.
  • #21: Data staging is used in cleansing, transforming, and integrating the data.
  • #22: Aggregates, such as sales totals, are often precalculated and stored in the warehouse to speed queries that require summary totals.
  • #23: Most loads involve only change data rather than a bulk reloading of all of the data in the warehouse.
  • #24: The importance of meta data is now realized, even though creating it is not glamorous work.
  • #25: Historically, each vendor had their own meta data solution -- which was incompatible with other vendors’ solutions. This is changing.