SlideShare a Scribd company logo
ETL
The process of updating the
data warehouse.
Recent Developments in
Data Warehousing: A Tutorial
Hugh J. Watson
Terry College of Business
University of Georgia
hwatson@terry.uga.edu
http://www.terry.uga.edu/~hwatson/dw_tutorial.ppt
Two Data Warehousing
Strategies
Enterprise-wide warehouse, top
down, the Inmon methodology
Data mart, bottom up, the Kimball
methodology
When properly executed, both result
in an enterprise-wide data warehouse
The Data Mart Strategy
The most common approach
Begins with a single mart and architected marts
are added over time for more subject areas
Relatively inexpensive and easy to implement
Can be used as a proof of concept for data
warehousing
Can perpetuate the “silos of information” problem
Can postpone difficult decisions and activities
Requires an overall integration plan
The Enterprise-wide Strategy
A comprehensive warehouse is built initially
An initial dependent data mart is built using
a subset of the data in the warehouse
Additional data marts are built using
subsets of the data in the warehouse
Like all complex projects, it is expensive,
time consuming, and prone to failure
When successful, it results in an integrated,
scalable warehouse
Data Sources and Types
Primarily from legacy, operational systems
Almost exclusively numerical data at the
present time
External data may be included, often
purchased from third-party sources
Technology exists for storing unstructured
data and expect this to become more
important over time
Extraction, Transformation, and
Loading (ETL) Processes
The “plumbing” work of data
warehousing
Data are moved from source to
target data bases
A very costly, time consuming part of
data warehousing
Recent Development:
More Frequent Updates
Updates can be done in bulk and
trickle modes
Business requirements, such as
trading partner access to a Web site,
requires current data
For international firms, there is no
good time to load the warehouse
Recent Development:
Clickstream Data
Results from clicks at web sites
A dialog manager handles user interactions.
An ODS (operational data store in the data
staging area) helps to custom tailor the
dialog
The clickstream data is filtered and parsed
and sent to a data warehouse where it is
analyzed
Software is available to analyze the
clickstream data
Data Extraction
Often performed by COBOL routines
(not recommended because of high program
maintenance and no automatically generated
meta data)
Sometimes source data is copied to the target
database using the replication capabilities of
standard RDMS (not recommended because of
“dirty data” in the source systems)
Increasing performed by specialized ETL
software
Sample ETL Tools
Teradata Warehouse Builder from Teradata
DataStage from Ascential Software
SAS System from SAS Institute
Power Mart/Power Center from Informatica
Sagent Solution from Sagent Software
Hummingbird Genio Suite from
Hummingbird Communications
Reasons for “Dirty” Data
 Dummy Values
 Absence of Data
 Multipurpose Fields
 Cryptic Data
 Contradicting Data
 Inappropriate Use of Address Lines
 Violation of Business Rules
 Reused Primary Keys,
 Non-Unique Identifiers
 Data Integration Problems
Data Cleansing
Source systems contain “dirty data” that must
be cleansed
ETL software contains rudimentary data
cleansing capabilities
Specialized data cleansing software is often
used. Important for performing name and
address correction and householding functions
Leading data cleansing vendors include Vality
(Integrity), Harte-Hanks (Trillium), and Firstlogic
(i.d.Centric)
Steps in Data Cleansing
 Parsing
 Correcting
 Standardizing
 Matching
 Consolidating
Parsing
Parsing locates and identifies
individual data elements in the
source files and then isolates these
data elements in the target files.
Examples include parsing the first,
middle, and last name; street number
and street name; and city and state.
Correcting
Corrects parsed individual data
components using sophisticated data
algorithms and secondary data
sources.
Example include replacing a vanity
address and adding a zip code.
Standardizing
Standardizing applies conversion
routines to transform data into its
preferred (and consistent) format
using both standard and custom
business rules.
Examples include adding a pre name,
replacing a nickname, and using a
preferred street name.
Matching
Searching and matching records
within and across the parsed,
corrected and standardized data
based on predefined business rules
to eliminate duplications.
Examples include identifying similar
names and addresses.
Consolidating
 Analyzing and identifying
relationships between matched
records and consolidating/merging
them into ONE representation.
Data Staging
Often used as an interim step between data
extraction and later steps
Accumulates data from asynchronous sources using
native interfaces, flat files, FTP sessions, or other
processes
At a predefined cutoff time, data in the staging file is
transformed and loaded to the warehouse
There is usually no end user access to the staging file
An operational data store may be used for data
staging
Data Transformation
Transforms the data in accordance
with the business rules and standards
that have been established
Example include: format changes,
deduplication, splitting up fields,
replacement of codes, derived values,
and aggregates
Data Loading
Data are physically moved to the data
warehouse
The loading takes place within a “load
window”
The trend is to near real time updates
of the data warehouse as the
warehouse is increasingly used for
operational applications
Meta Data
Data about data
Needed by both information technology
personnel and users
IT personnel need to know data sources and
targets; database, table and column names;
refresh schedules; data usage measures; etc.
Users need to know entity/attribute
definitions; reports/query tools available;
report distribution information; help desk
contact information, etc.
Recent Development:
Meta Data Integration
A growing realization that meta data is
critical to data warehousing success
Progress is being made on getting vendors
to agree on standards and to incorporate
the sharing of meta data among their tools
Vendors like Microsoft, Computer
Associates, and Oracle have entered the
meta data marketplace with significant
product offerings

More Related Content

PPT
Etl data processing system which is very useful for the engineering students
PPT
D01 etl
PPT
extract, transform, load_Data Analyt.ppt
PPTX
ETL-Datawarehousing.ppt.pptx
PPT
Datastage Introduction To Data Warehousing
PPT
ETL Testing - Introduction to ETL testing
PPT
ETL Testing - Introduction to ETL Testing
PPT
ETL Testing - Introduction to ETL testing
Etl data processing system which is very useful for the engineering students
D01 etl
extract, transform, load_Data Analyt.ppt
ETL-Datawarehousing.ppt.pptx
Datastage Introduction To Data Warehousing
ETL Testing - Introduction to ETL testing
ETL Testing - Introduction to ETL Testing
ETL Testing - Introduction to ETL testing

Similar to Introduction to ETL Data Warehousing.ppt (20)

PPT
Data Warehouse Basic Guide
PPTX
1.3 CLASS-DW.pptx-ETL process in details with detailed descriptions
PPT
Data Warehouse
PPTX
Etl - Extract Transform Load
PPT
Ch1 data-warehousing
PPT
Ch1 data-warehousing
PPTX
Datawarehouse org
PPTX
Data Mining and Data Warehousing Presentation
PPTX
GROPSIKS.pptx
PPTX
ETL processes , Datawarehouse and Datamarts.pptx
PPT
definign etl process extract transform load.ppt
PPT
ITReady DW Day2
PPTX
Etl process in data warehouse
PDF
Data warehouse-testing
PPTX
ETL_Methodology.pptx
PDF
Data Ware House Testing
PPT
Chapter 2-data-warehousingppt2517 vero
PPTX
Extract, Transform and Load.pptx
PPT
Building the DW - ETL
PPTX
What is ETL?
Data Warehouse Basic Guide
1.3 CLASS-DW.pptx-ETL process in details with detailed descriptions
Data Warehouse
Etl - Extract Transform Load
Ch1 data-warehousing
Ch1 data-warehousing
Datawarehouse org
Data Mining and Data Warehousing Presentation
GROPSIKS.pptx
ETL processes , Datawarehouse and Datamarts.pptx
definign etl process extract transform load.ppt
ITReady DW Day2
Etl process in data warehouse
Data warehouse-testing
ETL_Methodology.pptx
Data Ware House Testing
Chapter 2-data-warehousingppt2517 vero
Extract, Transform and Load.pptx
Building the DW - ETL
What is ETL?
Ad

Recently uploaded (20)

PDF
Mega Projects Data Mega Projects Data
PDF
Launch Your Data Science Career in Kochi – 2025
PPT
Chapter 3 METAL JOINING.pptnnnnnnnnnnnnn
PPTX
STUDY DESIGN details- Lt Col Maksud (21).pptx
PPTX
Moving the Public Sector (Government) to a Digital Adoption
PPT
Quality review (1)_presentation of this 21
PPTX
Data_Analytics_and_PowerBI_Presentation.pptx
PDF
Introduction to Business Data Analytics.
PPTX
CEE 2 REPORT G7.pptxbdbshjdgsgjgsjfiuhsd
PPTX
Major-Components-ofNKJNNKNKNKNKronment.pptx
PPT
Reliability_Chapter_ presentation 1221.5784
PPTX
Introduction-to-Cloud-ComputingFinal.pptx
PDF
“Getting Started with Data Analytics Using R – Concepts, Tools & Case Studies”
PPTX
Acceptance and paychological effects of mandatory extra coach I classes.pptx
PDF
22.Patil - Early prediction of Alzheimer’s disease using convolutional neural...
PPTX
MODULE 8 - DISASTER risk PREPAREDNESS.pptx
PPTX
oil_refinery_comprehensive_20250804084928 (1).pptx
PDF
168300704-gasification-ppt.pdfhghhhsjsjhsuxush
PDF
Fluorescence-microscope_Botany_detailed content
PDF
Clinical guidelines as a resource for EBP(1).pdf
Mega Projects Data Mega Projects Data
Launch Your Data Science Career in Kochi – 2025
Chapter 3 METAL JOINING.pptnnnnnnnnnnnnn
STUDY DESIGN details- Lt Col Maksud (21).pptx
Moving the Public Sector (Government) to a Digital Adoption
Quality review (1)_presentation of this 21
Data_Analytics_and_PowerBI_Presentation.pptx
Introduction to Business Data Analytics.
CEE 2 REPORT G7.pptxbdbshjdgsgjgsjfiuhsd
Major-Components-ofNKJNNKNKNKNKronment.pptx
Reliability_Chapter_ presentation 1221.5784
Introduction-to-Cloud-ComputingFinal.pptx
“Getting Started with Data Analytics Using R – Concepts, Tools & Case Studies”
Acceptance and paychological effects of mandatory extra coach I classes.pptx
22.Patil - Early prediction of Alzheimer’s disease using convolutional neural...
MODULE 8 - DISASTER risk PREPAREDNESS.pptx
oil_refinery_comprehensive_20250804084928 (1).pptx
168300704-gasification-ppt.pdfhghhhsjsjhsuxush
Fluorescence-microscope_Botany_detailed content
Clinical guidelines as a resource for EBP(1).pdf
Ad

Introduction to ETL Data Warehousing.ppt

  • 1. ETL The process of updating the data warehouse.
  • 2. Recent Developments in Data Warehousing: A Tutorial Hugh J. Watson Terry College of Business University of Georgia hwatson@terry.uga.edu http://www.terry.uga.edu/~hwatson/dw_tutorial.ppt
  • 3. Two Data Warehousing Strategies Enterprise-wide warehouse, top down, the Inmon methodology Data mart, bottom up, the Kimball methodology When properly executed, both result in an enterprise-wide data warehouse
  • 4. The Data Mart Strategy The most common approach Begins with a single mart and architected marts are added over time for more subject areas Relatively inexpensive and easy to implement Can be used as a proof of concept for data warehousing Can perpetuate the “silos of information” problem Can postpone difficult decisions and activities Requires an overall integration plan
  • 5. The Enterprise-wide Strategy A comprehensive warehouse is built initially An initial dependent data mart is built using a subset of the data in the warehouse Additional data marts are built using subsets of the data in the warehouse Like all complex projects, it is expensive, time consuming, and prone to failure When successful, it results in an integrated, scalable warehouse
  • 6. Data Sources and Types Primarily from legacy, operational systems Almost exclusively numerical data at the present time External data may be included, often purchased from third-party sources Technology exists for storing unstructured data and expect this to become more important over time
  • 7. Extraction, Transformation, and Loading (ETL) Processes The “plumbing” work of data warehousing Data are moved from source to target data bases A very costly, time consuming part of data warehousing
  • 8. Recent Development: More Frequent Updates Updates can be done in bulk and trickle modes Business requirements, such as trading partner access to a Web site, requires current data For international firms, there is no good time to load the warehouse
  • 9. Recent Development: Clickstream Data Results from clicks at web sites A dialog manager handles user interactions. An ODS (operational data store in the data staging area) helps to custom tailor the dialog The clickstream data is filtered and parsed and sent to a data warehouse where it is analyzed Software is available to analyze the clickstream data
  • 10. Data Extraction Often performed by COBOL routines (not recommended because of high program maintenance and no automatically generated meta data) Sometimes source data is copied to the target database using the replication capabilities of standard RDMS (not recommended because of “dirty data” in the source systems) Increasing performed by specialized ETL software
  • 11. Sample ETL Tools Teradata Warehouse Builder from Teradata DataStage from Ascential Software SAS System from SAS Institute Power Mart/Power Center from Informatica Sagent Solution from Sagent Software Hummingbird Genio Suite from Hummingbird Communications
  • 12. Reasons for “Dirty” Data  Dummy Values  Absence of Data  Multipurpose Fields  Cryptic Data  Contradicting Data  Inappropriate Use of Address Lines  Violation of Business Rules  Reused Primary Keys,  Non-Unique Identifiers  Data Integration Problems
  • 13. Data Cleansing Source systems contain “dirty data” that must be cleansed ETL software contains rudimentary data cleansing capabilities Specialized data cleansing software is often used. Important for performing name and address correction and householding functions Leading data cleansing vendors include Vality (Integrity), Harte-Hanks (Trillium), and Firstlogic (i.d.Centric)
  • 14. Steps in Data Cleansing  Parsing  Correcting  Standardizing  Matching  Consolidating
  • 15. Parsing Parsing locates and identifies individual data elements in the source files and then isolates these data elements in the target files. Examples include parsing the first, middle, and last name; street number and street name; and city and state.
  • 16. Correcting Corrects parsed individual data components using sophisticated data algorithms and secondary data sources. Example include replacing a vanity address and adding a zip code.
  • 17. Standardizing Standardizing applies conversion routines to transform data into its preferred (and consistent) format using both standard and custom business rules. Examples include adding a pre name, replacing a nickname, and using a preferred street name.
  • 18. Matching Searching and matching records within and across the parsed, corrected and standardized data based on predefined business rules to eliminate duplications. Examples include identifying similar names and addresses.
  • 19. Consolidating  Analyzing and identifying relationships between matched records and consolidating/merging them into ONE representation.
  • 20. Data Staging Often used as an interim step between data extraction and later steps Accumulates data from asynchronous sources using native interfaces, flat files, FTP sessions, or other processes At a predefined cutoff time, data in the staging file is transformed and loaded to the warehouse There is usually no end user access to the staging file An operational data store may be used for data staging
  • 21. Data Transformation Transforms the data in accordance with the business rules and standards that have been established Example include: format changes, deduplication, splitting up fields, replacement of codes, derived values, and aggregates
  • 22. Data Loading Data are physically moved to the data warehouse The loading takes place within a “load window” The trend is to near real time updates of the data warehouse as the warehouse is increasingly used for operational applications
  • 23. Meta Data Data about data Needed by both information technology personnel and users IT personnel need to know data sources and targets; database, table and column names; refresh schedules; data usage measures; etc. Users need to know entity/attribute definitions; reports/query tools available; report distribution information; help desk contact information, etc.
  • 24. Recent Development: Meta Data Integration A growing realization that meta data is critical to data warehousing success Progress is being made on getting vendors to agree on standards and to incorporate the sharing of meta data among their tools Vendors like Microsoft, Computer Associates, and Oracle have entered the meta data marketplace with significant product offerings

Editor's Notes

  • #2: This presentation is based on a tutorial presented by Hugh Watson at the 2001 AMCIS. It provides material for an introduction to data warehousing. Customize it for your own purposes.
  • #3: There is still debate over which approach is best.
  • #4: The key is to have an overall plan, processes, and technologies for integrating the different marts. The marts may be logically rather than physically separate.
  • #5: Even with the enterprise-wide strategy, the warehouse is developed in phases and each phase should be designed to deliver business value.
  • #6: It is not unusual to extract data from over 100 source systems. While the technology is available to store structured and unstructured data together, the reality is that warehouse data is almost exclusively structured -- numerical with simple textual identifiers.
  • #7: ETL tends to be “pick and shovel” work. Most organization’s data is even worse than imagined.
  • #8: As data warehousing becomes more critical to decision making and operational processes, the pressure is to have more current data, which leads to trickle updates.
  • #9: The ODS is used to support the web site dialog -- an operational process -- while the data in the warehouse is analyzed -- to better understand customers and their use of the web site.
  • #10: It’s changing, but COBOL extracts are still the most common ETL process. There are multiple reasons for this -- the cost of specialized ETL software, in-house programmers who have a good knowledge of the COBOL based source systems that will be used, and the peculiarities of the source systems that make the use of ETL software difficult.
  • #11: You might go to the vendors’ web sites to find a good demo to show your students.
  • #12: Here’s a couple of examples: Dummy data -- a clerk enters 999-99-9999 as a SSN rather than asking the customer for theirs. Reused primary keys -- a branch bank is closed. Several years later, a new branch is opened, and the old identifier is used again.
  • #13: Data cleansing is critical to customer relationship management initiatives.
  • #14: A good example to use is cleansing customer data. Most students can identify with receiving multiple copies of the same catalog because the company is not doing a good data cleansing job.
  • #15: The record is broken down into atomic data elements.
  • #16: External data, such as census data, is often used in this process.
  • #17: Companies decide on the standards that they want to use.
  • #18: Commercial data cleansing software often uses AI techniques to match records.
  • #19: All of the data are now combined in a standard format.
  • #20: Data staging is used in cleansing, transforming, and integrating the data.
  • #21: Aggregates, such as sales totals, are often precalculated and stored in the warehouse to speed queries that require summary totals.
  • #22: Most loads involve only change data rather than a bulk reloading of all of the data in the warehouse.
  • #23: The importance of meta data is now realized, even though creating it is not glamorous work.
  • #24: Historically, each vendor had their own meta data solution -- which was incompatible with other vendors’ solutions. This is changing.