SlideShare a Scribd company logo
The Data Flow Task
Encapsulates the data flow engine
Exists in the context of an overall control flow
Performs traditional ETL in addition to other extended scenarios
Is fast and scalable
Data Flow Components
Extract data from Sources
Load data into Destinations
Modify data with Transformations
Service Paths
Connect data flow components
Create the pipeline
Encapsulates the data flow engine
Load
Extract
Transform
SSIS 2008 R2 data flow
SSIS 2008 R2 data flow
A Path connects two components in a data flow by connecting the
output of one data flow component to the input of another
component. A path has a source and a destination.
In SSIS, a source is the data flow component that extracts data from different
external data sources and makes it available to the other components in the data
flow.
Sources have one regular output, and many sources in addition also have one
error output.
All the output columns are available as input columns to the next data flow
component in the data flow.
Sources extract data from:
Relational tables and views
Files
Analysis Services databases
OLEDB Oracle
Connection
Data Source
Source Adapter
Destinations are the data flow components that load the data from a data
flow into different types of data sources or create an in-memory dataset.
Destinations have one input and one error output.
Destinations load data to:
Relational tables and views
Files
Analysis Services databases and objects
DataReaders and Recordsets
Enterprise Edition only
ADO.NET
Connection TargetDestination
Adapter
SSIS Transformations are the components in the data flow of a package
that give you the ability to modify and manipulate data in the data flow.
A transformation performs an operation either on one row of data at a
time or on several rows of data at once.
For example aggregate, merge, distribute, and modify data and also
can perform lookup operations and generate sample datasets.
SSIS 2008 R2 data flow
DimProduct
ProductKey
Color
Name
Cost
DimProduct
ProductKey
Color
Name
Cost
DimProduct
ProductKey
Color
Name
Cost
Source
Transformation
Destination
Best Practices
We can logically group them by functionality:
Row Transformations
Rowset Transformations
Split and Join Transformations
Auditing Transformations
Business Intelligence Transformations
Custom Transformations
The most common and easily configured transformations perform operations
on rows without needing other rows from the source. These transformations,
which logically work at the row level, often perform very well.
Update column values or create new columns
Transform each row in the pipeline input
Create new rowsets that can include
Aggregated values
Sorted values
Sample rowsets
Pivoted or unpivoted rowsets
Distribute rows to different outputs
Create copies of the transformation inputs
Join multiple inputs into one output
Perform lookup operations
Integration Services includes the following transformations to
add audit information and count rows.
cleaning data
updating of a slowly changing dimension
looks up values
mining text
running data mining prediction queries
The final grouping of transformations lets you perform
advanced operations on rows in the data flow pipeline.
ADO.NET
Connection
Sources
Transformations
Destinations
OLEDB Oracle
Connection
EXCEL
Connection
SSIS 2008 R2 data flow

More Related Content

PPTX
Introduction of ssis
PPTX
06 SSIS Data Flow
PPTX
PPTX
Oracle DBA Tutorial for Beginners -Oracle training institute in bangalore
PPT
SSIS Presentation
PPTX
IBM MQ Series For ZOS
PPTX
SSIS control flow
Introduction of ssis
06 SSIS Data Flow
Oracle DBA Tutorial for Beginners -Oracle training institute in bangalore
SSIS Presentation
IBM MQ Series For ZOS
SSIS control flow

What's hot (20)

PPTX
PPTX
Structured Query Language
PPTX
Group By, Order By, and Aliases in SQL
PDF
[❤PDF❤] Oracle 19c Database Administration Oracle Simplified
PPT
Including Constraints -Oracle Data base
PPTX
introdution to SQL and SQL functions
PPT
Database Connection
PPTX
SQL Tuning 101
PDF
MySQL Database Architectures - High Availability and Disaster Recovery Solution
PDF
MySQL Workbench Tutorial | Introduction To MySQL Workbench | MySQL DBA Traini...
PPTX
Procedures and triggers in SQL
PPT
Lecture 01 introduction to database
PDF
Relational database- Fundamentals
PPTX
Sql(structured query language)
PDF
SQL Overview
PDF
Data visualization with sql analytics
PDF
Oracle User Management
PPTX
Introduction to DAX
PDF
Oracle Security Presentation
PDF
Road to database automation - Database source control
Structured Query Language
Group By, Order By, and Aliases in SQL
[❤PDF❤] Oracle 19c Database Administration Oracle Simplified
Including Constraints -Oracle Data base
introdution to SQL and SQL functions
Database Connection
SQL Tuning 101
MySQL Database Architectures - High Availability and Disaster Recovery Solution
MySQL Workbench Tutorial | Introduction To MySQL Workbench | MySQL DBA Traini...
Procedures and triggers in SQL
Lecture 01 introduction to database
Relational database- Fundamentals
Sql(structured query language)
SQL Overview
Data visualization with sql analytics
Oracle User Management
Introduction to DAX
Oracle Security Presentation
Road to database automation - Database source control
Ad

Viewers also liked (20)

PPTX
05 SSIS Control Flow
PPTX
Advanced integration services on microsoft ssis 1
PDF
Control Flow Using SSIS
PPT
Ssis 2008
PPTX
Business intelligence the next generation of knowledge management (1)
PPTX
03 Integration Services Project
PDF
SQL Server Integration Services – Enterprise Manageability
PPTX
Professional Recycling - SSIS Custom Control Flow Components With Visual Stud...
PDF
SQLDay2013_ChrisWebb_CubeDesign&PerformanceTuning
PPT
9\9 SSIS 2008R2_Training - Package Reliability and Package Execution
PDF
Step by Step design cube using SSAS
PDF
SSIS Basic Data Flow
PPT
5\9 SSIS 2008R2_Training - DataFlow Basics
PPT
2\9.SSIS 2008R2 _Training - Control Flow
PDF
SSIS Data Flow Tasks
PPT
6.2\9 SSIS 2008R2_Training - DataFlow Transformations
PPTX
Architecture of integration services
PPT
Business Intelligence with SQL Server
PPTX
Developing ssas cube
05 SSIS Control Flow
Advanced integration services on microsoft ssis 1
Control Flow Using SSIS
Ssis 2008
Business intelligence the next generation of knowledge management (1)
03 Integration Services Project
SQL Server Integration Services – Enterprise Manageability
Professional Recycling - SSIS Custom Control Flow Components With Visual Stud...
SQLDay2013_ChrisWebb_CubeDesign&PerformanceTuning
9\9 SSIS 2008R2_Training - Package Reliability and Package Execution
Step by Step design cube using SSAS
SSIS Basic Data Flow
5\9 SSIS 2008R2_Training - DataFlow Basics
2\9.SSIS 2008R2 _Training - Control Flow
SSIS Data Flow Tasks
6.2\9 SSIS 2008R2_Training - DataFlow Transformations
Architecture of integration services
Business Intelligence with SQL Server
Developing ssas cube
Ad

Similar to SSIS 2008 R2 data flow (20)

PPTX
White jason presentation
PPTX
Mapping Data Flows Training deck Q1 CY22
PPTX
SESSION-6 Targets and Filter Transformation.pptx
PPTX
SESSION-5 Targets and Filter Transformation.pptx
PPTX
Azure Data Factory Data Flows Training (Sept 2020 Update)
PPTX
Mapping Data Flows Training April 2021
PPTX
introductionofssis-130418034853-phpapp01.pptx
PPT
Informatica session
PPTX
PPT
Olap
PPTX
Informatica overview
PPTX
Informatica overview
DOCX
adf.docx
DOCX
Data mining with ms access
PDF
KNIME Finance Cheatseet PDF For Data Analytics
PDF
2 designer
PPT
Managing Data Integration Initiatives
DOCX
SAS Online Training Institute in Hyderabad - C-Point
PPTX
Business Intelligence Portfolio
PPT
6.1\9 SSIS 2008R2_Training - DataFlow Transformations
White jason presentation
Mapping Data Flows Training deck Q1 CY22
SESSION-6 Targets and Filter Transformation.pptx
SESSION-5 Targets and Filter Transformation.pptx
Azure Data Factory Data Flows Training (Sept 2020 Update)
Mapping Data Flows Training April 2021
introductionofssis-130418034853-phpapp01.pptx
Informatica session
Olap
Informatica overview
Informatica overview
adf.docx
Data mining with ms access
KNIME Finance Cheatseet PDF For Data Analytics
2 designer
Managing Data Integration Initiatives
SAS Online Training Institute in Hyderabad - C-Point
Business Intelligence Portfolio
6.1\9 SSIS 2008R2_Training - DataFlow Transformations

More from Slava Kokaev (13)

PPTX
Introduction to Azure Stream Analytics
PDF
Introduction to Azure Data Factory
PPTX
Business process modeling and analysis for data warehouse design
PPTX
Introduction BI Semantic Model with Sql Server Data Tools copy
PPTX
Architecture modeling with UML and Visual Studio 2010 Ultimate
PPTX
SSIS Connection managers and data sources
PPTX
Data visualization
PPTX
Business intelligence architecture
PPTX
Designing and developing Business Process dimensional Model or Data Warehouse
PPTX
SSAS R2 and SharePoint 2010 – Business Intelligence
PPTX
MS SQL Server Analysis Services 2008 and Enterprise Data Warehousing
PPTX
01 Architecture Of Integration Services
PPTX
Bi Architecture And Conceptual Framework
Introduction to Azure Stream Analytics
Introduction to Azure Data Factory
Business process modeling and analysis for data warehouse design
Introduction BI Semantic Model with Sql Server Data Tools copy
Architecture modeling with UML and Visual Studio 2010 Ultimate
SSIS Connection managers and data sources
Data visualization
Business intelligence architecture
Designing and developing Business Process dimensional Model or Data Warehouse
SSAS R2 and SharePoint 2010 – Business Intelligence
MS SQL Server Analysis Services 2008 and Enterprise Data Warehousing
01 Architecture Of Integration Services
Bi Architecture And Conceptual Framework

Recently uploaded (20)

PDF
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
PDF
Approach and Philosophy of On baking technology
PDF
The Rise and Fall of 3GPP – Time for a Sabbatical?
PDF
Network Security Unit 5.pdf for BCA BBA.
DOCX
The AUB Centre for AI in Media Proposal.docx
PDF
Review of recent advances in non-invasive hemoglobin estimation
PDF
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
PDF
Per capita expenditure prediction using model stacking based on satellite ima...
PPTX
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
PPTX
Digital-Transformation-Roadmap-for-Companies.pptx
PDF
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
PDF
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
PPT
“AI and Expert System Decision Support & Business Intelligence Systems”
PDF
Mobile App Security Testing_ A Comprehensive Guide.pdf
PDF
NewMind AI Weekly Chronicles - August'25 Week I
PPTX
MYSQL Presentation for SQL database connectivity
PDF
cuic standard and advanced reporting.pdf
PPTX
sap open course for s4hana steps from ECC to s4
PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
Approach and Philosophy of On baking technology
The Rise and Fall of 3GPP – Time for a Sabbatical?
Network Security Unit 5.pdf for BCA BBA.
The AUB Centre for AI in Media Proposal.docx
Review of recent advances in non-invasive hemoglobin estimation
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
Per capita expenditure prediction using model stacking based on satellite ima...
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
Digital-Transformation-Roadmap-for-Companies.pptx
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
“AI and Expert System Decision Support & Business Intelligence Systems”
Mobile App Security Testing_ A Comprehensive Guide.pdf
NewMind AI Weekly Chronicles - August'25 Week I
MYSQL Presentation for SQL database connectivity
cuic standard and advanced reporting.pdf
sap open course for s4hana steps from ECC to s4
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
Reach Out and Touch Someone: Haptics and Empathic Computing

SSIS 2008 R2 data flow

  • 1. The Data Flow Task Encapsulates the data flow engine Exists in the context of an overall control flow Performs traditional ETL in addition to other extended scenarios Is fast and scalable Data Flow Components Extract data from Sources Load data into Destinations Modify data with Transformations Service Paths Connect data flow components Create the pipeline
  • 2. Encapsulates the data flow engine Load Extract Transform
  • 5. A Path connects two components in a data flow by connecting the output of one data flow component to the input of another component. A path has a source and a destination.
  • 6. In SSIS, a source is the data flow component that extracts data from different external data sources and makes it available to the other components in the data flow. Sources have one regular output, and many sources in addition also have one error output. All the output columns are available as input columns to the next data flow component in the data flow. Sources extract data from: Relational tables and views Files Analysis Services databases
  • 8. Destinations are the data flow components that load the data from a data flow into different types of data sources or create an in-memory dataset. Destinations have one input and one error output. Destinations load data to: Relational tables and views Files Analysis Services databases and objects DataReaders and Recordsets Enterprise Edition only
  • 10. SSIS Transformations are the components in the data flow of a package that give you the ability to modify and manipulate data in the data flow. A transformation performs an operation either on one row of data at a time or on several rows of data at once. For example aggregate, merge, distribute, and modify data and also can perform lookup operations and generate sample datasets.
  • 14. We can logically group them by functionality: Row Transformations Rowset Transformations Split and Join Transformations Auditing Transformations Business Intelligence Transformations Custom Transformations
  • 15. The most common and easily configured transformations perform operations on rows without needing other rows from the source. These transformations, which logically work at the row level, often perform very well. Update column values or create new columns Transform each row in the pipeline input
  • 16. Create new rowsets that can include Aggregated values Sorted values Sample rowsets Pivoted or unpivoted rowsets
  • 17. Distribute rows to different outputs Create copies of the transformation inputs Join multiple inputs into one output Perform lookup operations
  • 18. Integration Services includes the following transformations to add audit information and count rows.
  • 19. cleaning data updating of a slowly changing dimension looks up values mining text running data mining prediction queries The final grouping of transformations lets you perform advanced operations on rows in the data flow pipeline.

Editor's Notes

  • #2: For beginners, the data flow can look very similar to the control flow. Ensure that students appreciate the difference between the arrows in control flow (precedence constraints) and data flow (service paths).Extended scenarios can consist of, for example, non-ETL data transfers, data cleansing, or text mining.When describing the sample data flow on the slide, use the analogy of a conveyor belt in a factory. Raw material (source data) is placed on the conveyor belt and passes through various processes (transformations). Quality assurance components might reject some material, in which case it can be scrapped (logged) or fixed and blended back in with the quality material. Eventually, finished goods (clean, valid data) arrive at the end of the conveyor belt (data warehouse).
  • #4: Developing Project Data Sources and Package ConnectionsBecause the main purpose of SSIS is to move data from sources to destinations, the nextmost important step is to add the pointers to these sources and destinations. These pointersare called data sources and connections. Data sources are stored at the project level and arefound in Solution Explorer under the logical folder named Data Sources. Connections, on theother hand, are defined within packages and are found in the Connection Managers pane atthe bottom of the Control Flow or Data Flow tab. Connections can be based on project datasources or can stand alone within packages. The next sections walk you through the uses andimplementation of project data sources and package connections.
  • #5: To work with the Data Flow Task, you can either drag a Data Flow Task from the Control Flow toolbox onto the workspace and then double-click it,or you can click the Data Flow tab within the SSIS Designer. After clicking the Data Flow tab, you see the Data Flow Designer, where you can use the data flow to handle and transform datasets. Lesson 2, “Creating and Editing Control Flow Objects,” showed how to use control flow tasksand containers.Notice the difference between the ControlFlow toolbox items and the Data Flow toolbox items.When you are working in the data flow, the toolbox shows items related to data flow development,including data flow sources, data flow transformations, and data flow destinations.In this lesson, you will look at the details of the source and destination adapters as well asthe transformations.
  • #6: If you run a package in SSIS Designer, you can view the data in a data flow by attaching data viewers to a path. A data viewer can be configured to display data in a grid, histogram, scatter plot, or column chart. A data viewer is a useful debugging tool. For more information, see Debugging Data Flow. For example, if a path connects an OLE DB source and a Sort transformation, the OLE DB source is the source of the path, and the Sort transformation is the destination of the path. The source is the component where the path starts, and the destination is the component where the path ends. The SSIS Designer provides the Data Flow Path Editor dialog box for setting path properties, viewing the metadata of the data columns that pass through the path, and configuring data viewers.
  • #7: ADO.NET SourceProvides connections to tables or queries through an ADO.NETprovider.Excel SourceAllows extractions from an Excel worksheet defined in an Excelfile.Flat File Source Connects to a delimited or fixed-width file created with differentcode pages.OLE DB Source Connects to installed OLE DB providers, such as SQL Server,Access, SSAS, and Oracle.Raw File SourceStores native SSIS data in a binary file type useful for data staging.XML SourceAllows raw data to be extracted from an XML file; requires an XMLschema to define data associations.
  • #8: Data flow source adapters use package connections, which point to the server instance or file location of the data source. (The only exception is the raw file adapter, which does not use a package connection.) A source adapter extracts data from sources and moves it into the data flow, where it will be modified and sent to a destination.The source for a data flow typically has one regular output. The regular output contains output columns, which are columns the source adds to the data flow.
  • #9: Data flow destinations are similar to sources in that they use package connections. However, destinations are the endpoints in a package, defining the location to which the data shouldbe pushed.For example, if you are sending data to an Excel file from a database table, your destination will be an Excel Destination adapter.All the source adapters (except the Data Reader source) have matching destination adapters in the SSIS data flow. And there are other destination adapters that let you send data toeven more destinationsADO.NET Destination Allows insertion of data by using an ADO.NET managedprovider.Data Mining Model Training Lets you pass data from the data flow into a data miningmodel in SSAS.DataReader Destination Lets you put data in an ADO.NET record set that can beprogrammatically referenced.Dimension Processing Lets SSAS dimensions be processed directly from dataflowing through the data flow.Excel Destination Used for inserting data into Excel, including Excel 2007.Flat File DestinationAllows insertion of data to a flat file such as a comma delimitedor tab-delimited file.OLE DB Destination Uses the OLE DB provider to insert rows into a destinationsystem that allows an OLE DB connection.Partition Processing Allows SSAS partitions to be processed directly fromdata flowing through the data flow.Raw File Destination Stores native SSIS data in a binary fi le type useful fordata staging.Recordset Destination Takes the data flow data and creates a record set in a package variable of type object.SQL Server Compact DestinationLets you send data to a mobile device running SQL Mobile.SQL Server Destination Provides a high-speed destination specific to SQL Server2008 if the package is running on SQL Server.
  • #10: Like the source, the destination adapter requires configuration,both in the connection and table that the rows should be inserted into as well as in mapping the data flow columns to thedestination table columns.
  • #12: As with the source and destination adapters, you drag transformations from the DataFlow toolbox onto the Data Flow tab of the SSIS Designer, and edit them by right-clicking thetransformation you want to change and then clicking Edit. You connect sources, transformations,and destinations through data paths, which you create by dragging the output arrowonto another component in the data flow. The green data path arrows are for rows that aresuccessfully transformed, and the red output path arrows are for rows that failed the transformationbecause of an error, such as a truncation or conversion error. Figure 1-25 shows adata flow that connects a source to several transformations through data paths and onto adestination.
  • #13: Notice that this OLE DB Destination uses the AdventureWorksDW2008 connection and is configured by default to use the Table Or View—Fast Load option of the Data Access Mode drop-down list. This means that records will be processed with bulk insert statements rather than one row at a time.Figure 1-24 shows the Mappings tab of the same OLE DB Destination Editor. This is where you map columns available from the data flow to the destination columns in the destinationadapter. All the destination adapters have a Mappings tab. FIGURE 1-24 Each destination adapter requires you to map data from the data flow input columns to thedestination columns.Notice that not all columns are mapped. However, if one of the unmapped destination columnsis marked as NOT NULL, the destination fails the package when it is run. In the sectiontitled “Using Transformations” later in this lesson, you see how to use the Slowly ChangingDimension Transformation to handle new records and updates.
  • #15: Transformations perform a wide variety of operations on the underlying data, and the transformation you choose depends on your data processing requirements. Some transformations operate similarly to other transformations; therefore, we can categorize them into naturalgroupings of like components.
  • #16: Character MapPerforms common text operations, such as Uppercase, and allowsadvanced linguistic bit conversion operations.Copy Column Duplicates column values in each row to new named columns.Data ConversionCreates new columns in each row based on new data typesconverted from other columns—for example, converting text tonumeric.Derived Column Uses the SSIS Expression language to perform in-place calculationson existing values; alternatively, allows the addition ofnew columns based on expressions and calculations from othercolumns and variables.Export Column Exports binary large object (BLOB) columns, one row at a time,to a file.Import Column Loads binary files such as images into the pipeline; intended for aBLOB data type destination.OLE DB Command Performs database operations such as updates and deletes,one row at a time, based on mapped parameters from inputrows.
  • #17: Aggregate Associates records based on defined groupings and generates aggregationssuch as SUM, MAX, MIN, and COUNT.Percent Sampling Filters the input rows by allowing only a defined percent to bepassed to the output path.Pivot Takes multiple input rows and pivots the rows to generate an outputwith more columns based on the original row values.Row Sampling Outputs a fixed number of rows, sampling the data from the entireinput, no matter how much larger than the defined output theinput is.Sort Orders the input based on defined sort columns and sort directionand allows the removal of duplicates across the sort columns.UnpivotTakes a single row and outputs multiple rows, moving columnvalues to the new row based on defined columns.
  • #18: Cache Transform Allows data that will be used in a Lookup Transformation to becached and available for multiple Lookup components.Conditional Split Routes or filters data based on Boolean expressions to one or moreoutputs, from which each row can be sent out only one outputpath.Multicast Generates one or more identical outputs, from which every row issent out every output.Union All Combines one or more similar inputs, stacking rows one on top ofanother, based on matching columnsMerge Join Joins the rows of two sorted inputs based on a defined joincolumn(s), adding columns from each source.Lookup Allows matching between pipeline column values to external databasetables; additional columns can be added to the data flow fromthe external table.Merge Combines the rows of two similar sorted inputs, one on top of theother, based on a defined sort key.
  • #19: Audit Adds additional columns to each row based on system packagevariables such as ExecutionStartTime and PackageName.Row Count Tracks the number of rows that flow through the transformationand stores the number in a package variable after the final row.
  • #20: Slowly Changing DimensionProcesses dimension changes, including tracking dimensionhistory and updating dimension values. The Slowly ChangingDimension Transformation handles these common dimensionchange types: Historical Attributes, Fixed Attributes, and Changing Attributes.Data Mining QueryApplies input rows against a data mining model for prediction.Fuzzy Grouping Associates column values with a set of rows based on similarity,for data cleansing.Fuzzy Lookup Joins a data flow input to a reference table based on column similarity. The Similarity Threshold setting specifies the closeness of allowed matches—a high setting means that matching values are closer in similarity.Script Component Provides VB.NET scripting capabilities against rows, columns, inputs, and outputs in the data flow pipeline.Term Extraction Analyzes text input columns for English nouns and noun phrases.Term Lookup Analyzes text input columns against a user-defined set of words for association.