SlideShare a Scribd company logo
Data quality management definition
In this file, you can ref useful information about data quality management definition such as data
quality management definitionforms, tools for data quality management definition, data quality
management definitionstrategies … If you need more assistant for data quality management
definition, please leave your comment at the end of file.
Other useful material for data quality management definition:
• qualitymanagement123.com/23-free-ebooks-for-quality-management
• qualitymanagement123.com/185-free-quality-management-forms
• qualitymanagement123.com/free-98-ISO-9001-templates-and-forms
• qualitymanagement123.com/top-84-quality-management-KPIs
• qualitymanagement123.com/top-18-quality-management-job-descriptions
• qualitymanagement123.com/86-quality-management-interview-questions-and-answers
I. Contents of data quality management definition
==================
Definition - What does Data Quality Management (DQM)mean?
Data quality management is an administration type that incorporates the role establishment, role
deployment, policies, responsibilities and processes with regard to the acquisition, maintenance,
disposition and distribution of data. In order for a data quality management initiative to succeed,
a strong partnership between technology groups and the business is required.
Information technology groups are in charge of building and controlling the entire environment,
that is, architecture, systems, technical establishments and databases. This overall environment
acquires, maintains, disseminates and disposes of an organization's electronic data assets.
Techopedia explains Data Quality Management (DQM)
When considering a business intelligence platform, there are various roles associated with data
quality management:
Project leader and program manager: In charge of supervising individual projects or the business
intelligence program. They also manage day-to-day functions depending on the budget, scope
and schedule limitations.
Organization change agent: Assists the organization in recognizing the impact and value of the
business intelligence environment, and helps the organization to handle any challenges that arise.
Data analyst and business analyst: Communicate business needs, which consist of in-depth data
quality needs. The data analyst demonstrates these needs in the data model as well as in the
prerequisites for the data acquisition and delivery procedures. Collectively, these analysts
guarantee that the quality needs are identified and demonstrated in the design, and that these
needs are carried to the team of developers.
Data steward: Handles data as a corporate asset.
An effective data quality management approach has both reactive and proactive elements. The
proactive elements include:
Establishment of the entire governance
Identification of the roles and responsibilities
Creation of the quality expectations as well as the supporting business strategies
Implementation of a technical platform that facilitates these business practices
==================
III. Quality management tools
1. Check sheet
The check sheet is a form (document) used to collect data
in real time at the location where the data is generated.
The data it captures can be quantitative or qualitative.
When the information is quantitative, the check sheet is
sometimes called a tally sheet.
The defining characteristic of a check sheet is that data
are recorded by making marks ("checks") on it. A typical
check sheet is divided into regions, and marks made in
different regions have different significance. Data are
read by observing the location and number of marks on
the sheet.
Check sheets typically employ a heading that answers the
Five Ws:
 Who filled out the check sheet
 What was collected (what each check represents,
an identifying batch or lot number)
 Where the collection took place (facility, room,
apparatus)
 When the collection took place (hour, shift, day
of the week)
 Why the data were collected
2. Control chart
Control charts, also known as Shewhart charts
(after Walter A. Shewhart) or process-behavior
charts, in statistical process control are tools used
to determine if a manufacturing or business
process is in a state of statistical control.
If analysis of the control chart indicates that the
process is currently under control (i.e., is stable,
with variation only coming from sources common
to the process), then no corrections or changes to
process control parameters are needed or desired.
In addition, data from the process can be used to
predict the future performance of the process. If
the chart indicates that the monitored process is
not in control, analysis of the chart can help
determine the sources of variation, as this will
result in degraded process performance.[1] A
process that is stable but operating outside of
desired (specification) limits (e.g., scrap rates
may be in statistical control but above desired
limits) needs to be improved through a deliberate
effort to understand the causes of current
performance and fundamentally improve the
process.
The control chart is one of the seven basic tools of
quality control.[3] Typically control charts are
used for time-series data, though they can be used
for data that have logical comparability (i.e. you
want to compare samples that were taken all at
the same time, or the performance of different
individuals), however the type of chart used to do
this requires consideration.
3. Pareto chart
A Pareto chart, named after Vilfredo Pareto, is a type
of chart that contains both bars and a line graph, where
individual values are represented in descending order
by bars, and the cumulative total is represented by the
line.
The left vertical axis is the frequency of occurrence,
but it can alternatively represent cost or another
important unit of measure. The right vertical axis is
the cumulative percentage of the total number of
occurrences, total cost, or total of the particular unit of
measure. Because the reasons are in decreasing order,
the cumulative function is a concave function. To take
the example above, in order to lower the amount of
late arrivals by 78%, it is sufficient to solve the first
three issues.
The purpose of the Pareto chart is to highlight the
most important among a (typically large) set of
factors. In quality control, it often represents the most
common sources of defects, the highest occurring type
of defect, or the most frequent reasons for customer
complaints, and so on. Wilkinson (2006) devised an
algorithm for producing statistically based acceptance
limits (similar to confidence intervals) for each bar in
the Pareto chart.
4. Scatter plot Method
A scatter plot, scatterplot, or scattergraph is a type of
mathematical diagram using Cartesian coordinates to
display values for two variables for a set of data.
The data is displayed as a collection of points, each
having the value of one variable determining the position
on the horizontal axis and the value of the other variable
determining the position on the vertical axis.[2] This kind
of plot is also called a scatter chart, scattergram, scatter
diagram,[3] or scatter graph.
A scatter plot is used when a variable exists that is under
the control of the experimenter. If a parameter exists that
is systematically incremented and/or decremented by the
other, it is called the control parameter or independent
variable and is customarily plotted along the horizontal
axis. The measured or dependent variable is customarily
plotted along the vertical axis. If no dependent variable
exists, either type of variable can be plotted on either axis
and a scatter plot will illustrate only the degree of
correlation (not causation) between two variables.
A scatter plot can suggest various kinds of correlations
between variables with a certain confidence interval. For
example, weight and height, weight would be on x axis
and height would be on the y axis. Correlations may be
positive (rising), negative (falling), or null (uncorrelated).
If the pattern of dots slopes from lower left to upper right,
it suggests a positive correlation between the variables
being studied. If the pattern of dots slopes from upper left
to lower right, it suggests a negative correlation. A line of
best fit (alternatively called 'trendline') can be drawn in
order to study the correlation between the variables. An
equation for the correlation between the variables can be
determined by established best-fit procedures. For a linear
correlation, the best-fit procedure is known as linear
regression and is guaranteed to generate a correct solution
in a finite time. No universal best-fit procedure is
guaranteed to generate a correct solution for arbitrary
relationships. A scatter plot is also very useful when we
wish to see how two comparable data sets agree with each
other. In this case, an identity line, i.e., a y=x line, or an
1:1 line, is often drawn as a reference. The more the two
data sets agree, the more the scatters tend to concentrate in
the vicinity of the identity line; if the two data sets are
numerically identical, the scatters fall on the identity line
exactly.
5.Ishikawa diagram
Ishikawa diagrams (also called fishbone diagrams,
herringbone diagrams, cause-and-effect diagrams, or
Fishikawa) are causal diagrams created by Kaoru
Ishikawa (1968) that show the causes of a specific
event.[1][2] Common uses of the Ishikawa diagram are
product design and quality defect prevention, to identify
potential factors causing an overall effect. Each cause or
reason for imperfection is a source of variation. Causes
are usually grouped into major categories to identify these
sources of variation. The categories typically include
 People: Anyone involved with the process
 Methods: How the process is performed and the
specific requirements for doing it, such as policies,
procedures, rules, regulations and laws
 Machines: Any equipment, computers, tools, etc.
required to accomplish the job
 Materials: Raw materials, parts, pens, paper, etc.
used to produce the final product
 Measurements: Data generated from the process
that are used to evaluate its quality
 Environment: The conditions, such as location,
time, temperature, and culture in which the process
operates
6. Histogram method
A histogram is a graphical representation of the
distribution of data. It is an estimate of the probability
distribution of a continuous variable (quantitative
variable) and was first introduced by Karl Pearson.[1] To
construct a histogram, the first step is to "bin" the range of
values -- that is, divide the entire range of values into a
series of small intervals -- and then count how many
values fall into each interval. A rectangle is drawn with
height proportional to the count and width equal to the bin
size, so that rectangles abut each other. A histogram may
also be normalized displaying relative frequencies. It then
shows the proportion of cases that fall into each of several
categories, with the sum of the heights equaling 1. The
bins are usually specified as consecutive, non-overlapping
intervals of a variable. The bins (intervals) must be
adjacent, and usually equal size.[2] The rectangles of a
histogram are drawn so that they touch each other to
indicate that the original variable is continuous.[3]
III. Other topics related to Data quality management definition (pdf
download)
quality management systems
quality management courses
quality management tools
iso 9001 quality management system
quality management process
quality management system example
quality system management
quality management techniques
quality management standards
quality management policy
quality management strategy
quality management books

More Related Content

DOCX
Example of quality management
DOCX
Examples of quality management
DOCX
Continual improvement of the quality management system
DOCX
Quality management example
DOCX
The importance of quality management
DOCX
Quality management companies
DOCX
Quality management seminars
DOCX
Continuous improvement quality management
Example of quality management
Examples of quality management
Continual improvement of the quality management system
Quality management example
The importance of quality management
Quality management companies
Quality management seminars
Continuous improvement quality management

What's hot (20)

DOCX
Quality management careers
DOCX
Diploma of quality management
DOCX
Quality driven management
DOCX
Quality management tool
DOCX
Quality management software systems
DOCX
Quality management policy template
DOCX
Quality management systems for education and training providers
DOCX
Quality management process
DOCX
Quality and performance management
DOCX
Quality management systems software
DOCX
Post graduate diploma in quality management
DOCX
Agile quality management
DOCX
Documented quality management system
DOCX
Project quality management process
DOCX
Quality management office
DOCX
Presentation on quality management system
DOCX
Supplier quality management process
DOCX
Quality management essentials
DOCX
Certified quality management system
DOCX
Why is quality management important
Quality management careers
Diploma of quality management
Quality driven management
Quality management tool
Quality management software systems
Quality management policy template
Quality management systems for education and training providers
Quality management process
Quality and performance management
Quality management systems software
Post graduate diploma in quality management
Agile quality management
Documented quality management system
Project quality management process
Quality management office
Presentation on quality management system
Supplier quality management process
Quality management essentials
Certified quality management system
Why is quality management important
Ad

Viewers also liked (6)

PPTX
Data integrity challenges and solutions
PPT
Data integrity
PDF
Data Governance: Keystone of Information Management Initiatives
PDF
Presentation on data integrity in Pharmaceutical Industry
PPTX
8 Tips to Make Data Quality an Ethos, Not a Project.
PDF
Wim Helmer: 'Data Quality - It's a Family Affair'
Data integrity challenges and solutions
Data integrity
Data Governance: Keystone of Information Management Initiatives
Presentation on data integrity in Pharmaceutical Industry
8 Tips to Make Data Quality an Ethos, Not a Project.
Wim Helmer: 'Data Quality - It's a Family Affair'
Ad

Similar to Data quality management definition (20)

DOCX
Corporate quality management
DOCX
Quality management process model
DOCX
What is service quality management
DOCX
Application quality management
DOCX
Data quality management tools
DOCX
Software quality management
DOCX
Objectives of quality management
DOCX
Define quality management
DOCX
Define quality management
DOCX
Quality management practices
DOCX
What is quality management system
DOCX
Quality management approaches
DOCX
Quality management services
DOCX
Quality management company
DOCX
Business quality management
DOCX
Quality production management
DOCX
Quality management accreditation
DOCX
Medical quality management
DOCX
Statistical quality management
DOCX
Quality management standard
Corporate quality management
Quality management process model
What is service quality management
Application quality management
Data quality management tools
Software quality management
Objectives of quality management
Define quality management
Define quality management
Quality management practices
What is quality management system
Quality management approaches
Quality management services
Quality management company
Business quality management
Quality production management
Quality management accreditation
Medical quality management
Statistical quality management
Quality management standard

More from selinasimpson311 (12)

DOCX
What is water quality management
DOCX
Quality management system training courses
DOCX
Quality management system courses
DOCX
Quality management software reviews
DOCX
Quality management philosophies
DOCX
Quality management news
DOCX
Quality management in projects
DOCX
Quality management conference
DOCX
Lean quality management system
DOCX
Key concepts of quality management
DOCX
Courses in quality management
DOCX
Components of a quality management system
What is water quality management
Quality management system training courses
Quality management system courses
Quality management software reviews
Quality management philosophies
Quality management news
Quality management in projects
Quality management conference
Lean quality management system
Key concepts of quality management
Courses in quality management
Components of a quality management system

Data quality management definition

  • 1. Data quality management definition In this file, you can ref useful information about data quality management definition such as data quality management definitionforms, tools for data quality management definition, data quality management definitionstrategies … If you need more assistant for data quality management definition, please leave your comment at the end of file. Other useful material for data quality management definition: • qualitymanagement123.com/23-free-ebooks-for-quality-management • qualitymanagement123.com/185-free-quality-management-forms • qualitymanagement123.com/free-98-ISO-9001-templates-and-forms • qualitymanagement123.com/top-84-quality-management-KPIs • qualitymanagement123.com/top-18-quality-management-job-descriptions • qualitymanagement123.com/86-quality-management-interview-questions-and-answers I. Contents of data quality management definition ================== Definition - What does Data Quality Management (DQM)mean? Data quality management is an administration type that incorporates the role establishment, role deployment, policies, responsibilities and processes with regard to the acquisition, maintenance, disposition and distribution of data. In order for a data quality management initiative to succeed, a strong partnership between technology groups and the business is required. Information technology groups are in charge of building and controlling the entire environment, that is, architecture, systems, technical establishments and databases. This overall environment acquires, maintains, disseminates and disposes of an organization's electronic data assets. Techopedia explains Data Quality Management (DQM) When considering a business intelligence platform, there are various roles associated with data quality management: Project leader and program manager: In charge of supervising individual projects or the business intelligence program. They also manage day-to-day functions depending on the budget, scope and schedule limitations. Organization change agent: Assists the organization in recognizing the impact and value of the business intelligence environment, and helps the organization to handle any challenges that arise. Data analyst and business analyst: Communicate business needs, which consist of in-depth data quality needs. The data analyst demonstrates these needs in the data model as well as in the prerequisites for the data acquisition and delivery procedures. Collectively, these analysts
  • 2. guarantee that the quality needs are identified and demonstrated in the design, and that these needs are carried to the team of developers. Data steward: Handles data as a corporate asset. An effective data quality management approach has both reactive and proactive elements. The proactive elements include: Establishment of the entire governance Identification of the roles and responsibilities Creation of the quality expectations as well as the supporting business strategies Implementation of a technical platform that facilitates these business practices ================== III. Quality management tools 1. Check sheet The check sheet is a form (document) used to collect data in real time at the location where the data is generated. The data it captures can be quantitative or qualitative. When the information is quantitative, the check sheet is sometimes called a tally sheet. The defining characteristic of a check sheet is that data are recorded by making marks ("checks") on it. A typical check sheet is divided into regions, and marks made in different regions have different significance. Data are read by observing the location and number of marks on the sheet. Check sheets typically employ a heading that answers the Five Ws:  Who filled out the check sheet  What was collected (what each check represents, an identifying batch or lot number)  Where the collection took place (facility, room, apparatus)  When the collection took place (hour, shift, day of the week)  Why the data were collected
  • 3. 2. Control chart Control charts, also known as Shewhart charts (after Walter A. Shewhart) or process-behavior charts, in statistical process control are tools used to determine if a manufacturing or business process is in a state of statistical control. If analysis of the control chart indicates that the process is currently under control (i.e., is stable, with variation only coming from sources common to the process), then no corrections or changes to process control parameters are needed or desired. In addition, data from the process can be used to predict the future performance of the process. If the chart indicates that the monitored process is not in control, analysis of the chart can help determine the sources of variation, as this will result in degraded process performance.[1] A process that is stable but operating outside of desired (specification) limits (e.g., scrap rates may be in statistical control but above desired limits) needs to be improved through a deliberate effort to understand the causes of current performance and fundamentally improve the process. The control chart is one of the seven basic tools of quality control.[3] Typically control charts are used for time-series data, though they can be used for data that have logical comparability (i.e. you want to compare samples that were taken all at the same time, or the performance of different individuals), however the type of chart used to do this requires consideration. 3. Pareto chart
  • 4. A Pareto chart, named after Vilfredo Pareto, is a type of chart that contains both bars and a line graph, where individual values are represented in descending order by bars, and the cumulative total is represented by the line. The left vertical axis is the frequency of occurrence, but it can alternatively represent cost or another important unit of measure. The right vertical axis is the cumulative percentage of the total number of occurrences, total cost, or total of the particular unit of measure. Because the reasons are in decreasing order, the cumulative function is a concave function. To take the example above, in order to lower the amount of late arrivals by 78%, it is sufficient to solve the first three issues. The purpose of the Pareto chart is to highlight the most important among a (typically large) set of factors. In quality control, it often represents the most common sources of defects, the highest occurring type of defect, or the most frequent reasons for customer complaints, and so on. Wilkinson (2006) devised an algorithm for producing statistically based acceptance limits (similar to confidence intervals) for each bar in the Pareto chart. 4. Scatter plot Method A scatter plot, scatterplot, or scattergraph is a type of mathematical diagram using Cartesian coordinates to display values for two variables for a set of data. The data is displayed as a collection of points, each having the value of one variable determining the position on the horizontal axis and the value of the other variable determining the position on the vertical axis.[2] This kind of plot is also called a scatter chart, scattergram, scatter diagram,[3] or scatter graph. A scatter plot is used when a variable exists that is under the control of the experimenter. If a parameter exists that
  • 5. is systematically incremented and/or decremented by the other, it is called the control parameter or independent variable and is customarily plotted along the horizontal axis. The measured or dependent variable is customarily plotted along the vertical axis. If no dependent variable exists, either type of variable can be plotted on either axis and a scatter plot will illustrate only the degree of correlation (not causation) between two variables. A scatter plot can suggest various kinds of correlations between variables with a certain confidence interval. For example, weight and height, weight would be on x axis and height would be on the y axis. Correlations may be positive (rising), negative (falling), or null (uncorrelated). If the pattern of dots slopes from lower left to upper right, it suggests a positive correlation between the variables being studied. If the pattern of dots slopes from upper left to lower right, it suggests a negative correlation. A line of best fit (alternatively called 'trendline') can be drawn in order to study the correlation between the variables. An equation for the correlation between the variables can be determined by established best-fit procedures. For a linear correlation, the best-fit procedure is known as linear regression and is guaranteed to generate a correct solution in a finite time. No universal best-fit procedure is guaranteed to generate a correct solution for arbitrary relationships. A scatter plot is also very useful when we wish to see how two comparable data sets agree with each other. In this case, an identity line, i.e., a y=x line, or an 1:1 line, is often drawn as a reference. The more the two data sets agree, the more the scatters tend to concentrate in the vicinity of the identity line; if the two data sets are numerically identical, the scatters fall on the identity line exactly.
  • 6. 5.Ishikawa diagram Ishikawa diagrams (also called fishbone diagrams, herringbone diagrams, cause-and-effect diagrams, or Fishikawa) are causal diagrams created by Kaoru Ishikawa (1968) that show the causes of a specific event.[1][2] Common uses of the Ishikawa diagram are product design and quality defect prevention, to identify potential factors causing an overall effect. Each cause or reason for imperfection is a source of variation. Causes are usually grouped into major categories to identify these sources of variation. The categories typically include  People: Anyone involved with the process  Methods: How the process is performed and the specific requirements for doing it, such as policies, procedures, rules, regulations and laws  Machines: Any equipment, computers, tools, etc. required to accomplish the job  Materials: Raw materials, parts, pens, paper, etc. used to produce the final product  Measurements: Data generated from the process that are used to evaluate its quality  Environment: The conditions, such as location, time, temperature, and culture in which the process operates 6. Histogram method
  • 7. A histogram is a graphical representation of the distribution of data. It is an estimate of the probability distribution of a continuous variable (quantitative variable) and was first introduced by Karl Pearson.[1] To construct a histogram, the first step is to "bin" the range of values -- that is, divide the entire range of values into a series of small intervals -- and then count how many values fall into each interval. A rectangle is drawn with height proportional to the count and width equal to the bin size, so that rectangles abut each other. A histogram may also be normalized displaying relative frequencies. It then shows the proportion of cases that fall into each of several categories, with the sum of the heights equaling 1. The bins are usually specified as consecutive, non-overlapping intervals of a variable. The bins (intervals) must be adjacent, and usually equal size.[2] The rectangles of a histogram are drawn so that they touch each other to indicate that the original variable is continuous.[3] III. Other topics related to Data quality management definition (pdf download) quality management systems quality management courses quality management tools iso 9001 quality management system quality management process quality management system example quality system management quality management techniques quality management standards quality management policy quality management strategy quality management books