SlideShare a Scribd company logo
2
Most read
4
Most read
5
Most read
Case study: FeatureBase, Part 1: Fine tuning metrics for data
collection
In this course, you've been thinking about the stages of the business intelligence process. This case study with FeatureBase will focus on the Capture
stage of the BI process, where you examine static, backward-facing data and plan for the next two phases of the project. In two follow-up case studies,
you’ll learn about how FeatureBase addressed the Analyze and Monitor stages of this project. But first, you’ll need to understand the problem,
process, and solutions for this first stage of the project.
Example for a  Business intelligence Case study.pdf
As a BI professional, you will add value to the organizations you work with. Your expertise will help organizations access the right data, use data to find
ways to grow and improve, and put those insights into action. Throughout this certificate program, you will have the opportunity to explore how different
businesses handled real challenges they faced using business intelligence. In this reading, you will be introduced to FeatureBase, an Operational AI
company in Austin, Texas. Across the three courses, you will encounter three case studies that follow the FeatureBase team’s approach to an actual
problem they faced. This is a great example of how a real company solved a BI problem and completed an entire project–Starting with identifying a
problem and preparing to tackle it!
Company background
FeatureBase builds technologies that unlock the value of data as soon as it is created. Based in Austin Texas, the team and community consist of
database, distributed systems, and cloud engineers, as well as leading researchers on bitmap innovation. FeatureBase’s CEO, H.O. Maycotte, and
founding engineers have worked for nearly 20 years to solve a gap in the database market and develop a new data format that is built specifically to
enable faster computation.
Their core technology, FeatureBase, is the first OLAP Database built entirely on bitmaps that power real-time analytics and machine learning
applications by simultaneously executing low latency, high throughput, and highly concurrent workloads.
The challenge
The sales team noticed that a significant portion of potential customers were falling off during the sales cycle. Once they discovered this pattern, they
realized that they didn’t have the data they needed to really discover when customers were falling off. And if they couldn’t determine when customers
were falling off, then they couldn’t find out why. And finding out why was key for creating solutions to address this problem.
The approach
The initial question was, “Why did we fall short on our quarterly revenue target?” To answer that question, the FeatureBase team needed to know why
people dropped off and when drop-off happened. But they didn’t have the metrics built into their database to actually measure that. In order to build
this question into their data collection, they had to experiment with what data was actually useful, add new attributes, and refine their metrics. For this
particular project, the solution was clear: recreate their existing sales funnel with key attributes about each potential customer at every stage of the
project.
To do this, the Sales leader, Marketing leader, and CEO collaborated to decide on new metrics and how to implement them within the system. It
required some experimentation– the team was committed to iterating and fine tuning their data collection process in order to optimize this solution.
Tuning is often a really necessary part of creating forward looking solutions; the first model is usually not the best one. It’s a first draft; you have to
revise it in order to achieve the most ideal version of the solution! As a BI professional, the reality is that you might have to iterate a few times to get
your model where you need it.
The next step
As a BI professional, there will be times when you are asked a question that you don’t have sufficient data to actually answer. Sometimes, you have to
keep digging, keep researching, and keep thinking about how to provide an insightful answer your team can actually use. In this case, the
FeatureBase team realized they had observed that there was a trend, but they couldn’t determine what it was and how to act on it with the data they
had. The first step was deciding what metrics they could implement to actually capture useful observations. As a team, they collaborated and fine
tuned their data collection processes. Coming up in the next course, you’ll learn more about how they actually imposed these new processes on their
database systems, what tools they used, and how that set them up for success.
If you’re interested in reading more about FeatureBase’s approach to answering this question, you can find more in the FeatureBase part two and part
three readings featured in upcoming courses.
Review technologies and best practices
As you continue through this program, you will be introduced to a variety of business intelligence tools that will help you create systems and
processes and provide stakeholders with insights they can use to guide business decisions. Depending on the organization, you might end up
using different tools over time. Luckily, the skills you are learning now can be transferred between tools. In this reading, you’ll be given some
best practices for creating pipeline tools, data visualizations, and dashboards that you’ll be able to apply no matter what programs or tools your
organization uses.
Optimal pipeline processes
Developing tools to optimize and automate certain data processes is a large part of a BI professional’s job. Being able to automate processes such
as moving and transforming data saves users from having to do that work manually and empowers them with the ability to get answers quickly
for themselves. There are a variety of tools that BI professionals use to create pipelines; and although there are some key differences between
them, these are many best practices that apply no matter what tool you use.
Modular design
As you have learned, a data pipeline is a series of processes that transport data from different sources to their final destination for storage and
analysis. A pipeline takes multiple processes and combines them into a system that automatically handles the data. Modular design principles can
enable the development of individual pieces of a pipeline system so they can be treated as unique building blocks. Modular design also makes it
possible to optimize and change individual components of a system without disrupting the rest of the pipeline. In addition, it helps users isolate
and troubleshoot errors quickly.
Other best practices related to modular design include using version control to track changes over time and undo any as needed. Also, BI
professionals can create a separate development environment to test and review changes before implementing them.
Other general software development best practices are also applicable to data pipelines.
Verify data accuracy and integrity
The BI processes that move, transform, and report data findings for analysis are only useful if the data itself is accurate. Stakeholders need to be
able to depend on the data they are accessing in order to make key business decisions. It’s also possible that incomplete or inaccurate data can
cause errors within a pipeline system. Because of this, it’s necessary to ensure the accuracy and integrity of the data, no matter what tools you are
using to construct the system. Some important things to consider about the data in your pipelines are:
• Completeness: Is the data complete?
• Consistency: Are data values consistent across datasets?
• Conformity: Do data values conform to the required format?
• Accuracy: Do data values accurately represent actual values?
• Redundancy: Are data values redundant within the same dataset?
• Integrity: Are data values missing important relationships?
• Timeliness: Is the data current?
Creating checkpoints in your pipeline system to address any of these issues before the data is delivered to the destination will save time and effort
later on in the process! For example, you can add SQL scripts that test each stage for duplicates and will send an error alert if any are found.
Creating a testing environment
Building the pipeline processes is only one aspect of creating data pipelines; it’s an iterative process that might require you to make updates and
changes depending on how technology or business needs change. Because you will want to continue making improvements to the system, you
need to create ways to test any changes before they’re implemented to avoid disrupting users’ access to the data. This could include creating a
separate staging environment for data where you can run tests or including a stable dataset that you can make changes to and compare to current
processes without interrupting the current flow.
Dynamic dashboards
Dashboards are powerful visual tools that help BI professionals empower stakeholders with data insights they can access and use when they need
them. Dashboards track, analyze, and visualize data in order to answer questions and solve problems. The following table summarizes how BI
professionals approach dashboards and how it differs from their stakeholders:
Element of the
dashboard
BI professional tenets Stakeholder tenets
Centralization
Creating a single source of data for
all stakeholders
Working with a comprehensive view of data that tracks
their initiatives, objectives, projects, processes, and more
Visualization Showing data in near-real time Spotting changing trends and patterns more quickly
Insightfulness
Determining relevant information
to include
Understanding a more holistic story behind the numbers to
keep track of goals and make data-driven decisions
Customization
Creating custom views dedicated to
a specific team or project
Drilling down to more specific areas of specialized interest
or concern
Note that new data is pulled into dashboards automatically only if the data structure remains the same. If the data structure is different or altered,
you will have to update the dashboard design before the data is automatically updated in your dashboard.
Dashboards are part of a business journey
Just like how the dashboard on an airplane shows the pilot their flight path, your dashboard does the same for your stakeholders. It helps them
navigate the path of the project inside the data. If you add clear markers and highlight important points on your dashboard, users will understand
where your data story is headed. Then, you can work together to make sure the business gets where it needs to go. To learn more about designing
dashboards, check out this reading from the Google Data Analytics Certificate: Designing compelling dashboards.
Effective visualizations
Data visualizations are a key part of most dashboards, so you’ll want to ensure that you are creating effective visualizations. This requires
organizing your thoughts using frameworks, incorporating key design principles, and ensuring you are avoiding misleading or inaccurate data
visualizations by following best practices.
Frameworks for organizing your thoughts about visualization
Frameworks can help you organize your thoughts about data visualization and give you a useful checklist to reference. Here are two frameworks
that may be useful for you as you create your own data visualizations:
1. The McCandless Method
2. Kaiser Fung’s Junk Charts Trifecta Checkup
Pre-attentive attributes: marks and channels
Creating effective visuals involves considering how the brain works, then using specific visual elements to communicate the information
effectively. Pre-attentive attributes are the elements of a data visualization that people recognize automatically without conscious effort. The
essential, basic building blocks that make visuals immediately understandable are called marks and channels.
Design principles
Once you understand the pre-attentive attributes of data visualization, you can go on to design principles for creating effective visuals. These
design principles are vital to your work as a data analyst because they help you make sure that you are creating visualizations that convey your
data effectively to your audience. By keeping these rules in mind, you can plan and evaluate your data visualizations to decide if they are
working for you and your goals. And, if they aren’t, you can adjust them!
Avoiding misleading or deceptive charts
As you have been learning, BI provides people with insights and knowledge they can use to make decisions. So, it’s important that the
visualizations you create are communicating your data accurately and truthfully. To learn more about effective visualizations, check out this
reading from the Google Data Analytics Certificate: Effective data visualizations.
Make your visualizations accessible and useful to everyone in your audience by keeping in mind the following:
• Labeling
• Text alternatives
• Text-based format
• Distinguishing
• Simplifying
To learn more about accessible visualizations, check out this video from the Google Data Analytics Certificate: Making Data Visualizations
Accessible.
Conclusion
As a BI professional, you will encounter a variety of tools for creating pipeline systems, developing dashboards to share with stakeholders, and
creating effective visualizations to demonstrate your findings. Those tools require different skills, which take time and effort to learn. But often,
you can apply your knowledge to numerous processes and systems.

More Related Content

DOCX
PART 1.docx
PDF
Implementing business intelligence
DOCX
Bsa 411 preview full class
PDF
Tasks of a data analyst Microsoft Learning Path - PL 300 .pdf
PPTX
Business Intelligence Module 2
PPTX
Visualizing Your Data Through Dashboards
PDF
Effective Data Analysis MEAP V05 Hard and soft skills Mona Khalil
DOCX
4Emerging Trends in Business IntelligenceITS 531.docx
PART 1.docx
Implementing business intelligence
Bsa 411 preview full class
Tasks of a data analyst Microsoft Learning Path - PL 300 .pdf
Business Intelligence Module 2
Visualizing Your Data Through Dashboards
Effective Data Analysis MEAP V05 Hard and soft skills Mona Khalil
4Emerging Trends in Business IntelligenceITS 531.docx

Similar to Example for a Business intelligence Case study.pdf (20)

PPTX
Monitoring and Measuring SharePoint to Guarantee Your ROI
PDF
Challenges Of A Junior Data Scientist_ Best Tips To Help You Along The Way.pdf
PDF
Operational Analytics: Best Software For Sourcing Actionable Insights 2013
PPTX
PPTX
Top Benefits of Implementing Self-Service BI in Your Business
PPT
Delivering an effective customer experience dashboard
PDF
Self-service analytics risk_September_2016
PDF
Business Case for Data Mashup
PDF
Performance Management: How Technology is Changing the Game
PDF
How Self-Service BI Empowers Decision-Making Across Teams
PPTX
About Business Intelligence
PDF
Worst practices in Business Intelligence setup
PPTX
Latest trends in Business Analytics
PDF
5 Tips to Bulletproof Your Analytics Implementation
PDF
The Simple 5-Step Process for Creating a Winning Data Pipeline.pdf
PDF
Offers bank dss
PDF
How Can Business Analytics Dashboard Help Data Analysts.pdf
PPTX
[DSC Europe 22] The Making of a Data Organization - Denys Holovatyi
Monitoring and Measuring SharePoint to Guarantee Your ROI
Challenges Of A Junior Data Scientist_ Best Tips To Help You Along The Way.pdf
Operational Analytics: Best Software For Sourcing Actionable Insights 2013
Top Benefits of Implementing Self-Service BI in Your Business
Delivering an effective customer experience dashboard
Self-service analytics risk_September_2016
Business Case for Data Mashup
Performance Management: How Technology is Changing the Game
How Self-Service BI Empowers Decision-Making Across Teams
About Business Intelligence
Worst practices in Business Intelligence setup
Latest trends in Business Analytics
5 Tips to Bulletproof Your Analytics Implementation
The Simple 5-Step Process for Creating a Winning Data Pipeline.pdf
Offers bank dss
How Can Business Analytics Dashboard Help Data Analysts.pdf
[DSC Europe 22] The Making of a Data Organization - Denys Holovatyi
Ad

Recently uploaded (20)

PDF
STATICS OF THE RIGID BODIES Hibbelers.pdf
PDF
VCE English Exam - Section C Student Revision Booklet
PDF
Microbial disease of the cardiovascular and lymphatic systems
PDF
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
PPTX
Cell Types and Its function , kingdom of life
PPTX
BOWEL ELIMINATION FACTORS AFFECTING AND TYPES
PDF
2.FourierTransform-ShortQuestionswithAnswers.pdf
PPTX
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
PDF
01-Introduction-to-Information-Management.pdf
PPTX
human mycosis Human fungal infections are called human mycosis..pptx
PDF
FourierSeries-QuestionsWithAnswers(Part-A).pdf
PPTX
Introduction_to_Human_Anatomy_and_Physiology_for_B.Pharm.pptx
PDF
Module 4: Burden of Disease Tutorial Slides S2 2025
PDF
TR - Agricultural Crops Production NC III.pdf
PDF
Computing-Curriculum for Schools in Ghana
PDF
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
PPTX
GDM (1) (1).pptx small presentation for students
PPTX
Cell Structure & Organelles in detailed.
PDF
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
PPTX
Final Presentation General Medicine 03-08-2024.pptx
STATICS OF THE RIGID BODIES Hibbelers.pdf
VCE English Exam - Section C Student Revision Booklet
Microbial disease of the cardiovascular and lymphatic systems
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
Cell Types and Its function , kingdom of life
BOWEL ELIMINATION FACTORS AFFECTING AND TYPES
2.FourierTransform-ShortQuestionswithAnswers.pdf
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
01-Introduction-to-Information-Management.pdf
human mycosis Human fungal infections are called human mycosis..pptx
FourierSeries-QuestionsWithAnswers(Part-A).pdf
Introduction_to_Human_Anatomy_and_Physiology_for_B.Pharm.pptx
Module 4: Burden of Disease Tutorial Slides S2 2025
TR - Agricultural Crops Production NC III.pdf
Computing-Curriculum for Schools in Ghana
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
GDM (1) (1).pptx small presentation for students
Cell Structure & Organelles in detailed.
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
Final Presentation General Medicine 03-08-2024.pptx
Ad

Example for a Business intelligence Case study.pdf

  • 1. Case study: FeatureBase, Part 1: Fine tuning metrics for data collection In this course, you've been thinking about the stages of the business intelligence process. This case study with FeatureBase will focus on the Capture stage of the BI process, where you examine static, backward-facing data and plan for the next two phases of the project. In two follow-up case studies, you’ll learn about how FeatureBase addressed the Analyze and Monitor stages of this project. But first, you’ll need to understand the problem, process, and solutions for this first stage of the project.
  • 3. As a BI professional, you will add value to the organizations you work with. Your expertise will help organizations access the right data, use data to find ways to grow and improve, and put those insights into action. Throughout this certificate program, you will have the opportunity to explore how different businesses handled real challenges they faced using business intelligence. In this reading, you will be introduced to FeatureBase, an Operational AI company in Austin, Texas. Across the three courses, you will encounter three case studies that follow the FeatureBase team’s approach to an actual problem they faced. This is a great example of how a real company solved a BI problem and completed an entire project–Starting with identifying a problem and preparing to tackle it! Company background FeatureBase builds technologies that unlock the value of data as soon as it is created. Based in Austin Texas, the team and community consist of database, distributed systems, and cloud engineers, as well as leading researchers on bitmap innovation. FeatureBase’s CEO, H.O. Maycotte, and founding engineers have worked for nearly 20 years to solve a gap in the database market and develop a new data format that is built specifically to enable faster computation. Their core technology, FeatureBase, is the first OLAP Database built entirely on bitmaps that power real-time analytics and machine learning applications by simultaneously executing low latency, high throughput, and highly concurrent workloads. The challenge
  • 4. The sales team noticed that a significant portion of potential customers were falling off during the sales cycle. Once they discovered this pattern, they realized that they didn’t have the data they needed to really discover when customers were falling off. And if they couldn’t determine when customers were falling off, then they couldn’t find out why. And finding out why was key for creating solutions to address this problem. The approach The initial question was, “Why did we fall short on our quarterly revenue target?” To answer that question, the FeatureBase team needed to know why people dropped off and when drop-off happened. But they didn’t have the metrics built into their database to actually measure that. In order to build this question into their data collection, they had to experiment with what data was actually useful, add new attributes, and refine their metrics. For this particular project, the solution was clear: recreate their existing sales funnel with key attributes about each potential customer at every stage of the project. To do this, the Sales leader, Marketing leader, and CEO collaborated to decide on new metrics and how to implement them within the system. It required some experimentation– the team was committed to iterating and fine tuning their data collection process in order to optimize this solution. Tuning is often a really necessary part of creating forward looking solutions; the first model is usually not the best one. It’s a first draft; you have to revise it in order to achieve the most ideal version of the solution! As a BI professional, the reality is that you might have to iterate a few times to get your model where you need it. The next step As a BI professional, there will be times when you are asked a question that you don’t have sufficient data to actually answer. Sometimes, you have to keep digging, keep researching, and keep thinking about how to provide an insightful answer your team can actually use. In this case, the FeatureBase team realized they had observed that there was a trend, but they couldn’t determine what it was and how to act on it with the data they had. The first step was deciding what metrics they could implement to actually capture useful observations. As a team, they collaborated and fine tuned their data collection processes. Coming up in the next course, you’ll learn more about how they actually imposed these new processes on their database systems, what tools they used, and how that set them up for success. If you’re interested in reading more about FeatureBase’s approach to answering this question, you can find more in the FeatureBase part two and part three readings featured in upcoming courses.
  • 5. Review technologies and best practices As you continue through this program, you will be introduced to a variety of business intelligence tools that will help you create systems and processes and provide stakeholders with insights they can use to guide business decisions. Depending on the organization, you might end up using different tools over time. Luckily, the skills you are learning now can be transferred between tools. In this reading, you’ll be given some best practices for creating pipeline tools, data visualizations, and dashboards that you’ll be able to apply no matter what programs or tools your organization uses. Optimal pipeline processes Developing tools to optimize and automate certain data processes is a large part of a BI professional’s job. Being able to automate processes such as moving and transforming data saves users from having to do that work manually and empowers them with the ability to get answers quickly for themselves. There are a variety of tools that BI professionals use to create pipelines; and although there are some key differences between them, these are many best practices that apply no matter what tool you use. Modular design As you have learned, a data pipeline is a series of processes that transport data from different sources to their final destination for storage and analysis. A pipeline takes multiple processes and combines them into a system that automatically handles the data. Modular design principles can enable the development of individual pieces of a pipeline system so they can be treated as unique building blocks. Modular design also makes it possible to optimize and change individual components of a system without disrupting the rest of the pipeline. In addition, it helps users isolate and troubleshoot errors quickly. Other best practices related to modular design include using version control to track changes over time and undo any as needed. Also, BI professionals can create a separate development environment to test and review changes before implementing them. Other general software development best practices are also applicable to data pipelines.
  • 6. Verify data accuracy and integrity The BI processes that move, transform, and report data findings for analysis are only useful if the data itself is accurate. Stakeholders need to be able to depend on the data they are accessing in order to make key business decisions. It’s also possible that incomplete or inaccurate data can cause errors within a pipeline system. Because of this, it’s necessary to ensure the accuracy and integrity of the data, no matter what tools you are using to construct the system. Some important things to consider about the data in your pipelines are: • Completeness: Is the data complete? • Consistency: Are data values consistent across datasets? • Conformity: Do data values conform to the required format? • Accuracy: Do data values accurately represent actual values? • Redundancy: Are data values redundant within the same dataset? • Integrity: Are data values missing important relationships? • Timeliness: Is the data current? Creating checkpoints in your pipeline system to address any of these issues before the data is delivered to the destination will save time and effort later on in the process! For example, you can add SQL scripts that test each stage for duplicates and will send an error alert if any are found. Creating a testing environment Building the pipeline processes is only one aspect of creating data pipelines; it’s an iterative process that might require you to make updates and changes depending on how technology or business needs change. Because you will want to continue making improvements to the system, you need to create ways to test any changes before they’re implemented to avoid disrupting users’ access to the data. This could include creating a separate staging environment for data where you can run tests or including a stable dataset that you can make changes to and compare to current processes without interrupting the current flow. Dynamic dashboards Dashboards are powerful visual tools that help BI professionals empower stakeholders with data insights they can access and use when they need them. Dashboards track, analyze, and visualize data in order to answer questions and solve problems. The following table summarizes how BI professionals approach dashboards and how it differs from their stakeholders:
  • 7. Element of the dashboard BI professional tenets Stakeholder tenets Centralization Creating a single source of data for all stakeholders Working with a comprehensive view of data that tracks their initiatives, objectives, projects, processes, and more Visualization Showing data in near-real time Spotting changing trends and patterns more quickly Insightfulness Determining relevant information to include Understanding a more holistic story behind the numbers to keep track of goals and make data-driven decisions Customization Creating custom views dedicated to a specific team or project Drilling down to more specific areas of specialized interest or concern Note that new data is pulled into dashboards automatically only if the data structure remains the same. If the data structure is different or altered, you will have to update the dashboard design before the data is automatically updated in your dashboard. Dashboards are part of a business journey Just like how the dashboard on an airplane shows the pilot their flight path, your dashboard does the same for your stakeholders. It helps them navigate the path of the project inside the data. If you add clear markers and highlight important points on your dashboard, users will understand where your data story is headed. Then, you can work together to make sure the business gets where it needs to go. To learn more about designing dashboards, check out this reading from the Google Data Analytics Certificate: Designing compelling dashboards. Effective visualizations Data visualizations are a key part of most dashboards, so you’ll want to ensure that you are creating effective visualizations. This requires organizing your thoughts using frameworks, incorporating key design principles, and ensuring you are avoiding misleading or inaccurate data visualizations by following best practices. Frameworks for organizing your thoughts about visualization Frameworks can help you organize your thoughts about data visualization and give you a useful checklist to reference. Here are two frameworks that may be useful for you as you create your own data visualizations: 1. The McCandless Method
  • 8. 2. Kaiser Fung’s Junk Charts Trifecta Checkup Pre-attentive attributes: marks and channels Creating effective visuals involves considering how the brain works, then using specific visual elements to communicate the information effectively. Pre-attentive attributes are the elements of a data visualization that people recognize automatically without conscious effort. The essential, basic building blocks that make visuals immediately understandable are called marks and channels. Design principles Once you understand the pre-attentive attributes of data visualization, you can go on to design principles for creating effective visuals. These design principles are vital to your work as a data analyst because they help you make sure that you are creating visualizations that convey your data effectively to your audience. By keeping these rules in mind, you can plan and evaluate your data visualizations to decide if they are working for you and your goals. And, if they aren’t, you can adjust them! Avoiding misleading or deceptive charts As you have been learning, BI provides people with insights and knowledge they can use to make decisions. So, it’s important that the visualizations you create are communicating your data accurately and truthfully. To learn more about effective visualizations, check out this reading from the Google Data Analytics Certificate: Effective data visualizations. Make your visualizations accessible and useful to everyone in your audience by keeping in mind the following: • Labeling • Text alternatives • Text-based format • Distinguishing • Simplifying To learn more about accessible visualizations, check out this video from the Google Data Analytics Certificate: Making Data Visualizations Accessible.
  • 9. Conclusion As a BI professional, you will encounter a variety of tools for creating pipeline systems, developing dashboards to share with stakeholders, and creating effective visualizations to demonstrate your findings. Those tools require different skills, which take time and effort to learn. But often, you can apply your knowledge to numerous processes and systems.