SlideShare a Scribd company logo
Intelligence Density WXGC6307
Intelligence Density: A measure of organizational intelligence and productivity Dhar & Stein’s  Intelligence Density  (ID) framework allows us to roughly  measure the productivity of knowledge work , in terms of the achievable gain in conciseness, profit or other quality. ID is a  heuristic measure of the certain intelligence types  provided by a particular analytic decision tool/system. ID refer to amount of  useful  “decision support information” that a decision maker gets from  using the output from some analytic system  for a certain amount of time.
cont. Conceptually, ID can be viewed as the  ratio of the number of utiles  of decision making power (quality)  to the number of units of analytic time  spent by the decision maker. Example: If a decision maker can make decisions that were consistently determined to be twice as good (by some qualitative/ quantitative measure) after examining Source X compare than Source Y (same time frame), we could say that Source X had twice the ID as Source Y.
Dimensions of Problems and Solutions The right path to successful intelligent IS project: First: you need to  satisfy model   output quality requirements . A solution must satisfy basic things like accuracy and response time. Generally, the quality of the outputs should be adequate to meet your organization’s needs.
cont. Second: you need to  consider longer term cost drivers . Like what it will cost to maintain, extend, or modify the system. These types of constraints will help determine how useful the system is in the long run. Thus, the system must be  engineered  correctly.
cont. Third: you need to ensure that the quality of the  organization’s resources is sufficient  to undertake the proposed project. These dimensions deal with human resources and infrastructure. Finally: you need to ensure that the organization can support the  logistical  (development schedules or budgets)  requirements  of the project.
Intelligence Density Intelligence Density looks at the following four areas: Quality of the Model Engineering Dimensions Quality of Available Resources Logistical Constraints
Intelligence Density: Quality of the Model Model Quality can be assessed by: Accuracy : measures how close the outputs of a system are to the correct or best decision. Questions: Are the predictions/prescriptions correct (low errors) and profitable (low cost of errors, high value of correct predictions) ? Explainability : the description of the process by which a conclusion was reached. Questions: Are the predictions/prescriptions explainable ?  e.g. neural nets are hard to interpret, whereas in rule-based systems we can trace the origin and justification of the rule.
cont. Speed / reliability of response time : the time it takes for a system to complete analysis at the desired level of accuracy and within a specified time frame. Questions: Does the system provide responses within a reasonable amount of time?
Intelligence Density: Engineering Dimensions Engineering Dimensions include: Flexibility : the ease with which the relationships among the variables or their domains can be changed, or the goals of the system modified. Questions: How flexible is the system in allowing the problem specifications to be changed? Scalability : involves adding more variables to the problem or increasing the range of values that variables can take (computational complexity increases). Questions: Does the algorithm work on large data volumes?
cont. Compactness : refers to how small the system can be made (portable format). Questions: How compact is the system? Is it can be installed on the laptop or handheld device? Embeddability : refers to the ease with which a system can be coupled with or incorporated into the infrastructure of an organization. Questions: Can we easily embed the knowledge obtained in our software applications to the new system?
cont. Ease of use : describes how complicated the system is to use for the user who will be using it on a daily basis. Questions: Is the software easy to use?
Intelligence Density: Quality of Available Resources Quality of Available Resources looks at: Tolerance for noise :  the degree to which the quality of a system, most notably its accuracy, is affected by noise in the electronic data. Questions: Can the algorithm work with noisy data? Will its accuracy be significantly affected by noisy data? Tolerance for sparse data : the degree to which the quality of a system is affected by incompleteness or lack of data. Questions: Can the algorithm work with small volumes of data, and with missing data?
cont. Learning curve : indicate the degree to which the organization needs to experiment in order to become sufficiently competent at solving a problem or using a technique. Questions: Is it easy to learn and implement the algorithm ? Tolerance for complexity : the degree to which the quality of a system is affected by interactions among the various components of the process being modeled or in the knowledge used to model a process. Questions: Can the algorithm cater for complex inter-relationships between variables? e.g. Weather system
Intelligence Density: Logistical Constraints Logistical Constraints include: Independence from experts : the degree to which the system can be designed, built and tested without experts. Questions: Do we need to take a lot of time from domain experts in order to implement the software ?  The problem with rule-based expert systems was that humans were required to manually encode rules which is difficult and time-consuming. Development speed : the time that the organization can afford to develop a system. Questions: Will it take a lot of time or cost a lot of money to implement the algorithms in the system?
cont. Computational ease : the degree to which a system can be implemented without requiring special-purpose hardware or software. Questions: Do we have the necessary hardware resources to run the software?
Intelligence Density: Case Examples
ID Case 1 A mortgage application evaluation system must give some indication of what factors it used to determine that a mortgage applicant scored poorly so that this can be explained to the applicant or be used as the basis of further inquiries by the mortgage officer. ID    Explainability
ID Case 2 A bank needs a back office system that processes and classifies letters of credits into “acceptable” and “unacceptable” categories to be able to classify at least 85% of the letters correctly to make business sense. ID    Accuracy
ID Case 3 A point-of-purchase credit card fraud-detection system must be able to return the results of its evaluation in under 5 seconds so that using it will not overly inconvenience store-owners or cardholders. ID    Response time
ID Case 4 A system that designs shipping routes for a cargo freight firm needs to be able to generate good routes regardless of whether there are 10 or 200 cities being served, or 3 or 30 ships in the fleet. ID    Scalability
ID Case 5 A system designed to rank financial investment alternatives according to risk and return, needs to be updated over time to allow for new investment instruments and financial strategies. ID    Flexibility
ID Case 6 A system that determines how much a client should be billed for a particular service based on information about the client must be able to share information with the firm’s client information database and its current billing and accounting systems. ID    Embeddability
ID Case 7 A system that aids marketing personnel in interviewing clients and suggesting product, needs to be compact enough to be installed on a laptop computer and taken on client calls. ID    Compactness
ID Case 8 A consultant suggests that you need to develop a system using a genetic learning algorithm for data mining. You have never done it before, which means you’ll need to do a lot of background work and learning first and implement a small scale prototype system to understand how the GA would mine the data. ID    Learning curve
ID Case 9 In developing a particular type of stock trading system using neural networks, developers estimate that they will need at least 60 months of accurate historical data, normalized for stock splits, and so on. ID    Tolerance for data sparseness and noise
ID Case 10 If you decide to use a genetic algorithm for data mining, you will have to load hundreds of megabytes of data into memory at one time; this will require access to a very large mainframe or a massively parallel computer. ID    Computational ease
ID Case 11 In developing a stock picking rule based expert system, you need to realize that you need access to an experienced trader for at least 4 hours a week over the course of several months in order to specify the process by which stocks are selected, and for validating the system’s results. ID    Independence from Experts
ID Case 12 Based on initial discussions with experts, in developing a hybrid rule-based system to spot exchange rate patterns, you estimate that the system will consist of roughly 500 rules, which will probably require 6 to 8 months to extract from experts, validate them, and organize them to develop a production version of a system. ID    Development time

More Related Content

DOC
Strayer cis 333 week 11 final exam set 1 new
DOC
Strayer cis 333 week 11 final exam set 1 new
PDF
A predictive system for detection of bankruptcy using machine learning techni...
PDF
DATA SCIENCE METHODOLOGY FOR CYBERSECURITY PROJECTS
PPT
Lecture 23
PDF
Pitfalls and Countermeasures in Software Quality Measurements and Evaluations
PDF
CIS 333 Life of the Mind/newtonhelp.com   
PPTX
Knowledge based systems
Strayer cis 333 week 11 final exam set 1 new
Strayer cis 333 week 11 final exam set 1 new
A predictive system for detection of bankruptcy using machine learning techni...
DATA SCIENCE METHODOLOGY FOR CYBERSECURITY PROJECTS
Lecture 23
Pitfalls and Countermeasures in Software Quality Measurements and Evaluations
CIS 333 Life of the Mind/newtonhelp.com   
Knowledge based systems

What's hot (7)

DOC
CIS 333 Imagine Your Future/newtonhelp.com   
PDF
Data Allocation Strategies for Leakage Detection
DOCX
CIS 333 Exceptional Education / snaptutorial.com
PPT
How to Test Whether Consciousness Can Be Revived From Digital Reflections of ...
DOCX
Cis 333 Extraordinary Success/newtonhelp.com
PDF
Cis 333 Enhance teaching / snaptutorial.com
DOC
Cis 333 Education Organization / snaptutorial.com
CIS 333 Imagine Your Future/newtonhelp.com   
Data Allocation Strategies for Leakage Detection
CIS 333 Exceptional Education / snaptutorial.com
How to Test Whether Consciousness Can Be Revived From Digital Reflections of ...
Cis 333 Extraordinary Success/newtonhelp.com
Cis 333 Enhance teaching / snaptutorial.com
Cis 333 Education Organization / snaptutorial.com
Ad

Similar to Intelligence Density (20)

DOCX
Sharda_dss11_im_01.docChapter 1An Overview of Analy.docx
DOCX
Sharda_dss11_im_01.docChapter 1An Overview of Analy.docx
PPTX
Expert systems and decision making
PPTX
Applied Artificial Intelligence NOTES (1).pptx
PPT
Useful Techniques in Artificial Intelligence
PDF
CocoaConf Relationship Between Agility and Expertise
PPTX
Introduction to Information System
PDF
The Relationship Between Agility and Expertise
PPT
Designing the expert system
PPTX
Cmsc411(Pascuappt Report)
PPTX
MIS Unit-2.pptx
PPT
Intelligent Decision Support Systems
PPS
Management of Complexity in System Design of Large IT Solutions
PPS
PPTX
Galorath.dan
PDF
Understanding the impact of certain uncertain event using bayesian network
PPT
PDF
Decision supportsystems
PPT
MIS Wk-10.ppt
Sharda_dss11_im_01.docChapter 1An Overview of Analy.docx
Sharda_dss11_im_01.docChapter 1An Overview of Analy.docx
Expert systems and decision making
Applied Artificial Intelligence NOTES (1).pptx
Useful Techniques in Artificial Intelligence
CocoaConf Relationship Between Agility and Expertise
Introduction to Information System
The Relationship Between Agility and Expertise
Designing the expert system
Cmsc411(Pascuappt Report)
MIS Unit-2.pptx
Intelligent Decision Support Systems
Management of Complexity in System Design of Large IT Solutions
Galorath.dan
Understanding the impact of certain uncertain event using bayesian network
Decision supportsystems
MIS Wk-10.ppt
Ad

Recently uploaded (20)

PDF
A comparative study of natural language inference in Swahili using monolingua...
PPTX
TechTalks-8-2019-Service-Management-ITIL-Refresh-ITIL-4-Framework-Supports-Ou...
PPTX
Machine Learning_overview_presentation.pptx
PDF
Per capita expenditure prediction using model stacking based on satellite ima...
PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
PDF
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
PDF
Video forgery: An extensive analysis of inter-and intra-frame manipulation al...
PDF
Building Integrated photovoltaic BIPV_UPV.pdf
PDF
Getting Started with Data Integration: FME Form 101
PPTX
TLE Review Electricity (Electricity).pptx
PPTX
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
PDF
Assigned Numbers - 2025 - Bluetooth® Document
PDF
Advanced methodologies resolving dimensionality complications for autism neur...
PPTX
Group 1 Presentation -Planning and Decision Making .pptx
PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
PDF
Encapsulation theory and applications.pdf
PDF
Approach and Philosophy of On baking technology
PPTX
1. Introduction to Computer Programming.pptx
PDF
Diabetes mellitus diagnosis method based random forest with bat algorithm
PDF
Network Security Unit 5.pdf for BCA BBA.
A comparative study of natural language inference in Swahili using monolingua...
TechTalks-8-2019-Service-Management-ITIL-Refresh-ITIL-4-Framework-Supports-Ou...
Machine Learning_overview_presentation.pptx
Per capita expenditure prediction using model stacking based on satellite ima...
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
Video forgery: An extensive analysis of inter-and intra-frame manipulation al...
Building Integrated photovoltaic BIPV_UPV.pdf
Getting Started with Data Integration: FME Form 101
TLE Review Electricity (Electricity).pptx
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
Assigned Numbers - 2025 - Bluetooth® Document
Advanced methodologies resolving dimensionality complications for autism neur...
Group 1 Presentation -Planning and Decision Making .pptx
Reach Out and Touch Someone: Haptics and Empathic Computing
Encapsulation theory and applications.pdf
Approach and Philosophy of On baking technology
1. Introduction to Computer Programming.pptx
Diabetes mellitus diagnosis method based random forest with bat algorithm
Network Security Unit 5.pdf for BCA BBA.

Intelligence Density

  • 2. Intelligence Density: A measure of organizational intelligence and productivity Dhar & Stein’s Intelligence Density (ID) framework allows us to roughly measure the productivity of knowledge work , in terms of the achievable gain in conciseness, profit or other quality. ID is a heuristic measure of the certain intelligence types provided by a particular analytic decision tool/system. ID refer to amount of useful “decision support information” that a decision maker gets from using the output from some analytic system for a certain amount of time.
  • 3. cont. Conceptually, ID can be viewed as the ratio of the number of utiles of decision making power (quality) to the number of units of analytic time spent by the decision maker. Example: If a decision maker can make decisions that were consistently determined to be twice as good (by some qualitative/ quantitative measure) after examining Source X compare than Source Y (same time frame), we could say that Source X had twice the ID as Source Y.
  • 4. Dimensions of Problems and Solutions The right path to successful intelligent IS project: First: you need to satisfy model output quality requirements . A solution must satisfy basic things like accuracy and response time. Generally, the quality of the outputs should be adequate to meet your organization’s needs.
  • 5. cont. Second: you need to consider longer term cost drivers . Like what it will cost to maintain, extend, or modify the system. These types of constraints will help determine how useful the system is in the long run. Thus, the system must be engineered correctly.
  • 6. cont. Third: you need to ensure that the quality of the organization’s resources is sufficient to undertake the proposed project. These dimensions deal with human resources and infrastructure. Finally: you need to ensure that the organization can support the logistical (development schedules or budgets) requirements of the project.
  • 7. Intelligence Density Intelligence Density looks at the following four areas: Quality of the Model Engineering Dimensions Quality of Available Resources Logistical Constraints
  • 8. Intelligence Density: Quality of the Model Model Quality can be assessed by: Accuracy : measures how close the outputs of a system are to the correct or best decision. Questions: Are the predictions/prescriptions correct (low errors) and profitable (low cost of errors, high value of correct predictions) ? Explainability : the description of the process by which a conclusion was reached. Questions: Are the predictions/prescriptions explainable ? e.g. neural nets are hard to interpret, whereas in rule-based systems we can trace the origin and justification of the rule.
  • 9. cont. Speed / reliability of response time : the time it takes for a system to complete analysis at the desired level of accuracy and within a specified time frame. Questions: Does the system provide responses within a reasonable amount of time?
  • 10. Intelligence Density: Engineering Dimensions Engineering Dimensions include: Flexibility : the ease with which the relationships among the variables or their domains can be changed, or the goals of the system modified. Questions: How flexible is the system in allowing the problem specifications to be changed? Scalability : involves adding more variables to the problem or increasing the range of values that variables can take (computational complexity increases). Questions: Does the algorithm work on large data volumes?
  • 11. cont. Compactness : refers to how small the system can be made (portable format). Questions: How compact is the system? Is it can be installed on the laptop or handheld device? Embeddability : refers to the ease with which a system can be coupled with or incorporated into the infrastructure of an organization. Questions: Can we easily embed the knowledge obtained in our software applications to the new system?
  • 12. cont. Ease of use : describes how complicated the system is to use for the user who will be using it on a daily basis. Questions: Is the software easy to use?
  • 13. Intelligence Density: Quality of Available Resources Quality of Available Resources looks at: Tolerance for noise : the degree to which the quality of a system, most notably its accuracy, is affected by noise in the electronic data. Questions: Can the algorithm work with noisy data? Will its accuracy be significantly affected by noisy data? Tolerance for sparse data : the degree to which the quality of a system is affected by incompleteness or lack of data. Questions: Can the algorithm work with small volumes of data, and with missing data?
  • 14. cont. Learning curve : indicate the degree to which the organization needs to experiment in order to become sufficiently competent at solving a problem or using a technique. Questions: Is it easy to learn and implement the algorithm ? Tolerance for complexity : the degree to which the quality of a system is affected by interactions among the various components of the process being modeled or in the knowledge used to model a process. Questions: Can the algorithm cater for complex inter-relationships between variables? e.g. Weather system
  • 15. Intelligence Density: Logistical Constraints Logistical Constraints include: Independence from experts : the degree to which the system can be designed, built and tested without experts. Questions: Do we need to take a lot of time from domain experts in order to implement the software ? The problem with rule-based expert systems was that humans were required to manually encode rules which is difficult and time-consuming. Development speed : the time that the organization can afford to develop a system. Questions: Will it take a lot of time or cost a lot of money to implement the algorithms in the system?
  • 16. cont. Computational ease : the degree to which a system can be implemented without requiring special-purpose hardware or software. Questions: Do we have the necessary hardware resources to run the software?
  • 18. ID Case 1 A mortgage application evaluation system must give some indication of what factors it used to determine that a mortgage applicant scored poorly so that this can be explained to the applicant or be used as the basis of further inquiries by the mortgage officer. ID  Explainability
  • 19. ID Case 2 A bank needs a back office system that processes and classifies letters of credits into “acceptable” and “unacceptable” categories to be able to classify at least 85% of the letters correctly to make business sense. ID  Accuracy
  • 20. ID Case 3 A point-of-purchase credit card fraud-detection system must be able to return the results of its evaluation in under 5 seconds so that using it will not overly inconvenience store-owners or cardholders. ID  Response time
  • 21. ID Case 4 A system that designs shipping routes for a cargo freight firm needs to be able to generate good routes regardless of whether there are 10 or 200 cities being served, or 3 or 30 ships in the fleet. ID  Scalability
  • 22. ID Case 5 A system designed to rank financial investment alternatives according to risk and return, needs to be updated over time to allow for new investment instruments and financial strategies. ID  Flexibility
  • 23. ID Case 6 A system that determines how much a client should be billed for a particular service based on information about the client must be able to share information with the firm’s client information database and its current billing and accounting systems. ID  Embeddability
  • 24. ID Case 7 A system that aids marketing personnel in interviewing clients and suggesting product, needs to be compact enough to be installed on a laptop computer and taken on client calls. ID  Compactness
  • 25. ID Case 8 A consultant suggests that you need to develop a system using a genetic learning algorithm for data mining. You have never done it before, which means you’ll need to do a lot of background work and learning first and implement a small scale prototype system to understand how the GA would mine the data. ID  Learning curve
  • 26. ID Case 9 In developing a particular type of stock trading system using neural networks, developers estimate that they will need at least 60 months of accurate historical data, normalized for stock splits, and so on. ID  Tolerance for data sparseness and noise
  • 27. ID Case 10 If you decide to use a genetic algorithm for data mining, you will have to load hundreds of megabytes of data into memory at one time; this will require access to a very large mainframe or a massively parallel computer. ID  Computational ease
  • 28. ID Case 11 In developing a stock picking rule based expert system, you need to realize that you need access to an experienced trader for at least 4 hours a week over the course of several months in order to specify the process by which stocks are selected, and for validating the system’s results. ID  Independence from Experts
  • 29. ID Case 12 Based on initial discussions with experts, in developing a hybrid rule-based system to spot exchange rate patterns, you estimate that the system will consist of roughly 500 rules, which will probably require 6 to 8 months to extract from experts, validate them, and organize them to develop a production version of a system. ID  Development time