Exploring The Danish
Approach to Evaluating
Public Sector Innovation
Webinar May 23rd 2018
Lene Krogh Jeppesen, Senior Consultant
Mail: lkj@coi.dk Twitter: @lenekrogh
Innovation is a new or
significantly changed way
of improving the workplace’s
activities and results.
COI:
Established 2014
Covers all three layers
of government:
• national
• regional
• local
9,4 FTE
14-12-2017
14-12-2017
Minister for Public Sector Innovation,
Sophie Løhde
(since november 28, 2016)
25.06.2018 3
Minister for
Public Sector
Innovation
Agency for the
modernization of the
public sector
The National Center for Public
Sector Innovation
Agency for Digitization
Agency for Governmental
Administration
Agency for Governmental IT-
services
COI strategic framework 2017-
14-12-2017 4
Agenda:
1. WHAT is the state of evaluating public
sector innovation in Denmark?
#Facts
2. WHY did COI decide to work in these
two fields?
#DefiningTheProblem
3. WHAT did COI do?
#ShapingTheSolution
4. NOW WHAT?
#NextSteps
25.06.2018 5
25.06.2018 6
1. WHAT is the state of
evaluating public sector
innovation in Denmark?
#Facts
Fact: How many?
25.06.2018 7
Fact: Who evaluates?
25.06.2018 8
External, for example consultants or
researchers have run or contributed to
the evaluation
Internal evaluation run by the
workplace
Fact: How?
25.06.2018 9
Fact: Purposes
25.06.2018 10
Learning to
improve
(innovation)
efforts
Examine
whether the
innovation
has
achieved its
goal
Spreading
ones
experiences
with others
Documenting
the value of
the innovation
to decision
makers
To better
manage the
innovation
process along
the way
Other
25.06.2018 11
2. WHY did COI decide to
work in these two fields
#DefiningTheProblem
Defining the problem pt. 1:
Based on the InnovationBarometer
25.06.2018 12
1. (not) knowing the value
2. the needs of the internal evaluators
3. (not) documenting results
Defining the problem pt. 1:
Based on field work
25.06.2018 13
1. A lack of systematic evaluation approach within public sector innovation
2. A wish for:
‒ More evaluation of innovation
‒ Tools to evaluate innovation
‒ Increase in internal evaluation capacity
‒ Network and knowledge sharing on evaluating public sector innovation
3. How do we increase internal evaluation capacity?
‒ New models and methods
‒ Innovators need evaluation skills
‒ Specific guides or tools to evaluate public sector innovation
‒ Network and knowledge sharing on evaluating public sector innovation
Two fields
crossing paths
… or not
crossing paths
25.06.2018 14
25.06.2018 15
3. WHAT did COI do?
#ShapingTheSolution
25.06.2018 16
Publication
3. Developing
and testing
prototypes
2. Community
building:
Involving
experts and
pracitioners
1. Exploring the
problem: What
troubles us
when evaluating
innovation?
Our approach:
COI’s evaluation activities (2015-16):
Innovators getting evaluation skills
25.06.2018 17
Conference: ”HELP! I’m
evaluating an innovative
activity”
Network and
knowledge
sharing
Models and
methods
Publication: ”Guidance on
evaluating an innovative
activity” by professor
Peter Dahler Larsen
Models and
methods
Working community on
evaluating innovation
Network and
knowledge
sharing
Specific
guides and
tools
Publication: ”Hands-on-
guide” with dialogue tools
Specific
guides and
tools
Working community?
25.06.2018 18
• Who:
‒Participants: 30 consultants with knowledge on innovation/evaluation
‒Advisory board: 4 experts theory/practice
• Purpose:
‒To enhance evaluation of public sector innovation (more evaluations)
‒To make evaluation of innovation easier for the internal evaluators
(capacity building).
• Objectives:
‒To create networks, knowledge sharing and development of new
approaches between actors working on evaluating public innovation.
Process: Working Community
25.06.2018 19
•Introducing the suggested
framework
•Testing it on an innovative
activity
Introducing: Theory,
framework,
definitions
•Further developing and
qualifying content in a
framework for evaluation of
innovation
•Open space: Tools,
framework, skills, planning,
roles.
Developing the
framework •Testing a prototype
•Evaluating the work
Testing the prototype
25.06.2018 20
The concept: Four phases when evaluating innovation
25.06.2018 21
Dialogue tool - example
25.06.2018 22
An evaluation artefact
25.06.2018 23
25.06.2018 24
4. NOW WHAT?
#NextSteps
Since 2016
25.06.2018 25
• Distribution, introductory workshops and training
• Article and evaluation conferences:
‒ Combining spreading and evaluation
‒ Organizational evaluation capacity:
‒ When political leadership ask for evaluation of innovation
Current challenges:
25.06.2018 26
Attention & playmates…
Thank you
Find our (limited) products and information in English on coi.dk/about-coi
Our tool kit is available for download – in Danish: coi.dk/evaluering .
Lene Krogh Jeppsen: lkj@coi.dk // @lenekrogh
25.06.2018 27

More Related Content

PPTX
TCI2013 How can design be a platform for the development of a regional cluste...
PDF
Art of Projects 2018 - Yohann Abrahamas
PPTX
Jo Casebourne: Innovation in the Public Sector Organizations (NESTA_London)
PDF
Tim Willoughby, LGMA, Presentation to University of Ulster Innovation Master...
PDF
Attachement 2: Public
PDF
ICLCity2013 - Accelerating Innovation in Local Councils: Different Perspectiv...
PDF
Innovation Framework for Public Policy and Citizen Services
PPT
Enhancing Social Innovation by Rethinking Collaboration, Leadership and Publi...
TCI2013 How can design be a platform for the development of a regional cluste...
Art of Projects 2018 - Yohann Abrahamas
Jo Casebourne: Innovation in the Public Sector Organizations (NESTA_London)
Tim Willoughby, LGMA, Presentation to University of Ulster Innovation Master...
Attachement 2: Public
ICLCity2013 - Accelerating Innovation in Local Councils: Different Perspectiv...
Innovation Framework for Public Policy and Citizen Services
Enhancing Social Innovation by Rethinking Collaboration, Leadership and Publi...

Similar to Exploring the danish approach to evaluating public sector innovation (20)

PDF
Innovating the-public-sector-conference-conclusions-2014
PDF
Accelerating Innovation in Local Government: Three perspectives
PPTX
Peter Totterdil: Policy Dialogue (UK Work Organisation Network)
PDF
Anthony Arundel
PPS
Public sector innovation
PDF
MKTGMAL076-Driving-innovation-in-government
PDF
Module 06 Innovation
PPTX
on innovation for/in public sector in Africa
PDF
Public Sector Innovation Amcham Eu Wernberg Tougaard
PPTX
Toby Baker - User-centric public administration in Tunisia - 02-2025 .pptx
PPTX
Governor's Corporate Innovators Council - State of Illinois
PPT
A Festival of Ideas: Innovation for Personalisation
PDF
OECD perspective on social innovation and the public sector
PDF
Piret Tõnurist - Systems change: how to get started and keep going?
PPTX
Co-creation of public service innovation - what, why, where, when, who and how?
PDF
Horizon 2020 OPSI Newsletter July 2016 no 1
DOCX
Dr. Richard ChuaJWI 550 date Insert NameJack Welch Man.docx
PDF
Accelerating Innovation in Local Government
Innovating the-public-sector-conference-conclusions-2014
Accelerating Innovation in Local Government: Three perspectives
Peter Totterdil: Policy Dialogue (UK Work Organisation Network)
Anthony Arundel
Public sector innovation
MKTGMAL076-Driving-innovation-in-government
Module 06 Innovation
on innovation for/in public sector in Africa
Public Sector Innovation Amcham Eu Wernberg Tougaard
Toby Baker - User-centric public administration in Tunisia - 02-2025 .pptx
Governor's Corporate Innovators Council - State of Illinois
A Festival of Ideas: Innovation for Personalisation
OECD perspective on social innovation and the public sector
Piret Tõnurist - Systems change: how to get started and keep going?
Co-creation of public service innovation - what, why, where, when, who and how?
Horizon 2020 OPSI Newsletter July 2016 no 1
Dr. Richard ChuaJWI 550 date Insert NameJack Welch Man.docx
Accelerating Innovation in Local Government
Ad

Recently uploaded (20)

PPTX
Empowering Teens with Essential Life Skills 🚀
PPTX
The DFARS - Part 251 - Use of Government Sources By Contractors
PPTX
Water-Energy-Food (WEF) Nexus interventions, policy, and action in the MENA r...
PPTX
Presentation on CGIAR’s Policy Innovation Program _18.08.2025 FE.pptx
PPTX
DFARS Part 252 - Clauses - Defense Regulations
PPTX
Neurons.pptx and the family in London are you chatgpt
PDF
PPT Item # 9 - FY 2025-26 Proposed Budget.pdf
PPTX
3.-Canvassing-Procedures49for election.pptx
PPTX
Parliamentary procedure in meeting that can be use
PPTX
Core Humanitarian Standard Presentation by Abraham Lebeza
PPTX
cpgram enivaran cpgram enivaran cpgram enivaran
PDF
CXPA Finland Webinar: Rated 5 Stars - Delivering Service That Customers Truly...
PDF
UNEP/ UNEA Plastic Treaty Negotiations Report of Inc 5.2 Geneva
PDF
4_Key Concepts Structure and Governance plus UN.pdf okay
PPTX
Workshop introduction and objectives. SK.pptx
PDF
Introducrion of creative nonfiction lesson 1
PPTX
Introduction to the NAP Process and NAP Global Network
PPTX
CHS rollout Presentation by Abraham Lebeza.pptx
PDF
Concept_Note_-_GoAP_Primary_Sector_-_The_Great_Rural_Reset_-_Updated_18_June_...
PPTX
LUNG CANCER PREDICTION MODELING USING ARTIFICIAL NEURAL NETWORK.pptx
Empowering Teens with Essential Life Skills 🚀
The DFARS - Part 251 - Use of Government Sources By Contractors
Water-Energy-Food (WEF) Nexus interventions, policy, and action in the MENA r...
Presentation on CGIAR’s Policy Innovation Program _18.08.2025 FE.pptx
DFARS Part 252 - Clauses - Defense Regulations
Neurons.pptx and the family in London are you chatgpt
PPT Item # 9 - FY 2025-26 Proposed Budget.pdf
3.-Canvassing-Procedures49for election.pptx
Parliamentary procedure in meeting that can be use
Core Humanitarian Standard Presentation by Abraham Lebeza
cpgram enivaran cpgram enivaran cpgram enivaran
CXPA Finland Webinar: Rated 5 Stars - Delivering Service That Customers Truly...
UNEP/ UNEA Plastic Treaty Negotiations Report of Inc 5.2 Geneva
4_Key Concepts Structure and Governance plus UN.pdf okay
Workshop introduction and objectives. SK.pptx
Introducrion of creative nonfiction lesson 1
Introduction to the NAP Process and NAP Global Network
CHS rollout Presentation by Abraham Lebeza.pptx
Concept_Note_-_GoAP_Primary_Sector_-_The_Great_Rural_Reset_-_Updated_18_June_...
LUNG CANCER PREDICTION MODELING USING ARTIFICIAL NEURAL NETWORK.pptx
Ad

Exploring the danish approach to evaluating public sector innovation

  • 1. Exploring The Danish Approach to Evaluating Public Sector Innovation Webinar May 23rd 2018 Lene Krogh Jeppesen, Senior Consultant Mail: lkj@coi.dk Twitter: @lenekrogh
  • 2. Innovation is a new or significantly changed way of improving the workplace’s activities and results. COI: Established 2014 Covers all three layers of government: • national • regional • local 9,4 FTE 14-12-2017
  • 3. 14-12-2017 Minister for Public Sector Innovation, Sophie Løhde (since november 28, 2016) 25.06.2018 3 Minister for Public Sector Innovation Agency for the modernization of the public sector The National Center for Public Sector Innovation Agency for Digitization Agency for Governmental Administration Agency for Governmental IT- services
  • 4. COI strategic framework 2017- 14-12-2017 4
  • 5. Agenda: 1. WHAT is the state of evaluating public sector innovation in Denmark? #Facts 2. WHY did COI decide to work in these two fields? #DefiningTheProblem 3. WHAT did COI do? #ShapingTheSolution 4. NOW WHAT? #NextSteps 25.06.2018 5
  • 6. 25.06.2018 6 1. WHAT is the state of evaluating public sector innovation in Denmark? #Facts
  • 8. Fact: Who evaluates? 25.06.2018 8 External, for example consultants or researchers have run or contributed to the evaluation Internal evaluation run by the workplace
  • 10. Fact: Purposes 25.06.2018 10 Learning to improve (innovation) efforts Examine whether the innovation has achieved its goal Spreading ones experiences with others Documenting the value of the innovation to decision makers To better manage the innovation process along the way Other
  • 11. 25.06.2018 11 2. WHY did COI decide to work in these two fields #DefiningTheProblem
  • 12. Defining the problem pt. 1: Based on the InnovationBarometer 25.06.2018 12 1. (not) knowing the value 2. the needs of the internal evaluators 3. (not) documenting results
  • 13. Defining the problem pt. 1: Based on field work 25.06.2018 13 1. A lack of systematic evaluation approach within public sector innovation 2. A wish for: ‒ More evaluation of innovation ‒ Tools to evaluate innovation ‒ Increase in internal evaluation capacity ‒ Network and knowledge sharing on evaluating public sector innovation 3. How do we increase internal evaluation capacity? ‒ New models and methods ‒ Innovators need evaluation skills ‒ Specific guides or tools to evaluate public sector innovation ‒ Network and knowledge sharing on evaluating public sector innovation
  • 14. Two fields crossing paths … or not crossing paths 25.06.2018 14
  • 15. 25.06.2018 15 3. WHAT did COI do? #ShapingTheSolution
  • 16. 25.06.2018 16 Publication 3. Developing and testing prototypes 2. Community building: Involving experts and pracitioners 1. Exploring the problem: What troubles us when evaluating innovation? Our approach:
  • 17. COI’s evaluation activities (2015-16): Innovators getting evaluation skills 25.06.2018 17 Conference: ”HELP! I’m evaluating an innovative activity” Network and knowledge sharing Models and methods Publication: ”Guidance on evaluating an innovative activity” by professor Peter Dahler Larsen Models and methods Working community on evaluating innovation Network and knowledge sharing Specific guides and tools Publication: ”Hands-on- guide” with dialogue tools Specific guides and tools
  • 18. Working community? 25.06.2018 18 • Who: ‒Participants: 30 consultants with knowledge on innovation/evaluation ‒Advisory board: 4 experts theory/practice • Purpose: ‒To enhance evaluation of public sector innovation (more evaluations) ‒To make evaluation of innovation easier for the internal evaluators (capacity building). • Objectives: ‒To create networks, knowledge sharing and development of new approaches between actors working on evaluating public innovation.
  • 19. Process: Working Community 25.06.2018 19 •Introducing the suggested framework •Testing it on an innovative activity Introducing: Theory, framework, definitions •Further developing and qualifying content in a framework for evaluation of innovation •Open space: Tools, framework, skills, planning, roles. Developing the framework •Testing a prototype •Evaluating the work Testing the prototype
  • 21. The concept: Four phases when evaluating innovation 25.06.2018 21
  • 22. Dialogue tool - example 25.06.2018 22
  • 24. 25.06.2018 24 4. NOW WHAT? #NextSteps
  • 25. Since 2016 25.06.2018 25 • Distribution, introductory workshops and training • Article and evaluation conferences: ‒ Combining spreading and evaluation ‒ Organizational evaluation capacity: ‒ When political leadership ask for evaluation of innovation
  • 27. Thank you Find our (limited) products and information in English on coi.dk/about-coi Our tool kit is available for download – in Danish: coi.dk/evaluering . Lene Krogh Jeppsen: lkj@coi.dk // @lenekrogh 25.06.2018 27

Editor's Notes

  • #2: I’m Lene Krogh Jeppesen and am a senior consultant at the Danish National Centre for Public Sector Innovation. In this talk I will introduce you to our take on public sector innovation, the challenge of evaluating public sector innovation, the approaches COI put together to evaluate Danish innovations, and their results so far.
  • #3: First a few words on what COI is and how we understand public sector innovation. We span all three layers of the public sector and have been around since 2014. It is a rather unique structure as far as we can tell in that there is a common ownership and financing both by the government, by the association of local municipalities and by the parallel association for the regional level organizations. There is 9 of us and a part time student. Most of us have a background in doing public sector innovation – we know what it’s like getting our hands dirty and trying to change the public sector from within. To us innovation is a new or significantly changed way of improving the workplace’s activities and results. In other words: DOING, not just talking and getting ideas.
  • #4: Some of you might have heard that we have what we think is the world’s first minister for public sector innovation: Sophie Løjde. Here she is together with our director. She is one of two ministers within the ministry of finance where COI is now organizationally housed, but with a high degree of independence – thanks to the aforementioned common ownership and financing both by the government, by the association of local municipalities and by the parallel association for the regional level organizations. Within the Ministry of Finance the Minister for Public Sector Innovation is responsible for the major agencies that constitute the “machine room” of the Danish public sector: The four agencies for Modernization of the public sector, Digitization, Governmental Administration, Governmental IT-services – and us at the COI. We in COI have as the obligation to get more quality and efficiency in and across the public sector by strengthening the valuable outcome of innovation.
  • #5: Until the end of 2016 COI has prioritized building a strong foundation for the continuous work with public sector innovation in Denmark. Our InnovationBarometer – the world’s first statistic on public sector innovation – has shown that 80% of the Danish public workplaces are innovative and that three out of four innovations are inspired by or even copied from other workplaces. Thus killing myths on the sorry state of public sector innovation. We’ve established an innovation internship, that builds relations and connections across the public sector, thus enabling the spreading of innovation. Our two networks for innovation leaders and innovators have focused on establishing deep relationships so that the hardships of being innovative within the public sector can be shared openly and solutions can be found. We’ve developed dialogue tools and kits to enable innovators evaluate and spread innovation.   Since 2017 we have worked on using these building blocks in new combinations and on new territories in order to achieve changes in the system and behaviors and to continuously ensure that (new) knowledge is used. We step in to the roles of dialogue- and leverage-partner, capacitybuilder and knowledge-partner. All of this also means that as we are 9 people supporting 800.000 public sector employees our focus is very much on enabling the actors in the public sector that make innovation happen on a day to day basis. Be it the consultant, projectmanager, middle management or toplevel managers and politicians. The work we did on evaluating public sector innovation is one example of this. We knew there were still challenges to doing this and we worked with the innovators and evaluators to find a path forward – a structure that can help innovators.
  • #6: So, now for today’s actual agenda: Evaluating public sector innovation. I’ve structured the talk in 4 sections: WHAT is the state of evaluating public sector innovation in Denmark? #Facts WHY did COI decide to work in these two fields? #DefiningTheProblem – the two fields being innovation and evaluation WHAT did COI do? #ShapingTheSolution NOW WHAT? #NextSteps
  • #7: I’ll start with some facts from our InnovationBarometer, which is the world’s first statistic on public sector innovation in accordance with the Oslo Manual. We’ve collected data twice – regarding 2013/14 and 2015/16. The following numbers are from the first data collection. We haven’t fully crunched the numbers from the second collection.
  • #8: From the statistic we know that 44% of the public sector innovations have been evaluated
  • #9: We know that a vast majority of those are internal evaluations.
  • #10: We know that nearly a third of the evaluations are based on the workplaces own professional assesments whereas only 11% are based on studies among the citizens and/or companies that the innovations are meant to serve.
  • #11: We know that learning is very high on the purpose-agenda, whereas documenting the value to decision makers and using the evaluation to better manage the innovaiton proces along the way only happens in about 1 in five of the evaluations.
  • #12: So, knowing these facts – how does COI see the problems with public sector innovation?
  • #13: Our definition of innovation is that it is an actual improvement – but how do we know if it is actually in improvement? How do we know if public sector innovations show any value and are an improvement if only 44% are evaluated? Seeing as so many of the evaluations are done internally: How can we establish a basis for better internal evaluation practices? Especially considering how relatively mature the innovation approach is in the public sector in Denmark: How can we continue to argue that innovation is a proper approach in the public sector when we have so little focus on documenting results? We chose to supplement these defining questions with field work to gather even more knowledge.
  • #14: The field work – amongst some of the places in the public sector, where we knew they actively did work with evaluation of innovation showed us: That the numbers were right: There is a lack of systematic evaluation approach within public sector innovation We also saw a wish for: More evaluation of innovation Tools to evaluate innovation Increase in internal evaluation capacity Network and knowledge sharing on public sector innovation The evalutors pointed to several ways to increase internal evaluation capacity – several ideas for actions and solutions for building capacity: New models and methods Innovators need evaluation skills Specific guides or tools to evaluate public sector innovation Network and knowledge sharing on public sector innovation
  • #15: One of our realizations was that it really was a matter of two different mindsets and paradigms having trouble crossing paths. Whereas the evaluator work retrospectively the innovator works futurespectively. Whereas the evaluator works in a linear manner, the innovator works in a circular manner. Whereas the evaluator is dedicated to measuring, the innovator is dedicated to the process. In the intersection they unite around a systematic approach to their respective work fields. The intersection is not crowded and we do want more people to play with in this intersection.
  • #16: So what did we actually do to find answers to this complex of problems?
  • #17: Having explored the problem we started community building, then developed and tested prototypes in order to publish tools and guides to help innovators evaluate more and better.
  • #18: Aside from showing you specifically how we worked - I wanted to give you an idea of the timeline. All of this took place in under af year. The conference was held in late november 2015 and the hands—on guide went to print over the summer of 2016 I want to show you that these four main activities combined was the first part of a solution to the problems with evaluating innovation. We kind of launched the intensive work towards ”a solution” with a conference – this initiated the community building. We had talks on evaluation of innovation in theory and practice and worked on an evaluation case. We published a 50 page booklet which was written by an esteemed Danish professor of evaluation. It gives an overview of the field of evaluation – the various theoretic approaches and some overall advice on evaluating innovation. We initiated a working community to help us codesign exactly how we could help innovators evaluate The output from the working community was published as a hands on guide with dialogue tools. And now I’ll dig a little deeper in to the working community, where we co-designed the hands-on guide with experts and practitioners.
  • #19: So what is a working community? To us it was a group of participants who helped us co-design what they actually needed to help the field of public sector innovations do more and better evaluations. An advisory board of researchers and practioners kept us on track and were useful for discussing general direction and implementation of the “products”. During the three sessions when the community met, we switched between different types of work: Presentations on evaluation of innovation in theory and practice – so the participants felt they could take something away from the meetings as well as give us something. Discussions Hands-on work with an evaluation case and joint exercises.
  • #20: This was the overall flow of the three meetings in the co-design process. The entire proces took place over five weeks. There was a weeks gap between the first to meetings and 4 weeks between the last two.
  • #21: It was just half-day meetings, so the key in planning it was to be structured – and to be structured in the testing during the meetings and in making sure we got all the insights from the practioners – and used them to develop relevant tools. We continued testing and adjusting the prototype after the last meeting.
  • #22: The result was a model consisting of four phases when evaluating innovation. I’ll just briefly give you a bit of insight in to why we chose these phases: CLARIFY: Why evaluate? Because anchoring an innovative project in the organization is a challenge. The dialogue should start early - and be returned to continually. We should connect innovation and evaluation in the beginning. So they do not run off in different directions. PLAN: How to evaluate? Because evaluation is not something that "just" happens – neither is innovation for that matter. Because the evaluation must be structured and systematic and in order to be able to make a valid assessment. DO IT! Create knowledge! Because we should describe and retain the knowledge we create during the innovation- and evaluation process. We use this to adjust and steer the way. To come to a conclusion on our evaluation: Does the innovation create value? USE IT! Use the knowledge We need knowledge from the evaluation of the innovation for further work. How can this knowledge be diffused, used for further learning and reported to relevant users? I wouldn’t have guessed in advance that the clarify phase would be so important to practioners. This is really where there is still plenty of problems. If you don’t have a clear idea in your organization on what you’re working on – you will never succed with doing a good evaluation or with getting people to use the knowledge from the evaluations.
  • #23: This is just an example of one of the 10 dialogue tools that have become part of the hands-on guide. Aside from the nice print version they can be downloaded from our website in ppt or pdf. It’s A3 in size so you can sit your project group down and have a joined discussion around the questions and thus establish a common ground for the evaluation and innovation process. The gudiebook that frames the 10 dialogue tools consists of: An introduction: Why evaluate innovation? Providing arguments and definitions What are the requirements for your evaluation? Evidence, tendencies and experiences – focus on structure. Important terms: validity, control groups and baseline An overview of 8 qualitative and quantitative methods that you might choose for your evaluation: Observation, interview, service journeys, RCT, register (count things!), vox pop, Survey/questionaire, registerbased analysis. All described focusing on: pro/cons and how to collect/analyse data as well as what kind of information this method will give me and what tools/requirement I need. This is not targeted at larger scale program evaluations and I don’t expect people to be able to do rct’s based on this. But for the innovation projects run internally in the public sector workplaces this should help provide structure and a basis for doing better evaluations.
  • #24: As with our other work at COI we put a lot of effort in to delivering our products in nice packages. The design and the quality of it makes them in to artefacts that legitimize innovation work at the public sector workplaces. And when we put an effort in to making tools and materials that are not only build on a solid foundation of knowledge but also look nice, this encourages people to also put in the effort when they work with this.
  • #25: So where did we and do we go from this point?
  • #26: We launched and distributed the publications. Our materials are free for the Danish public sector employees. We ordered 1000 prints of the guidance-booklet written by the professor – and we are pretty much out of those now. It can only be downloaded from our website. We ordered 1500 prints of the hands-on-publication with the dialogue tools – we’ve distributed 600 of those, so for the hard working innovative evaluators there’s still nice supplies to get from us. I’ve done some introductory workshops – introducing the evaluation kit and one training session, where I spend three days over half a year taking a unit through working with the kit on specific evaluations they were doing. We haven’t evaluated the kit – and I honestly don’t know the effect it has had on the community. I don’t expect it to solely push the numbers higher than the 44% evaluated but I do hope over a longer time to push the field in the right direction. I’ve co-written an article on evaluating innovation – this model is from that article – where we take a stab at describing the role of the evaluator. I’ve done a learning session at the major Danish evaluation conference where we focused on how to further integrate spreading and evaluating innovation. What actions can you take in your evaluation if you want to make it easier for yourself to spread the innovation? And this year I’m doing a session at the conference on organisational evaluation capacity. Working with a municipality where the local politicians have asked for evaluations of innovations in order for them to make better decisions.
  • #27: This is still a field that lacks attention. I still see innovators giving up. They get overwhelmed by the complexity of evaluating and choose to do nothing. And they still don’t get started early enough. And neither political nor administrative leaders ask for the right kind of evaluation or give it continuous attention. In the intersection between the fields of evaluation and innovation we are still lacking playmates.
  • #28: That was it – thank you for showing an interest in our work. I hope it made some sort of sense to you and I’m happy to answer any questions you might have.