SlideShare a Scribd company logo
Benchmarking Experiments for
               Criticality Safety and Reactor
                Physics Applications – II –
                           Tutorial

                     John D. Bess and J. Blair Briggs – INL
                          Ian Hill (IDAT) – OECD/NEA
www.inl.gov




                                  2012 ANS Annual Meeting
                                       Chicago, Illinois
                                      June 24-28, 2012


              This paper was prepared at Idaho National Laboratory for the U.S. Department of
                          Energy under Contract Number (DE-AC07-05ID14517)
Outline
I.   Introduction to Benchmarking
     a.   Overview
     b.   ICSBEP/IRPhEP
II. Benchmark Experiment Availability
     a.   DICE Demonstration
     b.   IDAT Demonstration
III. Dissection of a Benchmark Report
     a.   Experimental Data
     b.   Experiment Evaluation
     c.   Benchmark Model
     d.   Sample Calculations
     e.   Benchmark Measurements
IV. Benchmark Participation


                                        2
BENCHMARK PARTICIPATION


                          3
Getting Started
• Criticality Safety Analyses • Student and Young
   – Most of the work has       Generation Involvement
     already been performed            – Provide education
   – Typically only uncertainty          opportunities prior to
     and bias calculations               incorporation into the
     remain                              workforce
                                       – Expand upon training at
• Reactor Operations                     facilities/workplace
   – Occur on a daily basis               • Especially when handbook
     worldwide                              data is already widely
   – Requires reorganization                utilized
     and evaluation of activities         • Computational analysis is
     performed                              rapidly being incorporated
                                            in the nuclear community
                                    • Just for Fun 


                                                                         4
Getting Started (Continued)
• ICSBEP Handbook            • Look at available
  – Uncertainty Guide          benchmark data
  – Critical/Subcritical Guide • Talk with current and
  – Alarm/Shielding Guide        past participants in the
  – Fundamental Physics          benchmark programs
    Guide
  – DICE User’s Manual
  – Reviewer Checklist         • What are you
                              interested in
• IRPhEP Handbook             benchmarking?
  – Evaluation Guide
  – Uncertainty Guide
                             • What if you can’t do a
                               complete benchmark?
                                                            5
Past and Present Student Involvement
• Since 1995, ~30 students
  have participated in the
  ICSBEP and/or IRPhEP
   – 14 directly at INL
• Students have authored
  or coauthored over 60
  benchmark evaluations
   – 28 & 2+ in progress at INL
• They have also
  submitted technical
  papers to various
  conferences and journals


                                       6
Opportunities for Involvement
• Each year the ICSBEP hosts up to one or two
  Summer Interns at the Idaho National Laboratory
• INL has also funded benchmark development via the
  CSNR Next Degree Program
• Students have participated in the projects as
  subcontractors through various universities and
  laboratories
• Benchmark development represents excellent work for
  collaborative Senior Design Projects, Master of
  Engineering Project, or Master of Science Thesis
  Topic
• Further information can be Obtained by contacting the
  Chairman of the ICSBEP
   – J. Blair Briggs, J.Briggs@inl.gov

                                                          7
Opportunities for Student Involvement – I
• Internships                • Augmented Education
   – Traditional approach      – Rigorous structure for
   – 10-weeks                    Master’s Thesis
   – Benchmark review          – Mentor and peer-review
     process completed on        network
     “free-time” during        – Undergraduate thesis
     university studies        – Undergraduate team
   – Encouraged to publish       projects
                               – Encouraged to publish




                                                          8
Opportunities for Student Involvement – II
• Center for Space Nuclear Research (CSNR)
  – Next Degree Program
     • Work as a part- or full-time subcontractor while completing a
       graduate degree
  – Via local or remote university participation
  – Opportunity for interaction with students participating in
    space nuclear activities
     • Other Next Degree students
     • Summer fellow students (interns)
  – Collaboration opportunities on other projects




                                                                       9
Opportunities for Student Involvement – III
• Nuclear and Criticality Safety Engineers
  – Pilot program collaboration
     • Battelle Energy Alliance (BEA)
     • U.S. Department of Energy – Idaho (DOE-ID)
  – Idaho State University and University of Idaho
  – Graduate Nuclear Engineering Curriculum
  – Part-time employment (full-time summer)
  – Hands-on training, benchmarking, ANS meetings,
    thesis/dissertation work, shadow mentors, DOE and
    ANS standard development, etc.




                                                        10
Investigation Breeds Comprehension
• Benchmark procedures require investigation into
  – History and background
     • Purpose of experiment?
  – Experimental design and methods
  – Analytical capabilities and procedures
  – Experimental results
• Often experiments were performed with the intent
  to provide data for criticality safety assessments
  – Many are utilized to
    develop criticality
    safety standards


                                                       11
Culturing Good Engineering Judgment
• Often experimental information is incomplete or
  misleading
  – Contact original experimenters (if available)
  – Interact with professionals from the ICSBEP/IRPhEP
    community
  – Establish a personal network for the young professional
    engineer

   “Do, or do not.
       There is no „try‟”
            - Jedi Master Yoda


                                                              12
Developing an Analytical Skill and Tool Set
• Evaluators develop analytical and computational
  capabilities throughout the evaluation process
  – Utility of conventional computational codes and
    neutron cross section data libraries
     • Monte Carlo or Diffusion methods
     • MCNP and KENO are the most common in the US
  – Application of perturbation theory and statistical
    analyses
     • Uncertainty evaluation
     • Bias assessment
  – Technical report writing
  – Understanding acceptability of results


                                                         13
We Want You…
                        • Numerous reactors and
                          test facilities worldwide that
                          can be benchmarked
                             – Availability of data and
                               expertise for various
                               systems dwindle with time
                        • Visit the website
                             – http://icsbep.inl.gov/
                             – http://irphep.inl.gov/
                        • Contact us
                             – icsbep@inl.gov
                             – irphep@inl.gov
                             – jim.gulliford@oecd.org
           http://www.oecd-nea.org/science/wpncs/icsbep/
           http://www.oecd-nea.org/science/wprs/irphe/
Reactors Can Be Used for More Than Making Heat




                                                 15
WHAT BENCHMARK DATA ARE
YOU MISSING?

                          16
Example Experiments (Not Currently Benchmarked)
• Fast Test Reactors       • Small Modular Reactors
  – ZPR-3, ZPR-6, ZPR-9,     – SCCA-004
    and ZPPR Assemblies      – SORA Critical
  – EBR-II                     Experiment Assembly
• Gas Cooled Reactors        – CLEMENTINE
  – GA HTGR                  – GE-710 Tungsten
                               Nuclear Rocket
  – Fort St. Vrain           – HNPF
  – Peach Bottom             – PBR-CX
  – PNWL HTLTR               – SNAP & NASA Activities
  – ROVER/NERVA              – N.S. Savannah
    Program
  – Project Pluto            – B&W Spectral Shift
                               Experiments
  – DRAGON

                                                        17
Example Experiments (Continued)
• Research Reactors            • Experiments at ORCEF
  –   More TRIGAs                –   Mihalczo HEU Sphere
  –   AGN                        –   USS Sandwich
  –   PULSTAR                    –   Moderated Jemima
  –   Argonaut                   –   Numerous HEU Cases
  –   Pool-type such as MSTR     –   SNAP Water Immersion
  –   MNSR                       –   “Libby” Johnson Cases
  –   HFIR                     • And many, many, more
  –   ACRR                       – Similar experiments
  –   Fast Burst Reactors          validate the benchmark
  –   Kodak CFX                    process and further
                                   improve nuclear data
                                 – See OECD/NEA IRPhEP
                                   website for reports with
                                   data for benchmarking
                                                              18
General Benchmark Evaluation/Review Process
• ICSBEP and IRPhEP benchmarks are subject to extensive review
    – Evaluator(s) – primary assessment of the benchmark
    – Internal Reviewer(s) – in-house verification of the analysis and
      adherence to procedure
    – Independent Reviewer(s) – external (often foreign) verification of the
      analysis via international experts
    – Technical Review Workgroup Meeting – annual international effort to
      review all benchmarks prior to inclusion in the handbook
        • Sometimes a subgroup is assigned to assess any final workgroup
           comments and revisions prior to publication
    – Benchmarks are determined to be Acceptable or Unacceptable for
      use depending on uncertainty in results
        • All Approved benchmarks are retained in the handbook
        • Unacceptable data are published for documentation purposes, but
           benchmark specifications are not provided

                                                                               19
Quality Assurance
Each experiment evaluation included in the Handbook
undergoes a thorough internal review by the evaluator's
organization. Internal reviewers are expected to verify:
1. The accuracy of the descriptive information given in the
   evaluation by comparison with original documentation
   (published and unpublished).
2. That the benchmark specification can be derived from
   the descriptive information given in the evaluation
3. The completeness of the benchmark specification
4. The results and conclusions
5. Adherence to format.


                                                              20
Quality Assurance (continued)
In addition, each evaluation undergoes an independent
peer review by another Technical Review Group member
at a different facility. Starting with the evaluator's submittal
in the appropriate format, independent peer reviewers are
expected to verify:
 1. That the benchmark specification can be derived from
    the descriptive information given in the evaluation
 2. The completeness of the benchmark specification
 3. The results and conclusions
 4. Adherence to format.



                                                                   21
Quality Assurance (continued)

A third review by the Technical Review Group verifies that
the benchmark specifications and the conclusions were
adequately supported.




                                                             22
Annual Technical Review Meeting
• Typically 30 international participants
• Usually OECD/NEA Headquarters in Paris, France
• Technical review group converges to discuss each
  benchmark, page-by-page, prior to approval for
  entry into the handbook
• List of Action Items are recorded; these must be
  resolved prior to publication
  – Final approval given by Internal and Independent
    Reviewers and, if necessary, a Subgroup of additional
    reviewers
• Final benchmark report sent to INL for publication

                                                            23
What to Expect at Tech Review Group Meetings




                                               24
Developing International Camaraderie




                                       25
Annual Important Dates (Guidelines)
                                        ICSBEP                   IRPhEP
Benchmark Evaluation                   Year-round               Year-round
Internal Review                        Year-round               Year-round
Independent Review                 February – March        August – September
Technical Review Group                Late March –          Late September –
Distribution                           Early April            Early October
Technical Review Meeting               Early May               Late October
Resolution of Action Items             May – July          November – January
Submission of Final Report          Before Late July       Before Late January
Handbook Preparation               June – September         November – March
Handbook Publication                End September               End March

                  Reviews can be coordinated throughout the year.
                  These are the guidelines for annual publications.
                                                                                 26
Questions?




             SCCA-002




                        27

More Related Content

PDF
Benchmark Education
PDF
エンタープライズにおける iOSアプリ開発・導入のいろは
PDF
ICSBEP - YPC2009
PPTX
Benchmark Tutorial -- I - Introduction
PDF
Benchmarking with Monte Carlo
PPTX
Benchmark Tutorial -- III - Report
PDF
Advanced Computational Materials Science: Application to Fusion and Generatio...
PPTX
Why manage research data?
Benchmark Education
エンタープライズにおける iOSアプリ開発・導入のいろは
ICSBEP - YPC2009
Benchmark Tutorial -- I - Introduction
Benchmarking with Monte Carlo
Benchmark Tutorial -- III - Report
Advanced Computational Materials Science: Application to Fusion and Generatio...
Why manage research data?

Similar to Benchmark Tutorial -- IV - Participation (20)

PDF
Ece481 lecture4engsocexp
PDF
Graham Pryor
PDF
The Materials Project: Experiences from running a million computational scien...
PDF
Software Methods for Sustainable Solutions
PDF
Software Methods for Sustainable Solutions
PDF
Preservation And Reuse In High Energy Physics Salvatore Mele
PPTX
Aapt summer 2012 active engagement materials for subatomic physics
PDF
The Science of Cyber Security Experimentation: The DETER Project
PDF
The Education of Computational Scientists
PDF
Research Overview April 2010
PDF
Lionel Briand ICSM 2011 Keynote
PDF
Advances in Computers 56 First Edition Marvin Zelkowitz
PDF
ECP Application Development
PDF
Accelerators at ORNL - Application Readiness, Early Science, and Industry Impact
PDF
The 16th Intl. Workshop on Search-Based and Fuzz Testing
PDF
Belak_ICME_June02015
PDF
Simulation Based Engineering Science Report
PDF
Information resources in high energy physics
PDF
Safety and Reliability Modeling and Its Applications (Advances in Reliability...
PDF
NSF Software @ ApacheConNA
Ece481 lecture4engsocexp
Graham Pryor
The Materials Project: Experiences from running a million computational scien...
Software Methods for Sustainable Solutions
Software Methods for Sustainable Solutions
Preservation And Reuse In High Energy Physics Salvatore Mele
Aapt summer 2012 active engagement materials for subatomic physics
The Science of Cyber Security Experimentation: The DETER Project
The Education of Computational Scientists
Research Overview April 2010
Lionel Briand ICSM 2011 Keynote
Advances in Computers 56 First Edition Marvin Zelkowitz
ECP Application Development
Accelerators at ORNL - Application Readiness, Early Science, and Industry Impact
The 16th Intl. Workshop on Search-Based and Fuzz Testing
Belak_ICME_June02015
Simulation Based Engineering Science Report
Information resources in high energy physics
Safety and Reliability Modeling and Its Applications (Advances in Reliability...
NSF Software @ ApacheConNA
Ad

More from jdbess (20)

PPTX
Benchmark Tutorial -- II - Availability
PPTX
NRAD - ANS 2012
PPTX
GROTESQUE - ANS 2012
PDF
NRAD Reactor Benchmark Update
PDF
MIRTE 2010
PDF
IRPhEP 2011
PDF
ICSBEP 2010
PDF
Development Of An ICSBEP Benchmark 2011
PDF
FSP-Be - NETS2011
PDF
IRPhEP - ICAPP2010
PDF
FFTF - PHYSOR2010
PDF
HTTR - PHYSOR2010
PDF
HTTR - ANSWM 2009
PDF
UO2F2 - ANSWM 2009
PDF
CSNR - YPC 2009
PDF
TANKS - ANS 2009
PDF
FSP - NETS 2009
PDF
NTR - NETS 2009
PDF
HTTR - M&C 2009
PDF
CSNR - STAIF 2008
Benchmark Tutorial -- II - Availability
NRAD - ANS 2012
GROTESQUE - ANS 2012
NRAD Reactor Benchmark Update
MIRTE 2010
IRPhEP 2011
ICSBEP 2010
Development Of An ICSBEP Benchmark 2011
FSP-Be - NETS2011
IRPhEP - ICAPP2010
FFTF - PHYSOR2010
HTTR - PHYSOR2010
HTTR - ANSWM 2009
UO2F2 - ANSWM 2009
CSNR - YPC 2009
TANKS - ANS 2009
FSP - NETS 2009
NTR - NETS 2009
HTTR - M&C 2009
CSNR - STAIF 2008
Ad

Benchmark Tutorial -- IV - Participation

  • 1. Benchmarking Experiments for Criticality Safety and Reactor Physics Applications – II – Tutorial John D. Bess and J. Blair Briggs – INL Ian Hill (IDAT) – OECD/NEA www.inl.gov 2012 ANS Annual Meeting Chicago, Illinois June 24-28, 2012 This paper was prepared at Idaho National Laboratory for the U.S. Department of Energy under Contract Number (DE-AC07-05ID14517)
  • 2. Outline I. Introduction to Benchmarking a. Overview b. ICSBEP/IRPhEP II. Benchmark Experiment Availability a. DICE Demonstration b. IDAT Demonstration III. Dissection of a Benchmark Report a. Experimental Data b. Experiment Evaluation c. Benchmark Model d. Sample Calculations e. Benchmark Measurements IV. Benchmark Participation 2
  • 4. Getting Started • Criticality Safety Analyses • Student and Young – Most of the work has Generation Involvement already been performed – Provide education – Typically only uncertainty opportunities prior to and bias calculations incorporation into the remain workforce – Expand upon training at • Reactor Operations facilities/workplace – Occur on a daily basis • Especially when handbook worldwide data is already widely – Requires reorganization utilized and evaluation of activities • Computational analysis is performed rapidly being incorporated in the nuclear community • Just for Fun  4
  • 5. Getting Started (Continued) • ICSBEP Handbook • Look at available – Uncertainty Guide benchmark data – Critical/Subcritical Guide • Talk with current and – Alarm/Shielding Guide past participants in the – Fundamental Physics benchmark programs Guide – DICE User’s Manual – Reviewer Checklist • What are you interested in • IRPhEP Handbook benchmarking? – Evaluation Guide – Uncertainty Guide • What if you can’t do a complete benchmark? 5
  • 6. Past and Present Student Involvement • Since 1995, ~30 students have participated in the ICSBEP and/or IRPhEP – 14 directly at INL • Students have authored or coauthored over 60 benchmark evaluations – 28 & 2+ in progress at INL • They have also submitted technical papers to various conferences and journals 6
  • 7. Opportunities for Involvement • Each year the ICSBEP hosts up to one or two Summer Interns at the Idaho National Laboratory • INL has also funded benchmark development via the CSNR Next Degree Program • Students have participated in the projects as subcontractors through various universities and laboratories • Benchmark development represents excellent work for collaborative Senior Design Projects, Master of Engineering Project, or Master of Science Thesis Topic • Further information can be Obtained by contacting the Chairman of the ICSBEP – J. Blair Briggs, J.Briggs@inl.gov 7
  • 8. Opportunities for Student Involvement – I • Internships • Augmented Education – Traditional approach – Rigorous structure for – 10-weeks Master’s Thesis – Benchmark review – Mentor and peer-review process completed on network “free-time” during – Undergraduate thesis university studies – Undergraduate team – Encouraged to publish projects – Encouraged to publish 8
  • 9. Opportunities for Student Involvement – II • Center for Space Nuclear Research (CSNR) – Next Degree Program • Work as a part- or full-time subcontractor while completing a graduate degree – Via local or remote university participation – Opportunity for interaction with students participating in space nuclear activities • Other Next Degree students • Summer fellow students (interns) – Collaboration opportunities on other projects 9
  • 10. Opportunities for Student Involvement – III • Nuclear and Criticality Safety Engineers – Pilot program collaboration • Battelle Energy Alliance (BEA) • U.S. Department of Energy – Idaho (DOE-ID) – Idaho State University and University of Idaho – Graduate Nuclear Engineering Curriculum – Part-time employment (full-time summer) – Hands-on training, benchmarking, ANS meetings, thesis/dissertation work, shadow mentors, DOE and ANS standard development, etc. 10
  • 11. Investigation Breeds Comprehension • Benchmark procedures require investigation into – History and background • Purpose of experiment? – Experimental design and methods – Analytical capabilities and procedures – Experimental results • Often experiments were performed with the intent to provide data for criticality safety assessments – Many are utilized to develop criticality safety standards 11
  • 12. Culturing Good Engineering Judgment • Often experimental information is incomplete or misleading – Contact original experimenters (if available) – Interact with professionals from the ICSBEP/IRPhEP community – Establish a personal network for the young professional engineer “Do, or do not. There is no „try‟” - Jedi Master Yoda 12
  • 13. Developing an Analytical Skill and Tool Set • Evaluators develop analytical and computational capabilities throughout the evaluation process – Utility of conventional computational codes and neutron cross section data libraries • Monte Carlo or Diffusion methods • MCNP and KENO are the most common in the US – Application of perturbation theory and statistical analyses • Uncertainty evaluation • Bias assessment – Technical report writing – Understanding acceptability of results 13
  • 14. We Want You… • Numerous reactors and test facilities worldwide that can be benchmarked – Availability of data and expertise for various systems dwindle with time • Visit the website – http://icsbep.inl.gov/ – http://irphep.inl.gov/ • Contact us – icsbep@inl.gov – irphep@inl.gov – jim.gulliford@oecd.org http://www.oecd-nea.org/science/wpncs/icsbep/ http://www.oecd-nea.org/science/wprs/irphe/
  • 15. Reactors Can Be Used for More Than Making Heat 15
  • 16. WHAT BENCHMARK DATA ARE YOU MISSING? 16
  • 17. Example Experiments (Not Currently Benchmarked) • Fast Test Reactors • Small Modular Reactors – ZPR-3, ZPR-6, ZPR-9, – SCCA-004 and ZPPR Assemblies – SORA Critical – EBR-II Experiment Assembly • Gas Cooled Reactors – CLEMENTINE – GA HTGR – GE-710 Tungsten Nuclear Rocket – Fort St. Vrain – HNPF – Peach Bottom – PBR-CX – PNWL HTLTR – SNAP & NASA Activities – ROVER/NERVA – N.S. Savannah Program – Project Pluto – B&W Spectral Shift Experiments – DRAGON 17
  • 18. Example Experiments (Continued) • Research Reactors • Experiments at ORCEF – More TRIGAs – Mihalczo HEU Sphere – AGN – USS Sandwich – PULSTAR – Moderated Jemima – Argonaut – Numerous HEU Cases – Pool-type such as MSTR – SNAP Water Immersion – MNSR – “Libby” Johnson Cases – HFIR • And many, many, more – ACRR – Similar experiments – Fast Burst Reactors validate the benchmark – Kodak CFX process and further improve nuclear data – See OECD/NEA IRPhEP website for reports with data for benchmarking 18
  • 19. General Benchmark Evaluation/Review Process • ICSBEP and IRPhEP benchmarks are subject to extensive review – Evaluator(s) – primary assessment of the benchmark – Internal Reviewer(s) – in-house verification of the analysis and adherence to procedure – Independent Reviewer(s) – external (often foreign) verification of the analysis via international experts – Technical Review Workgroup Meeting – annual international effort to review all benchmarks prior to inclusion in the handbook • Sometimes a subgroup is assigned to assess any final workgroup comments and revisions prior to publication – Benchmarks are determined to be Acceptable or Unacceptable for use depending on uncertainty in results • All Approved benchmarks are retained in the handbook • Unacceptable data are published for documentation purposes, but benchmark specifications are not provided 19
  • 20. Quality Assurance Each experiment evaluation included in the Handbook undergoes a thorough internal review by the evaluator's organization. Internal reviewers are expected to verify: 1. The accuracy of the descriptive information given in the evaluation by comparison with original documentation (published and unpublished). 2. That the benchmark specification can be derived from the descriptive information given in the evaluation 3. The completeness of the benchmark specification 4. The results and conclusions 5. Adherence to format. 20
  • 21. Quality Assurance (continued) In addition, each evaluation undergoes an independent peer review by another Technical Review Group member at a different facility. Starting with the evaluator's submittal in the appropriate format, independent peer reviewers are expected to verify: 1. That the benchmark specification can be derived from the descriptive information given in the evaluation 2. The completeness of the benchmark specification 3. The results and conclusions 4. Adherence to format. 21
  • 22. Quality Assurance (continued) A third review by the Technical Review Group verifies that the benchmark specifications and the conclusions were adequately supported. 22
  • 23. Annual Technical Review Meeting • Typically 30 international participants • Usually OECD/NEA Headquarters in Paris, France • Technical review group converges to discuss each benchmark, page-by-page, prior to approval for entry into the handbook • List of Action Items are recorded; these must be resolved prior to publication – Final approval given by Internal and Independent Reviewers and, if necessary, a Subgroup of additional reviewers • Final benchmark report sent to INL for publication 23
  • 24. What to Expect at Tech Review Group Meetings 24
  • 26. Annual Important Dates (Guidelines) ICSBEP IRPhEP Benchmark Evaluation Year-round Year-round Internal Review Year-round Year-round Independent Review February – March August – September Technical Review Group Late March – Late September – Distribution Early April Early October Technical Review Meeting Early May Late October Resolution of Action Items May – July November – January Submission of Final Report Before Late July Before Late January Handbook Preparation June – September November – March Handbook Publication End September End March Reviews can be coordinated throughout the year. These are the guidelines for annual publications. 26
  • 27. Questions? SCCA-002 27