SlideShare a Scribd company logo
Validation and Design
 in a Small Team Environment:
The Development of the High
Performance Embedded Computing
Process at Northrop Grumman

                  Matthew Clark, Ph.D
                  February 19th, 2009
Motivation


Validation is:                   What if your project is:

• Complex                        • Complex

• Long                           • Lasts 3-6 months,

• Requires expensive tools       • Has a limited budget,
                                     – And new grads.
• Requires experienced
  employees.                     • Is done in a group with 30
                                   years of SW experience,
                                   and you represent the sum
                                   total of HW development
It is both an Art and Science.
                                   experience.
First Question

What makes verification so expensive and time consuming?
                                   Coverage Analysis
Functional Verification Issues
• Code with Branches
                                   Assertions
• Multiple Clock Rates
• Packet and Clock Rate
  adaptation.                      1 TC per config + Random.

• Configurations
• SAR                              Corner Case Analysis

• Encapsulation
                                   Special SW modeling (e/Vera)

      Identifying and reducing the green arrows is the Art.
Problem Statement


How can you reduce the number of green arrows so that
designs can be completed quickly, reliably, without having
the “normal” 2.5:1 Verification : Designer ratio?
A: Do what Capt. Kirk did, change the problem.
We can’t change the verification restrictions, perhaps we
can change the design issues so that we don’t have some
verification problems… by design.
Problem Space


• Offload CPU-intensive portions of numerical algorithms.
• Rapid deployment. Standard project is 3-6 months.
• Must be modular:
    – Single module stand-alone.
    – Combined into algorithm.
    – Multiple algorithms in single container.

• Exact duplicates of existing SW applications.
    – Goal is to reduce execution time, or improve performance (ie replace 64K pt
      FFT with 512K pt FFT).
    – To SW, there must be no difference between CPU, FPGA, FPGA+DSP, DSP
      implementation.
Design and Verification constraints.


• Single Clock Rate for every block.
• No Packet and Clock Rate Adaptation
    – Force it to occur at a single ingress/egress point.

• Heavily branched code
    – Algorithms (fortunately) tend to be datapath centric.

• Configurations
    – Separate algorithm implementation verification from IO
      verification.
    – Build bit/cycle accurate model of algorithm, use IO controls to vary data
      tx/rx.
Design constraints reduce verification space

 Design Constraints
 •   All blocks use small input fifo’s, and programmable full/empty
     indicators.
 •   All blocks execute at 210 MHz. (Max Register Access speed).
 •   Top Level register interface can be pipelined for timing closure.
 •   Coregen modules only (Ref. Designs Need Not Apply)
 •   Each block uses (1+) async resets, that are externally synch’d.
 •   Each block will have a testbench that:
      – Uses golden Matlab code to create all stimulus/checker files.
            • Scenarios are generated from configure files.
      –   Test bench is nothing but interface BFM comparing values against file.
      –   Can slew input/output rates independently.
      –   Can randomly assert the full/empty flags on interface pins.
      –   Interrupt conditions may be checked by hand.
      –   Pass fail conditions checked in VHDL at run time.
      –   Test bench can be run as regression.
Design constraints reduce verification space

Functional Verification Issues
• Code with Branches
• Multiple Clock Rates           IO_clock and funct_clock only.

• Packet and Clock Rate          All IO is FIFO based. IO ring handles.
  adaptation.
• Configurations
• SAR                            Not an issue.
• Encapsulation                  Not an issue
Results


• MegaCore Function (FFT, IFFT, Matched Filter, Correlation,
  Polyphase/WOLA ), plus Shim for PCI bridge.
• Designer Tasks: Architect, Document (MAS), Block Test, Synthesize.
    – Forward FFT: BSEE (1 mo Verilog)
    – Inverse FFT : MSEE (6 mo Verilog)
    – Packet Aggregation: Intern (2 mo VHDL).

• Verification:
    – MSEE (no HDL).
        • Create Matlab based Testbench, stimulus checker files.
        • Created Stimulus files for Block Level Test Benches
        • Reused FIFO BFM’s from previous project.

• Code + Block Test: 4 Months
• Verification + Synthesis: 2 Months.
Summary


Q: How can you reduce the number of green arrows so that designs can be completed
quickly, reliably, without having the “normal” Verification : Designer ratio?


• Use standard, non-optimal, fifo interfaces for every block.
• Separate algorithm verification from implementation verification.
• Repeatable test cases at the cost of text file storage.
• Design each module like it is going into a reuse container:
    –   Standard IF
    –   Standard Clock and reset.
    –   Standard IO and register interfaces.
    –   All data widths are 64 bit + Vld @ 210 MHz.

 Summary: Trade improved verification schedule and TTM with reduced performance
and complexity.

More Related Content

PDF
Strickland dvclub
PDF
Zhang rtp q307
PPT
Coverage Solutions on Emulators
PPT
Validating Next Generation CPUs
PDF
Zehr dv club_12052006
PDF
Sharam salamian
PDF
The Cortex-A15 Verification Story
Strickland dvclub
Zhang rtp q307
Coverage Solutions on Emulators
Validating Next Generation CPUs
Zehr dv club_12052006
Sharam salamian
The Cortex-A15 Verification Story

What's hot (18)

ODP
SANER 2015 ERA track: Differential Flame Graphs
PDF
Trends in Mixed Signal Validation
PDF
Stinson post si and verification
PDF
Chris brown ti
PPTX
WebLogic Stability; Detect and Analyse Stuck Threads
PPT
Verification strategies
PDF
Continuous Performance Testing
PPTX
ASIC design verification
PDF
Bristol 2009 q1_blackmore_tim
PDF
Java Performance Tuning
PDF
Code Management Workshop
PDF
Fpga Verification Methodology and case studies - Semisrael Expo2014
PDF
9 d57105 hardware software co design
PDF
Performance Testing Java Applications
PPTX
Burst clock controller
PPT
Reverse engineering
DOC
Ghoshal_resume_LinkedIn_20160705
PPT
Inside Dolphin Test Infrastructure
SANER 2015 ERA track: Differential Flame Graphs
Trends in Mixed Signal Validation
Stinson post si and verification
Chris brown ti
WebLogic Stability; Detect and Analyse Stuck Threads
Verification strategies
Continuous Performance Testing
ASIC design verification
Bristol 2009 q1_blackmore_tim
Java Performance Tuning
Code Management Workshop
Fpga Verification Methodology and case studies - Semisrael Expo2014
9 d57105 hardware software co design
Performance Testing Java Applications
Burst clock controller
Reverse engineering
Ghoshal_resume_LinkedIn_20160705
Inside Dolphin Test Infrastructure
Ad

Viewers also liked (8)

PDF
Lear design con_east_2004_simplifing_mixed_signal_simulation
PDF
Schulz sv q2_2009
PDF
Robert page-abstract
PDF
Mcdermot bio
PDF
09 10 parent sign up email
PDF
Thaker q3 2008
PDF
Tobin verification isglobal
PDF
The Verification Methodology Landscape
Lear design con_east_2004_simplifing_mixed_signal_simulation
Schulz sv q2_2009
Robert page-abstract
Mcdermot bio
09 10 parent sign up email
Thaker q3 2008
Tobin verification isglobal
The Verification Methodology Landscape
Ad

Similar to Validation and-design-in-a-small-team-environment (20)

PDF
8d545d46b1785a31eaab12d116e10ba41d996928Lecture%202%20and%203%20pdf (1).pdf
PPT
How to become a testing expert
DOC
Rashmi_Palakkal_CV
PPTX
Case Study of End to End Formal Verification Methodology
PPTX
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
PPTX
Hardware Software Codesign
PPTX
module nenddhd dhdbdh dehrbdbddnd d 1.pptx
DOC
Ramprasad-CV_3+yrs
DOC
VIKAS _SENIOR HARDWARE
PDF
Intel Atom Processor Pre-Silicon Verification Experience
PDF
The Art of Applied Engineering - An Overview
PDF
FPGA Design Challenges
PDF
Christopher_Reder_2016
PPTX
VLSI Logic synthesis (1).pptx %ighdhdhshsgsgshshshfghhhhj
PPTX
Constraint Programming in Compiler Optimization: Lessons Learned
PDF
TDD and Related Techniques for Non Developers (2012)
PPTX
Mixing d ps building architecture on the cross cutting example
PPTX
Spyglass dft
PPT
ASIC Design Flow_Introduction_details.ppt
PDF
Digital VLSI Design : Introduction
8d545d46b1785a31eaab12d116e10ba41d996928Lecture%202%20and%203%20pdf (1).pdf
How to become a testing expert
Rashmi_Palakkal_CV
Case Study of End to End Formal Verification Methodology
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
Hardware Software Codesign
module nenddhd dhdbdh dehrbdbddnd d 1.pptx
Ramprasad-CV_3+yrs
VIKAS _SENIOR HARDWARE
Intel Atom Processor Pre-Silicon Verification Experience
The Art of Applied Engineering - An Overview
FPGA Design Challenges
Christopher_Reder_2016
VLSI Logic synthesis (1).pptx %ighdhdhshsgsgshshshfghhhhj
Constraint Programming in Compiler Optimization: Lessons Learned
TDD and Related Techniques for Non Developers (2012)
Mixing d ps building architecture on the cross cutting example
Spyglass dft
ASIC Design Flow_Introduction_details.ppt
Digital VLSI Design : Introduction

More from Obsidian Software (20)

PDF
Yang greenstein part_2
PDF
Yang greenstein part_1
PDF
Williamson arm validation metrics
PDF
Whipp q3 2008_sv
PPT
Vishakantaiah validating
PDF
Tierney bq207
PDF
The validation attitude
PPT
Thaker q3 2008
PDF
Shultz dallas q108
PDF
Shreeve dv club_ams
PDF
Schulz dallas q1_2008
PDF
Salamian dv club_foils_intel_austin
PDF
Sakar jain
PDF
Runner sv q307
PDF
Roy omap validation_dvc_lub_092106
PDF
Roy aerofone power_verif
PDF
Pedneau dv employment_081705
PDF
Oop dv club_rev2b-4
PDF
Oop dv club_rev2b-3
Yang greenstein part_2
Yang greenstein part_1
Williamson arm validation metrics
Whipp q3 2008_sv
Vishakantaiah validating
Tierney bq207
The validation attitude
Thaker q3 2008
Shultz dallas q108
Shreeve dv club_ams
Schulz dallas q1_2008
Salamian dv club_foils_intel_austin
Sakar jain
Runner sv q307
Roy omap validation_dvc_lub_092106
Roy aerofone power_verif
Pedneau dv employment_081705
Oop dv club_rev2b-4
Oop dv club_rev2b-3

Validation and-design-in-a-small-team-environment

  • 1. Validation and Design in a Small Team Environment: The Development of the High Performance Embedded Computing Process at Northrop Grumman Matthew Clark, Ph.D February 19th, 2009
  • 2. Motivation Validation is: What if your project is: • Complex • Complex • Long • Lasts 3-6 months, • Requires expensive tools • Has a limited budget, – And new grads. • Requires experienced employees. • Is done in a group with 30 years of SW experience, and you represent the sum total of HW development It is both an Art and Science. experience.
  • 3. First Question What makes verification so expensive and time consuming? Coverage Analysis Functional Verification Issues • Code with Branches Assertions • Multiple Clock Rates • Packet and Clock Rate adaptation. 1 TC per config + Random. • Configurations • SAR Corner Case Analysis • Encapsulation Special SW modeling (e/Vera) Identifying and reducing the green arrows is the Art.
  • 4. Problem Statement How can you reduce the number of green arrows so that designs can be completed quickly, reliably, without having the “normal” 2.5:1 Verification : Designer ratio? A: Do what Capt. Kirk did, change the problem. We can’t change the verification restrictions, perhaps we can change the design issues so that we don’t have some verification problems… by design.
  • 5. Problem Space • Offload CPU-intensive portions of numerical algorithms. • Rapid deployment. Standard project is 3-6 months. • Must be modular: – Single module stand-alone. – Combined into algorithm. – Multiple algorithms in single container. • Exact duplicates of existing SW applications. – Goal is to reduce execution time, or improve performance (ie replace 64K pt FFT with 512K pt FFT). – To SW, there must be no difference between CPU, FPGA, FPGA+DSP, DSP implementation.
  • 6. Design and Verification constraints. • Single Clock Rate for every block. • No Packet and Clock Rate Adaptation – Force it to occur at a single ingress/egress point. • Heavily branched code – Algorithms (fortunately) tend to be datapath centric. • Configurations – Separate algorithm implementation verification from IO verification. – Build bit/cycle accurate model of algorithm, use IO controls to vary data tx/rx.
  • 7. Design constraints reduce verification space Design Constraints • All blocks use small input fifo’s, and programmable full/empty indicators. • All blocks execute at 210 MHz. (Max Register Access speed). • Top Level register interface can be pipelined for timing closure. • Coregen modules only (Ref. Designs Need Not Apply) • Each block uses (1+) async resets, that are externally synch’d. • Each block will have a testbench that: – Uses golden Matlab code to create all stimulus/checker files. • Scenarios are generated from configure files. – Test bench is nothing but interface BFM comparing values against file. – Can slew input/output rates independently. – Can randomly assert the full/empty flags on interface pins. – Interrupt conditions may be checked by hand. – Pass fail conditions checked in VHDL at run time. – Test bench can be run as regression.
  • 8. Design constraints reduce verification space Functional Verification Issues • Code with Branches • Multiple Clock Rates IO_clock and funct_clock only. • Packet and Clock Rate All IO is FIFO based. IO ring handles. adaptation. • Configurations • SAR Not an issue. • Encapsulation Not an issue
  • 9. Results • MegaCore Function (FFT, IFFT, Matched Filter, Correlation, Polyphase/WOLA ), plus Shim for PCI bridge. • Designer Tasks: Architect, Document (MAS), Block Test, Synthesize. – Forward FFT: BSEE (1 mo Verilog) – Inverse FFT : MSEE (6 mo Verilog) – Packet Aggregation: Intern (2 mo VHDL). • Verification: – MSEE (no HDL). • Create Matlab based Testbench, stimulus checker files. • Created Stimulus files for Block Level Test Benches • Reused FIFO BFM’s from previous project. • Code + Block Test: 4 Months • Verification + Synthesis: 2 Months.
  • 10. Summary Q: How can you reduce the number of green arrows so that designs can be completed quickly, reliably, without having the “normal” Verification : Designer ratio? • Use standard, non-optimal, fifo interfaces for every block. • Separate algorithm verification from implementation verification. • Repeatable test cases at the cost of text file storage. • Design each module like it is going into a reuse container: – Standard IF – Standard Clock and reset. – Standard IO and register interfaces. – All data widths are 64 bit + Vld @ 210 MHz. Summary: Trade improved verification schedule and TTM with reduced performance and complexity.