Quality Maturity Server (QMS), Case Study
David Ling, Jeff Freeman, Sunil Maheshwari, Freescale Semiconductor Inc.
Austin, Texas, USA
Abstract
With increasing pressure to produce semiconductor Intellectual Property (IP) quickly for the System on Chip (SoC) marketplace, design teams with limited resources are resorting to higher levels of design reuse of IP owned by other teams. Collaborating in this manner to share IP across teams can be more efficient overall for a company, but it can also create problems as teams may have different design methodologies, tool environments, differing levels of expertise, etc. A system called Quality Maturity Server (QMS) was developed to organize, record, and maintain metrics on IP so that teams could see the quality and maturity criteria for each IP. QMS is a web based, database system shared across the entire company that replaced the previous method of maintaining manual, ad-hoc design checklists on individual computers. It has greatly improved the efficiency of collaboration across teams and specifically helped to improve the overall quality of IP blocks and thus SoCs.
Introduction
The need for improved management of semiconductor Intellectual Property (IP) quality in System on Chips (SoCs) increases with chip complexity, limited resources, and ever aggressive schedules. Typically, in complex SoCs, all the IP needed to build the complete chip are not available from a single design team. IP are either reused intact or leveraged from prior IP work. It is common for SoCs to require a few hundred IP blocks, many of them from other design teams or even external IP providers. The resulting quality of the SoC is heavily dependent on the quality of the individual IP. There may be multiple variants of an IP that provide similar functionality and with differing levels of quality. It can become a maintenance headache for the SoC team to determine which version of each IP to select and integrate into their chip.
Maturity
Each IP is itself a project that must be managed and tracked through its lifecycle. The maturity of an IP can be indicated by its lifecycle which describes where along the timescale of its development a particular design resides. For example, an IP that is designed brand new from scratch that has never been validated on silicon in a SoC would be considered very immature. An IP that has been manufactured on silicon in multiple SoCs and proven to meet all stated requirements through qualification in multiple customer applications is considered very mature.
An analogy of an IP’s maturity can be made to the grade level of a student through a school system. A PhD student at a university is considered much more mature than a 1st grade student in elementary school. There is more trust in the knowledge that a PhD student has vs. a 1st grade student. Similarly, a SoC integrator has much more trust in using an IP block that has been fully validated in other SoCs vs. an IP block that has never been used in any SoC.
Figure 1 shows an example lifecycle that indicates the maturity of an IP as it progresses through its development cycle into production. The major phases include Definition, Development, and Manufacturing. The detailed phases include Concept, Feasibility, Planning, Execution, Certification, Production, and End of Life.
Figure 1. Lifecycle Phases
The later in an IP’s lifecycle, the greater its maturity. Risk assessment of a SoC would lead to the conclusion that the IP with the lowest maturity would have the highest likelihood of failures or problems. Maturity is tracked within QMS through the lifecycle of Deliverables for an IP. A Deliverable is simply some work output from one user or team that is given to another user or team for its input.
Quality
Related to maturity, is the quality of the SoC and its contained IP. Quality is a measure of how well something meets stated requirements. These requirements may be stated in the form of performance, reliability, stability, repeatability, and other parameters.
An analogy of an IP’s quality may be made to a student’s earned grade or score in a school system. A student that performs well may be given a score of “A” or 100% for meeting all test requirements on an exam. A student that is performing at a lower level may earn a lower score such as “B”, “C”, “D” or 90%, 80%, 70% and so on. Independent of the grade level, the student may earn a score that represents their ability to meet the requirements of an exam or test. This score can then be used to predict a student’s knowledge or aptitude to perform related work at that grade level. Note that the student’s score is not an absolute measure of accomplishment but relative to their grade level in school.
In a similar manner, an IP block can be evaluated against a list of requirements or tests and given a score to indicate compliance. A higher score indicates greater compliance and ability to meet some criteria.
Figure 2 shows one implementation of measuring IP quality using QMS. The higher the score, the greater the compliance to stated requirements and thus, higher confidence of quality. 100% means complete compliance to all criteria.
Figure 2. Sample IP Quality Score in QMS
Quality meets Maturity
Maturity alone does not give enough detail about the quality of a particular IP design. For example, the IP may have been tested and have all views, but may not have a robust specification or design. Maturity indicates completeness of the IP that can be represented as a workflow composed of a list of deliverables. Quality indicates the excellence of the IP that can be represented with checklists. Both the workflow and checklist methods of input can be used to calculate a maturity and quality score.
The combination of quality and maturity together present the best picture of an IP’s suitability for usage in a SoC. In the student analogy, a university student will have high maturity and could have a “C” average score indicating low quality, but still will not be the most accomplished student. On the other hand, a 1st grade student will have low maturity and could have an “A” average score indicating high quality, but still will not be the most accomplished student. Ideally, the most accomplished student would be a PhD student with high maturity and an “A” average score indicating high quality.
For IP blocks, the best ones to use would have both high maturity and high quality. If either maturity or quality is low, then there is risk in using the IP for a particular design.
Figure 3 shows a quadrant diagram illustrating the relationship of Maturity to Quality on the predicted outcome of an IP. If given a choice, a SoC team would choose to have all of its IP in the quadrant with both high Maturity and high Quality.
Fig.3: Quadrant Diagram of Quality & Maturity
Design Reality
In practical terms, IP are being developed at the same time as SoCs. This means that a typical SoC may be a new product that contains a mixture of IP blocks that have varying maturity and quality. Properly assessing each IP for its risks will have a large impact on the overall success of the SoC.
In many design teams, IP maturity is not well tracked and is determined ad-hoc by asking other teams what other products an IP might be used in along with finding out if problems existed with the IP’s use. This is hit or miss and not an effective method.
Similarly, IP quality might be determined by interviewing the original design team of a particular IP to see what criteria the IP was measured against along with the methodologies, verification strategies, etc. that were used to verify the IP. This is also prone to much error.
It is common for many teams to use checklists of criteria on an Excel spreadsheet to measure compliance. This can cause problems because the spreadsheets are not easily searched and can vary in criteria from team to team or even from individual to individual.
Figure 4 shows a legacy sample checklist in the form of a spreadsheet with criteria that a designer might answer to record compliance to design criteria. This legacy method created problems because locally maintained spreadsheets were difficult to track, find, and communicate across users and teams.
Figure 4. Typical Spreadsheet Questions
Goals
The problems of the legacy spreadsheet method led to a company-wide search for a better solution. Goals for implementing a solution included: Assess IP and intelligently select IP during project planning; Understand the quality and maturity of IP via real-time scoring; Correlate predicted Quality and Maturity with product metrics (fewer silicon and document defects); Use metrics to show trends; Plan for maturity at a future date; Transition organizations from being “Reactive” to being “Closed-Loop and Preventative” (Defect -> Effect -> Root Cause -> Process Improvement -> Checklists/Workflows); Collect feedback from users to improve checklists and workflows; Document processes to meet ISO9000/TS16949 certifications and customer audits; Provide real-time visibility of the status of any IP for Quality and Maturity across all teams.
Research into the work done by the VSI Alliance [1] and the SPIRIT Consortium [2] served as the foundation for a company wide solution to address IP Quality and Maturity. The Reuse Methodology Manual [3], commonly known in the industry as RMM, also provided much background information on the methods used to implement design reuse in SoCs.
Solution
This paper describes the solution implemented to achieve the mentioned IP maturity and quality goals. First, all design collateral such as RTL code, scripts, files, etc. that comprise an IP are stored in a Configuration Management (CM) system. This allows for all the needed files to produce a given version of an IP to be represented by a simple tag. DesignSync, ClearCase, and CVS are examples of CM systems. The tag is simply a reference that can be communicated between teams to retrieve the required files for a particular design.
Second, a database catalog was deployed to inventory and track IP meta data, version info, and IP blocks needed to construct a SoC via a Bill of Materials (BOM). This BOM allows each SoC to know exactly which IP and their associated versions are in a particular SoC. The meta data can include dozens of pieces of information such as the owner, support contact, integration notes, description, performance parameters, licensing, etc.
Third, another database system was installed to collect and manage design criteria used to measure IP Quality and Maturity. This database is called the Quality Maturity System (QMS) and is the focus of this paper. It is integrated with the 2 other systems to provide a comprehensive view of the Quality and Maturity of all IP used in a SoC.
Figure 5 shows the relationship of these 3 systems to provide comprehensive management of versions, metadata, Quality and Maturity. All of these parameters are inter-related and affect the final product manufactured as a SoC.
Figure 5. Quality Maturity Architecture
QMS (Quality Maturity Server)
QMS uses a web based front end for the user interface. It stores all its data in a database on the backend. Templates are created that store the criteria which are then accessed by users to create individual records. The templates can then be updated in a continuous improvement process independently from the records. The individual Checklist or Workflow records are tied to meta data about each IP through an API (Application Programming Interface) to a Catalog server. This Catalog server can then provide summary level QMS data such as the Workflow Stage, Workflow Maturity, Checklist % Answered, and Checklist Score.
Some of the features of QMS include: Template and Record Versioning to allow updates to data with storage of any historical changes. Any data change will be recorded as a new version in QMS so that there is complete traceability for reporting purposes. Template Filtering allows automatically narrowing down the number of applicable templates through the VC Type (Virtual Component) and Functional Category meta data of the IP. E.g. This allows a digital IP to only use applicable digital checklists and not see unrelated analog checklists or SoC related checklists. A Feedback Mechanism allows users to provide direct requests to administrators to enhance criteria or questions in templates. This facilitates continuous improvement of templates for Checklists and Workflows so that all team members can benefit. A Sign-Off Mechanism allows recording digital approvals by different user roles so that there is traceability and accountability for advancing a project from stage to stage.
Automatic Calculations allow users to answer or input their data and see real-time summaries of their status. Examples of calculations include a record’s Score, Level, %Answered, and %Passing.
Workflows
A Workflow is defined as a list of deliverables needed to progress a design from the output of one step to the input of the next step. Deliverables are typically passed from one user/team to the next user/team in the IP’s development flow. The Workflow is a deliverable tracking solution for designers and other disciplines to represent the stages of the project lifecycle. A Workflow contains deliverables for each stage of the project lifecycle. User accountability is achieved through the use of approval sign-offs at each stage.
Checklists
A Checklist is a list of criteria in the form of questions that users answer about the quality of deliverables being completed. Some properties of Checklists include: Controls or qualifies the Deliverables within the Workflow; Reminds users of the required steps for creating Deliverables; Stores up-to-date information from project post-mortems or “Lessons Learned” so that mistakes are not repeated.
Workflow and Checklist Relationship
Checklists describe a stage within a Workflow. There may be one or more Checklists per Workflow stage. The Workflow stage indicates the Maturity of an IP while the Checklist(s) indicate the Quality of the Deliverables for the particular Workflow stage.
Deliverable Checklists must meet minimal passing criteria before development may continue to the next Deliverable. All prerequisite Deliverable Checklists must be completed before advancing to the next Maturity stage.
Figure 6 shows a stair step representation of this concept. Checklists may be viewed as the minor building blocks (shown in yellow) that are stacked at each Workflow stage. The Checklists must be completed in succession to reach the next Workflow stage. Each Workflow stage may be viewed as major stair steps that indicate Maturity. When an IP has reached the top stair step, then it has reached full Maturity.
Figure 6. Workflow to Checklist – Stair Step Relationship
Conversion of Spreadsheets to Checklists
A significant effort was expended to convert legacy, manual, design checklists from spreadsheet format into an equivalent database format. Each checklist contained criteria that were converted to questions that were answered by engineers as different phases of the IP’s lifecycle were completed. The answers were recorded into the database and the results could be manipulated to show summaries.
This conversion from individually stored checklists into a database helped to dramatically increase the visibility of results to all design teams within the company as well as hold engineers accountable for their work. No longer did SoC teams need to manually inquire into each IP’s design team to find out Quality and Maturity. This info was now available in real time, not only for each IP but could be viewed in aggregate for an entire SoC. What used to take weeks of manual work could now be viewed in seconds for hundreds of IP used to build a SoC.
Checklists were now in the form of question templates that could be loaded into the database. The engineer completing a particular milestone would then record his answers to a question template in the database. QMS would then tabulate scores for individual IP as well as all IP within a SoC BOM.
In the most simplistic form, criteria would be written as a question with only a Yes or No answer. More complex criteria could use a numerical answer such as 0 to 100 with weighted scoring. A question could be worded as “What was your completed code coverage percentage?” This could also be converted to a simpler criteria as “Did you achieve a code coverage completion greater than 99%?” To simplify scoring, and maximize adoption, many teams chose to convert their criteria to Yes/No questions.
Figure 7 shows a portion of a typical checklist. There would be several checklists for different milestone deliverables along the lifecycle of the IP. This allows one engineer to deliver a particular intermediate design and communicate to another engineer that certain criteria in the checklist have been met for each respective deliverable.
Figure 7. QMS Sample Checklist
Challenges
One of the biggest hurdles to overcome was not the infrastructure technology, but the change management of the human behaviors. Training engineers to do yet another task using a different tool in their already busy schedules was justified by revealing the power and immediacy of IP Quality and Maturity at the SoC level. Getting management support to drive all design teams to adopt the new methodology helped to ensure the widespread adoption needed so that all IP for each SoC were using QMS to track and score their Quality and Maturity.
Implementing the QMS solution involved balancing the overhead of data entry vs. the system benefits of visibility, tracking, and improved Quality and Maturity. For experienced engineers, a light-weight checklist is desired for minimal overhead while still providing helpful reminders to complete all tasks. For less-experienced engineers, a detailed set of checklists are needed to provide specific instructions so that all criteria are completed correctly.
Results
The aggregation of the individual IP checklist scores into a SoC provided an extremely powerful view of the Quality and Maturity of a SoC. The use of the integrated database allowed any engineer to query and view real time results without having to manually interview engineers to determine status.
Not only could a SoC integrator see the scores of each IP, but he could dig deeper and see each individual criteria that was answered. This provided much more transparency across teams and helped SoC teams to more accurately assess their overall Quality and Maturity.
Figure 8 shows a representative SoC BOM and the associated QMS scores for its IP. Within this single view, a SoC team can check the Quality and Maturity of all its IP blocks in real time.
Figure 8. Sample SoC BOM w/QMS Scores
At the time of this paper, there is limited quantitative data to show that QMS by itself directly improves IP quality. In a company focused on continuous improvement, there are numerous quality improvement initiatives taking place simultaneously which make it difficult to correlate one variable directly on overall quality. However, preliminary data clearly shows a decrease in total number of defects submitted on IP. Subjectively, QMS is definitely catching mistakes that would otherwise escape and result in additional Defects.
Figure 9 shows the total number of defects on a quarterly basis since implementing QMS for Group 1 (a different business unit than the next figure). For confidentiality reasons, the data has been normalized to 1.0 to filter absolute numbers. The relative change can still be seen with this metric which shows an improvement from a baseline of 1.0 in Q1 to 0.77 in Q2, and down to 0.65 in Q3 for the total number of defects as a result of implementing QMS and other quality initiatives.
Figure 9. Total Defects, Normalized, Group 1
Figure 10 shows a scatter plot of the normalized number of defects on a quarterly basis for the time period from 2007 Q3 to 2010 Q4 for Group 2 (a different business unit than for the previous figure). Again, the data is normalized for confidentiality reasons. Multiple variables affect the data points since many Quality and Maturity improvement initiatives were executed in parallel over this 3 year time span. Although the points show aberrations, there is a clear downward trend in the normalized number of defects over time. An appreciable portion of this trend is attributed to the implementation of QMS Workflows and Checklists.
Figure 10. Total Defects, Normalized, Group 2
Conclusion
Implementation of the Quality Maturity Server (QMS) helped to increase the Quality and Maturity of IP blocks as well as for SoC products. It provided transparency in viewing the status of IP Quality and Maturity as well as improved the inter-communication between teams working on very complex SoCs. QMS assists in risk assessment of projects and is an enabler for decreasing overall defect rates. It also facilitates greater accountability and traceability of deliverable Quality and helps achieve a more consistent output through the use of standardized checklists. As more of the company adopts this methodology, the ability to have improved Quality and Maturity steadily increases and results in better products to our customers.
References
[1] http://vsi.org, VSI Alliance, Legacy Documents.
[2] http://www.accellera.org, The SPIRIT Consortium and Accellera.
[3] M. Keating, P. Bricaud, “Reuse Methodology Manual for System-on-a-Chip Designs”, Springer Publisher, 3rd edition, 2007.
[4] http://www.vsi.org/docs/VSIA-QIP-v4.0.zip, VSI Alliance QIP checklists.
|
Related Articles
- An FPGA-to-ASIC case study for refining smart meter design
- Case study: optimizing PPA with RISC-V custom extensions in TWS earbuds
- UPF Constraint coding for SoC - A Case Study
- Formal Property Checking for IP - A Case Study
- Context Based Clock Gating Technique For Low Power Designs of IoT Applications - A DesignWare IP Case Study
New Articles
- The Ideal Crypto Coprocessor with Root of Trust to Support Customer Complete Full Chip Evaluation: PUFcc gained SESIP and PSA Certified™ Level 3 RoT Component Certification
- Advanced Packaging and Chiplets Can Be for Everyone
- Timing Optimization Technique Using Useful Skew in 5nm Technology Node
- Streamlining SoC Design with IDS-Integrate™
- Last-Time Buy Notifications For Your ASICs? How To Make the Most of It
Most Popular
- System Verilog Assertions Simplified
- System Verilog Macro: A Powerful Feature for Design Verification Projects
- Enhancing VLSI Design Efficiency: Tackling Congestion and Shorts with Practical Approaches and PnR Tool (ICC2)
- Design Rule Checks (DRC) - A Practical View for 28nm Technology
- Layout versus Schematic (LVS) Debug
E-mail This Article | Printer-Friendly Page |