Challenge Details
Tracking Number

TW-2023-132

Organization

Joint Frederated Assurance Center (JFAC)

Start Date

Apr 14, 2023

End Date

May 8, 2023
Current Status

Closed

Registration

Open

1 Attachment

To find out more information about this Challenge, please signin or register for an account.
More Challenges

No additonal Challenges were found.

testimonial
Automated, Enterprise-Scale, Assurance Case Framework
Challenge Summary
Description
The Joint Federated Assurance Center (JFAC)’s mission is to provide assurance solutions to the federation of Department of Defense (DoD) customers and program offices for weapon systems, information systems, and national security systems. Due to the unique reach of JFAC across the DoD, considerations for enterprise-wide approaches of addressing Trust and Assurance is at the core of JFAC’s pursuits
Additional Information
Submission Deadline:
05/08/2023 at 06:00 PM EST
ORGANIZATION:
    The Joint Federated Assurance Center (JFAC)’s mission is to provide assurance solutions to the federation of Department of Defense (DoD) customers and program offices for weapon systems, information systems, and national security systems.  Due to the unique reach of JFAC across the DoD, considerations for enterprise-wide approaches of addressing Trust and Assurance is at the core of JFAC’s pursuits (see Figure 1).   


Figure 1 Definitions of Trust and Assurance


JFAC’s vision is to build trust through holistic assurance.  Holistic assurance is the comprehensive, traceable, and composable connections across hardware and software components that comprise a system, aggregation across the system of systems, and linked to mission impacts and effects (see Figure 2).


 
 
Holistic Assurance
       


Figure 2 Holistic Assurance

 
 
STRATEGIC BACKGROUND:
      The Department is needing to transform to compete with the pacing global adversarial threats.  The need for speed, agility, and resilience is significant and should drive institutional changes towards an approach that can adapt to external forces (see Figure 3).

    

Figure 3 Digital Transformation - Continuous Everything

 
 
Visionary Digital Engineering as a Service
   

One critical enabler for this transformation is the ability to leverage a digital infrastructure that provides all of the necessary data and tools integrated into a seamless end-to-end workflow via Digital Engineering as a Service (DEaaS – see Figure 4).  DEaaS is an enabler of generating digital evidence across the life cycle for assurance purposes.

 

Figure 4 Visionary Digital Engineering as a Service

 
 
JFAC's Federated Data Enabling Holistic Assurance Approach
   

Current enterprise challenges in the assurance space are often segmented caused by various drivers: operating domains, system types, organizations, governance, authorities, funding sources, infrastructure segmentation, competency areas, technology, and life cycle processes.  However, as the Department modernizes in software, cloud technology adoption, data-centricity/data as a service (Daas), DEaaS, federated data exposure through Application Programming Interfaces (API) / data mesh, analysis through big data analytics, and data-informed decision-making, assurance and an enterprise assurance case framework is the glue between all of the various Department priorities and initiatives.  To that end, opportunities exist to aggregate data across a federated eco-system of repositories containing evidence that support various assurance activities (see Figure 5).



Figure 5 JFAC's Federated Data Enabling

 
 
Assurance Case Framework Implimentation
   

The JFAC is seeking a technology offering to support implementation of assurance case frameworks across the life cycle of DoD acquisition programs and portfolios.  The goal of assurance case framework implementation is to migrate from checklist-based acquisition approaches towards a holistic, comprehensive, and effective risk management capability.  In addition, by integrating artifacts across a program with the proper context of assurance arguments/claims with linkages to the various evidences and sources, a holistic contextualized perspective to inform decision makers, such as Milestone Decision Authorities (MDA), Acquisition Executives, and Program Managers.  A strategic example of showing how assurance cases can integrated across various competencies supported by arguments, claims, and evidence (see Figure 6).


Figure 6 Strategic Example Assurance Case Framework

 
 
Assurance Process
      Data Ingestion: Collecting data and information into a digital medium


Data Curation: Preparing data, to include meta-data tagging, for various value-generating downstream workflows (i.e. data storage/retrieval, search/query, analysis/analytics, or AI/ML model development)


Data Storage/System of Records: Information systems storing evidentiary data


Evidentiary Discovery: The ability to discover sources and evidence relevant to assurance case


Assurance Case Development: The generation of presenting a compelling case in the pursuit of resolving dilemmas for critical decision topics (i.e. is the system safe? Is the system secure? Is the system mission-effective? Is the system survivable?)


Argument Decomposition: The ability to decompose a strategic decision topic for a system into smaller segments traceable from the overall system to the indentured sub-systems and components


Evidence and Source Traceability: Alignment of assurance cases and arguments with the supporting evidence and sources


Veracity/Credibility Determination: Evaluate the strength of the available evidence and sources for veracity/credibility of the argument/claims. 


Certification: The assurance process can support an outcome of achieving a program accreditation/certification, such as a safety certification.         

 
 
Evidence Types
    Various types of evidence exist that can be used for the generation and updating of assurance cases.  An example assurance case applied to an autonomous system (see Figure 7) is provided to show how a specific implementation uses various artifacts for different trees (i.e. hazard analysis, mission thread, definitions of acceptable probability, strategies).




             Figure 7 Total Assurance Case Framework Implementation Example for Autonomous Systems


Although not a comprehensive list, the following artifacts are potential examples of evidence used for implementing an assurance case across various stages of the life cycle and in support of various certification technical areas: 


Authoritative: Public Laws, Regulations, Policies, and Directives


Community Recommendations: Guidance, Best Practices, Standards


Contractual: Terms, Conditions, Requirements, Deliverables, Schedule, Contract Data Requirements Lists (CDRL)


User/Mission Requirements: War Gaming Simulation/Analysis, Cyber Table-Top, User-Based Requirements, Mission Engineering Artifacts, Mission Models, Design Reference Mission, Concept of Operations (CONOPS), Concept of Employment (CONEMP), Joint/Service Mission-Task Lists, Analyses of Alternatives, Trade Studies, Initial Capabilities Document, Capability Development Document, Capability Production Document, Operational Requirements Document


System Engineering:  System Model, System Specification, System Architecture, Mission Essential Functional List, Performance Specification, Interface Control Document, Systems Engineering Plan, Design Bill of Materials (DBOM), Digital Twins, Engineering Drawings, Material Specifications


Software Engineering: Software Requirement Document (SRD), Software Bill of Materials (SBOM), Software Requirements Specification, Software Development Plan, Software Architecture, Software Defects, Software Metrics, Software Assurance Tool Assessments


Specialty Engineering: System Safety; Reliability, Availability, Maintainability (RAM); Root Cause Corrective Action (RCCA); Failure Modes, Criticality, and Effects Analysis (FMECA), Failure Modes and Effects Analysis (FMEA); Human System Integration (HSI); Failure Reporting, Analysis, and Corrective Action System (FRACAS)


Test & Evaluation:  Test and Evaluation Master Plan (TEMP), Test Plan, Range Resource Schedule Clearinghouse, Unit Test Results, Integration Test Results, Simulation Results, Live Test Results, Live-Virtual Constructive Testing, Test and Evaluation Manuals

Manufacturing: Build Specification, Manufacturing Bill of Materials (MBOM), Production Bill of Materials (PBOM), Production Delivery Schedule, Production Process Flow, Production Value Stream Map, Hardware Condition Assessments, Non-Conformance/Yield Rates, Supply Chain


Intelligence & Security: Program Protection Plan (PPP), Technology Area Protection Plan (TAPP), Security Classification Guide (SCG), Threat Assessment, Risk Management Framework, Cyber Defense Operations, Counter-Intelligence Support Plan (CISP)


Configuration Management: extensible Bills of Materials (xBOM), Configuration Audits (functional/physical)


Logistics & Sustainment: Life Cycle Sustainment Plan (LCSP), Logistics Support Elements, Sustainment Metrics, Technical Data Packages, Training, Support Equipment, Maintenance Task Analysis (MTA), Depot/Intermediate/Operational Maintenance, Logistics Support Analysis (LSA), Level of Repair Analysis (LORA), Performance Based Logistics, Site Surveys, Diminishing Manufacturing Sources and Material Shortages (DMSMS), Reliability Centered Maintenance (RCM), Condition Based Maintenance (CBM), Prognostic Health Management, Supply Chain Management, Supportability Analysis, Fleet Management, Spares Management, Engineering Change Proposal (ECP), Modification Management


Operations: Operation Plan (OPLAN), Operations Order (OPORD), Execute Order (EXORD), Concept Plan (CONPLAN), Simulations, Operational Mission Package, Operational Threat, Sensor Data, Operator Assessments, Commander’s Intent, Operational Bill of Materials (OBOM), Battle Damage Assessments, Mission Effectiveness Evaluations

 
 
Assurance Technology Challenges
   

·        Reusable, Modularized Assurance Arguments/Claims/Evidence – Ability to reuse portions of an assurance case framework across other arguments (i.e. same evidence supporting both a safety assurance case and cybersecurity case) or systems (i.e. portable claims and evidence for a microelectronic part that is used in multiple systems).  This could include the establishment of a repository of assurance cases for an enterprise to be leveraged by programs.

·     Automation – Current assurance case development requires significant time and highly specialized skillsets.  Need a technology to rapidly generate assurance cases that might provide an 80% solution that can be reviewed by and built from a subject matter expert (SME) for consistency, accuracy, and completeness.

·    Maturity – The bulk of assurance case framework implementations to date are heavily academic in nature, examining very specific use cases, with low technology readiness levels (TRL).  Need to advance the state of the art for maturing assurance case framework implementation.

·    Enterprise/Scalable - There is a need to advance the technology towards scalable solutions that can compile case/arguments/claims/evidence across various viewpoints from macro-to-micro views across the life cycle and across acquisition programs (i.e. DoD Enterprise).

·     Aggregation – Current data analytics and AI/ML models are often built on specific data fields and functions in the creation of narrow AI; however, data generated across the lifecycle consists of a multitude of data types: requirements, SysML/UML models, JIRA records, source code, metric outcomes, software assurance scan results, test and evaluation results, etc.  Need to integrate and aggregate various types of data generated across the life cycle into the assurance framework.

·     Traceability – the ability to link various evidence to overall outcomes eludes current acquisition programs and systems due to crossing multiple competencies.  For example, software vulnerability discovery (Software Assurance) within a software component configuration within a system architecture impacting other interfaced components (System Security Engineering) that results in a change to desired overall mission effects and outcomes (Mission Engineering - probability of mission success). 

·    Composable – the ability to decompose/recompose from a mission/system level view down to the sub-system, components, modules, and parts aligned to the subsequent arguments/claims/evidence/sources is needed.

·    Persistent/Continuous – the ability to have real-time updates to the assurance case as new evidence is generated, old evidence/data becomes stale, or new branches of an argument become needed (i.e. new requirements from an update to a standard).

·    Visualization – the ability for human-readable/interpretable views with an intuitive user interface/use experience (UI/UX) particularly for highly complex assurance cases (potentially tens of thousands of nodes).

 
 
USE CASE:
    Support implementing as assurance case framework and address the assurance technology challenges in support of an early development acquisition program, where some data and artifacts have yet to be generated or developed (i.e. using an assurance case to drive the development of criteria and evidence to support satisfaction of claims).   Opportunities exist to support large and small acquisition programs as initial pilot projects.
 
 
SOLUTIONS SUBMISSIONS:
   

JFAC is requesting prototype proposals to support a FY23 pilot to advance the state of the art in trust and assurance through an automated and scalable solution for implementing an assurance case framework.  The JFAC is seeking solutions from industry that can address portions or all of the relevant challenges associated with implementing assurance case frameworks for DoD systems and programs.  The process of identifying the proper technologies that meet the JFAC’s need will proceed through a multi-phase approach.  Not all proposals will proceed through all phases of the process or obtain a project agreement with the Government.  The Government may skip or combine rounds as necessary to collect and assess information as it relates to proposed solutions. Further, the Government may go back to earlier rounds with clarifications and additional requests as necessary to identify successful solutions that will meet intended pilot project objectives. Finally, the Government may initiate steps toward a prototype project award at any round. 


Proposal Process


·        ROUND 1 – Discovery Paper

If you have solution that addresses any of the listed problems, we would love to hear about it. Submit a brief discovery paper for us to assess. JFAC and select subject matter experts may invite those that have the highest potential to solving the problem to Round 2. For those that have submitted under the market research, please feel free to update and resubmit. Please see below for Discovery Paper Guidelines. 


·        ROUND 2 – Digital Proving Ground (DPG) (May, 2023) 

The goal of the DPG is to allow our Industry partners to have the opportunity to pitch in a fast-paced environment with contracting professionals poised to execute rapid pilot project awards. If selected to participate at the DPG, you will receive instructions on whether to prepare for an informal one-on-one technical discussion, demonstration of a solution, “Shark Tank” pitch, or something similar to better understand your solution. Conversations may continue outside of the DPG if additional information is needed to understand detailed solution offerings, determine the feasibility of teaming between separate solution providers, and/or make a Round 3 determination.


·        ROUND 3 – Project Award 

The Government reserves the right to make one, some, or no pilot project awards based on the Round 2 results. Successfully negotiated awards are intended to be Other Transaction Agreements under 10 USC 4022. An award under 10 USC 4022 may result in a subsequent award of a follow-on production agreement without additional competition based on successful prototype completion. 

 
 
NOTICES
   

Special Notice on Advisors  

Non-Government, subject matter expert (SME) advisor may be used during any assessment activity. Such assessors will be operating at the direction of the Government and through signed non-disclosure agreements (NDAs). The Government understands that information provided in response to this request for solutions is presented in confidence and it agrees to protect such information from unauthorized disclosure to extent required by law. Your participation in any round of assessment under this announcement indicates concurrence with the use of Non-Government SME advisors.  Companies supporting in the advisory role include Science Applications International Corporation (SAIC). 

 
 
GUIDELINES: WHAT DO I SUBMIT IN ROUND 1?
    Submit a 2-page Discovery Paper through our Registration Form by 1400 Eastern Time on May 05, 2023 answering the following questions and instructions. We will not dictate format, but ask you answer the questions in the order provided. 

  • PROBLEM ALIGNMENT: How well does your solution map to our published problem(s)? Argue the solution you’re providing is a perfect fit to solve the published problem(s). 
  • VALUE PROPOSITION: Why is your solution the best approach from a technical perspective? If you can convincingly refute alternatives to solving the problem, please briefly do so. If applicable, also consider arguing why the end user/warfighter would prefer your solution. 
  • PROGRAM IMPACT: Looking only at the DoD personnel who will be impacted by your solution, argue that their jobs or lives will be significantly improved if your solution is adopted. What is the impact of your solution vs. today's solutions? 
  • PROGRAM OFFICE DEMAND: Make your best pitch to the program offices directly. Why would the program office want to adopt your solution? To the extent that you can, make the case that program offices -- once they experience your solution -- will ask for it themselves. 
  • QUALITY OF PROSE: Prove you write clearly. Prove you argue convincingly.


 
 
EVALUATION CRITERIA
   

If a paper is favorably rated as green, they will be invited to Round 2. If an Offeror is rated as yellow, invitation is at the discretion of the Government. Any Offeror rated red will not be moved forward to Round 2.



Evaluation criteria will be based on the questionnaire above. The responses will be evaluated against the following:



Acceptable: The paper meets the requirements of all five of the questions. The responses demonstrate an understanding of each question and supports a proven approach that could be utilized under JFAC’s Automated Enterprise-Scale Assurance Case Framework.



Unacceptable: The paper does not meet the requirements of all five of the questions. The responses do not demonstrate an understanding of each question and does not support a proven approach that could be utilized under JFAC’s Automated Enterprise-Scale Assurance Case Framework.

 
 
Point of Contact

Name

Natasha Backman

Email

natasha.a.backman.ctr@mail.mil

Title

Procurement Analyst

Phone

Not Provided