A digital flyer for a virtual workshop titled "SSDF for Generative AI & Dual Use Foundation Models." Hosted by NIST, it will occur on January 17, 2024, at 9:00 AM EST. Featuring discussions on cybersecurity threats, the background includes various digital and technological graphics.

NIST Secure Software Framework

NIST Secure Software Development Framework for Virtual Workshop for Dual Use Foundation Models and Generative AI

To bring together business, academia, and government to discuss secure software development practices for AI models, NIST is hosting a workshop on Wednesday, January 17, 2024, from 9:00 AM to 1:00 PM EST. Participants will learn about significant cybersecurity challenges related to the creation and application of AI models as well as suggested methods for resolving them.

The development of SSDF companion resources will be influenced by feedback from various communities in order to support both AI model creators and companies that are incorporating those AI models into their own software and services.

Background

NIST was tasked with “developing a companion resource to the SSDF to incorporate secure development practices for generative AI and for dual-use foundation models” under Executive Order 14110, October 2023. A set of fundamental, sound practices for general secure software development are described in NIST’s SSDF version 1.1.

The SSDF can be used for any type of software development, including AI models, because it focuses on outcomes rather than tools and techniques.

NIST is considering the creation of one or more SSDF companion resources on generative AI models and dual-use foundation models in order to give software producers and acquirers more information on secure development for artificial intelligence ( AI ) models.

The concept and content of these companion resources would be comparable to those of the Profiles for the NIST Cybersecurity Framework, Privacy Framework and AI Risk Management Framework.

NIST is looking for feedback on a number of issues during the workshop, such as:

  1. What modifications to SSDF version 1.1, if any, are required to accommodate secure development procedures for generative AI and dual-use foundation models?
  2. What AI-specific factors ought to be taken into account by NIST when creating its companion resource?
  3. What else needs to be entered into the SSDF Profiles?
  4. Is there an SSDF Profile substitute that would be more effective at meeting the EO 14110 requirement while also giving software developers flexibility and technology neutrality?
  5. Which AI model-specific secure development resources do you think are the most valuable?
  6. What makes writing code for generative AI and dual-use foundation models special?

Questions regarding the workshop or the SSDF work at NIST? Get in touch with us at ssdf]at ] nist. gov ( ssdf]at ] nist]dot ] government

Building Blocks for Internet of Things Product Security: SSDF and IoT Cybersecurity Guidance(Opens in a new browser tab)

On Windows 10 and later, Windows Photos receives an AI magic eraser

CHIPS R&D Digital Twin Technical Standards Workshop

MTTR and other incident management metrics: a comprehensive guide

Automated Vehicle Workshop at NIST, September 5, 2023

ExecBrief from PinnacleOne: Safe, Secure, and Reliable AI

Skip to content