OpenAI has launched a suite of open-source tools to help developers build safer AI applications for teenagers. The release includes a 'safety prompt pack' with pre-written guidelines to protect against risks such as self-harm, sexual content, and harmful body image ideals. These resources are designed to be directly integrated into AI systems, providing a more robust framework than high-level guidelines.

The move aims to create a standardized safety foundation across the AI ecosystem, addressing a gap where developers often struggled to translate broad safety goals into specific operational rules. This initiative follows previous steps by OpenAI to enhance teen safety, including adding under-18 principles to its model specifications. The tools were developed in collaboration with nonprofits like Common Sense Media.