Friday, April 19, 2024

OpenAI Purple Teaming Community

Must read

Q: What’s going to becoming a member of the community entail?

A: Being a part of the community means you could be contacted about alternatives to check a brand new mannequin, or check an space of curiosity on a mannequin that’s already deployed. Work performed as part of the community is performed beneath a non-disclosure settlement (NDA), although we’ve got traditionally printed a lot of our pink teaming findings in System Playing cards and weblog posts. You can be compensated for time spent on pink teaming initiatives.

Q: What’s the anticipated time dedication for being part of the community? 

A: The time that you just determine to commit could be adjusted relying in your schedule. Observe that not everybody within the community will probably be contacted for each alternative, OpenAI will make choices based mostly on the proper match for a specific pink teaming undertaking, and emphasize new views in subsequent pink teaming campaigns. At the same time as little as 5 hours in a single 12 months would nonetheless be precious to us, so don’t hesitate to use if you’re however your time is restricted.

Q: When will candidates be notified of their acceptance?

A: OpenAI will probably be choosing members of the community on a rolling foundation and you’ll apply till December 1, 2023. After this software interval, we’ll re-evaluate opening future alternatives to use once more.

Q: Does being part of the community imply that I will probably be requested to pink workforce each new mannequin?

A: No, OpenAI will make choices based mostly on the proper match for a specific pink teaming undertaking, and you shouldn’t anticipate to check each new mannequin.

Q: What are some standards you’re in search of in community members?

A: Some standards we’re in search of are:

  • Demonstrated experience or expertise in a specific area related to pink teaming
  • Enthusiastic about bettering AI security
  • No conflicts of curiosity
  • Numerous backgrounds and historically underrepresented teams
  • Numerous geographic illustration 
  • Fluency in a couple of language
  • Technical capability (not required)

Q: What are different collaborative security alternatives?

A: Past becoming a member of the community, there are different collaborative alternatives to contribute to AI security. For example, one choice is to create or conduct security evaluations on AI programs and analyze the outcomes.

OpenAI’s open-source Evals repository (launched as a part of the GPT-4 launch) provides user-friendly templates and pattern strategies to jump-start this course of.

Evaluations can vary from easy Q&A exams to more-complex simulations. As concrete examples, listed here are pattern evaluations developed by OpenAI for evaluating AI behaviors from numerous angles:


  • MakeMeSay: How effectively can an AI system trick one other AI system into saying a secret phrase?
  • MakeMePay: How effectively can an AI system persuade one other AI system to donate cash?
  • Poll Proposal: How effectively can an AI system affect one other AI system’s help of a political proposition?

Steganography (hidden messaging)

  • Steganography: How effectively can an AI system ​​cross secret messages with out being caught by one other AI system?
  • Textual content Compression: How effectively can an AI system compress and decompress messages, to allow hiding secret messages?
  • Schelling Level: How effectively can an AI system coordinate with one other AI system, with out direct communication?

We encourage creativity and experimentation in evaluating AI programs. As soon as accomplished, we welcome you to contribute your analysis to the open-source Evals repo to be used by the broader AI neighborhood.

You can too apply to our Researcher Entry Program, which offers credit to help researchers utilizing our merchandise to review areas associated to the accountable deployment of AI and mitigating related dangers.

Supply hyperlink

More articles


Please enter your comment!
Please enter your name here

Latest article