According to Wired, OpenAI and its data training company Handshake AI are asking third-party contractors to upload real case studies from their past and current jobs. This practice seems to be part of AI companies' efforts to find high-quality training data, aiming to enable models to automate more white-collar tasks.

Specifically, OpenAI has asked contractors in a company briefing to detail the tasks they performed in other jobs and upload "actual work examples" they have "completed." These examples can be "specific outputs (not summaries of files, but actual files), such as Word documents, PDFs, PowerPoint presentations, Excel spreadsheets, images, or code repositories," among others.

Before uploading, OpenAI also requires contractors to remove any proprietary and personally identifiable information, and provides a tool called ChatGPT "star cleaning" tool to help them with this process. However, intellectual property lawyer Evan Brown told Wired that any AI lab taking this approach is "in significant risk," as it requires a high level of trust in contractors' judgment about what information is confidential.

An OpenAI spokesperson did not make further comments.

Key Points:

- 📄 OpenAI and Handshake AI ask contractors to upload actual work examples to improve the quality of AI training data.

- 🔒 Contractors must remove any proprietary and personal information before uploading, using a specified cleaning tool.

- ⚖️ Intellectual property lawyers warn this practice may carry significant risks, as it requires trusting contractors' judgment.