[ad_1]

OpenAI LLC has reportedly hired about 1,000 contractors over the past six months to help hone its artificial intelligence models’ coding capabilities.

Semafor reported the development today, citing people familiar with the matter. OpenAI has built an AI system called Cortex that can automatically generate code snippets based on text prompts. Cortex is trained on billions of lines of open-source code from GitHub.

According to Semafor’s sources, OpenAI has hired contractors in multiple parts of the world including Latin America and Eastern Europe. About 60% of the 1,000 hires reportedly onboarded in the past six months work on data labeling projects. Data labeling is the task of enriching AI training datasets with contextual information designed to help neural networks learn faster.

The remaining 40% of the contractors are programmers, today’s report elaborated. It’s believed they will write code from which neural networks can learn to generate new software. Interview questions shared with an applicant reportedly suggest OpenAI is seeking not only code examples, but also natural language descriptions of the underlying thought process. 

Codex, OpenAI’s flagship code generation AI, made its initial debut in 2021. The same year, Microsoft Corp.’s GitHub unit introduced a Codex-powered coding assistant called Copilot. The service has since been adopted by tens of thousands of developers.

Copilot enables developers to describe a software task in natural language and have the code necessary to carry it out generated automatically. The service uses OpenAI’s Cortex neural network to produce up to several lines of code at a time. According to GitHub, it’s particularly adept at helping developers integrate third-party code components and services with their applications. 

Developers can access Cortex not only as part of Copilot but also through a standalone application programming interface. According to OpenAI, generating code is only one of several programming tasks that the API supports. Cortex is also capable of explaining existing code and producing terminal commands, the instructions that developers enter into a computer’s command line to carry out configuration tasks.

OpenAI earlier this week secured an investment from Microsoft that is reportedly worth $10 billion over several years. Microsoft will provide not only capital, but also cloud infrastructure that the AI group can use to train new neural networks. 

As part of a previous $1 billion investment announced in 2019, Microsoft has built a cloud-based supercomputer to support OpenAI’s research. The system featured 10,000 graphics cards and 285,000 central processing unit cores as of 2020. It ranked among the world’s five fastest supercomputers at the time of its completion. 

Image: OpenAI

Show your support for our mission by joining our Cube Club and Cube Event Community of experts. Join the community that includes Amazon Web Services and Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many more luminaries and experts.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *