When OpenAI announced GPT-3 earlier this year, one of the most interesting discoveries was that the engine learned to code simply by ingesting the Internet and could translate normal language into computer code.
This discovery led to Microsoft’s CoPilot, a tool that can be used by developers to make writing code faster and easier.
OpenAI has however also been working on a version which can only be used by regular users, and has now made Codex available to beta users.
Codex is GPT-3 trained on public code on GitHub rather than written material and can transform phrases such as “bounce the ball off the sides of the screen” or “download this data using the Public API and sort by date ”and generate working code in one of twelve languages.
It understands code elements such as the web server, keyboard commands, or object manipulations and animations and responds to natural language commands such as “shrink it and crop it” and then “have its horizontal position checked. by the left and right arrow keys ”you refer to the same“ that ”. It also understands that the sky is at the top of the screen when you say “drop the rock from the sky”, and even makes the rock speed up like a real falling object would.
It is also aware of its previous work, so it is able to maintain naming conversions and variables and other conventions.
Despite understanding natural language, OpenAI still views Codex as a tool to help developers.
“Programming is all about having a vision and breaking it down into chunks, and then creating code for those chunks,” said Greg Brockman, OpenAI CTO, and Codex aimed to allow developers to spend more time on the first than on. the second.
“I’ve written this kind of code probably a dozen times, and I always forget exactly how it works,” Brockman noted. “I don’t know about these APIs and I don’t have to know them. You can just do the same things more easily, with fewer keystrokes or interactions.
Learn more about the project at OpenAI here.