AlphaCode: code generation model as described by DeepMind

DeepMind introduced the AlphaCode code generation system with 41 billion parameters. AlphaCode is superior to OpenAI Codex and generates code in 12 languages.

According to a Cambridge University study, more than half of developers’ working time is spent debugging code, which costs the IT industry about $300 billion a year. AI-based code development and analysis tools can reduce development costs by allowing programmers to focus on creative and less routine tasks.

The developers of AlphaCode claim that their system generates code based on the description of the algorithm, which not only compiles without errors, but also really corresponds to the description.

AlphaCode is a language model based on a transformer. AlphaCode contains 41.4 billion parameters, which is about four times the size of Codex. The system was trained on public GitHub repositories in the programming languages C++, C#, Go, Java, JavaScript, Lua, PHP, Python, Ruby, Rust, Scala and TypeScript. The training dataset amounted to 715.1 GB of codes and their descriptions.

Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments

aischool