OpenAI Fellows Fall 2018: Final projects
Our second class of OpenAI Fellows has wrapped up, with each Fellow going from a machine learning beginner to core OpenAI contributor in the...
974+ articles from 7 top sources — updated every 2 hours.
Our second class of OpenAI Fellows has wrapped up, with each Fellow going from a machine learning beginner to core OpenAI contributor in the...
We’ve created MuseNet, a deep neural network that can generate 4-minute musical compositions with 10 different instruments, and can combine ...
We’ve developed the Sparse Transformer, a deep neural network which sets new records at predicting what comes next in a sequence—whether tex...
OpenAI Five is the first AI to beat the world champions in an esports game, having won two back-to-back games versus the world champion Dota...
We’ll be holding our final live event for OpenAI Five at 11:30am PT on April 13....
We’ve made progress towards stable and scalable training of energy-based models (EBMs) resulting in better sample quality and generalization...
Our class of eight scholars (out of 550 applicants) brings together collective expertise in literature, philosophy, cell biology, statistics...
We’ve created OpenAI LP, a new “capped-profit” company that allows us to rapidly increase our investments in compute and talent while includ...
We’ve created activation atlases (in collaboration with Google researchers), a new technique for visualizing what interactions between neuro...
We’re releasing a Neural MMO, a massively multiagent game environment for reinforcement learning agents. Our platform supports a large, vari...
On February 2, we held our first Spinning Up Workshop as part of our new education initiative at OpenAI....
We’ve written a paper arguing that long-term AI safety research needs social scientists to ensure AI alignment algorithms succeed when actua...
We’ve trained a large-scale unsupervised language model which generates coherent paragraphs of text, achieves state-of-the-art performance o...
Our first cohort of OpenAI Fellows has concluded, with each Fellow going from a machine learning beginner to core OpenAI contributor in the ...
We’ve discovered that the gradient noise scale, a simple statistical metric, predicts the parallelizability of neural network training on a ...
We’re releasing CoinRun, a training environment which provides a metric for an agent’s ability to transfer its experience to novel situation...
We’re releasing Spinning Up in Deep RL, an educational resource designed to let anyone learn to become a skilled practitioner in deep reinfo...
We’ve developed an energy-based model that can quickly learn to identify and generate instances of concepts, such as near, above, between, c...