IS YOUR MIND DEEPER THAN DEEPMIND’S
Google is at the cutting edge of technology at the moment. With its interest in AI and machine learning, its newest AI system, DeepMind, has managed to beat scores of professional gamers in 23 classic Atari 2600 games. Read on to find out more.
If you belong to the generation that grew up on Atari 2600 consoles, you will know the classic game Breakout, which involves bouncing a ball on a plate at the bottom edge of the screen to clear out bricks that are clumped in various patterns at the top edge. If you were a ‘pro’ at the game, you would have figured out pretty quickly that the best way to play is to tunnel through a narrow column of bricks and then watch the ball bounce on its own between the layer of bricks and the top edge of the screen.
DeepMind, an AI system developed by Google, learned to do just this without being taught.
Many AI programs have been used to play video games before, but a majority of them have been fed game data from the source code. But in this case, DeepMind was not told any of the rules, and was expected to learn how to play the game just by ‘watching’, much like a human would.
What it uses to perform these learning tasks is a deep neural network, which, like a human brain, contains networks of processing units called neurons, which are connected by weights which either strengthen or weaken depending on the reward or punishment that happens at the end. Over a period of learning, then, certain network paths become stronger than before. This is how learning and pattern matching happens in the human brain as well.
DeepMind has been tested on 49 of Atari’s games, so the network is robust and is able to handle more games than one. It does not ‘unlearn’ the patterns that it has learned with one game, for instance. On 23 of these games, DeepMind has managed to trounce the top scores of professional gamers.
The point of this research is believed to be Google’s interest in being able to process and extract patterns from vast amounts of picture and video data, which is available on its Google and Youtube servers. Learning an Atari game on one’s own requires a processing rate of 2 million pixels per second. This will also help in autonomous cars and other long-term applications that Google is interested in.
The age of the robots may just be upon us.