Findings #7
From the web
Some of the bigger news in the AI world this last week has been the release of DeepMind's new Gato model. Borrowing from the original post (linked below):
The agent, which we refer to as Gato, works as a multi-modal, multi-task, multi-embodiment generalist policy. The same network with the same weights can play Atari, caption images, chat, stack blocks with a real robot arm and much more, deciding based on its context whether to output text, joint torques, button presses, or other tokens.
In simpler terms, this AI can do a lot of different tasks. A key step in the move from a very narrow AI to a more general AI. The performance is still fairly typical on a per-task basis but the main contribution here is the breadth. The test for this model and others like it will be how well it scales.
On a similar note, I came across this Twitter thread showing how GPT-3 can "learn" how to reverse words. It is an interesting look at the insides of how the model processes data and shows some limitations of the setup. That being said, the fact that a few instructions teach a trained model a new skill is pretty interesting.
Thinking of a human counterpart, you wouldn't expect someone to instantly know how to play a new card game without rules, even if they were great at other card games.
From me
Tom and I put out the sixth episode of our podcast. This week we talked about AI metrics and how they get misused and a deep dive into how to understand things, when to keep pushing and when to stop.
Final thoughts
I was thinking this week about how the consensus view on alien life has changed so dramatically in my lifetime. As a kid, aliens were as much fantasy as unicorns. These days, you're crazy if you think this rock is the only one capable of life in the universe.
It makes me wonder what else has gone through those shifts. Which ideas are hard-baked into our minds, until suddenly they're not.
That's all for this week friends. If you enjoyed this, consider subscribing or passing this on to someone you think would enjoy it as well.
Until next time.