Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
jimfan 
posted an update Jan 26
Post
My TED talk is finally live!! I proposed the recipe for the "Foundation Agent": a single model that learns how to act in different worlds. LLM scales across lots and lots of texts. Foundation Agent scales across lots of lots of realities. If it is able to master 10,000 diverse simulated realities, it may well generalize to our physical world, which is simply the 10,001st reality.

Why do we want a single Foundation Agent instead of many smaller models? I'd like to quote the idea from my friend Prof. Yuke Zhu's CoRL keynote talk. If we trace each AI field's evolution, we'd find this pattern:

Specialist -> Generalist -> Specializing Generalist

And the "specialized generalist" is often way more powerful than the original specialist. Just like distilled versions of LlaMA are way better than custom-built NLP systems 5 years ago.

TED talks do not have teleprompters!! All I have is a "confidence monitor" at my foot, showing only the current slide and timer. That means I need to memorize the whole speech. It sounds intimidating at first, but turns out to be the best way to connect with the audience and deliver the ideas right to heart.

Happy to share my slides with all of you! https://drive.google.com/file/d/1NSY6MxMu3OPQ4U6hx0OxPq7EQB5XTcAG/view?usp=sharing

The video is only 10 min. I promise it's well worth your time!
https://www.ted.com/talks/jim_fan_the_next_grand_challenge_for_ai

Very cool!

Very nice!

This is great, i want to repost this @jimfan even though we don't have the feature yet 🤣