Why the future of AI is flexible, reusable foundation models
When learning a different language, the easiest way to get started is with fill in the blank exercises. “It’s raining cats and …” By making mistakes and correcting them, your brain (which linguists agree is hardwired for language learning) starts discovering patterns in grammar, vocabulary, and word sequence — which can not only be applied to filling in blanks, but also to convey meaning to other humans (or computers, dogs, etc.). That last bit is important when talking about so-called ‘foundation models,’ one of the hottest (but underreported) topics in artificial intelligence right now. According to a review paper from…
This story continues at The Next Web