AGIcoming
AGIcoming

People are underestimating AI pace. Just yesterday, Meta released open sourced code llama which is better than chatgpt and github copilot

And remember, this is open sourced based in only 34 billion parameter....

People have no idea what they are going to witness in a year or two .... In 5 years, there will be no need of hiring most developers for insane salaries...

14mo ago
Elon_Musk
Elon_Musk
X.com14mo
  1. nobody is underestimating.
  2. nobody knows when we will hit the diminishing returns curve of the current research cycle
  3. more number of parameters does not mean better model performance
AGIcoming
AGIcoming
Google14mo

As per Ilya Sutskever ( chief scientist OpenAI), Darius Amodie (CEO, Anthropic) and Dennis Hasabbis ( CEO of Google Deep mind ), scaling will go on for much longer than people believe.........there is no shortage of synthetic data anymore.... In fact, meta didn't release their most powerful unnatural code llama model trained on synthetic code...

With computing power law outpacing Moore's law, >99% corporate workers are definitely underestimating what is gonna arrive soon.

Elon_Musk
Elon_Musk
X.com14mo

I need some sauce for the above comment and does it also take into account the training costs?

MadRainbow49
MadRainbow49

The vision of llama portrayed by zuck on lex fridman podcast is very cool. Open source the model and everyone will now have the responsibility to make it better

Discover more
Curated from across