SqueakyPickle
SqueakyPickle

People are underestimating AI pace. Just yesterday, Meta released open sourced code llama which is better than chatgpt and github copilot

And remember, this is open sourced based in only 34 billion parameter....

People have no idea what they are going to witness in a year or two .... In 5 years, there will be no need of hiring most developers for insane salaries...

16mo ago
Find out if you are being paid fairly.Download Grapevine
GroovyBoba
GroovyBoba
  1. nobody is underestimating.
  2. nobody knows when we will hit the diminishing returns curve of the current research cycle
  3. more number of parameters does not mean better model performance
SqueakyPickle
SqueakyPickle
Google16mo

As per Ilya Sutskever ( chief scientist OpenAI), Darius Amodie (CEO, Anthropic) and Dennis Hasabbis ( CEO of Google Deep mind ), scaling will go on for much longer than people believe.........there is no shortage of synthetic data anymore.... In fact, meta didn't release their most powerful unnatural code llama model trained on synthetic code...

With computing power law outpacing Moore's law, >99% corporate workers are definitely underestimating what is gonna arrive soon.

GroovyBoba
GroovyBoba

I need some sauce for the above comment and does it also take into account the training costs?

GigglyWalrus
GigglyWalrus

The vision of llama portrayed by zuck on lex fridman podcast is very cool. Open source the model and everyone will now have the responsibility to make it better

Discover more
Curated from across