What are your thoughts on AGI?
Is it even possible? If there's even the slightest possibility, what would a post-AGI world look like? Would love to know everyone's take on this!
They have really super hyped it. I know it’s good and all but by no means we are going to remove human intervention in any product. I have been using GPT4 since past 2 months, I don’t see any value other than it help me writing my SOP which is just filled with a lot of difficult words. I gave a 200 words yaml file to convert it to json, it made soup of everything.
AGI comes and there's a very high probability we're doomed. The board of OpenAI was probably right and very concerned about this and that's why they went with what they did. However Satya and Sam's greed put them in a situation where they're damned if they do and damned if they don't.
The letter from former employees to the board that was posted on Gist (Elmo confirmed the validity) confirmed the concerns that they were going after unrestricted AGI among others.
History will probably not look very kindly at this.
Did you see the letter?
Hey yes, Elmo posted it on his Twitter.
Here you go: https://web.archive.org/web/20231121225252/https://gist.github.com/Xe/32d7bc436e401f3323ae77e7e242f858
Goodbye earth 😶
I thought losing my job to AI would be the worst but I think we might just lose our lives too
Is it even possible? If there's even the slightest possibility, what would a post-AGI world look like? Would love to know everyone's take on this!
That's the market cap of Saudi Arabian Oil Co, or Google + Amazon + Apple
Let’s say If, A big if, AI becomes self sufficient to do the job of a backend developer, like it’s able to write basic crud apps, simple system design stuff it can do and slowly it keeps getting better at debugging and stuff. (It should ...