Paper with Code: You can now run LLMs without Matrix Multiplications
Saw this paper: https://arxiv.org/pdf/2406.02528 In essence, MatMul operations can be completely eliminated from LLMs while maintaining strong performance at billion-parameter scales and by utilising an optimised kernel during inference, their model’s memory consumption can be reduced by more than 10× compared to un-optimised models. source: https://x.com/rohanpaul_ai/status/1799122826114330866
Implementation for MatMul-free LM. Contribute to ridgerchu/matmulfreellm development by creating an account on GitHub.
https://github.com/ridgerchu/matmulfreellm
Discover More
Curated from across
Software Engineers on
by Sherlock007
TCS
Coding will be dead soon
That's not what am saying or pre assuming. But the brain behind those supercomputers h100 and all, who is making LLMs train faster. GPT 5.0 can do miracles and imagine 8.0 coupled with those 1nm GPUs. I still believe that not 100% can be replaced but yes 60-70% roles will be redundant. Top 10% will survive and thrive if ride the ai wave. What do u think viners?
But this isn't the first time a tech exec has predicted the death of coding.
https://www.tomshardware.com/tech-industry/artificial-intelligence/jensen-huang-advises-against-learning-to-code-leave-it-up-to-ai