I wrote about the new GLM-4.5 model family yesterday—new open weight (MIT licensed) models from Z.ai in China which their benchmarks claim score highly in coding even against models such …
Sure, it’s still remarkable that a model that performs better than one that would’ve needed a data centre to run a year ago can now run on a regular consumer laptop though.
I’m just saying a 2.5 year old laptop is not that old, not sure why we’re specifying that
Sure, it’s still remarkable that a model that performs better than one that would’ve needed a data centre to run a year ago can now run on a regular consumer laptop though.