You must log in or register to comment.
The article mentions AI. 16gigs feels far too little to run a LLM of respectable size so I wonder what exactly this means? Feels like no one is gonna be happy about a 16gig LLM (high RAM usage and bad AI features)
No 16 gigs is ok for an LLM
On your gpu
Of fucking course it’s AI, why the hell wouldn’t it be AI. For fuck sake, it’s like they want their users to switch to Linux.