TheBigBrother@lemmy.world to Selfhosted@lemmy.worldEnglish · edit-24 months agoWhat's the bang for the buck go to setup for AI image generation and LLM models?message-squaremessage-square5fedilinkarrow-up12arrow-down12file-text
arrow-up10arrow-down1message-squareWhat's the bang for the buck go to setup for AI image generation and LLM models?TheBigBrother@lemmy.world to Selfhosted@lemmy.worldEnglish · edit-24 months agomessage-square5fedilinkfile-text
minus-squarethirdBreakfast@lemmy.worldlinkfedilinkEnglisharrow-up0·edit-24 months agoAn M1 MacBook with 16GB cheerfully runs llama3:8b outputting about 5 words a second. A second hand MacBook like that probably costs half to a third of a secondhand RTX3090. It must suck to be a bargain hunting gamer. First bitcoin, and now AI. edit: a letter
minus-squareDamage@feddit.itlinkfedilinkEnglisharrow-up1·4 months agoPatient gamers at least have the steam deck option now
An M1 MacBook with 16GB cheerfully runs llama3:8b outputting about 5 words a second. A second hand MacBook like that probably costs half to a third of a secondhand RTX3090.
It must suck to be a bargain hunting gamer. First bitcoin, and now AI.
edit: a letter
Patient gamers at least have the steam deck option now