lidd1ejimmy@lemmy.ml to Memes@lemmy.mlEnglish · 2 years agoOffline version of Chat GPTlemmy.mlexternal-linkmessage-square25linkfedilinkarrow-up1502arrow-down18
arrow-up1494arrow-down1external-linkOffline version of Chat GPTlemmy.mllidd1ejimmy@lemmy.ml to Memes@lemmy.mlEnglish · 2 years agomessage-square25linkfedilink
minus-squareneidu2@feddit.nllinkfedilinkarrow-up47·edit-22 years agoTechnically possible with a small enough model to work from. It’s going to be pretty shit, but “working”. Now, if we were to go further down in scale, I’m curious how/if a 700MB CD version would work. Or how many 1.44MB floppies you would need for the actual program and smallest viable model.
minus-squareNaz@sh.itjust.workslinkfedilinkarrow-up16·2 years agosquints That says , “PHILLIPS DVD+R” So we’re looking at a 4.7GB model, or just a hair under the tiniest, most incredibly optimized implementation of <INSERT_MODEL_NAME_HERE>
minus-squarecurbstickle@lemmy.dbzer0.comlinkfedilinkarrow-up13·2 years agollama 3 8b, phi 3 mini, Mistral, moondream 2, neural chat, starling, code llama, llama 2 uncensored, and llava would fit.
minus-squareBudgetBandit@sh.itjust.workslinkfedilinkarrow-up1·2 years agoJust interested in the topic did you 🔨 offline privately?
minus-squarecurbstickle@lemmy.dbzer0.comlinkfedilinkarrow-up1·2 years agoI’m not an expert on them or anything, but feel free
minus-squareNoiseColor @lemmy.worldBanned from communitylinkfedilinkarrow-up14arrow-down2·2 years agoRemoved by mod
minus-squareIgnotum@lemmy.worldlinkfedilinkarrow-up8·2 years ago70b model taking 1.5GB? So 0.02 bit per parameter? Are you sure you’re not thinking of a heavily quantised and compressed 7b model or something? Ollama llama3 70b is 40GB from what i can find, that’s a lot of DVDs
minus-squareNoiseColor @lemmy.worldBanned from communitylinkfedilinkarrow-up9·2 years agoRemoved by mod
minus-square9point6@lemmy.worldlinkfedilinkarrow-up7·2 years agoLess than half of a BDXL though! The dream still breathes
minus-squareSteve@startrek.websitelinkfedilinkarrow-up5·2 years agoFor some reason, triple layer writable blu-ray exists. 100GB each https://www.verbatim.com/prod/optical-media/blu-ray/bd-r-xl-tl/bd-r-xl-tl/
minus-squareerrer@lemmy.worldlinkfedilinkEnglisharrow-up8·2 years agoIt is a DVD, can faintly see DVD+R on the left side
minus-squareDannyBoy@sh.itjust.workslinkfedilinkEnglisharrow-up5·2 years agoIt does have the label DVD-R
minus-squarekindenough@kbin.earthlinkfedilinkarrow-up8·2 years agoMaybe not all that LLM, https://en.wikipedia.org/wiki/ELIZA
minus-squareNum10ck@lemmy.worldlinkfedilinkEnglisharrow-up7·2 years agoELIZA was pretty impressive for the 1960s, as a chatbot for psychology.
minus-squarelidd1ejimmy@lemmy.mlOPlinkfedilinkEnglisharrow-up4·2 years agoyes i guess it would be a funny experiment for just a local model
minus-squareveroxii@aussie.zonelinkfedilinkarrow-up3·2 years agopkzip c:\chatgpt*.* a:\chatgpt.zip -&
Technically possible with a small enough model to work from. It’s going to be pretty shit, but “working”.
Now, if we were to go further down in scale, I’m curious how/if a 700MB CD version would work.
Or how many 1.44MB floppies you would need for the actual program and smallest viable model.
squints
That says , “PHILLIPS DVD+R”
So we’re looking at a 4.7GB model, or just a hair under the tiniest, most incredibly optimized implementation of <INSERT_MODEL_NAME_HERE>
llama 3 8b, phi 3 mini, Mistral, moondream 2, neural chat, starling, code llama, llama 2 uncensored, and llava would fit.
Just interested in the topic did you 🔨 offline privately?
I’m not an expert on them or anything, but feel free
Removed by mod
70b model taking 1.5GB? So 0.02 bit per parameter?
Are you sure you’re not thinking of a heavily quantised and compressed 7b model or something? Ollama llama3 70b is 40GB from what i can find, that’s a lot of DVDs
Removed by mod
Less than half of a BDXL though! The dream still breathes
For some reason, triple layer writable blu-ray exists. 100GB each
https://www.verbatim.com/prod/optical-media/blu-ray/bd-r-xl-tl/bd-r-xl-tl/
It is a DVD, can faintly see DVD+R on the left side
It does have the label DVD-R
Maybe not all that LLM, https://en.wikipedia.org/wiki/ELIZA
ELIZA was pretty impressive for the 1960s, as a chatbot for psychology.
yes i guess it would be a funny experiment for just a local model
pkzip c:\chatgpt*.* a:\chatgpt.zip -&