How much storage on an HDD or an SSD would ChatGPT, Grok, or Claude take up?
>>108409972https://huggingface.co/xai-org/grok-2 500gb for grok 2, safe to assume later versions go up from there.
>>108410025how did this get leaked. what is the catch to running it locally. is it just as censored as cloud grok?
>>108410368>what is the catch to running it locallyMy guess is the 8 GPUS with 40GBs of memory requirement
>>108410368>leakedxai "open sources" their old models whenever they release a new one.
>>108410396that sounds kick ass. but given that you put quotation marks around "open source" i'm guessing there are some kind of strings attached? Can you imagine if OpenAI or Antropic did this.
>>108410394>My guess is the 8 GPUS with 40GBs of memory requirementyep
>>108410440https://huggingface.co/xai-org/grok-2/blob/main/LICENSE>>108410394if you use a low bit quantization you can get it down to about 68.5gb, you'll lose a lot of quality but you can actually run it on consumer hardware https://huggingface.co/bartowski/xai-org_grok-2-GGUF
>>108409972>1752271170135420.jpgKEK
>>108410440>Can you imagine if OpenAI did this.They also provide open source models.https://huggingface.co/openai/gpt-oss-120b
>>108409972https://huggingface.co/xai-org/grok-2
>>108410440they only release the weights of old deprecated models too big for anybody to actually run>>108411190useless safetyslopped garbage
>>108409972About 1GB per billion parameters at 8 bit quant.