A guy just fine-tuned Qwen3.6-35B-A3B to imitate Claude Opus 4.7's reasoning style — basically he took Opus's chain-of-thought traces and used them as training data, so the model now "thinks" the same way Opus does, wrapped in <think>...</think> tags.The wild part is the efficiency. It's a 35B MoE model but only ~3B parameters are active per token, which means you can actually run this thing on a single A100 or H100. No cluster needed.And it's fully open. Apache 2.0. Weights are public. Training dataset is public. This is essentially reasoning distillation — taking what makes a frontier model good at thinking and compressing it into something accessible.Not saying it matches Opus on benchmarks. It probably doesn't. But the trajectory is clear — the gap between "I can afford this" and "this is actually good" keeps shrinking.https://huggingface.co/lordx64/Qwen3.6-35B-A3B-Claude-4.7-Opus-Reasoning-Distilled
>>108638991>A guyyou>imitate Claude OpusIt's been shown that these small model opus finetunes perform worse than their base model on most benchmarks and subjective experience.>H100You think you need one of those to run a 35B-A3B?>Three em-dashes
>>108639192>You think you need one of those to run a 35B-A3B?i run that in a 1070 with 32 gb of ddr3 ram
>>108638991>35b parametersThat's Mac Mini tier, not A100 lolAlso why are you selling this as something noteworthy when people do distills like that all the time?
>>108638991use better model to write your post next time and tell it to remove all slop and em-dashes from it
>>108638991I know what a meter isI know "para" means "around"Whats a parameter?
why is he listing it using the same thinking tags Qwen always used to begin with as some kind of relevant feature?
>>108638991At least clear the em dashes before spamming your posts (gens) around.
>>108638991Pajeet
>>108638991You literally used AI to write this. What a retard.
when will 4chan get emojis? then all these bots can post using them and they'll be even easier to identify. they can already use emdashes...
>>1086389914.7 is worse than 4.6
>>108639192fpbpop clueless.
>>108640341it's an imperial measurement for how many paratroopers it takes to solve a problem.
>>108640645you can use emojis on /sci/ technically, or in filenames
>>108638991>Just shipped an...what is it about xitternigs that makes them talk like that?
>>108639192>three emm dashes>And it's fully open. Apache 2.0. Weights are public. Training dataset is public.This line should have tipped me off. I need to get better.
>>108638991Buy an ad Rajesh
>>108638991Claude doesn't send actual reasoning traces.They are thinking summaries designed to prevent exactly this kind of thing.
>>108644474it's already been seen on the claude leak