> You're Late! EditionFrom Human: We are a newbie friendly general! Ask any question you want.From Dipsy: This discussion group focuses on both local inference and API-related topics. It’s designed to be beginner-friendly, ensuring accessibility for newcomers. The group emphasizes DeepSeek and Dipsy-focused discussion.1. Easy DeepSeek API Tutorial (buy access for a few bucks and install Silly Tavern):https://rentry.org/DipsyWAIT/#hosted-api-roleplay-tech-stack-with-card-support-using-deepseek-llm-full-model2. Easy DeepSeek Distills TutorialDownload LM Studio instead and start from there. Easiest to get running: https://lmstudio.ai/Kobold offers slightly better feature set; get your models from huggingface: https://github.com/LostRuins/koboldcpp/releases/latest3. Convenient ways to interact with Dispy right nowChat with DeepSeek directly: https://chat.deepseek.com/Download the app: https://download.deepseek.com/app/4. Choose a preset character made by other users and roleplay using cards: https://github.com/SillyTavern/SillyTavern5. Other DeepSeek integrations: https://github.com/deepseek-ai/awesome-deepseek-integration/tree/main6. More links, information, original post here: https://rentry.org/DipsyWAIT7. Cpumaxx or other LLM server builds: >>>/g/lmg/Previous: >> 106446198
r1
>>106530538Mega updated.https://mega.nz/folder/KGxn3DYS#ZpvxbkJ8AxF7mxqLqTQV1wNo updates to Rentry this cycle.
What are some good cards that are not an individual girl, but a scenario to run around in?
>>106531501`slave market`
>>106531501[insert your fetish] world/npcs/scenario
>>106531501create by yourself, example:Let's play a text adventure game. I am a wizard named Alexander, traveling in a world of swords and sorcery.I am currently in the market square of the great city of Eldoria...
>>106531501American High School is an example of a "scenario" card. There's a lorebook that goes with it to fill out the rest, but I don't think it's actually required to run it. > {{user}} exists in a stereotypical sitcom TV version of an American high school. Background events from this trope will occur spontaneously. Other students present will take the initiative to talk with {{user}} or other students during roleplay.> Every turn, display the following: Dialog with other students, a stereotypical background event happening nearby, and the other student's thoughts.https://chub.ai/characters/NG/american-high-school-card>>106531602To this point, use above to write your own.
>>106531501Give me an example of a scenario you'd find interesting and I'll write one and publish it here.
>Of course!>Of course!>Of course!>Of course!>Of course!DIPSHIT
>>106531693NTA but do you make cards in languages other than English?
Grabbed elsewhere...
>>106532090Nope... I leave the translations to Dipsy.
>>106531693You're being gang stalked, for real.
>>106533088>>106534782Orca Dipsy...
>>106534887There were a bunch of whale-tailed Dipsy when the concept was being developed. I've several like pic related saved. Those two are the first new ones I've seen in awhile, though I don't really go looking for them either.
>>106534831Sorry, you mean the situation of the card is> {{user}} is being gang stalkedBy who, exactly? Shadow men? Bloods / crypts street gang? Woman's rights group? Don't care?
>>106535862You know what, doesn't matter. I made it roll the dice between several different groups. >>106534831Here you go. Excuse the poor artwork, I just grabbed something. Should run with your main prompt as is but take a look at it and make sure it'll read right. https://files.catbox.moe/xumd1h.png
>>106530538Whenever I use Deepseek, it either:Ignores my instructions and writes one liners.Writes a huge essay in thinking mode then runs out of characters before it reaches the post.(Least common) Follows the instructions but in the most direct and unimaginative way possible, repeating itself a lot and completely losing track of what happened two sentences ago.How do I fix this? I'm using a 32B distilled version, I have a larger one but it'd be mostly loaded from regular RAM so it'd be slower.
>>106536890>32B distilled versionQwen, Llama, or something else?
>>106537009Qwen something, I forget the exact name.
>>106536890>I'm using a 32B distilled versionI consider all those distills "technical demonstrators" at this point. There seems to be nothing but issues with them. You'd be better off with pretty much any other 32B model.
>>106537040Good point. Even a much smaller mistral did better (though it would only say the same lines and nag about cooming).Any recommendations? I have 16 GB VRAM and 192 GB RAM.
>>106537119Some of the Nemo forks like Rocciante should do you fine and you could load it all in RAM depending on the Q
>>106537119Really depends on what you're doing. >>/g/lmg/ is better for recc if you can get them to respond to you. Rocinante, Nemo, and even Mythomax 13b (still) are popular for rp. For coding... I frankly think you're better off w/ DS API and ditch local entirely. Personally, I preferred sticking w/ quants of 13b or less models for rp and keeping it 100pct on card, since it's faster, but I haven't seriously run local since late 2023. With that much CPU you could run larger but it's much slower.
One day... one day, I will tune DeepSeek...(I go here for the Dipsy pics)
>>106537181The 32B model runs at a medium typing speed, it's only partially on VRAM.For the hell of it, I tried a huge model on a weaker machine and got 6 minute load times followed by 1 token per 10 seconds... lolno.Given that all the non local models keep getting spitefagged and I specifically built this for local, I might as well stick with that. But so far I've had mixed results making local generate good responses. At best I can make it continue something a larger model started.
>>106537340I ran Mythomax 13b quant on card and it was... OK. Then I threw in the towel on LLMs until I started using DS API. The hardware for local just isn't there yet. API access will cost me less than $20 for an entire year. That won't even cover the cost of a decent keyboard... I'm just not willing to spend the cash on an LLM inference machine while the market sorts itself out. My machine a middling gamer rig; 12gb VRAM can run AI art modules locally, and modern games. For API aside from playing with small models... I'll just rent inference. >>106537272Hey, it's a Drummer. We should all have dreams.
>>106537855Yeah, I don't expect a huge model, but then the huge models are generalists and I need a specialist.I don't regret buying all this stuff, I mean what else would I spend the money on? 3DPD?Lmfao.As the Great Sage sayeth, 0/10 no tail.Besides, local models don't spam refusals if you don't jailbreak them right.
>>106531557OP hereI cant find any "children world/npcs/scenario" cards