slot update_slots: id 0 | task 18657 | new prompt, n_ctx_slot = 100096, n_keep = 0, n_prompt_tokens = 17468
slot update_slots: id 0 | task 18657 | n_past = 4, memory_seq_rm [4, end)
slot update_slots: id 0 | task 18657 | prompt processing progress, n_past = 2052, n_tokens = 2048, progress = 0.117243
slot update_slots: id 0 | task 18657 | n_past = 2052, memory_seq_rm [2052, end)
slot update_slots: id 0 | task 18657 | prompt processing progress, n_past = 4100, n_tokens = 2048, progress = 0.234486
srv params_from_: Chat format: Hermes 2 Pro
Is there any way to stop llamacpp from generating once it's been sent a message from roo code?
Does the sillytavern stop button work with llama-server?
Does /g/ still just use llama-server use nowadays?