>>106686926
>i wanted to train some chroma loras but if it's as faggy as you say it is then there's no point.
That's what I'm experimenting with right now since it let's me use 1024 res with batch 4. Might be worth testing the automatic settings it gives, actual restarts with cosine. Right now I'm testing
{
"engine": "kohya",
"unetLR": 0.0003,
"clipSkip": 1,
"loraType": "lora",
"keepTokens": 0,
"networkDim": 32,
"numRepeats": 1,
"resolution": 1024,
"lrScheduler": "constant",
"minSnrGamma": 5,
"noiseOffset": 0.1,
"targetSteps": 2415,
"enableBucket": true,
"networkAlpha": 64,
"optimizerType": "AdamW8Bit",
"textEncoderLR": 0,
"maxTrainEpochs": 30,
"shuffleCaption": false,
"trainBatchSize": 4,
"flipAugmentation": false,
"lrSchedulerNumCycles": 1
}