>his AI models aren't boosted, bagged, folded, quantized, distilled, pruned, in their lane, reduced, democratized, minimum sample setted, moisturized, regularized, blended and stacked
Do you guys even want a tech job? How long until AI can write all the other AIs anyway, I only used AI for like 100% of this but I had to like keep copying and pasting different stuff or something I don't remember. Fuckers made me install conda for python 10 and add it to my PATH variables (didn't fucking work). Anyhoo number go up. Or down.
=== Initial Model ===
Root Mean Squared Error: 0.12965930364894782
R^2 Score: 0.9629537513134565
=== Float32 Quantized Model ===
RMSE: 0.1295987076078411
R^2: 0.9629883683503238
Memory usage before: 266.73 KB
Memory usage after: 137.82 KB
=== Knowledge Distillation ===
RMSE: 0.16836558907776777
R^2: 0.9375340199212869
Teacher model size: 136.59 KB
Student model size: 55.73 KB
=== Minimum Samples per Leaf and Split ===
RMSE: 0.14947366789773672
R^2: 0.9507658693964841
=== Reduced Feature Set ===
RMSE: 0.14922477486614621
R^2: 0.9509296953053549
=== Stacking Ensemble Teacher ===
RMSE: 0.14565949530771244
R^2: 0.9532464608517965
=== Stacking Ensemble Teacher With Adaptive Boosting ===
RMSE: 0.14397968770504493
R^2: 0.954318606399121