reproduce Adam, AdamW, Adafactor optimizors with pytorch, and introduce popular optimizers in the training of the LLMs.
-
Updated
Mar 24, 2025 - Python
reproduce Adam, AdamW, Adafactor optimizors with pytorch, and introduce popular optimizers in the training of the LLMs.
Add a description, image, and links to the adafactor topic page so that developers can more easily learn about it.
To associate your repository with the adafactor topic, visit your repo's landing page and select "manage topics."