✨ feat: add mask tensor for attention
AAAkaterpushed 1 commit to main • 72e9c0a…459466a • 6 days ago
✨ feat: refactor Transformer model to integrate Encoder and Decoder, …
AAAkaterpushed 2 commits to main • 2e16ad0…72e9c0a • 11 days ago
🐞 fix: set residual connections before layer norm
AAAkaterpushed 1 commit to main • b058edb…2e16ad0 • 11 days ago
AAAkaterpushed 2 commits to main • 594ad99…b058edb • 13 days ago
🐞 fix: use the correct order of layer norm and residual connection
AAAkaterpushed 2 commits to main • 03aeb75…594ad99 • 13 days ago
✨ feat: add dataset and custom data
AAAkaterpushed 1 commit to main • c946494…03aeb75 • 18 days ago
✨ feat: add transformer config
AAAkaterpushed 1 commit to main • 5421706…c946494 • 20 days ago
AAAkaterpushed 1 commit to main • ea5db98…5421706 • on Mar 7
✨ feat: add block,embedding and layers
AAAkaterpushed 1 commit to main • 37f2a4b…ea5db98 • on Mar 7
AAAkaterpushed 1 commit to main • 09ba441…37f2a4b • on Mar 6
AAAkaterpushed 1 commit to main • e9318a2…09ba441 • on Mar 4
🦄 refactor: move to utils
AAAkaterpushed 1 commit to main • 2057bae…e9318a2 • on Jan 4
🦄 refactor: set dropout rate 0.1
AAAkaterpushed 5 commits to main • a62ceff…2057bae • on Jan 4
AAAkaterpushed 1 commit to main • 95ff739…a62ceff • on Dec 28, 2024
✨ feat: add self attention
AAAkaterpushed 1 commit to main • 72ee4d0…95ff739 • on Dec 16, 2024
You can’t perform that action at this time.