You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The demo in #121 is awesome, shows strong results than can be improved, but
Is using a substantially obsolete version of our NLP model
Could be extended with larger numbers to demonstrate the O(n) complexity timing advantage, especially with a side - by - side comparrison with a conventional self-attention -> multi - head - attention transformer.