Positional Embedding: The Secret behind the Accuracy of Transformer Neural Networks
An article explaining the intuition behind the “positional embedding” in transformer models from the renowned research paper - “Attention Is All You Need”.Table of ContentsIntroductionConcept of embedding in NLPNeed for positional embedding in TransformersVarious types of initial trial and error experimentsFrequency-based positional embeddingConclusionReferencesIntroductionThe introduction of transformer architecture in the field of deep learning undoubtedly has paved a way for the silent revolution, especially in the…