Improving TabTransformer Part 1: Linear Numerical Embeddings | by Anton Rubert | Oct, 2022
Deep learning for tabular data with FT-TransformerPhoto by Nick Hillier on UnsplashIn the previous post about TabTransformer I’ve described how the model works and how it can be applied to your data. This post will build on it, so if you haven’t read it yet, I highly recommend starting there and returning to this post afterwards.TabTransformer was shown to outperform traditional multi-layer perceptrons (MLPs) and came close to the performance of Gradient Boosted Trees (GBTs) on some datasets. However, there is one…