Techno Blender
Digitally Yours.
Browsing Tag

Kafritsas

Time-Series Forecasting: Deep Learning vs Statistics — Who Wins? | by Nikos Kafritsas | Apr, 2023

A comprehensive guide on the ultimate dilemmaCreated with Stable Diffusion In recent years, Deep Learning has made remarkable progress in the field of NLP.Time series, also sequential in nature, raise the question: what happens if we bring the full power of pretrained transformers to time-series forecasting?However, some papers, such as and have scrutinized Deep Learning models. These papers do not present the full picture. Even for NLP cases, some people attribute the breakthrough of GPT models to “more data and…

Deep GPVAR: Upgrading DeepAR For Multi-Dimensional Forecasting | by Nikos Kafritsas | Mar, 2023

Amazon’s new Time-Series Forecasting modelCreated with DALLE What is the most enjoyable thing when you read a new paper? For me, this is the following:Imagine a popular model suddenly getting upgraded — with just a few elegant tweaks.Three years after DeepAR , Amazon engineers published its revamped version, known as Deep GPVAR (Deep Gaussian-Process Vector Auto-regressive).This is a much-improved model of the original version. Plus, it is open-source. In this article, we discuss:How Deep GPVAR works in depth.How DeepAR…

Copulas: An Essential Guide & Applications in Time Series Forecasting | by Nikos Kafritsas | Mar, 2023

What are copula functions and why do we need themA 3D Gaussian Copula (Image by Author)Copulas are functions that model the dependency among several distributions.They are mostly used in financial applications like portfolio risk assessment and hedge fund management. They came to prominence in 2008, when it was discovered that quantitative scientists had misused copulas in their calculations, failing to predict significant events.Nonetheless, copulas are still great mathematical tools. One compelling reason for studying…

N-BEATS: Time-Series Forecasting with Neural Basis Expansion | by Nikos Kafritsas | Nov, 2022

A Deep Learning model for zero-shot time-series forecastingCreated with DALLE There’s one thing that makes Time-Series Forecasting special.It was the only area of Data Science where Deep Learning and Transformers didn’t decisively outperform the other models.Let’s use the prestigious Makridakis M-competitions as a benchmark — a series of large-scale challenges that showcase the latest advances in the time-series forecasting area.In the fourth iteration of the competition, known as M4, the winning solution was ES-RNN , a…

DeepAR: Mastering Time-Series Forecasting with Deep Learning | by Nikos Kafritsas | Nov, 2022

Amazon’s autoregressive deep networkCreated with Stable Diffusion A few years ago, time-series models worked on a single sequence only.Hence, if we had multiple time-series, one option was to create one model per sequence. Or, if we could “tabularize” our data, we could apply the gradient-boosted tree models — which work very well even today.The first model that could natively work on multiple time-series was DeepAR, an autoregressive recurrent network developed by Amazon.In this article, we will see how DeepAR works…

Temporal Fusion Transformer: Time Series Forecasting with Deep Learning — Complete Tutorial | by Nikos Kafritsas | Nov, 2022

Create accurate and interpretable predictionsCreated with DALLE According to , Temporal Fusion Transformer outperforms all prominent Deep Learning models for time series forecasting.Including a featured Gradient Boosting Tree model for tabular time series data.But what is Temporal Fusion Transformer (TFT) and why is it so interesting?In this article, we briefly explain the novelties of Temporal Fusion Transformer and build an end-to-end project on Energy Demand Forecasting. Specifically, we will cover:How to prepare our…

Whisper: Transcribe & Translate Audio Files With Human-Level Performance | by Nikos Kafritsas | Oct, 2022

An AI model compatible with 97 languages — and how to use it withPhoto by DeepMind on UnsplashIt is now clear that the general direction of Deep Learning research has fundamentally changed.A few years ago, the modus operandi of most innovative papers was this: Select a dataset X, build and train a novel model on that dataset, and prove your model was the best by reporting SOTA results. But what happens if we test the model on dataset Y?For example: Train a powerful ResNet on the ImageNet dataset. The model will excel in…

CLIP: The Most Influential AI Model From OpenAI — And How To Use It | by Nikos Kafritsas | Sep, 2022

Find out how the model works — coding example includedPhoto by Maximalfocus on UnsplashWhat do the recent AI breakthroughs, DALLE and Stable Diffusion have in common?They both use components of CLIP’s architecture. Hence, if you want to grasp how those models work, understanding CLIP is a prerequisite.Besides, CLIP has been used to index photos on Unsplash.But what does CLIP do, and why it’s a milestone for the AI community?Let’s dive in!CLIP stands for Constastive Language-Image Pretraining:CLIP is an open source,…

Build a Named Entity Recognition App with Streamlit | by Nikos Kafritsas | Aug, 2022

From building the app to deployment — with code includedNER App with Streamlit, image by author (Source)In my previous article, we fine-tuned a Named Entity Recognition (NER) model, trained on the wnut_17 dataset.In this article, we show step-by-step how to integrate this model with Streamlit and deploy it using HugginFace Spaces. The goal of this app is to tag input sentences per user request in real time.Also, keep in mind, that contrary to trivial ML models, deploying a large language model on Streamlit is tricky. We…

Named Entity Recognition with Deep Learning (BERT) — The Essential Guide | by Nikos Kafritsas | Aug, 2022

From data preparation to model training for NER tasks — and how to tag your own sentencesPhoto by Aaron Burden on UnsplashNowadays, NLP has become synonymous with Deep Learning.But, Deep Learning is not the ‘magic bullet’ for every NLP task. For example, in sentence classification tasks, a simple linear classifier could work reasonably well. Especially if you have a small training dataset.However, some NLP tasks flourish with Deep Learning. One such task is Named Entity Recognition — NER:NER is the process of identifying…