Techno Blender
Digitally Yours.
Browsing Tag

Skanda

When Should You Fine-Tune LLMs?. There has been a flurry of exciting… | by Skanda Vivek | May, 2023

The problem of giving all the necessary information to the model to answer questions is now offloaded from the model architecture to a database, containing document chunks.The documents of relevance can then be found by computing similarities between the question and the document chunks. This is done typically by converting the chunks and question into word embedding vectors, and computing cosine similarities between chunks and question, and finally choosing only those chunks above a certain cosine similarity as relevant…

LLM Economics: ChatGPT vs Open-Source | by Skanda Vivek | Apr, 2023

How much does it cost to deploy LLMs like ChatGPT? Are open-source LLMs cheaper to deploy? What are the tradeoffs?Cartoon schematic for comparing LLM costs | Skanda VivekTLDR: For lower usage in the 1000’s of requests per day range ChatGPT works out cheaper than using open-sourced LLMs deployed to AWS. For millions of requests per day, open-sourced models deployed in AWS work out cheaper. (As of writing this article on April 24th, 2023.)Large Language Models are taking the world by storm. Transformers were introduced in…

Transformer Models For Custom Text Classification Through Fine-Tuning | by Skanda Vivek | Jan, 2023

A tutorial on how to build a spam classifier (or any other classifier) by fine-tuning the DistilBERT modelFine-Tuned SMS Spam Classifier Model Output | Skanda VivekThe DistiBERT model was released by the folks at Hugging Face, as a cheaper, faster alternative to large transformer models like BERT. It was originally introduced in a blog post. The way this model works — is by using a teacher-student training approach, where the “student” model is a smaller version of the teacher model. Then, instead of training the student…

Fine-Tune Transformer Models For Question Answering On Custom Data | by Skanda Vivek | Dec, 2022

A tutorial on fine-tuning the Hugging Face RoBERTa QA Model on custom data and obtaining significant performance boostsExtractive Question Answering | Skanda VivekBERT is a transformer model that took the world by storm in 2019. BERT was trained on unlabeled data by masking words and training the model to predict these masked words based on context. BERT was later fine-tuned on multiple tasks and achieved state of the art performance on many specific language tasks. In particular, BERT was fine-tuned on 100k+ question…

The Future Perfect 50: Skanda Amarnath, executive director of Employ America

In the summer of 2022, President Joe Biden had a problem. Gas prices had been soaring for most of 2021 and 2022, due to a combination of overhang from reduced production during the height of the Covid-19 pandemic and the Russian invasion of Ukraine. And American voters hate when gas prices go up. Biden’s approval rating plunged over his first two years in office. He needed some kind of policy response to address the problem and prevent his party from getting slaughtered in the midterms. The plan he ultimately arrived…

Generating New Carnatic Music Patterns Using LSTM Neural Networks | by Skanda Vivek | Aug, 2022

Carnatic Music features rich recurrent patterns from a few building blocks — making it a rich playground to explore AI based music generation. Let’s see how LSTM can generate new basic compositionsCarnatic music is one of 2 Indian classical music forms (the other being Hindustani music). Unlike in Western classical music where compositions are set to a particular key (e.g. Beethoven’s Symphony №5 in C minor) and features multiple modes; Carnatic music compositions are mostly set to a single distinct mode, known as a Raga.…

How I Transitioned from Academia To the Data Science Industry | by Skanda Vivek | Jul, 2022

One day I realized it was time for a new adventure. Here’s how small consistent efforts laid the foundations for my next role.https://pxhere.com/en/photo/1284568I want to share my experiences in transitioning from academia to data science as it might benefit many in a similar boat. More broadly, my experiences could also help lay a solid foundation for achieving your goals —in transitioning to a completely different field.I’ve always wanted to be a scientist since I was probably 10. Since many close family members were…