Techno Blender
Digitally Yours.
Browsing Tag

Bayesian

Paper: ‘EpiLPS: A Fast and Flexible Bayesian Tool for Estimation of the Time-Varying Reproduction Number’ | by Antoine Soetewey | Oct, 2022

How to smooth epidemic curves and estimate the time-varying reproduction number in a flexible wayPlot by authorA colleague (and friend) of mine recently published a research paper entitled “EpiLPS: A fast and flexible Bayesian tool for estimation of the time-varying reproduction number” in PLoS Computational Biology.I am not in the habit of sharing research paper to which I did not contribute. Nevertheless, I would like to make an exception with this one because I strongly believe that the method developed in the paper…

Bayesian Regression Using PyMC3. How to implement Bayesian Regression in… | by Egor Howell | Sep, 2022

How to implement Bayesian Regression in Python using the PyMC3 packagePhoto by Joachim Schnürle on UnsplashPyMC3 (now simply PyMC) is a Bayesian modelling package that enables us to carry out Bayesian inference easily as Data Scientists.Under the hood, PyMC3 uses the method of Markov Chain Monte Carlo (MCMC) to compute the posterior distribution. Now, this method is quite complex and would require a whole another article to fully cover it. Therefore, I have linked a great post here that explains the topic very well.You…

Bayesian Regression From Scratch. Deriving Bayesian Linear Regression… | by Egor Howell | Sep, 2022

Deriving Bayesian Linear Regression from first principles using PythonPhoto by Klim Musalimov on UnsplashLinear Regression is the most well known algorithm in Data Science, however there is more than one version of it. The version most people use comes from the Frequentist interpretation of statistics, but there is another that comes from the Bayesian school of thought.In this article, we will go over Bayes’ theorem, the difference between Frequentist and Bayesian statistics and finally carry out Bayesian Linear…

Not-so-naive Bayes. Improved Bayesian classifier.

Improve the simple Bayesian classifier by releasing its naive assumptionDespite being very simple, naive Bayes classifiers tend to work decently in some real-world applications, famously document classification or spam filtering. They don’t need much training data and are very fast. As a result, they are often adopted as simple baselines for classification tasks. What many don’t know is that we can make them much less naive by using a simple trick.Naive Bayes is a simple probabilistic algorithm that makes use of the…

New method to identify symmetries in data using Bayesian statistics

Examples of colored graphs designating symmetries of four-dimensional data: Vertices and edges of the same color and shape in a graph are mapped to each other by a symmetry permutation preserving the structure of data. Credit: Hideyuki Ishi, Osaka Metropolitan University An international research team led by scientists from Osaka Metropolitan University has developed a method to identify symmetries in multi-dimensional data…

Bayesian A/B Testing in R. Analyze social media performance with… | by Hannah Roos | Sep, 2022

Analyze social media performance with Bayesian vs. Frequentist statisticsPhoto by Adem AY on UnsplashProfessionals have always wondered how they can improve their products and services. Marketing practitioners usually question how modifications to their website would change online purchases to maximize sales. Similarly, scientists test different drug versions against each other to identify the most effective one for future patients. Every time comparable options need to be tested against each other, we can run a little…

A Guide to Find the Best Boosting Model using Bayesian Hyperparameter Tuning but without Overfitting.

With boosted decision tree algorithms, such as XGBoost, CatBoost, and LightBoost you may outperform other models but overfitting is a real danger. Learn how to split the data, optimize hyperparameters, and find the best-performing model without overtraining it using the HGBoost library.Image from the author.Gradient boosting techniques gained much popularity in recent years for classification and regression tasks. An important part is the tuning of hyperparameters to gain the best performance in predictions. This requires…

How Bayesian statistics works in updating probabilities | by Giovanni Organtini | Jul, 2022

How experiments update knowledge leading to accurate probability estimatesDice-players and a bird-seller gathered around a stone slab — Master of the Gamblers — oil on canvas (public domain image taken from Wikipedia)By studying a cheat’s winnings, it is possible to find out with what probability he will get points, without knowing his resources, by applying Bayes’ Theorem. In this story we see how performing some “experiments” makes it possible to correctly estimate the probability of an event happening, even if we do…

Bayesian Conjugate Priors Simply Explained | by Egor Howell | Aug, 2022

A computationally effective way of carrying out Bayesian statisticsPhoto by Heather Gill on UnsplashIn Bayesian statistics, the conjugate prior is when the posterior and prior distributions belong to the same distribution. This phenomenon allows for simpler calculations of the posterior making Bayesian inference a lot easier.In this article, we will gain an in-depth view of the conjugate prior. We will show the need for it, derive an example from first principles and finally apply it to a real world problem.Bayes’…

Mango: A new way to do Bayesian optimization in Python | by Carmen Adriana Martinez Barbosa | Aug, 2022

All you need to know about this library for scalable hyperparameter tuning of machine learning modelsPhoto by Kvistholt Photography on UnsplashThe optimization of model hyperparameters (or model settings) is perhaps the most important step in training a machine learning algorithm as it leads to finding the optimal parameters that minimize your model’s loss function. This step is also essential to building generalizable models that are not prone to overfitting.The most known techniques to optimize model hyperparameters are…