Techno Blender
Digitally Yours.

Failed in Data Science Interview? Here Are Some Tips to Help You Succeed | by Sharan Kumar Ravindran | Jul, 2022

0 49


Based on 10,000+ hours of experience and many failures!

Photo by Brett Jordan on Unsplash

First of all, If you have recently failed an interview. Remember, Failure is not the end. It is an opportunity to learn and perform better the next time. Here is one of my favorite quotes on overcoming failure.

I’ve missed more than 9000 shots in my career. I’ve lost almost 300 games. Twenty-six times I’ve been trusted to take the game-winning shot and missed. I’ve failed over and over and over again in my life. And that is why I succeed.

— Michael Jordan

I have failed in many data science job interviews. I continue to fail on many things. My early failures have been very frustrating. It even made me doubt my capabilities. I eventually found success in data science. This article is not to boast about my success. On contrary, it is about the learnings from my failures. The most frustrating thing is to see people quitting after being very close to success. I have seen many people investing a lot of time in learning data science. But after a few failures, they start to doubt their capability and quit. If you have ever faced any failure in a data science interview or if you are interested in learning from others’ failures. I promise this article will not disappoint you.

Some people can find success in a few attempts. Some take a bit longer to find success. The key here is to have patience and continue to focus on the learning. Below is an article about a person who took 475 job applications and 6 months to find success.

It doesn’t matter, how many times you have failed. The most important thing is having a growth mindset. In this article, I am going to talk about important areas to focus on to crack data science interviews.

The first few minutes are the most important part of the interview. As per this article, 33% of the bosses decide in the first 90 seconds of the interview.

In most interviews, the first question asked is “Tell me about yourself”. Be well prepared for this question. You can leverage this question to drive the interview discussions in your favor. A simple formula to answer this question is,

  • Talk about your current role and projects
  • Talk about your experience and achievements
  • Talk about your plans and how this role is aligned with your career goals

Keep your response under 2 minutes and practice a few times. Do not repeat the contents of your resume, your interviewer will certainly have a copy of your resume. Here is an interesting article by Seek about preparing for this question.

Most people hardly spend any time preparing for this question. The importance of having a good response to this question is often unestimated. It is often taken for granted.

It is not possible to predict the interview questions. But you can prepare for questions that are most likely to be asked. This is most certainly one of these questions that are often asked in a job interview. You will be able to make an impressive start if you are well-prepared for the question.

Once you have made the impression in the first few minutes then the only thing to focus on is to not give away reasons to reject you. One way to ensure that is by not making any mistakes about the data science basics.

You need to be familiar with the basic concepts across the entire life cycle of a data science project. Here are some references to concepts behind key steps common in most data science projects. It is very important to revise these topics before your interview.

Feature engineering

Data is never clean enough to be used directly by the models. It always requires some level of refinement. Interviewers are usually keen to understand if you know the common feature engineering techniques. Also, have a good understanding of when to use which techniques. Below is an amazing resource that clearly explains the fundamentals of feature engineering. This article also explains the rationale behind the usage of these techniques and the benefits.

Below is a link to a Kaggle Notebook that has the implementation of the frequently-used feature engineering techniques.

Feature selection

After feature engineering, the next obvious step is feature selection. We rarely end up using all the features. Feature selection is an important step that helps us identify the critical features or drop the non-important ones. Below is an amazing resource to learn and revise the techniques used for feature selection under different circumstances.

Knowing the algorithms

The least expertise required here will differ depending on the role and the job level. It is not just enough if you name the algorithms that can be used to solve a particular problem. But, it is important to understand algorithms that are best suited for different scenarios. Below is an interesting article that explains different algorithms and their advantages and disadvantages. It also covers the scenarios when an algorithm could be used.

Tree-based ensemble models are very popular in many real-life use cases. The success rate of tree-based ensemble models is much higher as compared to traditional regression algorithms. Below is again another in-depth article that explains tree-based ensemble models. It talks about their strengths, weakness, and parameters, and does a comparative study as well.

Model evaluation

Generally while developing a solution we try different algorithms. But then finally we select the ones that perform better as compared to others. Model evaluation metrics are used to measure the performance of the models. Here is an ultimate guide on different evaluation techniques that can be used for different problems.

Other concepts frequently asked in interviews

Apart from the above topics, here are some more topics to be revised. Questions on these topics are often asked in the interviews.

  • Dimensionality reduction: This technique is commonly used when there are too many features in the dataset. When the number of features is less it means that lesser memory is enough and likely lesser computation time as well. This can help reduce the noise in the dataset. To know in detail about the advantages and different techniques, check the below article.
  • Regularization: This is used to prevent over-fitting or under-fitting. Lasso and Ridge are the most commonly used regularization techniques. One key difference between them is, that in Ridge, all the features are retained whereas, in the case of Lasso, the unimportant features are dropped. To revise both these techniques check the below article.
  • Bias and Variance: Bias and Variance are very handy to study the model outcome. These two have an inverse relationship. It means that it is not possible to have a model that has low bias and low variance. It is an art to find a balance between them. To revise the topic, refer to the below article.

Projects are an important aspect of our resume and the job interview. Many interview discussions are driven by the project details. Over time our memory might fail us. We could forget some granular details of the project. For example, the data quality issues present in the data, limitations of the project environment, and so on. While these could be trivial in the larger scheme of things but these could have been the reason behind some key decisions of the project. These details could have an extraordinary impact on your interviews. Hence, I always recommend people to document the project, the techniques and models used, and the rationale behind their usage.

These project notes need not be huge documents. If it is a huge document then it doesn’t help much in the interview preparation. For the interview purpose, keep these notes minimal. It should not be longer than a couple of bullet points. Below are some of the questions that can be used as a pointer,

  • A brief 1–2 mins summary of the project
  • The key goal of the project
  • Team and skillsets
  • The key decisions and the rationale behind them
  • Technical and non-technical issues
  • Steps involved in the projects
  • Technology and tools details and rationale for using them
  • Impact of the project
  • Improvement/Changes that could have been done

This will not just be useful for your interview preparation but this will be very helpful to get a better understanding.

If you are new to data science then you need to first focus on building a good project portfolio. You can crack an interview by having a Titanic project in your portfolio.

Not knowing enough about the role or the company sends out red signals to the interviewer. Also, researching the company and the job profile help in improving your confidence level while having a general discussion with your interviewer. It also sends out clear signs of your preparedness.

Some of the places where you can better get to know about the role and the company are,

  • Company’s website
  • Search for recent news about the company
  • Check out their blogs
  • Check their social media pages
  • Search and learn about their competitions
  • Research about the industry in general
  • Go through the employee reviews about the work culture and interview process on Glassdoor

Failing an interview is OK as long as there has been some learning from it. After the interview, you can spend some time, retrospectively thinking about your responses. You can write notes on responses that could have been better. This will help you to be better prepared when a similar question is asked again. The other means to get to know your performance is by asking the recruiter directly for feedback.

One way to be better prepared is by discussing the questions with a friend or a mentor. They could help in providing details that you were possibly unaware of. Or, you could ask a more experienced professional for a mock interview. Mock interviews help in improving your confidence. It could be very helpful in better managing your stress levels before the real interview. It can also help you to come up with the right strategy.

Check if your resume would need any update based on the feedback you have received. Also, get your resume reviewed by your friends in data science or much better by a mentor. There are some amazing techniques to come up with a good resume. Below is a link to one such article that talks about powerful techniques to write highly impactful resumes.

Mentors play a very important role in one’s career development. They don’t just help with providing feedback. They do provide emotional support and also give you access to their network. Networking can be of great help to get to know about interesting job opportunities.

Here are some amazing interview preparation materials.

  • Be Yourself! Being yourself will make you more comfortable
  • Not knowing enough about the interviewing company is a grave mistake
  • Be ready to answer why you would want to join the company.
  • Be ready to discuss your salary expectations
  • Be prepared to ask questions about the role and company
  • Don’t talk about concepts, techniques, and algorithms that you are not confident of
  • Have a mentor who is 3–5 years ahead of you in the career
  • Never fail to make eye contact while answering a question


Based on 10,000+ hours of experience and many failures!

Photo by Brett Jordan on Unsplash

First of all, If you have recently failed an interview. Remember, Failure is not the end. It is an opportunity to learn and perform better the next time. Here is one of my favorite quotes on overcoming failure.

I’ve missed more than 9000 shots in my career. I’ve lost almost 300 games. Twenty-six times I’ve been trusted to take the game-winning shot and missed. I’ve failed over and over and over again in my life. And that is why I succeed.

— Michael Jordan

I have failed in many data science job interviews. I continue to fail on many things. My early failures have been very frustrating. It even made me doubt my capabilities. I eventually found success in data science. This article is not to boast about my success. On contrary, it is about the learnings from my failures. The most frustrating thing is to see people quitting after being very close to success. I have seen many people investing a lot of time in learning data science. But after a few failures, they start to doubt their capability and quit. If you have ever faced any failure in a data science interview or if you are interested in learning from others’ failures. I promise this article will not disappoint you.

Some people can find success in a few attempts. Some take a bit longer to find success. The key here is to have patience and continue to focus on the learning. Below is an article about a person who took 475 job applications and 6 months to find success.

It doesn’t matter, how many times you have failed. The most important thing is having a growth mindset. In this article, I am going to talk about important areas to focus on to crack data science interviews.

The first few minutes are the most important part of the interview. As per this article, 33% of the bosses decide in the first 90 seconds of the interview.

In most interviews, the first question asked is “Tell me about yourself”. Be well prepared for this question. You can leverage this question to drive the interview discussions in your favor. A simple formula to answer this question is,

  • Talk about your current role and projects
  • Talk about your experience and achievements
  • Talk about your plans and how this role is aligned with your career goals

Keep your response under 2 minutes and practice a few times. Do not repeat the contents of your resume, your interviewer will certainly have a copy of your resume. Here is an interesting article by Seek about preparing for this question.

Most people hardly spend any time preparing for this question. The importance of having a good response to this question is often unestimated. It is often taken for granted.

It is not possible to predict the interview questions. But you can prepare for questions that are most likely to be asked. This is most certainly one of these questions that are often asked in a job interview. You will be able to make an impressive start if you are well-prepared for the question.

Once you have made the impression in the first few minutes then the only thing to focus on is to not give away reasons to reject you. One way to ensure that is by not making any mistakes about the data science basics.

You need to be familiar with the basic concepts across the entire life cycle of a data science project. Here are some references to concepts behind key steps common in most data science projects. It is very important to revise these topics before your interview.

Feature engineering

Data is never clean enough to be used directly by the models. It always requires some level of refinement. Interviewers are usually keen to understand if you know the common feature engineering techniques. Also, have a good understanding of when to use which techniques. Below is an amazing resource that clearly explains the fundamentals of feature engineering. This article also explains the rationale behind the usage of these techniques and the benefits.

Below is a link to a Kaggle Notebook that has the implementation of the frequently-used feature engineering techniques.

Feature selection

After feature engineering, the next obvious step is feature selection. We rarely end up using all the features. Feature selection is an important step that helps us identify the critical features or drop the non-important ones. Below is an amazing resource to learn and revise the techniques used for feature selection under different circumstances.

Knowing the algorithms

The least expertise required here will differ depending on the role and the job level. It is not just enough if you name the algorithms that can be used to solve a particular problem. But, it is important to understand algorithms that are best suited for different scenarios. Below is an interesting article that explains different algorithms and their advantages and disadvantages. It also covers the scenarios when an algorithm could be used.

Tree-based ensemble models are very popular in many real-life use cases. The success rate of tree-based ensemble models is much higher as compared to traditional regression algorithms. Below is again another in-depth article that explains tree-based ensemble models. It talks about their strengths, weakness, and parameters, and does a comparative study as well.

Model evaluation

Generally while developing a solution we try different algorithms. But then finally we select the ones that perform better as compared to others. Model evaluation metrics are used to measure the performance of the models. Here is an ultimate guide on different evaluation techniques that can be used for different problems.

Other concepts frequently asked in interviews

Apart from the above topics, here are some more topics to be revised. Questions on these topics are often asked in the interviews.

  • Dimensionality reduction: This technique is commonly used when there are too many features in the dataset. When the number of features is less it means that lesser memory is enough and likely lesser computation time as well. This can help reduce the noise in the dataset. To know in detail about the advantages and different techniques, check the below article.
  • Regularization: This is used to prevent over-fitting or under-fitting. Lasso and Ridge are the most commonly used regularization techniques. One key difference between them is, that in Ridge, all the features are retained whereas, in the case of Lasso, the unimportant features are dropped. To revise both these techniques check the below article.
  • Bias and Variance: Bias and Variance are very handy to study the model outcome. These two have an inverse relationship. It means that it is not possible to have a model that has low bias and low variance. It is an art to find a balance between them. To revise the topic, refer to the below article.

Projects are an important aspect of our resume and the job interview. Many interview discussions are driven by the project details. Over time our memory might fail us. We could forget some granular details of the project. For example, the data quality issues present in the data, limitations of the project environment, and so on. While these could be trivial in the larger scheme of things but these could have been the reason behind some key decisions of the project. These details could have an extraordinary impact on your interviews. Hence, I always recommend people to document the project, the techniques and models used, and the rationale behind their usage.

These project notes need not be huge documents. If it is a huge document then it doesn’t help much in the interview preparation. For the interview purpose, keep these notes minimal. It should not be longer than a couple of bullet points. Below are some of the questions that can be used as a pointer,

  • A brief 1–2 mins summary of the project
  • The key goal of the project
  • Team and skillsets
  • The key decisions and the rationale behind them
  • Technical and non-technical issues
  • Steps involved in the projects
  • Technology and tools details and rationale for using them
  • Impact of the project
  • Improvement/Changes that could have been done

This will not just be useful for your interview preparation but this will be very helpful to get a better understanding.

If you are new to data science then you need to first focus on building a good project portfolio. You can crack an interview by having a Titanic project in your portfolio.

Not knowing enough about the role or the company sends out red signals to the interviewer. Also, researching the company and the job profile help in improving your confidence level while having a general discussion with your interviewer. It also sends out clear signs of your preparedness.

Some of the places where you can better get to know about the role and the company are,

  • Company’s website
  • Search for recent news about the company
  • Check out their blogs
  • Check their social media pages
  • Search and learn about their competitions
  • Research about the industry in general
  • Go through the employee reviews about the work culture and interview process on Glassdoor

Failing an interview is OK as long as there has been some learning from it. After the interview, you can spend some time, retrospectively thinking about your responses. You can write notes on responses that could have been better. This will help you to be better prepared when a similar question is asked again. The other means to get to know your performance is by asking the recruiter directly for feedback.

One way to be better prepared is by discussing the questions with a friend or a mentor. They could help in providing details that you were possibly unaware of. Or, you could ask a more experienced professional for a mock interview. Mock interviews help in improving your confidence. It could be very helpful in better managing your stress levels before the real interview. It can also help you to come up with the right strategy.

Check if your resume would need any update based on the feedback you have received. Also, get your resume reviewed by your friends in data science or much better by a mentor. There are some amazing techniques to come up with a good resume. Below is a link to one such article that talks about powerful techniques to write highly impactful resumes.

Mentors play a very important role in one’s career development. They don’t just help with providing feedback. They do provide emotional support and also give you access to their network. Networking can be of great help to get to know about interesting job opportunities.

Here are some amazing interview preparation materials.

  • Be Yourself! Being yourself will make you more comfortable
  • Not knowing enough about the interviewing company is a grave mistake
  • Be ready to answer why you would want to join the company.
  • Be ready to discuss your salary expectations
  • Be prepared to ask questions about the role and company
  • Don’t talk about concepts, techniques, and algorithms that you are not confident of
  • Have a mentor who is 3–5 years ahead of you in the career
  • Never fail to make eye contact while answering a question

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.
Leave a comment