Depends on the definition

it's about machine learning, data science and more



Understanding text data with topic models

This is the first post of my series about understanding text data sets. In practice, you often want and need to know, what is going on in your data. In this post we will focus on applying a Latent Dirichlet allocation (LDA) topic model to the “Quora Insincere Questions Classification” data set on kaggle.

Image segmentation with test time augmentation with keras

In the last post, I introduced the U-Net model for segmenting salt depots in seismic images. This time, we will see how to improve the model by data augmentation and especially test time augmentation (TTA). You will learn how to… Continue Reading →

U-Net for segmenting seismic images with keras

Today I’m going to write about a kaggle competition I started working on recently. I will show you how to approach the problem using the U-Net neural model architecture in keras. In the TGS Salt Identification Challenge, you are asked… Continue Reading →

Guide to word vectors with gensim and keras

  Today, I tell you what word vectors are, how you create them in python and finally how you can use them with neural networks in keras. For a long time, NLP methods use a vectorspace model to represent words…. Continue Reading →

How to build a smart product: Transfer Learning for Dog Breed Identification with keras

This time I will show you how to build a simple “AI” product with transfer learning. We will build a “dog breed identification chat bot”. In this first post, I will show how to build a good model using keras,… Continue Reading →

A strong baseline to classify toxic comments on Wikipedia with fasttext in keras

This time we’re going to discuss a current machine learning competion on kaggle. In this competition, you’re challenged to build a model that’s capable of detecting different types of toxicity in comments from Wikipedia’s talk page edits. I will show you how to create a strong baseline using python and keras.

© 2019 Depends on the definition

Up ↑