Depends on the definition

it's about machine learning, data science and more

LSTM with attention for relation classification

Once named entities have been identified in a text, we then want to extract the relations that exist between them. As indicated earlier, we will typically be looking for relations between specified types of named entity. I covered named entity… Continue Reading →

Evaluate sequence models in python

An important part of every machine learning project is the proper evaluation of the performance of the system. In this post I will show you how evaluate sequence models with token-based labels. This is especially tricky because: some entity types… Continue Reading →

Image segmentation with test time augmentation with keras

In the last post, I introduced the U-Net model for segmenting salt depots in seismic images. This time, we will see how to improve the model by data augmentation and especially test time augmentation (TTA). You will learn how to… Continue Reading →

U-Net for segmenting seismic images with keras

Today I’m going to write about a kaggle competition I started working on recently. I will show you how to approach the problem using the U-Net neural model architecture in keras. In the TGS Salt Identification Challenge, you are asked… Continue Reading →

State-of-the-art named entity recognition with residual LSTM and ELMo

This is the sixth post in my series about named entity recognition. If you haven’t seen the last five, have a look now. The last time we used character embeddings and a LSTM to model the sequence structure of our… Continue Reading →

Explain neural networks with keras and eli5

  In this post I’m going to show you how you can use a neural network from keras with the LIME algorithm implemented in the eli5 TextExplainer class. For this we will write a scikit-learn compatible wrapper for a keras… Continue Reading →

Debugging black-box text classifiers with LIME

Often in text classification, we use so called black-box classifiers. By black-box classifiers I mean a classification system where the internal workings are completly hidden from you. A famous example are deep neural nets, in text classification oftern recurrent or… Continue Reading →

PyData Amsterdam 2018

Last weekend I participated at the PyData Amsterdam 2018 Conference in, you guess it, in Amsterdam. It has been a great conference and I meet a lot of great people and had a very good time in Amsterdam. In this… Continue Reading →

Enhancing LSTMs with character embeddings for Named entity recognition

This is the fifth in my series about named entity recognition with python. If you haven’t seen the last four, have a look now. The last time we used a CRF-LSTM to model the sequence structure of our sentences. We… Continue Reading →

Guide to word vectors with gensim and keras

Today, I tell you what word vectors are, how you create them in python and finally how you can use them with neural networks in keras. For a long time, NLP methods use a vectorspace model to represent words. Commonly… Continue Reading →

« Older posts

© 2018 Depends on the definition

Up ↑