Depends on the definition

it's about machine learning, data science and more


deep learning

How to use magnitude with keras

This time we have a look into the magnitude library, a feature-packed Python package for utilizing vector embeddings in machine learning models in a fast, efficient, and simple manner. We want to utilize the embeddings magnitude provides and use them in keras.

Named Entity Recognition with Bert

One of the latest milestones in pre-training and fine-tuning in natural language processing is the release of BERT. This is a new post in my NER series. I will show you how you can fine-tune the Bert model to do state-of-the art named entity recognition in pytorch.

LSTM with attention for relation classification

Once named entities have been identified in a text, we then want to extract the relations that exist between them. As indicated earlier, we will typically be looking for relations between specified types of named entity. I covered named entity… Continue Reading →

Image segmentation with test time augmentation with keras

In the last post, I introduced the U-Net model for segmenting salt depots in seismic images. This time, we will see how to improve the model by data augmentation and especially test time augmentation (TTA). You will learn how to… Continue Reading →

U-Net for segmenting seismic images with keras

Today I’m going to write about a kaggle competition I started working on recently. I will show you how to approach the problem using the U-Net neural model architecture in keras. In the TGS Salt Identification Challenge, you are asked… Continue Reading →

State-of-the-art named entity recognition with residual LSTM and ELMo

This is the sixth post in my series about named entity recognition. This time I’m going to show you some cutting edge stuff. We will use a residual LSTM network together with ELMo embeddings, developed at Allen NLP. You will learn how to wrap a tensorflow hub pre-trained model to work with keras. The resulting model with give you state-of-the-art performance on the named entity recognition task.

Explain neural networks with keras and eli5

  In this post I’m going to show you how you can use a neural network from keras with the LIME algorithm implemented in the eli5 TextExplainer class. For this we will write a scikit-learn compatible wrapper for a keras… Continue Reading →

Enhancing LSTMs with character embeddings for Named entity recognition

This is the fifth in my series about named entity recognition with python. The last time we used a CRF-LSTM to model the sequence structure of our sentences. While this approach is straight forward and often yields strong results there are some potential shortcomings. If we haven’t seen a word a prediction time, we have to encode it as unknown and have to infer it’s meaning by it’s surrounding words. To encode the character-level information, we will use character embeddings and a LSTM to encode every word to an vector. We can use basically everything that produces a single vector for a sequence of characters that represent a word.

Guide to word vectors with gensim and keras

  Today, I tell you what word vectors are, how you create them in python and finally how you can use them with neural networks in keras. For a long time, NLP methods use a vectorspace model to represent words…. Continue Reading →

How to build a smart product: Transfer Learning for Dog Breed Identification with keras

This time I will show you how to build a simple “AI” product with transfer learning. We will build a “dog breed identification chat bot”. In this first post, I will show how to build a good model using keras,… Continue Reading →

© 2019 Depends on the definition

Up ↑