Versions

Description

BERT, published by Google, is new way to obtain pre-trained language model word representation. Many NLP tasks are benefit from BERT to get the SOTA.

The goal of this project is to obtain the sentence and token embedding from BERT's pre-trained model. In this way, instead of building and do fine-tuning for an end-to-end NLP model, you can build your model by just utilizing the sentence or token embedding.

This project is implemented with @MXNet. Special thanks to @gluon-nlp team.

Repository

https://github.com/imgarylai/bert-embedding

Project Slug

bert-embedding

Last Built

4 weeks ago passed

Maintainers

Home Page

https://github.com/imgarylai/bert-embedding

Badge

Tags

nlp, mxnet, bert, natural-language-processing, word-embeddings, gluonnlp

Project Privacy Level

Public

Short URLs

bert-embedding.readthedocs.io
bert-embedding.rtfd.io

Default Version

latest

'latest' Version

master