TensorFlow code and pre-trained models for BERT and ERNIE
-
Updated
Jun 5, 2019 - Python
TensorFlow code and pre-trained models for BERT and ERNIE
Efficient binary encoding for your data (based on Erlang's External Terms Format)
a mindspore implementation of emotion detection model based on ERNIE.
本篇基于paddle高级api的千言情感分析比赛。
百度PaddleHub-ERNIE微调中文情感分析(文本分类)
基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。
Chinese Text Classification using BERT (Bidirectional Encoder Representation from Transformers), BERT variants and ERNIE (Enhanced Language Representation with Informative Entities), implemented by PyTorch, monitored training by WandB (Weights & Biases)
2021 CCF BDCI 全国信息检索挑战杯(CCIR-Cup)智能人机交互自然语言理解赛道第二名参赛解决方案
Code and data for paper 'Fine-tuning ERNIE for chest abnormal imaging signs extraction'
NLP Text Classification
Pre-trained ERNIE models could be loaded with Keras for feature extraction and prediction.
This repo contains a PyTorch implementation of a pretrained ERNIE model for text classification.
AiSpace: Better practices for deep learning model development and deployment For Tensorflow 2.0
Add a description, image, and links to the ernie topic page so that developers can more easily learn about it.
To associate your repository with the ernie topic, visit your repo's landing page and select "manage topics."