Skip to content

Releases: ant-research/StructuredLM_RTDT

Official code for the paper titled Augmenting Transformers with Recursively Composed Multi-grained Representations

20 May 14:08
Compare
Choose a tag to compare

In this work, we successfully combine a composition model with bi-directional Transformers and make them jointly pre-trainable.

self-interpretable classification

Fast-R2D2

10 Oct 03:48
Compare
Choose a tag to compare

The paper version and the corresponding model pretrained on wiki-103.

r2d2

13 Feb 03:38
Compare
Choose a tag to compare

The code for paper "R2D2: Recursive Transformer based on Differentiable Tree for Interpretable Hierarchical Language Modeling"