Skip to content
View wietsedv's full-sized avatar
  • University of Groningen
Block or Report

Block or report wietsedv

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse

Pinned

  1. bertje bertje Public

    BERTje is a Dutch pre-trained BERT model developed at the University of Groningen. (EMNLP Findings 2020) "What’s so special about BERT’s layers? A closer look at the NLP pipeline in monolingual and…

    Python 131 10

  2. gpt2-recycle gpt2-recycle Public

    As good as new. How to successfully recycle English GPT-2 to make models for other languages (ACL Findings 2021)

    Jupyter Notebook 45 5

  3. low-resource-adapt low-resource-adapt Public

    Code for the paper "Adapting Monolingual Models: Data can be Scarce when Language Similarity is High" (ACL Findings 2021)

    Python 5 1

  4. xpos xpos Public

    Make the Best of Cross-lingual Transfer: Evidence from POS Tagging with over 100 Languages (ACL 2022)

    Jupyter Notebook 18 7

  5. dumb dumb Public

    A Benchmark for Smart Evaluation of Dutch Models (EMNLP 2023)

    Python 7 1