Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Maybe we should add readability assessment task, too? #533

Open
brucewlee opened this issue Feb 16, 2021 · 2 comments
Open

Maybe we should add readability assessment task, too? #533

brucewlee opened this issue Feb 16, 2021 · 2 comments

Comments

@brucewlee
Copy link

brucewlee commented Feb 16, 2021

There had been both meaningful neural and non-neural approaches to this task. The linguistic features developed in this field can often be transferred to the other areas of text classification.

We shouldn't add this task inside text classification hence readability assessment tends to focus on "measuring" the difficulty of a text. Similar classification prediction models (SVM, HAN, etc.) are frequently used in readability assessment but I believe that the goal is different.

@sebastianruder
Copy link
Owner

Good idea. Could this be part of a section on "Automated assessment of written text" or something along those lines, see for example (Yannakoudakis and Briscoe, 2004)?

@brucewlee
Copy link
Author

brucewlee commented Mar 7, 2021

"Automated assessment of written text" is a great paper. A more recent example would be SOTA, non-neural 2016 and 'SOTA, neural 2020'.

Readability assessment is traditionally a very handcrafted feature-dependent task. SOTA models tend to be neural network-based models, but more traditional ones use SVM and ~100 linguistic features. There can be some useful insightful insights from the traditional models as well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants