Skip to content

alanalynch/computers-will-never

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Based on this comic: http://xkcd.com/1263/ & released under the MIT Licence.

pos_tokeniser.py is included for reference, pruned list of 30,000 verbs to 9600 based on tense.

Requires NLTK and the punkt corpus:
$ sudo pip install nltk
>>> import nltk
>>> nltk.download(punkt)

TODO:
- Replace flat text files with database
- Fix control flow to speed up parsing
- Make into web service

About

Things computers will never do.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages