tokenize
Simple tokenizer for English text
https://github.com/haskell/tokenize
Stackage Nightly 2024-11-12: | 0.3.0.1 |
Latest on Hackage: | 0.3.0.1 |
Maintained by Andreas Abel
This version can be pinned in stack with:
tokenize-0.3.0.1@sha256:146423dea3f30146cdce0e4f386854568fc11ea2f8c284745544148926dc87f2,1495
Module documentation for 0.3.0.1
Depends on 3 packages(full list with versions):