Home / NLP with Ruby / Segmentation > pragmatic_tokenizer Segmentation > pragmatic_tokenizer Multilingual tokenizer to split a string into tokens. Package 93 stars GitHub Back to NLP with Ruby