r-cran-tokenizers 0.3.0-1 source package in Ubuntu
Changelog
r-cran-tokenizers (0.3.0-1) unstable; urgency=medium * Team Upload. * New upstream version 0.3.0 * Bump Standards-Version to 4.6.2 (no changes needed) -- Nilesh Patra <email address hidden> Fri, 30 Dec 2022 18:07:46 +0530
Upload details
- Uploaded by:
- Debian R Packages Maintainers
- Uploaded to:
- Sid
- Original maintainer:
- Debian R Packages Maintainers
- Architectures:
- any
- Section:
- misc
- Urgency:
- Medium Urgency
See full publishing history Publishing
Series | Published | Component | Section | |
---|---|---|---|---|
Oracular | release | universe | misc | |
Noble | release | universe | misc | |
Mantic | release | universe | misc | |
Lunar | release | universe | misc |
Downloads
File | Size | SHA-256 Checksum |
---|---|---|
r-cran-tokenizers_0.3.0-1.dsc | 1.5 KiB | de1293074c2eb89bb9d8d0f84d7951e7655edc34e187e1dbf2339f5d2dff5da3 |
r-cran-tokenizers_0.3.0.orig.tar.gz | 445.2 KiB | decf3caa0d38f0679dc20e96972fab06162be94162fc228758f93cdf041d53db |
r-cran-tokenizers_0.3.0-1.debian.tar.xz | 3.0 KiB | aa2c95aab2b1265fbc5c723d01c1d514823ed6888f6c839d5dfc037c90215526 |
Available diffs
- diff from 0.2.3-1 to 0.3.0-1 (9.4 KiB)
No changes file available.
Binary packages built by this source
- r-cran-tokenizers: GNU R fast, consistent tokenization of natural language text
Convert natural language text into tokens. Includes tokenizers for
shingled n-grams, skip n-grams, words, word stems, sentences,
paragraphs, characters, shingled characters, lines, tweets, Penn
Treebank, regular expressions, as well as functions for counting
characters, words, and sentences, and a function for splitting longer
texts into separate documents, each with the same number of words.
The tokenizers have a consistent interface, and the package is built
on the 'stringi' and 'Rcpp' packages for fast yet correct
tokenization in 'UTF-8'.
- r-cran-tokenizers-dbgsym: debug symbols for r-cran-tokenizers