Provides a consistent `Transform` interface to tokenizers operating on `DataFrame`s and folders

Tokenizer(
  tok,
  rules = NULL,
  counter = NULL,
  lengths = NULL,
  mode = NULL,
  sep = " "
)

Arguments

tok

tokenizer

rules

rules

counter

counter

lengths

lengths

mode

mode

sep

separator

Value

None