,

Skip to contents

Tokenize a string or token ids.

Usage

tokenize_lst(
  x,
  model = getOption("pangoling.causal.default"),
  add_special_tokens = NULL,
  config_tokenizer = NULL
)

Arguments

x

Strings or token ids.

model

Name of a pre-trained model or folder.

add_special_tokens

Whether to include special tokens. It has the same default as the AutoTokenizer method in Python.

config_tokenizer

List with other arguments that control how the tokenizer from Hugging Face is accessed.

Value

A list with tokens

See also

Other token-related functions: ntokens(), transformer_vocab()

Examples

if (FALSE) { # interactive()
tokenize_lst(x = c("The apple doesn't fall far from the tree."), model = "gpt2")
}