Skip to contents

Returns the configuration of a masked model.


  model = getOption("pangoling.masked.default"),
  config_model = NULL



Name of a pre-trained model or folder.


List with other arguments that control how the model from Hugging Face is accessed.


A list with the configuration of the model.


A masked language model (also called BERT-like, or encoder model) is a type of large language model that can be used to predict the content of a mask in a sentence.

If not specified, the masked model that will be used is the one set in specified in the global option pangoling.masked.default, this can be accessed via getOption("pangoling.masked.default") (by default "bert-base-uncased"). To change the default option use options(pangoling.masked.default = "newmaskedmodel").

A list of possible masked can be found in Hugging Face website.

Using the config_model and config_tokenizer arguments, it's possible to control how the model and tokenizer from Hugging Face is accessed, see the python method from_pretrained for details. In case of errors check the status of https://status.huggingface.co/

See also

Other masked model functions: masked_lp(), masked_preload(), masked_tokens_tbl()


if (FALSE) { # interactive()
masked_config(model = "bert-base-uncased")