Download Model (Generative Models)


Downloads a model from Huggingface


This operator downloads a model from and stores it in the specified model directory. After downloading, you can directly use the model with the corresponding task operator or use it as a foundation model for finetuning. Please note that you will need to specify the full name, e.g., “facebook/bart-large” and not just “bart-large” as the model parameter. Also note that you will have to specify the correct type of the model and the intended task type (e.g., "text classification" etc.) since otherwise unexpected results may occur.


  • connection (Connection)

    An optional connection containing your Hugging Face token which may be needed to download some of the models. The connection has to be a Dictionary Connection with a single key-value-pair with key 'token'.


  • model (File)

    The model directory into which the model has been downloaded.

  • connection (Connection)

    The input connection if one has been provided.


  • model The model which will be downloaded which will often be a base or foundation model but can also be a model which has been already finetuned for specific tasks. Range:
  • type This parameter must be set to the correct type of the model and the task it is supposed to solve. Failing to do so will likely result in unexpected results or execution failures. Range:
  • storage_type Determines where the large language model will be stored. Either in a folder in one of your projects / repositories (recommended), in a folder of your file system, or in a temporary folder. Range:
  • project_folder The folder in a project / repository to store the large language model in. Range:
  • file_folder The folder in your file system to store the large language model in. Range:
  • data_type Specifies the data type under which the model should be loaded. Using lower precisions can reduce memory usage while leading to slightly less accurate results in some cases. If set to “auto” the data precision is derived from the model itself. Range:
  • revision The specific model version to use. The default is “main”. The value can be a branch name, a tag name, or a commit id of the model in the Huggingface git repository. Range:
  • proxy A http proxy server in case you need to use one. Range:
  • conda_environment The conda environment used for this downloading task. Additional packages may be installed into this environment, please refer to the extension documentation for additional details on this and on version requirements for Python and some packages which have be present in this environment. Range:

Tutorial Processes

Download a model and delete it again

This process simply downloads a model from Huggingface and deletes it afterwards. Do not forget to resume the process after the breakpoint is reached in order to delete the model from the temporary directory.