Download Model (Generative Models)
Synopsis
Downloads a model from HuggingfaceDescription
This operator downloads a model from Huggingface.com and stores it in the specified model directory. After downloading, you can directly use the model with the corresponding task operator or use it as a foundation model for finetuning. Please note that you will need to specify the full name, e.g., “facebook/bart-large” and not just “bart-large” as the model parameter. Also note that you will have to specify the correct type of the model and the intended task type (e.g., "text classification" etc.) since otherwise unexpected results may occur.Input
- connection (Connection)
An optional connection containing your Hugging Face token which may be needed to download some of the models. The connection has to be a Dictionary Connection with a single key-value-pair with key 'token'.
Output
- model (File)
The model directory into which the model has been downloaded.
- connection (Connection)
The input connection if one has been provided.
Parameters
- model The model which will be downloaded which will often be a base or foundation model but can also be a model which has been already finetuned for specific tasks.
- type This parameter must be set to the correct type of the model and the task it is supposed to solve. Failing to do so will likely result in unexpected results or execution failures.
- storage type Determines where the large language model will be stored. Either in a folder in one of your projects / repositories (recommended), in a folder of your file system, or in a temporary folder.
- project folder The folder in a project / repository to store the large language model in.
- file folder The folder in your file system to store the large language model in.
- data type Specifies the data type under which the model should be loaded. Using lower precisions can reduce memory usage while leading to slightly less accurate results in some cases. If set to “auto” the data precision is derived from the model itself.
- revision The specific model version to use. The default is “main”. The value can be a branch name, a tag name, or a commit id of the model in the Huggingface git repository.
- proxy A http proxy server in case you need to use one.
- conda environment The conda environment used for this downloading task. Additional packages may be installed into this environment, please refer to the extension documentation for additional details on this and on version requirements for Python and some packages which have be present in this environment.
Tutorial Processes
Download a model and delete it again
This process simply downloads a model from Huggingface and deletes it afterwards. Do not forget to resume the process after the breakpoint is reached in order to delete the model from the temporary directory.