Yes, it is possible to load a HuggingFace Big Model that has been sharded and save it as a single file that is not sharded. You can use the from_pretrained
method to load the sharded model and then use the save_pretrained
method to save the model as a single file. Here's an example:
from transformers import AutoModel, AutoConfig
# Load the sharded model
config = AutoConfig.from_pretrained('path/to/sharded/model/config.json')
model = AutoModel.from_pretrained('path/to/sharded/model/', config=config)
# Save the model as a single file
model.save_pretrained('path/to/saved/model/')
This will save the model as a single file in the specified path. Note that the from_pretrained
method automatically detects if the model is sharded and loads it accordingly.
Please start posting anonymously - your entry will be published after you log in or create a new account. This space is reserved only for answers. If you would like to engage in a discussion, please instead post a comment under the question or an answer that you would like to discuss
Asked: 2022-12-02 11:00:00 +0000
Seen: 9 times
Last updated: Jan 27 '23
At what point should I load view data within a .NET MAUI application?
How to load RTF text from a database into TRichEdit?
How can I load the collection in a horizontal manner?
How can the lengthy start-up duration be avoided when there are numerous callbacks?
What is the method for altering the color of the input text in a TextFormField in Flutter?
What is the procedure for utilizing the node-rdpjs library?
What is the method to change a PDF file to an Excel file using C#?
What is the method to make a TextButton inactive when the text field has no text?