Yes, it is possible to load a HuggingFace Big Model that has been sharded and save it as a single file that is not sharded. You can use the from_pretrained
method to load the sharded model and then use the save_pretrained
method to save the model as a single file. Here's an example:
from transformers import AutoModel, AutoConfig
# Load the sharded model
config = AutoConfig.from_pretrained('path/to/sharded/model/config.json')
model = AutoModel.from_pretrained('path/to/sharded/model/', config=config)
# Save the model as a single file
model.save_pretrained('path/to/saved/model/')
This will save the model as a single file in the specified path. Note that the from_pretrained
method automatically detects if the model is sharded and loads it accordingly.
Asked: 2022-12-02 11:00:00 +0000
Seen: 9 times
Last updated: Jan 27 '23