Rehosted from https://huggingface.co/nasaharvest/galileo using the following code:
import os
import hashlib
import torch
urls = dict(
nano='https://hf.co/nasaharvest/galileo/resolve/main/models/nano/encoder.pt',
tiny='https://hf.co/nasaharvest/galileo/resolve/main/models/tiny/encoder.pt',
base='https://hf.co/nasaharvest/galileo/resolve/main/models/base/encoder.pt',
)
for name, url in urls.items():
state_dict = torch.hub.load_state_dict_from_url(url, map_location="cpu", file_name=f'model_{name}.pth')
state_dict = {k.replace('.backbone', ''): v for k, v in state_dict.items()}
filename = f'model_{name}.pth'
torch.save(state_dict, filename)
md5 = hashlib.md5(open(filename, "rb").read()).hexdigest()[:8]
os.rename(filename, filename.replace(".pth", f"-{md5}.pth"))
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support