Instructions to use Wangyh/mSTAR with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- timm
How to use Wangyh/mSTAR with timm:
import timm model = timm.create_model("hf_hub:Wangyh/mSTAR", pretrained=True) - Notebooks
- Google Colab
- Kaggle
Accessing the pretrained encoders to test our vision-language survival analysis framework
Hello, authors! Big congratulations on your incredible work. I really love it.
Having witnessed the exciting performance of mSTAR in various downstream tasks, I would like to leverage this pathology foundation model to test our vision-language survival analysis framework. I will strictly follow the license of model usage and the model won't be used for any commercial purposes.
Hope to get approved soon. Thanks!
Hello, authors! Great work.
Requesting access to the model!
Thank you very much!
Thank you for your request. Your access has been approved. Please proceed as needed. Let us know if you encounter any issues.
Now, mSTAR can be directly load from timm, please use the following code.
import timm
model = timm.create_model(
'hf-hub:Wangyh/mSTAR',
pretrained=True,
init_values=1e-5, dynamic_img_size=True
)