Feature Extraction
Transformers
Safetensors
English
modernbert
cybersecurity
APT
threat-intelligence
contrastive-learning
embeddings
attribution
MITRE-ATTACK
CTI
ModernBERT
Eval Results (legacy)
text-embeddings-inference
Instructions to use selfconstruct3d/FALCON with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use selfconstruct3d/FALCON with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="selfconstruct3d/FALCON")# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("selfconstruct3d/FALCON") model = AutoModel.from_pretrained("selfconstruct3d/FALCON") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 98313b1ee808f491d86995720017677359b112b2049b8f0c9df92d39edb53557
- Size of remote file:
- 597 MB
- SHA256:
- 061d3df02a202132b77b9eef18aa4610fe82a06fda03332e895b2690f880f500
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.