--- license: mit --- this is the placeholder model card for the (finess-benchmark-space)[https://huggingface.co/spaces/enzoescipy/finesse-benchmark-space] and its (database)[https://huggingface.co/datasets/enzoescipy/finesse-benchmark-results]. ```python import torch from typing import List from transformers import AutoConfig, PreTrainedModel # Optional: for loading configs from finesse_benchmark.interfaces import FinesseSynthesizer # --- Custom Embedder Example --- # Uncomment and customize this class for your embedder. class AverageSynthesizer(FinesseSynthesizer): """ Average Synthesizer: Computes the mean of input embeddings without using any model. """ def __init__(self, config_path: str): super().__init__() # No model to load for average pooling print(f"{self.__class__.__name__} initialized - Average pooling ready.") def synthesize(self, embeddings: torch.Tensor, **kwargs) -> torch.Tensor: """ Average synthesis: Compute the mean along the sequence dimension. Args: embeddings: torch.Tensor of shape (batch, seq_len, embedding_dim) **kwargs: Additional arguments Returns: torch.Tensor of shape (batch, embedding_dim) """ return embeddings.mean(dim=1) def device(self): return "cpu" ``` it just averages the embedding vectors.