File size: 1,410 Bytes
23b59bf
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
---
license: mit
---
this is the placeholder model card for the (finess-benchmark-space)[https://huggingface.co/spaces/enzoescipy/finesse-benchmark-space] and its (database)[https://huggingface.co/datasets/enzoescipy/finesse-benchmark-results].

```python
import torch
from typing import List
from transformers import AutoConfig, PreTrainedModel  # Optional: for loading configs

from finesse_benchmark.interfaces import  FinesseSynthesizer

# --- Custom Embedder Example ---
# Uncomment and customize this class for your embedder.

class AverageSynthesizer(FinesseSynthesizer):
    """
    Average Synthesizer: Computes the mean of input embeddings without using any model.
    """
    def __init__(self, config_path: str):
        super().__init__()
        # No model to load for average pooling
        print(f"{self.__class__.__name__} initialized - Average pooling ready.")

    def synthesize(self, embeddings: torch.Tensor, **kwargs) -> torch.Tensor:
        """
        Average synthesis: Compute the mean along the sequence dimension.
        
        Args:
            embeddings: torch.Tensor of shape (batch, seq_len, embedding_dim)
            **kwargs: Additional arguments
        
        Returns:
            torch.Tensor of shape (batch, embedding_dim)
        """
        return embeddings.mean(dim=1)

    def device(self):
        return "cpu"
```

it just averages the embedding vectors.