psaegert's picture
Upload 201 files
2c34d2f
from dataclasses import asdict, dataclass
from enum import Enum
from typing import (
Any,
Dict,
List,
Mapping,
Optional,
Sequence,
Tuple,
)
from aleph_alpha_client.prompt import Prompt
@dataclass(frozen=True)
class EmbeddingRequest:
"""
Embeds a text and returns vectors that can be used for downstream tasks (e.g. semantic similarity) and models (e.g. classifiers).
Parameters:
prompt
The text and/or image(s) to be embedded.
layers
A list of layer indices from which to return embeddings.
* Index 0 corresponds to the word embeddings used as input to the first transformer layer
* Index 1 corresponds to the hidden state as output by the first transformer layer, index 2 to the output of the second layer etc.
* Index -1 corresponds to the last transformer layer (not the language modelling head), index -2 to the second last layer etc.
pooling
Pooling operation to use.
Pooling operations include:
* mean: aggregate token embeddings across the sequence dimension using an average
* max: aggregate token embeddings across the sequence dimension using a maximum
* last_token: just use the last token
* abs_max: aggregate token embeddings across the sequence dimension using a maximum of absolute values
type
Type of the embedding (e.g. symmetric or asymmetric)
tokens
Flag indicating whether the tokenized prompt is to be returned (True) or not (False)
normalize
Return normalized embeddings. This can be used to save on additional compute when applying a cosine similarity metric.
Note that at the moment this parameter does not yet have any effect. This will change as soon as the
corresponding feature is available in the backend
contextual_control_threshold (float, default None)
If set to None, attention control parameters only apply to those tokens that have
explicitly b