Shortcuts

quaterion.loss.softmax_loss module

class SoftmaxLoss(embedding_size: int, num_groups: int, temperature: float = 0.05)[source]

Bases: GroupLoss

Regular cross-entropy loss.

An implementation of softmax with dot product. It is designed to work with the base GroupLoss.

Parameters:
  • embedding_size – Output dimension of the encoder.

  • num_groups – Number of groups in the dataset.

  • temperature – Temperature value to divide logits, defaults to 0.05

forward(embeddings: Tensor, groups: LongTensor) Tensor[source]

Compute loss value.

Parameters:
  • embeddings – shape: (batch_size, vector_length) - Output embeddings from the encoder.

  • groups – shape: (batch_size,) - Group ids, associated with embeddings.

Returns:

Tensor – zero-size tensor, loss value

training: bool

Qdrant

Learn more about Qdrant vector search project and ecosystem

Discover Qdrant

Similarity Learning

Explore practical problem solving with Similarity Learning

Learn Similarity Learning

Community

Find people dealing with similar problems and get answers to your questions

Join Community