Small error in the sentence transformer example

#9
by fbilhaut - opened

The comment in the sentence transformer example says:

Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.

After checking the numbers and the code, it seems that these scores do not come from a softmax, which isn't applied by default, but from the default activation function which is a sigmoid.

See method predict() in CrossEncoder.py .

Sign up or log in to comment