guymorganb commited on
Commit
c869900
·
1 Parent(s): c359274

Fix imports for LSG code: explicit BERT import + import math + logger

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -14,7 +14,7 @@ Below is an example to encode queries and passages from the MS-MARCO passage ran
14
 
15
  ```python
16
  from sentence_transformers import SentenceTransformer
17
- model = SentenceTransformer('efederici/e5-large-v2-4096', {"trust_remote_code": True})
18
  input_texts = [
19
  'query: how much protein should a female eat',
20
  'query: summit define',
@@ -45,8 +45,8 @@ input_texts = [
45
  "passage: Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."
46
  ]
47
 
48
- tokenizer = AutoTokenizer.from_pretrained('efederici/e5-large-v2-4096')
49
- model = AutoModel.from_pretrained('efederici/e5-large-v2-4096', trust_remote_code=True)
50
 
51
  batch_dict = tokenizer(input_texts, max_length=4096, padding=True, truncation=True, return_tensors='pt')
52
  outputs = model(**batch_dict)
 
14
 
15
  ```python
16
  from sentence_transformers import SentenceTransformer
17
+ model = SentenceTransformer('guymorganb/e5-large-v2-4096-lsg-patched', {"trust_remote_code": True})
18
  input_texts = [
19
  'query: how much protein should a female eat',
20
  'query: summit define',
 
45
  "passage: Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."
46
  ]
47
 
48
+ tokenizer = AutoTokenizer.from_pretrained('guymorganb/e5-large-v2-4096-lsg-patched')
49
+ model = AutoModel.from_pretrained('guymorganb/e5-large-v2-4096-lsg-patched', trust_remote_code=True)
50
 
51
  batch_dict = tokenizer(input_texts, max_length=4096, padding=True, truncation=True, return_tensors='pt')
52
  outputs = model(**batch_dict)