Update README.md
Browse files
README.md
CHANGED
@@ -26,21 +26,15 @@ widget:
|
|
26 |
example_title: "language - Portuguese"
|
27 |
- text: "se potesse vivere ovunque, dove sarebbe? peter szemraj: Io"
|
28 |
example_title: "dream living place - Italian"
|
29 |
-
- text: "What would you change about yourself if you could?\npeter szemraj:\n\n"
|
30 |
-
example_title: "change"
|
31 |
-
- text: "My first is in Asia, my second is in Europe, my third is in North America, and my fourth is in South America. What am I?\npeter szemraj:\n\n"
|
32 |
-
example_title: "continent"
|
33 |
- text: "Can you take me for dinner somewhere nice this time?\npeter szemraj:\n\n"
|
34 |
example_title: "dinner"
|
35 |
-
- text: "Honey, I have clogged the toilet for the third time this month.. sorry..\npeter szemraj:\n\n"
|
36 |
-
example_title: "overflow"
|
37 |
- text: "What really makes you angry?\npeter szemraj:\n\n"
|
38 |
example_title: "pet peeve"
|
39 |
- text: "Jak nazwać aligatora, który właśnie przeszedł operację usunięcia lewego ramienia?peter szemraj: Ja"
|
40 |
example_title: "alligator - Polish"
|
41 |
- text: "Warum sind Transformers für die Sprachmodellierung wichtig? peter szemraj: Es ist"
|
42 |
example_title: "Transformers - German"
|
43 |
-
- text: "как написать хорошие подсказки для языковых моделей? peter szemraj:
|
44 |
example_title: "prompt tutorial - Russian"
|
45 |
- text: "Pewien mężczyzna wpycha swój samochód do hotelu i mówi właścicielowi, że jest bankrutem. Dlaczego? peter szemraj: może"
|
46 |
example_title: "brain teaser - Polish 2"
|
@@ -73,7 +67,7 @@ inference:
|
|
73 |
**Interesting findings thus far:**
|
74 |
|
75 |
- Passing a generic word after the `<name-identifier>` that is in a non-English language helps ensure the model responds in the question language (see: any example).
|
76 |
-
- Model generations (in general) remain semantically consistent, even if the generations switch from `<language>`to English in the middle of the generated text. This demonstrates some sort of "universal concept understanding"
|
77 |
|
78 |
### Usage in python
|
79 |
|
|
|
26 |
example_title: "language - Portuguese"
|
27 |
- text: "se potesse vivere ovunque, dove sarebbe? peter szemraj: Io"
|
28 |
example_title: "dream living place - Italian"
|
|
|
|
|
|
|
|
|
29 |
- text: "Can you take me for dinner somewhere nice this time?\npeter szemraj:\n\n"
|
30 |
example_title: "dinner"
|
|
|
|
|
31 |
- text: "What really makes you angry?\npeter szemraj:\n\n"
|
32 |
example_title: "pet peeve"
|
33 |
- text: "Jak nazwać aligatora, który właśnie przeszedł operację usunięcia lewego ramienia?peter szemraj: Ja"
|
34 |
example_title: "alligator - Polish"
|
35 |
- text: "Warum sind Transformers für die Sprachmodellierung wichtig? peter szemraj: Es ist"
|
36 |
example_title: "Transformers - German"
|
37 |
+
- text: "как написать хорошие подсказки для языковых моделей? peter szemraj: сначала вам нужно"
|
38 |
example_title: "prompt tutorial - Russian"
|
39 |
- text: "Pewien mężczyzna wpycha swój samochód do hotelu i mówi właścicielowi, że jest bankrutem. Dlaczego? peter szemraj: może"
|
40 |
example_title: "brain teaser - Polish 2"
|
|
|
67 |
**Interesting findings thus far:**
|
68 |
|
69 |
- Passing a generic word after the `<name-identifier>` that is in a non-English language helps ensure the model responds in the question language (see: any example).
|
70 |
+
- Model generations (in general) remain semantically consistent, even if the generations switch from `<language>`to English in the middle of the generated text. This demonstrates some sort of "universal concept understanding"
|
71 |
|
72 |
### Usage in python
|
73 |
|