prithivMLmods commited on
Commit
02a67a4
·
verified ·
1 Parent(s): 304eea4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +32 -0
README.md CHANGED
@@ -18,3 +18,35 @@ tags:
18
  # **PyThagorean-10B**
19
 
20
  PyThagorean [Python + Math] is a Python and mathematics-based model designed to solve mathematical problems using Python libraries and coding. It has been fine-tuned on 1.5 million entries and is built on LLaMA's architecture. The model supports different parameter sizes, including 10B, 3B, and 1B (Tiny). These instruction-tuned, text-only models are optimized for multilingual dialogue use cases, including agent-based retrieval and summarization tasks. PyThagorean leverages an auto-regressive language model that uses an optimized transformer architecture. The tuned versions employ supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align with human preferences for helpfulness and safety.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
18
  # **PyThagorean-10B**
19
 
20
  PyThagorean [Python + Math] is a Python and mathematics-based model designed to solve mathematical problems using Python libraries and coding. It has been fine-tuned on 1.5 million entries and is built on LLaMA's architecture. The model supports different parameter sizes, including 10B, 3B, and 1B (Tiny). These instruction-tuned, text-only models are optimized for multilingual dialogue use cases, including agent-based retrieval and summarization tasks. PyThagorean leverages an auto-regressive language model that uses an optimized transformer architecture. The tuned versions employ supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align with human preferences for helpfulness and safety.
21
+
22
+
23
+ # **Use with transformers**
24
+
25
+ Starting with `transformers >= 4.43.0` onward, you can run conversational inference using the Transformers `pipeline` abstraction or by leveraging the Auto classes with the `generate()` function.
26
+
27
+ Make sure to update your transformers installation via `pip install --upgrade transformers`.
28
+
29
+ ```python
30
+ import transformers
31
+ import torch
32
+
33
+ model_id = "prithivMLmods/PyThagorean-10B"
34
+
35
+ pipeline = transformers.pipeline(
36
+ "text-generation",
37
+ model=model_id,
38
+ model_kwargs={"torch_dtype": torch.bfloat16},
39
+ device_map="auto",
40
+ )
41
+
42
+ messages = [
43
+ {"role": "system", "content": "You are the helpful assistant. Solve the mathematical problem in Python programming."},
44
+ {"role": "user", "content": "Find all real numbers $x$ such that \[\frac{x^3+2x^2}{x^2+3x+2} + x = -6.\]Enter all the solutions, separated by commas."},
45
+ ]
46
+
47
+ outputs = pipeline(
48
+ messages,
49
+ max_new_tokens=256,
50
+ )
51
+ print(outputs[0]["generated_text"][-1])
52
+ ```