TanishkB commited on
Commit
5c70ebe
·
verified ·
1 Parent(s): 9f7429c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +31 -4
README.md CHANGED
@@ -19,10 +19,37 @@ It has been trained using [TRL](https://github.com/huggingface/trl).
19
  ```python
20
  from transformers import pipeline
21
 
22
- question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
23
- generator = pipeline("text-generation", model="TanishkB/Philosopher", device="cuda")
24
- output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
25
- print(output["generated_text"])
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
26
  ```
27
 
28
  ## Training procedure
 
19
  ```python
20
  from transformers import pipeline
21
 
22
+ # Load text-generation pipeline
23
+ generator = pipeline(
24
+ "text-generation",
25
+ model="TanishkB/RandomNumber",
26
+ device=-1 # use 0 if you have GPU
27
+ )
28
+
29
+ print("Chat with it — type 'exit' to quit.")
30
+
31
+ while True:
32
+ user_input = input(">> ").strip()
33
+ if user_input.lower() in ("exit", "quit"):
34
+ break
35
+
36
+ # Build single-turn prompt (no history)
37
+ prompt = f"User: {user_input}\nAssistant:"
38
+
39
+ # Generate reply
40
+ response = generator(
41
+ prompt,
42
+ max_new_tokens=128,
43
+ return_full_text=False
44
+ )[0]["generated_text"]
45
+
46
+ # Clean up model output (remove repeated labels if any)
47
+ reply = response.strip()
48
+ if reply.lower().startswith("assistant:"):
49
+ reply = reply[len("assistant:"):].strip()
50
+
51
+ print(reply)
52
+
53
  ```
54
 
55
  ## Training procedure