Daemontatox commited on
Commit
150ce81
·
verified ·
1 Parent(s): 0ff168e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -80,7 +80,7 @@ Prompt:
80
  Write a simple multithreaded web server in Rust that serves "Hello, world!" to any GET request.
81
 
82
  HydraCoder Output:
83
-
84
  use std::net::{TcpListener, TcpStream};
85
  use std::io::{Read, Write};
86
  use std::thread;
@@ -110,14 +110,14 @@ fn main() -> std::io::Result<()> {
110
 
111
  Ok(())
112
  }
113
-
114
 
115
  ---
116
 
117
  ⚡ Inference Code
118
 
119
  You can run inference using transformers and text-generation pipeline:
120
-
121
  from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline
122
 
123
  model_id = "Daemontatox/HydraCoder"
@@ -131,7 +131,7 @@ prompt = "Write a function in Rust that takes a list of integers and returns the
131
 
132
  output = pipe(prompt, max_new_tokens=200, do_sample=True, temperature=0.2)[0]["generated_text"]
133
  print(output)
134
-
135
 
136
  ---
137
 
 
80
  Write a simple multithreaded web server in Rust that serves "Hello, world!" to any GET request.
81
 
82
  HydraCoder Output:
83
+ ```rust
84
  use std::net::{TcpListener, TcpStream};
85
  use std::io::{Read, Write};
86
  use std::thread;
 
110
 
111
  Ok(())
112
  }
113
+ ```
114
 
115
  ---
116
 
117
  ⚡ Inference Code
118
 
119
  You can run inference using transformers and text-generation pipeline:
120
+ ```python
121
  from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline
122
 
123
  model_id = "Daemontatox/HydraCoder"
 
131
 
132
  output = pipe(prompt, max_new_tokens=200, do_sample=True, temperature=0.2)[0]["generated_text"]
133
  print(output)
134
+ ```
135
 
136
  ---
137