Ziyi223 commited on
Commit
6b3abe1
·
verified ·
1 Parent(s): 8db09f4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -3
README.md CHANGED
@@ -27,6 +27,7 @@ tags:
27
 
28
 
29
  *Latest News* 🔥
 
30
  - [2025/06] We **finally** released and **open-sourced** the **ONNX** model and the corresponding **preprocessing code**! Now you can deploy **TEN VAD** on **any platform** and **any hardware architecture**!
31
  - [2025/06] We are excited to announce the release of **WASM+JS** for Web WASM Support.
32
 
@@ -256,7 +257,7 @@ The project supports five major platforms with dynamic library linking.
256
  </table>
257
 
258
  ### **Python Usage**
259
- #### **1. Linux**
260
  #### **Requirements**
261
  - numpy (Version 1.17.4/1.26.4 verified)
262
  - scipy (Version >= 1.5.0)
@@ -377,7 +378,7 @@ The detailed usage methods of each platform are as follows <br>
377
  - CMake
378
  - Terminal
379
 
380
- Note that if you did not install **libc++1**, you have to run the code below to install it:
381
  ```
382
  sudo apt update
383
  sudo apt install libc++1
@@ -395,7 +396,7 @@ You have to download the **onnxruntime** packages from the [microsoft official o
395
  You can check the official **ONNX Runtime releases** from [this website](https://github.com/microsoft/onnxruntime/tags). And for example, to download version 1.17.1 (Linux x64), use [this link](https://github.com/microsoft/onnxruntime/releases/download/v1.17.1/onnxruntime-linux-x64-1.17.1.tgz). After extracting the compressed file, you'll find two important directories:`include/` - header files, `lib/` - library files
396
  ```
397
  1) cd examples_onnx/
398
- 2) ./build-and-deploy-linux.sh --ort-root /absolute/path/to/your/onnxruntime/root/dir
399
  ```
400
  Note 1: If executing the onnx demo from a different directory than the one used when running build-and-deploy-linux.sh, ensure to create a symbolic link to src/onnx_model/ to prevent ONNX model file loading failures.
401
  <br>
 
27
 
28
 
29
  *Latest News* 🔥
30
+ - [2025/07] We support **Python inference** on **macOS** and **Windows** with usage of the prebuilt-lib!
31
  - [2025/06] We **finally** released and **open-sourced** the **ONNX** model and the corresponding **preprocessing code**! Now you can deploy **TEN VAD** on **any platform** and **any hardware architecture**!
32
  - [2025/06] We are excited to announce the release of **WASM+JS** for Web WASM Support.
33
 
 
257
  </table>
258
 
259
  ### **Python Usage**
260
+ #### **1. Linux / macOS / Windows**
261
  #### **Requirements**
262
  - numpy (Version 1.17.4/1.26.4 verified)
263
  - scipy (Version >= 1.5.0)
 
378
  - CMake
379
  - Terminal
380
 
381
+ Note that if you did not install **libc++1** (Linux), you have to run the code below to install it:
382
  ```
383
  sudo apt update
384
  sudo apt install libc++1
 
396
  You can check the official **ONNX Runtime releases** from [this website](https://github.com/microsoft/onnxruntime/tags). And for example, to download version 1.17.1 (Linux x64), use [this link](https://github.com/microsoft/onnxruntime/releases/download/v1.17.1/onnxruntime-linux-x64-1.17.1.tgz). After extracting the compressed file, you'll find two important directories:`include/` - header files, `lib/` - library files
397
  ```
398
  1) cd examples_onnx/
399
+ 2) ./build-and-deploy-linux.sh --ort-path /absolute/path/to/your/onnxruntime/root/dir
400
  ```
401
  Note 1: If executing the onnx demo from a different directory than the one used when running build-and-deploy-linux.sh, ensure to create a symbolic link to src/onnx_model/ to prevent ONNX model file loading failures.
402
  <br>