Transformers
GGUF
text-generation-inference
unsloth
mistral
Mistral_Star
Mistral_Quiet
Mistral
Mixtral
Question-Answer
Token-Classification
Sequence-Classification
SpydazWeb-AI
chemistry
biology
legal
code
climate
medical
LCARS_AI_StarTrek_Computer
chain-of-thought
tree-of-knowledge
forest-of-thoughts
visual-spacial-sketchpad
alpha-mind
knowledge-graph
entity-detection
encyclopedia
wikipedia
stack-exchange
Reddit
Cyber-series
MegaMind
Cybertron
SpydazWeb
Spydaz
LCARS
star-trek
mega-transformers
Mulit-Mega-Merge
Multi-Lingual
Afro-Centric
African-Model
Ancient-One
Inference Endpoints
conversational
auto-patch README.md
Browse files
README.md
CHANGED
@@ -1,7 +1,35 @@
|
|
1 |
---
|
2 |
base_model: LeroyDyer/SpydazWeb_AI_HumanAGI_002
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3 |
language:
|
4 |
- en
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5 |
library_name: transformers
|
6 |
license: apache-2.0
|
7 |
quantized_by: mradermacher
|
@@ -10,7 +38,46 @@ tags:
|
|
10 |
- transformers
|
11 |
- unsloth
|
12 |
- mistral
|
13 |
-
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
14 |
---
|
15 |
## About
|
16 |
|
|
|
1 |
---
|
2 |
base_model: LeroyDyer/SpydazWeb_AI_HumanAGI_002
|
3 |
+
datasets:
|
4 |
+
- neoneye/base64-decode-v2
|
5 |
+
- neoneye/base64-encode-v1
|
6 |
+
- VuongQuoc/Chemistry_text_to_image
|
7 |
+
- Kamizuru00/diagram_image_to_text
|
8 |
+
- LeroyDyer/Chemistry_text_to_image_BASE64
|
9 |
+
- LeroyDyer/AudioCaps-Spectrograms_to_Base64
|
10 |
+
- LeroyDyer/winogroud_text_to_imaget_BASE64
|
11 |
+
- LeroyDyer/chart_text_to_Base64
|
12 |
+
- LeroyDyer/diagram_image_to_text_BASE64
|
13 |
+
- mekaneeky/salt_m2e_15_3_instruction
|
14 |
+
- mekaneeky/SALT-languages-bible
|
15 |
+
- xz56/react-llama
|
16 |
+
- BeIR/hotpotqa
|
17 |
+
- arcee-ai/agent-data
|
18 |
language:
|
19 |
- en
|
20 |
+
- sw
|
21 |
+
- ig
|
22 |
+
- so
|
23 |
+
- es
|
24 |
+
- ca
|
25 |
+
- xh
|
26 |
+
- zu
|
27 |
+
- ha
|
28 |
+
- tw
|
29 |
+
- af
|
30 |
+
- hi
|
31 |
+
- bm
|
32 |
+
- su
|
33 |
library_name: transformers
|
34 |
license: apache-2.0
|
35 |
quantized_by: mradermacher
|
|
|
38 |
- transformers
|
39 |
- unsloth
|
40 |
- mistral
|
41 |
+
- Mistral_Star
|
42 |
+
- Mistral_Quiet
|
43 |
+
- Mistral
|
44 |
+
- Mixtral
|
45 |
+
- Question-Answer
|
46 |
+
- Token-Classification
|
47 |
+
- Sequence-Classification
|
48 |
+
- SpydazWeb-AI
|
49 |
+
- chemistry
|
50 |
+
- biology
|
51 |
+
- legal
|
52 |
+
- code
|
53 |
+
- climate
|
54 |
+
- medical
|
55 |
+
- LCARS_AI_StarTrek_Computer
|
56 |
+
- text-generation-inference
|
57 |
+
- chain-of-thought
|
58 |
+
- tree-of-knowledge
|
59 |
+
- forest-of-thoughts
|
60 |
+
- visual-spacial-sketchpad
|
61 |
+
- alpha-mind
|
62 |
+
- knowledge-graph
|
63 |
+
- entity-detection
|
64 |
+
- encyclopedia
|
65 |
+
- wikipedia
|
66 |
+
- stack-exchange
|
67 |
+
- Reddit
|
68 |
+
- Cyber-series
|
69 |
+
- MegaMind
|
70 |
+
- Cybertron
|
71 |
+
- SpydazWeb
|
72 |
+
- Spydaz
|
73 |
+
- LCARS
|
74 |
+
- star-trek
|
75 |
+
- mega-transformers
|
76 |
+
- Mulit-Mega-Merge
|
77 |
+
- Multi-Lingual
|
78 |
+
- Afro-Centric
|
79 |
+
- African-Model
|
80 |
+
- Ancient-One
|
81 |
---
|
82 |
## About
|
83 |
|