ozkurt7 commited on
Commit
e99891c
·
verified ·
1 Parent(s): 89d63fd

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +90 -0
README.md ADDED
@@ -0,0 +1,90 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: unsloth/DeepSeek-R1-Distill-Llama-8B-unsloth-bnb-4bit
3
+ library_name: peft
4
+ license: apache-2.0
5
+ tags:
6
+ - oracle
7
+ - scm
8
+ - fusion-cloud
9
+ - adapter
10
+ - dora
11
+ - lora
12
+ language:
13
+ - en
14
+ pipeline_tag: text-generation
15
+ ---
16
+
17
+ # Oracle Fusion Cloud SCM - DoRA Adapter
18
+
19
+ Bu Oracle Fusion Cloud SCM konularında uzmanlaşmış **DoRA (Weight-Decomposed Low-Rank Adaptation)** adapter'ıdır.
20
+
21
+ ## 🎯 Kullanım
22
+
23
+ ### Google Colab'da Merge:
24
+ ```python
25
+ # 1. Base model yükle
26
+ from transformers import AutoTokenizer, AutoModelForCausalLM
27
+ from peft import PeftModel
28
+ import torch
29
+
30
+ base_model_name = "unsloth/DeepSeek-R1-Distill-Llama-8B-unsloth-bnb-4bit"
31
+ adapter_name = "ozkurt7/oracle-deepseek-r1-adapter"
32
+
33
+ # 2. Model ve adapter yükle
34
+ tokenizer = AutoTokenizer.from_pretrained(base_model_name)
35
+ base_model = AutoModelForCausalLM.from_pretrained(
36
+ base_model_name,
37
+ torch_dtype=torch.float16,
38
+ device_map="auto"
39
+ )
40
+
41
+ model = PeftModel.from_pretrained(base_model, adapter_name)
42
+
43
+ # 3. Merge işlemi
44
+ merged_model = model.merge_and_unload()
45
+
46
+ # 4. Kaydet
47
+ merged_model.save_pretrained("./oracle-merged")
48
+ tokenizer.save_pretrained("./oracle-merged")
49
+
50
+ # 5. Test
51
+ messages = [
52
+ {"role": "system", "content": "You are an Oracle Fusion Cloud SCM expert."},
53
+ {"role": "user", "content": "What is Oracle SCM Cloud?"}
54
+ ]
55
+
56
+ text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
57
+ inputs = tokenizer(text, return_tensors="pt")
58
+ outputs = merged_model.generate(**inputs, max_new_tokens=200, temperature=0.7)
59
+ response = tokenizer.decode(outputs[0], skip_special_tokens=True)
60
+ print(response)
61
+ ```
62
+
63
+ ### Local kullanım:
64
+ ```bash
65
+ # Adapter'ı indir
66
+ git clone https://huggingface.co/ozkurt7/oracle-deepseek-r1-adapter
67
+
68
+ # Python'da merge
69
+ python merge_adapter.py
70
+ ```
71
+
72
+ ## 📊 Model Details
73
+ - **Base Model**: unsloth/DeepSeek-R1-Distill-Llama-8B-unsloth-bnb-4bit
74
+ - **Technique**: DoRA (Weight-Decomposed Low-Rank Adaptation)
75
+ - **Domain**: Oracle Fusion Cloud SCM
76
+ - **Status**: Adapter only (merge required)
77
+ - **Memory**: ~500MB (adapter only)
78
+
79
+ ## 🚀 Next Steps
80
+ 1. Google Colab'da bu adapter'ı kullanarak merge yapın
81
+ 2. Merge edilen modeli yeni repo'ya upload edin
82
+ 3. GGUF formatına dönüştürün
83
+
84
+ ## 🛠️ Troubleshooting
85
+ - **Memory Error**: Colab Pro kullanın veya local'de merge yapın
86
+ - **Loading Error**: `trust_remote_code=True` ekleyin
87
+ - **CUDA Error**: `device_map="auto"` kullanın
88
+
89
+ **Created by**: Kaggle → Google Colab workflow
90
+ **Date**: 2025-08-12