Input, output embedding weights are possibly not tied -

Some weights of LlamaForCausalLM were not initialized from the model checkpoint at ./llama3-pa/llama3-32kpa-emb-init-weight-tied and are newly initialized: ['lm_head.weight']

used -

source_model.lm_head.weight.data = source_model.model.embed_tokens.weight.data
source_model.lm_head.weight = source_model.model.embed_tokens.weight
source_model.tie_weights()
Downloads last month
7
Safetensors
Model size
7.24B params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.