rajabmondal commited on
Commit
558d010
·
verified ·
1 Parent(s): 5ce8d3f

updated model summary

Browse files
Files changed (1) hide show
  1. README.md +2 -3
README.md CHANGED
@@ -53,13 +53,13 @@ duplicated_from: bigcode-data/starcoderbase-1b
53
 
54
  ## Model Summary
55
 
56
- The Narrow Transformer (NT) model NT-Java-1.1B is an open-source specialized code model built by extending pre-training on starcoderbase-1b, designed for code related tasks in Java programming. The model is a decoder-only transformer with Multi-Query-Attention and a context length of 8192 tokens. The model has been trained with Java subset of the starcoderdata dataset, which is ~22B tokens.
57
 
58
  - **Repository:** [bigcode/Megatron-LM](https://github.com/bigcode-project/Megatron-LM)
59
  - **Project Website:**
60
  - **Paper:**
61
  - **Point of Contact:**
62
- - **Languages:** Java
63
 
64
  ## Use
65
 
@@ -115,7 +115,6 @@ The model, NT-Java-1.1B, has been trained on publicly available datasets and com
115
 
116
  - **Orchestration:** [Megatron-LM](https://github.com/bigcode-project/Megatron-LM)
117
  - **Neural networks:** [PyTorch](https://github.com/pytorch/pytorch)
118
- - **BP16 if applicable:** [apex](https://github.com/NVIDIA/apex)
119
 
120
  # License
121
  The model checkpoint and vocabulary file are licensed under the [BigCode OpenRAIL-M v1](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement) . Under the license, you must evaluate if your use case does not violate the use-case restriction under Attachment A of the License. Any modification of the model (finetuning or extended pre training) for further downstream task needs to be released under [BigCode OpenRAIL-M v1](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement).
 
53
 
54
  ## Model Summary
55
 
56
+ The Narrow Transformer (NT) model NT-Java-1.1B is an open-source specialized code model built by extending pre-training on StarCoderBase-1B, designed for coding tasks in Java programming. The model is a decoder-only transformer with Multi-Query-Attention and with a context length of 8192 tokens. The model was trained with Java subset of the StarCoderData dataset, which is ~22B tokens.
57
 
58
  - **Repository:** [bigcode/Megatron-LM](https://github.com/bigcode-project/Megatron-LM)
59
  - **Project Website:**
60
  - **Paper:**
61
  - **Point of Contact:**
62
+ - **Language(s):** Java
63
 
64
  ## Use
65
 
 
115
 
116
  - **Orchestration:** [Megatron-LM](https://github.com/bigcode-project/Megatron-LM)
117
  - **Neural networks:** [PyTorch](https://github.com/pytorch/pytorch)
 
118
 
119
  # License
120
  The model checkpoint and vocabulary file are licensed under the [BigCode OpenRAIL-M v1](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement) . Under the license, you must evaluate if your use case does not violate the use-case restriction under Attachment A of the License. Any modification of the model (finetuning or extended pre training) for further downstream task needs to be released under [BigCode OpenRAIL-M v1](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement).