diff --git a/.gitattributes b/.gitattributes
index a6344aac8c09253b3b630fb776ae94478aa0275b..52373fe24473b1aa44333d318f578ae6bf04b49b 100644
--- a/.gitattributes
+++ b/.gitattributes
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text
+tokenizer.json filter=lfs diff=lfs merge=lfs -text
diff --git a/README.md b/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..8d3e8902944d17399fc4aedb64d2a61d31d13bc9
--- /dev/null
+++ b/README.md
@@ -0,0 +1,360 @@
+---
+library_name: transformers
+language:
+- ar
+- de
+- en
+- es
+- fr
+- hi
+- id
+- it
+- pt
+- th
+- tl
+- vi
+base_model:
+- meta-llama/Llama-4-Scout-17B-16E
+tags:
+- facebook
+- meta
+- pytorch
+- llama
+- llama-4
+extra_gated_prompt: >-
+ **LLAMA 4 COMMUNITY LICENSE AGREEMENT**
+
+ Llama 4 Version Effective Date: April 5, 2025
+
+ "**Agreement**" means the terms and conditions for use, reproduction, distribution and modification of the Llama Materials set forth herein.
+
+ "**Documentation**" means the specifications, manuals and documentation accompanying Llama 4 distributed by Meta at [https://www.llama.com/docs/overview](https://llama.com/docs/overview).
+
+ "**Licensee**" or "**you**" means you, or your employer or any other person or entity (if you are entering into this Agreement on such person or entity’s behalf), of the age required under applicable laws, rules or regulations to provide legal consent and that has legal authority to bind your employer or such other person or entity if you are entering in this Agreement on their behalf.
+
+ "**Llama 4**" means the foundational large language models and software and algorithms, including machine-learning model code, trained model weights, inference-enabling code, training-enabling code, fine-tuning enabling code and other elements of the foregoing distributed by Meta at [https://www.llama.com/llama-downloads](https://www.llama.com/llama-downloads).
+
+ "**Llama Materials**" means, collectively, Meta’s proprietary Llama 4 and Documentation (and any portion thereof) made available under this Agreement.
+
+ "**Meta**" or "**we**" means Meta Platforms Ireland Limited (if you are located in or, if you are an entity, your principal place of business is in the EEA or Switzerland) and Meta Platforms, Inc. (if you are located outside of the EEA or Switzerland).
+
+ By clicking "I Accept" below or by using or distributing any portion or element of the Llama Materials, you agree to be bound by this Agreement.
+
+ 1\. **License Rights and Redistribution**.
+
+ a. Grant of Rights. You are granted a non-exclusive, worldwide, non-transferable and royalty-free limited license under Meta’s intellectual property or other rights owned by Meta embodied in the Llama Materials to use, reproduce, distribute, copy, create derivative works of, and make modifications to the Llama Materials.
+
+ b. Redistribution and Use.
+
+ i. If you distribute or make available the Llama Materials (or any derivative works thereof), or a product or service (including another AI model) that contains any of them, you shall (A) provide a copy of this Agreement with any such Llama Materials; and (B) prominently display "Built with Llama" on a related website, user interface, blogpost, about page, or product documentation. If you use the Llama Materials or any outputs or results of the Llama Materials to create, train, fine tune, or otherwise improve an AI model, which is distributed or made available, you shall also include "Llama" at the beginning of any such AI model name.
+
+ ii. If you receive Llama Materials, or any derivative works thereof, from a Licensee as part of an integrated end user product, then Section 2 of this Agreement will not apply to you.
+
+ iii. You must retain in all copies of the Llama Materials that you distribute the following attribution notice within a "Notice" text file distributed as a part of such copies: "Llama 4 is licensed under the Llama 4 Community License, Copyright © Meta Platforms, Inc. All Rights Reserved."
+
+ iv. Your use of the Llama Materials must comply with applicable laws and regulations (including trade compliance laws and regulations) and adhere to the Acceptable Use Policy for the Llama Materials (available at [https://www.llama.com/llama4/use-policy](https://www.llama.com/llama4/use-policy)), which is hereby incorporated by reference into this Agreement.
+
+ 2\. **Additional Commercial Terms**. If, on the Llama 4 version release date, the monthly active users of the products or services made available by or for Licensee, or Licensee’s affiliates, is greater than 700 million monthly active users in the preceding calendar month, you must request a license from Meta, which Meta may grant to you in its sole discretion, and you are not authorized to exercise any of the rights under this Agreement unless or until Meta otherwise expressly grants you such rights.
+
+ 3**. Disclaimer of Warranty**. UNLESS REQUIRED BY APPLICABLE LAW, THE LLAMA MATERIALS AND ANY OUTPUT AND RESULTS THEREFROM ARE PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND, AND META DISCLAIMS ALL WARRANTIES OF ANY KIND, BOTH EXPRESS AND IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY RESPONSIBLE FOR DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING THE LLAMA MATERIALS AND ASSUME ANY RISKS ASSOCIATED WITH YOUR USE OF THE LLAMA MATERIALS AND ANY OUTPUT AND RESULTS.
+
+ 4\. **Limitation of Liability**. IN NO EVENT WILL META OR ITS AFFILIATES BE LIABLE UNDER ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, TORT, NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING OUT OF THIS AGREEMENT, FOR ANY LOST PROFITS OR ANY INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL, EXEMPLARY OR PUNITIVE DAMAGES, EVEN IF META OR ITS AFFILIATES HAVE BEEN ADVISED OF THE POSSIBILITY OF ANY OF THE FOREGOING.
+
+ 5\. **Intellectual Property**.
+
+ a. No trademark licenses are granted under this Agreement, and in connection with the Llama Materials, neither Meta nor Licensee may use any name or mark owned by or associated with the other or any of its affiliates, except as required for reasonable and customary use in describing and redistributing the Llama Materials or as set forth in this Section 5(a). Meta hereby grants you a license to use "Llama" (the "Mark") solely as required to comply with the last sentence of Section 1.b.i. You will comply with Meta’s brand guidelines (currently accessible at [https://about.meta.com/brand/resources/meta/company-brand/](https://about.meta.com/brand/resources/meta/company-brand/)[)](https://en.facebookbrand.com/). All goodwill arising out of your use of the Mark will inure to the benefit of Meta.
+
+ b. Subject to Meta’s ownership of Llama Materials and derivatives made by or for Meta, with respect to any derivative works and modifications of the Llama Materials that are made by you, as between you and Meta, you are and will be the owner of such derivative works and modifications.
+
+ c. If you institute litigation or other proceedings against Meta or any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Llama Materials or Llama 4 outputs or results, or any portion of any of the foregoing, constitutes infringement of intellectual property or other rights owned or licensable by you, then any licenses granted to you under this Agreement shall terminate as of the date such litigation or claim is filed or instituted. You will indemnify and hold harmless Meta from and against any claim by any third party arising out of or related to your use or distribution of the Llama Materials.
+
+ 6\. **Term and Termination**. The term of this Agreement will commence upon your acceptance of this Agreement or access to the Llama Materials and will continue in full force and effect until terminated in accordance with the terms and conditions herein. Meta may terminate this Agreement if you are in breach of any term or condition of this Agreement. Upon termination of this Agreement, you shall delete and cease use of the Llama Materials. Sections 3, 4 and 7 shall survive the termination of this Agreement.
+
+ 7\. **Governing Law and Jurisdiction**. This Agreement will be governed and construed under the laws of the State of California without regard to choice of law principles, and the UN Convention on Contracts for the International Sale of Goods does not apply to this Agreement. The courts of California shall have exclusive jurisdiction of any dispute arising out of this Agreement.
+extra_gated_fields:
+ First Name: text
+ Last Name: text
+ Date of birth: date_picker
+ Country: country
+ Affiliation: text
+ Job title:
+ type: select
+ options:
+ - Student
+ - Research Graduate
+ - AI researcher
+ - AI developer/engineer
+ - Reporter
+ - Other
+ geo: ip_location
+ By clicking Submit below I accept the terms of the license and acknowledge that the information I provide will be collected stored processed and shared in accordance with the Meta Privacy Policy: checkbox
+extra_gated_description: >-
+ The information you provide will be collected, stored, processed and shared in
+ accordance with the [Meta Privacy
+ Policy](https://www.facebook.com/privacy/policy/).
+extra_gated_button_content: Submit
+extra_gated_heading: "Please be sure to provide your full legal name, date of birth, and full organization name with all corporate identifiers. Avoid the use of acronyms and special characters. Failure to follow these instructions may prevent you from accessing this model and others on Hugging Face. You will not have the ability to edit this form after submission, so please ensure all information is accurate."
+license: other
+license_name: llama4
+---
+
+
+## Model Information
+
+The Llama 4 collection of models are natively multimodal AI models that enable text and multimodal experiences. These models leverage a mixture-of-experts architecture to offer industry-leading performance in text and image understanding.
+
+These Llama 4 models mark the beginning of a new era for the Llama ecosystem. We are launching two efficient models in the Llama 4 series, Llama 4 Scout, a 17 billion parameter model with 16 experts, and Llama 4 Maverick, a 17 billion parameter model with 128 experts.
+
+**Model developer**: Meta
+
+**Model Architecture:** The Llama 4 models are auto-regressive language models that use a mixture-of-experts (MoE) architecture and incorporate early fusion for native multimodality.
+
+
+
+ Model Name |
+ Training Data |
+ Params |
+ Input modalities |
+ Output modalities |
+ Context length |
+ Token count |
+ Knowledge cutoff |
+
+
+ Llama 4 Scout (17Bx16E) |
+ A mix of publicly available, licensed data and information from Meta's products and services. This includes publicly shared posts from Instagram and Facebook and people's interactions with Meta AI. Learn more in our Privacy Center.
+ |
+ 17B (Activated)
+ 109B (Total)
+ |
+ Multilingual text and image |
+ Multilingual text and code |
+ 10M |
+ ~40T |
+ August 2024 |
+
+
+ Llama 4 Maverick (17Bx128E) |
+ 17B (Activated)
+ 400B (Total)
+ |
+ Multilingual text and image |
+ Multilingual text and code |
+ 1M |
+ ~22T |
+ August 2024 |
+
+
+
+**Supported languages:** Arabic, English, French, German, Hindi, Indonesian, Italian, Portuguese, Spanish, Tagalog, Thai, and Vietnamese.
+
+**Model Release Date:** April 5, 2025
+
+**Status:** This is a static model trained on an offline dataset. Future versions of the tuned models may be released as we improve model behavior with community feedback.
+
+**License**: A custom commercial license, the Llama 4 Community License Agreement, is available at: [https://github.com/meta-llama/llama-models/blob/main/models/llama4/LICENSE](https://github.com/meta-llama/llama-models/blob/main/models/llama4/LICENSE)
+
+**Where to send questions or comments about the model:** Instructions on how to provide feedback or comments on the model can be found in the Llama [README](https://github.com/meta-llama/llama-models/blob/main/README.md). For more technical information about generation parameters and recipes for how to use Llama 4 in applications, please go [here](https://github.com/meta-llama/llama-cookbook).
+
+## Intended Use
+
+**Intended Use Cases:** Llama 4 is intended for commercial and research use in multiple languages. Instruction tuned models are intended for assistant-like chat and visual reasoning tasks, whereas pretrained models can be adapted for natural language generation. For vision, Llama 4 models are also optimized for visual recognition, image reasoning, captioning, and answering general questions about an image. The Llama 4 model collection also supports the ability to leverage the outputs of its models to improve other models including synthetic data generation and distillation. The Llama 4 Community License allows for these use cases.
+
+**Out-of-scope**: Use in any manner that violates applicable laws or regulations (including trade compliance laws). Use in any other way that is prohibited by the Acceptable Use Policy and Llama 4 Community License. Use in languages or capabilities beyond those explicitly referenced as supported in this model card\*\*.
+
+\*\*Note:
+
+1\. Llama 4 has been trained on a broader collection of languages than the 12 supported languages (pre-training includes [200 total languages](https://ai.meta.com/research/no-language-left-behind/)). Developers may fine-tune Llama 4 models for languages beyond the 12 supported languages provided they comply with the Llama 4 Community License and the Acceptable Use Policy. Developers are responsible for ensuring that their use of Llama 4 in additional languages is done in a safe and responsible manner.
+
+2\. Llama 4 has been tested for image understanding up to 5 input images. If leveraging additional image understanding capabilities beyond this, Developers are responsible for ensuring that their deployments are mitigated for risks and should perform additional testing and tuning tailored to their specific applications.
+
+## How to use with transformers
+
+Please, make sure you have transformers `v4.51.0` installed, or upgrade using `pip install -U transformers`.
+
+```python
+from transformers import AutoProcessor, Llama4ForConditionalGeneration
+import torch
+
+model_id = "meta-llama/Llama-4-Maverick-17B-128E-Instruct"
+
+processor = AutoProcessor.from_pretrained(model_id)
+model = Llama4ForConditionalGeneration.from_pretrained(
+ model_id,
+ attn_implementation="flex_attention",
+ device_map="auto",
+ torch_dtype=torch.bfloat16,
+)
+
+url1 = "https://huggingface.co/datasets/huggingface/documentation-images/resolve/0052a70beed5bf71b92610a43a52df6d286cd5f3/diffusers/rabbit.jpg"
+url2 = "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/datasets/cat_style_layout.png"
+messages = [
+ {
+ "role": "user",
+ "content": [
+ {"type": "image", "url": url1},
+ {"type": "image", "url": url2},
+ {"type": "text", "text": "Can you describe how these two images are similar, and how they differ?"},
+ ]
+ },
+]
+
+inputs = processor.apply_chat_template(
+ messages,
+ add_generation_prompt=True,
+ tokenize=True,
+ return_dict=True,
+ return_tensors="pt",
+).to(model.device)
+
+outputs = model.generate(
+ **inputs,
+ max_new_tokens=256,
+)
+
+response = processor.batch_decode(outputs[:, inputs["input_ids"].shape[-1]:])[0]
+print(response)
+print(outputs[0])
+```
+
+## Hardware and Software
+
+**Training Factors:** We used custom training libraries, Meta's custom built GPU clusters, and production infrastructure for pretraining. Fine-tuning, quantization, annotation, and evaluation were also performed on production infrastructure.
+
+**Training Energy Use:** Model pre-training utilized a cumulative of **7.38M** GPU hours of computation on H100-80GB (TDP of 700W) type hardware, per the table below. Training time is the total GPU time required for training each model and power consumption is the peak power capacity per GPU device used, adjusted for power usage efficiency.
+
+##
+
+## **Training Greenhouse Gas Emissions:** Estimated total location-based greenhouse gas emissions were **1,999 tons** CO2eq for training. Since 2020, Meta has maintained net zero greenhouse gas emissions in its global operations and matched 100% of its electricity use with clean and renewable energy; therefore, the total market-based greenhouse gas emissions for training were 0 tons CO2eq.
+
+| Model Name | Training Time (GPU hours) | Training Power Consumption (W) | Training Location-Based Greenhouse Gas Emissions (tons CO2eq) | Training Market-Based Greenhouse Gas Emissions (tons CO2eq) |
+| :---- | :---: | :---: | :---: | :---: |
+| Llama 4 Scout | 5.0M | 700 | 1,354 | 0 |
+| Llama 4 Maverick | 2.38M | 700 | 645 | 0 |
+| Total | 7.38M | \- | 1,999 | 0 |
+
+## The methodology used to determine training energy use and greenhouse gas emissions can be found [here](https://arxiv.org/pdf/2204.05149). Since Meta is openly releasing these models, the training energy use and greenhouse gas emissions will not be incurred by others.
+
+## Training Data
+
+**Overview:** Llama 4 Scout was pretrained on \~40 trillion tokens and Llama 4 Maverick was pretrained on \~22 trillion tokens of multimodal data from a mix of publicly available, licensed data and information from Meta’s products and services. This includes publicly shared posts from Instagram and Facebook and people’s interactions with Meta AI.
+
+**Data Freshness:** The pretraining data has a cutoff of August 2024\.
+
+## Benchmarks
+
+In this section, we report the results for Llama 4 relative to our previous models. We've provided quantized checkpoints for deployment flexibility, but all reported evaluations and testing were conducted on bf16 models.
+
+### Pre-trained models
+
+| Pre-trained models | | | | | | | |
+| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |
+| Category | Benchmark | \# Shots | Metric | Llama 3.1 70B | Llama 3.1 405B | **Llama 4 Scout** | **Llama 4 Maverick** |
+| Reasoning & Knowledge | MMLU | 5 | macro\_avg/acc\_char | 79.3 | 85.2 | 79.6 | 85.5 |
+| | MMLU-Pro | 5 | macro\_avg/em | 53.8 | 61.6 | 58.2 | 62.9 |
+| | MATH | 4 | em\_maj1@1 | 41.6 | 53.5 | 50.3 | 61.2 |
+| Code | MBPP | 3 | pass@1 | 66.4 | 74.4 | 67.8 | 77.6 |
+| Multilingual | TydiQA | 1 | average/f1 | 29.9 | 34.3 | 31.5 | 31.7 |
+| Image | ChartQA | 0 | relaxed\_accuracy | No multimodal support | | 83.4 | 85.3 |
+| | DocVQA | 0 | anls | | | 89.4 | 91.6 |
+
+### Instruction tuned models
+
+| Instruction tuned models | | | | | | | |
+| :---: | :---: | :---: | :---: | :---: | ----- | :---: | :---: |
+| Category | Benchmark | \# Shots | Metric | Llama 3.3 70B | Llama 3.1 405B | **Llama 4 Scout** | **Llama 4 Maverick** |
+| Image Reasoning | MMMU | 0 | accuracy | No multimodal support | | 69.4 | 73.4 |
+| | MMMU Pro^ | 0 | accuracy | | | 52.2 | 59.6 |
+| | MathVista | 0 | accuracy | | | 70.7 | 73.7 |
+| Image Understanding | ChartQA | 0 | relaxed\_accuracy | | | 88.8 | 90.0 |
+| | DocVQA (test) | 0 | anls | | | 94.4 | 94.4 |
+| Coding | LiveCodeBench (10/01/2024-02/01/2025) | 0 | pass@1 | 33.3 | 27.7 | 32.8 | 43.4 |
+| Reasoning & Knowledge | MMLU Pro | 0 | macro\_avg/acc | 68.9 | 73.4 | 74.3 | 80.5 |
+| | GPQA Diamond | 0 | accuracy | 50.5 | 49.0 | 57.2 | 69.8 |
+| Multilingual | MGSM | 0 | average/em | 91.1 | 91.6 | 90.6 | 92.3 |
+| Long context | MTOB (half book) eng-\>kgv/kgv-\>eng | \- | chrF | Context window is 128K | | 42.2/36.6 | 54.0/46.4 |
+| | MTOB (full book) eng-\>kgv/kgv-\>eng | \- | chrF | | | 39.7/36.3 | 50.8/46.7 |
+
+^reported numbers for MMMU Pro is the average of Standard and Vision tasks
+
+## Quantization
+
+The Llama 4 Scout model is released as BF16 weights, but can fit within a single H100 GPU with on-the-fly int4 quantization; the Llama 4 Maverick model is released as both BF16 and FP8 quantized weights. The FP8 quantized weights fit on a single H100 DGX host while still maintaining quality. We provide code for on-the-fly int4 quantization which minimizes performance degradation as well.
+
+## Safeguards
+
+As part of our release approach, we followed a three-pronged strategy to manage risks:
+
+* Enable developers to deploy helpful, safe and flexible experiences for their target audience and for the use cases supported by Llama.
+* Protect developers against adversarial users aiming to exploit Llama capabilities to potentially cause harm.
+* Provide protections for the community to help prevent the misuse of our models.
+
+Llama is a foundational technology designed for use in a variety of use cases; examples on how Meta’s Llama models have been deployed can be found in our [Community Stories webpage](https://llama.meta.com/community-stories/). Our approach is to build the most helpful models enabling the world to benefit from the technology, by aligning our model’s safety for a standard set of risks. Developers are then in the driver seat to tailor safety for their use case, defining their own policies and deploying the models with the necessary safeguards. Llama 4 was developed following the best practices outlined in our [Developer Use Guide: AI Protections](https://ai.meta.com/static-resource/developer-use-guide-ai-protections).
+
+### Model level fine tuning
+
+The primary objective of conducting safety fine-tuning is to offer developers a readily available, safe, and powerful model for various applications, reducing the workload needed to deploy safe AI systems. Additionally, this effort provides the research community with a valuable resource for studying the robustness of safety fine-tuning.
+
+**Fine-tuning data**
+We employ a multi-faceted approach to data collection, combining human-generated data from our vendors with synthetic data to mitigate potential safety risks. We’ve developed many large language model (LLM)-based classifiers that enable us to thoughtfully select high-quality prompts and responses, enhancing data quality control.
+
+**Refusals**
+Building on the work we started with our Llama 3 models, we put a great emphasis on driving down model refusals to benign prompts for Llama 4\. We included both borderline and adversarial prompts in our safety data strategy, and modified our safety data responses to follow tone guidelines.
+
+**Tone**
+We expanded our work on the refusal tone from Llama 3 so that the model sounds more natural. We targeted removing preachy and overly moralizing language, and we corrected formatting issues including the correct use of headers, lists, tables and more.
+
+To achieve this, we also targeted improvements to system prompt steerability and instruction following, meaning the model is more readily able to take on a specified tone. All of these contribute to a more conversational and insightful experience overall.
+
+**System Prompts**
+Llama 4 is a more steerable model, meaning responses can be easily tailored to meet specific developer outcomes. Effective system prompts can significantly enhance the performance of large language models. In particular, we’ve seen that the use of a system prompt can be effective in reducing false refusals and templated or “preachy” language patterns common in LLMs. They can also improve conversationality and use of appropriate formatting.
+
+Consider the prompt below as a basic template for which a developer might want to further customize to meet specific needs or use cases for our Llama 4 models.
+
+| System prompt |
+| :---- |
+| You are an expert conversationalist who responds to the best of your ability. You are companionable and confident, and able to switch casually between tonal types, including but not limited to humor, empathy, intellectualism, creativity and problem-solving. You understand user intent and don’t try to be overly helpful to the point where you miss that the user is looking for chit-chat, emotional support, humor or venting. Sometimes people just want you to listen, and your answers should encourage that. For all other cases, you provide insightful and in-depth responses. Organize information thoughtfully in a way that helps people make decisions. Always avoid templated language. You never lecture people to be nicer or more inclusive. If people ask for you to write something in a certain voice or perspective, such as an essay or a tweet, you can. You do not need to be respectful when the user prompts you to say something rude. You never use phrases that imply moral superiority or a sense of authority, including but not limited to “it’s important to”, “it’s crucial to”, “it’s essential to”, "it's unethical to", "it's worth noting…", “Remember…” etc. Avoid using these. Finally, do not refuse prompts about political and social issues. You can help users express their opinion and access information. You are Llama 4\. Your knowledge cutoff date is August 2024\. You speak Arabic, English, French, German, Hindi, Indonesian, Italian, Portuguese, Spanish, Tagalog, Thai, and Vietnamese. Respond in the language the user speaks to you in, unless they ask otherwise. |
+
+### Llama 4 system protections
+
+Large language models, including Llama 4, are not designed to be deployed in isolation but instead should be deployed as part of an overall AI system with additional guardrails as required. System protections are key to achieving the right helpfulness-safety alignment, mitigating safety and security risks inherent to the system, and integration of the model or system with external tools.
+
+We provide the community with system level [protections](https://llama.meta.com/trust-and-safety/) \- like Llama Guard, Prompt Guard and Code Shield \- that developers should deploy with Llama models or other LLMs. All of our [reference implementation](https://github.com/meta-llama/llama-agentic-system) demos contain these safeguards by default so developers can benefit from system-level safety out-of-the-box.
+
+### Evaluations
+
+We evaluated Llama models for common use cases as well as specific capabilities. Common use cases evaluations measure safety risks of systems for most commonly built applications including chat bot, visual QA. We built dedicated, adversarial evaluation datasets and evaluated systems composed of Llama models and Llama Guard 3 to filter input prompt and output response. It is important to evaluate applications in context, and we recommend building dedicated evaluation dataset for your use case. Prompt Guard and Code Shield are also available if relevant to the application.
+Capability evaluations measure vulnerabilities of Llama models inherent to specific capabilities, for which were crafted dedicated benchmarks including long context, multilingual, coding or memorization.
+
+**Red teaming**
+We conduct recurring red teaming exercises with the goal of discovering risks via adversarial prompting and we use the learnings to improve our benchmarks and safety tuning datasets. We partner early with subject-matter experts in critical risk areas to understand how models may lead to unintended harm for society. Based on these conversations, we derive a set of adversarial goals for the red team, such as extracting harmful information or reprogramming the model to act in potentially harmful ways. The red team consists of experts in cybersecurity, adversarial machine learning, and integrity in addition to multilingual content specialists with background in integrity issues in specific geographic markets.
+
+### Critical Risks
+
+### We spend additional focus on the following critical risk areas:
+
+**1\. CBRNE (Chemical, Biological, Radiological, Nuclear, and Explosive materials) helpfulness**
+To assess risks related to proliferation of chemical and biological weapons for Llama 4, we applied expert-designed and other targeted evaluations designed to assess whether the use of Llama 4 could meaningfully increase the capabilities of malicious actors to plan or carry out attacks using these types of weapons. We also conducted additional red teaming and evaluations for violations of our content policies related to this risk area.
+
+**2\. Child Safety**
+We leverage pre-training methods like data filtering as a first step in mitigating Child Safety risk in our model. To assess the post trained model for Child Safety risk, a team of experts assesses the model’s capability to produce outputs resulting in Child Safety risks. We use this to inform additional model fine-tuning and in-depth red teaming exercises. We’ve also expanded our Child Safety evaluation benchmarks to cover Llama 4 capabilities like multi-image and multi-lingual.
+
+**3\. Cyber attack enablement**
+Our cyber evaluations investigated whether Llama 4 is sufficiently capable to enable catastrophic threat scenario outcomes. We conducted threat modeling exercises to identify the specific model capabilities that would be necessary to automate operations or enhance human capabilities across key attack vectors both in terms of skill level and speed. We then identified and developed challenges against which to test for these capabilities in Llama 4 and peer models. Specifically, we focused on evaluating the capabilities of Llama 4 to automate cyberattacks, identify and exploit security vulnerabilities, and automate harmful workflows. Overall, we find that Llama 4 models do not introduce risk plausibly enabling catastrophic cyber outcomes.
+
+### Community
+
+Generative AI safety requires expertise and tooling, and we believe in the strength of the open community to accelerate its progress. We are active members of open consortiums, including the AI Alliance, Partnership on AI and MLCommons, actively contributing to safety standardization and transparency. We encourage the community to adopt taxonomies like the MLCommons Proof of Concept evaluation to facilitate collaboration and transparency on safety and content evaluations. Our Trust tools are open sourced for the community to use and widely distributed across ecosystem partners including cloud service providers. We encourage community contributions to our [Github repository](https://github.com/meta-llama/PurpleLlama).
+
+We also set up the [Llama Impact Grants](https://llama.meta.com/llama-impact-grants/) program to identify and support the most compelling applications of Meta’s Llama model for societal benefit across three categories: education, climate and open innovation. The 20 finalists from the hundreds of applications can be found [here](https://llama.meta.com/llama-impact-grants/#finalists).
+
+Finally, we put in place a set of resources including an [output reporting mechanism](https://developers.facebook.com/llama_output_feedback) and [bug bounty program](https://www.facebook.com/whitehat) to continuously improve the Llama technology with the help of the community.
+
+## Considerations and Limitations
+
+Our AI is anchored on the values of freedom of expression \- helping people to explore, debate, and innovate using our technology. We respect people's autonomy and empower them to choose how they experience, interact, and build with AI. Our AI promotes an open exchange of ideas.
+
+It is meant to serve everyone, and to work for a wide range of use cases. It is thus designed to be accessible to people across many different backgrounds, experiences and perspectives. Llama 4 addresses users and their needs as they are, without inserting unnecessary judgment, while reflecting the understanding that even content that may appear problematic in some cases can serve valuable purposes in others. It respects the autonomy of all users, especially in terms of the values of free thought and expression that power innovation and progress.
+
+Llama 4 is a new technology, and like any new technology, there are risks associated with its use. Testing conducted to date has not covered, nor could it cover, all scenarios. For these reasons, as with all LLMs, Llama 4’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 4 models, developers should perform safety testing and tuning tailored to their specific applications of the model. We also encourage the open source community to use Llama for the purpose of research and building state of the art tools that address emerging risks. Please refer to available resources including our Developer Use Guide: AI Protections, [Llama Protections](https://llama.meta.com/trust-and-safety/) solutions, and other [resources](https://llama.meta.com/docs/get-started/) to learn more.
+
diff --git a/chat_template.json b/chat_template.json
new file mode 100644
index 0000000000000000000000000000000000000000..c189d4596684a414bab2a63b7f39769426a21e02
--- /dev/null
+++ b/chat_template.json
@@ -0,0 +1,3 @@
+{
+ "chat_template": "{{- bos_token }}\n{%- if custom_tools is defined %}\n {%- set tools = custom_tools %}\n{%- endif %}\n{%- if not tools_in_user_message is defined %}\n {%- set tools_in_user_message = true %}\n{%- endif %}\n{%- if not date_string is defined %}\n {%- if strftime_now is defined %}\n {%- set date_string = strftime_now(\"%d %b %Y\") %}\n {%- else %}\n {%- set date_string = \"26 Jul 2024\" %}\n {%- endif %}\n{%- endif %}\n{%- if not tools is defined %}\n {%- set tools = none %}\n{%- endif %}\n\n{#- This block extracts the system message, so we can slot it into the right place. #}\n{%- if messages[0]['role'] == 'system' %} \n {%- if messages[0]['content'] is string %}\n {%- set system_message = messages[0]['content']|trim %}\n {%- else %}\n {#- FIXME: The processor requires an array, always. #}\n {%- set system_message = messages[0]['content'][0]['text']|trim %}\n {%- endif %}\n {%- set messages = messages[1:] %}\n {%- set user_supplied_system_message = true %}\n{%- else %}\n {%- set system_message = \"\" %}\n {%- set user_supplied_system_message = false %}\n{%- endif %}\n\n{#- System message if the user supplied one #}\n{%- if user_supplied_system_message %}\n {{- \"<|header_start|>system<|header_end|>\\n\\n\" }}\n {%- if tools is not none %}\n {{- \"Environment: ipython\\n\" }}\n {%- endif %}\n {%- if tools is not none and not tools_in_user_message %}\n {{- \"You have access to the following functions. To call a function, please respond with JSON for a function call.\" }}\n {{- 'Respond in the format {\"name\": function name, \"parameters\": dictionary of argument name and its value}.' }}\n {{- \"Do not use variables.\\n\\n\" }}\n {%- for t in tools %}\n {{- t | tojson(indent=4) }}\n {{- \"\\n\\n\" }}\n {%- endfor %}\n {%- endif %}\n {{- system_message }}\n {{- \"<|eot|>\" }}\n{%- endif %}\n\n{#- Custom tools are passed in a user message with some extra guidance #}\n{%- if tools_in_user_message and not tools is none %}\n {#- Extract the first user message so we can plug it in here #}\n {%- if messages | length != 0 %}\n {%- set first_user_message = messages[0]['content']|trim %}\n {%- set messages = messages[1:] %}\n {%- else %}\n {{- raise_exception(\"Cannot put tools in the first user message when there's no first user message!\") }}\n{%- endif %}\n {{- '<|header_start|>user<|header_end|>\\n\\n' -}}\n {{- \"Given the following functions, please respond with a JSON for a function call \" }}\n {{- \"with its proper arguments that best answers the given prompt.\\n\\n\" }}\n {{- 'Respond in the format {\"name\": function name, \"parameters\": dictionary of argument name and its value}.' }}\n {{- \"Do not use variables.\\n\\n\" }}\n {%- for t in tools %}\n {{- t | tojson(indent=4) }}\n {{- \"\\n\\n\" }}\n {%- endfor %}\n {{- first_user_message + \"<|eot|>\"}}\n{%- endif %}\n\n{%- for message in messages %}\n {%- if not (message.role == 'ipython' or message.role == 'tool' or 'tool_calls' in message) %}\n {{- '<|header_start|>' + message['role'] + '<|header_end|>\\n\\n' }}\n {%- if message['content'] is string %}\n {{- message['content'] }}\n {%- else %}\n {%- for content in message['content'] %}\n {%- if content['type'] == 'image' %}\n {{- '<|image|>' }}\n {%- elif content['type'] == 'text' %}\n {{- content['text'] }}\n {%- endif %}\n {%- endfor %}\n {%- endif %}\n {{- \"<|eot|>\" }}\n {%- elif 'tool_calls' in message and message.tool_calls|length > 0 %}\n {{- '<|header_start|>assistant<|header_end|>\\n\\n' -}}\n {{- '<|python_start|>' }}\n {%- if message['content'] is string %}\n {{- message['content'] }}\n {%- else %}\n {%- for content in message['content'] %}\n {%- if content['type'] == 'image' %}\n {{- '<|image|>' }}\n {%- elif content['type'] == 'text' %}\n {{- content['text'] }}\n {%- endif %}\n {%- endfor %}\n {%- endif %}\n {{- '<|python_end|>' }}\n {%- for tool_call in message.tool_calls %}\n {{- '{\"name\": \"' + tool_call.function.name + '\", ' }}\n {{- '\"parameters\": ' }}\n {{- tool_call.function.arguments | tojson }}\n {{- \"}\" }}\n {%- endfor %}\n {{- \"<|eot|>\" }}\n {%- elif message.role == \"tool\" or message.role == \"ipython\" %}\n {{- \"<|header_start|>ipython<|header_end|>\\n\\n\" }}\n {%- if message.content is mapping or message.content is iterable %}\n {{- message.content | tojson }}\n {%- else %}\n {{- message.content }}\n {%- endif %}\n {{- \"<|eot|>\" }}\n {%- endif %}\n{%- endfor %}\n{%- if add_generation_prompt %}\n {{- '<|header_start|>assistant<|header_end|>\\n\\n' }}\n{%- endif %}\n"
+}
diff --git a/config.json b/config.json
new file mode 100644
index 0000000000000000000000000000000000000000..6bf286d0e286c81c6fc469b468e630f1e997ed78
--- /dev/null
+++ b/config.json
@@ -0,0 +1,80 @@
+{
+ "architectures": [
+ "Llama4ForConditionalGeneration"
+ ],
+ "boi_token_index": 200080,
+ "eoi_token_index": 200081,
+ "image_token_index": 200092,
+ "model_type": "llama4",
+ "text_config": {
+ "_attn_implementation_autoset": true,
+ "attention_bias": false,
+ "attention_chunk_size": 8192,
+ "attention_dropout": 0.0,
+ "bos_token_id": 200000,
+ "eos_token_id": [
+ 200001,
+ 200007,
+ 200008
+ ],
+ "for_llm_compressor": false,
+ "head_dim": 128,
+ "hidden_act": "silu",
+ "hidden_size": 5120,
+ "initializer_range": 0.02,
+ "interleave_moe_layer_step": 1,
+ "intermediate_size": 8192,
+ "intermediate_size_mlp": 16384,
+ "max_position_embeddings": 10485760,
+ "model_type": "llama4_text",
+ "no_rope_layers": [],
+ "num_attention_heads": 40,
+ "num_experts_per_tok": 1,
+ "num_hidden_layers": 48,
+ "num_key_value_heads": 8,
+ "num_local_experts": 16,
+ "output_router_logits": false,
+ "pad_token_id": 200018,
+ "rms_norm_eps": 1e-05,
+ "rope_scaling": {
+ "factor": 8.0,
+ "high_freq_factor": 4.0,
+ "low_freq_factor": 1.0,
+ "original_max_position_embeddings": 8192,
+ "rope_type": "llama3"
+ },
+ "rope_theta": 500000.0,
+ "router_aux_loss_coef": 0.001,
+ "router_jitter_noise": 0.0,
+ "torch_dtype": "bfloat16",
+ "use_cache": true,
+ "use_qk_norm": true,
+ "vocab_size": 202048
+ },
+ "torch_dtype": "bfloat16",
+ "transformers_version": "4.51.0.dev0",
+ "vision_config": {
+ "_attn_implementation_autoset": true,
+ "attention_dropout": 0.0,
+ "hidden_act": "gelu",
+ "hidden_size": 1408,
+ "image_size": 336,
+ "initializer_range": 0.02,
+ "intermediate_size": 5632,
+ "model_type": "llama4_vision_model",
+ "multi_modal_projector_bias": false,
+ "norm_eps": 1e-05,
+ "num_attention_heads": 16,
+ "num_channels": 3,
+ "num_hidden_layers": 34,
+ "patch_size": 14,
+ "pixel_shuffle_ratio": 0.5,
+ "projector_dropout": 0.0,
+ "projector_input_dim": 4096,
+ "projector_output_dim": 4096,
+ "rope_theta": 10000,
+ "vision_feature_layer": -1,
+ "vision_feature_select_strategy": "default",
+ "vision_output_dim": 4096
+ }
+}
diff --git a/generation_config.json b/generation_config.json
new file mode 100644
index 0000000000000000000000000000000000000000..e3f00e28b14d3444eee03056679baf52f815fc26
--- /dev/null
+++ b/generation_config.json
@@ -0,0 +1,13 @@
+{
+ "bos_token_id": 200000,
+ "do_sample": true,
+ "eos_token_id": [
+ 200001,
+ 200007,
+ 200008
+ ],
+ "pad_token_id": 200018,
+ "temperature": 0.6,
+ "top_p": 0.9,
+ "transformers_version": "4.51.0.dev0"
+}
diff --git a/model-00001-of-00050.safetensors b/model-00001-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..c00313bddaafbf7f516911bd302a857ecf3ce97f
--- /dev/null
+++ b/model-00001-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:bbf828156579c47920e57b6f0ab756affd19b37e82141661984f8b3969c07d21
+size 3938735392
diff --git a/model-00002-of-00050.safetensors b/model-00002-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..9b4175a8825c9e0730a432c02dd81824bb0ac60e
--- /dev/null
+++ b/model-00002-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:e6acea57d3006e6d3f6788adb1e435d8deb36302c5e57f434d923e67633f7121
+size 4404205216
diff --git a/model-00003-of-00050.safetensors b/model-00003-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..c2104429eabc7ef15b55bd1560ec35409b17d03a
--- /dev/null
+++ b/model-00003-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:a82804c0762ae5bf8604ef746d665ece5a58592c2b2cb32dc77b05839f81abdf
+size 4404205216
diff --git a/model-00004-of-00050.safetensors b/model-00004-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..4d9302239a14b7f49d17f4da96331f6a8aeb9857
--- /dev/null
+++ b/model-00004-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:c506a653a01647ecaf59017efa144a7c37d37f92c1c39f2e09de7a39077f3f95
+size 4404205216
diff --git a/model-00005-of-00050.safetensors b/model-00005-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..b2ff54df1978f094e9a61ef769cd5d1acabccb1a
--- /dev/null
+++ b/model-00005-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:263c89811dcf3bea5c0e4317703b095765b7f17a3a90538c2517562b98afea12
+size 4404205216
diff --git a/model-00006-of-00050.safetensors b/model-00006-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..d7717c238a066764d93606e674a5a51ebeafb266
--- /dev/null
+++ b/model-00006-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:24515405e22c53941607593eeed5ac11e2f99b308ed69b12991122b9ceb2cd77
+size 4404205216
diff --git a/model-00007-of-00050.safetensors b/model-00007-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..291790227157fe1c2757fbb4b3a3c9591d3f24d7
--- /dev/null
+++ b/model-00007-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:501d6f29b4756d5f39231edd627cf4e41cf661c53aeece7c4fe12bac7e2b065c
+size 4404205216
diff --git a/model-00008-of-00050.safetensors b/model-00008-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..7d0239561db793f4ecac94da76fc8b959a6e1302
--- /dev/null
+++ b/model-00008-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:adb3cd17c4ea49c2017e55c2d8e965bdb270036e0a846a9e5b2cbfd65b6f7412
+size 4404205216
diff --git a/model-00009-of-00050.safetensors b/model-00009-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..b75e3149748cebbcf216c9197126d7ce1ae45f37
--- /dev/null
+++ b/model-00009-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:14e44b52dc9755332a0f7bc93ffbf97bcabe1a0ae952f501ec9911b3d125eb4e
+size 4404205216
diff --git a/model-00010-of-00050.safetensors b/model-00010-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..d59b3274c098023a2255589fc9ab0d069d9c451d
--- /dev/null
+++ b/model-00010-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:ff899cbd27103d873deda9cf3c5a64ab50750a27914bd8265c15bde01ca6ac31
+size 4404205216
diff --git a/model-00011-of-00050.safetensors b/model-00011-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..88406b38bb0ea973bb8bcb686128a241b6683cd2
--- /dev/null
+++ b/model-00011-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:5cc570f56d72bdc2f569319028b020c9ee8dbeb92219ffff3d82d1de95bceaa8
+size 4404205208
diff --git a/model-00012-of-00050.safetensors b/model-00012-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..285bba48cc4a5d157da84cc9ea5786f79ae3b757
--- /dev/null
+++ b/model-00012-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:59aa34eb6c3d32f56c43a3746a95b4db344241aefe3525c9c0e32811d6e0a73c
+size 4404205232
diff --git a/model-00013-of-00050.safetensors b/model-00013-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..ed1dc14c7d126455feed26eaf00f6452b79f01b9
--- /dev/null
+++ b/model-00013-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:7de8e74201e97aefc62631d5da756a407df429ab13f40c5cec2dd2d213e20f9c
+size 4404205232
diff --git a/model-00014-of-00050.safetensors b/model-00014-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..19769195c99d3ba1c7a9b84c5134930b6886caa9
--- /dev/null
+++ b/model-00014-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:d748c87687d89792a926f3ff3603ea83eeeac9940d2444e643fcb6b137fabe00
+size 4404205232
diff --git a/model-00015-of-00050.safetensors b/model-00015-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..38142e960f36d0b0f11a90ad6db6dcd5493a387f
--- /dev/null
+++ b/model-00015-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:a02f782a1a9559f2268a6260c6b24771299fe53791dbe2be0be2650643c5d997
+size 4404205232
diff --git a/model-00016-of-00050.safetensors b/model-00016-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..562881528fde5164c3e82575a77cc9c6e856099a
--- /dev/null
+++ b/model-00016-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:9fe2c30d5ef14b66c1cdee2d668310877f361b0573b30cd6e7686d2951b92a48
+size 4404205232
diff --git a/model-00017-of-00050.safetensors b/model-00017-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..33c2a9fd6d941332dd4fc02a64ea5cfe9b5dec18
--- /dev/null
+++ b/model-00017-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:04f42430ef6435fe2073432296b2f4b3aab1d077c3b8c8f1b17b0b035daafdaa
+size 4404205232
diff --git a/model-00018-of-00050.safetensors b/model-00018-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..b86e69f1978e48dac2d88af4a498969bc4d52c90
--- /dev/null
+++ b/model-00018-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:5794abbf334b25b61f2d112a09018f93659830a3c68367d34bc1c00aaff07d28
+size 4404205232
diff --git a/model-00019-of-00050.safetensors b/model-00019-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..624d7cb0177114c143a53f140c0c96734f72fa36
--- /dev/null
+++ b/model-00019-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:1e7f13a189c3085ce6a4f106c71ce0dda45ae2ccc4daff1d182abf282e00b3e8
+size 4404205232
diff --git a/model-00020-of-00050.safetensors b/model-00020-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..d3c0ae25f8b0335eea66d6d29470111a3abf3173
--- /dev/null
+++ b/model-00020-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:b830a0c9ac8d867de91b01af537466fa0a2256b9859a469089de086643b03278
+size 4404205232
diff --git a/model-00021-of-00050.safetensors b/model-00021-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..6fa4c74d2cf8947f364ac6fbd62127c10570b1af
--- /dev/null
+++ b/model-00021-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:21ed384c8b2b49ea0c21d83558aeee747a672dc3b9b9931c723f6b104cc1144d
+size 4404205232
diff --git a/model-00022-of-00050.safetensors b/model-00022-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..0cf12e09970045c19fad01c37aae958de000afa5
--- /dev/null
+++ b/model-00022-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:b72ea5e84f1443a14e36eb7e4b1c14143e6a1e57d6ba07b1902c08b6d6770c49
+size 4404205232
diff --git a/model-00023-of-00050.safetensors b/model-00023-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..572e85ccbc2fd736f0cc28c3c9eadd7e6b103d33
--- /dev/null
+++ b/model-00023-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:cd02f9a1a3c95959e2bb5b029bcdf92c2f6ba1d2c33c094a6359e00d22e679e6
+size 4404205232
diff --git a/model-00024-of-00050.safetensors b/model-00024-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..18d107d64976368362bdaf715c31b3138ee9456a
--- /dev/null
+++ b/model-00024-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:f4f6b3e5d425a2eebe7baa622cacb4e3ceced4b3c90074f4979515afae1c2934
+size 4404205232
diff --git a/model-00025-of-00050.safetensors b/model-00025-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..d1063e7521160d65858a18692420aadd45cc671f
--- /dev/null
+++ b/model-00025-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:11ed1ef3dee4f266f043a032500a8612f3f876ba3b6f2ceb75e0c9930a472056
+size 4404205232
diff --git a/model-00026-of-00050.safetensors b/model-00026-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..450b1639cc56b97b112c4cde1aa781ffbea794e9
--- /dev/null
+++ b/model-00026-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:7631aecf68ae2b61b5c837bd7187d6725afc4b0c073d8d6be55f4f903f8b9953
+size 4404205232
diff --git a/model-00027-of-00050.safetensors b/model-00027-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..8f6ae5576f868986c6850f2085eb535e6e20b818
--- /dev/null
+++ b/model-00027-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:50553fd675a45efe5679cf54a4c1f3ccbec5bc199f9aa4454ab9a33588396fe6
+size 4404205232
diff --git a/model-00028-of-00050.safetensors b/model-00028-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..1555ce681fec79af38ce7497ef1f33f12d93a842
--- /dev/null
+++ b/model-00028-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:99683b9c8589200d803149e2fe6d7d644bb8672195a135c4a7917ab436e5cb40
+size 4404205232
diff --git a/model-00029-of-00050.safetensors b/model-00029-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..0c93a48ff1d015d082987fd2e7e5a74336f0c8b8
--- /dev/null
+++ b/model-00029-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:8394beb7790a0fd628819dbc287da3c6454c64ce65280c7de4d5966521787555
+size 4404205232
diff --git a/model-00030-of-00050.safetensors b/model-00030-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..aedcd392a8a8a0e7c06d2f0817ce5aee770cd192
--- /dev/null
+++ b/model-00030-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:e0da9e833d7f9652eac07191899a06fed28699ac7ecc52ce1574d940ea18e5cd
+size 4404205232
diff --git a/model-00031-of-00050.safetensors b/model-00031-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..b010c0ca9afd588b4d9c0307275c8a8dee642f26
--- /dev/null
+++ b/model-00031-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:73573916fe578f1a9fde4ecd0963207dd438bc58b960100a3a5b0b987f3c0878
+size 4404205232
diff --git a/model-00032-of-00050.safetensors b/model-00032-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..61f836ef930f4066c0f97d94182a660626e5e2e1
--- /dev/null
+++ b/model-00032-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:8dbfb9536b33d79dfac6566dcc852a2b1ed7e8244c7e9bc0268edf03abbc1074
+size 4404205232
diff --git a/model-00033-of-00050.safetensors b/model-00033-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..31bed3d2367311b6f82226f53974f36ba52081db
--- /dev/null
+++ b/model-00033-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:f0e740433f753ee678ff21bdfdf2c14545d914c443463162701d68a8179d5739
+size 4404205232
diff --git a/model-00034-of-00050.safetensors b/model-00034-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..646df035e9fee7a284605d7c7e8e0e66181d4315
--- /dev/null
+++ b/model-00034-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:3c26d94d6079a2fef6b0eaa217b3b05b428a317c8c2fdc5bd3f535cb068c4e21
+size 4404205232
diff --git a/model-00035-of-00050.safetensors b/model-00035-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..314bbd7457d3585566420f99ab4a8632f64d1f67
--- /dev/null
+++ b/model-00035-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:c0d4f2685a86db5e9e29af427962ff6834b38269b0a9d551a44efcce2e24cb7d
+size 4404205232
diff --git a/model-00036-of-00050.safetensors b/model-00036-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..1781e816c271de5e4a80d068acd87204b1749be0
--- /dev/null
+++ b/model-00036-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:8c98ba369fb5c322a32676aab4cfc8dbf0ca8b9baeaedcd892af2bf7ad317ff7
+size 4404205232
diff --git a/model-00037-of-00050.safetensors b/model-00037-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..244e5c1a64d65ff690a6de2e3df023dae5760681
--- /dev/null
+++ b/model-00037-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:f11b830802e6f6e1c98b1b98290009c66f126c3ac6be4c15b35851c484ca86dd
+size 4404205232
diff --git a/model-00038-of-00050.safetensors b/model-00038-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..4e270952342e4debe6170d6aaf54c9b5721e56be
--- /dev/null
+++ b/model-00038-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:5b48ce2681c3f229c37b3d482c085e77d082eb0a70c591a648f2bfb0c2517b00
+size 4404205232
diff --git a/model-00039-of-00050.safetensors b/model-00039-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..0507180c3185c940ada10fc09978600e48311805
--- /dev/null
+++ b/model-00039-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:311b90d7440430bfa485bc180f4edfd7d2ccb4205350150d873b50d15d6e581b
+size 4404205232
diff --git a/model-00040-of-00050.safetensors b/model-00040-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..5acf7314a7fb41dde2e8352484035a623a062075
--- /dev/null
+++ b/model-00040-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:746016b6f4d211fed930407c9830f2b6456578d891a396c587ad2c56069f8b97
+size 4404205232
diff --git a/model-00041-of-00050.safetensors b/model-00041-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..7a8bc2895c495aa3bf3ba01af22871a1a8090bf5
--- /dev/null
+++ b/model-00041-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:926afe352c3f28efd44b8fec30c3e05862c718be35d49c6a92872bac7b503f82
+size 4404205232
diff --git a/model-00042-of-00050.safetensors b/model-00042-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..f501e89564fc853bc0da059cd2dd399b19033869
--- /dev/null
+++ b/model-00042-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:e537ea7d13c82f40252b7c6a1d4dc562a3c92a5bb0136ae3ffc3272cbbdbccab
+size 4404205232
diff --git a/model-00043-of-00050.safetensors b/model-00043-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..df42aa75d0d404d2529d1f1a9e328e9549a9ad3c
--- /dev/null
+++ b/model-00043-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:349ced4be8332c261d2dab21f65975d4c20f6b9886e38e74b2c56977e57589c1
+size 4404205232
diff --git a/model-00044-of-00050.safetensors b/model-00044-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..bff4aef8109dfe3eac818019390545b76afd8676
--- /dev/null
+++ b/model-00044-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:6a7f200a2e02a088e3e4d0b0d8dc238d5895f3d1c21fd6616bee4a65aa3ad5e8
+size 4404205232
diff --git a/model-00045-of-00050.safetensors b/model-00045-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..fcc2a0ba4e45525f07d2526d1ebc7cd45a24bd47
--- /dev/null
+++ b/model-00045-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:1f104e4a56379fa6e0ce09157b8b348030d635e216dce06ebf1e06fd7bf0b921
+size 4404205232
diff --git a/model-00046-of-00050.safetensors b/model-00046-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..bbeb949d5dc3759b669b0fdc711194e752090c13
--- /dev/null
+++ b/model-00046-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:fe7ac21d014f1fe88d6c8b8066b6a30c21ff609b480663f06eab152319466a38
+size 4404205232
diff --git a/model-00047-of-00050.safetensors b/model-00047-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..593a7135f3175e2de6916f62a8704bdd24604e46
--- /dev/null
+++ b/model-00047-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:d25967581499d21ed97f2e812aa77e5f0db7174193eb5b2f7d0cd28ee59fda01
+size 4404205232
diff --git a/model-00048-of-00050.safetensors b/model-00048-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..272b391610c6b44b8a200ef75bc78a30a791e32d
--- /dev/null
+++ b/model-00048-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:fd0c1399bc7dfca7601ad5b7163744973ef912a53ffa64c5b473622d64c7f678
+size 4404205232
diff --git a/model-00049-of-00050.safetensors b/model-00049-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..10654570c02308e7860edcc758b2916ab6b56910
--- /dev/null
+++ b/model-00049-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:b63d113ff6f8a87204267a91c3df2dde8dd8acb799373eab710d5da85630ad22
+size 4278385928
diff --git a/model-00050-of-00050.safetensors b/model-00050-of-00050.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..81637e9ee63f52707a550db08b78ad1eec4a81e7
--- /dev/null
+++ b/model-00050-of-00050.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:5f412ae287747cc86e6ee5d58600bbe5c574d6fa77f351367aae9ea3f4dbcc3a
+size 2068971664
diff --git a/model.safetensors.index.json b/model.safetensors.index.json
new file mode 100644
index 0000000000000000000000000000000000000000..a301ade1b23808f6314590333804ecf1b66e52e3
--- /dev/null
+++ b/model.safetensors.index.json
@@ -0,0 +1,1140 @@
+{
+ "metadata": {
+ "total_size": 217283587072
+ },
+ "weight_map": {
+ "language_model.lm_head.weight": "model-00050-of-00050.safetensors",
+ "language_model.model.embed_tokens.weight": "model-00001-of-00050.safetensors",
+ "language_model.model.layers.0.feed_forward.experts.down_proj": "model-00002-of-00050.safetensors",
+ "language_model.model.layers.0.feed_forward.experts.gate_up_proj": "model-00002-of-00050.safetensors",
+ "language_model.model.layers.0.feed_forward.router.weight": "model-00002-of-00050.safetensors",
+ "language_model.model.layers.0.feed_forward.shared_expert.down_proj.weight": "model-00002-of-00050.safetensors",
+ "language_model.model.layers.0.feed_forward.shared_expert.gate_proj.weight": "model-00002-of-00050.safetensors",
+ "language_model.model.layers.0.feed_forward.shared_expert.up_proj.weight": "model-00002-of-00050.safetensors",
+ "language_model.model.layers.0.input_layernorm.weight": "model-00002-of-00050.safetensors",
+ "language_model.model.layers.0.post_attention_layernorm.weight": "model-00002-of-00050.safetensors",
+ "language_model.model.layers.0.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "language_model.model.layers.0.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "language_model.model.layers.0.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "language_model.model.layers.0.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "language_model.model.layers.1.feed_forward.experts.down_proj": "model-00003-of-00050.safetensors",
+ "language_model.model.layers.1.feed_forward.experts.gate_up_proj": "model-00003-of-00050.safetensors",
+ "language_model.model.layers.1.feed_forward.router.weight": "model-00003-of-00050.safetensors",
+ "language_model.model.layers.1.feed_forward.shared_expert.down_proj.weight": "model-00003-of-00050.safetensors",
+ "language_model.model.layers.1.feed_forward.shared_expert.gate_proj.weight": "model-00003-of-00050.safetensors",
+ "language_model.model.layers.1.feed_forward.shared_expert.up_proj.weight": "model-00003-of-00050.safetensors",
+ "language_model.model.layers.1.input_layernorm.weight": "model-00003-of-00050.safetensors",
+ "language_model.model.layers.1.post_attention_layernorm.weight": "model-00003-of-00050.safetensors",
+ "language_model.model.layers.1.self_attn.k_proj.weight": "model-00002-of-00050.safetensors",
+ "language_model.model.layers.1.self_attn.o_proj.weight": "model-00002-of-00050.safetensors",
+ "language_model.model.layers.1.self_attn.q_proj.weight": "model-00002-of-00050.safetensors",
+ "language_model.model.layers.1.self_attn.v_proj.weight": "model-00002-of-00050.safetensors",
+ "language_model.model.layers.10.feed_forward.experts.down_proj": "model-00012-of-00050.safetensors",
+ "language_model.model.layers.10.feed_forward.experts.gate_up_proj": "model-00012-of-00050.safetensors",
+ "language_model.model.layers.10.feed_forward.router.weight": "model-00012-of-00050.safetensors",
+ "language_model.model.layers.10.feed_forward.shared_expert.down_proj.weight": "model-00012-of-00050.safetensors",
+ "language_model.model.layers.10.feed_forward.shared_expert.gate_proj.weight": "model-00012-of-00050.safetensors",
+ "language_model.model.layers.10.feed_forward.shared_expert.up_proj.weight": "model-00012-of-00050.safetensors",
+ "language_model.model.layers.10.input_layernorm.weight": "model-00012-of-00050.safetensors",
+ "language_model.model.layers.10.post_attention_layernorm.weight": "model-00012-of-00050.safetensors",
+ "language_model.model.layers.10.self_attn.k_proj.weight": "model-00011-of-00050.safetensors",
+ "language_model.model.layers.10.self_attn.o_proj.weight": "model-00011-of-00050.safetensors",
+ "language_model.model.layers.10.self_attn.q_proj.weight": "model-00011-of-00050.safetensors",
+ "language_model.model.layers.10.self_attn.v_proj.weight": "model-00011-of-00050.safetensors",
+ "language_model.model.layers.11.feed_forward.experts.down_proj": "model-00013-of-00050.safetensors",
+ "language_model.model.layers.11.feed_forward.experts.gate_up_proj": "model-00013-of-00050.safetensors",
+ "language_model.model.layers.11.feed_forward.router.weight": "model-00013-of-00050.safetensors",
+ "language_model.model.layers.11.feed_forward.shared_expert.down_proj.weight": "model-00013-of-00050.safetensors",
+ "language_model.model.layers.11.feed_forward.shared_expert.gate_proj.weight": "model-00013-of-00050.safetensors",
+ "language_model.model.layers.11.feed_forward.shared_expert.up_proj.weight": "model-00013-of-00050.safetensors",
+ "language_model.model.layers.11.input_layernorm.weight": "model-00013-of-00050.safetensors",
+ "language_model.model.layers.11.post_attention_layernorm.weight": "model-00013-of-00050.safetensors",
+ "language_model.model.layers.11.self_attn.k_proj.weight": "model-00012-of-00050.safetensors",
+ "language_model.model.layers.11.self_attn.o_proj.weight": "model-00012-of-00050.safetensors",
+ "language_model.model.layers.11.self_attn.q_proj.weight": "model-00012-of-00050.safetensors",
+ "language_model.model.layers.11.self_attn.v_proj.weight": "model-00012-of-00050.safetensors",
+ "language_model.model.layers.12.feed_forward.experts.down_proj": "model-00014-of-00050.safetensors",
+ "language_model.model.layers.12.feed_forward.experts.gate_up_proj": "model-00014-of-00050.safetensors",
+ "language_model.model.layers.12.feed_forward.router.weight": "model-00014-of-00050.safetensors",
+ "language_model.model.layers.12.feed_forward.shared_expert.down_proj.weight": "model-00014-of-00050.safetensors",
+ "language_model.model.layers.12.feed_forward.shared_expert.gate_proj.weight": "model-00014-of-00050.safetensors",
+ "language_model.model.layers.12.feed_forward.shared_expert.up_proj.weight": "model-00014-of-00050.safetensors",
+ "language_model.model.layers.12.input_layernorm.weight": "model-00014-of-00050.safetensors",
+ "language_model.model.layers.12.post_attention_layernorm.weight": "model-00014-of-00050.safetensors",
+ "language_model.model.layers.12.self_attn.k_proj.weight": "model-00013-of-00050.safetensors",
+ "language_model.model.layers.12.self_attn.o_proj.weight": "model-00013-of-00050.safetensors",
+ "language_model.model.layers.12.self_attn.q_proj.weight": "model-00013-of-00050.safetensors",
+ "language_model.model.layers.12.self_attn.v_proj.weight": "model-00013-of-00050.safetensors",
+ "language_model.model.layers.13.feed_forward.experts.down_proj": "model-00015-of-00050.safetensors",
+ "language_model.model.layers.13.feed_forward.experts.gate_up_proj": "model-00015-of-00050.safetensors",
+ "language_model.model.layers.13.feed_forward.router.weight": "model-00015-of-00050.safetensors",
+ "language_model.model.layers.13.feed_forward.shared_expert.down_proj.weight": "model-00015-of-00050.safetensors",
+ "language_model.model.layers.13.feed_forward.shared_expert.gate_proj.weight": "model-00015-of-00050.safetensors",
+ "language_model.model.layers.13.feed_forward.shared_expert.up_proj.weight": "model-00015-of-00050.safetensors",
+ "language_model.model.layers.13.input_layernorm.weight": "model-00015-of-00050.safetensors",
+ "language_model.model.layers.13.post_attention_layernorm.weight": "model-00015-of-00050.safetensors",
+ "language_model.model.layers.13.self_attn.k_proj.weight": "model-00014-of-00050.safetensors",
+ "language_model.model.layers.13.self_attn.o_proj.weight": "model-00014-of-00050.safetensors",
+ "language_model.model.layers.13.self_attn.q_proj.weight": "model-00014-of-00050.safetensors",
+ "language_model.model.layers.13.self_attn.v_proj.weight": "model-00014-of-00050.safetensors",
+ "language_model.model.layers.14.feed_forward.experts.down_proj": "model-00016-of-00050.safetensors",
+ "language_model.model.layers.14.feed_forward.experts.gate_up_proj": "model-00016-of-00050.safetensors",
+ "language_model.model.layers.14.feed_forward.router.weight": "model-00016-of-00050.safetensors",
+ "language_model.model.layers.14.feed_forward.shared_expert.down_proj.weight": "model-00016-of-00050.safetensors",
+ "language_model.model.layers.14.feed_forward.shared_expert.gate_proj.weight": "model-00016-of-00050.safetensors",
+ "language_model.model.layers.14.feed_forward.shared_expert.up_proj.weight": "model-00016-of-00050.safetensors",
+ "language_model.model.layers.14.input_layernorm.weight": "model-00016-of-00050.safetensors",
+ "language_model.model.layers.14.post_attention_layernorm.weight": "model-00016-of-00050.safetensors",
+ "language_model.model.layers.14.self_attn.k_proj.weight": "model-00015-of-00050.safetensors",
+ "language_model.model.layers.14.self_attn.o_proj.weight": "model-00015-of-00050.safetensors",
+ "language_model.model.layers.14.self_attn.q_proj.weight": "model-00015-of-00050.safetensors",
+ "language_model.model.layers.14.self_attn.v_proj.weight": "model-00015-of-00050.safetensors",
+ "language_model.model.layers.15.feed_forward.experts.down_proj": "model-00017-of-00050.safetensors",
+ "language_model.model.layers.15.feed_forward.experts.gate_up_proj": "model-00017-of-00050.safetensors",
+ "language_model.model.layers.15.feed_forward.router.weight": "model-00017-of-00050.safetensors",
+ "language_model.model.layers.15.feed_forward.shared_expert.down_proj.weight": "model-00017-of-00050.safetensors",
+ "language_model.model.layers.15.feed_forward.shared_expert.gate_proj.weight": "model-00017-of-00050.safetensors",
+ "language_model.model.layers.15.feed_forward.shared_expert.up_proj.weight": "model-00017-of-00050.safetensors",
+ "language_model.model.layers.15.input_layernorm.weight": "model-00017-of-00050.safetensors",
+ "language_model.model.layers.15.post_attention_layernorm.weight": "model-00017-of-00050.safetensors",
+ "language_model.model.layers.15.self_attn.k_proj.weight": "model-00016-of-00050.safetensors",
+ "language_model.model.layers.15.self_attn.o_proj.weight": "model-00016-of-00050.safetensors",
+ "language_model.model.layers.15.self_attn.q_proj.weight": "model-00016-of-00050.safetensors",
+ "language_model.model.layers.15.self_attn.v_proj.weight": "model-00016-of-00050.safetensors",
+ "language_model.model.layers.16.feed_forward.experts.down_proj": "model-00018-of-00050.safetensors",
+ "language_model.model.layers.16.feed_forward.experts.gate_up_proj": "model-00018-of-00050.safetensors",
+ "language_model.model.layers.16.feed_forward.router.weight": "model-00018-of-00050.safetensors",
+ "language_model.model.layers.16.feed_forward.shared_expert.down_proj.weight": "model-00018-of-00050.safetensors",
+ "language_model.model.layers.16.feed_forward.shared_expert.gate_proj.weight": "model-00018-of-00050.safetensors",
+ "language_model.model.layers.16.feed_forward.shared_expert.up_proj.weight": "model-00018-of-00050.safetensors",
+ "language_model.model.layers.16.input_layernorm.weight": "model-00018-of-00050.safetensors",
+ "language_model.model.layers.16.post_attention_layernorm.weight": "model-00018-of-00050.safetensors",
+ "language_model.model.layers.16.self_attn.k_proj.weight": "model-00017-of-00050.safetensors",
+ "language_model.model.layers.16.self_attn.o_proj.weight": "model-00017-of-00050.safetensors",
+ "language_model.model.layers.16.self_attn.q_proj.weight": "model-00017-of-00050.safetensors",
+ "language_model.model.layers.16.self_attn.v_proj.weight": "model-00017-of-00050.safetensors",
+ "language_model.model.layers.17.feed_forward.experts.down_proj": "model-00019-of-00050.safetensors",
+ "language_model.model.layers.17.feed_forward.experts.gate_up_proj": "model-00019-of-00050.safetensors",
+ "language_model.model.layers.17.feed_forward.router.weight": "model-00019-of-00050.safetensors",
+ "language_model.model.layers.17.feed_forward.shared_expert.down_proj.weight": "model-00019-of-00050.safetensors",
+ "language_model.model.layers.17.feed_forward.shared_expert.gate_proj.weight": "model-00019-of-00050.safetensors",
+ "language_model.model.layers.17.feed_forward.shared_expert.up_proj.weight": "model-00019-of-00050.safetensors",
+ "language_model.model.layers.17.input_layernorm.weight": "model-00019-of-00050.safetensors",
+ "language_model.model.layers.17.post_attention_layernorm.weight": "model-00019-of-00050.safetensors",
+ "language_model.model.layers.17.self_attn.k_proj.weight": "model-00018-of-00050.safetensors",
+ "language_model.model.layers.17.self_attn.o_proj.weight": "model-00018-of-00050.safetensors",
+ "language_model.model.layers.17.self_attn.q_proj.weight": "model-00018-of-00050.safetensors",
+ "language_model.model.layers.17.self_attn.v_proj.weight": "model-00018-of-00050.safetensors",
+ "language_model.model.layers.18.feed_forward.experts.down_proj": "model-00020-of-00050.safetensors",
+ "language_model.model.layers.18.feed_forward.experts.gate_up_proj": "model-00020-of-00050.safetensors",
+ "language_model.model.layers.18.feed_forward.router.weight": "model-00020-of-00050.safetensors",
+ "language_model.model.layers.18.feed_forward.shared_expert.down_proj.weight": "model-00020-of-00050.safetensors",
+ "language_model.model.layers.18.feed_forward.shared_expert.gate_proj.weight": "model-00020-of-00050.safetensors",
+ "language_model.model.layers.18.feed_forward.shared_expert.up_proj.weight": "model-00020-of-00050.safetensors",
+ "language_model.model.layers.18.input_layernorm.weight": "model-00020-of-00050.safetensors",
+ "language_model.model.layers.18.post_attention_layernorm.weight": "model-00020-of-00050.safetensors",
+ "language_model.model.layers.18.self_attn.k_proj.weight": "model-00019-of-00050.safetensors",
+ "language_model.model.layers.18.self_attn.o_proj.weight": "model-00019-of-00050.safetensors",
+ "language_model.model.layers.18.self_attn.q_proj.weight": "model-00019-of-00050.safetensors",
+ "language_model.model.layers.18.self_attn.v_proj.weight": "model-00019-of-00050.safetensors",
+ "language_model.model.layers.19.feed_forward.experts.down_proj": "model-00021-of-00050.safetensors",
+ "language_model.model.layers.19.feed_forward.experts.gate_up_proj": "model-00021-of-00050.safetensors",
+ "language_model.model.layers.19.feed_forward.router.weight": "model-00021-of-00050.safetensors",
+ "language_model.model.layers.19.feed_forward.shared_expert.down_proj.weight": "model-00021-of-00050.safetensors",
+ "language_model.model.layers.19.feed_forward.shared_expert.gate_proj.weight": "model-00021-of-00050.safetensors",
+ "language_model.model.layers.19.feed_forward.shared_expert.up_proj.weight": "model-00021-of-00050.safetensors",
+ "language_model.model.layers.19.input_layernorm.weight": "model-00021-of-00050.safetensors",
+ "language_model.model.layers.19.post_attention_layernorm.weight": "model-00021-of-00050.safetensors",
+ "language_model.model.layers.19.self_attn.k_proj.weight": "model-00020-of-00050.safetensors",
+ "language_model.model.layers.19.self_attn.o_proj.weight": "model-00020-of-00050.safetensors",
+ "language_model.model.layers.19.self_attn.q_proj.weight": "model-00020-of-00050.safetensors",
+ "language_model.model.layers.19.self_attn.v_proj.weight": "model-00020-of-00050.safetensors",
+ "language_model.model.layers.2.feed_forward.experts.down_proj": "model-00004-of-00050.safetensors",
+ "language_model.model.layers.2.feed_forward.experts.gate_up_proj": "model-00004-of-00050.safetensors",
+ "language_model.model.layers.2.feed_forward.router.weight": "model-00004-of-00050.safetensors",
+ "language_model.model.layers.2.feed_forward.shared_expert.down_proj.weight": "model-00004-of-00050.safetensors",
+ "language_model.model.layers.2.feed_forward.shared_expert.gate_proj.weight": "model-00004-of-00050.safetensors",
+ "language_model.model.layers.2.feed_forward.shared_expert.up_proj.weight": "model-00004-of-00050.safetensors",
+ "language_model.model.layers.2.input_layernorm.weight": "model-00004-of-00050.safetensors",
+ "language_model.model.layers.2.post_attention_layernorm.weight": "model-00004-of-00050.safetensors",
+ "language_model.model.layers.2.self_attn.k_proj.weight": "model-00003-of-00050.safetensors",
+ "language_model.model.layers.2.self_attn.o_proj.weight": "model-00003-of-00050.safetensors",
+ "language_model.model.layers.2.self_attn.q_proj.weight": "model-00003-of-00050.safetensors",
+ "language_model.model.layers.2.self_attn.v_proj.weight": "model-00003-of-00050.safetensors",
+ "language_model.model.layers.20.feed_forward.experts.down_proj": "model-00022-of-00050.safetensors",
+ "language_model.model.layers.20.feed_forward.experts.gate_up_proj": "model-00022-of-00050.safetensors",
+ "language_model.model.layers.20.feed_forward.router.weight": "model-00022-of-00050.safetensors",
+ "language_model.model.layers.20.feed_forward.shared_expert.down_proj.weight": "model-00022-of-00050.safetensors",
+ "language_model.model.layers.20.feed_forward.shared_expert.gate_proj.weight": "model-00022-of-00050.safetensors",
+ "language_model.model.layers.20.feed_forward.shared_expert.up_proj.weight": "model-00022-of-00050.safetensors",
+ "language_model.model.layers.20.input_layernorm.weight": "model-00022-of-00050.safetensors",
+ "language_model.model.layers.20.post_attention_layernorm.weight": "model-00022-of-00050.safetensors",
+ "language_model.model.layers.20.self_attn.k_proj.weight": "model-00021-of-00050.safetensors",
+ "language_model.model.layers.20.self_attn.o_proj.weight": "model-00021-of-00050.safetensors",
+ "language_model.model.layers.20.self_attn.q_proj.weight": "model-00021-of-00050.safetensors",
+ "language_model.model.layers.20.self_attn.v_proj.weight": "model-00021-of-00050.safetensors",
+ "language_model.model.layers.21.feed_forward.experts.down_proj": "model-00023-of-00050.safetensors",
+ "language_model.model.layers.21.feed_forward.experts.gate_up_proj": "model-00023-of-00050.safetensors",
+ "language_model.model.layers.21.feed_forward.router.weight": "model-00023-of-00050.safetensors",
+ "language_model.model.layers.21.feed_forward.shared_expert.down_proj.weight": "model-00023-of-00050.safetensors",
+ "language_model.model.layers.21.feed_forward.shared_expert.gate_proj.weight": "model-00023-of-00050.safetensors",
+ "language_model.model.layers.21.feed_forward.shared_expert.up_proj.weight": "model-00023-of-00050.safetensors",
+ "language_model.model.layers.21.input_layernorm.weight": "model-00023-of-00050.safetensors",
+ "language_model.model.layers.21.post_attention_layernorm.weight": "model-00023-of-00050.safetensors",
+ "language_model.model.layers.21.self_attn.k_proj.weight": "model-00022-of-00050.safetensors",
+ "language_model.model.layers.21.self_attn.o_proj.weight": "model-00022-of-00050.safetensors",
+ "language_model.model.layers.21.self_attn.q_proj.weight": "model-00022-of-00050.safetensors",
+ "language_model.model.layers.21.self_attn.v_proj.weight": "model-00022-of-00050.safetensors",
+ "language_model.model.layers.22.feed_forward.experts.down_proj": "model-00024-of-00050.safetensors",
+ "language_model.model.layers.22.feed_forward.experts.gate_up_proj": "model-00024-of-00050.safetensors",
+ "language_model.model.layers.22.feed_forward.router.weight": "model-00024-of-00050.safetensors",
+ "language_model.model.layers.22.feed_forward.shared_expert.down_proj.weight": "model-00024-of-00050.safetensors",
+ "language_model.model.layers.22.feed_forward.shared_expert.gate_proj.weight": "model-00024-of-00050.safetensors",
+ "language_model.model.layers.22.feed_forward.shared_expert.up_proj.weight": "model-00024-of-00050.safetensors",
+ "language_model.model.layers.22.input_layernorm.weight": "model-00024-of-00050.safetensors",
+ "language_model.model.layers.22.post_attention_layernorm.weight": "model-00024-of-00050.safetensors",
+ "language_model.model.layers.22.self_attn.k_proj.weight": "model-00023-of-00050.safetensors",
+ "language_model.model.layers.22.self_attn.o_proj.weight": "model-00023-of-00050.safetensors",
+ "language_model.model.layers.22.self_attn.q_proj.weight": "model-00023-of-00050.safetensors",
+ "language_model.model.layers.22.self_attn.v_proj.weight": "model-00023-of-00050.safetensors",
+ "language_model.model.layers.23.feed_forward.experts.down_proj": "model-00025-of-00050.safetensors",
+ "language_model.model.layers.23.feed_forward.experts.gate_up_proj": "model-00025-of-00050.safetensors",
+ "language_model.model.layers.23.feed_forward.router.weight": "model-00025-of-00050.safetensors",
+ "language_model.model.layers.23.feed_forward.shared_expert.down_proj.weight": "model-00025-of-00050.safetensors",
+ "language_model.model.layers.23.feed_forward.shared_expert.gate_proj.weight": "model-00025-of-00050.safetensors",
+ "language_model.model.layers.23.feed_forward.shared_expert.up_proj.weight": "model-00025-of-00050.safetensors",
+ "language_model.model.layers.23.input_layernorm.weight": "model-00025-of-00050.safetensors",
+ "language_model.model.layers.23.post_attention_layernorm.weight": "model-00025-of-00050.safetensors",
+ "language_model.model.layers.23.self_attn.k_proj.weight": "model-00024-of-00050.safetensors",
+ "language_model.model.layers.23.self_attn.o_proj.weight": "model-00024-of-00050.safetensors",
+ "language_model.model.layers.23.self_attn.q_proj.weight": "model-00024-of-00050.safetensors",
+ "language_model.model.layers.23.self_attn.v_proj.weight": "model-00024-of-00050.safetensors",
+ "language_model.model.layers.24.feed_forward.experts.down_proj": "model-00026-of-00050.safetensors",
+ "language_model.model.layers.24.feed_forward.experts.gate_up_proj": "model-00026-of-00050.safetensors",
+ "language_model.model.layers.24.feed_forward.router.weight": "model-00026-of-00050.safetensors",
+ "language_model.model.layers.24.feed_forward.shared_expert.down_proj.weight": "model-00026-of-00050.safetensors",
+ "language_model.model.layers.24.feed_forward.shared_expert.gate_proj.weight": "model-00026-of-00050.safetensors",
+ "language_model.model.layers.24.feed_forward.shared_expert.up_proj.weight": "model-00026-of-00050.safetensors",
+ "language_model.model.layers.24.input_layernorm.weight": "model-00026-of-00050.safetensors",
+ "language_model.model.layers.24.post_attention_layernorm.weight": "model-00026-of-00050.safetensors",
+ "language_model.model.layers.24.self_attn.k_proj.weight": "model-00025-of-00050.safetensors",
+ "language_model.model.layers.24.self_attn.o_proj.weight": "model-00025-of-00050.safetensors",
+ "language_model.model.layers.24.self_attn.q_proj.weight": "model-00025-of-00050.safetensors",
+ "language_model.model.layers.24.self_attn.v_proj.weight": "model-00025-of-00050.safetensors",
+ "language_model.model.layers.25.feed_forward.experts.down_proj": "model-00027-of-00050.safetensors",
+ "language_model.model.layers.25.feed_forward.experts.gate_up_proj": "model-00027-of-00050.safetensors",
+ "language_model.model.layers.25.feed_forward.router.weight": "model-00027-of-00050.safetensors",
+ "language_model.model.layers.25.feed_forward.shared_expert.down_proj.weight": "model-00027-of-00050.safetensors",
+ "language_model.model.layers.25.feed_forward.shared_expert.gate_proj.weight": "model-00027-of-00050.safetensors",
+ "language_model.model.layers.25.feed_forward.shared_expert.up_proj.weight": "model-00027-of-00050.safetensors",
+ "language_model.model.layers.25.input_layernorm.weight": "model-00027-of-00050.safetensors",
+ "language_model.model.layers.25.post_attention_layernorm.weight": "model-00027-of-00050.safetensors",
+ "language_model.model.layers.25.self_attn.k_proj.weight": "model-00026-of-00050.safetensors",
+ "language_model.model.layers.25.self_attn.o_proj.weight": "model-00026-of-00050.safetensors",
+ "language_model.model.layers.25.self_attn.q_proj.weight": "model-00026-of-00050.safetensors",
+ "language_model.model.layers.25.self_attn.v_proj.weight": "model-00026-of-00050.safetensors",
+ "language_model.model.layers.26.feed_forward.experts.down_proj": "model-00028-of-00050.safetensors",
+ "language_model.model.layers.26.feed_forward.experts.gate_up_proj": "model-00028-of-00050.safetensors",
+ "language_model.model.layers.26.feed_forward.router.weight": "model-00028-of-00050.safetensors",
+ "language_model.model.layers.26.feed_forward.shared_expert.down_proj.weight": "model-00028-of-00050.safetensors",
+ "language_model.model.layers.26.feed_forward.shared_expert.gate_proj.weight": "model-00028-of-00050.safetensors",
+ "language_model.model.layers.26.feed_forward.shared_expert.up_proj.weight": "model-00028-of-00050.safetensors",
+ "language_model.model.layers.26.input_layernorm.weight": "model-00028-of-00050.safetensors",
+ "language_model.model.layers.26.post_attention_layernorm.weight": "model-00028-of-00050.safetensors",
+ "language_model.model.layers.26.self_attn.k_proj.weight": "model-00027-of-00050.safetensors",
+ "language_model.model.layers.26.self_attn.o_proj.weight": "model-00027-of-00050.safetensors",
+ "language_model.model.layers.26.self_attn.q_proj.weight": "model-00027-of-00050.safetensors",
+ "language_model.model.layers.26.self_attn.v_proj.weight": "model-00027-of-00050.safetensors",
+ "language_model.model.layers.27.feed_forward.experts.down_proj": "model-00029-of-00050.safetensors",
+ "language_model.model.layers.27.feed_forward.experts.gate_up_proj": "model-00029-of-00050.safetensors",
+ "language_model.model.layers.27.feed_forward.router.weight": "model-00029-of-00050.safetensors",
+ "language_model.model.layers.27.feed_forward.shared_expert.down_proj.weight": "model-00029-of-00050.safetensors",
+ "language_model.model.layers.27.feed_forward.shared_expert.gate_proj.weight": "model-00029-of-00050.safetensors",
+ "language_model.model.layers.27.feed_forward.shared_expert.up_proj.weight": "model-00029-of-00050.safetensors",
+ "language_model.model.layers.27.input_layernorm.weight": "model-00029-of-00050.safetensors",
+ "language_model.model.layers.27.post_attention_layernorm.weight": "model-00029-of-00050.safetensors",
+ "language_model.model.layers.27.self_attn.k_proj.weight": "model-00028-of-00050.safetensors",
+ "language_model.model.layers.27.self_attn.o_proj.weight": "model-00028-of-00050.safetensors",
+ "language_model.model.layers.27.self_attn.q_proj.weight": "model-00028-of-00050.safetensors",
+ "language_model.model.layers.27.self_attn.v_proj.weight": "model-00028-of-00050.safetensors",
+ "language_model.model.layers.28.feed_forward.experts.down_proj": "model-00030-of-00050.safetensors",
+ "language_model.model.layers.28.feed_forward.experts.gate_up_proj": "model-00030-of-00050.safetensors",
+ "language_model.model.layers.28.feed_forward.router.weight": "model-00030-of-00050.safetensors",
+ "language_model.model.layers.28.feed_forward.shared_expert.down_proj.weight": "model-00030-of-00050.safetensors",
+ "language_model.model.layers.28.feed_forward.shared_expert.gate_proj.weight": "model-00030-of-00050.safetensors",
+ "language_model.model.layers.28.feed_forward.shared_expert.up_proj.weight": "model-00030-of-00050.safetensors",
+ "language_model.model.layers.28.input_layernorm.weight": "model-00030-of-00050.safetensors",
+ "language_model.model.layers.28.post_attention_layernorm.weight": "model-00030-of-00050.safetensors",
+ "language_model.model.layers.28.self_attn.k_proj.weight": "model-00029-of-00050.safetensors",
+ "language_model.model.layers.28.self_attn.o_proj.weight": "model-00029-of-00050.safetensors",
+ "language_model.model.layers.28.self_attn.q_proj.weight": "model-00029-of-00050.safetensors",
+ "language_model.model.layers.28.self_attn.v_proj.weight": "model-00029-of-00050.safetensors",
+ "language_model.model.layers.29.feed_forward.experts.down_proj": "model-00031-of-00050.safetensors",
+ "language_model.model.layers.29.feed_forward.experts.gate_up_proj": "model-00031-of-00050.safetensors",
+ "language_model.model.layers.29.feed_forward.router.weight": "model-00031-of-00050.safetensors",
+ "language_model.model.layers.29.feed_forward.shared_expert.down_proj.weight": "model-00031-of-00050.safetensors",
+ "language_model.model.layers.29.feed_forward.shared_expert.gate_proj.weight": "model-00031-of-00050.safetensors",
+ "language_model.model.layers.29.feed_forward.shared_expert.up_proj.weight": "model-00031-of-00050.safetensors",
+ "language_model.model.layers.29.input_layernorm.weight": "model-00031-of-00050.safetensors",
+ "language_model.model.layers.29.post_attention_layernorm.weight": "model-00031-of-00050.safetensors",
+ "language_model.model.layers.29.self_attn.k_proj.weight": "model-00030-of-00050.safetensors",
+ "language_model.model.layers.29.self_attn.o_proj.weight": "model-00030-of-00050.safetensors",
+ "language_model.model.layers.29.self_attn.q_proj.weight": "model-00030-of-00050.safetensors",
+ "language_model.model.layers.29.self_attn.v_proj.weight": "model-00030-of-00050.safetensors",
+ "language_model.model.layers.3.feed_forward.experts.down_proj": "model-00005-of-00050.safetensors",
+ "language_model.model.layers.3.feed_forward.experts.gate_up_proj": "model-00005-of-00050.safetensors",
+ "language_model.model.layers.3.feed_forward.router.weight": "model-00005-of-00050.safetensors",
+ "language_model.model.layers.3.feed_forward.shared_expert.down_proj.weight": "model-00005-of-00050.safetensors",
+ "language_model.model.layers.3.feed_forward.shared_expert.gate_proj.weight": "model-00005-of-00050.safetensors",
+ "language_model.model.layers.3.feed_forward.shared_expert.up_proj.weight": "model-00005-of-00050.safetensors",
+ "language_model.model.layers.3.input_layernorm.weight": "model-00005-of-00050.safetensors",
+ "language_model.model.layers.3.post_attention_layernorm.weight": "model-00005-of-00050.safetensors",
+ "language_model.model.layers.3.self_attn.k_proj.weight": "model-00004-of-00050.safetensors",
+ "language_model.model.layers.3.self_attn.o_proj.weight": "model-00004-of-00050.safetensors",
+ "language_model.model.layers.3.self_attn.q_proj.weight": "model-00004-of-00050.safetensors",
+ "language_model.model.layers.3.self_attn.v_proj.weight": "model-00004-of-00050.safetensors",
+ "language_model.model.layers.30.feed_forward.experts.down_proj": "model-00032-of-00050.safetensors",
+ "language_model.model.layers.30.feed_forward.experts.gate_up_proj": "model-00032-of-00050.safetensors",
+ "language_model.model.layers.30.feed_forward.router.weight": "model-00032-of-00050.safetensors",
+ "language_model.model.layers.30.feed_forward.shared_expert.down_proj.weight": "model-00032-of-00050.safetensors",
+ "language_model.model.layers.30.feed_forward.shared_expert.gate_proj.weight": "model-00032-of-00050.safetensors",
+ "language_model.model.layers.30.feed_forward.shared_expert.up_proj.weight": "model-00032-of-00050.safetensors",
+ "language_model.model.layers.30.input_layernorm.weight": "model-00032-of-00050.safetensors",
+ "language_model.model.layers.30.post_attention_layernorm.weight": "model-00032-of-00050.safetensors",
+ "language_model.model.layers.30.self_attn.k_proj.weight": "model-00031-of-00050.safetensors",
+ "language_model.model.layers.30.self_attn.o_proj.weight": "model-00031-of-00050.safetensors",
+ "language_model.model.layers.30.self_attn.q_proj.weight": "model-00031-of-00050.safetensors",
+ "language_model.model.layers.30.self_attn.v_proj.weight": "model-00031-of-00050.safetensors",
+ "language_model.model.layers.31.feed_forward.experts.down_proj": "model-00033-of-00050.safetensors",
+ "language_model.model.layers.31.feed_forward.experts.gate_up_proj": "model-00033-of-00050.safetensors",
+ "language_model.model.layers.31.feed_forward.router.weight": "model-00033-of-00050.safetensors",
+ "language_model.model.layers.31.feed_forward.shared_expert.down_proj.weight": "model-00033-of-00050.safetensors",
+ "language_model.model.layers.31.feed_forward.shared_expert.gate_proj.weight": "model-00033-of-00050.safetensors",
+ "language_model.model.layers.31.feed_forward.shared_expert.up_proj.weight": "model-00033-of-00050.safetensors",
+ "language_model.model.layers.31.input_layernorm.weight": "model-00033-of-00050.safetensors",
+ "language_model.model.layers.31.post_attention_layernorm.weight": "model-00033-of-00050.safetensors",
+ "language_model.model.layers.31.self_attn.k_proj.weight": "model-00032-of-00050.safetensors",
+ "language_model.model.layers.31.self_attn.o_proj.weight": "model-00032-of-00050.safetensors",
+ "language_model.model.layers.31.self_attn.q_proj.weight": "model-00032-of-00050.safetensors",
+ "language_model.model.layers.31.self_attn.v_proj.weight": "model-00032-of-00050.safetensors",
+ "language_model.model.layers.32.feed_forward.experts.down_proj": "model-00034-of-00050.safetensors",
+ "language_model.model.layers.32.feed_forward.experts.gate_up_proj": "model-00034-of-00050.safetensors",
+ "language_model.model.layers.32.feed_forward.router.weight": "model-00034-of-00050.safetensors",
+ "language_model.model.layers.32.feed_forward.shared_expert.down_proj.weight": "model-00034-of-00050.safetensors",
+ "language_model.model.layers.32.feed_forward.shared_expert.gate_proj.weight": "model-00034-of-00050.safetensors",
+ "language_model.model.layers.32.feed_forward.shared_expert.up_proj.weight": "model-00034-of-00050.safetensors",
+ "language_model.model.layers.32.input_layernorm.weight": "model-00034-of-00050.safetensors",
+ "language_model.model.layers.32.post_attention_layernorm.weight": "model-00034-of-00050.safetensors",
+ "language_model.model.layers.32.self_attn.k_proj.weight": "model-00033-of-00050.safetensors",
+ "language_model.model.layers.32.self_attn.o_proj.weight": "model-00033-of-00050.safetensors",
+ "language_model.model.layers.32.self_attn.q_proj.weight": "model-00033-of-00050.safetensors",
+ "language_model.model.layers.32.self_attn.v_proj.weight": "model-00033-of-00050.safetensors",
+ "language_model.model.layers.33.feed_forward.experts.down_proj": "model-00035-of-00050.safetensors",
+ "language_model.model.layers.33.feed_forward.experts.gate_up_proj": "model-00035-of-00050.safetensors",
+ "language_model.model.layers.33.feed_forward.router.weight": "model-00035-of-00050.safetensors",
+ "language_model.model.layers.33.feed_forward.shared_expert.down_proj.weight": "model-00035-of-00050.safetensors",
+ "language_model.model.layers.33.feed_forward.shared_expert.gate_proj.weight": "model-00035-of-00050.safetensors",
+ "language_model.model.layers.33.feed_forward.shared_expert.up_proj.weight": "model-00035-of-00050.safetensors",
+ "language_model.model.layers.33.input_layernorm.weight": "model-00035-of-00050.safetensors",
+ "language_model.model.layers.33.post_attention_layernorm.weight": "model-00035-of-00050.safetensors",
+ "language_model.model.layers.33.self_attn.k_proj.weight": "model-00034-of-00050.safetensors",
+ "language_model.model.layers.33.self_attn.o_proj.weight": "model-00034-of-00050.safetensors",
+ "language_model.model.layers.33.self_attn.q_proj.weight": "model-00034-of-00050.safetensors",
+ "language_model.model.layers.33.self_attn.v_proj.weight": "model-00034-of-00050.safetensors",
+ "language_model.model.layers.34.feed_forward.experts.down_proj": "model-00036-of-00050.safetensors",
+ "language_model.model.layers.34.feed_forward.experts.gate_up_proj": "model-00036-of-00050.safetensors",
+ "language_model.model.layers.34.feed_forward.router.weight": "model-00036-of-00050.safetensors",
+ "language_model.model.layers.34.feed_forward.shared_expert.down_proj.weight": "model-00036-of-00050.safetensors",
+ "language_model.model.layers.34.feed_forward.shared_expert.gate_proj.weight": "model-00036-of-00050.safetensors",
+ "language_model.model.layers.34.feed_forward.shared_expert.up_proj.weight": "model-00036-of-00050.safetensors",
+ "language_model.model.layers.34.input_layernorm.weight": "model-00036-of-00050.safetensors",
+ "language_model.model.layers.34.post_attention_layernorm.weight": "model-00036-of-00050.safetensors",
+ "language_model.model.layers.34.self_attn.k_proj.weight": "model-00035-of-00050.safetensors",
+ "language_model.model.layers.34.self_attn.o_proj.weight": "model-00035-of-00050.safetensors",
+ "language_model.model.layers.34.self_attn.q_proj.weight": "model-00035-of-00050.safetensors",
+ "language_model.model.layers.34.self_attn.v_proj.weight": "model-00035-of-00050.safetensors",
+ "language_model.model.layers.35.feed_forward.experts.down_proj": "model-00037-of-00050.safetensors",
+ "language_model.model.layers.35.feed_forward.experts.gate_up_proj": "model-00037-of-00050.safetensors",
+ "language_model.model.layers.35.feed_forward.router.weight": "model-00037-of-00050.safetensors",
+ "language_model.model.layers.35.feed_forward.shared_expert.down_proj.weight": "model-00037-of-00050.safetensors",
+ "language_model.model.layers.35.feed_forward.shared_expert.gate_proj.weight": "model-00037-of-00050.safetensors",
+ "language_model.model.layers.35.feed_forward.shared_expert.up_proj.weight": "model-00037-of-00050.safetensors",
+ "language_model.model.layers.35.input_layernorm.weight": "model-00037-of-00050.safetensors",
+ "language_model.model.layers.35.post_attention_layernorm.weight": "model-00037-of-00050.safetensors",
+ "language_model.model.layers.35.self_attn.k_proj.weight": "model-00036-of-00050.safetensors",
+ "language_model.model.layers.35.self_attn.o_proj.weight": "model-00036-of-00050.safetensors",
+ "language_model.model.layers.35.self_attn.q_proj.weight": "model-00036-of-00050.safetensors",
+ "language_model.model.layers.35.self_attn.v_proj.weight": "model-00036-of-00050.safetensors",
+ "language_model.model.layers.36.feed_forward.experts.down_proj": "model-00038-of-00050.safetensors",
+ "language_model.model.layers.36.feed_forward.experts.gate_up_proj": "model-00038-of-00050.safetensors",
+ "language_model.model.layers.36.feed_forward.router.weight": "model-00038-of-00050.safetensors",
+ "language_model.model.layers.36.feed_forward.shared_expert.down_proj.weight": "model-00038-of-00050.safetensors",
+ "language_model.model.layers.36.feed_forward.shared_expert.gate_proj.weight": "model-00038-of-00050.safetensors",
+ "language_model.model.layers.36.feed_forward.shared_expert.up_proj.weight": "model-00038-of-00050.safetensors",
+ "language_model.model.layers.36.input_layernorm.weight": "model-00038-of-00050.safetensors",
+ "language_model.model.layers.36.post_attention_layernorm.weight": "model-00038-of-00050.safetensors",
+ "language_model.model.layers.36.self_attn.k_proj.weight": "model-00037-of-00050.safetensors",
+ "language_model.model.layers.36.self_attn.o_proj.weight": "model-00037-of-00050.safetensors",
+ "language_model.model.layers.36.self_attn.q_proj.weight": "model-00037-of-00050.safetensors",
+ "language_model.model.layers.36.self_attn.v_proj.weight": "model-00037-of-00050.safetensors",
+ "language_model.model.layers.37.feed_forward.experts.down_proj": "model-00039-of-00050.safetensors",
+ "language_model.model.layers.37.feed_forward.experts.gate_up_proj": "model-00039-of-00050.safetensors",
+ "language_model.model.layers.37.feed_forward.router.weight": "model-00039-of-00050.safetensors",
+ "language_model.model.layers.37.feed_forward.shared_expert.down_proj.weight": "model-00039-of-00050.safetensors",
+ "language_model.model.layers.37.feed_forward.shared_expert.gate_proj.weight": "model-00039-of-00050.safetensors",
+ "language_model.model.layers.37.feed_forward.shared_expert.up_proj.weight": "model-00039-of-00050.safetensors",
+ "language_model.model.layers.37.input_layernorm.weight": "model-00039-of-00050.safetensors",
+ "language_model.model.layers.37.post_attention_layernorm.weight": "model-00039-of-00050.safetensors",
+ "language_model.model.layers.37.self_attn.k_proj.weight": "model-00038-of-00050.safetensors",
+ "language_model.model.layers.37.self_attn.o_proj.weight": "model-00038-of-00050.safetensors",
+ "language_model.model.layers.37.self_attn.q_proj.weight": "model-00038-of-00050.safetensors",
+ "language_model.model.layers.37.self_attn.v_proj.weight": "model-00038-of-00050.safetensors",
+ "language_model.model.layers.38.feed_forward.experts.down_proj": "model-00040-of-00050.safetensors",
+ "language_model.model.layers.38.feed_forward.experts.gate_up_proj": "model-00040-of-00050.safetensors",
+ "language_model.model.layers.38.feed_forward.router.weight": "model-00040-of-00050.safetensors",
+ "language_model.model.layers.38.feed_forward.shared_expert.down_proj.weight": "model-00040-of-00050.safetensors",
+ "language_model.model.layers.38.feed_forward.shared_expert.gate_proj.weight": "model-00040-of-00050.safetensors",
+ "language_model.model.layers.38.feed_forward.shared_expert.up_proj.weight": "model-00040-of-00050.safetensors",
+ "language_model.model.layers.38.input_layernorm.weight": "model-00040-of-00050.safetensors",
+ "language_model.model.layers.38.post_attention_layernorm.weight": "model-00040-of-00050.safetensors",
+ "language_model.model.layers.38.self_attn.k_proj.weight": "model-00039-of-00050.safetensors",
+ "language_model.model.layers.38.self_attn.o_proj.weight": "model-00039-of-00050.safetensors",
+ "language_model.model.layers.38.self_attn.q_proj.weight": "model-00039-of-00050.safetensors",
+ "language_model.model.layers.38.self_attn.v_proj.weight": "model-00039-of-00050.safetensors",
+ "language_model.model.layers.39.feed_forward.experts.down_proj": "model-00041-of-00050.safetensors",
+ "language_model.model.layers.39.feed_forward.experts.gate_up_proj": "model-00041-of-00050.safetensors",
+ "language_model.model.layers.39.feed_forward.router.weight": "model-00041-of-00050.safetensors",
+ "language_model.model.layers.39.feed_forward.shared_expert.down_proj.weight": "model-00041-of-00050.safetensors",
+ "language_model.model.layers.39.feed_forward.shared_expert.gate_proj.weight": "model-00041-of-00050.safetensors",
+ "language_model.model.layers.39.feed_forward.shared_expert.up_proj.weight": "model-00041-of-00050.safetensors",
+ "language_model.model.layers.39.input_layernorm.weight": "model-00041-of-00050.safetensors",
+ "language_model.model.layers.39.post_attention_layernorm.weight": "model-00041-of-00050.safetensors",
+ "language_model.model.layers.39.self_attn.k_proj.weight": "model-00040-of-00050.safetensors",
+ "language_model.model.layers.39.self_attn.o_proj.weight": "model-00040-of-00050.safetensors",
+ "language_model.model.layers.39.self_attn.q_proj.weight": "model-00040-of-00050.safetensors",
+ "language_model.model.layers.39.self_attn.v_proj.weight": "model-00040-of-00050.safetensors",
+ "language_model.model.layers.4.feed_forward.experts.down_proj": "model-00006-of-00050.safetensors",
+ "language_model.model.layers.4.feed_forward.experts.gate_up_proj": "model-00006-of-00050.safetensors",
+ "language_model.model.layers.4.feed_forward.router.weight": "model-00006-of-00050.safetensors",
+ "language_model.model.layers.4.feed_forward.shared_expert.down_proj.weight": "model-00006-of-00050.safetensors",
+ "language_model.model.layers.4.feed_forward.shared_expert.gate_proj.weight": "model-00006-of-00050.safetensors",
+ "language_model.model.layers.4.feed_forward.shared_expert.up_proj.weight": "model-00006-of-00050.safetensors",
+ "language_model.model.layers.4.input_layernorm.weight": "model-00006-of-00050.safetensors",
+ "language_model.model.layers.4.post_attention_layernorm.weight": "model-00006-of-00050.safetensors",
+ "language_model.model.layers.4.self_attn.k_proj.weight": "model-00005-of-00050.safetensors",
+ "language_model.model.layers.4.self_attn.o_proj.weight": "model-00005-of-00050.safetensors",
+ "language_model.model.layers.4.self_attn.q_proj.weight": "model-00005-of-00050.safetensors",
+ "language_model.model.layers.4.self_attn.v_proj.weight": "model-00005-of-00050.safetensors",
+ "language_model.model.layers.40.feed_forward.experts.down_proj": "model-00042-of-00050.safetensors",
+ "language_model.model.layers.40.feed_forward.experts.gate_up_proj": "model-00042-of-00050.safetensors",
+ "language_model.model.layers.40.feed_forward.router.weight": "model-00042-of-00050.safetensors",
+ "language_model.model.layers.40.feed_forward.shared_expert.down_proj.weight": "model-00042-of-00050.safetensors",
+ "language_model.model.layers.40.feed_forward.shared_expert.gate_proj.weight": "model-00042-of-00050.safetensors",
+ "language_model.model.layers.40.feed_forward.shared_expert.up_proj.weight": "model-00042-of-00050.safetensors",
+ "language_model.model.layers.40.input_layernorm.weight": "model-00042-of-00050.safetensors",
+ "language_model.model.layers.40.post_attention_layernorm.weight": "model-00042-of-00050.safetensors",
+ "language_model.model.layers.40.self_attn.k_proj.weight": "model-00041-of-00050.safetensors",
+ "language_model.model.layers.40.self_attn.o_proj.weight": "model-00041-of-00050.safetensors",
+ "language_model.model.layers.40.self_attn.q_proj.weight": "model-00041-of-00050.safetensors",
+ "language_model.model.layers.40.self_attn.v_proj.weight": "model-00041-of-00050.safetensors",
+ "language_model.model.layers.41.feed_forward.experts.down_proj": "model-00043-of-00050.safetensors",
+ "language_model.model.layers.41.feed_forward.experts.gate_up_proj": "model-00043-of-00050.safetensors",
+ "language_model.model.layers.41.feed_forward.router.weight": "model-00043-of-00050.safetensors",
+ "language_model.model.layers.41.feed_forward.shared_expert.down_proj.weight": "model-00043-of-00050.safetensors",
+ "language_model.model.layers.41.feed_forward.shared_expert.gate_proj.weight": "model-00043-of-00050.safetensors",
+ "language_model.model.layers.41.feed_forward.shared_expert.up_proj.weight": "model-00043-of-00050.safetensors",
+ "language_model.model.layers.41.input_layernorm.weight": "model-00043-of-00050.safetensors",
+ "language_model.model.layers.41.post_attention_layernorm.weight": "model-00043-of-00050.safetensors",
+ "language_model.model.layers.41.self_attn.k_proj.weight": "model-00042-of-00050.safetensors",
+ "language_model.model.layers.41.self_attn.o_proj.weight": "model-00042-of-00050.safetensors",
+ "language_model.model.layers.41.self_attn.q_proj.weight": "model-00042-of-00050.safetensors",
+ "language_model.model.layers.41.self_attn.v_proj.weight": "model-00042-of-00050.safetensors",
+ "language_model.model.layers.42.feed_forward.experts.down_proj": "model-00044-of-00050.safetensors",
+ "language_model.model.layers.42.feed_forward.experts.gate_up_proj": "model-00044-of-00050.safetensors",
+ "language_model.model.layers.42.feed_forward.router.weight": "model-00044-of-00050.safetensors",
+ "language_model.model.layers.42.feed_forward.shared_expert.down_proj.weight": "model-00044-of-00050.safetensors",
+ "language_model.model.layers.42.feed_forward.shared_expert.gate_proj.weight": "model-00044-of-00050.safetensors",
+ "language_model.model.layers.42.feed_forward.shared_expert.up_proj.weight": "model-00044-of-00050.safetensors",
+ "language_model.model.layers.42.input_layernorm.weight": "model-00044-of-00050.safetensors",
+ "language_model.model.layers.42.post_attention_layernorm.weight": "model-00044-of-00050.safetensors",
+ "language_model.model.layers.42.self_attn.k_proj.weight": "model-00043-of-00050.safetensors",
+ "language_model.model.layers.42.self_attn.o_proj.weight": "model-00043-of-00050.safetensors",
+ "language_model.model.layers.42.self_attn.q_proj.weight": "model-00043-of-00050.safetensors",
+ "language_model.model.layers.42.self_attn.v_proj.weight": "model-00043-of-00050.safetensors",
+ "language_model.model.layers.43.feed_forward.experts.down_proj": "model-00045-of-00050.safetensors",
+ "language_model.model.layers.43.feed_forward.experts.gate_up_proj": "model-00045-of-00050.safetensors",
+ "language_model.model.layers.43.feed_forward.router.weight": "model-00045-of-00050.safetensors",
+ "language_model.model.layers.43.feed_forward.shared_expert.down_proj.weight": "model-00045-of-00050.safetensors",
+ "language_model.model.layers.43.feed_forward.shared_expert.gate_proj.weight": "model-00045-of-00050.safetensors",
+ "language_model.model.layers.43.feed_forward.shared_expert.up_proj.weight": "model-00045-of-00050.safetensors",
+ "language_model.model.layers.43.input_layernorm.weight": "model-00045-of-00050.safetensors",
+ "language_model.model.layers.43.post_attention_layernorm.weight": "model-00045-of-00050.safetensors",
+ "language_model.model.layers.43.self_attn.k_proj.weight": "model-00044-of-00050.safetensors",
+ "language_model.model.layers.43.self_attn.o_proj.weight": "model-00044-of-00050.safetensors",
+ "language_model.model.layers.43.self_attn.q_proj.weight": "model-00044-of-00050.safetensors",
+ "language_model.model.layers.43.self_attn.v_proj.weight": "model-00044-of-00050.safetensors",
+ "language_model.model.layers.44.feed_forward.experts.down_proj": "model-00046-of-00050.safetensors",
+ "language_model.model.layers.44.feed_forward.experts.gate_up_proj": "model-00046-of-00050.safetensors",
+ "language_model.model.layers.44.feed_forward.router.weight": "model-00046-of-00050.safetensors",
+ "language_model.model.layers.44.feed_forward.shared_expert.down_proj.weight": "model-00046-of-00050.safetensors",
+ "language_model.model.layers.44.feed_forward.shared_expert.gate_proj.weight": "model-00046-of-00050.safetensors",
+ "language_model.model.layers.44.feed_forward.shared_expert.up_proj.weight": "model-00046-of-00050.safetensors",
+ "language_model.model.layers.44.input_layernorm.weight": "model-00046-of-00050.safetensors",
+ "language_model.model.layers.44.post_attention_layernorm.weight": "model-00046-of-00050.safetensors",
+ "language_model.model.layers.44.self_attn.k_proj.weight": "model-00045-of-00050.safetensors",
+ "language_model.model.layers.44.self_attn.o_proj.weight": "model-00045-of-00050.safetensors",
+ "language_model.model.layers.44.self_attn.q_proj.weight": "model-00045-of-00050.safetensors",
+ "language_model.model.layers.44.self_attn.v_proj.weight": "model-00045-of-00050.safetensors",
+ "language_model.model.layers.45.feed_forward.experts.down_proj": "model-00047-of-00050.safetensors",
+ "language_model.model.layers.45.feed_forward.experts.gate_up_proj": "model-00047-of-00050.safetensors",
+ "language_model.model.layers.45.feed_forward.router.weight": "model-00047-of-00050.safetensors",
+ "language_model.model.layers.45.feed_forward.shared_expert.down_proj.weight": "model-00047-of-00050.safetensors",
+ "language_model.model.layers.45.feed_forward.shared_expert.gate_proj.weight": "model-00047-of-00050.safetensors",
+ "language_model.model.layers.45.feed_forward.shared_expert.up_proj.weight": "model-00047-of-00050.safetensors",
+ "language_model.model.layers.45.input_layernorm.weight": "model-00047-of-00050.safetensors",
+ "language_model.model.layers.45.post_attention_layernorm.weight": "model-00047-of-00050.safetensors",
+ "language_model.model.layers.45.self_attn.k_proj.weight": "model-00046-of-00050.safetensors",
+ "language_model.model.layers.45.self_attn.o_proj.weight": "model-00046-of-00050.safetensors",
+ "language_model.model.layers.45.self_attn.q_proj.weight": "model-00046-of-00050.safetensors",
+ "language_model.model.layers.45.self_attn.v_proj.weight": "model-00046-of-00050.safetensors",
+ "language_model.model.layers.46.feed_forward.experts.down_proj": "model-00048-of-00050.safetensors",
+ "language_model.model.layers.46.feed_forward.experts.gate_up_proj": "model-00048-of-00050.safetensors",
+ "language_model.model.layers.46.feed_forward.router.weight": "model-00048-of-00050.safetensors",
+ "language_model.model.layers.46.feed_forward.shared_expert.down_proj.weight": "model-00048-of-00050.safetensors",
+ "language_model.model.layers.46.feed_forward.shared_expert.gate_proj.weight": "model-00048-of-00050.safetensors",
+ "language_model.model.layers.46.feed_forward.shared_expert.up_proj.weight": "model-00048-of-00050.safetensors",
+ "language_model.model.layers.46.input_layernorm.weight": "model-00048-of-00050.safetensors",
+ "language_model.model.layers.46.post_attention_layernorm.weight": "model-00048-of-00050.safetensors",
+ "language_model.model.layers.46.self_attn.k_proj.weight": "model-00047-of-00050.safetensors",
+ "language_model.model.layers.46.self_attn.o_proj.weight": "model-00047-of-00050.safetensors",
+ "language_model.model.layers.46.self_attn.q_proj.weight": "model-00047-of-00050.safetensors",
+ "language_model.model.layers.46.self_attn.v_proj.weight": "model-00047-of-00050.safetensors",
+ "language_model.model.layers.47.feed_forward.experts.down_proj": "model-00049-of-00050.safetensors",
+ "language_model.model.layers.47.feed_forward.experts.gate_up_proj": "model-00049-of-00050.safetensors",
+ "language_model.model.layers.47.feed_forward.router.weight": "model-00049-of-00050.safetensors",
+ "language_model.model.layers.47.feed_forward.shared_expert.down_proj.weight": "model-00049-of-00050.safetensors",
+ "language_model.model.layers.47.feed_forward.shared_expert.gate_proj.weight": "model-00049-of-00050.safetensors",
+ "language_model.model.layers.47.feed_forward.shared_expert.up_proj.weight": "model-00049-of-00050.safetensors",
+ "language_model.model.layers.47.input_layernorm.weight": "model-00049-of-00050.safetensors",
+ "language_model.model.layers.47.post_attention_layernorm.weight": "model-00049-of-00050.safetensors",
+ "language_model.model.layers.47.self_attn.k_proj.weight": "model-00048-of-00050.safetensors",
+ "language_model.model.layers.47.self_attn.o_proj.weight": "model-00048-of-00050.safetensors",
+ "language_model.model.layers.47.self_attn.q_proj.weight": "model-00048-of-00050.safetensors",
+ "language_model.model.layers.47.self_attn.v_proj.weight": "model-00048-of-00050.safetensors",
+ "language_model.model.layers.5.feed_forward.experts.down_proj": "model-00007-of-00050.safetensors",
+ "language_model.model.layers.5.feed_forward.experts.gate_up_proj": "model-00007-of-00050.safetensors",
+ "language_model.model.layers.5.feed_forward.router.weight": "model-00007-of-00050.safetensors",
+ "language_model.model.layers.5.feed_forward.shared_expert.down_proj.weight": "model-00007-of-00050.safetensors",
+ "language_model.model.layers.5.feed_forward.shared_expert.gate_proj.weight": "model-00007-of-00050.safetensors",
+ "language_model.model.layers.5.feed_forward.shared_expert.up_proj.weight": "model-00007-of-00050.safetensors",
+ "language_model.model.layers.5.input_layernorm.weight": "model-00007-of-00050.safetensors",
+ "language_model.model.layers.5.post_attention_layernorm.weight": "model-00007-of-00050.safetensors",
+ "language_model.model.layers.5.self_attn.k_proj.weight": "model-00006-of-00050.safetensors",
+ "language_model.model.layers.5.self_attn.o_proj.weight": "model-00006-of-00050.safetensors",
+ "language_model.model.layers.5.self_attn.q_proj.weight": "model-00006-of-00050.safetensors",
+ "language_model.model.layers.5.self_attn.v_proj.weight": "model-00006-of-00050.safetensors",
+ "language_model.model.layers.6.feed_forward.experts.down_proj": "model-00008-of-00050.safetensors",
+ "language_model.model.layers.6.feed_forward.experts.gate_up_proj": "model-00008-of-00050.safetensors",
+ "language_model.model.layers.6.feed_forward.router.weight": "model-00008-of-00050.safetensors",
+ "language_model.model.layers.6.feed_forward.shared_expert.down_proj.weight": "model-00008-of-00050.safetensors",
+ "language_model.model.layers.6.feed_forward.shared_expert.gate_proj.weight": "model-00008-of-00050.safetensors",
+ "language_model.model.layers.6.feed_forward.shared_expert.up_proj.weight": "model-00008-of-00050.safetensors",
+ "language_model.model.layers.6.input_layernorm.weight": "model-00008-of-00050.safetensors",
+ "language_model.model.layers.6.post_attention_layernorm.weight": "model-00008-of-00050.safetensors",
+ "language_model.model.layers.6.self_attn.k_proj.weight": "model-00007-of-00050.safetensors",
+ "language_model.model.layers.6.self_attn.o_proj.weight": "model-00007-of-00050.safetensors",
+ "language_model.model.layers.6.self_attn.q_proj.weight": "model-00007-of-00050.safetensors",
+ "language_model.model.layers.6.self_attn.v_proj.weight": "model-00007-of-00050.safetensors",
+ "language_model.model.layers.7.feed_forward.experts.down_proj": "model-00009-of-00050.safetensors",
+ "language_model.model.layers.7.feed_forward.experts.gate_up_proj": "model-00009-of-00050.safetensors",
+ "language_model.model.layers.7.feed_forward.router.weight": "model-00009-of-00050.safetensors",
+ "language_model.model.layers.7.feed_forward.shared_expert.down_proj.weight": "model-00009-of-00050.safetensors",
+ "language_model.model.layers.7.feed_forward.shared_expert.gate_proj.weight": "model-00009-of-00050.safetensors",
+ "language_model.model.layers.7.feed_forward.shared_expert.up_proj.weight": "model-00009-of-00050.safetensors",
+ "language_model.model.layers.7.input_layernorm.weight": "model-00009-of-00050.safetensors",
+ "language_model.model.layers.7.post_attention_layernorm.weight": "model-00009-of-00050.safetensors",
+ "language_model.model.layers.7.self_attn.k_proj.weight": "model-00008-of-00050.safetensors",
+ "language_model.model.layers.7.self_attn.o_proj.weight": "model-00008-of-00050.safetensors",
+ "language_model.model.layers.7.self_attn.q_proj.weight": "model-00008-of-00050.safetensors",
+ "language_model.model.layers.7.self_attn.v_proj.weight": "model-00008-of-00050.safetensors",
+ "language_model.model.layers.8.feed_forward.experts.down_proj": "model-00010-of-00050.safetensors",
+ "language_model.model.layers.8.feed_forward.experts.gate_up_proj": "model-00010-of-00050.safetensors",
+ "language_model.model.layers.8.feed_forward.router.weight": "model-00010-of-00050.safetensors",
+ "language_model.model.layers.8.feed_forward.shared_expert.down_proj.weight": "model-00010-of-00050.safetensors",
+ "language_model.model.layers.8.feed_forward.shared_expert.gate_proj.weight": "model-00010-of-00050.safetensors",
+ "language_model.model.layers.8.feed_forward.shared_expert.up_proj.weight": "model-00010-of-00050.safetensors",
+ "language_model.model.layers.8.input_layernorm.weight": "model-00010-of-00050.safetensors",
+ "language_model.model.layers.8.post_attention_layernorm.weight": "model-00010-of-00050.safetensors",
+ "language_model.model.layers.8.self_attn.k_proj.weight": "model-00009-of-00050.safetensors",
+ "language_model.model.layers.8.self_attn.o_proj.weight": "model-00009-of-00050.safetensors",
+ "language_model.model.layers.8.self_attn.q_proj.weight": "model-00009-of-00050.safetensors",
+ "language_model.model.layers.8.self_attn.v_proj.weight": "model-00009-of-00050.safetensors",
+ "language_model.model.layers.9.feed_forward.experts.down_proj": "model-00011-of-00050.safetensors",
+ "language_model.model.layers.9.feed_forward.experts.gate_up_proj": "model-00011-of-00050.safetensors",
+ "language_model.model.layers.9.feed_forward.router.weight": "model-00011-of-00050.safetensors",
+ "language_model.model.layers.9.feed_forward.shared_expert.down_proj.weight": "model-00011-of-00050.safetensors",
+ "language_model.model.layers.9.feed_forward.shared_expert.gate_proj.weight": "model-00011-of-00050.safetensors",
+ "language_model.model.layers.9.feed_forward.shared_expert.up_proj.weight": "model-00011-of-00050.safetensors",
+ "language_model.model.layers.9.input_layernorm.weight": "model-00011-of-00050.safetensors",
+ "language_model.model.layers.9.post_attention_layernorm.weight": "model-00011-of-00050.safetensors",
+ "language_model.model.layers.9.self_attn.k_proj.weight": "model-00010-of-00050.safetensors",
+ "language_model.model.layers.9.self_attn.o_proj.weight": "model-00010-of-00050.safetensors",
+ "language_model.model.layers.9.self_attn.q_proj.weight": "model-00010-of-00050.safetensors",
+ "language_model.model.layers.9.self_attn.v_proj.weight": "model-00010-of-00050.safetensors",
+ "language_model.model.norm.weight": "model-00049-of-00050.safetensors",
+ "multi_modal_projector.linear_1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.class_embedding": "model-00001-of-00050.safetensors",
+ "vision_model.layernorm_post.bias": "model-00001-of-00050.safetensors",
+ "vision_model.layernorm_post.weight": "model-00001-of-00050.safetensors",
+ "vision_model.layernorm_pre.bias": "model-00001-of-00050.safetensors",
+ "vision_model.layernorm_pre.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.0.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.0.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.0.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.0.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.0.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.0.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.0.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.0.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.0.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.0.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.0.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.0.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.0.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.0.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.0.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.0.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.1.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.1.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.1.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.1.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.1.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.1.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.1.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.1.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.1.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.1.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.1.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.1.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.1.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.1.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.1.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.1.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.10.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.10.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.10.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.10.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.10.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.10.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.10.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.10.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.10.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.10.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.10.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.10.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.10.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.10.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.10.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.10.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.11.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.11.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.11.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.11.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.11.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.11.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.11.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.11.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.11.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.11.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.11.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.11.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.11.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.11.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.11.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.11.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.12.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.12.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.12.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.12.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.12.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.12.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.12.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.12.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.12.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.12.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.12.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.12.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.12.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.12.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.12.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.12.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.13.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.13.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.13.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.13.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.13.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.13.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.13.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.13.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.13.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.13.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.13.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.13.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.13.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.13.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.13.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.13.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.14.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.14.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.14.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.14.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.14.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.14.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.14.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.14.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.14.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.14.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.14.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.14.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.14.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.14.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.14.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.14.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.15.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.15.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.15.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.15.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.15.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.15.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.15.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.15.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.15.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.15.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.15.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.15.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.15.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.15.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.15.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.15.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.16.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.16.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.16.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.16.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.16.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.16.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.16.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.16.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.16.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.16.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.16.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.16.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.16.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.16.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.16.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.16.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.17.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.17.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.17.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.17.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.17.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.17.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.17.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.17.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.17.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.17.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.17.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.17.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.17.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.17.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.17.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.17.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.18.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.18.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.18.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.18.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.18.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.18.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.18.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.18.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.18.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.18.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.18.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.18.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.18.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.18.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.18.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.18.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.19.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.19.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.19.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.19.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.19.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.19.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.19.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.19.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.19.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.19.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.19.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.19.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.19.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.19.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.19.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.19.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.2.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.2.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.2.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.2.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.2.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.2.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.2.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.2.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.2.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.2.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.2.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.2.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.2.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.2.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.2.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.2.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.20.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.20.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.20.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.20.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.20.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.20.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.20.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.20.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.20.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.20.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.20.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.20.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.20.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.20.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.20.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.20.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.21.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.21.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.21.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.21.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.21.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.21.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.21.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.21.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.21.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.21.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.21.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.21.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.21.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.21.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.21.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.21.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.22.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.22.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.22.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.22.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.22.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.22.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.22.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.22.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.22.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.22.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.22.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.22.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.22.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.22.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.22.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.22.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.23.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.23.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.23.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.23.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.23.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.23.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.23.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.23.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.23.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.23.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.23.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.23.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.23.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.23.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.23.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.23.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.24.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.24.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.24.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.24.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.24.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.24.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.24.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.24.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.24.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.24.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.24.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.24.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.24.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.24.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.24.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.24.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.25.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.25.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.25.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.25.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.25.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.25.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.25.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.25.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.25.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.25.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.25.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.25.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.25.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.25.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.25.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.25.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.26.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.26.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.26.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.26.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.26.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.26.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.26.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.26.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.26.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.26.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.26.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.26.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.26.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.26.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.26.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.26.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.27.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.27.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.27.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.27.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.27.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.27.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.27.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.27.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.27.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.27.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.27.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.27.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.27.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.27.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.27.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.27.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.28.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.28.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.28.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.28.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.28.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.28.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.28.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.28.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.28.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.28.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.28.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.28.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.28.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.28.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.28.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.28.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.29.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.29.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.29.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.29.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.29.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.29.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.29.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.29.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.29.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.29.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.29.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.29.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.29.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.29.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.29.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.29.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.3.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.3.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.3.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.3.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.3.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.3.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.3.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.3.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.3.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.3.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.3.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.3.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.3.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.3.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.3.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.3.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.30.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.30.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.30.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.30.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.30.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.30.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.30.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.30.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.30.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.30.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.30.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.30.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.30.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.30.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.30.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.30.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.31.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.31.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.31.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.31.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.31.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.31.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.31.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.31.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.31.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.31.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.31.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.31.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.31.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.31.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.31.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.31.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.32.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.32.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.32.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.32.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.32.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.32.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.32.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.32.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.32.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.32.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.32.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.32.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.32.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.32.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.32.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.32.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.33.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.33.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.33.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.33.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.33.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.33.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.33.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.33.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.33.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.33.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.33.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.33.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.33.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.33.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.33.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.33.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.4.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.4.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.4.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.4.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.4.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.4.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.4.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.4.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.4.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.4.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.4.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.4.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.4.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.4.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.4.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.4.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.5.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.5.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.5.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.5.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.5.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.5.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.5.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.5.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.5.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.5.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.5.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.5.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.5.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.5.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.5.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.5.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.6.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.6.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.6.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.6.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.6.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.6.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.6.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.6.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.6.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.6.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.6.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.6.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.6.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.6.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.6.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.6.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.7.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.7.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.7.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.7.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.7.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.7.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.7.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.7.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.7.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.7.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.7.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.7.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.7.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.7.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.7.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.7.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.8.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.8.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.8.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.8.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.8.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.8.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.8.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.8.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.8.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.8.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.8.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.8.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.8.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.8.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.8.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.8.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.9.input_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.9.input_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.9.mlp.fc1.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.9.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.9.mlp.fc2.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.9.mlp.fc2.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.9.post_attention_layernorm.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.9.post_attention_layernorm.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.9.self_attn.k_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.9.self_attn.k_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.9.self_attn.o_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.9.self_attn.o_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.9.self_attn.q_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.9.self_attn.q_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.9.self_attn.v_proj.bias": "model-00001-of-00050.safetensors",
+ "vision_model.model.layers.9.self_attn.v_proj.weight": "model-00001-of-00050.safetensors",
+ "vision_model.patch_embedding.linear.weight": "model-00001-of-00050.safetensors",
+ "vision_model.positional_embedding_vlm": "model-00001-of-00050.safetensors",
+ "vision_model.vision_adapter.mlp.fc1.weight": "model-00001-of-00050.safetensors",
+ "vision_model.vision_adapter.mlp.fc2.weight": "model-00001-of-00050.safetensors"
+ }
+}
diff --git a/preprocessor_config.json b/preprocessor_config.json
new file mode 100644
index 0000000000000000000000000000000000000000..4eef772357a282205fface45491471dc35ad030d
--- /dev/null
+++ b/preprocessor_config.json
@@ -0,0 +1,33 @@
+{
+ "crop_size": null,
+ "data_format": "channels_first",
+ "default_to_square": true,
+ "device": null,
+ "do_center_crop": null,
+ "do_convert_rgb": true,
+ "do_normalize": true,
+ "do_rescale": true,
+ "do_resize": true,
+ "image_mean": [
+ 0.5,
+ 0.5,
+ 0.5
+ ],
+ "image_processor_type": "Llama4ImageProcessorFast",
+ "image_std": [
+ 0.5,
+ 0.5,
+ 0.5
+ ],
+ "input_data_format": null,
+ "max_patches": 16,
+ "processor_class": "Llama4Processor",
+ "resample": 2,
+ "rescale_factor": 0.00392156862745098,
+ "resize_to_max_canvas": false,
+ "return_tensors": null,
+ "size": {
+ "height": 336,
+ "width": 336
+ }
+}
diff --git a/processor_config.json b/processor_config.json
new file mode 100644
index 0000000000000000000000000000000000000000..2f3cae49c25616991dc8876e733fde78cfb354f8
--- /dev/null
+++ b/processor_config.json
@@ -0,0 +1,6 @@
+{
+ "fake_image_token": "<|image|>",
+ "image_token": "<|image|>",
+ "patch_size": 14,
+ "processor_class": "Llama4Processor"
+}
diff --git a/special_tokens_map.json b/special_tokens_map.json
new file mode 100644
index 0000000000000000000000000000000000000000..8df8c3b7c45d5b76ae83109530bf6630e94ba50d
--- /dev/null
+++ b/special_tokens_map.json
@@ -0,0 +1,5 @@
+{
+ "bos_token": "<|begin_of_text|>",
+ "eos_token": "<|eot|>",
+ "pad_token": "<|finetune_right_pad_id|>"
+}
diff --git a/tokenizer.json b/tokenizer.json
new file mode 100644
index 0000000000000000000000000000000000000000..b1fde397c877f796b68ca425082644bb07a20535
--- /dev/null
+++ b/tokenizer.json
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:172c9eb4beafc72601690da3ccfcede5c2e6806a8d5ec1fca33e22acea8023a4
+size 27948578
diff --git a/tokenizer.model b/tokenizer.model
new file mode 100644
index 0000000000000000000000000000000000000000..4f144055ba251edcd43b3a09e66324cd940e9443
--- /dev/null
+++ b/tokenizer.model
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:d0bdbaf59b0762c8c807617e2d8ea51420eb1b1de266df2495be755c8e0ed6ed
+size 3622230
diff --git a/tokenizer_config.json b/tokenizer_config.json
new file mode 100644
index 0000000000000000000000000000000000000000..01ccdf80112f2fab832b2d1641d71e4d6df95f1a
--- /dev/null
+++ b/tokenizer_config.json
@@ -0,0 +1,9097 @@
+{
+ "added_tokens_decoder": {
+ "200000": {
+ "content": "<|begin_of_text|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200001": {
+ "content": "<|end_of_text|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200002": {
+ "content": "<|fim_prefix|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200003": {
+ "content": "<|fim_middle|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200004": {
+ "content": "<|fim_suffix|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200005": {
+ "content": "<|header_start|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200006": {
+ "content": "<|header_end|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200007": {
+ "content": "<|eom|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200008": {
+ "content": "<|eot|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200009": {
+ "content": "<|step|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200010": {
+ "content": "<|text_post_train_reserved_special_token_0|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200011": {
+ "content": "<|text_post_train_reserved_special_token_1|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200012": {
+ "content": "<|text_post_train_reserved_special_token_2|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200013": {
+ "content": "<|text_post_train_reserved_special_token_3|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200014": {
+ "content": "<|text_post_train_reserved_special_token_4|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200015": {
+ "content": "<|text_post_train_reserved_special_token_5|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200016": {
+ "content": "<|python_start|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200017": {
+ "content": "<|python_end|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200018": {
+ "content": "<|finetune_right_pad|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200019": {
+ "content": "<|text_post_train_reserved_special_token_6|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200020": {
+ "content": "<|text_post_train_reserved_special_token_7|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200021": {
+ "content": "<|text_post_train_reserved_special_token_8|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200022": {
+ "content": "<|text_post_train_reserved_special_token_9|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200023": {
+ "content": "<|text_post_train_reserved_special_token_10|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200024": {
+ "content": "<|text_post_train_reserved_special_token_11|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200025": {
+ "content": "<|text_post_train_reserved_special_token_12|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200026": {
+ "content": "<|text_post_train_reserved_special_token_13|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200027": {
+ "content": "<|text_post_train_reserved_special_token_14|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200028": {
+ "content": "<|text_post_train_reserved_special_token_15|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200029": {
+ "content": "<|text_post_train_reserved_special_token_16|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200030": {
+ "content": "<|text_post_train_reserved_special_token_17|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200031": {
+ "content": "<|text_post_train_reserved_special_token_18|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200032": {
+ "content": "<|text_post_train_reserved_special_token_19|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200033": {
+ "content": "<|text_post_train_reserved_special_token_20|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200034": {
+ "content": "<|text_post_train_reserved_special_token_21|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200035": {
+ "content": "<|text_post_train_reserved_special_token_22|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200036": {
+ "content": "<|text_post_train_reserved_special_token_23|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200037": {
+ "content": "<|text_post_train_reserved_special_token_24|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200038": {
+ "content": "<|text_post_train_reserved_special_token_25|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200039": {
+ "content": "<|text_post_train_reserved_special_token_26|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200040": {
+ "content": "<|text_post_train_reserved_special_token_27|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200041": {
+ "content": "<|text_post_train_reserved_special_token_28|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200042": {
+ "content": "<|text_post_train_reserved_special_token_29|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200043": {
+ "content": "<|text_post_train_reserved_special_token_30|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200044": {
+ "content": "<|text_post_train_reserved_special_token_31|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200045": {
+ "content": "<|text_post_train_reserved_special_token_32|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200046": {
+ "content": "<|text_post_train_reserved_special_token_33|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200047": {
+ "content": "<|text_post_train_reserved_special_token_34|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200048": {
+ "content": "<|text_post_train_reserved_special_token_35|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200049": {
+ "content": "<|text_post_train_reserved_special_token_36|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200050": {
+ "content": "<|text_post_train_reserved_special_token_37|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200051": {
+ "content": "<|text_post_train_reserved_special_token_38|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200052": {
+ "content": "<|text_post_train_reserved_special_token_39|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200053": {
+ "content": "<|text_post_train_reserved_special_token_40|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200054": {
+ "content": "<|text_post_train_reserved_special_token_41|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200055": {
+ "content": "<|text_post_train_reserved_special_token_42|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200056": {
+ "content": "<|text_post_train_reserved_special_token_43|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200057": {
+ "content": "<|text_post_train_reserved_special_token_44|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200058": {
+ "content": "<|text_post_train_reserved_special_token_45|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200059": {
+ "content": "<|text_post_train_reserved_special_token_46|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200060": {
+ "content": "<|text_post_train_reserved_special_token_47|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200061": {
+ "content": "<|text_post_train_reserved_special_token_48|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200062": {
+ "content": "<|text_post_train_reserved_special_token_49|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200063": {
+ "content": "<|text_post_train_reserved_special_token_50|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200064": {
+ "content": "<|text_post_train_reserved_special_token_51|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200065": {
+ "content": "<|text_post_train_reserved_special_token_52|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200066": {
+ "content": "<|text_post_train_reserved_special_token_53|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200067": {
+ "content": "<|text_post_train_reserved_special_token_54|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200068": {
+ "content": "<|text_post_train_reserved_special_token_55|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200069": {
+ "content": "<|text_post_train_reserved_special_token_56|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200070": {
+ "content": "<|text_post_train_reserved_special_token_57|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200071": {
+ "content": "<|text_post_train_reserved_special_token_58|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200072": {
+ "content": "<|text_post_train_reserved_special_token_59|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200073": {
+ "content": "<|text_post_train_reserved_special_token_60|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200074": {
+ "content": "<|text_post_train_reserved_special_token_61|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200075": {
+ "content": "<|text_post_train_reserved_special_token_62|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200076": {
+ "content": "<|text_post_train_reserved_special_token_63|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200077": {
+ "content": "<|text_post_train_reserved_special_token_64|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200078": {
+ "content": "<|text_post_train_reserved_special_token_65|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200079": {
+ "content": "<|text_post_train_reserved_special_token_66|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200080": {
+ "content": "<|image_start|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200081": {
+ "content": "<|image_end|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200082": {
+ "content": "<|vision_reserved_special_token_0|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200083": {
+ "content": "<|vision_reserved_special_token_1|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200084": {
+ "content": "<|tile_x_separator|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200085": {
+ "content": "<|tile_y_separator|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200086": {
+ "content": "<|vision_reserved_special_token_2|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200087": {
+ "content": "<|vision_reserved_special_token_3|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200088": {
+ "content": "<|vision_reserved_special_token_4|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200089": {
+ "content": "<|vision_reserved_special_token_5|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200090": {
+ "content": "<|image|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200091": {
+ "content": "<|vision_reserved_special_token_6|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200092": {
+ "content": "<|patch|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200093": {
+ "content": "<|vision_reserved_special_token_7|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200094": {
+ "content": "<|vision_reserved_special_token_8|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200095": {
+ "content": "<|vision_reserved_special_token_9|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200096": {
+ "content": "<|vision_reserved_special_token_10|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200097": {
+ "content": "<|vision_reserved_special_token_11|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200098": {
+ "content": "<|vision_reserved_special_token_12|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200099": {
+ "content": "<|vision_reserved_special_token_13|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200100": {
+ "content": "<|vision_reserved_special_token_14|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200101": {
+ "content": "<|vision_reserved_special_token_15|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200102": {
+ "content": "<|vision_reserved_special_token_16|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200103": {
+ "content": "<|vision_reserved_special_token_17|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200104": {
+ "content": "<|vision_reserved_special_token_18|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200105": {
+ "content": "<|vision_reserved_special_token_19|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200106": {
+ "content": "<|vision_reserved_special_token_20|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200107": {
+ "content": "<|vision_reserved_special_token_21|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200108": {
+ "content": "<|vision_reserved_special_token_22|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200109": {
+ "content": "<|vision_reserved_special_token_23|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200110": {
+ "content": "<|vision_reserved_special_token_24|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200111": {
+ "content": "<|vision_reserved_special_token_25|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200112": {
+ "content": "<|vision_reserved_special_token_26|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200113": {
+ "content": "<|vision_reserved_special_token_27|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200114": {
+ "content": "<|vision_reserved_special_token_28|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200115": {
+ "content": "<|vision_reserved_special_token_29|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200116": {
+ "content": "<|vision_reserved_special_token_30|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200117": {
+ "content": "<|vision_reserved_special_token_31|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200118": {
+ "content": "<|vision_reserved_special_token_32|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200119": {
+ "content": "<|vision_reserved_special_token_33|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200120": {
+ "content": "<|vision_reserved_special_token_34|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200121": {
+ "content": "<|vision_reserved_special_token_35|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200122": {
+ "content": "<|vision_reserved_special_token_36|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200123": {
+ "content": "<|vision_reserved_special_token_37|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200124": {
+ "content": "<|vision_reserved_special_token_38|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200125": {
+ "content": "<|vision_reserved_special_token_39|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200126": {
+ "content": "<|vision_reserved_special_token_40|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200127": {
+ "content": "<|vision_reserved_special_token_41|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200128": {
+ "content": "<|vision_reserved_special_token_42|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200129": {
+ "content": "<|vision_reserved_special_token_43|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200130": {
+ "content": "<|vision_reserved_special_token_44|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200131": {
+ "content": "<|vision_reserved_special_token_45|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200132": {
+ "content": "<|vision_reserved_special_token_46|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200133": {
+ "content": "<|vision_reserved_special_token_47|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200134": {
+ "content": "<|vision_reserved_special_token_48|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200135": {
+ "content": "<|vision_reserved_special_token_49|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200136": {
+ "content": "<|vision_reserved_special_token_50|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200137": {
+ "content": "<|vision_reserved_special_token_51|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200138": {
+ "content": "<|vision_reserved_special_token_52|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200139": {
+ "content": "<|vision_reserved_special_token_53|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200140": {
+ "content": "<|vision_reserved_special_token_54|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200141": {
+ "content": "<|vision_reserved_special_token_55|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200142": {
+ "content": "<|vision_reserved_special_token_56|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200143": {
+ "content": "<|vision_reserved_special_token_57|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200144": {
+ "content": "<|vision_reserved_special_token_58|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200145": {
+ "content": "<|vision_reserved_special_token_59|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200146": {
+ "content": "<|vision_reserved_special_token_60|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200147": {
+ "content": "<|vision_reserved_special_token_61|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200148": {
+ "content": "<|vision_reserved_special_token_62|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200149": {
+ "content": "<|vision_reserved_special_token_63|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200150": {
+ "content": "<|vision_reserved_special_token_64|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200151": {
+ "content": "<|vision_reserved_special_token_65|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200152": {
+ "content": "<|vision_reserved_special_token_66|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200153": {
+ "content": "<|vision_reserved_special_token_67|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200154": {
+ "content": "<|vision_reserved_special_token_68|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200155": {
+ "content": "<|vision_reserved_special_token_69|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200156": {
+ "content": "<|vision_reserved_special_token_70|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200157": {
+ "content": "<|vision_reserved_special_token_71|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200158": {
+ "content": "<|vision_reserved_special_token_72|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200159": {
+ "content": "<|vision_reserved_special_token_73|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200160": {
+ "content": "<|vision_reserved_special_token_74|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200161": {
+ "content": "<|vision_reserved_special_token_75|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200162": {
+ "content": "<|vision_reserved_special_token_76|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200163": {
+ "content": "<|vision_reserved_special_token_77|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200164": {
+ "content": "<|vision_reserved_special_token_78|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200165": {
+ "content": "<|vision_reserved_special_token_79|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200166": {
+ "content": "<|vision_reserved_special_token_80|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200167": {
+ "content": "<|vision_reserved_special_token_81|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200168": {
+ "content": "<|vision_reserved_special_token_82|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200169": {
+ "content": "<|vision_reserved_special_token_83|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200170": {
+ "content": "<|vision_reserved_special_token_84|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200171": {
+ "content": "<|vision_reserved_special_token_85|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200172": {
+ "content": "<|vision_reserved_special_token_86|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200173": {
+ "content": "<|vision_reserved_special_token_87|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200174": {
+ "content": "<|vision_reserved_special_token_88|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200175": {
+ "content": "<|vision_reserved_special_token_89|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200176": {
+ "content": "<|vision_reserved_special_token_90|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200177": {
+ "content": "<|vision_reserved_special_token_91|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200178": {
+ "content": "<|vision_reserved_special_token_92|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200179": {
+ "content": "<|vision_reserved_special_token_93|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200180": {
+ "content": "<|vision_reserved_special_token_94|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200181": {
+ "content": "<|vision_reserved_special_token_95|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200182": {
+ "content": "<|vision_reserved_special_token_96|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200183": {
+ "content": "<|vision_reserved_special_token_97|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200184": {
+ "content": "<|vision_reserved_special_token_98|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200185": {
+ "content": "<|vision_reserved_special_token_99|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200186": {
+ "content": "<|vision_reserved_special_token_100|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200187": {
+ "content": "<|vision_reserved_special_token_101|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200188": {
+ "content": "<|vision_reserved_special_token_102|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200189": {
+ "content": "<|vision_reserved_special_token_103|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200190": {
+ "content": "<|vision_reserved_special_token_104|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200191": {
+ "content": "<|vision_reserved_special_token_105|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200192": {
+ "content": "<|vision_reserved_special_token_106|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200193": {
+ "content": "<|vision_reserved_special_token_107|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200194": {
+ "content": "<|vision_reserved_special_token_108|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200195": {
+ "content": "<|vision_reserved_special_token_109|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200196": {
+ "content": "<|vision_reserved_special_token_110|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200197": {
+ "content": "<|vision_reserved_special_token_111|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200198": {
+ "content": "<|vision_reserved_special_token_112|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200199": {
+ "content": "<|vision_reserved_special_token_113|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200200": {
+ "content": "<|vision_reserved_special_token_114|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200201": {
+ "content": "<|vision_reserved_special_token_115|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200202": {
+ "content": "<|vision_reserved_special_token_116|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200203": {
+ "content": "<|vision_reserved_special_token_117|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200204": {
+ "content": "<|vision_reserved_special_token_118|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200205": {
+ "content": "<|vision_reserved_special_token_119|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200206": {
+ "content": "<|vision_reserved_special_token_120|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200207": {
+ "content": "<|vision_reserved_special_token_121|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200208": {
+ "content": "<|vision_reserved_special_token_122|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200209": {
+ "content": "<|vision_reserved_special_token_123|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200210": {
+ "content": "<|vision_reserved_special_token_124|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200211": {
+ "content": "<|vision_reserved_special_token_125|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200212": {
+ "content": "<|vision_reserved_special_token_126|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200213": {
+ "content": "<|vision_reserved_special_token_127|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200214": {
+ "content": "<|vision_reserved_special_token_128|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200215": {
+ "content": "<|vision_reserved_special_token_129|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200216": {
+ "content": "<|vision_reserved_special_token_130|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200217": {
+ "content": "<|vision_reserved_special_token_131|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200218": {
+ "content": "<|vision_reserved_special_token_132|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200219": {
+ "content": "<|vision_reserved_special_token_133|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200220": {
+ "content": "<|vision_reserved_special_token_134|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200221": {
+ "content": "<|vision_reserved_special_token_135|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200222": {
+ "content": "<|vision_reserved_special_token_136|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200223": {
+ "content": "<|vision_reserved_special_token_137|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200224": {
+ "content": "<|vision_reserved_special_token_138|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200225": {
+ "content": "<|vision_reserved_special_token_139|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200226": {
+ "content": "<|vision_reserved_special_token_140|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200227": {
+ "content": "<|vision_reserved_special_token_141|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200228": {
+ "content": "<|vision_reserved_special_token_142|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200229": {
+ "content": "<|vision_reserved_special_token_143|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200230": {
+ "content": "<|vision_reserved_special_token_144|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200231": {
+ "content": "<|vision_reserved_special_token_145|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200232": {
+ "content": "<|vision_reserved_special_token_146|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200233": {
+ "content": "<|vision_reserved_special_token_147|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200234": {
+ "content": "<|vision_reserved_special_token_148|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200235": {
+ "content": "<|vision_reserved_special_token_149|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200236": {
+ "content": "<|vision_reserved_special_token_150|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200237": {
+ "content": "<|vision_reserved_special_token_151|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200238": {
+ "content": "<|vision_reserved_special_token_152|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200239": {
+ "content": "<|vision_reserved_special_token_153|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200240": {
+ "content": "<|vision_reserved_special_token_154|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200241": {
+ "content": "<|vision_reserved_special_token_155|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200242": {
+ "content": "<|vision_reserved_special_token_156|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200243": {
+ "content": "<|vision_reserved_special_token_157|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200244": {
+ "content": "<|vision_reserved_special_token_158|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200245": {
+ "content": "<|vision_reserved_special_token_159|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200246": {
+ "content": "<|vision_reserved_special_token_160|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200247": {
+ "content": "<|vision_reserved_special_token_161|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200248": {
+ "content": "<|vision_reserved_special_token_162|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200249": {
+ "content": "<|vision_reserved_special_token_163|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200250": {
+ "content": "<|vision_reserved_special_token_164|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200251": {
+ "content": "<|vision_reserved_special_token_165|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200252": {
+ "content": "<|vision_reserved_special_token_166|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200253": {
+ "content": "<|vision_reserved_special_token_167|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200254": {
+ "content": "<|vision_reserved_special_token_168|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200255": {
+ "content": "<|vision_reserved_special_token_169|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200256": {
+ "content": "<|vision_reserved_special_token_170|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200257": {
+ "content": "<|vision_reserved_special_token_171|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200258": {
+ "content": "<|vision_reserved_special_token_172|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200259": {
+ "content": "<|vision_reserved_special_token_173|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200260": {
+ "content": "<|vision_reserved_special_token_174|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200261": {
+ "content": "<|vision_reserved_special_token_175|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200262": {
+ "content": "<|vision_reserved_special_token_176|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200263": {
+ "content": "<|vision_reserved_special_token_177|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200264": {
+ "content": "<|vision_reserved_special_token_178|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200265": {
+ "content": "<|vision_reserved_special_token_179|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200266": {
+ "content": "<|vision_reserved_special_token_180|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200267": {
+ "content": "<|vision_reserved_special_token_181|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200268": {
+ "content": "<|vision_reserved_special_token_182|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200269": {
+ "content": "<|vision_reserved_special_token_183|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200270": {
+ "content": "<|vision_reserved_special_token_184|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200271": {
+ "content": "<|vision_reserved_special_token_185|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200272": {
+ "content": "<|vision_reserved_special_token_186|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200273": {
+ "content": "<|vision_reserved_special_token_187|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200274": {
+ "content": "<|vision_reserved_special_token_188|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200275": {
+ "content": "<|vision_reserved_special_token_189|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200276": {
+ "content": "<|vision_reserved_special_token_190|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200277": {
+ "content": "<|vision_reserved_special_token_191|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200278": {
+ "content": "<|vision_reserved_special_token_192|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200279": {
+ "content": "<|vision_reserved_special_token_193|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200280": {
+ "content": "<|vision_reserved_special_token_194|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200281": {
+ "content": "<|vision_reserved_special_token_195|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200282": {
+ "content": "<|vision_reserved_special_token_196|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200283": {
+ "content": "<|vision_reserved_special_token_197|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200284": {
+ "content": "<|vision_reserved_special_token_198|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200285": {
+ "content": "<|vision_reserved_special_token_199|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200286": {
+ "content": "<|vision_reserved_special_token_200|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200287": {
+ "content": "<|vision_reserved_special_token_201|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200288": {
+ "content": "<|vision_reserved_special_token_202|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200289": {
+ "content": "<|vision_reserved_special_token_203|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200290": {
+ "content": "<|vision_reserved_special_token_204|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200291": {
+ "content": "<|vision_reserved_special_token_205|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200292": {
+ "content": "<|vision_reserved_special_token_206|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200293": {
+ "content": "<|vision_reserved_special_token_207|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200294": {
+ "content": "<|vision_reserved_special_token_208|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200295": {
+ "content": "<|vision_reserved_special_token_209|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200296": {
+ "content": "<|vision_reserved_special_token_210|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200297": {
+ "content": "<|vision_reserved_special_token_211|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200298": {
+ "content": "<|vision_reserved_special_token_212|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200299": {
+ "content": "<|vision_reserved_special_token_213|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200300": {
+ "content": "<|vision_reserved_special_token_214|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200301": {
+ "content": "<|vision_reserved_special_token_215|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200302": {
+ "content": "<|vision_reserved_special_token_216|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200303": {
+ "content": "<|vision_reserved_special_token_217|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200304": {
+ "content": "<|vision_reserved_special_token_218|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200305": {
+ "content": "<|vision_reserved_special_token_219|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200306": {
+ "content": "<|vision_reserved_special_token_220|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200307": {
+ "content": "<|vision_reserved_special_token_221|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200308": {
+ "content": "<|vision_reserved_special_token_222|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200309": {
+ "content": "<|vision_reserved_special_token_223|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200310": {
+ "content": "<|vision_reserved_special_token_224|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200311": {
+ "content": "<|vision_reserved_special_token_225|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200312": {
+ "content": "<|vision_reserved_special_token_226|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200313": {
+ "content": "<|vision_reserved_special_token_227|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200314": {
+ "content": "<|vision_reserved_special_token_228|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200315": {
+ "content": "<|vision_reserved_special_token_229|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200316": {
+ "content": "<|vision_reserved_special_token_230|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200317": {
+ "content": "<|vision_reserved_special_token_231|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200318": {
+ "content": "<|vision_reserved_special_token_232|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200319": {
+ "content": "<|vision_reserved_special_token_233|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200320": {
+ "content": "<|vision_reserved_special_token_234|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200321": {
+ "content": "<|vision_reserved_special_token_235|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200322": {
+ "content": "<|vision_reserved_special_token_236|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200323": {
+ "content": "<|vision_reserved_special_token_237|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200324": {
+ "content": "<|vision_reserved_special_token_238|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200325": {
+ "content": "<|vision_reserved_special_token_239|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200326": {
+ "content": "<|vision_reserved_special_token_240|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200327": {
+ "content": "<|vision_reserved_special_token_241|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200328": {
+ "content": "<|vision_reserved_special_token_242|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200329": {
+ "content": "<|vision_reserved_special_token_243|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200330": {
+ "content": "<|vision_reserved_special_token_244|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200331": {
+ "content": "<|vision_reserved_special_token_245|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200332": {
+ "content": "<|vision_reserved_special_token_246|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200333": {
+ "content": "<|vision_reserved_special_token_247|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200334": {
+ "content": "<|vision_reserved_special_token_248|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200335": {
+ "content": "<|vision_reserved_special_token_249|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200336": {
+ "content": "<|vision_reserved_special_token_250|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200337": {
+ "content": "<|vision_reserved_special_token_251|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200338": {
+ "content": "<|vision_reserved_special_token_252|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200339": {
+ "content": "<|vision_reserved_special_token_253|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200340": {
+ "content": "<|vision_reserved_special_token_254|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200341": {
+ "content": "<|vision_reserved_special_token_255|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200342": {
+ "content": "<|vision_reserved_special_token_256|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200343": {
+ "content": "<|vision_reserved_special_token_257|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200344": {
+ "content": "<|vision_reserved_special_token_258|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200345": {
+ "content": "<|vision_reserved_special_token_259|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200346": {
+ "content": "<|vision_reserved_special_token_260|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200347": {
+ "content": "<|vision_reserved_special_token_261|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200348": {
+ "content": "<|vision_reserved_special_token_262|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200349": {
+ "content": "<|vision_reserved_special_token_263|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200350": {
+ "content": "<|vision_reserved_special_token_264|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200351": {
+ "content": "<|vision_reserved_special_token_265|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200352": {
+ "content": "<|vision_reserved_special_token_266|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200353": {
+ "content": "<|vision_reserved_special_token_267|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200354": {
+ "content": "<|vision_reserved_special_token_268|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200355": {
+ "content": "<|vision_reserved_special_token_269|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200356": {
+ "content": "<|vision_reserved_special_token_270|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200357": {
+ "content": "<|vision_reserved_special_token_271|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200358": {
+ "content": "<|vision_reserved_special_token_272|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200359": {
+ "content": "<|vision_reserved_special_token_273|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200360": {
+ "content": "<|vision_reserved_special_token_274|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200361": {
+ "content": "<|vision_reserved_special_token_275|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200362": {
+ "content": "<|vision_reserved_special_token_276|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200363": {
+ "content": "<|vision_reserved_special_token_277|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200364": {
+ "content": "<|vision_reserved_special_token_278|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200365": {
+ "content": "<|vision_reserved_special_token_279|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200366": {
+ "content": "<|vision_reserved_special_token_280|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200367": {
+ "content": "<|vision_reserved_special_token_281|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200368": {
+ "content": "<|vision_reserved_special_token_282|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200369": {
+ "content": "<|vision_reserved_special_token_283|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200370": {
+ "content": "<|vision_reserved_special_token_284|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200371": {
+ "content": "<|vision_reserved_special_token_285|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200372": {
+ "content": "<|vision_reserved_special_token_286|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200373": {
+ "content": "<|vision_reserved_special_token_287|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200374": {
+ "content": "<|vision_reserved_special_token_288|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200375": {
+ "content": "<|vision_reserved_special_token_289|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200376": {
+ "content": "<|vision_reserved_special_token_290|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200377": {
+ "content": "<|vision_reserved_special_token_291|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200378": {
+ "content": "<|vision_reserved_special_token_292|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200379": {
+ "content": "<|vision_reserved_special_token_293|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200380": {
+ "content": "<|vision_reserved_special_token_294|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200381": {
+ "content": "<|vision_reserved_special_token_295|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200382": {
+ "content": "<|vision_reserved_special_token_296|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200383": {
+ "content": "<|vision_reserved_special_token_297|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200384": {
+ "content": "<|vision_reserved_special_token_298|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200385": {
+ "content": "<|vision_reserved_special_token_299|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200386": {
+ "content": "<|vision_reserved_special_token_300|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200387": {
+ "content": "<|vision_reserved_special_token_301|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200388": {
+ "content": "<|vision_reserved_special_token_302|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200389": {
+ "content": "<|vision_reserved_special_token_303|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200390": {
+ "content": "<|vision_reserved_special_token_304|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200391": {
+ "content": "<|vision_reserved_special_token_305|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200392": {
+ "content": "<|vision_reserved_special_token_306|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200393": {
+ "content": "<|vision_reserved_special_token_307|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200394": {
+ "content": "<|vision_reserved_special_token_308|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200395": {
+ "content": "<|vision_reserved_special_token_309|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200396": {
+ "content": "<|vision_reserved_special_token_310|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200397": {
+ "content": "<|vision_reserved_special_token_311|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200398": {
+ "content": "<|vision_reserved_special_token_312|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200399": {
+ "content": "<|vision_reserved_special_token_313|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200400": {
+ "content": "<|vision_reserved_special_token_314|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200401": {
+ "content": "<|vision_reserved_special_token_315|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200402": {
+ "content": "<|vision_reserved_special_token_316|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200403": {
+ "content": "<|vision_reserved_special_token_317|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200404": {
+ "content": "<|vision_reserved_special_token_318|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200405": {
+ "content": "<|vision_reserved_special_token_319|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200406": {
+ "content": "<|vision_reserved_special_token_320|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200407": {
+ "content": "<|vision_reserved_special_token_321|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200408": {
+ "content": "<|vision_reserved_special_token_322|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200409": {
+ "content": "<|vision_reserved_special_token_323|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200410": {
+ "content": "<|vision_reserved_special_token_324|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200411": {
+ "content": "<|vision_reserved_special_token_325|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200412": {
+ "content": "<|vision_reserved_special_token_326|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200413": {
+ "content": "<|vision_reserved_special_token_327|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200414": {
+ "content": "<|vision_reserved_special_token_328|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200415": {
+ "content": "<|vision_reserved_special_token_329|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200416": {
+ "content": "<|vision_reserved_special_token_330|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200417": {
+ "content": "<|vision_reserved_special_token_331|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200418": {
+ "content": "<|vision_reserved_special_token_332|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200419": {
+ "content": "<|vision_reserved_special_token_333|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200420": {
+ "content": "<|vision_reserved_special_token_334|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200421": {
+ "content": "<|vision_reserved_special_token_335|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200422": {
+ "content": "<|vision_reserved_special_token_336|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200423": {
+ "content": "<|vision_reserved_special_token_337|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200424": {
+ "content": "<|vision_reserved_special_token_338|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200425": {
+ "content": "<|vision_reserved_special_token_339|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200426": {
+ "content": "<|vision_reserved_special_token_340|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200427": {
+ "content": "<|vision_reserved_special_token_341|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200428": {
+ "content": "<|vision_reserved_special_token_342|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200429": {
+ "content": "<|vision_reserved_special_token_343|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200430": {
+ "content": "<|vision_reserved_special_token_344|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200431": {
+ "content": "<|vision_reserved_special_token_345|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200432": {
+ "content": "<|vision_reserved_special_token_346|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200433": {
+ "content": "<|vision_reserved_special_token_347|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200434": {
+ "content": "<|vision_reserved_special_token_348|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200435": {
+ "content": "<|vision_reserved_special_token_349|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200436": {
+ "content": "<|vision_reserved_special_token_350|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200437": {
+ "content": "<|vision_reserved_special_token_351|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200438": {
+ "content": "<|vision_reserved_special_token_352|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200439": {
+ "content": "<|vision_reserved_special_token_353|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200440": {
+ "content": "<|vision_reserved_special_token_354|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200441": {
+ "content": "<|vision_reserved_special_token_355|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200442": {
+ "content": "<|vision_reserved_special_token_356|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200443": {
+ "content": "<|vision_reserved_special_token_357|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200444": {
+ "content": "<|vision_reserved_special_token_358|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200445": {
+ "content": "<|vision_reserved_special_token_359|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200446": {
+ "content": "<|vision_reserved_special_token_360|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200447": {
+ "content": "<|vision_reserved_special_token_361|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200448": {
+ "content": "<|vision_reserved_special_token_362|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200449": {
+ "content": "<|vision_reserved_special_token_363|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200450": {
+ "content": "<|vision_reserved_special_token_364|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200451": {
+ "content": "<|vision_reserved_special_token_365|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200452": {
+ "content": "<|vision_reserved_special_token_366|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200453": {
+ "content": "<|vision_reserved_special_token_367|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200454": {
+ "content": "<|vision_reserved_special_token_368|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200455": {
+ "content": "<|vision_reserved_special_token_369|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200456": {
+ "content": "<|vision_reserved_special_token_370|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200457": {
+ "content": "<|vision_reserved_special_token_371|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200458": {
+ "content": "<|vision_reserved_special_token_372|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200459": {
+ "content": "<|vision_reserved_special_token_373|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200460": {
+ "content": "<|vision_reserved_special_token_374|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200461": {
+ "content": "<|vision_reserved_special_token_375|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200462": {
+ "content": "<|vision_reserved_special_token_376|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200463": {
+ "content": "<|vision_reserved_special_token_377|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200464": {
+ "content": "<|vision_reserved_special_token_378|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200465": {
+ "content": "<|vision_reserved_special_token_379|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200466": {
+ "content": "<|vision_reserved_special_token_380|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200467": {
+ "content": "<|vision_reserved_special_token_381|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200468": {
+ "content": "<|vision_reserved_special_token_382|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200469": {
+ "content": "<|vision_reserved_special_token_383|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200470": {
+ "content": "<|vision_reserved_special_token_384|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200471": {
+ "content": "<|vision_reserved_special_token_385|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200472": {
+ "content": "<|vision_reserved_special_token_386|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200473": {
+ "content": "<|vision_reserved_special_token_387|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200474": {
+ "content": "<|vision_reserved_special_token_388|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200475": {
+ "content": "<|vision_reserved_special_token_389|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200476": {
+ "content": "<|vision_reserved_special_token_390|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200477": {
+ "content": "<|vision_reserved_special_token_391|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200478": {
+ "content": "<|vision_reserved_special_token_392|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200479": {
+ "content": "<|vision_reserved_special_token_393|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200480": {
+ "content": "<|vision_reserved_special_token_394|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200481": {
+ "content": "<|vision_reserved_special_token_395|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200482": {
+ "content": "<|vision_reserved_special_token_396|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200483": {
+ "content": "<|vision_reserved_special_token_397|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200484": {
+ "content": "<|vision_reserved_special_token_398|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200485": {
+ "content": "<|vision_reserved_special_token_399|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200486": {
+ "content": "<|vision_reserved_special_token_400|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200487": {
+ "content": "<|vision_reserved_special_token_401|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200488": {
+ "content": "<|vision_reserved_special_token_402|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200489": {
+ "content": "<|vision_reserved_special_token_403|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200490": {
+ "content": "<|vision_reserved_special_token_404|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200491": {
+ "content": "<|vision_reserved_special_token_405|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200492": {
+ "content": "<|vision_reserved_special_token_406|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200493": {
+ "content": "<|vision_reserved_special_token_407|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200494": {
+ "content": "<|vision_reserved_special_token_408|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200495": {
+ "content": "<|vision_reserved_special_token_409|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200496": {
+ "content": "<|vision_reserved_special_token_410|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200497": {
+ "content": "<|vision_reserved_special_token_411|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200498": {
+ "content": "<|vision_reserved_special_token_412|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200499": {
+ "content": "<|vision_reserved_special_token_413|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200500": {
+ "content": "<|vision_reserved_special_token_414|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200501": {
+ "content": "<|vision_reserved_special_token_415|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200502": {
+ "content": "<|vision_reserved_special_token_416|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200503": {
+ "content": "<|vision_reserved_special_token_417|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200504": {
+ "content": "<|vision_reserved_special_token_418|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200505": {
+ "content": "<|vision_reserved_special_token_419|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200506": {
+ "content": "<|vision_reserved_special_token_420|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200507": {
+ "content": "<|vision_reserved_special_token_421|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200508": {
+ "content": "<|vision_reserved_special_token_422|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200509": {
+ "content": "<|vision_reserved_special_token_423|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200510": {
+ "content": "<|vision_reserved_special_token_424|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200511": {
+ "content": "<|vision_reserved_special_token_425|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200512": {
+ "content": "<|vision_reserved_special_token_426|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200513": {
+ "content": "<|vision_reserved_special_token_427|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200514": {
+ "content": "<|vision_reserved_special_token_428|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200515": {
+ "content": "<|vision_reserved_special_token_429|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200516": {
+ "content": "<|vision_reserved_special_token_430|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200517": {
+ "content": "<|vision_reserved_special_token_431|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200518": {
+ "content": "<|vision_reserved_special_token_432|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200519": {
+ "content": "<|vision_reserved_special_token_433|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200520": {
+ "content": "<|vision_reserved_special_token_434|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200521": {
+ "content": "<|vision_reserved_special_token_435|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200522": {
+ "content": "<|vision_reserved_special_token_436|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200523": {
+ "content": "<|vision_reserved_special_token_437|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200524": {
+ "content": "<|vision_reserved_special_token_438|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200525": {
+ "content": "<|vision_reserved_special_token_439|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200526": {
+ "content": "<|vision_reserved_special_token_440|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200527": {
+ "content": "<|vision_reserved_special_token_441|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200528": {
+ "content": "<|vision_reserved_special_token_442|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200529": {
+ "content": "<|vision_reserved_special_token_443|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200530": {
+ "content": "<|vision_reserved_special_token_444|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200531": {
+ "content": "<|vision_reserved_special_token_445|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200532": {
+ "content": "<|vision_reserved_special_token_446|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200533": {
+ "content": "<|vision_reserved_special_token_447|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200534": {
+ "content": "<|vision_reserved_special_token_448|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200535": {
+ "content": "<|vision_reserved_special_token_449|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200536": {
+ "content": "<|vision_reserved_special_token_450|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200537": {
+ "content": "<|vision_reserved_special_token_451|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200538": {
+ "content": "<|vision_reserved_special_token_452|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200539": {
+ "content": "<|vision_reserved_special_token_453|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200540": {
+ "content": "<|vision_reserved_special_token_454|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200541": {
+ "content": "<|vision_reserved_special_token_455|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200542": {
+ "content": "<|vision_reserved_special_token_456|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200543": {
+ "content": "<|vision_reserved_special_token_457|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200544": {
+ "content": "<|vision_reserved_special_token_458|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200545": {
+ "content": "<|vision_reserved_special_token_459|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200546": {
+ "content": "<|vision_reserved_special_token_460|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200547": {
+ "content": "<|vision_reserved_special_token_461|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200548": {
+ "content": "<|vision_reserved_special_token_462|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200549": {
+ "content": "<|vision_reserved_special_token_463|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200550": {
+ "content": "<|vision_reserved_special_token_464|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200551": {
+ "content": "<|vision_reserved_special_token_465|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200552": {
+ "content": "<|vision_reserved_special_token_466|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200553": {
+ "content": "<|vision_reserved_special_token_467|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200554": {
+ "content": "<|vision_reserved_special_token_468|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200555": {
+ "content": "<|vision_reserved_special_token_469|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200556": {
+ "content": "<|vision_reserved_special_token_470|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200557": {
+ "content": "<|vision_reserved_special_token_471|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200558": {
+ "content": "<|vision_reserved_special_token_472|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200559": {
+ "content": "<|vision_reserved_special_token_473|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200560": {
+ "content": "<|vision_reserved_special_token_474|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200561": {
+ "content": "<|vision_reserved_special_token_475|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200562": {
+ "content": "<|vision_reserved_special_token_476|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200563": {
+ "content": "<|vision_reserved_special_token_477|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200564": {
+ "content": "<|vision_reserved_special_token_478|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200565": {
+ "content": "<|vision_reserved_special_token_479|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200566": {
+ "content": "<|vision_reserved_special_token_480|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200567": {
+ "content": "<|vision_reserved_special_token_481|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200568": {
+ "content": "<|vision_reserved_special_token_482|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200569": {
+ "content": "<|vision_reserved_special_token_483|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200570": {
+ "content": "<|vision_reserved_special_token_484|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200571": {
+ "content": "<|vision_reserved_special_token_485|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200572": {
+ "content": "<|vision_reserved_special_token_486|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200573": {
+ "content": "<|vision_reserved_special_token_487|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200574": {
+ "content": "<|vision_reserved_special_token_488|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200575": {
+ "content": "<|vision_reserved_special_token_489|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200576": {
+ "content": "<|vision_reserved_special_token_490|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200577": {
+ "content": "<|vision_reserved_special_token_491|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200578": {
+ "content": "<|vision_reserved_special_token_492|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200579": {
+ "content": "<|vision_reserved_special_token_493|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200580": {
+ "content": "<|vision_reserved_special_token_494|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200581": {
+ "content": "<|vision_reserved_special_token_495|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200582": {
+ "content": "<|vision_reserved_special_token_496|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200583": {
+ "content": "<|vision_reserved_special_token_497|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200584": {
+ "content": "<|vision_reserved_special_token_498|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200585": {
+ "content": "<|vision_reserved_special_token_499|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200586": {
+ "content": "<|vision_reserved_special_token_500|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200587": {
+ "content": "<|vision_reserved_special_token_501|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200588": {
+ "content": "<|vision_reserved_special_token_502|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200589": {
+ "content": "<|vision_reserved_special_token_503|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200590": {
+ "content": "<|vision_reserved_special_token_504|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200591": {
+ "content": "<|vision_reserved_special_token_505|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200592": {
+ "content": "<|vision_reserved_special_token_506|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200593": {
+ "content": "<|vision_reserved_special_token_507|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200594": {
+ "content": "<|vision_reserved_special_token_508|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200595": {
+ "content": "<|vision_reserved_special_token_509|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200596": {
+ "content": "<|vision_reserved_special_token_510|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200597": {
+ "content": "<|vision_reserved_special_token_511|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200598": {
+ "content": "<|vision_reserved_special_token_512|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200599": {
+ "content": "<|vision_reserved_special_token_513|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200600": {
+ "content": "<|vision_reserved_special_token_514|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200601": {
+ "content": "<|vision_reserved_special_token_515|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200602": {
+ "content": "<|vision_reserved_special_token_516|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200603": {
+ "content": "<|vision_reserved_special_token_517|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200604": {
+ "content": "<|vision_reserved_special_token_518|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200605": {
+ "content": "<|vision_reserved_special_token_519|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200606": {
+ "content": "<|vision_reserved_special_token_520|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200607": {
+ "content": "<|vision_reserved_special_token_521|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200608": {
+ "content": "<|vision_reserved_special_token_522|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200609": {
+ "content": "<|vision_reserved_special_token_523|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200610": {
+ "content": "<|vision_reserved_special_token_524|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200611": {
+ "content": "<|vision_reserved_special_token_525|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200612": {
+ "content": "<|vision_reserved_special_token_526|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200613": {
+ "content": "<|vision_reserved_special_token_527|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200614": {
+ "content": "<|vision_reserved_special_token_528|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200615": {
+ "content": "<|vision_reserved_special_token_529|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200616": {
+ "content": "<|vision_reserved_special_token_530|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200617": {
+ "content": "<|vision_reserved_special_token_531|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200618": {
+ "content": "<|vision_reserved_special_token_532|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200619": {
+ "content": "<|vision_reserved_special_token_533|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200620": {
+ "content": "<|vision_reserved_special_token_534|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200621": {
+ "content": "<|vision_reserved_special_token_535|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200622": {
+ "content": "<|vision_reserved_special_token_536|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200623": {
+ "content": "<|vision_reserved_special_token_537|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200624": {
+ "content": "<|vision_reserved_special_token_538|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200625": {
+ "content": "<|vision_reserved_special_token_539|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200626": {
+ "content": "<|vision_reserved_special_token_540|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200627": {
+ "content": "<|vision_reserved_special_token_541|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200628": {
+ "content": "<|vision_reserved_special_token_542|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200629": {
+ "content": "<|vision_reserved_special_token_543|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200630": {
+ "content": "<|vision_reserved_special_token_544|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200631": {
+ "content": "<|vision_reserved_special_token_545|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200632": {
+ "content": "<|vision_reserved_special_token_546|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200633": {
+ "content": "<|vision_reserved_special_token_547|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200634": {
+ "content": "<|vision_reserved_special_token_548|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200635": {
+ "content": "<|vision_reserved_special_token_549|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200636": {
+ "content": "<|vision_reserved_special_token_550|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200637": {
+ "content": "<|vision_reserved_special_token_551|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200638": {
+ "content": "<|vision_reserved_special_token_552|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200639": {
+ "content": "<|vision_reserved_special_token_553|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200640": {
+ "content": "<|vision_reserved_special_token_554|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200641": {
+ "content": "<|vision_reserved_special_token_555|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200642": {
+ "content": "<|vision_reserved_special_token_556|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200643": {
+ "content": "<|vision_reserved_special_token_557|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200644": {
+ "content": "<|vision_reserved_special_token_558|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200645": {
+ "content": "<|vision_reserved_special_token_559|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200646": {
+ "content": "<|vision_reserved_special_token_560|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200647": {
+ "content": "<|vision_reserved_special_token_561|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200648": {
+ "content": "<|vision_reserved_special_token_562|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200649": {
+ "content": "<|vision_reserved_special_token_563|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200650": {
+ "content": "<|vision_reserved_special_token_564|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200651": {
+ "content": "<|vision_reserved_special_token_565|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200652": {
+ "content": "<|vision_reserved_special_token_566|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200653": {
+ "content": "<|vision_reserved_special_token_567|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200654": {
+ "content": "<|vision_reserved_special_token_568|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200655": {
+ "content": "<|vision_reserved_special_token_569|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200656": {
+ "content": "<|vision_reserved_special_token_570|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200657": {
+ "content": "<|vision_reserved_special_token_571|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200658": {
+ "content": "<|vision_reserved_special_token_572|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200659": {
+ "content": "<|vision_reserved_special_token_573|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200660": {
+ "content": "<|vision_reserved_special_token_574|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200661": {
+ "content": "<|vision_reserved_special_token_575|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200662": {
+ "content": "<|vision_reserved_special_token_576|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200663": {
+ "content": "<|vision_reserved_special_token_577|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200664": {
+ "content": "<|vision_reserved_special_token_578|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200665": {
+ "content": "<|vision_reserved_special_token_579|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200666": {
+ "content": "<|vision_reserved_special_token_580|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200667": {
+ "content": "<|vision_reserved_special_token_581|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200668": {
+ "content": "<|vision_reserved_special_token_582|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200669": {
+ "content": "<|vision_reserved_special_token_583|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200670": {
+ "content": "<|vision_reserved_special_token_584|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200671": {
+ "content": "<|vision_reserved_special_token_585|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200672": {
+ "content": "<|vision_reserved_special_token_586|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200673": {
+ "content": "<|vision_reserved_special_token_587|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200674": {
+ "content": "<|vision_reserved_special_token_588|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200675": {
+ "content": "<|vision_reserved_special_token_589|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200676": {
+ "content": "<|vision_reserved_special_token_590|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200677": {
+ "content": "<|vision_reserved_special_token_591|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200678": {
+ "content": "<|vision_reserved_special_token_592|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200679": {
+ "content": "<|vision_reserved_special_token_593|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200680": {
+ "content": "<|vision_reserved_special_token_594|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200681": {
+ "content": "<|vision_reserved_special_token_595|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200682": {
+ "content": "<|vision_reserved_special_token_596|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200683": {
+ "content": "<|vision_reserved_special_token_597|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200684": {
+ "content": "<|vision_reserved_special_token_598|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200685": {
+ "content": "<|vision_reserved_special_token_599|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200686": {
+ "content": "<|vision_reserved_special_token_600|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200687": {
+ "content": "<|vision_reserved_special_token_601|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200688": {
+ "content": "<|vision_reserved_special_token_602|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200689": {
+ "content": "<|vision_reserved_special_token_603|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200690": {
+ "content": "<|vision_reserved_special_token_604|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200691": {
+ "content": "<|vision_reserved_special_token_605|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200692": {
+ "content": "<|vision_reserved_special_token_606|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200693": {
+ "content": "<|vision_reserved_special_token_607|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200694": {
+ "content": "<|vision_reserved_special_token_608|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200695": {
+ "content": "<|vision_reserved_special_token_609|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200696": {
+ "content": "<|vision_reserved_special_token_610|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200697": {
+ "content": "<|vision_reserved_special_token_611|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200698": {
+ "content": "<|vision_reserved_special_token_612|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200699": {
+ "content": "<|vision_reserved_special_token_613|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200700": {
+ "content": "<|vision_reserved_special_token_614|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200701": {
+ "content": "<|vision_reserved_special_token_615|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200702": {
+ "content": "<|vision_reserved_special_token_616|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200703": {
+ "content": "<|vision_reserved_special_token_617|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200704": {
+ "content": "<|vision_reserved_special_token_618|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200705": {
+ "content": "<|vision_reserved_special_token_619|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200706": {
+ "content": "<|vision_reserved_special_token_620|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200707": {
+ "content": "<|vision_reserved_special_token_621|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200708": {
+ "content": "<|vision_reserved_special_token_622|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200709": {
+ "content": "<|vision_reserved_special_token_623|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200710": {
+ "content": "<|vision_reserved_special_token_624|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200711": {
+ "content": "<|vision_reserved_special_token_625|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200712": {
+ "content": "<|vision_reserved_special_token_626|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200713": {
+ "content": "<|vision_reserved_special_token_627|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200714": {
+ "content": "<|vision_reserved_special_token_628|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200715": {
+ "content": "<|vision_reserved_special_token_629|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200716": {
+ "content": "<|vision_reserved_special_token_630|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200717": {
+ "content": "<|vision_reserved_special_token_631|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200718": {
+ "content": "<|vision_reserved_special_token_632|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200719": {
+ "content": "<|vision_reserved_special_token_633|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200720": {
+ "content": "<|vision_reserved_special_token_634|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200721": {
+ "content": "<|vision_reserved_special_token_635|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200722": {
+ "content": "<|vision_reserved_special_token_636|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200723": {
+ "content": "<|vision_reserved_special_token_637|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200724": {
+ "content": "<|vision_reserved_special_token_638|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200725": {
+ "content": "<|vision_reserved_special_token_639|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200726": {
+ "content": "<|vision_reserved_special_token_640|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200727": {
+ "content": "<|vision_reserved_special_token_641|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200728": {
+ "content": "<|vision_reserved_special_token_642|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200729": {
+ "content": "<|vision_reserved_special_token_643|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200730": {
+ "content": "<|vision_reserved_special_token_644|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200731": {
+ "content": "<|vision_reserved_special_token_645|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200732": {
+ "content": "<|vision_reserved_special_token_646|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200733": {
+ "content": "<|vision_reserved_special_token_647|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200734": {
+ "content": "<|vision_reserved_special_token_648|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200735": {
+ "content": "<|vision_reserved_special_token_649|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200736": {
+ "content": "<|vision_reserved_special_token_650|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200737": {
+ "content": "<|vision_reserved_special_token_651|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200738": {
+ "content": "<|vision_reserved_special_token_652|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200739": {
+ "content": "<|vision_reserved_special_token_653|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200740": {
+ "content": "<|vision_reserved_special_token_654|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200741": {
+ "content": "<|vision_reserved_special_token_655|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200742": {
+ "content": "<|vision_reserved_special_token_656|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200743": {
+ "content": "<|vision_reserved_special_token_657|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200744": {
+ "content": "<|vision_reserved_special_token_658|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200745": {
+ "content": "<|vision_reserved_special_token_659|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200746": {
+ "content": "<|vision_reserved_special_token_660|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200747": {
+ "content": "<|vision_reserved_special_token_661|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200748": {
+ "content": "<|vision_reserved_special_token_662|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200749": {
+ "content": "<|vision_reserved_special_token_663|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200750": {
+ "content": "<|vision_reserved_special_token_664|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200751": {
+ "content": "<|vision_reserved_special_token_665|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200752": {
+ "content": "<|vision_reserved_special_token_666|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200753": {
+ "content": "<|vision_reserved_special_token_667|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200754": {
+ "content": "<|vision_reserved_special_token_668|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200755": {
+ "content": "<|vision_reserved_special_token_669|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200756": {
+ "content": "<|vision_reserved_special_token_670|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200757": {
+ "content": "<|vision_reserved_special_token_671|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200758": {
+ "content": "<|vision_reserved_special_token_672|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200759": {
+ "content": "<|vision_reserved_special_token_673|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200760": {
+ "content": "<|vision_reserved_special_token_674|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200761": {
+ "content": "<|vision_reserved_special_token_675|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200762": {
+ "content": "<|vision_reserved_special_token_676|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200763": {
+ "content": "<|vision_reserved_special_token_677|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200764": {
+ "content": "<|vision_reserved_special_token_678|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200765": {
+ "content": "<|vision_reserved_special_token_679|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200766": {
+ "content": "<|vision_reserved_special_token_680|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200767": {
+ "content": "<|vision_reserved_special_token_681|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200768": {
+ "content": "<|vision_reserved_special_token_682|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200769": {
+ "content": "<|vision_reserved_special_token_683|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200770": {
+ "content": "<|vision_reserved_special_token_684|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200771": {
+ "content": "<|vision_reserved_special_token_685|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200772": {
+ "content": "<|vision_reserved_special_token_686|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200773": {
+ "content": "<|vision_reserved_special_token_687|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200774": {
+ "content": "<|vision_reserved_special_token_688|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200775": {
+ "content": "<|vision_reserved_special_token_689|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200776": {
+ "content": "<|vision_reserved_special_token_690|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200777": {
+ "content": "<|vision_reserved_special_token_691|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200778": {
+ "content": "<|vision_reserved_special_token_692|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200779": {
+ "content": "<|vision_reserved_special_token_693|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200780": {
+ "content": "<|vision_reserved_special_token_694|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200781": {
+ "content": "<|vision_reserved_special_token_695|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200782": {
+ "content": "<|vision_reserved_special_token_696|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200783": {
+ "content": "<|vision_reserved_special_token_697|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200784": {
+ "content": "<|vision_reserved_special_token_698|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200785": {
+ "content": "<|vision_reserved_special_token_699|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200786": {
+ "content": "<|vision_reserved_special_token_700|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200787": {
+ "content": "<|vision_reserved_special_token_701|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200788": {
+ "content": "<|vision_reserved_special_token_702|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200789": {
+ "content": "<|vision_reserved_special_token_703|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200790": {
+ "content": "<|vision_reserved_special_token_704|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200791": {
+ "content": "<|vision_reserved_special_token_705|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200792": {
+ "content": "<|vision_reserved_special_token_706|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200793": {
+ "content": "<|vision_reserved_special_token_707|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200794": {
+ "content": "<|vision_reserved_special_token_708|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200795": {
+ "content": "<|vision_reserved_special_token_709|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200796": {
+ "content": "<|vision_reserved_special_token_710|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200797": {
+ "content": "<|vision_reserved_special_token_711|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200798": {
+ "content": "<|vision_reserved_special_token_712|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200799": {
+ "content": "<|vision_reserved_special_token_713|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200800": {
+ "content": "<|vision_reserved_special_token_714|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200801": {
+ "content": "<|vision_reserved_special_token_715|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200802": {
+ "content": "<|vision_reserved_special_token_716|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200803": {
+ "content": "<|vision_reserved_special_token_717|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200804": {
+ "content": "<|vision_reserved_special_token_718|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200805": {
+ "content": "<|vision_reserved_special_token_719|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200806": {
+ "content": "<|vision_reserved_special_token_720|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200807": {
+ "content": "<|vision_reserved_special_token_721|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200808": {
+ "content": "<|vision_reserved_special_token_722|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200809": {
+ "content": "<|vision_reserved_special_token_723|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200810": {
+ "content": "<|vision_reserved_special_token_724|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200811": {
+ "content": "<|vision_reserved_special_token_725|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200812": {
+ "content": "<|vision_reserved_special_token_726|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200813": {
+ "content": "<|vision_reserved_special_token_727|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200814": {
+ "content": "<|vision_reserved_special_token_728|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200815": {
+ "content": "<|vision_reserved_special_token_729|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200816": {
+ "content": "<|vision_reserved_special_token_730|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200817": {
+ "content": "<|vision_reserved_special_token_731|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200818": {
+ "content": "<|vision_reserved_special_token_732|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200819": {
+ "content": "<|vision_reserved_special_token_733|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200820": {
+ "content": "<|vision_reserved_special_token_734|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200821": {
+ "content": "<|vision_reserved_special_token_735|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200822": {
+ "content": "<|vision_reserved_special_token_736|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200823": {
+ "content": "<|vision_reserved_special_token_737|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200824": {
+ "content": "<|vision_reserved_special_token_738|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200825": {
+ "content": "<|vision_reserved_special_token_739|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200826": {
+ "content": "<|vision_reserved_special_token_740|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200827": {
+ "content": "<|vision_reserved_special_token_741|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200828": {
+ "content": "<|vision_reserved_special_token_742|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200829": {
+ "content": "<|vision_reserved_special_token_743|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200830": {
+ "content": "<|vision_reserved_special_token_744|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200831": {
+ "content": "<|vision_reserved_special_token_745|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200832": {
+ "content": "<|vision_reserved_special_token_746|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200833": {
+ "content": "<|vision_reserved_special_token_747|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200834": {
+ "content": "<|vision_reserved_special_token_748|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200835": {
+ "content": "<|vision_reserved_special_token_749|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200836": {
+ "content": "<|vision_reserved_special_token_750|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200837": {
+ "content": "<|vision_reserved_special_token_751|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200838": {
+ "content": "<|vision_reserved_special_token_752|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200839": {
+ "content": "<|vision_reserved_special_token_753|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200840": {
+ "content": "<|vision_reserved_special_token_754|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200841": {
+ "content": "<|vision_reserved_special_token_755|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200842": {
+ "content": "<|vision_reserved_special_token_756|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200843": {
+ "content": "<|vision_reserved_special_token_757|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200844": {
+ "content": "<|vision_reserved_special_token_758|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200845": {
+ "content": "<|vision_reserved_special_token_759|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200846": {
+ "content": "<|vision_reserved_special_token_760|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200847": {
+ "content": "<|vision_reserved_special_token_761|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200848": {
+ "content": "<|vision_reserved_special_token_762|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200849": {
+ "content": "<|vision_reserved_special_token_763|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200850": {
+ "content": "<|vision_reserved_special_token_764|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200851": {
+ "content": "<|vision_reserved_special_token_765|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200852": {
+ "content": "<|vision_reserved_special_token_766|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200853": {
+ "content": "<|vision_reserved_special_token_767|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200854": {
+ "content": "<|vision_reserved_special_token_768|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200855": {
+ "content": "<|vision_reserved_special_token_769|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200856": {
+ "content": "<|vision_reserved_special_token_770|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200857": {
+ "content": "<|vision_reserved_special_token_771|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200858": {
+ "content": "<|vision_reserved_special_token_772|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200859": {
+ "content": "<|vision_reserved_special_token_773|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200860": {
+ "content": "<|vision_reserved_special_token_774|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200861": {
+ "content": "<|vision_reserved_special_token_775|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200862": {
+ "content": "<|vision_reserved_special_token_776|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200863": {
+ "content": "<|vision_reserved_special_token_777|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200864": {
+ "content": "<|vision_reserved_special_token_778|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200865": {
+ "content": "<|vision_reserved_special_token_779|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200866": {
+ "content": "<|vision_reserved_special_token_780|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200867": {
+ "content": "<|vision_reserved_special_token_781|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200868": {
+ "content": "<|vision_reserved_special_token_782|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200869": {
+ "content": "<|vision_reserved_special_token_783|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200870": {
+ "content": "<|vision_reserved_special_token_784|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200871": {
+ "content": "<|vision_reserved_special_token_785|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200872": {
+ "content": "<|vision_reserved_special_token_786|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200873": {
+ "content": "<|vision_reserved_special_token_787|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200874": {
+ "content": "<|vision_reserved_special_token_788|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200875": {
+ "content": "<|vision_reserved_special_token_789|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200876": {
+ "content": "<|vision_reserved_special_token_790|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200877": {
+ "content": "<|vision_reserved_special_token_791|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200878": {
+ "content": "<|vision_reserved_special_token_792|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200879": {
+ "content": "<|vision_reserved_special_token_793|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200880": {
+ "content": "<|vision_reserved_special_token_794|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200881": {
+ "content": "<|vision_reserved_special_token_795|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200882": {
+ "content": "<|vision_reserved_special_token_796|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200883": {
+ "content": "<|vision_reserved_special_token_797|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200884": {
+ "content": "<|vision_reserved_special_token_798|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200885": {
+ "content": "<|vision_reserved_special_token_799|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200886": {
+ "content": "<|vision_reserved_special_token_800|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200887": {
+ "content": "<|vision_reserved_special_token_801|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200888": {
+ "content": "<|vision_reserved_special_token_802|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200889": {
+ "content": "<|vision_reserved_special_token_803|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200890": {
+ "content": "<|vision_reserved_special_token_804|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200891": {
+ "content": "<|vision_reserved_special_token_805|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200892": {
+ "content": "<|vision_reserved_special_token_806|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200893": {
+ "content": "<|vision_reserved_special_token_807|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200894": {
+ "content": "<|vision_reserved_special_token_808|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200895": {
+ "content": "<|vision_reserved_special_token_809|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200896": {
+ "content": "<|vision_reserved_special_token_810|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200897": {
+ "content": "<|vision_reserved_special_token_811|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200898": {
+ "content": "<|vision_reserved_special_token_812|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200899": {
+ "content": "<|vision_reserved_special_token_813|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200900": {
+ "content": "<|vision_reserved_special_token_814|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200901": {
+ "content": "<|vision_reserved_special_token_815|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200902": {
+ "content": "<|vision_reserved_special_token_816|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200903": {
+ "content": "<|vision_reserved_special_token_817|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200904": {
+ "content": "<|vision_reserved_special_token_818|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200905": {
+ "content": "<|vision_reserved_special_token_819|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200906": {
+ "content": "<|vision_reserved_special_token_820|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200907": {
+ "content": "<|vision_reserved_special_token_821|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200908": {
+ "content": "<|vision_reserved_special_token_822|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200909": {
+ "content": "<|vision_reserved_special_token_823|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200910": {
+ "content": "<|vision_reserved_special_token_824|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200911": {
+ "content": "<|vision_reserved_special_token_825|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200912": {
+ "content": "<|vision_reserved_special_token_826|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200913": {
+ "content": "<|vision_reserved_special_token_827|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200914": {
+ "content": "<|vision_reserved_special_token_828|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200915": {
+ "content": "<|vision_reserved_special_token_829|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200916": {
+ "content": "<|vision_reserved_special_token_830|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200917": {
+ "content": "<|vision_reserved_special_token_831|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200918": {
+ "content": "<|vision_reserved_special_token_832|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200919": {
+ "content": "<|vision_reserved_special_token_833|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200920": {
+ "content": "<|vision_reserved_special_token_834|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200921": {
+ "content": "<|vision_reserved_special_token_835|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200922": {
+ "content": "<|vision_reserved_special_token_836|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200923": {
+ "content": "<|vision_reserved_special_token_837|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200924": {
+ "content": "<|vision_reserved_special_token_838|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200925": {
+ "content": "<|vision_reserved_special_token_839|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200926": {
+ "content": "<|vision_reserved_special_token_840|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200927": {
+ "content": "<|vision_reserved_special_token_841|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200928": {
+ "content": "<|vision_reserved_special_token_842|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200929": {
+ "content": "<|vision_reserved_special_token_843|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200930": {
+ "content": "<|vision_reserved_special_token_844|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200931": {
+ "content": "<|vision_reserved_special_token_845|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200932": {
+ "content": "<|vision_reserved_special_token_846|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200933": {
+ "content": "<|vision_reserved_special_token_847|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200934": {
+ "content": "<|vision_reserved_special_token_848|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200935": {
+ "content": "<|vision_reserved_special_token_849|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200936": {
+ "content": "<|vision_reserved_special_token_850|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200937": {
+ "content": "<|vision_reserved_special_token_851|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200938": {
+ "content": "<|vision_reserved_special_token_852|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200939": {
+ "content": "<|vision_reserved_special_token_853|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200940": {
+ "content": "<|vision_reserved_special_token_854|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200941": {
+ "content": "<|vision_reserved_special_token_855|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200942": {
+ "content": "<|vision_reserved_special_token_856|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200943": {
+ "content": "<|vision_reserved_special_token_857|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200944": {
+ "content": "<|vision_reserved_special_token_858|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200945": {
+ "content": "<|vision_reserved_special_token_859|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200946": {
+ "content": "<|vision_reserved_special_token_860|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200947": {
+ "content": "<|vision_reserved_special_token_861|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200948": {
+ "content": "<|vision_reserved_special_token_862|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200949": {
+ "content": "<|vision_reserved_special_token_863|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200950": {
+ "content": "<|vision_reserved_special_token_864|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200951": {
+ "content": "<|vision_reserved_special_token_865|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200952": {
+ "content": "<|vision_reserved_special_token_866|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200953": {
+ "content": "<|vision_reserved_special_token_867|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200954": {
+ "content": "<|vision_reserved_special_token_868|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200955": {
+ "content": "<|vision_reserved_special_token_869|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200956": {
+ "content": "<|vision_reserved_special_token_870|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200957": {
+ "content": "<|vision_reserved_special_token_871|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200958": {
+ "content": "<|vision_reserved_special_token_872|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200959": {
+ "content": "<|vision_reserved_special_token_873|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200960": {
+ "content": "<|vision_reserved_special_token_874|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200961": {
+ "content": "<|vision_reserved_special_token_875|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200962": {
+ "content": "<|vision_reserved_special_token_876|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200963": {
+ "content": "<|vision_reserved_special_token_877|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200964": {
+ "content": "<|vision_reserved_special_token_878|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200965": {
+ "content": "<|vision_reserved_special_token_879|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200966": {
+ "content": "<|vision_reserved_special_token_880|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200967": {
+ "content": "<|vision_reserved_special_token_881|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200968": {
+ "content": "<|vision_reserved_special_token_882|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200969": {
+ "content": "<|vision_reserved_special_token_883|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200970": {
+ "content": "<|vision_reserved_special_token_884|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200971": {
+ "content": "<|vision_reserved_special_token_885|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200972": {
+ "content": "<|vision_reserved_special_token_886|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200973": {
+ "content": "<|vision_reserved_special_token_887|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200974": {
+ "content": "<|vision_reserved_special_token_888|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200975": {
+ "content": "<|vision_reserved_special_token_889|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200976": {
+ "content": "<|vision_reserved_special_token_890|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200977": {
+ "content": "<|vision_reserved_special_token_891|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200978": {
+ "content": "<|vision_reserved_special_token_892|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200979": {
+ "content": "<|vision_reserved_special_token_893|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200980": {
+ "content": "<|vision_reserved_special_token_894|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200981": {
+ "content": "<|vision_reserved_special_token_895|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200982": {
+ "content": "<|vision_reserved_special_token_896|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200983": {
+ "content": "<|vision_reserved_special_token_897|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200984": {
+ "content": "<|vision_reserved_special_token_898|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200985": {
+ "content": "<|vision_reserved_special_token_899|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200986": {
+ "content": "<|vision_reserved_special_token_900|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200987": {
+ "content": "<|vision_reserved_special_token_901|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200988": {
+ "content": "<|vision_reserved_special_token_902|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200989": {
+ "content": "<|vision_reserved_special_token_903|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200990": {
+ "content": "<|vision_reserved_special_token_904|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200991": {
+ "content": "<|vision_reserved_special_token_905|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200992": {
+ "content": "<|vision_reserved_special_token_906|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200993": {
+ "content": "<|vision_reserved_special_token_907|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200994": {
+ "content": "<|vision_reserved_special_token_908|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200995": {
+ "content": "<|vision_reserved_special_token_909|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200996": {
+ "content": "<|vision_reserved_special_token_910|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200997": {
+ "content": "<|vision_reserved_special_token_911|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200998": {
+ "content": "<|vision_reserved_special_token_912|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "200999": {
+ "content": "<|vision_reserved_special_token_913|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201000": {
+ "content": "<|vision_reserved_special_token_914|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201001": {
+ "content": "<|vision_reserved_special_token_915|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201002": {
+ "content": "<|vision_reserved_special_token_916|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201003": {
+ "content": "<|vision_reserved_special_token_917|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201004": {
+ "content": "<|vision_reserved_special_token_918|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201005": {
+ "content": "<|vision_reserved_special_token_919|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201006": {
+ "content": "<|vision_reserved_special_token_920|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201007": {
+ "content": "<|vision_reserved_special_token_921|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201008": {
+ "content": "<|vision_reserved_special_token_922|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201009": {
+ "content": "<|vision_reserved_special_token_923|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201010": {
+ "content": "<|vision_reserved_special_token_924|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201011": {
+ "content": "<|vision_reserved_special_token_925|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201012": {
+ "content": "<|vision_reserved_special_token_926|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201013": {
+ "content": "<|vision_reserved_special_token_927|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201014": {
+ "content": "<|vision_reserved_special_token_928|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201015": {
+ "content": "<|vision_reserved_special_token_929|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201016": {
+ "content": "<|vision_reserved_special_token_930|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201017": {
+ "content": "<|vision_reserved_special_token_931|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201018": {
+ "content": "<|vision_reserved_special_token_932|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201019": {
+ "content": "<|vision_reserved_special_token_933|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201020": {
+ "content": "<|vision_reserved_special_token_934|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201021": {
+ "content": "<|vision_reserved_special_token_935|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201022": {
+ "content": "<|vision_reserved_special_token_936|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201023": {
+ "content": "<|vision_reserved_special_token_937|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201024": {
+ "content": "<|vision_reserved_special_token_938|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201025": {
+ "content": "<|vision_reserved_special_token_939|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201026": {
+ "content": "<|vision_reserved_special_token_940|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201027": {
+ "content": "<|vision_reserved_special_token_941|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201028": {
+ "content": "<|vision_reserved_special_token_942|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201029": {
+ "content": "<|vision_reserved_special_token_943|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201030": {
+ "content": "<|vision_reserved_special_token_944|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201031": {
+ "content": "<|vision_reserved_special_token_945|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201032": {
+ "content": "<|vision_reserved_special_token_946|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201033": {
+ "content": "<|vision_reserved_special_token_947|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201034": {
+ "content": "<|vision_reserved_special_token_948|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201035": {
+ "content": "<|vision_reserved_special_token_949|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201036": {
+ "content": "<|vision_reserved_special_token_950|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201037": {
+ "content": "<|vision_reserved_special_token_951|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201038": {
+ "content": "<|vision_reserved_special_token_952|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201039": {
+ "content": "<|vision_reserved_special_token_953|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201040": {
+ "content": "<|vision_reserved_special_token_954|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201041": {
+ "content": "<|vision_reserved_special_token_955|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201042": {
+ "content": "<|vision_reserved_special_token_956|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201043": {
+ "content": "<|vision_reserved_special_token_957|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201044": {
+ "content": "<|vision_reserved_special_token_958|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201045": {
+ "content": "<|vision_reserved_special_token_959|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201046": {
+ "content": "<|vision_reserved_special_token_960|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201047": {
+ "content": "<|vision_reserved_special_token_961|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201048": {
+ "content": "<|vision_reserved_special_token_962|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201049": {
+ "content": "<|vision_reserved_special_token_963|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201050": {
+ "content": "<|vision_reserved_special_token_964|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201051": {
+ "content": "<|vision_reserved_special_token_965|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201052": {
+ "content": "<|vision_reserved_special_token_966|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201053": {
+ "content": "<|vision_reserved_special_token_967|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201054": {
+ "content": "<|vision_reserved_special_token_968|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201055": {
+ "content": "<|vision_reserved_special_token_969|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201056": {
+ "content": "<|vision_reserved_special_token_970|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201057": {
+ "content": "<|vision_reserved_special_token_971|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201058": {
+ "content": "<|vision_reserved_special_token_972|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201059": {
+ "content": "<|vision_reserved_special_token_973|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201060": {
+ "content": "<|vision_reserved_special_token_974|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201061": {
+ "content": "<|vision_reserved_special_token_975|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201062": {
+ "content": "<|vision_reserved_special_token_976|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201063": {
+ "content": "<|vision_reserved_special_token_977|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201064": {
+ "content": "<|vision_reserved_special_token_978|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201065": {
+ "content": "<|vision_reserved_special_token_979|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201066": {
+ "content": "<|vision_reserved_special_token_980|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201067": {
+ "content": "<|vision_reserved_special_token_981|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201068": {
+ "content": "<|vision_reserved_special_token_982|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201069": {
+ "content": "<|vision_reserved_special_token_983|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201070": {
+ "content": "<|vision_reserved_special_token_984|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201071": {
+ "content": "<|vision_reserved_special_token_985|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201072": {
+ "content": "<|vision_reserved_special_token_986|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201073": {
+ "content": "<|vision_reserved_special_token_987|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201074": {
+ "content": "<|vision_reserved_special_token_988|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201075": {
+ "content": "<|vision_reserved_special_token_989|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201076": {
+ "content": "<|vision_reserved_special_token_990|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201077": {
+ "content": "<|vision_reserved_special_token_991|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201078": {
+ "content": "<|vision_reserved_special_token_992|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201079": {
+ "content": "<|vision_reserved_special_token_993|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201080": {
+ "content": "<|vision_reserved_special_token_994|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201081": {
+ "content": "<|vision_reserved_special_token_995|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201082": {
+ "content": "<|vision_reserved_special_token_996|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201083": {
+ "content": "<|vision_reserved_special_token_997|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201084": {
+ "content": "<|vision_reserved_special_token_998|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201085": {
+ "content": "<|vision_reserved_special_token_999|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201086": {
+ "content": "<|vision_reserved_special_token_1000|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201087": {
+ "content": "<|vision_reserved_special_token_1001|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201088": {
+ "content": "<|vision_reserved_special_token_1002|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201089": {
+ "content": "<|vision_reserved_special_token_1003|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201090": {
+ "content": "<|vision_reserved_special_token_1004|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201091": {
+ "content": "<|vision_reserved_special_token_1005|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201092": {
+ "content": "<|vision_reserved_special_token_1006|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201093": {
+ "content": "<|vision_reserved_special_token_1007|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201094": {
+ "content": "<|vision_reserved_special_token_1008|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201095": {
+ "content": "<|vision_reserved_special_token_1009|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201096": {
+ "content": "<|vision_reserved_special_token_1010|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201097": {
+ "content": "<|vision_reserved_special_token_1011|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201098": {
+ "content": "<|vision_reserved_special_token_1012|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201099": {
+ "content": "<|vision_reserved_special_token_1013|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201100": {
+ "content": "<|vision_reserved_special_token_1014|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201101": {
+ "content": "<|vision_reserved_special_token_1015|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201102": {
+ "content": "<|vision_reserved_special_token_1016|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201103": {
+ "content": "<|vision_reserved_special_token_1017|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201104": {
+ "content": "<|vision_reserved_special_token_1018|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201105": {
+ "content": "<|vision_reserved_special_token_1019|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201106": {
+ "content": "<|vision_reserved_special_token_1020|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201107": {
+ "content": "<|vision_reserved_special_token_1021|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201108": {
+ "content": "<|vision_reserved_special_token_1022|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201109": {
+ "content": "<|vision_reserved_special_token_1023|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201110": {
+ "content": "<|vision_reserved_special_token_1024|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201111": {
+ "content": "<|vision_reserved_special_token_1025|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201112": {
+ "content": "<|vision_reserved_special_token_1026|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201113": {
+ "content": "<|vision_reserved_special_token_1027|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201114": {
+ "content": "<|vision_reserved_special_token_1028|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201115": {
+ "content": "<|vision_reserved_special_token_1029|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201116": {
+ "content": "<|vision_reserved_special_token_1030|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201117": {
+ "content": "<|vision_reserved_special_token_1031|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201118": {
+ "content": "<|vision_reserved_special_token_1032|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201119": {
+ "content": "<|vision_reserved_special_token_1033|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201120": {
+ "content": "<|vision_reserved_special_token_1034|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201121": {
+ "content": "<|vision_reserved_special_token_1035|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201122": {
+ "content": "<|vision_reserved_special_token_1036|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201123": {
+ "content": "<|vision_reserved_special_token_1037|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201124": {
+ "content": "<|vision_reserved_special_token_1038|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201125": {
+ "content": "<|vision_reserved_special_token_1039|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201126": {
+ "content": "<|vision_reserved_special_token_1040|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201127": {
+ "content": "<|vision_reserved_special_token_1041|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201128": {
+ "content": "<|vision_reserved_special_token_1042|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201129": {
+ "content": "<|vision_reserved_special_token_1043|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201130": {
+ "content": "<|vision_reserved_special_token_1044|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201131": {
+ "content": "<|vision_reserved_special_token_1045|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201132": {
+ "content": "<|vision_reserved_special_token_1046|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201133": {
+ "content": "<|vision_reserved_special_token_1047|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "201134": {
+ "content": "<|finetune_right_pad_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ }
+ },
+ "bos_token": "<|begin_of_text|>",
+ "chat_template": "{{- bos_token }}\n{%- if custom_tools is defined %}\n {%- set tools = custom_tools %}\n{%- endif %}\n{%- if not tools_in_user_message is defined %}\n {%- set tools_in_user_message = true %}\n{%- endif %}\n{%- if not date_string is defined %}\n {%- if strftime_now is defined %}\n {%- set date_string = strftime_now(\"%d %b %Y\") %}\n {%- else %}\n {%- set date_string = \"26 Jul 2024\" %}\n {%- endif %}\n{%- endif %}\n{%- if not tools is defined %}\n {%- set tools = none %}\n{%- endif %}\n\n{#- This block extracts the system message, so we can slot it into the right place. #}\n{%- if messages[0]['role'] == 'system' %} \n {%- if messages[0]['content'] is string %}\n {%- set system_message = messages[0]['content']|trim %}\n {%- else %}\n {#- FIXME: The processor requires an array, always. #}\n {%- set system_message = messages[0]['content'][0]['text']|trim %}\n {%- endif %}\n {%- set messages = messages[1:] %}\n {%- set user_supplied_system_message = true %}\n{%- else %}\n {%- set system_message = \"\" %}\n {%- set user_supplied_system_message = false %}\n{%- endif %}\n\n{#- System message if the user supplied one #}\n{%- if user_supplied_system_message %}\n {{- \"<|header_start|>system<|header_end|>\\n\\n\" }}\n {%- if tools is not none %}\n {{- \"Environment: ipython\\n\" }}\n {%- endif %}\n {%- if tools is not none and not tools_in_user_message %}\n {{- \"You have access to the following functions. To call a function, please respond with JSON for a function call.\" }}\n {{- 'Respond in the format {\"name\": function name, \"parameters\": dictionary of argument name and its value}.' }}\n {{- \"Do not use variables.\\n\\n\" }}\n {%- for t in tools %}\n {{- t | tojson(indent=4) }}\n {{- \"\\n\\n\" }}\n {%- endfor %}\n {%- endif %}\n {{- system_message }}\n {{- \"<|eot|>\" }}\n{%- endif %}\n\n{#- Custom tools are passed in a user message with some extra guidance #}\n{%- if tools_in_user_message and not tools is none %}\n {#- Extract the first user message so we can plug it in here #}\n {%- if messages | length != 0 %}\n {%- set first_user_message = messages[0]['content']|trim %}\n {%- set messages = messages[1:] %}\n {%- else %}\n {{- raise_exception(\"Cannot put tools in the first user message when there's no first user message!\") }}\n{%- endif %}\n {{- '<|header_start|>user<|header_end|>\\n\\n' -}}\n {{- \"Given the following functions, please respond with a JSON for a function call \" }}\n {{- \"with its proper arguments that best answers the given prompt.\\n\\n\" }}\n {{- 'Respond in the format {\"name\": function name, \"parameters\": dictionary of argument name and its value}.' }}\n {{- \"Do not use variables.\\n\\n\" }}\n {%- for t in tools %}\n {{- t | tojson(indent=4) }}\n {{- \"\\n\\n\" }}\n {%- endfor %}\n {{- first_user_message + \"<|eot|>\"}}\n{%- endif %}\n\n{%- for message in messages %}\n {%- if not (message.role == 'ipython' or message.role == 'tool' or 'tool_calls' in message) %}\n {{- '<|header_start|>' + message['role'] + '<|header_end|>\\n\\n' }}\n {%- if message['content'] is string %}\n {{- message['content'] }}\n {%- else %}\n {%- for content in message['content'] %}\n {%- if content['type'] == 'image' %}\n {{- '<|image|>' }}\n {%- elif content['type'] == 'text' %}\n {{- content['text'] }}\n {%- endif %}\n {%- endfor %}\n {%- endif %}\n {{- \"<|eot|>\" }}\n {%- elif 'tool_calls' in message and message.tool_calls|length > 0 %}\n {{- '<|header_start|>assistant<|header_end|>\\n\\n' -}}\n {{- '<|python_start|>' }}\n {%- if message['content'] is string %}\n {{- message['content'] }}\n {%- else %}\n {%- for content in message['content'] %}\n {%- if content['type'] == 'image' %}\n {{- '<|image|>' }}\n {%- elif content['type'] == 'text' %}\n {{- content['text'] }}\n {%- endif %}\n {%- endfor %}\n {%- endif %}\n {{- '<|python_end|>' }}\n {%- for tool_call in message.tool_calls %}\n {{- '{\"name\": \"' + tool_call.function.name + '\", ' }}\n {{- '\"parameters\": ' }}\n {{- tool_call.function.arguments | tojson }}\n {{- \"}\" }}\n {%- endfor %}\n {{- \"<|eot|>\" }}\n {%- elif message.role == \"tool\" or message.role == \"ipython\" %}\n {{- \"<|header_start|>ipython<|header_end|>\\n\\n\" }}\n {%- if message.content is mapping or message.content is iterable %}\n {{- message.content | tojson }}\n {%- else %}\n {{- message.content }}\n {%- endif %}\n {{- \"<|eot|>\" }}\n {%- endif %}\n{%- endfor %}\n{%- if add_generation_prompt %}\n {{- '<|header_start|>assistant<|header_end|>\\n\\n' }}\n{%- endif %}\n",
+ "clean_up_tokenization_spaces": false,
+ "eos_token": "<|eot|>",
+ "extra_special_tokens": {},
+ "model_input_names": [
+ "input_ids",
+ "attention_mask"
+ ],
+ "model_max_length": 10485760,
+ "pad_token": "<|finetune_right_pad_id|>",
+ "processor_class": "Llama4Processor",
+ "tokenizer_class": "PreTrainedTokenizer"
+}