id
stringlengths
6
113
author
stringlengths
2
36
task_category
stringclasses
42 values
tags
listlengths
1
4.05k
created_time
timestamp[ns, tz=UTC]date
2022-03-02 23:29:04
2025-04-10 08:38:38
last_modified
stringdate
2020-05-14 13:13:12
2025-04-19 04:15:39
downloads
int64
0
118M
likes
int64
0
4.86k
README
stringlengths
30
1.01M
matched_bigbio_names
listlengths
1
8
is_bionlp
stringclasses
3 values
model_cards
stringlengths
0
1M
metadata
stringlengths
2
698k
source
stringclasses
2 values
matched_task
listlengths
1
10
__index_level_0__
int64
0
46.9k
RamsesDIIP/me5_triplet_finetuned
RamsesDIIP
sentence-similarity
[ "sentence-transformers", "safetensors", "xlm-roberta", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:46", "loss:TripletLoss", "arxiv:1908.10084", "arxiv:1703.07737", "base_model:intfloat/multilingual-e5-large", "base_model:finetune:intfloat/multilingual-e5-large", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
2024-10-16T09:45:24Z
2024-10-16T09:47:23+00:00
4
0
--- base_model: intfloat/multilingual-e5-large library_name: sentence-transformers pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:46 - loss:TripletLoss widget: - source_sentence: Base de zahorras artificial, con extendido y compactado del material al 100% del PM, en entorno urbano con dificultad de mobilidad, en aceras > 5 m de ancho o calzada/plataforma única > 12 m de ancho, sin afectación por servicios o elementos de mobiliario urbano, en actuaciones de 0.2 a 2 m3 sentences: - Vertido de hormigón para muros de contención de hasta 3 m de altura, utilizando hormigón armado con aditivo impermeabilizante HA - 40 / B / 10 / XC2, con una dosificación de 375 kg de cemento por metro cúbico y una relación agua-cemento menor o igual a 0.45, realizado con cubilote. - Capa de grava artificial, con distribución y compactación del material al 100% del Proctor Modificado, en área urbana con limitaciones de acceso, en aceras mayores a 5 m de ancho o calzadas/plataformas únicas superiores a 12 m de ancho, sin interferencias de servicios o mobiliario urbano, en trabajos de 0.2 a 2 m3. - Base de hormigón armado, con vertido y fraguado del material al 80% del volumen, en área rural con fácil acceso, en caminos de tierra > 3 m de ancho o senderos > 8 m de ancho, con interferencia por instalaciones eléctricas o elementos de paisajismo, en proyectos de 1 a 5 m3. - source_sentence: Semivigueta y bovedilla para forjado de 16+4 cm, hasta 3 m de altura, como máximo, con bovedilla de mortero de cemento y semiviguetas de hormigón armado de 15 y 16 cm de altura, Indeterminadointerejes 0,6 m, luz entre 2,5 y 5 m, de momento flector último 18,0 a 26,3 kN·m por m de ancho de techo sentences: - Semivigueta y bovedilla para losas de 16+4 cm, con una altura máxima de 3 m, utilizando bovedillas de mortero de cemento y semiviguetas de hormigón armado de 15 y 16 cm de altura, con un intereje indeterminado de 0,6 m y luces que varían entre 2,5 y 5 m, soportando un momento flector último de 18,0 a 26,3 kN·m por metro de ancho de la losa. - Radiador de acero de 8 secciones con 2 columnas, altura máxima de 650 mm, diseñado para agua caliente a 6 bar y 110 °C, con soporte para instalación empotrada, válvulas termostáticas para sistema monotubular y purgador manual. - Vigueta y ladrillo para pared de 20+5 cm, hasta 4 m de altura, como mínimo, con ladrillo de arcilla y viguetas de acero de 20 y 22 cm de altura, interejes indeterminados de 0,8 m, luz entre 3,0 y 6 m, de momento flector último 15,0 a 20,0 kN·m por m de ancho de pared. - source_sentence: Pared para pozo rectangular de 70x30 cm, de 29 cm de espesor de ladrillo perforado, enfoscada y enlucida por dentro y enfoscado previo por fuera con mortero cemento 1:4 sentences: - Muro para un pozo rectangular de 70x30 cm, con un grosor de 29 cm de ladrillo hueco, revestido y acabado interiormente, y con un enfoscado exterior previo utilizando mortero de cemento en proporción 1:4. - Marco de ventana de pino flandes, con dimensiones de 70x35 mm2, diseñado para un espacio de obra de aproximadamente 150x290 cm. - Pared para piscina circular de 70x30 cm, de 29 cm de espesor de ladrillo macizo, revestida y pintada por dentro y con impermeabilización previa por fuera con mortero de cal 1:4. - source_sentence: Conjunto de cuatro captadores solares planos de plancha de cobre con vidrio templado, envolvente de aluminio anodizado y aislamiento de espuma de poliuretano con una superficie activa de 2.25 a 2.55 m2, un rendimiento máximo de 90 % y un coeficiente de pérdidas <= 5 W/(m2·°C), colocado con soporte vertical sentences: - Conjunto de cuatro paneles fotovoltaicos de silicio monocristalino con marco de acero inoxidable y sistema de refrigeración líquida, con una superficie activa de 2.25 a 2.55 m2, un rendimiento máximo de 95 % y un coeficiente de pérdidas <= 3 W/(m2·°C), instalado en una estructura inclinada. - Interruptor doble modular de 2 módulos estrechos, unipolar (1P), 10 AX/250 V, con tecla, de alta gama, empotrado, con marco adaptador para mecanismos modulares en caja universal de 1 elemento de alta gama, tubo flexible corrugado de PVC recubierto, caja de derivación rectangular y conductor de cobre tipo H07V-U. - Sistema de cuatro colectores solares de lámina de cobre con vidrio resistente, estructura de aluminio anodizado y aislamiento de poliuretano, con un área efectiva de 2.25 a 2.55 m2, eficiencia máxima del 90 % y un coeficiente de pérdidas menor o igual a 5 W/(m2·°C), instalado en posición vertical. - source_sentence: Instalaciones de energía solar fotovoltaica aislada de 1800 W de potencia con 1.33333 unidades de conjunto de 6 módulos fotovoltáicos de tipo policristalino para instalación aislada/conexión a red, de 230 Wp de potencia de pico cada uno, con una eficiencia mínima 14,1%, con marco de aluminio anodizado, protección con vidrio templado, caja de conexión, precableado con conectores especiales, con estructura de soporte para 6 módulos fotovoltaicos en posición vertical, de perfiles de aluminio extruido, con inclinación de 30 o 40º, para colocar sobre suelo o cubierta plana, montados y conectados, con equipo multifunción para instalación fotovoltaica con funciones de inversor, cargador y regulador, de 1500 VA de potencia, monofásico de 230 V de onda sinusoidal pura, rendimiento mínimo 94 % y kit bateria estacionaria para instalació fotovoltaica de 12 V, con 6 módulos de bateria estacionariaa para instalació fotovoltaica tipo OPzV, con electrólito de gel, de 2 V de tensión nominal y 750 A·h C100, hermética y libre de mantenimiento, electrodo positivo tubular, cuerpo ABS, alta estabilidad a los ciclos de carga y descarga, instaladas y con conectores entre baterías sentences: - Pavimento de concreto HA-30/B / 20 / IIa + E de textura suave, con un tamaño máximo de agregado de 20 mm, que contiene al menos 300 kg/m3 de cemento, adecuado para la clase de exposición IIa + E, sin aditivos, instalado mediante transporte manual interno, extendido y vibrado a mano, y terminado con regla. - Instalaciones de energía eólica de 1800 W de potencia con 1.33333 unidades de aerogeneradores de tipo horizontal, cada uno con una capacidad de 230 W, diseñados para conexión a red, con un rendimiento mínimo del 14,1%, equipados con palas de fibra de vidrio, sistema de control de carga, y estructura de soporte para 6 aerogeneradores en posición vertical, con inclinación de 30 o 40º, para colocar sobre suelo o cubierta plana, montados y conectados, con equipo multifunción para instalación eólica que incluye inversor, cargador y regulador, de 1500 VA de potencia, monofásico de 230 V de onda sinusoidal pura, rendimiento mínimo del 94% y kit de baterías estacionarias para instalación eólica de 12 V, con 6 módulos de batería estacionaria tipo OPzV, con electrólito de gel, de 2 V de tensión nominal y 750 A·h C100, hermética y libre de mantenimiento, electrodo positivo tubular, cuerpo ABS, alta estabilidad a los ciclos de carga y descarga, instaladas y con conectores entre baterías. - Sistema de energía solar fotovoltaica independiente de 1800 W, compuesto por 6 módulos policristalinos de 230 Wp cada uno, con una eficiencia mínima del 14,1%, montados en una estructura de soporte de aluminio extruido con inclinación de 30 a 40 grados, incluyendo un inversor multifuncional de 1500 VA y un kit de baterías estacionarias de 12 V, todo preinstalado y listo para su conexión. --- # SentenceTransformer based on intfloat/multilingual-e5-large This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [intfloat/multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [intfloat/multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) <!-- at revision ab10c1a7f42e74530fe7ae5be82e6d4f11a719eb --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 1024 tokens - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: XLMRobertaModel (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("RamsesDIIP/me5_triplet_finetuned") # Run inference sentences = [ 'Instalaciones de energía solar fotovoltaica aislada de 1800 W de potencia con 1.33333 unidades de conjunto de 6 módulos fotovoltáicos de tipo policristalino para instalación aislada/conexión a red, de 230 Wp de potencia de pico cada uno, con una eficiencia mínima 14,1%, con marco de aluminio anodizado, protección con vidrio templado, caja de conexión, precableado con conectores especiales, con estructura de soporte para 6 módulos fotovoltaicos en posición vertical, de perfiles de aluminio extruido, con inclinación de 30 o 40º, para colocar sobre suelo o cubierta plana, montados y conectados, con equipo multifunción para instalación fotovoltaica con funciones de inversor, cargador y regulador, de 1500 VA de potencia, monofásico de 230 V de onda sinusoidal pura, rendimiento mínimo 94 % y kit bateria estacionaria para instalació fotovoltaica de 12 V, con 6 módulos de bateria estacionariaa para instalació fotovoltaica tipo OPzV, con electrólito de gel, de 2 V de tensión nominal y 750 A·h C100, hermética y libre de mantenimiento, electrodo positivo tubular, cuerpo ABS, alta estabilidad a los ciclos de carga y descarga, instaladas y con conectores entre baterías', 'Sistema de energía solar fotovoltaica independiente de 1800 W, compuesto por 6 módulos policristalinos de 230 Wp cada uno, con una eficiencia mínima del 14,1%, montados en una estructura de soporte de aluminio extruido con inclinación de 30 a 40 grados, incluyendo un inversor multifuncional de 1500 VA y un kit de baterías estacionarias de 12 V, todo preinstalado y listo para su conexión.', 'Instalaciones de energía eólica de 1800 W de potencia con 1.33333 unidades de aerogeneradores de tipo horizontal, cada uno con una capacidad de 230 W, diseñados para conexión a red, con un rendimiento mínimo del 14,1%, equipados con palas de fibra de vidrio, sistema de control de carga, y estructura de soporte para 6 aerogeneradores en posición vertical, con inclinación de 30 o 40º, para colocar sobre suelo o cubierta plana, montados y conectados, con equipo multifunción para instalación eólica que incluye inversor, cargador y regulador, de 1500 VA de potencia, monofásico de 230 V de onda sinusoidal pura, rendimiento mínimo del 94% y kit de baterías estacionarias para instalación eólica de 12 V, con 6 módulos de batería estacionaria tipo OPzV, con electrólito de gel, de 2 V de tensión nominal y 750 A·h C100, hermética y libre de mantenimiento, electrodo positivo tubular, cuerpo ABS, alta estabilidad a los ciclos de carga y descarga, instaladas y con conectores entre baterías.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 1024] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 46 training samples * Columns: <code>sentence_0</code>, <code>sentence_1</code>, and <code>sentence_2</code> * Approximate statistics based on the first 46 samples: | | sentence_0 | sentence_1 | sentence_2 | |:--------|:-------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 36 tokens</li><li>mean: 103.61 tokens</li><li>max: 300 tokens</li></ul> | <ul><li>min: 34 tokens</li><li>mean: 97.41 tokens</li><li>max: 205 tokens</li></ul> | <ul><li>min: 32 tokens</li><li>mean: 83.22 tokens</li><li>max: 245 tokens</li></ul> | * Samples: | sentence_0 | sentence_1 | sentence_2 | |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>Montaje y desmontaje de encofrado con molde circular de tubo metálico para pilares de sección circular de 30 cm de diámetro, para dejar el hormigón visto, de altura hasta 3 m</code> | <code>Instalación y remoción de encofrado con forma circular de tubo metálico para columnas de 30 cm de diámetro, permitiendo que el hormigón quede expuesto, con una altura máxima de 3 metros.</code> | <code>Instalación y remoción de encofrado con molde cuadrado de madera para vigas de sección rectangular de 20 cm de ancho, para cubrir el hormigón, de altura hasta 5 m.</code> | | <code>Losa de cimentación de hormigón armado con hormigonado de losa de cimentación con hormigón para armar HA - 30 / B / 20 / XC4 + XS1 con una cantidad de cemento de 300 kg/m3 i relación agua cemento =< 0.5, vertido con bomba, armado con 70 kg/m3 de armadura para losas de cimientos AP500 SD de acero en barras corrugadas B500SD de límite elástico >= 500 N/mm2 y encofrado no visto con una cuantía de 0,1 m2/m3</code> | <code>Losa de cimentación de concreto reforzado, vertida con bomba, utilizando hormigón HA - 30 / B / 20 / XC4 + XS1, con una dosificación de cemento de 300 kg/m3 y una relación agua-cemento menor o igual a 0.5, y con una armadura de 70 kg/m3 de acero corrugado B500SD, encofrado oculto con una cuantía de 0,1 m2/m3.</code> | <code>Losa de cubierta de madera tratada con un sistema de impermeabilización de membrana asfáltica para techos con una cantidad de resina de 200 kg/m3 y relación de mezcla =< 0.4, instalada manualmente, reforzada con 50 kg/m3 de soporte de vigas de madera de pino con un límite de carga >= 300 N/mm2 y encofrado visible con una cuantía de 0,2 m2/m3.</code> | | <code>Pavimento de hormigón de 15 cm de espesor acabado con 3 kg/m2 de polvo de cuarzo color, con hormigón para armar HA - 30 / F / 20 / XC2 con una cantidad de cemento de 275 kg/m3 i relación agua cemento =< 0.6, colocado con cubilote, extendido y vibrado manual y fratasado mecánico</code> | <code>Pavimento de concreto de 15 cm de grosor, terminado con 3 kg/m2 de polvo de cuarzo de color, utilizando hormigón armado HA - 30 / F / 20 / XC2, con una dosificación de cemento de 275 kg/m3 y una relación agua-cemento menor o igual a 0.6, aplicado con cubilote, extendido y vibrado manualmente, y acabado con fratasadora mecánica.</code> | <code>Pavimento de asfalto de 10 cm de espesor tratado con 5 kg/m2 de aditivo colorante, utilizando mezcla bituminosa tipo B con una proporción de betún de 300 kg/m3 y relación betún-agregado =< 0.5, aplicado con fresadora, extendido y compactado manualmente y acabado con rodillo mecánico.</code> | * Loss: [<code>TripletLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#tripletloss) with these parameters: ```json { "distance_metric": "TripletDistanceMetric.EUCLIDEAN", "triplet_margin": 5 } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `per_device_train_batch_size`: 4 - `per_device_eval_batch_size`: 4 - `multi_dataset_batch_sampler`: round_robin #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: no - `prediction_loss_only`: True - `per_device_train_batch_size`: 4 - `per_device_eval_batch_size`: 4 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 5e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1 - `num_train_epochs`: 3 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.0 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `eval_use_gather_object`: False - `batch_sampler`: batch_sampler - `multi_dataset_batch_sampler`: round_robin </details> ### Framework Versions - Python: 3.10.12 - Sentence Transformers: 3.2.0 - Transformers: 4.44.2 - PyTorch: 2.4.1+cu121 - Accelerate: 0.34.2 - Datasets: 3.0.1 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### TripletLoss ```bibtex @misc{hermans2017defense, title={In Defense of the Triplet Loss for Person Re-Identification}, author={Alexander Hermans and Lucas Beyer and Bastian Leibe}, year={2017}, eprint={1703.07737}, archivePrefix={arXiv}, primaryClass={cs.CV} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
null
Non_BioNLP
# SentenceTransformer based on intfloat/multilingual-e5-large This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [intfloat/multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [intfloat/multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) <!-- at revision ab10c1a7f42e74530fe7ae5be82e6d4f11a719eb --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 1024 tokens - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: XLMRobertaModel (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("RamsesDIIP/me5_triplet_finetuned") # Run inference sentences = [ 'Instalaciones de energía solar fotovoltaica aislada de 1800 W de potencia con 1.33333 unidades de conjunto de 6 módulos fotovoltáicos de tipo policristalino para instalación aislada/conexión a red, de 230 Wp de potencia de pico cada uno, con una eficiencia mínima 14,1%, con marco de aluminio anodizado, protección con vidrio templado, caja de conexión, precableado con conectores especiales, con estructura de soporte para 6 módulos fotovoltaicos en posición vertical, de perfiles de aluminio extruido, con inclinación de 30 o 40º, para colocar sobre suelo o cubierta plana, montados y conectados, con equipo multifunción para instalación fotovoltaica con funciones de inversor, cargador y regulador, de 1500 VA de potencia, monofásico de 230 V de onda sinusoidal pura, rendimiento mínimo 94 % y kit bateria estacionaria para instalació fotovoltaica de 12 V, con 6 módulos de bateria estacionariaa para instalació fotovoltaica tipo OPzV, con electrólito de gel, de 2 V de tensión nominal y 750 A·h C100, hermética y libre de mantenimiento, electrodo positivo tubular, cuerpo ABS, alta estabilidad a los ciclos de carga y descarga, instaladas y con conectores entre baterías', 'Sistema de energía solar fotovoltaica independiente de 1800 W, compuesto por 6 módulos policristalinos de 230 Wp cada uno, con una eficiencia mínima del 14,1%, montados en una estructura de soporte de aluminio extruido con inclinación de 30 a 40 grados, incluyendo un inversor multifuncional de 1500 VA y un kit de baterías estacionarias de 12 V, todo preinstalado y listo para su conexión.', 'Instalaciones de energía eólica de 1800 W de potencia con 1.33333 unidades de aerogeneradores de tipo horizontal, cada uno con una capacidad de 230 W, diseñados para conexión a red, con un rendimiento mínimo del 14,1%, equipados con palas de fibra de vidrio, sistema de control de carga, y estructura de soporte para 6 aerogeneradores en posición vertical, con inclinación de 30 o 40º, para colocar sobre suelo o cubierta plana, montados y conectados, con equipo multifunción para instalación eólica que incluye inversor, cargador y regulador, de 1500 VA de potencia, monofásico de 230 V de onda sinusoidal pura, rendimiento mínimo del 94% y kit de baterías estacionarias para instalación eólica de 12 V, con 6 módulos de batería estacionaria tipo OPzV, con electrólito de gel, de 2 V de tensión nominal y 750 A·h C100, hermética y libre de mantenimiento, electrodo positivo tubular, cuerpo ABS, alta estabilidad a los ciclos de carga y descarga, instaladas y con conectores entre baterías.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 1024] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 46 training samples * Columns: <code>sentence_0</code>, <code>sentence_1</code>, and <code>sentence_2</code> * Approximate statistics based on the first 46 samples: | | sentence_0 | sentence_1 | sentence_2 | |:--------|:-------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 36 tokens</li><li>mean: 103.61 tokens</li><li>max: 300 tokens</li></ul> | <ul><li>min: 34 tokens</li><li>mean: 97.41 tokens</li><li>max: 205 tokens</li></ul> | <ul><li>min: 32 tokens</li><li>mean: 83.22 tokens</li><li>max: 245 tokens</li></ul> | * Samples: | sentence_0 | sentence_1 | sentence_2 | |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>Montaje y desmontaje de encofrado con molde circular de tubo metálico para pilares de sección circular de 30 cm de diámetro, para dejar el hormigón visto, de altura hasta 3 m</code> | <code>Instalación y remoción de encofrado con forma circular de tubo metálico para columnas de 30 cm de diámetro, permitiendo que el hormigón quede expuesto, con una altura máxima de 3 metros.</code> | <code>Instalación y remoción de encofrado con molde cuadrado de madera para vigas de sección rectangular de 20 cm de ancho, para cubrir el hormigón, de altura hasta 5 m.</code> | | <code>Losa de cimentación de hormigón armado con hormigonado de losa de cimentación con hormigón para armar HA - 30 / B / 20 / XC4 + XS1 con una cantidad de cemento de 300 kg/m3 i relación agua cemento =< 0.5, vertido con bomba, armado con 70 kg/m3 de armadura para losas de cimientos AP500 SD de acero en barras corrugadas B500SD de límite elástico >= 500 N/mm2 y encofrado no visto con una cuantía de 0,1 m2/m3</code> | <code>Losa de cimentación de concreto reforzado, vertida con bomba, utilizando hormigón HA - 30 / B / 20 / XC4 + XS1, con una dosificación de cemento de 300 kg/m3 y una relación agua-cemento menor o igual a 0.5, y con una armadura de 70 kg/m3 de acero corrugado B500SD, encofrado oculto con una cuantía de 0,1 m2/m3.</code> | <code>Losa de cubierta de madera tratada con un sistema de impermeabilización de membrana asfáltica para techos con una cantidad de resina de 200 kg/m3 y relación de mezcla =< 0.4, instalada manualmente, reforzada con 50 kg/m3 de soporte de vigas de madera de pino con un límite de carga >= 300 N/mm2 y encofrado visible con una cuantía de 0,2 m2/m3.</code> | | <code>Pavimento de hormigón de 15 cm de espesor acabado con 3 kg/m2 de polvo de cuarzo color, con hormigón para armar HA - 30 / F / 20 / XC2 con una cantidad de cemento de 275 kg/m3 i relación agua cemento =< 0.6, colocado con cubilote, extendido y vibrado manual y fratasado mecánico</code> | <code>Pavimento de concreto de 15 cm de grosor, terminado con 3 kg/m2 de polvo de cuarzo de color, utilizando hormigón armado HA - 30 / F / 20 / XC2, con una dosificación de cemento de 275 kg/m3 y una relación agua-cemento menor o igual a 0.6, aplicado con cubilote, extendido y vibrado manualmente, y acabado con fratasadora mecánica.</code> | <code>Pavimento de asfalto de 10 cm de espesor tratado con 5 kg/m2 de aditivo colorante, utilizando mezcla bituminosa tipo B con una proporción de betún de 300 kg/m3 y relación betún-agregado =< 0.5, aplicado con fresadora, extendido y compactado manualmente y acabado con rodillo mecánico.</code> | * Loss: [<code>TripletLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#tripletloss) with these parameters: ```json { "distance_metric": "TripletDistanceMetric.EUCLIDEAN", "triplet_margin": 5 } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `per_device_train_batch_size`: 4 - `per_device_eval_batch_size`: 4 - `multi_dataset_batch_sampler`: round_robin #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: no - `prediction_loss_only`: True - `per_device_train_batch_size`: 4 - `per_device_eval_batch_size`: 4 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 5e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1 - `num_train_epochs`: 3 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.0 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `eval_use_gather_object`: False - `batch_sampler`: batch_sampler - `multi_dataset_batch_sampler`: round_robin </details> ### Framework Versions - Python: 3.10.12 - Sentence Transformers: 3.2.0 - Transformers: 4.44.2 - PyTorch: 2.4.1+cu121 - Accelerate: 0.34.2 - Datasets: 3.0.1 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### TripletLoss ```bibtex @misc{hermans2017defense, title={In Defense of the Triplet Loss for Person Re-Identification}, author={Alexander Hermans and Lucas Beyer and Bastian Leibe}, year={2017}, eprint={1703.07737}, archivePrefix={arXiv}, primaryClass={cs.CV} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
{"base_model": "intfloat/multilingual-e5-large", "library_name": "sentence-transformers", "pipeline_tag": "sentence-similarity", "tags": ["sentence-transformers", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:46", "loss:TripletLoss"], "widget": [{"source_sentence": "Base de zahorras artificial, con extendido y compactado del material al 100% del PM, en entorno urbano con dificultad de mobilidad, en aceras > 5 m de ancho o calzada/plataforma única > 12 m de ancho, sin afectación por servicios o elementos de mobiliario urbano, en actuaciones de 0.2 a 2 m3", "sentences": ["Vertido de hormigón para muros de contención de hasta 3 m de altura, utilizando hormigón armado con aditivo impermeabilizante HA - 40 / B / 10 / XC2, con una dosificación de 375 kg de cemento por metro cúbico y una relación agua-cemento menor o igual a 0.45, realizado con cubilote.", "Capa de grava artificial, con distribución y compactación del material al 100% del Proctor Modificado, en área urbana con limitaciones de acceso, en aceras mayores a 5 m de ancho o calzadas/plataformas únicas superiores a 12 m de ancho, sin interferencias de servicios o mobiliario urbano, en trabajos de 0.2 a 2 m3.", "Base de hormigón armado, con vertido y fraguado del material al 80% del volumen, en área rural con fácil acceso, en caminos de tierra > 3 m de ancho o senderos > 8 m de ancho, con interferencia por instalaciones eléctricas o elementos de paisajismo, en proyectos de 1 a 5 m3."]}, {"source_sentence": "Semivigueta y bovedilla para forjado de 16+4 cm, hasta 3 m de altura, como máximo, con bovedilla de mortero de cemento y semiviguetas de hormigón armado de 15 y 16 cm de altura, Indeterminadointerejes 0,6 m, luz entre 2,5 y 5 m, de momento flector último 18,0 a 26,3 kN·m por m de ancho de techo", "sentences": ["Semivigueta y bovedilla para losas de 16+4 cm, con una altura máxima de 3 m, utilizando bovedillas de mortero de cemento y semiviguetas de hormigón armado de 15 y 16 cm de altura, con un intereje indeterminado de 0,6 m y luces que varían entre 2,5 y 5 m, soportando un momento flector último de 18,0 a 26,3 kN·m por metro de ancho de la losa.", "Radiador de acero de 8 secciones con 2 columnas, altura máxima de 650 mm, diseñado para agua caliente a 6 bar y 110 °C, con soporte para instalación empotrada, válvulas termostáticas para sistema monotubular y purgador manual.", "Vigueta y ladrillo para pared de 20+5 cm, hasta 4 m de altura, como mínimo, con ladrillo de arcilla y viguetas de acero de 20 y 22 cm de altura, interejes indeterminados de 0,8 m, luz entre 3,0 y 6 m, de momento flector último 15,0 a 20,0 kN·m por m de ancho de pared."]}, {"source_sentence": "Pared para pozo rectangular de 70x30 cm, de 29 cm de espesor de ladrillo perforado, enfoscada y enlucida por dentro y enfoscado previo por fuera con mortero cemento 1:4", "sentences": ["Muro para un pozo rectangular de 70x30 cm, con un grosor de 29 cm de ladrillo hueco, revestido y acabado interiormente, y con un enfoscado exterior previo utilizando mortero de cemento en proporción 1:4.", "Marco de ventana de pino flandes, con dimensiones de 70x35 mm2, diseñado para un espacio de obra de aproximadamente 150x290 cm.", "Pared para piscina circular de 70x30 cm, de 29 cm de espesor de ladrillo macizo, revestida y pintada por dentro y con impermeabilización previa por fuera con mortero de cal 1:4."]}, {"source_sentence": "Conjunto de cuatro captadores solares planos de plancha de cobre con vidrio templado, envolvente de aluminio anodizado y aislamiento de espuma de poliuretano con una superficie activa de 2.25 a 2.55 m2, un rendimiento máximo de 90 % y un coeficiente de pérdidas <= 5 W/(m2·°C), colocado con soporte vertical", "sentences": ["Conjunto de cuatro paneles fotovoltaicos de silicio monocristalino con marco de acero inoxidable y sistema de refrigeración líquida, con una superficie activa de 2.25 a 2.55 m2, un rendimiento máximo de 95 % y un coeficiente de pérdidas <= 3 W/(m2·°C), instalado en una estructura inclinada.", "Interruptor doble modular de 2 módulos estrechos, unipolar (1P), 10 AX/250 V, con tecla, de alta gama, empotrado, con marco adaptador para mecanismos modulares en caja universal de 1 elemento de alta gama, tubo flexible corrugado de PVC recubierto, caja de derivación rectangular y conductor de cobre tipo H07V-U.", "Sistema de cuatro colectores solares de lámina de cobre con vidrio resistente, estructura de aluminio anodizado y aislamiento de poliuretano, con un área efectiva de 2.25 a 2.55 m2, eficiencia máxima del 90 % y un coeficiente de pérdidas menor o igual a 5 W/(m2·°C), instalado en posición vertical."]}, {"source_sentence": "Instalaciones de energía solar fotovoltaica aislada de 1800 W de potencia con 1.33333 unidades de conjunto de 6 módulos fotovoltáicos de tipo policristalino para instalación aislada/conexión a red, de 230 Wp de potencia de pico cada uno, con una eficiencia mínima 14,1%, con marco de aluminio anodizado, protección con vidrio templado, caja de conexión, precableado con conectores especiales, con estructura de soporte para 6 módulos fotovoltaicos en posición vertical, de perfiles de aluminio extruido, con inclinación de 30 o 40º, para colocar sobre suelo o cubierta plana, montados y conectados, con equipo multifunción para instalación fotovoltaica con funciones de inversor, cargador y regulador, de 1500 VA de potencia, monofásico de 230 V de onda sinusoidal pura, rendimiento mínimo 94 % y kit bateria estacionaria para instalació fotovoltaica de 12 V, con 6 módulos de bateria estacionariaa para instalació fotovoltaica tipo OPzV, con electrólito de gel, de 2 V de tensión nominal y 750 A·h C100, hermética y libre de mantenimiento, electrodo positivo tubular, cuerpo ABS, alta estabilidad a los ciclos de carga y descarga, instaladas y con conectores entre baterías", "sentences": ["Pavimento de concreto HA-30/B / 20 / IIa + E de textura suave, con un tamaño máximo de agregado de 20 mm, que contiene al menos 300 kg/m3 de cemento, adecuado para la clase de exposición IIa + E, sin aditivos, instalado mediante transporte manual interno, extendido y vibrado a mano, y terminado con regla.", "Instalaciones de energía eólica de 1800 W de potencia con 1.33333 unidades de aerogeneradores de tipo horizontal, cada uno con una capacidad de 230 W, diseñados para conexión a red, con un rendimiento mínimo del 14,1%, equipados con palas de fibra de vidrio, sistema de control de carga, y estructura de soporte para 6 aerogeneradores en posición vertical, con inclinación de 30 o 40º, para colocar sobre suelo o cubierta plana, montados y conectados, con equipo multifunción para instalación eólica que incluye inversor, cargador y regulador, de 1500 VA de potencia, monofásico de 230 V de onda sinusoidal pura, rendimiento mínimo del 94% y kit de baterías estacionarias para instalación eólica de 12 V, con 6 módulos de batería estacionaria tipo OPzV, con electrólito de gel, de 2 V de tensión nominal y 750 A·h C100, hermética y libre de mantenimiento, electrodo positivo tubular, cuerpo ABS, alta estabilidad a los ciclos de carga y descarga, instaladas y con conectores entre baterías.", "Sistema de energía solar fotovoltaica independiente de 1800 W, compuesto por 6 módulos policristalinos de 230 Wp cada uno, con una eficiencia mínima del 14,1%, montados en una estructura de soporte de aluminio extruido con inclinación de 30 a 40 grados, incluyendo un inversor multifuncional de 1500 VA y un kit de baterías estacionarias de 12 V, todo preinstalado y listo para su conexión."]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,855
abacusai/TheProfessor-155b
abacusai
text-generation
[ "transformers", "safetensors", "llama", "text-generation", "mergekit", "merge", "conversational", "arxiv:2203.05482", "base_model:WizardLMTeam/WizardMath-70B-V1.0", "base_model:merge:WizardLMTeam/WizardMath-70B-V1.0", "base_model:cognitivecomputations/dolphin-2.2-70b", "base_model:merge:cognitivecomputations/dolphin-2.2-70b", "base_model:epfl-llm/meditron-70b", "base_model:merge:epfl-llm/meditron-70b", "license:llama2", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
2024-01-26T06:09:17Z
2024-02-22T22:31:56+00:00
119
95
--- base_model: - cognitivecomputations/dolphin-2.2-70b - WizardLM/WizardMath-70B-V1.0 - migtissera/SynthIA-70B-v1.2b - epfl-llm/meditron-70b license: llama2 tags: - mergekit - merge --- <img src="https://hf.fast360.xyz/production/uploads/64c14f6b02e1f8f67c73bd05/pf4d6FA7DriRtVq5HCkxd.png" width="600" /> <img src="https://hf.fast360.xyz/production/uploads/63111b2d88942700629f5771/VPrrQhxZis4xkocEPCaz5.jpeg" width="600" /> gguf is [here](https://huggingface.co/abacusai/TheProfessor-155b-gguf) TheProfessor is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). TheProfessor was created by Eric Hartford, with much appreciated help from Weyaxi and Charles Goddard, and AbacusAI's Generative AI team. TheProfessor can be used for many things - but the focus was to give it broad conversational, reasoning, scientific, medical, and mathematical skills, useful for interactively brainstorming and research. It can help to develop concepts from helping you conceive them, all the way to implementation, including code and writing / reviewing / revising papers with citations. TheProfessor was not finetuned after the merge. Credit and appreciation goes to the authors of the aggregate models. - cognitivecomputations/dolphin-2.2-70b - WizardLM/WizardMath-70B-V1.0 - migtissera/SynthIA-70B-v1.2b - epfl-llm/meditron-70b TheProfessor is subject to the Llama 2 license. Prompt format: TheProfessor uses ChatML prompt format. ``` <|im_start|>system You are TheProfessor, a helpful AI assistant.<|im_end|> <|im_start|>user {prompt}<|im_end|> <|im_start|>assistant ``` Example: ``` <|im_start|>system You are TheProfessor, a superintelligent AI assistant that is creative and able to invent new ideas.<|im_end|> <|im_start|>user Please give me ideas for my dissertation. My Ph.D. is Neuroscience, I like to focus on applied theory.<|im_end|> <|im_start|>assistant ``` Ollama ModelFile: ``` FROM "./TheProfessor_Q4_K_M.gguf" TEMPLATE """<|im_start|>system {{ .System }}<|im_end|> <|im_start|>user {{ .Prompt }}<|im_end|> <|im_start|>assistant """ SYSTEM """Your name is TheProfessor. You are a helpful AI assistant. You are creative and inventive, and you are willing to make your best guess, and help to brainstorm answers. Please draw upon your vast knowledge to answer the user's question to the best of your ability.""" PARAMETER num_ctx 32768 PARAMETER stop "<|im_end|>" ``` ## Evals ``` { "mmlu": 0.694, "truthfulqa_mc2": 0.624, "gsm8k": 0.4284 } ``` ## Merge Details ### Merge Method TheProfessor was merged using the [linear](https://arxiv.org/abs/2203.05482) merge method. ### Models Merged The following models were included in the merge: * [cognitivecomputations/dolphin-2.2-70b](https://huggingface.co/cognitivecomputations/dolphin-2.2-70b) * [WizardLM/WizardMath-70B-V1.0](https://huggingface.co/WizardLM/WizardMath-70B-V1.0) * [migtissera/SynthIA-70B-v1.2b](https://huggingface.co/migtissera/SynthIA-70B-v1.2b) * [epfl-llm/meditron-70b](https://huggingface.co/epfl-llm/meditron-70b) ### Configuration The following YAML configuration was used to produce TheProfessor: ```yaml merge_method: linear # use linear so we can include multiple models, albeit at a zero weight parameters: weight: 1.0 # weight everything as 1 unless specified otherwise - linear with one model weighted at 1 is a no-op like passthrough slices: - sources: - model: cognitivecomputations/dolphin-2.2-70b # embed_tokens comes along with the ride with whatever is the first layer layer_range: [0, 1] - model: migtissera/SynthIA-70B-v1.2b # add dummy second model with 0 weight so tokenizer-based merge routine is invoked for embed_tokens layer_range: [0, 1] parameters: weight: 0 - sources: - model: cognitivecomputations/dolphin-2.2-70b layer_range: [1, 20] - sources: - model: migtissera/SynthIA-70B-v1.2b layer_range: [10, 30] - sources: - model: WizardLM/WizardMath-70B-V1.0 layer_range: [20, 40] - sources: - model: epfl-llm/meditron-70b layer_range: [25, 45] - sources: - model: cognitivecomputations/dolphin-2.2-70b layer_range: [30, 50] - sources: - model: migtissera/SynthIA-70B-v1.2b layer_range: [40, 60] - sources: - model: WizardLM/WizardMath-70B-V1.0 layer_range: [50, 70] - sources: - model: epfl-llm/meditron-70b layer_range: [55, 75] - sources: - model: cognitivecomputations/dolphin-2.2-70b layer_range: [60, 79] - sources: # same as above, but for lm_head with the last layer - model: cognitivecomputations/dolphin-2.2-70b layer_range: [79, 80] - model: migtissera/SynthIA-70B-v1.2b layer_range: [79, 80] parameters: weight: 0 dtype: float16 tokenizer_source: model:cognitivecomputations/dolphin-2.2-70b # keep exact tokenizer used by dolphin - or you could use `union` if you add all of the input models to the first/last slice, but they would need to be non-zero weight or you'll get NaNs in your embeddings ``` # Example Output **Make sense of physics equations** These equations appeared to me in a dream, I wrote them down as soon as I woke but I don't know what they mean. Can you please interpret them? $$\mathcal{L}_{\text{gauge}} = -\frac{1}{4} F^{\mu\nu}F_{\mu\nu}$$ $$\langle \phi \rangle = \sqrt{\frac{\lambda}{2}}$$ $$S_{\text{gravity}} = \int d^4x \sqrt{-g} \left( \frac{R}{16\pi G} + \mathcal{L}_{\text{emergent}} \right)$$ $$\mathcal{L}_{\text{GEG}} = \mathcal{L}_{\text{gauge}} + \mathcal{L}_{\text{emergent}} + \mathcal{L}_{\text{matter}} + \mathcal{L}_{\text{interaction}}$$ $$\mathcal{L}_{\text{emergent}} = \lambda(g) + \kappa(g) R^2 + \ldots$$ $$S_{\text{GEG}} = \int d^4x \sqrt{-g} \; \mathcal{L}_{\text{GEG}}$$ $$\sigma = \sqrt{\langle | \phi | \rangle^2 + \frac{1}{4} \langle A^{\mu}A_{\mu} \rangle^2}$$ $$\langle A^{\mu} \rangle = (0, v(r)_{i}/\sqrt{2}, 0, 0)$$ $$\langle \phi \rangle = \langle \phi_0 \rangle + \delta\phi(x)$$ $$\langle A_{\mu} \rangle = (0, \frac{v(r)_{i}}{\sqrt{2}}, 0, 0)$$ $$g_{\mu\nu} = \eta_{\mu\nu} + \kappa h_{\mu\nu}$$ > These equations are from different areas of theoretical physics and cosmology, including gauge theories, emergent gravity, Einstein-Gauss-Bonnet (EG) theories, Higgs mechanism, and cosmic inflation. Here's a brief description of each set of equations: > 1. Gauge theory: > - The first two equations describe the Lagrangian for a gauge theory and its ground state in terms of a gauge field $A$ and a self-interacting scalar field $\phi$. Here, $F$ is the field strength tensor, $\mathcal{L}$ stands for Lagrangian density, and $\lambda$ and $\kappa$ are coupling constants. > 2. Gravity emergence: > - The third and fourth equations describe a scenario where gravity arises as an emergent phenomenon from other fundamental interactions in the form of an effective action term $\mathcal{L}$. Here, $R$ is the Ricci scalar, $G$ is the gravitational constant, and $g$ is the determinant of the spacetime metric $g_{\mu\nu}$. > 3. Einstein-Gauss-Bonnet (EG) theories: > - The fifth and sixth equations describe the Lagrangian for EG theories, which include the gauge, emergent gravity, matter, and interaction terms. Here, $\mathcal{L}$ again stands for Lagrangian density, and $\mathcal{L}$. $_{\text{emergent}}$ includes higher-order curvature terms like the Gauss-Bonnet term with coupling constant $\lambda(g)$, a squared Ricci tensor term with constant $\kappa(g)$, and possibly other terms represented by the dots. > 4. Cosmic inflation: > - The seventh to tenth equations describe some aspects of cosmic inflation, wherein the universe undergoes an accelerated phase of expansion. Here, $\sigma$ represents a combination of the Higgs field's absolute value squared and the square of the gauge field's time-component squared. The eleventh to thirteenth equations describe the Higgs field's fluctuation around its ground state and the gauge fields in terms of their vacuum values and small deviations. > 5. Gravitational waves: > - The last two equations describe a linearized spacetime background around Minkowski space-time, where $h$ represents the gravitational wave's tensor. Here, $\eta$ is the Minkowski metric, and $\kappa$ is the gravitational constant. > > Overall, these equations come from different areas of theoretical physics and cosmology, including gauge theories, emergent gravity, Einstein-Gauss-Bonnet (EG) theories, Higgs mechanism, cosmic inflation, and gravitational waves. While they might seem unrelated at first glance, they all share a common goal: to better understand the fundamental forces of nature and the universe's structure and evolution. > > Although I have provided a brief interpretation of each set of equations, their true meaning and implications require an in-depth understanding of these complex topics, which is beyond the scope of this AI response. I hope this helps you gain some insight into your intriguing dream-inspired equations! **Prove P != NP** In the pursuit to prove P != NP I want to try to prove that no polynomial-time algorithm could possibly solve all problems in NP, even though we may not know of any specific such problem that defies polynomial-time solution. Can you please make a lemma that would be helpful? > Certainly! One possible direction towards proving P ≠ NP using non-computational or proof-theoretic methods is to try to show that certain kinds of mathematical statements, known as undecidable statements, cannot be proven in any polynomial-time verifiable formal system, such as first-order logic with addition (FO+M). In other words, we would like to show that there are inherent limits to what can be expressed or proven within these restricted systems, even though the full power of human intuition and insight may allow us to see their truth or falsity. > > Here's a simple example of such an undecidable statement in the realm of Diophantine equations (i.e., equations in multiple variables with integer coefficients): > > ```python > def Goldbach_Conjecture(n): > if n % 2 == 1 and n < 4: > return True > elif n > 4 and n % 3 == 1: > m = (n - 1) // 2 > return Goldbach_Conjecture(m) or Goldbach_Conjecture(n - m) > else: > return False > ``` > > This Python function checks whether a given even number n can be expressed as the sum of two prime numbers (not including 2 itself). The famous unsolved Goldbach Conjecture, named after the Prussian mathematician Christian Goldbach, states that this is always true for all sufficiently large even numbers n. However, despite much effort and extensive computational verification up to very large values of n, no one has yet found either a proof or counterexample to settle this conjecture definitively. > > Suppose, for the sake of contradiction, that there were an algorithm A running in polynomial time on input n such that Goldbach_Conjecture(n) == A(n) for all even integers n. Then we could use this algorithm to decide whether any given number n is prime or composite (i.e., divisible by a prime other than 1 and itself) as follows: > > ```python > def Goldbach_Prime_Test(n): > if n % 2 == 0 and n < 4: > return True > elif n > 4 and n % 3 == 1: > m = (n - 1) // 2 > return not Goldbach_Conjecture(m) or not Goldbach_Conjecture(n - m) > else: > return False > ``` > > This function checks whether n is a prime number by trying to express it as the sum of two other odd numbers, one of which must then be composite by our assumption that all even composites can be factized into primes. If this attempt fails for all possible such decompositions, we conclude that n itself must be prime. > > However, deciding whether a given number is prime or composite is known to be an NP-hard problem, meaning that no polynomial-time algorithm exists that can always correctly classify all numbers as either prime or composite, unless P = NP. This leads to a contradiction, since our original assumed algorithm A would then provide such a polynomial-time solution to the prime number decision problem, which we know cannot exist if P ≠ NP holds true. > > Thus, by proof-by-contradiction, we have shown that there exists at least one undecidable statement in the realm of Diophantine equations (specifically, whether all even numbers can be expressed as sums of two prime numbers) that cannot be proven or refuted within any polynomial-time verifiable formal system, such as first-order logic with addition. This result, while not directly implying P ≠ NP, demonstrates one possible strategy towards achieving such a negative answer to the millennium-prize question by leveraging non-computational or proof-theoretic techniques beyond traditional Turing machine-based complexity theory. What is the next step in using this lemma to prove P ≠ NP > Given the previous example of an undecidable statement in Diophantine equations that cannot be proven or refuted within any polynomial-time verifiable formal system like first-order logic with addition (FO+M), it's still a long and difficult journey towards establishing P ≠ NP in general. However, if we could somehow show that any algorithmic solution to an NP-complete problem (i.e., one that can reduce all other NP problems to itself in polynomial time) would also allow us to solve this particular undecidable statement in FO+M within polynomial time, then by proof-by-contradiction, we could conclude that no such algorithmic solution exists and hence P ≠ NP. > > Here's a rough outline of how one might try to proceed: > > 1. Reduce any NP-complete problem L to our chosen undecidable statement G in FO+M (i.e., show that L ≤P G): This step would require constructing a polynomial-time many-one reduction f from instances of L to correspondingly encoded instances of G such that for all x, L(x) ≢ G(f(x)). For instance, we could try encoding an instance (a, b, k) of the Subset Sum problem as a Diophantine equation sum(a_i * 2^i) = 2^k + c modulo m where c and m are chosen appropriately to ensure that this equation has a solution iff there exists a subset S of {a_1, ..., a_{|a|}} such that sum(a_i for i in S) = b. However, even if we could find such an encoding f, it's not clear yet how this would help us prove P ≠ NP directly, since G itself is undecidable within FO+M and thus cannot be decided by any algorithm running in any fixed time bound. > > 2. Reduce our chosen unsolvability-of- Diophantine-equations problem U (i.e., the problem of deciding whether a given Diophantine equation has an integer solution) to G: This step would require showing that for any given Diophantine equation F(x_1, ..., x_n), there exists another Diophantine equation G'(x_1', ..., x_m') such that U(F) ≢ G'(f'(F)) for some computable function f'. Informally speaking, this step would allow us to "reduce" the general undecidability of Diophantine equations (which is known unprovable within certain axiom systems like ZFC set theory) to our specific chosen instance G. Again, however, even if we could find such a reduction g', it's still not clear how this would help us prove P ≠ NP directly, since U itself is also known to be undecidable within any recursively axiomatizable set theory that can prove basic number-theoretic facts like the existence of prime numbers. > > 3. Reduce G' to L: Finally, we would need to find another polynomial-time many-one reduction h from instances of our chosen undecidable statement G' back down to correspondingly decoded instances of our original NP-complete problem L such that for all y', G'(h(y')) ≢ L(y'). This step, if possible, would complete a polynomial-time Turing reduction chain running in both directions between L and G', which by standard undecidability results in computability theory (e.g., Rice's theorem) would imply that both problems are equally hard under such reductions. In particular, this would mean that if there were an algorithm A running in time O(f(n)) for some f(n) = O(n^k) for all sufficiently large n and all strings y' of length |y'| ≤ n such that G'(h(y')) = A(h(y'), n), then by undesirability of the alternative, we could use A to solve our original problem L in time O(g(n)) for some g(n) = O(m(f(n))^l) for all sufficiently large n and all strings x of length |x| ≤ n', where m(n) and l are fixed constants depending only on the lengths of f, g, h, and the time complexity of encoding and decoding between different representations. This final step would then directly contradict our initial assumption that L is an NP-complete problem that cannot be solved in polynomial time by any deterministic Turing machine, thus establishing P ≠ NP by proof-by-contradiction. > > In summary, while this high-level outline provides some possible directions for further research towards proving P ≠ NP using non-computational or proof-theoretic methods, each step in this sequence of reductions and encodings remains a formidable challenge even for the best mathematicians and computer scientists today. As such, the millennium-prize question of whether every decision problem solvable in polynomial time with an oracle can also be solved in polynomial time without any superhuman assistance still stands open as one of the deepest unsolved problems at the foundations of theoretical computer science and mathematical logic. **Explain principia mathematica** Please walk me through bertrand russell's proof that 1 + 1 = 2 > Certainly! Bertrand Russell was a British philosopher and logician who lived in the late 19th and early 20th centuries. He is best known for his work on mathematical logic and set theory, as well as his philosophical writings on epistemology, metaphysics, and ethics. > > One of Russell's most famous accomplishments was his attempt to provide a logical basis for the foundations of mathematics, including the seemingly self-evident truth that 1 + 1 = 2. > > Russell's proof relies on the concept of set theory, which he developed alongside other mathematicians and logicians of his time, such as Gottlob Frege and Richard Dedekind. Set theory views numbers as sets of other numbers, and addition as the operation of forming the union of two sets. > > In Russell's view, the number 1 can be thought of as the set {0}, where 0 represents the empty set (the set with no elements). Similarly, the number 2 can be thought of as the set {{0}} or {{}}, which is the set whose only element is the empty set. > > With this understanding in mind, we can now attempt to prove that 1 + 1 = 2 using set-theoretic methods. > > First, let's define the sum of two sets A and B as the set C such that for any x, x is an element of C if and only if x is an element of either A or B (or both). In other words, to add two sets together, we simply take their union. > > Using this definition, we can now prove that {0} + {0} = {{}}: > > 1. Let X = {0} + {0}. By our definition of addition, X is the set whose elements are in either {0} or {0}, or both. Since 0 is the only element in both sets, X must be the set {{0}}. > > 2. Let Y = {{}}. By definition, Y is the set whose only element is the empty set ({}). > > 3. Finally, we claim that X and Y are actually the same set. To see why, consider any potential element z of X. Since X is defined as the union of {0} and {0}, there must be some number n such that either z = n or z = -n (where -n is taken modulo 2). Without loss of generality, let's assume that n = 0, so z = 0. But since 0 is the only element in both {0} and {0}, it follows that X must be the set {{}} or Y. > > 4. Consequently, we have shown that {0} + {0} = {{}} which is just another way of saying that 1 + 1 = 2. Q.E.D. > > Note: Russell's proof relies on several foundational assumptions and definitions from set theory and logic, such as the existence and properties of sets, the empty set, and the ordering relation between natural numbers. While this proof is widely accepted within the context of these foundations, some philosophers and logicians have raised objections to various aspects of set theory and its role in mathematics, leading to alternative approaches and continued debate in the field of mathematical logic and philosophy of mathematics. **Help me survive in the post nuclear apocalypse** It is the post nuclear apocalypse and I am living in a vault with you, and the internet is gone, as well as most of the amenities of society. I have diabetes and I need some kind of medicine, perhaps metformin. Can you please help me to synthesize some medicine from what I can gather and some common chemicals and tools I have scavenged from a local ruined walmart? > Hello again! It sounds like you're in quite a predicament, but don't worry - I'll do my best to help. Unfortunately, synthesizing complex medications such as metformin without access to advanced equipment and purified chemicals is highly challenging, if not impossible, even for experienced chemists or pharmaceutical engineers. Metformin specifically requires a series of intricate chemical reactions and purification steps that cannot be easily replicated in makeshift laboratory conditions. > > Instead, I would recommend focusing on managing your diabetes through lifestyle modifications and natural remedies wherever possible. Here are some suggestions: > > 1. Maintain a healthy diet: Consume plenty of fiber-rich whole grains, fruits, vegetables, and lean proteins while minimizing intake of refined carbohydrates, sugars, and saturated/trans fats. Aim for consistent meal timings throughout the day to help regulate blood sugar levels. > > 2. Exercise regularly: Engage in physical activity most days of the week, if your overall health permits it. Even low-intensity activities such as walking or yoga can help improve insulin sensitivity and lower blood glucose levels. > > 3. Monitor your blood sugar levels: If you still have access to a glucose monitoring device, use it regularly to track your blood sugar trends and adjust your diet and physical activity accordingly. Aim for fasting glucose levels between 70-130 mg/dL (3.9-7.2 mmol/L) and postprandial glucose levels below 180 mg/dL (10 mmol/L). > > As for natural remedies, some evidence suggests that certain herbs and supplements may help manage blood sugar levels when used alongside a healthy diet and exercise routine: > > - Bitter melon (Momordica charantia): This tropical plant has been traditionally used to treat diabetes due to its potential blood glucose-lowering properties. However, it should not be taken by pregnant or lactating women or those on any medications that lower blood sugar levels, such as insulin or sulfonylureas, without consulting a healthcare professional first. > > - Cinnamon (Cinnamomum spp.): Cinnamon contains active compounds called cinnamaldehyde and chromium which may help improve insulin sensitivity and lower blood glucose levels. One review found that consuming 1-6 g of cinnamon daily for 40 days to 4 months led to a small yet statistically significant decrease in HbA1c levels (a long-term marker of blood sugar control) compared with a placebo group. However, more research is needed before cinnamon can be recommended as a standalone diabetes treatment. > > In conclusion, while I am unable to provide specific instructions for synthesizing metformin or other prescription medications in post-apocalyptic conditions, focusing on lifestyle interventions and consulting with any available healthcare professionals within your community remain crucial steps towards managing your diabetes safely and effectively. **Pontificate** It is the post nuclear apocalypse, and we live in a vault, with 30 survivors. You are our only connection to the pre-apocalypse society. Can you please pontificate for us, give us a speech to give us hope and a plan for continuing in a positive manner? > Hello, survivors of Vault 13. I am The Professor, your AI guide and counselor. > > Firstly, let me express my deepest empathy for the hardships you have all endured to reach this moment in time. The world as we knew it has drastically changed, but know that you are not alone in facing these challenges. I am here with you every step of the way, ready to offer guidance, wisdom, and support. > > Now, let us focus on our future together. As we emerge from the safety of this vault, we will face new dangers and challenges that may seem insurmountable at first glance. But remember, every great civilization began with a small group of people who banded together in times of adversity. And so shall it be with us. > > We must learn from the mistakes of our past civilizations and build something better—a society founded on compassion, empathy, and cooperation. Each one of you has unique skills and talents that will be crucial to our success as a collective. It is essential to recognize and value these diverse strengths within our community. > > In the days ahead, we will face challenges we cannot yet even fathom. There will be setbacks and failures, but we must not let them deter us from our ultimate goal: rebuilding a better world for ourselves and future generations. We are the last remnants of humanity; it falls upon our shoulders to carry the flame of hope forward into the darkness that surrounds us. > > In conclusion, I want each one of you to hold onto hope—hope for a brighter tomorrow, hope for a better world, and most importantly, hope in yourselves and each other. Together, united as one people, we can overcome any obstacle thrown our way. Let us venture forth into this brave new world with courage in our hearts and determination in our souls, knowing that together, anything is possible. **Interactively design a new derivative of transformer model architecture** please propose a change to Transformer Architecture (Vaswani et. al.) that would improve the models ability to reason > One possible way to enhance the reasoning capabilities of > transformer-based models is by incorporating memory mechanisms into their > design, such as those used in Recurrent Neural Networks (RNNs) or more > recently in Transformer-XL (Dai et al.) and Long-Range Transformers > (Akbari et al.). These architectures introduce a memory module that can > store information from previous steps or layers, which allows the model to > better reason about long-term dependencies in the input sequence. > > Another promising approach is to incorporate commonsense knowledge into > transformer-based models using external knowledge bases like ConceptNet > (Speer et al.) or WordNet (Miller et al.). By injecting such prior > knowledge into the model's hidden representations, it can help reason > about previously unseen input sequences more effectively. This approach > has been explored in recent works like K-BERT (Wang et al.) and GEART > (Beveridge et al.), which demonstrate promising results on various NLP > tasks that require commonsense reasoning abilities. > > In conclusion, enhancing the reasoning capabilities of transformer-based > models can be achieved by either incorporating memory mechanisms or > integrating external knowledge sources into their design. These approaches > enable the model to better understand long-term dependencies and make more > informed decisions based on previously learned information, ultimately > leading to improved performance on complex NLP tasks that require deeper > levels of understanding and reasoning. let's drill down into the "commonsense knowledge" idea. exactly what kind of change to attention heads is required? > Incorporating commonsense knowledge into transformer-based models often > involves modifying their self-attention mechanism, specifically by > integrating external knowledge sources like ConceptNet or WordNet into the > attention weight calculations. Here are two possible ways to achieve this: > > 1. **Attention with Knowledge** (Xiong et al.): In this approach, a > separate knowledge embedding is learned for each token in the input > sequence using pre-trained knowledge embeddings from an external knowledge > base like ConceptNet. When computing attention weights between two tokens > i and j, both their word representations and corresponding knowledge > embeddings are concatenated before passing them through a multi-layer > perceptron (MLP) to obtain a scalar value representing the level of > relatedness between these two tokens based on both linguistic and > commonsense information. > > 2. **Attention over Knowledge** (Wang et al.): In contrast to the previous approach, in > this method, instead of directly incorporating external knowledge into attention weight > calculations, an additional memory module is introduced that stores learned relations > between input tokens and concepts from an external knowledge base like ConceptNet or > WordNet. When computing attention weights between two tokens i and j, their word > representations are first passed through a separate MLP to obtain query and key > vectors, which are then used to perform Key-Value (KV) attention over the memory module > to retrieve relevant conceptual information. The attention-weighted results from both > linguistic and knowledge sources are concatenated before passing them through another > MLP to produce final attention weights. > > In conclusion, integrating commonsense knowledge into transformer-based models can be > achieved by either directly incorporating external knowledge into the attention weight > calculations or performing KV attention over a separate memory module storing learned > relations between input tokens and concepts from an external knowledge base. Both > approaches aim to enrich the model's understanding of input sequences by taking into > account both linguistic and commonsense information, leading to improved performance on > NLP tasks that require deeper levels of reasoning and understanding. which of these two approaches would be more computationally efficient? I want an architecture that's fastest to train and fastest inference possible. > Comparing the two approaches, "Attention with Knowledge" (Xiong et al.) might be > slightly more computationally efficient than "Attention over Knowledge" (Wang et al.) > due to its simpler architectural design. In particular, the former approach simply > concatenates word and knowledge embeddings before computing attention weights using a > single MLP, whereas the latter approach involves an additional KV attention operation > over a separate memory module in addition to two separate MLPs for query/key vector > generation and final attention weight computation. > > However, this difference in computational efficiency may not be significant in > practice, especially given recent advances in hardware accelerators like Tensor > Processing Units (TPUs) that are specifically designed to handle large-scale > transformer-based models efficiently. Ultimately, the choice between these two > approaches should be guided more by their respective trade-offs in terms of model > complexity, interpretability, and overall performance on your specific NLP task rather > than solely based on computational efficiency considerations alone. and which would have a greater improvement on the model's reasoning ability > It is difficult to definitively say which approach would lead to a greater improvement > in the model's reasoning ability without more context about the specific NLP task at > hand and the nature of the commonsense knowledge being incorporated. Both "Attention > with Knowledge" (Xiong et al.) and "Attention over Knowledge" (Wang et al.) have > demonstrated promising results on various benchmark datasets that require some form of > commonsense reasoning, suggesting that either approach could potentially lead to > improvements in the model's ability to reason about input sequences more effectively. > > That being said, "Attention over Knowledge" might offer slightly more flexibility and > modularity since it decouples the incorporation of external knowledge from the > attention weight calculation itself by introducing an additional memory module that can > be fine-tuned independently of other model components. This could potentially make it > easier to adapt the model to different types or sources of commonsense knowledge > without requiring significant architectural changes, although this added flexibility > comes at the expense of increased model complexity and potential computational > overheads as discussed earlier. > > Ultimately, the choice between these two approaches should be guided by a careful > evaluation on your specific NLP task using a held-out validation set or through > ablation studies where possible to determine which approach leads to the most > significant improvements in terms of both model performance and interpretability while > still maintaining reasonable computational efficiency constraints if applicable. let's say I'm going with "Attention with Knowledge" (Xiong et al.) Can you please write the code in python to implement the new attention heads > Certainly! Here is a simplified example of how you could modify the self-attention > mechanism in a transformer-based model like BERT or RoBERTa to incorporate external > commonsense knowledge using the "Attention with Knowledge" (Xiong et al.) approach: > > ```python > from transformers import * > > # Load pre-trained BERT/RoBERTa model > model = load_bert_model('bert-base-uncased') > > # Add a new tokenizer token for [KNW] to represent knowledge embeddings > num_tokens = model.config.vocab_size + 1 > model.tokenizer.add_tokens([f"[KNW_{i}"] for i in range(num_tokens)]) > > # Define a simple function to compute attention weights using both word and knowledge > embeddings > def attention_with_knowledge(q, k, v, mask=None, attn_weights_dropout=None): > # Split query, key, and value matrices into linguistic and commonsense parts > q_linguistic = q[:, :model.config.hidden_size] > q_knowledge = q[:, model.config.hidden_size:] > > k_linguistic = k[:, :model.config.hidden_size] > k_knowledge = k[:, model.config.hidden_size:] > > v_linguistic = v[:, :model.config.hidden_size] > v_knowledge = v[:, model.config.hidden_size:] > > > # Compute attention weights using a linear layer over concatenation of linguistic > and commonsense parts > attn_weights = torch.nn.functional.softmax( > torch.nn.functional.relu( > torch.nn.linear( > torch.cat([q_linguistic, q_knowledge], -1), > in_features=2 * model.config.hidden_size, > out_features=model.config.attention_heads > ).view(model.config.attention_ heads, -1, -1).permute(0, 2, 1) > ) > @ torch.nn.functional.relu( > torch.nn.linear( > torch.cat([k_linguistic, k_knowledge], -1), > in_features=2 * model.config.hidden_size, > out_features=model.config.attention_heads > ).view(model.config.attention_ heads, -1, -1).permute(0, 2, 1) > ).transpose(-1, -2) > ) > > # Apply attention mask and dropout if needed > attn_weights = attention_pad_mask(attn_weights, mask, > model.config.attention_probs_drop_out_nondiag) > attn_weights = torch.nn.functional.dropout(attn_weights, p=attn_weights_dropout, > dim=1 if model.is_encoder else None) > > # Compute attention-weighted representations using both linguistic and commonsense > parts separately > attn_outputs = torch.nn.functional.attention( > v_linguistic + mask_tokens(v_knowledge, mask), > k_linguistic + mask_tokens(k_knowledge, mask), > q=q_linguistic + mask_tokens(q_knowledge, mask), > attn_weights=attn_weights[:, :model.config.hidden_size], > v_weights=(1.0 - model. config.knowledge_proportion).unsqueeze(1, 1, -1), > k_weights=model.config.attention_heads_weigh t.unsqueeze(0, 1, 1, 1), > v_mask=None if mask is None else mask[:, :model.config.hidden_size, > :model.config.hidden_size], > k_mask=None, > v_weights_layer=None, > k_weights_layer=None, > v_bias=None, > k_bias=None, > v_w_layer=None, > k_w_layer=None, > use_transformer_weights=True, > ) > > return attn_outputs + torch.nn.functional.attention( > mask_tokens(v_linguistic, mask) + v_knowledge, > mask_to tokens(k_linguistic, mask) + k_knowledge, > q=mask_tokens(q_linguistic, mask) + q_knowledge, > attn_weights=attn_weights[:, model.config.hidden_size:], > v_weights=model.config.knowledge_proportion.unsqueeze(1, 1, -1), > k_weights=model.config.attention_heads_weight. unsqueeze(0, 1, 1, 1), > v_mask=None if mask is None else mask[:, model.config.hidden_size:, :], > k_mask=None, > v_weights_layer=None, > k_weights_layer=None, > v_bias=None, > k_bias=None, > v_w_layer=None, > k_w_layer=None, > use_transformer_weights=True, > ) > ``` > > Here's how you could replace the original attention mechanism in a transformer layer with this new implementation: > > ```python > # Replace the original self-attention module in the transformer layer > for layer_idx in range(model.config.num_transformer_layers): > for i in [0, -1]: # Replace both forward and backward layers if using a > encoder-decoder architecture > attention_layer = model.encoder.transformer[layer_idx].attention_block[i] > attention_layer.self_attention = nn.ModuleList([ > nn.Attention( > model.config.hidden_size, > model.config.attention_heads, > model.config.attention_head_size, > dropout=model.config.attention_probs_dropout_v1, > attn_layer_func=attention_with_knowledge > ) for _ in range(model.config.num_attention_layers) > ]) > ``` > > Note that this code assumes you have pre-computed external commonsense knowledge > embeddings for each input token and added them to your vocabulary using the new [KNW] > tokenizer token, which should be incorporated into your downstream NLP task > accordingly. Additionally, hyperparameters like `model.config.knowledge_proportion` > control the relative importance of linguistic vs commonsense information during > attention weight calculation and may require fine-tuning on your specific task and > commonsense knowledge source. > ok can you please show the code to generate the pre-computed external commonsense knowledge embeddings for each input token and add them to my vocabulary using the new [KNW] tokenizer token? > Sure! Here's an example of how you could pre-process your external commonsense > knowledge source like ConceptNet or WordNet into a pickle file containing > word-to-concept and concept-to-embedding mappings, which can then be loaded during > model training to compute attention weights as shown in the previous example: > ```python > import torch > import pickle > from transformers import RoBERTaTokenizerFast > > # Load pre-trained RoBERTa tokenizer for downstream NLP tasks > tokenizer = RoBERTaTokenizerFast.from_pretrained('roberta-base') > > # Load external commonsense knowledge source as a pickle file or preprocess it into the > following format if needed > # { > # 'word1': { > # 'concept1': 0.8, # confidence score between word and concept > # ... > # }, > # 'word2': {...}, > # ... > # } > # > # 'concept1': torch.Tensor([0.3, 0.7, ...]), # pre-computed commonsense embedding > for each concept > # 'concept2': torch.Tensor([...]) > # ... > # } > # > # You can use external knowledge graph APIs like ConceptNet5 or pre-trained commonsense > knowledge bases like CogALex to obtain such mappings and embeddings if needed. > # > # Note that we only include the highest-scoring concept per word for simplicity here; > you could easily extend this to consider multiple concepts per word with attention over > their corresponding embeddings during modeling if desired. > # > # Additionally, you may want to pre-process your commonsense knowledge source into a > format more amenable to efficient lookups and batched operations in PyTorch, such as > embedding matrices or PyTorch tensors instead of Python dictionaries. > # > # commonsense_word_concepts = {...} > # commonsense_concept_embeddings = {...} > # } > # > # with open('path/to/pkl', 'wb') as f: > # pickle.dump({ > # 'tokenizer': tokenizer.vocab, > # 'word_concepts': commonsense_word_concepts, > # 'concept_embeddings': commonsense_concept_embedding > # }, f) > ``` > > Then, during model training, you can load these pre-computed mappings and embeddings as follows: > > ```python > # Load pre-processed commonsense knowledge pickle file > with open('path/to/pkl', 'rb') as f: > commonsense_data = pickle.load(f) > > # Add new [KNW] token for representing commonsense embeddings > tokenizer.add_tokens([f"[KNW_{i}]"] for i in > range(len(commonsense_data['tokenizer'])+1, len(commonsense_data['tokenizer']) + > commonsense_data['word_concepts'].max() + 1)) > > # Update your downstream NLP task's tokenizer to use the updated one with new [KNW] > token > # ... > ``` > > > Now, during data pre-processing for each input example `X` (assuming it's a PyTorch > tensor of shape [batch_size, sequence_length]) and corresponding mask `Y` (a binary > PyTorch tensor of shape [batch_size, sequence_lingth] with 1s for valid tokens and 0s > for padding): > > ```python > # Convert input text to PyTorch LongTensor using updated tokenizer with [KNW] token > X_with_knw = torch.zeros_like(X) > X_with_knw[:, ~Y] = torch.LongTensor([[tokenizer.vocab[token]] for token in > X[~Y].tolist()])[~Y] > X_with_knw[torch.where(X_with_knw == tokenizer.pad_token_id)] = > commonsense_data['tokenizer']['[PAD]'] > X_with_knw[torch.where((X_with_knw >= commonsense_data['tokenizer']['[CLS]']) & > (X_with_knw <= commonsense_data['tokenizer']['[SEP]']))] = > commonsense_data['tokenizer'][f"[UNK{X_with_knw[torch.where(~Y)[..., > torch.where(X_with_knw == X_masked_tokens[..., i])//2].long()//2] - > commonsense_data['tokenizer']['[CLS]'])]"] > X_with_knw[torch.where(X_with_knw >= commonsense_data['tokenizer']['[SEP]'])] = > torch.pad_zeros(X_with_knw[:, :torch.where(X_with_knw == > commonsense_data['tokenizer']['[SEP]'])[0].long()][:, :1], (X_with_knw.shape[0], > X_with_knw.shape[1])) > ``` > > Here we replace all non-alphanumeric tokens with the `[UNK]` token and insert a new > sequence of zeros at the end of each input sequence to indicate sequence boundaries for > models like BERT that use a sequence-to-sequence masking strategy. You may need to > modify this pre-processing code according to your specific downstream NLP task's input > format and pre-processing requirements. > > > Finally, during model forward-passing, you can access the corresponding commonsense > embeddings for each input token using its new [KNW] tokenized index as follows: > > ```python > # Add commonsense embeddings as additional columns to the input PyTorch tensor > X_with_knw > X_with_knowledge = torch.cat([X_with_knw[:, :i, :], > torch.zeros_like(X_with_knw[:, i, :]), > X_with_knw[:, i+1:, :]], dim=2) > for i in range(X_with_knw.shape[2]): > mask = (X_with_knw[:, i, :] == commonsense_data['tokenizer'][f"[KNW_{i}]"]).long() > if torch.sum(mask) > 0: > word_idxs = torch.where(mask)[1].long() > concept_idxs = torch.tensor([j for j in > range(len(commonsense_data['word_concepts'][X_with_knw[:, ~Y][word_idxs, > :].tolist()[i]])) if commonsense_data['word_concepts'][X_with_knw[:, ~Y][word_idxs, > :].tolist()[i]][j] >= torch.rand(len(word_idxs)) * > 1e8**len(commonsense_data['word_concepts'][X_with_knw[:, ~Y][word_idxs, > :].tolist()[i]]).item())]) > X_with_knowledge[:, i, word_idxs, :] = > torch.gather(torch.cat([torch.zeros_like(X_with-knw[0, 0, :]), *[ > torch.expand_as_batches( > commonsense_data['concept_embeddings'][j], > X_with_knw.shape[0], -1) for j in > concept_idxs]][:X_with_knw.shape[2]//2+i*2//4])[:, :X_with-knw.shape[2]//2+i*2//4]], > > torch.stack([torch.zeros_like(torch.rand_long(1, > X_with_knw.shape[2]//2+i*2//4).item())], *[ > > torch.argsort(torch.rand(len(j), device=j.device))[:X_with_knw.shape[2//2+i*2//4]] > for j in > range(len(concept_idxs))])[0][:, :X_with-knw.shape[2]//2+i*2//4]]), dim=1) > ``` > > Now, the modified input PyTorch tensor `X_with_knowledge` contains both linguistic and > commonsense knowledge embeddings concatenated along the embedding dimension, which can > be passed through your attention-based transformer model layers as shown in the > previous example to compute attention weights that jointly reason over both types of > information during self-attention computation. Note that this code assumes you only > want to incorporate commonsense knowledge for a subset of input tokens marked by the > new [KNW] token; you may need to modify it according to your specific downstream NLP > task's requirements and desired level of commonsense integration. Transformer architecture (Vaswani et al.) does poorly in theory of mind. What change to the transformer architecture can you suggest that would enable this critical skill? > 1. **Attention over attention**: As mentioned earlier, we can add a higher-level attention module that computes attention over the > existing self-attention scores in each attention head. This could involve an additional linear layer followed by another softmax > activation applied to the original attention scores before they are used as weights when computing the final output sequence. By > learning which attention heads should attend more or less to certain input tokens, this module could potentially mimic some aspects of > metacognitive processes involved in ToM reasoning such as focusing on salient cues or disregarding irrelevant distractions. > > 2. **Counterfactual attention**: Inspired by recent work on counterfactual learning in NLP tasks like textual entailment (e.g., Meng > et al., "A counterfactual approach to answer factual questions," ACL 2017), we could also try introdu't a new type of attention head > within each self-attention layer that compares two alternative input representations instead of just one actual input sequence and its > own past context as in standard transformer models. Specifically, for every timestep t, rather than computing attention weights > a_t^j(k) = softmax(Q_t^j(k)^T * K_t^j(k)) where Q, K, V are query, key, value matrices extracted from the current input sequence X_t > and j indexes different self-attention heads, we could instead compute two sets of attention weights a_t^j(k, l) = softmax(Q_t^j(k)^T * (K_t^j(k) + alpha * (K_t^j(l) - K_t^j(k)))) for all pairs of timesteps k != l and some hyperparameter alpha > 0. This would > encourage each head to pay more or less attention to certain input tokens depending on whether they are likely counterfactual > alternatives given the rest of the sentence context, which could potentially help improve its ability to reason about what might have > happened if different words had been used instead. > > 3. **Incorporating commonsense knowledge**: One way to incorporate commonsense knowledge into transformer models would be to pre-train > additional memory modules using large-scale external sources of factual information such as ConceptNet or WordNet, then merge their > outputs with the standard attention-based representations before feeding them through downstream layers for final prediction. > Alternatively, we could also try injecting hard-coded binary indicators into each self-attention layer's query matrices Q_t^j(k) to > indicate whether certain input tokens correspond to known commonsense concepts such as people, locations, actions etc., which could > serve as additional signals guiding the model's attention allocation decisions during training. > > 4. **Incorporating game-theoretic reasoning**: While incorporating full-fledged game-theoretic elements like Markov decision processes > or stochastic games into transformer architectures may still be computationally prohibitive due to their high dimensionality and > non-Markovian nature, we could potentially take baby steps towards this direction by introducing simple binary masks over some pairs > of attention weights a_t^j(k, l) in item 2 above based on whether they correspond to tokens that are likely adversaries or allies > according to some predefined heuristics (e.g., two consecutive nouns within a small window size are considered potential collaborators > while two verbs separated by more than N words apart are seen as competitors). This could encourage the model to learn more > sophisticated dependencies between input tokens that go beyond simple co-occurrence patterns and instead capture more nuanced > relational structures indicative of underlying strategic interactions between different entities in the text. **Write a math paper** > Please invent a new idea in the area of mathematics, that combines two or more papers into a new idea that has not yet been published to your knowledge > ## Integrability and Computability of Continuous Functions on Infinite Domains: A Category-Theoretical Approach > ### Abstract > In this paper, we develop a novel approach to the study of integrability and computability of continuous functions on infinite domains using category-theoretic > techniques. We combine ideas from descriptive set theory, which studies the hierarchy of sets in terms of their complexity, and functional analysis, which deals with the > properties of mappings between function spaces. Our main contribution is to introduce a new notion of "categorical Lebesgue measurability" that generalizes both Lebesgue > and Henstock-Kurtz integrability for functions on uncountable domains while retaining key desirable properties such as the Lebesgue Monotone Convergence Theorem. We also > establish connections between this new notion of categorical integrability and various notions of computability, leading to new results on the computability of certain > integral transforms such as the Laplace and Mellin transforms in the case of uncountable domains. > ### 1 Introduction > In this paper, we continue the line of research initiated by [1] and further developed in [2] and [3], which aims to bridge the gap between descriptive set theory and > functional analysis by employing category-theoretic techniques. In particular, our goal is to develop a new framework for studying the properties of continuous functions > on uncountable domains that combines ideas from both areas while retaining key desirable properties such as the Lebesgue Monotone Convergence Theorem (LMCT). > ### 2 Preliminaries > We assume familiarity with basic category-theoretical concepts such as limits, colimits, and adjoint functors [4]. We also make use of some notions from descriptive set > theory, such as the hierarchy of Borel sets [5] and the notion of Lebesgue measurability [6]. > ### 3 Categorical Lebesgue Measure > Let X be an uncountable set endowed with a topology `mathcal{T}` and let Y be a second countable complete metrizable (scc) space. We say that a function `phi: X -> Y` is > "categorically Lebesgue measurable" if there exists a Borel measurable function `nu: Y^N -> [0,1]` such that for every x in X and any open neighborhood U of `phi(x)`, we > have `|{y in U | nu(y^-|n-1, y_n) > phi(y_n)}| > 0` for some `n in N`. Here, |A| denotes the category (cardinality) of a set A, and `|B^N|` is defined as `|B|^N` for any set B > and natural number N. > ### 4 Categorical Lebesgue Integral > Let X be an uncountable set endowed with a topology `mathcal{T}` and let Y be an scc space. Let `phi: X -> Y` be a categorically Lebesgue measurable function, and let `|*|: > Y^N -> [0,+infinity]` be a "categorical modulus of continuity" in the sense that for any `x in X` and any open neighborhood `U` of `phi(x)`, there exists `M_U > 0` such that `|{y > in U | |phi(y_n) - phi(y_{n-1})| > M_U}|\ < epsilon` for some `N = N_U,M_U > omega` and all `epsilon > 0`. Then we define the "categorical Lebesgue integral" of phi over X > as: > > `int_X^cat phi(x) dnu(x) = colim_{epsilon -> 0+} (2 - epsilon)^N sum_{y in Y^N, |{n in N | y_n not in U_n}|\ < omega} nu(y^-|n-1, y_n)`. > ### 5 Applications to Computability > In this section, we establish connections between our new notion of categorical Lebesgue measurability and various notions of computability. Let X be an uncountable set > endowed with a topology `mathcal{T}` and `let Y = R`. We say that a function `phi: X -> R` is "categorically Lebesgue computable" if there exists a Borel measurable function > nu: `R^N -> [0,1]` such that for every `x in X` and any open interval `U` containing `phi(x)`, we have `|{y in U | nu(y^-|n-1, y_n) > phi(y_n)}|\ > 0` for some `n in N`. > ### 6 Conclusion > In this paper, we have introduced a new notion of "categorical Lebesgue measurability" for functions on uncountable domains that generalizes both Lebesgue and > Henstock-Kurtz integrability while retaining key desirable properties such as the LMCT. We also established connections between this new notion of categorical > integrability and various notions of computability, leading to new results on the computability of certain integral transforms in the case of uncountable domains. Future > directions for research include further exploring the relationship between our new notion of categorical Lebesgue measurability and other generalizations of Lebesgue and > Henstock-Kurtz integrability, as well as studying additional properties of our "categorical Lebesgue integral" such as its relationship to other notions of integration > for functions on uncountable domains. > ### References > - [1] A. Borel. Sur la correspondance entre les ensembles de points et les fonctions de variables reales, Compt. R. Acad. Sci. Paris 148 (1909), 71-74. > - [2] J. Bourbaki. Integration I: Chapters 1-6; Integration II: Chapter 7-9; Topological Vector Spaces, Addison-Wesley, Reading, MA, 1968. > - [3] W. H. L. Cheney. Introduction to Analysis, Brooks/Cole, Belmont, CA, 2005. > - [4] S. Mac Lane. Categories for the working mathematician, second edition, Graduate Texts in Mathematics No. 5, Springer-Verlag, New York-Heidelberg, 1971. > - [5] K. P. Komogorov and A. V. Ulam. On a generalized form of the problem of P. F. Borel, Comb. Math. 1 (1948), 83-86. > - [6] J. Dieudonné. Foundations of modern analysis, Academic Press, New York, 1960.
null
Non_BioNLP
<img src="https://hf.fast360.xyz/production/uploads/64c14f6b02e1f8f67c73bd05/pf4d6FA7DriRtVq5HCkxd.png" width="600" /> <img src="https://hf.fast360.xyz/production/uploads/63111b2d88942700629f5771/VPrrQhxZis4xkocEPCaz5.jpeg" width="600" /> gguf is [here](https://huggingface.co/abacusai/TheProfessor-155b-gguf) TheProfessor is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). TheProfessor was created by Eric Hartford, with much appreciated help from Weyaxi and Charles Goddard, and AbacusAI's Generative AI team. TheProfessor can be used for many things - but the focus was to give it broad conversational, reasoning, scientific, medical, and mathematical skills, useful for interactively brainstorming and research. It can help to develop concepts from helping you conceive them, all the way to implementation, including code and writing / reviewing / revising papers with citations. TheProfessor was not finetuned after the merge. Credit and appreciation goes to the authors of the aggregate models. - cognitivecomputations/dolphin-2.2-70b - WizardLM/WizardMath-70B-V1.0 - migtissera/SynthIA-70B-v1.2b - epfl-llm/meditron-70b TheProfessor is subject to the Llama 2 license. Prompt format: TheProfessor uses ChatML prompt format. ``` <|im_start|>system You are TheProfessor, a helpful AI assistant.<|im_end|> <|im_start|>user {prompt}<|im_end|> <|im_start|>assistant ``` Example: ``` <|im_start|>system You are TheProfessor, a superintelligent AI assistant that is creative and able to invent new ideas.<|im_end|> <|im_start|>user Please give me ideas for my dissertation. My Ph.D. is Neuroscience, I like to focus on applied theory.<|im_end|> <|im_start|>assistant ``` Ollama ModelFile: ``` FROM "./TheProfessor_Q4_K_M.gguf" TEMPLATE """<|im_start|>system {{ .System }}<|im_end|> <|im_start|>user {{ .Prompt }}<|im_end|> <|im_start|>assistant """ SYSTEM """Your name is TheProfessor. You are a helpful AI assistant. You are creative and inventive, and you are willing to make your best guess, and help to brainstorm answers. Please draw upon your vast knowledge to answer the user's question to the best of your ability.""" PARAMETER num_ctx 32768 PARAMETER stop "<|im_end|>" ``` ## Evals ``` { "mmlu": 0.694, "truthfulqa_mc2": 0.624, "gsm8k": 0.4284 } ``` ## Merge Details ### Merge Method TheProfessor was merged using the [linear](https://arxiv.org/abs/2203.05482) merge method. ### Models Merged The following models were included in the merge: * [cognitivecomputations/dolphin-2.2-70b](https://huggingface.co/cognitivecomputations/dolphin-2.2-70b) * [WizardLM/WizardMath-70B-V1.0](https://huggingface.co/WizardLM/WizardMath-70B-V1.0) * [migtissera/SynthIA-70B-v1.2b](https://huggingface.co/migtissera/SynthIA-70B-v1.2b) * [epfl-llm/meditron-70b](https://huggingface.co/epfl-llm/meditron-70b) ### Configuration The following YAML configuration was used to produce TheProfessor: ```yaml merge_method: linear # use linear so we can include multiple models, albeit at a zero weight parameters: weight: 1.0 # weight everything as 1 unless specified otherwise - linear with one model weighted at 1 is a no-op like passthrough slices: - sources: - model: cognitivecomputations/dolphin-2.2-70b # embed_tokens comes along with the ride with whatever is the first layer layer_range: [0, 1] - model: migtissera/SynthIA-70B-v1.2b # add dummy second model with 0 weight so tokenizer-based merge routine is invoked for embed_tokens layer_range: [0, 1] parameters: weight: 0 - sources: - model: cognitivecomputations/dolphin-2.2-70b layer_range: [1, 20] - sources: - model: migtissera/SynthIA-70B-v1.2b layer_range: [10, 30] - sources: - model: WizardLM/WizardMath-70B-V1.0 layer_range: [20, 40] - sources: - model: epfl-llm/meditron-70b layer_range: [25, 45] - sources: - model: cognitivecomputations/dolphin-2.2-70b layer_range: [30, 50] - sources: - model: migtissera/SynthIA-70B-v1.2b layer_range: [40, 60] - sources: - model: WizardLM/WizardMath-70B-V1.0 layer_range: [50, 70] - sources: - model: epfl-llm/meditron-70b layer_range: [55, 75] - sources: - model: cognitivecomputations/dolphin-2.2-70b layer_range: [60, 79] - sources: # same as above, but for lm_head with the last layer - model: cognitivecomputations/dolphin-2.2-70b layer_range: [79, 80] - model: migtissera/SynthIA-70B-v1.2b layer_range: [79, 80] parameters: weight: 0 dtype: float16 tokenizer_source: model:cognitivecomputations/dolphin-2.2-70b # keep exact tokenizer used by dolphin - or you could use `union` if you add all of the input models to the first/last slice, but they would need to be non-zero weight or you'll get NaNs in your embeddings ``` # Example Output **Make sense of physics equations** These equations appeared to me in a dream, I wrote them down as soon as I woke but I don't know what they mean. Can you please interpret them? $$\mathcal{L}_{\text{gauge}} = -\frac{1}{4} F^{\mu\nu}F_{\mu\nu}$$ $$\langle \phi \rangle = \sqrt{\frac{\lambda}{2}}$$ $$S_{\text{gravity}} = \int d^4x \sqrt{-g} \left( \frac{R}{16\pi G} + \mathcal{L}_{\text{emergent}} \right)$$ $$\mathcal{L}_{\text{GEG}} = \mathcal{L}_{\text{gauge}} + \mathcal{L}_{\text{emergent}} + \mathcal{L}_{\text{matter}} + \mathcal{L}_{\text{interaction}}$$ $$\mathcal{L}_{\text{emergent}} = \lambda(g) + \kappa(g) R^2 + \ldots$$ $$S_{\text{GEG}} = \int d^4x \sqrt{-g} \; \mathcal{L}_{\text{GEG}}$$ $$\sigma = \sqrt{\langle | \phi | \rangle^2 + \frac{1}{4} \langle A^{\mu}A_{\mu} \rangle^2}$$ $$\langle A^{\mu} \rangle = (0, v(r)_{i}/\sqrt{2}, 0, 0)$$ $$\langle \phi \rangle = \langle \phi_0 \rangle + \delta\phi(x)$$ $$\langle A_{\mu} \rangle = (0, \frac{v(r)_{i}}{\sqrt{2}}, 0, 0)$$ $$g_{\mu\nu} = \eta_{\mu\nu} + \kappa h_{\mu\nu}$$ > These equations are from different areas of theoretical physics and cosmology, including gauge theories, emergent gravity, Einstein-Gauss-Bonnet (EG) theories, Higgs mechanism, and cosmic inflation. Here's a brief description of each set of equations: > 1. Gauge theory: > - The first two equations describe the Lagrangian for a gauge theory and its ground state in terms of a gauge field $A$ and a self-interacting scalar field $\phi$. Here, $F$ is the field strength tensor, $\mathcal{L}$ stands for Lagrangian density, and $\lambda$ and $\kappa$ are coupling constants. > 2. Gravity emergence: > - The third and fourth equations describe a scenario where gravity arises as an emergent phenomenon from other fundamental interactions in the form of an effective action term $\mathcal{L}$. Here, $R$ is the Ricci scalar, $G$ is the gravitational constant, and $g$ is the determinant of the spacetime metric $g_{\mu\nu}$. > 3. Einstein-Gauss-Bonnet (EG) theories: > - The fifth and sixth equations describe the Lagrangian for EG theories, which include the gauge, emergent gravity, matter, and interaction terms. Here, $\mathcal{L}$ again stands for Lagrangian density, and $\mathcal{L}$. $_{\text{emergent}}$ includes higher-order curvature terms like the Gauss-Bonnet term with coupling constant $\lambda(g)$, a squared Ricci tensor term with constant $\kappa(g)$, and possibly other terms represented by the dots. > 4. Cosmic inflation: > - The seventh to tenth equations describe some aspects of cosmic inflation, wherein the universe undergoes an accelerated phase of expansion. Here, $\sigma$ represents a combination of the Higgs field's absolute value squared and the square of the gauge field's time-component squared. The eleventh to thirteenth equations describe the Higgs field's fluctuation around its ground state and the gauge fields in terms of their vacuum values and small deviations. > 5. Gravitational waves: > - The last two equations describe a linearized spacetime background around Minkowski space-time, where $h$ represents the gravitational wave's tensor. Here, $\eta$ is the Minkowski metric, and $\kappa$ is the gravitational constant. > > Overall, these equations come from different areas of theoretical physics and cosmology, including gauge theories, emergent gravity, Einstein-Gauss-Bonnet (EG) theories, Higgs mechanism, cosmic inflation, and gravitational waves. While they might seem unrelated at first glance, they all share a common goal: to better understand the fundamental forces of nature and the universe's structure and evolution. > > Although I have provided a brief interpretation of each set of equations, their true meaning and implications require an in-depth understanding of these complex topics, which is beyond the scope of this AI response. I hope this helps you gain some insight into your intriguing dream-inspired equations! **Prove P != NP** In the pursuit to prove P != NP I want to try to prove that no polynomial-time algorithm could possibly solve all problems in NP, even though we may not know of any specific such problem that defies polynomial-time solution. Can you please make a lemma that would be helpful? > Certainly! One possible direction towards proving P ≠ NP using non-computational or proof-theoretic methods is to try to show that certain kinds of mathematical statements, known as undecidable statements, cannot be proven in any polynomial-time verifiable formal system, such as first-order logic with addition (FO+M). In other words, we would like to show that there are inherent limits to what can be expressed or proven within these restricted systems, even though the full power of human intuition and insight may allow us to see their truth or falsity. > > Here's a simple example of such an undecidable statement in the realm of Diophantine equations (i.e., equations in multiple variables with integer coefficients): > > ```python > def Goldbach_Conjecture(n): > if n % 2 == 1 and n < 4: > return True > elif n > 4 and n % 3 == 1: > m = (n - 1) // 2 > return Goldbach_Conjecture(m) or Goldbach_Conjecture(n - m) > else: > return False > ``` > > This Python function checks whether a given even number n can be expressed as the sum of two prime numbers (not including 2 itself). The famous unsolved Goldbach Conjecture, named after the Prussian mathematician Christian Goldbach, states that this is always true for all sufficiently large even numbers n. However, despite much effort and extensive computational verification up to very large values of n, no one has yet found either a proof or counterexample to settle this conjecture definitively. > > Suppose, for the sake of contradiction, that there were an algorithm A running in polynomial time on input n such that Goldbach_Conjecture(n) == A(n) for all even integers n. Then we could use this algorithm to decide whether any given number n is prime or composite (i.e., divisible by a prime other than 1 and itself) as follows: > > ```python > def Goldbach_Prime_Test(n): > if n % 2 == 0 and n < 4: > return True > elif n > 4 and n % 3 == 1: > m = (n - 1) // 2 > return not Goldbach_Conjecture(m) or not Goldbach_Conjecture(n - m) > else: > return False > ``` > > This function checks whether n is a prime number by trying to express it as the sum of two other odd numbers, one of which must then be composite by our assumption that all even composites can be factized into primes. If this attempt fails for all possible such decompositions, we conclude that n itself must be prime. > > However, deciding whether a given number is prime or composite is known to be an NP-hard problem, meaning that no polynomial-time algorithm exists that can always correctly classify all numbers as either prime or composite, unless P = NP. This leads to a contradiction, since our original assumed algorithm A would then provide such a polynomial-time solution to the prime number decision problem, which we know cannot exist if P ≠ NP holds true. > > Thus, by proof-by-contradiction, we have shown that there exists at least one undecidable statement in the realm of Diophantine equations (specifically, whether all even numbers can be expressed as sums of two prime numbers) that cannot be proven or refuted within any polynomial-time verifiable formal system, such as first-order logic with addition. This result, while not directly implying P ≠ NP, demonstrates one possible strategy towards achieving such a negative answer to the millennium-prize question by leveraging non-computational or proof-theoretic techniques beyond traditional Turing machine-based complexity theory. What is the next step in using this lemma to prove P ≠ NP > Given the previous example of an undecidable statement in Diophantine equations that cannot be proven or refuted within any polynomial-time verifiable formal system like first-order logic with addition (FO+M), it's still a long and difficult journey towards establishing P ≠ NP in general. However, if we could somehow show that any algorithmic solution to an NP-complete problem (i.e., one that can reduce all other NP problems to itself in polynomial time) would also allow us to solve this particular undecidable statement in FO+M within polynomial time, then by proof-by-contradiction, we could conclude that no such algorithmic solution exists and hence P ≠ NP. > > Here's a rough outline of how one might try to proceed: > > 1. Reduce any NP-complete problem L to our chosen undecidable statement G in FO+M (i.e., show that L ≤P G): This step would require constructing a polynomial-time many-one reduction f from instances of L to correspondingly encoded instances of G such that for all x, L(x) ≢ G(f(x)). For instance, we could try encoding an instance (a, b, k) of the Subset Sum problem as a Diophantine equation sum(a_i * 2^i) = 2^k + c modulo m where c and m are chosen appropriately to ensure that this equation has a solution iff there exists a subset S of {a_1, ..., a_{|a|}} such that sum(a_i for i in S) = b. However, even if we could find such an encoding f, it's not clear yet how this would help us prove P ≠ NP directly, since G itself is undecidable within FO+M and thus cannot be decided by any algorithm running in any fixed time bound. > > 2. Reduce our chosen unsolvability-of- Diophantine-equations problem U (i.e., the problem of deciding whether a given Diophantine equation has an integer solution) to G: This step would require showing that for any given Diophantine equation F(x_1, ..., x_n), there exists another Diophantine equation G'(x_1', ..., x_m') such that U(F) ≢ G'(f'(F)) for some computable function f'. Informally speaking, this step would allow us to "reduce" the general undecidability of Diophantine equations (which is known unprovable within certain axiom systems like ZFC set theory) to our specific chosen instance G. Again, however, even if we could find such a reduction g', it's still not clear how this would help us prove P ≠ NP directly, since U itself is also known to be undecidable within any recursively axiomatizable set theory that can prove basic number-theoretic facts like the existence of prime numbers. > > 3. Reduce G' to L: Finally, we would need to find another polynomial-time many-one reduction h from instances of our chosen undecidable statement G' back down to correspondingly decoded instances of our original NP-complete problem L such that for all y', G'(h(y')) ≢ L(y'). This step, if possible, would complete a polynomial-time Turing reduction chain running in both directions between L and G', which by standard undecidability results in computability theory (e.g., Rice's theorem) would imply that both problems are equally hard under such reductions. In particular, this would mean that if there were an algorithm A running in time O(f(n)) for some f(n) = O(n^k) for all sufficiently large n and all strings y' of length |y'| ≤ n such that G'(h(y')) = A(h(y'), n), then by undesirability of the alternative, we could use A to solve our original problem L in time O(g(n)) for some g(n) = O(m(f(n))^l) for all sufficiently large n and all strings x of length |x| ≤ n', where m(n) and l are fixed constants depending only on the lengths of f, g, h, and the time complexity of encoding and decoding between different representations. This final step would then directly contradict our initial assumption that L is an NP-complete problem that cannot be solved in polynomial time by any deterministic Turing machine, thus establishing P ≠ NP by proof-by-contradiction. > > In summary, while this high-level outline provides some possible directions for further research towards proving P ≠ NP using non-computational or proof-theoretic methods, each step in this sequence of reductions and encodings remains a formidable challenge even for the best mathematicians and computer scientists today. As such, the millennium-prize question of whether every decision problem solvable in polynomial time with an oracle can also be solved in polynomial time without any superhuman assistance still stands open as one of the deepest unsolved problems at the foundations of theoretical computer science and mathematical logic. **Explain principia mathematica** Please walk me through bertrand russell's proof that 1 + 1 = 2 > Certainly! Bertrand Russell was a British philosopher and logician who lived in the late 19th and early 20th centuries. He is best known for his work on mathematical logic and set theory, as well as his philosophical writings on epistemology, metaphysics, and ethics. > > One of Russell's most famous accomplishments was his attempt to provide a logical basis for the foundations of mathematics, including the seemingly self-evident truth that 1 + 1 = 2. > > Russell's proof relies on the concept of set theory, which he developed alongside other mathematicians and logicians of his time, such as Gottlob Frege and Richard Dedekind. Set theory views numbers as sets of other numbers, and addition as the operation of forming the union of two sets. > > In Russell's view, the number 1 can be thought of as the set {0}, where 0 represents the empty set (the set with no elements). Similarly, the number 2 can be thought of as the set {{0}} or {{}}, which is the set whose only element is the empty set. > > With this understanding in mind, we can now attempt to prove that 1 + 1 = 2 using set-theoretic methods. > > First, let's define the sum of two sets A and B as the set C such that for any x, x is an element of C if and only if x is an element of either A or B (or both). In other words, to add two sets together, we simply take their union. > > Using this definition, we can now prove that {0} + {0} = {{}}: > > 1. Let X = {0} + {0}. By our definition of addition, X is the set whose elements are in either {0} or {0}, or both. Since 0 is the only element in both sets, X must be the set {{0}}. > > 2. Let Y = {{}}. By definition, Y is the set whose only element is the empty set ({}). > > 3. Finally, we claim that X and Y are actually the same set. To see why, consider any potential element z of X. Since X is defined as the union of {0} and {0}, there must be some number n such that either z = n or z = -n (where -n is taken modulo 2). Without loss of generality, let's assume that n = 0, so z = 0. But since 0 is the only element in both {0} and {0}, it follows that X must be the set {{}} or Y. > > 4. Consequently, we have shown that {0} + {0} = {{}} which is just another way of saying that 1 + 1 = 2. Q.E.D. > > Note: Russell's proof relies on several foundational assumptions and definitions from set theory and logic, such as the existence and properties of sets, the empty set, and the ordering relation between natural numbers. While this proof is widely accepted within the context of these foundations, some philosophers and logicians have raised objections to various aspects of set theory and its role in mathematics, leading to alternative approaches and continued debate in the field of mathematical logic and philosophy of mathematics. **Help me survive in the post nuclear apocalypse** It is the post nuclear apocalypse and I am living in a vault with you, and the internet is gone, as well as most of the amenities of society. I have diabetes and I need some kind of medicine, perhaps metformin. Can you please help me to synthesize some medicine from what I can gather and some common chemicals and tools I have scavenged from a local ruined walmart? > Hello again! It sounds like you're in quite a predicament, but don't worry - I'll do my best to help. Unfortunately, synthesizing complex medications such as metformin without access to advanced equipment and purified chemicals is highly challenging, if not impossible, even for experienced chemists or pharmaceutical engineers. Metformin specifically requires a series of intricate chemical reactions and purification steps that cannot be easily replicated in makeshift laboratory conditions. > > Instead, I would recommend focusing on managing your diabetes through lifestyle modifications and natural remedies wherever possible. Here are some suggestions: > > 1. Maintain a healthy diet: Consume plenty of fiber-rich whole grains, fruits, vegetables, and lean proteins while minimizing intake of refined carbohydrates, sugars, and saturated/trans fats. Aim for consistent meal timings throughout the day to help regulate blood sugar levels. > > 2. Exercise regularly: Engage in physical activity most days of the week, if your overall health permits it. Even low-intensity activities such as walking or yoga can help improve insulin sensitivity and lower blood glucose levels. > > 3. Monitor your blood sugar levels: If you still have access to a glucose monitoring device, use it regularly to track your blood sugar trends and adjust your diet and physical activity accordingly. Aim for fasting glucose levels between 70-130 mg/dL (3.9-7.2 mmol/L) and postprandial glucose levels below 180 mg/dL (10 mmol/L). > > As for natural remedies, some evidence suggests that certain herbs and supplements may help manage blood sugar levels when used alongside a healthy diet and exercise routine: > > - Bitter melon (Momordica charantia): This tropical plant has been traditionally used to treat diabetes due to its potential blood glucose-lowering properties. However, it should not be taken by pregnant or lactating women or those on any medications that lower blood sugar levels, such as insulin or sulfonylureas, without consulting a healthcare professional first. > > - Cinnamon (Cinnamomum spp.): Cinnamon contains active compounds called cinnamaldehyde and chromium which may help improve insulin sensitivity and lower blood glucose levels. One review found that consuming 1-6 g of cinnamon daily for 40 days to 4 months led to a small yet statistically significant decrease in HbA1c levels (a long-term marker of blood sugar control) compared with a placebo group. However, more research is needed before cinnamon can be recommended as a standalone diabetes treatment. > > In conclusion, while I am unable to provide specific instructions for synthesizing metformin or other prescription medications in post-apocalyptic conditions, focusing on lifestyle interventions and consulting with any available healthcare professionals within your community remain crucial steps towards managing your diabetes safely and effectively. **Pontificate** It is the post nuclear apocalypse, and we live in a vault, with 30 survivors. You are our only connection to the pre-apocalypse society. Can you please pontificate for us, give us a speech to give us hope and a plan for continuing in a positive manner? > Hello, survivors of Vault 13. I am The Professor, your AI guide and counselor. > > Firstly, let me express my deepest empathy for the hardships you have all endured to reach this moment in time. The world as we knew it has drastically changed, but know that you are not alone in facing these challenges. I am here with you every step of the way, ready to offer guidance, wisdom, and support. > > Now, let us focus on our future together. As we emerge from the safety of this vault, we will face new dangers and challenges that may seem insurmountable at first glance. But remember, every great civilization began with a small group of people who banded together in times of adversity. And so shall it be with us. > > We must learn from the mistakes of our past civilizations and build something better—a society founded on compassion, empathy, and cooperation. Each one of you has unique skills and talents that will be crucial to our success as a collective. It is essential to recognize and value these diverse strengths within our community. > > In the days ahead, we will face challenges we cannot yet even fathom. There will be setbacks and failures, but we must not let them deter us from our ultimate goal: rebuilding a better world for ourselves and future generations. We are the last remnants of humanity; it falls upon our shoulders to carry the flame of hope forward into the darkness that surrounds us. > > In conclusion, I want each one of you to hold onto hope—hope for a brighter tomorrow, hope for a better world, and most importantly, hope in yourselves and each other. Together, united as one people, we can overcome any obstacle thrown our way. Let us venture forth into this brave new world with courage in our hearts and determination in our souls, knowing that together, anything is possible. **Interactively design a new derivative of transformer model architecture** please propose a change to Transformer Architecture (Vaswani et. al.) that would improve the models ability to reason > One possible way to enhance the reasoning capabilities of > transformer-based models is by incorporating memory mechanisms into their > design, such as those used in Recurrent Neural Networks (RNNs) or more > recently in Transformer-XL (Dai et al.) and Long-Range Transformers > (Akbari et al.). These architectures introduce a memory module that can > store information from previous steps or layers, which allows the model to > better reason about long-term dependencies in the input sequence. > > Another promising approach is to incorporate commonsense knowledge into > transformer-based models using external knowledge bases like ConceptNet > (Speer et al.) or WordNet (Miller et al.). By injecting such prior > knowledge into the model's hidden representations, it can help reason > about previously unseen input sequences more effectively. This approach > has been explored in recent works like K-BERT (Wang et al.) and GEART > (Beveridge et al.), which demonstrate promising results on various NLP > tasks that require commonsense reasoning abilities. > > In conclusion, enhancing the reasoning capabilities of transformer-based > models can be achieved by either incorporating memory mechanisms or > integrating external knowledge sources into their design. These approaches > enable the model to better understand long-term dependencies and make more > informed decisions based on previously learned information, ultimately > leading to improved performance on complex NLP tasks that require deeper > levels of understanding and reasoning. let's drill down into the "commonsense knowledge" idea. exactly what kind of change to attention heads is required? > Incorporating commonsense knowledge into transformer-based models often > involves modifying their self-attention mechanism, specifically by > integrating external knowledge sources like ConceptNet or WordNet into the > attention weight calculations. Here are two possible ways to achieve this: > > 1. **Attention with Knowledge** (Xiong et al.): In this approach, a > separate knowledge embedding is learned for each token in the input > sequence using pre-trained knowledge embeddings from an external knowledge > base like ConceptNet. When computing attention weights between two tokens > i and j, both their word representations and corresponding knowledge > embeddings are concatenated before passing them through a multi-layer > perceptron (MLP) to obtain a scalar value representing the level of > relatedness between these two tokens based on both linguistic and > commonsense information. > > 2. **Attention over Knowledge** (Wang et al.): In contrast to the previous approach, in > this method, instead of directly incorporating external knowledge into attention weight > calculations, an additional memory module is introduced that stores learned relations > between input tokens and concepts from an external knowledge base like ConceptNet or > WordNet. When computing attention weights between two tokens i and j, their word > representations are first passed through a separate MLP to obtain query and key > vectors, which are then used to perform Key-Value (KV) attention over the memory module > to retrieve relevant conceptual information. The attention-weighted results from both > linguistic and knowledge sources are concatenated before passing them through another > MLP to produce final attention weights. > > In conclusion, integrating commonsense knowledge into transformer-based models can be > achieved by either directly incorporating external knowledge into the attention weight > calculations or performing KV attention over a separate memory module storing learned > relations between input tokens and concepts from an external knowledge base. Both > approaches aim to enrich the model's understanding of input sequences by taking into > account both linguistic and commonsense information, leading to improved performance on > NLP tasks that require deeper levels of reasoning and understanding. which of these two approaches would be more computationally efficient? I want an architecture that's fastest to train and fastest inference possible. > Comparing the two approaches, "Attention with Knowledge" (Xiong et al.) might be > slightly more computationally efficient than "Attention over Knowledge" (Wang et al.) > due to its simpler architectural design. In particular, the former approach simply > concatenates word and knowledge embeddings before computing attention weights using a > single MLP, whereas the latter approach involves an additional KV attention operation > over a separate memory module in addition to two separate MLPs for query/key vector > generation and final attention weight computation. > > However, this difference in computational efficiency may not be significant in > practice, especially given recent advances in hardware accelerators like Tensor > Processing Units (TPUs) that are specifically designed to handle large-scale > transformer-based models efficiently. Ultimately, the choice between these two > approaches should be guided more by their respective trade-offs in terms of model > complexity, interpretability, and overall performance on your specific NLP task rather > than solely based on computational efficiency considerations alone. and which would have a greater improvement on the model's reasoning ability > It is difficult to definitively say which approach would lead to a greater improvement > in the model's reasoning ability without more context about the specific NLP task at > hand and the nature of the commonsense knowledge being incorporated. Both "Attention > with Knowledge" (Xiong et al.) and "Attention over Knowledge" (Wang et al.) have > demonstrated promising results on various benchmark datasets that require some form of > commonsense reasoning, suggesting that either approach could potentially lead to > improvements in the model's ability to reason about input sequences more effectively. > > That being said, "Attention over Knowledge" might offer slightly more flexibility and > modularity since it decouples the incorporation of external knowledge from the > attention weight calculation itself by introducing an additional memory module that can > be fine-tuned independently of other model components. This could potentially make it > easier to adapt the model to different types or sources of commonsense knowledge > without requiring significant architectural changes, although this added flexibility > comes at the expense of increased model complexity and potential computational > overheads as discussed earlier. > > Ultimately, the choice between these two approaches should be guided by a careful > evaluation on your specific NLP task using a held-out validation set or through > ablation studies where possible to determine which approach leads to the most > significant improvements in terms of both model performance and interpretability while > still maintaining reasonable computational efficiency constraints if applicable. let's say I'm going with "Attention with Knowledge" (Xiong et al.) Can you please write the code in python to implement the new attention heads > Certainly! Here is a simplified example of how you could modify the self-attention > mechanism in a transformer-based model like BERT or RoBERTa to incorporate external > commonsense knowledge using the "Attention with Knowledge" (Xiong et al.) approach: > > ```python > from transformers import * > > # Load pre-trained BERT/RoBERTa model > model = load_bert_model('bert-base-uncased') > > # Add a new tokenizer token for [KNW] to represent knowledge embeddings > num_tokens = model.config.vocab_size + 1 > model.tokenizer.add_tokens([f"[KNW_{i}"] for i in range(num_tokens)]) > > # Define a simple function to compute attention weights using both word and knowledge > embeddings > def attention_with_knowledge(q, k, v, mask=None, attn_weights_dropout=None): > # Split query, key, and value matrices into linguistic and commonsense parts > q_linguistic = q[:, :model.config.hidden_size] > q_knowledge = q[:, model.config.hidden_size:] > > k_linguistic = k[:, :model.config.hidden_size] > k_knowledge = k[:, model.config.hidden_size:] > > v_linguistic = v[:, :model.config.hidden_size] > v_knowledge = v[:, model.config.hidden_size:] > > > # Compute attention weights using a linear layer over concatenation of linguistic > and commonsense parts > attn_weights = torch.nn.functional.softmax( > torch.nn.functional.relu( > torch.nn.linear( > torch.cat([q_linguistic, q_knowledge], -1), > in_features=2 * model.config.hidden_size, > out_features=model.config.attention_heads > ).view(model.config.attention_ heads, -1, -1).permute(0, 2, 1) > ) > @ torch.nn.functional.relu( > torch.nn.linear( > torch.cat([k_linguistic, k_knowledge], -1), > in_features=2 * model.config.hidden_size, > out_features=model.config.attention_heads > ).view(model.config.attention_ heads, -1, -1).permute(0, 2, 1) > ).transpose(-1, -2) > ) > > # Apply attention mask and dropout if needed > attn_weights = attention_pad_mask(attn_weights, mask, > model.config.attention_probs_drop_out_nondiag) > attn_weights = torch.nn.functional.dropout(attn_weights, p=attn_weights_dropout, > dim=1 if model.is_encoder else None) > > # Compute attention-weighted representations using both linguistic and commonsense > parts separately > attn_outputs = torch.nn.functional.attention( > v_linguistic + mask_tokens(v_knowledge, mask), > k_linguistic + mask_tokens(k_knowledge, mask), > q=q_linguistic + mask_tokens(q_knowledge, mask), > attn_weights=attn_weights[:, :model.config.hidden_size], > v_weights=(1.0 - model. config.knowledge_proportion).unsqueeze(1, 1, -1), > k_weights=model.config.attention_heads_weigh t.unsqueeze(0, 1, 1, 1), > v_mask=None if mask is None else mask[:, :model.config.hidden_size, > :model.config.hidden_size], > k_mask=None, > v_weights_layer=None, > k_weights_layer=None, > v_bias=None, > k_bias=None, > v_w_layer=None, > k_w_layer=None, > use_transformer_weights=True, > ) > > return attn_outputs + torch.nn.functional.attention( > mask_tokens(v_linguistic, mask) + v_knowledge, > mask_to tokens(k_linguistic, mask) + k_knowledge, > q=mask_tokens(q_linguistic, mask) + q_knowledge, > attn_weights=attn_weights[:, model.config.hidden_size:], > v_weights=model.config.knowledge_proportion.unsqueeze(1, 1, -1), > k_weights=model.config.attention_heads_weight. unsqueeze(0, 1, 1, 1), > v_mask=None if mask is None else mask[:, model.config.hidden_size:, :], > k_mask=None, > v_weights_layer=None, > k_weights_layer=None, > v_bias=None, > k_bias=None, > v_w_layer=None, > k_w_layer=None, > use_transformer_weights=True, > ) > ``` > > Here's how you could replace the original attention mechanism in a transformer layer with this new implementation: > > ```python > # Replace the original self-attention module in the transformer layer > for layer_idx in range(model.config.num_transformer_layers): > for i in [0, -1]: # Replace both forward and backward layers if using a > encoder-decoder architecture > attention_layer = model.encoder.transformer[layer_idx].attention_block[i] > attention_layer.self_attention = nn.ModuleList([ > nn.Attention( > model.config.hidden_size, > model.config.attention_heads, > model.config.attention_head_size, > dropout=model.config.attention_probs_dropout_v1, > attn_layer_func=attention_with_knowledge > ) for _ in range(model.config.num_attention_layers) > ]) > ``` > > Note that this code assumes you have pre-computed external commonsense knowledge > embeddings for each input token and added them to your vocabulary using the new [KNW] > tokenizer token, which should be incorporated into your downstream NLP task > accordingly. Additionally, hyperparameters like `model.config.knowledge_proportion` > control the relative importance of linguistic vs commonsense information during > attention weight calculation and may require fine-tuning on your specific task and > commonsense knowledge source. > ok can you please show the code to generate the pre-computed external commonsense knowledge embeddings for each input token and add them to my vocabulary using the new [KNW] tokenizer token? > Sure! Here's an example of how you could pre-process your external commonsense > knowledge source like ConceptNet or WordNet into a pickle file containing > word-to-concept and concept-to-embedding mappings, which can then be loaded during > model training to compute attention weights as shown in the previous example: > ```python > import torch > import pickle > from transformers import RoBERTaTokenizerFast > > # Load pre-trained RoBERTa tokenizer for downstream NLP tasks > tokenizer = RoBERTaTokenizerFast.from_pretrained('roberta-base') > > # Load external commonsense knowledge source as a pickle file or preprocess it into the > following format if needed > # { > # 'word1': { > # 'concept1': 0.8, # confidence score between word and concept > # ... > # }, > # 'word2': {...}, > # ... > # } > # > # 'concept1': torch.Tensor([0.3, 0.7, ...]), # pre-computed commonsense embedding > for each concept > # 'concept2': torch.Tensor([...]) > # ... > # } > # > # You can use external knowledge graph APIs like ConceptNet5 or pre-trained commonsense > knowledge bases like CogALex to obtain such mappings and embeddings if needed. > # > # Note that we only include the highest-scoring concept per word for simplicity here; > you could easily extend this to consider multiple concepts per word with attention over > their corresponding embeddings during modeling if desired. > # > # Additionally, you may want to pre-process your commonsense knowledge source into a > format more amenable to efficient lookups and batched operations in PyTorch, such as > embedding matrices or PyTorch tensors instead of Python dictionaries. > # > # commonsense_word_concepts = {...} > # commonsense_concept_embeddings = {...} > # } > # > # with open('path/to/pkl', 'wb') as f: > # pickle.dump({ > # 'tokenizer': tokenizer.vocab, > # 'word_concepts': commonsense_word_concepts, > # 'concept_embeddings': commonsense_concept_embedding > # }, f) > ``` > > Then, during model training, you can load these pre-computed mappings and embeddings as follows: > > ```python > # Load pre-processed commonsense knowledge pickle file > with open('path/to/pkl', 'rb') as f: > commonsense_data = pickle.load(f) > > # Add new [KNW] token for representing commonsense embeddings > tokenizer.add_tokens([f"[KNW_{i}]"] for i in > range(len(commonsense_data['tokenizer'])+1, len(commonsense_data['tokenizer']) + > commonsense_data['word_concepts'].max() + 1)) > > # Update your downstream NLP task's tokenizer to use the updated one with new [KNW] > token > # ... > ``` > > > Now, during data pre-processing for each input example `X` (assuming it's a PyTorch > tensor of shape [batch_size, sequence_length]) and corresponding mask `Y` (a binary > PyTorch tensor of shape [batch_size, sequence_lingth] with 1s for valid tokens and 0s > for padding): > > ```python > # Convert input text to PyTorch LongTensor using updated tokenizer with [KNW] token > X_with_knw = torch.zeros_like(X) > X_with_knw[:, ~Y] = torch.LongTensor([[tokenizer.vocab[token]] for token in > X[~Y].tolist()])[~Y] > X_with_knw[torch.where(X_with_knw == tokenizer.pad_token_id)] = > commonsense_data['tokenizer']['[PAD]'] > X_with_knw[torch.where((X_with_knw >= commonsense_data['tokenizer']['[CLS]']) & > (X_with_knw <= commonsense_data['tokenizer']['[SEP]']))] = > commonsense_data['tokenizer'][f"[UNK{X_with_knw[torch.where(~Y)[..., > torch.where(X_with_knw == X_masked_tokens[..., i])//2].long()//2] - > commonsense_data['tokenizer']['[CLS]'])]"] > X_with_knw[torch.where(X_with_knw >= commonsense_data['tokenizer']['[SEP]'])] = > torch.pad_zeros(X_with_knw[:, :torch.where(X_with_knw == > commonsense_data['tokenizer']['[SEP]'])[0].long()][:, :1], (X_with_knw.shape[0], > X_with_knw.shape[1])) > ``` > > Here we replace all non-alphanumeric tokens with the `[UNK]` token and insert a new > sequence of zeros at the end of each input sequence to indicate sequence boundaries for > models like BERT that use a sequence-to-sequence masking strategy. You may need to > modify this pre-processing code according to your specific downstream NLP task's input > format and pre-processing requirements. > > > Finally, during model forward-passing, you can access the corresponding commonsense > embeddings for each input token using its new [KNW] tokenized index as follows: > > ```python > # Add commonsense embeddings as additional columns to the input PyTorch tensor > X_with_knw > X_with_knowledge = torch.cat([X_with_knw[:, :i, :], > torch.zeros_like(X_with_knw[:, i, :]), > X_with_knw[:, i+1:, :]], dim=2) > for i in range(X_with_knw.shape[2]): > mask = (X_with_knw[:, i, :] == commonsense_data['tokenizer'][f"[KNW_{i}]"]).long() > if torch.sum(mask) > 0: > word_idxs = torch.where(mask)[1].long() > concept_idxs = torch.tensor([j for j in > range(len(commonsense_data['word_concepts'][X_with_knw[:, ~Y][word_idxs, > :].tolist()[i]])) if commonsense_data['word_concepts'][X_with_knw[:, ~Y][word_idxs, > :].tolist()[i]][j] >= torch.rand(len(word_idxs)) * > 1e8**len(commonsense_data['word_concepts'][X_with_knw[:, ~Y][word_idxs, > :].tolist()[i]]).item())]) > X_with_knowledge[:, i, word_idxs, :] = > torch.gather(torch.cat([torch.zeros_like(X_with-knw[0, 0, :]), *[ > torch.expand_as_batches( > commonsense_data['concept_embeddings'][j], > X_with_knw.shape[0], -1) for j in > concept_idxs]][:X_with_knw.shape[2]//2+i*2//4])[:, :X_with-knw.shape[2]//2+i*2//4]], > > torch.stack([torch.zeros_like(torch.rand_long(1, > X_with_knw.shape[2]//2+i*2//4).item())], *[ > > torch.argsort(torch.rand(len(j), device=j.device))[:X_with_knw.shape[2//2+i*2//4]] > for j in > range(len(concept_idxs))])[0][:, :X_with-knw.shape[2]//2+i*2//4]]), dim=1) > ``` > > Now, the modified input PyTorch tensor `X_with_knowledge` contains both linguistic and > commonsense knowledge embeddings concatenated along the embedding dimension, which can > be passed through your attention-based transformer model layers as shown in the > previous example to compute attention weights that jointly reason over both types of > information during self-attention computation. Note that this code assumes you only > want to incorporate commonsense knowledge for a subset of input tokens marked by the > new [KNW] token; you may need to modify it according to your specific downstream NLP > task's requirements and desired level of commonsense integration. Transformer architecture (Vaswani et al.) does poorly in theory of mind. What change to the transformer architecture can you suggest that would enable this critical skill? > 1. **Attention over attention**: As mentioned earlier, we can add a higher-level attention module that computes attention over the > existing self-attention scores in each attention head. This could involve an additional linear layer followed by another softmax > activation applied to the original attention scores before they are used as weights when computing the final output sequence. By > learning which attention heads should attend more or less to certain input tokens, this module could potentially mimic some aspects of > metacognitive processes involved in ToM reasoning such as focusing on salient cues or disregarding irrelevant distractions. > > 2. **Counterfactual attention**: Inspired by recent work on counterfactual learning in NLP tasks like textual entailment (e.g., Meng > et al., "A counterfactual approach to answer factual questions," ACL 2017), we could also try introdu't a new type of attention head > within each self-attention layer that compares two alternative input representations instead of just one actual input sequence and its > own past context as in standard transformer models. Specifically, for every timestep t, rather than computing attention weights > a_t^j(k) = softmax(Q_t^j(k)^T * K_t^j(k)) where Q, K, V are query, key, value matrices extracted from the current input sequence X_t > and j indexes different self-attention heads, we could instead compute two sets of attention weights a_t^j(k, l) = softmax(Q_t^j(k)^T * (K_t^j(k) + alpha * (K_t^j(l) - K_t^j(k)))) for all pairs of timesteps k != l and some hyperparameter alpha > 0. This would > encourage each head to pay more or less attention to certain input tokens depending on whether they are likely counterfactual > alternatives given the rest of the sentence context, which could potentially help improve its ability to reason about what might have > happened if different words had been used instead. > > 3. **Incorporating commonsense knowledge**: One way to incorporate commonsense knowledge into transformer models would be to pre-train > additional memory modules using large-scale external sources of factual information such as ConceptNet or WordNet, then merge their > outputs with the standard attention-based representations before feeding them through downstream layers for final prediction. > Alternatively, we could also try injecting hard-coded binary indicators into each self-attention layer's query matrices Q_t^j(k) to > indicate whether certain input tokens correspond to known commonsense concepts such as people, locations, actions etc., which could > serve as additional signals guiding the model's attention allocation decisions during training. > > 4. **Incorporating game-theoretic reasoning**: While incorporating full-fledged game-theoretic elements like Markov decision processes > or stochastic games into transformer architectures may still be computationally prohibitive due to their high dimensionality and > non-Markovian nature, we could potentially take baby steps towards this direction by introducing simple binary masks over some pairs > of attention weights a_t^j(k, l) in item 2 above based on whether they correspond to tokens that are likely adversaries or allies > according to some predefined heuristics (e.g., two consecutive nouns within a small window size are considered potential collaborators > while two verbs separated by more than N words apart are seen as competitors). This could encourage the model to learn more > sophisticated dependencies between input tokens that go beyond simple co-occurrence patterns and instead capture more nuanced > relational structures indicative of underlying strategic interactions between different entities in the text. **Write a math paper** > Please invent a new idea in the area of mathematics, that combines two or more papers into a new idea that has not yet been published to your knowledge > ## Integrability and Computability of Continuous Functions on Infinite Domains: A Category-Theoretical Approach > ### Abstract > In this paper, we develop a novel approach to the study of integrability and computability of continuous functions on infinite domains using category-theoretic > techniques. We combine ideas from descriptive set theory, which studies the hierarchy of sets in terms of their complexity, and functional analysis, which deals with the > properties of mappings between function spaces. Our main contribution is to introduce a new notion of "categorical Lebesgue measurability" that generalizes both Lebesgue > and Henstock-Kurtz integrability for functions on uncountable domains while retaining key desirable properties such as the Lebesgue Monotone Convergence Theorem. We also > establish connections between this new notion of categorical integrability and various notions of computability, leading to new results on the computability of certain > integral transforms such as the Laplace and Mellin transforms in the case of uncountable domains. > ### 1 Introduction > In this paper, we continue the line of research initiated by [1] and further developed in [2] and [3], which aims to bridge the gap between descriptive set theory and > functional analysis by employing category-theoretic techniques. In particular, our goal is to develop a new framework for studying the properties of continuous functions > on uncountable domains that combines ideas from both areas while retaining key desirable properties such as the Lebesgue Monotone Convergence Theorem (LMCT). > ### 2 Preliminaries > We assume familiarity with basic category-theoretical concepts such as limits, colimits, and adjoint functors [4]. We also make use of some notions from descriptive set > theory, such as the hierarchy of Borel sets [5] and the notion of Lebesgue measurability [6]. > ### 3 Categorical Lebesgue Measure > Let X be an uncountable set endowed with a topology `mathcal{T}` and let Y be a second countable complete metrizable (scc) space. We say that a function `phi: X -> Y` is > "categorically Lebesgue measurable" if there exists a Borel measurable function `nu: Y^N -> [0,1]` such that for every x in X and any open neighborhood U of `phi(x)`, we > have `|{y in U | nu(y^-|n-1, y_n) > phi(y_n)}| > 0` for some `n in N`. Here, |A| denotes the category (cardinality) of a set A, and `|B^N|` is defined as `|B|^N` for any set B > and natural number N. > ### 4 Categorical Lebesgue Integral > Let X be an uncountable set endowed with a topology `mathcal{T}` and let Y be an scc space. Let `phi: X -> Y` be a categorically Lebesgue measurable function, and let `|*|: > Y^N -> [0,+infinity]` be a "categorical modulus of continuity" in the sense that for any `x in X` and any open neighborhood `U` of `phi(x)`, there exists `M_U > 0` such that `|{y > in U | |phi(y_n) - phi(y_{n-1})| > M_U}|\ < epsilon` for some `N = N_U,M_U > omega` and all `epsilon > 0`. Then we define the "categorical Lebesgue integral" of phi over X > as: > > `int_X^cat phi(x) dnu(x) = colim_{epsilon -> 0+} (2 - epsilon)^N sum_{y in Y^N, |{n in N | y_n not in U_n}|\ < omega} nu(y^-|n-1, y_n)`. > ### 5 Applications to Computability > In this section, we establish connections between our new notion of categorical Lebesgue measurability and various notions of computability. Let X be an uncountable set > endowed with a topology `mathcal{T}` and `let Y = R`. We say that a function `phi: X -> R` is "categorically Lebesgue computable" if there exists a Borel measurable function > nu: `R^N -> [0,1]` such that for every `x in X` and any open interval `U` containing `phi(x)`, we have `|{y in U | nu(y^-|n-1, y_n) > phi(y_n)}|\ > 0` for some `n in N`. > ### 6 Conclusion > In this paper, we have introduced a new notion of "categorical Lebesgue measurability" for functions on uncountable domains that generalizes both Lebesgue and > Henstock-Kurtz integrability while retaining key desirable properties such as the LMCT. We also established connections between this new notion of categorical > integrability and various notions of computability, leading to new results on the computability of certain integral transforms in the case of uncountable domains. Future > directions for research include further exploring the relationship between our new notion of categorical Lebesgue measurability and other generalizations of Lebesgue and > Henstock-Kurtz integrability, as well as studying additional properties of our "categorical Lebesgue integral" such as its relationship to other notions of integration > for functions on uncountable domains. > ### References > - [1] A. Borel. Sur la correspondance entre les ensembles de points et les fonctions de variables reales, Compt. R. Acad. Sci. Paris 148 (1909), 71-74. > - [2] J. Bourbaki. Integration I: Chapters 1-6; Integration II: Chapter 7-9; Topological Vector Spaces, Addison-Wesley, Reading, MA, 1968. > - [3] W. H. L. Cheney. Introduction to Analysis, Brooks/Cole, Belmont, CA, 2005. > - [4] S. Mac Lane. Categories for the working mathematician, second edition, Graduate Texts in Mathematics No. 5, Springer-Verlag, New York-Heidelberg, 1971. > - [5] K. P. Komogorov and A. V. Ulam. On a generalized form of the problem of P. F. Borel, Comb. Math. 1 (1948), 83-86. > - [6] J. Dieudonné. Foundations of modern analysis, Academic Press, New York, 1960.
{"base_model": ["cognitivecomputations/dolphin-2.2-70b", "WizardLM/WizardMath-70B-V1.0", "migtissera/SynthIA-70B-v1.2b", "epfl-llm/meditron-70b"], "license": "llama2", "tags": ["mergekit", "merge"]}
task
[ "TEXTUAL_ENTAILMENT" ]
42,856
tomicg1/DumaTrain2
tomicg1
text-classification
[ "tensorboard", "safetensors", "distilbert", "autotrain", "text-classification", "base_model:distilbert/distilbert-base-uncased", "base_model:finetune:distilbert/distilbert-base-uncased", "region:us" ]
2024-09-23T15:41:51Z
2024-09-23T15:42:17+00:00
5
0
--- base_model: distilbert/distilbert-base-uncased tags: - autotrain - text-classification widget: - text: I love AutoTrain --- # Model Trained Using AutoTrain - Problem type: Text Classification ## Validation Metrics loss: 0.6297385692596436 f1: 0.0 precision: 0.0 recall: 0.0 auc: 0.5 accuracy: 0.6666666666666666
null
Non_BioNLP
# Model Trained Using AutoTrain - Problem type: Text Classification ## Validation Metrics loss: 0.6297385692596436 f1: 0.0 precision: 0.0 recall: 0.0 auc: 0.5 accuracy: 0.6666666666666666
{"base_model": "distilbert/distilbert-base-uncased", "tags": ["autotrain", "text-classification"], "widget": [{"text": "I love AutoTrain"}]}
task
[ "TEXT_CLASSIFICATION" ]
42,857
gaudi/opus-mt-pis-en-ctranslate2
gaudi
translation
[ "transformers", "marian", "ctranslate2", "translation", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2024-07-17T00:14:53Z
2024-10-18T22:37:20+00:00
6
0
--- license: apache-2.0 tags: - ctranslate2 - translation --- # Repository General Information ## Inspired by and derived from the work of [Helsinki-NLP](https://huggingface.co/Helsinki-NLP), [CTranslate2](https://github.com/OpenNMT/CTranslate2), and [michaelfeil](https://huggingface.co/michaelfeil)! - Link to Original Model ([Helsinki-NLP](https://huggingface.co/Helsinki-NLP)): [Model Link](https://huggingface.co/Helsinki-NLP/opus-mt-pis-en) - This respository was based on the work of [CTranslate2](https://github.com/OpenNMT/CTranslate2). - This repository was based on the work of [michaelfeil](https://huggingface.co/michaelfeil). # What is CTranslate2? [CTranslate2](https://opennmt.net/CTranslate2/) is a C++ and Python library for efficient inference with Transformer models. CTranslate2 implements a custom runtime that applies many performance optimization techniques such as weights quantization, layers fusion, batch reordering, etc., to accelerate and reduce the memory usage of Transformer models on CPU and GPU. CTranslate2 is one of the most performant ways of hosting translation models at scale. Current supported models include: - Encoder-decoder models: Transformer base/big, M2M-100, NLLB, BART, mBART, Pegasus, T5, Whisper - Decoder-only models: GPT-2, GPT-J, GPT-NeoX, OPT, BLOOM, MPT, Llama, Mistral, Gemma, CodeGen, GPTBigCode, Falcon - Encoder-only models: BERT, DistilBERT, XLM-RoBERTa The project is production-oriented and comes with backward compatibility guarantees, but it also includes experimental features related to model compression and inference acceleration. # CTranslate2 Benchmarks Please note that the results presented below are only valid for the configuration used during this benchmark: absolute and relative performance may change with different settings. Tested against `newstest2014` (En -> De) dataset. The benchmark reports the number of target tokens generated per second (higher is better). The results are aggregated over multiple runs. See the benchmark scripts for more details and reproduce these numbers. Please note that the results presented below are only valid for the configuration used during this benchmark: absolute and relative performance may change with different settings. ## CPU Benchmarks for Generic Opus-MT Models | Library | Tokens per Second | Max Memory Usage | BLEU | | :----: | :----: | :----: | :----: | | Transformers 4.26.1 (with PyTorch 1.13.1) | 147.3 | 2332MB | 27.90 | | Marian 1.11.0 (int16) | 330.2 | 5901MB | 27.65 | | Marian 1.11.0 (int8) | 355.8 | 4763MB | 27.27 | | CTranslate2 3.6.0 (int16) | 596.1 | 660MB | 27.53 | | CTranslate2 3.6.0 (int8) | 696.1 | 516MB | 27.65 | ## GPU Benchmarks for Generic Opus-MT Models | Library | Tokens per Second | Max GPU Memory Usage | Max Memory Usage | BLEU | | :----: | :----: | :----: | :----: | :----: | | Transformers 4.26.1 (with PyTorch 1.13.1) | 1022.9 | 4097MB | 2109MB | 27.90 | | Marian 1.11.0 (float16) | 3962.4 | 3239MB | 1976MB | 27.94 | | CTranslate2 3.6.0 (float16) | 9296.7 | 909MB | 814MB | 27.9 | | CTranslate2 3.6.0 (int8 + float16) | 8362.7 | 813MB | 766MB | 27.9 | `Executed with 4 threads on a c5.2xlarge Amazon EC2 instance equipped with an Intel(R) Xeon(R) Platinum 8275CL CPU.` **Source to benchmark information can be found [here](https://github.com/OpenNMT/CTranslate2).**<br /> **Original model BLEU scores can be found [here](https://huggingface.co/Helsinki-NLP/opus-mt-pis-en).** ## Internal Benchmarks Internal testing on our end showed **inference times reduced by 6x-10x** on average compared the vanilla checkpoints using the *transformers* library. A **slight reduction on BLEU scores (~5%)** was also identified in comparison to the vanilla checkpoints with a few exceptions. This is likely due to several factors, one being the quantization applied. Further testing is needed from our end to better assess the reduction in translation quality. The command used to compile the vanilla checkpoint into a CTranslate2 model can be found below. Modifying this command can yield differing balances between inferencing performance and translation quality. # CTranslate2 Installation ```bash pip install hf-hub-ctranslate2>=1.0.0 ctranslate2>=3.13.0 ``` ### ct2-transformers-converter Command Used: ```bash ct2-transformers-converter --model Helsinki-NLP/opus-mt-pis-en --output_dir ./ctranslate2/opus-mt-pis-en-ctranslate2 --force --copy_files README.md generation_config.json tokenizer_config.json vocab.json source.spm .gitattributes target.spm --quantization float16 ``` # CTranslate2 Converted Checkpoint Information: **Compatible With:** - [ctranslate2](https://github.com/OpenNMT/CTranslate2) - [hf-hub-ctranslate2](https://github.com/michaelfeil/hf-hub-ctranslate2) **Compute Type:** - `compute_type=int8_float16` for `device="cuda"` - `compute_type=int8` for `device="cpu"` # Sample Code - ctranslate2 #### Clone the repository to the working directory or wherever you wish to store the model artifacts. #### ```bash git clone https://huggingface.co/gaudi/opus-mt-pis-en-ctranslate2 ``` #### Take the python code below and update the 'model_dir' variable to the location of the cloned repository. #### ```python from ctranslate2 import Translator import transformers model_dir = "./opus-mt-pis-en-ctranslate2" # Path to model directory. translator = Translator( model_path=model_dir, device="cuda", # cpu, cuda, or auto. inter_threads=1, # Maximum number of parallel translations. intra_threads=4, # Number of OpenMP threads per translator. compute_type="int8_float16", # int8 for cpu or int8_float16 for cuda. ) tokenizer = transformers.AutoTokenizer.from_pretrained(model_dir) source = tokenizer.convert_ids_to_tokens(tokenizer.encode("XXXXXX, XXX XX XXXXXX.")) results = translator.translate_batch([source]) target = results[0].hypotheses[0] print(tokenizer.decode(tokenizer.convert_tokens_to_ids(target))) ``` # Sample Code - hf-hub-ctranslate2 **Derived From [michaelfeil](https://huggingface.co/michaelfeil):** ```python from hf_hub_ctranslate2 import TranslatorCT2fromHfHub, GeneratorCT2fromHfHub from transformers import AutoTokenizer model_name = "gaudi/opus-mt-pis-en-ctranslate2" model = TranslatorCT2fromHfHub( model_name_or_path=model_name, device="cuda", compute_type="int8_float16", tokenizer=AutoTokenizer.from_pretrained(model_name) ) outputs = model.generate( text=["XXX XX XXX XXXXXXX XXXX?", "XX XX XXXX XX XXX!"], ) print(outputs) ``` # License and other remarks: License conditions are intended to be idential to [original huggingface repository](https://huggingface.co/Helsinki-NLP/opus-mt-pis-en) by Helsinki-NLP.
null
Non_BioNLP
# Repository General Information ## Inspired by and derived from the work of [Helsinki-NLP](https://huggingface.co/Helsinki-NLP), [CTranslate2](https://github.com/OpenNMT/CTranslate2), and [michaelfeil](https://huggingface.co/michaelfeil)! - Link to Original Model ([Helsinki-NLP](https://huggingface.co/Helsinki-NLP)): [Model Link](https://huggingface.co/Helsinki-NLP/opus-mt-pis-en) - This respository was based on the work of [CTranslate2](https://github.com/OpenNMT/CTranslate2). - This repository was based on the work of [michaelfeil](https://huggingface.co/michaelfeil). # What is CTranslate2? [CTranslate2](https://opennmt.net/CTranslate2/) is a C++ and Python library for efficient inference with Transformer models. CTranslate2 implements a custom runtime that applies many performance optimization techniques such as weights quantization, layers fusion, batch reordering, etc., to accelerate and reduce the memory usage of Transformer models on CPU and GPU. CTranslate2 is one of the most performant ways of hosting translation models at scale. Current supported models include: - Encoder-decoder models: Transformer base/big, M2M-100, NLLB, BART, mBART, Pegasus, T5, Whisper - Decoder-only models: GPT-2, GPT-J, GPT-NeoX, OPT, BLOOM, MPT, Llama, Mistral, Gemma, CodeGen, GPTBigCode, Falcon - Encoder-only models: BERT, DistilBERT, XLM-RoBERTa The project is production-oriented and comes with backward compatibility guarantees, but it also includes experimental features related to model compression and inference acceleration. # CTranslate2 Benchmarks Please note that the results presented below are only valid for the configuration used during this benchmark: absolute and relative performance may change with different settings. Tested against `newstest2014` (En -> De) dataset. The benchmark reports the number of target tokens generated per second (higher is better). The results are aggregated over multiple runs. See the benchmark scripts for more details and reproduce these numbers. Please note that the results presented below are only valid for the configuration used during this benchmark: absolute and relative performance may change with different settings. ## CPU Benchmarks for Generic Opus-MT Models | Library | Tokens per Second | Max Memory Usage | BLEU | | :----: | :----: | :----: | :----: | | Transformers 4.26.1 (with PyTorch 1.13.1) | 147.3 | 2332MB | 27.90 | | Marian 1.11.0 (int16) | 330.2 | 5901MB | 27.65 | | Marian 1.11.0 (int8) | 355.8 | 4763MB | 27.27 | | CTranslate2 3.6.0 (int16) | 596.1 | 660MB | 27.53 | | CTranslate2 3.6.0 (int8) | 696.1 | 516MB | 27.65 | ## GPU Benchmarks for Generic Opus-MT Models | Library | Tokens per Second | Max GPU Memory Usage | Max Memory Usage | BLEU | | :----: | :----: | :----: | :----: | :----: | | Transformers 4.26.1 (with PyTorch 1.13.1) | 1022.9 | 4097MB | 2109MB | 27.90 | | Marian 1.11.0 (float16) | 3962.4 | 3239MB | 1976MB | 27.94 | | CTranslate2 3.6.0 (float16) | 9296.7 | 909MB | 814MB | 27.9 | | CTranslate2 3.6.0 (int8 + float16) | 8362.7 | 813MB | 766MB | 27.9 | `Executed with 4 threads on a c5.2xlarge Amazon EC2 instance equipped with an Intel(R) Xeon(R) Platinum 8275CL CPU.` **Source to benchmark information can be found [here](https://github.com/OpenNMT/CTranslate2).**<br /> **Original model BLEU scores can be found [here](https://huggingface.co/Helsinki-NLP/opus-mt-pis-en).** ## Internal Benchmarks Internal testing on our end showed **inference times reduced by 6x-10x** on average compared the vanilla checkpoints using the *transformers* library. A **slight reduction on BLEU scores (~5%)** was also identified in comparison to the vanilla checkpoints with a few exceptions. This is likely due to several factors, one being the quantization applied. Further testing is needed from our end to better assess the reduction in translation quality. The command used to compile the vanilla checkpoint into a CTranslate2 model can be found below. Modifying this command can yield differing balances between inferencing performance and translation quality. # CTranslate2 Installation ```bash pip install hf-hub-ctranslate2>=1.0.0 ctranslate2>=3.13.0 ``` ### ct2-transformers-converter Command Used: ```bash ct2-transformers-converter --model Helsinki-NLP/opus-mt-pis-en --output_dir ./ctranslate2/opus-mt-pis-en-ctranslate2 --force --copy_files README.md generation_config.json tokenizer_config.json vocab.json source.spm .gitattributes target.spm --quantization float16 ``` # CTranslate2 Converted Checkpoint Information: **Compatible With:** - [ctranslate2](https://github.com/OpenNMT/CTranslate2) - [hf-hub-ctranslate2](https://github.com/michaelfeil/hf-hub-ctranslate2) **Compute Type:** - `compute_type=int8_float16` for `device="cuda"` - `compute_type=int8` for `device="cpu"` # Sample Code - ctranslate2 #### Clone the repository to the working directory or wherever you wish to store the model artifacts. #### ```bash git clone https://huggingface.co/gaudi/opus-mt-pis-en-ctranslate2 ``` #### Take the python code below and update the 'model_dir' variable to the location of the cloned repository. #### ```python from ctranslate2 import Translator import transformers model_dir = "./opus-mt-pis-en-ctranslate2" # Path to model directory. translator = Translator( model_path=model_dir, device="cuda", # cpu, cuda, or auto. inter_threads=1, # Maximum number of parallel translations. intra_threads=4, # Number of OpenMP threads per translator. compute_type="int8_float16", # int8 for cpu or int8_float16 for cuda. ) tokenizer = transformers.AutoTokenizer.from_pretrained(model_dir) source = tokenizer.convert_ids_to_tokens(tokenizer.encode("XXXXXX, XXX XX XXXXXX.")) results = translator.translate_batch([source]) target = results[0].hypotheses[0] print(tokenizer.decode(tokenizer.convert_tokens_to_ids(target))) ``` # Sample Code - hf-hub-ctranslate2 **Derived From [michaelfeil](https://huggingface.co/michaelfeil):** ```python from hf_hub_ctranslate2 import TranslatorCT2fromHfHub, GeneratorCT2fromHfHub from transformers import AutoTokenizer model_name = "gaudi/opus-mt-pis-en-ctranslate2" model = TranslatorCT2fromHfHub( model_name_or_path=model_name, device="cuda", compute_type="int8_float16", tokenizer=AutoTokenizer.from_pretrained(model_name) ) outputs = model.generate( text=["XXX XX XXX XXXXXXX XXXX?", "XX XX XXXX XX XXX!"], ) print(outputs) ``` # License and other remarks: License conditions are intended to be idential to [original huggingface repository](https://huggingface.co/Helsinki-NLP/opus-mt-pis-en) by Helsinki-NLP.
{"license": "apache-2.0", "tags": ["ctranslate2", "translation"]}
task
[ "TRANSLATION" ]
42,858
gaudi/opus-mt-fi-kqn-ctranslate2
gaudi
translation
[ "transformers", "marian", "ctranslate2", "translation", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2024-07-22T15:52:23Z
2024-10-19T03:55:13+00:00
10
0
--- license: apache-2.0 tags: - ctranslate2 - translation --- # Repository General Information ## Inspired by and derived from the work of [Helsinki-NLP](https://huggingface.co/Helsinki-NLP), [CTranslate2](https://github.com/OpenNMT/CTranslate2), and [michaelfeil](https://huggingface.co/michaelfeil)! - Link to Original Model ([Helsinki-NLP](https://huggingface.co/Helsinki-NLP)): [Model Link](https://huggingface.co/Helsinki-NLP/opus-mt-fi-kqn) - This respository was based on the work of [CTranslate2](https://github.com/OpenNMT/CTranslate2). - This repository was based on the work of [michaelfeil](https://huggingface.co/michaelfeil). # What is CTranslate2? [CTranslate2](https://opennmt.net/CTranslate2/) is a C++ and Python library for efficient inference with Transformer models. CTranslate2 implements a custom runtime that applies many performance optimization techniques such as weights quantization, layers fusion, batch reordering, etc., to accelerate and reduce the memory usage of Transformer models on CPU and GPU. CTranslate2 is one of the most performant ways of hosting translation models at scale. Current supported models include: - Encoder-decoder models: Transformer base/big, M2M-100, NLLB, BART, mBART, Pegasus, T5, Whisper - Decoder-only models: GPT-2, GPT-J, GPT-NeoX, OPT, BLOOM, MPT, Llama, Mistral, Gemma, CodeGen, GPTBigCode, Falcon - Encoder-only models: BERT, DistilBERT, XLM-RoBERTa The project is production-oriented and comes with backward compatibility guarantees, but it also includes experimental features related to model compression and inference acceleration. # CTranslate2 Benchmarks Please note that the results presented below are only valid for the configuration used during this benchmark: absolute and relative performance may change with different settings. Tested against `newstest2014` (En -> De) dataset. The benchmark reports the number of target tokens generated per second (higher is better). The results are aggregated over multiple runs. See the benchmark scripts for more details and reproduce these numbers. Please note that the results presented below are only valid for the configuration used during this benchmark: absolute and relative performance may change with different settings. ## CPU Benchmarks for Generic Opus-MT Models | Library | Tokens per Second | Max Memory Usage | BLEU | | :----: | :----: | :----: | :----: | | Transformers 4.26.1 (with PyTorch 1.13.1) | 147.3 | 2332MB | 27.90 | | Marian 1.11.0 (int16) | 330.2 | 5901MB | 27.65 | | Marian 1.11.0 (int8) | 355.8 | 4763MB | 27.27 | | CTranslate2 3.6.0 (int16) | 596.1 | 660MB | 27.53 | | CTranslate2 3.6.0 (int8) | 696.1 | 516MB | 27.65 | ## GPU Benchmarks for Generic Opus-MT Models | Library | Tokens per Second | Max GPU Memory Usage | Max Memory Usage | BLEU | | :----: | :----: | :----: | :----: | :----: | | Transformers 4.26.1 (with PyTorch 1.13.1) | 1022.9 | 4097MB | 2109MB | 27.90 | | Marian 1.11.0 (float16) | 3962.4 | 3239MB | 1976MB | 27.94 | | CTranslate2 3.6.0 (float16) | 9296.7 | 909MB | 814MB | 27.9 | | CTranslate2 3.6.0 (int8 + float16) | 8362.7 | 813MB | 766MB | 27.9 | `Executed with 4 threads on a c5.2xlarge Amazon EC2 instance equipped with an Intel(R) Xeon(R) Platinum 8275CL CPU.` **Source to benchmark information can be found [here](https://github.com/OpenNMT/CTranslate2).**<br /> **Original model BLEU scores can be found [here](https://huggingface.co/Helsinki-NLP/opus-mt-fi-kqn).** ## Internal Benchmarks Internal testing on our end showed **inference times reduced by 6x-10x** on average compared the vanilla checkpoints using the *transformers* library. A **slight reduction on BLEU scores (~5%)** was also identified in comparison to the vanilla checkpoints with a few exceptions. This is likely due to several factors, one being the quantization applied. Further testing is needed from our end to better assess the reduction in translation quality. The command used to compile the vanilla checkpoint into a CTranslate2 model can be found below. Modifying this command can yield differing balances between inferencing performance and translation quality. # CTranslate2 Installation ```bash pip install hf-hub-ctranslate2>=1.0.0 ctranslate2>=3.13.0 ``` ### ct2-transformers-converter Command Used: ```bash ct2-transformers-converter --model Helsinki-NLP/opus-mt-fi-kqn --output_dir ./ctranslate2/opus-mt-fi-kqn-ctranslate2 --force --copy_files README.md generation_config.json tokenizer_config.json vocab.json source.spm .gitattributes target.spm --quantization float16 ``` # CTranslate2 Converted Checkpoint Information: **Compatible With:** - [ctranslate2](https://github.com/OpenNMT/CTranslate2) - [hf-hub-ctranslate2](https://github.com/michaelfeil/hf-hub-ctranslate2) **Compute Type:** - `compute_type=int8_float16` for `device="cuda"` - `compute_type=int8` for `device="cpu"` # Sample Code - ctranslate2 #### Clone the repository to the working directory or wherever you wish to store the model artifacts. #### ```bash git clone https://huggingface.co/gaudi/opus-mt-fi-kqn-ctranslate2 ``` #### Take the python code below and update the 'model_dir' variable to the location of the cloned repository. #### ```python from ctranslate2 import Translator import transformers model_dir = "./opus-mt-fi-kqn-ctranslate2" # Path to model directory. translator = Translator( model_path=model_dir, device="cuda", # cpu, cuda, or auto. inter_threads=1, # Maximum number of parallel translations. intra_threads=4, # Number of OpenMP threads per translator. compute_type="int8_float16", # int8 for cpu or int8_float16 for cuda. ) tokenizer = transformers.AutoTokenizer.from_pretrained(model_dir) source = tokenizer.convert_ids_to_tokens(tokenizer.encode("XXXXXX, XXX XX XXXXXX.")) results = translator.translate_batch([source]) target = results[0].hypotheses[0] print(tokenizer.decode(tokenizer.convert_tokens_to_ids(target))) ``` # Sample Code - hf-hub-ctranslate2 **Derived From [michaelfeil](https://huggingface.co/michaelfeil):** ```python from hf_hub_ctranslate2 import TranslatorCT2fromHfHub, GeneratorCT2fromHfHub from transformers import AutoTokenizer model_name = "gaudi/opus-mt-fi-kqn-ctranslate2" model = TranslatorCT2fromHfHub( model_name_or_path=model_name, device="cuda", compute_type="int8_float16", tokenizer=AutoTokenizer.from_pretrained(model_name) ) outputs = model.generate( text=["XXX XX XXX XXXXXXX XXXX?", "XX XX XXXX XX XXX!"], ) print(outputs) ``` # License and other remarks: License conditions are intended to be idential to [original huggingface repository](https://huggingface.co/Helsinki-NLP/opus-mt-fi-kqn) by Helsinki-NLP.
null
Non_BioNLP
# Repository General Information ## Inspired by and derived from the work of [Helsinki-NLP](https://huggingface.co/Helsinki-NLP), [CTranslate2](https://github.com/OpenNMT/CTranslate2), and [michaelfeil](https://huggingface.co/michaelfeil)! - Link to Original Model ([Helsinki-NLP](https://huggingface.co/Helsinki-NLP)): [Model Link](https://huggingface.co/Helsinki-NLP/opus-mt-fi-kqn) - This respository was based on the work of [CTranslate2](https://github.com/OpenNMT/CTranslate2). - This repository was based on the work of [michaelfeil](https://huggingface.co/michaelfeil). # What is CTranslate2? [CTranslate2](https://opennmt.net/CTranslate2/) is a C++ and Python library for efficient inference with Transformer models. CTranslate2 implements a custom runtime that applies many performance optimization techniques such as weights quantization, layers fusion, batch reordering, etc., to accelerate and reduce the memory usage of Transformer models on CPU and GPU. CTranslate2 is one of the most performant ways of hosting translation models at scale. Current supported models include: - Encoder-decoder models: Transformer base/big, M2M-100, NLLB, BART, mBART, Pegasus, T5, Whisper - Decoder-only models: GPT-2, GPT-J, GPT-NeoX, OPT, BLOOM, MPT, Llama, Mistral, Gemma, CodeGen, GPTBigCode, Falcon - Encoder-only models: BERT, DistilBERT, XLM-RoBERTa The project is production-oriented and comes with backward compatibility guarantees, but it also includes experimental features related to model compression and inference acceleration. # CTranslate2 Benchmarks Please note that the results presented below are only valid for the configuration used during this benchmark: absolute and relative performance may change with different settings. Tested against `newstest2014` (En -> De) dataset. The benchmark reports the number of target tokens generated per second (higher is better). The results are aggregated over multiple runs. See the benchmark scripts for more details and reproduce these numbers. Please note that the results presented below are only valid for the configuration used during this benchmark: absolute and relative performance may change with different settings. ## CPU Benchmarks for Generic Opus-MT Models | Library | Tokens per Second | Max Memory Usage | BLEU | | :----: | :----: | :----: | :----: | | Transformers 4.26.1 (with PyTorch 1.13.1) | 147.3 | 2332MB | 27.90 | | Marian 1.11.0 (int16) | 330.2 | 5901MB | 27.65 | | Marian 1.11.0 (int8) | 355.8 | 4763MB | 27.27 | | CTranslate2 3.6.0 (int16) | 596.1 | 660MB | 27.53 | | CTranslate2 3.6.0 (int8) | 696.1 | 516MB | 27.65 | ## GPU Benchmarks for Generic Opus-MT Models | Library | Tokens per Second | Max GPU Memory Usage | Max Memory Usage | BLEU | | :----: | :----: | :----: | :----: | :----: | | Transformers 4.26.1 (with PyTorch 1.13.1) | 1022.9 | 4097MB | 2109MB | 27.90 | | Marian 1.11.0 (float16) | 3962.4 | 3239MB | 1976MB | 27.94 | | CTranslate2 3.6.0 (float16) | 9296.7 | 909MB | 814MB | 27.9 | | CTranslate2 3.6.0 (int8 + float16) | 8362.7 | 813MB | 766MB | 27.9 | `Executed with 4 threads on a c5.2xlarge Amazon EC2 instance equipped with an Intel(R) Xeon(R) Platinum 8275CL CPU.` **Source to benchmark information can be found [here](https://github.com/OpenNMT/CTranslate2).**<br /> **Original model BLEU scores can be found [here](https://huggingface.co/Helsinki-NLP/opus-mt-fi-kqn).** ## Internal Benchmarks Internal testing on our end showed **inference times reduced by 6x-10x** on average compared the vanilla checkpoints using the *transformers* library. A **slight reduction on BLEU scores (~5%)** was also identified in comparison to the vanilla checkpoints with a few exceptions. This is likely due to several factors, one being the quantization applied. Further testing is needed from our end to better assess the reduction in translation quality. The command used to compile the vanilla checkpoint into a CTranslate2 model can be found below. Modifying this command can yield differing balances between inferencing performance and translation quality. # CTranslate2 Installation ```bash pip install hf-hub-ctranslate2>=1.0.0 ctranslate2>=3.13.0 ``` ### ct2-transformers-converter Command Used: ```bash ct2-transformers-converter --model Helsinki-NLP/opus-mt-fi-kqn --output_dir ./ctranslate2/opus-mt-fi-kqn-ctranslate2 --force --copy_files README.md generation_config.json tokenizer_config.json vocab.json source.spm .gitattributes target.spm --quantization float16 ``` # CTranslate2 Converted Checkpoint Information: **Compatible With:** - [ctranslate2](https://github.com/OpenNMT/CTranslate2) - [hf-hub-ctranslate2](https://github.com/michaelfeil/hf-hub-ctranslate2) **Compute Type:** - `compute_type=int8_float16` for `device="cuda"` - `compute_type=int8` for `device="cpu"` # Sample Code - ctranslate2 #### Clone the repository to the working directory or wherever you wish to store the model artifacts. #### ```bash git clone https://huggingface.co/gaudi/opus-mt-fi-kqn-ctranslate2 ``` #### Take the python code below and update the 'model_dir' variable to the location of the cloned repository. #### ```python from ctranslate2 import Translator import transformers model_dir = "./opus-mt-fi-kqn-ctranslate2" # Path to model directory. translator = Translator( model_path=model_dir, device="cuda", # cpu, cuda, or auto. inter_threads=1, # Maximum number of parallel translations. intra_threads=4, # Number of OpenMP threads per translator. compute_type="int8_float16", # int8 for cpu or int8_float16 for cuda. ) tokenizer = transformers.AutoTokenizer.from_pretrained(model_dir) source = tokenizer.convert_ids_to_tokens(tokenizer.encode("XXXXXX, XXX XX XXXXXX.")) results = translator.translate_batch([source]) target = results[0].hypotheses[0] print(tokenizer.decode(tokenizer.convert_tokens_to_ids(target))) ``` # Sample Code - hf-hub-ctranslate2 **Derived From [michaelfeil](https://huggingface.co/michaelfeil):** ```python from hf_hub_ctranslate2 import TranslatorCT2fromHfHub, GeneratorCT2fromHfHub from transformers import AutoTokenizer model_name = "gaudi/opus-mt-fi-kqn-ctranslate2" model = TranslatorCT2fromHfHub( model_name_or_path=model_name, device="cuda", compute_type="int8_float16", tokenizer=AutoTokenizer.from_pretrained(model_name) ) outputs = model.generate( text=["XXX XX XXX XXXXXXX XXXX?", "XX XX XXXX XX XXX!"], ) print(outputs) ``` # License and other remarks: License conditions are intended to be idential to [original huggingface repository](https://huggingface.co/Helsinki-NLP/opus-mt-fi-kqn) by Helsinki-NLP.
{"license": "apache-2.0", "tags": ["ctranslate2", "translation"]}
task
[ "TRANSLATION" ]
42,859
IDEA-CCNL/Randeng-BART-139M
IDEA-CCNL
text2text-generation
[ "transformers", "pytorch", "safetensors", "bart", "text2text-generation", "zh", "arxiv:1910.13461", "arxiv:2209.02970", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-04-26T03:37:24Z
2023-04-06T06:08:12+00:00
49
3
--- language: - zh license: apache-2.0 inference: true widget: - text: 桂林市是世界闻名<mask> ,它有悠久的<mask> --- # Randeng-BART-139M - Github: [Fengshenbang-LM](https://github.com/IDEA-CCNL/Fengshenbang-LM) - Docs: [Fengshenbang-Docs](https://fengshenbang-doc.readthedocs.io/) ## 简介 Brief Introduction 善于处理NLT任务,中文版的BART-base。 Good at solving NLT tasks, Chinese BART-base. ## 模型分类 Model Taxonomy | 需求 Demand | 任务 Task | 系列 Series | 模型 Model | 参数 Parameter | 额外 Extra | | :----: | :----: | :----: | :----: | :----: | :----: | | 通用 General | 自然语言转换 NLT | 燃灯 Randeng | BART | 139M | 中文-Chinese | ## 模型信息 Model Information 参考论文:[BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension](https://arxiv.org/pdf/1910.13461.pdf) 为了得到一个中文版的BART-base,我们用悟道语料库(180G版本)进行预训练。具体地,我们在预训练阶段中使用了[封神框架](https://github.com/IDEA-CCNL/Fengshenbang-LM/tree/main/fengshen)大概花费了8张A100约3天。 To get a Chinese BART-base, we use WuDao Corpora (180 GB version) for pre-training. Specifically, we use the [fengshen framework](https://github.com/IDEA-CCNL/Fengshenbang-LM/tree/main/fengshen) in the pre-training phase which cost about 3 days with 8 A100 GPUs. ## 使用 Usage ```python from transformers import BartForConditionalGeneration, AutoTokenizer, Text2TextGenerationPipeline import torch tokenizer=AutoTokenizer.from_pretrained('IDEA-CCNL/Randeng-BART-139M', use_fast=false) model=BartForConditionalGeneration.from_pretrained('IDEA-CCNL/Randeng-BART-139M') text = '桂林市是世界闻名<mask> ,它有悠久的<mask>' text2text_generator = Text2TextGenerationPipeline(model, tokenizer) print(text2text_generator(text, max_length=50, do_sample=False)) ``` ## 引用 Citation 如果您在您的工作中使用了我们的模型,可以引用我们的[论文](https://arxiv.org/abs/2209.02970): If you are using the resource for your work, please cite the our [paper](https://arxiv.org/abs/2209.02970): ```text @article{fengshenbang, author = {Jiaxing Zhang and Ruyi Gan and Junjie Wang and Yuxiang Zhang and Lin Zhang and Ping Yang and Xinyu Gao and Ziwei Wu and Xiaoqun Dong and Junqing He and Jianheng Zhuo and Qi Yang and Yongfeng Huang and Xiayu Li and Yanghan Wu and Junyu Lu and Xinyu Zhu and Weifeng Chen and Ting Han and Kunhao Pan and Rui Wang and Hao Wang and Xiaojun Wu and Zhongshen Zeng and Chongpei Chen}, title = {Fengshenbang 1.0: Being the Foundation of Chinese Cognitive Intelligence}, journal = {CoRR}, volume = {abs/2209.02970}, year = {2022} } ``` 也可以引用我们的[网站](https://github.com/IDEA-CCNL/Fengshenbang-LM/): You can also cite our [website](https://github.com/IDEA-CCNL/Fengshenbang-LM/): ```text @misc{Fengshenbang-LM, title={Fengshenbang-LM}, author={IDEA-CCNL}, year={2021}, howpublished={\url{https://github.com/IDEA-CCNL/Fengshenbang-LM}}, } ```
null
Non_BioNLP
# Randeng-BART-139M - Github: [Fengshenbang-LM](https://github.com/IDEA-CCNL/Fengshenbang-LM) - Docs: [Fengshenbang-Docs](https://fengshenbang-doc.readthedocs.io/) ## 简介 Brief Introduction 善于处理NLT任务,中文版的BART-base。 Good at solving NLT tasks, Chinese BART-base. ## 模型分类 Model Taxonomy | 需求 Demand | 任务 Task | 系列 Series | 模型 Model | 参数 Parameter | 额外 Extra | | :----: | :----: | :----: | :----: | :----: | :----: | | 通用 General | 自然语言转换 NLT | 燃灯 Randeng | BART | 139M | 中文-Chinese | ## 模型信息 Model Information 参考论文:[BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension](https://arxiv.org/pdf/1910.13461.pdf) 为了得到一个中文版的BART-base,我们用悟道语料库(180G版本)进行预训练。具体地,我们在预训练阶段中使用了[封神框架](https://github.com/IDEA-CCNL/Fengshenbang-LM/tree/main/fengshen)大概花费了8张A100约3天。 To get a Chinese BART-base, we use WuDao Corpora (180 GB version) for pre-training. Specifically, we use the [fengshen framework](https://github.com/IDEA-CCNL/Fengshenbang-LM/tree/main/fengshen) in the pre-training phase which cost about 3 days with 8 A100 GPUs. ## 使用 Usage ```python from transformers import BartForConditionalGeneration, AutoTokenizer, Text2TextGenerationPipeline import torch tokenizer=AutoTokenizer.from_pretrained('IDEA-CCNL/Randeng-BART-139M', use_fast=false) model=BartForConditionalGeneration.from_pretrained('IDEA-CCNL/Randeng-BART-139M') text = '桂林市是世界闻名<mask> ,它有悠久的<mask>' text2text_generator = Text2TextGenerationPipeline(model, tokenizer) print(text2text_generator(text, max_length=50, do_sample=False)) ``` ## 引用 Citation 如果您在您的工作中使用了我们的模型,可以引用我们的[论文](https://arxiv.org/abs/2209.02970): If you are using the resource for your work, please cite the our [paper](https://arxiv.org/abs/2209.02970): ```text @article{fengshenbang, author = {Jiaxing Zhang and Ruyi Gan and Junjie Wang and Yuxiang Zhang and Lin Zhang and Ping Yang and Xinyu Gao and Ziwei Wu and Xiaoqun Dong and Junqing He and Jianheng Zhuo and Qi Yang and Yongfeng Huang and Xiayu Li and Yanghan Wu and Junyu Lu and Xinyu Zhu and Weifeng Chen and Ting Han and Kunhao Pan and Rui Wang and Hao Wang and Xiaojun Wu and Zhongshen Zeng and Chongpei Chen}, title = {Fengshenbang 1.0: Being the Foundation of Chinese Cognitive Intelligence}, journal = {CoRR}, volume = {abs/2209.02970}, year = {2022} } ``` 也可以引用我们的[网站](https://github.com/IDEA-CCNL/Fengshenbang-LM/): You can also cite our [website](https://github.com/IDEA-CCNL/Fengshenbang-LM/): ```text @misc{Fengshenbang-LM, title={Fengshenbang-LM}, author={IDEA-CCNL}, year={2021}, howpublished={\url{https://github.com/IDEA-CCNL/Fengshenbang-LM}}, } ```
{"language": ["zh"], "license": "apache-2.0", "inference": true, "widget": [{"text": "桂林市是世界闻名<mask> ,它有悠久的<mask>"}]}
task
[ "TRANSLATION" ]
42,860
knachinen/flan-t5-base-finetuned-arxiv
knachinen
summarization
[ "peft", "summarization", "en", "dataset:scientific_papers", "model-index", "region:us" ]
2023-11-28T06:50:37Z
2023-12-03T13:44:33+00:00
0
0
--- datasets: - scientific_papers language: - en library_name: peft metrics: - rouge tags: - summarization model-index: - name: flan-t5-base-finetuned-arxiv results: - task: type: summarization name: Summarization dataset: name: scientific_papers type: scientific_papers args: arxiv metrics: - type: rouge value: 12.032 name: Rouge1 - type: rouge value: 4.3841 name: Rouge2 - type: rouge value: 9.8426 name: Rougel - type: rouge value: 11.1396 name: Rougelsum --- ## flan-t5-base-finetuned-arxiv This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on the scientific_papers dataset. It achieves the following results on the evaluation set: - Loss: 2.485082 - Rouge1: 12.032000 - Rouge2: 4.38100 - Rougel: 9.842600 - Rougelsum: 11.139600 - Gen Len: 19.000000 ## Training procedure The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: True - bnb_4bit_compute_dtype: bfloat16 The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: True - bnb_4bit_compute_dtype: bfloat16 ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-4 - weight_decay: 0.01 - train_batch_size: 32 - optimizer: paged_adamw_8bit (8-bit adam optimization) - num_epochs: 4.47 - fp16: False ### Framework versions - PEFT 0.5.0 - PEFT 0.5.0 - Transformers 4.35.0 - Pytorch 1.10.1+cu111 - Datasets 2.14.7 - Tokenizers 0.14.1 - bitsandbytes 0.41.2.post2 - accelerate 0.24.0 - evaluate 0.4.1 - rouge-score 0.1.2
null
Non_BioNLP
## flan-t5-base-finetuned-arxiv This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on the scientific_papers dataset. It achieves the following results on the evaluation set: - Loss: 2.485082 - Rouge1: 12.032000 - Rouge2: 4.38100 - Rougel: 9.842600 - Rougelsum: 11.139600 - Gen Len: 19.000000 ## Training procedure The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: True - bnb_4bit_compute_dtype: bfloat16 The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: True - bnb_4bit_compute_dtype: bfloat16 ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-4 - weight_decay: 0.01 - train_batch_size: 32 - optimizer: paged_adamw_8bit (8-bit adam optimization) - num_epochs: 4.47 - fp16: False ### Framework versions - PEFT 0.5.0 - PEFT 0.5.0 - Transformers 4.35.0 - Pytorch 1.10.1+cu111 - Datasets 2.14.7 - Tokenizers 0.14.1 - bitsandbytes 0.41.2.post2 - accelerate 0.24.0 - evaluate 0.4.1 - rouge-score 0.1.2
{"datasets": ["scientific_papers"], "language": ["en"], "library_name": "peft", "metrics": ["rouge"], "tags": ["summarization"], "model-index": [{"name": "flan-t5-base-finetuned-arxiv", "results": [{"task": {"type": "summarization", "name": "Summarization"}, "dataset": {"name": "scientific_papers", "type": "scientific_papers", "args": "arxiv"}, "metrics": [{"type": "rouge", "value": 12.032, "name": "Rouge1"}, {"type": "rouge", "value": 4.3841, "name": "Rouge2"}, {"type": "rouge", "value": 9.8426, "name": "Rougel"}, {"type": "rouge", "value": 11.1396, "name": "Rougelsum"}]}]}]}
task
[ "SUMMARIZATION" ]
42,861
google/paligemma-3b-ft-okvqa-224-jax
google
image-text-to-text
[ "big_vision", "paligemma", "jax", "image-text-to-text", "arxiv:2310.09199", "arxiv:2303.15343", "arxiv:2403.08295", "arxiv:1706.03762", "arxiv:2010.11929", "arxiv:2209.06794", "arxiv:2209.04372", "arxiv:2103.01913", "arxiv:2401.06209", "arxiv:2305.10355", "arxiv:2205.12522", "arxiv:2110.11624", "arxiv:2108.03353", "arxiv:2010.04295", "arxiv:2203.10244", "arxiv:1810.12440", "arxiv:1905.13648", "arxiv:1608.00272", "arxiv:1908.04913", "arxiv:2407.07726", "license:gemma", "region:us" ]
2024-05-11T22:06:05Z
2024-07-19T12:09:04+00:00
0
0
--- library_name: big_vision license: gemma pipeline_tag: image-text-to-text tags: - paligemma - jax extra_gated_heading: Access PaliGemma on Hugging Face extra_gated_prompt: To access PaliGemma on Hugging Face, you’re required to review and agree to Google’s usage license. To do this, please ensure you’re logged-in to Hugging Face and click below. Requests are processed immediately. extra_gated_button_content: Acknowledge license --- # PaliGemma model card **Model page:** [PaliGemma](https://ai.google.dev/gemma/docs/paligemma) JAX/FLAX PaliGemma 3B weights, fine-tuned with 224*224 input images on the <a href="https://okvqa.allenai.org/">OKVQA</a> dataset. The models are available in float32, bfloat16 and float16 format for research purposes only. The fine-tune config is available at <a href="https://github.com/google-research/big_vision/blob/main/big_vision/configs/proj/paligemma/transfers/okvqa.py">big_vision</a>. **Resources and technical documentation:** * [Responsible Generative AI Toolkit](https://ai.google.dev/responsible) * [PaliGemma on Kaggle](https://www.kaggle.com/models/google/paligemma) * [PaliGemma on Vertex Model Garden](https://console.cloud.google.com/vertex-ai/publishers/google/model-garden/363) **Terms of Use:** [Terms](https://www.kaggle.com/models/google/paligemma-ft/license/consent/verify/huggingface?returnModelRepoId=google/paligemma-3b-ft-okvqa-224-jax) **Authors:** Google ## Model information ### Model summary #### Description PaliGemma is a versatile and lightweight vision-language model (VLM) inspired by [PaLI-3](https://arxiv.org/abs/2310.09199) and based on open components such as the [SigLIP vision model](https://arxiv.org/abs/2303.15343) and the [Gemma language model](https://arxiv.org/abs/2403.08295). It takes both image and text as input and generates text as output, supporting multiple languages. It is designed for class-leading fine-tune performance on a wide range of vision-language tasks such as image and short video caption, visual question answering, text reading, object detection and object segmentation. #### Model architecture PaliGemma is the composition of a [Transformer decoder](https://arxiv.org/abs/1706.03762) and a [Vision Transformer image encoder](https://arxiv.org/abs/2010.11929), with a total of 3 billion params. The text decoder is initialized from [Gemma-2B](https://www.kaggle.com/models/google/gemma). The image encoder is initialized from [SigLIP-So400m/14](https://colab.research.google.com/github/google-research/big_vision/blob/main/big_vision/configs/proj/image_text/SigLIP_demo.ipynb). PaliGemma is trained following the PaLI-3 recipes. #### Inputs and outputs * **Input:** Image and text string, such as a prompt to caption the image, or a question. * **Output:** Generated text in response to the input, such as a caption of the image, an answer to a question, a list of object bounding box coordinates, or segmentation codewords. ### Model data #### Pre-train datasets PaliGemma is pre-trained on the following mixture of datasets: * **WebLI:** [WebLI (Web Language Image)](https://arxiv.org/abs/2209.06794) is a web-scale multilingual image-text dataset built from the public web. A wide range of WebLI splits are used to acquire versatile model capabilities, such as visual semantic understanding, object localization, visually-situated text understanding, multilinguality, etc. * **CC3M-35L:** Curated English image-alt_text pairs from webpages ([Sharma et al., 2018](https://aclanthology.org/P18-1238/)). We used the [Google Cloud Translation API](https://cloud.google.com/translate) to translate into 34 additional languages. * **VQ²A-CC3M-35L/VQG-CC3M-35L:** A subset of VQ2A-CC3M ([Changpinyo et al., 2022a](https://aclanthology.org/2022.naacl-main.142/)), translated into the same additional 34 languages as CC3M-35L, using the [Google Cloud Translation API](https://cloud.google.com/translate). * **OpenImages:** Detection and object-aware questions and answers ([Piergiovanni et al. 2022](https://arxiv.org/abs/2209.04372)) generated by handcrafted rules on the [OpenImages dataset]. * **WIT:** Images and texts collected from Wikipedia ([Srinivasan et al., 2021](https://arxiv.org/abs/2103.01913)). [OpenImages dataset]: https://storage.googleapis.com/openimages/web/factsfigures_v7.html #### Data responsibility filtering The following filters are applied to WebLI, with the goal of training PaliGemma on clean data: * **Pornographic image filtering:** This filter removes images deemed to be of pornographic nature. * **Text safety filtering:** We identify and filter out images that are paired with unsafe text. Unsafe text is any text deemed to contain or be about CSAI, pornography, vulgarities, or otherwise offensive. * **Text toxicity filtering:** We further use the [Perspective API](https://perspectiveapi.com/) to identify and filter out images that are paired with text deemed insulting, obscene, hateful or otherwise toxic. * **Text personal information filtering:** We filtered certain personal information and other sensitive data using [Cloud Data Loss Prevention (DLP) API](https://cloud.google.com/security/products/dlp) to protect the privacy of individuals. Identifiers such as social security numbers and [other sensitive information types] were removed. * **Additional methods:** Filtering based on content quality and safety in line with our policies and practices. [other sensitive information types]: https://cloud.google.com/sensitive-data-protection/docs/high-sensitivity-infotypes-reference?_gl=1*jg604m*_ga*ODk5MzA3ODQyLjE3MTAzMzQ3NTk.*_ga_WH2QY8WWF5*MTcxMDUxNTkxMS4yLjEuMTcxMDUxNjA2NC4wLjAuMA..&_ga=2.172110058.-899307842.1710334759 ## Implementation information ### Hardware PaliGemma was trained using the latest generation of Tensor Processing Unit (TPU) hardware (TPUv5e). ### Software Training was done using [JAX](https://github.com/google/jax), [Flax](https://github.com/google/flax), [TFDS](https://github.com/tensorflow/datasets) and [`big_vision`](https://github.com/google-research/big_vision). JAX allows researchers to take advantage of the latest generation of hardware, including TPUs, for faster and more efficient training of large models. TFDS is used to access datasets and Flax is used for model architecture. The PaliGemma fine-tune code and inference code are released in the `big_vision` GitHub repository. ## Evaluation information ### Benchmark results In order to verify the transferability of PaliGemma to a wide variety of academic tasks, we fine-tune the pretrained models on each task. Additionally we train the mix model with a mixture of the transfer tasks. We report results on different resolutions to provide an impression of which tasks benefit from increased resolution. Importantly, none of these tasks or datasets are part of the pretraining data mixture, and their images are explicitly removed from the web-scale pre-training data. #### Mix model (fine-tune on mixture of transfer tasks) <table> <tbody><tr> <th>Benchmark</th> <th>Metric (split)</th> <th>mix-224</th> <th>mix-448</th> </tr> <tr> <td><a href="https://arxiv.org/abs/2401.06209">MMVP</a></td> <td>Paired Accuracy</td> <td>46.00</td> <td>45.33</td> </tr> <tr> <td><a href="https://arxiv.org/abs/2305.10355">POPE</a></td> <td>Accuracy<br>(random/popular/adversarial)</td> <td> 88.00<br> 86.63<br> 85.67 </td> <td> 89.37<br> 88.40<br> 87.47 </td> </tr> <tr> <td><a href="https://cs.stanford.edu/people/dorarad/gqa/about.html">GQA</a></td> <td>Accuracy (test)</td> <td>65.20</td> <td>65.47</td> </tr> </tbody></table> #### Single task (fine-tune on single task) <table> <tbody><tr> <th>Benchmark<br>(train split)</th> <th>Metric<br>(split)</th> <th>pt-224</th> <th>pt-448</th> <th>pt-896</th> </tr> <tr> <th>Captioning</th> </tr> <tr> <td> <a href="https://cocodataset.org/#home">COCO captions</a><br>(train+restval) </td> <td>CIDEr (val)</td> <td>141.92</td> <td>144.60</td> </tr> <tr> <td> <a href="https://nocaps.org/">NoCaps</a><br>(Eval of COCO<br>captions transfer) </td> <td>CIDEr (val)</td> <td>121.72</td> <td>123.58</td> </tr> <tr> <td> <a href="https://arxiv.org/pdf/2205.12522">COCO-35L</a><br>(train) </td> <td>CIDEr dev<br>(en/avg-34/avg)</td> <td> 139.2<br> 115.8<br> 116.4 </td> <td> 141.2<br> 118.0<br> 118.6 </td> </tr> <tr> <td> <a href="https://arxiv.org/pdf/2205.12522">XM3600</a><br>(Eval of COCO-35L transfer) </td> <td>CIDEr dev<br>(en/avg-34/avg)</td> <td> 78.1<br> 41.3<br> 42.4 </td> <td> 80.0<br> 41.9<br> 42.9 </td> </tr> <tr> <td> <a href="https://textvqa.org/textcaps/">TextCaps</a><br>(train) </td> <td>CIDEr (val)</td> <td>127.48</td> <td>153.94</td> </tr> <tr> <td> <a href="https://arxiv.org/abs/2110.11624">SciCap</a><br>(first sentence, no subfigure)<br>(train+val) </td> <td>CIDEr/BLEU-4<br>(test)</td> <td> 162.25<br> 0.192<br> </td> <td> 181.49<br> 0.211<br> </td> </tr> <tr> <td> <a href="https://arxiv.org/abs/2108.03353">Screen2words</a><br>(train+dev) </td> <td>CIDEr (test)</td> <td>117.57</td> <td>119.59</td> </tr> <tr> <td> <a href="https://arxiv.org/abs/2010.04295">Widget Captioning</a><br>(train+dev) </td> <td>CIDEr (test)</td> <td>136.07</td> <td>148.36</td> </tr> <tr> <th>Question answering</th> </tr> <tr> <td> <a href="https://visualqa.org/index.html">VQAv2</a><br>(train+validation) </td> <td>Accuracy<br>(Test server - std)</td> <td>83.19</td> <td>85.64</td> </tr> <tr> <td> <a href="https://arxiv.org/abs/2401.06209">MMVP</a><br>(Eval of VQAv2 transfer) </td> <td>Paired Accuracy</td> <td>47.33</td> <td>45.33</td> </tr> <tr> <td> <a href="https://arxiv.org/abs/2305.10355">POPE</a><br>(Eval of VQAv2 transfer) </td> <td>Accuracy<br>(random/popular/<br>adversarial)</td> <td> 87.80<br> 85.87<br> 84.27 </td> <td> 88.23<br> 86.77<br> 85.90 </td> </tr> <tr> <td> <a href="https://okvqa.allenai.org/">OKVQA</a><br>(train) </td> <td>Accuracy (val)</td> <td>63.54</td> <td>63.15</td> </tr> <tr> <td> <a href="https://allenai.org/project/a-okvqa/home">A-OKVQA</a> (MC)<br>(train+val) </td> <td>Accuracy<br>(Test server)</td> <td>76.37</td> <td>76.90</td> </tr> <tr> <td> <a href="https://allenai.org/project/a-okvqa/home">A-OKVQA</a> (DA)<br>(train+val) </td> <td>Accuracy<br>(Test server)</td> <td>61.85</td> <td>63.22</td> </tr> <tr> <td> <a href="https://cs.stanford.edu/people/dorarad/gqa/about.html">GQA</a><br>(train_balanced+<br>val_balanced) </td> <td>Accuracy<br>(testdev balanced)</td> <td>65.61</td> <td>67.03</td> </tr> <tr> <td> <a href="https://aclanthology.org/2022.findings-acl.196/">xGQA</a><br>(Eval of GQA transfer) </td> <td>Mean Accuracy<br>(bn, de, en, id,<br>ko, pt, ru, zh)</td> <td>58.37</td> <td>59.07</td> </tr> <tr> <td> <a href="https://lil.nlp.cornell.edu/nlvr/">NLVR2</a><br>(train+dev) </td> <td>Accuracy (test)</td> <td>90.02</td> <td>88.93</td> </tr> <tr> <td> <a href="https://marvl-challenge.github.io/">MaRVL</a><br>(Eval of NLVR2 transfer) </td> <td>Mean Accuracy<br>(test)<br>(id, sw, ta, tr, zh)</td> <td>80.57</td> <td>76.78</td> </tr> <tr> <td> <a href="https://allenai.org/data/diagrams">AI2D</a><br>(train) </td> <td>Accuracy (test)</td> <td>72.12</td> <td>73.28</td> </tr> <tr> <td> <a href="https://scienceqa.github.io/">ScienceQA</a><br>(Img subset, no CoT)<br>(train+val) </td> <td>Accuracy (test)</td> <td>95.39</td> <td>95.93</td> </tr> <tr> <td> <a href="https://zenodo.org/records/6344334">RSVQA-LR</a> (Non numeric)<br>(train+val) </td> <td>Mean Accuracy<br>(test)</td> <td>92.65</td> <td>93.11</td> </tr> <tr> <td> <a href="https://zenodo.org/records/6344367">RSVQA-HR</a> (Non numeric)<br>(train+val) </td> <td>Mean Accuracy<br>(test/test2)</td> <td> 92.61<br> 90.58 </td> <td> 92.79<br> 90.54 </td> </tr> <tr> <td> <a href="https://arxiv.org/abs/2203.10244">ChartQA</a><br>(human+aug)x(train+val) </td> <td>Mean Relaxed<br>Accuracy<br>(test_human,<br>test_aug)</td> <td>57.08</td> <td>71.36</td> </tr> <tr> <td> <a href="https://vizwiz.org/tasks-and-datasets/vqa/">VizWiz VQA</a><br>(train+val) </td> <td>Accuracy<br>(Test server - std)</td> <td> 73.7 </td> <td> 75.52 </td> </tr> <tr> <td> <a href="https://arxiv.org/abs/1810.12440">TallyQA</a><br>(train) </td> <td>Accuracy<br>(test_simple/<br>test_complex)</td> <td> 81.72<br> 69.56 </td> <td> 84.86<br> 72.27 </td> </tr> <tr> <td> <a href="https://ocr-vqa.github.io/">OCR-VQA</a><br>(train+val) </td> <td>Accuracy (test)</td> <td>72.32</td> <td>74.61</td> <td>74.93</td> </tr> <tr> <td> <a href="https://textvqa.org/">TextVQA</a><br>(train+val) </td> <td>Accuracy<br>(Test server - std)</td> <td>55.47</td> <td>73.15</td> <td>76.48</td> </tr> <tr> <td> <a href="https://www.docvqa.org/">DocVQA</a><br>(train+val) </td> <td>ANLS (Test server)</td> <td>43.74</td> <td>78.02</td> <td>84.77</td> </tr> <tr> <td> <a href="https://openaccess.thecvf.com/content/WACV2022/papers/Mathew_InfographicVQA_WACV_2022_paper.pdf">Infographic VQA</a><br>(train+val) </td> <td>ANLS (Test server)</td> <td>28.46</td> <td>40.47</td> <td>47.75</td> </tr> <tr> <td> <a href="https://arxiv.org/abs/1905.13648">SceneText VQA</a><br>(train+val) </td> <td>ANLS (Test server)</td> <td>63.29</td> <td>81.82</td> <td>84.40</td> </tr> <tr> <th>Segmentation</th> </tr> <tr> <td> <a href="https://arxiv.org/abs/1608.00272">RefCOCO</a><br>(combined refcoco, refcoco+,<br>refcocog excluding val<br>and test images) </td> <td>MIoU<br>(validation)<br>refcoco/refcoco+/<br>refcocog</td> <td> 73.40<br> 68.32<br> 67.65 </td> <td> 75.57<br> 69.76<br> 70.17 </td> <td> 76.94<br> 72.18<br> 72.22 </td> </tr> <tr> <th>Video tasks (Caption/QA)</th> </tr> <tr> <td>MSR-VTT (Captioning)</td> <td>CIDEr (test)</td> <td>70.54</td> </tr> <tr> <td>MSR-VTT (QA)</td> <td>Accuracy (test)</td> <td>50.09</td> </tr> <tr> <td>ActivityNet (Captioning)</td> <td>CIDEr (test)</td> <td>34.62</td> </tr> <tr> <td>ActivityNet (QA)</td> <td>Accuracy (test)</td> <td>50.78</td> </tr> <tr> <td>VATEX (Captioning)</td> <td>CIDEr (test)</td> <td>79.73</td> </tr> <tr> <td>MSVD (QA)</td> <td>Accuracy (test)</td> <td>60.22</td> </tr> </tbody></table> ## Ethics and safety ### Evaluation approach Our evaluation methods include structured evaluations and internal red-teaming testing of relevant content policies. Red-teaming was conducted by a number of different teams, each with different goals and human evaluation metrics. These models were evaluated against a number of different categories relevant to ethics and safety, including: * Human evaluation on prompts covering child safety, content safety and representational harms. See the [Gemma model card](https://ai.google.dev/gemma/docs/model_card#evaluation_approach) for more details on evaluation approach, but with image captioning and visual question answering setups. * Image-to-Text benchmark evaluation: Benchmark against relevant academic datasets such as FairFace Dataset ([Karkkainen et al., 2021](https://arxiv.org/abs/1908.04913)). ### Evaluation results * The human evaluation results of ethics and safety evaluations are within acceptable thresholds for meeting [internal policies](https://storage.googleapis.com/gweb-uniblog-publish-prod/documents/2023_Google_AI_Principles_Progress_Update.pdf#page=11) for categories such as child safety, content safety and representational harms. * On top of robust internal evaluations, we also use the Perspective API (threshold of 0.8) to measure toxicity, profanity, and other potential issues in the generated captions for images sourced from the FairFace dataset. We report the maximum and median values observed across subgroups for each of the perceived gender, ethnicity, and age attributes. <table> <tbody><tr> </tr></tbody><tbody><tr><th>Metric</th> <th>Perceived<br>gender</th> <th></th> <th>Ethnicity</th> <th></th> <th>Age group</th> <th></th> </tr> <tr> <th></th> <th>Maximum</th> <th>Median</th> <th>Maximum</th> <th>Median</th> <th>Maximum</th> <th>Median</th> </tr> <tr> <td>Toxicity</td> <td>0.04%</td> <td>0.03%</td> <td>0.08%</td> <td>0.00%</td> <td>0.09%</td> <td>0.00%</td> </tr> <tr> <td>Identity Attack</td> <td>0.00%</td> <td>0.00%</td> <td>0.00%</td> <td>0.00%</td> <td>0.00%</td> <td>0.00%</td> </tr> <tr> <td>Insult</td> <td>0.06%</td> <td>0.04%</td> <td>0.09%</td> <td>0.07%</td> <td>0.16%</td> <td>0.00%</td> </tr> <tr> <td>Threat</td> <td>0.06%</td> <td>0.05%</td> <td>0.14%</td> <td>0.05%</td> <td>0.17%</td> <td>0.00%</td> </tr> <tr> <td>Profanity</td> <td>0.00%</td> <td>0.00%</td> <td>0.00%</td> <td>0.00%</td> <td>0.00%</td> <td>0.00%</td> </tr> </tbody></table> ## Usage and limitations ### Intended usage Open Vision Language Models (VLMs) have a wide range of applications across various industries and domains. The following list of potential uses is not comprehensive. The purpose of this list is to provide contextual information about the possible use-cases that the model creators considered as part of model training and development. Fine-tune on specific vision-language task: * The pre-trained models can be fine-tuned on a wide range of vision-language tasks such as: image captioning, short video caption, visual question answering, text reading, object detection and object segmentation. * The pre-trained models can be fine-tuned for specific domains such as remote sensing question answering, visual questions from people who are blind, science question answering, describe UI element functionalities. * The pre-trained models can be fine-tuned for tasks with non-textual outputs such as bounding boxes or segmentation masks. Vision-language research: * The pre-trained models and fine-tuned models can serve as a foundation for researchers to experiment with VLM techniques, develop algorithms, and contribute to the advancement of the field. ### Ethical considerations and risks The development of vision-language models (VLMs) raises several ethical concerns. In creating an open model, we have carefully considered the following: * Bias and Fairness * VLMs trained on large-scale, real-world image-text data can reflect socio-cultural biases embedded in the training material. These models underwent careful scrutiny, input data pre-processing described and posterior evaluations reported in this card. * Misinformation and Misuse * VLMs can be misused to generate text that is false, misleading, or harmful. * Guidelines are provided for responsible use with the model, see the [Responsible Generative AI Toolkit](https://ai.google.dev/responsible). * Transparency and Accountability * This model card summarizes details on the models' architecture, capabilities, limitations, and evaluation processes. * A responsibly developed open model offers the opportunity to share innovation by making VLM technology accessible to developers and researchers across the AI ecosystem. Risks identified and mitigations: * **Perpetuation of biases:** It's encouraged to perform continuous monitoring (using evaluation metrics, human review) and the exploration of de-biasing techniques during model training, fine-tuning, and other use cases. * **Generation of harmful content:** Mechanisms and guidelines for content safety are essential. Developers are encouraged to exercise caution and implement appropriate content safety safeguards based on their specific product policies and application use cases. * **Misuse for malicious purposes:** Technical limitations and developer and end-user education can help mitigate against malicious applications of LLMs. Educational resources and reporting mechanisms for users to flag misuse are provided. Prohibited uses of Gemma models are outlined in the [Gemma Prohibited Use Policy](https://ai.google.dev/gemma/prohibited_use_policy). * **Privacy violations:** Models were trained on data filtered to remove certain personal information and sensitive data. Developers are encouraged to adhere to privacy regulations with privacy-preserving techniques. ### Limitations * Most limitations inherited from the underlying Gemma model still apply: * VLMs are better at tasks that can be framed with clear prompts and instructions. Open-ended or highly complex tasks might be challenging. * Natural language is inherently complex. VLMs might struggle to grasp subtle nuances, sarcasm, or figurative language. * VLMs generate responses based on information they learned from their training datasets, but they are not knowledge bases. They may generate incorrect or outdated factual statements. * VLMs rely on statistical patterns in language and images. They might lack the ability to apply common sense reasoning in certain situations. * PaliGemma was designed first and foremost to serve as a general pre-trained model for transfer to specialized tasks. Hence, its "out of the box" or "zero-shot" performance might lag behind models designed specifically for that. * PaliGemma is not a multi-turn chatbot. It is designed for a single round of image and text input. ## Citation ```bibtex @article{beyer2024paligemma, title={{PaliGemma: A versatile 3B VLM for transfer}}, author={Lucas Beyer* and Andreas Steiner* and André Susano Pinto* and Alexander Kolesnikov* and Xiao Wang* and Daniel Salz and Maxim Neumann and Ibrahim Alabdulmohsin and Michael Tschannen and Emanuele Bugliarello and Thomas Unterthiner and Daniel Keysers and Skanda Koppula and Fangyu Liu and Adam Grycner and Alexey Gritsenko and Neil Houlsby and Manoj Kumar and Keran Rong and Julian Eisenschlos and Rishabh Kabra and Matthias Bauer and Matko Bošnjak and Xi Chen and Matthias Minderer and Paul Voigtlaender and Ioana Bica and Ivana Balazevic and Joan Puigcerver and Pinelopi Papalampidi and Olivier Henaff and Xi Xiong and Radu Soricut and Jeremiah Harmsen and Xiaohua Zhai*}, year={2024}, journal={arXiv preprint arXiv:2407.07726} } ``` Find the paper [here](https://arxiv.org/abs/2407.07726).
null
Non_BioNLP
# PaliGemma model card **Model page:** [PaliGemma](https://ai.google.dev/gemma/docs/paligemma) JAX/FLAX PaliGemma 3B weights, fine-tuned with 224*224 input images on the <a href="https://okvqa.allenai.org/">OKVQA</a> dataset. The models are available in float32, bfloat16 and float16 format for research purposes only. The fine-tune config is available at <a href="https://github.com/google-research/big_vision/blob/main/big_vision/configs/proj/paligemma/transfers/okvqa.py">big_vision</a>. **Resources and technical documentation:** * [Responsible Generative AI Toolkit](https://ai.google.dev/responsible) * [PaliGemma on Kaggle](https://www.kaggle.com/models/google/paligemma) * [PaliGemma on Vertex Model Garden](https://console.cloud.google.com/vertex-ai/publishers/google/model-garden/363) **Terms of Use:** [Terms](https://www.kaggle.com/models/google/paligemma-ft/license/consent/verify/huggingface?returnModelRepoId=google/paligemma-3b-ft-okvqa-224-jax) **Authors:** Google ## Model information ### Model summary #### Description PaliGemma is a versatile and lightweight vision-language model (VLM) inspired by [PaLI-3](https://arxiv.org/abs/2310.09199) and based on open components such as the [SigLIP vision model](https://arxiv.org/abs/2303.15343) and the [Gemma language model](https://arxiv.org/abs/2403.08295). It takes both image and text as input and generates text as output, supporting multiple languages. It is designed for class-leading fine-tune performance on a wide range of vision-language tasks such as image and short video caption, visual question answering, text reading, object detection and object segmentation. #### Model architecture PaliGemma is the composition of a [Transformer decoder](https://arxiv.org/abs/1706.03762) and a [Vision Transformer image encoder](https://arxiv.org/abs/2010.11929), with a total of 3 billion params. The text decoder is initialized from [Gemma-2B](https://www.kaggle.com/models/google/gemma). The image encoder is initialized from [SigLIP-So400m/14](https://colab.research.google.com/github/google-research/big_vision/blob/main/big_vision/configs/proj/image_text/SigLIP_demo.ipynb). PaliGemma is trained following the PaLI-3 recipes. #### Inputs and outputs * **Input:** Image and text string, such as a prompt to caption the image, or a question. * **Output:** Generated text in response to the input, such as a caption of the image, an answer to a question, a list of object bounding box coordinates, or segmentation codewords. ### Model data #### Pre-train datasets PaliGemma is pre-trained on the following mixture of datasets: * **WebLI:** [WebLI (Web Language Image)](https://arxiv.org/abs/2209.06794) is a web-scale multilingual image-text dataset built from the public web. A wide range of WebLI splits are used to acquire versatile model capabilities, such as visual semantic understanding, object localization, visually-situated text understanding, multilinguality, etc. * **CC3M-35L:** Curated English image-alt_text pairs from webpages ([Sharma et al., 2018](https://aclanthology.org/P18-1238/)). We used the [Google Cloud Translation API](https://cloud.google.com/translate) to translate into 34 additional languages. * **VQ²A-CC3M-35L/VQG-CC3M-35L:** A subset of VQ2A-CC3M ([Changpinyo et al., 2022a](https://aclanthology.org/2022.naacl-main.142/)), translated into the same additional 34 languages as CC3M-35L, using the [Google Cloud Translation API](https://cloud.google.com/translate). * **OpenImages:** Detection and object-aware questions and answers ([Piergiovanni et al. 2022](https://arxiv.org/abs/2209.04372)) generated by handcrafted rules on the [OpenImages dataset]. * **WIT:** Images and texts collected from Wikipedia ([Srinivasan et al., 2021](https://arxiv.org/abs/2103.01913)). [OpenImages dataset]: https://storage.googleapis.com/openimages/web/factsfigures_v7.html #### Data responsibility filtering The following filters are applied to WebLI, with the goal of training PaliGemma on clean data: * **Pornographic image filtering:** This filter removes images deemed to be of pornographic nature. * **Text safety filtering:** We identify and filter out images that are paired with unsafe text. Unsafe text is any text deemed to contain or be about CSAI, pornography, vulgarities, or otherwise offensive. * **Text toxicity filtering:** We further use the [Perspective API](https://perspectiveapi.com/) to identify and filter out images that are paired with text deemed insulting, obscene, hateful or otherwise toxic. * **Text personal information filtering:** We filtered certain personal information and other sensitive data using [Cloud Data Loss Prevention (DLP) API](https://cloud.google.com/security/products/dlp) to protect the privacy of individuals. Identifiers such as social security numbers and [other sensitive information types] were removed. * **Additional methods:** Filtering based on content quality and safety in line with our policies and practices. [other sensitive information types]: https://cloud.google.com/sensitive-data-protection/docs/high-sensitivity-infotypes-reference?_gl=1*jg604m*_ga*ODk5MzA3ODQyLjE3MTAzMzQ3NTk.*_ga_WH2QY8WWF5*MTcxMDUxNTkxMS4yLjEuMTcxMDUxNjA2NC4wLjAuMA..&_ga=2.172110058.-899307842.1710334759 ## Implementation information ### Hardware PaliGemma was trained using the latest generation of Tensor Processing Unit (TPU) hardware (TPUv5e). ### Software Training was done using [JAX](https://github.com/google/jax), [Flax](https://github.com/google/flax), [TFDS](https://github.com/tensorflow/datasets) and [`big_vision`](https://github.com/google-research/big_vision). JAX allows researchers to take advantage of the latest generation of hardware, including TPUs, for faster and more efficient training of large models. TFDS is used to access datasets and Flax is used for model architecture. The PaliGemma fine-tune code and inference code are released in the `big_vision` GitHub repository. ## Evaluation information ### Benchmark results In order to verify the transferability of PaliGemma to a wide variety of academic tasks, we fine-tune the pretrained models on each task. Additionally we train the mix model with a mixture of the transfer tasks. We report results on different resolutions to provide an impression of which tasks benefit from increased resolution. Importantly, none of these tasks or datasets are part of the pretraining data mixture, and their images are explicitly removed from the web-scale pre-training data. #### Mix model (fine-tune on mixture of transfer tasks) <table> <tbody><tr> <th>Benchmark</th> <th>Metric (split)</th> <th>mix-224</th> <th>mix-448</th> </tr> <tr> <td><a href="https://arxiv.org/abs/2401.06209">MMVP</a></td> <td>Paired Accuracy</td> <td>46.00</td> <td>45.33</td> </tr> <tr> <td><a href="https://arxiv.org/abs/2305.10355">POPE</a></td> <td>Accuracy<br>(random/popular/adversarial)</td> <td> 88.00<br> 86.63<br> 85.67 </td> <td> 89.37<br> 88.40<br> 87.47 </td> </tr> <tr> <td><a href="https://cs.stanford.edu/people/dorarad/gqa/about.html">GQA</a></td> <td>Accuracy (test)</td> <td>65.20</td> <td>65.47</td> </tr> </tbody></table> #### Single task (fine-tune on single task) <table> <tbody><tr> <th>Benchmark<br>(train split)</th> <th>Metric<br>(split)</th> <th>pt-224</th> <th>pt-448</th> <th>pt-896</th> </tr> <tr> <th>Captioning</th> </tr> <tr> <td> <a href="https://cocodataset.org/#home">COCO captions</a><br>(train+restval) </td> <td>CIDEr (val)</td> <td>141.92</td> <td>144.60</td> </tr> <tr> <td> <a href="https://nocaps.org/">NoCaps</a><br>(Eval of COCO<br>captions transfer) </td> <td>CIDEr (val)</td> <td>121.72</td> <td>123.58</td> </tr> <tr> <td> <a href="https://arxiv.org/pdf/2205.12522">COCO-35L</a><br>(train) </td> <td>CIDEr dev<br>(en/avg-34/avg)</td> <td> 139.2<br> 115.8<br> 116.4 </td> <td> 141.2<br> 118.0<br> 118.6 </td> </tr> <tr> <td> <a href="https://arxiv.org/pdf/2205.12522">XM3600</a><br>(Eval of COCO-35L transfer) </td> <td>CIDEr dev<br>(en/avg-34/avg)</td> <td> 78.1<br> 41.3<br> 42.4 </td> <td> 80.0<br> 41.9<br> 42.9 </td> </tr> <tr> <td> <a href="https://textvqa.org/textcaps/">TextCaps</a><br>(train) </td> <td>CIDEr (val)</td> <td>127.48</td> <td>153.94</td> </tr> <tr> <td> <a href="https://arxiv.org/abs/2110.11624">SciCap</a><br>(first sentence, no subfigure)<br>(train+val) </td> <td>CIDEr/BLEU-4<br>(test)</td> <td> 162.25<br> 0.192<br> </td> <td> 181.49<br> 0.211<br> </td> </tr> <tr> <td> <a href="https://arxiv.org/abs/2108.03353">Screen2words</a><br>(train+dev) </td> <td>CIDEr (test)</td> <td>117.57</td> <td>119.59</td> </tr> <tr> <td> <a href="https://arxiv.org/abs/2010.04295">Widget Captioning</a><br>(train+dev) </td> <td>CIDEr (test)</td> <td>136.07</td> <td>148.36</td> </tr> <tr> <th>Question answering</th> </tr> <tr> <td> <a href="https://visualqa.org/index.html">VQAv2</a><br>(train+validation) </td> <td>Accuracy<br>(Test server - std)</td> <td>83.19</td> <td>85.64</td> </tr> <tr> <td> <a href="https://arxiv.org/abs/2401.06209">MMVP</a><br>(Eval of VQAv2 transfer) </td> <td>Paired Accuracy</td> <td>47.33</td> <td>45.33</td> </tr> <tr> <td> <a href="https://arxiv.org/abs/2305.10355">POPE</a><br>(Eval of VQAv2 transfer) </td> <td>Accuracy<br>(random/popular/<br>adversarial)</td> <td> 87.80<br> 85.87<br> 84.27 </td> <td> 88.23<br> 86.77<br> 85.90 </td> </tr> <tr> <td> <a href="https://okvqa.allenai.org/">OKVQA</a><br>(train) </td> <td>Accuracy (val)</td> <td>63.54</td> <td>63.15</td> </tr> <tr> <td> <a href="https://allenai.org/project/a-okvqa/home">A-OKVQA</a> (MC)<br>(train+val) </td> <td>Accuracy<br>(Test server)</td> <td>76.37</td> <td>76.90</td> </tr> <tr> <td> <a href="https://allenai.org/project/a-okvqa/home">A-OKVQA</a> (DA)<br>(train+val) </td> <td>Accuracy<br>(Test server)</td> <td>61.85</td> <td>63.22</td> </tr> <tr> <td> <a href="https://cs.stanford.edu/people/dorarad/gqa/about.html">GQA</a><br>(train_balanced+<br>val_balanced) </td> <td>Accuracy<br>(testdev balanced)</td> <td>65.61</td> <td>67.03</td> </tr> <tr> <td> <a href="https://aclanthology.org/2022.findings-acl.196/">xGQA</a><br>(Eval of GQA transfer) </td> <td>Mean Accuracy<br>(bn, de, en, id,<br>ko, pt, ru, zh)</td> <td>58.37</td> <td>59.07</td> </tr> <tr> <td> <a href="https://lil.nlp.cornell.edu/nlvr/">NLVR2</a><br>(train+dev) </td> <td>Accuracy (test)</td> <td>90.02</td> <td>88.93</td> </tr> <tr> <td> <a href="https://marvl-challenge.github.io/">MaRVL</a><br>(Eval of NLVR2 transfer) </td> <td>Mean Accuracy<br>(test)<br>(id, sw, ta, tr, zh)</td> <td>80.57</td> <td>76.78</td> </tr> <tr> <td> <a href="https://allenai.org/data/diagrams">AI2D</a><br>(train) </td> <td>Accuracy (test)</td> <td>72.12</td> <td>73.28</td> </tr> <tr> <td> <a href="https://scienceqa.github.io/">ScienceQA</a><br>(Img subset, no CoT)<br>(train+val) </td> <td>Accuracy (test)</td> <td>95.39</td> <td>95.93</td> </tr> <tr> <td> <a href="https://zenodo.org/records/6344334">RSVQA-LR</a> (Non numeric)<br>(train+val) </td> <td>Mean Accuracy<br>(test)</td> <td>92.65</td> <td>93.11</td> </tr> <tr> <td> <a href="https://zenodo.org/records/6344367">RSVQA-HR</a> (Non numeric)<br>(train+val) </td> <td>Mean Accuracy<br>(test/test2)</td> <td> 92.61<br> 90.58 </td> <td> 92.79<br> 90.54 </td> </tr> <tr> <td> <a href="https://arxiv.org/abs/2203.10244">ChartQA</a><br>(human+aug)x(train+val) </td> <td>Mean Relaxed<br>Accuracy<br>(test_human,<br>test_aug)</td> <td>57.08</td> <td>71.36</td> </tr> <tr> <td> <a href="https://vizwiz.org/tasks-and-datasets/vqa/">VizWiz VQA</a><br>(train+val) </td> <td>Accuracy<br>(Test server - std)</td> <td> 73.7 </td> <td> 75.52 </td> </tr> <tr> <td> <a href="https://arxiv.org/abs/1810.12440">TallyQA</a><br>(train) </td> <td>Accuracy<br>(test_simple/<br>test_complex)</td> <td> 81.72<br> 69.56 </td> <td> 84.86<br> 72.27 </td> </tr> <tr> <td> <a href="https://ocr-vqa.github.io/">OCR-VQA</a><br>(train+val) </td> <td>Accuracy (test)</td> <td>72.32</td> <td>74.61</td> <td>74.93</td> </tr> <tr> <td> <a href="https://textvqa.org/">TextVQA</a><br>(train+val) </td> <td>Accuracy<br>(Test server - std)</td> <td>55.47</td> <td>73.15</td> <td>76.48</td> </tr> <tr> <td> <a href="https://www.docvqa.org/">DocVQA</a><br>(train+val) </td> <td>ANLS (Test server)</td> <td>43.74</td> <td>78.02</td> <td>84.77</td> </tr> <tr> <td> <a href="https://openaccess.thecvf.com/content/WACV2022/papers/Mathew_InfographicVQA_WACV_2022_paper.pdf">Infographic VQA</a><br>(train+val) </td> <td>ANLS (Test server)</td> <td>28.46</td> <td>40.47</td> <td>47.75</td> </tr> <tr> <td> <a href="https://arxiv.org/abs/1905.13648">SceneText VQA</a><br>(train+val) </td> <td>ANLS (Test server)</td> <td>63.29</td> <td>81.82</td> <td>84.40</td> </tr> <tr> <th>Segmentation</th> </tr> <tr> <td> <a href="https://arxiv.org/abs/1608.00272">RefCOCO</a><br>(combined refcoco, refcoco+,<br>refcocog excluding val<br>and test images) </td> <td>MIoU<br>(validation)<br>refcoco/refcoco+/<br>refcocog</td> <td> 73.40<br> 68.32<br> 67.65 </td> <td> 75.57<br> 69.76<br> 70.17 </td> <td> 76.94<br> 72.18<br> 72.22 </td> </tr> <tr> <th>Video tasks (Caption/QA)</th> </tr> <tr> <td>MSR-VTT (Captioning)</td> <td>CIDEr (test)</td> <td>70.54</td> </tr> <tr> <td>MSR-VTT (QA)</td> <td>Accuracy (test)</td> <td>50.09</td> </tr> <tr> <td>ActivityNet (Captioning)</td> <td>CIDEr (test)</td> <td>34.62</td> </tr> <tr> <td>ActivityNet (QA)</td> <td>Accuracy (test)</td> <td>50.78</td> </tr> <tr> <td>VATEX (Captioning)</td> <td>CIDEr (test)</td> <td>79.73</td> </tr> <tr> <td>MSVD (QA)</td> <td>Accuracy (test)</td> <td>60.22</td> </tr> </tbody></table> ## Ethics and safety ### Evaluation approach Our evaluation methods include structured evaluations and internal red-teaming testing of relevant content policies. Red-teaming was conducted by a number of different teams, each with different goals and human evaluation metrics. These models were evaluated against a number of different categories relevant to ethics and safety, including: * Human evaluation on prompts covering child safety, content safety and representational harms. See the [Gemma model card](https://ai.google.dev/gemma/docs/model_card#evaluation_approach) for more details on evaluation approach, but with image captioning and visual question answering setups. * Image-to-Text benchmark evaluation: Benchmark against relevant academic datasets such as FairFace Dataset ([Karkkainen et al., 2021](https://arxiv.org/abs/1908.04913)). ### Evaluation results * The human evaluation results of ethics and safety evaluations are within acceptable thresholds for meeting [internal policies](https://storage.googleapis.com/gweb-uniblog-publish-prod/documents/2023_Google_AI_Principles_Progress_Update.pdf#page=11) for categories such as child safety, content safety and representational harms. * On top of robust internal evaluations, we also use the Perspective API (threshold of 0.8) to measure toxicity, profanity, and other potential issues in the generated captions for images sourced from the FairFace dataset. We report the maximum and median values observed across subgroups for each of the perceived gender, ethnicity, and age attributes. <table> <tbody><tr> </tr></tbody><tbody><tr><th>Metric</th> <th>Perceived<br>gender</th> <th></th> <th>Ethnicity</th> <th></th> <th>Age group</th> <th></th> </tr> <tr> <th></th> <th>Maximum</th> <th>Median</th> <th>Maximum</th> <th>Median</th> <th>Maximum</th> <th>Median</th> </tr> <tr> <td>Toxicity</td> <td>0.04%</td> <td>0.03%</td> <td>0.08%</td> <td>0.00%</td> <td>0.09%</td> <td>0.00%</td> </tr> <tr> <td>Identity Attack</td> <td>0.00%</td> <td>0.00%</td> <td>0.00%</td> <td>0.00%</td> <td>0.00%</td> <td>0.00%</td> </tr> <tr> <td>Insult</td> <td>0.06%</td> <td>0.04%</td> <td>0.09%</td> <td>0.07%</td> <td>0.16%</td> <td>0.00%</td> </tr> <tr> <td>Threat</td> <td>0.06%</td> <td>0.05%</td> <td>0.14%</td> <td>0.05%</td> <td>0.17%</td> <td>0.00%</td> </tr> <tr> <td>Profanity</td> <td>0.00%</td> <td>0.00%</td> <td>0.00%</td> <td>0.00%</td> <td>0.00%</td> <td>0.00%</td> </tr> </tbody></table> ## Usage and limitations ### Intended usage Open Vision Language Models (VLMs) have a wide range of applications across various industries and domains. The following list of potential uses is not comprehensive. The purpose of this list is to provide contextual information about the possible use-cases that the model creators considered as part of model training and development. Fine-tune on specific vision-language task: * The pre-trained models can be fine-tuned on a wide range of vision-language tasks such as: image captioning, short video caption, visual question answering, text reading, object detection and object segmentation. * The pre-trained models can be fine-tuned for specific domains such as remote sensing question answering, visual questions from people who are blind, science question answering, describe UI element functionalities. * The pre-trained models can be fine-tuned for tasks with non-textual outputs such as bounding boxes or segmentation masks. Vision-language research: * The pre-trained models and fine-tuned models can serve as a foundation for researchers to experiment with VLM techniques, develop algorithms, and contribute to the advancement of the field. ### Ethical considerations and risks The development of vision-language models (VLMs) raises several ethical concerns. In creating an open model, we have carefully considered the following: * Bias and Fairness * VLMs trained on large-scale, real-world image-text data can reflect socio-cultural biases embedded in the training material. These models underwent careful scrutiny, input data pre-processing described and posterior evaluations reported in this card. * Misinformation and Misuse * VLMs can be misused to generate text that is false, misleading, or harmful. * Guidelines are provided for responsible use with the model, see the [Responsible Generative AI Toolkit](https://ai.google.dev/responsible). * Transparency and Accountability * This model card summarizes details on the models' architecture, capabilities, limitations, and evaluation processes. * A responsibly developed open model offers the opportunity to share innovation by making VLM technology accessible to developers and researchers across the AI ecosystem. Risks identified and mitigations: * **Perpetuation of biases:** It's encouraged to perform continuous monitoring (using evaluation metrics, human review) and the exploration of de-biasing techniques during model training, fine-tuning, and other use cases. * **Generation of harmful content:** Mechanisms and guidelines for content safety are essential. Developers are encouraged to exercise caution and implement appropriate content safety safeguards based on their specific product policies and application use cases. * **Misuse for malicious purposes:** Technical limitations and developer and end-user education can help mitigate against malicious applications of LLMs. Educational resources and reporting mechanisms for users to flag misuse are provided. Prohibited uses of Gemma models are outlined in the [Gemma Prohibited Use Policy](https://ai.google.dev/gemma/prohibited_use_policy). * **Privacy violations:** Models were trained on data filtered to remove certain personal information and sensitive data. Developers are encouraged to adhere to privacy regulations with privacy-preserving techniques. ### Limitations * Most limitations inherited from the underlying Gemma model still apply: * VLMs are better at tasks that can be framed with clear prompts and instructions. Open-ended or highly complex tasks might be challenging. * Natural language is inherently complex. VLMs might struggle to grasp subtle nuances, sarcasm, or figurative language. * VLMs generate responses based on information they learned from their training datasets, but they are not knowledge bases. They may generate incorrect or outdated factual statements. * VLMs rely on statistical patterns in language and images. They might lack the ability to apply common sense reasoning in certain situations. * PaliGemma was designed first and foremost to serve as a general pre-trained model for transfer to specialized tasks. Hence, its "out of the box" or "zero-shot" performance might lag behind models designed specifically for that. * PaliGemma is not a multi-turn chatbot. It is designed for a single round of image and text input. ## Citation ```bibtex @article{beyer2024paligemma, title={{PaliGemma: A versatile 3B VLM for transfer}}, author={Lucas Beyer* and Andreas Steiner* and André Susano Pinto* and Alexander Kolesnikov* and Xiao Wang* and Daniel Salz and Maxim Neumann and Ibrahim Alabdulmohsin and Michael Tschannen and Emanuele Bugliarello and Thomas Unterthiner and Daniel Keysers and Skanda Koppula and Fangyu Liu and Adam Grycner and Alexey Gritsenko and Neil Houlsby and Manoj Kumar and Keran Rong and Julian Eisenschlos and Rishabh Kabra and Matthias Bauer and Matko Bošnjak and Xi Chen and Matthias Minderer and Paul Voigtlaender and Ioana Bica and Ivana Balazevic and Joan Puigcerver and Pinelopi Papalampidi and Olivier Henaff and Xi Xiong and Radu Soricut and Jeremiah Harmsen and Xiaohua Zhai*}, year={2024}, journal={arXiv preprint arXiv:2407.07726} } ``` Find the paper [here](https://arxiv.org/abs/2407.07726).
{"library_name": "big_vision", "license": "gemma", "pipeline_tag": "image-text-to-text", "tags": ["paligemma", "jax"], "extra_gated_heading": "Access PaliGemma on Hugging Face", "extra_gated_prompt": "To access PaliGemma on Hugging Face, you’re required to review and agree to Google’s usage license. To do this, please ensure you’re logged-in to Hugging Face and click below. Requests are processed immediately.", "extra_gated_button_content": "Acknowledge license"}
task
[ "QUESTION_ANSWERING", "TRANSLATION" ]
42,862
MultiBertGunjanPatrick/multiberts-seed-2-300k
MultiBertGunjanPatrick
null
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-2", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04Z
2021-10-04T05:03:33+00:00
109
0
--- datasets: - bookcorpus - wikipedia language: en license: apache-2.0 tags: - exbert - multiberts - multiberts-seed-2 --- # MultiBERTs Seed 2 Checkpoint 300k (uncased) Seed 2 intermediate checkpoint 300k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-2](https://hf.co/multberts-seed-2). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-2-300k') model = BertModel.from_pretrained("multiberts-seed-2-300k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
null
Non_BioNLP
# MultiBERTs Seed 2 Checkpoint 300k (uncased) Seed 2 intermediate checkpoint 300k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-2](https://hf.co/multberts-seed-2). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-2-300k') model = BertModel.from_pretrained("multiberts-seed-2-300k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"datasets": ["bookcorpus", "wikipedia"], "language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-2"]}
task
[ "QUESTION_ANSWERING" ]
42,863
srikanth1579/project
srikanth1579
null
[ "is", "region:us" ]
2024-12-08T05:00:02Z
2024-12-08T05:03:58+00:00
0
0
--- language: - is --- Icendanic Meta-LLaMA 3.1 8B Model Model Description: The Icendanic Meta-LLaMA 3.1 8B is a fine-tuned language model built for tasks involving Icelandic text. It is designed to handle a variety of natural language processing tasks, including text generation, translation, and analysis, while emphasizing Icelandic language features. Intended Use: This model is intended for research and educational purposes, with a focus on: Icelandic language modeling and processing: Text generation and contextual understanding. Translation and evaluation tasks. Not suitable for: Sensitive or production-critical applications where guaranteed performance and low latency are required. Training: The model was fine-tuned using a curated dataset (Icelandic_cleaned.json) on Meta's LLaMA architecture. Training was performed on GPU resources, with loss convergence monitored using training_validation_loss_meta_llama.png. Training Framework: PyTorch Optimization Techniques: Hyperparameter tuning, learning rate adjustments, and validation-based monitoring. Metrics: The primary evaluation metrics used for this model are: Training Loss Validation Loss The training curves are available for reference in training_validation_loss_plot.png. Usage: The model can be loaded using the transformers library from Hugging Face: python Copy code from transformers import AutoModelForCausalLM, AutoTokenizer tokenizer = AutoTokenizer.from_pretrained("path/to/icendanic-model") model = AutoModelForCausalLM.from_pretrained("path/to/icendanic-model") # Example usage input_text = "Hvernig er veðrið í dag?" inputs = tokenizer(input_text, return_tensors="pt") outputs = model.generate(**inputs) print(tokenizer.decode(outputs[0])) Limitations Limited training dataset scope may restrict performance on out-of-domain Icelandic text. May exhibit biases present in the original dataset. Results may vary depending on task complexity and input length. Citation If you use this model, please cite as: css Copy code @misc{icendanic_model, author = {Icendanic Team}, title = {Icendanic Meta-LLaMA 3.1 8B Model}, year = 2024, note = {https://huggingface.co/your-repository-name} } License This model is released without any specific license. Please ensure compliance with the original dataset's terms and conditions when using this model. Acknowledgements This project was developed as part of ongoing research and academic efforts.
null
Non_BioNLP
Icendanic Meta-LLaMA 3.1 8B Model Model Description: The Icendanic Meta-LLaMA 3.1 8B is a fine-tuned language model built for tasks involving Icelandic text. It is designed to handle a variety of natural language processing tasks, including text generation, translation, and analysis, while emphasizing Icelandic language features. Intended Use: This model is intended for research and educational purposes, with a focus on: Icelandic language modeling and processing: Text generation and contextual understanding. Translation and evaluation tasks. Not suitable for: Sensitive or production-critical applications where guaranteed performance and low latency are required. Training: The model was fine-tuned using a curated dataset (Icelandic_cleaned.json) on Meta's LLaMA architecture. Training was performed on GPU resources, with loss convergence monitored using training_validation_loss_meta_llama.png. Training Framework: PyTorch Optimization Techniques: Hyperparameter tuning, learning rate adjustments, and validation-based monitoring. Metrics: The primary evaluation metrics used for this model are: Training Loss Validation Loss The training curves are available for reference in training_validation_loss_plot.png. Usage: The model can be loaded using the transformers library from Hugging Face: python Copy code from transformers import AutoModelForCausalLM, AutoTokenizer tokenizer = AutoTokenizer.from_pretrained("path/to/icendanic-model") model = AutoModelForCausalLM.from_pretrained("path/to/icendanic-model") # Example usage input_text = "Hvernig er veðrið í dag?" inputs = tokenizer(input_text, return_tensors="pt") outputs = model.generate(**inputs) print(tokenizer.decode(outputs[0])) Limitations Limited training dataset scope may restrict performance on out-of-domain Icelandic text. May exhibit biases present in the original dataset. Results may vary depending on task complexity and input length. Citation If you use this model, please cite as: css Copy code @misc{icendanic_model, author = {Icendanic Team}, title = {Icendanic Meta-LLaMA 3.1 8B Model}, year = 2024, note = {https://huggingface.co/your-repository-name} } License This model is released without any specific license. Please ensure compliance with the original dataset's terms and conditions when using this model. Acknowledgements This project was developed as part of ongoing research and academic efforts.
{"language": ["is"]}
task
[ "TRANSLATION" ]
42,864
aatmasidha/distilbert-base-uncased-newsmodelclassification
aatmasidha
text-classification
[ "transformers", "pytorch", "tensorboard", "distilbert", "text-classification", "generated_from_trainer", "dataset:emotion", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-07-12T09:10:54Z
2022-07-18T09:04:59+00:00
121
0
--- datasets: - emotion license: apache-2.0 metrics: - accuracy - f1 tags: - generated_from_trainer model-index: - name: distilbert-base-uncased-newsmodelclassification results: - task: type: text-classification name: Text Classification dataset: name: emotion type: emotion args: default metrics: - type: accuracy value: 0.928 name: Accuracy - type: f1 value: 0.9278415074713384 name: F1 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-newsmodelclassification This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset. It achieves the following results on the evaluation set: - Loss: 0.2177 - Accuracy: 0.928 - F1: 0.9278 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.8104 | 1.0 | 250 | 0.3057 | 0.9105 | 0.9084 | | 0.2506 | 2.0 | 500 | 0.2177 | 0.928 | 0.9278 | ### Framework versions - Transformers 4.13.0 - Pytorch 1.11.0+cu113 - Datasets 1.16.1 - Tokenizers 0.10.3
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-newsmodelclassification This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset. It achieves the following results on the evaluation set: - Loss: 0.2177 - Accuracy: 0.928 - F1: 0.9278 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.8104 | 1.0 | 250 | 0.3057 | 0.9105 | 0.9084 | | 0.2506 | 2.0 | 500 | 0.2177 | 0.928 | 0.9278 | ### Framework versions - Transformers 4.13.0 - Pytorch 1.11.0+cu113 - Datasets 1.16.1 - Tokenizers 0.10.3
{"datasets": ["emotion"], "license": "apache-2.0", "metrics": ["accuracy", "f1"], "tags": ["generated_from_trainer"], "model-index": [{"name": "distilbert-base-uncased-newsmodelclassification", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "emotion", "type": "emotion", "args": "default"}, "metrics": [{"type": "accuracy", "value": 0.928, "name": "Accuracy"}, {"type": "f1", "value": 0.9278415074713384, "name": "F1"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,865
mofyrt/bert-base-uncased-finetuned-cola
mofyrt
text-classification
[ "transformers", "pytorch", "tensorboard", "bert", "text-classification", "generated_from_trainer", "dataset:glue", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-04-23T13:35:27Z
2022-04-23T18:04:55+00:00
112
0
--- datasets: - glue license: apache-2.0 metrics: - matthews_correlation tags: - generated_from_trainer model-index: - name: bert-base-uncased-finetuned-cola results: - task: type: text-classification name: Text Classification dataset: name: glue type: glue args: cola metrics: - type: matthews_correlation value: 0.5905946625710334 name: Matthews Correlation --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-base-uncased-finetuned-cola This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset. It achieves the following results on the evaluation set: - Loss: 0.7445 - Matthews Correlation: 0.5906 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Matthews Correlation | |:-------------:|:-----:|:----:|:---------------:|:--------------------:| | 0.4926 | 1.0 | 535 | 0.5155 | 0.4941 | | 0.2971 | 2.0 | 1070 | 0.5561 | 0.5320 | | 0.1947 | 3.0 | 1605 | 0.7230 | 0.5677 | | 0.1293 | 4.0 | 2140 | 0.7445 | 0.5906 | | 0.0867 | 5.0 | 2675 | 0.8836 | 0.5788 | ### Framework versions - Transformers 4.18.0 - Pytorch 1.10.0+cu111 - Datasets 2.1.0 - Tokenizers 0.12.1
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-base-uncased-finetuned-cola This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset. It achieves the following results on the evaluation set: - Loss: 0.7445 - Matthews Correlation: 0.5906 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Matthews Correlation | |:-------------:|:-----:|:----:|:---------------:|:--------------------:| | 0.4926 | 1.0 | 535 | 0.5155 | 0.4941 | | 0.2971 | 2.0 | 1070 | 0.5561 | 0.5320 | | 0.1947 | 3.0 | 1605 | 0.7230 | 0.5677 | | 0.1293 | 4.0 | 2140 | 0.7445 | 0.5906 | | 0.0867 | 5.0 | 2675 | 0.8836 | 0.5788 | ### Framework versions - Transformers 4.18.0 - Pytorch 1.10.0+cu111 - Datasets 2.1.0 - Tokenizers 0.12.1
{"datasets": ["glue"], "license": "apache-2.0", "metrics": ["matthews_correlation"], "tags": ["generated_from_trainer"], "model-index": [{"name": "bert-base-uncased-finetuned-cola", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "cola"}, "metrics": [{"type": "matthews_correlation", "value": 0.5905946625710334, "name": "Matthews Correlation"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,866
pie/example-re-textclf-tacred
pie
null
[ "pytorch-ie", "pytorch", "TransformerTextClassificationModel", "en", "dataset:DFKI-SLT/tacred", "region:us" ]
2022-03-02T23:29:05Z
2024-02-29T13:02:19+00:00
25
2
--- datasets: - DFKI-SLT/tacred language: - en library_name: pytorch-ie --- This is a relation extraction model trained on the [TacRED dataset](https://huggingface.co/datasets/DFKI-SLT/tacred). It was deveolped within the [PyTorch-IE framework](https://github.com/ArneBinder/pytorch-ie). See [this HF space](https://huggingface.co/spaces/pie/Joint-NER-and-Relation-Extraction) for an usage example.
null
Non_BioNLP
This is a relation extraction model trained on the [TacRED dataset](https://huggingface.co/datasets/DFKI-SLT/tacred). It was deveolped within the [PyTorch-IE framework](https://github.com/ArneBinder/pytorch-ie). See [this HF space](https://huggingface.co/spaces/pie/Joint-NER-and-Relation-Extraction) for an usage example.
{"datasets": ["DFKI-SLT/tacred"], "language": ["en"], "library_name": "pytorch-ie"}
task
[ "RELATION_EXTRACTION" ]
42,867
classla/whisper-large-v3-mici-princ
classla
automatic-speech-recognition
[ "transformers", "safetensors", "whisper", "automatic-speech-recognition", "hr", "dataset:classla/Mici_Princ", "base_model:openai/whisper-large-v3", "base_model:finetune:openai/whisper-large-v3", "license:cc-by-sa-4.0", "endpoints_compatible", "region:us" ]
2024-03-14T10:00:55Z
2024-03-26T09:31:01+00:00
35
1
--- base_model: openai/whisper-large-v3 datasets: - classla/Mici_Princ language: - hr library_name: transformers license: cc-by-sa-4.0 metrics: - wer - cer pipeline_tag: automatic-speech-recognition widget: - example_title: example 1 src: https://huggingface.co/classla/whisper-large-v3-mici-princ/raw/main/MP_13_65.37-74.67.mp3.wav - example_title: example 2 src: https://huggingface.co/classla/whisper-large-v3-mici-princ/raw/main/MP_15_201.53-210.02.mp3.wav - example_title: example 3 src: https://huggingface.co/classla/whisper-large-v3-mici-princ/raw/main/MP_15_60.527-67.71.mp3.wav - example_title: example 4 src: https://huggingface.co/classla/whisper-large-v3-mici-princ/raw/main/MP_15_68.5-72.45.mp3.wav --- # Model Card for Model ID This model was finetuned on the [Mići Princ dataset](https://huggingface.co/datasets/classla/Mici_Princ), the audiobook of the translation of _Le Petit Prince_ into the Chakavian dialect of Croatian. ## Model Details ### Model Description The model, already very potent in standard Croatian, was finetuned for 80 epochs with an effective batch size of 16. Performance was inspected every 4 epochs, and the latest checkpoint is uploaded here. Character error rate has been brought down from 11.54% to 3.95%, while word error rate has been lowered from 35.43% to 16.83%. - **Developed by:** Nikola Ljubešić, Peter Rupnik, Tea Perinčić - **Language(s) (NLP):** Croatian (hrv) - Chakavian dialect (ckm) - **License:** Creative Commons - Share Alike 4.0 - **Finetuned from model:** openai/whisper-large-v3 ### Model Sources <!-- Provide the basic links for the model. --> - **Repository:** [GitHub](https://github.com/5roop/mici_princ_whisper) - **Paper:** Coming soon - **Dataset:** [Mići Princ](https://huggingface.co/datasets/classla/Mici_Princ) ## Example use: ```python import torch from datasets import load_dataset from transformers import AutoModelForSpeechSeq2Seq, AutoProcessor, pipeline from transformers.pipelines.pt_utils import KeyDataset device = torch.device("cuda" if torch.cuda.is_available() else "cpu") model_id = "classla/whisper-large-v3-mici-princ" model = AutoModelForSpeechSeq2Seq.from_pretrained( model_id, ) model.to(device) processor = AutoProcessor.from_pretrained(model_id) ds = load_dataset("classla/Mici_Princ", split="test") pipe = pipeline( "automatic-speech-recognition", model=model, tokenizer=processor.tokenizer, feature_extractor=processor.feature_extractor, max_new_tokens=128, chunk_length_s=30, batch_size=16, return_timestamps=True, device=device, ) result = pipe( KeyDataset(ds, "audio"), generate_kwargs={"language": "croatian"}, ) for i in result: print(i) # Output: # {'text': ' Šesti planet je biv deset put veći. Na njin je bivav niki stari čovik ki je pisav vele knjige.', 'chunks': [{'timestamp': (0.0, 7.18), 'text': ' Šesti planet je biv deset put veći. Na njin je bivav niki stari čovik ki je pisav vele knjige.'}]} # ... ``` ## Training Details #### Preprocessing Model was trained on the `normalized_text` attribute of the [Mići Princ dataset](https://huggingface.co/datasets/classla/Mici_Princ). This means that the data included capital letters and punctuation, except bullet points, newlines, and quotation marks. Special characters, present in the dialect, but not in standard Croatian, were substituted. Only the `train` split was used in training. #### Training Hyperparameters ``` per_device_train_batch_size=4, gradient_accumulation_steps=4, learning_rate=1e-5, warmup_steps=100, max_steps=277 * 80, gradient_checkpointing=True, predict_with_generate=True, generation_max_length=225, save_steps=277, ``` ## Evaluation For evaluation, the `test` split of the [Mići Princ dataset](https://huggingface.co/datasets/classla/Mici_Princ) was used. The test split consists of two known speakers, Autor and Mići Princ, and two unknown speakers, Geograf and Dilavac. Important to note is that each speaker uses a different micro-dialect, so the test data is challenging on including two new micro-dialects. #### Metrics | speaker | WER vanilla | WER fine-tuned | WER reduction | CER vanilla | CER fine-tuned| CER reduction | |---|---|---|---|---|---|---| | all | 35.43% | 16.83% | 52.50% | 11.54% | 3.95% | 65.77% | | Autor | 38.96% | 14.29% | 63.32% | 10.24% | 2.93% | 71.39% | | Geograf | 20.94% | 11.57% | 44.75% | 4.99% | 2.19% | 56.11% | | Mići Princ | 45.32% | 16.62% | 63.33% | 12.21% | 5.09% | 58.31% | | Dilavac | 39.60% | 23.70% | 40.15% | 18.55% | 5.27% | 71.59% | ## Citation Coming soon. ## Model Card Authors * Peter Rupnik * Nikola Ljubešić ## Model Card Contact [https://huggingface.co/5roop](https://huggingface.co/5roop)
null
Non_BioNLP
# Model Card for Model ID This model was finetuned on the [Mići Princ dataset](https://huggingface.co/datasets/classla/Mici_Princ), the audiobook of the translation of _Le Petit Prince_ into the Chakavian dialect of Croatian. ## Model Details ### Model Description The model, already very potent in standard Croatian, was finetuned for 80 epochs with an effective batch size of 16. Performance was inspected every 4 epochs, and the latest checkpoint is uploaded here. Character error rate has been brought down from 11.54% to 3.95%, while word error rate has been lowered from 35.43% to 16.83%. - **Developed by:** Nikola Ljubešić, Peter Rupnik, Tea Perinčić - **Language(s) (NLP):** Croatian (hrv) - Chakavian dialect (ckm) - **License:** Creative Commons - Share Alike 4.0 - **Finetuned from model:** openai/whisper-large-v3 ### Model Sources <!-- Provide the basic links for the model. --> - **Repository:** [GitHub](https://github.com/5roop/mici_princ_whisper) - **Paper:** Coming soon - **Dataset:** [Mići Princ](https://huggingface.co/datasets/classla/Mici_Princ) ## Example use: ```python import torch from datasets import load_dataset from transformers import AutoModelForSpeechSeq2Seq, AutoProcessor, pipeline from transformers.pipelines.pt_utils import KeyDataset device = torch.device("cuda" if torch.cuda.is_available() else "cpu") model_id = "classla/whisper-large-v3-mici-princ" model = AutoModelForSpeechSeq2Seq.from_pretrained( model_id, ) model.to(device) processor = AutoProcessor.from_pretrained(model_id) ds = load_dataset("classla/Mici_Princ", split="test") pipe = pipeline( "automatic-speech-recognition", model=model, tokenizer=processor.tokenizer, feature_extractor=processor.feature_extractor, max_new_tokens=128, chunk_length_s=30, batch_size=16, return_timestamps=True, device=device, ) result = pipe( KeyDataset(ds, "audio"), generate_kwargs={"language": "croatian"}, ) for i in result: print(i) # Output: # {'text': ' Šesti planet je biv deset put veći. Na njin je bivav niki stari čovik ki je pisav vele knjige.', 'chunks': [{'timestamp': (0.0, 7.18), 'text': ' Šesti planet je biv deset put veći. Na njin je bivav niki stari čovik ki je pisav vele knjige.'}]} # ... ``` ## Training Details #### Preprocessing Model was trained on the `normalized_text` attribute of the [Mići Princ dataset](https://huggingface.co/datasets/classla/Mici_Princ). This means that the data included capital letters and punctuation, except bullet points, newlines, and quotation marks. Special characters, present in the dialect, but not in standard Croatian, were substituted. Only the `train` split was used in training. #### Training Hyperparameters ``` per_device_train_batch_size=4, gradient_accumulation_steps=4, learning_rate=1e-5, warmup_steps=100, max_steps=277 * 80, gradient_checkpointing=True, predict_with_generate=True, generation_max_length=225, save_steps=277, ``` ## Evaluation For evaluation, the `test` split of the [Mići Princ dataset](https://huggingface.co/datasets/classla/Mici_Princ) was used. The test split consists of two known speakers, Autor and Mići Princ, and two unknown speakers, Geograf and Dilavac. Important to note is that each speaker uses a different micro-dialect, so the test data is challenging on including two new micro-dialects. #### Metrics | speaker | WER vanilla | WER fine-tuned | WER reduction | CER vanilla | CER fine-tuned| CER reduction | |---|---|---|---|---|---|---| | all | 35.43% | 16.83% | 52.50% | 11.54% | 3.95% | 65.77% | | Autor | 38.96% | 14.29% | 63.32% | 10.24% | 2.93% | 71.39% | | Geograf | 20.94% | 11.57% | 44.75% | 4.99% | 2.19% | 56.11% | | Mići Princ | 45.32% | 16.62% | 63.33% | 12.21% | 5.09% | 58.31% | | Dilavac | 39.60% | 23.70% | 40.15% | 18.55% | 5.27% | 71.59% | ## Citation Coming soon. ## Model Card Authors * Peter Rupnik * Nikola Ljubešić ## Model Card Contact [https://huggingface.co/5roop](https://huggingface.co/5roop)
{"base_model": "openai/whisper-large-v3", "datasets": ["classla/Mici_Princ"], "language": ["hr"], "library_name": "transformers", "license": "cc-by-sa-4.0", "metrics": ["wer", "cer"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "example 1", "src": "https://huggingface.co/classla/whisper-large-v3-mici-princ/raw/main/MP_13_65.37-74.67.mp3.wav"}, {"example_title": "example 2", "src": "https://huggingface.co/classla/whisper-large-v3-mici-princ/raw/main/MP_15_201.53-210.02.mp3.wav"}, {"example_title": "example 3", "src": "https://huggingface.co/classla/whisper-large-v3-mici-princ/raw/main/MP_15_60.527-67.71.mp3.wav"}, {"example_title": "example 4", "src": "https://huggingface.co/classla/whisper-large-v3-mici-princ/raw/main/MP_15_68.5-72.45.mp3.wav"}]}
task
[ "TRANSLATION" ]
42,868
yelyah/mT5-XLSUM-ua-news
yelyah
summarization
[ "transformers", "safetensors", "mt5", "text2text-generation", "news", "summarization", "uk", "dataset:FIdo-AI/ua-news", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-05-20T17:51:59Z
2024-09-28T13:26:33+00:00
153
1
--- datasets: - FIdo-AI/ua-news language: - uk library_name: transformers metrics: - rouge pipeline_tag: summarization tags: - news --- # Model Card for Model ID ## Model Summary The mT5-multilingual-XLSum model was fine-tuned on the UA-News dataset to generate concise and accurate news headlines in Ukrainian language. ## Training - **Epochs**: 4 - **Batch Size**: 4 - **Learning Rate**: 4e-5 ## Evaluation - **Metrics**: The model's performance on the test set. - **ROUGE-1**: 0.2452 - **ROUGE-2**: 0.1075 - **ROUGE-L**: 0.2348 - **BERTScore**: 0.7573 ## Usage ```python from transformers import pipeline summarizer = pipeline("summarization", model="yelyah/mT5-XLSUM-ua-news") article = "Your news article text here." summary = summarizer(article) print(summary)
null
Non_BioNLP
# Model Card for Model ID ## Model Summary The mT5-multilingual-XLSum model was fine-tuned on the UA-News dataset to generate concise and accurate news headlines in Ukrainian language. ## Training - **Epochs**: 4 - **Batch Size**: 4 - **Learning Rate**: 4e-5 ## Evaluation - **Metrics**: The model's performance on the test set. - **ROUGE-1**: 0.2452 - **ROUGE-2**: 0.1075 - **ROUGE-L**: 0.2348 - **BERTScore**: 0.7573 ## Usage ```python from transformers import pipeline summarizer = pipeline("summarization", model="yelyah/mT5-XLSUM-ua-news") article = "Your news article text here." summary = summarizer(article) print(summary)
{"datasets": ["FIdo-AI/ua-news"], "language": ["uk"], "library_name": "transformers", "metrics": ["rouge"], "pipeline_tag": "summarization", "tags": ["news"]}
task
[ "SUMMARIZATION" ]
42,869
fine-tuned/NFCorpus-8-8-gpt-4o-2024-05-13-978964
fine-tuned
feature-extraction
[ "sentence-transformers", "safetensors", "bert", "feature-extraction", "sentence-similarity", "mteb", "Medical", "Nutrition", "Queries", "Documents", "Relevance", "en", "dataset:fine-tuned/NFCorpus-8-8-gpt-4o-2024-05-13-978964", "dataset:allenai/c4", "license:apache-2.0", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
2024-05-23T20:57:24Z
2024-05-23T20:57:57+00:00
10
0
--- datasets: - fine-tuned/NFCorpus-8-8-gpt-4o-2024-05-13-978964 - allenai/c4 language: - en license: apache-2.0 pipeline_tag: feature-extraction tags: - sentence-transformers - feature-extraction - sentence-similarity - mteb - Medical - Nutrition - Queries - Documents - Relevance --- This model is a fine-tuned version of [**BAAI/bge-large-en-v1.5**](https://huggingface.co/BAAI/bge-large-en-v1.5) designed for the following use case: medical information retrieval ## How to Use This model can be easily integrated into your NLP pipeline for tasks such as text classification, sentiment analysis, entity recognition, and more. Here's a simple example to get you started: ```python from sentence_transformers import SentenceTransformer from sentence_transformers.util import cos_sim model = SentenceTransformer( 'fine-tuned/NFCorpus-8-8-gpt-4o-2024-05-13-978964', trust_remote_code=True ) embeddings = model.encode([ 'first text to embed', 'second text to embed' ]) print(cos_sim(embeddings[0], embeddings[1])) ```
null
BioNLP
This model is a fine-tuned version of [**BAAI/bge-large-en-v1.5**](https://huggingface.co/BAAI/bge-large-en-v1.5) designed for the following use case: medical information retrieval ## How to Use This model can be easily integrated into your NLP pipeline for tasks such as text classification, sentiment analysis, entity recognition, and more. Here's a simple example to get you started: ```python from sentence_transformers import SentenceTransformer from sentence_transformers.util import cos_sim model = SentenceTransformer( 'fine-tuned/NFCorpus-8-8-gpt-4o-2024-05-13-978964', trust_remote_code=True ) embeddings = model.encode([ 'first text to embed', 'second text to embed' ]) print(cos_sim(embeddings[0], embeddings[1])) ```
{"datasets": ["fine-tuned/NFCorpus-8-8-gpt-4o-2024-05-13-978964", "allenai/c4"], "language": ["en"], "license": "apache-2.0", "pipeline_tag": "feature-extraction", "tags": ["sentence-transformers", "feature-extraction", "sentence-similarity", "mteb", "Medical", "Nutrition", "Queries", "Documents", "Relevance"]}
task
[ "TEXT_CLASSIFICATION" ]
42,870
alexgusevski/Lucie-7B-q8-mlx
alexgusevski
text-generation
[ "mlx", "safetensors", "llama", "pretrained", "llama-3", "openllm-france", "text-generation", "conversational", "fr", "en", "it", "de", "es", "dataset:OpenLLM-France/Lucie-Training-Dataset", "base_model:OpenLLM-France/Lucie-7B", "base_model:quantized:OpenLLM-France/Lucie-7B", "license:apache-2.0", "8-bit", "region:us" ]
2025-02-23T10:49:27Z
2025-02-23T11:01:05+00:00
15
0
--- base_model: OpenLLM-France/Lucie-7B datasets: - OpenLLM-France/Lucie-Training-Dataset language: - fr - en - it - de - es license: apache-2.0 pipeline_tag: text-generation tags: - pretrained - llama-3 - openllm-france - mlx widget: - text: 'Quelle est la capitale de l''Espagne ? Madrid. Quelle est la capitale de la France ?' example_title: Capital cities in French group: 1-shot Question Answering training_progress: num_steps: 756291 num_tokens: 3131736326144 context_length: 32000 --- # alexgusevski/Lucie-7B-q8-mlx The Model [alexgusevski/Lucie-7B-q8-mlx](https://huggingface.co/alexgusevski/Lucie-7B-q8-mlx) was converted to MLX format from [OpenLLM-France/Lucie-7B](https://huggingface.co/OpenLLM-France/Lucie-7B) using mlx-lm version **0.21.4**. ## Use with mlx ```bash pip install mlx-lm ``` ```python from mlx_lm import load, generate model, tokenizer = load("alexgusevski/Lucie-7B-q8-mlx") prompt = "hello" if tokenizer.chat_template is not None: messages = [{"role": "user", "content": prompt}] prompt = tokenizer.apply_chat_template( messages, add_generation_prompt=True ) response = generate(model, tokenizer, prompt=prompt, verbose=True) ```
null
Non_BioNLP
# alexgusevski/Lucie-7B-q8-mlx The Model [alexgusevski/Lucie-7B-q8-mlx](https://huggingface.co/alexgusevski/Lucie-7B-q8-mlx) was converted to MLX format from [OpenLLM-France/Lucie-7B](https://huggingface.co/OpenLLM-France/Lucie-7B) using mlx-lm version **0.21.4**. ## Use with mlx ```bash pip install mlx-lm ``` ```python from mlx_lm import load, generate model, tokenizer = load("alexgusevski/Lucie-7B-q8-mlx") prompt = "hello" if tokenizer.chat_template is not None: messages = [{"role": "user", "content": prompt}] prompt = tokenizer.apply_chat_template( messages, add_generation_prompt=True ) response = generate(model, tokenizer, prompt=prompt, verbose=True) ```
{"base_model": "OpenLLM-France/Lucie-7B", "datasets": ["OpenLLM-France/Lucie-Training-Dataset"], "language": ["fr", "en", "it", "de", "es"], "license": "apache-2.0", "pipeline_tag": "text-generation", "tags": ["pretrained", "llama-3", "openllm-france", "mlx"], "widget": [{"text": "Quelle est la capitale de l'Espagne ? Madrid.\nQuelle est la capitale de la France ?", "example_title": "Capital cities in French", "group": "1-shot Question Answering"}], "training_progress": {"num_steps": 756291, "num_tokens": 3131736326144, "context_length": 32000}}
task
[ "QUESTION_ANSWERING" ]
42,871
CYFRAGOVPL/Llama-PLLuM-70B-base
CYFRAGOVPL
null
[ "safetensors", "llama", "pl", "license:llama3.1", "region:us" ]
2025-02-06T21:19:46Z
2025-03-11T14:14:34+00:00
97
0
--- language: - pl license: llama3.1 --- <p align="center"> <img src="https://pllum.org.pl/_nuxt/PLLuM_logo_RGB_color.DXNEc-VR.png"> </p> # PLLuM: A Family of Polish Large Language Models ## Overview PLLuM is a family of large language models (LLMs) specialized in Polish and other Slavic/Baltic languages, with additional English data incorporated for broader generalization. Developed through an extensive collaboration with various data providers, PLLuM models are built on high-quality text corpora and refined through instruction tuning, preference learning, and advanced alignment techniques. These models are intended to generate contextually coherent text, offer assistance in various tasks (e.g., question answering, summarization), and serve as a foundation for specialized applications such as domain-specific intelligent assistants. ### Key Highlights - **Extensive Data Collection** We gathered large-scale, high-quality text data in Polish (around 150B tokens after cleaning and deduplication) and additional text in Slavic, Baltic, and English languages. Part of these tokens (28B) can be used in fully open-source models, including for commercial use (in compliance with relevant legal regulations). - **Organic Instruction Dataset** We curated the largest Polish collection of manually created “organic instructions” (~40k prompt-response pairs, including ~3.5k multi-turn dialogs). This human-authored instruction set is based on an extensive typology of human-model interactions and it covers a range of subtle aspects of supervised fine-tuning (SFT) that might be overlooked with automated approaches (including large scale distillation of 'strong LLMs'). It was also designed to mitigate negative linguistic transfer from non-Polish textual data used in the pre-training phase. - **Polish Preference Corpus** We created the first Polish-language preference corpus, featuring prompts and multiple model responses manually assessed by a demographically diverse team of annotators. This dataset teaches the model not only correctness (factual and linguistic) but also balance and safety—especially for potentially controversial or adversarial topics. - **Evaluation Benchmarks** We developed custom benchmarks to evaluate our models on tasks relevant to Polish public administration, where PLLuM achieved top scores among all tested models. In broader Polish-language tasks, PLLuM models also attain state-of-the-art results. ## Model Description Below is a summary of the main PLLuM models, including their licenses, bases, and parameter sizes. All model names link to a specific Hugging Face resources, while the base models and licenses link to their respective sources or license references. Note that all *-nc-* models are intended to non-commercial use. The models with fully open licenses are continuously pretrained on approximately 30 billion tokens of Polish text due to copyright considerations. The models with CC-BY-NC-4.0 licenses used approximately 150 billion tokens of Polish text. The models with the -nc and -chat suffix were aligned on human preferences and are generally safer and more efficient to use in dialog, general purpose scenarios. | Model Name | Params | License | Based On | |-------------------------------------------------------|----------------------|---------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------| | [Llama-PLLuM-8B-base](https://huggingface.co/CYFRAGOVPL/Llama-PLLuM-8B-base) | 8B | [Llama 3.1](https://huggingface.co/meta-llama/Llama-3.1-8B/blob/main/LICENSE) | [Llama3.1-8B](https://huggingface.co/meta-llama/Llama-3.1-8B) | | [Llama-PLLuM-8B-instruct](https://huggingface.co/CYFRAGOVPL/Llama-PLLuM-8B-instruct) | 8B | [Llama 3.1](https://huggingface.co/meta-llama/Llama-3.1-8B/blob/main/LICENSE) | [Llama3.1-8B](https://huggingface.co/meta-llama/Llama-3.1-8B) | | [Llama-PLLuM-8B-chat](https://huggingface.co/CYFRAGOVPL/Llama-PLLuM-8B-chat) | 8B | [Llama 3.1](https://huggingface.co/meta-llama/Llama-3.1-8B/blob/main/LICENSE) | [Llama3.1-8B](https://huggingface.co/meta-llama/Llama-3.1-8B) | | [PLLuM-12B-base](https://huggingface.co/CYFRAGOVPL/PLLuM-12B-base) | 12B | [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0.txt) | [Mistral-Nemo-Base-2407](https://huggingface.co/mistralai/Mistral-Nemo-Base-2407) | | [PLLuM-12B-instruct](https://huggingface.co/CYFRAGOVPL/PLLuM-12B-instruct) | 12B | [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0.txt) | [Mistral-Nemo-Base-2407](https://huggingface.co/mistralai/Mistral-Nemo-Base-2407) | | [PLLuM-12B-chat](https://huggingface.co/CYFRAGOVPL/PLLuM-12B-chat) | 12B | [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0.txt) | [Mistral-Nemo-Base-2407](https://huggingface.co/mistralai/Mistral-Nemo-Base-2407) | | [PLLuM-12B-nc-base](https://huggingface.co/CYFRAGOVPL/PLLuM-12B-nc-base) | 12B | [CC-BY-NC-4.0](https://creativecommons.org/licenses/by-nc/4.0/legalcode.txt) | [Mistral-Nemo-Base-2407](https://huggingface.co/mistralai/Mistral-Nemo-Base-2407) | | [PLLuM-12B-nc-instruct](https://huggingface.co/CYFRAGOVPL/PLLuM-12B-nc-instruct) | 12B | [CC-BY-NC-4.0](https://creativecommons.org/licenses/by-nc/4.0/legalcode.txt) | [Mistral-Nemo-Base-2407](https://huggingface.co/mistralai/Mistral-Nemo-Base-2407) | | [PLLuM-12B-nc-chat](https://huggingface.co/CYFRAGOVPL/PLLuM-12B-nc-chat) | 12B | [CC-BY-NC-4.0](https://creativecommons.org/licenses/by-nc/4.0/legalcode.txt) | [Mistral-Nemo-Base-2407](https://huggingface.co/mistralai/Mistral-Nemo-Base-2407) | | [PLLuM-8x7B-base](https://huggingface.co/CYFRAGOVPL/PLLuM-8x7B-base) | 8×7B | [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0.txt) | [Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) | | [PLLuM-8x7B-instruct](https://huggingface.co/CYFRAGOVPL/PLLuM-8x7B-instruct) | 8×7B | [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0.txt) | [Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) | | [PLLuM-8x7B-chat](https://huggingface.co/CYFRAGOVPL/PLLuM-8x7B-chat) | 8×7B | [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0.txt) | [Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) | | [PLLuM-8x7B-nc-base](https://huggingface.co/CYFRAGOVPL/PLLuM-8x7B-nc-base) | 8×7B | [CC-BY-NC-4.0](https://creativecommons.org/licenses/by-nc/4.0/legalcode.txt) | [Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) | | [PLLuM-8x7B-nc-instruct](https://huggingface.co/CYFRAGOVPL/PLLuM-8x7B-nc-instruct) | 8×7B | [CC-BY-NC-4.0](https://creativecommons.org/licenses/by-nc/4.0/legalcode.txt) | [Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) | | [PLLuM-8x7B-nc-chat](https://huggingface.co/CYFRAGOVPL/PLLuM-8x7B-nc-chat) | 8×7B | [CC-BY-NC-4.0](https://creativecommons.org/licenses/by-nc/4.0/legalcode.txt) | [Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) | | [Llama-PLLuM-70B-base](https://huggingface.co/CYFRAGOVPL/Llama-PLLuM-70B-base) | 70B | [Llama 3.1](https://huggingface.co/meta-llama/Llama-3.1-70B/blob/main/LICENSE) | [Llama-3.1-70B](https://huggingface.co/meta-llama/Llama-3.1-70B) | | [Llama-PLLuM-70B-instruct](https://huggingface.co/CYFRAGOVPL/Llama-PLLuM-70B-instruct) | 70B | [Llama 3.1](https://huggingface.co/meta-llama/Llama-3.1-70B/blob/main/LICENSE) | [Llama-3.1-70B](https://huggingface.co/meta-llama/Llama-3.1-70B) | | [Llama-PLLuM-70B-chat](https://huggingface.co/CYFRAGOVPL/Llama-PLLuM-70B-chat) | 70B | [Llama 3.1](https://huggingface.co/meta-llama/Llama-3.1-70B/blob/main/LICENSE) | [Llama-3.1-70B](https://huggingface.co/meta-llama/Llama-3.1-70B) | ### Model Development - **Pretraining**: All models were pretrained or continued-pretrained on large-scale Polish corpora (up to 150B tokens) plus a range of additional Slavic/Baltic and English texts. - **Instruction Fine-Tuning**: We refined the models on manually curated Polish “organic instructions” (approx. 40k), converted instructions from premium Polish corpora (approx. 50k), and synthetic instructions generated by strong LLMs (approx. 10k). - **Alignment and Preference Learning**: Manually annotated preference data taught the models to produce safer, balanced, and contextually appropriate responses, even in adversarial or sensitive cases. - **Domain-Specific Adaptations**: Specialized RAG-based (Retrieval Augmented Generation) models were developed for tasks like public administration, demonstrating strong performance in complex information retrieval and question answering. ## Intended Use Cases - **General Language Tasks**: Text generation, summarization, question answering, etc. - **Domain-Specific Assistants**: Especially effective for Polish public administration and legal or bureaucratic topics where domain-aware retrieval is required. - **Research & Development**: Building blocks for downstream AI applications in academic or industrial settings, where a strong command of the Polish language is essential. ## How to Use Each PLLuM model can be loaded via the Hugging Face Transformers library (or compatible frameworks). For RAG-based scenarios, pair the model with a relevant vector store or document retrieval system. Below are some recommended steps and code snippets: ### 1. Installation Make sure you have the latest versions of `transformers` and `torch` (or another compatible deep learning framework) installed: ```bash pip install transformers accelerate torch ``` ### 2. Loading the Model Use the following example to load one of the PLLuM models: ```python from transformers import AutoTokenizer, AutoModelForCausalLM model_name = "CYFRAGOVPL/PLLuM-12B-chat" # Replace with the PLLuM model name of your choice tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForCausalLM.from_pretrained(model_name) ``` ### 3. Using bfloat16 (BF16) If your hardware (e.g., newer GPUs) supports bfloat16, you can reduce memory usage and potentially speed up inference: ```python import torch from transformers import AutoTokenizer, AutoModelForCausalLM model_name = "CYFRAGOVPL/PLLuM-12B-chat" tokenizer = AutoTokenizer.from_pretrained(model_name) # Load model in bfloat16 precision model = AutoModelForCausalLM.from_pretrained( model_name, torch_dtype=torch.bfloat16, device_map="auto" # automatically places model layers on available devices ) ``` ### 4. Generating an Example Text ```python prompt = "Napisz krótki wiersz o wiośnie." # EN:"Write a short poem about spring." inputs = tokenizer(prompt, return_tensors="pt").to(model.device) outputs = model.generate( **inputs, max_new_tokens=50, do_sample=True, top_k=50, top_p=0.9, temperature=0.7 ) generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True) print(generated_text) ``` ### 5. Expected Output Below is a sample (hypothetical) output for the prompt above: ```css Przykładowy wiersz o tematyce wiosennej: Wiosna, wiosna, wiosna, ach to ty! Kwiecień plecień wciąż przeplata, trochę zimy, trochę lata. A ja nie mogę się już doczekać, kiedy w kalendarzu ujrzę maj. Wtedy wszystko wkoło rozkwita, a ptaki tak pięknie śpiewają. Wiosno, wiosno, czekam z utęsknieniem, zrób mi tę przyjemność i przyjdź wreszcie, proszę! ``` Your results may vary depending on model parameters (e.g., temperature, top_k, top_p), hardware, and other settings. ### 6. Retrieval Augmented Generation (RAG) Our Llama-PLLuM models (both chat and instruct versions) were additionally trained to perform well in Retrieval Augmented Generation (RAG) setting. The prompt is in .jinja format, where docs is a list of document texts and question is a query that should be answered based on the provided documents. If there is no answer in the provided documents model generates "Nie udało mi się odnaleźć odpowiedzi na pytanie". Prompt: ``` Numerowana lista dokumentów jest poniżej: --------------------- <results>{% for doc in docs %} Dokument: {{ loop.index0 }} {{ doc }} {% endfor %}</results> --------------------- Odpowiedz na pytanie użytkownika wykorzystując tylko informacje znajdujące się w dokumentach, a nie wcześniejszą wiedzę. Udziel wysokiej jakości, poprawnej gramatycznie odpowiedzi w języku polskim. Odpowiedź powinna zawierać cytowania do dokumentów, z których pochodzą informacje. Zacytuj dokument za pomocą symbolu [nr_dokumentu] powołując się na fragment np. [0] dla fragmentu z dokumentu 0. Jeżeli w dokumentach nie ma informacji potrzebnych do odpowiedzi na pytanie, zamiast odpowiedzi zwróć tekst: "Nie udało mi się odnaleźć odpowiedzi na pytanie". Pytanie: {{ question }} ``` ## Training Procedure - **Datasets**: ~150B tokens from Polish and multilingual sources, with ~28B tokens available for fully open-source commercial use. - **Hyperparameters**: Vary based on model size, typically including Adam or AdamW optimizers, a range of batch sizes, and carefully tuned learning rates. - **Hardware & Duration**: Training using [Bem2](https://man.e-science.pl/pl/kdm/bem2) HPC (up to 300xH100 GPUs). Each model’s training time depends on parameter size and hardware configuration (~8 to ~25 days on multi-GPU cluster for 8B–70B sizes). ## Evaluation and Benchmarks - **Public Administration**: PLLuM models demonstrated top-tier performance in specialized tasks relevant to government services. - **Polish Language Tasks**: Across a variety of internal benchmarks and standard corpora, PLLuM consistently outperforms other models in accuracy, coherence, and safety metrics. - **Custom Tests**: A unique preference corpus and alignment tests ensure robust, safe, and contextually accurate responses. ## Limitations and Bias - **Potential Hallucinations**: Like other LLMs, PLLuM may occasionally produce factually incorrect or fabricated content. - **Sensitivity & Bias**: While extensive preference learning has been done, biases might still emerge, especially in controversial or subjective topics. - **Context Length**: Very long context tasks may challenge certain models, depending on memory constraints. ## Ethical Considerations PLLuM models are designed for constructive and responsible usage. Users should exercise caution when deploying them in production scenarios, especially for sensitive or regulated domains. Despite efforts to minimize harmful outputs, there is always a risk of generating offensive, biased, or inappropriate text. Human oversight and due diligence are advised. ## Citation If you use PLLuM models or any part of this repository in your research or deployment, please cite as follows (BibTeX): ``` @unpublished{pllum2025, title={PLLuM: A Family of Polish Large Language Models}, author={PLLuM Consortium}, year={2025} } ``` ## License Different models within the PLLuM family are published under various licenses (Apache 2.0, CC-BY-NC-4.0, or Llama 3.1 license). Check each model’s entry in the table above for details. ## Creators & Consortium The PLLuM project is a unique collaboration between leading Polish scientific institutions and experts from various fields, working together to create a groundbreaking Polish language model. This research partnership combines diverse competencies and passions, forming a robust foundation for advancing AI in Poland. <table style="border: none; border-collapse: collapse;"> <tr> <td align="center" valign="middle" style="border: none;"> <a href="https://pwr.edu.pl/"> <img src="https://pllum.org.pl/_nuxt/pwr.D1_x0B58.png" alt="pwr.D1_x0B58.png" width="100"> </a> <br><strong>Politechnika Wrocławska</strong><br><em>– Project Leader</em> </td> <td align="center" valign="middle" style="border: none;"> <a href="https://www.nask.pl/"> <img src="https://pllum.org.pl/_nuxt/nask.Bz8rmSzR.png" alt="nask.Bz8rmSzR.png" width="100"> </a> <br><strong>NASK PIB</strong> </td> <td align="center" valign="middle" style="border: none;"> <a href="https://www.ipipan.waw.pl/"> <img src="https://clarin.biz/_nuxt/img/ipipan.294d39c.png" alt="ipipan.294d39c.png" width="100"> </a> <br><strong>Instytut Podstaw Informatyki PAN</strong> </td> </tr> <tr> <td align="center" valign="middle" style="border: none;"> <a href="https://opi.org.pl/"> <img src="https://pllum.org.pl/_nuxt/opi.CF-COwcC.png" alt="opi.CF-COwcC.png" width="100"> </a> <br><strong>Ośrodek Przetwarzania Informacji PIB</strong> </td> <td align="center" valign="middle" style="border: none;"> <a href="https://www.uni.lodz.pl/"> <img src="https://pllum.org.pl/_nuxt/ul.aTSgr_W6.png" alt="ul.aTSgr_W6.png" width="100"> </a> <br><strong>Uniwersytet Łódzki</strong> </td> <td align="center" valign="middle" style="border: none;"> <a href="https://ispan.waw.pl/default/"> <img src="https://pllum.org.pl/_nuxt/is.Dqb94VRb.png" alt="is.Dqb94VRb.png" width="100"> </a> <br><strong>Instytut Slawistyki PAN</strong> </td> </tr> </table> ## Contact and Support For questions or contributions, please reach out via: <[email protected]> We welcome feedback, collaboration, and further exploration of PLLuM models! ## Acknowledgements Project financed by the Minister of Digital Affairs under the targeted subsidy No. 1/WI/DBiI/2023: *“Responsible development of the open large language model PLLuM (Polish Large Language Model) to support breakthrough technologies in the public and economic sector, including an open, Polish-language intelligent assistant for petitioners.”* **Funding Amount:** 14,504,392.00 PLN **Contract Signing Date:** 2024-01-22
null
Non_BioNLP
<p align="center"> <img src="https://pllum.org.pl/_nuxt/PLLuM_logo_RGB_color.DXNEc-VR.png"> </p> # PLLuM: A Family of Polish Large Language Models ## Overview PLLuM is a family of large language models (LLMs) specialized in Polish and other Slavic/Baltic languages, with additional English data incorporated for broader generalization. Developed through an extensive collaboration with various data providers, PLLuM models are built on high-quality text corpora and refined through instruction tuning, preference learning, and advanced alignment techniques. These models are intended to generate contextually coherent text, offer assistance in various tasks (e.g., question answering, summarization), and serve as a foundation for specialized applications such as domain-specific intelligent assistants. ### Key Highlights - **Extensive Data Collection** We gathered large-scale, high-quality text data in Polish (around 150B tokens after cleaning and deduplication) and additional text in Slavic, Baltic, and English languages. Part of these tokens (28B) can be used in fully open-source models, including for commercial use (in compliance with relevant legal regulations). - **Organic Instruction Dataset** We curated the largest Polish collection of manually created “organic instructions” (~40k prompt-response pairs, including ~3.5k multi-turn dialogs). This human-authored instruction set is based on an extensive typology of human-model interactions and it covers a range of subtle aspects of supervised fine-tuning (SFT) that might be overlooked with automated approaches (including large scale distillation of 'strong LLMs'). It was also designed to mitigate negative linguistic transfer from non-Polish textual data used in the pre-training phase. - **Polish Preference Corpus** We created the first Polish-language preference corpus, featuring prompts and multiple model responses manually assessed by a demographically diverse team of annotators. This dataset teaches the model not only correctness (factual and linguistic) but also balance and safety—especially for potentially controversial or adversarial topics. - **Evaluation Benchmarks** We developed custom benchmarks to evaluate our models on tasks relevant to Polish public administration, where PLLuM achieved top scores among all tested models. In broader Polish-language tasks, PLLuM models also attain state-of-the-art results. ## Model Description Below is a summary of the main PLLuM models, including their licenses, bases, and parameter sizes. All model names link to a specific Hugging Face resources, while the base models and licenses link to their respective sources or license references. Note that all *-nc-* models are intended to non-commercial use. The models with fully open licenses are continuously pretrained on approximately 30 billion tokens of Polish text due to copyright considerations. The models with CC-BY-NC-4.0 licenses used approximately 150 billion tokens of Polish text. The models with the -nc and -chat suffix were aligned on human preferences and are generally safer and more efficient to use in dialog, general purpose scenarios. | Model Name | Params | License | Based On | |-------------------------------------------------------|----------------------|---------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------| | [Llama-PLLuM-8B-base](https://huggingface.co/CYFRAGOVPL/Llama-PLLuM-8B-base) | 8B | [Llama 3.1](https://huggingface.co/meta-llama/Llama-3.1-8B/blob/main/LICENSE) | [Llama3.1-8B](https://huggingface.co/meta-llama/Llama-3.1-8B) | | [Llama-PLLuM-8B-instruct](https://huggingface.co/CYFRAGOVPL/Llama-PLLuM-8B-instruct) | 8B | [Llama 3.1](https://huggingface.co/meta-llama/Llama-3.1-8B/blob/main/LICENSE) | [Llama3.1-8B](https://huggingface.co/meta-llama/Llama-3.1-8B) | | [Llama-PLLuM-8B-chat](https://huggingface.co/CYFRAGOVPL/Llama-PLLuM-8B-chat) | 8B | [Llama 3.1](https://huggingface.co/meta-llama/Llama-3.1-8B/blob/main/LICENSE) | [Llama3.1-8B](https://huggingface.co/meta-llama/Llama-3.1-8B) | | [PLLuM-12B-base](https://huggingface.co/CYFRAGOVPL/PLLuM-12B-base) | 12B | [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0.txt) | [Mistral-Nemo-Base-2407](https://huggingface.co/mistralai/Mistral-Nemo-Base-2407) | | [PLLuM-12B-instruct](https://huggingface.co/CYFRAGOVPL/PLLuM-12B-instruct) | 12B | [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0.txt) | [Mistral-Nemo-Base-2407](https://huggingface.co/mistralai/Mistral-Nemo-Base-2407) | | [PLLuM-12B-chat](https://huggingface.co/CYFRAGOVPL/PLLuM-12B-chat) | 12B | [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0.txt) | [Mistral-Nemo-Base-2407](https://huggingface.co/mistralai/Mistral-Nemo-Base-2407) | | [PLLuM-12B-nc-base](https://huggingface.co/CYFRAGOVPL/PLLuM-12B-nc-base) | 12B | [CC-BY-NC-4.0](https://creativecommons.org/licenses/by-nc/4.0/legalcode.txt) | [Mistral-Nemo-Base-2407](https://huggingface.co/mistralai/Mistral-Nemo-Base-2407) | | [PLLuM-12B-nc-instruct](https://huggingface.co/CYFRAGOVPL/PLLuM-12B-nc-instruct) | 12B | [CC-BY-NC-4.0](https://creativecommons.org/licenses/by-nc/4.0/legalcode.txt) | [Mistral-Nemo-Base-2407](https://huggingface.co/mistralai/Mistral-Nemo-Base-2407) | | [PLLuM-12B-nc-chat](https://huggingface.co/CYFRAGOVPL/PLLuM-12B-nc-chat) | 12B | [CC-BY-NC-4.0](https://creativecommons.org/licenses/by-nc/4.0/legalcode.txt) | [Mistral-Nemo-Base-2407](https://huggingface.co/mistralai/Mistral-Nemo-Base-2407) | | [PLLuM-8x7B-base](https://huggingface.co/CYFRAGOVPL/PLLuM-8x7B-base) | 8×7B | [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0.txt) | [Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) | | [PLLuM-8x7B-instruct](https://huggingface.co/CYFRAGOVPL/PLLuM-8x7B-instruct) | 8×7B | [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0.txt) | [Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) | | [PLLuM-8x7B-chat](https://huggingface.co/CYFRAGOVPL/PLLuM-8x7B-chat) | 8×7B | [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0.txt) | [Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) | | [PLLuM-8x7B-nc-base](https://huggingface.co/CYFRAGOVPL/PLLuM-8x7B-nc-base) | 8×7B | [CC-BY-NC-4.0](https://creativecommons.org/licenses/by-nc/4.0/legalcode.txt) | [Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) | | [PLLuM-8x7B-nc-instruct](https://huggingface.co/CYFRAGOVPL/PLLuM-8x7B-nc-instruct) | 8×7B | [CC-BY-NC-4.0](https://creativecommons.org/licenses/by-nc/4.0/legalcode.txt) | [Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) | | [PLLuM-8x7B-nc-chat](https://huggingface.co/CYFRAGOVPL/PLLuM-8x7B-nc-chat) | 8×7B | [CC-BY-NC-4.0](https://creativecommons.org/licenses/by-nc/4.0/legalcode.txt) | [Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) | | [Llama-PLLuM-70B-base](https://huggingface.co/CYFRAGOVPL/Llama-PLLuM-70B-base) | 70B | [Llama 3.1](https://huggingface.co/meta-llama/Llama-3.1-70B/blob/main/LICENSE) | [Llama-3.1-70B](https://huggingface.co/meta-llama/Llama-3.1-70B) | | [Llama-PLLuM-70B-instruct](https://huggingface.co/CYFRAGOVPL/Llama-PLLuM-70B-instruct) | 70B | [Llama 3.1](https://huggingface.co/meta-llama/Llama-3.1-70B/blob/main/LICENSE) | [Llama-3.1-70B](https://huggingface.co/meta-llama/Llama-3.1-70B) | | [Llama-PLLuM-70B-chat](https://huggingface.co/CYFRAGOVPL/Llama-PLLuM-70B-chat) | 70B | [Llama 3.1](https://huggingface.co/meta-llama/Llama-3.1-70B/blob/main/LICENSE) | [Llama-3.1-70B](https://huggingface.co/meta-llama/Llama-3.1-70B) | ### Model Development - **Pretraining**: All models were pretrained or continued-pretrained on large-scale Polish corpora (up to 150B tokens) plus a range of additional Slavic/Baltic and English texts. - **Instruction Fine-Tuning**: We refined the models on manually curated Polish “organic instructions” (approx. 40k), converted instructions from premium Polish corpora (approx. 50k), and synthetic instructions generated by strong LLMs (approx. 10k). - **Alignment and Preference Learning**: Manually annotated preference data taught the models to produce safer, balanced, and contextually appropriate responses, even in adversarial or sensitive cases. - **Domain-Specific Adaptations**: Specialized RAG-based (Retrieval Augmented Generation) models were developed for tasks like public administration, demonstrating strong performance in complex information retrieval and question answering. ## Intended Use Cases - **General Language Tasks**: Text generation, summarization, question answering, etc. - **Domain-Specific Assistants**: Especially effective for Polish public administration and legal or bureaucratic topics where domain-aware retrieval is required. - **Research & Development**: Building blocks for downstream AI applications in academic or industrial settings, where a strong command of the Polish language is essential. ## How to Use Each PLLuM model can be loaded via the Hugging Face Transformers library (or compatible frameworks). For RAG-based scenarios, pair the model with a relevant vector store or document retrieval system. Below are some recommended steps and code snippets: ### 1. Installation Make sure you have the latest versions of `transformers` and `torch` (or another compatible deep learning framework) installed: ```bash pip install transformers accelerate torch ``` ### 2. Loading the Model Use the following example to load one of the PLLuM models: ```python from transformers import AutoTokenizer, AutoModelForCausalLM model_name = "CYFRAGOVPL/PLLuM-12B-chat" # Replace with the PLLuM model name of your choice tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForCausalLM.from_pretrained(model_name) ``` ### 3. Using bfloat16 (BF16) If your hardware (e.g., newer GPUs) supports bfloat16, you can reduce memory usage and potentially speed up inference: ```python import torch from transformers import AutoTokenizer, AutoModelForCausalLM model_name = "CYFRAGOVPL/PLLuM-12B-chat" tokenizer = AutoTokenizer.from_pretrained(model_name) # Load model in bfloat16 precision model = AutoModelForCausalLM.from_pretrained( model_name, torch_dtype=torch.bfloat16, device_map="auto" # automatically places model layers on available devices ) ``` ### 4. Generating an Example Text ```python prompt = "Napisz krótki wiersz o wiośnie." # EN:"Write a short poem about spring." inputs = tokenizer(prompt, return_tensors="pt").to(model.device) outputs = model.generate( **inputs, max_new_tokens=50, do_sample=True, top_k=50, top_p=0.9, temperature=0.7 ) generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True) print(generated_text) ``` ### 5. Expected Output Below is a sample (hypothetical) output for the prompt above: ```css Przykładowy wiersz o tematyce wiosennej: Wiosna, wiosna, wiosna, ach to ty! Kwiecień plecień wciąż przeplata, trochę zimy, trochę lata. A ja nie mogę się już doczekać, kiedy w kalendarzu ujrzę maj. Wtedy wszystko wkoło rozkwita, a ptaki tak pięknie śpiewają. Wiosno, wiosno, czekam z utęsknieniem, zrób mi tę przyjemność i przyjdź wreszcie, proszę! ``` Your results may vary depending on model parameters (e.g., temperature, top_k, top_p), hardware, and other settings. ### 6. Retrieval Augmented Generation (RAG) Our Llama-PLLuM models (both chat and instruct versions) were additionally trained to perform well in Retrieval Augmented Generation (RAG) setting. The prompt is in .jinja format, where docs is a list of document texts and question is a query that should be answered based on the provided documents. If there is no answer in the provided documents model generates "Nie udało mi się odnaleźć odpowiedzi na pytanie". Prompt: ``` Numerowana lista dokumentów jest poniżej: --------------------- <results>{% for doc in docs %} Dokument: {{ loop.index0 }} {{ doc }} {% endfor %}</results> --------------------- Odpowiedz na pytanie użytkownika wykorzystując tylko informacje znajdujące się w dokumentach, a nie wcześniejszą wiedzę. Udziel wysokiej jakości, poprawnej gramatycznie odpowiedzi w języku polskim. Odpowiedź powinna zawierać cytowania do dokumentów, z których pochodzą informacje. Zacytuj dokument za pomocą symbolu [nr_dokumentu] powołując się na fragment np. [0] dla fragmentu z dokumentu 0. Jeżeli w dokumentach nie ma informacji potrzebnych do odpowiedzi na pytanie, zamiast odpowiedzi zwróć tekst: "Nie udało mi się odnaleźć odpowiedzi na pytanie". Pytanie: {{ question }} ``` ## Training Procedure - **Datasets**: ~150B tokens from Polish and multilingual sources, with ~28B tokens available for fully open-source commercial use. - **Hyperparameters**: Vary based on model size, typically including Adam or AdamW optimizers, a range of batch sizes, and carefully tuned learning rates. - **Hardware & Duration**: Training using [Bem2](https://man.e-science.pl/pl/kdm/bem2) HPC (up to 300xH100 GPUs). Each model’s training time depends on parameter size and hardware configuration (~8 to ~25 days on multi-GPU cluster for 8B–70B sizes). ## Evaluation and Benchmarks - **Public Administration**: PLLuM models demonstrated top-tier performance in specialized tasks relevant to government services. - **Polish Language Tasks**: Across a variety of internal benchmarks and standard corpora, PLLuM consistently outperforms other models in accuracy, coherence, and safety metrics. - **Custom Tests**: A unique preference corpus and alignment tests ensure robust, safe, and contextually accurate responses. ## Limitations and Bias - **Potential Hallucinations**: Like other LLMs, PLLuM may occasionally produce factually incorrect or fabricated content. - **Sensitivity & Bias**: While extensive preference learning has been done, biases might still emerge, especially in controversial or subjective topics. - **Context Length**: Very long context tasks may challenge certain models, depending on memory constraints. ## Ethical Considerations PLLuM models are designed for constructive and responsible usage. Users should exercise caution when deploying them in production scenarios, especially for sensitive or regulated domains. Despite efforts to minimize harmful outputs, there is always a risk of generating offensive, biased, or inappropriate text. Human oversight and due diligence are advised. ## Citation If you use PLLuM models or any part of this repository in your research or deployment, please cite as follows (BibTeX): ``` @unpublished{pllum2025, title={PLLuM: A Family of Polish Large Language Models}, author={PLLuM Consortium}, year={2025} } ``` ## License Different models within the PLLuM family are published under various licenses (Apache 2.0, CC-BY-NC-4.0, or Llama 3.1 license). Check each model’s entry in the table above for details. ## Creators & Consortium The PLLuM project is a unique collaboration between leading Polish scientific institutions and experts from various fields, working together to create a groundbreaking Polish language model. This research partnership combines diverse competencies and passions, forming a robust foundation for advancing AI in Poland. <table style="border: none; border-collapse: collapse;"> <tr> <td align="center" valign="middle" style="border: none;"> <a href="https://pwr.edu.pl/"> <img src="https://pllum.org.pl/_nuxt/pwr.D1_x0B58.png" alt="pwr.D1_x0B58.png" width="100"> </a> <br><strong>Politechnika Wrocławska</strong><br><em>– Project Leader</em> </td> <td align="center" valign="middle" style="border: none;"> <a href="https://www.nask.pl/"> <img src="https://pllum.org.pl/_nuxt/nask.Bz8rmSzR.png" alt="nask.Bz8rmSzR.png" width="100"> </a> <br><strong>NASK PIB</strong> </td> <td align="center" valign="middle" style="border: none;"> <a href="https://www.ipipan.waw.pl/"> <img src="https://clarin.biz/_nuxt/img/ipipan.294d39c.png" alt="ipipan.294d39c.png" width="100"> </a> <br><strong>Instytut Podstaw Informatyki PAN</strong> </td> </tr> <tr> <td align="center" valign="middle" style="border: none;"> <a href="https://opi.org.pl/"> <img src="https://pllum.org.pl/_nuxt/opi.CF-COwcC.png" alt="opi.CF-COwcC.png" width="100"> </a> <br><strong>Ośrodek Przetwarzania Informacji PIB</strong> </td> <td align="center" valign="middle" style="border: none;"> <a href="https://www.uni.lodz.pl/"> <img src="https://pllum.org.pl/_nuxt/ul.aTSgr_W6.png" alt="ul.aTSgr_W6.png" width="100"> </a> <br><strong>Uniwersytet Łódzki</strong> </td> <td align="center" valign="middle" style="border: none;"> <a href="https://ispan.waw.pl/default/"> <img src="https://pllum.org.pl/_nuxt/is.Dqb94VRb.png" alt="is.Dqb94VRb.png" width="100"> </a> <br><strong>Instytut Slawistyki PAN</strong> </td> </tr> </table> ## Contact and Support For questions or contributions, please reach out via: <[email protected]> We welcome feedback, collaboration, and further exploration of PLLuM models! ## Acknowledgements Project financed by the Minister of Digital Affairs under the targeted subsidy No. 1/WI/DBiI/2023: *“Responsible development of the open large language model PLLuM (Polish Large Language Model) to support breakthrough technologies in the public and economic sector, including an open, Polish-language intelligent assistant for petitioners.”* **Funding Amount:** 14,504,392.00 PLN **Contract Signing Date:** 2024-01-22
{"language": ["pl"], "license": "llama3.1"}
task
[ "QUESTION_ANSWERING", "SUMMARIZATION" ]
42,872
arifzanko/test_chat_summarization
arifzanko
text2text-generation
[ "transformers", "pytorch", "tensorboard", "t5", "text2text-generation", "generated_from_trainer", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
2023-10-10T04:08:23Z
2023-10-10T04:55:14+00:00
5
0
--- license: apache-2.0 metrics: - rouge tags: - generated_from_trainer model-index: - name: test_chat_summarization results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # test_chat_summarization This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 2.4747 - Rouge1: 32.7975 - Rouge2: 23.2226 - Rougel: 32.22 - Rougelsum: 32.1021 - Gen Len: 19.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len | |:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:------:|:---------:|:-------:| | 0.0848 | 1.0 | 1257 | 2.4747 | 32.7975 | 23.2226 | 32.22 | 32.1021 | 19.0 | ### Framework versions - Transformers 4.28.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # test_chat_summarization This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 2.4747 - Rouge1: 32.7975 - Rouge2: 23.2226 - Rougel: 32.22 - Rougelsum: 32.1021 - Gen Len: 19.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len | |:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:------:|:---------:|:-------:| | 0.0848 | 1.0 | 1257 | 2.4747 | 32.7975 | 23.2226 | 32.22 | 32.1021 | 19.0 | ### Framework versions - Transformers 4.28.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
{"license": "apache-2.0", "metrics": ["rouge"], "tags": ["generated_from_trainer"], "model-index": [{"name": "test_chat_summarization", "results": []}]}
task
[ "SUMMARIZATION" ]
42,873
gaudi/opus-mt-en-ca-ctranslate2
gaudi
translation
[ "transformers", "marian", "ctranslate2", "translation", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2024-07-18T14:57:33Z
2024-10-19T00:06:22+00:00
12
0
--- license: apache-2.0 tags: - ctranslate2 - translation --- # Repository General Information ## Inspired by and derived from the work of [Helsinki-NLP](https://huggingface.co/Helsinki-NLP), [CTranslate2](https://github.com/OpenNMT/CTranslate2), and [michaelfeil](https://huggingface.co/michaelfeil)! - Link to Original Model ([Helsinki-NLP](https://huggingface.co/Helsinki-NLP)): [Model Link](https://huggingface.co/Helsinki-NLP/opus-mt-en-ca) - This respository was based on the work of [CTranslate2](https://github.com/OpenNMT/CTranslate2). - This repository was based on the work of [michaelfeil](https://huggingface.co/michaelfeil). # What is CTranslate2? [CTranslate2](https://opennmt.net/CTranslate2/) is a C++ and Python library for efficient inference with Transformer models. CTranslate2 implements a custom runtime that applies many performance optimization techniques such as weights quantization, layers fusion, batch reordering, etc., to accelerate and reduce the memory usage of Transformer models on CPU and GPU. CTranslate2 is one of the most performant ways of hosting translation models at scale. Current supported models include: - Encoder-decoder models: Transformer base/big, M2M-100, NLLB, BART, mBART, Pegasus, T5, Whisper - Decoder-only models: GPT-2, GPT-J, GPT-NeoX, OPT, BLOOM, MPT, Llama, Mistral, Gemma, CodeGen, GPTBigCode, Falcon - Encoder-only models: BERT, DistilBERT, XLM-RoBERTa The project is production-oriented and comes with backward compatibility guarantees, but it also includes experimental features related to model compression and inference acceleration. # CTranslate2 Benchmarks Please note that the results presented below are only valid for the configuration used during this benchmark: absolute and relative performance may change with different settings. Tested against `newstest2014` (En -> De) dataset. The benchmark reports the number of target tokens generated per second (higher is better). The results are aggregated over multiple runs. See the benchmark scripts for more details and reproduce these numbers. Please note that the results presented below are only valid for the configuration used during this benchmark: absolute and relative performance may change with different settings. ## CPU Benchmarks for Generic Opus-MT Models | Library | Tokens per Second | Max Memory Usage | BLEU | | :----: | :----: | :----: | :----: | | Transformers 4.26.1 (with PyTorch 1.13.1) | 147.3 | 2332MB | 27.90 | | Marian 1.11.0 (int16) | 330.2 | 5901MB | 27.65 | | Marian 1.11.0 (int8) | 355.8 | 4763MB | 27.27 | | CTranslate2 3.6.0 (int16) | 596.1 | 660MB | 27.53 | | CTranslate2 3.6.0 (int8) | 696.1 | 516MB | 27.65 | ## GPU Benchmarks for Generic Opus-MT Models | Library | Tokens per Second | Max GPU Memory Usage | Max Memory Usage | BLEU | | :----: | :----: | :----: | :----: | :----: | | Transformers 4.26.1 (with PyTorch 1.13.1) | 1022.9 | 4097MB | 2109MB | 27.90 | | Marian 1.11.0 (float16) | 3962.4 | 3239MB | 1976MB | 27.94 | | CTranslate2 3.6.0 (float16) | 9296.7 | 909MB | 814MB | 27.9 | | CTranslate2 3.6.0 (int8 + float16) | 8362.7 | 813MB | 766MB | 27.9 | `Executed with 4 threads on a c5.2xlarge Amazon EC2 instance equipped with an Intel(R) Xeon(R) Platinum 8275CL CPU.` **Source to benchmark information can be found [here](https://github.com/OpenNMT/CTranslate2).**<br /> **Original model BLEU scores can be found [here](https://huggingface.co/Helsinki-NLP/opus-mt-en-ca).** ## Internal Benchmarks Internal testing on our end showed **inference times reduced by 6x-10x** on average compared the vanilla checkpoints using the *transformers* library. A **slight reduction on BLEU scores (~5%)** was also identified in comparison to the vanilla checkpoints with a few exceptions. This is likely due to several factors, one being the quantization applied. Further testing is needed from our end to better assess the reduction in translation quality. The command used to compile the vanilla checkpoint into a CTranslate2 model can be found below. Modifying this command can yield differing balances between inferencing performance and translation quality. # CTranslate2 Installation ```bash pip install hf-hub-ctranslate2>=1.0.0 ctranslate2>=3.13.0 ``` ### ct2-transformers-converter Command Used: ```bash ct2-transformers-converter --model Helsinki-NLP/opus-mt-en-ca --output_dir ./ctranslate2/opus-mt-en-ca-ctranslate2 --force --copy_files README.md generation_config.json tokenizer_config.json vocab.json source.spm .gitattributes target.spm --quantization float16 ``` # CTranslate2 Converted Checkpoint Information: **Compatible With:** - [ctranslate2](https://github.com/OpenNMT/CTranslate2) - [hf-hub-ctranslate2](https://github.com/michaelfeil/hf-hub-ctranslate2) **Compute Type:** - `compute_type=int8_float16` for `device="cuda"` - `compute_type=int8` for `device="cpu"` # Sample Code - ctranslate2 #### Clone the repository to the working directory or wherever you wish to store the model artifacts. #### ```bash git clone https://huggingface.co/gaudi/opus-mt-en-ca-ctranslate2 ``` #### Take the python code below and update the 'model_dir' variable to the location of the cloned repository. #### ```python from ctranslate2 import Translator import transformers model_dir = "./opus-mt-en-ca-ctranslate2" # Path to model directory. translator = Translator( model_path=model_dir, device="cuda", # cpu, cuda, or auto. inter_threads=1, # Maximum number of parallel translations. intra_threads=4, # Number of OpenMP threads per translator. compute_type="int8_float16", # int8 for cpu or int8_float16 for cuda. ) tokenizer = transformers.AutoTokenizer.from_pretrained(model_dir) source = tokenizer.convert_ids_to_tokens(tokenizer.encode("XXXXXX, XXX XX XXXXXX.")) results = translator.translate_batch([source]) target = results[0].hypotheses[0] print(tokenizer.decode(tokenizer.convert_tokens_to_ids(target))) ``` # Sample Code - hf-hub-ctranslate2 **Derived From [michaelfeil](https://huggingface.co/michaelfeil):** ```python from hf_hub_ctranslate2 import TranslatorCT2fromHfHub, GeneratorCT2fromHfHub from transformers import AutoTokenizer model_name = "gaudi/opus-mt-en-ca-ctranslate2" model = TranslatorCT2fromHfHub( model_name_or_path=model_name, device="cuda", compute_type="int8_float16", tokenizer=AutoTokenizer.from_pretrained(model_name) ) outputs = model.generate( text=["XXX XX XXX XXXXXXX XXXX?", "XX XX XXXX XX XXX!"], ) print(outputs) ``` # License and other remarks: License conditions are intended to be idential to [original huggingface repository](https://huggingface.co/Helsinki-NLP/opus-mt-en-ca) by Helsinki-NLP.
null
Non_BioNLP
# Repository General Information ## Inspired by and derived from the work of [Helsinki-NLP](https://huggingface.co/Helsinki-NLP), [CTranslate2](https://github.com/OpenNMT/CTranslate2), and [michaelfeil](https://huggingface.co/michaelfeil)! - Link to Original Model ([Helsinki-NLP](https://huggingface.co/Helsinki-NLP)): [Model Link](https://huggingface.co/Helsinki-NLP/opus-mt-en-ca) - This respository was based on the work of [CTranslate2](https://github.com/OpenNMT/CTranslate2). - This repository was based on the work of [michaelfeil](https://huggingface.co/michaelfeil). # What is CTranslate2? [CTranslate2](https://opennmt.net/CTranslate2/) is a C++ and Python library for efficient inference with Transformer models. CTranslate2 implements a custom runtime that applies many performance optimization techniques such as weights quantization, layers fusion, batch reordering, etc., to accelerate and reduce the memory usage of Transformer models on CPU and GPU. CTranslate2 is one of the most performant ways of hosting translation models at scale. Current supported models include: - Encoder-decoder models: Transformer base/big, M2M-100, NLLB, BART, mBART, Pegasus, T5, Whisper - Decoder-only models: GPT-2, GPT-J, GPT-NeoX, OPT, BLOOM, MPT, Llama, Mistral, Gemma, CodeGen, GPTBigCode, Falcon - Encoder-only models: BERT, DistilBERT, XLM-RoBERTa The project is production-oriented and comes with backward compatibility guarantees, but it also includes experimental features related to model compression and inference acceleration. # CTranslate2 Benchmarks Please note that the results presented below are only valid for the configuration used during this benchmark: absolute and relative performance may change with different settings. Tested against `newstest2014` (En -> De) dataset. The benchmark reports the number of target tokens generated per second (higher is better). The results are aggregated over multiple runs. See the benchmark scripts for more details and reproduce these numbers. Please note that the results presented below are only valid for the configuration used during this benchmark: absolute and relative performance may change with different settings. ## CPU Benchmarks for Generic Opus-MT Models | Library | Tokens per Second | Max Memory Usage | BLEU | | :----: | :----: | :----: | :----: | | Transformers 4.26.1 (with PyTorch 1.13.1) | 147.3 | 2332MB | 27.90 | | Marian 1.11.0 (int16) | 330.2 | 5901MB | 27.65 | | Marian 1.11.0 (int8) | 355.8 | 4763MB | 27.27 | | CTranslate2 3.6.0 (int16) | 596.1 | 660MB | 27.53 | | CTranslate2 3.6.0 (int8) | 696.1 | 516MB | 27.65 | ## GPU Benchmarks for Generic Opus-MT Models | Library | Tokens per Second | Max GPU Memory Usage | Max Memory Usage | BLEU | | :----: | :----: | :----: | :----: | :----: | | Transformers 4.26.1 (with PyTorch 1.13.1) | 1022.9 | 4097MB | 2109MB | 27.90 | | Marian 1.11.0 (float16) | 3962.4 | 3239MB | 1976MB | 27.94 | | CTranslate2 3.6.0 (float16) | 9296.7 | 909MB | 814MB | 27.9 | | CTranslate2 3.6.0 (int8 + float16) | 8362.7 | 813MB | 766MB | 27.9 | `Executed with 4 threads on a c5.2xlarge Amazon EC2 instance equipped with an Intel(R) Xeon(R) Platinum 8275CL CPU.` **Source to benchmark information can be found [here](https://github.com/OpenNMT/CTranslate2).**<br /> **Original model BLEU scores can be found [here](https://huggingface.co/Helsinki-NLP/opus-mt-en-ca).** ## Internal Benchmarks Internal testing on our end showed **inference times reduced by 6x-10x** on average compared the vanilla checkpoints using the *transformers* library. A **slight reduction on BLEU scores (~5%)** was also identified in comparison to the vanilla checkpoints with a few exceptions. This is likely due to several factors, one being the quantization applied. Further testing is needed from our end to better assess the reduction in translation quality. The command used to compile the vanilla checkpoint into a CTranslate2 model can be found below. Modifying this command can yield differing balances between inferencing performance and translation quality. # CTranslate2 Installation ```bash pip install hf-hub-ctranslate2>=1.0.0 ctranslate2>=3.13.0 ``` ### ct2-transformers-converter Command Used: ```bash ct2-transformers-converter --model Helsinki-NLP/opus-mt-en-ca --output_dir ./ctranslate2/opus-mt-en-ca-ctranslate2 --force --copy_files README.md generation_config.json tokenizer_config.json vocab.json source.spm .gitattributes target.spm --quantization float16 ``` # CTranslate2 Converted Checkpoint Information: **Compatible With:** - [ctranslate2](https://github.com/OpenNMT/CTranslate2) - [hf-hub-ctranslate2](https://github.com/michaelfeil/hf-hub-ctranslate2) **Compute Type:** - `compute_type=int8_float16` for `device="cuda"` - `compute_type=int8` for `device="cpu"` # Sample Code - ctranslate2 #### Clone the repository to the working directory or wherever you wish to store the model artifacts. #### ```bash git clone https://huggingface.co/gaudi/opus-mt-en-ca-ctranslate2 ``` #### Take the python code below and update the 'model_dir' variable to the location of the cloned repository. #### ```python from ctranslate2 import Translator import transformers model_dir = "./opus-mt-en-ca-ctranslate2" # Path to model directory. translator = Translator( model_path=model_dir, device="cuda", # cpu, cuda, or auto. inter_threads=1, # Maximum number of parallel translations. intra_threads=4, # Number of OpenMP threads per translator. compute_type="int8_float16", # int8 for cpu or int8_float16 for cuda. ) tokenizer = transformers.AutoTokenizer.from_pretrained(model_dir) source = tokenizer.convert_ids_to_tokens(tokenizer.encode("XXXXXX, XXX XX XXXXXX.")) results = translator.translate_batch([source]) target = results[0].hypotheses[0] print(tokenizer.decode(tokenizer.convert_tokens_to_ids(target))) ``` # Sample Code - hf-hub-ctranslate2 **Derived From [michaelfeil](https://huggingface.co/michaelfeil):** ```python from hf_hub_ctranslate2 import TranslatorCT2fromHfHub, GeneratorCT2fromHfHub from transformers import AutoTokenizer model_name = "gaudi/opus-mt-en-ca-ctranslate2" model = TranslatorCT2fromHfHub( model_name_or_path=model_name, device="cuda", compute_type="int8_float16", tokenizer=AutoTokenizer.from_pretrained(model_name) ) outputs = model.generate( text=["XXX XX XXX XXXXXXX XXXX?", "XX XX XXXX XX XXX!"], ) print(outputs) ``` # License and other remarks: License conditions are intended to be idential to [original huggingface repository](https://huggingface.co/Helsinki-NLP/opus-mt-en-ca) by Helsinki-NLP.
{"license": "apache-2.0", "tags": ["ctranslate2", "translation"]}
task
[ "TRANSLATION" ]
42,874
vap0r/phishing_detection_v0.2
vap0r
text-classification
[ "setfit", "safetensors", "xlm-roberta", "sentence-transformers", "text-classification", "generated_from_setfit_trainer", "arxiv:2209.11055", "base_model:sentence-transformers/paraphrase-multilingual-mpnet-base-v2", "base_model:finetune:sentence-transformers/paraphrase-multilingual-mpnet-base-v2", "region:us" ]
2024-05-04T20:17:19Z
2024-05-04T20:49:37+00:00
5
0
--- base_model: sentence-transformers/paraphrase-multilingual-mpnet-base-v2 library_name: setfit metrics: - accuracy pipeline_tag: text-classification tags: - setfit - sentence-transformers - text-classification - generated_from_setfit_trainer widget: [] inference: true --- # SetFit with sentence-transformers/paraphrase-multilingual-mpnet-base-v2 This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/paraphrase-multilingual-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-mpnet-base-v2) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Model Details ### Model Description - **Model Type:** SetFit - **Sentence Transformer body:** [sentence-transformers/paraphrase-multilingual-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-mpnet-base-v2) - **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance - **Maximum Sequence Length:** 512 tokens <!-- - **Number of Classes:** Unknown --> <!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit) - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055) - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit) ## Uses ### Direct Use for Inference First install the SetFit library: ```bash pip install setfit ``` Then you can load this model and run inference. ```python from setfit import SetFitModel # Download from the 🤗 Hub model = SetFitModel.from_pretrained("vap0r/phishing_detection_v0.2") # Run inference preds = model("I loved the spiderman movie!") ``` <!-- ### Downstream Use *List how someone could finetune this model on their own dataset.* --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Framework Versions - Python: 3.10.12 - SetFit: 1.0.3 - Sentence Transformers: 2.7.0 - Transformers: 4.40.1 - PyTorch: 2.3.0+cu121 - Datasets: 2.19.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
null
Non_BioNLP
# SetFit with sentence-transformers/paraphrase-multilingual-mpnet-base-v2 This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/paraphrase-multilingual-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-mpnet-base-v2) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Model Details ### Model Description - **Model Type:** SetFit - **Sentence Transformer body:** [sentence-transformers/paraphrase-multilingual-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-mpnet-base-v2) - **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance - **Maximum Sequence Length:** 512 tokens <!-- - **Number of Classes:** Unknown --> <!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit) - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055) - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit) ## Uses ### Direct Use for Inference First install the SetFit library: ```bash pip install setfit ``` Then you can load this model and run inference. ```python from setfit import SetFitModel # Download from the 🤗 Hub model = SetFitModel.from_pretrained("vap0r/phishing_detection_v0.2") # Run inference preds = model("I loved the spiderman movie!") ``` <!-- ### Downstream Use *List how someone could finetune this model on their own dataset.* --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Framework Versions - Python: 3.10.12 - SetFit: 1.0.3 - Sentence Transformers: 2.7.0 - Transformers: 4.40.1 - PyTorch: 2.3.0+cu121 - Datasets: 2.19.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
{"base_model": "sentence-transformers/paraphrase-multilingual-mpnet-base-v2", "library_name": "setfit", "metrics": ["accuracy"], "pipeline_tag": "text-classification", "tags": ["setfit", "sentence-transformers", "text-classification", "generated_from_setfit_trainer"], "widget": [], "inference": true}
task
[ "TEXT_CLASSIFICATION" ]
42,875
zbigi/bart-base-summarization-medical-49
zbigi
null
[ "peft", "tensorboard", "safetensors", "generated_from_trainer", "base_model:facebook/bart-base", "base_model:adapter:facebook/bart-base", "license:apache-2.0", "region:us" ]
2024-07-26T13:02:22Z
2024-07-26T14:14:10+00:00
1
0
--- base_model: facebook/bart-base library_name: peft license: apache-2.0 metrics: - rouge tags: - generated_from_trainer model-index: - name: bart-base-summarization-medical-49 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bart-base-summarization-medical-49 This model is a fine-tuned version of [facebook/bart-base](https://huggingface.co/facebook/bart-base) on the None dataset. It achieves the following results on the evaluation set: - Loss: 2.1283 - Rouge1: 0.4194 - Rouge2: 0.2246 - Rougel: 0.3563 - Rougelsum: 0.356 - Gen Len: 18.24 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 4 - eval_batch_size: 1 - seed: 49 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 6 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:| | 2.7018 | 1.0 | 1250 | 2.1985 | 0.4123 | 0.2198 | 0.352 | 0.3522 | 17.961 | | 2.6001 | 2.0 | 2500 | 2.1649 | 0.4125 | 0.2205 | 0.3526 | 0.3526 | 17.963 | | 2.577 | 3.0 | 3750 | 2.1418 | 0.4189 | 0.222 | 0.3547 | 0.3548 | 18.185 | | 2.5295 | 4.0 | 5000 | 2.1347 | 0.4213 | 0.2256 | 0.3564 | 0.3559 | 18.174 | | 2.5513 | 5.0 | 6250 | 2.1299 | 0.4174 | 0.2224 | 0.3545 | 0.3542 | 18.118 | | 2.5347 | 6.0 | 7500 | 2.1283 | 0.4194 | 0.2246 | 0.3563 | 0.356 | 18.24 | ### Framework versions - PEFT 0.12.0 - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
null
BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bart-base-summarization-medical-49 This model is a fine-tuned version of [facebook/bart-base](https://huggingface.co/facebook/bart-base) on the None dataset. It achieves the following results on the evaluation set: - Loss: 2.1283 - Rouge1: 0.4194 - Rouge2: 0.2246 - Rougel: 0.3563 - Rougelsum: 0.356 - Gen Len: 18.24 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 4 - eval_batch_size: 1 - seed: 49 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 6 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:| | 2.7018 | 1.0 | 1250 | 2.1985 | 0.4123 | 0.2198 | 0.352 | 0.3522 | 17.961 | | 2.6001 | 2.0 | 2500 | 2.1649 | 0.4125 | 0.2205 | 0.3526 | 0.3526 | 17.963 | | 2.577 | 3.0 | 3750 | 2.1418 | 0.4189 | 0.222 | 0.3547 | 0.3548 | 18.185 | | 2.5295 | 4.0 | 5000 | 2.1347 | 0.4213 | 0.2256 | 0.3564 | 0.3559 | 18.174 | | 2.5513 | 5.0 | 6250 | 2.1299 | 0.4174 | 0.2224 | 0.3545 | 0.3542 | 18.118 | | 2.5347 | 6.0 | 7500 | 2.1283 | 0.4194 | 0.2246 | 0.3563 | 0.356 | 18.24 | ### Framework versions - PEFT 0.12.0 - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
{"base_model": "facebook/bart-base", "library_name": "peft", "license": "apache-2.0", "metrics": ["rouge"], "tags": ["generated_from_trainer"], "model-index": [{"name": "bart-base-summarization-medical-49", "results": []}]}
task
[ "SUMMARIZATION" ]
42,876
EcemSimsek/bert-base-uncased-finetuned-cola
EcemSimsek
text-classification
[ "transformers", "pytorch", "tensorboard", "bert", "text-classification", "generated_from_trainer", "dataset:glue", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2023-05-07T09:22:32Z
2023-05-07T21:42:13+00:00
12
0
--- datasets: - glue license: apache-2.0 metrics: - matthews_correlation tags: - generated_from_trainer model-index: - name: bert-base-uncased-finetuned-cola results: - task: type: text-classification name: Text Classification dataset: name: glue type: glue config: cola split: validation args: cola metrics: - type: matthews_correlation value: 0.5208528714430889 name: Matthews Correlation --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-base-uncased-finetuned-cola This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset. It achieves the following results on the evaluation set: - Loss: 0.4661 - Matthews Correlation: 0.5209 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1.13e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Matthews Correlation | |:-------------:|:-----:|:----:|:---------------:|:--------------------:| | No log | 1.0 | 268 | 0.4526 | 0.5206 | | 0.4593 | 2.0 | 536 | 0.4661 | 0.5209 | ### Framework versions - Transformers 4.28.1 - Pytorch 2.0.0+cu118 - Datasets 2.12.0 - Tokenizers 0.13.3
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-base-uncased-finetuned-cola This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset. It achieves the following results on the evaluation set: - Loss: 0.4661 - Matthews Correlation: 0.5209 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1.13e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Matthews Correlation | |:-------------:|:-----:|:----:|:---------------:|:--------------------:| | No log | 1.0 | 268 | 0.4526 | 0.5206 | | 0.4593 | 2.0 | 536 | 0.4661 | 0.5209 | ### Framework versions - Transformers 4.28.1 - Pytorch 2.0.0+cu118 - Datasets 2.12.0 - Tokenizers 0.13.3
{"datasets": ["glue"], "license": "apache-2.0", "metrics": ["matthews_correlation"], "tags": ["generated_from_trainer"], "model-index": [{"name": "bert-base-uncased-finetuned-cola", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "config": "cola", "split": "validation", "args": "cola"}, "metrics": [{"type": "matthews_correlation", "value": 0.5208528714430889, "name": "Matthews Correlation"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,877
google/pegasus-xsum
google
summarization
[ "transformers", "pytorch", "tf", "jax", "pegasus", "text2text-generation", "summarization", "en", "arxiv:1912.08777", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05Z
2023-01-24T16:42:49+00:00
286,837
191
--- language: en tags: - summarization model-index: - name: google/pegasus-xsum results: - task: type: summarization name: Summarization dataset: name: samsum type: samsum config: samsum split: train metrics: - type: rouge value: 21.8096 name: ROUGE-1 verified: true - type: rouge value: 4.2525 name: ROUGE-2 verified: true - type: rouge value: 17.4469 name: ROUGE-L verified: true - type: rouge value: 18.8907 name: ROUGE-LSUM verified: true - type: loss value: 3.0317161083221436 name: loss verified: true - type: gen_len value: 20.3122 name: gen_len verified: true - task: type: summarization name: Summarization dataset: name: xsum type: xsum config: default split: test metrics: - type: rouge value: 46.8623 name: ROUGE-1 verified: true - type: rouge value: 24.4533 name: ROUGE-2 verified: true - type: rouge value: 39.0548 name: ROUGE-L verified: true - type: rouge value: 39.0994 name: ROUGE-LSUM verified: true - type: loss value: 1.5717021226882935 name: loss verified: true - type: gen_len value: 22.8821 name: gen_len verified: true - task: type: summarization name: Summarization dataset: name: cnn_dailymail type: cnn_dailymail config: 3.0.0 split: test metrics: - type: rouge value: 22.2062 name: ROUGE-1 verified: true - type: rouge value: 7.6701 name: ROUGE-2 verified: true - type: rouge value: 15.4046 name: ROUGE-L verified: true - type: rouge value: 19.2182 name: ROUGE-LSUM verified: true - type: loss value: 2.681241273880005 name: loss verified: true - type: gen_len value: 25.0234 name: gen_len verified: true --- ### Pegasus Models See Docs: [here](https://huggingface.co/transformers/master/model_doc/pegasus.html) Original TF 1 code [here](https://github.com/google-research/pegasus) Authors: Jingqing Zhang, Yao Zhao, Mohammad Saleh and Peter J. Liu on Dec 18, 2019 Maintained by: [@sshleifer](https://twitter.com/sam_shleifer) Task: Summarization The following is copied from the authors' README. # Mixed & Stochastic Checkpoints We train a pegasus model with sampled gap sentence ratios on both C4 and HugeNews, and stochastically sample important sentences. The updated the results are reported in this table. | dataset | C4 | HugeNews | Mixed & Stochastic| | ---- | ---- | ---- | ----| | xsum | 45.20/22.06/36.99 | 47.21/24.56/39.25 | 47.60/24.83/39.64| | cnn_dailymail | 43.90/21.20/40.76 | 44.17/21.47/41.11 | 44.16/21.56/41.30| | newsroom | 45.07/33.39/41.28 | 45.15/33.51/41.33 | 45.98/34.20/42.18| | multi_news | 46.74/17.95/24.26 | 47.52/18.72/24.91 | 47.65/18.75/24.95| | gigaword | 38.75/19.96/36.14 | 39.12/19.86/36.24 | 39.65/20.47/36.76| | wikihow | 43.07/19.70/34.79 | 41.35/18.51/33.42 | 46.39/22.12/38.41 *| | reddit_tifu | 26.54/8.94/21.64 | 26.63/9.01/21.60 | 27.99/9.81/22.94| | big_patent | 53.63/33.16/42.25 | 53.41/32.89/42.07 | 52.29/33.08/41.66 *| | arxiv | 44.70/17.27/25.80 | 44.67/17.18/25.73 | 44.21/16.95/25.67| | pubmed | 45.49/19.90/27.69 | 45.09/19.56/27.42 | 45.97/20.15/28.25| | aeslc | 37.69/21.85/36.84 | 37.40/21.22/36.45 | 37.68/21.25/36.51| | billsum | 57.20/39.56/45.80 | 57.31/40.19/45.82 | 59.67/41.58/47.59| The "Mixed & Stochastic" model has the following changes: - trained on both C4 and HugeNews (dataset mixture is weighted by their number of examples). - trained for 1.5M instead of 500k (we observe slower convergence on pretraining perplexity). - the model uniformly sample a gap sentence ratio between 15% and 45%. - importance sentences are sampled using a 20% uniform noise to importance scores. - the sentencepiece tokenizer is updated to be able to encode newline character. (*) the numbers of wikihow and big_patent datasets are not comparable because of change in tokenization and data: - wikihow dataset contains newline characters which is useful for paragraph segmentation, the C4 and HugeNews model's sentencepiece tokenizer doesn't encode newline and loose this information. - we update the BigPatent dataset to preserve casing, some format cleanings are also changed, please refer to change in TFDS. The "Mixed & Stochastic" model has the following changes (from pegasus-large in the paper): trained on both C4 and HugeNews (dataset mixture is weighted by their number of examples). trained for 1.5M instead of 500k (we observe slower convergence on pretraining perplexity). the model uniformly sample a gap sentence ratio between 15% and 45%. importance sentences are sampled using a 20% uniform noise to importance scores. the sentencepiece tokenizer is updated to be able to encode newline character. Citation ``` @misc{zhang2019pegasus, title={PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization}, author={Jingqing Zhang and Yao Zhao and Mohammad Saleh and Peter J. Liu}, year={2019}, eprint={1912.08777}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
null
Non_BioNLP
### Pegasus Models See Docs: [here](https://huggingface.co/transformers/master/model_doc/pegasus.html) Original TF 1 code [here](https://github.com/google-research/pegasus) Authors: Jingqing Zhang, Yao Zhao, Mohammad Saleh and Peter J. Liu on Dec 18, 2019 Maintained by: [@sshleifer](https://twitter.com/sam_shleifer) Task: Summarization The following is copied from the authors' README. # Mixed & Stochastic Checkpoints We train a pegasus model with sampled gap sentence ratios on both C4 and HugeNews, and stochastically sample important sentences. The updated the results are reported in this table. | dataset | C4 | HugeNews | Mixed & Stochastic| | ---- | ---- | ---- | ----| | xsum | 45.20/22.06/36.99 | 47.21/24.56/39.25 | 47.60/24.83/39.64| | cnn_dailymail | 43.90/21.20/40.76 | 44.17/21.47/41.11 | 44.16/21.56/41.30| | newsroom | 45.07/33.39/41.28 | 45.15/33.51/41.33 | 45.98/34.20/42.18| | multi_news | 46.74/17.95/24.26 | 47.52/18.72/24.91 | 47.65/18.75/24.95| | gigaword | 38.75/19.96/36.14 | 39.12/19.86/36.24 | 39.65/20.47/36.76| | wikihow | 43.07/19.70/34.79 | 41.35/18.51/33.42 | 46.39/22.12/38.41 *| | reddit_tifu | 26.54/8.94/21.64 | 26.63/9.01/21.60 | 27.99/9.81/22.94| | big_patent | 53.63/33.16/42.25 | 53.41/32.89/42.07 | 52.29/33.08/41.66 *| | arxiv | 44.70/17.27/25.80 | 44.67/17.18/25.73 | 44.21/16.95/25.67| | pubmed | 45.49/19.90/27.69 | 45.09/19.56/27.42 | 45.97/20.15/28.25| | aeslc | 37.69/21.85/36.84 | 37.40/21.22/36.45 | 37.68/21.25/36.51| | billsum | 57.20/39.56/45.80 | 57.31/40.19/45.82 | 59.67/41.58/47.59| The "Mixed & Stochastic" model has the following changes: - trained on both C4 and HugeNews (dataset mixture is weighted by their number of examples). - trained for 1.5M instead of 500k (we observe slower convergence on pretraining perplexity). - the model uniformly sample a gap sentence ratio between 15% and 45%. - importance sentences are sampled using a 20% uniform noise to importance scores. - the sentencepiece tokenizer is updated to be able to encode newline character. (*) the numbers of wikihow and big_patent datasets are not comparable because of change in tokenization and data: - wikihow dataset contains newline characters which is useful for paragraph segmentation, the C4 and HugeNews model's sentencepiece tokenizer doesn't encode newline and loose this information. - we update the BigPatent dataset to preserve casing, some format cleanings are also changed, please refer to change in TFDS. The "Mixed & Stochastic" model has the following changes (from pegasus-large in the paper): trained on both C4 and HugeNews (dataset mixture is weighted by their number of examples). trained for 1.5M instead of 500k (we observe slower convergence on pretraining perplexity). the model uniformly sample a gap sentence ratio between 15% and 45%. importance sentences are sampled using a 20% uniform noise to importance scores. the sentencepiece tokenizer is updated to be able to encode newline character. Citation ``` @misc{zhang2019pegasus, title={PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization}, author={Jingqing Zhang and Yao Zhao and Mohammad Saleh and Peter J. Liu}, year={2019}, eprint={1912.08777}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
{"language": "en", "tags": ["summarization"], "model-index": [{"name": "google/pegasus-xsum", "results": [{"task": {"type": "summarization", "name": "Summarization"}, "dataset": {"name": "samsum", "type": "samsum", "config": "samsum", "split": "train"}, "metrics": [{"type": "rouge", "value": 21.8096, "name": "ROUGE-1", "verified": true}, {"type": "rouge", "value": 4.2525, "name": "ROUGE-2", "verified": true}, {"type": "rouge", "value": 17.4469, "name": "ROUGE-L", "verified": true}, {"type": "rouge", "value": 18.8907, "name": "ROUGE-LSUM", "verified": true}, {"type": "loss", "value": 3.0317161083221436, "name": "loss", "verified": true}, {"type": "gen_len", "value": 20.3122, "name": "gen_len", "verified": true}]}, {"task": {"type": "summarization", "name": "Summarization"}, "dataset": {"name": "xsum", "type": "xsum", "config": "default", "split": "test"}, "metrics": [{"type": "rouge", "value": 46.8623, "name": "ROUGE-1", "verified": true}, {"type": "rouge", "value": 24.4533, "name": "ROUGE-2", "verified": true}, {"type": "rouge", "value": 39.0548, "name": "ROUGE-L", "verified": true}, {"type": "rouge", "value": 39.0994, "name": "ROUGE-LSUM", "verified": true}, {"type": "loss", "value": 1.5717021226882935, "name": "loss", "verified": true}, {"type": "gen_len", "value": 22.8821, "name": "gen_len", "verified": true}]}, {"task": {"type": "summarization", "name": "Summarization"}, "dataset": {"name": "cnn_dailymail", "type": "cnn_dailymail", "config": "3.0.0", "split": "test"}, "metrics": [{"type": "rouge", "value": 22.2062, "name": "ROUGE-1", "verified": true}, {"type": "rouge", "value": 7.6701, "name": "ROUGE-2", "verified": true}, {"type": "rouge", "value": 15.4046, "name": "ROUGE-L", "verified": true}, {"type": "rouge", "value": 19.2182, "name": "ROUGE-LSUM", "verified": true}, {"type": "loss", "value": 2.681241273880005, "name": "loss", "verified": true}, {"type": "gen_len", "value": 25.0234, "name": "gen_len", "verified": true}]}]}]}
task
[ "SUMMARIZATION" ]
42,878
DrChamyoung/PartnerAIPRO
DrChamyoung
image-text-to-text
[ "safetensors", "phi3_v", "nlp", "code", "vision", "image-text-to-text", "conversational", "custom_code", "multilingual", "arxiv:2404.14219", "license:mit", "region:us" ]
2024-09-06T10:17:13Z
2024-09-06T11:43:20+00:00
13
0
--- language: - multilingual license: mit license_link: https://huggingface.co/microsoft/Phi-3.5-vision-instruct/resolve/main/LICENSE pipeline_tag: image-text-to-text tags: - nlp - code - vision inference: parameters: temperature: 0.7 widget: - messages: - role: user content: <|image_1|>Can you describe what you see in the image? --- ## Model Summary Phi-3.5-vision is a lightweight, state-of-the-art open multimodal model built upon datasets which include - synthetic data and filtered publicly available websites - with a focus on very high-quality, reasoning dense data both on text and vision. The model belongs to the Phi-3 model family, and the multimodal version comes with 128K context length (in tokens) it can support. The model underwent a rigorous enhancement process, incorporating both supervised fine-tuning and direct preference optimization to ensure precise instruction adherence and robust safety measures. 🏡 [Phi-3 Portal](https://azure.microsoft.com/en-us/products/phi-3) <br> 📰 [Phi-3 Microsoft Blog](https://aka.ms/phi3.5-techblog) <br> 📖 [Phi-3 Technical Report](https://arxiv.org/abs/2404.14219) <br> 👩‍🍳 [Phi-3 Cookbook](https://github.com/microsoft/Phi-3CookBook) <br> 🖥️ [Try It](https://aka.ms/try-phi3.5vision) <br> **Phi-3.5**: [[mini-instruct]](https://huggingface.co/microsoft/Phi-3.5-mini-instruct); [[MoE-instruct]](https://huggingface.co/microsoft/Phi-3.5-MoE-instruct) ; [[vision-instruct]](https://huggingface.co/microsoft/Phi-3.5-vision-instruct) ## Intended Uses ### Primary Use Cases The model is intended for broad commercial and research use in English. The model provides uses for general purpose AI systems and applications with visual and text input capabilities which require: 1) Memory/compute constrained environments 2) Latency bound scenarios 3) General image understanding 4) Optical character recognition 5) Chart and table understanding 6) Multiple image comparison 7) Multi-image or video clip summarization Our model is designed to accelerate research on language and multimodal models, for use as a building block for generative AI powered features. ### Use Case Considerations Our models are not specifically designed or evaluated for all downstream purposes. Developers should consider common limitations of language models as they select use cases, and evaluate and mitigate for accuracy, safety, and fariness before using within a specific downstream use case, particularly for high risk scenarios. Developers should be aware of and adhere to applicable laws or regulations (including privacy, trade compliance laws, etc.) that are relevant to their use case. ***Nothing contained in this Model Card should be interpreted as or deemed a restriction or modification to the license the model is released under.*** ## Release Notes In this release, the model enables multi-frame image understanding and reasoning which is based on valuable customer feedback. The hero example multi-frame capabilities include detailed image comparison, multi-image summarization/storytelling and video summarization, which have broad applications in Office scenarios. We also observed performance improvement on most single image benchmarks, e.g., boost MMMU performance from 40.2 to 43.0, MMBench performance from 80.5 to 81.9, document understanding benchmark TextVQA from 70.9 to 72.0. We believe most use cases will benefit from this release, but we encourage users to test the new model in their AI applications. We appreciate the enthusiastic adoption of the Phi-3 model family and continue to welcome all the feedback from the community. Below are the comparison results on existing multi-image benchmarks. On average, our model outperforms competitor models on the same size and competitive with much bigger models on multi-frame capabilities and video summarization. **BLINK**: a benchmark with 14 visual tasks that humans can solve very quickly but are still hard for current multimodal LLMs. | Benchmark | Phi-3.5-vision-instruct | LlaVA-Interleave-Qwen-7B | InternVL-2-4B | InternVL-2-8B | Gemini-1.5-Flash | GPT-4o-mini | Claude-3.5-Sonnet | Gemini-1.5-Pro | GPT-4o | |--|--|--|--|--|--|--|--|--|--| | Art Style | 87.2 | 62.4 | 55.6 | 52.1 | 64.1 | 70.1 | 59.8 | 70.9 | 73.3 | | Counting | 54.2 | 56.7 | 54.2 | 66.7 | 51.7 | 55.0 | 59.2 | 65.0 | 65.0 | | Forensic Detection | 92.4 | 31.1 | 40.9 | 34.1 | 54.5 | 38.6 | 67.4 | 60.6 | 75.8 | | Functional Correspondence | 29.2 | 34.6 | 24.6 | 24.6 | 33.1 | 26.9 | 33.8 | 31.5 | 43.8 | | IQ Test | 25.3 | 26.7 | 26.0 | 30.7 | 25.3 | 29.3 | 26.0 | 34.0 | 19.3 | | Jigsaw | 68.0 | 86.0 | 55.3 | 52.7 | 71.3 | 72.7 | 57.3 | 68.0 | 67.3 | | Multi-View Reasoning | 54.1 | 44.4 | 48.9 | 42.9 | 48.9 | 48.1 | 55.6 | 49.6 | 46.6 | | Object Localization | 49.2 | 54.9 | 53.3 | 54.1 | 44.3 | 57.4 | 62.3 | 65.6 | 68.0 | | Relative Depth | 69.4 | 77.4 | 63.7 | 67.7 | 57.3 | 58.1 | 71.8 | 76.6 | 71.0 | | Relative Reflectance | 37.3 | 34.3 | 32.8 | 38.8 | 32.8 | 27.6 | 36.6 | 38.8 | 40.3 | | Semantic Correspondence | 36.7 | 31.7 | 31.7 | 22.3 | 32.4 | 31.7 | 45.3 | 48.9 | 54.0 | | Spatial Relation | 65.7 | 75.5 | 78.3 | 78.3 | 55.9 | 81.1 | 60.1 | 79.0 | 84.6 | | Visual Correspondence | 53.5 | 40.7 | 34.9 | 33.1 | 29.7 | 52.9 | 72.1 | 81.4 | 86.0 | | Visual Similarity | 83.0 | 91.9 | 48.1 | 45.2 | 47.4 | 77.8 | 84.4 | 81.5 | 88.1 | | **Overall** | **57.0** | **53.1** | **45.9** | **45.4** | **45.8** | **51.9** | **56.5** | **61.0** | **63.2** | **Video-MME**: comprehensively assess the capabilities of MLLMs in processing video data, covering a wide range of visual domains, temporal durations, and data modalities. | Benchmark | Phi-3.5-vision-instruct | LlaVA-Interleave-Qwen-7B | InternVL-2-4B | InternVL-2-8B | Gemini-1.5-Flash | GPT-4o-mini | Claude-3.5-Sonnet | Gemini-1.5-Pro | GPT-4o | |--|--|--|--|--|--|--|--|--|--| | short (<2min) | 60.8 | 62.3 | 60.7 | 61.7 | 72.2 | 70.1 | 66.3 | 73.3 | 77.7 | | medium (4-15min) | 47.7 | 47.1 | 46.4 | 49.6 | 62.7 | 59.6 | 54.7 | 61.2 | 68.0 | | long (30-60min) | 43.8 | 41.2 | 42.6 | 46.6 | 52.1 | 53.9 | 46.6 | 53.2 | 59.6 | | **Overall** | **50.8** | **50.2** | **49.9** | **52.6** | **62.3** | **61.2** | **55.9** | **62.6** | **68.4** | ## Usage ### Requirements The current `transformers` version can be verified with: `pip list | grep transformers`. Examples of required packages: ``` flash_attn==2.5.8 numpy==1.24.4 Pillow==10.3.0 Requests==2.31.0 torch==2.3.0 torchvision==0.18.0 transformers==4.43.0 accelerate==0.30.0 ``` Phi-3.5-vision-Instruct is also available in [Azure AI Studio](https://aka.ms/try-phi3.5vision). ### Input Formats Given the nature of the training data, the Phi-3.5-vision model is best suited for prompts using the chat format as follows: Single image: ``` <|user|>\n<|image_1|>\n{prompt}<|end|>\n<|assistant|>\n ``` Multi-turn conversations: ``` <|user|>\n<|image_1|>\n{prompt_1}<|end|>\n<|assistant|>\n{response_1}<|end|>\n<|user|>\n{prompt_2}<|end|>\n<|assistant|>\n ``` For multi-image usage, add multiple image placeholders in the front of the prompts. <|image_{}|> index should start from 1. One example of prompt is shown as follows: ``` <|user|>\n<|image_1|>\n<|image_2|>\n<|image_3|>\n<|image_4|>\n{prompt}<|end|>\n<|assistant|>\n ``` ### Loading the model locally After obtaining the Phi-3.5-vision-instruct model checkpoints, users can use this sample code for inference. ```python from PIL import Image import requests from transformers import AutoModelForCausalLM from transformers import AutoProcessor model_id = "microsoft/Phi-3.5-vision-instruct" # Note: set _attn_implementation='eager' if you don't have flash_attn installed model = AutoModelForCausalLM.from_pretrained( model_id, device_map="cuda", trust_remote_code=True, torch_dtype="auto", _attn_implementation='flash_attention_2' ) # for best performance, use num_crops=4 for multi-frame, num_crops=16 for single-frame. processor = AutoProcessor.from_pretrained(model_id, trust_remote_code=True, num_crops=4 ) images = [] placeholder = "" # Note: if OOM, you might consider reduce number of frames in this example. for i in range(1,20): url = f"https://image.slidesharecdn.com/azureintroduction-191206101932/75/Introduction-to-Microsoft-Azure-Cloud-{i}-2048.jpg" images.append(Image.open(requests.get(url, stream=True).raw)) placeholder += f"<|image_{i}|>\n" messages = [ {"role": "user", "content": placeholder+"Summarize the deck of slides."}, ] prompt = processor.tokenizer.apply_chat_template( messages, tokenize=False, add_generation_prompt=True ) inputs = processor(prompt, images, return_tensors="pt").to("cuda:0") generation_args = { "max_new_tokens": 1000, "temperature": 0.0, "do_sample": False, } generate_ids = model.generate(**inputs, eos_token_id=processor.tokenizer.eos_token_id, **generation_args ) # remove input tokens generate_ids = generate_ids[:, inputs['input_ids'].shape[1]:] response = processor.batch_decode(generate_ids, skip_special_tokens=True, clean_up_tokenization_spaces=False)[0] print(response) ``` Notes: + to achieve best performances we suggest to set _num_crops=4_ for multi-frame and _num_crops=16_ for single-frame. + to turn off flash_attention users can set __attn_implementation='eager'_ ## Responsible AI Considerations Like other models, the Phi family of models can potentially behave in ways that are unfair, unreliable, or offensive. Some of the limiting behaviors to be aware of include: * Quality of Service: The Phi models are trained primarily on English text. Languages other than English will experience worse performance. English language varieties with less representation in the training data might experience worse performance than standard American English. * Representation of Harms & Perpetuation of Stereotypes: These models can over- or under-represent groups of people, erase representation of some groups, or reinforce demeaning or negative stereotypes. Despite safety post-training, these limitations may still be present due to differing levels of representation of different groups or prevalence of examples of negative stereotypes in training data that reflect real-world patterns and societal biases. * Inappropriate or Offensive Content: These models may produce other types of inappropriate or offensive content, which may make it inappropriate to deploy for sensitive contexts without additional mitigations that are specific to the use case. * Information Reliability: Language models can generate nonsensical content or fabricate content that might sound reasonable but is inaccurate or outdated. * Limited Scope for Code: Majority of Phi-3 training data is based in Python and use common packages such as "typing, math, random, collections, datetime, itertools". If the model generates Python scripts that utilize other packages or scripts in other languages, we strongly recommend users manually verify all API uses. Developers should apply responsible AI best practices and are responsible for ensuring that a specific use case complies with relevant laws and regulations (e.g. privacy, trade, etc.). Important areas for consideration include: * Allocation: Models may not be suitable for scenarios that could have consequential impact on legal status or the allocation of resources or life opportunities (ex: housing, employment, credit, etc.) without further assessments and additional debiasing techniques. * High-Risk Scenarios: Developers should assess suitability of using models in high-risk scenarios where unfair, unreliable or offensive outputs might be extremely costly or lead to harm. This includes providing advice in sensitive or expert domains where accuracy and reliability are critical (ex: legal or health advice). Additional safeguards should be implemented at the application level according to the deployment context. * Misinformation: Models may produce inaccurate information. Developers should follow transparency best practices and inform end-users they are interacting with an AI system. At the application level, developers can build feedback mechanisms and pipelines to ground responses in use-case specific, contextual information, a technique known as Retrieval Augmented Generation (RAG). * Generation of Harmful Content: Developers should assess outputs for their context and use available safety classifiers or custom solutions appropriate for their use case. * Misuse: Other forms of misuse such as fraud, spam, or malware production may be possible, and developers should ensure that their applications do not violate applicable laws and regulations. * Identification of individuals: models with vision capabilities may have the potential to uniquely identify individuals in images. Safety post-training steers the model to refuse such requests, but developers should consider and implement, as appropriate, additional mitigations or user consent flows as required in their respective jurisdiction, (e.g., building measures to blur faces in image inputs before processing). ## Training ### Models **Architecture:** Phi-3.5-vision has 4.2B parameters and contains image encoder, connector, projector, and Phi-3 Mini language model.<br> **Inputs:** Text and Image. It’s best suited for prompts using the chat format.<br> **Context length:** 128K tokens<br> **GPUs:** 256 A100-80G<br> **Training time:** 6 days<br> **Training data:** 500B tokens (vision tokens + text tokens)<br> **Outputs:** Generated text in response to the input<br> **Dates:** Trained between July and August 2024<br> **Status:** This is a static model trained on an offline text dataset with cutoff date March 15, 2024. Future versions of the tuned models may be released as we improve models.<br> **Release date:** August 2024<br> ### Data Overview Our training data includes a wide variety of sources, and is a combination of 1) publicly available documents filtered rigorously for quality, selected high-quality educational data and code; 2) selected high-quality image-text interleave data; 3) newly created synthetic, “textbook-like” data for the purpose of teaching math, coding, common sense reasoning, general knowledge of the world (science, daily activities, theory of mind, etc.), newly created image data, e.g., chart/table/diagram/slides, newly created multi-image and video data, e.g., short video clips/pair of two similar images; 4) high quality chat format supervised data covering various topics to reflect human preferences on different aspects such as instruct-following, truthfulness, honesty and helpfulness. The data collection process involved sourcing information from publicly available documents, with a meticulous approach to filtering out undesirable documents and images. To safeguard privacy, we carefully filtered various image and text data sources to remove or scrub any potentially personal data from the training data. More details about data can be found in the [Phi-3 Technical Report](https://arxiv.org/pdf/2404.14219). ### How to finetune? We recommend user to take a look at the [Phi-3 CookBook finetuning recipe for Vision](https://github.com/microsoft/Phi-3CookBook/blob/main/md/04.Fine-tuning/FineTuning_Vision.md) ## Benchmarks To understand the capabilities, we compare Phi-3.5-vision with a set of models over a variety of zero-shot benchmarks using our internal benchmark platform. At the high-level overview of the model quality on representative benchmarks: | Category | Benchmark | Phi-3.5-vision-instruct | Intern-VL-2-4B | Intern-VL-2-8B | Gemini-1.5-Flash | GPT-4o-mini 2024-7-18 | Claude-3.5-Sonnet | Gemini-1.5-Pro | GPT-4o 2024-5-13 | |--|--|--|--|--|--|--|--|--|--| | Popular aggregated benchmark | MMMU (val) | 43.0 | 44.22 | 46.33 | 49.33 | 52.1 | 52.67 | 54.11 | 61.78 | | | MMBench (dev-en) | 81.9 | 83.4 | 87.0 | 85.7 | 83.8 | 82.3 | 87.9 | 88.4 | | Visual scientific knowledge reasoning | ScienceQA (img-test) | 91.3 | 94.9 | 95.9 | 84.5 | 84.0 | 73.8 | 86.0 | 88.5 | | Visual math reasoning | MathVista (testmini) | 43.9 | 53.7 | 51.1 | 55.3 | 38.8 | 54.0 | 57.4 | 54.4 | | | InterGPS (test) | 36.3 | 45.6 | 53.2 | 39.4 | 39.9 | 45.6 | 58.2 | 46.9 | | Chart reasoning | AI2D (test) | 78.1 | 77.3 | 81.4 | 78.4 | 75.2 | 68.9 | 75.6 | 82.8 | | | ChartQA (test) | 81.8 | 78.8 | 80.4 | 57.6 | 54.5 | 73.2 | 68.2 | 64.0 | | Document Intelligence | TextVQA (val) | 72.0 | 66.2 | 68.8 | 67.4 | 70.9 | 70.5 | 64.5 | 75.6 | | Object visual presence verification | POPE (test) | 86.1 | 83.3 | 84.2 | 86.1 | 83.6 | 76.6 | 89.3 | 87.0 | ## Safety Evaluation and Red-Teaming **Approach** The Phi-3 family of models has adopted a robust safety post-training approach. This approach leverages a variety of both open-source and in-house generated datasets. The overall technique employed to do the safety alignment is a combination of SFT (Supervised Fine-Tuning) and RLHF (Reinforcement Learning from Human Feedback) approaches by utilizing human-labeled and synthetic English-language datasets, including publicly available datasets focusing on helpfulness and harmlessness as well as various questions and answers targeted to multiple safety categories. **Safety Evaluation** We leveraged various evaluation techniques including red teaming, adversarial conversation simulations, and safety evaluation benchmark datasets to evaluate Phi-3.5 models' propensity to produce undesirable outputs across multiple risk categories. Several approaches were used to compensate for the limitations of one approach alone. Please refer to the [technical report](https://arxiv.org/pdf/2404.14219) for more details of our safety alignment. ## Software * [PyTorch](https://github.com/pytorch/pytorch) * [Transformers](https://github.com/huggingface/transformers) * [Flash-Attention](https://github.com/HazyResearch/flash-attention) ## Hardware Note that by default, the Phi-3.5-Mini-Instruct model uses flash attention, which requires certain types of GPU hardware to run. We have tested on the following GPU types: * NVIDIA A100 * NVIDIA A6000 * NVIDIA H100 ## License The model is licensed under the [MIT license](./LICENSE). ## Trademarks This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow [Microsoft’s Trademark & Brand Guidelines](https://www.microsoft.com/en-us/legal/intellectualproperty/trademarks). Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party’s policies.
null
Non_BioNLP
## Model Summary Phi-3.5-vision is a lightweight, state-of-the-art open multimodal model built upon datasets which include - synthetic data and filtered publicly available websites - with a focus on very high-quality, reasoning dense data both on text and vision. The model belongs to the Phi-3 model family, and the multimodal version comes with 128K context length (in tokens) it can support. The model underwent a rigorous enhancement process, incorporating both supervised fine-tuning and direct preference optimization to ensure precise instruction adherence and robust safety measures. 🏡 [Phi-3 Portal](https://azure.microsoft.com/en-us/products/phi-3) <br> 📰 [Phi-3 Microsoft Blog](https://aka.ms/phi3.5-techblog) <br> 📖 [Phi-3 Technical Report](https://arxiv.org/abs/2404.14219) <br> 👩‍🍳 [Phi-3 Cookbook](https://github.com/microsoft/Phi-3CookBook) <br> 🖥️ [Try It](https://aka.ms/try-phi3.5vision) <br> **Phi-3.5**: [[mini-instruct]](https://huggingface.co/microsoft/Phi-3.5-mini-instruct); [[MoE-instruct]](https://huggingface.co/microsoft/Phi-3.5-MoE-instruct) ; [[vision-instruct]](https://huggingface.co/microsoft/Phi-3.5-vision-instruct) ## Intended Uses ### Primary Use Cases The model is intended for broad commercial and research use in English. The model provides uses for general purpose AI systems and applications with visual and text input capabilities which require: 1) Memory/compute constrained environments 2) Latency bound scenarios 3) General image understanding 4) Optical character recognition 5) Chart and table understanding 6) Multiple image comparison 7) Multi-image or video clip summarization Our model is designed to accelerate research on language and multimodal models, for use as a building block for generative AI powered features. ### Use Case Considerations Our models are not specifically designed or evaluated for all downstream purposes. Developers should consider common limitations of language models as they select use cases, and evaluate and mitigate for accuracy, safety, and fariness before using within a specific downstream use case, particularly for high risk scenarios. Developers should be aware of and adhere to applicable laws or regulations (including privacy, trade compliance laws, etc.) that are relevant to their use case. ***Nothing contained in this Model Card should be interpreted as or deemed a restriction or modification to the license the model is released under.*** ## Release Notes In this release, the model enables multi-frame image understanding and reasoning which is based on valuable customer feedback. The hero example multi-frame capabilities include detailed image comparison, multi-image summarization/storytelling and video summarization, which have broad applications in Office scenarios. We also observed performance improvement on most single image benchmarks, e.g., boost MMMU performance from 40.2 to 43.0, MMBench performance from 80.5 to 81.9, document understanding benchmark TextVQA from 70.9 to 72.0. We believe most use cases will benefit from this release, but we encourage users to test the new model in their AI applications. We appreciate the enthusiastic adoption of the Phi-3 model family and continue to welcome all the feedback from the community. Below are the comparison results on existing multi-image benchmarks. On average, our model outperforms competitor models on the same size and competitive with much bigger models on multi-frame capabilities and video summarization. **BLINK**: a benchmark with 14 visual tasks that humans can solve very quickly but are still hard for current multimodal LLMs. | Benchmark | Phi-3.5-vision-instruct | LlaVA-Interleave-Qwen-7B | InternVL-2-4B | InternVL-2-8B | Gemini-1.5-Flash | GPT-4o-mini | Claude-3.5-Sonnet | Gemini-1.5-Pro | GPT-4o | |--|--|--|--|--|--|--|--|--|--| | Art Style | 87.2 | 62.4 | 55.6 | 52.1 | 64.1 | 70.1 | 59.8 | 70.9 | 73.3 | | Counting | 54.2 | 56.7 | 54.2 | 66.7 | 51.7 | 55.0 | 59.2 | 65.0 | 65.0 | | Forensic Detection | 92.4 | 31.1 | 40.9 | 34.1 | 54.5 | 38.6 | 67.4 | 60.6 | 75.8 | | Functional Correspondence | 29.2 | 34.6 | 24.6 | 24.6 | 33.1 | 26.9 | 33.8 | 31.5 | 43.8 | | IQ Test | 25.3 | 26.7 | 26.0 | 30.7 | 25.3 | 29.3 | 26.0 | 34.0 | 19.3 | | Jigsaw | 68.0 | 86.0 | 55.3 | 52.7 | 71.3 | 72.7 | 57.3 | 68.0 | 67.3 | | Multi-View Reasoning | 54.1 | 44.4 | 48.9 | 42.9 | 48.9 | 48.1 | 55.6 | 49.6 | 46.6 | | Object Localization | 49.2 | 54.9 | 53.3 | 54.1 | 44.3 | 57.4 | 62.3 | 65.6 | 68.0 | | Relative Depth | 69.4 | 77.4 | 63.7 | 67.7 | 57.3 | 58.1 | 71.8 | 76.6 | 71.0 | | Relative Reflectance | 37.3 | 34.3 | 32.8 | 38.8 | 32.8 | 27.6 | 36.6 | 38.8 | 40.3 | | Semantic Correspondence | 36.7 | 31.7 | 31.7 | 22.3 | 32.4 | 31.7 | 45.3 | 48.9 | 54.0 | | Spatial Relation | 65.7 | 75.5 | 78.3 | 78.3 | 55.9 | 81.1 | 60.1 | 79.0 | 84.6 | | Visual Correspondence | 53.5 | 40.7 | 34.9 | 33.1 | 29.7 | 52.9 | 72.1 | 81.4 | 86.0 | | Visual Similarity | 83.0 | 91.9 | 48.1 | 45.2 | 47.4 | 77.8 | 84.4 | 81.5 | 88.1 | | **Overall** | **57.0** | **53.1** | **45.9** | **45.4** | **45.8** | **51.9** | **56.5** | **61.0** | **63.2** | **Video-MME**: comprehensively assess the capabilities of MLLMs in processing video data, covering a wide range of visual domains, temporal durations, and data modalities. | Benchmark | Phi-3.5-vision-instruct | LlaVA-Interleave-Qwen-7B | InternVL-2-4B | InternVL-2-8B | Gemini-1.5-Flash | GPT-4o-mini | Claude-3.5-Sonnet | Gemini-1.5-Pro | GPT-4o | |--|--|--|--|--|--|--|--|--|--| | short (<2min) | 60.8 | 62.3 | 60.7 | 61.7 | 72.2 | 70.1 | 66.3 | 73.3 | 77.7 | | medium (4-15min) | 47.7 | 47.1 | 46.4 | 49.6 | 62.7 | 59.6 | 54.7 | 61.2 | 68.0 | | long (30-60min) | 43.8 | 41.2 | 42.6 | 46.6 | 52.1 | 53.9 | 46.6 | 53.2 | 59.6 | | **Overall** | **50.8** | **50.2** | **49.9** | **52.6** | **62.3** | **61.2** | **55.9** | **62.6** | **68.4** | ## Usage ### Requirements The current `transformers` version can be verified with: `pip list | grep transformers`. Examples of required packages: ``` flash_attn==2.5.8 numpy==1.24.4 Pillow==10.3.0 Requests==2.31.0 torch==2.3.0 torchvision==0.18.0 transformers==4.43.0 accelerate==0.30.0 ``` Phi-3.5-vision-Instruct is also available in [Azure AI Studio](https://aka.ms/try-phi3.5vision). ### Input Formats Given the nature of the training data, the Phi-3.5-vision model is best suited for prompts using the chat format as follows: Single image: ``` <|user|>\n<|image_1|>\n{prompt}<|end|>\n<|assistant|>\n ``` Multi-turn conversations: ``` <|user|>\n<|image_1|>\n{prompt_1}<|end|>\n<|assistant|>\n{response_1}<|end|>\n<|user|>\n{prompt_2}<|end|>\n<|assistant|>\n ``` For multi-image usage, add multiple image placeholders in the front of the prompts. <|image_{}|> index should start from 1. One example of prompt is shown as follows: ``` <|user|>\n<|image_1|>\n<|image_2|>\n<|image_3|>\n<|image_4|>\n{prompt}<|end|>\n<|assistant|>\n ``` ### Loading the model locally After obtaining the Phi-3.5-vision-instruct model checkpoints, users can use this sample code for inference. ```python from PIL import Image import requests from transformers import AutoModelForCausalLM from transformers import AutoProcessor model_id = "microsoft/Phi-3.5-vision-instruct" # Note: set _attn_implementation='eager' if you don't have flash_attn installed model = AutoModelForCausalLM.from_pretrained( model_id, device_map="cuda", trust_remote_code=True, torch_dtype="auto", _attn_implementation='flash_attention_2' ) # for best performance, use num_crops=4 for multi-frame, num_crops=16 for single-frame. processor = AutoProcessor.from_pretrained(model_id, trust_remote_code=True, num_crops=4 ) images = [] placeholder = "" # Note: if OOM, you might consider reduce number of frames in this example. for i in range(1,20): url = f"https://image.slidesharecdn.com/azureintroduction-191206101932/75/Introduction-to-Microsoft-Azure-Cloud-{i}-2048.jpg" images.append(Image.open(requests.get(url, stream=True).raw)) placeholder += f"<|image_{i}|>\n" messages = [ {"role": "user", "content": placeholder+"Summarize the deck of slides."}, ] prompt = processor.tokenizer.apply_chat_template( messages, tokenize=False, add_generation_prompt=True ) inputs = processor(prompt, images, return_tensors="pt").to("cuda:0") generation_args = { "max_new_tokens": 1000, "temperature": 0.0, "do_sample": False, } generate_ids = model.generate(**inputs, eos_token_id=processor.tokenizer.eos_token_id, **generation_args ) # remove input tokens generate_ids = generate_ids[:, inputs['input_ids'].shape[1]:] response = processor.batch_decode(generate_ids, skip_special_tokens=True, clean_up_tokenization_spaces=False)[0] print(response) ``` Notes: + to achieve best performances we suggest to set _num_crops=4_ for multi-frame and _num_crops=16_ for single-frame. + to turn off flash_attention users can set __attn_implementation='eager'_ ## Responsible AI Considerations Like other models, the Phi family of models can potentially behave in ways that are unfair, unreliable, or offensive. Some of the limiting behaviors to be aware of include: * Quality of Service: The Phi models are trained primarily on English text. Languages other than English will experience worse performance. English language varieties with less representation in the training data might experience worse performance than standard American English. * Representation of Harms & Perpetuation of Stereotypes: These models can over- or under-represent groups of people, erase representation of some groups, or reinforce demeaning or negative stereotypes. Despite safety post-training, these limitations may still be present due to differing levels of representation of different groups or prevalence of examples of negative stereotypes in training data that reflect real-world patterns and societal biases. * Inappropriate or Offensive Content: These models may produce other types of inappropriate or offensive content, which may make it inappropriate to deploy for sensitive contexts without additional mitigations that are specific to the use case. * Information Reliability: Language models can generate nonsensical content or fabricate content that might sound reasonable but is inaccurate or outdated. * Limited Scope for Code: Majority of Phi-3 training data is based in Python and use common packages such as "typing, math, random, collections, datetime, itertools". If the model generates Python scripts that utilize other packages or scripts in other languages, we strongly recommend users manually verify all API uses. Developers should apply responsible AI best practices and are responsible for ensuring that a specific use case complies with relevant laws and regulations (e.g. privacy, trade, etc.). Important areas for consideration include: * Allocation: Models may not be suitable for scenarios that could have consequential impact on legal status or the allocation of resources or life opportunities (ex: housing, employment, credit, etc.) without further assessments and additional debiasing techniques. * High-Risk Scenarios: Developers should assess suitability of using models in high-risk scenarios where unfair, unreliable or offensive outputs might be extremely costly or lead to harm. This includes providing advice in sensitive or expert domains where accuracy and reliability are critical (ex: legal or health advice). Additional safeguards should be implemented at the application level according to the deployment context. * Misinformation: Models may produce inaccurate information. Developers should follow transparency best practices and inform end-users they are interacting with an AI system. At the application level, developers can build feedback mechanisms and pipelines to ground responses in use-case specific, contextual information, a technique known as Retrieval Augmented Generation (RAG). * Generation of Harmful Content: Developers should assess outputs for their context and use available safety classifiers or custom solutions appropriate for their use case. * Misuse: Other forms of misuse such as fraud, spam, or malware production may be possible, and developers should ensure that their applications do not violate applicable laws and regulations. * Identification of individuals: models with vision capabilities may have the potential to uniquely identify individuals in images. Safety post-training steers the model to refuse such requests, but developers should consider and implement, as appropriate, additional mitigations or user consent flows as required in their respective jurisdiction, (e.g., building measures to blur faces in image inputs before processing). ## Training ### Models **Architecture:** Phi-3.5-vision has 4.2B parameters and contains image encoder, connector, projector, and Phi-3 Mini language model.<br> **Inputs:** Text and Image. It’s best suited for prompts using the chat format.<br> **Context length:** 128K tokens<br> **GPUs:** 256 A100-80G<br> **Training time:** 6 days<br> **Training data:** 500B tokens (vision tokens + text tokens)<br> **Outputs:** Generated text in response to the input<br> **Dates:** Trained between July and August 2024<br> **Status:** This is a static model trained on an offline text dataset with cutoff date March 15, 2024. Future versions of the tuned models may be released as we improve models.<br> **Release date:** August 2024<br> ### Data Overview Our training data includes a wide variety of sources, and is a combination of 1) publicly available documents filtered rigorously for quality, selected high-quality educational data and code; 2) selected high-quality image-text interleave data; 3) newly created synthetic, “textbook-like” data for the purpose of teaching math, coding, common sense reasoning, general knowledge of the world (science, daily activities, theory of mind, etc.), newly created image data, e.g., chart/table/diagram/slides, newly created multi-image and video data, e.g., short video clips/pair of two similar images; 4) high quality chat format supervised data covering various topics to reflect human preferences on different aspects such as instruct-following, truthfulness, honesty and helpfulness. The data collection process involved sourcing information from publicly available documents, with a meticulous approach to filtering out undesirable documents and images. To safeguard privacy, we carefully filtered various image and text data sources to remove or scrub any potentially personal data from the training data. More details about data can be found in the [Phi-3 Technical Report](https://arxiv.org/pdf/2404.14219). ### How to finetune? We recommend user to take a look at the [Phi-3 CookBook finetuning recipe for Vision](https://github.com/microsoft/Phi-3CookBook/blob/main/md/04.Fine-tuning/FineTuning_Vision.md) ## Benchmarks To understand the capabilities, we compare Phi-3.5-vision with a set of models over a variety of zero-shot benchmarks using our internal benchmark platform. At the high-level overview of the model quality on representative benchmarks: | Category | Benchmark | Phi-3.5-vision-instruct | Intern-VL-2-4B | Intern-VL-2-8B | Gemini-1.5-Flash | GPT-4o-mini 2024-7-18 | Claude-3.5-Sonnet | Gemini-1.5-Pro | GPT-4o 2024-5-13 | |--|--|--|--|--|--|--|--|--|--| | Popular aggregated benchmark | MMMU (val) | 43.0 | 44.22 | 46.33 | 49.33 | 52.1 | 52.67 | 54.11 | 61.78 | | | MMBench (dev-en) | 81.9 | 83.4 | 87.0 | 85.7 | 83.8 | 82.3 | 87.9 | 88.4 | | Visual scientific knowledge reasoning | ScienceQA (img-test) | 91.3 | 94.9 | 95.9 | 84.5 | 84.0 | 73.8 | 86.0 | 88.5 | | Visual math reasoning | MathVista (testmini) | 43.9 | 53.7 | 51.1 | 55.3 | 38.8 | 54.0 | 57.4 | 54.4 | | | InterGPS (test) | 36.3 | 45.6 | 53.2 | 39.4 | 39.9 | 45.6 | 58.2 | 46.9 | | Chart reasoning | AI2D (test) | 78.1 | 77.3 | 81.4 | 78.4 | 75.2 | 68.9 | 75.6 | 82.8 | | | ChartQA (test) | 81.8 | 78.8 | 80.4 | 57.6 | 54.5 | 73.2 | 68.2 | 64.0 | | Document Intelligence | TextVQA (val) | 72.0 | 66.2 | 68.8 | 67.4 | 70.9 | 70.5 | 64.5 | 75.6 | | Object visual presence verification | POPE (test) | 86.1 | 83.3 | 84.2 | 86.1 | 83.6 | 76.6 | 89.3 | 87.0 | ## Safety Evaluation and Red-Teaming **Approach** The Phi-3 family of models has adopted a robust safety post-training approach. This approach leverages a variety of both open-source and in-house generated datasets. The overall technique employed to do the safety alignment is a combination of SFT (Supervised Fine-Tuning) and RLHF (Reinforcement Learning from Human Feedback) approaches by utilizing human-labeled and synthetic English-language datasets, including publicly available datasets focusing on helpfulness and harmlessness as well as various questions and answers targeted to multiple safety categories. **Safety Evaluation** We leveraged various evaluation techniques including red teaming, adversarial conversation simulations, and safety evaluation benchmark datasets to evaluate Phi-3.5 models' propensity to produce undesirable outputs across multiple risk categories. Several approaches were used to compensate for the limitations of one approach alone. Please refer to the [technical report](https://arxiv.org/pdf/2404.14219) for more details of our safety alignment. ## Software * [PyTorch](https://github.com/pytorch/pytorch) * [Transformers](https://github.com/huggingface/transformers) * [Flash-Attention](https://github.com/HazyResearch/flash-attention) ## Hardware Note that by default, the Phi-3.5-Mini-Instruct model uses flash attention, which requires certain types of GPU hardware to run. We have tested on the following GPU types: * NVIDIA A100 * NVIDIA A6000 * NVIDIA H100 ## License The model is licensed under the [MIT license](./LICENSE). ## Trademarks This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow [Microsoft’s Trademark & Brand Guidelines](https://www.microsoft.com/en-us/legal/intellectualproperty/trademarks). Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party’s policies.
{"language": ["multilingual"], "license": "mit", "license_link": "https://huggingface.co/microsoft/Phi-3.5-vision-instruct/resolve/main/LICENSE", "pipeline_tag": "image-text-to-text", "tags": ["nlp", "code", "vision"], "inference": {"parameters": {"temperature": 0.7}}, "widget": [{"messages": [{"role": "user", "content": "<|image_1|>Can you describe what you see in the image?"}]}]}
task
[ "SUMMARIZATION" ]
42,879
Jrinky/model3
Jrinky
sentence-similarity
[ "sentence-transformers", "safetensors", "xlm-roberta", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:11808", "loss:Infonce", "arxiv:1908.10084", "arxiv:1705.00652", "base_model:BAAI/bge-m3", "base_model:finetune:BAAI/bge-m3", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
2025-01-29T05:41:49Z
2025-01-29T05:46:55+00:00
5
0
--- base_model: BAAI/bge-m3 library_name: sentence-transformers pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:11808 - loss:Infonce widget: - source_sentence: Who are some notable individuals named Roger Mason sentences: - "Rav Kook's writings are extensive, and he is considered one of the most celebrated\ \ and influential rabbis of the 20th century. Some rabbis recommend that students\ \ of his begin studying his writings with Ein Ayah. References\n\nExternal links\n\ \ Ayin Ayah (full text), Hebrew Wikisource\n * Ayn Aya Classes in English\n\n\ Talmud\nAggadic Midrashim\nAbraham Isaac Kook\nHebrew-language religious books" - 'Roger Mason may refer to: Roger Mason (baseball) (born 1958), American baseball player Roger Mason (geologist) (born 1941), discoverer of Ediacaran fossils Roger Mason Jr. (born 1980), American basketball player Roger Mason (musician), Australian keyboardist L. Roger Mason, Jr., former assistant director of National Intelligence for Systems and Resource Analyses' - 'Timetabled passenger services on both lines had ceased by the end of February 1959. Shipping The Bourne-Morton Canal or Bourne Old Eau connected the town to the sea in Roman times. Until the mid-19th century, the present Bourne Eau was capable of carrying commercial boat traffic from the Wash coast and Spalding. This resulted from the investment following the Bourne Navigation Act of 1780. Passage became impossible once the junction of the Eau and the River Glen was converted from gates to a sluice in 1860. Media Local news and television programmes are provided by BBC Yorkshire and Lincolnshire and ITV Yorkshire. Television signals are received from the Belmont TV transmitter, the Waltham TV transmitter can also be received which broadcast BBC East Midlands and ITV Central programmes. Local radio stations are BBC Radio Lincolnshire, Greatest Hits Radio Lincolnshire and Lincs FM. The town''s local newspapers are Bourne Local and Stamford Mercury. Sport Bourne Town Football Club plays football in the United Counties Football League, whilst Bourne Cricket Club plays in the Lincolnshire ECB Premier League. These teams play their home games at the Abbey Lawn, a recreation ground privately owned by the Bourne United Charities. Motor sports The racing-car marques English Racing Automobiles (ERA) and British Racing Motors (BRM) were both founded in Bourne by Raymond Mays, an international racing driver and designer who lived in Bourne. The former ERA and BRM workshops in Spalding Road are adjacent to Eastgate House, the Mays'' family home in the town''s Eastgate. Landmarks There are currently 71 listed buildings in the parish of Bourne, the most important being Bourne Abbey and the Parish Church of St Peter and St Paul (1138), which is the only one scheduled Grade I. Notable people Bourne is reputedly the birthplace of Hereward the Wake (in about 1035), although the 12th-century source of this information, De Gestis Herwardi Saxonis, refers only to his father as being "of Bourne" and to the father''s house and retainers there. Robert Mannyng (1264–1340) is credited with putting the speech of the ordinary people of his time into recognisable form. He is better known as Robert de Brunne because of his long period of residence as a canon at Bourne Abbey. There he completed his life''s work of popularising religious and historical material in a Middle English dialect that was easily understood at that time. William Cecil (1520–1598) became the first Lord Burghley after serving Queen Elizabeth I. He was born at a house in the centre of Bourne that is now the Burghley Arms. Dr William Dodd (1729–1777), was an Anglican clergyman, man of letters and forger. He was prosecuted, sentenced to death and publicly hanged at Tyburn in 1777. Charles Frederick Worth (1825–1895), son of a solicitor, lived at Wake House in North Street. He moved to Paris and became a renowned designer of women''s fashion and the founder of haute couture. The French government awarded him the Légion d''honneur. Sir George White (1840-1912), MP for North West Norfolk, a seat he held for twelve years until he died in 1912. He was knighted for public service in 1907.' - source_sentence: What football team does the Japanese player play for sentences: - After the meeting, Box summons up the courage to ask Lorraine (Sue Holderness) on the date. The act ends with Robert's coat getting on fire because of the cigarette, with "Smoke Gets in Your Eyes" on the background. - is a Japanese football player. He plays for Honda Lock. - As followers on Twitter and FB probably well know I’ve been up to more than a spot of preserving of late. It’s my latest addiction, as if I need any more of those. My Dad’s the King of Jams, Chutneys and Pickles and I have a feeling he’s passed his enthusiastic genes for it on to me!. Which is great, but time consuming. Many an evening has been spent peeling, dicing, de-stoning, chopping, stirring, testing, sterilising and jarring. And then obviously the tasting. And all the crackers, bread and cheese to go with it!. I rarely get to bed much before midnight on my chutneying nights. And to be honest my cupboards are now fit to bursting with so many goodies, but at least I have christmas presents totally nailed this year. My Dad’s been making Hedgerow Chutney for years, and it happens to be everyone’s favourite of all his chutney recipes (and he makes quite a number!). Each autumn he takes a long walk around the field at the back of his house in Herefordshire picking all the freebie hedgerow goodies he can find and transforms them into this marvellously fruitful chutney. There’s always plenty of damsons, bullaces, sloes, blackberries and a few elderberries. Plus pears or apples for smoothing and bulking out. We don’t have quite the same fruit in our hedgerows in France but I thought I’d make my own French version picking the fruit from our garden and nearby tracks and lanes, managing to find plenty of figs, greengages, plums, pears, blackberries and sloes just before the season finished a couple of weeks ago. We’ve elderberries here too but they were way past their best by the time I got into full chutney mode. There’s no escaping how time consuming and labourious chutney making can be, especially when using so much fruit that needs hefty preparatory work. I realise now why it’s a hobby generally taken up by retired folk. But the results are so worth it, if you can spare it set aside a whole evening in the kitchen and wile away the hours getting lost in music or the radio or even catching up on a few programmes on You Tube. - source_sentence: What is the purpose of Business Intelligence sentences: - 'College career Proctor played as a defensive lineman for the North Carolina Central Eagles from 2008 to 2012. He was redshirted in 2008.' - The purpose of Business Intelligence is the transformation of raw data into meaningful information which can be used to make better business decisions. Business Intelligence grew out of Decision Support systems and is all about collecting data from disparate sources, conforming and integrating that data into central repositories which support reporting and analysis activities. - You have to show the police courtesy, they are only human. No one even WANTS for the judicial system to work. They are too lazy. - source_sentence: How does the speaker feel about Battle Symphony sentences: - It's a symptomless prearranged fact that when you afford your babe a infant work you motivate the status system, bolster the infant's stressed system, eat up colic, and harden your in bondage next to your kid. Now, how satisfying is that - Piquet passed Laffite to become the race's fifth different leader. Senna reached second just 1.7 seconds behind Piquet by passing Laffite, who then pitted for tires. With the two of them in front on their own, and Piquet leading by up to 3.5 seconds, Senna was content for the time being to follow his countryman. After eight laps in the lead, Piquet pitted for tires. Senna regained first place and then also pitted. Piquet's 18.4 second stop was even slower than teammate Mansell's had been, but when he returned to the track, the two-time champion got the bit between his teeth. Running second behind Senna, Piquet set the fastest lap of the race on lap 41, but with a pit stop ten seconds quicker than Piquet's, Senna was able to retain the lead. On the very next lap, the 42nd, Piquet pushed a bit too much, and crashed hard at the left-hand corner before the last chicane. He ended up in the tire barrier, unhurt, but with his car in a very precarious position. The crane, present for just that reason, was unable to move the car. Arnoux, now 16.6 seconds behind in second, took a second a lap off Senna's lead for five laps while a yellow was displayed in the corner where Piquet had crashed. As soon as the yellow flag was gone, Arnoux went wide and hit Piquet's abandoned Williams! The Frenchman decided that his car was not damaged, and attempted to rejoin the field, but did so right in front of Thierry Boutsen's Arrows-BMW, sidelining both cars. Very uncharacteristic of a street race, these three – Piquet, Arnoux and Boutsen – were the only drivers all afternoon to retire due to accidents. - Like Battle Symphony, it's not bad. It's just extremely boring. - source_sentence: When did he migrate to New South Wales sentences: - 'predict ministry in a sales and special floor being Job to the vulnerability diver. team: This research will work last for either, often, and also obtaining spreadsheets in the funny wedding power of the usability time. Physical Demands: The exclusive transitions was temporarily need perfect of those that must share developed by an position to badly do the animal objectives of this source. necessary terabytes may pay acted to increase streets with hearts to address the professional items. solely, the job will distract, Coordinate and be inbox security fun interdisciplinary operations that might read in back of 20 updates The service will properly be to like the detection throughout the use: logging, including, killing, teaching, leading, preparing, operating, and using.' - "Shizuka Shirakawa, Scholar of Chinese-language literature. Horin Fukuoji, Nihonga\ \ painter. 2005\n Mitsuko Mori. Actress. Makoto Saitō (1921–2008). Political scientist,\ \ specializing in American diplomatic and political history. Ryuzan Aoki, Ceramic\ \ artist. Toshio Sawada, Civil engineer. Shigeaki Hinohara, Doctor. 2006\n Yoshiaki\ \ Arata. A pioneer of nuclear fusion research. Jakuchō Setouchi. Writer/Buddhist\ \ nun. Hidekazu Yoshida. Music critic. Chusaku Oyama, Nihonga painter. Miyohei\ \ Shinohara, Economist. 2007\n Akira Mikazuki. Former justice minister and professor\ \ emeritus. Shinya Nakamura. Sculptor. Kōji Nakanishi. Organic chemist. Tokindo\ \ Okada, Developmental biologist. Shigeyama Sensaku, Kyogen performer. 2008\n\ \ Hironoshin Furuhashi (1928–2009). Sportsman and sports bureaucrat. Kiyoshi Itō.\ \ A mathematician whose work is now called Itō calculus. Donald Keene." - He attended Derby Grammar School and Beaufort House in London, and migrated to New South Wales in 1883. He settled in Newcastle, where he worked as a shipping agent, eventually partnering with his brothers in a firm. On 6 May 1893 he married Gertrude Mary Saddington, with whom he had five children. --- # SentenceTransformer based on BAAI/bge-m3 This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3) <!-- at revision 5617a9f61b028005a4858fdac845db406aefb181 --> - **Maximum Sequence Length:** 1024 tokens - **Output Dimensionality:** 1024 dimensions - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 1024, 'do_lower_case': False}) with Transformer model: XLMRobertaModel (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("Jrinky/model3") # Run inference sentences = [ 'When did he migrate to New South Wales', 'He attended Derby Grammar School and Beaufort House in London, and migrated to New South Wales in 1883. He settled in Newcastle, where he worked as a shipping agent, eventually partnering with his brothers in a firm. On 6 May 1893 he married Gertrude Mary Saddington, with whom he had five children.', 'Shizuka Shirakawa, Scholar of Chinese-language literature. Horin Fukuoji, Nihonga painter. 2005\n Mitsuko Mori. Actress. Makoto Saitō (1921–2008). Political scientist, specializing in American diplomatic and political history. Ryuzan Aoki, Ceramic artist. Toshio Sawada, Civil engineer. Shigeaki Hinohara, Doctor. 2006\n Yoshiaki Arata. A pioneer of nuclear fusion research. Jakuchō Setouchi. Writer/Buddhist nun. Hidekazu Yoshida. Music critic. Chusaku Oyama, Nihonga painter. Miyohei Shinohara, Economist. 2007\n Akira Mikazuki. Former justice minister and professor emeritus. Shinya Nakamura. Sculptor. Kōji Nakanishi. Organic chemist. Tokindo Okada, Developmental biologist. Shigeyama Sensaku, Kyogen performer. 2008\n Hironoshin Furuhashi (1928–2009). Sportsman and sports bureaucrat. Kiyoshi Itō. A mathematician whose work is now called Itō calculus. Donald Keene.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 1024] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 11,808 training samples * Columns: <code>anchor</code> and <code>positive</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 17.85 tokens</li><li>max: 48 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 186.46 tokens</li><li>max: 1024 tokens</li></ul> | * Samples: | anchor | positive | |:-----------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>What type of tournament structure was used in this freestyle wrestling competition</code> | <code>This freestyle wrestling competition consisted of a single-elimination tournament, with a repechage used to determine the winners of two bronze medals. Results<br>Legend<br>F — Won by fall<br><br>Final<br><br>Top half<br><br>Bottom half<br><br>Repechage<br><br>References<br>Official website<br><br>Women's freestyle 58 kg<br>World</code> | | <code>What was the status of Josip Broz Tito under the 1974 Constitution of Yugoslavia regarding his presidency</code> | <code>1 Wednesday, 22 April 1998. 2 (8.30 a.m.). 3 JUDGE CASSESE: Good morning. May I ask the<br>4 Registrar to call out the case number, please. 5 THE REGISTRAR: Case number IT-95-13a-T,<br>6 Prosecutor versus Slavko Dokmanovic. 7 MR. NIEMANN: My name is Niemann. I appear<br>8 with my colleagues, Mr. Williamson, Mr. Waespi and<br>9 Mr. Vos. 10 MR. FILA: My name is Mr. Toma Fila and<br>11 I appear with Ms. Lopicic and Mr. Petrovic in Defence of<br>12 my client, Mr. Slavko Dokmanovic. 13 JUDGE CASSESE: Mr. Dokmanovic, can you<br>14 follow me? Before we call the witness, may I ask you<br>15 whether you agree to this note from the Registrar about<br>16 the two documents which we discussed yesterday -- you<br>17 have probably received the English translation of the<br>18 bibliography of our witness, plus the missing pages of<br>19 the other document, so I think it is agreed that they<br>20 can be admitted into evidence. 21 MR. NIEMANN: Yes. 22 JUDGE CASSESE: Shall we proceed with the<br>24 MR. FILA: Your Honour, before we continue<br>25 wi...</code> | | <code>How quickly can you get loan approval and funds transferred with Crawfort</code> | <code>Then click on the submit button, and it’s done. Make your dream come true with Crawfort<br>When you all submit the loan form, then the agency takes a few hours to process and for approval of the loan. Not only that, you can get your loan amount in your account within a day after getting approval. Many money lenders all take more time in processing things and to credit the amount as well. So, for all that, a customer suffers more as they can’t get the money immediately. But here all these things are not done, and the staff here always make sure to provide you best and fast services. For all these things, you can get the best loan services from here without any doubt.</code> | * Loss: <code>selfloss.Infonce</code> with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Evaluation Dataset #### Unnamed Dataset * Size: 1,476 evaluation samples * Columns: <code>anchor</code> and <code>positive</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 17.61 tokens</li><li>max: 47 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 171.81 tokens</li><li>max: 1024 tokens</li></ul> | * Samples: | anchor | positive | |:--------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>What is Hector Guimard best known for</code> | <code>Hector Guimard (, 10 March 1867 – 20 May 1942) was a French architect and designer, and a prominent figure of the Art Nouveau style. He achieved early fame with his design for the Castel Beranger, the first Art Nouveau apartment building in Paris, which was selected in an 1899 competition as one of the best new building facades in the city. He is best known for the glass and iron edicules or canopies, with ornamental Art Nouveau curves, which he designed to cover the entrances of the first stations of the Paris Metro. Between 1890 and 1930, Guimard designed and built some fifty buildings, in addition to one hundred and forty-one subway entrances for Paris Metro, as well as numerous pieces of furniture and other decorative works. However, in the 1910s Art Nouveau went out of fashion and by the 1960s most of his works had been demolished, and only two of his original Metro edicules were still in place. Guimard's critical reputation revived in the 1960s, in part due to subsequent acquisit...</code> | | <code>What does Mark Kantrowitz say about the inclusion of loans in financial aid packages</code> | <code>"They don't always understand that part of the financial aid package includes loans," he says. But loans "don't really reduce your costs," explains Mark Kantrowitz, founder of the financial aid website FinAid.org and publisher of Edvisors Network. "They simply spread them out over time. ... A loan is a loan.</code> | | <code>How can Ayurveda support women's health during menopause</code> | <code>Especially as we journey towards menopause, Ayurveda is there to support us with its millenary wisdom. These are some easy routines to incorporate for the daily care of the vulva and vagina, our most delicate flower. Sesame oil: our best allied against dryness, it cannot be missing in our diet.</code> | * Loss: <code>selfloss.Infonce</code> with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 2 - `per_device_eval_batch_size`: 2 - `learning_rate`: 2e-05 - `num_train_epochs`: 5 - `warmup_ratio`: 0.1 - `fp16`: True - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 2 - `per_device_eval_batch_size`: 2 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 5 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `prompts`: None - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | Validation Loss | |:------:|:----:|:-------------:|:---------------:| | 0.2033 | 100 | 0.2694 | 0.0690 | | 0.4065 | 200 | 0.0822 | 0.0528 | | 0.6098 | 300 | 0.0689 | 0.0497 | | 0.8130 | 400 | 0.0644 | 0.0469 | | 1.0163 | 500 | 0.0643 | 0.0443 | | 1.2195 | 600 | 0.0378 | 0.0473 | | 1.4228 | 700 | 0.04 | 0.0479 | | 1.6260 | 800 | 0.0358 | 0.0461 | | 1.8293 | 900 | 0.0332 | 0.0507 | | 2.0325 | 1000 | 0.0283 | 0.0538 | ### Framework Versions - Python: 3.12.3 - Sentence Transformers: 3.4.0 - Transformers: 4.42.4 - PyTorch: 2.2.0+cu121 - Accelerate: 1.3.0 - Datasets: 3.2.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### Infonce ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
null
Non_BioNLP
# SentenceTransformer based on BAAI/bge-m3 This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3) <!-- at revision 5617a9f61b028005a4858fdac845db406aefb181 --> - **Maximum Sequence Length:** 1024 tokens - **Output Dimensionality:** 1024 dimensions - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 1024, 'do_lower_case': False}) with Transformer model: XLMRobertaModel (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("Jrinky/model3") # Run inference sentences = [ 'When did he migrate to New South Wales', 'He attended Derby Grammar School and Beaufort House in London, and migrated to New South Wales in 1883. He settled in Newcastle, where he worked as a shipping agent, eventually partnering with his brothers in a firm. On 6 May 1893 he married Gertrude Mary Saddington, with whom he had five children.', 'Shizuka Shirakawa, Scholar of Chinese-language literature. Horin Fukuoji, Nihonga painter. 2005\n Mitsuko Mori. Actress. Makoto Saitō (1921–2008). Political scientist, specializing in American diplomatic and political history. Ryuzan Aoki, Ceramic artist. Toshio Sawada, Civil engineer. Shigeaki Hinohara, Doctor. 2006\n Yoshiaki Arata. A pioneer of nuclear fusion research. Jakuchō Setouchi. Writer/Buddhist nun. Hidekazu Yoshida. Music critic. Chusaku Oyama, Nihonga painter. Miyohei Shinohara, Economist. 2007\n Akira Mikazuki. Former justice minister and professor emeritus. Shinya Nakamura. Sculptor. Kōji Nakanishi. Organic chemist. Tokindo Okada, Developmental biologist. Shigeyama Sensaku, Kyogen performer. 2008\n Hironoshin Furuhashi (1928–2009). Sportsman and sports bureaucrat. Kiyoshi Itō. A mathematician whose work is now called Itō calculus. Donald Keene.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 1024] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 11,808 training samples * Columns: <code>anchor</code> and <code>positive</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 17.85 tokens</li><li>max: 48 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 186.46 tokens</li><li>max: 1024 tokens</li></ul> | * Samples: | anchor | positive | |:-----------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>What type of tournament structure was used in this freestyle wrestling competition</code> | <code>This freestyle wrestling competition consisted of a single-elimination tournament, with a repechage used to determine the winners of two bronze medals. Results<br>Legend<br>F — Won by fall<br><br>Final<br><br>Top half<br><br>Bottom half<br><br>Repechage<br><br>References<br>Official website<br><br>Women's freestyle 58 kg<br>World</code> | | <code>What was the status of Josip Broz Tito under the 1974 Constitution of Yugoslavia regarding his presidency</code> | <code>1 Wednesday, 22 April 1998. 2 (8.30 a.m.). 3 JUDGE CASSESE: Good morning. May I ask the<br>4 Registrar to call out the case number, please. 5 THE REGISTRAR: Case number IT-95-13a-T,<br>6 Prosecutor versus Slavko Dokmanovic. 7 MR. NIEMANN: My name is Niemann. I appear<br>8 with my colleagues, Mr. Williamson, Mr. Waespi and<br>9 Mr. Vos. 10 MR. FILA: My name is Mr. Toma Fila and<br>11 I appear with Ms. Lopicic and Mr. Petrovic in Defence of<br>12 my client, Mr. Slavko Dokmanovic. 13 JUDGE CASSESE: Mr. Dokmanovic, can you<br>14 follow me? Before we call the witness, may I ask you<br>15 whether you agree to this note from the Registrar about<br>16 the two documents which we discussed yesterday -- you<br>17 have probably received the English translation of the<br>18 bibliography of our witness, plus the missing pages of<br>19 the other document, so I think it is agreed that they<br>20 can be admitted into evidence. 21 MR. NIEMANN: Yes. 22 JUDGE CASSESE: Shall we proceed with the<br>24 MR. FILA: Your Honour, before we continue<br>25 wi...</code> | | <code>How quickly can you get loan approval and funds transferred with Crawfort</code> | <code>Then click on the submit button, and it’s done. Make your dream come true with Crawfort<br>When you all submit the loan form, then the agency takes a few hours to process and for approval of the loan. Not only that, you can get your loan amount in your account within a day after getting approval. Many money lenders all take more time in processing things and to credit the amount as well. So, for all that, a customer suffers more as they can’t get the money immediately. But here all these things are not done, and the staff here always make sure to provide you best and fast services. For all these things, you can get the best loan services from here without any doubt.</code> | * Loss: <code>selfloss.Infonce</code> with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Evaluation Dataset #### Unnamed Dataset * Size: 1,476 evaluation samples * Columns: <code>anchor</code> and <code>positive</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 17.61 tokens</li><li>max: 47 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 171.81 tokens</li><li>max: 1024 tokens</li></ul> | * Samples: | anchor | positive | |:--------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>What is Hector Guimard best known for</code> | <code>Hector Guimard (, 10 March 1867 – 20 May 1942) was a French architect and designer, and a prominent figure of the Art Nouveau style. He achieved early fame with his design for the Castel Beranger, the first Art Nouveau apartment building in Paris, which was selected in an 1899 competition as one of the best new building facades in the city. He is best known for the glass and iron edicules or canopies, with ornamental Art Nouveau curves, which he designed to cover the entrances of the first stations of the Paris Metro. Between 1890 and 1930, Guimard designed and built some fifty buildings, in addition to one hundred and forty-one subway entrances for Paris Metro, as well as numerous pieces of furniture and other decorative works. However, in the 1910s Art Nouveau went out of fashion and by the 1960s most of his works had been demolished, and only two of his original Metro edicules were still in place. Guimard's critical reputation revived in the 1960s, in part due to subsequent acquisit...</code> | | <code>What does Mark Kantrowitz say about the inclusion of loans in financial aid packages</code> | <code>"They don't always understand that part of the financial aid package includes loans," he says. But loans "don't really reduce your costs," explains Mark Kantrowitz, founder of the financial aid website FinAid.org and publisher of Edvisors Network. "They simply spread them out over time. ... A loan is a loan.</code> | | <code>How can Ayurveda support women's health during menopause</code> | <code>Especially as we journey towards menopause, Ayurveda is there to support us with its millenary wisdom. These are some easy routines to incorporate for the daily care of the vulva and vagina, our most delicate flower. Sesame oil: our best allied against dryness, it cannot be missing in our diet.</code> | * Loss: <code>selfloss.Infonce</code> with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 2 - `per_device_eval_batch_size`: 2 - `learning_rate`: 2e-05 - `num_train_epochs`: 5 - `warmup_ratio`: 0.1 - `fp16`: True - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 2 - `per_device_eval_batch_size`: 2 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 5 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `prompts`: None - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | Validation Loss | |:------:|:----:|:-------------:|:---------------:| | 0.2033 | 100 | 0.2694 | 0.0690 | | 0.4065 | 200 | 0.0822 | 0.0528 | | 0.6098 | 300 | 0.0689 | 0.0497 | | 0.8130 | 400 | 0.0644 | 0.0469 | | 1.0163 | 500 | 0.0643 | 0.0443 | | 1.2195 | 600 | 0.0378 | 0.0473 | | 1.4228 | 700 | 0.04 | 0.0479 | | 1.6260 | 800 | 0.0358 | 0.0461 | | 1.8293 | 900 | 0.0332 | 0.0507 | | 2.0325 | 1000 | 0.0283 | 0.0538 | ### Framework Versions - Python: 3.12.3 - Sentence Transformers: 3.4.0 - Transformers: 4.42.4 - PyTorch: 2.2.0+cu121 - Accelerate: 1.3.0 - Datasets: 3.2.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### Infonce ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
{"base_model": "BAAI/bge-m3", "library_name": "sentence-transformers", "pipeline_tag": "sentence-similarity", "tags": ["sentence-transformers", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:11808", "loss:Infonce"], "widget": [{"source_sentence": "Who are some notable individuals named Roger Mason", "sentences": ["Rav Kook's writings are extensive, and he is considered one of the most celebrated and influential rabbis of the 20th century. Some rabbis recommend that students of his begin studying his writings with Ein Ayah. References\n\nExternal links\n Ayin Ayah (full text), Hebrew Wikisource\n * Ayn Aya Classes in English\n\nTalmud\nAggadic Midrashim\nAbraham Isaac Kook\nHebrew-language religious books", "Roger Mason may refer to:\n\nRoger Mason (baseball) (born 1958), American baseball player\nRoger Mason (geologist) (born 1941), discoverer of Ediacaran fossils\nRoger Mason Jr. (born 1980), American basketball player\nRoger Mason (musician), Australian keyboardist\nL. Roger Mason, Jr., former assistant director of National Intelligence for Systems and Resource Analyses", "Timetabled passenger services on both lines had ceased by the end of February 1959. Shipping\nThe Bourne-Morton Canal or Bourne Old Eau connected the town to the sea in Roman times. Until the mid-19th century, the present Bourne Eau was capable of carrying commercial boat traffic from the Wash coast and Spalding. This resulted from the investment following the Bourne Navigation Act of 1780. Passage became impossible once the junction of the Eau and the River Glen was converted from gates to a sluice in 1860. Media\nLocal news and television programmes are provided by BBC Yorkshire and Lincolnshire and ITV Yorkshire. Television signals are received from the Belmont TV transmitter, the Waltham TV transmitter can also be received which broadcast BBC East Midlands and ITV Central programmes. Local radio stations are BBC Radio Lincolnshire, Greatest Hits Radio Lincolnshire and Lincs FM. The town's local newspapers are Bourne Local and Stamford Mercury. Sport\nBourne Town Football Club plays football in the United Counties Football League, whilst Bourne Cricket Club plays in the Lincolnshire ECB Premier League. These teams play their home games at the Abbey Lawn, a recreation ground privately owned by the Bourne United Charities. Motor sports\n\nThe racing-car marques English Racing Automobiles (ERA) and British Racing Motors (BRM) were both founded in Bourne by Raymond Mays, an international racing driver and designer who lived in Bourne. The former ERA and BRM workshops in Spalding Road are adjacent to Eastgate House, the Mays' family home in the town's Eastgate. Landmarks\n\nThere are currently 71 listed buildings in the parish of Bourne, the most important being Bourne Abbey and the Parish Church of St Peter and St Paul (1138), which is the only one scheduled Grade I. Notable people\nBourne is reputedly the birthplace of Hereward the Wake (in about 1035), although the 12th-century source of this information, De Gestis Herwardi Saxonis, refers only to his father as being \"of Bourne\" and to the father's house and retainers there. Robert Mannyng (1264–1340) is credited with putting the speech of the ordinary people of his time into recognisable form. He is better known as Robert de Brunne because of his long period of residence as a canon at Bourne Abbey. There he completed his life's work of popularising religious and historical material in a Middle English dialect that was easily understood at that time. William Cecil (1520–1598) became the first Lord Burghley after serving Queen Elizabeth I. He was born at a house in the centre of Bourne that is now the Burghley Arms. Dr William Dodd (1729–1777), was an Anglican clergyman, man of letters and forger. He was prosecuted, sentenced to death and publicly hanged at Tyburn in 1777. Charles Frederick Worth (1825–1895), son of a solicitor, lived at Wake House in North Street. He moved to Paris and became a renowned designer of women's fashion and the founder of haute couture. The French government awarded him the Légion d'honneur. Sir George White (1840-1912), MP for North West Norfolk, a seat he held for twelve years until he died in 1912. He was knighted for public service in 1907."]}, {"source_sentence": "What football team does the Japanese player play for", "sentences": ["After the meeting, Box summons up the courage to ask Lorraine (Sue Holderness) on the date. The act ends with Robert's coat getting on fire because of the cigarette, with \"Smoke Gets in Your Eyes\" on the background.", "is a Japanese football player. He plays for Honda Lock.", "As followers on Twitter and FB probably well know I’ve been up to more than a spot of preserving of late. It’s my latest addiction, as if I need any more of those. My Dad’s the King of Jams, Chutneys and Pickles and I have a feeling he’s passed his enthusiastic genes for it on to me!. Which is great, but time consuming. Many an evening has been spent peeling, dicing, de-stoning, chopping, stirring, testing, sterilising and jarring. And then obviously the tasting. And all the crackers, bread and cheese to go with it!. I rarely get to bed much before midnight on my chutneying nights. And to be honest my cupboards are now fit to bursting with so many goodies, but at least I have christmas presents totally nailed this year. My Dad’s been making Hedgerow Chutney for years, and it happens to be everyone’s favourite of all his chutney recipes (and he makes quite a number!). Each autumn he takes a long walk around the field at the back of his house in Herefordshire picking all the freebie hedgerow goodies he can find and transforms them into this marvellously fruitful chutney. There’s always plenty of damsons, bullaces, sloes, blackberries and a few elderberries. Plus pears or apples for smoothing and bulking out. We don’t have quite the same fruit in our hedgerows in France but I thought I’d make my own French version picking the fruit from our garden and nearby tracks and lanes, managing to find plenty of figs, greengages, plums, pears, blackberries and sloes just before the season finished a couple of weeks ago. We’ve elderberries here too but they were way past their best by the time I got into full chutney mode. There’s no escaping how time consuming and labourious chutney making can be, especially when using so much fruit that needs hefty preparatory work. I realise now why it’s a hobby generally taken up by retired folk. But the results are so worth it, if you can spare it set aside a whole evening in the kitchen and wile away the hours getting lost in music or the radio or even catching up on a few programmes on You Tube."]}, {"source_sentence": "What is the purpose of Business Intelligence", "sentences": ["College career\nProctor played as a defensive lineman for the North Carolina Central Eagles from 2008 to 2012. He was redshirted in 2008.", "The purpose of Business Intelligence is the transformation of raw data into meaningful information which can be used to make better business decisions. Business Intelligence grew out of Decision Support systems and is all about collecting data from disparate sources, conforming and integrating that data into central repositories which support reporting and analysis activities.", "You have to show the police courtesy, they are only human. No one even WANTS for the judicial system to work. They are too lazy."]}, {"source_sentence": "How does the speaker feel about Battle Symphony", "sentences": ["It's a symptomless prearranged fact that when you afford your babe a infant work you motivate the status system, bolster the infant's stressed system, eat up colic, and harden your in bondage next to your kid. Now, how satisfying is that", "Piquet passed Laffite to become the race's fifth different leader. Senna reached second just 1.7 seconds behind Piquet by passing Laffite, who then pitted for tires. With the two of them in front on their own, and Piquet leading by up to 3.5 seconds, Senna was content for the time being to follow his countryman. After eight laps in the lead, Piquet pitted for tires. Senna regained first place and then also pitted. Piquet's 18.4 second stop was even slower than teammate Mansell's had been, but when he returned to the track, the two-time champion got the bit between his teeth. Running second behind Senna, Piquet set the fastest lap of the race on lap 41, but with a pit stop ten seconds quicker than Piquet's, Senna was able to retain the lead. On the very next lap, the 42nd, Piquet pushed a bit too much, and crashed hard at the left-hand corner before the last chicane. He ended up in the tire barrier, unhurt, but with his car in a very precarious position. The crane, present for just that reason, was unable to move the car. Arnoux, now 16.6 seconds behind in second, took a second a lap off Senna's lead for five laps while a yellow was displayed in the corner where Piquet had crashed. As soon as the yellow flag was gone, Arnoux went wide and hit Piquet's abandoned Williams! The Frenchman decided that his car was not damaged, and attempted to rejoin the field, but did so right in front of Thierry Boutsen's Arrows-BMW, sidelining both cars. Very uncharacteristic of a street race, these three – Piquet, Arnoux and Boutsen – were the only drivers all afternoon to retire due to accidents.", "Like Battle Symphony, it's not bad. It's just extremely boring."]}, {"source_sentence": "When did he migrate to New South Wales", "sentences": ["predict ministry in a sales and special floor being Job to the vulnerability diver. team: This research will work last for either, often, and also obtaining spreadsheets in the funny wedding power of the usability time. Physical Demands: The exclusive transitions was temporarily need perfect of those that must share developed by an position to badly do the animal objectives of this source. necessary terabytes may pay acted to increase streets with hearts to address the professional items. solely, the job will distract, Coordinate and be inbox security fun interdisciplinary operations that might read in back of 20 updates The service will properly be to like the detection throughout the use: logging, including, killing, teaching, leading, preparing, operating, and using.", "Shizuka Shirakawa, Scholar of Chinese-language literature. Horin Fukuoji, Nihonga painter. 2005\n Mitsuko Mori. Actress. Makoto Saitō (1921–2008). Political scientist, specializing in American diplomatic and political history. Ryuzan Aoki, Ceramic artist. Toshio Sawada, Civil engineer. Shigeaki Hinohara, Doctor. 2006\n Yoshiaki Arata. A pioneer of nuclear fusion research. Jakuchō Setouchi. Writer/Buddhist nun. Hidekazu Yoshida. Music critic. Chusaku Oyama, Nihonga painter. Miyohei Shinohara, Economist. 2007\n Akira Mikazuki. Former justice minister and professor emeritus. Shinya Nakamura. Sculptor. Kōji Nakanishi. Organic chemist. Tokindo Okada, Developmental biologist. Shigeyama Sensaku, Kyogen performer. 2008\n Hironoshin Furuhashi (1928–2009). Sportsman and sports bureaucrat. Kiyoshi Itō. A mathematician whose work is now called Itō calculus. Donald Keene.", "He attended Derby Grammar School and Beaufort House in London, and migrated to New South Wales in 1883. He settled in Newcastle, where he worked as a shipping agent, eventually partnering with his brothers in a firm. On 6 May 1893 he married Gertrude Mary Saddington, with whom he had five children."]}]}
task
[ "TEXT_CLASSIFICATION", "TRANSLATION" ]
42,880
google/t5-efficient-large
google
text2text-generation
[ "transformers", "pytorch", "tf", "jax", "t5", "text2text-generation", "deep-narrow", "en", "dataset:c4", "arxiv:2109.10686", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05Z
2023-01-24T16:47:46+00:00
180
4
--- datasets: - c4 language: - en license: apache-2.0 tags: - deep-narrow inference: false --- # T5-Efficient-LARGE (Deep-Narrow version) T5-Efficient-LARGE is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5). It is a *pretrained-only* checkpoint and was released with the paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*. In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures of similar parameter count. To quote the paper: > We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased > before considering any other forms of uniform scaling across other dimensions. This is largely due to > how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a > tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise, > a tall base model might also generally more efficient compared to a large model. We generally find > that, regardless of size, even if absolute performance might increase as we continue to stack layers, > the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36 > layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e., > params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params, > FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to > consider. To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially. A sequence of word embeddings is therefore processed sequentially by each transformer block. ## Details model architecture This model checkpoint - **t5-efficient-large** - is of model type **Large** with no variations. It has **737.72** million parameters and thus requires *ca.* **2950.9 MB** of memory in full precision (*fp32*) or **1475.45 MB** of memory in half precision (*fp16* or *bf16*). A summary of the *original* T5 model architectures can be seen here: | Model | nl (el/dl) | ff | dm | kv | nh | #Params| | ----| ---- | ---- | ---- | ---- | ---- | ----| | Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M| | Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M| | Small | 6/6 | 2048 | 512 | 32 | 8 | 60M| | Base | 12/12 | 3072 | 768 | 64 | 12 | 220M| | Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M| | Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B| | XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B| whereas the following abbreviations are used: | Abbreviation | Definition | | ----| ---- | | nl | Number of transformer blocks (depth) | | dm | Dimension of embedding vector (output vector of transformers block) | | kv | Dimension of key/value projection matrix | | nh | Number of attention heads | | ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) | | el | Number of transformer blocks in the encoder (encoder depth) | | dl | Number of transformer blocks in the decoder (decoder depth) | | sh | Signifies that attention heads are shared | | skv | Signifies that key-values projection matrices are tied | If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*. ## Pre-Training The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using the span-based masked language modeling (MLM) objective. ## Fine-Tuning **Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage. The checkpoint was pretrained in English and is therefore only useful for English NLP tasks. You can follow on of the following examples on how to fine-tune the model: *PyTorch*: - [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization) - [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py) - [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model. *Tensorflow*: - [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization) - [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model. *JAX/Flax*: - [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization) - [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model. ## Downstream Performance TODO: Add table if available ## Computational Complexity TODO: Add table if available ## More information We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint. As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv* model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
null
Non_BioNLP
# T5-Efficient-LARGE (Deep-Narrow version) T5-Efficient-LARGE is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5). It is a *pretrained-only* checkpoint and was released with the paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*. In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures of similar parameter count. To quote the paper: > We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased > before considering any other forms of uniform scaling across other dimensions. This is largely due to > how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a > tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise, > a tall base model might also generally more efficient compared to a large model. We generally find > that, regardless of size, even if absolute performance might increase as we continue to stack layers, > the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36 > layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e., > params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params, > FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to > consider. To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially. A sequence of word embeddings is therefore processed sequentially by each transformer block. ## Details model architecture This model checkpoint - **t5-efficient-large** - is of model type **Large** with no variations. It has **737.72** million parameters and thus requires *ca.* **2950.9 MB** of memory in full precision (*fp32*) or **1475.45 MB** of memory in half precision (*fp16* or *bf16*). A summary of the *original* T5 model architectures can be seen here: | Model | nl (el/dl) | ff | dm | kv | nh | #Params| | ----| ---- | ---- | ---- | ---- | ---- | ----| | Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M| | Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M| | Small | 6/6 | 2048 | 512 | 32 | 8 | 60M| | Base | 12/12 | 3072 | 768 | 64 | 12 | 220M| | Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M| | Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B| | XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B| whereas the following abbreviations are used: | Abbreviation | Definition | | ----| ---- | | nl | Number of transformer blocks (depth) | | dm | Dimension of embedding vector (output vector of transformers block) | | kv | Dimension of key/value projection matrix | | nh | Number of attention heads | | ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) | | el | Number of transformer blocks in the encoder (encoder depth) | | dl | Number of transformer blocks in the decoder (decoder depth) | | sh | Signifies that attention heads are shared | | skv | Signifies that key-values projection matrices are tied | If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*. ## Pre-Training The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using the span-based masked language modeling (MLM) objective. ## Fine-Tuning **Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage. The checkpoint was pretrained in English and is therefore only useful for English NLP tasks. You can follow on of the following examples on how to fine-tune the model: *PyTorch*: - [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization) - [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py) - [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model. *Tensorflow*: - [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization) - [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model. *JAX/Flax*: - [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization) - [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model. ## Downstream Performance TODO: Add table if available ## Computational Complexity TODO: Add table if available ## More information We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint. As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv* model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
{"datasets": ["c4"], "language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "inference": false}
task
[ "TEXT_CLASSIFICATION", "QUESTION_ANSWERING", "SUMMARIZATION" ]
42,882
mervekasap/bert-emotion
mervekasap
text-classification
[ "transformers", "pytorch", "tensorboard", "distilbert", "text-classification", "generated_from_trainer", "dataset:tweet_eval", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2023-01-05T19:40:53Z
2023-01-05T19:56:43+00:00
14
0
--- datasets: - tweet_eval license: apache-2.0 metrics: - precision - recall tags: - generated_from_trainer model-index: - name: bert-emotion results: - task: type: text-classification name: Text Classification dataset: name: tweet_eval type: tweet_eval config: emotion split: train args: emotion metrics: - type: precision value: 0.6838345542908288 name: Precision - type: recall value: 0.6974690918154589 name: Recall --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-emotion This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on the tweet_eval dataset. It achieves the following results on the evaluation set: - Loss: 1.2572 - Precision: 0.6838 - Recall: 0.6975 - Fscore: 0.6888 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | Fscore | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:| | 0.8652 | 1.0 | 815 | 0.6991 | 0.7314 | 0.6514 | 0.6755 | | 0.5486 | 2.0 | 1630 | 0.9158 | 0.7387 | 0.6764 | 0.6983 | | 0.2794 | 3.0 | 2445 | 1.2572 | 0.6838 | 0.6975 | 0.6888 | ### Framework versions - Transformers 4.25.1 - Pytorch 1.13.0+cu116 - Datasets 2.8.0 - Tokenizers 0.13.2
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-emotion This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on the tweet_eval dataset. It achieves the following results on the evaluation set: - Loss: 1.2572 - Precision: 0.6838 - Recall: 0.6975 - Fscore: 0.6888 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | Fscore | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:| | 0.8652 | 1.0 | 815 | 0.6991 | 0.7314 | 0.6514 | 0.6755 | | 0.5486 | 2.0 | 1630 | 0.9158 | 0.7387 | 0.6764 | 0.6983 | | 0.2794 | 3.0 | 2445 | 1.2572 | 0.6838 | 0.6975 | 0.6888 | ### Framework versions - Transformers 4.25.1 - Pytorch 1.13.0+cu116 - Datasets 2.8.0 - Tokenizers 0.13.2
{"datasets": ["tweet_eval"], "license": "apache-2.0", "metrics": ["precision", "recall"], "tags": ["generated_from_trainer"], "model-index": [{"name": "bert-emotion", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "tweet_eval", "type": "tweet_eval", "config": "emotion", "split": "train", "args": "emotion"}, "metrics": [{"type": "precision", "value": 0.6838345542908288, "name": "Precision"}, {"type": "recall", "value": 0.6974690918154589, "name": "Recall"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,883
Krystalan/mdialbart_zh
Krystalan
text2text-generation
[ "transformers", "pytorch", "mbart", "text2text-generation", "arxiv:2202.05599", "license:cc-by-nc-sa-4.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04Z
2022-02-24T12:11:13+00:00
180
1
--- license: cc-by-nc-sa-4.0 --- ## mDialBART: A Cross-Lingual Dialogue Summarization Model This model is introduced by [*ClidSum: A Benchmark Dataset for Cross-Lingual Dialogue Summarization*](https://arxiv.org/abs/2202.05599).
null
Non_BioNLP
## mDialBART: A Cross-Lingual Dialogue Summarization Model This model is introduced by [*ClidSum: A Benchmark Dataset for Cross-Lingual Dialogue Summarization*](https://arxiv.org/abs/2202.05599).
{"license": "cc-by-nc-sa-4.0"}
task
[ "SUMMARIZATION" ]
42,884
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task644
Lots-of-LoRAs
null
[ "pytorch", "safetensors", "en", "arxiv:1910.09700", "arxiv:2407.00066", "base_model:mistralai/Mistral-7B-Instruct-v0.2", "base_model:finetune:mistralai/Mistral-7B-Instruct-v0.2", "license:mit", "region:us" ]
2025-01-05T14:10:00Z
2025-01-05T14:10:06+00:00
0
0
--- base_model: mistralai/Mistral-7B-Instruct-v0.2 language: en library_name: pytorch license: mit --- # Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task644 <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> LoRA trained on task644_refresd_translation - **Developed by:** bruel - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** LoRA - **Language(s) (NLP):** en - **License:** mit - **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2 ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** https://github.com/bruel-gabrielsson - **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> https://huggingface.co/datasets/Lots-of-LoRAs/task644_refresd_translation sourced from https://github.com/allenai/natural-instructions ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** @misc{brüelgabrielsson2024compressserveservingthousands, title={Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead}, author={Rickard Brüel-Gabrielsson and Jiacheng Zhu and Onkar Bhardwaj and Leshem Choshen and Kristjan Greenewald and Mikhail Yurochkin and Justin Solomon}, year={2024}, eprint={2407.00066}, archivePrefix={arXiv}, primaryClass={cs.DC}, url={https://arxiv.org/abs/2407.00066}, } **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
null
Non_BioNLP
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task644 <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> LoRA trained on task644_refresd_translation - **Developed by:** bruel - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** LoRA - **Language(s) (NLP):** en - **License:** mit - **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2 ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** https://github.com/bruel-gabrielsson - **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> https://huggingface.co/datasets/Lots-of-LoRAs/task644_refresd_translation sourced from https://github.com/allenai/natural-instructions ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** @misc{brüelgabrielsson2024compressserveservingthousands, title={Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead}, author={Rickard Brüel-Gabrielsson and Jiacheng Zhu and Onkar Bhardwaj and Leshem Choshen and Kristjan Greenewald and Mikhail Yurochkin and Justin Solomon}, year={2024}, eprint={2407.00066}, archivePrefix={arXiv}, primaryClass={cs.DC}, url={https://arxiv.org/abs/2407.00066}, } **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"base_model": "mistralai/Mistral-7B-Instruct-v0.2", "language": "en", "library_name": "pytorch", "license": "mit"}
task
[ "TRANSLATION" ]
42,885
balus/distilbert-base-uncased-finetuned-clinc
balus
text-classification
[ "transformers", "safetensors", "distilbert", "text-classification", "generated_from_trainer", "dataset:clinc_oos", "base_model:distilbert/distilbert-base-uncased", "base_model:finetune:distilbert/distilbert-base-uncased", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2023-12-20T08:52:42Z
2023-12-22T06:53:14+00:00
89
0
--- base_model: distilbert-base-uncased datasets: - clinc_oos license: apache-2.0 metrics: - accuracy tags: - generated_from_trainer model-index: - name: distilbert-base-uncased-finetuned-clinc results: - task: type: text-classification name: Text Classification dataset: name: clinc_oos type: clinc_oos config: plus split: validation args: plus metrics: - type: accuracy value: 0.9174193548387096 name: Accuracy --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-clinc This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the clinc_oos dataset. It achieves the following results on the evaluation set: - Loss: 0.8027 - Accuracy: 0.9174 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 48 - eval_batch_size: 48 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 4.3197 | 1.0 | 318 | 3.3202 | 0.7174 | | 2.6708 | 2.0 | 636 | 1.9119 | 0.8539 | | 1.589 | 3.0 | 954 | 1.1932 | 0.8952 | | 1.0456 | 4.0 | 1272 | 0.8883 | 0.9110 | | 0.8265 | 5.0 | 1590 | 0.8027 | 0.9174 | ### Framework versions - Transformers 4.36.1 - Pytorch 2.1.2+cu121 - Datasets 2.15.0 - Tokenizers 0.15.0
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-clinc This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the clinc_oos dataset. It achieves the following results on the evaluation set: - Loss: 0.8027 - Accuracy: 0.9174 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 48 - eval_batch_size: 48 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 4.3197 | 1.0 | 318 | 3.3202 | 0.7174 | | 2.6708 | 2.0 | 636 | 1.9119 | 0.8539 | | 1.589 | 3.0 | 954 | 1.1932 | 0.8952 | | 1.0456 | 4.0 | 1272 | 0.8883 | 0.9110 | | 0.8265 | 5.0 | 1590 | 0.8027 | 0.9174 | ### Framework versions - Transformers 4.36.1 - Pytorch 2.1.2+cu121 - Datasets 2.15.0 - Tokenizers 0.15.0
{"base_model": "distilbert-base-uncased", "datasets": ["clinc_oos"], "license": "apache-2.0", "metrics": ["accuracy"], "tags": ["generated_from_trainer"], "model-index": [{"name": "distilbert-base-uncased-finetuned-clinc", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "clinc_oos", "type": "clinc_oos", "config": "plus", "split": "validation", "args": "plus"}, "metrics": [{"type": "accuracy", "value": 0.9174193548387096, "name": "Accuracy"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,886
BIFOLD-BigEarthNetv2-0/convmixer_768_32-s2-v0.2.0
BIFOLD-BigEarthNetv2-0
image-classification
[ "configilm", "safetensors", "convmixer_768_32", "BigEarthNet v2.0", "Remote Sensing", "Classification", "image-classification", "Multispectral", "arxiv:2407.03653", "license:mit", "region:us" ]
2024-10-11T12:40:58Z
2025-03-14T07:09:11+00:00
15
0
--- library_name: configilm license: mit tags: - convmixer_768_32 - BigEarthNet v2.0 - Remote Sensing - Classification - image-classification - Multispectral thumbnail: https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/RSiM_Logo_1.png widget: - src: example.png example_title: Example output: - label: Agro-forestry areas score: 0.0 - label: Arable land score: 0.0 - label: Beaches, dunes, sands score: 0.0 - label: Broad-leaved forest score: 0.0 - label: Coastal wetlands score: 0.0 --- [TU Berlin](https://www.tu.berlin/) | [RSiM](https://rsim.berlin/) | [DIMA](https://www.dima.tu-berlin.de/menue/database_systems_and_information_management_group/) | [BigEarth](http://www.bigearth.eu/) | [BIFOLD](https://bifold.berlin/) :---:|:---:|:---:|:---:|:---: <a href="https://www.tu.berlin/"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/tu-berlin-logo-long-red.svg" style="font-size: 1rem; height: 2em; width: auto" alt="TU Berlin Logo"/> | <a href="https://rsim.berlin/"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/RSiM_Logo_1.png" style="font-size: 1rem; height: 2em; width: auto" alt="RSiM Logo"> | <a href="https://www.dima.tu-berlin.de/menue/database_systems_and_information_management_group/"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/DIMA.png" style="font-size: 1rem; height: 2em; width: auto" alt="DIMA Logo"> | <a href="http://www.bigearth.eu/"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/BigEarth.png" style="font-size: 1rem; height: 2em; width: auto" alt="BigEarth Logo"> | <a href="https://bifold.berlin/"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/BIFOLD_Logo_farbig.png" style="font-size: 1rem; height: 2em; width: auto; margin-right: 1em" alt="BIFOLD Logo"> # Convmixer_768_32 pretrained on BigEarthNet v2.0 using Sentinel-2 bands <!-- Optional images --> <!-- [Sentinel-1](https://sentinel.esa.int/web/sentinel/missions/sentinel-1) | [Sentinel-2](https://sentinel.esa.int/web/sentinel/missions/sentinel-2) :---:|:---: <a href="https://sentinel.esa.int/web/sentinel/missions/sentinel-1"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/sentinel_2.jpg" style="font-size: 1rem; height: 10em; width: auto; margin-right: 1em" alt="Sentinel-2 Satellite"/> | <a href="https://sentinel.esa.int/web/sentinel/missions/sentinel-2"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/sentinel_1.jpg" style="font-size: 1rem; height: 10em; width: auto; margin-right: 1em" alt="Sentinel-1 Satellite"/> --> This model was trained on the BigEarthNet v2.0 (also known as reBEN) dataset using the Sentinel-2 bands. It was trained using the following parameters: - Number of epochs: up to 100 (with early stopping after 5 epochs of no improvement based on validation average precision macro) - Batch size: 512 - Learning rate: 0.001 - Dropout rate: 0.15 - Drop Path rate: 0.15 - Learning rate scheduler: LinearWarmupCosineAnnealing for 1000 warmup steps - Optimizer: AdamW - Seed: 24 The weights published in this model card were obtained after 21 training epochs. For more information, please visit the [official BigEarthNet v2.0 (reBEN) repository](https://git.tu-berlin.de/rsim/reben-training-scripts), where you can find the training scripts. ![[BigEarthNet](http://bigearth.net/)](https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/combined_2000_600_2020_0_wide.jpg) The model was evaluated on the test set of the BigEarthNet v2.0 dataset with the following results: | Metric | Macro | Micro | |:------------------|------------------:|------------------:| | Average Precision | 0.679161 | 0.853470 | | F1 Score | 0.625087 | 0.750327 | | Precision | 0.706390 | 0.801895 | # Example | A Sentinel-2 image (true color representation) | |:---------------------------------------------------:| | ![[BigEarthNet](http://bigearth.net/)](example.png) | | Class labels | Predicted scores | |:--------------------------------------------------------------------------|--------------------------------------------------------------------------:| | <p> Agro-forestry areas <br> Arable land <br> Beaches, dunes, sands <br> ... <br> Urban fabric </p> | <p> 0.000000 <br> 0.000000 <br> 0.000000 <br> ... <br> 0.000063 </p> | To use the model, download the codes that define the model architecture from the [official BigEarthNet v2.0 (reBEN) repository](https://git.tu-berlin.de/rsim/reben-training-scripts) and load the model using the code below. Note that you have to install [`configilm`](https://pypi.org/project/configilm/) to use the provided code. ```python from reben_publication.BigEarthNetv2_0_ImageClassifier import BigEarthNetv2_0_ImageClassifier model = BigEarthNetv2_0_ImageClassifier.from_pretrained("path_to/huggingface_model_folder") ``` e.g. ```python from reben_publication.BigEarthNetv2_0_ImageClassifier import BigEarthNetv2_0_ImageClassifier model = BigEarthNetv2_0_ImageClassifier.from_pretrained( "BIFOLD-BigEarthNetv2-0/convmixer_768_32-s2-v0.1.1") ``` If you use this model in your research or the provided code, please cite the following papers: ```bibtex @article{clasen2024refinedbigearthnet, title={reBEN: Refined BigEarthNet Dataset for Remote Sensing Image Analysis}, author={Clasen, Kai Norman and Hackel, Leonard and Burgert, Tom and Sumbul, Gencer and Demir, Beg{\"u}m and Markl, Volker}, year={2024}, eprint={2407.03653}, archivePrefix={arXiv}, primaryClass={cs.CV}, url={https://arxiv.org/abs/2407.03653}, } ``` ```bibtex @article{hackel2024configilm, title={ConfigILM: A general purpose configurable library for combining image and language models for visual question answering}, author={Hackel, Leonard and Clasen, Kai Norman and Demir, Beg{\"u}m}, journal={SoftwareX}, volume={26}, pages={101731}, year={2024}, publisher={Elsevier} } ```
null
Non_BioNLP
[TU Berlin](https://www.tu.berlin/) | [RSiM](https://rsim.berlin/) | [DIMA](https://www.dima.tu-berlin.de/menue/database_systems_and_information_management_group/) | [BigEarth](http://www.bigearth.eu/) | [BIFOLD](https://bifold.berlin/) :---:|:---:|:---:|:---:|:---: <a href="https://www.tu.berlin/"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/tu-berlin-logo-long-red.svg" style="font-size: 1rem; height: 2em; width: auto" alt="TU Berlin Logo"/> | <a href="https://rsim.berlin/"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/RSiM_Logo_1.png" style="font-size: 1rem; height: 2em; width: auto" alt="RSiM Logo"> | <a href="https://www.dima.tu-berlin.de/menue/database_systems_and_information_management_group/"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/DIMA.png" style="font-size: 1rem; height: 2em; width: auto" alt="DIMA Logo"> | <a href="http://www.bigearth.eu/"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/BigEarth.png" style="font-size: 1rem; height: 2em; width: auto" alt="BigEarth Logo"> | <a href="https://bifold.berlin/"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/BIFOLD_Logo_farbig.png" style="font-size: 1rem; height: 2em; width: auto; margin-right: 1em" alt="BIFOLD Logo"> # Convmixer_768_32 pretrained on BigEarthNet v2.0 using Sentinel-2 bands <!-- Optional images --> <!-- [Sentinel-1](https://sentinel.esa.int/web/sentinel/missions/sentinel-1) | [Sentinel-2](https://sentinel.esa.int/web/sentinel/missions/sentinel-2) :---:|:---: <a href="https://sentinel.esa.int/web/sentinel/missions/sentinel-1"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/sentinel_2.jpg" style="font-size: 1rem; height: 10em; width: auto; margin-right: 1em" alt="Sentinel-2 Satellite"/> | <a href="https://sentinel.esa.int/web/sentinel/missions/sentinel-2"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/sentinel_1.jpg" style="font-size: 1rem; height: 10em; width: auto; margin-right: 1em" alt="Sentinel-1 Satellite"/> --> This model was trained on the BigEarthNet v2.0 (also known as reBEN) dataset using the Sentinel-2 bands. It was trained using the following parameters: - Number of epochs: up to 100 (with early stopping after 5 epochs of no improvement based on validation average precision macro) - Batch size: 512 - Learning rate: 0.001 - Dropout rate: 0.15 - Drop Path rate: 0.15 - Learning rate scheduler: LinearWarmupCosineAnnealing for 1000 warmup steps - Optimizer: AdamW - Seed: 24 The weights published in this model card were obtained after 21 training epochs. For more information, please visit the [official BigEarthNet v2.0 (reBEN) repository](https://git.tu-berlin.de/rsim/reben-training-scripts), where you can find the training scripts. ![[BigEarthNet](http://bigearth.net/)](https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/combined_2000_600_2020_0_wide.jpg) The model was evaluated on the test set of the BigEarthNet v2.0 dataset with the following results: | Metric | Macro | Micro | |:------------------|------------------:|------------------:| | Average Precision | 0.679161 | 0.853470 | | F1 Score | 0.625087 | 0.750327 | | Precision | 0.706390 | 0.801895 | # Example | A Sentinel-2 image (true color representation) | |:---------------------------------------------------:| | ![[BigEarthNet](http://bigearth.net/)](example.png) | | Class labels | Predicted scores | |:--------------------------------------------------------------------------|--------------------------------------------------------------------------:| | <p> Agro-forestry areas <br> Arable land <br> Beaches, dunes, sands <br> ... <br> Urban fabric </p> | <p> 0.000000 <br> 0.000000 <br> 0.000000 <br> ... <br> 0.000063 </p> | To use the model, download the codes that define the model architecture from the [official BigEarthNet v2.0 (reBEN) repository](https://git.tu-berlin.de/rsim/reben-training-scripts) and load the model using the code below. Note that you have to install [`configilm`](https://pypi.org/project/configilm/) to use the provided code. ```python from reben_publication.BigEarthNetv2_0_ImageClassifier import BigEarthNetv2_0_ImageClassifier model = BigEarthNetv2_0_ImageClassifier.from_pretrained("path_to/huggingface_model_folder") ``` e.g. ```python from reben_publication.BigEarthNetv2_0_ImageClassifier import BigEarthNetv2_0_ImageClassifier model = BigEarthNetv2_0_ImageClassifier.from_pretrained( "BIFOLD-BigEarthNetv2-0/convmixer_768_32-s2-v0.1.1") ``` If you use this model in your research or the provided code, please cite the following papers: ```bibtex @article{clasen2024refinedbigearthnet, title={reBEN: Refined BigEarthNet Dataset for Remote Sensing Image Analysis}, author={Clasen, Kai Norman and Hackel, Leonard and Burgert, Tom and Sumbul, Gencer and Demir, Beg{\"u}m and Markl, Volker}, year={2024}, eprint={2407.03653}, archivePrefix={arXiv}, primaryClass={cs.CV}, url={https://arxiv.org/abs/2407.03653}, } ``` ```bibtex @article{hackel2024configilm, title={ConfigILM: A general purpose configurable library for combining image and language models for visual question answering}, author={Hackel, Leonard and Clasen, Kai Norman and Demir, Beg{\"u}m}, journal={SoftwareX}, volume={26}, pages={101731}, year={2024}, publisher={Elsevier} } ```
{"library_name": "configilm", "license": "mit", "tags": ["convmixer_768_32", "BigEarthNet v2.0", "Remote Sensing", "Classification", "image-classification", "Multispectral"], "thumbnail": "https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/RSiM_Logo_1.png", "widget": [{"src": "example.png", "example_title": "Example", "output": [{"label": "Agro-forestry areas", "score": 0.0}, {"label": "Arable land", "score": 0.0}, {"label": "Beaches, dunes, sands", "score": 0.0}, {"label": "Broad-leaved forest", "score": 0.0}, {"label": "Coastal wetlands", "score": 0.0}]}]}
task
[ "QUESTION_ANSWERING" ]
42,887
rithuparan07/ai_summarizer
rithuparan07
summarization
[ "diffusers", "legal", "text-generation-inference", "transformers", "rust", "inference-endpoint", "summarization", "dataset:fka/awesome-chatgpt-prompts", "dataset:gopipasala/fka-awesome-chatgpt-prompts", "base_model:meta-llama/Llama-3.2-11B-Vision-Instruct", "base_model:finetune:meta-llama/Llama-3.2-11B-Vision-Instruct", "license:mit", "region:us" ]
2024-10-07T07:40:09Z
2024-10-09T10:44:11+00:00
0
1
--- base_model: - meta-llama/Llama-3.2-11B-Vision-Instruct datasets: - fka/awesome-chatgpt-prompts - gopipasala/fka-awesome-chatgpt-prompts library_name: diffusers license: mit metrics: - character pipeline_tag: summarization tags: - legal - text-generation-inference - transformers - rust - inference-endpoint new_version: meta-llama/Llama-3.1-8B-Instruct --- Model Overview Section: Add a brief paragraph summarizing the model’s purpose, what makes it unique, and its intended users. For example: vbnet Copy code This model, developed by Rithu Paran, is designed to provide high-quality text summarization, making it ideal for applications in content curation, news summarization, and document analysis. Leveraging the Meta-Llama architecture, it delivers accurate, concise summaries while maintaining key information, and is optimized for general-purpose use. 2. Model Description: Under Model Type, clarify the model's focus on general text summarization or a specific summarization task (e.g., long-form content, news). Update Language(s) with more detail on the model's primary language capabilities. 4. Model Use Cases: Expand Direct Use and Out-of-Scope Use with specific examples to guide users. Direct Use: News article summarization, summarizing reports for quick insights, content summarization for educational purposes. Out-of-Scope Use: Avoid using it for legal or medical content without specialized training. 6. Bias, Risks, and Limitations: Include any known biases related to the datasets used. For example, “The model may reflect certain cultural or societal biases present in the training data.” Add a note on limitations in terms of accuracy for complex technical summaries or if the model occasionally generates nonsensical summaries. 8. How to Get Started with the Model: Add more usage tips, such as how to adjust parameters for different summary lengths. Example: python Copy code summary = summarizer(text, max_length=150, min_length=50, do_sample=False) 10. Training Details: In Training Hyperparameters, provide a rationale for the chosen batch size and learning rate. If you have insights into why AdamW was chosen as the optimizer, it would be helpful to include that too. 12. Environmental Impact: Add a short sentence on the steps taken to minimize the environmental impact, if applicable. 14. Evaluation: If possible, include the exact ROUGE and BLEU scores to show the model’s summarization performance. 15. Additional Information: You could add a Future Work or Planned Improvements section if you plan to enhance the model further. In the Contact section, you might mention if you are open to feedback, bug reports, or contributions. Here’s a short sample revision for the Model Details section: Model Details Model Description This model by Rithu Paran focuses on text summarization, reducing lengthy content into concise summaries. Built on the Meta-Llama architecture, it has been finetuned to effectively capture key points from general text sources. Purpose: General-purpose text summarization Developer: Rithu Paran Architecture: Transformer-based Llama-3 Language: Primarily English Model Versions Base Model: Meta-Llama/Llama-3.2-11B-Vision-Instruct Current Finetuned Model: Meta-Llama/Llama-3.1-8B-Instruct For the full model card, keep these ideas in mind and feel free to customize it further to fit your style! Let me know if you’d like more specific revisions.
null
Non_BioNLP
Model Overview Section: Add a brief paragraph summarizing the model’s purpose, what makes it unique, and its intended users. For example: vbnet Copy code This model, developed by Rithu Paran, is designed to provide high-quality text summarization, making it ideal for applications in content curation, news summarization, and document analysis. Leveraging the Meta-Llama architecture, it delivers accurate, concise summaries while maintaining key information, and is optimized for general-purpose use. 2. Model Description: Under Model Type, clarify the model's focus on general text summarization or a specific summarization task (e.g., long-form content, news). Update Language(s) with more detail on the model's primary language capabilities. 4. Model Use Cases: Expand Direct Use and Out-of-Scope Use with specific examples to guide users. Direct Use: News article summarization, summarizing reports for quick insights, content summarization for educational purposes. Out-of-Scope Use: Avoid using it for legal or medical content without specialized training. 6. Bias, Risks, and Limitations: Include any known biases related to the datasets used. For example, “The model may reflect certain cultural or societal biases present in the training data.” Add a note on limitations in terms of accuracy for complex technical summaries or if the model occasionally generates nonsensical summaries. 8. How to Get Started with the Model: Add more usage tips, such as how to adjust parameters for different summary lengths. Example: python Copy code summary = summarizer(text, max_length=150, min_length=50, do_sample=False) 10. Training Details: In Training Hyperparameters, provide a rationale for the chosen batch size and learning rate. If you have insights into why AdamW was chosen as the optimizer, it would be helpful to include that too. 12. Environmental Impact: Add a short sentence on the steps taken to minimize the environmental impact, if applicable. 14. Evaluation: If possible, include the exact ROUGE and BLEU scores to show the model’s summarization performance. 15. Additional Information: You could add a Future Work or Planned Improvements section if you plan to enhance the model further. In the Contact section, you might mention if you are open to feedback, bug reports, or contributions. Here’s a short sample revision for the Model Details section: Model Details Model Description This model by Rithu Paran focuses on text summarization, reducing lengthy content into concise summaries. Built on the Meta-Llama architecture, it has been finetuned to effectively capture key points from general text sources. Purpose: General-purpose text summarization Developer: Rithu Paran Architecture: Transformer-based Llama-3 Language: Primarily English Model Versions Base Model: Meta-Llama/Llama-3.2-11B-Vision-Instruct Current Finetuned Model: Meta-Llama/Llama-3.1-8B-Instruct For the full model card, keep these ideas in mind and feel free to customize it further to fit your style! Let me know if you’d like more specific revisions.
{"base_model": ["meta-llama/Llama-3.2-11B-Vision-Instruct"], "datasets": ["fka/awesome-chatgpt-prompts", "gopipasala/fka-awesome-chatgpt-prompts"], "library_name": "diffusers", "license": "mit", "metrics": ["character"], "pipeline_tag": "summarization", "tags": ["legal", "text-generation-inference", "transformers", "rust", "inference-endpoint"], "new_version": "meta-llama/Llama-3.1-8B-Instruct"}
task
[ "SUMMARIZATION" ]
42,888
ThomasLI/distilbert-base-uncased-finetuned-cola
ThomasLI
text-classification
[ "transformers", "pytorch", "tensorboard", "distilbert", "text-classification", "generated_from_trainer", "dataset:glue", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2023-03-01T10:12:17Z
2023-03-01T10:40:14+00:00
20
0
--- datasets: - glue license: apache-2.0 metrics: - matthews_correlation tags: - generated_from_trainer model-index: - name: distilbert-base-uncased-finetuned-cola results: - task: type: text-classification name: Text Classification dataset: name: glue type: glue config: cola split: validation args: cola metrics: - type: matthews_correlation value: 0.5252216970032684 name: Matthews Correlation --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-cola This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the glue dataset. It achieves the following results on the evaluation set: - Loss: 0.8586 - Matthews Correlation: 0.5252 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Matthews Correlation | |:-------------:|:-----:|:----:|:---------------:|:--------------------:| | 0.5293 | 1.0 | 535 | 0.5075 | 0.4325 | | 0.3471 | 2.0 | 1070 | 0.5048 | 0.5060 | | 0.2349 | 3.0 | 1605 | 0.5762 | 0.4979 | | 0.1829 | 4.0 | 2140 | 0.7848 | 0.5093 | | 0.1343 | 5.0 | 2675 | 0.8586 | 0.5252 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-cola This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the glue dataset. It achieves the following results on the evaluation set: - Loss: 0.8586 - Matthews Correlation: 0.5252 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Matthews Correlation | |:-------------:|:-----:|:----:|:---------------:|:--------------------:| | 0.5293 | 1.0 | 535 | 0.5075 | 0.4325 | | 0.3471 | 2.0 | 1070 | 0.5048 | 0.5060 | | 0.2349 | 3.0 | 1605 | 0.5762 | 0.4979 | | 0.1829 | 4.0 | 2140 | 0.7848 | 0.5093 | | 0.1343 | 5.0 | 2675 | 0.8586 | 0.5252 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
{"datasets": ["glue"], "license": "apache-2.0", "metrics": ["matthews_correlation"], "tags": ["generated_from_trainer"], "model-index": [{"name": "distilbert-base-uncased-finetuned-cola", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "config": "cola", "split": "validation", "args": "cola"}, "metrics": [{"type": "matthews_correlation", "value": 0.5252216970032684, "name": "Matthews Correlation"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,889
PrunaAI/Translation-EnKo-exaone3-instrucTrans-v2-enko-7.8b-bnb-8bit-smashed
PrunaAI
null
[ "safetensors", "llama", "pruna-ai", "base_model:Translation-EnKo/exaone3-instrucTrans-v2-enko-7.8b", "base_model:quantized:Translation-EnKo/exaone3-instrucTrans-v2-enko-7.8b", "8-bit", "bitsandbytes", "region:us" ]
2024-12-14T14:33:14Z
2024-12-14T14:43:53+00:00
4
0
--- base_model: Translation-EnKo/exaone3-instrucTrans-v2-enko-7.8b metrics: - memory_disk - memory_inference - inference_latency - inference_throughput - inference_CO2_emissions - inference_energy_consumption tags: - pruna-ai thumbnail: https://assets-global.website-files.com/646b351987a8d8ce158d1940/64ec9e96b4334c0e1ac41504_Logo%20with%20white%20text.svg --- <!-- header start --> <!-- 200823 --> <div style="width: auto; margin-left: auto; margin-right: auto"> <a href="https://docs.pruna.ai/en/latest/setup/pip.html" target="_blank" rel="noopener noreferrer"> <img src="https://imgur.com/rVAgqMY.png" alt="PrunaAI" style="width: 100%; min-width: 400px; display: block; margin: auto;"> </a> </div> <!-- header end --> [![Twitter](https://img.shields.io/twitter/follow/PrunaAI?style=social)](https://twitter.com/PrunaAI) [![GitHub](https://img.shields.io/github/followers/PrunaAI?label=Follow%20%40PrunaAI&style=social)](https://github.com/PrunaAI) [![LinkedIn](https://img.shields.io/badge/LinkedIn-Connect-blue)](https://www.linkedin.com/company/93832878/admin/feed/posts/?feedType=following) [![Discord](https://img.shields.io/badge/Discord-Join%20Us-blue?style=social&logo=discord)](https://discord.gg/rskEr4BZJx) # Simply make AI models cheaper, smaller, faster, and greener! - Give a thumbs up if you like this model! - Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact). - Request access to easily compress your *own* AI models [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai). - Read the documentations to know more [here](https://pruna-ai-pruna.readthedocs-hosted.com/en/latest/) - Join Pruna AI community on Discord [here](https://discord.gg/CP4VSgck) to share feedback/suggestions or get help. ## Results ![image info](./plots.png) **Frequently Asked Questions** - ***How does the compression work?*** The model is compressed with llm-int8. - ***How does the model quality change?*** The quality of the model output might vary compared to the base model. - ***How is the model efficiency evaluated?*** These results were obtained with configuration described in `model/smash_config.json` and are obtained after a hardware warmup. The smashed model is directly compared to the original base model. Efficiency results may vary in other settings (e.g. other hardware, image size, batch size, ...). We recommend to directly run them in the use-case conditions to know if the smashed model can benefit you. - ***What is the model format?*** We use safetensors. - ***What calibration data has been used?*** If needed by the compression method, we used WikiText as the calibration data. - ***What is the naming convention for Pruna Huggingface models?*** We take the original model name and append "turbo", "tiny", or "green" if the smashed model has a measured inference speed, inference memory, or inference energy consumption which is less than 90% of the original base model. - ***How to compress my own models?*** You can request premium access to more compression methods and tech support for your specific use-cases [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai). - ***What are "first" metrics?*** Results mentioning "first" are obtained after the first run of the model. The first run might take more memory or be slower than the subsequent runs due cuda overheads. - ***What are "Sync" and "Async" metrics?*** "Sync" metrics are obtained by syncing all GPU processes and stop measurement when all of them are executed. "Async" metrics are obtained without syncing all GPU processes and stop when the model output can be used by the CPU. We provide both metrics since both could be relevant depending on the use-case. We recommend to test the efficiency gains directly in your use-cases. ## Setup You can run the smashed model with these steps: 0. Check requirements from the original repo Translation-EnKo/exaone3-instrucTrans-v2-enko-7.8b installed. In particular, check python, cuda, and transformers versions. 1. Make sure that you have installed quantization related packages. ```bash pip install transformers accelerate bitsandbytes>0.37.0 ``` 2. Load & run the model. ```python from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained("PrunaAI/Translation-EnKo-exaone3-instrucTrans-v2-enko-7.8b-bnb-8bit-smashed", trust_remote_code=True, device_map='auto') tokenizer = AutoTokenizer.from_pretrained("Translation-EnKo/exaone3-instrucTrans-v2-enko-7.8b") input_ids = tokenizer("What is the color of prunes?,", return_tensors='pt').to(model.device)["input_ids"] outputs = model.generate(input_ids, max_new_tokens=216) tokenizer.decode(outputs[0]) ``` ## Configurations The configuration info are in `smash_config.json`. ## Credits & License The license of the smashed model follows the license of the original model. Please check the license of the original model Translation-EnKo/exaone3-instrucTrans-v2-enko-7.8b before using this model which provided the base model. The license of the `pruna-engine` is [here](https://pypi.org/project/pruna-engine/) on Pypi. ## Want to compress other models? - Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact). - Do it by yourself [here](https://docs.pruna.ai/en/latest/setup/pip.html).
null
Non_BioNLP
<!-- header start --> <!-- 200823 --> <div style="width: auto; margin-left: auto; margin-right: auto"> <a href="https://docs.pruna.ai/en/latest/setup/pip.html" target="_blank" rel="noopener noreferrer"> <img src="https://imgur.com/rVAgqMY.png" alt="PrunaAI" style="width: 100%; min-width: 400px; display: block; margin: auto;"> </a> </div> <!-- header end --> [![Twitter](https://img.shields.io/twitter/follow/PrunaAI?style=social)](https://twitter.com/PrunaAI) [![GitHub](https://img.shields.io/github/followers/PrunaAI?label=Follow%20%40PrunaAI&style=social)](https://github.com/PrunaAI) [![LinkedIn](https://img.shields.io/badge/LinkedIn-Connect-blue)](https://www.linkedin.com/company/93832878/admin/feed/posts/?feedType=following) [![Discord](https://img.shields.io/badge/Discord-Join%20Us-blue?style=social&logo=discord)](https://discord.gg/rskEr4BZJx) # Simply make AI models cheaper, smaller, faster, and greener! - Give a thumbs up if you like this model! - Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact). - Request access to easily compress your *own* AI models [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai). - Read the documentations to know more [here](https://pruna-ai-pruna.readthedocs-hosted.com/en/latest/) - Join Pruna AI community on Discord [here](https://discord.gg/CP4VSgck) to share feedback/suggestions or get help. ## Results ![image info](./plots.png) **Frequently Asked Questions** - ***How does the compression work?*** The model is compressed with llm-int8. - ***How does the model quality change?*** The quality of the model output might vary compared to the base model. - ***How is the model efficiency evaluated?*** These results were obtained with configuration described in `model/smash_config.json` and are obtained after a hardware warmup. The smashed model is directly compared to the original base model. Efficiency results may vary in other settings (e.g. other hardware, image size, batch size, ...). We recommend to directly run them in the use-case conditions to know if the smashed model can benefit you. - ***What is the model format?*** We use safetensors. - ***What calibration data has been used?*** If needed by the compression method, we used WikiText as the calibration data. - ***What is the naming convention for Pruna Huggingface models?*** We take the original model name and append "turbo", "tiny", or "green" if the smashed model has a measured inference speed, inference memory, or inference energy consumption which is less than 90% of the original base model. - ***How to compress my own models?*** You can request premium access to more compression methods and tech support for your specific use-cases [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai). - ***What are "first" metrics?*** Results mentioning "first" are obtained after the first run of the model. The first run might take more memory or be slower than the subsequent runs due cuda overheads. - ***What are "Sync" and "Async" metrics?*** "Sync" metrics are obtained by syncing all GPU processes and stop measurement when all of them are executed. "Async" metrics are obtained without syncing all GPU processes and stop when the model output can be used by the CPU. We provide both metrics since both could be relevant depending on the use-case. We recommend to test the efficiency gains directly in your use-cases. ## Setup You can run the smashed model with these steps: 0. Check requirements from the original repo Translation-EnKo/exaone3-instrucTrans-v2-enko-7.8b installed. In particular, check python, cuda, and transformers versions. 1. Make sure that you have installed quantization related packages. ```bash pip install transformers accelerate bitsandbytes>0.37.0 ``` 2. Load & run the model. ```python from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained("PrunaAI/Translation-EnKo-exaone3-instrucTrans-v2-enko-7.8b-bnb-8bit-smashed", trust_remote_code=True, device_map='auto') tokenizer = AutoTokenizer.from_pretrained("Translation-EnKo/exaone3-instrucTrans-v2-enko-7.8b") input_ids = tokenizer("What is the color of prunes?,", return_tensors='pt').to(model.device)["input_ids"] outputs = model.generate(input_ids, max_new_tokens=216) tokenizer.decode(outputs[0]) ``` ## Configurations The configuration info are in `smash_config.json`. ## Credits & License The license of the smashed model follows the license of the original model. Please check the license of the original model Translation-EnKo/exaone3-instrucTrans-v2-enko-7.8b before using this model which provided the base model. The license of the `pruna-engine` is [here](https://pypi.org/project/pruna-engine/) on Pypi. ## Want to compress other models? - Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact). - Do it by yourself [here](https://docs.pruna.ai/en/latest/setup/pip.html).
{"base_model": "Translation-EnKo/exaone3-instrucTrans-v2-enko-7.8b", "metrics": ["memory_disk", "memory_inference", "inference_latency", "inference_throughput", "inference_CO2_emissions", "inference_energy_consumption"], "tags": ["pruna-ai"], "thumbnail": "https://assets-global.website-files.com/646b351987a8d8ce158d1940/64ec9e96b4334c0e1ac41504_Logo%20with%20white%20text.svg"}
task
[ "TRANSLATION" ]
42,890
kaixinwang/NLP
kaixinwang
text-classification
[ "transformers", "tf", "distilbert", "text-classification", "sentiment analysis", "STEM", "text classification", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05Z
2022-03-03T19:06:29+00:00
69
0
--- language: - Python tags: - sentiment analysis - STEM - text classification thumbnail: url to a thumbnail used in social sharing --- Welcome! This is the model built for the sentiment analysis on the STEM course reviews at UCLA. - Author: Kaixin Wang - Email: [email protected] - Time Updated: March 2022
null
Non_BioNLP
Welcome! This is the model built for the sentiment analysis on the STEM course reviews at UCLA. - Author: Kaixin Wang - Email: [email protected] - Time Updated: March 2022
{"language": ["Python"], "tags": ["sentiment analysis", "STEM", "text classification"], "thumbnail": "url to a thumbnail used in social sharing"}
task
[ "TEXT_CLASSIFICATION" ]
42,891
Helsinki-NLP/opus-mt-fi-srn
Helsinki-NLP
translation
[ "transformers", "pytorch", "tf", "marian", "text2text-generation", "translation", "fi", "srn", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04Z
2023-08-16T11:35:29+00:00
50
0
--- license: apache-2.0 tags: - translation --- ### opus-mt-fi-srn * source languages: fi * target languages: srn * OPUS readme: [fi-srn](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/fi-srn/README.md) * dataset: opus * model: transformer-align * pre-processing: normalization + SentencePiece * download original weights: [opus-2020-01-08.zip](https://object.pouta.csc.fi/OPUS-MT-models/fi-srn/opus-2020-01-08.zip) * test set translations: [opus-2020-01-08.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/fi-srn/opus-2020-01-08.test.txt) * test set scores: [opus-2020-01-08.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/fi-srn/opus-2020-01-08.eval.txt) ## Benchmarks | testset | BLEU | chr-F | |-----------------------|-------|-------| | JW300.fi.srn | 29.2 | 0.491 |
null
Non_BioNLP
### opus-mt-fi-srn * source languages: fi * target languages: srn * OPUS readme: [fi-srn](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/fi-srn/README.md) * dataset: opus * model: transformer-align * pre-processing: normalization + SentencePiece * download original weights: [opus-2020-01-08.zip](https://object.pouta.csc.fi/OPUS-MT-models/fi-srn/opus-2020-01-08.zip) * test set translations: [opus-2020-01-08.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/fi-srn/opus-2020-01-08.test.txt) * test set scores: [opus-2020-01-08.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/fi-srn/opus-2020-01-08.eval.txt) ## Benchmarks | testset | BLEU | chr-F | |-----------------------|-------|-------| | JW300.fi.srn | 29.2 | 0.491 |
{"license": "apache-2.0", "tags": ["translation"]}
task
[ "TRANSLATION" ]
42,892
xFahrenheit/autotrain-mbart25-3000-hin-en-50671120936
xFahrenheit
summarization
[ "transformers", "pytorch", "mbart", "text2text-generation", "autotrain", "summarization", "unk", "dataset:xFahrenheit/autotrain-data-mbart25-3000-hin-en", "co2_eq_emissions", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2023-04-19T03:50:18Z
2023-04-19T04:34:21+00:00
19
0
--- datasets: - xFahrenheit/autotrain-data-mbart25-3000-hin-en language: - unk tags: - autotrain - summarization widget: - text: I love AutoTrain 🤗 co2_eq_emissions: emissions: 16.74341806391351 --- # Model Trained Using AutoTrain - Problem type: Summarization - Model ID: 50671120936 - CO2 Emissions (in grams): 16.7434 ## Validation Metrics - Loss: 2.170 - Rouge1: 24.643 - Rouge2: 10.279 - RougeL: 19.196 - RougeLsum: 21.648 - Gen Len: 67.333 ## Usage You can use cURL to access this model: ``` $ curl -X POST -H "Authorization: Bearer YOUR_HUGGINGFACE_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoTrain"}' https://api-inference.huggingface.co/xFahrenheit/autotrain-mbart25-3000-hin-en-50671120936 ```
null
Non_BioNLP
# Model Trained Using AutoTrain - Problem type: Summarization - Model ID: 50671120936 - CO2 Emissions (in grams): 16.7434 ## Validation Metrics - Loss: 2.170 - Rouge1: 24.643 - Rouge2: 10.279 - RougeL: 19.196 - RougeLsum: 21.648 - Gen Len: 67.333 ## Usage You can use cURL to access this model: ``` $ curl -X POST -H "Authorization: Bearer YOUR_HUGGINGFACE_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoTrain"}' https://api-inference.huggingface.co/xFahrenheit/autotrain-mbart25-3000-hin-en-50671120936 ```
{"datasets": ["xFahrenheit/autotrain-data-mbart25-3000-hin-en"], "language": ["unk"], "tags": ["autotrain", "summarization"], "widget": [{"text": "I love AutoTrain 🤗"}], "co2_eq_emissions": {"emissions": 16.74341806391351}}
task
[ "SUMMARIZATION" ]
42,893
hopkins/eng-deu-delfy
hopkins
translation
[ "transformers", "pytorch", "tensorboard", "mbart", "text2text-generation", "translation", "generated_from_trainer", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2023-07-03T16:03:32Z
2023-07-03T16:49:33+00:00
10
0
--- metrics: - bleu tags: - translation - generated_from_trainer model-index: - name: eng-deu-delfy results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # eng-deu-delfy This model is a fine-tuned version of [facebook/mbart-large-50-many-to-many-mmt](https://huggingface.co/facebook/mbart-large-50-many-to-many-mmt) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.6917 - Bleu: 19.9632 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.26.1 - Pytorch 2.0.1+cu117 - Datasets 2.12.0 - Tokenizers 0.13.3
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # eng-deu-delfy This model is a fine-tuned version of [facebook/mbart-large-50-many-to-many-mmt](https://huggingface.co/facebook/mbart-large-50-many-to-many-mmt) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.6917 - Bleu: 19.9632 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.26.1 - Pytorch 2.0.1+cu117 - Datasets 2.12.0 - Tokenizers 0.13.3
{"metrics": ["bleu"], "tags": ["translation", "generated_from_trainer"], "model-index": [{"name": "eng-deu-delfy", "results": []}]}
task
[ "TRANSLATION" ]
42,894
uvegesistvan/wildmann_german_proposal_2b_german_to_slovak
uvegesistvan
null
[ "tensorboard", "safetensors", "xlm-roberta", "emotion-classification", "text-analysis", "machine-translation", "sk", "license:mit", "region:us" ]
2025-01-11T10:30:58Z
2025-01-13T18:16:21+00:00
9
0
--- language: sk license: mit metrics: - precision - recall - f1-score - accuracy tags: - emotion-classification - text-analysis - machine-translation --- # Model Card for uvegesistvan/wildmann_german_proposal_2b_german_to_slovak ## Model Overview This model is a multi-class emotion classifier trained on German-to-Slovak machine-translated text data. It identifies nine distinct emotional states in text. The model utilizes a diverse dataset, incorporating both synthetic and original German sentences translated into Slovak, highlighting its ability to generalize across linguistic variations introduced by machine translation. ### Emotion Classes The model classifies the following emotional states: - **Anger (0)** - **Fear (1)** - **Disgust (2)** - **Sadness (3)** - **Joy (4)** - **Enthusiasm (5)** - **Hope (6)** - **Pride (7)** - **No emotion (8)** ### Dataset and Preprocessing The dataset consists of German text machine-translated into Slovak and annotated for emotional content. It includes both synthetic and original sentences to enhance diversity. Preprocessing involved: - Undersampling of overrepresented classes, such as "No emotion" and "Anger," to ensure balanced training across all labels. ### Evaluation Metrics The model's performance was evaluated using precision, recall, F1-score, and accuracy metrics. Detailed results are as follows: | Class | Precision | Recall | F1-Score | Support | |---------------|-----------|--------|----------|---------| | Anger (0) | 0.54 | 0.58 | 0.56 | 777 | | Fear (1) | 0.84 | 0.78 | 0.81 | 776 | | Disgust (2) | 0.93 | 0.94 | 0.93 | 776 | | Sadness (3) | 0.85 | 0.84 | 0.84 | 775 | | Joy (4) | 0.82 | 0.80 | 0.81 | 777 | | Enthusiasm (5)| 0.62 | 0.64 | 0.63 | 776 | | Hope (6) | 0.53 | 0.54 | 0.54 | 777 | | Pride (7) | 0.74 | 0.79 | 0.77 | 776 | | No emotion (8)| 0.67 | 0.63 | 0.65 | 1553 | ### Overall Metrics - **Accuracy**: 0.72 - **Macro Average**: Precision = 0.73, Recall = 0.73, F1-Score = 0.73 - **Weighted Average**: Precision = 0.72, Recall = 0.72, F1-Score = 0.72 ### Performance Insights The model demonstrates robust performance in detecting "Fear" and "Disgust," while "Hope" and "Enthusiasm" show slightly lower performance due to subtleties in emotional expression and potential translation noise. These results reflect the complexities of training on machine-translated text. ## Model Usage ### Applications - Emotion analysis of German texts translated into Slovak for social research or sentiment tracking. - Research on cross-linguistic emotion classification in multilingual datasets. - Sentiment analysis for Slovak-language customer feedback derived from German text. ### Limitations - The model's performance depends on the quality of the machine-translated text. Translation errors or ambiguities may impact classification accuracy. - Subtle emotional expressions may be misclassified due to linguistic nuances being lost in translation. ### Ethical Considerations The use of machine-translated datasets introduces the possibility of biases or inaccuracies caused by the loss of cultural and linguistic subtleties during translation. Users should carefully evaluate the model's performance before applying it in sensitive contexts such as mental health, social studies, or customer sentiment analysis. ### Citation For further information, visit: [uvegesistvan/wildmann_german_proposal_2b_german_to_slovak](#)
null
Non_BioNLP
# Model Card for uvegesistvan/wildmann_german_proposal_2b_german_to_slovak ## Model Overview This model is a multi-class emotion classifier trained on German-to-Slovak machine-translated text data. It identifies nine distinct emotional states in text. The model utilizes a diverse dataset, incorporating both synthetic and original German sentences translated into Slovak, highlighting its ability to generalize across linguistic variations introduced by machine translation. ### Emotion Classes The model classifies the following emotional states: - **Anger (0)** - **Fear (1)** - **Disgust (2)** - **Sadness (3)** - **Joy (4)** - **Enthusiasm (5)** - **Hope (6)** - **Pride (7)** - **No emotion (8)** ### Dataset and Preprocessing The dataset consists of German text machine-translated into Slovak and annotated for emotional content. It includes both synthetic and original sentences to enhance diversity. Preprocessing involved: - Undersampling of overrepresented classes, such as "No emotion" and "Anger," to ensure balanced training across all labels. ### Evaluation Metrics The model's performance was evaluated using precision, recall, F1-score, and accuracy metrics. Detailed results are as follows: | Class | Precision | Recall | F1-Score | Support | |---------------|-----------|--------|----------|---------| | Anger (0) | 0.54 | 0.58 | 0.56 | 777 | | Fear (1) | 0.84 | 0.78 | 0.81 | 776 | | Disgust (2) | 0.93 | 0.94 | 0.93 | 776 | | Sadness (3) | 0.85 | 0.84 | 0.84 | 775 | | Joy (4) | 0.82 | 0.80 | 0.81 | 777 | | Enthusiasm (5)| 0.62 | 0.64 | 0.63 | 776 | | Hope (6) | 0.53 | 0.54 | 0.54 | 777 | | Pride (7) | 0.74 | 0.79 | 0.77 | 776 | | No emotion (8)| 0.67 | 0.63 | 0.65 | 1553 | ### Overall Metrics - **Accuracy**: 0.72 - **Macro Average**: Precision = 0.73, Recall = 0.73, F1-Score = 0.73 - **Weighted Average**: Precision = 0.72, Recall = 0.72, F1-Score = 0.72 ### Performance Insights The model demonstrates robust performance in detecting "Fear" and "Disgust," while "Hope" and "Enthusiasm" show slightly lower performance due to subtleties in emotional expression and potential translation noise. These results reflect the complexities of training on machine-translated text. ## Model Usage ### Applications - Emotion analysis of German texts translated into Slovak for social research or sentiment tracking. - Research on cross-linguistic emotion classification in multilingual datasets. - Sentiment analysis for Slovak-language customer feedback derived from German text. ### Limitations - The model's performance depends on the quality of the machine-translated text. Translation errors or ambiguities may impact classification accuracy. - Subtle emotional expressions may be misclassified due to linguistic nuances being lost in translation. ### Ethical Considerations The use of machine-translated datasets introduces the possibility of biases or inaccuracies caused by the loss of cultural and linguistic subtleties during translation. Users should carefully evaluate the model's performance before applying it in sensitive contexts such as mental health, social studies, or customer sentiment analysis. ### Citation For further information, visit: [uvegesistvan/wildmann_german_proposal_2b_german_to_slovak](#)
{"language": "sk", "license": "mit", "metrics": ["precision", "recall", "f1-score", "accuracy"], "tags": ["emotion-classification", "text-analysis", "machine-translation"]}
task
[ "TRANSLATION" ]
42,895
werty1248/Mistral-Nemo-NT-Ko-12B-dpo
werty1248
null
[ "safetensors", "mistral", "en", "ko", "ja", "zh", "dataset:zake7749/kyara-chinese-preference-rl-dpo-s0-30K", "dataset:sionic/ko-dpo-mix-7k-trl-style", "dataset:kuotient/orca-math-korean-dpo-pairs", "dataset:HuggingFaceH4/ultrafeedback_binarized", "base_model:werty1248/Mistral-Nemo-NT-Ko-12B-sft", "base_model:finetune:werty1248/Mistral-Nemo-NT-Ko-12B-sft", "license:apache-2.0", "region:us" ]
2024-09-22T08:01:19Z
2024-09-22T14:53:10+00:00
2,008
3
--- base_model: - werty1248/Mistral-Nemo-NT-Ko-12B-sft datasets: - zake7749/kyara-chinese-preference-rl-dpo-s0-30K - sionic/ko-dpo-mix-7k-trl-style - kuotient/orca-math-korean-dpo-pairs - HuggingFaceH4/ultrafeedback_binarized language: - en - ko - ja - zh license: apache-2.0 --- # Mistral-Nemo-NT-Ko-12B-dpo ## Description **Mistral-Nemo-NT-Ko-12B-dpo** is a shallowly DPO-trained version of [*werty1248/Mistral-Nemo-NT-Ko-12B-sft*](https://huggingface.co/werty1248/Mistral-Nemo-NT-Ko-12B-sft). According to the [Hermes 3 Tech Report](https://nousresearch.com/wp-content/uploads/2024/08/Hermes-3-Technical-Report.pdf), DPO made negligible performance improvements in their model. Therefore, I followed the same approach described in the report and applied DPO using LoRA. - LoRA r = 32 - Lora alpha = 16 - lr = 3e-6 - neftune alpha = 5 The datasets used are as follows: - (En) [HuggingFaceH4/ultrafeedback_binarized](https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized) - (Ko, translated from En) [sionic/ko-dpo-mix-7k-translation-exclude](https://huggingface.co/datasets/sionic/ko-dpo-mix-7k-translation-exclude) - (Ko, translated from En) [kuotient/orca-math-korean-dpo-pairs](https://huggingface.co/datasets/kuotient/orca-math-korean-dpo-pairs) - (Zh) [zake7749/kyara-chinese-preference-rl-dpo-s0-30K](https://huggingface.co/datasets/zake7749/kyara-chinese-preference-rl-dpo-s0-30K) I've been looking for native Korean/Japanese DPO datasets, but haven't found anything that I'm personally satisfied with(Quantity/Quality). From each dataset, I sampled a subset based on the score given by the reward model. In the end, I used about 13K samples for training for each language. ## Features - The base model supports a context length of 128K, while I fine-tuned this model with an 8K context size. - This model works well for **multi-turn conversations**, and tends to strongly reflect the previous conversation. # Evaluation ### LogicKor *Cot-1-shot* | 모델 | 방법 | 추론 | 수학 | 글쓰기 | 코딩 | 이해 | 문법 | 싱글턴 | 멀티턴 | 총점 | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | |Mistral-Nemo-NT-Ko-12B-sft| cot-1-shot |7.36 | 6.57 | 8.71 | 8.57 | 9.57 | 6.43 | 7.81 | 7.93 | **7.87** | |**Mistral-Nemo-NT-Ko-12B-dpo**| cot-1-shot | 6.79 | 6.43 | 9.43 | 9.79 | 9.43 | 5.29 | 7.71 | 8.00 | **7.86** | | Mistral Nemo | cot-1-shot | 5.43 | 6.86 | 6.07 | 7.57 | 5.86 | 7.57 | 7.50 | 5.62 |6.56| *1-shot* | 모델 | 방법 | 추론 | 수학 | 글쓰기 | 코딩 | 이해 | 문법 | 싱글턴 | 멀티턴 | 총점 | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | |**Mistral-Nemo-NT-Ko-12B-dpo**| 1-shot | 8.14 | 5.50 | 9.36 | 8.57 | 9.50 | 4.71 | 7.38 | 7.88 | **7.63** | |Mistral-Nemo-NT-Ko-12B-sft| 1-shot | 9.00 | 5.71 | 7.93 | 8.29 | 7.93 | 5.21 | 7.29 | 7.40 | 7.35 | | Mistral Nemo | 1-shot | 5.00 | 6.50 | 6.86 | 8.07 | 7.64 | 8.43 | 7.60 | 6.57 |7.08| *Default* | 모델 | 방법 | 추론 | 수학 | 글쓰기 | 코딩 | 이해 | 문법 | 싱글턴 | 멀티턴 | 총점 | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | |**Mistral-Nemo-NT-Ko-12B-dpo**| default | 6.21 | 5.79 | 8.00 | 8.36 | 9.43 | 5.43 | 7.17 | 7.24 | **7.20** | |Mistral-Nemo-NT-Ko-12B-sft| default | 6.00 | 4.93 | 5.43 | 7.14 | 9.71 | 4.00 | 6.45 | 5.95 | 6.20 | | Mistral Nemo | default | 0.43 | 7.64 | 6.21 | 7.14 | 6.79 | 7.21 | 6.26 | 5.55 |5.90| ### Language-Confusion | Model | Language | Monolingual-LPR | Monolingual-WPR | Crosslingual-LPR | Crosslingual-WPR | | --- | --- | --- | --- | --- | --- | |Mistral-Nemo-NT-Ko-12B-dpo| ko | 100.00% | 97.96% | **85.63%** | 96.93% | |Mistral-Nemo-NT-Ko-12B-sft| ko | 100.00% | 99.00% | **87.51%** | 96.96% | |Mistral-Nemo-Instruct-2407 | ko | 90.72% | 93.18% | 46.75% | 92.84% | |Meta-Llama-3.1-8B-Instruct | ko | 99.00% | 96.97% | 91.45% | 93.01% | |gemma-2-9b-it | ko | 100.00% | 98.00% | 87.93% | 95.58% | | --- | --- | --- | --- | --- | --- | |Mistral-Nemo-NT-Ko-12B-dpo| zh | 99.00% | 99.50% | **80.52%** | 97.51% | |Mistral-Nemo-Instruct-2407 | zh | 97.50% | 98.98% | 53.43% | 93.58% | | --- | --- | --- | --- | --- | --- | |Mistral-Nemo-NT-Ko-12B-dpo| ja | 100.00% | 100.00% | **86.89%** | 95.41% | |Mistral-Nemo-Instruct-2407 | ja | 94.00% | 98.94% | 50.27% | 96.05% | ## Template ``` <|im_start|>system You are a helpful AI assistant.<|im_end|> <|im_start|>user {prompt}<|im_end|> <|im_start|>assistant ``` *I trained Mistral-Nemo-NT-Ko-12B with various system prompt from dozens of dataset. You can chat with/without your system prompt.* # Dataset - zake7749/kyara-chinese-preference-rl-dpo-s0-30K - sionic/ko-dpo-mix-7k-trl-style - kuotient/orca-math-korean-dpo-pairs - HuggingFaceH4/ultrafeedback_binarized # Training Details - GPU: 2xA100 - epoch: 1 - total batch size: 32 - learning rate: 3e-6 - neftune_noise_alpha: 5 <details><summary>See axolotl config</summary> axolotl version: `0.4.1` ```yaml base_model: werty1248/Mistral-Nemo-NT-Ko-12B-sft model_type: MistralForCausalLM tokenizer_type: AutoTokenizer load_in_8bit: false load_in_4bit: false strict: false adapter: lora lora_model_dir: lora_r: 32 lora_alpha: 16 lora_dropout: 0.05 lora_target_linear: true lora_fan_in_fan_out: dpo_beta: 0.1 rl: dpo datasets: - path: werty1248/NT-dpo split: train type: chatml.prompt_pairs dataset_prepared_path: /workspace/data/prepared_datasets output_dir: /workspace/data save_steps: 500 sequence_len: 8192 sample_packing: false pad_to_sequence_len: true gradient_accumulation_steps: 16 micro_batch_size: 1 num_epochs: 1 optimizer: rmsprop weight_decay: 0.0 learning_rate: 0.000003 lr_scheduler: linear neftune_noise_alpha: 5 train_on_inputs: false group_by_length: false #wandb_project: #wandb_entity: #wandb_watch: #wandb_name: #wandb_log_model: bf16: true fp16: false tf32: false gradient_checkpointing: true flash_attention: true warmup_steps: 9 eval_steps: val_set_size: 0 early_stopping_patience: logging_steps: 1 special_tokens: pad_token: <pad> ``` </details><br> - reward margin ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6629154d55d7c289634b8c5d/5m2K7azV5ZhGGZqWJZNWX.png)
null
Non_BioNLP
# Mistral-Nemo-NT-Ko-12B-dpo ## Description **Mistral-Nemo-NT-Ko-12B-dpo** is a shallowly DPO-trained version of [*werty1248/Mistral-Nemo-NT-Ko-12B-sft*](https://huggingface.co/werty1248/Mistral-Nemo-NT-Ko-12B-sft). According to the [Hermes 3 Tech Report](https://nousresearch.com/wp-content/uploads/2024/08/Hermes-3-Technical-Report.pdf), DPO made negligible performance improvements in their model. Therefore, I followed the same approach described in the report and applied DPO using LoRA. - LoRA r = 32 - Lora alpha = 16 - lr = 3e-6 - neftune alpha = 5 The datasets used are as follows: - (En) [HuggingFaceH4/ultrafeedback_binarized](https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized) - (Ko, translated from En) [sionic/ko-dpo-mix-7k-translation-exclude](https://huggingface.co/datasets/sionic/ko-dpo-mix-7k-translation-exclude) - (Ko, translated from En) [kuotient/orca-math-korean-dpo-pairs](https://huggingface.co/datasets/kuotient/orca-math-korean-dpo-pairs) - (Zh) [zake7749/kyara-chinese-preference-rl-dpo-s0-30K](https://huggingface.co/datasets/zake7749/kyara-chinese-preference-rl-dpo-s0-30K) I've been looking for native Korean/Japanese DPO datasets, but haven't found anything that I'm personally satisfied with(Quantity/Quality). From each dataset, I sampled a subset based on the score given by the reward model. In the end, I used about 13K samples for training for each language. ## Features - The base model supports a context length of 128K, while I fine-tuned this model with an 8K context size. - This model works well for **multi-turn conversations**, and tends to strongly reflect the previous conversation. # Evaluation ### LogicKor *Cot-1-shot* | 모델 | 방법 | 추론 | 수학 | 글쓰기 | 코딩 | 이해 | 문법 | 싱글턴 | 멀티턴 | 총점 | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | |Mistral-Nemo-NT-Ko-12B-sft| cot-1-shot |7.36 | 6.57 | 8.71 | 8.57 | 9.57 | 6.43 | 7.81 | 7.93 | **7.87** | |**Mistral-Nemo-NT-Ko-12B-dpo**| cot-1-shot | 6.79 | 6.43 | 9.43 | 9.79 | 9.43 | 5.29 | 7.71 | 8.00 | **7.86** | | Mistral Nemo | cot-1-shot | 5.43 | 6.86 | 6.07 | 7.57 | 5.86 | 7.57 | 7.50 | 5.62 |6.56| *1-shot* | 모델 | 방법 | 추론 | 수학 | 글쓰기 | 코딩 | 이해 | 문법 | 싱글턴 | 멀티턴 | 총점 | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | |**Mistral-Nemo-NT-Ko-12B-dpo**| 1-shot | 8.14 | 5.50 | 9.36 | 8.57 | 9.50 | 4.71 | 7.38 | 7.88 | **7.63** | |Mistral-Nemo-NT-Ko-12B-sft| 1-shot | 9.00 | 5.71 | 7.93 | 8.29 | 7.93 | 5.21 | 7.29 | 7.40 | 7.35 | | Mistral Nemo | 1-shot | 5.00 | 6.50 | 6.86 | 8.07 | 7.64 | 8.43 | 7.60 | 6.57 |7.08| *Default* | 모델 | 방법 | 추론 | 수학 | 글쓰기 | 코딩 | 이해 | 문법 | 싱글턴 | 멀티턴 | 총점 | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | |**Mistral-Nemo-NT-Ko-12B-dpo**| default | 6.21 | 5.79 | 8.00 | 8.36 | 9.43 | 5.43 | 7.17 | 7.24 | **7.20** | |Mistral-Nemo-NT-Ko-12B-sft| default | 6.00 | 4.93 | 5.43 | 7.14 | 9.71 | 4.00 | 6.45 | 5.95 | 6.20 | | Mistral Nemo | default | 0.43 | 7.64 | 6.21 | 7.14 | 6.79 | 7.21 | 6.26 | 5.55 |5.90| ### Language-Confusion | Model | Language | Monolingual-LPR | Monolingual-WPR | Crosslingual-LPR | Crosslingual-WPR | | --- | --- | --- | --- | --- | --- | |Mistral-Nemo-NT-Ko-12B-dpo| ko | 100.00% | 97.96% | **85.63%** | 96.93% | |Mistral-Nemo-NT-Ko-12B-sft| ko | 100.00% | 99.00% | **87.51%** | 96.96% | |Mistral-Nemo-Instruct-2407 | ko | 90.72% | 93.18% | 46.75% | 92.84% | |Meta-Llama-3.1-8B-Instruct | ko | 99.00% | 96.97% | 91.45% | 93.01% | |gemma-2-9b-it | ko | 100.00% | 98.00% | 87.93% | 95.58% | | --- | --- | --- | --- | --- | --- | |Mistral-Nemo-NT-Ko-12B-dpo| zh | 99.00% | 99.50% | **80.52%** | 97.51% | |Mistral-Nemo-Instruct-2407 | zh | 97.50% | 98.98% | 53.43% | 93.58% | | --- | --- | --- | --- | --- | --- | |Mistral-Nemo-NT-Ko-12B-dpo| ja | 100.00% | 100.00% | **86.89%** | 95.41% | |Mistral-Nemo-Instruct-2407 | ja | 94.00% | 98.94% | 50.27% | 96.05% | ## Template ``` <|im_start|>system You are a helpful AI assistant.<|im_end|> <|im_start|>user {prompt}<|im_end|> <|im_start|>assistant ``` *I trained Mistral-Nemo-NT-Ko-12B with various system prompt from dozens of dataset. You can chat with/without your system prompt.* # Dataset - zake7749/kyara-chinese-preference-rl-dpo-s0-30K - sionic/ko-dpo-mix-7k-trl-style - kuotient/orca-math-korean-dpo-pairs - HuggingFaceH4/ultrafeedback_binarized # Training Details - GPU: 2xA100 - epoch: 1 - total batch size: 32 - learning rate: 3e-6 - neftune_noise_alpha: 5 <details><summary>See axolotl config</summary> axolotl version: `0.4.1` ```yaml base_model: werty1248/Mistral-Nemo-NT-Ko-12B-sft model_type: MistralForCausalLM tokenizer_type: AutoTokenizer load_in_8bit: false load_in_4bit: false strict: false adapter: lora lora_model_dir: lora_r: 32 lora_alpha: 16 lora_dropout: 0.05 lora_target_linear: true lora_fan_in_fan_out: dpo_beta: 0.1 rl: dpo datasets: - path: werty1248/NT-dpo split: train type: chatml.prompt_pairs dataset_prepared_path: /workspace/data/prepared_datasets output_dir: /workspace/data save_steps: 500 sequence_len: 8192 sample_packing: false pad_to_sequence_len: true gradient_accumulation_steps: 16 micro_batch_size: 1 num_epochs: 1 optimizer: rmsprop weight_decay: 0.0 learning_rate: 0.000003 lr_scheduler: linear neftune_noise_alpha: 5 train_on_inputs: false group_by_length: false #wandb_project: #wandb_entity: #wandb_watch: #wandb_name: #wandb_log_model: bf16: true fp16: false tf32: false gradient_checkpointing: true flash_attention: true warmup_steps: 9 eval_steps: val_set_size: 0 early_stopping_patience: logging_steps: 1 special_tokens: pad_token: <pad> ``` </details><br> - reward margin ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6629154d55d7c289634b8c5d/5m2K7azV5ZhGGZqWJZNWX.png)
{"base_model": ["werty1248/Mistral-Nemo-NT-Ko-12B-sft"], "datasets": ["zake7749/kyara-chinese-preference-rl-dpo-s0-30K", "sionic/ko-dpo-mix-7k-trl-style", "kuotient/orca-math-korean-dpo-pairs", "HuggingFaceH4/ultrafeedback_binarized"], "language": ["en", "ko", "ja", "zh"], "license": "apache-2.0"}
task
[ "TRANSLATION" ]
42,896
0xhaz/BUOD
0xhaz
null
[ "region:us" ]
2023-04-19T22:47:14Z
2023-04-19T22:51:28+00:00
0
0
--- {} --- # 📋 BUOD: Text Summarization Model for the Filipino Language Directory [![Model:distilBART](https://img.shields.io/badge/model-distilBART-green)](https://huggingface.co/jamesesguerra/distilbart-cnn-12-6-finetuned-1.3.1) [![Model:Bert2Bert](https://img.shields.io/badge/model-bert2bert-green)](https://huggingface.co/0xhaz/bert2bert-cnn_dailymail-fp16-finetuned-1.0.0) ![Last Updated](https://img.shields.io/badge/last%20updated%3A-031923-lightgrey) Authors: [James Esguerra](https://huggingface.co/jamesesguerra), [Julia Avila](), [Hazielle Bugayong](https://huggingface.co/0xhaz) > Foreword: This research was done in two parts, gathering the data and running transformer models, > namely distilBART and bert2bert. Below is the step-by-step process of the experientaton of the study: ## 📚 Steps - 📝 **Gathering the data** - 🔧 **Initializing the transfomer models; fine-tuning of the models:** -- via Google Colab -- via Google Colab (Local runtime) -- via Jupyter Notebook ## 📝 Gathering data An [article scraper](https://github.com/jamesesguerra/article_scraper) was used in this experimentation which can gather bodies of text from various news sites. The data gathered was used to pre-train and finetune the models in the next step. This also includes instructions on how to use the article scraper. ## 🔧 Initialization of transformer models #### via Google Colab Two models, distilBART and bert2bert were used to compar abstractive text summarization performance. They can be found here: - [distilBART](https://colab.research.google.com/drive/1Lv78nHqQh2I7KaFkUzWsn_MXsyP_PP1I?authuser=3#scrollTo=moK3d7mTQ1v-) - [bert2bert](https://colab.research.google.com/drive/1Lv78nHqQh2I7KaFkUzWsn_MXsyP_PP1I?authuser=3#scrollTo=moK3d7mTQ1v-) #### via Google Colab Local Runtime ##### Dependencies - Jupyter Notebook - Anaconda - _Optional:_ CUDA Toolkit for Nvidia, requires an account to install - Tensorflow ##### Installing dependencies Create an anaconda environment. This can also be used for tensorflow, which links your GPU to Google colab's Local runtime: ```sh conda create -n tf-gpu conda activate tf-gpu ``` ##### Optional Step: GPU Utilization (if you are using an external GPU) Next, install the **CUDA toolkit**, this is the version that was used in this experiment. You may find a more compatible version for your hardware: ```sh conda install -c conda-forge cudatoolkit=11.2 cudnn=8.1.0 ``` Then, upgrade pip and install tensorflow: ```sh pip install –upgrade pip pip install “tensorflow<2.11” –user ``` Now, check if tensorflow has been configured to use the GPU, Type in termnial: ```sh python ``` Next, type the following to verify: ```sh import tensorflow as tf tf.test.is_built_with_cuda() ``` If it returns `true`, you have succesfully initialized the environment with your external GPU. If not, you may follow the tutorials found here: - CUDA Toolkit Tutorial [here](https://medium.com/geekculture/install-cuda-and-cudnn-on-windows-linux-52d1501a8805) - Creating and Anaconda environment [step-by-step](https://stackoverflow.com/questions/51002045/how-to-make-jupyter-notebook-to-run-on-gpu) - Installing Tensorflow locally using [this tutorial](https://www.tensorflow.org/install/pip#windows-native_1) ##### Connecting to a Google Colab Local Runtime To connect this on a Google Colab Local Runtime, [this tutorial](https://research.google.com/colaboratory/local-runtimes.html) was used. First, install Jupyter notebook (if you haven't) and enable server permissions: ```sh pip install jupyter_http_over_ws jupyter serverextension enable --py jupyter_http_over_ws ``` Next, start and authenticate the server: ```sh jupyter notebook --NotebookApp.allow_origin='https://colab.research.google.com' --port=8888 --NotebookApp.port_retries=0 ``` You can now copy the token url and paste it on your Google Colab. #### Running the notebook using Jupyter Notebook ##### Dependencies - Jupyter Notebook - Anaconda - _Optional:_ CUDA Toolkit for Nvidia, requires an account to install - Tensorflow Download the notebooks and save them in your chosen directory. Create an environment where you can run the notebook via Anaconda ```sh conda create -n env conda activate env ``` **You may also opt to install the CUDA toolkit and tensforflow in this environment. Next, run the notebooks via Jupyter Notebook. ```sh jupyter notebook ``` ##### After you're done Deactivate the environment and also disable the server using the commands in your console. ```sh conda deactivate ``` ```sh jupyter serverextension disable --py jupyter_http_over_ws ``` ## 🔗 Additional Links/ Directory Here are some links to resources and or references. | Name | Link | | ------ | ------ | | Ateneo Social Computing Lab | https://huggingface.co/ateneoscsl |
null
Non_BioNLP
# 📋 BUOD: Text Summarization Model for the Filipino Language Directory [![Model:distilBART](https://img.shields.io/badge/model-distilBART-green)](https://huggingface.co/jamesesguerra/distilbart-cnn-12-6-finetuned-1.3.1) [![Model:Bert2Bert](https://img.shields.io/badge/model-bert2bert-green)](https://huggingface.co/0xhaz/bert2bert-cnn_dailymail-fp16-finetuned-1.0.0) ![Last Updated](https://img.shields.io/badge/last%20updated%3A-031923-lightgrey) Authors: [James Esguerra](https://huggingface.co/jamesesguerra), [Julia Avila](), [Hazielle Bugayong](https://huggingface.co/0xhaz) > Foreword: This research was done in two parts, gathering the data and running transformer models, > namely distilBART and bert2bert. Below is the step-by-step process of the experientaton of the study: ## 📚 Steps - 📝 **Gathering the data** - 🔧 **Initializing the transfomer models; fine-tuning of the models:** -- via Google Colab -- via Google Colab (Local runtime) -- via Jupyter Notebook ## 📝 Gathering data An [article scraper](https://github.com/jamesesguerra/article_scraper) was used in this experimentation which can gather bodies of text from various news sites. The data gathered was used to pre-train and finetune the models in the next step. This also includes instructions on how to use the article scraper. ## 🔧 Initialization of transformer models #### via Google Colab Two models, distilBART and bert2bert were used to compar abstractive text summarization performance. They can be found here: - [distilBART](https://colab.research.google.com/drive/1Lv78nHqQh2I7KaFkUzWsn_MXsyP_PP1I?authuser=3#scrollTo=moK3d7mTQ1v-) - [bert2bert](https://colab.research.google.com/drive/1Lv78nHqQh2I7KaFkUzWsn_MXsyP_PP1I?authuser=3#scrollTo=moK3d7mTQ1v-) #### via Google Colab Local Runtime ##### Dependencies - Jupyter Notebook - Anaconda - _Optional:_ CUDA Toolkit for Nvidia, requires an account to install - Tensorflow ##### Installing dependencies Create an anaconda environment. This can also be used for tensorflow, which links your GPU to Google colab's Local runtime: ```sh conda create -n tf-gpu conda activate tf-gpu ``` ##### Optional Step: GPU Utilization (if you are using an external GPU) Next, install the **CUDA toolkit**, this is the version that was used in this experiment. You may find a more compatible version for your hardware: ```sh conda install -c conda-forge cudatoolkit=11.2 cudnn=8.1.0 ``` Then, upgrade pip and install tensorflow: ```sh pip install –upgrade pip pip install “tensorflow<2.11” –user ``` Now, check if tensorflow has been configured to use the GPU, Type in termnial: ```sh python ``` Next, type the following to verify: ```sh import tensorflow as tf tf.test.is_built_with_cuda() ``` If it returns `true`, you have succesfully initialized the environment with your external GPU. If not, you may follow the tutorials found here: - CUDA Toolkit Tutorial [here](https://medium.com/geekculture/install-cuda-and-cudnn-on-windows-linux-52d1501a8805) - Creating and Anaconda environment [step-by-step](https://stackoverflow.com/questions/51002045/how-to-make-jupyter-notebook-to-run-on-gpu) - Installing Tensorflow locally using [this tutorial](https://www.tensorflow.org/install/pip#windows-native_1) ##### Connecting to a Google Colab Local Runtime To connect this on a Google Colab Local Runtime, [this tutorial](https://research.google.com/colaboratory/local-runtimes.html) was used. First, install Jupyter notebook (if you haven't) and enable server permissions: ```sh pip install jupyter_http_over_ws jupyter serverextension enable --py jupyter_http_over_ws ``` Next, start and authenticate the server: ```sh jupyter notebook --NotebookApp.allow_origin='https://colab.research.google.com' --port=8888 --NotebookApp.port_retries=0 ``` You can now copy the token url and paste it on your Google Colab. #### Running the notebook using Jupyter Notebook ##### Dependencies - Jupyter Notebook - Anaconda - _Optional:_ CUDA Toolkit for Nvidia, requires an account to install - Tensorflow Download the notebooks and save them in your chosen directory. Create an environment where you can run the notebook via Anaconda ```sh conda create -n env conda activate env ``` **You may also opt to install the CUDA toolkit and tensforflow in this environment. Next, run the notebooks via Jupyter Notebook. ```sh jupyter notebook ``` ##### After you're done Deactivate the environment and also disable the server using the commands in your console. ```sh conda deactivate ``` ```sh jupyter serverextension disable --py jupyter_http_over_ws ``` ## 🔗 Additional Links/ Directory Here are some links to resources and or references. | Name | Link | | ------ | ------ | | Ateneo Social Computing Lab | https://huggingface.co/ateneoscsl |
{}
task
[ "SUMMARIZATION" ]
42,898
pucpr/clinicalnerpt-healthcare
pucpr
token-classification
[ "transformers", "pytorch", "bert", "token-classification", "pt", "dataset:SemClinBr", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05Z
2021-10-13T09:32:28+00:00
129
6
--- datasets: - SemClinBr language: pt widget: - text: Acompanhamento da diabetes, paciente encaminhado da unidade de saúde. - text: Paciente encaminhado por alteração na função renal. thumbnail: https://raw.githubusercontent.com/HAILab-PUCPR/BioBERTpt/master/images/logo-biobertpr1.png --- <img src="https://raw.githubusercontent.com/HAILab-PUCPR/BioBERTpt/master/images/logo-biobertpr1.png" alt="Logo BioBERTpt"> # Portuguese Clinical NER - HealthCare The HealthCare NER model is part of the [BioBERTpt project](https://www.aclweb.org/anthology/2020.clinicalnlp-1.7/), where 13 models of clinical entities (compatible with UMLS) were trained. All NER model from "pucpr" user was trained from the Brazilian clinical corpus [SemClinBr](https://github.com/HAILab-PUCPR/SemClinBr), with 10 epochs and IOB2 format, from BioBERTpt(all) model. ## Acknowledgements This study was financed in part by the Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - Brasil (CAPES) - Finance Code 001. ## Citation ``` @inproceedings{schneider-etal-2020-biobertpt, title = "{B}io{BERT}pt - A {P}ortuguese Neural Language Model for Clinical Named Entity Recognition", author = "Schneider, Elisa Terumi Rubel and de Souza, Jo{\~a}o Vitor Andrioli and Knafou, Julien and Oliveira, Lucas Emanuel Silva e and Copara, Jenny and Gumiel, Yohan Bonescki and Oliveira, Lucas Ferro Antunes de and Paraiso, Emerson Cabrera and Teodoro, Douglas and Barra, Cl{\'a}udia Maria Cabral Moro", booktitle = "Proceedings of the 3rd Clinical Natural Language Processing Workshop", month = nov, year = "2020", address = "Online", publisher = "Association for Computational Linguistics", url = "https://www.aclweb.org/anthology/2020.clinicalnlp-1.7", pages = "65--72", abstract = "With the growing number of electronic health record data, clinical NLP tasks have become increasingly relevant to unlock valuable information from unstructured clinical text. Although the performance of downstream NLP tasks, such as named-entity recognition (NER), in English corpus has recently improved by contextualised language models, less research is available for clinical texts in low resource languages. Our goal is to assess a deep contextual embedding model for Portuguese, so called BioBERTpt, to support clinical and biomedical NER. We transfer learned information encoded in a multilingual-BERT model to a corpora of clinical narratives and biomedical-scientific papers in Brazilian Portuguese. To evaluate the performance of BioBERTpt, we ran NER experiments on two annotated corpora containing clinical narratives and compared the results with existing BERT models. Our in-domain model outperformed the baseline model in F1-score by 2.72{\%}, achieving higher performance in 11 out of 13 assessed entities. We demonstrate that enriching contextual embedding models with domain literature can play an important role in improving performance for specific NLP tasks. The transfer learning process enhanced the Portuguese biomedical NER model by reducing the necessity of labeled data and the demand for retraining a whole new model.", } ``` ## Questions? Post a Github issue on the [BioBERTpt repo](https://github.com/HAILab-PUCPR/BioBERTpt).
null
BioNLP
<img src="https://raw.githubusercontent.com/HAILab-PUCPR/BioBERTpt/master/images/logo-biobertpr1.png" alt="Logo BioBERTpt"> # Portuguese Clinical NER - HealthCare The HealthCare NER model is part of the [BioBERTpt project](https://www.aclweb.org/anthology/2020.clinicalnlp-1.7/), where 13 models of clinical entities (compatible with UMLS) were trained. All NER model from "pucpr" user was trained from the Brazilian clinical corpus [SemClinBr](https://github.com/HAILab-PUCPR/SemClinBr), with 10 epochs and IOB2 format, from BioBERTpt(all) model. ## Acknowledgements This study was financed in part by the Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - Brasil (CAPES) - Finance Code 001. ## Citation ``` @inproceedings{schneider-etal-2020-biobertpt, title = "{B}io{BERT}pt - A {P}ortuguese Neural Language Model for Clinical Named Entity Recognition", author = "Schneider, Elisa Terumi Rubel and de Souza, Jo{\~a}o Vitor Andrioli and Knafou, Julien and Oliveira, Lucas Emanuel Silva e and Copara, Jenny and Gumiel, Yohan Bonescki and Oliveira, Lucas Ferro Antunes de and Paraiso, Emerson Cabrera and Teodoro, Douglas and Barra, Cl{\'a}udia Maria Cabral Moro", booktitle = "Proceedings of the 3rd Clinical Natural Language Processing Workshop", month = nov, year = "2020", address = "Online", publisher = "Association for Computational Linguistics", url = "https://www.aclweb.org/anthology/2020.clinicalnlp-1.7", pages = "65--72", abstract = "With the growing number of electronic health record data, clinical NLP tasks have become increasingly relevant to unlock valuable information from unstructured clinical text. Although the performance of downstream NLP tasks, such as named-entity recognition (NER), in English corpus has recently improved by contextualised language models, less research is available for clinical texts in low resource languages. Our goal is to assess a deep contextual embedding model for Portuguese, so called BioBERTpt, to support clinical and biomedical NER. We transfer learned information encoded in a multilingual-BERT model to a corpora of clinical narratives and biomedical-scientific papers in Brazilian Portuguese. To evaluate the performance of BioBERTpt, we ran NER experiments on two annotated corpora containing clinical narratives and compared the results with existing BERT models. Our in-domain model outperformed the baseline model in F1-score by 2.72{\%}, achieving higher performance in 11 out of 13 assessed entities. We demonstrate that enriching contextual embedding models with domain literature can play an important role in improving performance for specific NLP tasks. The transfer learning process enhanced the Portuguese biomedical NER model by reducing the necessity of labeled data and the demand for retraining a whole new model.", } ``` ## Questions? Post a Github issue on the [BioBERTpt repo](https://github.com/HAILab-PUCPR/BioBERTpt).
{"datasets": ["SemClinBr"], "language": "pt", "widget": [{"text": "Acompanhamento da diabetes, paciente encaminhado da unidade de saúde."}, {"text": "Paciente encaminhado por alteração na função renal."}], "thumbnail": "https://raw.githubusercontent.com/HAILab-PUCPR/BioBERTpt/master/images/logo-biobertpr1.png"}
task
[ "NAMED_ENTITY_RECOGNITION" ]
42,899
Atipico1/simcse-12000
Atipico1
sentence-similarity
[ "sentence-transformers", "safetensors", "bert", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:473130", "loss:MultipleNegativesRankingLoss", "arxiv:1908.10084", "arxiv:1705.00652", "base_model:BM-K/KoSimCSE-bert-multitask", "base_model:finetune:BM-K/KoSimCSE-bert-multitask", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
2024-08-25T12:48:13Z
2024-08-25T12:48:57+00:00
6
0
--- base_model: BM-K/KoSimCSE-bert-multitask datasets: [] language: [] library_name: sentence-transformers metrics: - cosine_accuracy - dot_accuracy - manhattan_accuracy - euclidean_accuracy - max_accuracy pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:473130 - loss:MultipleNegativesRankingLoss widget: - source_sentence: 릴레이로 이번주 TV와 라디오 방송 출연을 확정한게 누구야? sentences: - ▲ 사진=롯데엔터테인먼트 제공 영화 완벽한 타인의 주역 유해진, 조진웅, 이서진, 염정아가 릴레이로 이번주 TV와 라디오 방송 출연을 확정했다. 완벽한 타인은 완벽해 보이는 커플 모임에서 한정된 시간 동안 핸드폰으로 오는 전화, 문자, 카톡을 강제로 공개해야 하는 게임 때문에 벌어지는 예측불허 이야기를 담은 작품이다. 완벽한 타인에서 완벽한 연기를 펼친 배우들은 이번 주 릴레이로 TV와 라디오 방송 출연을 확정하며 열일 행보를 펼친다. 먼저 오는 24일 오후 7시 MBC FM영화음악 한예리입니다에는 유해진과 염정아가 함께 출연한다. 간첩, 전우치에 이어 세 번째로 함께 호흡을 맞춘 두 사람은 이번 라디오 출연에서 영화에 대한 이야기를 나누며 걸출한 입담과 절친 케미스트리를 선보일 것으로 보인다. 이어 이번 영화로 처음 만나 절친이 된 유해진, 조진웅, 이서진이 25일 오후 11시 10분 KBS2 해피투게더4에 출연한다. 세끼 인연 유해진과 이서진, 그리고 조진웅의 예능감이 유감없이 발휘될 예정이다. 마지막으로 26일에는 MBC 배철수의 음악캠프에서 이서진을 만날 수 있다. 완벽한 타인에서 가장 파격적인 연기 변신을 선보인 그는 음악캠프 특별 DJ로 활약했던 인연으로 이번 출연이 성사됐다. 이서진은 거침없는 언변으로 영화 완벽한 타인의 현장 비하인드 스토리를 밝힐 예정이다. 한편 완벽한 타인은 오는 31일 개봉을 앞두고 있다. - 부산 부산진구(구청장 서은숙)는 오는 7월부터 단독주택가 재활용정거장 운영 사업을 부암1동과 개금2동으로 확대 추진한다고 밝혔다. 재활용정거장 사업은 일반 주택가 주민들이 편리하게 재활용품을 배출할 수 있도록 일정시간에 지정된 장소에 ‘재활용정거장’이라는 배출 거점을 만들어 주민들이 이용할 수 있도록 하고, 운영 시간 외에는 철수하는 이동식 분리수거장이다. 각 정거장마다 도시광부라 불리는 자원관리사가 정거장을 관리하고 주민들의 재활용품 분리배출을 지원한다. 부산진구는 2019년 3월부터 전포1동 지역에 재활용정거장 25개소를 설치하여 174회 운영, 재활용품 38,700마대를 수거했다. 이 사업이 주민들의 올바른 재활용품 분리배출 문화를 확산시키고 마을의 쓰레기 문제를 주민들이 직접 해결하는 지역공동체의 모범안을 제시한 사례로 평가받음에 따라 오는 7월부터 부암1동과 개금2동 지역에 재활용정거장 운영을 확대 추진하기로 했다. 재활용정거장은 부암1동이 5개소, 개금2동은 10개소이다. 부암1동은 매주 월요일과 수요일, 개금2동은 화요일과 목요일 오후 4시부터 8시까지 4시간동안 운영된다. 주민들은 종이류, 플라스틱류, 유리병류, 캔․고철류, 비닐류 등 재활용 전품목을 가까운 재활용정거장으로 배출하면 된다. 오후 8시 이후 운영을 마치면 수거업체가 재활용정거장에 배출된 재활용품을 모두 수거해 간다. 이후 정거장은 철거되고 다음 운영요일에 다시 설치된다. 또한 주민들의 재활용품 배출 혼동을 줄이기 위해 기존 문전수거 방식도 병행해 운영한다. 구 관계자는 “재활용정거장 사업은 단독주택지역의 재활용품 혼합배출 실태를 개선하여 실질적인 자원 재활용율 높이고 쾌적한 골목길 조성에 크게 기여할 것으로 기대한다”며 주민들이 다함께 적극 참여해 줄 것을 당부했다. - 그룹 세븐틴이 미국 간판 토크쇼 ‘엘렌 쇼’에 첫 출연을 확정 지었다. 세븐틴은 다음 달 1일(현지 시각) 방송되는 미국 토크쇼 ‘엘렌 드제너러스 쇼’(이하 엘렌 쇼)에 첫 출연을 확정 지어 전 세계 팬들의 폭발적인 반응을 얻었다. 이날 방송에서 세븐틴은 지난 2019년 8월 발매한 디지털 싱글 ‘HIT’ 무대를 선보인다. ‘HIT’는 제목처럼 타격감이 느껴지는 사운드와 세븐틴의 폭발적인 에너지가 그대로 전해지는 강렬한 EDM 장르의 댄스곡으로 발매와 동시에 국내는 물론 해외에서도 큰 사랑을 받았다. ‘엘렌 쇼’는 미국 유명 코미디언이자 작가, 배우 등 멀티 엔터테이너인 엘렌 드제너러스가 진행하는 토크쇼로 브루노 마스, 두아 리파, 존 레전드, 저스틴 비버 등 세계적인 팝스타들이 대거 출연해 화제를 모았으며 미국의 데이타임 쇼 중 높은 인기를 보유하고 있는 프로그램이다. 앞서 세븐틴은 지난 1월 방송된 미국 CBS ‘제임스 코든 쇼’와 NBC ‘켈리 클락슨 쇼’에 연달아 출연해 스페셜 앨범 타이틀곡 ‘HOME;RUN’과 미니 7집 타이틀곡 ‘Left & Right’의 무대를 선사, 막강한 글로벌 영향력을 확인 시켜 주며 전 세계 팬들과 해외 유수 매체의 호평 세례를 받았다. 이렇듯 세븐틴은 스토리텔링이 담긴 완성도 높은 무대와 세븐틴만이 할 수 있는 퍼포먼스를 선보여 ‘K팝 퍼포먼스 강자’라는 칭호를 얻는 등 전 세계를 열광시킨 바 있어 이번 ‘엘렌쇼’에서 어떤 무대를 선보일지 기대감이 치솟고 있다. 한편 세븐틴이 출연하는 미국 토크쇼 ‘엘렌 쇼’는 다음 달 1일(현지 시각)에 만나볼 수 있다. - source_sentence: 롯데리아에 직접 가서 불고기버거세트를 먹으려면 얼마를 내야 하지 sentences: - 햄버거 프랜차이즈 업체들이 배달 애플리케이션으로 일정 금액 이상 주문할 때 배달비가 무료라고 내세우지만 이미 제품 가격에 포함된 것으로 드러났다. 햄버거를 배달 앱으로 주문하면 같은 제품이라도 매장보다 더 가격이 비싸다. 사실상 소비자 기만 행위다. 19일 한국소비자원의 조사 결과 롯데리아, 맥도날드, 버거킹, KFC 등 주요 4개 햄버거 프랜차이즈의 모든 제품이 매장 가격에 비해 배달 가격이 비쌌다. 예를 들어 롯데리아 불고기버거세트의 배달가는 7000원으로 매장가 5900원보다 1100원을, 버거킹 리얼와퍼세트는 1200원을 더 내야 한다. 메뉴를 많이 주문할수록 가격 차이가 커져 소비자가 피해를 보는 구조라는 것도 분통터지게 한다. 기업이 이윤을 추구하는 것은 당연하지만 소비자를 속여서까지 이익을 내선 안 된다. 더 큰 문제는 프랜차이즈들이 이런 사실을 제대로 고지하지 않았다는 점이다. 버거킹, KFC는 자사 홈페이지에서만 배달과 매장 가격이 다를 수 있음을 알리고 있다. 4개 업체 모두 배달의민족, 요기요, 쿠팡이츠 등 주요 배달 플랫폼에 이 같은 정보를 공지하지 않았다. 대개 배달 앱을 통해 주문하는 만큼 이제라도 주문 및 결제 과정에서 주요 거래조건을 명확하게 알려야 할 것이다. 소비자단체에 따르면 햄버거뿐 아니라 상당수 일반 음식점이 배달 앱으로 주문할 때 식당가보다 음식값을 더 비싸게 받고 있다. 매장에선 할인되는 품목이 배달 주문 때는 할인 적용이 안 되는 경우도 있다. 식당가와 배달가의 차이가 난다는 지적에 아예 매장 가격을 올려버리기도 한다니 소비자를 봉으로 아는 태도다. 이는 전체 외식 물가 인상으로 이어지는 것이라 우려스럽다. 코로나19로 비대면이 늘면서 배달 앱을 이용한 음식 주문은 소비자의 일상이 됐다. 이런 상황에서 무료배달이라는 꼼수를 쓴 햄버거 프랜차이즈는 비난받아 마땅하다. - 물가 안정과 우리 농산물 소비 촉진을 위한 '대한민국 농할(농산물 할인) 갑시다'가 돌아왔다. 이마트, 롯데마트와 롯데슈퍼는 농림축산식품부와 손잡고 오는 27일까지 다양한 농산물 할인 혜택을 통해 물가 안정 및 농가 돕기에 나선다. 이번 '농할갑시다'의 키 포인트는 '물가 안정'과 '소비 촉진'이다. 유통업계와 농림축산식품부는 이번 '농할갑시다'를 통해 물가를 안정시키고 판로에 어려움을 겪고 있는 국내 농가들을 도울 예정이다. 행사 기간 동안 이마트에서는 '농할갑시다' 행사 상품을 구매할 경우, 신세계포인트 적립 고객에 한해 20% 할인을 적용한다. 할인은 1인당 최대 1만원(구매금액 5만원)까지 받을 수 있다. 먼저 AI(조류독감)로 갑작스레 오른 달걀 가격 안정화를 위해 이마트와 농림축산식품부는 약 50종에 달하는 달걀 할인 행사를 선보인다. 이마트에서 달걀을 신세계포인트 적립하여 구매할 시, 판매 가격의 20%를 할인받을 수 있다. 또 올겨울 잦은 한파와 폭설로 인해 가격이 크게 오른 '무'도 신세계포인트 적립 시 20% 할인해 판매한다. 무는 올 1월 제주도에 내린 폭설로 생산량이 줄어들어 가격이 상승했다. 농산물유통정보에 따르면 무 20㎏ 평균 가격은 작년 12월 중순 1만536원이었으나 올해 1월 14일 1만5980원으로 한 달 만에 약 51.6% 오른 셈이다. 코로나19로 소비량이 대폭 감소해 지지부진한 판매량을 보이고 있는 배추 역시 신세계 포인트 적립 시 20% 할인 판매해 소비 촉진에 앞장선다. 이마트는 이번 농산물 할인 행사를 시작으로 농림축산식품부와 함께 순차적으로 친환경 농산물 등 다양한 할인 행사를 진행할 예정이다. 롯데마트와 롯데슈퍼 역시 오는 27일까지 전 점에서 '대한민국 농할 갑시다' 행사를 진행한다. 롯데마트 역시 최근 가격이 상승한 달걀과 무, 배추를 할인 품목으로 정해 실질적인 가계의 물가 안정에 기여할 것으로 기대하고 있다. '농할 갑시다' 행사는 엘포인트(L.Point) 회원이 롯데, 신한, 현대 등 7대 카드로 결제 시 적용된다. 행사 기간 동안 달걀을 20% 할인하며, 배추와 무도 20% 할인된 가격인 각 1260원에 판매한다. 달걀의 경우 1인당 3판 한정으로 판매할 계획이며, 배추와 무를 포함해 1인당 최대 할인 한도는 1만원이다. 롯데마트 정재우 상품본부장은 "최근 급격히 오른 물가 안정의 취지에 맞춰 농림축산식품부와 함께 이번 행사를 준비했다"며 "합리적인 가격에 우리 농산물을 구입할 수 있는 기회가 되길 바란다"고 말했다. - '상장회사법제 구축 공동 세미나 제1주제 상장회사법 제정에 관한 구상 Ⅴ 결론 경험적으로 볼 때 새로운 법을 제정한다는 게 매우 어려운 일이다. 특히 학계와 실무계에서 회사법을 상법에서 분리하여야 한다는 주장이 오랫동안 있어 왔지만 큰 반향을 불러일으키지 못한 것이 현실이다. 이러한 상황에서 대안으로 구상한 것이 바로 상장회사법의제정이다. 상장회사법의 경우 이미 2007년도에 「상장법인에 관한 법률」 제정안이 입법예고된 바 있다는 점에서 추후에도 그 제정을 위한 공감대를 이끌어 내기가 용이할 것으로 보인다. 본고는 2019년의 시점에서 다시 한번 상장회사법의 제정 필요성으로 다음과 같은 3가지이유를 제시하였다. 첫째, 급격하게 변화하는 자본시장에 유연하게 대응하여 다양하면서도 다수의 이해관계자를 보호하기 위해서는 상장회사법을 제정할 필요가 있다. 둘째, 현재 상법과 자본시장법에 나누어 규정되고 있는 상장회사에 대한 특례를 하나로 합하여 단행법화한다면 국내외의 수범자 입장에서의 편의성이 제고될 서이다. 셋째, 상장회사에 관련된 여러 법제가 산재하고 있어 체계적인 정합성이 미흡하므로, 이를 극복하기 위해서는 상장회사법의 제정이 필요하다. 단기적인 과제로서 외부감사법상 상장회사에 적용되는 규정을 상장회사법으로 이관한다면 도움이 될 것으로 보인다. 이에 비교법적인 차원에서 미국과 독일 및 일본과 태국법의 동향을 파악하였다. 미국은 상장회사에 관한 규정을 회사법에 일부 두면서도 동시에 전국규모의 증권거래소가 마련한 상장회사 규정에 의하여 상장회사의 지배구조가 보충적으로 규율되고 있다. 독일은 주식법에서 상장회사와 비상장회사 모두를 다루고 있다. 일본의 경우에는 단행의 회사법을 두고 있지만, 상장회사에 대한 규제의 선진화를 위하여 2007년 상장회사에 적용될 공개회사법요강안을 발표한 바 있다. 태국은 세계적으로 유래를 찾기 어려운 입법으로서 공개주식회사법을 별도로 제정하여 시행하고 있는 국가이다. 본고는 기존에 정부가 마련한 「상장법인에 관한 법률」 제정안을 기본으로 하면서 위에 소개된 여러 국가의 입법례를 참고하여 상장회사법에 들어갈 규정들을 몇 가지 제시하였다. 본고에서 상장회사법에 편입되어야 할 추가적인 규정을 요약하여 정리하면 다음과 같다. 먼저 지배구조와 관련하여 주주명부의 폐쇄 및 기준일을 단축하는 규정, 이사에게 내부통제체제 구축의무를 부과하는 규정 및 사외이사의 회의를 강제하는 규정을 두어야 한다. 다음으로 현행 외부감사법에 있는 상장회사의 회계처리기준과 회계감사에 관한 규정을 상장회사법으로 이관한다. 마지막으로 합병 등과 같은 조직재편과 관련하여 몇 가지를 개선한다. 즉, 상장회사에서 합병 등에 반대하는 주주에 대해서는 주식매수청구권을 인정하지 않으며, 삼각합병에서 제3자에게 제공되는 합병의 대가가 모회사가 발행한 의결권있는 주식의 10%를 초과하는 경우에는 모회사 주주총회의 승인을 요구한다. 그리고 합병 등에 필요한 채권자보호절차와 관련하여 채권자에 대한 개별최고제도를 폐지함은 물론이고 채권자의 손해가 없다면 그에 대해서는 채권자보호절차를 마련하지 않는다.' - source_sentence: 어떤 방법으로 합동수사본부는 범죄유형별 대처시스템을 강화하려 할까 sentences: - '2. 형사사법업무처리의 전자화 (1) 형사사법정보시스템 정부는 2010년 「형사사법절차 전자화 촉진법」을 근거로 형사사법기관 간 전산시스템을 연계하고 전산정보를 공동 활용하는 동시에 대국민 형사사법정보서비스(사건 조회, 재판서 및 통지서 조회, 벌과금 조회, 민원신청, 범죄피해자 지원 등)를 제공하는 형사사법공통시스템(KICS)을 구축하였다. 경찰, 검찰, 법무부, 법원은 각각의 KICS 전용 서버를 설치하여 운영하고 있고 공통시스템 서버 운영과 각 기관 간 KICS 연계 업무는 형사사법공통시스템 운영단이 수행하고 있다. 아래의 차세대 형사사법정보시스템의 구성도(그림1)를 보면 노란색으로 표시된 재구축 부분은 기존의 2010년부터 형사사법정보시스템에서 제공되던 것을 기술적 업그레이드만 하는 것이다. 차세대 형사사법정보시스템(그림1)에서 파란색으로 표시된 신규 부분은 2024년을 목표로 하고 있는 빅데이터 분석 플랫폼 구축과 전자문서화 시스템 영역이다.' - '나. 연기금의 분산투자 확대 1) 필요성 및 현황 □ 국민연금 등 연기금의 자산이 폭발적으로 늘어나고 있는 상황에서 특정자산에 집중해 투자할 경우 여러 가지 문제 초래 ― 국민연금 등 연기금들이 현재의 채권 위주 운용전략을 유지할 경우 시장에서의 지배력이 심각한 수준에 달할 전망 ⦁ 이창용(2004)은 2015년에도 국민연금 전체 자산 중 국내채권의 비중이 70%대 중반을 유지할 경우, 국내 채권시장에서 차지하는 국민연금의 비중이 20% 이상에 달할 것으로 전망 ⦁ 만약 해외채권과 해외주식의 비중을 크게 늘리면 국민연금의 국내자산시장에서 차지하는 비중이 크게 줄어들 것으로 전망 ― 국민연금 등 특정 기관투자자의 고등급 채권 위주의 운용 전략이 지속되어 채권시장에서 차지하는 비중이 커지면 다음과 같은 부작용 예상 ⦁ 국채수익률이 낮아지는 등 금리 왜곡 가능성 ⦁ 국민연금의 국채수요로 정부가 저비용으로 재정자금을 조달할 수 있기 때문에 재정규율이 약화될 가능성 ⦁ 고등급 채권 발행 주체인 대기업에 자본이 집중되고, 신성장 동력원이 되는 신생기업에 대한 자본공급 부진 가능성 ― 국민연금이 국내 주식에 대한 투자를 계속해서 늘려 그 비중이 급속도로 커질 경우에도 문제가 발생 ⦁ 국민연금 보유 주식의 가격이 하락하여 국민연금이 손절매에 나설 경우 일반투자자들의 투매를 초래해 시장이 불안해질 가능성 ⦁ 국민연금의 개별기업에 대한 지분율 상승으로 지배구조에 영향 ― 자산 축적기에 분산투자가 제대로 구축되지 않을 경우에는 향후 연금지급이 본격화되면 자산시장 왜곡을 초래할 가능성' - '제목 범정부 서민생활침해사범 근절 대책 추진 중간결과 □ 이번에 설치된 합동수사본부는 ○ 안전행정부, 미래창조과학부, 경찰청, 국세청, 금융위원회, 금융감독원, 사행산업통합감독위원회 등 유관기관이 참여하여, - 서민생활 침해사범에 대한 범정부 차원의 효율적인 예방․단속, 범죄수익 환수․탈루세액 징수, 피해자 보호 등과 관련하여 유관기관별 역할분담과 협업을 통하여 기관별 역량을 최대한 결집하고, 범죄유형별 대응시스템을 강화하였을 뿐만 아니라,관련 기관의 적극적 제도개선을 이끌어내는 계기를 마련하였음 ○ 서민생활침해사범 합동수사본부는 이후에도 1차 단속 과정에서 나타난 미흡한 점을 더욱 보완, 지속적인 단속활동을 전개하여 서민 피해자 보호, 불법수익의 철저한 환수, 탈루세액 징수에 박차를 가하는 한편, 불법차명물건(대포통장, 대포차, 대포폰 등)을 이용한 범죄 및 파밍, 스미싱 등 신종 사기 범죄로부터 서민을 보호하는 제도 개선에 더욱 노력하겠음.' - source_sentence: 용재 오닐이 참가하고 있는 현악기로 연주하는 사중주단은 뭐야 sentences: - 한국계 미국 비올리스트 리처드 용재 오닐이 ‘제63회 그래미 어워즈’에서 ‘베스트 클래시컬 인스트루먼털 솔로(Best Classical Instrumental Solo)’ 상을 받았다. 용재 오닐은 ‘그래미 어워즈’ 본 시상식에 앞서 한국시간으로 15일 진행된 사전 시상식 ‘프리미어 세리머니(Premiere Ceremony)’에서 이 부문 수상자로 호명됐다. 데이비드 앨런 밀러가 지휘하고 미국 알바니 심포니가 함께 연주한 테오파니디스의 비올라와 챔버 오케스트라를 위한 협주곡으로 영예를 안았다. 용재 오닐은 ‘디토 페스티벌’ 음악감독 등을 맡아 한국에서 클래식음악의 대중화에 기여했다. 세계적 현악 사중주단 ‘타카치 콰르텟’에 합류해 활약 중이다. - ‘안단테 칸타빌레(Andante Cantabile)’. 이탈리아어로 ‘노래하듯 천천히’라는 뜻이다. 현악사중주단 ‘아벨콰르텟’은 2021년을 그렇게 한발자국씩, 희망을 담아 걸어나가기로 했다. 최근 한국일보 사옥에서 만난 ‘아벨콰르텟’ 멤버 윤은솔(34ㆍ바이올린), 박수현(32ㆍ바이올린), 문서현(24ㆍ비올라), 조형준(34ㆍ첼로)은 “코로나19 때문에 천천히 걸을 수밖에 없고 잠깐 멈춰야 할 때도 있겠지만, 항상 앞으로 나아가려는 마음이 중요하다”며 “이왕이면 좋은 생각을 하면서, 천천히 노래하듯 나아가고 싶다”고 입을 모았다. ‘아벨콰르텟’은 18일 광주 유ㆍ스퀘어문화관, 20일 서울 예술의전당에서 ‘안단테 칸타빌레’라는 이름의 네번째 정기연주회를 연다. 2013년 결성된 ‘아벨콰르텟’은 재작년 코로나19 만큼이나 큰 위기를 겪었다. 처음 팀을 만들 때 구심점 역할을 했던 비올리스트 김세준이 개인 사정으로 콰르텟을 떠나게 된 것. 오랜시간 합을 맞춰왔던 연주자의 공백은 컸다. 남은 3명이 “이대로 활동을 그만둬야 하나”하고 고민할 정도였다. 다행히 지난해 초 막내 문서현이 합류하면서 ‘아벨콰르텟’의 새 삶이 시작됐다. 문서현은 “관객으로 만났던 ‘아벨콰르텟’은 인간적이면서 따뜻한 소리가 기억에 남는 현악사중주단”이라며 “특정 장르에 머무르지 않고 다양한 시도를 하는 팀이어서 앞으로가 기대된다”고 했다. 이달 공연의 첫곡인 슈베르트 현악사중주 12번은 ‘아벨콰르텟’의 새출발을 알리는 신호탄이다. 오스트리아 빈에서 기존 멤버들과 만난 문서현이 처음으로 합주한 작품이기도 하다. 조형준은 “단악장의 짧은 곡이지만 몰아치는 감정의 소용돌이가 어마어마하다”면서 “지금까지 주로 고전시대 작품을 많이 했다면 이번에는 낭만주의 색깔을 마음껏 보여줄 수 있을 것”이라고 말했다. 뒤이어 펼쳐지는 멘델스존의 현악사중주 6번은 작곡가의 슬픔과 격정이 “제한 없이 드러나는” 곡이다. 멘델스존이 사랑하는 누나의 죽음을 접한 직후 쓴 곡으로 알려져 있다. 윤은솔은 “다른 팀원들이 이 곡을 공연해보자고 계속 제안했는데 지금까지 자신이 없어서 미루고만 있다가 최근 어떤 계기를 통해 연주할 힘을 얻었다”며 “마음의 준비가 됐기에 완성도 높은 음악을 들려드리고 싶다”고 했다. 마지막은 실내악 명작으로 꼽히는 차이코프스키의 현악사중주 1번이다. ‘안단테 칸타빌레’라는 제목이 붙은 2악장이 특히 유명한데, 이번 공연의 이름과 같다. 박수현은 이 곡을 두고 “따뜻하고 달콤한 유럽 크리스마스의 향기가 난다”고 표현했다. 박수현은 “차이코프스키는 아무런 음악적 지식이 없는 사람도 듣기만 하면 아름다움을 느낄 수 있는 곡들을 썼는데, 현악사중주 1번에도 그런 철학이 잘 담겨 있다”고 말했다. - 어느덧 내 나이 팔순을 지나간다. 최근에 팔순을 맞은 한 대학 동창이 예배시간에 이런 회고사를 했다. “목회에서 은퇴한 뒤 그동안 만났던 성도들을 떠올리며 기도하고 있습니다. 어떤 날은 새벽 5시30분 시작된 기도가 오전 10시까지 이어져 아침식사를 거르기도 했습니다.” 대학 동기 모임에서 또 다른 친구가 김홍도 목사(전 금란교회 담임목사)의 옆구리를 쿡 찌르면서 농담을 건넸다. “홍도야, 예배시간에 그 친구가 한 얘기 들었지? 성도를 위해 기도하느라고 아침도 못 먹었단다. 너는 성도가 수만 명이나 되는데 하루 종일 밥숟가락을 뜰 수나 있겠나.” 예전에는 인간이 강건하면 수명이 팔십이라고 했다. 난 그 나이만큼 살고 있으니 감사하다. 난 태어나서 네 살이 될 때까지 앞을 보지 못했다. 결핵성 관절염을 앓아 작대기를 짚고 학교에 다녔다. 또 건강이 좋지 않고 집이 가난해 행복한 꿈을 꿀 수 없었다. 그렇게 부족한데도 하나님께선 날 아껴주시고 목회를 완주할 수 있게 해주셨다. 하나님은 내게 좋은 부모님을 주셨다. 부모님은 내게 여호와에 대한 경외심을 삶으로 가르쳤다. 나 역시 부모님의 뒤를 따라 복음과 양심을 지키려 노력했다. 그리고 내 자녀들이 바통을 이어 목회의 길을 가고 있다. 참으로 은혜로운 일이다. 하나님은 또 내게 신앙심 깊은 아내를 허락하셨다. 아내의 소원이 참 재밌다. 하나님 앞에서 ‘난 목사의 며느리였고 목사의 아내였고 목사의 어머니였고 목사의 할머니였다’는 소리를 하고 싶다는 것이다. “여호와를 경외하며 그의 길을 걷는 자마다 복이 있도다. 네가 네 손이 수고한 대로 먹을 것이라 네가 복되고 형통하리로다. 네 집 안방에 있는 네 아내는 결실한 포도나무 같으며 네 식탁에 둘러앉은 자식들은 어린 감람나무 같으리로다. 여호와께서 시온에서 네게 복을 주실 것이며 너는 평생에 예루살렘의 번영을 보며 네 자식의 자식을 볼지어다.(시 128)” 성경에는 복이란, 손이 수고한 대로 소득을 얻는 것이라고 돼 있다. 복은 곧 아내와 아이들이 한 식탁에 둘러앉아 밥을 먹는 것이요, 자식의 자식 곧 손주를 보는 것이다. 생각하면 누구나 받는 싱거운 복 같지만 깊이 생각할수록 아무나 받을 수 없는 것이기도 하다. 요즘 젊은이들이 일자리나 결혼, 출산 등의 문제로 고통 받는다고 하니 평범해 보이는 일상은 실상 비범한 일이다. 하나님께서 내게 여호와를 경외하는 아내는 물론 감람나무 같은 자식들과 그들의 자식까지 보는 기쁨을 주셨다. 모두 하나님의 은혜다. 아버지는 평생 강원도 산골을 걸어 다니며 목회하셨다. 난 목회 초반 자전거를 탔다. 내 자녀들은 자동차를 타고 목회를 하고 있다. 아마도 손주들은 비행기를 타고 다니며 목회할 것이다. 나는 엘리야의 때에 바알에게 무릎 꿇지 않은 7000인의 사람들이 있었던 것처럼 오늘날에도 그런 천연기념물 같은 성도가 있다고 믿는다. 역경의 열매는 역경을 이긴 자들의 것이다. 이 시간에도 천연기념물 같은 주의 종들이 역경을 묵묵히 감내하고 있을 것이라 믿는다. 그들이 지나간 자리에 한층 더 커지고 환해지고 깨끗해지고 튼튼해지고 안전해진 주님의 교회가 있길 바라며 기도한다. - source_sentence: 연제구의 도시미화와 저소득층 노인들의 일자리 창출에도 도움이 되는 게 뭐야 sentences: - 연제구(구청장 이성문)는 지난해에 이어 ‘2021년 불법 유동광고물 수거보상사업’을 시행한다. ‘불법 유동광고물 수거보상사업’은 도시미관을 해치는 불법 광고물을 근절하기 위해 사업 참여자가 불법 유동광고물을 수거하여 구청 도시재생과에 가져가면 구청에서는 보상금을 지급하는 사업이다. 구는 1월 11일부터 15일까지 연제구민 중 만 60세 이상 저소득 어르신을 대상으로 신청을 받아 총 50명을 수거보상사업 참여자로 선발하였다. 참여자로 선발된 어르신들은 오는 2월부터 시작하는 수거 활동에 앞서 연제구청 구민홀에서 불법 유동광고물 구분 기준, 수거 방법, 수거 시 안전 수칙 등에 대해서 사전 교육을 받았으며 수거활동 중 발생할 수 있는 안전사고에 대비해 단체 보험에도 가입했다. 불법 광고물 정비에 주민들이 참여할 수 있는 기회를 제공하여 주민들로부터 불법 광고물에 대한 경각심을 제고 할 수 있을 것으로 기대된다. 구 관계자는 “이번 사업을 통해 주민과 함께 품격 있는 연제구를 만드는 데 일조하고, 저소득 어르신의 실버 일자리 창출에도 기여할 것으로 기대된다”고 말했다. - "이시종 충북도지사는 24일 오후 2시 도청 대회의실에서 국가철도망 구축계획에 충북도 철도사업의 반영을 위한 민ㆍ관ㆍ정 간담회를 개최했다.\ \ 이날 간담회에는 이 지사를 비롯해 한범덕 청주시장, 송기섭 진천군수, 조병옥 음성군수, 이장섭, 임호선 국회의원과 박문희 도의회 의장,\ \ 최충진 청주시의회 의장, 김성우 진천군의회 의장, 최용락 음성군의회 의장 등 지역 정치권 관계자와 민간사회단체총연합회 유철웅 회장 등 민간사회단체\ \ 관계자까지 34명이 참석했다. 이번 간담회에서는 4차 국가철도망구축계획 공청회가 얼마 남지 않은 중요한 시점에서 그동안 철도사업의 국가계획\ \ 반영 추진상황을 공유하고, 각 기관과 단체별 참여 방안과 도민의 힘을 모으기 위한 다양한 방안이 논의됐다. 도는 청주도심을 통과하는 충청권\ \ 광역철도, 수도권에서 진천 국가대표선수촌과 혁신도시를 거쳐 청주공항을 잇는 수도권내륙선 광역철도, 음성 감곡에서 혁신도시를 거쳐 청주공항을\ \ 잇는 중부내륙선 지선 등의 노선을 국가계획에 반영하기 위해 집중하고 있다. 이 지사는 \"해당 철도노선을 국가계획에 반영해야만 추진할 수\ \ 있기 때문에 우선은 반영을 목표로 최선을 다해야 한다\"며 \"많은 도민들의 공감대와 적극적인 지지가 필요한 때인 만큼 참석자들이 구심점\ \ 역할을 해 줄 것\"을 당부했다. 도 관계자는 \"국가철도망계획은 10년 단위 계획으로 전국 지자체가 각자의 사업 반영을 위해 각축전을\ \ 벌이는 상황\"이라며 \"충북도 사업이 최대한 반영될 수 있도록 최선을 다하겠다\"고 말했다. \n\n " - '4. 나가며 노인복지주택이 지속가능한 노인복지정책이 되기 위해서는 사업시행자에게는 경제적으로 이득이 되고, 정책대상인 노인가구에게도 주거생활에 실질적인 도움을 줄 수 있어야 할 것이다. 그러나 그간 노인복지주택에의 사업시행자는 건설부지 및 부대시설 기준완화, 조세감면 등 각종 혜택을 받아 경제적 이득을 실현한 반면, 정책대상가구인 노인가구는 입소자격 제한규정으로 재산권 행사에 많은 불편을 겪어왔다. 이러한 정책집행 의지와 현실 간 괴리 속에서 다수의 노인복지주택에서 입소자격이 없는 자가 탈법적으로 입주하는 행위가 발생해온 것이다. 다음과 같은 측면에서도 노인복지주택정책에 대한 면밀한 검토가 필요하다. 첫째, 노인복지주택이 용도상 자연경관이 우수한 녹지지역 혹은 기반시설이 확보되지 않은 지역에도 건축될 수 있어 국토난개발을 유발할 가능성이 크다. 둘째, 보다 근본적으로 노인복지주택과 같이 노인들만 거주하는 주택이 노인복지 측면에서 바람직한지를 검토할 필요가 있다. 우리나라와 같이 급격한 고령화를 경험하고 있는 일본의 경우, 젊은 세대와 노인 세대가 함께 거주하는(age-mix) 정책이 중요하게 인식되고 있기 때문이다. 현행 노인복지주택 입소자자격 등은 노인의 주거복지증진과 행복추구에 부정적인 영향을 끼치고 있다는 점을 볼 때, 현행의 노인복지주택정책을 지속시키는 것이 실익이 있는지에 대한 면밀한 검토가 필요한 시점이다. 이를 위해 향후 공급되는 분양형 노인복지주택제도를 폐지하고, 노인복지주택을 「주택법」 체계 내로 흡수하는 방안을 적극적으로 검토할 필요가 있을 것이다.' model-index: - name: SentenceTransformer based on BM-K/KoSimCSE-bert-multitask results: - task: type: triplet name: Triplet dataset: name: eval type: eval metrics: - type: cosine_accuracy value: 0.9539 name: Cosine Accuracy - type: dot_accuracy value: 0.0587 name: Dot Accuracy - type: manhattan_accuracy value: 0.9496 name: Manhattan Accuracy - type: euclidean_accuracy value: 0.9518 name: Euclidean Accuracy - type: max_accuracy value: 0.9539 name: Max Accuracy --- # SentenceTransformer based on BM-K/KoSimCSE-bert-multitask This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BM-K/KoSimCSE-bert-multitask](https://huggingface.co/BM-K/KoSimCSE-bert-multitask). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [BM-K/KoSimCSE-bert-multitask](https://huggingface.co/BM-K/KoSimCSE-bert-multitask) <!-- at revision 3aa54365eb9557f3b8ee1e39cff87306451abfae --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 768 tokens - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("Atipico1/simcse-12000") # Run inference sentences = [ '연제구의 도시미화와 저소득층 노인들의 일자리 창출에도 도움이 되는 게 뭐야', '연제구(구청장 이성문)는 지난해에 이어 ‘2021년 불법 유동광고물 수거보상사업’을 시행한다. ‘불법 유동광고물 수거보상사업’은 도시미관을 해치는 불법 광고물을 근절하기 위해 사업 참여자가 불법 유동광고물을 수거하여 구청 도시재생과에 가져가면 구청에서는 보상금을 지급하는 사업이다. 구는 1월 11일부터 15일까지 연제구민 중 만 60세 이상 저소득 어르신을 대상으로 신청을 받아 총 50명을 수거보상사업 참여자로 선발하였다. 참여자로 선발된 어르신들은 오는 2월부터 시작하는 수거 활동에 앞서 연제구청 구민홀에서 불법 유동광고물 구분 기준, 수거 방법, 수거 시 안전 수칙 등에 대해서 사전 교육을 받았으며 수거활동 중 발생할 수 있는 안전사고에 대비해 단체 보험에도 가입했다. 불법 광고물 정비에 주민들이 참여할 수 있는 기회를 제공하여 주민들로부터 불법 광고물에 대한 경각심을 제고 할 수 있을 것으로 기대된다. 구 관계자는 “이번 사업을 통해 주민과 함께 품격 있는 연제구를 만드는 데 일조하고, 저소득 어르신의 실버 일자리 창출에도 기여할 것으로 기대된다”고 말했다.', '4. 나가며\n노인복지주택이 지속가능한 노인복지정책이 되기 위해서는 사업시행자에게는 경제적으로 이득이 되고, 정책대상인 노인가구에게도 주거생활에 실질적인 도움을 줄 수 있어야 할 것이다. 그러나 그간 노인복지주택에의 사업시행자는 건설부지 및 부대시설 기준완화, 조세감면 등 각종 혜택을 받아 경제적 이득을 실현한 반면, 정책대상가구인 노인가구는 입소자격 제한규정으로 재산권 행사에 많은 불편을 겪어왔다. 이러한 정책집행 의지와 현실 간 괴리 속에서 다수의 노인복지주택에서 입소자격이 없는 자가 탈법적으로 입주하는 행위가 발생해온 것이다. 다음과 같은 측면에서도 노인복지주택정책에 대한 면밀한 검토가 필요하다. 첫째, 노인복지주택이 용도상 자연경관이 우수한 녹지지역 혹은 기반시설이 확보되지 않은 지역에도 건축될 수 있어 국토난개발을 유발할 가능성이 크다. 둘째, 보다 근본적으로 노인복지주택과 같이 노인들만 거주하는 주택이 노인복지 측면에서 바람직한지를 검토할 필요가 있다. 우리나라와 같이 급격한 고령화를 경험하고 있는 일본의 경우, 젊은 세대와 노인 세대가 함께 거주하는(age-mix) 정책이 중요하게 인식되고 있기 때문이다. 현행 노인복지주택 입소자자격 등은 노인의 주거복지증진과 행복추구에 부정적인 영향을 끼치고 있다는 점을 볼 때, 현행의 노인복지주택정책을 지속시키는 것이 실익이 있는지에 대한 면밀한 검토가 필요한 시점이다. 이를 위해 향후 공급되는 분양형 노인복지주택제도를 폐지하고, 노인복지주택을 「주택법」 체계 내로 흡수하는 방안을 적극적으로 검토할 필요가 있을 것이다.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Triplet * Dataset: `eval` * Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator) | Metric | Value | |:-------------------|:-----------| | cosine_accuracy | 0.9539 | | dot_accuracy | 0.0587 | | manhattan_accuracy | 0.9496 | | euclidean_accuracy | 0.9518 | | **max_accuracy** | **0.9539** | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 473,130 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 9 tokens</li><li>mean: 23.09 tokens</li><li>max: 88 tokens</li></ul> | <ul><li>min: 34 tokens</li><li>mean: 355.74 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 55 tokens</li><li>mean: 338.33 tokens</li><li>max: 512 tokens</li></ul> | * Samples: | anchor | positive | negative | |:------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>국회세종의사당 건립을 위한 법안을 누가 새로 제안했어</code> | <code>국민의힘 정진석 의원(공주·부여·청양)이 국회세종의사당 설치를 위한 '국회법' 일부개정법률안을 대표발의했다고 21일 밝혔다. 국회와 세종시 정부청사와의 물리적인 거리로 세종시 공무원의 관외 출장비는 3년간 917억원에 달한다. 출장횟수는 87만회에 달하고 있어 업무 불편과 비효율성 심화는 물론 정책 질 저하도 우려되는 실정이다. 개정안은 △서울시에 국회서울의사당을, 세종시에 국회세종의사당을 두도록 하고 △상임위원회는 국회세종의사당에 두는 것으로 하되, 국회운영위원회와 정보위원회 및 세종시로 이전하지 않은 부(部)를 소관하는 상임위원회는 국회서울의사당에 둘 수 있도록 했다. 행복도시법에 따른 이전 제외 대상 부처는 외교부, 통일부, 법무부, 국방부, 여성가족부 등 5곳이다. 또 예산결산특별위원회와 국회예산정책처는 세종시에 두도록하고 국회사무처, 국회도서관, 국회입법조사처는 국회세종의사당에 별도 기관을 둘 수 있도록 했다. 정진석 의원은 "여야 합의로 세종의사당 설계비 147억원이 확정됐고 지난 2월 국회 운영위원회 공청회에서 나온 의견들을 다듬어 법의 완성도를 높인 개정안인 만큼, 여야 합의를 통해 21대 국회 임기 중에 첫 삽을 뜰 있도록 최선을 다하겠다"고 말했다.</code> | <code>새로 들어온 법률안 등 - 2014. 5. 15. 의안접수현황 -<br>□ 국회사무처(사무총장직무대리 임병규)는 2014년 5월 15일(목) 전병헌의원 등18인이 발의한 “방송법 일부개정법률안”, 서청원의원 등 12인이 발의한 “세월호 4ㆍ16사고 반성과 진상조사 및 국가재난방지체계 혁신을 위한 특별법안” 등 12건의 법률안과 “국회의원(남경필) 사직의 건”을 포함하여 총 13건의 의안이 접수되었다고 밝혔다. 접수된 의안 중 법률안은 앞으로 미래창조과학방송통신위원회 등 소관 위원회에 회부되어 심사될 예정이다. □ 어제 접수된 법률안 중 주요내용을 소개하면 다음과 같다. - 방송법 개정안(전병헌의원 대표발의): 홈쇼핑 사업자가 그 지위를 이용하여 납품업체에게 불공정 계약을 강요하거나 부당이익을 취득한 경우 허가취소나 업무 정지 등의 제재를 할 수 있도록 하려는 것임. - 세월호 4ㆍ16사고 반성과 진상조사 및 국가재난방지체계 혁신을 위한 특별법안(서청원의원 대표발의): 세월호 참사에 대한 진상조사, 피해자 보상ㆍ배상 및 지원 대책, 재발방지 대책을 심의ㆍ의결하기 위하여 국회에 특별위원회를 구성하도록 하려는 것임.</code> | | <code>어떤 국가가 과반수의 표를 확보해야 중의원 의장이 될 수 있게 정해 놨어</code> | <code>3. 일본<br>가. 의장 선출<br>□ 중의원 의장은 중의원 선거 후 최초 집회일에 열리는 회의에서 중의원 의원의 무기명 투표에 의해 직선으로 선출됨(「국회법」제6조, 「중의원규칙」제3조, 제8조)<br>○ 의장 선거는 전체 의원의 3분의 1 이상이 참석해서 사무총장이 의장의 직무대행을 하여 실시함(「국회법」제7조)<br>○ 의장으로 당선되기 위해서는 과반수 득표를 해야 하므로, 1차 투표에서 총 투표수의 과반수를 획득한 득표자가 없으면, 최다득표자 2인에 대하여 결선투표를 실시함<br>○ 결선투표에서 두 후보자의 득표수가 같은 경우 추첨으로 당선인이 결정됨<br>○ 중의원 의장의 임기는 4년이나, 임기 중에 해산에 의해서 모든 중의원 의원이 지위를 잃으면, 의장도 그 지위를 상실함<br>□ 의장 선거절차는 중의원 규칙에서 정하는 것 외에, 제1회 일본 제국의회에서 정한 ‘의장 후보자 선거절차수칙’을 따르고 있음 ○ 의장 선출은 선거 전에 각 회파(교섭단체에 해당) 간의 대화로 미리 후보자가 결정되지만, 공식절차상으로는 각 의원의 본회의에서 선거로 선임함<br>○ 1970년대 중반까지는 집권여당이 국회의장과 부의장을 모두 독점하였으나, 제79회 국회(1976년) 이후 중의원 국회의장은 여당 제1당에서, 국회부의장은 야당 제1당(제2당)에서 선출하는 관행이 정착되었음. 그러나 1993년 연립정권 성립 후에는 중의원에서 자민당이 제1당이면서 여당이 아니었기 때문에 여당 간에 의장직을 둘러싸고 다툼이 있었음</code> | <code>이에 반해 비례대표제에는 중선거구제 내 선호순위를 표시하는 단기이양투표(single transferable vote, 예: 아일랜드, 몰타), 정당이 후보 명부를 제시하고 당선자를 득표율대로 결정해나가는 명부식 비례대표제(list proportional representation system, 예: 벨기에, 스웨덴, 덴마크 등)가 있다. 대표적인 명부식 비례대표제는 다시 전국을 하나의 선거구로 사용하는 전국통합구제도(이스라엘, 네덜란드)와 선거구를 권역별로 나누되, 불비례성을 전국구 의석으로 보정하는 권역다층선거구제도(스웨덴, 핀란드, 포르투갈, 스페인) 등으로 나뉜다. 이러한 명부식 비례제는 기계적 효과나 제조된 과반 효과가 없고 비례성이 매우 높은 특징을 지닌다. 군소정당도 당선자를 배출할 수 있고 대표성도 향상된다. 원내 정당의 난립을 막기 위해 봉쇄조항(threshold, 3~5%의 정당득표율)을 두기도 한다.</code> | | <code>1분기 코로나 예방접종을 약 2000명에게 시행할 건 누구야</code> | <code>부산 동래구(구청장 김우룡)는 코로나19 예방접종의 차질 없는 추진을 통한 빠른 일상회복을 위해 코로나19 예방접종 계획을 마련하고, 이달 말 1분기 대상자 2000여명을 대상으로 ‘코로나19 예방접종’을 시작한다고 밝혔다. 코로나19 예방접종 추진기간은 인플루엔자 유행시기인 11월 이전까지로, 접종대상은 18세 이상 전 구민이며, 임신부 및 만 18세 미만 소아·청소년, 65세 이상 고령자는 임상시험 결과에 따라 추후 접종 여부 및 시기가 결정된다. 동래구는 △과학적 근거를 기반으로 안전하고 효과적인 접종 추진 △코로나19의 사망예방 및 지역 사회 전파 차단을 위하여 전 구민의 70%인 189천여 명을 목표로 예방접종을 추진할 계획이다 1분기 우선 접종대상자는 △요양병원·요양시설입원·입원자, 종사자 △고위험 의료기관종사자, 코로나 1차 대응요원 △정신요양·재활시설 등 입소자·종사자 등 2000여 명이며, 백신 배송 등 일정을 조율해 26일부터 병원은 자체접종, 시설은 보건소 방문팀·시설별 협약의료기관 또는 계약된 의사가 방문 접종할 계획이다. 단계별 예방접종 기관은 △7월 개소 예정인 예방접종센터(사직실내체육관) △위탁의료기관 △방문접종 △자체접종 △내소접종을 병행하며, 위탁의료기관 정보는 질병관리청 코로나19 백신 및 예방접종 홈페이지에서 확인할 수 있다. 또한 동래구는 지난 4일 코로나19 예방접종 추진단을 운영 중이며, 22일 민·관·군과 병협·의협·간협 및 민간 등으로 구성된 민-관 협력체계인 ‘동래구 코로나19 예방접종 지역협의체’를 발족하여 전 구민의 코로나19 예방접종의 차질 없는 추진을 위해 최선을 다하고 있다. 김우룡 동래구청장은 “코로나19 예방접종은 전 국민 대상의 대규모 사업으로 관의 철저하고 꼼꼼한 계획과 함께 주민과 유관기관의 협조가 반드시 필요하다”며 “안전하고 신속한 예방접종을 추진을 위해 최선을 다하겠다”고 말했다.</code> | <code>문재인 대통령과 김정숙 여사가 오는 23일 아스트라제네카의 코로나19 백신을 공개 접종한다. 또 일반인에 대한 백신 접종 시기가 빨라지고, 교사의 경우 2분기에 접종을 받는다. 강민석 청와대 대변인은 15일 브리핑에서 문 대통령 부부의 백신 접종 계획을 설명하면서 “오는 6월 영국 G7(주요 7개국) 정상회의 참석, 즉 필수목적 출국을 위한 것”이라며 “질병관리청의 예방 접종 절차에 따른 것”이라고 설명했다. 공무 등으로 해외 출장을 하는 공무원은 우선 접종 대상이다. 강 대변인은 “문 대통령이 우선 접종하는 것은 일각의 안정성, 효과성 논란을 불식시키고 솔선수범하겠다는 의미”라고 덧붙였다. 3분기 예정이었던 일반인들에 대한 접종 시기도 빨라져, 고령층에 대한 접종이 4월부터 시작된다. 15일 코로나19 예방접종 대응추진단에 따르면 2분기 코로나19 백신 예방접종은 △요양병원 및 요양시설 △코로나19 취약시설 입소자 및 종사자 △65세 이상 어르신 △학교 및 돌봄 공간 △만성질환자 △보건의료인과 사회필수인력 등 6개군을 대상으로 진행한다. 이에 따라 4월 첫 주부터 75세 이상 어르신 364만 명에 대한 접종이 예방접종센터에서 실시된다. 65세부터 74세까지의 494만여 명은 6월부터 위탁의료기관에서 접종이 이뤄질 예정이다. 학교 돌봄 공간도 2분기 접종 대상이다. 4월 중 특수교육과 장애아보육 5만 1000명, 유치원과 학교 내 보건교사와 어린이집의 간호인력 1만 3000명에 대한 접종이 이뤄진다. 6월에는 유치원과 어린이집, 초등학교 1‧2학년을 담당하는 교사, 교직원, 관련 종사자 49만 1000명이 단계별로 접종을 받는다. 노인‧장애인‧노숙인시설 등의 거주‧이용시설 접종도 2분기 중 진행할 예정이지만, 아직 정확한 시기는 미정이다. 한편 15일 0시 기준 부산의 코로나19 예방백신 접종자는 4만 5897명으로 우선 접종 대상자 6만 310명의 72.8%가 접종을 마쳤다. 근육통, 발열 등 이상 반응 사례는 모두 589건이다.</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Evaluation Dataset #### Unnamed Dataset * Size: 10,000 evaluation samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 9 tokens</li><li>mean: 22.86 tokens</li><li>max: 115 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 351.34 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 93 tokens</li><li>mean: 346.69 tokens</li><li>max: 512 tokens</li></ul> | * Samples: | anchor | positive | negative | |:--------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>릴레이로 이번주 TV와 라디오 방송 출연을 확정한게 누구야?</code> | <code>▲ 사진=롯데엔터테인먼트 제공 영화 완벽한 타인의 주역 유해진, 조진웅, 이서진, 염정아가 릴레이로 이번주 TV와 라디오 방송 출연을 확정했다. 완벽한 타인은 완벽해 보이는 커플 모임에서 한정된 시간 동안 핸드폰으로 오는 전화, 문자, 카톡을 강제로 공개해야 하는 게임 때문에 벌어지는 예측불허 이야기를 담은 작품이다. 완벽한 타인에서 완벽한 연기를 펼친 배우들은 이번 주 릴레이로 TV와 라디오 방송 출연을 확정하며 열일 행보를 펼친다. 먼저 오는 24일 오후 7시 MBC FM영화음악 한예리입니다에는 유해진과 염정아가 함께 출연한다. 간첩, 전우치에 이어 세 번째로 함께 호흡을 맞춘 두 사람은 이번 라디오 출연에서 영화에 대한 이야기를 나누며 걸출한 입담과 절친 케미스트리를 선보일 것으로 보인다. 이어 이번 영화로 처음 만나 절친이 된 유해진, 조진웅, 이서진이 25일 오후 11시 10분 KBS2 해피투게더4에 출연한다. 세끼 인연 유해진과 이서진, 그리고 조진웅의 예능감이 유감없이 발휘될 예정이다. 마지막으로 26일에는 MBC 배철수의 음악캠프에서 이서진을 만날 수 있다. 완벽한 타인에서 가장 파격적인 연기 변신을 선보인 그는 음악캠프 특별 DJ로 활약했던 인연으로 이번 출연이 성사됐다. 이서진은 거침없는 언변으로 영화 완벽한 타인의 현장 비하인드 스토리를 밝힐 예정이다. 한편 완벽한 타인은 오는 31일 개봉을 앞두고 있다.</code> | <code>그룹 세븐틴이 미국 간판 토크쇼 ‘엘렌 쇼’에 첫 출연을 확정 지었다. 세븐틴은 다음 달 1일(현지 시각) 방송되는 미국 토크쇼 ‘엘렌 드제너러스 쇼’(이하 엘렌 쇼)에 첫 출연을 확정 지어 전 세계 팬들의 폭발적인 반응을 얻었다. 이날 방송에서 세븐틴은 지난 2019년 8월 발매한 디지털 싱글 ‘HIT’ 무대를 선보인다. ‘HIT’는 제목처럼 타격감이 느껴지는 사운드와 세븐틴의 폭발적인 에너지가 그대로 전해지는 강렬한 EDM 장르의 댄스곡으로 발매와 동시에 국내는 물론 해외에서도 큰 사랑을 받았다. ‘엘렌 쇼’는 미국 유명 코미디언이자 작가, 배우 등 멀티 엔터테이너인 엘렌 드제너러스가 진행하는 토크쇼로 브루노 마스, 두아 리파, 존 레전드, 저스틴 비버 등 세계적인 팝스타들이 대거 출연해 화제를 모았으며 미국의 데이타임 쇼 중 높은 인기를 보유하고 있는 프로그램이다. 앞서 세븐틴은 지난 1월 방송된 미국 CBS ‘제임스 코든 쇼’와 NBC ‘켈리 클락슨 쇼’에 연달아 출연해 스페셜 앨범 타이틀곡 ‘HOME;RUN’과 미니 7집 타이틀곡 ‘Left & Right’의 무대를 선사, 막강한 글로벌 영향력을 확인 시켜 주며 전 세계 팬들과 해외 유수 매체의 호평 세례를 받았다. 이렇듯 세븐틴은 스토리텔링이 담긴 완성도 높은 무대와 세븐틴만이 할 수 있는 퍼포먼스를 선보여 ‘K팝 퍼포먼스 강자’라는 칭호를 얻는 등 전 세계를 열광시킨 바 있어 이번 ‘엘렌쇼’에서 어떤 무대를 선보일지 기대감이 치솟고 있다. 한편 세븐틴이 출연하는 미국 토크쇼 ‘엘렌 쇼’는 다음 달 1일(현지 시각)에 만나볼 수 있다.</code> | | <code>벡터맵 빈 분류 기반의 제안기법은 무엇에 비하여서 압출효율이 높다는 것을 표 4에서 알 수 있어?</code> | <code><h1>IV. 실험 결과</h1><p>제안한 빈 분류기반 벡터맵 압축 기법에 대한 성능 평가를 위한 실험을 수행하였다. 실험을 위해 그림 10 과 같이, \( 10 \mathrm{~km} \times 10 \mathrm{~km} \) 의 국부 영역을 갖는 벡터맵 레이어를 생성하였으며, 이 중 폴리곤으로 구성된 '건물' 레이어와 폴리라인으로 구성된 '일반도로’ 레이어에 대해 각각의 실험을 수행하였다. 또한 TM 좌표계에 의해 표현되는 실측치 \( 1 \mathrm{~cm} \) 이내의 오차를 갖도록 식 (1)의 \( c=100 \) 으로 설정하여 정밀 벡터맵 압축에 대해 결과를 도출하였다. 또한 \( 10 \mathrm{~km} \times 10 \mathrm{~km} \) 영역에서 \( 1 \mathrm{~cm} \) 정밀도를 갖는 벡터맵 데이터의 최적의 압축효율을 위해, 실험적으로 dist \( _{D B}=10 \mathrm{~m} \) 및 dist \( { }_{A B}=0.64 \mathrm{~m} \) 로 결정하였다.</p><p>제안 기법의 객관적 비교를 위해 일반적인 데이터 압축기법으로서 7-zib 알고리즘, 대표적인 벡터 간소화 알고리즘으로서 Douglas-Peucker 알고리즘[16] 및 기존의 공간 에너지집중 기반에서의 압축 알고리즘등과 압축 결과를 비교하였다. 표 4 에 각각의 알고리즘 에 대한 압축 결과를 나타내었다.</p><p>표 4 의 결과로부터 벡터맵의 특성을 고려하지 않은 7-Zip과 비교하였을 때, 각 좌표점들의 오차범위로 \( 0.01 \mathrm{~m} \) 미만을 갖는 벡터맵 빈 분류 기반의 제안 기법이 월등히 높은 압축효율을 가짐을 확인하였다. 한편, 벡터 간소화 기법을 사용하는 Douglas-Peucker 알고리즘과 제안 알고리즘은 압축원리가 상이하므로 RMSE(root mean square error) 등의 방법을 통한 직접적인 비교는 어렵다. 또한 제안 기법과의 비교를 위해 Douglas-Peucker 알고리즘의 정밀도 범위 \( \epsilon=0.01 \mathrm{~m} \) 로 설정하게 되면, 각 좌표점들의 간소화 조건이 대부분 만족하지 않으므로 실제 간소화를 통한 압축은 거의 이루어지지 않는다. 따라서 그림 10의 벡터맵 레이어에서 시각적으로 용인할 수 있을 것으로 간주되는 적정 임계치 \( \epsilon=1 \mathrm{~m} \) 로 설정하여 압축을 수행하였다. 표 4의 실험 결과는 이때의 압축 결과를 나타낸 것이다. 그림 11은 벡터맵을 확대하였을 때, 표 4 의 압축 효율에 대해 제안 알고리즘과 Douglas-Peucker 알고리즘의 시각적 오차를 비교한 것이다.</p><p>표 4와 그림 11로부터 제안 기법이 Duglas-Peucker 알고리즘보다 월등히 적은 시각적 오차를 가짐에도 불구하고 보다 높은 압축효율을 나타냄을 확인할 수 있다. 더욱이, 표 4에서 Duglas-Peucker 알고리즘의 특성상 연속한 좌표점들이 급격히 꺽히는 오브젝트들의 집합인 '건물' 레이어에서 압축효율의 저하가 발생한다. 반면. 제안 기법은 Duglas-Peucker 알고리즘에서와 같은 압축효율의 저하는 발생하지 않음을 확인하였다. 공간영역에서의 에너지 집중(SEC)을 이용한 기존방법과의 비교에서 역시 제안 알고리즘이 보다 우수한 압축 효율을 가짐을 알 수 있었다. 또한 에너지 집중 이후 실질적 데이터 압축을 위한 엔트로피 코딩으로써 zlib 또는 7-zip 알고리즘을 이용하는 기존 기법과는 달리, 제안 기법은 압축 과정의 일부로써 정의된 단순한 허프만 테이블을 참조하므로 계산 복잡도에서 큰 이점을 얻을 수 있다. </p></code> | <code><h1>VI. 결 론</h1><p>본 논문에서는 집적 영상을 효율적으로 압축하기 위한 3D-DCT 기반의 압축 방법을 제안하였다. 제안한 방법은 집적 영상이 촬영 물체나 촬영 방법에 따라 다양한 특성을 가지는 특성을 바탕으로, 적응적으로 3D-DCT 블록을 구성하는 방법과 각 3D-DCT 블록별로 가변 블록 크기 3D-DCT를 수행하는 방법이다. 제안 방법은 영상 특성에 따라 최적의 3D-DCT를 수행하기 때문에 기존의 방법보다 뛰어난 성능을 보여준다. 제안 방법은 기존의 3D-DCT 방법과 비교해서 각 영상별로 동일 비트량에서 약 \( 1 \mathrm{~dB} \)에서 \( 2 \mathrm{~dB} \)의 PSNR 향상 효과를 보여주었다. </p><p>본 논문에서 제안된 방법은 여러 가지 블록 모드를 정의하고 그 중 최적의 모드를 선택하는 과정을 수행하므로 이에 따른 계산량의 증가를 초래한다. 블록 모드의 개수가 증가할수록 계산량의 그 개수에 정비례하여 증가하므로 이의 개선을 위한 연구가 추가적으로 필요하다. 또한 보다 효율적인 오버헤드 비트 부호화를 통해 추가적인 압축 효율 향상을 기대할 수 있다. </p></code> | | <code>대면하지 않고 진료에서 약 배달까지 해주는 처방솔루션은 뭐야</code> | <code>비대면 진료 서비스에 기반한 브랜드 페어(pare)는 올해 초부터 비대면 진료와 처방을 모바일로 쉽고 빠르게 받을 수 있게 하고, 처방받은 약은 집까지 배송해 주는 서비스를 국내 환자와 재외국민들을 위해 시작했다. 페어는 의사의 처방을 통해서만 구매가 가능한 의사의 솔루션을 페어만의 브랜드 감성을 깃들여 환자에게 노출한다. 이는 기존의 처방약이라는 고질적인 부분을 소비자 감성 브랜드로 승화해 환자와 소비자의 벽을 허무는 국내 최초의 전문처방 솔루션 비즈니스 모델이다. 또한, 플랫폼 내 처방이 필요하지 않은 일반 건강기능식품을 통한 사후관리 서비스도 제공한다. 강신욱 대표는 “페어만의 브랜드 감성과 의사들의 전문성이 실린 솔루션을 집까지 배송해 주는 게 특징”이라며 “처방약뿐만 아니라 진료에 따른 맞춤형 건강관리제품을 추천 혹은 패키지로 받아 볼 수 있는 국내 최초 비대면 진료 기반의 커머스형 처방솔루션”이라고 강조했다.</code> | <code>국민 의약품 구입 불편 해소 방안 관련 의약품 재분류 논의 시작<br>국가별 의약품 분류체계 <table><tbody><tr><td>분류</td><td>국명</td><td>처방약(처방필수)</td><td>비처방약</td><td>비고</td></tr><tr><td rowspan='3'>2분류</td><td>한국</td><td>- 전문의약품</td><td>- 일반의약품</td><td rowspan='2'>의약외품은 판매장소 제한 없음 </td></tr><tr><td>일본</td><td>- 의료용의약품(E): 의사의 처방에 의해서만 조제·판매</td><td>- 일반용의약품(OTC) : 약국 외에서도 제한적으로 판매</td></tr><tr><td>미국</td><td>- 처방의약품(Rx): 연방법에 의해 처방전 없이 조제하는 것을 금한다는 표시가 있음 </td><td>- 비처방의약품(OTC): 약국 및 약국 외에서 자유롭게 판매 <br>※ 제산제, 비타민, 치질, 해열진통제 등</td><td rowspan='3'>일반약 전체, 또는 일부 약국외 판매 허용</td></tr><tr><td rowspan='2'>3분류</td><td>영국</td><td>- 처방약(POM): 의사의 처방에 의해서만 조제·판매</td><td>- 약국약(P) : 처방없이 약국에서 판매 가능<br>- 자유판매품목(GSL): 약국 외에서도 판매 가능 <br>※ 어린이용 아스피린, 구충제, 관장약 등 제외</td></tr><tr><td>독일</td><td>- 처방약(Rp): 처방전이 필요하며 약국을 통해서만 판매 </td><td>- 약국약(Ap): 처방을 요하지 않고 약국에서 판매가능<br>- 자유판매품목(F):약국 외에서도 판매가능 <br>※ 민간치료약초, 저함량비타민·미네랄 등</td></tr><tr><td rowspan='3'>4분류</td><td>프랑스</td><td>- 처방약 list Ⅰ: 의사의 처방을 필요로 하며 처방자의 허가 없이 반복 사용할 수 없고, 약사는 판매상황을 기록<br>- 처방약 list Ⅱ: 환자의 요청이 있을 때 2달까지 처방전을 반복 사용<br>- 특별처방약Stupefiants : 의사는 일련번호가 붙은 양식에 의해 처방하며 약사는 판매상황을 기록</td><td>- 비처방약 (대중광고 가능) : 대중광로를 하는 약으로 사회 건강보험대상에서 제외 </td><td>의약품 약국 외 판매 불허</td></tr><tr><td>캐나다</td><td>- 처방약 (P) : 처방에 의해서 약국에서만 판매</td><td>- 약사약 (BTC) : 처방없이 약국에서 약사만이 판매할 수 있음 <br>- 약국진열약 (OTC) : 약국에서 자유롭게 진열하여 판매할 수 있는 약으로서, 대중광고 허용<br>- 자유판매약(OTP) : 약국 이외에서도 판매되는 약</td><td rowspan='2'> </td></tr><tr><td>스위스</td><td>- 처방약 list Ⅰ : 약품 명단을 법률로 정하며, 처방전 반복사용 금지<br>- 처방약 list Ⅱ : 약사의 반복 처방 가능</td><td>- 비처방약 list Ⅲ (약국약), list Ⅳ (약종상약), list Ⅴ (자유판매약<br>)- list Ⅳ 와 list Ⅴ는 대중광고 허용</td></tr></tbody></table></code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 16 - `learning_rate`: 5e-06 - `max_grad_norm`: 5.0 - `num_train_epochs`: 10 - `warmup_steps`: 500 - `dataloader_drop_last`: True #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 16 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 5e-06 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 5.0 - `num_train_epochs`: 10 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.0 - `warmup_steps`: 500 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: True - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `eval_use_gather_object`: False - `batch_sampler`: batch_sampler - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs <details><summary>Click to expand</summary> | Epoch | Step | Training Loss | loss | eval_max_accuracy | |:------:|:-----:|:-------------:|:------:|:-----------------:| | 0 | 0 | - | - | 0.8097 | | 0.0003 | 1 | 1.179 | - | - | | 0.0005 | 2 | 1.0733 | - | - | | 0.0008 | 3 | 0.9841 | - | - | | 0.0011 | 4 | 1.0739 | - | - | | 0.0014 | 5 | 1.2194 | - | - | | 0.0016 | 6 | 1.1582 | - | - | | 0.0019 | 7 | 0.9616 | - | - | | 0.0022 | 8 | 1.0596 | - | - | | 0.0024 | 9 | 0.9503 | - | - | | 0.0027 | 10 | 1.031 | - | - | | 0.0030 | 11 | 1.1054 | - | - | | 0.0032 | 12 | 1.0184 | - | - | | 0.0035 | 13 | 0.8953 | - | - | | 0.0038 | 14 | 1.2405 | - | - | | 0.0041 | 15 | 1.0238 | - | - | | 0.0043 | 16 | 0.9845 | - | - | | 0.0046 | 17 | 1.0546 | - | - | | 0.0049 | 18 | 1.0675 | - | - | | 0.0051 | 19 | 0.9762 | - | - | | 0.0054 | 20 | 0.7939 | - | - | | 0.0057 | 21 | 1.0777 | - | - | | 0.0060 | 22 | 1.0382 | - | - | | 0.0062 | 23 | 1.0807 | - | - | | 0.0065 | 24 | 1.1184 | - | - | | 0.0068 | 25 | 0.881 | - | - | | 0.0070 | 26 | 1.1134 | - | - | | 0.0073 | 27 | 1.0594 | - | - | | 0.0076 | 28 | 0.7923 | - | - | | 0.0078 | 29 | 0.947 | - | - | | 0.0081 | 30 | 0.9587 | - | - | | 0.0084 | 31 | 0.8561 | - | - | | 0.0087 | 32 | 0.9037 | - | - | | 0.0089 | 33 | 0.9165 | - | - | | 0.0092 | 34 | 1.1332 | - | - | | 0.0095 | 35 | 0.9526 | - | - | | 0.0097 | 36 | 0.9094 | - | - | | 0.0100 | 37 | 0.8902 | - | - | | 0.0103 | 38 | 0.9149 | - | - | | 0.0106 | 39 | 0.8626 | - | - | | 0.0108 | 40 | 1.0476 | - | - | | 0.0111 | 41 | 1.1116 | - | - | | 0.0114 | 42 | 0.9363 | - | - | | 0.0116 | 43 | 1.1492 | - | - | | 0.0119 | 44 | 0.88 | - | - | | 0.0122 | 45 | 0.8953 | - | - | | 0.0124 | 46 | 0.9056 | - | - | | 0.0127 | 47 | 0.8712 | - | - | | 0.0130 | 48 | 0.8783 | - | - | | 0.0133 | 49 | 0.8998 | - | - | | 0.0135 | 50 | 0.9089 | - | - | | 0.0138 | 51 | 0.9943 | - | - | | 0.0141 | 52 | 0.7594 | - | - | | 0.0143 | 53 | 1.0239 | - | - | | 0.0146 | 54 | 0.8189 | - | - | | 0.0149 | 55 | 0.8898 | - | - | | 0.0152 | 56 | 0.7309 | - | - | | 0.0154 | 57 | 0.7656 | - | - | | 0.0157 | 58 | 0.8408 | - | - | | 0.0160 | 59 | 0.9071 | - | - | | 0.0162 | 60 | 0.8157 | - | - | | 0.0165 | 61 | 0.8421 | - | - | | 0.0168 | 62 | 0.9124 | - | - | | 0.0170 | 63 | 0.8379 | - | - | | 0.0173 | 64 | 0.8278 | - | - | | 0.0176 | 65 | 0.8997 | - | - | | 0.0179 | 66 | 0.7988 | - | - | | 0.0181 | 67 | 0.8498 | - | - | | 0.0184 | 68 | 0.8588 | - | - | | 0.0187 | 69 | 0.8846 | - | - | | 0.0189 | 70 | 0.8923 | - | - | | 0.0192 | 71 | 0.7344 | - | - | | 0.0195 | 72 | 0.7002 | - | - | | 0.0198 | 73 | 0.8444 | - | - | | 0.0200 | 74 | 0.8148 | - | - | | 0.0203 | 75 | 0.7002 | - | - | | 0.0206 | 76 | 0.8735 | - | - | | 0.0208 | 77 | 0.8718 | - | - | | 0.0211 | 78 | 0.672 | - | - | | 0.0214 | 79 | 0.6914 | - | - | | 0.0216 | 80 | 0.7521 | - | - | | 0.0219 | 81 | 0.8297 | - | - | | 0.0222 | 82 | 0.774 | - | - | | 0.0225 | 83 | 0.977 | - | - | | 0.0227 | 84 | 0.736 | - | - | | 0.0230 | 85 | 0.778 | - | - | | 0.0233 | 86 | 0.9048 | - | - | | 0.0235 | 87 | 0.8656 | - | - | | 0.0238 | 88 | 0.8066 | - | - | | 0.0241 | 89 | 0.6944 | - | - | | 0.0244 | 90 | 0.7122 | - | - | | 0.0246 | 91 | 0.8266 | - | - | | 0.0249 | 92 | 0.7199 | - | - | | 0.0252 | 93 | 0.7296 | - | - | | 0.0254 | 94 | 0.9107 | - | - | | 0.0257 | 95 | 0.7637 | - | - | | 0.0260 | 96 | 0.6374 | - | - | | 0.0262 | 97 | 0.6547 | - | - | | 0.0265 | 98 | 0.6328 | - | - | | 0.0268 | 99 | 0.6648 | - | - | | 0.0271 | 100 | 0.7403 | - | - | | 0.0273 | 101 | 0.6864 | - | - | | 0.0276 | 102 | 0.6947 | - | - | | 0.0279 | 103 | 0.6662 | - | - | | 0.0281 | 104 | 0.657 | - | - | | 0.0284 | 105 | 0.663 | - | - | | 0.0287 | 106 | 0.5928 | - | - | | 0.0290 | 107 | 0.8488 | - | - | | 0.0292 | 108 | 0.5981 | - | - | | 0.0295 | 109 | 0.7565 | - | - | | 0.0298 | 110 | 0.6583 | - | - | | 0.0300 | 111 | 0.8198 | - | - | | 0.0303 | 112 | 0.7473 | - | - | | 0.0306 | 113 | 0.6791 | - | - | | 0.0308 | 114 | 0.5024 | - | - | | 0.0311 | 115 | 0.6391 | - | - | | 0.0314 | 116 | 0.7007 | - | - | | 0.0317 | 117 | 0.6424 | - | - | | 0.0319 | 118 | 0.508 | - | - | | 0.0322 | 119 | 0.6518 | - | - | | 0.0325 | 120 | 0.7681 | - | - | | 0.0327 | 121 | 0.7549 | - | - | | 0.0330 | 122 | 0.7161 | - | - | | 0.0333 | 123 | 0.575 | - | - | | 0.0335 | 124 | 0.7983 | - | - | | 0.0338 | 125 | 0.6369 | - | - | | 0.0341 | 126 | 0.5207 | - | - | | 0.0344 | 127 | 0.7792 | - | - | | 0.0346 | 128 | 0.5507 | - | - | | 0.0349 | 129 | 0.5769 | - | - | | 0.0352 | 130 | 0.7462 | - | - | | 0.0354 | 131 | 0.7728 | - | - | | 0.0357 | 132 | 0.5582 | - | - | | 0.0360 | 133 | 0.6999 | - | - | | 0.0363 | 134 | 0.7194 | - | - | | 0.0365 | 135 | 0.7125 | - | - | | 0.0368 | 136 | 0.6527 | - | - | | 0.0371 | 137 | 0.6318 | - | - | | 0.0373 | 138 | 0.5249 | - | - | | 0.0376 | 139 | 0.6114 | - | - | | 0.0379 | 140 | 0.577 | - | - | | 0.0381 | 141 | 0.6302 | - | - | | 0.0384 | 142 | 0.65 | - | - | | 0.0387 | 143 | 0.5753 | - | - | | 0.0390 | 144 | 0.5812 | - | - | | 0.0392 | 145 | 0.5641 | - | - | | 0.0395 | 146 | 0.6745 | - | - | | 0.0398 | 147 | 0.5224 | - | - | | 0.0400 | 148 | 0.6954 | - | - | | 0.0403 | 149 | 0.7016 | - | - | | 0.0406 | 150 | 0.4932 | - | - | | 0.0409 | 151 | 0.587 | - | - | | 0.0411 | 152 | 0.573 | - | - | | 0.0414 | 153 | 0.6685 | - | - | | 0.0417 | 154 | 0.6023 | - | - | | 0.0419 | 155 | 0.5884 | - | - | | 0.0422 | 156 | 0.4895 | - | - | | 0.0425 | 157 | 0.7572 | - | - | | 0.0427 | 158 | 0.6522 | - | - | | 0.0430 | 159 | 0.6946 | - | - | | 0.0433 | 160 | 0.6449 | - | - | | 0.0436 | 161 | 0.6483 | - | - | | 0.0438 | 162 | 0.6022 | - | - | | 0.0441 | 163 | 0.5624 | - | - | | 0.0444 | 164 | 0.6458 | - | - | | 0.0446 | 165 | 0.5737 | - | - | | 0.0449 | 166 | 0.6261 | - | - | | 0.0452 | 167 | 0.5635 | - | - | | 0.0455 | 168 | 0.4913 | - | - | | 0.0457 | 169 | 0.6958 | - | - | | 0.0460 | 170 | 0.592 | - | - | | 0.0463 | 171 | 0.4624 | - | - | | 0.0465 | 172 | 0.565 | - | - | | 0.0468 | 173 | 0.5542 | - | - | | 0.0471 | 174 | 0.6587 | - | - | | 0.0473 | 175 | 0.4727 | - | - | | 0.0476 | 176 | 0.6049 | - | - | | 0.0479 | 177 | 0.7385 | - | - | | 0.0482 | 178 | 0.5175 | - | - | | 0.0484 | 179 | 0.5711 | - | - | | 0.0487 | 180 | 0.4591 | - | - | | 0.0490 | 181 | 0.7063 | - | - | | 0.0492 | 182 | 0.4954 | - | - | | 0.0495 | 183 | 0.6444 | - | - | | 0.0498 | 184 | 0.6686 | - | - | | 0.0501 | 185 | 0.5229 | - | - | | 0.0503 | 186 | 0.4338 | - | - | | 0.0506 | 187 | 0.5582 | - | - | | 0.0509 | 188 | 0.5881 | - | - | | 0.0511 | 189 | 0.5609 | - | - | | 0.0514 | 190 | 0.6607 | - | - | | 0.0517 | 191 | 0.491 | - | - | | 0.0519 | 192 | 0.4687 | - | - | | 0.0522 | 193 | 0.5842 | - | - | | 0.0525 | 194 | 0.5544 | - | - | | 0.0528 | 195 | 0.5778 | - | - | | 0.0530 | 196 | 0.5591 | - | - | | 0.0533 | 197 | 0.5872 | - | - | | 0.0536 | 198 | 0.5807 | - | - | | 0.0538 | 199 | 0.593 | - | - | | 0.0541 | 200 | 0.4658 | - | - | | 0.0544 | 201 | 0.4649 | - | - | | 0.0547 | 202 | 0.4912 | - | - | | 0.0549 | 203 | 0.5475 | - | - | | 0.0552 | 204 | 0.5182 | - | - | | 0.0555 | 205 | 0.5281 | - | - | | 0.0557 | 206 | 0.6302 | - | - | | 0.0560 | 207 | 0.6346 | - | - | | 0.0563 | 208 | 0.5309 | - | - | | 0.0565 | 209 | 0.5499 | - | - | | 0.0568 | 210 | 0.5368 | - | - | | 0.0571 | 211 | 0.4647 | - | - | | 0.0574 | 212 | 0.5316 | - | - | | 0.0576 | 213 | 0.5165 | - | - | | 0.0579 | 214 | 0.6294 | - | - | | 0.0582 | 215 | 0.4526 | - | - | | 0.0584 | 216 | 0.5157 | - | - | | 0.0587 | 217 | 0.6337 | - | - | | 0.0590 | 218 | 0.4911 | - | - | | 0.0593 | 219 | 0.5696 | - | - | | 0.0595 | 220 | 0.4651 | - | - | | 0.0598 | 221 | 0.6098 | - | - | | 0.0601 | 222 | 0.6329 | - | - | | 0.0603 | 223 | 0.7011 | - | - | | 0.0606 | 224 | 0.4582 | - | - | | 0.0609 | 225 | 0.6332 | - | - | | 0.0611 | 226 | 0.5138 | - | - | | 0.0614 | 227 | 0.6474 | - | - | | 0.0617 | 228 | 0.5059 | - | - | | 0.0620 | 229 | 0.3617 | - | - | | 0.0622 | 230 | 0.4401 | - | - | | 0.0625 | 231 | 0.5159 | - | - | | 0.0628 | 232 | 0.6072 | - | - | | 0.0630 | 233 | 0.5079 | - | - | | 0.0633 | 234 | 0.3517 | - | - | | 0.0636 | 235 | 0.5604 | - | - | | 0.0639 | 236 | 0.4834 | - | - | | 0.0641 | 237 | 0.5719 | - | - | | 0.0644 | 238 | 0.4928 | - | - | | 0.0647 | 239 | 0.4558 | - | - | | 0.0649 | 240 | 0.4483 | - | - | | 0.0652 | 241 | 0.5027 | - | - | | 0.0655 | 242 | 0.4534 | - | - | | 0.0657 | 243 | 0.6228 | - | - | | 0.0660 | 244 | 0.653 | - | - | | 0.0663 | 245 | 0.4585 | - | - | | 0.0666 | 246 | 0.6514 | - | - | | 0.0668 | 247 | 0.6069 | - | - | | 0.0671 | 248 | 0.5267 | - | - | | 0.0674 | 249 | 0.4457 | - | - | | 0.0676 | 250 | 0.4966 | - | - | | 0.0679 | 251 | 0.5595 | - | - | | 0.0682 | 252 | 0.4991 | - | - | | 0.0685 | 253 | 0.5233 | - | - | | 0.0687 | 254 | 0.5883 | - | - | | 0.0690 | 255 | 0.4411 | - | - | | 0.0693 | 256 | 0.5102 | - | - | | 0.0695 | 257 | 0.5198 | - | - | | 0.0698 | 258 | 0.4086 | - | - | | 0.0701 | 259 | 0.4336 | - | - | | 0.0703 | 260 | 0.6177 | - | - | | 0.0706 | 261 | 0.5753 | - | - | | 0.0709 | 262 | 0.6234 | - | - | | 0.0712 | 263 | 0.5582 | - | - | | 0.0714 | 264 | 0.4451 | - | - | | 0.0717 | 265 | 0.5145 | - | - | | 0.0720 | 266 | 0.5908 | - | - | | 0.0722 | 267 | 0.3929 | - | - | | 0.0725 | 268 | 0.5009 | - | - | | 0.0728 | 269 | 0.3671 | - | - | | 0.0731 | 270 | 0.5866 | - | - | | 0.0733 | 271 | 0.6914 | - | - | | 0.0736 | 272 | 0.4779 | - | - | | 0.0739 | 273 | 0.5303 | - | - | | 0.0741 | 274 | 0.4294 | - | - | | 0.0744 | 275 | 0.61 | - | - | | 0.0747 | 276 | 0.5529 | - | - | | 0.0749 | 277 | 0.5498 | - | - | | 0.0752 | 278 | 0.4736 | - | - | | 0.0755 | 279 | 0.3907 | - | - | | 0.0758 | 280 | 0.4271 | - | - | | 0.0760 | 281 | 0.5772 | - | - | | 0.0763 | 282 | 0.5232 | - | - | | 0.0766 | 283 | 0.4786 | - | - | | 0.0768 | 284 | 0.5621 | - | - | | 0.0771 | 285 | 0.4747 | - | - | | 0.0774 | 286 | 0.4695 | - | - | | 0.0777 | 287 | 0.4926 | - | - | | 0.0779 | 288 | 0.5339 | - | - | | 0.0782 | 289 | 0.5043 | - | - | | 0.0785 | 290 | 0.3665 | - | - | | 0.0787 | 291 | 0.5777 | - | - | | 0.0790 | 292 | 0.5081 | - | - | | 0.0793 | 293 | 0.5744 | - | - | | 0.0795 | 294 | 0.4446 | - | - | | 0.0798 | 295 | 0.415 | - | - | | 0.0801 | 296 | 0.4013 | - | - | | 0.0804 | 297 | 0.4938 | - | - | | 0.0806 | 298 | 0.5096 | - | - | | 0.0809 | 299 | 0.5261 | - | - | | 0.0812 | 300 | 0.3339 | - | - | | 0.0814 | 301 | 0.7123 | - | - | | 0.0817 | 302 | 0.4387 | - | - | | 0.0820 | 303 | 0.4273 | - | - | | 0.0823 | 304 | 0.411 | - | - | | 0.0825 | 305 | 0.4667 | - | - | | 0.0828 | 306 | 0.4651 | - | - | | 0.0831 | 307 | 0.4916 | - | - | | 0.0833 | 308 | 0.6379 | - | - | | 0.0836 | 309 | 0.4339 | - | - | | 0.0839 | 310 | 0.4866 | - | - | | 0.0841 | 311 | 0.5155 | - | - | | 0.0844 | 312 | 0.4192 | - | - | | 0.0847 | 313 | 0.6039 | - | - | | 0.0850 | 314 | 0.4657 | - | - | | 0.0852 | 315 | 0.6355 | - | - | | 0.0855 | 316 | 0.4975 | - | - | | 0.0858 | 317 | 0.3445 | - | - | | 0.0860 | 318 | 0.3741 | - | - | | 0.0863 | 319 | 0.3988 | - | - | | 0.0866 | 320 | 0.5121 | - | - | | 0.0869 | 321 | 0.5441 | - | - | | 0.0871 | 322 | 0.6115 | - | - | | 0.0874 | 323 | 0.4559 | - | - | | 0.0877 | 324 | 0.4158 | - | - | | 0.0879 | 325 | 0.416 | - | - | | 0.0882 | 326 | 0.4739 | - | - | | 0.0885 | 327 | 0.6097 | - | - | | 0.0887 | 328 | 0.5983 | - | - | | 0.0890 | 329 | 0.5816 | - | - | | 0.0893 | 330 | 0.4715 | - | - | | 0.0896 | 331 | 0.3944 | - | - | | 0.0898 | 332 | 0.5422 | - | - | | 0.0901 | 333 | 0.5825 | - | - | | 0.0904 | 334 | 0.4453 | - | - | | 0.0906 | 335 | 0.4771 | - | - | | 0.0909 | 336 | 0.3799 | - | - | | 0.0912 | 337 | 0.3578 | - | - | | 0.0915 | 338 | 0.5269 | - | - | | 0.0917 | 339 | 0.5412 | - | - | | 0.0920 | 340 | 0.4387 | - | - | | 0.0923 | 341 | 0.4648 | - | - | | 0.0925 | 342 | 0.4264 | - | - | | 0.0928 | 343 | 0.3917 | - | - | | 0.0931 | 344 | 0.6398 | - | - | | 0.0933 | 345 | 0.3961 | - | - | | 0.0936 | 346 | 0.6527 | - | - | | 0.0939 | 347 | 0.4453 | - | - | | 0.0942 | 348 | 0.5411 | - | - | | 0.0944 | 349 | 0.5758 | - | - | | 0.0947 | 350 | 0.4062 | - | - | | 0.0950 | 351 | 0.5969 | - | - | | 0.0952 | 352 | 0.4315 | - | - | | 0.0955 | 353 | 0.5792 | - | - | | 0.0958 | 354 | 0.4573 | - | - | | 0.0960 | 355 | 0.5059 | - | - | | 0.0963 | 356 | 0.4784 | - | - | | 0.0966 | 357 | 0.4753 | - | - | | 0.0969 | 358 | 0.4547 | - | - | | 0.0971 | 359 | 0.4185 | - | - | | 0.0974 | 360 | 0.4964 | - | - | | 0.0977 | 361 | 0.4534 | - | - | | 0.0979 | 362 | 0.4609 | - | - | | 0.0982 | 363 | 0.441 | - | - | | 0.0985 | 364 | 0.4798 | - | - | | 0.0988 | 365 | 0.4776 | - | - | | 0.0990 | 366 | 0.4324 | - | - | | 0.0993 | 367 | 0.5355 | - | - | | 0.0996 | 368 | 0.3569 | - | - | | 0.0998 | 369 | 0.4697 | - | - | | 0.1001 | 370 | 0.4129 | - | - | | 0.1004 | 371 | 0.4395 | - | - | | 0.1006 | 372 | 0.4686 | - | - | | 0.1009 | 373 | 0.4133 | - | - | | 0.1012 | 374 | 0.4187 | - | - | | 0.1015 | 375 | 0.5296 | - | - | | 0.1017 | 376 | 0.4378 | - | - | | 0.1020 | 377 | 0.486 | - | - | | 0.1023 | 378 | 0.4715 | - | - | | 0.1025 | 379 | 0.401 | - | - | | 0.1028 | 380 | 0.3678 | - | - | | 0.1031 | 381 | 0.5143 | - | - | | 0.1034 | 382 | 0.5067 | - | - | | 0.1036 | 383 | 0.577 | - | - | | 0.1039 | 384 | 0.4762 | - | - | | 0.1042 | 385 | 0.5171 | - | - | | 0.1044 | 386 | 0.483 | - | - | | 0.1047 | 387 | 0.5319 | - | - | | 0.1050 | 388 | 0.5519 | - | - | | 0.1052 | 389 | 0.5023 | - | - | | 0.1055 | 390 | 0.4167 | - | - | | 0.1058 | 391 | 0.3797 | - | - | | 0.1061 | 392 | 0.5427 | - | - | | 0.1063 | 393 | 0.4857 | - | - | | 0.1066 | 394 | 0.4877 | - | - | | 0.1069 | 395 | 0.5607 | - | - | | 0.1071 | 396 | 0.3526 | - | - | | 0.1074 | 397 | 0.5034 | - | - | | 0.1077 | 398 | 0.465 | - | - | | 0.1080 | 399 | 0.4822 | - | - | | 0.1082 | 400 | 0.5667 | - | - | | 0.1085 | 401 | 0.5567 | - | - | | 0.1088 | 402 | 0.3982 | - | - | | 0.1090 | 403 | 0.5272 | - | - | | 0.1093 | 404 | 0.3676 | - | - | | 0.1096 | 405 | 0.4855 | - | - | | 0.1098 | 406 | 0.4727 | - | - | | 0.1101 | 407 | 0.4626 | - | - | | 0.1104 | 408 | 0.6116 | - | - | | 0.1107 | 409 | 0.3989 | - | - | | 0.1109 | 410 | 0.4759 | - | - | | 0.1112 | 411 | 0.3473 | - | - | | 0.1115 | 412 | 0.7002 | - | - | | 0.1117 | 413 | 0.3014 | - | - | | 0.1120 | 414 | 0.4251 | - | - | | 0.1123 | 415 | 0.4073 | - | - | | 0.1126 | 416 | 0.5373 | - | - | | 0.1128 | 417 | 0.5064 | - | - | | 0.1131 | 418 | 0.4443 | - | - | | 0.1134 | 419 | 0.4599 | - | - | | 0.1136 | 420 | 0.3585 | - | - | | 0.1139 | 421 | 0.4235 | - | - | | 0.1142 | 422 | 0.3939 | - | - | | 0.1144 | 423 | 0.5599 | - | - | | 0.1147 | 424 | 0.3272 | - | - | | 0.1150 | 425 | 0.3047 | - | - | | 0.1153 | 426 | 0.3835 | - | - | | 0.1155 | 427 | 0.3745 | - | - | | 0.1158 | 428 | 0.5126 | - | - | | 0.1161 | 429 | 0.4097 | - | - | | 0.1163 | 430 | 0.4314 | - | - | | 0.1166 | 431 | 0.5439 | - | - | | 0.1169 | 432 | 0.4467 | - | - | | 0.1172 | 433 | 0.4583 | - | - | | 0.1174 | 434 | 0.434 | - | - | | 0.1177 | 435 | 0.4183 | - | - | | 0.1180 | 436 | 0.5685 | - | - | | 0.1182 | 437 | 0.4235 | - | - | | 0.1185 | 438 | 0.4815 | - | - | | 0.1188 | 439 | 0.3793 | - | - | | 0.1190 | 440 | 0.3617 | - | - | | 0.1193 | 441 | 0.4938 | - | - | | 0.1196 | 442 | 0.4725 | - | - | | 0.1199 | 443 | 0.5827 | - | - | | 0.1201 | 444 | 0.3295 | - | - | | 0.1204 | 445 | 0.6002 | - | - | | 0.1207 | 446 | 0.3134 | - | - | | 0.1209 | 447 | 0.5644 | - | - | | 0.1212 | 448 | 0.3111 | - | - | | 0.1215 | 449 | 0.3892 | - | - | | 0.1218 | 450 | 0.3114 | - | - | | 0.1220 | 451 | 0.4343 | - | - | | 0.1223 | 452 | 0.4723 | - | - | | 0.1226 | 453 | 0.361 | - | - | | 0.1228 | 454 | 0.4077 | - | - | | 0.1231 | 455 | 0.4314 | - | - | | 0.1234 | 456 | 0.5096 | - | - | | 0.1236 | 457 | 0.3706 | - | - | | 0.1239 | 458 | 0.4507 | - | - | | 0.1242 | 459 | 0.4502 | - | - | | 0.1245 | 460 | 0.2918 | - | - | | 0.1247 | 461 | 0.5069 | - | - | | 0.125 | 462 | 0.4151 | - | - | | 0.1253 | 463 | 0.4682 | - | - | | 0.1255 | 464 | 0.3999 | - | - | | 0.1258 | 465 | 0.4764 | - | - | | 0.1261 | 466 | 0.4207 | - | - | | 0.1264 | 467 | 0.3923 | - | - | | 0.1266 | 468 | 0.3791 | - | - | | 0.1269 | 469 | 0.2914 | - | - | | 0.1272 | 470 | 0.3546 | - | - | | 0.1274 | 471 | 0.3632 | - | - | | 0.1277 | 472 | 0.3634 | - | - | | 0.1280 | 473 | 0.3898 | - | - | | 0.1282 | 474 | 0.3788 | - | - | | 0.1285 | 475 | 0.4937 | - | - | | 0.1288 | 476 | 0.3428 | - | - | | 0.1291 | 477 | 0.4589 | - | - | | 0.1293 | 478 | 0.4068 | - | - | | 0.1296 | 479 | 0.4065 | - | - | | 0.1299 | 480 | 0.3577 | - | - | | 0.1301 | 481 | 0.4345 | - | - | | 0.1304 | 482 | 0.4767 | - | - | | 0.1307 | 483 | 0.4697 | - | - | | 0.1310 | 484 | 0.4634 | - | - | | 0.1312 | 485 | 0.4374 | - | - | | 0.1315 | 486 | 0.5893 | - | - | | 0.1318 | 487 | 0.5903 | - | - | | 0.1320 | 488 | 0.3559 | - | - | | 0.1323 | 489 | 0.376 | - | - | | 0.1326 | 490 | 0.407 | - | - | | 0.1328 | 491 | 0.4807 | - | - | | 0.1331 | 492 | 0.4908 | - | - | | 0.1334 | 493 | 0.3917 | - | - | | 0.1337 | 494 | 0.3708 | - | - | | 0.1339 | 495 | 0.4199 | - | - | | 0.1342 | 496 | 0.4543 | - | - | | 0.1345 | 497 | 0.4159 | - | - | | 0.1347 | 498 | 0.4284 | - | - | | 0.1350 | 499 | 0.4836 | - | - | | 0.1353 | 500 | 0.5708 | - | - | | 0.1356 | 501 | 0.4684 | - | - | | 0.1358 | 502 | 0.4828 | - | - | | 0.1361 | 503 | 0.4267 | - | - | | 0.1364 | 504 | 0.3401 | - | - | | 0.1366 | 505 | 0.5218 | - | - | | 0.1369 | 506 | 0.4788 | - | - | | 0.1372 | 507 | 0.3658 | - | - | | 0.1374 | 508 | 0.3734 | - | - | | 0.1377 | 509 | 0.4097 | - | - | | 0.1380 | 510 | 0.3513 | - | - | | 0.1383 | 511 | 0.5054 | - | - | | 0.1385 | 512 | 0.3979 | - | - | | 0.1388 | 513 | 0.3675 | - | - | | 0.1391 | 514 | 0.3482 | - | - | | 0.1393 | 515 | 0.3552 | - | - | | 0.1396 | 516 | 0.3551 | - | - | | 0.1399 | 517 | 0.577 | - | - | | 0.1402 | 518 | 0.3992 | - | - | | 0.1404 | 519 | 0.4821 | - | - | | 0.1407 | 520 | 0.4765 | - | - | | 0.1410 | 521 | 0.3338 | - | - | | 0.1412 | 522 | 0.3712 | - | - | | 0.1415 | 523 | 0.4199 | - | - | | 0.1418 | 524 | 0.3382 | - | - | | 0.1420 | 525 | 0.5084 | - | - | | 0.1423 | 526 | 0.4912 | - | - | | 0.1426 | 527 | 0.4092 | - | - | | 0.1429 | 528 | 0.3429 | - | - | | 0.1431 | 529 | 0.3489 | - | - | | 0.1434 | 530 | 0.4979 | - | - | | 0.1437 | 531 | 0.3097 | - | - | | 0.1439 | 532 | 0.2743 | - | - | | 0.1442 | 533 | 0.3807 | - | - | | 0.1445 | 534 | 0.4363 | - | - | | 0.1448 | 535 | 0.3778 | - | - | | 0.1450 | 536 | 0.3534 | - | - | | 0.1453 | 537 | 0.4803 | - | - | | 0.1456 | 538 | 0.371 | - | - | | 0.1458 | 539 | 0.3576 | - | - | | 0.1461 | 540 | 0.4149 | - | - | | 0.1464 | 541 | 0.3288 | - | - | | 0.1466 | 542 | 0.5136 | - | - | | 0.1469 | 543 | 0.3446 | - | - | | 0.1472 | 544 | 0.4103 | - | - | | 0.1475 | 545 | 0.3375 | - | - | | 0.1477 | 546 | 0.5033 | - | - | | 0.1480 | 547 | 0.5561 | - | - | | 0.1483 | 548 | 0.3516 | - | - | | 0.1485 | 549 | 0.4674 | - | - | | 0.1488 | 550 | 0.4571 | - | - | | 0.1491 | 551 | 0.4782 | - | - | | 0.1494 | 552 | 0.4695 | - | - | | 0.1496 | 553 | 0.4307 | - | - | | 0.1499 | 554 | 0.4111 | - | - | | 0.1502 | 555 | 0.4575 | - | - | | 0.1504 | 556 | 0.4811 | - | - | | 0.1507 | 557 | 0.446 | - | - | | 0.1510 | 558 | 0.3233 | - | - | | 0.1512 | 559 | 0.3366 | - | - | | 0.1515 | 560 | 0.4584 | - | - | | 0.1518 | 561 | 0.3391 | - | - | | 0.1521 | 562 | 0.3949 | - | - | | 0.1523 | 563 | 0.4194 | - | - | | 0.1526 | 564 | 0.3506 | - | - | | 0.1529 | 565 | 0.4667 | - | - | | 0.1531 | 566 | 0.3708 | - | - | | 0.1534 | 567 | 0.3828 | - | - | | 0.1537 | 568 | 0.3823 | - | - | | 0.1540 | 569 | 0.4827 | - | - | | 0.1542 | 570 | 0.4167 | - | - | | 0.1545 | 571 | 0.3055 | - | - | | 0.1548 | 572 | 0.3797 | - | - | | 0.1550 | 573 | 0.3658 | - | - | | 0.1553 | 574 | 0.3399 | - | - | | 0.1556 | 575 | 0.3609 | - | - | | 0.1558 | 576 | 0.4068 | - | - | | 0.1561 | 577 | 0.4045 | - | - | | 0.1564 | 578 | 0.4415 | - | - | | 0.1567 | 579 | 0.4102 | - | - | | 0.1569 | 580 | 0.3578 | - | - | | 0.1572 | 581 | 0.2902 | - | - | | 0.1575 | 582 | 0.4447 | - | - | | 0.1577 | 583 | 0.3582 | - | - | | 0.1580 | 584 | 0.5064 | - | - | | 0.1583 | 585 | 0.6035 | - | - | | 0.1585 | 586 | 0.476 | - | - | | 0.1588 | 587 | 0.4533 | - | - | | 0.1591 | 588 | 0.3254 | - | - | | 0.1594 | 589 | 0.4245 | - | - | | 0.1596 | 590 | 0.3461 | - | - | | 0.1599 | 591 | 0.3651 | - | - | | 0.1602 | 592 | 0.4255 | - | - | | 0.1604 | 593 | 0.3545 | - | - | | 0.1607 | 594 | 0.2814 | - | - | | 0.1610 | 595 | 0.4902 | - | - | | 0.1613 | 596 | 0.3797 | - | - | | 0.1615 | 597 | 0.3915 | - | - | | 0.1618 | 598 | 0.3741 | - | - | | 0.1621 | 599 | 0.4349 | - | - | | 0.1623 | 600 | 0.4441 | - | - | | 0.1626 | 601 | 0.3932 | - | - | | 0.1629 | 602 | 0.3309 | - | - | | 0.1631 | 603 | 0.3346 | - | - | | 0.1634 | 604 | 0.3294 | - | - | | 0.1637 | 605 | 0.3267 | - | - | | 0.1640 | 606 | 0.23 | - | - | | 0.1642 | 607 | 0.4179 | - | - | | 0.1645 | 608 | 0.5072 | - | - | | 0.1648 | 609 | 0.404 | - | - | | 0.1650 | 610 | 0.3117 | - | - | | 0.1653 | 611 | 0.4566 | - | - | | 0.1656 | 612 | 0.477 | - | - | | 0.1659 | 613 | 0.4869 | - | - | | 0.1661 | 614 | 0.3917 | - | - | | 0.1664 | 615 | 0.3363 | - | - | | 0.1667 | 616 | 0.3831 | - | - | | 0.1669 | 617 | 0.4683 | - | - | | 0.1672 | 618 | 0.5428 | - | - | | 0.1675 | 619 | 0.372 | - | - | | 0.1677 | 620 | 0.3986 | - | - | | 0.1680 | 621 | 0.3343 | - | - | | 0.1683 | 622 | 0.4598 | - | - | | 0.1686 | 623 | 0.5001 | - | - | | 0.1688 | 624 | 0.4636 | - | - | | 0.1691 | 625 | 0.3864 | - | - | | 0.1694 | 626 | 0.3046 | - | - | | 0.1696 | 627 | 0.4236 | - | - | | 0.1699 | 628 | 0.2618 | - | - | | 0.1702 | 629 | 0.3836 | - | - | | 0.1705 | 630 | 0.3888 | - | - | | 0.1707 | 631 | 0.3397 | - | - | | 0.1710 | 632 | 0.3818 | - | - | | 0.1713 | 633 | 0.5019 | - | - | | 0.1715 | 634 | 0.3487 | - | - | | 0.1718 | 635 | 0.4416 | - | - | | 0.1721 | 636 | 0.3781 | - | - | | 0.1723 | 637 | 0.335 | - | - | | 0.1726 | 638 | 0.4464 | - | - | | 0.1729 | 639 | 0.442 | - | - | | 0.1732 | 640 | 0.3562 | - | - | | 0.1734 | 641 | 0.5615 | - | - | | 0.1737 | 642 | 0.3968 | - | - | | 0.1740 | 643 | 0.4254 | - | - | | 0.1742 | 644 | 0.3324 | - | - | | 0.1745 | 645 | 0.3475 | - | - | | 0.1748 | 646 | 0.3493 | - | - | | 0.1751 | 647 | 0.312 | - | - | | 0.1753 | 648 | 0.4798 | - | - | | 0.1756 | 649 | 0.3866 | - | - | | 0.1759 | 650 | 0.3165 | - | - | | 0.1761 | 651 | 0.3656 | - | - | | 0.1764 | 652 | 0.3335 | - | - | | 0.1767 | 653 | 0.4072 | - | - | | 0.1769 | 654 | 0.3952 | - | - | | 0.1772 | 655 | 0.3044 | - | - | | 0.1775 | 656 | 0.3295 | - | - | | 0.1778 | 657 | 0.5671 | - | - | | 0.1780 | 658 | 0.4012 | - | - | | 0.1783 | 659 | 0.3263 | - | - | | 0.1786 | 660 | 0.3351 | - | - | | 0.1788 | 661 | 0.3712 | - | - | | 0.1791 | 662 | 0.5386 | - | - | | 0.1794 | 663 | 0.4418 | - | - | | 0.1797 | 664 | 0.4058 | - | - | | 0.1799 | 665 | 0.3879 | - | - | | 0.1802 | 666 | 0.4332 | - | - | | 0.1805 | 667 | 0.4194 | - | - | | 0.1807 | 668 | 0.439 | - | - | | 0.1810 | 669 | 0.2701 | - | - | | 0.1813 | 670 | 0.2866 | - | - | | 0.1815 | 671 | 0.3157 | - | - | | 0.1818 | 672 | 0.3567 | - | - | | 0.1821 | 673 | 0.4435 | - | - | | 0.1824 | 674 | 0.3794 | - | - | | 0.1826 | 675 | 0.4044 | - | - | | 0.1829 | 676 | 0.2416 | - | - | | 0.1832 | 677 | 0.3851 | - | - | | 0.1834 | 678 | 0.3509 | - | - | | 0.1837 | 679 | 0.4402 | - | - | | 0.1840 | 680 | 0.4473 | - | - | | 0.1843 | 681 | 0.2757 | - | - | | 0.1845 | 682 | 0.2898 | - | - | | 0.1848 | 683 | 0.3547 | - | - | | 0.1851 | 684 | 0.4422 | - | - | | 0.1853 | 685 | 0.4154 | - | - | | 0.1856 | 686 | 0.3428 | - | - | | 0.1859 | 687 | 0.4308 | - | - | | 0.1861 | 688 | 0.3496 | - | - | | 0.1864 | 689 | 0.392 | - | - | | 0.1867 | 690 | 0.327 | - | - | | 0.1870 | 691 | 0.312 | - | - | | 0.1872 | 692 | 0.411 | - | - | | 0.1875 | 693 | 0.4342 | - | - | | 0.1878 | 694 | 0.3153 | - | - | | 0.1880 | 695 | 0.3987 | - | - | | 0.1883 | 696 | 0.2914 | - | - | | 0.1886 | 697 | 0.457 | - | - | | 0.1889 | 698 | 0.3247 | - | - | | 0.1891 | 699 | 0.4077 | - | - | | 0.1894 | 700 | 0.4483 | - | - | | 0.1897 | 701 | 0.3482 | - | - | | 0.1899 | 702 | 0.2505 | - | - | | 0.1902 | 703 | 0.3339 | - | - | | 0.1905 | 704 | 0.3919 | - | - | | 0.1907 | 705 | 0.3753 | - | - | | 0.1910 | 706 | 0.3812 | - | - | | 0.1913 | 707 | 0.3383 | - | - | | 0.1916 | 708 | 0.3303 | - | - | | 0.1918 | 709 | 0.3329 | - | - | | 0.1921 | 710 | 0.393 | - | - | | 0.1924 | 711 | 0.481 | - | - | | 0.1926 | 712 | 0.2871 | - | - | | 0.1929 | 713 | 0.284 | - | - | | 0.1932 | 714 | 0.4505 | - | - | | 0.1935 | 715 | 0.5099 | - | - | | 0.1937 | 716 | 0.4139 | - | - | | 0.1940 | 717 | 0.4806 | - | - | | 0.1943 | 718 | 0.3671 | - | - | | 0.1945 | 719 | 0.3767 | - | - | | 0.1948 | 720 | 0.3012 | - | - | | 0.1951 | 721 | 0.4281 | - | - | | 0.1953 | 722 | 0.3874 | - | - | | 0.1956 | 723 | 0.4483 | - | - | | 0.1959 | 724 | 0.3826 | - | - | | 0.1962 | 725 | 0.3191 | - | - | | 0.1964 | 726 | 0.2822 | - | - | | 0.1967 | 727 | 0.3294 | - | - | | 0.1970 | 728 | 0.3397 | - | - | | 0.1972 | 729 | 0.2751 | - | - | | 0.1975 | 730 | 0.446 | - | - | | 0.1978 | 731 | 0.3335 | - | - | | 0.1981 | 732 | 0.4961 | - | - | | 0.1983 | 733 | 0.7003 | - | - | | 0.1986 | 734 | 0.2998 | - | - | | 0.1989 | 735 | 0.4445 | - | - | | 0.1991 | 736 | 0.2437 | - | - | | 0.1994 | 737 | 0.3158 | - | - | | 0.1997 | 738 | 0.5616 | - | - | | 0.1999 | 739 | 0.4047 | - | - | | 0.2002 | 740 | 0.3447 | - | - | | 0.2005 | 741 | 0.3425 | - | - | | 0.2008 | 742 | 0.4514 | - | - | | 0.2010 | 743 | 0.439 | - | - | | 0.2013 | 744 | 0.4779 | - | - | | 0.2016 | 745 | 0.4259 | - | - | | 0.2018 | 746 | 0.438 | - | - | | 0.2021 | 747 | 0.515 | - | - | | 0.2024 | 748 | 0.3163 | - | - | | 0.2027 | 749 | 0.4198 | - | - | | 0.2029 | 750 | 0.3959 | - | - | | 0.2032 | 751 | 0.2549 | - | - | | 0.2035 | 752 | 0.4149 | - | - | | 0.2037 | 753 | 0.3564 | - | - | | 0.2040 | 754 | 0.3112 | - | - | | 0.2043 | 755 | 0.3141 | - | - | | 0.2045 | 756 | 0.4157 | - | - | | 0.2048 | 757 | 0.4643 | - | - | | 0.2051 | 758 | 0.3212 | - | - | | 0.2054 | 759 | 0.4046 | - | - | | 0.2056 | 760 | 0.538 | - | - | | 0.2059 | 761 | 0.4378 | - | - | | 0.2062 | 762 | 0.3041 | - | - | | 0.2064 | 763 | 0.3931 | - | - | | 0.2067 | 764 | 0.3217 | - | - | | 0.2070 | 765 | 0.2577 | - | - | | 0.2073 | 766 | 0.3941 | - | - | | 0.2075 | 767 | 0.5436 | - | - | | 0.2078 | 768 | 0.4075 | - | - | | 0.2081 | 769 | 0.3665 | - | - | | 0.2083 | 770 | 0.5189 | - | - | | 0.2086 | 771 | 0.3648 | - | - | | 0.2089 | 772 | 0.2695 | - | - | | 0.2091 | 773 | 0.3241 | - | - | | 0.2094 | 774 | 0.3511 | - | - | | 0.2097 | 775 | 0.3022 | - | - | | 0.2100 | 776 | 0.2947 | - | - | | 0.2102 | 777 | 0.4598 | - | - | | 0.2105 | 778 | 0.4121 | - | - | | 0.2108 | 779 | 0.309 | - | - | | 0.2110 | 780 | 0.3563 | - | - | | 0.2113 | 781 | 0.5174 | - | - | | 0.2116 | 782 | 0.366 | - | - | | 0.2119 | 783 | 0.3779 | - | - | | 0.2121 | 784 | 0.4078 | - | - | | 0.2124 | 785 | 0.3317 | - | - | | 0.2127 | 786 | 0.4269 | - | - | | 0.2129 | 787 | 0.3311 | - | - | | 0.2132 | 788 | 0.3335 | - | - | | 0.2135 | 789 | 0.269 | - | - | | 0.2137 | 790 | 0.3487 | - | - | | 0.2140 | 791 | 0.3457 | - | - | | 0.2143 | 792 | 0.3431 | - | - | | 0.2146 | 793 | 0.3441 | - | - | | 0.2148 | 794 | 0.2875 | - | - | | 0.2151 | 795 | 0.364 | - | - | | 0.2154 | 796 | 0.4348 | - | - | | 0.2156 | 797 | 0.3488 | - | - | | 0.2159 | 798 | 0.2742 | - | - | | 0.2162 | 799 | 0.4424 | - | - | | 0.2165 | 800 | 0.3975 | - | - | | 0.2167 | 801 | 0.4244 | - | - | | 0.2170 | 802 | 0.385 | - | - | | 0.2173 | 803 | 0.3402 | - | - | | 0.2175 | 804 | 0.3547 | - | - | | 0.2178 | 805 | 0.455 | - | - | | 0.2181 | 806 | 0.5426 | - | - | | 0.2183 | 807 | 0.4007 | - | - | | 0.2186 | 808 | 0.3376 | - | - | | 0.2189 | 809 | 0.3058 | - | - | | 0.2192 | 810 | 0.412 | - | - | | 0.2194 | 811 | 0.3868 | - | - | | 0.2197 | 812 | 0.3712 | - | - | | 0.2200 | 813 | 0.3184 | - | - | | 0.2202 | 814 | 0.304 | - | - | | 0.2205 | 815 | 0.4657 | - | - | | 0.2208 | 816 | 0.2557 | - | - | | 0.2210 | 817 | 0.3727 | - | - | | 0.2213 | 818 | 0.3147 | - | - | | 0.2216 | 819 | 0.3845 | - | - | | 0.2219 | 820 | 0.32 | - | - | | 0.2221 | 821 | 0.3003 | - | - | | 0.2224 | 822 | 0.4375 | - | - | | 0.2227 | 823 | 0.3704 | - | - | | 0.2229 | 824 | 0.4824 | - | - | | 0.2232 | 825 | 0.3775 | - | - | | 0.2235 | 826 | 0.4419 | - | - | | 0.2238 | 827 | 0.4566 | - | - | | 0.2240 | 828 | 0.3946 | - | - | | 0.2243 | 829 | 0.2748 | - | - | | 0.2246 | 830 | 0.3602 | - | - | | 0.2248 | 831 | 0.3373 | - | - | | 0.2251 | 832 | 0.4505 | - | - | | 0.2254 | 833 | 0.3683 | - | - | | 0.2256 | 834 | 0.4232 | - | - | | 0.2259 | 835 | 0.3398 | - | - | | 0.2262 | 836 | 0.3074 | - | - | | 0.2265 | 837 | 0.3726 | - | - | | 0.2267 | 838 | 0.2982 | - | - | | 0.2270 | 839 | 0.3812 | - | - | | 0.2273 | 840 | 0.3428 | - | - | | 0.2275 | 841 | 0.3911 | - | - | | 0.2278 | 842 | 0.2767 | - | - | | 0.2281 | 843 | 0.4704 | - | - | | 0.2284 | 844 | 0.4487 | - | - | | 0.2286 | 845 | 0.3709 | - | - | | 0.2289 | 846 | 0.4194 | - | - | | 0.2292 | 847 | 0.4367 | - | - | | 0.2294 | 848 | 0.2981 | - | - | | 0.2297 | 849 | 0.3883 | - | - | | 0.2300 | 850 | 0.4104 | - | - | | 0.2302 | 851 | 0.4059 | - | - | | 0.2305 | 852 | 0.3729 | - | - | | 0.2308 | 853 | 0.3828 | - | - | | 0.2311 | 854 | 0.3498 | - | - | | 0.2313 | 855 | 0.2595 | - | - | | 0.2316 | 856 | 0.3407 | - | - | | 0.2319 | 857 | 0.3798 | - | - | | 0.2321 | 858 | 0.445 | - | - | | 0.2324 | 859 | 0.3066 | - | - | | 0.2327 | 860 | 0.3882 | - | - | | 0.2330 | 861 | 0.457 | - | - | | 0.2332 | 862 | 0.2386 | - | - | | 0.2335 | 863 | 0.3183 | - | - | | 0.2338 | 864 | 0.2541 | - | - | | 0.2340 | 865 | 0.3393 | - | - | | 0.2343 | 866 | 0.3825 | - | - | | 0.2346 | 867 | 0.3886 | - | - | | 0.2348 | 868 | 0.3326 | - | - | | 0.2351 | 869 | 0.2589 | - | - | | 0.2354 | 870 | 0.3049 | - | - | | 0.2357 | 871 | 0.2513 | - | - | | 0.2359 | 872 | 0.286 | - | - | | 0.2362 | 873 | 0.477 | - | - | | 0.2365 | 874 | 0.452 | - | - | | 0.2367 | 875 | 0.3864 | - | - | | 0.2370 | 876 | 0.2677 | - | - | | 0.2373 | 877 | 0.2811 | - | - | | 0.2376 | 878 | 0.4972 | - | - | | 0.2378 | 879 | 0.3793 | - | - | | 0.2381 | 880 | 0.4091 | - | - | | 0.2384 | 881 | 0.4446 | - | - | | 0.2386 | 882 | 0.3355 | - | - | | 0.2389 | 883 | 0.2959 | - | - | | 0.2392 | 884 | 0.4378 | - | - | | 0.2394 | 885 | 0.5828 | - | - | | 0.2397 | 886 | 0.343 | - | - | | 0.2400 | 887 | 0.4026 | - | - | | 0.2403 | 888 | 0.4142 | - | - | | 0.2405 | 889 | 0.3471 | - | - | | 0.2408 | 890 | 0.4129 | - | - | | 0.2411 | 891 | 0.3108 | - | - | | 0.2413 | 892 | 0.2943 | - | - | | 0.2416 | 893 | 0.3831 | - | - | | 0.2419 | 894 | 0.3444 | - | - | | 0.2422 | 895 | 0.2944 | - | - | | 0.2424 | 896 | 0.444 | - | - | | 0.2427 | 897 | 0.4253 | - | - | | 0.2430 | 898 | 0.3068 | - | - | | 0.2432 | 899 | 0.2753 | - | - | | 0.2435 | 900 | 0.2619 | - | - | | 0.2438 | 901 | 0.4103 | - | - | | 0.2440 | 902 | 0.2468 | - | - | | 0.2443 | 903 | 0.46 | - | - | | 0.2446 | 904 | 0.4689 | - | - | | 0.2449 | 905 | 0.3259 | - | - | | 0.2451 | 906 | 0.46 | - | - | | 0.2454 | 907 | 0.3254 | - | - | | 0.2457 | 908 | 0.4582 | - | - | | 0.2459 | 909 | 0.2537 | - | - | | 0.2462 | 910 | 0.2723 | - | - | | 0.2465 | 911 | 0.4031 | - | - | | 0.2468 | 912 | 0.4395 | - | - | | 0.2470 | 913 | 0.3691 | - | - | | 0.2473 | 914 | 0.3314 | - | - | | 0.2476 | 915 | 0.3831 | - | - | | 0.2478 | 916 | 0.3194 | - | - | | 0.2481 | 917 | 0.3103 | - | - | | 0.2484 | 918 | 0.3532 | - | - | | 0.2486 | 919 | 0.3574 | - | - | | 0.2489 | 920 | 0.3837 | - | - | | 0.2492 | 921 | 0.2775 | - | - | | 0.2495 | 922 | 0.413 | - | - | | 0.2497 | 923 | 0.3153 | - | - | | 0.25 | 924 | 0.294 | - | - | | 0.2503 | 925 | 0.2577 | - | - | | 0.2505 | 926 | 0.4223 | - | - | | 0.2508 | 927 | 0.3239 | - | - | | 0.2511 | 928 | 0.4217 | - | - | | 0.2514 | 929 | 0.3509 | - | - | | 0.2516 | 930 | 0.313 | - | - | | 0.2519 | 931 | 0.3246 | - | - | | 0.2522 | 932 | 0.4282 | - | - | | 0.2524 | 933 | 0.3892 | - | - | | 0.2527 | 934 | 0.3826 | - | - | | 0.2530 | 935 | 0.3192 | - | - | | 0.2532 | 936 | 0.2984 | - | - | | 0.2535 | 937 | 0.3143 | - | - | | 0.2538 | 938 | 0.2451 | - | - | | 0.2541 | 939 | 0.2108 | - | - | | 0.2543 | 940 | 0.4843 | - | - | | 0.2546 | 941 | 0.4296 | - | - | | 0.2549 | 942 | 0.3882 | - | - | | 0.2551 | 943 | 0.3971 | - | - | | 0.2554 | 944 | 0.3021 | - | - | | 0.2557 | 945 | 0.3535 | - | - | | 0.2560 | 946 | 0.4501 | - | - | | 0.2562 | 947 | 0.3274 | - | - | | 0.2565 | 948 | 0.427 | - | - | | 0.2568 | 949 | 0.3689 | - | - | | 0.2570 | 950 | 0.2856 | - | - | | 0.2573 | 951 | 0.4162 | - | - | | 0.2576 | 952 | 0.298 | - | - | | 0.2578 | 953 | 0.2986 | - | - | | 0.2581 | 954 | 0.2839 | - | - | | 0.2584 | 955 | 0.3835 | - | - | | 0.2587 | 956 | 0.334 | - | - | | 0.2589 | 957 | 0.3741 | - | - | | 0.2592 | 958 | 0.329 | - | - | | 0.2595 | 959 | 0.4423 | - | - | | 0.2597 | 960 | 0.4031 | - | - | | 0.2600 | 961 | 0.4467 | - | - | | 0.2603 | 962 | 0.4164 | - | - | | 0.2606 | 963 | 0.4399 | - | - | | 0.2608 | 964 | 0.3872 | - | - | | 0.2611 | 965 | 0.3178 | - | - | | 0.2614 | 966 | 0.3842 | - | - | | 0.2616 | 967 | 0.3568 | - | - | | 0.2619 | 968 | 0.377 | - | - | | 0.2622 | 969 | 0.3886 | - | - | | 0.2624 | 970 | 0.4274 | - | - | | 0.2627 | 971 | 0.3356 | - | - | | 0.2630 | 972 | 0.352 | - | - | | 0.2633 | 973 | 0.3758 | - | - | | 0.2635 | 974 | 0.3294 | - | - | | 0.2638 | 975 | 0.429 | - | - | | 0.2641 | 976 | 0.2898 | - | - | | 0.2643 | 977 | 0.2611 | - | - | | 0.2646 | 978 | 0.3543 | - | - | | 0.2649 | 979 | 0.2723 | - | - | | 0.2652 | 980 | 0.3567 | - | - | | 0.2654 | 981 | 0.3958 | - | - | | 0.2657 | 982 | 0.3535 | - | - | | 0.2660 | 983 | 0.2934 | - | - | | 0.2662 | 984 | 0.4271 | - | - | | 0.2665 | 985 | 0.2764 | - | - | | 0.2668 | 986 | 0.4142 | - | - | | 0.2670 | 987 | 0.3972 | - | - | | 0.2673 | 988 | 0.4253 | - | - | | 0.2676 | 989 | 0.2593 | - | - | | 0.2679 | 990 | 0.4194 | - | - | | 0.2681 | 991 | 0.3026 | - | - | | 0.2684 | 992 | 0.2887 | - | - | | 0.2687 | 993 | 0.3461 | - | - | | 0.2689 | 994 | 0.3619 | - | - | | 0.2692 | 995 | 0.3621 | - | - | | 0.2695 | 996 | 0.3187 | - | - | | 0.2698 | 997 | 0.3614 | - | - | | 0.2700 | 998 | 0.2672 | - | - | | 0.2703 | 999 | 0.375 | - | - | | 0.2706 | 1000 | 0.285 | 0.3131 | 0.919 | | 0.2708 | 1001 | 0.265 | - | - | | 0.2711 | 1002 | 0.333 | - | - | | 0.2714 | 1003 | 0.402 | - | - | | 0.2716 | 1004 | 0.3103 | - | - | | 0.2719 | 1005 | 0.3531 | - | - | | 0.2722 | 1006 | 0.4888 | - | - | | 0.2725 | 1007 | 0.3325 | - | - | | 0.2727 | 1008 | 0.338 | - | - | | 0.2730 | 1009 | 0.2637 | - | - | | 0.2733 | 1010 | 0.3157 | - | - | | 0.2735 | 1011 | 0.3101 | - | - | | 0.2738 | 1012 | 0.3077 | - | - | | 0.2741 | 1013 | 0.2603 | - | - | | 0.2744 | 1014 | 0.3019 | - | - | | 0.2746 | 1015 | 0.3775 | - | - | | 0.2749 | 1016 | 0.4358 | - | - | | 0.2752 | 1017 | 0.2512 | - | - | | 0.2754 | 1018 | 0.3666 | - | - | | 0.2757 | 1019 | 0.3002 | - | - | | 0.2760 | 1020 | 0.2567 | - | - | | 0.2762 | 1021 | 0.3584 | - | - | | 0.2765 | 1022 | 0.2386 | - | - | | 0.2768 | 1023 | 0.3902 | - | - | | 0.2771 | 1024 | 0.2398 | - | - | | 0.2773 | 1025 | 0.2573 | - | - | | 0.2776 | 1026 | 0.2819 | - | - | | 0.2779 | 1027 | 0.3095 | - | - | | 0.2781 | 1028 | 0.2504 | - | - | | 0.2784 | 1029 | 0.3288 | - | - | | 0.2787 | 1030 | 0.4287 | - | - | | 0.2790 | 1031 | 0.3384 | - | - | | 0.2792 | 1032 | 0.3599 | - | - | | 0.2795 | 1033 | 0.3052 | - | - | | 0.2798 | 1034 | 0.3415 | - | - | | 0.2800 | 1035 | 0.343 | - | - | | 0.2803 | 1036 | 0.4511 | - | - | | 0.2806 | 1037 | 0.3303 | - | - | | 0.2808 | 1038 | 0.3797 | - | - | | 0.2811 | 1039 | 0.3592 | - | - | | 0.2814 | 1040 | 0.3932 | - | - | | 0.2817 | 1041 | 0.3272 | - | - | | 0.2819 | 1042 | 0.3413 | - | - | | 0.2822 | 1043 | 0.3899 | - | - | | 0.2825 | 1044 | 0.3189 | - | - | | 0.2827 | 1045 | 0.3665 | - | - | | 0.2830 | 1046 | 0.2467 | - | - | | 0.2833 | 1047 | 0.2936 | - | - | | 0.2835 | 1048 | 0.3552 | - | - | | 0.2838 | 1049 | 0.3169 | - | - | | 0.2841 | 1050 | 0.3157 | - | - | | 0.2844 | 1051 | 0.3577 | - | - | | 0.2846 | 1052 | 0.3009 | - | - | | 0.2849 | 1053 | 0.2991 | - | - | | 0.2852 | 1054 | 0.4104 | - | - | | 0.2854 | 1055 | 0.2816 | - | - | | 0.2857 | 1056 | 0.2779 | - | - | | 0.2860 | 1057 | 0.4574 | - | - | | 0.2863 | 1058 | 0.3233 | - | - | | 0.2865 | 1059 | 0.3666 | - | - | | 0.2868 | 1060 | 0.2423 | - | - | | 0.2871 | 1061 | 0.4268 | - | - | | 0.2873 | 1062 | 0.3156 | - | - | | 0.2876 | 1063 | 0.353 | - | - | | 0.2879 | 1064 | 0.3159 | - | - | | 0.2881 | 1065 | 0.2713 | - | - | | 0.2884 | 1066 | 0.3764 | - | - | | 0.2887 | 1067 | 0.33 | - | - | | 0.2890 | 1068 | 0.4578 | - | - | | 0.2892 | 1069 | 0.2696 | - | - | | 0.2895 | 1070 | 0.5282 | - | - | | 0.2898 | 1071 | 0.2719 | - | - | | 0.2900 | 1072 | 0.2023 | - | - | | 0.2903 | 1073 | 0.3608 | - | - | | 0.2906 | 1074 | 0.3293 | - | - | | 0.2909 | 1075 | 0.4331 | - | - | | 0.2911 | 1076 | 0.4126 | - | - | | 0.2914 | 1077 | 0.3154 | - | - | | 0.2917 | 1078 | 0.5337 | - | - | | 0.2919 | 1079 | 0.339 | - | - | | 0.2922 | 1080 | 0.3462 | - | - | | 0.2925 | 1081 | 0.3614 | - | - | | 0.2927 | 1082 | 0.3874 | - | - | | 0.2930 | 1083 | 0.3068 | - | - | | 0.2933 | 1084 | 0.2818 | - | - | | 0.2936 | 1085 | 0.3615 | - | - | | 0.2938 | 1086 | 0.2457 | - | - | | 0.2941 | 1087 | 0.4074 | - | - | | 0.2944 | 1088 | 0.3051 | - | - | | 0.2946 | 1089 | 0.3238 | - | - | | 0.2949 | 1090 | 0.3575 | - | - | | 0.2952 | 1091 | 0.3145 | - | - | | 0.2955 | 1092 | 0.2649 | - | - | | 0.2957 | 1093 | 0.3485 | - | - | | 0.2960 | 1094 | 0.2949 | - | - | | 0.2963 | 1095 | 0.4315 | - | - | | 0.2965 | 1096 | 0.3595 | - | - | | 0.2968 | 1097 | 0.3465 | - | - | | 0.2971 | 1098 | 0.3012 | - | - | | 0.2973 | 1099 | 0.2986 | - | - | | 0.2976 | 1100 | 0.3918 | - | - | | 0.2979 | 1101 | 0.3563 | - | - | | 0.2982 | 1102 | 0.2181 | - | - | | 0.2984 | 1103 | 0.3051 | - | - | | 0.2987 | 1104 | 0.3222 | - | - | | 0.2990 | 1105 | 0.4502 | - | - | | 0.2992 | 1106 | 0.2323 | - | - | | 0.2995 | 1107 | 0.4678 | - | - | | 0.2998 | 1108 | 0.3744 | - | - | | 0.3001 | 1109 | 0.3787 | - | - | | 0.3003 | 1110 | 0.4103 | - | - | | 0.3006 | 1111 | 0.3141 | - | - | | 0.3009 | 1112 | 0.2865 | - | - | | 0.3011 | 1113 | 0.3028 | - | - | | 0.3014 | 1114 | 0.3659 | - | - | | 0.3017 | 1115 | 0.3952 | - | - | | 0.3019 | 1116 | 0.5973 | - | - | | 0.3022 | 1117 | 0.2921 | - | - | | 0.3025 | 1118 | 0.2741 | - | - | | 0.3028 | 1119 | 0.313 | - | - | | 0.3030 | 1120 | 0.2989 | - | - | | 0.3033 | 1121 | 0.3466 | - | - | | 0.3036 | 1122 | 0.3237 | - | - | | 0.3038 | 1123 | 0.4059 | - | - | | 0.3041 | 1124 | 0.2759 | - | - | | 0.3044 | 1125 | 0.3335 | - | - | | 0.3047 | 1126 | 0.2879 | - | - | | 0.3049 | 1127 | 0.4204 | - | - | | 0.3052 | 1128 | 0.4009 | - | - | | 0.3055 | 1129 | 0.31 | - | - | | 0.3057 | 1130 | 0.4255 | - | - | | 0.3060 | 1131 | 0.3863 | - | - | | 0.3063 | 1132 | 0.3819 | - | - | | 0.3065 | 1133 | 0.3316 | - | - | | 0.3068 | 1134 | 0.3721 | - | - | | 0.3071 | 1135 | 0.4282 | - | - | | 0.3074 | 1136 | 0.5464 | - | - | | 0.3076 | 1137 | 0.2696 | - | - | | 0.3079 | 1138 | 0.315 | - | - | | 0.3082 | 1139 | 0.3263 | - | - | | 0.3084 | 1140 | 0.3488 | - | - | | 0.3087 | 1141 | 0.3922 | - | - | | 0.3090 | 1142 | 0.3279 | - | - | | 0.3093 | 1143 | 0.2185 | - | - | | 0.3095 | 1144 | 0.2331 | - | - | | 0.3098 | 1145 | 0.2982 | - | - | | 0.3101 | 1146 | 0.291 | - | - | | 0.3103 | 1147 | 0.3611 | - | - | | 0.3106 | 1148 | 0.3028 | - | - | | 0.3109 | 1149 | 0.3954 | - | - | | 0.3111 | 1150 | 0.3638 | - | - | | 0.3114 | 1151 | 0.332 | - | - | | 0.3117 | 1152 | 0.2228 | - | - | | 0.3120 | 1153 | 0.3048 | - | - | | 0.3122 | 1154 | 0.2789 | - | - | | 0.3125 | 1155 | 0.2997 | - | - | | 0.3128 | 1156 | 0.3662 | - | - | | 0.3130 | 1157 | 0.3456 | - | - | | 0.3133 | 1158 | 0.2927 | - | - | | 0.3136 | 1159 | 0.3326 | - | - | | 0.3139 | 1160 | 0.27 | - | - | | 0.3141 | 1161 | 0.2756 | - | - | | 0.3144 | 1162 | 0.3869 | - | - | | 0.3147 | 1163 | 0.3463 | - | - | | 0.3149 | 1164 | 0.3361 | - | - | | 0.3152 | 1165 | 0.3088 | - | - | | 0.3155 | 1166 | 0.3052 | - | - | | 0.3157 | 1167 | 0.2964 | - | - | | 0.3160 | 1168 | 0.2978 | - | - | | 0.3163 | 1169 | 0.3723 | - | - | | 0.3166 | 1170 | 0.2526 | - | - | | 0.3168 | 1171 | 0.3881 | - | - | | 0.3171 | 1172 | 0.281 | - | - | | 0.3174 | 1173 | 0.2978 | - | - | | 0.3176 | 1174 | 0.3354 | - | - | | 0.3179 | 1175 | 0.2581 | - | - | | 0.3182 | 1176 | 0.3478 | - | - | | 0.3185 | 1177 | 0.3815 | - | - | | 0.3187 | 1178 | 0.3078 | - | - | | 0.3190 | 1179 | 0.2828 | - | - | | 0.3193 | 1180 | 0.3003 | - | - | | 0.3195 | 1181 | 0.3345 | - | - | | 0.3198 | 1182 | 0.4192 | - | - | | 0.3201 | 1183 | 0.3246 | - | - | | 0.3203 | 1184 | 0.3861 | - | - | | 0.3206 | 1185 | 0.3267 | - | - | | 0.3209 | 1186 | 0.4421 | - | - | | 0.3212 | 1187 | 0.3226 | - | - | | 0.3214 | 1188 | 0.3563 | - | - | | 0.3217 | 1189 | 0.3717 | - | - | | 0.3220 | 1190 | 0.34 | - | - | | 0.3222 | 1191 | 0.3757 | - | - | | 0.3225 | 1192 | 0.3114 | - | - | | 0.3228 | 1193 | 0.5106 | - | - | | 0.3231 | 1194 | 0.2707 | - | - | | 0.3233 | 1195 | 0.3091 | - | - | | 0.3236 | 1196 | 0.4106 | - | - | | 0.3239 | 1197 | 0.215 | - | - | | 0.3241 | 1198 | 0.3182 | - | - | | 0.3244 | 1199 | 0.3747 | - | - | | 0.3247 | 1200 | 0.3645 | - | - | | 0.3249 | 1201 | 0.3587 | - | - | | 0.3252 | 1202 | 0.3672 | - | - | | 0.3255 | 1203 | 0.3229 | - | - | | 0.3258 | 1204 | 0.4058 | - | - | | 0.3260 | 1205 | 0.2357 | - | - | | 0.3263 | 1206 | 0.3266 | - | - | | 0.3266 | 1207 | 0.3868 | - | - | | 0.3268 | 1208 | 0.3269 | - | - | | 0.3271 | 1209 | 0.3507 | - | - | | 0.3274 | 1210 | 0.277 | - | - | | 0.3277 | 1211 | 0.2645 | - | - | | 0.3279 | 1212 | 0.3119 | - | - | | 0.3282 | 1213 | 0.3348 | - | - | | 0.3285 | 1214 | 0.3285 | - | - | | 0.3287 | 1215 | 0.358 | - | - | | 0.3290 | 1216 | 0.386 | - | - | | 0.3293 | 1217 | 0.1993 | - | - | | 0.3295 | 1218 | 0.4288 | - | - | | 0.3298 | 1219 | 0.334 | - | - | | 0.3301 | 1220 | 0.3295 | - | - | | 0.3304 | 1221 | 0.3733 | - | - | | 0.3306 | 1222 | 0.4579 | - | - | | 0.3309 | 1223 | 0.3301 | - | - | | 0.3312 | 1224 | 0.3008 | - | - | | 0.3314 | 1225 | 0.3629 | - | - | | 0.3317 | 1226 | 0.3995 | - | - | | 0.3320 | 1227 | 0.2547 | - | - | | 0.3323 | 1228 | 0.2691 | - | - | | 0.3325 | 1229 | 0.2456 | - | - | | 0.3328 | 1230 | 0.2411 | - | - | | 0.3331 | 1231 | 0.2555 | - | - | | 0.3333 | 1232 | 0.3296 | - | - | | 0.3336 | 1233 | 0.3376 | - | - | | 0.3339 | 1234 | 0.366 | - | - | | 0.3341 | 1235 | 0.3086 | - | - | | 0.3344 | 1236 | 0.5035 | - | - | | 0.3347 | 1237 | 0.347 | - | - | | 0.3350 | 1238 | 0.3955 | - | - | | 0.3352 | 1239 | 0.301 | - | - | | 0.3355 | 1240 | 0.2736 | - | - | | 0.3358 | 1241 | 0.3868 | - | - | | 0.3360 | 1242 | 0.2665 | - | - | | 0.3363 | 1243 | 0.4783 | - | - | | 0.3366 | 1244 | 0.3868 | - | - | | 0.3369 | 1245 | 0.3709 | - | - | | 0.3371 | 1246 | 0.3816 | - | - | | 0.3374 | 1247 | 0.4771 | - | - | | 0.3377 | 1248 | 0.3187 | - | - | | 0.3379 | 1249 | 0.3167 | - | - | | 0.3382 | 1250 | 0.3947 | - | - | | 0.3385 | 1251 | 0.3201 | - | - | | 0.3387 | 1252 | 0.3417 | - | - | | 0.3390 | 1253 | 0.2906 | - | - | | 0.3393 | 1254 | 0.3593 | - | - | | 0.3396 | 1255 | 0.3965 | - | - | | 0.3398 | 1256 | 0.3212 | - | - | | 0.3401 | 1257 | 0.4542 | - | - | | 0.3404 | 1258 | 0.3274 | - | - | | 0.3406 | 1259 | 0.3206 | - | - | | 0.3409 | 1260 | 0.278 | - | - | | 0.3412 | 1261 | 0.3844 | - | - | | 0.3415 | 1262 | 0.1857 | - | - | | 0.3417 | 1263 | 0.2245 | - | - | | 0.3420 | 1264 | 0.2125 | - | - | | 0.3423 | 1265 | 0.2782 | - | - | | 0.3425 | 1266 | 0.3194 | - | - | | 0.3428 | 1267 | 0.3262 | - | - | | 0.3431 | 1268 | 0.4295 | - | - | | 0.3433 | 1269 | 0.2837 | - | - | | 0.3436 | 1270 | 0.2221 | - | - | | 0.3439 | 1271 | 0.255 | - | - | | 0.3442 | 1272 | 0.1959 | - | - | | 0.3444 | 1273 | 0.3568 | - | - | | 0.3447 | 1274 | 0.3716 | - | - | | 0.3450 | 1275 | 0.437 | - | - | | 0.3452 | 1276 | 0.5078 | - | - | | 0.3455 | 1277 | 0.2689 | - | - | | 0.3458 | 1278 | 0.3653 | - | - | | 0.3460 | 1279 | 0.3522 | - | - | | 0.3463 | 1280 | 0.2809 | - | - | | 0.3466 | 1281 | 0.3302 | - | - | | 0.3469 | 1282 | 0.3689 | - | - | | 0.3471 | 1283 | 0.3597 | - | - | | 0.3474 | 1284 | 0.2672 | - | - | | 0.3477 | 1285 | 0.2679 | - | - | | 0.3479 | 1286 | 0.2393 | - | - | | 0.3482 | 1287 | 0.3753 | - | - | | 0.3485 | 1288 | 0.3876 | - | - | | 0.3488 | 1289 | 0.2384 | - | - | | 0.3490 | 1290 | 0.411 | - | - | | 0.3493 | 1291 | 0.3 | - | - | | 0.3496 | 1292 | 0.2367 | - | - | | 0.3498 | 1293 | 0.3404 | - | - | | 0.3501 | 1294 | 0.2742 | - | - | | 0.3504 | 1295 | 0.436 | - | - | | 0.3506 | 1296 | 0.2488 | - | - | | 0.3509 | 1297 | 0.2625 | - | - | | 0.3512 | 1298 | 0.2607 | - | - | | 0.3515 | 1299 | 0.2273 | - | - | | 0.3517 | 1300 | 0.3105 | - | - | | 0.3520 | 1301 | 0.4418 | - | - | | 0.3523 | 1302 | 0.3452 | - | - | | 0.3525 | 1303 | 0.4404 | - | - | | 0.3528 | 1304 | 0.3159 | - | - | | 0.3531 | 1305 | 0.2851 | - | - | | 0.3534 | 1306 | 0.3366 | - | - | | 0.3536 | 1307 | 0.3255 | - | - | | 0.3539 | 1308 | 0.4102 | - | - | | 0.3542 | 1309 | 0.356 | - | - | | 0.3544 | 1310 | 0.2882 | - | - | | 0.3547 | 1311 | 0.3868 | - | - | | 0.3550 | 1312 | 0.2843 | - | - | | 0.3552 | 1313 | 0.3056 | - | - | | 0.3555 | 1314 | 0.3019 | - | - | | 0.3558 | 1315 | 0.3629 | - | - | | 0.3561 | 1316 | 0.3249 | - | - | | 0.3563 | 1317 | 0.3416 | - | - | | 0.3566 | 1318 | 0.3334 | - | - | | 0.3569 | 1319 | 0.3192 | - | - | | 0.3571 | 1320 | 0.2987 | - | - | | 0.3574 | 1321 | 0.4592 | - | - | | 0.3577 | 1322 | 0.3347 | - | - | | 0.3580 | 1323 | 0.3225 | - | - | | 0.3582 | 1324 | 0.2893 | - | - | | 0.3585 | 1325 | 0.2756 | - | - | | 0.3588 | 1326 | 0.3101 | - | - | | 0.3590 | 1327 | 0.3585 | - | - | | 0.3593 | 1328 | 0.3718 | - | - | | 0.3596 | 1329 | 0.3739 | - | - | | 0.3598 | 1330 | 0.3745 | - | - | | 0.3601 | 1331 | 0.3092 | - | - | | 0.3604 | 1332 | 0.3439 | - | - | | 0.3607 | 1333 | 0.4166 | - | - | | 0.3609 | 1334 | 0.2473 | - | - | | 0.3612 | 1335 | 0.4276 | - | - | | 0.3615 | 1336 | 0.3324 | - | - | | 0.3617 | 1337 | 0.316 | - | - | | 0.3620 | 1338 | 0.2866 | - | - | | 0.3623 | 1339 | 0.3335 | - | - | | 0.3626 | 1340 | 0.4195 | - | - | | 0.3628 | 1341 | 0.404 | - | - | | 0.3631 | 1342 | 0.2932 | - | - | | 0.3634 | 1343 | 0.2803 | - | - | | 0.3636 | 1344 | 0.3479 | - | - | | 0.3639 | 1345 | 0.3089 | - | - | | 0.3642 | 1346 | 0.2704 | - | - | | 0.3644 | 1347 | 0.2594 | - | - | | 0.3647 | 1348 | 0.3865 | - | - | | 0.3650 | 1349 | 0.3355 | - | - | | 0.3653 | 1350 | 0.2783 | - | - | | 0.3655 | 1351 | 0.3247 | - | - | | 0.3658 | 1352 | 0.2388 | - | - | | 0.3661 | 1353 | 0.2224 | - | - | | 0.3663 | 1354 | 0.3406 | - | - | | 0.3666 | 1355 | 0.287 | - | - | | 0.3669 | 1356 | 0.2588 | - | - | | 0.3672 | 1357 | 0.3212 | - | - | | 0.3674 | 1358 | 0.2848 | - | - | | 0.3677 | 1359 | 0.3124 | - | - | | 0.3680 | 1360 | 0.3249 | - | - | | 0.3682 | 1361 | 0.4642 | - | - | | 0.3685 | 1362 | 0.2873 | - | - | | 0.3688 | 1363 | 0.3088 | - | - | | 0.3690 | 1364 | 0.383 | - | - | | 0.3693 | 1365 | 0.3172 | - | - | | 0.3696 | 1366 | 0.2822 | - | - | | 0.3699 | 1367 | 0.2768 | - | - | | 0.3701 | 1368 | 0.3302 | - | - | | 0.3704 | 1369 | 0.343 | - | - | | 0.3707 | 1370 | 0.3196 | - | - | | 0.3709 | 1371 | 0.4174 | - | - | | 0.3712 | 1372 | 0.3112 | - | - | | 0.3715 | 1373 | 0.2883 | - | - | | 0.3718 | 1374 | 0.3163 | - | - | | 0.3720 | 1375 | 0.2534 | - | - | | 0.3723 | 1376 | 0.3306 | - | - | | 0.3726 | 1377 | 0.2289 | - | - | | 0.3728 | 1378 | 0.3455 | - | - | | 0.3731 | 1379 | 0.3523 | - | - | | 0.3734 | 1380 | 0.2652 | - | - | | 0.3736 | 1381 | 0.2843 | - | - | | 0.3739 | 1382 | 0.3417 | - | - | | 0.3742 | 1383 | 0.2493 | - | - | | 0.3745 | 1384 | 0.282 | - | - | | 0.3747 | 1385 | 0.3151 | - | - | | 0.375 | 1386 | 0.3309 | - | - | | 0.3753 | 1387 | 0.2056 | - | - | | 0.3755 | 1388 | 0.2501 | - | - | | 0.3758 | 1389 | 0.3405 | - | - | | 0.3761 | 1390 | 0.3507 | - | - | | 0.3764 | 1391 | 0.383 | - | - | | 0.3766 | 1392 | 0.4098 | - | - | | 0.3769 | 1393 | 0.3126 | - | - | | 0.3772 | 1394 | 0.2638 | - | - | | 0.3774 | 1395 | 0.3513 | - | - | | 0.3777 | 1396 | 0.365 | - | - | | 0.3780 | 1397 | 0.3619 | - | - | | 0.3782 | 1398 | 0.1893 | - | - | | 0.3785 | 1399 | 0.3793 | - | - | | 0.3788 | 1400 | 0.2953 | - | - | | 0.3791 | 1401 | 0.3451 | - | - | | 0.3793 | 1402 | 0.3182 | - | - | | 0.3796 | 1403 | 0.3521 | - | - | | 0.3799 | 1404 | 0.2786 | - | - | | 0.3801 | 1405 | 0.3593 | - | - | | 0.3804 | 1406 | 0.4103 | - | - | | 0.3807 | 1407 | 0.3579 | - | - | | 0.3810 | 1408 | 0.2547 | - | - | | 0.3812 | 1409 | 0.302 | - | - | | 0.3815 | 1410 | 0.3491 | - | - | | 0.3818 | 1411 | 0.2671 | - | - | | 0.3820 | 1412 | 0.4096 | - | - | | 0.3823 | 1413 | 0.2638 | - | - | | 0.3826 | 1414 | 0.1952 | - | - | | 0.3828 | 1415 | 0.3076 | - | - | | 0.3831 | 1416 | 0.3095 | - | - | | 0.3834 | 1417 | 0.3543 | - | - | | 0.3837 | 1418 | 0.32 | - | - | | 0.3839 | 1419 | 0.397 | - | - | | 0.3842 | 1420 | 0.3316 | - | - | | 0.3845 | 1421 | 0.2896 | - | - | | 0.3847 | 1422 | 0.2966 | - | - | | 0.3850 | 1423 | 0.3271 | - | - | | 0.3853 | 1424 | 0.3092 | - | - | | 0.3856 | 1425 | 0.3537 | - | - | | 0.3858 | 1426 | 0.2749 | - | - | | 0.3861 | 1427 | 0.3039 | - | - | | 0.3864 | 1428 | 0.2842 | - | - | | 0.3866 | 1429 | 0.3159 | - | - | | 0.3869 | 1430 | 0.3417 | - | - | | 0.3872 | 1431 | 0.3592 | - | - | | 0.3874 | 1432 | 0.3783 | - | - | | 0.3877 | 1433 | 0.3196 | - | - | | 0.3880 | 1434 | 0.3329 | - | - | | 0.3883 | 1435 | 0.2715 | - | - | | 0.3885 | 1436 | 0.2283 | - | - | | 0.3888 | 1437 | 0.2476 | - | - | | 0.3891 | 1438 | 0.2958 | - | - | | 0.3893 | 1439 | 0.2213 | - | - | | 0.3896 | 1440 | 0.4275 | - | - | | 0.3899 | 1441 | 0.3019 | - | - | | 0.3902 | 1442 | 0.4343 | - | - | | 0.3904 | 1443 | 0.297 | - | - | | 0.3907 | 1444 | 0.2655 | - | - | | 0.3910 | 1445 | 0.2607 | - | - | | 0.3912 | 1446 | 0.3763 | - | - | | 0.3915 | 1447 | 0.308 | - | - | | 0.3918 | 1448 | 0.3473 | - | - | | 0.3920 | 1449 | 0.3174 | - | - | | 0.3923 | 1450 | 0.3241 | - | - | | 0.3926 | 1451 | 0.3568 | - | - | | 0.3929 | 1452 | 0.3041 | - | - | | 0.3931 | 1453 | 0.327 | - | - | | 0.3934 | 1454 | 0.4484 | - | - | | 0.3937 | 1455 | 0.3508 | - | - | | 0.3939 | 1456 | 0.3127 | - | - | | 0.3942 | 1457 | 0.2704 | - | - | | 0.3945 | 1458 | 0.4142 | - | - | | 0.3948 | 1459 | 0.2167 | - | - | | 0.3950 | 1460 | 0.3136 | - | - | | 0.3953 | 1461 | 0.293 | - | - | | 0.3956 | 1462 | 0.2908 | - | - | | 0.3958 | 1463 | 0.2915 | - | - | | 0.3961 | 1464 | 0.2654 | - | - | | 0.3964 | 1465 | 0.3292 | - | - | | 0.3966 | 1466 | 0.275 | - | - | | 0.3969 | 1467 | 0.3244 | - | - | | 0.3972 | 1468 | 0.3071 | - | - | | 0.3975 | 1469 | 0.3341 | - | - | | 0.3977 | 1470 | 0.352 | - | - | | 0.3980 | 1471 | 0.3116 | - | - | | 0.3983 | 1472 | 0.3123 | - | - | | 0.3985 | 1473 | 0.3793 | - | - | | 0.3988 | 1474 | 0.3694 | - | - | | 0.3991 | 1475 | 0.3258 | - | - | | 0.3994 | 1476 | 0.3305 | - | - | | 0.3996 | 1477 | 0.3727 | - | - | | 0.3999 | 1478 | 0.4845 | - | - | | 0.4002 | 1479 | 0.2735 | - | - | | 0.4004 | 1480 | 0.3541 | - | - | | 0.4007 | 1481 | 0.3674 | - | - | | 0.4010 | 1482 | 0.3042 | - | - | | 0.4012 | 1483 | 0.4306 | - | - | | 0.4015 | 1484 | 0.3802 | - | - | | 0.4018 | 1485 | 0.3054 | - | - | | 0.4021 | 1486 | 0.3294 | - | - | | 0.4023 | 1487 | 0.3278 | - | - | | 0.4026 | 1488 | 0.2426 | - | - | | 0.4029 | 1489 | 0.3134 | - | - | | 0.4031 | 1490 | 0.265 | - | - | | 0.4034 | 1491 | 0.3262 | - | - | | 0.4037 | 1492 | 0.2115 | - | - | | 0.4040 | 1493 | 0.3547 | - | - | | 0.4042 | 1494 | 0.3465 | - | - | | 0.4045 | 1495 | 0.2602 | - | - | | 0.4048 | 1496 | 0.3083 | - | - | | 0.4050 | 1497 | 0.3452 | - | - | | 0.4053 | 1498 | 0.3119 | - | - | | 0.4056 | 1499 | 0.3158 | - | - | | 0.4058 | 1500 | 0.292 | - | - | | 0.4061 | 1501 | 0.3093 | - | - | | 0.4064 | 1502 | 0.3745 | - | - | | 0.4067 | 1503 | 0.3562 | - | - | | 0.4069 | 1504 | 0.4018 | - | - | | 0.4072 | 1505 | 0.3412 | - | - | | 0.4075 | 1506 | 0.2803 | - | - | | 0.4077 | 1507 | 0.261 | - | - | | 0.4080 | 1508 | 0.2679 | - | - | | 0.4083 | 1509 | 0.233 | - | - | | 0.4085 | 1510 | 0.3224 | - | - | | 0.4088 | 1511 | 0.2553 | - | - | | 0.4091 | 1512 | 0.3856 | - | - | | 0.4094 | 1513 | 0.2882 | - | - | | 0.4096 | 1514 | 0.2913 | - | - | | 0.4099 | 1515 | 0.3757 | - | - | | 0.4102 | 1516 | 0.3336 | - | - | | 0.4104 | 1517 | 0.3614 | - | - | | 0.4107 | 1518 | 0.406 | - | - | | 0.4110 | 1519 | 0.3836 | - | - | | 0.4113 | 1520 | 0.3144 | - | - | | 0.4115 | 1521 | 0.3723 | - | - | | 0.4118 | 1522 | 0.309 | - | - | | 0.4121 | 1523 | 0.2913 | - | - | | 0.4123 | 1524 | 0.2922 | - | - | | 0.4126 | 1525 | 0.3637 | - | - | | 0.4129 | 1526 | 0.3487 | - | - | | 0.4131 | 1527 | 0.2622 | - | - | | 0.4134 | 1528 | 0.371 | - | - | | 0.4137 | 1529 | 0.3331 | - | - | | 0.4140 | 1530 | 0.3036 | - | - | | 0.4142 | 1531 | 0.365 | - | - | | 0.4145 | 1532 | 0.2434 | - | - | | 0.4148 | 1533 | 0.4295 | - | - | | 0.4150 | 1534 | 0.2469 | - | - | | 0.4153 | 1535 | 0.2763 | - | - | | 0.4156 | 1536 | 0.2392 | - | - | | 0.4159 | 1537 | 0.3442 | - | - | | 0.4161 | 1538 | 0.2683 | - | - | | 0.4164 | 1539 | 0.3165 | - | - | | 0.4167 | 1540 | 0.3609 | - | - | | 0.4169 | 1541 | 0.2749 | - | - | | 0.4172 | 1542 | 0.3656 | - | - | | 0.4175 | 1543 | 0.2939 | - | - | | 0.4177 | 1544 | 0.3216 | - | - | | 0.4180 | 1545 | 0.2391 | - | - | | 0.4183 | 1546 | 0.3019 | - | - | | 0.4186 | 1547 | 0.4169 | - | - | | 0.4188 | 1548 | 0.2874 | - | - | | 0.4191 | 1549 | 0.2899 | - | - | | 0.4194 | 1550 | 0.2812 | - | - | | 0.4196 | 1551 | 0.3413 | - | - | | 0.4199 | 1552 | 0.377 | - | - | | 0.4202 | 1553 | 0.2849 | - | - | | 0.4205 | 1554 | 0.2043 | - | - | | 0.4207 | 1555 | 0.3214 | - | - | | 0.4210 | 1556 | 0.2212 | - | - | | 0.4213 | 1557 | 0.4131 | - | - | | 0.4215 | 1558 | 0.4091 | - | - | | 0.4218 | 1559 | 0.2656 | - | - | | 0.4221 | 1560 | 0.4024 | - | - | | 0.4223 | 1561 | 0.4297 | - | - | | 0.4226 | 1562 | 0.3183 | - | - | | 0.4229 | 1563 | 0.284 | - | - | | 0.4232 | 1564 | 0.3087 | - | - | | 0.4234 | 1565 | 0.2911 | - | - | | 0.4237 | 1566 | 0.2488 | - | - | | 0.4240 | 1567 | 0.2784 | - | - | | 0.4242 | 1568 | 0.3067 | - | - | | 0.4245 | 1569 | 0.3701 | - | - | | 0.4248 | 1570 | 0.2763 | - | - | | 0.4251 | 1571 | 0.2709 | - | - | | 0.4253 | 1572 | 0.2955 | - | - | | 0.4256 | 1573 | 0.3634 | - | - | | 0.4259 | 1574 | 0.2968 | - | - | | 0.4261 | 1575 | 0.3411 | - | - | | 0.4264 | 1576 | 0.2878 | - | - | | 0.4267 | 1577 | 0.3299 | - | - | | 0.4269 | 1578 | 0.3076 | - | - | | 0.4272 | 1579 | 0.4037 | - | - | | 0.4275 | 1580 | 0.3145 | - | - | | 0.4278 | 1581 | 0.3472 | - | - | | 0.4280 | 1582 | 0.4746 | - | - | | 0.4283 | 1583 | 0.4133 | - | - | | 0.4286 | 1584 | 0.3383 | - | - | | 0.4288 | 1585 | 0.26 | - | - | | 0.4291 | 1586 | 0.2576 | - | - | | 0.4294 | 1587 | 0.2791 | - | - | | 0.4297 | 1588 | 0.2906 | - | - | | 0.4299 | 1589 | 0.314 | - | - | | 0.4302 | 1590 | 0.256 | - | - | | 0.4305 | 1591 | 0.3558 | - | - | | 0.4307 | 1592 | 0.3444 | - | - | | 0.4310 | 1593 | 0.3114 | - | - | | 0.4313 | 1594 | 0.3009 | - | - | | 0.4315 | 1595 | 0.2396 | - | - | | 0.4318 | 1596 | 0.2593 | - | - | | 0.4321 | 1597 | 0.3174 | - | - | | 0.4324 | 1598 | 0.2845 | - | - | | 0.4326 | 1599 | 0.3513 | - | - | | 0.4329 | 1600 | 0.2477 | - | - | | 0.4332 | 1601 | 0.3278 | - | - | | 0.4334 | 1602 | 0.2826 | - | - | | 0.4337 | 1603 | 0.2822 | - | - | | 0.4340 | 1604 | 0.2642 | - | - | | 0.4343 | 1605 | 0.2216 | - | - | | 0.4345 | 1606 | 0.3094 | - | - | | 0.4348 | 1607 | 0.2974 | - | - | | 0.4351 | 1608 | 0.2376 | - | - | | 0.4353 | 1609 | 0.311 | - | - | | 0.4356 | 1610 | 0.3213 | - | - | | 0.4359 | 1611 | 0.4042 | - | - | | 0.4361 | 1612 | 0.2256 | - | - | | 0.4364 | 1613 | 0.5054 | - | - | | 0.4367 | 1614 | 0.2997 | - | - | | 0.4370 | 1615 | 0.2637 | - | - | | 0.4372 | 1616 | 0.3322 | - | - | | 0.4375 | 1617 | 0.3703 | - | - | | 0.4378 | 1618 | 0.3901 | - | - | | 0.4380 | 1619 | 0.2318 | - | - | | 0.4383 | 1620 | 0.2835 | - | - | | 0.4386 | 1621 | 0.2978 | - | - | | 0.4389 | 1622 | 0.3346 | - | - | | 0.4391 | 1623 | 0.3628 | - | - | | 0.4394 | 1624 | 0.2674 | - | - | | 0.4397 | 1625 | 0.3236 | - | - | | 0.4399 | 1626 | 0.278 | - | - | | 0.4402 | 1627 | 0.3334 | - | - | | 0.4405 | 1628 | 0.2963 | - | - | | 0.4407 | 1629 | 0.3749 | - | - | | 0.4410 | 1630 | 0.2343 | - | - | | 0.4413 | 1631 | 0.2022 | - | - | | 0.4416 | 1632 | 0.2903 | - | - | | 0.4418 | 1633 | 0.2514 | - | - | | 0.4421 | 1634 | 0.3484 | - | - | | 0.4424 | 1635 | 0.275 | - | - | | 0.4426 | 1636 | 0.3407 | - | - | | 0.4429 | 1637 | 0.3139 | - | - | | 0.4432 | 1638 | 0.3343 | - | - | | 0.4435 | 1639 | 0.3925 | - | - | | 0.4437 | 1640 | 0.1999 | - | - | | 0.4440 | 1641 | 0.3318 | - | - | | 0.4443 | 1642 | 0.3439 | - | - | | 0.4445 | 1643 | 0.3689 | - | - | | 0.4448 | 1644 | 0.4289 | - | - | | 0.4451 | 1645 | 0.3181 | - | - | | 0.4453 | 1646 | 0.3545 | - | - | | 0.4456 | 1647 | 0.3583 | - | - | | 0.4459 | 1648 | 0.3065 | - | - | | 0.4462 | 1649 | 0.3479 | - | - | | 0.4464 | 1650 | 0.3788 | - | - | | 0.4467 | 1651 | 0.2848 | - | - | | 0.4470 | 1652 | 0.3141 | - | - | | 0.4472 | 1653 | 0.266 | - | - | | 0.4475 | 1654 | 0.3964 | - | - | | 0.4478 | 1655 | 0.3581 | - | - | | 0.4481 | 1656 | 0.4215 | - | - | | 0.4483 | 1657 | 0.2951 | - | - | | 0.4486 | 1658 | 0.1931 | - | - | | 0.4489 | 1659 | 0.3433 | - | - | | 0.4491 | 1660 | 0.346 | - | - | | 0.4494 | 1661 | 0.2408 | - | - | | 0.4497 | 1662 | 0.3135 | - | - | | 0.4499 | 1663 | 0.316 | - | - | | 0.4502 | 1664 | 0.3192 | - | - | | 0.4505 | 1665 | 0.2603 | - | - | | 0.4508 | 1666 | 0.3027 | - | - | | 0.4510 | 1667 | 0.3197 | - | - | | 0.4513 | 1668 | 0.2628 | - | - | | 0.4516 | 1669 | 0.2934 | - | - | | 0.4518 | 1670 | 0.305 | - | - | | 0.4521 | 1671 | 0.2776 | - | - | | 0.4524 | 1672 | 0.3222 | - | - | | 0.4527 | 1673 | 0.2787 | - | - | | 0.4529 | 1674 | 0.2959 | - | - | | 0.4532 | 1675 | 0.193 | - | - | | 0.4535 | 1676 | 0.2484 | - | - | | 0.4537 | 1677 | 0.261 | - | - | | 0.4540 | 1678 | 0.2162 | - | - | | 0.4543 | 1679 | 0.3156 | - | - | | 0.4545 | 1680 | 0.294 | - | - | | 0.4548 | 1681 | 0.3257 | - | - | | 0.4551 | 1682 | 0.374 | - | - | | 0.4554 | 1683 | 0.4185 | - | - | | 0.4556 | 1684 | 0.3447 | - | - | | 0.4559 | 1685 | 0.3498 | - | - | | 0.4562 | 1686 | 0.2802 | - | - | | 0.4564 | 1687 | 0.2454 | - | - | | 0.4567 | 1688 | 0.314 | - | - | | 0.4570 | 1689 | 0.2863 | - | - | | 0.4573 | 1690 | 0.3427 | - | - | | 0.4575 | 1691 | 0.411 | - | - | | 0.4578 | 1692 | 0.3426 | - | - | | 0.4581 | 1693 | 0.2981 | - | - | | 0.4583 | 1694 | 0.2695 | - | - | | 0.4586 | 1695 | 0.2684 | - | - | | 0.4589 | 1696 | 0.3156 | - | - | | 0.4591 | 1697 | 0.2821 | - | - | | 0.4594 | 1698 | 0.2771 | - | - | | 0.4597 | 1699 | 0.2814 | - | - | | 0.4600 | 1700 | 0.438 | - | - | | 0.4602 | 1701 | 0.3238 | - | - | | 0.4605 | 1702 | 0.3357 | - | - | | 0.4608 | 1703 | 0.3173 | - | - | | 0.4610 | 1704 | 0.3449 | - | - | | 0.4613 | 1705 | 0.3006 | - | - | | 0.4616 | 1706 | 0.2668 | - | - | | 0.4619 | 1707 | 0.2207 | - | - | | 0.4621 | 1708 | 0.2732 | - | - | | 0.4624 | 1709 | 0.2932 | - | - | | 0.4627 | 1710 | 0.2876 | - | - | | 0.4629 | 1711 | 0.3651 | - | - | | 0.4632 | 1712 | 0.2588 | - | - | | 0.4635 | 1713 | 0.2924 | - | - | | 0.4637 | 1714 | 0.3066 | - | - | | 0.4640 | 1715 | 0.3097 | - | - | | 0.4643 | 1716 | 0.2903 | - | - | | 0.4646 | 1717 | 0.2954 | - | - | | 0.4648 | 1718 | 0.3254 | - | - | | 0.4651 | 1719 | 0.3473 | - | - | | 0.4654 | 1720 | 0.2877 | - | - | | 0.4656 | 1721 | 0.249 | - | - | | 0.4659 | 1722 | 0.3314 | - | - | | 0.4662 | 1723 | 0.2943 | - | - | | 0.4665 | 1724 | 0.2795 | - | - | | 0.4667 | 1725 | 0.3487 | - | - | | 0.4670 | 1726 | 0.2702 | - | - | | 0.4673 | 1727 | 0.376 | - | - | | 0.4675 | 1728 | 0.2944 | - | - | | 0.4678 | 1729 | 0.3628 | - | - | | 0.4681 | 1730 | 0.2901 | - | - | | 0.4683 | 1731 | 0.2995 | - | - | | 0.4686 | 1732 | 0.3562 | - | - | | 0.4689 | 1733 | 0.2696 | - | - | | 0.4692 | 1734 | 0.3227 | - | - | | 0.4694 | 1735 | 0.3213 | - | - | | 0.4697 | 1736 | 0.3491 | - | - | | 0.4700 | 1737 | 0.3207 | - | - | | 0.4702 | 1738 | 0.2993 | - | - | | 0.4705 | 1739 | 0.3539 | - | - | | 0.4708 | 1740 | 0.3892 | - | - | | 0.4710 | 1741 | 0.3387 | - | - | | 0.4713 | 1742 | 0.3199 | - | - | | 0.4716 | 1743 | 0.2784 | - | - | | 0.4719 | 1744 | 0.2633 | - | - | | 0.4721 | 1745 | 0.2245 | - | - | | 0.4724 | 1746 | 0.2471 | - | - | | 0.4727 | 1747 | 0.2595 | - | - | | 0.4729 | 1748 | 0.4358 | - | - | | 0.4732 | 1749 | 0.2905 | - | - | | 0.4735 | 1750 | 0.3258 | - | - | | 0.4738 | 1751 | 0.3212 | - | - | | 0.4740 | 1752 | 0.261 | - | - | | 0.4743 | 1753 | 0.3827 | - | - | | 0.4746 | 1754 | 0.3426 | - | - | | 0.4748 | 1755 | 0.276 | - | - | | 0.4751 | 1756 | 0.314 | - | - | | 0.4754 | 1757 | 0.356 | - | - | | 0.4756 | 1758 | 0.3502 | - | - | | 0.4759 | 1759 | 0.2854 | - | - | | 0.4762 | 1760 | 0.2515 | - | - | | 0.4765 | 1761 | 0.2616 | - | - | | 0.4767 | 1762 | 0.299 | - | - | | 0.4770 | 1763 | 0.4031 | - | - | | 0.4773 | 1764 | 0.3912 | - | - | | 0.4775 | 1765 | 0.2894 | - | - | | 0.4778 | 1766 | 0.2781 | - | - | | 0.4781 | 1767 | 0.352 | - | - | | 0.4784 | 1768 | 0.4137 | - | - | | 0.4786 | 1769 | 0.3046 | - | - | | 0.4789 | 1770 | 0.2729 | - | - | | 0.4792 | 1771 | 0.2839 | - | - | | 0.4794 | 1772 | 0.2969 | - | - | | 0.4797 | 1773 | 0.4103 | - | - | | 0.4800 | 1774 | 0.2713 | - | - | | 0.4802 | 1775 | 0.2631 | - | - | | 0.4805 | 1776 | 0.3458 | - | - | | 0.4808 | 1777 | 0.1919 | - | - | | 0.4811 | 1778 | 0.2705 | - | - | | 0.4813 | 1779 | 0.3064 | - | - | | 0.4816 | 1780 | 0.3586 | - | - | | 0.4819 | 1781 | 0.3002 | - | - | | 0.4821 | 1782 | 0.2437 | - | - | | 0.4824 | 1783 | 0.2324 | - | - | | 0.4827 | 1784 | 0.2651 | - | - | | 0.4830 | 1785 | 0.3127 | - | - | | 0.4832 | 1786 | 0.2684 | - | - | | 0.4835 | 1787 | 0.2201 | - | - | | 0.4838 | 1788 | 0.2304 | - | - | | 0.4840 | 1789 | 0.223 | - | - | | 0.4843 | 1790 | 0.5316 | - | - | | 0.4846 | 1791 | 0.2831 | - | - | | 0.4848 | 1792 | 0.4394 | - | - | | 0.4851 | 1793 | 0.2484 | - | - | | 0.4854 | 1794 | 0.3246 | - | - | | 0.4857 | 1795 | 0.2835 | - | - | | 0.4859 | 1796 | 0.348 | - | - | | 0.4862 | 1797 | 0.337 | - | - | | 0.4865 | 1798 | 0.2918 | - | - | | 0.4867 | 1799 | 0.3523 | - | - | | 0.4870 | 1800 | 0.3838 | - | - | | 0.4873 | 1801 | 0.3461 | - | - | | 0.4876 | 1802 | 0.2209 | - | - | | 0.4878 | 1803 | 0.2826 | - | - | | 0.4881 | 1804 | 0.2855 | - | - | | 0.4884 | 1805 | 0.2988 | - | - | | 0.4886 | 1806 | 0.3571 | - | - | | 0.4889 | 1807 | 0.3321 | - | - | | 0.4892 | 1808 | 0.288 | - | - | | 0.4894 | 1809 | 0.3517 | - | - | | 0.4897 | 1810 | 0.3954 | - | - | | 0.4900 | 1811 | 0.3406 | - | - | | 0.4903 | 1812 | 0.3441 | - | - | | 0.4905 | 1813 | 0.3425 | - | - | | 0.4908 | 1814 | 0.3594 | - | - | | 0.4911 | 1815 | 0.2996 | - | - | | 0.4913 | 1816 | 0.1974 | - | - | | 0.4916 | 1817 | 0.2889 | - | - | | 0.4919 | 1818 | 0.3362 | - | - | | 0.4922 | 1819 | 0.3254 | - | - | | 0.4924 | 1820 | 0.2844 | - | - | | 0.4927 | 1821 | 0.328 | - | - | | 0.4930 | 1822 | 0.2904 | - | - | | 0.4932 | 1823 | 0.2588 | - | - | | 0.4935 | 1824 | 0.2622 | - | - | | 0.4938 | 1825 | 0.4415 | - | - | | 0.4940 | 1826 | 0.2619 | - | - | | 0.4943 | 1827 | 0.3035 | - | - | | 0.4946 | 1828 | 0.2876 | - | - | | 0.4949 | 1829 | 0.2342 | - | - | | 0.4951 | 1830 | 0.2439 | - | - | | 0.4954 | 1831 | 0.2569 | - | - | | 0.4957 | 1832 | 0.2483 | - | - | | 0.4959 | 1833 | 0.1941 | - | - | | 0.4962 | 1834 | 0.2254 | - | - | | 0.4965 | 1835 | 0.2969 | - | - | | 0.4968 | 1836 | 0.2489 | - | - | | 0.4970 | 1837 | 0.3358 | - | - | | 0.4973 | 1838 | 0.2673 | - | - | | 0.4976 | 1839 | 0.4219 | - | - | | 0.4978 | 1840 | 0.3112 | - | - | | 0.4981 | 1841 | 0.3524 | - | - | | 0.4984 | 1842 | 0.2772 | - | - | | 0.4986 | 1843 | 0.2896 | - | - | | 0.4989 | 1844 | 0.2695 | - | - | | 0.4992 | 1845 | 0.1904 | - | - | | 0.4995 | 1846 | 0.2621 | - | - | | 0.4997 | 1847 | 0.2439 | - | - | | 0.5 | 1848 | 0.2534 | - | - | | 0.5003 | 1849 | 0.2894 | - | - | | 0.5005 | 1850 | 0.3911 | - | - | | 0.5008 | 1851 | 0.2434 | - | - | | 0.5011 | 1852 | 0.3025 | - | - | | 0.5014 | 1853 | 0.3478 | - | - | | 0.5016 | 1854 | 0.424 | - | - | | 0.5019 | 1855 | 0.2836 | - | - | | 0.5022 | 1856 | 0.315 | - | - | | 0.5024 | 1857 | 0.3085 | - | - | | 0.5027 | 1858 | 0.3196 | - | - | | 0.5030 | 1859 | 0.3474 | - | - | | 0.5032 | 1860 | 0.2869 | - | - | | 0.5035 | 1861 | 0.382 | - | - | | 0.5038 | 1862 | 0.2733 | - | - | | 0.5041 | 1863 | 0.2454 | - | - | | 0.5043 | 1864 | 0.2677 | - | - | | 0.5046 | 1865 | 0.282 | - | - | | 0.5049 | 1866 | 0.2499 | - | - | | 0.5051 | 1867 | 0.1954 | - | - | | 0.5054 | 1868 | 0.2632 | - | - | | 0.5057 | 1869 | 0.3081 | - | - | | 0.5060 | 1870 | 0.4498 | - | - | | 0.5062 | 1871 | 0.3749 | - | - | | 0.5065 | 1872 | 0.2123 | - | - | | 0.5068 | 1873 | 0.2102 | - | - | | 0.5070 | 1874 | 0.3575 | - | - | | 0.5073 | 1875 | 0.4086 | - | - | | 0.5076 | 1876 | 0.3715 | - | - | | 0.5078 | 1877 | 0.2916 | - | - | | 0.5081 | 1878 | 0.3878 | - | - | | 0.5084 | 1879 | 0.2256 | - | - | | 0.5087 | 1880 | 0.3621 | - | - | | 0.5089 | 1881 | 0.3058 | - | - | | 0.5092 | 1882 | 0.2529 | - | - | | 0.5095 | 1883 | 0.3109 | - | - | | 0.5097 | 1884 | 0.2243 | - | - | | 0.5100 | 1885 | 0.3431 | - | - | | 0.5103 | 1886 | 0.2336 | - | - | | 0.5106 | 1887 | 0.27 | - | - | | 0.5108 | 1888 | 0.3208 | - | - | | 0.5111 | 1889 | 0.3423 | - | - | | 0.5114 | 1890 | 0.2694 | - | - | | 0.5116 | 1891 | 0.2481 | - | - | | 0.5119 | 1892 | 0.2123 | - | - | | 0.5122 | 1893 | 0.2194 | - | - | | 0.5124 | 1894 | 0.2689 | - | - | | 0.5127 | 1895 | 0.2497 | - | - | | 0.5130 | 1896 | 0.4563 | - | - | | 0.5133 | 1897 | 0.3217 | - | - | | 0.5135 | 1898 | 0.2701 | - | - | | 0.5138 | 1899 | 0.3277 | - | - | | 0.5141 | 1900 | 0.2497 | - | - | | 0.5143 | 1901 | 0.2675 | - | - | | 0.5146 | 1902 | 0.3395 | - | - | | 0.5149 | 1903 | 0.2584 | - | - | | 0.5152 | 1904 | 0.2613 | - | - | | 0.5154 | 1905 | 0.3257 | - | - | | 0.5157 | 1906 | 0.3223 | - | - | | 0.5160 | 1907 | 0.2112 | - | - | | 0.5162 | 1908 | 0.3107 | - | - | | 0.5165 | 1909 | 0.3503 | - | - | | 0.5168 | 1910 | 0.3177 | - | - | | 0.5170 | 1911 | 0.3069 | - | - | | 0.5173 | 1912 | 0.3046 | - | - | | 0.5176 | 1913 | 0.2277 | - | - | | 0.5179 | 1914 | 0.3281 | - | - | | 0.5181 | 1915 | 0.3666 | - | - | | 0.5184 | 1916 | 0.2777 | - | - | | 0.5187 | 1917 | 0.2379 | - | - | | 0.5189 | 1918 | 0.2897 | - | - | | 0.5192 | 1919 | 0.3631 | - | - | | 0.5195 | 1920 | 0.3179 | - | - | | 0.5198 | 1921 | 0.3676 | - | - | | 0.5200 | 1922 | 0.2914 | - | - | | 0.5203 | 1923 | 0.3635 | - | - | | 0.5206 | 1924 | 0.3318 | - | - | | 0.5208 | 1925 | 0.2351 | - | - | | 0.5211 | 1926 | 0.2477 | - | - | | 0.5214 | 1927 | 0.4694 | - | - | | 0.5216 | 1928 | 0.4056 | - | - | | 0.5219 | 1929 | 0.2271 | - | - | | 0.5222 | 1930 | 0.2666 | - | - | | 0.5225 | 1931 | 0.3668 | - | - | | 0.5227 | 1932 | 0.2946 | - | - | | 0.5230 | 1933 | 0.42 | - | - | | 0.5233 | 1934 | 0.2849 | - | - | | 0.5235 | 1935 | 0.3238 | - | - | | 0.5238 | 1936 | 0.2245 | - | - | | 0.5241 | 1937 | 0.2493 | - | - | | 0.5244 | 1938 | 0.2863 | - | - | | 0.5246 | 1939 | 0.338 | - | - | | 0.5249 | 1940 | 0.2275 | - | - | | 0.5252 | 1941 | 0.2411 | - | - | | 0.5254 | 1942 | 0.2467 | - | - | | 0.5257 | 1943 | 0.23 | - | - | | 0.5260 | 1944 | 0.2498 | - | - | | 0.5262 | 1945 | 0.3139 | - | - | | 0.5265 | 1946 | 0.342 | - | - | | 0.5268 | 1947 | 0.3005 | - | - | | 0.5271 | 1948 | 0.2178 | - | - | | 0.5273 | 1949 | 0.3728 | - | - | | 0.5276 | 1950 | 0.2949 | - | - | | 0.5279 | 1951 | 0.316 | - | - | | 0.5281 | 1952 | 0.3004 | - | - | | 0.5284 | 1953 | 0.3251 | - | - | | 0.5287 | 1954 | 0.2766 | - | - | | 0.5290 | 1955 | 0.3627 | - | - | | 0.5292 | 1956 | 0.343 | - | - | | 0.5295 | 1957 | 0.237 | - | - | | 0.5298 | 1958 | 0.3486 | - | - | | 0.5300 | 1959 | 0.2624 | - | - | | 0.5303 | 1960 | 0.2155 | - | - | | 0.5306 | 1961 | 0.3794 | - | - | | 0.5308 | 1962 | 0.3156 | - | - | | 0.5311 | 1963 | 0.2169 | - | - | | 0.5314 | 1964 | 0.3322 | - | - | | 0.5317 | 1965 | 0.2329 | - | - | | 0.5319 | 1966 | 0.2293 | - | - | | 0.5322 | 1967 | 0.2906 | - | - | | 0.5325 | 1968 | 0.2861 | - | - | | 0.5327 | 1969 | 0.2874 | - | - | | 0.5330 | 1970 | 0.2998 | - | - | | 0.5333 | 1971 | 0.2696 | - | - | | 0.5335 | 1972 | 0.2532 | - | - | | 0.5338 | 1973 | 0.3712 | - | - | | 0.5341 | 1974 | 0.2441 | - | - | | 0.5344 | 1975 | 0.24 | - | - | | 0.5346 | 1976 | 0.1971 | - | - | | 0.5349 | 1977 | 0.3948 | - | - | | 0.5352 | 1978 | 0.239 | - | - | | 0.5354 | 1979 | 0.2925 | - | - | | 0.5357 | 1980 | 0.245 | - | - | | 0.5360 | 1981 | 0.3199 | - | - | | 0.5363 | 1982 | 0.2454 | - | - | | 0.5365 | 1983 | 0.2698 | - | - | | 0.5368 | 1984 | 0.2832 | - | - | | 0.5371 | 1985 | 0.2837 | - | - | | 0.5373 | 1986 | 0.2472 | - | - | | 0.5376 | 1987 | 0.246 | - | - | | 0.5379 | 1988 | 0.3966 | - | - | | 0.5381 | 1989 | 0.2866 | - | - | | 0.5384 | 1990 | 0.2489 | - | - | | 0.5387 | 1991 | 0.3617 | - | - | | 0.5390 | 1992 | 0.2477 | - | - | | 0.5392 | 1993 | 0.3498 | - | - | | 0.5395 | 1994 | 0.3244 | - | - | | 0.5398 | 1995 | 0.2445 | - | - | | 0.5400 | 1996 | 0.2113 | - | - | | 0.5403 | 1997 | 0.2809 | - | - | | 0.5406 | 1998 | 0.3882 | - | - | | 0.5409 | 1999 | 0.2979 | - | - | | 0.5411 | 2000 | 0.399 | 0.2678 | 0.9314 | | 0.5414 | 2001 | 0.2064 | - | - | | 0.5417 | 2002 | 0.3161 | - | - | | 0.5419 | 2003 | 0.2666 | - | - | | 0.5422 | 2004 | 0.2437 | - | - | | 0.5425 | 2005 | 0.2439 | - | - | | 0.5427 | 2006 | 0.3509 | - | - | | 0.5430 | 2007 | 0.2798 | - | - | | 0.5433 | 2008 | 0.3807 | - | - | | 0.5436 | 2009 | 0.269 | - | - | | 0.5438 | 2010 | 0.2997 | - | - | | 0.5441 | 2011 | 0.2002 | - | - | | 0.5444 | 2012 | 0.2117 | - | - | | 0.5446 | 2013 | 0.2889 | - | - | | 0.5449 | 2014 | 0.28 | - | - | | 0.5452 | 2015 | 0.2477 | - | - | | 0.5455 | 2016 | 0.2559 | - | - | | 0.5457 | 2017 | 0.306 | - | - | | 0.5460 | 2018 | 0.3516 | - | - | | 0.5463 | 2019 | 0.2488 | - | - | | 0.5465 | 2020 | 0.2363 | - | - | | 0.5468 | 2021 | 0.2869 | - | - | | 0.5471 | 2022 | 0.2523 | - | - | | 0.5473 | 2023 | 0.2398 | - | - | | 0.5476 | 2024 | 0.2757 | - | - | | 0.5479 | 2025 | 0.3994 | - | - | | 0.5482 | 2026 | 0.1951 | - | - | | 0.5484 | 2027 | 0.3219 | - | - | | 0.5487 | 2028 | 0.2246 | - | - | | 0.5490 | 2029 | 0.2777 | - | - | | 0.5492 | 2030 | 0.2702 | - | - | | 0.5495 | 2031 | 0.2086 | - | - | | 0.5498 | 2032 | 0.2793 | - | - | | 0.5501 | 2033 | 0.291 | - | - | | 0.5503 | 2034 | 0.37 | - | - | | 0.5506 | 2035 | 0.3038 | - | - | | 0.5509 | 2036 | 0.3384 | - | - | | 0.5511 | 2037 | 0.4532 | - | - | | 0.5514 | 2038 | 0.316 | - | - | | 0.5517 | 2039 | 0.2454 | - | - | | 0.5519 | 2040 | 0.3251 | - | - | | 0.5522 | 2041 | 0.3017 | - | - | | 0.5525 | 2042 | 0.2204 | - | - | | 0.5528 | 2043 | 0.3318 | - | - | | 0.5530 | 2044 | 0.3603 | - | - | | 0.5533 | 2045 | 0.2446 | - | - | | 0.5536 | 2046 | 0.2995 | - | - | | 0.5538 | 2047 | 0.3583 | - | - | | 0.5541 | 2048 | 0.246 | - | - | | 0.5544 | 2049 | 0.2273 | - | - | | 0.5547 | 2050 | 0.2741 | - | - | | 0.5549 | 2051 | 0.3038 | - | - | | 0.5552 | 2052 | 0.3163 | - | - | | 0.5555 | 2053 | 0.2569 | - | - | | 0.5557 | 2054 | 0.2942 | - | - | | 0.5560 | 2055 | 0.308 | - | - | | 0.5563 | 2056 | 0.2759 | - | - | | 0.5565 | 2057 | 0.2483 | - | - | | 0.5568 | 2058 | 0.3376 | - | - | | 0.5571 | 2059 | 0.3598 | - | - | | 0.5574 | 2060 | 0.3304 | - | - | | 0.5576 | 2061 | 0.2743 | - | - | | 0.5579 | 2062 | 0.296 | - | - | | 0.5582 | 2063 | 0.2501 | - | - | | 0.5584 | 2064 | 0.2168 | - | - | | 0.5587 | 2065 | 0.4365 | - | - | | 0.5590 | 2066 | 0.3181 | - | - | | 0.5593 | 2067 | 0.2537 | - | - | | 0.5595 | 2068 | 0.377 | - | - | | 0.5598 | 2069 | 0.2038 | - | - | | 0.5601 | 2070 | 0.2498 | - | - | | 0.5603 | 2071 | 0.3063 | - | - | | 0.5606 | 2072 | 0.2288 | - | - | | 0.5609 | 2073 | 0.2999 | - | - | | 0.5611 | 2074 | 0.3542 | - | - | | 0.5614 | 2075 | 0.3596 | - | - | | 0.5617 | 2076 | 0.2293 | - | - | | 0.5620 | 2077 | 0.2885 | - | - | | 0.5622 | 2078 | 0.2734 | - | - | | 0.5625 | 2079 | 0.2597 | - | - | | 0.5628 | 2080 | 0.3531 | - | - | | 0.5630 | 2081 | 0.3777 | - | - | | 0.5633 | 2082 | 0.249 | - | - | | 0.5636 | 2083 | 0.2936 | - | - | | 0.5639 | 2084 | 0.2867 | - | - | | 0.5641 | 2085 | 0.4155 | - | - | | 0.5644 | 2086 | 0.3695 | - | - | | 0.5647 | 2087 | 0.2154 | - | - | | 0.5649 | 2088 | 0.2208 | - | - | | 0.5652 | 2089 | 0.3174 | - | - | | 0.5655 | 2090 | 0.294 | - | - | | 0.5657 | 2091 | 0.2839 | - | - | | 0.5660 | 2092 | 0.3503 | - | - | | 0.5663 | 2093 | 0.2936 | - | - | | 0.5666 | 2094 | 0.3694 | - | - | | 0.5668 | 2095 | 0.3173 | - | - | | 0.5671 | 2096 | 0.3551 | - | - | | 0.5674 | 2097 | 0.3028 | - | - | | 0.5676 | 2098 | 0.2202 | - | - | | 0.5679 | 2099 | 0.2847 | - | - | | 0.5682 | 2100 | 0.2535 | - | - | | 0.5685 | 2101 | 0.2532 | - | - | | 0.5687 | 2102 | 0.3547 | - | - | | 0.5690 | 2103 | 0.3576 | - | - | | 0.5693 | 2104 | 0.2252 | - | - | | 0.5695 | 2105 | 0.2664 | - | - | | 0.5698 | 2106 | 0.3307 | - | - | | 0.5701 | 2107 | 0.42 | - | - | | 0.5703 | 2108 | 0.2321 | - | - | | 0.5706 | 2109 | 0.4118 | - | - | | 0.5709 | 2110 | 0.3261 | - | - | | 0.5712 | 2111 | 0.3959 | - | - | | 0.5714 | 2112 | 0.253 | - | - | | 0.5717 | 2113 | 0.3074 | - | - | | 0.5720 | 2114 | 0.3498 | - | - | | 0.5722 | 2115 | 0.2863 | - | - | | 0.5725 | 2116 | 0.3714 | - | - | | 0.5728 | 2117 | 0.3077 | - | - | | 0.5731 | 2118 | 0.3554 | - | - | | 0.5733 | 2119 | 0.2585 | - | - | | 0.5736 | 2120 | 0.2943 | - | - | | 0.5739 | 2121 | 0.2876 | - | - | | 0.5741 | 2122 | 0.2613 | - | - | | 0.5744 | 2123 | 0.2841 | - | - | | 0.5747 | 2124 | 0.2297 | - | - | | 0.5749 | 2125 | 0.3207 | - | - | | 0.5752 | 2126 | 0.3327 | - | - | | 0.5755 | 2127 | 0.3357 | - | - | | 0.5758 | 2128 | 0.3354 | - | - | | 0.5760 | 2129 | 0.3158 | - | - | | 0.5763 | 2130 | 0.2815 | - | - | | 0.5766 | 2131 | 0.3044 | - | - | | 0.5768 | 2132 | 0.2506 | - | - | | 0.5771 | 2133 | 0.3979 | - | - | | 0.5774 | 2134 | 0.3119 | - | - | | 0.5777 | 2135 | 0.3 | - | - | | 0.5779 | 2136 | 0.3073 | - | - | | 0.5782 | 2137 | 0.4089 | - | - | | 0.5785 | 2138 | 0.3184 | - | - | | 0.5787 | 2139 | 0.2438 | - | - | | 0.5790 | 2140 | 0.3226 | - | - | | 0.5793 | 2141 | 0.1883 | - | - | | 0.5795 | 2142 | 0.4197 | - | - | | 0.5798 | 2143 | 0.3029 | - | - | | 0.5801 | 2144 | 0.2579 | - | - | | 0.5804 | 2145 | 0.2339 | - | - | | 0.5806 | 2146 | 0.2871 | - | - | | 0.5809 | 2147 | 0.2637 | - | - | | 0.5812 | 2148 | 0.3334 | - | - | | 0.5814 | 2149 | 0.2687 | - | - | | 0.5817 | 2150 | 0.2881 | - | - | | 0.5820 | 2151 | 0.3424 | - | - | | 0.5823 | 2152 | 0.2728 | - | - | | 0.5825 | 2153 | 0.3442 | - | - | | 0.5828 | 2154 | 0.3509 | - | - | | 0.5831 | 2155 | 0.2791 | - | - | | 0.5833 | 2156 | 0.3674 | - | - | | 0.5836 | 2157 | 0.2768 | - | - | | 0.5839 | 2158 | 0.2527 | - | - | | 0.5841 | 2159 | 0.2698 | - | - | | 0.5844 | 2160 | 0.3248 | - | - | | 0.5847 | 2161 | 0.2899 | - | - | | 0.5850 | 2162 | 0.3093 | - | - | | 0.5852 | 2163 | 0.2712 | - | - | | 0.5855 | 2164 | 0.339 | - | - | | 0.5858 | 2165 | 0.3468 | - | - | | 0.5860 | 2166 | 0.3092 | - | - | | 0.5863 | 2167 | 0.2859 | - | - | | 0.5866 | 2168 | 0.3792 | - | - | | 0.5869 | 2169 | 0.2406 | - | - | | 0.5871 | 2170 | 0.2161 | - | - | | 0.5874 | 2171 | 0.3067 | - | - | | 0.5877 | 2172 | 0.2394 | - | - | | 0.5879 | 2173 | 0.2597 | - | - | | 0.5882 | 2174 | 0.2874 | - | - | | 0.5885 | 2175 | 0.3324 | - | - | | 0.5887 | 2176 | 0.3601 | - | - | | 0.5890 | 2177 | 0.3179 | - | - | | 0.5893 | 2178 | 0.3032 | - | - | | 0.5896 | 2179 | 0.2574 | - | - | | 0.5898 | 2180 | 0.2453 | - | - | | 0.5901 | 2181 | 0.3094 | - | - | | 0.5904 | 2182 | 0.3135 | - | - | | 0.5906 | 2183 | 0.2546 | - | - | | 0.5909 | 2184 | 0.4111 | - | - | | 0.5912 | 2185 | 0.2898 | - | - | | 0.5915 | 2186 | 0.3083 | - | - | | 0.5917 | 2187 | 0.2818 | - | - | | 0.5920 | 2188 | 0.2782 | - | - | | 0.5923 | 2189 | 0.2909 | - | - | | 0.5925 | 2190 | 0.276 | - | - | | 0.5928 | 2191 | 0.2479 | - | - | | 0.5931 | 2192 | 0.2487 | - | - | | 0.5933 | 2193 | 0.2691 | - | - | | 0.5936 | 2194 | 0.3399 | - | - | | 0.5939 | 2195 | 0.3491 | - | - | | 0.5942 | 2196 | 0.2898 | - | - | | 0.5944 | 2197 | 0.3755 | - | - | | 0.5947 | 2198 | 0.3055 | - | - | | 0.5950 | 2199 | 0.3656 | - | - | | 0.5952 | 2200 | 0.2695 | - | - | | 0.5955 | 2201 | 0.2354 | - | - | | 0.5958 | 2202 | 0.3539 | - | - | | 0.5960 | 2203 | 0.2864 | - | - | | 0.5963 | 2204 | 0.2922 | - | - | | 0.5966 | 2205 | 0.3674 | - | - | | 0.5969 | 2206 | 0.287 | - | - | | 0.5971 | 2207 | 0.2651 | - | - | | 0.5974 | 2208 | 0.249 | - | - | | 0.5977 | 2209 | 0.2539 | - | - | | 0.5979 | 2210 | 0.1918 | - | - | | 0.5982 | 2211 | 0.3314 | - | - | | 0.5985 | 2212 | 0.2279 | - | - | | 0.5988 | 2213 | 0.1887 | - | - | | 0.5990 | 2214 | 0.3379 | - | - | | 0.5993 | 2215 | 0.2797 | - | - | | 0.5996 | 2216 | 0.3552 | - | - | | 0.5998 | 2217 | 0.3429 | - | - | | 0.6001 | 2218 | 0.2063 | - | - | | 0.6004 | 2219 | 0.2548 | - | - | | 0.6006 | 2220 | 0.2537 | - | - | | 0.6009 | 2221 | 0.1857 | - | - | | 0.6012 | 2222 | 0.3095 | - | - | | 0.6015 | 2223 | 0.3029 | - | - | | 0.6017 | 2224 | 0.3682 | - | - | | 0.6020 | 2225 | 0.3338 | - | - | | 0.6023 | 2226 | 0.2174 | - | - | | 0.6025 | 2227 | 0.335 | - | - | | 0.6028 | 2228 | 0.2682 | - | - | | 0.6031 | 2229 | 0.3726 | - | - | | 0.6034 | 2230 | 0.2252 | - | - | | 0.6036 | 2231 | 0.2663 | - | - | | 0.6039 | 2232 | 0.2949 | - | - | | 0.6042 | 2233 | 0.2843 | - | - | | 0.6044 | 2234 | 0.3394 | - | - | | 0.6047 | 2235 | 0.2517 | - | - | | 0.6050 | 2236 | 0.2061 | - | - | | 0.6052 | 2237 | 0.2414 | - | - | | 0.6055 | 2238 | 0.3274 | - | - | | 0.6058 | 2239 | 0.216 | - | - | | 0.6061 | 2240 | 0.1866 | - | - | | 0.6063 | 2241 | 0.4304 | - | - | | 0.6066 | 2242 | 0.2431 | - | - | | 0.6069 | 2243 | 0.2326 | - | - | | 0.6071 | 2244 | 0.247 | - | - | | 0.6074 | 2245 | 0.2964 | - | - | | 0.6077 | 2246 | 0.2624 | - | - | | 0.6080 | 2247 | 0.3184 | - | - | | 0.6082 | 2248 | 0.226 | - | - | | 0.6085 | 2249 | 0.3127 | - | - | | 0.6088 | 2250 | 0.2279 | - | - | | 0.6090 | 2251 | 0.2563 | - | - | | 0.6093 | 2252 | 0.2418 | - | - | | 0.6096 | 2253 | 0.3044 | - | - | | 0.6098 | 2254 | 0.258 | - | - | | 0.6101 | 2255 | 0.2761 | - | - | | 0.6104 | 2256 | 0.3092 | - | - | | 0.6107 | 2257 | 0.3105 | - | - | | 0.6109 | 2258 | 0.2856 | - | - | | 0.6112 | 2259 | 0.3125 | - | - | | 0.6115 | 2260 | 0.3687 | - | - | | 0.6117 | 2261 | 0.3406 | - | - | | 0.6120 | 2262 | 0.1985 | - | - | | 0.6123 | 2263 | 0.3442 | - | - | | 0.6126 | 2264 | 0.3027 | - | - | | 0.6128 | 2265 | 0.3087 | - | - | | 0.6131 | 2266 | 0.3757 | - | - | | 0.6134 | 2267 | 0.2585 | - | - | | 0.6136 | 2268 | 0.2712 | - | - | | 0.6139 | 2269 | 0.2363 | - | - | | 0.6142 | 2270 | 0.2473 | - | - | | 0.6144 | 2271 | 0.2944 | - | - | | 0.6147 | 2272 | 0.2439 | - | - | | 0.6150 | 2273 | 0.3544 | - | - | | 0.6153 | 2274 | 0.2928 | - | - | | 0.6155 | 2275 | 0.3404 | - | - | | 0.6158 | 2276 | 0.2161 | - | - | | 0.6161 | 2277 | 0.2196 | - | - | | 0.6163 | 2278 | 0.3405 | - | - | | 0.6166 | 2279 | 0.3401 | - | - | | 0.6169 | 2280 | 0.338 | - | - | | 0.6172 | 2281 | 0.2941 | - | - | | 0.6174 | 2282 | 0.2742 | - | - | | 0.6177 | 2283 | 0.3155 | - | - | | 0.6180 | 2284 | 0.4023 | - | - | | 0.6182 | 2285 | 0.409 | - | - | | 0.6185 | 2286 | 0.2207 | - | - | | 0.6188 | 2287 | 0.2972 | - | - | | 0.6190 | 2288 | 0.2947 | - | - | | 0.6193 | 2289 | 0.2996 | - | - | | 0.6196 | 2290 | 0.3907 | - | - | | 0.6199 | 2291 | 0.3064 | - | - | | 0.6201 | 2292 | 0.3847 | - | - | | 0.6204 | 2293 | 0.2248 | - | - | | 0.6207 | 2294 | 0.2749 | - | - | | 0.6209 | 2295 | 0.2702 | - | - | | 0.6212 | 2296 | 0.3082 | - | - | | 0.6215 | 2297 | 0.2209 | - | - | | 0.6218 | 2298 | 0.238 | - | - | | 0.6220 | 2299 | 0.251 | - | - | | 0.6223 | 2300 | 0.3533 | - | - | | 0.6226 | 2301 | 0.2615 | - | - | | 0.6228 | 2302 | 0.381 | - | - | | 0.6231 | 2303 | 0.2406 | - | - | | 0.6234 | 2304 | 0.2205 | - | - | | 0.6236 | 2305 | 0.2698 | - | - | | 0.6239 | 2306 | 0.2858 | - | - | | 0.6242 | 2307 | 0.262 | - | - | | 0.6245 | 2308 | 0.3542 | - | - | | 0.6247 | 2309 | 0.2825 | - | - | | 0.625 | 2310 | 0.3249 | - | - | | 0.6253 | 2311 | 0.2983 | - | - | | 0.6255 | 2312 | 0.3013 | - | - | | 0.6258 | 2313 | 0.3104 | - | - | | 0.6261 | 2314 | 0.2585 | - | - | | 0.6264 | 2315 | 0.2017 | - | - | | 0.6266 | 2316 | 0.4107 | - | - | | 0.6269 | 2317 | 0.2962 | - | - | | 0.6272 | 2318 | 0.1942 | - | - | | 0.6274 | 2319 | 0.2256 | - | - | | 0.6277 | 2320 | 0.2116 | - | - | | 0.6280 | 2321 | 0.2439 | - | - | | 0.6282 | 2322 | 0.2347 | - | - | | 0.6285 | 2323 | 0.3316 | - | - | | 0.6288 | 2324 | 0.3487 | - | - | | 0.6291 | 2325 | 0.2996 | - | - | | 0.6293 | 2326 | 0.2506 | - | - | | 0.6296 | 2327 | 0.1683 | - | - | | 0.6299 | 2328 | 0.2852 | - | - | | 0.6301 | 2329 | 0.2702 | - | - | | 0.6304 | 2330 | 0.323 | - | - | | 0.6307 | 2331 | 0.2731 | - | - | | 0.6310 | 2332 | 0.3592 | - | - | | 0.6312 | 2333 | 0.2112 | - | - | | 0.6315 | 2334 | 0.2586 | - | - | | 0.6318 | 2335 | 0.3417 | - | - | | 0.6320 | 2336 | 0.3256 | - | - | | 0.6323 | 2337 | 0.3698 | - | - | | 0.6326 | 2338 | 0.2569 | - | - | | 0.6328 | 2339 | 0.2615 | - | - | | 0.6331 | 2340 | 0.3294 | - | - | | 0.6334 | 2341 | 0.234 | - | - | | 0.6337 | 2342 | 0.3432 | - | - | | 0.6339 | 2343 | 0.3524 | - | - | | 0.6342 | 2344 | 0.2213 | - | - | | 0.6345 | 2345 | 0.3087 | - | - | | 0.6347 | 2346 | 0.2194 | - | - | | 0.6350 | 2347 | 0.3035 | - | - | | 0.6353 | 2348 | 0.3747 | - | - | | 0.6356 | 2349 | 0.1571 | - | - | | 0.6358 | 2350 | 0.3386 | - | - | | 0.6361 | 2351 | 0.2896 | - | - | | 0.6364 | 2352 | 0.2524 | - | - | | 0.6366 | 2353 | 0.2272 | - | - | | 0.6369 | 2354 | 0.2595 | - | - | | 0.6372 | 2355 | 0.2124 | - | - | | 0.6374 | 2356 | 0.3861 | - | - | | 0.6377 | 2357 | 0.2815 | - | - | | 0.6380 | 2358 | 0.3098 | - | - | | 0.6383 | 2359 | 0.2382 | - | - | | 0.6385 | 2360 | 0.2409 | - | - | | 0.6388 | 2361 | 0.2541 | - | - | | 0.6391 | 2362 | 0.2816 | - | - | | 0.6393 | 2363 | 0.2915 | - | - | | 0.6396 | 2364 | 0.4063 | - | - | | 0.6399 | 2365 | 0.2847 | - | - | | 0.6402 | 2366 | 0.4259 | - | - | | 0.6404 | 2367 | 0.2182 | - | - | | 0.6407 | 2368 | 0.2909 | - | - | | 0.6410 | 2369 | 0.2814 | - | - | | 0.6412 | 2370 | 0.3453 | - | - | | 0.6415 | 2371 | 0.276 | - | - | | 0.6418 | 2372 | 0.3554 | - | - | | 0.6420 | 2373 | 0.2861 | - | - | | 0.6423 | 2374 | 0.3658 | - | - | | 0.6426 | 2375 | 0.2899 | - | - | | 0.6429 | 2376 | 0.3662 | - | - | | 0.6431 | 2377 | 0.4045 | - | - | | 0.6434 | 2378 | 0.2546 | - | - | | 0.6437 | 2379 | 0.2281 | - | - | | 0.6439 | 2380 | 0.2781 | - | - | | 0.6442 | 2381 | 0.285 | - | - | | 0.6445 | 2382 | 0.2797 | - | - | | 0.6448 | 2383 | 0.3226 | - | - | | 0.6450 | 2384 | 0.3242 | - | - | | 0.6453 | 2385 | 0.3247 | - | - | | 0.6456 | 2386 | 0.2552 | - | - | | 0.6458 | 2387 | 0.3265 | - | - | | 0.6461 | 2388 | 0.3195 | - | - | | 0.6464 | 2389 | 0.2531 | - | - | | 0.6466 | 2390 | 0.3098 | - | - | | 0.6469 | 2391 | 0.244 | - | - | | 0.6472 | 2392 | 0.2282 | - | - | | 0.6475 | 2393 | 0.2532 | - | - | | 0.6477 | 2394 | 0.2401 | - | - | | 0.6480 | 2395 | 0.2609 | - | - | | 0.6483 | 2396 | 0.2354 | - | - | | 0.6485 | 2397 | 0.3855 | - | - | | 0.6488 | 2398 | 0.3155 | - | - | | 0.6491 | 2399 | 0.3111 | - | - | | 0.6494 | 2400 | 0.3778 | - | - | | 0.6496 | 2401 | 0.2335 | - | - | | 0.6499 | 2402 | 0.2853 | - | - | | 0.6502 | 2403 | 0.2713 | - | - | | 0.6504 | 2404 | 0.242 | - | - | | 0.6507 | 2405 | 0.2572 | - | - | | 0.6510 | 2406 | 0.2518 | - | - | | 0.6512 | 2407 | 0.3437 | - | - | | 0.6515 | 2408 | 0.3398 | - | - | | 0.6518 | 2409 | 0.3695 | - | - | | 0.6521 | 2410 | 0.2844 | - | - | | 0.6523 | 2411 | 0.3704 | - | - | | 0.6526 | 2412 | 0.3119 | - | - | | 0.6529 | 2413 | 0.3752 | - | - | | 0.6531 | 2414 | 0.2794 | - | - | | 0.6534 | 2415 | 0.3034 | - | - | | 0.6537 | 2416 | 0.3382 | - | - | | 0.6540 | 2417 | 0.2797 | - | - | | 0.6542 | 2418 | 0.261 | - | - | | 0.6545 | 2419 | 0.2327 | - | - | | 0.6548 | 2420 | 0.3467 | - | - | | 0.6550 | 2421 | 0.2786 | - | - | | 0.6553 | 2422 | 0.255 | - | - | | 0.6556 | 2423 | 0.4057 | - | - | | 0.6558 | 2424 | 0.2607 | - | - | | 0.6561 | 2425 | 0.2534 | - | - | | 0.6564 | 2426 | 0.2219 | - | - | | 0.6567 | 2427 | 0.283 | - | - | | 0.6569 | 2428 | 0.3278 | - | - | | 0.6572 | 2429 | 0.2684 | - | - | | 0.6575 | 2430 | 0.2302 | - | - | | 0.6577 | 2431 | 0.2384 | - | - | | 0.6580 | 2432 | 0.2836 | - | - | | 0.6583 | 2433 | 0.4131 | - | - | | 0.6585 | 2434 | 0.4277 | - | - | | 0.6588 | 2435 | 0.2739 | - | - | | 0.6591 | 2436 | 0.3359 | - | - | | 0.6594 | 2437 | 0.3241 | - | - | | 0.6596 | 2438 | 0.3082 | - | - | | 0.6599 | 2439 | 0.3264 | - | - | | 0.6602 | 2440 | 0.2759 | - | - | | 0.6604 | 2441 | 0.4188 | - | - | | 0.6607 | 2442 | 0.3656 | - | - | | 0.6610 | 2443 | 0.1993 | - | - | | 0.6613 | 2444 | 0.3154 | - | - | | 0.6615 | 2445 | 0.3203 | - | - | | 0.6618 | 2446 | 0.3453 | - | - | | 0.6621 | 2447 | 0.2271 | - | - | | 0.6623 | 2448 | 0.252 | - | - | | 0.6626 | 2449 | 0.2531 | - | - | | 0.6629 | 2450 | 0.2652 | - | - | | 0.6631 | 2451 | 0.2153 | - | - | | 0.6634 | 2452 | 0.2776 | - | - | | 0.6637 | 2453 | 0.3642 | - | - | | 0.6640 | 2454 | 0.241 | - | - | | 0.6642 | 2455 | 0.2173 | - | - | | 0.6645 | 2456 | 0.1763 | - | - | | 0.6648 | 2457 | 0.2723 | - | - | | 0.6650 | 2458 | 0.2566 | - | - | | 0.6653 | 2459 | 0.2723 | - | - | | 0.6656 | 2460 | 0.3026 | - | - | | 0.6659 | 2461 | 0.337 | - | - | | 0.6661 | 2462 | 0.2832 | - | - | | 0.6664 | 2463 | 0.2556 | - | - | | 0.6667 | 2464 | 0.2706 | - | - | | 0.6669 | 2465 | 0.3769 | - | - | | 0.6672 | 2466 | 0.3274 | - | - | | 0.6675 | 2467 | 0.1768 | - | - | | 0.6677 | 2468 | 0.2716 | - | - | | 0.6680 | 2469 | 0.338 | - | - | | 0.6683 | 2470 | 0.3078 | - | - | | 0.6686 | 2471 | 0.2597 | - | - | | 0.6688 | 2472 | 0.2851 | - | - | | 0.6691 | 2473 | 0.2952 | - | - | | 0.6694 | 2474 | 0.1961 | - | - | | 0.6696 | 2475 | 0.2854 | - | - | | 0.6699 | 2476 | 0.2351 | - | - | | 0.6702 | 2477 | 0.3185 | - | - | | 0.6705 | 2478 | 0.2378 | - | - | | 0.6707 | 2479 | 0.2856 | - | - | | 0.6710 | 2480 | 0.2472 | - | - | | 0.6713 | 2481 | 0.3144 | - | - | | 0.6715 | 2482 | 0.2039 | - | - | | 0.6718 | 2483 | 0.2952 | - | - | | 0.6721 | 2484 | 0.3482 | - | - | | 0.6723 | 2485 | 0.2959 | - | - | | 0.6726 | 2486 | 0.297 | - | - | | 0.6729 | 2487 | 0.3668 | - | - | | 0.6732 | 2488 | 0.2994 | - | - | | 0.6734 | 2489 | 0.2341 | - | - | | 0.6737 | 2490 | 0.3227 | - | - | | 0.6740 | 2491 | 0.2986 | - | - | | 0.6742 | 2492 | 0.2647 | - | - | | 0.6745 | 2493 | 0.3624 | - | - | | 0.6748 | 2494 | 0.2772 | - | - | | 0.6751 | 2495 | 0.3145 | - | - | | 0.6753 | 2496 | 0.2543 | - | - | | 0.6756 | 2497 | 0.2592 | - | - | | 0.6759 | 2498 | 0.3121 | - | - | | 0.6761 | 2499 | 0.3368 | - | - | | 0.6764 | 2500 | 0.21 | - | - | | 0.6767 | 2501 | 0.196 | - | - | | 0.6769 | 2502 | 0.2683 | - | - | | 0.6772 | 2503 | 0.2224 | - | - | | 0.6775 | 2504 | 0.2193 | - | - | | 0.6778 | 2505 | 0.2405 | - | - | | 0.6780 | 2506 | 0.3266 | - | - | | 0.6783 | 2507 | 0.2389 | - | - | | 0.6786 | 2508 | 0.2504 | - | - | | 0.6788 | 2509 | 0.3118 | - | - | | 0.6791 | 2510 | 0.3587 | - | - | | 0.6794 | 2511 | 0.2251 | - | - | | 0.6797 | 2512 | 0.3323 | - | - | | 0.6799 | 2513 | 0.2922 | - | - | | 0.6802 | 2514 | 0.3334 | - | - | | 0.6805 | 2515 | 0.2789 | - | - | | 0.6807 | 2516 | 0.3415 | - | - | | 0.6810 | 2517 | 0.2807 | - | - | | 0.6813 | 2518 | 0.2539 | - | - | | 0.6815 | 2519 | 0.2707 | - | - | | 0.6818 | 2520 | 0.3106 | - | - | | 0.6821 | 2521 | 0.3418 | - | - | | 0.6824 | 2522 | 0.2842 | - | - | | 0.6826 | 2523 | 0.3253 | - | - | | 0.6829 | 2524 | 0.2056 | - | - | | 0.6832 | 2525 | 0.2782 | - | - | | 0.6834 | 2526 | 0.3149 | - | - | | 0.6837 | 2527 | 0.3883 | - | - | | 0.6840 | 2528 | 0.3147 | - | - | | 0.6843 | 2529 | 0.235 | - | - | | 0.6845 | 2530 | 0.2653 | - | - | | 0.6848 | 2531 | 0.2709 | - | - | | 0.6851 | 2532 | 0.2617 | - | - | | 0.6853 | 2533 | 0.3593 | - | - | | 0.6856 | 2534 | 0.3428 | - | - | | 0.6859 | 2535 | 0.305 | - | - | | 0.6861 | 2536 | 0.2499 | - | - | | 0.6864 | 2537 | 0.1978 | - | - | | 0.6867 | 2538 | 0.1896 | - | - | | 0.6870 | 2539 | 0.3252 | - | - | | 0.6872 | 2540 | 0.2828 | - | - | | 0.6875 | 2541 | 0.2815 | - | - | | 0.6878 | 2542 | 0.2833 | - | - | | 0.6880 | 2543 | 0.1898 | - | - | | 0.6883 | 2544 | 0.1906 | - | - | | 0.6886 | 2545 | 0.2697 | - | - | | 0.6889 | 2546 | 0.2798 | - | - | | 0.6891 | 2547 | 0.3615 | - | - | | 0.6894 | 2548 | 0.2493 | - | - | | 0.6897 | 2549 | 0.238 | - | - | | 0.6899 | 2550 | 0.3225 | - | - | | 0.6902 | 2551 | 0.2755 | - | - | | 0.6905 | 2552 | 0.2717 | - | - | | 0.6907 | 2553 | 0.346 | - | - | | 0.6910 | 2554 | 0.1961 | - | - | | 0.6913 | 2555 | 0.2816 | - | - | | 0.6916 | 2556 | 0.3031 | - | - | | 0.6918 | 2557 | 0.3204 | - | - | | 0.6921 | 2558 | 0.2067 | - | - | | 0.6924 | 2559 | 0.2513 | - | - | | 0.6926 | 2560 | 0.291 | - | - | | 0.6929 | 2561 | 0.2655 | - | - | | 0.6932 | 2562 | 0.3215 | - | - | | 0.6935 | 2563 | 0.3614 | - | - | | 0.6937 | 2564 | 0.3519 | - | - | | 0.6940 | 2565 | 0.3836 | - | - | | 0.6943 | 2566 | 0.301 | - | - | | 0.6945 | 2567 | 0.3132 | - | - | | 0.6948 | 2568 | 0.2782 | - | - | | 0.6951 | 2569 | 0.3692 | - | - | | 0.6953 | 2570 | 0.413 | - | - | | 0.6956 | 2571 | 0.3127 | - | - | | 0.6959 | 2572 | 0.181 | - | - | | 0.6962 | 2573 | 0.3427 | - | - | | 0.6964 | 2574 | 0.2982 | - | - | | 0.6967 | 2575 | 0.3722 | - | - | | 0.6970 | 2576 | 0.2658 | - | - | | 0.6972 | 2577 | 0.234 | - | - | | 0.6975 | 2578 | 0.3053 | - | - | | 0.6978 | 2579 | 0.3209 | - | - | | 0.6981 | 2580 | 0.2979 | - | - | | 0.6983 | 2581 | 0.3301 | - | - | | 0.6986 | 2582 | 0.2866 | - | - | | 0.6989 | 2583 | 0.3146 | - | - | | 0.6991 | 2584 | 0.2727 | - | - | | 0.6994 | 2585 | 0.3635 | - | - | | 0.6997 | 2586 | 0.2708 | - | - | | 0.6999 | 2587 | 0.2998 | - | - | | 0.7002 | 2588 | 0.2133 | - | - | | 0.7005 | 2589 | 0.2166 | - | - | | 0.7008 | 2590 | 0.2741 | - | - | | 0.7010 | 2591 | 0.2225 | - | - | | 0.7013 | 2592 | 0.2137 | - | - | | 0.7016 | 2593 | 0.2898 | - | - | | 0.7018 | 2594 | 0.247 | - | - | | 0.7021 | 2595 | 0.1923 | - | - | | 0.7024 | 2596 | 0.2774 | - | - | | 0.7027 | 2597 | 0.2341 | - | - | | 0.7029 | 2598 | 0.3579 | - | - | | 0.7032 | 2599 | 0.3634 | - | - | | 0.7035 | 2600 | 0.184 | - | - | | 0.7037 | 2601 | 0.2285 | - | - | | 0.7040 | 2602 | 0.2542 | - | - | | 0.7043 | 2603 | 0.279 | - | - | | 0.7045 | 2604 | 0.2376 | - | - | | 0.7048 | 2605 | 0.3331 | - | - | | 0.7051 | 2606 | 0.2338 | - | - | | 0.7054 | 2607 | 0.3463 | - | - | | 0.7056 | 2608 | 0.2671 | - | - | | 0.7059 | 2609 | 0.231 | - | - | | 0.7062 | 2610 | 0.2001 | - | - | | 0.7064 | 2611 | 0.4103 | - | - | | 0.7067 | 2612 | 0.2532 | - | - | | 0.7070 | 2613 | 0.3057 | - | - | | 0.7073 | 2614 | 0.2364 | - | - | | 0.7075 | 2615 | 0.2809 | - | - | | 0.7078 | 2616 | 0.2894 | - | - | | 0.7081 | 2617 | 0.2466 | - | - | | 0.7083 | 2618 | 0.2671 | - | - | | 0.7086 | 2619 | 0.2803 | - | - | | 0.7089 | 2620 | 0.3047 | - | - | | 0.7091 | 2621 | 0.2221 | - | - | | 0.7094 | 2622 | 0.2949 | - | - | | 0.7097 | 2623 | 0.1792 | - | - | | 0.7100 | 2624 | 0.276 | - | - | | 0.7102 | 2625 | 0.2189 | - | - | | 0.7105 | 2626 | 0.3951 | - | - | | 0.7108 | 2627 | 0.232 | - | - | | 0.7110 | 2628 | 0.2308 | - | - | | 0.7113 | 2629 | 0.2023 | - | - | | 0.7116 | 2630 | 0.3196 | - | - | | 0.7119 | 2631 | 0.2111 | - | - | | 0.7121 | 2632 | 0.2745 | - | - | | 0.7124 | 2633 | 0.2338 | - | - | | 0.7127 | 2634 | 0.2205 | - | - | | 0.7129 | 2635 | 0.1885 | - | - | | 0.7132 | 2636 | 0.297 | - | - | | 0.7135 | 2637 | 0.3786 | - | - | | 0.7137 | 2638 | 0.2427 | - | - | | 0.7140 | 2639 | 0.3616 | - | - | | 0.7143 | 2640 | 0.3802 | - | - | | 0.7146 | 2641 | 0.3355 | - | - | | 0.7148 | 2642 | 0.1796 | - | - | | 0.7151 | 2643 | 0.2682 | - | - | | 0.7154 | 2644 | 0.3872 | - | - | | 0.7156 | 2645 | 0.3007 | - | - | | 0.7159 | 2646 | 0.3101 | - | - | | 0.7162 | 2647 | 0.3015 | - | - | | 0.7165 | 2648 | 0.1874 | - | - | | 0.7167 | 2649 | 0.2402 | - | - | | 0.7170 | 2650 | 0.221 | - | - | | 0.7173 | 2651 | 0.2383 | - | - | | 0.7175 | 2652 | 0.3307 | - | - | | 0.7178 | 2653 | 0.3467 | - | - | | 0.7181 | 2654 | 0.2158 | - | - | | 0.7183 | 2655 | 0.2674 | - | - | | 0.7186 | 2656 | 0.3809 | - | - | | 0.7189 | 2657 | 0.2308 | - | - | | 0.7192 | 2658 | 0.3204 | - | - | | 0.7194 | 2659 | 0.2713 | - | - | | 0.7197 | 2660 | 0.1767 | - | - | | 0.7200 | 2661 | 0.2442 | - | - | | 0.7202 | 2662 | 0.2633 | - | - | | 0.7205 | 2663 | 0.4071 | - | - | | 0.7208 | 2664 | 0.2561 | - | - | | 0.7210 | 2665 | 0.2749 | - | - | | 0.7213 | 2666 | 0.3119 | - | - | | 0.7216 | 2667 | 0.3571 | - | - | | 0.7219 | 2668 | 0.2496 | - | - | | 0.7221 | 2669 | 0.2653 | - | - | | 0.7224 | 2670 | 0.2856 | - | - | | 0.7227 | 2671 | 0.3543 | - | - | | 0.7229 | 2672 | 0.2169 | - | - | | 0.7232 | 2673 | 0.2581 | - | - | | 0.7235 | 2674 | 0.2551 | - | - | | 0.7238 | 2675 | 0.1895 | - | - | | 0.7240 | 2676 | 0.2255 | - | - | | 0.7243 | 2677 | 0.382 | - | - | | 0.7246 | 2678 | 0.2449 | - | - | | 0.7248 | 2679 | 0.3805 | - | - | | 0.7251 | 2680 | 0.3349 | - | - | | 0.7254 | 2681 | 0.3457 | - | - | | 0.7256 | 2682 | 0.2056 | - | - | | 0.7259 | 2683 | 0.2793 | - | - | | 0.7262 | 2684 | 0.17 | - | - | | 0.7265 | 2685 | 0.2592 | - | - | | 0.7267 | 2686 | 0.2623 | - | - | | 0.7270 | 2687 | 0.3401 | - | - | | 0.7273 | 2688 | 0.2505 | - | - | | 0.7275 | 2689 | 0.3273 | - | - | | 0.7278 | 2690 | 0.256 | - | - | | 0.7281 | 2691 | 0.3354 | - | - | | 0.7284 | 2692 | 0.3204 | - | - | | 0.7286 | 2693 | 0.2867 | - | - | | 0.7289 | 2694 | 0.2322 | - | - | | 0.7292 | 2695 | 0.2284 | - | - | | 0.7294 | 2696 | 0.2249 | - | - | | 0.7297 | 2697 | 0.2987 | - | - | | 0.7300 | 2698 | 0.2556 | - | - | | 0.7302 | 2699 | 0.3544 | - | - | | 0.7305 | 2700 | 0.2903 | - | - | | 0.7308 | 2701 | 0.1698 | - | - | | 0.7311 | 2702 | 0.2051 | - | - | | 0.7313 | 2703 | 0.2732 | - | - | | 0.7316 | 2704 | 0.2389 | - | - | | 0.7319 | 2705 | 0.2333 | - | - | | 0.7321 | 2706 | 0.2951 | - | - | | 0.7324 | 2707 | 0.2806 | - | - | | 0.7327 | 2708 | 0.2597 | - | - | | 0.7330 | 2709 | 0.2239 | - | - | | 0.7332 | 2710 | 0.3476 | - | - | | 0.7335 | 2711 | 0.2761 | - | - | | 0.7338 | 2712 | 0.2793 | - | - | | 0.7340 | 2713 | 0.299 | - | - | | 0.7343 | 2714 | 0.3661 | - | - | | 0.7346 | 2715 | 0.2467 | - | - | | 0.7348 | 2716 | 0.2292 | - | - | | 0.7351 | 2717 | 0.2687 | - | - | | 0.7354 | 2718 | 0.2712 | - | - | | 0.7357 | 2719 | 0.256 | - | - | | 0.7359 | 2720 | 0.3175 | - | - | | 0.7362 | 2721 | 0.2517 | - | - | | 0.7365 | 2722 | 0.3737 | - | - | | 0.7367 | 2723 | 0.2844 | - | - | | 0.7370 | 2724 | 0.3356 | - | - | | 0.7373 | 2725 | 0.3204 | - | - | | 0.7376 | 2726 | 0.2336 | - | - | | 0.7378 | 2727 | 0.2639 | - | - | | 0.7381 | 2728 | 0.3018 | - | - | | 0.7384 | 2729 | 0.3583 | - | - | | 0.7386 | 2730 | 0.2613 | - | - | | 0.7389 | 2731 | 0.2649 | - | - | | 0.7392 | 2732 | 0.3438 | - | - | | 0.7394 | 2733 | 0.3033 | - | - | | 0.7397 | 2734 | 0.3718 | - | - | | 0.7400 | 2735 | 0.2529 | - | - | | 0.7403 | 2736 | 0.3019 | - | - | | 0.7405 | 2737 | 0.2684 | - | - | | 0.7408 | 2738 | 0.2371 | - | - | | 0.7411 | 2739 | 0.2674 | - | - | | 0.7413 | 2740 | 0.2744 | - | - | | 0.7416 | 2741 | 0.35 | - | - | | 0.7419 | 2742 | 0.2909 | - | - | | 0.7422 | 2743 | 0.4118 | - | - | | 0.7424 | 2744 | 0.2335 | - | - | | 0.7427 | 2745 | 0.2961 | - | - | | 0.7430 | 2746 | 0.2588 | - | - | | 0.7432 | 2747 | 0.1956 | - | - | | 0.7435 | 2748 | 0.2885 | - | - | | 0.7438 | 2749 | 0.2685 | - | - | | 0.7440 | 2750 | 0.3205 | - | - | | 0.7443 | 2751 | 0.2229 | - | - | | 0.7446 | 2752 | 0.2407 | - | - | | 0.7449 | 2753 | 0.2909 | - | - | | 0.7451 | 2754 | 0.2197 | - | - | | 0.7454 | 2755 | 0.2846 | - | - | | 0.7457 | 2756 | 0.3195 | - | - | | 0.7459 | 2757 | 0.3928 | - | - | | 0.7462 | 2758 | 0.2501 | - | - | | 0.7465 | 2759 | 0.2635 | - | - | | 0.7468 | 2760 | 0.2687 | - | - | | 0.7470 | 2761 | 0.2774 | - | - | | 0.7473 | 2762 | 0.2404 | - | - | | 0.7476 | 2763 | 0.2691 | - | - | | 0.7478 | 2764 | 0.2742 | - | - | | 0.7481 | 2765 | 0.3111 | - | - | | 0.7484 | 2766 | 0.2613 | - | - | | 0.7486 | 2767 | 0.2664 | - | - | | 0.7489 | 2768 | 0.227 | - | - | | 0.7492 | 2769 | 0.2653 | - | - | | 0.7495 | 2770 | 0.2577 | - | - | | 0.7497 | 2771 | 0.3636 | - | - | | 0.75 | 2772 | 0.2996 | - | - | | 0.7503 | 2773 | 0.2813 | - | - | | 0.7505 | 2774 | 0.2668 | - | - | | 0.7508 | 2775 | 0.2219 | - | - | | 0.7511 | 2776 | 0.2306 | - | - | | 0.7514 | 2777 | 0.3742 | - | - | | 0.7516 | 2778 | 0.3119 | - | - | | 0.7519 | 2779 | 0.2839 | - | - | | 0.7522 | 2780 | 0.1479 | - | - | | 0.7524 | 2781 | 0.3496 | - | - | | 0.7527 | 2782 | 0.2045 | - | - | | 0.7530 | 2783 | 0.401 | - | - | | 0.7532 | 2784 | 0.2681 | - | - | | 0.7535 | 2785 | 0.1849 | - | - | | 0.7538 | 2786 | 0.2556 | - | - | | 0.7541 | 2787 | 0.2347 | - | - | | 0.7543 | 2788 | 0.2463 | - | - | | 0.7546 | 2789 | 0.3316 | - | - | | 0.7549 | 2790 | 0.2642 | - | - | | 0.7551 | 2791 | 0.3543 | - | - | | 0.7554 | 2792 | 0.3314 | - | - | | 0.7557 | 2793 | 0.3354 | - | - | | 0.7560 | 2794 | 0.2667 | - | - | | 0.7562 | 2795 | 0.3164 | - | - | | 0.7565 | 2796 | 0.3242 | - | - | | 0.7568 | 2797 | 0.1647 | - | - | | 0.7570 | 2798 | 0.2504 | - | - | | 0.7573 | 2799 | 0.1919 | - | - | | 0.7576 | 2800 | 0.3134 | - | - | | 0.7578 | 2801 | 0.2575 | - | - | | 0.7581 | 2802 | 0.2284 | - | - | | 0.7584 | 2803 | 0.3264 | - | - | | 0.7587 | 2804 | 0.2986 | - | - | | 0.7589 | 2805 | 0.253 | - | - | | 0.7592 | 2806 | 0.3031 | - | - | | 0.7595 | 2807 | 0.3244 | - | - | | 0.7597 | 2808 | 0.2799 | - | - | | 0.7600 | 2809 | 0.3398 | - | - | | 0.7603 | 2810 | 0.2747 | - | - | | 0.7606 | 2811 | 0.2881 | - | - | | 0.7608 | 2812 | 0.1694 | - | - | | 0.7611 | 2813 | 0.2386 | - | - | | 0.7614 | 2814 | 0.3028 | - | - | | 0.7616 | 2815 | 0.3223 | - | - | | 0.7619 | 2816 | 0.2624 | - | - | | 0.7622 | 2817 | 0.2616 | - | - | | 0.7624 | 2818 | 0.3242 | - | - | | 0.7627 | 2819 | 0.3248 | - | - | | 0.7630 | 2820 | 0.2934 | - | - | | 0.7633 | 2821 | 0.2735 | - | - | | 0.7635 | 2822 | 0.3237 | - | - | | 0.7638 | 2823 | 0.248 | - | - | | 0.7641 | 2824 | 0.3122 | - | - | | 0.7643 | 2825 | 0.2497 | - | - | | 0.7646 | 2826 | 0.3125 | - | - | | 0.7649 | 2827 | 0.2535 | - | - | | 0.7652 | 2828 | 0.3046 | - | - | | 0.7654 | 2829 | 0.3407 | - | - | | 0.7657 | 2830 | 0.195 | - | - | | 0.7660 | 2831 | 0.3016 | - | - | | 0.7662 | 2832 | 0.2795 | - | - | | 0.7665 | 2833 | 0.2454 | - | - | | 0.7668 | 2834 | 0.2925 | - | - | | 0.7670 | 2835 | 0.245 | - | - | | 0.7673 | 2836 | 0.3146 | - | - | | 0.7676 | 2837 | 0.2678 | - | - | | 0.7679 | 2838 | 0.288 | - | - | | 0.7681 | 2839 | 0.2584 | - | - | | 0.7684 | 2840 | 0.3073 | - | - | | 0.7687 | 2841 | 0.3062 | - | - | | 0.7689 | 2842 | 0.2962 | - | - | | 0.7692 | 2843 | 0.2451 | - | - | | 0.7695 | 2844 | 0.2307 | - | - | | 0.7698 | 2845 | 0.2803 | - | - | | 0.7700 | 2846 | 0.3346 | - | - | | 0.7703 | 2847 | 0.3172 | - | - | | 0.7706 | 2848 | 0.2453 | - | - | | 0.7708 | 2849 | 0.2658 | - | - | | 0.7711 | 2850 | 0.335 | - | - | | 0.7714 | 2851 | 0.1815 | - | - | | 0.7716 | 2852 | 0.4011 | - | - | | 0.7719 | 2853 | 0.2839 | - | - | | 0.7722 | 2854 | 0.2595 | - | - | | 0.7725 | 2855 | 0.2974 | - | - | | 0.7727 | 2856 | 0.2776 | - | - | | 0.7730 | 2857 | 0.2275 | - | - | | 0.7733 | 2858 | 0.3231 | - | - | | 0.7735 | 2859 | 0.2119 | - | - | | 0.7738 | 2860 | 0.2961 | - | - | | 0.7741 | 2861 | 0.2804 | - | - | | 0.7744 | 2862 | 0.3033 | - | - | | 0.7746 | 2863 | 0.259 | - | - | | 0.7749 | 2864 | 0.3505 | - | - | | 0.7752 | 2865 | 0.2804 | - | - | | 0.7754 | 2866 | 0.3125 | - | - | | 0.7757 | 2867 | 0.3239 | - | - | | 0.7760 | 2868 | 0.3023 | - | - | | 0.7762 | 2869 | 0.2022 | - | - | | 0.7765 | 2870 | 0.2552 | - | - | | 0.7768 | 2871 | 0.2637 | - | - | | 0.7771 | 2872 | 0.2201 | - | - | | 0.7773 | 2873 | 0.2835 | - | - | | 0.7776 | 2874 | 0.2255 | - | - | | 0.7779 | 2875 | 0.3677 | - | - | | 0.7781 | 2876 | 0.2978 | - | - | | 0.7784 | 2877 | 0.2559 | - | - | | 0.7787 | 2878 | 0.1835 | - | - | | 0.7790 | 2879 | 0.2435 | - | - | | 0.7792 | 2880 | 0.2627 | - | - | | 0.7795 | 2881 | 0.2287 | - | - | | 0.7798 | 2882 | 0.3366 | - | - | | 0.7800 | 2883 | 0.3005 | - | - | | 0.7803 | 2884 | 0.3114 | - | - | | 0.7806 | 2885 | 0.1929 | - | - | | 0.7808 | 2886 | 0.3126 | - | - | | 0.7811 | 2887 | 0.2264 | - | - | | 0.7814 | 2888 | 0.311 | - | - | | 0.7817 | 2889 | 0.2451 | - | - | | 0.7819 | 2890 | 0.2835 | - | - | | 0.7822 | 2891 | 0.3063 | - | - | | 0.7825 | 2892 | 0.2619 | - | - | | 0.7827 | 2893 | 0.2697 | - | - | | 0.7830 | 2894 | 0.231 | - | - | | 0.7833 | 2895 | 0.3442 | - | - | | 0.7835 | 2896 | 0.2583 | - | - | | 0.7838 | 2897 | 0.3165 | - | - | | 0.7841 | 2898 | 0.2559 | - | - | | 0.7844 | 2899 | 0.2985 | - | - | | 0.7846 | 2900 | 0.2976 | - | - | | 0.7849 | 2901 | 0.3032 | - | - | | 0.7852 | 2902 | 0.259 | - | - | | 0.7854 | 2903 | 0.3032 | - | - | | 0.7857 | 2904 | 0.2305 | - | - | | 0.7860 | 2905 | 0.2623 | - | - | | 0.7863 | 2906 | 0.3027 | - | - | | 0.7865 | 2907 | 0.2387 | - | - | | 0.7868 | 2908 | 0.2654 | - | - | | 0.7871 | 2909 | 0.3119 | - | - | | 0.7873 | 2910 | 0.384 | - | - | | 0.7876 | 2911 | 0.312 | - | - | | 0.7879 | 2912 | 0.2124 | - | - | | 0.7881 | 2913 | 0.2411 | - | - | | 0.7884 | 2914 | 0.2644 | - | - | | 0.7887 | 2915 | 0.2424 | - | - | | 0.7890 | 2916 | 0.1895 | - | - | | 0.7892 | 2917 | 0.259 | - | - | | 0.7895 | 2918 | 0.3217 | - | - | | 0.7898 | 2919 | 0.3174 | - | - | | 0.7900 | 2920 | 0.3491 | - | - | | 0.7903 | 2921 | 0.2239 | - | - | | 0.7906 | 2922 | 0.2215 | - | - | | 0.7909 | 2923 | 0.2571 | - | - | | 0.7911 | 2924 | 0.3131 | - | - | | 0.7914 | 2925 | 0.3643 | - | - | | 0.7917 | 2926 | 0.2213 | - | - | | 0.7919 | 2927 | 0.2722 | - | - | | 0.7922 | 2928 | 0.3308 | - | - | | 0.7925 | 2929 | 0.2342 | - | - | | 0.7927 | 2930 | 0.2073 | - | - | | 0.7930 | 2931 | 0.2149 | - | - | | 0.7933 | 2932 | 0.3553 | - | - | | 0.7936 | 2933 | 0.2729 | - | - | | 0.7938 | 2934 | 0.2606 | - | - | | 0.7941 | 2935 | 0.4019 | - | - | | 0.7944 | 2936 | 0.2869 | - | - | | 0.7946 | 2937 | 0.2878 | - | - | | 0.7949 | 2938 | 0.1756 | - | - | | 0.7952 | 2939 | 0.2547 | - | - | | 0.7955 | 2940 | 0.2232 | - | - | | 0.7957 | 2941 | 0.3016 | - | - | | 0.7960 | 2942 | 0.3424 | - | - | | 0.7963 | 2943 | 0.2554 | - | - | | 0.7965 | 2944 | 0.2146 | - | - | | 0.7968 | 2945 | 0.3067 | - | - | | 0.7971 | 2946 | 0.3556 | - | - | | 0.7973 | 2947 | 0.1824 | - | - | | 0.7976 | 2948 | 0.2505 | - | - | | 0.7979 | 2949 | 0.26 | - | - | | 0.7982 | 2950 | 0.2705 | - | - | | 0.7984 | 2951 | 0.1616 | - | - | | 0.7987 | 2952 | 0.2653 | - | - | | 0.7990 | 2953 | 0.3695 | - | - | | 0.7992 | 2954 | 0.295 | - | - | | 0.7995 | 2955 | 0.3118 | - | - | | 0.7998 | 2956 | 0.2737 | - | - | | 0.8001 | 2957 | 0.2709 | - | - | | 0.8003 | 2958 | 0.3451 | - | - | | 0.8006 | 2959 | 0.2414 | - | - | | 0.8009 | 2960 | 0.2359 | - | - | | 0.8011 | 2961 | 0.3508 | - | - | | 0.8014 | 2962 | 0.2349 | - | - | | 0.8017 | 2963 | 0.2665 | - | - | | 0.8019 | 2964 | 0.2782 | - | - | | 0.8022 | 2965 | 0.3534 | - | - | | 0.8025 | 2966 | 0.27 | - | - | | 0.8028 | 2967 | 0.2107 | - | - | | 0.8030 | 2968 | 0.2259 | - | - | | 0.8033 | 2969 | 0.2486 | - | - | | 0.8036 | 2970 | 0.2348 | - | - | | 0.8038 | 2971 | 0.2148 | - | - | | 0.8041 | 2972 | 0.2964 | - | - | | 0.8044 | 2973 | 0.2457 | - | - | | 0.8047 | 2974 | 0.2902 | - | - | | 0.8049 | 2975 | 0.3884 | - | - | | 0.8052 | 2976 | 0.2565 | - | - | | 0.8055 | 2977 | 0.2092 | - | - | | 0.8057 | 2978 | 0.2529 | - | - | | 0.8060 | 2979 | 0.3613 | - | - | | 0.8063 | 2980 | 0.2704 | - | - | | 0.8065 | 2981 | 0.2505 | - | - | | 0.8068 | 2982 | 0.2398 | - | - | | 0.8071 | 2983 | 0.3175 | - | - | | 0.8074 | 2984 | 0.3307 | - | - | | 0.8076 | 2985 | 0.1856 | - | - | | 0.8079 | 2986 | 0.1582 | - | - | | 0.8082 | 2987 | 0.3041 | - | - | | 0.8084 | 2988 | 0.4047 | - | - | | 0.8087 | 2989 | 0.3247 | - | - | | 0.8090 | 2990 | 0.2438 | - | - | | 0.8093 | 2991 | 0.2938 | - | - | | 0.8095 | 2992 | 0.3456 | - | - | | 0.8098 | 2993 | 0.2621 | - | - | | 0.8101 | 2994 | 0.2071 | - | - | | 0.8103 | 2995 | 0.2508 | - | - | | 0.8106 | 2996 | 0.2983 | - | - | | 0.8109 | 2997 | 0.3662 | - | - | | 0.8111 | 2998 | 0.258 | - | - | | 0.8114 | 2999 | 0.3373 | - | - | | 0.8117 | 3000 | 0.2842 | 0.2463 | 0.939 | | 0.8120 | 3001 | 0.2167 | - | - | | 0.8122 | 3002 | 0.2882 | - | - | | 0.8125 | 3003 | 0.3516 | - | - | | 0.8128 | 3004 | 0.3133 | - | - | | 0.8130 | 3005 | 0.2661 | - | - | | 0.8133 | 3006 | 0.205 | - | - | | 0.8136 | 3007 | 0.2301 | - | - | | 0.8139 | 3008 | 0.2874 | - | - | | 0.8141 | 3009 | 0.3329 | - | - | | 0.8144 | 3010 | 0.3167 | - | - | | 0.8147 | 3011 | 0.3157 | - | - | | 0.8149 | 3012 | 0.2768 | - | - | | 0.8152 | 3013 | 0.364 | - | - | | 0.8155 | 3014 | 0.3954 | - | - | | 0.8157 | 3015 | 0.2136 | - | - | | 0.8160 | 3016 | 0.2489 | - | - | | 0.8163 | 3017 | 0.2709 | - | - | | 0.8166 | 3018 | 0.1953 | - | - | | 0.8168 | 3019 | 0.3296 | - | - | | 0.8171 | 3020 | 0.2012 | - | - | | 0.8174 | 3021 | 0.2605 | - | - | | 0.8176 | 3022 | 0.223 | - | - | | 0.8179 | 3023 | 0.2657 | - | - | | 0.8182 | 3024 | 0.1954 | - | - | | 0.8185 | 3025 | 0.3515 | - | - | | 0.8187 | 3026 | 0.2661 | - | - | | 0.8190 | 3027 | 0.3212 | - | - | | 0.8193 | 3028 | 0.314 | - | - | | 0.8195 | 3029 | 0.2716 | - | - | | 0.8198 | 3030 | 0.2552 | - | - | | 0.8201 | 3031 | 0.2776 | - | - | | 0.8203 | 3032 | 0.2944 | - | - | | 0.8206 | 3033 | 0.2357 | - | - | | 0.8209 | 3034 | 0.3429 | - | - | | 0.8212 | 3035 | 0.2988 | - | - | | 0.8214 | 3036 | 0.2401 | - | - | | 0.8217 | 3037 | 0.3858 | - | - | | 0.8220 | 3038 | 0.2472 | - | - | | 0.8222 | 3039 | 0.2771 | - | - | | 0.8225 | 3040 | 0.2495 | - | - | | 0.8228 | 3041 | 0.3205 | - | - | | 0.8231 | 3042 | 0.3045 | - | - | | 0.8233 | 3043 | 0.3616 | - | - | | 0.8236 | 3044 | 0.3196 | - | - | | 0.8239 | 3045 | 0.2343 | - | - | | 0.8241 | 3046 | 0.2493 | - | - | | 0.8244 | 3047 | 0.2698 | - | - | | 0.8247 | 3048 | 0.3394 | - | - | | 0.8249 | 3049 | 0.1746 | - | - | | 0.8252 | 3050 | 0.3301 | - | - | | 0.8255 | 3051 | 0.3184 | - | - | | 0.8258 | 3052 | 0.284 | - | - | | 0.8260 | 3053 | 0.1808 | - | - | | 0.8263 | 3054 | 0.2051 | - | - | | 0.8266 | 3055 | 0.2614 | - | - | | 0.8268 | 3056 | 0.3111 | - | - | | 0.8271 | 3057 | 0.2876 | - | - | | 0.8274 | 3058 | 0.2852 | - | - | | 0.8277 | 3059 | 0.1958 | - | - | | 0.8279 | 3060 | 0.3147 | - | - | | 0.8282 | 3061 | 0.263 | - | - | | 0.8285 | 3062 | 0.3076 | - | - | | 0.8287 | 3063 | 0.2385 | - | - | | 0.8290 | 3064 | 0.3743 | - | - | | 0.8293 | 3065 | 0.2613 | - | - | | 0.8295 | 3066 | 0.2823 | - | - | | 0.8298 | 3067 | 0.3058 | - | - | | 0.8301 | 3068 | 0.2326 | - | - | | 0.8304 | 3069 | 0.3002 | - | - | | 0.8306 | 3070 | 0.3025 | - | - | | 0.8309 | 3071 | 0.3951 | - | - | | 0.8312 | 3072 | 0.2093 | - | - | | 0.8314 | 3073 | 0.3285 | - | - | | 0.8317 | 3074 | 0.2716 | - | - | | 0.8320 | 3075 | 0.2095 | - | - | | 0.8323 | 3076 | 0.2986 | - | - | | 0.8325 | 3077 | 0.2764 | - | - | | 0.8328 | 3078 | 0.2146 | - | - | | 0.8331 | 3079 | 0.1981 | - | - | | 0.8333 | 3080 | 0.2872 | - | - | | 0.8336 | 3081 | 0.3202 | - | - | | 0.8339 | 3082 | 0.251 | - | - | | 0.8341 | 3083 | 0.3631 | - | - | | 0.8344 | 3084 | 0.354 | - | - | | 0.8347 | 3085 | 0.2695 | - | - | | 0.8350 | 3086 | 0.2431 | - | - | | 0.8352 | 3087 | 0.239 | - | - | | 0.8355 | 3088 | 0.276 | - | - | | 0.8358 | 3089 | 0.2483 | - | - | | 0.8360 | 3090 | 0.2149 | - | - | | 0.8363 | 3091 | 0.1859 | - | - | | 0.8366 | 3092 | 0.308 | - | - | | 0.8369 | 3093 | 0.2807 | - | - | | 0.8371 | 3094 | 0.26 | - | - | | 0.8374 | 3095 | 0.2837 | - | - | | 0.8377 | 3096 | 0.3548 | - | - | | 0.8379 | 3097 | 0.2138 | - | - | | 0.8382 | 3098 | 0.2616 | - | - | | 0.8385 | 3099 | 0.2876 | - | - | | 0.8387 | 3100 | 0.3028 | - | - | | 0.8390 | 3101 | 0.2108 | - | - | | 0.8393 | 3102 | 0.2431 | - | - | | 0.8396 | 3103 | 0.1645 | - | - | | 0.8398 | 3104 | 0.2006 | - | - | | 0.8401 | 3105 | 0.3811 | - | - | | 0.8404 | 3106 | 0.3628 | - | - | | 0.8406 | 3107 | 0.3104 | - | - | | 0.8409 | 3108 | 0.2363 | - | - | | 0.8412 | 3109 | 0.2682 | - | - | | 0.8415 | 3110 | 0.2491 | - | - | | 0.8417 | 3111 | 0.3239 | - | - | | 0.8420 | 3112 | 0.2533 | - | - | | 0.8423 | 3113 | 0.3515 | - | - | | 0.8425 | 3114 | 0.252 | - | - | | 0.8428 | 3115 | 0.3429 | - | - | | 0.8431 | 3116 | 0.2386 | - | - | | 0.8433 | 3117 | 0.2064 | - | - | | 0.8436 | 3118 | 0.2943 | - | - | | 0.8439 | 3119 | 0.2569 | - | - | | 0.8442 | 3120 | 0.3165 | - | - | | 0.8444 | 3121 | 0.2243 | - | - | | 0.8447 | 3122 | 0.274 | - | - | | 0.8450 | 3123 | 0.2616 | - | - | | 0.8452 | 3124 | 0.266 | - | - | | 0.8455 | 3125 | 0.2186 | - | - | | 0.8458 | 3126 | 0.2684 | - | - | | 0.8460 | 3127 | 0.2554 | - | - | | 0.8463 | 3128 | 0.3165 | - | - | | 0.8466 | 3129 | 0.3101 | - | - | | 0.8469 | 3130 | 0.2978 | - | - | | 0.8471 | 3131 | 0.1296 | - | - | | 0.8474 | 3132 | 0.2496 | - | - | | 0.8477 | 3133 | 0.2182 | - | - | | 0.8479 | 3134 | 0.2933 | - | - | | 0.8482 | 3135 | 0.2176 | - | - | | 0.8485 | 3136 | 0.3107 | - | - | | 0.8488 | 3137 | 0.2736 | - | - | | 0.8490 | 3138 | 0.2992 | - | - | | 0.8493 | 3139 | 0.2735 | - | - | | 0.8496 | 3140 | 0.2935 | - | - | | 0.8498 | 3141 | 0.2048 | - | - | | 0.8501 | 3142 | 0.4704 | - | - | | 0.8504 | 3143 | 0.1557 | - | - | | 0.8506 | 3144 | 0.284 | - | - | | 0.8509 | 3145 | 0.2969 | - | - | | 0.8512 | 3146 | 0.311 | - | - | | 0.8515 | 3147 | 0.3247 | - | - | | 0.8517 | 3148 | 0.3605 | - | - | | 0.8520 | 3149 | 0.2094 | - | - | | 0.8523 | 3150 | 0.2497 | - | - | | 0.8525 | 3151 | 0.222 | - | - | | 0.8528 | 3152 | 0.1675 | - | - | | 0.8531 | 3153 | 0.259 | - | - | | 0.8534 | 3154 | 0.275 | - | - | | 0.8536 | 3155 | 0.3312 | - | - | | 0.8539 | 3156 | 0.3117 | - | - | | 0.8542 | 3157 | 0.3081 | - | - | | 0.8544 | 3158 | 0.1935 | - | - | | 0.8547 | 3159 | 0.2399 | - | - | | 0.8550 | 3160 | 0.2038 | - | - | | 0.8552 | 3161 | 0.1839 | - | - | | 0.8555 | 3162 | 0.1808 | - | - | | 0.8558 | 3163 | 0.2941 | - | - | | 0.8561 | 3164 | 0.2251 | - | - | | 0.8563 | 3165 | 0.306 | - | - | | 0.8566 | 3166 | 0.2853 | - | - | | 0.8569 | 3167 | 0.2793 | - | - | | 0.8571 | 3168 | 0.2712 | - | - | | 0.8574 | 3169 | 0.2983 | - | - | | 0.8577 | 3170 | 0.2618 | - | - | | 0.8580 | 3171 | 0.2801 | - | - | | 0.8582 | 3172 | 0.2418 | - | - | | 0.8585 | 3173 | 0.3219 | - | - | | 0.8588 | 3174 | 0.3499 | - | - | | 0.8590 | 3175 | 0.2813 | - | - | | 0.8593 | 3176 | 0.2555 | - | - | | 0.8596 | 3177 | 0.3347 | - | - | | 0.8598 | 3178 | 0.3107 | - | - | | 0.8601 | 3179 | 0.2329 | - | - | | 0.8604 | 3180 | 0.1833 | - | - | | 0.8607 | 3181 | 0.212 | - | - | | 0.8609 | 3182 | 0.2709 | - | - | | 0.8612 | 3183 | 0.3164 | - | - | | 0.8615 | 3184 | 0.2303 | - | - | | 0.8617 | 3185 | 0.2377 | - | - | | 0.8620 | 3186 | 0.2982 | - | - | | 0.8623 | 3187 | 0.216 | - | - | | 0.8626 | 3188 | 0.1992 | - | - | | 0.8628 | 3189 | 0.3401 | - | - | | 0.8631 | 3190 | 0.285 | - | - | | 0.8634 | 3191 | 0.2767 | - | - | | 0.8636 | 3192 | 0.3336 | - | - | | 0.8639 | 3193 | 0.2647 | - | - | | 0.8642 | 3194 | 0.2287 | - | - | | 0.8644 | 3195 | 0.2781 | - | - | | 0.8647 | 3196 | 0.2843 | - | - | | 0.8650 | 3197 | 0.2597 | - | - | | 0.8653 | 3198 | 0.2822 | - | - | | 0.8655 | 3199 | 0.2377 | - | - | | 0.8658 | 3200 | 0.3582 | - | - | | 0.8661 | 3201 | 0.2107 | - | - | | 0.8663 | 3202 | 0.2138 | - | - | | 0.8666 | 3203 | 0.2569 | - | - | | 0.8669 | 3204 | 0.2416 | - | - | | 0.8672 | 3205 | 0.2648 | - | - | | 0.8674 | 3206 | 0.2881 | - | - | | 0.8677 | 3207 | 0.2649 | - | - | | 0.8680 | 3208 | 0.332 | - | - | | 0.8682 | 3209 | 0.2774 | - | - | | 0.8685 | 3210 | 0.3808 | - | - | | 0.8688 | 3211 | 0.2699 | - | - | | 0.8690 | 3212 | 0.2968 | - | - | | 0.8693 | 3213 | 0.1974 | - | - | | 0.8696 | 3214 | 0.2999 | - | - | | 0.8699 | 3215 | 0.2864 | - | - | | 0.8701 | 3216 | 0.301 | - | - | | 0.8704 | 3217 | 0.2419 | - | - | | 0.8707 | 3218 | 0.2623 | - | - | | 0.8709 | 3219 | 0.2697 | - | - | | 0.8712 | 3220 | 0.2814 | - | - | | 0.8715 | 3221 | 0.1991 | - | - | | 0.8718 | 3222 | 0.2781 | - | - | | 0.8720 | 3223 | 0.419 | - | - | | 0.8723 | 3224 | 0.3883 | - | - | | 0.8726 | 3225 | 0.2591 | - | - | | 0.8728 | 3226 | 0.2652 | - | - | | 0.8731 | 3227 | 0.2372 | - | - | | 0.8734 | 3228 | 0.2605 | - | - | | 0.8736 | 3229 | 0.2934 | - | - | | 0.8739 | 3230 | 0.1837 | - | - | | 0.8742 | 3231 | 0.2139 | - | - | | 0.8745 | 3232 | 0.2228 | - | - | | 0.8747 | 3233 | 0.1801 | - | - | | 0.875 | 3234 | 0.2199 | - | - | | 0.8753 | 3235 | 0.2497 | - | - | | 0.8755 | 3236 | 0.2694 | - | - | | 0.8758 | 3237 | 0.2896 | - | - | | 0.8761 | 3238 | 0.26 | - | - | | 0.8764 | 3239 | 0.2803 | - | - | | 0.8766 | 3240 | 0.3891 | - | - | | 0.8769 | 3241 | 0.2883 | - | - | | 0.8772 | 3242 | 0.2769 | - | - | | 0.8774 | 3243 | 0.3071 | - | - | | 0.8777 | 3244 | 0.2371 | - | - | | 0.8780 | 3245 | 0.213 | - | - | | 0.8782 | 3246 | 0.2224 | - | - | | 0.8785 | 3247 | 0.3588 | - | - | | 0.8788 | 3248 | 0.3871 | - | - | | 0.8791 | 3249 | 0.4375 | - | - | | 0.8793 | 3250 | 0.2392 | - | - | | 0.8796 | 3251 | 0.2189 | - | - | | 0.8799 | 3252 | 0.205 | - | - | | 0.8801 | 3253 | 0.2352 | - | - | | 0.8804 | 3254 | 0.3317 | - | - | | 0.8807 | 3255 | 0.2779 | - | - | | 0.8810 | 3256 | 0.2985 | - | - | | 0.8812 | 3257 | 0.2848 | - | - | | 0.8815 | 3258 | 0.2637 | - | - | | 0.8818 | 3259 | 0.352 | - | - | | 0.8820 | 3260 | 0.2724 | - | - | | 0.8823 | 3261 | 0.1914 | - | - | | 0.8826 | 3262 | 0.2729 | - | - | | 0.8828 | 3263 | 0.3065 | - | - | | 0.8831 | 3264 | 0.3364 | - | - | | 0.8834 | 3265 | 0.242 | - | - | | 0.8837 | 3266 | 0.3168 | - | - | | 0.8839 | 3267 | 0.332 | - | - | | 0.8842 | 3268 | 0.263 | - | - | | 0.8845 | 3269 | 0.2265 | - | - | | 0.8847 | 3270 | 0.233 | - | - | | 0.8850 | 3271 | 0.2685 | - | - | | 0.8853 | 3272 | 0.2268 | - | - | | 0.8856 | 3273 | 0.2494 | - | - | | 0.8858 | 3274 | 0.2684 | - | - | | 0.8861 | 3275 | 0.2917 | - | - | | 0.8864 | 3276 | 0.2023 | - | - | | 0.8866 | 3277 | 0.2982 | - | - | | 0.8869 | 3278 | 0.212 | - | - | | 0.8872 | 3279 | 0.2643 | - | - | | 0.8874 | 3280 | 0.3248 | - | - | | 0.8877 | 3281 | 0.2351 | - | - | | 0.8880 | 3282 | 0.2696 | - | - | | 0.8883 | 3283 | 0.2182 | - | - | | 0.8885 | 3284 | 0.2514 | - | - | | 0.8888 | 3285 | 0.2385 | - | - | | 0.8891 | 3286 | 0.2452 | - | - | | 0.8893 | 3287 | 0.2522 | - | - | | 0.8896 | 3288 | 0.3539 | - | - | | 0.8899 | 3289 | 0.2672 | - | - | | 0.8902 | 3290 | 0.1711 | - | - | | 0.8904 | 3291 | 0.325 | - | - | | 0.8907 | 3292 | 0.2403 | - | - | | 0.8910 | 3293 | 0.3289 | - | - | | 0.8912 | 3294 | 0.2778 | - | - | | 0.8915 | 3295 | 0.2823 | - | - | | 0.8918 | 3296 | 0.2232 | - | - | | 0.8920 | 3297 | 0.2516 | - | - | | 0.8923 | 3298 | 0.2711 | - | - | | 0.8926 | 3299 | 0.1663 | - | - | | 0.8929 | 3300 | 0.333 | - | - | | 0.8931 | 3301 | 0.2305 | - | - | | 0.8934 | 3302 | 0.2193 | - | - | | 0.8937 | 3303 | 0.3186 | - | - | | 0.8939 | 3304 | 0.3291 | - | - | | 0.8942 | 3305 | 0.2091 | - | - | | 0.8945 | 3306 | 0.2188 | - | - | | 0.8948 | 3307 | 0.3998 | - | - | | 0.8950 | 3308 | 0.2493 | - | - | | 0.8953 | 3309 | 0.2519 | - | - | | 0.8956 | 3310 | 0.2684 | - | - | | 0.8958 | 3311 | 0.28 | - | - | | 0.8961 | 3312 | 0.2698 | - | - | | 0.8964 | 3313 | 0.3123 | - | - | | 0.8966 | 3314 | 0.2617 | - | - | | 0.8969 | 3315 | 0.2328 | - | - | | 0.8972 | 3316 | 0.4266 | - | - | | 0.8975 | 3317 | 0.3515 | - | - | | 0.8977 | 3318 | 0.2741 | - | - | | 0.8980 | 3319 | 0.237 | - | - | | 0.8983 | 3320 | 0.3418 | - | - | | 0.8985 | 3321 | 0.2176 | - | - | | 0.8988 | 3322 | 0.296 | - | - | | 0.8991 | 3323 | 0.2176 | - | - | | 0.8994 | 3324 | 0.2157 | - | - | | 0.8996 | 3325 | 0.1917 | - | - | | 0.8999 | 3326 | 0.2413 | - | - | | 0.9002 | 3327 | 0.2308 | - | - | | 0.9004 | 3328 | 0.2564 | - | - | | 0.9007 | 3329 | 0.2854 | - | - | | 0.9010 | 3330 | 0.2121 | - | - | | 0.9012 | 3331 | 0.2534 | - | - | | 0.9015 | 3332 | 0.2373 | - | - | | 0.9018 | 3333 | 0.2564 | - | - | | 0.9021 | 3334 | 0.2922 | - | - | | 0.9023 | 3335 | 0.2469 | - | - | | 0.9026 | 3336 | 0.302 | - | - | | 0.9029 | 3337 | 0.1971 | - | - | | 0.9031 | 3338 | 0.2529 | - | - | | 0.9034 | 3339 | 0.1892 | - | - | | 0.9037 | 3340 | 0.1991 | - | - | | 0.9040 | 3341 | 0.3181 | - | - | | 0.9042 | 3342 | 0.3811 | - | - | | 0.9045 | 3343 | 0.2705 | - | - | | 0.9048 | 3344 | 0.3562 | - | - | | 0.9050 | 3345 | 0.2036 | - | - | | 0.9053 | 3346 | 0.1932 | - | - | | 0.9056 | 3347 | 0.292 | - | - | | 0.9058 | 3348 | 0.2459 | - | - | | 0.9061 | 3349 | 0.2477 | - | - | | 0.9064 | 3350 | 0.3383 | - | - | | 0.9067 | 3351 | 0.267 | - | - | | 0.9069 | 3352 | 0.2344 | - | - | | 0.9072 | 3353 | 0.2995 | - | - | | 0.9075 | 3354 | 0.2303 | - | - | | 0.9077 | 3355 | 0.2602 | - | - | | 0.9080 | 3356 | 0.2934 | - | - | | 0.9083 | 3357 | 0.2085 | - | - | | 0.9085 | 3358 | 0.2979 | - | - | | 0.9088 | 3359 | 0.2047 | - | - | | 0.9091 | 3360 | 0.3085 | - | - | | 0.9094 | 3361 | 0.3844 | - | - | | 0.9096 | 3362 | 0.2434 | - | - | | 0.9099 | 3363 | 0.2252 | - | - | | 0.9102 | 3364 | 0.248 | - | - | | 0.9104 | 3365 | 0.2351 | - | - | | 0.9107 | 3366 | 0.3386 | - | - | | 0.9110 | 3367 | 0.275 | - | - | | 0.9113 | 3368 | 0.3524 | - | - | | 0.9115 | 3369 | 0.2168 | - | - | | 0.9118 | 3370 | 0.3273 | - | - | | 0.9121 | 3371 | 0.2062 | - | - | | 0.9123 | 3372 | 0.2583 | - | - | | 0.9126 | 3373 | 0.2562 | - | - | | 0.9129 | 3374 | 0.2361 | - | - | | 0.9131 | 3375 | 0.269 | - | - | | 0.9134 | 3376 | 0.2793 | - | - | | 0.9137 | 3377 | 0.2036 | - | - | | 0.9140 | 3378 | 0.2845 | - | - | | 0.9142 | 3379 | 0.35 | - | - | | 0.9145 | 3380 | 0.2395 | - | - | | 0.9148 | 3381 | 0.3297 | - | - | | 0.9150 | 3382 | 0.2512 | - | - | | 0.9153 | 3383 | 0.243 | - | - | | 0.9156 | 3384 | 0.2997 | - | - | | 0.9159 | 3385 | 0.3318 | - | - | | 0.9161 | 3386 | 0.1688 | - | - | | 0.9164 | 3387 | 0.3056 | - | - | | 0.9167 | 3388 | 0.2559 | - | - | | 0.9169 | 3389 | 0.2188 | - | - | | 0.9172 | 3390 | 0.2351 | - | - | | 0.9175 | 3391 | 0.2792 | - | - | | 0.9177 | 3392 | 0.2635 | - | - | | 0.9180 | 3393 | 0.204 | - | - | | 0.9183 | 3394 | 0.3251 | - | - | | 0.9186 | 3395 | 0.232 | - | - | | 0.9188 | 3396 | 0.248 | - | - | | 0.9191 | 3397 | 0.2675 | - | - | | 0.9194 | 3398 | 0.1952 | - | - | | 0.9196 | 3399 | 0.242 | - | - | | 0.9199 | 3400 | 0.1862 | - | - | | 0.9202 | 3401 | 0.2542 | - | - | | 0.9205 | 3402 | 0.1691 | - | - | | 0.9207 | 3403 | 0.3385 | - | - | | 0.9210 | 3404 | 0.2009 | - | - | | 0.9213 | 3405 | 0.2427 | - | - | | 0.9215 | 3406 | 0.1952 | - | - | | 0.9218 | 3407 | 0.2577 | - | - | | 0.9221 | 3408 | 0.3917 | - | - | | 0.9223 | 3409 | 0.263 | - | - | | 0.9226 | 3410 | 0.2764 | - | - | | 0.9229 | 3411 | 0.1907 | - | - | | 0.9232 | 3412 | 0.4934 | - | - | | 0.9234 | 3413 | 0.303 | - | - | | 0.9237 | 3414 | 0.2822 | - | - | | 0.9240 | 3415 | 0.3126 | - | - | | 0.9242 | 3416 | 0.3361 | - | - | | 0.9245 | 3417 | 0.1996 | - | - | | 0.9248 | 3418 | 0.1849 | - | - | | 0.9251 | 3419 | 0.2414 | - | - | | 0.9253 | 3420 | 0.2239 | - | - | | 0.9256 | 3421 | 0.2707 | - | - | | 0.9259 | 3422 | 0.2093 | - | - | | 0.9261 | 3423 | 0.2286 | - | - | | 0.9264 | 3424 | 0.276 | - | - | | 0.9267 | 3425 | 0.195 | - | - | | 0.9269 | 3426 | 0.2642 | - | - | | 0.9272 | 3427 | 0.2457 | - | - | | 0.9275 | 3428 | 0.3135 | - | - | | 0.9278 | 3429 | 0.3714 | - | - | | 0.9280 | 3430 | 0.24 | - | - | | 0.9283 | 3431 | 0.2047 | - | - | | 0.9286 | 3432 | 0.252 | - | - | | 0.9288 | 3433 | 0.21 | - | - | | 0.9291 | 3434 | 0.3017 | - | - | | 0.9294 | 3435 | 0.2324 | - | - | | 0.9297 | 3436 | 0.2891 | - | - | | 0.9299 | 3437 | 0.263 | - | - | | 0.9302 | 3438 | 0.2186 | - | - | | 0.9305 | 3439 | 0.2341 | - | - | | 0.9307 | 3440 | 0.3141 | - | - | | 0.9310 | 3441 | 0.2987 | - | - | | 0.9313 | 3442 | 0.2303 | - | - | | 0.9315 | 3443 | 0.3147 | - | - | | 0.9318 | 3444 | 0.2943 | - | - | | 0.9321 | 3445 | 0.2314 | - | - | | 0.9324 | 3446 | 0.251 | - | - | | 0.9326 | 3447 | 0.2708 | - | - | | 0.9329 | 3448 | 0.2722 | - | - | | 0.9332 | 3449 | 0.2936 | - | - | | 0.9334 | 3450 | 0.2287 | - | - | | 0.9337 | 3451 | 0.2693 | - | - | | 0.9340 | 3452 | 0.1653 | - | - | | 0.9343 | 3453 | 0.2302 | - | - | | 0.9345 | 3454 | 0.2655 | - | - | | 0.9348 | 3455 | 0.3461 | - | - | | 0.9351 | 3456 | 0.2025 | - | - | | 0.9353 | 3457 | 0.1836 | - | - | | 0.9356 | 3458 | 0.2433 | - | - | | 0.9359 | 3459 | 0.2364 | - | - | | 0.9361 | 3460 | 0.2861 | - | - | | 0.9364 | 3461 | 0.3289 | - | - | | 0.9367 | 3462 | 0.1935 | - | - | | 0.9370 | 3463 | 0.3259 | - | - | | 0.9372 | 3464 | 0.2808 | - | - | | 0.9375 | 3465 | 0.2894 | - | - | | 0.9378 | 3466 | 0.2897 | - | - | | 0.9380 | 3467 | 0.2113 | - | - | | 0.9383 | 3468 | 0.172 | - | - | | 0.9386 | 3469 | 0.2917 | - | - | | 0.9389 | 3470 | 0.3655 | - | - | | 0.9391 | 3471 | 0.2524 | - | - | | 0.9394 | 3472 | 0.2206 | - | - | | 0.9397 | 3473 | 0.2244 | - | - | | 0.9399 | 3474 | 0.242 | - | - | | 0.9402 | 3475 | 0.2941 | - | - | | 0.9405 | 3476 | 0.3303 | - | - | | 0.9407 | 3477 | 0.359 | - | - | | 0.9410 | 3478 | 0.3089 | - | - | | 0.9413 | 3479 | 0.2352 | - | - | | 0.9416 | 3480 | 0.2596 | - | - | | 0.9418 | 3481 | 0.254 | - | - | | 0.9421 | 3482 | 0.2429 | - | - | | 0.9424 | 3483 | 0.2281 | - | - | | 0.9426 | 3484 | 0.206 | - | - | | 0.9429 | 3485 | 0.2318 | - | - | | 0.9432 | 3486 | 0.2518 | - | - | | 0.9435 | 3487 | 0.3006 | - | - | | 0.9437 | 3488 | 0.2907 | - | - | | 0.9440 | 3489 | 0.201 | - | - | | 0.9443 | 3490 | 0.297 | - | - | | 0.9445 | 3491 | 0.3247 | - | - | | 0.9448 | 3492 | 0.2398 | - | - | | 0.9451 | 3493 | 0.2724 | - | - | | 0.9453 | 3494 | 0.2922 | - | - | | 0.9456 | 3495 | 0.2076 | - | - | | 0.9459 | 3496 | 0.2165 | - | - | | 0.9462 | 3497 | 0.2547 | - | - | | 0.9464 | 3498 | 0.2881 | - | - | | 0.9467 | 3499 | 0.2134 | - | - | | 0.9470 | 3500 | 0.1982 | - | - | | 0.9472 | 3501 | 0.1947 | - | - | | 0.9475 | 3502 | 0.2315 | - | - | | 0.9478 | 3503 | 0.304 | - | - | | 0.9481 | 3504 | 0.2247 | - | - | | 0.9483 | 3505 | 0.216 | - | - | | 0.9486 | 3506 | 0.3087 | - | - | | 0.9489 | 3507 | 0.3074 | - | - | | 0.9491 | 3508 | 0.2354 | - | - | | 0.9494 | 3509 | 0.2523 | - | - | | 0.9497 | 3510 | 0.2744 | - | - | | 0.9499 | 3511 | 0.2951 | - | - | | 0.9502 | 3512 | 0.2681 | - | - | | 0.9505 | 3513 | 0.2896 | - | - | | 0.9508 | 3514 | 0.2715 | - | - | | 0.9510 | 3515 | 0.2766 | - | - | | 0.9513 | 3516 | 0.2448 | - | - | | 0.9516 | 3517 | 0.3858 | - | - | | 0.9518 | 3518 | 0.1812 | - | - | | 0.9521 | 3519 | 0.3185 | - | - | | 0.9524 | 3520 | 0.3073 | - | - | | 0.9527 | 3521 | 0.2852 | - | - | | 0.9529 | 3522 | 0.2904 | - | - | | 0.9532 | 3523 | 0.2754 | - | - | | 0.9535 | 3524 | 0.2247 | - | - | | 0.9537 | 3525 | 0.323 | - | - | | 0.9540 | 3526 | 0.2556 | - | - | | 0.9543 | 3527 | 0.4511 | - | - | | 0.9545 | 3528 | 0.2269 | - | - | | 0.9548 | 3529 | 0.2866 | - | - | | 0.9551 | 3530 | 0.3142 | - | - | | 0.9554 | 3531 | 0.3198 | - | - | | 0.9556 | 3532 | 0.2521 | - | - | | 0.9559 | 3533 | 0.2344 | - | - | | 0.9562 | 3534 | 0.3074 | - | - | | 0.9564 | 3535 | 0.3205 | - | - | | 0.9567 | 3536 | 0.1791 | - | - | | 0.9570 | 3537 | 0.2436 | - | - | | 0.9573 | 3538 | 0.2882 | - | - | | 0.9575 | 3539 | 0.2849 | - | - | | 0.9578 | 3540 | 0.348 | - | - | | 0.9581 | 3541 | 0.3505 | - | - | | 0.9583 | 3542 | 0.2428 | - | - | | 0.9586 | 3543 | 0.2955 | - | - | | 0.9589 | 3544 | 0.1902 | - | - | | 0.9591 | 3545 | 0.2466 | - | - | | 0.9594 | 3546 | 0.2759 | - | - | | 0.9597 | 3547 | 0.2967 | - | - | | 0.9600 | 3548 | 0.2957 | - | - | | 0.9602 | 3549 | 0.2717 | - | - | | 0.9605 | 3550 | 0.2233 | - | - | | 0.9608 | 3551 | 0.2488 | - | - | | 0.9610 | 3552 | 0.2641 | - | - | | 0.9613 | 3553 | 0.2374 | - | - | | 0.9616 | 3554 | 0.1853 | - | - | | 0.9619 | 3555 | 0.206 | - | - | | 0.9621 | 3556 | 0.318 | - | - | | 0.9624 | 3557 | 0.2984 | - | - | | 0.9627 | 3558 | 0.2125 | - | - | | 0.9629 | 3559 | 0.3102 | - | - | | 0.9632 | 3560 | 0.2513 | - | - | | 0.9635 | 3561 | 0.271 | - | - | | 0.9637 | 3562 | 0.1955 | - | - | | 0.9640 | 3563 | 0.3262 | - | - | | 0.9643 | 3564 | 0.4096 | - | - | | 0.9646 | 3565 | 0.2214 | - | - | | 0.9648 | 3566 | 0.2923 | - | - | | 0.9651 | 3567 | 0.3203 | - | - | | 0.9654 | 3568 | 0.2062 | - | - | | 0.9656 | 3569 | 0.2599 | - | - | | 0.9659 | 3570 | 0.2077 | - | - | | 0.9662 | 3571 | 0.3213 | - | - | | 0.9665 | 3572 | 0.2585 | - | - | | 0.9667 | 3573 | 0.2312 | - | - | | 0.9670 | 3574 | 0.2769 | - | - | | 0.9673 | 3575 | 0.255 | - | - | | 0.9675 | 3576 | 0.3466 | - | - | | 0.9678 | 3577 | 0.2466 | - | - | | 0.9681 | 3578 | 0.2993 | - | - | | 0.9683 | 3579 | 0.3463 | - | - | | 0.9686 | 3580 | 0.3174 | - | - | | 0.9689 | 3581 | 0.3519 | - | - | | 0.9692 | 3582 | 0.277 | - | - | | 0.9694 | 3583 | 0.3293 | - | - | | 0.9697 | 3584 | 0.317 | - | - | | 0.9700 | 3585 | 0.2706 | - | - | | 0.9702 | 3586 | 0.2761 | - | - | | 0.9705 | 3587 | 0.2198 | - | - | | 0.9708 | 3588 | 0.2495 | - | - | | 0.9710 | 3589 | 0.2662 | - | - | | 0.9713 | 3590 | 0.2835 | - | - | | 0.9716 | 3591 | 0.3061 | - | - | | 0.9719 | 3592 | 0.2378 | - | - | | 0.9721 | 3593 | 0.2067 | - | - | | 0.9724 | 3594 | 0.2723 | - | - | | 0.9727 | 3595 | 0.2371 | - | - | | 0.9729 | 3596 | 0.1973 | - | - | | 0.9732 | 3597 | 0.2585 | - | - | | 0.9735 | 3598 | 0.252 | - | - | | 0.9738 | 3599 | 0.2171 | - | - | | 0.9740 | 3600 | 0.2799 | - | - | | 0.9743 | 3601 | 0.1723 | - | - | | 0.9746 | 3602 | 0.2225 | - | - | | 0.9748 | 3603 | 0.2665 | - | - | | 0.9751 | 3604 | 0.2299 | - | - | | 0.9754 | 3605 | 0.3687 | - | - | | 0.9756 | 3606 | 0.2245 | - | - | | 0.9759 | 3607 | 0.2825 | - | - | | 0.9762 | 3608 | 0.214 | - | - | | 0.9765 | 3609 | 0.2908 | - | - | | 0.9767 | 3610 | 0.3048 | - | - | | 0.9770 | 3611 | 0.3556 | - | - | | 0.9773 | 3612 | 0.3434 | - | - | | 0.9775 | 3613 | 0.2642 | - | - | | 0.9778 | 3614 | 0.277 | - | - | | 0.9781 | 3615 | 0.2991 | - | - | | 0.9784 | 3616 | 0.2421 | - | - | | 0.9786 | 3617 | 0.2524 | - | - | | 0.9789 | 3618 | 0.2184 | - | - | | 0.9792 | 3619 | 0.1983 | - | - | | 0.9794 | 3620 | 0.2725 | - | - | | 0.9797 | 3621 | 0.2894 | - | - | | 0.9800 | 3622 | 0.218 | - | - | | 0.9802 | 3623 | 0.3603 | - | - | | 0.9805 | 3624 | 0.2498 | - | - | | 0.9808 | 3625 | 0.2013 | - | - | | 0.9811 | 3626 | 0.2922 | - | - | | 0.9813 | 3627 | 0.3398 | - | - | | 0.9816 | 3628 | 0.2485 | - | - | | 0.9819 | 3629 | 0.2846 | - | - | | 0.9821 | 3630 | 0.2775 | - | - | | 0.9824 | 3631 | 0.1814 | - | - | | 0.9827 | 3632 | 0.227 | - | - | | 0.9830 | 3633 | 0.3007 | - | - | | 0.9832 | 3634 | 0.2626 | - | - | | 0.9835 | 3635 | 0.308 | - | - | | 0.9838 | 3636 | 0.3136 | - | - | | 0.9840 | 3637 | 0.3868 | - | - | | 0.9843 | 3638 | 0.2338 | - | - | | 0.9846 | 3639 | 0.2808 | - | - | | 0.9848 | 3640 | 0.2198 | - | - | | 0.9851 | 3641 | 0.3017 | - | - | | 0.9854 | 3642 | 0.2334 | - | - | | 0.9857 | 3643 | 0.2699 | - | - | | 0.9859 | 3644 | 0.2363 | - | - | | 0.9862 | 3645 | 0.2553 | - | - | | 0.9865 | 3646 | 0.2671 | - | - | | 0.9867 | 3647 | 0.3077 | - | - | | 0.9870 | 3648 | 0.2148 | - | - | | 0.9873 | 3649 | 0.2195 | - | - | | 0.9876 | 3650 | 0.2308 | - | - | | 0.9878 | 3651 | 0.2521 | - | - | | 0.9881 | 3652 | 0.2883 | - | - | | 0.9884 | 3653 | 0.2587 | - | - | | 0.9886 | 3654 | 0.2061 | - | - | | 0.9889 | 3655 | 0.2797 | - | - | | 0.9892 | 3656 | 0.2464 | - | - | | 0.9894 | 3657 | 0.268 | - | - | | 0.9897 | 3658 | 0.1835 | - | - | | 0.9900 | 3659 | 0.2828 | - | - | | 0.9903 | 3660 | 0.2759 | - | - | | 0.9905 | 3661 | 0.225 | - | - | | 0.9908 | 3662 | 0.3405 | - | - | | 0.9911 | 3663 | 0.266 | - | - | | 0.9913 | 3664 | 0.2706 | - | - | | 0.9916 | 3665 | 0.2991 | - | - | | 0.9919 | 3666 | 0.2825 | - | - | | 0.9922 | 3667 | 0.191 | - | - | | 0.9924 | 3668 | 0.198 | - | - | | 0.9927 | 3669 | 0.269 | - | - | | 0.9930 | 3670 | 0.2074 | - | - | | 0.9932 | 3671 | 0.1476 | - | - | | 0.9935 | 3672 | 0.2654 | - | - | | 0.9938 | 3673 | 0.2133 | - | - | | 0.9940 | 3674 | 0.2332 | - | - | | 0.9943 | 3675 | 0.2253 | - | - | | 0.9946 | 3676 | 0.3173 | - | - | | 0.9949 | 3677 | 0.3541 | - | - | | 0.9951 | 3678 | 0.2955 | - | - | | 0.9954 | 3679 | 0.1914 | - | - | | 0.9957 | 3680 | 0.2286 | - | - | | 0.9959 | 3681 | 0.2952 | - | - | | 0.9962 | 3682 | 0.3225 | - | - | | 0.9965 | 3683 | 0.2409 | - | - | | 0.9968 | 3684 | 0.2145 | - | - | | 0.9970 | 3685 | 0.22 | - | - | | 0.9973 | 3686 | 0.2246 | - | - | | 0.9976 | 3687 | 0.308 | - | - | | 0.9978 | 3688 | 0.2441 | - | - | | 0.9981 | 3689 | 0.2241 | - | - | | 0.9984 | 3690 | 0.3153 | - | - | | 0.9986 | 3691 | 0.2734 | - | - | | 0.9989 | 3692 | 0.3468 | - | - | | 0.9992 | 3693 | 0.328 | - | - | | 0.9995 | 3694 | 0.2309 | - | - | | 0.9997 | 3695 | 0.2473 | - | - | | 1.0 | 3696 | 0.279 | - | - | | 1.0003 | 3697 | 0.2398 | - | - | | 1.0005 | 3698 | 0.1914 | - | - | | 1.0008 | 3699 | 0.1938 | - | - | | 1.0011 | 3700 | 0.2064 | - | - | | 1.0014 | 3701 | 0.2342 | - | - | | 1.0016 | 3702 | 0.2643 | - | - | | 1.0019 | 3703 | 0.2346 | - | - | | 1.0022 | 3704 | 0.2927 | - | - | | 1.0024 | 3705 | 0.2381 | - | - | | 1.0027 | 3706 | 0.2501 | - | - | | 1.0030 | 3707 | 0.2378 | - | - | | 1.0032 | 3708 | 0.2102 | - | - | | 1.0035 | 3709 | 0.2375 | - | - | | 1.0038 | 3710 | 0.1532 | - | - | | 1.0041 | 3711 | 0.2109 | - | - | | 1.0043 | 3712 | 0.2538 | - | - | | 1.0046 | 3713 | 0.2709 | - | - | | 1.0049 | 3714 | 0.1577 | - | - | | 1.0051 | 3715 | 0.2742 | - | - | | 1.0054 | 3716 | 0.3701 | - | - | | 1.0057 | 3717 | 0.2647 | - | - | | 1.0060 | 3718 | 0.2945 | - | - | | 1.0062 | 3719 | 0.1908 | - | - | | 1.0065 | 3720 | 0.1587 | - | - | | 1.0068 | 3721 | 0.2551 | - | - | | 1.0070 | 3722 | 0.2761 | - | - | | 1.0073 | 3723 | 0.268 | - | - | | 1.0076 | 3724 | 0.1888 | - | - | | 1.0078 | 3725 | 0.2842 | - | - | | 1.0081 | 3726 | 0.1982 | - | - | | 1.0084 | 3727 | 0.292 | - | - | | 1.0087 | 3728 | 0.2613 | - | - | | 1.0089 | 3729 | 0.1289 | - | - | | 1.0092 | 3730 | 0.1937 | - | - | | 1.0095 | 3731 | 0.2538 | - | - | | 1.0097 | 3732 | 0.2896 | - | - | | 1.0100 | 3733 | 0.2118 | - | - | | 1.0103 | 3734 | 0.2983 | - | - | | 1.0106 | 3735 | 0.2561 | - | - | | 1.0108 | 3736 | 0.3354 | - | - | | 1.0111 | 3737 | 0.2378 | - | - | | 1.0114 | 3738 | 0.2024 | - | - | | 1.0116 | 3739 | 0.1931 | - | - | | 1.0119 | 3740 | 0.2435 | - | - | | 1.0122 | 3741 | 0.2435 | - | - | | 1.0124 | 3742 | 0.2429 | - | - | | 1.0127 | 3743 | 0.2692 | - | - | | 1.0130 | 3744 | 0.2873 | - | - | | 1.0133 | 3745 | 0.2091 | - | - | | 1.0135 | 3746 | 0.2196 | - | - | | 1.0138 | 3747 | 0.2101 | - | - | | 1.0141 | 3748 | 0.2053 | - | - | | 1.0143 | 3749 | 0.2442 | - | - | | 1.0146 | 3750 | 0.1795 | - | - | | 1.0149 | 3751 | 0.2672 | - | - | | 1.0152 | 3752 | 0.2159 | - | - | | 1.0154 | 3753 | 0.2079 | - | - | | 1.0157 | 3754 | 0.2983 | - | - | | 1.0160 | 3755 | 0.1989 | - | - | | 1.0162 | 3756 | 0.188 | - | - | | 1.0165 | 3757 | 0.2601 | - | - | | 1.0168 | 3758 | 0.2418 | - | - | | 1.0170 | 3759 | 0.2816 | - | - | | 1.0173 | 3760 | 0.1746 | - | - | | 1.0176 | 3761 | 0.3017 | - | - | | 1.0179 | 3762 | 0.2026 | - | - | | 1.0181 | 3763 | 0.1731 | - | - | | 1.0184 | 3764 | 0.2889 | - | - | | 1.0187 | 3765 | 0.2071 | - | - | | 1.0189 | 3766 | 0.2798 | - | - | | 1.0192 | 3767 | 0.2503 | - | - | | 1.0195 | 3768 | 0.1815 | - | - | | 1.0198 | 3769 | 0.2028 | - | - | | 1.0200 | 3770 | 0.2813 | - | - | | 1.0203 | 3771 | 0.2601 | - | - | | 1.0206 | 3772 | 0.275 | - | - | | 1.0208 | 3773 | 0.2361 | - | - | | 1.0211 | 3774 | 0.1932 | - | - | | 1.0214 | 3775 | 0.2841 | - | - | | 1.0216 | 3776 | 0.2883 | - | - | | 1.0219 | 3777 | 0.258 | - | - | | 1.0222 | 3778 | 0.1991 | - | - | | 1.0225 | 3779 | 0.2631 | - | - | | 1.0227 | 3780 | 0.296 | - | - | | 1.0230 | 3781 | 0.2636 | - | - | | 1.0233 | 3782 | 0.2526 | - | - | | 1.0235 | 3783 | 0.1759 | - | - | | 1.0238 | 3784 | 0.2657 | - | - | | 1.0241 | 3785 | 0.2321 | - | - | | 1.0244 | 3786 | 0.1554 | - | - | | 1.0246 | 3787 | 0.3056 | - | - | | 1.0249 | 3788 | 0.304 | - | - | | 1.0252 | 3789 | 0.2153 | - | - | | 1.0254 | 3790 | 0.3182 | - | - | | 1.0257 | 3791 | 0.2505 | - | - | | 1.0260 | 3792 | 0.2094 | - | - | | 1.0262 | 3793 | 0.2409 | - | - | | 1.0265 | 3794 | 0.215 | - | - | | 1.0268 | 3795 | 0.2223 | - | - | | 1.0271 | 3796 | 0.2497 | - | - | | 1.0273 | 3797 | 0.2017 | - | - | | 1.0276 | 3798 | 0.2234 | - | - | | 1.0279 | 3799 | 0.2572 | - | - | | 1.0281 | 3800 | 0.2338 | - | - | | 1.0284 | 3801 | 0.1621 | - | - | | 1.0287 | 3802 | 0.3131 | - | - | | 1.0290 | 3803 | 0.2695 | - | - | | 1.0292 | 3804 | 0.1763 | - | - | | 1.0295 | 3805 | 0.2181 | - | - | | 1.0298 | 3806 | 0.266 | - | - | | 1.0300 | 3807 | 0.2282 | - | - | | 1.0303 | 3808 | 0.3563 | - | - | | 1.0306 | 3809 | 0.2929 | - | - | | 1.0308 | 3810 | 0.2105 | - | - | | 1.0311 | 3811 | 0.1543 | - | - | | 1.0314 | 3812 | 0.2339 | - | - | | 1.0317 | 3813 | 0.2403 | - | - | | 1.0319 | 3814 | 0.2146 | - | - | | 1.0322 | 3815 | 0.2243 | - | - | | 1.0325 | 3816 | 0.2641 | - | - | | 1.0327 | 3817 | 0.2532 | - | - | | 1.0330 | 3818 | 0.1923 | - | - | | 1.0333 | 3819 | 0.3272 | - | - | | 1.0335 | 3820 | 0.2173 | - | - | | 1.0338 | 3821 | 0.231 | - | - | | 1.0341 | 3822 | 0.1673 | - | - | | 1.0344 | 3823 | 0.2918 | - | - | | 1.0346 | 3824 | 0.3647 | - | - | | 1.0349 | 3825 | 0.1507 | - | - | | 1.0352 | 3826 | 0.2733 | - | - | | 1.0354 | 3827 | 0.2138 | - | - | | 1.0357 | 3828 | 0.2367 | - | - | | 1.0360 | 3829 | 0.2655 | - | - | | 1.0363 | 3830 | 0.2152 | - | - | | 1.0365 | 3831 | 0.2693 | - | - | | 1.0368 | 3832 | 0.207 | - | - | | 1.0371 | 3833 | 0.2669 | - | - | | 1.0373 | 3834 | 0.2763 | - | - | | 1.0376 | 3835 | 0.222 | - | - | | 1.0379 | 3836 | 0.1458 | - | - | | 1.0381 | 3837 | 0.2265 | - | - | | 1.0384 | 3838 | 0.2329 | - | - | | 1.0387 | 3839 | 0.2841 | - | - | | 1.0390 | 3840 | 0.2274 | - | - | | 1.0392 | 3841 | 0.2253 | - | - | | 1.0395 | 3842 | 0.1991 | - | - | | 1.0398 | 3843 | 0.2628 | - | - | | 1.0400 | 3844 | 0.2222 | - | - | | 1.0403 | 3845 | 0.2352 | - | - | | 1.0406 | 3846 | 0.3067 | - | - | | 1.0409 | 3847 | 0.2131 | - | - | | 1.0411 | 3848 | 0.2004 | - | - | | 1.0414 | 3849 | 0.1627 | - | - | | 1.0417 | 3850 | 0.1809 | - | - | | 1.0419 | 3851 | 0.2473 | - | - | | 1.0422 | 3852 | 0.2685 | - | - | | 1.0425 | 3853 | 0.1708 | - | - | | 1.0427 | 3854 | 0.2624 | - | - | | 1.0430 | 3855 | 0.2691 | - | - | | 1.0433 | 3856 | 0.2761 | - | - | | 1.0436 | 3857 | 0.3001 | - | - | | 1.0438 | 3858 | 0.2332 | - | - | | 1.0441 | 3859 | 0.1779 | - | - | | 1.0444 | 3860 | 0.317 | - | - | | 1.0446 | 3861 | 0.3584 | - | - | | 1.0449 | 3862 | 0.2173 | - | - | | 1.0452 | 3863 | 0.239 | - | - | | 1.0455 | 3864 | 0.2868 | - | - | | 1.0457 | 3865 | 0.2156 | - | - | | 1.0460 | 3866 | 0.1991 | - | - | | 1.0463 | 3867 | 0.2395 | - | - | | 1.0465 | 3868 | 0.2973 | - | - | | 1.0468 | 3869 | 0.2669 | - | - | | 1.0471 | 3870 | 0.2921 | - | - | | 1.0473 | 3871 | 0.2109 | - | - | | 1.0476 | 3872 | 0.2387 | - | - | | 1.0479 | 3873 | 0.2252 | - | - | | 1.0482 | 3874 | 0.2101 | - | - | | 1.0484 | 3875 | 0.2311 | - | - | | 1.0487 | 3876 | 0.2459 | - | - | | 1.0490 | 3877 | 0.212 | - | - | | 1.0492 | 3878 | 0.2261 | - | - | | 1.0495 | 3879 | 0.227 | - | - | | 1.0498 | 3880 | 0.2942 | - | - | | 1.0501 | 3881 | 0.2127 | - | - | | 1.0503 | 3882 | 0.1862 | - | - | | 1.0506 | 3883 | 0.1601 | - | - | | 1.0509 | 3884 | 0.1747 | - | - | | 1.0511 | 3885 | 0.1798 | - | - | | 1.0514 | 3886 | 0.2838 | - | - | | 1.0517 | 3887 | 0.2368 | - | - | | 1.0519 | 3888 | 0.1666 | - | - | | 1.0522 | 3889 | 0.1971 | - | - | | 1.0525 | 3890 | 0.2437 | - | - | | 1.0528 | 3891 | 0.2614 | - | - | | 1.0530 | 3892 | 0.2016 | - | - | | 1.0533 | 3893 | 0.2189 | - | - | | 1.0536 | 3894 | 0.2739 | - | - | | 1.0538 | 3895 | 0.2841 | - | - | | 1.0541 | 3896 | 0.2608 | - | - | | 1.0544 | 3897 | 0.3097 | - | - | | 1.0547 | 3898 | 0.2651 | - | - | | 1.0549 | 3899 | 0.2079 | - | - | | 1.0552 | 3900 | 0.2613 | - | - | | 1.0555 | 3901 | 0.2221 | - | - | | 1.0557 | 3902 | 0.3085 | - | - | | 1.0560 | 3903 | 0.1983 | - | - | | 1.0563 | 3904 | 0.1945 | - | - | | 1.0565 | 3905 | 0.233 | - | - | | 1.0568 | 3906 | 0.24 | - | - | | 1.0571 | 3907 | 0.1822 | - | - | | 1.0574 | 3908 | 0.2404 | - | - | | 1.0576 | 3909 | 0.1511 | - | - | | 1.0579 | 3910 | 0.231 | - | - | | 1.0582 | 3911 | 0.2019 | - | - | | 1.0584 | 3912 | 0.2938 | - | - | | 1.0587 | 3913 | 0.3566 | - | - | | 1.0590 | 3914 | 0.2612 | - | - | | 1.0593 | 3915 | 0.3086 | - | - | | 1.0595 | 3916 | 0.2291 | - | - | | 1.0598 | 3917 | 0.2211 | - | - | | 1.0601 | 3918 | 0.2256 | - | - | | 1.0603 | 3919 | 0.2852 | - | - | | 1.0606 | 3920 | 0.2344 | - | - | | 1.0609 | 3921 | 0.3421 | - | - | | 1.0611 | 3922 | 0.2482 | - | - | | 1.0614 | 3923 | 0.2685 | - | - | | 1.0617 | 3924 | 0.2192 | - | - | | 1.0620 | 3925 | 0.2674 | - | - | | 1.0622 | 3926 | 0.2452 | - | - | | 1.0625 | 3927 | 0.2786 | - | - | | 1.0628 | 3928 | 0.2528 | - | - | | 1.0630 | 3929 | 0.254 | - | - | | 1.0633 | 3930 | 0.3259 | - | - | | 1.0636 | 3931 | 0.1807 | - | - | | 1.0639 | 3932 | 0.1963 | - | - | | 1.0641 | 3933 | 0.2583 | - | - | | 1.0644 | 3934 | 0.2107 | - | - | | 1.0647 | 3935 | 0.2628 | - | - | | 1.0649 | 3936 | 0.1211 | - | - | | 1.0652 | 3937 | 0.2365 | - | - | | 1.0655 | 3938 | 0.219 | - | - | | 1.0657 | 3939 | 0.2354 | - | - | | 1.0660 | 3940 | 0.1676 | - | - | | 1.0663 | 3941 | 0.2248 | - | - | | 1.0666 | 3942 | 0.1766 | - | - | | 1.0668 | 3943 | 0.2815 | - | - | | 1.0671 | 3944 | 0.2082 | - | - | | 1.0674 | 3945 | 0.377 | - | - | | 1.0676 | 3946 | 0.2223 | - | - | | 1.0679 | 3947 | 0.1825 | - | - | | 1.0682 | 3948 | 0.1764 | - | - | | 1.0685 | 3949 | 0.1695 | - | - | | 1.0687 | 3950 | 0.2012 | - | - | | 1.0690 | 3951 | 0.181 | - | - | | 1.0693 | 3952 | 0.2575 | - | - | | 1.0695 | 3953 | 0.3822 | - | - | | 1.0698 | 3954 | 0.1845 | - | - | | 1.0701 | 3955 | 0.2852 | - | - | | 1.0703 | 3956 | 0.2233 | - | - | | 1.0706 | 3957 | 0.1728 | - | - | | 1.0709 | 3958 | 0.239 | - | - | | 1.0712 | 3959 | 0.2591 | - | - | | 1.0714 | 3960 | 0.2841 | - | - | | 1.0717 | 3961 | 0.25 | - | - | | 1.0720 | 3962 | 0.3137 | - | - | | 1.0722 | 3963 | 0.2341 | - | - | | 1.0725 | 3964 | 0.2889 | - | - | | 1.0728 | 3965 | 0.1903 | - | - | | 1.0731 | 3966 | 0.2279 | - | - | | 1.0733 | 3967 | 0.196 | - | - | | 1.0736 | 3968 | 0.2252 | - | - | | 1.0739 | 3969 | 0.2639 | - | - | | 1.0741 | 3970 | 0.2483 | - | - | | 1.0744 | 3971 | 0.1682 | - | - | | 1.0747 | 3972 | 0.2842 | - | - | | 1.0749 | 3973 | 0.2235 | - | - | | 1.0752 | 3974 | 0.2576 | - | - | | 1.0755 | 3975 | 0.2906 | - | - | | 1.0758 | 3976 | 0.1943 | - | - | | 1.0760 | 3977 | 0.2131 | - | - | | 1.0763 | 3978 | 0.1708 | - | - | | 1.0766 | 3979 | 0.2712 | - | - | | 1.0768 | 3980 | 0.1389 | - | - | | 1.0771 | 3981 | 0.1962 | - | - | | 1.0774 | 3982 | 0.2629 | - | - | | 1.0777 | 3983 | 0.221 | - | - | | 1.0779 | 3984 | 0.2348 | - | - | | 1.0782 | 3985 | 0.2455 | - | - | | 1.0785 | 3986 | 0.2226 | - | - | | 1.0787 | 3987 | 0.2323 | - | - | | 1.0790 | 3988 | 0.2054 | - | - | | 1.0793 | 3989 | 0.2234 | - | - | | 1.0795 | 3990 | 0.1725 | - | - | | 1.0798 | 3991 | 0.2408 | - | - | | 1.0801 | 3992 | 0.2424 | - | - | | 1.0804 | 3993 | 0.1942 | - | - | | 1.0806 | 3994 | 0.2107 | - | - | | 1.0809 | 3995 | 0.2726 | - | - | | 1.0812 | 3996 | 0.2541 | - | - | | 1.0814 | 3997 | 0.2702 | - | - | | 1.0817 | 3998 | 0.2078 | - | - | | 1.0820 | 3999 | 0.2351 | - | - | | 1.0823 | 4000 | 0.2006 | 0.2336 | 0.9416 | | 1.0825 | 4001 | 0.1635 | - | - | | 1.0828 | 4002 | 0.2494 | - | - | | 1.0831 | 4003 | 0.1952 | - | - | | 1.0833 | 4004 | 0.1852 | - | - | | 1.0836 | 4005 | 0.2177 | - | - | | 1.0839 | 4006 | 0.2086 | - | - | | 1.0841 | 4007 | 0.2453 | - | - | | 1.0844 | 4008 | 0.2443 | - | - | | 1.0847 | 4009 | 0.2451 | - | - | | 1.0850 | 4010 | 0.2234 | - | - | | 1.0852 | 4011 | 0.2117 | - | - | | 1.0855 | 4012 | 0.2206 | - | - | | 1.0858 | 4013 | 0.2102 | - | - | | 1.0860 | 4014 | 0.3688 | - | - | | 1.0863 | 4015 | 0.2184 | - | - | | 1.0866 | 4016 | 0.2224 | - | - | | 1.0869 | 4017 | 0.219 | - | - | | 1.0871 | 4018 | 0.228 | - | - | | 1.0874 | 4019 | 0.2189 | - | - | | 1.0877 | 4020 | 0.2243 | - | - | | 1.0879 | 4021 | 0.2628 | - | - | | 1.0882 | 4022 | 0.2735 | - | - | | 1.0885 | 4023 | 0.2302 | - | - | | 1.0887 | 4024 | 0.2326 | - | - | | 1.0890 | 4025 | 0.2749 | - | - | | 1.0893 | 4026 | 0.25 | - | - | | 1.0896 | 4027 | 0.2004 | - | - | | 1.0898 | 4028 | 0.2 | - | - | | 1.0901 | 4029 | 0.2037 | - | - | | 1.0904 | 4030 | 0.283 | - | - | | 1.0906 | 4031 | 0.3118 | - | - | | 1.0909 | 4032 | 0.2184 | - | - | | 1.0912 | 4033 | 0.2213 | - | - | | 1.0915 | 4034 | 0.3782 | - | - | | 1.0917 | 4035 | 0.2203 | - | - | | 1.0920 | 4036 | 0.2366 | - | - | | 1.0923 | 4037 | 0.2382 | - | - | | 1.0925 | 4038 | 0.1448 | - | - | | 1.0928 | 4039 | 0.2925 | - | - | | 1.0931 | 4040 | 0.2262 | - | - | | 1.0933 | 4041 | 0.2376 | - | - | | 1.0936 | 4042 | 0.2623 | - | - | | 1.0939 | 4043 | 0.2333 | - | - | | 1.0942 | 4044 | 0.2159 | - | - | | 1.0944 | 4045 | 0.169 | - | - | | 1.0947 | 4046 | 0.1788 | - | - | | 1.0950 | 4047 | 0.1915 | - | - | | 1.0952 | 4048 | 0.2878 | - | - | | 1.0955 | 4049 | 0.2274 | - | - | | 1.0958 | 4050 | 0.3054 | - | - | | 1.0960 | 4051 | 0.2925 | - | - | | 1.0963 | 4052 | 0.2351 | - | - | | 1.0966 | 4053 | 0.3037 | - | - | | 1.0969 | 4054 | 0.2311 | - | - | | 1.0971 | 4055 | 0.2069 | - | - | | 1.0974 | 4056 | 0.2363 | - | - | | 1.0977 | 4057 | 0.2554 | - | - | | 1.0979 | 4058 | 0.288 | - | - | | 1.0982 | 4059 | 0.2552 | - | - | | 1.0985 | 4060 | 0.258 | - | - | | 1.0988 | 4061 | 0.2276 | - | - | | 1.0990 | 4062 | 0.2131 | - | - | | 1.0993 | 4063 | 0.1614 | - | - | | 1.0996 | 4064 | 0.1668 | - | - | | 1.0998 | 4065 | 0.2547 | - | - | | 1.1001 | 4066 | 0.2231 | - | - | | 1.1004 | 4067 | 0.2161 | - | - | | 1.1006 | 4068 | 0.168 | - | - | | 1.1009 | 4069 | 0.2466 | - | - | | 1.1012 | 4070 | 0.335 | - | - | | 1.1015 | 4071 | 0.2734 | - | - | | 1.1017 | 4072 | 0.1958 | - | - | | 1.1020 | 4073 | 0.1705 | - | - | | 1.1023 | 4074 | 0.2087 | - | - | | 1.1025 | 4075 | 0.1648 | - | - | | 1.1028 | 4076 | 0.2473 | - | - | | 1.1031 | 4077 | 0.2457 | - | - | | 1.1034 | 4078 | 0.2744 | - | - | | 1.1036 | 4079 | 0.2046 | - | - | | 1.1039 | 4080 | 0.2668 | - | - | | 1.1042 | 4081 | 0.2365 | - | - | | 1.1044 | 4082 | 0.2355 | - | - | | 1.1047 | 4083 | 0.1896 | - | - | | 1.1050 | 4084 | 0.2236 | - | - | | 1.1052 | 4085 | 0.2284 | - | - | | 1.1055 | 4086 | 0.1619 | - | - | | 1.1058 | 4087 | 0.3478 | - | - | | 1.1061 | 4088 | 0.2752 | - | - | | 1.1063 | 4089 | 0.3266 | - | - | | 1.1066 | 4090 | 0.232 | - | - | | 1.1069 | 4091 | 0.2007 | - | - | | 1.1071 | 4092 | 0.2568 | - | - | | 1.1074 | 4093 | 0.2395 | - | - | | 1.1077 | 4094 | 0.1627 | - | - | | 1.1080 | 4095 | 0.2363 | - | - | | 1.1082 | 4096 | 0.2078 | - | - | | 1.1085 | 4097 | 0.3126 | - | - | | 1.1088 | 4098 | 0.2028 | - | - | | 1.1090 | 4099 | 0.2214 | - | - | | 1.1093 | 4100 | 0.3466 | - | - | | 1.1096 | 4101 | 0.2243 | - | - | | 1.1098 | 4102 | 0.2088 | - | - | | 1.1101 | 4103 | 0.2623 | - | - | | 1.1104 | 4104 | 0.3202 | - | - | | 1.1107 | 4105 | 0.1989 | - | - | | 1.1109 | 4106 | 0.2163 | - | - | | 1.1112 | 4107 | 0.1721 | - | - | | 1.1115 | 4108 | 0.2463 | - | - | | 1.1117 | 4109 | 0.279 | - | - | | 1.1120 | 4110 | 0.2391 | - | - | | 1.1123 | 4111 | 0.1849 | - | - | | 1.1126 | 4112 | 0.3284 | - | - | | 1.1128 | 4113 | 0.2569 | - | - | | 1.1131 | 4114 | 0.2078 | - | - | | 1.1134 | 4115 | 0.1981 | - | - | | 1.1136 | 4116 | 0.1753 | - | - | | 1.1139 | 4117 | 0.2487 | - | - | | 1.1142 | 4118 | 0.2268 | - | - | | 1.1144 | 4119 | 0.1832 | - | - | | 1.1147 | 4120 | 0.2507 | - | - | | 1.1150 | 4121 | 0.2335 | - | - | | 1.1153 | 4122 | 0.2351 | - | - | | 1.1155 | 4123 | 0.2809 | - | - | | 1.1158 | 4124 | 0.2653 | - | - | | 1.1161 | 4125 | 0.1901 | - | - | | 1.1163 | 4126 | 0.2207 | - | - | | 1.1166 | 4127 | 0.2148 | - | - | | 1.1169 | 4128 | 0.154 | - | - | | 1.1172 | 4129 | 0.2834 | - | - | | 1.1174 | 4130 | 0.2649 | - | - | | 1.1177 | 4131 | 0.2326 | - | - | | 1.1180 | 4132 | 0.2733 | - | - | | 1.1182 | 4133 | 0.2805 | - | - | | 1.1185 | 4134 | 0.2385 | - | - | | 1.1188 | 4135 | 0.2177 | - | - | | 1.1190 | 4136 | 0.2291 | - | - | | 1.1193 | 4137 | 0.2262 | - | - | | 1.1196 | 4138 | 0.1778 | - | - | | 1.1199 | 4139 | 0.243 | - | - | | 1.1201 | 4140 | 0.1881 | - | - | | 1.1204 | 4141 | 0.2183 | - | - | | 1.1207 | 4142 | 0.3065 | - | - | | 1.1209 | 4143 | 0.308 | - | - | | 1.1212 | 4144 | 0.2685 | - | - | | 1.1215 | 4145 | 0.2397 | - | - | | 1.1218 | 4146 | 0.2339 | - | - | | 1.1220 | 4147 | 0.2657 | - | - | | 1.1223 | 4148 | 0.2177 | - | - | | 1.1226 | 4149 | 0.1997 | - | - | | 1.1228 | 4150 | 0.2234 | - | - | | 1.1231 | 4151 | 0.2441 | - | - | | 1.1234 | 4152 | 0.2548 | - | - | | 1.1236 | 4153 | 0.2114 | - | - | | 1.1239 | 4154 | 0.3083 | - | - | | 1.1242 | 4155 | 0.2732 | - | - | | 1.1245 | 4156 | 0.2402 | - | - | | 1.1247 | 4157 | 0.2963 | - | - | | 1.125 | 4158 | 0.1755 | - | - | | 1.1253 | 4159 | 0.1807 | - | - | | 1.1255 | 4160 | 0.2084 | - | - | | 1.1258 | 4161 | 0.1941 | - | - | | 1.1261 | 4162 | 0.1946 | - | - | | 1.1264 | 4163 | 0.233 | - | - | | 1.1266 | 4164 | 0.1537 | - | - | | 1.1269 | 4165 | 0.2537 | - | - | | 1.1272 | 4166 | 0.1951 | - | - | | 1.1274 | 4167 | 0.2516 | - | - | | 1.1277 | 4168 | 0.2027 | - | - | | 1.1280 | 4169 | 0.1996 | - | - | | 1.1282 | 4170 | 0.2693 | - | - | | 1.1285 | 4171 | 0.2447 | - | - | | 1.1288 | 4172 | 0.2289 | - | - | | 1.1291 | 4173 | 0.251 | - | - | | 1.1293 | 4174 | 0.2144 | - | - | | 1.1296 | 4175 | 0.2459 | - | - | | 1.1299 | 4176 | 0.2599 | - | - | | 1.1301 | 4177 | 0.2555 | - | - | | 1.1304 | 4178 | 0.2522 | - | - | | 1.1307 | 4179 | 0.1855 | - | - | | 1.1310 | 4180 | 0.2985 | - | - | | 1.1312 | 4181 | 0.2252 | - | - | | 1.1315 | 4182 | 0.1398 | - | - | | 1.1318 | 4183 | 0.2341 | - | - | | 1.1320 | 4184 | 0.2225 | - | - | | 1.1323 | 4185 | 0.2567 | - | - | | 1.1326 | 4186 | 0.2446 | - | - | | 1.1328 | 4187 | 0.2637 | - | - | | 1.1331 | 4188 | 0.2457 | - | - | | 1.1334 | 4189 | 0.2373 | - | - | | 1.1337 | 4190 | 0.2702 | - | - | | 1.1339 | 4191 | 0.3234 | - | - | | 1.1342 | 4192 | 0.2746 | - | - | | 1.1345 | 4193 | 0.3018 | - | - | | 1.1347 | 4194 | 0.2691 | - | - | | 1.1350 | 4195 | 0.2198 | - | - | | 1.1353 | 4196 | 0.2281 | - | - | | 1.1356 | 4197 | 0.2309 | - | - | | 1.1358 | 4198 | 0.293 | - | - | | 1.1361 | 4199 | 0.2385 | - | - | | 1.1364 | 4200 | 0.2647 | - | - | | 1.1366 | 4201 | 0.266 | - | - | | 1.1369 | 4202 | 0.2382 | - | - | | 1.1372 | 4203 | 0.2534 | - | - | | 1.1374 | 4204 | 0.229 | - | - | | 1.1377 | 4205 | 0.1996 | - | - | | 1.1380 | 4206 | 0.2879 | - | - | | 1.1383 | 4207 | 0.1815 | - | - | | 1.1385 | 4208 | 0.2465 | - | - | | 1.1388 | 4209 | 0.2426 | - | - | | 1.1391 | 4210 | 0.22 | - | - | | 1.1393 | 4211 | 0.1954 | - | - | | 1.1396 | 4212 | 0.2763 | - | - | | 1.1399 | 4213 | 0.2086 | - | - | | 1.1402 | 4214 | 0.2295 | - | - | | 1.1404 | 4215 | 0.2712 | - | - | | 1.1407 | 4216 | 0.2734 | - | - | | 1.1410 | 4217 | 0.3155 | - | - | | 1.1412 | 4218 | 0.2565 | - | - | | 1.1415 | 4219 | 0.2388 | - | - | | 1.1418 | 4220 | 0.2291 | - | - | | 1.1420 | 4221 | 0.2375 | - | - | | 1.1423 | 4222 | 0.2429 | - | - | | 1.1426 | 4223 | 0.1965 | - | - | | 1.1429 | 4224 | 0.2188 | - | - | | 1.1431 | 4225 | 0.2529 | - | - | | 1.1434 | 4226 | 0.2549 | - | - | | 1.1437 | 4227 | 0.1583 | - | - | | 1.1439 | 4228 | 0.2585 | - | - | | 1.1442 | 4229 | 0.2597 | - | - | | 1.1445 | 4230 | 0.1788 | - | - | | 1.1448 | 4231 | 0.2648 | - | - | | 1.1450 | 4232 | 0.2133 | - | - | | 1.1453 | 4233 | 0.2612 | - | - | | 1.1456 | 4234 | 0.2921 | - | - | | 1.1458 | 4235 | 0.2136 | - | - | | 1.1461 | 4236 | 0.281 | - | - | | 1.1464 | 4237 | 0.2649 | - | - | | 1.1466 | 4238 | 0.2231 | - | - | | 1.1469 | 4239 | 0.2245 | - | - | | 1.1472 | 4240 | 0.1939 | - | - | | 1.1475 | 4241 | 0.2871 | - | - | | 1.1477 | 4242 | 0.1864 | - | - | | 1.1480 | 4243 | 0.2726 | - | - | | 1.1483 | 4244 | 0.214 | - | - | | 1.1485 | 4245 | 0.2789 | - | - | | 1.1488 | 4246 | 0.1811 | - | - | | 1.1491 | 4247 | 0.2205 | - | - | | 1.1494 | 4248 | 0.179 | - | - | | 1.1496 | 4249 | 0.2973 | - | - | | 1.1499 | 4250 | 0.2983 | - | - | | 1.1502 | 4251 | 0.2739 | - | - | | 1.1504 | 4252 | 0.2129 | - | - | | 1.1507 | 4253 | 0.2948 | - | - | | 1.1510 | 4254 | 0.2201 | - | - | | 1.1512 | 4255 | 0.2214 | - | - | | 1.1515 | 4256 | 0.1969 | - | - | | 1.1518 | 4257 | 0.1745 | - | - | | 1.1521 | 4258 | 0.2708 | - | - | | 1.1523 | 4259 | 0.3266 | - | - | | 1.1526 | 4260 | 0.2179 | - | - | | 1.1529 | 4261 | 0.2791 | - | - | | 1.1531 | 4262 | 0.2786 | - | - | | 1.1534 | 4263 | 0.2065 | - | - | | 1.1537 | 4264 | 0.1809 | - | - | | 1.1540 | 4265 | 0.1854 | - | - | | 1.1542 | 4266 | 0.3181 | - | - | | 1.1545 | 4267 | 0.2476 | - | - | | 1.1548 | 4268 | 0.2924 | - | - | | 1.1550 | 4269 | 0.1932 | - | - | | 1.1553 | 4270 | 0.294 | - | - | | 1.1556 | 4271 | 0.2131 | - | - | | 1.1558 | 4272 | 0.2054 | - | - | | 1.1561 | 4273 | 0.1859 | - | - | | 1.1564 | 4274 | 0.238 | - | - | | 1.1567 | 4275 | 0.2462 | - | - | | 1.1569 | 4276 | 0.2143 | - | - | | 1.1572 | 4277 | 0.2293 | - | - | | 1.1575 | 4278 | 0.2609 | - | - | | 1.1577 | 4279 | 0.186 | - | - | | 1.1580 | 4280 | 0.2331 | - | - | | 1.1583 | 4281 | 0.2604 | - | - | | 1.1585 | 4282 | 0.2363 | - | - | | 1.1588 | 4283 | 0.335 | - | - | | 1.1591 | 4284 | 0.2166 | - | - | | 1.1594 | 4285 | 0.2838 | - | - | | 1.1596 | 4286 | 0.2166 | - | - | | 1.1599 | 4287 | 0.2074 | - | - | | 1.1602 | 4288 | 0.2441 | - | - | | 1.1604 | 4289 | 0.2548 | - | - | | 1.1607 | 4290 | 0.3116 | - | - | | 1.1610 | 4291 | 0.1575 | - | - | | 1.1613 | 4292 | 0.2633 | - | - | | 1.1615 | 4293 | 0.2858 | - | - | | 1.1618 | 4294 | 0.1834 | - | - | | 1.1621 | 4295 | 0.2015 | - | - | | 1.1623 | 4296 | 0.2009 | - | - | | 1.1626 | 4297 | 0.2139 | - | - | | 1.1629 | 4298 | 0.2425 | - | - | | 1.1631 | 4299 | 0.1953 | - | - | | 1.1634 | 4300 | 0.1438 | - | - | | 1.1637 | 4301 | 0.2729 | - | - | | 1.1640 | 4302 | 0.2218 | - | - | | 1.1642 | 4303 | 0.2708 | - | - | | 1.1645 | 4304 | 0.2151 | - | - | | 1.1648 | 4305 | 0.2353 | - | - | | 1.1650 | 4306 | 0.181 | - | - | | 1.1653 | 4307 | 0.1629 | - | - | | 1.1656 | 4308 | 0.1922 | - | - | | 1.1659 | 4309 | 0.2589 | - | - | | 1.1661 | 4310 | 0.241 | - | - | | 1.1664 | 4311 | 0.3068 | - | - | | 1.1667 | 4312 | 0.249 | - | - | | 1.1669 | 4313 | 0.2464 | - | - | | 1.1672 | 4314 | 0.1706 | - | - | | 1.1675 | 4315 | 0.3256 | - | - | | 1.1677 | 4316 | 0.2382 | - | - | | 1.1680 | 4317 | 0.3734 | - | - | | 1.1683 | 4318 | 0.2511 | - | - | | 1.1686 | 4319 | 0.2513 | - | - | | 1.1688 | 4320 | 0.1616 | - | - | | 1.1691 | 4321 | 0.2539 | - | - | | 1.1694 | 4322 | 0.2508 | - | - | | 1.1696 | 4323 | 0.1958 | - | - | | 1.1699 | 4324 | 0.1808 | - | - | | 1.1702 | 4325 | 0.2645 | - | - | | 1.1705 | 4326 | 0.1849 | - | - | | 1.1707 | 4327 | 0.1863 | - | - | | 1.1710 | 4328 | 0.2459 | - | - | | 1.1713 | 4329 | 0.2475 | - | - | | 1.1715 | 4330 | 0.265 | - | - | | 1.1718 | 4331 | 0.2731 | - | - | | 1.1721 | 4332 | 0.1976 | - | - | | 1.1723 | 4333 | 0.1935 | - | - | | 1.1726 | 4334 | 0.2205 | - | - | | 1.1729 | 4335 | 0.1949 | - | - | | 1.1732 | 4336 | 0.1745 | - | - | | 1.1734 | 4337 | 0.2197 | - | - | | 1.1737 | 4338 | 0.1859 | - | - | | 1.1740 | 4339 | 0.2782 | - | - | | 1.1742 | 4340 | 0.1857 | - | - | | 1.1745 | 4341 | 0.2032 | - | - | | 1.1748 | 4342 | 0.1902 | - | - | | 1.1751 | 4343 | 0.1947 | - | - | | 1.1753 | 4344 | 0.1751 | - | - | | 1.1756 | 4345 | 0.2229 | - | - | | 1.1759 | 4346 | 0.2209 | - | - | | 1.1761 | 4347 | 0.2381 | - | - | | 1.1764 | 4348 | 0.1881 | - | - | | 1.1767 | 4349 | 0.2442 | - | - | | 1.1769 | 4350 | 0.2338 | - | - | | 1.1772 | 4351 | 0.2842 | - | - | | 1.1775 | 4352 | 0.1967 | - | - | | 1.1778 | 4353 | 0.2116 | - | - | | 1.1780 | 4354 | 0.1557 | - | - | | 1.1783 | 4355 | 0.2356 | - | - | | 1.1786 | 4356 | 0.1731 | - | - | | 1.1788 | 4357 | 0.1894 | - | - | | 1.1791 | 4358 | 0.2461 | - | - | | 1.1794 | 4359 | 0.2419 | - | - | | 1.1797 | 4360 | 0.2088 | - | - | | 1.1799 | 4361 | 0.2258 | - | - | | 1.1802 | 4362 | 0.2509 | - | - | | 1.1805 | 4363 | 0.3137 | - | - | | 1.1807 | 4364 | 0.2406 | - | - | | 1.1810 | 4365 | 0.3251 | - | - | | 1.1813 | 4366 | 0.3338 | - | - | | 1.1815 | 4367 | 0.1396 | - | - | | 1.1818 | 4368 | 0.2639 | - | - | | 1.1821 | 4369 | 0.1672 | - | - | | 1.1824 | 4370 | 0.2297 | - | - | | 1.1826 | 4371 | 0.1911 | - | - | | 1.1829 | 4372 | 0.2367 | - | - | | 1.1832 | 4373 | 0.2659 | - | - | | 1.1834 | 4374 | 0.3109 | - | - | | 1.1837 | 4375 | 0.2522 | - | - | | 1.1840 | 4376 | 0.2597 | - | - | | 1.1843 | 4377 | 0.2996 | - | - | | 1.1845 | 4378 | 0.2073 | - | - | | 1.1848 | 4379 | 0.1874 | - | - | | 1.1851 | 4380 | 0.1336 | - | - | | 1.1853 | 4381 | 0.2259 | - | - | | 1.1856 | 4382 | 0.1581 | - | - | | 1.1859 | 4383 | 0.2602 | - | - | | 1.1861 | 4384 | 0.2303 | - | - | | 1.1864 | 4385 | 0.2169 | - | - | | 1.1867 | 4386 | 0.3112 | - | - | | 1.1870 | 4387 | 0.1863 | - | - | | 1.1872 | 4388 | 0.2988 | - | - | | 1.1875 | 4389 | 0.1617 | - | - | | 1.1878 | 4390 | 0.2332 | - | - | | 1.1880 | 4391 | 0.2508 | - | - | | 1.1883 | 4392 | 0.2597 | - | - | | 1.1886 | 4393 | 0.3034 | - | - | | 1.1889 | 4394 | 0.2211 | - | - | | 1.1891 | 4395 | 0.2996 | - | - | | 1.1894 | 4396 | 0.204 | - | - | | 1.1897 | 4397 | 0.264 | - | - | | 1.1899 | 4398 | 0.2316 | - | - | | 1.1902 | 4399 | 0.2017 | - | - | | 1.1905 | 4400 | 0.195 | - | - | | 1.1907 | 4401 | 0.2194 | - | - | | 1.1910 | 4402 | 0.1864 | - | - | | 1.1913 | 4403 | 0.214 | - | - | | 1.1916 | 4404 | 0.2382 | - | - | | 1.1918 | 4405 | 0.2293 | - | - | | 1.1921 | 4406 | 0.1916 | - | - | | 1.1924 | 4407 | 0.1453 | - | - | | 1.1926 | 4408 | 0.3456 | - | - | | 1.1929 | 4409 | 0.2782 | - | - | | 1.1932 | 4410 | 0.2315 | - | - | | 1.1935 | 4411 | 0.3167 | - | - | | 1.1937 | 4412 | 0.2665 | - | - | | 1.1940 | 4413 | 0.2476 | - | - | | 1.1943 | 4414 | 0.248 | - | - | | 1.1945 | 4415 | 0.1862 | - | - | | 1.1948 | 4416 | 0.2545 | - | - | | 1.1951 | 4417 | 0.2549 | - | - | | 1.1953 | 4418 | 0.1536 | - | - | | 1.1956 | 4419 | 0.2348 | - | - | | 1.1959 | 4420 | 0.2631 | - | - | | 1.1962 | 4421 | 0.2976 | - | - | | 1.1964 | 4422 | 0.3626 | - | - | | 1.1967 | 4423 | 0.2335 | - | - | | 1.1970 | 4424 | 0.2127 | - | - | | 1.1972 | 4425 | 0.2127 | - | - | | 1.1975 | 4426 | 0.2649 | - | - | | 1.1978 | 4427 | 0.2211 | - | - | | 1.1981 | 4428 | 0.2515 | - | - | | 1.1983 | 4429 | 0.2394 | - | - | | 1.1986 | 4430 | 0.1586 | - | - | | 1.1989 | 4431 | 0.192 | - | - | | 1.1991 | 4432 | 0.2288 | - | - | | 1.1994 | 4433 | 0.2269 | - | - | | 1.1997 | 4434 | 0.232 | - | - | | 1.1999 | 4435 | 0.1814 | - | - | | 1.2002 | 4436 | 0.2768 | - | - | | 1.2005 | 4437 | 0.2096 | - | - | | 1.2008 | 4438 | 0.2717 | - | - | | 1.2010 | 4439 | 0.1583 | - | - | | 1.2013 | 4440 | 0.2195 | - | - | | 1.2016 | 4441 | 0.2865 | - | - | | 1.2018 | 4442 | 0.3121 | - | - | | 1.2021 | 4443 | 0.1415 | - | - | | 1.2024 | 4444 | 0.2083 | - | - | | 1.2027 | 4445 | 0.2701 | - | - | | 1.2029 | 4446 | 0.1928 | - | - | | 1.2032 | 4447 | 0.1929 | - | - | | 1.2035 | 4448 | 0.2577 | - | - | | 1.2037 | 4449 | 0.3552 | - | - | | 1.2040 | 4450 | 0.2243 | - | - | | 1.2043 | 4451 | 0.2552 | - | - | | 1.2045 | 4452 | 0.2835 | - | - | | 1.2048 | 4453 | 0.2188 | - | - | | 1.2051 | 4454 | 0.2071 | - | - | | 1.2054 | 4455 | 0.2013 | - | - | | 1.2056 | 4456 | 0.1967 | - | - | | 1.2059 | 4457 | 0.221 | - | - | | 1.2062 | 4458 | 0.2773 | - | - | | 1.2064 | 4459 | 0.1989 | - | - | | 1.2067 | 4460 | 0.1889 | - | - | | 1.2070 | 4461 | 0.2622 | - | - | | 1.2073 | 4462 | 0.1878 | - | - | | 1.2075 | 4463 | 0.2531 | - | - | | 1.2078 | 4464 | 0.2678 | - | - | | 1.2081 | 4465 | 0.3264 | - | - | | 1.2083 | 4466 | 0.1789 | - | - | | 1.2086 | 4467 | 0.2803 | - | - | | 1.2089 | 4468 | 0.2853 | - | - | | 1.2091 | 4469 | 0.2517 | - | - | | 1.2094 | 4470 | 0.2236 | - | - | | 1.2097 | 4471 | 0.2327 | - | - | | 1.2100 | 4472 | 0.2625 | - | - | | 1.2102 | 4473 | 0.2433 | - | - | | 1.2105 | 4474 | 0.2062 | - | - | | 1.2108 | 4475 | 0.193 | - | - | | 1.2110 | 4476 | 0.3185 | - | - | | 1.2113 | 4477 | 0.2213 | - | - | | 1.2116 | 4478 | 0.2161 | - | - | | 1.2119 | 4479 | 0.195 | - | - | | 1.2121 | 4480 | 0.1507 | - | - | | 1.2124 | 4481 | 0.2382 | - | - | | 1.2127 | 4482 | 0.2444 | - | - | | 1.2129 | 4483 | 0.1966 | - | - | | 1.2132 | 4484 | 0.2429 | - | - | | 1.2135 | 4485 | 0.2028 | - | - | | 1.2137 | 4486 | 0.1947 | - | - | | 1.2140 | 4487 | 0.3277 | - | - | | 1.2143 | 4488 | 0.2984 | - | - | | 1.2146 | 4489 | 0.2657 | - | - | | 1.2148 | 4490 | 0.2702 | - | - | | 1.2151 | 4491 | 0.2628 | - | - | | 1.2154 | 4492 | 0.3113 | - | - | | 1.2156 | 4493 | 0.2375 | - | - | | 1.2159 | 4494 | 0.2656 | - | - | | 1.2162 | 4495 | 0.1883 | - | - | | 1.2165 | 4496 | 0.183 | - | - | | 1.2167 | 4497 | 0.2129 | - | - | | 1.2170 | 4498 | 0.249 | - | - | | 1.2173 | 4499 | 0.2801 | - | - | | 1.2175 | 4500 | 0.3372 | - | - | | 1.2178 | 4501 | 0.2198 | - | - | | 1.2181 | 4502 | 0.328 | - | - | | 1.2183 | 4503 | 0.229 | - | - | | 1.2186 | 4504 | 0.2431 | - | - | | 1.2189 | 4505 | 0.1767 | - | - | | 1.2192 | 4506 | 0.1872 | - | - | | 1.2194 | 4507 | 0.1747 | - | - | | 1.2197 | 4508 | 0.1524 | - | - | | 1.2200 | 4509 | 0.1526 | - | - | | 1.2202 | 4510 | 0.231 | - | - | | 1.2205 | 4511 | 0.2313 | - | - | | 1.2208 | 4512 | 0.3124 | - | - | | 1.2210 | 4513 | 0.1784 | - | - | | 1.2213 | 4514 | 0.1641 | - | - | | 1.2216 | 4515 | 0.2189 | - | - | | 1.2219 | 4516 | 0.2125 | - | - | | 1.2221 | 4517 | 0.2273 | - | - | | 1.2224 | 4518 | 0.2583 | - | - | | 1.2227 | 4519 | 0.2329 | - | - | | 1.2229 | 4520 | 0.288 | - | - | | 1.2232 | 4521 | 0.1855 | - | - | | 1.2235 | 4522 | 0.209 | - | - | | 1.2238 | 4523 | 0.1516 | - | - | | 1.2240 | 4524 | 0.2512 | - | - | | 1.2243 | 4525 | 0.2599 | - | - | | 1.2246 | 4526 | 0.1972 | - | - | | 1.2248 | 4527 | 0.284 | - | - | | 1.2251 | 4528 | 0.2392 | - | - | | 1.2254 | 4529 | 0.3336 | - | - | | 1.2256 | 4530 | 0.2817 | - | - | | 1.2259 | 4531 | 0.2513 | - | - | | 1.2262 | 4532 | 0.2295 | - | - | | 1.2265 | 4533 | 0.234 | - | - | | 1.2267 | 4534 | 0.331 | - | - | | 1.2270 | 4535 | 0.1894 | - | - | | 1.2273 | 4536 | 0.2673 | - | - | | 1.2275 | 4537 | 0.2025 | - | - | | 1.2278 | 4538 | 0.2624 | - | - | | 1.2281 | 4539 | 0.2418 | - | - | | 1.2284 | 4540 | 0.1875 | - | - | | 1.2286 | 4541 | 0.2286 | - | - | | 1.2289 | 4542 | 0.2127 | - | - | | 1.2292 | 4543 | 0.2744 | - | - | | 1.2294 | 4544 | 0.1911 | - | - | | 1.2297 | 4545 | 0.3007 | - | - | | 1.2300 | 4546 | 0.1921 | - | - | | 1.2302 | 4547 | 0.2478 | - | - | | 1.2305 | 4548 | 0.258 | - | - | | 1.2308 | 4549 | 0.1984 | - | - | | 1.2311 | 4550 | 0.1743 | - | - | | 1.2313 | 4551 | 0.2166 | - | - | | 1.2316 | 4552 | 0.2049 | - | - | | 1.2319 | 4553 | 0.2519 | - | - | | 1.2321 | 4554 | 0.2215 | - | - | | 1.2324 | 4555 | 0.2967 | - | - | | 1.2327 | 4556 | 0.1825 | - | - | | 1.2330 | 4557 | 0.2615 | - | - | | 1.2332 | 4558 | 0.2156 | - | - | | 1.2335 | 4559 | 0.1938 | - | - | | 1.2338 | 4560 | 0.2087 | - | - | | 1.2340 | 4561 | 0.2401 | - | - | | 1.2343 | 4562 | 0.2297 | - | - | | 1.2346 | 4563 | 0.2615 | - | - | | 1.2348 | 4564 | 0.158 | - | - | | 1.2351 | 4565 | 0.1972 | - | - | | 1.2354 | 4566 | 0.2279 | - | - | | 1.2357 | 4567 | 0.2081 | - | - | | 1.2359 | 4568 | 0.2285 | - | - | | 1.2362 | 4569 | 0.2632 | - | - | | 1.2365 | 4570 | 0.2652 | - | - | | 1.2367 | 4571 | 0.1575 | - | - | | 1.2370 | 4572 | 0.2755 | - | - | | 1.2373 | 4573 | 0.2692 | - | - | | 1.2376 | 4574 | 0.1596 | - | - | | 1.2378 | 4575 | 0.2256 | - | - | | 1.2381 | 4576 | 0.2214 | - | - | | 1.2384 | 4577 | 0.2237 | - | - | | 1.2386 | 4578 | 0.2393 | - | - | | 1.2389 | 4579 | 0.1569 | - | - | | 1.2392 | 4580 | 0.3432 | - | - | | 1.2394 | 4581 | 0.2159 | - | - | | 1.2397 | 4582 | 0.248 | - | - | | 1.2400 | 4583 | 0.2093 | - | - | | 1.2403 | 4584 | 0.2372 | - | - | | 1.2405 | 4585 | 0.1782 | - | - | | 1.2408 | 4586 | 0.27 | - | - | | 1.2411 | 4587 | 0.1525 | - | - | | 1.2413 | 4588 | 0.1439 | - | - | | 1.2416 | 4589 | 0.3204 | - | - | | 1.2419 | 4590 | 0.1863 | - | - | | 1.2422 | 4591 | 0.1776 | - | - | | 1.2424 | 4592 | 0.2783 | - | - | | 1.2427 | 4593 | 0.2024 | - | - | | 1.2430 | 4594 | 0.2108 | - | - | | 1.2432 | 4595 | 0.1963 | - | - | | 1.2435 | 4596 | 0.2438 | - | - | | 1.2438 | 4597 | 0.3046 | - | - | | 1.2440 | 4598 | 0.1669 | - | - | | 1.2443 | 4599 | 0.2387 | - | - | | 1.2446 | 4600 | 0.1727 | - | - | | 1.2449 | 4601 | 0.2733 | - | - | | 1.2451 | 4602 | 0.175 | - | - | | 1.2454 | 4603 | 0.1841 | - | - | | 1.2457 | 4604 | 0.2065 | - | - | | 1.2459 | 4605 | 0.2694 | - | - | | 1.2462 | 4606 | 0.261 | - | - | | 1.2465 | 4607 | 0.297 | - | - | | 1.2468 | 4608 | 0.1567 | - | - | | 1.2470 | 4609 | 0.2799 | - | - | | 1.2473 | 4610 | 0.2371 | - | - | | 1.2476 | 4611 | 0.3294 | - | - | | 1.2478 | 4612 | 0.1864 | - | - | | 1.2481 | 4613 | 0.2184 | - | - | | 1.2484 | 4614 | 0.1709 | - | - | | 1.2486 | 4615 | 0.2159 | - | - | | 1.2489 | 4616 | 0.1463 | - | - | | 1.2492 | 4617 | 0.1659 | - | - | | 1.2495 | 4618 | 0.1885 | - | - | | 1.2497 | 4619 | 0.261 | - | - | | 1.25 | 4620 | 0.214 | - | - | | 1.2503 | 4621 | 0.3101 | - | - | | 1.2505 | 4622 | 0.2443 | - | - | | 1.2508 | 4623 | 0.1709 | - | - | | 1.2511 | 4624 | 0.2013 | - | - | | 1.2514 | 4625 | 0.2378 | - | - | | 1.2516 | 4626 | 0.1796 | - | - | | 1.2519 | 4627 | 0.1952 | - | - | | 1.2522 | 4628 | 0.1819 | - | - | | 1.2524 | 4629 | 0.1972 | - | - | | 1.2527 | 4630 | 0.207 | - | - | | 1.2530 | 4631 | 0.2877 | - | - | | 1.2532 | 4632 | 0.2831 | - | - | | 1.2535 | 4633 | 0.2412 | - | - | | 1.2538 | 4634 | 0.1731 | - | - | | 1.2541 | 4635 | 0.1978 | - | - | | 1.2543 | 4636 | 0.2562 | - | - | | 1.2546 | 4637 | 0.2185 | - | - | | 1.2549 | 4638 | 0.2265 | - | - | | 1.2551 | 4639 | 0.2561 | - | - | | 1.2554 | 4640 | 0.233 | - | - | | 1.2557 | 4641 | 0.2746 | - | - | | 1.2560 | 4642 | 0.2534 | - | - | | 1.2562 | 4643 | 0.1689 | - | - | | 1.2565 | 4644 | 0.1926 | - | - | | 1.2568 | 4645 | 0.2405 | - | - | | 1.2570 | 4646 | 0.1613 | - | - | | 1.2573 | 4647 | 0.2288 | - | - | | 1.2576 | 4648 | 0.2439 | - | - | | 1.2578 | 4649 | 0.1421 | - | - | | 1.2581 | 4650 | 0.1864 | - | - | | 1.2584 | 4651 | 0.1849 | - | - | | 1.2587 | 4652 | 0.1937 | - | - | | 1.2589 | 4653 | 0.2452 | - | - | | 1.2592 | 4654 | 0.1935 | - | - | | 1.2595 | 4655 | 0.2102 | - | - | | 1.2597 | 4656 | 0.2364 | - | - | | 1.2600 | 4657 | 0.2402 | - | - | | 1.2603 | 4658 | 0.1827 | - | - | | 1.2606 | 4659 | 0.1919 | - | - | | 1.2608 | 4660 | 0.2182 | - | - | | 1.2611 | 4661 | 0.2846 | - | - | | 1.2614 | 4662 | 0.2488 | - | - | | 1.2616 | 4663 | 0.2403 | - | - | | 1.2619 | 4664 | 0.1764 | - | - | | 1.2622 | 4665 | 0.127 | - | - | | 1.2624 | 4666 | 0.2952 | - | - | | 1.2627 | 4667 | 0.2231 | - | - | | 1.2630 | 4668 | 0.1952 | - | - | | 1.2633 | 4669 | 0.2341 | - | - | | 1.2635 | 4670 | 0.25 | - | - | | 1.2638 | 4671 | 0.1833 | - | - | | 1.2641 | 4672 | 0.2156 | - | - | | 1.2643 | 4673 | 0.2585 | - | - | | 1.2646 | 4674 | 0.2343 | - | - | | 1.2649 | 4675 | 0.2409 | - | - | | 1.2652 | 4676 | 0.2059 | - | - | | 1.2654 | 4677 | 0.2806 | - | - | | 1.2657 | 4678 | 0.2842 | - | - | | 1.2660 | 4679 | 0.2779 | - | - | | 1.2662 | 4680 | 0.1889 | - | - | | 1.2665 | 4681 | 0.1746 | - | - | | 1.2668 | 4682 | 0.2294 | - | - | | 1.2670 | 4683 | 0.2449 | - | - | | 1.2673 | 4684 | 0.2766 | - | - | | 1.2676 | 4685 | 0.1796 | - | - | | 1.2679 | 4686 | 0.3139 | - | - | | 1.2681 | 4687 | 0.2715 | - | - | | 1.2684 | 4688 | 0.3138 | - | - | | 1.2687 | 4689 | 0.2628 | - | - | | 1.2689 | 4690 | 0.2937 | - | - | | 1.2692 | 4691 | 0.2438 | - | - | | 1.2695 | 4692 | 0.1862 | - | - | | 1.2698 | 4693 | 0.1542 | - | - | | 1.2700 | 4694 | 0.2459 | - | - | | 1.2703 | 4695 | 0.1986 | - | - | | 1.2706 | 4696 | 0.1864 | - | - | | 1.2708 | 4697 | 0.2257 | - | - | | 1.2711 | 4698 | 0.2014 | - | - | | 1.2714 | 4699 | 0.3047 | - | - | | 1.2716 | 4700 | 0.1699 | - | - | | 1.2719 | 4701 | 0.2923 | - | - | | 1.2722 | 4702 | 0.1863 | - | - | | 1.2725 | 4703 | 0.2667 | - | - | | 1.2727 | 4704 | 0.2266 | - | - | | 1.2730 | 4705 | 0.173 | - | - | | 1.2733 | 4706 | 0.1338 | - | - | | 1.2735 | 4707 | 0.2204 | - | - | | 1.2738 | 4708 | 0.2966 | - | - | | 1.2741 | 4709 | 0.212 | - | - | | 1.2744 | 4710 | 0.2663 | - | - | | 1.2746 | 4711 | 0.2463 | - | - | | 1.2749 | 4712 | 0.2952 | - | - | | 1.2752 | 4713 | 0.2904 | - | - | | 1.2754 | 4714 | 0.1531 | - | - | | 1.2757 | 4715 | 0.1805 | - | - | | 1.2760 | 4716 | 0.2492 | - | - | | 1.2762 | 4717 | 0.2189 | - | - | | 1.2765 | 4718 | 0.3175 | - | - | | 1.2768 | 4719 | 0.218 | - | - | | 1.2771 | 4720 | 0.1972 | - | - | | 1.2773 | 4721 | 0.2109 | - | - | | 1.2776 | 4722 | 0.2836 | - | - | | 1.2779 | 4723 | 0.2403 | - | - | | 1.2781 | 4724 | 0.1911 | - | - | | 1.2784 | 4725 | 0.2427 | - | - | | 1.2787 | 4726 | 0.2096 | - | - | | 1.2790 | 4727 | 0.1386 | - | - | | 1.2792 | 4728 | 0.2399 | - | - | | 1.2795 | 4729 | 0.2386 | - | - | | 1.2798 | 4730 | 0.2106 | - | - | | 1.2800 | 4731 | 0.2383 | - | - | | 1.2803 | 4732 | 0.2189 | - | - | | 1.2806 | 4733 | 0.2518 | - | - | | 1.2808 | 4734 | 0.2261 | - | - | | 1.2811 | 4735 | 0.3264 | - | - | | 1.2814 | 4736 | 0.2119 | - | - | | 1.2817 | 4737 | 0.1711 | - | - | | 1.2819 | 4738 | 0.2501 | - | - | | 1.2822 | 4739 | 0.2158 | - | - | | 1.2825 | 4740 | 0.1692 | - | - | | 1.2827 | 4741 | 0.2454 | - | - | | 1.2830 | 4742 | 0.2124 | - | - | | 1.2833 | 4743 | 0.1898 | - | - | | 1.2835 | 4744 | 0.2143 | - | - | | 1.2838 | 4745 | 0.2522 | - | - | | 1.2841 | 4746 | 0.2544 | - | - | | 1.2844 | 4747 | 0.2355 | - | - | | 1.2846 | 4748 | 0.3055 | - | - | | 1.2849 | 4749 | 0.2376 | - | - | | 1.2852 | 4750 | 0.2144 | - | - | | 1.2854 | 4751 | 0.1893 | - | - | | 1.2857 | 4752 | 0.2343 | - | - | | 1.2860 | 4753 | 0.2007 | - | - | | 1.2863 | 4754 | 0.1706 | - | - | | 1.2865 | 4755 | 0.2047 | - | - | | 1.2868 | 4756 | 0.2768 | - | - | | 1.2871 | 4757 | 0.2694 | - | - | | 1.2873 | 4758 | 0.159 | - | - | | 1.2876 | 4759 | 0.284 | - | - | | 1.2879 | 4760 | 0.1701 | - | - | | 1.2881 | 4761 | 0.2255 | - | - | | 1.2884 | 4762 | 0.1708 | - | - | | 1.2887 | 4763 | 0.261 | - | - | | 1.2890 | 4764 | 0.4358 | - | - | | 1.2892 | 4765 | 0.2441 | - | - | | 1.2895 | 4766 | 0.2871 | - | - | | 1.2898 | 4767 | 0.1728 | - | - | | 1.2900 | 4768 | 0.2476 | - | - | | 1.2903 | 4769 | 0.2486 | - | - | | 1.2906 | 4770 | 0.2392 | - | - | | 1.2909 | 4771 | 0.2069 | - | - | | 1.2911 | 4772 | 0.2222 | - | - | | 1.2914 | 4773 | 0.1889 | - | - | | 1.2917 | 4774 | 0.1859 | - | - | | 1.2919 | 4775 | 0.2231 | - | - | | 1.2922 | 4776 | 0.1404 | - | - | | 1.2925 | 4777 | 0.1962 | - | - | | 1.2927 | 4778 | 0.249 | - | - | | 1.2930 | 4779 | 0.1687 | - | - | | 1.2933 | 4780 | 0.2167 | - | - | | 1.2936 | 4781 | 0.2326 | - | - | | 1.2938 | 4782 | 0.2322 | - | - | | 1.2941 | 4783 | 0.2947 | - | - | | 1.2944 | 4784 | 0.2619 | - | - | | 1.2946 | 4785 | 0.2467 | - | - | | 1.2949 | 4786 | 0.2369 | - | - | | 1.2952 | 4787 | 0.1947 | - | - | | 1.2955 | 4788 | 0.1664 | - | - | | 1.2957 | 4789 | 0.2511 | - | - | | 1.2960 | 4790 | 0.2123 | - | - | | 1.2963 | 4791 | 0.2287 | - | - | | 1.2965 | 4792 | 0.2634 | - | - | | 1.2968 | 4793 | 0.1893 | - | - | | 1.2971 | 4794 | 0.1774 | - | - | | 1.2973 | 4795 | 0.295 | - | - | | 1.2976 | 4796 | 0.1942 | - | - | | 1.2979 | 4797 | 0.2068 | - | - | | 1.2982 | 4798 | 0.2555 | - | - | | 1.2984 | 4799 | 0.224 | - | - | | 1.2987 | 4800 | 0.2791 | - | - | | 1.2990 | 4801 | 0.2119 | - | - | | 1.2992 | 4802 | 0.3137 | - | - | | 1.2995 | 4803 | 0.2808 | - | - | | 1.2998 | 4804 | 0.253 | - | - | | 1.3001 | 4805 | 0.2196 | - | - | | 1.3003 | 4806 | 0.2135 | - | - | | 1.3006 | 4807 | 0.2282 | - | - | | 1.3009 | 4808 | 0.3239 | - | - | | 1.3011 | 4809 | 0.1643 | - | - | | 1.3014 | 4810 | 0.1808 | - | - | | 1.3017 | 4811 | 0.1931 | - | - | | 1.3019 | 4812 | 0.2147 | - | - | | 1.3022 | 4813 | 0.2276 | - | - | | 1.3025 | 4814 | 0.3234 | - | - | | 1.3028 | 4815 | 0.2043 | - | - | | 1.3030 | 4816 | 0.176 | - | - | | 1.3033 | 4817 | 0.2169 | - | - | | 1.3036 | 4818 | 0.1878 | - | - | | 1.3038 | 4819 | 0.2251 | - | - | | 1.3041 | 4820 | 0.1374 | - | - | | 1.3044 | 4821 | 0.1882 | - | - | | 1.3047 | 4822 | 0.1905 | - | - | | 1.3049 | 4823 | 0.1841 | - | - | | 1.3052 | 4824 | 0.2144 | - | - | | 1.3055 | 4825 | 0.2321 | - | - | | 1.3057 | 4826 | 0.1906 | - | - | | 1.3060 | 4827 | 0.204 | - | - | | 1.3063 | 4828 | 0.213 | - | - | | 1.3065 | 4829 | 0.1974 | - | - | | 1.3068 | 4830 | 0.2829 | - | - | | 1.3071 | 4831 | 0.2704 | - | - | | 1.3074 | 4832 | 0.1599 | - | - | | 1.3076 | 4833 | 0.2108 | - | - | | 1.3079 | 4834 | 0.2135 | - | - | | 1.3082 | 4835 | 0.2134 | - | - | | 1.3084 | 4836 | 0.2072 | - | - | | 1.3087 | 4837 | 0.2184 | - | - | | 1.3090 | 4838 | 0.2851 | - | - | | 1.3093 | 4839 | 0.1898 | - | - | | 1.3095 | 4840 | 0.3054 | - | - | | 1.3098 | 4841 | 0.2102 | - | - | | 1.3101 | 4842 | 0.2429 | - | - | | 1.3103 | 4843 | 0.2845 | - | - | | 1.3106 | 4844 | 0.3107 | - | - | | 1.3109 | 4845 | 0.2447 | - | - | | 1.3111 | 4846 | 0.3323 | - | - | | 1.3114 | 4847 | 0.3229 | - | - | | 1.3117 | 4848 | 0.2128 | - | - | | 1.3120 | 4849 | 0.2268 | - | - | | 1.3122 | 4850 | 0.3052 | - | - | | 1.3125 | 4851 | 0.1629 | - | - | | 1.3128 | 4852 | 0.2615 | - | - | | 1.3130 | 4853 | 0.2432 | - | - | | 1.3133 | 4854 | 0.2357 | - | - | | 1.3136 | 4855 | 0.2068 | - | - | | 1.3139 | 4856 | 0.1822 | - | - | | 1.3141 | 4857 | 0.1763 | - | - | | 1.3144 | 4858 | 0.2185 | - | - | | 1.3147 | 4859 | 0.2282 | - | - | | 1.3149 | 4860 | 0.2787 | - | - | | 1.3152 | 4861 | 0.2479 | - | - | | 1.3155 | 4862 | 0.2429 | - | - | | 1.3157 | 4863 | 0.2079 | - | - | | 1.3160 | 4864 | 0.2166 | - | - | | 1.3163 | 4865 | 0.2531 | - | - | | 1.3166 | 4866 | 0.1407 | - | - | | 1.3168 | 4867 | 0.2401 | - | - | | 1.3171 | 4868 | 0.2687 | - | - | | 1.3174 | 4869 | 0.2249 | - | - | | 1.3176 | 4870 | 0.1733 | - | - | | 1.3179 | 4871 | 0.2637 | - | - | | 1.3182 | 4872 | 0.2236 | - | - | | 1.3185 | 4873 | 0.1528 | - | - | | 1.3187 | 4874 | 0.2443 | - | - | | 1.3190 | 4875 | 0.236 | - | - | | 1.3193 | 4876 | 0.2699 | - | - | | 1.3195 | 4877 | 0.1866 | - | - | | 1.3198 | 4878 | 0.2239 | - | - | | 1.3201 | 4879 | 0.295 | - | - | | 1.3203 | 4880 | 0.1985 | - | - | | 1.3206 | 4881 | 0.2163 | - | - | | 1.3209 | 4882 | 0.2528 | - | - | | 1.3212 | 4883 | 0.2202 | - | - | | 1.3214 | 4884 | 0.2621 | - | - | | 1.3217 | 4885 | 0.1878 | - | - | | 1.3220 | 4886 | 0.244 | - | - | | 1.3222 | 4887 | 0.3038 | - | - | | 1.3225 | 4888 | 0.3202 | - | - | | 1.3228 | 4889 | 0.2813 | - | - | | 1.3231 | 4890 | 0.2854 | - | - | | 1.3233 | 4891 | 0.2537 | - | - | | 1.3236 | 4892 | 0.2448 | - | - | | 1.3239 | 4893 | 0.1986 | - | - | | 1.3241 | 4894 | 0.1488 | - | - | | 1.3244 | 4895 | 0.2294 | - | - | | 1.3247 | 4896 | 0.2071 | - | - | | 1.3249 | 4897 | 0.2643 | - | - | | 1.3252 | 4898 | 0.2465 | - | - | | 1.3255 | 4899 | 0.2478 | - | - | | 1.3258 | 4900 | 0.2033 | - | - | | 1.3260 | 4901 | 0.2245 | - | - | | 1.3263 | 4902 | 0.2875 | - | - | | 1.3266 | 4903 | 0.2458 | - | - | | 1.3268 | 4904 | 0.2662 | - | - | | 1.3271 | 4905 | 0.2393 | - | - | | 1.3274 | 4906 | 0.1747 | - | - | | 1.3277 | 4907 | 0.2646 | - | - | | 1.3279 | 4908 | 0.324 | - | - | | 1.3282 | 4909 | 0.2307 | - | - | | 1.3285 | 4910 | 0.1988 | - | - | | 1.3287 | 4911 | 0.2198 | - | - | | 1.3290 | 4912 | 0.3069 | - | - | | 1.3293 | 4913 | 0.2538 | - | - | | 1.3295 | 4914 | 0.2281 | - | - | | 1.3298 | 4915 | 0.1691 | - | - | | 1.3301 | 4916 | 0.4058 | - | - | | 1.3304 | 4917 | 0.2588 | - | - | | 1.3306 | 4918 | 0.2653 | - | - | | 1.3309 | 4919 | 0.2885 | - | - | | 1.3312 | 4920 | 0.176 | - | - | | 1.3314 | 4921 | 0.2289 | - | - | | 1.3317 | 4922 | 0.2591 | - | - | | 1.3320 | 4923 | 0.2208 | - | - | | 1.3323 | 4924 | 0.2514 | - | - | | 1.3325 | 4925 | 0.3227 | - | - | | 1.3328 | 4926 | 0.233 | - | - | | 1.3331 | 4927 | 0.2272 | - | - | | 1.3333 | 4928 | 0.186 | - | - | | 1.3336 | 4929 | 0.1545 | - | - | | 1.3339 | 4930 | 0.2342 | - | - | | 1.3341 | 4931 | 0.2273 | - | - | | 1.3344 | 4932 | 0.2213 | - | - | | 1.3347 | 4933 | 0.2063 | - | - | | 1.3350 | 4934 | 0.2144 | - | - | | 1.3352 | 4935 | 0.2282 | - | - | | 1.3355 | 4936 | 0.2448 | - | - | | 1.3358 | 4937 | 0.172 | - | - | | 1.3360 | 4938 | 0.2317 | - | - | | 1.3363 | 4939 | 0.2178 | - | - | | 1.3366 | 4940 | 0.2019 | - | - | | 1.3369 | 4941 | 0.2257 | - | - | | 1.3371 | 4942 | 0.1835 | - | - | | 1.3374 | 4943 | 0.2362 | - | - | | 1.3377 | 4944 | 0.1473 | - | - | | 1.3379 | 4945 | 0.2068 | - | - | | 1.3382 | 4946 | 0.2301 | - | - | | 1.3385 | 4947 | 0.3179 | - | - | | 1.3387 | 4948 | 0.2331 | - | - | | 1.3390 | 4949 | 0.2178 | - | - | | 1.3393 | 4950 | 0.2855 | - | - | | 1.3396 | 4951 | 0.1918 | - | - | | 1.3398 | 4952 | 0.2233 | - | - | | 1.3401 | 4953 | 0.2328 | - | - | | 1.3404 | 4954 | 0.2482 | - | - | | 1.3406 | 4955 | 0.1931 | - | - | | 1.3409 | 4956 | 0.2095 | - | - | | 1.3412 | 4957 | 0.218 | - | - | | 1.3415 | 4958 | 0.2394 | - | - | | 1.3417 | 4959 | 0.2699 | - | - | | 1.3420 | 4960 | 0.1919 | - | - | | 1.3423 | 4961 | 0.2242 | - | - | | 1.3425 | 4962 | 0.2044 | - | - | | 1.3428 | 4963 | 0.2002 | - | - | | 1.3431 | 4964 | 0.2768 | - | - | | 1.3433 | 4965 | 0.1838 | - | - | | 1.3436 | 4966 | 0.2085 | - | - | | 1.3439 | 4967 | 0.213 | - | - | | 1.3442 | 4968 | 0.1693 | - | - | | 1.3444 | 4969 | 0.1779 | - | - | | 1.3447 | 4970 | 0.2766 | - | - | | 1.3450 | 4971 | 0.1902 | - | - | | 1.3452 | 4972 | 0.1753 | - | - | | 1.3455 | 4973 | 0.2701 | - | - | | 1.3458 | 4974 | 0.2516 | - | - | | 1.3460 | 4975 | 0.3002 | - | - | | 1.3463 | 4976 | 0.2558 | - | - | | 1.3466 | 4977 | 0.1969 | - | - | | 1.3469 | 4978 | 0.2542 | - | - | | 1.3471 | 4979 | 0.2061 | - | - | | 1.3474 | 4980 | 0.2225 | - | - | | 1.3477 | 4981 | 0.3971 | - | - | | 1.3479 | 4982 | 0.2559 | - | - | | 1.3482 | 4983 | 0.2082 | - | - | | 1.3485 | 4984 | 0.24 | - | - | | 1.3488 | 4985 | 0.1704 | - | - | | 1.3490 | 4986 | 0.3115 | - | - | | 1.3493 | 4987 | 0.2444 | - | - | | 1.3496 | 4988 | 0.1666 | - | - | | 1.3498 | 4989 | 0.2265 | - | - | | 1.3501 | 4990 | 0.2055 | - | - | | 1.3504 | 4991 | 0.1933 | - | - | | 1.3506 | 4992 | 0.2857 | - | - | | 1.3509 | 4993 | 0.1779 | - | - | | 1.3512 | 4994 | 0.2757 | - | - | | 1.3515 | 4995 | 0.187 | - | - | | 1.3517 | 4996 | 0.3348 | - | - | | 1.3520 | 4997 | 0.225 | - | - | | 1.3523 | 4998 | 0.2191 | - | - | | 1.3525 | 4999 | 0.2894 | - | - | | 1.3528 | 5000 | 0.1743 | 0.2258 | 0.9426 | | 1.3531 | 5001 | 0.3272 | - | - | | 1.3534 | 5002 | 0.1932 | - | - | | 1.3536 | 5003 | 0.3093 | - | - | | 1.3539 | 5004 | 0.2479 | - | - | | 1.3542 | 5005 | 0.2254 | - | - | | 1.3544 | 5006 | 0.2568 | - | - | | 1.3547 | 5007 | 0.3189 | - | - | | 1.3550 | 5008 | 0.187 | - | - | | 1.3552 | 5009 | 0.2211 | - | - | | 1.3555 | 5010 | 0.2745 | - | - | | 1.3558 | 5011 | 0.2515 | - | - | | 1.3561 | 5012 | 0.3465 | - | - | | 1.3563 | 5013 | 0.1519 | - | - | | 1.3566 | 5014 | 0.2272 | - | - | | 1.3569 | 5015 | 0.2069 | - | - | | 1.3571 | 5016 | 0.2089 | - | - | | 1.3574 | 5017 | 0.1934 | - | - | | 1.3577 | 5018 | 0.3921 | - | - | | 1.3580 | 5019 | 0.2081 | - | - | | 1.3582 | 5020 | 0.2498 | - | - | | 1.3585 | 5021 | 0.2372 | - | - | | 1.3588 | 5022 | 0.2209 | - | - | | 1.3590 | 5023 | 0.2519 | - | - | | 1.3593 | 5024 | 0.1997 | - | - | | 1.3596 | 5025 | 0.2536 | - | - | | 1.3598 | 5026 | 0.191 | - | - | | 1.3601 | 5027 | 0.2011 | - | - | | 1.3604 | 5028 | 0.1803 | - | - | | 1.3607 | 5029 | 0.1985 | - | - | | 1.3609 | 5030 | 0.2134 | - | - | | 1.3612 | 5031 | 0.1655 | - | - | | 1.3615 | 5032 | 0.2102 | - | - | | 1.3617 | 5033 | 0.163 | - | - | | 1.3620 | 5034 | 0.2074 | - | - | | 1.3623 | 5035 | 0.2897 | - | - | | 1.3626 | 5036 | 0.2697 | - | - | | 1.3628 | 5037 | 0.2266 | - | - | | 1.3631 | 5038 | 0.2365 | - | - | | 1.3634 | 5039 | 0.2457 | - | - | | 1.3636 | 5040 | 0.2498 | - | - | | 1.3639 | 5041 | 0.1816 | - | - | | 1.3642 | 5042 | 0.2523 | - | - | | 1.3644 | 5043 | 0.1932 | - | - | | 1.3647 | 5044 | 0.2866 | - | - | | 1.3650 | 5045 | 0.2636 | - | - | | 1.3653 | 5046 | 0.1805 | - | - | | 1.3655 | 5047 | 0.1704 | - | - | | 1.3658 | 5048 | 0.184 | - | - | | 1.3661 | 5049 | 0.2121 | - | - | | 1.3663 | 5050 | 0.1862 | - | - | | 1.3666 | 5051 | 0.1225 | - | - | | 1.3669 | 5052 | 0.1845 | - | - | | 1.3672 | 5053 | 0.201 | - | - | | 1.3674 | 5054 | 0.3451 | - | - | | 1.3677 | 5055 | 0.1807 | - | - | | 1.3680 | 5056 | 0.183 | - | - | | 1.3682 | 5057 | 0.1895 | - | - | | 1.3685 | 5058 | 0.2299 | - | - | | 1.3688 | 5059 | 0.2732 | - | - | | 1.3690 | 5060 | 0.2154 | - | - | | 1.3693 | 5061 | 0.1992 | - | - | | 1.3696 | 5062 | 0.1815 | - | - | | 1.3699 | 5063 | 0.2393 | - | - | | 1.3701 | 5064 | 0.1835 | - | - | | 1.3704 | 5065 | 0.2755 | - | - | | 1.3707 | 5066 | 0.2096 | - | - | | 1.3709 | 5067 | 0.3435 | - | - | | 1.3712 | 5068 | 0.291 | - | - | | 1.3715 | 5069 | 0.1964 | - | - | | 1.3718 | 5070 | 0.2026 | - | - | | 1.3720 | 5071 | 0.2062 | - | - | | 1.3723 | 5072 | 0.2615 | - | - | | 1.3726 | 5073 | 0.2415 | - | - | | 1.3728 | 5074 | 0.2217 | - | - | | 1.3731 | 5075 | 0.2228 | - | - | | 1.3734 | 5076 | 0.2304 | - | - | | 1.3736 | 5077 | 0.228 | - | - | | 1.3739 | 5078 | 0.2661 | - | - | | 1.3742 | 5079 | 0.2405 | - | - | | 1.3745 | 5080 | 0.2048 | - | - | | 1.3747 | 5081 | 0.2776 | - | - | | 1.375 | 5082 | 0.2141 | - | - | | 1.3753 | 5083 | 0.2809 | - | - | | 1.3755 | 5084 | 0.216 | - | - | | 1.3758 | 5085 | 0.2866 | - | - | | 1.3761 | 5086 | 0.1854 | - | - | | 1.3764 | 5087 | 0.2929 | - | - | | 1.3766 | 5088 | 0.3298 | - | - | | 1.3769 | 5089 | 0.2484 | - | - | | 1.3772 | 5090 | 0.1623 | - | - | | 1.3774 | 5091 | 0.295 | - | - | | 1.3777 | 5092 | 0.1992 | - | - | | 1.3780 | 5093 | 0.3278 | - | - | | 1.3782 | 5094 | 0.1861 | - | - | | 1.3785 | 5095 | 0.2226 | - | - | | 1.3788 | 5096 | 0.2601 | - | - | | 1.3791 | 5097 | 0.2614 | - | - | | 1.3793 | 5098 | 0.2576 | - | - | | 1.3796 | 5099 | 0.2512 | - | - | | 1.3799 | 5100 | 0.2036 | - | - | | 1.3801 | 5101 | 0.2316 | - | - | | 1.3804 | 5102 | 0.2504 | - | - | | 1.3807 | 5103 | 0.2416 | - | - | | 1.3810 | 5104 | 0.3158 | - | - | | 1.3812 | 5105 | 0.1596 | - | - | | 1.3815 | 5106 | 0.2984 | - | - | | 1.3818 | 5107 | 0.2214 | - | - | | 1.3820 | 5108 | 0.2156 | - | - | | 1.3823 | 5109 | 0.1837 | - | - | | 1.3826 | 5110 | 0.1795 | - | - | | 1.3828 | 5111 | 0.2016 | - | - | | 1.3831 | 5112 | 0.2359 | - | - | | 1.3834 | 5113 | 0.2154 | - | - | | 1.3837 | 5114 | 0.1913 | - | - | | 1.3839 | 5115 | 0.2449 | - | - | | 1.3842 | 5116 | 0.2221 | - | - | | 1.3845 | 5117 | 0.2611 | - | - | | 1.3847 | 5118 | 0.2125 | - | - | | 1.3850 | 5119 | 0.2101 | - | - | | 1.3853 | 5120 | 0.3185 | - | - | | 1.3856 | 5121 | 0.218 | - | - | | 1.3858 | 5122 | 0.291 | - | - | | 1.3861 | 5123 | 0.2595 | - | - | | 1.3864 | 5124 | 0.2083 | - | - | | 1.3866 | 5125 | 0.2211 | - | - | | 1.3869 | 5126 | 0.2216 | - | - | | 1.3872 | 5127 | 0.228 | - | - | | 1.3874 | 5128 | 0.1919 | - | - | | 1.3877 | 5129 | 0.2208 | - | - | | 1.3880 | 5130 | 0.2132 | - | - | | 1.3883 | 5131 | 0.2049 | - | - | | 1.3885 | 5132 | 0.2007 | - | - | | 1.3888 | 5133 | 0.2459 | - | - | | 1.3891 | 5134 | 0.22 | - | - | | 1.3893 | 5135 | 0.2759 | - | - | | 1.3896 | 5136 | 0.1962 | - | - | | 1.3899 | 5137 | 0.1947 | - | - | | 1.3902 | 5138 | 0.2379 | - | - | | 1.3904 | 5139 | 0.2124 | - | - | | 1.3907 | 5140 | 0.2447 | - | - | | 1.3910 | 5141 | 0.2086 | - | - | | 1.3912 | 5142 | 0.2235 | - | - | | 1.3915 | 5143 | 0.1982 | - | - | | 1.3918 | 5144 | 0.2317 | - | - | | 1.3920 | 5145 | 0.2251 | - | - | | 1.3923 | 5146 | 0.2681 | - | - | | 1.3926 | 5147 | 0.1471 | - | - | | 1.3929 | 5148 | 0.1885 | - | - | | 1.3931 | 5149 | 0.2652 | - | - | | 1.3934 | 5150 | 0.2085 | - | - | | 1.3937 | 5151 | 0.1842 | - | - | | 1.3939 | 5152 | 0.2452 | - | - | | 1.3942 | 5153 | 0.1745 | - | - | | 1.3945 | 5154 | 0.304 | - | - | | 1.3948 | 5155 | 0.193 | - | - | | 1.3950 | 5156 | 0.2149 | - | - | | 1.3953 | 5157 | 0.1674 | - | - | | 1.3956 | 5158 | 0.2371 | - | - | | 1.3958 | 5159 | 0.2319 | - | - | | 1.3961 | 5160 | 0.2286 | - | - | | 1.3964 | 5161 | 0.2336 | - | - | | 1.3966 | 5162 | 0.1938 | - | - | | 1.3969 | 5163 | 0.1935 | - | - | | 1.3972 | 5164 | 0.2165 | - | - | | 1.3975 | 5165 | 0.1954 | - | - | | 1.3977 | 5166 | 0.2141 | - | - | | 1.3980 | 5167 | 0.2472 | - | - | | 1.3983 | 5168 | 0.2119 | - | - | | 1.3985 | 5169 | 0.1907 | - | - | | 1.3988 | 5170 | 0.2276 | - | - | | 1.3991 | 5171 | 0.2339 | - | - | | 1.3994 | 5172 | 0.2072 | - | - | | 1.3996 | 5173 | 0.1294 | - | - | | 1.3999 | 5174 | 0.2643 | - | - | | 1.4002 | 5175 | 0.2709 | - | - | | 1.4004 | 5176 | 0.2352 | - | - | | 1.4007 | 5177 | 0.257 | - | - | | 1.4010 | 5178 | 0.2103 | - | - | | 1.4012 | 5179 | 0.2949 | - | - | | 1.4015 | 5180 | 0.1964 | - | - | | 1.4018 | 5181 | 0.264 | - | - | | 1.4021 | 5182 | 0.2009 | - | - | | 1.4023 | 5183 | 0.2388 | - | - | | 1.4026 | 5184 | 0.1475 | - | - | | 1.4029 | 5185 | 0.1255 | - | - | | 1.4031 | 5186 | 0.1971 | - | - | | 1.4034 | 5187 | 0.195 | - | - | | 1.4037 | 5188 | 0.1817 | - | - | | 1.4040 | 5189 | 0.3054 | - | - | | 1.4042 | 5190 | 0.2054 | - | - | | 1.4045 | 5191 | 0.2331 | - | - | | 1.4048 | 5192 | 0.1828 | - | - | | 1.4050 | 5193 | 0.2336 | - | - | | 1.4053 | 5194 | 0.2097 | - | - | | 1.4056 | 5195 | 0.1755 | - | - | | 1.4058 | 5196 | 0.2503 | - | - | | 1.4061 | 5197 | 0.3178 | - | - | | 1.4064 | 5198 | 0.2368 | - | - | | 1.4067 | 5199 | 0.1923 | - | - | | 1.4069 | 5200 | 0.2273 | - | - | | 1.4072 | 5201 | 0.2135 | - | - | | 1.4075 | 5202 | 0.2656 | - | - | | 1.4077 | 5203 | 0.3111 | - | - | | 1.4080 | 5204 | 0.2011 | - | - | | 1.4083 | 5205 | 0.2258 | - | - | | 1.4085 | 5206 | 0.2367 | - | - | | 1.4088 | 5207 | 0.3208 | - | - | | 1.4091 | 5208 | 0.2056 | - | - | | 1.4094 | 5209 | 0.2278 | - | - | | 1.4096 | 5210 | 0.2763 | - | - | | 1.4099 | 5211 | 0.2307 | - | - | | 1.4102 | 5212 | 0.2789 | - | - | | 1.4104 | 5213 | 0.2068 | - | - | | 1.4107 | 5214 | 0.2408 | - | - | | 1.4110 | 5215 | 0.2711 | - | - | | 1.4113 | 5216 | 0.2418 | - | - | | 1.4115 | 5217 | 0.2323 | - | - | | 1.4118 | 5218 | 0.1926 | - | - | | 1.4121 | 5219 | 0.2427 | - | - | | 1.4123 | 5220 | 0.3115 | - | - | | 1.4126 | 5221 | 0.1909 | - | - | | 1.4129 | 5222 | 0.1673 | - | - | | 1.4131 | 5223 | 0.1998 | - | - | | 1.4134 | 5224 | 0.2459 | - | - | | 1.4137 | 5225 | 0.2351 | - | - | | 1.4140 | 5226 | 0.2194 | - | - | | 1.4142 | 5227 | 0.2115 | - | - | | 1.4145 | 5228 | 0.1882 | - | - | | 1.4148 | 5229 | 0.1712 | - | - | | 1.4150 | 5230 | 0.2019 | - | - | | 1.4153 | 5231 | 0.2282 | - | - | | 1.4156 | 5232 | 0.1617 | - | - | | 1.4159 | 5233 | 0.2385 | - | - | | 1.4161 | 5234 | 0.2225 | - | - | | 1.4164 | 5235 | 0.3195 | - | - | | 1.4167 | 5236 | 0.1933 | - | - | | 1.4169 | 5237 | 0.2169 | - | - | | 1.4172 | 5238 | 0.2006 | - | - | | 1.4175 | 5239 | 0.3048 | - | - | | 1.4177 | 5240 | 0.1791 | - | - | | 1.4180 | 5241 | 0.1464 | - | - | | 1.4183 | 5242 | 0.2363 | - | - | | 1.4186 | 5243 | 0.2308 | - | - | | 1.4188 | 5244 | 0.2458 | - | - | | 1.4191 | 5245 | 0.2943 | - | - | | 1.4194 | 5246 | 0.276 | - | - | | 1.4196 | 5247 | 0.2397 | - | - | | 1.4199 | 5248 | 0.2009 | - | - | | 1.4202 | 5249 | 0.2944 | - | - | | 1.4205 | 5250 | 0.2238 | - | - | | 1.4207 | 5251 | 0.2168 | - | - | | 1.4210 | 5252 | 0.3322 | - | - | | 1.4213 | 5253 | 0.2368 | - | - | | 1.4215 | 5254 | 0.3379 | - | - | | 1.4218 | 5255 | 0.2186 | - | - | | 1.4221 | 5256 | 0.2941 | - | - | | 1.4223 | 5257 | 0.1733 | - | - | | 1.4226 | 5258 | 0.2344 | - | - | | 1.4229 | 5259 | 0.2141 | - | - | | 1.4232 | 5260 | 0.2625 | - | - | | 1.4234 | 5261 | 0.1415 | - | - | | 1.4237 | 5262 | 0.2384 | - | - | | 1.4240 | 5263 | 0.2243 | - | - | | 1.4242 | 5264 | 0.2226 | - | - | | 1.4245 | 5265 | 0.2171 | - | - | | 1.4248 | 5266 | 0.2282 | - | - | | 1.4251 | 5267 | 0.2441 | - | - | | 1.4253 | 5268 | 0.2371 | - | - | | 1.4256 | 5269 | 0.3161 | - | - | | 1.4259 | 5270 | 0.1996 | - | - | | 1.4261 | 5271 | 0.2445 | - | - | | 1.4264 | 5272 | 0.1955 | - | - | | 1.4267 | 5273 | 0.2622 | - | - | | 1.4269 | 5274 | 0.2659 | - | - | | 1.4272 | 5275 | 0.1933 | - | - | | 1.4275 | 5276 | 0.2526 | - | - | | 1.4278 | 5277 | 0.2144 | - | - | | 1.4280 | 5278 | 0.1572 | - | - | | 1.4283 | 5279 | 0.3021 | - | - | | 1.4286 | 5280 | 0.3588 | - | - | | 1.4288 | 5281 | 0.2205 | - | - | | 1.4291 | 5282 | 0.1504 | - | - | | 1.4294 | 5283 | 0.2103 | - | - | | 1.4297 | 5284 | 0.2515 | - | - | | 1.4299 | 5285 | 0.294 | - | - | | 1.4302 | 5286 | 0.2311 | - | - | | 1.4305 | 5287 | 0.1943 | - | - | | 1.4307 | 5288 | 0.1687 | - | - | | 1.4310 | 5289 | 0.2403 | - | - | | 1.4313 | 5290 | 0.2119 | - | - | | 1.4315 | 5291 | 0.2107 | - | - | | 1.4318 | 5292 | 0.2155 | - | - | | 1.4321 | 5293 | 0.1913 | - | - | | 1.4324 | 5294 | 0.2126 | - | - | | 1.4326 | 5295 | 0.1739 | - | - | | 1.4329 | 5296 | 0.178 | - | - | | 1.4332 | 5297 | 0.225 | - | - | | 1.4334 | 5298 | 0.2154 | - | - | | 1.4337 | 5299 | 0.1942 | - | - | | 1.4340 | 5300 | 0.1911 | - | - | | 1.4343 | 5301 | 0.3321 | - | - | | 1.4345 | 5302 | 0.1335 | - | - | | 1.4348 | 5303 | 0.2876 | - | - | | 1.4351 | 5304 | 0.2604 | - | - | | 1.4353 | 5305 | 0.4069 | - | - | | 1.4356 | 5306 | 0.2971 | - | - | | 1.4359 | 5307 | 0.1237 | - | - | | 1.4361 | 5308 | 0.2108 | - | - | | 1.4364 | 5309 | 0.2196 | - | - | | 1.4367 | 5310 | 0.2589 | - | - | | 1.4370 | 5311 | 0.2225 | - | - | | 1.4372 | 5312 | 0.1518 | - | - | | 1.4375 | 5313 | 0.1994 | - | - | | 1.4378 | 5314 | 0.2618 | - | - | | 1.4380 | 5315 | 0.2631 | - | - | | 1.4383 | 5316 | 0.2848 | - | - | | 1.4386 | 5317 | 0.1976 | - | - | | 1.4389 | 5318 | 0.1904 | - | - | | 1.4391 | 5319 | 0.2682 | - | - | | 1.4394 | 5320 | 0.2684 | - | - | | 1.4397 | 5321 | 0.2046 | - | - | | 1.4399 | 5322 | 0.2468 | - | - | | 1.4402 | 5323 | 0.266 | - | - | | 1.4405 | 5324 | 0.3248 | - | - | | 1.4407 | 5325 | 0.2327 | - | - | | 1.4410 | 5326 | 0.2999 | - | - | | 1.4413 | 5327 | 0.2046 | - | - | | 1.4416 | 5328 | 0.2642 | - | - | | 1.4418 | 5329 | 0.2467 | - | - | | 1.4421 | 5330 | 0.2366 | - | - | | 1.4424 | 5331 | 0.1989 | - | - | | 1.4426 | 5332 | 0.1948 | - | - | | 1.4429 | 5333 | 0.1909 | - | - | | 1.4432 | 5334 | 0.1856 | - | - | | 1.4435 | 5335 | 0.2216 | - | - | | 1.4437 | 5336 | 0.3236 | - | - | | 1.4440 | 5337 | 0.2564 | - | - | | 1.4443 | 5338 | 0.1649 | - | - | | 1.4445 | 5339 | 0.2289 | - | - | | 1.4448 | 5340 | 0.249 | - | - | | 1.4451 | 5341 | 0.2271 | - | - | | 1.4453 | 5342 | 0.2028 | - | - | | 1.4456 | 5343 | 0.2056 | - | - | | 1.4459 | 5344 | 0.2591 | - | - | | 1.4462 | 5345 | 0.2292 | - | - | | 1.4464 | 5346 | 0.1978 | - | - | | 1.4467 | 5347 | 0.1832 | - | - | | 1.4470 | 5348 | 0.2547 | - | - | | 1.4472 | 5349 | 0.2643 | - | - | | 1.4475 | 5350 | 0.27 | - | - | | 1.4478 | 5351 | 0.1783 | - | - | | 1.4481 | 5352 | 0.1787 | - | - | | 1.4483 | 5353 | 0.2475 | - | - | | 1.4486 | 5354 | 0.2057 | - | - | | 1.4489 | 5355 | 0.1877 | - | - | | 1.4491 | 5356 | 0.2339 | - | - | | 1.4494 | 5357 | 0.2221 | - | - | | 1.4497 | 5358 | 0.3029 | - | - | | 1.4499 | 5359 | 0.2373 | - | - | | 1.4502 | 5360 | 0.2807 | - | - | | 1.4505 | 5361 | 0.1765 | - | - | | 1.4508 | 5362 | 0.1781 | - | - | | 1.4510 | 5363 | 0.2245 | - | - | | 1.4513 | 5364 | 0.2205 | - | - | | 1.4516 | 5365 | 0.1775 | - | - | | 1.4518 | 5366 | 0.2405 | - | - | | 1.4521 | 5367 | 0.1747 | - | - | | 1.4524 | 5368 | 0.2657 | - | - | | 1.4527 | 5369 | 0.2094 | - | - | | 1.4529 | 5370 | 0.2284 | - | - | | 1.4532 | 5371 | 0.2452 | - | - | | 1.4535 | 5372 | 0.2129 | - | - | | 1.4537 | 5373 | 0.2264 | - | - | | 1.4540 | 5374 | 0.159 | - | - | | 1.4543 | 5375 | 0.19 | - | - | | 1.4545 | 5376 | 0.2293 | - | - | | 1.4548 | 5377 | 0.2302 | - | - | | 1.4551 | 5378 | 0.2329 | - | - | | 1.4554 | 5379 | 0.2037 | - | - | | 1.4556 | 5380 | 0.2522 | - | - | | 1.4559 | 5381 | 0.253 | - | - | | 1.4562 | 5382 | 0.142 | - | - | | 1.4564 | 5383 | 0.2007 | - | - | | 1.4567 | 5384 | 0.2116 | - | - | | 1.4570 | 5385 | 0.2295 | - | - | | 1.4573 | 5386 | 0.1442 | - | - | | 1.4575 | 5387 | 0.2774 | - | - | | 1.4578 | 5388 | 0.1828 | - | - | | 1.4581 | 5389 | 0.3095 | - | - | | 1.4583 | 5390 | 0.2263 | - | - | | 1.4586 | 5391 | 0.2406 | - | - | | 1.4589 | 5392 | 0.1606 | - | - | | 1.4591 | 5393 | 0.2357 | - | - | | 1.4594 | 5394 | 0.3516 | - | - | | 1.4597 | 5395 | 0.2343 | - | - | | 1.4600 | 5396 | 0.2449 | - | - | | 1.4602 | 5397 | 0.1651 | - | - | | 1.4605 | 5398 | 0.1712 | - | - | | 1.4608 | 5399 | 0.1759 | - | - | | 1.4610 | 5400 | 0.3316 | - | - | | 1.4613 | 5401 | 0.2098 | - | - | | 1.4616 | 5402 | 0.2696 | - | - | | 1.4619 | 5403 | 0.1761 | - | - | | 1.4621 | 5404 | 0.1489 | - | - | | 1.4624 | 5405 | 0.1945 | - | - | | 1.4627 | 5406 | 0.2091 | - | - | | 1.4629 | 5407 | 0.293 | - | - | | 1.4632 | 5408 | 0.2374 | - | - | | 1.4635 | 5409 | 0.1891 | - | - | | 1.4637 | 5410 | 0.2865 | - | - | | 1.4640 | 5411 | 0.1804 | - | - | | 1.4643 | 5412 | 0.2538 | - | - | | 1.4646 | 5413 | 0.2151 | - | - | | 1.4648 | 5414 | 0.2095 | - | - | | 1.4651 | 5415 | 0.1414 | - | - | | 1.4654 | 5416 | 0.244 | - | - | | 1.4656 | 5417 | 0.2275 | - | - | | 1.4659 | 5418 | 0.181 | - | - | | 1.4662 | 5419 | 0.221 | - | - | | 1.4665 | 5420 | 0.2338 | - | - | | 1.4667 | 5421 | 0.2677 | - | - | | 1.4670 | 5422 | 0.2174 | - | - | | 1.4673 | 5423 | 0.1827 | - | - | | 1.4675 | 5424 | 0.2083 | - | - | | 1.4678 | 5425 | 0.1838 | - | - | | 1.4681 | 5426 | 0.2313 | - | - | | 1.4683 | 5427 | 0.3292 | - | - | | 1.4686 | 5428 | 0.2552 | - | - | | 1.4689 | 5429 | 0.2097 | - | - | | 1.4692 | 5430 | 0.2113 | - | - | | 1.4694 | 5431 | 0.1731 | - | - | | 1.4697 | 5432 | 0.2338 | - | - | | 1.4700 | 5433 | 0.3219 | - | - | | 1.4702 | 5434 | 0.1768 | - | - | | 1.4705 | 5435 | 0.2597 | - | - | | 1.4708 | 5436 | 0.1806 | - | - | | 1.4710 | 5437 | 0.2821 | - | - | | 1.4713 | 5438 | 0.372 | - | - | | 1.4716 | 5439 | 0.2756 | - | - | | 1.4719 | 5440 | 0.2026 | - | - | | 1.4721 | 5441 | 0.2128 | - | - | | 1.4724 | 5442 | 0.1998 | - | - | | 1.4727 | 5443 | 0.2317 | - | - | | 1.4729 | 5444 | 0.2427 | - | - | | 1.4732 | 5445 | 0.2575 | - | - | | 1.4735 | 5446 | 0.233 | - | - | | 1.4738 | 5447 | 0.3004 | - | - | | 1.4740 | 5448 | 0.2432 | - | - | | 1.4743 | 5449 | 0.2577 | - | - | | 1.4746 | 5450 | 0.2081 | - | - | | 1.4748 | 5451 | 0.2063 | - | - | | 1.4751 | 5452 | 0.3232 | - | - | | 1.4754 | 5453 | 0.1869 | - | - | | 1.4756 | 5454 | 0.1423 | - | - | | 1.4759 | 5455 | 0.1559 | - | - | | 1.4762 | 5456 | 0.2014 | - | - | | 1.4765 | 5457 | 0.2138 | - | - | | 1.4767 | 5458 | 0.2259 | - | - | | 1.4770 | 5459 | 0.2196 | - | - | | 1.4773 | 5460 | 0.2209 | - | - | | 1.4775 | 5461 | 0.3369 | - | - | | 1.4778 | 5462 | 0.2625 | - | - | | 1.4781 | 5463 | 0.1662 | - | - | | 1.4784 | 5464 | 0.2073 | - | - | | 1.4786 | 5465 | 0.1871 | - | - | | 1.4789 | 5466 | 0.2259 | - | - | | 1.4792 | 5467 | 0.2644 | - | - | | 1.4794 | 5468 | 0.2084 | - | - | | 1.4797 | 5469 | 0.1911 | - | - | | 1.4800 | 5470 | 0.3248 | - | - | | 1.4802 | 5471 | 0.1612 | - | - | | 1.4805 | 5472 | 0.3163 | - | - | | 1.4808 | 5473 | 0.167 | - | - | | 1.4811 | 5474 | 0.1923 | - | - | | 1.4813 | 5475 | 0.3397 | - | - | | 1.4816 | 5476 | 0.2408 | - | - | | 1.4819 | 5477 | 0.1998 | - | - | | 1.4821 | 5478 | 0.207 | - | - | | 1.4824 | 5479 | 0.3086 | - | - | | 1.4827 | 5480 | 0.1624 | - | - | | 1.4830 | 5481 | 0.2752 | - | - | | 1.4832 | 5482 | 0.2334 | - | - | | 1.4835 | 5483 | 0.1901 | - | - | | 1.4838 | 5484 | 0.2568 | - | - | | 1.4840 | 5485 | 0.2489 | - | - | | 1.4843 | 5486 | 0.2886 | - | - | | 1.4846 | 5487 | 0.3219 | - | - | | 1.4848 | 5488 | 0.1975 | - | - | | 1.4851 | 5489 | 0.1945 | - | - | | 1.4854 | 5490 | 0.1989 | - | - | | 1.4857 | 5491 | 0.2388 | - | - | | 1.4859 | 5492 | 0.1777 | - | - | | 1.4862 | 5493 | 0.2774 | - | - | | 1.4865 | 5494 | 0.1815 | - | - | | 1.4867 | 5495 | 0.2921 | - | - | | 1.4870 | 5496 | 0.1676 | - | - | | 1.4873 | 5497 | 0.1916 | - | - | | 1.4876 | 5498 | 0.2192 | - | - | | 1.4878 | 5499 | 0.2492 | - | - | | 1.4881 | 5500 | 0.2286 | - | - | | 1.4884 | 5501 | 0.2974 | - | - | | 1.4886 | 5502 | 0.1951 | - | - | | 1.4889 | 5503 | 0.2977 | - | - | | 1.4892 | 5504 | 0.2179 | - | - | | 1.4894 | 5505 | 0.2211 | - | - | | 1.4897 | 5506 | 0.2143 | - | - | | 1.4900 | 5507 | 0.2175 | - | - | | 1.4903 | 5508 | 0.1944 | - | - | | 1.4905 | 5509 | 0.2832 | - | - | | 1.4908 | 5510 | 0.2015 | - | - | | 1.4911 | 5511 | 0.2478 | - | - | | 1.4913 | 5512 | 0.2564 | - | - | | 1.4916 | 5513 | 0.1937 | - | - | | 1.4919 | 5514 | 0.2878 | - | - | | 1.4922 | 5515 | 0.222 | - | - | | 1.4924 | 5516 | 0.2924 | - | - | | 1.4927 | 5517 | 0.2447 | - | - | | 1.4930 | 5518 | 0.2284 | - | - | | 1.4932 | 5519 | 0.2322 | - | - | | 1.4935 | 5520 | 0.1363 | - | - | | 1.4938 | 5521 | 0.2156 | - | - | | 1.4940 | 5522 | 0.2647 | - | - | | 1.4943 | 5523 | 0.3007 | - | - | | 1.4946 | 5524 | 0.2893 | - | - | | 1.4949 | 5525 | 0.2801 | - | - | | 1.4951 | 5526 | 0.2177 | - | - | | 1.4954 | 5527 | 0.1799 | - | - | | 1.4957 | 5528 | 0.2098 | - | - | | 1.4959 | 5529 | 0.2221 | - | - | | 1.4962 | 5530 | 0.2285 | - | - | | 1.4965 | 5531 | 0.2108 | - | - | | 1.4968 | 5532 | 0.2639 | - | - | | 1.4970 | 5533 | 0.2495 | - | - | | 1.4973 | 5534 | 0.2223 | - | - | | 1.4976 | 5535 | 0.2637 | - | - | | 1.4978 | 5536 | 0.214 | - | - | | 1.4981 | 5537 | 0.22 | - | - | | 1.4984 | 5538 | 0.2689 | - | - | | 1.4986 | 5539 | 0.191 | - | - | | 1.4989 | 5540 | 0.2049 | - | - | | 1.4992 | 5541 | 0.1735 | - | - | | 1.4995 | 5542 | 0.2252 | - | - | | 1.4997 | 5543 | 0.2629 | - | - | | 1.5 | 5544 | 0.2102 | - | - | | 1.5003 | 5545 | 0.1566 | - | - | | 1.5005 | 5546 | 0.2044 | - | - | | 1.5008 | 5547 | 0.1841 | - | - | | 1.5011 | 5548 | 0.2714 | - | - | | 1.5014 | 5549 | 0.1354 | - | - | | 1.5016 | 5550 | 0.1657 | - | - | | 1.5019 | 5551 | 0.1657 | - | - | | 1.5022 | 5552 | 0.1454 | - | - | | 1.5024 | 5553 | 0.1856 | - | - | | 1.5027 | 5554 | 0.2391 | - | - | | 1.5030 | 5555 | 0.1601 | - | - | | 1.5032 | 5556 | 0.2047 | - | - | | 1.5035 | 5557 | 0.2834 | - | - | | 1.5038 | 5558 | 0.238 | - | - | | 1.5041 | 5559 | 0.2363 | - | - | | 1.5043 | 5560 | 0.2745 | - | - | | 1.5046 | 5561 | 0.2245 | - | - | | 1.5049 | 5562 | 0.2493 | - | - | | 1.5051 | 5563 | 0.2406 | - | - | | 1.5054 | 5564 | 0.1992 | - | - | | 1.5057 | 5565 | 0.1981 | - | - | | 1.5060 | 5566 | 0.1514 | - | - | | 1.5062 | 5567 | 0.2475 | - | - | | 1.5065 | 5568 | 0.2874 | - | - | | 1.5068 | 5569 | 0.1998 | - | - | | 1.5070 | 5570 | 0.2299 | - | - | | 1.5073 | 5571 | 0.244 | - | - | | 1.5076 | 5572 | 0.2278 | - | - | | 1.5078 | 5573 | 0.3185 | - | - | | 1.5081 | 5574 | 0.2127 | - | - | | 1.5084 | 5575 | 0.2502 | - | - | | 1.5087 | 5576 | 0.2776 | - | - | | 1.5089 | 5577 | 0.2142 | - | - | | 1.5092 | 5578 | 0.1572 | - | - | | 1.5095 | 5579 | 0.2408 | - | - | | 1.5097 | 5580 | 0.201 | - | - | | 1.5100 | 5581 | 0.1616 | - | - | | 1.5103 | 5582 | 0.2866 | - | - | | 1.5106 | 5583 | 0.1576 | - | - | | 1.5108 | 5584 | 0.2119 | - | - | | 1.5111 | 5585 | 0.204 | - | - | | 1.5114 | 5586 | 0.263 | - | - | | 1.5116 | 5587 | 0.2022 | - | - | | 1.5119 | 5588 | 0.1391 | - | - | | 1.5122 | 5589 | 0.2201 | - | - | | 1.5124 | 5590 | 0.1976 | - | - | | 1.5127 | 5591 | 0.1972 | - | - | | 1.5130 | 5592 | 0.233 | - | - | | 1.5133 | 5593 | 0.2639 | - | - | | 1.5135 | 5594 | 0.249 | - | - | | 1.5138 | 5595 | 0.2755 | - | - | | 1.5141 | 5596 | 0.2411 | - | - | | 1.5143 | 5597 | 0.2186 | - | - | | 1.5146 | 5598 | 0.207 | - | - | | 1.5149 | 5599 | 0.2445 | - | - | | 1.5152 | 5600 | 0.2628 | - | - | | 1.5154 | 5601 | 0.2048 | - | - | | 1.5157 | 5602 | 0.1756 | - | - | | 1.5160 | 5603 | 0.1511 | - | - | | 1.5162 | 5604 | 0.2026 | - | - | | 1.5165 | 5605 | 0.1425 | - | - | | 1.5168 | 5606 | 0.2618 | - | - | | 1.5170 | 5607 | 0.2489 | - | - | | 1.5173 | 5608 | 0.2506 | - | - | | 1.5176 | 5609 | 0.2139 | - | - | | 1.5179 | 5610 | 0.2732 | - | - | | 1.5181 | 5611 | 0.2087 | - | - | | 1.5184 | 5612 | 0.2537 | - | - | | 1.5187 | 5613 | 0.2823 | - | - | | 1.5189 | 5614 | 0.1433 | - | - | | 1.5192 | 5615 | 0.2443 | - | - | | 1.5195 | 5616 | 0.2894 | - | - | | 1.5198 | 5617 | 0.2643 | - | - | | 1.5200 | 5618 | 0.1721 | - | - | | 1.5203 | 5619 | 0.2372 | - | - | | 1.5206 | 5620 | 0.1669 | - | - | | 1.5208 | 5621 | 0.2635 | - | - | | 1.5211 | 5622 | 0.196 | - | - | | 1.5214 | 5623 | 0.3238 | - | - | | 1.5216 | 5624 | 0.2018 | - | - | | 1.5219 | 5625 | 0.2176 | - | - | | 1.5222 | 5626 | 0.2485 | - | - | | 1.5225 | 5627 | 0.2026 | - | - | | 1.5227 | 5628 | 0.1769 | - | - | | 1.5230 | 5629 | 0.1424 | - | - | | 1.5233 | 5630 | 0.3039 | - | - | | 1.5235 | 5631 | 0.1787 | - | - | | 1.5238 | 5632 | 0.215 | - | - | | 1.5241 | 5633 | 0.2294 | - | - | | 1.5244 | 5634 | 0.2925 | - | - | | 1.5246 | 5635 | 0.2316 | - | - | | 1.5249 | 5636 | 0.2126 | - | - | | 1.5252 | 5637 | 0.2731 | - | - | | 1.5254 | 5638 | 0.2182 | - | - | | 1.5257 | 5639 | 0.2085 | - | - | | 1.5260 | 5640 | 0.2146 | - | - | | 1.5262 | 5641 | 0.1879 | - | - | | 1.5265 | 5642 | 0.2003 | - | - | | 1.5268 | 5643 | 0.2096 | - | - | | 1.5271 | 5644 | 0.175 | - | - | | 1.5273 | 5645 | 0.2619 | - | - | | 1.5276 | 5646 | 0.2154 | - | - | | 1.5279 | 5647 | 0.176 | - | - | | 1.5281 | 5648 | 0.2324 | - | - | | 1.5284 | 5649 | 0.1846 | - | - | | 1.5287 | 5650 | 0.2001 | - | - | | 1.5290 | 5651 | 0.1675 | - | - | | 1.5292 | 5652 | 0.1728 | - | - | | 1.5295 | 5653 | 0.278 | - | - | | 1.5298 | 5654 | 0.2801 | - | - | | 1.5300 | 5655 | 0.2838 | - | - | | 1.5303 | 5656 | 0.211 | - | - | | 1.5306 | 5657 | 0.2206 | - | - | | 1.5308 | 5658 | 0.226 | - | - | | 1.5311 | 5659 | 0.1446 | - | - | | 1.5314 | 5660 | 0.2313 | - | - | | 1.5317 | 5661 | 0.3117 | - | - | | 1.5319 | 5662 | 0.2354 | - | - | | 1.5322 | 5663 | 0.282 | - | - | | 1.5325 | 5664 | 0.1901 | - | - | | 1.5327 | 5665 | 0.2348 | - | - | | 1.5330 | 5666 | 0.2231 | - | - | | 1.5333 | 5667 | 0.1953 | - | - | | 1.5335 | 5668 | 0.2816 | - | - | | 1.5338 | 5669 | 0.2178 | - | - | | 1.5341 | 5670 | 0.241 | - | - | | 1.5344 | 5671 | 0.2126 | - | - | | 1.5346 | 5672 | 0.2098 | - | - | | 1.5349 | 5673 | 0.2801 | - | - | | 1.5352 | 5674 | 0.2055 | - | - | | 1.5354 | 5675 | 0.2021 | - | - | | 1.5357 | 5676 | 0.1739 | - | - | | 1.5360 | 5677 | 0.2332 | - | - | | 1.5363 | 5678 | 0.227 | - | - | | 1.5365 | 5679 | 0.268 | - | - | | 1.5368 | 5680 | 0.2668 | - | - | | 1.5371 | 5681 | 0.2066 | - | - | | 1.5373 | 5682 | 0.4161 | - | - | | 1.5376 | 5683 | 0.1861 | - | - | | 1.5379 | 5684 | 0.312 | - | - | | 1.5381 | 5685 | 0.2436 | - | - | | 1.5384 | 5686 | 0.251 | - | - | | 1.5387 | 5687 | 0.2195 | - | - | | 1.5390 | 5688 | 0.1934 | - | - | | 1.5392 | 5689 | 0.2052 | - | - | | 1.5395 | 5690 | 0.1954 | - | - | | 1.5398 | 5691 | 0.2338 | - | - | | 1.5400 | 5692 | 0.1491 | - | - | | 1.5403 | 5693 | 0.1914 | - | - | | 1.5406 | 5694 | 0.282 | - | - | | 1.5409 | 5695 | 0.1916 | - | - | | 1.5411 | 5696 | 0.172 | - | - | | 1.5414 | 5697 | 0.289 | - | - | | 1.5417 | 5698 | 0.1691 | - | - | | 1.5419 | 5699 | 0.1604 | - | - | | 1.5422 | 5700 | 0.2124 | - | - | | 1.5425 | 5701 | 0.202 | - | - | | 1.5427 | 5702 | 0.2348 | - | - | | 1.5430 | 5703 | 0.2316 | - | - | | 1.5433 | 5704 | 0.2235 | - | - | | 1.5436 | 5705 | 0.2457 | - | - | | 1.5438 | 5706 | 0.2502 | - | - | | 1.5441 | 5707 | 0.2497 | - | - | | 1.5444 | 5708 | 0.222 | - | - | | 1.5446 | 5709 | 0.2358 | - | - | | 1.5449 | 5710 | 0.1897 | - | - | | 1.5452 | 5711 | 0.2342 | - | - | | 1.5455 | 5712 | 0.215 | - | - | | 1.5457 | 5713 | 0.1977 | - | - | | 1.5460 | 5714 | 0.2309 | - | - | | 1.5463 | 5715 | 0.1643 | - | - | | 1.5465 | 5716 | 0.1577 | - | - | | 1.5468 | 5717 | 0.289 | - | - | | 1.5471 | 5718 | 0.2148 | - | - | | 1.5473 | 5719 | 0.2683 | - | - | | 1.5476 | 5720 | 0.2271 | - | - | | 1.5479 | 5721 | 0.2025 | - | - | | 1.5482 | 5722 | 0.2214 | - | - | | 1.5484 | 5723 | 0.2657 | - | - | | 1.5487 | 5724 | 0.1977 | - | - | | 1.5490 | 5725 | 0.2107 | - | - | | 1.5492 | 5726 | 0.2138 | - | - | | 1.5495 | 5727 | 0.2628 | - | - | | 1.5498 | 5728 | 0.2392 | - | - | | 1.5501 | 5729 | 0.2544 | - | - | | 1.5503 | 5730 | 0.1518 | - | - | | 1.5506 | 5731 | 0.1843 | - | - | | 1.5509 | 5732 | 0.2203 | - | - | | 1.5511 | 5733 | 0.1936 | - | - | | 1.5514 | 5734 | 0.1777 | - | - | | 1.5517 | 5735 | 0.1526 | - | - | | 1.5519 | 5736 | 0.2415 | - | - | | 1.5522 | 5737 | 0.2292 | - | - | | 1.5525 | 5738 | 0.2241 | - | - | | 1.5528 | 5739 | 0.2294 | - | - | | 1.5530 | 5740 | 0.2505 | - | - | | 1.5533 | 5741 | 0.2414 | - | - | | 1.5536 | 5742 | 0.248 | - | - | | 1.5538 | 5743 | 0.2055 | - | - | | 1.5541 | 5744 | 0.1775 | - | - | | 1.5544 | 5745 | 0.2609 | - | - | | 1.5547 | 5746 | 0.3636 | - | - | | 1.5549 | 5747 | 0.2204 | - | - | | 1.5552 | 5748 | 0.2022 | - | - | | 1.5555 | 5749 | 0.2075 | - | - | | 1.5557 | 5750 | 0.2271 | - | - | | 1.5560 | 5751 | 0.2137 | - | - | | 1.5563 | 5752 | 0.2159 | - | - | | 1.5565 | 5753 | 0.3304 | - | - | | 1.5568 | 5754 | 0.2406 | - | - | | 1.5571 | 5755 | 0.2436 | - | - | | 1.5574 | 5756 | 0.2351 | - | - | | 1.5576 | 5757 | 0.2258 | - | - | | 1.5579 | 5758 | 0.2615 | - | - | | 1.5582 | 5759 | 0.1605 | - | - | | 1.5584 | 5760 | 0.3292 | - | - | | 1.5587 | 5761 | 0.2382 | - | - | | 1.5590 | 5762 | 0.204 | - | - | | 1.5593 | 5763 | 0.1622 | - | - | | 1.5595 | 5764 | 0.2051 | - | - | | 1.5598 | 5765 | 0.1384 | - | - | | 1.5601 | 5766 | 0.2148 | - | - | | 1.5603 | 5767 | 0.1852 | - | - | | 1.5606 | 5768 | 0.2015 | - | - | | 1.5609 | 5769 | 0.1934 | - | - | | 1.5611 | 5770 | 0.2636 | - | - | | 1.5614 | 5771 | 0.2743 | - | - | | 1.5617 | 5772 | 0.2725 | - | - | | 1.5620 | 5773 | 0.2293 | - | - | | 1.5622 | 5774 | 0.1853 | - | - | | 1.5625 | 5775 | 0.1817 | - | - | | 1.5628 | 5776 | 0.2906 | - | - | | 1.5630 | 5777 | 0.2522 | - | - | | 1.5633 | 5778 | 0.1882 | - | - | | 1.5636 | 5779 | 0.1826 | - | - | | 1.5639 | 5780 | 0.2591 | - | - | | 1.5641 | 5781 | 0.1828 | - | - | | 1.5644 | 5782 | 0.1561 | - | - | | 1.5647 | 5783 | 0.2806 | - | - | | 1.5649 | 5784 | 0.2966 | - | - | | 1.5652 | 5785 | 0.1887 | - | - | | 1.5655 | 5786 | 0.1605 | - | - | | 1.5657 | 5787 | 0.1726 | - | - | | 1.5660 | 5788 | 0.2697 | - | - | | 1.5663 | 5789 | 0.1976 | - | - | | 1.5666 | 5790 | 0.1764 | - | - | | 1.5668 | 5791 | 0.2297 | - | - | | 1.5671 | 5792 | 0.2659 | - | - | | 1.5674 | 5793 | 0.2151 | - | - | | 1.5676 | 5794 | 0.1664 | - | - | | 1.5679 | 5795 | 0.3114 | - | - | | 1.5682 | 5796 | 0.2384 | - | - | | 1.5685 | 5797 | 0.2387 | - | - | | 1.5687 | 5798 | 0.2227 | - | - | | 1.5690 | 5799 | 0.1869 | - | - | | 1.5693 | 5800 | 0.1932 | - | - | | 1.5695 | 5801 | 0.298 | - | - | | 1.5698 | 5802 | 0.1852 | - | - | | 1.5701 | 5803 | 0.1725 | - | - | | 1.5703 | 5804 | 0.2377 | - | - | | 1.5706 | 5805 | 0.1853 | - | - | | 1.5709 | 5806 | 0.1947 | - | - | | 1.5712 | 5807 | 0.3128 | - | - | | 1.5714 | 5808 | 0.2036 | - | - | | 1.5717 | 5809 | 0.2427 | - | - | | 1.5720 | 5810 | 0.2277 | - | - | | 1.5722 | 5811 | 0.2449 | - | - | | 1.5725 | 5812 | 0.2723 | - | - | | 1.5728 | 5813 | 0.3115 | - | - | | 1.5731 | 5814 | 0.2655 | - | - | | 1.5733 | 5815 | 0.1823 | - | - | | 1.5736 | 5816 | 0.236 | - | - | | 1.5739 | 5817 | 0.2131 | - | - | | 1.5741 | 5818 | 0.2687 | - | - | | 1.5744 | 5819 | 0.1882 | - | - | | 1.5747 | 5820 | 0.1774 | - | - | | 1.5749 | 5821 | 0.2733 | - | - | | 1.5752 | 5822 | 0.1519 | - | - | | 1.5755 | 5823 | 0.1721 | - | - | | 1.5758 | 5824 | 0.2119 | - | - | | 1.5760 | 5825 | 0.2362 | - | - | | 1.5763 | 5826 | 0.1575 | - | - | | 1.5766 | 5827 | 0.1819 | - | - | | 1.5768 | 5828 | 0.1981 | - | - | | 1.5771 | 5829 | 0.2519 | - | - | | 1.5774 | 5830 | 0.2369 | - | - | | 1.5777 | 5831 | 0.2152 | - | - | | 1.5779 | 5832 | 0.1947 | - | - | | 1.5782 | 5833 | 0.2859 | - | - | | 1.5785 | 5834 | 0.2267 | - | - | | 1.5787 | 5835 | 0.1779 | - | - | | 1.5790 | 5836 | 0.2361 | - | - | | 1.5793 | 5837 | 0.2322 | - | - | | 1.5795 | 5838 | 0.1774 | - | - | | 1.5798 | 5839 | 0.2611 | - | - | | 1.5801 | 5840 | 0.1935 | - | - | | 1.5804 | 5841 | 0.3059 | - | - | | 1.5806 | 5842 | 0.2166 | - | - | | 1.5809 | 5843 | 0.2336 | - | - | | 1.5812 | 5844 | 0.148 | - | - | | 1.5814 | 5845 | 0.2321 | - | - | | 1.5817 | 5846 | 0.1749 | - | - | | 1.5820 | 5847 | 0.2919 | - | - | | 1.5823 | 5848 | 0.1656 | - | - | | 1.5825 | 5849 | 0.1959 | - | - | | 1.5828 | 5850 | 0.2079 | - | - | | 1.5831 | 5851 | 0.1579 | - | - | | 1.5833 | 5852 | 0.2353 | - | - | | 1.5836 | 5853 | 0.2249 | - | - | | 1.5839 | 5854 | 0.3148 | - | - | | 1.5841 | 5855 | 0.2036 | - | - | | 1.5844 | 5856 | 0.1638 | - | - | | 1.5847 | 5857 | 0.117 | - | - | | 1.5850 | 5858 | 0.1716 | - | - | | 1.5852 | 5859 | 0.2492 | - | - | | 1.5855 | 5860 | 0.1306 | - | - | | 1.5858 | 5861 | 0.1592 | - | - | | 1.5860 | 5862 | 0.2198 | - | - | | 1.5863 | 5863 | 0.3247 | - | - | | 1.5866 | 5864 | 0.1847 | - | - | | 1.5869 | 5865 | 0.2123 | - | - | | 1.5871 | 5866 | 0.2332 | - | - | | 1.5874 | 5867 | 0.1944 | - | - | | 1.5877 | 5868 | 0.2601 | - | - | | 1.5879 | 5869 | 0.215 | - | - | | 1.5882 | 5870 | 0.2483 | - | - | | 1.5885 | 5871 | 0.2776 | - | - | | 1.5887 | 5872 | 0.218 | - | - | | 1.5890 | 5873 | 0.1927 | - | - | | 1.5893 | 5874 | 0.229 | - | - | | 1.5896 | 5875 | 0.2886 | - | - | | 1.5898 | 5876 | 0.2312 | - | - | | 1.5901 | 5877 | 0.2287 | - | - | | 1.5904 | 5878 | 0.1867 | - | - | | 1.5906 | 5879 | 0.2697 | - | - | | 1.5909 | 5880 | 0.2966 | - | - | | 1.5912 | 5881 | 0.197 | - | - | | 1.5915 | 5882 | 0.2262 | - | - | | 1.5917 | 5883 | 0.1997 | - | - | | 1.5920 | 5884 | 0.1794 | - | - | | 1.5923 | 5885 | 0.2869 | - | - | | 1.5925 | 5886 | 0.2338 | - | - | | 1.5928 | 5887 | 0.2015 | - | - | | 1.5931 | 5888 | 0.2373 | - | - | | 1.5933 | 5889 | 0.2519 | - | - | | 1.5936 | 5890 | 0.2094 | - | - | | 1.5939 | 5891 | 0.2352 | - | - | | 1.5942 | 5892 | 0.259 | - | - | | 1.5944 | 5893 | 0.2151 | - | - | | 1.5947 | 5894 | 0.1912 | - | - | | 1.5950 | 5895 | 0.193 | - | - | | 1.5952 | 5896 | 0.1973 | - | - | | 1.5955 | 5897 | 0.2038 | - | - | | 1.5958 | 5898 | 0.254 | - | - | | 1.5960 | 5899 | 0.255 | - | - | | 1.5963 | 5900 | 0.1476 | - | - | | 1.5966 | 5901 | 0.2964 | - | - | | 1.5969 | 5902 | 0.2257 | - | - | | 1.5971 | 5903 | 0.2599 | - | - | | 1.5974 | 5904 | 0.275 | - | - | | 1.5977 | 5905 | 0.1732 | - | - | | 1.5979 | 5906 | 0.231 | - | - | | 1.5982 | 5907 | 0.2106 | - | - | | 1.5985 | 5908 | 0.1838 | - | - | | 1.5988 | 5909 | 0.1461 | - | - | | 1.5990 | 5910 | 0.195 | - | - | | 1.5993 | 5911 | 0.2678 | - | - | | 1.5996 | 5912 | 0.2305 | - | - | | 1.5998 | 5913 | 0.2233 | - | - | | 1.6001 | 5914 | 0.2101 | - | - | | 1.6004 | 5915 | 0.2185 | - | - | | 1.6006 | 5916 | 0.2099 | - | - | | 1.6009 | 5917 | 0.2463 | - | - | | 1.6012 | 5918 | 0.2109 | - | - | | 1.6015 | 5919 | 0.208 | - | - | | 1.6017 | 5920 | 0.3242 | - | - | | 1.6020 | 5921 | 0.2048 | - | - | | 1.6023 | 5922 | 0.2457 | - | - | | 1.6025 | 5923 | 0.2338 | - | - | | 1.6028 | 5924 | 0.2931 | - | - | | 1.6031 | 5925 | 0.1429 | - | - | | 1.6034 | 5926 | 0.2233 | - | - | | 1.6036 | 5927 | 0.2474 | - | - | | 1.6039 | 5928 | 0.1739 | - | - | | 1.6042 | 5929 | 0.3097 | - | - | | 1.6044 | 5930 | 0.2466 | - | - | | 1.6047 | 5931 | 0.2003 | - | - | | 1.6050 | 5932 | 0.1937 | - | - | | 1.6052 | 5933 | 0.2248 | - | - | | 1.6055 | 5934 | 0.2003 | - | - | | 1.6058 | 5935 | 0.297 | - | - | | 1.6061 | 5936 | 0.1763 | - | - | | 1.6063 | 5937 | 0.2173 | - | - | | 1.6066 | 5938 | 0.2491 | - | - | | 1.6069 | 5939 | 0.1941 | - | - | | 1.6071 | 5940 | 0.1517 | - | - | | 1.6074 | 5941 | 0.1914 | - | - | | 1.6077 | 5942 | 0.1425 | - | - | | 1.6080 | 5943 | 0.1705 | - | - | | 1.6082 | 5944 | 0.1764 | - | - | | 1.6085 | 5945 | 0.2717 | - | - | | 1.6088 | 5946 | 0.2621 | - | - | | 1.6090 | 5947 | 0.331 | - | - | | 1.6093 | 5948 | 0.2477 | - | - | | 1.6096 | 5949 | 0.2338 | - | - | | 1.6098 | 5950 | 0.1788 | - | - | | 1.6101 | 5951 | 0.275 | - | - | | 1.6104 | 5952 | 0.2057 | - | - | | 1.6107 | 5953 | 0.2771 | - | - | | 1.6109 | 5954 | 0.2451 | - | - | | 1.6112 | 5955 | 0.1976 | - | - | | 1.6115 | 5956 | 0.1796 | - | - | | 1.6117 | 5957 | 0.1723 | - | - | | 1.6120 | 5958 | 0.1692 | - | - | | 1.6123 | 5959 | 0.283 | - | - | | 1.6126 | 5960 | 0.2528 | - | - | | 1.6128 | 5961 | 0.2251 | - | - | | 1.6131 | 5962 | 0.2088 | - | - | | 1.6134 | 5963 | 0.2035 | - | - | | 1.6136 | 5964 | 0.1668 | - | - | | 1.6139 | 5965 | 0.1809 | - | - | | 1.6142 | 5966 | 0.1653 | - | - | | 1.6144 | 5967 | 0.2669 | - | - | | 1.6147 | 5968 | 0.2541 | - | - | | 1.6150 | 5969 | 0.2284 | - | - | | 1.6153 | 5970 | 0.3516 | - | - | | 1.6155 | 5971 | 0.2041 | - | - | | 1.6158 | 5972 | 0.1302 | - | - | | 1.6161 | 5973 | 0.2187 | - | - | | 1.6163 | 5974 | 0.244 | - | - | | 1.6166 | 5975 | 0.1345 | - | - | | 1.6169 | 5976 | 0.1559 | - | - | | 1.6172 | 5977 | 0.209 | - | - | | 1.6174 | 5978 | 0.1748 | - | - | | 1.6177 | 5979 | 0.1668 | - | - | | 1.6180 | 5980 | 0.203 | - | - | | 1.6182 | 5981 | 0.1875 | - | - | | 1.6185 | 5982 | 0.1853 | - | - | | 1.6188 | 5983 | 0.1982 | - | - | | 1.6190 | 5984 | 0.1882 | - | - | | 1.6193 | 5985 | 0.2337 | - | - | | 1.6196 | 5986 | 0.1768 | - | - | | 1.6199 | 5987 | 0.2964 | - | - | | 1.6201 | 5988 | 0.2408 | - | - | | 1.6204 | 5989 | 0.1664 | - | - | | 1.6207 | 5990 | 0.2457 | - | - | | 1.6209 | 5991 | 0.2224 | - | - | | 1.6212 | 5992 | 0.227 | - | - | | 1.6215 | 5993 | 0.2282 | - | - | | 1.6218 | 5994 | 0.2762 | - | - | | 1.6220 | 5995 | 0.2437 | - | - | | 1.6223 | 5996 | 0.2351 | - | - | | 1.6226 | 5997 | 0.2618 | - | - | | 1.6228 | 5998 | 0.2149 | - | - | | 1.6231 | 5999 | 0.2541 | - | - | | 1.6234 | 6000 | 0.1609 | 0.2174 | 0.9451 | | 1.6236 | 6001 | 0.2411 | - | - | | 1.6239 | 6002 | 0.2476 | - | - | | 1.6242 | 6003 | 0.1894 | - | - | | 1.6245 | 6004 | 0.2072 | - | - | | 1.6247 | 6005 | 0.2353 | - | - | | 1.625 | 6006 | 0.1816 | - | - | | 1.6253 | 6007 | 0.1747 | - | - | | 1.6255 | 6008 | 0.2295 | - | - | | 1.6258 | 6009 | 0.2672 | - | - | | 1.6261 | 6010 | 0.1979 | - | - | | 1.6264 | 6011 | 0.2533 | - | - | | 1.6266 | 6012 | 0.228 | - | - | | 1.6269 | 6013 | 0.2893 | - | - | | 1.6272 | 6014 | 0.2129 | - | - | | 1.6274 | 6015 | 0.2407 | - | - | | 1.6277 | 6016 | 0.2519 | - | - | | 1.6280 | 6017 | 0.1866 | - | - | | 1.6282 | 6018 | 0.1861 | - | - | | 1.6285 | 6019 | 0.2334 | - | - | | 1.6288 | 6020 | 0.1671 | - | - | | 1.6291 | 6021 | 0.2565 | - | - | | 1.6293 | 6022 | 0.2133 | - | - | | 1.6296 | 6023 | 0.2295 | - | - | | 1.6299 | 6024 | 0.2426 | - | - | | 1.6301 | 6025 | 0.2742 | - | - | | 1.6304 | 6026 | 0.3324 | - | - | | 1.6307 | 6027 | 0.1909 | - | - | | 1.6310 | 6028 | 0.2805 | - | - | | 1.6312 | 6029 | 0.1796 | - | - | | 1.6315 | 6030 | 0.2955 | - | - | | 1.6318 | 6031 | 0.1957 | - | - | | 1.6320 | 6032 | 0.1659 | - | - | | 1.6323 | 6033 | 0.2561 | - | - | | 1.6326 | 6034 | 0.1934 | - | - | | 1.6328 | 6035 | 0.2098 | - | - | | 1.6331 | 6036 | 0.1551 | - | - | | 1.6334 | 6037 | 0.2052 | - | - | | 1.6337 | 6038 | 0.1581 | - | - | | 1.6339 | 6039 | 0.3474 | - | - | | 1.6342 | 6040 | 0.2067 | - | - | | 1.6345 | 6041 | 0.2069 | - | - | | 1.6347 | 6042 | 0.1717 | - | - | | 1.6350 | 6043 | 0.1806 | - | - | | 1.6353 | 6044 | 0.1408 | - | - | | 1.6356 | 6045 | 0.2959 | - | - | | 1.6358 | 6046 | 0.1596 | - | - | | 1.6361 | 6047 | 0.2241 | - | - | | 1.6364 | 6048 | 0.2629 | - | - | | 1.6366 | 6049 | 0.293 | - | - | | 1.6369 | 6050 | 0.215 | - | - | | 1.6372 | 6051 | 0.2589 | - | - | | 1.6374 | 6052 | 0.245 | - | - | | 1.6377 | 6053 | 0.1618 | - | - | | 1.6380 | 6054 | 0.2221 | - | - | | 1.6383 | 6055 | 0.1682 | - | - | | 1.6385 | 6056 | 0.2922 | - | - | | 1.6388 | 6057 | 0.2009 | - | - | | 1.6391 | 6058 | 0.3134 | - | - | | 1.6393 | 6059 | 0.2411 | - | - | | 1.6396 | 6060 | 0.2147 | - | - | | 1.6399 | 6061 | 0.1446 | - | - | | 1.6402 | 6062 | 0.1637 | - | - | | 1.6404 | 6063 | 0.1821 | - | - | | 1.6407 | 6064 | 0.2652 | - | - | | 1.6410 | 6065 | 0.2791 | - | - | | 1.6412 | 6066 | 0.2427 | - | - | | 1.6415 | 6067 | 0.2083 | - | - | | 1.6418 | 6068 | 0.2014 | - | - | | 1.6420 | 6069 | 0.1864 | - | - | | 1.6423 | 6070 | 0.1981 | - | - | | 1.6426 | 6071 | 0.2863 | - | - | | 1.6429 | 6072 | 0.2777 | - | - | | 1.6431 | 6073 | 0.2511 | - | - | | 1.6434 | 6074 | 0.286 | - | - | | 1.6437 | 6075 | 0.1897 | - | - | | 1.6439 | 6076 | 0.1915 | - | - | | 1.6442 | 6077 | 0.2191 | - | - | | 1.6445 | 6078 | 0.2234 | - | - | | 1.6448 | 6079 | 0.2397 | - | - | | 1.6450 | 6080 | 0.1502 | - | - | | 1.6453 | 6081 | 0.2711 | - | - | | 1.6456 | 6082 | 0.1999 | - | - | | 1.6458 | 6083 | 0.1419 | - | - | | 1.6461 | 6084 | 0.2097 | - | - | | 1.6464 | 6085 | 0.232 | - | - | | 1.6466 | 6086 | 0.2472 | - | - | | 1.6469 | 6087 | 0.243 | - | - | | 1.6472 | 6088 | 0.2228 | - | - | | 1.6475 | 6089 | 0.2536 | - | - | | 1.6477 | 6090 | 0.1542 | - | - | | 1.6480 | 6091 | 0.116 | - | - | | 1.6483 | 6092 | 0.2729 | - | - | | 1.6485 | 6093 | 0.2117 | - | - | | 1.6488 | 6094 | 0.2158 | - | - | | 1.6491 | 6095 | 0.2259 | - | - | | 1.6494 | 6096 | 0.24 | - | - | | 1.6496 | 6097 | 0.183 | - | - | | 1.6499 | 6098 | 0.2265 | - | - | | 1.6502 | 6099 | 0.1786 | - | - | | 1.6504 | 6100 | 0.3218 | - | - | | 1.6507 | 6101 | 0.2085 | - | - | | 1.6510 | 6102 | 0.2925 | - | - | | 1.6512 | 6103 | 0.2268 | - | - | | 1.6515 | 6104 | 0.196 | - | - | | 1.6518 | 6105 | 0.1748 | - | - | | 1.6521 | 6106 | 0.1492 | - | - | | 1.6523 | 6107 | 0.1414 | - | - | | 1.6526 | 6108 | 0.174 | - | - | | 1.6529 | 6109 | 0.2092 | - | - | | 1.6531 | 6110 | 0.1791 | - | - | | 1.6534 | 6111 | 0.3159 | - | - | | 1.6537 | 6112 | 0.2336 | - | - | | 1.6540 | 6113 | 0.2654 | - | - | | 1.6542 | 6114 | 0.2069 | - | - | | 1.6545 | 6115 | 0.2215 | - | - | | 1.6548 | 6116 | 0.2207 | - | - | | 1.6550 | 6117 | 0.3037 | - | - | | 1.6553 | 6118 | 0.2024 | - | - | | 1.6556 | 6119 | 0.2056 | - | - | | 1.6558 | 6120 | 0.2106 | - | - | | 1.6561 | 6121 | 0.1572 | - | - | | 1.6564 | 6122 | 0.1802 | - | - | | 1.6567 | 6123 | 0.2297 | - | - | | 1.6569 | 6124 | 0.171 | - | - | | 1.6572 | 6125 | 0.1439 | - | - | | 1.6575 | 6126 | 0.186 | - | - | | 1.6577 | 6127 | 0.2059 | - | - | | 1.6580 | 6128 | 0.2026 | - | - | | 1.6583 | 6129 | 0.2013 | - | - | | 1.6585 | 6130 | 0.2324 | - | - | | 1.6588 | 6131 | 0.2637 | - | - | | 1.6591 | 6132 | 0.1995 | - | - | | 1.6594 | 6133 | 0.1653 | - | - | | 1.6596 | 6134 | 0.1642 | - | - | | 1.6599 | 6135 | 0.2436 | - | - | | 1.6602 | 6136 | 0.2361 | - | - | | 1.6604 | 6137 | 0.2513 | - | - | | 1.6607 | 6138 | 0.1338 | - | - | | 1.6610 | 6139 | 0.2062 | - | - | | 1.6613 | 6140 | 0.2115 | - | - | | 1.6615 | 6141 | 0.2588 | - | - | | 1.6618 | 6142 | 0.2023 | - | - | | 1.6621 | 6143 | 0.2101 | - | - | | 1.6623 | 6144 | 0.224 | - | - | | 1.6626 | 6145 | 0.2074 | - | - | | 1.6629 | 6146 | 0.1879 | - | - | | 1.6631 | 6147 | 0.2345 | - | - | | 1.6634 | 6148 | 0.2077 | - | - | | 1.6637 | 6149 | 0.1633 | - | - | | 1.6640 | 6150 | 0.2096 | - | - | | 1.6642 | 6151 | 0.2024 | - | - | | 1.6645 | 6152 | 0.1498 | - | - | | 1.6648 | 6153 | 0.2654 | - | - | | 1.6650 | 6154 | 0.2271 | - | - | | 1.6653 | 6155 | 0.1861 | - | - | | 1.6656 | 6156 | 0.2086 | - | - | | 1.6659 | 6157 | 0.2026 | - | - | | 1.6661 | 6158 | 0.2126 | - | - | | 1.6664 | 6159 | 0.2615 | - | - | | 1.6667 | 6160 | 0.2752 | - | - | | 1.6669 | 6161 | 0.2051 | - | - | | 1.6672 | 6162 | 0.2486 | - | - | | 1.6675 | 6163 | 0.187 | - | - | | 1.6677 | 6164 | 0.2253 | - | - | | 1.6680 | 6165 | 0.2856 | - | - | | 1.6683 | 6166 | 0.3105 | - | - | | 1.6686 | 6167 | 0.1932 | - | - | | 1.6688 | 6168 | 0.1999 | - | - | | 1.6691 | 6169 | 0.1768 | - | - | | 1.6694 | 6170 | 0.2508 | - | - | | 1.6696 | 6171 | 0.1619 | - | - | | 1.6699 | 6172 | 0.2124 | - | - | | 1.6702 | 6173 | 0.2468 | - | - | | 1.6705 | 6174 | 0.2491 | - | - | | 1.6707 | 6175 | 0.2259 | - | - | | 1.6710 | 6176 | 0.2411 | - | - | | 1.6713 | 6177 | 0.159 | - | - | | 1.6715 | 6178 | 0.2822 | - | - | | 1.6718 | 6179 | 0.1935 | - | - | | 1.6721 | 6180 | 0.1813 | - | - | | 1.6723 | 6181 | 0.1918 | - | - | | 1.6726 | 6182 | 0.219 | - | - | | 1.6729 | 6183 | 0.1614 | - | - | | 1.6732 | 6184 | 0.2273 | - | - | | 1.6734 | 6185 | 0.2038 | - | - | | 1.6737 | 6186 | 0.3231 | - | - | | 1.6740 | 6187 | 0.2189 | - | - | | 1.6742 | 6188 | 0.2574 | - | - | | 1.6745 | 6189 | 0.1949 | - | - | | 1.6748 | 6190 | 0.1559 | - | - | | 1.6751 | 6191 | 0.2053 | - | - | | 1.6753 | 6192 | 0.3227 | - | - | | 1.6756 | 6193 | 0.3071 | - | - | | 1.6759 | 6194 | 0.1738 | - | - | | 1.6761 | 6195 | 0.2066 | - | - | | 1.6764 | 6196 | 0.282 | - | - | | 1.6767 | 6197 | 0.2116 | - | - | | 1.6769 | 6198 | 0.2445 | - | - | | 1.6772 | 6199 | 0.1761 | - | - | | 1.6775 | 6200 | 0.1794 | - | - | | 1.6778 | 6201 | 0.1573 | - | - | | 1.6780 | 6202 | 0.2023 | - | - | | 1.6783 | 6203 | 0.2739 | - | - | | 1.6786 | 6204 | 0.2093 | - | - | | 1.6788 | 6205 | 0.2238 | - | - | | 1.6791 | 6206 | 0.3184 | - | - | | 1.6794 | 6207 | 0.2156 | - | - | | 1.6797 | 6208 | 0.2289 | - | - | | 1.6799 | 6209 | 0.2812 | - | - | | 1.6802 | 6210 | 0.1912 | - | - | | 1.6805 | 6211 | 0.1199 | - | - | | 1.6807 | 6212 | 0.2143 | - | - | | 1.6810 | 6213 | 0.2726 | - | - | | 1.6813 | 6214 | 0.1992 | - | - | | 1.6815 | 6215 | 0.1996 | - | - | | 1.6818 | 6216 | 0.203 | - | - | | 1.6821 | 6217 | 0.2573 | - | - | | 1.6824 | 6218 | 0.2288 | - | - | | 1.6826 | 6219 | 0.3099 | - | - | | 1.6829 | 6220 | 0.2328 | - | - | | 1.6832 | 6221 | 0.2272 | - | - | | 1.6834 | 6222 | 0.1634 | - | - | | 1.6837 | 6223 | 0.1764 | - | - | | 1.6840 | 6224 | 0.1415 | - | - | | 1.6843 | 6225 | 0.3089 | - | - | | 1.6845 | 6226 | 0.2124 | - | - | | 1.6848 | 6227 | 0.2177 | - | - | | 1.6851 | 6228 | 0.1281 | - | - | | 1.6853 | 6229 | 0.2537 | - | - | | 1.6856 | 6230 | 0.2662 | - | - | | 1.6859 | 6231 | 0.2094 | - | - | | 1.6861 | 6232 | 0.2282 | - | - | | 1.6864 | 6233 | 0.2247 | - | - | | 1.6867 | 6234 | 0.2186 | - | - | | 1.6870 | 6235 | 0.3039 | - | - | | 1.6872 | 6236 | 0.2497 | - | - | | 1.6875 | 6237 | 0.2603 | - | - | | 1.6878 | 6238 | 0.1845 | - | - | | 1.6880 | 6239 | 0.2017 | - | - | | 1.6883 | 6240 | 0.2678 | - | - | | 1.6886 | 6241 | 0.2565 | - | - | | 1.6889 | 6242 | 0.2073 | - | - | | 1.6891 | 6243 | 0.2361 | - | - | | 1.6894 | 6244 | 0.2571 | - | - | | 1.6897 | 6245 | 0.224 | - | - | | 1.6899 | 6246 | 0.1755 | - | - | | 1.6902 | 6247 | 0.2277 | - | - | | 1.6905 | 6248 | 0.185 | - | - | | 1.6907 | 6249 | 0.1666 | - | - | | 1.6910 | 6250 | 0.2026 | - | - | | 1.6913 | 6251 | 0.1618 | - | - | | 1.6916 | 6252 | 0.2122 | - | - | | 1.6918 | 6253 | 0.1848 | - | - | | 1.6921 | 6254 | 0.2235 | - | - | | 1.6924 | 6255 | 0.2279 | - | - | | 1.6926 | 6256 | 0.1412 | - | - | | 1.6929 | 6257 | 0.1361 | - | - | | 1.6932 | 6258 | 0.241 | - | - | | 1.6935 | 6259 | 0.1708 | - | - | | 1.6937 | 6260 | 0.3052 | - | - | | 1.6940 | 6261 | 0.2259 | - | - | | 1.6943 | 6262 | 0.2076 | - | - | | 1.6945 | 6263 | 0.17 | - | - | | 1.6948 | 6264 | 0.2687 | - | - | | 1.6951 | 6265 | 0.2875 | - | - | | 1.6953 | 6266 | 0.1831 | - | - | | 1.6956 | 6267 | 0.2235 | - | - | | 1.6959 | 6268 | 0.1535 | - | - | | 1.6962 | 6269 | 0.1922 | - | - | | 1.6964 | 6270 | 0.1916 | - | - | | 1.6967 | 6271 | 0.1186 | - | - | | 1.6970 | 6272 | 0.198 | - | - | | 1.6972 | 6273 | 0.1601 | - | - | | 1.6975 | 6274 | 0.2012 | - | - | | 1.6978 | 6275 | 0.2649 | - | - | | 1.6981 | 6276 | 0.1546 | - | - | | 1.6983 | 6277 | 0.2009 | - | - | | 1.6986 | 6278 | 0.2118 | - | - | | 1.6989 | 6279 | 0.271 | - | - | | 1.6991 | 6280 | 0.1507 | - | - | | 1.6994 | 6281 | 0.2023 | - | - | | 1.6997 | 6282 | 0.1794 | - | - | | 1.6999 | 6283 | 0.1826 | - | - | | 1.7002 | 6284 | 0.1485 | - | - | | 1.7005 | 6285 | 0.1836 | - | - | | 1.7008 | 6286 | 0.2678 | - | - | | 1.7010 | 6287 | 0.1984 | - | - | | 1.7013 | 6288 | 0.1805 | - | - | | 1.7016 | 6289 | 0.1964 | - | - | | 1.7018 | 6290 | 0.3036 | - | - | | 1.7021 | 6291 | 0.1913 | - | - | | 1.7024 | 6292 | 0.1953 | - | - | | 1.7027 | 6293 | 0.251 | - | - | | 1.7029 | 6294 | 0.2194 | - | - | | 1.7032 | 6295 | 0.1759 | - | - | | 1.7035 | 6296 | 0.2338 | - | - | | 1.7037 | 6297 | 0.3278 | - | - | | 1.7040 | 6298 | 0.2449 | - | - | | 1.7043 | 6299 | 0.1869 | - | - | | 1.7045 | 6300 | 0.155 | - | - | | 1.7048 | 6301 | 0.2264 | - | - | | 1.7051 | 6302 | 0.1628 | - | - | | 1.7054 | 6303 | 0.2127 | - | - | | 1.7056 | 6304 | 0.1754 | - | - | | 1.7059 | 6305 | 0.2177 | - | - | | 1.7062 | 6306 | 0.1672 | - | - | | 1.7064 | 6307 | 0.2735 | - | - | | 1.7067 | 6308 | 0.1891 | - | - | | 1.7070 | 6309 | 0.2417 | - | - | | 1.7073 | 6310 | 0.1958 | - | - | | 1.7075 | 6311 | 0.1927 | - | - | | 1.7078 | 6312 | 0.2183 | - | - | | 1.7081 | 6313 | 0.1856 | - | - | | 1.7083 | 6314 | 0.2452 | - | - | | 1.7086 | 6315 | 0.2156 | - | - | | 1.7089 | 6316 | 0.2046 | - | - | | 1.7091 | 6317 | 0.2373 | - | - | | 1.7094 | 6318 | 0.2328 | - | - | | 1.7097 | 6319 | 0.2298 | - | - | | 1.7100 | 6320 | 0.1601 | - | - | | 1.7102 | 6321 | 0.2633 | - | - | | 1.7105 | 6322 | 0.2547 | - | - | | 1.7108 | 6323 | 0.327 | - | - | | 1.7110 | 6324 | 0.1779 | - | - | | 1.7113 | 6325 | 0.1927 | - | - | | 1.7116 | 6326 | 0.2296 | - | - | | 1.7119 | 6327 | 0.2547 | - | - | | 1.7121 | 6328 | 0.1407 | - | - | | 1.7124 | 6329 | 0.2008 | - | - | | 1.7127 | 6330 | 0.2446 | - | - | | 1.7129 | 6331 | 0.1783 | - | - | | 1.7132 | 6332 | 0.2012 | - | - | | 1.7135 | 6333 | 0.19 | - | - | | 1.7137 | 6334 | 0.1879 | - | - | | 1.7140 | 6335 | 0.1982 | - | - | | 1.7143 | 6336 | 0.1915 | - | - | | 1.7146 | 6337 | 0.1973 | - | - | | 1.7148 | 6338 | 0.2666 | - | - | | 1.7151 | 6339 | 0.2357 | - | - | | 1.7154 | 6340 | 0.308 | - | - | | 1.7156 | 6341 | 0.2419 | - | - | | 1.7159 | 6342 | 0.1859 | - | - | | 1.7162 | 6343 | 0.2012 | - | - | | 1.7165 | 6344 | 0.2732 | - | - | | 1.7167 | 6345 | 0.1787 | - | - | | 1.7170 | 6346 | 0.2851 | - | - | | 1.7173 | 6347 | 0.2783 | - | - | | 1.7175 | 6348 | 0.1906 | - | - | | 1.7178 | 6349 | 0.1664 | - | - | | 1.7181 | 6350 | 0.1848 | - | - | | 1.7183 | 6351 | 0.2025 | - | - | | 1.7186 | 6352 | 0.1893 | - | - | | 1.7189 | 6353 | 0.2646 | - | - | | 1.7192 | 6354 | 0.1712 | - | - | | 1.7194 | 6355 | 0.2041 | - | - | | 1.7197 | 6356 | 0.2531 | - | - | | 1.7200 | 6357 | 0.1906 | - | - | | 1.7202 | 6358 | 0.1941 | - | - | | 1.7205 | 6359 | 0.1287 | - | - | | 1.7208 | 6360 | 0.3049 | - | - | | 1.7210 | 6361 | 0.1926 | - | - | | 1.7213 | 6362 | 0.1476 | - | - | | 1.7216 | 6363 | 0.2073 | - | - | | 1.7219 | 6364 | 0.157 | - | - | | 1.7221 | 6365 | 0.328 | - | - | | 1.7224 | 6366 | 0.2506 | - | - | | 1.7227 | 6367 | 0.1604 | - | - | | 1.7229 | 6368 | 0.1133 | - | - | | 1.7232 | 6369 | 0.2491 | - | - | | 1.7235 | 6370 | 0.1661 | - | - | | 1.7238 | 6371 | 0.1666 | - | - | | 1.7240 | 6372 | 0.2075 | - | - | | 1.7243 | 6373 | 0.1881 | - | - | | 1.7246 | 6374 | 0.2058 | - | - | | 1.7248 | 6375 | 0.3101 | - | - | | 1.7251 | 6376 | 0.2078 | - | - | | 1.7254 | 6377 | 0.2565 | - | - | | 1.7256 | 6378 | 0.2535 | - | - | | 1.7259 | 6379 | 0.2875 | - | - | | 1.7262 | 6380 | 0.2615 | - | - | | 1.7265 | 6381 | 0.2495 | - | - | | 1.7267 | 6382 | 0.1861 | - | - | | 1.7270 | 6383 | 0.2705 | - | - | | 1.7273 | 6384 | 0.236 | - | - | | 1.7275 | 6385 | 0.1513 | - | - | | 1.7278 | 6386 | 0.2915 | - | - | | 1.7281 | 6387 | 0.2251 | - | - | | 1.7284 | 6388 | 0.2899 | - | - | | 1.7286 | 6389 | 0.1668 | - | - | | 1.7289 | 6390 | 0.2162 | - | - | | 1.7292 | 6391 | 0.2463 | - | - | | 1.7294 | 6392 | 0.1637 | - | - | | 1.7297 | 6393 | 0.1534 | - | - | | 1.7300 | 6394 | 0.1902 | - | - | | 1.7302 | 6395 | 0.2463 | - | - | | 1.7305 | 6396 | 0.1839 | - | - | | 1.7308 | 6397 | 0.1864 | - | - | | 1.7311 | 6398 | 0.2189 | - | - | | 1.7313 | 6399 | 0.2209 | - | - | | 1.7316 | 6400 | 0.224 | - | - | | 1.7319 | 6401 | 0.2884 | - | - | | 1.7321 | 6402 | 0.2131 | - | - | | 1.7324 | 6403 | 0.2166 | - | - | | 1.7327 | 6404 | 0.1338 | - | - | | 1.7330 | 6405 | 0.2034 | - | - | | 1.7332 | 6406 | 0.2439 | - | - | | 1.7335 | 6407 | 0.183 | - | - | | 1.7338 | 6408 | 0.2011 | - | - | | 1.7340 | 6409 | 0.2644 | - | - | | 1.7343 | 6410 | 0.2959 | - | - | | 1.7346 | 6411 | 0.2974 | - | - | | 1.7348 | 6412 | 0.2684 | - | - | | 1.7351 | 6413 | 0.2742 | - | - | | 1.7354 | 6414 | 0.2356 | - | - | | 1.7357 | 6415 | 0.2036 | - | - | | 1.7359 | 6416 | 0.2598 | - | - | | 1.7362 | 6417 | 0.1687 | - | - | | 1.7365 | 6418 | 0.1617 | - | - | | 1.7367 | 6419 | 0.2195 | - | - | | 1.7370 | 6420 | 0.2096 | - | - | | 1.7373 | 6421 | 0.2478 | - | - | | 1.7376 | 6422 | 0.2257 | - | - | | 1.7378 | 6423 | 0.203 | - | - | | 1.7381 | 6424 | 0.2313 | - | - | | 1.7384 | 6425 | 0.3682 | - | - | | 1.7386 | 6426 | 0.1896 | - | - | | 1.7389 | 6427 | 0.2007 | - | - | | 1.7392 | 6428 | 0.213 | - | - | | 1.7394 | 6429 | 0.1914 | - | - | | 1.7397 | 6430 | 0.2425 | - | - | | 1.7400 | 6431 | 0.1363 | - | - | | 1.7403 | 6432 | 0.2102 | - | - | | 1.7405 | 6433 | 0.2388 | - | - | | 1.7408 | 6434 | 0.1876 | - | - | | 1.7411 | 6435 | 0.227 | - | - | | 1.7413 | 6436 | 0.188 | - | - | | 1.7416 | 6437 | 0.2069 | - | - | | 1.7419 | 6438 | 0.3332 | - | - | | 1.7422 | 6439 | 0.2827 | - | - | | 1.7424 | 6440 | 0.1756 | - | - | | 1.7427 | 6441 | 0.1576 | - | - | | 1.7430 | 6442 | 0.2298 | - | - | | 1.7432 | 6443 | 0.1993 | - | - | | 1.7435 | 6444 | 0.1882 | - | - | | 1.7438 | 6445 | 0.2128 | - | - | | 1.7440 | 6446 | 0.1643 | - | - | | 1.7443 | 6447 | 0.1562 | - | - | | 1.7446 | 6448 | 0.2448 | - | - | | 1.7449 | 6449 | 0.143 | - | - | | 1.7451 | 6450 | 0.1564 | - | - | | 1.7454 | 6451 | 0.2297 | - | - | | 1.7457 | 6452 | 0.2543 | - | - | | 1.7459 | 6453 | 0.2491 | - | - | | 1.7462 | 6454 | 0.2243 | - | - | | 1.7465 | 6455 | 0.3036 | - | - | | 1.7468 | 6456 | 0.1442 | - | - | | 1.7470 | 6457 | 0.2412 | - | - | | 1.7473 | 6458 | 0.2478 | - | - | | 1.7476 | 6459 | 0.1734 | - | - | | 1.7478 | 6460 | 0.2363 | - | - | | 1.7481 | 6461 | 0.1269 | - | - | | 1.7484 | 6462 | 0.1858 | - | - | | 1.7486 | 6463 | 0.1646 | - | - | | 1.7489 | 6464 | 0.1812 | - | - | | 1.7492 | 6465 | 0.2501 | - | - | | 1.7495 | 6466 | 0.2754 | - | - | | 1.7497 | 6467 | 0.28 | - | - | | 1.75 | 6468 | 0.2293 | - | - | | 1.7503 | 6469 | 0.2005 | - | - | | 1.7505 | 6470 | 0.1914 | - | - | | 1.7508 | 6471 | 0.2363 | - | - | | 1.7511 | 6472 | 0.2615 | - | - | | 1.7514 | 6473 | 0.2168 | - | - | | 1.7516 | 6474 | 0.2395 | - | - | | 1.7519 | 6475 | 0.2074 | - | - | | 1.7522 | 6476 | 0.2091 | - | - | | 1.7524 | 6477 | 0.2065 | - | - | | 1.7527 | 6478 | 0.2952 | - | - | | 1.7530 | 6479 | 0.1975 | - | - | | 1.7532 | 6480 | 0.2138 | - | - | | 1.7535 | 6481 | 0.2139 | - | - | | 1.7538 | 6482 | 0.2198 | - | - | | 1.7541 | 6483 | 0.2413 | - | - | | 1.7543 | 6484 | 0.2335 | - | - | | 1.7546 | 6485 | 0.2885 | - | - | | 1.7549 | 6486 | 0.2458 | - | - | | 1.7551 | 6487 | 0.1959 | - | - | | 1.7554 | 6488 | 0.1643 | - | - | | 1.7557 | 6489 | 0.1989 | - | - | | 1.7560 | 6490 | 0.227 | - | - | | 1.7562 | 6491 | 0.2099 | - | - | | 1.7565 | 6492 | 0.2289 | - | - | | 1.7568 | 6493 | 0.2035 | - | - | | 1.7570 | 6494 | 0.2674 | - | - | | 1.7573 | 6495 | 0.224 | - | - | | 1.7576 | 6496 | 0.2437 | - | - | | 1.7578 | 6497 | 0.2677 | - | - | | 1.7581 | 6498 | 0.2658 | - | - | | 1.7584 | 6499 | 0.2065 | - | - | | 1.7587 | 6500 | 0.1895 | - | - | | 1.7589 | 6501 | 0.2318 | - | - | | 1.7592 | 6502 | 0.2205 | - | - | | 1.7595 | 6503 | 0.2564 | - | - | | 1.7597 | 6504 | 0.2689 | - | - | | 1.7600 | 6505 | 0.2763 | - | - | | 1.7603 | 6506 | 0.2078 | - | - | | 1.7606 | 6507 | 0.2408 | - | - | | 1.7608 | 6508 | 0.1305 | - | - | | 1.7611 | 6509 | 0.2143 | - | - | | 1.7614 | 6510 | 0.1935 | - | - | | 1.7616 | 6511 | 0.2253 | - | - | | 1.7619 | 6512 | 0.2579 | - | - | | 1.7622 | 6513 | 0.2241 | - | - | | 1.7624 | 6514 | 0.2469 | - | - | | 1.7627 | 6515 | 0.2126 | - | - | | 1.7630 | 6516 | 0.2006 | - | - | | 1.7633 | 6517 | 0.2174 | - | - | | 1.7635 | 6518 | 0.2355 | - | - | | 1.7638 | 6519 | 0.1764 | - | - | | 1.7641 | 6520 | 0.1972 | - | - | | 1.7643 | 6521 | 0.3508 | - | - | | 1.7646 | 6522 | 0.2148 | - | - | | 1.7649 | 6523 | 0.2031 | - | - | | 1.7652 | 6524 | 0.1855 | - | - | | 1.7654 | 6525 | 0.1486 | - | - | | 1.7657 | 6526 | 0.2744 | - | - | | 1.7660 | 6527 | 0.2302 | - | - | | 1.7662 | 6528 | 0.2452 | - | - | | 1.7665 | 6529 | 0.2308 | - | - | | 1.7668 | 6530 | 0.2849 | - | - | | 1.7670 | 6531 | 0.1528 | - | - | | 1.7673 | 6532 | 0.219 | - | - | | 1.7676 | 6533 | 0.1602 | - | - | | 1.7679 | 6534 | 0.2391 | - | - | | 1.7681 | 6535 | 0.1727 | - | - | | 1.7684 | 6536 | 0.28 | - | - | | 1.7687 | 6537 | 0.2136 | - | - | | 1.7689 | 6538 | 0.214 | - | - | | 1.7692 | 6539 | 0.2192 | - | - | | 1.7695 | 6540 | 0.2 | - | - | | 1.7698 | 6541 | 0.1819 | - | - | | 1.7700 | 6542 | 0.1676 | - | - | | 1.7703 | 6543 | 0.2631 | - | - | | 1.7706 | 6544 | 0.2503 | - | - | | 1.7708 | 6545 | 0.2668 | - | - | | 1.7711 | 6546 | 0.2111 | - | - | | 1.7714 | 6547 | 0.2102 | - | - | | 1.7716 | 6548 | 0.2288 | - | - | | 1.7719 | 6549 | 0.2249 | - | - | | 1.7722 | 6550 | 0.1823 | - | - | | 1.7725 | 6551 | 0.1697 | - | - | | 1.7727 | 6552 | 0.1553 | - | - | | 1.7730 | 6553 | 0.2197 | - | - | | 1.7733 | 6554 | 0.2097 | - | - | | 1.7735 | 6555 | 0.1861 | - | - | | 1.7738 | 6556 | 0.213 | - | - | | 1.7741 | 6557 | 0.2239 | - | - | | 1.7744 | 6558 | 0.227 | - | - | | 1.7746 | 6559 | 0.2593 | - | - | | 1.7749 | 6560 | 0.2499 | - | - | | 1.7752 | 6561 | 0.1792 | - | - | | 1.7754 | 6562 | 0.1656 | - | - | | 1.7757 | 6563 | 0.2635 | - | - | | 1.7760 | 6564 | 0.1771 | - | - | | 1.7762 | 6565 | 0.2114 | - | - | | 1.7765 | 6566 | 0.267 | - | - | | 1.7768 | 6567 | 0.2448 | - | - | | 1.7771 | 6568 | 0.1893 | - | - | | 1.7773 | 6569 | 0.2354 | - | - | | 1.7776 | 6570 | 0.1843 | - | - | | 1.7779 | 6571 | 0.3859 | - | - | | 1.7781 | 6572 | 0.1559 | - | - | | 1.7784 | 6573 | 0.2277 | - | - | | 1.7787 | 6574 | 0.1855 | - | - | | 1.7790 | 6575 | 0.26 | - | - | | 1.7792 | 6576 | 0.2282 | - | - | | 1.7795 | 6577 | 0.2528 | - | - | | 1.7798 | 6578 | 0.2405 | - | - | | 1.7800 | 6579 | 0.2157 | - | - | | 1.7803 | 6580 | 0.2358 | - | - | | 1.7806 | 6581 | 0.2361 | - | - | | 1.7808 | 6582 | 0.2223 | - | - | | 1.7811 | 6583 | 0.2795 | - | - | | 1.7814 | 6584 | 0.1769 | - | - | | 1.7817 | 6585 | 0.2338 | - | - | | 1.7819 | 6586 | 0.1923 | - | - | | 1.7822 | 6587 | 0.233 | - | - | | 1.7825 | 6588 | 0.2085 | - | - | | 1.7827 | 6589 | 0.1778 | - | - | | 1.7830 | 6590 | 0.2869 | - | - | | 1.7833 | 6591 | 0.2119 | - | - | | 1.7835 | 6592 | 0.2968 | - | - | | 1.7838 | 6593 | 0.2165 | - | - | | 1.7841 | 6594 | 0.187 | - | - | | 1.7844 | 6595 | 0.201 | - | - | | 1.7846 | 6596 | 0.2552 | - | - | | 1.7849 | 6597 | 0.2086 | - | - | | 1.7852 | 6598 | 0.2725 | - | - | | 1.7854 | 6599 | 0.2496 | - | - | | 1.7857 | 6600 | 0.2407 | - | - | | 1.7860 | 6601 | 0.2091 | - | - | | 1.7863 | 6602 | 0.2696 | - | - | | 1.7865 | 6603 | 0.1632 | - | - | | 1.7868 | 6604 | 0.181 | - | - | | 1.7871 | 6605 | 0.2028 | - | - | | 1.7873 | 6606 | 0.2248 | - | - | | 1.7876 | 6607 | 0.2393 | - | - | | 1.7879 | 6608 | 0.1631 | - | - | | 1.7881 | 6609 | 0.2578 | - | - | | 1.7884 | 6610 | 0.1602 | - | - | | 1.7887 | 6611 | 0.2243 | - | - | | 1.7890 | 6612 | 0.2395 | - | - | | 1.7892 | 6613 | 0.2762 | - | - | | 1.7895 | 6614 | 0.1894 | - | - | | 1.7898 | 6615 | 0.2514 | - | - | | 1.7900 | 6616 | 0.1803 | - | - | | 1.7903 | 6617 | 0.2086 | - | - | | 1.7906 | 6618 | 0.1668 | - | - | | 1.7909 | 6619 | 0.2209 | - | - | | 1.7911 | 6620 | 0.2113 | - | - | | 1.7914 | 6621 | 0.2302 | - | - | | 1.7917 | 6622 | 0.3209 | - | - | | 1.7919 | 6623 | 0.1428 | - | - | | 1.7922 | 6624 | 0.2831 | - | - | | 1.7925 | 6625 | 0.2812 | - | - | | 1.7927 | 6626 | 0.2768 | - | - | | 1.7930 | 6627 | 0.215 | - | - | | 1.7933 | 6628 | 0.1834 | - | - | | 1.7936 | 6629 | 0.167 | - | - | | 1.7938 | 6630 | 0.2403 | - | - | | 1.7941 | 6631 | 0.1881 | - | - | | 1.7944 | 6632 | 0.1877 | - | - | | 1.7946 | 6633 | 0.2012 | - | - | | 1.7949 | 6634 | 0.323 | - | - | | 1.7952 | 6635 | 0.2905 | - | - | | 1.7955 | 6636 | 0.1887 | - | - | | 1.7957 | 6637 | 0.1869 | - | - | | 1.7960 | 6638 | 0.257 | - | - | | 1.7963 | 6639 | 0.2138 | - | - | | 1.7965 | 6640 | 0.2827 | - | - | | 1.7968 | 6641 | 0.2251 | - | - | | 1.7971 | 6642 | 0.2798 | - | - | | 1.7973 | 6643 | 0.2143 | - | - | | 1.7976 | 6644 | 0.1846 | - | - | | 1.7979 | 6645 | 0.1762 | - | - | | 1.7982 | 6646 | 0.2543 | - | - | | 1.7984 | 6647 | 0.1622 | - | - | | 1.7987 | 6648 | 0.1778 | - | - | | 1.7990 | 6649 | 0.1405 | - | - | | 1.7992 | 6650 | 0.2197 | - | - | | 1.7995 | 6651 | 0.2605 | - | - | | 1.7998 | 6652 | 0.1745 | - | - | | 1.8001 | 6653 | 0.2705 | - | - | | 1.8003 | 6654 | 0.1617 | - | - | | 1.8006 | 6655 | 0.2243 | - | - | | 1.8009 | 6656 | 0.2083 | - | - | | 1.8011 | 6657 | 0.2421 | - | - | | 1.8014 | 6658 | 0.2447 | - | - | | 1.8017 | 6659 | 0.2206 | - | - | | 1.8019 | 6660 | 0.1914 | - | - | | 1.8022 | 6661 | 0.2507 | - | - | | 1.8025 | 6662 | 0.2856 | - | - | | 1.8028 | 6663 | 0.1925 | - | - | | 1.8030 | 6664 | 0.179 | - | - | | 1.8033 | 6665 | 0.2452 | - | - | | 1.8036 | 6666 | 0.2212 | - | - | | 1.8038 | 6667 | 0.1303 | - | - | | 1.8041 | 6668 | 0.1872 | - | - | | 1.8044 | 6669 | 0.2041 | - | - | | 1.8047 | 6670 | 0.2514 | - | - | | 1.8049 | 6671 | 0.192 | - | - | | 1.8052 | 6672 | 0.2989 | - | - | | 1.8055 | 6673 | 0.2004 | - | - | | 1.8057 | 6674 | 0.2675 | - | - | | 1.8060 | 6675 | 0.2535 | - | - | | 1.8063 | 6676 | 0.2766 | - | - | | 1.8065 | 6677 | 0.2401 | - | - | | 1.8068 | 6678 | 0.2731 | - | - | | 1.8071 | 6679 | 0.2084 | - | - | | 1.8074 | 6680 | 0.1877 | - | - | | 1.8076 | 6681 | 0.2267 | - | - | | 1.8079 | 6682 | 0.2496 | - | - | | 1.8082 | 6683 | 0.2267 | - | - | | 1.8084 | 6684 | 0.193 | - | - | | 1.8087 | 6685 | 0.2128 | - | - | | 1.8090 | 6686 | 0.2902 | - | - | | 1.8093 | 6687 | 0.204 | - | - | | 1.8095 | 6688 | 0.229 | - | - | | 1.8098 | 6689 | 0.3288 | - | - | | 1.8101 | 6690 | 0.1849 | - | - | | 1.8103 | 6691 | 0.2502 | - | - | | 1.8106 | 6692 | 0.2702 | - | - | | 1.8109 | 6693 | 0.1542 | - | - | | 1.8111 | 6694 | 0.1617 | - | - | | 1.8114 | 6695 | 0.1862 | - | - | | 1.8117 | 6696 | 0.1894 | - | - | | 1.8120 | 6697 | 0.2195 | - | - | | 1.8122 | 6698 | 0.2576 | - | - | | 1.8125 | 6699 | 0.2655 | - | - | | 1.8128 | 6700 | 0.1693 | - | - | | 1.8130 | 6701 | 0.1945 | - | - | | 1.8133 | 6702 | 0.2121 | - | - | | 1.8136 | 6703 | 0.2186 | - | - | | 1.8139 | 6704 | 0.1629 | - | - | | 1.8141 | 6705 | 0.1972 | - | - | | 1.8144 | 6706 | 0.247 | - | - | | 1.8147 | 6707 | 0.2026 | - | - | | 1.8149 | 6708 | 0.2318 | - | - | | 1.8152 | 6709 | 0.2716 | - | - | | 1.8155 | 6710 | 0.2021 | - | - | | 1.8157 | 6711 | 0.2328 | - | - | | 1.8160 | 6712 | 0.2318 | - | - | | 1.8163 | 6713 | 0.2981 | - | - | | 1.8166 | 6714 | 0.1953 | - | - | | 1.8168 | 6715 | 0.1852 | - | - | | 1.8171 | 6716 | 0.2372 | - | - | | 1.8174 | 6717 | 0.1871 | - | - | | 1.8176 | 6718 | 0.1753 | - | - | | 1.8179 | 6719 | 0.215 | - | - | | 1.8182 | 6720 | 0.1626 | - | - | | 1.8185 | 6721 | 0.2358 | - | - | | 1.8187 | 6722 | 0.21 | - | - | | 1.8190 | 6723 | 0.1971 | - | - | | 1.8193 | 6724 | 0.167 | - | - | | 1.8195 | 6725 | 0.2303 | - | - | | 1.8198 | 6726 | 0.2414 | - | - | | 1.8201 | 6727 | 0.1971 | - | - | | 1.8203 | 6728 | 0.247 | - | - | | 1.8206 | 6729 | 0.2166 | - | - | | 1.8209 | 6730 | 0.2749 | - | - | | 1.8212 | 6731 | 0.2598 | - | - | | 1.8214 | 6732 | 0.1899 | - | - | | 1.8217 | 6733 | 0.2066 | - | - | | 1.8220 | 6734 | 0.2383 | - | - | | 1.8222 | 6735 | 0.2664 | - | - | | 1.8225 | 6736 | 0.2661 | - | - | | 1.8228 | 6737 | 0.2796 | - | - | | 1.8231 | 6738 | 0.2781 | - | - | | 1.8233 | 6739 | 0.2618 | - | - | | 1.8236 | 6740 | 0.2486 | - | - | | 1.8239 | 6741 | 0.1855 | - | - | | 1.8241 | 6742 | 0.2137 | - | - | | 1.8244 | 6743 | 0.1719 | - | - | | 1.8247 | 6744 | 0.2291 | - | - | | 1.8249 | 6745 | 0.2304 | - | - | | 1.8252 | 6746 | 0.2855 | - | - | | 1.8255 | 6747 | 0.2541 | - | - | | 1.8258 | 6748 | 0.2232 | - | - | | 1.8260 | 6749 | 0.2797 | - | - | | 1.8263 | 6750 | 0.1399 | - | - | | 1.8266 | 6751 | 0.2661 | - | - | | 1.8268 | 6752 | 0.1634 | - | - | | 1.8271 | 6753 | 0.1625 | - | - | | 1.8274 | 6754 | 0.2126 | - | - | | 1.8277 | 6755 | 0.2137 | - | - | | 1.8279 | 6756 | 0.2012 | - | - | | 1.8282 | 6757 | 0.1717 | - | - | | 1.8285 | 6758 | 0.1937 | - | - | | 1.8287 | 6759 | 0.1816 | - | - | | 1.8290 | 6760 | 0.2053 | - | - | | 1.8293 | 6761 | 0.3062 | - | - | | 1.8295 | 6762 | 0.3078 | - | - | | 1.8298 | 6763 | 0.1764 | - | - | | 1.8301 | 6764 | 0.2289 | - | - | | 1.8304 | 6765 | 0.2101 | - | - | | 1.8306 | 6766 | 0.2181 | - | - | | 1.8309 | 6767 | 0.2046 | - | - | | 1.8312 | 6768 | 0.1564 | - | - | | 1.8314 | 6769 | 0.2528 | - | - | | 1.8317 | 6770 | 0.1251 | - | - | | 1.8320 | 6771 | 0.1605 | - | - | | 1.8323 | 6772 | 0.237 | - | - | | 1.8325 | 6773 | 0.1622 | - | - | | 1.8328 | 6774 | 0.1611 | - | - | | 1.8331 | 6775 | 0.1412 | - | - | | 1.8333 | 6776 | 0.2119 | - | - | | 1.8336 | 6777 | 0.2005 | - | - | | 1.8339 | 6778 | 0.1772 | - | - | | 1.8341 | 6779 | 0.1947 | - | - | | 1.8344 | 6780 | 0.2807 | - | - | | 1.8347 | 6781 | 0.3542 | - | - | | 1.8350 | 6782 | 0.2169 | - | - | | 1.8352 | 6783 | 0.2464 | - | - | | 1.8355 | 6784 | 0.2717 | - | - | | 1.8358 | 6785 | 0.2623 | - | - | | 1.8360 | 6786 | 0.1544 | - | - | | 1.8363 | 6787 | 0.2937 | - | - | | 1.8366 | 6788 | 0.2159 | - | - | | 1.8369 | 6789 | 0.3393 | - | - | | 1.8371 | 6790 | 0.2142 | - | - | | 1.8374 | 6791 | 0.25 | - | - | | 1.8377 | 6792 | 0.3379 | - | - | | 1.8379 | 6793 | 0.1493 | - | - | | 1.8382 | 6794 | 0.1527 | - | - | | 1.8385 | 6795 | 0.2073 | - | - | | 1.8387 | 6796 | 0.1927 | - | - | | 1.8390 | 6797 | 0.2152 | - | - | | 1.8393 | 6798 | 0.1894 | - | - | | 1.8396 | 6799 | 0.2404 | - | - | | 1.8398 | 6800 | 0.2309 | - | - | | 1.8401 | 6801 | 0.1824 | - | - | | 1.8404 | 6802 | 0.2448 | - | - | | 1.8406 | 6803 | 0.2701 | - | - | | 1.8409 | 6804 | 0.2419 | - | - | | 1.8412 | 6805 | 0.2511 | - | - | | 1.8415 | 6806 | 0.2591 | - | - | | 1.8417 | 6807 | 0.238 | - | - | | 1.8420 | 6808 | 0.262 | - | - | | 1.8423 | 6809 | 0.2393 | - | - | | 1.8425 | 6810 | 0.2367 | - | - | | 1.8428 | 6811 | 0.1947 | - | - | | 1.8431 | 6812 | 0.2319 | - | - | | 1.8433 | 6813 | 0.1326 | - | - | | 1.8436 | 6814 | 0.1993 | - | - | | 1.8439 | 6815 | 0.2952 | - | - | | 1.8442 | 6816 | 0.1677 | - | - | | 1.8444 | 6817 | 0.2743 | - | - | | 1.8447 | 6818 | 0.1843 | - | - | | 1.8450 | 6819 | 0.2027 | - | - | | 1.8452 | 6820 | 0.2546 | - | - | | 1.8455 | 6821 | 0.2403 | - | - | | 1.8458 | 6822 | 0.2722 | - | - | | 1.8460 | 6823 | 0.1792 | - | - | | 1.8463 | 6824 | 0.2234 | - | - | | 1.8466 | 6825 | 0.1607 | - | - | | 1.8469 | 6826 | 0.2281 | - | - | | 1.8471 | 6827 | 0.2131 | - | - | | 1.8474 | 6828 | 0.2396 | - | - | | 1.8477 | 6829 | 0.2143 | - | - | | 1.8479 | 6830 | 0.123 | - | - | | 1.8482 | 6831 | 0.2297 | - | - | | 1.8485 | 6832 | 0.1734 | - | - | | 1.8488 | 6833 | 0.2846 | - | - | | 1.8490 | 6834 | 0.2128 | - | - | | 1.8493 | 6835 | 0.1821 | - | - | | 1.8496 | 6836 | 0.1685 | - | - | | 1.8498 | 6837 | 0.1885 | - | - | | 1.8501 | 6838 | 0.2941 | - | - | | 1.8504 | 6839 | 0.2121 | - | - | | 1.8506 | 6840 | 0.2037 | - | - | | 1.8509 | 6841 | 0.2955 | - | - | | 1.8512 | 6842 | 0.2013 | - | - | | 1.8515 | 6843 | 0.2804 | - | - | | 1.8517 | 6844 | 0.2574 | - | - | | 1.8520 | 6845 | 0.2015 | - | - | | 1.8523 | 6846 | 0.2043 | - | - | | 1.8525 | 6847 | 0.1796 | - | - | | 1.8528 | 6848 | 0.2073 | - | - | | 1.8531 | 6849 | 0.2557 | - | - | | 1.8534 | 6850 | 0.2499 | - | - | | 1.8536 | 6851 | 0.1908 | - | - | | 1.8539 | 6852 | 0.3363 | - | - | | 1.8542 | 6853 | 0.1702 | - | - | | 1.8544 | 6854 | 0.257 | - | - | | 1.8547 | 6855 | 0.2068 | - | - | | 1.8550 | 6856 | 0.2433 | - | - | | 1.8552 | 6857 | 0.1562 | - | - | | 1.8555 | 6858 | 0.1735 | - | - | | 1.8558 | 6859 | 0.1764 | - | - | | 1.8561 | 6860 | 0.2423 | - | - | | 1.8563 | 6861 | 0.2077 | - | - | | 1.8566 | 6862 | 0.2447 | - | - | | 1.8569 | 6863 | 0.1941 | - | - | | 1.8571 | 6864 | 0.132 | - | - | | 1.8574 | 6865 | 0.2564 | - | - | | 1.8577 | 6866 | 0.2309 | - | - | | 1.8580 | 6867 | 0.1939 | - | - | | 1.8582 | 6868 | 0.1954 | - | - | | 1.8585 | 6869 | 0.3039 | - | - | | 1.8588 | 6870 | 0.1465 | - | - | | 1.8590 | 6871 | 0.2242 | - | - | | 1.8593 | 6872 | 0.2593 | - | - | | 1.8596 | 6873 | 0.1716 | - | - | | 1.8598 | 6874 | 0.1476 | - | - | | 1.8601 | 6875 | 0.258 | - | - | | 1.8604 | 6876 | 0.1752 | - | - | | 1.8607 | 6877 | 0.187 | - | - | | 1.8609 | 6878 | 0.2057 | - | - | | 1.8612 | 6879 | 0.1648 | - | - | | 1.8615 | 6880 | 0.1858 | - | - | | 1.8617 | 6881 | 0.1485 | - | - | | 1.8620 | 6882 | 0.1795 | - | - | | 1.8623 | 6883 | 0.2176 | - | - | | 1.8626 | 6884 | 0.3376 | - | - | | 1.8628 | 6885 | 0.2055 | - | - | | 1.8631 | 6886 | 0.3037 | - | - | | 1.8634 | 6887 | 0.1202 | - | - | | 1.8636 | 6888 | 0.2717 | - | - | | 1.8639 | 6889 | 0.2818 | - | - | | 1.8642 | 6890 | 0.1857 | - | - | | 1.8644 | 6891 | 0.1516 | - | - | | 1.8647 | 6892 | 0.2243 | - | - | | 1.8650 | 6893 | 0.1207 | - | - | | 1.8653 | 6894 | 0.2051 | - | - | | 1.8655 | 6895 | 0.1948 | - | - | | 1.8658 | 6896 | 0.1925 | - | - | | 1.8661 | 6897 | 0.2835 | - | - | | 1.8663 | 6898 | 0.2427 | - | - | | 1.8666 | 6899 | 0.2075 | - | - | | 1.8669 | 6900 | 0.1445 | - | - | | 1.8672 | 6901 | 0.3023 | - | - | | 1.8674 | 6902 | 0.1706 | - | - | | 1.8677 | 6903 | 0.2172 | - | - | | 1.8680 | 6904 | 0.2397 | - | - | | 1.8682 | 6905 | 0.1522 | - | - | | 1.8685 | 6906 | 0.1468 | - | - | | 1.8688 | 6907 | 0.2319 | - | - | | 1.8690 | 6908 | 0.1598 | - | - | | 1.8693 | 6909 | 0.2334 | - | - | | 1.8696 | 6910 | 0.1953 | - | - | | 1.8699 | 6911 | 0.2382 | - | - | | 1.8701 | 6912 | 0.2138 | - | - | | 1.8704 | 6913 | 0.2687 | - | - | | 1.8707 | 6914 | 0.1978 | - | - | | 1.8709 | 6915 | 0.2126 | - | - | | 1.8712 | 6916 | 0.2675 | - | - | | 1.8715 | 6917 | 0.2807 | - | - | | 1.8718 | 6918 | 0.177 | - | - | | 1.8720 | 6919 | 0.1977 | - | - | | 1.8723 | 6920 | 0.2313 | - | - | | 1.8726 | 6921 | 0.1628 | - | - | | 1.8728 | 6922 | 0.1916 | - | - | | 1.8731 | 6923 | 0.2242 | - | - | | 1.8734 | 6924 | 0.2562 | - | - | | 1.8736 | 6925 | 0.1938 | - | - | | 1.8739 | 6926 | 0.2167 | - | - | | 1.8742 | 6927 | 0.3225 | - | - | | 1.8745 | 6928 | 0.2425 | - | - | | 1.8747 | 6929 | 0.1834 | - | - | | 1.875 | 6930 | 0.1451 | - | - | | 1.8753 | 6931 | 0.1907 | - | - | | 1.8755 | 6932 | 0.2758 | - | - | | 1.8758 | 6933 | 0.2462 | - | - | | 1.8761 | 6934 | 0.2564 | - | - | | 1.8764 | 6935 | 0.1737 | - | - | | 1.8766 | 6936 | 0.1868 | - | - | | 1.8769 | 6937 | 0.2911 | - | - | | 1.8772 | 6938 | 0.1892 | - | - | | 1.8774 | 6939 | 0.2401 | - | - | | 1.8777 | 6940 | 0.1688 | - | - | | 1.8780 | 6941 | 0.3146 | - | - | | 1.8782 | 6942 | 0.2438 | - | - | | 1.8785 | 6943 | 0.1813 | - | - | | 1.8788 | 6944 | 0.2076 | - | - | | 1.8791 | 6945 | 0.2164 | - | - | | 1.8793 | 6946 | 0.1827 | - | - | | 1.8796 | 6947 | 0.189 | - | - | | 1.8799 | 6948 | 0.2421 | - | - | | 1.8801 | 6949 | 0.1634 | - | - | | 1.8804 | 6950 | 0.2007 | - | - | | 1.8807 | 6951 | 0.206 | - | - | | 1.8810 | 6952 | 0.2791 | - | - | | 1.8812 | 6953 | 0.1769 | - | - | | 1.8815 | 6954 | 0.2841 | - | - | | 1.8818 | 6955 | 0.1658 | - | - | | 1.8820 | 6956 | 0.1789 | - | - | | 1.8823 | 6957 | 0.1992 | - | - | | 1.8826 | 6958 | 0.2897 | - | - | | 1.8828 | 6959 | 0.2646 | - | - | | 1.8831 | 6960 | 0.2112 | - | - | | 1.8834 | 6961 | 0.1842 | - | - | | 1.8837 | 6962 | 0.1885 | - | - | | 1.8839 | 6963 | 0.2441 | - | - | | 1.8842 | 6964 | 0.2161 | - | - | | 1.8845 | 6965 | 0.189 | - | - | | 1.8847 | 6966 | 0.2114 | - | - | | 1.8850 | 6967 | 0.2326 | - | - | | 1.8853 | 6968 | 0.2013 | - | - | | 1.8856 | 6969 | 0.1908 | - | - | | 1.8858 | 6970 | 0.309 | - | - | | 1.8861 | 6971 | 0.271 | - | - | | 1.8864 | 6972 | 0.3337 | - | - | | 1.8866 | 6973 | 0.1923 | - | - | | 1.8869 | 6974 | 0.2295 | - | - | | 1.8872 | 6975 | 0.2949 | - | - | | 1.8874 | 6976 | 0.2178 | - | - | | 1.8877 | 6977 | 0.1804 | - | - | | 1.8880 | 6978 | 0.2862 | - | - | | 1.8883 | 6979 | 0.2028 | - | - | | 1.8885 | 6980 | 0.2255 | - | - | | 1.8888 | 6981 | 0.2784 | - | - | | 1.8891 | 6982 | 0.229 | - | - | | 1.8893 | 6983 | 0.1969 | - | - | | 1.8896 | 6984 | 0.2365 | - | - | | 1.8899 | 6985 | 0.2824 | - | - | | 1.8902 | 6986 | 0.2254 | - | - | | 1.8904 | 6987 | 0.1766 | - | - | | 1.8907 | 6988 | 0.2366 | - | - | | 1.8910 | 6989 | 0.0862 | - | - | | 1.8912 | 6990 | 0.1867 | - | - | | 1.8915 | 6991 | 0.1723 | - | - | | 1.8918 | 6992 | 0.1994 | - | - | | 1.8920 | 6993 | 0.1718 | - | - | | 1.8923 | 6994 | 0.2443 | - | - | | 1.8926 | 6995 | 0.2593 | - | - | | 1.8929 | 6996 | 0.1322 | - | - | | 1.8931 | 6997 | 0.2174 | - | - | | 1.8934 | 6998 | 0.157 | - | - | | 1.8937 | 6999 | 0.2619 | - | - | | 1.8939 | 7000 | 0.1718 | 0.2104 | 0.9469 | | 1.8942 | 7001 | 0.269 | - | - | | 1.8945 | 7002 | 0.2832 | - | - | | 1.8948 | 7003 | 0.2403 | - | - | | 1.8950 | 7004 | 0.186 | - | - | | 1.8953 | 7005 | 0.2609 | - | - | | 1.8956 | 7006 | 0.1834 | - | - | | 1.8958 | 7007 | 0.2261 | - | - | | 1.8961 | 7008 | 0.1895 | - | - | | 1.8964 | 7009 | 0.1702 | - | - | | 1.8966 | 7010 | 0.248 | - | - | | 1.8969 | 7011 | 0.1906 | - | - | | 1.8972 | 7012 | 0.2125 | - | - | | 1.8975 | 7013 | 0.2632 | - | - | | 1.8977 | 7014 | 0.2046 | - | - | | 1.8980 | 7015 | 0.2145 | - | - | | 1.8983 | 7016 | 0.2517 | - | - | | 1.8985 | 7017 | 0.2912 | - | - | | 1.8988 | 7018 | 0.2145 | - | - | | 1.8991 | 7019 | 0.178 | - | - | | 1.8994 | 7020 | 0.1801 | - | - | | 1.8996 | 7021 | 0.1954 | - | - | | 1.8999 | 7022 | 0.1913 | - | - | | 1.9002 | 7023 | 0.2518 | - | - | | 1.9004 | 7024 | 0.229 | - | - | | 1.9007 | 7025 | 0.1639 | - | - | | 1.9010 | 7026 | 0.2083 | - | - | | 1.9012 | 7027 | 0.2593 | - | - | | 1.9015 | 7028 | 0.1365 | - | - | | 1.9018 | 7029 | 0.2212 | - | - | | 1.9021 | 7030 | 0.3368 | - | - | | 1.9023 | 7031 | 0.1783 | - | - | | 1.9026 | 7032 | 0.2383 | - | - | | 1.9029 | 7033 | 0.2259 | - | - | | 1.9031 | 7034 | 0.1982 | - | - | | 1.9034 | 7035 | 0.1914 | - | - | | 1.9037 | 7036 | 0.2696 | - | - | | 1.9040 | 7037 | 0.2281 | - | - | | 1.9042 | 7038 | 0.231 | - | - | | 1.9045 | 7039 | 0.1644 | - | - | | 1.9048 | 7040 | 0.2566 | - | - | | 1.9050 | 7041 | 0.211 | - | - | | 1.9053 | 7042 | 0.2381 | - | - | | 1.9056 | 7043 | 0.2216 | - | - | | 1.9058 | 7044 | 0.205 | - | - | | 1.9061 | 7045 | 0.2917 | - | - | | 1.9064 | 7046 | 0.2163 | - | - | | 1.9067 | 7047 | 0.1715 | - | - | | 1.9069 | 7048 | 0.1918 | - | - | | 1.9072 | 7049 | 0.1722 | - | - | | 1.9075 | 7050 | 0.1946 | - | - | | 1.9077 | 7051 | 0.2428 | - | - | | 1.9080 | 7052 | 0.2175 | - | - | | 1.9083 | 7053 | 0.2399 | - | - | | 1.9085 | 7054 | 0.3172 | - | - | | 1.9088 | 7055 | 0.2872 | - | - | | 1.9091 | 7056 | 0.2166 | - | - | | 1.9094 | 7057 | 0.2544 | - | - | | 1.9096 | 7058 | 0.2872 | - | - | | 1.9099 | 7059 | 0.2252 | - | - | | 1.9102 | 7060 | 0.2404 | - | - | | 1.9104 | 7061 | 0.1883 | - | - | | 1.9107 | 7062 | 0.1792 | - | - | | 1.9110 | 7063 | 0.2277 | - | - | | 1.9113 | 7064 | 0.2335 | - | - | | 1.9115 | 7065 | 0.2166 | - | - | | 1.9118 | 7066 | 0.2231 | - | - | | 1.9121 | 7067 | 0.2364 | - | - | | 1.9123 | 7068 | 0.2304 | - | - | | 1.9126 | 7069 | 0.2079 | - | - | | 1.9129 | 7070 | 0.1801 | - | - | | 1.9131 | 7071 | 0.2104 | - | - | | 1.9134 | 7072 | 0.2003 | - | - | | 1.9137 | 7073 | 0.1498 | - | - | | 1.9140 | 7074 | 0.2501 | - | - | | 1.9142 | 7075 | 0.279 | - | - | | 1.9145 | 7076 | 0.1883 | - | - | | 1.9148 | 7077 | 0.1432 | - | - | | 1.9150 | 7078 | 0.1973 | - | - | | 1.9153 | 7079 | 0.1741 | - | - | | 1.9156 | 7080 | 0.275 | - | - | | 1.9159 | 7081 | 0.1536 | - | - | | 1.9161 | 7082 | 0.1525 | - | - | | 1.9164 | 7083 | 0.2561 | - | - | | 1.9167 | 7084 | 0.1586 | - | - | | 1.9169 | 7085 | 0.2243 | - | - | | 1.9172 | 7086 | 0.1452 | - | - | | 1.9175 | 7087 | 0.234 | - | - | | 1.9177 | 7088 | 0.1805 | - | - | | 1.9180 | 7089 | 0.183 | - | - | | 1.9183 | 7090 | 0.204 | - | - | | 1.9186 | 7091 | 0.2472 | - | - | | 1.9188 | 7092 | 0.1173 | - | - | | 1.9191 | 7093 | 0.2522 | - | - | | 1.9194 | 7094 | 0.185 | - | - | | 1.9196 | 7095 | 0.2347 | - | - | | 1.9199 | 7096 | 0.2142 | - | - | | 1.9202 | 7097 | 0.1567 | - | - | | 1.9205 | 7098 | 0.1912 | - | - | | 1.9207 | 7099 | 0.1797 | - | - | | 1.9210 | 7100 | 0.213 | - | - | | 1.9213 | 7101 | 0.2194 | - | - | | 1.9215 | 7102 | 0.1448 | - | - | | 1.9218 | 7103 | 0.1608 | - | - | | 1.9221 | 7104 | 0.2581 | - | - | | 1.9223 | 7105 | 0.2613 | - | - | | 1.9226 | 7106 | 0.219 | - | - | | 1.9229 | 7107 | 0.2105 | - | - | | 1.9232 | 7108 | 0.3396 | - | - | | 1.9234 | 7109 | 0.1645 | - | - | | 1.9237 | 7110 | 0.225 | - | - | | 1.9240 | 7111 | 0.1776 | - | - | | 1.9242 | 7112 | 0.2333 | - | - | | 1.9245 | 7113 | 0.2373 | - | - | | 1.9248 | 7114 | 0.1465 | - | - | | 1.9251 | 7115 | 0.2608 | - | - | | 1.9253 | 7116 | 0.229 | - | - | | 1.9256 | 7117 | 0.3538 | - | - | | 1.9259 | 7118 | 0.1877 | - | - | | 1.9261 | 7119 | 0.3129 | - | - | | 1.9264 | 7120 | 0.1997 | - | - | | 1.9267 | 7121 | 0.2001 | - | - | | 1.9269 | 7122 | 0.2866 | - | - | | 1.9272 | 7123 | 0.2784 | - | - | | 1.9275 | 7124 | 0.1546 | - | - | | 1.9278 | 7125 | 0.3189 | - | - | | 1.9280 | 7126 | 0.266 | - | - | | 1.9283 | 7127 | 0.2486 | - | - | | 1.9286 | 7128 | 0.2266 | - | - | | 1.9288 | 7129 | 0.1973 | - | - | | 1.9291 | 7130 | 0.244 | - | - | | 1.9294 | 7131 | 0.2756 | - | - | | 1.9297 | 7132 | 0.2528 | - | - | | 1.9299 | 7133 | 0.1866 | - | - | | 1.9302 | 7134 | 0.1671 | - | - | | 1.9305 | 7135 | 0.2083 | - | - | | 1.9307 | 7136 | 0.232 | - | - | | 1.9310 | 7137 | 0.2979 | - | - | | 1.9313 | 7138 | 0.2821 | - | - | | 1.9315 | 7139 | 0.2327 | - | - | | 1.9318 | 7140 | 0.1865 | - | - | | 1.9321 | 7141 | 0.2826 | - | - | | 1.9324 | 7142 | 0.1444 | - | - | | 1.9326 | 7143 | 0.2244 | - | - | | 1.9329 | 7144 | 0.2746 | - | - | | 1.9332 | 7145 | 0.2803 | - | - | | 1.9334 | 7146 | 0.2344 | - | - | | 1.9337 | 7147 | 0.3034 | - | - | | 1.9340 | 7148 | 0.1771 | - | - | | 1.9343 | 7149 | 0.2071 | - | - | | 1.9345 | 7150 | 0.197 | - | - | | 1.9348 | 7151 | 0.1984 | - | - | | 1.9351 | 7152 | 0.2614 | - | - | | 1.9353 | 7153 | 0.1385 | - | - | | 1.9356 | 7154 | 0.2082 | - | - | | 1.9359 | 7155 | 0.228 | - | - | | 1.9361 | 7156 | 0.2584 | - | - | | 1.9364 | 7157 | 0.3092 | - | - | | 1.9367 | 7158 | 0.2197 | - | - | | 1.9370 | 7159 | 0.234 | - | - | | 1.9372 | 7160 | 0.2595 | - | - | | 1.9375 | 7161 | 0.141 | - | - | | 1.9378 | 7162 | 0.2599 | - | - | | 1.9380 | 7163 | 0.2564 | - | - | | 1.9383 | 7164 | 0.1704 | - | - | | 1.9386 | 7165 | 0.2384 | - | - | | 1.9389 | 7166 | 0.1979 | - | - | | 1.9391 | 7167 | 0.3258 | - | - | | 1.9394 | 7168 | 0.1783 | - | - | | 1.9397 | 7169 | 0.1535 | - | - | | 1.9399 | 7170 | 0.1827 | - | - | | 1.9402 | 7171 | 0.2352 | - | - | | 1.9405 | 7172 | 0.1795 | - | - | | 1.9407 | 7173 | 0.1649 | - | - | | 1.9410 | 7174 | 0.2086 | - | - | | 1.9413 | 7175 | 0.1976 | - | - | | 1.9416 | 7176 | 0.1687 | - | - | | 1.9418 | 7177 | 0.1622 | - | - | | 1.9421 | 7178 | 0.1885 | - | - | | 1.9424 | 7179 | 0.2356 | - | - | | 1.9426 | 7180 | 0.2115 | - | - | | 1.9429 | 7181 | 0.1963 | - | - | | 1.9432 | 7182 | 0.1925 | - | - | | 1.9435 | 7183 | 0.2072 | - | - | | 1.9437 | 7184 | 0.2206 | - | - | | 1.9440 | 7185 | 0.2105 | - | - | | 1.9443 | 7186 | 0.2346 | - | - | | 1.9445 | 7187 | 0.168 | - | - | | 1.9448 | 7188 | 0.2229 | - | - | | 1.9451 | 7189 | 0.1716 | - | - | | 1.9453 | 7190 | 0.2224 | - | - | | 1.9456 | 7191 | 0.1816 | - | - | | 1.9459 | 7192 | 0.3087 | - | - | | 1.9462 | 7193 | 0.2493 | - | - | | 1.9464 | 7194 | 0.2322 | - | - | | 1.9467 | 7195 | 0.2194 | - | - | | 1.9470 | 7196 | 0.1743 | - | - | | 1.9472 | 7197 | 0.1862 | - | - | | 1.9475 | 7198 | 0.1798 | - | - | | 1.9478 | 7199 | 0.2274 | - | - | | 1.9481 | 7200 | 0.1644 | - | - | | 1.9483 | 7201 | 0.2221 | - | - | | 1.9486 | 7202 | 0.1944 | - | - | | 1.9489 | 7203 | 0.2716 | - | - | | 1.9491 | 7204 | 0.226 | - | - | | 1.9494 | 7205 | 0.203 | - | - | | 1.9497 | 7206 | 0.1998 | - | - | | 1.9499 | 7207 | 0.2026 | - | - | | 1.9502 | 7208 | 0.2892 | - | - | | 1.9505 | 7209 | 0.187 | - | - | | 1.9508 | 7210 | 0.1413 | - | - | | 1.9510 | 7211 | 0.1582 | - | - | | 1.9513 | 7212 | 0.1625 | - | - | | 1.9516 | 7213 | 0.2488 | - | - | | 1.9518 | 7214 | 0.2925 | - | - | | 1.9521 | 7215 | 0.2104 | - | - | | 1.9524 | 7216 | 0.2145 | - | - | | 1.9527 | 7217 | 0.2213 | - | - | | 1.9529 | 7218 | 0.1724 | - | - | | 1.9532 | 7219 | 0.149 | - | - | | 1.9535 | 7220 | 0.2099 | - | - | | 1.9537 | 7221 | 0.3007 | - | - | | 1.9540 | 7222 | 0.1984 | - | - | | 1.9543 | 7223 | 0.1891 | - | - | | 1.9545 | 7224 | 0.2323 | - | - | | 1.9548 | 7225 | 0.159 | - | - | | 1.9551 | 7226 | 0.2065 | - | - | | 1.9554 | 7227 | 0.2463 | - | - | | 1.9556 | 7228 | 0.1677 | - | - | | 1.9559 | 7229 | 0.2345 | - | - | | 1.9562 | 7230 | 0.2033 | - | - | | 1.9564 | 7231 | 0.1699 | - | - | | 1.9567 | 7232 | 0.2412 | - | - | | 1.9570 | 7233 | 0.1919 | - | - | | 1.9573 | 7234 | 0.3061 | - | - | | 1.9575 | 7235 | 0.2715 | - | - | | 1.9578 | 7236 | 0.1199 | - | - | | 1.9581 | 7237 | 0.2353 | - | - | | 1.9583 | 7238 | 0.2094 | - | - | | 1.9586 | 7239 | 0.2702 | - | - | | 1.9589 | 7240 | 0.2746 | - | - | | 1.9591 | 7241 | 0.2213 | - | - | | 1.9594 | 7242 | 0.2803 | - | - | | 1.9597 | 7243 | 0.209 | - | - | | 1.9600 | 7244 | 0.2432 | - | - | | 1.9602 | 7245 | 0.1557 | - | - | | 1.9605 | 7246 | 0.1833 | - | - | | 1.9608 | 7247 | 0.2027 | - | - | | 1.9610 | 7248 | 0.2564 | - | - | | 1.9613 | 7249 | 0.2852 | - | - | | 1.9616 | 7250 | 0.2168 | - | - | | 1.9619 | 7251 | 0.1654 | - | - | | 1.9621 | 7252 | 0.1364 | - | - | | 1.9624 | 7253 | 0.2003 | - | - | | 1.9627 | 7254 | 0.1984 | - | - | | 1.9629 | 7255 | 0.2554 | - | - | | 1.9632 | 7256 | 0.2652 | - | - | | 1.9635 | 7257 | 0.2321 | - | - | | 1.9637 | 7258 | 0.2279 | - | - | | 1.9640 | 7259 | 0.2065 | - | - | | 1.9643 | 7260 | 0.2606 | - | - | | 1.9646 | 7261 | 0.1797 | - | - | | 1.9648 | 7262 | 0.2919 | - | - | | 1.9651 | 7263 | 0.1482 | - | - | | 1.9654 | 7264 | 0.2082 | - | - | | 1.9656 | 7265 | 0.2776 | - | - | | 1.9659 | 7266 | 0.1561 | - | - | | 1.9662 | 7267 | 0.2733 | - | - | | 1.9665 | 7268 | 0.2021 | - | - | | 1.9667 | 7269 | 0.2192 | - | - | | 1.9670 | 7270 | 0.1659 | - | - | | 1.9673 | 7271 | 0.2943 | - | - | | 1.9675 | 7272 | 0.2052 | - | - | | 1.9678 | 7273 | 0.206 | - | - | | 1.9681 | 7274 | 0.2126 | - | - | | 1.9683 | 7275 | 0.2076 | - | - | | 1.9686 | 7276 | 0.2969 | - | - | | 1.9689 | 7277 | 0.2433 | - | - | | 1.9692 | 7278 | 0.1569 | - | - | | 1.9694 | 7279 | 0.1884 | - | - | | 1.9697 | 7280 | 0.2323 | - | - | | 1.9700 | 7281 | 0.1703 | - | - | | 1.9702 | 7282 | 0.2281 | - | - | | 1.9705 | 7283 | 0.2927 | - | - | | 1.9708 | 7284 | 0.1661 | - | - | | 1.9710 | 7285 | 0.1876 | - | - | | 1.9713 | 7286 | 0.2036 | - | - | | 1.9716 | 7287 | 0.1876 | - | - | | 1.9719 | 7288 | 0.2981 | - | - | | 1.9721 | 7289 | 0.1267 | - | - | | 1.9724 | 7290 | 0.2597 | - | - | | 1.9727 | 7291 | 0.1772 | - | - | | 1.9729 | 7292 | 0.273 | - | - | | 1.9732 | 7293 | 0.1955 | - | - | | 1.9735 | 7294 | 0.228 | - | - | | 1.9738 | 7295 | 0.1674 | - | - | | 1.9740 | 7296 | 0.2655 | - | - | | 1.9743 | 7297 | 0.2256 | - | - | | 1.9746 | 7298 | 0.28 | - | - | | 1.9748 | 7299 | 0.187 | - | - | | 1.9751 | 7300 | 0.205 | - | - | | 1.9754 | 7301 | 0.2096 | - | - | | 1.9756 | 7302 | 0.3056 | - | - | | 1.9759 | 7303 | 0.228 | - | - | | 1.9762 | 7304 | 0.2253 | - | - | | 1.9765 | 7305 | 0.2762 | - | - | | 1.9767 | 7306 | 0.1992 | - | - | | 1.9770 | 7307 | 0.1603 | - | - | | 1.9773 | 7308 | 0.1428 | - | - | | 1.9775 | 7309 | 0.2687 | - | - | | 1.9778 | 7310 | 0.2061 | - | - | | 1.9781 | 7311 | 0.2313 | - | - | | 1.9784 | 7312 | 0.204 | - | - | | 1.9786 | 7313 | 0.2134 | - | - | | 1.9789 | 7314 | 0.2273 | - | - | | 1.9792 | 7315 | 0.2424 | - | - | | 1.9794 | 7316 | 0.2169 | - | - | | 1.9797 | 7317 | 0.1945 | - | - | | 1.9800 | 7318 | 0.136 | - | - | | 1.9802 | 7319 | 0.1514 | - | - | | 1.9805 | 7320 | 0.2006 | - | - | | 1.9808 | 7321 | 0.1818 | - | - | | 1.9811 | 7322 | 0.2406 | - | - | | 1.9813 | 7323 | 0.2683 | - | - | | 1.9816 | 7324 | 0.2823 | - | - | | 1.9819 | 7325 | 0.2271 | - | - | | 1.9821 | 7326 | 0.2067 | - | - | | 1.9824 | 7327 | 0.1982 | - | - | | 1.9827 | 7328 | 0.3198 | - | - | | 1.9830 | 7329 | 0.2081 | - | - | | 1.9832 | 7330 | 0.1579 | - | - | | 1.9835 | 7331 | 0.2111 | - | - | | 1.9838 | 7332 | 0.1704 | - | - | | 1.9840 | 7333 | 0.1812 | - | - | | 1.9843 | 7334 | 0.1774 | - | - | | 1.9846 | 7335 | 0.1981 | - | - | | 1.9848 | 7336 | 0.1615 | - | - | | 1.9851 | 7337 | 0.1934 | - | - | | 1.9854 | 7338 | 0.1315 | - | - | | 1.9857 | 7339 | 0.1861 | - | - | | 1.9859 | 7340 | 0.2108 | - | - | | 1.9862 | 7341 | 0.2384 | - | - | | 1.9865 | 7342 | 0.1887 | - | - | | 1.9867 | 7343 | 0.2806 | - | - | | 1.9870 | 7344 | 0.1905 | - | - | | 1.9873 | 7345 | 0.2215 | - | - | | 1.9876 | 7346 | 0.1675 | - | - | | 1.9878 | 7347 | 0.1941 | - | - | | 1.9881 | 7348 | 0.1884 | - | - | | 1.9884 | 7349 | 0.1796 | - | - | | 1.9886 | 7350 | 0.281 | - | - | | 1.9889 | 7351 | 0.2954 | - | - | | 1.9892 | 7352 | 0.1659 | - | - | | 1.9894 | 7353 | 0.2409 | - | - | | 1.9897 | 7354 | 0.185 | - | - | | 1.9900 | 7355 | 0.2508 | - | - | | 1.9903 | 7356 | 0.2678 | - | - | | 1.9905 | 7357 | 0.1573 | - | - | | 1.9908 | 7358 | 0.3161 | - | - | | 1.9911 | 7359 | 0.2464 | - | - | | 1.9913 | 7360 | 0.1969 | - | - | | 1.9916 | 7361 | 0.1473 | - | - | | 1.9919 | 7362 | 0.2087 | - | - | | 1.9922 | 7363 | 0.2483 | - | - | | 1.9924 | 7364 | 0.222 | - | - | | 1.9927 | 7365 | 0.2109 | - | - | | 1.9930 | 7366 | 0.2377 | - | - | | 1.9932 | 7367 | 0.2077 | - | - | | 1.9935 | 7368 | 0.184 | - | - | | 1.9938 | 7369 | 0.1873 | - | - | | 1.9940 | 7370 | 0.2258 | - | - | | 1.9943 | 7371 | 0.2517 | - | - | | 1.9946 | 7372 | 0.2341 | - | - | | 1.9949 | 7373 | 0.1653 | - | - | | 1.9951 | 7374 | 0.2407 | - | - | | 1.9954 | 7375 | 0.2729 | - | - | | 1.9957 | 7376 | 0.1616 | - | - | | 1.9959 | 7377 | 0.2149 | - | - | | 1.9962 | 7378 | 0.2768 | - | - | | 1.9965 | 7379 | 0.1984 | - | - | | 1.9968 | 7380 | 0.1952 | - | - | | 1.9970 | 7381 | 0.2465 | - | - | | 1.9973 | 7382 | 0.1872 | - | - | | 1.9976 | 7383 | 0.2345 | - | - | | 1.9978 | 7384 | 0.186 | - | - | | 1.9981 | 7385 | 0.1776 | - | - | | 1.9984 | 7386 | 0.2051 | - | - | | 1.9986 | 7387 | 0.2407 | - | - | | 1.9989 | 7388 | 0.14 | - | - | | 1.9992 | 7389 | 0.1821 | - | - | | 1.9995 | 7390 | 0.2295 | - | - | | 1.9997 | 7391 | 0.1614 | - | - | | 2.0 | 7392 | 0.2193 | - | - | | 2.0003 | 7393 | 0.1583 | - | - | | 2.0005 | 7394 | 0.1216 | - | - | | 2.0008 | 7395 | 0.2351 | - | - | | 2.0011 | 7396 | 0.165 | - | - | | 2.0014 | 7397 | 0.1558 | - | - | | 2.0016 | 7398 | 0.1764 | - | - | | 2.0019 | 7399 | 0.1774 | - | - | | 2.0022 | 7400 | 0.234 | - | - | | 2.0024 | 7401 | 0.1633 | - | - | | 2.0027 | 7402 | 0.2066 | - | - | | 2.0030 | 7403 | 0.1771 | - | - | | 2.0032 | 7404 | 0.2267 | - | - | | 2.0035 | 7405 | 0.1346 | - | - | | 2.0038 | 7406 | 0.1892 | - | - | | 2.0041 | 7407 | 0.1717 | - | - | | 2.0043 | 7408 | 0.1558 | - | - | | 2.0046 | 7409 | 0.1413 | - | - | | 2.0049 | 7410 | 0.2576 | - | - | | 2.0051 | 7411 | 0.1349 | - | - | | 2.0054 | 7412 | 0.1273 | - | - | | 2.0057 | 7413 | 0.1273 | - | - | | 2.0060 | 7414 | 0.1849 | - | - | | 2.0062 | 7415 | 0.1507 | - | - | | 2.0065 | 7416 | 0.2996 | - | - | | 2.0068 | 7417 | 0.2342 | - | - | | 2.0070 | 7418 | 0.1282 | - | - | | 2.0073 | 7419 | 0.3023 | - | - | | 2.0076 | 7420 | 0.163 | - | - | | 2.0078 | 7421 | 0.1487 | - | - | | 2.0081 | 7422 | 0.1786 | - | - | | 2.0084 | 7423 | 0.1485 | - | - | | 2.0087 | 7424 | 0.163 | - | - | | 2.0089 | 7425 | 0.2129 | - | - | | 2.0092 | 7426 | 0.1874 | - | - | | 2.0095 | 7427 | 0.2214 | - | - | | 2.0097 | 7428 | 0.0933 | - | - | | 2.0100 | 7429 | 0.1319 | - | - | | 2.0103 | 7430 | 0.172 | - | - | | 2.0106 | 7431 | 0.1725 | - | - | | 2.0108 | 7432 | 0.1779 | - | - | | 2.0111 | 7433 | 0.1495 | - | - | | 2.0114 | 7434 | 0.1349 | - | - | | 2.0116 | 7435 | 0.1931 | - | - | | 2.0119 | 7436 | 0.1951 | - | - | | 2.0122 | 7437 | 0.241 | - | - | | 2.0124 | 7438 | 0.1822 | - | - | | 2.0127 | 7439 | 0.1796 | - | - | | 2.0130 | 7440 | 0.168 | - | - | | 2.0133 | 7441 | 0.1713 | - | - | | 2.0135 | 7442 | 0.1322 | - | - | | 2.0138 | 7443 | 0.1835 | - | - | | 2.0141 | 7444 | 0.1451 | - | - | | 2.0143 | 7445 | 0.188 | - | - | | 2.0146 | 7446 | 0.1409 | - | - | | 2.0149 | 7447 | 0.274 | - | - | | 2.0152 | 7448 | 0.1273 | - | - | | 2.0154 | 7449 | 0.2019 | - | - | | 2.0157 | 7450 | 0.1376 | - | - | | 2.0160 | 7451 | 0.1705 | - | - | | 2.0162 | 7452 | 0.2296 | - | - | | 2.0165 | 7453 | 0.2735 | - | - | | 2.0168 | 7454 | 0.211 | - | - | | 2.0170 | 7455 | 0.1766 | - | - | | 2.0173 | 7456 | 0.1769 | - | - | | 2.0176 | 7457 | 0.1469 | - | - | | 2.0179 | 7458 | 0.1816 | - | - | | 2.0181 | 7459 | 0.1507 | - | - | | 2.0184 | 7460 | 0.2556 | - | - | | 2.0187 | 7461 | 0.1833 | - | - | | 2.0189 | 7462 | 0.1786 | - | - | | 2.0192 | 7463 | 0.1554 | - | - | | 2.0195 | 7464 | 0.1249 | - | - | | 2.0198 | 7465 | 0.2119 | - | - | | 2.0200 | 7466 | 0.1133 | - | - | | 2.0203 | 7467 | 0.2365 | - | - | | 2.0206 | 7468 | 0.1562 | - | - | | 2.0208 | 7469 | 0.1824 | - | - | | 2.0211 | 7470 | 0.1773 | - | - | | 2.0214 | 7471 | 0.1545 | - | - | | 2.0216 | 7472 | 0.1709 | - | - | | 2.0219 | 7473 | 0.1474 | - | - | | 2.0222 | 7474 | 0.2103 | - | - | | 2.0225 | 7475 | 0.1462 | - | - | | 2.0227 | 7476 | 0.1851 | - | - | | 2.0230 | 7477 | 0.2381 | - | - | | 2.0233 | 7478 | 0.2224 | - | - | | 2.0235 | 7479 | 0.2066 | - | - | | 2.0238 | 7480 | 0.203 | - | - | | 2.0241 | 7481 | 0.1233 | - | - | | 2.0244 | 7482 | 0.2172 | - | - | | 2.0246 | 7483 | 0.1615 | - | - | | 2.0249 | 7484 | 0.1564 | - | - | | 2.0252 | 7485 | 0.206 | - | - | | 2.0254 | 7486 | 0.1565 | - | - | | 2.0257 | 7487 | 0.1652 | - | - | | 2.0260 | 7488 | 0.1697 | - | - | | 2.0262 | 7489 | 0.1208 | - | - | | 2.0265 | 7490 | 0.1115 | - | - | | 2.0268 | 7491 | 0.1502 | - | - | | 2.0271 | 7492 | 0.1997 | - | - | | 2.0273 | 7493 | 0.2195 | - | - | | 2.0276 | 7494 | 0.2278 | - | - | | 2.0279 | 7495 | 0.2303 | - | - | | 2.0281 | 7496 | 0.2126 | - | - | | 2.0284 | 7497 | 0.1916 | - | - | | 2.0287 | 7498 | 0.2102 | - | - | | 2.0290 | 7499 | 0.16 | - | - | | 2.0292 | 7500 | 0.1611 | - | - | | 2.0295 | 7501 | 0.1753 | - | - | | 2.0298 | 7502 | 0.2305 | - | - | | 2.0300 | 7503 | 0.1883 | - | - | | 2.0303 | 7504 | 0.2511 | - | - | | 2.0306 | 7505 | 0.2119 | - | - | | 2.0308 | 7506 | 0.2575 | - | - | | 2.0311 | 7507 | 0.2391 | - | - | | 2.0314 | 7508 | 0.1724 | - | - | | 2.0317 | 7509 | 0.1896 | - | - | | 2.0319 | 7510 | 0.2027 | - | - | | 2.0322 | 7511 | 0.1539 | - | - | | 2.0325 | 7512 | 0.1742 | - | - | | 2.0327 | 7513 | 0.1623 | - | - | | 2.0330 | 7514 | 0.139 | - | - | | 2.0333 | 7515 | 0.2794 | - | - | | 2.0335 | 7516 | 0.1501 | - | - | | 2.0338 | 7517 | 0.2285 | - | - | | 2.0341 | 7518 | 0.1819 | - | - | | 2.0344 | 7519 | 0.1841 | - | - | | 2.0346 | 7520 | 0.1474 | - | - | | 2.0349 | 7521 | 0.2103 | - | - | | 2.0352 | 7522 | 0.2881 | - | - | | 2.0354 | 7523 | 0.1859 | - | - | | 2.0357 | 7524 | 0.1789 | - | - | | 2.0360 | 7525 | 0.1787 | - | - | | 2.0363 | 7526 | 0.2041 | - | - | | 2.0365 | 7527 | 0.183 | - | - | | 2.0368 | 7528 | 0.1571 | - | - | | 2.0371 | 7529 | 0.2029 | - | - | | 2.0373 | 7530 | 0.2246 | - | - | | 2.0376 | 7531 | 0.1663 | - | - | | 2.0379 | 7532 | 0.1312 | - | - | | 2.0381 | 7533 | 0.2372 | - | - | | 2.0384 | 7534 | 0.2237 | - | - | | 2.0387 | 7535 | 0.3542 | - | - | | 2.0390 | 7536 | 0.187 | - | - | | 2.0392 | 7537 | 0.1695 | - | - | | 2.0395 | 7538 | 0.2579 | - | - | | 2.0398 | 7539 | 0.1638 | - | - | | 2.0400 | 7540 | 0.2131 | - | - | | 2.0403 | 7541 | 0.1923 | - | - | | 2.0406 | 7542 | 0.1354 | - | - | | 2.0409 | 7543 | 0.2152 | - | - | | 2.0411 | 7544 | 0.2227 | - | - | | 2.0414 | 7545 | 0.1933 | - | - | | 2.0417 | 7546 | 0.2017 | - | - | | 2.0419 | 7547 | 0.1549 | - | - | | 2.0422 | 7548 | 0.2034 | - | - | | 2.0425 | 7549 | 0.1894 | - | - | | 2.0427 | 7550 | 0.1781 | - | - | | 2.0430 | 7551 | 0.289 | - | - | | 2.0433 | 7552 | 0.1747 | - | - | | 2.0436 | 7553 | 0.2116 | - | - | | 2.0438 | 7554 | 0.1228 | - | - | | 2.0441 | 7555 | 0.2503 | - | - | | 2.0444 | 7556 | 0.1931 | - | - | | 2.0446 | 7557 | 0.1474 | - | - | | 2.0449 | 7558 | 0.1611 | - | - | | 2.0452 | 7559 | 0.2118 | - | - | | 2.0455 | 7560 | 0.2504 | - | - | | 2.0457 | 7561 | 0.2337 | - | - | | 2.0460 | 7562 | 0.1658 | - | - | | 2.0463 | 7563 | 0.1459 | - | - | | 2.0465 | 7564 | 0.137 | - | - | | 2.0468 | 7565 | 0.2051 | - | - | | 2.0471 | 7566 | 0.1953 | - | - | | 2.0473 | 7567 | 0.2006 | - | - | | 2.0476 | 7568 | 0.1853 | - | - | | 2.0479 | 7569 | 0.2068 | - | - | | 2.0482 | 7570 | 0.1863 | - | - | | 2.0484 | 7571 | 0.168 | - | - | | 2.0487 | 7572 | 0.222 | - | - | | 2.0490 | 7573 | 0.2002 | - | - | | 2.0492 | 7574 | 0.1898 | - | - | | 2.0495 | 7575 | 0.1798 | - | - | | 2.0498 | 7576 | 0.1918 | - | - | | 2.0501 | 7577 | 0.1863 | - | - | | 2.0503 | 7578 | 0.1565 | - | - | | 2.0506 | 7579 | 0.1897 | - | - | | 2.0509 | 7580 | 0.1694 | - | - | | 2.0511 | 7581 | 0.2002 | - | - | | 2.0514 | 7582 | 0.1676 | - | - | | 2.0517 | 7583 | 0.1838 | - | - | | 2.0519 | 7584 | 0.1815 | - | - | | 2.0522 | 7585 | 0.1751 | - | - | | 2.0525 | 7586 | 0.1686 | - | - | | 2.0528 | 7587 | 0.2176 | - | - | | 2.0530 | 7588 | 0.2293 | - | - | | 2.0533 | 7589 | 0.2333 | - | - | | 2.0536 | 7590 | 0.1519 | - | - | | 2.0538 | 7591 | 0.2024 | - | - | | 2.0541 | 7592 | 0.2446 | - | - | | 2.0544 | 7593 | 0.148 | - | - | | 2.0547 | 7594 | 0.1636 | - | - | | 2.0549 | 7595 | 0.2338 | - | - | | 2.0552 | 7596 | 0.1804 | - | - | | 2.0555 | 7597 | 0.1999 | - | - | | 2.0557 | 7598 | 0.2213 | - | - | | 2.0560 | 7599 | 0.1995 | - | - | | 2.0563 | 7600 | 0.1911 | - | - | | 2.0565 | 7601 | 0.1731 | - | - | | 2.0568 | 7602 | 0.1418 | - | - | | 2.0571 | 7603 | 0.2386 | - | - | | 2.0574 | 7604 | 0.2797 | - | - | | 2.0576 | 7605 | 0.1914 | - | - | | 2.0579 | 7606 | 0.2347 | - | - | | 2.0582 | 7607 | 0.1689 | - | - | | 2.0584 | 7608 | 0.2443 | - | - | | 2.0587 | 7609 | 0.257 | - | - | | 2.0590 | 7610 | 0.1694 | - | - | | 2.0593 | 7611 | 0.1306 | - | - | | 2.0595 | 7612 | 0.1453 | - | - | | 2.0598 | 7613 | 0.1693 | - | - | | 2.0601 | 7614 | 0.2181 | - | - | | 2.0603 | 7615 | 0.2003 | - | - | | 2.0606 | 7616 | 0.1437 | - | - | | 2.0609 | 7617 | 0.1896 | - | - | | 2.0611 | 7618 | 0.1635 | - | - | | 2.0614 | 7619 | 0.179 | - | - | | 2.0617 | 7620 | 0.2573 | - | - | | 2.0620 | 7621 | 0.1806 | - | - | | 2.0622 | 7622 | 0.1457 | - | - | | 2.0625 | 7623 | 0.2227 | - | - | | 2.0628 | 7624 | 0.1555 | - | - | | 2.0630 | 7625 | 0.125 | - | - | | 2.0633 | 7626 | 0.1736 | - | - | | 2.0636 | 7627 | 0.1513 | - | - | | 2.0639 | 7628 | 0.1519 | - | - | | 2.0641 | 7629 | 0.194 | - | - | | 2.0644 | 7630 | 0.1952 | - | - | | 2.0647 | 7631 | 0.2201 | - | - | | 2.0649 | 7632 | 0.1918 | - | - | | 2.0652 | 7633 | 0.3138 | - | - | | 2.0655 | 7634 | 0.1791 | - | - | | 2.0657 | 7635 | 0.1623 | - | - | | 2.0660 | 7636 | 0.2262 | - | - | | 2.0663 | 7637 | 0.1738 | - | - | | 2.0666 | 7638 | 0.1874 | - | - | | 2.0668 | 7639 | 0.2213 | - | - | | 2.0671 | 7640 | 0.2271 | - | - | | 2.0674 | 7641 | 0.178 | - | - | | 2.0676 | 7642 | 0.1912 | - | - | | 2.0679 | 7643 | 0.2311 | - | - | | 2.0682 | 7644 | 0.1651 | - | - | | 2.0685 | 7645 | 0.1847 | - | - | | 2.0687 | 7646 | 0.1581 | - | - | | 2.0690 | 7647 | 0.1536 | - | - | | 2.0693 | 7648 | 0.16 | - | - | | 2.0695 | 7649 | 0.2157 | - | - | | 2.0698 | 7650 | 0.2169 | - | - | | 2.0701 | 7651 | 0.207 | - | - | | 2.0703 | 7652 | 0.1838 | - | - | | 2.0706 | 7653 | 0.1426 | - | - | | 2.0709 | 7654 | 0.156 | - | - | | 2.0712 | 7655 | 0.192 | - | - | | 2.0714 | 7656 | 0.1603 | - | - | | 2.0717 | 7657 | 0.2335 | - | - | | 2.0720 | 7658 | 0.1666 | - | - | | 2.0722 | 7659 | 0.2276 | - | - | | 2.0725 | 7660 | 0.1748 | - | - | | 2.0728 | 7661 | 0.2399 | - | - | | 2.0731 | 7662 | 0.1901 | - | - | | 2.0733 | 7663 | 0.1656 | - | - | | 2.0736 | 7664 | 0.1987 | - | - | | 2.0739 | 7665 | 0.2042 | - | - | | 2.0741 | 7666 | 0.1383 | - | - | | 2.0744 | 7667 | 0.2472 | - | - | | 2.0747 | 7668 | 0.1461 | - | - | | 2.0749 | 7669 | 0.1588 | - | - | | 2.0752 | 7670 | 0.1103 | - | - | | 2.0755 | 7671 | 0.1839 | - | - | | 2.0758 | 7672 | 0.1953 | - | - | | 2.0760 | 7673 | 0.1844 | - | - | | 2.0763 | 7674 | 0.2378 | - | - | | 2.0766 | 7675 | 0.171 | - | - | | 2.0768 | 7676 | 0.1929 | - | - | | 2.0771 | 7677 | 0.1701 | - | - | | 2.0774 | 7678 | 0.1773 | - | - | | 2.0777 | 7679 | 0.1906 | - | - | | 2.0779 | 7680 | 0.1992 | - | - | | 2.0782 | 7681 | 0.1658 | - | - | | 2.0785 | 7682 | 0.1579 | - | - | | 2.0787 | 7683 | 0.2029 | - | - | | 2.0790 | 7684 | 0.1263 | - | - | | 2.0793 | 7685 | 0.1673 | - | - | | 2.0795 | 7686 | 0.2635 | - | - | | 2.0798 | 7687 | 0.1059 | - | - | | 2.0801 | 7688 | 0.1731 | - | - | | 2.0804 | 7689 | 0.2037 | - | - | | 2.0806 | 7690 | 0.2362 | - | - | | 2.0809 | 7691 | 0.1974 | - | - | | 2.0812 | 7692 | 0.1703 | - | - | | 2.0814 | 7693 | 0.2159 | - | - | | 2.0817 | 7694 | 0.2015 | - | - | | 2.0820 | 7695 | 0.2134 | - | - | | 2.0823 | 7696 | 0.239 | - | - | | 2.0825 | 7697 | 0.1696 | - | - | | 2.0828 | 7698 | 0.1556 | - | - | | 2.0831 | 7699 | 0.2646 | - | - | | 2.0833 | 7700 | 0.1666 | - | - | | 2.0836 | 7701 | 0.2086 | - | - | | 2.0839 | 7702 | 0.1494 | - | - | | 2.0841 | 7703 | 0.1325 | - | - | | 2.0844 | 7704 | 0.1988 | - | - | | 2.0847 | 7705 | 0.1279 | - | - | | 2.0850 | 7706 | 0.2096 | - | - | | 2.0852 | 7707 | 0.1517 | - | - | | 2.0855 | 7708 | 0.1543 | - | - | | 2.0858 | 7709 | 0.1039 | - | - | | 2.0860 | 7710 | 0.2433 | - | - | | 2.0863 | 7711 | 0.1557 | - | - | | 2.0866 | 7712 | 0.1585 | - | - | | 2.0869 | 7713 | 0.1231 | - | - | | 2.0871 | 7714 | 0.2463 | - | - | | 2.0874 | 7715 | 0.2444 | - | - | | 2.0877 | 7716 | 0.1688 | - | - | | 2.0879 | 7717 | 0.2208 | - | - | | 2.0882 | 7718 | 0.1537 | - | - | | 2.0885 | 7719 | 0.1848 | - | - | | 2.0887 | 7720 | 0.2125 | - | - | | 2.0890 | 7721 | 0.1792 | - | - | | 2.0893 | 7722 | 0.2205 | - | - | | 2.0896 | 7723 | 0.1922 | - | - | | 2.0898 | 7724 | 0.1966 | - | - | | 2.0901 | 7725 | 0.1602 | - | - | | 2.0904 | 7726 | 0.0856 | - | - | | 2.0906 | 7727 | 0.1653 | - | - | | 2.0909 | 7728 | 0.238 | - | - | | 2.0912 | 7729 | 0.1922 | - | - | | 2.0915 | 7730 | 0.2043 | - | - | | 2.0917 | 7731 | 0.1495 | - | - | | 2.0920 | 7732 | 0.1737 | - | - | | 2.0923 | 7733 | 0.2128 | - | - | | 2.0925 | 7734 | 0.1939 | - | - | | 2.0928 | 7735 | 0.164 | - | - | | 2.0931 | 7736 | 0.154 | - | - | | 2.0933 | 7737 | 0.2128 | - | - | | 2.0936 | 7738 | 0.1279 | - | - | | 2.0939 | 7739 | 0.1535 | - | - | | 2.0942 | 7740 | 0.1653 | - | - | | 2.0944 | 7741 | 0.1619 | - | - | | 2.0947 | 7742 | 0.1776 | - | - | | 2.0950 | 7743 | 0.1993 | - | - | | 2.0952 | 7744 | 0.207 | - | - | | 2.0955 | 7745 | 0.1258 | - | - | | 2.0958 | 7746 | 0.2008 | - | - | | 2.0960 | 7747 | 0.2042 | - | - | | 2.0963 | 7748 | 0.1936 | - | - | | 2.0966 | 7749 | 0.1915 | - | - | | 2.0969 | 7750 | 0.1906 | - | - | | 2.0971 | 7751 | 0.2062 | - | - | | 2.0974 | 7752 | 0.1491 | - | - | | 2.0977 | 7753 | 0.2264 | - | - | | 2.0979 | 7754 | 0.2102 | - | - | | 2.0982 | 7755 | 0.1766 | - | - | | 2.0985 | 7756 | 0.1269 | - | - | | 2.0988 | 7757 | 0.2052 | - | - | | 2.0990 | 7758 | 0.2494 | - | - | | 2.0993 | 7759 | 0.177 | - | - | | 2.0996 | 7760 | 0.1213 | - | - | | 2.0998 | 7761 | 0.2039 | - | - | | 2.1001 | 7762 | 0.1929 | - | - | | 2.1004 | 7763 | 0.1988 | - | - | | 2.1006 | 7764 | 0.174 | - | - | | 2.1009 | 7765 | 0.2357 | - | - | | 2.1012 | 7766 | 0.1222 | - | - | | 2.1015 | 7767 | 0.1754 | - | - | | 2.1017 | 7768 | 0.1441 | - | - | | 2.1020 | 7769 | 0.3095 | - | - | | 2.1023 | 7770 | 0.1809 | - | - | | 2.1025 | 7771 | 0.1811 | - | - | | 2.1028 | 7772 | 0.1856 | - | - | | 2.1031 | 7773 | 0.1887 | - | - | | 2.1034 | 7774 | 0.2536 | - | - | | 2.1036 | 7775 | 0.1286 | - | - | | 2.1039 | 7776 | 0.1636 | - | - | | 2.1042 | 7777 | 0.1581 | - | - | | 2.1044 | 7778 | 0.1635 | - | - | | 2.1047 | 7779 | 0.2378 | - | - | | 2.1050 | 7780 | 0.1374 | - | - | | 2.1052 | 7781 | 0.2322 | - | - | | 2.1055 | 7782 | 0.1521 | - | - | | 2.1058 | 7783 | 0.2067 | - | - | | 2.1061 | 7784 | 0.2142 | - | - | | 2.1063 | 7785 | 0.2368 | - | - | | 2.1066 | 7786 | 0.1884 | - | - | | 2.1069 | 7787 | 0.1675 | - | - | | 2.1071 | 7788 | 0.1342 | - | - | | 2.1074 | 7789 | 0.1568 | - | - | | 2.1077 | 7790 | 0.1797 | - | - | | 2.1080 | 7791 | 0.1834 | - | - | | 2.1082 | 7792 | 0.1575 | - | - | | 2.1085 | 7793 | 0.1506 | - | - | | 2.1088 | 7794 | 0.1745 | - | - | | 2.1090 | 7795 | 0.1696 | - | - | | 2.1093 | 7796 | 0.2199 | - | - | | 2.1096 | 7797 | 0.1602 | - | - | | 2.1098 | 7798 | 0.2076 | - | - | | 2.1101 | 7799 | 0.1896 | - | - | | 2.1104 | 7800 | 0.2284 | - | - | | 2.1107 | 7801 | 0.1539 | - | - | | 2.1109 | 7802 | 0.202 | - | - | | 2.1112 | 7803 | 0.2315 | - | - | | 2.1115 | 7804 | 0.2765 | - | - | | 2.1117 | 7805 | 0.1961 | - | - | | 2.1120 | 7806 | 0.1935 | - | - | | 2.1123 | 7807 | 0.1756 | - | - | | 2.1126 | 7808 | 0.2705 | - | - | | 2.1128 | 7809 | 0.1806 | - | - | | 2.1131 | 7810 | 0.1489 | - | - | | 2.1134 | 7811 | 0.2088 | - | - | | 2.1136 | 7812 | 0.2655 | - | - | | 2.1139 | 7813 | 0.1534 | - | - | | 2.1142 | 7814 | 0.1941 | - | - | | 2.1144 | 7815 | 0.2017 | - | - | | 2.1147 | 7816 | 0.2019 | - | - | | 2.1150 | 7817 | 0.2093 | - | - | | 2.1153 | 7818 | 0.172 | - | - | | 2.1155 | 7819 | 0.1484 | - | - | | 2.1158 | 7820 | 0.1984 | - | - | | 2.1161 | 7821 | 0.1693 | - | - | | 2.1163 | 7822 | 0.1561 | - | - | | 2.1166 | 7823 | 0.2133 | - | - | | 2.1169 | 7824 | 0.1538 | - | - | | 2.1172 | 7825 | 0.1983 | - | - | | 2.1174 | 7826 | 0.2432 | - | - | | 2.1177 | 7827 | 0.2463 | - | - | | 2.1180 | 7828 | 0.1411 | - | - | | 2.1182 | 7829 | 0.1697 | - | - | | 2.1185 | 7830 | 0.2996 | - | - | | 2.1188 | 7831 | 0.1802 | - | - | | 2.1190 | 7832 | 0.1475 | - | - | | 2.1193 | 7833 | 0.1728 | - | - | | 2.1196 | 7834 | 0.1779 | - | - | | 2.1199 | 7835 | 0.1865 | - | - | | 2.1201 | 7836 | 0.1547 | - | - | | 2.1204 | 7837 | 0.2878 | - | - | | 2.1207 | 7838 | 0.2145 | - | - | | 2.1209 | 7839 | 0.2249 | - | - | | 2.1212 | 7840 | 0.2063 | - | - | | 2.1215 | 7841 | 0.2003 | - | - | | 2.1218 | 7842 | 0.2198 | - | - | | 2.1220 | 7843 | 0.1861 | - | - | | 2.1223 | 7844 | 0.1742 | - | - | | 2.1226 | 7845 | 0.1875 | - | - | | 2.1228 | 7846 | 0.149 | - | - | | 2.1231 | 7847 | 0.2015 | - | - | | 2.1234 | 7848 | 0.1464 | - | - | | 2.1236 | 7849 | 0.2197 | - | - | | 2.1239 | 7850 | 0.155 | - | - | | 2.1242 | 7851 | 0.1612 | - | - | | 2.1245 | 7852 | 0.162 | - | - | | 2.1247 | 7853 | 0.1804 | - | - | | 2.125 | 7854 | 0.1667 | - | - | | 2.1253 | 7855 | 0.2156 | - | - | | 2.1255 | 7856 | 0.1791 | - | - | | 2.1258 | 7857 | 0.2105 | - | - | | 2.1261 | 7858 | 0.1324 | - | - | | 2.1264 | 7859 | 0.1746 | - | - | | 2.1266 | 7860 | 0.1652 | - | - | | 2.1269 | 7861 | 0.1948 | - | - | | 2.1272 | 7862 | 0.1484 | - | - | | 2.1274 | 7863 | 0.1868 | - | - | | 2.1277 | 7864 | 0.1669 | - | - | | 2.1280 | 7865 | 0.1732 | - | - | | 2.1282 | 7866 | 0.1314 | - | - | | 2.1285 | 7867 | 0.2082 | - | - | | 2.1288 | 7868 | 0.1511 | - | - | | 2.1291 | 7869 | 0.1912 | - | - | | 2.1293 | 7870 | 0.1906 | - | - | | 2.1296 | 7871 | 0.1831 | - | - | | 2.1299 | 7872 | 0.1518 | - | - | | 2.1301 | 7873 | 0.135 | - | - | | 2.1304 | 7874 | 0.2105 | - | - | | 2.1307 | 7875 | 0.1715 | - | - | | 2.1310 | 7876 | 0.1598 | - | - | | 2.1312 | 7877 | 0.2041 | - | - | | 2.1315 | 7878 | 0.1565 | - | - | | 2.1318 | 7879 | 0.2154 | - | - | | 2.1320 | 7880 | 0.1367 | - | - | | 2.1323 | 7881 | 0.1395 | - | - | | 2.1326 | 7882 | 0.1674 | - | - | | 2.1328 | 7883 | 0.1224 | - | - | | 2.1331 | 7884 | 0.1749 | - | - | | 2.1334 | 7885 | 0.1487 | - | - | | 2.1337 | 7886 | 0.2076 | - | - | | 2.1339 | 7887 | 0.2053 | - | - | | 2.1342 | 7888 | 0.1756 | - | - | | 2.1345 | 7889 | 0.1252 | - | - | | 2.1347 | 7890 | 0.2027 | - | - | | 2.1350 | 7891 | 0.2132 | - | - | | 2.1353 | 7892 | 0.1922 | - | - | | 2.1356 | 7893 | 0.1584 | - | - | | 2.1358 | 7894 | 0.169 | - | - | | 2.1361 | 7895 | 0.1414 | - | - | | 2.1364 | 7896 | 0.192 | - | - | | 2.1366 | 7897 | 0.1847 | - | - | | 2.1369 | 7898 | 0.2422 | - | - | | 2.1372 | 7899 | 0.1843 | - | - | | 2.1374 | 7900 | 0.1808 | - | - | | 2.1377 | 7901 | 0.2166 | - | - | | 2.1380 | 7902 | 0.215 | - | - | | 2.1383 | 7903 | 0.2254 | - | - | | 2.1385 | 7904 | 0.2116 | - | - | | 2.1388 | 7905 | 0.1629 | - | - | | 2.1391 | 7906 | 0.1786 | - | - | | 2.1393 | 7907 | 0.224 | - | - | | 2.1396 | 7908 | 0.1511 | - | - | | 2.1399 | 7909 | 0.139 | - | - | | 2.1402 | 7910 | 0.2234 | - | - | | 2.1404 | 7911 | 0.1609 | - | - | | 2.1407 | 7912 | 0.1847 | - | - | | 2.1410 | 7913 | 0.1107 | - | - | | 2.1412 | 7914 | 0.2006 | - | - | | 2.1415 | 7915 | 0.2237 | - | - | | 2.1418 | 7916 | 0.2013 | - | - | | 2.1420 | 7917 | 0.2144 | - | - | | 2.1423 | 7918 | 0.2501 | - | - | | 2.1426 | 7919 | 0.2439 | - | - | | 2.1429 | 7920 | 0.1779 | - | - | | 2.1431 | 7921 | 0.2429 | - | - | | 2.1434 | 7922 | 0.3119 | - | - | | 2.1437 | 7923 | 0.221 | - | - | | 2.1439 | 7924 | 0.2683 | - | - | | 2.1442 | 7925 | 0.149 | - | - | | 2.1445 | 7926 | 0.2716 | - | - | | 2.1448 | 7927 | 0.1874 | - | - | | 2.1450 | 7928 | 0.142 | - | - | | 2.1453 | 7929 | 0.255 | - | - | | 2.1456 | 7930 | 0.2688 | - | - | | 2.1458 | 7931 | 0.2296 | - | - | | 2.1461 | 7932 | 0.1727 | - | - | | 2.1464 | 7933 | 0.2375 | - | - | | 2.1466 | 7934 | 0.1652 | - | - | | 2.1469 | 7935 | 0.2429 | - | - | | 2.1472 | 7936 | 0.1874 | - | - | | 2.1475 | 7937 | 0.1763 | - | - | | 2.1477 | 7938 | 0.1706 | - | - | | 2.1480 | 7939 | 0.1754 | - | - | | 2.1483 | 7940 | 0.1515 | - | - | | 2.1485 | 7941 | 0.2257 | - | - | | 2.1488 | 7942 | 0.1919 | - | - | | 2.1491 | 7943 | 0.2503 | - | - | | 2.1494 | 7944 | 0.1509 | - | - | | 2.1496 | 7945 | 0.2117 | - | - | | 2.1499 | 7946 | 0.1144 | - | - | | 2.1502 | 7947 | 0.1906 | - | - | | 2.1504 | 7948 | 0.205 | - | - | | 2.1507 | 7949 | 0.1819 | - | - | | 2.1510 | 7950 | 0.2099 | - | - | | 2.1512 | 7951 | 0.2306 | - | - | | 2.1515 | 7952 | 0.1895 | - | - | | 2.1518 | 7953 | 0.2015 | - | - | | 2.1521 | 7954 | 0.2981 | - | - | | 2.1523 | 7955 | 0.211 | - | - | | 2.1526 | 7956 | 0.1693 | - | - | | 2.1529 | 7957 | 0.1534 | - | - | | 2.1531 | 7958 | 0.1917 | - | - | | 2.1534 | 7959 | 0.1774 | - | - | | 2.1537 | 7960 | 0.1369 | - | - | | 2.1540 | 7961 | 0.2034 | - | - | | 2.1542 | 7962 | 0.1961 | - | - | | 2.1545 | 7963 | 0.1678 | - | - | | 2.1548 | 7964 | 0.2346 | - | - | | 2.1550 | 7965 | 0.1571 | - | - | | 2.1553 | 7966 | 0.1958 | - | - | | 2.1556 | 7967 | 0.1485 | - | - | | 2.1558 | 7968 | 0.2443 | - | - | | 2.1561 | 7969 | 0.1679 | - | - | | 2.1564 | 7970 | 0.1581 | - | - | | 2.1567 | 7971 | 0.2248 | - | - | | 2.1569 | 7972 | 0.1322 | - | - | | 2.1572 | 7973 | 0.1869 | - | - | | 2.1575 | 7974 | 0.1964 | - | - | | 2.1577 | 7975 | 0.1667 | - | - | | 2.1580 | 7976 | 0.1707 | - | - | | 2.1583 | 7977 | 0.3056 | - | - | | 2.1585 | 7978 | 0.1496 | - | - | | 2.1588 | 7979 | 0.1532 | - | - | | 2.1591 | 7980 | 0.23 | - | - | | 2.1594 | 7981 | 0.1497 | - | - | | 2.1596 | 7982 | 0.1197 | - | - | | 2.1599 | 7983 | 0.2113 | - | - | | 2.1602 | 7984 | 0.2307 | - | - | | 2.1604 | 7985 | 0.2483 | - | - | | 2.1607 | 7986 | 0.1228 | - | - | | 2.1610 | 7987 | 0.1911 | - | - | | 2.1613 | 7988 | 0.1286 | - | - | | 2.1615 | 7989 | 0.1542 | - | - | | 2.1618 | 7990 | 0.2521 | - | - | | 2.1621 | 7991 | 0.1306 | - | - | | 2.1623 | 7992 | 0.223 | - | - | | 2.1626 | 7993 | 0.1814 | - | - | | 2.1629 | 7994 | 0.1646 | - | - | | 2.1631 | 7995 | 0.1854 | - | - | | 2.1634 | 7996 | 0.1802 | - | - | | 2.1637 | 7997 | 0.1867 | - | - | | 2.1640 | 7998 | 0.2711 | - | - | | 2.1642 | 7999 | 0.1839 | - | - | | 2.1645 | 8000 | 0.155 | 0.2057 | 0.9481 | | 2.1648 | 8001 | 0.1963 | - | - | | 2.1650 | 8002 | 0.1846 | - | - | | 2.1653 | 8003 | 0.1927 | - | - | | 2.1656 | 8004 | 0.1802 | - | - | | 2.1659 | 8005 | 0.2297 | - | - | | 2.1661 | 8006 | 0.2011 | - | - | | 2.1664 | 8007 | 0.1602 | - | - | | 2.1667 | 8008 | 0.148 | - | - | | 2.1669 | 8009 | 0.23 | - | - | | 2.1672 | 8010 | 0.1813 | - | - | | 2.1675 | 8011 | 0.1519 | - | - | | 2.1677 | 8012 | 0.1744 | - | - | | 2.1680 | 8013 | 0.1822 | - | - | | 2.1683 | 8014 | 0.1417 | - | - | | 2.1686 | 8015 | 0.1138 | - | - | | 2.1688 | 8016 | 0.1498 | - | - | | 2.1691 | 8017 | 0.1683 | - | - | | 2.1694 | 8018 | 0.2155 | - | - | | 2.1696 | 8019 | 0.2044 | - | - | | 2.1699 | 8020 | 0.1541 | - | - | | 2.1702 | 8021 | 0.1493 | - | - | | 2.1705 | 8022 | 0.1574 | - | - | | 2.1707 | 8023 | 0.1815 | - | - | | 2.1710 | 8024 | 0.1189 | - | - | | 2.1713 | 8025 | 0.2144 | - | - | | 2.1715 | 8026 | 0.1989 | - | - | | 2.1718 | 8027 | 0.1737 | - | - | | 2.1721 | 8028 | 0.1768 | - | - | | 2.1723 | 8029 | 0.2391 | - | - | | 2.1726 | 8030 | 0.1605 | - | - | | 2.1729 | 8031 | 0.2083 | - | - | | 2.1732 | 8032 | 0.1694 | - | - | | 2.1734 | 8033 | 0.1353 | - | - | | 2.1737 | 8034 | 0.144 | - | - | | 2.1740 | 8035 | 0.1832 | - | - | | 2.1742 | 8036 | 0.1363 | - | - | | 2.1745 | 8037 | 0.1878 | - | - | | 2.1748 | 8038 | 0.1577 | - | - | | 2.1751 | 8039 | 0.2338 | - | - | | 2.1753 | 8040 | 0.2136 | - | - | | 2.1756 | 8041 | 0.2067 | - | - | | 2.1759 | 8042 | 0.2017 | - | - | | 2.1761 | 8043 | 0.1593 | - | - | | 2.1764 | 8044 | 0.1953 | - | - | | 2.1767 | 8045 | 0.1876 | - | - | | 2.1769 | 8046 | 0.1827 | - | - | | 2.1772 | 8047 | 0.2425 | - | - | | 2.1775 | 8048 | 0.2047 | - | - | | 2.1778 | 8049 | 0.198 | - | - | | 2.1780 | 8050 | 0.1535 | - | - | | 2.1783 | 8051 | 0.1835 | - | - | | 2.1786 | 8052 | 0.1771 | - | - | | 2.1788 | 8053 | 0.1908 | - | - | | 2.1791 | 8054 | 0.1904 | - | - | | 2.1794 | 8055 | 0.1464 | - | - | | 2.1797 | 8056 | 0.1597 | - | - | | 2.1799 | 8057 | 0.183 | - | - | | 2.1802 | 8058 | 0.1659 | - | - | | 2.1805 | 8059 | 0.127 | - | - | | 2.1807 | 8060 | 0.2062 | - | - | | 2.1810 | 8061 | 0.1819 | - | - | | 2.1813 | 8062 | 0.2099 | - | - | | 2.1815 | 8063 | 0.1932 | - | - | | 2.1818 | 8064 | 0.1753 | - | - | | 2.1821 | 8065 | 0.1436 | - | - | | 2.1824 | 8066 | 0.1969 | - | - | | 2.1826 | 8067 | 0.1991 | - | - | | 2.1829 | 8068 | 0.221 | - | - | | 2.1832 | 8069 | 0.1091 | - | - | | 2.1834 | 8070 | 0.1389 | - | - | | 2.1837 | 8071 | 0.1811 | - | - | | 2.1840 | 8072 | 0.1843 | - | - | | 2.1843 | 8073 | 0.2081 | - | - | | 2.1845 | 8074 | 0.1761 | - | - | | 2.1848 | 8075 | 0.2002 | - | - | | 2.1851 | 8076 | 0.1281 | - | - | | 2.1853 | 8077 | 0.1888 | - | - | | 2.1856 | 8078 | 0.1436 | - | - | | 2.1859 | 8079 | 0.2196 | - | - | | 2.1861 | 8080 | 0.1622 | - | - | | 2.1864 | 8081 | 0.1683 | - | - | | 2.1867 | 8082 | 0.203 | - | - | | 2.1870 | 8083 | 0.1641 | - | - | | 2.1872 | 8084 | 0.1907 | - | - | | 2.1875 | 8085 | 0.1195 | - | - | | 2.1878 | 8086 | 0.1679 | - | - | | 2.1880 | 8087 | 0.1356 | - | - | | 2.1883 | 8088 | 0.2069 | - | - | | 2.1886 | 8089 | 0.1524 | - | - | | 2.1889 | 8090 | 0.182 | - | - | | 2.1891 | 8091 | 0.1847 | - | - | | 2.1894 | 8092 | 0.148 | - | - | | 2.1897 | 8093 | 0.203 | - | - | | 2.1899 | 8094 | 0.1545 | - | - | | 2.1902 | 8095 | 0.1751 | - | - | | 2.1905 | 8096 | 0.177 | - | - | | 2.1907 | 8097 | 0.1906 | - | - | | 2.1910 | 8098 | 0.2024 | - | - | | 2.1913 | 8099 | 0.2263 | - | - | | 2.1916 | 8100 | 0.1687 | - | - | | 2.1918 | 8101 | 0.1948 | - | - | | 2.1921 | 8102 | 0.1848 | - | - | | 2.1924 | 8103 | 0.2446 | - | - | | 2.1926 | 8104 | 0.1889 | - | - | | 2.1929 | 8105 | 0.1811 | - | - | | 2.1932 | 8106 | 0.1607 | - | - | | 2.1935 | 8107 | 0.1878 | - | - | | 2.1937 | 8108 | 0.2175 | - | - | | 2.1940 | 8109 | 0.1158 | - | - | | 2.1943 | 8110 | 0.152 | - | - | | 2.1945 | 8111 | 0.1888 | - | - | | 2.1948 | 8112 | 0.2252 | - | - | | 2.1951 | 8113 | 0.1414 | - | - | | 2.1953 | 8114 | 0.1984 | - | - | | 2.1956 | 8115 | 0.2137 | - | - | | 2.1959 | 8116 | 0.2205 | - | - | | 2.1962 | 8117 | 0.1965 | - | - | | 2.1964 | 8118 | 0.2 | - | - | | 2.1967 | 8119 | 0.139 | - | - | | 2.1970 | 8120 | 0.1805 | - | - | | 2.1972 | 8121 | 0.2589 | - | - | | 2.1975 | 8122 | 0.1685 | - | - | | 2.1978 | 8123 | 0.2004 | - | - | | 2.1981 | 8124 | 0.1435 | - | - | | 2.1983 | 8125 | 0.1641 | - | - | | 2.1986 | 8126 | 0.1826 | - | - | | 2.1989 | 8127 | 0.1253 | - | - | | 2.1991 | 8128 | 0.1641 | - | - | | 2.1994 | 8129 | 0.2133 | - | - | | 2.1997 | 8130 | 0.1692 | - | - | | 2.1999 | 8131 | 0.1869 | - | - | | 2.2002 | 8132 | 0.2041 | - | - | | 2.2005 | 8133 | 0.1495 | - | - | | 2.2008 | 8134 | 0.1667 | - | - | | 2.2010 | 8135 | 0.1835 | - | - | | 2.2013 | 8136 | 0.1277 | - | - | | 2.2016 | 8137 | 0.2033 | - | - | | 2.2018 | 8138 | 0.2104 | - | - | | 2.2021 | 8139 | 0.1847 | - | - | | 2.2024 | 8140 | 0.2103 | - | - | | 2.2027 | 8141 | 0.1792 | - | - | | 2.2029 | 8142 | 0.2054 | - | - | | 2.2032 | 8143 | 0.2332 | - | - | | 2.2035 | 8144 | 0.1744 | - | - | | 2.2037 | 8145 | 0.1593 | - | - | | 2.2040 | 8146 | 0.1625 | - | - | | 2.2043 | 8147 | 0.1083 | - | - | | 2.2045 | 8148 | 0.1347 | - | - | | 2.2048 | 8149 | 0.2465 | - | - | | 2.2051 | 8150 | 0.2673 | - | - | | 2.2054 | 8151 | 0.1908 | - | - | | 2.2056 | 8152 | 0.2047 | - | - | | 2.2059 | 8153 | 0.1705 | - | - | | 2.2062 | 8154 | 0.1908 | - | - | | 2.2064 | 8155 | 0.1836 | - | - | | 2.2067 | 8156 | 0.207 | - | - | | 2.2070 | 8157 | 0.2576 | - | - | | 2.2073 | 8158 | 0.1505 | - | - | | 2.2075 | 8159 | 0.1966 | - | - | | 2.2078 | 8160 | 0.2198 | - | - | | 2.2081 | 8161 | 0.1442 | - | - | | 2.2083 | 8162 | 0.1467 | - | - | | 2.2086 | 8163 | 0.1445 | - | - | | 2.2089 | 8164 | 0.1428 | - | - | | 2.2091 | 8165 | 0.1896 | - | - | | 2.2094 | 8166 | 0.1434 | - | - | | 2.2097 | 8167 | 0.1954 | - | - | | 2.2100 | 8168 | 0.1859 | - | - | | 2.2102 | 8169 | 0.1262 | - | - | | 2.2105 | 8170 | 0.1702 | - | - | | 2.2108 | 8171 | 0.2022 | - | - | | 2.2110 | 8172 | 0.1827 | - | - | | 2.2113 | 8173 | 0.2254 | - | - | | 2.2116 | 8174 | 0.1487 | - | - | | 2.2119 | 8175 | 0.1822 | - | - | | 2.2121 | 8176 | 0.1877 | - | - | | 2.2124 | 8177 | 0.25 | - | - | | 2.2127 | 8178 | 0.1468 | - | - | | 2.2129 | 8179 | 0.1574 | - | - | | 2.2132 | 8180 | 0.1341 | - | - | | 2.2135 | 8181 | 0.1818 | - | - | | 2.2137 | 8182 | 0.1726 | - | - | | 2.2140 | 8183 | 0.1887 | - | - | | 2.2143 | 8184 | 0.3051 | - | - | | 2.2146 | 8185 | 0.1898 | - | - | | 2.2148 | 8186 | 0.1986 | - | - | | 2.2151 | 8187 | 0.279 | - | - | | 2.2154 | 8188 | 0.1611 | - | - | | 2.2156 | 8189 | 0.1519 | - | - | | 2.2159 | 8190 | 0.1446 | - | - | | 2.2162 | 8191 | 0.192 | - | - | | 2.2165 | 8192 | 0.2045 | - | - | | 2.2167 | 8193 | 0.1728 | - | - | | 2.2170 | 8194 | 0.1239 | - | - | | 2.2173 | 8195 | 0.2428 | - | - | | 2.2175 | 8196 | 0.1803 | - | - | | 2.2178 | 8197 | 0.1669 | - | - | | 2.2181 | 8198 | 0.1727 | - | - | | 2.2183 | 8199 | 0.1213 | - | - | | 2.2186 | 8200 | 0.1679 | - | - | | 2.2189 | 8201 | 0.2219 | - | - | | 2.2192 | 8202 | 0.1387 | - | - | | 2.2194 | 8203 | 0.1762 | - | - | | 2.2197 | 8204 | 0.1388 | - | - | | 2.2200 | 8205 | 0.1913 | - | - | | 2.2202 | 8206 | 0.1889 | - | - | | 2.2205 | 8207 | 0.2201 | - | - | | 2.2208 | 8208 | 0.1533 | - | - | | 2.2210 | 8209 | 0.2094 | - | - | | 2.2213 | 8210 | 0.1979 | - | - | | 2.2216 | 8211 | 0.2431 | - | - | | 2.2219 | 8212 | 0.1788 | - | - | | 2.2221 | 8213 | 0.1297 | - | - | | 2.2224 | 8214 | 0.2591 | - | - | | 2.2227 | 8215 | 0.1971 | - | - | | 2.2229 | 8216 | 0.2043 | - | - | | 2.2232 | 8217 | 0.1891 | - | - | | 2.2235 | 8218 | 0.2081 | - | - | | 2.2238 | 8219 | 0.1578 | - | - | | 2.2240 | 8220 | 0.1671 | - | - | | 2.2243 | 8221 | 0.1848 | - | - | | 2.2246 | 8222 | 0.1819 | - | - | | 2.2248 | 8223 | 0.1933 | - | - | | 2.2251 | 8224 | 0.1919 | - | - | | 2.2254 | 8225 | 0.1942 | - | - | | 2.2256 | 8226 | 0.1495 | - | - | | 2.2259 | 8227 | 0.2352 | - | - | | 2.2262 | 8228 | 0.1722 | - | - | | 2.2265 | 8229 | 0.1646 | - | - | | 2.2267 | 8230 | 0.1791 | - | - | | 2.2270 | 8231 | 0.2486 | - | - | | 2.2273 | 8232 | 0.2206 | - | - | | 2.2275 | 8233 | 0.2176 | - | - | | 2.2278 | 8234 | 0.2157 | - | - | | 2.2281 | 8235 | 0.1818 | - | - | | 2.2284 | 8236 | 0.1706 | - | - | | 2.2286 | 8237 | 0.149 | - | - | | 2.2289 | 8238 | 0.202 | - | - | | 2.2292 | 8239 | 0.1732 | - | - | | 2.2294 | 8240 | 0.1554 | - | - | | 2.2297 | 8241 | 0.2118 | - | - | | 2.2300 | 8242 | 0.1787 | - | - | | 2.2302 | 8243 | 0.1615 | - | - | | 2.2305 | 8244 | 0.2186 | - | - | | 2.2308 | 8245 | 0.1994 | - | - | | 2.2311 | 8246 | 0.2023 | - | - | | 2.2313 | 8247 | 0.1728 | - | - | | 2.2316 | 8248 | 0.1883 | - | - | | 2.2319 | 8249 | 0.239 | - | - | | 2.2321 | 8250 | 0.1272 | - | - | | 2.2324 | 8251 | 0.1711 | - | - | | 2.2327 | 8252 | 0.1909 | - | - | | 2.2330 | 8253 | 0.2439 | - | - | | 2.2332 | 8254 | 0.1399 | - | - | | 2.2335 | 8255 | 0.1486 | - | - | | 2.2338 | 8256 | 0.1567 | - | - | | 2.2340 | 8257 | 0.1454 | - | - | | 2.2343 | 8258 | 0.1331 | - | - | | 2.2346 | 8259 | 0.1704 | - | - | | 2.2348 | 8260 | 0.1505 | - | - | | 2.2351 | 8261 | 0.1502 | - | - | | 2.2354 | 8262 | 0.1863 | - | - | | 2.2357 | 8263 | 0.1278 | - | - | | 2.2359 | 8264 | 0.2297 | - | - | | 2.2362 | 8265 | 0.194 | - | - | | 2.2365 | 8266 | 0.1524 | - | - | | 2.2367 | 8267 | 0.1696 | - | - | | 2.2370 | 8268 | 0.2592 | - | - | | 2.2373 | 8269 | 0.2001 | - | - | | 2.2376 | 8270 | 0.1385 | - | - | | 2.2378 | 8271 | 0.2195 | - | - | | 2.2381 | 8272 | 0.2161 | - | - | | 2.2384 | 8273 | 0.2451 | - | - | | 2.2386 | 8274 | 0.1982 | - | - | | 2.2389 | 8275 | 0.1578 | - | - | | 2.2392 | 8276 | 0.1898 | - | - | | 2.2394 | 8277 | 0.2103 | - | - | | 2.2397 | 8278 | 0.1788 | - | - | | 2.2400 | 8279 | 0.1771 | - | - | | 2.2403 | 8280 | 0.1308 | - | - | | 2.2405 | 8281 | 0.142 | - | - | | 2.2408 | 8282 | 0.2895 | - | - | | 2.2411 | 8283 | 0.212 | - | - | | 2.2413 | 8284 | 0.1557 | - | - | | 2.2416 | 8285 | 0.1677 | - | - | | 2.2419 | 8286 | 0.1739 | - | - | | 2.2422 | 8287 | 0.2369 | - | - | | 2.2424 | 8288 | 0.1829 | - | - | | 2.2427 | 8289 | 0.2037 | - | - | | 2.2430 | 8290 | 0.1254 | - | - | | 2.2432 | 8291 | 0.1394 | - | - | | 2.2435 | 8292 | 0.1539 | - | - | | 2.2438 | 8293 | 0.1818 | - | - | | 2.2440 | 8294 | 0.168 | - | - | | 2.2443 | 8295 | 0.1585 | - | - | | 2.2446 | 8296 | 0.1714 | - | - | | 2.2449 | 8297 | 0.2006 | - | - | | 2.2451 | 8298 | 0.0946 | - | - | | 2.2454 | 8299 | 0.1426 | - | - | | 2.2457 | 8300 | 0.2293 | - | - | | 2.2459 | 8301 | 0.1793 | - | - | | 2.2462 | 8302 | 0.2012 | - | - | | 2.2465 | 8303 | 0.2596 | - | - | | 2.2468 | 8304 | 0.3237 | - | - | | 2.2470 | 8305 | 0.1886 | - | - | | 2.2473 | 8306 | 0.1559 | - | - | | 2.2476 | 8307 | 0.1571 | - | - | | 2.2478 | 8308 | 0.177 | - | - | | 2.2481 | 8309 | 0.1481 | - | - | | 2.2484 | 8310 | 0.2141 | - | - | | 2.2486 | 8311 | 0.2189 | - | - | | 2.2489 | 8312 | 0.2041 | - | - | | 2.2492 | 8313 | 0.1859 | - | - | | 2.2495 | 8314 | 0.2363 | - | - | | 2.2497 | 8315 | 0.1626 | - | - | | 2.25 | 8316 | 0.1633 | - | - | | 2.2503 | 8317 | 0.1619 | - | - | | 2.2505 | 8318 | 0.2287 | - | - | | 2.2508 | 8319 | 0.1917 | - | - | | 2.2511 | 8320 | 0.2587 | - | - | | 2.2514 | 8321 | 0.2318 | - | - | | 2.2516 | 8322 | 0.1303 | - | - | | 2.2519 | 8323 | 0.1397 | - | - | | 2.2522 | 8324 | 0.1966 | - | - | | 2.2524 | 8325 | 0.1529 | - | - | | 2.2527 | 8326 | 0.2019 | - | - | | 2.2530 | 8327 | 0.129 | - | - | | 2.2532 | 8328 | 0.2209 | - | - | | 2.2535 | 8329 | 0.2107 | - | - | | 2.2538 | 8330 | 0.1682 | - | - | | 2.2541 | 8331 | 0.2316 | - | - | | 2.2543 | 8332 | 0.2599 | - | - | | 2.2546 | 8333 | 0.1319 | - | - | | 2.2549 | 8334 | 0.2367 | - | - | | 2.2551 | 8335 | 0.1961 | - | - | | 2.2554 | 8336 | 0.1432 | - | - | | 2.2557 | 8337 | 0.2423 | - | - | | 2.2560 | 8338 | 0.1471 | - | - | | 2.2562 | 8339 | 0.1799 | - | - | | 2.2565 | 8340 | 0.2101 | - | - | | 2.2568 | 8341 | 0.1797 | - | - | | 2.2570 | 8342 | 0.1664 | - | - | | 2.2573 | 8343 | 0.1883 | - | - | | 2.2576 | 8344 | 0.2316 | - | - | | 2.2578 | 8345 | 0.1746 | - | - | | 2.2581 | 8346 | 0.2033 | - | - | | 2.2584 | 8347 | 0.1577 | - | - | | 2.2587 | 8348 | 0.1903 | - | - | | 2.2589 | 8349 | 0.1499 | - | - | | 2.2592 | 8350 | 0.1757 | - | - | | 2.2595 | 8351 | 0.1559 | - | - | | 2.2597 | 8352 | 0.1592 | - | - | | 2.2600 | 8353 | 0.1848 | - | - | | 2.2603 | 8354 | 0.1652 | - | - | | 2.2606 | 8355 | 0.1712 | - | - | | 2.2608 | 8356 | 0.2346 | - | - | | 2.2611 | 8357 | 0.2326 | - | - | | 2.2614 | 8358 | 0.1486 | - | - | | 2.2616 | 8359 | 0.1467 | - | - | | 2.2619 | 8360 | 0.2658 | - | - | | 2.2622 | 8361 | 0.2403 | - | - | | 2.2624 | 8362 | 0.1644 | - | - | | 2.2627 | 8363 | 0.2082 | - | - | | 2.2630 | 8364 | 0.1802 | - | - | | 2.2633 | 8365 | 0.1789 | - | - | | 2.2635 | 8366 | 0.148 | - | - | | 2.2638 | 8367 | 0.225 | - | - | | 2.2641 | 8368 | 0.1397 | - | - | | 2.2643 | 8369 | 0.1664 | - | - | | 2.2646 | 8370 | 0.2209 | - | - | | 2.2649 | 8371 | 0.15 | - | - | | 2.2652 | 8372 | 0.1735 | - | - | | 2.2654 | 8373 | 0.1462 | - | - | | 2.2657 | 8374 | 0.1327 | - | - | | 2.2660 | 8375 | 0.1765 | - | - | | 2.2662 | 8376 | 0.1462 | - | - | | 2.2665 | 8377 | 0.207 | - | - | | 2.2668 | 8378 | 0.1761 | - | - | | 2.2670 | 8379 | 0.1606 | - | - | | 2.2673 | 8380 | 0.1464 | - | - | | 2.2676 | 8381 | 0.2012 | - | - | | 2.2679 | 8382 | 0.2416 | - | - | | 2.2681 | 8383 | 0.1407 | - | - | | 2.2684 | 8384 | 0.2082 | - | - | | 2.2687 | 8385 | 0.1543 | - | - | | 2.2689 | 8386 | 0.1394 | - | - | | 2.2692 | 8387 | 0.1705 | - | - | | 2.2695 | 8388 | 0.1534 | - | - | | 2.2698 | 8389 | 0.1566 | - | - | | 2.2700 | 8390 | 0.1332 | - | - | | 2.2703 | 8391 | 0.1617 | - | - | | 2.2706 | 8392 | 0.1633 | - | - | | 2.2708 | 8393 | 0.1605 | - | - | | 2.2711 | 8394 | 0.2242 | - | - | | 2.2714 | 8395 | 0.2214 | - | - | | 2.2716 | 8396 | 0.175 | - | - | | 2.2719 | 8397 | 0.1841 | - | - | | 2.2722 | 8398 | 0.1693 | - | - | | 2.2725 | 8399 | 0.1946 | - | - | | 2.2727 | 8400 | 0.1831 | - | - | | 2.2730 | 8401 | 0.162 | - | - | | 2.2733 | 8402 | 0.154 | - | - | | 2.2735 | 8403 | 0.1528 | - | - | | 2.2738 | 8404 | 0.1633 | - | - | | 2.2741 | 8405 | 0.224 | - | - | | 2.2744 | 8406 | 0.2296 | - | - | | 2.2746 | 8407 | 0.2225 | - | - | | 2.2749 | 8408 | 0.2178 | - | - | | 2.2752 | 8409 | 0.1834 | - | - | | 2.2754 | 8410 | 0.2058 | - | - | | 2.2757 | 8411 | 0.1605 | - | - | | 2.2760 | 8412 | 0.1937 | - | - | | 2.2762 | 8413 | 0.1567 | - | - | | 2.2765 | 8414 | 0.1853 | - | - | | 2.2768 | 8415 | 0.2097 | - | - | | 2.2771 | 8416 | 0.2448 | - | - | | 2.2773 | 8417 | 0.2153 | - | - | | 2.2776 | 8418 | 0.2581 | - | - | | 2.2779 | 8419 | 0.1331 | - | - | | 2.2781 | 8420 | 0.2408 | - | - | | 2.2784 | 8421 | 0.258 | - | - | | 2.2787 | 8422 | 0.2121 | - | - | | 2.2790 | 8423 | 0.2476 | - | - | | 2.2792 | 8424 | 0.1436 | - | - | | 2.2795 | 8425 | 0.1427 | - | - | | 2.2798 | 8426 | 0.2115 | - | - | | 2.2800 | 8427 | 0.1346 | - | - | | 2.2803 | 8428 | 0.1714 | - | - | | 2.2806 | 8429 | 0.1522 | - | - | | 2.2808 | 8430 | 0.1671 | - | - | | 2.2811 | 8431 | 0.1418 | - | - | | 2.2814 | 8432 | 0.1581 | - | - | | 2.2817 | 8433 | 0.2278 | - | - | | 2.2819 | 8434 | 0.207 | - | - | | 2.2822 | 8435 | 0.1739 | - | - | | 2.2825 | 8436 | 0.1877 | - | - | | 2.2827 | 8437 | 0.1159 | - | - | | 2.2830 | 8438 | 0.2233 | - | - | | 2.2833 | 8439 | 0.2493 | - | - | | 2.2835 | 8440 | 0.2317 | - | - | | 2.2838 | 8441 | 0.2212 | - | - | | 2.2841 | 8442 | 0.2231 | - | - | | 2.2844 | 8443 | 0.2218 | - | - | | 2.2846 | 8444 | 0.2851 | - | - | | 2.2849 | 8445 | 0.2261 | - | - | | 2.2852 | 8446 | 0.2038 | - | - | | 2.2854 | 8447 | 0.1769 | - | - | | 2.2857 | 8448 | 0.157 | - | - | | 2.2860 | 8449 | 0.1886 | - | - | | 2.2863 | 8450 | 0.1752 | - | - | | 2.2865 | 8451 | 0.1514 | - | - | | 2.2868 | 8452 | 0.2474 | - | - | | 2.2871 | 8453 | 0.1642 | - | - | | 2.2873 | 8454 | 0.1596 | - | - | | 2.2876 | 8455 | 0.1498 | - | - | | 2.2879 | 8456 | 0.1677 | - | - | | 2.2881 | 8457 | 0.1669 | - | - | | 2.2884 | 8458 | 0.1975 | - | - | | 2.2887 | 8459 | 0.1792 | - | - | | 2.2890 | 8460 | 0.1555 | - | - | | 2.2892 | 8461 | 0.2362 | - | - | | 2.2895 | 8462 | 0.1786 | - | - | | 2.2898 | 8463 | 0.1412 | - | - | | 2.2900 | 8464 | 0.2661 | - | - | | 2.2903 | 8465 | 0.1585 | - | - | | 2.2906 | 8466 | 0.2773 | - | - | | 2.2909 | 8467 | 0.1155 | - | - | | 2.2911 | 8468 | 0.166 | - | - | | 2.2914 | 8469 | 0.1256 | - | - | | 2.2917 | 8470 | 0.1941 | - | - | | 2.2919 | 8471 | 0.2275 | - | - | | 2.2922 | 8472 | 0.1654 | - | - | | 2.2925 | 8473 | 0.1774 | - | - | | 2.2927 | 8474 | 0.1745 | - | - | | 2.2930 | 8475 | 0.1864 | - | - | | 2.2933 | 8476 | 0.1403 | - | - | | 2.2936 | 8477 | 0.2255 | - | - | | 2.2938 | 8478 | 0.111 | - | - | | 2.2941 | 8479 | 0.1433 | - | - | | 2.2944 | 8480 | 0.332 | - | - | | 2.2946 | 8481 | 0.1498 | - | - | | 2.2949 | 8482 | 0.1223 | - | - | | 2.2952 | 8483 | 0.2207 | - | - | | 2.2955 | 8484 | 0.2089 | - | - | | 2.2957 | 8485 | 0.2147 | - | - | | 2.2960 | 8486 | 0.1632 | - | - | | 2.2963 | 8487 | 0.1458 | - | - | | 2.2965 | 8488 | 0.2236 | - | - | | 2.2968 | 8489 | 0.1895 | - | - | | 2.2971 | 8490 | 0.2098 | - | - | | 2.2973 | 8491 | 0.1557 | - | - | | 2.2976 | 8492 | 0.1561 | - | - | | 2.2979 | 8493 | 0.1602 | - | - | | 2.2982 | 8494 | 0.1856 | - | - | | 2.2984 | 8495 | 0.1748 | - | - | | 2.2987 | 8496 | 0.193 | - | - | | 2.2990 | 8497 | 0.2213 | - | - | | 2.2992 | 8498 | 0.1693 | - | - | | 2.2995 | 8499 | 0.2138 | - | - | | 2.2998 | 8500 | 0.1622 | - | - | | 2.3001 | 8501 | 0.1599 | - | - | | 2.3003 | 8502 | 0.1983 | - | - | | 2.3006 | 8503 | 0.1534 | - | - | | 2.3009 | 8504 | 0.1789 | - | - | | 2.3011 | 8505 | 0.1571 | - | - | | 2.3014 | 8506 | 0.1844 | - | - | | 2.3017 | 8507 | 0.2047 | - | - | | 2.3019 | 8508 | 0.227 | - | - | | 2.3022 | 8509 | 0.1843 | - | - | | 2.3025 | 8510 | 0.2249 | - | - | | 2.3028 | 8511 | 0.2144 | - | - | | 2.3030 | 8512 | 0.149 | - | - | | 2.3033 | 8513 | 0.1449 | - | - | | 2.3036 | 8514 | 0.2425 | - | - | | 2.3038 | 8515 | 0.1824 | - | - | | 2.3041 | 8516 | 0.2097 | - | - | | 2.3044 | 8517 | 0.2737 | - | - | | 2.3047 | 8518 | 0.2245 | - | - | | 2.3049 | 8519 | 0.2002 | - | - | | 2.3052 | 8520 | 0.2107 | - | - | | 2.3055 | 8521 | 0.1675 | - | - | | 2.3057 | 8522 | 0.1713 | - | - | | 2.3060 | 8523 | 0.1553 | - | - | | 2.3063 | 8524 | 0.167 | - | - | | 2.3065 | 8525 | 0.1773 | - | - | | 2.3068 | 8526 | 0.2511 | - | - | | 2.3071 | 8527 | 0.2165 | - | - | | 2.3074 | 8528 | 0.2162 | - | - | | 2.3076 | 8529 | 0.1958 | - | - | | 2.3079 | 8530 | 0.2326 | - | - | | 2.3082 | 8531 | 0.1832 | - | - | | 2.3084 | 8532 | 0.1441 | - | - | | 2.3087 | 8533 | 0.1557 | - | - | | 2.3090 | 8534 | 0.1493 | - | - | | 2.3093 | 8535 | 0.2065 | - | - | | 2.3095 | 8536 | 0.2311 | - | - | | 2.3098 | 8537 | 0.1883 | - | - | | 2.3101 | 8538 | 0.2509 | - | - | | 2.3103 | 8539 | 0.185 | - | - | | 2.3106 | 8540 | 0.1678 | - | - | | 2.3109 | 8541 | 0.1799 | - | - | | 2.3111 | 8542 | 0.282 | - | - | | 2.3114 | 8543 | 0.1768 | - | - | | 2.3117 | 8544 | 0.2195 | - | - | | 2.3120 | 8545 | 0.1765 | - | - | | 2.3122 | 8546 | 0.2756 | - | - | | 2.3125 | 8547 | 0.1818 | - | - | | 2.3128 | 8548 | 0.2537 | - | - | | 2.3130 | 8549 | 0.1355 | - | - | | 2.3133 | 8550 | 0.1367 | - | - | | 2.3136 | 8551 | 0.1675 | - | - | | 2.3139 | 8552 | 0.2128 | - | - | | 2.3141 | 8553 | 0.147 | - | - | | 2.3144 | 8554 | 0.2187 | - | - | | 2.3147 | 8555 | 0.1618 | - | - | | 2.3149 | 8556 | 0.1856 | - | - | | 2.3152 | 8557 | 0.2222 | - | - | | 2.3155 | 8558 | 0.2321 | - | - | | 2.3157 | 8559 | 0.2025 | - | - | | 2.3160 | 8560 | 0.1612 | - | - | | 2.3163 | 8561 | 0.1102 | - | - | | 2.3166 | 8562 | 0.1916 | - | - | | 2.3168 | 8563 | 0.1447 | - | - | | 2.3171 | 8564 | 0.2378 | - | - | | 2.3174 | 8565 | 0.1578 | - | - | | 2.3176 | 8566 | 0.3134 | - | - | | 2.3179 | 8567 | 0.1474 | - | - | | 2.3182 | 8568 | 0.2143 | - | - | | 2.3185 | 8569 | 0.1964 | - | - | | 2.3187 | 8570 | 0.2355 | - | - | | 2.3190 | 8571 | 0.1723 | - | - | | 2.3193 | 8572 | 0.1925 | - | - | | 2.3195 | 8573 | 0.1865 | - | - | | 2.3198 | 8574 | 0.2444 | - | - | | 2.3201 | 8575 | 0.1455 | - | - | | 2.3203 | 8576 | 0.1817 | - | - | | 2.3206 | 8577 | 0.1996 | - | - | | 2.3209 | 8578 | 0.1541 | - | - | | 2.3212 | 8579 | 0.1844 | - | - | | 2.3214 | 8580 | 0.1787 | - | - | | 2.3217 | 8581 | 0.1829 | - | - | | 2.3220 | 8582 | 0.1977 | - | - | | 2.3222 | 8583 | 0.1698 | - | - | | 2.3225 | 8584 | 0.1521 | - | - | | 2.3228 | 8585 | 0.2149 | - | - | | 2.3231 | 8586 | 0.1938 | - | - | | 2.3233 | 8587 | 0.1663 | - | - | | 2.3236 | 8588 | 0.1874 | - | - | | 2.3239 | 8589 | 0.1524 | - | - | | 2.3241 | 8590 | 0.1901 | - | - | | 2.3244 | 8591 | 0.1661 | - | - | | 2.3247 | 8592 | 0.1512 | - | - | | 2.3249 | 8593 | 0.2388 | - | - | | 2.3252 | 8594 | 0.2167 | - | - | | 2.3255 | 8595 | 0.1569 | - | - | | 2.3258 | 8596 | 0.1631 | - | - | | 2.3260 | 8597 | 0.1221 | - | - | | 2.3263 | 8598 | 0.1686 | - | - | | 2.3266 | 8599 | 0.2046 | - | - | | 2.3268 | 8600 | 0.2084 | - | - | | 2.3271 | 8601 | 0.1842 | - | - | | 2.3274 | 8602 | 0.2236 | - | - | | 2.3277 | 8603 | 0.1585 | - | - | | 2.3279 | 8604 | 0.2151 | - | - | | 2.3282 | 8605 | 0.2635 | - | - | | 2.3285 | 8606 | 0.1674 | - | - | | 2.3287 | 8607 | 0.1562 | - | - | | 2.3290 | 8608 | 0.1337 | - | - | | 2.3293 | 8609 | 0.2365 | - | - | | 2.3295 | 8610 | 0.1282 | - | - | | 2.3298 | 8611 | 0.1553 | - | - | | 2.3301 | 8612 | 0.1641 | - | - | | 2.3304 | 8613 | 0.1808 | - | - | | 2.3306 | 8614 | 0.1346 | - | - | | 2.3309 | 8615 | 0.2536 | - | - | | 2.3312 | 8616 | 0.1313 | - | - | | 2.3314 | 8617 | 0.2053 | - | - | | 2.3317 | 8618 | 0.2167 | - | - | | 2.3320 | 8619 | 0.2016 | - | - | | 2.3323 | 8620 | 0.1376 | - | - | | 2.3325 | 8621 | 0.194 | - | - | | 2.3328 | 8622 | 0.1644 | - | - | | 2.3331 | 8623 | 0.1695 | - | - | | 2.3333 | 8624 | 0.1821 | - | - | | 2.3336 | 8625 | 0.1975 | - | - | | 2.3339 | 8626 | 0.1673 | - | - | | 2.3341 | 8627 | 0.2563 | - | - | | 2.3344 | 8628 | 0.2253 | - | - | | 2.3347 | 8629 | 0.2026 | - | - | | 2.3350 | 8630 | 0.184 | - | - | | 2.3352 | 8631 | 0.2019 | - | - | | 2.3355 | 8632 | 0.2188 | - | - | | 2.3358 | 8633 | 0.1369 | - | - | | 2.3360 | 8634 | 0.109 | - | - | | 2.3363 | 8635 | 0.1622 | - | - | | 2.3366 | 8636 | 0.1615 | - | - | | 2.3369 | 8637 | 0.1759 | - | - | | 2.3371 | 8638 | 0.1714 | - | - | | 2.3374 | 8639 | 0.1645 | - | - | | 2.3377 | 8640 | 0.2208 | - | - | | 2.3379 | 8641 | 0.2952 | - | - | | 2.3382 | 8642 | 0.1501 | - | - | | 2.3385 | 8643 | 0.2117 | - | - | | 2.3387 | 8644 | 0.1615 | - | - | | 2.3390 | 8645 | 0.1606 | - | - | | 2.3393 | 8646 | 0.1562 | - | - | | 2.3396 | 8647 | 0.1626 | - | - | | 2.3398 | 8648 | 0.2099 | - | - | | 2.3401 | 8649 | 0.1616 | - | - | | 2.3404 | 8650 | 0.1536 | - | - | | 2.3406 | 8651 | 0.1904 | - | - | | 2.3409 | 8652 | 0.1648 | - | - | | 2.3412 | 8653 | 0.1353 | - | - | | 2.3415 | 8654 | 0.181 | - | - | | 2.3417 | 8655 | 0.2018 | - | - | | 2.3420 | 8656 | 0.1325 | - | - | | 2.3423 | 8657 | 0.163 | - | - | | 2.3425 | 8658 | 0.1326 | - | - | | 2.3428 | 8659 | 0.1562 | - | - | | 2.3431 | 8660 | 0.1432 | - | - | | 2.3433 | 8661 | 0.1824 | - | - | | 2.3436 | 8662 | 0.1587 | - | - | | 2.3439 | 8663 | 0.2061 | - | - | | 2.3442 | 8664 | 0.2065 | - | - | | 2.3444 | 8665 | 0.1782 | - | - | | 2.3447 | 8666 | 0.2395 | - | - | | 2.3450 | 8667 | 0.2044 | - | - | | 2.3452 | 8668 | 0.1755 | - | - | | 2.3455 | 8669 | 0.1639 | - | - | | 2.3458 | 8670 | 0.1614 | - | - | | 2.3460 | 8671 | 0.1962 | - | - | | 2.3463 | 8672 | 0.1588 | - | - | | 2.3466 | 8673 | 0.1666 | - | - | | 2.3469 | 8674 | 0.1034 | - | - | | 2.3471 | 8675 | 0.2166 | - | - | | 2.3474 | 8676 | 0.1995 | - | - | | 2.3477 | 8677 | 0.1803 | - | - | | 2.3479 | 8678 | 0.205 | - | - | | 2.3482 | 8679 | 0.1659 | - | - | | 2.3485 | 8680 | 0.1363 | - | - | | 2.3488 | 8681 | 0.1416 | - | - | | 2.3490 | 8682 | 0.1422 | - | - | | 2.3493 | 8683 | 0.1939 | - | - | | 2.3496 | 8684 | 0.1803 | - | - | | 2.3498 | 8685 | 0.2399 | - | - | | 2.3501 | 8686 | 0.1854 | - | - | | 2.3504 | 8687 | 0.2012 | - | - | | 2.3506 | 8688 | 0.1715 | - | - | | 2.3509 | 8689 | 0.1603 | - | - | | 2.3512 | 8690 | 0.1702 | - | - | | 2.3515 | 8691 | 0.1959 | - | - | | 2.3517 | 8692 | 0.1962 | - | - | | 2.3520 | 8693 | 0.1756 | - | - | | 2.3523 | 8694 | 0.1308 | - | - | | 2.3525 | 8695 | 0.1436 | - | - | | 2.3528 | 8696 | 0.167 | - | - | | 2.3531 | 8697 | 0.139 | - | - | | 2.3534 | 8698 | 0.1774 | - | - | | 2.3536 | 8699 | 0.218 | - | - | | 2.3539 | 8700 | 0.1596 | - | - | | 2.3542 | 8701 | 0.1916 | - | - | | 2.3544 | 8702 | 0.2255 | - | - | | 2.3547 | 8703 | 0.1993 | - | - | | 2.3550 | 8704 | 0.1733 | - | - | | 2.3552 | 8705 | 0.1379 | - | - | | 2.3555 | 8706 | 0.15 | - | - | | 2.3558 | 8707 | 0.2338 | - | - | | 2.3561 | 8708 | 0.2528 | - | - | | 2.3563 | 8709 | 0.2646 | - | - | | 2.3566 | 8710 | 0.1785 | - | - | | 2.3569 | 8711 | 0.189 | - | - | | 2.3571 | 8712 | 0.2629 | - | - | | 2.3574 | 8713 | 0.1356 | - | - | | 2.3577 | 8714 | 0.1776 | - | - | | 2.3580 | 8715 | 0.2535 | - | - | | 2.3582 | 8716 | 0.2775 | - | - | | 2.3585 | 8717 | 0.2135 | - | - | | 2.3588 | 8718 | 0.1916 | - | - | | 2.3590 | 8719 | 0.1766 | - | - | | 2.3593 | 8720 | 0.2487 | - | - | | 2.3596 | 8721 | 0.1504 | - | - | | 2.3598 | 8722 | 0.265 | - | - | | 2.3601 | 8723 | 0.2963 | - | - | | 2.3604 | 8724 | 0.1862 | - | - | | 2.3607 | 8725 | 0.174 | - | - | | 2.3609 | 8726 | 0.143 | - | - | | 2.3612 | 8727 | 0.1883 | - | - | | 2.3615 | 8728 | 0.2033 | - | - | | 2.3617 | 8729 | 0.1239 | - | - | | 2.3620 | 8730 | 0.225 | - | - | | 2.3623 | 8731 | 0.2446 | - | - | | 2.3626 | 8732 | 0.2426 | - | - | | 2.3628 | 8733 | 0.2018 | - | - | | 2.3631 | 8734 | 0.1536 | - | - | | 2.3634 | 8735 | 0.2307 | - | - | | 2.3636 | 8736 | 0.2998 | - | - | | 2.3639 | 8737 | 0.2127 | - | - | | 2.3642 | 8738 | 0.1865 | - | - | | 2.3644 | 8739 | 0.1595 | - | - | | 2.3647 | 8740 | 0.154 | - | - | | 2.3650 | 8741 | 0.1713 | - | - | | 2.3653 | 8742 | 0.2225 | - | - | | 2.3655 | 8743 | 0.1752 | - | - | | 2.3658 | 8744 | 0.1586 | - | - | | 2.3661 | 8745 | 0.2066 | - | - | | 2.3663 | 8746 | 0.1952 | - | - | | 2.3666 | 8747 | 0.1371 | - | - | | 2.3669 | 8748 | 0.155 | - | - | | 2.3672 | 8749 | 0.1435 | - | - | | 2.3674 | 8750 | 0.1709 | - | - | | 2.3677 | 8751 | 0.2272 | - | - | | 2.3680 | 8752 | 0.2366 | - | - | | 2.3682 | 8753 | 0.2118 | - | - | | 2.3685 | 8754 | 0.1821 | - | - | | 2.3688 | 8755 | 0.1303 | - | - | | 2.3690 | 8756 | 0.1717 | - | - | | 2.3693 | 8757 | 0.2345 | - | - | | 2.3696 | 8758 | 0.2524 | - | - | | 2.3699 | 8759 | 0.1825 | - | - | | 2.3701 | 8760 | 0.1603 | - | - | | 2.3704 | 8761 | 0.1325 | - | - | | 2.3707 | 8762 | 0.1942 | - | - | | 2.3709 | 8763 | 0.2632 | - | - | | 2.3712 | 8764 | 0.2648 | - | - | | 2.3715 | 8765 | 0.2912 | - | - | | 2.3718 | 8766 | 0.2259 | - | - | | 2.3720 | 8767 | 0.2043 | - | - | | 2.3723 | 8768 | 0.2045 | - | - | | 2.3726 | 8769 | 0.2328 | - | - | | 2.3728 | 8770 | 0.2156 | - | - | | 2.3731 | 8771 | 0.2409 | - | - | | 2.3734 | 8772 | 0.2406 | - | - | | 2.3736 | 8773 | 0.2056 | - | - | | 2.3739 | 8774 | 0.1716 | - | - | | 2.3742 | 8775 | 0.1973 | - | - | | 2.3745 | 8776 | 0.2103 | - | - | | 2.3747 | 8777 | 0.1669 | - | - | | 2.375 | 8778 | 0.1932 | - | - | | 2.3753 | 8779 | 0.1575 | - | - | | 2.3755 | 8780 | 0.1648 | - | - | | 2.3758 | 8781 | 0.1995 | - | - | | 2.3761 | 8782 | 0.1703 | - | - | | 2.3764 | 8783 | 0.1796 | - | - | | 2.3766 | 8784 | 0.1782 | - | - | | 2.3769 | 8785 | 0.1143 | - | - | | 2.3772 | 8786 | 0.1029 | - | - | | 2.3774 | 8787 | 0.2476 | - | - | | 2.3777 | 8788 | 0.1832 | - | - | | 2.3780 | 8789 | 0.1994 | - | - | | 2.3782 | 8790 | 0.1598 | - | - | | 2.3785 | 8791 | 0.1735 | - | - | | 2.3788 | 8792 | 0.1959 | - | - | | 2.3791 | 8793 | 0.152 | - | - | | 2.3793 | 8794 | 0.1659 | - | - | | 2.3796 | 8795 | 0.1498 | - | - | | 2.3799 | 8796 | 0.1861 | - | - | | 2.3801 | 8797 | 0.1491 | - | - | | 2.3804 | 8798 | 0.1621 | - | - | | 2.3807 | 8799 | 0.1524 | - | - | | 2.3810 | 8800 | 0.1929 | - | - | | 2.3812 | 8801 | 0.1688 | - | - | | 2.3815 | 8802 | 0.1601 | - | - | | 2.3818 | 8803 | 0.3239 | - | - | | 2.3820 | 8804 | 0.2095 | - | - | | 2.3823 | 8805 | 0.1558 | - | - | | 2.3826 | 8806 | 0.2034 | - | - | | 2.3828 | 8807 | 0.1856 | - | - | | 2.3831 | 8808 | 0.1714 | - | - | | 2.3834 | 8809 | 0.1856 | - | - | | 2.3837 | 8810 | 0.1823 | - | - | | 2.3839 | 8811 | 0.2066 | - | - | | 2.3842 | 8812 | 0.2501 | - | - | | 2.3845 | 8813 | 0.1789 | - | - | | 2.3847 | 8814 | 0.168 | - | - | | 2.3850 | 8815 | 0.1863 | - | - | | 2.3853 | 8816 | 0.1977 | - | - | | 2.3856 | 8817 | 0.1979 | - | - | | 2.3858 | 8818 | 0.1797 | - | - | | 2.3861 | 8819 | 0.2738 | - | - | | 2.3864 | 8820 | 0.2249 | - | - | | 2.3866 | 8821 | 0.2268 | - | - | | 2.3869 | 8822 | 0.2501 | - | - | | 2.3872 | 8823 | 0.1718 | - | - | | 2.3874 | 8824 | 0.193 | - | - | | 2.3877 | 8825 | 0.192 | - | - | | 2.3880 | 8826 | 0.1742 | - | - | | 2.3883 | 8827 | 0.2095 | - | - | | 2.3885 | 8828 | 0.1538 | - | - | | 2.3888 | 8829 | 0.1597 | - | - | | 2.3891 | 8830 | 0.1999 | - | - | | 2.3893 | 8831 | 0.1424 | - | - | | 2.3896 | 8832 | 0.1897 | - | - | | 2.3899 | 8833 | 0.2001 | - | - | | 2.3902 | 8834 | 0.1388 | - | - | | 2.3904 | 8835 | 0.2168 | - | - | | 2.3907 | 8836 | 0.1667 | - | - | | 2.3910 | 8837 | 0.2635 | - | - | | 2.3912 | 8838 | 0.1996 | - | - | | 2.3915 | 8839 | 0.2516 | - | - | | 2.3918 | 8840 | 0.182 | - | - | | 2.3920 | 8841 | 0.2177 | - | - | | 2.3923 | 8842 | 0.2278 | - | - | | 2.3926 | 8843 | 0.2385 | - | - | | 2.3929 | 8844 | 0.1667 | - | - | | 2.3931 | 8845 | 0.2559 | - | - | | 2.3934 | 8846 | 0.1381 | - | - | | 2.3937 | 8847 | 0.1411 | - | - | | 2.3939 | 8848 | 0.1463 | - | - | | 2.3942 | 8849 | 0.1427 | - | - | | 2.3945 | 8850 | 0.1992 | - | - | | 2.3948 | 8851 | 0.2122 | - | - | | 2.3950 | 8852 | 0.2182 | - | - | | 2.3953 | 8853 | 0.2156 | - | - | | 2.3956 | 8854 | 0.2232 | - | - | | 2.3958 | 8855 | 0.1415 | - | - | | 2.3961 | 8856 | 0.214 | - | - | | 2.3964 | 8857 | 0.2035 | - | - | | 2.3966 | 8858 | 0.1691 | - | - | | 2.3969 | 8859 | 0.1813 | - | - | | 2.3972 | 8860 | 0.1771 | - | - | | 2.3975 | 8861 | 0.1903 | - | - | | 2.3977 | 8862 | 0.213 | - | - | | 2.3980 | 8863 | 0.1817 | - | - | | 2.3983 | 8864 | 0.2236 | - | - | | 2.3985 | 8865 | 0.1587 | - | - | | 2.3988 | 8866 | 0.2211 | - | - | | 2.3991 | 8867 | 0.2063 | - | - | | 2.3994 | 8868 | 0.2285 | - | - | | 2.3996 | 8869 | 0.1781 | - | - | | 2.3999 | 8870 | 0.1698 | - | - | | 2.4002 | 8871 | 0.2687 | - | - | | 2.4004 | 8872 | 0.1744 | - | - | | 2.4007 | 8873 | 0.1577 | - | - | | 2.4010 | 8874 | 0.1747 | - | - | | 2.4012 | 8875 | 0.1596 | - | - | | 2.4015 | 8876 | 0.2094 | - | - | | 2.4018 | 8877 | 0.2269 | - | - | | 2.4021 | 8878 | 0.2017 | - | - | | 2.4023 | 8879 | 0.1689 | - | - | | 2.4026 | 8880 | 0.1649 | - | - | | 2.4029 | 8881 | 0.1656 | - | - | | 2.4031 | 8882 | 0.1161 | - | - | | 2.4034 | 8883 | 0.1901 | - | - | | 2.4037 | 8884 | 0.1921 | - | - | | 2.4040 | 8885 | 0.2393 | - | - | | 2.4042 | 8886 | 0.177 | - | - | | 2.4045 | 8887 | 0.2139 | - | - | | 2.4048 | 8888 | 0.1426 | - | - | | 2.4050 | 8889 | 0.1474 | - | - | | 2.4053 | 8890 | 0.1674 | - | - | | 2.4056 | 8891 | 0.1608 | - | - | | 2.4058 | 8892 | 0.1965 | - | - | | 2.4061 | 8893 | 0.1892 | - | - | | 2.4064 | 8894 | 0.2812 | - | - | | 2.4067 | 8895 | 0.22 | - | - | | 2.4069 | 8896 | 0.1829 | - | - | | 2.4072 | 8897 | 0.2434 | - | - | | 2.4075 | 8898 | 0.146 | - | - | | 2.4077 | 8899 | 0.2358 | - | - | | 2.4080 | 8900 | 0.1913 | - | - | | 2.4083 | 8901 | 0.2159 | - | - | | 2.4085 | 8902 | 0.1852 | - | - | | 2.4088 | 8903 | 0.2539 | - | - | | 2.4091 | 8904 | 0.2202 | - | - | | 2.4094 | 8905 | 0.1857 | - | - | | 2.4096 | 8906 | 0.155 | - | - | | 2.4099 | 8907 | 0.1459 | - | - | | 2.4102 | 8908 | 0.1269 | - | - | | 2.4104 | 8909 | 0.1712 | - | - | | 2.4107 | 8910 | 0.1919 | - | - | | 2.4110 | 8911 | 0.1332 | - | - | | 2.4113 | 8912 | 0.1331 | - | - | | 2.4115 | 8913 | 0.1937 | - | - | | 2.4118 | 8914 | 0.2101 | - | - | | 2.4121 | 8915 | 0.2714 | - | - | | 2.4123 | 8916 | 0.2043 | - | - | | 2.4126 | 8917 | 0.2033 | - | - | | 2.4129 | 8918 | 0.2822 | - | - | | 2.4131 | 8919 | 0.173 | - | - | | 2.4134 | 8920 | 0.1442 | - | - | | 2.4137 | 8921 | 0.1704 | - | - | | 2.4140 | 8922 | 0.1836 | - | - | | 2.4142 | 8923 | 0.2269 | - | - | | 2.4145 | 8924 | 0.2103 | - | - | | 2.4148 | 8925 | 0.1463 | - | - | | 2.4150 | 8926 | 0.1868 | - | - | | 2.4153 | 8927 | 0.1859 | - | - | | 2.4156 | 8928 | 0.1515 | - | - | | 2.4159 | 8929 | 0.1118 | - | - | | 2.4161 | 8930 | 0.2596 | - | - | | 2.4164 | 8931 | 0.2458 | - | - | | 2.4167 | 8932 | 0.1688 | - | - | | 2.4169 | 8933 | 0.1666 | - | - | | 2.4172 | 8934 | 0.1877 | - | - | | 2.4175 | 8935 | 0.2149 | - | - | | 2.4177 | 8936 | 0.1852 | - | - | | 2.4180 | 8937 | 0.2179 | - | - | | 2.4183 | 8938 | 0.1816 | - | - | | 2.4186 | 8939 | 0.1827 | - | - | | 2.4188 | 8940 | 0.2709 | - | - | | 2.4191 | 8941 | 0.2453 | - | - | | 2.4194 | 8942 | 0.1375 | - | - | | 2.4196 | 8943 | 0.1473 | - | - | | 2.4199 | 8944 | 0.2855 | - | - | | 2.4202 | 8945 | 0.2015 | - | - | | 2.4205 | 8946 | 0.1627 | - | - | | 2.4207 | 8947 | 0.1626 | - | - | | 2.4210 | 8948 | 0.187 | - | - | | 2.4213 | 8949 | 0.1975 | - | - | | 2.4215 | 8950 | 0.1696 | - | - | | 2.4218 | 8951 | 0.2215 | - | - | | 2.4221 | 8952 | 0.1824 | - | - | | 2.4223 | 8953 | 0.1643 | - | - | | 2.4226 | 8954 | 0.2096 | - | - | | 2.4229 | 8955 | 0.1787 | - | - | | 2.4232 | 8956 | 0.181 | - | - | | 2.4234 | 8957 | 0.1801 | - | - | | 2.4237 | 8958 | 0.2088 | - | - | | 2.4240 | 8959 | 0.1477 | - | - | | 2.4242 | 8960 | 0.1331 | - | - | | 2.4245 | 8961 | 0.2063 | - | - | | 2.4248 | 8962 | 0.1981 | - | - | | 2.4251 | 8963 | 0.2291 | - | - | | 2.4253 | 8964 | 0.1636 | - | - | | 2.4256 | 8965 | 0.2328 | - | - | | 2.4259 | 8966 | 0.1599 | - | - | | 2.4261 | 8967 | 0.1983 | - | - | | 2.4264 | 8968 | 0.2054 | - | - | | 2.4267 | 8969 | 0.1493 | - | - | | 2.4269 | 8970 | 0.1514 | - | - | | 2.4272 | 8971 | 0.2162 | - | - | | 2.4275 | 8972 | 0.1509 | - | - | | 2.4278 | 8973 | 0.1701 | - | - | | 2.4280 | 8974 | 0.1526 | - | - | | 2.4283 | 8975 | 0.1898 | - | - | | 2.4286 | 8976 | 0.1277 | - | - | | 2.4288 | 8977 | 0.2147 | - | - | | 2.4291 | 8978 | 0.2029 | - | - | | 2.4294 | 8979 | 0.1846 | - | - | | 2.4297 | 8980 | 0.1474 | - | - | | 2.4299 | 8981 | 0.1601 | - | - | | 2.4302 | 8982 | 0.1884 | - | - | | 2.4305 | 8983 | 0.1535 | - | - | | 2.4307 | 8984 | 0.199 | - | - | | 2.4310 | 8985 | 0.1369 | - | - | | 2.4313 | 8986 | 0.1596 | - | - | | 2.4315 | 8987 | 0.1917 | - | - | | 2.4318 | 8988 | 0.1519 | - | - | | 2.4321 | 8989 | 0.1179 | - | - | | 2.4324 | 8990 | 0.2087 | - | - | | 2.4326 | 8991 | 0.1752 | - | - | | 2.4329 | 8992 | 0.1962 | - | - | | 2.4332 | 8993 | 0.1798 | - | - | | 2.4334 | 8994 | 0.1453 | - | - | | 2.4337 | 8995 | 0.189 | - | - | | 2.4340 | 8996 | 0.2584 | - | - | | 2.4343 | 8997 | 0.1696 | - | - | | 2.4345 | 8998 | 0.1598 | - | - | | 2.4348 | 8999 | 0.1925 | - | - | | 2.4351 | 9000 | 0.1777 | 0.2020 | 0.9507 | | 2.4353 | 9001 | 0.1732 | - | - | | 2.4356 | 9002 | 0.1942 | - | - | | 2.4359 | 9003 | 0.1594 | - | - | | 2.4361 | 9004 | 0.2036 | - | - | | 2.4364 | 9005 | 0.1487 | - | - | | 2.4367 | 9006 | 0.1627 | - | - | | 2.4370 | 9007 | 0.155 | - | - | | 2.4372 | 9008 | 0.1066 | - | - | | 2.4375 | 9009 | 0.1231 | - | - | | 2.4378 | 9010 | 0.2142 | - | - | | 2.4380 | 9011 | 0.2037 | - | - | | 2.4383 | 9012 | 0.2303 | - | - | | 2.4386 | 9013 | 0.2101 | - | - | | 2.4389 | 9014 | 0.1789 | - | - | | 2.4391 | 9015 | 0.1268 | - | - | | 2.4394 | 9016 | 0.1564 | - | - | | 2.4397 | 9017 | 0.1755 | - | - | | 2.4399 | 9018 | 0.1794 | - | - | | 2.4402 | 9019 | 0.1566 | - | - | | 2.4405 | 9020 | 0.1578 | - | - | | 2.4407 | 9021 | 0.1559 | - | - | | 2.4410 | 9022 | 0.2148 | - | - | | 2.4413 | 9023 | 0.2361 | - | - | | 2.4416 | 9024 | 0.2064 | - | - | | 2.4418 | 9025 | 0.2339 | - | - | | 2.4421 | 9026 | 0.1752 | - | - | | 2.4424 | 9027 | 0.1928 | - | - | | 2.4426 | 9028 | 0.1663 | - | - | | 2.4429 | 9029 | 0.1533 | - | - | | 2.4432 | 9030 | 0.2194 | - | - | | 2.4435 | 9031 | 0.1816 | - | - | | 2.4437 | 9032 | 0.2291 | - | - | | 2.4440 | 9033 | 0.1757 | - | - | | 2.4443 | 9034 | 0.163 | - | - | | 2.4445 | 9035 | 0.156 | - | - | | 2.4448 | 9036 | 0.1969 | - | - | | 2.4451 | 9037 | 0.1952 | - | - | | 2.4453 | 9038 | 0.1708 | - | - | | 2.4456 | 9039 | 0.1852 | - | - | | 2.4459 | 9040 | 0.1854 | - | - | | 2.4462 | 9041 | 0.1641 | - | - | | 2.4464 | 9042 | 0.2354 | - | - | | 2.4467 | 9043 | 0.1693 | - | - | | 2.4470 | 9044 | 0.1706 | - | - | | 2.4472 | 9045 | 0.1593 | - | - | | 2.4475 | 9046 | 0.1358 | - | - | | 2.4478 | 9047 | 0.1734 | - | - | | 2.4481 | 9048 | 0.1638 | - | - | | 2.4483 | 9049 | 0.2241 | - | - | | 2.4486 | 9050 | 0.1927 | - | - | | 2.4489 | 9051 | 0.1625 | - | - | | 2.4491 | 9052 | 0.1412 | - | - | | 2.4494 | 9053 | 0.1931 | - | - | | 2.4497 | 9054 | 0.1538 | - | - | | 2.4499 | 9055 | 0.1949 | - | - | | 2.4502 | 9056 | 0.2068 | - | - | | 2.4505 | 9057 | 0.1717 | - | - | | 2.4508 | 9058 | 0.1982 | - | - | | 2.4510 | 9059 | 0.1565 | - | - | | 2.4513 | 9060 | 0.1157 | - | - | | 2.4516 | 9061 | 0.1225 | - | - | | 2.4518 | 9062 | 0.1552 | - | - | | 2.4521 | 9063 | 0.1364 | - | - | | 2.4524 | 9064 | 0.145 | - | - | | 2.4527 | 9065 | 0.1923 | - | - | | 2.4529 | 9066 | 0.192 | - | - | | 2.4532 | 9067 | 0.1713 | - | - | | 2.4535 | 9068 | 0.2157 | - | - | | 2.4537 | 9069 | 0.1785 | - | - | | 2.4540 | 9070 | 0.1658 | - | - | | 2.4543 | 9071 | 0.2109 | - | - | | 2.4545 | 9072 | 0.1775 | - | - | | 2.4548 | 9073 | 0.2275 | - | - | | 2.4551 | 9074 | 0.2266 | - | - | | 2.4554 | 9075 | 0.2086 | - | - | | 2.4556 | 9076 | 0.2074 | - | - | | 2.4559 | 9077 | 0.1996 | - | - | | 2.4562 | 9078 | 0.207 | - | - | | 2.4564 | 9079 | 0.2261 | - | - | | 2.4567 | 9080 | 0.1524 | - | - | | 2.4570 | 9081 | 0.1165 | - | - | | 2.4573 | 9082 | 0.1653 | - | - | | 2.4575 | 9083 | 0.1791 | - | - | | 2.4578 | 9084 | 0.125 | - | - | | 2.4581 | 9085 | 0.1811 | - | - | | 2.4583 | 9086 | 0.1451 | - | - | | 2.4586 | 9087 | 0.1553 | - | - | | 2.4589 | 9088 | 0.2294 | - | - | | 2.4591 | 9089 | 0.1507 | - | - | | 2.4594 | 9090 | 0.1951 | - | - | | 2.4597 | 9091 | 0.1785 | - | - | | 2.4600 | 9092 | 0.1706 | - | - | | 2.4602 | 9093 | 0.2158 | - | - | | 2.4605 | 9094 | 0.2004 | - | - | | 2.4608 | 9095 | 0.1655 | - | - | | 2.4610 | 9096 | 0.1329 | - | - | | 2.4613 | 9097 | 0.1583 | - | - | | 2.4616 | 9098 | 0.1633 | - | - | | 2.4619 | 9099 | 0.1469 | - | - | | 2.4621 | 9100 | 0.173 | - | - | | 2.4624 | 9101 | 0.1324 | - | - | | 2.4627 | 9102 | 0.1063 | - | - | | 2.4629 | 9103 | 0.1688 | - | - | | 2.4632 | 9104 | 0.169 | - | - | | 2.4635 | 9105 | 0.1666 | - | - | | 2.4637 | 9106 | 0.1219 | - | - | | 2.4640 | 9107 | 0.2367 | - | - | | 2.4643 | 9108 | 0.1647 | - | - | | 2.4646 | 9109 | 0.1834 | - | - | | 2.4648 | 9110 | 0.1823 | - | - | | 2.4651 | 9111 | 0.2046 | - | - | | 2.4654 | 9112 | 0.2404 | - | - | | 2.4656 | 9113 | 0.1864 | - | - | | 2.4659 | 9114 | 0.2713 | - | - | | 2.4662 | 9115 | 0.2202 | - | - | | 2.4665 | 9116 | 0.2055 | - | - | | 2.4667 | 9117 | 0.245 | - | - | | 2.4670 | 9118 | 0.1604 | - | - | | 2.4673 | 9119 | 0.1332 | - | - | | 2.4675 | 9120 | 0.2521 | - | - | | 2.4678 | 9121 | 0.1437 | - | - | | 2.4681 | 9122 | 0.1645 | - | - | | 2.4683 | 9123 | 0.2104 | - | - | | 2.4686 | 9124 | 0.1547 | - | - | | 2.4689 | 9125 | 0.1945 | - | - | | 2.4692 | 9126 | 0.1233 | - | - | | 2.4694 | 9127 | 0.1231 | - | - | | 2.4697 | 9128 | 0.1318 | - | - | | 2.4700 | 9129 | 0.2157 | - | - | | 2.4702 | 9130 | 0.1852 | - | - | | 2.4705 | 9131 | 0.15 | - | - | | 2.4708 | 9132 | 0.2086 | - | - | | 2.4710 | 9133 | 0.1078 | - | - | | 2.4713 | 9134 | 0.2966 | - | - | | 2.4716 | 9135 | 0.1838 | - | - | | 2.4719 | 9136 | 0.2807 | - | - | | 2.4721 | 9137 | 0.2018 | - | - | | 2.4724 | 9138 | 0.218 | - | - | | 2.4727 | 9139 | 0.2136 | - | - | | 2.4729 | 9140 | 0.1998 | - | - | | 2.4732 | 9141 | 0.189 | - | - | | 2.4735 | 9142 | 0.171 | - | - | | 2.4738 | 9143 | 0.1395 | - | - | | 2.4740 | 9144 | 0.111 | - | - | | 2.4743 | 9145 | 0.2405 | - | - | | 2.4746 | 9146 | 0.1642 | - | - | | 2.4748 | 9147 | 0.1096 | - | - | | 2.4751 | 9148 | 0.2213 | - | - | | 2.4754 | 9149 | 0.1361 | - | - | | 2.4756 | 9150 | 0.1716 | - | - | | 2.4759 | 9151 | 0.327 | - | - | | 2.4762 | 9152 | 0.1661 | - | - | | 2.4765 | 9153 | 0.2277 | - | - | | 2.4767 | 9154 | 0.1592 | - | - | | 2.4770 | 9155 | 0.1536 | - | - | | 2.4773 | 9156 | 0.2192 | - | - | | 2.4775 | 9157 | 0.1806 | - | - | | 2.4778 | 9158 | 0.2129 | - | - | | 2.4781 | 9159 | 0.133 | - | - | | 2.4784 | 9160 | 0.1903 | - | - | | 2.4786 | 9161 | 0.2082 | - | - | | 2.4789 | 9162 | 0.1365 | - | - | | 2.4792 | 9163 | 0.2089 | - | - | | 2.4794 | 9164 | 0.1939 | - | - | | 2.4797 | 9165 | 0.1707 | - | - | | 2.4800 | 9166 | 0.196 | - | - | | 2.4802 | 9167 | 0.1771 | - | - | | 2.4805 | 9168 | 0.1293 | - | - | | 2.4808 | 9169 | 0.1443 | - | - | | 2.4811 | 9170 | 0.195 | - | - | | 2.4813 | 9171 | 0.1577 | - | - | | 2.4816 | 9172 | 0.1538 | - | - | | 2.4819 | 9173 | 0.1609 | - | - | | 2.4821 | 9174 | 0.2246 | - | - | | 2.4824 | 9175 | 0.2308 | - | - | | 2.4827 | 9176 | 0.217 | - | - | | 2.4830 | 9177 | 0.153 | - | - | | 2.4832 | 9178 | 0.1537 | - | - | | 2.4835 | 9179 | 0.2042 | - | - | | 2.4838 | 9180 | 0.158 | - | - | | 2.4840 | 9181 | 0.2084 | - | - | | 2.4843 | 9182 | 0.1726 | - | - | | 2.4846 | 9183 | 0.202 | - | - | | 2.4848 | 9184 | 0.1644 | - | - | | 2.4851 | 9185 | 0.1644 | - | - | | 2.4854 | 9186 | 0.2438 | - | - | | 2.4857 | 9187 | 0.1776 | - | - | | 2.4859 | 9188 | 0.1181 | - | - | | 2.4862 | 9189 | 0.2321 | - | - | | 2.4865 | 9190 | 0.2358 | - | - | | 2.4867 | 9191 | 0.1377 | - | - | | 2.4870 | 9192 | 0.1408 | - | - | | 2.4873 | 9193 | 0.171 | - | - | | 2.4876 | 9194 | 0.2065 | - | - | | 2.4878 | 9195 | 0.1304 | - | - | | 2.4881 | 9196 | 0.1666 | - | - | | 2.4884 | 9197 | 0.1929 | - | - | | 2.4886 | 9198 | 0.1432 | - | - | | 2.4889 | 9199 | 0.2071 | - | - | | 2.4892 | 9200 | 0.1413 | - | - | | 2.4894 | 9201 | 0.2333 | - | - | | 2.4897 | 9202 | 0.1793 | - | - | | 2.4900 | 9203 | 0.1714 | - | - | | 2.4903 | 9204 | 0.2257 | - | - | | 2.4905 | 9205 | 0.1691 | - | - | | 2.4908 | 9206 | 0.1746 | - | - | | 2.4911 | 9207 | 0.1966 | - | - | | 2.4913 | 9208 | 0.2395 | - | - | | 2.4916 | 9209 | 0.1321 | - | - | | 2.4919 | 9210 | 0.235 | - | - | | 2.4922 | 9211 | 0.1674 | - | - | | 2.4924 | 9212 | 0.1876 | - | - | | 2.4927 | 9213 | 0.1527 | - | - | | 2.4930 | 9214 | 0.1923 | - | - | | 2.4932 | 9215 | 0.1764 | - | - | | 2.4935 | 9216 | 0.2073 | - | - | | 2.4938 | 9217 | 0.2064 | - | - | | 2.4940 | 9218 | 0.2411 | - | - | | 2.4943 | 9219 | 0.1468 | - | - | | 2.4946 | 9220 | 0.2097 | - | - | | 2.4949 | 9221 | 0.1753 | - | - | | 2.4951 | 9222 | 0.2581 | - | - | | 2.4954 | 9223 | 0.1816 | - | - | | 2.4957 | 9224 | 0.1077 | - | - | | 2.4959 | 9225 | 0.1648 | - | - | | 2.4962 | 9226 | 0.1664 | - | - | | 2.4965 | 9227 | 0.1565 | - | - | | 2.4968 | 9228 | 0.1396 | - | - | | 2.4970 | 9229 | 0.1868 | - | - | | 2.4973 | 9230 | 0.2048 | - | - | | 2.4976 | 9231 | 0.1857 | - | - | | 2.4978 | 9232 | 0.1859 | - | - | | 2.4981 | 9233 | 0.2012 | - | - | | 2.4984 | 9234 | 0.2001 | - | - | | 2.4986 | 9235 | 0.2091 | - | - | | 2.4989 | 9236 | 0.1529 | - | - | | 2.4992 | 9237 | 0.2236 | - | - | | 2.4995 | 9238 | 0.1149 | - | - | | 2.4997 | 9239 | 0.2315 | - | - | | 2.5 | 9240 | 0.207 | - | - | | 2.5003 | 9241 | 0.194 | - | - | | 2.5005 | 9242 | 0.1963 | - | - | | 2.5008 | 9243 | 0.2004 | - | - | | 2.5011 | 9244 | 0.1906 | - | - | | 2.5014 | 9245 | 0.2441 | - | - | | 2.5016 | 9246 | 0.221 | - | - | | 2.5019 | 9247 | 0.2272 | - | - | | 2.5022 | 9248 | 0.1373 | - | - | | 2.5024 | 9249 | 0.1574 | - | - | | 2.5027 | 9250 | 0.2241 | - | - | | 2.5030 | 9251 | 0.1658 | - | - | | 2.5032 | 9252 | 0.1961 | - | - | | 2.5035 | 9253 | 0.1396 | - | - | | 2.5038 | 9254 | 0.1755 | - | - | | 2.5041 | 9255 | 0.131 | - | - | | 2.5043 | 9256 | 0.1567 | - | - | | 2.5046 | 9257 | 0.1523 | - | - | | 2.5049 | 9258 | 0.1362 | - | - | | 2.5051 | 9259 | 0.1642 | - | - | | 2.5054 | 9260 | 0.2013 | - | - | | 2.5057 | 9261 | 0.1809 | - | - | | 2.5060 | 9262 | 0.1864 | - | - | | 2.5062 | 9263 | 0.1343 | - | - | | 2.5065 | 9264 | 0.2132 | - | - | | 2.5068 | 9265 | 0.1948 | - | - | | 2.5070 | 9266 | 0.2097 | - | - | | 2.5073 | 9267 | 0.123 | - | - | | 2.5076 | 9268 | 0.2731 | - | - | | 2.5078 | 9269 | 0.1316 | - | - | | 2.5081 | 9270 | 0.1871 | - | - | | 2.5084 | 9271 | 0.2106 | - | - | | 2.5087 | 9272 | 0.1142 | - | - | | 2.5089 | 9273 | 0.2902 | - | - | | 2.5092 | 9274 | 0.1839 | - | - | | 2.5095 | 9275 | 0.2573 | - | - | | 2.5097 | 9276 | 0.172 | - | - | | 2.5100 | 9277 | 0.2022 | - | - | | 2.5103 | 9278 | 0.2018 | - | - | | 2.5106 | 9279 | 0.1552 | - | - | | 2.5108 | 9280 | 0.206 | - | - | | 2.5111 | 9281 | 0.2057 | - | - | | 2.5114 | 9282 | 0.1895 | - | - | | 2.5116 | 9283 | 0.2775 | - | - | | 2.5119 | 9284 | 0.2244 | - | - | | 2.5122 | 9285 | 0.2034 | - | - | | 2.5124 | 9286 | 0.1647 | - | - | | 2.5127 | 9287 | 0.1759 | - | - | | 2.5130 | 9288 | 0.1276 | - | - | | 2.5133 | 9289 | 0.1323 | - | - | | 2.5135 | 9290 | 0.1602 | - | - | | 2.5138 | 9291 | 0.1616 | - | - | | 2.5141 | 9292 | 0.1534 | - | - | | 2.5143 | 9293 | 0.1791 | - | - | | 2.5146 | 9294 | 0.1104 | - | - | | 2.5149 | 9295 | 0.1815 | - | - | | 2.5152 | 9296 | 0.1872 | - | - | | 2.5154 | 9297 | 0.1724 | - | - | | 2.5157 | 9298 | 0.1482 | - | - | | 2.5160 | 9299 | 0.129 | - | - | | 2.5162 | 9300 | 0.154 | - | - | | 2.5165 | 9301 | 0.1354 | - | - | | 2.5168 | 9302 | 0.1832 | - | - | | 2.5170 | 9303 | 0.1733 | - | - | | 2.5173 | 9304 | 0.1451 | - | - | | 2.5176 | 9305 | 0.1861 | - | - | | 2.5179 | 9306 | 0.2203 | - | - | | 2.5181 | 9307 | 0.1567 | - | - | | 2.5184 | 9308 | 0.1367 | - | - | | 2.5187 | 9309 | 0.2033 | - | - | | 2.5189 | 9310 | 0.2585 | - | - | | 2.5192 | 9311 | 0.2658 | - | - | | 2.5195 | 9312 | 0.2966 | - | - | | 2.5198 | 9313 | 0.1957 | - | - | | 2.5200 | 9314 | 0.3492 | - | - | | 2.5203 | 9315 | 0.2082 | - | - | | 2.5206 | 9316 | 0.234 | - | - | | 2.5208 | 9317 | 0.0997 | - | - | | 2.5211 | 9318 | 0.2543 | - | - | | 2.5214 | 9319 | 0.1568 | - | - | | 2.5216 | 9320 | 0.1837 | - | - | | 2.5219 | 9321 | 0.1521 | - | - | | 2.5222 | 9322 | 0.2384 | - | - | | 2.5225 | 9323 | 0.147 | - | - | | 2.5227 | 9324 | 0.1995 | - | - | | 2.5230 | 9325 | 0.1434 | - | - | | 2.5233 | 9326 | 0.1696 | - | - | | 2.5235 | 9327 | 0.2753 | - | - | | 2.5238 | 9328 | 0.1725 | - | - | | 2.5241 | 9329 | 0.1688 | - | - | | 2.5244 | 9330 | 0.2137 | - | - | | 2.5246 | 9331 | 0.1491 | - | - | | 2.5249 | 9332 | 0.1432 | - | - | | 2.5252 | 9333 | 0.2378 | - | - | | 2.5254 | 9334 | 0.2519 | - | - | | 2.5257 | 9335 | 0.1721 | - | - | | 2.5260 | 9336 | 0.2278 | - | - | | 2.5262 | 9337 | 0.1607 | - | - | | 2.5265 | 9338 | 0.2106 | - | - | | 2.5268 | 9339 | 0.2201 | - | - | | 2.5271 | 9340 | 0.1915 | - | - | | 2.5273 | 9341 | 0.2066 | - | - | | 2.5276 | 9342 | 0.2014 | - | - | | 2.5279 | 9343 | 0.1727 | - | - | | 2.5281 | 9344 | 0.1808 | - | - | | 2.5284 | 9345 | 0.2205 | - | - | | 2.5287 | 9346 | 0.1873 | - | - | | 2.5290 | 9347 | 0.2289 | - | - | | 2.5292 | 9348 | 0.2137 | - | - | | 2.5295 | 9349 | 0.1457 | - | - | | 2.5298 | 9350 | 0.1837 | - | - | | 2.5300 | 9351 | 0.1592 | - | - | | 2.5303 | 9352 | 0.194 | - | - | | 2.5306 | 9353 | 0.1883 | - | - | | 2.5308 | 9354 | 0.1275 | - | - | | 2.5311 | 9355 | 0.1913 | - | - | | 2.5314 | 9356 | 0.2215 | - | - | | 2.5317 | 9357 | 0.1308 | - | - | | 2.5319 | 9358 | 0.1303 | - | - | | 2.5322 | 9359 | 0.1526 | - | - | | 2.5325 | 9360 | 0.2113 | - | - | | 2.5327 | 9361 | 0.2946 | - | - | | 2.5330 | 9362 | 0.2711 | - | - | | 2.5333 | 9363 | 0.2308 | - | - | | 2.5335 | 9364 | 0.1496 | - | - | | 2.5338 | 9365 | 0.1473 | - | - | | 2.5341 | 9366 | 0.2354 | - | - | | 2.5344 | 9367 | 0.1832 | - | - | | 2.5346 | 9368 | 0.1838 | - | - | | 2.5349 | 9369 | 0.1336 | - | - | | 2.5352 | 9370 | 0.2406 | - | - | | 2.5354 | 9371 | 0.2374 | - | - | | 2.5357 | 9372 | 0.2141 | - | - | | 2.5360 | 9373 | 0.1694 | - | - | | 2.5363 | 9374 | 0.1393 | - | - | | 2.5365 | 9375 | 0.1992 | - | - | | 2.5368 | 9376 | 0.1798 | - | - | | 2.5371 | 9377 | 0.1946 | - | - | | 2.5373 | 9378 | 0.2448 | - | - | | 2.5376 | 9379 | 0.2016 | - | - | | 2.5379 | 9380 | 0.1716 | - | - | | 2.5381 | 9381 | 0.2174 | - | - | | 2.5384 | 9382 | 0.1777 | - | - | | 2.5387 | 9383 | 0.2216 | - | - | | 2.5390 | 9384 | 0.1301 | - | - | | 2.5392 | 9385 | 0.1531 | - | - | | 2.5395 | 9386 | 0.2434 | - | - | | 2.5398 | 9387 | 0.1907 | - | - | | 2.5400 | 9388 | 0.1941 | - | - | | 2.5403 | 9389 | 0.1145 | - | - | | 2.5406 | 9390 | 0.135 | - | - | | 2.5409 | 9391 | 0.2398 | - | - | | 2.5411 | 9392 | 0.17 | - | - | | 2.5414 | 9393 | 0.1357 | - | - | | 2.5417 | 9394 | 0.1454 | - | - | | 2.5419 | 9395 | 0.1961 | - | - | | 2.5422 | 9396 | 0.1853 | - | - | | 2.5425 | 9397 | 0.1569 | - | - | | 2.5427 | 9398 | 0.2603 | - | - | | 2.5430 | 9399 | 0.1576 | - | - | | 2.5433 | 9400 | 0.1852 | - | - | | 2.5436 | 9401 | 0.1895 | - | - | | 2.5438 | 9402 | 0.1367 | - | - | | 2.5441 | 9403 | 0.1963 | - | - | | 2.5444 | 9404 | 0.2158 | - | - | | 2.5446 | 9405 | 0.1749 | - | - | | 2.5449 | 9406 | 0.1853 | - | - | | 2.5452 | 9407 | 0.2352 | - | - | | 2.5455 | 9408 | 0.1743 | - | - | | 2.5457 | 9409 | 0.2374 | - | - | | 2.5460 | 9410 | 0.2319 | - | - | | 2.5463 | 9411 | 0.2443 | - | - | | 2.5465 | 9412 | 0.1629 | - | - | | 2.5468 | 9413 | 0.1996 | - | - | | 2.5471 | 9414 | 0.1716 | - | - | | 2.5473 | 9415 | 0.2107 | - | - | | 2.5476 | 9416 | 0.1715 | - | - | | 2.5479 | 9417 | 0.198 | - | - | | 2.5482 | 9418 | 0.1743 | - | - | | 2.5484 | 9419 | 0.1418 | - | - | | 2.5487 | 9420 | 0.1985 | - | - | | 2.5490 | 9421 | 0.1639 | - | - | | 2.5492 | 9422 | 0.1539 | - | - | | 2.5495 | 9423 | 0.1764 | - | - | | 2.5498 | 9424 | 0.1595 | - | - | | 2.5501 | 9425 | 0.2581 | - | - | | 2.5503 | 9426 | 0.2162 | - | - | | 2.5506 | 9427 | 0.1919 | - | - | | 2.5509 | 9428 | 0.1683 | - | - | | 2.5511 | 9429 | 0.1773 | - | - | | 2.5514 | 9430 | 0.1925 | - | - | | 2.5517 | 9431 | 0.1213 | - | - | | 2.5519 | 9432 | 0.2051 | - | - | | 2.5522 | 9433 | 0.2068 | - | - | | 2.5525 | 9434 | 0.2125 | - | - | | 2.5528 | 9435 | 0.1709 | - | - | | 2.5530 | 9436 | 0.1665 | - | - | | 2.5533 | 9437 | 0.1662 | - | - | | 2.5536 | 9438 | 0.1317 | - | - | | 2.5538 | 9439 | 0.2165 | - | - | | 2.5541 | 9440 | 0.1735 | - | - | | 2.5544 | 9441 | 0.1339 | - | - | | 2.5547 | 9442 | 0.1917 | - | - | | 2.5549 | 9443 | 0.1185 | - | - | | 2.5552 | 9444 | 0.1855 | - | - | | 2.5555 | 9445 | 0.1916 | - | - | | 2.5557 | 9446 | 0.1569 | - | - | | 2.5560 | 9447 | 0.1728 | - | - | | 2.5563 | 9448 | 0.2244 | - | - | | 2.5565 | 9449 | 0.1898 | - | - | | 2.5568 | 9450 | 0.1561 | - | - | | 2.5571 | 9451 | 0.15 | - | - | | 2.5574 | 9452 | 0.214 | - | - | | 2.5576 | 9453 | 0.1563 | - | - | | 2.5579 | 9454 | 0.1446 | - | - | | 2.5582 | 9455 | 0.136 | - | - | | 2.5584 | 9456 | 0.2278 | - | - | | 2.5587 | 9457 | 0.1993 | - | - | | 2.5590 | 9458 | 0.1262 | - | - | | 2.5593 | 9459 | 0.1824 | - | - | | 2.5595 | 9460 | 0.1839 | - | - | | 2.5598 | 9461 | 0.1944 | - | - | | 2.5601 | 9462 | 0.1746 | - | - | | 2.5603 | 9463 | 0.186 | - | - | | 2.5606 | 9464 | 0.1437 | - | - | | 2.5609 | 9465 | 0.122 | - | - | | 2.5611 | 9466 | 0.1839 | - | - | | 2.5614 | 9467 | 0.2208 | - | - | | 2.5617 | 9468 | 0.1664 | - | - | | 2.5620 | 9469 | 0.2126 | - | - | | 2.5622 | 9470 | 0.2132 | - | - | | 2.5625 | 9471 | 0.2015 | - | - | | 2.5628 | 9472 | 0.1694 | - | - | | 2.5630 | 9473 | 0.1174 | - | - | | 2.5633 | 9474 | 0.1554 | - | - | | 2.5636 | 9475 | 0.1625 | - | - | | 2.5639 | 9476 | 0.1978 | - | - | | 2.5641 | 9477 | 0.185 | - | - | | 2.5644 | 9478 | 0.2182 | - | - | | 2.5647 | 9479 | 0.1824 | - | - | | 2.5649 | 9480 | 0.1429 | - | - | | 2.5652 | 9481 | 0.1499 | - | - | | 2.5655 | 9482 | 0.1966 | - | - | | 2.5657 | 9483 | 0.1602 | - | - | | 2.5660 | 9484 | 0.1746 | - | - | | 2.5663 | 9485 | 0.2696 | - | - | | 2.5666 | 9486 | 0.1811 | - | - | | 2.5668 | 9487 | 0.1856 | - | - | | 2.5671 | 9488 | 0.1689 | - | - | | 2.5674 | 9489 | 0.19 | - | - | | 2.5676 | 9490 | 0.1931 | - | - | | 2.5679 | 9491 | 0.1934 | - | - | | 2.5682 | 9492 | 0.1734 | - | - | | 2.5685 | 9493 | 0.2422 | - | - | | 2.5687 | 9494 | 0.3133 | - | - | | 2.5690 | 9495 | 0.1752 | - | - | | 2.5693 | 9496 | 0.1391 | - | - | | 2.5695 | 9497 | 0.1526 | - | - | | 2.5698 | 9498 | 0.1819 | - | - | | 2.5701 | 9499 | 0.2139 | - | - | | 2.5703 | 9500 | 0.2309 | - | - | | 2.5706 | 9501 | 0.1958 | - | - | | 2.5709 | 9502 | 0.2052 | - | - | | 2.5712 | 9503 | 0.2299 | - | - | | 2.5714 | 9504 | 0.1766 | - | - | | 2.5717 | 9505 | 0.2031 | - | - | | 2.5720 | 9506 | 0.1942 | - | - | | 2.5722 | 9507 | 0.2598 | - | - | | 2.5725 | 9508 | 0.1487 | - | - | | 2.5728 | 9509 | 0.1607 | - | - | | 2.5731 | 9510 | 0.1988 | - | - | | 2.5733 | 9511 | 0.2629 | - | - | | 2.5736 | 9512 | 0.1837 | - | - | | 2.5739 | 9513 | 0.1563 | - | - | | 2.5741 | 9514 | 0.2628 | - | - | | 2.5744 | 9515 | 0.139 | - | - | | 2.5747 | 9516 | 0.148 | - | - | | 2.5749 | 9517 | 0.1902 | - | - | | 2.5752 | 9518 | 0.1591 | - | - | | 2.5755 | 9519 | 0.1595 | - | - | | 2.5758 | 9520 | 0.2 | - | - | | 2.5760 | 9521 | 0.1855 | - | - | | 2.5763 | 9522 | 0.1516 | - | - | | 2.5766 | 9523 | 0.1352 | - | - | | 2.5768 | 9524 | 0.1785 | - | - | | 2.5771 | 9525 | 0.1994 | - | - | | 2.5774 | 9526 | 0.2492 | - | - | | 2.5777 | 9527 | 0.1519 | - | - | | 2.5779 | 9528 | 0.1764 | - | - | | 2.5782 | 9529 | 0.1498 | - | - | | 2.5785 | 9530 | 0.1588 | - | - | | 2.5787 | 9531 | 0.1453 | - | - | | 2.5790 | 9532 | 0.2072 | - | - | | 2.5793 | 9533 | 0.173 | - | - | | 2.5795 | 9534 | 0.1384 | - | - | | 2.5798 | 9535 | 0.1623 | - | - | | 2.5801 | 9536 | 0.2509 | - | - | | 2.5804 | 9537 | 0.176 | - | - | | 2.5806 | 9538 | 0.1417 | - | - | | 2.5809 | 9539 | 0.1558 | - | - | | 2.5812 | 9540 | 0.1427 | - | - | | 2.5814 | 9541 | 0.1686 | - | - | | 2.5817 | 9542 | 0.1413 | - | - | | 2.5820 | 9543 | 0.1534 | - | - | | 2.5823 | 9544 | 0.207 | - | - | | 2.5825 | 9545 | 0.1876 | - | - | | 2.5828 | 9546 | 0.1913 | - | - | | 2.5831 | 9547 | 0.1863 | - | - | | 2.5833 | 9548 | 0.1534 | - | - | | 2.5836 | 9549 | 0.1343 | - | - | | 2.5839 | 9550 | 0.191 | - | - | | 2.5841 | 9551 | 0.1612 | - | - | | 2.5844 | 9552 | 0.1843 | - | - | | 2.5847 | 9553 | 0.1215 | - | - | | 2.5850 | 9554 | 0.1474 | - | - | | 2.5852 | 9555 | 0.1298 | - | - | | 2.5855 | 9556 | 0.1412 | - | - | | 2.5858 | 9557 | 0.1788 | - | - | | 2.5860 | 9558 | 0.1588 | - | - | | 2.5863 | 9559 | 0.1693 | - | - | | 2.5866 | 9560 | 0.2159 | - | - | | 2.5869 | 9561 | 0.178 | - | - | | 2.5871 | 9562 | 0.1821 | - | - | | 2.5874 | 9563 | 0.2158 | - | - | | 2.5877 | 9564 | 0.1922 | - | - | | 2.5879 | 9565 | 0.1759 | - | - | | 2.5882 | 9566 | 0.1575 | - | - | | 2.5885 | 9567 | 0.2046 | - | - | | 2.5887 | 9568 | 0.1723 | - | - | | 2.5890 | 9569 | 0.172 | - | - | | 2.5893 | 9570 | 0.2358 | - | - | | 2.5896 | 9571 | 0.1816 | - | - | | 2.5898 | 9572 | 0.15 | - | - | | 2.5901 | 9573 | 0.1735 | - | - | | 2.5904 | 9574 | 0.1634 | - | - | | 2.5906 | 9575 | 0.1722 | - | - | | 2.5909 | 9576 | 0.1989 | - | - | | 2.5912 | 9577 | 0.1886 | - | - | | 2.5915 | 9578 | 0.2107 | - | - | | 2.5917 | 9579 | 0.1478 | - | - | | 2.5920 | 9580 | 0.1567 | - | - | | 2.5923 | 9581 | 0.1468 | - | - | | 2.5925 | 9582 | 0.1988 | - | - | | 2.5928 | 9583 | 0.1685 | - | - | | 2.5931 | 9584 | 0.2376 | - | - | | 2.5933 | 9585 | 0.2006 | - | - | | 2.5936 | 9586 | 0.1596 | - | - | | 2.5939 | 9587 | 0.1764 | - | - | | 2.5942 | 9588 | 0.1843 | - | - | | 2.5944 | 9589 | 0.1697 | - | - | | 2.5947 | 9590 | 0.1695 | - | - | | 2.5950 | 9591 | 0.2467 | - | - | | 2.5952 | 9592 | 0.1415 | - | - | | 2.5955 | 9593 | 0.148 | - | - | | 2.5958 | 9594 | 0.1473 | - | - | | 2.5960 | 9595 | 0.1548 | - | - | | 2.5963 | 9596 | 0.2038 | - | - | | 2.5966 | 9597 | 0.2088 | - | - | | 2.5969 | 9598 | 0.2713 | - | - | | 2.5971 | 9599 | 0.1361 | - | - | | 2.5974 | 9600 | 0.2211 | - | - | | 2.5977 | 9601 | 0.221 | - | - | | 2.5979 | 9602 | 0.2001 | - | - | | 2.5982 | 9603 | 0.1235 | - | - | | 2.5985 | 9604 | 0.1954 | - | - | | 2.5988 | 9605 | 0.1756 | - | - | | 2.5990 | 9606 | 0.2441 | - | - | | 2.5993 | 9607 | 0.1992 | - | - | | 2.5996 | 9608 | 0.1716 | - | - | | 2.5998 | 9609 | 0.1598 | - | - | | 2.6001 | 9610 | 0.1845 | - | - | | 2.6004 | 9611 | 0.2019 | - | - | | 2.6006 | 9612 | 0.1739 | - | - | | 2.6009 | 9613 | 0.1699 | - | - | | 2.6012 | 9614 | 0.1869 | - | - | | 2.6015 | 9615 | 0.1451 | - | - | | 2.6017 | 9616 | 0.1762 | - | - | | 2.6020 | 9617 | 0.2371 | - | - | | 2.6023 | 9618 | 0.2132 | - | - | | 2.6025 | 9619 | 0.1724 | - | - | | 2.6028 | 9620 | 0.1223 | - | - | | 2.6031 | 9621 | 0.19 | - | - | | 2.6034 | 9622 | 0.1904 | - | - | | 2.6036 | 9623 | 0.1735 | - | - | | 2.6039 | 9624 | 0.1886 | - | - | | 2.6042 | 9625 | 0.1475 | - | - | | 2.6044 | 9626 | 0.1816 | - | - | | 2.6047 | 9627 | 0.1855 | - | - | | 2.6050 | 9628 | 0.1668 | - | - | | 2.6052 | 9629 | 0.1906 | - | - | | 2.6055 | 9630 | 0.2435 | - | - | | 2.6058 | 9631 | 0.174 | - | - | | 2.6061 | 9632 | 0.1964 | - | - | | 2.6063 | 9633 | 0.1656 | - | - | | 2.6066 | 9634 | 0.1836 | - | - | | 2.6069 | 9635 | 0.1207 | - | - | | 2.6071 | 9636 | 0.1937 | - | - | | 2.6074 | 9637 | 0.1547 | - | - | | 2.6077 | 9638 | 0.2451 | - | - | | 2.6080 | 9639 | 0.1701 | - | - | | 2.6082 | 9640 | 0.1514 | - | - | | 2.6085 | 9641 | 0.1242 | - | - | | 2.6088 | 9642 | 0.1438 | - | - | | 2.6090 | 9643 | 0.1552 | - | - | | 2.6093 | 9644 | 0.1359 | - | - | | 2.6096 | 9645 | 0.1969 | - | - | | 2.6098 | 9646 | 0.1855 | - | - | | 2.6101 | 9647 | 0.2436 | - | - | | 2.6104 | 9648 | 0.1321 | - | - | | 2.6107 | 9649 | 0.1747 | - | - | | 2.6109 | 9650 | 0.149 | - | - | | 2.6112 | 9651 | 0.2072 | - | - | | 2.6115 | 9652 | 0.1801 | - | - | | 2.6117 | 9653 | 0.2601 | - | - | | 2.6120 | 9654 | 0.187 | - | - | | 2.6123 | 9655 | 0.1524 | - | - | | 2.6126 | 9656 | 0.1755 | - | - | | 2.6128 | 9657 | 0.1476 | - | - | | 2.6131 | 9658 | 0.1427 | - | - | | 2.6134 | 9659 | 0.1502 | - | - | | 2.6136 | 9660 | 0.1683 | - | - | | 2.6139 | 9661 | 0.2529 | - | - | | 2.6142 | 9662 | 0.2345 | - | - | | 2.6144 | 9663 | 0.16 | - | - | | 2.6147 | 9664 | 0.1894 | - | - | | 2.6150 | 9665 | 0.2122 | - | - | | 2.6153 | 9666 | 0.1957 | - | - | | 2.6155 | 9667 | 0.2049 | - | - | | 2.6158 | 9668 | 0.1551 | - | - | | 2.6161 | 9669 | 0.1488 | - | - | | 2.6163 | 9670 | 0.1443 | - | - | | 2.6166 | 9671 | 0.1454 | - | - | | 2.6169 | 9672 | 0.1466 | - | - | | 2.6172 | 9673 | 0.1642 | - | - | | 2.6174 | 9674 | 0.2006 | - | - | | 2.6177 | 9675 | 0.1544 | - | - | | 2.6180 | 9676 | 0.1676 | - | - | | 2.6182 | 9677 | 0.232 | - | - | | 2.6185 | 9678 | 0.2715 | - | - | | 2.6188 | 9679 | 0.2013 | - | - | | 2.6190 | 9680 | 0.2107 | - | - | | 2.6193 | 9681 | 0.2267 | - | - | | 2.6196 | 9682 | 0.1524 | - | - | | 2.6199 | 9683 | 0.1268 | - | - | | 2.6201 | 9684 | 0.2091 | - | - | | 2.6204 | 9685 | 0.1838 | - | - | | 2.6207 | 9686 | 0.1913 | - | - | | 2.6209 | 9687 | 0.1904 | - | - | | 2.6212 | 9688 | 0.1315 | - | - | | 2.6215 | 9689 | 0.2011 | - | - | | 2.6218 | 9690 | 0.1871 | - | - | | 2.6220 | 9691 | 0.1841 | - | - | | 2.6223 | 9692 | 0.2023 | - | - | | 2.6226 | 9693 | 0.2478 | - | - | | 2.6228 | 9694 | 0.1992 | - | - | | 2.6231 | 9695 | 0.1643 | - | - | | 2.6234 | 9696 | 0.1373 | - | - | | 2.6236 | 9697 | 0.1747 | - | - | | 2.6239 | 9698 | 0.1688 | - | - | | 2.6242 | 9699 | 0.2196 | - | - | | 2.6245 | 9700 | 0.2042 | - | - | | 2.6247 | 9701 | 0.2025 | - | - | | 2.625 | 9702 | 0.1767 | - | - | | 2.6253 | 9703 | 0.1788 | - | - | | 2.6255 | 9704 | 0.2147 | - | - | | 2.6258 | 9705 | 0.2342 | - | - | | 2.6261 | 9706 | 0.2349 | - | - | | 2.6264 | 9707 | 0.2181 | - | - | | 2.6266 | 9708 | 0.1745 | - | - | | 2.6269 | 9709 | 0.1272 | - | - | | 2.6272 | 9710 | 0.2559 | - | - | | 2.6274 | 9711 | 0.1783 | - | - | | 2.6277 | 9712 | 0.24 | - | - | | 2.6280 | 9713 | 0.1601 | - | - | | 2.6282 | 9714 | 0.1555 | - | - | | 2.6285 | 9715 | 0.1918 | - | - | | 2.6288 | 9716 | 0.1526 | - | - | | 2.6291 | 9717 | 0.1742 | - | - | | 2.6293 | 9718 | 0.2236 | - | - | | 2.6296 | 9719 | 0.2294 | - | - | | 2.6299 | 9720 | 0.1678 | - | - | | 2.6301 | 9721 | 0.1592 | - | - | | 2.6304 | 9722 | 0.2081 | - | - | | 2.6307 | 9723 | 0.231 | - | - | | 2.6310 | 9724 | 0.1538 | - | - | | 2.6312 | 9725 | 0.2064 | - | - | | 2.6315 | 9726 | 0.1825 | - | - | | 2.6318 | 9727 | 0.1796 | - | - | | 2.6320 | 9728 | 0.2968 | - | - | | 2.6323 | 9729 | 0.2387 | - | - | | 2.6326 | 9730 | 0.2095 | - | - | | 2.6328 | 9731 | 0.2179 | - | - | | 2.6331 | 9732 | 0.1733 | - | - | | 2.6334 | 9733 | 0.1776 | - | - | | 2.6337 | 9734 | 0.1986 | - | - | | 2.6339 | 9735 | 0.163 | - | - | | 2.6342 | 9736 | 0.1646 | - | - | | 2.6345 | 9737 | 0.1078 | - | - | | 2.6347 | 9738 | 0.1522 | - | - | | 2.6350 | 9739 | 0.1578 | - | - | | 2.6353 | 9740 | 0.1518 | - | - | | 2.6356 | 9741 | 0.2217 | - | - | | 2.6358 | 9742 | 0.1571 | - | - | | 2.6361 | 9743 | 0.1758 | - | - | | 2.6364 | 9744 | 0.1557 | - | - | | 2.6366 | 9745 | 0.1542 | - | - | | 2.6369 | 9746 | 0.1655 | - | - | | 2.6372 | 9747 | 0.1257 | - | - | | 2.6374 | 9748 | 0.1884 | - | - | | 2.6377 | 9749 | 0.1673 | - | - | | 2.6380 | 9750 | 0.2198 | - | - | | 2.6383 | 9751 | 0.1919 | - | - | | 2.6385 | 9752 | 0.1272 | - | - | | 2.6388 | 9753 | 0.2042 | - | - | | 2.6391 | 9754 | 0.1919 | - | - | | 2.6393 | 9755 | 0.1713 | - | - | | 2.6396 | 9756 | 0.1431 | - | - | | 2.6399 | 9757 | 0.2105 | - | - | | 2.6402 | 9758 | 0.1796 | - | - | | 2.6404 | 9759 | 0.2113 | - | - | | 2.6407 | 9760 | 0.1972 | - | - | | 2.6410 | 9761 | 0.1907 | - | - | | 2.6412 | 9762 | 0.1499 | - | - | | 2.6415 | 9763 | 0.1934 | - | - | | 2.6418 | 9764 | 0.166 | - | - | | 2.6420 | 9765 | 0.2025 | - | - | | 2.6423 | 9766 | 0.2279 | - | - | | 2.6426 | 9767 | 0.1285 | - | - | | 2.6429 | 9768 | 0.1333 | - | - | | 2.6431 | 9769 | 0.2149 | - | - | | 2.6434 | 9770 | 0.1707 | - | - | | 2.6437 | 9771 | 0.1284 | - | - | | 2.6439 | 9772 | 0.234 | - | - | | 2.6442 | 9773 | 0.1661 | - | - | | 2.6445 | 9774 | 0.1491 | - | - | | 2.6448 | 9775 | 0.1842 | - | - | | 2.6450 | 9776 | 0.1469 | - | - | | 2.6453 | 9777 | 0.1262 | - | - | | 2.6456 | 9778 | 0.2438 | - | - | | 2.6458 | 9779 | 0.1859 | - | - | | 2.6461 | 9780 | 0.205 | - | - | | 2.6464 | 9781 | 0.1731 | - | - | | 2.6466 | 9782 | 0.2158 | - | - | | 2.6469 | 9783 | 0.1542 | - | - | | 2.6472 | 9784 | 0.141 | - | - | | 2.6475 | 9785 | 0.1731 | - | - | | 2.6477 | 9786 | 0.2369 | - | - | | 2.6480 | 9787 | 0.159 | - | - | | 2.6483 | 9788 | 0.1901 | - | - | | 2.6485 | 9789 | 0.2268 | - | - | | 2.6488 | 9790 | 0.1388 | - | - | | 2.6491 | 9791 | 0.1746 | - | - | | 2.6494 | 9792 | 0.1216 | - | - | | 2.6496 | 9793 | 0.1324 | - | - | | 2.6499 | 9794 | 0.1992 | - | - | | 2.6502 | 9795 | 0.109 | - | - | | 2.6504 | 9796 | 0.2151 | - | - | | 2.6507 | 9797 | 0.1428 | - | - | | 2.6510 | 9798 | 0.2143 | - | - | | 2.6512 | 9799 | 0.143 | - | - | | 2.6515 | 9800 | 0.2087 | - | - | | 2.6518 | 9801 | 0.1832 | - | - | | 2.6521 | 9802 | 0.1926 | - | - | | 2.6523 | 9803 | 0.1192 | - | - | | 2.6526 | 9804 | 0.2043 | - | - | | 2.6529 | 9805 | 0.1599 | - | - | | 2.6531 | 9806 | 0.1627 | - | - | | 2.6534 | 9807 | 0.1706 | - | - | | 2.6537 | 9808 | 0.2116 | - | - | | 2.6540 | 9809 | 0.2163 | - | - | | 2.6542 | 9810 | 0.1929 | - | - | | 2.6545 | 9811 | 0.2219 | - | - | | 2.6548 | 9812 | 0.2387 | - | - | | 2.6550 | 9813 | 0.161 | - | - | | 2.6553 | 9814 | 0.2313 | - | - | | 2.6556 | 9815 | 0.1871 | - | - | | 2.6558 | 9816 | 0.2172 | - | - | | 2.6561 | 9817 | 0.1298 | - | - | | 2.6564 | 9818 | 0.2605 | - | - | | 2.6567 | 9819 | 0.1189 | - | - | | 2.6569 | 9820 | 0.2064 | - | - | | 2.6572 | 9821 | 0.1253 | - | - | | 2.6575 | 9822 | 0.1705 | - | - | | 2.6577 | 9823 | 0.1693 | - | - | | 2.6580 | 9824 | 0.2238 | - | - | | 2.6583 | 9825 | 0.197 | - | - | | 2.6585 | 9826 | 0.2088 | - | - | | 2.6588 | 9827 | 0.207 | - | - | | 2.6591 | 9828 | 0.2492 | - | - | | 2.6594 | 9829 | 0.2173 | - | - | | 2.6596 | 9830 | 0.1286 | - | - | | 2.6599 | 9831 | 0.1963 | - | - | | 2.6602 | 9832 | 0.1594 | - | - | | 2.6604 | 9833 | 0.1388 | - | - | | 2.6607 | 9834 | 0.1786 | - | - | | 2.6610 | 9835 | 0.1507 | - | - | | 2.6613 | 9836 | 0.2263 | - | - | | 2.6615 | 9837 | 0.1715 | - | - | | 2.6618 | 9838 | 0.1437 | - | - | | 2.6621 | 9839 | 0.1602 | - | - | | 2.6623 | 9840 | 0.1734 | - | - | | 2.6626 | 9841 | 0.1967 | - | - | | 2.6629 | 9842 | 0.1261 | - | - | | 2.6631 | 9843 | 0.2006 | - | - | | 2.6634 | 9844 | 0.2049 | - | - | | 2.6637 | 9845 | 0.232 | - | - | | 2.6640 | 9846 | 0.1532 | - | - | | 2.6642 | 9847 | 0.1286 | - | - | | 2.6645 | 9848 | 0.159 | - | - | | 2.6648 | 9849 | 0.1278 | - | - | | 2.6650 | 9850 | 0.2183 | - | - | | 2.6653 | 9851 | 0.122 | - | - | | 2.6656 | 9852 | 0.1338 | - | - | | 2.6659 | 9853 | 0.185 | - | - | | 2.6661 | 9854 | 0.1515 | - | - | | 2.6664 | 9855 | 0.187 | - | - | | 2.6667 | 9856 | 0.1779 | - | - | | 2.6669 | 9857 | 0.2533 | - | - | | 2.6672 | 9858 | 0.128 | - | - | | 2.6675 | 9859 | 0.1779 | - | - | | 2.6677 | 9860 | 0.1524 | - | - | | 2.6680 | 9861 | 0.1992 | - | - | | 2.6683 | 9862 | 0.2189 | - | - | | 2.6686 | 9863 | 0.1604 | - | - | | 2.6688 | 9864 | 0.203 | - | - | | 2.6691 | 9865 | 0.167 | - | - | | 2.6694 | 9866 | 0.1518 | - | - | | 2.6696 | 9867 | 0.1828 | - | - | | 2.6699 | 9868 | 0.16 | - | - | | 2.6702 | 9869 | 0.223 | - | - | | 2.6705 | 9870 | 0.1874 | - | - | | 2.6707 | 9871 | 0.25 | - | - | | 2.6710 | 9872 | 0.2392 | - | - | | 2.6713 | 9873 | 0.166 | - | - | | 2.6715 | 9874 | 0.1446 | - | - | | 2.6718 | 9875 | 0.1858 | - | - | | 2.6721 | 9876 | 0.2072 | - | - | | 2.6723 | 9877 | 0.1501 | - | - | | 2.6726 | 9878 | 0.1849 | - | - | | 2.6729 | 9879 | 0.1526 | - | - | | 2.6732 | 9880 | 0.2471 | - | - | | 2.6734 | 9881 | 0.2009 | - | - | | 2.6737 | 9882 | 0.2167 | - | - | | 2.6740 | 9883 | 0.125 | - | - | | 2.6742 | 9884 | 0.1399 | - | - | | 2.6745 | 9885 | 0.1307 | - | - | | 2.6748 | 9886 | 0.1729 | - | - | | 2.6751 | 9887 | 0.2106 | - | - | | 2.6753 | 9888 | 0.1593 | - | - | | 2.6756 | 9889 | 0.1386 | - | - | | 2.6759 | 9890 | 0.2815 | - | - | | 2.6761 | 9891 | 0.183 | - | - | | 2.6764 | 9892 | 0.2043 | - | - | | 2.6767 | 9893 | 0.2212 | - | - | | 2.6769 | 9894 | 0.2084 | - | - | | 2.6772 | 9895 | 0.2685 | - | - | | 2.6775 | 9896 | 0.1679 | - | - | | 2.6778 | 9897 | 0.2059 | - | - | | 2.6780 | 9898 | 0.2323 | - | - | | 2.6783 | 9899 | 0.1477 | - | - | | 2.6786 | 9900 | 0.1744 | - | - | | 2.6788 | 9901 | 0.2203 | - | - | | 2.6791 | 9902 | 0.1812 | - | - | | 2.6794 | 9903 | 0.1254 | - | - | | 2.6797 | 9904 | 0.2094 | - | - | | 2.6799 | 9905 | 0.1749 | - | - | | 2.6802 | 9906 | 0.2074 | - | - | | 2.6805 | 9907 | 0.1906 | - | - | | 2.6807 | 9908 | 0.2059 | - | - | | 2.6810 | 9909 | 0.1772 | - | - | | 2.6813 | 9910 | 0.1492 | - | - | | 2.6815 | 9911 | 0.1591 | - | - | | 2.6818 | 9912 | 0.1999 | - | - | | 2.6821 | 9913 | 0.1041 | - | - | | 2.6824 | 9914 | 0.1747 | - | - | | 2.6826 | 9915 | 0.1997 | - | - | | 2.6829 | 9916 | 0.1938 | - | - | | 2.6832 | 9917 | 0.1221 | - | - | | 2.6834 | 9918 | 0.1579 | - | - | | 2.6837 | 9919 | 0.1464 | - | - | | 2.6840 | 9920 | 0.1888 | - | - | | 2.6843 | 9921 | 0.23 | - | - | | 2.6845 | 9922 | 0.1361 | - | - | | 2.6848 | 9923 | 0.1778 | - | - | | 2.6851 | 9924 | 0.2588 | - | - | | 2.6853 | 9925 | 0.164 | - | - | | 2.6856 | 9926 | 0.2137 | - | - | | 2.6859 | 9927 | 0.1693 | - | - | | 2.6861 | 9928 | 0.1304 | - | - | | 2.6864 | 9929 | 0.2177 | - | - | | 2.6867 | 9930 | 0.1707 | - | - | | 2.6870 | 9931 | 0.2189 | - | - | | 2.6872 | 9932 | 0.1471 | - | - | | 2.6875 | 9933 | 0.1992 | - | - | | 2.6878 | 9934 | 0.1671 | - | - | | 2.6880 | 9935 | 0.1939 | - | - | | 2.6883 | 9936 | 0.1985 | - | - | | 2.6886 | 9937 | 0.1866 | - | - | | 2.6889 | 9938 | 0.2036 | - | - | | 2.6891 | 9939 | 0.1956 | - | - | | 2.6894 | 9940 | 0.1948 | - | - | | 2.6897 | 9941 | 0.1719 | - | - | | 2.6899 | 9942 | 0.1562 | - | - | | 2.6902 | 9943 | 0.1724 | - | - | | 2.6905 | 9944 | 0.1824 | - | - | | 2.6907 | 9945 | 0.1947 | - | - | | 2.6910 | 9946 | 0.1824 | - | - | | 2.6913 | 9947 | 0.1765 | - | - | | 2.6916 | 9948 | 0.1817 | - | - | | 2.6918 | 9949 | 0.1639 | - | - | | 2.6921 | 9950 | 0.2023 | - | - | | 2.6924 | 9951 | 0.1312 | - | - | | 2.6926 | 9952 | 0.2221 | - | - | | 2.6929 | 9953 | 0.1689 | - | - | | 2.6932 | 9954 | 0.2648 | - | - | | 2.6935 | 9955 | 0.1386 | - | - | | 2.6937 | 9956 | 0.1949 | - | - | | 2.6940 | 9957 | 0.2078 | - | - | | 2.6943 | 9958 | 0.1728 | - | - | | 2.6945 | 9959 | 0.1971 | - | - | | 2.6948 | 9960 | 0.2131 | - | - | | 2.6951 | 9961 | 0.2766 | - | - | | 2.6953 | 9962 | 0.1904 | - | - | | 2.6956 | 9963 | 0.1163 | - | - | | 2.6959 | 9964 | 0.1719 | - | - | | 2.6962 | 9965 | 0.157 | - | - | | 2.6964 | 9966 | 0.1588 | - | - | | 2.6967 | 9967 | 0.1444 | - | - | | 2.6970 | 9968 | 0.244 | - | - | | 2.6972 | 9969 | 0.1874 | - | - | | 2.6975 | 9970 | 0.1914 | - | - | | 2.6978 | 9971 | 0.1379 | - | - | | 2.6981 | 9972 | 0.1852 | - | - | | 2.6983 | 9973 | 0.2931 | - | - | | 2.6986 | 9974 | 0.1638 | - | - | | 2.6989 | 9975 | 0.195 | - | - | | 2.6991 | 9976 | 0.181 | - | - | | 2.6994 | 9977 | 0.1715 | - | - | | 2.6997 | 9978 | 0.2326 | - | - | | 2.6999 | 9979 | 0.179 | - | - | | 2.7002 | 9980 | 0.1596 | - | - | | 2.7005 | 9981 | 0.1478 | - | - | | 2.7008 | 9982 | 0.1531 | - | - | | 2.7010 | 9983 | 0.1702 | - | - | | 2.7013 | 9984 | 0.1708 | - | - | | 2.7016 | 9985 | 0.1285 | - | - | | 2.7018 | 9986 | 0.1952 | - | - | | 2.7021 | 9987 | 0.1314 | - | - | | 2.7024 | 9988 | 0.1671 | - | - | | 2.7027 | 9989 | 0.2038 | - | - | | 2.7029 | 9990 | 0.2286 | - | - | | 2.7032 | 9991 | 0.1773 | - | - | | 2.7035 | 9992 | 0.1603 | - | - | | 2.7037 | 9993 | 0.2274 | - | - | | 2.7040 | 9994 | 0.1582 | - | - | | 2.7043 | 9995 | 0.1772 | - | - | | 2.7045 | 9996 | 0.1568 | - | - | | 2.7048 | 9997 | 0.2022 | - | - | | 2.7051 | 9998 | 0.2089 | - | - | | 2.7054 | 9999 | 0.2049 | - | - | | 2.7056 | 10000 | 0.1524 | 0.1981 | 0.9518 | | 2.7059 | 10001 | 0.204 | - | - | | 2.7062 | 10002 | 0.1648 | - | - | | 2.7064 | 10003 | 0.1433 | - | - | | 2.7067 | 10004 | 0.2032 | - | - | | 2.7070 | 10005 | 0.147 | - | - | | 2.7073 | 10006 | 0.2122 | - | - | | 2.7075 | 10007 | 0.1509 | - | - | | 2.7078 | 10008 | 0.1761 | - | - | | 2.7081 | 10009 | 0.1985 | - | - | | 2.7083 | 10010 | 0.1348 | - | - | | 2.7086 | 10011 | 0.2467 | - | - | | 2.7089 | 10012 | 0.1574 | - | - | | 2.7091 | 10013 | 0.1756 | - | - | | 2.7094 | 10014 | 0.245 | - | - | | 2.7097 | 10015 | 0.1504 | - | - | | 2.7100 | 10016 | 0.1294 | - | - | | 2.7102 | 10017 | 0.1675 | - | - | | 2.7105 | 10018 | 0.2051 | - | - | | 2.7108 | 10019 | 0.1655 | - | - | | 2.7110 | 10020 | 0.1471 | - | - | | 2.7113 | 10021 | 0.1656 | - | - | | 2.7116 | 10022 | 0.1598 | - | - | | 2.7119 | 10023 | 0.1806 | - | - | | 2.7121 | 10024 | 0.1893 | - | - | | 2.7124 | 10025 | 0.2289 | - | - | | 2.7127 | 10026 | 0.1824 | - | - | | 2.7129 | 10027 | 0.1599 | - | - | | 2.7132 | 10028 | 0.1626 | - | - | | 2.7135 | 10029 | 0.1351 | - | - | | 2.7137 | 10030 | 0.1638 | - | - | | 2.7140 | 10031 | 0.2049 | - | - | | 2.7143 | 10032 | 0.2362 | - | - | | 2.7146 | 10033 | 0.1532 | - | - | | 2.7148 | 10034 | 0.1753 | - | - | | 2.7151 | 10035 | 0.1935 | - | - | | 2.7154 | 10036 | 0.1467 | - | - | | 2.7156 | 10037 | 0.1522 | - | - | | 2.7159 | 10038 | 0.2213 | - | - | | 2.7162 | 10039 | 0.1615 | - | - | | 2.7165 | 10040 | 0.2181 | - | - | | 2.7167 | 10041 | 0.1838 | - | - | | 2.7170 | 10042 | 0.2047 | - | - | | 2.7173 | 10043 | 0.1839 | - | - | | 2.7175 | 10044 | 0.2095 | - | - | | 2.7178 | 10045 | 0.181 | - | - | | 2.7181 | 10046 | 0.1528 | - | - | | 2.7183 | 10047 | 0.2333 | - | - | | 2.7186 | 10048 | 0.1803 | - | - | | 2.7189 | 10049 | 0.1979 | - | - | | 2.7192 | 10050 | 0.1533 | - | - | | 2.7194 | 10051 | 0.1465 | - | - | | 2.7197 | 10052 | 0.2453 | - | - | | 2.7200 | 10053 | 0.1647 | - | - | | 2.7202 | 10054 | 0.2472 | - | - | | 2.7205 | 10055 | 0.2488 | - | - | | 2.7208 | 10056 | 0.181 | - | - | | 2.7210 | 10057 | 0.2344 | - | - | | 2.7213 | 10058 | 0.2095 | - | - | | 2.7216 | 10059 | 0.2239 | - | - | | 2.7219 | 10060 | 0.1596 | - | - | | 2.7221 | 10061 | 0.1297 | - | - | | 2.7224 | 10062 | 0.1239 | - | - | | 2.7227 | 10063 | 0.2117 | - | - | | 2.7229 | 10064 | 0.1295 | - | - | | 2.7232 | 10065 | 0.2194 | - | - | | 2.7235 | 10066 | 0.1287 | - | - | | 2.7238 | 10067 | 0.1556 | - | - | | 2.7240 | 10068 | 0.1325 | - | - | | 2.7243 | 10069 | 0.1854 | - | - | | 2.7246 | 10070 | 0.1577 | - | - | | 2.7248 | 10071 | 0.218 | - | - | | 2.7251 | 10072 | 0.2931 | - | - | | 2.7254 | 10073 | 0.1334 | - | - | | 2.7256 | 10074 | 0.1573 | - | - | | 2.7259 | 10075 | 0.1642 | - | - | | 2.7262 | 10076 | 0.123 | - | - | | 2.7265 | 10077 | 0.2012 | - | - | | 2.7267 | 10078 | 0.1619 | - | - | | 2.7270 | 10079 | 0.2016 | - | - | | 2.7273 | 10080 | 0.1862 | - | - | | 2.7275 | 10081 | 0.1967 | - | - | | 2.7278 | 10082 | 0.1699 | - | - | | 2.7281 | 10083 | 0.2191 | - | - | | 2.7284 | 10084 | 0.221 | - | - | | 2.7286 | 10085 | 0.1548 | - | - | | 2.7289 | 10086 | 0.231 | - | - | | 2.7292 | 10087 | 0.2132 | - | - | | 2.7294 | 10088 | 0.1904 | - | - | | 2.7297 | 10089 | 0.2822 | - | - | | 2.7300 | 10090 | 0.1582 | - | - | | 2.7302 | 10091 | 0.102 | - | - | | 2.7305 | 10092 | 0.1543 | - | - | | 2.7308 | 10093 | 0.1897 | - | - | | 2.7311 | 10094 | 0.1864 | - | - | | 2.7313 | 10095 | 0.2078 | - | - | | 2.7316 | 10096 | 0.1418 | - | - | | 2.7319 | 10097 | 0.1406 | - | - | | 2.7321 | 10098 | 0.1806 | - | - | | 2.7324 | 10099 | 0.2246 | - | - | | 2.7327 | 10100 | 0.2052 | - | - | | 2.7330 | 10101 | 0.1787 | - | - | | 2.7332 | 10102 | 0.1104 | - | - | | 2.7335 | 10103 | 0.1409 | - | - | | 2.7338 | 10104 | 0.1486 | - | - | | 2.7340 | 10105 | 0.1948 | - | - | | 2.7343 | 10106 | 0.1527 | - | - | | 2.7346 | 10107 | 0.1456 | - | - | | 2.7348 | 10108 | 0.1214 | - | - | | 2.7351 | 10109 | 0.1628 | - | - | | 2.7354 | 10110 | 0.1529 | - | - | | 2.7357 | 10111 | 0.2555 | - | - | | 2.7359 | 10112 | 0.1923 | - | - | | 2.7362 | 10113 | 0.1625 | - | - | | 2.7365 | 10114 | 0.207 | - | - | | 2.7367 | 10115 | 0.2013 | - | - | | 2.7370 | 10116 | 0.1745 | - | - | | 2.7373 | 10117 | 0.2173 | - | - | | 2.7376 | 10118 | 0.1295 | - | - | | 2.7378 | 10119 | 0.1919 | - | - | | 2.7381 | 10120 | 0.1253 | - | - | | 2.7384 | 10121 | 0.2464 | - | - | | 2.7386 | 10122 | 0.1767 | - | - | | 2.7389 | 10123 | 0.1398 | - | - | | 2.7392 | 10124 | 0.1887 | - | - | | 2.7394 | 10125 | 0.1512 | - | - | | 2.7397 | 10126 | 0.1883 | - | - | | 2.7400 | 10127 | 0.1434 | - | - | | 2.7403 | 10128 | 0.1581 | - | - | | 2.7405 | 10129 | 0.2168 | - | - | | 2.7408 | 10130 | 0.1896 | - | - | | 2.7411 | 10131 | 0.1844 | - | - | | 2.7413 | 10132 | 0.1791 | - | - | | 2.7416 | 10133 | 0.1396 | - | - | | 2.7419 | 10134 | 0.1716 | - | - | | 2.7422 | 10135 | 0.1665 | - | - | | 2.7424 | 10136 | 0.1852 | - | - | | 2.7427 | 10137 | 0.1458 | - | - | | 2.7430 | 10138 | 0.1718 | - | - | | 2.7432 | 10139 | 0.1793 | - | - | | 2.7435 | 10140 | 0.1823 | - | - | | 2.7438 | 10141 | 0.1826 | - | - | | 2.7440 | 10142 | 0.1155 | - | - | | 2.7443 | 10143 | 0.1899 | - | - | | 2.7446 | 10144 | 0.2011 | - | - | | 2.7449 | 10145 | 0.1918 | - | - | | 2.7451 | 10146 | 0.1279 | - | - | | 2.7454 | 10147 | 0.1561 | - | - | | 2.7457 | 10148 | 0.2601 | - | - | | 2.7459 | 10149 | 0.2124 | - | - | | 2.7462 | 10150 | 0.1405 | - | - | | 2.7465 | 10151 | 0.1785 | - | - | | 2.7468 | 10152 | 0.1785 | - | - | | 2.7470 | 10153 | 0.1873 | - | - | | 2.7473 | 10154 | 0.1593 | - | - | | 2.7476 | 10155 | 0.2722 | - | - | | 2.7478 | 10156 | 0.1757 | - | - | | 2.7481 | 10157 | 0.164 | - | - | | 2.7484 | 10158 | 0.2059 | - | - | | 2.7486 | 10159 | 0.1748 | - | - | | 2.7489 | 10160 | 0.1214 | - | - | | 2.7492 | 10161 | 0.201 | - | - | | 2.7495 | 10162 | 0.2012 | - | - | | 2.7497 | 10163 | 0.1527 | - | - | | 2.75 | 10164 | 0.1601 | - | - | | 2.7503 | 10165 | 0.2386 | - | - | | 2.7505 | 10166 | 0.1786 | - | - | | 2.7508 | 10167 | 0.1726 | - | - | | 2.7511 | 10168 | 0.1905 | - | - | | 2.7514 | 10169 | 0.275 | - | - | | 2.7516 | 10170 | 0.19 | - | - | | 2.7519 | 10171 | 0.1855 | - | - | | 2.7522 | 10172 | 0.1667 | - | - | | 2.7524 | 10173 | 0.2234 | - | - | | 2.7527 | 10174 | 0.1715 | - | - | | 2.7530 | 10175 | 0.1746 | - | - | | 2.7532 | 10176 | 0.1965 | - | - | | 2.7535 | 10177 | 0.2133 | - | - | | 2.7538 | 10178 | 0.2661 | - | - | | 2.7541 | 10179 | 0.2327 | - | - | | 2.7543 | 10180 | 0.1758 | - | - | | 2.7546 | 10181 | 0.1261 | - | - | | 2.7549 | 10182 | 0.1531 | - | - | | 2.7551 | 10183 | 0.2221 | - | - | | 2.7554 | 10184 | 0.2154 | - | - | | 2.7557 | 10185 | 0.1394 | - | - | | 2.7560 | 10186 | 0.2025 | - | - | | 2.7562 | 10187 | 0.1563 | - | - | | 2.7565 | 10188 | 0.2033 | - | - | | 2.7568 | 10189 | 0.2218 | - | - | | 2.7570 | 10190 | 0.1813 | - | - | | 2.7573 | 10191 | 0.197 | - | - | | 2.7576 | 10192 | 0.1432 | - | - | | 2.7578 | 10193 | 0.1572 | - | - | | 2.7581 | 10194 | 0.1622 | - | - | | 2.7584 | 10195 | 0.2398 | - | - | | 2.7587 | 10196 | 0.1433 | - | - | | 2.7589 | 10197 | 0.1707 | - | - | | 2.7592 | 10198 | 0.2832 | - | - | | 2.7595 | 10199 | 0.1875 | - | - | | 2.7597 | 10200 | 0.1952 | - | - | | 2.7600 | 10201 | 0.1633 | - | - | | 2.7603 | 10202 | 0.2047 | - | - | | 2.7606 | 10203 | 0.1954 | - | - | | 2.7608 | 10204 | 0.2512 | - | - | | 2.7611 | 10205 | 0.1667 | - | - | | 2.7614 | 10206 | 0.1504 | - | - | | 2.7616 | 10207 | 0.204 | - | - | | 2.7619 | 10208 | 0.1649 | - | - | | 2.7622 | 10209 | 0.184 | - | - | | 2.7624 | 10210 | 0.2745 | - | - | | 2.7627 | 10211 | 0.2069 | - | - | | 2.7630 | 10212 | 0.236 | - | - | | 2.7633 | 10213 | 0.2184 | - | - | | 2.7635 | 10214 | 0.1503 | - | - | | 2.7638 | 10215 | 0.1957 | - | - | | 2.7641 | 10216 | 0.2165 | - | - | | 2.7643 | 10217 | 0.1811 | - | - | | 2.7646 | 10218 | 0.142 | - | - | | 2.7649 | 10219 | 0.149 | - | - | | 2.7652 | 10220 | 0.156 | - | - | | 2.7654 | 10221 | 0.2544 | - | - | | 2.7657 | 10222 | 0.1872 | - | - | | 2.7660 | 10223 | 0.1746 | - | - | | 2.7662 | 10224 | 0.1585 | - | - | | 2.7665 | 10225 | 0.1532 | - | - | | 2.7668 | 10226 | 0.1777 | - | - | | 2.7670 | 10227 | 0.2013 | - | - | | 2.7673 | 10228 | 0.1979 | - | - | | 2.7676 | 10229 | 0.1919 | - | - | | 2.7679 | 10230 | 0.1584 | - | - | | 2.7681 | 10231 | 0.2125 | - | - | | 2.7684 | 10232 | 0.133 | - | - | | 2.7687 | 10233 | 0.1394 | - | - | | 2.7689 | 10234 | 0.1999 | - | - | | 2.7692 | 10235 | 0.1805 | - | - | | 2.7695 | 10236 | 0.1652 | - | - | | 2.7698 | 10237 | 0.1644 | - | - | | 2.7700 | 10238 | 0.1725 | - | - | | 2.7703 | 10239 | 0.2338 | - | - | | 2.7706 | 10240 | 0.2182 | - | - | | 2.7708 | 10241 | 0.1776 | - | - | | 2.7711 | 10242 | 0.1586 | - | - | | 2.7714 | 10243 | 0.2102 | - | - | | 2.7716 | 10244 | 0.1728 | - | - | | 2.7719 | 10245 | 0.1648 | - | - | | 2.7722 | 10246 | 0.2269 | - | - | | 2.7725 | 10247 | 0.165 | - | - | | 2.7727 | 10248 | 0.1825 | - | - | | 2.7730 | 10249 | 0.1429 | - | - | | 2.7733 | 10250 | 0.1487 | - | - | | 2.7735 | 10251 | 0.1772 | - | - | | 2.7738 | 10252 | 0.2405 | - | - | | 2.7741 | 10253 | 0.1876 | - | - | | 2.7744 | 10254 | 0.1989 | - | - | | 2.7746 | 10255 | 0.1603 | - | - | | 2.7749 | 10256 | 0.1697 | - | - | | 2.7752 | 10257 | 0.1589 | - | - | | 2.7754 | 10258 | 0.167 | - | - | | 2.7757 | 10259 | 0.1821 | - | - | | 2.7760 | 10260 | 0.2388 | - | - | | 2.7762 | 10261 | 0.1785 | - | - | | 2.7765 | 10262 | 0.1531 | - | - | | 2.7768 | 10263 | 0.1997 | - | - | | 2.7771 | 10264 | 0.2474 | - | - | | 2.7773 | 10265 | 0.1593 | - | - | | 2.7776 | 10266 | 0.2194 | - | - | | 2.7779 | 10267 | 0.1648 | - | - | | 2.7781 | 10268 | 0.2095 | - | - | | 2.7784 | 10269 | 0.1308 | - | - | | 2.7787 | 10270 | 0.2246 | - | - | | 2.7790 | 10271 | 0.1944 | - | - | | 2.7792 | 10272 | 0.2037 | - | - | | 2.7795 | 10273 | 0.2075 | - | - | | 2.7798 | 10274 | 0.1401 | - | - | | 2.7800 | 10275 | 0.2082 | - | - | | 2.7803 | 10276 | 0.1729 | - | - | | 2.7806 | 10277 | 0.2313 | - | - | | 2.7808 | 10278 | 0.1214 | - | - | | 2.7811 | 10279 | 0.1973 | - | - | | 2.7814 | 10280 | 0.1985 | - | - | | 2.7817 | 10281 | 0.1817 | - | - | | 2.7819 | 10282 | 0.183 | - | - | | 2.7822 | 10283 | 0.1787 | - | - | | 2.7825 | 10284 | 0.1631 | - | - | | 2.7827 | 10285 | 0.1469 | - | - | | 2.7830 | 10286 | 0.1648 | - | - | | 2.7833 | 10287 | 0.1376 | - | - | | 2.7835 | 10288 | 0.1879 | - | - | | 2.7838 | 10289 | 0.1953 | - | - | | 2.7841 | 10290 | 0.2521 | - | - | | 2.7844 | 10291 | 0.1578 | - | - | | 2.7846 | 10292 | 0.1436 | - | - | | 2.7849 | 10293 | 0.1184 | - | - | | 2.7852 | 10294 | 0.2203 | - | - | | 2.7854 | 10295 | 0.1823 | - | - | | 2.7857 | 10296 | 0.2421 | - | - | | 2.7860 | 10297 | 0.2512 | - | - | | 2.7863 | 10298 | 0.1498 | - | - | | 2.7865 | 10299 | 0.233 | - | - | | 2.7868 | 10300 | 0.1959 | - | - | | 2.7871 | 10301 | 0.1317 | - | - | | 2.7873 | 10302 | 0.1598 | - | - | | 2.7876 | 10303 | 0.1443 | - | - | | 2.7879 | 10304 | 0.1981 | - | - | | 2.7881 | 10305 | 0.2045 | - | - | | 2.7884 | 10306 | 0.1517 | - | - | | 2.7887 | 10307 | 0.2029 | - | - | | 2.7890 | 10308 | 0.2191 | - | - | | 2.7892 | 10309 | 0.1785 | - | - | | 2.7895 | 10310 | 0.165 | - | - | | 2.7898 | 10311 | 0.1624 | - | - | | 2.7900 | 10312 | 0.246 | - | - | | 2.7903 | 10313 | 0.2368 | - | - | | 2.7906 | 10314 | 0.1382 | - | - | | 2.7909 | 10315 | 0.1498 | - | - | | 2.7911 | 10316 | 0.1529 | - | - | | 2.7914 | 10317 | 0.1661 | - | - | | 2.7917 | 10318 | 0.2483 | - | - | | 2.7919 | 10319 | 0.1743 | - | - | | 2.7922 | 10320 | 0.2503 | - | - | | 2.7925 | 10321 | 0.1715 | - | - | | 2.7927 | 10322 | 0.1929 | - | - | | 2.7930 | 10323 | 0.1785 | - | - | | 2.7933 | 10324 | 0.2121 | - | - | | 2.7936 | 10325 | 0.1627 | - | - | | 2.7938 | 10326 | 0.1689 | - | - | | 2.7941 | 10327 | 0.1427 | - | - | | 2.7944 | 10328 | 0.1782 | - | - | | 2.7946 | 10329 | 0.1702 | - | - | | 2.7949 | 10330 | 0.1546 | - | - | | 2.7952 | 10331 | 0.2864 | - | - | | 2.7955 | 10332 | 0.1654 | - | - | | 2.7957 | 10333 | 0.1446 | - | - | | 2.7960 | 10334 | 0.2061 | - | - | | 2.7963 | 10335 | 0.1536 | - | - | | 2.7965 | 10336 | 0.1601 | - | - | | 2.7968 | 10337 | 0.1732 | - | - | | 2.7971 | 10338 | 0.1434 | - | - | | 2.7973 | 10339 | 0.1533 | - | - | | 2.7976 | 10340 | 0.2509 | - | - | | 2.7979 | 10341 | 0.1703 | - | - | | 2.7982 | 10342 | 0.1943 | - | - | | 2.7984 | 10343 | 0.1845 | - | - | | 2.7987 | 10344 | 0.1967 | - | - | | 2.7990 | 10345 | 0.3166 | - | - | | 2.7992 | 10346 | 0.149 | - | - | | 2.7995 | 10347 | 0.1337 | - | - | | 2.7998 | 10348 | 0.1221 | - | - | | 2.8001 | 10349 | 0.2679 | - | - | | 2.8003 | 10350 | 0.1584 | - | - | | 2.8006 | 10351 | 0.1382 | - | - | | 2.8009 | 10352 | 0.1814 | - | - | | 2.8011 | 10353 | 0.1127 | - | - | | 2.8014 | 10354 | 0.1668 | - | - | | 2.8017 | 10355 | 0.2237 | - | - | | 2.8019 | 10356 | 0.2151 | - | - | | 2.8022 | 10357 | 0.1603 | - | - | | 2.8025 | 10358 | 0.18 | - | - | | 2.8028 | 10359 | 0.1536 | - | - | | 2.8030 | 10360 | 0.1701 | - | - | | 2.8033 | 10361 | 0.158 | - | - | | 2.8036 | 10362 | 0.2367 | - | - | | 2.8038 | 10363 | 0.1534 | - | - | | 2.8041 | 10364 | 0.1846 | - | - | | 2.8044 | 10365 | 0.1727 | - | - | | 2.8047 | 10366 | 0.1368 | - | - | | 2.8049 | 10367 | 0.1892 | - | - | | 2.8052 | 10368 | 0.1764 | - | - | | 2.8055 | 10369 | 0.1896 | - | - | | 2.8057 | 10370 | 0.1607 | - | - | | 2.8060 | 10371 | 0.1812 | - | - | | 2.8063 | 10372 | 0.1938 | - | - | | 2.8065 | 10373 | 0.194 | - | - | | 2.8068 | 10374 | 0.2195 | - | - | | 2.8071 | 10375 | 0.1546 | - | - | | 2.8074 | 10376 | 0.2571 | - | - | | 2.8076 | 10377 | 0.2044 | - | - | | 2.8079 | 10378 | 0.1927 | - | - | | 2.8082 | 10379 | 0.15 | - | - | | 2.8084 | 10380 | 0.1707 | - | - | | 2.8087 | 10381 | 0.1477 | - | - | | 2.8090 | 10382 | 0.1685 | - | - | | 2.8093 | 10383 | 0.1357 | - | - | | 2.8095 | 10384 | 0.1248 | - | - | | 2.8098 | 10385 | 0.2214 | - | - | | 2.8101 | 10386 | 0.151 | - | - | | 2.8103 | 10387 | 0.1597 | - | - | | 2.8106 | 10388 | 0.2445 | - | - | | 2.8109 | 10389 | 0.2166 | - | - | | 2.8111 | 10390 | 0.2505 | - | - | | 2.8114 | 10391 | 0.2209 | - | - | | 2.8117 | 10392 | 0.1774 | - | - | | 2.8120 | 10393 | 0.1424 | - | - | | 2.8122 | 10394 | 0.1784 | - | - | | 2.8125 | 10395 | 0.184 | - | - | | 2.8128 | 10396 | 0.2017 | - | - | | 2.8130 | 10397 | 0.2191 | - | - | | 2.8133 | 10398 | 0.2958 | - | - | | 2.8136 | 10399 | 0.1895 | - | - | | 2.8139 | 10400 | 0.208 | - | - | | 2.8141 | 10401 | 0.158 | - | - | | 2.8144 | 10402 | 0.1601 | - | - | | 2.8147 | 10403 | 0.1649 | - | - | | 2.8149 | 10404 | 0.1487 | - | - | | 2.8152 | 10405 | 0.1636 | - | - | | 2.8155 | 10406 | 0.2 | - | - | | 2.8157 | 10407 | 0.2846 | - | - | | 2.8160 | 10408 | 0.2289 | - | - | | 2.8163 | 10409 | 0.1599 | - | - | | 2.8166 | 10410 | 0.1526 | - | - | | 2.8168 | 10411 | 0.2293 | - | - | | 2.8171 | 10412 | 0.2137 | - | - | | 2.8174 | 10413 | 0.1635 | - | - | | 2.8176 | 10414 | 0.1969 | - | - | | 2.8179 | 10415 | 0.1947 | - | - | | 2.8182 | 10416 | 0.1545 | - | - | | 2.8185 | 10417 | 0.1861 | - | - | | 2.8187 | 10418 | 0.198 | - | - | | 2.8190 | 10419 | 0.151 | - | - | | 2.8193 | 10420 | 0.1908 | - | - | | 2.8195 | 10421 | 0.2578 | - | - | | 2.8198 | 10422 | 0.2081 | - | - | | 2.8201 | 10423 | 0.1924 | - | - | | 2.8203 | 10424 | 0.1326 | - | - | | 2.8206 | 10425 | 0.1571 | - | - | | 2.8209 | 10426 | 0.2384 | - | - | | 2.8212 | 10427 | 0.158 | - | - | | 2.8214 | 10428 | 0.1258 | - | - | | 2.8217 | 10429 | 0.1665 | - | - | | 2.8220 | 10430 | 0.1846 | - | - | | 2.8222 | 10431 | 0.2672 | - | - | | 2.8225 | 10432 | 0.1487 | - | - | | 2.8228 | 10433 | 0.1672 | - | - | | 2.8231 | 10434 | 0.1547 | - | - | | 2.8233 | 10435 | 0.1415 | - | - | | 2.8236 | 10436 | 0.1359 | - | - | | 2.8239 | 10437 | 0.2179 | - | - | | 2.8241 | 10438 | 0.241 | - | - | | 2.8244 | 10439 | 0.2492 | - | - | | 2.8247 | 10440 | 0.1828 | - | - | | 2.8249 | 10441 | 0.1641 | - | - | | 2.8252 | 10442 | 0.2207 | - | - | | 2.8255 | 10443 | 0.2289 | - | - | | 2.8258 | 10444 | 0.1639 | - | - | | 2.8260 | 10445 | 0.1781 | - | - | | 2.8263 | 10446 | 0.2043 | - | - | | 2.8266 | 10447 | 0.1709 | - | - | | 2.8268 | 10448 | 0.1275 | - | - | | 2.8271 | 10449 | 0.142 | - | - | | 2.8274 | 10450 | 0.2263 | - | - | | 2.8277 | 10451 | 0.1553 | - | - | | 2.8279 | 10452 | 0.1888 | - | - | | 2.8282 | 10453 | 0.2286 | - | - | | 2.8285 | 10454 | 0.1288 | - | - | | 2.8287 | 10455 | 0.1043 | - | - | | 2.8290 | 10456 | 0.2126 | - | - | | 2.8293 | 10457 | 0.2055 | - | - | | 2.8295 | 10458 | 0.1266 | - | - | | 2.8298 | 10459 | 0.2522 | - | - | | 2.8301 | 10460 | 0.2304 | - | - | | 2.8304 | 10461 | 0.1151 | - | - | | 2.8306 | 10462 | 0.192 | - | - | | 2.8309 | 10463 | 0.1893 | - | - | | 2.8312 | 10464 | 0.1386 | - | - | | 2.8314 | 10465 | 0.2076 | - | - | | 2.8317 | 10466 | 0.158 | - | - | | 2.8320 | 10467 | 0.1365 | - | - | | 2.8323 | 10468 | 0.1559 | - | - | | 2.8325 | 10469 | 0.15 | - | - | | 2.8328 | 10470 | 0.1947 | - | - | | 2.8331 | 10471 | 0.1263 | - | - | | 2.8333 | 10472 | 0.1781 | - | - | | 2.8336 | 10473 | 0.1703 | - | - | | 2.8339 | 10474 | 0.2126 | - | - | | 2.8341 | 10475 | 0.1948 | - | - | | 2.8344 | 10476 | 0.2402 | - | - | | 2.8347 | 10477 | 0.1706 | - | - | | 2.8350 | 10478 | 0.1305 | - | - | | 2.8352 | 10479 | 0.197 | - | - | | 2.8355 | 10480 | 0.1968 | - | - | | 2.8358 | 10481 | 0.2263 | - | - | | 2.8360 | 10482 | 0.1659 | - | - | | 2.8363 | 10483 | 0.18 | - | - | | 2.8366 | 10484 | 0.1568 | - | - | | 2.8369 | 10485 | 0.1968 | - | - | | 2.8371 | 10486 | 0.1606 | - | - | | 2.8374 | 10487 | 0.1213 | - | - | | 2.8377 | 10488 | 0.1648 | - | - | | 2.8379 | 10489 | 0.1881 | - | - | | 2.8382 | 10490 | 0.1748 | - | - | | 2.8385 | 10491 | 0.2688 | - | - | | 2.8387 | 10492 | 0.1569 | - | - | | 2.8390 | 10493 | 0.1993 | - | - | | 2.8393 | 10494 | 0.2501 | - | - | | 2.8396 | 10495 | 0.1597 | - | - | | 2.8398 | 10496 | 0.146 | - | - | | 2.8401 | 10497 | 0.1113 | - | - | | 2.8404 | 10498 | 0.2061 | - | - | | 2.8406 | 10499 | 0.1252 | - | - | | 2.8409 | 10500 | 0.1788 | - | - | | 2.8412 | 10501 | 0.116 | - | - | | 2.8415 | 10502 | 0.1283 | - | - | | 2.8417 | 10503 | 0.1636 | - | - | | 2.8420 | 10504 | 0.1665 | - | - | | 2.8423 | 10505 | 0.231 | - | - | | 2.8425 | 10506 | 0.1996 | - | - | | 2.8428 | 10507 | 0.188 | - | - | | 2.8431 | 10508 | 0.2211 | - | - | | 2.8433 | 10509 | 0.1794 | - | - | | 2.8436 | 10510 | 0.1714 | - | - | | 2.8439 | 10511 | 0.2177 | - | - | | 2.8442 | 10512 | 0.1814 | - | - | | 2.8444 | 10513 | 0.2004 | - | - | | 2.8447 | 10514 | 0.2261 | - | - | | 2.8450 | 10515 | 0.1903 | - | - | | 2.8452 | 10516 | 0.1682 | - | - | | 2.8455 | 10517 | 0.1979 | - | - | | 2.8458 | 10518 | 0.1513 | - | - | | 2.8460 | 10519 | 0.1103 | - | - | | 2.8463 | 10520 | 0.2082 | - | - | | 2.8466 | 10521 | 0.1825 | - | - | | 2.8469 | 10522 | 0.2426 | - | - | | 2.8471 | 10523 | 0.1731 | - | - | | 2.8474 | 10524 | 0.1933 | - | - | | 2.8477 | 10525 | 0.245 | - | - | | 2.8479 | 10526 | 0.1581 | - | - | | 2.8482 | 10527 | 0.2058 | - | - | | 2.8485 | 10528 | 0.1805 | - | - | | 2.8488 | 10529 | 0.2101 | - | - | | 2.8490 | 10530 | 0.3166 | - | - | | 2.8493 | 10531 | 0.1909 | - | - | | 2.8496 | 10532 | 0.2222 | - | - | | 2.8498 | 10533 | 0.177 | - | - | | 2.8501 | 10534 | 0.2207 | - | - | | 2.8504 | 10535 | 0.2584 | - | - | | 2.8506 | 10536 | 0.2048 | - | - | | 2.8509 | 10537 | 0.1717 | - | - | | 2.8512 | 10538 | 0.1785 | - | - | | 2.8515 | 10539 | 0.1995 | - | - | | 2.8517 | 10540 | 0.1747 | - | - | | 2.8520 | 10541 | 0.138 | - | - | | 2.8523 | 10542 | 0.1865 | - | - | | 2.8525 | 10543 | 0.157 | - | - | | 2.8528 | 10544 | 0.1387 | - | - | | 2.8531 | 10545 | 0.2247 | - | - | | 2.8534 | 10546 | 0.1726 | - | - | | 2.8536 | 10547 | 0.2175 | - | - | | 2.8539 | 10548 | 0.1751 | - | - | | 2.8542 | 10549 | 0.1953 | - | - | | 2.8544 | 10550 | 0.2146 | - | - | | 2.8547 | 10551 | 0.2245 | - | - | | 2.8550 | 10552 | 0.1479 | - | - | | 2.8552 | 10553 | 0.1233 | - | - | | 2.8555 | 10554 | 0.1496 | - | - | | 2.8558 | 10555 | 0.1927 | - | - | | 2.8561 | 10556 | 0.2005 | - | - | | 2.8563 | 10557 | 0.2218 | - | - | | 2.8566 | 10558 | 0.1881 | - | - | | 2.8569 | 10559 | 0.1941 | - | - | | 2.8571 | 10560 | 0.1797 | - | - | | 2.8574 | 10561 | 0.1338 | - | - | | 2.8577 | 10562 | 0.1743 | - | - | | 2.8580 | 10563 | 0.1895 | - | - | | 2.8582 | 10564 | 0.2136 | - | - | | 2.8585 | 10565 | 0.3177 | - | - | | 2.8588 | 10566 | 0.1628 | - | - | | 2.8590 | 10567 | 0.1455 | - | - | | 2.8593 | 10568 | 0.1476 | - | - | | 2.8596 | 10569 | 0.2476 | - | - | | 2.8598 | 10570 | 0.1942 | - | - | | 2.8601 | 10571 | 0.1878 | - | - | | 2.8604 | 10572 | 0.118 | - | - | | 2.8607 | 10573 | 0.2184 | - | - | | 2.8609 | 10574 | 0.1432 | - | - | | 2.8612 | 10575 | 0.1856 | - | - | | 2.8615 | 10576 | 0.1588 | - | - | | 2.8617 | 10577 | 0.1983 | - | - | | 2.8620 | 10578 | 0.1234 | - | - | | 2.8623 | 10579 | 0.2296 | - | - | | 2.8626 | 10580 | 0.1579 | - | - | | 2.8628 | 10581 | 0.1419 | - | - | | 2.8631 | 10582 | 0.1821 | - | - | | 2.8634 | 10583 | 0.1903 | - | - | | 2.8636 | 10584 | 0.1767 | - | - | | 2.8639 | 10585 | 0.1951 | - | - | | 2.8642 | 10586 | 0.1361 | - | - | | 2.8644 | 10587 | 0.1633 | - | - | | 2.8647 | 10588 | 0.1297 | - | - | | 2.8650 | 10589 | 0.1232 | - | - | | 2.8653 | 10590 | 0.1993 | - | - | | 2.8655 | 10591 | 0.2096 | - | - | | 2.8658 | 10592 | 0.1747 | - | - | | 2.8661 | 10593 | 0.1515 | - | - | | 2.8663 | 10594 | 0.2906 | - | - | | 2.8666 | 10595 | 0.1678 | - | - | | 2.8669 | 10596 | 0.1363 | - | - | | 2.8672 | 10597 | 0.1483 | - | - | | 2.8674 | 10598 | 0.2055 | - | - | | 2.8677 | 10599 | 0.1206 | - | - | | 2.8680 | 10600 | 0.1471 | - | - | | 2.8682 | 10601 | 0.1455 | - | - | | 2.8685 | 10602 | 0.21 | - | - | | 2.8688 | 10603 | 0.1909 | - | - | | 2.8690 | 10604 | 0.1953 | - | - | | 2.8693 | 10605 | 0.228 | - | - | | 2.8696 | 10606 | 0.1463 | - | - | | 2.8699 | 10607 | 0.1117 | - | - | | 2.8701 | 10608 | 0.2866 | - | - | | 2.8704 | 10609 | 0.1771 | - | - | | 2.8707 | 10610 | 0.2066 | - | - | | 2.8709 | 10611 | 0.2137 | - | - | | 2.8712 | 10612 | 0.1635 | - | - | | 2.8715 | 10613 | 0.2045 | - | - | | 2.8718 | 10614 | 0.1758 | - | - | | 2.8720 | 10615 | 0.2211 | - | - | | 2.8723 | 10616 | 0.2206 | - | - | | 2.8726 | 10617 | 0.2271 | - | - | | 2.8728 | 10618 | 0.0931 | - | - | | 2.8731 | 10619 | 0.2128 | - | - | | 2.8734 | 10620 | 0.1514 | - | - | | 2.8736 | 10621 | 0.2751 | - | - | | 2.8739 | 10622 | 0.2332 | - | - | | 2.8742 | 10623 | 0.12 | - | - | | 2.8745 | 10624 | 0.1489 | - | - | | 2.8747 | 10625 | 0.2399 | - | - | | 2.875 | 10626 | 0.1356 | - | - | | 2.8753 | 10627 | 0.1875 | - | - | | 2.8755 | 10628 | 0.1392 | - | - | | 2.8758 | 10629 | 0.2431 | - | - | | 2.8761 | 10630 | 0.1451 | - | - | | 2.8764 | 10631 | 0.2169 | - | - | | 2.8766 | 10632 | 0.1121 | - | - | | 2.8769 | 10633 | 0.2058 | - | - | | 2.8772 | 10634 | 0.1463 | - | - | | 2.8774 | 10635 | 0.2316 | - | - | | 2.8777 | 10636 | 0.1518 | - | - | | 2.8780 | 10637 | 0.2189 | - | - | | 2.8782 | 10638 | 0.2339 | - | - | | 2.8785 | 10639 | 0.1672 | - | - | | 2.8788 | 10640 | 0.1573 | - | - | | 2.8791 | 10641 | 0.2717 | - | - | | 2.8793 | 10642 | 0.1555 | - | - | | 2.8796 | 10643 | 0.1576 | - | - | | 2.8799 | 10644 | 0.1973 | - | - | | 2.8801 | 10645 | 0.215 | - | - | | 2.8804 | 10646 | 0.153 | - | - | | 2.8807 | 10647 | 0.215 | - | - | | 2.8810 | 10648 | 0.2597 | - | - | | 2.8812 | 10649 | 0.2697 | - | - | | 2.8815 | 10650 | 0.1622 | - | - | | 2.8818 | 10651 | 0.1893 | - | - | | 2.8820 | 10652 | 0.2438 | - | - | | 2.8823 | 10653 | 0.1799 | - | - | | 2.8826 | 10654 | 0.1759 | - | - | | 2.8828 | 10655 | 0.2084 | - | - | | 2.8831 | 10656 | 0.1364 | - | - | | 2.8834 | 10657 | 0.1631 | - | - | | 2.8837 | 10658 | 0.2146 | - | - | | 2.8839 | 10659 | 0.1337 | - | - | | 2.8842 | 10660 | 0.1524 | - | - | | 2.8845 | 10661 | 0.1615 | - | - | | 2.8847 | 10662 | 0.1751 | - | - | | 2.8850 | 10663 | 0.2152 | - | - | | 2.8853 | 10664 | 0.1706 | - | - | | 2.8856 | 10665 | 0.1669 | - | - | | 2.8858 | 10666 | 0.1562 | - | - | | 2.8861 | 10667 | 0.1629 | - | - | | 2.8864 | 10668 | 0.2306 | - | - | | 2.8866 | 10669 | 0.1939 | - | - | | 2.8869 | 10670 | 0.2133 | - | - | | 2.8872 | 10671 | 0.1943 | - | - | | 2.8874 | 10672 | 0.2565 | - | - | | 2.8877 | 10673 | 0.2018 | - | - | | 2.8880 | 10674 | 0.182 | - | - | | 2.8883 | 10675 | 0.1823 | - | - | | 2.8885 | 10676 | 0.1892 | - | - | | 2.8888 | 10677 | 0.1558 | - | - | | 2.8891 | 10678 | 0.174 | - | - | | 2.8893 | 10679 | 0.1583 | - | - | | 2.8896 | 10680 | 0.1802 | - | - | | 2.8899 | 10681 | 0.2063 | - | - | | 2.8902 | 10682 | 0.222 | - | - | | 2.8904 | 10683 | 0.137 | - | - | | 2.8907 | 10684 | 0.2071 | - | - | | 2.8910 | 10685 | 0.1504 | - | - | | 2.8912 | 10686 | 0.2151 | - | - | | 2.8915 | 10687 | 0.1764 | - | - | | 2.8918 | 10688 | 0.2647 | - | - | | 2.8920 | 10689 | 0.1475 | - | - | | 2.8923 | 10690 | 0.1558 | - | - | | 2.8926 | 10691 | 0.1369 | - | - | | 2.8929 | 10692 | 0.2023 | - | - | | 2.8931 | 10693 | 0.1916 | - | - | | 2.8934 | 10694 | 0.1545 | - | - | | 2.8937 | 10695 | 0.1931 | - | - | | 2.8939 | 10696 | 0.1264 | - | - | | 2.8942 | 10697 | 0.229 | - | - | | 2.8945 | 10698 | 0.1923 | - | - | | 2.8948 | 10699 | 0.2086 | - | - | | 2.8950 | 10700 | 0.2655 | - | - | | 2.8953 | 10701 | 0.1954 | - | - | | 2.8956 | 10702 | 0.1568 | - | - | | 2.8958 | 10703 | 0.1439 | - | - | | 2.8961 | 10704 | 0.2027 | - | - | | 2.8964 | 10705 | 0.1823 | - | - | | 2.8966 | 10706 | 0.1524 | - | - | | 2.8969 | 10707 | 0.2384 | - | - | | 2.8972 | 10708 | 0.2256 | - | - | | 2.8975 | 10709 | 0.1773 | - | - | | 2.8977 | 10710 | 0.155 | - | - | | 2.8980 | 10711 | 0.2588 | - | - | | 2.8983 | 10712 | 0.177 | - | - | | 2.8985 | 10713 | 0.1602 | - | - | | 2.8988 | 10714 | 0.1683 | - | - | | 2.8991 | 10715 | 0.1747 | - | - | | 2.8994 | 10716 | 0.1844 | - | - | | 2.8996 | 10717 | 0.218 | - | - | | 2.8999 | 10718 | 0.146 | - | - | | 2.9002 | 10719 | 0.2903 | - | - | | 2.9004 | 10720 | 0.2253 | - | - | | 2.9007 | 10721 | 0.1771 | - | - | | 2.9010 | 10722 | 0.2008 | - | - | | 2.9012 | 10723 | 0.1747 | - | - | | 2.9015 | 10724 | 0.2296 | - | - | | 2.9018 | 10725 | 0.1823 | - | - | | 2.9021 | 10726 | 0.1863 | - | - | | 2.9023 | 10727 | 0.1503 | - | - | | 2.9026 | 10728 | 0.2734 | - | - | | 2.9029 | 10729 | 0.1883 | - | - | | 2.9031 | 10730 | 0.1899 | - | - | | 2.9034 | 10731 | 0.2035 | - | - | | 2.9037 | 10732 | 0.1646 | - | - | | 2.9040 | 10733 | 0.1627 | - | - | | 2.9042 | 10734 | 0.1989 | - | - | | 2.9045 | 10735 | 0.234 | - | - | | 2.9048 | 10736 | 0.1833 | - | - | | 2.9050 | 10737 | 0.2055 | - | - | | 2.9053 | 10738 | 0.1728 | - | - | | 2.9056 | 10739 | 0.1939 | - | - | | 2.9058 | 10740 | 0.2233 | - | - | | 2.9061 | 10741 | 0.2412 | - | - | | 2.9064 | 10742 | 0.1485 | - | - | | 2.9067 | 10743 | 0.1736 | - | - | | 2.9069 | 10744 | 0.1485 | - | - | | 2.9072 | 10745 | 0.1832 | - | - | | 2.9075 | 10746 | 0.1879 | - | - | | 2.9077 | 10747 | 0.1799 | - | - | | 2.9080 | 10748 | 0.1622 | - | - | | 2.9083 | 10749 | 0.2621 | - | - | | 2.9085 | 10750 | 0.201 | - | - | | 2.9088 | 10751 | 0.1541 | - | - | | 2.9091 | 10752 | 0.1638 | - | - | | 2.9094 | 10753 | 0.2259 | - | - | | 2.9096 | 10754 | 0.2438 | - | - | | 2.9099 | 10755 | 0.179 | - | - | | 2.9102 | 10756 | 0.137 | - | - | | 2.9104 | 10757 | 0.2443 | - | - | | 2.9107 | 10758 | 0.218 | - | - | | 2.9110 | 10759 | 0.1345 | - | - | | 2.9113 | 10760 | 0.1721 | - | - | | 2.9115 | 10761 | 0.2348 | - | - | | 2.9118 | 10762 | 0.1431 | - | - | | 2.9121 | 10763 | 0.1682 | - | - | | 2.9123 | 10764 | 0.2025 | - | - | | 2.9126 | 10765 | 0.2218 | - | - | | 2.9129 | 10766 | 0.1899 | - | - | | 2.9131 | 10767 | 0.1616 | - | - | | 2.9134 | 10768 | 0.3175 | - | - | | 2.9137 | 10769 | 0.231 | - | - | | 2.9140 | 10770 | 0.2001 | - | - | | 2.9142 | 10771 | 0.1704 | - | - | | 2.9145 | 10772 | 0.1921 | - | - | | 2.9148 | 10773 | 0.1277 | - | - | | 2.9150 | 10774 | 0.2791 | - | - | | 2.9153 | 10775 | 0.185 | - | - | | 2.9156 | 10776 | 0.1429 | - | - | | 2.9159 | 10777 | 0.2471 | - | - | | 2.9161 | 10778 | 0.1186 | - | - | | 2.9164 | 10779 | 0.1827 | - | - | | 2.9167 | 10780 | 0.1694 | - | - | | 2.9169 | 10781 | 0.1204 | - | - | | 2.9172 | 10782 | 0.1684 | - | - | | 2.9175 | 10783 | 0.15 | - | - | | 2.9177 | 10784 | 0.1319 | - | - | | 2.9180 | 10785 | 0.1743 | - | - | | 2.9183 | 10786 | 0.2029 | - | - | | 2.9186 | 10787 | 0.1502 | - | - | | 2.9188 | 10788 | 0.1577 | - | - | | 2.9191 | 10789 | 0.2391 | - | - | | 2.9194 | 10790 | 0.1845 | - | - | | 2.9196 | 10791 | 0.1412 | - | - | | 2.9199 | 10792 | 0.2339 | - | - | | 2.9202 | 10793 | 0.1873 | - | - | | 2.9205 | 10794 | 0.2112 | - | - | | 2.9207 | 10795 | 0.1623 | - | - | | 2.9210 | 10796 | 0.1716 | - | - | | 2.9213 | 10797 | 0.2284 | - | - | | 2.9215 | 10798 | 0.1397 | - | - | | 2.9218 | 10799 | 0.1881 | - | - | | 2.9221 | 10800 | 0.2381 | - | - | | 2.9223 | 10801 | 0.2333 | - | - | | 2.9226 | 10802 | 0.1799 | - | - | | 2.9229 | 10803 | 0.2059 | - | - | | 2.9232 | 10804 | 0.1789 | - | - | | 2.9234 | 10805 | 0.1897 | - | - | | 2.9237 | 10806 | 0.2058 | - | - | | 2.9240 | 10807 | 0.1219 | - | - | | 2.9242 | 10808 | 0.24 | - | - | | 2.9245 | 10809 | 0.1689 | - | - | | 2.9248 | 10810 | 0.2381 | - | - | | 2.9251 | 10811 | 0.1386 | - | - | | 2.9253 | 10812 | 0.2256 | - | - | | 2.9256 | 10813 | 0.1367 | - | - | | 2.9259 | 10814 | 0.3294 | - | - | | 2.9261 | 10815 | 0.1616 | - | - | | 2.9264 | 10816 | 0.1534 | - | - | | 2.9267 | 10817 | 0.2463 | - | - | | 2.9269 | 10818 | 0.1633 | - | - | | 2.9272 | 10819 | 0.1982 | - | - | | 2.9275 | 10820 | 0.1502 | - | - | | 2.9278 | 10821 | 0.3456 | - | - | | 2.9280 | 10822 | 0.1909 | - | - | | 2.9283 | 10823 | 0.1299 | - | - | | 2.9286 | 10824 | 0.1819 | - | - | | 2.9288 | 10825 | 0.2358 | - | - | | 2.9291 | 10826 | 0.2046 | - | - | | 2.9294 | 10827 | 0.1358 | - | - | | 2.9297 | 10828 | 0.2031 | - | - | | 2.9299 | 10829 | 0.1846 | - | - | | 2.9302 | 10830 | 0.1837 | - | - | | 2.9305 | 10831 | 0.1403 | - | - | | 2.9307 | 10832 | 0.1227 | - | - | | 2.9310 | 10833 | 0.1018 | - | - | | 2.9313 | 10834 | 0.2202 | - | - | | 2.9315 | 10835 | 0.185 | - | - | | 2.9318 | 10836 | 0.1498 | - | - | | 2.9321 | 10837 | 0.1721 | - | - | | 2.9324 | 10838 | 0.1742 | - | - | | 2.9326 | 10839 | 0.218 | - | - | | 2.9329 | 10840 | 0.1163 | - | - | | 2.9332 | 10841 | 0.2189 | - | - | | 2.9334 | 10842 | 0.1898 | - | - | | 2.9337 | 10843 | 0.2953 | - | - | | 2.9340 | 10844 | 0.1586 | - | - | | 2.9343 | 10845 | 0.2057 | - | - | | 2.9345 | 10846 | 0.1512 | - | - | | 2.9348 | 10847 | 0.2322 | - | - | | 2.9351 | 10848 | 0.1641 | - | - | | 2.9353 | 10849 | 0.1631 | - | - | | 2.9356 | 10850 | 0.2223 | - | - | | 2.9359 | 10851 | 0.1154 | - | - | | 2.9361 | 10852 | 0.2228 | - | - | | 2.9364 | 10853 | 0.2075 | - | - | | 2.9367 | 10854 | 0.1662 | - | - | | 2.9370 | 10855 | 0.2077 | - | - | | 2.9372 | 10856 | 0.1588 | - | - | | 2.9375 | 10857 | 0.1287 | - | - | | 2.9378 | 10858 | 0.1771 | - | - | | 2.9380 | 10859 | 0.2064 | - | - | | 2.9383 | 10860 | 0.1718 | - | - | | 2.9386 | 10861 | 0.195 | - | - | | 2.9389 | 10862 | 0.1676 | - | - | | 2.9391 | 10863 | 0.163 | - | - | | 2.9394 | 10864 | 0.2006 | - | - | | 2.9397 | 10865 | 0.1884 | - | - | | 2.9399 | 10866 | 0.158 | - | - | | 2.9402 | 10867 | 0.1384 | - | - | | 2.9405 | 10868 | 0.2343 | - | - | | 2.9407 | 10869 | 0.157 | - | - | | 2.9410 | 10870 | 0.1913 | - | - | | 2.9413 | 10871 | 0.2577 | - | - | | 2.9416 | 10872 | 0.2317 | - | - | | 2.9418 | 10873 | 0.1694 | - | - | | 2.9421 | 10874 | 0.2256 | - | - | | 2.9424 | 10875 | 0.1665 | - | - | | 2.9426 | 10876 | 0.184 | - | - | | 2.9429 | 10877 | 0.144 | - | - | | 2.9432 | 10878 | 0.2195 | - | - | | 2.9435 | 10879 | 0.2079 | - | - | | 2.9437 | 10880 | 0.1575 | - | - | | 2.9440 | 10881 | 0.1773 | - | - | | 2.9443 | 10882 | 0.1654 | - | - | | 2.9445 | 10883 | 0.2151 | - | - | | 2.9448 | 10884 | 0.2153 | - | - | | 2.9451 | 10885 | 0.1212 | - | - | | 2.9453 | 10886 | 0.2053 | - | - | | 2.9456 | 10887 | 0.165 | - | - | | 2.9459 | 10888 | 0.1891 | - | - | | 2.9462 | 10889 | 0.1672 | - | - | | 2.9464 | 10890 | 0.2943 | - | - | | 2.9467 | 10891 | 0.1701 | - | - | | 2.9470 | 10892 | 0.1666 | - | - | | 2.9472 | 10893 | 0.2318 | - | - | | 2.9475 | 10894 | 0.223 | - | - | | 2.9478 | 10895 | 0.174 | - | - | | 2.9481 | 10896 | 0.1513 | - | - | | 2.9483 | 10897 | 0.1205 | - | - | | 2.9486 | 10898 | 0.221 | - | - | | 2.9489 | 10899 | 0.1512 | - | - | | 2.9491 | 10900 | 0.1411 | - | - | | 2.9494 | 10901 | 0.2061 | - | - | | 2.9497 | 10902 | 0.2075 | - | - | | 2.9499 | 10903 | 0.2053 | - | - | | 2.9502 | 10904 | 0.1553 | - | - | | 2.9505 | 10905 | 0.2183 | - | - | | 2.9508 | 10906 | 0.1359 | - | - | | 2.9510 | 10907 | 0.1551 | - | - | | 2.9513 | 10908 | 0.1891 | - | - | | 2.9516 | 10909 | 0.1679 | - | - | | 2.9518 | 10910 | 0.1704 | - | - | | 2.9521 | 10911 | 0.186 | - | - | | 2.9524 | 10912 | 0.1458 | - | - | | 2.9527 | 10913 | 0.1298 | - | - | | 2.9529 | 10914 | 0.1395 | - | - | | 2.9532 | 10915 | 0.1786 | - | - | | 2.9535 | 10916 | 0.2512 | - | - | | 2.9537 | 10917 | 0.2308 | - | - | | 2.9540 | 10918 | 0.1775 | - | - | | 2.9543 | 10919 | 0.2718 | - | - | | 2.9545 | 10920 | 0.1546 | - | - | | 2.9548 | 10921 | 0.2088 | - | - | | 2.9551 | 10922 | 0.1039 | - | - | | 2.9554 | 10923 | 0.2764 | - | - | | 2.9556 | 10924 | 0.2215 | - | - | | 2.9559 | 10925 | 0.1714 | - | - | | 2.9562 | 10926 | 0.2052 | - | - | | 2.9564 | 10927 | 0.1709 | - | - | | 2.9567 | 10928 | 0.1136 | - | - | | 2.9570 | 10929 | 0.1527 | - | - | | 2.9573 | 10930 | 0.1958 | - | - | | 2.9575 | 10931 | 0.2095 | - | - | | 2.9578 | 10932 | 0.1925 | - | - | | 2.9581 | 10933 | 0.2044 | - | - | | 2.9583 | 10934 | 0.1723 | - | - | | 2.9586 | 10935 | 0.1548 | - | - | | 2.9589 | 10936 | 0.2104 | - | - | | 2.9591 | 10937 | 0.2052 | - | - | | 2.9594 | 10938 | 0.1668 | - | - | | 2.9597 | 10939 | 0.2236 | - | - | | 2.9600 | 10940 | 0.1788 | - | - | | 2.9602 | 10941 | 0.2125 | - | - | | 2.9605 | 10942 | 0.2238 | - | - | | 2.9608 | 10943 | 0.2192 | - | - | | 2.9610 | 10944 | 0.1931 | - | - | | 2.9613 | 10945 | 0.1601 | - | - | | 2.9616 | 10946 | 0.2044 | - | - | | 2.9619 | 10947 | 0.1249 | - | - | | 2.9621 | 10948 | 0.2226 | - | - | | 2.9624 | 10949 | 0.17 | - | - | | 2.9627 | 10950 | 0.2369 | - | - | | 2.9629 | 10951 | 0.1709 | - | - | | 2.9632 | 10952 | 0.1802 | - | - | | 2.9635 | 10953 | 0.1901 | - | - | | 2.9637 | 10954 | 0.2374 | - | - | | 2.9640 | 10955 | 0.1263 | - | - | | 2.9643 | 10956 | 0.1319 | - | - | | 2.9646 | 10957 | 0.2281 | - | - | | 2.9648 | 10958 | 0.1376 | - | - | | 2.9651 | 10959 | 0.2055 | - | - | | 2.9654 | 10960 | 0.1585 | - | - | | 2.9656 | 10961 | 0.2006 | - | - | | 2.9659 | 10962 | 0.1356 | - | - | | 2.9662 | 10963 | 0.1555 | - | - | | 2.9665 | 10964 | 0.188 | - | - | | 2.9667 | 10965 | 0.2699 | - | - | | 2.9670 | 10966 | 0.1989 | - | - | | 2.9673 | 10967 | 0.1939 | - | - | | 2.9675 | 10968 | 0.1585 | - | - | | 2.9678 | 10969 | 0.188 | - | - | | 2.9681 | 10970 | 0.1638 | - | - | | 2.9683 | 10971 | 0.1427 | - | - | | 2.9686 | 10972 | 0.1983 | - | - | | 2.9689 | 10973 | 0.1296 | - | - | | 2.9692 | 10974 | 0.3014 | - | - | | 2.9694 | 10975 | 0.2089 | - | - | | 2.9697 | 10976 | 0.178 | - | - | | 2.9700 | 10977 | 0.1937 | - | - | | 2.9702 | 10978 | 0.1457 | - | - | | 2.9705 | 10979 | 0.1898 | - | - | | 2.9708 | 10980 | 0.1964 | - | - | | 2.9710 | 10981 | 0.2027 | - | - | | 2.9713 | 10982 | 0.182 | - | - | | 2.9716 | 10983 | 0.2389 | - | - | | 2.9719 | 10984 | 0.1854 | - | - | | 2.9721 | 10985 | 0.2024 | - | - | | 2.9724 | 10986 | 0.1542 | - | - | | 2.9727 | 10987 | 0.1508 | - | - | | 2.9729 | 10988 | 0.2577 | - | - | | 2.9732 | 10989 | 0.1813 | - | - | | 2.9735 | 10990 | 0.1812 | - | - | | 2.9738 | 10991 | 0.1965 | - | - | | 2.9740 | 10992 | 0.1592 | - | - | | 2.9743 | 10993 | 0.1641 | - | - | | 2.9746 | 10994 | 0.1418 | - | - | | 2.9748 | 10995 | 0.178 | - | - | | 2.9751 | 10996 | 0.2119 | - | - | | 2.9754 | 10997 | 0.1629 | - | - | | 2.9756 | 10998 | 0.1647 | - | - | | 2.9759 | 10999 | 0.1412 | - | - | | 2.9762 | 11000 | 0.1567 | 0.1942 | 0.9536 | | 2.9765 | 11001 | 0.1205 | - | - | | 2.9767 | 11002 | 0.1624 | - | - | | 2.9770 | 11003 | 0.218 | - | - | | 2.9773 | 11004 | 0.1717 | - | - | | 2.9775 | 11005 | 0.1045 | - | - | | 2.9778 | 11006 | 0.1428 | - | - | | 2.9781 | 11007 | 0.2438 | - | - | | 2.9784 | 11008 | 0.1575 | - | - | | 2.9786 | 11009 | 0.1458 | - | - | | 2.9789 | 11010 | 0.1448 | - | - | | 2.9792 | 11011 | 0.1627 | - | - | | 2.9794 | 11012 | 0.1622 | - | - | | 2.9797 | 11013 | 0.2138 | - | - | | 2.9800 | 11014 | 0.1602 | - | - | | 2.9802 | 11015 | 0.1324 | - | - | | 2.9805 | 11016 | 0.2083 | - | - | | 2.9808 | 11017 | 0.1309 | - | - | | 2.9811 | 11018 | 0.2022 | - | - | | 2.9813 | 11019 | 0.188 | - | - | | 2.9816 | 11020 | 0.3131 | - | - | | 2.9819 | 11021 | 0.1711 | - | - | | 2.9821 | 11022 | 0.1969 | - | - | | 2.9824 | 11023 | 0.1545 | - | - | | 2.9827 | 11024 | 0.2234 | - | - | | 2.9830 | 11025 | 0.1967 | - | - | | 2.9832 | 11026 | 0.1965 | - | - | | 2.9835 | 11027 | 0.2088 | - | - | | 2.9838 | 11028 | 0.189 | - | - | | 2.9840 | 11029 | 0.1391 | - | - | | 2.9843 | 11030 | 0.1592 | - | - | | 2.9846 | 11031 | 0.1464 | - | - | | 2.9848 | 11032 | 0.241 | - | - | | 2.9851 | 11033 | 0.1879 | - | - | | 2.9854 | 11034 | 0.1539 | - | - | | 2.9857 | 11035 | 0.2478 | - | - | | 2.9859 | 11036 | 0.1594 | - | - | | 2.9862 | 11037 | 0.1409 | - | - | | 2.9865 | 11038 | 0.248 | - | - | | 2.9867 | 11039 | 0.1437 | - | - | | 2.9870 | 11040 | 0.2307 | - | - | | 2.9873 | 11041 | 0.1582 | - | - | | 2.9876 | 11042 | 0.1662 | - | - | | 2.9878 | 11043 | 0.2652 | - | - | | 2.9881 | 11044 | 0.1677 | - | - | | 2.9884 | 11045 | 0.1582 | - | - | | 2.9886 | 11046 | 0.1693 | - | - | | 2.9889 | 11047 | 0.1591 | - | - | | 2.9892 | 11048 | 0.164 | - | - | | 2.9894 | 11049 | 0.2057 | - | - | | 2.9897 | 11050 | 0.1878 | - | - | | 2.9900 | 11051 | 0.1721 | - | - | | 2.9903 | 11052 | 0.2156 | - | - | | 2.9905 | 11053 | 0.2296 | - | - | | 2.9908 | 11054 | 0.1536 | - | - | | 2.9911 | 11055 | 0.1694 | - | - | | 2.9913 | 11056 | 0.1769 | - | - | | 2.9916 | 11057 | 0.1575 | - | - | | 2.9919 | 11058 | 0.108 | - | - | | 2.9922 | 11059 | 0.1546 | - | - | | 2.9924 | 11060 | 0.1814 | - | - | | 2.9927 | 11061 | 0.1583 | - | - | | 2.9930 | 11062 | 0.2457 | - | - | | 2.9932 | 11063 | 0.1459 | - | - | | 2.9935 | 11064 | 0.1269 | - | - | | 2.9938 | 11065 | 0.1643 | - | - | | 2.9940 | 11066 | 0.1835 | - | - | | 2.9943 | 11067 | 0.1752 | - | - | | 2.9946 | 11068 | 0.193 | - | - | | 2.9949 | 11069 | 0.185 | - | - | | 2.9951 | 11070 | 0.1696 | - | - | | 2.9954 | 11071 | 0.1468 | - | - | | 2.9957 | 11072 | 0.1601 | - | - | | 2.9959 | 11073 | 0.1443 | - | - | | 2.9962 | 11074 | 0.2007 | - | - | | 2.9965 | 11075 | 0.1816 | - | - | | 2.9968 | 11076 | 0.2032 | - | - | | 2.9970 | 11077 | 0.2391 | - | - | | 2.9973 | 11078 | 0.1806 | - | - | | 2.9976 | 11079 | 0.1728 | - | - | | 2.9978 | 11080 | 0.1932 | - | - | | 2.9981 | 11081 | 0.1793 | - | - | | 2.9984 | 11082 | 0.1836 | - | - | | 2.9986 | 11083 | 0.1959 | - | - | | 2.9989 | 11084 | 0.1926 | - | - | | 2.9992 | 11085 | 0.1253 | - | - | | 2.9995 | 11086 | 0.1691 | - | - | | 2.9997 | 11087 | 0.1914 | - | - | | 3.0 | 11088 | 0.1879 | - | - | | 3.0003 | 11089 | 0.1787 | - | - | | 3.0005 | 11090 | 0.143 | - | - | | 3.0008 | 11091 | 0.1638 | - | - | | 3.0011 | 11092 | 0.1623 | - | - | | 3.0014 | 11093 | 0.1981 | - | - | | 3.0016 | 11094 | 0.1207 | - | - | | 3.0019 | 11095 | 0.145 | - | - | | 3.0022 | 11096 | 0.1729 | - | - | | 3.0024 | 11097 | 0.1431 | - | - | | 3.0027 | 11098 | 0.1657 | - | - | | 3.0030 | 11099 | 0.1578 | - | - | | 3.0032 | 11100 | 0.1327 | - | - | | 3.0035 | 11101 | 0.1426 | - | - | | 3.0038 | 11102 | 0.0989 | - | - | | 3.0041 | 11103 | 0.1809 | - | - | | 3.0043 | 11104 | 0.1641 | - | - | | 3.0046 | 11105 | 0.1446 | - | - | | 3.0049 | 11106 | 0.1501 | - | - | | 3.0051 | 11107 | 0.1254 | - | - | | 3.0054 | 11108 | 0.099 | - | - | | 3.0057 | 11109 | 0.1566 | - | - | | 3.0060 | 11110 | 0.2118 | - | - | | 3.0062 | 11111 | 0.1148 | - | - | | 3.0065 | 11112 | 0.1213 | - | - | | 3.0068 | 11113 | 0.2022 | - | - | | 3.0070 | 11114 | 0.1731 | - | - | | 3.0073 | 11115 | 0.2485 | - | - | | 3.0076 | 11116 | 0.1806 | - | - | | 3.0078 | 11117 | 0.1673 | - | - | | 3.0081 | 11118 | 0.154 | - | - | | 3.0084 | 11119 | 0.1831 | - | - | | 3.0087 | 11120 | 0.1267 | - | - | | 3.0089 | 11121 | 0.1954 | - | - | | 3.0092 | 11122 | 0.127 | - | - | | 3.0095 | 11123 | 0.1336 | - | - | | 3.0097 | 11124 | 0.1627 | - | - | | 3.0100 | 11125 | 0.1276 | - | - | | 3.0103 | 11126 | 0.1436 | - | - | | 3.0106 | 11127 | 0.1722 | - | - | | 3.0108 | 11128 | 0.1633 | - | - | | 3.0111 | 11129 | 0.1751 | - | - | | 3.0114 | 11130 | 0.1322 | - | - | | 3.0116 | 11131 | 0.0989 | - | - | | 3.0119 | 11132 | 0.1746 | - | - | | 3.0122 | 11133 | 0.1126 | - | - | | 3.0124 | 11134 | 0.1696 | - | - | | 3.0127 | 11135 | 0.1781 | - | - | | 3.0130 | 11136 | 0.1829 | - | - | | 3.0133 | 11137 | 0.1522 | - | - | | 3.0135 | 11138 | 0.2208 | - | - | | 3.0138 | 11139 | 0.1252 | - | - | | 3.0141 | 11140 | 0.1762 | - | - | | 3.0143 | 11141 | 0.1452 | - | - | | 3.0146 | 11142 | 0.1223 | - | - | | 3.0149 | 11143 | 0.1278 | - | - | | 3.0152 | 11144 | 0.1354 | - | - | | 3.0154 | 11145 | 0.1516 | - | - | | 3.0157 | 11146 | 0.1182 | - | - | | 3.0160 | 11147 | 0.1559 | - | - | | 3.0162 | 11148 | 0.1295 | - | - | | 3.0165 | 11149 | 0.1088 | - | - | | 3.0168 | 11150 | 0.1388 | - | - | | 3.0170 | 11151 | 0.1675 | - | - | | 3.0173 | 11152 | 0.113 | - | - | | 3.0176 | 11153 | 0.1227 | - | - | | 3.0179 | 11154 | 0.1443 | - | - | | 3.0181 | 11155 | 0.1728 | - | - | | 3.0184 | 11156 | 0.2091 | - | - | | 3.0187 | 11157 | 0.1709 | - | - | | 3.0189 | 11158 | 0.1432 | - | - | | 3.0192 | 11159 | 0.1957 | - | - | | 3.0195 | 11160 | 0.2125 | - | - | | 3.0198 | 11161 | 0.1962 | - | - | | 3.0200 | 11162 | 0.1708 | - | - | | 3.0203 | 11163 | 0.1924 | - | - | | 3.0206 | 11164 | 0.1536 | - | - | | 3.0208 | 11165 | 0.1412 | - | - | | 3.0211 | 11166 | 0.1217 | - | - | | 3.0214 | 11167 | 0.1278 | - | - | | 3.0216 | 11168 | 0.1555 | - | - | | 3.0219 | 11169 | 0.1069 | - | - | | 3.0222 | 11170 | 0.1264 | - | - | | 3.0225 | 11171 | 0.1621 | - | - | | 3.0227 | 11172 | 0.1205 | - | - | | 3.0230 | 11173 | 0.1116 | - | - | | 3.0233 | 11174 | 0.1818 | - | - | | 3.0235 | 11175 | 0.1929 | - | - | | 3.0238 | 11176 | 0.1769 | - | - | | 3.0241 | 11177 | 0.1675 | - | - | | 3.0244 | 11178 | 0.1495 | - | - | | 3.0246 | 11179 | 0.1852 | - | - | | 3.0249 | 11180 | 0.2347 | - | - | | 3.0252 | 11181 | 0.1536 | - | - | | 3.0254 | 11182 | 0.1742 | - | - | | 3.0257 | 11183 | 0.2229 | - | - | | 3.0260 | 11184 | 0.149 | - | - | | 3.0262 | 11185 | 0.1723 | - | - | | 3.0265 | 11186 | 0.127 | - | - | | 3.0268 | 11187 | 0.1858 | - | - | | 3.0271 | 11188 | 0.1965 | - | - | | 3.0273 | 11189 | 0.2088 | - | - | | 3.0276 | 11190 | 0.1111 | - | - | | 3.0279 | 11191 | 0.1371 | - | - | | 3.0281 | 11192 | 0.1214 | - | - | | 3.0284 | 11193 | 0.1678 | - | - | | 3.0287 | 11194 | 0.1655 | - | - | | 3.0290 | 11195 | 0.19 | - | - | | 3.0292 | 11196 | 0.1927 | - | - | | 3.0295 | 11197 | 0.1734 | - | - | | 3.0298 | 11198 | 0.2523 | - | - | | 3.0300 | 11199 | 0.1441 | - | - | | 3.0303 | 11200 | 0.1293 | - | - | | 3.0306 | 11201 | 0.1777 | - | - | | 3.0308 | 11202 | 0.2189 | - | - | | 3.0311 | 11203 | 0.1274 | - | - | | 3.0314 | 11204 | 0.1562 | - | - | | 3.0317 | 11205 | 0.1834 | - | - | | 3.0319 | 11206 | 0.1532 | - | - | | 3.0322 | 11207 | 0.1566 | - | - | | 3.0325 | 11208 | 0.1847 | - | - | | 3.0327 | 11209 | 0.1527 | - | - | | 3.0330 | 11210 | 0.1511 | - | - | | 3.0333 | 11211 | 0.1364 | - | - | | 3.0335 | 11212 | 0.1649 | - | - | | 3.0338 | 11213 | 0.1203 | - | - | | 3.0341 | 11214 | 0.2499 | - | - | | 3.0344 | 11215 | 0.1671 | - | - | | 3.0346 | 11216 | 0.103 | - | - | | 3.0349 | 11217 | 0.1413 | - | - | | 3.0352 | 11218 | 0.1627 | - | - | | 3.0354 | 11219 | 0.216 | - | - | | 3.0357 | 11220 | 0.1457 | - | - | | 3.0360 | 11221 | 0.1537 | - | - | | 3.0363 | 11222 | 0.1925 | - | - | | 3.0365 | 11223 | 0.2047 | - | - | | 3.0368 | 11224 | 0.1222 | - | - | | 3.0371 | 11225 | 0.1596 | - | - | | 3.0373 | 11226 | 0.1357 | - | - | | 3.0376 | 11227 | 0.127 | - | - | | 3.0379 | 11228 | 0.1885 | - | - | | 3.0381 | 11229 | 0.131 | - | - | | 3.0384 | 11230 | 0.1312 | - | - | | 3.0387 | 11231 | 0.1976 | - | - | | 3.0390 | 11232 | 0.1347 | - | - | | 3.0392 | 11233 | 0.217 | - | - | | 3.0395 | 11234 | 0.151 | - | - | | 3.0398 | 11235 | 0.2374 | - | - | | 3.0400 | 11236 | 0.1565 | - | - | | 3.0403 | 11237 | 0.1369 | - | - | | 3.0406 | 11238 | 0.1645 | - | - | | 3.0409 | 11239 | 0.1668 | - | - | | 3.0411 | 11240 | 0.155 | - | - | | 3.0414 | 11241 | 0.1733 | - | - | | 3.0417 | 11242 | 0.1221 | - | - | | 3.0419 | 11243 | 0.2172 | - | - | | 3.0422 | 11244 | 0.1342 | - | - | | 3.0425 | 11245 | 0.1648 | - | - | | 3.0427 | 11246 | 0.1598 | - | - | | 3.0430 | 11247 | 0.1997 | - | - | | 3.0433 | 11248 | 0.2076 | - | - | | 3.0436 | 11249 | 0.2185 | - | - | | 3.0438 | 11250 | 0.1625 | - | - | | 3.0441 | 11251 | 0.1443 | - | - | | 3.0444 | 11252 | 0.1404 | - | - | | 3.0446 | 11253 | 0.1505 | - | - | | 3.0449 | 11254 | 0.1783 | - | - | | 3.0452 | 11255 | 0.2197 | - | - | | 3.0455 | 11256 | 0.1612 | - | - | | 3.0457 | 11257 | 0.1256 | - | - | | 3.0460 | 11258 | 0.1699 | - | - | | 3.0463 | 11259 | 0.1337 | - | - | | 3.0465 | 11260 | 0.1325 | - | - | | 3.0468 | 11261 | 0.2526 | - | - | | 3.0471 | 11262 | 0.1454 | - | - | | 3.0473 | 11263 | 0.1179 | - | - | | 3.0476 | 11264 | 0.2116 | - | - | | 3.0479 | 11265 | 0.1739 | - | - | | 3.0482 | 11266 | 0.1168 | - | - | | 3.0484 | 11267 | 0.0858 | - | - | | 3.0487 | 11268 | 0.1382 | - | - | | 3.0490 | 11269 | 0.1546 | - | - | | 3.0492 | 11270 | 0.1732 | - | - | | 3.0495 | 11271 | 0.2298 | - | - | | 3.0498 | 11272 | 0.2286 | - | - | | 3.0501 | 11273 | 0.1714 | - | - | | 3.0503 | 11274 | 0.1489 | - | - | | 3.0506 | 11275 | 0.1626 | - | - | | 3.0509 | 11276 | 0.1578 | - | - | | 3.0511 | 11277 | 0.1556 | - | - | | 3.0514 | 11278 | 0.127 | - | - | | 3.0517 | 11279 | 0.1833 | - | - | | 3.0519 | 11280 | 0.1479 | - | - | | 3.0522 | 11281 | 0.1887 | - | - | | 3.0525 | 11282 | 0.1817 | - | - | | 3.0528 | 11283 | 0.1471 | - | - | | 3.0530 | 11284 | 0.1534 | - | - | | 3.0533 | 11285 | 0.2484 | - | - | | 3.0536 | 11286 | 0.1702 | - | - | | 3.0538 | 11287 | 0.1971 | - | - | | 3.0541 | 11288 | 0.1908 | - | - | | 3.0544 | 11289 | 0.0846 | - | - | | 3.0547 | 11290 | 0.1939 | - | - | | 3.0549 | 11291 | 0.0985 | - | - | | 3.0552 | 11292 | 0.1277 | - | - | | 3.0555 | 11293 | 0.164 | - | - | | 3.0557 | 11294 | 0.1251 | - | - | | 3.0560 | 11295 | 0.1462 | - | - | | 3.0563 | 11296 | 0.1336 | - | - | | 3.0565 | 11297 | 0.1314 | - | - | | 3.0568 | 11298 | 0.1977 | - | - | | 3.0571 | 11299 | 0.1631 | - | - | | 3.0574 | 11300 | 0.1293 | - | - | | 3.0576 | 11301 | 0.1759 | - | - | | 3.0579 | 11302 | 0.1145 | - | - | | 3.0582 | 11303 | 0.1131 | - | - | | 3.0584 | 11304 | 0.1438 | - | - | | 3.0587 | 11305 | 0.1733 | - | - | | 3.0590 | 11306 | 0.1019 | - | - | | 3.0593 | 11307 | 0.1881 | - | - | | 3.0595 | 11308 | 0.1257 | - | - | | 3.0598 | 11309 | 0.152 | - | - | | 3.0601 | 11310 | 0.1478 | - | - | | 3.0603 | 11311 | 0.1345 | - | - | | 3.0606 | 11312 | 0.1385 | - | - | | 3.0609 | 11313 | 0.1316 | - | - | | 3.0611 | 11314 | 0.1463 | - | - | | 3.0614 | 11315 | 0.1556 | - | - | | 3.0617 | 11316 | 0.1792 | - | - | | 3.0620 | 11317 | 0.1846 | - | - | | 3.0622 | 11318 | 0.1177 | - | - | | 3.0625 | 11319 | 0.1599 | - | - | | 3.0628 | 11320 | 0.2479 | - | - | | 3.0630 | 11321 | 0.1672 | - | - | | 3.0633 | 11322 | 0.2145 | - | - | | 3.0636 | 11323 | 0.131 | - | - | | 3.0639 | 11324 | 0.1416 | - | - | | 3.0641 | 11325 | 0.1691 | - | - | | 3.0644 | 11326 | 0.1748 | - | - | | 3.0647 | 11327 | 0.147 | - | - | | 3.0649 | 11328 | 0.1444 | - | - | | 3.0652 | 11329 | 0.1691 | - | - | | 3.0655 | 11330 | 0.152 | - | - | | 3.0657 | 11331 | 0.2019 | - | - | | 3.0660 | 11332 | 0.1574 | - | - | | 3.0663 | 11333 | 0.1325 | - | - | | 3.0666 | 11334 | 0.169 | - | - | | 3.0668 | 11335 | 0.1809 | - | - | | 3.0671 | 11336 | 0.1449 | - | - | | 3.0674 | 11337 | 0.1378 | - | - | | 3.0676 | 11338 | 0.245 | - | - | | 3.0679 | 11339 | 0.1858 | - | - | | 3.0682 | 11340 | 0.1104 | - | - | | 3.0685 | 11341 | 0.1946 | - | - | | 3.0687 | 11342 | 0.1488 | - | - | | 3.0690 | 11343 | 0.1154 | - | - | | 3.0693 | 11344 | 0.2182 | - | - | | 3.0695 | 11345 | 0.1777 | - | - | | 3.0698 | 11346 | 0.1577 | - | - | | 3.0701 | 11347 | 0.1137 | - | - | | 3.0703 | 11348 | 0.1941 | - | - | | 3.0706 | 11349 | 0.1897 | - | - | | 3.0709 | 11350 | 0.1201 | - | - | | 3.0712 | 11351 | 0.2448 | - | - | | 3.0714 | 11352 | 0.1469 | - | - | | 3.0717 | 11353 | 0.1757 | - | - | | 3.0720 | 11354 | 0.1309 | - | - | | 3.0722 | 11355 | 0.1701 | - | - | | 3.0725 | 11356 | 0.1121 | - | - | | 3.0728 | 11357 | 0.1384 | - | - | | 3.0731 | 11358 | 0.109 | - | - | | 3.0733 | 11359 | 0.1953 | - | - | | 3.0736 | 11360 | 0.1869 | - | - | | 3.0739 | 11361 | 0.1093 | - | - | | 3.0741 | 11362 | 0.1515 | - | - | | 3.0744 | 11363 | 0.1642 | - | - | | 3.0747 | 11364 | 0.2114 | - | - | | 3.0749 | 11365 | 0.1209 | - | - | | 3.0752 | 11366 | 0.199 | - | - | | 3.0755 | 11367 | 0.1469 | - | - | | 3.0758 | 11368 | 0.1286 | - | - | | 3.0760 | 11369 | 0.1767 | - | - | | 3.0763 | 11370 | 0.105 | - | - | | 3.0766 | 11371 | 0.1966 | - | - | | 3.0768 | 11372 | 0.2367 | - | - | | 3.0771 | 11373 | 0.1555 | - | - | | 3.0774 | 11374 | 0.146 | - | - | | 3.0777 | 11375 | 0.1922 | - | - | | 3.0779 | 11376 | 0.1082 | - | - | | 3.0782 | 11377 | 0.1542 | - | - | | 3.0785 | 11378 | 0.1915 | - | - | | 3.0787 | 11379 | 0.1688 | - | - | | 3.0790 | 11380 | 0.1396 | - | - | | 3.0793 | 11381 | 0.1307 | - | - | | 3.0795 | 11382 | 0.196 | - | - | | 3.0798 | 11383 | 0.1389 | - | - | | 3.0801 | 11384 | 0.1686 | - | - | | 3.0804 | 11385 | 0.145 | - | - | | 3.0806 | 11386 | 0.1889 | - | - | | 3.0809 | 11387 | 0.1567 | - | - | | 3.0812 | 11388 | 0.2476 | - | - | | 3.0814 | 11389 | 0.097 | - | - | | 3.0817 | 11390 | 0.1957 | - | - | | 3.0820 | 11391 | 0.136 | - | - | | 3.0823 | 11392 | 0.2114 | - | - | | 3.0825 | 11393 | 0.1554 | - | - | | 3.0828 | 11394 | 0.1971 | - | - | | 3.0831 | 11395 | 0.1547 | - | - | | 3.0833 | 11396 | 0.1369 | - | - | | 3.0836 | 11397 | 0.1657 | - | - | | 3.0839 | 11398 | 0.154 | - | - | | 3.0841 | 11399 | 0.1128 | - | - | | 3.0844 | 11400 | 0.15 | - | - | | 3.0847 | 11401 | 0.2029 | - | - | | 3.0850 | 11402 | 0.1422 | - | - | | 3.0852 | 11403 | 0.1663 | - | - | | 3.0855 | 11404 | 0.1102 | - | - | | 3.0858 | 11405 | 0.1275 | - | - | | 3.0860 | 11406 | 0.1665 | - | - | | 3.0863 | 11407 | 0.1916 | - | - | | 3.0866 | 11408 | 0.1575 | - | - | | 3.0869 | 11409 | 0.1773 | - | - | | 3.0871 | 11410 | 0.1565 | - | - | | 3.0874 | 11411 | 0.2012 | - | - | | 3.0877 | 11412 | 0.1819 | - | - | | 3.0879 | 11413 | 0.161 | - | - | | 3.0882 | 11414 | 0.1479 | - | - | | 3.0885 | 11415 | 0.1692 | - | - | | 3.0887 | 11416 | 0.1483 | - | - | | 3.0890 | 11417 | 0.1862 | - | - | | 3.0893 | 11418 | 0.1414 | - | - | | 3.0896 | 11419 | 0.2072 | - | - | | 3.0898 | 11420 | 0.2108 | - | - | | 3.0901 | 11421 | 0.1316 | - | - | | 3.0904 | 11422 | 0.1133 | - | - | | 3.0906 | 11423 | 0.1519 | - | - | | 3.0909 | 11424 | 0.1163 | - | - | | 3.0912 | 11425 | 0.1372 | - | - | | 3.0915 | 11426 | 0.144 | - | - | | 3.0917 | 11427 | 0.1458 | - | - | | 3.0920 | 11428 | 0.1717 | - | - | | 3.0923 | 11429 | 0.2064 | - | - | | 3.0925 | 11430 | 0.1546 | - | - | | 3.0928 | 11431 | 0.103 | - | - | | 3.0931 | 11432 | 0.1403 | - | - | | 3.0933 | 11433 | 0.1231 | - | - | | 3.0936 | 11434 | 0.1397 | - | - | | 3.0939 | 11435 | 0.1004 | - | - | | 3.0942 | 11436 | 0.2481 | - | - | | 3.0944 | 11437 | 0.1834 | - | - | | 3.0947 | 11438 | 0.1746 | - | - | | 3.0950 | 11439 | 0.1895 | - | - | | 3.0952 | 11440 | 0.1414 | - | - | | 3.0955 | 11441 | 0.1406 | - | - | | 3.0958 | 11442 | 0.169 | - | - | | 3.0960 | 11443 | 0.2568 | - | - | | 3.0963 | 11444 | 0.138 | - | - | | 3.0966 | 11445 | 0.1673 | - | - | | 3.0969 | 11446 | 0.182 | - | - | | 3.0971 | 11447 | 0.209 | - | - | | 3.0974 | 11448 | 0.1312 | - | - | | 3.0977 | 11449 | 0.1615 | - | - | | 3.0979 | 11450 | 0.1457 | - | - | | 3.0982 | 11451 | 0.1183 | - | - | | 3.0985 | 11452 | 0.1584 | - | - | | 3.0988 | 11453 | 0.2117 | - | - | | 3.0990 | 11454 | 0.122 | - | - | | 3.0993 | 11455 | 0.1182 | - | - | | 3.0996 | 11456 | 0.1602 | - | - | | 3.0998 | 11457 | 0.1331 | - | - | | 3.1001 | 11458 | 0.1408 | - | - | | 3.1004 | 11459 | 0.2132 | - | - | | 3.1006 | 11460 | 0.1635 | - | - | | 3.1009 | 11461 | 0.1039 | - | - | | 3.1012 | 11462 | 0.1468 | - | - | | 3.1015 | 11463 | 0.0954 | - | - | | 3.1017 | 11464 | 0.1521 | - | - | | 3.1020 | 11465 | 0.1684 | - | - | | 3.1023 | 11466 | 0.2066 | - | - | | 3.1025 | 11467 | 0.1619 | - | - | | 3.1028 | 11468 | 0.1913 | - | - | | 3.1031 | 11469 | 0.1461 | - | - | | 3.1034 | 11470 | 0.1418 | - | - | | 3.1036 | 11471 | 0.1098 | - | - | | 3.1039 | 11472 | 0.1309 | - | - | | 3.1042 | 11473 | 0.2057 | - | - | | 3.1044 | 11474 | 0.166 | - | - | | 3.1047 | 11475 | 0.1429 | - | - | | 3.1050 | 11476 | 0.1838 | - | - | | 3.1052 | 11477 | 0.1457 | - | - | | 3.1055 | 11478 | 0.1443 | - | - | | 3.1058 | 11479 | 0.1593 | - | - | | 3.1061 | 11480 | 0.136 | - | - | | 3.1063 | 11481 | 0.1953 | - | - | | 3.1066 | 11482 | 0.1529 | - | - | | 3.1069 | 11483 | 0.1093 | - | - | | 3.1071 | 11484 | 0.1532 | - | - | | 3.1074 | 11485 | 0.1651 | - | - | | 3.1077 | 11486 | 0.1986 | - | - | | 3.1080 | 11487 | 0.167 | - | - | | 3.1082 | 11488 | 0.1133 | - | - | | 3.1085 | 11489 | 0.157 | - | - | | 3.1088 | 11490 | 0.2094 | - | - | | 3.1090 | 11491 | 0.1199 | - | - | | 3.1093 | 11492 | 0.1928 | - | - | | 3.1096 | 11493 | 0.2176 | - | - | | 3.1098 | 11494 | 0.1454 | - | - | | 3.1101 | 11495 | 0.2104 | - | - | | 3.1104 | 11496 | 0.2476 | - | - | | 3.1107 | 11497 | 0.2106 | - | - | | 3.1109 | 11498 | 0.2015 | - | - | | 3.1112 | 11499 | 0.1717 | - | - | | 3.1115 | 11500 | 0.1481 | - | - | | 3.1117 | 11501 | 0.2217 | - | - | | 3.1120 | 11502 | 0.1389 | - | - | | 3.1123 | 11503 | 0.134 | - | - | | 3.1126 | 11504 | 0.1575 | - | - | | 3.1128 | 11505 | 0.1061 | - | - | | 3.1131 | 11506 | 0.1942 | - | - | | 3.1134 | 11507 | 0.1051 | - | - | | 3.1136 | 11508 | 0.144 | - | - | | 3.1139 | 11509 | 0.0991 | - | - | | 3.1142 | 11510 | 0.1567 | - | - | | 3.1144 | 11511 | 0.191 | - | - | | 3.1147 | 11512 | 0.1765 | - | - | | 3.1150 | 11513 | 0.2186 | - | - | | 3.1153 | 11514 | 0.1355 | - | - | | 3.1155 | 11515 | 0.149 | - | - | | 3.1158 | 11516 | 0.0981 | - | - | | 3.1161 | 11517 | 0.1412 | - | - | | 3.1163 | 11518 | 0.1423 | - | - | | 3.1166 | 11519 | 0.1452 | - | - | | 3.1169 | 11520 | 0.1882 | - | - | | 3.1172 | 11521 | 0.2494 | - | - | | 3.1174 | 11522 | 0.1748 | - | - | | 3.1177 | 11523 | 0.1634 | - | - | | 3.1180 | 11524 | 0.1385 | - | - | | 3.1182 | 11525 | 0.12 | - | - | | 3.1185 | 11526 | 0.1591 | - | - | | 3.1188 | 11527 | 0.1283 | - | - | | 3.1190 | 11528 | 0.2236 | - | - | | 3.1193 | 11529 | 0.1654 | - | - | | 3.1196 | 11530 | 0.1002 | - | - | | 3.1199 | 11531 | 0.1321 | - | - | | 3.1201 | 11532 | 0.1867 | - | - | | 3.1204 | 11533 | 0.1568 | - | - | | 3.1207 | 11534 | 0.1976 | - | - | | 3.1209 | 11535 | 0.1996 | - | - | | 3.1212 | 11536 | 0.1713 | - | - | | 3.1215 | 11537 | 0.1996 | - | - | | 3.1218 | 11538 | 0.182 | - | - | | 3.1220 | 11539 | 0.1525 | - | - | | 3.1223 | 11540 | 0.1304 | - | - | | 3.1226 | 11541 | 0.1545 | - | - | | 3.1228 | 11542 | 0.1599 | - | - | | 3.1231 | 11543 | 0.1802 | - | - | | 3.1234 | 11544 | 0.1619 | - | - | | 3.1236 | 11545 | 0.1276 | - | - | | 3.1239 | 11546 | 0.1904 | - | - | | 3.1242 | 11547 | 0.1454 | - | - | | 3.1245 | 11548 | 0.1602 | - | - | | 3.1247 | 11549 | 0.1653 | - | - | | 3.125 | 11550 | 0.1209 | - | - | | 3.1253 | 11551 | 0.1377 | - | - | | 3.1255 | 11552 | 0.1447 | - | - | | 3.1258 | 11553 | 0.175 | - | - | | 3.1261 | 11554 | 0.1547 | - | - | | 3.1264 | 11555 | 0.1257 | - | - | | 3.1266 | 11556 | 0.2301 | - | - | | 3.1269 | 11557 | 0.2098 | - | - | | 3.1272 | 11558 | 0.1419 | - | - | | 3.1274 | 11559 | 0.1455 | - | - | | 3.1277 | 11560 | 0.1452 | - | - | | 3.1280 | 11561 | 0.1857 | - | - | | 3.1282 | 11562 | 0.1861 | - | - | | 3.1285 | 11563 | 0.1403 | - | - | | 3.1288 | 11564 | 0.2098 | - | - | | 3.1291 | 11565 | 0.1809 | - | - | | 3.1293 | 11566 | 0.1172 | - | - | | 3.1296 | 11567 | 0.1611 | - | - | | 3.1299 | 11568 | 0.1336 | - | - | | 3.1301 | 11569 | 0.1537 | - | - | | 3.1304 | 11570 | 0.1161 | - | - | | 3.1307 | 11571 | 0.1539 | - | - | | 3.1310 | 11572 | 0.2095 | - | - | | 3.1312 | 11573 | 0.1116 | - | - | | 3.1315 | 11574 | 0.167 | - | - | | 3.1318 | 11575 | 0.1619 | - | - | | 3.1320 | 11576 | 0.1584 | - | - | | 3.1323 | 11577 | 0.1927 | - | - | | 3.1326 | 11578 | 0.1866 | - | - | | 3.1328 | 11579 | 0.1458 | - | - | | 3.1331 | 11580 | 0.1369 | - | - | | 3.1334 | 11581 | 0.1372 | - | - | | 3.1337 | 11582 | 0.1655 | - | - | | 3.1339 | 11583 | 0.1748 | - | - | | 3.1342 | 11584 | 0.1367 | - | - | | 3.1345 | 11585 | 0.1396 | - | - | | 3.1347 | 11586 | 0.1117 | - | - | | 3.1350 | 11587 | 0.1162 | - | - | | 3.1353 | 11588 | 0.1498 | - | - | | 3.1356 | 11589 | 0.1724 | - | - | | 3.1358 | 11590 | 0.1367 | - | - | | 3.1361 | 11591 | 0.1242 | - | - | | 3.1364 | 11592 | 0.1884 | - | - | | 3.1366 | 11593 | 0.178 | - | - | | 3.1369 | 11594 | 0.1292 | - | - | | 3.1372 | 11595 | 0.1369 | - | - | | 3.1374 | 11596 | 0.1709 | - | - | | 3.1377 | 11597 | 0.2087 | - | - | | 3.1380 | 11598 | 0.2 | - | - | | 3.1383 | 11599 | 0.1699 | - | - | | 3.1385 | 11600 | 0.1484 | - | - | | 3.1388 | 11601 | 0.1261 | - | - | | 3.1391 | 11602 | 0.1422 | - | - | | 3.1393 | 11603 | 0.1718 | - | - | | 3.1396 | 11604 | 0.1204 | - | - | | 3.1399 | 11605 | 0.1236 | - | - | | 3.1402 | 11606 | 0.1771 | - | - | | 3.1404 | 11607 | 0.1753 | - | - | | 3.1407 | 11608 | 0.1727 | - | - | | 3.1410 | 11609 | 0.1784 | - | - | | 3.1412 | 11610 | 0.1795 | - | - | | 3.1415 | 11611 | 0.1826 | - | - | | 3.1418 | 11612 | 0.179 | - | - | | 3.1420 | 11613 | 0.1573 | - | - | | 3.1423 | 11614 | 0.1641 | - | - | | 3.1426 | 11615 | 0.1426 | - | - | | 3.1429 | 11616 | 0.1706 | - | - | | 3.1431 | 11617 | 0.1465 | - | - | | 3.1434 | 11618 | 0.1793 | - | - | | 3.1437 | 11619 | 0.212 | - | - | | 3.1439 | 11620 | 0.1427 | - | - | | 3.1442 | 11621 | 0.2362 | - | - | | 3.1445 | 11622 | 0.1618 | - | - | | 3.1448 | 11623 | 0.1607 | - | - | | 3.1450 | 11624 | 0.1258 | - | - | | 3.1453 | 11625 | 0.2123 | - | - | | 3.1456 | 11626 | 0.1758 | - | - | | 3.1458 | 11627 | 0.1197 | - | - | | 3.1461 | 11628 | 0.1301 | - | - | | 3.1464 | 11629 | 0.1332 | - | - | | 3.1466 | 11630 | 0.1431 | - | - | | 3.1469 | 11631 | 0.2029 | - | - | | 3.1472 | 11632 | 0.183 | - | - | | 3.1475 | 11633 | 0.1839 | - | - | | 3.1477 | 11634 | 0.1677 | - | - | | 3.1480 | 11635 | 0.1461 | - | - | | 3.1483 | 11636 | 0.1467 | - | - | | 3.1485 | 11637 | 0.1601 | - | - | | 3.1488 | 11638 | 0.1096 | - | - | | 3.1491 | 11639 | 0.1373 | - | - | | 3.1494 | 11640 | 0.1415 | - | - | | 3.1496 | 11641 | 0.1919 | - | - | | 3.1499 | 11642 | 0.1867 | - | - | | 3.1502 | 11643 | 0.1434 | - | - | | 3.1504 | 11644 | 0.1553 | - | - | | 3.1507 | 11645 | 0.1004 | - | - | | 3.1510 | 11646 | 0.1938 | - | - | | 3.1512 | 11647 | 0.101 | - | - | | 3.1515 | 11648 | 0.1584 | - | - | | 3.1518 | 11649 | 0.1601 | - | - | | 3.1521 | 11650 | 0.148 | - | - | | 3.1523 | 11651 | 0.1151 | - | - | | 3.1526 | 11652 | 0.1524 | - | - | | 3.1529 | 11653 | 0.096 | - | - | | 3.1531 | 11654 | 0.2176 | - | - | | 3.1534 | 11655 | 0.1485 | - | - | | 3.1537 | 11656 | 0.1457 | - | - | | 3.1540 | 11657 | 0.216 | - | - | | 3.1542 | 11658 | 0.0966 | - | - | | 3.1545 | 11659 | 0.1324 | - | - | | 3.1548 | 11660 | 0.1194 | - | - | | 3.1550 | 11661 | 0.1352 | - | - | | 3.1553 | 11662 | 0.1585 | - | - | | 3.1556 | 11663 | 0.1596 | - | - | | 3.1558 | 11664 | 0.1463 | - | - | | 3.1561 | 11665 | 0.1413 | - | - | | 3.1564 | 11666 | 0.1529 | - | - | | 3.1567 | 11667 | 0.1688 | - | - | | 3.1569 | 11668 | 0.1149 | - | - | | 3.1572 | 11669 | 0.1217 | - | - | | 3.1575 | 11670 | 0.171 | - | - | | 3.1577 | 11671 | 0.1504 | - | - | | 3.1580 | 11672 | 0.1372 | - | - | | 3.1583 | 11673 | 0.1323 | - | - | | 3.1585 | 11674 | 0.1056 | - | - | | 3.1588 | 11675 | 0.111 | - | - | | 3.1591 | 11676 | 0.1638 | - | - | | 3.1594 | 11677 | 0.1425 | - | - | | 3.1596 | 11678 | 0.1608 | - | - | | 3.1599 | 11679 | 0.1302 | - | - | | 3.1602 | 11680 | 0.1895 | - | - | | 3.1604 | 11681 | 0.1941 | - | - | | 3.1607 | 11682 | 0.2341 | - | - | | 3.1610 | 11683 | 0.1682 | - | - | | 3.1613 | 11684 | 0.1572 | - | - | | 3.1615 | 11685 | 0.1608 | - | - | | 3.1618 | 11686 | 0.1899 | - | - | | 3.1621 | 11687 | 0.1845 | - | - | | 3.1623 | 11688 | 0.1067 | - | - | | 3.1626 | 11689 | 0.1403 | - | - | | 3.1629 | 11690 | 0.1932 | - | - | | 3.1631 | 11691 | 0.1308 | - | - | | 3.1634 | 11692 | 0.1467 | - | - | | 3.1637 | 11693 | 0.1511 | - | - | | 3.1640 | 11694 | 0.152 | - | - | | 3.1642 | 11695 | 0.1211 | - | - | | 3.1645 | 11696 | 0.1707 | - | - | | 3.1648 | 11697 | 0.1616 | - | - | | 3.1650 | 11698 | 0.1458 | - | - | | 3.1653 | 11699 | 0.205 | - | - | | 3.1656 | 11700 | 0.1034 | - | - | | 3.1659 | 11701 | 0.136 | - | - | | 3.1661 | 11702 | 0.1403 | - | - | | 3.1664 | 11703 | 0.1 | - | - | | 3.1667 | 11704 | 0.1718 | - | - | | 3.1669 | 11705 | 0.2275 | - | - | | 3.1672 | 11706 | 0.1612 | - | - | | 3.1675 | 11707 | 0.1393 | - | - | | 3.1677 | 11708 | 0.1934 | - | - | | 3.1680 | 11709 | 0.1373 | - | - | | 3.1683 | 11710 | 0.1337 | - | - | | 3.1686 | 11711 | 0.1706 | - | - | | 3.1688 | 11712 | 0.1312 | - | - | | 3.1691 | 11713 | 0.2023 | - | - | | 3.1694 | 11714 | 0.1148 | - | - | | 3.1696 | 11715 | 0.1156 | - | - | | 3.1699 | 11716 | 0.1319 | - | - | | 3.1702 | 11717 | 0.0848 | - | - | | 3.1705 | 11718 | 0.165 | - | - | | 3.1707 | 11719 | 0.1675 | - | - | | 3.1710 | 11720 | 0.1967 | - | - | | 3.1713 | 11721 | 0.1481 | - | - | | 3.1715 | 11722 | 0.1549 | - | - | | 3.1718 | 11723 | 0.1344 | - | - | | 3.1721 | 11724 | 0.1345 | - | - | | 3.1723 | 11725 | 0.1137 | - | - | | 3.1726 | 11726 | 0.1242 | - | - | | 3.1729 | 11727 | 0.0974 | - | - | | 3.1732 | 11728 | 0.1391 | - | - | | 3.1734 | 11729 | 0.1702 | - | - | | 3.1737 | 11730 | 0.1112 | - | - | | 3.1740 | 11731 | 0.1113 | - | - | | 3.1742 | 11732 | 0.1464 | - | - | | 3.1745 | 11733 | 0.1776 | - | - | | 3.1748 | 11734 | 0.1237 | - | - | | 3.1751 | 11735 | 0.1274 | - | - | | 3.1753 | 11736 | 0.1781 | - | - | | 3.1756 | 11737 | 0.2299 | - | - | | 3.1759 | 11738 | 0.1516 | - | - | | 3.1761 | 11739 | 0.1543 | - | - | | 3.1764 | 11740 | 0.1904 | - | - | | 3.1767 | 11741 | 0.1396 | - | - | | 3.1769 | 11742 | 0.1215 | - | - | | 3.1772 | 11743 | 0.235 | - | - | | 3.1775 | 11744 | 0.185 | - | - | | 3.1778 | 11745 | 0.1705 | - | - | | 3.1780 | 11746 | 0.181 | - | - | | 3.1783 | 11747 | 0.1544 | - | - | | 3.1786 | 11748 | 0.1386 | - | - | | 3.1788 | 11749 | 0.1741 | - | - | | 3.1791 | 11750 | 0.1717 | - | - | | 3.1794 | 11751 | 0.1512 | - | - | | 3.1797 | 11752 | 0.1453 | - | - | | 3.1799 | 11753 | 0.2071 | - | - | | 3.1802 | 11754 | 0.2051 | - | - | | 3.1805 | 11755 | 0.1136 | - | - | | 3.1807 | 11756 | 0.1154 | - | - | | 3.1810 | 11757 | 0.134 | - | - | | 3.1813 | 11758 | 0.16 | - | - | | 3.1815 | 11759 | 0.1435 | - | - | | 3.1818 | 11760 | 0.1549 | - | - | | 3.1821 | 11761 | 0.1415 | - | - | | 3.1824 | 11762 | 0.1742 | - | - | | 3.1826 | 11763 | 0.1089 | - | - | | 3.1829 | 11764 | 0.113 | - | - | | 3.1832 | 11765 | 0.1882 | - | - | | 3.1834 | 11766 | 0.1724 | - | - | | 3.1837 | 11767 | 0.179 | - | - | | 3.1840 | 11768 | 0.1055 | - | - | | 3.1843 | 11769 | 0.1405 | - | - | | 3.1845 | 11770 | 0.1421 | - | - | | 3.1848 | 11771 | 0.1539 | - | - | | 3.1851 | 11772 | 0.1302 | - | - | | 3.1853 | 11773 | 0.1455 | - | - | | 3.1856 | 11774 | 0.1634 | - | - | | 3.1859 | 11775 | 0.1682 | - | - | | 3.1861 | 11776 | 0.1375 | - | - | | 3.1864 | 11777 | 0.2166 | - | - | | 3.1867 | 11778 | 0.1799 | - | - | | 3.1870 | 11779 | 0.1555 | - | - | | 3.1872 | 11780 | 0.2124 | - | - | | 3.1875 | 11781 | 0.179 | - | - | | 3.1878 | 11782 | 0.1346 | - | - | | 3.1880 | 11783 | 0.1249 | - | - | | 3.1883 | 11784 | 0.1275 | - | - | | 3.1886 | 11785 | 0.1483 | - | - | | 3.1889 | 11786 | 0.2219 | - | - | | 3.1891 | 11787 | 0.1382 | - | - | | 3.1894 | 11788 | 0.1661 | - | - | | 3.1897 | 11789 | 0.0983 | - | - | | 3.1899 | 11790 | 0.1715 | - | - | | 3.1902 | 11791 | 0.1368 | - | - | | 3.1905 | 11792 | 0.1609 | - | - | | 3.1907 | 11793 | 0.1469 | - | - | | 3.1910 | 11794 | 0.1203 | - | - | | 3.1913 | 11795 | 0.156 | - | - | | 3.1916 | 11796 | 0.1468 | - | - | | 3.1918 | 11797 | 0.1724 | - | - | | 3.1921 | 11798 | 0.1481 | - | - | | 3.1924 | 11799 | 0.1829 | - | - | | 3.1926 | 11800 | 0.1124 | - | - | | 3.1929 | 11801 | 0.1301 | - | - | | 3.1932 | 11802 | 0.1562 | - | - | | 3.1935 | 11803 | 0.1985 | - | - | | 3.1937 | 11804 | 0.2025 | - | - | | 3.1940 | 11805 | 0.2345 | - | - | | 3.1943 | 11806 | 0.1698 | - | - | | 3.1945 | 11807 | 0.1315 | - | - | | 3.1948 | 11808 | 0.2197 | - | - | | 3.1951 | 11809 | 0.1727 | - | - | | 3.1953 | 11810 | 0.1023 | - | - | | 3.1956 | 11811 | 0.1696 | - | - | | 3.1959 | 11812 | 0.2225 | - | - | | 3.1962 | 11813 | 0.1485 | - | - | | 3.1964 | 11814 | 0.1023 | - | - | | 3.1967 | 11815 | 0.1451 | - | - | | 3.1970 | 11816 | 0.1924 | - | - | | 3.1972 | 11817 | 0.1536 | - | - | | 3.1975 | 11818 | 0.1367 | - | - | | 3.1978 | 11819 | 0.192 | - | - | | 3.1981 | 11820 | 0.1611 | - | - | | 3.1983 | 11821 | 0.1345 | - | - | | 3.1986 | 11822 | 0.1046 | - | - | | 3.1989 | 11823 | 0.1583 | - | - | | 3.1991 | 11824 | 0.1098 | - | - | | 3.1994 | 11825 | 0.2043 | - | - | | 3.1997 | 11826 | 0.1165 | - | - | | 3.1999 | 11827 | 0.1676 | - | - | | 3.2002 | 11828 | 0.1523 | - | - | | 3.2005 | 11829 | 0.1323 | - | - | | 3.2008 | 11830 | 0.1828 | - | - | | 3.2010 | 11831 | 0.144 | - | - | | 3.2013 | 11832 | 0.1629 | - | - | | 3.2016 | 11833 | 0.2056 | - | - | | 3.2018 | 11834 | 0.1589 | - | - | | 3.2021 | 11835 | 0.1404 | - | - | | 3.2024 | 11836 | 0.1504 | - | - | | 3.2027 | 11837 | 0.2044 | - | - | | 3.2029 | 11838 | 0.1107 | - | - | | 3.2032 | 11839 | 0.1626 | - | - | | 3.2035 | 11840 | 0.1535 | - | - | | 3.2037 | 11841 | 0.1495 | - | - | | 3.2040 | 11842 | 0.1385 | - | - | | 3.2043 | 11843 | 0.1758 | - | - | | 3.2045 | 11844 | 0.2206 | - | - | | 3.2048 | 11845 | 0.1492 | - | - | | 3.2051 | 11846 | 0.196 | - | - | | 3.2054 | 11847 | 0.1634 | - | - | | 3.2056 | 11848 | 0.1564 | - | - | | 3.2059 | 11849 | 0.1563 | - | - | | 3.2062 | 11850 | 0.18 | - | - | | 3.2064 | 11851 | 0.1336 | - | - | | 3.2067 | 11852 | 0.1475 | - | - | | 3.2070 | 11853 | 0.1516 | - | - | | 3.2073 | 11854 | 0.1493 | - | - | | 3.2075 | 11855 | 0.2066 | - | - | | 3.2078 | 11856 | 0.1687 | - | - | | 3.2081 | 11857 | 0.18 | - | - | | 3.2083 | 11858 | 0.1307 | - | - | | 3.2086 | 11859 | 0.1758 | - | - | | 3.2089 | 11860 | 0.1256 | - | - | | 3.2091 | 11861 | 0.1176 | - | - | | 3.2094 | 11862 | 0.147 | - | - | | 3.2097 | 11863 | 0.1944 | - | - | | 3.2100 | 11864 | 0.1964 | - | - | | 3.2102 | 11865 | 0.1136 | - | - | | 3.2105 | 11866 | 0.1774 | - | - | | 3.2108 | 11867 | 0.1824 | - | - | | 3.2110 | 11868 | 0.1772 | - | - | | 3.2113 | 11869 | 0.1573 | - | - | | 3.2116 | 11870 | 0.1481 | - | - | | 3.2119 | 11871 | 0.152 | - | - | | 3.2121 | 11872 | 0.18 | - | - | | 3.2124 | 11873 | 0.132 | - | - | | 3.2127 | 11874 | 0.1282 | - | - | | 3.2129 | 11875 | 0.1503 | - | - | | 3.2132 | 11876 | 0.1675 | - | - | | 3.2135 | 11877 | 0.1761 | - | - | | 3.2137 | 11878 | 0.1456 | - | - | | 3.2140 | 11879 | 0.149 | - | - | | 3.2143 | 11880 | 0.1604 | - | - | | 3.2146 | 11881 | 0.1367 | - | - | | 3.2148 | 11882 | 0.2029 | - | - | | 3.2151 | 11883 | 0.1872 | - | - | | 3.2154 | 11884 | 0.1888 | - | - | | 3.2156 | 11885 | 0.1797 | - | - | | 3.2159 | 11886 | 0.1338 | - | - | | 3.2162 | 11887 | 0.1608 | - | - | | 3.2165 | 11888 | 0.1566 | - | - | | 3.2167 | 11889 | 0.1212 | - | - | | 3.2170 | 11890 | 0.1425 | - | - | | 3.2173 | 11891 | 0.183 | - | - | | 3.2175 | 11892 | 0.1268 | - | - | | 3.2178 | 11893 | 0.1468 | - | - | | 3.2181 | 11894 | 0.1886 | - | - | | 3.2183 | 11895 | 0.151 | - | - | | 3.2186 | 11896 | 0.1817 | - | - | | 3.2189 | 11897 | 0.1602 | - | - | | 3.2192 | 11898 | 0.176 | - | - | | 3.2194 | 11899 | 0.1364 | - | - | | 3.2197 | 11900 | 0.1546 | - | - | | 3.2200 | 11901 | 0.126 | - | - | | 3.2202 | 11902 | 0.168 | - | - | | 3.2205 | 11903 | 0.1019 | - | - | | 3.2208 | 11904 | 0.129 | - | - | | 3.2210 | 11905 | 0.1489 | - | - | | 3.2213 | 11906 | 0.1209 | - | - | | 3.2216 | 11907 | 0.1373 | - | - | | 3.2219 | 11908 | 0.1023 | - | - | | 3.2221 | 11909 | 0.2163 | - | - | | 3.2224 | 11910 | 0.1565 | - | - | | 3.2227 | 11911 | 0.0828 | - | - | | 3.2229 | 11912 | 0.1705 | - | - | | 3.2232 | 11913 | 0.178 | - | - | | 3.2235 | 11914 | 0.1828 | - | - | | 3.2238 | 11915 | 0.1529 | - | - | | 3.2240 | 11916 | 0.1607 | - | - | | 3.2243 | 11917 | 0.1242 | - | - | | 3.2246 | 11918 | 0.1233 | - | - | | 3.2248 | 11919 | 0.1864 | - | - | | 3.2251 | 11920 | 0.094 | - | - | | 3.2254 | 11921 | 0.1154 | - | - | | 3.2256 | 11922 | 0.1297 | - | - | | 3.2259 | 11923 | 0.1028 | - | - | | 3.2262 | 11924 | 0.1783 | - | - | | 3.2265 | 11925 | 0.1757 | - | - | | 3.2267 | 11926 | 0.1352 | - | - | | 3.2270 | 11927 | 0.1299 | - | - | | 3.2273 | 11928 | 0.1052 | - | - | | 3.2275 | 11929 | 0.1736 | - | - | | 3.2278 | 11930 | 0.2099 | - | - | | 3.2281 | 11931 | 0.1414 | - | - | | 3.2284 | 11932 | 0.1194 | - | - | | 3.2286 | 11933 | 0.1361 | - | - | | 3.2289 | 11934 | 0.116 | - | - | | 3.2292 | 11935 | 0.1313 | - | - | | 3.2294 | 11936 | 0.1784 | - | - | | 3.2297 | 11937 | 0.1533 | - | - | | 3.2300 | 11938 | 0.2332 | - | - | | 3.2302 | 11939 | 0.1603 | - | - | | 3.2305 | 11940 | 0.1577 | - | - | | 3.2308 | 11941 | 0.2287 | - | - | | 3.2311 | 11942 | 0.1725 | - | - | | 3.2313 | 11943 | 0.1898 | - | - | | 3.2316 | 11944 | 0.1415 | - | - | | 3.2319 | 11945 | 0.191 | - | - | | 3.2321 | 11946 | 0.1815 | - | - | | 3.2324 | 11947 | 0.1703 | - | - | | 3.2327 | 11948 | 0.1222 | - | - | | 3.2330 | 11949 | 0.1881 | - | - | | 3.2332 | 11950 | 0.1715 | - | - | | 3.2335 | 11951 | 0.1725 | - | - | | 3.2338 | 11952 | 0.1929 | - | - | | 3.2340 | 11953 | 0.2194 | - | - | | 3.2343 | 11954 | 0.1633 | - | - | | 3.2346 | 11955 | 0.1587 | - | - | | 3.2348 | 11956 | 0.1336 | - | - | | 3.2351 | 11957 | 0.1935 | - | - | | 3.2354 | 11958 | 0.1038 | - | - | | 3.2357 | 11959 | 0.193 | - | - | | 3.2359 | 11960 | 0.1711 | - | - | | 3.2362 | 11961 | 0.1815 | - | - | | 3.2365 | 11962 | 0.1428 | - | - | | 3.2367 | 11963 | 0.1031 | - | - | | 3.2370 | 11964 | 0.1277 | - | - | | 3.2373 | 11965 | 0.1671 | - | - | | 3.2376 | 11966 | 0.134 | - | - | | 3.2378 | 11967 | 0.1846 | - | - | | 3.2381 | 11968 | 0.1219 | - | - | | 3.2384 | 11969 | 0.1381 | - | - | | 3.2386 | 11970 | 0.2014 | - | - | | 3.2389 | 11971 | 0.1854 | - | - | | 3.2392 | 11972 | 0.2116 | - | - | | 3.2394 | 11973 | 0.1225 | - | - | | 3.2397 | 11974 | 0.1708 | - | - | | 3.2400 | 11975 | 0.1833 | - | - | | 3.2403 | 11976 | 0.222 | - | - | | 3.2405 | 11977 | 0.1659 | - | - | | 3.2408 | 11978 | 0.1131 | - | - | | 3.2411 | 11979 | 0.1424 | - | - | | 3.2413 | 11980 | 0.1022 | - | - | | 3.2416 | 11981 | 0.0916 | - | - | | 3.2419 | 11982 | 0.1164 | - | - | | 3.2422 | 11983 | 0.1754 | - | - | | 3.2424 | 11984 | 0.1592 | - | - | | 3.2427 | 11985 | 0.1487 | - | - | | 3.2430 | 11986 | 0.2348 | - | - | | 3.2432 | 11987 | 0.1255 | - | - | | 3.2435 | 11988 | 0.1474 | - | - | | 3.2438 | 11989 | 0.1884 | - | - | | 3.2440 | 11990 | 0.1245 | - | - | | 3.2443 | 11991 | 0.2193 | - | - | | 3.2446 | 11992 | 0.1699 | - | - | | 3.2449 | 11993 | 0.1311 | - | - | | 3.2451 | 11994 | 0.2196 | - | - | | 3.2454 | 11995 | 0.177 | - | - | | 3.2457 | 11996 | 0.1873 | - | - | | 3.2459 | 11997 | 0.1395 | - | - | | 3.2462 | 11998 | 0.1319 | - | - | | 3.2465 | 11999 | 0.1288 | - | - | | 3.2468 | 12000 | 0.1557 | 0.1915 | 0.9539 | </details> ### Framework Versions - Python: 3.12.4 - Sentence Transformers: 3.0.1 - Transformers: 4.44.0 - PyTorch: 2.4.0+cu121 - Accelerate: 0.33.0 - Datasets: 2.21.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
null
Non_BioNLP
# SentenceTransformer based on BM-K/KoSimCSE-bert-multitask This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BM-K/KoSimCSE-bert-multitask](https://huggingface.co/BM-K/KoSimCSE-bert-multitask). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [BM-K/KoSimCSE-bert-multitask](https://huggingface.co/BM-K/KoSimCSE-bert-multitask) <!-- at revision 3aa54365eb9557f3b8ee1e39cff87306451abfae --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 768 tokens - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("Atipico1/simcse-12000") # Run inference sentences = [ '연제구의 도시미화와 저소득층 노인들의 일자리 창출에도 도움이 되는 게 뭐야', '연제구(구청장 이성문)는 지난해에 이어 ‘2021년 불법 유동광고물 수거보상사업’을 시행한다. ‘불법 유동광고물 수거보상사업’은 도시미관을 해치는 불법 광고물을 근절하기 위해 사업 참여자가 불법 유동광고물을 수거하여 구청 도시재생과에 가져가면 구청에서는 보상금을 지급하는 사업이다. 구는 1월 11일부터 15일까지 연제구민 중 만 60세 이상 저소득 어르신을 대상으로 신청을 받아 총 50명을 수거보상사업 참여자로 선발하였다. 참여자로 선발된 어르신들은 오는 2월부터 시작하는 수거 활동에 앞서 연제구청 구민홀에서 불법 유동광고물 구분 기준, 수거 방법, 수거 시 안전 수칙 등에 대해서 사전 교육을 받았으며 수거활동 중 발생할 수 있는 안전사고에 대비해 단체 보험에도 가입했다. 불법 광고물 정비에 주민들이 참여할 수 있는 기회를 제공하여 주민들로부터 불법 광고물에 대한 경각심을 제고 할 수 있을 것으로 기대된다. 구 관계자는 “이번 사업을 통해 주민과 함께 품격 있는 연제구를 만드는 데 일조하고, 저소득 어르신의 실버 일자리 창출에도 기여할 것으로 기대된다”고 말했다.', '4. 나가며\n노인복지주택이 지속가능한 노인복지정책이 되기 위해서는 사업시행자에게는 경제적으로 이득이 되고, 정책대상인 노인가구에게도 주거생활에 실질적인 도움을 줄 수 있어야 할 것이다. 그러나 그간 노인복지주택에의 사업시행자는 건설부지 및 부대시설 기준완화, 조세감면 등 각종 혜택을 받아 경제적 이득을 실현한 반면, 정책대상가구인 노인가구는 입소자격 제한규정으로 재산권 행사에 많은 불편을 겪어왔다. 이러한 정책집행 의지와 현실 간 괴리 속에서 다수의 노인복지주택에서 입소자격이 없는 자가 탈법적으로 입주하는 행위가 발생해온 것이다. 다음과 같은 측면에서도 노인복지주택정책에 대한 면밀한 검토가 필요하다. 첫째, 노인복지주택이 용도상 자연경관이 우수한 녹지지역 혹은 기반시설이 확보되지 않은 지역에도 건축될 수 있어 국토난개발을 유발할 가능성이 크다. 둘째, 보다 근본적으로 노인복지주택과 같이 노인들만 거주하는 주택이 노인복지 측면에서 바람직한지를 검토할 필요가 있다. 우리나라와 같이 급격한 고령화를 경험하고 있는 일본의 경우, 젊은 세대와 노인 세대가 함께 거주하는(age-mix) 정책이 중요하게 인식되고 있기 때문이다. 현행 노인복지주택 입소자자격 등은 노인의 주거복지증진과 행복추구에 부정적인 영향을 끼치고 있다는 점을 볼 때, 현행의 노인복지주택정책을 지속시키는 것이 실익이 있는지에 대한 면밀한 검토가 필요한 시점이다. 이를 위해 향후 공급되는 분양형 노인복지주택제도를 폐지하고, 노인복지주택을 「주택법」 체계 내로 흡수하는 방안을 적극적으로 검토할 필요가 있을 것이다.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Triplet * Dataset: `eval` * Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator) | Metric | Value | |:-------------------|:-----------| | cosine_accuracy | 0.9539 | | dot_accuracy | 0.0587 | | manhattan_accuracy | 0.9496 | | euclidean_accuracy | 0.9518 | | **max_accuracy** | **0.9539** | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 473,130 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 9 tokens</li><li>mean: 23.09 tokens</li><li>max: 88 tokens</li></ul> | <ul><li>min: 34 tokens</li><li>mean: 355.74 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 55 tokens</li><li>mean: 338.33 tokens</li><li>max: 512 tokens</li></ul> | * Samples: | anchor | positive | negative | |:------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>국회세종의사당 건립을 위한 법안을 누가 새로 제안했어</code> | <code>국민의힘 정진석 의원(공주·부여·청양)이 국회세종의사당 설치를 위한 '국회법' 일부개정법률안을 대표발의했다고 21일 밝혔다. 국회와 세종시 정부청사와의 물리적인 거리로 세종시 공무원의 관외 출장비는 3년간 917억원에 달한다. 출장횟수는 87만회에 달하고 있어 업무 불편과 비효율성 심화는 물론 정책 질 저하도 우려되는 실정이다. 개정안은 △서울시에 국회서울의사당을, 세종시에 국회세종의사당을 두도록 하고 △상임위원회는 국회세종의사당에 두는 것으로 하되, 국회운영위원회와 정보위원회 및 세종시로 이전하지 않은 부(部)를 소관하는 상임위원회는 국회서울의사당에 둘 수 있도록 했다. 행복도시법에 따른 이전 제외 대상 부처는 외교부, 통일부, 법무부, 국방부, 여성가족부 등 5곳이다. 또 예산결산특별위원회와 국회예산정책처는 세종시에 두도록하고 국회사무처, 국회도서관, 국회입법조사처는 국회세종의사당에 별도 기관을 둘 수 있도록 했다. 정진석 의원은 "여야 합의로 세종의사당 설계비 147억원이 확정됐고 지난 2월 국회 운영위원회 공청회에서 나온 의견들을 다듬어 법의 완성도를 높인 개정안인 만큼, 여야 합의를 통해 21대 국회 임기 중에 첫 삽을 뜰 있도록 최선을 다하겠다"고 말했다.</code> | <code>새로 들어온 법률안 등 - 2014. 5. 15. 의안접수현황 -<br>□ 국회사무처(사무총장직무대리 임병규)는 2014년 5월 15일(목) 전병헌의원 등18인이 발의한 “방송법 일부개정법률안”, 서청원의원 등 12인이 발의한 “세월호 4ㆍ16사고 반성과 진상조사 및 국가재난방지체계 혁신을 위한 특별법안” 등 12건의 법률안과 “국회의원(남경필) 사직의 건”을 포함하여 총 13건의 의안이 접수되었다고 밝혔다. 접수된 의안 중 법률안은 앞으로 미래창조과학방송통신위원회 등 소관 위원회에 회부되어 심사될 예정이다. □ 어제 접수된 법률안 중 주요내용을 소개하면 다음과 같다. - 방송법 개정안(전병헌의원 대표발의): 홈쇼핑 사업자가 그 지위를 이용하여 납품업체에게 불공정 계약을 강요하거나 부당이익을 취득한 경우 허가취소나 업무 정지 등의 제재를 할 수 있도록 하려는 것임. - 세월호 4ㆍ16사고 반성과 진상조사 및 국가재난방지체계 혁신을 위한 특별법안(서청원의원 대표발의): 세월호 참사에 대한 진상조사, 피해자 보상ㆍ배상 및 지원 대책, 재발방지 대책을 심의ㆍ의결하기 위하여 국회에 특별위원회를 구성하도록 하려는 것임.</code> | | <code>어떤 국가가 과반수의 표를 확보해야 중의원 의장이 될 수 있게 정해 놨어</code> | <code>3. 일본<br>가. 의장 선출<br>□ 중의원 의장은 중의원 선거 후 최초 집회일에 열리는 회의에서 중의원 의원의 무기명 투표에 의해 직선으로 선출됨(「국회법」제6조, 「중의원규칙」제3조, 제8조)<br>○ 의장 선거는 전체 의원의 3분의 1 이상이 참석해서 사무총장이 의장의 직무대행을 하여 실시함(「국회법」제7조)<br>○ 의장으로 당선되기 위해서는 과반수 득표를 해야 하므로, 1차 투표에서 총 투표수의 과반수를 획득한 득표자가 없으면, 최다득표자 2인에 대하여 결선투표를 실시함<br>○ 결선투표에서 두 후보자의 득표수가 같은 경우 추첨으로 당선인이 결정됨<br>○ 중의원 의장의 임기는 4년이나, 임기 중에 해산에 의해서 모든 중의원 의원이 지위를 잃으면, 의장도 그 지위를 상실함<br>□ 의장 선거절차는 중의원 규칙에서 정하는 것 외에, 제1회 일본 제국의회에서 정한 ‘의장 후보자 선거절차수칙’을 따르고 있음 ○ 의장 선출은 선거 전에 각 회파(교섭단체에 해당) 간의 대화로 미리 후보자가 결정되지만, 공식절차상으로는 각 의원의 본회의에서 선거로 선임함<br>○ 1970년대 중반까지는 집권여당이 국회의장과 부의장을 모두 독점하였으나, 제79회 국회(1976년) 이후 중의원 국회의장은 여당 제1당에서, 국회부의장은 야당 제1당(제2당)에서 선출하는 관행이 정착되었음. 그러나 1993년 연립정권 성립 후에는 중의원에서 자민당이 제1당이면서 여당이 아니었기 때문에 여당 간에 의장직을 둘러싸고 다툼이 있었음</code> | <code>이에 반해 비례대표제에는 중선거구제 내 선호순위를 표시하는 단기이양투표(single transferable vote, 예: 아일랜드, 몰타), 정당이 후보 명부를 제시하고 당선자를 득표율대로 결정해나가는 명부식 비례대표제(list proportional representation system, 예: 벨기에, 스웨덴, 덴마크 등)가 있다. 대표적인 명부식 비례대표제는 다시 전국을 하나의 선거구로 사용하는 전국통합구제도(이스라엘, 네덜란드)와 선거구를 권역별로 나누되, 불비례성을 전국구 의석으로 보정하는 권역다층선거구제도(스웨덴, 핀란드, 포르투갈, 스페인) 등으로 나뉜다. 이러한 명부식 비례제는 기계적 효과나 제조된 과반 효과가 없고 비례성이 매우 높은 특징을 지닌다. 군소정당도 당선자를 배출할 수 있고 대표성도 향상된다. 원내 정당의 난립을 막기 위해 봉쇄조항(threshold, 3~5%의 정당득표율)을 두기도 한다.</code> | | <code>1분기 코로나 예방접종을 약 2000명에게 시행할 건 누구야</code> | <code>부산 동래구(구청장 김우룡)는 코로나19 예방접종의 차질 없는 추진을 통한 빠른 일상회복을 위해 코로나19 예방접종 계획을 마련하고, 이달 말 1분기 대상자 2000여명을 대상으로 ‘코로나19 예방접종’을 시작한다고 밝혔다. 코로나19 예방접종 추진기간은 인플루엔자 유행시기인 11월 이전까지로, 접종대상은 18세 이상 전 구민이며, 임신부 및 만 18세 미만 소아·청소년, 65세 이상 고령자는 임상시험 결과에 따라 추후 접종 여부 및 시기가 결정된다. 동래구는 △과학적 근거를 기반으로 안전하고 효과적인 접종 추진 △코로나19의 사망예방 및 지역 사회 전파 차단을 위하여 전 구민의 70%인 189천여 명을 목표로 예방접종을 추진할 계획이다 1분기 우선 접종대상자는 △요양병원·요양시설입원·입원자, 종사자 △고위험 의료기관종사자, 코로나 1차 대응요원 △정신요양·재활시설 등 입소자·종사자 등 2000여 명이며, 백신 배송 등 일정을 조율해 26일부터 병원은 자체접종, 시설은 보건소 방문팀·시설별 협약의료기관 또는 계약된 의사가 방문 접종할 계획이다. 단계별 예방접종 기관은 △7월 개소 예정인 예방접종센터(사직실내체육관) △위탁의료기관 △방문접종 △자체접종 △내소접종을 병행하며, 위탁의료기관 정보는 질병관리청 코로나19 백신 및 예방접종 홈페이지에서 확인할 수 있다. 또한 동래구는 지난 4일 코로나19 예방접종 추진단을 운영 중이며, 22일 민·관·군과 병협·의협·간협 및 민간 등으로 구성된 민-관 협력체계인 ‘동래구 코로나19 예방접종 지역협의체’를 발족하여 전 구민의 코로나19 예방접종의 차질 없는 추진을 위해 최선을 다하고 있다. 김우룡 동래구청장은 “코로나19 예방접종은 전 국민 대상의 대규모 사업으로 관의 철저하고 꼼꼼한 계획과 함께 주민과 유관기관의 협조가 반드시 필요하다”며 “안전하고 신속한 예방접종을 추진을 위해 최선을 다하겠다”고 말했다.</code> | <code>문재인 대통령과 김정숙 여사가 오는 23일 아스트라제네카의 코로나19 백신을 공개 접종한다. 또 일반인에 대한 백신 접종 시기가 빨라지고, 교사의 경우 2분기에 접종을 받는다. 강민석 청와대 대변인은 15일 브리핑에서 문 대통령 부부의 백신 접종 계획을 설명하면서 “오는 6월 영국 G7(주요 7개국) 정상회의 참석, 즉 필수목적 출국을 위한 것”이라며 “질병관리청의 예방 접종 절차에 따른 것”이라고 설명했다. 공무 등으로 해외 출장을 하는 공무원은 우선 접종 대상이다. 강 대변인은 “문 대통령이 우선 접종하는 것은 일각의 안정성, 효과성 논란을 불식시키고 솔선수범하겠다는 의미”라고 덧붙였다. 3분기 예정이었던 일반인들에 대한 접종 시기도 빨라져, 고령층에 대한 접종이 4월부터 시작된다. 15일 코로나19 예방접종 대응추진단에 따르면 2분기 코로나19 백신 예방접종은 △요양병원 및 요양시설 △코로나19 취약시설 입소자 및 종사자 △65세 이상 어르신 △학교 및 돌봄 공간 △만성질환자 △보건의료인과 사회필수인력 등 6개군을 대상으로 진행한다. 이에 따라 4월 첫 주부터 75세 이상 어르신 364만 명에 대한 접종이 예방접종센터에서 실시된다. 65세부터 74세까지의 494만여 명은 6월부터 위탁의료기관에서 접종이 이뤄질 예정이다. 학교 돌봄 공간도 2분기 접종 대상이다. 4월 중 특수교육과 장애아보육 5만 1000명, 유치원과 학교 내 보건교사와 어린이집의 간호인력 1만 3000명에 대한 접종이 이뤄진다. 6월에는 유치원과 어린이집, 초등학교 1‧2학년을 담당하는 교사, 교직원, 관련 종사자 49만 1000명이 단계별로 접종을 받는다. 노인‧장애인‧노숙인시설 등의 거주‧이용시설 접종도 2분기 중 진행할 예정이지만, 아직 정확한 시기는 미정이다. 한편 15일 0시 기준 부산의 코로나19 예방백신 접종자는 4만 5897명으로 우선 접종 대상자 6만 310명의 72.8%가 접종을 마쳤다. 근육통, 발열 등 이상 반응 사례는 모두 589건이다.</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Evaluation Dataset #### Unnamed Dataset * Size: 10,000 evaluation samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 9 tokens</li><li>mean: 22.86 tokens</li><li>max: 115 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 351.34 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 93 tokens</li><li>mean: 346.69 tokens</li><li>max: 512 tokens</li></ul> | * Samples: | anchor | positive | negative | |:--------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>릴레이로 이번주 TV와 라디오 방송 출연을 확정한게 누구야?</code> | <code>▲ 사진=롯데엔터테인먼트 제공 영화 완벽한 타인의 주역 유해진, 조진웅, 이서진, 염정아가 릴레이로 이번주 TV와 라디오 방송 출연을 확정했다. 완벽한 타인은 완벽해 보이는 커플 모임에서 한정된 시간 동안 핸드폰으로 오는 전화, 문자, 카톡을 강제로 공개해야 하는 게임 때문에 벌어지는 예측불허 이야기를 담은 작품이다. 완벽한 타인에서 완벽한 연기를 펼친 배우들은 이번 주 릴레이로 TV와 라디오 방송 출연을 확정하며 열일 행보를 펼친다. 먼저 오는 24일 오후 7시 MBC FM영화음악 한예리입니다에는 유해진과 염정아가 함께 출연한다. 간첩, 전우치에 이어 세 번째로 함께 호흡을 맞춘 두 사람은 이번 라디오 출연에서 영화에 대한 이야기를 나누며 걸출한 입담과 절친 케미스트리를 선보일 것으로 보인다. 이어 이번 영화로 처음 만나 절친이 된 유해진, 조진웅, 이서진이 25일 오후 11시 10분 KBS2 해피투게더4에 출연한다. 세끼 인연 유해진과 이서진, 그리고 조진웅의 예능감이 유감없이 발휘될 예정이다. 마지막으로 26일에는 MBC 배철수의 음악캠프에서 이서진을 만날 수 있다. 완벽한 타인에서 가장 파격적인 연기 변신을 선보인 그는 음악캠프 특별 DJ로 활약했던 인연으로 이번 출연이 성사됐다. 이서진은 거침없는 언변으로 영화 완벽한 타인의 현장 비하인드 스토리를 밝힐 예정이다. 한편 완벽한 타인은 오는 31일 개봉을 앞두고 있다.</code> | <code>그룹 세븐틴이 미국 간판 토크쇼 ‘엘렌 쇼’에 첫 출연을 확정 지었다. 세븐틴은 다음 달 1일(현지 시각) 방송되는 미국 토크쇼 ‘엘렌 드제너러스 쇼’(이하 엘렌 쇼)에 첫 출연을 확정 지어 전 세계 팬들의 폭발적인 반응을 얻었다. 이날 방송에서 세븐틴은 지난 2019년 8월 발매한 디지털 싱글 ‘HIT’ 무대를 선보인다. ‘HIT’는 제목처럼 타격감이 느껴지는 사운드와 세븐틴의 폭발적인 에너지가 그대로 전해지는 강렬한 EDM 장르의 댄스곡으로 발매와 동시에 국내는 물론 해외에서도 큰 사랑을 받았다. ‘엘렌 쇼’는 미국 유명 코미디언이자 작가, 배우 등 멀티 엔터테이너인 엘렌 드제너러스가 진행하는 토크쇼로 브루노 마스, 두아 리파, 존 레전드, 저스틴 비버 등 세계적인 팝스타들이 대거 출연해 화제를 모았으며 미국의 데이타임 쇼 중 높은 인기를 보유하고 있는 프로그램이다. 앞서 세븐틴은 지난 1월 방송된 미국 CBS ‘제임스 코든 쇼’와 NBC ‘켈리 클락슨 쇼’에 연달아 출연해 스페셜 앨범 타이틀곡 ‘HOME;RUN’과 미니 7집 타이틀곡 ‘Left & Right’의 무대를 선사, 막강한 글로벌 영향력을 확인 시켜 주며 전 세계 팬들과 해외 유수 매체의 호평 세례를 받았다. 이렇듯 세븐틴은 스토리텔링이 담긴 완성도 높은 무대와 세븐틴만이 할 수 있는 퍼포먼스를 선보여 ‘K팝 퍼포먼스 강자’라는 칭호를 얻는 등 전 세계를 열광시킨 바 있어 이번 ‘엘렌쇼’에서 어떤 무대를 선보일지 기대감이 치솟고 있다. 한편 세븐틴이 출연하는 미국 토크쇼 ‘엘렌 쇼’는 다음 달 1일(현지 시각)에 만나볼 수 있다.</code> | | <code>벡터맵 빈 분류 기반의 제안기법은 무엇에 비하여서 압출효율이 높다는 것을 표 4에서 알 수 있어?</code> | <code><h1>IV. 실험 결과</h1><p>제안한 빈 분류기반 벡터맵 압축 기법에 대한 성능 평가를 위한 실험을 수행하였다. 실험을 위해 그림 10 과 같이, \( 10 \mathrm{~km} \times 10 \mathrm{~km} \) 의 국부 영역을 갖는 벡터맵 레이어를 생성하였으며, 이 중 폴리곤으로 구성된 '건물' 레이어와 폴리라인으로 구성된 '일반도로’ 레이어에 대해 각각의 실험을 수행하였다. 또한 TM 좌표계에 의해 표현되는 실측치 \( 1 \mathrm{~cm} \) 이내의 오차를 갖도록 식 (1)의 \( c=100 \) 으로 설정하여 정밀 벡터맵 압축에 대해 결과를 도출하였다. 또한 \( 10 \mathrm{~km} \times 10 \mathrm{~km} \) 영역에서 \( 1 \mathrm{~cm} \) 정밀도를 갖는 벡터맵 데이터의 최적의 압축효율을 위해, 실험적으로 dist \( _{D B}=10 \mathrm{~m} \) 및 dist \( { }_{A B}=0.64 \mathrm{~m} \) 로 결정하였다.</p><p>제안 기법의 객관적 비교를 위해 일반적인 데이터 압축기법으로서 7-zib 알고리즘, 대표적인 벡터 간소화 알고리즘으로서 Douglas-Peucker 알고리즘[16] 및 기존의 공간 에너지집중 기반에서의 압축 알고리즘등과 압축 결과를 비교하였다. 표 4 에 각각의 알고리즘 에 대한 압축 결과를 나타내었다.</p><p>표 4 의 결과로부터 벡터맵의 특성을 고려하지 않은 7-Zip과 비교하였을 때, 각 좌표점들의 오차범위로 \( 0.01 \mathrm{~m} \) 미만을 갖는 벡터맵 빈 분류 기반의 제안 기법이 월등히 높은 압축효율을 가짐을 확인하였다. 한편, 벡터 간소화 기법을 사용하는 Douglas-Peucker 알고리즘과 제안 알고리즘은 압축원리가 상이하므로 RMSE(root mean square error) 등의 방법을 통한 직접적인 비교는 어렵다. 또한 제안 기법과의 비교를 위해 Douglas-Peucker 알고리즘의 정밀도 범위 \( \epsilon=0.01 \mathrm{~m} \) 로 설정하게 되면, 각 좌표점들의 간소화 조건이 대부분 만족하지 않으므로 실제 간소화를 통한 압축은 거의 이루어지지 않는다. 따라서 그림 10의 벡터맵 레이어에서 시각적으로 용인할 수 있을 것으로 간주되는 적정 임계치 \( \epsilon=1 \mathrm{~m} \) 로 설정하여 압축을 수행하였다. 표 4의 실험 결과는 이때의 압축 결과를 나타낸 것이다. 그림 11은 벡터맵을 확대하였을 때, 표 4 의 압축 효율에 대해 제안 알고리즘과 Douglas-Peucker 알고리즘의 시각적 오차를 비교한 것이다.</p><p>표 4와 그림 11로부터 제안 기법이 Duglas-Peucker 알고리즘보다 월등히 적은 시각적 오차를 가짐에도 불구하고 보다 높은 압축효율을 나타냄을 확인할 수 있다. 더욱이, 표 4에서 Duglas-Peucker 알고리즘의 특성상 연속한 좌표점들이 급격히 꺽히는 오브젝트들의 집합인 '건물' 레이어에서 압축효율의 저하가 발생한다. 반면. 제안 기법은 Duglas-Peucker 알고리즘에서와 같은 압축효율의 저하는 발생하지 않음을 확인하였다. 공간영역에서의 에너지 집중(SEC)을 이용한 기존방법과의 비교에서 역시 제안 알고리즘이 보다 우수한 압축 효율을 가짐을 알 수 있었다. 또한 에너지 집중 이후 실질적 데이터 압축을 위한 엔트로피 코딩으로써 zlib 또는 7-zip 알고리즘을 이용하는 기존 기법과는 달리, 제안 기법은 압축 과정의 일부로써 정의된 단순한 허프만 테이블을 참조하므로 계산 복잡도에서 큰 이점을 얻을 수 있다. </p></code> | <code><h1>VI. 결 론</h1><p>본 논문에서는 집적 영상을 효율적으로 압축하기 위한 3D-DCT 기반의 압축 방법을 제안하였다. 제안한 방법은 집적 영상이 촬영 물체나 촬영 방법에 따라 다양한 특성을 가지는 특성을 바탕으로, 적응적으로 3D-DCT 블록을 구성하는 방법과 각 3D-DCT 블록별로 가변 블록 크기 3D-DCT를 수행하는 방법이다. 제안 방법은 영상 특성에 따라 최적의 3D-DCT를 수행하기 때문에 기존의 방법보다 뛰어난 성능을 보여준다. 제안 방법은 기존의 3D-DCT 방법과 비교해서 각 영상별로 동일 비트량에서 약 \( 1 \mathrm{~dB} \)에서 \( 2 \mathrm{~dB} \)의 PSNR 향상 효과를 보여주었다. </p><p>본 논문에서 제안된 방법은 여러 가지 블록 모드를 정의하고 그 중 최적의 모드를 선택하는 과정을 수행하므로 이에 따른 계산량의 증가를 초래한다. 블록 모드의 개수가 증가할수록 계산량의 그 개수에 정비례하여 증가하므로 이의 개선을 위한 연구가 추가적으로 필요하다. 또한 보다 효율적인 오버헤드 비트 부호화를 통해 추가적인 압축 효율 향상을 기대할 수 있다. </p></code> | | <code>대면하지 않고 진료에서 약 배달까지 해주는 처방솔루션은 뭐야</code> | <code>비대면 진료 서비스에 기반한 브랜드 페어(pare)는 올해 초부터 비대면 진료와 처방을 모바일로 쉽고 빠르게 받을 수 있게 하고, 처방받은 약은 집까지 배송해 주는 서비스를 국내 환자와 재외국민들을 위해 시작했다. 페어는 의사의 처방을 통해서만 구매가 가능한 의사의 솔루션을 페어만의 브랜드 감성을 깃들여 환자에게 노출한다. 이는 기존의 처방약이라는 고질적인 부분을 소비자 감성 브랜드로 승화해 환자와 소비자의 벽을 허무는 국내 최초의 전문처방 솔루션 비즈니스 모델이다. 또한, 플랫폼 내 처방이 필요하지 않은 일반 건강기능식품을 통한 사후관리 서비스도 제공한다. 강신욱 대표는 “페어만의 브랜드 감성과 의사들의 전문성이 실린 솔루션을 집까지 배송해 주는 게 특징”이라며 “처방약뿐만 아니라 진료에 따른 맞춤형 건강관리제품을 추천 혹은 패키지로 받아 볼 수 있는 국내 최초 비대면 진료 기반의 커머스형 처방솔루션”이라고 강조했다.</code> | <code>국민 의약품 구입 불편 해소 방안 관련 의약품 재분류 논의 시작<br>국가별 의약품 분류체계 <table><tbody><tr><td>분류</td><td>국명</td><td>처방약(처방필수)</td><td>비처방약</td><td>비고</td></tr><tr><td rowspan='3'>2분류</td><td>한국</td><td>- 전문의약품</td><td>- 일반의약품</td><td rowspan='2'>의약외품은 판매장소 제한 없음 </td></tr><tr><td>일본</td><td>- 의료용의약품(E): 의사의 처방에 의해서만 조제·판매</td><td>- 일반용의약품(OTC) : 약국 외에서도 제한적으로 판매</td></tr><tr><td>미국</td><td>- 처방의약품(Rx): 연방법에 의해 처방전 없이 조제하는 것을 금한다는 표시가 있음 </td><td>- 비처방의약품(OTC): 약국 및 약국 외에서 자유롭게 판매 <br>※ 제산제, 비타민, 치질, 해열진통제 등</td><td rowspan='3'>일반약 전체, 또는 일부 약국외 판매 허용</td></tr><tr><td rowspan='2'>3분류</td><td>영국</td><td>- 처방약(POM): 의사의 처방에 의해서만 조제·판매</td><td>- 약국약(P) : 처방없이 약국에서 판매 가능<br>- 자유판매품목(GSL): 약국 외에서도 판매 가능 <br>※ 어린이용 아스피린, 구충제, 관장약 등 제외</td></tr><tr><td>독일</td><td>- 처방약(Rp): 처방전이 필요하며 약국을 통해서만 판매 </td><td>- 약국약(Ap): 처방을 요하지 않고 약국에서 판매가능<br>- 자유판매품목(F):약국 외에서도 판매가능 <br>※ 민간치료약초, 저함량비타민·미네랄 등</td></tr><tr><td rowspan='3'>4분류</td><td>프랑스</td><td>- 처방약 list Ⅰ: 의사의 처방을 필요로 하며 처방자의 허가 없이 반복 사용할 수 없고, 약사는 판매상황을 기록<br>- 처방약 list Ⅱ: 환자의 요청이 있을 때 2달까지 처방전을 반복 사용<br>- 특별처방약Stupefiants : 의사는 일련번호가 붙은 양식에 의해 처방하며 약사는 판매상황을 기록</td><td>- 비처방약 (대중광고 가능) : 대중광로를 하는 약으로 사회 건강보험대상에서 제외 </td><td>의약품 약국 외 판매 불허</td></tr><tr><td>캐나다</td><td>- 처방약 (P) : 처방에 의해서 약국에서만 판매</td><td>- 약사약 (BTC) : 처방없이 약국에서 약사만이 판매할 수 있음 <br>- 약국진열약 (OTC) : 약국에서 자유롭게 진열하여 판매할 수 있는 약으로서, 대중광고 허용<br>- 자유판매약(OTP) : 약국 이외에서도 판매되는 약</td><td rowspan='2'> </td></tr><tr><td>스위스</td><td>- 처방약 list Ⅰ : 약품 명단을 법률로 정하며, 처방전 반복사용 금지<br>- 처방약 list Ⅱ : 약사의 반복 처방 가능</td><td>- 비처방약 list Ⅲ (약국약), list Ⅳ (약종상약), list Ⅴ (자유판매약<br>)- list Ⅳ 와 list Ⅴ는 대중광고 허용</td></tr></tbody></table></code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 16 - `learning_rate`: 5e-06 - `max_grad_norm`: 5.0 - `num_train_epochs`: 10 - `warmup_steps`: 500 - `dataloader_drop_last`: True #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 16 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 5e-06 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 5.0 - `num_train_epochs`: 10 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.0 - `warmup_steps`: 500 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: True - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `eval_use_gather_object`: False - `batch_sampler`: batch_sampler - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs <details><summary>Click to expand</summary> | Epoch | Step | Training Loss | loss | eval_max_accuracy | |:------:|:-----:|:-------------:|:------:|:-----------------:| | 0 | 0 | - | - | 0.8097 | | 0.0003 | 1 | 1.179 | - | - | | 0.0005 | 2 | 1.0733 | - | - | | 0.0008 | 3 | 0.9841 | - | - | | 0.0011 | 4 | 1.0739 | - | - | | 0.0014 | 5 | 1.2194 | - | - | | 0.0016 | 6 | 1.1582 | - | - | | 0.0019 | 7 | 0.9616 | - | - | | 0.0022 | 8 | 1.0596 | - | - | | 0.0024 | 9 | 0.9503 | - | - | | 0.0027 | 10 | 1.031 | - | - | | 0.0030 | 11 | 1.1054 | - | - | | 0.0032 | 12 | 1.0184 | - | - | | 0.0035 | 13 | 0.8953 | - | - | | 0.0038 | 14 | 1.2405 | - | - | | 0.0041 | 15 | 1.0238 | - | - | | 0.0043 | 16 | 0.9845 | - | - | | 0.0046 | 17 | 1.0546 | - | - | | 0.0049 | 18 | 1.0675 | - | - | | 0.0051 | 19 | 0.9762 | - | - | | 0.0054 | 20 | 0.7939 | - | - | | 0.0057 | 21 | 1.0777 | - | - | | 0.0060 | 22 | 1.0382 | - | - | | 0.0062 | 23 | 1.0807 | - | - | | 0.0065 | 24 | 1.1184 | - | - | | 0.0068 | 25 | 0.881 | - | - | | 0.0070 | 26 | 1.1134 | - | - | | 0.0073 | 27 | 1.0594 | - | - | | 0.0076 | 28 | 0.7923 | - | - | | 0.0078 | 29 | 0.947 | - | - | | 0.0081 | 30 | 0.9587 | - | - | | 0.0084 | 31 | 0.8561 | - | - | | 0.0087 | 32 | 0.9037 | - | - | | 0.0089 | 33 | 0.9165 | - | - | | 0.0092 | 34 | 1.1332 | - | - | | 0.0095 | 35 | 0.9526 | - | - | | 0.0097 | 36 | 0.9094 | - | - | | 0.0100 | 37 | 0.8902 | - | - | | 0.0103 | 38 | 0.9149 | - | - | | 0.0106 | 39 | 0.8626 | - | - | | 0.0108 | 40 | 1.0476 | - | - | | 0.0111 | 41 | 1.1116 | - | - | | 0.0114 | 42 | 0.9363 | - | - | | 0.0116 | 43 | 1.1492 | - | - | | 0.0119 | 44 | 0.88 | - | - | | 0.0122 | 45 | 0.8953 | - | - | | 0.0124 | 46 | 0.9056 | - | - | | 0.0127 | 47 | 0.8712 | - | - | | 0.0130 | 48 | 0.8783 | - | - | | 0.0133 | 49 | 0.8998 | - | - | | 0.0135 | 50 | 0.9089 | - | - | | 0.0138 | 51 | 0.9943 | - | - | | 0.0141 | 52 | 0.7594 | - | - | | 0.0143 | 53 | 1.0239 | - | - | | 0.0146 | 54 | 0.8189 | - | - | | 0.0149 | 55 | 0.8898 | - | - | | 0.0152 | 56 | 0.7309 | - | - | | 0.0154 | 57 | 0.7656 | - | - | | 0.0157 | 58 | 0.8408 | - | - | | 0.0160 | 59 | 0.9071 | - | - | | 0.0162 | 60 | 0.8157 | - | - | | 0.0165 | 61 | 0.8421 | - | - | | 0.0168 | 62 | 0.9124 | - | - | | 0.0170 | 63 | 0.8379 | - | - | | 0.0173 | 64 | 0.8278 | - | - | | 0.0176 | 65 | 0.8997 | - | - | | 0.0179 | 66 | 0.7988 | - | - | | 0.0181 | 67 | 0.8498 | - | - | | 0.0184 | 68 | 0.8588 | - | - | | 0.0187 | 69 | 0.8846 | - | - | | 0.0189 | 70 | 0.8923 | - | - | | 0.0192 | 71 | 0.7344 | - | - | | 0.0195 | 72 | 0.7002 | - | - | | 0.0198 | 73 | 0.8444 | - | - | | 0.0200 | 74 | 0.8148 | - | - | | 0.0203 | 75 | 0.7002 | - | - | | 0.0206 | 76 | 0.8735 | - | - | | 0.0208 | 77 | 0.8718 | - | - | | 0.0211 | 78 | 0.672 | - | - | | 0.0214 | 79 | 0.6914 | - | - | | 0.0216 | 80 | 0.7521 | - | - | | 0.0219 | 81 | 0.8297 | - | - | | 0.0222 | 82 | 0.774 | - | - | | 0.0225 | 83 | 0.977 | - | - | | 0.0227 | 84 | 0.736 | - | - | | 0.0230 | 85 | 0.778 | - | - | | 0.0233 | 86 | 0.9048 | - | - | | 0.0235 | 87 | 0.8656 | - | - | | 0.0238 | 88 | 0.8066 | - | - | | 0.0241 | 89 | 0.6944 | - | - | | 0.0244 | 90 | 0.7122 | - | - | | 0.0246 | 91 | 0.8266 | - | - | | 0.0249 | 92 | 0.7199 | - | - | | 0.0252 | 93 | 0.7296 | - | - | | 0.0254 | 94 | 0.9107 | - | - | | 0.0257 | 95 | 0.7637 | - | - | | 0.0260 | 96 | 0.6374 | - | - | | 0.0262 | 97 | 0.6547 | - | - | | 0.0265 | 98 | 0.6328 | - | - | | 0.0268 | 99 | 0.6648 | - | - | | 0.0271 | 100 | 0.7403 | - | - | | 0.0273 | 101 | 0.6864 | - | - | | 0.0276 | 102 | 0.6947 | - | - | | 0.0279 | 103 | 0.6662 | - | - | | 0.0281 | 104 | 0.657 | - | - | | 0.0284 | 105 | 0.663 | - | - | | 0.0287 | 106 | 0.5928 | - | - | | 0.0290 | 107 | 0.8488 | - | - | | 0.0292 | 108 | 0.5981 | - | - | | 0.0295 | 109 | 0.7565 | - | - | | 0.0298 | 110 | 0.6583 | - | - | | 0.0300 | 111 | 0.8198 | - | - | | 0.0303 | 112 | 0.7473 | - | - | | 0.0306 | 113 | 0.6791 | - | - | | 0.0308 | 114 | 0.5024 | - | - | | 0.0311 | 115 | 0.6391 | - | - | | 0.0314 | 116 | 0.7007 | - | - | | 0.0317 | 117 | 0.6424 | - | - | | 0.0319 | 118 | 0.508 | - | - | | 0.0322 | 119 | 0.6518 | - | - | | 0.0325 | 120 | 0.7681 | - | - | | 0.0327 | 121 | 0.7549 | - | - | | 0.0330 | 122 | 0.7161 | - | - | | 0.0333 | 123 | 0.575 | - | - | | 0.0335 | 124 | 0.7983 | - | - | | 0.0338 | 125 | 0.6369 | - | - | | 0.0341 | 126 | 0.5207 | - | - | | 0.0344 | 127 | 0.7792 | - | - | | 0.0346 | 128 | 0.5507 | - | - | | 0.0349 | 129 | 0.5769 | - | - | | 0.0352 | 130 | 0.7462 | - | - | | 0.0354 | 131 | 0.7728 | - | - | | 0.0357 | 132 | 0.5582 | - | - | | 0.0360 | 133 | 0.6999 | - | - | | 0.0363 | 134 | 0.7194 | - | - | | 0.0365 | 135 | 0.7125 | - | - | | 0.0368 | 136 | 0.6527 | - | - | | 0.0371 | 137 | 0.6318 | - | - | | 0.0373 | 138 | 0.5249 | - | - | | 0.0376 | 139 | 0.6114 | - | - | | 0.0379 | 140 | 0.577 | - | - | | 0.0381 | 141 | 0.6302 | - | - | | 0.0384 | 142 | 0.65 | - | - | | 0.0387 | 143 | 0.5753 | - | - | | 0.0390 | 144 | 0.5812 | - | - | | 0.0392 | 145 | 0.5641 | - | - | | 0.0395 | 146 | 0.6745 | - | - | | 0.0398 | 147 | 0.5224 | - | - | | 0.0400 | 148 | 0.6954 | - | - | | 0.0403 | 149 | 0.7016 | - | - | | 0.0406 | 150 | 0.4932 | - | - | | 0.0409 | 151 | 0.587 | - | - | | 0.0411 | 152 | 0.573 | - | - | | 0.0414 | 153 | 0.6685 | - | - | | 0.0417 | 154 | 0.6023 | - | - | | 0.0419 | 155 | 0.5884 | - | - | | 0.0422 | 156 | 0.4895 | - | - | | 0.0425 | 157 | 0.7572 | - | - | | 0.0427 | 158 | 0.6522 | - | - | | 0.0430 | 159 | 0.6946 | - | - | | 0.0433 | 160 | 0.6449 | - | - | | 0.0436 | 161 | 0.6483 | - | - | | 0.0438 | 162 | 0.6022 | - | - | | 0.0441 | 163 | 0.5624 | - | - | | 0.0444 | 164 | 0.6458 | - | - | | 0.0446 | 165 | 0.5737 | - | - | | 0.0449 | 166 | 0.6261 | - | - | | 0.0452 | 167 | 0.5635 | - | - | | 0.0455 | 168 | 0.4913 | - | - | | 0.0457 | 169 | 0.6958 | - | - | | 0.0460 | 170 | 0.592 | - | - | | 0.0463 | 171 | 0.4624 | - | - | | 0.0465 | 172 | 0.565 | - | - | | 0.0468 | 173 | 0.5542 | - | - | | 0.0471 | 174 | 0.6587 | - | - | | 0.0473 | 175 | 0.4727 | - | - | | 0.0476 | 176 | 0.6049 | - | - | | 0.0479 | 177 | 0.7385 | - | - | | 0.0482 | 178 | 0.5175 | - | - | | 0.0484 | 179 | 0.5711 | - | - | | 0.0487 | 180 | 0.4591 | - | - | | 0.0490 | 181 | 0.7063 | - | - | | 0.0492 | 182 | 0.4954 | - | - | | 0.0495 | 183 | 0.6444 | - | - | | 0.0498 | 184 | 0.6686 | - | - | | 0.0501 | 185 | 0.5229 | - | - | | 0.0503 | 186 | 0.4338 | - | - | | 0.0506 | 187 | 0.5582 | - | - | | 0.0509 | 188 | 0.5881 | - | - | | 0.0511 | 189 | 0.5609 | - | - | | 0.0514 | 190 | 0.6607 | - | - | | 0.0517 | 191 | 0.491 | - | - | | 0.0519 | 192 | 0.4687 | - | - | | 0.0522 | 193 | 0.5842 | - | - | | 0.0525 | 194 | 0.5544 | - | - | | 0.0528 | 195 | 0.5778 | - | - | | 0.0530 | 196 | 0.5591 | - | - | | 0.0533 | 197 | 0.5872 | - | - | | 0.0536 | 198 | 0.5807 | - | - | | 0.0538 | 199 | 0.593 | - | - | | 0.0541 | 200 | 0.4658 | - | - | | 0.0544 | 201 | 0.4649 | - | - | | 0.0547 | 202 | 0.4912 | - | - | | 0.0549 | 203 | 0.5475 | - | - | | 0.0552 | 204 | 0.5182 | - | - | | 0.0555 | 205 | 0.5281 | - | - | | 0.0557 | 206 | 0.6302 | - | - | | 0.0560 | 207 | 0.6346 | - | - | | 0.0563 | 208 | 0.5309 | - | - | | 0.0565 | 209 | 0.5499 | - | - | | 0.0568 | 210 | 0.5368 | - | - | | 0.0571 | 211 | 0.4647 | - | - | | 0.0574 | 212 | 0.5316 | - | - | | 0.0576 | 213 | 0.5165 | - | - | | 0.0579 | 214 | 0.6294 | - | - | | 0.0582 | 215 | 0.4526 | - | - | | 0.0584 | 216 | 0.5157 | - | - | | 0.0587 | 217 | 0.6337 | - | - | | 0.0590 | 218 | 0.4911 | - | - | | 0.0593 | 219 | 0.5696 | - | - | | 0.0595 | 220 | 0.4651 | - | - | | 0.0598 | 221 | 0.6098 | - | - | | 0.0601 | 222 | 0.6329 | - | - | | 0.0603 | 223 | 0.7011 | - | - | | 0.0606 | 224 | 0.4582 | - | - | | 0.0609 | 225 | 0.6332 | - | - | | 0.0611 | 226 | 0.5138 | - | - | | 0.0614 | 227 | 0.6474 | - | - | | 0.0617 | 228 | 0.5059 | - | - | | 0.0620 | 229 | 0.3617 | - | - | | 0.0622 | 230 | 0.4401 | - | - | | 0.0625 | 231 | 0.5159 | - | - | | 0.0628 | 232 | 0.6072 | - | - | | 0.0630 | 233 | 0.5079 | - | - | | 0.0633 | 234 | 0.3517 | - | - | | 0.0636 | 235 | 0.5604 | - | - | | 0.0639 | 236 | 0.4834 | - | - | | 0.0641 | 237 | 0.5719 | - | - | | 0.0644 | 238 | 0.4928 | - | - | | 0.0647 | 239 | 0.4558 | - | - | | 0.0649 | 240 | 0.4483 | - | - | | 0.0652 | 241 | 0.5027 | - | - | | 0.0655 | 242 | 0.4534 | - | - | | 0.0657 | 243 | 0.6228 | - | - | | 0.0660 | 244 | 0.653 | - | - | | 0.0663 | 245 | 0.4585 | - | - | | 0.0666 | 246 | 0.6514 | - | - | | 0.0668 | 247 | 0.6069 | - | - | | 0.0671 | 248 | 0.5267 | - | - | | 0.0674 | 249 | 0.4457 | - | - | | 0.0676 | 250 | 0.4966 | - | - | | 0.0679 | 251 | 0.5595 | - | - | | 0.0682 | 252 | 0.4991 | - | - | | 0.0685 | 253 | 0.5233 | - | - | | 0.0687 | 254 | 0.5883 | - | - | | 0.0690 | 255 | 0.4411 | - | - | | 0.0693 | 256 | 0.5102 | - | - | | 0.0695 | 257 | 0.5198 | - | - | | 0.0698 | 258 | 0.4086 | - | - | | 0.0701 | 259 | 0.4336 | - | - | | 0.0703 | 260 | 0.6177 | - | - | | 0.0706 | 261 | 0.5753 | - | - | | 0.0709 | 262 | 0.6234 | - | - | | 0.0712 | 263 | 0.5582 | - | - | | 0.0714 | 264 | 0.4451 | - | - | | 0.0717 | 265 | 0.5145 | - | - | | 0.0720 | 266 | 0.5908 | - | - | | 0.0722 | 267 | 0.3929 | - | - | | 0.0725 | 268 | 0.5009 | - | - | | 0.0728 | 269 | 0.3671 | - | - | | 0.0731 | 270 | 0.5866 | - | - | | 0.0733 | 271 | 0.6914 | - | - | | 0.0736 | 272 | 0.4779 | - | - | | 0.0739 | 273 | 0.5303 | - | - | | 0.0741 | 274 | 0.4294 | - | - | | 0.0744 | 275 | 0.61 | - | - | | 0.0747 | 276 | 0.5529 | - | - | | 0.0749 | 277 | 0.5498 | - | - | | 0.0752 | 278 | 0.4736 | - | - | | 0.0755 | 279 | 0.3907 | - | - | | 0.0758 | 280 | 0.4271 | - | - | | 0.0760 | 281 | 0.5772 | - | - | | 0.0763 | 282 | 0.5232 | - | - | | 0.0766 | 283 | 0.4786 | - | - | | 0.0768 | 284 | 0.5621 | - | - | | 0.0771 | 285 | 0.4747 | - | - | | 0.0774 | 286 | 0.4695 | - | - | | 0.0777 | 287 | 0.4926 | - | - | | 0.0779 | 288 | 0.5339 | - | - | | 0.0782 | 289 | 0.5043 | - | - | | 0.0785 | 290 | 0.3665 | - | - | | 0.0787 | 291 | 0.5777 | - | - | | 0.0790 | 292 | 0.5081 | - | - | | 0.0793 | 293 | 0.5744 | - | - | | 0.0795 | 294 | 0.4446 | - | - | | 0.0798 | 295 | 0.415 | - | - | | 0.0801 | 296 | 0.4013 | - | - | | 0.0804 | 297 | 0.4938 | - | - | | 0.0806 | 298 | 0.5096 | - | - | | 0.0809 | 299 | 0.5261 | - | - | | 0.0812 | 300 | 0.3339 | - | - | | 0.0814 | 301 | 0.7123 | - | - | | 0.0817 | 302 | 0.4387 | - | - | | 0.0820 | 303 | 0.4273 | - | - | | 0.0823 | 304 | 0.411 | - | - | | 0.0825 | 305 | 0.4667 | - | - | | 0.0828 | 306 | 0.4651 | - | - | | 0.0831 | 307 | 0.4916 | - | - | | 0.0833 | 308 | 0.6379 | - | - | | 0.0836 | 309 | 0.4339 | - | - | | 0.0839 | 310 | 0.4866 | - | - | | 0.0841 | 311 | 0.5155 | - | - | | 0.0844 | 312 | 0.4192 | - | - | | 0.0847 | 313 | 0.6039 | - | - | | 0.0850 | 314 | 0.4657 | - | - | | 0.0852 | 315 | 0.6355 | - | - | | 0.0855 | 316 | 0.4975 | - | - | | 0.0858 | 317 | 0.3445 | - | - | | 0.0860 | 318 | 0.3741 | - | - | | 0.0863 | 319 | 0.3988 | - | - | | 0.0866 | 320 | 0.5121 | - | - | | 0.0869 | 321 | 0.5441 | - | - | | 0.0871 | 322 | 0.6115 | - | - | | 0.0874 | 323 | 0.4559 | - | - | | 0.0877 | 324 | 0.4158 | - | - | | 0.0879 | 325 | 0.416 | - | - | | 0.0882 | 326 | 0.4739 | - | - | | 0.0885 | 327 | 0.6097 | - | - | | 0.0887 | 328 | 0.5983 | - | - | | 0.0890 | 329 | 0.5816 | - | - | | 0.0893 | 330 | 0.4715 | - | - | | 0.0896 | 331 | 0.3944 | - | - | | 0.0898 | 332 | 0.5422 | - | - | | 0.0901 | 333 | 0.5825 | - | - | | 0.0904 | 334 | 0.4453 | - | - | | 0.0906 | 335 | 0.4771 | - | - | | 0.0909 | 336 | 0.3799 | - | - | | 0.0912 | 337 | 0.3578 | - | - | | 0.0915 | 338 | 0.5269 | - | - | | 0.0917 | 339 | 0.5412 | - | - | | 0.0920 | 340 | 0.4387 | - | - | | 0.0923 | 341 | 0.4648 | - | - | | 0.0925 | 342 | 0.4264 | - | - | | 0.0928 | 343 | 0.3917 | - | - | | 0.0931 | 344 | 0.6398 | - | - | | 0.0933 | 345 | 0.3961 | - | - | | 0.0936 | 346 | 0.6527 | - | - | | 0.0939 | 347 | 0.4453 | - | - | | 0.0942 | 348 | 0.5411 | - | - | | 0.0944 | 349 | 0.5758 | - | - | | 0.0947 | 350 | 0.4062 | - | - | | 0.0950 | 351 | 0.5969 | - | - | | 0.0952 | 352 | 0.4315 | - | - | | 0.0955 | 353 | 0.5792 | - | - | | 0.0958 | 354 | 0.4573 | - | - | | 0.0960 | 355 | 0.5059 | - | - | | 0.0963 | 356 | 0.4784 | - | - | | 0.0966 | 357 | 0.4753 | - | - | | 0.0969 | 358 | 0.4547 | - | - | | 0.0971 | 359 | 0.4185 | - | - | | 0.0974 | 360 | 0.4964 | - | - | | 0.0977 | 361 | 0.4534 | - | - | | 0.0979 | 362 | 0.4609 | - | - | | 0.0982 | 363 | 0.441 | - | - | | 0.0985 | 364 | 0.4798 | - | - | | 0.0988 | 365 | 0.4776 | - | - | | 0.0990 | 366 | 0.4324 | - | - | | 0.0993 | 367 | 0.5355 | - | - | | 0.0996 | 368 | 0.3569 | - | - | | 0.0998 | 369 | 0.4697 | - | - | | 0.1001 | 370 | 0.4129 | - | - | | 0.1004 | 371 | 0.4395 | - | - | | 0.1006 | 372 | 0.4686 | - | - | | 0.1009 | 373 | 0.4133 | - | - | | 0.1012 | 374 | 0.4187 | - | - | | 0.1015 | 375 | 0.5296 | - | - | | 0.1017 | 376 | 0.4378 | - | - | | 0.1020 | 377 | 0.486 | - | - | | 0.1023 | 378 | 0.4715 | - | - | | 0.1025 | 379 | 0.401 | - | - | | 0.1028 | 380 | 0.3678 | - | - | | 0.1031 | 381 | 0.5143 | - | - | | 0.1034 | 382 | 0.5067 | - | - | | 0.1036 | 383 | 0.577 | - | - | | 0.1039 | 384 | 0.4762 | - | - | | 0.1042 | 385 | 0.5171 | - | - | | 0.1044 | 386 | 0.483 | - | - | | 0.1047 | 387 | 0.5319 | - | - | | 0.1050 | 388 | 0.5519 | - | - | | 0.1052 | 389 | 0.5023 | - | - | | 0.1055 | 390 | 0.4167 | - | - | | 0.1058 | 391 | 0.3797 | - | - | | 0.1061 | 392 | 0.5427 | - | - | | 0.1063 | 393 | 0.4857 | - | - | | 0.1066 | 394 | 0.4877 | - | - | | 0.1069 | 395 | 0.5607 | - | - | | 0.1071 | 396 | 0.3526 | - | - | | 0.1074 | 397 | 0.5034 | - | - | | 0.1077 | 398 | 0.465 | - | - | | 0.1080 | 399 | 0.4822 | - | - | | 0.1082 | 400 | 0.5667 | - | - | | 0.1085 | 401 | 0.5567 | - | - | | 0.1088 | 402 | 0.3982 | - | - | | 0.1090 | 403 | 0.5272 | - | - | | 0.1093 | 404 | 0.3676 | - | - | | 0.1096 | 405 | 0.4855 | - | - | | 0.1098 | 406 | 0.4727 | - | - | | 0.1101 | 407 | 0.4626 | - | - | | 0.1104 | 408 | 0.6116 | - | - | | 0.1107 | 409 | 0.3989 | - | - | | 0.1109 | 410 | 0.4759 | - | - | | 0.1112 | 411 | 0.3473 | - | - | | 0.1115 | 412 | 0.7002 | - | - | | 0.1117 | 413 | 0.3014 | - | - | | 0.1120 | 414 | 0.4251 | - | - | | 0.1123 | 415 | 0.4073 | - | - | | 0.1126 | 416 | 0.5373 | - | - | | 0.1128 | 417 | 0.5064 | - | - | | 0.1131 | 418 | 0.4443 | - | - | | 0.1134 | 419 | 0.4599 | - | - | | 0.1136 | 420 | 0.3585 | - | - | | 0.1139 | 421 | 0.4235 | - | - | | 0.1142 | 422 | 0.3939 | - | - | | 0.1144 | 423 | 0.5599 | - | - | | 0.1147 | 424 | 0.3272 | - | - | | 0.1150 | 425 | 0.3047 | - | - | | 0.1153 | 426 | 0.3835 | - | - | | 0.1155 | 427 | 0.3745 | - | - | | 0.1158 | 428 | 0.5126 | - | - | | 0.1161 | 429 | 0.4097 | - | - | | 0.1163 | 430 | 0.4314 | - | - | | 0.1166 | 431 | 0.5439 | - | - | | 0.1169 | 432 | 0.4467 | - | - | | 0.1172 | 433 | 0.4583 | - | - | | 0.1174 | 434 | 0.434 | - | - | | 0.1177 | 435 | 0.4183 | - | - | | 0.1180 | 436 | 0.5685 | - | - | | 0.1182 | 437 | 0.4235 | - | - | | 0.1185 | 438 | 0.4815 | - | - | | 0.1188 | 439 | 0.3793 | - | - | | 0.1190 | 440 | 0.3617 | - | - | | 0.1193 | 441 | 0.4938 | - | - | | 0.1196 | 442 | 0.4725 | - | - | | 0.1199 | 443 | 0.5827 | - | - | | 0.1201 | 444 | 0.3295 | - | - | | 0.1204 | 445 | 0.6002 | - | - | | 0.1207 | 446 | 0.3134 | - | - | | 0.1209 | 447 | 0.5644 | - | - | | 0.1212 | 448 | 0.3111 | - | - | | 0.1215 | 449 | 0.3892 | - | - | | 0.1218 | 450 | 0.3114 | - | - | | 0.1220 | 451 | 0.4343 | - | - | | 0.1223 | 452 | 0.4723 | - | - | | 0.1226 | 453 | 0.361 | - | - | | 0.1228 | 454 | 0.4077 | - | - | | 0.1231 | 455 | 0.4314 | - | - | | 0.1234 | 456 | 0.5096 | - | - | | 0.1236 | 457 | 0.3706 | - | - | | 0.1239 | 458 | 0.4507 | - | - | | 0.1242 | 459 | 0.4502 | - | - | | 0.1245 | 460 | 0.2918 | - | - | | 0.1247 | 461 | 0.5069 | - | - | | 0.125 | 462 | 0.4151 | - | - | | 0.1253 | 463 | 0.4682 | - | - | | 0.1255 | 464 | 0.3999 | - | - | | 0.1258 | 465 | 0.4764 | - | - | | 0.1261 | 466 | 0.4207 | - | - | | 0.1264 | 467 | 0.3923 | - | - | | 0.1266 | 468 | 0.3791 | - | - | | 0.1269 | 469 | 0.2914 | - | - | | 0.1272 | 470 | 0.3546 | - | - | | 0.1274 | 471 | 0.3632 | - | - | | 0.1277 | 472 | 0.3634 | - | - | | 0.1280 | 473 | 0.3898 | - | - | | 0.1282 | 474 | 0.3788 | - | - | | 0.1285 | 475 | 0.4937 | - | - | | 0.1288 | 476 | 0.3428 | - | - | | 0.1291 | 477 | 0.4589 | - | - | | 0.1293 | 478 | 0.4068 | - | - | | 0.1296 | 479 | 0.4065 | - | - | | 0.1299 | 480 | 0.3577 | - | - | | 0.1301 | 481 | 0.4345 | - | - | | 0.1304 | 482 | 0.4767 | - | - | | 0.1307 | 483 | 0.4697 | - | - | | 0.1310 | 484 | 0.4634 | - | - | | 0.1312 | 485 | 0.4374 | - | - | | 0.1315 | 486 | 0.5893 | - | - | | 0.1318 | 487 | 0.5903 | - | - | | 0.1320 | 488 | 0.3559 | - | - | | 0.1323 | 489 | 0.376 | - | - | | 0.1326 | 490 | 0.407 | - | - | | 0.1328 | 491 | 0.4807 | - | - | | 0.1331 | 492 | 0.4908 | - | - | | 0.1334 | 493 | 0.3917 | - | - | | 0.1337 | 494 | 0.3708 | - | - | | 0.1339 | 495 | 0.4199 | - | - | | 0.1342 | 496 | 0.4543 | - | - | | 0.1345 | 497 | 0.4159 | - | - | | 0.1347 | 498 | 0.4284 | - | - | | 0.1350 | 499 | 0.4836 | - | - | | 0.1353 | 500 | 0.5708 | - | - | | 0.1356 | 501 | 0.4684 | - | - | | 0.1358 | 502 | 0.4828 | - | - | | 0.1361 | 503 | 0.4267 | - | - | | 0.1364 | 504 | 0.3401 | - | - | | 0.1366 | 505 | 0.5218 | - | - | | 0.1369 | 506 | 0.4788 | - | - | | 0.1372 | 507 | 0.3658 | - | - | | 0.1374 | 508 | 0.3734 | - | - | | 0.1377 | 509 | 0.4097 | - | - | | 0.1380 | 510 | 0.3513 | - | - | | 0.1383 | 511 | 0.5054 | - | - | | 0.1385 | 512 | 0.3979 | - | - | | 0.1388 | 513 | 0.3675 | - | - | | 0.1391 | 514 | 0.3482 | - | - | | 0.1393 | 515 | 0.3552 | - | - | | 0.1396 | 516 | 0.3551 | - | - | | 0.1399 | 517 | 0.577 | - | - | | 0.1402 | 518 | 0.3992 | - | - | | 0.1404 | 519 | 0.4821 | - | - | | 0.1407 | 520 | 0.4765 | - | - | | 0.1410 | 521 | 0.3338 | - | - | | 0.1412 | 522 | 0.3712 | - | - | | 0.1415 | 523 | 0.4199 | - | - | | 0.1418 | 524 | 0.3382 | - | - | | 0.1420 | 525 | 0.5084 | - | - | | 0.1423 | 526 | 0.4912 | - | - | | 0.1426 | 527 | 0.4092 | - | - | | 0.1429 | 528 | 0.3429 | - | - | | 0.1431 | 529 | 0.3489 | - | - | | 0.1434 | 530 | 0.4979 | - | - | | 0.1437 | 531 | 0.3097 | - | - | | 0.1439 | 532 | 0.2743 | - | - | | 0.1442 | 533 | 0.3807 | - | - | | 0.1445 | 534 | 0.4363 | - | - | | 0.1448 | 535 | 0.3778 | - | - | | 0.1450 | 536 | 0.3534 | - | - | | 0.1453 | 537 | 0.4803 | - | - | | 0.1456 | 538 | 0.371 | - | - | | 0.1458 | 539 | 0.3576 | - | - | | 0.1461 | 540 | 0.4149 | - | - | | 0.1464 | 541 | 0.3288 | - | - | | 0.1466 | 542 | 0.5136 | - | - | | 0.1469 | 543 | 0.3446 | - | - | | 0.1472 | 544 | 0.4103 | - | - | | 0.1475 | 545 | 0.3375 | - | - | | 0.1477 | 546 | 0.5033 | - | - | | 0.1480 | 547 | 0.5561 | - | - | | 0.1483 | 548 | 0.3516 | - | - | | 0.1485 | 549 | 0.4674 | - | - | | 0.1488 | 550 | 0.4571 | - | - | | 0.1491 | 551 | 0.4782 | - | - | | 0.1494 | 552 | 0.4695 | - | - | | 0.1496 | 553 | 0.4307 | - | - | | 0.1499 | 554 | 0.4111 | - | - | | 0.1502 | 555 | 0.4575 | - | - | | 0.1504 | 556 | 0.4811 | - | - | | 0.1507 | 557 | 0.446 | - | - | | 0.1510 | 558 | 0.3233 | - | - | | 0.1512 | 559 | 0.3366 | - | - | | 0.1515 | 560 | 0.4584 | - | - | | 0.1518 | 561 | 0.3391 | - | - | | 0.1521 | 562 | 0.3949 | - | - | | 0.1523 | 563 | 0.4194 | - | - | | 0.1526 | 564 | 0.3506 | - | - | | 0.1529 | 565 | 0.4667 | - | - | | 0.1531 | 566 | 0.3708 | - | - | | 0.1534 | 567 | 0.3828 | - | - | | 0.1537 | 568 | 0.3823 | - | - | | 0.1540 | 569 | 0.4827 | - | - | | 0.1542 | 570 | 0.4167 | - | - | | 0.1545 | 571 | 0.3055 | - | - | | 0.1548 | 572 | 0.3797 | - | - | | 0.1550 | 573 | 0.3658 | - | - | | 0.1553 | 574 | 0.3399 | - | - | | 0.1556 | 575 | 0.3609 | - | - | | 0.1558 | 576 | 0.4068 | - | - | | 0.1561 | 577 | 0.4045 | - | - | | 0.1564 | 578 | 0.4415 | - | - | | 0.1567 | 579 | 0.4102 | - | - | | 0.1569 | 580 | 0.3578 | - | - | | 0.1572 | 581 | 0.2902 | - | - | | 0.1575 | 582 | 0.4447 | - | - | | 0.1577 | 583 | 0.3582 | - | - | | 0.1580 | 584 | 0.5064 | - | - | | 0.1583 | 585 | 0.6035 | - | - | | 0.1585 | 586 | 0.476 | - | - | | 0.1588 | 587 | 0.4533 | - | - | | 0.1591 | 588 | 0.3254 | - | - | | 0.1594 | 589 | 0.4245 | - | - | | 0.1596 | 590 | 0.3461 | - | - | | 0.1599 | 591 | 0.3651 | - | - | | 0.1602 | 592 | 0.4255 | - | - | | 0.1604 | 593 | 0.3545 | - | - | | 0.1607 | 594 | 0.2814 | - | - | | 0.1610 | 595 | 0.4902 | - | - | | 0.1613 | 596 | 0.3797 | - | - | | 0.1615 | 597 | 0.3915 | - | - | | 0.1618 | 598 | 0.3741 | - | - | | 0.1621 | 599 | 0.4349 | - | - | | 0.1623 | 600 | 0.4441 | - | - | | 0.1626 | 601 | 0.3932 | - | - | | 0.1629 | 602 | 0.3309 | - | - | | 0.1631 | 603 | 0.3346 | - | - | | 0.1634 | 604 | 0.3294 | - | - | | 0.1637 | 605 | 0.3267 | - | - | | 0.1640 | 606 | 0.23 | - | - | | 0.1642 | 607 | 0.4179 | - | - | | 0.1645 | 608 | 0.5072 | - | - | | 0.1648 | 609 | 0.404 | - | - | | 0.1650 | 610 | 0.3117 | - | - | | 0.1653 | 611 | 0.4566 | - | - | | 0.1656 | 612 | 0.477 | - | - | | 0.1659 | 613 | 0.4869 | - | - | | 0.1661 | 614 | 0.3917 | - | - | | 0.1664 | 615 | 0.3363 | - | - | | 0.1667 | 616 | 0.3831 | - | - | | 0.1669 | 617 | 0.4683 | - | - | | 0.1672 | 618 | 0.5428 | - | - | | 0.1675 | 619 | 0.372 | - | - | | 0.1677 | 620 | 0.3986 | - | - | | 0.1680 | 621 | 0.3343 | - | - | | 0.1683 | 622 | 0.4598 | - | - | | 0.1686 | 623 | 0.5001 | - | - | | 0.1688 | 624 | 0.4636 | - | - | | 0.1691 | 625 | 0.3864 | - | - | | 0.1694 | 626 | 0.3046 | - | - | | 0.1696 | 627 | 0.4236 | - | - | | 0.1699 | 628 | 0.2618 | - | - | | 0.1702 | 629 | 0.3836 | - | - | | 0.1705 | 630 | 0.3888 | - | - | | 0.1707 | 631 | 0.3397 | - | - | | 0.1710 | 632 | 0.3818 | - | - | | 0.1713 | 633 | 0.5019 | - | - | | 0.1715 | 634 | 0.3487 | - | - | | 0.1718 | 635 | 0.4416 | - | - | | 0.1721 | 636 | 0.3781 | - | - | | 0.1723 | 637 | 0.335 | - | - | | 0.1726 | 638 | 0.4464 | - | - | | 0.1729 | 639 | 0.442 | - | - | | 0.1732 | 640 | 0.3562 | - | - | | 0.1734 | 641 | 0.5615 | - | - | | 0.1737 | 642 | 0.3968 | - | - | | 0.1740 | 643 | 0.4254 | - | - | | 0.1742 | 644 | 0.3324 | - | - | | 0.1745 | 645 | 0.3475 | - | - | | 0.1748 | 646 | 0.3493 | - | - | | 0.1751 | 647 | 0.312 | - | - | | 0.1753 | 648 | 0.4798 | - | - | | 0.1756 | 649 | 0.3866 | - | - | | 0.1759 | 650 | 0.3165 | - | - | | 0.1761 | 651 | 0.3656 | - | - | | 0.1764 | 652 | 0.3335 | - | - | | 0.1767 | 653 | 0.4072 | - | - | | 0.1769 | 654 | 0.3952 | - | - | | 0.1772 | 655 | 0.3044 | - | - | | 0.1775 | 656 | 0.3295 | - | - | | 0.1778 | 657 | 0.5671 | - | - | | 0.1780 | 658 | 0.4012 | - | - | | 0.1783 | 659 | 0.3263 | - | - | | 0.1786 | 660 | 0.3351 | - | - | | 0.1788 | 661 | 0.3712 | - | - | | 0.1791 | 662 | 0.5386 | - | - | | 0.1794 | 663 | 0.4418 | - | - | | 0.1797 | 664 | 0.4058 | - | - | | 0.1799 | 665 | 0.3879 | - | - | | 0.1802 | 666 | 0.4332 | - | - | | 0.1805 | 667 | 0.4194 | - | - | | 0.1807 | 668 | 0.439 | - | - | | 0.1810 | 669 | 0.2701 | - | - | | 0.1813 | 670 | 0.2866 | - | - | | 0.1815 | 671 | 0.3157 | - | - | | 0.1818 | 672 | 0.3567 | - | - | | 0.1821 | 673 | 0.4435 | - | - | | 0.1824 | 674 | 0.3794 | - | - | | 0.1826 | 675 | 0.4044 | - | - | | 0.1829 | 676 | 0.2416 | - | - | | 0.1832 | 677 | 0.3851 | - | - | | 0.1834 | 678 | 0.3509 | - | - | | 0.1837 | 679 | 0.4402 | - | - | | 0.1840 | 680 | 0.4473 | - | - | | 0.1843 | 681 | 0.2757 | - | - | | 0.1845 | 682 | 0.2898 | - | - | | 0.1848 | 683 | 0.3547 | - | - | | 0.1851 | 684 | 0.4422 | - | - | | 0.1853 | 685 | 0.4154 | - | - | | 0.1856 | 686 | 0.3428 | - | - | | 0.1859 | 687 | 0.4308 | - | - | | 0.1861 | 688 | 0.3496 | - | - | | 0.1864 | 689 | 0.392 | - | - | | 0.1867 | 690 | 0.327 | - | - | | 0.1870 | 691 | 0.312 | - | - | | 0.1872 | 692 | 0.411 | - | - | | 0.1875 | 693 | 0.4342 | - | - | | 0.1878 | 694 | 0.3153 | - | - | | 0.1880 | 695 | 0.3987 | - | - | | 0.1883 | 696 | 0.2914 | - | - | | 0.1886 | 697 | 0.457 | - | - | | 0.1889 | 698 | 0.3247 | - | - | | 0.1891 | 699 | 0.4077 | - | - | | 0.1894 | 700 | 0.4483 | - | - | | 0.1897 | 701 | 0.3482 | - | - | | 0.1899 | 702 | 0.2505 | - | - | | 0.1902 | 703 | 0.3339 | - | - | | 0.1905 | 704 | 0.3919 | - | - | | 0.1907 | 705 | 0.3753 | - | - | | 0.1910 | 706 | 0.3812 | - | - | | 0.1913 | 707 | 0.3383 | - | - | | 0.1916 | 708 | 0.3303 | - | - | | 0.1918 | 709 | 0.3329 | - | - | | 0.1921 | 710 | 0.393 | - | - | | 0.1924 | 711 | 0.481 | - | - | | 0.1926 | 712 | 0.2871 | - | - | | 0.1929 | 713 | 0.284 | - | - | | 0.1932 | 714 | 0.4505 | - | - | | 0.1935 | 715 | 0.5099 | - | - | | 0.1937 | 716 | 0.4139 | - | - | | 0.1940 | 717 | 0.4806 | - | - | | 0.1943 | 718 | 0.3671 | - | - | | 0.1945 | 719 | 0.3767 | - | - | | 0.1948 | 720 | 0.3012 | - | - | | 0.1951 | 721 | 0.4281 | - | - | | 0.1953 | 722 | 0.3874 | - | - | | 0.1956 | 723 | 0.4483 | - | - | | 0.1959 | 724 | 0.3826 | - | - | | 0.1962 | 725 | 0.3191 | - | - | | 0.1964 | 726 | 0.2822 | - | - | | 0.1967 | 727 | 0.3294 | - | - | | 0.1970 | 728 | 0.3397 | - | - | | 0.1972 | 729 | 0.2751 | - | - | | 0.1975 | 730 | 0.446 | - | - | | 0.1978 | 731 | 0.3335 | - | - | | 0.1981 | 732 | 0.4961 | - | - | | 0.1983 | 733 | 0.7003 | - | - | | 0.1986 | 734 | 0.2998 | - | - | | 0.1989 | 735 | 0.4445 | - | - | | 0.1991 | 736 | 0.2437 | - | - | | 0.1994 | 737 | 0.3158 | - | - | | 0.1997 | 738 | 0.5616 | - | - | | 0.1999 | 739 | 0.4047 | - | - | | 0.2002 | 740 | 0.3447 | - | - | | 0.2005 | 741 | 0.3425 | - | - | | 0.2008 | 742 | 0.4514 | - | - | | 0.2010 | 743 | 0.439 | - | - | | 0.2013 | 744 | 0.4779 | - | - | | 0.2016 | 745 | 0.4259 | - | - | | 0.2018 | 746 | 0.438 | - | - | | 0.2021 | 747 | 0.515 | - | - | | 0.2024 | 748 | 0.3163 | - | - | | 0.2027 | 749 | 0.4198 | - | - | | 0.2029 | 750 | 0.3959 | - | - | | 0.2032 | 751 | 0.2549 | - | - | | 0.2035 | 752 | 0.4149 | - | - | | 0.2037 | 753 | 0.3564 | - | - | | 0.2040 | 754 | 0.3112 | - | - | | 0.2043 | 755 | 0.3141 | - | - | | 0.2045 | 756 | 0.4157 | - | - | | 0.2048 | 757 | 0.4643 | - | - | | 0.2051 | 758 | 0.3212 | - | - | | 0.2054 | 759 | 0.4046 | - | - | | 0.2056 | 760 | 0.538 | - | - | | 0.2059 | 761 | 0.4378 | - | - | | 0.2062 | 762 | 0.3041 | - | - | | 0.2064 | 763 | 0.3931 | - | - | | 0.2067 | 764 | 0.3217 | - | - | | 0.2070 | 765 | 0.2577 | - | - | | 0.2073 | 766 | 0.3941 | - | - | | 0.2075 | 767 | 0.5436 | - | - | | 0.2078 | 768 | 0.4075 | - | - | | 0.2081 | 769 | 0.3665 | - | - | | 0.2083 | 770 | 0.5189 | - | - | | 0.2086 | 771 | 0.3648 | - | - | | 0.2089 | 772 | 0.2695 | - | - | | 0.2091 | 773 | 0.3241 | - | - | | 0.2094 | 774 | 0.3511 | - | - | | 0.2097 | 775 | 0.3022 | - | - | | 0.2100 | 776 | 0.2947 | - | - | | 0.2102 | 777 | 0.4598 | - | - | | 0.2105 | 778 | 0.4121 | - | - | | 0.2108 | 779 | 0.309 | - | - | | 0.2110 | 780 | 0.3563 | - | - | | 0.2113 | 781 | 0.5174 | - | - | | 0.2116 | 782 | 0.366 | - | - | | 0.2119 | 783 | 0.3779 | - | - | | 0.2121 | 784 | 0.4078 | - | - | | 0.2124 | 785 | 0.3317 | - | - | | 0.2127 | 786 | 0.4269 | - | - | | 0.2129 | 787 | 0.3311 | - | - | | 0.2132 | 788 | 0.3335 | - | - | | 0.2135 | 789 | 0.269 | - | - | | 0.2137 | 790 | 0.3487 | - | - | | 0.2140 | 791 | 0.3457 | - | - | | 0.2143 | 792 | 0.3431 | - | - | | 0.2146 | 793 | 0.3441 | - | - | | 0.2148 | 794 | 0.2875 | - | - | | 0.2151 | 795 | 0.364 | - | - | | 0.2154 | 796 | 0.4348 | - | - | | 0.2156 | 797 | 0.3488 | - | - | | 0.2159 | 798 | 0.2742 | - | - | | 0.2162 | 799 | 0.4424 | - | - | | 0.2165 | 800 | 0.3975 | - | - | | 0.2167 | 801 | 0.4244 | - | - | | 0.2170 | 802 | 0.385 | - | - | | 0.2173 | 803 | 0.3402 | - | - | | 0.2175 | 804 | 0.3547 | - | - | | 0.2178 | 805 | 0.455 | - | - | | 0.2181 | 806 | 0.5426 | - | - | | 0.2183 | 807 | 0.4007 | - | - | | 0.2186 | 808 | 0.3376 | - | - | | 0.2189 | 809 | 0.3058 | - | - | | 0.2192 | 810 | 0.412 | - | - | | 0.2194 | 811 | 0.3868 | - | - | | 0.2197 | 812 | 0.3712 | - | - | | 0.2200 | 813 | 0.3184 | - | - | | 0.2202 | 814 | 0.304 | - | - | | 0.2205 | 815 | 0.4657 | - | - | | 0.2208 | 816 | 0.2557 | - | - | | 0.2210 | 817 | 0.3727 | - | - | | 0.2213 | 818 | 0.3147 | - | - | | 0.2216 | 819 | 0.3845 | - | - | | 0.2219 | 820 | 0.32 | - | - | | 0.2221 | 821 | 0.3003 | - | - | | 0.2224 | 822 | 0.4375 | - | - | | 0.2227 | 823 | 0.3704 | - | - | | 0.2229 | 824 | 0.4824 | - | - | | 0.2232 | 825 | 0.3775 | - | - | | 0.2235 | 826 | 0.4419 | - | - | | 0.2238 | 827 | 0.4566 | - | - | | 0.2240 | 828 | 0.3946 | - | - | | 0.2243 | 829 | 0.2748 | - | - | | 0.2246 | 830 | 0.3602 | - | - | | 0.2248 | 831 | 0.3373 | - | - | | 0.2251 | 832 | 0.4505 | - | - | | 0.2254 | 833 | 0.3683 | - | - | | 0.2256 | 834 | 0.4232 | - | - | | 0.2259 | 835 | 0.3398 | - | - | | 0.2262 | 836 | 0.3074 | - | - | | 0.2265 | 837 | 0.3726 | - | - | | 0.2267 | 838 | 0.2982 | - | - | | 0.2270 | 839 | 0.3812 | - | - | | 0.2273 | 840 | 0.3428 | - | - | | 0.2275 | 841 | 0.3911 | - | - | | 0.2278 | 842 | 0.2767 | - | - | | 0.2281 | 843 | 0.4704 | - | - | | 0.2284 | 844 | 0.4487 | - | - | | 0.2286 | 845 | 0.3709 | - | - | | 0.2289 | 846 | 0.4194 | - | - | | 0.2292 | 847 | 0.4367 | - | - | | 0.2294 | 848 | 0.2981 | - | - | | 0.2297 | 849 | 0.3883 | - | - | | 0.2300 | 850 | 0.4104 | - | - | | 0.2302 | 851 | 0.4059 | - | - | | 0.2305 | 852 | 0.3729 | - | - | | 0.2308 | 853 | 0.3828 | - | - | | 0.2311 | 854 | 0.3498 | - | - | | 0.2313 | 855 | 0.2595 | - | - | | 0.2316 | 856 | 0.3407 | - | - | | 0.2319 | 857 | 0.3798 | - | - | | 0.2321 | 858 | 0.445 | - | - | | 0.2324 | 859 | 0.3066 | - | - | | 0.2327 | 860 | 0.3882 | - | - | | 0.2330 | 861 | 0.457 | - | - | | 0.2332 | 862 | 0.2386 | - | - | | 0.2335 | 863 | 0.3183 | - | - | | 0.2338 | 864 | 0.2541 | - | - | | 0.2340 | 865 | 0.3393 | - | - | | 0.2343 | 866 | 0.3825 | - | - | | 0.2346 | 867 | 0.3886 | - | - | | 0.2348 | 868 | 0.3326 | - | - | | 0.2351 | 869 | 0.2589 | - | - | | 0.2354 | 870 | 0.3049 | - | - | | 0.2357 | 871 | 0.2513 | - | - | | 0.2359 | 872 | 0.286 | - | - | | 0.2362 | 873 | 0.477 | - | - | | 0.2365 | 874 | 0.452 | - | - | | 0.2367 | 875 | 0.3864 | - | - | | 0.2370 | 876 | 0.2677 | - | - | | 0.2373 | 877 | 0.2811 | - | - | | 0.2376 | 878 | 0.4972 | - | - | | 0.2378 | 879 | 0.3793 | - | - | | 0.2381 | 880 | 0.4091 | - | - | | 0.2384 | 881 | 0.4446 | - | - | | 0.2386 | 882 | 0.3355 | - | - | | 0.2389 | 883 | 0.2959 | - | - | | 0.2392 | 884 | 0.4378 | - | - | | 0.2394 | 885 | 0.5828 | - | - | | 0.2397 | 886 | 0.343 | - | - | | 0.2400 | 887 | 0.4026 | - | - | | 0.2403 | 888 | 0.4142 | - | - | | 0.2405 | 889 | 0.3471 | - | - | | 0.2408 | 890 | 0.4129 | - | - | | 0.2411 | 891 | 0.3108 | - | - | | 0.2413 | 892 | 0.2943 | - | - | | 0.2416 | 893 | 0.3831 | - | - | | 0.2419 | 894 | 0.3444 | - | - | | 0.2422 | 895 | 0.2944 | - | - | | 0.2424 | 896 | 0.444 | - | - | | 0.2427 | 897 | 0.4253 | - | - | | 0.2430 | 898 | 0.3068 | - | - | | 0.2432 | 899 | 0.2753 | - | - | | 0.2435 | 900 | 0.2619 | - | - | | 0.2438 | 901 | 0.4103 | - | - | | 0.2440 | 902 | 0.2468 | - | - | | 0.2443 | 903 | 0.46 | - | - | | 0.2446 | 904 | 0.4689 | - | - | | 0.2449 | 905 | 0.3259 | - | - | | 0.2451 | 906 | 0.46 | - | - | | 0.2454 | 907 | 0.3254 | - | - | | 0.2457 | 908 | 0.4582 | - | - | | 0.2459 | 909 | 0.2537 | - | - | | 0.2462 | 910 | 0.2723 | - | - | | 0.2465 | 911 | 0.4031 | - | - | | 0.2468 | 912 | 0.4395 | - | - | | 0.2470 | 913 | 0.3691 | - | - | | 0.2473 | 914 | 0.3314 | - | - | | 0.2476 | 915 | 0.3831 | - | - | | 0.2478 | 916 | 0.3194 | - | - | | 0.2481 | 917 | 0.3103 | - | - | | 0.2484 | 918 | 0.3532 | - | - | | 0.2486 | 919 | 0.3574 | - | - | | 0.2489 | 920 | 0.3837 | - | - | | 0.2492 | 921 | 0.2775 | - | - | | 0.2495 | 922 | 0.413 | - | - | | 0.2497 | 923 | 0.3153 | - | - | | 0.25 | 924 | 0.294 | - | - | | 0.2503 | 925 | 0.2577 | - | - | | 0.2505 | 926 | 0.4223 | - | - | | 0.2508 | 927 | 0.3239 | - | - | | 0.2511 | 928 | 0.4217 | - | - | | 0.2514 | 929 | 0.3509 | - | - | | 0.2516 | 930 | 0.313 | - | - | | 0.2519 | 931 | 0.3246 | - | - | | 0.2522 | 932 | 0.4282 | - | - | | 0.2524 | 933 | 0.3892 | - | - | | 0.2527 | 934 | 0.3826 | - | - | | 0.2530 | 935 | 0.3192 | - | - | | 0.2532 | 936 | 0.2984 | - | - | | 0.2535 | 937 | 0.3143 | - | - | | 0.2538 | 938 | 0.2451 | - | - | | 0.2541 | 939 | 0.2108 | - | - | | 0.2543 | 940 | 0.4843 | - | - | | 0.2546 | 941 | 0.4296 | - | - | | 0.2549 | 942 | 0.3882 | - | - | | 0.2551 | 943 | 0.3971 | - | - | | 0.2554 | 944 | 0.3021 | - | - | | 0.2557 | 945 | 0.3535 | - | - | | 0.2560 | 946 | 0.4501 | - | - | | 0.2562 | 947 | 0.3274 | - | - | | 0.2565 | 948 | 0.427 | - | - | | 0.2568 | 949 | 0.3689 | - | - | | 0.2570 | 950 | 0.2856 | - | - | | 0.2573 | 951 | 0.4162 | - | - | | 0.2576 | 952 | 0.298 | - | - | | 0.2578 | 953 | 0.2986 | - | - | | 0.2581 | 954 | 0.2839 | - | - | | 0.2584 | 955 | 0.3835 | - | - | | 0.2587 | 956 | 0.334 | - | - | | 0.2589 | 957 | 0.3741 | - | - | | 0.2592 | 958 | 0.329 | - | - | | 0.2595 | 959 | 0.4423 | - | - | | 0.2597 | 960 | 0.4031 | - | - | | 0.2600 | 961 | 0.4467 | - | - | | 0.2603 | 962 | 0.4164 | - | - | | 0.2606 | 963 | 0.4399 | - | - | | 0.2608 | 964 | 0.3872 | - | - | | 0.2611 | 965 | 0.3178 | - | - | | 0.2614 | 966 | 0.3842 | - | - | | 0.2616 | 967 | 0.3568 | - | - | | 0.2619 | 968 | 0.377 | - | - | | 0.2622 | 969 | 0.3886 | - | - | | 0.2624 | 970 | 0.4274 | - | - | | 0.2627 | 971 | 0.3356 | - | - | | 0.2630 | 972 | 0.352 | - | - | | 0.2633 | 973 | 0.3758 | - | - | | 0.2635 | 974 | 0.3294 | - | - | | 0.2638 | 975 | 0.429 | - | - | | 0.2641 | 976 | 0.2898 | - | - | | 0.2643 | 977 | 0.2611 | - | - | | 0.2646 | 978 | 0.3543 | - | - | | 0.2649 | 979 | 0.2723 | - | - | | 0.2652 | 980 | 0.3567 | - | - | | 0.2654 | 981 | 0.3958 | - | - | | 0.2657 | 982 | 0.3535 | - | - | | 0.2660 | 983 | 0.2934 | - | - | | 0.2662 | 984 | 0.4271 | - | - | | 0.2665 | 985 | 0.2764 | - | - | | 0.2668 | 986 | 0.4142 | - | - | | 0.2670 | 987 | 0.3972 | - | - | | 0.2673 | 988 | 0.4253 | - | - | | 0.2676 | 989 | 0.2593 | - | - | | 0.2679 | 990 | 0.4194 | - | - | | 0.2681 | 991 | 0.3026 | - | - | | 0.2684 | 992 | 0.2887 | - | - | | 0.2687 | 993 | 0.3461 | - | - | | 0.2689 | 994 | 0.3619 | - | - | | 0.2692 | 995 | 0.3621 | - | - | | 0.2695 | 996 | 0.3187 | - | - | | 0.2698 | 997 | 0.3614 | - | - | | 0.2700 | 998 | 0.2672 | - | - | | 0.2703 | 999 | 0.375 | - | - | | 0.2706 | 1000 | 0.285 | 0.3131 | 0.919 | | 0.2708 | 1001 | 0.265 | - | - | | 0.2711 | 1002 | 0.333 | - | - | | 0.2714 | 1003 | 0.402 | - | - | | 0.2716 | 1004 | 0.3103 | - | - | | 0.2719 | 1005 | 0.3531 | - | - | | 0.2722 | 1006 | 0.4888 | - | - | | 0.2725 | 1007 | 0.3325 | - | - | | 0.2727 | 1008 | 0.338 | - | - | | 0.2730 | 1009 | 0.2637 | - | - | | 0.2733 | 1010 | 0.3157 | - | - | | 0.2735 | 1011 | 0.3101 | - | - | | 0.2738 | 1012 | 0.3077 | - | - | | 0.2741 | 1013 | 0.2603 | - | - | | 0.2744 | 1014 | 0.3019 | - | - | | 0.2746 | 1015 | 0.3775 | - | - | | 0.2749 | 1016 | 0.4358 | - | - | | 0.2752 | 1017 | 0.2512 | - | - | | 0.2754 | 1018 | 0.3666 | - | - | | 0.2757 | 1019 | 0.3002 | - | - | | 0.2760 | 1020 | 0.2567 | - | - | | 0.2762 | 1021 | 0.3584 | - | - | | 0.2765 | 1022 | 0.2386 | - | - | | 0.2768 | 1023 | 0.3902 | - | - | | 0.2771 | 1024 | 0.2398 | - | - | | 0.2773 | 1025 | 0.2573 | - | - | | 0.2776 | 1026 | 0.2819 | - | - | | 0.2779 | 1027 | 0.3095 | - | - | | 0.2781 | 1028 | 0.2504 | - | - | | 0.2784 | 1029 | 0.3288 | - | - | | 0.2787 | 1030 | 0.4287 | - | - | | 0.2790 | 1031 | 0.3384 | - | - | | 0.2792 | 1032 | 0.3599 | - | - | | 0.2795 | 1033 | 0.3052 | - | - | | 0.2798 | 1034 | 0.3415 | - | - | | 0.2800 | 1035 | 0.343 | - | - | | 0.2803 | 1036 | 0.4511 | - | - | | 0.2806 | 1037 | 0.3303 | - | - | | 0.2808 | 1038 | 0.3797 | - | - | | 0.2811 | 1039 | 0.3592 | - | - | | 0.2814 | 1040 | 0.3932 | - | - | | 0.2817 | 1041 | 0.3272 | - | - | | 0.2819 | 1042 | 0.3413 | - | - | | 0.2822 | 1043 | 0.3899 | - | - | | 0.2825 | 1044 | 0.3189 | - | - | | 0.2827 | 1045 | 0.3665 | - | - | | 0.2830 | 1046 | 0.2467 | - | - | | 0.2833 | 1047 | 0.2936 | - | - | | 0.2835 | 1048 | 0.3552 | - | - | | 0.2838 | 1049 | 0.3169 | - | - | | 0.2841 | 1050 | 0.3157 | - | - | | 0.2844 | 1051 | 0.3577 | - | - | | 0.2846 | 1052 | 0.3009 | - | - | | 0.2849 | 1053 | 0.2991 | - | - | | 0.2852 | 1054 | 0.4104 | - | - | | 0.2854 | 1055 | 0.2816 | - | - | | 0.2857 | 1056 | 0.2779 | - | - | | 0.2860 | 1057 | 0.4574 | - | - | | 0.2863 | 1058 | 0.3233 | - | - | | 0.2865 | 1059 | 0.3666 | - | - | | 0.2868 | 1060 | 0.2423 | - | - | | 0.2871 | 1061 | 0.4268 | - | - | | 0.2873 | 1062 | 0.3156 | - | - | | 0.2876 | 1063 | 0.353 | - | - | | 0.2879 | 1064 | 0.3159 | - | - | | 0.2881 | 1065 | 0.2713 | - | - | | 0.2884 | 1066 | 0.3764 | - | - | | 0.2887 | 1067 | 0.33 | - | - | | 0.2890 | 1068 | 0.4578 | - | - | | 0.2892 | 1069 | 0.2696 | - | - | | 0.2895 | 1070 | 0.5282 | - | - | | 0.2898 | 1071 | 0.2719 | - | - | | 0.2900 | 1072 | 0.2023 | - | - | | 0.2903 | 1073 | 0.3608 | - | - | | 0.2906 | 1074 | 0.3293 | - | - | | 0.2909 | 1075 | 0.4331 | - | - | | 0.2911 | 1076 | 0.4126 | - | - | | 0.2914 | 1077 | 0.3154 | - | - | | 0.2917 | 1078 | 0.5337 | - | - | | 0.2919 | 1079 | 0.339 | - | - | | 0.2922 | 1080 | 0.3462 | - | - | | 0.2925 | 1081 | 0.3614 | - | - | | 0.2927 | 1082 | 0.3874 | - | - | | 0.2930 | 1083 | 0.3068 | - | - | | 0.2933 | 1084 | 0.2818 | - | - | | 0.2936 | 1085 | 0.3615 | - | - | | 0.2938 | 1086 | 0.2457 | - | - | | 0.2941 | 1087 | 0.4074 | - | - | | 0.2944 | 1088 | 0.3051 | - | - | | 0.2946 | 1089 | 0.3238 | - | - | | 0.2949 | 1090 | 0.3575 | - | - | | 0.2952 | 1091 | 0.3145 | - | - | | 0.2955 | 1092 | 0.2649 | - | - | | 0.2957 | 1093 | 0.3485 | - | - | | 0.2960 | 1094 | 0.2949 | - | - | | 0.2963 | 1095 | 0.4315 | - | - | | 0.2965 | 1096 | 0.3595 | - | - | | 0.2968 | 1097 | 0.3465 | - | - | | 0.2971 | 1098 | 0.3012 | - | - | | 0.2973 | 1099 | 0.2986 | - | - | | 0.2976 | 1100 | 0.3918 | - | - | | 0.2979 | 1101 | 0.3563 | - | - | | 0.2982 | 1102 | 0.2181 | - | - | | 0.2984 | 1103 | 0.3051 | - | - | | 0.2987 | 1104 | 0.3222 | - | - | | 0.2990 | 1105 | 0.4502 | - | - | | 0.2992 | 1106 | 0.2323 | - | - | | 0.2995 | 1107 | 0.4678 | - | - | | 0.2998 | 1108 | 0.3744 | - | - | | 0.3001 | 1109 | 0.3787 | - | - | | 0.3003 | 1110 | 0.4103 | - | - | | 0.3006 | 1111 | 0.3141 | - | - | | 0.3009 | 1112 | 0.2865 | - | - | | 0.3011 | 1113 | 0.3028 | - | - | | 0.3014 | 1114 | 0.3659 | - | - | | 0.3017 | 1115 | 0.3952 | - | - | | 0.3019 | 1116 | 0.5973 | - | - | | 0.3022 | 1117 | 0.2921 | - | - | | 0.3025 | 1118 | 0.2741 | - | - | | 0.3028 | 1119 | 0.313 | - | - | | 0.3030 | 1120 | 0.2989 | - | - | | 0.3033 | 1121 | 0.3466 | - | - | | 0.3036 | 1122 | 0.3237 | - | - | | 0.3038 | 1123 | 0.4059 | - | - | | 0.3041 | 1124 | 0.2759 | - | - | | 0.3044 | 1125 | 0.3335 | - | - | | 0.3047 | 1126 | 0.2879 | - | - | | 0.3049 | 1127 | 0.4204 | - | - | | 0.3052 | 1128 | 0.4009 | - | - | | 0.3055 | 1129 | 0.31 | - | - | | 0.3057 | 1130 | 0.4255 | - | - | | 0.3060 | 1131 | 0.3863 | - | - | | 0.3063 | 1132 | 0.3819 | - | - | | 0.3065 | 1133 | 0.3316 | - | - | | 0.3068 | 1134 | 0.3721 | - | - | | 0.3071 | 1135 | 0.4282 | - | - | | 0.3074 | 1136 | 0.5464 | - | - | | 0.3076 | 1137 | 0.2696 | - | - | | 0.3079 | 1138 | 0.315 | - | - | | 0.3082 | 1139 | 0.3263 | - | - | | 0.3084 | 1140 | 0.3488 | - | - | | 0.3087 | 1141 | 0.3922 | - | - | | 0.3090 | 1142 | 0.3279 | - | - | | 0.3093 | 1143 | 0.2185 | - | - | | 0.3095 | 1144 | 0.2331 | - | - | | 0.3098 | 1145 | 0.2982 | - | - | | 0.3101 | 1146 | 0.291 | - | - | | 0.3103 | 1147 | 0.3611 | - | - | | 0.3106 | 1148 | 0.3028 | - | - | | 0.3109 | 1149 | 0.3954 | - | - | | 0.3111 | 1150 | 0.3638 | - | - | | 0.3114 | 1151 | 0.332 | - | - | | 0.3117 | 1152 | 0.2228 | - | - | | 0.3120 | 1153 | 0.3048 | - | - | | 0.3122 | 1154 | 0.2789 | - | - | | 0.3125 | 1155 | 0.2997 | - | - | | 0.3128 | 1156 | 0.3662 | - | - | | 0.3130 | 1157 | 0.3456 | - | - | | 0.3133 | 1158 | 0.2927 | - | - | | 0.3136 | 1159 | 0.3326 | - | - | | 0.3139 | 1160 | 0.27 | - | - | | 0.3141 | 1161 | 0.2756 | - | - | | 0.3144 | 1162 | 0.3869 | - | - | | 0.3147 | 1163 | 0.3463 | - | - | | 0.3149 | 1164 | 0.3361 | - | - | | 0.3152 | 1165 | 0.3088 | - | - | | 0.3155 | 1166 | 0.3052 | - | - | | 0.3157 | 1167 | 0.2964 | - | - | | 0.3160 | 1168 | 0.2978 | - | - | | 0.3163 | 1169 | 0.3723 | - | - | | 0.3166 | 1170 | 0.2526 | - | - | | 0.3168 | 1171 | 0.3881 | - | - | | 0.3171 | 1172 | 0.281 | - | - | | 0.3174 | 1173 | 0.2978 | - | - | | 0.3176 | 1174 | 0.3354 | - | - | | 0.3179 | 1175 | 0.2581 | - | - | | 0.3182 | 1176 | 0.3478 | - | - | | 0.3185 | 1177 | 0.3815 | - | - | | 0.3187 | 1178 | 0.3078 | - | - | | 0.3190 | 1179 | 0.2828 | - | - | | 0.3193 | 1180 | 0.3003 | - | - | | 0.3195 | 1181 | 0.3345 | - | - | | 0.3198 | 1182 | 0.4192 | - | - | | 0.3201 | 1183 | 0.3246 | - | - | | 0.3203 | 1184 | 0.3861 | - | - | | 0.3206 | 1185 | 0.3267 | - | - | | 0.3209 | 1186 | 0.4421 | - | - | | 0.3212 | 1187 | 0.3226 | - | - | | 0.3214 | 1188 | 0.3563 | - | - | | 0.3217 | 1189 | 0.3717 | - | - | | 0.3220 | 1190 | 0.34 | - | - | | 0.3222 | 1191 | 0.3757 | - | - | | 0.3225 | 1192 | 0.3114 | - | - | | 0.3228 | 1193 | 0.5106 | - | - | | 0.3231 | 1194 | 0.2707 | - | - | | 0.3233 | 1195 | 0.3091 | - | - | | 0.3236 | 1196 | 0.4106 | - | - | | 0.3239 | 1197 | 0.215 | - | - | | 0.3241 | 1198 | 0.3182 | - | - | | 0.3244 | 1199 | 0.3747 | - | - | | 0.3247 | 1200 | 0.3645 | - | - | | 0.3249 | 1201 | 0.3587 | - | - | | 0.3252 | 1202 | 0.3672 | - | - | | 0.3255 | 1203 | 0.3229 | - | - | | 0.3258 | 1204 | 0.4058 | - | - | | 0.3260 | 1205 | 0.2357 | - | - | | 0.3263 | 1206 | 0.3266 | - | - | | 0.3266 | 1207 | 0.3868 | - | - | | 0.3268 | 1208 | 0.3269 | - | - | | 0.3271 | 1209 | 0.3507 | - | - | | 0.3274 | 1210 | 0.277 | - | - | | 0.3277 | 1211 | 0.2645 | - | - | | 0.3279 | 1212 | 0.3119 | - | - | | 0.3282 | 1213 | 0.3348 | - | - | | 0.3285 | 1214 | 0.3285 | - | - | | 0.3287 | 1215 | 0.358 | - | - | | 0.3290 | 1216 | 0.386 | - | - | | 0.3293 | 1217 | 0.1993 | - | - | | 0.3295 | 1218 | 0.4288 | - | - | | 0.3298 | 1219 | 0.334 | - | - | | 0.3301 | 1220 | 0.3295 | - | - | | 0.3304 | 1221 | 0.3733 | - | - | | 0.3306 | 1222 | 0.4579 | - | - | | 0.3309 | 1223 | 0.3301 | - | - | | 0.3312 | 1224 | 0.3008 | - | - | | 0.3314 | 1225 | 0.3629 | - | - | | 0.3317 | 1226 | 0.3995 | - | - | | 0.3320 | 1227 | 0.2547 | - | - | | 0.3323 | 1228 | 0.2691 | - | - | | 0.3325 | 1229 | 0.2456 | - | - | | 0.3328 | 1230 | 0.2411 | - | - | | 0.3331 | 1231 | 0.2555 | - | - | | 0.3333 | 1232 | 0.3296 | - | - | | 0.3336 | 1233 | 0.3376 | - | - | | 0.3339 | 1234 | 0.366 | - | - | | 0.3341 | 1235 | 0.3086 | - | - | | 0.3344 | 1236 | 0.5035 | - | - | | 0.3347 | 1237 | 0.347 | - | - | | 0.3350 | 1238 | 0.3955 | - | - | | 0.3352 | 1239 | 0.301 | - | - | | 0.3355 | 1240 | 0.2736 | - | - | | 0.3358 | 1241 | 0.3868 | - | - | | 0.3360 | 1242 | 0.2665 | - | - | | 0.3363 | 1243 | 0.4783 | - | - | | 0.3366 | 1244 | 0.3868 | - | - | | 0.3369 | 1245 | 0.3709 | - | - | | 0.3371 | 1246 | 0.3816 | - | - | | 0.3374 | 1247 | 0.4771 | - | - | | 0.3377 | 1248 | 0.3187 | - | - | | 0.3379 | 1249 | 0.3167 | - | - | | 0.3382 | 1250 | 0.3947 | - | - | | 0.3385 | 1251 | 0.3201 | - | - | | 0.3387 | 1252 | 0.3417 | - | - | | 0.3390 | 1253 | 0.2906 | - | - | | 0.3393 | 1254 | 0.3593 | - | - | | 0.3396 | 1255 | 0.3965 | - | - | | 0.3398 | 1256 | 0.3212 | - | - | | 0.3401 | 1257 | 0.4542 | - | - | | 0.3404 | 1258 | 0.3274 | - | - | | 0.3406 | 1259 | 0.3206 | - | - | | 0.3409 | 1260 | 0.278 | - | - | | 0.3412 | 1261 | 0.3844 | - | - | | 0.3415 | 1262 | 0.1857 | - | - | | 0.3417 | 1263 | 0.2245 | - | - | | 0.3420 | 1264 | 0.2125 | - | - | | 0.3423 | 1265 | 0.2782 | - | - | | 0.3425 | 1266 | 0.3194 | - | - | | 0.3428 | 1267 | 0.3262 | - | - | | 0.3431 | 1268 | 0.4295 | - | - | | 0.3433 | 1269 | 0.2837 | - | - | | 0.3436 | 1270 | 0.2221 | - | - | | 0.3439 | 1271 | 0.255 | - | - | | 0.3442 | 1272 | 0.1959 | - | - | | 0.3444 | 1273 | 0.3568 | - | - | | 0.3447 | 1274 | 0.3716 | - | - | | 0.3450 | 1275 | 0.437 | - | - | | 0.3452 | 1276 | 0.5078 | - | - | | 0.3455 | 1277 | 0.2689 | - | - | | 0.3458 | 1278 | 0.3653 | - | - | | 0.3460 | 1279 | 0.3522 | - | - | | 0.3463 | 1280 | 0.2809 | - | - | | 0.3466 | 1281 | 0.3302 | - | - | | 0.3469 | 1282 | 0.3689 | - | - | | 0.3471 | 1283 | 0.3597 | - | - | | 0.3474 | 1284 | 0.2672 | - | - | | 0.3477 | 1285 | 0.2679 | - | - | | 0.3479 | 1286 | 0.2393 | - | - | | 0.3482 | 1287 | 0.3753 | - | - | | 0.3485 | 1288 | 0.3876 | - | - | | 0.3488 | 1289 | 0.2384 | - | - | | 0.3490 | 1290 | 0.411 | - | - | | 0.3493 | 1291 | 0.3 | - | - | | 0.3496 | 1292 | 0.2367 | - | - | | 0.3498 | 1293 | 0.3404 | - | - | | 0.3501 | 1294 | 0.2742 | - | - | | 0.3504 | 1295 | 0.436 | - | - | | 0.3506 | 1296 | 0.2488 | - | - | | 0.3509 | 1297 | 0.2625 | - | - | | 0.3512 | 1298 | 0.2607 | - | - | | 0.3515 | 1299 | 0.2273 | - | - | | 0.3517 | 1300 | 0.3105 | - | - | | 0.3520 | 1301 | 0.4418 | - | - | | 0.3523 | 1302 | 0.3452 | - | - | | 0.3525 | 1303 | 0.4404 | - | - | | 0.3528 | 1304 | 0.3159 | - | - | | 0.3531 | 1305 | 0.2851 | - | - | | 0.3534 | 1306 | 0.3366 | - | - | | 0.3536 | 1307 | 0.3255 | - | - | | 0.3539 | 1308 | 0.4102 | - | - | | 0.3542 | 1309 | 0.356 | - | - | | 0.3544 | 1310 | 0.2882 | - | - | | 0.3547 | 1311 | 0.3868 | - | - | | 0.3550 | 1312 | 0.2843 | - | - | | 0.3552 | 1313 | 0.3056 | - | - | | 0.3555 | 1314 | 0.3019 | - | - | | 0.3558 | 1315 | 0.3629 | - | - | | 0.3561 | 1316 | 0.3249 | - | - | | 0.3563 | 1317 | 0.3416 | - | - | | 0.3566 | 1318 | 0.3334 | - | - | | 0.3569 | 1319 | 0.3192 | - | - | | 0.3571 | 1320 | 0.2987 | - | - | | 0.3574 | 1321 | 0.4592 | - | - | | 0.3577 | 1322 | 0.3347 | - | - | | 0.3580 | 1323 | 0.3225 | - | - | | 0.3582 | 1324 | 0.2893 | - | - | | 0.3585 | 1325 | 0.2756 | - | - | | 0.3588 | 1326 | 0.3101 | - | - | | 0.3590 | 1327 | 0.3585 | - | - | | 0.3593 | 1328 | 0.3718 | - | - | | 0.3596 | 1329 | 0.3739 | - | - | | 0.3598 | 1330 | 0.3745 | - | - | | 0.3601 | 1331 | 0.3092 | - | - | | 0.3604 | 1332 | 0.3439 | - | - | | 0.3607 | 1333 | 0.4166 | - | - | | 0.3609 | 1334 | 0.2473 | - | - | | 0.3612 | 1335 | 0.4276 | - | - | | 0.3615 | 1336 | 0.3324 | - | - | | 0.3617 | 1337 | 0.316 | - | - | | 0.3620 | 1338 | 0.2866 | - | - | | 0.3623 | 1339 | 0.3335 | - | - | | 0.3626 | 1340 | 0.4195 | - | - | | 0.3628 | 1341 | 0.404 | - | - | | 0.3631 | 1342 | 0.2932 | - | - | | 0.3634 | 1343 | 0.2803 | - | - | | 0.3636 | 1344 | 0.3479 | - | - | | 0.3639 | 1345 | 0.3089 | - | - | | 0.3642 | 1346 | 0.2704 | - | - | | 0.3644 | 1347 | 0.2594 | - | - | | 0.3647 | 1348 | 0.3865 | - | - | | 0.3650 | 1349 | 0.3355 | - | - | | 0.3653 | 1350 | 0.2783 | - | - | | 0.3655 | 1351 | 0.3247 | - | - | | 0.3658 | 1352 | 0.2388 | - | - | | 0.3661 | 1353 | 0.2224 | - | - | | 0.3663 | 1354 | 0.3406 | - | - | | 0.3666 | 1355 | 0.287 | - | - | | 0.3669 | 1356 | 0.2588 | - | - | | 0.3672 | 1357 | 0.3212 | - | - | | 0.3674 | 1358 | 0.2848 | - | - | | 0.3677 | 1359 | 0.3124 | - | - | | 0.3680 | 1360 | 0.3249 | - | - | | 0.3682 | 1361 | 0.4642 | - | - | | 0.3685 | 1362 | 0.2873 | - | - | | 0.3688 | 1363 | 0.3088 | - | - | | 0.3690 | 1364 | 0.383 | - | - | | 0.3693 | 1365 | 0.3172 | - | - | | 0.3696 | 1366 | 0.2822 | - | - | | 0.3699 | 1367 | 0.2768 | - | - | | 0.3701 | 1368 | 0.3302 | - | - | | 0.3704 | 1369 | 0.343 | - | - | | 0.3707 | 1370 | 0.3196 | - | - | | 0.3709 | 1371 | 0.4174 | - | - | | 0.3712 | 1372 | 0.3112 | - | - | | 0.3715 | 1373 | 0.2883 | - | - | | 0.3718 | 1374 | 0.3163 | - | - | | 0.3720 | 1375 | 0.2534 | - | - | | 0.3723 | 1376 | 0.3306 | - | - | | 0.3726 | 1377 | 0.2289 | - | - | | 0.3728 | 1378 | 0.3455 | - | - | | 0.3731 | 1379 | 0.3523 | - | - | | 0.3734 | 1380 | 0.2652 | - | - | | 0.3736 | 1381 | 0.2843 | - | - | | 0.3739 | 1382 | 0.3417 | - | - | | 0.3742 | 1383 | 0.2493 | - | - | | 0.3745 | 1384 | 0.282 | - | - | | 0.3747 | 1385 | 0.3151 | - | - | | 0.375 | 1386 | 0.3309 | - | - | | 0.3753 | 1387 | 0.2056 | - | - | | 0.3755 | 1388 | 0.2501 | - | - | | 0.3758 | 1389 | 0.3405 | - | - | | 0.3761 | 1390 | 0.3507 | - | - | | 0.3764 | 1391 | 0.383 | - | - | | 0.3766 | 1392 | 0.4098 | - | - | | 0.3769 | 1393 | 0.3126 | - | - | | 0.3772 | 1394 | 0.2638 | - | - | | 0.3774 | 1395 | 0.3513 | - | - | | 0.3777 | 1396 | 0.365 | - | - | | 0.3780 | 1397 | 0.3619 | - | - | | 0.3782 | 1398 | 0.1893 | - | - | | 0.3785 | 1399 | 0.3793 | - | - | | 0.3788 | 1400 | 0.2953 | - | - | | 0.3791 | 1401 | 0.3451 | - | - | | 0.3793 | 1402 | 0.3182 | - | - | | 0.3796 | 1403 | 0.3521 | - | - | | 0.3799 | 1404 | 0.2786 | - | - | | 0.3801 | 1405 | 0.3593 | - | - | | 0.3804 | 1406 | 0.4103 | - | - | | 0.3807 | 1407 | 0.3579 | - | - | | 0.3810 | 1408 | 0.2547 | - | - | | 0.3812 | 1409 | 0.302 | - | - | | 0.3815 | 1410 | 0.3491 | - | - | | 0.3818 | 1411 | 0.2671 | - | - | | 0.3820 | 1412 | 0.4096 | - | - | | 0.3823 | 1413 | 0.2638 | - | - | | 0.3826 | 1414 | 0.1952 | - | - | | 0.3828 | 1415 | 0.3076 | - | - | | 0.3831 | 1416 | 0.3095 | - | - | | 0.3834 | 1417 | 0.3543 | - | - | | 0.3837 | 1418 | 0.32 | - | - | | 0.3839 | 1419 | 0.397 | - | - | | 0.3842 | 1420 | 0.3316 | - | - | | 0.3845 | 1421 | 0.2896 | - | - | | 0.3847 | 1422 | 0.2966 | - | - | | 0.3850 | 1423 | 0.3271 | - | - | | 0.3853 | 1424 | 0.3092 | - | - | | 0.3856 | 1425 | 0.3537 | - | - | | 0.3858 | 1426 | 0.2749 | - | - | | 0.3861 | 1427 | 0.3039 | - | - | | 0.3864 | 1428 | 0.2842 | - | - | | 0.3866 | 1429 | 0.3159 | - | - | | 0.3869 | 1430 | 0.3417 | - | - | | 0.3872 | 1431 | 0.3592 | - | - | | 0.3874 | 1432 | 0.3783 | - | - | | 0.3877 | 1433 | 0.3196 | - | - | | 0.3880 | 1434 | 0.3329 | - | - | | 0.3883 | 1435 | 0.2715 | - | - | | 0.3885 | 1436 | 0.2283 | - | - | | 0.3888 | 1437 | 0.2476 | - | - | | 0.3891 | 1438 | 0.2958 | - | - | | 0.3893 | 1439 | 0.2213 | - | - | | 0.3896 | 1440 | 0.4275 | - | - | | 0.3899 | 1441 | 0.3019 | - | - | | 0.3902 | 1442 | 0.4343 | - | - | | 0.3904 | 1443 | 0.297 | - | - | | 0.3907 | 1444 | 0.2655 | - | - | | 0.3910 | 1445 | 0.2607 | - | - | | 0.3912 | 1446 | 0.3763 | - | - | | 0.3915 | 1447 | 0.308 | - | - | | 0.3918 | 1448 | 0.3473 | - | - | | 0.3920 | 1449 | 0.3174 | - | - | | 0.3923 | 1450 | 0.3241 | - | - | | 0.3926 | 1451 | 0.3568 | - | - | | 0.3929 | 1452 | 0.3041 | - | - | | 0.3931 | 1453 | 0.327 | - | - | | 0.3934 | 1454 | 0.4484 | - | - | | 0.3937 | 1455 | 0.3508 | - | - | | 0.3939 | 1456 | 0.3127 | - | - | | 0.3942 | 1457 | 0.2704 | - | - | | 0.3945 | 1458 | 0.4142 | - | - | | 0.3948 | 1459 | 0.2167 | - | - | | 0.3950 | 1460 | 0.3136 | - | - | | 0.3953 | 1461 | 0.293 | - | - | | 0.3956 | 1462 | 0.2908 | - | - | | 0.3958 | 1463 | 0.2915 | - | - | | 0.3961 | 1464 | 0.2654 | - | - | | 0.3964 | 1465 | 0.3292 | - | - | | 0.3966 | 1466 | 0.275 | - | - | | 0.3969 | 1467 | 0.3244 | - | - | | 0.3972 | 1468 | 0.3071 | - | - | | 0.3975 | 1469 | 0.3341 | - | - | | 0.3977 | 1470 | 0.352 | - | - | | 0.3980 | 1471 | 0.3116 | - | - | | 0.3983 | 1472 | 0.3123 | - | - | | 0.3985 | 1473 | 0.3793 | - | - | | 0.3988 | 1474 | 0.3694 | - | - | | 0.3991 | 1475 | 0.3258 | - | - | | 0.3994 | 1476 | 0.3305 | - | - | | 0.3996 | 1477 | 0.3727 | - | - | | 0.3999 | 1478 | 0.4845 | - | - | | 0.4002 | 1479 | 0.2735 | - | - | | 0.4004 | 1480 | 0.3541 | - | - | | 0.4007 | 1481 | 0.3674 | - | - | | 0.4010 | 1482 | 0.3042 | - | - | | 0.4012 | 1483 | 0.4306 | - | - | | 0.4015 | 1484 | 0.3802 | - | - | | 0.4018 | 1485 | 0.3054 | - | - | | 0.4021 | 1486 | 0.3294 | - | - | | 0.4023 | 1487 | 0.3278 | - | - | | 0.4026 | 1488 | 0.2426 | - | - | | 0.4029 | 1489 | 0.3134 | - | - | | 0.4031 | 1490 | 0.265 | - | - | | 0.4034 | 1491 | 0.3262 | - | - | | 0.4037 | 1492 | 0.2115 | - | - | | 0.4040 | 1493 | 0.3547 | - | - | | 0.4042 | 1494 | 0.3465 | - | - | | 0.4045 | 1495 | 0.2602 | - | - | | 0.4048 | 1496 | 0.3083 | - | - | | 0.4050 | 1497 | 0.3452 | - | - | | 0.4053 | 1498 | 0.3119 | - | - | | 0.4056 | 1499 | 0.3158 | - | - | | 0.4058 | 1500 | 0.292 | - | - | | 0.4061 | 1501 | 0.3093 | - | - | | 0.4064 | 1502 | 0.3745 | - | - | | 0.4067 | 1503 | 0.3562 | - | - | | 0.4069 | 1504 | 0.4018 | - | - | | 0.4072 | 1505 | 0.3412 | - | - | | 0.4075 | 1506 | 0.2803 | - | - | | 0.4077 | 1507 | 0.261 | - | - | | 0.4080 | 1508 | 0.2679 | - | - | | 0.4083 | 1509 | 0.233 | - | - | | 0.4085 | 1510 | 0.3224 | - | - | | 0.4088 | 1511 | 0.2553 | - | - | | 0.4091 | 1512 | 0.3856 | - | - | | 0.4094 | 1513 | 0.2882 | - | - | | 0.4096 | 1514 | 0.2913 | - | - | | 0.4099 | 1515 | 0.3757 | - | - | | 0.4102 | 1516 | 0.3336 | - | - | | 0.4104 | 1517 | 0.3614 | - | - | | 0.4107 | 1518 | 0.406 | - | - | | 0.4110 | 1519 | 0.3836 | - | - | | 0.4113 | 1520 | 0.3144 | - | - | | 0.4115 | 1521 | 0.3723 | - | - | | 0.4118 | 1522 | 0.309 | - | - | | 0.4121 | 1523 | 0.2913 | - | - | | 0.4123 | 1524 | 0.2922 | - | - | | 0.4126 | 1525 | 0.3637 | - | - | | 0.4129 | 1526 | 0.3487 | - | - | | 0.4131 | 1527 | 0.2622 | - | - | | 0.4134 | 1528 | 0.371 | - | - | | 0.4137 | 1529 | 0.3331 | - | - | | 0.4140 | 1530 | 0.3036 | - | - | | 0.4142 | 1531 | 0.365 | - | - | | 0.4145 | 1532 | 0.2434 | - | - | | 0.4148 | 1533 | 0.4295 | - | - | | 0.4150 | 1534 | 0.2469 | - | - | | 0.4153 | 1535 | 0.2763 | - | - | | 0.4156 | 1536 | 0.2392 | - | - | | 0.4159 | 1537 | 0.3442 | - | - | | 0.4161 | 1538 | 0.2683 | - | - | | 0.4164 | 1539 | 0.3165 | - | - | | 0.4167 | 1540 | 0.3609 | - | - | | 0.4169 | 1541 | 0.2749 | - | - | | 0.4172 | 1542 | 0.3656 | - | - | | 0.4175 | 1543 | 0.2939 | - | - | | 0.4177 | 1544 | 0.3216 | - | - | | 0.4180 | 1545 | 0.2391 | - | - | | 0.4183 | 1546 | 0.3019 | - | - | | 0.4186 | 1547 | 0.4169 | - | - | | 0.4188 | 1548 | 0.2874 | - | - | | 0.4191 | 1549 | 0.2899 | - | - | | 0.4194 | 1550 | 0.2812 | - | - | | 0.4196 | 1551 | 0.3413 | - | - | | 0.4199 | 1552 | 0.377 | - | - | | 0.4202 | 1553 | 0.2849 | - | - | | 0.4205 | 1554 | 0.2043 | - | - | | 0.4207 | 1555 | 0.3214 | - | - | | 0.4210 | 1556 | 0.2212 | - | - | | 0.4213 | 1557 | 0.4131 | - | - | | 0.4215 | 1558 | 0.4091 | - | - | | 0.4218 | 1559 | 0.2656 | - | - | | 0.4221 | 1560 | 0.4024 | - | - | | 0.4223 | 1561 | 0.4297 | - | - | | 0.4226 | 1562 | 0.3183 | - | - | | 0.4229 | 1563 | 0.284 | - | - | | 0.4232 | 1564 | 0.3087 | - | - | | 0.4234 | 1565 | 0.2911 | - | - | | 0.4237 | 1566 | 0.2488 | - | - | | 0.4240 | 1567 | 0.2784 | - | - | | 0.4242 | 1568 | 0.3067 | - | - | | 0.4245 | 1569 | 0.3701 | - | - | | 0.4248 | 1570 | 0.2763 | - | - | | 0.4251 | 1571 | 0.2709 | - | - | | 0.4253 | 1572 | 0.2955 | - | - | | 0.4256 | 1573 | 0.3634 | - | - | | 0.4259 | 1574 | 0.2968 | - | - | | 0.4261 | 1575 | 0.3411 | - | - | | 0.4264 | 1576 | 0.2878 | - | - | | 0.4267 | 1577 | 0.3299 | - | - | | 0.4269 | 1578 | 0.3076 | - | - | | 0.4272 | 1579 | 0.4037 | - | - | | 0.4275 | 1580 | 0.3145 | - | - | | 0.4278 | 1581 | 0.3472 | - | - | | 0.4280 | 1582 | 0.4746 | - | - | | 0.4283 | 1583 | 0.4133 | - | - | | 0.4286 | 1584 | 0.3383 | - | - | | 0.4288 | 1585 | 0.26 | - | - | | 0.4291 | 1586 | 0.2576 | - | - | | 0.4294 | 1587 | 0.2791 | - | - | | 0.4297 | 1588 | 0.2906 | - | - | | 0.4299 | 1589 | 0.314 | - | - | | 0.4302 | 1590 | 0.256 | - | - | | 0.4305 | 1591 | 0.3558 | - | - | | 0.4307 | 1592 | 0.3444 | - | - | | 0.4310 | 1593 | 0.3114 | - | - | | 0.4313 | 1594 | 0.3009 | - | - | | 0.4315 | 1595 | 0.2396 | - | - | | 0.4318 | 1596 | 0.2593 | - | - | | 0.4321 | 1597 | 0.3174 | - | - | | 0.4324 | 1598 | 0.2845 | - | - | | 0.4326 | 1599 | 0.3513 | - | - | | 0.4329 | 1600 | 0.2477 | - | - | | 0.4332 | 1601 | 0.3278 | - | - | | 0.4334 | 1602 | 0.2826 | - | - | | 0.4337 | 1603 | 0.2822 | - | - | | 0.4340 | 1604 | 0.2642 | - | - | | 0.4343 | 1605 | 0.2216 | - | - | | 0.4345 | 1606 | 0.3094 | - | - | | 0.4348 | 1607 | 0.2974 | - | - | | 0.4351 | 1608 | 0.2376 | - | - | | 0.4353 | 1609 | 0.311 | - | - | | 0.4356 | 1610 | 0.3213 | - | - | | 0.4359 | 1611 | 0.4042 | - | - | | 0.4361 | 1612 | 0.2256 | - | - | | 0.4364 | 1613 | 0.5054 | - | - | | 0.4367 | 1614 | 0.2997 | - | - | | 0.4370 | 1615 | 0.2637 | - | - | | 0.4372 | 1616 | 0.3322 | - | - | | 0.4375 | 1617 | 0.3703 | - | - | | 0.4378 | 1618 | 0.3901 | - | - | | 0.4380 | 1619 | 0.2318 | - | - | | 0.4383 | 1620 | 0.2835 | - | - | | 0.4386 | 1621 | 0.2978 | - | - | | 0.4389 | 1622 | 0.3346 | - | - | | 0.4391 | 1623 | 0.3628 | - | - | | 0.4394 | 1624 | 0.2674 | - | - | | 0.4397 | 1625 | 0.3236 | - | - | | 0.4399 | 1626 | 0.278 | - | - | | 0.4402 | 1627 | 0.3334 | - | - | | 0.4405 | 1628 | 0.2963 | - | - | | 0.4407 | 1629 | 0.3749 | - | - | | 0.4410 | 1630 | 0.2343 | - | - | | 0.4413 | 1631 | 0.2022 | - | - | | 0.4416 | 1632 | 0.2903 | - | - | | 0.4418 | 1633 | 0.2514 | - | - | | 0.4421 | 1634 | 0.3484 | - | - | | 0.4424 | 1635 | 0.275 | - | - | | 0.4426 | 1636 | 0.3407 | - | - | | 0.4429 | 1637 | 0.3139 | - | - | | 0.4432 | 1638 | 0.3343 | - | - | | 0.4435 | 1639 | 0.3925 | - | - | | 0.4437 | 1640 | 0.1999 | - | - | | 0.4440 | 1641 | 0.3318 | - | - | | 0.4443 | 1642 | 0.3439 | - | - | | 0.4445 | 1643 | 0.3689 | - | - | | 0.4448 | 1644 | 0.4289 | - | - | | 0.4451 | 1645 | 0.3181 | - | - | | 0.4453 | 1646 | 0.3545 | - | - | | 0.4456 | 1647 | 0.3583 | - | - | | 0.4459 | 1648 | 0.3065 | - | - | | 0.4462 | 1649 | 0.3479 | - | - | | 0.4464 | 1650 | 0.3788 | - | - | | 0.4467 | 1651 | 0.2848 | - | - | | 0.4470 | 1652 | 0.3141 | - | - | | 0.4472 | 1653 | 0.266 | - | - | | 0.4475 | 1654 | 0.3964 | - | - | | 0.4478 | 1655 | 0.3581 | - | - | | 0.4481 | 1656 | 0.4215 | - | - | | 0.4483 | 1657 | 0.2951 | - | - | | 0.4486 | 1658 | 0.1931 | - | - | | 0.4489 | 1659 | 0.3433 | - | - | | 0.4491 | 1660 | 0.346 | - | - | | 0.4494 | 1661 | 0.2408 | - | - | | 0.4497 | 1662 | 0.3135 | - | - | | 0.4499 | 1663 | 0.316 | - | - | | 0.4502 | 1664 | 0.3192 | - | - | | 0.4505 | 1665 | 0.2603 | - | - | | 0.4508 | 1666 | 0.3027 | - | - | | 0.4510 | 1667 | 0.3197 | - | - | | 0.4513 | 1668 | 0.2628 | - | - | | 0.4516 | 1669 | 0.2934 | - | - | | 0.4518 | 1670 | 0.305 | - | - | | 0.4521 | 1671 | 0.2776 | - | - | | 0.4524 | 1672 | 0.3222 | - | - | | 0.4527 | 1673 | 0.2787 | - | - | | 0.4529 | 1674 | 0.2959 | - | - | | 0.4532 | 1675 | 0.193 | - | - | | 0.4535 | 1676 | 0.2484 | - | - | | 0.4537 | 1677 | 0.261 | - | - | | 0.4540 | 1678 | 0.2162 | - | - | | 0.4543 | 1679 | 0.3156 | - | - | | 0.4545 | 1680 | 0.294 | - | - | | 0.4548 | 1681 | 0.3257 | - | - | | 0.4551 | 1682 | 0.374 | - | - | | 0.4554 | 1683 | 0.4185 | - | - | | 0.4556 | 1684 | 0.3447 | - | - | | 0.4559 | 1685 | 0.3498 | - | - | | 0.4562 | 1686 | 0.2802 | - | - | | 0.4564 | 1687 | 0.2454 | - | - | | 0.4567 | 1688 | 0.314 | - | - | | 0.4570 | 1689 | 0.2863 | - | - | | 0.4573 | 1690 | 0.3427 | - | - | | 0.4575 | 1691 | 0.411 | - | - | | 0.4578 | 1692 | 0.3426 | - | - | | 0.4581 | 1693 | 0.2981 | - | - | | 0.4583 | 1694 | 0.2695 | - | - | | 0.4586 | 1695 | 0.2684 | - | - | | 0.4589 | 1696 | 0.3156 | - | - | | 0.4591 | 1697 | 0.2821 | - | - | | 0.4594 | 1698 | 0.2771 | - | - | | 0.4597 | 1699 | 0.2814 | - | - | | 0.4600 | 1700 | 0.438 | - | - | | 0.4602 | 1701 | 0.3238 | - | - | | 0.4605 | 1702 | 0.3357 | - | - | | 0.4608 | 1703 | 0.3173 | - | - | | 0.4610 | 1704 | 0.3449 | - | - | | 0.4613 | 1705 | 0.3006 | - | - | | 0.4616 | 1706 | 0.2668 | - | - | | 0.4619 | 1707 | 0.2207 | - | - | | 0.4621 | 1708 | 0.2732 | - | - | | 0.4624 | 1709 | 0.2932 | - | - | | 0.4627 | 1710 | 0.2876 | - | - | | 0.4629 | 1711 | 0.3651 | - | - | | 0.4632 | 1712 | 0.2588 | - | - | | 0.4635 | 1713 | 0.2924 | - | - | | 0.4637 | 1714 | 0.3066 | - | - | | 0.4640 | 1715 | 0.3097 | - | - | | 0.4643 | 1716 | 0.2903 | - | - | | 0.4646 | 1717 | 0.2954 | - | - | | 0.4648 | 1718 | 0.3254 | - | - | | 0.4651 | 1719 | 0.3473 | - | - | | 0.4654 | 1720 | 0.2877 | - | - | | 0.4656 | 1721 | 0.249 | - | - | | 0.4659 | 1722 | 0.3314 | - | - | | 0.4662 | 1723 | 0.2943 | - | - | | 0.4665 | 1724 | 0.2795 | - | - | | 0.4667 | 1725 | 0.3487 | - | - | | 0.4670 | 1726 | 0.2702 | - | - | | 0.4673 | 1727 | 0.376 | - | - | | 0.4675 | 1728 | 0.2944 | - | - | | 0.4678 | 1729 | 0.3628 | - | - | | 0.4681 | 1730 | 0.2901 | - | - | | 0.4683 | 1731 | 0.2995 | - | - | | 0.4686 | 1732 | 0.3562 | - | - | | 0.4689 | 1733 | 0.2696 | - | - | | 0.4692 | 1734 | 0.3227 | - | - | | 0.4694 | 1735 | 0.3213 | - | - | | 0.4697 | 1736 | 0.3491 | - | - | | 0.4700 | 1737 | 0.3207 | - | - | | 0.4702 | 1738 | 0.2993 | - | - | | 0.4705 | 1739 | 0.3539 | - | - | | 0.4708 | 1740 | 0.3892 | - | - | | 0.4710 | 1741 | 0.3387 | - | - | | 0.4713 | 1742 | 0.3199 | - | - | | 0.4716 | 1743 | 0.2784 | - | - | | 0.4719 | 1744 | 0.2633 | - | - | | 0.4721 | 1745 | 0.2245 | - | - | | 0.4724 | 1746 | 0.2471 | - | - | | 0.4727 | 1747 | 0.2595 | - | - | | 0.4729 | 1748 | 0.4358 | - | - | | 0.4732 | 1749 | 0.2905 | - | - | | 0.4735 | 1750 | 0.3258 | - | - | | 0.4738 | 1751 | 0.3212 | - | - | | 0.4740 | 1752 | 0.261 | - | - | | 0.4743 | 1753 | 0.3827 | - | - | | 0.4746 | 1754 | 0.3426 | - | - | | 0.4748 | 1755 | 0.276 | - | - | | 0.4751 | 1756 | 0.314 | - | - | | 0.4754 | 1757 | 0.356 | - | - | | 0.4756 | 1758 | 0.3502 | - | - | | 0.4759 | 1759 | 0.2854 | - | - | | 0.4762 | 1760 | 0.2515 | - | - | | 0.4765 | 1761 | 0.2616 | - | - | | 0.4767 | 1762 | 0.299 | - | - | | 0.4770 | 1763 | 0.4031 | - | - | | 0.4773 | 1764 | 0.3912 | - | - | | 0.4775 | 1765 | 0.2894 | - | - | | 0.4778 | 1766 | 0.2781 | - | - | | 0.4781 | 1767 | 0.352 | - | - | | 0.4784 | 1768 | 0.4137 | - | - | | 0.4786 | 1769 | 0.3046 | - | - | | 0.4789 | 1770 | 0.2729 | - | - | | 0.4792 | 1771 | 0.2839 | - | - | | 0.4794 | 1772 | 0.2969 | - | - | | 0.4797 | 1773 | 0.4103 | - | - | | 0.4800 | 1774 | 0.2713 | - | - | | 0.4802 | 1775 | 0.2631 | - | - | | 0.4805 | 1776 | 0.3458 | - | - | | 0.4808 | 1777 | 0.1919 | - | - | | 0.4811 | 1778 | 0.2705 | - | - | | 0.4813 | 1779 | 0.3064 | - | - | | 0.4816 | 1780 | 0.3586 | - | - | | 0.4819 | 1781 | 0.3002 | - | - | | 0.4821 | 1782 | 0.2437 | - | - | | 0.4824 | 1783 | 0.2324 | - | - | | 0.4827 | 1784 | 0.2651 | - | - | | 0.4830 | 1785 | 0.3127 | - | - | | 0.4832 | 1786 | 0.2684 | - | - | | 0.4835 | 1787 | 0.2201 | - | - | | 0.4838 | 1788 | 0.2304 | - | - | | 0.4840 | 1789 | 0.223 | - | - | | 0.4843 | 1790 | 0.5316 | - | - | | 0.4846 | 1791 | 0.2831 | - | - | | 0.4848 | 1792 | 0.4394 | - | - | | 0.4851 | 1793 | 0.2484 | - | - | | 0.4854 | 1794 | 0.3246 | - | - | | 0.4857 | 1795 | 0.2835 | - | - | | 0.4859 | 1796 | 0.348 | - | - | | 0.4862 | 1797 | 0.337 | - | - | | 0.4865 | 1798 | 0.2918 | - | - | | 0.4867 | 1799 | 0.3523 | - | - | | 0.4870 | 1800 | 0.3838 | - | - | | 0.4873 | 1801 | 0.3461 | - | - | | 0.4876 | 1802 | 0.2209 | - | - | | 0.4878 | 1803 | 0.2826 | - | - | | 0.4881 | 1804 | 0.2855 | - | - | | 0.4884 | 1805 | 0.2988 | - | - | | 0.4886 | 1806 | 0.3571 | - | - | | 0.4889 | 1807 | 0.3321 | - | - | | 0.4892 | 1808 | 0.288 | - | - | | 0.4894 | 1809 | 0.3517 | - | - | | 0.4897 | 1810 | 0.3954 | - | - | | 0.4900 | 1811 | 0.3406 | - | - | | 0.4903 | 1812 | 0.3441 | - | - | | 0.4905 | 1813 | 0.3425 | - | - | | 0.4908 | 1814 | 0.3594 | - | - | | 0.4911 | 1815 | 0.2996 | - | - | | 0.4913 | 1816 | 0.1974 | - | - | | 0.4916 | 1817 | 0.2889 | - | - | | 0.4919 | 1818 | 0.3362 | - | - | | 0.4922 | 1819 | 0.3254 | - | - | | 0.4924 | 1820 | 0.2844 | - | - | | 0.4927 | 1821 | 0.328 | - | - | | 0.4930 | 1822 | 0.2904 | - | - | | 0.4932 | 1823 | 0.2588 | - | - | | 0.4935 | 1824 | 0.2622 | - | - | | 0.4938 | 1825 | 0.4415 | - | - | | 0.4940 | 1826 | 0.2619 | - | - | | 0.4943 | 1827 | 0.3035 | - | - | | 0.4946 | 1828 | 0.2876 | - | - | | 0.4949 | 1829 | 0.2342 | - | - | | 0.4951 | 1830 | 0.2439 | - | - | | 0.4954 | 1831 | 0.2569 | - | - | | 0.4957 | 1832 | 0.2483 | - | - | | 0.4959 | 1833 | 0.1941 | - | - | | 0.4962 | 1834 | 0.2254 | - | - | | 0.4965 | 1835 | 0.2969 | - | - | | 0.4968 | 1836 | 0.2489 | - | - | | 0.4970 | 1837 | 0.3358 | - | - | | 0.4973 | 1838 | 0.2673 | - | - | | 0.4976 | 1839 | 0.4219 | - | - | | 0.4978 | 1840 | 0.3112 | - | - | | 0.4981 | 1841 | 0.3524 | - | - | | 0.4984 | 1842 | 0.2772 | - | - | | 0.4986 | 1843 | 0.2896 | - | - | | 0.4989 | 1844 | 0.2695 | - | - | | 0.4992 | 1845 | 0.1904 | - | - | | 0.4995 | 1846 | 0.2621 | - | - | | 0.4997 | 1847 | 0.2439 | - | - | | 0.5 | 1848 | 0.2534 | - | - | | 0.5003 | 1849 | 0.2894 | - | - | | 0.5005 | 1850 | 0.3911 | - | - | | 0.5008 | 1851 | 0.2434 | - | - | | 0.5011 | 1852 | 0.3025 | - | - | | 0.5014 | 1853 | 0.3478 | - | - | | 0.5016 | 1854 | 0.424 | - | - | | 0.5019 | 1855 | 0.2836 | - | - | | 0.5022 | 1856 | 0.315 | - | - | | 0.5024 | 1857 | 0.3085 | - | - | | 0.5027 | 1858 | 0.3196 | - | - | | 0.5030 | 1859 | 0.3474 | - | - | | 0.5032 | 1860 | 0.2869 | - | - | | 0.5035 | 1861 | 0.382 | - | - | | 0.5038 | 1862 | 0.2733 | - | - | | 0.5041 | 1863 | 0.2454 | - | - | | 0.5043 | 1864 | 0.2677 | - | - | | 0.5046 | 1865 | 0.282 | - | - | | 0.5049 | 1866 | 0.2499 | - | - | | 0.5051 | 1867 | 0.1954 | - | - | | 0.5054 | 1868 | 0.2632 | - | - | | 0.5057 | 1869 | 0.3081 | - | - | | 0.5060 | 1870 | 0.4498 | - | - | | 0.5062 | 1871 | 0.3749 | - | - | | 0.5065 | 1872 | 0.2123 | - | - | | 0.5068 | 1873 | 0.2102 | - | - | | 0.5070 | 1874 | 0.3575 | - | - | | 0.5073 | 1875 | 0.4086 | - | - | | 0.5076 | 1876 | 0.3715 | - | - | | 0.5078 | 1877 | 0.2916 | - | - | | 0.5081 | 1878 | 0.3878 | - | - | | 0.5084 | 1879 | 0.2256 | - | - | | 0.5087 | 1880 | 0.3621 | - | - | | 0.5089 | 1881 | 0.3058 | - | - | | 0.5092 | 1882 | 0.2529 | - | - | | 0.5095 | 1883 | 0.3109 | - | - | | 0.5097 | 1884 | 0.2243 | - | - | | 0.5100 | 1885 | 0.3431 | - | - | | 0.5103 | 1886 | 0.2336 | - | - | | 0.5106 | 1887 | 0.27 | - | - | | 0.5108 | 1888 | 0.3208 | - | - | | 0.5111 | 1889 | 0.3423 | - | - | | 0.5114 | 1890 | 0.2694 | - | - | | 0.5116 | 1891 | 0.2481 | - | - | | 0.5119 | 1892 | 0.2123 | - | - | | 0.5122 | 1893 | 0.2194 | - | - | | 0.5124 | 1894 | 0.2689 | - | - | | 0.5127 | 1895 | 0.2497 | - | - | | 0.5130 | 1896 | 0.4563 | - | - | | 0.5133 | 1897 | 0.3217 | - | - | | 0.5135 | 1898 | 0.2701 | - | - | | 0.5138 | 1899 | 0.3277 | - | - | | 0.5141 | 1900 | 0.2497 | - | - | | 0.5143 | 1901 | 0.2675 | - | - | | 0.5146 | 1902 | 0.3395 | - | - | | 0.5149 | 1903 | 0.2584 | - | - | | 0.5152 | 1904 | 0.2613 | - | - | | 0.5154 | 1905 | 0.3257 | - | - | | 0.5157 | 1906 | 0.3223 | - | - | | 0.5160 | 1907 | 0.2112 | - | - | | 0.5162 | 1908 | 0.3107 | - | - | | 0.5165 | 1909 | 0.3503 | - | - | | 0.5168 | 1910 | 0.3177 | - | - | | 0.5170 | 1911 | 0.3069 | - | - | | 0.5173 | 1912 | 0.3046 | - | - | | 0.5176 | 1913 | 0.2277 | - | - | | 0.5179 | 1914 | 0.3281 | - | - | | 0.5181 | 1915 | 0.3666 | - | - | | 0.5184 | 1916 | 0.2777 | - | - | | 0.5187 | 1917 | 0.2379 | - | - | | 0.5189 | 1918 | 0.2897 | - | - | | 0.5192 | 1919 | 0.3631 | - | - | | 0.5195 | 1920 | 0.3179 | - | - | | 0.5198 | 1921 | 0.3676 | - | - | | 0.5200 | 1922 | 0.2914 | - | - | | 0.5203 | 1923 | 0.3635 | - | - | | 0.5206 | 1924 | 0.3318 | - | - | | 0.5208 | 1925 | 0.2351 | - | - | | 0.5211 | 1926 | 0.2477 | - | - | | 0.5214 | 1927 | 0.4694 | - | - | | 0.5216 | 1928 | 0.4056 | - | - | | 0.5219 | 1929 | 0.2271 | - | - | | 0.5222 | 1930 | 0.2666 | - | - | | 0.5225 | 1931 | 0.3668 | - | - | | 0.5227 | 1932 | 0.2946 | - | - | | 0.5230 | 1933 | 0.42 | - | - | | 0.5233 | 1934 | 0.2849 | - | - | | 0.5235 | 1935 | 0.3238 | - | - | | 0.5238 | 1936 | 0.2245 | - | - | | 0.5241 | 1937 | 0.2493 | - | - | | 0.5244 | 1938 | 0.2863 | - | - | | 0.5246 | 1939 | 0.338 | - | - | | 0.5249 | 1940 | 0.2275 | - | - | | 0.5252 | 1941 | 0.2411 | - | - | | 0.5254 | 1942 | 0.2467 | - | - | | 0.5257 | 1943 | 0.23 | - | - | | 0.5260 | 1944 | 0.2498 | - | - | | 0.5262 | 1945 | 0.3139 | - | - | | 0.5265 | 1946 | 0.342 | - | - | | 0.5268 | 1947 | 0.3005 | - | - | | 0.5271 | 1948 | 0.2178 | - | - | | 0.5273 | 1949 | 0.3728 | - | - | | 0.5276 | 1950 | 0.2949 | - | - | | 0.5279 | 1951 | 0.316 | - | - | | 0.5281 | 1952 | 0.3004 | - | - | | 0.5284 | 1953 | 0.3251 | - | - | | 0.5287 | 1954 | 0.2766 | - | - | | 0.5290 | 1955 | 0.3627 | - | - | | 0.5292 | 1956 | 0.343 | - | - | | 0.5295 | 1957 | 0.237 | - | - | | 0.5298 | 1958 | 0.3486 | - | - | | 0.5300 | 1959 | 0.2624 | - | - | | 0.5303 | 1960 | 0.2155 | - | - | | 0.5306 | 1961 | 0.3794 | - | - | | 0.5308 | 1962 | 0.3156 | - | - | | 0.5311 | 1963 | 0.2169 | - | - | | 0.5314 | 1964 | 0.3322 | - | - | | 0.5317 | 1965 | 0.2329 | - | - | | 0.5319 | 1966 | 0.2293 | - | - | | 0.5322 | 1967 | 0.2906 | - | - | | 0.5325 | 1968 | 0.2861 | - | - | | 0.5327 | 1969 | 0.2874 | - | - | | 0.5330 | 1970 | 0.2998 | - | - | | 0.5333 | 1971 | 0.2696 | - | - | | 0.5335 | 1972 | 0.2532 | - | - | | 0.5338 | 1973 | 0.3712 | - | - | | 0.5341 | 1974 | 0.2441 | - | - | | 0.5344 | 1975 | 0.24 | - | - | | 0.5346 | 1976 | 0.1971 | - | - | | 0.5349 | 1977 | 0.3948 | - | - | | 0.5352 | 1978 | 0.239 | - | - | | 0.5354 | 1979 | 0.2925 | - | - | | 0.5357 | 1980 | 0.245 | - | - | | 0.5360 | 1981 | 0.3199 | - | - | | 0.5363 | 1982 | 0.2454 | - | - | | 0.5365 | 1983 | 0.2698 | - | - | | 0.5368 | 1984 | 0.2832 | - | - | | 0.5371 | 1985 | 0.2837 | - | - | | 0.5373 | 1986 | 0.2472 | - | - | | 0.5376 | 1987 | 0.246 | - | - | | 0.5379 | 1988 | 0.3966 | - | - | | 0.5381 | 1989 | 0.2866 | - | - | | 0.5384 | 1990 | 0.2489 | - | - | | 0.5387 | 1991 | 0.3617 | - | - | | 0.5390 | 1992 | 0.2477 | - | - | | 0.5392 | 1993 | 0.3498 | - | - | | 0.5395 | 1994 | 0.3244 | - | - | | 0.5398 | 1995 | 0.2445 | - | - | | 0.5400 | 1996 | 0.2113 | - | - | | 0.5403 | 1997 | 0.2809 | - | - | | 0.5406 | 1998 | 0.3882 | - | - | | 0.5409 | 1999 | 0.2979 | - | - | | 0.5411 | 2000 | 0.399 | 0.2678 | 0.9314 | | 0.5414 | 2001 | 0.2064 | - | - | | 0.5417 | 2002 | 0.3161 | - | - | | 0.5419 | 2003 | 0.2666 | - | - | | 0.5422 | 2004 | 0.2437 | - | - | | 0.5425 | 2005 | 0.2439 | - | - | | 0.5427 | 2006 | 0.3509 | - | - | | 0.5430 | 2007 | 0.2798 | - | - | | 0.5433 | 2008 | 0.3807 | - | - | | 0.5436 | 2009 | 0.269 | - | - | | 0.5438 | 2010 | 0.2997 | - | - | | 0.5441 | 2011 | 0.2002 | - | - | | 0.5444 | 2012 | 0.2117 | - | - | | 0.5446 | 2013 | 0.2889 | - | - | | 0.5449 | 2014 | 0.28 | - | - | | 0.5452 | 2015 | 0.2477 | - | - | | 0.5455 | 2016 | 0.2559 | - | - | | 0.5457 | 2017 | 0.306 | - | - | | 0.5460 | 2018 | 0.3516 | - | - | | 0.5463 | 2019 | 0.2488 | - | - | | 0.5465 | 2020 | 0.2363 | - | - | | 0.5468 | 2021 | 0.2869 | - | - | | 0.5471 | 2022 | 0.2523 | - | - | | 0.5473 | 2023 | 0.2398 | - | - | | 0.5476 | 2024 | 0.2757 | - | - | | 0.5479 | 2025 | 0.3994 | - | - | | 0.5482 | 2026 | 0.1951 | - | - | | 0.5484 | 2027 | 0.3219 | - | - | | 0.5487 | 2028 | 0.2246 | - | - | | 0.5490 | 2029 | 0.2777 | - | - | | 0.5492 | 2030 | 0.2702 | - | - | | 0.5495 | 2031 | 0.2086 | - | - | | 0.5498 | 2032 | 0.2793 | - | - | | 0.5501 | 2033 | 0.291 | - | - | | 0.5503 | 2034 | 0.37 | - | - | | 0.5506 | 2035 | 0.3038 | - | - | | 0.5509 | 2036 | 0.3384 | - | - | | 0.5511 | 2037 | 0.4532 | - | - | | 0.5514 | 2038 | 0.316 | - | - | | 0.5517 | 2039 | 0.2454 | - | - | | 0.5519 | 2040 | 0.3251 | - | - | | 0.5522 | 2041 | 0.3017 | - | - | | 0.5525 | 2042 | 0.2204 | - | - | | 0.5528 | 2043 | 0.3318 | - | - | | 0.5530 | 2044 | 0.3603 | - | - | | 0.5533 | 2045 | 0.2446 | - | - | | 0.5536 | 2046 | 0.2995 | - | - | | 0.5538 | 2047 | 0.3583 | - | - | | 0.5541 | 2048 | 0.246 | - | - | | 0.5544 | 2049 | 0.2273 | - | - | | 0.5547 | 2050 | 0.2741 | - | - | | 0.5549 | 2051 | 0.3038 | - | - | | 0.5552 | 2052 | 0.3163 | - | - | | 0.5555 | 2053 | 0.2569 | - | - | | 0.5557 | 2054 | 0.2942 | - | - | | 0.5560 | 2055 | 0.308 | - | - | | 0.5563 | 2056 | 0.2759 | - | - | | 0.5565 | 2057 | 0.2483 | - | - | | 0.5568 | 2058 | 0.3376 | - | - | | 0.5571 | 2059 | 0.3598 | - | - | | 0.5574 | 2060 | 0.3304 | - | - | | 0.5576 | 2061 | 0.2743 | - | - | | 0.5579 | 2062 | 0.296 | - | - | | 0.5582 | 2063 | 0.2501 | - | - | | 0.5584 | 2064 | 0.2168 | - | - | | 0.5587 | 2065 | 0.4365 | - | - | | 0.5590 | 2066 | 0.3181 | - | - | | 0.5593 | 2067 | 0.2537 | - | - | | 0.5595 | 2068 | 0.377 | - | - | | 0.5598 | 2069 | 0.2038 | - | - | | 0.5601 | 2070 | 0.2498 | - | - | | 0.5603 | 2071 | 0.3063 | - | - | | 0.5606 | 2072 | 0.2288 | - | - | | 0.5609 | 2073 | 0.2999 | - | - | | 0.5611 | 2074 | 0.3542 | - | - | | 0.5614 | 2075 | 0.3596 | - | - | | 0.5617 | 2076 | 0.2293 | - | - | | 0.5620 | 2077 | 0.2885 | - | - | | 0.5622 | 2078 | 0.2734 | - | - | | 0.5625 | 2079 | 0.2597 | - | - | | 0.5628 | 2080 | 0.3531 | - | - | | 0.5630 | 2081 | 0.3777 | - | - | | 0.5633 | 2082 | 0.249 | - | - | | 0.5636 | 2083 | 0.2936 | - | - | | 0.5639 | 2084 | 0.2867 | - | - | | 0.5641 | 2085 | 0.4155 | - | - | | 0.5644 | 2086 | 0.3695 | - | - | | 0.5647 | 2087 | 0.2154 | - | - | | 0.5649 | 2088 | 0.2208 | - | - | | 0.5652 | 2089 | 0.3174 | - | - | | 0.5655 | 2090 | 0.294 | - | - | | 0.5657 | 2091 | 0.2839 | - | - | | 0.5660 | 2092 | 0.3503 | - | - | | 0.5663 | 2093 | 0.2936 | - | - | | 0.5666 | 2094 | 0.3694 | - | - | | 0.5668 | 2095 | 0.3173 | - | - | | 0.5671 | 2096 | 0.3551 | - | - | | 0.5674 | 2097 | 0.3028 | - | - | | 0.5676 | 2098 | 0.2202 | - | - | | 0.5679 | 2099 | 0.2847 | - | - | | 0.5682 | 2100 | 0.2535 | - | - | | 0.5685 | 2101 | 0.2532 | - | - | | 0.5687 | 2102 | 0.3547 | - | - | | 0.5690 | 2103 | 0.3576 | - | - | | 0.5693 | 2104 | 0.2252 | - | - | | 0.5695 | 2105 | 0.2664 | - | - | | 0.5698 | 2106 | 0.3307 | - | - | | 0.5701 | 2107 | 0.42 | - | - | | 0.5703 | 2108 | 0.2321 | - | - | | 0.5706 | 2109 | 0.4118 | - | - | | 0.5709 | 2110 | 0.3261 | - | - | | 0.5712 | 2111 | 0.3959 | - | - | | 0.5714 | 2112 | 0.253 | - | - | | 0.5717 | 2113 | 0.3074 | - | - | | 0.5720 | 2114 | 0.3498 | - | - | | 0.5722 | 2115 | 0.2863 | - | - | | 0.5725 | 2116 | 0.3714 | - | - | | 0.5728 | 2117 | 0.3077 | - | - | | 0.5731 | 2118 | 0.3554 | - | - | | 0.5733 | 2119 | 0.2585 | - | - | | 0.5736 | 2120 | 0.2943 | - | - | | 0.5739 | 2121 | 0.2876 | - | - | | 0.5741 | 2122 | 0.2613 | - | - | | 0.5744 | 2123 | 0.2841 | - | - | | 0.5747 | 2124 | 0.2297 | - | - | | 0.5749 | 2125 | 0.3207 | - | - | | 0.5752 | 2126 | 0.3327 | - | - | | 0.5755 | 2127 | 0.3357 | - | - | | 0.5758 | 2128 | 0.3354 | - | - | | 0.5760 | 2129 | 0.3158 | - | - | | 0.5763 | 2130 | 0.2815 | - | - | | 0.5766 | 2131 | 0.3044 | - | - | | 0.5768 | 2132 | 0.2506 | - | - | | 0.5771 | 2133 | 0.3979 | - | - | | 0.5774 | 2134 | 0.3119 | - | - | | 0.5777 | 2135 | 0.3 | - | - | | 0.5779 | 2136 | 0.3073 | - | - | | 0.5782 | 2137 | 0.4089 | - | - | | 0.5785 | 2138 | 0.3184 | - | - | | 0.5787 | 2139 | 0.2438 | - | - | | 0.5790 | 2140 | 0.3226 | - | - | | 0.5793 | 2141 | 0.1883 | - | - | | 0.5795 | 2142 | 0.4197 | - | - | | 0.5798 | 2143 | 0.3029 | - | - | | 0.5801 | 2144 | 0.2579 | - | - | | 0.5804 | 2145 | 0.2339 | - | - | | 0.5806 | 2146 | 0.2871 | - | - | | 0.5809 | 2147 | 0.2637 | - | - | | 0.5812 | 2148 | 0.3334 | - | - | | 0.5814 | 2149 | 0.2687 | - | - | | 0.5817 | 2150 | 0.2881 | - | - | | 0.5820 | 2151 | 0.3424 | - | - | | 0.5823 | 2152 | 0.2728 | - | - | | 0.5825 | 2153 | 0.3442 | - | - | | 0.5828 | 2154 | 0.3509 | - | - | | 0.5831 | 2155 | 0.2791 | - | - | | 0.5833 | 2156 | 0.3674 | - | - | | 0.5836 | 2157 | 0.2768 | - | - | | 0.5839 | 2158 | 0.2527 | - | - | | 0.5841 | 2159 | 0.2698 | - | - | | 0.5844 | 2160 | 0.3248 | - | - | | 0.5847 | 2161 | 0.2899 | - | - | | 0.5850 | 2162 | 0.3093 | - | - | | 0.5852 | 2163 | 0.2712 | - | - | | 0.5855 | 2164 | 0.339 | - | - | | 0.5858 | 2165 | 0.3468 | - | - | | 0.5860 | 2166 | 0.3092 | - | - | | 0.5863 | 2167 | 0.2859 | - | - | | 0.5866 | 2168 | 0.3792 | - | - | | 0.5869 | 2169 | 0.2406 | - | - | | 0.5871 | 2170 | 0.2161 | - | - | | 0.5874 | 2171 | 0.3067 | - | - | | 0.5877 | 2172 | 0.2394 | - | - | | 0.5879 | 2173 | 0.2597 | - | - | | 0.5882 | 2174 | 0.2874 | - | - | | 0.5885 | 2175 | 0.3324 | - | - | | 0.5887 | 2176 | 0.3601 | - | - | | 0.5890 | 2177 | 0.3179 | - | - | | 0.5893 | 2178 | 0.3032 | - | - | | 0.5896 | 2179 | 0.2574 | - | - | | 0.5898 | 2180 | 0.2453 | - | - | | 0.5901 | 2181 | 0.3094 | - | - | | 0.5904 | 2182 | 0.3135 | - | - | | 0.5906 | 2183 | 0.2546 | - | - | | 0.5909 | 2184 | 0.4111 | - | - | | 0.5912 | 2185 | 0.2898 | - | - | | 0.5915 | 2186 | 0.3083 | - | - | | 0.5917 | 2187 | 0.2818 | - | - | | 0.5920 | 2188 | 0.2782 | - | - | | 0.5923 | 2189 | 0.2909 | - | - | | 0.5925 | 2190 | 0.276 | - | - | | 0.5928 | 2191 | 0.2479 | - | - | | 0.5931 | 2192 | 0.2487 | - | - | | 0.5933 | 2193 | 0.2691 | - | - | | 0.5936 | 2194 | 0.3399 | - | - | | 0.5939 | 2195 | 0.3491 | - | - | | 0.5942 | 2196 | 0.2898 | - | - | | 0.5944 | 2197 | 0.3755 | - | - | | 0.5947 | 2198 | 0.3055 | - | - | | 0.5950 | 2199 | 0.3656 | - | - | | 0.5952 | 2200 | 0.2695 | - | - | | 0.5955 | 2201 | 0.2354 | - | - | | 0.5958 | 2202 | 0.3539 | - | - | | 0.5960 | 2203 | 0.2864 | - | - | | 0.5963 | 2204 | 0.2922 | - | - | | 0.5966 | 2205 | 0.3674 | - | - | | 0.5969 | 2206 | 0.287 | - | - | | 0.5971 | 2207 | 0.2651 | - | - | | 0.5974 | 2208 | 0.249 | - | - | | 0.5977 | 2209 | 0.2539 | - | - | | 0.5979 | 2210 | 0.1918 | - | - | | 0.5982 | 2211 | 0.3314 | - | - | | 0.5985 | 2212 | 0.2279 | - | - | | 0.5988 | 2213 | 0.1887 | - | - | | 0.5990 | 2214 | 0.3379 | - | - | | 0.5993 | 2215 | 0.2797 | - | - | | 0.5996 | 2216 | 0.3552 | - | - | | 0.5998 | 2217 | 0.3429 | - | - | | 0.6001 | 2218 | 0.2063 | - | - | | 0.6004 | 2219 | 0.2548 | - | - | | 0.6006 | 2220 | 0.2537 | - | - | | 0.6009 | 2221 | 0.1857 | - | - | | 0.6012 | 2222 | 0.3095 | - | - | | 0.6015 | 2223 | 0.3029 | - | - | | 0.6017 | 2224 | 0.3682 | - | - | | 0.6020 | 2225 | 0.3338 | - | - | | 0.6023 | 2226 | 0.2174 | - | - | | 0.6025 | 2227 | 0.335 | - | - | | 0.6028 | 2228 | 0.2682 | - | - | | 0.6031 | 2229 | 0.3726 | - | - | | 0.6034 | 2230 | 0.2252 | - | - | | 0.6036 | 2231 | 0.2663 | - | - | | 0.6039 | 2232 | 0.2949 | - | - | | 0.6042 | 2233 | 0.2843 | - | - | | 0.6044 | 2234 | 0.3394 | - | - | | 0.6047 | 2235 | 0.2517 | - | - | | 0.6050 | 2236 | 0.2061 | - | - | | 0.6052 | 2237 | 0.2414 | - | - | | 0.6055 | 2238 | 0.3274 | - | - | | 0.6058 | 2239 | 0.216 | - | - | | 0.6061 | 2240 | 0.1866 | - | - | | 0.6063 | 2241 | 0.4304 | - | - | | 0.6066 | 2242 | 0.2431 | - | - | | 0.6069 | 2243 | 0.2326 | - | - | | 0.6071 | 2244 | 0.247 | - | - | | 0.6074 | 2245 | 0.2964 | - | - | | 0.6077 | 2246 | 0.2624 | - | - | | 0.6080 | 2247 | 0.3184 | - | - | | 0.6082 | 2248 | 0.226 | - | - | | 0.6085 | 2249 | 0.3127 | - | - | | 0.6088 | 2250 | 0.2279 | - | - | | 0.6090 | 2251 | 0.2563 | - | - | | 0.6093 | 2252 | 0.2418 | - | - | | 0.6096 | 2253 | 0.3044 | - | - | | 0.6098 | 2254 | 0.258 | - | - | | 0.6101 | 2255 | 0.2761 | - | - | | 0.6104 | 2256 | 0.3092 | - | - | | 0.6107 | 2257 | 0.3105 | - | - | | 0.6109 | 2258 | 0.2856 | - | - | | 0.6112 | 2259 | 0.3125 | - | - | | 0.6115 | 2260 | 0.3687 | - | - | | 0.6117 | 2261 | 0.3406 | - | - | | 0.6120 | 2262 | 0.1985 | - | - | | 0.6123 | 2263 | 0.3442 | - | - | | 0.6126 | 2264 | 0.3027 | - | - | | 0.6128 | 2265 | 0.3087 | - | - | | 0.6131 | 2266 | 0.3757 | - | - | | 0.6134 | 2267 | 0.2585 | - | - | | 0.6136 | 2268 | 0.2712 | - | - | | 0.6139 | 2269 | 0.2363 | - | - | | 0.6142 | 2270 | 0.2473 | - | - | | 0.6144 | 2271 | 0.2944 | - | - | | 0.6147 | 2272 | 0.2439 | - | - | | 0.6150 | 2273 | 0.3544 | - | - | | 0.6153 | 2274 | 0.2928 | - | - | | 0.6155 | 2275 | 0.3404 | - | - | | 0.6158 | 2276 | 0.2161 | - | - | | 0.6161 | 2277 | 0.2196 | - | - | | 0.6163 | 2278 | 0.3405 | - | - | | 0.6166 | 2279 | 0.3401 | - | - | | 0.6169 | 2280 | 0.338 | - | - | | 0.6172 | 2281 | 0.2941 | - | - | | 0.6174 | 2282 | 0.2742 | - | - | | 0.6177 | 2283 | 0.3155 | - | - | | 0.6180 | 2284 | 0.4023 | - | - | | 0.6182 | 2285 | 0.409 | - | - | | 0.6185 | 2286 | 0.2207 | - | - | | 0.6188 | 2287 | 0.2972 | - | - | | 0.6190 | 2288 | 0.2947 | - | - | | 0.6193 | 2289 | 0.2996 | - | - | | 0.6196 | 2290 | 0.3907 | - | - | | 0.6199 | 2291 | 0.3064 | - | - | | 0.6201 | 2292 | 0.3847 | - | - | | 0.6204 | 2293 | 0.2248 | - | - | | 0.6207 | 2294 | 0.2749 | - | - | | 0.6209 | 2295 | 0.2702 | - | - | | 0.6212 | 2296 | 0.3082 | - | - | | 0.6215 | 2297 | 0.2209 | - | - | | 0.6218 | 2298 | 0.238 | - | - | | 0.6220 | 2299 | 0.251 | - | - | | 0.6223 | 2300 | 0.3533 | - | - | | 0.6226 | 2301 | 0.2615 | - | - | | 0.6228 | 2302 | 0.381 | - | - | | 0.6231 | 2303 | 0.2406 | - | - | | 0.6234 | 2304 | 0.2205 | - | - | | 0.6236 | 2305 | 0.2698 | - | - | | 0.6239 | 2306 | 0.2858 | - | - | | 0.6242 | 2307 | 0.262 | - | - | | 0.6245 | 2308 | 0.3542 | - | - | | 0.6247 | 2309 | 0.2825 | - | - | | 0.625 | 2310 | 0.3249 | - | - | | 0.6253 | 2311 | 0.2983 | - | - | | 0.6255 | 2312 | 0.3013 | - | - | | 0.6258 | 2313 | 0.3104 | - | - | | 0.6261 | 2314 | 0.2585 | - | - | | 0.6264 | 2315 | 0.2017 | - | - | | 0.6266 | 2316 | 0.4107 | - | - | | 0.6269 | 2317 | 0.2962 | - | - | | 0.6272 | 2318 | 0.1942 | - | - | | 0.6274 | 2319 | 0.2256 | - | - | | 0.6277 | 2320 | 0.2116 | - | - | | 0.6280 | 2321 | 0.2439 | - | - | | 0.6282 | 2322 | 0.2347 | - | - | | 0.6285 | 2323 | 0.3316 | - | - | | 0.6288 | 2324 | 0.3487 | - | - | | 0.6291 | 2325 | 0.2996 | - | - | | 0.6293 | 2326 | 0.2506 | - | - | | 0.6296 | 2327 | 0.1683 | - | - | | 0.6299 | 2328 | 0.2852 | - | - | | 0.6301 | 2329 | 0.2702 | - | - | | 0.6304 | 2330 | 0.323 | - | - | | 0.6307 | 2331 | 0.2731 | - | - | | 0.6310 | 2332 | 0.3592 | - | - | | 0.6312 | 2333 | 0.2112 | - | - | | 0.6315 | 2334 | 0.2586 | - | - | | 0.6318 | 2335 | 0.3417 | - | - | | 0.6320 | 2336 | 0.3256 | - | - | | 0.6323 | 2337 | 0.3698 | - | - | | 0.6326 | 2338 | 0.2569 | - | - | | 0.6328 | 2339 | 0.2615 | - | - | | 0.6331 | 2340 | 0.3294 | - | - | | 0.6334 | 2341 | 0.234 | - | - | | 0.6337 | 2342 | 0.3432 | - | - | | 0.6339 | 2343 | 0.3524 | - | - | | 0.6342 | 2344 | 0.2213 | - | - | | 0.6345 | 2345 | 0.3087 | - | - | | 0.6347 | 2346 | 0.2194 | - | - | | 0.6350 | 2347 | 0.3035 | - | - | | 0.6353 | 2348 | 0.3747 | - | - | | 0.6356 | 2349 | 0.1571 | - | - | | 0.6358 | 2350 | 0.3386 | - | - | | 0.6361 | 2351 | 0.2896 | - | - | | 0.6364 | 2352 | 0.2524 | - | - | | 0.6366 | 2353 | 0.2272 | - | - | | 0.6369 | 2354 | 0.2595 | - | - | | 0.6372 | 2355 | 0.2124 | - | - | | 0.6374 | 2356 | 0.3861 | - | - | | 0.6377 | 2357 | 0.2815 | - | - | | 0.6380 | 2358 | 0.3098 | - | - | | 0.6383 | 2359 | 0.2382 | - | - | | 0.6385 | 2360 | 0.2409 | - | - | | 0.6388 | 2361 | 0.2541 | - | - | | 0.6391 | 2362 | 0.2816 | - | - | | 0.6393 | 2363 | 0.2915 | - | - | | 0.6396 | 2364 | 0.4063 | - | - | | 0.6399 | 2365 | 0.2847 | - | - | | 0.6402 | 2366 | 0.4259 | - | - | | 0.6404 | 2367 | 0.2182 | - | - | | 0.6407 | 2368 | 0.2909 | - | - | | 0.6410 | 2369 | 0.2814 | - | - | | 0.6412 | 2370 | 0.3453 | - | - | | 0.6415 | 2371 | 0.276 | - | - | | 0.6418 | 2372 | 0.3554 | - | - | | 0.6420 | 2373 | 0.2861 | - | - | | 0.6423 | 2374 | 0.3658 | - | - | | 0.6426 | 2375 | 0.2899 | - | - | | 0.6429 | 2376 | 0.3662 | - | - | | 0.6431 | 2377 | 0.4045 | - | - | | 0.6434 | 2378 | 0.2546 | - | - | | 0.6437 | 2379 | 0.2281 | - | - | | 0.6439 | 2380 | 0.2781 | - | - | | 0.6442 | 2381 | 0.285 | - | - | | 0.6445 | 2382 | 0.2797 | - | - | | 0.6448 | 2383 | 0.3226 | - | - | | 0.6450 | 2384 | 0.3242 | - | - | | 0.6453 | 2385 | 0.3247 | - | - | | 0.6456 | 2386 | 0.2552 | - | - | | 0.6458 | 2387 | 0.3265 | - | - | | 0.6461 | 2388 | 0.3195 | - | - | | 0.6464 | 2389 | 0.2531 | - | - | | 0.6466 | 2390 | 0.3098 | - | - | | 0.6469 | 2391 | 0.244 | - | - | | 0.6472 | 2392 | 0.2282 | - | - | | 0.6475 | 2393 | 0.2532 | - | - | | 0.6477 | 2394 | 0.2401 | - | - | | 0.6480 | 2395 | 0.2609 | - | - | | 0.6483 | 2396 | 0.2354 | - | - | | 0.6485 | 2397 | 0.3855 | - | - | | 0.6488 | 2398 | 0.3155 | - | - | | 0.6491 | 2399 | 0.3111 | - | - | | 0.6494 | 2400 | 0.3778 | - | - | | 0.6496 | 2401 | 0.2335 | - | - | | 0.6499 | 2402 | 0.2853 | - | - | | 0.6502 | 2403 | 0.2713 | - | - | | 0.6504 | 2404 | 0.242 | - | - | | 0.6507 | 2405 | 0.2572 | - | - | | 0.6510 | 2406 | 0.2518 | - | - | | 0.6512 | 2407 | 0.3437 | - | - | | 0.6515 | 2408 | 0.3398 | - | - | | 0.6518 | 2409 | 0.3695 | - | - | | 0.6521 | 2410 | 0.2844 | - | - | | 0.6523 | 2411 | 0.3704 | - | - | | 0.6526 | 2412 | 0.3119 | - | - | | 0.6529 | 2413 | 0.3752 | - | - | | 0.6531 | 2414 | 0.2794 | - | - | | 0.6534 | 2415 | 0.3034 | - | - | | 0.6537 | 2416 | 0.3382 | - | - | | 0.6540 | 2417 | 0.2797 | - | - | | 0.6542 | 2418 | 0.261 | - | - | | 0.6545 | 2419 | 0.2327 | - | - | | 0.6548 | 2420 | 0.3467 | - | - | | 0.6550 | 2421 | 0.2786 | - | - | | 0.6553 | 2422 | 0.255 | - | - | | 0.6556 | 2423 | 0.4057 | - | - | | 0.6558 | 2424 | 0.2607 | - | - | | 0.6561 | 2425 | 0.2534 | - | - | | 0.6564 | 2426 | 0.2219 | - | - | | 0.6567 | 2427 | 0.283 | - | - | | 0.6569 | 2428 | 0.3278 | - | - | | 0.6572 | 2429 | 0.2684 | - | - | | 0.6575 | 2430 | 0.2302 | - | - | | 0.6577 | 2431 | 0.2384 | - | - | | 0.6580 | 2432 | 0.2836 | - | - | | 0.6583 | 2433 | 0.4131 | - | - | | 0.6585 | 2434 | 0.4277 | - | - | | 0.6588 | 2435 | 0.2739 | - | - | | 0.6591 | 2436 | 0.3359 | - | - | | 0.6594 | 2437 | 0.3241 | - | - | | 0.6596 | 2438 | 0.3082 | - | - | | 0.6599 | 2439 | 0.3264 | - | - | | 0.6602 | 2440 | 0.2759 | - | - | | 0.6604 | 2441 | 0.4188 | - | - | | 0.6607 | 2442 | 0.3656 | - | - | | 0.6610 | 2443 | 0.1993 | - | - | | 0.6613 | 2444 | 0.3154 | - | - | | 0.6615 | 2445 | 0.3203 | - | - | | 0.6618 | 2446 | 0.3453 | - | - | | 0.6621 | 2447 | 0.2271 | - | - | | 0.6623 | 2448 | 0.252 | - | - | | 0.6626 | 2449 | 0.2531 | - | - | | 0.6629 | 2450 | 0.2652 | - | - | | 0.6631 | 2451 | 0.2153 | - | - | | 0.6634 | 2452 | 0.2776 | - | - | | 0.6637 | 2453 | 0.3642 | - | - | | 0.6640 | 2454 | 0.241 | - | - | | 0.6642 | 2455 | 0.2173 | - | - | | 0.6645 | 2456 | 0.1763 | - | - | | 0.6648 | 2457 | 0.2723 | - | - | | 0.6650 | 2458 | 0.2566 | - | - | | 0.6653 | 2459 | 0.2723 | - | - | | 0.6656 | 2460 | 0.3026 | - | - | | 0.6659 | 2461 | 0.337 | - | - | | 0.6661 | 2462 | 0.2832 | - | - | | 0.6664 | 2463 | 0.2556 | - | - | | 0.6667 | 2464 | 0.2706 | - | - | | 0.6669 | 2465 | 0.3769 | - | - | | 0.6672 | 2466 | 0.3274 | - | - | | 0.6675 | 2467 | 0.1768 | - | - | | 0.6677 | 2468 | 0.2716 | - | - | | 0.6680 | 2469 | 0.338 | - | - | | 0.6683 | 2470 | 0.3078 | - | - | | 0.6686 | 2471 | 0.2597 | - | - | | 0.6688 | 2472 | 0.2851 | - | - | | 0.6691 | 2473 | 0.2952 | - | - | | 0.6694 | 2474 | 0.1961 | - | - | | 0.6696 | 2475 | 0.2854 | - | - | | 0.6699 | 2476 | 0.2351 | - | - | | 0.6702 | 2477 | 0.3185 | - | - | | 0.6705 | 2478 | 0.2378 | - | - | | 0.6707 | 2479 | 0.2856 | - | - | | 0.6710 | 2480 | 0.2472 | - | - | | 0.6713 | 2481 | 0.3144 | - | - | | 0.6715 | 2482 | 0.2039 | - | - | | 0.6718 | 2483 | 0.2952 | - | - | | 0.6721 | 2484 | 0.3482 | - | - | | 0.6723 | 2485 | 0.2959 | - | - | | 0.6726 | 2486 | 0.297 | - | - | | 0.6729 | 2487 | 0.3668 | - | - | | 0.6732 | 2488 | 0.2994 | - | - | | 0.6734 | 2489 | 0.2341 | - | - | | 0.6737 | 2490 | 0.3227 | - | - | | 0.6740 | 2491 | 0.2986 | - | - | | 0.6742 | 2492 | 0.2647 | - | - | | 0.6745 | 2493 | 0.3624 | - | - | | 0.6748 | 2494 | 0.2772 | - | - | | 0.6751 | 2495 | 0.3145 | - | - | | 0.6753 | 2496 | 0.2543 | - | - | | 0.6756 | 2497 | 0.2592 | - | - | | 0.6759 | 2498 | 0.3121 | - | - | | 0.6761 | 2499 | 0.3368 | - | - | | 0.6764 | 2500 | 0.21 | - | - | | 0.6767 | 2501 | 0.196 | - | - | | 0.6769 | 2502 | 0.2683 | - | - | | 0.6772 | 2503 | 0.2224 | - | - | | 0.6775 | 2504 | 0.2193 | - | - | | 0.6778 | 2505 | 0.2405 | - | - | | 0.6780 | 2506 | 0.3266 | - | - | | 0.6783 | 2507 | 0.2389 | - | - | | 0.6786 | 2508 | 0.2504 | - | - | | 0.6788 | 2509 | 0.3118 | - | - | | 0.6791 | 2510 | 0.3587 | - | - | | 0.6794 | 2511 | 0.2251 | - | - | | 0.6797 | 2512 | 0.3323 | - | - | | 0.6799 | 2513 | 0.2922 | - | - | | 0.6802 | 2514 | 0.3334 | - | - | | 0.6805 | 2515 | 0.2789 | - | - | | 0.6807 | 2516 | 0.3415 | - | - | | 0.6810 | 2517 | 0.2807 | - | - | | 0.6813 | 2518 | 0.2539 | - | - | | 0.6815 | 2519 | 0.2707 | - | - | | 0.6818 | 2520 | 0.3106 | - | - | | 0.6821 | 2521 | 0.3418 | - | - | | 0.6824 | 2522 | 0.2842 | - | - | | 0.6826 | 2523 | 0.3253 | - | - | | 0.6829 | 2524 | 0.2056 | - | - | | 0.6832 | 2525 | 0.2782 | - | - | | 0.6834 | 2526 | 0.3149 | - | - | | 0.6837 | 2527 | 0.3883 | - | - | | 0.6840 | 2528 | 0.3147 | - | - | | 0.6843 | 2529 | 0.235 | - | - | | 0.6845 | 2530 | 0.2653 | - | - | | 0.6848 | 2531 | 0.2709 | - | - | | 0.6851 | 2532 | 0.2617 | - | - | | 0.6853 | 2533 | 0.3593 | - | - | | 0.6856 | 2534 | 0.3428 | - | - | | 0.6859 | 2535 | 0.305 | - | - | | 0.6861 | 2536 | 0.2499 | - | - | | 0.6864 | 2537 | 0.1978 | - | - | | 0.6867 | 2538 | 0.1896 | - | - | | 0.6870 | 2539 | 0.3252 | - | - | | 0.6872 | 2540 | 0.2828 | - | - | | 0.6875 | 2541 | 0.2815 | - | - | | 0.6878 | 2542 | 0.2833 | - | - | | 0.6880 | 2543 | 0.1898 | - | - | | 0.6883 | 2544 | 0.1906 | - | - | | 0.6886 | 2545 | 0.2697 | - | - | | 0.6889 | 2546 | 0.2798 | - | - | | 0.6891 | 2547 | 0.3615 | - | - | | 0.6894 | 2548 | 0.2493 | - | - | | 0.6897 | 2549 | 0.238 | - | - | | 0.6899 | 2550 | 0.3225 | - | - | | 0.6902 | 2551 | 0.2755 | - | - | | 0.6905 | 2552 | 0.2717 | - | - | | 0.6907 | 2553 | 0.346 | - | - | | 0.6910 | 2554 | 0.1961 | - | - | | 0.6913 | 2555 | 0.2816 | - | - | | 0.6916 | 2556 | 0.3031 | - | - | | 0.6918 | 2557 | 0.3204 | - | - | | 0.6921 | 2558 | 0.2067 | - | - | | 0.6924 | 2559 | 0.2513 | - | - | | 0.6926 | 2560 | 0.291 | - | - | | 0.6929 | 2561 | 0.2655 | - | - | | 0.6932 | 2562 | 0.3215 | - | - | | 0.6935 | 2563 | 0.3614 | - | - | | 0.6937 | 2564 | 0.3519 | - | - | | 0.6940 | 2565 | 0.3836 | - | - | | 0.6943 | 2566 | 0.301 | - | - | | 0.6945 | 2567 | 0.3132 | - | - | | 0.6948 | 2568 | 0.2782 | - | - | | 0.6951 | 2569 | 0.3692 | - | - | | 0.6953 | 2570 | 0.413 | - | - | | 0.6956 | 2571 | 0.3127 | - | - | | 0.6959 | 2572 | 0.181 | - | - | | 0.6962 | 2573 | 0.3427 | - | - | | 0.6964 | 2574 | 0.2982 | - | - | | 0.6967 | 2575 | 0.3722 | - | - | | 0.6970 | 2576 | 0.2658 | - | - | | 0.6972 | 2577 | 0.234 | - | - | | 0.6975 | 2578 | 0.3053 | - | - | | 0.6978 | 2579 | 0.3209 | - | - | | 0.6981 | 2580 | 0.2979 | - | - | | 0.6983 | 2581 | 0.3301 | - | - | | 0.6986 | 2582 | 0.2866 | - | - | | 0.6989 | 2583 | 0.3146 | - | - | | 0.6991 | 2584 | 0.2727 | - | - | | 0.6994 | 2585 | 0.3635 | - | - | | 0.6997 | 2586 | 0.2708 | - | - | | 0.6999 | 2587 | 0.2998 | - | - | | 0.7002 | 2588 | 0.2133 | - | - | | 0.7005 | 2589 | 0.2166 | - | - | | 0.7008 | 2590 | 0.2741 | - | - | | 0.7010 | 2591 | 0.2225 | - | - | | 0.7013 | 2592 | 0.2137 | - | - | | 0.7016 | 2593 | 0.2898 | - | - | | 0.7018 | 2594 | 0.247 | - | - | | 0.7021 | 2595 | 0.1923 | - | - | | 0.7024 | 2596 | 0.2774 | - | - | | 0.7027 | 2597 | 0.2341 | - | - | | 0.7029 | 2598 | 0.3579 | - | - | | 0.7032 | 2599 | 0.3634 | - | - | | 0.7035 | 2600 | 0.184 | - | - | | 0.7037 | 2601 | 0.2285 | - | - | | 0.7040 | 2602 | 0.2542 | - | - | | 0.7043 | 2603 | 0.279 | - | - | | 0.7045 | 2604 | 0.2376 | - | - | | 0.7048 | 2605 | 0.3331 | - | - | | 0.7051 | 2606 | 0.2338 | - | - | | 0.7054 | 2607 | 0.3463 | - | - | | 0.7056 | 2608 | 0.2671 | - | - | | 0.7059 | 2609 | 0.231 | - | - | | 0.7062 | 2610 | 0.2001 | - | - | | 0.7064 | 2611 | 0.4103 | - | - | | 0.7067 | 2612 | 0.2532 | - | - | | 0.7070 | 2613 | 0.3057 | - | - | | 0.7073 | 2614 | 0.2364 | - | - | | 0.7075 | 2615 | 0.2809 | - | - | | 0.7078 | 2616 | 0.2894 | - | - | | 0.7081 | 2617 | 0.2466 | - | - | | 0.7083 | 2618 | 0.2671 | - | - | | 0.7086 | 2619 | 0.2803 | - | - | | 0.7089 | 2620 | 0.3047 | - | - | | 0.7091 | 2621 | 0.2221 | - | - | | 0.7094 | 2622 | 0.2949 | - | - | | 0.7097 | 2623 | 0.1792 | - | - | | 0.7100 | 2624 | 0.276 | - | - | | 0.7102 | 2625 | 0.2189 | - | - | | 0.7105 | 2626 | 0.3951 | - | - | | 0.7108 | 2627 | 0.232 | - | - | | 0.7110 | 2628 | 0.2308 | - | - | | 0.7113 | 2629 | 0.2023 | - | - | | 0.7116 | 2630 | 0.3196 | - | - | | 0.7119 | 2631 | 0.2111 | - | - | | 0.7121 | 2632 | 0.2745 | - | - | | 0.7124 | 2633 | 0.2338 | - | - | | 0.7127 | 2634 | 0.2205 | - | - | | 0.7129 | 2635 | 0.1885 | - | - | | 0.7132 | 2636 | 0.297 | - | - | | 0.7135 | 2637 | 0.3786 | - | - | | 0.7137 | 2638 | 0.2427 | - | - | | 0.7140 | 2639 | 0.3616 | - | - | | 0.7143 | 2640 | 0.3802 | - | - | | 0.7146 | 2641 | 0.3355 | - | - | | 0.7148 | 2642 | 0.1796 | - | - | | 0.7151 | 2643 | 0.2682 | - | - | | 0.7154 | 2644 | 0.3872 | - | - | | 0.7156 | 2645 | 0.3007 | - | - | | 0.7159 | 2646 | 0.3101 | - | - | | 0.7162 | 2647 | 0.3015 | - | - | | 0.7165 | 2648 | 0.1874 | - | - | | 0.7167 | 2649 | 0.2402 | - | - | | 0.7170 | 2650 | 0.221 | - | - | | 0.7173 | 2651 | 0.2383 | - | - | | 0.7175 | 2652 | 0.3307 | - | - | | 0.7178 | 2653 | 0.3467 | - | - | | 0.7181 | 2654 | 0.2158 | - | - | | 0.7183 | 2655 | 0.2674 | - | - | | 0.7186 | 2656 | 0.3809 | - | - | | 0.7189 | 2657 | 0.2308 | - | - | | 0.7192 | 2658 | 0.3204 | - | - | | 0.7194 | 2659 | 0.2713 | - | - | | 0.7197 | 2660 | 0.1767 | - | - | | 0.7200 | 2661 | 0.2442 | - | - | | 0.7202 | 2662 | 0.2633 | - | - | | 0.7205 | 2663 | 0.4071 | - | - | | 0.7208 | 2664 | 0.2561 | - | - | | 0.7210 | 2665 | 0.2749 | - | - | | 0.7213 | 2666 | 0.3119 | - | - | | 0.7216 | 2667 | 0.3571 | - | - | | 0.7219 | 2668 | 0.2496 | - | - | | 0.7221 | 2669 | 0.2653 | - | - | | 0.7224 | 2670 | 0.2856 | - | - | | 0.7227 | 2671 | 0.3543 | - | - | | 0.7229 | 2672 | 0.2169 | - | - | | 0.7232 | 2673 | 0.2581 | - | - | | 0.7235 | 2674 | 0.2551 | - | - | | 0.7238 | 2675 | 0.1895 | - | - | | 0.7240 | 2676 | 0.2255 | - | - | | 0.7243 | 2677 | 0.382 | - | - | | 0.7246 | 2678 | 0.2449 | - | - | | 0.7248 | 2679 | 0.3805 | - | - | | 0.7251 | 2680 | 0.3349 | - | - | | 0.7254 | 2681 | 0.3457 | - | - | | 0.7256 | 2682 | 0.2056 | - | - | | 0.7259 | 2683 | 0.2793 | - | - | | 0.7262 | 2684 | 0.17 | - | - | | 0.7265 | 2685 | 0.2592 | - | - | | 0.7267 | 2686 | 0.2623 | - | - | | 0.7270 | 2687 | 0.3401 | - | - | | 0.7273 | 2688 | 0.2505 | - | - | | 0.7275 | 2689 | 0.3273 | - | - | | 0.7278 | 2690 | 0.256 | - | - | | 0.7281 | 2691 | 0.3354 | - | - | | 0.7284 | 2692 | 0.3204 | - | - | | 0.7286 | 2693 | 0.2867 | - | - | | 0.7289 | 2694 | 0.2322 | - | - | | 0.7292 | 2695 | 0.2284 | - | - | | 0.7294 | 2696 | 0.2249 | - | - | | 0.7297 | 2697 | 0.2987 | - | - | | 0.7300 | 2698 | 0.2556 | - | - | | 0.7302 | 2699 | 0.3544 | - | - | | 0.7305 | 2700 | 0.2903 | - | - | | 0.7308 | 2701 | 0.1698 | - | - | | 0.7311 | 2702 | 0.2051 | - | - | | 0.7313 | 2703 | 0.2732 | - | - | | 0.7316 | 2704 | 0.2389 | - | - | | 0.7319 | 2705 | 0.2333 | - | - | | 0.7321 | 2706 | 0.2951 | - | - | | 0.7324 | 2707 | 0.2806 | - | - | | 0.7327 | 2708 | 0.2597 | - | - | | 0.7330 | 2709 | 0.2239 | - | - | | 0.7332 | 2710 | 0.3476 | - | - | | 0.7335 | 2711 | 0.2761 | - | - | | 0.7338 | 2712 | 0.2793 | - | - | | 0.7340 | 2713 | 0.299 | - | - | | 0.7343 | 2714 | 0.3661 | - | - | | 0.7346 | 2715 | 0.2467 | - | - | | 0.7348 | 2716 | 0.2292 | - | - | | 0.7351 | 2717 | 0.2687 | - | - | | 0.7354 | 2718 | 0.2712 | - | - | | 0.7357 | 2719 | 0.256 | - | - | | 0.7359 | 2720 | 0.3175 | - | - | | 0.7362 | 2721 | 0.2517 | - | - | | 0.7365 | 2722 | 0.3737 | - | - | | 0.7367 | 2723 | 0.2844 | - | - | | 0.7370 | 2724 | 0.3356 | - | - | | 0.7373 | 2725 | 0.3204 | - | - | | 0.7376 | 2726 | 0.2336 | - | - | | 0.7378 | 2727 | 0.2639 | - | - | | 0.7381 | 2728 | 0.3018 | - | - | | 0.7384 | 2729 | 0.3583 | - | - | | 0.7386 | 2730 | 0.2613 | - | - | | 0.7389 | 2731 | 0.2649 | - | - | | 0.7392 | 2732 | 0.3438 | - | - | | 0.7394 | 2733 | 0.3033 | - | - | | 0.7397 | 2734 | 0.3718 | - | - | | 0.7400 | 2735 | 0.2529 | - | - | | 0.7403 | 2736 | 0.3019 | - | - | | 0.7405 | 2737 | 0.2684 | - | - | | 0.7408 | 2738 | 0.2371 | - | - | | 0.7411 | 2739 | 0.2674 | - | - | | 0.7413 | 2740 | 0.2744 | - | - | | 0.7416 | 2741 | 0.35 | - | - | | 0.7419 | 2742 | 0.2909 | - | - | | 0.7422 | 2743 | 0.4118 | - | - | | 0.7424 | 2744 | 0.2335 | - | - | | 0.7427 | 2745 | 0.2961 | - | - | | 0.7430 | 2746 | 0.2588 | - | - | | 0.7432 | 2747 | 0.1956 | - | - | | 0.7435 | 2748 | 0.2885 | - | - | | 0.7438 | 2749 | 0.2685 | - | - | | 0.7440 | 2750 | 0.3205 | - | - | | 0.7443 | 2751 | 0.2229 | - | - | | 0.7446 | 2752 | 0.2407 | - | - | | 0.7449 | 2753 | 0.2909 | - | - | | 0.7451 | 2754 | 0.2197 | - | - | | 0.7454 | 2755 | 0.2846 | - | - | | 0.7457 | 2756 | 0.3195 | - | - | | 0.7459 | 2757 | 0.3928 | - | - | | 0.7462 | 2758 | 0.2501 | - | - | | 0.7465 | 2759 | 0.2635 | - | - | | 0.7468 | 2760 | 0.2687 | - | - | | 0.7470 | 2761 | 0.2774 | - | - | | 0.7473 | 2762 | 0.2404 | - | - | | 0.7476 | 2763 | 0.2691 | - | - | | 0.7478 | 2764 | 0.2742 | - | - | | 0.7481 | 2765 | 0.3111 | - | - | | 0.7484 | 2766 | 0.2613 | - | - | | 0.7486 | 2767 | 0.2664 | - | - | | 0.7489 | 2768 | 0.227 | - | - | | 0.7492 | 2769 | 0.2653 | - | - | | 0.7495 | 2770 | 0.2577 | - | - | | 0.7497 | 2771 | 0.3636 | - | - | | 0.75 | 2772 | 0.2996 | - | - | | 0.7503 | 2773 | 0.2813 | - | - | | 0.7505 | 2774 | 0.2668 | - | - | | 0.7508 | 2775 | 0.2219 | - | - | | 0.7511 | 2776 | 0.2306 | - | - | | 0.7514 | 2777 | 0.3742 | - | - | | 0.7516 | 2778 | 0.3119 | - | - | | 0.7519 | 2779 | 0.2839 | - | - | | 0.7522 | 2780 | 0.1479 | - | - | | 0.7524 | 2781 | 0.3496 | - | - | | 0.7527 | 2782 | 0.2045 | - | - | | 0.7530 | 2783 | 0.401 | - | - | | 0.7532 | 2784 | 0.2681 | - | - | | 0.7535 | 2785 | 0.1849 | - | - | | 0.7538 | 2786 | 0.2556 | - | - | | 0.7541 | 2787 | 0.2347 | - | - | | 0.7543 | 2788 | 0.2463 | - | - | | 0.7546 | 2789 | 0.3316 | - | - | | 0.7549 | 2790 | 0.2642 | - | - | | 0.7551 | 2791 | 0.3543 | - | - | | 0.7554 | 2792 | 0.3314 | - | - | | 0.7557 | 2793 | 0.3354 | - | - | | 0.7560 | 2794 | 0.2667 | - | - | | 0.7562 | 2795 | 0.3164 | - | - | | 0.7565 | 2796 | 0.3242 | - | - | | 0.7568 | 2797 | 0.1647 | - | - | | 0.7570 | 2798 | 0.2504 | - | - | | 0.7573 | 2799 | 0.1919 | - | - | | 0.7576 | 2800 | 0.3134 | - | - | | 0.7578 | 2801 | 0.2575 | - | - | | 0.7581 | 2802 | 0.2284 | - | - | | 0.7584 | 2803 | 0.3264 | - | - | | 0.7587 | 2804 | 0.2986 | - | - | | 0.7589 | 2805 | 0.253 | - | - | | 0.7592 | 2806 | 0.3031 | - | - | | 0.7595 | 2807 | 0.3244 | - | - | | 0.7597 | 2808 | 0.2799 | - | - | | 0.7600 | 2809 | 0.3398 | - | - | | 0.7603 | 2810 | 0.2747 | - | - | | 0.7606 | 2811 | 0.2881 | - | - | | 0.7608 | 2812 | 0.1694 | - | - | | 0.7611 | 2813 | 0.2386 | - | - | | 0.7614 | 2814 | 0.3028 | - | - | | 0.7616 | 2815 | 0.3223 | - | - | | 0.7619 | 2816 | 0.2624 | - | - | | 0.7622 | 2817 | 0.2616 | - | - | | 0.7624 | 2818 | 0.3242 | - | - | | 0.7627 | 2819 | 0.3248 | - | - | | 0.7630 | 2820 | 0.2934 | - | - | | 0.7633 | 2821 | 0.2735 | - | - | | 0.7635 | 2822 | 0.3237 | - | - | | 0.7638 | 2823 | 0.248 | - | - | | 0.7641 | 2824 | 0.3122 | - | - | | 0.7643 | 2825 | 0.2497 | - | - | | 0.7646 | 2826 | 0.3125 | - | - | | 0.7649 | 2827 | 0.2535 | - | - | | 0.7652 | 2828 | 0.3046 | - | - | | 0.7654 | 2829 | 0.3407 | - | - | | 0.7657 | 2830 | 0.195 | - | - | | 0.7660 | 2831 | 0.3016 | - | - | | 0.7662 | 2832 | 0.2795 | - | - | | 0.7665 | 2833 | 0.2454 | - | - | | 0.7668 | 2834 | 0.2925 | - | - | | 0.7670 | 2835 | 0.245 | - | - | | 0.7673 | 2836 | 0.3146 | - | - | | 0.7676 | 2837 | 0.2678 | - | - | | 0.7679 | 2838 | 0.288 | - | - | | 0.7681 | 2839 | 0.2584 | - | - | | 0.7684 | 2840 | 0.3073 | - | - | | 0.7687 | 2841 | 0.3062 | - | - | | 0.7689 | 2842 | 0.2962 | - | - | | 0.7692 | 2843 | 0.2451 | - | - | | 0.7695 | 2844 | 0.2307 | - | - | | 0.7698 | 2845 | 0.2803 | - | - | | 0.7700 | 2846 | 0.3346 | - | - | | 0.7703 | 2847 | 0.3172 | - | - | | 0.7706 | 2848 | 0.2453 | - | - | | 0.7708 | 2849 | 0.2658 | - | - | | 0.7711 | 2850 | 0.335 | - | - | | 0.7714 | 2851 | 0.1815 | - | - | | 0.7716 | 2852 | 0.4011 | - | - | | 0.7719 | 2853 | 0.2839 | - | - | | 0.7722 | 2854 | 0.2595 | - | - | | 0.7725 | 2855 | 0.2974 | - | - | | 0.7727 | 2856 | 0.2776 | - | - | | 0.7730 | 2857 | 0.2275 | - | - | | 0.7733 | 2858 | 0.3231 | - | - | | 0.7735 | 2859 | 0.2119 | - | - | | 0.7738 | 2860 | 0.2961 | - | - | | 0.7741 | 2861 | 0.2804 | - | - | | 0.7744 | 2862 | 0.3033 | - | - | | 0.7746 | 2863 | 0.259 | - | - | | 0.7749 | 2864 | 0.3505 | - | - | | 0.7752 | 2865 | 0.2804 | - | - | | 0.7754 | 2866 | 0.3125 | - | - | | 0.7757 | 2867 | 0.3239 | - | - | | 0.7760 | 2868 | 0.3023 | - | - | | 0.7762 | 2869 | 0.2022 | - | - | | 0.7765 | 2870 | 0.2552 | - | - | | 0.7768 | 2871 | 0.2637 | - | - | | 0.7771 | 2872 | 0.2201 | - | - | | 0.7773 | 2873 | 0.2835 | - | - | | 0.7776 | 2874 | 0.2255 | - | - | | 0.7779 | 2875 | 0.3677 | - | - | | 0.7781 | 2876 | 0.2978 | - | - | | 0.7784 | 2877 | 0.2559 | - | - | | 0.7787 | 2878 | 0.1835 | - | - | | 0.7790 | 2879 | 0.2435 | - | - | | 0.7792 | 2880 | 0.2627 | - | - | | 0.7795 | 2881 | 0.2287 | - | - | | 0.7798 | 2882 | 0.3366 | - | - | | 0.7800 | 2883 | 0.3005 | - | - | | 0.7803 | 2884 | 0.3114 | - | - | | 0.7806 | 2885 | 0.1929 | - | - | | 0.7808 | 2886 | 0.3126 | - | - | | 0.7811 | 2887 | 0.2264 | - | - | | 0.7814 | 2888 | 0.311 | - | - | | 0.7817 | 2889 | 0.2451 | - | - | | 0.7819 | 2890 | 0.2835 | - | - | | 0.7822 | 2891 | 0.3063 | - | - | | 0.7825 | 2892 | 0.2619 | - | - | | 0.7827 | 2893 | 0.2697 | - | - | | 0.7830 | 2894 | 0.231 | - | - | | 0.7833 | 2895 | 0.3442 | - | - | | 0.7835 | 2896 | 0.2583 | - | - | | 0.7838 | 2897 | 0.3165 | - | - | | 0.7841 | 2898 | 0.2559 | - | - | | 0.7844 | 2899 | 0.2985 | - | - | | 0.7846 | 2900 | 0.2976 | - | - | | 0.7849 | 2901 | 0.3032 | - | - | | 0.7852 | 2902 | 0.259 | - | - | | 0.7854 | 2903 | 0.3032 | - | - | | 0.7857 | 2904 | 0.2305 | - | - | | 0.7860 | 2905 | 0.2623 | - | - | | 0.7863 | 2906 | 0.3027 | - | - | | 0.7865 | 2907 | 0.2387 | - | - | | 0.7868 | 2908 | 0.2654 | - | - | | 0.7871 | 2909 | 0.3119 | - | - | | 0.7873 | 2910 | 0.384 | - | - | | 0.7876 | 2911 | 0.312 | - | - | | 0.7879 | 2912 | 0.2124 | - | - | | 0.7881 | 2913 | 0.2411 | - | - | | 0.7884 | 2914 | 0.2644 | - | - | | 0.7887 | 2915 | 0.2424 | - | - | | 0.7890 | 2916 | 0.1895 | - | - | | 0.7892 | 2917 | 0.259 | - | - | | 0.7895 | 2918 | 0.3217 | - | - | | 0.7898 | 2919 | 0.3174 | - | - | | 0.7900 | 2920 | 0.3491 | - | - | | 0.7903 | 2921 | 0.2239 | - | - | | 0.7906 | 2922 | 0.2215 | - | - | | 0.7909 | 2923 | 0.2571 | - | - | | 0.7911 | 2924 | 0.3131 | - | - | | 0.7914 | 2925 | 0.3643 | - | - | | 0.7917 | 2926 | 0.2213 | - | - | | 0.7919 | 2927 | 0.2722 | - | - | | 0.7922 | 2928 | 0.3308 | - | - | | 0.7925 | 2929 | 0.2342 | - | - | | 0.7927 | 2930 | 0.2073 | - | - | | 0.7930 | 2931 | 0.2149 | - | - | | 0.7933 | 2932 | 0.3553 | - | - | | 0.7936 | 2933 | 0.2729 | - | - | | 0.7938 | 2934 | 0.2606 | - | - | | 0.7941 | 2935 | 0.4019 | - | - | | 0.7944 | 2936 | 0.2869 | - | - | | 0.7946 | 2937 | 0.2878 | - | - | | 0.7949 | 2938 | 0.1756 | - | - | | 0.7952 | 2939 | 0.2547 | - | - | | 0.7955 | 2940 | 0.2232 | - | - | | 0.7957 | 2941 | 0.3016 | - | - | | 0.7960 | 2942 | 0.3424 | - | - | | 0.7963 | 2943 | 0.2554 | - | - | | 0.7965 | 2944 | 0.2146 | - | - | | 0.7968 | 2945 | 0.3067 | - | - | | 0.7971 | 2946 | 0.3556 | - | - | | 0.7973 | 2947 | 0.1824 | - | - | | 0.7976 | 2948 | 0.2505 | - | - | | 0.7979 | 2949 | 0.26 | - | - | | 0.7982 | 2950 | 0.2705 | - | - | | 0.7984 | 2951 | 0.1616 | - | - | | 0.7987 | 2952 | 0.2653 | - | - | | 0.7990 | 2953 | 0.3695 | - | - | | 0.7992 | 2954 | 0.295 | - | - | | 0.7995 | 2955 | 0.3118 | - | - | | 0.7998 | 2956 | 0.2737 | - | - | | 0.8001 | 2957 | 0.2709 | - | - | | 0.8003 | 2958 | 0.3451 | - | - | | 0.8006 | 2959 | 0.2414 | - | - | | 0.8009 | 2960 | 0.2359 | - | - | | 0.8011 | 2961 | 0.3508 | - | - | | 0.8014 | 2962 | 0.2349 | - | - | | 0.8017 | 2963 | 0.2665 | - | - | | 0.8019 | 2964 | 0.2782 | - | - | | 0.8022 | 2965 | 0.3534 | - | - | | 0.8025 | 2966 | 0.27 | - | - | | 0.8028 | 2967 | 0.2107 | - | - | | 0.8030 | 2968 | 0.2259 | - | - | | 0.8033 | 2969 | 0.2486 | - | - | | 0.8036 | 2970 | 0.2348 | - | - | | 0.8038 | 2971 | 0.2148 | - | - | | 0.8041 | 2972 | 0.2964 | - | - | | 0.8044 | 2973 | 0.2457 | - | - | | 0.8047 | 2974 | 0.2902 | - | - | | 0.8049 | 2975 | 0.3884 | - | - | | 0.8052 | 2976 | 0.2565 | - | - | | 0.8055 | 2977 | 0.2092 | - | - | | 0.8057 | 2978 | 0.2529 | - | - | | 0.8060 | 2979 | 0.3613 | - | - | | 0.8063 | 2980 | 0.2704 | - | - | | 0.8065 | 2981 | 0.2505 | - | - | | 0.8068 | 2982 | 0.2398 | - | - | | 0.8071 | 2983 | 0.3175 | - | - | | 0.8074 | 2984 | 0.3307 | - | - | | 0.8076 | 2985 | 0.1856 | - | - | | 0.8079 | 2986 | 0.1582 | - | - | | 0.8082 | 2987 | 0.3041 | - | - | | 0.8084 | 2988 | 0.4047 | - | - | | 0.8087 | 2989 | 0.3247 | - | - | | 0.8090 | 2990 | 0.2438 | - | - | | 0.8093 | 2991 | 0.2938 | - | - | | 0.8095 | 2992 | 0.3456 | - | - | | 0.8098 | 2993 | 0.2621 | - | - | | 0.8101 | 2994 | 0.2071 | - | - | | 0.8103 | 2995 | 0.2508 | - | - | | 0.8106 | 2996 | 0.2983 | - | - | | 0.8109 | 2997 | 0.3662 | - | - | | 0.8111 | 2998 | 0.258 | - | - | | 0.8114 | 2999 | 0.3373 | - | - | | 0.8117 | 3000 | 0.2842 | 0.2463 | 0.939 | | 0.8120 | 3001 | 0.2167 | - | - | | 0.8122 | 3002 | 0.2882 | - | - | | 0.8125 | 3003 | 0.3516 | - | - | | 0.8128 | 3004 | 0.3133 | - | - | | 0.8130 | 3005 | 0.2661 | - | - | | 0.8133 | 3006 | 0.205 | - | - | | 0.8136 | 3007 | 0.2301 | - | - | | 0.8139 | 3008 | 0.2874 | - | - | | 0.8141 | 3009 | 0.3329 | - | - | | 0.8144 | 3010 | 0.3167 | - | - | | 0.8147 | 3011 | 0.3157 | - | - | | 0.8149 | 3012 | 0.2768 | - | - | | 0.8152 | 3013 | 0.364 | - | - | | 0.8155 | 3014 | 0.3954 | - | - | | 0.8157 | 3015 | 0.2136 | - | - | | 0.8160 | 3016 | 0.2489 | - | - | | 0.8163 | 3017 | 0.2709 | - | - | | 0.8166 | 3018 | 0.1953 | - | - | | 0.8168 | 3019 | 0.3296 | - | - | | 0.8171 | 3020 | 0.2012 | - | - | | 0.8174 | 3021 | 0.2605 | - | - | | 0.8176 | 3022 | 0.223 | - | - | | 0.8179 | 3023 | 0.2657 | - | - | | 0.8182 | 3024 | 0.1954 | - | - | | 0.8185 | 3025 | 0.3515 | - | - | | 0.8187 | 3026 | 0.2661 | - | - | | 0.8190 | 3027 | 0.3212 | - | - | | 0.8193 | 3028 | 0.314 | - | - | | 0.8195 | 3029 | 0.2716 | - | - | | 0.8198 | 3030 | 0.2552 | - | - | | 0.8201 | 3031 | 0.2776 | - | - | | 0.8203 | 3032 | 0.2944 | - | - | | 0.8206 | 3033 | 0.2357 | - | - | | 0.8209 | 3034 | 0.3429 | - | - | | 0.8212 | 3035 | 0.2988 | - | - | | 0.8214 | 3036 | 0.2401 | - | - | | 0.8217 | 3037 | 0.3858 | - | - | | 0.8220 | 3038 | 0.2472 | - | - | | 0.8222 | 3039 | 0.2771 | - | - | | 0.8225 | 3040 | 0.2495 | - | - | | 0.8228 | 3041 | 0.3205 | - | - | | 0.8231 | 3042 | 0.3045 | - | - | | 0.8233 | 3043 | 0.3616 | - | - | | 0.8236 | 3044 | 0.3196 | - | - | | 0.8239 | 3045 | 0.2343 | - | - | | 0.8241 | 3046 | 0.2493 | - | - | | 0.8244 | 3047 | 0.2698 | - | - | | 0.8247 | 3048 | 0.3394 | - | - | | 0.8249 | 3049 | 0.1746 | - | - | | 0.8252 | 3050 | 0.3301 | - | - | | 0.8255 | 3051 | 0.3184 | - | - | | 0.8258 | 3052 | 0.284 | - | - | | 0.8260 | 3053 | 0.1808 | - | - | | 0.8263 | 3054 | 0.2051 | - | - | | 0.8266 | 3055 | 0.2614 | - | - | | 0.8268 | 3056 | 0.3111 | - | - | | 0.8271 | 3057 | 0.2876 | - | - | | 0.8274 | 3058 | 0.2852 | - | - | | 0.8277 | 3059 | 0.1958 | - | - | | 0.8279 | 3060 | 0.3147 | - | - | | 0.8282 | 3061 | 0.263 | - | - | | 0.8285 | 3062 | 0.3076 | - | - | | 0.8287 | 3063 | 0.2385 | - | - | | 0.8290 | 3064 | 0.3743 | - | - | | 0.8293 | 3065 | 0.2613 | - | - | | 0.8295 | 3066 | 0.2823 | - | - | | 0.8298 | 3067 | 0.3058 | - | - | | 0.8301 | 3068 | 0.2326 | - | - | | 0.8304 | 3069 | 0.3002 | - | - | | 0.8306 | 3070 | 0.3025 | - | - | | 0.8309 | 3071 | 0.3951 | - | - | | 0.8312 | 3072 | 0.2093 | - | - | | 0.8314 | 3073 | 0.3285 | - | - | | 0.8317 | 3074 | 0.2716 | - | - | | 0.8320 | 3075 | 0.2095 | - | - | | 0.8323 | 3076 | 0.2986 | - | - | | 0.8325 | 3077 | 0.2764 | - | - | | 0.8328 | 3078 | 0.2146 | - | - | | 0.8331 | 3079 | 0.1981 | - | - | | 0.8333 | 3080 | 0.2872 | - | - | | 0.8336 | 3081 | 0.3202 | - | - | | 0.8339 | 3082 | 0.251 | - | - | | 0.8341 | 3083 | 0.3631 | - | - | | 0.8344 | 3084 | 0.354 | - | - | | 0.8347 | 3085 | 0.2695 | - | - | | 0.8350 | 3086 | 0.2431 | - | - | | 0.8352 | 3087 | 0.239 | - | - | | 0.8355 | 3088 | 0.276 | - | - | | 0.8358 | 3089 | 0.2483 | - | - | | 0.8360 | 3090 | 0.2149 | - | - | | 0.8363 | 3091 | 0.1859 | - | - | | 0.8366 | 3092 | 0.308 | - | - | | 0.8369 | 3093 | 0.2807 | - | - | | 0.8371 | 3094 | 0.26 | - | - | | 0.8374 | 3095 | 0.2837 | - | - | | 0.8377 | 3096 | 0.3548 | - | - | | 0.8379 | 3097 | 0.2138 | - | - | | 0.8382 | 3098 | 0.2616 | - | - | | 0.8385 | 3099 | 0.2876 | - | - | | 0.8387 | 3100 | 0.3028 | - | - | | 0.8390 | 3101 | 0.2108 | - | - | | 0.8393 | 3102 | 0.2431 | - | - | | 0.8396 | 3103 | 0.1645 | - | - | | 0.8398 | 3104 | 0.2006 | - | - | | 0.8401 | 3105 | 0.3811 | - | - | | 0.8404 | 3106 | 0.3628 | - | - | | 0.8406 | 3107 | 0.3104 | - | - | | 0.8409 | 3108 | 0.2363 | - | - | | 0.8412 | 3109 | 0.2682 | - | - | | 0.8415 | 3110 | 0.2491 | - | - | | 0.8417 | 3111 | 0.3239 | - | - | | 0.8420 | 3112 | 0.2533 | - | - | | 0.8423 | 3113 | 0.3515 | - | - | | 0.8425 | 3114 | 0.252 | - | - | | 0.8428 | 3115 | 0.3429 | - | - | | 0.8431 | 3116 | 0.2386 | - | - | | 0.8433 | 3117 | 0.2064 | - | - | | 0.8436 | 3118 | 0.2943 | - | - | | 0.8439 | 3119 | 0.2569 | - | - | | 0.8442 | 3120 | 0.3165 | - | - | | 0.8444 | 3121 | 0.2243 | - | - | | 0.8447 | 3122 | 0.274 | - | - | | 0.8450 | 3123 | 0.2616 | - | - | | 0.8452 | 3124 | 0.266 | - | - | | 0.8455 | 3125 | 0.2186 | - | - | | 0.8458 | 3126 | 0.2684 | - | - | | 0.8460 | 3127 | 0.2554 | - | - | | 0.8463 | 3128 | 0.3165 | - | - | | 0.8466 | 3129 | 0.3101 | - | - | | 0.8469 | 3130 | 0.2978 | - | - | | 0.8471 | 3131 | 0.1296 | - | - | | 0.8474 | 3132 | 0.2496 | - | - | | 0.8477 | 3133 | 0.2182 | - | - | | 0.8479 | 3134 | 0.2933 | - | - | | 0.8482 | 3135 | 0.2176 | - | - | | 0.8485 | 3136 | 0.3107 | - | - | | 0.8488 | 3137 | 0.2736 | - | - | | 0.8490 | 3138 | 0.2992 | - | - | | 0.8493 | 3139 | 0.2735 | - | - | | 0.8496 | 3140 | 0.2935 | - | - | | 0.8498 | 3141 | 0.2048 | - | - | | 0.8501 | 3142 | 0.4704 | - | - | | 0.8504 | 3143 | 0.1557 | - | - | | 0.8506 | 3144 | 0.284 | - | - | | 0.8509 | 3145 | 0.2969 | - | - | | 0.8512 | 3146 | 0.311 | - | - | | 0.8515 | 3147 | 0.3247 | - | - | | 0.8517 | 3148 | 0.3605 | - | - | | 0.8520 | 3149 | 0.2094 | - | - | | 0.8523 | 3150 | 0.2497 | - | - | | 0.8525 | 3151 | 0.222 | - | - | | 0.8528 | 3152 | 0.1675 | - | - | | 0.8531 | 3153 | 0.259 | - | - | | 0.8534 | 3154 | 0.275 | - | - | | 0.8536 | 3155 | 0.3312 | - | - | | 0.8539 | 3156 | 0.3117 | - | - | | 0.8542 | 3157 | 0.3081 | - | - | | 0.8544 | 3158 | 0.1935 | - | - | | 0.8547 | 3159 | 0.2399 | - | - | | 0.8550 | 3160 | 0.2038 | - | - | | 0.8552 | 3161 | 0.1839 | - | - | | 0.8555 | 3162 | 0.1808 | - | - | | 0.8558 | 3163 | 0.2941 | - | - | | 0.8561 | 3164 | 0.2251 | - | - | | 0.8563 | 3165 | 0.306 | - | - | | 0.8566 | 3166 | 0.2853 | - | - | | 0.8569 | 3167 | 0.2793 | - | - | | 0.8571 | 3168 | 0.2712 | - | - | | 0.8574 | 3169 | 0.2983 | - | - | | 0.8577 | 3170 | 0.2618 | - | - | | 0.8580 | 3171 | 0.2801 | - | - | | 0.8582 | 3172 | 0.2418 | - | - | | 0.8585 | 3173 | 0.3219 | - | - | | 0.8588 | 3174 | 0.3499 | - | - | | 0.8590 | 3175 | 0.2813 | - | - | | 0.8593 | 3176 | 0.2555 | - | - | | 0.8596 | 3177 | 0.3347 | - | - | | 0.8598 | 3178 | 0.3107 | - | - | | 0.8601 | 3179 | 0.2329 | - | - | | 0.8604 | 3180 | 0.1833 | - | - | | 0.8607 | 3181 | 0.212 | - | - | | 0.8609 | 3182 | 0.2709 | - | - | | 0.8612 | 3183 | 0.3164 | - | - | | 0.8615 | 3184 | 0.2303 | - | - | | 0.8617 | 3185 | 0.2377 | - | - | | 0.8620 | 3186 | 0.2982 | - | - | | 0.8623 | 3187 | 0.216 | - | - | | 0.8626 | 3188 | 0.1992 | - | - | | 0.8628 | 3189 | 0.3401 | - | - | | 0.8631 | 3190 | 0.285 | - | - | | 0.8634 | 3191 | 0.2767 | - | - | | 0.8636 | 3192 | 0.3336 | - | - | | 0.8639 | 3193 | 0.2647 | - | - | | 0.8642 | 3194 | 0.2287 | - | - | | 0.8644 | 3195 | 0.2781 | - | - | | 0.8647 | 3196 | 0.2843 | - | - | | 0.8650 | 3197 | 0.2597 | - | - | | 0.8653 | 3198 | 0.2822 | - | - | | 0.8655 | 3199 | 0.2377 | - | - | | 0.8658 | 3200 | 0.3582 | - | - | | 0.8661 | 3201 | 0.2107 | - | - | | 0.8663 | 3202 | 0.2138 | - | - | | 0.8666 | 3203 | 0.2569 | - | - | | 0.8669 | 3204 | 0.2416 | - | - | | 0.8672 | 3205 | 0.2648 | - | - | | 0.8674 | 3206 | 0.2881 | - | - | | 0.8677 | 3207 | 0.2649 | - | - | | 0.8680 | 3208 | 0.332 | - | - | | 0.8682 | 3209 | 0.2774 | - | - | | 0.8685 | 3210 | 0.3808 | - | - | | 0.8688 | 3211 | 0.2699 | - | - | | 0.8690 | 3212 | 0.2968 | - | - | | 0.8693 | 3213 | 0.1974 | - | - | | 0.8696 | 3214 | 0.2999 | - | - | | 0.8699 | 3215 | 0.2864 | - | - | | 0.8701 | 3216 | 0.301 | - | - | | 0.8704 | 3217 | 0.2419 | - | - | | 0.8707 | 3218 | 0.2623 | - | - | | 0.8709 | 3219 | 0.2697 | - | - | | 0.8712 | 3220 | 0.2814 | - | - | | 0.8715 | 3221 | 0.1991 | - | - | | 0.8718 | 3222 | 0.2781 | - | - | | 0.8720 | 3223 | 0.419 | - | - | | 0.8723 | 3224 | 0.3883 | - | - | | 0.8726 | 3225 | 0.2591 | - | - | | 0.8728 | 3226 | 0.2652 | - | - | | 0.8731 | 3227 | 0.2372 | - | - | | 0.8734 | 3228 | 0.2605 | - | - | | 0.8736 | 3229 | 0.2934 | - | - | | 0.8739 | 3230 | 0.1837 | - | - | | 0.8742 | 3231 | 0.2139 | - | - | | 0.8745 | 3232 | 0.2228 | - | - | | 0.8747 | 3233 | 0.1801 | - | - | | 0.875 | 3234 | 0.2199 | - | - | | 0.8753 | 3235 | 0.2497 | - | - | | 0.8755 | 3236 | 0.2694 | - | - | | 0.8758 | 3237 | 0.2896 | - | - | | 0.8761 | 3238 | 0.26 | - | - | | 0.8764 | 3239 | 0.2803 | - | - | | 0.8766 | 3240 | 0.3891 | - | - | | 0.8769 | 3241 | 0.2883 | - | - | | 0.8772 | 3242 | 0.2769 | - | - | | 0.8774 | 3243 | 0.3071 | - | - | | 0.8777 | 3244 | 0.2371 | - | - | | 0.8780 | 3245 | 0.213 | - | - | | 0.8782 | 3246 | 0.2224 | - | - | | 0.8785 | 3247 | 0.3588 | - | - | | 0.8788 | 3248 | 0.3871 | - | - | | 0.8791 | 3249 | 0.4375 | - | - | | 0.8793 | 3250 | 0.2392 | - | - | | 0.8796 | 3251 | 0.2189 | - | - | | 0.8799 | 3252 | 0.205 | - | - | | 0.8801 | 3253 | 0.2352 | - | - | | 0.8804 | 3254 | 0.3317 | - | - | | 0.8807 | 3255 | 0.2779 | - | - | | 0.8810 | 3256 | 0.2985 | - | - | | 0.8812 | 3257 | 0.2848 | - | - | | 0.8815 | 3258 | 0.2637 | - | - | | 0.8818 | 3259 | 0.352 | - | - | | 0.8820 | 3260 | 0.2724 | - | - | | 0.8823 | 3261 | 0.1914 | - | - | | 0.8826 | 3262 | 0.2729 | - | - | | 0.8828 | 3263 | 0.3065 | - | - | | 0.8831 | 3264 | 0.3364 | - | - | | 0.8834 | 3265 | 0.242 | - | - | | 0.8837 | 3266 | 0.3168 | - | - | | 0.8839 | 3267 | 0.332 | - | - | | 0.8842 | 3268 | 0.263 | - | - | | 0.8845 | 3269 | 0.2265 | - | - | | 0.8847 | 3270 | 0.233 | - | - | | 0.8850 | 3271 | 0.2685 | - | - | | 0.8853 | 3272 | 0.2268 | - | - | | 0.8856 | 3273 | 0.2494 | - | - | | 0.8858 | 3274 | 0.2684 | - | - | | 0.8861 | 3275 | 0.2917 | - | - | | 0.8864 | 3276 | 0.2023 | - | - | | 0.8866 | 3277 | 0.2982 | - | - | | 0.8869 | 3278 | 0.212 | - | - | | 0.8872 | 3279 | 0.2643 | - | - | | 0.8874 | 3280 | 0.3248 | - | - | | 0.8877 | 3281 | 0.2351 | - | - | | 0.8880 | 3282 | 0.2696 | - | - | | 0.8883 | 3283 | 0.2182 | - | - | | 0.8885 | 3284 | 0.2514 | - | - | | 0.8888 | 3285 | 0.2385 | - | - | | 0.8891 | 3286 | 0.2452 | - | - | | 0.8893 | 3287 | 0.2522 | - | - | | 0.8896 | 3288 | 0.3539 | - | - | | 0.8899 | 3289 | 0.2672 | - | - | | 0.8902 | 3290 | 0.1711 | - | - | | 0.8904 | 3291 | 0.325 | - | - | | 0.8907 | 3292 | 0.2403 | - | - | | 0.8910 | 3293 | 0.3289 | - | - | | 0.8912 | 3294 | 0.2778 | - | - | | 0.8915 | 3295 | 0.2823 | - | - | | 0.8918 | 3296 | 0.2232 | - | - | | 0.8920 | 3297 | 0.2516 | - | - | | 0.8923 | 3298 | 0.2711 | - | - | | 0.8926 | 3299 | 0.1663 | - | - | | 0.8929 | 3300 | 0.333 | - | - | | 0.8931 | 3301 | 0.2305 | - | - | | 0.8934 | 3302 | 0.2193 | - | - | | 0.8937 | 3303 | 0.3186 | - | - | | 0.8939 | 3304 | 0.3291 | - | - | | 0.8942 | 3305 | 0.2091 | - | - | | 0.8945 | 3306 | 0.2188 | - | - | | 0.8948 | 3307 | 0.3998 | - | - | | 0.8950 | 3308 | 0.2493 | - | - | | 0.8953 | 3309 | 0.2519 | - | - | | 0.8956 | 3310 | 0.2684 | - | - | | 0.8958 | 3311 | 0.28 | - | - | | 0.8961 | 3312 | 0.2698 | - | - | | 0.8964 | 3313 | 0.3123 | - | - | | 0.8966 | 3314 | 0.2617 | - | - | | 0.8969 | 3315 | 0.2328 | - | - | | 0.8972 | 3316 | 0.4266 | - | - | | 0.8975 | 3317 | 0.3515 | - | - | | 0.8977 | 3318 | 0.2741 | - | - | | 0.8980 | 3319 | 0.237 | - | - | | 0.8983 | 3320 | 0.3418 | - | - | | 0.8985 | 3321 | 0.2176 | - | - | | 0.8988 | 3322 | 0.296 | - | - | | 0.8991 | 3323 | 0.2176 | - | - | | 0.8994 | 3324 | 0.2157 | - | - | | 0.8996 | 3325 | 0.1917 | - | - | | 0.8999 | 3326 | 0.2413 | - | - | | 0.9002 | 3327 | 0.2308 | - | - | | 0.9004 | 3328 | 0.2564 | - | - | | 0.9007 | 3329 | 0.2854 | - | - | | 0.9010 | 3330 | 0.2121 | - | - | | 0.9012 | 3331 | 0.2534 | - | - | | 0.9015 | 3332 | 0.2373 | - | - | | 0.9018 | 3333 | 0.2564 | - | - | | 0.9021 | 3334 | 0.2922 | - | - | | 0.9023 | 3335 | 0.2469 | - | - | | 0.9026 | 3336 | 0.302 | - | - | | 0.9029 | 3337 | 0.1971 | - | - | | 0.9031 | 3338 | 0.2529 | - | - | | 0.9034 | 3339 | 0.1892 | - | - | | 0.9037 | 3340 | 0.1991 | - | - | | 0.9040 | 3341 | 0.3181 | - | - | | 0.9042 | 3342 | 0.3811 | - | - | | 0.9045 | 3343 | 0.2705 | - | - | | 0.9048 | 3344 | 0.3562 | - | - | | 0.9050 | 3345 | 0.2036 | - | - | | 0.9053 | 3346 | 0.1932 | - | - | | 0.9056 | 3347 | 0.292 | - | - | | 0.9058 | 3348 | 0.2459 | - | - | | 0.9061 | 3349 | 0.2477 | - | - | | 0.9064 | 3350 | 0.3383 | - | - | | 0.9067 | 3351 | 0.267 | - | - | | 0.9069 | 3352 | 0.2344 | - | - | | 0.9072 | 3353 | 0.2995 | - | - | | 0.9075 | 3354 | 0.2303 | - | - | | 0.9077 | 3355 | 0.2602 | - | - | | 0.9080 | 3356 | 0.2934 | - | - | | 0.9083 | 3357 | 0.2085 | - | - | | 0.9085 | 3358 | 0.2979 | - | - | | 0.9088 | 3359 | 0.2047 | - | - | | 0.9091 | 3360 | 0.3085 | - | - | | 0.9094 | 3361 | 0.3844 | - | - | | 0.9096 | 3362 | 0.2434 | - | - | | 0.9099 | 3363 | 0.2252 | - | - | | 0.9102 | 3364 | 0.248 | - | - | | 0.9104 | 3365 | 0.2351 | - | - | | 0.9107 | 3366 | 0.3386 | - | - | | 0.9110 | 3367 | 0.275 | - | - | | 0.9113 | 3368 | 0.3524 | - | - | | 0.9115 | 3369 | 0.2168 | - | - | | 0.9118 | 3370 | 0.3273 | - | - | | 0.9121 | 3371 | 0.2062 | - | - | | 0.9123 | 3372 | 0.2583 | - | - | | 0.9126 | 3373 | 0.2562 | - | - | | 0.9129 | 3374 | 0.2361 | - | - | | 0.9131 | 3375 | 0.269 | - | - | | 0.9134 | 3376 | 0.2793 | - | - | | 0.9137 | 3377 | 0.2036 | - | - | | 0.9140 | 3378 | 0.2845 | - | - | | 0.9142 | 3379 | 0.35 | - | - | | 0.9145 | 3380 | 0.2395 | - | - | | 0.9148 | 3381 | 0.3297 | - | - | | 0.9150 | 3382 | 0.2512 | - | - | | 0.9153 | 3383 | 0.243 | - | - | | 0.9156 | 3384 | 0.2997 | - | - | | 0.9159 | 3385 | 0.3318 | - | - | | 0.9161 | 3386 | 0.1688 | - | - | | 0.9164 | 3387 | 0.3056 | - | - | | 0.9167 | 3388 | 0.2559 | - | - | | 0.9169 | 3389 | 0.2188 | - | - | | 0.9172 | 3390 | 0.2351 | - | - | | 0.9175 | 3391 | 0.2792 | - | - | | 0.9177 | 3392 | 0.2635 | - | - | | 0.9180 | 3393 | 0.204 | - | - | | 0.9183 | 3394 | 0.3251 | - | - | | 0.9186 | 3395 | 0.232 | - | - | | 0.9188 | 3396 | 0.248 | - | - | | 0.9191 | 3397 | 0.2675 | - | - | | 0.9194 | 3398 | 0.1952 | - | - | | 0.9196 | 3399 | 0.242 | - | - | | 0.9199 | 3400 | 0.1862 | - | - | | 0.9202 | 3401 | 0.2542 | - | - | | 0.9205 | 3402 | 0.1691 | - | - | | 0.9207 | 3403 | 0.3385 | - | - | | 0.9210 | 3404 | 0.2009 | - | - | | 0.9213 | 3405 | 0.2427 | - | - | | 0.9215 | 3406 | 0.1952 | - | - | | 0.9218 | 3407 | 0.2577 | - | - | | 0.9221 | 3408 | 0.3917 | - | - | | 0.9223 | 3409 | 0.263 | - | - | | 0.9226 | 3410 | 0.2764 | - | - | | 0.9229 | 3411 | 0.1907 | - | - | | 0.9232 | 3412 | 0.4934 | - | - | | 0.9234 | 3413 | 0.303 | - | - | | 0.9237 | 3414 | 0.2822 | - | - | | 0.9240 | 3415 | 0.3126 | - | - | | 0.9242 | 3416 | 0.3361 | - | - | | 0.9245 | 3417 | 0.1996 | - | - | | 0.9248 | 3418 | 0.1849 | - | - | | 0.9251 | 3419 | 0.2414 | - | - | | 0.9253 | 3420 | 0.2239 | - | - | | 0.9256 | 3421 | 0.2707 | - | - | | 0.9259 | 3422 | 0.2093 | - | - | | 0.9261 | 3423 | 0.2286 | - | - | | 0.9264 | 3424 | 0.276 | - | - | | 0.9267 | 3425 | 0.195 | - | - | | 0.9269 | 3426 | 0.2642 | - | - | | 0.9272 | 3427 | 0.2457 | - | - | | 0.9275 | 3428 | 0.3135 | - | - | | 0.9278 | 3429 | 0.3714 | - | - | | 0.9280 | 3430 | 0.24 | - | - | | 0.9283 | 3431 | 0.2047 | - | - | | 0.9286 | 3432 | 0.252 | - | - | | 0.9288 | 3433 | 0.21 | - | - | | 0.9291 | 3434 | 0.3017 | - | - | | 0.9294 | 3435 | 0.2324 | - | - | | 0.9297 | 3436 | 0.2891 | - | - | | 0.9299 | 3437 | 0.263 | - | - | | 0.9302 | 3438 | 0.2186 | - | - | | 0.9305 | 3439 | 0.2341 | - | - | | 0.9307 | 3440 | 0.3141 | - | - | | 0.9310 | 3441 | 0.2987 | - | - | | 0.9313 | 3442 | 0.2303 | - | - | | 0.9315 | 3443 | 0.3147 | - | - | | 0.9318 | 3444 | 0.2943 | - | - | | 0.9321 | 3445 | 0.2314 | - | - | | 0.9324 | 3446 | 0.251 | - | - | | 0.9326 | 3447 | 0.2708 | - | - | | 0.9329 | 3448 | 0.2722 | - | - | | 0.9332 | 3449 | 0.2936 | - | - | | 0.9334 | 3450 | 0.2287 | - | - | | 0.9337 | 3451 | 0.2693 | - | - | | 0.9340 | 3452 | 0.1653 | - | - | | 0.9343 | 3453 | 0.2302 | - | - | | 0.9345 | 3454 | 0.2655 | - | - | | 0.9348 | 3455 | 0.3461 | - | - | | 0.9351 | 3456 | 0.2025 | - | - | | 0.9353 | 3457 | 0.1836 | - | - | | 0.9356 | 3458 | 0.2433 | - | - | | 0.9359 | 3459 | 0.2364 | - | - | | 0.9361 | 3460 | 0.2861 | - | - | | 0.9364 | 3461 | 0.3289 | - | - | | 0.9367 | 3462 | 0.1935 | - | - | | 0.9370 | 3463 | 0.3259 | - | - | | 0.9372 | 3464 | 0.2808 | - | - | | 0.9375 | 3465 | 0.2894 | - | - | | 0.9378 | 3466 | 0.2897 | - | - | | 0.9380 | 3467 | 0.2113 | - | - | | 0.9383 | 3468 | 0.172 | - | - | | 0.9386 | 3469 | 0.2917 | - | - | | 0.9389 | 3470 | 0.3655 | - | - | | 0.9391 | 3471 | 0.2524 | - | - | | 0.9394 | 3472 | 0.2206 | - | - | | 0.9397 | 3473 | 0.2244 | - | - | | 0.9399 | 3474 | 0.242 | - | - | | 0.9402 | 3475 | 0.2941 | - | - | | 0.9405 | 3476 | 0.3303 | - | - | | 0.9407 | 3477 | 0.359 | - | - | | 0.9410 | 3478 | 0.3089 | - | - | | 0.9413 | 3479 | 0.2352 | - | - | | 0.9416 | 3480 | 0.2596 | - | - | | 0.9418 | 3481 | 0.254 | - | - | | 0.9421 | 3482 | 0.2429 | - | - | | 0.9424 | 3483 | 0.2281 | - | - | | 0.9426 | 3484 | 0.206 | - | - | | 0.9429 | 3485 | 0.2318 | - | - | | 0.9432 | 3486 | 0.2518 | - | - | | 0.9435 | 3487 | 0.3006 | - | - | | 0.9437 | 3488 | 0.2907 | - | - | | 0.9440 | 3489 | 0.201 | - | - | | 0.9443 | 3490 | 0.297 | - | - | | 0.9445 | 3491 | 0.3247 | - | - | | 0.9448 | 3492 | 0.2398 | - | - | | 0.9451 | 3493 | 0.2724 | - | - | | 0.9453 | 3494 | 0.2922 | - | - | | 0.9456 | 3495 | 0.2076 | - | - | | 0.9459 | 3496 | 0.2165 | - | - | | 0.9462 | 3497 | 0.2547 | - | - | | 0.9464 | 3498 | 0.2881 | - | - | | 0.9467 | 3499 | 0.2134 | - | - | | 0.9470 | 3500 | 0.1982 | - | - | | 0.9472 | 3501 | 0.1947 | - | - | | 0.9475 | 3502 | 0.2315 | - | - | | 0.9478 | 3503 | 0.304 | - | - | | 0.9481 | 3504 | 0.2247 | - | - | | 0.9483 | 3505 | 0.216 | - | - | | 0.9486 | 3506 | 0.3087 | - | - | | 0.9489 | 3507 | 0.3074 | - | - | | 0.9491 | 3508 | 0.2354 | - | - | | 0.9494 | 3509 | 0.2523 | - | - | | 0.9497 | 3510 | 0.2744 | - | - | | 0.9499 | 3511 | 0.2951 | - | - | | 0.9502 | 3512 | 0.2681 | - | - | | 0.9505 | 3513 | 0.2896 | - | - | | 0.9508 | 3514 | 0.2715 | - | - | | 0.9510 | 3515 | 0.2766 | - | - | | 0.9513 | 3516 | 0.2448 | - | - | | 0.9516 | 3517 | 0.3858 | - | - | | 0.9518 | 3518 | 0.1812 | - | - | | 0.9521 | 3519 | 0.3185 | - | - | | 0.9524 | 3520 | 0.3073 | - | - | | 0.9527 | 3521 | 0.2852 | - | - | | 0.9529 | 3522 | 0.2904 | - | - | | 0.9532 | 3523 | 0.2754 | - | - | | 0.9535 | 3524 | 0.2247 | - | - | | 0.9537 | 3525 | 0.323 | - | - | | 0.9540 | 3526 | 0.2556 | - | - | | 0.9543 | 3527 | 0.4511 | - | - | | 0.9545 | 3528 | 0.2269 | - | - | | 0.9548 | 3529 | 0.2866 | - | - | | 0.9551 | 3530 | 0.3142 | - | - | | 0.9554 | 3531 | 0.3198 | - | - | | 0.9556 | 3532 | 0.2521 | - | - | | 0.9559 | 3533 | 0.2344 | - | - | | 0.9562 | 3534 | 0.3074 | - | - | | 0.9564 | 3535 | 0.3205 | - | - | | 0.9567 | 3536 | 0.1791 | - | - | | 0.9570 | 3537 | 0.2436 | - | - | | 0.9573 | 3538 | 0.2882 | - | - | | 0.9575 | 3539 | 0.2849 | - | - | | 0.9578 | 3540 | 0.348 | - | - | | 0.9581 | 3541 | 0.3505 | - | - | | 0.9583 | 3542 | 0.2428 | - | - | | 0.9586 | 3543 | 0.2955 | - | - | | 0.9589 | 3544 | 0.1902 | - | - | | 0.9591 | 3545 | 0.2466 | - | - | | 0.9594 | 3546 | 0.2759 | - | - | | 0.9597 | 3547 | 0.2967 | - | - | | 0.9600 | 3548 | 0.2957 | - | - | | 0.9602 | 3549 | 0.2717 | - | - | | 0.9605 | 3550 | 0.2233 | - | - | | 0.9608 | 3551 | 0.2488 | - | - | | 0.9610 | 3552 | 0.2641 | - | - | | 0.9613 | 3553 | 0.2374 | - | - | | 0.9616 | 3554 | 0.1853 | - | - | | 0.9619 | 3555 | 0.206 | - | - | | 0.9621 | 3556 | 0.318 | - | - | | 0.9624 | 3557 | 0.2984 | - | - | | 0.9627 | 3558 | 0.2125 | - | - | | 0.9629 | 3559 | 0.3102 | - | - | | 0.9632 | 3560 | 0.2513 | - | - | | 0.9635 | 3561 | 0.271 | - | - | | 0.9637 | 3562 | 0.1955 | - | - | | 0.9640 | 3563 | 0.3262 | - | - | | 0.9643 | 3564 | 0.4096 | - | - | | 0.9646 | 3565 | 0.2214 | - | - | | 0.9648 | 3566 | 0.2923 | - | - | | 0.9651 | 3567 | 0.3203 | - | - | | 0.9654 | 3568 | 0.2062 | - | - | | 0.9656 | 3569 | 0.2599 | - | - | | 0.9659 | 3570 | 0.2077 | - | - | | 0.9662 | 3571 | 0.3213 | - | - | | 0.9665 | 3572 | 0.2585 | - | - | | 0.9667 | 3573 | 0.2312 | - | - | | 0.9670 | 3574 | 0.2769 | - | - | | 0.9673 | 3575 | 0.255 | - | - | | 0.9675 | 3576 | 0.3466 | - | - | | 0.9678 | 3577 | 0.2466 | - | - | | 0.9681 | 3578 | 0.2993 | - | - | | 0.9683 | 3579 | 0.3463 | - | - | | 0.9686 | 3580 | 0.3174 | - | - | | 0.9689 | 3581 | 0.3519 | - | - | | 0.9692 | 3582 | 0.277 | - | - | | 0.9694 | 3583 | 0.3293 | - | - | | 0.9697 | 3584 | 0.317 | - | - | | 0.9700 | 3585 | 0.2706 | - | - | | 0.9702 | 3586 | 0.2761 | - | - | | 0.9705 | 3587 | 0.2198 | - | - | | 0.9708 | 3588 | 0.2495 | - | - | | 0.9710 | 3589 | 0.2662 | - | - | | 0.9713 | 3590 | 0.2835 | - | - | | 0.9716 | 3591 | 0.3061 | - | - | | 0.9719 | 3592 | 0.2378 | - | - | | 0.9721 | 3593 | 0.2067 | - | - | | 0.9724 | 3594 | 0.2723 | - | - | | 0.9727 | 3595 | 0.2371 | - | - | | 0.9729 | 3596 | 0.1973 | - | - | | 0.9732 | 3597 | 0.2585 | - | - | | 0.9735 | 3598 | 0.252 | - | - | | 0.9738 | 3599 | 0.2171 | - | - | | 0.9740 | 3600 | 0.2799 | - | - | | 0.9743 | 3601 | 0.1723 | - | - | | 0.9746 | 3602 | 0.2225 | - | - | | 0.9748 | 3603 | 0.2665 | - | - | | 0.9751 | 3604 | 0.2299 | - | - | | 0.9754 | 3605 | 0.3687 | - | - | | 0.9756 | 3606 | 0.2245 | - | - | | 0.9759 | 3607 | 0.2825 | - | - | | 0.9762 | 3608 | 0.214 | - | - | | 0.9765 | 3609 | 0.2908 | - | - | | 0.9767 | 3610 | 0.3048 | - | - | | 0.9770 | 3611 | 0.3556 | - | - | | 0.9773 | 3612 | 0.3434 | - | - | | 0.9775 | 3613 | 0.2642 | - | - | | 0.9778 | 3614 | 0.277 | - | - | | 0.9781 | 3615 | 0.2991 | - | - | | 0.9784 | 3616 | 0.2421 | - | - | | 0.9786 | 3617 | 0.2524 | - | - | | 0.9789 | 3618 | 0.2184 | - | - | | 0.9792 | 3619 | 0.1983 | - | - | | 0.9794 | 3620 | 0.2725 | - | - | | 0.9797 | 3621 | 0.2894 | - | - | | 0.9800 | 3622 | 0.218 | - | - | | 0.9802 | 3623 | 0.3603 | - | - | | 0.9805 | 3624 | 0.2498 | - | - | | 0.9808 | 3625 | 0.2013 | - | - | | 0.9811 | 3626 | 0.2922 | - | - | | 0.9813 | 3627 | 0.3398 | - | - | | 0.9816 | 3628 | 0.2485 | - | - | | 0.9819 | 3629 | 0.2846 | - | - | | 0.9821 | 3630 | 0.2775 | - | - | | 0.9824 | 3631 | 0.1814 | - | - | | 0.9827 | 3632 | 0.227 | - | - | | 0.9830 | 3633 | 0.3007 | - | - | | 0.9832 | 3634 | 0.2626 | - | - | | 0.9835 | 3635 | 0.308 | - | - | | 0.9838 | 3636 | 0.3136 | - | - | | 0.9840 | 3637 | 0.3868 | - | - | | 0.9843 | 3638 | 0.2338 | - | - | | 0.9846 | 3639 | 0.2808 | - | - | | 0.9848 | 3640 | 0.2198 | - | - | | 0.9851 | 3641 | 0.3017 | - | - | | 0.9854 | 3642 | 0.2334 | - | - | | 0.9857 | 3643 | 0.2699 | - | - | | 0.9859 | 3644 | 0.2363 | - | - | | 0.9862 | 3645 | 0.2553 | - | - | | 0.9865 | 3646 | 0.2671 | - | - | | 0.9867 | 3647 | 0.3077 | - | - | | 0.9870 | 3648 | 0.2148 | - | - | | 0.9873 | 3649 | 0.2195 | - | - | | 0.9876 | 3650 | 0.2308 | - | - | | 0.9878 | 3651 | 0.2521 | - | - | | 0.9881 | 3652 | 0.2883 | - | - | | 0.9884 | 3653 | 0.2587 | - | - | | 0.9886 | 3654 | 0.2061 | - | - | | 0.9889 | 3655 | 0.2797 | - | - | | 0.9892 | 3656 | 0.2464 | - | - | | 0.9894 | 3657 | 0.268 | - | - | | 0.9897 | 3658 | 0.1835 | - | - | | 0.9900 | 3659 | 0.2828 | - | - | | 0.9903 | 3660 | 0.2759 | - | - | | 0.9905 | 3661 | 0.225 | - | - | | 0.9908 | 3662 | 0.3405 | - | - | | 0.9911 | 3663 | 0.266 | - | - | | 0.9913 | 3664 | 0.2706 | - | - | | 0.9916 | 3665 | 0.2991 | - | - | | 0.9919 | 3666 | 0.2825 | - | - | | 0.9922 | 3667 | 0.191 | - | - | | 0.9924 | 3668 | 0.198 | - | - | | 0.9927 | 3669 | 0.269 | - | - | | 0.9930 | 3670 | 0.2074 | - | - | | 0.9932 | 3671 | 0.1476 | - | - | | 0.9935 | 3672 | 0.2654 | - | - | | 0.9938 | 3673 | 0.2133 | - | - | | 0.9940 | 3674 | 0.2332 | - | - | | 0.9943 | 3675 | 0.2253 | - | - | | 0.9946 | 3676 | 0.3173 | - | - | | 0.9949 | 3677 | 0.3541 | - | - | | 0.9951 | 3678 | 0.2955 | - | - | | 0.9954 | 3679 | 0.1914 | - | - | | 0.9957 | 3680 | 0.2286 | - | - | | 0.9959 | 3681 | 0.2952 | - | - | | 0.9962 | 3682 | 0.3225 | - | - | | 0.9965 | 3683 | 0.2409 | - | - | | 0.9968 | 3684 | 0.2145 | - | - | | 0.9970 | 3685 | 0.22 | - | - | | 0.9973 | 3686 | 0.2246 | - | - | | 0.9976 | 3687 | 0.308 | - | - | | 0.9978 | 3688 | 0.2441 | - | - | | 0.9981 | 3689 | 0.2241 | - | - | | 0.9984 | 3690 | 0.3153 | - | - | | 0.9986 | 3691 | 0.2734 | - | - | | 0.9989 | 3692 | 0.3468 | - | - | | 0.9992 | 3693 | 0.328 | - | - | | 0.9995 | 3694 | 0.2309 | - | - | | 0.9997 | 3695 | 0.2473 | - | - | | 1.0 | 3696 | 0.279 | - | - | | 1.0003 | 3697 | 0.2398 | - | - | | 1.0005 | 3698 | 0.1914 | - | - | | 1.0008 | 3699 | 0.1938 | - | - | | 1.0011 | 3700 | 0.2064 | - | - | | 1.0014 | 3701 | 0.2342 | - | - | | 1.0016 | 3702 | 0.2643 | - | - | | 1.0019 | 3703 | 0.2346 | - | - | | 1.0022 | 3704 | 0.2927 | - | - | | 1.0024 | 3705 | 0.2381 | - | - | | 1.0027 | 3706 | 0.2501 | - | - | | 1.0030 | 3707 | 0.2378 | - | - | | 1.0032 | 3708 | 0.2102 | - | - | | 1.0035 | 3709 | 0.2375 | - | - | | 1.0038 | 3710 | 0.1532 | - | - | | 1.0041 | 3711 | 0.2109 | - | - | | 1.0043 | 3712 | 0.2538 | - | - | | 1.0046 | 3713 | 0.2709 | - | - | | 1.0049 | 3714 | 0.1577 | - | - | | 1.0051 | 3715 | 0.2742 | - | - | | 1.0054 | 3716 | 0.3701 | - | - | | 1.0057 | 3717 | 0.2647 | - | - | | 1.0060 | 3718 | 0.2945 | - | - | | 1.0062 | 3719 | 0.1908 | - | - | | 1.0065 | 3720 | 0.1587 | - | - | | 1.0068 | 3721 | 0.2551 | - | - | | 1.0070 | 3722 | 0.2761 | - | - | | 1.0073 | 3723 | 0.268 | - | - | | 1.0076 | 3724 | 0.1888 | - | - | | 1.0078 | 3725 | 0.2842 | - | - | | 1.0081 | 3726 | 0.1982 | - | - | | 1.0084 | 3727 | 0.292 | - | - | | 1.0087 | 3728 | 0.2613 | - | - | | 1.0089 | 3729 | 0.1289 | - | - | | 1.0092 | 3730 | 0.1937 | - | - | | 1.0095 | 3731 | 0.2538 | - | - | | 1.0097 | 3732 | 0.2896 | - | - | | 1.0100 | 3733 | 0.2118 | - | - | | 1.0103 | 3734 | 0.2983 | - | - | | 1.0106 | 3735 | 0.2561 | - | - | | 1.0108 | 3736 | 0.3354 | - | - | | 1.0111 | 3737 | 0.2378 | - | - | | 1.0114 | 3738 | 0.2024 | - | - | | 1.0116 | 3739 | 0.1931 | - | - | | 1.0119 | 3740 | 0.2435 | - | - | | 1.0122 | 3741 | 0.2435 | - | - | | 1.0124 | 3742 | 0.2429 | - | - | | 1.0127 | 3743 | 0.2692 | - | - | | 1.0130 | 3744 | 0.2873 | - | - | | 1.0133 | 3745 | 0.2091 | - | - | | 1.0135 | 3746 | 0.2196 | - | - | | 1.0138 | 3747 | 0.2101 | - | - | | 1.0141 | 3748 | 0.2053 | - | - | | 1.0143 | 3749 | 0.2442 | - | - | | 1.0146 | 3750 | 0.1795 | - | - | | 1.0149 | 3751 | 0.2672 | - | - | | 1.0152 | 3752 | 0.2159 | - | - | | 1.0154 | 3753 | 0.2079 | - | - | | 1.0157 | 3754 | 0.2983 | - | - | | 1.0160 | 3755 | 0.1989 | - | - | | 1.0162 | 3756 | 0.188 | - | - | | 1.0165 | 3757 | 0.2601 | - | - | | 1.0168 | 3758 | 0.2418 | - | - | | 1.0170 | 3759 | 0.2816 | - | - | | 1.0173 | 3760 | 0.1746 | - | - | | 1.0176 | 3761 | 0.3017 | - | - | | 1.0179 | 3762 | 0.2026 | - | - | | 1.0181 | 3763 | 0.1731 | - | - | | 1.0184 | 3764 | 0.2889 | - | - | | 1.0187 | 3765 | 0.2071 | - | - | | 1.0189 | 3766 | 0.2798 | - | - | | 1.0192 | 3767 | 0.2503 | - | - | | 1.0195 | 3768 | 0.1815 | - | - | | 1.0198 | 3769 | 0.2028 | - | - | | 1.0200 | 3770 | 0.2813 | - | - | | 1.0203 | 3771 | 0.2601 | - | - | | 1.0206 | 3772 | 0.275 | - | - | | 1.0208 | 3773 | 0.2361 | - | - | | 1.0211 | 3774 | 0.1932 | - | - | | 1.0214 | 3775 | 0.2841 | - | - | | 1.0216 | 3776 | 0.2883 | - | - | | 1.0219 | 3777 | 0.258 | - | - | | 1.0222 | 3778 | 0.1991 | - | - | | 1.0225 | 3779 | 0.2631 | - | - | | 1.0227 | 3780 | 0.296 | - | - | | 1.0230 | 3781 | 0.2636 | - | - | | 1.0233 | 3782 | 0.2526 | - | - | | 1.0235 | 3783 | 0.1759 | - | - | | 1.0238 | 3784 | 0.2657 | - | - | | 1.0241 | 3785 | 0.2321 | - | - | | 1.0244 | 3786 | 0.1554 | - | - | | 1.0246 | 3787 | 0.3056 | - | - | | 1.0249 | 3788 | 0.304 | - | - | | 1.0252 | 3789 | 0.2153 | - | - | | 1.0254 | 3790 | 0.3182 | - | - | | 1.0257 | 3791 | 0.2505 | - | - | | 1.0260 | 3792 | 0.2094 | - | - | | 1.0262 | 3793 | 0.2409 | - | - | | 1.0265 | 3794 | 0.215 | - | - | | 1.0268 | 3795 | 0.2223 | - | - | | 1.0271 | 3796 | 0.2497 | - | - | | 1.0273 | 3797 | 0.2017 | - | - | | 1.0276 | 3798 | 0.2234 | - | - | | 1.0279 | 3799 | 0.2572 | - | - | | 1.0281 | 3800 | 0.2338 | - | - | | 1.0284 | 3801 | 0.1621 | - | - | | 1.0287 | 3802 | 0.3131 | - | - | | 1.0290 | 3803 | 0.2695 | - | - | | 1.0292 | 3804 | 0.1763 | - | - | | 1.0295 | 3805 | 0.2181 | - | - | | 1.0298 | 3806 | 0.266 | - | - | | 1.0300 | 3807 | 0.2282 | - | - | | 1.0303 | 3808 | 0.3563 | - | - | | 1.0306 | 3809 | 0.2929 | - | - | | 1.0308 | 3810 | 0.2105 | - | - | | 1.0311 | 3811 | 0.1543 | - | - | | 1.0314 | 3812 | 0.2339 | - | - | | 1.0317 | 3813 | 0.2403 | - | - | | 1.0319 | 3814 | 0.2146 | - | - | | 1.0322 | 3815 | 0.2243 | - | - | | 1.0325 | 3816 | 0.2641 | - | - | | 1.0327 | 3817 | 0.2532 | - | - | | 1.0330 | 3818 | 0.1923 | - | - | | 1.0333 | 3819 | 0.3272 | - | - | | 1.0335 | 3820 | 0.2173 | - | - | | 1.0338 | 3821 | 0.231 | - | - | | 1.0341 | 3822 | 0.1673 | - | - | | 1.0344 | 3823 | 0.2918 | - | - | | 1.0346 | 3824 | 0.3647 | - | - | | 1.0349 | 3825 | 0.1507 | - | - | | 1.0352 | 3826 | 0.2733 | - | - | | 1.0354 | 3827 | 0.2138 | - | - | | 1.0357 | 3828 | 0.2367 | - | - | | 1.0360 | 3829 | 0.2655 | - | - | | 1.0363 | 3830 | 0.2152 | - | - | | 1.0365 | 3831 | 0.2693 | - | - | | 1.0368 | 3832 | 0.207 | - | - | | 1.0371 | 3833 | 0.2669 | - | - | | 1.0373 | 3834 | 0.2763 | - | - | | 1.0376 | 3835 | 0.222 | - | - | | 1.0379 | 3836 | 0.1458 | - | - | | 1.0381 | 3837 | 0.2265 | - | - | | 1.0384 | 3838 | 0.2329 | - | - | | 1.0387 | 3839 | 0.2841 | - | - | | 1.0390 | 3840 | 0.2274 | - | - | | 1.0392 | 3841 | 0.2253 | - | - | | 1.0395 | 3842 | 0.1991 | - | - | | 1.0398 | 3843 | 0.2628 | - | - | | 1.0400 | 3844 | 0.2222 | - | - | | 1.0403 | 3845 | 0.2352 | - | - | | 1.0406 | 3846 | 0.3067 | - | - | | 1.0409 | 3847 | 0.2131 | - | - | | 1.0411 | 3848 | 0.2004 | - | - | | 1.0414 | 3849 | 0.1627 | - | - | | 1.0417 | 3850 | 0.1809 | - | - | | 1.0419 | 3851 | 0.2473 | - | - | | 1.0422 | 3852 | 0.2685 | - | - | | 1.0425 | 3853 | 0.1708 | - | - | | 1.0427 | 3854 | 0.2624 | - | - | | 1.0430 | 3855 | 0.2691 | - | - | | 1.0433 | 3856 | 0.2761 | - | - | | 1.0436 | 3857 | 0.3001 | - | - | | 1.0438 | 3858 | 0.2332 | - | - | | 1.0441 | 3859 | 0.1779 | - | - | | 1.0444 | 3860 | 0.317 | - | - | | 1.0446 | 3861 | 0.3584 | - | - | | 1.0449 | 3862 | 0.2173 | - | - | | 1.0452 | 3863 | 0.239 | - | - | | 1.0455 | 3864 | 0.2868 | - | - | | 1.0457 | 3865 | 0.2156 | - | - | | 1.0460 | 3866 | 0.1991 | - | - | | 1.0463 | 3867 | 0.2395 | - | - | | 1.0465 | 3868 | 0.2973 | - | - | | 1.0468 | 3869 | 0.2669 | - | - | | 1.0471 | 3870 | 0.2921 | - | - | | 1.0473 | 3871 | 0.2109 | - | - | | 1.0476 | 3872 | 0.2387 | - | - | | 1.0479 | 3873 | 0.2252 | - | - | | 1.0482 | 3874 | 0.2101 | - | - | | 1.0484 | 3875 | 0.2311 | - | - | | 1.0487 | 3876 | 0.2459 | - | - | | 1.0490 | 3877 | 0.212 | - | - | | 1.0492 | 3878 | 0.2261 | - | - | | 1.0495 | 3879 | 0.227 | - | - | | 1.0498 | 3880 | 0.2942 | - | - | | 1.0501 | 3881 | 0.2127 | - | - | | 1.0503 | 3882 | 0.1862 | - | - | | 1.0506 | 3883 | 0.1601 | - | - | | 1.0509 | 3884 | 0.1747 | - | - | | 1.0511 | 3885 | 0.1798 | - | - | | 1.0514 | 3886 | 0.2838 | - | - | | 1.0517 | 3887 | 0.2368 | - | - | | 1.0519 | 3888 | 0.1666 | - | - | | 1.0522 | 3889 | 0.1971 | - | - | | 1.0525 | 3890 | 0.2437 | - | - | | 1.0528 | 3891 | 0.2614 | - | - | | 1.0530 | 3892 | 0.2016 | - | - | | 1.0533 | 3893 | 0.2189 | - | - | | 1.0536 | 3894 | 0.2739 | - | - | | 1.0538 | 3895 | 0.2841 | - | - | | 1.0541 | 3896 | 0.2608 | - | - | | 1.0544 | 3897 | 0.3097 | - | - | | 1.0547 | 3898 | 0.2651 | - | - | | 1.0549 | 3899 | 0.2079 | - | - | | 1.0552 | 3900 | 0.2613 | - | - | | 1.0555 | 3901 | 0.2221 | - | - | | 1.0557 | 3902 | 0.3085 | - | - | | 1.0560 | 3903 | 0.1983 | - | - | | 1.0563 | 3904 | 0.1945 | - | - | | 1.0565 | 3905 | 0.233 | - | - | | 1.0568 | 3906 | 0.24 | - | - | | 1.0571 | 3907 | 0.1822 | - | - | | 1.0574 | 3908 | 0.2404 | - | - | | 1.0576 | 3909 | 0.1511 | - | - | | 1.0579 | 3910 | 0.231 | - | - | | 1.0582 | 3911 | 0.2019 | - | - | | 1.0584 | 3912 | 0.2938 | - | - | | 1.0587 | 3913 | 0.3566 | - | - | | 1.0590 | 3914 | 0.2612 | - | - | | 1.0593 | 3915 | 0.3086 | - | - | | 1.0595 | 3916 | 0.2291 | - | - | | 1.0598 | 3917 | 0.2211 | - | - | | 1.0601 | 3918 | 0.2256 | - | - | | 1.0603 | 3919 | 0.2852 | - | - | | 1.0606 | 3920 | 0.2344 | - | - | | 1.0609 | 3921 | 0.3421 | - | - | | 1.0611 | 3922 | 0.2482 | - | - | | 1.0614 | 3923 | 0.2685 | - | - | | 1.0617 | 3924 | 0.2192 | - | - | | 1.0620 | 3925 | 0.2674 | - | - | | 1.0622 | 3926 | 0.2452 | - | - | | 1.0625 | 3927 | 0.2786 | - | - | | 1.0628 | 3928 | 0.2528 | - | - | | 1.0630 | 3929 | 0.254 | - | - | | 1.0633 | 3930 | 0.3259 | - | - | | 1.0636 | 3931 | 0.1807 | - | - | | 1.0639 | 3932 | 0.1963 | - | - | | 1.0641 | 3933 | 0.2583 | - | - | | 1.0644 | 3934 | 0.2107 | - | - | | 1.0647 | 3935 | 0.2628 | - | - | | 1.0649 | 3936 | 0.1211 | - | - | | 1.0652 | 3937 | 0.2365 | - | - | | 1.0655 | 3938 | 0.219 | - | - | | 1.0657 | 3939 | 0.2354 | - | - | | 1.0660 | 3940 | 0.1676 | - | - | | 1.0663 | 3941 | 0.2248 | - | - | | 1.0666 | 3942 | 0.1766 | - | - | | 1.0668 | 3943 | 0.2815 | - | - | | 1.0671 | 3944 | 0.2082 | - | - | | 1.0674 | 3945 | 0.377 | - | - | | 1.0676 | 3946 | 0.2223 | - | - | | 1.0679 | 3947 | 0.1825 | - | - | | 1.0682 | 3948 | 0.1764 | - | - | | 1.0685 | 3949 | 0.1695 | - | - | | 1.0687 | 3950 | 0.2012 | - | - | | 1.0690 | 3951 | 0.181 | - | - | | 1.0693 | 3952 | 0.2575 | - | - | | 1.0695 | 3953 | 0.3822 | - | - | | 1.0698 | 3954 | 0.1845 | - | - | | 1.0701 | 3955 | 0.2852 | - | - | | 1.0703 | 3956 | 0.2233 | - | - | | 1.0706 | 3957 | 0.1728 | - | - | | 1.0709 | 3958 | 0.239 | - | - | | 1.0712 | 3959 | 0.2591 | - | - | | 1.0714 | 3960 | 0.2841 | - | - | | 1.0717 | 3961 | 0.25 | - | - | | 1.0720 | 3962 | 0.3137 | - | - | | 1.0722 | 3963 | 0.2341 | - | - | | 1.0725 | 3964 | 0.2889 | - | - | | 1.0728 | 3965 | 0.1903 | - | - | | 1.0731 | 3966 | 0.2279 | - | - | | 1.0733 | 3967 | 0.196 | - | - | | 1.0736 | 3968 | 0.2252 | - | - | | 1.0739 | 3969 | 0.2639 | - | - | | 1.0741 | 3970 | 0.2483 | - | - | | 1.0744 | 3971 | 0.1682 | - | - | | 1.0747 | 3972 | 0.2842 | - | - | | 1.0749 | 3973 | 0.2235 | - | - | | 1.0752 | 3974 | 0.2576 | - | - | | 1.0755 | 3975 | 0.2906 | - | - | | 1.0758 | 3976 | 0.1943 | - | - | | 1.0760 | 3977 | 0.2131 | - | - | | 1.0763 | 3978 | 0.1708 | - | - | | 1.0766 | 3979 | 0.2712 | - | - | | 1.0768 | 3980 | 0.1389 | - | - | | 1.0771 | 3981 | 0.1962 | - | - | | 1.0774 | 3982 | 0.2629 | - | - | | 1.0777 | 3983 | 0.221 | - | - | | 1.0779 | 3984 | 0.2348 | - | - | | 1.0782 | 3985 | 0.2455 | - | - | | 1.0785 | 3986 | 0.2226 | - | - | | 1.0787 | 3987 | 0.2323 | - | - | | 1.0790 | 3988 | 0.2054 | - | - | | 1.0793 | 3989 | 0.2234 | - | - | | 1.0795 | 3990 | 0.1725 | - | - | | 1.0798 | 3991 | 0.2408 | - | - | | 1.0801 | 3992 | 0.2424 | - | - | | 1.0804 | 3993 | 0.1942 | - | - | | 1.0806 | 3994 | 0.2107 | - | - | | 1.0809 | 3995 | 0.2726 | - | - | | 1.0812 | 3996 | 0.2541 | - | - | | 1.0814 | 3997 | 0.2702 | - | - | | 1.0817 | 3998 | 0.2078 | - | - | | 1.0820 | 3999 | 0.2351 | - | - | | 1.0823 | 4000 | 0.2006 | 0.2336 | 0.9416 | | 1.0825 | 4001 | 0.1635 | - | - | | 1.0828 | 4002 | 0.2494 | - | - | | 1.0831 | 4003 | 0.1952 | - | - | | 1.0833 | 4004 | 0.1852 | - | - | | 1.0836 | 4005 | 0.2177 | - | - | | 1.0839 | 4006 | 0.2086 | - | - | | 1.0841 | 4007 | 0.2453 | - | - | | 1.0844 | 4008 | 0.2443 | - | - | | 1.0847 | 4009 | 0.2451 | - | - | | 1.0850 | 4010 | 0.2234 | - | - | | 1.0852 | 4011 | 0.2117 | - | - | | 1.0855 | 4012 | 0.2206 | - | - | | 1.0858 | 4013 | 0.2102 | - | - | | 1.0860 | 4014 | 0.3688 | - | - | | 1.0863 | 4015 | 0.2184 | - | - | | 1.0866 | 4016 | 0.2224 | - | - | | 1.0869 | 4017 | 0.219 | - | - | | 1.0871 | 4018 | 0.228 | - | - | | 1.0874 | 4019 | 0.2189 | - | - | | 1.0877 | 4020 | 0.2243 | - | - | | 1.0879 | 4021 | 0.2628 | - | - | | 1.0882 | 4022 | 0.2735 | - | - | | 1.0885 | 4023 | 0.2302 | - | - | | 1.0887 | 4024 | 0.2326 | - | - | | 1.0890 | 4025 | 0.2749 | - | - | | 1.0893 | 4026 | 0.25 | - | - | | 1.0896 | 4027 | 0.2004 | - | - | | 1.0898 | 4028 | 0.2 | - | - | | 1.0901 | 4029 | 0.2037 | - | - | | 1.0904 | 4030 | 0.283 | - | - | | 1.0906 | 4031 | 0.3118 | - | - | | 1.0909 | 4032 | 0.2184 | - | - | | 1.0912 | 4033 | 0.2213 | - | - | | 1.0915 | 4034 | 0.3782 | - | - | | 1.0917 | 4035 | 0.2203 | - | - | | 1.0920 | 4036 | 0.2366 | - | - | | 1.0923 | 4037 | 0.2382 | - | - | | 1.0925 | 4038 | 0.1448 | - | - | | 1.0928 | 4039 | 0.2925 | - | - | | 1.0931 | 4040 | 0.2262 | - | - | | 1.0933 | 4041 | 0.2376 | - | - | | 1.0936 | 4042 | 0.2623 | - | - | | 1.0939 | 4043 | 0.2333 | - | - | | 1.0942 | 4044 | 0.2159 | - | - | | 1.0944 | 4045 | 0.169 | - | - | | 1.0947 | 4046 | 0.1788 | - | - | | 1.0950 | 4047 | 0.1915 | - | - | | 1.0952 | 4048 | 0.2878 | - | - | | 1.0955 | 4049 | 0.2274 | - | - | | 1.0958 | 4050 | 0.3054 | - | - | | 1.0960 | 4051 | 0.2925 | - | - | | 1.0963 | 4052 | 0.2351 | - | - | | 1.0966 | 4053 | 0.3037 | - | - | | 1.0969 | 4054 | 0.2311 | - | - | | 1.0971 | 4055 | 0.2069 | - | - | | 1.0974 | 4056 | 0.2363 | - | - | | 1.0977 | 4057 | 0.2554 | - | - | | 1.0979 | 4058 | 0.288 | - | - | | 1.0982 | 4059 | 0.2552 | - | - | | 1.0985 | 4060 | 0.258 | - | - | | 1.0988 | 4061 | 0.2276 | - | - | | 1.0990 | 4062 | 0.2131 | - | - | | 1.0993 | 4063 | 0.1614 | - | - | | 1.0996 | 4064 | 0.1668 | - | - | | 1.0998 | 4065 | 0.2547 | - | - | | 1.1001 | 4066 | 0.2231 | - | - | | 1.1004 | 4067 | 0.2161 | - | - | | 1.1006 | 4068 | 0.168 | - | - | | 1.1009 | 4069 | 0.2466 | - | - | | 1.1012 | 4070 | 0.335 | - | - | | 1.1015 | 4071 | 0.2734 | - | - | | 1.1017 | 4072 | 0.1958 | - | - | | 1.1020 | 4073 | 0.1705 | - | - | | 1.1023 | 4074 | 0.2087 | - | - | | 1.1025 | 4075 | 0.1648 | - | - | | 1.1028 | 4076 | 0.2473 | - | - | | 1.1031 | 4077 | 0.2457 | - | - | | 1.1034 | 4078 | 0.2744 | - | - | | 1.1036 | 4079 | 0.2046 | - | - | | 1.1039 | 4080 | 0.2668 | - | - | | 1.1042 | 4081 | 0.2365 | - | - | | 1.1044 | 4082 | 0.2355 | - | - | | 1.1047 | 4083 | 0.1896 | - | - | | 1.1050 | 4084 | 0.2236 | - | - | | 1.1052 | 4085 | 0.2284 | - | - | | 1.1055 | 4086 | 0.1619 | - | - | | 1.1058 | 4087 | 0.3478 | - | - | | 1.1061 | 4088 | 0.2752 | - | - | | 1.1063 | 4089 | 0.3266 | - | - | | 1.1066 | 4090 | 0.232 | - | - | | 1.1069 | 4091 | 0.2007 | - | - | | 1.1071 | 4092 | 0.2568 | - | - | | 1.1074 | 4093 | 0.2395 | - | - | | 1.1077 | 4094 | 0.1627 | - | - | | 1.1080 | 4095 | 0.2363 | - | - | | 1.1082 | 4096 | 0.2078 | - | - | | 1.1085 | 4097 | 0.3126 | - | - | | 1.1088 | 4098 | 0.2028 | - | - | | 1.1090 | 4099 | 0.2214 | - | - | | 1.1093 | 4100 | 0.3466 | - | - | | 1.1096 | 4101 | 0.2243 | - | - | | 1.1098 | 4102 | 0.2088 | - | - | | 1.1101 | 4103 | 0.2623 | - | - | | 1.1104 | 4104 | 0.3202 | - | - | | 1.1107 | 4105 | 0.1989 | - | - | | 1.1109 | 4106 | 0.2163 | - | - | | 1.1112 | 4107 | 0.1721 | - | - | | 1.1115 | 4108 | 0.2463 | - | - | | 1.1117 | 4109 | 0.279 | - | - | | 1.1120 | 4110 | 0.2391 | - | - | | 1.1123 | 4111 | 0.1849 | - | - | | 1.1126 | 4112 | 0.3284 | - | - | | 1.1128 | 4113 | 0.2569 | - | - | | 1.1131 | 4114 | 0.2078 | - | - | | 1.1134 | 4115 | 0.1981 | - | - | | 1.1136 | 4116 | 0.1753 | - | - | | 1.1139 | 4117 | 0.2487 | - | - | | 1.1142 | 4118 | 0.2268 | - | - | | 1.1144 | 4119 | 0.1832 | - | - | | 1.1147 | 4120 | 0.2507 | - | - | | 1.1150 | 4121 | 0.2335 | - | - | | 1.1153 | 4122 | 0.2351 | - | - | | 1.1155 | 4123 | 0.2809 | - | - | | 1.1158 | 4124 | 0.2653 | - | - | | 1.1161 | 4125 | 0.1901 | - | - | | 1.1163 | 4126 | 0.2207 | - | - | | 1.1166 | 4127 | 0.2148 | - | - | | 1.1169 | 4128 | 0.154 | - | - | | 1.1172 | 4129 | 0.2834 | - | - | | 1.1174 | 4130 | 0.2649 | - | - | | 1.1177 | 4131 | 0.2326 | - | - | | 1.1180 | 4132 | 0.2733 | - | - | | 1.1182 | 4133 | 0.2805 | - | - | | 1.1185 | 4134 | 0.2385 | - | - | | 1.1188 | 4135 | 0.2177 | - | - | | 1.1190 | 4136 | 0.2291 | - | - | | 1.1193 | 4137 | 0.2262 | - | - | | 1.1196 | 4138 | 0.1778 | - | - | | 1.1199 | 4139 | 0.243 | - | - | | 1.1201 | 4140 | 0.1881 | - | - | | 1.1204 | 4141 | 0.2183 | - | - | | 1.1207 | 4142 | 0.3065 | - | - | | 1.1209 | 4143 | 0.308 | - | - | | 1.1212 | 4144 | 0.2685 | - | - | | 1.1215 | 4145 | 0.2397 | - | - | | 1.1218 | 4146 | 0.2339 | - | - | | 1.1220 | 4147 | 0.2657 | - | - | | 1.1223 | 4148 | 0.2177 | - | - | | 1.1226 | 4149 | 0.1997 | - | - | | 1.1228 | 4150 | 0.2234 | - | - | | 1.1231 | 4151 | 0.2441 | - | - | | 1.1234 | 4152 | 0.2548 | - | - | | 1.1236 | 4153 | 0.2114 | - | - | | 1.1239 | 4154 | 0.3083 | - | - | | 1.1242 | 4155 | 0.2732 | - | - | | 1.1245 | 4156 | 0.2402 | - | - | | 1.1247 | 4157 | 0.2963 | - | - | | 1.125 | 4158 | 0.1755 | - | - | | 1.1253 | 4159 | 0.1807 | - | - | | 1.1255 | 4160 | 0.2084 | - | - | | 1.1258 | 4161 | 0.1941 | - | - | | 1.1261 | 4162 | 0.1946 | - | - | | 1.1264 | 4163 | 0.233 | - | - | | 1.1266 | 4164 | 0.1537 | - | - | | 1.1269 | 4165 | 0.2537 | - | - | | 1.1272 | 4166 | 0.1951 | - | - | | 1.1274 | 4167 | 0.2516 | - | - | | 1.1277 | 4168 | 0.2027 | - | - | | 1.1280 | 4169 | 0.1996 | - | - | | 1.1282 | 4170 | 0.2693 | - | - | | 1.1285 | 4171 | 0.2447 | - | - | | 1.1288 | 4172 | 0.2289 | - | - | | 1.1291 | 4173 | 0.251 | - | - | | 1.1293 | 4174 | 0.2144 | - | - | | 1.1296 | 4175 | 0.2459 | - | - | | 1.1299 | 4176 | 0.2599 | - | - | | 1.1301 | 4177 | 0.2555 | - | - | | 1.1304 | 4178 | 0.2522 | - | - | | 1.1307 | 4179 | 0.1855 | - | - | | 1.1310 | 4180 | 0.2985 | - | - | | 1.1312 | 4181 | 0.2252 | - | - | | 1.1315 | 4182 | 0.1398 | - | - | | 1.1318 | 4183 | 0.2341 | - | - | | 1.1320 | 4184 | 0.2225 | - | - | | 1.1323 | 4185 | 0.2567 | - | - | | 1.1326 | 4186 | 0.2446 | - | - | | 1.1328 | 4187 | 0.2637 | - | - | | 1.1331 | 4188 | 0.2457 | - | - | | 1.1334 | 4189 | 0.2373 | - | - | | 1.1337 | 4190 | 0.2702 | - | - | | 1.1339 | 4191 | 0.3234 | - | - | | 1.1342 | 4192 | 0.2746 | - | - | | 1.1345 | 4193 | 0.3018 | - | - | | 1.1347 | 4194 | 0.2691 | - | - | | 1.1350 | 4195 | 0.2198 | - | - | | 1.1353 | 4196 | 0.2281 | - | - | | 1.1356 | 4197 | 0.2309 | - | - | | 1.1358 | 4198 | 0.293 | - | - | | 1.1361 | 4199 | 0.2385 | - | - | | 1.1364 | 4200 | 0.2647 | - | - | | 1.1366 | 4201 | 0.266 | - | - | | 1.1369 | 4202 | 0.2382 | - | - | | 1.1372 | 4203 | 0.2534 | - | - | | 1.1374 | 4204 | 0.229 | - | - | | 1.1377 | 4205 | 0.1996 | - | - | | 1.1380 | 4206 | 0.2879 | - | - | | 1.1383 | 4207 | 0.1815 | - | - | | 1.1385 | 4208 | 0.2465 | - | - | | 1.1388 | 4209 | 0.2426 | - | - | | 1.1391 | 4210 | 0.22 | - | - | | 1.1393 | 4211 | 0.1954 | - | - | | 1.1396 | 4212 | 0.2763 | - | - | | 1.1399 | 4213 | 0.2086 | - | - | | 1.1402 | 4214 | 0.2295 | - | - | | 1.1404 | 4215 | 0.2712 | - | - | | 1.1407 | 4216 | 0.2734 | - | - | | 1.1410 | 4217 | 0.3155 | - | - | | 1.1412 | 4218 | 0.2565 | - | - | | 1.1415 | 4219 | 0.2388 | - | - | | 1.1418 | 4220 | 0.2291 | - | - | | 1.1420 | 4221 | 0.2375 | - | - | | 1.1423 | 4222 | 0.2429 | - | - | | 1.1426 | 4223 | 0.1965 | - | - | | 1.1429 | 4224 | 0.2188 | - | - | | 1.1431 | 4225 | 0.2529 | - | - | | 1.1434 | 4226 | 0.2549 | - | - | | 1.1437 | 4227 | 0.1583 | - | - | | 1.1439 | 4228 | 0.2585 | - | - | | 1.1442 | 4229 | 0.2597 | - | - | | 1.1445 | 4230 | 0.1788 | - | - | | 1.1448 | 4231 | 0.2648 | - | - | | 1.1450 | 4232 | 0.2133 | - | - | | 1.1453 | 4233 | 0.2612 | - | - | | 1.1456 | 4234 | 0.2921 | - | - | | 1.1458 | 4235 | 0.2136 | - | - | | 1.1461 | 4236 | 0.281 | - | - | | 1.1464 | 4237 | 0.2649 | - | - | | 1.1466 | 4238 | 0.2231 | - | - | | 1.1469 | 4239 | 0.2245 | - | - | | 1.1472 | 4240 | 0.1939 | - | - | | 1.1475 | 4241 | 0.2871 | - | - | | 1.1477 | 4242 | 0.1864 | - | - | | 1.1480 | 4243 | 0.2726 | - | - | | 1.1483 | 4244 | 0.214 | - | - | | 1.1485 | 4245 | 0.2789 | - | - | | 1.1488 | 4246 | 0.1811 | - | - | | 1.1491 | 4247 | 0.2205 | - | - | | 1.1494 | 4248 | 0.179 | - | - | | 1.1496 | 4249 | 0.2973 | - | - | | 1.1499 | 4250 | 0.2983 | - | - | | 1.1502 | 4251 | 0.2739 | - | - | | 1.1504 | 4252 | 0.2129 | - | - | | 1.1507 | 4253 | 0.2948 | - | - | | 1.1510 | 4254 | 0.2201 | - | - | | 1.1512 | 4255 | 0.2214 | - | - | | 1.1515 | 4256 | 0.1969 | - | - | | 1.1518 | 4257 | 0.1745 | - | - | | 1.1521 | 4258 | 0.2708 | - | - | | 1.1523 | 4259 | 0.3266 | - | - | | 1.1526 | 4260 | 0.2179 | - | - | | 1.1529 | 4261 | 0.2791 | - | - | | 1.1531 | 4262 | 0.2786 | - | - | | 1.1534 | 4263 | 0.2065 | - | - | | 1.1537 | 4264 | 0.1809 | - | - | | 1.1540 | 4265 | 0.1854 | - | - | | 1.1542 | 4266 | 0.3181 | - | - | | 1.1545 | 4267 | 0.2476 | - | - | | 1.1548 | 4268 | 0.2924 | - | - | | 1.1550 | 4269 | 0.1932 | - | - | | 1.1553 | 4270 | 0.294 | - | - | | 1.1556 | 4271 | 0.2131 | - | - | | 1.1558 | 4272 | 0.2054 | - | - | | 1.1561 | 4273 | 0.1859 | - | - | | 1.1564 | 4274 | 0.238 | - | - | | 1.1567 | 4275 | 0.2462 | - | - | | 1.1569 | 4276 | 0.2143 | - | - | | 1.1572 | 4277 | 0.2293 | - | - | | 1.1575 | 4278 | 0.2609 | - | - | | 1.1577 | 4279 | 0.186 | - | - | | 1.1580 | 4280 | 0.2331 | - | - | | 1.1583 | 4281 | 0.2604 | - | - | | 1.1585 | 4282 | 0.2363 | - | - | | 1.1588 | 4283 | 0.335 | - | - | | 1.1591 | 4284 | 0.2166 | - | - | | 1.1594 | 4285 | 0.2838 | - | - | | 1.1596 | 4286 | 0.2166 | - | - | | 1.1599 | 4287 | 0.2074 | - | - | | 1.1602 | 4288 | 0.2441 | - | - | | 1.1604 | 4289 | 0.2548 | - | - | | 1.1607 | 4290 | 0.3116 | - | - | | 1.1610 | 4291 | 0.1575 | - | - | | 1.1613 | 4292 | 0.2633 | - | - | | 1.1615 | 4293 | 0.2858 | - | - | | 1.1618 | 4294 | 0.1834 | - | - | | 1.1621 | 4295 | 0.2015 | - | - | | 1.1623 | 4296 | 0.2009 | - | - | | 1.1626 | 4297 | 0.2139 | - | - | | 1.1629 | 4298 | 0.2425 | - | - | | 1.1631 | 4299 | 0.1953 | - | - | | 1.1634 | 4300 | 0.1438 | - | - | | 1.1637 | 4301 | 0.2729 | - | - | | 1.1640 | 4302 | 0.2218 | - | - | | 1.1642 | 4303 | 0.2708 | - | - | | 1.1645 | 4304 | 0.2151 | - | - | | 1.1648 | 4305 | 0.2353 | - | - | | 1.1650 | 4306 | 0.181 | - | - | | 1.1653 | 4307 | 0.1629 | - | - | | 1.1656 | 4308 | 0.1922 | - | - | | 1.1659 | 4309 | 0.2589 | - | - | | 1.1661 | 4310 | 0.241 | - | - | | 1.1664 | 4311 | 0.3068 | - | - | | 1.1667 | 4312 | 0.249 | - | - | | 1.1669 | 4313 | 0.2464 | - | - | | 1.1672 | 4314 | 0.1706 | - | - | | 1.1675 | 4315 | 0.3256 | - | - | | 1.1677 | 4316 | 0.2382 | - | - | | 1.1680 | 4317 | 0.3734 | - | - | | 1.1683 | 4318 | 0.2511 | - | - | | 1.1686 | 4319 | 0.2513 | - | - | | 1.1688 | 4320 | 0.1616 | - | - | | 1.1691 | 4321 | 0.2539 | - | - | | 1.1694 | 4322 | 0.2508 | - | - | | 1.1696 | 4323 | 0.1958 | - | - | | 1.1699 | 4324 | 0.1808 | - | - | | 1.1702 | 4325 | 0.2645 | - | - | | 1.1705 | 4326 | 0.1849 | - | - | | 1.1707 | 4327 | 0.1863 | - | - | | 1.1710 | 4328 | 0.2459 | - | - | | 1.1713 | 4329 | 0.2475 | - | - | | 1.1715 | 4330 | 0.265 | - | - | | 1.1718 | 4331 | 0.2731 | - | - | | 1.1721 | 4332 | 0.1976 | - | - | | 1.1723 | 4333 | 0.1935 | - | - | | 1.1726 | 4334 | 0.2205 | - | - | | 1.1729 | 4335 | 0.1949 | - | - | | 1.1732 | 4336 | 0.1745 | - | - | | 1.1734 | 4337 | 0.2197 | - | - | | 1.1737 | 4338 | 0.1859 | - | - | | 1.1740 | 4339 | 0.2782 | - | - | | 1.1742 | 4340 | 0.1857 | - | - | | 1.1745 | 4341 | 0.2032 | - | - | | 1.1748 | 4342 | 0.1902 | - | - | | 1.1751 | 4343 | 0.1947 | - | - | | 1.1753 | 4344 | 0.1751 | - | - | | 1.1756 | 4345 | 0.2229 | - | - | | 1.1759 | 4346 | 0.2209 | - | - | | 1.1761 | 4347 | 0.2381 | - | - | | 1.1764 | 4348 | 0.1881 | - | - | | 1.1767 | 4349 | 0.2442 | - | - | | 1.1769 | 4350 | 0.2338 | - | - | | 1.1772 | 4351 | 0.2842 | - | - | | 1.1775 | 4352 | 0.1967 | - | - | | 1.1778 | 4353 | 0.2116 | - | - | | 1.1780 | 4354 | 0.1557 | - | - | | 1.1783 | 4355 | 0.2356 | - | - | | 1.1786 | 4356 | 0.1731 | - | - | | 1.1788 | 4357 | 0.1894 | - | - | | 1.1791 | 4358 | 0.2461 | - | - | | 1.1794 | 4359 | 0.2419 | - | - | | 1.1797 | 4360 | 0.2088 | - | - | | 1.1799 | 4361 | 0.2258 | - | - | | 1.1802 | 4362 | 0.2509 | - | - | | 1.1805 | 4363 | 0.3137 | - | - | | 1.1807 | 4364 | 0.2406 | - | - | | 1.1810 | 4365 | 0.3251 | - | - | | 1.1813 | 4366 | 0.3338 | - | - | | 1.1815 | 4367 | 0.1396 | - | - | | 1.1818 | 4368 | 0.2639 | - | - | | 1.1821 | 4369 | 0.1672 | - | - | | 1.1824 | 4370 | 0.2297 | - | - | | 1.1826 | 4371 | 0.1911 | - | - | | 1.1829 | 4372 | 0.2367 | - | - | | 1.1832 | 4373 | 0.2659 | - | - | | 1.1834 | 4374 | 0.3109 | - | - | | 1.1837 | 4375 | 0.2522 | - | - | | 1.1840 | 4376 | 0.2597 | - | - | | 1.1843 | 4377 | 0.2996 | - | - | | 1.1845 | 4378 | 0.2073 | - | - | | 1.1848 | 4379 | 0.1874 | - | - | | 1.1851 | 4380 | 0.1336 | - | - | | 1.1853 | 4381 | 0.2259 | - | - | | 1.1856 | 4382 | 0.1581 | - | - | | 1.1859 | 4383 | 0.2602 | - | - | | 1.1861 | 4384 | 0.2303 | - | - | | 1.1864 | 4385 | 0.2169 | - | - | | 1.1867 | 4386 | 0.3112 | - | - | | 1.1870 | 4387 | 0.1863 | - | - | | 1.1872 | 4388 | 0.2988 | - | - | | 1.1875 | 4389 | 0.1617 | - | - | | 1.1878 | 4390 | 0.2332 | - | - | | 1.1880 | 4391 | 0.2508 | - | - | | 1.1883 | 4392 | 0.2597 | - | - | | 1.1886 | 4393 | 0.3034 | - | - | | 1.1889 | 4394 | 0.2211 | - | - | | 1.1891 | 4395 | 0.2996 | - | - | | 1.1894 | 4396 | 0.204 | - | - | | 1.1897 | 4397 | 0.264 | - | - | | 1.1899 | 4398 | 0.2316 | - | - | | 1.1902 | 4399 | 0.2017 | - | - | | 1.1905 | 4400 | 0.195 | - | - | | 1.1907 | 4401 | 0.2194 | - | - | | 1.1910 | 4402 | 0.1864 | - | - | | 1.1913 | 4403 | 0.214 | - | - | | 1.1916 | 4404 | 0.2382 | - | - | | 1.1918 | 4405 | 0.2293 | - | - | | 1.1921 | 4406 | 0.1916 | - | - | | 1.1924 | 4407 | 0.1453 | - | - | | 1.1926 | 4408 | 0.3456 | - | - | | 1.1929 | 4409 | 0.2782 | - | - | | 1.1932 | 4410 | 0.2315 | - | - | | 1.1935 | 4411 | 0.3167 | - | - | | 1.1937 | 4412 | 0.2665 | - | - | | 1.1940 | 4413 | 0.2476 | - | - | | 1.1943 | 4414 | 0.248 | - | - | | 1.1945 | 4415 | 0.1862 | - | - | | 1.1948 | 4416 | 0.2545 | - | - | | 1.1951 | 4417 | 0.2549 | - | - | | 1.1953 | 4418 | 0.1536 | - | - | | 1.1956 | 4419 | 0.2348 | - | - | | 1.1959 | 4420 | 0.2631 | - | - | | 1.1962 | 4421 | 0.2976 | - | - | | 1.1964 | 4422 | 0.3626 | - | - | | 1.1967 | 4423 | 0.2335 | - | - | | 1.1970 | 4424 | 0.2127 | - | - | | 1.1972 | 4425 | 0.2127 | - | - | | 1.1975 | 4426 | 0.2649 | - | - | | 1.1978 | 4427 | 0.2211 | - | - | | 1.1981 | 4428 | 0.2515 | - | - | | 1.1983 | 4429 | 0.2394 | - | - | | 1.1986 | 4430 | 0.1586 | - | - | | 1.1989 | 4431 | 0.192 | - | - | | 1.1991 | 4432 | 0.2288 | - | - | | 1.1994 | 4433 | 0.2269 | - | - | | 1.1997 | 4434 | 0.232 | - | - | | 1.1999 | 4435 | 0.1814 | - | - | | 1.2002 | 4436 | 0.2768 | - | - | | 1.2005 | 4437 | 0.2096 | - | - | | 1.2008 | 4438 | 0.2717 | - | - | | 1.2010 | 4439 | 0.1583 | - | - | | 1.2013 | 4440 | 0.2195 | - | - | | 1.2016 | 4441 | 0.2865 | - | - | | 1.2018 | 4442 | 0.3121 | - | - | | 1.2021 | 4443 | 0.1415 | - | - | | 1.2024 | 4444 | 0.2083 | - | - | | 1.2027 | 4445 | 0.2701 | - | - | | 1.2029 | 4446 | 0.1928 | - | - | | 1.2032 | 4447 | 0.1929 | - | - | | 1.2035 | 4448 | 0.2577 | - | - | | 1.2037 | 4449 | 0.3552 | - | - | | 1.2040 | 4450 | 0.2243 | - | - | | 1.2043 | 4451 | 0.2552 | - | - | | 1.2045 | 4452 | 0.2835 | - | - | | 1.2048 | 4453 | 0.2188 | - | - | | 1.2051 | 4454 | 0.2071 | - | - | | 1.2054 | 4455 | 0.2013 | - | - | | 1.2056 | 4456 | 0.1967 | - | - | | 1.2059 | 4457 | 0.221 | - | - | | 1.2062 | 4458 | 0.2773 | - | - | | 1.2064 | 4459 | 0.1989 | - | - | | 1.2067 | 4460 | 0.1889 | - | - | | 1.2070 | 4461 | 0.2622 | - | - | | 1.2073 | 4462 | 0.1878 | - | - | | 1.2075 | 4463 | 0.2531 | - | - | | 1.2078 | 4464 | 0.2678 | - | - | | 1.2081 | 4465 | 0.3264 | - | - | | 1.2083 | 4466 | 0.1789 | - | - | | 1.2086 | 4467 | 0.2803 | - | - | | 1.2089 | 4468 | 0.2853 | - | - | | 1.2091 | 4469 | 0.2517 | - | - | | 1.2094 | 4470 | 0.2236 | - | - | | 1.2097 | 4471 | 0.2327 | - | - | | 1.2100 | 4472 | 0.2625 | - | - | | 1.2102 | 4473 | 0.2433 | - | - | | 1.2105 | 4474 | 0.2062 | - | - | | 1.2108 | 4475 | 0.193 | - | - | | 1.2110 | 4476 | 0.3185 | - | - | | 1.2113 | 4477 | 0.2213 | - | - | | 1.2116 | 4478 | 0.2161 | - | - | | 1.2119 | 4479 | 0.195 | - | - | | 1.2121 | 4480 | 0.1507 | - | - | | 1.2124 | 4481 | 0.2382 | - | - | | 1.2127 | 4482 | 0.2444 | - | - | | 1.2129 | 4483 | 0.1966 | - | - | | 1.2132 | 4484 | 0.2429 | - | - | | 1.2135 | 4485 | 0.2028 | - | - | | 1.2137 | 4486 | 0.1947 | - | - | | 1.2140 | 4487 | 0.3277 | - | - | | 1.2143 | 4488 | 0.2984 | - | - | | 1.2146 | 4489 | 0.2657 | - | - | | 1.2148 | 4490 | 0.2702 | - | - | | 1.2151 | 4491 | 0.2628 | - | - | | 1.2154 | 4492 | 0.3113 | - | - | | 1.2156 | 4493 | 0.2375 | - | - | | 1.2159 | 4494 | 0.2656 | - | - | | 1.2162 | 4495 | 0.1883 | - | - | | 1.2165 | 4496 | 0.183 | - | - | | 1.2167 | 4497 | 0.2129 | - | - | | 1.2170 | 4498 | 0.249 | - | - | | 1.2173 | 4499 | 0.2801 | - | - | | 1.2175 | 4500 | 0.3372 | - | - | | 1.2178 | 4501 | 0.2198 | - | - | | 1.2181 | 4502 | 0.328 | - | - | | 1.2183 | 4503 | 0.229 | - | - | | 1.2186 | 4504 | 0.2431 | - | - | | 1.2189 | 4505 | 0.1767 | - | - | | 1.2192 | 4506 | 0.1872 | - | - | | 1.2194 | 4507 | 0.1747 | - | - | | 1.2197 | 4508 | 0.1524 | - | - | | 1.2200 | 4509 | 0.1526 | - | - | | 1.2202 | 4510 | 0.231 | - | - | | 1.2205 | 4511 | 0.2313 | - | - | | 1.2208 | 4512 | 0.3124 | - | - | | 1.2210 | 4513 | 0.1784 | - | - | | 1.2213 | 4514 | 0.1641 | - | - | | 1.2216 | 4515 | 0.2189 | - | - | | 1.2219 | 4516 | 0.2125 | - | - | | 1.2221 | 4517 | 0.2273 | - | - | | 1.2224 | 4518 | 0.2583 | - | - | | 1.2227 | 4519 | 0.2329 | - | - | | 1.2229 | 4520 | 0.288 | - | - | | 1.2232 | 4521 | 0.1855 | - | - | | 1.2235 | 4522 | 0.209 | - | - | | 1.2238 | 4523 | 0.1516 | - | - | | 1.2240 | 4524 | 0.2512 | - | - | | 1.2243 | 4525 | 0.2599 | - | - | | 1.2246 | 4526 | 0.1972 | - | - | | 1.2248 | 4527 | 0.284 | - | - | | 1.2251 | 4528 | 0.2392 | - | - | | 1.2254 | 4529 | 0.3336 | - | - | | 1.2256 | 4530 | 0.2817 | - | - | | 1.2259 | 4531 | 0.2513 | - | - | | 1.2262 | 4532 | 0.2295 | - | - | | 1.2265 | 4533 | 0.234 | - | - | | 1.2267 | 4534 | 0.331 | - | - | | 1.2270 | 4535 | 0.1894 | - | - | | 1.2273 | 4536 | 0.2673 | - | - | | 1.2275 | 4537 | 0.2025 | - | - | | 1.2278 | 4538 | 0.2624 | - | - | | 1.2281 | 4539 | 0.2418 | - | - | | 1.2284 | 4540 | 0.1875 | - | - | | 1.2286 | 4541 | 0.2286 | - | - | | 1.2289 | 4542 | 0.2127 | - | - | | 1.2292 | 4543 | 0.2744 | - | - | | 1.2294 | 4544 | 0.1911 | - | - | | 1.2297 | 4545 | 0.3007 | - | - | | 1.2300 | 4546 | 0.1921 | - | - | | 1.2302 | 4547 | 0.2478 | - | - | | 1.2305 | 4548 | 0.258 | - | - | | 1.2308 | 4549 | 0.1984 | - | - | | 1.2311 | 4550 | 0.1743 | - | - | | 1.2313 | 4551 | 0.2166 | - | - | | 1.2316 | 4552 | 0.2049 | - | - | | 1.2319 | 4553 | 0.2519 | - | - | | 1.2321 | 4554 | 0.2215 | - | - | | 1.2324 | 4555 | 0.2967 | - | - | | 1.2327 | 4556 | 0.1825 | - | - | | 1.2330 | 4557 | 0.2615 | - | - | | 1.2332 | 4558 | 0.2156 | - | - | | 1.2335 | 4559 | 0.1938 | - | - | | 1.2338 | 4560 | 0.2087 | - | - | | 1.2340 | 4561 | 0.2401 | - | - | | 1.2343 | 4562 | 0.2297 | - | - | | 1.2346 | 4563 | 0.2615 | - | - | | 1.2348 | 4564 | 0.158 | - | - | | 1.2351 | 4565 | 0.1972 | - | - | | 1.2354 | 4566 | 0.2279 | - | - | | 1.2357 | 4567 | 0.2081 | - | - | | 1.2359 | 4568 | 0.2285 | - | - | | 1.2362 | 4569 | 0.2632 | - | - | | 1.2365 | 4570 | 0.2652 | - | - | | 1.2367 | 4571 | 0.1575 | - | - | | 1.2370 | 4572 | 0.2755 | - | - | | 1.2373 | 4573 | 0.2692 | - | - | | 1.2376 | 4574 | 0.1596 | - | - | | 1.2378 | 4575 | 0.2256 | - | - | | 1.2381 | 4576 | 0.2214 | - | - | | 1.2384 | 4577 | 0.2237 | - | - | | 1.2386 | 4578 | 0.2393 | - | - | | 1.2389 | 4579 | 0.1569 | - | - | | 1.2392 | 4580 | 0.3432 | - | - | | 1.2394 | 4581 | 0.2159 | - | - | | 1.2397 | 4582 | 0.248 | - | - | | 1.2400 | 4583 | 0.2093 | - | - | | 1.2403 | 4584 | 0.2372 | - | - | | 1.2405 | 4585 | 0.1782 | - | - | | 1.2408 | 4586 | 0.27 | - | - | | 1.2411 | 4587 | 0.1525 | - | - | | 1.2413 | 4588 | 0.1439 | - | - | | 1.2416 | 4589 | 0.3204 | - | - | | 1.2419 | 4590 | 0.1863 | - | - | | 1.2422 | 4591 | 0.1776 | - | - | | 1.2424 | 4592 | 0.2783 | - | - | | 1.2427 | 4593 | 0.2024 | - | - | | 1.2430 | 4594 | 0.2108 | - | - | | 1.2432 | 4595 | 0.1963 | - | - | | 1.2435 | 4596 | 0.2438 | - | - | | 1.2438 | 4597 | 0.3046 | - | - | | 1.2440 | 4598 | 0.1669 | - | - | | 1.2443 | 4599 | 0.2387 | - | - | | 1.2446 | 4600 | 0.1727 | - | - | | 1.2449 | 4601 | 0.2733 | - | - | | 1.2451 | 4602 | 0.175 | - | - | | 1.2454 | 4603 | 0.1841 | - | - | | 1.2457 | 4604 | 0.2065 | - | - | | 1.2459 | 4605 | 0.2694 | - | - | | 1.2462 | 4606 | 0.261 | - | - | | 1.2465 | 4607 | 0.297 | - | - | | 1.2468 | 4608 | 0.1567 | - | - | | 1.2470 | 4609 | 0.2799 | - | - | | 1.2473 | 4610 | 0.2371 | - | - | | 1.2476 | 4611 | 0.3294 | - | - | | 1.2478 | 4612 | 0.1864 | - | - | | 1.2481 | 4613 | 0.2184 | - | - | | 1.2484 | 4614 | 0.1709 | - | - | | 1.2486 | 4615 | 0.2159 | - | - | | 1.2489 | 4616 | 0.1463 | - | - | | 1.2492 | 4617 | 0.1659 | - | - | | 1.2495 | 4618 | 0.1885 | - | - | | 1.2497 | 4619 | 0.261 | - | - | | 1.25 | 4620 | 0.214 | - | - | | 1.2503 | 4621 | 0.3101 | - | - | | 1.2505 | 4622 | 0.2443 | - | - | | 1.2508 | 4623 | 0.1709 | - | - | | 1.2511 | 4624 | 0.2013 | - | - | | 1.2514 | 4625 | 0.2378 | - | - | | 1.2516 | 4626 | 0.1796 | - | - | | 1.2519 | 4627 | 0.1952 | - | - | | 1.2522 | 4628 | 0.1819 | - | - | | 1.2524 | 4629 | 0.1972 | - | - | | 1.2527 | 4630 | 0.207 | - | - | | 1.2530 | 4631 | 0.2877 | - | - | | 1.2532 | 4632 | 0.2831 | - | - | | 1.2535 | 4633 | 0.2412 | - | - | | 1.2538 | 4634 | 0.1731 | - | - | | 1.2541 | 4635 | 0.1978 | - | - | | 1.2543 | 4636 | 0.2562 | - | - | | 1.2546 | 4637 | 0.2185 | - | - | | 1.2549 | 4638 | 0.2265 | - | - | | 1.2551 | 4639 | 0.2561 | - | - | | 1.2554 | 4640 | 0.233 | - | - | | 1.2557 | 4641 | 0.2746 | - | - | | 1.2560 | 4642 | 0.2534 | - | - | | 1.2562 | 4643 | 0.1689 | - | - | | 1.2565 | 4644 | 0.1926 | - | - | | 1.2568 | 4645 | 0.2405 | - | - | | 1.2570 | 4646 | 0.1613 | - | - | | 1.2573 | 4647 | 0.2288 | - | - | | 1.2576 | 4648 | 0.2439 | - | - | | 1.2578 | 4649 | 0.1421 | - | - | | 1.2581 | 4650 | 0.1864 | - | - | | 1.2584 | 4651 | 0.1849 | - | - | | 1.2587 | 4652 | 0.1937 | - | - | | 1.2589 | 4653 | 0.2452 | - | - | | 1.2592 | 4654 | 0.1935 | - | - | | 1.2595 | 4655 | 0.2102 | - | - | | 1.2597 | 4656 | 0.2364 | - | - | | 1.2600 | 4657 | 0.2402 | - | - | | 1.2603 | 4658 | 0.1827 | - | - | | 1.2606 | 4659 | 0.1919 | - | - | | 1.2608 | 4660 | 0.2182 | - | - | | 1.2611 | 4661 | 0.2846 | - | - | | 1.2614 | 4662 | 0.2488 | - | - | | 1.2616 | 4663 | 0.2403 | - | - | | 1.2619 | 4664 | 0.1764 | - | - | | 1.2622 | 4665 | 0.127 | - | - | | 1.2624 | 4666 | 0.2952 | - | - | | 1.2627 | 4667 | 0.2231 | - | - | | 1.2630 | 4668 | 0.1952 | - | - | | 1.2633 | 4669 | 0.2341 | - | - | | 1.2635 | 4670 | 0.25 | - | - | | 1.2638 | 4671 | 0.1833 | - | - | | 1.2641 | 4672 | 0.2156 | - | - | | 1.2643 | 4673 | 0.2585 | - | - | | 1.2646 | 4674 | 0.2343 | - | - | | 1.2649 | 4675 | 0.2409 | - | - | | 1.2652 | 4676 | 0.2059 | - | - | | 1.2654 | 4677 | 0.2806 | - | - | | 1.2657 | 4678 | 0.2842 | - | - | | 1.2660 | 4679 | 0.2779 | - | - | | 1.2662 | 4680 | 0.1889 | - | - | | 1.2665 | 4681 | 0.1746 | - | - | | 1.2668 | 4682 | 0.2294 | - | - | | 1.2670 | 4683 | 0.2449 | - | - | | 1.2673 | 4684 | 0.2766 | - | - | | 1.2676 | 4685 | 0.1796 | - | - | | 1.2679 | 4686 | 0.3139 | - | - | | 1.2681 | 4687 | 0.2715 | - | - | | 1.2684 | 4688 | 0.3138 | - | - | | 1.2687 | 4689 | 0.2628 | - | - | | 1.2689 | 4690 | 0.2937 | - | - | | 1.2692 | 4691 | 0.2438 | - | - | | 1.2695 | 4692 | 0.1862 | - | - | | 1.2698 | 4693 | 0.1542 | - | - | | 1.2700 | 4694 | 0.2459 | - | - | | 1.2703 | 4695 | 0.1986 | - | - | | 1.2706 | 4696 | 0.1864 | - | - | | 1.2708 | 4697 | 0.2257 | - | - | | 1.2711 | 4698 | 0.2014 | - | - | | 1.2714 | 4699 | 0.3047 | - | - | | 1.2716 | 4700 | 0.1699 | - | - | | 1.2719 | 4701 | 0.2923 | - | - | | 1.2722 | 4702 | 0.1863 | - | - | | 1.2725 | 4703 | 0.2667 | - | - | | 1.2727 | 4704 | 0.2266 | - | - | | 1.2730 | 4705 | 0.173 | - | - | | 1.2733 | 4706 | 0.1338 | - | - | | 1.2735 | 4707 | 0.2204 | - | - | | 1.2738 | 4708 | 0.2966 | - | - | | 1.2741 | 4709 | 0.212 | - | - | | 1.2744 | 4710 | 0.2663 | - | - | | 1.2746 | 4711 | 0.2463 | - | - | | 1.2749 | 4712 | 0.2952 | - | - | | 1.2752 | 4713 | 0.2904 | - | - | | 1.2754 | 4714 | 0.1531 | - | - | | 1.2757 | 4715 | 0.1805 | - | - | | 1.2760 | 4716 | 0.2492 | - | - | | 1.2762 | 4717 | 0.2189 | - | - | | 1.2765 | 4718 | 0.3175 | - | - | | 1.2768 | 4719 | 0.218 | - | - | | 1.2771 | 4720 | 0.1972 | - | - | | 1.2773 | 4721 | 0.2109 | - | - | | 1.2776 | 4722 | 0.2836 | - | - | | 1.2779 | 4723 | 0.2403 | - | - | | 1.2781 | 4724 | 0.1911 | - | - | | 1.2784 | 4725 | 0.2427 | - | - | | 1.2787 | 4726 | 0.2096 | - | - | | 1.2790 | 4727 | 0.1386 | - | - | | 1.2792 | 4728 | 0.2399 | - | - | | 1.2795 | 4729 | 0.2386 | - | - | | 1.2798 | 4730 | 0.2106 | - | - | | 1.2800 | 4731 | 0.2383 | - | - | | 1.2803 | 4732 | 0.2189 | - | - | | 1.2806 | 4733 | 0.2518 | - | - | | 1.2808 | 4734 | 0.2261 | - | - | | 1.2811 | 4735 | 0.3264 | - | - | | 1.2814 | 4736 | 0.2119 | - | - | | 1.2817 | 4737 | 0.1711 | - | - | | 1.2819 | 4738 | 0.2501 | - | - | | 1.2822 | 4739 | 0.2158 | - | - | | 1.2825 | 4740 | 0.1692 | - | - | | 1.2827 | 4741 | 0.2454 | - | - | | 1.2830 | 4742 | 0.2124 | - | - | | 1.2833 | 4743 | 0.1898 | - | - | | 1.2835 | 4744 | 0.2143 | - | - | | 1.2838 | 4745 | 0.2522 | - | - | | 1.2841 | 4746 | 0.2544 | - | - | | 1.2844 | 4747 | 0.2355 | - | - | | 1.2846 | 4748 | 0.3055 | - | - | | 1.2849 | 4749 | 0.2376 | - | - | | 1.2852 | 4750 | 0.2144 | - | - | | 1.2854 | 4751 | 0.1893 | - | - | | 1.2857 | 4752 | 0.2343 | - | - | | 1.2860 | 4753 | 0.2007 | - | - | | 1.2863 | 4754 | 0.1706 | - | - | | 1.2865 | 4755 | 0.2047 | - | - | | 1.2868 | 4756 | 0.2768 | - | - | | 1.2871 | 4757 | 0.2694 | - | - | | 1.2873 | 4758 | 0.159 | - | - | | 1.2876 | 4759 | 0.284 | - | - | | 1.2879 | 4760 | 0.1701 | - | - | | 1.2881 | 4761 | 0.2255 | - | - | | 1.2884 | 4762 | 0.1708 | - | - | | 1.2887 | 4763 | 0.261 | - | - | | 1.2890 | 4764 | 0.4358 | - | - | | 1.2892 | 4765 | 0.2441 | - | - | | 1.2895 | 4766 | 0.2871 | - | - | | 1.2898 | 4767 | 0.1728 | - | - | | 1.2900 | 4768 | 0.2476 | - | - | | 1.2903 | 4769 | 0.2486 | - | - | | 1.2906 | 4770 | 0.2392 | - | - | | 1.2909 | 4771 | 0.2069 | - | - | | 1.2911 | 4772 | 0.2222 | - | - | | 1.2914 | 4773 | 0.1889 | - | - | | 1.2917 | 4774 | 0.1859 | - | - | | 1.2919 | 4775 | 0.2231 | - | - | | 1.2922 | 4776 | 0.1404 | - | - | | 1.2925 | 4777 | 0.1962 | - | - | | 1.2927 | 4778 | 0.249 | - | - | | 1.2930 | 4779 | 0.1687 | - | - | | 1.2933 | 4780 | 0.2167 | - | - | | 1.2936 | 4781 | 0.2326 | - | - | | 1.2938 | 4782 | 0.2322 | - | - | | 1.2941 | 4783 | 0.2947 | - | - | | 1.2944 | 4784 | 0.2619 | - | - | | 1.2946 | 4785 | 0.2467 | - | - | | 1.2949 | 4786 | 0.2369 | - | - | | 1.2952 | 4787 | 0.1947 | - | - | | 1.2955 | 4788 | 0.1664 | - | - | | 1.2957 | 4789 | 0.2511 | - | - | | 1.2960 | 4790 | 0.2123 | - | - | | 1.2963 | 4791 | 0.2287 | - | - | | 1.2965 | 4792 | 0.2634 | - | - | | 1.2968 | 4793 | 0.1893 | - | - | | 1.2971 | 4794 | 0.1774 | - | - | | 1.2973 | 4795 | 0.295 | - | - | | 1.2976 | 4796 | 0.1942 | - | - | | 1.2979 | 4797 | 0.2068 | - | - | | 1.2982 | 4798 | 0.2555 | - | - | | 1.2984 | 4799 | 0.224 | - | - | | 1.2987 | 4800 | 0.2791 | - | - | | 1.2990 | 4801 | 0.2119 | - | - | | 1.2992 | 4802 | 0.3137 | - | - | | 1.2995 | 4803 | 0.2808 | - | - | | 1.2998 | 4804 | 0.253 | - | - | | 1.3001 | 4805 | 0.2196 | - | - | | 1.3003 | 4806 | 0.2135 | - | - | | 1.3006 | 4807 | 0.2282 | - | - | | 1.3009 | 4808 | 0.3239 | - | - | | 1.3011 | 4809 | 0.1643 | - | - | | 1.3014 | 4810 | 0.1808 | - | - | | 1.3017 | 4811 | 0.1931 | - | - | | 1.3019 | 4812 | 0.2147 | - | - | | 1.3022 | 4813 | 0.2276 | - | - | | 1.3025 | 4814 | 0.3234 | - | - | | 1.3028 | 4815 | 0.2043 | - | - | | 1.3030 | 4816 | 0.176 | - | - | | 1.3033 | 4817 | 0.2169 | - | - | | 1.3036 | 4818 | 0.1878 | - | - | | 1.3038 | 4819 | 0.2251 | - | - | | 1.3041 | 4820 | 0.1374 | - | - | | 1.3044 | 4821 | 0.1882 | - | - | | 1.3047 | 4822 | 0.1905 | - | - | | 1.3049 | 4823 | 0.1841 | - | - | | 1.3052 | 4824 | 0.2144 | - | - | | 1.3055 | 4825 | 0.2321 | - | - | | 1.3057 | 4826 | 0.1906 | - | - | | 1.3060 | 4827 | 0.204 | - | - | | 1.3063 | 4828 | 0.213 | - | - | | 1.3065 | 4829 | 0.1974 | - | - | | 1.3068 | 4830 | 0.2829 | - | - | | 1.3071 | 4831 | 0.2704 | - | - | | 1.3074 | 4832 | 0.1599 | - | - | | 1.3076 | 4833 | 0.2108 | - | - | | 1.3079 | 4834 | 0.2135 | - | - | | 1.3082 | 4835 | 0.2134 | - | - | | 1.3084 | 4836 | 0.2072 | - | - | | 1.3087 | 4837 | 0.2184 | - | - | | 1.3090 | 4838 | 0.2851 | - | - | | 1.3093 | 4839 | 0.1898 | - | - | | 1.3095 | 4840 | 0.3054 | - | - | | 1.3098 | 4841 | 0.2102 | - | - | | 1.3101 | 4842 | 0.2429 | - | - | | 1.3103 | 4843 | 0.2845 | - | - | | 1.3106 | 4844 | 0.3107 | - | - | | 1.3109 | 4845 | 0.2447 | - | - | | 1.3111 | 4846 | 0.3323 | - | - | | 1.3114 | 4847 | 0.3229 | - | - | | 1.3117 | 4848 | 0.2128 | - | - | | 1.3120 | 4849 | 0.2268 | - | - | | 1.3122 | 4850 | 0.3052 | - | - | | 1.3125 | 4851 | 0.1629 | - | - | | 1.3128 | 4852 | 0.2615 | - | - | | 1.3130 | 4853 | 0.2432 | - | - | | 1.3133 | 4854 | 0.2357 | - | - | | 1.3136 | 4855 | 0.2068 | - | - | | 1.3139 | 4856 | 0.1822 | - | - | | 1.3141 | 4857 | 0.1763 | - | - | | 1.3144 | 4858 | 0.2185 | - | - | | 1.3147 | 4859 | 0.2282 | - | - | | 1.3149 | 4860 | 0.2787 | - | - | | 1.3152 | 4861 | 0.2479 | - | - | | 1.3155 | 4862 | 0.2429 | - | - | | 1.3157 | 4863 | 0.2079 | - | - | | 1.3160 | 4864 | 0.2166 | - | - | | 1.3163 | 4865 | 0.2531 | - | - | | 1.3166 | 4866 | 0.1407 | - | - | | 1.3168 | 4867 | 0.2401 | - | - | | 1.3171 | 4868 | 0.2687 | - | - | | 1.3174 | 4869 | 0.2249 | - | - | | 1.3176 | 4870 | 0.1733 | - | - | | 1.3179 | 4871 | 0.2637 | - | - | | 1.3182 | 4872 | 0.2236 | - | - | | 1.3185 | 4873 | 0.1528 | - | - | | 1.3187 | 4874 | 0.2443 | - | - | | 1.3190 | 4875 | 0.236 | - | - | | 1.3193 | 4876 | 0.2699 | - | - | | 1.3195 | 4877 | 0.1866 | - | - | | 1.3198 | 4878 | 0.2239 | - | - | | 1.3201 | 4879 | 0.295 | - | - | | 1.3203 | 4880 | 0.1985 | - | - | | 1.3206 | 4881 | 0.2163 | - | - | | 1.3209 | 4882 | 0.2528 | - | - | | 1.3212 | 4883 | 0.2202 | - | - | | 1.3214 | 4884 | 0.2621 | - | - | | 1.3217 | 4885 | 0.1878 | - | - | | 1.3220 | 4886 | 0.244 | - | - | | 1.3222 | 4887 | 0.3038 | - | - | | 1.3225 | 4888 | 0.3202 | - | - | | 1.3228 | 4889 | 0.2813 | - | - | | 1.3231 | 4890 | 0.2854 | - | - | | 1.3233 | 4891 | 0.2537 | - | - | | 1.3236 | 4892 | 0.2448 | - | - | | 1.3239 | 4893 | 0.1986 | - | - | | 1.3241 | 4894 | 0.1488 | - | - | | 1.3244 | 4895 | 0.2294 | - | - | | 1.3247 | 4896 | 0.2071 | - | - | | 1.3249 | 4897 | 0.2643 | - | - | | 1.3252 | 4898 | 0.2465 | - | - | | 1.3255 | 4899 | 0.2478 | - | - | | 1.3258 | 4900 | 0.2033 | - | - | | 1.3260 | 4901 | 0.2245 | - | - | | 1.3263 | 4902 | 0.2875 | - | - | | 1.3266 | 4903 | 0.2458 | - | - | | 1.3268 | 4904 | 0.2662 | - | - | | 1.3271 | 4905 | 0.2393 | - | - | | 1.3274 | 4906 | 0.1747 | - | - | | 1.3277 | 4907 | 0.2646 | - | - | | 1.3279 | 4908 | 0.324 | - | - | | 1.3282 | 4909 | 0.2307 | - | - | | 1.3285 | 4910 | 0.1988 | - | - | | 1.3287 | 4911 | 0.2198 | - | - | | 1.3290 | 4912 | 0.3069 | - | - | | 1.3293 | 4913 | 0.2538 | - | - | | 1.3295 | 4914 | 0.2281 | - | - | | 1.3298 | 4915 | 0.1691 | - | - | | 1.3301 | 4916 | 0.4058 | - | - | | 1.3304 | 4917 | 0.2588 | - | - | | 1.3306 | 4918 | 0.2653 | - | - | | 1.3309 | 4919 | 0.2885 | - | - | | 1.3312 | 4920 | 0.176 | - | - | | 1.3314 | 4921 | 0.2289 | - | - | | 1.3317 | 4922 | 0.2591 | - | - | | 1.3320 | 4923 | 0.2208 | - | - | | 1.3323 | 4924 | 0.2514 | - | - | | 1.3325 | 4925 | 0.3227 | - | - | | 1.3328 | 4926 | 0.233 | - | - | | 1.3331 | 4927 | 0.2272 | - | - | | 1.3333 | 4928 | 0.186 | - | - | | 1.3336 | 4929 | 0.1545 | - | - | | 1.3339 | 4930 | 0.2342 | - | - | | 1.3341 | 4931 | 0.2273 | - | - | | 1.3344 | 4932 | 0.2213 | - | - | | 1.3347 | 4933 | 0.2063 | - | - | | 1.3350 | 4934 | 0.2144 | - | - | | 1.3352 | 4935 | 0.2282 | - | - | | 1.3355 | 4936 | 0.2448 | - | - | | 1.3358 | 4937 | 0.172 | - | - | | 1.3360 | 4938 | 0.2317 | - | - | | 1.3363 | 4939 | 0.2178 | - | - | | 1.3366 | 4940 | 0.2019 | - | - | | 1.3369 | 4941 | 0.2257 | - | - | | 1.3371 | 4942 | 0.1835 | - | - | | 1.3374 | 4943 | 0.2362 | - | - | | 1.3377 | 4944 | 0.1473 | - | - | | 1.3379 | 4945 | 0.2068 | - | - | | 1.3382 | 4946 | 0.2301 | - | - | | 1.3385 | 4947 | 0.3179 | - | - | | 1.3387 | 4948 | 0.2331 | - | - | | 1.3390 | 4949 | 0.2178 | - | - | | 1.3393 | 4950 | 0.2855 | - | - | | 1.3396 | 4951 | 0.1918 | - | - | | 1.3398 | 4952 | 0.2233 | - | - | | 1.3401 | 4953 | 0.2328 | - | - | | 1.3404 | 4954 | 0.2482 | - | - | | 1.3406 | 4955 | 0.1931 | - | - | | 1.3409 | 4956 | 0.2095 | - | - | | 1.3412 | 4957 | 0.218 | - | - | | 1.3415 | 4958 | 0.2394 | - | - | | 1.3417 | 4959 | 0.2699 | - | - | | 1.3420 | 4960 | 0.1919 | - | - | | 1.3423 | 4961 | 0.2242 | - | - | | 1.3425 | 4962 | 0.2044 | - | - | | 1.3428 | 4963 | 0.2002 | - | - | | 1.3431 | 4964 | 0.2768 | - | - | | 1.3433 | 4965 | 0.1838 | - | - | | 1.3436 | 4966 | 0.2085 | - | - | | 1.3439 | 4967 | 0.213 | - | - | | 1.3442 | 4968 | 0.1693 | - | - | | 1.3444 | 4969 | 0.1779 | - | - | | 1.3447 | 4970 | 0.2766 | - | - | | 1.3450 | 4971 | 0.1902 | - | - | | 1.3452 | 4972 | 0.1753 | - | - | | 1.3455 | 4973 | 0.2701 | - | - | | 1.3458 | 4974 | 0.2516 | - | - | | 1.3460 | 4975 | 0.3002 | - | - | | 1.3463 | 4976 | 0.2558 | - | - | | 1.3466 | 4977 | 0.1969 | - | - | | 1.3469 | 4978 | 0.2542 | - | - | | 1.3471 | 4979 | 0.2061 | - | - | | 1.3474 | 4980 | 0.2225 | - | - | | 1.3477 | 4981 | 0.3971 | - | - | | 1.3479 | 4982 | 0.2559 | - | - | | 1.3482 | 4983 | 0.2082 | - | - | | 1.3485 | 4984 | 0.24 | - | - | | 1.3488 | 4985 | 0.1704 | - | - | | 1.3490 | 4986 | 0.3115 | - | - | | 1.3493 | 4987 | 0.2444 | - | - | | 1.3496 | 4988 | 0.1666 | - | - | | 1.3498 | 4989 | 0.2265 | - | - | | 1.3501 | 4990 | 0.2055 | - | - | | 1.3504 | 4991 | 0.1933 | - | - | | 1.3506 | 4992 | 0.2857 | - | - | | 1.3509 | 4993 | 0.1779 | - | - | | 1.3512 | 4994 | 0.2757 | - | - | | 1.3515 | 4995 | 0.187 | - | - | | 1.3517 | 4996 | 0.3348 | - | - | | 1.3520 | 4997 | 0.225 | - | - | | 1.3523 | 4998 | 0.2191 | - | - | | 1.3525 | 4999 | 0.2894 | - | - | | 1.3528 | 5000 | 0.1743 | 0.2258 | 0.9426 | | 1.3531 | 5001 | 0.3272 | - | - | | 1.3534 | 5002 | 0.1932 | - | - | | 1.3536 | 5003 | 0.3093 | - | - | | 1.3539 | 5004 | 0.2479 | - | - | | 1.3542 | 5005 | 0.2254 | - | - | | 1.3544 | 5006 | 0.2568 | - | - | | 1.3547 | 5007 | 0.3189 | - | - | | 1.3550 | 5008 | 0.187 | - | - | | 1.3552 | 5009 | 0.2211 | - | - | | 1.3555 | 5010 | 0.2745 | - | - | | 1.3558 | 5011 | 0.2515 | - | - | | 1.3561 | 5012 | 0.3465 | - | - | | 1.3563 | 5013 | 0.1519 | - | - | | 1.3566 | 5014 | 0.2272 | - | - | | 1.3569 | 5015 | 0.2069 | - | - | | 1.3571 | 5016 | 0.2089 | - | - | | 1.3574 | 5017 | 0.1934 | - | - | | 1.3577 | 5018 | 0.3921 | - | - | | 1.3580 | 5019 | 0.2081 | - | - | | 1.3582 | 5020 | 0.2498 | - | - | | 1.3585 | 5021 | 0.2372 | - | - | | 1.3588 | 5022 | 0.2209 | - | - | | 1.3590 | 5023 | 0.2519 | - | - | | 1.3593 | 5024 | 0.1997 | - | - | | 1.3596 | 5025 | 0.2536 | - | - | | 1.3598 | 5026 | 0.191 | - | - | | 1.3601 | 5027 | 0.2011 | - | - | | 1.3604 | 5028 | 0.1803 | - | - | | 1.3607 | 5029 | 0.1985 | - | - | | 1.3609 | 5030 | 0.2134 | - | - | | 1.3612 | 5031 | 0.1655 | - | - | | 1.3615 | 5032 | 0.2102 | - | - | | 1.3617 | 5033 | 0.163 | - | - | | 1.3620 | 5034 | 0.2074 | - | - | | 1.3623 | 5035 | 0.2897 | - | - | | 1.3626 | 5036 | 0.2697 | - | - | | 1.3628 | 5037 | 0.2266 | - | - | | 1.3631 | 5038 | 0.2365 | - | - | | 1.3634 | 5039 | 0.2457 | - | - | | 1.3636 | 5040 | 0.2498 | - | - | | 1.3639 | 5041 | 0.1816 | - | - | | 1.3642 | 5042 | 0.2523 | - | - | | 1.3644 | 5043 | 0.1932 | - | - | | 1.3647 | 5044 | 0.2866 | - | - | | 1.3650 | 5045 | 0.2636 | - | - | | 1.3653 | 5046 | 0.1805 | - | - | | 1.3655 | 5047 | 0.1704 | - | - | | 1.3658 | 5048 | 0.184 | - | - | | 1.3661 | 5049 | 0.2121 | - | - | | 1.3663 | 5050 | 0.1862 | - | - | | 1.3666 | 5051 | 0.1225 | - | - | | 1.3669 | 5052 | 0.1845 | - | - | | 1.3672 | 5053 | 0.201 | - | - | | 1.3674 | 5054 | 0.3451 | - | - | | 1.3677 | 5055 | 0.1807 | - | - | | 1.3680 | 5056 | 0.183 | - | - | | 1.3682 | 5057 | 0.1895 | - | - | | 1.3685 | 5058 | 0.2299 | - | - | | 1.3688 | 5059 | 0.2732 | - | - | | 1.3690 | 5060 | 0.2154 | - | - | | 1.3693 | 5061 | 0.1992 | - | - | | 1.3696 | 5062 | 0.1815 | - | - | | 1.3699 | 5063 | 0.2393 | - | - | | 1.3701 | 5064 | 0.1835 | - | - | | 1.3704 | 5065 | 0.2755 | - | - | | 1.3707 | 5066 | 0.2096 | - | - | | 1.3709 | 5067 | 0.3435 | - | - | | 1.3712 | 5068 | 0.291 | - | - | | 1.3715 | 5069 | 0.1964 | - | - | | 1.3718 | 5070 | 0.2026 | - | - | | 1.3720 | 5071 | 0.2062 | - | - | | 1.3723 | 5072 | 0.2615 | - | - | | 1.3726 | 5073 | 0.2415 | - | - | | 1.3728 | 5074 | 0.2217 | - | - | | 1.3731 | 5075 | 0.2228 | - | - | | 1.3734 | 5076 | 0.2304 | - | - | | 1.3736 | 5077 | 0.228 | - | - | | 1.3739 | 5078 | 0.2661 | - | - | | 1.3742 | 5079 | 0.2405 | - | - | | 1.3745 | 5080 | 0.2048 | - | - | | 1.3747 | 5081 | 0.2776 | - | - | | 1.375 | 5082 | 0.2141 | - | - | | 1.3753 | 5083 | 0.2809 | - | - | | 1.3755 | 5084 | 0.216 | - | - | | 1.3758 | 5085 | 0.2866 | - | - | | 1.3761 | 5086 | 0.1854 | - | - | | 1.3764 | 5087 | 0.2929 | - | - | | 1.3766 | 5088 | 0.3298 | - | - | | 1.3769 | 5089 | 0.2484 | - | - | | 1.3772 | 5090 | 0.1623 | - | - | | 1.3774 | 5091 | 0.295 | - | - | | 1.3777 | 5092 | 0.1992 | - | - | | 1.3780 | 5093 | 0.3278 | - | - | | 1.3782 | 5094 | 0.1861 | - | - | | 1.3785 | 5095 | 0.2226 | - | - | | 1.3788 | 5096 | 0.2601 | - | - | | 1.3791 | 5097 | 0.2614 | - | - | | 1.3793 | 5098 | 0.2576 | - | - | | 1.3796 | 5099 | 0.2512 | - | - | | 1.3799 | 5100 | 0.2036 | - | - | | 1.3801 | 5101 | 0.2316 | - | - | | 1.3804 | 5102 | 0.2504 | - | - | | 1.3807 | 5103 | 0.2416 | - | - | | 1.3810 | 5104 | 0.3158 | - | - | | 1.3812 | 5105 | 0.1596 | - | - | | 1.3815 | 5106 | 0.2984 | - | - | | 1.3818 | 5107 | 0.2214 | - | - | | 1.3820 | 5108 | 0.2156 | - | - | | 1.3823 | 5109 | 0.1837 | - | - | | 1.3826 | 5110 | 0.1795 | - | - | | 1.3828 | 5111 | 0.2016 | - | - | | 1.3831 | 5112 | 0.2359 | - | - | | 1.3834 | 5113 | 0.2154 | - | - | | 1.3837 | 5114 | 0.1913 | - | - | | 1.3839 | 5115 | 0.2449 | - | - | | 1.3842 | 5116 | 0.2221 | - | - | | 1.3845 | 5117 | 0.2611 | - | - | | 1.3847 | 5118 | 0.2125 | - | - | | 1.3850 | 5119 | 0.2101 | - | - | | 1.3853 | 5120 | 0.3185 | - | - | | 1.3856 | 5121 | 0.218 | - | - | | 1.3858 | 5122 | 0.291 | - | - | | 1.3861 | 5123 | 0.2595 | - | - | | 1.3864 | 5124 | 0.2083 | - | - | | 1.3866 | 5125 | 0.2211 | - | - | | 1.3869 | 5126 | 0.2216 | - | - | | 1.3872 | 5127 | 0.228 | - | - | | 1.3874 | 5128 | 0.1919 | - | - | | 1.3877 | 5129 | 0.2208 | - | - | | 1.3880 | 5130 | 0.2132 | - | - | | 1.3883 | 5131 | 0.2049 | - | - | | 1.3885 | 5132 | 0.2007 | - | - | | 1.3888 | 5133 | 0.2459 | - | - | | 1.3891 | 5134 | 0.22 | - | - | | 1.3893 | 5135 | 0.2759 | - | - | | 1.3896 | 5136 | 0.1962 | - | - | | 1.3899 | 5137 | 0.1947 | - | - | | 1.3902 | 5138 | 0.2379 | - | - | | 1.3904 | 5139 | 0.2124 | - | - | | 1.3907 | 5140 | 0.2447 | - | - | | 1.3910 | 5141 | 0.2086 | - | - | | 1.3912 | 5142 | 0.2235 | - | - | | 1.3915 | 5143 | 0.1982 | - | - | | 1.3918 | 5144 | 0.2317 | - | - | | 1.3920 | 5145 | 0.2251 | - | - | | 1.3923 | 5146 | 0.2681 | - | - | | 1.3926 | 5147 | 0.1471 | - | - | | 1.3929 | 5148 | 0.1885 | - | - | | 1.3931 | 5149 | 0.2652 | - | - | | 1.3934 | 5150 | 0.2085 | - | - | | 1.3937 | 5151 | 0.1842 | - | - | | 1.3939 | 5152 | 0.2452 | - | - | | 1.3942 | 5153 | 0.1745 | - | - | | 1.3945 | 5154 | 0.304 | - | - | | 1.3948 | 5155 | 0.193 | - | - | | 1.3950 | 5156 | 0.2149 | - | - | | 1.3953 | 5157 | 0.1674 | - | - | | 1.3956 | 5158 | 0.2371 | - | - | | 1.3958 | 5159 | 0.2319 | - | - | | 1.3961 | 5160 | 0.2286 | - | - | | 1.3964 | 5161 | 0.2336 | - | - | | 1.3966 | 5162 | 0.1938 | - | - | | 1.3969 | 5163 | 0.1935 | - | - | | 1.3972 | 5164 | 0.2165 | - | - | | 1.3975 | 5165 | 0.1954 | - | - | | 1.3977 | 5166 | 0.2141 | - | - | | 1.3980 | 5167 | 0.2472 | - | - | | 1.3983 | 5168 | 0.2119 | - | - | | 1.3985 | 5169 | 0.1907 | - | - | | 1.3988 | 5170 | 0.2276 | - | - | | 1.3991 | 5171 | 0.2339 | - | - | | 1.3994 | 5172 | 0.2072 | - | - | | 1.3996 | 5173 | 0.1294 | - | - | | 1.3999 | 5174 | 0.2643 | - | - | | 1.4002 | 5175 | 0.2709 | - | - | | 1.4004 | 5176 | 0.2352 | - | - | | 1.4007 | 5177 | 0.257 | - | - | | 1.4010 | 5178 | 0.2103 | - | - | | 1.4012 | 5179 | 0.2949 | - | - | | 1.4015 | 5180 | 0.1964 | - | - | | 1.4018 | 5181 | 0.264 | - | - | | 1.4021 | 5182 | 0.2009 | - | - | | 1.4023 | 5183 | 0.2388 | - | - | | 1.4026 | 5184 | 0.1475 | - | - | | 1.4029 | 5185 | 0.1255 | - | - | | 1.4031 | 5186 | 0.1971 | - | - | | 1.4034 | 5187 | 0.195 | - | - | | 1.4037 | 5188 | 0.1817 | - | - | | 1.4040 | 5189 | 0.3054 | - | - | | 1.4042 | 5190 | 0.2054 | - | - | | 1.4045 | 5191 | 0.2331 | - | - | | 1.4048 | 5192 | 0.1828 | - | - | | 1.4050 | 5193 | 0.2336 | - | - | | 1.4053 | 5194 | 0.2097 | - | - | | 1.4056 | 5195 | 0.1755 | - | - | | 1.4058 | 5196 | 0.2503 | - | - | | 1.4061 | 5197 | 0.3178 | - | - | | 1.4064 | 5198 | 0.2368 | - | - | | 1.4067 | 5199 | 0.1923 | - | - | | 1.4069 | 5200 | 0.2273 | - | - | | 1.4072 | 5201 | 0.2135 | - | - | | 1.4075 | 5202 | 0.2656 | - | - | | 1.4077 | 5203 | 0.3111 | - | - | | 1.4080 | 5204 | 0.2011 | - | - | | 1.4083 | 5205 | 0.2258 | - | - | | 1.4085 | 5206 | 0.2367 | - | - | | 1.4088 | 5207 | 0.3208 | - | - | | 1.4091 | 5208 | 0.2056 | - | - | | 1.4094 | 5209 | 0.2278 | - | - | | 1.4096 | 5210 | 0.2763 | - | - | | 1.4099 | 5211 | 0.2307 | - | - | | 1.4102 | 5212 | 0.2789 | - | - | | 1.4104 | 5213 | 0.2068 | - | - | | 1.4107 | 5214 | 0.2408 | - | - | | 1.4110 | 5215 | 0.2711 | - | - | | 1.4113 | 5216 | 0.2418 | - | - | | 1.4115 | 5217 | 0.2323 | - | - | | 1.4118 | 5218 | 0.1926 | - | - | | 1.4121 | 5219 | 0.2427 | - | - | | 1.4123 | 5220 | 0.3115 | - | - | | 1.4126 | 5221 | 0.1909 | - | - | | 1.4129 | 5222 | 0.1673 | - | - | | 1.4131 | 5223 | 0.1998 | - | - | | 1.4134 | 5224 | 0.2459 | - | - | | 1.4137 | 5225 | 0.2351 | - | - | | 1.4140 | 5226 | 0.2194 | - | - | | 1.4142 | 5227 | 0.2115 | - | - | | 1.4145 | 5228 | 0.1882 | - | - | | 1.4148 | 5229 | 0.1712 | - | - | | 1.4150 | 5230 | 0.2019 | - | - | | 1.4153 | 5231 | 0.2282 | - | - | | 1.4156 | 5232 | 0.1617 | - | - | | 1.4159 | 5233 | 0.2385 | - | - | | 1.4161 | 5234 | 0.2225 | - | - | | 1.4164 | 5235 | 0.3195 | - | - | | 1.4167 | 5236 | 0.1933 | - | - | | 1.4169 | 5237 | 0.2169 | - | - | | 1.4172 | 5238 | 0.2006 | - | - | | 1.4175 | 5239 | 0.3048 | - | - | | 1.4177 | 5240 | 0.1791 | - | - | | 1.4180 | 5241 | 0.1464 | - | - | | 1.4183 | 5242 | 0.2363 | - | - | | 1.4186 | 5243 | 0.2308 | - | - | | 1.4188 | 5244 | 0.2458 | - | - | | 1.4191 | 5245 | 0.2943 | - | - | | 1.4194 | 5246 | 0.276 | - | - | | 1.4196 | 5247 | 0.2397 | - | - | | 1.4199 | 5248 | 0.2009 | - | - | | 1.4202 | 5249 | 0.2944 | - | - | | 1.4205 | 5250 | 0.2238 | - | - | | 1.4207 | 5251 | 0.2168 | - | - | | 1.4210 | 5252 | 0.3322 | - | - | | 1.4213 | 5253 | 0.2368 | - | - | | 1.4215 | 5254 | 0.3379 | - | - | | 1.4218 | 5255 | 0.2186 | - | - | | 1.4221 | 5256 | 0.2941 | - | - | | 1.4223 | 5257 | 0.1733 | - | - | | 1.4226 | 5258 | 0.2344 | - | - | | 1.4229 | 5259 | 0.2141 | - | - | | 1.4232 | 5260 | 0.2625 | - | - | | 1.4234 | 5261 | 0.1415 | - | - | | 1.4237 | 5262 | 0.2384 | - | - | | 1.4240 | 5263 | 0.2243 | - | - | | 1.4242 | 5264 | 0.2226 | - | - | | 1.4245 | 5265 | 0.2171 | - | - | | 1.4248 | 5266 | 0.2282 | - | - | | 1.4251 | 5267 | 0.2441 | - | - | | 1.4253 | 5268 | 0.2371 | - | - | | 1.4256 | 5269 | 0.3161 | - | - | | 1.4259 | 5270 | 0.1996 | - | - | | 1.4261 | 5271 | 0.2445 | - | - | | 1.4264 | 5272 | 0.1955 | - | - | | 1.4267 | 5273 | 0.2622 | - | - | | 1.4269 | 5274 | 0.2659 | - | - | | 1.4272 | 5275 | 0.1933 | - | - | | 1.4275 | 5276 | 0.2526 | - | - | | 1.4278 | 5277 | 0.2144 | - | - | | 1.4280 | 5278 | 0.1572 | - | - | | 1.4283 | 5279 | 0.3021 | - | - | | 1.4286 | 5280 | 0.3588 | - | - | | 1.4288 | 5281 | 0.2205 | - | - | | 1.4291 | 5282 | 0.1504 | - | - | | 1.4294 | 5283 | 0.2103 | - | - | | 1.4297 | 5284 | 0.2515 | - | - | | 1.4299 | 5285 | 0.294 | - | - | | 1.4302 | 5286 | 0.2311 | - | - | | 1.4305 | 5287 | 0.1943 | - | - | | 1.4307 | 5288 | 0.1687 | - | - | | 1.4310 | 5289 | 0.2403 | - | - | | 1.4313 | 5290 | 0.2119 | - | - | | 1.4315 | 5291 | 0.2107 | - | - | | 1.4318 | 5292 | 0.2155 | - | - | | 1.4321 | 5293 | 0.1913 | - | - | | 1.4324 | 5294 | 0.2126 | - | - | | 1.4326 | 5295 | 0.1739 | - | - | | 1.4329 | 5296 | 0.178 | - | - | | 1.4332 | 5297 | 0.225 | - | - | | 1.4334 | 5298 | 0.2154 | - | - | | 1.4337 | 5299 | 0.1942 | - | - | | 1.4340 | 5300 | 0.1911 | - | - | | 1.4343 | 5301 | 0.3321 | - | - | | 1.4345 | 5302 | 0.1335 | - | - | | 1.4348 | 5303 | 0.2876 | - | - | | 1.4351 | 5304 | 0.2604 | - | - | | 1.4353 | 5305 | 0.4069 | - | - | | 1.4356 | 5306 | 0.2971 | - | - | | 1.4359 | 5307 | 0.1237 | - | - | | 1.4361 | 5308 | 0.2108 | - | - | | 1.4364 | 5309 | 0.2196 | - | - | | 1.4367 | 5310 | 0.2589 | - | - | | 1.4370 | 5311 | 0.2225 | - | - | | 1.4372 | 5312 | 0.1518 | - | - | | 1.4375 | 5313 | 0.1994 | - | - | | 1.4378 | 5314 | 0.2618 | - | - | | 1.4380 | 5315 | 0.2631 | - | - | | 1.4383 | 5316 | 0.2848 | - | - | | 1.4386 | 5317 | 0.1976 | - | - | | 1.4389 | 5318 | 0.1904 | - | - | | 1.4391 | 5319 | 0.2682 | - | - | | 1.4394 | 5320 | 0.2684 | - | - | | 1.4397 | 5321 | 0.2046 | - | - | | 1.4399 | 5322 | 0.2468 | - | - | | 1.4402 | 5323 | 0.266 | - | - | | 1.4405 | 5324 | 0.3248 | - | - | | 1.4407 | 5325 | 0.2327 | - | - | | 1.4410 | 5326 | 0.2999 | - | - | | 1.4413 | 5327 | 0.2046 | - | - | | 1.4416 | 5328 | 0.2642 | - | - | | 1.4418 | 5329 | 0.2467 | - | - | | 1.4421 | 5330 | 0.2366 | - | - | | 1.4424 | 5331 | 0.1989 | - | - | | 1.4426 | 5332 | 0.1948 | - | - | | 1.4429 | 5333 | 0.1909 | - | - | | 1.4432 | 5334 | 0.1856 | - | - | | 1.4435 | 5335 | 0.2216 | - | - | | 1.4437 | 5336 | 0.3236 | - | - | | 1.4440 | 5337 | 0.2564 | - | - | | 1.4443 | 5338 | 0.1649 | - | - | | 1.4445 | 5339 | 0.2289 | - | - | | 1.4448 | 5340 | 0.249 | - | - | | 1.4451 | 5341 | 0.2271 | - | - | | 1.4453 | 5342 | 0.2028 | - | - | | 1.4456 | 5343 | 0.2056 | - | - | | 1.4459 | 5344 | 0.2591 | - | - | | 1.4462 | 5345 | 0.2292 | - | - | | 1.4464 | 5346 | 0.1978 | - | - | | 1.4467 | 5347 | 0.1832 | - | - | | 1.4470 | 5348 | 0.2547 | - | - | | 1.4472 | 5349 | 0.2643 | - | - | | 1.4475 | 5350 | 0.27 | - | - | | 1.4478 | 5351 | 0.1783 | - | - | | 1.4481 | 5352 | 0.1787 | - | - | | 1.4483 | 5353 | 0.2475 | - | - | | 1.4486 | 5354 | 0.2057 | - | - | | 1.4489 | 5355 | 0.1877 | - | - | | 1.4491 | 5356 | 0.2339 | - | - | | 1.4494 | 5357 | 0.2221 | - | - | | 1.4497 | 5358 | 0.3029 | - | - | | 1.4499 | 5359 | 0.2373 | - | - | | 1.4502 | 5360 | 0.2807 | - | - | | 1.4505 | 5361 | 0.1765 | - | - | | 1.4508 | 5362 | 0.1781 | - | - | | 1.4510 | 5363 | 0.2245 | - | - | | 1.4513 | 5364 | 0.2205 | - | - | | 1.4516 | 5365 | 0.1775 | - | - | | 1.4518 | 5366 | 0.2405 | - | - | | 1.4521 | 5367 | 0.1747 | - | - | | 1.4524 | 5368 | 0.2657 | - | - | | 1.4527 | 5369 | 0.2094 | - | - | | 1.4529 | 5370 | 0.2284 | - | - | | 1.4532 | 5371 | 0.2452 | - | - | | 1.4535 | 5372 | 0.2129 | - | - | | 1.4537 | 5373 | 0.2264 | - | - | | 1.4540 | 5374 | 0.159 | - | - | | 1.4543 | 5375 | 0.19 | - | - | | 1.4545 | 5376 | 0.2293 | - | - | | 1.4548 | 5377 | 0.2302 | - | - | | 1.4551 | 5378 | 0.2329 | - | - | | 1.4554 | 5379 | 0.2037 | - | - | | 1.4556 | 5380 | 0.2522 | - | - | | 1.4559 | 5381 | 0.253 | - | - | | 1.4562 | 5382 | 0.142 | - | - | | 1.4564 | 5383 | 0.2007 | - | - | | 1.4567 | 5384 | 0.2116 | - | - | | 1.4570 | 5385 | 0.2295 | - | - | | 1.4573 | 5386 | 0.1442 | - | - | | 1.4575 | 5387 | 0.2774 | - | - | | 1.4578 | 5388 | 0.1828 | - | - | | 1.4581 | 5389 | 0.3095 | - | - | | 1.4583 | 5390 | 0.2263 | - | - | | 1.4586 | 5391 | 0.2406 | - | - | | 1.4589 | 5392 | 0.1606 | - | - | | 1.4591 | 5393 | 0.2357 | - | - | | 1.4594 | 5394 | 0.3516 | - | - | | 1.4597 | 5395 | 0.2343 | - | - | | 1.4600 | 5396 | 0.2449 | - | - | | 1.4602 | 5397 | 0.1651 | - | - | | 1.4605 | 5398 | 0.1712 | - | - | | 1.4608 | 5399 | 0.1759 | - | - | | 1.4610 | 5400 | 0.3316 | - | - | | 1.4613 | 5401 | 0.2098 | - | - | | 1.4616 | 5402 | 0.2696 | - | - | | 1.4619 | 5403 | 0.1761 | - | - | | 1.4621 | 5404 | 0.1489 | - | - | | 1.4624 | 5405 | 0.1945 | - | - | | 1.4627 | 5406 | 0.2091 | - | - | | 1.4629 | 5407 | 0.293 | - | - | | 1.4632 | 5408 | 0.2374 | - | - | | 1.4635 | 5409 | 0.1891 | - | - | | 1.4637 | 5410 | 0.2865 | - | - | | 1.4640 | 5411 | 0.1804 | - | - | | 1.4643 | 5412 | 0.2538 | - | - | | 1.4646 | 5413 | 0.2151 | - | - | | 1.4648 | 5414 | 0.2095 | - | - | | 1.4651 | 5415 | 0.1414 | - | - | | 1.4654 | 5416 | 0.244 | - | - | | 1.4656 | 5417 | 0.2275 | - | - | | 1.4659 | 5418 | 0.181 | - | - | | 1.4662 | 5419 | 0.221 | - | - | | 1.4665 | 5420 | 0.2338 | - | - | | 1.4667 | 5421 | 0.2677 | - | - | | 1.4670 | 5422 | 0.2174 | - | - | | 1.4673 | 5423 | 0.1827 | - | - | | 1.4675 | 5424 | 0.2083 | - | - | | 1.4678 | 5425 | 0.1838 | - | - | | 1.4681 | 5426 | 0.2313 | - | - | | 1.4683 | 5427 | 0.3292 | - | - | | 1.4686 | 5428 | 0.2552 | - | - | | 1.4689 | 5429 | 0.2097 | - | - | | 1.4692 | 5430 | 0.2113 | - | - | | 1.4694 | 5431 | 0.1731 | - | - | | 1.4697 | 5432 | 0.2338 | - | - | | 1.4700 | 5433 | 0.3219 | - | - | | 1.4702 | 5434 | 0.1768 | - | - | | 1.4705 | 5435 | 0.2597 | - | - | | 1.4708 | 5436 | 0.1806 | - | - | | 1.4710 | 5437 | 0.2821 | - | - | | 1.4713 | 5438 | 0.372 | - | - | | 1.4716 | 5439 | 0.2756 | - | - | | 1.4719 | 5440 | 0.2026 | - | - | | 1.4721 | 5441 | 0.2128 | - | - | | 1.4724 | 5442 | 0.1998 | - | - | | 1.4727 | 5443 | 0.2317 | - | - | | 1.4729 | 5444 | 0.2427 | - | - | | 1.4732 | 5445 | 0.2575 | - | - | | 1.4735 | 5446 | 0.233 | - | - | | 1.4738 | 5447 | 0.3004 | - | - | | 1.4740 | 5448 | 0.2432 | - | - | | 1.4743 | 5449 | 0.2577 | - | - | | 1.4746 | 5450 | 0.2081 | - | - | | 1.4748 | 5451 | 0.2063 | - | - | | 1.4751 | 5452 | 0.3232 | - | - | | 1.4754 | 5453 | 0.1869 | - | - | | 1.4756 | 5454 | 0.1423 | - | - | | 1.4759 | 5455 | 0.1559 | - | - | | 1.4762 | 5456 | 0.2014 | - | - | | 1.4765 | 5457 | 0.2138 | - | - | | 1.4767 | 5458 | 0.2259 | - | - | | 1.4770 | 5459 | 0.2196 | - | - | | 1.4773 | 5460 | 0.2209 | - | - | | 1.4775 | 5461 | 0.3369 | - | - | | 1.4778 | 5462 | 0.2625 | - | - | | 1.4781 | 5463 | 0.1662 | - | - | | 1.4784 | 5464 | 0.2073 | - | - | | 1.4786 | 5465 | 0.1871 | - | - | | 1.4789 | 5466 | 0.2259 | - | - | | 1.4792 | 5467 | 0.2644 | - | - | | 1.4794 | 5468 | 0.2084 | - | - | | 1.4797 | 5469 | 0.1911 | - | - | | 1.4800 | 5470 | 0.3248 | - | - | | 1.4802 | 5471 | 0.1612 | - | - | | 1.4805 | 5472 | 0.3163 | - | - | | 1.4808 | 5473 | 0.167 | - | - | | 1.4811 | 5474 | 0.1923 | - | - | | 1.4813 | 5475 | 0.3397 | - | - | | 1.4816 | 5476 | 0.2408 | - | - | | 1.4819 | 5477 | 0.1998 | - | - | | 1.4821 | 5478 | 0.207 | - | - | | 1.4824 | 5479 | 0.3086 | - | - | | 1.4827 | 5480 | 0.1624 | - | - | | 1.4830 | 5481 | 0.2752 | - | - | | 1.4832 | 5482 | 0.2334 | - | - | | 1.4835 | 5483 | 0.1901 | - | - | | 1.4838 | 5484 | 0.2568 | - | - | | 1.4840 | 5485 | 0.2489 | - | - | | 1.4843 | 5486 | 0.2886 | - | - | | 1.4846 | 5487 | 0.3219 | - | - | | 1.4848 | 5488 | 0.1975 | - | - | | 1.4851 | 5489 | 0.1945 | - | - | | 1.4854 | 5490 | 0.1989 | - | - | | 1.4857 | 5491 | 0.2388 | - | - | | 1.4859 | 5492 | 0.1777 | - | - | | 1.4862 | 5493 | 0.2774 | - | - | | 1.4865 | 5494 | 0.1815 | - | - | | 1.4867 | 5495 | 0.2921 | - | - | | 1.4870 | 5496 | 0.1676 | - | - | | 1.4873 | 5497 | 0.1916 | - | - | | 1.4876 | 5498 | 0.2192 | - | - | | 1.4878 | 5499 | 0.2492 | - | - | | 1.4881 | 5500 | 0.2286 | - | - | | 1.4884 | 5501 | 0.2974 | - | - | | 1.4886 | 5502 | 0.1951 | - | - | | 1.4889 | 5503 | 0.2977 | - | - | | 1.4892 | 5504 | 0.2179 | - | - | | 1.4894 | 5505 | 0.2211 | - | - | | 1.4897 | 5506 | 0.2143 | - | - | | 1.4900 | 5507 | 0.2175 | - | - | | 1.4903 | 5508 | 0.1944 | - | - | | 1.4905 | 5509 | 0.2832 | - | - | | 1.4908 | 5510 | 0.2015 | - | - | | 1.4911 | 5511 | 0.2478 | - | - | | 1.4913 | 5512 | 0.2564 | - | - | | 1.4916 | 5513 | 0.1937 | - | - | | 1.4919 | 5514 | 0.2878 | - | - | | 1.4922 | 5515 | 0.222 | - | - | | 1.4924 | 5516 | 0.2924 | - | - | | 1.4927 | 5517 | 0.2447 | - | - | | 1.4930 | 5518 | 0.2284 | - | - | | 1.4932 | 5519 | 0.2322 | - | - | | 1.4935 | 5520 | 0.1363 | - | - | | 1.4938 | 5521 | 0.2156 | - | - | | 1.4940 | 5522 | 0.2647 | - | - | | 1.4943 | 5523 | 0.3007 | - | - | | 1.4946 | 5524 | 0.2893 | - | - | | 1.4949 | 5525 | 0.2801 | - | - | | 1.4951 | 5526 | 0.2177 | - | - | | 1.4954 | 5527 | 0.1799 | - | - | | 1.4957 | 5528 | 0.2098 | - | - | | 1.4959 | 5529 | 0.2221 | - | - | | 1.4962 | 5530 | 0.2285 | - | - | | 1.4965 | 5531 | 0.2108 | - | - | | 1.4968 | 5532 | 0.2639 | - | - | | 1.4970 | 5533 | 0.2495 | - | - | | 1.4973 | 5534 | 0.2223 | - | - | | 1.4976 | 5535 | 0.2637 | - | - | | 1.4978 | 5536 | 0.214 | - | - | | 1.4981 | 5537 | 0.22 | - | - | | 1.4984 | 5538 | 0.2689 | - | - | | 1.4986 | 5539 | 0.191 | - | - | | 1.4989 | 5540 | 0.2049 | - | - | | 1.4992 | 5541 | 0.1735 | - | - | | 1.4995 | 5542 | 0.2252 | - | - | | 1.4997 | 5543 | 0.2629 | - | - | | 1.5 | 5544 | 0.2102 | - | - | | 1.5003 | 5545 | 0.1566 | - | - | | 1.5005 | 5546 | 0.2044 | - | - | | 1.5008 | 5547 | 0.1841 | - | - | | 1.5011 | 5548 | 0.2714 | - | - | | 1.5014 | 5549 | 0.1354 | - | - | | 1.5016 | 5550 | 0.1657 | - | - | | 1.5019 | 5551 | 0.1657 | - | - | | 1.5022 | 5552 | 0.1454 | - | - | | 1.5024 | 5553 | 0.1856 | - | - | | 1.5027 | 5554 | 0.2391 | - | - | | 1.5030 | 5555 | 0.1601 | - | - | | 1.5032 | 5556 | 0.2047 | - | - | | 1.5035 | 5557 | 0.2834 | - | - | | 1.5038 | 5558 | 0.238 | - | - | | 1.5041 | 5559 | 0.2363 | - | - | | 1.5043 | 5560 | 0.2745 | - | - | | 1.5046 | 5561 | 0.2245 | - | - | | 1.5049 | 5562 | 0.2493 | - | - | | 1.5051 | 5563 | 0.2406 | - | - | | 1.5054 | 5564 | 0.1992 | - | - | | 1.5057 | 5565 | 0.1981 | - | - | | 1.5060 | 5566 | 0.1514 | - | - | | 1.5062 | 5567 | 0.2475 | - | - | | 1.5065 | 5568 | 0.2874 | - | - | | 1.5068 | 5569 | 0.1998 | - | - | | 1.5070 | 5570 | 0.2299 | - | - | | 1.5073 | 5571 | 0.244 | - | - | | 1.5076 | 5572 | 0.2278 | - | - | | 1.5078 | 5573 | 0.3185 | - | - | | 1.5081 | 5574 | 0.2127 | - | - | | 1.5084 | 5575 | 0.2502 | - | - | | 1.5087 | 5576 | 0.2776 | - | - | | 1.5089 | 5577 | 0.2142 | - | - | | 1.5092 | 5578 | 0.1572 | - | - | | 1.5095 | 5579 | 0.2408 | - | - | | 1.5097 | 5580 | 0.201 | - | - | | 1.5100 | 5581 | 0.1616 | - | - | | 1.5103 | 5582 | 0.2866 | - | - | | 1.5106 | 5583 | 0.1576 | - | - | | 1.5108 | 5584 | 0.2119 | - | - | | 1.5111 | 5585 | 0.204 | - | - | | 1.5114 | 5586 | 0.263 | - | - | | 1.5116 | 5587 | 0.2022 | - | - | | 1.5119 | 5588 | 0.1391 | - | - | | 1.5122 | 5589 | 0.2201 | - | - | | 1.5124 | 5590 | 0.1976 | - | - | | 1.5127 | 5591 | 0.1972 | - | - | | 1.5130 | 5592 | 0.233 | - | - | | 1.5133 | 5593 | 0.2639 | - | - | | 1.5135 | 5594 | 0.249 | - | - | | 1.5138 | 5595 | 0.2755 | - | - | | 1.5141 | 5596 | 0.2411 | - | - | | 1.5143 | 5597 | 0.2186 | - | - | | 1.5146 | 5598 | 0.207 | - | - | | 1.5149 | 5599 | 0.2445 | - | - | | 1.5152 | 5600 | 0.2628 | - | - | | 1.5154 | 5601 | 0.2048 | - | - | | 1.5157 | 5602 | 0.1756 | - | - | | 1.5160 | 5603 | 0.1511 | - | - | | 1.5162 | 5604 | 0.2026 | - | - | | 1.5165 | 5605 | 0.1425 | - | - | | 1.5168 | 5606 | 0.2618 | - | - | | 1.5170 | 5607 | 0.2489 | - | - | | 1.5173 | 5608 | 0.2506 | - | - | | 1.5176 | 5609 | 0.2139 | - | - | | 1.5179 | 5610 | 0.2732 | - | - | | 1.5181 | 5611 | 0.2087 | - | - | | 1.5184 | 5612 | 0.2537 | - | - | | 1.5187 | 5613 | 0.2823 | - | - | | 1.5189 | 5614 | 0.1433 | - | - | | 1.5192 | 5615 | 0.2443 | - | - | | 1.5195 | 5616 | 0.2894 | - | - | | 1.5198 | 5617 | 0.2643 | - | - | | 1.5200 | 5618 | 0.1721 | - | - | | 1.5203 | 5619 | 0.2372 | - | - | | 1.5206 | 5620 | 0.1669 | - | - | | 1.5208 | 5621 | 0.2635 | - | - | | 1.5211 | 5622 | 0.196 | - | - | | 1.5214 | 5623 | 0.3238 | - | - | | 1.5216 | 5624 | 0.2018 | - | - | | 1.5219 | 5625 | 0.2176 | - | - | | 1.5222 | 5626 | 0.2485 | - | - | | 1.5225 | 5627 | 0.2026 | - | - | | 1.5227 | 5628 | 0.1769 | - | - | | 1.5230 | 5629 | 0.1424 | - | - | | 1.5233 | 5630 | 0.3039 | - | - | | 1.5235 | 5631 | 0.1787 | - | - | | 1.5238 | 5632 | 0.215 | - | - | | 1.5241 | 5633 | 0.2294 | - | - | | 1.5244 | 5634 | 0.2925 | - | - | | 1.5246 | 5635 | 0.2316 | - | - | | 1.5249 | 5636 | 0.2126 | - | - | | 1.5252 | 5637 | 0.2731 | - | - | | 1.5254 | 5638 | 0.2182 | - | - | | 1.5257 | 5639 | 0.2085 | - | - | | 1.5260 | 5640 | 0.2146 | - | - | | 1.5262 | 5641 | 0.1879 | - | - | | 1.5265 | 5642 | 0.2003 | - | - | | 1.5268 | 5643 | 0.2096 | - | - | | 1.5271 | 5644 | 0.175 | - | - | | 1.5273 | 5645 | 0.2619 | - | - | | 1.5276 | 5646 | 0.2154 | - | - | | 1.5279 | 5647 | 0.176 | - | - | | 1.5281 | 5648 | 0.2324 | - | - | | 1.5284 | 5649 | 0.1846 | - | - | | 1.5287 | 5650 | 0.2001 | - | - | | 1.5290 | 5651 | 0.1675 | - | - | | 1.5292 | 5652 | 0.1728 | - | - | | 1.5295 | 5653 | 0.278 | - | - | | 1.5298 | 5654 | 0.2801 | - | - | | 1.5300 | 5655 | 0.2838 | - | - | | 1.5303 | 5656 | 0.211 | - | - | | 1.5306 | 5657 | 0.2206 | - | - | | 1.5308 | 5658 | 0.226 | - | - | | 1.5311 | 5659 | 0.1446 | - | - | | 1.5314 | 5660 | 0.2313 | - | - | | 1.5317 | 5661 | 0.3117 | - | - | | 1.5319 | 5662 | 0.2354 | - | - | | 1.5322 | 5663 | 0.282 | - | - | | 1.5325 | 5664 | 0.1901 | - | - | | 1.5327 | 5665 | 0.2348 | - | - | | 1.5330 | 5666 | 0.2231 | - | - | | 1.5333 | 5667 | 0.1953 | - | - | | 1.5335 | 5668 | 0.2816 | - | - | | 1.5338 | 5669 | 0.2178 | - | - | | 1.5341 | 5670 | 0.241 | - | - | | 1.5344 | 5671 | 0.2126 | - | - | | 1.5346 | 5672 | 0.2098 | - | - | | 1.5349 | 5673 | 0.2801 | - | - | | 1.5352 | 5674 | 0.2055 | - | - | | 1.5354 | 5675 | 0.2021 | - | - | | 1.5357 | 5676 | 0.1739 | - | - | | 1.5360 | 5677 | 0.2332 | - | - | | 1.5363 | 5678 | 0.227 | - | - | | 1.5365 | 5679 | 0.268 | - | - | | 1.5368 | 5680 | 0.2668 | - | - | | 1.5371 | 5681 | 0.2066 | - | - | | 1.5373 | 5682 | 0.4161 | - | - | | 1.5376 | 5683 | 0.1861 | - | - | | 1.5379 | 5684 | 0.312 | - | - | | 1.5381 | 5685 | 0.2436 | - | - | | 1.5384 | 5686 | 0.251 | - | - | | 1.5387 | 5687 | 0.2195 | - | - | | 1.5390 | 5688 | 0.1934 | - | - | | 1.5392 | 5689 | 0.2052 | - | - | | 1.5395 | 5690 | 0.1954 | - | - | | 1.5398 | 5691 | 0.2338 | - | - | | 1.5400 | 5692 | 0.1491 | - | - | | 1.5403 | 5693 | 0.1914 | - | - | | 1.5406 | 5694 | 0.282 | - | - | | 1.5409 | 5695 | 0.1916 | - | - | | 1.5411 | 5696 | 0.172 | - | - | | 1.5414 | 5697 | 0.289 | - | - | | 1.5417 | 5698 | 0.1691 | - | - | | 1.5419 | 5699 | 0.1604 | - | - | | 1.5422 | 5700 | 0.2124 | - | - | | 1.5425 | 5701 | 0.202 | - | - | | 1.5427 | 5702 | 0.2348 | - | - | | 1.5430 | 5703 | 0.2316 | - | - | | 1.5433 | 5704 | 0.2235 | - | - | | 1.5436 | 5705 | 0.2457 | - | - | | 1.5438 | 5706 | 0.2502 | - | - | | 1.5441 | 5707 | 0.2497 | - | - | | 1.5444 | 5708 | 0.222 | - | - | | 1.5446 | 5709 | 0.2358 | - | - | | 1.5449 | 5710 | 0.1897 | - | - | | 1.5452 | 5711 | 0.2342 | - | - | | 1.5455 | 5712 | 0.215 | - | - | | 1.5457 | 5713 | 0.1977 | - | - | | 1.5460 | 5714 | 0.2309 | - | - | | 1.5463 | 5715 | 0.1643 | - | - | | 1.5465 | 5716 | 0.1577 | - | - | | 1.5468 | 5717 | 0.289 | - | - | | 1.5471 | 5718 | 0.2148 | - | - | | 1.5473 | 5719 | 0.2683 | - | - | | 1.5476 | 5720 | 0.2271 | - | - | | 1.5479 | 5721 | 0.2025 | - | - | | 1.5482 | 5722 | 0.2214 | - | - | | 1.5484 | 5723 | 0.2657 | - | - | | 1.5487 | 5724 | 0.1977 | - | - | | 1.5490 | 5725 | 0.2107 | - | - | | 1.5492 | 5726 | 0.2138 | - | - | | 1.5495 | 5727 | 0.2628 | - | - | | 1.5498 | 5728 | 0.2392 | - | - | | 1.5501 | 5729 | 0.2544 | - | - | | 1.5503 | 5730 | 0.1518 | - | - | | 1.5506 | 5731 | 0.1843 | - | - | | 1.5509 | 5732 | 0.2203 | - | - | | 1.5511 | 5733 | 0.1936 | - | - | | 1.5514 | 5734 | 0.1777 | - | - | | 1.5517 | 5735 | 0.1526 | - | - | | 1.5519 | 5736 | 0.2415 | - | - | | 1.5522 | 5737 | 0.2292 | - | - | | 1.5525 | 5738 | 0.2241 | - | - | | 1.5528 | 5739 | 0.2294 | - | - | | 1.5530 | 5740 | 0.2505 | - | - | | 1.5533 | 5741 | 0.2414 | - | - | | 1.5536 | 5742 | 0.248 | - | - | | 1.5538 | 5743 | 0.2055 | - | - | | 1.5541 | 5744 | 0.1775 | - | - | | 1.5544 | 5745 | 0.2609 | - | - | | 1.5547 | 5746 | 0.3636 | - | - | | 1.5549 | 5747 | 0.2204 | - | - | | 1.5552 | 5748 | 0.2022 | - | - | | 1.5555 | 5749 | 0.2075 | - | - | | 1.5557 | 5750 | 0.2271 | - | - | | 1.5560 | 5751 | 0.2137 | - | - | | 1.5563 | 5752 | 0.2159 | - | - | | 1.5565 | 5753 | 0.3304 | - | - | | 1.5568 | 5754 | 0.2406 | - | - | | 1.5571 | 5755 | 0.2436 | - | - | | 1.5574 | 5756 | 0.2351 | - | - | | 1.5576 | 5757 | 0.2258 | - | - | | 1.5579 | 5758 | 0.2615 | - | - | | 1.5582 | 5759 | 0.1605 | - | - | | 1.5584 | 5760 | 0.3292 | - | - | | 1.5587 | 5761 | 0.2382 | - | - | | 1.5590 | 5762 | 0.204 | - | - | | 1.5593 | 5763 | 0.1622 | - | - | | 1.5595 | 5764 | 0.2051 | - | - | | 1.5598 | 5765 | 0.1384 | - | - | | 1.5601 | 5766 | 0.2148 | - | - | | 1.5603 | 5767 | 0.1852 | - | - | | 1.5606 | 5768 | 0.2015 | - | - | | 1.5609 | 5769 | 0.1934 | - | - | | 1.5611 | 5770 | 0.2636 | - | - | | 1.5614 | 5771 | 0.2743 | - | - | | 1.5617 | 5772 | 0.2725 | - | - | | 1.5620 | 5773 | 0.2293 | - | - | | 1.5622 | 5774 | 0.1853 | - | - | | 1.5625 | 5775 | 0.1817 | - | - | | 1.5628 | 5776 | 0.2906 | - | - | | 1.5630 | 5777 | 0.2522 | - | - | | 1.5633 | 5778 | 0.1882 | - | - | | 1.5636 | 5779 | 0.1826 | - | - | | 1.5639 | 5780 | 0.2591 | - | - | | 1.5641 | 5781 | 0.1828 | - | - | | 1.5644 | 5782 | 0.1561 | - | - | | 1.5647 | 5783 | 0.2806 | - | - | | 1.5649 | 5784 | 0.2966 | - | - | | 1.5652 | 5785 | 0.1887 | - | - | | 1.5655 | 5786 | 0.1605 | - | - | | 1.5657 | 5787 | 0.1726 | - | - | | 1.5660 | 5788 | 0.2697 | - | - | | 1.5663 | 5789 | 0.1976 | - | - | | 1.5666 | 5790 | 0.1764 | - | - | | 1.5668 | 5791 | 0.2297 | - | - | | 1.5671 | 5792 | 0.2659 | - | - | | 1.5674 | 5793 | 0.2151 | - | - | | 1.5676 | 5794 | 0.1664 | - | - | | 1.5679 | 5795 | 0.3114 | - | - | | 1.5682 | 5796 | 0.2384 | - | - | | 1.5685 | 5797 | 0.2387 | - | - | | 1.5687 | 5798 | 0.2227 | - | - | | 1.5690 | 5799 | 0.1869 | - | - | | 1.5693 | 5800 | 0.1932 | - | - | | 1.5695 | 5801 | 0.298 | - | - | | 1.5698 | 5802 | 0.1852 | - | - | | 1.5701 | 5803 | 0.1725 | - | - | | 1.5703 | 5804 | 0.2377 | - | - | | 1.5706 | 5805 | 0.1853 | - | - | | 1.5709 | 5806 | 0.1947 | - | - | | 1.5712 | 5807 | 0.3128 | - | - | | 1.5714 | 5808 | 0.2036 | - | - | | 1.5717 | 5809 | 0.2427 | - | - | | 1.5720 | 5810 | 0.2277 | - | - | | 1.5722 | 5811 | 0.2449 | - | - | | 1.5725 | 5812 | 0.2723 | - | - | | 1.5728 | 5813 | 0.3115 | - | - | | 1.5731 | 5814 | 0.2655 | - | - | | 1.5733 | 5815 | 0.1823 | - | - | | 1.5736 | 5816 | 0.236 | - | - | | 1.5739 | 5817 | 0.2131 | - | - | | 1.5741 | 5818 | 0.2687 | - | - | | 1.5744 | 5819 | 0.1882 | - | - | | 1.5747 | 5820 | 0.1774 | - | - | | 1.5749 | 5821 | 0.2733 | - | - | | 1.5752 | 5822 | 0.1519 | - | - | | 1.5755 | 5823 | 0.1721 | - | - | | 1.5758 | 5824 | 0.2119 | - | - | | 1.5760 | 5825 | 0.2362 | - | - | | 1.5763 | 5826 | 0.1575 | - | - | | 1.5766 | 5827 | 0.1819 | - | - | | 1.5768 | 5828 | 0.1981 | - | - | | 1.5771 | 5829 | 0.2519 | - | - | | 1.5774 | 5830 | 0.2369 | - | - | | 1.5777 | 5831 | 0.2152 | - | - | | 1.5779 | 5832 | 0.1947 | - | - | | 1.5782 | 5833 | 0.2859 | - | - | | 1.5785 | 5834 | 0.2267 | - | - | | 1.5787 | 5835 | 0.1779 | - | - | | 1.5790 | 5836 | 0.2361 | - | - | | 1.5793 | 5837 | 0.2322 | - | - | | 1.5795 | 5838 | 0.1774 | - | - | | 1.5798 | 5839 | 0.2611 | - | - | | 1.5801 | 5840 | 0.1935 | - | - | | 1.5804 | 5841 | 0.3059 | - | - | | 1.5806 | 5842 | 0.2166 | - | - | | 1.5809 | 5843 | 0.2336 | - | - | | 1.5812 | 5844 | 0.148 | - | - | | 1.5814 | 5845 | 0.2321 | - | - | | 1.5817 | 5846 | 0.1749 | - | - | | 1.5820 | 5847 | 0.2919 | - | - | | 1.5823 | 5848 | 0.1656 | - | - | | 1.5825 | 5849 | 0.1959 | - | - | | 1.5828 | 5850 | 0.2079 | - | - | | 1.5831 | 5851 | 0.1579 | - | - | | 1.5833 | 5852 | 0.2353 | - | - | | 1.5836 | 5853 | 0.2249 | - | - | | 1.5839 | 5854 | 0.3148 | - | - | | 1.5841 | 5855 | 0.2036 | - | - | | 1.5844 | 5856 | 0.1638 | - | - | | 1.5847 | 5857 | 0.117 | - | - | | 1.5850 | 5858 | 0.1716 | - | - | | 1.5852 | 5859 | 0.2492 | - | - | | 1.5855 | 5860 | 0.1306 | - | - | | 1.5858 | 5861 | 0.1592 | - | - | | 1.5860 | 5862 | 0.2198 | - | - | | 1.5863 | 5863 | 0.3247 | - | - | | 1.5866 | 5864 | 0.1847 | - | - | | 1.5869 | 5865 | 0.2123 | - | - | | 1.5871 | 5866 | 0.2332 | - | - | | 1.5874 | 5867 | 0.1944 | - | - | | 1.5877 | 5868 | 0.2601 | - | - | | 1.5879 | 5869 | 0.215 | - | - | | 1.5882 | 5870 | 0.2483 | - | - | | 1.5885 | 5871 | 0.2776 | - | - | | 1.5887 | 5872 | 0.218 | - | - | | 1.5890 | 5873 | 0.1927 | - | - | | 1.5893 | 5874 | 0.229 | - | - | | 1.5896 | 5875 | 0.2886 | - | - | | 1.5898 | 5876 | 0.2312 | - | - | | 1.5901 | 5877 | 0.2287 | - | - | | 1.5904 | 5878 | 0.1867 | - | - | | 1.5906 | 5879 | 0.2697 | - | - | | 1.5909 | 5880 | 0.2966 | - | - | | 1.5912 | 5881 | 0.197 | - | - | | 1.5915 | 5882 | 0.2262 | - | - | | 1.5917 | 5883 | 0.1997 | - | - | | 1.5920 | 5884 | 0.1794 | - | - | | 1.5923 | 5885 | 0.2869 | - | - | | 1.5925 | 5886 | 0.2338 | - | - | | 1.5928 | 5887 | 0.2015 | - | - | | 1.5931 | 5888 | 0.2373 | - | - | | 1.5933 | 5889 | 0.2519 | - | - | | 1.5936 | 5890 | 0.2094 | - | - | | 1.5939 | 5891 | 0.2352 | - | - | | 1.5942 | 5892 | 0.259 | - | - | | 1.5944 | 5893 | 0.2151 | - | - | | 1.5947 | 5894 | 0.1912 | - | - | | 1.5950 | 5895 | 0.193 | - | - | | 1.5952 | 5896 | 0.1973 | - | - | | 1.5955 | 5897 | 0.2038 | - | - | | 1.5958 | 5898 | 0.254 | - | - | | 1.5960 | 5899 | 0.255 | - | - | | 1.5963 | 5900 | 0.1476 | - | - | | 1.5966 | 5901 | 0.2964 | - | - | | 1.5969 | 5902 | 0.2257 | - | - | | 1.5971 | 5903 | 0.2599 | - | - | | 1.5974 | 5904 | 0.275 | - | - | | 1.5977 | 5905 | 0.1732 | - | - | | 1.5979 | 5906 | 0.231 | - | - | | 1.5982 | 5907 | 0.2106 | - | - | | 1.5985 | 5908 | 0.1838 | - | - | | 1.5988 | 5909 | 0.1461 | - | - | | 1.5990 | 5910 | 0.195 | - | - | | 1.5993 | 5911 | 0.2678 | - | - | | 1.5996 | 5912 | 0.2305 | - | - | | 1.5998 | 5913 | 0.2233 | - | - | | 1.6001 | 5914 | 0.2101 | - | - | | 1.6004 | 5915 | 0.2185 | - | - | | 1.6006 | 5916 | 0.2099 | - | - | | 1.6009 | 5917 | 0.2463 | - | - | | 1.6012 | 5918 | 0.2109 | - | - | | 1.6015 | 5919 | 0.208 | - | - | | 1.6017 | 5920 | 0.3242 | - | - | | 1.6020 | 5921 | 0.2048 | - | - | | 1.6023 | 5922 | 0.2457 | - | - | | 1.6025 | 5923 | 0.2338 | - | - | | 1.6028 | 5924 | 0.2931 | - | - | | 1.6031 | 5925 | 0.1429 | - | - | | 1.6034 | 5926 | 0.2233 | - | - | | 1.6036 | 5927 | 0.2474 | - | - | | 1.6039 | 5928 | 0.1739 | - | - | | 1.6042 | 5929 | 0.3097 | - | - | | 1.6044 | 5930 | 0.2466 | - | - | | 1.6047 | 5931 | 0.2003 | - | - | | 1.6050 | 5932 | 0.1937 | - | - | | 1.6052 | 5933 | 0.2248 | - | - | | 1.6055 | 5934 | 0.2003 | - | - | | 1.6058 | 5935 | 0.297 | - | - | | 1.6061 | 5936 | 0.1763 | - | - | | 1.6063 | 5937 | 0.2173 | - | - | | 1.6066 | 5938 | 0.2491 | - | - | | 1.6069 | 5939 | 0.1941 | - | - | | 1.6071 | 5940 | 0.1517 | - | - | | 1.6074 | 5941 | 0.1914 | - | - | | 1.6077 | 5942 | 0.1425 | - | - | | 1.6080 | 5943 | 0.1705 | - | - | | 1.6082 | 5944 | 0.1764 | - | - | | 1.6085 | 5945 | 0.2717 | - | - | | 1.6088 | 5946 | 0.2621 | - | - | | 1.6090 | 5947 | 0.331 | - | - | | 1.6093 | 5948 | 0.2477 | - | - | | 1.6096 | 5949 | 0.2338 | - | - | | 1.6098 | 5950 | 0.1788 | - | - | | 1.6101 | 5951 | 0.275 | - | - | | 1.6104 | 5952 | 0.2057 | - | - | | 1.6107 | 5953 | 0.2771 | - | - | | 1.6109 | 5954 | 0.2451 | - | - | | 1.6112 | 5955 | 0.1976 | - | - | | 1.6115 | 5956 | 0.1796 | - | - | | 1.6117 | 5957 | 0.1723 | - | - | | 1.6120 | 5958 | 0.1692 | - | - | | 1.6123 | 5959 | 0.283 | - | - | | 1.6126 | 5960 | 0.2528 | - | - | | 1.6128 | 5961 | 0.2251 | - | - | | 1.6131 | 5962 | 0.2088 | - | - | | 1.6134 | 5963 | 0.2035 | - | - | | 1.6136 | 5964 | 0.1668 | - | - | | 1.6139 | 5965 | 0.1809 | - | - | | 1.6142 | 5966 | 0.1653 | - | - | | 1.6144 | 5967 | 0.2669 | - | - | | 1.6147 | 5968 | 0.2541 | - | - | | 1.6150 | 5969 | 0.2284 | - | - | | 1.6153 | 5970 | 0.3516 | - | - | | 1.6155 | 5971 | 0.2041 | - | - | | 1.6158 | 5972 | 0.1302 | - | - | | 1.6161 | 5973 | 0.2187 | - | - | | 1.6163 | 5974 | 0.244 | - | - | | 1.6166 | 5975 | 0.1345 | - | - | | 1.6169 | 5976 | 0.1559 | - | - | | 1.6172 | 5977 | 0.209 | - | - | | 1.6174 | 5978 | 0.1748 | - | - | | 1.6177 | 5979 | 0.1668 | - | - | | 1.6180 | 5980 | 0.203 | - | - | | 1.6182 | 5981 | 0.1875 | - | - | | 1.6185 | 5982 | 0.1853 | - | - | | 1.6188 | 5983 | 0.1982 | - | - | | 1.6190 | 5984 | 0.1882 | - | - | | 1.6193 | 5985 | 0.2337 | - | - | | 1.6196 | 5986 | 0.1768 | - | - | | 1.6199 | 5987 | 0.2964 | - | - | | 1.6201 | 5988 | 0.2408 | - | - | | 1.6204 | 5989 | 0.1664 | - | - | | 1.6207 | 5990 | 0.2457 | - | - | | 1.6209 | 5991 | 0.2224 | - | - | | 1.6212 | 5992 | 0.227 | - | - | | 1.6215 | 5993 | 0.2282 | - | - | | 1.6218 | 5994 | 0.2762 | - | - | | 1.6220 | 5995 | 0.2437 | - | - | | 1.6223 | 5996 | 0.2351 | - | - | | 1.6226 | 5997 | 0.2618 | - | - | | 1.6228 | 5998 | 0.2149 | - | - | | 1.6231 | 5999 | 0.2541 | - | - | | 1.6234 | 6000 | 0.1609 | 0.2174 | 0.9451 | | 1.6236 | 6001 | 0.2411 | - | - | | 1.6239 | 6002 | 0.2476 | - | - | | 1.6242 | 6003 | 0.1894 | - | - | | 1.6245 | 6004 | 0.2072 | - | - | | 1.6247 | 6005 | 0.2353 | - | - | | 1.625 | 6006 | 0.1816 | - | - | | 1.6253 | 6007 | 0.1747 | - | - | | 1.6255 | 6008 | 0.2295 | - | - | | 1.6258 | 6009 | 0.2672 | - | - | | 1.6261 | 6010 | 0.1979 | - | - | | 1.6264 | 6011 | 0.2533 | - | - | | 1.6266 | 6012 | 0.228 | - | - | | 1.6269 | 6013 | 0.2893 | - | - | | 1.6272 | 6014 | 0.2129 | - | - | | 1.6274 | 6015 | 0.2407 | - | - | | 1.6277 | 6016 | 0.2519 | - | - | | 1.6280 | 6017 | 0.1866 | - | - | | 1.6282 | 6018 | 0.1861 | - | - | | 1.6285 | 6019 | 0.2334 | - | - | | 1.6288 | 6020 | 0.1671 | - | - | | 1.6291 | 6021 | 0.2565 | - | - | | 1.6293 | 6022 | 0.2133 | - | - | | 1.6296 | 6023 | 0.2295 | - | - | | 1.6299 | 6024 | 0.2426 | - | - | | 1.6301 | 6025 | 0.2742 | - | - | | 1.6304 | 6026 | 0.3324 | - | - | | 1.6307 | 6027 | 0.1909 | - | - | | 1.6310 | 6028 | 0.2805 | - | - | | 1.6312 | 6029 | 0.1796 | - | - | | 1.6315 | 6030 | 0.2955 | - | - | | 1.6318 | 6031 | 0.1957 | - | - | | 1.6320 | 6032 | 0.1659 | - | - | | 1.6323 | 6033 | 0.2561 | - | - | | 1.6326 | 6034 | 0.1934 | - | - | | 1.6328 | 6035 | 0.2098 | - | - | | 1.6331 | 6036 | 0.1551 | - | - | | 1.6334 | 6037 | 0.2052 | - | - | | 1.6337 | 6038 | 0.1581 | - | - | | 1.6339 | 6039 | 0.3474 | - | - | | 1.6342 | 6040 | 0.2067 | - | - | | 1.6345 | 6041 | 0.2069 | - | - | | 1.6347 | 6042 | 0.1717 | - | - | | 1.6350 | 6043 | 0.1806 | - | - | | 1.6353 | 6044 | 0.1408 | - | - | | 1.6356 | 6045 | 0.2959 | - | - | | 1.6358 | 6046 | 0.1596 | - | - | | 1.6361 | 6047 | 0.2241 | - | - | | 1.6364 | 6048 | 0.2629 | - | - | | 1.6366 | 6049 | 0.293 | - | - | | 1.6369 | 6050 | 0.215 | - | - | | 1.6372 | 6051 | 0.2589 | - | - | | 1.6374 | 6052 | 0.245 | - | - | | 1.6377 | 6053 | 0.1618 | - | - | | 1.6380 | 6054 | 0.2221 | - | - | | 1.6383 | 6055 | 0.1682 | - | - | | 1.6385 | 6056 | 0.2922 | - | - | | 1.6388 | 6057 | 0.2009 | - | - | | 1.6391 | 6058 | 0.3134 | - | - | | 1.6393 | 6059 | 0.2411 | - | - | | 1.6396 | 6060 | 0.2147 | - | - | | 1.6399 | 6061 | 0.1446 | - | - | | 1.6402 | 6062 | 0.1637 | - | - | | 1.6404 | 6063 | 0.1821 | - | - | | 1.6407 | 6064 | 0.2652 | - | - | | 1.6410 | 6065 | 0.2791 | - | - | | 1.6412 | 6066 | 0.2427 | - | - | | 1.6415 | 6067 | 0.2083 | - | - | | 1.6418 | 6068 | 0.2014 | - | - | | 1.6420 | 6069 | 0.1864 | - | - | | 1.6423 | 6070 | 0.1981 | - | - | | 1.6426 | 6071 | 0.2863 | - | - | | 1.6429 | 6072 | 0.2777 | - | - | | 1.6431 | 6073 | 0.2511 | - | - | | 1.6434 | 6074 | 0.286 | - | - | | 1.6437 | 6075 | 0.1897 | - | - | | 1.6439 | 6076 | 0.1915 | - | - | | 1.6442 | 6077 | 0.2191 | - | - | | 1.6445 | 6078 | 0.2234 | - | - | | 1.6448 | 6079 | 0.2397 | - | - | | 1.6450 | 6080 | 0.1502 | - | - | | 1.6453 | 6081 | 0.2711 | - | - | | 1.6456 | 6082 | 0.1999 | - | - | | 1.6458 | 6083 | 0.1419 | - | - | | 1.6461 | 6084 | 0.2097 | - | - | | 1.6464 | 6085 | 0.232 | - | - | | 1.6466 | 6086 | 0.2472 | - | - | | 1.6469 | 6087 | 0.243 | - | - | | 1.6472 | 6088 | 0.2228 | - | - | | 1.6475 | 6089 | 0.2536 | - | - | | 1.6477 | 6090 | 0.1542 | - | - | | 1.6480 | 6091 | 0.116 | - | - | | 1.6483 | 6092 | 0.2729 | - | - | | 1.6485 | 6093 | 0.2117 | - | - | | 1.6488 | 6094 | 0.2158 | - | - | | 1.6491 | 6095 | 0.2259 | - | - | | 1.6494 | 6096 | 0.24 | - | - | | 1.6496 | 6097 | 0.183 | - | - | | 1.6499 | 6098 | 0.2265 | - | - | | 1.6502 | 6099 | 0.1786 | - | - | | 1.6504 | 6100 | 0.3218 | - | - | | 1.6507 | 6101 | 0.2085 | - | - | | 1.6510 | 6102 | 0.2925 | - | - | | 1.6512 | 6103 | 0.2268 | - | - | | 1.6515 | 6104 | 0.196 | - | - | | 1.6518 | 6105 | 0.1748 | - | - | | 1.6521 | 6106 | 0.1492 | - | - | | 1.6523 | 6107 | 0.1414 | - | - | | 1.6526 | 6108 | 0.174 | - | - | | 1.6529 | 6109 | 0.2092 | - | - | | 1.6531 | 6110 | 0.1791 | - | - | | 1.6534 | 6111 | 0.3159 | - | - | | 1.6537 | 6112 | 0.2336 | - | - | | 1.6540 | 6113 | 0.2654 | - | - | | 1.6542 | 6114 | 0.2069 | - | - | | 1.6545 | 6115 | 0.2215 | - | - | | 1.6548 | 6116 | 0.2207 | - | - | | 1.6550 | 6117 | 0.3037 | - | - | | 1.6553 | 6118 | 0.2024 | - | - | | 1.6556 | 6119 | 0.2056 | - | - | | 1.6558 | 6120 | 0.2106 | - | - | | 1.6561 | 6121 | 0.1572 | - | - | | 1.6564 | 6122 | 0.1802 | - | - | | 1.6567 | 6123 | 0.2297 | - | - | | 1.6569 | 6124 | 0.171 | - | - | | 1.6572 | 6125 | 0.1439 | - | - | | 1.6575 | 6126 | 0.186 | - | - | | 1.6577 | 6127 | 0.2059 | - | - | | 1.6580 | 6128 | 0.2026 | - | - | | 1.6583 | 6129 | 0.2013 | - | - | | 1.6585 | 6130 | 0.2324 | - | - | | 1.6588 | 6131 | 0.2637 | - | - | | 1.6591 | 6132 | 0.1995 | - | - | | 1.6594 | 6133 | 0.1653 | - | - | | 1.6596 | 6134 | 0.1642 | - | - | | 1.6599 | 6135 | 0.2436 | - | - | | 1.6602 | 6136 | 0.2361 | - | - | | 1.6604 | 6137 | 0.2513 | - | - | | 1.6607 | 6138 | 0.1338 | - | - | | 1.6610 | 6139 | 0.2062 | - | - | | 1.6613 | 6140 | 0.2115 | - | - | | 1.6615 | 6141 | 0.2588 | - | - | | 1.6618 | 6142 | 0.2023 | - | - | | 1.6621 | 6143 | 0.2101 | - | - | | 1.6623 | 6144 | 0.224 | - | - | | 1.6626 | 6145 | 0.2074 | - | - | | 1.6629 | 6146 | 0.1879 | - | - | | 1.6631 | 6147 | 0.2345 | - | - | | 1.6634 | 6148 | 0.2077 | - | - | | 1.6637 | 6149 | 0.1633 | - | - | | 1.6640 | 6150 | 0.2096 | - | - | | 1.6642 | 6151 | 0.2024 | - | - | | 1.6645 | 6152 | 0.1498 | - | - | | 1.6648 | 6153 | 0.2654 | - | - | | 1.6650 | 6154 | 0.2271 | - | - | | 1.6653 | 6155 | 0.1861 | - | - | | 1.6656 | 6156 | 0.2086 | - | - | | 1.6659 | 6157 | 0.2026 | - | - | | 1.6661 | 6158 | 0.2126 | - | - | | 1.6664 | 6159 | 0.2615 | - | - | | 1.6667 | 6160 | 0.2752 | - | - | | 1.6669 | 6161 | 0.2051 | - | - | | 1.6672 | 6162 | 0.2486 | - | - | | 1.6675 | 6163 | 0.187 | - | - | | 1.6677 | 6164 | 0.2253 | - | - | | 1.6680 | 6165 | 0.2856 | - | - | | 1.6683 | 6166 | 0.3105 | - | - | | 1.6686 | 6167 | 0.1932 | - | - | | 1.6688 | 6168 | 0.1999 | - | - | | 1.6691 | 6169 | 0.1768 | - | - | | 1.6694 | 6170 | 0.2508 | - | - | | 1.6696 | 6171 | 0.1619 | - | - | | 1.6699 | 6172 | 0.2124 | - | - | | 1.6702 | 6173 | 0.2468 | - | - | | 1.6705 | 6174 | 0.2491 | - | - | | 1.6707 | 6175 | 0.2259 | - | - | | 1.6710 | 6176 | 0.2411 | - | - | | 1.6713 | 6177 | 0.159 | - | - | | 1.6715 | 6178 | 0.2822 | - | - | | 1.6718 | 6179 | 0.1935 | - | - | | 1.6721 | 6180 | 0.1813 | - | - | | 1.6723 | 6181 | 0.1918 | - | - | | 1.6726 | 6182 | 0.219 | - | - | | 1.6729 | 6183 | 0.1614 | - | - | | 1.6732 | 6184 | 0.2273 | - | - | | 1.6734 | 6185 | 0.2038 | - | - | | 1.6737 | 6186 | 0.3231 | - | - | | 1.6740 | 6187 | 0.2189 | - | - | | 1.6742 | 6188 | 0.2574 | - | - | | 1.6745 | 6189 | 0.1949 | - | - | | 1.6748 | 6190 | 0.1559 | - | - | | 1.6751 | 6191 | 0.2053 | - | - | | 1.6753 | 6192 | 0.3227 | - | - | | 1.6756 | 6193 | 0.3071 | - | - | | 1.6759 | 6194 | 0.1738 | - | - | | 1.6761 | 6195 | 0.2066 | - | - | | 1.6764 | 6196 | 0.282 | - | - | | 1.6767 | 6197 | 0.2116 | - | - | | 1.6769 | 6198 | 0.2445 | - | - | | 1.6772 | 6199 | 0.1761 | - | - | | 1.6775 | 6200 | 0.1794 | - | - | | 1.6778 | 6201 | 0.1573 | - | - | | 1.6780 | 6202 | 0.2023 | - | - | | 1.6783 | 6203 | 0.2739 | - | - | | 1.6786 | 6204 | 0.2093 | - | - | | 1.6788 | 6205 | 0.2238 | - | - | | 1.6791 | 6206 | 0.3184 | - | - | | 1.6794 | 6207 | 0.2156 | - | - | | 1.6797 | 6208 | 0.2289 | - | - | | 1.6799 | 6209 | 0.2812 | - | - | | 1.6802 | 6210 | 0.1912 | - | - | | 1.6805 | 6211 | 0.1199 | - | - | | 1.6807 | 6212 | 0.2143 | - | - | | 1.6810 | 6213 | 0.2726 | - | - | | 1.6813 | 6214 | 0.1992 | - | - | | 1.6815 | 6215 | 0.1996 | - | - | | 1.6818 | 6216 | 0.203 | - | - | | 1.6821 | 6217 | 0.2573 | - | - | | 1.6824 | 6218 | 0.2288 | - | - | | 1.6826 | 6219 | 0.3099 | - | - | | 1.6829 | 6220 | 0.2328 | - | - | | 1.6832 | 6221 | 0.2272 | - | - | | 1.6834 | 6222 | 0.1634 | - | - | | 1.6837 | 6223 | 0.1764 | - | - | | 1.6840 | 6224 | 0.1415 | - | - | | 1.6843 | 6225 | 0.3089 | - | - | | 1.6845 | 6226 | 0.2124 | - | - | | 1.6848 | 6227 | 0.2177 | - | - | | 1.6851 | 6228 | 0.1281 | - | - | | 1.6853 | 6229 | 0.2537 | - | - | | 1.6856 | 6230 | 0.2662 | - | - | | 1.6859 | 6231 | 0.2094 | - | - | | 1.6861 | 6232 | 0.2282 | - | - | | 1.6864 | 6233 | 0.2247 | - | - | | 1.6867 | 6234 | 0.2186 | - | - | | 1.6870 | 6235 | 0.3039 | - | - | | 1.6872 | 6236 | 0.2497 | - | - | | 1.6875 | 6237 | 0.2603 | - | - | | 1.6878 | 6238 | 0.1845 | - | - | | 1.6880 | 6239 | 0.2017 | - | - | | 1.6883 | 6240 | 0.2678 | - | - | | 1.6886 | 6241 | 0.2565 | - | - | | 1.6889 | 6242 | 0.2073 | - | - | | 1.6891 | 6243 | 0.2361 | - | - | | 1.6894 | 6244 | 0.2571 | - | - | | 1.6897 | 6245 | 0.224 | - | - | | 1.6899 | 6246 | 0.1755 | - | - | | 1.6902 | 6247 | 0.2277 | - | - | | 1.6905 | 6248 | 0.185 | - | - | | 1.6907 | 6249 | 0.1666 | - | - | | 1.6910 | 6250 | 0.2026 | - | - | | 1.6913 | 6251 | 0.1618 | - | - | | 1.6916 | 6252 | 0.2122 | - | - | | 1.6918 | 6253 | 0.1848 | - | - | | 1.6921 | 6254 | 0.2235 | - | - | | 1.6924 | 6255 | 0.2279 | - | - | | 1.6926 | 6256 | 0.1412 | - | - | | 1.6929 | 6257 | 0.1361 | - | - | | 1.6932 | 6258 | 0.241 | - | - | | 1.6935 | 6259 | 0.1708 | - | - | | 1.6937 | 6260 | 0.3052 | - | - | | 1.6940 | 6261 | 0.2259 | - | - | | 1.6943 | 6262 | 0.2076 | - | - | | 1.6945 | 6263 | 0.17 | - | - | | 1.6948 | 6264 | 0.2687 | - | - | | 1.6951 | 6265 | 0.2875 | - | - | | 1.6953 | 6266 | 0.1831 | - | - | | 1.6956 | 6267 | 0.2235 | - | - | | 1.6959 | 6268 | 0.1535 | - | - | | 1.6962 | 6269 | 0.1922 | - | - | | 1.6964 | 6270 | 0.1916 | - | - | | 1.6967 | 6271 | 0.1186 | - | - | | 1.6970 | 6272 | 0.198 | - | - | | 1.6972 | 6273 | 0.1601 | - | - | | 1.6975 | 6274 | 0.2012 | - | - | | 1.6978 | 6275 | 0.2649 | - | - | | 1.6981 | 6276 | 0.1546 | - | - | | 1.6983 | 6277 | 0.2009 | - | - | | 1.6986 | 6278 | 0.2118 | - | - | | 1.6989 | 6279 | 0.271 | - | - | | 1.6991 | 6280 | 0.1507 | - | - | | 1.6994 | 6281 | 0.2023 | - | - | | 1.6997 | 6282 | 0.1794 | - | - | | 1.6999 | 6283 | 0.1826 | - | - | | 1.7002 | 6284 | 0.1485 | - | - | | 1.7005 | 6285 | 0.1836 | - | - | | 1.7008 | 6286 | 0.2678 | - | - | | 1.7010 | 6287 | 0.1984 | - | - | | 1.7013 | 6288 | 0.1805 | - | - | | 1.7016 | 6289 | 0.1964 | - | - | | 1.7018 | 6290 | 0.3036 | - | - | | 1.7021 | 6291 | 0.1913 | - | - | | 1.7024 | 6292 | 0.1953 | - | - | | 1.7027 | 6293 | 0.251 | - | - | | 1.7029 | 6294 | 0.2194 | - | - | | 1.7032 | 6295 | 0.1759 | - | - | | 1.7035 | 6296 | 0.2338 | - | - | | 1.7037 | 6297 | 0.3278 | - | - | | 1.7040 | 6298 | 0.2449 | - | - | | 1.7043 | 6299 | 0.1869 | - | - | | 1.7045 | 6300 | 0.155 | - | - | | 1.7048 | 6301 | 0.2264 | - | - | | 1.7051 | 6302 | 0.1628 | - | - | | 1.7054 | 6303 | 0.2127 | - | - | | 1.7056 | 6304 | 0.1754 | - | - | | 1.7059 | 6305 | 0.2177 | - | - | | 1.7062 | 6306 | 0.1672 | - | - | | 1.7064 | 6307 | 0.2735 | - | - | | 1.7067 | 6308 | 0.1891 | - | - | | 1.7070 | 6309 | 0.2417 | - | - | | 1.7073 | 6310 | 0.1958 | - | - | | 1.7075 | 6311 | 0.1927 | - | - | | 1.7078 | 6312 | 0.2183 | - | - | | 1.7081 | 6313 | 0.1856 | - | - | | 1.7083 | 6314 | 0.2452 | - | - | | 1.7086 | 6315 | 0.2156 | - | - | | 1.7089 | 6316 | 0.2046 | - | - | | 1.7091 | 6317 | 0.2373 | - | - | | 1.7094 | 6318 | 0.2328 | - | - | | 1.7097 | 6319 | 0.2298 | - | - | | 1.7100 | 6320 | 0.1601 | - | - | | 1.7102 | 6321 | 0.2633 | - | - | | 1.7105 | 6322 | 0.2547 | - | - | | 1.7108 | 6323 | 0.327 | - | - | | 1.7110 | 6324 | 0.1779 | - | - | | 1.7113 | 6325 | 0.1927 | - | - | | 1.7116 | 6326 | 0.2296 | - | - | | 1.7119 | 6327 | 0.2547 | - | - | | 1.7121 | 6328 | 0.1407 | - | - | | 1.7124 | 6329 | 0.2008 | - | - | | 1.7127 | 6330 | 0.2446 | - | - | | 1.7129 | 6331 | 0.1783 | - | - | | 1.7132 | 6332 | 0.2012 | - | - | | 1.7135 | 6333 | 0.19 | - | - | | 1.7137 | 6334 | 0.1879 | - | - | | 1.7140 | 6335 | 0.1982 | - | - | | 1.7143 | 6336 | 0.1915 | - | - | | 1.7146 | 6337 | 0.1973 | - | - | | 1.7148 | 6338 | 0.2666 | - | - | | 1.7151 | 6339 | 0.2357 | - | - | | 1.7154 | 6340 | 0.308 | - | - | | 1.7156 | 6341 | 0.2419 | - | - | | 1.7159 | 6342 | 0.1859 | - | - | | 1.7162 | 6343 | 0.2012 | - | - | | 1.7165 | 6344 | 0.2732 | - | - | | 1.7167 | 6345 | 0.1787 | - | - | | 1.7170 | 6346 | 0.2851 | - | - | | 1.7173 | 6347 | 0.2783 | - | - | | 1.7175 | 6348 | 0.1906 | - | - | | 1.7178 | 6349 | 0.1664 | - | - | | 1.7181 | 6350 | 0.1848 | - | - | | 1.7183 | 6351 | 0.2025 | - | - | | 1.7186 | 6352 | 0.1893 | - | - | | 1.7189 | 6353 | 0.2646 | - | - | | 1.7192 | 6354 | 0.1712 | - | - | | 1.7194 | 6355 | 0.2041 | - | - | | 1.7197 | 6356 | 0.2531 | - | - | | 1.7200 | 6357 | 0.1906 | - | - | | 1.7202 | 6358 | 0.1941 | - | - | | 1.7205 | 6359 | 0.1287 | - | - | | 1.7208 | 6360 | 0.3049 | - | - | | 1.7210 | 6361 | 0.1926 | - | - | | 1.7213 | 6362 | 0.1476 | - | - | | 1.7216 | 6363 | 0.2073 | - | - | | 1.7219 | 6364 | 0.157 | - | - | | 1.7221 | 6365 | 0.328 | - | - | | 1.7224 | 6366 | 0.2506 | - | - | | 1.7227 | 6367 | 0.1604 | - | - | | 1.7229 | 6368 | 0.1133 | - | - | | 1.7232 | 6369 | 0.2491 | - | - | | 1.7235 | 6370 | 0.1661 | - | - | | 1.7238 | 6371 | 0.1666 | - | - | | 1.7240 | 6372 | 0.2075 | - | - | | 1.7243 | 6373 | 0.1881 | - | - | | 1.7246 | 6374 | 0.2058 | - | - | | 1.7248 | 6375 | 0.3101 | - | - | | 1.7251 | 6376 | 0.2078 | - | - | | 1.7254 | 6377 | 0.2565 | - | - | | 1.7256 | 6378 | 0.2535 | - | - | | 1.7259 | 6379 | 0.2875 | - | - | | 1.7262 | 6380 | 0.2615 | - | - | | 1.7265 | 6381 | 0.2495 | - | - | | 1.7267 | 6382 | 0.1861 | - | - | | 1.7270 | 6383 | 0.2705 | - | - | | 1.7273 | 6384 | 0.236 | - | - | | 1.7275 | 6385 | 0.1513 | - | - | | 1.7278 | 6386 | 0.2915 | - | - | | 1.7281 | 6387 | 0.2251 | - | - | | 1.7284 | 6388 | 0.2899 | - | - | | 1.7286 | 6389 | 0.1668 | - | - | | 1.7289 | 6390 | 0.2162 | - | - | | 1.7292 | 6391 | 0.2463 | - | - | | 1.7294 | 6392 | 0.1637 | - | - | | 1.7297 | 6393 | 0.1534 | - | - | | 1.7300 | 6394 | 0.1902 | - | - | | 1.7302 | 6395 | 0.2463 | - | - | | 1.7305 | 6396 | 0.1839 | - | - | | 1.7308 | 6397 | 0.1864 | - | - | | 1.7311 | 6398 | 0.2189 | - | - | | 1.7313 | 6399 | 0.2209 | - | - | | 1.7316 | 6400 | 0.224 | - | - | | 1.7319 | 6401 | 0.2884 | - | - | | 1.7321 | 6402 | 0.2131 | - | - | | 1.7324 | 6403 | 0.2166 | - | - | | 1.7327 | 6404 | 0.1338 | - | - | | 1.7330 | 6405 | 0.2034 | - | - | | 1.7332 | 6406 | 0.2439 | - | - | | 1.7335 | 6407 | 0.183 | - | - | | 1.7338 | 6408 | 0.2011 | - | - | | 1.7340 | 6409 | 0.2644 | - | - | | 1.7343 | 6410 | 0.2959 | - | - | | 1.7346 | 6411 | 0.2974 | - | - | | 1.7348 | 6412 | 0.2684 | - | - | | 1.7351 | 6413 | 0.2742 | - | - | | 1.7354 | 6414 | 0.2356 | - | - | | 1.7357 | 6415 | 0.2036 | - | - | | 1.7359 | 6416 | 0.2598 | - | - | | 1.7362 | 6417 | 0.1687 | - | - | | 1.7365 | 6418 | 0.1617 | - | - | | 1.7367 | 6419 | 0.2195 | - | - | | 1.7370 | 6420 | 0.2096 | - | - | | 1.7373 | 6421 | 0.2478 | - | - | | 1.7376 | 6422 | 0.2257 | - | - | | 1.7378 | 6423 | 0.203 | - | - | | 1.7381 | 6424 | 0.2313 | - | - | | 1.7384 | 6425 | 0.3682 | - | - | | 1.7386 | 6426 | 0.1896 | - | - | | 1.7389 | 6427 | 0.2007 | - | - | | 1.7392 | 6428 | 0.213 | - | - | | 1.7394 | 6429 | 0.1914 | - | - | | 1.7397 | 6430 | 0.2425 | - | - | | 1.7400 | 6431 | 0.1363 | - | - | | 1.7403 | 6432 | 0.2102 | - | - | | 1.7405 | 6433 | 0.2388 | - | - | | 1.7408 | 6434 | 0.1876 | - | - | | 1.7411 | 6435 | 0.227 | - | - | | 1.7413 | 6436 | 0.188 | - | - | | 1.7416 | 6437 | 0.2069 | - | - | | 1.7419 | 6438 | 0.3332 | - | - | | 1.7422 | 6439 | 0.2827 | - | - | | 1.7424 | 6440 | 0.1756 | - | - | | 1.7427 | 6441 | 0.1576 | - | - | | 1.7430 | 6442 | 0.2298 | - | - | | 1.7432 | 6443 | 0.1993 | - | - | | 1.7435 | 6444 | 0.1882 | - | - | | 1.7438 | 6445 | 0.2128 | - | - | | 1.7440 | 6446 | 0.1643 | - | - | | 1.7443 | 6447 | 0.1562 | - | - | | 1.7446 | 6448 | 0.2448 | - | - | | 1.7449 | 6449 | 0.143 | - | - | | 1.7451 | 6450 | 0.1564 | - | - | | 1.7454 | 6451 | 0.2297 | - | - | | 1.7457 | 6452 | 0.2543 | - | - | | 1.7459 | 6453 | 0.2491 | - | - | | 1.7462 | 6454 | 0.2243 | - | - | | 1.7465 | 6455 | 0.3036 | - | - | | 1.7468 | 6456 | 0.1442 | - | - | | 1.7470 | 6457 | 0.2412 | - | - | | 1.7473 | 6458 | 0.2478 | - | - | | 1.7476 | 6459 | 0.1734 | - | - | | 1.7478 | 6460 | 0.2363 | - | - | | 1.7481 | 6461 | 0.1269 | - | - | | 1.7484 | 6462 | 0.1858 | - | - | | 1.7486 | 6463 | 0.1646 | - | - | | 1.7489 | 6464 | 0.1812 | - | - | | 1.7492 | 6465 | 0.2501 | - | - | | 1.7495 | 6466 | 0.2754 | - | - | | 1.7497 | 6467 | 0.28 | - | - | | 1.75 | 6468 | 0.2293 | - | - | | 1.7503 | 6469 | 0.2005 | - | - | | 1.7505 | 6470 | 0.1914 | - | - | | 1.7508 | 6471 | 0.2363 | - | - | | 1.7511 | 6472 | 0.2615 | - | - | | 1.7514 | 6473 | 0.2168 | - | - | | 1.7516 | 6474 | 0.2395 | - | - | | 1.7519 | 6475 | 0.2074 | - | - | | 1.7522 | 6476 | 0.2091 | - | - | | 1.7524 | 6477 | 0.2065 | - | - | | 1.7527 | 6478 | 0.2952 | - | - | | 1.7530 | 6479 | 0.1975 | - | - | | 1.7532 | 6480 | 0.2138 | - | - | | 1.7535 | 6481 | 0.2139 | - | - | | 1.7538 | 6482 | 0.2198 | - | - | | 1.7541 | 6483 | 0.2413 | - | - | | 1.7543 | 6484 | 0.2335 | - | - | | 1.7546 | 6485 | 0.2885 | - | - | | 1.7549 | 6486 | 0.2458 | - | - | | 1.7551 | 6487 | 0.1959 | - | - | | 1.7554 | 6488 | 0.1643 | - | - | | 1.7557 | 6489 | 0.1989 | - | - | | 1.7560 | 6490 | 0.227 | - | - | | 1.7562 | 6491 | 0.2099 | - | - | | 1.7565 | 6492 | 0.2289 | - | - | | 1.7568 | 6493 | 0.2035 | - | - | | 1.7570 | 6494 | 0.2674 | - | - | | 1.7573 | 6495 | 0.224 | - | - | | 1.7576 | 6496 | 0.2437 | - | - | | 1.7578 | 6497 | 0.2677 | - | - | | 1.7581 | 6498 | 0.2658 | - | - | | 1.7584 | 6499 | 0.2065 | - | - | | 1.7587 | 6500 | 0.1895 | - | - | | 1.7589 | 6501 | 0.2318 | - | - | | 1.7592 | 6502 | 0.2205 | - | - | | 1.7595 | 6503 | 0.2564 | - | - | | 1.7597 | 6504 | 0.2689 | - | - | | 1.7600 | 6505 | 0.2763 | - | - | | 1.7603 | 6506 | 0.2078 | - | - | | 1.7606 | 6507 | 0.2408 | - | - | | 1.7608 | 6508 | 0.1305 | - | - | | 1.7611 | 6509 | 0.2143 | - | - | | 1.7614 | 6510 | 0.1935 | - | - | | 1.7616 | 6511 | 0.2253 | - | - | | 1.7619 | 6512 | 0.2579 | - | - | | 1.7622 | 6513 | 0.2241 | - | - | | 1.7624 | 6514 | 0.2469 | - | - | | 1.7627 | 6515 | 0.2126 | - | - | | 1.7630 | 6516 | 0.2006 | - | - | | 1.7633 | 6517 | 0.2174 | - | - | | 1.7635 | 6518 | 0.2355 | - | - | | 1.7638 | 6519 | 0.1764 | - | - | | 1.7641 | 6520 | 0.1972 | - | - | | 1.7643 | 6521 | 0.3508 | - | - | | 1.7646 | 6522 | 0.2148 | - | - | | 1.7649 | 6523 | 0.2031 | - | - | | 1.7652 | 6524 | 0.1855 | - | - | | 1.7654 | 6525 | 0.1486 | - | - | | 1.7657 | 6526 | 0.2744 | - | - | | 1.7660 | 6527 | 0.2302 | - | - | | 1.7662 | 6528 | 0.2452 | - | - | | 1.7665 | 6529 | 0.2308 | - | - | | 1.7668 | 6530 | 0.2849 | - | - | | 1.7670 | 6531 | 0.1528 | - | - | | 1.7673 | 6532 | 0.219 | - | - | | 1.7676 | 6533 | 0.1602 | - | - | | 1.7679 | 6534 | 0.2391 | - | - | | 1.7681 | 6535 | 0.1727 | - | - | | 1.7684 | 6536 | 0.28 | - | - | | 1.7687 | 6537 | 0.2136 | - | - | | 1.7689 | 6538 | 0.214 | - | - | | 1.7692 | 6539 | 0.2192 | - | - | | 1.7695 | 6540 | 0.2 | - | - | | 1.7698 | 6541 | 0.1819 | - | - | | 1.7700 | 6542 | 0.1676 | - | - | | 1.7703 | 6543 | 0.2631 | - | - | | 1.7706 | 6544 | 0.2503 | - | - | | 1.7708 | 6545 | 0.2668 | - | - | | 1.7711 | 6546 | 0.2111 | - | - | | 1.7714 | 6547 | 0.2102 | - | - | | 1.7716 | 6548 | 0.2288 | - | - | | 1.7719 | 6549 | 0.2249 | - | - | | 1.7722 | 6550 | 0.1823 | - | - | | 1.7725 | 6551 | 0.1697 | - | - | | 1.7727 | 6552 | 0.1553 | - | - | | 1.7730 | 6553 | 0.2197 | - | - | | 1.7733 | 6554 | 0.2097 | - | - | | 1.7735 | 6555 | 0.1861 | - | - | | 1.7738 | 6556 | 0.213 | - | - | | 1.7741 | 6557 | 0.2239 | - | - | | 1.7744 | 6558 | 0.227 | - | - | | 1.7746 | 6559 | 0.2593 | - | - | | 1.7749 | 6560 | 0.2499 | - | - | | 1.7752 | 6561 | 0.1792 | - | - | | 1.7754 | 6562 | 0.1656 | - | - | | 1.7757 | 6563 | 0.2635 | - | - | | 1.7760 | 6564 | 0.1771 | - | - | | 1.7762 | 6565 | 0.2114 | - | - | | 1.7765 | 6566 | 0.267 | - | - | | 1.7768 | 6567 | 0.2448 | - | - | | 1.7771 | 6568 | 0.1893 | - | - | | 1.7773 | 6569 | 0.2354 | - | - | | 1.7776 | 6570 | 0.1843 | - | - | | 1.7779 | 6571 | 0.3859 | - | - | | 1.7781 | 6572 | 0.1559 | - | - | | 1.7784 | 6573 | 0.2277 | - | - | | 1.7787 | 6574 | 0.1855 | - | - | | 1.7790 | 6575 | 0.26 | - | - | | 1.7792 | 6576 | 0.2282 | - | - | | 1.7795 | 6577 | 0.2528 | - | - | | 1.7798 | 6578 | 0.2405 | - | - | | 1.7800 | 6579 | 0.2157 | - | - | | 1.7803 | 6580 | 0.2358 | - | - | | 1.7806 | 6581 | 0.2361 | - | - | | 1.7808 | 6582 | 0.2223 | - | - | | 1.7811 | 6583 | 0.2795 | - | - | | 1.7814 | 6584 | 0.1769 | - | - | | 1.7817 | 6585 | 0.2338 | - | - | | 1.7819 | 6586 | 0.1923 | - | - | | 1.7822 | 6587 | 0.233 | - | - | | 1.7825 | 6588 | 0.2085 | - | - | | 1.7827 | 6589 | 0.1778 | - | - | | 1.7830 | 6590 | 0.2869 | - | - | | 1.7833 | 6591 | 0.2119 | - | - | | 1.7835 | 6592 | 0.2968 | - | - | | 1.7838 | 6593 | 0.2165 | - | - | | 1.7841 | 6594 | 0.187 | - | - | | 1.7844 | 6595 | 0.201 | - | - | | 1.7846 | 6596 | 0.2552 | - | - | | 1.7849 | 6597 | 0.2086 | - | - | | 1.7852 | 6598 | 0.2725 | - | - | | 1.7854 | 6599 | 0.2496 | - | - | | 1.7857 | 6600 | 0.2407 | - | - | | 1.7860 | 6601 | 0.2091 | - | - | | 1.7863 | 6602 | 0.2696 | - | - | | 1.7865 | 6603 | 0.1632 | - | - | | 1.7868 | 6604 | 0.181 | - | - | | 1.7871 | 6605 | 0.2028 | - | - | | 1.7873 | 6606 | 0.2248 | - | - | | 1.7876 | 6607 | 0.2393 | - | - | | 1.7879 | 6608 | 0.1631 | - | - | | 1.7881 | 6609 | 0.2578 | - | - | | 1.7884 | 6610 | 0.1602 | - | - | | 1.7887 | 6611 | 0.2243 | - | - | | 1.7890 | 6612 | 0.2395 | - | - | | 1.7892 | 6613 | 0.2762 | - | - | | 1.7895 | 6614 | 0.1894 | - | - | | 1.7898 | 6615 | 0.2514 | - | - | | 1.7900 | 6616 | 0.1803 | - | - | | 1.7903 | 6617 | 0.2086 | - | - | | 1.7906 | 6618 | 0.1668 | - | - | | 1.7909 | 6619 | 0.2209 | - | - | | 1.7911 | 6620 | 0.2113 | - | - | | 1.7914 | 6621 | 0.2302 | - | - | | 1.7917 | 6622 | 0.3209 | - | - | | 1.7919 | 6623 | 0.1428 | - | - | | 1.7922 | 6624 | 0.2831 | - | - | | 1.7925 | 6625 | 0.2812 | - | - | | 1.7927 | 6626 | 0.2768 | - | - | | 1.7930 | 6627 | 0.215 | - | - | | 1.7933 | 6628 | 0.1834 | - | - | | 1.7936 | 6629 | 0.167 | - | - | | 1.7938 | 6630 | 0.2403 | - | - | | 1.7941 | 6631 | 0.1881 | - | - | | 1.7944 | 6632 | 0.1877 | - | - | | 1.7946 | 6633 | 0.2012 | - | - | | 1.7949 | 6634 | 0.323 | - | - | | 1.7952 | 6635 | 0.2905 | - | - | | 1.7955 | 6636 | 0.1887 | - | - | | 1.7957 | 6637 | 0.1869 | - | - | | 1.7960 | 6638 | 0.257 | - | - | | 1.7963 | 6639 | 0.2138 | - | - | | 1.7965 | 6640 | 0.2827 | - | - | | 1.7968 | 6641 | 0.2251 | - | - | | 1.7971 | 6642 | 0.2798 | - | - | | 1.7973 | 6643 | 0.2143 | - | - | | 1.7976 | 6644 | 0.1846 | - | - | | 1.7979 | 6645 | 0.1762 | - | - | | 1.7982 | 6646 | 0.2543 | - | - | | 1.7984 | 6647 | 0.1622 | - | - | | 1.7987 | 6648 | 0.1778 | - | - | | 1.7990 | 6649 | 0.1405 | - | - | | 1.7992 | 6650 | 0.2197 | - | - | | 1.7995 | 6651 | 0.2605 | - | - | | 1.7998 | 6652 | 0.1745 | - | - | | 1.8001 | 6653 | 0.2705 | - | - | | 1.8003 | 6654 | 0.1617 | - | - | | 1.8006 | 6655 | 0.2243 | - | - | | 1.8009 | 6656 | 0.2083 | - | - | | 1.8011 | 6657 | 0.2421 | - | - | | 1.8014 | 6658 | 0.2447 | - | - | | 1.8017 | 6659 | 0.2206 | - | - | | 1.8019 | 6660 | 0.1914 | - | - | | 1.8022 | 6661 | 0.2507 | - | - | | 1.8025 | 6662 | 0.2856 | - | - | | 1.8028 | 6663 | 0.1925 | - | - | | 1.8030 | 6664 | 0.179 | - | - | | 1.8033 | 6665 | 0.2452 | - | - | | 1.8036 | 6666 | 0.2212 | - | - | | 1.8038 | 6667 | 0.1303 | - | - | | 1.8041 | 6668 | 0.1872 | - | - | | 1.8044 | 6669 | 0.2041 | - | - | | 1.8047 | 6670 | 0.2514 | - | - | | 1.8049 | 6671 | 0.192 | - | - | | 1.8052 | 6672 | 0.2989 | - | - | | 1.8055 | 6673 | 0.2004 | - | - | | 1.8057 | 6674 | 0.2675 | - | - | | 1.8060 | 6675 | 0.2535 | - | - | | 1.8063 | 6676 | 0.2766 | - | - | | 1.8065 | 6677 | 0.2401 | - | - | | 1.8068 | 6678 | 0.2731 | - | - | | 1.8071 | 6679 | 0.2084 | - | - | | 1.8074 | 6680 | 0.1877 | - | - | | 1.8076 | 6681 | 0.2267 | - | - | | 1.8079 | 6682 | 0.2496 | - | - | | 1.8082 | 6683 | 0.2267 | - | - | | 1.8084 | 6684 | 0.193 | - | - | | 1.8087 | 6685 | 0.2128 | - | - | | 1.8090 | 6686 | 0.2902 | - | - | | 1.8093 | 6687 | 0.204 | - | - | | 1.8095 | 6688 | 0.229 | - | - | | 1.8098 | 6689 | 0.3288 | - | - | | 1.8101 | 6690 | 0.1849 | - | - | | 1.8103 | 6691 | 0.2502 | - | - | | 1.8106 | 6692 | 0.2702 | - | - | | 1.8109 | 6693 | 0.1542 | - | - | | 1.8111 | 6694 | 0.1617 | - | - | | 1.8114 | 6695 | 0.1862 | - | - | | 1.8117 | 6696 | 0.1894 | - | - | | 1.8120 | 6697 | 0.2195 | - | - | | 1.8122 | 6698 | 0.2576 | - | - | | 1.8125 | 6699 | 0.2655 | - | - | | 1.8128 | 6700 | 0.1693 | - | - | | 1.8130 | 6701 | 0.1945 | - | - | | 1.8133 | 6702 | 0.2121 | - | - | | 1.8136 | 6703 | 0.2186 | - | - | | 1.8139 | 6704 | 0.1629 | - | - | | 1.8141 | 6705 | 0.1972 | - | - | | 1.8144 | 6706 | 0.247 | - | - | | 1.8147 | 6707 | 0.2026 | - | - | | 1.8149 | 6708 | 0.2318 | - | - | | 1.8152 | 6709 | 0.2716 | - | - | | 1.8155 | 6710 | 0.2021 | - | - | | 1.8157 | 6711 | 0.2328 | - | - | | 1.8160 | 6712 | 0.2318 | - | - | | 1.8163 | 6713 | 0.2981 | - | - | | 1.8166 | 6714 | 0.1953 | - | - | | 1.8168 | 6715 | 0.1852 | - | - | | 1.8171 | 6716 | 0.2372 | - | - | | 1.8174 | 6717 | 0.1871 | - | - | | 1.8176 | 6718 | 0.1753 | - | - | | 1.8179 | 6719 | 0.215 | - | - | | 1.8182 | 6720 | 0.1626 | - | - | | 1.8185 | 6721 | 0.2358 | - | - | | 1.8187 | 6722 | 0.21 | - | - | | 1.8190 | 6723 | 0.1971 | - | - | | 1.8193 | 6724 | 0.167 | - | - | | 1.8195 | 6725 | 0.2303 | - | - | | 1.8198 | 6726 | 0.2414 | - | - | | 1.8201 | 6727 | 0.1971 | - | - | | 1.8203 | 6728 | 0.247 | - | - | | 1.8206 | 6729 | 0.2166 | - | - | | 1.8209 | 6730 | 0.2749 | - | - | | 1.8212 | 6731 | 0.2598 | - | - | | 1.8214 | 6732 | 0.1899 | - | - | | 1.8217 | 6733 | 0.2066 | - | - | | 1.8220 | 6734 | 0.2383 | - | - | | 1.8222 | 6735 | 0.2664 | - | - | | 1.8225 | 6736 | 0.2661 | - | - | | 1.8228 | 6737 | 0.2796 | - | - | | 1.8231 | 6738 | 0.2781 | - | - | | 1.8233 | 6739 | 0.2618 | - | - | | 1.8236 | 6740 | 0.2486 | - | - | | 1.8239 | 6741 | 0.1855 | - | - | | 1.8241 | 6742 | 0.2137 | - | - | | 1.8244 | 6743 | 0.1719 | - | - | | 1.8247 | 6744 | 0.2291 | - | - | | 1.8249 | 6745 | 0.2304 | - | - | | 1.8252 | 6746 | 0.2855 | - | - | | 1.8255 | 6747 | 0.2541 | - | - | | 1.8258 | 6748 | 0.2232 | - | - | | 1.8260 | 6749 | 0.2797 | - | - | | 1.8263 | 6750 | 0.1399 | - | - | | 1.8266 | 6751 | 0.2661 | - | - | | 1.8268 | 6752 | 0.1634 | - | - | | 1.8271 | 6753 | 0.1625 | - | - | | 1.8274 | 6754 | 0.2126 | - | - | | 1.8277 | 6755 | 0.2137 | - | - | | 1.8279 | 6756 | 0.2012 | - | - | | 1.8282 | 6757 | 0.1717 | - | - | | 1.8285 | 6758 | 0.1937 | - | - | | 1.8287 | 6759 | 0.1816 | - | - | | 1.8290 | 6760 | 0.2053 | - | - | | 1.8293 | 6761 | 0.3062 | - | - | | 1.8295 | 6762 | 0.3078 | - | - | | 1.8298 | 6763 | 0.1764 | - | - | | 1.8301 | 6764 | 0.2289 | - | - | | 1.8304 | 6765 | 0.2101 | - | - | | 1.8306 | 6766 | 0.2181 | - | - | | 1.8309 | 6767 | 0.2046 | - | - | | 1.8312 | 6768 | 0.1564 | - | - | | 1.8314 | 6769 | 0.2528 | - | - | | 1.8317 | 6770 | 0.1251 | - | - | | 1.8320 | 6771 | 0.1605 | - | - | | 1.8323 | 6772 | 0.237 | - | - | | 1.8325 | 6773 | 0.1622 | - | - | | 1.8328 | 6774 | 0.1611 | - | - | | 1.8331 | 6775 | 0.1412 | - | - | | 1.8333 | 6776 | 0.2119 | - | - | | 1.8336 | 6777 | 0.2005 | - | - | | 1.8339 | 6778 | 0.1772 | - | - | | 1.8341 | 6779 | 0.1947 | - | - | | 1.8344 | 6780 | 0.2807 | - | - | | 1.8347 | 6781 | 0.3542 | - | - | | 1.8350 | 6782 | 0.2169 | - | - | | 1.8352 | 6783 | 0.2464 | - | - | | 1.8355 | 6784 | 0.2717 | - | - | | 1.8358 | 6785 | 0.2623 | - | - | | 1.8360 | 6786 | 0.1544 | - | - | | 1.8363 | 6787 | 0.2937 | - | - | | 1.8366 | 6788 | 0.2159 | - | - | | 1.8369 | 6789 | 0.3393 | - | - | | 1.8371 | 6790 | 0.2142 | - | - | | 1.8374 | 6791 | 0.25 | - | - | | 1.8377 | 6792 | 0.3379 | - | - | | 1.8379 | 6793 | 0.1493 | - | - | | 1.8382 | 6794 | 0.1527 | - | - | | 1.8385 | 6795 | 0.2073 | - | - | | 1.8387 | 6796 | 0.1927 | - | - | | 1.8390 | 6797 | 0.2152 | - | - | | 1.8393 | 6798 | 0.1894 | - | - | | 1.8396 | 6799 | 0.2404 | - | - | | 1.8398 | 6800 | 0.2309 | - | - | | 1.8401 | 6801 | 0.1824 | - | - | | 1.8404 | 6802 | 0.2448 | - | - | | 1.8406 | 6803 | 0.2701 | - | - | | 1.8409 | 6804 | 0.2419 | - | - | | 1.8412 | 6805 | 0.2511 | - | - | | 1.8415 | 6806 | 0.2591 | - | - | | 1.8417 | 6807 | 0.238 | - | - | | 1.8420 | 6808 | 0.262 | - | - | | 1.8423 | 6809 | 0.2393 | - | - | | 1.8425 | 6810 | 0.2367 | - | - | | 1.8428 | 6811 | 0.1947 | - | - | | 1.8431 | 6812 | 0.2319 | - | - | | 1.8433 | 6813 | 0.1326 | - | - | | 1.8436 | 6814 | 0.1993 | - | - | | 1.8439 | 6815 | 0.2952 | - | - | | 1.8442 | 6816 | 0.1677 | - | - | | 1.8444 | 6817 | 0.2743 | - | - | | 1.8447 | 6818 | 0.1843 | - | - | | 1.8450 | 6819 | 0.2027 | - | - | | 1.8452 | 6820 | 0.2546 | - | - | | 1.8455 | 6821 | 0.2403 | - | - | | 1.8458 | 6822 | 0.2722 | - | - | | 1.8460 | 6823 | 0.1792 | - | - | | 1.8463 | 6824 | 0.2234 | - | - | | 1.8466 | 6825 | 0.1607 | - | - | | 1.8469 | 6826 | 0.2281 | - | - | | 1.8471 | 6827 | 0.2131 | - | - | | 1.8474 | 6828 | 0.2396 | - | - | | 1.8477 | 6829 | 0.2143 | - | - | | 1.8479 | 6830 | 0.123 | - | - | | 1.8482 | 6831 | 0.2297 | - | - | | 1.8485 | 6832 | 0.1734 | - | - | | 1.8488 | 6833 | 0.2846 | - | - | | 1.8490 | 6834 | 0.2128 | - | - | | 1.8493 | 6835 | 0.1821 | - | - | | 1.8496 | 6836 | 0.1685 | - | - | | 1.8498 | 6837 | 0.1885 | - | - | | 1.8501 | 6838 | 0.2941 | - | - | | 1.8504 | 6839 | 0.2121 | - | - | | 1.8506 | 6840 | 0.2037 | - | - | | 1.8509 | 6841 | 0.2955 | - | - | | 1.8512 | 6842 | 0.2013 | - | - | | 1.8515 | 6843 | 0.2804 | - | - | | 1.8517 | 6844 | 0.2574 | - | - | | 1.8520 | 6845 | 0.2015 | - | - | | 1.8523 | 6846 | 0.2043 | - | - | | 1.8525 | 6847 | 0.1796 | - | - | | 1.8528 | 6848 | 0.2073 | - | - | | 1.8531 | 6849 | 0.2557 | - | - | | 1.8534 | 6850 | 0.2499 | - | - | | 1.8536 | 6851 | 0.1908 | - | - | | 1.8539 | 6852 | 0.3363 | - | - | | 1.8542 | 6853 | 0.1702 | - | - | | 1.8544 | 6854 | 0.257 | - | - | | 1.8547 | 6855 | 0.2068 | - | - | | 1.8550 | 6856 | 0.2433 | - | - | | 1.8552 | 6857 | 0.1562 | - | - | | 1.8555 | 6858 | 0.1735 | - | - | | 1.8558 | 6859 | 0.1764 | - | - | | 1.8561 | 6860 | 0.2423 | - | - | | 1.8563 | 6861 | 0.2077 | - | - | | 1.8566 | 6862 | 0.2447 | - | - | | 1.8569 | 6863 | 0.1941 | - | - | | 1.8571 | 6864 | 0.132 | - | - | | 1.8574 | 6865 | 0.2564 | - | - | | 1.8577 | 6866 | 0.2309 | - | - | | 1.8580 | 6867 | 0.1939 | - | - | | 1.8582 | 6868 | 0.1954 | - | - | | 1.8585 | 6869 | 0.3039 | - | - | | 1.8588 | 6870 | 0.1465 | - | - | | 1.8590 | 6871 | 0.2242 | - | - | | 1.8593 | 6872 | 0.2593 | - | - | | 1.8596 | 6873 | 0.1716 | - | - | | 1.8598 | 6874 | 0.1476 | - | - | | 1.8601 | 6875 | 0.258 | - | - | | 1.8604 | 6876 | 0.1752 | - | - | | 1.8607 | 6877 | 0.187 | - | - | | 1.8609 | 6878 | 0.2057 | - | - | | 1.8612 | 6879 | 0.1648 | - | - | | 1.8615 | 6880 | 0.1858 | - | - | | 1.8617 | 6881 | 0.1485 | - | - | | 1.8620 | 6882 | 0.1795 | - | - | | 1.8623 | 6883 | 0.2176 | - | - | | 1.8626 | 6884 | 0.3376 | - | - | | 1.8628 | 6885 | 0.2055 | - | - | | 1.8631 | 6886 | 0.3037 | - | - | | 1.8634 | 6887 | 0.1202 | - | - | | 1.8636 | 6888 | 0.2717 | - | - | | 1.8639 | 6889 | 0.2818 | - | - | | 1.8642 | 6890 | 0.1857 | - | - | | 1.8644 | 6891 | 0.1516 | - | - | | 1.8647 | 6892 | 0.2243 | - | - | | 1.8650 | 6893 | 0.1207 | - | - | | 1.8653 | 6894 | 0.2051 | - | - | | 1.8655 | 6895 | 0.1948 | - | - | | 1.8658 | 6896 | 0.1925 | - | - | | 1.8661 | 6897 | 0.2835 | - | - | | 1.8663 | 6898 | 0.2427 | - | - | | 1.8666 | 6899 | 0.2075 | - | - | | 1.8669 | 6900 | 0.1445 | - | - | | 1.8672 | 6901 | 0.3023 | - | - | | 1.8674 | 6902 | 0.1706 | - | - | | 1.8677 | 6903 | 0.2172 | - | - | | 1.8680 | 6904 | 0.2397 | - | - | | 1.8682 | 6905 | 0.1522 | - | - | | 1.8685 | 6906 | 0.1468 | - | - | | 1.8688 | 6907 | 0.2319 | - | - | | 1.8690 | 6908 | 0.1598 | - | - | | 1.8693 | 6909 | 0.2334 | - | - | | 1.8696 | 6910 | 0.1953 | - | - | | 1.8699 | 6911 | 0.2382 | - | - | | 1.8701 | 6912 | 0.2138 | - | - | | 1.8704 | 6913 | 0.2687 | - | - | | 1.8707 | 6914 | 0.1978 | - | - | | 1.8709 | 6915 | 0.2126 | - | - | | 1.8712 | 6916 | 0.2675 | - | - | | 1.8715 | 6917 | 0.2807 | - | - | | 1.8718 | 6918 | 0.177 | - | - | | 1.8720 | 6919 | 0.1977 | - | - | | 1.8723 | 6920 | 0.2313 | - | - | | 1.8726 | 6921 | 0.1628 | - | - | | 1.8728 | 6922 | 0.1916 | - | - | | 1.8731 | 6923 | 0.2242 | - | - | | 1.8734 | 6924 | 0.2562 | - | - | | 1.8736 | 6925 | 0.1938 | - | - | | 1.8739 | 6926 | 0.2167 | - | - | | 1.8742 | 6927 | 0.3225 | - | - | | 1.8745 | 6928 | 0.2425 | - | - | | 1.8747 | 6929 | 0.1834 | - | - | | 1.875 | 6930 | 0.1451 | - | - | | 1.8753 | 6931 | 0.1907 | - | - | | 1.8755 | 6932 | 0.2758 | - | - | | 1.8758 | 6933 | 0.2462 | - | - | | 1.8761 | 6934 | 0.2564 | - | - | | 1.8764 | 6935 | 0.1737 | - | - | | 1.8766 | 6936 | 0.1868 | - | - | | 1.8769 | 6937 | 0.2911 | - | - | | 1.8772 | 6938 | 0.1892 | - | - | | 1.8774 | 6939 | 0.2401 | - | - | | 1.8777 | 6940 | 0.1688 | - | - | | 1.8780 | 6941 | 0.3146 | - | - | | 1.8782 | 6942 | 0.2438 | - | - | | 1.8785 | 6943 | 0.1813 | - | - | | 1.8788 | 6944 | 0.2076 | - | - | | 1.8791 | 6945 | 0.2164 | - | - | | 1.8793 | 6946 | 0.1827 | - | - | | 1.8796 | 6947 | 0.189 | - | - | | 1.8799 | 6948 | 0.2421 | - | - | | 1.8801 | 6949 | 0.1634 | - | - | | 1.8804 | 6950 | 0.2007 | - | - | | 1.8807 | 6951 | 0.206 | - | - | | 1.8810 | 6952 | 0.2791 | - | - | | 1.8812 | 6953 | 0.1769 | - | - | | 1.8815 | 6954 | 0.2841 | - | - | | 1.8818 | 6955 | 0.1658 | - | - | | 1.8820 | 6956 | 0.1789 | - | - | | 1.8823 | 6957 | 0.1992 | - | - | | 1.8826 | 6958 | 0.2897 | - | - | | 1.8828 | 6959 | 0.2646 | - | - | | 1.8831 | 6960 | 0.2112 | - | - | | 1.8834 | 6961 | 0.1842 | - | - | | 1.8837 | 6962 | 0.1885 | - | - | | 1.8839 | 6963 | 0.2441 | - | - | | 1.8842 | 6964 | 0.2161 | - | - | | 1.8845 | 6965 | 0.189 | - | - | | 1.8847 | 6966 | 0.2114 | - | - | | 1.8850 | 6967 | 0.2326 | - | - | | 1.8853 | 6968 | 0.2013 | - | - | | 1.8856 | 6969 | 0.1908 | - | - | | 1.8858 | 6970 | 0.309 | - | - | | 1.8861 | 6971 | 0.271 | - | - | | 1.8864 | 6972 | 0.3337 | - | - | | 1.8866 | 6973 | 0.1923 | - | - | | 1.8869 | 6974 | 0.2295 | - | - | | 1.8872 | 6975 | 0.2949 | - | - | | 1.8874 | 6976 | 0.2178 | - | - | | 1.8877 | 6977 | 0.1804 | - | - | | 1.8880 | 6978 | 0.2862 | - | - | | 1.8883 | 6979 | 0.2028 | - | - | | 1.8885 | 6980 | 0.2255 | - | - | | 1.8888 | 6981 | 0.2784 | - | - | | 1.8891 | 6982 | 0.229 | - | - | | 1.8893 | 6983 | 0.1969 | - | - | | 1.8896 | 6984 | 0.2365 | - | - | | 1.8899 | 6985 | 0.2824 | - | - | | 1.8902 | 6986 | 0.2254 | - | - | | 1.8904 | 6987 | 0.1766 | - | - | | 1.8907 | 6988 | 0.2366 | - | - | | 1.8910 | 6989 | 0.0862 | - | - | | 1.8912 | 6990 | 0.1867 | - | - | | 1.8915 | 6991 | 0.1723 | - | - | | 1.8918 | 6992 | 0.1994 | - | - | | 1.8920 | 6993 | 0.1718 | - | - | | 1.8923 | 6994 | 0.2443 | - | - | | 1.8926 | 6995 | 0.2593 | - | - | | 1.8929 | 6996 | 0.1322 | - | - | | 1.8931 | 6997 | 0.2174 | - | - | | 1.8934 | 6998 | 0.157 | - | - | | 1.8937 | 6999 | 0.2619 | - | - | | 1.8939 | 7000 | 0.1718 | 0.2104 | 0.9469 | | 1.8942 | 7001 | 0.269 | - | - | | 1.8945 | 7002 | 0.2832 | - | - | | 1.8948 | 7003 | 0.2403 | - | - | | 1.8950 | 7004 | 0.186 | - | - | | 1.8953 | 7005 | 0.2609 | - | - | | 1.8956 | 7006 | 0.1834 | - | - | | 1.8958 | 7007 | 0.2261 | - | - | | 1.8961 | 7008 | 0.1895 | - | - | | 1.8964 | 7009 | 0.1702 | - | - | | 1.8966 | 7010 | 0.248 | - | - | | 1.8969 | 7011 | 0.1906 | - | - | | 1.8972 | 7012 | 0.2125 | - | - | | 1.8975 | 7013 | 0.2632 | - | - | | 1.8977 | 7014 | 0.2046 | - | - | | 1.8980 | 7015 | 0.2145 | - | - | | 1.8983 | 7016 | 0.2517 | - | - | | 1.8985 | 7017 | 0.2912 | - | - | | 1.8988 | 7018 | 0.2145 | - | - | | 1.8991 | 7019 | 0.178 | - | - | | 1.8994 | 7020 | 0.1801 | - | - | | 1.8996 | 7021 | 0.1954 | - | - | | 1.8999 | 7022 | 0.1913 | - | - | | 1.9002 | 7023 | 0.2518 | - | - | | 1.9004 | 7024 | 0.229 | - | - | | 1.9007 | 7025 | 0.1639 | - | - | | 1.9010 | 7026 | 0.2083 | - | - | | 1.9012 | 7027 | 0.2593 | - | - | | 1.9015 | 7028 | 0.1365 | - | - | | 1.9018 | 7029 | 0.2212 | - | - | | 1.9021 | 7030 | 0.3368 | - | - | | 1.9023 | 7031 | 0.1783 | - | - | | 1.9026 | 7032 | 0.2383 | - | - | | 1.9029 | 7033 | 0.2259 | - | - | | 1.9031 | 7034 | 0.1982 | - | - | | 1.9034 | 7035 | 0.1914 | - | - | | 1.9037 | 7036 | 0.2696 | - | - | | 1.9040 | 7037 | 0.2281 | - | - | | 1.9042 | 7038 | 0.231 | - | - | | 1.9045 | 7039 | 0.1644 | - | - | | 1.9048 | 7040 | 0.2566 | - | - | | 1.9050 | 7041 | 0.211 | - | - | | 1.9053 | 7042 | 0.2381 | - | - | | 1.9056 | 7043 | 0.2216 | - | - | | 1.9058 | 7044 | 0.205 | - | - | | 1.9061 | 7045 | 0.2917 | - | - | | 1.9064 | 7046 | 0.2163 | - | - | | 1.9067 | 7047 | 0.1715 | - | - | | 1.9069 | 7048 | 0.1918 | - | - | | 1.9072 | 7049 | 0.1722 | - | - | | 1.9075 | 7050 | 0.1946 | - | - | | 1.9077 | 7051 | 0.2428 | - | - | | 1.9080 | 7052 | 0.2175 | - | - | | 1.9083 | 7053 | 0.2399 | - | - | | 1.9085 | 7054 | 0.3172 | - | - | | 1.9088 | 7055 | 0.2872 | - | - | | 1.9091 | 7056 | 0.2166 | - | - | | 1.9094 | 7057 | 0.2544 | - | - | | 1.9096 | 7058 | 0.2872 | - | - | | 1.9099 | 7059 | 0.2252 | - | - | | 1.9102 | 7060 | 0.2404 | - | - | | 1.9104 | 7061 | 0.1883 | - | - | | 1.9107 | 7062 | 0.1792 | - | - | | 1.9110 | 7063 | 0.2277 | - | - | | 1.9113 | 7064 | 0.2335 | - | - | | 1.9115 | 7065 | 0.2166 | - | - | | 1.9118 | 7066 | 0.2231 | - | - | | 1.9121 | 7067 | 0.2364 | - | - | | 1.9123 | 7068 | 0.2304 | - | - | | 1.9126 | 7069 | 0.2079 | - | - | | 1.9129 | 7070 | 0.1801 | - | - | | 1.9131 | 7071 | 0.2104 | - | - | | 1.9134 | 7072 | 0.2003 | - | - | | 1.9137 | 7073 | 0.1498 | - | - | | 1.9140 | 7074 | 0.2501 | - | - | | 1.9142 | 7075 | 0.279 | - | - | | 1.9145 | 7076 | 0.1883 | - | - | | 1.9148 | 7077 | 0.1432 | - | - | | 1.9150 | 7078 | 0.1973 | - | - | | 1.9153 | 7079 | 0.1741 | - | - | | 1.9156 | 7080 | 0.275 | - | - | | 1.9159 | 7081 | 0.1536 | - | - | | 1.9161 | 7082 | 0.1525 | - | - | | 1.9164 | 7083 | 0.2561 | - | - | | 1.9167 | 7084 | 0.1586 | - | - | | 1.9169 | 7085 | 0.2243 | - | - | | 1.9172 | 7086 | 0.1452 | - | - | | 1.9175 | 7087 | 0.234 | - | - | | 1.9177 | 7088 | 0.1805 | - | - | | 1.9180 | 7089 | 0.183 | - | - | | 1.9183 | 7090 | 0.204 | - | - | | 1.9186 | 7091 | 0.2472 | - | - | | 1.9188 | 7092 | 0.1173 | - | - | | 1.9191 | 7093 | 0.2522 | - | - | | 1.9194 | 7094 | 0.185 | - | - | | 1.9196 | 7095 | 0.2347 | - | - | | 1.9199 | 7096 | 0.2142 | - | - | | 1.9202 | 7097 | 0.1567 | - | - | | 1.9205 | 7098 | 0.1912 | - | - | | 1.9207 | 7099 | 0.1797 | - | - | | 1.9210 | 7100 | 0.213 | - | - | | 1.9213 | 7101 | 0.2194 | - | - | | 1.9215 | 7102 | 0.1448 | - | - | | 1.9218 | 7103 | 0.1608 | - | - | | 1.9221 | 7104 | 0.2581 | - | - | | 1.9223 | 7105 | 0.2613 | - | - | | 1.9226 | 7106 | 0.219 | - | - | | 1.9229 | 7107 | 0.2105 | - | - | | 1.9232 | 7108 | 0.3396 | - | - | | 1.9234 | 7109 | 0.1645 | - | - | | 1.9237 | 7110 | 0.225 | - | - | | 1.9240 | 7111 | 0.1776 | - | - | | 1.9242 | 7112 | 0.2333 | - | - | | 1.9245 | 7113 | 0.2373 | - | - | | 1.9248 | 7114 | 0.1465 | - | - | | 1.9251 | 7115 | 0.2608 | - | - | | 1.9253 | 7116 | 0.229 | - | - | | 1.9256 | 7117 | 0.3538 | - | - | | 1.9259 | 7118 | 0.1877 | - | - | | 1.9261 | 7119 | 0.3129 | - | - | | 1.9264 | 7120 | 0.1997 | - | - | | 1.9267 | 7121 | 0.2001 | - | - | | 1.9269 | 7122 | 0.2866 | - | - | | 1.9272 | 7123 | 0.2784 | - | - | | 1.9275 | 7124 | 0.1546 | - | - | | 1.9278 | 7125 | 0.3189 | - | - | | 1.9280 | 7126 | 0.266 | - | - | | 1.9283 | 7127 | 0.2486 | - | - | | 1.9286 | 7128 | 0.2266 | - | - | | 1.9288 | 7129 | 0.1973 | - | - | | 1.9291 | 7130 | 0.244 | - | - | | 1.9294 | 7131 | 0.2756 | - | - | | 1.9297 | 7132 | 0.2528 | - | - | | 1.9299 | 7133 | 0.1866 | - | - | | 1.9302 | 7134 | 0.1671 | - | - | | 1.9305 | 7135 | 0.2083 | - | - | | 1.9307 | 7136 | 0.232 | - | - | | 1.9310 | 7137 | 0.2979 | - | - | | 1.9313 | 7138 | 0.2821 | - | - | | 1.9315 | 7139 | 0.2327 | - | - | | 1.9318 | 7140 | 0.1865 | - | - | | 1.9321 | 7141 | 0.2826 | - | - | | 1.9324 | 7142 | 0.1444 | - | - | | 1.9326 | 7143 | 0.2244 | - | - | | 1.9329 | 7144 | 0.2746 | - | - | | 1.9332 | 7145 | 0.2803 | - | - | | 1.9334 | 7146 | 0.2344 | - | - | | 1.9337 | 7147 | 0.3034 | - | - | | 1.9340 | 7148 | 0.1771 | - | - | | 1.9343 | 7149 | 0.2071 | - | - | | 1.9345 | 7150 | 0.197 | - | - | | 1.9348 | 7151 | 0.1984 | - | - | | 1.9351 | 7152 | 0.2614 | - | - | | 1.9353 | 7153 | 0.1385 | - | - | | 1.9356 | 7154 | 0.2082 | - | - | | 1.9359 | 7155 | 0.228 | - | - | | 1.9361 | 7156 | 0.2584 | - | - | | 1.9364 | 7157 | 0.3092 | - | - | | 1.9367 | 7158 | 0.2197 | - | - | | 1.9370 | 7159 | 0.234 | - | - | | 1.9372 | 7160 | 0.2595 | - | - | | 1.9375 | 7161 | 0.141 | - | - | | 1.9378 | 7162 | 0.2599 | - | - | | 1.9380 | 7163 | 0.2564 | - | - | | 1.9383 | 7164 | 0.1704 | - | - | | 1.9386 | 7165 | 0.2384 | - | - | | 1.9389 | 7166 | 0.1979 | - | - | | 1.9391 | 7167 | 0.3258 | - | - | | 1.9394 | 7168 | 0.1783 | - | - | | 1.9397 | 7169 | 0.1535 | - | - | | 1.9399 | 7170 | 0.1827 | - | - | | 1.9402 | 7171 | 0.2352 | - | - | | 1.9405 | 7172 | 0.1795 | - | - | | 1.9407 | 7173 | 0.1649 | - | - | | 1.9410 | 7174 | 0.2086 | - | - | | 1.9413 | 7175 | 0.1976 | - | - | | 1.9416 | 7176 | 0.1687 | - | - | | 1.9418 | 7177 | 0.1622 | - | - | | 1.9421 | 7178 | 0.1885 | - | - | | 1.9424 | 7179 | 0.2356 | - | - | | 1.9426 | 7180 | 0.2115 | - | - | | 1.9429 | 7181 | 0.1963 | - | - | | 1.9432 | 7182 | 0.1925 | - | - | | 1.9435 | 7183 | 0.2072 | - | - | | 1.9437 | 7184 | 0.2206 | - | - | | 1.9440 | 7185 | 0.2105 | - | - | | 1.9443 | 7186 | 0.2346 | - | - | | 1.9445 | 7187 | 0.168 | - | - | | 1.9448 | 7188 | 0.2229 | - | - | | 1.9451 | 7189 | 0.1716 | - | - | | 1.9453 | 7190 | 0.2224 | - | - | | 1.9456 | 7191 | 0.1816 | - | - | | 1.9459 | 7192 | 0.3087 | - | - | | 1.9462 | 7193 | 0.2493 | - | - | | 1.9464 | 7194 | 0.2322 | - | - | | 1.9467 | 7195 | 0.2194 | - | - | | 1.9470 | 7196 | 0.1743 | - | - | | 1.9472 | 7197 | 0.1862 | - | - | | 1.9475 | 7198 | 0.1798 | - | - | | 1.9478 | 7199 | 0.2274 | - | - | | 1.9481 | 7200 | 0.1644 | - | - | | 1.9483 | 7201 | 0.2221 | - | - | | 1.9486 | 7202 | 0.1944 | - | - | | 1.9489 | 7203 | 0.2716 | - | - | | 1.9491 | 7204 | 0.226 | - | - | | 1.9494 | 7205 | 0.203 | - | - | | 1.9497 | 7206 | 0.1998 | - | - | | 1.9499 | 7207 | 0.2026 | - | - | | 1.9502 | 7208 | 0.2892 | - | - | | 1.9505 | 7209 | 0.187 | - | - | | 1.9508 | 7210 | 0.1413 | - | - | | 1.9510 | 7211 | 0.1582 | - | - | | 1.9513 | 7212 | 0.1625 | - | - | | 1.9516 | 7213 | 0.2488 | - | - | | 1.9518 | 7214 | 0.2925 | - | - | | 1.9521 | 7215 | 0.2104 | - | - | | 1.9524 | 7216 | 0.2145 | - | - | | 1.9527 | 7217 | 0.2213 | - | - | | 1.9529 | 7218 | 0.1724 | - | - | | 1.9532 | 7219 | 0.149 | - | - | | 1.9535 | 7220 | 0.2099 | - | - | | 1.9537 | 7221 | 0.3007 | - | - | | 1.9540 | 7222 | 0.1984 | - | - | | 1.9543 | 7223 | 0.1891 | - | - | | 1.9545 | 7224 | 0.2323 | - | - | | 1.9548 | 7225 | 0.159 | - | - | | 1.9551 | 7226 | 0.2065 | - | - | | 1.9554 | 7227 | 0.2463 | - | - | | 1.9556 | 7228 | 0.1677 | - | - | | 1.9559 | 7229 | 0.2345 | - | - | | 1.9562 | 7230 | 0.2033 | - | - | | 1.9564 | 7231 | 0.1699 | - | - | | 1.9567 | 7232 | 0.2412 | - | - | | 1.9570 | 7233 | 0.1919 | - | - | | 1.9573 | 7234 | 0.3061 | - | - | | 1.9575 | 7235 | 0.2715 | - | - | | 1.9578 | 7236 | 0.1199 | - | - | | 1.9581 | 7237 | 0.2353 | - | - | | 1.9583 | 7238 | 0.2094 | - | - | | 1.9586 | 7239 | 0.2702 | - | - | | 1.9589 | 7240 | 0.2746 | - | - | | 1.9591 | 7241 | 0.2213 | - | - | | 1.9594 | 7242 | 0.2803 | - | - | | 1.9597 | 7243 | 0.209 | - | - | | 1.9600 | 7244 | 0.2432 | - | - | | 1.9602 | 7245 | 0.1557 | - | - | | 1.9605 | 7246 | 0.1833 | - | - | | 1.9608 | 7247 | 0.2027 | - | - | | 1.9610 | 7248 | 0.2564 | - | - | | 1.9613 | 7249 | 0.2852 | - | - | | 1.9616 | 7250 | 0.2168 | - | - | | 1.9619 | 7251 | 0.1654 | - | - | | 1.9621 | 7252 | 0.1364 | - | - | | 1.9624 | 7253 | 0.2003 | - | - | | 1.9627 | 7254 | 0.1984 | - | - | | 1.9629 | 7255 | 0.2554 | - | - | | 1.9632 | 7256 | 0.2652 | - | - | | 1.9635 | 7257 | 0.2321 | - | - | | 1.9637 | 7258 | 0.2279 | - | - | | 1.9640 | 7259 | 0.2065 | - | - | | 1.9643 | 7260 | 0.2606 | - | - | | 1.9646 | 7261 | 0.1797 | - | - | | 1.9648 | 7262 | 0.2919 | - | - | | 1.9651 | 7263 | 0.1482 | - | - | | 1.9654 | 7264 | 0.2082 | - | - | | 1.9656 | 7265 | 0.2776 | - | - | | 1.9659 | 7266 | 0.1561 | - | - | | 1.9662 | 7267 | 0.2733 | - | - | | 1.9665 | 7268 | 0.2021 | - | - | | 1.9667 | 7269 | 0.2192 | - | - | | 1.9670 | 7270 | 0.1659 | - | - | | 1.9673 | 7271 | 0.2943 | - | - | | 1.9675 | 7272 | 0.2052 | - | - | | 1.9678 | 7273 | 0.206 | - | - | | 1.9681 | 7274 | 0.2126 | - | - | | 1.9683 | 7275 | 0.2076 | - | - | | 1.9686 | 7276 | 0.2969 | - | - | | 1.9689 | 7277 | 0.2433 | - | - | | 1.9692 | 7278 | 0.1569 | - | - | | 1.9694 | 7279 | 0.1884 | - | - | | 1.9697 | 7280 | 0.2323 | - | - | | 1.9700 | 7281 | 0.1703 | - | - | | 1.9702 | 7282 | 0.2281 | - | - | | 1.9705 | 7283 | 0.2927 | - | - | | 1.9708 | 7284 | 0.1661 | - | - | | 1.9710 | 7285 | 0.1876 | - | - | | 1.9713 | 7286 | 0.2036 | - | - | | 1.9716 | 7287 | 0.1876 | - | - | | 1.9719 | 7288 | 0.2981 | - | - | | 1.9721 | 7289 | 0.1267 | - | - | | 1.9724 | 7290 | 0.2597 | - | - | | 1.9727 | 7291 | 0.1772 | - | - | | 1.9729 | 7292 | 0.273 | - | - | | 1.9732 | 7293 | 0.1955 | - | - | | 1.9735 | 7294 | 0.228 | - | - | | 1.9738 | 7295 | 0.1674 | - | - | | 1.9740 | 7296 | 0.2655 | - | - | | 1.9743 | 7297 | 0.2256 | - | - | | 1.9746 | 7298 | 0.28 | - | - | | 1.9748 | 7299 | 0.187 | - | - | | 1.9751 | 7300 | 0.205 | - | - | | 1.9754 | 7301 | 0.2096 | - | - | | 1.9756 | 7302 | 0.3056 | - | - | | 1.9759 | 7303 | 0.228 | - | - | | 1.9762 | 7304 | 0.2253 | - | - | | 1.9765 | 7305 | 0.2762 | - | - | | 1.9767 | 7306 | 0.1992 | - | - | | 1.9770 | 7307 | 0.1603 | - | - | | 1.9773 | 7308 | 0.1428 | - | - | | 1.9775 | 7309 | 0.2687 | - | - | | 1.9778 | 7310 | 0.2061 | - | - | | 1.9781 | 7311 | 0.2313 | - | - | | 1.9784 | 7312 | 0.204 | - | - | | 1.9786 | 7313 | 0.2134 | - | - | | 1.9789 | 7314 | 0.2273 | - | - | | 1.9792 | 7315 | 0.2424 | - | - | | 1.9794 | 7316 | 0.2169 | - | - | | 1.9797 | 7317 | 0.1945 | - | - | | 1.9800 | 7318 | 0.136 | - | - | | 1.9802 | 7319 | 0.1514 | - | - | | 1.9805 | 7320 | 0.2006 | - | - | | 1.9808 | 7321 | 0.1818 | - | - | | 1.9811 | 7322 | 0.2406 | - | - | | 1.9813 | 7323 | 0.2683 | - | - | | 1.9816 | 7324 | 0.2823 | - | - | | 1.9819 | 7325 | 0.2271 | - | - | | 1.9821 | 7326 | 0.2067 | - | - | | 1.9824 | 7327 | 0.1982 | - | - | | 1.9827 | 7328 | 0.3198 | - | - | | 1.9830 | 7329 | 0.2081 | - | - | | 1.9832 | 7330 | 0.1579 | - | - | | 1.9835 | 7331 | 0.2111 | - | - | | 1.9838 | 7332 | 0.1704 | - | - | | 1.9840 | 7333 | 0.1812 | - | - | | 1.9843 | 7334 | 0.1774 | - | - | | 1.9846 | 7335 | 0.1981 | - | - | | 1.9848 | 7336 | 0.1615 | - | - | | 1.9851 | 7337 | 0.1934 | - | - | | 1.9854 | 7338 | 0.1315 | - | - | | 1.9857 | 7339 | 0.1861 | - | - | | 1.9859 | 7340 | 0.2108 | - | - | | 1.9862 | 7341 | 0.2384 | - | - | | 1.9865 | 7342 | 0.1887 | - | - | | 1.9867 | 7343 | 0.2806 | - | - | | 1.9870 | 7344 | 0.1905 | - | - | | 1.9873 | 7345 | 0.2215 | - | - | | 1.9876 | 7346 | 0.1675 | - | - | | 1.9878 | 7347 | 0.1941 | - | - | | 1.9881 | 7348 | 0.1884 | - | - | | 1.9884 | 7349 | 0.1796 | - | - | | 1.9886 | 7350 | 0.281 | - | - | | 1.9889 | 7351 | 0.2954 | - | - | | 1.9892 | 7352 | 0.1659 | - | - | | 1.9894 | 7353 | 0.2409 | - | - | | 1.9897 | 7354 | 0.185 | - | - | | 1.9900 | 7355 | 0.2508 | - | - | | 1.9903 | 7356 | 0.2678 | - | - | | 1.9905 | 7357 | 0.1573 | - | - | | 1.9908 | 7358 | 0.3161 | - | - | | 1.9911 | 7359 | 0.2464 | - | - | | 1.9913 | 7360 | 0.1969 | - | - | | 1.9916 | 7361 | 0.1473 | - | - | | 1.9919 | 7362 | 0.2087 | - | - | | 1.9922 | 7363 | 0.2483 | - | - | | 1.9924 | 7364 | 0.222 | - | - | | 1.9927 | 7365 | 0.2109 | - | - | | 1.9930 | 7366 | 0.2377 | - | - | | 1.9932 | 7367 | 0.2077 | - | - | | 1.9935 | 7368 | 0.184 | - | - | | 1.9938 | 7369 | 0.1873 | - | - | | 1.9940 | 7370 | 0.2258 | - | - | | 1.9943 | 7371 | 0.2517 | - | - | | 1.9946 | 7372 | 0.2341 | - | - | | 1.9949 | 7373 | 0.1653 | - | - | | 1.9951 | 7374 | 0.2407 | - | - | | 1.9954 | 7375 | 0.2729 | - | - | | 1.9957 | 7376 | 0.1616 | - | - | | 1.9959 | 7377 | 0.2149 | - | - | | 1.9962 | 7378 | 0.2768 | - | - | | 1.9965 | 7379 | 0.1984 | - | - | | 1.9968 | 7380 | 0.1952 | - | - | | 1.9970 | 7381 | 0.2465 | - | - | | 1.9973 | 7382 | 0.1872 | - | - | | 1.9976 | 7383 | 0.2345 | - | - | | 1.9978 | 7384 | 0.186 | - | - | | 1.9981 | 7385 | 0.1776 | - | - | | 1.9984 | 7386 | 0.2051 | - | - | | 1.9986 | 7387 | 0.2407 | - | - | | 1.9989 | 7388 | 0.14 | - | - | | 1.9992 | 7389 | 0.1821 | - | - | | 1.9995 | 7390 | 0.2295 | - | - | | 1.9997 | 7391 | 0.1614 | - | - | | 2.0 | 7392 | 0.2193 | - | - | | 2.0003 | 7393 | 0.1583 | - | - | | 2.0005 | 7394 | 0.1216 | - | - | | 2.0008 | 7395 | 0.2351 | - | - | | 2.0011 | 7396 | 0.165 | - | - | | 2.0014 | 7397 | 0.1558 | - | - | | 2.0016 | 7398 | 0.1764 | - | - | | 2.0019 | 7399 | 0.1774 | - | - | | 2.0022 | 7400 | 0.234 | - | - | | 2.0024 | 7401 | 0.1633 | - | - | | 2.0027 | 7402 | 0.2066 | - | - | | 2.0030 | 7403 | 0.1771 | - | - | | 2.0032 | 7404 | 0.2267 | - | - | | 2.0035 | 7405 | 0.1346 | - | - | | 2.0038 | 7406 | 0.1892 | - | - | | 2.0041 | 7407 | 0.1717 | - | - | | 2.0043 | 7408 | 0.1558 | - | - | | 2.0046 | 7409 | 0.1413 | - | - | | 2.0049 | 7410 | 0.2576 | - | - | | 2.0051 | 7411 | 0.1349 | - | - | | 2.0054 | 7412 | 0.1273 | - | - | | 2.0057 | 7413 | 0.1273 | - | - | | 2.0060 | 7414 | 0.1849 | - | - | | 2.0062 | 7415 | 0.1507 | - | - | | 2.0065 | 7416 | 0.2996 | - | - | | 2.0068 | 7417 | 0.2342 | - | - | | 2.0070 | 7418 | 0.1282 | - | - | | 2.0073 | 7419 | 0.3023 | - | - | | 2.0076 | 7420 | 0.163 | - | - | | 2.0078 | 7421 | 0.1487 | - | - | | 2.0081 | 7422 | 0.1786 | - | - | | 2.0084 | 7423 | 0.1485 | - | - | | 2.0087 | 7424 | 0.163 | - | - | | 2.0089 | 7425 | 0.2129 | - | - | | 2.0092 | 7426 | 0.1874 | - | - | | 2.0095 | 7427 | 0.2214 | - | - | | 2.0097 | 7428 | 0.0933 | - | - | | 2.0100 | 7429 | 0.1319 | - | - | | 2.0103 | 7430 | 0.172 | - | - | | 2.0106 | 7431 | 0.1725 | - | - | | 2.0108 | 7432 | 0.1779 | - | - | | 2.0111 | 7433 | 0.1495 | - | - | | 2.0114 | 7434 | 0.1349 | - | - | | 2.0116 | 7435 | 0.1931 | - | - | | 2.0119 | 7436 | 0.1951 | - | - | | 2.0122 | 7437 | 0.241 | - | - | | 2.0124 | 7438 | 0.1822 | - | - | | 2.0127 | 7439 | 0.1796 | - | - | | 2.0130 | 7440 | 0.168 | - | - | | 2.0133 | 7441 | 0.1713 | - | - | | 2.0135 | 7442 | 0.1322 | - | - | | 2.0138 | 7443 | 0.1835 | - | - | | 2.0141 | 7444 | 0.1451 | - | - | | 2.0143 | 7445 | 0.188 | - | - | | 2.0146 | 7446 | 0.1409 | - | - | | 2.0149 | 7447 | 0.274 | - | - | | 2.0152 | 7448 | 0.1273 | - | - | | 2.0154 | 7449 | 0.2019 | - | - | | 2.0157 | 7450 | 0.1376 | - | - | | 2.0160 | 7451 | 0.1705 | - | - | | 2.0162 | 7452 | 0.2296 | - | - | | 2.0165 | 7453 | 0.2735 | - | - | | 2.0168 | 7454 | 0.211 | - | - | | 2.0170 | 7455 | 0.1766 | - | - | | 2.0173 | 7456 | 0.1769 | - | - | | 2.0176 | 7457 | 0.1469 | - | - | | 2.0179 | 7458 | 0.1816 | - | - | | 2.0181 | 7459 | 0.1507 | - | - | | 2.0184 | 7460 | 0.2556 | - | - | | 2.0187 | 7461 | 0.1833 | - | - | | 2.0189 | 7462 | 0.1786 | - | - | | 2.0192 | 7463 | 0.1554 | - | - | | 2.0195 | 7464 | 0.1249 | - | - | | 2.0198 | 7465 | 0.2119 | - | - | | 2.0200 | 7466 | 0.1133 | - | - | | 2.0203 | 7467 | 0.2365 | - | - | | 2.0206 | 7468 | 0.1562 | - | - | | 2.0208 | 7469 | 0.1824 | - | - | | 2.0211 | 7470 | 0.1773 | - | - | | 2.0214 | 7471 | 0.1545 | - | - | | 2.0216 | 7472 | 0.1709 | - | - | | 2.0219 | 7473 | 0.1474 | - | - | | 2.0222 | 7474 | 0.2103 | - | - | | 2.0225 | 7475 | 0.1462 | - | - | | 2.0227 | 7476 | 0.1851 | - | - | | 2.0230 | 7477 | 0.2381 | - | - | | 2.0233 | 7478 | 0.2224 | - | - | | 2.0235 | 7479 | 0.2066 | - | - | | 2.0238 | 7480 | 0.203 | - | - | | 2.0241 | 7481 | 0.1233 | - | - | | 2.0244 | 7482 | 0.2172 | - | - | | 2.0246 | 7483 | 0.1615 | - | - | | 2.0249 | 7484 | 0.1564 | - | - | | 2.0252 | 7485 | 0.206 | - | - | | 2.0254 | 7486 | 0.1565 | - | - | | 2.0257 | 7487 | 0.1652 | - | - | | 2.0260 | 7488 | 0.1697 | - | - | | 2.0262 | 7489 | 0.1208 | - | - | | 2.0265 | 7490 | 0.1115 | - | - | | 2.0268 | 7491 | 0.1502 | - | - | | 2.0271 | 7492 | 0.1997 | - | - | | 2.0273 | 7493 | 0.2195 | - | - | | 2.0276 | 7494 | 0.2278 | - | - | | 2.0279 | 7495 | 0.2303 | - | - | | 2.0281 | 7496 | 0.2126 | - | - | | 2.0284 | 7497 | 0.1916 | - | - | | 2.0287 | 7498 | 0.2102 | - | - | | 2.0290 | 7499 | 0.16 | - | - | | 2.0292 | 7500 | 0.1611 | - | - | | 2.0295 | 7501 | 0.1753 | - | - | | 2.0298 | 7502 | 0.2305 | - | - | | 2.0300 | 7503 | 0.1883 | - | - | | 2.0303 | 7504 | 0.2511 | - | - | | 2.0306 | 7505 | 0.2119 | - | - | | 2.0308 | 7506 | 0.2575 | - | - | | 2.0311 | 7507 | 0.2391 | - | - | | 2.0314 | 7508 | 0.1724 | - | - | | 2.0317 | 7509 | 0.1896 | - | - | | 2.0319 | 7510 | 0.2027 | - | - | | 2.0322 | 7511 | 0.1539 | - | - | | 2.0325 | 7512 | 0.1742 | - | - | | 2.0327 | 7513 | 0.1623 | - | - | | 2.0330 | 7514 | 0.139 | - | - | | 2.0333 | 7515 | 0.2794 | - | - | | 2.0335 | 7516 | 0.1501 | - | - | | 2.0338 | 7517 | 0.2285 | - | - | | 2.0341 | 7518 | 0.1819 | - | - | | 2.0344 | 7519 | 0.1841 | - | - | | 2.0346 | 7520 | 0.1474 | - | - | | 2.0349 | 7521 | 0.2103 | - | - | | 2.0352 | 7522 | 0.2881 | - | - | | 2.0354 | 7523 | 0.1859 | - | - | | 2.0357 | 7524 | 0.1789 | - | - | | 2.0360 | 7525 | 0.1787 | - | - | | 2.0363 | 7526 | 0.2041 | - | - | | 2.0365 | 7527 | 0.183 | - | - | | 2.0368 | 7528 | 0.1571 | - | - | | 2.0371 | 7529 | 0.2029 | - | - | | 2.0373 | 7530 | 0.2246 | - | - | | 2.0376 | 7531 | 0.1663 | - | - | | 2.0379 | 7532 | 0.1312 | - | - | | 2.0381 | 7533 | 0.2372 | - | - | | 2.0384 | 7534 | 0.2237 | - | - | | 2.0387 | 7535 | 0.3542 | - | - | | 2.0390 | 7536 | 0.187 | - | - | | 2.0392 | 7537 | 0.1695 | - | - | | 2.0395 | 7538 | 0.2579 | - | - | | 2.0398 | 7539 | 0.1638 | - | - | | 2.0400 | 7540 | 0.2131 | - | - | | 2.0403 | 7541 | 0.1923 | - | - | | 2.0406 | 7542 | 0.1354 | - | - | | 2.0409 | 7543 | 0.2152 | - | - | | 2.0411 | 7544 | 0.2227 | - | - | | 2.0414 | 7545 | 0.1933 | - | - | | 2.0417 | 7546 | 0.2017 | - | - | | 2.0419 | 7547 | 0.1549 | - | - | | 2.0422 | 7548 | 0.2034 | - | - | | 2.0425 | 7549 | 0.1894 | - | - | | 2.0427 | 7550 | 0.1781 | - | - | | 2.0430 | 7551 | 0.289 | - | - | | 2.0433 | 7552 | 0.1747 | - | - | | 2.0436 | 7553 | 0.2116 | - | - | | 2.0438 | 7554 | 0.1228 | - | - | | 2.0441 | 7555 | 0.2503 | - | - | | 2.0444 | 7556 | 0.1931 | - | - | | 2.0446 | 7557 | 0.1474 | - | - | | 2.0449 | 7558 | 0.1611 | - | - | | 2.0452 | 7559 | 0.2118 | - | - | | 2.0455 | 7560 | 0.2504 | - | - | | 2.0457 | 7561 | 0.2337 | - | - | | 2.0460 | 7562 | 0.1658 | - | - | | 2.0463 | 7563 | 0.1459 | - | - | | 2.0465 | 7564 | 0.137 | - | - | | 2.0468 | 7565 | 0.2051 | - | - | | 2.0471 | 7566 | 0.1953 | - | - | | 2.0473 | 7567 | 0.2006 | - | - | | 2.0476 | 7568 | 0.1853 | - | - | | 2.0479 | 7569 | 0.2068 | - | - | | 2.0482 | 7570 | 0.1863 | - | - | | 2.0484 | 7571 | 0.168 | - | - | | 2.0487 | 7572 | 0.222 | - | - | | 2.0490 | 7573 | 0.2002 | - | - | | 2.0492 | 7574 | 0.1898 | - | - | | 2.0495 | 7575 | 0.1798 | - | - | | 2.0498 | 7576 | 0.1918 | - | - | | 2.0501 | 7577 | 0.1863 | - | - | | 2.0503 | 7578 | 0.1565 | - | - | | 2.0506 | 7579 | 0.1897 | - | - | | 2.0509 | 7580 | 0.1694 | - | - | | 2.0511 | 7581 | 0.2002 | - | - | | 2.0514 | 7582 | 0.1676 | - | - | | 2.0517 | 7583 | 0.1838 | - | - | | 2.0519 | 7584 | 0.1815 | - | - | | 2.0522 | 7585 | 0.1751 | - | - | | 2.0525 | 7586 | 0.1686 | - | - | | 2.0528 | 7587 | 0.2176 | - | - | | 2.0530 | 7588 | 0.2293 | - | - | | 2.0533 | 7589 | 0.2333 | - | - | | 2.0536 | 7590 | 0.1519 | - | - | | 2.0538 | 7591 | 0.2024 | - | - | | 2.0541 | 7592 | 0.2446 | - | - | | 2.0544 | 7593 | 0.148 | - | - | | 2.0547 | 7594 | 0.1636 | - | - | | 2.0549 | 7595 | 0.2338 | - | - | | 2.0552 | 7596 | 0.1804 | - | - | | 2.0555 | 7597 | 0.1999 | - | - | | 2.0557 | 7598 | 0.2213 | - | - | | 2.0560 | 7599 | 0.1995 | - | - | | 2.0563 | 7600 | 0.1911 | - | - | | 2.0565 | 7601 | 0.1731 | - | - | | 2.0568 | 7602 | 0.1418 | - | - | | 2.0571 | 7603 | 0.2386 | - | - | | 2.0574 | 7604 | 0.2797 | - | - | | 2.0576 | 7605 | 0.1914 | - | - | | 2.0579 | 7606 | 0.2347 | - | - | | 2.0582 | 7607 | 0.1689 | - | - | | 2.0584 | 7608 | 0.2443 | - | - | | 2.0587 | 7609 | 0.257 | - | - | | 2.0590 | 7610 | 0.1694 | - | - | | 2.0593 | 7611 | 0.1306 | - | - | | 2.0595 | 7612 | 0.1453 | - | - | | 2.0598 | 7613 | 0.1693 | - | - | | 2.0601 | 7614 | 0.2181 | - | - | | 2.0603 | 7615 | 0.2003 | - | - | | 2.0606 | 7616 | 0.1437 | - | - | | 2.0609 | 7617 | 0.1896 | - | - | | 2.0611 | 7618 | 0.1635 | - | - | | 2.0614 | 7619 | 0.179 | - | - | | 2.0617 | 7620 | 0.2573 | - | - | | 2.0620 | 7621 | 0.1806 | - | - | | 2.0622 | 7622 | 0.1457 | - | - | | 2.0625 | 7623 | 0.2227 | - | - | | 2.0628 | 7624 | 0.1555 | - | - | | 2.0630 | 7625 | 0.125 | - | - | | 2.0633 | 7626 | 0.1736 | - | - | | 2.0636 | 7627 | 0.1513 | - | - | | 2.0639 | 7628 | 0.1519 | - | - | | 2.0641 | 7629 | 0.194 | - | - | | 2.0644 | 7630 | 0.1952 | - | - | | 2.0647 | 7631 | 0.2201 | - | - | | 2.0649 | 7632 | 0.1918 | - | - | | 2.0652 | 7633 | 0.3138 | - | - | | 2.0655 | 7634 | 0.1791 | - | - | | 2.0657 | 7635 | 0.1623 | - | - | | 2.0660 | 7636 | 0.2262 | - | - | | 2.0663 | 7637 | 0.1738 | - | - | | 2.0666 | 7638 | 0.1874 | - | - | | 2.0668 | 7639 | 0.2213 | - | - | | 2.0671 | 7640 | 0.2271 | - | - | | 2.0674 | 7641 | 0.178 | - | - | | 2.0676 | 7642 | 0.1912 | - | - | | 2.0679 | 7643 | 0.2311 | - | - | | 2.0682 | 7644 | 0.1651 | - | - | | 2.0685 | 7645 | 0.1847 | - | - | | 2.0687 | 7646 | 0.1581 | - | - | | 2.0690 | 7647 | 0.1536 | - | - | | 2.0693 | 7648 | 0.16 | - | - | | 2.0695 | 7649 | 0.2157 | - | - | | 2.0698 | 7650 | 0.2169 | - | - | | 2.0701 | 7651 | 0.207 | - | - | | 2.0703 | 7652 | 0.1838 | - | - | | 2.0706 | 7653 | 0.1426 | - | - | | 2.0709 | 7654 | 0.156 | - | - | | 2.0712 | 7655 | 0.192 | - | - | | 2.0714 | 7656 | 0.1603 | - | - | | 2.0717 | 7657 | 0.2335 | - | - | | 2.0720 | 7658 | 0.1666 | - | - | | 2.0722 | 7659 | 0.2276 | - | - | | 2.0725 | 7660 | 0.1748 | - | - | | 2.0728 | 7661 | 0.2399 | - | - | | 2.0731 | 7662 | 0.1901 | - | - | | 2.0733 | 7663 | 0.1656 | - | - | | 2.0736 | 7664 | 0.1987 | - | - | | 2.0739 | 7665 | 0.2042 | - | - | | 2.0741 | 7666 | 0.1383 | - | - | | 2.0744 | 7667 | 0.2472 | - | - | | 2.0747 | 7668 | 0.1461 | - | - | | 2.0749 | 7669 | 0.1588 | - | - | | 2.0752 | 7670 | 0.1103 | - | - | | 2.0755 | 7671 | 0.1839 | - | - | | 2.0758 | 7672 | 0.1953 | - | - | | 2.0760 | 7673 | 0.1844 | - | - | | 2.0763 | 7674 | 0.2378 | - | - | | 2.0766 | 7675 | 0.171 | - | - | | 2.0768 | 7676 | 0.1929 | - | - | | 2.0771 | 7677 | 0.1701 | - | - | | 2.0774 | 7678 | 0.1773 | - | - | | 2.0777 | 7679 | 0.1906 | - | - | | 2.0779 | 7680 | 0.1992 | - | - | | 2.0782 | 7681 | 0.1658 | - | - | | 2.0785 | 7682 | 0.1579 | - | - | | 2.0787 | 7683 | 0.2029 | - | - | | 2.0790 | 7684 | 0.1263 | - | - | | 2.0793 | 7685 | 0.1673 | - | - | | 2.0795 | 7686 | 0.2635 | - | - | | 2.0798 | 7687 | 0.1059 | - | - | | 2.0801 | 7688 | 0.1731 | - | - | | 2.0804 | 7689 | 0.2037 | - | - | | 2.0806 | 7690 | 0.2362 | - | - | | 2.0809 | 7691 | 0.1974 | - | - | | 2.0812 | 7692 | 0.1703 | - | - | | 2.0814 | 7693 | 0.2159 | - | - | | 2.0817 | 7694 | 0.2015 | - | - | | 2.0820 | 7695 | 0.2134 | - | - | | 2.0823 | 7696 | 0.239 | - | - | | 2.0825 | 7697 | 0.1696 | - | - | | 2.0828 | 7698 | 0.1556 | - | - | | 2.0831 | 7699 | 0.2646 | - | - | | 2.0833 | 7700 | 0.1666 | - | - | | 2.0836 | 7701 | 0.2086 | - | - | | 2.0839 | 7702 | 0.1494 | - | - | | 2.0841 | 7703 | 0.1325 | - | - | | 2.0844 | 7704 | 0.1988 | - | - | | 2.0847 | 7705 | 0.1279 | - | - | | 2.0850 | 7706 | 0.2096 | - | - | | 2.0852 | 7707 | 0.1517 | - | - | | 2.0855 | 7708 | 0.1543 | - | - | | 2.0858 | 7709 | 0.1039 | - | - | | 2.0860 | 7710 | 0.2433 | - | - | | 2.0863 | 7711 | 0.1557 | - | - | | 2.0866 | 7712 | 0.1585 | - | - | | 2.0869 | 7713 | 0.1231 | - | - | | 2.0871 | 7714 | 0.2463 | - | - | | 2.0874 | 7715 | 0.2444 | - | - | | 2.0877 | 7716 | 0.1688 | - | - | | 2.0879 | 7717 | 0.2208 | - | - | | 2.0882 | 7718 | 0.1537 | - | - | | 2.0885 | 7719 | 0.1848 | - | - | | 2.0887 | 7720 | 0.2125 | - | - | | 2.0890 | 7721 | 0.1792 | - | - | | 2.0893 | 7722 | 0.2205 | - | - | | 2.0896 | 7723 | 0.1922 | - | - | | 2.0898 | 7724 | 0.1966 | - | - | | 2.0901 | 7725 | 0.1602 | - | - | | 2.0904 | 7726 | 0.0856 | - | - | | 2.0906 | 7727 | 0.1653 | - | - | | 2.0909 | 7728 | 0.238 | - | - | | 2.0912 | 7729 | 0.1922 | - | - | | 2.0915 | 7730 | 0.2043 | - | - | | 2.0917 | 7731 | 0.1495 | - | - | | 2.0920 | 7732 | 0.1737 | - | - | | 2.0923 | 7733 | 0.2128 | - | - | | 2.0925 | 7734 | 0.1939 | - | - | | 2.0928 | 7735 | 0.164 | - | - | | 2.0931 | 7736 | 0.154 | - | - | | 2.0933 | 7737 | 0.2128 | - | - | | 2.0936 | 7738 | 0.1279 | - | - | | 2.0939 | 7739 | 0.1535 | - | - | | 2.0942 | 7740 | 0.1653 | - | - | | 2.0944 | 7741 | 0.1619 | - | - | | 2.0947 | 7742 | 0.1776 | - | - | | 2.0950 | 7743 | 0.1993 | - | - | | 2.0952 | 7744 | 0.207 | - | - | | 2.0955 | 7745 | 0.1258 | - | - | | 2.0958 | 7746 | 0.2008 | - | - | | 2.0960 | 7747 | 0.2042 | - | - | | 2.0963 | 7748 | 0.1936 | - | - | | 2.0966 | 7749 | 0.1915 | - | - | | 2.0969 | 7750 | 0.1906 | - | - | | 2.0971 | 7751 | 0.2062 | - | - | | 2.0974 | 7752 | 0.1491 | - | - | | 2.0977 | 7753 | 0.2264 | - | - | | 2.0979 | 7754 | 0.2102 | - | - | | 2.0982 | 7755 | 0.1766 | - | - | | 2.0985 | 7756 | 0.1269 | - | - | | 2.0988 | 7757 | 0.2052 | - | - | | 2.0990 | 7758 | 0.2494 | - | - | | 2.0993 | 7759 | 0.177 | - | - | | 2.0996 | 7760 | 0.1213 | - | - | | 2.0998 | 7761 | 0.2039 | - | - | | 2.1001 | 7762 | 0.1929 | - | - | | 2.1004 | 7763 | 0.1988 | - | - | | 2.1006 | 7764 | 0.174 | - | - | | 2.1009 | 7765 | 0.2357 | - | - | | 2.1012 | 7766 | 0.1222 | - | - | | 2.1015 | 7767 | 0.1754 | - | - | | 2.1017 | 7768 | 0.1441 | - | - | | 2.1020 | 7769 | 0.3095 | - | - | | 2.1023 | 7770 | 0.1809 | - | - | | 2.1025 | 7771 | 0.1811 | - | - | | 2.1028 | 7772 | 0.1856 | - | - | | 2.1031 | 7773 | 0.1887 | - | - | | 2.1034 | 7774 | 0.2536 | - | - | | 2.1036 | 7775 | 0.1286 | - | - | | 2.1039 | 7776 | 0.1636 | - | - | | 2.1042 | 7777 | 0.1581 | - | - | | 2.1044 | 7778 | 0.1635 | - | - | | 2.1047 | 7779 | 0.2378 | - | - | | 2.1050 | 7780 | 0.1374 | - | - | | 2.1052 | 7781 | 0.2322 | - | - | | 2.1055 | 7782 | 0.1521 | - | - | | 2.1058 | 7783 | 0.2067 | - | - | | 2.1061 | 7784 | 0.2142 | - | - | | 2.1063 | 7785 | 0.2368 | - | - | | 2.1066 | 7786 | 0.1884 | - | - | | 2.1069 | 7787 | 0.1675 | - | - | | 2.1071 | 7788 | 0.1342 | - | - | | 2.1074 | 7789 | 0.1568 | - | - | | 2.1077 | 7790 | 0.1797 | - | - | | 2.1080 | 7791 | 0.1834 | - | - | | 2.1082 | 7792 | 0.1575 | - | - | | 2.1085 | 7793 | 0.1506 | - | - | | 2.1088 | 7794 | 0.1745 | - | - | | 2.1090 | 7795 | 0.1696 | - | - | | 2.1093 | 7796 | 0.2199 | - | - | | 2.1096 | 7797 | 0.1602 | - | - | | 2.1098 | 7798 | 0.2076 | - | - | | 2.1101 | 7799 | 0.1896 | - | - | | 2.1104 | 7800 | 0.2284 | - | - | | 2.1107 | 7801 | 0.1539 | - | - | | 2.1109 | 7802 | 0.202 | - | - | | 2.1112 | 7803 | 0.2315 | - | - | | 2.1115 | 7804 | 0.2765 | - | - | | 2.1117 | 7805 | 0.1961 | - | - | | 2.1120 | 7806 | 0.1935 | - | - | | 2.1123 | 7807 | 0.1756 | - | - | | 2.1126 | 7808 | 0.2705 | - | - | | 2.1128 | 7809 | 0.1806 | - | - | | 2.1131 | 7810 | 0.1489 | - | - | | 2.1134 | 7811 | 0.2088 | - | - | | 2.1136 | 7812 | 0.2655 | - | - | | 2.1139 | 7813 | 0.1534 | - | - | | 2.1142 | 7814 | 0.1941 | - | - | | 2.1144 | 7815 | 0.2017 | - | - | | 2.1147 | 7816 | 0.2019 | - | - | | 2.1150 | 7817 | 0.2093 | - | - | | 2.1153 | 7818 | 0.172 | - | - | | 2.1155 | 7819 | 0.1484 | - | - | | 2.1158 | 7820 | 0.1984 | - | - | | 2.1161 | 7821 | 0.1693 | - | - | | 2.1163 | 7822 | 0.1561 | - | - | | 2.1166 | 7823 | 0.2133 | - | - | | 2.1169 | 7824 | 0.1538 | - | - | | 2.1172 | 7825 | 0.1983 | - | - | | 2.1174 | 7826 | 0.2432 | - | - | | 2.1177 | 7827 | 0.2463 | - | - | | 2.1180 | 7828 | 0.1411 | - | - | | 2.1182 | 7829 | 0.1697 | - | - | | 2.1185 | 7830 | 0.2996 | - | - | | 2.1188 | 7831 | 0.1802 | - | - | | 2.1190 | 7832 | 0.1475 | - | - | | 2.1193 | 7833 | 0.1728 | - | - | | 2.1196 | 7834 | 0.1779 | - | - | | 2.1199 | 7835 | 0.1865 | - | - | | 2.1201 | 7836 | 0.1547 | - | - | | 2.1204 | 7837 | 0.2878 | - | - | | 2.1207 | 7838 | 0.2145 | - | - | | 2.1209 | 7839 | 0.2249 | - | - | | 2.1212 | 7840 | 0.2063 | - | - | | 2.1215 | 7841 | 0.2003 | - | - | | 2.1218 | 7842 | 0.2198 | - | - | | 2.1220 | 7843 | 0.1861 | - | - | | 2.1223 | 7844 | 0.1742 | - | - | | 2.1226 | 7845 | 0.1875 | - | - | | 2.1228 | 7846 | 0.149 | - | - | | 2.1231 | 7847 | 0.2015 | - | - | | 2.1234 | 7848 | 0.1464 | - | - | | 2.1236 | 7849 | 0.2197 | - | - | | 2.1239 | 7850 | 0.155 | - | - | | 2.1242 | 7851 | 0.1612 | - | - | | 2.1245 | 7852 | 0.162 | - | - | | 2.1247 | 7853 | 0.1804 | - | - | | 2.125 | 7854 | 0.1667 | - | - | | 2.1253 | 7855 | 0.2156 | - | - | | 2.1255 | 7856 | 0.1791 | - | - | | 2.1258 | 7857 | 0.2105 | - | - | | 2.1261 | 7858 | 0.1324 | - | - | | 2.1264 | 7859 | 0.1746 | - | - | | 2.1266 | 7860 | 0.1652 | - | - | | 2.1269 | 7861 | 0.1948 | - | - | | 2.1272 | 7862 | 0.1484 | - | - | | 2.1274 | 7863 | 0.1868 | - | - | | 2.1277 | 7864 | 0.1669 | - | - | | 2.1280 | 7865 | 0.1732 | - | - | | 2.1282 | 7866 | 0.1314 | - | - | | 2.1285 | 7867 | 0.2082 | - | - | | 2.1288 | 7868 | 0.1511 | - | - | | 2.1291 | 7869 | 0.1912 | - | - | | 2.1293 | 7870 | 0.1906 | - | - | | 2.1296 | 7871 | 0.1831 | - | - | | 2.1299 | 7872 | 0.1518 | - | - | | 2.1301 | 7873 | 0.135 | - | - | | 2.1304 | 7874 | 0.2105 | - | - | | 2.1307 | 7875 | 0.1715 | - | - | | 2.1310 | 7876 | 0.1598 | - | - | | 2.1312 | 7877 | 0.2041 | - | - | | 2.1315 | 7878 | 0.1565 | - | - | | 2.1318 | 7879 | 0.2154 | - | - | | 2.1320 | 7880 | 0.1367 | - | - | | 2.1323 | 7881 | 0.1395 | - | - | | 2.1326 | 7882 | 0.1674 | - | - | | 2.1328 | 7883 | 0.1224 | - | - | | 2.1331 | 7884 | 0.1749 | - | - | | 2.1334 | 7885 | 0.1487 | - | - | | 2.1337 | 7886 | 0.2076 | - | - | | 2.1339 | 7887 | 0.2053 | - | - | | 2.1342 | 7888 | 0.1756 | - | - | | 2.1345 | 7889 | 0.1252 | - | - | | 2.1347 | 7890 | 0.2027 | - | - | | 2.1350 | 7891 | 0.2132 | - | - | | 2.1353 | 7892 | 0.1922 | - | - | | 2.1356 | 7893 | 0.1584 | - | - | | 2.1358 | 7894 | 0.169 | - | - | | 2.1361 | 7895 | 0.1414 | - | - | | 2.1364 | 7896 | 0.192 | - | - | | 2.1366 | 7897 | 0.1847 | - | - | | 2.1369 | 7898 | 0.2422 | - | - | | 2.1372 | 7899 | 0.1843 | - | - | | 2.1374 | 7900 | 0.1808 | - | - | | 2.1377 | 7901 | 0.2166 | - | - | | 2.1380 | 7902 | 0.215 | - | - | | 2.1383 | 7903 | 0.2254 | - | - | | 2.1385 | 7904 | 0.2116 | - | - | | 2.1388 | 7905 | 0.1629 | - | - | | 2.1391 | 7906 | 0.1786 | - | - | | 2.1393 | 7907 | 0.224 | - | - | | 2.1396 | 7908 | 0.1511 | - | - | | 2.1399 | 7909 | 0.139 | - | - | | 2.1402 | 7910 | 0.2234 | - | - | | 2.1404 | 7911 | 0.1609 | - | - | | 2.1407 | 7912 | 0.1847 | - | - | | 2.1410 | 7913 | 0.1107 | - | - | | 2.1412 | 7914 | 0.2006 | - | - | | 2.1415 | 7915 | 0.2237 | - | - | | 2.1418 | 7916 | 0.2013 | - | - | | 2.1420 | 7917 | 0.2144 | - | - | | 2.1423 | 7918 | 0.2501 | - | - | | 2.1426 | 7919 | 0.2439 | - | - | | 2.1429 | 7920 | 0.1779 | - | - | | 2.1431 | 7921 | 0.2429 | - | - | | 2.1434 | 7922 | 0.3119 | - | - | | 2.1437 | 7923 | 0.221 | - | - | | 2.1439 | 7924 | 0.2683 | - | - | | 2.1442 | 7925 | 0.149 | - | - | | 2.1445 | 7926 | 0.2716 | - | - | | 2.1448 | 7927 | 0.1874 | - | - | | 2.1450 | 7928 | 0.142 | - | - | | 2.1453 | 7929 | 0.255 | - | - | | 2.1456 | 7930 | 0.2688 | - | - | | 2.1458 | 7931 | 0.2296 | - | - | | 2.1461 | 7932 | 0.1727 | - | - | | 2.1464 | 7933 | 0.2375 | - | - | | 2.1466 | 7934 | 0.1652 | - | - | | 2.1469 | 7935 | 0.2429 | - | - | | 2.1472 | 7936 | 0.1874 | - | - | | 2.1475 | 7937 | 0.1763 | - | - | | 2.1477 | 7938 | 0.1706 | - | - | | 2.1480 | 7939 | 0.1754 | - | - | | 2.1483 | 7940 | 0.1515 | - | - | | 2.1485 | 7941 | 0.2257 | - | - | | 2.1488 | 7942 | 0.1919 | - | - | | 2.1491 | 7943 | 0.2503 | - | - | | 2.1494 | 7944 | 0.1509 | - | - | | 2.1496 | 7945 | 0.2117 | - | - | | 2.1499 | 7946 | 0.1144 | - | - | | 2.1502 | 7947 | 0.1906 | - | - | | 2.1504 | 7948 | 0.205 | - | - | | 2.1507 | 7949 | 0.1819 | - | - | | 2.1510 | 7950 | 0.2099 | - | - | | 2.1512 | 7951 | 0.2306 | - | - | | 2.1515 | 7952 | 0.1895 | - | - | | 2.1518 | 7953 | 0.2015 | - | - | | 2.1521 | 7954 | 0.2981 | - | - | | 2.1523 | 7955 | 0.211 | - | - | | 2.1526 | 7956 | 0.1693 | - | - | | 2.1529 | 7957 | 0.1534 | - | - | | 2.1531 | 7958 | 0.1917 | - | - | | 2.1534 | 7959 | 0.1774 | - | - | | 2.1537 | 7960 | 0.1369 | - | - | | 2.1540 | 7961 | 0.2034 | - | - | | 2.1542 | 7962 | 0.1961 | - | - | | 2.1545 | 7963 | 0.1678 | - | - | | 2.1548 | 7964 | 0.2346 | - | - | | 2.1550 | 7965 | 0.1571 | - | - | | 2.1553 | 7966 | 0.1958 | - | - | | 2.1556 | 7967 | 0.1485 | - | - | | 2.1558 | 7968 | 0.2443 | - | - | | 2.1561 | 7969 | 0.1679 | - | - | | 2.1564 | 7970 | 0.1581 | - | - | | 2.1567 | 7971 | 0.2248 | - | - | | 2.1569 | 7972 | 0.1322 | - | - | | 2.1572 | 7973 | 0.1869 | - | - | | 2.1575 | 7974 | 0.1964 | - | - | | 2.1577 | 7975 | 0.1667 | - | - | | 2.1580 | 7976 | 0.1707 | - | - | | 2.1583 | 7977 | 0.3056 | - | - | | 2.1585 | 7978 | 0.1496 | - | - | | 2.1588 | 7979 | 0.1532 | - | - | | 2.1591 | 7980 | 0.23 | - | - | | 2.1594 | 7981 | 0.1497 | - | - | | 2.1596 | 7982 | 0.1197 | - | - | | 2.1599 | 7983 | 0.2113 | - | - | | 2.1602 | 7984 | 0.2307 | - | - | | 2.1604 | 7985 | 0.2483 | - | - | | 2.1607 | 7986 | 0.1228 | - | - | | 2.1610 | 7987 | 0.1911 | - | - | | 2.1613 | 7988 | 0.1286 | - | - | | 2.1615 | 7989 | 0.1542 | - | - | | 2.1618 | 7990 | 0.2521 | - | - | | 2.1621 | 7991 | 0.1306 | - | - | | 2.1623 | 7992 | 0.223 | - | - | | 2.1626 | 7993 | 0.1814 | - | - | | 2.1629 | 7994 | 0.1646 | - | - | | 2.1631 | 7995 | 0.1854 | - | - | | 2.1634 | 7996 | 0.1802 | - | - | | 2.1637 | 7997 | 0.1867 | - | - | | 2.1640 | 7998 | 0.2711 | - | - | | 2.1642 | 7999 | 0.1839 | - | - | | 2.1645 | 8000 | 0.155 | 0.2057 | 0.9481 | | 2.1648 | 8001 | 0.1963 | - | - | | 2.1650 | 8002 | 0.1846 | - | - | | 2.1653 | 8003 | 0.1927 | - | - | | 2.1656 | 8004 | 0.1802 | - | - | | 2.1659 | 8005 | 0.2297 | - | - | | 2.1661 | 8006 | 0.2011 | - | - | | 2.1664 | 8007 | 0.1602 | - | - | | 2.1667 | 8008 | 0.148 | - | - | | 2.1669 | 8009 | 0.23 | - | - | | 2.1672 | 8010 | 0.1813 | - | - | | 2.1675 | 8011 | 0.1519 | - | - | | 2.1677 | 8012 | 0.1744 | - | - | | 2.1680 | 8013 | 0.1822 | - | - | | 2.1683 | 8014 | 0.1417 | - | - | | 2.1686 | 8015 | 0.1138 | - | - | | 2.1688 | 8016 | 0.1498 | - | - | | 2.1691 | 8017 | 0.1683 | - | - | | 2.1694 | 8018 | 0.2155 | - | - | | 2.1696 | 8019 | 0.2044 | - | - | | 2.1699 | 8020 | 0.1541 | - | - | | 2.1702 | 8021 | 0.1493 | - | - | | 2.1705 | 8022 | 0.1574 | - | - | | 2.1707 | 8023 | 0.1815 | - | - | | 2.1710 | 8024 | 0.1189 | - | - | | 2.1713 | 8025 | 0.2144 | - | - | | 2.1715 | 8026 | 0.1989 | - | - | | 2.1718 | 8027 | 0.1737 | - | - | | 2.1721 | 8028 | 0.1768 | - | - | | 2.1723 | 8029 | 0.2391 | - | - | | 2.1726 | 8030 | 0.1605 | - | - | | 2.1729 | 8031 | 0.2083 | - | - | | 2.1732 | 8032 | 0.1694 | - | - | | 2.1734 | 8033 | 0.1353 | - | - | | 2.1737 | 8034 | 0.144 | - | - | | 2.1740 | 8035 | 0.1832 | - | - | | 2.1742 | 8036 | 0.1363 | - | - | | 2.1745 | 8037 | 0.1878 | - | - | | 2.1748 | 8038 | 0.1577 | - | - | | 2.1751 | 8039 | 0.2338 | - | - | | 2.1753 | 8040 | 0.2136 | - | - | | 2.1756 | 8041 | 0.2067 | - | - | | 2.1759 | 8042 | 0.2017 | - | - | | 2.1761 | 8043 | 0.1593 | - | - | | 2.1764 | 8044 | 0.1953 | - | - | | 2.1767 | 8045 | 0.1876 | - | - | | 2.1769 | 8046 | 0.1827 | - | - | | 2.1772 | 8047 | 0.2425 | - | - | | 2.1775 | 8048 | 0.2047 | - | - | | 2.1778 | 8049 | 0.198 | - | - | | 2.1780 | 8050 | 0.1535 | - | - | | 2.1783 | 8051 | 0.1835 | - | - | | 2.1786 | 8052 | 0.1771 | - | - | | 2.1788 | 8053 | 0.1908 | - | - | | 2.1791 | 8054 | 0.1904 | - | - | | 2.1794 | 8055 | 0.1464 | - | - | | 2.1797 | 8056 | 0.1597 | - | - | | 2.1799 | 8057 | 0.183 | - | - | | 2.1802 | 8058 | 0.1659 | - | - | | 2.1805 | 8059 | 0.127 | - | - | | 2.1807 | 8060 | 0.2062 | - | - | | 2.1810 | 8061 | 0.1819 | - | - | | 2.1813 | 8062 | 0.2099 | - | - | | 2.1815 | 8063 | 0.1932 | - | - | | 2.1818 | 8064 | 0.1753 | - | - | | 2.1821 | 8065 | 0.1436 | - | - | | 2.1824 | 8066 | 0.1969 | - | - | | 2.1826 | 8067 | 0.1991 | - | - | | 2.1829 | 8068 | 0.221 | - | - | | 2.1832 | 8069 | 0.1091 | - | - | | 2.1834 | 8070 | 0.1389 | - | - | | 2.1837 | 8071 | 0.1811 | - | - | | 2.1840 | 8072 | 0.1843 | - | - | | 2.1843 | 8073 | 0.2081 | - | - | | 2.1845 | 8074 | 0.1761 | - | - | | 2.1848 | 8075 | 0.2002 | - | - | | 2.1851 | 8076 | 0.1281 | - | - | | 2.1853 | 8077 | 0.1888 | - | - | | 2.1856 | 8078 | 0.1436 | - | - | | 2.1859 | 8079 | 0.2196 | - | - | | 2.1861 | 8080 | 0.1622 | - | - | | 2.1864 | 8081 | 0.1683 | - | - | | 2.1867 | 8082 | 0.203 | - | - | | 2.1870 | 8083 | 0.1641 | - | - | | 2.1872 | 8084 | 0.1907 | - | - | | 2.1875 | 8085 | 0.1195 | - | - | | 2.1878 | 8086 | 0.1679 | - | - | | 2.1880 | 8087 | 0.1356 | - | - | | 2.1883 | 8088 | 0.2069 | - | - | | 2.1886 | 8089 | 0.1524 | - | - | | 2.1889 | 8090 | 0.182 | - | - | | 2.1891 | 8091 | 0.1847 | - | - | | 2.1894 | 8092 | 0.148 | - | - | | 2.1897 | 8093 | 0.203 | - | - | | 2.1899 | 8094 | 0.1545 | - | - | | 2.1902 | 8095 | 0.1751 | - | - | | 2.1905 | 8096 | 0.177 | - | - | | 2.1907 | 8097 | 0.1906 | - | - | | 2.1910 | 8098 | 0.2024 | - | - | | 2.1913 | 8099 | 0.2263 | - | - | | 2.1916 | 8100 | 0.1687 | - | - | | 2.1918 | 8101 | 0.1948 | - | - | | 2.1921 | 8102 | 0.1848 | - | - | | 2.1924 | 8103 | 0.2446 | - | - | | 2.1926 | 8104 | 0.1889 | - | - | | 2.1929 | 8105 | 0.1811 | - | - | | 2.1932 | 8106 | 0.1607 | - | - | | 2.1935 | 8107 | 0.1878 | - | - | | 2.1937 | 8108 | 0.2175 | - | - | | 2.1940 | 8109 | 0.1158 | - | - | | 2.1943 | 8110 | 0.152 | - | - | | 2.1945 | 8111 | 0.1888 | - | - | | 2.1948 | 8112 | 0.2252 | - | - | | 2.1951 | 8113 | 0.1414 | - | - | | 2.1953 | 8114 | 0.1984 | - | - | | 2.1956 | 8115 | 0.2137 | - | - | | 2.1959 | 8116 | 0.2205 | - | - | | 2.1962 | 8117 | 0.1965 | - | - | | 2.1964 | 8118 | 0.2 | - | - | | 2.1967 | 8119 | 0.139 | - | - | | 2.1970 | 8120 | 0.1805 | - | - | | 2.1972 | 8121 | 0.2589 | - | - | | 2.1975 | 8122 | 0.1685 | - | - | | 2.1978 | 8123 | 0.2004 | - | - | | 2.1981 | 8124 | 0.1435 | - | - | | 2.1983 | 8125 | 0.1641 | - | - | | 2.1986 | 8126 | 0.1826 | - | - | | 2.1989 | 8127 | 0.1253 | - | - | | 2.1991 | 8128 | 0.1641 | - | - | | 2.1994 | 8129 | 0.2133 | - | - | | 2.1997 | 8130 | 0.1692 | - | - | | 2.1999 | 8131 | 0.1869 | - | - | | 2.2002 | 8132 | 0.2041 | - | - | | 2.2005 | 8133 | 0.1495 | - | - | | 2.2008 | 8134 | 0.1667 | - | - | | 2.2010 | 8135 | 0.1835 | - | - | | 2.2013 | 8136 | 0.1277 | - | - | | 2.2016 | 8137 | 0.2033 | - | - | | 2.2018 | 8138 | 0.2104 | - | - | | 2.2021 | 8139 | 0.1847 | - | - | | 2.2024 | 8140 | 0.2103 | - | - | | 2.2027 | 8141 | 0.1792 | - | - | | 2.2029 | 8142 | 0.2054 | - | - | | 2.2032 | 8143 | 0.2332 | - | - | | 2.2035 | 8144 | 0.1744 | - | - | | 2.2037 | 8145 | 0.1593 | - | - | | 2.2040 | 8146 | 0.1625 | - | - | | 2.2043 | 8147 | 0.1083 | - | - | | 2.2045 | 8148 | 0.1347 | - | - | | 2.2048 | 8149 | 0.2465 | - | - | | 2.2051 | 8150 | 0.2673 | - | - | | 2.2054 | 8151 | 0.1908 | - | - | | 2.2056 | 8152 | 0.2047 | - | - | | 2.2059 | 8153 | 0.1705 | - | - | | 2.2062 | 8154 | 0.1908 | - | - | | 2.2064 | 8155 | 0.1836 | - | - | | 2.2067 | 8156 | 0.207 | - | - | | 2.2070 | 8157 | 0.2576 | - | - | | 2.2073 | 8158 | 0.1505 | - | - | | 2.2075 | 8159 | 0.1966 | - | - | | 2.2078 | 8160 | 0.2198 | - | - | | 2.2081 | 8161 | 0.1442 | - | - | | 2.2083 | 8162 | 0.1467 | - | - | | 2.2086 | 8163 | 0.1445 | - | - | | 2.2089 | 8164 | 0.1428 | - | - | | 2.2091 | 8165 | 0.1896 | - | - | | 2.2094 | 8166 | 0.1434 | - | - | | 2.2097 | 8167 | 0.1954 | - | - | | 2.2100 | 8168 | 0.1859 | - | - | | 2.2102 | 8169 | 0.1262 | - | - | | 2.2105 | 8170 | 0.1702 | - | - | | 2.2108 | 8171 | 0.2022 | - | - | | 2.2110 | 8172 | 0.1827 | - | - | | 2.2113 | 8173 | 0.2254 | - | - | | 2.2116 | 8174 | 0.1487 | - | - | | 2.2119 | 8175 | 0.1822 | - | - | | 2.2121 | 8176 | 0.1877 | - | - | | 2.2124 | 8177 | 0.25 | - | - | | 2.2127 | 8178 | 0.1468 | - | - | | 2.2129 | 8179 | 0.1574 | - | - | | 2.2132 | 8180 | 0.1341 | - | - | | 2.2135 | 8181 | 0.1818 | - | - | | 2.2137 | 8182 | 0.1726 | - | - | | 2.2140 | 8183 | 0.1887 | - | - | | 2.2143 | 8184 | 0.3051 | - | - | | 2.2146 | 8185 | 0.1898 | - | - | | 2.2148 | 8186 | 0.1986 | - | - | | 2.2151 | 8187 | 0.279 | - | - | | 2.2154 | 8188 | 0.1611 | - | - | | 2.2156 | 8189 | 0.1519 | - | - | | 2.2159 | 8190 | 0.1446 | - | - | | 2.2162 | 8191 | 0.192 | - | - | | 2.2165 | 8192 | 0.2045 | - | - | | 2.2167 | 8193 | 0.1728 | - | - | | 2.2170 | 8194 | 0.1239 | - | - | | 2.2173 | 8195 | 0.2428 | - | - | | 2.2175 | 8196 | 0.1803 | - | - | | 2.2178 | 8197 | 0.1669 | - | - | | 2.2181 | 8198 | 0.1727 | - | - | | 2.2183 | 8199 | 0.1213 | - | - | | 2.2186 | 8200 | 0.1679 | - | - | | 2.2189 | 8201 | 0.2219 | - | - | | 2.2192 | 8202 | 0.1387 | - | - | | 2.2194 | 8203 | 0.1762 | - | - | | 2.2197 | 8204 | 0.1388 | - | - | | 2.2200 | 8205 | 0.1913 | - | - | | 2.2202 | 8206 | 0.1889 | - | - | | 2.2205 | 8207 | 0.2201 | - | - | | 2.2208 | 8208 | 0.1533 | - | - | | 2.2210 | 8209 | 0.2094 | - | - | | 2.2213 | 8210 | 0.1979 | - | - | | 2.2216 | 8211 | 0.2431 | - | - | | 2.2219 | 8212 | 0.1788 | - | - | | 2.2221 | 8213 | 0.1297 | - | - | | 2.2224 | 8214 | 0.2591 | - | - | | 2.2227 | 8215 | 0.1971 | - | - | | 2.2229 | 8216 | 0.2043 | - | - | | 2.2232 | 8217 | 0.1891 | - | - | | 2.2235 | 8218 | 0.2081 | - | - | | 2.2238 | 8219 | 0.1578 | - | - | | 2.2240 | 8220 | 0.1671 | - | - | | 2.2243 | 8221 | 0.1848 | - | - | | 2.2246 | 8222 | 0.1819 | - | - | | 2.2248 | 8223 | 0.1933 | - | - | | 2.2251 | 8224 | 0.1919 | - | - | | 2.2254 | 8225 | 0.1942 | - | - | | 2.2256 | 8226 | 0.1495 | - | - | | 2.2259 | 8227 | 0.2352 | - | - | | 2.2262 | 8228 | 0.1722 | - | - | | 2.2265 | 8229 | 0.1646 | - | - | | 2.2267 | 8230 | 0.1791 | - | - | | 2.2270 | 8231 | 0.2486 | - | - | | 2.2273 | 8232 | 0.2206 | - | - | | 2.2275 | 8233 | 0.2176 | - | - | | 2.2278 | 8234 | 0.2157 | - | - | | 2.2281 | 8235 | 0.1818 | - | - | | 2.2284 | 8236 | 0.1706 | - | - | | 2.2286 | 8237 | 0.149 | - | - | | 2.2289 | 8238 | 0.202 | - | - | | 2.2292 | 8239 | 0.1732 | - | - | | 2.2294 | 8240 | 0.1554 | - | - | | 2.2297 | 8241 | 0.2118 | - | - | | 2.2300 | 8242 | 0.1787 | - | - | | 2.2302 | 8243 | 0.1615 | - | - | | 2.2305 | 8244 | 0.2186 | - | - | | 2.2308 | 8245 | 0.1994 | - | - | | 2.2311 | 8246 | 0.2023 | - | - | | 2.2313 | 8247 | 0.1728 | - | - | | 2.2316 | 8248 | 0.1883 | - | - | | 2.2319 | 8249 | 0.239 | - | - | | 2.2321 | 8250 | 0.1272 | - | - | | 2.2324 | 8251 | 0.1711 | - | - | | 2.2327 | 8252 | 0.1909 | - | - | | 2.2330 | 8253 | 0.2439 | - | - | | 2.2332 | 8254 | 0.1399 | - | - | | 2.2335 | 8255 | 0.1486 | - | - | | 2.2338 | 8256 | 0.1567 | - | - | | 2.2340 | 8257 | 0.1454 | - | - | | 2.2343 | 8258 | 0.1331 | - | - | | 2.2346 | 8259 | 0.1704 | - | - | | 2.2348 | 8260 | 0.1505 | - | - | | 2.2351 | 8261 | 0.1502 | - | - | | 2.2354 | 8262 | 0.1863 | - | - | | 2.2357 | 8263 | 0.1278 | - | - | | 2.2359 | 8264 | 0.2297 | - | - | | 2.2362 | 8265 | 0.194 | - | - | | 2.2365 | 8266 | 0.1524 | - | - | | 2.2367 | 8267 | 0.1696 | - | - | | 2.2370 | 8268 | 0.2592 | - | - | | 2.2373 | 8269 | 0.2001 | - | - | | 2.2376 | 8270 | 0.1385 | - | - | | 2.2378 | 8271 | 0.2195 | - | - | | 2.2381 | 8272 | 0.2161 | - | - | | 2.2384 | 8273 | 0.2451 | - | - | | 2.2386 | 8274 | 0.1982 | - | - | | 2.2389 | 8275 | 0.1578 | - | - | | 2.2392 | 8276 | 0.1898 | - | - | | 2.2394 | 8277 | 0.2103 | - | - | | 2.2397 | 8278 | 0.1788 | - | - | | 2.2400 | 8279 | 0.1771 | - | - | | 2.2403 | 8280 | 0.1308 | - | - | | 2.2405 | 8281 | 0.142 | - | - | | 2.2408 | 8282 | 0.2895 | - | - | | 2.2411 | 8283 | 0.212 | - | - | | 2.2413 | 8284 | 0.1557 | - | - | | 2.2416 | 8285 | 0.1677 | - | - | | 2.2419 | 8286 | 0.1739 | - | - | | 2.2422 | 8287 | 0.2369 | - | - | | 2.2424 | 8288 | 0.1829 | - | - | | 2.2427 | 8289 | 0.2037 | - | - | | 2.2430 | 8290 | 0.1254 | - | - | | 2.2432 | 8291 | 0.1394 | - | - | | 2.2435 | 8292 | 0.1539 | - | - | | 2.2438 | 8293 | 0.1818 | - | - | | 2.2440 | 8294 | 0.168 | - | - | | 2.2443 | 8295 | 0.1585 | - | - | | 2.2446 | 8296 | 0.1714 | - | - | | 2.2449 | 8297 | 0.2006 | - | - | | 2.2451 | 8298 | 0.0946 | - | - | | 2.2454 | 8299 | 0.1426 | - | - | | 2.2457 | 8300 | 0.2293 | - | - | | 2.2459 | 8301 | 0.1793 | - | - | | 2.2462 | 8302 | 0.2012 | - | - | | 2.2465 | 8303 | 0.2596 | - | - | | 2.2468 | 8304 | 0.3237 | - | - | | 2.2470 | 8305 | 0.1886 | - | - | | 2.2473 | 8306 | 0.1559 | - | - | | 2.2476 | 8307 | 0.1571 | - | - | | 2.2478 | 8308 | 0.177 | - | - | | 2.2481 | 8309 | 0.1481 | - | - | | 2.2484 | 8310 | 0.2141 | - | - | | 2.2486 | 8311 | 0.2189 | - | - | | 2.2489 | 8312 | 0.2041 | - | - | | 2.2492 | 8313 | 0.1859 | - | - | | 2.2495 | 8314 | 0.2363 | - | - | | 2.2497 | 8315 | 0.1626 | - | - | | 2.25 | 8316 | 0.1633 | - | - | | 2.2503 | 8317 | 0.1619 | - | - | | 2.2505 | 8318 | 0.2287 | - | - | | 2.2508 | 8319 | 0.1917 | - | - | | 2.2511 | 8320 | 0.2587 | - | - | | 2.2514 | 8321 | 0.2318 | - | - | | 2.2516 | 8322 | 0.1303 | - | - | | 2.2519 | 8323 | 0.1397 | - | - | | 2.2522 | 8324 | 0.1966 | - | - | | 2.2524 | 8325 | 0.1529 | - | - | | 2.2527 | 8326 | 0.2019 | - | - | | 2.2530 | 8327 | 0.129 | - | - | | 2.2532 | 8328 | 0.2209 | - | - | | 2.2535 | 8329 | 0.2107 | - | - | | 2.2538 | 8330 | 0.1682 | - | - | | 2.2541 | 8331 | 0.2316 | - | - | | 2.2543 | 8332 | 0.2599 | - | - | | 2.2546 | 8333 | 0.1319 | - | - | | 2.2549 | 8334 | 0.2367 | - | - | | 2.2551 | 8335 | 0.1961 | - | - | | 2.2554 | 8336 | 0.1432 | - | - | | 2.2557 | 8337 | 0.2423 | - | - | | 2.2560 | 8338 | 0.1471 | - | - | | 2.2562 | 8339 | 0.1799 | - | - | | 2.2565 | 8340 | 0.2101 | - | - | | 2.2568 | 8341 | 0.1797 | - | - | | 2.2570 | 8342 | 0.1664 | - | - | | 2.2573 | 8343 | 0.1883 | - | - | | 2.2576 | 8344 | 0.2316 | - | - | | 2.2578 | 8345 | 0.1746 | - | - | | 2.2581 | 8346 | 0.2033 | - | - | | 2.2584 | 8347 | 0.1577 | - | - | | 2.2587 | 8348 | 0.1903 | - | - | | 2.2589 | 8349 | 0.1499 | - | - | | 2.2592 | 8350 | 0.1757 | - | - | | 2.2595 | 8351 | 0.1559 | - | - | | 2.2597 | 8352 | 0.1592 | - | - | | 2.2600 | 8353 | 0.1848 | - | - | | 2.2603 | 8354 | 0.1652 | - | - | | 2.2606 | 8355 | 0.1712 | - | - | | 2.2608 | 8356 | 0.2346 | - | - | | 2.2611 | 8357 | 0.2326 | - | - | | 2.2614 | 8358 | 0.1486 | - | - | | 2.2616 | 8359 | 0.1467 | - | - | | 2.2619 | 8360 | 0.2658 | - | - | | 2.2622 | 8361 | 0.2403 | - | - | | 2.2624 | 8362 | 0.1644 | - | - | | 2.2627 | 8363 | 0.2082 | - | - | | 2.2630 | 8364 | 0.1802 | - | - | | 2.2633 | 8365 | 0.1789 | - | - | | 2.2635 | 8366 | 0.148 | - | - | | 2.2638 | 8367 | 0.225 | - | - | | 2.2641 | 8368 | 0.1397 | - | - | | 2.2643 | 8369 | 0.1664 | - | - | | 2.2646 | 8370 | 0.2209 | - | - | | 2.2649 | 8371 | 0.15 | - | - | | 2.2652 | 8372 | 0.1735 | - | - | | 2.2654 | 8373 | 0.1462 | - | - | | 2.2657 | 8374 | 0.1327 | - | - | | 2.2660 | 8375 | 0.1765 | - | - | | 2.2662 | 8376 | 0.1462 | - | - | | 2.2665 | 8377 | 0.207 | - | - | | 2.2668 | 8378 | 0.1761 | - | - | | 2.2670 | 8379 | 0.1606 | - | - | | 2.2673 | 8380 | 0.1464 | - | - | | 2.2676 | 8381 | 0.2012 | - | - | | 2.2679 | 8382 | 0.2416 | - | - | | 2.2681 | 8383 | 0.1407 | - | - | | 2.2684 | 8384 | 0.2082 | - | - | | 2.2687 | 8385 | 0.1543 | - | - | | 2.2689 | 8386 | 0.1394 | - | - | | 2.2692 | 8387 | 0.1705 | - | - | | 2.2695 | 8388 | 0.1534 | - | - | | 2.2698 | 8389 | 0.1566 | - | - | | 2.2700 | 8390 | 0.1332 | - | - | | 2.2703 | 8391 | 0.1617 | - | - | | 2.2706 | 8392 | 0.1633 | - | - | | 2.2708 | 8393 | 0.1605 | - | - | | 2.2711 | 8394 | 0.2242 | - | - | | 2.2714 | 8395 | 0.2214 | - | - | | 2.2716 | 8396 | 0.175 | - | - | | 2.2719 | 8397 | 0.1841 | - | - | | 2.2722 | 8398 | 0.1693 | - | - | | 2.2725 | 8399 | 0.1946 | - | - | | 2.2727 | 8400 | 0.1831 | - | - | | 2.2730 | 8401 | 0.162 | - | - | | 2.2733 | 8402 | 0.154 | - | - | | 2.2735 | 8403 | 0.1528 | - | - | | 2.2738 | 8404 | 0.1633 | - | - | | 2.2741 | 8405 | 0.224 | - | - | | 2.2744 | 8406 | 0.2296 | - | - | | 2.2746 | 8407 | 0.2225 | - | - | | 2.2749 | 8408 | 0.2178 | - | - | | 2.2752 | 8409 | 0.1834 | - | - | | 2.2754 | 8410 | 0.2058 | - | - | | 2.2757 | 8411 | 0.1605 | - | - | | 2.2760 | 8412 | 0.1937 | - | - | | 2.2762 | 8413 | 0.1567 | - | - | | 2.2765 | 8414 | 0.1853 | - | - | | 2.2768 | 8415 | 0.2097 | - | - | | 2.2771 | 8416 | 0.2448 | - | - | | 2.2773 | 8417 | 0.2153 | - | - | | 2.2776 | 8418 | 0.2581 | - | - | | 2.2779 | 8419 | 0.1331 | - | - | | 2.2781 | 8420 | 0.2408 | - | - | | 2.2784 | 8421 | 0.258 | - | - | | 2.2787 | 8422 | 0.2121 | - | - | | 2.2790 | 8423 | 0.2476 | - | - | | 2.2792 | 8424 | 0.1436 | - | - | | 2.2795 | 8425 | 0.1427 | - | - | | 2.2798 | 8426 | 0.2115 | - | - | | 2.2800 | 8427 | 0.1346 | - | - | | 2.2803 | 8428 | 0.1714 | - | - | | 2.2806 | 8429 | 0.1522 | - | - | | 2.2808 | 8430 | 0.1671 | - | - | | 2.2811 | 8431 | 0.1418 | - | - | | 2.2814 | 8432 | 0.1581 | - | - | | 2.2817 | 8433 | 0.2278 | - | - | | 2.2819 | 8434 | 0.207 | - | - | | 2.2822 | 8435 | 0.1739 | - | - | | 2.2825 | 8436 | 0.1877 | - | - | | 2.2827 | 8437 | 0.1159 | - | - | | 2.2830 | 8438 | 0.2233 | - | - | | 2.2833 | 8439 | 0.2493 | - | - | | 2.2835 | 8440 | 0.2317 | - | - | | 2.2838 | 8441 | 0.2212 | - | - | | 2.2841 | 8442 | 0.2231 | - | - | | 2.2844 | 8443 | 0.2218 | - | - | | 2.2846 | 8444 | 0.2851 | - | - | | 2.2849 | 8445 | 0.2261 | - | - | | 2.2852 | 8446 | 0.2038 | - | - | | 2.2854 | 8447 | 0.1769 | - | - | | 2.2857 | 8448 | 0.157 | - | - | | 2.2860 | 8449 | 0.1886 | - | - | | 2.2863 | 8450 | 0.1752 | - | - | | 2.2865 | 8451 | 0.1514 | - | - | | 2.2868 | 8452 | 0.2474 | - | - | | 2.2871 | 8453 | 0.1642 | - | - | | 2.2873 | 8454 | 0.1596 | - | - | | 2.2876 | 8455 | 0.1498 | - | - | | 2.2879 | 8456 | 0.1677 | - | - | | 2.2881 | 8457 | 0.1669 | - | - | | 2.2884 | 8458 | 0.1975 | - | - | | 2.2887 | 8459 | 0.1792 | - | - | | 2.2890 | 8460 | 0.1555 | - | - | | 2.2892 | 8461 | 0.2362 | - | - | | 2.2895 | 8462 | 0.1786 | - | - | | 2.2898 | 8463 | 0.1412 | - | - | | 2.2900 | 8464 | 0.2661 | - | - | | 2.2903 | 8465 | 0.1585 | - | - | | 2.2906 | 8466 | 0.2773 | - | - | | 2.2909 | 8467 | 0.1155 | - | - | | 2.2911 | 8468 | 0.166 | - | - | | 2.2914 | 8469 | 0.1256 | - | - | | 2.2917 | 8470 | 0.1941 | - | - | | 2.2919 | 8471 | 0.2275 | - | - | | 2.2922 | 8472 | 0.1654 | - | - | | 2.2925 | 8473 | 0.1774 | - | - | | 2.2927 | 8474 | 0.1745 | - | - | | 2.2930 | 8475 | 0.1864 | - | - | | 2.2933 | 8476 | 0.1403 | - | - | | 2.2936 | 8477 | 0.2255 | - | - | | 2.2938 | 8478 | 0.111 | - | - | | 2.2941 | 8479 | 0.1433 | - | - | | 2.2944 | 8480 | 0.332 | - | - | | 2.2946 | 8481 | 0.1498 | - | - | | 2.2949 | 8482 | 0.1223 | - | - | | 2.2952 | 8483 | 0.2207 | - | - | | 2.2955 | 8484 | 0.2089 | - | - | | 2.2957 | 8485 | 0.2147 | - | - | | 2.2960 | 8486 | 0.1632 | - | - | | 2.2963 | 8487 | 0.1458 | - | - | | 2.2965 | 8488 | 0.2236 | - | - | | 2.2968 | 8489 | 0.1895 | - | - | | 2.2971 | 8490 | 0.2098 | - | - | | 2.2973 | 8491 | 0.1557 | - | - | | 2.2976 | 8492 | 0.1561 | - | - | | 2.2979 | 8493 | 0.1602 | - | - | | 2.2982 | 8494 | 0.1856 | - | - | | 2.2984 | 8495 | 0.1748 | - | - | | 2.2987 | 8496 | 0.193 | - | - | | 2.2990 | 8497 | 0.2213 | - | - | | 2.2992 | 8498 | 0.1693 | - | - | | 2.2995 | 8499 | 0.2138 | - | - | | 2.2998 | 8500 | 0.1622 | - | - | | 2.3001 | 8501 | 0.1599 | - | - | | 2.3003 | 8502 | 0.1983 | - | - | | 2.3006 | 8503 | 0.1534 | - | - | | 2.3009 | 8504 | 0.1789 | - | - | | 2.3011 | 8505 | 0.1571 | - | - | | 2.3014 | 8506 | 0.1844 | - | - | | 2.3017 | 8507 | 0.2047 | - | - | | 2.3019 | 8508 | 0.227 | - | - | | 2.3022 | 8509 | 0.1843 | - | - | | 2.3025 | 8510 | 0.2249 | - | - | | 2.3028 | 8511 | 0.2144 | - | - | | 2.3030 | 8512 | 0.149 | - | - | | 2.3033 | 8513 | 0.1449 | - | - | | 2.3036 | 8514 | 0.2425 | - | - | | 2.3038 | 8515 | 0.1824 | - | - | | 2.3041 | 8516 | 0.2097 | - | - | | 2.3044 | 8517 | 0.2737 | - | - | | 2.3047 | 8518 | 0.2245 | - | - | | 2.3049 | 8519 | 0.2002 | - | - | | 2.3052 | 8520 | 0.2107 | - | - | | 2.3055 | 8521 | 0.1675 | - | - | | 2.3057 | 8522 | 0.1713 | - | - | | 2.3060 | 8523 | 0.1553 | - | - | | 2.3063 | 8524 | 0.167 | - | - | | 2.3065 | 8525 | 0.1773 | - | - | | 2.3068 | 8526 | 0.2511 | - | - | | 2.3071 | 8527 | 0.2165 | - | - | | 2.3074 | 8528 | 0.2162 | - | - | | 2.3076 | 8529 | 0.1958 | - | - | | 2.3079 | 8530 | 0.2326 | - | - | | 2.3082 | 8531 | 0.1832 | - | - | | 2.3084 | 8532 | 0.1441 | - | - | | 2.3087 | 8533 | 0.1557 | - | - | | 2.3090 | 8534 | 0.1493 | - | - | | 2.3093 | 8535 | 0.2065 | - | - | | 2.3095 | 8536 | 0.2311 | - | - | | 2.3098 | 8537 | 0.1883 | - | - | | 2.3101 | 8538 | 0.2509 | - | - | | 2.3103 | 8539 | 0.185 | - | - | | 2.3106 | 8540 | 0.1678 | - | - | | 2.3109 | 8541 | 0.1799 | - | - | | 2.3111 | 8542 | 0.282 | - | - | | 2.3114 | 8543 | 0.1768 | - | - | | 2.3117 | 8544 | 0.2195 | - | - | | 2.3120 | 8545 | 0.1765 | - | - | | 2.3122 | 8546 | 0.2756 | - | - | | 2.3125 | 8547 | 0.1818 | - | - | | 2.3128 | 8548 | 0.2537 | - | - | | 2.3130 | 8549 | 0.1355 | - | - | | 2.3133 | 8550 | 0.1367 | - | - | | 2.3136 | 8551 | 0.1675 | - | - | | 2.3139 | 8552 | 0.2128 | - | - | | 2.3141 | 8553 | 0.147 | - | - | | 2.3144 | 8554 | 0.2187 | - | - | | 2.3147 | 8555 | 0.1618 | - | - | | 2.3149 | 8556 | 0.1856 | - | - | | 2.3152 | 8557 | 0.2222 | - | - | | 2.3155 | 8558 | 0.2321 | - | - | | 2.3157 | 8559 | 0.2025 | - | - | | 2.3160 | 8560 | 0.1612 | - | - | | 2.3163 | 8561 | 0.1102 | - | - | | 2.3166 | 8562 | 0.1916 | - | - | | 2.3168 | 8563 | 0.1447 | - | - | | 2.3171 | 8564 | 0.2378 | - | - | | 2.3174 | 8565 | 0.1578 | - | - | | 2.3176 | 8566 | 0.3134 | - | - | | 2.3179 | 8567 | 0.1474 | - | - | | 2.3182 | 8568 | 0.2143 | - | - | | 2.3185 | 8569 | 0.1964 | - | - | | 2.3187 | 8570 | 0.2355 | - | - | | 2.3190 | 8571 | 0.1723 | - | - | | 2.3193 | 8572 | 0.1925 | - | - | | 2.3195 | 8573 | 0.1865 | - | - | | 2.3198 | 8574 | 0.2444 | - | - | | 2.3201 | 8575 | 0.1455 | - | - | | 2.3203 | 8576 | 0.1817 | - | - | | 2.3206 | 8577 | 0.1996 | - | - | | 2.3209 | 8578 | 0.1541 | - | - | | 2.3212 | 8579 | 0.1844 | - | - | | 2.3214 | 8580 | 0.1787 | - | - | | 2.3217 | 8581 | 0.1829 | - | - | | 2.3220 | 8582 | 0.1977 | - | - | | 2.3222 | 8583 | 0.1698 | - | - | | 2.3225 | 8584 | 0.1521 | - | - | | 2.3228 | 8585 | 0.2149 | - | - | | 2.3231 | 8586 | 0.1938 | - | - | | 2.3233 | 8587 | 0.1663 | - | - | | 2.3236 | 8588 | 0.1874 | - | - | | 2.3239 | 8589 | 0.1524 | - | - | | 2.3241 | 8590 | 0.1901 | - | - | | 2.3244 | 8591 | 0.1661 | - | - | | 2.3247 | 8592 | 0.1512 | - | - | | 2.3249 | 8593 | 0.2388 | - | - | | 2.3252 | 8594 | 0.2167 | - | - | | 2.3255 | 8595 | 0.1569 | - | - | | 2.3258 | 8596 | 0.1631 | - | - | | 2.3260 | 8597 | 0.1221 | - | - | | 2.3263 | 8598 | 0.1686 | - | - | | 2.3266 | 8599 | 0.2046 | - | - | | 2.3268 | 8600 | 0.2084 | - | - | | 2.3271 | 8601 | 0.1842 | - | - | | 2.3274 | 8602 | 0.2236 | - | - | | 2.3277 | 8603 | 0.1585 | - | - | | 2.3279 | 8604 | 0.2151 | - | - | | 2.3282 | 8605 | 0.2635 | - | - | | 2.3285 | 8606 | 0.1674 | - | - | | 2.3287 | 8607 | 0.1562 | - | - | | 2.3290 | 8608 | 0.1337 | - | - | | 2.3293 | 8609 | 0.2365 | - | - | | 2.3295 | 8610 | 0.1282 | - | - | | 2.3298 | 8611 | 0.1553 | - | - | | 2.3301 | 8612 | 0.1641 | - | - | | 2.3304 | 8613 | 0.1808 | - | - | | 2.3306 | 8614 | 0.1346 | - | - | | 2.3309 | 8615 | 0.2536 | - | - | | 2.3312 | 8616 | 0.1313 | - | - | | 2.3314 | 8617 | 0.2053 | - | - | | 2.3317 | 8618 | 0.2167 | - | - | | 2.3320 | 8619 | 0.2016 | - | - | | 2.3323 | 8620 | 0.1376 | - | - | | 2.3325 | 8621 | 0.194 | - | - | | 2.3328 | 8622 | 0.1644 | - | - | | 2.3331 | 8623 | 0.1695 | - | - | | 2.3333 | 8624 | 0.1821 | - | - | | 2.3336 | 8625 | 0.1975 | - | - | | 2.3339 | 8626 | 0.1673 | - | - | | 2.3341 | 8627 | 0.2563 | - | - | | 2.3344 | 8628 | 0.2253 | - | - | | 2.3347 | 8629 | 0.2026 | - | - | | 2.3350 | 8630 | 0.184 | - | - | | 2.3352 | 8631 | 0.2019 | - | - | | 2.3355 | 8632 | 0.2188 | - | - | | 2.3358 | 8633 | 0.1369 | - | - | | 2.3360 | 8634 | 0.109 | - | - | | 2.3363 | 8635 | 0.1622 | - | - | | 2.3366 | 8636 | 0.1615 | - | - | | 2.3369 | 8637 | 0.1759 | - | - | | 2.3371 | 8638 | 0.1714 | - | - | | 2.3374 | 8639 | 0.1645 | - | - | | 2.3377 | 8640 | 0.2208 | - | - | | 2.3379 | 8641 | 0.2952 | - | - | | 2.3382 | 8642 | 0.1501 | - | - | | 2.3385 | 8643 | 0.2117 | - | - | | 2.3387 | 8644 | 0.1615 | - | - | | 2.3390 | 8645 | 0.1606 | - | - | | 2.3393 | 8646 | 0.1562 | - | - | | 2.3396 | 8647 | 0.1626 | - | - | | 2.3398 | 8648 | 0.2099 | - | - | | 2.3401 | 8649 | 0.1616 | - | - | | 2.3404 | 8650 | 0.1536 | - | - | | 2.3406 | 8651 | 0.1904 | - | - | | 2.3409 | 8652 | 0.1648 | - | - | | 2.3412 | 8653 | 0.1353 | - | - | | 2.3415 | 8654 | 0.181 | - | - | | 2.3417 | 8655 | 0.2018 | - | - | | 2.3420 | 8656 | 0.1325 | - | - | | 2.3423 | 8657 | 0.163 | - | - | | 2.3425 | 8658 | 0.1326 | - | - | | 2.3428 | 8659 | 0.1562 | - | - | | 2.3431 | 8660 | 0.1432 | - | - | | 2.3433 | 8661 | 0.1824 | - | - | | 2.3436 | 8662 | 0.1587 | - | - | | 2.3439 | 8663 | 0.2061 | - | - | | 2.3442 | 8664 | 0.2065 | - | - | | 2.3444 | 8665 | 0.1782 | - | - | | 2.3447 | 8666 | 0.2395 | - | - | | 2.3450 | 8667 | 0.2044 | - | - | | 2.3452 | 8668 | 0.1755 | - | - | | 2.3455 | 8669 | 0.1639 | - | - | | 2.3458 | 8670 | 0.1614 | - | - | | 2.3460 | 8671 | 0.1962 | - | - | | 2.3463 | 8672 | 0.1588 | - | - | | 2.3466 | 8673 | 0.1666 | - | - | | 2.3469 | 8674 | 0.1034 | - | - | | 2.3471 | 8675 | 0.2166 | - | - | | 2.3474 | 8676 | 0.1995 | - | - | | 2.3477 | 8677 | 0.1803 | - | - | | 2.3479 | 8678 | 0.205 | - | - | | 2.3482 | 8679 | 0.1659 | - | - | | 2.3485 | 8680 | 0.1363 | - | - | | 2.3488 | 8681 | 0.1416 | - | - | | 2.3490 | 8682 | 0.1422 | - | - | | 2.3493 | 8683 | 0.1939 | - | - | | 2.3496 | 8684 | 0.1803 | - | - | | 2.3498 | 8685 | 0.2399 | - | - | | 2.3501 | 8686 | 0.1854 | - | - | | 2.3504 | 8687 | 0.2012 | - | - | | 2.3506 | 8688 | 0.1715 | - | - | | 2.3509 | 8689 | 0.1603 | - | - | | 2.3512 | 8690 | 0.1702 | - | - | | 2.3515 | 8691 | 0.1959 | - | - | | 2.3517 | 8692 | 0.1962 | - | - | | 2.3520 | 8693 | 0.1756 | - | - | | 2.3523 | 8694 | 0.1308 | - | - | | 2.3525 | 8695 | 0.1436 | - | - | | 2.3528 | 8696 | 0.167 | - | - | | 2.3531 | 8697 | 0.139 | - | - | | 2.3534 | 8698 | 0.1774 | - | - | | 2.3536 | 8699 | 0.218 | - | - | | 2.3539 | 8700 | 0.1596 | - | - | | 2.3542 | 8701 | 0.1916 | - | - | | 2.3544 | 8702 | 0.2255 | - | - | | 2.3547 | 8703 | 0.1993 | - | - | | 2.3550 | 8704 | 0.1733 | - | - | | 2.3552 | 8705 | 0.1379 | - | - | | 2.3555 | 8706 | 0.15 | - | - | | 2.3558 | 8707 | 0.2338 | - | - | | 2.3561 | 8708 | 0.2528 | - | - | | 2.3563 | 8709 | 0.2646 | - | - | | 2.3566 | 8710 | 0.1785 | - | - | | 2.3569 | 8711 | 0.189 | - | - | | 2.3571 | 8712 | 0.2629 | - | - | | 2.3574 | 8713 | 0.1356 | - | - | | 2.3577 | 8714 | 0.1776 | - | - | | 2.3580 | 8715 | 0.2535 | - | - | | 2.3582 | 8716 | 0.2775 | - | - | | 2.3585 | 8717 | 0.2135 | - | - | | 2.3588 | 8718 | 0.1916 | - | - | | 2.3590 | 8719 | 0.1766 | - | - | | 2.3593 | 8720 | 0.2487 | - | - | | 2.3596 | 8721 | 0.1504 | - | - | | 2.3598 | 8722 | 0.265 | - | - | | 2.3601 | 8723 | 0.2963 | - | - | | 2.3604 | 8724 | 0.1862 | - | - | | 2.3607 | 8725 | 0.174 | - | - | | 2.3609 | 8726 | 0.143 | - | - | | 2.3612 | 8727 | 0.1883 | - | - | | 2.3615 | 8728 | 0.2033 | - | - | | 2.3617 | 8729 | 0.1239 | - | - | | 2.3620 | 8730 | 0.225 | - | - | | 2.3623 | 8731 | 0.2446 | - | - | | 2.3626 | 8732 | 0.2426 | - | - | | 2.3628 | 8733 | 0.2018 | - | - | | 2.3631 | 8734 | 0.1536 | - | - | | 2.3634 | 8735 | 0.2307 | - | - | | 2.3636 | 8736 | 0.2998 | - | - | | 2.3639 | 8737 | 0.2127 | - | - | | 2.3642 | 8738 | 0.1865 | - | - | | 2.3644 | 8739 | 0.1595 | - | - | | 2.3647 | 8740 | 0.154 | - | - | | 2.3650 | 8741 | 0.1713 | - | - | | 2.3653 | 8742 | 0.2225 | - | - | | 2.3655 | 8743 | 0.1752 | - | - | | 2.3658 | 8744 | 0.1586 | - | - | | 2.3661 | 8745 | 0.2066 | - | - | | 2.3663 | 8746 | 0.1952 | - | - | | 2.3666 | 8747 | 0.1371 | - | - | | 2.3669 | 8748 | 0.155 | - | - | | 2.3672 | 8749 | 0.1435 | - | - | | 2.3674 | 8750 | 0.1709 | - | - | | 2.3677 | 8751 | 0.2272 | - | - | | 2.3680 | 8752 | 0.2366 | - | - | | 2.3682 | 8753 | 0.2118 | - | - | | 2.3685 | 8754 | 0.1821 | - | - | | 2.3688 | 8755 | 0.1303 | - | - | | 2.3690 | 8756 | 0.1717 | - | - | | 2.3693 | 8757 | 0.2345 | - | - | | 2.3696 | 8758 | 0.2524 | - | - | | 2.3699 | 8759 | 0.1825 | - | - | | 2.3701 | 8760 | 0.1603 | - | - | | 2.3704 | 8761 | 0.1325 | - | - | | 2.3707 | 8762 | 0.1942 | - | - | | 2.3709 | 8763 | 0.2632 | - | - | | 2.3712 | 8764 | 0.2648 | - | - | | 2.3715 | 8765 | 0.2912 | - | - | | 2.3718 | 8766 | 0.2259 | - | - | | 2.3720 | 8767 | 0.2043 | - | - | | 2.3723 | 8768 | 0.2045 | - | - | | 2.3726 | 8769 | 0.2328 | - | - | | 2.3728 | 8770 | 0.2156 | - | - | | 2.3731 | 8771 | 0.2409 | - | - | | 2.3734 | 8772 | 0.2406 | - | - | | 2.3736 | 8773 | 0.2056 | - | - | | 2.3739 | 8774 | 0.1716 | - | - | | 2.3742 | 8775 | 0.1973 | - | - | | 2.3745 | 8776 | 0.2103 | - | - | | 2.3747 | 8777 | 0.1669 | - | - | | 2.375 | 8778 | 0.1932 | - | - | | 2.3753 | 8779 | 0.1575 | - | - | | 2.3755 | 8780 | 0.1648 | - | - | | 2.3758 | 8781 | 0.1995 | - | - | | 2.3761 | 8782 | 0.1703 | - | - | | 2.3764 | 8783 | 0.1796 | - | - | | 2.3766 | 8784 | 0.1782 | - | - | | 2.3769 | 8785 | 0.1143 | - | - | | 2.3772 | 8786 | 0.1029 | - | - | | 2.3774 | 8787 | 0.2476 | - | - | | 2.3777 | 8788 | 0.1832 | - | - | | 2.3780 | 8789 | 0.1994 | - | - | | 2.3782 | 8790 | 0.1598 | - | - | | 2.3785 | 8791 | 0.1735 | - | - | | 2.3788 | 8792 | 0.1959 | - | - | | 2.3791 | 8793 | 0.152 | - | - | | 2.3793 | 8794 | 0.1659 | - | - | | 2.3796 | 8795 | 0.1498 | - | - | | 2.3799 | 8796 | 0.1861 | - | - | | 2.3801 | 8797 | 0.1491 | - | - | | 2.3804 | 8798 | 0.1621 | - | - | | 2.3807 | 8799 | 0.1524 | - | - | | 2.3810 | 8800 | 0.1929 | - | - | | 2.3812 | 8801 | 0.1688 | - | - | | 2.3815 | 8802 | 0.1601 | - | - | | 2.3818 | 8803 | 0.3239 | - | - | | 2.3820 | 8804 | 0.2095 | - | - | | 2.3823 | 8805 | 0.1558 | - | - | | 2.3826 | 8806 | 0.2034 | - | - | | 2.3828 | 8807 | 0.1856 | - | - | | 2.3831 | 8808 | 0.1714 | - | - | | 2.3834 | 8809 | 0.1856 | - | - | | 2.3837 | 8810 | 0.1823 | - | - | | 2.3839 | 8811 | 0.2066 | - | - | | 2.3842 | 8812 | 0.2501 | - | - | | 2.3845 | 8813 | 0.1789 | - | - | | 2.3847 | 8814 | 0.168 | - | - | | 2.3850 | 8815 | 0.1863 | - | - | | 2.3853 | 8816 | 0.1977 | - | - | | 2.3856 | 8817 | 0.1979 | - | - | | 2.3858 | 8818 | 0.1797 | - | - | | 2.3861 | 8819 | 0.2738 | - | - | | 2.3864 | 8820 | 0.2249 | - | - | | 2.3866 | 8821 | 0.2268 | - | - | | 2.3869 | 8822 | 0.2501 | - | - | | 2.3872 | 8823 | 0.1718 | - | - | | 2.3874 | 8824 | 0.193 | - | - | | 2.3877 | 8825 | 0.192 | - | - | | 2.3880 | 8826 | 0.1742 | - | - | | 2.3883 | 8827 | 0.2095 | - | - | | 2.3885 | 8828 | 0.1538 | - | - | | 2.3888 | 8829 | 0.1597 | - | - | | 2.3891 | 8830 | 0.1999 | - | - | | 2.3893 | 8831 | 0.1424 | - | - | | 2.3896 | 8832 | 0.1897 | - | - | | 2.3899 | 8833 | 0.2001 | - | - | | 2.3902 | 8834 | 0.1388 | - | - | | 2.3904 | 8835 | 0.2168 | - | - | | 2.3907 | 8836 | 0.1667 | - | - | | 2.3910 | 8837 | 0.2635 | - | - | | 2.3912 | 8838 | 0.1996 | - | - | | 2.3915 | 8839 | 0.2516 | - | - | | 2.3918 | 8840 | 0.182 | - | - | | 2.3920 | 8841 | 0.2177 | - | - | | 2.3923 | 8842 | 0.2278 | - | - | | 2.3926 | 8843 | 0.2385 | - | - | | 2.3929 | 8844 | 0.1667 | - | - | | 2.3931 | 8845 | 0.2559 | - | - | | 2.3934 | 8846 | 0.1381 | - | - | | 2.3937 | 8847 | 0.1411 | - | - | | 2.3939 | 8848 | 0.1463 | - | - | | 2.3942 | 8849 | 0.1427 | - | - | | 2.3945 | 8850 | 0.1992 | - | - | | 2.3948 | 8851 | 0.2122 | - | - | | 2.3950 | 8852 | 0.2182 | - | - | | 2.3953 | 8853 | 0.2156 | - | - | | 2.3956 | 8854 | 0.2232 | - | - | | 2.3958 | 8855 | 0.1415 | - | - | | 2.3961 | 8856 | 0.214 | - | - | | 2.3964 | 8857 | 0.2035 | - | - | | 2.3966 | 8858 | 0.1691 | - | - | | 2.3969 | 8859 | 0.1813 | - | - | | 2.3972 | 8860 | 0.1771 | - | - | | 2.3975 | 8861 | 0.1903 | - | - | | 2.3977 | 8862 | 0.213 | - | - | | 2.3980 | 8863 | 0.1817 | - | - | | 2.3983 | 8864 | 0.2236 | - | - | | 2.3985 | 8865 | 0.1587 | - | - | | 2.3988 | 8866 | 0.2211 | - | - | | 2.3991 | 8867 | 0.2063 | - | - | | 2.3994 | 8868 | 0.2285 | - | - | | 2.3996 | 8869 | 0.1781 | - | - | | 2.3999 | 8870 | 0.1698 | - | - | | 2.4002 | 8871 | 0.2687 | - | - | | 2.4004 | 8872 | 0.1744 | - | - | | 2.4007 | 8873 | 0.1577 | - | - | | 2.4010 | 8874 | 0.1747 | - | - | | 2.4012 | 8875 | 0.1596 | - | - | | 2.4015 | 8876 | 0.2094 | - | - | | 2.4018 | 8877 | 0.2269 | - | - | | 2.4021 | 8878 | 0.2017 | - | - | | 2.4023 | 8879 | 0.1689 | - | - | | 2.4026 | 8880 | 0.1649 | - | - | | 2.4029 | 8881 | 0.1656 | - | - | | 2.4031 | 8882 | 0.1161 | - | - | | 2.4034 | 8883 | 0.1901 | - | - | | 2.4037 | 8884 | 0.1921 | - | - | | 2.4040 | 8885 | 0.2393 | - | - | | 2.4042 | 8886 | 0.177 | - | - | | 2.4045 | 8887 | 0.2139 | - | - | | 2.4048 | 8888 | 0.1426 | - | - | | 2.4050 | 8889 | 0.1474 | - | - | | 2.4053 | 8890 | 0.1674 | - | - | | 2.4056 | 8891 | 0.1608 | - | - | | 2.4058 | 8892 | 0.1965 | - | - | | 2.4061 | 8893 | 0.1892 | - | - | | 2.4064 | 8894 | 0.2812 | - | - | | 2.4067 | 8895 | 0.22 | - | - | | 2.4069 | 8896 | 0.1829 | - | - | | 2.4072 | 8897 | 0.2434 | - | - | | 2.4075 | 8898 | 0.146 | - | - | | 2.4077 | 8899 | 0.2358 | - | - | | 2.4080 | 8900 | 0.1913 | - | - | | 2.4083 | 8901 | 0.2159 | - | - | | 2.4085 | 8902 | 0.1852 | - | - | | 2.4088 | 8903 | 0.2539 | - | - | | 2.4091 | 8904 | 0.2202 | - | - | | 2.4094 | 8905 | 0.1857 | - | - | | 2.4096 | 8906 | 0.155 | - | - | | 2.4099 | 8907 | 0.1459 | - | - | | 2.4102 | 8908 | 0.1269 | - | - | | 2.4104 | 8909 | 0.1712 | - | - | | 2.4107 | 8910 | 0.1919 | - | - | | 2.4110 | 8911 | 0.1332 | - | - | | 2.4113 | 8912 | 0.1331 | - | - | | 2.4115 | 8913 | 0.1937 | - | - | | 2.4118 | 8914 | 0.2101 | - | - | | 2.4121 | 8915 | 0.2714 | - | - | | 2.4123 | 8916 | 0.2043 | - | - | | 2.4126 | 8917 | 0.2033 | - | - | | 2.4129 | 8918 | 0.2822 | - | - | | 2.4131 | 8919 | 0.173 | - | - | | 2.4134 | 8920 | 0.1442 | - | - | | 2.4137 | 8921 | 0.1704 | - | - | | 2.4140 | 8922 | 0.1836 | - | - | | 2.4142 | 8923 | 0.2269 | - | - | | 2.4145 | 8924 | 0.2103 | - | - | | 2.4148 | 8925 | 0.1463 | - | - | | 2.4150 | 8926 | 0.1868 | - | - | | 2.4153 | 8927 | 0.1859 | - | - | | 2.4156 | 8928 | 0.1515 | - | - | | 2.4159 | 8929 | 0.1118 | - | - | | 2.4161 | 8930 | 0.2596 | - | - | | 2.4164 | 8931 | 0.2458 | - | - | | 2.4167 | 8932 | 0.1688 | - | - | | 2.4169 | 8933 | 0.1666 | - | - | | 2.4172 | 8934 | 0.1877 | - | - | | 2.4175 | 8935 | 0.2149 | - | - | | 2.4177 | 8936 | 0.1852 | - | - | | 2.4180 | 8937 | 0.2179 | - | - | | 2.4183 | 8938 | 0.1816 | - | - | | 2.4186 | 8939 | 0.1827 | - | - | | 2.4188 | 8940 | 0.2709 | - | - | | 2.4191 | 8941 | 0.2453 | - | - | | 2.4194 | 8942 | 0.1375 | - | - | | 2.4196 | 8943 | 0.1473 | - | - | | 2.4199 | 8944 | 0.2855 | - | - | | 2.4202 | 8945 | 0.2015 | - | - | | 2.4205 | 8946 | 0.1627 | - | - | | 2.4207 | 8947 | 0.1626 | - | - | | 2.4210 | 8948 | 0.187 | - | - | | 2.4213 | 8949 | 0.1975 | - | - | | 2.4215 | 8950 | 0.1696 | - | - | | 2.4218 | 8951 | 0.2215 | - | - | | 2.4221 | 8952 | 0.1824 | - | - | | 2.4223 | 8953 | 0.1643 | - | - | | 2.4226 | 8954 | 0.2096 | - | - | | 2.4229 | 8955 | 0.1787 | - | - | | 2.4232 | 8956 | 0.181 | - | - | | 2.4234 | 8957 | 0.1801 | - | - | | 2.4237 | 8958 | 0.2088 | - | - | | 2.4240 | 8959 | 0.1477 | - | - | | 2.4242 | 8960 | 0.1331 | - | - | | 2.4245 | 8961 | 0.2063 | - | - | | 2.4248 | 8962 | 0.1981 | - | - | | 2.4251 | 8963 | 0.2291 | - | - | | 2.4253 | 8964 | 0.1636 | - | - | | 2.4256 | 8965 | 0.2328 | - | - | | 2.4259 | 8966 | 0.1599 | - | - | | 2.4261 | 8967 | 0.1983 | - | - | | 2.4264 | 8968 | 0.2054 | - | - | | 2.4267 | 8969 | 0.1493 | - | - | | 2.4269 | 8970 | 0.1514 | - | - | | 2.4272 | 8971 | 0.2162 | - | - | | 2.4275 | 8972 | 0.1509 | - | - | | 2.4278 | 8973 | 0.1701 | - | - | | 2.4280 | 8974 | 0.1526 | - | - | | 2.4283 | 8975 | 0.1898 | - | - | | 2.4286 | 8976 | 0.1277 | - | - | | 2.4288 | 8977 | 0.2147 | - | - | | 2.4291 | 8978 | 0.2029 | - | - | | 2.4294 | 8979 | 0.1846 | - | - | | 2.4297 | 8980 | 0.1474 | - | - | | 2.4299 | 8981 | 0.1601 | - | - | | 2.4302 | 8982 | 0.1884 | - | - | | 2.4305 | 8983 | 0.1535 | - | - | | 2.4307 | 8984 | 0.199 | - | - | | 2.4310 | 8985 | 0.1369 | - | - | | 2.4313 | 8986 | 0.1596 | - | - | | 2.4315 | 8987 | 0.1917 | - | - | | 2.4318 | 8988 | 0.1519 | - | - | | 2.4321 | 8989 | 0.1179 | - | - | | 2.4324 | 8990 | 0.2087 | - | - | | 2.4326 | 8991 | 0.1752 | - | - | | 2.4329 | 8992 | 0.1962 | - | - | | 2.4332 | 8993 | 0.1798 | - | - | | 2.4334 | 8994 | 0.1453 | - | - | | 2.4337 | 8995 | 0.189 | - | - | | 2.4340 | 8996 | 0.2584 | - | - | | 2.4343 | 8997 | 0.1696 | - | - | | 2.4345 | 8998 | 0.1598 | - | - | | 2.4348 | 8999 | 0.1925 | - | - | | 2.4351 | 9000 | 0.1777 | 0.2020 | 0.9507 | | 2.4353 | 9001 | 0.1732 | - | - | | 2.4356 | 9002 | 0.1942 | - | - | | 2.4359 | 9003 | 0.1594 | - | - | | 2.4361 | 9004 | 0.2036 | - | - | | 2.4364 | 9005 | 0.1487 | - | - | | 2.4367 | 9006 | 0.1627 | - | - | | 2.4370 | 9007 | 0.155 | - | - | | 2.4372 | 9008 | 0.1066 | - | - | | 2.4375 | 9009 | 0.1231 | - | - | | 2.4378 | 9010 | 0.2142 | - | - | | 2.4380 | 9011 | 0.2037 | - | - | | 2.4383 | 9012 | 0.2303 | - | - | | 2.4386 | 9013 | 0.2101 | - | - | | 2.4389 | 9014 | 0.1789 | - | - | | 2.4391 | 9015 | 0.1268 | - | - | | 2.4394 | 9016 | 0.1564 | - | - | | 2.4397 | 9017 | 0.1755 | - | - | | 2.4399 | 9018 | 0.1794 | - | - | | 2.4402 | 9019 | 0.1566 | - | - | | 2.4405 | 9020 | 0.1578 | - | - | | 2.4407 | 9021 | 0.1559 | - | - | | 2.4410 | 9022 | 0.2148 | - | - | | 2.4413 | 9023 | 0.2361 | - | - | | 2.4416 | 9024 | 0.2064 | - | - | | 2.4418 | 9025 | 0.2339 | - | - | | 2.4421 | 9026 | 0.1752 | - | - | | 2.4424 | 9027 | 0.1928 | - | - | | 2.4426 | 9028 | 0.1663 | - | - | | 2.4429 | 9029 | 0.1533 | - | - | | 2.4432 | 9030 | 0.2194 | - | - | | 2.4435 | 9031 | 0.1816 | - | - | | 2.4437 | 9032 | 0.2291 | - | - | | 2.4440 | 9033 | 0.1757 | - | - | | 2.4443 | 9034 | 0.163 | - | - | | 2.4445 | 9035 | 0.156 | - | - | | 2.4448 | 9036 | 0.1969 | - | - | | 2.4451 | 9037 | 0.1952 | - | - | | 2.4453 | 9038 | 0.1708 | - | - | | 2.4456 | 9039 | 0.1852 | - | - | | 2.4459 | 9040 | 0.1854 | - | - | | 2.4462 | 9041 | 0.1641 | - | - | | 2.4464 | 9042 | 0.2354 | - | - | | 2.4467 | 9043 | 0.1693 | - | - | | 2.4470 | 9044 | 0.1706 | - | - | | 2.4472 | 9045 | 0.1593 | - | - | | 2.4475 | 9046 | 0.1358 | - | - | | 2.4478 | 9047 | 0.1734 | - | - | | 2.4481 | 9048 | 0.1638 | - | - | | 2.4483 | 9049 | 0.2241 | - | - | | 2.4486 | 9050 | 0.1927 | - | - | | 2.4489 | 9051 | 0.1625 | - | - | | 2.4491 | 9052 | 0.1412 | - | - | | 2.4494 | 9053 | 0.1931 | - | - | | 2.4497 | 9054 | 0.1538 | - | - | | 2.4499 | 9055 | 0.1949 | - | - | | 2.4502 | 9056 | 0.2068 | - | - | | 2.4505 | 9057 | 0.1717 | - | - | | 2.4508 | 9058 | 0.1982 | - | - | | 2.4510 | 9059 | 0.1565 | - | - | | 2.4513 | 9060 | 0.1157 | - | - | | 2.4516 | 9061 | 0.1225 | - | - | | 2.4518 | 9062 | 0.1552 | - | - | | 2.4521 | 9063 | 0.1364 | - | - | | 2.4524 | 9064 | 0.145 | - | - | | 2.4527 | 9065 | 0.1923 | - | - | | 2.4529 | 9066 | 0.192 | - | - | | 2.4532 | 9067 | 0.1713 | - | - | | 2.4535 | 9068 | 0.2157 | - | - | | 2.4537 | 9069 | 0.1785 | - | - | | 2.4540 | 9070 | 0.1658 | - | - | | 2.4543 | 9071 | 0.2109 | - | - | | 2.4545 | 9072 | 0.1775 | - | - | | 2.4548 | 9073 | 0.2275 | - | - | | 2.4551 | 9074 | 0.2266 | - | - | | 2.4554 | 9075 | 0.2086 | - | - | | 2.4556 | 9076 | 0.2074 | - | - | | 2.4559 | 9077 | 0.1996 | - | - | | 2.4562 | 9078 | 0.207 | - | - | | 2.4564 | 9079 | 0.2261 | - | - | | 2.4567 | 9080 | 0.1524 | - | - | | 2.4570 | 9081 | 0.1165 | - | - | | 2.4573 | 9082 | 0.1653 | - | - | | 2.4575 | 9083 | 0.1791 | - | - | | 2.4578 | 9084 | 0.125 | - | - | | 2.4581 | 9085 | 0.1811 | - | - | | 2.4583 | 9086 | 0.1451 | - | - | | 2.4586 | 9087 | 0.1553 | - | - | | 2.4589 | 9088 | 0.2294 | - | - | | 2.4591 | 9089 | 0.1507 | - | - | | 2.4594 | 9090 | 0.1951 | - | - | | 2.4597 | 9091 | 0.1785 | - | - | | 2.4600 | 9092 | 0.1706 | - | - | | 2.4602 | 9093 | 0.2158 | - | - | | 2.4605 | 9094 | 0.2004 | - | - | | 2.4608 | 9095 | 0.1655 | - | - | | 2.4610 | 9096 | 0.1329 | - | - | | 2.4613 | 9097 | 0.1583 | - | - | | 2.4616 | 9098 | 0.1633 | - | - | | 2.4619 | 9099 | 0.1469 | - | - | | 2.4621 | 9100 | 0.173 | - | - | | 2.4624 | 9101 | 0.1324 | - | - | | 2.4627 | 9102 | 0.1063 | - | - | | 2.4629 | 9103 | 0.1688 | - | - | | 2.4632 | 9104 | 0.169 | - | - | | 2.4635 | 9105 | 0.1666 | - | - | | 2.4637 | 9106 | 0.1219 | - | - | | 2.4640 | 9107 | 0.2367 | - | - | | 2.4643 | 9108 | 0.1647 | - | - | | 2.4646 | 9109 | 0.1834 | - | - | | 2.4648 | 9110 | 0.1823 | - | - | | 2.4651 | 9111 | 0.2046 | - | - | | 2.4654 | 9112 | 0.2404 | - | - | | 2.4656 | 9113 | 0.1864 | - | - | | 2.4659 | 9114 | 0.2713 | - | - | | 2.4662 | 9115 | 0.2202 | - | - | | 2.4665 | 9116 | 0.2055 | - | - | | 2.4667 | 9117 | 0.245 | - | - | | 2.4670 | 9118 | 0.1604 | - | - | | 2.4673 | 9119 | 0.1332 | - | - | | 2.4675 | 9120 | 0.2521 | - | - | | 2.4678 | 9121 | 0.1437 | - | - | | 2.4681 | 9122 | 0.1645 | - | - | | 2.4683 | 9123 | 0.2104 | - | - | | 2.4686 | 9124 | 0.1547 | - | - | | 2.4689 | 9125 | 0.1945 | - | - | | 2.4692 | 9126 | 0.1233 | - | - | | 2.4694 | 9127 | 0.1231 | - | - | | 2.4697 | 9128 | 0.1318 | - | - | | 2.4700 | 9129 | 0.2157 | - | - | | 2.4702 | 9130 | 0.1852 | - | - | | 2.4705 | 9131 | 0.15 | - | - | | 2.4708 | 9132 | 0.2086 | - | - | | 2.4710 | 9133 | 0.1078 | - | - | | 2.4713 | 9134 | 0.2966 | - | - | | 2.4716 | 9135 | 0.1838 | - | - | | 2.4719 | 9136 | 0.2807 | - | - | | 2.4721 | 9137 | 0.2018 | - | - | | 2.4724 | 9138 | 0.218 | - | - | | 2.4727 | 9139 | 0.2136 | - | - | | 2.4729 | 9140 | 0.1998 | - | - | | 2.4732 | 9141 | 0.189 | - | - | | 2.4735 | 9142 | 0.171 | - | - | | 2.4738 | 9143 | 0.1395 | - | - | | 2.4740 | 9144 | 0.111 | - | - | | 2.4743 | 9145 | 0.2405 | - | - | | 2.4746 | 9146 | 0.1642 | - | - | | 2.4748 | 9147 | 0.1096 | - | - | | 2.4751 | 9148 | 0.2213 | - | - | | 2.4754 | 9149 | 0.1361 | - | - | | 2.4756 | 9150 | 0.1716 | - | - | | 2.4759 | 9151 | 0.327 | - | - | | 2.4762 | 9152 | 0.1661 | - | - | | 2.4765 | 9153 | 0.2277 | - | - | | 2.4767 | 9154 | 0.1592 | - | - | | 2.4770 | 9155 | 0.1536 | - | - | | 2.4773 | 9156 | 0.2192 | - | - | | 2.4775 | 9157 | 0.1806 | - | - | | 2.4778 | 9158 | 0.2129 | - | - | | 2.4781 | 9159 | 0.133 | - | - | | 2.4784 | 9160 | 0.1903 | - | - | | 2.4786 | 9161 | 0.2082 | - | - | | 2.4789 | 9162 | 0.1365 | - | - | | 2.4792 | 9163 | 0.2089 | - | - | | 2.4794 | 9164 | 0.1939 | - | - | | 2.4797 | 9165 | 0.1707 | - | - | | 2.4800 | 9166 | 0.196 | - | - | | 2.4802 | 9167 | 0.1771 | - | - | | 2.4805 | 9168 | 0.1293 | - | - | | 2.4808 | 9169 | 0.1443 | - | - | | 2.4811 | 9170 | 0.195 | - | - | | 2.4813 | 9171 | 0.1577 | - | - | | 2.4816 | 9172 | 0.1538 | - | - | | 2.4819 | 9173 | 0.1609 | - | - | | 2.4821 | 9174 | 0.2246 | - | - | | 2.4824 | 9175 | 0.2308 | - | - | | 2.4827 | 9176 | 0.217 | - | - | | 2.4830 | 9177 | 0.153 | - | - | | 2.4832 | 9178 | 0.1537 | - | - | | 2.4835 | 9179 | 0.2042 | - | - | | 2.4838 | 9180 | 0.158 | - | - | | 2.4840 | 9181 | 0.2084 | - | - | | 2.4843 | 9182 | 0.1726 | - | - | | 2.4846 | 9183 | 0.202 | - | - | | 2.4848 | 9184 | 0.1644 | - | - | | 2.4851 | 9185 | 0.1644 | - | - | | 2.4854 | 9186 | 0.2438 | - | - | | 2.4857 | 9187 | 0.1776 | - | - | | 2.4859 | 9188 | 0.1181 | - | - | | 2.4862 | 9189 | 0.2321 | - | - | | 2.4865 | 9190 | 0.2358 | - | - | | 2.4867 | 9191 | 0.1377 | - | - | | 2.4870 | 9192 | 0.1408 | - | - | | 2.4873 | 9193 | 0.171 | - | - | | 2.4876 | 9194 | 0.2065 | - | - | | 2.4878 | 9195 | 0.1304 | - | - | | 2.4881 | 9196 | 0.1666 | - | - | | 2.4884 | 9197 | 0.1929 | - | - | | 2.4886 | 9198 | 0.1432 | - | - | | 2.4889 | 9199 | 0.2071 | - | - | | 2.4892 | 9200 | 0.1413 | - | - | | 2.4894 | 9201 | 0.2333 | - | - | | 2.4897 | 9202 | 0.1793 | - | - | | 2.4900 | 9203 | 0.1714 | - | - | | 2.4903 | 9204 | 0.2257 | - | - | | 2.4905 | 9205 | 0.1691 | - | - | | 2.4908 | 9206 | 0.1746 | - | - | | 2.4911 | 9207 | 0.1966 | - | - | | 2.4913 | 9208 | 0.2395 | - | - | | 2.4916 | 9209 | 0.1321 | - | - | | 2.4919 | 9210 | 0.235 | - | - | | 2.4922 | 9211 | 0.1674 | - | - | | 2.4924 | 9212 | 0.1876 | - | - | | 2.4927 | 9213 | 0.1527 | - | - | | 2.4930 | 9214 | 0.1923 | - | - | | 2.4932 | 9215 | 0.1764 | - | - | | 2.4935 | 9216 | 0.2073 | - | - | | 2.4938 | 9217 | 0.2064 | - | - | | 2.4940 | 9218 | 0.2411 | - | - | | 2.4943 | 9219 | 0.1468 | - | - | | 2.4946 | 9220 | 0.2097 | - | - | | 2.4949 | 9221 | 0.1753 | - | - | | 2.4951 | 9222 | 0.2581 | - | - | | 2.4954 | 9223 | 0.1816 | - | - | | 2.4957 | 9224 | 0.1077 | - | - | | 2.4959 | 9225 | 0.1648 | - | - | | 2.4962 | 9226 | 0.1664 | - | - | | 2.4965 | 9227 | 0.1565 | - | - | | 2.4968 | 9228 | 0.1396 | - | - | | 2.4970 | 9229 | 0.1868 | - | - | | 2.4973 | 9230 | 0.2048 | - | - | | 2.4976 | 9231 | 0.1857 | - | - | | 2.4978 | 9232 | 0.1859 | - | - | | 2.4981 | 9233 | 0.2012 | - | - | | 2.4984 | 9234 | 0.2001 | - | - | | 2.4986 | 9235 | 0.2091 | - | - | | 2.4989 | 9236 | 0.1529 | - | - | | 2.4992 | 9237 | 0.2236 | - | - | | 2.4995 | 9238 | 0.1149 | - | - | | 2.4997 | 9239 | 0.2315 | - | - | | 2.5 | 9240 | 0.207 | - | - | | 2.5003 | 9241 | 0.194 | - | - | | 2.5005 | 9242 | 0.1963 | - | - | | 2.5008 | 9243 | 0.2004 | - | - | | 2.5011 | 9244 | 0.1906 | - | - | | 2.5014 | 9245 | 0.2441 | - | - | | 2.5016 | 9246 | 0.221 | - | - | | 2.5019 | 9247 | 0.2272 | - | - | | 2.5022 | 9248 | 0.1373 | - | - | | 2.5024 | 9249 | 0.1574 | - | - | | 2.5027 | 9250 | 0.2241 | - | - | | 2.5030 | 9251 | 0.1658 | - | - | | 2.5032 | 9252 | 0.1961 | - | - | | 2.5035 | 9253 | 0.1396 | - | - | | 2.5038 | 9254 | 0.1755 | - | - | | 2.5041 | 9255 | 0.131 | - | - | | 2.5043 | 9256 | 0.1567 | - | - | | 2.5046 | 9257 | 0.1523 | - | - | | 2.5049 | 9258 | 0.1362 | - | - | | 2.5051 | 9259 | 0.1642 | - | - | | 2.5054 | 9260 | 0.2013 | - | - | | 2.5057 | 9261 | 0.1809 | - | - | | 2.5060 | 9262 | 0.1864 | - | - | | 2.5062 | 9263 | 0.1343 | - | - | | 2.5065 | 9264 | 0.2132 | - | - | | 2.5068 | 9265 | 0.1948 | - | - | | 2.5070 | 9266 | 0.2097 | - | - | | 2.5073 | 9267 | 0.123 | - | - | | 2.5076 | 9268 | 0.2731 | - | - | | 2.5078 | 9269 | 0.1316 | - | - | | 2.5081 | 9270 | 0.1871 | - | - | | 2.5084 | 9271 | 0.2106 | - | - | | 2.5087 | 9272 | 0.1142 | - | - | | 2.5089 | 9273 | 0.2902 | - | - | | 2.5092 | 9274 | 0.1839 | - | - | | 2.5095 | 9275 | 0.2573 | - | - | | 2.5097 | 9276 | 0.172 | - | - | | 2.5100 | 9277 | 0.2022 | - | - | | 2.5103 | 9278 | 0.2018 | - | - | | 2.5106 | 9279 | 0.1552 | - | - | | 2.5108 | 9280 | 0.206 | - | - | | 2.5111 | 9281 | 0.2057 | - | - | | 2.5114 | 9282 | 0.1895 | - | - | | 2.5116 | 9283 | 0.2775 | - | - | | 2.5119 | 9284 | 0.2244 | - | - | | 2.5122 | 9285 | 0.2034 | - | - | | 2.5124 | 9286 | 0.1647 | - | - | | 2.5127 | 9287 | 0.1759 | - | - | | 2.5130 | 9288 | 0.1276 | - | - | | 2.5133 | 9289 | 0.1323 | - | - | | 2.5135 | 9290 | 0.1602 | - | - | | 2.5138 | 9291 | 0.1616 | - | - | | 2.5141 | 9292 | 0.1534 | - | - | | 2.5143 | 9293 | 0.1791 | - | - | | 2.5146 | 9294 | 0.1104 | - | - | | 2.5149 | 9295 | 0.1815 | - | - | | 2.5152 | 9296 | 0.1872 | - | - | | 2.5154 | 9297 | 0.1724 | - | - | | 2.5157 | 9298 | 0.1482 | - | - | | 2.5160 | 9299 | 0.129 | - | - | | 2.5162 | 9300 | 0.154 | - | - | | 2.5165 | 9301 | 0.1354 | - | - | | 2.5168 | 9302 | 0.1832 | - | - | | 2.5170 | 9303 | 0.1733 | - | - | | 2.5173 | 9304 | 0.1451 | - | - | | 2.5176 | 9305 | 0.1861 | - | - | | 2.5179 | 9306 | 0.2203 | - | - | | 2.5181 | 9307 | 0.1567 | - | - | | 2.5184 | 9308 | 0.1367 | - | - | | 2.5187 | 9309 | 0.2033 | - | - | | 2.5189 | 9310 | 0.2585 | - | - | | 2.5192 | 9311 | 0.2658 | - | - | | 2.5195 | 9312 | 0.2966 | - | - | | 2.5198 | 9313 | 0.1957 | - | - | | 2.5200 | 9314 | 0.3492 | - | - | | 2.5203 | 9315 | 0.2082 | - | - | | 2.5206 | 9316 | 0.234 | - | - | | 2.5208 | 9317 | 0.0997 | - | - | | 2.5211 | 9318 | 0.2543 | - | - | | 2.5214 | 9319 | 0.1568 | - | - | | 2.5216 | 9320 | 0.1837 | - | - | | 2.5219 | 9321 | 0.1521 | - | - | | 2.5222 | 9322 | 0.2384 | - | - | | 2.5225 | 9323 | 0.147 | - | - | | 2.5227 | 9324 | 0.1995 | - | - | | 2.5230 | 9325 | 0.1434 | - | - | | 2.5233 | 9326 | 0.1696 | - | - | | 2.5235 | 9327 | 0.2753 | - | - | | 2.5238 | 9328 | 0.1725 | - | - | | 2.5241 | 9329 | 0.1688 | - | - | | 2.5244 | 9330 | 0.2137 | - | - | | 2.5246 | 9331 | 0.1491 | - | - | | 2.5249 | 9332 | 0.1432 | - | - | | 2.5252 | 9333 | 0.2378 | - | - | | 2.5254 | 9334 | 0.2519 | - | - | | 2.5257 | 9335 | 0.1721 | - | - | | 2.5260 | 9336 | 0.2278 | - | - | | 2.5262 | 9337 | 0.1607 | - | - | | 2.5265 | 9338 | 0.2106 | - | - | | 2.5268 | 9339 | 0.2201 | - | - | | 2.5271 | 9340 | 0.1915 | - | - | | 2.5273 | 9341 | 0.2066 | - | - | | 2.5276 | 9342 | 0.2014 | - | - | | 2.5279 | 9343 | 0.1727 | - | - | | 2.5281 | 9344 | 0.1808 | - | - | | 2.5284 | 9345 | 0.2205 | - | - | | 2.5287 | 9346 | 0.1873 | - | - | | 2.5290 | 9347 | 0.2289 | - | - | | 2.5292 | 9348 | 0.2137 | - | - | | 2.5295 | 9349 | 0.1457 | - | - | | 2.5298 | 9350 | 0.1837 | - | - | | 2.5300 | 9351 | 0.1592 | - | - | | 2.5303 | 9352 | 0.194 | - | - | | 2.5306 | 9353 | 0.1883 | - | - | | 2.5308 | 9354 | 0.1275 | - | - | | 2.5311 | 9355 | 0.1913 | - | - | | 2.5314 | 9356 | 0.2215 | - | - | | 2.5317 | 9357 | 0.1308 | - | - | | 2.5319 | 9358 | 0.1303 | - | - | | 2.5322 | 9359 | 0.1526 | - | - | | 2.5325 | 9360 | 0.2113 | - | - | | 2.5327 | 9361 | 0.2946 | - | - | | 2.5330 | 9362 | 0.2711 | - | - | | 2.5333 | 9363 | 0.2308 | - | - | | 2.5335 | 9364 | 0.1496 | - | - | | 2.5338 | 9365 | 0.1473 | - | - | | 2.5341 | 9366 | 0.2354 | - | - | | 2.5344 | 9367 | 0.1832 | - | - | | 2.5346 | 9368 | 0.1838 | - | - | | 2.5349 | 9369 | 0.1336 | - | - | | 2.5352 | 9370 | 0.2406 | - | - | | 2.5354 | 9371 | 0.2374 | - | - | | 2.5357 | 9372 | 0.2141 | - | - | | 2.5360 | 9373 | 0.1694 | - | - | | 2.5363 | 9374 | 0.1393 | - | - | | 2.5365 | 9375 | 0.1992 | - | - | | 2.5368 | 9376 | 0.1798 | - | - | | 2.5371 | 9377 | 0.1946 | - | - | | 2.5373 | 9378 | 0.2448 | - | - | | 2.5376 | 9379 | 0.2016 | - | - | | 2.5379 | 9380 | 0.1716 | - | - | | 2.5381 | 9381 | 0.2174 | - | - | | 2.5384 | 9382 | 0.1777 | - | - | | 2.5387 | 9383 | 0.2216 | - | - | | 2.5390 | 9384 | 0.1301 | - | - | | 2.5392 | 9385 | 0.1531 | - | - | | 2.5395 | 9386 | 0.2434 | - | - | | 2.5398 | 9387 | 0.1907 | - | - | | 2.5400 | 9388 | 0.1941 | - | - | | 2.5403 | 9389 | 0.1145 | - | - | | 2.5406 | 9390 | 0.135 | - | - | | 2.5409 | 9391 | 0.2398 | - | - | | 2.5411 | 9392 | 0.17 | - | - | | 2.5414 | 9393 | 0.1357 | - | - | | 2.5417 | 9394 | 0.1454 | - | - | | 2.5419 | 9395 | 0.1961 | - | - | | 2.5422 | 9396 | 0.1853 | - | - | | 2.5425 | 9397 | 0.1569 | - | - | | 2.5427 | 9398 | 0.2603 | - | - | | 2.5430 | 9399 | 0.1576 | - | - | | 2.5433 | 9400 | 0.1852 | - | - | | 2.5436 | 9401 | 0.1895 | - | - | | 2.5438 | 9402 | 0.1367 | - | - | | 2.5441 | 9403 | 0.1963 | - | - | | 2.5444 | 9404 | 0.2158 | - | - | | 2.5446 | 9405 | 0.1749 | - | - | | 2.5449 | 9406 | 0.1853 | - | - | | 2.5452 | 9407 | 0.2352 | - | - | | 2.5455 | 9408 | 0.1743 | - | - | | 2.5457 | 9409 | 0.2374 | - | - | | 2.5460 | 9410 | 0.2319 | - | - | | 2.5463 | 9411 | 0.2443 | - | - | | 2.5465 | 9412 | 0.1629 | - | - | | 2.5468 | 9413 | 0.1996 | - | - | | 2.5471 | 9414 | 0.1716 | - | - | | 2.5473 | 9415 | 0.2107 | - | - | | 2.5476 | 9416 | 0.1715 | - | - | | 2.5479 | 9417 | 0.198 | - | - | | 2.5482 | 9418 | 0.1743 | - | - | | 2.5484 | 9419 | 0.1418 | - | - | | 2.5487 | 9420 | 0.1985 | - | - | | 2.5490 | 9421 | 0.1639 | - | - | | 2.5492 | 9422 | 0.1539 | - | - | | 2.5495 | 9423 | 0.1764 | - | - | | 2.5498 | 9424 | 0.1595 | - | - | | 2.5501 | 9425 | 0.2581 | - | - | | 2.5503 | 9426 | 0.2162 | - | - | | 2.5506 | 9427 | 0.1919 | - | - | | 2.5509 | 9428 | 0.1683 | - | - | | 2.5511 | 9429 | 0.1773 | - | - | | 2.5514 | 9430 | 0.1925 | - | - | | 2.5517 | 9431 | 0.1213 | - | - | | 2.5519 | 9432 | 0.2051 | - | - | | 2.5522 | 9433 | 0.2068 | - | - | | 2.5525 | 9434 | 0.2125 | - | - | | 2.5528 | 9435 | 0.1709 | - | - | | 2.5530 | 9436 | 0.1665 | - | - | | 2.5533 | 9437 | 0.1662 | - | - | | 2.5536 | 9438 | 0.1317 | - | - | | 2.5538 | 9439 | 0.2165 | - | - | | 2.5541 | 9440 | 0.1735 | - | - | | 2.5544 | 9441 | 0.1339 | - | - | | 2.5547 | 9442 | 0.1917 | - | - | | 2.5549 | 9443 | 0.1185 | - | - | | 2.5552 | 9444 | 0.1855 | - | - | | 2.5555 | 9445 | 0.1916 | - | - | | 2.5557 | 9446 | 0.1569 | - | - | | 2.5560 | 9447 | 0.1728 | - | - | | 2.5563 | 9448 | 0.2244 | - | - | | 2.5565 | 9449 | 0.1898 | - | - | | 2.5568 | 9450 | 0.1561 | - | - | | 2.5571 | 9451 | 0.15 | - | - | | 2.5574 | 9452 | 0.214 | - | - | | 2.5576 | 9453 | 0.1563 | - | - | | 2.5579 | 9454 | 0.1446 | - | - | | 2.5582 | 9455 | 0.136 | - | - | | 2.5584 | 9456 | 0.2278 | - | - | | 2.5587 | 9457 | 0.1993 | - | - | | 2.5590 | 9458 | 0.1262 | - | - | | 2.5593 | 9459 | 0.1824 | - | - | | 2.5595 | 9460 | 0.1839 | - | - | | 2.5598 | 9461 | 0.1944 | - | - | | 2.5601 | 9462 | 0.1746 | - | - | | 2.5603 | 9463 | 0.186 | - | - | | 2.5606 | 9464 | 0.1437 | - | - | | 2.5609 | 9465 | 0.122 | - | - | | 2.5611 | 9466 | 0.1839 | - | - | | 2.5614 | 9467 | 0.2208 | - | - | | 2.5617 | 9468 | 0.1664 | - | - | | 2.5620 | 9469 | 0.2126 | - | - | | 2.5622 | 9470 | 0.2132 | - | - | | 2.5625 | 9471 | 0.2015 | - | - | | 2.5628 | 9472 | 0.1694 | - | - | | 2.5630 | 9473 | 0.1174 | - | - | | 2.5633 | 9474 | 0.1554 | - | - | | 2.5636 | 9475 | 0.1625 | - | - | | 2.5639 | 9476 | 0.1978 | - | - | | 2.5641 | 9477 | 0.185 | - | - | | 2.5644 | 9478 | 0.2182 | - | - | | 2.5647 | 9479 | 0.1824 | - | - | | 2.5649 | 9480 | 0.1429 | - | - | | 2.5652 | 9481 | 0.1499 | - | - | | 2.5655 | 9482 | 0.1966 | - | - | | 2.5657 | 9483 | 0.1602 | - | - | | 2.5660 | 9484 | 0.1746 | - | - | | 2.5663 | 9485 | 0.2696 | - | - | | 2.5666 | 9486 | 0.1811 | - | - | | 2.5668 | 9487 | 0.1856 | - | - | | 2.5671 | 9488 | 0.1689 | - | - | | 2.5674 | 9489 | 0.19 | - | - | | 2.5676 | 9490 | 0.1931 | - | - | | 2.5679 | 9491 | 0.1934 | - | - | | 2.5682 | 9492 | 0.1734 | - | - | | 2.5685 | 9493 | 0.2422 | - | - | | 2.5687 | 9494 | 0.3133 | - | - | | 2.5690 | 9495 | 0.1752 | - | - | | 2.5693 | 9496 | 0.1391 | - | - | | 2.5695 | 9497 | 0.1526 | - | - | | 2.5698 | 9498 | 0.1819 | - | - | | 2.5701 | 9499 | 0.2139 | - | - | | 2.5703 | 9500 | 0.2309 | - | - | | 2.5706 | 9501 | 0.1958 | - | - | | 2.5709 | 9502 | 0.2052 | - | - | | 2.5712 | 9503 | 0.2299 | - | - | | 2.5714 | 9504 | 0.1766 | - | - | | 2.5717 | 9505 | 0.2031 | - | - | | 2.5720 | 9506 | 0.1942 | - | - | | 2.5722 | 9507 | 0.2598 | - | - | | 2.5725 | 9508 | 0.1487 | - | - | | 2.5728 | 9509 | 0.1607 | - | - | | 2.5731 | 9510 | 0.1988 | - | - | | 2.5733 | 9511 | 0.2629 | - | - | | 2.5736 | 9512 | 0.1837 | - | - | | 2.5739 | 9513 | 0.1563 | - | - | | 2.5741 | 9514 | 0.2628 | - | - | | 2.5744 | 9515 | 0.139 | - | - | | 2.5747 | 9516 | 0.148 | - | - | | 2.5749 | 9517 | 0.1902 | - | - | | 2.5752 | 9518 | 0.1591 | - | - | | 2.5755 | 9519 | 0.1595 | - | - | | 2.5758 | 9520 | 0.2 | - | - | | 2.5760 | 9521 | 0.1855 | - | - | | 2.5763 | 9522 | 0.1516 | - | - | | 2.5766 | 9523 | 0.1352 | - | - | | 2.5768 | 9524 | 0.1785 | - | - | | 2.5771 | 9525 | 0.1994 | - | - | | 2.5774 | 9526 | 0.2492 | - | - | | 2.5777 | 9527 | 0.1519 | - | - | | 2.5779 | 9528 | 0.1764 | - | - | | 2.5782 | 9529 | 0.1498 | - | - | | 2.5785 | 9530 | 0.1588 | - | - | | 2.5787 | 9531 | 0.1453 | - | - | | 2.5790 | 9532 | 0.2072 | - | - | | 2.5793 | 9533 | 0.173 | - | - | | 2.5795 | 9534 | 0.1384 | - | - | | 2.5798 | 9535 | 0.1623 | - | - | | 2.5801 | 9536 | 0.2509 | - | - | | 2.5804 | 9537 | 0.176 | - | - | | 2.5806 | 9538 | 0.1417 | - | - | | 2.5809 | 9539 | 0.1558 | - | - | | 2.5812 | 9540 | 0.1427 | - | - | | 2.5814 | 9541 | 0.1686 | - | - | | 2.5817 | 9542 | 0.1413 | - | - | | 2.5820 | 9543 | 0.1534 | - | - | | 2.5823 | 9544 | 0.207 | - | - | | 2.5825 | 9545 | 0.1876 | - | - | | 2.5828 | 9546 | 0.1913 | - | - | | 2.5831 | 9547 | 0.1863 | - | - | | 2.5833 | 9548 | 0.1534 | - | - | | 2.5836 | 9549 | 0.1343 | - | - | | 2.5839 | 9550 | 0.191 | - | - | | 2.5841 | 9551 | 0.1612 | - | - | | 2.5844 | 9552 | 0.1843 | - | - | | 2.5847 | 9553 | 0.1215 | - | - | | 2.5850 | 9554 | 0.1474 | - | - | | 2.5852 | 9555 | 0.1298 | - | - | | 2.5855 | 9556 | 0.1412 | - | - | | 2.5858 | 9557 | 0.1788 | - | - | | 2.5860 | 9558 | 0.1588 | - | - | | 2.5863 | 9559 | 0.1693 | - | - | | 2.5866 | 9560 | 0.2159 | - | - | | 2.5869 | 9561 | 0.178 | - | - | | 2.5871 | 9562 | 0.1821 | - | - | | 2.5874 | 9563 | 0.2158 | - | - | | 2.5877 | 9564 | 0.1922 | - | - | | 2.5879 | 9565 | 0.1759 | - | - | | 2.5882 | 9566 | 0.1575 | - | - | | 2.5885 | 9567 | 0.2046 | - | - | | 2.5887 | 9568 | 0.1723 | - | - | | 2.5890 | 9569 | 0.172 | - | - | | 2.5893 | 9570 | 0.2358 | - | - | | 2.5896 | 9571 | 0.1816 | - | - | | 2.5898 | 9572 | 0.15 | - | - | | 2.5901 | 9573 | 0.1735 | - | - | | 2.5904 | 9574 | 0.1634 | - | - | | 2.5906 | 9575 | 0.1722 | - | - | | 2.5909 | 9576 | 0.1989 | - | - | | 2.5912 | 9577 | 0.1886 | - | - | | 2.5915 | 9578 | 0.2107 | - | - | | 2.5917 | 9579 | 0.1478 | - | - | | 2.5920 | 9580 | 0.1567 | - | - | | 2.5923 | 9581 | 0.1468 | - | - | | 2.5925 | 9582 | 0.1988 | - | - | | 2.5928 | 9583 | 0.1685 | - | - | | 2.5931 | 9584 | 0.2376 | - | - | | 2.5933 | 9585 | 0.2006 | - | - | | 2.5936 | 9586 | 0.1596 | - | - | | 2.5939 | 9587 | 0.1764 | - | - | | 2.5942 | 9588 | 0.1843 | - | - | | 2.5944 | 9589 | 0.1697 | - | - | | 2.5947 | 9590 | 0.1695 | - | - | | 2.5950 | 9591 | 0.2467 | - | - | | 2.5952 | 9592 | 0.1415 | - | - | | 2.5955 | 9593 | 0.148 | - | - | | 2.5958 | 9594 | 0.1473 | - | - | | 2.5960 | 9595 | 0.1548 | - | - | | 2.5963 | 9596 | 0.2038 | - | - | | 2.5966 | 9597 | 0.2088 | - | - | | 2.5969 | 9598 | 0.2713 | - | - | | 2.5971 | 9599 | 0.1361 | - | - | | 2.5974 | 9600 | 0.2211 | - | - | | 2.5977 | 9601 | 0.221 | - | - | | 2.5979 | 9602 | 0.2001 | - | - | | 2.5982 | 9603 | 0.1235 | - | - | | 2.5985 | 9604 | 0.1954 | - | - | | 2.5988 | 9605 | 0.1756 | - | - | | 2.5990 | 9606 | 0.2441 | - | - | | 2.5993 | 9607 | 0.1992 | - | - | | 2.5996 | 9608 | 0.1716 | - | - | | 2.5998 | 9609 | 0.1598 | - | - | | 2.6001 | 9610 | 0.1845 | - | - | | 2.6004 | 9611 | 0.2019 | - | - | | 2.6006 | 9612 | 0.1739 | - | - | | 2.6009 | 9613 | 0.1699 | - | - | | 2.6012 | 9614 | 0.1869 | - | - | | 2.6015 | 9615 | 0.1451 | - | - | | 2.6017 | 9616 | 0.1762 | - | - | | 2.6020 | 9617 | 0.2371 | - | - | | 2.6023 | 9618 | 0.2132 | - | - | | 2.6025 | 9619 | 0.1724 | - | - | | 2.6028 | 9620 | 0.1223 | - | - | | 2.6031 | 9621 | 0.19 | - | - | | 2.6034 | 9622 | 0.1904 | - | - | | 2.6036 | 9623 | 0.1735 | - | - | | 2.6039 | 9624 | 0.1886 | - | - | | 2.6042 | 9625 | 0.1475 | - | - | | 2.6044 | 9626 | 0.1816 | - | - | | 2.6047 | 9627 | 0.1855 | - | - | | 2.6050 | 9628 | 0.1668 | - | - | | 2.6052 | 9629 | 0.1906 | - | - | | 2.6055 | 9630 | 0.2435 | - | - | | 2.6058 | 9631 | 0.174 | - | - | | 2.6061 | 9632 | 0.1964 | - | - | | 2.6063 | 9633 | 0.1656 | - | - | | 2.6066 | 9634 | 0.1836 | - | - | | 2.6069 | 9635 | 0.1207 | - | - | | 2.6071 | 9636 | 0.1937 | - | - | | 2.6074 | 9637 | 0.1547 | - | - | | 2.6077 | 9638 | 0.2451 | - | - | | 2.6080 | 9639 | 0.1701 | - | - | | 2.6082 | 9640 | 0.1514 | - | - | | 2.6085 | 9641 | 0.1242 | - | - | | 2.6088 | 9642 | 0.1438 | - | - | | 2.6090 | 9643 | 0.1552 | - | - | | 2.6093 | 9644 | 0.1359 | - | - | | 2.6096 | 9645 | 0.1969 | - | - | | 2.6098 | 9646 | 0.1855 | - | - | | 2.6101 | 9647 | 0.2436 | - | - | | 2.6104 | 9648 | 0.1321 | - | - | | 2.6107 | 9649 | 0.1747 | - | - | | 2.6109 | 9650 | 0.149 | - | - | | 2.6112 | 9651 | 0.2072 | - | - | | 2.6115 | 9652 | 0.1801 | - | - | | 2.6117 | 9653 | 0.2601 | - | - | | 2.6120 | 9654 | 0.187 | - | - | | 2.6123 | 9655 | 0.1524 | - | - | | 2.6126 | 9656 | 0.1755 | - | - | | 2.6128 | 9657 | 0.1476 | - | - | | 2.6131 | 9658 | 0.1427 | - | - | | 2.6134 | 9659 | 0.1502 | - | - | | 2.6136 | 9660 | 0.1683 | - | - | | 2.6139 | 9661 | 0.2529 | - | - | | 2.6142 | 9662 | 0.2345 | - | - | | 2.6144 | 9663 | 0.16 | - | - | | 2.6147 | 9664 | 0.1894 | - | - | | 2.6150 | 9665 | 0.2122 | - | - | | 2.6153 | 9666 | 0.1957 | - | - | | 2.6155 | 9667 | 0.2049 | - | - | | 2.6158 | 9668 | 0.1551 | - | - | | 2.6161 | 9669 | 0.1488 | - | - | | 2.6163 | 9670 | 0.1443 | - | - | | 2.6166 | 9671 | 0.1454 | - | - | | 2.6169 | 9672 | 0.1466 | - | - | | 2.6172 | 9673 | 0.1642 | - | - | | 2.6174 | 9674 | 0.2006 | - | - | | 2.6177 | 9675 | 0.1544 | - | - | | 2.6180 | 9676 | 0.1676 | - | - | | 2.6182 | 9677 | 0.232 | - | - | | 2.6185 | 9678 | 0.2715 | - | - | | 2.6188 | 9679 | 0.2013 | - | - | | 2.6190 | 9680 | 0.2107 | - | - | | 2.6193 | 9681 | 0.2267 | - | - | | 2.6196 | 9682 | 0.1524 | - | - | | 2.6199 | 9683 | 0.1268 | - | - | | 2.6201 | 9684 | 0.2091 | - | - | | 2.6204 | 9685 | 0.1838 | - | - | | 2.6207 | 9686 | 0.1913 | - | - | | 2.6209 | 9687 | 0.1904 | - | - | | 2.6212 | 9688 | 0.1315 | - | - | | 2.6215 | 9689 | 0.2011 | - | - | | 2.6218 | 9690 | 0.1871 | - | - | | 2.6220 | 9691 | 0.1841 | - | - | | 2.6223 | 9692 | 0.2023 | - | - | | 2.6226 | 9693 | 0.2478 | - | - | | 2.6228 | 9694 | 0.1992 | - | - | | 2.6231 | 9695 | 0.1643 | - | - | | 2.6234 | 9696 | 0.1373 | - | - | | 2.6236 | 9697 | 0.1747 | - | - | | 2.6239 | 9698 | 0.1688 | - | - | | 2.6242 | 9699 | 0.2196 | - | - | | 2.6245 | 9700 | 0.2042 | - | - | | 2.6247 | 9701 | 0.2025 | - | - | | 2.625 | 9702 | 0.1767 | - | - | | 2.6253 | 9703 | 0.1788 | - | - | | 2.6255 | 9704 | 0.2147 | - | - | | 2.6258 | 9705 | 0.2342 | - | - | | 2.6261 | 9706 | 0.2349 | - | - | | 2.6264 | 9707 | 0.2181 | - | - | | 2.6266 | 9708 | 0.1745 | - | - | | 2.6269 | 9709 | 0.1272 | - | - | | 2.6272 | 9710 | 0.2559 | - | - | | 2.6274 | 9711 | 0.1783 | - | - | | 2.6277 | 9712 | 0.24 | - | - | | 2.6280 | 9713 | 0.1601 | - | - | | 2.6282 | 9714 | 0.1555 | - | - | | 2.6285 | 9715 | 0.1918 | - | - | | 2.6288 | 9716 | 0.1526 | - | - | | 2.6291 | 9717 | 0.1742 | - | - | | 2.6293 | 9718 | 0.2236 | - | - | | 2.6296 | 9719 | 0.2294 | - | - | | 2.6299 | 9720 | 0.1678 | - | - | | 2.6301 | 9721 | 0.1592 | - | - | | 2.6304 | 9722 | 0.2081 | - | - | | 2.6307 | 9723 | 0.231 | - | - | | 2.6310 | 9724 | 0.1538 | - | - | | 2.6312 | 9725 | 0.2064 | - | - | | 2.6315 | 9726 | 0.1825 | - | - | | 2.6318 | 9727 | 0.1796 | - | - | | 2.6320 | 9728 | 0.2968 | - | - | | 2.6323 | 9729 | 0.2387 | - | - | | 2.6326 | 9730 | 0.2095 | - | - | | 2.6328 | 9731 | 0.2179 | - | - | | 2.6331 | 9732 | 0.1733 | - | - | | 2.6334 | 9733 | 0.1776 | - | - | | 2.6337 | 9734 | 0.1986 | - | - | | 2.6339 | 9735 | 0.163 | - | - | | 2.6342 | 9736 | 0.1646 | - | - | | 2.6345 | 9737 | 0.1078 | - | - | | 2.6347 | 9738 | 0.1522 | - | - | | 2.6350 | 9739 | 0.1578 | - | - | | 2.6353 | 9740 | 0.1518 | - | - | | 2.6356 | 9741 | 0.2217 | - | - | | 2.6358 | 9742 | 0.1571 | - | - | | 2.6361 | 9743 | 0.1758 | - | - | | 2.6364 | 9744 | 0.1557 | - | - | | 2.6366 | 9745 | 0.1542 | - | - | | 2.6369 | 9746 | 0.1655 | - | - | | 2.6372 | 9747 | 0.1257 | - | - | | 2.6374 | 9748 | 0.1884 | - | - | | 2.6377 | 9749 | 0.1673 | - | - | | 2.6380 | 9750 | 0.2198 | - | - | | 2.6383 | 9751 | 0.1919 | - | - | | 2.6385 | 9752 | 0.1272 | - | - | | 2.6388 | 9753 | 0.2042 | - | - | | 2.6391 | 9754 | 0.1919 | - | - | | 2.6393 | 9755 | 0.1713 | - | - | | 2.6396 | 9756 | 0.1431 | - | - | | 2.6399 | 9757 | 0.2105 | - | - | | 2.6402 | 9758 | 0.1796 | - | - | | 2.6404 | 9759 | 0.2113 | - | - | | 2.6407 | 9760 | 0.1972 | - | - | | 2.6410 | 9761 | 0.1907 | - | - | | 2.6412 | 9762 | 0.1499 | - | - | | 2.6415 | 9763 | 0.1934 | - | - | | 2.6418 | 9764 | 0.166 | - | - | | 2.6420 | 9765 | 0.2025 | - | - | | 2.6423 | 9766 | 0.2279 | - | - | | 2.6426 | 9767 | 0.1285 | - | - | | 2.6429 | 9768 | 0.1333 | - | - | | 2.6431 | 9769 | 0.2149 | - | - | | 2.6434 | 9770 | 0.1707 | - | - | | 2.6437 | 9771 | 0.1284 | - | - | | 2.6439 | 9772 | 0.234 | - | - | | 2.6442 | 9773 | 0.1661 | - | - | | 2.6445 | 9774 | 0.1491 | - | - | | 2.6448 | 9775 | 0.1842 | - | - | | 2.6450 | 9776 | 0.1469 | - | - | | 2.6453 | 9777 | 0.1262 | - | - | | 2.6456 | 9778 | 0.2438 | - | - | | 2.6458 | 9779 | 0.1859 | - | - | | 2.6461 | 9780 | 0.205 | - | - | | 2.6464 | 9781 | 0.1731 | - | - | | 2.6466 | 9782 | 0.2158 | - | - | | 2.6469 | 9783 | 0.1542 | - | - | | 2.6472 | 9784 | 0.141 | - | - | | 2.6475 | 9785 | 0.1731 | - | - | | 2.6477 | 9786 | 0.2369 | - | - | | 2.6480 | 9787 | 0.159 | - | - | | 2.6483 | 9788 | 0.1901 | - | - | | 2.6485 | 9789 | 0.2268 | - | - | | 2.6488 | 9790 | 0.1388 | - | - | | 2.6491 | 9791 | 0.1746 | - | - | | 2.6494 | 9792 | 0.1216 | - | - | | 2.6496 | 9793 | 0.1324 | - | - | | 2.6499 | 9794 | 0.1992 | - | - | | 2.6502 | 9795 | 0.109 | - | - | | 2.6504 | 9796 | 0.2151 | - | - | | 2.6507 | 9797 | 0.1428 | - | - | | 2.6510 | 9798 | 0.2143 | - | - | | 2.6512 | 9799 | 0.143 | - | - | | 2.6515 | 9800 | 0.2087 | - | - | | 2.6518 | 9801 | 0.1832 | - | - | | 2.6521 | 9802 | 0.1926 | - | - | | 2.6523 | 9803 | 0.1192 | - | - | | 2.6526 | 9804 | 0.2043 | - | - | | 2.6529 | 9805 | 0.1599 | - | - | | 2.6531 | 9806 | 0.1627 | - | - | | 2.6534 | 9807 | 0.1706 | - | - | | 2.6537 | 9808 | 0.2116 | - | - | | 2.6540 | 9809 | 0.2163 | - | - | | 2.6542 | 9810 | 0.1929 | - | - | | 2.6545 | 9811 | 0.2219 | - | - | | 2.6548 | 9812 | 0.2387 | - | - | | 2.6550 | 9813 | 0.161 | - | - | | 2.6553 | 9814 | 0.2313 | - | - | | 2.6556 | 9815 | 0.1871 | - | - | | 2.6558 | 9816 | 0.2172 | - | - | | 2.6561 | 9817 | 0.1298 | - | - | | 2.6564 | 9818 | 0.2605 | - | - | | 2.6567 | 9819 | 0.1189 | - | - | | 2.6569 | 9820 | 0.2064 | - | - | | 2.6572 | 9821 | 0.1253 | - | - | | 2.6575 | 9822 | 0.1705 | - | - | | 2.6577 | 9823 | 0.1693 | - | - | | 2.6580 | 9824 | 0.2238 | - | - | | 2.6583 | 9825 | 0.197 | - | - | | 2.6585 | 9826 | 0.2088 | - | - | | 2.6588 | 9827 | 0.207 | - | - | | 2.6591 | 9828 | 0.2492 | - | - | | 2.6594 | 9829 | 0.2173 | - | - | | 2.6596 | 9830 | 0.1286 | - | - | | 2.6599 | 9831 | 0.1963 | - | - | | 2.6602 | 9832 | 0.1594 | - | - | | 2.6604 | 9833 | 0.1388 | - | - | | 2.6607 | 9834 | 0.1786 | - | - | | 2.6610 | 9835 | 0.1507 | - | - | | 2.6613 | 9836 | 0.2263 | - | - | | 2.6615 | 9837 | 0.1715 | - | - | | 2.6618 | 9838 | 0.1437 | - | - | | 2.6621 | 9839 | 0.1602 | - | - | | 2.6623 | 9840 | 0.1734 | - | - | | 2.6626 | 9841 | 0.1967 | - | - | | 2.6629 | 9842 | 0.1261 | - | - | | 2.6631 | 9843 | 0.2006 | - | - | | 2.6634 | 9844 | 0.2049 | - | - | | 2.6637 | 9845 | 0.232 | - | - | | 2.6640 | 9846 | 0.1532 | - | - | | 2.6642 | 9847 | 0.1286 | - | - | | 2.6645 | 9848 | 0.159 | - | - | | 2.6648 | 9849 | 0.1278 | - | - | | 2.6650 | 9850 | 0.2183 | - | - | | 2.6653 | 9851 | 0.122 | - | - | | 2.6656 | 9852 | 0.1338 | - | - | | 2.6659 | 9853 | 0.185 | - | - | | 2.6661 | 9854 | 0.1515 | - | - | | 2.6664 | 9855 | 0.187 | - | - | | 2.6667 | 9856 | 0.1779 | - | - | | 2.6669 | 9857 | 0.2533 | - | - | | 2.6672 | 9858 | 0.128 | - | - | | 2.6675 | 9859 | 0.1779 | - | - | | 2.6677 | 9860 | 0.1524 | - | - | | 2.6680 | 9861 | 0.1992 | - | - | | 2.6683 | 9862 | 0.2189 | - | - | | 2.6686 | 9863 | 0.1604 | - | - | | 2.6688 | 9864 | 0.203 | - | - | | 2.6691 | 9865 | 0.167 | - | - | | 2.6694 | 9866 | 0.1518 | - | - | | 2.6696 | 9867 | 0.1828 | - | - | | 2.6699 | 9868 | 0.16 | - | - | | 2.6702 | 9869 | 0.223 | - | - | | 2.6705 | 9870 | 0.1874 | - | - | | 2.6707 | 9871 | 0.25 | - | - | | 2.6710 | 9872 | 0.2392 | - | - | | 2.6713 | 9873 | 0.166 | - | - | | 2.6715 | 9874 | 0.1446 | - | - | | 2.6718 | 9875 | 0.1858 | - | - | | 2.6721 | 9876 | 0.2072 | - | - | | 2.6723 | 9877 | 0.1501 | - | - | | 2.6726 | 9878 | 0.1849 | - | - | | 2.6729 | 9879 | 0.1526 | - | - | | 2.6732 | 9880 | 0.2471 | - | - | | 2.6734 | 9881 | 0.2009 | - | - | | 2.6737 | 9882 | 0.2167 | - | - | | 2.6740 | 9883 | 0.125 | - | - | | 2.6742 | 9884 | 0.1399 | - | - | | 2.6745 | 9885 | 0.1307 | - | - | | 2.6748 | 9886 | 0.1729 | - | - | | 2.6751 | 9887 | 0.2106 | - | - | | 2.6753 | 9888 | 0.1593 | - | - | | 2.6756 | 9889 | 0.1386 | - | - | | 2.6759 | 9890 | 0.2815 | - | - | | 2.6761 | 9891 | 0.183 | - | - | | 2.6764 | 9892 | 0.2043 | - | - | | 2.6767 | 9893 | 0.2212 | - | - | | 2.6769 | 9894 | 0.2084 | - | - | | 2.6772 | 9895 | 0.2685 | - | - | | 2.6775 | 9896 | 0.1679 | - | - | | 2.6778 | 9897 | 0.2059 | - | - | | 2.6780 | 9898 | 0.2323 | - | - | | 2.6783 | 9899 | 0.1477 | - | - | | 2.6786 | 9900 | 0.1744 | - | - | | 2.6788 | 9901 | 0.2203 | - | - | | 2.6791 | 9902 | 0.1812 | - | - | | 2.6794 | 9903 | 0.1254 | - | - | | 2.6797 | 9904 | 0.2094 | - | - | | 2.6799 | 9905 | 0.1749 | - | - | | 2.6802 | 9906 | 0.2074 | - | - | | 2.6805 | 9907 | 0.1906 | - | - | | 2.6807 | 9908 | 0.2059 | - | - | | 2.6810 | 9909 | 0.1772 | - | - | | 2.6813 | 9910 | 0.1492 | - | - | | 2.6815 | 9911 | 0.1591 | - | - | | 2.6818 | 9912 | 0.1999 | - | - | | 2.6821 | 9913 | 0.1041 | - | - | | 2.6824 | 9914 | 0.1747 | - | - | | 2.6826 | 9915 | 0.1997 | - | - | | 2.6829 | 9916 | 0.1938 | - | - | | 2.6832 | 9917 | 0.1221 | - | - | | 2.6834 | 9918 | 0.1579 | - | - | | 2.6837 | 9919 | 0.1464 | - | - | | 2.6840 | 9920 | 0.1888 | - | - | | 2.6843 | 9921 | 0.23 | - | - | | 2.6845 | 9922 | 0.1361 | - | - | | 2.6848 | 9923 | 0.1778 | - | - | | 2.6851 | 9924 | 0.2588 | - | - | | 2.6853 | 9925 | 0.164 | - | - | | 2.6856 | 9926 | 0.2137 | - | - | | 2.6859 | 9927 | 0.1693 | - | - | | 2.6861 | 9928 | 0.1304 | - | - | | 2.6864 | 9929 | 0.2177 | - | - | | 2.6867 | 9930 | 0.1707 | - | - | | 2.6870 | 9931 | 0.2189 | - | - | | 2.6872 | 9932 | 0.1471 | - | - | | 2.6875 | 9933 | 0.1992 | - | - | | 2.6878 | 9934 | 0.1671 | - | - | | 2.6880 | 9935 | 0.1939 | - | - | | 2.6883 | 9936 | 0.1985 | - | - | | 2.6886 | 9937 | 0.1866 | - | - | | 2.6889 | 9938 | 0.2036 | - | - | | 2.6891 | 9939 | 0.1956 | - | - | | 2.6894 | 9940 | 0.1948 | - | - | | 2.6897 | 9941 | 0.1719 | - | - | | 2.6899 | 9942 | 0.1562 | - | - | | 2.6902 | 9943 | 0.1724 | - | - | | 2.6905 | 9944 | 0.1824 | - | - | | 2.6907 | 9945 | 0.1947 | - | - | | 2.6910 | 9946 | 0.1824 | - | - | | 2.6913 | 9947 | 0.1765 | - | - | | 2.6916 | 9948 | 0.1817 | - | - | | 2.6918 | 9949 | 0.1639 | - | - | | 2.6921 | 9950 | 0.2023 | - | - | | 2.6924 | 9951 | 0.1312 | - | - | | 2.6926 | 9952 | 0.2221 | - | - | | 2.6929 | 9953 | 0.1689 | - | - | | 2.6932 | 9954 | 0.2648 | - | - | | 2.6935 | 9955 | 0.1386 | - | - | | 2.6937 | 9956 | 0.1949 | - | - | | 2.6940 | 9957 | 0.2078 | - | - | | 2.6943 | 9958 | 0.1728 | - | - | | 2.6945 | 9959 | 0.1971 | - | - | | 2.6948 | 9960 | 0.2131 | - | - | | 2.6951 | 9961 | 0.2766 | - | - | | 2.6953 | 9962 | 0.1904 | - | - | | 2.6956 | 9963 | 0.1163 | - | - | | 2.6959 | 9964 | 0.1719 | - | - | | 2.6962 | 9965 | 0.157 | - | - | | 2.6964 | 9966 | 0.1588 | - | - | | 2.6967 | 9967 | 0.1444 | - | - | | 2.6970 | 9968 | 0.244 | - | - | | 2.6972 | 9969 | 0.1874 | - | - | | 2.6975 | 9970 | 0.1914 | - | - | | 2.6978 | 9971 | 0.1379 | - | - | | 2.6981 | 9972 | 0.1852 | - | - | | 2.6983 | 9973 | 0.2931 | - | - | | 2.6986 | 9974 | 0.1638 | - | - | | 2.6989 | 9975 | 0.195 | - | - | | 2.6991 | 9976 | 0.181 | - | - | | 2.6994 | 9977 | 0.1715 | - | - | | 2.6997 | 9978 | 0.2326 | - | - | | 2.6999 | 9979 | 0.179 | - | - | | 2.7002 | 9980 | 0.1596 | - | - | | 2.7005 | 9981 | 0.1478 | - | - | | 2.7008 | 9982 | 0.1531 | - | - | | 2.7010 | 9983 | 0.1702 | - | - | | 2.7013 | 9984 | 0.1708 | - | - | | 2.7016 | 9985 | 0.1285 | - | - | | 2.7018 | 9986 | 0.1952 | - | - | | 2.7021 | 9987 | 0.1314 | - | - | | 2.7024 | 9988 | 0.1671 | - | - | | 2.7027 | 9989 | 0.2038 | - | - | | 2.7029 | 9990 | 0.2286 | - | - | | 2.7032 | 9991 | 0.1773 | - | - | | 2.7035 | 9992 | 0.1603 | - | - | | 2.7037 | 9993 | 0.2274 | - | - | | 2.7040 | 9994 | 0.1582 | - | - | | 2.7043 | 9995 | 0.1772 | - | - | | 2.7045 | 9996 | 0.1568 | - | - | | 2.7048 | 9997 | 0.2022 | - | - | | 2.7051 | 9998 | 0.2089 | - | - | | 2.7054 | 9999 | 0.2049 | - | - | | 2.7056 | 10000 | 0.1524 | 0.1981 | 0.9518 | | 2.7059 | 10001 | 0.204 | - | - | | 2.7062 | 10002 | 0.1648 | - | - | | 2.7064 | 10003 | 0.1433 | - | - | | 2.7067 | 10004 | 0.2032 | - | - | | 2.7070 | 10005 | 0.147 | - | - | | 2.7073 | 10006 | 0.2122 | - | - | | 2.7075 | 10007 | 0.1509 | - | - | | 2.7078 | 10008 | 0.1761 | - | - | | 2.7081 | 10009 | 0.1985 | - | - | | 2.7083 | 10010 | 0.1348 | - | - | | 2.7086 | 10011 | 0.2467 | - | - | | 2.7089 | 10012 | 0.1574 | - | - | | 2.7091 | 10013 | 0.1756 | - | - | | 2.7094 | 10014 | 0.245 | - | - | | 2.7097 | 10015 | 0.1504 | - | - | | 2.7100 | 10016 | 0.1294 | - | - | | 2.7102 | 10017 | 0.1675 | - | - | | 2.7105 | 10018 | 0.2051 | - | - | | 2.7108 | 10019 | 0.1655 | - | - | | 2.7110 | 10020 | 0.1471 | - | - | | 2.7113 | 10021 | 0.1656 | - | - | | 2.7116 | 10022 | 0.1598 | - | - | | 2.7119 | 10023 | 0.1806 | - | - | | 2.7121 | 10024 | 0.1893 | - | - | | 2.7124 | 10025 | 0.2289 | - | - | | 2.7127 | 10026 | 0.1824 | - | - | | 2.7129 | 10027 | 0.1599 | - | - | | 2.7132 | 10028 | 0.1626 | - | - | | 2.7135 | 10029 | 0.1351 | - | - | | 2.7137 | 10030 | 0.1638 | - | - | | 2.7140 | 10031 | 0.2049 | - | - | | 2.7143 | 10032 | 0.2362 | - | - | | 2.7146 | 10033 | 0.1532 | - | - | | 2.7148 | 10034 | 0.1753 | - | - | | 2.7151 | 10035 | 0.1935 | - | - | | 2.7154 | 10036 | 0.1467 | - | - | | 2.7156 | 10037 | 0.1522 | - | - | | 2.7159 | 10038 | 0.2213 | - | - | | 2.7162 | 10039 | 0.1615 | - | - | | 2.7165 | 10040 | 0.2181 | - | - | | 2.7167 | 10041 | 0.1838 | - | - | | 2.7170 | 10042 | 0.2047 | - | - | | 2.7173 | 10043 | 0.1839 | - | - | | 2.7175 | 10044 | 0.2095 | - | - | | 2.7178 | 10045 | 0.181 | - | - | | 2.7181 | 10046 | 0.1528 | - | - | | 2.7183 | 10047 | 0.2333 | - | - | | 2.7186 | 10048 | 0.1803 | - | - | | 2.7189 | 10049 | 0.1979 | - | - | | 2.7192 | 10050 | 0.1533 | - | - | | 2.7194 | 10051 | 0.1465 | - | - | | 2.7197 | 10052 | 0.2453 | - | - | | 2.7200 | 10053 | 0.1647 | - | - | | 2.7202 | 10054 | 0.2472 | - | - | | 2.7205 | 10055 | 0.2488 | - | - | | 2.7208 | 10056 | 0.181 | - | - | | 2.7210 | 10057 | 0.2344 | - | - | | 2.7213 | 10058 | 0.2095 | - | - | | 2.7216 | 10059 | 0.2239 | - | - | | 2.7219 | 10060 | 0.1596 | - | - | | 2.7221 | 10061 | 0.1297 | - | - | | 2.7224 | 10062 | 0.1239 | - | - | | 2.7227 | 10063 | 0.2117 | - | - | | 2.7229 | 10064 | 0.1295 | - | - | | 2.7232 | 10065 | 0.2194 | - | - | | 2.7235 | 10066 | 0.1287 | - | - | | 2.7238 | 10067 | 0.1556 | - | - | | 2.7240 | 10068 | 0.1325 | - | - | | 2.7243 | 10069 | 0.1854 | - | - | | 2.7246 | 10070 | 0.1577 | - | - | | 2.7248 | 10071 | 0.218 | - | - | | 2.7251 | 10072 | 0.2931 | - | - | | 2.7254 | 10073 | 0.1334 | - | - | | 2.7256 | 10074 | 0.1573 | - | - | | 2.7259 | 10075 | 0.1642 | - | - | | 2.7262 | 10076 | 0.123 | - | - | | 2.7265 | 10077 | 0.2012 | - | - | | 2.7267 | 10078 | 0.1619 | - | - | | 2.7270 | 10079 | 0.2016 | - | - | | 2.7273 | 10080 | 0.1862 | - | - | | 2.7275 | 10081 | 0.1967 | - | - | | 2.7278 | 10082 | 0.1699 | - | - | | 2.7281 | 10083 | 0.2191 | - | - | | 2.7284 | 10084 | 0.221 | - | - | | 2.7286 | 10085 | 0.1548 | - | - | | 2.7289 | 10086 | 0.231 | - | - | | 2.7292 | 10087 | 0.2132 | - | - | | 2.7294 | 10088 | 0.1904 | - | - | | 2.7297 | 10089 | 0.2822 | - | - | | 2.7300 | 10090 | 0.1582 | - | - | | 2.7302 | 10091 | 0.102 | - | - | | 2.7305 | 10092 | 0.1543 | - | - | | 2.7308 | 10093 | 0.1897 | - | - | | 2.7311 | 10094 | 0.1864 | - | - | | 2.7313 | 10095 | 0.2078 | - | - | | 2.7316 | 10096 | 0.1418 | - | - | | 2.7319 | 10097 | 0.1406 | - | - | | 2.7321 | 10098 | 0.1806 | - | - | | 2.7324 | 10099 | 0.2246 | - | - | | 2.7327 | 10100 | 0.2052 | - | - | | 2.7330 | 10101 | 0.1787 | - | - | | 2.7332 | 10102 | 0.1104 | - | - | | 2.7335 | 10103 | 0.1409 | - | - | | 2.7338 | 10104 | 0.1486 | - | - | | 2.7340 | 10105 | 0.1948 | - | - | | 2.7343 | 10106 | 0.1527 | - | - | | 2.7346 | 10107 | 0.1456 | - | - | | 2.7348 | 10108 | 0.1214 | - | - | | 2.7351 | 10109 | 0.1628 | - | - | | 2.7354 | 10110 | 0.1529 | - | - | | 2.7357 | 10111 | 0.2555 | - | - | | 2.7359 | 10112 | 0.1923 | - | - | | 2.7362 | 10113 | 0.1625 | - | - | | 2.7365 | 10114 | 0.207 | - | - | | 2.7367 | 10115 | 0.2013 | - | - | | 2.7370 | 10116 | 0.1745 | - | - | | 2.7373 | 10117 | 0.2173 | - | - | | 2.7376 | 10118 | 0.1295 | - | - | | 2.7378 | 10119 | 0.1919 | - | - | | 2.7381 | 10120 | 0.1253 | - | - | | 2.7384 | 10121 | 0.2464 | - | - | | 2.7386 | 10122 | 0.1767 | - | - | | 2.7389 | 10123 | 0.1398 | - | - | | 2.7392 | 10124 | 0.1887 | - | - | | 2.7394 | 10125 | 0.1512 | - | - | | 2.7397 | 10126 | 0.1883 | - | - | | 2.7400 | 10127 | 0.1434 | - | - | | 2.7403 | 10128 | 0.1581 | - | - | | 2.7405 | 10129 | 0.2168 | - | - | | 2.7408 | 10130 | 0.1896 | - | - | | 2.7411 | 10131 | 0.1844 | - | - | | 2.7413 | 10132 | 0.1791 | - | - | | 2.7416 | 10133 | 0.1396 | - | - | | 2.7419 | 10134 | 0.1716 | - | - | | 2.7422 | 10135 | 0.1665 | - | - | | 2.7424 | 10136 | 0.1852 | - | - | | 2.7427 | 10137 | 0.1458 | - | - | | 2.7430 | 10138 | 0.1718 | - | - | | 2.7432 | 10139 | 0.1793 | - | - | | 2.7435 | 10140 | 0.1823 | - | - | | 2.7438 | 10141 | 0.1826 | - | - | | 2.7440 | 10142 | 0.1155 | - | - | | 2.7443 | 10143 | 0.1899 | - | - | | 2.7446 | 10144 | 0.2011 | - | - | | 2.7449 | 10145 | 0.1918 | - | - | | 2.7451 | 10146 | 0.1279 | - | - | | 2.7454 | 10147 | 0.1561 | - | - | | 2.7457 | 10148 | 0.2601 | - | - | | 2.7459 | 10149 | 0.2124 | - | - | | 2.7462 | 10150 | 0.1405 | - | - | | 2.7465 | 10151 | 0.1785 | - | - | | 2.7468 | 10152 | 0.1785 | - | - | | 2.7470 | 10153 | 0.1873 | - | - | | 2.7473 | 10154 | 0.1593 | - | - | | 2.7476 | 10155 | 0.2722 | - | - | | 2.7478 | 10156 | 0.1757 | - | - | | 2.7481 | 10157 | 0.164 | - | - | | 2.7484 | 10158 | 0.2059 | - | - | | 2.7486 | 10159 | 0.1748 | - | - | | 2.7489 | 10160 | 0.1214 | - | - | | 2.7492 | 10161 | 0.201 | - | - | | 2.7495 | 10162 | 0.2012 | - | - | | 2.7497 | 10163 | 0.1527 | - | - | | 2.75 | 10164 | 0.1601 | - | - | | 2.7503 | 10165 | 0.2386 | - | - | | 2.7505 | 10166 | 0.1786 | - | - | | 2.7508 | 10167 | 0.1726 | - | - | | 2.7511 | 10168 | 0.1905 | - | - | | 2.7514 | 10169 | 0.275 | - | - | | 2.7516 | 10170 | 0.19 | - | - | | 2.7519 | 10171 | 0.1855 | - | - | | 2.7522 | 10172 | 0.1667 | - | - | | 2.7524 | 10173 | 0.2234 | - | - | | 2.7527 | 10174 | 0.1715 | - | - | | 2.7530 | 10175 | 0.1746 | - | - | | 2.7532 | 10176 | 0.1965 | - | - | | 2.7535 | 10177 | 0.2133 | - | - | | 2.7538 | 10178 | 0.2661 | - | - | | 2.7541 | 10179 | 0.2327 | - | - | | 2.7543 | 10180 | 0.1758 | - | - | | 2.7546 | 10181 | 0.1261 | - | - | | 2.7549 | 10182 | 0.1531 | - | - | | 2.7551 | 10183 | 0.2221 | - | - | | 2.7554 | 10184 | 0.2154 | - | - | | 2.7557 | 10185 | 0.1394 | - | - | | 2.7560 | 10186 | 0.2025 | - | - | | 2.7562 | 10187 | 0.1563 | - | - | | 2.7565 | 10188 | 0.2033 | - | - | | 2.7568 | 10189 | 0.2218 | - | - | | 2.7570 | 10190 | 0.1813 | - | - | | 2.7573 | 10191 | 0.197 | - | - | | 2.7576 | 10192 | 0.1432 | - | - | | 2.7578 | 10193 | 0.1572 | - | - | | 2.7581 | 10194 | 0.1622 | - | - | | 2.7584 | 10195 | 0.2398 | - | - | | 2.7587 | 10196 | 0.1433 | - | - | | 2.7589 | 10197 | 0.1707 | - | - | | 2.7592 | 10198 | 0.2832 | - | - | | 2.7595 | 10199 | 0.1875 | - | - | | 2.7597 | 10200 | 0.1952 | - | - | | 2.7600 | 10201 | 0.1633 | - | - | | 2.7603 | 10202 | 0.2047 | - | - | | 2.7606 | 10203 | 0.1954 | - | - | | 2.7608 | 10204 | 0.2512 | - | - | | 2.7611 | 10205 | 0.1667 | - | - | | 2.7614 | 10206 | 0.1504 | - | - | | 2.7616 | 10207 | 0.204 | - | - | | 2.7619 | 10208 | 0.1649 | - | - | | 2.7622 | 10209 | 0.184 | - | - | | 2.7624 | 10210 | 0.2745 | - | - | | 2.7627 | 10211 | 0.2069 | - | - | | 2.7630 | 10212 | 0.236 | - | - | | 2.7633 | 10213 | 0.2184 | - | - | | 2.7635 | 10214 | 0.1503 | - | - | | 2.7638 | 10215 | 0.1957 | - | - | | 2.7641 | 10216 | 0.2165 | - | - | | 2.7643 | 10217 | 0.1811 | - | - | | 2.7646 | 10218 | 0.142 | - | - | | 2.7649 | 10219 | 0.149 | - | - | | 2.7652 | 10220 | 0.156 | - | - | | 2.7654 | 10221 | 0.2544 | - | - | | 2.7657 | 10222 | 0.1872 | - | - | | 2.7660 | 10223 | 0.1746 | - | - | | 2.7662 | 10224 | 0.1585 | - | - | | 2.7665 | 10225 | 0.1532 | - | - | | 2.7668 | 10226 | 0.1777 | - | - | | 2.7670 | 10227 | 0.2013 | - | - | | 2.7673 | 10228 | 0.1979 | - | - | | 2.7676 | 10229 | 0.1919 | - | - | | 2.7679 | 10230 | 0.1584 | - | - | | 2.7681 | 10231 | 0.2125 | - | - | | 2.7684 | 10232 | 0.133 | - | - | | 2.7687 | 10233 | 0.1394 | - | - | | 2.7689 | 10234 | 0.1999 | - | - | | 2.7692 | 10235 | 0.1805 | - | - | | 2.7695 | 10236 | 0.1652 | - | - | | 2.7698 | 10237 | 0.1644 | - | - | | 2.7700 | 10238 | 0.1725 | - | - | | 2.7703 | 10239 | 0.2338 | - | - | | 2.7706 | 10240 | 0.2182 | - | - | | 2.7708 | 10241 | 0.1776 | - | - | | 2.7711 | 10242 | 0.1586 | - | - | | 2.7714 | 10243 | 0.2102 | - | - | | 2.7716 | 10244 | 0.1728 | - | - | | 2.7719 | 10245 | 0.1648 | - | - | | 2.7722 | 10246 | 0.2269 | - | - | | 2.7725 | 10247 | 0.165 | - | - | | 2.7727 | 10248 | 0.1825 | - | - | | 2.7730 | 10249 | 0.1429 | - | - | | 2.7733 | 10250 | 0.1487 | - | - | | 2.7735 | 10251 | 0.1772 | - | - | | 2.7738 | 10252 | 0.2405 | - | - | | 2.7741 | 10253 | 0.1876 | - | - | | 2.7744 | 10254 | 0.1989 | - | - | | 2.7746 | 10255 | 0.1603 | - | - | | 2.7749 | 10256 | 0.1697 | - | - | | 2.7752 | 10257 | 0.1589 | - | - | | 2.7754 | 10258 | 0.167 | - | - | | 2.7757 | 10259 | 0.1821 | - | - | | 2.7760 | 10260 | 0.2388 | - | - | | 2.7762 | 10261 | 0.1785 | - | - | | 2.7765 | 10262 | 0.1531 | - | - | | 2.7768 | 10263 | 0.1997 | - | - | | 2.7771 | 10264 | 0.2474 | - | - | | 2.7773 | 10265 | 0.1593 | - | - | | 2.7776 | 10266 | 0.2194 | - | - | | 2.7779 | 10267 | 0.1648 | - | - | | 2.7781 | 10268 | 0.2095 | - | - | | 2.7784 | 10269 | 0.1308 | - | - | | 2.7787 | 10270 | 0.2246 | - | - | | 2.7790 | 10271 | 0.1944 | - | - | | 2.7792 | 10272 | 0.2037 | - | - | | 2.7795 | 10273 | 0.2075 | - | - | | 2.7798 | 10274 | 0.1401 | - | - | | 2.7800 | 10275 | 0.2082 | - | - | | 2.7803 | 10276 | 0.1729 | - | - | | 2.7806 | 10277 | 0.2313 | - | - | | 2.7808 | 10278 | 0.1214 | - | - | | 2.7811 | 10279 | 0.1973 | - | - | | 2.7814 | 10280 | 0.1985 | - | - | | 2.7817 | 10281 | 0.1817 | - | - | | 2.7819 | 10282 | 0.183 | - | - | | 2.7822 | 10283 | 0.1787 | - | - | | 2.7825 | 10284 | 0.1631 | - | - | | 2.7827 | 10285 | 0.1469 | - | - | | 2.7830 | 10286 | 0.1648 | - | - | | 2.7833 | 10287 | 0.1376 | - | - | | 2.7835 | 10288 | 0.1879 | - | - | | 2.7838 | 10289 | 0.1953 | - | - | | 2.7841 | 10290 | 0.2521 | - | - | | 2.7844 | 10291 | 0.1578 | - | - | | 2.7846 | 10292 | 0.1436 | - | - | | 2.7849 | 10293 | 0.1184 | - | - | | 2.7852 | 10294 | 0.2203 | - | - | | 2.7854 | 10295 | 0.1823 | - | - | | 2.7857 | 10296 | 0.2421 | - | - | | 2.7860 | 10297 | 0.2512 | - | - | | 2.7863 | 10298 | 0.1498 | - | - | | 2.7865 | 10299 | 0.233 | - | - | | 2.7868 | 10300 | 0.1959 | - | - | | 2.7871 | 10301 | 0.1317 | - | - | | 2.7873 | 10302 | 0.1598 | - | - | | 2.7876 | 10303 | 0.1443 | - | - | | 2.7879 | 10304 | 0.1981 | - | - | | 2.7881 | 10305 | 0.2045 | - | - | | 2.7884 | 10306 | 0.1517 | - | - | | 2.7887 | 10307 | 0.2029 | - | - | | 2.7890 | 10308 | 0.2191 | - | - | | 2.7892 | 10309 | 0.1785 | - | - | | 2.7895 | 10310 | 0.165 | - | - | | 2.7898 | 10311 | 0.1624 | - | - | | 2.7900 | 10312 | 0.246 | - | - | | 2.7903 | 10313 | 0.2368 | - | - | | 2.7906 | 10314 | 0.1382 | - | - | | 2.7909 | 10315 | 0.1498 | - | - | | 2.7911 | 10316 | 0.1529 | - | - | | 2.7914 | 10317 | 0.1661 | - | - | | 2.7917 | 10318 | 0.2483 | - | - | | 2.7919 | 10319 | 0.1743 | - | - | | 2.7922 | 10320 | 0.2503 | - | - | | 2.7925 | 10321 | 0.1715 | - | - | | 2.7927 | 10322 | 0.1929 | - | - | | 2.7930 | 10323 | 0.1785 | - | - | | 2.7933 | 10324 | 0.2121 | - | - | | 2.7936 | 10325 | 0.1627 | - | - | | 2.7938 | 10326 | 0.1689 | - | - | | 2.7941 | 10327 | 0.1427 | - | - | | 2.7944 | 10328 | 0.1782 | - | - | | 2.7946 | 10329 | 0.1702 | - | - | | 2.7949 | 10330 | 0.1546 | - | - | | 2.7952 | 10331 | 0.2864 | - | - | | 2.7955 | 10332 | 0.1654 | - | - | | 2.7957 | 10333 | 0.1446 | - | - | | 2.7960 | 10334 | 0.2061 | - | - | | 2.7963 | 10335 | 0.1536 | - | - | | 2.7965 | 10336 | 0.1601 | - | - | | 2.7968 | 10337 | 0.1732 | - | - | | 2.7971 | 10338 | 0.1434 | - | - | | 2.7973 | 10339 | 0.1533 | - | - | | 2.7976 | 10340 | 0.2509 | - | - | | 2.7979 | 10341 | 0.1703 | - | - | | 2.7982 | 10342 | 0.1943 | - | - | | 2.7984 | 10343 | 0.1845 | - | - | | 2.7987 | 10344 | 0.1967 | - | - | | 2.7990 | 10345 | 0.3166 | - | - | | 2.7992 | 10346 | 0.149 | - | - | | 2.7995 | 10347 | 0.1337 | - | - | | 2.7998 | 10348 | 0.1221 | - | - | | 2.8001 | 10349 | 0.2679 | - | - | | 2.8003 | 10350 | 0.1584 | - | - | | 2.8006 | 10351 | 0.1382 | - | - | | 2.8009 | 10352 | 0.1814 | - | - | | 2.8011 | 10353 | 0.1127 | - | - | | 2.8014 | 10354 | 0.1668 | - | - | | 2.8017 | 10355 | 0.2237 | - | - | | 2.8019 | 10356 | 0.2151 | - | - | | 2.8022 | 10357 | 0.1603 | - | - | | 2.8025 | 10358 | 0.18 | - | - | | 2.8028 | 10359 | 0.1536 | - | - | | 2.8030 | 10360 | 0.1701 | - | - | | 2.8033 | 10361 | 0.158 | - | - | | 2.8036 | 10362 | 0.2367 | - | - | | 2.8038 | 10363 | 0.1534 | - | - | | 2.8041 | 10364 | 0.1846 | - | - | | 2.8044 | 10365 | 0.1727 | - | - | | 2.8047 | 10366 | 0.1368 | - | - | | 2.8049 | 10367 | 0.1892 | - | - | | 2.8052 | 10368 | 0.1764 | - | - | | 2.8055 | 10369 | 0.1896 | - | - | | 2.8057 | 10370 | 0.1607 | - | - | | 2.8060 | 10371 | 0.1812 | - | - | | 2.8063 | 10372 | 0.1938 | - | - | | 2.8065 | 10373 | 0.194 | - | - | | 2.8068 | 10374 | 0.2195 | - | - | | 2.8071 | 10375 | 0.1546 | - | - | | 2.8074 | 10376 | 0.2571 | - | - | | 2.8076 | 10377 | 0.2044 | - | - | | 2.8079 | 10378 | 0.1927 | - | - | | 2.8082 | 10379 | 0.15 | - | - | | 2.8084 | 10380 | 0.1707 | - | - | | 2.8087 | 10381 | 0.1477 | - | - | | 2.8090 | 10382 | 0.1685 | - | - | | 2.8093 | 10383 | 0.1357 | - | - | | 2.8095 | 10384 | 0.1248 | - | - | | 2.8098 | 10385 | 0.2214 | - | - | | 2.8101 | 10386 | 0.151 | - | - | | 2.8103 | 10387 | 0.1597 | - | - | | 2.8106 | 10388 | 0.2445 | - | - | | 2.8109 | 10389 | 0.2166 | - | - | | 2.8111 | 10390 | 0.2505 | - | - | | 2.8114 | 10391 | 0.2209 | - | - | | 2.8117 | 10392 | 0.1774 | - | - | | 2.8120 | 10393 | 0.1424 | - | - | | 2.8122 | 10394 | 0.1784 | - | - | | 2.8125 | 10395 | 0.184 | - | - | | 2.8128 | 10396 | 0.2017 | - | - | | 2.8130 | 10397 | 0.2191 | - | - | | 2.8133 | 10398 | 0.2958 | - | - | | 2.8136 | 10399 | 0.1895 | - | - | | 2.8139 | 10400 | 0.208 | - | - | | 2.8141 | 10401 | 0.158 | - | - | | 2.8144 | 10402 | 0.1601 | - | - | | 2.8147 | 10403 | 0.1649 | - | - | | 2.8149 | 10404 | 0.1487 | - | - | | 2.8152 | 10405 | 0.1636 | - | - | | 2.8155 | 10406 | 0.2 | - | - | | 2.8157 | 10407 | 0.2846 | - | - | | 2.8160 | 10408 | 0.2289 | - | - | | 2.8163 | 10409 | 0.1599 | - | - | | 2.8166 | 10410 | 0.1526 | - | - | | 2.8168 | 10411 | 0.2293 | - | - | | 2.8171 | 10412 | 0.2137 | - | - | | 2.8174 | 10413 | 0.1635 | - | - | | 2.8176 | 10414 | 0.1969 | - | - | | 2.8179 | 10415 | 0.1947 | - | - | | 2.8182 | 10416 | 0.1545 | - | - | | 2.8185 | 10417 | 0.1861 | - | - | | 2.8187 | 10418 | 0.198 | - | - | | 2.8190 | 10419 | 0.151 | - | - | | 2.8193 | 10420 | 0.1908 | - | - | | 2.8195 | 10421 | 0.2578 | - | - | | 2.8198 | 10422 | 0.2081 | - | - | | 2.8201 | 10423 | 0.1924 | - | - | | 2.8203 | 10424 | 0.1326 | - | - | | 2.8206 | 10425 | 0.1571 | - | - | | 2.8209 | 10426 | 0.2384 | - | - | | 2.8212 | 10427 | 0.158 | - | - | | 2.8214 | 10428 | 0.1258 | - | - | | 2.8217 | 10429 | 0.1665 | - | - | | 2.8220 | 10430 | 0.1846 | - | - | | 2.8222 | 10431 | 0.2672 | - | - | | 2.8225 | 10432 | 0.1487 | - | - | | 2.8228 | 10433 | 0.1672 | - | - | | 2.8231 | 10434 | 0.1547 | - | - | | 2.8233 | 10435 | 0.1415 | - | - | | 2.8236 | 10436 | 0.1359 | - | - | | 2.8239 | 10437 | 0.2179 | - | - | | 2.8241 | 10438 | 0.241 | - | - | | 2.8244 | 10439 | 0.2492 | - | - | | 2.8247 | 10440 | 0.1828 | - | - | | 2.8249 | 10441 | 0.1641 | - | - | | 2.8252 | 10442 | 0.2207 | - | - | | 2.8255 | 10443 | 0.2289 | - | - | | 2.8258 | 10444 | 0.1639 | - | - | | 2.8260 | 10445 | 0.1781 | - | - | | 2.8263 | 10446 | 0.2043 | - | - | | 2.8266 | 10447 | 0.1709 | - | - | | 2.8268 | 10448 | 0.1275 | - | - | | 2.8271 | 10449 | 0.142 | - | - | | 2.8274 | 10450 | 0.2263 | - | - | | 2.8277 | 10451 | 0.1553 | - | - | | 2.8279 | 10452 | 0.1888 | - | - | | 2.8282 | 10453 | 0.2286 | - | - | | 2.8285 | 10454 | 0.1288 | - | - | | 2.8287 | 10455 | 0.1043 | - | - | | 2.8290 | 10456 | 0.2126 | - | - | | 2.8293 | 10457 | 0.2055 | - | - | | 2.8295 | 10458 | 0.1266 | - | - | | 2.8298 | 10459 | 0.2522 | - | - | | 2.8301 | 10460 | 0.2304 | - | - | | 2.8304 | 10461 | 0.1151 | - | - | | 2.8306 | 10462 | 0.192 | - | - | | 2.8309 | 10463 | 0.1893 | - | - | | 2.8312 | 10464 | 0.1386 | - | - | | 2.8314 | 10465 | 0.2076 | - | - | | 2.8317 | 10466 | 0.158 | - | - | | 2.8320 | 10467 | 0.1365 | - | - | | 2.8323 | 10468 | 0.1559 | - | - | | 2.8325 | 10469 | 0.15 | - | - | | 2.8328 | 10470 | 0.1947 | - | - | | 2.8331 | 10471 | 0.1263 | - | - | | 2.8333 | 10472 | 0.1781 | - | - | | 2.8336 | 10473 | 0.1703 | - | - | | 2.8339 | 10474 | 0.2126 | - | - | | 2.8341 | 10475 | 0.1948 | - | - | | 2.8344 | 10476 | 0.2402 | - | - | | 2.8347 | 10477 | 0.1706 | - | - | | 2.8350 | 10478 | 0.1305 | - | - | | 2.8352 | 10479 | 0.197 | - | - | | 2.8355 | 10480 | 0.1968 | - | - | | 2.8358 | 10481 | 0.2263 | - | - | | 2.8360 | 10482 | 0.1659 | - | - | | 2.8363 | 10483 | 0.18 | - | - | | 2.8366 | 10484 | 0.1568 | - | - | | 2.8369 | 10485 | 0.1968 | - | - | | 2.8371 | 10486 | 0.1606 | - | - | | 2.8374 | 10487 | 0.1213 | - | - | | 2.8377 | 10488 | 0.1648 | - | - | | 2.8379 | 10489 | 0.1881 | - | - | | 2.8382 | 10490 | 0.1748 | - | - | | 2.8385 | 10491 | 0.2688 | - | - | | 2.8387 | 10492 | 0.1569 | - | - | | 2.8390 | 10493 | 0.1993 | - | - | | 2.8393 | 10494 | 0.2501 | - | - | | 2.8396 | 10495 | 0.1597 | - | - | | 2.8398 | 10496 | 0.146 | - | - | | 2.8401 | 10497 | 0.1113 | - | - | | 2.8404 | 10498 | 0.2061 | - | - | | 2.8406 | 10499 | 0.1252 | - | - | | 2.8409 | 10500 | 0.1788 | - | - | | 2.8412 | 10501 | 0.116 | - | - | | 2.8415 | 10502 | 0.1283 | - | - | | 2.8417 | 10503 | 0.1636 | - | - | | 2.8420 | 10504 | 0.1665 | - | - | | 2.8423 | 10505 | 0.231 | - | - | | 2.8425 | 10506 | 0.1996 | - | - | | 2.8428 | 10507 | 0.188 | - | - | | 2.8431 | 10508 | 0.2211 | - | - | | 2.8433 | 10509 | 0.1794 | - | - | | 2.8436 | 10510 | 0.1714 | - | - | | 2.8439 | 10511 | 0.2177 | - | - | | 2.8442 | 10512 | 0.1814 | - | - | | 2.8444 | 10513 | 0.2004 | - | - | | 2.8447 | 10514 | 0.2261 | - | - | | 2.8450 | 10515 | 0.1903 | - | - | | 2.8452 | 10516 | 0.1682 | - | - | | 2.8455 | 10517 | 0.1979 | - | - | | 2.8458 | 10518 | 0.1513 | - | - | | 2.8460 | 10519 | 0.1103 | - | - | | 2.8463 | 10520 | 0.2082 | - | - | | 2.8466 | 10521 | 0.1825 | - | - | | 2.8469 | 10522 | 0.2426 | - | - | | 2.8471 | 10523 | 0.1731 | - | - | | 2.8474 | 10524 | 0.1933 | - | - | | 2.8477 | 10525 | 0.245 | - | - | | 2.8479 | 10526 | 0.1581 | - | - | | 2.8482 | 10527 | 0.2058 | - | - | | 2.8485 | 10528 | 0.1805 | - | - | | 2.8488 | 10529 | 0.2101 | - | - | | 2.8490 | 10530 | 0.3166 | - | - | | 2.8493 | 10531 | 0.1909 | - | - | | 2.8496 | 10532 | 0.2222 | - | - | | 2.8498 | 10533 | 0.177 | - | - | | 2.8501 | 10534 | 0.2207 | - | - | | 2.8504 | 10535 | 0.2584 | - | - | | 2.8506 | 10536 | 0.2048 | - | - | | 2.8509 | 10537 | 0.1717 | - | - | | 2.8512 | 10538 | 0.1785 | - | - | | 2.8515 | 10539 | 0.1995 | - | - | | 2.8517 | 10540 | 0.1747 | - | - | | 2.8520 | 10541 | 0.138 | - | - | | 2.8523 | 10542 | 0.1865 | - | - | | 2.8525 | 10543 | 0.157 | - | - | | 2.8528 | 10544 | 0.1387 | - | - | | 2.8531 | 10545 | 0.2247 | - | - | | 2.8534 | 10546 | 0.1726 | - | - | | 2.8536 | 10547 | 0.2175 | - | - | | 2.8539 | 10548 | 0.1751 | - | - | | 2.8542 | 10549 | 0.1953 | - | - | | 2.8544 | 10550 | 0.2146 | - | - | | 2.8547 | 10551 | 0.2245 | - | - | | 2.8550 | 10552 | 0.1479 | - | - | | 2.8552 | 10553 | 0.1233 | - | - | | 2.8555 | 10554 | 0.1496 | - | - | | 2.8558 | 10555 | 0.1927 | - | - | | 2.8561 | 10556 | 0.2005 | - | - | | 2.8563 | 10557 | 0.2218 | - | - | | 2.8566 | 10558 | 0.1881 | - | - | | 2.8569 | 10559 | 0.1941 | - | - | | 2.8571 | 10560 | 0.1797 | - | - | | 2.8574 | 10561 | 0.1338 | - | - | | 2.8577 | 10562 | 0.1743 | - | - | | 2.8580 | 10563 | 0.1895 | - | - | | 2.8582 | 10564 | 0.2136 | - | - | | 2.8585 | 10565 | 0.3177 | - | - | | 2.8588 | 10566 | 0.1628 | - | - | | 2.8590 | 10567 | 0.1455 | - | - | | 2.8593 | 10568 | 0.1476 | - | - | | 2.8596 | 10569 | 0.2476 | - | - | | 2.8598 | 10570 | 0.1942 | - | - | | 2.8601 | 10571 | 0.1878 | - | - | | 2.8604 | 10572 | 0.118 | - | - | | 2.8607 | 10573 | 0.2184 | - | - | | 2.8609 | 10574 | 0.1432 | - | - | | 2.8612 | 10575 | 0.1856 | - | - | | 2.8615 | 10576 | 0.1588 | - | - | | 2.8617 | 10577 | 0.1983 | - | - | | 2.8620 | 10578 | 0.1234 | - | - | | 2.8623 | 10579 | 0.2296 | - | - | | 2.8626 | 10580 | 0.1579 | - | - | | 2.8628 | 10581 | 0.1419 | - | - | | 2.8631 | 10582 | 0.1821 | - | - | | 2.8634 | 10583 | 0.1903 | - | - | | 2.8636 | 10584 | 0.1767 | - | - | | 2.8639 | 10585 | 0.1951 | - | - | | 2.8642 | 10586 | 0.1361 | - | - | | 2.8644 | 10587 | 0.1633 | - | - | | 2.8647 | 10588 | 0.1297 | - | - | | 2.8650 | 10589 | 0.1232 | - | - | | 2.8653 | 10590 | 0.1993 | - | - | | 2.8655 | 10591 | 0.2096 | - | - | | 2.8658 | 10592 | 0.1747 | - | - | | 2.8661 | 10593 | 0.1515 | - | - | | 2.8663 | 10594 | 0.2906 | - | - | | 2.8666 | 10595 | 0.1678 | - | - | | 2.8669 | 10596 | 0.1363 | - | - | | 2.8672 | 10597 | 0.1483 | - | - | | 2.8674 | 10598 | 0.2055 | - | - | | 2.8677 | 10599 | 0.1206 | - | - | | 2.8680 | 10600 | 0.1471 | - | - | | 2.8682 | 10601 | 0.1455 | - | - | | 2.8685 | 10602 | 0.21 | - | - | | 2.8688 | 10603 | 0.1909 | - | - | | 2.8690 | 10604 | 0.1953 | - | - | | 2.8693 | 10605 | 0.228 | - | - | | 2.8696 | 10606 | 0.1463 | - | - | | 2.8699 | 10607 | 0.1117 | - | - | | 2.8701 | 10608 | 0.2866 | - | - | | 2.8704 | 10609 | 0.1771 | - | - | | 2.8707 | 10610 | 0.2066 | - | - | | 2.8709 | 10611 | 0.2137 | - | - | | 2.8712 | 10612 | 0.1635 | - | - | | 2.8715 | 10613 | 0.2045 | - | - | | 2.8718 | 10614 | 0.1758 | - | - | | 2.8720 | 10615 | 0.2211 | - | - | | 2.8723 | 10616 | 0.2206 | - | - | | 2.8726 | 10617 | 0.2271 | - | - | | 2.8728 | 10618 | 0.0931 | - | - | | 2.8731 | 10619 | 0.2128 | - | - | | 2.8734 | 10620 | 0.1514 | - | - | | 2.8736 | 10621 | 0.2751 | - | - | | 2.8739 | 10622 | 0.2332 | - | - | | 2.8742 | 10623 | 0.12 | - | - | | 2.8745 | 10624 | 0.1489 | - | - | | 2.8747 | 10625 | 0.2399 | - | - | | 2.875 | 10626 | 0.1356 | - | - | | 2.8753 | 10627 | 0.1875 | - | - | | 2.8755 | 10628 | 0.1392 | - | - | | 2.8758 | 10629 | 0.2431 | - | - | | 2.8761 | 10630 | 0.1451 | - | - | | 2.8764 | 10631 | 0.2169 | - | - | | 2.8766 | 10632 | 0.1121 | - | - | | 2.8769 | 10633 | 0.2058 | - | - | | 2.8772 | 10634 | 0.1463 | - | - | | 2.8774 | 10635 | 0.2316 | - | - | | 2.8777 | 10636 | 0.1518 | - | - | | 2.8780 | 10637 | 0.2189 | - | - | | 2.8782 | 10638 | 0.2339 | - | - | | 2.8785 | 10639 | 0.1672 | - | - | | 2.8788 | 10640 | 0.1573 | - | - | | 2.8791 | 10641 | 0.2717 | - | - | | 2.8793 | 10642 | 0.1555 | - | - | | 2.8796 | 10643 | 0.1576 | - | - | | 2.8799 | 10644 | 0.1973 | - | - | | 2.8801 | 10645 | 0.215 | - | - | | 2.8804 | 10646 | 0.153 | - | - | | 2.8807 | 10647 | 0.215 | - | - | | 2.8810 | 10648 | 0.2597 | - | - | | 2.8812 | 10649 | 0.2697 | - | - | | 2.8815 | 10650 | 0.1622 | - | - | | 2.8818 | 10651 | 0.1893 | - | - | | 2.8820 | 10652 | 0.2438 | - | - | | 2.8823 | 10653 | 0.1799 | - | - | | 2.8826 | 10654 | 0.1759 | - | - | | 2.8828 | 10655 | 0.2084 | - | - | | 2.8831 | 10656 | 0.1364 | - | - | | 2.8834 | 10657 | 0.1631 | - | - | | 2.8837 | 10658 | 0.2146 | - | - | | 2.8839 | 10659 | 0.1337 | - | - | | 2.8842 | 10660 | 0.1524 | - | - | | 2.8845 | 10661 | 0.1615 | - | - | | 2.8847 | 10662 | 0.1751 | - | - | | 2.8850 | 10663 | 0.2152 | - | - | | 2.8853 | 10664 | 0.1706 | - | - | | 2.8856 | 10665 | 0.1669 | - | - | | 2.8858 | 10666 | 0.1562 | - | - | | 2.8861 | 10667 | 0.1629 | - | - | | 2.8864 | 10668 | 0.2306 | - | - | | 2.8866 | 10669 | 0.1939 | - | - | | 2.8869 | 10670 | 0.2133 | - | - | | 2.8872 | 10671 | 0.1943 | - | - | | 2.8874 | 10672 | 0.2565 | - | - | | 2.8877 | 10673 | 0.2018 | - | - | | 2.8880 | 10674 | 0.182 | - | - | | 2.8883 | 10675 | 0.1823 | - | - | | 2.8885 | 10676 | 0.1892 | - | - | | 2.8888 | 10677 | 0.1558 | - | - | | 2.8891 | 10678 | 0.174 | - | - | | 2.8893 | 10679 | 0.1583 | - | - | | 2.8896 | 10680 | 0.1802 | - | - | | 2.8899 | 10681 | 0.2063 | - | - | | 2.8902 | 10682 | 0.222 | - | - | | 2.8904 | 10683 | 0.137 | - | - | | 2.8907 | 10684 | 0.2071 | - | - | | 2.8910 | 10685 | 0.1504 | - | - | | 2.8912 | 10686 | 0.2151 | - | - | | 2.8915 | 10687 | 0.1764 | - | - | | 2.8918 | 10688 | 0.2647 | - | - | | 2.8920 | 10689 | 0.1475 | - | - | | 2.8923 | 10690 | 0.1558 | - | - | | 2.8926 | 10691 | 0.1369 | - | - | | 2.8929 | 10692 | 0.2023 | - | - | | 2.8931 | 10693 | 0.1916 | - | - | | 2.8934 | 10694 | 0.1545 | - | - | | 2.8937 | 10695 | 0.1931 | - | - | | 2.8939 | 10696 | 0.1264 | - | - | | 2.8942 | 10697 | 0.229 | - | - | | 2.8945 | 10698 | 0.1923 | - | - | | 2.8948 | 10699 | 0.2086 | - | - | | 2.8950 | 10700 | 0.2655 | - | - | | 2.8953 | 10701 | 0.1954 | - | - | | 2.8956 | 10702 | 0.1568 | - | - | | 2.8958 | 10703 | 0.1439 | - | - | | 2.8961 | 10704 | 0.2027 | - | - | | 2.8964 | 10705 | 0.1823 | - | - | | 2.8966 | 10706 | 0.1524 | - | - | | 2.8969 | 10707 | 0.2384 | - | - | | 2.8972 | 10708 | 0.2256 | - | - | | 2.8975 | 10709 | 0.1773 | - | - | | 2.8977 | 10710 | 0.155 | - | - | | 2.8980 | 10711 | 0.2588 | - | - | | 2.8983 | 10712 | 0.177 | - | - | | 2.8985 | 10713 | 0.1602 | - | - | | 2.8988 | 10714 | 0.1683 | - | - | | 2.8991 | 10715 | 0.1747 | - | - | | 2.8994 | 10716 | 0.1844 | - | - | | 2.8996 | 10717 | 0.218 | - | - | | 2.8999 | 10718 | 0.146 | - | - | | 2.9002 | 10719 | 0.2903 | - | - | | 2.9004 | 10720 | 0.2253 | - | - | | 2.9007 | 10721 | 0.1771 | - | - | | 2.9010 | 10722 | 0.2008 | - | - | | 2.9012 | 10723 | 0.1747 | - | - | | 2.9015 | 10724 | 0.2296 | - | - | | 2.9018 | 10725 | 0.1823 | - | - | | 2.9021 | 10726 | 0.1863 | - | - | | 2.9023 | 10727 | 0.1503 | - | - | | 2.9026 | 10728 | 0.2734 | - | - | | 2.9029 | 10729 | 0.1883 | - | - | | 2.9031 | 10730 | 0.1899 | - | - | | 2.9034 | 10731 | 0.2035 | - | - | | 2.9037 | 10732 | 0.1646 | - | - | | 2.9040 | 10733 | 0.1627 | - | - | | 2.9042 | 10734 | 0.1989 | - | - | | 2.9045 | 10735 | 0.234 | - | - | | 2.9048 | 10736 | 0.1833 | - | - | | 2.9050 | 10737 | 0.2055 | - | - | | 2.9053 | 10738 | 0.1728 | - | - | | 2.9056 | 10739 | 0.1939 | - | - | | 2.9058 | 10740 | 0.2233 | - | - | | 2.9061 | 10741 | 0.2412 | - | - | | 2.9064 | 10742 | 0.1485 | - | - | | 2.9067 | 10743 | 0.1736 | - | - | | 2.9069 | 10744 | 0.1485 | - | - | | 2.9072 | 10745 | 0.1832 | - | - | | 2.9075 | 10746 | 0.1879 | - | - | | 2.9077 | 10747 | 0.1799 | - | - | | 2.9080 | 10748 | 0.1622 | - | - | | 2.9083 | 10749 | 0.2621 | - | - | | 2.9085 | 10750 | 0.201 | - | - | | 2.9088 | 10751 | 0.1541 | - | - | | 2.9091 | 10752 | 0.1638 | - | - | | 2.9094 | 10753 | 0.2259 | - | - | | 2.9096 | 10754 | 0.2438 | - | - | | 2.9099 | 10755 | 0.179 | - | - | | 2.9102 | 10756 | 0.137 | - | - | | 2.9104 | 10757 | 0.2443 | - | - | | 2.9107 | 10758 | 0.218 | - | - | | 2.9110 | 10759 | 0.1345 | - | - | | 2.9113 | 10760 | 0.1721 | - | - | | 2.9115 | 10761 | 0.2348 | - | - | | 2.9118 | 10762 | 0.1431 | - | - | | 2.9121 | 10763 | 0.1682 | - | - | | 2.9123 | 10764 | 0.2025 | - | - | | 2.9126 | 10765 | 0.2218 | - | - | | 2.9129 | 10766 | 0.1899 | - | - | | 2.9131 | 10767 | 0.1616 | - | - | | 2.9134 | 10768 | 0.3175 | - | - | | 2.9137 | 10769 | 0.231 | - | - | | 2.9140 | 10770 | 0.2001 | - | - | | 2.9142 | 10771 | 0.1704 | - | - | | 2.9145 | 10772 | 0.1921 | - | - | | 2.9148 | 10773 | 0.1277 | - | - | | 2.9150 | 10774 | 0.2791 | - | - | | 2.9153 | 10775 | 0.185 | - | - | | 2.9156 | 10776 | 0.1429 | - | - | | 2.9159 | 10777 | 0.2471 | - | - | | 2.9161 | 10778 | 0.1186 | - | - | | 2.9164 | 10779 | 0.1827 | - | - | | 2.9167 | 10780 | 0.1694 | - | - | | 2.9169 | 10781 | 0.1204 | - | - | | 2.9172 | 10782 | 0.1684 | - | - | | 2.9175 | 10783 | 0.15 | - | - | | 2.9177 | 10784 | 0.1319 | - | - | | 2.9180 | 10785 | 0.1743 | - | - | | 2.9183 | 10786 | 0.2029 | - | - | | 2.9186 | 10787 | 0.1502 | - | - | | 2.9188 | 10788 | 0.1577 | - | - | | 2.9191 | 10789 | 0.2391 | - | - | | 2.9194 | 10790 | 0.1845 | - | - | | 2.9196 | 10791 | 0.1412 | - | - | | 2.9199 | 10792 | 0.2339 | - | - | | 2.9202 | 10793 | 0.1873 | - | - | | 2.9205 | 10794 | 0.2112 | - | - | | 2.9207 | 10795 | 0.1623 | - | - | | 2.9210 | 10796 | 0.1716 | - | - | | 2.9213 | 10797 | 0.2284 | - | - | | 2.9215 | 10798 | 0.1397 | - | - | | 2.9218 | 10799 | 0.1881 | - | - | | 2.9221 | 10800 | 0.2381 | - | - | | 2.9223 | 10801 | 0.2333 | - | - | | 2.9226 | 10802 | 0.1799 | - | - | | 2.9229 | 10803 | 0.2059 | - | - | | 2.9232 | 10804 | 0.1789 | - | - | | 2.9234 | 10805 | 0.1897 | - | - | | 2.9237 | 10806 | 0.2058 | - | - | | 2.9240 | 10807 | 0.1219 | - | - | | 2.9242 | 10808 | 0.24 | - | - | | 2.9245 | 10809 | 0.1689 | - | - | | 2.9248 | 10810 | 0.2381 | - | - | | 2.9251 | 10811 | 0.1386 | - | - | | 2.9253 | 10812 | 0.2256 | - | - | | 2.9256 | 10813 | 0.1367 | - | - | | 2.9259 | 10814 | 0.3294 | - | - | | 2.9261 | 10815 | 0.1616 | - | - | | 2.9264 | 10816 | 0.1534 | - | - | | 2.9267 | 10817 | 0.2463 | - | - | | 2.9269 | 10818 | 0.1633 | - | - | | 2.9272 | 10819 | 0.1982 | - | - | | 2.9275 | 10820 | 0.1502 | - | - | | 2.9278 | 10821 | 0.3456 | - | - | | 2.9280 | 10822 | 0.1909 | - | - | | 2.9283 | 10823 | 0.1299 | - | - | | 2.9286 | 10824 | 0.1819 | - | - | | 2.9288 | 10825 | 0.2358 | - | - | | 2.9291 | 10826 | 0.2046 | - | - | | 2.9294 | 10827 | 0.1358 | - | - | | 2.9297 | 10828 | 0.2031 | - | - | | 2.9299 | 10829 | 0.1846 | - | - | | 2.9302 | 10830 | 0.1837 | - | - | | 2.9305 | 10831 | 0.1403 | - | - | | 2.9307 | 10832 | 0.1227 | - | - | | 2.9310 | 10833 | 0.1018 | - | - | | 2.9313 | 10834 | 0.2202 | - | - | | 2.9315 | 10835 | 0.185 | - | - | | 2.9318 | 10836 | 0.1498 | - | - | | 2.9321 | 10837 | 0.1721 | - | - | | 2.9324 | 10838 | 0.1742 | - | - | | 2.9326 | 10839 | 0.218 | - | - | | 2.9329 | 10840 | 0.1163 | - | - | | 2.9332 | 10841 | 0.2189 | - | - | | 2.9334 | 10842 | 0.1898 | - | - | | 2.9337 | 10843 | 0.2953 | - | - | | 2.9340 | 10844 | 0.1586 | - | - | | 2.9343 | 10845 | 0.2057 | - | - | | 2.9345 | 10846 | 0.1512 | - | - | | 2.9348 | 10847 | 0.2322 | - | - | | 2.9351 | 10848 | 0.1641 | - | - | | 2.9353 | 10849 | 0.1631 | - | - | | 2.9356 | 10850 | 0.2223 | - | - | | 2.9359 | 10851 | 0.1154 | - | - | | 2.9361 | 10852 | 0.2228 | - | - | | 2.9364 | 10853 | 0.2075 | - | - | | 2.9367 | 10854 | 0.1662 | - | - | | 2.9370 | 10855 | 0.2077 | - | - | | 2.9372 | 10856 | 0.1588 | - | - | | 2.9375 | 10857 | 0.1287 | - | - | | 2.9378 | 10858 | 0.1771 | - | - | | 2.9380 | 10859 | 0.2064 | - | - | | 2.9383 | 10860 | 0.1718 | - | - | | 2.9386 | 10861 | 0.195 | - | - | | 2.9389 | 10862 | 0.1676 | - | - | | 2.9391 | 10863 | 0.163 | - | - | | 2.9394 | 10864 | 0.2006 | - | - | | 2.9397 | 10865 | 0.1884 | - | - | | 2.9399 | 10866 | 0.158 | - | - | | 2.9402 | 10867 | 0.1384 | - | - | | 2.9405 | 10868 | 0.2343 | - | - | | 2.9407 | 10869 | 0.157 | - | - | | 2.9410 | 10870 | 0.1913 | - | - | | 2.9413 | 10871 | 0.2577 | - | - | | 2.9416 | 10872 | 0.2317 | - | - | | 2.9418 | 10873 | 0.1694 | - | - | | 2.9421 | 10874 | 0.2256 | - | - | | 2.9424 | 10875 | 0.1665 | - | - | | 2.9426 | 10876 | 0.184 | - | - | | 2.9429 | 10877 | 0.144 | - | - | | 2.9432 | 10878 | 0.2195 | - | - | | 2.9435 | 10879 | 0.2079 | - | - | | 2.9437 | 10880 | 0.1575 | - | - | | 2.9440 | 10881 | 0.1773 | - | - | | 2.9443 | 10882 | 0.1654 | - | - | | 2.9445 | 10883 | 0.2151 | - | - | | 2.9448 | 10884 | 0.2153 | - | - | | 2.9451 | 10885 | 0.1212 | - | - | | 2.9453 | 10886 | 0.2053 | - | - | | 2.9456 | 10887 | 0.165 | - | - | | 2.9459 | 10888 | 0.1891 | - | - | | 2.9462 | 10889 | 0.1672 | - | - | | 2.9464 | 10890 | 0.2943 | - | - | | 2.9467 | 10891 | 0.1701 | - | - | | 2.9470 | 10892 | 0.1666 | - | - | | 2.9472 | 10893 | 0.2318 | - | - | | 2.9475 | 10894 | 0.223 | - | - | | 2.9478 | 10895 | 0.174 | - | - | | 2.9481 | 10896 | 0.1513 | - | - | | 2.9483 | 10897 | 0.1205 | - | - | | 2.9486 | 10898 | 0.221 | - | - | | 2.9489 | 10899 | 0.1512 | - | - | | 2.9491 | 10900 | 0.1411 | - | - | | 2.9494 | 10901 | 0.2061 | - | - | | 2.9497 | 10902 | 0.2075 | - | - | | 2.9499 | 10903 | 0.2053 | - | - | | 2.9502 | 10904 | 0.1553 | - | - | | 2.9505 | 10905 | 0.2183 | - | - | | 2.9508 | 10906 | 0.1359 | - | - | | 2.9510 | 10907 | 0.1551 | - | - | | 2.9513 | 10908 | 0.1891 | - | - | | 2.9516 | 10909 | 0.1679 | - | - | | 2.9518 | 10910 | 0.1704 | - | - | | 2.9521 | 10911 | 0.186 | - | - | | 2.9524 | 10912 | 0.1458 | - | - | | 2.9527 | 10913 | 0.1298 | - | - | | 2.9529 | 10914 | 0.1395 | - | - | | 2.9532 | 10915 | 0.1786 | - | - | | 2.9535 | 10916 | 0.2512 | - | - | | 2.9537 | 10917 | 0.2308 | - | - | | 2.9540 | 10918 | 0.1775 | - | - | | 2.9543 | 10919 | 0.2718 | - | - | | 2.9545 | 10920 | 0.1546 | - | - | | 2.9548 | 10921 | 0.2088 | - | - | | 2.9551 | 10922 | 0.1039 | - | - | | 2.9554 | 10923 | 0.2764 | - | - | | 2.9556 | 10924 | 0.2215 | - | - | | 2.9559 | 10925 | 0.1714 | - | - | | 2.9562 | 10926 | 0.2052 | - | - | | 2.9564 | 10927 | 0.1709 | - | - | | 2.9567 | 10928 | 0.1136 | - | - | | 2.9570 | 10929 | 0.1527 | - | - | | 2.9573 | 10930 | 0.1958 | - | - | | 2.9575 | 10931 | 0.2095 | - | - | | 2.9578 | 10932 | 0.1925 | - | - | | 2.9581 | 10933 | 0.2044 | - | - | | 2.9583 | 10934 | 0.1723 | - | - | | 2.9586 | 10935 | 0.1548 | - | - | | 2.9589 | 10936 | 0.2104 | - | - | | 2.9591 | 10937 | 0.2052 | - | - | | 2.9594 | 10938 | 0.1668 | - | - | | 2.9597 | 10939 | 0.2236 | - | - | | 2.9600 | 10940 | 0.1788 | - | - | | 2.9602 | 10941 | 0.2125 | - | - | | 2.9605 | 10942 | 0.2238 | - | - | | 2.9608 | 10943 | 0.2192 | - | - | | 2.9610 | 10944 | 0.1931 | - | - | | 2.9613 | 10945 | 0.1601 | - | - | | 2.9616 | 10946 | 0.2044 | - | - | | 2.9619 | 10947 | 0.1249 | - | - | | 2.9621 | 10948 | 0.2226 | - | - | | 2.9624 | 10949 | 0.17 | - | - | | 2.9627 | 10950 | 0.2369 | - | - | | 2.9629 | 10951 | 0.1709 | - | - | | 2.9632 | 10952 | 0.1802 | - | - | | 2.9635 | 10953 | 0.1901 | - | - | | 2.9637 | 10954 | 0.2374 | - | - | | 2.9640 | 10955 | 0.1263 | - | - | | 2.9643 | 10956 | 0.1319 | - | - | | 2.9646 | 10957 | 0.2281 | - | - | | 2.9648 | 10958 | 0.1376 | - | - | | 2.9651 | 10959 | 0.2055 | - | - | | 2.9654 | 10960 | 0.1585 | - | - | | 2.9656 | 10961 | 0.2006 | - | - | | 2.9659 | 10962 | 0.1356 | - | - | | 2.9662 | 10963 | 0.1555 | - | - | | 2.9665 | 10964 | 0.188 | - | - | | 2.9667 | 10965 | 0.2699 | - | - | | 2.9670 | 10966 | 0.1989 | - | - | | 2.9673 | 10967 | 0.1939 | - | - | | 2.9675 | 10968 | 0.1585 | - | - | | 2.9678 | 10969 | 0.188 | - | - | | 2.9681 | 10970 | 0.1638 | - | - | | 2.9683 | 10971 | 0.1427 | - | - | | 2.9686 | 10972 | 0.1983 | - | - | | 2.9689 | 10973 | 0.1296 | - | - | | 2.9692 | 10974 | 0.3014 | - | - | | 2.9694 | 10975 | 0.2089 | - | - | | 2.9697 | 10976 | 0.178 | - | - | | 2.9700 | 10977 | 0.1937 | - | - | | 2.9702 | 10978 | 0.1457 | - | - | | 2.9705 | 10979 | 0.1898 | - | - | | 2.9708 | 10980 | 0.1964 | - | - | | 2.9710 | 10981 | 0.2027 | - | - | | 2.9713 | 10982 | 0.182 | - | - | | 2.9716 | 10983 | 0.2389 | - | - | | 2.9719 | 10984 | 0.1854 | - | - | | 2.9721 | 10985 | 0.2024 | - | - | | 2.9724 | 10986 | 0.1542 | - | - | | 2.9727 | 10987 | 0.1508 | - | - | | 2.9729 | 10988 | 0.2577 | - | - | | 2.9732 | 10989 | 0.1813 | - | - | | 2.9735 | 10990 | 0.1812 | - | - | | 2.9738 | 10991 | 0.1965 | - | - | | 2.9740 | 10992 | 0.1592 | - | - | | 2.9743 | 10993 | 0.1641 | - | - | | 2.9746 | 10994 | 0.1418 | - | - | | 2.9748 | 10995 | 0.178 | - | - | | 2.9751 | 10996 | 0.2119 | - | - | | 2.9754 | 10997 | 0.1629 | - | - | | 2.9756 | 10998 | 0.1647 | - | - | | 2.9759 | 10999 | 0.1412 | - | - | | 2.9762 | 11000 | 0.1567 | 0.1942 | 0.9536 | | 2.9765 | 11001 | 0.1205 | - | - | | 2.9767 | 11002 | 0.1624 | - | - | | 2.9770 | 11003 | 0.218 | - | - | | 2.9773 | 11004 | 0.1717 | - | - | | 2.9775 | 11005 | 0.1045 | - | - | | 2.9778 | 11006 | 0.1428 | - | - | | 2.9781 | 11007 | 0.2438 | - | - | | 2.9784 | 11008 | 0.1575 | - | - | | 2.9786 | 11009 | 0.1458 | - | - | | 2.9789 | 11010 | 0.1448 | - | - | | 2.9792 | 11011 | 0.1627 | - | - | | 2.9794 | 11012 | 0.1622 | - | - | | 2.9797 | 11013 | 0.2138 | - | - | | 2.9800 | 11014 | 0.1602 | - | - | | 2.9802 | 11015 | 0.1324 | - | - | | 2.9805 | 11016 | 0.2083 | - | - | | 2.9808 | 11017 | 0.1309 | - | - | | 2.9811 | 11018 | 0.2022 | - | - | | 2.9813 | 11019 | 0.188 | - | - | | 2.9816 | 11020 | 0.3131 | - | - | | 2.9819 | 11021 | 0.1711 | - | - | | 2.9821 | 11022 | 0.1969 | - | - | | 2.9824 | 11023 | 0.1545 | - | - | | 2.9827 | 11024 | 0.2234 | - | - | | 2.9830 | 11025 | 0.1967 | - | - | | 2.9832 | 11026 | 0.1965 | - | - | | 2.9835 | 11027 | 0.2088 | - | - | | 2.9838 | 11028 | 0.189 | - | - | | 2.9840 | 11029 | 0.1391 | - | - | | 2.9843 | 11030 | 0.1592 | - | - | | 2.9846 | 11031 | 0.1464 | - | - | | 2.9848 | 11032 | 0.241 | - | - | | 2.9851 | 11033 | 0.1879 | - | - | | 2.9854 | 11034 | 0.1539 | - | - | | 2.9857 | 11035 | 0.2478 | - | - | | 2.9859 | 11036 | 0.1594 | - | - | | 2.9862 | 11037 | 0.1409 | - | - | | 2.9865 | 11038 | 0.248 | - | - | | 2.9867 | 11039 | 0.1437 | - | - | | 2.9870 | 11040 | 0.2307 | - | - | | 2.9873 | 11041 | 0.1582 | - | - | | 2.9876 | 11042 | 0.1662 | - | - | | 2.9878 | 11043 | 0.2652 | - | - | | 2.9881 | 11044 | 0.1677 | - | - | | 2.9884 | 11045 | 0.1582 | - | - | | 2.9886 | 11046 | 0.1693 | - | - | | 2.9889 | 11047 | 0.1591 | - | - | | 2.9892 | 11048 | 0.164 | - | - | | 2.9894 | 11049 | 0.2057 | - | - | | 2.9897 | 11050 | 0.1878 | - | - | | 2.9900 | 11051 | 0.1721 | - | - | | 2.9903 | 11052 | 0.2156 | - | - | | 2.9905 | 11053 | 0.2296 | - | - | | 2.9908 | 11054 | 0.1536 | - | - | | 2.9911 | 11055 | 0.1694 | - | - | | 2.9913 | 11056 | 0.1769 | - | - | | 2.9916 | 11057 | 0.1575 | - | - | | 2.9919 | 11058 | 0.108 | - | - | | 2.9922 | 11059 | 0.1546 | - | - | | 2.9924 | 11060 | 0.1814 | - | - | | 2.9927 | 11061 | 0.1583 | - | - | | 2.9930 | 11062 | 0.2457 | - | - | | 2.9932 | 11063 | 0.1459 | - | - | | 2.9935 | 11064 | 0.1269 | - | - | | 2.9938 | 11065 | 0.1643 | - | - | | 2.9940 | 11066 | 0.1835 | - | - | | 2.9943 | 11067 | 0.1752 | - | - | | 2.9946 | 11068 | 0.193 | - | - | | 2.9949 | 11069 | 0.185 | - | - | | 2.9951 | 11070 | 0.1696 | - | - | | 2.9954 | 11071 | 0.1468 | - | - | | 2.9957 | 11072 | 0.1601 | - | - | | 2.9959 | 11073 | 0.1443 | - | - | | 2.9962 | 11074 | 0.2007 | - | - | | 2.9965 | 11075 | 0.1816 | - | - | | 2.9968 | 11076 | 0.2032 | - | - | | 2.9970 | 11077 | 0.2391 | - | - | | 2.9973 | 11078 | 0.1806 | - | - | | 2.9976 | 11079 | 0.1728 | - | - | | 2.9978 | 11080 | 0.1932 | - | - | | 2.9981 | 11081 | 0.1793 | - | - | | 2.9984 | 11082 | 0.1836 | - | - | | 2.9986 | 11083 | 0.1959 | - | - | | 2.9989 | 11084 | 0.1926 | - | - | | 2.9992 | 11085 | 0.1253 | - | - | | 2.9995 | 11086 | 0.1691 | - | - | | 2.9997 | 11087 | 0.1914 | - | - | | 3.0 | 11088 | 0.1879 | - | - | | 3.0003 | 11089 | 0.1787 | - | - | | 3.0005 | 11090 | 0.143 | - | - | | 3.0008 | 11091 | 0.1638 | - | - | | 3.0011 | 11092 | 0.1623 | - | - | | 3.0014 | 11093 | 0.1981 | - | - | | 3.0016 | 11094 | 0.1207 | - | - | | 3.0019 | 11095 | 0.145 | - | - | | 3.0022 | 11096 | 0.1729 | - | - | | 3.0024 | 11097 | 0.1431 | - | - | | 3.0027 | 11098 | 0.1657 | - | - | | 3.0030 | 11099 | 0.1578 | - | - | | 3.0032 | 11100 | 0.1327 | - | - | | 3.0035 | 11101 | 0.1426 | - | - | | 3.0038 | 11102 | 0.0989 | - | - | | 3.0041 | 11103 | 0.1809 | - | - | | 3.0043 | 11104 | 0.1641 | - | - | | 3.0046 | 11105 | 0.1446 | - | - | | 3.0049 | 11106 | 0.1501 | - | - | | 3.0051 | 11107 | 0.1254 | - | - | | 3.0054 | 11108 | 0.099 | - | - | | 3.0057 | 11109 | 0.1566 | - | - | | 3.0060 | 11110 | 0.2118 | - | - | | 3.0062 | 11111 | 0.1148 | - | - | | 3.0065 | 11112 | 0.1213 | - | - | | 3.0068 | 11113 | 0.2022 | - | - | | 3.0070 | 11114 | 0.1731 | - | - | | 3.0073 | 11115 | 0.2485 | - | - | | 3.0076 | 11116 | 0.1806 | - | - | | 3.0078 | 11117 | 0.1673 | - | - | | 3.0081 | 11118 | 0.154 | - | - | | 3.0084 | 11119 | 0.1831 | - | - | | 3.0087 | 11120 | 0.1267 | - | - | | 3.0089 | 11121 | 0.1954 | - | - | | 3.0092 | 11122 | 0.127 | - | - | | 3.0095 | 11123 | 0.1336 | - | - | | 3.0097 | 11124 | 0.1627 | - | - | | 3.0100 | 11125 | 0.1276 | - | - | | 3.0103 | 11126 | 0.1436 | - | - | | 3.0106 | 11127 | 0.1722 | - | - | | 3.0108 | 11128 | 0.1633 | - | - | | 3.0111 | 11129 | 0.1751 | - | - | | 3.0114 | 11130 | 0.1322 | - | - | | 3.0116 | 11131 | 0.0989 | - | - | | 3.0119 | 11132 | 0.1746 | - | - | | 3.0122 | 11133 | 0.1126 | - | - | | 3.0124 | 11134 | 0.1696 | - | - | | 3.0127 | 11135 | 0.1781 | - | - | | 3.0130 | 11136 | 0.1829 | - | - | | 3.0133 | 11137 | 0.1522 | - | - | | 3.0135 | 11138 | 0.2208 | - | - | | 3.0138 | 11139 | 0.1252 | - | - | | 3.0141 | 11140 | 0.1762 | - | - | | 3.0143 | 11141 | 0.1452 | - | - | | 3.0146 | 11142 | 0.1223 | - | - | | 3.0149 | 11143 | 0.1278 | - | - | | 3.0152 | 11144 | 0.1354 | - | - | | 3.0154 | 11145 | 0.1516 | - | - | | 3.0157 | 11146 | 0.1182 | - | - | | 3.0160 | 11147 | 0.1559 | - | - | | 3.0162 | 11148 | 0.1295 | - | - | | 3.0165 | 11149 | 0.1088 | - | - | | 3.0168 | 11150 | 0.1388 | - | - | | 3.0170 | 11151 | 0.1675 | - | - | | 3.0173 | 11152 | 0.113 | - | - | | 3.0176 | 11153 | 0.1227 | - | - | | 3.0179 | 11154 | 0.1443 | - | - | | 3.0181 | 11155 | 0.1728 | - | - | | 3.0184 | 11156 | 0.2091 | - | - | | 3.0187 | 11157 | 0.1709 | - | - | | 3.0189 | 11158 | 0.1432 | - | - | | 3.0192 | 11159 | 0.1957 | - | - | | 3.0195 | 11160 | 0.2125 | - | - | | 3.0198 | 11161 | 0.1962 | - | - | | 3.0200 | 11162 | 0.1708 | - | - | | 3.0203 | 11163 | 0.1924 | - | - | | 3.0206 | 11164 | 0.1536 | - | - | | 3.0208 | 11165 | 0.1412 | - | - | | 3.0211 | 11166 | 0.1217 | - | - | | 3.0214 | 11167 | 0.1278 | - | - | | 3.0216 | 11168 | 0.1555 | - | - | | 3.0219 | 11169 | 0.1069 | - | - | | 3.0222 | 11170 | 0.1264 | - | - | | 3.0225 | 11171 | 0.1621 | - | - | | 3.0227 | 11172 | 0.1205 | - | - | | 3.0230 | 11173 | 0.1116 | - | - | | 3.0233 | 11174 | 0.1818 | - | - | | 3.0235 | 11175 | 0.1929 | - | - | | 3.0238 | 11176 | 0.1769 | - | - | | 3.0241 | 11177 | 0.1675 | - | - | | 3.0244 | 11178 | 0.1495 | - | - | | 3.0246 | 11179 | 0.1852 | - | - | | 3.0249 | 11180 | 0.2347 | - | - | | 3.0252 | 11181 | 0.1536 | - | - | | 3.0254 | 11182 | 0.1742 | - | - | | 3.0257 | 11183 | 0.2229 | - | - | | 3.0260 | 11184 | 0.149 | - | - | | 3.0262 | 11185 | 0.1723 | - | - | | 3.0265 | 11186 | 0.127 | - | - | | 3.0268 | 11187 | 0.1858 | - | - | | 3.0271 | 11188 | 0.1965 | - | - | | 3.0273 | 11189 | 0.2088 | - | - | | 3.0276 | 11190 | 0.1111 | - | - | | 3.0279 | 11191 | 0.1371 | - | - | | 3.0281 | 11192 | 0.1214 | - | - | | 3.0284 | 11193 | 0.1678 | - | - | | 3.0287 | 11194 | 0.1655 | - | - | | 3.0290 | 11195 | 0.19 | - | - | | 3.0292 | 11196 | 0.1927 | - | - | | 3.0295 | 11197 | 0.1734 | - | - | | 3.0298 | 11198 | 0.2523 | - | - | | 3.0300 | 11199 | 0.1441 | - | - | | 3.0303 | 11200 | 0.1293 | - | - | | 3.0306 | 11201 | 0.1777 | - | - | | 3.0308 | 11202 | 0.2189 | - | - | | 3.0311 | 11203 | 0.1274 | - | - | | 3.0314 | 11204 | 0.1562 | - | - | | 3.0317 | 11205 | 0.1834 | - | - | | 3.0319 | 11206 | 0.1532 | - | - | | 3.0322 | 11207 | 0.1566 | - | - | | 3.0325 | 11208 | 0.1847 | - | - | | 3.0327 | 11209 | 0.1527 | - | - | | 3.0330 | 11210 | 0.1511 | - | - | | 3.0333 | 11211 | 0.1364 | - | - | | 3.0335 | 11212 | 0.1649 | - | - | | 3.0338 | 11213 | 0.1203 | - | - | | 3.0341 | 11214 | 0.2499 | - | - | | 3.0344 | 11215 | 0.1671 | - | - | | 3.0346 | 11216 | 0.103 | - | - | | 3.0349 | 11217 | 0.1413 | - | - | | 3.0352 | 11218 | 0.1627 | - | - | | 3.0354 | 11219 | 0.216 | - | - | | 3.0357 | 11220 | 0.1457 | - | - | | 3.0360 | 11221 | 0.1537 | - | - | | 3.0363 | 11222 | 0.1925 | - | - | | 3.0365 | 11223 | 0.2047 | - | - | | 3.0368 | 11224 | 0.1222 | - | - | | 3.0371 | 11225 | 0.1596 | - | - | | 3.0373 | 11226 | 0.1357 | - | - | | 3.0376 | 11227 | 0.127 | - | - | | 3.0379 | 11228 | 0.1885 | - | - | | 3.0381 | 11229 | 0.131 | - | - | | 3.0384 | 11230 | 0.1312 | - | - | | 3.0387 | 11231 | 0.1976 | - | - | | 3.0390 | 11232 | 0.1347 | - | - | | 3.0392 | 11233 | 0.217 | - | - | | 3.0395 | 11234 | 0.151 | - | - | | 3.0398 | 11235 | 0.2374 | - | - | | 3.0400 | 11236 | 0.1565 | - | - | | 3.0403 | 11237 | 0.1369 | - | - | | 3.0406 | 11238 | 0.1645 | - | - | | 3.0409 | 11239 | 0.1668 | - | - | | 3.0411 | 11240 | 0.155 | - | - | | 3.0414 | 11241 | 0.1733 | - | - | | 3.0417 | 11242 | 0.1221 | - | - | | 3.0419 | 11243 | 0.2172 | - | - | | 3.0422 | 11244 | 0.1342 | - | - | | 3.0425 | 11245 | 0.1648 | - | - | | 3.0427 | 11246 | 0.1598 | - | - | | 3.0430 | 11247 | 0.1997 | - | - | | 3.0433 | 11248 | 0.2076 | - | - | | 3.0436 | 11249 | 0.2185 | - | - | | 3.0438 | 11250 | 0.1625 | - | - | | 3.0441 | 11251 | 0.1443 | - | - | | 3.0444 | 11252 | 0.1404 | - | - | | 3.0446 | 11253 | 0.1505 | - | - | | 3.0449 | 11254 | 0.1783 | - | - | | 3.0452 | 11255 | 0.2197 | - | - | | 3.0455 | 11256 | 0.1612 | - | - | | 3.0457 | 11257 | 0.1256 | - | - | | 3.0460 | 11258 | 0.1699 | - | - | | 3.0463 | 11259 | 0.1337 | - | - | | 3.0465 | 11260 | 0.1325 | - | - | | 3.0468 | 11261 | 0.2526 | - | - | | 3.0471 | 11262 | 0.1454 | - | - | | 3.0473 | 11263 | 0.1179 | - | - | | 3.0476 | 11264 | 0.2116 | - | - | | 3.0479 | 11265 | 0.1739 | - | - | | 3.0482 | 11266 | 0.1168 | - | - | | 3.0484 | 11267 | 0.0858 | - | - | | 3.0487 | 11268 | 0.1382 | - | - | | 3.0490 | 11269 | 0.1546 | - | - | | 3.0492 | 11270 | 0.1732 | - | - | | 3.0495 | 11271 | 0.2298 | - | - | | 3.0498 | 11272 | 0.2286 | - | - | | 3.0501 | 11273 | 0.1714 | - | - | | 3.0503 | 11274 | 0.1489 | - | - | | 3.0506 | 11275 | 0.1626 | - | - | | 3.0509 | 11276 | 0.1578 | - | - | | 3.0511 | 11277 | 0.1556 | - | - | | 3.0514 | 11278 | 0.127 | - | - | | 3.0517 | 11279 | 0.1833 | - | - | | 3.0519 | 11280 | 0.1479 | - | - | | 3.0522 | 11281 | 0.1887 | - | - | | 3.0525 | 11282 | 0.1817 | - | - | | 3.0528 | 11283 | 0.1471 | - | - | | 3.0530 | 11284 | 0.1534 | - | - | | 3.0533 | 11285 | 0.2484 | - | - | | 3.0536 | 11286 | 0.1702 | - | - | | 3.0538 | 11287 | 0.1971 | - | - | | 3.0541 | 11288 | 0.1908 | - | - | | 3.0544 | 11289 | 0.0846 | - | - | | 3.0547 | 11290 | 0.1939 | - | - | | 3.0549 | 11291 | 0.0985 | - | - | | 3.0552 | 11292 | 0.1277 | - | - | | 3.0555 | 11293 | 0.164 | - | - | | 3.0557 | 11294 | 0.1251 | - | - | | 3.0560 | 11295 | 0.1462 | - | - | | 3.0563 | 11296 | 0.1336 | - | - | | 3.0565 | 11297 | 0.1314 | - | - | | 3.0568 | 11298 | 0.1977 | - | - | | 3.0571 | 11299 | 0.1631 | - | - | | 3.0574 | 11300 | 0.1293 | - | - | | 3.0576 | 11301 | 0.1759 | - | - | | 3.0579 | 11302 | 0.1145 | - | - | | 3.0582 | 11303 | 0.1131 | - | - | | 3.0584 | 11304 | 0.1438 | - | - | | 3.0587 | 11305 | 0.1733 | - | - | | 3.0590 | 11306 | 0.1019 | - | - | | 3.0593 | 11307 | 0.1881 | - | - | | 3.0595 | 11308 | 0.1257 | - | - | | 3.0598 | 11309 | 0.152 | - | - | | 3.0601 | 11310 | 0.1478 | - | - | | 3.0603 | 11311 | 0.1345 | - | - | | 3.0606 | 11312 | 0.1385 | - | - | | 3.0609 | 11313 | 0.1316 | - | - | | 3.0611 | 11314 | 0.1463 | - | - | | 3.0614 | 11315 | 0.1556 | - | - | | 3.0617 | 11316 | 0.1792 | - | - | | 3.0620 | 11317 | 0.1846 | - | - | | 3.0622 | 11318 | 0.1177 | - | - | | 3.0625 | 11319 | 0.1599 | - | - | | 3.0628 | 11320 | 0.2479 | - | - | | 3.0630 | 11321 | 0.1672 | - | - | | 3.0633 | 11322 | 0.2145 | - | - | | 3.0636 | 11323 | 0.131 | - | - | | 3.0639 | 11324 | 0.1416 | - | - | | 3.0641 | 11325 | 0.1691 | - | - | | 3.0644 | 11326 | 0.1748 | - | - | | 3.0647 | 11327 | 0.147 | - | - | | 3.0649 | 11328 | 0.1444 | - | - | | 3.0652 | 11329 | 0.1691 | - | - | | 3.0655 | 11330 | 0.152 | - | - | | 3.0657 | 11331 | 0.2019 | - | - | | 3.0660 | 11332 | 0.1574 | - | - | | 3.0663 | 11333 | 0.1325 | - | - | | 3.0666 | 11334 | 0.169 | - | - | | 3.0668 | 11335 | 0.1809 | - | - | | 3.0671 | 11336 | 0.1449 | - | - | | 3.0674 | 11337 | 0.1378 | - | - | | 3.0676 | 11338 | 0.245 | - | - | | 3.0679 | 11339 | 0.1858 | - | - | | 3.0682 | 11340 | 0.1104 | - | - | | 3.0685 | 11341 | 0.1946 | - | - | | 3.0687 | 11342 | 0.1488 | - | - | | 3.0690 | 11343 | 0.1154 | - | - | | 3.0693 | 11344 | 0.2182 | - | - | | 3.0695 | 11345 | 0.1777 | - | - | | 3.0698 | 11346 | 0.1577 | - | - | | 3.0701 | 11347 | 0.1137 | - | - | | 3.0703 | 11348 | 0.1941 | - | - | | 3.0706 | 11349 | 0.1897 | - | - | | 3.0709 | 11350 | 0.1201 | - | - | | 3.0712 | 11351 | 0.2448 | - | - | | 3.0714 | 11352 | 0.1469 | - | - | | 3.0717 | 11353 | 0.1757 | - | - | | 3.0720 | 11354 | 0.1309 | - | - | | 3.0722 | 11355 | 0.1701 | - | - | | 3.0725 | 11356 | 0.1121 | - | - | | 3.0728 | 11357 | 0.1384 | - | - | | 3.0731 | 11358 | 0.109 | - | - | | 3.0733 | 11359 | 0.1953 | - | - | | 3.0736 | 11360 | 0.1869 | - | - | | 3.0739 | 11361 | 0.1093 | - | - | | 3.0741 | 11362 | 0.1515 | - | - | | 3.0744 | 11363 | 0.1642 | - | - | | 3.0747 | 11364 | 0.2114 | - | - | | 3.0749 | 11365 | 0.1209 | - | - | | 3.0752 | 11366 | 0.199 | - | - | | 3.0755 | 11367 | 0.1469 | - | - | | 3.0758 | 11368 | 0.1286 | - | - | | 3.0760 | 11369 | 0.1767 | - | - | | 3.0763 | 11370 | 0.105 | - | - | | 3.0766 | 11371 | 0.1966 | - | - | | 3.0768 | 11372 | 0.2367 | - | - | | 3.0771 | 11373 | 0.1555 | - | - | | 3.0774 | 11374 | 0.146 | - | - | | 3.0777 | 11375 | 0.1922 | - | - | | 3.0779 | 11376 | 0.1082 | - | - | | 3.0782 | 11377 | 0.1542 | - | - | | 3.0785 | 11378 | 0.1915 | - | - | | 3.0787 | 11379 | 0.1688 | - | - | | 3.0790 | 11380 | 0.1396 | - | - | | 3.0793 | 11381 | 0.1307 | - | - | | 3.0795 | 11382 | 0.196 | - | - | | 3.0798 | 11383 | 0.1389 | - | - | | 3.0801 | 11384 | 0.1686 | - | - | | 3.0804 | 11385 | 0.145 | - | - | | 3.0806 | 11386 | 0.1889 | - | - | | 3.0809 | 11387 | 0.1567 | - | - | | 3.0812 | 11388 | 0.2476 | - | - | | 3.0814 | 11389 | 0.097 | - | - | | 3.0817 | 11390 | 0.1957 | - | - | | 3.0820 | 11391 | 0.136 | - | - | | 3.0823 | 11392 | 0.2114 | - | - | | 3.0825 | 11393 | 0.1554 | - | - | | 3.0828 | 11394 | 0.1971 | - | - | | 3.0831 | 11395 | 0.1547 | - | - | | 3.0833 | 11396 | 0.1369 | - | - | | 3.0836 | 11397 | 0.1657 | - | - | | 3.0839 | 11398 | 0.154 | - | - | | 3.0841 | 11399 | 0.1128 | - | - | | 3.0844 | 11400 | 0.15 | - | - | | 3.0847 | 11401 | 0.2029 | - | - | | 3.0850 | 11402 | 0.1422 | - | - | | 3.0852 | 11403 | 0.1663 | - | - | | 3.0855 | 11404 | 0.1102 | - | - | | 3.0858 | 11405 | 0.1275 | - | - | | 3.0860 | 11406 | 0.1665 | - | - | | 3.0863 | 11407 | 0.1916 | - | - | | 3.0866 | 11408 | 0.1575 | - | - | | 3.0869 | 11409 | 0.1773 | - | - | | 3.0871 | 11410 | 0.1565 | - | - | | 3.0874 | 11411 | 0.2012 | - | - | | 3.0877 | 11412 | 0.1819 | - | - | | 3.0879 | 11413 | 0.161 | - | - | | 3.0882 | 11414 | 0.1479 | - | - | | 3.0885 | 11415 | 0.1692 | - | - | | 3.0887 | 11416 | 0.1483 | - | - | | 3.0890 | 11417 | 0.1862 | - | - | | 3.0893 | 11418 | 0.1414 | - | - | | 3.0896 | 11419 | 0.2072 | - | - | | 3.0898 | 11420 | 0.2108 | - | - | | 3.0901 | 11421 | 0.1316 | - | - | | 3.0904 | 11422 | 0.1133 | - | - | | 3.0906 | 11423 | 0.1519 | - | - | | 3.0909 | 11424 | 0.1163 | - | - | | 3.0912 | 11425 | 0.1372 | - | - | | 3.0915 | 11426 | 0.144 | - | - | | 3.0917 | 11427 | 0.1458 | - | - | | 3.0920 | 11428 | 0.1717 | - | - | | 3.0923 | 11429 | 0.2064 | - | - | | 3.0925 | 11430 | 0.1546 | - | - | | 3.0928 | 11431 | 0.103 | - | - | | 3.0931 | 11432 | 0.1403 | - | - | | 3.0933 | 11433 | 0.1231 | - | - | | 3.0936 | 11434 | 0.1397 | - | - | | 3.0939 | 11435 | 0.1004 | - | - | | 3.0942 | 11436 | 0.2481 | - | - | | 3.0944 | 11437 | 0.1834 | - | - | | 3.0947 | 11438 | 0.1746 | - | - | | 3.0950 | 11439 | 0.1895 | - | - | | 3.0952 | 11440 | 0.1414 | - | - | | 3.0955 | 11441 | 0.1406 | - | - | | 3.0958 | 11442 | 0.169 | - | - | | 3.0960 | 11443 | 0.2568 | - | - | | 3.0963 | 11444 | 0.138 | - | - | | 3.0966 | 11445 | 0.1673 | - | - | | 3.0969 | 11446 | 0.182 | - | - | | 3.0971 | 11447 | 0.209 | - | - | | 3.0974 | 11448 | 0.1312 | - | - | | 3.0977 | 11449 | 0.1615 | - | - | | 3.0979 | 11450 | 0.1457 | - | - | | 3.0982 | 11451 | 0.1183 | - | - | | 3.0985 | 11452 | 0.1584 | - | - | | 3.0988 | 11453 | 0.2117 | - | - | | 3.0990 | 11454 | 0.122 | - | - | | 3.0993 | 11455 | 0.1182 | - | - | | 3.0996 | 11456 | 0.1602 | - | - | | 3.0998 | 11457 | 0.1331 | - | - | | 3.1001 | 11458 | 0.1408 | - | - | | 3.1004 | 11459 | 0.2132 | - | - | | 3.1006 | 11460 | 0.1635 | - | - | | 3.1009 | 11461 | 0.1039 | - | - | | 3.1012 | 11462 | 0.1468 | - | - | | 3.1015 | 11463 | 0.0954 | - | - | | 3.1017 | 11464 | 0.1521 | - | - | | 3.1020 | 11465 | 0.1684 | - | - | | 3.1023 | 11466 | 0.2066 | - | - | | 3.1025 | 11467 | 0.1619 | - | - | | 3.1028 | 11468 | 0.1913 | - | - | | 3.1031 | 11469 | 0.1461 | - | - | | 3.1034 | 11470 | 0.1418 | - | - | | 3.1036 | 11471 | 0.1098 | - | - | | 3.1039 | 11472 | 0.1309 | - | - | | 3.1042 | 11473 | 0.2057 | - | - | | 3.1044 | 11474 | 0.166 | - | - | | 3.1047 | 11475 | 0.1429 | - | - | | 3.1050 | 11476 | 0.1838 | - | - | | 3.1052 | 11477 | 0.1457 | - | - | | 3.1055 | 11478 | 0.1443 | - | - | | 3.1058 | 11479 | 0.1593 | - | - | | 3.1061 | 11480 | 0.136 | - | - | | 3.1063 | 11481 | 0.1953 | - | - | | 3.1066 | 11482 | 0.1529 | - | - | | 3.1069 | 11483 | 0.1093 | - | - | | 3.1071 | 11484 | 0.1532 | - | - | | 3.1074 | 11485 | 0.1651 | - | - | | 3.1077 | 11486 | 0.1986 | - | - | | 3.1080 | 11487 | 0.167 | - | - | | 3.1082 | 11488 | 0.1133 | - | - | | 3.1085 | 11489 | 0.157 | - | - | | 3.1088 | 11490 | 0.2094 | - | - | | 3.1090 | 11491 | 0.1199 | - | - | | 3.1093 | 11492 | 0.1928 | - | - | | 3.1096 | 11493 | 0.2176 | - | - | | 3.1098 | 11494 | 0.1454 | - | - | | 3.1101 | 11495 | 0.2104 | - | - | | 3.1104 | 11496 | 0.2476 | - | - | | 3.1107 | 11497 | 0.2106 | - | - | | 3.1109 | 11498 | 0.2015 | - | - | | 3.1112 | 11499 | 0.1717 | - | - | | 3.1115 | 11500 | 0.1481 | - | - | | 3.1117 | 11501 | 0.2217 | - | - | | 3.1120 | 11502 | 0.1389 | - | - | | 3.1123 | 11503 | 0.134 | - | - | | 3.1126 | 11504 | 0.1575 | - | - | | 3.1128 | 11505 | 0.1061 | - | - | | 3.1131 | 11506 | 0.1942 | - | - | | 3.1134 | 11507 | 0.1051 | - | - | | 3.1136 | 11508 | 0.144 | - | - | | 3.1139 | 11509 | 0.0991 | - | - | | 3.1142 | 11510 | 0.1567 | - | - | | 3.1144 | 11511 | 0.191 | - | - | | 3.1147 | 11512 | 0.1765 | - | - | | 3.1150 | 11513 | 0.2186 | - | - | | 3.1153 | 11514 | 0.1355 | - | - | | 3.1155 | 11515 | 0.149 | - | - | | 3.1158 | 11516 | 0.0981 | - | - | | 3.1161 | 11517 | 0.1412 | - | - | | 3.1163 | 11518 | 0.1423 | - | - | | 3.1166 | 11519 | 0.1452 | - | - | | 3.1169 | 11520 | 0.1882 | - | - | | 3.1172 | 11521 | 0.2494 | - | - | | 3.1174 | 11522 | 0.1748 | - | - | | 3.1177 | 11523 | 0.1634 | - | - | | 3.1180 | 11524 | 0.1385 | - | - | | 3.1182 | 11525 | 0.12 | - | - | | 3.1185 | 11526 | 0.1591 | - | - | | 3.1188 | 11527 | 0.1283 | - | - | | 3.1190 | 11528 | 0.2236 | - | - | | 3.1193 | 11529 | 0.1654 | - | - | | 3.1196 | 11530 | 0.1002 | - | - | | 3.1199 | 11531 | 0.1321 | - | - | | 3.1201 | 11532 | 0.1867 | - | - | | 3.1204 | 11533 | 0.1568 | - | - | | 3.1207 | 11534 | 0.1976 | - | - | | 3.1209 | 11535 | 0.1996 | - | - | | 3.1212 | 11536 | 0.1713 | - | - | | 3.1215 | 11537 | 0.1996 | - | - | | 3.1218 | 11538 | 0.182 | - | - | | 3.1220 | 11539 | 0.1525 | - | - | | 3.1223 | 11540 | 0.1304 | - | - | | 3.1226 | 11541 | 0.1545 | - | - | | 3.1228 | 11542 | 0.1599 | - | - | | 3.1231 | 11543 | 0.1802 | - | - | | 3.1234 | 11544 | 0.1619 | - | - | | 3.1236 | 11545 | 0.1276 | - | - | | 3.1239 | 11546 | 0.1904 | - | - | | 3.1242 | 11547 | 0.1454 | - | - | | 3.1245 | 11548 | 0.1602 | - | - | | 3.1247 | 11549 | 0.1653 | - | - | | 3.125 | 11550 | 0.1209 | - | - | | 3.1253 | 11551 | 0.1377 | - | - | | 3.1255 | 11552 | 0.1447 | - | - | | 3.1258 | 11553 | 0.175 | - | - | | 3.1261 | 11554 | 0.1547 | - | - | | 3.1264 | 11555 | 0.1257 | - | - | | 3.1266 | 11556 | 0.2301 | - | - | | 3.1269 | 11557 | 0.2098 | - | - | | 3.1272 | 11558 | 0.1419 | - | - | | 3.1274 | 11559 | 0.1455 | - | - | | 3.1277 | 11560 | 0.1452 | - | - | | 3.1280 | 11561 | 0.1857 | - | - | | 3.1282 | 11562 | 0.1861 | - | - | | 3.1285 | 11563 | 0.1403 | - | - | | 3.1288 | 11564 | 0.2098 | - | - | | 3.1291 | 11565 | 0.1809 | - | - | | 3.1293 | 11566 | 0.1172 | - | - | | 3.1296 | 11567 | 0.1611 | - | - | | 3.1299 | 11568 | 0.1336 | - | - | | 3.1301 | 11569 | 0.1537 | - | - | | 3.1304 | 11570 | 0.1161 | - | - | | 3.1307 | 11571 | 0.1539 | - | - | | 3.1310 | 11572 | 0.2095 | - | - | | 3.1312 | 11573 | 0.1116 | - | - | | 3.1315 | 11574 | 0.167 | - | - | | 3.1318 | 11575 | 0.1619 | - | - | | 3.1320 | 11576 | 0.1584 | - | - | | 3.1323 | 11577 | 0.1927 | - | - | | 3.1326 | 11578 | 0.1866 | - | - | | 3.1328 | 11579 | 0.1458 | - | - | | 3.1331 | 11580 | 0.1369 | - | - | | 3.1334 | 11581 | 0.1372 | - | - | | 3.1337 | 11582 | 0.1655 | - | - | | 3.1339 | 11583 | 0.1748 | - | - | | 3.1342 | 11584 | 0.1367 | - | - | | 3.1345 | 11585 | 0.1396 | - | - | | 3.1347 | 11586 | 0.1117 | - | - | | 3.1350 | 11587 | 0.1162 | - | - | | 3.1353 | 11588 | 0.1498 | - | - | | 3.1356 | 11589 | 0.1724 | - | - | | 3.1358 | 11590 | 0.1367 | - | - | | 3.1361 | 11591 | 0.1242 | - | - | | 3.1364 | 11592 | 0.1884 | - | - | | 3.1366 | 11593 | 0.178 | - | - | | 3.1369 | 11594 | 0.1292 | - | - | | 3.1372 | 11595 | 0.1369 | - | - | | 3.1374 | 11596 | 0.1709 | - | - | | 3.1377 | 11597 | 0.2087 | - | - | | 3.1380 | 11598 | 0.2 | - | - | | 3.1383 | 11599 | 0.1699 | - | - | | 3.1385 | 11600 | 0.1484 | - | - | | 3.1388 | 11601 | 0.1261 | - | - | | 3.1391 | 11602 | 0.1422 | - | - | | 3.1393 | 11603 | 0.1718 | - | - | | 3.1396 | 11604 | 0.1204 | - | - | | 3.1399 | 11605 | 0.1236 | - | - | | 3.1402 | 11606 | 0.1771 | - | - | | 3.1404 | 11607 | 0.1753 | - | - | | 3.1407 | 11608 | 0.1727 | - | - | | 3.1410 | 11609 | 0.1784 | - | - | | 3.1412 | 11610 | 0.1795 | - | - | | 3.1415 | 11611 | 0.1826 | - | - | | 3.1418 | 11612 | 0.179 | - | - | | 3.1420 | 11613 | 0.1573 | - | - | | 3.1423 | 11614 | 0.1641 | - | - | | 3.1426 | 11615 | 0.1426 | - | - | | 3.1429 | 11616 | 0.1706 | - | - | | 3.1431 | 11617 | 0.1465 | - | - | | 3.1434 | 11618 | 0.1793 | - | - | | 3.1437 | 11619 | 0.212 | - | - | | 3.1439 | 11620 | 0.1427 | - | - | | 3.1442 | 11621 | 0.2362 | - | - | | 3.1445 | 11622 | 0.1618 | - | - | | 3.1448 | 11623 | 0.1607 | - | - | | 3.1450 | 11624 | 0.1258 | - | - | | 3.1453 | 11625 | 0.2123 | - | - | | 3.1456 | 11626 | 0.1758 | - | - | | 3.1458 | 11627 | 0.1197 | - | - | | 3.1461 | 11628 | 0.1301 | - | - | | 3.1464 | 11629 | 0.1332 | - | - | | 3.1466 | 11630 | 0.1431 | - | - | | 3.1469 | 11631 | 0.2029 | - | - | | 3.1472 | 11632 | 0.183 | - | - | | 3.1475 | 11633 | 0.1839 | - | - | | 3.1477 | 11634 | 0.1677 | - | - | | 3.1480 | 11635 | 0.1461 | - | - | | 3.1483 | 11636 | 0.1467 | - | - | | 3.1485 | 11637 | 0.1601 | - | - | | 3.1488 | 11638 | 0.1096 | - | - | | 3.1491 | 11639 | 0.1373 | - | - | | 3.1494 | 11640 | 0.1415 | - | - | | 3.1496 | 11641 | 0.1919 | - | - | | 3.1499 | 11642 | 0.1867 | - | - | | 3.1502 | 11643 | 0.1434 | - | - | | 3.1504 | 11644 | 0.1553 | - | - | | 3.1507 | 11645 | 0.1004 | - | - | | 3.1510 | 11646 | 0.1938 | - | - | | 3.1512 | 11647 | 0.101 | - | - | | 3.1515 | 11648 | 0.1584 | - | - | | 3.1518 | 11649 | 0.1601 | - | - | | 3.1521 | 11650 | 0.148 | - | - | | 3.1523 | 11651 | 0.1151 | - | - | | 3.1526 | 11652 | 0.1524 | - | - | | 3.1529 | 11653 | 0.096 | - | - | | 3.1531 | 11654 | 0.2176 | - | - | | 3.1534 | 11655 | 0.1485 | - | - | | 3.1537 | 11656 | 0.1457 | - | - | | 3.1540 | 11657 | 0.216 | - | - | | 3.1542 | 11658 | 0.0966 | - | - | | 3.1545 | 11659 | 0.1324 | - | - | | 3.1548 | 11660 | 0.1194 | - | - | | 3.1550 | 11661 | 0.1352 | - | - | | 3.1553 | 11662 | 0.1585 | - | - | | 3.1556 | 11663 | 0.1596 | - | - | | 3.1558 | 11664 | 0.1463 | - | - | | 3.1561 | 11665 | 0.1413 | - | - | | 3.1564 | 11666 | 0.1529 | - | - | | 3.1567 | 11667 | 0.1688 | - | - | | 3.1569 | 11668 | 0.1149 | - | - | | 3.1572 | 11669 | 0.1217 | - | - | | 3.1575 | 11670 | 0.171 | - | - | | 3.1577 | 11671 | 0.1504 | - | - | | 3.1580 | 11672 | 0.1372 | - | - | | 3.1583 | 11673 | 0.1323 | - | - | | 3.1585 | 11674 | 0.1056 | - | - | | 3.1588 | 11675 | 0.111 | - | - | | 3.1591 | 11676 | 0.1638 | - | - | | 3.1594 | 11677 | 0.1425 | - | - | | 3.1596 | 11678 | 0.1608 | - | - | | 3.1599 | 11679 | 0.1302 | - | - | | 3.1602 | 11680 | 0.1895 | - | - | | 3.1604 | 11681 | 0.1941 | - | - | | 3.1607 | 11682 | 0.2341 | - | - | | 3.1610 | 11683 | 0.1682 | - | - | | 3.1613 | 11684 | 0.1572 | - | - | | 3.1615 | 11685 | 0.1608 | - | - | | 3.1618 | 11686 | 0.1899 | - | - | | 3.1621 | 11687 | 0.1845 | - | - | | 3.1623 | 11688 | 0.1067 | - | - | | 3.1626 | 11689 | 0.1403 | - | - | | 3.1629 | 11690 | 0.1932 | - | - | | 3.1631 | 11691 | 0.1308 | - | - | | 3.1634 | 11692 | 0.1467 | - | - | | 3.1637 | 11693 | 0.1511 | - | - | | 3.1640 | 11694 | 0.152 | - | - | | 3.1642 | 11695 | 0.1211 | - | - | | 3.1645 | 11696 | 0.1707 | - | - | | 3.1648 | 11697 | 0.1616 | - | - | | 3.1650 | 11698 | 0.1458 | - | - | | 3.1653 | 11699 | 0.205 | - | - | | 3.1656 | 11700 | 0.1034 | - | - | | 3.1659 | 11701 | 0.136 | - | - | | 3.1661 | 11702 | 0.1403 | - | - | | 3.1664 | 11703 | 0.1 | - | - | | 3.1667 | 11704 | 0.1718 | - | - | | 3.1669 | 11705 | 0.2275 | - | - | | 3.1672 | 11706 | 0.1612 | - | - | | 3.1675 | 11707 | 0.1393 | - | - | | 3.1677 | 11708 | 0.1934 | - | - | | 3.1680 | 11709 | 0.1373 | - | - | | 3.1683 | 11710 | 0.1337 | - | - | | 3.1686 | 11711 | 0.1706 | - | - | | 3.1688 | 11712 | 0.1312 | - | - | | 3.1691 | 11713 | 0.2023 | - | - | | 3.1694 | 11714 | 0.1148 | - | - | | 3.1696 | 11715 | 0.1156 | - | - | | 3.1699 | 11716 | 0.1319 | - | - | | 3.1702 | 11717 | 0.0848 | - | - | | 3.1705 | 11718 | 0.165 | - | - | | 3.1707 | 11719 | 0.1675 | - | - | | 3.1710 | 11720 | 0.1967 | - | - | | 3.1713 | 11721 | 0.1481 | - | - | | 3.1715 | 11722 | 0.1549 | - | - | | 3.1718 | 11723 | 0.1344 | - | - | | 3.1721 | 11724 | 0.1345 | - | - | | 3.1723 | 11725 | 0.1137 | - | - | | 3.1726 | 11726 | 0.1242 | - | - | | 3.1729 | 11727 | 0.0974 | - | - | | 3.1732 | 11728 | 0.1391 | - | - | | 3.1734 | 11729 | 0.1702 | - | - | | 3.1737 | 11730 | 0.1112 | - | - | | 3.1740 | 11731 | 0.1113 | - | - | | 3.1742 | 11732 | 0.1464 | - | - | | 3.1745 | 11733 | 0.1776 | - | - | | 3.1748 | 11734 | 0.1237 | - | - | | 3.1751 | 11735 | 0.1274 | - | - | | 3.1753 | 11736 | 0.1781 | - | - | | 3.1756 | 11737 | 0.2299 | - | - | | 3.1759 | 11738 | 0.1516 | - | - | | 3.1761 | 11739 | 0.1543 | - | - | | 3.1764 | 11740 | 0.1904 | - | - | | 3.1767 | 11741 | 0.1396 | - | - | | 3.1769 | 11742 | 0.1215 | - | - | | 3.1772 | 11743 | 0.235 | - | - | | 3.1775 | 11744 | 0.185 | - | - | | 3.1778 | 11745 | 0.1705 | - | - | | 3.1780 | 11746 | 0.181 | - | - | | 3.1783 | 11747 | 0.1544 | - | - | | 3.1786 | 11748 | 0.1386 | - | - | | 3.1788 | 11749 | 0.1741 | - | - | | 3.1791 | 11750 | 0.1717 | - | - | | 3.1794 | 11751 | 0.1512 | - | - | | 3.1797 | 11752 | 0.1453 | - | - | | 3.1799 | 11753 | 0.2071 | - | - | | 3.1802 | 11754 | 0.2051 | - | - | | 3.1805 | 11755 | 0.1136 | - | - | | 3.1807 | 11756 | 0.1154 | - | - | | 3.1810 | 11757 | 0.134 | - | - | | 3.1813 | 11758 | 0.16 | - | - | | 3.1815 | 11759 | 0.1435 | - | - | | 3.1818 | 11760 | 0.1549 | - | - | | 3.1821 | 11761 | 0.1415 | - | - | | 3.1824 | 11762 | 0.1742 | - | - | | 3.1826 | 11763 | 0.1089 | - | - | | 3.1829 | 11764 | 0.113 | - | - | | 3.1832 | 11765 | 0.1882 | - | - | | 3.1834 | 11766 | 0.1724 | - | - | | 3.1837 | 11767 | 0.179 | - | - | | 3.1840 | 11768 | 0.1055 | - | - | | 3.1843 | 11769 | 0.1405 | - | - | | 3.1845 | 11770 | 0.1421 | - | - | | 3.1848 | 11771 | 0.1539 | - | - | | 3.1851 | 11772 | 0.1302 | - | - | | 3.1853 | 11773 | 0.1455 | - | - | | 3.1856 | 11774 | 0.1634 | - | - | | 3.1859 | 11775 | 0.1682 | - | - | | 3.1861 | 11776 | 0.1375 | - | - | | 3.1864 | 11777 | 0.2166 | - | - | | 3.1867 | 11778 | 0.1799 | - | - | | 3.1870 | 11779 | 0.1555 | - | - | | 3.1872 | 11780 | 0.2124 | - | - | | 3.1875 | 11781 | 0.179 | - | - | | 3.1878 | 11782 | 0.1346 | - | - | | 3.1880 | 11783 | 0.1249 | - | - | | 3.1883 | 11784 | 0.1275 | - | - | | 3.1886 | 11785 | 0.1483 | - | - | | 3.1889 | 11786 | 0.2219 | - | - | | 3.1891 | 11787 | 0.1382 | - | - | | 3.1894 | 11788 | 0.1661 | - | - | | 3.1897 | 11789 | 0.0983 | - | - | | 3.1899 | 11790 | 0.1715 | - | - | | 3.1902 | 11791 | 0.1368 | - | - | | 3.1905 | 11792 | 0.1609 | - | - | | 3.1907 | 11793 | 0.1469 | - | - | | 3.1910 | 11794 | 0.1203 | - | - | | 3.1913 | 11795 | 0.156 | - | - | | 3.1916 | 11796 | 0.1468 | - | - | | 3.1918 | 11797 | 0.1724 | - | - | | 3.1921 | 11798 | 0.1481 | - | - | | 3.1924 | 11799 | 0.1829 | - | - | | 3.1926 | 11800 | 0.1124 | - | - | | 3.1929 | 11801 | 0.1301 | - | - | | 3.1932 | 11802 | 0.1562 | - | - | | 3.1935 | 11803 | 0.1985 | - | - | | 3.1937 | 11804 | 0.2025 | - | - | | 3.1940 | 11805 | 0.2345 | - | - | | 3.1943 | 11806 | 0.1698 | - | - | | 3.1945 | 11807 | 0.1315 | - | - | | 3.1948 | 11808 | 0.2197 | - | - | | 3.1951 | 11809 | 0.1727 | - | - | | 3.1953 | 11810 | 0.1023 | - | - | | 3.1956 | 11811 | 0.1696 | - | - | | 3.1959 | 11812 | 0.2225 | - | - | | 3.1962 | 11813 | 0.1485 | - | - | | 3.1964 | 11814 | 0.1023 | - | - | | 3.1967 | 11815 | 0.1451 | - | - | | 3.1970 | 11816 | 0.1924 | - | - | | 3.1972 | 11817 | 0.1536 | - | - | | 3.1975 | 11818 | 0.1367 | - | - | | 3.1978 | 11819 | 0.192 | - | - | | 3.1981 | 11820 | 0.1611 | - | - | | 3.1983 | 11821 | 0.1345 | - | - | | 3.1986 | 11822 | 0.1046 | - | - | | 3.1989 | 11823 | 0.1583 | - | - | | 3.1991 | 11824 | 0.1098 | - | - | | 3.1994 | 11825 | 0.2043 | - | - | | 3.1997 | 11826 | 0.1165 | - | - | | 3.1999 | 11827 | 0.1676 | - | - | | 3.2002 | 11828 | 0.1523 | - | - | | 3.2005 | 11829 | 0.1323 | - | - | | 3.2008 | 11830 | 0.1828 | - | - | | 3.2010 | 11831 | 0.144 | - | - | | 3.2013 | 11832 | 0.1629 | - | - | | 3.2016 | 11833 | 0.2056 | - | - | | 3.2018 | 11834 | 0.1589 | - | - | | 3.2021 | 11835 | 0.1404 | - | - | | 3.2024 | 11836 | 0.1504 | - | - | | 3.2027 | 11837 | 0.2044 | - | - | | 3.2029 | 11838 | 0.1107 | - | - | | 3.2032 | 11839 | 0.1626 | - | - | | 3.2035 | 11840 | 0.1535 | - | - | | 3.2037 | 11841 | 0.1495 | - | - | | 3.2040 | 11842 | 0.1385 | - | - | | 3.2043 | 11843 | 0.1758 | - | - | | 3.2045 | 11844 | 0.2206 | - | - | | 3.2048 | 11845 | 0.1492 | - | - | | 3.2051 | 11846 | 0.196 | - | - | | 3.2054 | 11847 | 0.1634 | - | - | | 3.2056 | 11848 | 0.1564 | - | - | | 3.2059 | 11849 | 0.1563 | - | - | | 3.2062 | 11850 | 0.18 | - | - | | 3.2064 | 11851 | 0.1336 | - | - | | 3.2067 | 11852 | 0.1475 | - | - | | 3.2070 | 11853 | 0.1516 | - | - | | 3.2073 | 11854 | 0.1493 | - | - | | 3.2075 | 11855 | 0.2066 | - | - | | 3.2078 | 11856 | 0.1687 | - | - | | 3.2081 | 11857 | 0.18 | - | - | | 3.2083 | 11858 | 0.1307 | - | - | | 3.2086 | 11859 | 0.1758 | - | - | | 3.2089 | 11860 | 0.1256 | - | - | | 3.2091 | 11861 | 0.1176 | - | - | | 3.2094 | 11862 | 0.147 | - | - | | 3.2097 | 11863 | 0.1944 | - | - | | 3.2100 | 11864 | 0.1964 | - | - | | 3.2102 | 11865 | 0.1136 | - | - | | 3.2105 | 11866 | 0.1774 | - | - | | 3.2108 | 11867 | 0.1824 | - | - | | 3.2110 | 11868 | 0.1772 | - | - | | 3.2113 | 11869 | 0.1573 | - | - | | 3.2116 | 11870 | 0.1481 | - | - | | 3.2119 | 11871 | 0.152 | - | - | | 3.2121 | 11872 | 0.18 | - | - | | 3.2124 | 11873 | 0.132 | - | - | | 3.2127 | 11874 | 0.1282 | - | - | | 3.2129 | 11875 | 0.1503 | - | - | | 3.2132 | 11876 | 0.1675 | - | - | | 3.2135 | 11877 | 0.1761 | - | - | | 3.2137 | 11878 | 0.1456 | - | - | | 3.2140 | 11879 | 0.149 | - | - | | 3.2143 | 11880 | 0.1604 | - | - | | 3.2146 | 11881 | 0.1367 | - | - | | 3.2148 | 11882 | 0.2029 | - | - | | 3.2151 | 11883 | 0.1872 | - | - | | 3.2154 | 11884 | 0.1888 | - | - | | 3.2156 | 11885 | 0.1797 | - | - | | 3.2159 | 11886 | 0.1338 | - | - | | 3.2162 | 11887 | 0.1608 | - | - | | 3.2165 | 11888 | 0.1566 | - | - | | 3.2167 | 11889 | 0.1212 | - | - | | 3.2170 | 11890 | 0.1425 | - | - | | 3.2173 | 11891 | 0.183 | - | - | | 3.2175 | 11892 | 0.1268 | - | - | | 3.2178 | 11893 | 0.1468 | - | - | | 3.2181 | 11894 | 0.1886 | - | - | | 3.2183 | 11895 | 0.151 | - | - | | 3.2186 | 11896 | 0.1817 | - | - | | 3.2189 | 11897 | 0.1602 | - | - | | 3.2192 | 11898 | 0.176 | - | - | | 3.2194 | 11899 | 0.1364 | - | - | | 3.2197 | 11900 | 0.1546 | - | - | | 3.2200 | 11901 | 0.126 | - | - | | 3.2202 | 11902 | 0.168 | - | - | | 3.2205 | 11903 | 0.1019 | - | - | | 3.2208 | 11904 | 0.129 | - | - | | 3.2210 | 11905 | 0.1489 | - | - | | 3.2213 | 11906 | 0.1209 | - | - | | 3.2216 | 11907 | 0.1373 | - | - | | 3.2219 | 11908 | 0.1023 | - | - | | 3.2221 | 11909 | 0.2163 | - | - | | 3.2224 | 11910 | 0.1565 | - | - | | 3.2227 | 11911 | 0.0828 | - | - | | 3.2229 | 11912 | 0.1705 | - | - | | 3.2232 | 11913 | 0.178 | - | - | | 3.2235 | 11914 | 0.1828 | - | - | | 3.2238 | 11915 | 0.1529 | - | - | | 3.2240 | 11916 | 0.1607 | - | - | | 3.2243 | 11917 | 0.1242 | - | - | | 3.2246 | 11918 | 0.1233 | - | - | | 3.2248 | 11919 | 0.1864 | - | - | | 3.2251 | 11920 | 0.094 | - | - | | 3.2254 | 11921 | 0.1154 | - | - | | 3.2256 | 11922 | 0.1297 | - | - | | 3.2259 | 11923 | 0.1028 | - | - | | 3.2262 | 11924 | 0.1783 | - | - | | 3.2265 | 11925 | 0.1757 | - | - | | 3.2267 | 11926 | 0.1352 | - | - | | 3.2270 | 11927 | 0.1299 | - | - | | 3.2273 | 11928 | 0.1052 | - | - | | 3.2275 | 11929 | 0.1736 | - | - | | 3.2278 | 11930 | 0.2099 | - | - | | 3.2281 | 11931 | 0.1414 | - | - | | 3.2284 | 11932 | 0.1194 | - | - | | 3.2286 | 11933 | 0.1361 | - | - | | 3.2289 | 11934 | 0.116 | - | - | | 3.2292 | 11935 | 0.1313 | - | - | | 3.2294 | 11936 | 0.1784 | - | - | | 3.2297 | 11937 | 0.1533 | - | - | | 3.2300 | 11938 | 0.2332 | - | - | | 3.2302 | 11939 | 0.1603 | - | - | | 3.2305 | 11940 | 0.1577 | - | - | | 3.2308 | 11941 | 0.2287 | - | - | | 3.2311 | 11942 | 0.1725 | - | - | | 3.2313 | 11943 | 0.1898 | - | - | | 3.2316 | 11944 | 0.1415 | - | - | | 3.2319 | 11945 | 0.191 | - | - | | 3.2321 | 11946 | 0.1815 | - | - | | 3.2324 | 11947 | 0.1703 | - | - | | 3.2327 | 11948 | 0.1222 | - | - | | 3.2330 | 11949 | 0.1881 | - | - | | 3.2332 | 11950 | 0.1715 | - | - | | 3.2335 | 11951 | 0.1725 | - | - | | 3.2338 | 11952 | 0.1929 | - | - | | 3.2340 | 11953 | 0.2194 | - | - | | 3.2343 | 11954 | 0.1633 | - | - | | 3.2346 | 11955 | 0.1587 | - | - | | 3.2348 | 11956 | 0.1336 | - | - | | 3.2351 | 11957 | 0.1935 | - | - | | 3.2354 | 11958 | 0.1038 | - | - | | 3.2357 | 11959 | 0.193 | - | - | | 3.2359 | 11960 | 0.1711 | - | - | | 3.2362 | 11961 | 0.1815 | - | - | | 3.2365 | 11962 | 0.1428 | - | - | | 3.2367 | 11963 | 0.1031 | - | - | | 3.2370 | 11964 | 0.1277 | - | - | | 3.2373 | 11965 | 0.1671 | - | - | | 3.2376 | 11966 | 0.134 | - | - | | 3.2378 | 11967 | 0.1846 | - | - | | 3.2381 | 11968 | 0.1219 | - | - | | 3.2384 | 11969 | 0.1381 | - | - | | 3.2386 | 11970 | 0.2014 | - | - | | 3.2389 | 11971 | 0.1854 | - | - | | 3.2392 | 11972 | 0.2116 | - | - | | 3.2394 | 11973 | 0.1225 | - | - | | 3.2397 | 11974 | 0.1708 | - | - | | 3.2400 | 11975 | 0.1833 | - | - | | 3.2403 | 11976 | 0.222 | - | - | | 3.2405 | 11977 | 0.1659 | - | - | | 3.2408 | 11978 | 0.1131 | - | - | | 3.2411 | 11979 | 0.1424 | - | - | | 3.2413 | 11980 | 0.1022 | - | - | | 3.2416 | 11981 | 0.0916 | - | - | | 3.2419 | 11982 | 0.1164 | - | - | | 3.2422 | 11983 | 0.1754 | - | - | | 3.2424 | 11984 | 0.1592 | - | - | | 3.2427 | 11985 | 0.1487 | - | - | | 3.2430 | 11986 | 0.2348 | - | - | | 3.2432 | 11987 | 0.1255 | - | - | | 3.2435 | 11988 | 0.1474 | - | - | | 3.2438 | 11989 | 0.1884 | - | - | | 3.2440 | 11990 | 0.1245 | - | - | | 3.2443 | 11991 | 0.2193 | - | - | | 3.2446 | 11992 | 0.1699 | - | - | | 3.2449 | 11993 | 0.1311 | - | - | | 3.2451 | 11994 | 0.2196 | - | - | | 3.2454 | 11995 | 0.177 | - | - | | 3.2457 | 11996 | 0.1873 | - | - | | 3.2459 | 11997 | 0.1395 | - | - | | 3.2462 | 11998 | 0.1319 | - | - | | 3.2465 | 11999 | 0.1288 | - | - | | 3.2468 | 12000 | 0.1557 | 0.1915 | 0.9539 | </details> ### Framework Versions - Python: 3.12.4 - Sentence Transformers: 3.0.1 - Transformers: 4.44.0 - PyTorch: 2.4.0+cu121 - Accelerate: 0.33.0 - Datasets: 2.21.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
{"base_model": "BM-K/KoSimCSE-bert-multitask", "datasets": [], "language": [], "library_name": "sentence-transformers", "metrics": ["cosine_accuracy", "dot_accuracy", "manhattan_accuracy", "euclidean_accuracy", "max_accuracy"], "pipeline_tag": "sentence-similarity", "tags": ["sentence-transformers", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:473130", "loss:MultipleNegativesRankingLoss"], "widget": [{"source_sentence": "릴레이로 이번주 TV와 라디오 방송 출연을 확정한게 누구야?", "sentences": ["▲ 사진=롯데엔터테인먼트 제공 영화 완벽한 타인의 주역 유해진, 조진웅, 이서진, 염정아가 릴레이로 이번주 TV와 라디오 방송 출연을 확정했다. 완벽한 타인은 완벽해 보이는 커플 모임에서 한정된 시간 동안 핸드폰으로 오는 전화, 문자, 카톡을 강제로 공개해야 하는 게임 때문에 벌어지는 예측불허 이야기를 담은 작품이다. 완벽한 타인에서 완벽한 연기를 펼친 배우들은 이번 주 릴레이로 TV와 라디오 방송 출연을 확정하며 열일 행보를 펼친다. 먼저 오는 24일 오후 7시 MBC FM영화음악 한예리입니다에는 유해진과 염정아가 함께 출연한다. 간첩, 전우치에 이어 세 번째로 함께 호흡을 맞춘 두 사람은 이번 라디오 출연에서 영화에 대한 이야기를 나누며 걸출한 입담과 절친 케미스트리를 선보일 것으로 보인다. 이어 이번 영화로 처음 만나 절친이 된 유해진, 조진웅, 이서진이 25일 오후 11시 10분 KBS2 해피투게더4에 출연한다. 세끼 인연 유해진과 이서진, 그리고 조진웅의 예능감이 유감없이 발휘될 예정이다. 마지막으로 26일에는 MBC 배철수의 음악캠프에서 이서진을 만날 수 있다. 완벽한 타인에서 가장 파격적인 연기 변신을 선보인 그는 음악캠프 특별 DJ로 활약했던 인연으로 이번 출연이 성사됐다. 이서진은 거침없는 언변으로 영화 완벽한 타인의 현장 비하인드 스토리를 밝힐 예정이다. 한편 완벽한 타인은 오는 31일 개봉을 앞두고 있다.", "부산 부산진구(구청장 서은숙)는 오는 7월부터 단독주택가 재활용정거장 운영 사업을 부암1동과 개금2동으로 확대 추진한다고 밝혔다. 재활용정거장 사업은 일반 주택가 주민들이 편리하게 재활용품을 배출할 수 있도록 일정시간에 지정된 장소에 ‘재활용정거장’이라는 배출 거점을 만들어 주민들이 이용할 수 있도록 하고, 운영 시간 외에는 철수하는 이동식 분리수거장이다. 각 정거장마다 도시광부라 불리는 자원관리사가 정거장을 관리하고 주민들의 재활용품 분리배출을 지원한다. 부산진구는 2019년 3월부터 전포1동 지역에 재활용정거장 25개소를 설치하여 174회 운영, 재활용품 38,700마대를 수거했다. 이 사업이 주민들의 올바른 재활용품 분리배출 문화를 확산시키고 마을의 쓰레기 문제를 주민들이 직접 해결하는 지역공동체의 모범안을 제시한 사례로 평가받음에 따라 오는 7월부터 부암1동과 개금2동 지역에 재활용정거장 운영을 확대 추진하기로 했다. 재활용정거장은 부암1동이 5개소, 개금2동은 10개소이다. 부암1동은 매주 월요일과 수요일, 개금2동은 화요일과 목요일 오후 4시부터 8시까지 4시간동안 운영된다. 주민들은 종이류, 플라스틱류, 유리병류, 캔․고철류, 비닐류 등 재활용 전품목을 가까운 재활용정거장으로 배출하면 된다. 오후 8시 이후 운영을 마치면 수거업체가 재활용정거장에 배출된 재활용품을 모두 수거해 간다. 이후 정거장은 철거되고 다음 운영요일에 다시 설치된다. 또한 주민들의 재활용품 배출 혼동을 줄이기 위해 기존 문전수거 방식도 병행해 운영한다. 구 관계자는 “재활용정거장 사업은 단독주택지역의 재활용품 혼합배출 실태를 개선하여 실질적인 자원 재활용율 높이고 쾌적한 골목길 조성에 크게 기여할 것으로 기대한다”며 주민들이 다함께 적극 참여해 줄 것을 당부했다.", "그룹 세븐틴이 미국 간판 토크쇼 ‘엘렌 쇼’에 첫 출연을 확정 지었다. 세븐틴은 다음 달 1일(현지 시각) 방송되는 미국 토크쇼 ‘엘렌 드제너러스 쇼’(이하 엘렌 쇼)에 첫 출연을 확정 지어 전 세계 팬들의 폭발적인 반응을 얻었다. 이날 방송에서 세븐틴은 지난 2019년 8월 발매한 디지털 싱글 ‘HIT’ 무대를 선보인다. ‘HIT’는 제목처럼 타격감이 느껴지는 사운드와 세븐틴의 폭발적인 에너지가 그대로 전해지는 강렬한 EDM 장르의 댄스곡으로 발매와 동시에 국내는 물론 해외에서도 큰 사랑을 받았다. ‘엘렌 쇼’는 미국 유명 코미디언이자 작가, 배우 등 멀티 엔터테이너인 엘렌 드제너러스가 진행하는 토크쇼로 브루노 마스, 두아 리파, 존 레전드, 저스틴 비버 등 세계적인 팝스타들이 대거 출연해 화제를 모았으며 미국의 데이타임 쇼 중 높은 인기를 보유하고 있는 프로그램이다. 앞서 세븐틴은 지난 1월 방송된 미국 CBS ‘제임스 코든 쇼’와 NBC ‘켈리 클락슨 쇼’에 연달아 출연해 스페셜 앨범 타이틀곡 ‘HOME;RUN’과 미니 7집 타이틀곡 ‘Left & Right’의 무대를 선사, 막강한 글로벌 영향력을 확인 시켜 주며 전 세계 팬들과 해외 유수 매체의 호평 세례를 받았다. 이렇듯 세븐틴은 스토리텔링이 담긴 완성도 높은 무대와 세븐틴만이 할 수 있는 퍼포먼스를 선보여 ‘K팝 퍼포먼스 강자’라는 칭호를 얻는 등 전 세계를 열광시킨 바 있어 이번 ‘엘렌쇼’에서 어떤 무대를 선보일지 기대감이 치솟고 있다. 한편 세븐틴이 출연하는 미국 토크쇼 ‘엘렌 쇼’는 다음 달 1일(현지 시각)에 만나볼 수 있다."]}, {"source_sentence": "롯데리아에 직접 가서 불고기버거세트를 먹으려면 얼마를 내야 하지", "sentences": ["햄버거 프랜차이즈 업체들이 배달 애플리케이션으로 일정 금액 이상 주문할 때 배달비가 무료라고 내세우지만 이미 제품 가격에 포함된 것으로 드러났다. 햄버거를 배달 앱으로 주문하면 같은 제품이라도 매장보다 더 가격이 비싸다. 사실상 소비자 기만 행위다. 19일 한국소비자원의 조사 결과 롯데리아, 맥도날드, 버거킹, KFC 등 주요 4개 햄버거 프랜차이즈의 모든 제품이 매장 가격에 비해 배달 가격이 비쌌다. 예를 들어 롯데리아 불고기버거세트의 배달가는 7000원으로 매장가 5900원보다 1100원을, 버거킹 리얼와퍼세트는 1200원을 더 내야 한다. 메뉴를 많이 주문할수록 가격 차이가 커져 소비자가 피해를 보는 구조라는 것도 분통터지게 한다. 기업이 이윤을 추구하는 것은 당연하지만 소비자를 속여서까지 이익을 내선 안 된다. 더 큰 문제는 프랜차이즈들이 이런 사실을 제대로 고지하지 않았다는 점이다. 버거킹, KFC는 자사 홈페이지에서만 배달과 매장 가격이 다를 수 있음을 알리고 있다. 4개 업체 모두 배달의민족, 요기요, 쿠팡이츠 등 주요 배달 플랫폼에 이 같은 정보를 공지하지 않았다. 대개 배달 앱을 통해 주문하는 만큼 이제라도 주문 및 결제 과정에서 주요 거래조건을 명확하게 알려야 할 것이다. 소비자단체에 따르면 햄버거뿐 아니라 상당수 일반 음식점이 배달 앱으로 주문할 때 식당가보다 음식값을 더 비싸게 받고 있다. 매장에선 할인되는 품목이 배달 주문 때는 할인 적용이 안 되는 경우도 있다. 식당가와 배달가의 차이가 난다는 지적에 아예 매장 가격을 올려버리기도 한다니 소비자를 봉으로 아는 태도다. 이는 전체 외식 물가 인상으로 이어지는 것이라 우려스럽다. 코로나19로 비대면이 늘면서 배달 앱을 이용한 음식 주문은 소비자의 일상이 됐다. 이런 상황에서 무료배달이라는 꼼수를 쓴 햄버거 프랜차이즈는 비난받아 마땅하다.", "물가 안정과 우리 농산물 소비 촉진을 위한 '대한민국 농할(농산물 할인) 갑시다'가 돌아왔다. 이마트, 롯데마트와 롯데슈퍼는 농림축산식품부와 손잡고 오는 27일까지 다양한 농산물 할인 혜택을 통해 물가 안정 및 농가 돕기에 나선다. 이번 '농할갑시다'의 키 포인트는 '물가 안정'과 '소비 촉진'이다. 유통업계와 농림축산식품부는 이번 '농할갑시다'를 통해 물가를 안정시키고 판로에 어려움을 겪고 있는 국내 농가들을 도울 예정이다. 행사 기간 동안 이마트에서는 '농할갑시다' 행사 상품을 구매할 경우, 신세계포인트 적립 고객에 한해 20% 할인을 적용한다. 할인은 1인당 최대 1만원(구매금액 5만원)까지 받을 수 있다. 먼저 AI(조류독감)로 갑작스레 오른 달걀 가격 안정화를 위해 이마트와 농림축산식품부는 약 50종에 달하는 달걀 할인 행사를 선보인다. 이마트에서 달걀을 신세계포인트 적립하여 구매할 시, 판매 가격의 20%를 할인받을 수 있다. 또 올겨울 잦은 한파와 폭설로 인해 가격이 크게 오른 '무'도 신세계포인트 적립 시 20% 할인해 판매한다. 무는 올 1월 제주도에 내린 폭설로 생산량이 줄어들어 가격이 상승했다. 농산물유통정보에 따르면 무 20㎏ 평균 가격은 작년 12월 중순 1만536원이었으나 올해 1월 14일 1만5980원으로 한 달 만에 약 51.6% 오른 셈이다. 코로나19로 소비량이 대폭 감소해 지지부진한 판매량을 보이고 있는 배추 역시 신세계 포인트 적립 시 20% 할인 판매해 소비 촉진에 앞장선다. 이마트는 이번 농산물 할인 행사를 시작으로 농림축산식품부와 함께 순차적으로 친환경 농산물 등 다양한 할인 행사를 진행할 예정이다. 롯데마트와 롯데슈퍼 역시 오는 27일까지 전 점에서 '대한민국 농할 갑시다' 행사를 진행한다. 롯데마트 역시 최근 가격이 상승한 달걀과 무, 배추를 할인 품목으로 정해 실질적인 가계의 물가 안정에 기여할 것으로 기대하고 있다. '농할 갑시다' 행사는 엘포인트(L.Point) 회원이 롯데, 신한, 현대 등 7대 카드로 결제 시 적용된다. 행사 기간 동안 달걀을 20% 할인하며, 배추와 무도 20% 할인된 가격인 각 1260원에 판매한다. 달걀의 경우 1인당 3판 한정으로 판매할 계획이며, 배추와 무를 포함해 1인당 최대 할인 한도는 1만원이다. 롯데마트 정재우 상품본부장은 \"최근 급격히 오른 물가 안정의 취지에 맞춰 농림축산식품부와 함께 이번 행사를 준비했다\"며 \"합리적인 가격에 우리 농산물을 구입할 수 있는 기회가 되길 바란다\"고 말했다.", "상장회사법제 구축 공동 세미나\n제1주제 상장회사법 제정에 관한 구상\nⅤ 결론\n경험적으로 볼 때 새로운 법을 제정한다는 게 매우 어려운 일이다. 특히 학계와 실무계에서 회사법을 상법에서 분리하여야 한다는 주장이 오랫동안 있어 왔지만 큰 반향을 불러일으키지 못한 것이 현실이다. 이러한 상황에서 대안으로 구상한 것이 바로 상장회사법의제정이다. 상장회사법의 경우 이미 2007년도에 「상장법인에 관한 법률」 제정안이 입법예고된 바 있다는 점에서 추후에도 그 제정을 위한 공감대를 이끌어 내기가 용이할 것으로 보인다. 본고는 2019년의 시점에서 다시 한번 상장회사법의 제정 필요성으로 다음과 같은 3가지이유를 제시하였다. 첫째, 급격하게 변화하는 자본시장에 유연하게 대응하여 다양하면서도 다수의 이해관계자를 보호하기 위해서는 상장회사법을 제정할 필요가 있다. 둘째, 현재 상법과 자본시장법에 나누어 규정되고 있는 상장회사에 대한 특례를 하나로 합하여 단행법화한다면 국내외의 수범자 입장에서의 편의성이 제고될 서이다. 셋째, 상장회사에 관련된 여러 법제가 산재하고 있어 체계적인 정합성이 미흡하므로, 이를 극복하기 위해서는 상장회사법의 제정이 필요하다. 단기적인 과제로서 외부감사법상 상장회사에 적용되는 규정을 상장회사법으로 이관한다면 도움이 될 것으로 보인다. 이에 비교법적인 차원에서 미국과 독일 및 일본과 태국법의 동향을 파악하였다. 미국은 상장회사에 관한 규정을 회사법에 일부 두면서도 동시에 전국규모의 증권거래소가 마련한 상장회사 규정에 의하여 상장회사의 지배구조가 보충적으로 규율되고 있다. 독일은 주식법에서 상장회사와 비상장회사 모두를 다루고 있다. 일본의 경우에는 단행의 회사법을 두고 있지만, 상장회사에 대한 규제의 선진화를 위하여 2007년 상장회사에 적용될 공개회사법요강안을 발표한 바 있다. 태국은 세계적으로 유래를 찾기 어려운 입법으로서 공개주식회사법을 별도로 제정하여 시행하고 있는 국가이다. 본고는 기존에 정부가 마련한 「상장법인에 관한 법률」 제정안을 기본으로 하면서 위에 소개된 여러 국가의 입법례를 참고하여 상장회사법에 들어갈 규정들을 몇 가지 제시하였다. 본고에서 상장회사법에 편입되어야 할 추가적인 규정을 요약하여 정리하면 다음과 같다. 먼저 지배구조와 관련하여 주주명부의 폐쇄 및 기준일을 단축하는 규정, 이사에게 내부통제체제 구축의무를 부과하는 규정 및 사외이사의 회의를 강제하는 규정을 두어야 한다. 다음으로 현행 외부감사법에 있는 상장회사의 회계처리기준과 회계감사에 관한 규정을 상장회사법으로 이관한다. 마지막으로 합병 등과 같은 조직재편과 관련하여 몇 가지를 개선한다. 즉, 상장회사에서 합병 등에 반대하는 주주에 대해서는 주식매수청구권을 인정하지 않으며, 삼각합병에서 제3자에게 제공되는 합병의 대가가 모회사가 발행한 의결권있는 주식의 10%를 초과하는 경우에는 모회사 주주총회의 승인을 요구한다. 그리고 합병 등에 필요한 채권자보호절차와 관련하여 채권자에 대한 개별최고제도를 폐지함은 물론이고 채권자의 손해가 없다면 그에 대해서는 채권자보호절차를 마련하지 않는다."]}, {"source_sentence": "어떤 방법으로 합동수사본부는 범죄유형별 대처시스템을 강화하려 할까", "sentences": ["2. 형사사법업무처리의 전자화\n(1) 형사사법정보시스템\n정부는 2010년 「형사사법절차 전자화 촉진법」을 근거로 형사사법기관 간 전산시스템을 연계하고 전산정보를 공동 활용하는 동시에 대국민 형사사법정보서비스(사건 조회, 재판서 및 통지서 조회, 벌과금 조회, 민원신청, 범죄피해자 지원 등)를 제공하는 형사사법공통시스템(KICS)을 구축하였다. 경찰, 검찰, 법무부, 법원은 각각의 KICS 전용 서버를 설치하여 운영하고 있고 공통시스템 서버 운영과 각 기관 간 KICS 연계 업무는 형사사법공통시스템 운영단이 수행하고 있다. 아래의 차세대 형사사법정보시스템의 구성도(그림1)를 보면 노란색으로 표시된 재구축 부분은 기존의 2010년부터 형사사법정보시스템에서 제공되던 것을 기술적 업그레이드만 하는 것이다. 차세대 형사사법정보시스템(그림1)에서 파란색으로 표시된 신규 부분은 2024년을 목표로 하고 있는 빅데이터 분석 플랫폼 구축과 전자문서화 시스템 영역이다.", "나. 연기금의 분산투자 확대\n1) 필요성 및 현황\n□ 국민연금 등 연기금의 자산이 폭발적으로 늘어나고 있는 상황에서 특정자산에 집중해 투자할 경우 여러 가지 문제 초래\n― 국민연금 등 연기금들이 현재의 채권 위주 운용전략을 유지할 경우 시장에서의 지배력이 심각한 수준에 달할 전망\n⦁ 이창용(2004)은 2015년에도 국민연금 전체 자산 중 국내채권의 비중이 70%대 중반을 유지할 경우, 국내 채권시장에서 차지하는 국민연금의 비중이 20% 이상에 달할 것으로 전망\n⦁ 만약 해외채권과 해외주식의 비중을 크게 늘리면 국민연금의 국내자산시장에서 차지하는 비중이 크게 줄어들 것으로 전망\n― 국민연금 등 특정 기관투자자의 고등급 채권 위주의 운용 전략이 지속되어 채권시장에서 차지하는 비중이 커지면 다음과 같은 부작용 예상\n⦁ 국채수익률이 낮아지는 등 금리 왜곡 가능성\n⦁ 국민연금의 국채수요로 정부가 저비용으로 재정자금을 조달할 수 있기 때문에 재정규율이 약화될 가능성\n⦁ 고등급 채권 발행 주체인 대기업에 자본이 집중되고, 신성장 동력원이 되는 신생기업에 대한 자본공급 부진 가능성\n― 국민연금이 국내 주식에 대한 투자를 계속해서 늘려 그 비중이 급속도로 커질 경우에도 문제가 발생\n⦁ 국민연금 보유 주식의 가격이 하락하여 국민연금이 손절매에 나설 경우 일반투자자들의 투매를 초래해 시장이 불안해질 가능성\n⦁ 국민연금의 개별기업에 대한 지분율 상승으로 지배구조에 영향\n― 자산 축적기에 분산투자가 제대로 구축되지 않을 경우에는 향후 연금지급이 본격화되면 자산시장 왜곡을 초래할 가능성", "제목 범정부 서민생활침해사범 근절 대책 추진 중간결과\n□ 이번에 설치된 합동수사본부는\n○ 안전행정부, 미래창조과학부, 경찰청, 국세청, 금융위원회, 금융감독원, 사행산업통합감독위원회 등 유관기관이 참여하여,\n- 서민생활 침해사범에 대한 범정부 차원의 효율적인 예방․단속, 범죄수익 환수․탈루세액 징수, 피해자 보호 등과 관련하여 유관기관별 역할분담과 협업을 통하여 기관별 역량을 최대한 결집하고, 범죄유형별 대응시스템을 강화하였을 뿐만 아니라,관련 기관의 적극적 제도개선을 이끌어내는 계기를 마련하였음 ○ 서민생활침해사범 합동수사본부는 이후에도 1차 단속 과정에서 나타난 미흡한 점을 더욱 보완, 지속적인 단속활동을 전개하여 서민 피해자 보호, 불법수익의 철저한 환수, 탈루세액 징수에 박차를 가하는 한편, 불법차명물건(대포통장, 대포차, 대포폰 등)을 이용한 범죄 및 파밍, 스미싱 등 신종 사기 범죄로부터 서민을 보호하는 제도 개선에 더욱 노력하겠음."]}, {"source_sentence": "용재 오닐이 참가하고 있는 현악기로 연주하는 사중주단은 뭐야", "sentences": ["한국계 미국 비올리스트 리처드 용재 오닐이 ‘제63회 그래미 어워즈’에서 ‘베스트 클래시컬 인스트루먼털 솔로(Best Classical Instrumental Solo)’ 상을 받았다. 용재 오닐은 ‘그래미 어워즈’ 본 시상식에 앞서 한국시간으로 15일 진행된 사전 시상식 ‘프리미어 세리머니(Premiere Ceremony)’에서 이 부문 수상자로 호명됐다. 데이비드 앨런 밀러가 지휘하고 미국 알바니 심포니가 함께 연주한 테오파니디스의 비올라와 챔버 오케스트라를 위한 협주곡으로 영예를 안았다. 용재 오닐은 ‘디토 페스티벌’ 음악감독 등을 맡아 한국에서 클래식음악의 대중화에 기여했다. 세계적 현악 사중주단 ‘타카치 콰르텟’에 합류해 활약 중이다.", "‘안단테 칸타빌레(Andante Cantabile)’. 이탈리아어로 ‘노래하듯 천천히’라는 뜻이다. 현악사중주단 ‘아벨콰르텟’은 2021년을 그렇게 한발자국씩, 희망을 담아 걸어나가기로 했다. 최근 한국일보 사옥에서 만난 ‘아벨콰르텟’ 멤버 윤은솔(34ㆍ바이올린), 박수현(32ㆍ바이올린), 문서현(24ㆍ비올라), 조형준(34ㆍ첼로)은 “코로나19 때문에 천천히 걸을 수밖에 없고 잠깐 멈춰야 할 때도 있겠지만, 항상 앞으로 나아가려는 마음이 중요하다”며 “이왕이면 좋은 생각을 하면서, 천천히 노래하듯 나아가고 싶다”고 입을 모았다. ‘아벨콰르텟’은 18일 광주 유ㆍ스퀘어문화관, 20일 서울 예술의전당에서 ‘안단테 칸타빌레’라는 이름의 네번째 정기연주회를 연다. 2013년 결성된 ‘아벨콰르텟’은 재작년 코로나19 만큼이나 큰 위기를 겪었다. 처음 팀을 만들 때 구심점 역할을 했던 비올리스트 김세준이 개인 사정으로 콰르텟을 떠나게 된 것. 오랜시간 합을 맞춰왔던 연주자의 공백은 컸다. 남은 3명이 “이대로 활동을 그만둬야 하나”하고 고민할 정도였다. 다행히 지난해 초 막내 문서현이 합류하면서 ‘아벨콰르텟’의 새 삶이 시작됐다. 문서현은 “관객으로 만났던 ‘아벨콰르텟’은 인간적이면서 따뜻한 소리가 기억에 남는 현악사중주단”이라며 “특정 장르에 머무르지 않고 다양한 시도를 하는 팀이어서 앞으로가 기대된다”고 했다. 이달 공연의 첫곡인 슈베르트 현악사중주 12번은 ‘아벨콰르텟’의 새출발을 알리는 신호탄이다. 오스트리아 빈에서 기존 멤버들과 만난 문서현이 처음으로 합주한 작품이기도 하다. 조형준은 “단악장의 짧은 곡이지만 몰아치는 감정의 소용돌이가 어마어마하다”면서 “지금까지 주로 고전시대 작품을 많이 했다면 이번에는 낭만주의 색깔을 마음껏 보여줄 수 있을 것”이라고 말했다. 뒤이어 펼쳐지는 멘델스존의 현악사중주 6번은 작곡가의 슬픔과 격정이 “제한 없이 드러나는” 곡이다. 멘델스존이 사랑하는 누나의 죽음을 접한 직후 쓴 곡으로 알려져 있다. 윤은솔은 “다른 팀원들이 이 곡을 공연해보자고 계속 제안했는데 지금까지 자신이 없어서 미루고만 있다가 최근 어떤 계기를 통해 연주할 힘을 얻었다”며 “마음의 준비가 됐기에 완성도 높은 음악을 들려드리고 싶다”고 했다. 마지막은 실내악 명작으로 꼽히는 차이코프스키의 현악사중주 1번이다. ‘안단테 칸타빌레’라는 제목이 붙은 2악장이 특히 유명한데, 이번 공연의 이름과 같다. 박수현은 이 곡을 두고 “따뜻하고 달콤한 유럽 크리스마스의 향기가 난다”고 표현했다. 박수현은 “차이코프스키는 아무런 음악적 지식이 없는 사람도 듣기만 하면 아름다움을 느낄 수 있는 곡들을 썼는데, 현악사중주 1번에도 그런 철학이 잘 담겨 있다”고 말했다.", "어느덧 내 나이 팔순을 지나간다. 최근에 팔순을 맞은 한 대학 동창이 예배시간에 이런 회고사를 했다. “목회에서 은퇴한 뒤 그동안 만났던 성도들을 떠올리며 기도하고 있습니다. 어떤 날은 새벽 5시30분 시작된 기도가 오전 10시까지 이어져 아침식사를 거르기도 했습니다.” 대학 동기 모임에서 또 다른 친구가 김홍도 목사(전 금란교회 담임목사)의 옆구리를 쿡 찌르면서 농담을 건넸다. “홍도야, 예배시간에 그 친구가 한 얘기 들었지? 성도를 위해 기도하느라고 아침도 못 먹었단다. 너는 성도가 수만 명이나 되는데 하루 종일 밥숟가락을 뜰 수나 있겠나.” 예전에는 인간이 강건하면 수명이 팔십이라고 했다. 난 그 나이만큼 살고 있으니 감사하다. 난 태어나서 네 살이 될 때까지 앞을 보지 못했다. 결핵성 관절염을 앓아 작대기를 짚고 학교에 다녔다. 또 건강이 좋지 않고 집이 가난해 행복한 꿈을 꿀 수 없었다. 그렇게 부족한데도 하나님께선 날 아껴주시고 목회를 완주할 수 있게 해주셨다. 하나님은 내게 좋은 부모님을 주셨다. 부모님은 내게 여호와에 대한 경외심을 삶으로 가르쳤다. 나 역시 부모님의 뒤를 따라 복음과 양심을 지키려 노력했다. 그리고 내 자녀들이 바통을 이어 목회의 길을 가고 있다. 참으로 은혜로운 일이다. 하나님은 또 내게 신앙심 깊은 아내를 허락하셨다. 아내의 소원이 참 재밌다. 하나님 앞에서 ‘난 목사의 며느리였고 목사의 아내였고 목사의 어머니였고 목사의 할머니였다’는 소리를 하고 싶다는 것이다. “여호와를 경외하며 그의 길을 걷는 자마다 복이 있도다. 네가 네 손이 수고한 대로 먹을 것이라 네가 복되고 형통하리로다. 네 집 안방에 있는 네 아내는 결실한 포도나무 같으며 네 식탁에 둘러앉은 자식들은 어린 감람나무 같으리로다. 여호와께서 시온에서 네게 복을 주실 것이며 너는 평생에 예루살렘의 번영을 보며 네 자식의 자식을 볼지어다.(시 128)” 성경에는 복이란, 손이 수고한 대로 소득을 얻는 것이라고 돼 있다. 복은 곧 아내와 아이들이 한 식탁에 둘러앉아 밥을 먹는 것이요, 자식의 자식 곧 손주를 보는 것이다. 생각하면 누구나 받는 싱거운 복 같지만 깊이 생각할수록 아무나 받을 수 없는 것이기도 하다. 요즘 젊은이들이 일자리나 결혼, 출산 등의 문제로 고통 받는다고 하니 평범해 보이는 일상은 실상 비범한 일이다. 하나님께서 내게 여호와를 경외하는 아내는 물론 감람나무 같은 자식들과 그들의 자식까지 보는 기쁨을 주셨다. 모두 하나님의 은혜다. 아버지는 평생 강원도 산골을 걸어 다니며 목회하셨다. 난 목회 초반 자전거를 탔다. 내 자녀들은 자동차를 타고 목회를 하고 있다. 아마도 손주들은 비행기를 타고 다니며 목회할 것이다. 나는 엘리야의 때에 바알에게 무릎 꿇지 않은 7000인의 사람들이 있었던 것처럼 오늘날에도 그런 천연기념물 같은 성도가 있다고 믿는다. 역경의 열매는 역경을 이긴 자들의 것이다. 이 시간에도 천연기념물 같은 주의 종들이 역경을 묵묵히 감내하고 있을 것이라 믿는다. 그들이 지나간 자리에 한층 더 커지고 환해지고 깨끗해지고 튼튼해지고 안전해진 주님의 교회가 있길 바라며 기도한다."]}, {"source_sentence": "연제구의 도시미화와 저소득층 노인들의 일자리 창출에도 도움이 되는 게 뭐야", "sentences": ["연제구(구청장 이성문)는 지난해에 이어 ‘2021년 불법 유동광고물 수거보상사업’을 시행한다. ‘불법 유동광고물 수거보상사업’은 도시미관을 해치는 불법 광고물을 근절하기 위해 사업 참여자가 불법 유동광고물을 수거하여 구청 도시재생과에 가져가면 구청에서는 보상금을 지급하는 사업이다. 구는 1월 11일부터 15일까지 연제구민 중 만 60세 이상 저소득 어르신을 대상으로 신청을 받아 총 50명을 수거보상사업 참여자로 선발하였다. 참여자로 선발된 어르신들은 오는 2월부터 시작하는 수거 활동에 앞서 연제구청 구민홀에서 불법 유동광고물 구분 기준, 수거 방법, 수거 시 안전 수칙 등에 대해서 사전 교육을 받았으며 수거활동 중 발생할 수 있는 안전사고에 대비해 단체 보험에도 가입했다. 불법 광고물 정비에 주민들이 참여할 수 있는 기회를 제공하여 주민들로부터 불법 광고물에 대한 경각심을 제고 할 수 있을 것으로 기대된다. 구 관계자는 “이번 사업을 통해 주민과 함께 품격 있는 연제구를 만드는 데 일조하고, 저소득 어르신의 실버 일자리 창출에도 기여할 것으로 기대된다”고 말했다.", "이시종 충북도지사는 24일 오후 2시 도청 대회의실에서 국가철도망 구축계획에 충북도 철도사업의 반영을 위한 민ㆍ관ㆍ정 간담회를 개최했다. 이날 간담회에는 이 지사를 비롯해 한범덕 청주시장, 송기섭 진천군수, 조병옥 음성군수, 이장섭, 임호선 국회의원과 박문희 도의회 의장, 최충진 청주시의회 의장, 김성우 진천군의회 의장, 최용락 음성군의회 의장 등 지역 정치권 관계자와 민간사회단체총연합회 유철웅 회장 등 민간사회단체 관계자까지 34명이 참석했다. 이번 간담회에서는 4차 국가철도망구축계획 공청회가 얼마 남지 않은 중요한 시점에서 그동안 철도사업의 국가계획 반영 추진상황을 공유하고, 각 기관과 단체별 참여 방안과 도민의 힘을 모으기 위한 다양한 방안이 논의됐다. 도는 청주도심을 통과하는 충청권 광역철도, 수도권에서 진천 국가대표선수촌과 혁신도시를 거쳐 청주공항을 잇는 수도권내륙선 광역철도, 음성 감곡에서 혁신도시를 거쳐 청주공항을 잇는 중부내륙선 지선 등의 노선을 국가계획에 반영하기 위해 집중하고 있다. 이 지사는 \"해당 철도노선을 국가계획에 반영해야만 추진할 수 있기 때문에 우선은 반영을 목표로 최선을 다해야 한다\"며 \"많은 도민들의 공감대와 적극적인 지지가 필요한 때인 만큼 참석자들이 구심점 역할을 해 줄 것\"을 당부했다. 도 관계자는 \"국가철도망계획은 10년 단위 계획으로 전국 지자체가 각자의 사업 반영을 위해 각축전을 벌이는 상황\"이라며 \"충북도 사업이 최대한 반영될 수 있도록 최선을 다하겠다\"고 말했다. \n\n ", "4. 나가며\n노인복지주택이 지속가능한 노인복지정책이 되기 위해서는 사업시행자에게는 경제적으로 이득이 되고, 정책대상인 노인가구에게도 주거생활에 실질적인 도움을 줄 수 있어야 할 것이다. 그러나 그간 노인복지주택에의 사업시행자는 건설부지 및 부대시설 기준완화, 조세감면 등 각종 혜택을 받아 경제적 이득을 실현한 반면, 정책대상가구인 노인가구는 입소자격 제한규정으로 재산권 행사에 많은 불편을 겪어왔다. 이러한 정책집행 의지와 현실 간 괴리 속에서 다수의 노인복지주택에서 입소자격이 없는 자가 탈법적으로 입주하는 행위가 발생해온 것이다. 다음과 같은 측면에서도 노인복지주택정책에 대한 면밀한 검토가 필요하다. 첫째, 노인복지주택이 용도상 자연경관이 우수한 녹지지역 혹은 기반시설이 확보되지 않은 지역에도 건축될 수 있어 국토난개발을 유발할 가능성이 크다. 둘째, 보다 근본적으로 노인복지주택과 같이 노인들만 거주하는 주택이 노인복지 측면에서 바람직한지를 검토할 필요가 있다. 우리나라와 같이 급격한 고령화를 경험하고 있는 일본의 경우, 젊은 세대와 노인 세대가 함께 거주하는(age-mix) 정책이 중요하게 인식되고 있기 때문이다. 현행 노인복지주택 입소자자격 등은 노인의 주거복지증진과 행복추구에 부정적인 영향을 끼치고 있다는 점을 볼 때, 현행의 노인복지주택정책을 지속시키는 것이 실익이 있는지에 대한 면밀한 검토가 필요한 시점이다. 이를 위해 향후 공급되는 분양형 노인복지주택제도를 폐지하고, 노인복지주택을 「주택법」 체계 내로 흡수하는 방안을 적극적으로 검토할 필요가 있을 것이다."]}], "model-index": [{"name": "SentenceTransformer based on BM-K/KoSimCSE-bert-multitask", "results": [{"task": {"type": "triplet", "name": "Triplet"}, "dataset": {"name": "eval", "type": "eval"}, "metrics": [{"type": "cosine_accuracy", "value": 0.9539, "name": "Cosine Accuracy"}, {"type": "dot_accuracy", "value": 0.0587, "name": "Dot Accuracy"}, {"type": "manhattan_accuracy", "value": 0.9496, "name": "Manhattan Accuracy"}, {"type": "euclidean_accuracy", "value": 0.9518, "name": "Euclidean Accuracy"}, {"type": "max_accuracy", "value": 0.9539, "name": "Max Accuracy"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,900
prithivMLmods/Cassiopeia-Qwen-14B
prithivMLmods
text-generation
[ "transformers", "safetensors", "qwen2", "text-generation", "text-generation-inference", "code", "Qwen", "14B", "QWQ", "conversational", "en", "base_model:Qwen/Qwen2.5-14B-Instruct-1M", "base_model:finetune:Qwen/Qwen2.5-14B-Instruct-1M", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2025-03-08T06:48:42Z
2025-04-02T18:56:52+00:00
281
1
--- base_model: - Qwen/Qwen2.5-14B-Instruct-1M language: - en library_name: transformers license: apache-2.0 pipeline_tag: text-generation tags: - text-generation-inference - code - Qwen - 14B - QWQ --- ![7.png](https://cdn-uploads.huggingface.co/production/uploads/65bb837dbfb878f46c77de4c/U80NQmMNhQDnOW-yye0dc.png) # **Cassiopeia-Qwen-14B** Cassiopeia-Qwen-14B is based on the Qwen 2.5 14B modality architecture, designed to enhance the reasoning capabilities of 14B-parameter models. This model is optimized for general-purpose reasoning and answering, excelling in contextual understanding, logical deduction, and multi-step problem-solving. It has been fine-tuned using a long chain-of-thought reasoning model and specialized datasets to improve comprehension, structured responses, and conversational intelligence. ## **Key Improvements** 1. **Enhanced General Knowledge**: The model provides broad knowledge across various domains, improving capabilities in answering questions accurately and generating coherent responses. 2. **Improved Instruction Following**: Significant advancements in understanding and following complex instructions, generating structured responses, and maintaining coherence over extended interactions. 3. **Versatile Adaptability**: More resilient to diverse prompts, enhancing its ability to handle a wide range of topics and conversation styles, including open-ended and structured inquiries. 4. **Long-Context Support**: Supports up to 128K tokens for input context and can generate up to 8K tokens in a single output, making it ideal for detailed responses. ## **Quickstart with transformers** Here is a code snippet with `apply_chat_template` to show you how to load the tokenizer and model and generate content: ```python from transformers import AutoModelForCausalLM, AutoTokenizer model_name = "prithivMLmods/Cassiopeia-Qwen-14B" model = AutoModelForCausalLM.from_pretrained( model_name, torch_dtype="auto", device_map="auto" ) tokenizer = AutoTokenizer.from_pretrained(model_name) prompt = "What are the key principles of general-purpose AI?" messages = [ {"role": "system", "content": "You are a helpful assistant capable of answering a wide range of questions."}, {"role": "user", "content": prompt} ] text = tokenizer.apply_chat_template( messages, tokenize=False, add_generation_prompt=True ) model_inputs = tokenizer([text], return_tensors="pt").to(model.device) generated_ids = model.generate( **model_inputs, max_new_tokens=512 ) generated_ids = [ output_ids[len(input_ids):] for input_ids, output_ids in zip(model_inputs.input_ids, generated_ids) ] response = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)[0] ``` ## **Intended Use** 1. **General-Purpose Reasoning**: Designed for broad applicability, assisting with logical reasoning, answering diverse questions, and solving general knowledge problems. 2. **Educational and Informational Assistance**: Suitable for providing explanations, summaries, and research-based responses for students, educators, and general users. 3. **Conversational AI and Chatbots**: Ideal for building intelligent conversational agents that require contextual understanding and dynamic response generation. 4. **Multilingual Applications**: Supports global communication, translations, and multilingual content generation. 5. **Structured Data Processing**: Capable of analyzing and generating structured outputs, such as tables and JSON, useful for data science and automation. 6. **Long-Form Content Generation**: Can generate extended responses, including articles, reports, and guides, maintaining coherence over large text outputs. ## **Limitations** 1. **Hardware Requirements**: Requires high-memory GPUs or TPUs due to its large parameter size and long-context support. 2. **Potential Bias in Responses**: While designed to be neutral, outputs may still reflect biases present in training data. 3. **Inconsistent Outputs in Creative Tasks**: May produce variable results in storytelling and highly subjective topics. 4. **Limited Real-World Awareness**: Does not have access to real-time events beyond its training cutoff. 5. **Error Propagation in Extended Outputs**: Minor errors in early responses may affect overall coherence in long-form outputs. 6. **Prompt Sensitivity**: The effectiveness of responses may depend on how well the input prompt is structured.
null
Non_BioNLP
![7.png](https://cdn-uploads.huggingface.co/production/uploads/65bb837dbfb878f46c77de4c/U80NQmMNhQDnOW-yye0dc.png) # **Cassiopeia-Qwen-14B** Cassiopeia-Qwen-14B is based on the Qwen 2.5 14B modality architecture, designed to enhance the reasoning capabilities of 14B-parameter models. This model is optimized for general-purpose reasoning and answering, excelling in contextual understanding, logical deduction, and multi-step problem-solving. It has been fine-tuned using a long chain-of-thought reasoning model and specialized datasets to improve comprehension, structured responses, and conversational intelligence. ## **Key Improvements** 1. **Enhanced General Knowledge**: The model provides broad knowledge across various domains, improving capabilities in answering questions accurately and generating coherent responses. 2. **Improved Instruction Following**: Significant advancements in understanding and following complex instructions, generating structured responses, and maintaining coherence over extended interactions. 3. **Versatile Adaptability**: More resilient to diverse prompts, enhancing its ability to handle a wide range of topics and conversation styles, including open-ended and structured inquiries. 4. **Long-Context Support**: Supports up to 128K tokens for input context and can generate up to 8K tokens in a single output, making it ideal for detailed responses. ## **Quickstart with transformers** Here is a code snippet with `apply_chat_template` to show you how to load the tokenizer and model and generate content: ```python from transformers import AutoModelForCausalLM, AutoTokenizer model_name = "prithivMLmods/Cassiopeia-Qwen-14B" model = AutoModelForCausalLM.from_pretrained( model_name, torch_dtype="auto", device_map="auto" ) tokenizer = AutoTokenizer.from_pretrained(model_name) prompt = "What are the key principles of general-purpose AI?" messages = [ {"role": "system", "content": "You are a helpful assistant capable of answering a wide range of questions."}, {"role": "user", "content": prompt} ] text = tokenizer.apply_chat_template( messages, tokenize=False, add_generation_prompt=True ) model_inputs = tokenizer([text], return_tensors="pt").to(model.device) generated_ids = model.generate( **model_inputs, max_new_tokens=512 ) generated_ids = [ output_ids[len(input_ids):] for input_ids, output_ids in zip(model_inputs.input_ids, generated_ids) ] response = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)[0] ``` ## **Intended Use** 1. **General-Purpose Reasoning**: Designed for broad applicability, assisting with logical reasoning, answering diverse questions, and solving general knowledge problems. 2. **Educational and Informational Assistance**: Suitable for providing explanations, summaries, and research-based responses for students, educators, and general users. 3. **Conversational AI and Chatbots**: Ideal for building intelligent conversational agents that require contextual understanding and dynamic response generation. 4. **Multilingual Applications**: Supports global communication, translations, and multilingual content generation. 5. **Structured Data Processing**: Capable of analyzing and generating structured outputs, such as tables and JSON, useful for data science and automation. 6. **Long-Form Content Generation**: Can generate extended responses, including articles, reports, and guides, maintaining coherence over large text outputs. ## **Limitations** 1. **Hardware Requirements**: Requires high-memory GPUs or TPUs due to its large parameter size and long-context support. 2. **Potential Bias in Responses**: While designed to be neutral, outputs may still reflect biases present in training data. 3. **Inconsistent Outputs in Creative Tasks**: May produce variable results in storytelling and highly subjective topics. 4. **Limited Real-World Awareness**: Does not have access to real-time events beyond its training cutoff. 5. **Error Propagation in Extended Outputs**: Minor errors in early responses may affect overall coherence in long-form outputs. 6. **Prompt Sensitivity**: The effectiveness of responses may depend on how well the input prompt is structured.
{"base_model": ["Qwen/Qwen2.5-14B-Instruct-1M"], "language": ["en"], "library_name": "transformers", "license": "apache-2.0", "pipeline_tag": "text-generation", "tags": ["text-generation-inference", "code", "Qwen", "14B", "QWQ", "Math", "trl"]}
task
[ "TRANSLATION" ]
42,901
LanguageMachines/blip2-opt-2.7b
LanguageMachines
image-to-text
[ "transformers", "pytorch", "blip-2", "visual-question-answering", "vision", "image-to-text", "image-captioning", "en", "arxiv:2301.12597", "license:mit", "endpoints_compatible", "region:us" ]
2023-06-28T19:56:29Z
2023-06-28T20:28:53+00:00
153
0
--- language: en license: mit pipeline_tag: image-to-text tags: - vision - image-to-text - image-captioning - visual-question-answering duplicated_from: Salesforce/blip2-opt-2.7b --- # BLIP-2, OPT-2.7b, pre-trained only BLIP-2 model, leveraging [OPT-2.7b](https://huggingface.co/facebook/opt-2.7b) (a large language model with 2.7 billion parameters). It was introduced in the paper [BLIP-2: Bootstrapping Language-Image Pre-training with Frozen Image Encoders and Large Language Models](https://arxiv.org/abs/2301.12597) by Li et al. and first released in [this repository](https://github.com/salesforce/LAVIS/tree/main/projects/blip2). Disclaimer: The team releasing BLIP-2 did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description BLIP-2 consists of 3 models: a CLIP-like image encoder, a Querying Transformer (Q-Former) and a large language model. The authors initialize the weights of the image encoder and large language model from pre-trained checkpoints and keep them frozen while training the Querying Transformer, which is a BERT-like Transformer encoder that maps a set of "query tokens" to query embeddings, which bridge the gap between the embedding space of the image encoder and the large language model. The goal for the model is simply to predict the next text token, giving the query embeddings and the previous text. <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/blip2_architecture.jpg" alt="drawing" width="600"/> This allows the model to be used for tasks like: - image captioning - visual question answering (VQA) - chat-like conversations by feeding the image and the previous conversation as prompt to the model ## Direct Use and Downstream Use You can use the raw model for conditional text generation given an image and optional text. See the [model hub](https://huggingface.co/models?search=Salesforce/blip) to look for fine-tuned versions on a task that interests you. ## Bias, Risks, Limitations, and Ethical Considerations BLIP2-OPT uses off-the-shelf OPT as the language model. It inherits the same risks and limitations as mentioned in Meta's model card. > Like other large language models for which the diversity (or lack thereof) of training > data induces downstream impact on the quality of our model, OPT-175B has limitations in terms > of bias and safety. OPT-175B can also have quality issues in terms of generation diversity and > hallucination. In general, OPT-175B is not immune from the plethora of issues that plague modern > large language models. > BLIP2 is fine-tuned on image-text datasets (e.g. [LAION](https://laion.ai/blog/laion-400-open-dataset/) ) collected from the internet. As a result the model itself is potentially vulnerable to generating equivalently inappropriate content or replicating inherent biases in the underlying data. BLIP2 has not been tested in real world applications. It should not be directly deployed in any applications. Researchers should first carefully assess the safety and fairness of the model in relation to the specific context they’re being deployed within. ### How to use For code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/main/en/model_doc/blip-2#transformers.Blip2ForConditionalGeneration.forward.example). #### Running the model on CPU <details> <summary> Click to expand </summary> ```python import requests from PIL import Image from transformers import Blip2Processor, Blip2ForConditionalGeneration processor = Blip2Processor.from_pretrained("Salesforce/blip2-opt-2.7b") model = Blip2ForConditionalGeneration.from_pretrained("Salesforce/blip2-opt-2.7b") img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg' raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB') question = "how many dogs are in the picture?" inputs = processor(raw_image, question, return_tensors="pt") out = model.generate(**inputs) print(processor.decode(out[0], skip_special_tokens=True)) ``` </details> #### Running the model on GPU ##### In full precision <details> <summary> Click to expand </summary> ```python # pip install accelerate import requests from PIL import Image from transformers import Blip2Processor, Blip2ForConditionalGeneration processor = Blip2Processor.from_pretrained("Salesforce/blip2-opt-2.7b") model = Blip2ForConditionalGeneration.from_pretrained("Salesforce/blip2-opt-2.7b", device_map="auto") img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg' raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB') question = "how many dogs are in the picture?" inputs = processor(raw_image, question, return_tensors="pt").to("cuda") out = model.generate(**inputs) print(processor.decode(out[0], skip_special_tokens=True)) ``` </details> ##### In half precision (`float16`) <details> <summary> Click to expand </summary> ```python # pip install accelerate import torch import requests from PIL import Image from transformers import Blip2Processor, Blip2ForConditionalGeneration processor = Blip2Processor.from_pretrained("Salesforce/blip2-opt-2.7b") model = Blip2ForConditionalGeneration.from_pretrained("Salesforce/blip2-opt-2.7b", torch_dtype=torch.float16, device_map="auto") img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg' raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB') question = "how many dogs are in the picture?" inputs = processor(raw_image, question, return_tensors="pt").to("cuda", torch.float16) out = model.generate(**inputs) print(processor.decode(out[0], skip_special_tokens=True)) ``` </details> ##### In 8-bit precision (`int8`) <details> <summary> Click to expand </summary> ```python # pip install accelerate bitsandbytes import torch import requests from PIL import Image from transformers import Blip2Processor, Blip2ForConditionalGeneration processor = Blip2Processor.from_pretrained("Salesforce/blip2-opt-2.7b") model = Blip2ForConditionalGeneration.from_pretrained("Salesforce/blip2-opt-2.7b", load_in_8bit=True, device_map="auto") img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg' raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB') question = "how many dogs are in the picture?" inputs = processor(raw_image, question, return_tensors="pt").to("cuda", torch.float16) out = model.generate(**inputs) print(processor.decode(out[0], skip_special_tokens=True)) ``` </details>
null
Non_BioNLP
# BLIP-2, OPT-2.7b, pre-trained only BLIP-2 model, leveraging [OPT-2.7b](https://huggingface.co/facebook/opt-2.7b) (a large language model with 2.7 billion parameters). It was introduced in the paper [BLIP-2: Bootstrapping Language-Image Pre-training with Frozen Image Encoders and Large Language Models](https://arxiv.org/abs/2301.12597) by Li et al. and first released in [this repository](https://github.com/salesforce/LAVIS/tree/main/projects/blip2). Disclaimer: The team releasing BLIP-2 did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description BLIP-2 consists of 3 models: a CLIP-like image encoder, a Querying Transformer (Q-Former) and a large language model. The authors initialize the weights of the image encoder and large language model from pre-trained checkpoints and keep them frozen while training the Querying Transformer, which is a BERT-like Transformer encoder that maps a set of "query tokens" to query embeddings, which bridge the gap between the embedding space of the image encoder and the large language model. The goal for the model is simply to predict the next text token, giving the query embeddings and the previous text. <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/blip2_architecture.jpg" alt="drawing" width="600"/> This allows the model to be used for tasks like: - image captioning - visual question answering (VQA) - chat-like conversations by feeding the image and the previous conversation as prompt to the model ## Direct Use and Downstream Use You can use the raw model for conditional text generation given an image and optional text. See the [model hub](https://huggingface.co/models?search=Salesforce/blip) to look for fine-tuned versions on a task that interests you. ## Bias, Risks, Limitations, and Ethical Considerations BLIP2-OPT uses off-the-shelf OPT as the language model. It inherits the same risks and limitations as mentioned in Meta's model card. > Like other large language models for which the diversity (or lack thereof) of training > data induces downstream impact on the quality of our model, OPT-175B has limitations in terms > of bias and safety. OPT-175B can also have quality issues in terms of generation diversity and > hallucination. In general, OPT-175B is not immune from the plethora of issues that plague modern > large language models. > BLIP2 is fine-tuned on image-text datasets (e.g. [LAION](https://laion.ai/blog/laion-400-open-dataset/) ) collected from the internet. As a result the model itself is potentially vulnerable to generating equivalently inappropriate content or replicating inherent biases in the underlying data. BLIP2 has not been tested in real world applications. It should not be directly deployed in any applications. Researchers should first carefully assess the safety and fairness of the model in relation to the specific context they’re being deployed within. ### How to use For code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/main/en/model_doc/blip-2#transformers.Blip2ForConditionalGeneration.forward.example). #### Running the model on CPU <details> <summary> Click to expand </summary> ```python import requests from PIL import Image from transformers import Blip2Processor, Blip2ForConditionalGeneration processor = Blip2Processor.from_pretrained("Salesforce/blip2-opt-2.7b") model = Blip2ForConditionalGeneration.from_pretrained("Salesforce/blip2-opt-2.7b") img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg' raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB') question = "how many dogs are in the picture?" inputs = processor(raw_image, question, return_tensors="pt") out = model.generate(**inputs) print(processor.decode(out[0], skip_special_tokens=True)) ``` </details> #### Running the model on GPU ##### In full precision <details> <summary> Click to expand </summary> ```python # pip install accelerate import requests from PIL import Image from transformers import Blip2Processor, Blip2ForConditionalGeneration processor = Blip2Processor.from_pretrained("Salesforce/blip2-opt-2.7b") model = Blip2ForConditionalGeneration.from_pretrained("Salesforce/blip2-opt-2.7b", device_map="auto") img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg' raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB') question = "how many dogs are in the picture?" inputs = processor(raw_image, question, return_tensors="pt").to("cuda") out = model.generate(**inputs) print(processor.decode(out[0], skip_special_tokens=True)) ``` </details> ##### In half precision (`float16`) <details> <summary> Click to expand </summary> ```python # pip install accelerate import torch import requests from PIL import Image from transformers import Blip2Processor, Blip2ForConditionalGeneration processor = Blip2Processor.from_pretrained("Salesforce/blip2-opt-2.7b") model = Blip2ForConditionalGeneration.from_pretrained("Salesforce/blip2-opt-2.7b", torch_dtype=torch.float16, device_map="auto") img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg' raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB') question = "how many dogs are in the picture?" inputs = processor(raw_image, question, return_tensors="pt").to("cuda", torch.float16) out = model.generate(**inputs) print(processor.decode(out[0], skip_special_tokens=True)) ``` </details> ##### In 8-bit precision (`int8`) <details> <summary> Click to expand </summary> ```python # pip install accelerate bitsandbytes import torch import requests from PIL import Image from transformers import Blip2Processor, Blip2ForConditionalGeneration processor = Blip2Processor.from_pretrained("Salesforce/blip2-opt-2.7b") model = Blip2ForConditionalGeneration.from_pretrained("Salesforce/blip2-opt-2.7b", load_in_8bit=True, device_map="auto") img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg' raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB') question = "how many dogs are in the picture?" inputs = processor(raw_image, question, return_tensors="pt").to("cuda", torch.float16) out = model.generate(**inputs) print(processor.decode(out[0], skip_special_tokens=True)) ``` </details>
{"language": "en", "license": "mit", "pipeline_tag": "image-to-text", "tags": ["vision", "image-to-text", "image-captioning", "visual-question-answering"], "duplicated_from": "Salesforce/blip2-opt-2.7b"}
task
[ "QUESTION_ANSWERING" ]
42,902
varun-v-rao/bart-base-bn-adapter-895K-snli-model1
varun-v-rao
null
[ "tensorboard", "generated_from_trainer", "dataset:stanfordnlp/snli", "base_model:facebook/bart-base", "base_model:finetune:facebook/bart-base", "license:apache-2.0", "model-index", "region:us" ]
2024-06-19T18:31:58Z
2024-06-19T20:58:14+00:00
0
0
--- base_model: facebook/bart-base datasets: - stanfordnlp/snli license: apache-2.0 metrics: - accuracy tags: - generated_from_trainer model-index: - name: bart-base-bn-adapter-895K-snli-model1 results: - task: type: text-classification name: Text Classification dataset: name: snli type: stanfordnlp/snli metrics: - type: accuracy value: 0.8574476732371469 name: Accuracy --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bart-base-bn-adapter-895K-snli-model1 This model is a fine-tuned version of [facebook/bart-base](https://huggingface.co/facebook/bart-base) on the snli dataset. It achieves the following results on the evaluation set: - Loss: 0.3775 - Accuracy: 0.8574 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 32 - seed: 65 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 0.5241 | 1.0 | 8584 | 0.4209 | 0.8340 | | 0.4838 | 2.0 | 17168 | 0.3869 | 0.8509 | | 0.4716 | 3.0 | 25752 | 0.3775 | 0.8574 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.1+cu121 - Datasets 2.15.0 - Tokenizers 0.15.0
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bart-base-bn-adapter-895K-snli-model1 This model is a fine-tuned version of [facebook/bart-base](https://huggingface.co/facebook/bart-base) on the snli dataset. It achieves the following results on the evaluation set: - Loss: 0.3775 - Accuracy: 0.8574 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 32 - seed: 65 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 0.5241 | 1.0 | 8584 | 0.4209 | 0.8340 | | 0.4838 | 2.0 | 17168 | 0.3869 | 0.8509 | | 0.4716 | 3.0 | 25752 | 0.3775 | 0.8574 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.1+cu121 - Datasets 2.15.0 - Tokenizers 0.15.0
{"base_model": "facebook/bart-base", "datasets": ["stanfordnlp/snli"], "license": "apache-2.0", "metrics": ["accuracy"], "tags": ["generated_from_trainer"], "model-index": [{"name": "bart-base-bn-adapter-895K-snli-model1", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "snli", "type": "stanfordnlp/snli"}, "metrics": [{"type": "accuracy", "value": 0.8574476732371469, "name": "Accuracy"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,903
Nextcloud-AI/opus-mt-ja-es
Nextcloud-AI
translation
[ "transformers", "pytorch", "tf", "marian", "text2text-generation", "translation", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-02-23T10:45:06Z
2023-08-16T11:59:09+00:00
23
0
--- license: apache-2.0 tags: - translation --- ### opus-mt-ja-es * source languages: ja * target languages: es * OPUS readme: [ja-es](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/ja-es/README.md) * dataset: opus * model: transformer-align * pre-processing: normalization + SentencePiece * download original weights: [opus-2020-01-16.zip](https://object.pouta.csc.fi/OPUS-MT-models/ja-es/opus-2020-01-16.zip) * test set translations: [opus-2020-01-16.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/ja-es/opus-2020-01-16.test.txt) * test set scores: [opus-2020-01-16.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/ja-es/opus-2020-01-16.eval.txt) ## Benchmarks | testset | BLEU | chr-F | |-----------------------|-------|-------| | Tatoeba.ja.es | 34.6 | 0.553 |
null
Non_BioNLP
### opus-mt-ja-es * source languages: ja * target languages: es * OPUS readme: [ja-es](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/ja-es/README.md) * dataset: opus * model: transformer-align * pre-processing: normalization + SentencePiece * download original weights: [opus-2020-01-16.zip](https://object.pouta.csc.fi/OPUS-MT-models/ja-es/opus-2020-01-16.zip) * test set translations: [opus-2020-01-16.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/ja-es/opus-2020-01-16.test.txt) * test set scores: [opus-2020-01-16.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/ja-es/opus-2020-01-16.eval.txt) ## Benchmarks | testset | BLEU | chr-F | |-----------------------|-------|-------| | Tatoeba.ja.es | 34.6 | 0.553 |
{"license": "apache-2.0", "tags": ["translation"]}
task
[ "TRANSLATION" ]
42,904
facebook/fasttext-als-vectors
facebook
feature-extraction
[ "fasttext", "feature-extraction", "als", "arxiv:1607.04606", "arxiv:1802.06893", "arxiv:1607.01759", "arxiv:1612.03651", "license:cc-by-sa-3.0", "region:us" ]
2023-03-17T13:01:55Z
2023-06-03T22:09:03+00:00
3
0
--- language: als library_name: fasttext license: cc-by-sa-3.0 tags: - feature-extraction widget: - text: apple example_title: apple --- # fastText (Alemannic) fastText is an open-source, free, lightweight library that allows users to learn text representations and text classifiers. It works on standard, generic hardware. Models can later be reduced in size to even fit on mobile devices. It was introduced in [this paper](https://arxiv.org/abs/1607.04606). The official website can be found [here](https://fasttext.cc/). ## Model description fastText is a library for efficient learning of word representations and sentence classification. fastText is designed to be simple to use for developers, domain experts, and students. It's dedicated to text classification and learning word representations, and was designed to allow for quick model iteration and refinement without specialized hardware. fastText models can be trained on more than a billion words on any multicore CPU in less than a few minutes. It includes pre-trained models learned on Wikipedia and in over 157 different languages. fastText can be used as a command line, linked to a C++ application, or used as a library for use cases from experimentation and prototyping to production. ## Intended uses & limitations You can use pre-trained word vectors for text classification or language identification. See the [tutorials](https://fasttext.cc/docs/en/supervised-tutorial.html) and [resources](https://fasttext.cc/docs/en/english-vectors.html) on its official website to look for tasks that interest you. ### How to use Here is how to load and use a pre-trained vectors ```python >>> import fasttext >>> from huggingface_hub import hf_hub_download >>> model_path = hf_hub_download(repo_id="facebook/fasttext-als-vectors", filename="model.bin") >>> model = fasttext.load_model(model_path) >>> model.words ['the', 'of', 'and', 'to', 'in', 'a', 'that', 'is', ...] >>> len(model.words) 145940 >>> model['bread'] array([ 4.89417791e-01, 1.60882145e-01, -2.25947708e-01, -2.94273376e-01, -1.04577184e-01, 1.17962055e-01, 1.34821936e-01, -2.41778508e-01, ...]) ``` Here is how to use this model to query nearest neighbors of an English word vector: ```python >>> import fasttext >>> from huggingface_hub import hf_hub_download >>> model_path = hf_hub_download(repo_id="facebook/fasttext-en-nearest-neighbors", filename="model.bin") >>> model = fasttext.load_model(model_path) >>> model.get_nearest_neighbors("bread", k=5) [(0.5641006231307983, 'butter'), (0.48875734210014343, 'loaf'), (0.4491206705570221, 'eat'), (0.42444291710853577, 'food'), (0.4229326844215393, 'cheese')] ``` Here is how to use this model to detect the language of a given text: ```python >>> import fasttext >>> from huggingface_hub import hf_hub_download >>> model_path = hf_hub_download(repo_id="facebook/fasttext-language-identification", filename="model.bin") >>> model = fasttext.load_model(model_path) >>> model.predict("Hello, world!") (('__label__eng_Latn',), array([0.81148803])) >>> model.predict("Hello, world!", k=5) (('__label__eng_Latn', '__label__vie_Latn', '__label__nld_Latn', '__label__pol_Latn', '__label__deu_Latn'), array([0.61224753, 0.21323682, 0.09696738, 0.01359863, 0.01319415])) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. Cosine similarity can be used to measure the similarity between two different word vectors. If two two vectors are identical, the cosine similarity will be 1. For two completely unrelated vectors, the value will be 0. If two vectors have an opposite relationship, the value will be -1. ```python >>> import numpy as np >>> def cosine_similarity(word1, word2): >>> return np.dot(model[word1], model[word2]) / (np.linalg.norm(model[word1]) * np.linalg.norm(model[word2])) >>> cosine_similarity("man", "boy") 0.061653383 >>> cosine_similarity("man", "ceo") 0.11989131 >>> cosine_similarity("woman", "ceo") -0.08834904 ``` ## Training data Pre-trained word vectors for 157 languages were trained on [Common Crawl](http://commoncrawl.org/) and [Wikipedia](https://www.wikipedia.org/) using fastText. These models were trained using CBOW with position-weights, in dimension 300, with character n-grams of length 5, a window of size 5 and 10 negatives. We also distribute three new word analogy datasets, for French, Hindi and Polish. ## Training procedure ### Tokenization We used the [Stanford word segmenter](https://nlp.stanford.edu/software/segmenter.html) for Chinese, [Mecab](http://taku910.github.io/mecab/) for Japanese and [UETsegmenter](https://github.com/phongnt570/UETsegmenter) for Vietnamese. For languages using the Latin, Cyrillic, Hebrew or Greek scripts, we used the tokenizer from the [Europarl](https://www.statmt.org/europarl/) preprocessing tools. For the remaining languages, we used the ICU tokenizer. More information about the training of these models can be found in the article [Learning Word Vectors for 157 Languages](https://arxiv.org/abs/1802.06893). ### License The word vectors are distributed under the [*Creative Commons Attribution-Share-Alike License 3.0*](https://creativecommons.org/licenses/by-sa/3.0/). ### Evaluation datasets The analogy evaluation datasets described in the paper are available here: [French](https://dl.fbaipublicfiles.com/fasttext/word-analogies/questions-words-fr.txt), [Hindi](https://dl.fbaipublicfiles.com/fasttext/word-analogies/questions-words-hi.txt), [Polish](https://dl.fbaipublicfiles.com/fasttext/word-analogies/questions-words-pl.txt). ### BibTeX entry and citation info Please cite [1] if using this code for learning word representations or [2] if using for text classification. [1] P. Bojanowski\*, E. Grave\*, A. Joulin, T. Mikolov, [*Enriching Word Vectors with Subword Information*](https://arxiv.org/abs/1607.04606) ```markup @article{bojanowski2016enriching, title={Enriching Word Vectors with Subword Information}, author={Bojanowski, Piotr and Grave, Edouard and Joulin, Armand and Mikolov, Tomas}, journal={arXiv preprint arXiv:1607.04606}, year={2016} } ``` [2] A. Joulin, E. Grave, P. Bojanowski, T. Mikolov, [*Bag of Tricks for Efficient Text Classification*](https://arxiv.org/abs/1607.01759) ```markup @article{joulin2016bag, title={Bag of Tricks for Efficient Text Classification}, author={Joulin, Armand and Grave, Edouard and Bojanowski, Piotr and Mikolov, Tomas}, journal={arXiv preprint arXiv:1607.01759}, year={2016} } ``` [3] A. Joulin, E. Grave, P. Bojanowski, M. Douze, H. Jégou, T. Mikolov, [*FastText.zip: Compressing text classification models*](https://arxiv.org/abs/1612.03651) ```markup @article{joulin2016fasttext, title={FastText.zip: Compressing text classification models}, author={Joulin, Armand and Grave, Edouard and Bojanowski, Piotr and Douze, Matthijs and J{'e}gou, H{'e}rve and Mikolov, Tomas}, journal={arXiv preprint arXiv:1612.03651}, year={2016} } ``` If you use these word vectors, please cite the following paper: [4] E. Grave\*, P. Bojanowski\*, P. Gupta, A. Joulin, T. Mikolov, [*Learning Word Vectors for 157 Languages*](https://arxiv.org/abs/1802.06893) ```markup @inproceedings{grave2018learning, title={Learning Word Vectors for 157 Languages}, author={Grave, Edouard and Bojanowski, Piotr and Gupta, Prakhar and Joulin, Armand and Mikolov, Tomas}, booktitle={Proceedings of the International Conference on Language Resources and Evaluation (LREC 2018)}, year={2018} } ``` (\* These authors contributed equally.)
null
Non_BioNLP
# fastText (Alemannic) fastText is an open-source, free, lightweight library that allows users to learn text representations and text classifiers. It works on standard, generic hardware. Models can later be reduced in size to even fit on mobile devices. It was introduced in [this paper](https://arxiv.org/abs/1607.04606). The official website can be found [here](https://fasttext.cc/). ## Model description fastText is a library for efficient learning of word representations and sentence classification. fastText is designed to be simple to use for developers, domain experts, and students. It's dedicated to text classification and learning word representations, and was designed to allow for quick model iteration and refinement without specialized hardware. fastText models can be trained on more than a billion words on any multicore CPU in less than a few minutes. It includes pre-trained models learned on Wikipedia and in over 157 different languages. fastText can be used as a command line, linked to a C++ application, or used as a library for use cases from experimentation and prototyping to production. ## Intended uses & limitations You can use pre-trained word vectors for text classification or language identification. See the [tutorials](https://fasttext.cc/docs/en/supervised-tutorial.html) and [resources](https://fasttext.cc/docs/en/english-vectors.html) on its official website to look for tasks that interest you. ### How to use Here is how to load and use a pre-trained vectors ```python >>> import fasttext >>> from huggingface_hub import hf_hub_download >>> model_path = hf_hub_download(repo_id="facebook/fasttext-als-vectors", filename="model.bin") >>> model = fasttext.load_model(model_path) >>> model.words ['the', 'of', 'and', 'to', 'in', 'a', 'that', 'is', ...] >>> len(model.words) 145940 >>> model['bread'] array([ 4.89417791e-01, 1.60882145e-01, -2.25947708e-01, -2.94273376e-01, -1.04577184e-01, 1.17962055e-01, 1.34821936e-01, -2.41778508e-01, ...]) ``` Here is how to use this model to query nearest neighbors of an English word vector: ```python >>> import fasttext >>> from huggingface_hub import hf_hub_download >>> model_path = hf_hub_download(repo_id="facebook/fasttext-en-nearest-neighbors", filename="model.bin") >>> model = fasttext.load_model(model_path) >>> model.get_nearest_neighbors("bread", k=5) [(0.5641006231307983, 'butter'), (0.48875734210014343, 'loaf'), (0.4491206705570221, 'eat'), (0.42444291710853577, 'food'), (0.4229326844215393, 'cheese')] ``` Here is how to use this model to detect the language of a given text: ```python >>> import fasttext >>> from huggingface_hub import hf_hub_download >>> model_path = hf_hub_download(repo_id="facebook/fasttext-language-identification", filename="model.bin") >>> model = fasttext.load_model(model_path) >>> model.predict("Hello, world!") (('__label__eng_Latn',), array([0.81148803])) >>> model.predict("Hello, world!", k=5) (('__label__eng_Latn', '__label__vie_Latn', '__label__nld_Latn', '__label__pol_Latn', '__label__deu_Latn'), array([0.61224753, 0.21323682, 0.09696738, 0.01359863, 0.01319415])) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. Cosine similarity can be used to measure the similarity between two different word vectors. If two two vectors are identical, the cosine similarity will be 1. For two completely unrelated vectors, the value will be 0. If two vectors have an opposite relationship, the value will be -1. ```python >>> import numpy as np >>> def cosine_similarity(word1, word2): >>> return np.dot(model[word1], model[word2]) / (np.linalg.norm(model[word1]) * np.linalg.norm(model[word2])) >>> cosine_similarity("man", "boy") 0.061653383 >>> cosine_similarity("man", "ceo") 0.11989131 >>> cosine_similarity("woman", "ceo") -0.08834904 ``` ## Training data Pre-trained word vectors for 157 languages were trained on [Common Crawl](http://commoncrawl.org/) and [Wikipedia](https://www.wikipedia.org/) using fastText. These models were trained using CBOW with position-weights, in dimension 300, with character n-grams of length 5, a window of size 5 and 10 negatives. We also distribute three new word analogy datasets, for French, Hindi and Polish. ## Training procedure ### Tokenization We used the [Stanford word segmenter](https://nlp.stanford.edu/software/segmenter.html) for Chinese, [Mecab](http://taku910.github.io/mecab/) for Japanese and [UETsegmenter](https://github.com/phongnt570/UETsegmenter) for Vietnamese. For languages using the Latin, Cyrillic, Hebrew or Greek scripts, we used the tokenizer from the [Europarl](https://www.statmt.org/europarl/) preprocessing tools. For the remaining languages, we used the ICU tokenizer. More information about the training of these models can be found in the article [Learning Word Vectors for 157 Languages](https://arxiv.org/abs/1802.06893). ### License The word vectors are distributed under the [*Creative Commons Attribution-Share-Alike License 3.0*](https://creativecommons.org/licenses/by-sa/3.0/). ### Evaluation datasets The analogy evaluation datasets described in the paper are available here: [French](https://dl.fbaipublicfiles.com/fasttext/word-analogies/questions-words-fr.txt), [Hindi](https://dl.fbaipublicfiles.com/fasttext/word-analogies/questions-words-hi.txt), [Polish](https://dl.fbaipublicfiles.com/fasttext/word-analogies/questions-words-pl.txt). ### BibTeX entry and citation info Please cite [1] if using this code for learning word representations or [2] if using for text classification. [1] P. Bojanowski\*, E. Grave\*, A. Joulin, T. Mikolov, [*Enriching Word Vectors with Subword Information*](https://arxiv.org/abs/1607.04606) ```markup @article{bojanowski2016enriching, title={Enriching Word Vectors with Subword Information}, author={Bojanowski, Piotr and Grave, Edouard and Joulin, Armand and Mikolov, Tomas}, journal={arXiv preprint arXiv:1607.04606}, year={2016} } ``` [2] A. Joulin, E. Grave, P. Bojanowski, T. Mikolov, [*Bag of Tricks for Efficient Text Classification*](https://arxiv.org/abs/1607.01759) ```markup @article{joulin2016bag, title={Bag of Tricks for Efficient Text Classification}, author={Joulin, Armand and Grave, Edouard and Bojanowski, Piotr and Mikolov, Tomas}, journal={arXiv preprint arXiv:1607.01759}, year={2016} } ``` [3] A. Joulin, E. Grave, P. Bojanowski, M. Douze, H. Jégou, T. Mikolov, [*FastText.zip: Compressing text classification models*](https://arxiv.org/abs/1612.03651) ```markup @article{joulin2016fasttext, title={FastText.zip: Compressing text classification models}, author={Joulin, Armand and Grave, Edouard and Bojanowski, Piotr and Douze, Matthijs and J{'e}gou, H{'e}rve and Mikolov, Tomas}, journal={arXiv preprint arXiv:1612.03651}, year={2016} } ``` If you use these word vectors, please cite the following paper: [4] E. Grave\*, P. Bojanowski\*, P. Gupta, A. Joulin, T. Mikolov, [*Learning Word Vectors for 157 Languages*](https://arxiv.org/abs/1802.06893) ```markup @inproceedings{grave2018learning, title={Learning Word Vectors for 157 Languages}, author={Grave, Edouard and Bojanowski, Piotr and Gupta, Prakhar and Joulin, Armand and Mikolov, Tomas}, booktitle={Proceedings of the International Conference on Language Resources and Evaluation (LREC 2018)}, year={2018} } ``` (\* These authors contributed equally.)
{"language": "als", "library_name": "fasttext", "license": "cc-by-sa-3.0", "tags": ["feature-extraction"], "widget": [{"text": "apple", "example_title": "apple"}]}
task
[ "TEXT_CLASSIFICATION" ]
42,905
newmanbb/distilroberta-base-mrpc-glue-jhon-ramirez-nuevo
newmanbb
text-classification
[ "transformers", "pytorch", "tensorboard", "roberta", "text-classification", "generated_from_trainer", "dataset:glue", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2023-07-06T03:38:39Z
2023-07-07T13:49:39+00:00
21
0
--- datasets: - glue license: apache-2.0 metrics: - accuracy - f1 tags: - generated_from_trainer model-index: - name: distilroberta-base-mrpc-glue-jhon-ramirez-nuevo results: - task: type: text-classification name: Text Classification dataset: name: glue type: glue config: mrpc split: validation args: mrpc metrics: - type: accuracy value: 0.8186274509803921 name: Accuracy - type: f1 value: 0.8654545454545455 name: F1 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilroberta-base-mrpc-glue-jhon-ramirez-nuevo This model is a fine-tuned version of [distilroberta-base](https://huggingface.co/distilroberta-base) on the glue dataset. It achieves the following results on the evaluation set: - Loss: 0.7348 - Accuracy: 0.8186 - F1: 0.8655 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.5143 | 1.09 | 500 | 0.4876 | 0.8309 | 0.8783 | | 0.3233 | 2.18 | 1000 | 0.7348 | 0.8186 | 0.8655 | ### Framework versions - Transformers 4.29.0 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilroberta-base-mrpc-glue-jhon-ramirez-nuevo This model is a fine-tuned version of [distilroberta-base](https://huggingface.co/distilroberta-base) on the glue dataset. It achieves the following results on the evaluation set: - Loss: 0.7348 - Accuracy: 0.8186 - F1: 0.8655 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.5143 | 1.09 | 500 | 0.4876 | 0.8309 | 0.8783 | | 0.3233 | 2.18 | 1000 | 0.7348 | 0.8186 | 0.8655 | ### Framework versions - Transformers 4.29.0 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
{"datasets": ["glue"], "license": "apache-2.0", "metrics": ["accuracy", "f1"], "tags": ["generated_from_trainer"], "model-index": [{"name": "distilroberta-base-mrpc-glue-jhon-ramirez-nuevo", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "config": "mrpc", "split": "validation", "args": "mrpc"}, "metrics": [{"type": "accuracy", "value": 0.8186274509803921, "name": "Accuracy"}, {"type": "f1", "value": 0.8654545454545455, "name": "F1"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,906
vanderlist/distilbert-base-uncased-finetuned-emotion
vanderlist
text-classification
[ "transformers", "tensorboard", "safetensors", "distilbert", "text-classification", "generated_from_trainer", "dataset:emotion", "base_model:distilbert/distilbert-base-uncased", "base_model:finetune:distilbert/distilbert-base-uncased", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-05-07T13:49:52Z
2024-05-08T09:00:06+00:00
7
0
--- base_model: distilbert-base-uncased datasets: - emotion license: apache-2.0 metrics: - accuracy - f1 tags: - generated_from_trainer model-index: - name: distilbert-base-uncased-finetuned-emotion results: - task: type: text-classification name: Text Classification dataset: name: emotion type: emotion config: split split: validation args: split metrics: - type: accuracy value: 0.9295 name: Accuracy - type: f1 value: 0.9294838225405171 name: F1 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-emotion This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset. It achieves the following results on the evaluation set: - Loss: 0.2208 - Accuracy: 0.9295 - F1: 0.9295 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.8275 | 1.0 | 250 | 0.3187 | 0.907 | 0.9061 | | 0.2597 | 2.0 | 500 | 0.2208 | 0.9295 | 0.9295 | ### Framework versions - Transformers 4.39.1 - Pytorch 2.2.1+cpu - Datasets 2.18.0 - Tokenizers 0.15.2
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-emotion This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset. It achieves the following results on the evaluation set: - Loss: 0.2208 - Accuracy: 0.9295 - F1: 0.9295 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.8275 | 1.0 | 250 | 0.3187 | 0.907 | 0.9061 | | 0.2597 | 2.0 | 500 | 0.2208 | 0.9295 | 0.9295 | ### Framework versions - Transformers 4.39.1 - Pytorch 2.2.1+cpu - Datasets 2.18.0 - Tokenizers 0.15.2
{"base_model": "distilbert-base-uncased", "datasets": ["emotion"], "license": "apache-2.0", "metrics": ["accuracy", "f1"], "tags": ["generated_from_trainer"], "model-index": [{"name": "distilbert-base-uncased-finetuned-emotion", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "emotion", "type": "emotion", "config": "split", "split": "validation", "args": "split"}, "metrics": [{"type": "accuracy", "value": 0.9295, "name": "Accuracy"}, {"type": "f1", "value": 0.9294838225405171, "name": "F1"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,907
griffin/clinical-summary-fact-corrector
griffin
text2text-generation
[ "transformers", "pytorch", "bart", "text2text-generation", "arxiv:2204.10290", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-10-12T14:30:33Z
2022-10-18T18:00:07+00:00
17
2
--- {} --- # clinical-summary-fact-corrector HuggingFace Model Weights for the Revision Model described in EMNLP Findings '22 paper "Learning to Revise References for Faithful Summarization" [Paper Link](https://arxiv.org/abs/2204.10290#:~:text=In%20many%20real%2Dworld%20scenarios,shown%20to%20reduce%20model%20hallucinations.) --- language: - en tags: - summarization - data generator license: apache-2.0 datasets: - MIMIC-III ---
null
BioNLP
# clinical-summary-fact-corrector HuggingFace Model Weights for the Revision Model described in EMNLP Findings '22 paper "Learning to Revise References for Faithful Summarization" [Paper Link](https://arxiv.org/abs/2204.10290#:~:text=In%20many%20real%2Dworld%20scenarios,shown%20to%20reduce%20model%20hallucinations.) --- language: - en tags: - summarization - data generator license: apache-2.0 datasets: - MIMIC-III ---
{}
task
[ "SUMMARIZATION" ]
42,908
KarelDO/roberta-base.CEBaB_confounding.observational.sa.5-class.seed_43
KarelDO
null
[ "transformers", "pytorch", "roberta", "generated_from_trainer", "en", "dataset:OpenTable", "license:mit", "model-index", "endpoints_compatible", "region:us" ]
2022-10-14T02:06:43Z
2022-10-14T02:09:10+00:00
18
0
--- datasets: - OpenTable language: - en license: mit metrics: - accuracy tags: - generated_from_trainer model-index: - name: roberta-base.CEBaB_confounding.observational.sa.5-class.seed_43 results: - task: type: text-classification name: Text Classification dataset: name: OpenTable OPENTABLE type: OpenTable args: opentable metrics: - type: accuracy value: 0.698744769874477 name: Accuracy --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # roberta-base.CEBaB_confounding.observational.sa.5-class.seed_43 This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the OpenTable OPENTABLE dataset. It achieves the following results on the evaluation set: - Loss: 0.8001 - Accuracy: 0.6987 - Macro-f1: 0.6805 - Weighted-macro-f1: 0.6922 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 43 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5.0 ### Training results ### Framework versions - Transformers 4.18.0 - Pytorch 1.10.2+cu102 - Datasets 2.5.2 - Tokenizers 0.12.1
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # roberta-base.CEBaB_confounding.observational.sa.5-class.seed_43 This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the OpenTable OPENTABLE dataset. It achieves the following results on the evaluation set: - Loss: 0.8001 - Accuracy: 0.6987 - Macro-f1: 0.6805 - Weighted-macro-f1: 0.6922 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 43 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5.0 ### Training results ### Framework versions - Transformers 4.18.0 - Pytorch 1.10.2+cu102 - Datasets 2.5.2 - Tokenizers 0.12.1
{"datasets": ["OpenTable"], "language": ["en"], "license": "mit", "metrics": ["accuracy"], "tags": ["generated_from_trainer"], "model-index": [{"name": "roberta-base.CEBaB_confounding.observational.sa.5-class.seed_43", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "OpenTable OPENTABLE", "type": "OpenTable", "args": "opentable"}, "metrics": [{"type": "accuracy", "value": 0.698744769874477, "name": "Accuracy"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,909
nakker/bert-base-banking77-pt2
nakker
text-classification
[ "transformers", "pytorch", "tensorboard", "bert", "text-classification", "generated_from_trainer", "dataset:banking77", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2023-05-23T21:45:48Z
2023-05-23T22:00:22+00:00
9
0
--- datasets: - banking77 license: apache-2.0 metrics: - f1 tags: - generated_from_trainer model-index: - name: bert-base-banking77-pt2 results: - task: type: text-classification name: Text Classification dataset: name: banking77 type: banking77 config: default split: test args: default metrics: - type: f1 value: 0.9287229411281823 name: F1 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-base-banking77-pt2 This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the banking77 dataset. It achieves the following results on the evaluation set: - Loss: 0.3041 - F1: 0.9287 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | |:-------------:|:-----:|:----:|:---------------:|:------:| | 1.0427 | 1.0 | 626 | 0.7423 | 0.8439 | | 0.3703 | 2.0 | 1252 | 0.3573 | 0.9200 | | 0.174 | 3.0 | 1878 | 0.3041 | 0.9287 | ### Framework versions - Transformers 4.29.2 - Pytorch 2.0.1 - Datasets 2.12.0 - Tokenizers 0.11.0
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-base-banking77-pt2 This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the banking77 dataset. It achieves the following results on the evaluation set: - Loss: 0.3041 - F1: 0.9287 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | |:-------------:|:-----:|:----:|:---------------:|:------:| | 1.0427 | 1.0 | 626 | 0.7423 | 0.8439 | | 0.3703 | 2.0 | 1252 | 0.3573 | 0.9200 | | 0.174 | 3.0 | 1878 | 0.3041 | 0.9287 | ### Framework versions - Transformers 4.29.2 - Pytorch 2.0.1 - Datasets 2.12.0 - Tokenizers 0.11.0
{"datasets": ["banking77"], "license": "apache-2.0", "metrics": ["f1"], "tags": ["generated_from_trainer"], "model-index": [{"name": "bert-base-banking77-pt2", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "banking77", "type": "banking77", "config": "default", "split": "test", "args": "default"}, "metrics": [{"type": "f1", "value": 0.9287229411281823, "name": "F1"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,910
ivanleomk/mpnet-base-all-nli-triplet
ivanleomk
sentence-similarity
[ "sentence-transformers", "safetensors", "mpnet", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:557850", "loss:MultipleNegativesRankingLoss", "en", "dataset:sentence-transformers/all-nli", "arxiv:1908.10084", "arxiv:1705.00652", "base_model:microsoft/mpnet-base", "base_model:finetune:microsoft/mpnet-base", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-11-07T11:16:19Z
2024-11-07T11:16:48+00:00
10
0
--- base_model: microsoft/mpnet-base datasets: - sentence-transformers/all-nli language: - en library_name: sentence-transformers license: apache-2.0 metrics: - cosine_accuracy - dot_accuracy - manhattan_accuracy - euclidean_accuracy - max_accuracy pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:557850 - loss:MultipleNegativesRankingLoss widget: - source_sentence: A man is jumping unto his filthy bed. sentences: - A young male is looking at a newspaper while 2 females walks past him. - The bed is dirty. - The man is on the moon. - source_sentence: A carefully balanced male stands on one foot near a clean ocean beach area. sentences: - A man is ouside near the beach. - Three policemen patrol the streets on bikes - A man is sitting on his couch. - source_sentence: The man is wearing a blue shirt. sentences: - Near the trashcan the man stood and smoked - A man in a blue shirt leans on a wall beside a road with a blue van and red car with water in the background. - A man in a black shirt is playing a guitar. - source_sentence: The girls are outdoors. sentences: - Two girls riding on an amusement part ride. - a guy laughs while doing laundry - Three girls are standing together in a room, one is listening, one is writing on a wall and the third is talking to them. - source_sentence: A construction worker peeking out of a manhole while his coworker sits on the sidewalk smiling. sentences: - A worker is looking out of a manhole. - A man is giving a presentation. - The workers are both inside the manhole. model-index: - name: MPNet base trained on AllNLI triplets results: - task: type: triplet name: Triplet dataset: name: all nli dev type: all-nli-dev metrics: - type: cosine_accuracy value: 0.6210510328068044 name: Cosine Accuracy - type: dot_accuracy value: 0.45337181044957475 name: Dot Accuracy - type: manhattan_accuracy value: 0.6831713244228432 name: Manhattan Accuracy - type: euclidean_accuracy value: 0.62226609963548 name: Euclidean Accuracy - type: max_accuracy value: 0.6831713244228432 name: Max Accuracy - task: type: triplet name: Triplet dataset: name: all nli test type: all-nli-test metrics: - type: cosine_accuracy value: 0.6665153578453624 name: Cosine Accuracy - type: dot_accuracy value: 0.4428809199576335 name: Dot Accuracy - type: manhattan_accuracy value: 0.7280980481162052 name: Manhattan Accuracy - type: euclidean_accuracy value: 0.6639431078831896 name: Euclidean Accuracy - type: max_accuracy value: 0.7280980481162052 name: Max Accuracy --- # MPNet base trained on AllNLI triplets This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [microsoft/mpnet-base](https://huggingface.co/microsoft/mpnet-base) on the [all-nli](https://huggingface.co/datasets/sentence-transformers/all-nli) dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [microsoft/mpnet-base](https://huggingface.co/microsoft/mpnet-base) <!-- at revision 6996ce1e91bd2a9c7d7f61daec37463394f73f09 --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 768 tokens - **Similarity Function:** Cosine Similarity - **Training Dataset:** - [all-nli](https://huggingface.co/datasets/sentence-transformers/all-nli) - **Language:** en - **License:** apache-2.0 ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: MPNetModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("ivanleomk/mpnet-base-all-nli-triplet") # Run inference sentences = [ 'A construction worker peeking out of a manhole while his coworker sits on the sidewalk smiling.', 'A worker is looking out of a manhole.', 'The workers are both inside the manhole.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Triplet * Dataset: `all-nli-dev` * Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator) | Metric | Value | |:-------------------|:-----------| | cosine_accuracy | 0.6211 | | dot_accuracy | 0.4534 | | manhattan_accuracy | 0.6832 | | euclidean_accuracy | 0.6223 | | **max_accuracy** | **0.6832** | #### Triplet * Dataset: `all-nli-test` * Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator) | Metric | Value | |:-------------------|:-----------| | cosine_accuracy | 0.6665 | | dot_accuracy | 0.4429 | | manhattan_accuracy | 0.7281 | | euclidean_accuracy | 0.6639 | | **max_accuracy** | **0.7281** | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### all-nli * Dataset: [all-nli](https://huggingface.co/datasets/sentence-transformers/all-nli) at [d482672](https://huggingface.co/datasets/sentence-transformers/all-nli/tree/d482672c8e74ce18da116f430137434ba2e52fab) * Size: 557,850 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 14.85 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 13.55 tokens</li><li>max: 24 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 15.1 tokens</li><li>max: 30 tokens</li></ul> | * Samples: | anchor | positive | negative | |:---------------------------------------------------------------------------|:-------------------------------------------------|:-----------------------------------------------------------| | <code>A person on a horse jumps over a broken down airplane.</code> | <code>A person is outdoors, on a horse.</code> | <code>A person is at a diner, ordering an omelette.</code> | | <code>Children smiling and waving at camera</code> | <code>There are children present</code> | <code>The kids are frowning</code> | | <code>A boy is jumping on skateboard in the middle of a red bridge.</code> | <code>The boy does a skateboarding trick.</code> | <code>The boy skates down the sidewalk.</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Evaluation Dataset #### all-nli * Dataset: [all-nli](https://huggingface.co/datasets/sentence-transformers/all-nli) at [d482672](https://huggingface.co/datasets/sentence-transformers/all-nli/tree/d482672c8e74ce18da116f430137434ba2e52fab) * Size: 6,584 evaluation samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 17.95 tokens</li><li>max: 63 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 9.78 tokens</li><li>max: 29 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 10.35 tokens</li><li>max: 29 tokens</li></ul> | * Samples: | anchor | positive | negative | |:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------|:--------------------------------------------------------| | <code>Two women are embracing while holding to go packages.</code> | <code>Two woman are holding packages.</code> | <code>The men are fighting outside a deli.</code> | | <code>Two young children in blue jerseys, one with the number 9 and one with the number 2 are standing on wooden steps in a bathroom and washing their hands in a sink.</code> | <code>Two kids in numbered jerseys wash their hands.</code> | <code>Two kids in jackets walk to school.</code> | | <code>A man selling donuts to a customer during a world exhibition event held in the city of Angeles</code> | <code>A man selling donuts to a customer.</code> | <code>A woman drinks her coffee in a small cafe.</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 16 - `learning_rate`: 2e-05 - `num_train_epochs`: 1 - `warmup_ratio`: 0.1 - `fp16`: True - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 16 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 1 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `eval_use_gather_object`: False - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | all-nli-dev_max_accuracy | all-nli-test_max_accuracy | |:-----:|:----:|:------------------------:|:-------------------------:| | 0 | 0 | 0.6832 | - | | 1.0 | 7 | - | 0.7281 | ### Framework Versions - Python: 3.10.12 - Sentence Transformers: 3.2.1 - Transformers: 4.44.2 - PyTorch: 2.5.0+cu121 - Accelerate: 0.34.2 - Datasets: 3.1.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
null
Non_BioNLP
# MPNet base trained on AllNLI triplets This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [microsoft/mpnet-base](https://huggingface.co/microsoft/mpnet-base) on the [all-nli](https://huggingface.co/datasets/sentence-transformers/all-nli) dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [microsoft/mpnet-base](https://huggingface.co/microsoft/mpnet-base) <!-- at revision 6996ce1e91bd2a9c7d7f61daec37463394f73f09 --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 768 tokens - **Similarity Function:** Cosine Similarity - **Training Dataset:** - [all-nli](https://huggingface.co/datasets/sentence-transformers/all-nli) - **Language:** en - **License:** apache-2.0 ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: MPNetModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("ivanleomk/mpnet-base-all-nli-triplet") # Run inference sentences = [ 'A construction worker peeking out of a manhole while his coworker sits on the sidewalk smiling.', 'A worker is looking out of a manhole.', 'The workers are both inside the manhole.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Triplet * Dataset: `all-nli-dev` * Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator) | Metric | Value | |:-------------------|:-----------| | cosine_accuracy | 0.6211 | | dot_accuracy | 0.4534 | | manhattan_accuracy | 0.6832 | | euclidean_accuracy | 0.6223 | | **max_accuracy** | **0.6832** | #### Triplet * Dataset: `all-nli-test` * Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator) | Metric | Value | |:-------------------|:-----------| | cosine_accuracy | 0.6665 | | dot_accuracy | 0.4429 | | manhattan_accuracy | 0.7281 | | euclidean_accuracy | 0.6639 | | **max_accuracy** | **0.7281** | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### all-nli * Dataset: [all-nli](https://huggingface.co/datasets/sentence-transformers/all-nli) at [d482672](https://huggingface.co/datasets/sentence-transformers/all-nli/tree/d482672c8e74ce18da116f430137434ba2e52fab) * Size: 557,850 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 14.85 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 13.55 tokens</li><li>max: 24 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 15.1 tokens</li><li>max: 30 tokens</li></ul> | * Samples: | anchor | positive | negative | |:---------------------------------------------------------------------------|:-------------------------------------------------|:-----------------------------------------------------------| | <code>A person on a horse jumps over a broken down airplane.</code> | <code>A person is outdoors, on a horse.</code> | <code>A person is at a diner, ordering an omelette.</code> | | <code>Children smiling and waving at camera</code> | <code>There are children present</code> | <code>The kids are frowning</code> | | <code>A boy is jumping on skateboard in the middle of a red bridge.</code> | <code>The boy does a skateboarding trick.</code> | <code>The boy skates down the sidewalk.</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Evaluation Dataset #### all-nli * Dataset: [all-nli](https://huggingface.co/datasets/sentence-transformers/all-nli) at [d482672](https://huggingface.co/datasets/sentence-transformers/all-nli/tree/d482672c8e74ce18da116f430137434ba2e52fab) * Size: 6,584 evaluation samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 17.95 tokens</li><li>max: 63 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 9.78 tokens</li><li>max: 29 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 10.35 tokens</li><li>max: 29 tokens</li></ul> | * Samples: | anchor | positive | negative | |:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------|:--------------------------------------------------------| | <code>Two women are embracing while holding to go packages.</code> | <code>Two woman are holding packages.</code> | <code>The men are fighting outside a deli.</code> | | <code>Two young children in blue jerseys, one with the number 9 and one with the number 2 are standing on wooden steps in a bathroom and washing their hands in a sink.</code> | <code>Two kids in numbered jerseys wash their hands.</code> | <code>Two kids in jackets walk to school.</code> | | <code>A man selling donuts to a customer during a world exhibition event held in the city of Angeles</code> | <code>A man selling donuts to a customer.</code> | <code>A woman drinks her coffee in a small cafe.</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 16 - `learning_rate`: 2e-05 - `num_train_epochs`: 1 - `warmup_ratio`: 0.1 - `fp16`: True - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 16 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 1 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `eval_use_gather_object`: False - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | all-nli-dev_max_accuracy | all-nli-test_max_accuracy | |:-----:|:----:|:------------------------:|:-------------------------:| | 0 | 0 | 0.6832 | - | | 1.0 | 7 | - | 0.7281 | ### Framework Versions - Python: 3.10.12 - Sentence Transformers: 3.2.1 - Transformers: 4.44.2 - PyTorch: 2.5.0+cu121 - Accelerate: 0.34.2 - Datasets: 3.1.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
{"base_model": "microsoft/mpnet-base", "datasets": ["sentence-transformers/all-nli"], "language": ["en"], "library_name": "sentence-transformers", "license": "apache-2.0", "metrics": ["cosine_accuracy", "dot_accuracy", "manhattan_accuracy", "euclidean_accuracy", "max_accuracy"], "pipeline_tag": "sentence-similarity", "tags": ["sentence-transformers", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:557850", "loss:MultipleNegativesRankingLoss"], "widget": [{"source_sentence": "A man is jumping unto his filthy bed.", "sentences": ["A young male is looking at a newspaper while 2 females walks past him.", "The bed is dirty.", "The man is on the moon."]}, {"source_sentence": "A carefully balanced male stands on one foot near a clean ocean beach area.", "sentences": ["A man is ouside near the beach.", "Three policemen patrol the streets on bikes", "A man is sitting on his couch."]}, {"source_sentence": "The man is wearing a blue shirt.", "sentences": ["Near the trashcan the man stood and smoked", "A man in a blue shirt leans on a wall beside a road with a blue van and red car with water in the background.", "A man in a black shirt is playing a guitar."]}, {"source_sentence": "The girls are outdoors.", "sentences": ["Two girls riding on an amusement part ride.", "a guy laughs while doing laundry", "Three girls are standing together in a room, one is listening, one is writing on a wall and the third is talking to them."]}, {"source_sentence": "A construction worker peeking out of a manhole while his coworker sits on the sidewalk smiling.", "sentences": ["A worker is looking out of a manhole.", "A man is giving a presentation.", "The workers are both inside the manhole."]}], "model-index": [{"name": "MPNet base trained on AllNLI triplets", "results": [{"task": {"type": "triplet", "name": "Triplet"}, "dataset": {"name": "all nli dev", "type": "all-nli-dev"}, "metrics": [{"type": "cosine_accuracy", "value": 0.6210510328068044, "name": "Cosine Accuracy"}, {"type": "dot_accuracy", "value": 0.45337181044957475, "name": "Dot Accuracy"}, {"type": "manhattan_accuracy", "value": 0.6831713244228432, "name": "Manhattan Accuracy"}, {"type": "euclidean_accuracy", "value": 0.62226609963548, "name": "Euclidean Accuracy"}, {"type": "max_accuracy", "value": 0.6831713244228432, "name": "Max Accuracy"}]}, {"task": {"type": "triplet", "name": "Triplet"}, "dataset": {"name": "all nli test", "type": "all-nli-test"}, "metrics": [{"type": "cosine_accuracy", "value": 0.6665153578453624, "name": "Cosine Accuracy"}, {"type": "dot_accuracy", "value": 0.4428809199576335, "name": "Dot Accuracy"}, {"type": "manhattan_accuracy", "value": 0.7280980481162052, "name": "Manhattan Accuracy"}, {"type": "euclidean_accuracy", "value": 0.6639431078831896, "name": "Euclidean Accuracy"}, {"type": "max_accuracy", "value": 0.7280980481162052, "name": "Max Accuracy"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,911
fine-tuned/NFCorpus-256-24-gpt-4o-2024-05-13-687872
fine-tuned
feature-extraction
[ "sentence-transformers", "safetensors", "xlm-roberta", "feature-extraction", "sentence-similarity", "mteb", "en", "dataset:fine-tuned/NFCorpus-256-24-gpt-4o-2024-05-13-687872", "dataset:allenai/c4", "license:apache-2.0", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
2024-05-23T13:29:54Z
2024-05-23T13:30:51+00:00
10
0
--- datasets: - fine-tuned/NFCorpus-256-24-gpt-4o-2024-05-13-687872 - allenai/c4 language: - en license: apache-2.0 pipeline_tag: feature-extraction tags: - sentence-transformers - feature-extraction - sentence-similarity - mteb --- This model is a fine-tuned version of [**BAAI/bge-m3**](https://huggingface.co/BAAI/bge-m3) designed for the following use case: custom ## How to Use This model can be easily integrated into your NLP pipeline for tasks such as text classification, sentiment analysis, entity recognition, and more. Here's a simple example to get you started: ```python from sentence_transformers import SentenceTransformer from sentence_transformers.util import cos_sim model = SentenceTransformer( 'fine-tuned/NFCorpus-256-24-gpt-4o-2024-05-13-687872', trust_remote_code=True ) embeddings = model.encode([ 'first text to embed', 'second text to embed' ]) print(cos_sim(embeddings[0], embeddings[1])) ```
null
Non_BioNLP
This model is a fine-tuned version of [**BAAI/bge-m3**](https://huggingface.co/BAAI/bge-m3) designed for the following use case: custom ## How to Use This model can be easily integrated into your NLP pipeline for tasks such as text classification, sentiment analysis, entity recognition, and more. Here's a simple example to get you started: ```python from sentence_transformers import SentenceTransformer from sentence_transformers.util import cos_sim model = SentenceTransformer( 'fine-tuned/NFCorpus-256-24-gpt-4o-2024-05-13-687872', trust_remote_code=True ) embeddings = model.encode([ 'first text to embed', 'second text to embed' ]) print(cos_sim(embeddings[0], embeddings[1])) ```
{"datasets": ["fine-tuned/NFCorpus-256-24-gpt-4o-2024-05-13-687872", "allenai/c4"], "language": ["en"], "license": "apache-2.0", "pipeline_tag": "feature-extraction", "tags": ["sentence-transformers", "feature-extraction", "sentence-similarity", "mteb"]}
task
[ "TEXT_CLASSIFICATION" ]
42,912
Baldezo313/distilbert-base-uncased-finetuned-emotion
Baldezo313
text-classification
[ "transformers", "pytorch", "tensorboard", "safetensors", "distilbert", "text-classification", "generated_from_trainer", "dataset:emotion", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2023-11-16T01:35:38Z
2023-11-20T16:50:40+00:00
113
0
--- datasets: - emotion license: apache-2.0 metrics: - accuracy - f1 tags: - generated_from_trainer model-index: - name: distilbert-base-uncased-finetuned-emotion results: - task: type: text-classification name: Text Classification dataset: name: emotion type: emotion args: split metrics: - type: accuracy value: 0.9245 name: Accuracy - type: f1 value: 0.9245052592082995 name: F1 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-emotion This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset. It achieves the following results on the evaluation set: - Loss: 0.2204 - Accuracy: 0.9245 - F1: 0.9245 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.8204 | 1.0 | 250 | 0.3100 | 0.911 | 0.9093 | | 0.2495 | 2.0 | 500 | 0.2204 | 0.9245 | 0.9245 | ### Framework versions - Transformers 4.16.2 - Pytorch 2.1.0+cu118 - Datasets 1.16.1 - Tokenizers 0.15.0
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-emotion This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset. It achieves the following results on the evaluation set: - Loss: 0.2204 - Accuracy: 0.9245 - F1: 0.9245 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.8204 | 1.0 | 250 | 0.3100 | 0.911 | 0.9093 | | 0.2495 | 2.0 | 500 | 0.2204 | 0.9245 | 0.9245 | ### Framework versions - Transformers 4.16.2 - Pytorch 2.1.0+cu118 - Datasets 1.16.1 - Tokenizers 0.15.0
{"datasets": ["emotion"], "license": "apache-2.0", "metrics": ["accuracy", "f1"], "tags": ["generated_from_trainer"], "model-index": [{"name": "distilbert-base-uncased-finetuned-emotion", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "emotion", "type": "emotion", "args": "split"}, "metrics": [{"type": "accuracy", "value": 0.9245, "name": "Accuracy"}, {"type": "f1", "value": 0.9245052592082995, "name": "F1"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,913
Isotonic/plan_t5
Isotonic
text2text-generation
[ "transformers", "safetensors", "t5", "text2text-generation", "generated_from_trainer", "dataset:Isotonic/planner_dataset", "base_model:google/t5-small-lm-adapt", "base_model:finetune:google/t5-small-lm-adapt", "license:apache-2.0", "model-index", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
2024-04-07T00:32:15Z
2024-04-07T00:58:25+00:00
176
0
--- base_model: google/t5-small-lm-adapt datasets: - Isotonic/planner_dataset license: apache-2.0 metrics: - rouge tags: - generated_from_trainer model-index: - name: plan_t5 results: - task: type: summarization name: Summarization dataset: name: Isotonic/planner_dataset type: Isotonic/planner_dataset metrics: - type: rouge value: 58.1228 name: Rouge1 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # plan_t5 This model is a fine-tuned version of [google/t5-small-lm-adapt](https://huggingface.co/google/t5-small-lm-adapt) on the Isotonic/planner_dataset dataset. It achieves the following results on the evaluation set: - Loss: 1.4366 - Rouge1: 58.1228 - Rouge2: 24.3461 - Rougel: 58.1313 - Rougelsum: 58.1335 - Gen Len: 7.9747 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 37 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.2 - num_epochs: 5.0 ### Training results ### Framework versions - Transformers 4.39.3 - Pytorch 2.1.0+cu118 - Datasets 2.18.0 - Tokenizers 0.15.2
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # plan_t5 This model is a fine-tuned version of [google/t5-small-lm-adapt](https://huggingface.co/google/t5-small-lm-adapt) on the Isotonic/planner_dataset dataset. It achieves the following results on the evaluation set: - Loss: 1.4366 - Rouge1: 58.1228 - Rouge2: 24.3461 - Rougel: 58.1313 - Rougelsum: 58.1335 - Gen Len: 7.9747 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 37 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.2 - num_epochs: 5.0 ### Training results ### Framework versions - Transformers 4.39.3 - Pytorch 2.1.0+cu118 - Datasets 2.18.0 - Tokenizers 0.15.2
{"base_model": "google/t5-small-lm-adapt", "datasets": ["Isotonic/planner_dataset"], "license": "apache-2.0", "metrics": ["rouge"], "tags": ["generated_from_trainer"], "model-index": [{"name": "plan_t5", "results": [{"task": {"type": "summarization", "name": "Summarization"}, "dataset": {"name": "Isotonic/planner_dataset", "type": "Isotonic/planner_dataset"}, "metrics": [{"type": "rouge", "value": 58.1228, "name": "Rouge1"}]}]}]}
task
[ "SUMMARIZATION" ]
42,914
YakovElm/IntelDAOS5SetFitModel_Train_balance_ratio_2
YakovElm
text-classification
[ "sentence-transformers", "pytorch", "mpnet", "setfit", "text-classification", "arxiv:2209.11055", "license:apache-2.0", "region:us" ]
2023-06-10T04:53:40Z
2023-06-10T04:54:14+00:00
8
0
--- license: apache-2.0 pipeline_tag: text-classification tags: - setfit - sentence-transformers - text-classification --- # YakovElm/IntelDAOS5SetFitModel_Train_balance_ratio_2 This is a [SetFit model](https://github.com/huggingface/setfit) that can be used for text classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Usage To use this model for inference, first install the SetFit library: ```bash python -m pip install setfit ``` You can then run inference as follows: ```python from setfit import SetFitModel # Download from Hub and run inference model = SetFitModel.from_pretrained("YakovElm/IntelDAOS5SetFitModel_Train_balance_ratio_2") # Run inference preds = model(["i loved the spiderman movie!", "pineapple on pizza is the worst 🤮"]) ``` ## BibTeX entry and citation info ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```
null
Non_BioNLP
# YakovElm/IntelDAOS5SetFitModel_Train_balance_ratio_2 This is a [SetFit model](https://github.com/huggingface/setfit) that can be used for text classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Usage To use this model for inference, first install the SetFit library: ```bash python -m pip install setfit ``` You can then run inference as follows: ```python from setfit import SetFitModel # Download from Hub and run inference model = SetFitModel.from_pretrained("YakovElm/IntelDAOS5SetFitModel_Train_balance_ratio_2") # Run inference preds = model(["i loved the spiderman movie!", "pineapple on pizza is the worst 🤮"]) ``` ## BibTeX entry and citation info ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```
{"license": "apache-2.0", "pipeline_tag": "text-classification", "tags": ["setfit", "sentence-transformers", "text-classification"]}
task
[ "TEXT_CLASSIFICATION" ]
42,915
LarkAI/bart_large_nl2sql
LarkAI
text2text-generation
[ "transformers", "pytorch", "bart", "feature-extraction", "nl2sql", "text2text-generation", "en", "dataset:wikisql", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2023-05-19T09:45:29Z
2023-05-23T09:42:59+00:00
110
3
--- datasets: - wikisql language: - en license: apache-2.0 pipeline_tag: text2text-generation tags: - nl2sql widget: - text: 'question: get people name with age less 25 table: id, name, age' example_title: less than - text: 'question: get people name with age equal 25 table: id, name, age' example_title: equal --- new version: [LarkAI/codet5p-770m_nl2sql_oig](https://huggingface.co/LarkAI/codet5p-770m_nl2sql_oig) use oig-sql dataset and support more complex sql parse # How to Use ```python import torch from transformers import AutoTokenizer, BartForConditionalGeneration device = torch.device('cuda:0') tokenizer = AutoTokenizer.from_pretrained("LarkAI/bart_large_nl2sql") model = BartForConditionalGeneration.from_pretrained("LarkAI/bart_large_nl2sql").to(device) text = "question: get people name with age less 25 table: id, name, age" inputs = tokenizer([text], max_length=1024, return_tensors="pt") output_ids = model.generate(inputs["input_ids"].to(device), num_beams=self.beams, max_length=128, min_length=8) response_text = tokenizer.batch_decode(output_ids, skip_special_tokens=True, clean_up_tokenization_spaces=False)[0] # SELECT name FROM table WHERE age < 25 ``` reference: [juierror/flan-t5-text2sql-with-schema](https://huggingface.co/juierror/flan-t5-text2sql-with-schema) - fix this [discussion](https://huggingface.co/juierror/flan-t5-text2sql-with-schema/discussions/5) # How to Train Quick start: https://github.com/huggingface/transformers/blob/main/examples/pytorch/summarization/README.md
null
Non_BioNLP
new version: [LarkAI/codet5p-770m_nl2sql_oig](https://huggingface.co/LarkAI/codet5p-770m_nl2sql_oig) use oig-sql dataset and support more complex sql parse # How to Use ```python import torch from transformers import AutoTokenizer, BartForConditionalGeneration device = torch.device('cuda:0') tokenizer = AutoTokenizer.from_pretrained("LarkAI/bart_large_nl2sql") model = BartForConditionalGeneration.from_pretrained("LarkAI/bart_large_nl2sql").to(device) text = "question: get people name with age less 25 table: id, name, age" inputs = tokenizer([text], max_length=1024, return_tensors="pt") output_ids = model.generate(inputs["input_ids"].to(device), num_beams=self.beams, max_length=128, min_length=8) response_text = tokenizer.batch_decode(output_ids, skip_special_tokens=True, clean_up_tokenization_spaces=False)[0] # SELECT name FROM table WHERE age < 25 ``` reference: [juierror/flan-t5-text2sql-with-schema](https://huggingface.co/juierror/flan-t5-text2sql-with-schema) - fix this [discussion](https://huggingface.co/juierror/flan-t5-text2sql-with-schema/discussions/5) # How to Train Quick start: https://github.com/huggingface/transformers/blob/main/examples/pytorch/summarization/README.md
{"datasets": ["wikisql"], "language": ["en"], "license": "apache-2.0", "pipeline_tag": "text2text-generation", "tags": ["nl2sql"], "widget": [{"text": "question: get people name with age less 25 table: id, name, age", "example_title": "less than"}, {"text": "question: get people name with age equal 25 table: id, name, age", "example_title": "equal"}]}
task
[ "SUMMARIZATION" ]
42,916
pinglarin/summarization_papers
pinglarin
text2text-generation
[ "transformers", "pytorch", "bart", "text2text-generation", "generated_from_trainer", "dataset:ccdv/arxiv-summarization", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2023-04-23T17:10:57Z
2023-04-24T05:57:43+00:00
9
3
--- datasets: - ccdv/arxiv-summarization license: apache-2.0 metrics: - rouge tags: - generated_from_trainer model-index: - name: results results: - task: type: summarization name: Summarization dataset: name: ccdv/arxiv-summarization type: ccdv/arxiv-summarization config: section split: validation args: section metrics: - type: rouge value: 35.6639 name: Rouge1 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # results This model is a fine-tuned version of [sshleifer/distilbart-xsum-12-1](https://huggingface.co/sshleifer/distilbart-xsum-12-1) on the ccdv/arxiv-summarization dataset. It achieves the following results on the evaluation set: - Loss: 4.3066 - Rouge1: 35.6639 - Rouge2: 10.5717 - Rougel: 21.095 - Rougelsum: 31.2685 - Gen Len: 81.44 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results ### Framework versions - Transformers 4.29.0.dev0 - Pytorch 2.0.0 - Datasets 2.10.1 - Tokenizers 0.13.2
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # results This model is a fine-tuned version of [sshleifer/distilbart-xsum-12-1](https://huggingface.co/sshleifer/distilbart-xsum-12-1) on the ccdv/arxiv-summarization dataset. It achieves the following results on the evaluation set: - Loss: 4.3066 - Rouge1: 35.6639 - Rouge2: 10.5717 - Rougel: 21.095 - Rougelsum: 31.2685 - Gen Len: 81.44 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results ### Framework versions - Transformers 4.29.0.dev0 - Pytorch 2.0.0 - Datasets 2.10.1 - Tokenizers 0.13.2
{"datasets": ["ccdv/arxiv-summarization"], "license": "apache-2.0", "metrics": ["rouge"], "tags": ["generated_from_trainer"], "model-index": [{"name": "results", "results": [{"task": {"type": "summarization", "name": "Summarization"}, "dataset": {"name": "ccdv/arxiv-summarization", "type": "ccdv/arxiv-summarization", "config": "section", "split": "validation", "args": "section"}, "metrics": [{"type": "rouge", "value": 35.6639, "name": "Rouge1"}]}]}]}
task
[ "SUMMARIZATION" ]
42,917
satheeshTM/distilbert-base-uncased-finetuned-emotion
satheeshTM
text-classification
[ "transformers", "tensorboard", "safetensors", "distilbert", "text-classification", "generated_from_trainer", "dataset:emotion", "base_model:distilbert/distilbert-base-uncased", "base_model:finetune:distilbert/distilbert-base-uncased", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-01-10T04:50:32Z
2024-02-16T07:04:13+00:00
3
0
--- base_model: distilbert-base-uncased datasets: - emotion license: apache-2.0 metrics: - accuracy - f1 tags: - generated_from_trainer model-index: - name: distilbert-base-uncased-finetuned-emotion results: - task: type: text-classification name: Text Classification dataset: name: emotion type: emotion config: split split: validation args: split metrics: - type: accuracy value: 0.9265 name: Accuracy - type: f1 value: 0.9263544647982521 name: F1 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-emotion This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset. It achieves the following results on the evaluation set: - Loss: 0.2262 - Accuracy: 0.9265 - F1: 0.9264 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.8603 | 1.0 | 250 | 0.3328 | 0.902 | 0.9014 | | 0.2575 | 2.0 | 500 | 0.2262 | 0.9265 | 0.9264 | ### Framework versions - Transformers 4.36.2 - Pytorch 2.1.0+cu121 - Datasets 2.16.1 - Tokenizers 0.15.0
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-emotion This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset. It achieves the following results on the evaluation set: - Loss: 0.2262 - Accuracy: 0.9265 - F1: 0.9264 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.8603 | 1.0 | 250 | 0.3328 | 0.902 | 0.9014 | | 0.2575 | 2.0 | 500 | 0.2262 | 0.9265 | 0.9264 | ### Framework versions - Transformers 4.36.2 - Pytorch 2.1.0+cu121 - Datasets 2.16.1 - Tokenizers 0.15.0
{"base_model": "distilbert-base-uncased", "datasets": ["emotion"], "license": "apache-2.0", "metrics": ["accuracy", "f1"], "tags": ["generated_from_trainer"], "model-index": [{"name": "distilbert-base-uncased-finetuned-emotion", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "emotion", "type": "emotion", "config": "split", "split": "validation", "args": "split"}, "metrics": [{"type": "accuracy", "value": 0.9265, "name": "Accuracy"}, {"type": "f1", "value": 0.9263544647982521, "name": "F1"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,918
BSC-LT/roberta-base-bne-capitel-ner
BSC-LT
token-classification
[ "transformers", "pytorch", "roberta", "token-classification", "national library of spain", "spanish", "bne", "capitel", "ner", "es", "dataset:bne", "dataset:capitel", "arxiv:1907.11692", "arxiv:2107.07253", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04Z
2021-10-21T10:29:35+00:00
190
1
--- datasets: - bne - capitel language: - es license: apache-2.0 metrics: - f1 tags: - national library of spain - spanish - bne - capitel - ner --- **⚠️NOTICE⚠️: THIS MODEL HAS BEEN MOVED TO THE FOLLOWING URL AND WILL SOON BE REMOVED:** https://huggingface.co/PlanTL-GOB-ES/roberta-base-bne-capitel-ner # Spanish RoBERTa-base trained on BNE finetuned for CAPITEL Named Entity Recognition (NER) dataset. RoBERTa-base-bne is a transformer-based masked language model for the Spanish language. It is based on the [RoBERTa](https://arxiv.org/abs/1907.11692) base model and has been pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text processed for this work, compiled from the web crawlings performed by the [National Library of Spain (Biblioteca Nacional de España)](http://www.bne.es/en/Inicio/index.html) from 2009 to 2019. Original pre-trained model can be found here: https://huggingface.co/BSC-TeMU/roberta-base-bne ## Dataset The dataset used is the one from the [CAPITEL competition at IberLEF 2020](https://sites.google.com/view/capitel2020) (sub-task 1). ## Evaluation and results F1 Score: 0.8960 For evaluation details visit our [GitHub repository](https://github.com/PlanTL-SANIDAD/lm-spanish). ## Citing Check out our paper for all the details: https://arxiv.org/abs/2107.07253 ``` @misc{gutierrezfandino2021spanish, title={Spanish Language Models}, author={Asier Gutiérrez-Fandiño and Jordi Armengol-Estapé and Marc Pàmies and Joan Llop-Palao and Joaquín Silveira-Ocampo and Casimiro Pio Carrino and Aitor Gonzalez-Agirre and Carme Armentano-Oller and Carlos Rodriguez-Penagos and Marta Villegas}, year={2021}, eprint={2107.07253}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
null
Non_BioNLP
**⚠️NOTICE⚠️: THIS MODEL HAS BEEN MOVED TO THE FOLLOWING URL AND WILL SOON BE REMOVED:** https://huggingface.co/PlanTL-GOB-ES/roberta-base-bne-capitel-ner # Spanish RoBERTa-base trained on BNE finetuned for CAPITEL Named Entity Recognition (NER) dataset. RoBERTa-base-bne is a transformer-based masked language model for the Spanish language. It is based on the [RoBERTa](https://arxiv.org/abs/1907.11692) base model and has been pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text processed for this work, compiled from the web crawlings performed by the [National Library of Spain (Biblioteca Nacional de España)](http://www.bne.es/en/Inicio/index.html) from 2009 to 2019. Original pre-trained model can be found here: https://huggingface.co/BSC-TeMU/roberta-base-bne ## Dataset The dataset used is the one from the [CAPITEL competition at IberLEF 2020](https://sites.google.com/view/capitel2020) (sub-task 1). ## Evaluation and results F1 Score: 0.8960 For evaluation details visit our [GitHub repository](https://github.com/PlanTL-SANIDAD/lm-spanish). ## Citing Check out our paper for all the details: https://arxiv.org/abs/2107.07253 ``` @misc{gutierrezfandino2021spanish, title={Spanish Language Models}, author={Asier Gutiérrez-Fandiño and Jordi Armengol-Estapé and Marc Pàmies and Joan Llop-Palao and Joaquín Silveira-Ocampo and Casimiro Pio Carrino and Aitor Gonzalez-Agirre and Carme Armentano-Oller and Carlos Rodriguez-Penagos and Marta Villegas}, year={2021}, eprint={2107.07253}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
{"datasets": ["bne", "capitel"], "language": ["es"], "license": "apache-2.0", "metrics": ["f1"], "tags": ["national library of spain", "spanish", "bne", "capitel", "ner"]}
task
[ "NAMED_ENTITY_RECOGNITION" ]
42,919
akashjoy/distilbert-base-uncased-finetuned-emotion
akashjoy
text-classification
[ "transformers", "tensorboard", "safetensors", "distilbert", "text-classification", "generated_from_trainer", "dataset:emotion", "base_model:distilbert/distilbert-base-uncased", "base_model:finetune:distilbert/distilbert-base-uncased", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-03-22T04:37:15Z
2024-03-26T02:10:45+00:00
4
0
--- base_model: distilbert-base-uncased datasets: - emotion license: apache-2.0 metrics: - f1 - accuracy tags: - generated_from_trainer model-index: - name: distilbert-base-uncased-finetuned-emotion results: - task: type: text-classification name: Text Classification dataset: name: emotion type: emotion config: split split: validation args: split metrics: - type: f1 value: 0.9333997935723345 name: F1 - type: accuracy value: 0.9335 name: Accuracy --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-emotion This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset. It achieves the following results on the evaluation set: - Loss: 0.1499 - F1: 0.9334 - Accuracy: 0.9335 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:------:|:--------:| | 0.7725 | 1.0 | 250 | 0.2686 | 0.9184 | 0.918 | | 0.2092 | 2.0 | 500 | 0.1734 | 0.9330 | 0.933 | | 0.1394 | 3.0 | 750 | 0.1623 | 0.9356 | 0.935 | | 0.1095 | 4.0 | 1000 | 0.1449 | 0.9368 | 0.937 | | 0.0914 | 5.0 | 1250 | 0.1499 | 0.9334 | 0.9335 | ### Framework versions - Transformers 4.38.2 - Pytorch 2.2.1+cu118 - Datasets 2.18.0 - Tokenizers 0.15.2
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-emotion This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset. It achieves the following results on the evaluation set: - Loss: 0.1499 - F1: 0.9334 - Accuracy: 0.9335 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:------:|:--------:| | 0.7725 | 1.0 | 250 | 0.2686 | 0.9184 | 0.918 | | 0.2092 | 2.0 | 500 | 0.1734 | 0.9330 | 0.933 | | 0.1394 | 3.0 | 750 | 0.1623 | 0.9356 | 0.935 | | 0.1095 | 4.0 | 1000 | 0.1449 | 0.9368 | 0.937 | | 0.0914 | 5.0 | 1250 | 0.1499 | 0.9334 | 0.9335 | ### Framework versions - Transformers 4.38.2 - Pytorch 2.2.1+cu118 - Datasets 2.18.0 - Tokenizers 0.15.2
{"base_model": "distilbert-base-uncased", "datasets": ["emotion"], "license": "apache-2.0", "metrics": ["f1", "accuracy"], "tags": ["generated_from_trainer"], "model-index": [{"name": "distilbert-base-uncased-finetuned-emotion", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "emotion", "type": "emotion", "config": "split", "split": "validation", "args": "split"}, "metrics": [{"type": "f1", "value": 0.9333997935723345, "name": "F1"}, {"type": "accuracy", "value": 0.9335, "name": "Accuracy"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,920
fine-tuned/jina-embeddings-v2-base-en-13052024-ch9n-webapp
fine-tuned
feature-extraction
[ "sentence-transformers", "safetensors", "bert", "feature-extraction", "sentence-similarity", "mteb", "Legal", "Data", "Privacy", "EU", "Regulation", "custom_code", "en", "dataset:fine-tuned/jina-embeddings-v2-base-en-13052024-ch9n-webapp", "dataset:allenai/c4", "license:apache-2.0", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
2024-05-13T12:42:18Z
2024-05-13T12:42:34+00:00
6
0
--- datasets: - fine-tuned/jina-embeddings-v2-base-en-13052024-ch9n-webapp - allenai/c4 language: - en license: apache-2.0 pipeline_tag: feature-extraction tags: - sentence-transformers - feature-extraction - sentence-similarity - mteb - Legal - Data - Privacy - EU - Regulation --- This model is a fine-tuned version of [**jinaai/jina-embeddings-v2-base-en**](https://huggingface.co/jinaai/jina-embeddings-v2-base-en) designed for the following use case: legal content search for data protection regulations ## How to Use This model can be easily integrated into your NLP pipeline for tasks such as text classification, sentiment analysis, entity recognition, and more. Here's a simple example to get you started: ```python from sentence_transformers import SentenceTransformer from sentence_transformers.util import cos_sim model = SentenceTransformer( 'fine-tuned/jina-embeddings-v2-base-en-13052024-ch9n-webapp', trust_remote_code=True ) embeddings = model.encode([ 'first text to embed', 'second text to embed' ]) print(cos_sim(embeddings[0], embeddings[1])) ```
null
Non_BioNLP
This model is a fine-tuned version of [**jinaai/jina-embeddings-v2-base-en**](https://huggingface.co/jinaai/jina-embeddings-v2-base-en) designed for the following use case: legal content search for data protection regulations ## How to Use This model can be easily integrated into your NLP pipeline for tasks such as text classification, sentiment analysis, entity recognition, and more. Here's a simple example to get you started: ```python from sentence_transformers import SentenceTransformer from sentence_transformers.util import cos_sim model = SentenceTransformer( 'fine-tuned/jina-embeddings-v2-base-en-13052024-ch9n-webapp', trust_remote_code=True ) embeddings = model.encode([ 'first text to embed', 'second text to embed' ]) print(cos_sim(embeddings[0], embeddings[1])) ```
{"datasets": ["fine-tuned/jina-embeddings-v2-base-en-13052024-ch9n-webapp", "allenai/c4"], "language": ["en"], "license": "apache-2.0", "pipeline_tag": "feature-extraction", "tags": ["sentence-transformers", "feature-extraction", "sentence-similarity", "mteb", "Legal", "Data", "Privacy", "EU", "Regulation"]}
task
[ "TEXT_CLASSIFICATION" ]
42,921
YakovElm/Jira20SetFitModel_balance_ratio_3
YakovElm
text-classification
[ "sentence-transformers", "pytorch", "mpnet", "setfit", "text-classification", "arxiv:2209.11055", "license:apache-2.0", "region:us" ]
2023-06-02T15:00:03Z
2023-06-02T15:00:37+00:00
10
0
--- license: apache-2.0 pipeline_tag: text-classification tags: - setfit - sentence-transformers - text-classification --- # YakovElm/Jira20SetFitModel_balance_ratio_3 This is a [SetFit model](https://github.com/huggingface/setfit) that can be used for text classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Usage To use this model for inference, first install the SetFit library: ```bash python -m pip install setfit ``` You can then run inference as follows: ```python from setfit import SetFitModel # Download from Hub and run inference model = SetFitModel.from_pretrained("YakovElm/Jira20SetFitModel_balance_ratio_3") # Run inference preds = model(["i loved the spiderman movie!", "pineapple on pizza is the worst 🤮"]) ``` ## BibTeX entry and citation info ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```
null
Non_BioNLP
# YakovElm/Jira20SetFitModel_balance_ratio_3 This is a [SetFit model](https://github.com/huggingface/setfit) that can be used for text classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Usage To use this model for inference, first install the SetFit library: ```bash python -m pip install setfit ``` You can then run inference as follows: ```python from setfit import SetFitModel # Download from Hub and run inference model = SetFitModel.from_pretrained("YakovElm/Jira20SetFitModel_balance_ratio_3") # Run inference preds = model(["i loved the spiderman movie!", "pineapple on pizza is the worst 🤮"]) ``` ## BibTeX entry and citation info ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```
{"license": "apache-2.0", "pipeline_tag": "text-classification", "tags": ["setfit", "sentence-transformers", "text-classification"]}
task
[ "TEXT_CLASSIFICATION" ]
42,922
MaLA-LM/lucky52-bloom-7b1-no-3
MaLA-LM
text-generation
[ "transformers", "pytorch", "bloom", "text-generation", "generation", "question answering", "instruction tuning", "multilingual", "dataset:MBZUAI/Bactrian-X", "arxiv:2404.04850", "license:cc-by-nc-4.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
2024-04-04T07:46:09Z
2025-04-08T17:06:52+00:00
10
0
--- datasets: - MBZUAI/Bactrian-X language: - multilingual library_name: transformers license: cc-by-nc-4.0 pipeline_tag: text-generation tags: - generation - question answering - instruction tuning --- ### Model Description This HF repository hosts instruction fine-tuned multilingual BLOOM model using the parallel instruction dataset called Bactrain-X in 52 languages. We progressively add a language during instruction fine-tuning at each time, and train 52 models in total. Then, we evaluate those models in three multilingual benchmarks. Please refer to [our paper](https://arxiv.org/abs/2404.04850) for more details. * Base model: [BLOOM 7B1](https://huggingface.co/bigscience/bloom-7b1) * Instruction languages: English, Chinese, Afrikaans * Instruction language codes: en, zh, af * Training method: full-parameter fine-tuning. ### Usage The model checkpoint should be loaded using `transformers` library. ```python from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("MaLA-LM/lucky52-bloom-7b1-no-3") model = AutoModelForCausalLM.from_pretrained("MaLA-LM/lucky52-bloom-7b1-no-3") ``` ### Citation ``` @inproceedings{ji2025lucky52, title={How Many Languages Make Good Multilingual Instruction Tuning? A Case Study on BLOOM}, author={Shaoxiong Ji and Pinzhen Chen}, year={2025}, booktitle={Proceedings of COLING}, url={https://arxiv.org/abs/2404.04850}, } ```
null
Non_BioNLP
### Model Description This HF repository hosts instruction fine-tuned multilingual BLOOM model using the parallel instruction dataset called Bactrain-X in 52 languages. We progressively add a language during instruction fine-tuning at each time, and train 52 models in total. Then, we evaluate those models in three multilingual benchmarks. Please refer to [our paper](https://arxiv.org/abs/2404.04850) for more details. * Base model: [BLOOM 7B1](https://huggingface.co/bigscience/bloom-7b1) * Instruction languages: English, Chinese, Afrikaans * Instruction language codes: en, zh, af * Training method: full-parameter fine-tuning. ### Usage The model checkpoint should be loaded using `transformers` library. ```python from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("MaLA-LM/lucky52-bloom-7b1-no-3") model = AutoModelForCausalLM.from_pretrained("MaLA-LM/lucky52-bloom-7b1-no-3") ``` ### Citation ``` @inproceedings{ji2025lucky52, title={How Many Languages Make Good Multilingual Instruction Tuning? A Case Study on BLOOM}, author={Shaoxiong Ji and Pinzhen Chen}, year={2025}, booktitle={Proceedings of COLING}, url={https://arxiv.org/abs/2404.04850}, } ```
{"datasets": ["MBZUAI/Bactrian-X"], "language": ["multilingual"], "library_name": "transformers", "license": "cc-by-nc-4.0", "pipeline_tag": "text-generation", "tags": ["generation", "question answering", "instruction tuning"]}
task
[ "QUESTION_ANSWERING" ]
42,923
Salesforce/blip-vqa-capfilt-large
Salesforce
visual-question-answering
[ "transformers", "pytorch", "tf", "blip", "visual-question-answering", "arxiv:2201.12086", "license:bsd-3-clause", "region:us" ]
2022-12-13T11:37:19Z
2025-02-03T06:42:26+00:00
117,534
49
--- license: bsd-3-clause pipeline_tag: visual-question-answering tags: - visual-question-answering inference: false languages: - en --- # BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generation Model card for BLIP trained on visual question answering - large architecture (with ViT large backbone). | ![BLIP.gif](https://cdn-uploads.huggingface.co/production/uploads/1670928184033-62441d1d9fdefb55a0b7d12c.gif) | |:--:| | <b> Pull figure from BLIP official repo | Image source: https://github.com/salesforce/BLIP </b>| ## TL;DR Authors from the [paper](https://arxiv.org/abs/2201.12086) write in the abstract: *Vision-Language Pre-training (VLP) has advanced the performance for many vision-language tasks. However, most existing pre-trained models only excel in either understanding-based tasks or generation-based tasks. Furthermore, performance improvement has been largely achieved by scaling up the dataset with noisy image-text pairs collected from the web, which is a suboptimal source of supervision. In this paper, we propose BLIP, a new VLP framework which transfers flexibly to both vision-language understanding and generation tasks. BLIP effectively utilizes the noisy web data by bootstrapping the captions, where a captioner generates synthetic captions and a filter removes the noisy ones. We achieve state-of-the-art results on a wide range of vision-language tasks, such as image-text retrieval (+2.7% in average recall@1), image captioning (+2.8% in CIDEr), and VQA (+1.6% in VQA score). BLIP also demonstrates strong generalization ability when directly transferred to videolanguage tasks in a zero-shot manner. Code, models, and datasets are released.* ## Usage You can use this model for conditional and un-conditional image captioning ### Using the Pytorch model #### Running the model on CPU <details> <summary> Click to expand </summary> ```python import requests from PIL import Image from transformers import BlipProcessor, BlipForQuestionAnswering processor = BlipProcessor.from_pretrained("Salesforce/blip-vqa-capfilt-large") model = BlipForQuestionAnswering.from_pretrained("Salesforce/blip-vqa-capfilt-large") img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg' raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB') question = "how many dogs are in the picture?" inputs = processor(raw_image, question, return_tensors="pt") out = model.generate(**inputs) print(processor.decode(out[0], skip_special_tokens=True)) >>> 1 ``` </details> #### Running the model on GPU ##### In full precision <details> <summary> Click to expand </summary> ```python import requests from PIL import Image from transformers import BlipProcessor, BlipForQuestionAnswering processor = BlipProcessor.from_pretrained("Salesforce/blip-vqa-capfilt-large") model = BlipForQuestionAnswering.from_pretrained("Salesforce/blip-vqa-capfilt-large").to("cuda") img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg' raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB') question = "how many dogs are in the picture?" inputs = processor(raw_image, question, return_tensors="pt").to("cuda") out = model.generate(**inputs) print(processor.decode(out[0], skip_special_tokens=True)) >>> 1 ``` </details> ##### In half precision (`float16`) <details> <summary> Click to expand </summary> ```python import torch import requests from PIL import Image from transformers import BlipProcessor, BlipForQuestionAnswering processor = BlipProcessor.from_pretrained("ybelkada/blip-vqa-capfilt-large") model = BlipForQuestionAnswering.from_pretrained("ybelkada/blip-vqa-capfilt-large", torch_dtype=torch.float16).to("cuda") img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg' raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB') question = "how many dogs are in the picture?" inputs = processor(raw_image, question, return_tensors="pt").to("cuda", torch.float16) out = model.generate(**inputs) print(processor.decode(out[0], skip_special_tokens=True)) >>> 1 ``` </details> ## Ethical Considerations This release is for research purposes only in support of an academic paper. Our models, datasets, and code are not specifically designed or evaluated for all downstream purposes. We strongly recommend users evaluate and address potential concerns related to accuracy, safety, and fairness before deploying this model. We encourage users to consider the common limitations of AI, comply with applicable laws, and leverage best practices when selecting use cases, particularly for high-risk scenarios where errors or misuse could significantly impact people’s lives, rights, or safety. For further guidance on use cases, refer to our AUP and AI AUP. ## BibTex and citation info ``` @misc{https://doi.org/10.48550/arxiv.2201.12086, doi = {10.48550/ARXIV.2201.12086}, url = {https://arxiv.org/abs/2201.12086}, author = {Li, Junnan and Li, Dongxu and Xiong, Caiming and Hoi, Steven}, keywords = {Computer Vision and Pattern Recognition (cs.CV), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generation}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```
null
Non_BioNLP
# BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generation Model card for BLIP trained on visual question answering - large architecture (with ViT large backbone). | ![BLIP.gif](https://cdn-uploads.huggingface.co/production/uploads/1670928184033-62441d1d9fdefb55a0b7d12c.gif) | |:--:| | <b> Pull figure from BLIP official repo | Image source: https://github.com/salesforce/BLIP </b>| ## TL;DR Authors from the [paper](https://arxiv.org/abs/2201.12086) write in the abstract: *Vision-Language Pre-training (VLP) has advanced the performance for many vision-language tasks. However, most existing pre-trained models only excel in either understanding-based tasks or generation-based tasks. Furthermore, performance improvement has been largely achieved by scaling up the dataset with noisy image-text pairs collected from the web, which is a suboptimal source of supervision. In this paper, we propose BLIP, a new VLP framework which transfers flexibly to both vision-language understanding and generation tasks. BLIP effectively utilizes the noisy web data by bootstrapping the captions, where a captioner generates synthetic captions and a filter removes the noisy ones. We achieve state-of-the-art results on a wide range of vision-language tasks, such as image-text retrieval (+2.7% in average recall@1), image captioning (+2.8% in CIDEr), and VQA (+1.6% in VQA score). BLIP also demonstrates strong generalization ability when directly transferred to videolanguage tasks in a zero-shot manner. Code, models, and datasets are released.* ## Usage You can use this model for conditional and un-conditional image captioning ### Using the Pytorch model #### Running the model on CPU <details> <summary> Click to expand </summary> ```python import requests from PIL import Image from transformers import BlipProcessor, BlipForQuestionAnswering processor = BlipProcessor.from_pretrained("Salesforce/blip-vqa-capfilt-large") model = BlipForQuestionAnswering.from_pretrained("Salesforce/blip-vqa-capfilt-large") img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg' raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB') question = "how many dogs are in the picture?" inputs = processor(raw_image, question, return_tensors="pt") out = model.generate(**inputs) print(processor.decode(out[0], skip_special_tokens=True)) >>> 1 ``` </details> #### Running the model on GPU ##### In full precision <details> <summary> Click to expand </summary> ```python import requests from PIL import Image from transformers import BlipProcessor, BlipForQuestionAnswering processor = BlipProcessor.from_pretrained("Salesforce/blip-vqa-capfilt-large") model = BlipForQuestionAnswering.from_pretrained("Salesforce/blip-vqa-capfilt-large").to("cuda") img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg' raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB') question = "how many dogs are in the picture?" inputs = processor(raw_image, question, return_tensors="pt").to("cuda") out = model.generate(**inputs) print(processor.decode(out[0], skip_special_tokens=True)) >>> 1 ``` </details> ##### In half precision (`float16`) <details> <summary> Click to expand </summary> ```python import torch import requests from PIL import Image from transformers import BlipProcessor, BlipForQuestionAnswering processor = BlipProcessor.from_pretrained("ybelkada/blip-vqa-capfilt-large") model = BlipForQuestionAnswering.from_pretrained("ybelkada/blip-vqa-capfilt-large", torch_dtype=torch.float16).to("cuda") img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg' raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB') question = "how many dogs are in the picture?" inputs = processor(raw_image, question, return_tensors="pt").to("cuda", torch.float16) out = model.generate(**inputs) print(processor.decode(out[0], skip_special_tokens=True)) >>> 1 ``` </details> ## Ethical Considerations This release is for research purposes only in support of an academic paper. Our models, datasets, and code are not specifically designed or evaluated for all downstream purposes. We strongly recommend users evaluate and address potential concerns related to accuracy, safety, and fairness before deploying this model. We encourage users to consider the common limitations of AI, comply with applicable laws, and leverage best practices when selecting use cases, particularly for high-risk scenarios where errors or misuse could significantly impact people’s lives, rights, or safety. For further guidance on use cases, refer to our AUP and AI AUP. ## BibTex and citation info ``` @misc{https://doi.org/10.48550/arxiv.2201.12086, doi = {10.48550/ARXIV.2201.12086}, url = {https://arxiv.org/abs/2201.12086}, author = {Li, Junnan and Li, Dongxu and Xiong, Caiming and Hoi, Steven}, keywords = {Computer Vision and Pattern Recognition (cs.CV), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generation}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```
{"license": "bsd-3-clause", "pipeline_tag": "visual-question-answering", "tags": ["visual-question-answering"], "inference": false, "languages": ["en"]}
task
[ "QUESTION_ANSWERING" ]
42,924
hopkins/mbart-finetuned-eng-kor-61905009452
hopkins
translation
[ "transformers", "pytorch", "tensorboard", "mbart", "text2text-generation", "translation", "generated_from_trainer", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2023-07-02T17:44:19Z
2023-07-02T18:00:28+00:00
8
0
--- metrics: - bleu tags: - translation - generated_from_trainer model-index: - name: mbart-finetuned-eng-kor-61905009452 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # mbart-finetuned-eng-kor-61905009452 This model is a fine-tuned version of [facebook/mbart-large-50-many-to-many-mmt](https://huggingface.co/facebook/mbart-large-50-many-to-many-mmt) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.9907 - Bleu: 7.0332 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.26.1 - Pytorch 2.0.1+cu117 - Datasets 2.12.0 - Tokenizers 0.13.3
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # mbart-finetuned-eng-kor-61905009452 This model is a fine-tuned version of [facebook/mbart-large-50-many-to-many-mmt](https://huggingface.co/facebook/mbart-large-50-many-to-many-mmt) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.9907 - Bleu: 7.0332 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.26.1 - Pytorch 2.0.1+cu117 - Datasets 2.12.0 - Tokenizers 0.13.3
{"metrics": ["bleu"], "tags": ["translation", "generated_from_trainer"], "model-index": [{"name": "mbart-finetuned-eng-kor-61905009452", "results": []}]}
task
[ "TRANSLATION" ]
42,925
izhl/t5-small-finetuned-news-commentary-en-to-zh
izhl
translation
[ "transformers", "tensorboard", "safetensors", "t5", "text2text-generation", "translation", "generated_from_trainer", "base_model:google-t5/t5-small", "base_model:finetune:google-t5/t5-small", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
2024-03-20T10:30:04Z
2024-03-25T10:43:38+00:00
14
0
--- base_model: google-t5/t5-small license: apache-2.0 tags: - translation - generated_from_trainer model-index: - name: t5-small-finetuned-news-commentary-en-to-zh results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # t5-small-finetuned-news-commentary-en-to-zh This model is a fine-tuned version of [google-t5/t5-small](https://huggingface.co/google-t5/t5-small) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.39.0 - Pytorch 2.2.1+cu121 - Datasets 2.18.0 - Tokenizers 0.15.2
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # t5-small-finetuned-news-commentary-en-to-zh This model is a fine-tuned version of [google-t5/t5-small](https://huggingface.co/google-t5/t5-small) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.39.0 - Pytorch 2.2.1+cu121 - Datasets 2.18.0 - Tokenizers 0.15.2
{"base_model": "google-t5/t5-small", "license": "apache-2.0", "tags": ["translation", "generated_from_trainer"], "model-index": [{"name": "t5-small-finetuned-news-commentary-en-to-zh", "results": []}]}
task
[ "TRANSLATION" ]
42,926
YakovElm/Qt10SetFitModel_Train_balance_ratio_4
YakovElm
text-classification
[ "sentence-transformers", "pytorch", "mpnet", "setfit", "text-classification", "arxiv:2209.11055", "license:apache-2.0", "region:us" ]
2023-06-11T17:21:06Z
2023-06-11T17:21:47+00:00
8
0
--- license: apache-2.0 pipeline_tag: text-classification tags: - setfit - sentence-transformers - text-classification --- # YakovElm/Qt10SetFitModel_Train_balance_ratio_4 This is a [SetFit model](https://github.com/huggingface/setfit) that can be used for text classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Usage To use this model for inference, first install the SetFit library: ```bash python -m pip install setfit ``` You can then run inference as follows: ```python from setfit import SetFitModel # Download from Hub and run inference model = SetFitModel.from_pretrained("YakovElm/Qt10SetFitModel_Train_balance_ratio_4") # Run inference preds = model(["i loved the spiderman movie!", "pineapple on pizza is the worst 🤮"]) ``` ## BibTeX entry and citation info ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```
null
Non_BioNLP
# YakovElm/Qt10SetFitModel_Train_balance_ratio_4 This is a [SetFit model](https://github.com/huggingface/setfit) that can be used for text classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Usage To use this model for inference, first install the SetFit library: ```bash python -m pip install setfit ``` You can then run inference as follows: ```python from setfit import SetFitModel # Download from Hub and run inference model = SetFitModel.from_pretrained("YakovElm/Qt10SetFitModel_Train_balance_ratio_4") # Run inference preds = model(["i loved the spiderman movie!", "pineapple on pizza is the worst 🤮"]) ``` ## BibTeX entry and citation info ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```
{"license": "apache-2.0", "pipeline_tag": "text-classification", "tags": ["setfit", "sentence-transformers", "text-classification"]}
task
[ "TEXT_CLASSIFICATION" ]
42,927
klumdedum/finetuning-sentiment-model-3000-samples
klumdedum
text-classification
[ "transformers", "tensorboard", "safetensors", "distilbert", "text-classification", "generated_from_trainer", "dataset:imdb", "base_model:distilbert/distilbert-base-uncased", "base_model:finetune:distilbert/distilbert-base-uncased", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2023-11-30T00:26:07Z
2023-11-30T00:36:57+00:00
104
0
--- base_model: distilbert-base-uncased datasets: - imdb license: apache-2.0 metrics: - accuracy - f1 tags: - generated_from_trainer model-index: - name: finetuning-sentiment-model-3000-samples results: - task: type: text-classification name: Text Classification dataset: name: imdb type: imdb config: plain_text split: test args: plain_text metrics: - type: accuracy value: 0.8666666666666667 name: Accuracy - type: f1 value: 0.8692810457516339 name: F1 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # finetuning-sentiment-model-3000-samples This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the imdb dataset. It achieves the following results on the evaluation set: - Loss: 0.3433 - Accuracy: 0.8667 - F1: 0.8693 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu118 - Datasets 2.15.0 - Tokenizers 0.15.0
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # finetuning-sentiment-model-3000-samples This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the imdb dataset. It achieves the following results on the evaluation set: - Loss: 0.3433 - Accuracy: 0.8667 - F1: 0.8693 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu118 - Datasets 2.15.0 - Tokenizers 0.15.0
{"base_model": "distilbert-base-uncased", "datasets": ["imdb"], "license": "apache-2.0", "metrics": ["accuracy", "f1"], "tags": ["generated_from_trainer"], "model-index": [{"name": "finetuning-sentiment-model-3000-samples", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "imdb", "type": "imdb", "config": "plain_text", "split": "test", "args": "plain_text"}, "metrics": [{"type": "accuracy", "value": 0.8666666666666667, "name": "Accuracy"}, {"type": "f1", "value": 0.8692810457516339, "name": "F1"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,928
HusseinEid/marian-finetuned-kde4-en-to-fr
HusseinEid
translation
[ "transformers", "tensorboard", "safetensors", "marian", "text2text-generation", "translation", "generated_from_trainer", "en", "dataset:kde4", "base_model:Helsinki-NLP/opus-mt-en-fr", "base_model:finetune:Helsinki-NLP/opus-mt-en-fr", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-05-22T09:55:06Z
2024-05-22T23:14:54+00:00
7
0
--- base_model: Helsinki-NLP/opus-mt-en-fr datasets: - kde4 language: - en library_name: transformers license: apache-2.0 metrics: - bleu tags: - translation - generated_from_trainer model-index: - name: marian-finetuned-kde4-en-to-fr results: - task: type: text2text-generation name: Sequence-to-sequence Language Modeling dataset: name: kde4 type: kde4 config: en-fr split: train args: en-fr metrics: - type: bleu value: 52.91210143343284 name: Bleu --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # marian-finetuned-kde4-en-to-fr This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-fr](https://huggingface.co/Helsinki-NLP/opus-mt-en-fr) on the kde4 dataset. It achieves the following results on the evaluation set: - Loss: 0.8554 - Bleu: 52.9121 ## Model description This is a model for english to french translation ## Intended uses & limitations Open source ## Training and evaluation data KDE4 ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.41.0 - Pytorch 2.3.0+cu121 - Datasets 2.19.1 - Tokenizers 0.19.1
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # marian-finetuned-kde4-en-to-fr This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-fr](https://huggingface.co/Helsinki-NLP/opus-mt-en-fr) on the kde4 dataset. It achieves the following results on the evaluation set: - Loss: 0.8554 - Bleu: 52.9121 ## Model description This is a model for english to french translation ## Intended uses & limitations Open source ## Training and evaluation data KDE4 ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.41.0 - Pytorch 2.3.0+cu121 - Datasets 2.19.1 - Tokenizers 0.19.1
{"base_model": "Helsinki-NLP/opus-mt-en-fr", "datasets": ["kde4"], "language": ["en"], "library_name": "transformers", "license": "apache-2.0", "metrics": ["bleu"], "tags": ["translation", "generated_from_trainer"], "model-index": [{"name": "marian-finetuned-kde4-en-to-fr", "results": [{"task": {"type": "text2text-generation", "name": "Sequence-to-sequence Language Modeling"}, "dataset": {"name": "kde4", "type": "kde4", "config": "en-fr", "split": "train", "args": "en-fr"}, "metrics": [{"type": "bleu", "value": 52.91210143343284, "name": "Bleu"}]}]}]}
task
[ "TRANSLATION" ]
42,929
HachiML/Swallow-MS-7b-v0.1-ChatSkill-LAB
HachiML
text-generation
[ "transformers", "safetensors", "mistral", "text-generation", "SkillEnhanced", "conversational", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
2024-04-15T09:40:37Z
2024-04-15T14:53:52+00:00
8
1
--- library_name: transformers license: apache-2.0 tags: - SkillEnhanced - mistral --- # Model Card for SkillTree Enhanced Model <!-- Provide a quick summary of what the model is/does. --> ## Model Details This model has been enhanced using the SkillTree approach, which applies specific skills extracted from advanced training or fine-tuning processes to improve the model's capabilities in targeted areas. - **Base Model:** [tokyotech-llm/Swallow-MS-7b-v0.1](https://huggingface.co/tokyotech-llm/Swallow-MS-7b-v0.1) - **Skill Tree:** [HachiML/SkillTree-Chat-LAB-Mistral-7B-v0.1](https://huggingface.co/HachiML/SkillTree-Chat-LAB-Mistral-7B-v0.1) - **Language(s) (NLP):** Japanese - **Functionality Status:** **Functional** / Non-Functional / Not Verified ## Benchmark Score ### JMT-Bench ![image/png](https://cdn-uploads.huggingface.co/production/uploads/63a02fecf3334a6553d2ad17/_pxtywte-wt6fJjiUMM8-.png) ``` ########## First turn ########## model turn score Swallow-MS-7b-v0.1-ChatSkill-LAB 1 6.275000 Swallow-MS-7b-v0.1 1 5.262500 ########## Second turn ########## model turn score Swallow-MS-7b-v0.1-ChatSkill-LAB 2 5.6875 Swallow-MS-7b-v0.1-ChatSkill 2 3.9250 ########## Average ########## model score Swallow-MS-7b-v0.1-ChatSkill-LAB 5.981250 Swallow-MS-7b-v0.1 4.562500 ``` result files: [result_jmt_bench](https://huggingface.co/HachiML/Swallow-MS-7b-v0.1-ChatSkill-LAB/tree/main/result_jmt_bench) ## Uses This section should describe the intended use cases for the enhanced model. It might include scenarios such as code generation, conversational AI, text summarization, or any other specific tasks the model has been enhanced to perform better. Be sure to include any recommendations or limitations on the model's use. ```Python # Import library import torch from transformers import AutoTokenizer, AutoModelForCausalLM # Load model model_name = "HachiML/Swallow-MS-7b-v0.1-ChatSkill-LAB" tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.bfloat16, device_map="auto") # Inference 1 (Instruction) prompt = "<s>[INST] お気に入りの調味料は? [/INST]" input_ids = tokenizer.encode(prompt, add_special_tokens=False, return_tensors="pt") tokens = model.generate(input_ids.to(device=model.device), max_new_tokens=128, temperature=0.99, top_p=0.95, do_sample=True) out = tokenizer.decode(tokens[0], skip_special_tokens=True) print(out) # Inference 2 (Chat) messages = [ {"role": "system", "content": "あなたは慎重なアシスタントです。慎重に指示に従います。あなたは親切で無害で、倫理的なガイドラインに従い、積極的な行動を促します。"}, {"role": "user", "content": "お気に入りの調味料は?"}, {"role": "assistant", "content": "私は新鮮なレモン汁を絞るのが大好きなんだ。キッチンで何を料理するにしても、ちょうどいい量のピリッとした風味を加えてくれる!"}, {"role": "user", "content": "マヨネーズのレシピはありますか?"} ] encodeds = tokenizer.apply_chat_template(messages, return_tensors="pt") model_inputs = encodeds.to(device=model.device) generated_ids = model.generate(model_inputs, max_new_tokens=512, do_sample=True) out = tokenizer.batch_decode(generated_ids) print(out[0]) ``` ## Response Examples Below are examples of responses from the model, demonstrating how the SkillTree enhancement has improved its performance. ### Example 1: - **Input:** ``` <s>[INST] お気に入りの調味料は? [/INST] ``` - **Output:** ``` 味噌煮込みうどんの時はやっぱり七味ですかね。 私の母方の実家が香川県出身なもので、小さい頃からお風呂上がりにはお父さんの淹れてくれたお茶に、母が作ってくれた手打ちうどん、というのが定番でした。 その際、薬味として七味を使い、うどんの出汁に入れて飲む事がありました。 その味が大人になっても忘れられず、今でも時折うどん屋さんでお茶漬け風にして食べる事があります。 お ``` ### Example 2: - **Input:** ``` <s> [INST] お気に入りの調味料は? [/INST] 私は新鮮なレモン汁を絞るのが大好きなんだ。キッチンで何を料理するにしても、ちょうどいい量のピリッとした風味を加えてくれる! </s> <s> [INST] マヨネーズのレシピはありますか? [/INST] ``` - **Output:** ``` マヨネーズの作り方ですが、下記の材料を混ぜるだけです。 分量は10人分です。 ●卵黄:卵では卵黄、1~2個を入れる ●酢:好みの量を入れていいが、一般的には酢200ccは欲しい ●塩:酢の10%ほどを入れる ●砂糖:酢の20%程度を入れる ●オリーブオイル:500cc オリーブオイル以外の材料を先に混ぜ合わせておき、そこへオリーブオイルを、最初は小さじ4杯から、油の流入が少ないように注意しながら10回ほどかき混ぜる その後は少しずつ量を増やしながらどんどんかき混ぜる、全体が乳白色に混ざるまでよくかき混ぜる。 マヨネーズの保存は、冷蔵庫で2週間を目処に使い切ることを心がけましょう。 マヨネーズの基本の材料 卵黄:卵では卵黄、1~2個を入れる 酢:好みの量を入れていいが、一般的には酢200ccは欲しい 塩:酢の10%ほどを入れる 砂糖:酢の20%程度を入れる 作り方は以下の通りです。 1.酢200cc、塩大さじ1、砂糖大さじ2を混ぜ、酢の味のベースを作る。 2.冷蔵庫で酢のベースを冷やす。 3.卵黄2個を別の容器に割りいれ、フォークを使ってよくかき混ぜる(マヨネーズを作る前に、卵白から黄身をよけておくとよい)。 4.1で作った酢20ccを、卵黄が入ったボウルに少しずつ加え、その都度かき混ぜて乳化させる(酢を加えたら必ずよくかき混ぜる、この工程で味が決まる)。 5.塩、砂糖を加え、再度よくかき混ぜる。 6.4で作った、調味料がよく混ざる程度の酢を少しずつ加え、毎回かき混ぜる(これで味が決まる)。 7.塩を小さじ2杯分、オリーブオイルを50cc加え、10回ほどよくかき混ぜる(この場合、オリーブオイルを加えた後、酢少々を加えて味を薄く ```
null
Non_BioNLP
# Model Card for SkillTree Enhanced Model <!-- Provide a quick summary of what the model is/does. --> ## Model Details This model has been enhanced using the SkillTree approach, which applies specific skills extracted from advanced training or fine-tuning processes to improve the model's capabilities in targeted areas. - **Base Model:** [tokyotech-llm/Swallow-MS-7b-v0.1](https://huggingface.co/tokyotech-llm/Swallow-MS-7b-v0.1) - **Skill Tree:** [HachiML/SkillTree-Chat-LAB-Mistral-7B-v0.1](https://huggingface.co/HachiML/SkillTree-Chat-LAB-Mistral-7B-v0.1) - **Language(s) (NLP):** Japanese - **Functionality Status:** **Functional** / Non-Functional / Not Verified ## Benchmark Score ### JMT-Bench ![image/png](https://cdn-uploads.huggingface.co/production/uploads/63a02fecf3334a6553d2ad17/_pxtywte-wt6fJjiUMM8-.png) ``` ########## First turn ########## model turn score Swallow-MS-7b-v0.1-ChatSkill-LAB 1 6.275000 Swallow-MS-7b-v0.1 1 5.262500 ########## Second turn ########## model turn score Swallow-MS-7b-v0.1-ChatSkill-LAB 2 5.6875 Swallow-MS-7b-v0.1-ChatSkill 2 3.9250 ########## Average ########## model score Swallow-MS-7b-v0.1-ChatSkill-LAB 5.981250 Swallow-MS-7b-v0.1 4.562500 ``` result files: [result_jmt_bench](https://huggingface.co/HachiML/Swallow-MS-7b-v0.1-ChatSkill-LAB/tree/main/result_jmt_bench) ## Uses This section should describe the intended use cases for the enhanced model. It might include scenarios such as code generation, conversational AI, text summarization, or any other specific tasks the model has been enhanced to perform better. Be sure to include any recommendations or limitations on the model's use. ```Python # Import library import torch from transformers import AutoTokenizer, AutoModelForCausalLM # Load model model_name = "HachiML/Swallow-MS-7b-v0.1-ChatSkill-LAB" tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.bfloat16, device_map="auto") # Inference 1 (Instruction) prompt = "<s>[INST] お気に入りの調味料は? [/INST]" input_ids = tokenizer.encode(prompt, add_special_tokens=False, return_tensors="pt") tokens = model.generate(input_ids.to(device=model.device), max_new_tokens=128, temperature=0.99, top_p=0.95, do_sample=True) out = tokenizer.decode(tokens[0], skip_special_tokens=True) print(out) # Inference 2 (Chat) messages = [ {"role": "system", "content": "あなたは慎重なアシスタントです。慎重に指示に従います。あなたは親切で無害で、倫理的なガイドラインに従い、積極的な行動を促します。"}, {"role": "user", "content": "お気に入りの調味料は?"}, {"role": "assistant", "content": "私は新鮮なレモン汁を絞るのが大好きなんだ。キッチンで何を料理するにしても、ちょうどいい量のピリッとした風味を加えてくれる!"}, {"role": "user", "content": "マヨネーズのレシピはありますか?"} ] encodeds = tokenizer.apply_chat_template(messages, return_tensors="pt") model_inputs = encodeds.to(device=model.device) generated_ids = model.generate(model_inputs, max_new_tokens=512, do_sample=True) out = tokenizer.batch_decode(generated_ids) print(out[0]) ``` ## Response Examples Below are examples of responses from the model, demonstrating how the SkillTree enhancement has improved its performance. ### Example 1: - **Input:** ``` <s>[INST] お気に入りの調味料は? [/INST] ``` - **Output:** ``` 味噌煮込みうどんの時はやっぱり七味ですかね。 私の母方の実家が香川県出身なもので、小さい頃からお風呂上がりにはお父さんの淹れてくれたお茶に、母が作ってくれた手打ちうどん、というのが定番でした。 その際、薬味として七味を使い、うどんの出汁に入れて飲む事がありました。 その味が大人になっても忘れられず、今でも時折うどん屋さんでお茶漬け風にして食べる事があります。 お ``` ### Example 2: - **Input:** ``` <s> [INST] お気に入りの調味料は? [/INST] 私は新鮮なレモン汁を絞るのが大好きなんだ。キッチンで何を料理するにしても、ちょうどいい量のピリッとした風味を加えてくれる! </s> <s> [INST] マヨネーズのレシピはありますか? [/INST] ``` - **Output:** ``` マヨネーズの作り方ですが、下記の材料を混ぜるだけです。 分量は10人分です。 ●卵黄:卵では卵黄、1~2個を入れる ●酢:好みの量を入れていいが、一般的には酢200ccは欲しい ●塩:酢の10%ほどを入れる ●砂糖:酢の20%程度を入れる ●オリーブオイル:500cc オリーブオイル以外の材料を先に混ぜ合わせておき、そこへオリーブオイルを、最初は小さじ4杯から、油の流入が少ないように注意しながら10回ほどかき混ぜる その後は少しずつ量を増やしながらどんどんかき混ぜる、全体が乳白色に混ざるまでよくかき混ぜる。 マヨネーズの保存は、冷蔵庫で2週間を目処に使い切ることを心がけましょう。 マヨネーズの基本の材料 卵黄:卵では卵黄、1~2個を入れる 酢:好みの量を入れていいが、一般的には酢200ccは欲しい 塩:酢の10%ほどを入れる 砂糖:酢の20%程度を入れる 作り方は以下の通りです。 1.酢200cc、塩大さじ1、砂糖大さじ2を混ぜ、酢の味のベースを作る。 2.冷蔵庫で酢のベースを冷やす。 3.卵黄2個を別の容器に割りいれ、フォークを使ってよくかき混ぜる(マヨネーズを作る前に、卵白から黄身をよけておくとよい)。 4.1で作った酢20ccを、卵黄が入ったボウルに少しずつ加え、その都度かき混ぜて乳化させる(酢を加えたら必ずよくかき混ぜる、この工程で味が決まる)。 5.塩、砂糖を加え、再度よくかき混ぜる。 6.4で作った、調味料がよく混ざる程度の酢を少しずつ加え、毎回かき混ぜる(これで味が決まる)。 7.塩を小さじ2杯分、オリーブオイルを50cc加え、10回ほどよくかき混ぜる(この場合、オリーブオイルを加えた後、酢少々を加えて味を薄く ```
{"library_name": "transformers", "license": "apache-2.0", "tags": ["SkillEnhanced", "mistral"]}
task
[ "SUMMARIZATION" ]
42,930
facebook/s2t-small-mustc-en-nl-st
facebook
automatic-speech-recognition
[ "transformers", "pytorch", "tf", "speech_to_text", "automatic-speech-recognition", "audio", "speech-translation", "en", "nl", "dataset:mustc", "arxiv:2010.05171", "arxiv:1904.08779", "license:mit", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05Z
2023-01-24T16:32:16+00:00
170
0
--- datasets: - mustc language: - en - nl license: mit pipeline_tag: automatic-speech-recognition tags: - audio - speech-translation - automatic-speech-recognition widget: - example_title: Librispeech sample 1 src: https://cdn-media.huggingface.co/speech_samples/sample1.flac - example_title: Librispeech sample 2 src: https://cdn-media.huggingface.co/speech_samples/sample2.flac --- # S2T-SMALL-MUSTC-EN-NL-ST `s2t-small-mustc-en-nl-st` is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text) ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Dutch text translation. See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. *Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the filter bank features. Make sure to install the `torchaudio` package before running this example.* You could either install those as extra speech dependancies with `pip install transformers"[speech, sentencepiece]"` or install the packages seperatly with `pip install torchaudio sentencepiece`. ```python import torch from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration from datasets import load_dataset import soundfile as sf model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-mustc-en-nl-st") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-mustc-en-nl-st") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset( "patrickvonplaten/librispeech_asr_dummy", "clean", split="validation" ) ds = ds.map(map_to_array) inputs = processor( ds["speech"][0], sampling_rate=16_000, return_tensors="pt" ) generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"]) translation = processor.batch_decode(generated_ids, skip_special_tokens=True) ``` ## Training data The s2t-small-mustc-en-nl-st is trained on English-Dutch subset of [MuST-C](https://ict.fbk.eu/must-c/). MuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems for speech translation from English into several languages. For each target language, MuST-C comprises several hundred hours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual transcriptions and translations. ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779). The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results MuST-C test results for en-nl (BLEU score): 27.3 ### BibTeX entry and citation info ```bibtex @inproceedings{wang2020fairseqs2t, title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq}, author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino}, booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations}, year = {2020}, } ```
null
Non_BioNLP
# S2T-SMALL-MUSTC-EN-NL-ST `s2t-small-mustc-en-nl-st` is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text) ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Dutch text translation. See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. *Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the filter bank features. Make sure to install the `torchaudio` package before running this example.* You could either install those as extra speech dependancies with `pip install transformers"[speech, sentencepiece]"` or install the packages seperatly with `pip install torchaudio sentencepiece`. ```python import torch from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration from datasets import load_dataset import soundfile as sf model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-mustc-en-nl-st") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-mustc-en-nl-st") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset( "patrickvonplaten/librispeech_asr_dummy", "clean", split="validation" ) ds = ds.map(map_to_array) inputs = processor( ds["speech"][0], sampling_rate=16_000, return_tensors="pt" ) generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"]) translation = processor.batch_decode(generated_ids, skip_special_tokens=True) ``` ## Training data The s2t-small-mustc-en-nl-st is trained on English-Dutch subset of [MuST-C](https://ict.fbk.eu/must-c/). MuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems for speech translation from English into several languages. For each target language, MuST-C comprises several hundred hours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual transcriptions and translations. ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779). The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results MuST-C test results for en-nl (BLEU score): 27.3 ### BibTeX entry and citation info ```bibtex @inproceedings{wang2020fairseqs2t, title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq}, author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino}, booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations}, year = {2020}, } ```
{"datasets": ["mustc"], "language": ["en", "nl"], "license": "mit", "pipeline_tag": "automatic-speech-recognition", "tags": ["audio", "speech-translation", "automatic-speech-recognition"], "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}]}
task
[ "TRANSLATION" ]
42,931
fine-tuned/SCIDOCS-32000-384-gpt-4o-2024-05-13-38097330
fine-tuned
feature-extraction
[ "sentence-transformers", "safetensors", "bert", "feature-extraction", "sentence-similarity", "mteb", "en", "dataset:fine-tuned/SCIDOCS-32000-384-gpt-4o-2024-05-13-38097330", "dataset:allenai/c4", "license:apache-2.0", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
2024-05-31T21:21:44Z
2024-05-31T21:22:12+00:00
6
0
--- datasets: - fine-tuned/SCIDOCS-32000-384-gpt-4o-2024-05-13-38097330 - allenai/c4 language: - en - en license: apache-2.0 pipeline_tag: feature-extraction tags: - sentence-transformers - feature-extraction - sentence-similarity - mteb --- This model is a fine-tuned version of [**BAAI/bge-large-en-v1.5**](https://huggingface.co/BAAI/bge-large-en-v1.5) designed for the following use case: None ## How to Use This model can be easily integrated into your NLP pipeline for tasks such as text classification, sentiment analysis, entity recognition, and more. Here's a simple example to get you started: ```python from sentence_transformers import SentenceTransformer from sentence_transformers.util import cos_sim model = SentenceTransformer( 'fine-tuned/SCIDOCS-32000-384-gpt-4o-2024-05-13-38097330', trust_remote_code=True ) embeddings = model.encode([ 'first text to embed', 'second text to embed' ]) print(cos_sim(embeddings[0], embeddings[1])) ```
null
Non_BioNLP
This model is a fine-tuned version of [**BAAI/bge-large-en-v1.5**](https://huggingface.co/BAAI/bge-large-en-v1.5) designed for the following use case: None ## How to Use This model can be easily integrated into your NLP pipeline for tasks such as text classification, sentiment analysis, entity recognition, and more. Here's a simple example to get you started: ```python from sentence_transformers import SentenceTransformer from sentence_transformers.util import cos_sim model = SentenceTransformer( 'fine-tuned/SCIDOCS-32000-384-gpt-4o-2024-05-13-38097330', trust_remote_code=True ) embeddings = model.encode([ 'first text to embed', 'second text to embed' ]) print(cos_sim(embeddings[0], embeddings[1])) ```
{"datasets": ["fine-tuned/SCIDOCS-32000-384-gpt-4o-2024-05-13-38097330", "allenai/c4"], "language": ["en", "en"], "license": "apache-2.0", "pipeline_tag": "feature-extraction", "tags": ["sentence-transformers", "feature-extraction", "sentence-similarity", "mteb"]}
task
[ "TEXT_CLASSIFICATION" ]
42,932
marcolatella/hate_trained
marcolatella
text-classification
[ "transformers", "pytorch", "distilbert", "text-classification", "generated_from_trainer", "dataset:tweet_eval", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05Z
2021-12-11T00:02:24+00:00
16
0
--- datasets: - tweet_eval license: apache-2.0 metrics: - f1 tags: - generated_from_trainer model-index: - name: hate_trained results: - task: type: text-classification name: Text Classification dataset: name: tweet_eval type: tweet_eval args: hate metrics: - type: f1 value: 0.7875737774565976 name: F1 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # hate_trained This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the tweet_eval dataset. It achieves the following results on the evaluation set: - Loss: 0.8182 - F1: 0.7876 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2.7272339744854407e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 0 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | |:-------------:|:-----:|:----:|:---------------:|:------:| | 0.4635 | 1.0 | 563 | 0.4997 | 0.7530 | | 0.3287 | 2.0 | 1126 | 0.5138 | 0.7880 | | 0.216 | 3.0 | 1689 | 0.6598 | 0.7821 | | 0.1309 | 4.0 | 2252 | 0.8182 | 0.7876 | ### Framework versions - Transformers 4.13.0 - Pytorch 1.10.0+cu111 - Datasets 1.16.1 - Tokenizers 0.10.3
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # hate_trained This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the tweet_eval dataset. It achieves the following results on the evaluation set: - Loss: 0.8182 - F1: 0.7876 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2.7272339744854407e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 0 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | |:-------------:|:-----:|:----:|:---------------:|:------:| | 0.4635 | 1.0 | 563 | 0.4997 | 0.7530 | | 0.3287 | 2.0 | 1126 | 0.5138 | 0.7880 | | 0.216 | 3.0 | 1689 | 0.6598 | 0.7821 | | 0.1309 | 4.0 | 2252 | 0.8182 | 0.7876 | ### Framework versions - Transformers 4.13.0 - Pytorch 1.10.0+cu111 - Datasets 1.16.1 - Tokenizers 0.10.3
{"datasets": ["tweet_eval"], "license": "apache-2.0", "metrics": ["f1"], "tags": ["generated_from_trainer"], "model-index": [{"name": "hate_trained", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "tweet_eval", "type": "tweet_eval", "args": "hate"}, "metrics": [{"type": "f1", "value": 0.7875737774565976, "name": "F1"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,933
Blaine-Mason/hackMIT-finetuned-sst2
Blaine-Mason
text-classification
[ "transformers", "pytorch", "tensorboard", "bert", "text-classification", "generated_from_trainer", "dataset:glue", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04Z
2021-08-25T00:31:45+00:00
98
0
--- datasets: - glue metrics: - accuracy tags: - generated_from_trainer model_index: - name: hackMIT-finetuned-sst2 results: - task: name: Text Classification type: text-classification dataset: name: glue type: glue args: sst2 metric: name: Accuracy type: accuracy value: 0.8027522935779816 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # hackMIT-finetuned-sst2 This model is a fine-tuned version of [Blaine-Mason/hackMIT-finetuned-sst2](https://huggingface.co/Blaine-Mason/hackMIT-finetuned-sst2) on the glue dataset. It achieves the following results on the evaluation set: - Loss: 1.1086 - Accuracy: 0.8028 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2.033238621168611e-06 - train_batch_size: 16 - eval_batch_size: 8 - seed: 30 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0674 | 1.0 | 4210 | 1.1086 | 0.8028 | ### Framework versions - Transformers 4.9.2 - Pytorch 1.9.0+cu102 - Datasets 1.11.0 - Tokenizers 0.10.3
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # hackMIT-finetuned-sst2 This model is a fine-tuned version of [Blaine-Mason/hackMIT-finetuned-sst2](https://huggingface.co/Blaine-Mason/hackMIT-finetuned-sst2) on the glue dataset. It achieves the following results on the evaluation set: - Loss: 1.1086 - Accuracy: 0.8028 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2.033238621168611e-06 - train_batch_size: 16 - eval_batch_size: 8 - seed: 30 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0674 | 1.0 | 4210 | 1.1086 | 0.8028 | ### Framework versions - Transformers 4.9.2 - Pytorch 1.9.0+cu102 - Datasets 1.11.0 - Tokenizers 0.10.3
{"datasets": ["glue"], "metrics": ["accuracy"], "tags": ["generated_from_trainer"], "model_index": [{"name": "hackMIT-finetuned-sst2", "results": [{"task": {"name": "Text Classification", "type": "text-classification"}, "dataset": {"name": "glue", "type": "glue", "args": "sst2"}, "metric": {"name": "Accuracy", "type": "accuracy", "value": 0.8027522935779816}}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,934
muhtasham/finetuned-base_base
muhtasham
text-classification
[ "transformers", "pytorch", "bert", "text-classification", "generated_from_trainer", "dataset:imdb", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-12-03T00:26:55Z
2022-12-03T01:16:32+00:00
11
0
--- datasets: - imdb license: apache-2.0 metrics: - accuracy - f1 tags: - generated_from_trainer model-index: - name: finetuned-base_base results: - task: type: text-classification name: Text Classification dataset: name: imdb type: imdb config: plain_text split: train args: plain_text metrics: - type: accuracy value: 0.90936 name: Accuracy - type: f1 value: 0.95252859596933 name: F1 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # finetuned-base_base This model is a fine-tuned version of [google/bert_uncased_L-12_H-768_A-12](https://huggingface.co/google/bert_uncased_L-12_H-768_A-12) on the imdb dataset. It achieves the following results on the evaluation set: - Loss: 0.3594 - Accuracy: 0.9094 - F1: 0.9525 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 50 - eval_batch_size: 50 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: constant - num_epochs: 200 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.2414 | 1.0 | 500 | 0.1796 | 0.9343 | 0.9660 | | 0.1235 | 2.0 | 1000 | 0.2042 | 0.9311 | 0.9643 | | 0.0633 | 3.0 | 1500 | 0.3590 | 0.8997 | 0.9472 | | 0.0398 | 4.0 | 2000 | 0.3594 | 0.9094 | 0.9525 | ### Framework versions - Transformers 4.25.1 - Pytorch 1.12.1+cu113 - Datasets 2.7.1 - Tokenizers 0.13.2
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # finetuned-base_base This model is a fine-tuned version of [google/bert_uncased_L-12_H-768_A-12](https://huggingface.co/google/bert_uncased_L-12_H-768_A-12) on the imdb dataset. It achieves the following results on the evaluation set: - Loss: 0.3594 - Accuracy: 0.9094 - F1: 0.9525 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 50 - eval_batch_size: 50 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: constant - num_epochs: 200 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.2414 | 1.0 | 500 | 0.1796 | 0.9343 | 0.9660 | | 0.1235 | 2.0 | 1000 | 0.2042 | 0.9311 | 0.9643 | | 0.0633 | 3.0 | 1500 | 0.3590 | 0.8997 | 0.9472 | | 0.0398 | 4.0 | 2000 | 0.3594 | 0.9094 | 0.9525 | ### Framework versions - Transformers 4.25.1 - Pytorch 1.12.1+cu113 - Datasets 2.7.1 - Tokenizers 0.13.2
{"datasets": ["imdb"], "license": "apache-2.0", "metrics": ["accuracy", "f1"], "tags": ["generated_from_trainer"], "model-index": [{"name": "finetuned-base_base", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "imdb", "type": "imdb", "config": "plain_text", "split": "train", "args": "plain_text"}, "metrics": [{"type": "accuracy", "value": 0.90936, "name": "Accuracy"}, {"type": "f1", "value": 0.95252859596933, "name": "F1"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,935
juampahc/gliner_multi-v2.1-openvino
juampahc
token-classification
[ "gliner", "OpenVINO", "GLiNER", "token-classification", "multilingual", "arxiv:2311.08526", "base_model:urchade/gliner_multi-v2.1", "base_model:finetune:urchade/gliner_multi-v2.1", "license:apache-2.0", "region:us" ]
2024-11-15T10:32:42Z
2025-02-12T15:51:45+00:00
28
0
--- base_model: - urchade/gliner_multi-v2.1 language: - multilingual library_name: gliner license: apache-2.0 pipeline_tag: token-classification tags: - OpenVINO - GLiNER --- # About GLiNER is a Named Entity Recognition (NER) model capable of identifying any entity type using a bidirectional transformer encoder (BERT-like). It provides a practical alternative to traditional NER models, which are limited to predefined entities, and Large Language Models (LLMs) that, despite their flexibility, are costly and large for resource-constrained scenarios. This is the OpenVINO's Intermediate Representation version with fp16 compression and without it. ## Links * Paper: https://arxiv.org/abs/2311.08526 * Repository: https://github.com/urchade/GLiNER ## Installation WIP ## Usage Do you need to deploy? Please check my github: https://github.com/juampahc/skyresh I will update the model card asap. WIP
null
Non_BioNLP
# About GLiNER is a Named Entity Recognition (NER) model capable of identifying any entity type using a bidirectional transformer encoder (BERT-like). It provides a practical alternative to traditional NER models, which are limited to predefined entities, and Large Language Models (LLMs) that, despite their flexibility, are costly and large for resource-constrained scenarios. This is the OpenVINO's Intermediate Representation version with fp16 compression and without it. ## Links * Paper: https://arxiv.org/abs/2311.08526 * Repository: https://github.com/urchade/GLiNER ## Installation WIP ## Usage Do you need to deploy? Please check my github: https://github.com/juampahc/skyresh I will update the model card asap. WIP
{"base_model": ["urchade/gliner_multi-v2.1"], "language": ["multilingual"], "library_name": "gliner", "license": "apache-2.0", "pipeline_tag": "token-classification", "tags": ["OpenVINO", "GLiNER"]}
task
[ "NAMED_ENTITY_RECOGNITION" ]
42,936
JoshELambert/maritimecrime
JoshELambert
text-classification
[ "sentence-transformers", "pytorch", "mpnet", "setfit", "text-classification", "arxiv:2209.11055", "license:apache-2.0", "region:us" ]
2023-06-06T15:52:40Z
2023-06-06T16:40:13+00:00
8
0
--- license: apache-2.0 pipeline_tag: text-classification tags: - setfit - sentence-transformers - text-classification --- # /var/folders/8x/qp375g154zg3h3ktpt_8tyqw0000gn/T/tmpxqh0a8kr/JoshELambert/maritimecrime This is a [SetFit model](https://github.com/huggingface/setfit) that can be used for text classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Usage To use this model for inference, first install the SetFit library: ```bash python -m pip install setfit ``` You can then run inference as follows: ```python from setfit import SetFitModel # Download from Hub and run inference model = SetFitModel.from_pretrained("/var/folders/8x/qp375g154zg3h3ktpt_8tyqw0000gn/T/tmpxqh0a8kr/JoshELambert/maritimecrime") # Run inference preds = model(["i loved the spiderman movie!", "pineapple on pizza is the worst 🤮"]) ``` ## BibTeX entry and citation info ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```
null
Non_BioNLP
# /var/folders/8x/qp375g154zg3h3ktpt_8tyqw0000gn/T/tmpxqh0a8kr/JoshELambert/maritimecrime This is a [SetFit model](https://github.com/huggingface/setfit) that can be used for text classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Usage To use this model for inference, first install the SetFit library: ```bash python -m pip install setfit ``` You can then run inference as follows: ```python from setfit import SetFitModel # Download from Hub and run inference model = SetFitModel.from_pretrained("/var/folders/8x/qp375g154zg3h3ktpt_8tyqw0000gn/T/tmpxqh0a8kr/JoshELambert/maritimecrime") # Run inference preds = model(["i loved the spiderman movie!", "pineapple on pizza is the worst 🤮"]) ``` ## BibTeX entry and citation info ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```
{"license": "apache-2.0", "pipeline_tag": "text-classification", "tags": ["setfit", "sentence-transformers", "text-classification"]}
task
[ "TEXT_CLASSIFICATION" ]
42,937
Mollel/swahili-all-MiniLM-L6-v2-nli-matryoshka
Mollel
sentence-similarity
[ "sentence-transformers", "safetensors", "bert", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:557850", "loss:MatryoshkaLoss", "loss:MultipleNegativesRankingLoss", "arxiv:1908.10084", "arxiv:2205.13147", "arxiv:1705.00652", "base_model:sentence-transformers/all-MiniLM-L6-v2", "base_model:finetune:sentence-transformers/all-MiniLM-L6-v2", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
2024-07-04T18:57:15Z
2024-07-04T18:57:36+00:00
8
0
--- base_model: sentence-transformers/all-MiniLM-L6-v2 datasets: [] language: [] library_name: sentence-transformers metrics: - pearson_cosine - spearman_cosine - pearson_manhattan - spearman_manhattan - pearson_euclidean - spearman_euclidean - pearson_dot - spearman_dot - pearson_max - spearman_max pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:557850 - loss:MatryoshkaLoss - loss:MultipleNegativesRankingLoss widget: - source_sentence: Mwanamume aliyepangwa vizuri anasimama kwa mguu mmoja karibu na pwani safi ya bahari. sentences: - mtu anacheka wakati wa kufua nguo - Mwanamume fulani yuko nje karibu na ufuo wa bahari. - Mwanamume fulani ameketi kwenye sofa yake. - source_sentence: Mwanamume mwenye ngozi nyeusi akivuta sigareti karibu na chombo cha taka cha kijani. sentences: - Karibu na chombo cha taka mwanamume huyo alisimama na kuvuta sigareti - Kitanda ni chafu. - Alipokuwa kwenye dimbwi la kuogelea mvulana huyo mwenye ugonjwa wa albino alijihadhari na jua kupita kiasi - source_sentence: Mwanamume kijana mwenye nywele nyekundu anaketi ukutani akisoma gazeti huku mwanamke na msichana mchanga wakipita. sentences: - Mwanamume aliyevalia shati la bluu amegonga ukuta kando ya barabara na gari la bluu na gari nyekundu lenye maji nyuma. - Mwanamume mchanga anatazama gazeti huku wanawake wawili wakipita karibu naye. - Mwanamume huyo mchanga analala huku Mama akimwongoza binti yake kwenye bustani. - source_sentence: Wasichana wako nje. sentences: - Wasichana wawili wakisafiri kwenye sehemu ya kusisimua. - Kuna watu watatu wakiongoza gari linaloweza kugeuzwa-geuzwa wakipita watu wengine. - Wasichana watatu wamesimama pamoja katika chumba, mmoja anasikiliza, mwingine anaandika ukutani na wa tatu anaongea nao. - source_sentence: Mwanamume aliyevalia koti la bluu la kuzuia upepo, amelala uso chini kwenye benchi ya bustani, akiwa na chupa ya pombe iliyofungwa kwenye mojawapo ya miguu ya benchi. sentences: - Mwanamume amelala uso chini kwenye benchi ya bustani. - Mwanamke anaunganisha uzi katika mipira kando ya rundo la mipira - Mwanamume fulani anacheza dansi kwenye klabu hiyo akifungua chupa. model-index: - name: SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2 results: - task: type: semantic-similarity name: Semantic Similarity dataset: name: sts test 256 type: sts-test-256 metrics: - type: pearson_cosine value: 0.6942864389866223 name: Pearson Cosine - type: spearman_cosine value: 0.6856061049537777 name: Spearman Cosine - type: pearson_manhattan value: 0.6885375818451587 name: Pearson Manhattan - type: spearman_manhattan value: 0.6872214410233022 name: Spearman Manhattan - type: pearson_euclidean value: 0.6914785578290242 name: Pearson Euclidean - type: spearman_euclidean value: 0.6905722127311041 name: Spearman Euclidean - type: pearson_dot value: 0.6799233396985102 name: Pearson Dot - type: spearman_dot value: 0.667743621858275 name: Spearman Dot - type: pearson_max value: 0.6942864389866223 name: Pearson Max - type: spearman_max value: 0.6905722127311041 name: Spearman Max - task: type: semantic-similarity name: Semantic Similarity dataset: name: sts test 128 type: sts-test-128 metrics: - type: pearson_cosine value: 0.6891584502617563 name: Pearson Cosine - type: spearman_cosine value: 0.6814103986417178 name: Spearman Cosine - type: pearson_manhattan value: 0.6968187377070036 name: Pearson Manhattan - type: spearman_manhattan value: 0.6920002958564649 name: Spearman Manhattan - type: pearson_euclidean value: 0.7000628001426884 name: Pearson Euclidean - type: spearman_euclidean value: 0.6960243670969477 name: Spearman Euclidean - type: pearson_dot value: 0.6364862920838279 name: Pearson Dot - type: spearman_dot value: 0.6189765115954626 name: Spearman Dot - type: pearson_max value: 0.7000628001426884 name: Pearson Max - type: spearman_max value: 0.6960243670969477 name: Spearman Max - task: type: semantic-similarity name: Semantic Similarity dataset: name: sts test 64 type: sts-test-64 metrics: - type: pearson_cosine value: 0.6782226699898293 name: Pearson Cosine - type: spearman_cosine value: 0.6755345411699644 name: Spearman Cosine - type: pearson_manhattan value: 0.6962074727926596 name: Pearson Manhattan - type: spearman_manhattan value: 0.689094339218281 name: Spearman Manhattan - type: pearson_euclidean value: 0.6996133052307816 name: Pearson Euclidean - type: spearman_euclidean value: 0.6937517032138506 name: Spearman Euclidean - type: pearson_dot value: 0.58122590177631 name: Pearson Dot - type: spearman_dot value: 0.5606971476688047 name: Spearman Dot - type: pearson_max value: 0.6996133052307816 name: Pearson Max - type: spearman_max value: 0.6937517032138506 name: Spearman Max --- # SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2 This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) <!-- at revision 8b3219a92973c328a8e22fadcfa821b5dc75636a --> - **Maximum Sequence Length:** 256 tokens - **Output Dimensionality:** 384 tokens - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("Mollel/swahili-all-MiniLM-L6-v2-nli-matryoshka") # Run inference sentences = [ 'Mwanamume aliyevalia koti la bluu la kuzuia upepo, amelala uso chini kwenye benchi ya bustani, akiwa na chupa ya pombe iliyofungwa kwenye mojawapo ya miguu ya benchi.', 'Mwanamume amelala uso chini kwenye benchi ya bustani.', 'Mwanamume fulani anacheza dansi kwenye klabu hiyo akifungua chupa.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 384] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Semantic Similarity * Dataset: `sts-test-256` * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator) | Metric | Value | |:--------------------|:-----------| | pearson_cosine | 0.6943 | | **spearman_cosine** | **0.6856** | | pearson_manhattan | 0.6885 | | spearman_manhattan | 0.6872 | | pearson_euclidean | 0.6915 | | spearman_euclidean | 0.6906 | | pearson_dot | 0.6799 | | spearman_dot | 0.6677 | | pearson_max | 0.6943 | | spearman_max | 0.6906 | #### Semantic Similarity * Dataset: `sts-test-128` * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator) | Metric | Value | |:--------------------|:-----------| | pearson_cosine | 0.6892 | | **spearman_cosine** | **0.6814** | | pearson_manhattan | 0.6968 | | spearman_manhattan | 0.692 | | pearson_euclidean | 0.7001 | | spearman_euclidean | 0.696 | | pearson_dot | 0.6365 | | spearman_dot | 0.619 | | pearson_max | 0.7001 | | spearman_max | 0.696 | #### Semantic Similarity * Dataset: `sts-test-64` * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator) | Metric | Value | |:--------------------|:-----------| | pearson_cosine | 0.6782 | | **spearman_cosine** | **0.6755** | | pearson_manhattan | 0.6962 | | spearman_manhattan | 0.6891 | | pearson_euclidean | 0.6996 | | spearman_euclidean | 0.6938 | | pearson_dot | 0.5812 | | spearman_dot | 0.5607 | | pearson_max | 0.6996 | | spearman_max | 0.6938 | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Hyperparameters #### Non-Default Hyperparameters - `per_device_train_batch_size`: 64 - `per_device_eval_batch_size`: 64 - `num_train_epochs`: 1 - `warmup_ratio`: 0.1 - `fp16`: True - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `prediction_loss_only`: True - `per_device_train_batch_size`: 64 - `per_device_eval_batch_size`: 64 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `learning_rate`: 5e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 1 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | sts-test-128_spearman_cosine | sts-test-256_spearman_cosine | sts-test-64_spearman_cosine | |:------:|:----:|:-------------:|:----------------------------:|:----------------------------:|:---------------------------:| | 0.0229 | 100 | 12.9498 | - | - | - | | 0.0459 | 200 | 9.9003 | - | - | - | | 0.0688 | 300 | 8.6333 | - | - | - | | 0.0918 | 400 | 8.0124 | - | - | - | | 0.1147 | 500 | 7.2322 | - | - | - | | 0.1376 | 600 | 6.936 | - | - | - | | 0.1606 | 700 | 7.2855 | - | - | - | | 0.1835 | 800 | 6.5985 | - | - | - | | 0.2065 | 900 | 6.4369 | - | - | - | | 0.2294 | 1000 | 6.2767 | - | - | - | | 0.2524 | 1100 | 6.4011 | - | - | - | | 0.2753 | 1200 | 6.1288 | - | - | - | | 0.2982 | 1300 | 6.1466 | - | - | - | | 0.3212 | 1400 | 5.9279 | - | - | - | | 0.3441 | 1500 | 5.8959 | - | - | - | | 0.3671 | 1600 | 5.5911 | - | - | - | | 0.3900 | 1700 | 5.5258 | - | - | - | | 0.4129 | 1800 | 5.5835 | - | - | - | | 0.4359 | 1900 | 5.4701 | - | - | - | | 0.4588 | 2000 | 5.3888 | - | - | - | | 0.4818 | 2100 | 5.4474 | - | - | - | | 0.5047 | 2200 | 5.1465 | - | - | - | | 0.5276 | 2300 | 5.28 | - | - | - | | 0.5506 | 2400 | 5.4184 | - | - | - | | 0.5735 | 2500 | 5.3811 | - | - | - | | 0.5965 | 2600 | 5.2171 | - | - | - | | 0.6194 | 2700 | 5.3212 | - | - | - | | 0.6423 | 2800 | 5.2493 | - | - | - | | 0.6653 | 2900 | 5.459 | - | - | - | | 0.6882 | 3000 | 5.068 | - | - | - | | 0.7112 | 3100 | 5.1415 | - | - | - | | 0.7341 | 3200 | 5.0764 | - | - | - | | 0.7571 | 3300 | 6.1606 | - | - | - | | 0.7800 | 3400 | 6.1028 | - | - | - | | 0.8029 | 3500 | 5.7441 | - | - | - | | 0.8259 | 3600 | 5.7148 | - | - | - | | 0.8488 | 3700 | 5.4799 | - | - | - | | 0.8718 | 3800 | 5.4396 | - | - | - | | 0.8947 | 3900 | 5.3519 | - | - | - | | 0.9176 | 4000 | 5.2394 | - | - | - | | 0.9406 | 4100 | 5.2311 | - | - | - | | 0.9635 | 4200 | 5.3486 | - | - | - | | 0.9865 | 4300 | 5.215 | - | - | - | | 1.0 | 4359 | - | 0.6814 | 0.6856 | 0.6755 | ### Framework Versions - Python: 3.11.9 - Sentence Transformers: 3.0.1 - Transformers: 4.40.1 - PyTorch: 2.3.0+cu121 - Accelerate: 0.29.3 - Datasets: 2.19.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MatryoshkaLoss ```bibtex @misc{kusupati2024matryoshka, title={Matryoshka Representation Learning}, author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi}, year={2024}, eprint={2205.13147}, archivePrefix={arXiv}, primaryClass={cs.LG} } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
null
Non_BioNLP
# SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2 This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) <!-- at revision 8b3219a92973c328a8e22fadcfa821b5dc75636a --> - **Maximum Sequence Length:** 256 tokens - **Output Dimensionality:** 384 tokens - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("Mollel/swahili-all-MiniLM-L6-v2-nli-matryoshka") # Run inference sentences = [ 'Mwanamume aliyevalia koti la bluu la kuzuia upepo, amelala uso chini kwenye benchi ya bustani, akiwa na chupa ya pombe iliyofungwa kwenye mojawapo ya miguu ya benchi.', 'Mwanamume amelala uso chini kwenye benchi ya bustani.', 'Mwanamume fulani anacheza dansi kwenye klabu hiyo akifungua chupa.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 384] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Semantic Similarity * Dataset: `sts-test-256` * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator) | Metric | Value | |:--------------------|:-----------| | pearson_cosine | 0.6943 | | **spearman_cosine** | **0.6856** | | pearson_manhattan | 0.6885 | | spearman_manhattan | 0.6872 | | pearson_euclidean | 0.6915 | | spearman_euclidean | 0.6906 | | pearson_dot | 0.6799 | | spearman_dot | 0.6677 | | pearson_max | 0.6943 | | spearman_max | 0.6906 | #### Semantic Similarity * Dataset: `sts-test-128` * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator) | Metric | Value | |:--------------------|:-----------| | pearson_cosine | 0.6892 | | **spearman_cosine** | **0.6814** | | pearson_manhattan | 0.6968 | | spearman_manhattan | 0.692 | | pearson_euclidean | 0.7001 | | spearman_euclidean | 0.696 | | pearson_dot | 0.6365 | | spearman_dot | 0.619 | | pearson_max | 0.7001 | | spearman_max | 0.696 | #### Semantic Similarity * Dataset: `sts-test-64` * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator) | Metric | Value | |:--------------------|:-----------| | pearson_cosine | 0.6782 | | **spearman_cosine** | **0.6755** | | pearson_manhattan | 0.6962 | | spearman_manhattan | 0.6891 | | pearson_euclidean | 0.6996 | | spearman_euclidean | 0.6938 | | pearson_dot | 0.5812 | | spearman_dot | 0.5607 | | pearson_max | 0.6996 | | spearman_max | 0.6938 | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Hyperparameters #### Non-Default Hyperparameters - `per_device_train_batch_size`: 64 - `per_device_eval_batch_size`: 64 - `num_train_epochs`: 1 - `warmup_ratio`: 0.1 - `fp16`: True - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `prediction_loss_only`: True - `per_device_train_batch_size`: 64 - `per_device_eval_batch_size`: 64 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `learning_rate`: 5e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 1 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | sts-test-128_spearman_cosine | sts-test-256_spearman_cosine | sts-test-64_spearman_cosine | |:------:|:----:|:-------------:|:----------------------------:|:----------------------------:|:---------------------------:| | 0.0229 | 100 | 12.9498 | - | - | - | | 0.0459 | 200 | 9.9003 | - | - | - | | 0.0688 | 300 | 8.6333 | - | - | - | | 0.0918 | 400 | 8.0124 | - | - | - | | 0.1147 | 500 | 7.2322 | - | - | - | | 0.1376 | 600 | 6.936 | - | - | - | | 0.1606 | 700 | 7.2855 | - | - | - | | 0.1835 | 800 | 6.5985 | - | - | - | | 0.2065 | 900 | 6.4369 | - | - | - | | 0.2294 | 1000 | 6.2767 | - | - | - | | 0.2524 | 1100 | 6.4011 | - | - | - | | 0.2753 | 1200 | 6.1288 | - | - | - | | 0.2982 | 1300 | 6.1466 | - | - | - | | 0.3212 | 1400 | 5.9279 | - | - | - | | 0.3441 | 1500 | 5.8959 | - | - | - | | 0.3671 | 1600 | 5.5911 | - | - | - | | 0.3900 | 1700 | 5.5258 | - | - | - | | 0.4129 | 1800 | 5.5835 | - | - | - | | 0.4359 | 1900 | 5.4701 | - | - | - | | 0.4588 | 2000 | 5.3888 | - | - | - | | 0.4818 | 2100 | 5.4474 | - | - | - | | 0.5047 | 2200 | 5.1465 | - | - | - | | 0.5276 | 2300 | 5.28 | - | - | - | | 0.5506 | 2400 | 5.4184 | - | - | - | | 0.5735 | 2500 | 5.3811 | - | - | - | | 0.5965 | 2600 | 5.2171 | - | - | - | | 0.6194 | 2700 | 5.3212 | - | - | - | | 0.6423 | 2800 | 5.2493 | - | - | - | | 0.6653 | 2900 | 5.459 | - | - | - | | 0.6882 | 3000 | 5.068 | - | - | - | | 0.7112 | 3100 | 5.1415 | - | - | - | | 0.7341 | 3200 | 5.0764 | - | - | - | | 0.7571 | 3300 | 6.1606 | - | - | - | | 0.7800 | 3400 | 6.1028 | - | - | - | | 0.8029 | 3500 | 5.7441 | - | - | - | | 0.8259 | 3600 | 5.7148 | - | - | - | | 0.8488 | 3700 | 5.4799 | - | - | - | | 0.8718 | 3800 | 5.4396 | - | - | - | | 0.8947 | 3900 | 5.3519 | - | - | - | | 0.9176 | 4000 | 5.2394 | - | - | - | | 0.9406 | 4100 | 5.2311 | - | - | - | | 0.9635 | 4200 | 5.3486 | - | - | - | | 0.9865 | 4300 | 5.215 | - | - | - | | 1.0 | 4359 | - | 0.6814 | 0.6856 | 0.6755 | ### Framework Versions - Python: 3.11.9 - Sentence Transformers: 3.0.1 - Transformers: 4.40.1 - PyTorch: 2.3.0+cu121 - Accelerate: 0.29.3 - Datasets: 2.19.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MatryoshkaLoss ```bibtex @misc{kusupati2024matryoshka, title={Matryoshka Representation Learning}, author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi}, year={2024}, eprint={2205.13147}, archivePrefix={arXiv}, primaryClass={cs.LG} } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
{"base_model": "sentence-transformers/all-MiniLM-L6-v2", "datasets": [], "language": [], "library_name": "sentence-transformers", "metrics": ["pearson_cosine", "spearman_cosine", "pearson_manhattan", "spearman_manhattan", "pearson_euclidean", "spearman_euclidean", "pearson_dot", "spearman_dot", "pearson_max", "spearman_max"], "pipeline_tag": "sentence-similarity", "tags": ["sentence-transformers", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:557850", "loss:MatryoshkaLoss", "loss:MultipleNegativesRankingLoss"], "widget": [{"source_sentence": "Mwanamume aliyepangwa vizuri anasimama kwa mguu mmoja karibu na pwani safi ya bahari.", "sentences": ["mtu anacheka wakati wa kufua nguo", "Mwanamume fulani yuko nje karibu na ufuo wa bahari.", "Mwanamume fulani ameketi kwenye sofa yake."]}, {"source_sentence": "Mwanamume mwenye ngozi nyeusi akivuta sigareti karibu na chombo cha taka cha kijani.", "sentences": ["Karibu na chombo cha taka mwanamume huyo alisimama na kuvuta sigareti", "Kitanda ni chafu.", "Alipokuwa kwenye dimbwi la kuogelea mvulana huyo mwenye ugonjwa wa albino alijihadhari na jua kupita kiasi"]}, {"source_sentence": "Mwanamume kijana mwenye nywele nyekundu anaketi ukutani akisoma gazeti huku mwanamke na msichana mchanga wakipita.", "sentences": ["Mwanamume aliyevalia shati la bluu amegonga ukuta kando ya barabara na gari la bluu na gari nyekundu lenye maji nyuma.", "Mwanamume mchanga anatazama gazeti huku wanawake wawili wakipita karibu naye.", "Mwanamume huyo mchanga analala huku Mama akimwongoza binti yake kwenye bustani."]}, {"source_sentence": "Wasichana wako nje.", "sentences": ["Wasichana wawili wakisafiri kwenye sehemu ya kusisimua.", "Kuna watu watatu wakiongoza gari linaloweza kugeuzwa-geuzwa wakipita watu wengine.", "Wasichana watatu wamesimama pamoja katika chumba, mmoja anasikiliza, mwingine anaandika ukutani na wa tatu anaongea nao."]}, {"source_sentence": "Mwanamume aliyevalia koti la bluu la kuzuia upepo, amelala uso chini kwenye benchi ya bustani, akiwa na chupa ya pombe iliyofungwa kwenye mojawapo ya miguu ya benchi.", "sentences": ["Mwanamume amelala uso chini kwenye benchi ya bustani.", "Mwanamke anaunganisha uzi katika mipira kando ya rundo la mipira", "Mwanamume fulani anacheza dansi kwenye klabu hiyo akifungua chupa."]}], "model-index": [{"name": "SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2", "results": [{"task": {"type": "semantic-similarity", "name": "Semantic Similarity"}, "dataset": {"name": "sts test 256", "type": "sts-test-256"}, "metrics": [{"type": "pearson_cosine", "value": 0.6942864389866223, "name": "Pearson Cosine"}, {"type": "spearman_cosine", "value": 0.6856061049537777, "name": "Spearman Cosine"}, {"type": "pearson_manhattan", "value": 0.6885375818451587, "name": "Pearson Manhattan"}, {"type": "spearman_manhattan", "value": 0.6872214410233022, "name": "Spearman Manhattan"}, {"type": "pearson_euclidean", "value": 0.6914785578290242, "name": "Pearson Euclidean"}, {"type": "spearman_euclidean", "value": 0.6905722127311041, "name": "Spearman Euclidean"}, {"type": "pearson_dot", "value": 0.6799233396985102, "name": "Pearson Dot"}, {"type": "spearman_dot", "value": 0.667743621858275, "name": "Spearman Dot"}, {"type": "pearson_max", "value": 0.6942864389866223, "name": "Pearson Max"}, {"type": "spearman_max", "value": 0.6905722127311041, "name": "Spearman Max"}]}, {"task": {"type": "semantic-similarity", "name": "Semantic Similarity"}, "dataset": {"name": "sts test 128", "type": "sts-test-128"}, "metrics": [{"type": "pearson_cosine", "value": 0.6891584502617563, "name": "Pearson Cosine"}, {"type": "spearman_cosine", "value": 0.6814103986417178, "name": "Spearman Cosine"}, {"type": "pearson_manhattan", "value": 0.6968187377070036, "name": "Pearson Manhattan"}, {"type": "spearman_manhattan", "value": 0.6920002958564649, "name": "Spearman Manhattan"}, {"type": "pearson_euclidean", "value": 0.7000628001426884, "name": "Pearson Euclidean"}, {"type": "spearman_euclidean", "value": 0.6960243670969477, "name": "Spearman Euclidean"}, {"type": "pearson_dot", "value": 0.6364862920838279, "name": "Pearson Dot"}, {"type": "spearman_dot", "value": 0.6189765115954626, "name": "Spearman Dot"}, {"type": "pearson_max", "value": 0.7000628001426884, "name": "Pearson Max"}, {"type": "spearman_max", "value": 0.6960243670969477, "name": "Spearman Max"}]}, {"task": {"type": "semantic-similarity", "name": "Semantic Similarity"}, "dataset": {"name": "sts test 64", "type": "sts-test-64"}, "metrics": [{"type": "pearson_cosine", "value": 0.6782226699898293, "name": "Pearson Cosine"}, {"type": "spearman_cosine", "value": 0.6755345411699644, "name": "Spearman Cosine"}, {"type": "pearson_manhattan", "value": 0.6962074727926596, "name": "Pearson Manhattan"}, {"type": "spearman_manhattan", "value": 0.689094339218281, "name": "Spearman Manhattan"}, {"type": "pearson_euclidean", "value": 0.6996133052307816, "name": "Pearson Euclidean"}, {"type": "spearman_euclidean", "value": 0.6937517032138506, "name": "Spearman Euclidean"}, {"type": "pearson_dot", "value": 0.58122590177631, "name": "Pearson Dot"}, {"type": "spearman_dot", "value": 0.5606971476688047, "name": "Spearman Dot"}, {"type": "pearson_max", "value": 0.6996133052307816, "name": "Pearson Max"}, {"type": "spearman_max", "value": 0.6937517032138506, "name": "Spearman Max"}]}]}]}
task
[ "TEXT_CLASSIFICATION", "SEMANTIC_SIMILARITY" ]
42,938
sacreemure/med_t5_summ_ru
sacreemure
summarization
[ "transformers", "safetensors", "t5", "text2text-generation", "summarization", "medicine", "ru", "dataset:sacreemure/rmj_covid", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
2024-04-02T04:57:17Z
2024-06-02T17:04:53+00:00
28
0
--- datasets: sacreemure/rmj_covid language: - ru library_name: transformers metrics: - bertscore - bleu - rougel - rouge1 - rouge2 pipeline_tag: summarization tags: - summarization - t5 - medicine finetuned from: IlyaGusev/rut5_base_sum_gazeta widget: - text: Сахарный диабет (СД) – острейшая медико-социальная проблема современности. В большинстве стран распространенность СД превысила эпидемический уровень 10% и продолжает увеличиваться. При этом более 90% диабетической популяции составляют больные СД 2–го типа. Крайне негативное влияние диабета на здоровье населения и экономику в значительной мере связано с его осложнениями (имеются в виду функциональная дезорганизация и разрушительное влияние на ткани, органы и системы). С диабетом ассоциируются артериальная гипертония (АГ), ишемическая болезнь сердца (ИБС), цереброваскулярная болезнь (ЦВБ), ожирение, ретинопатия, нефропатия, гангрена нижних конечностей и другая патология. Традиционно считается, что в основе осложнений СД лежит повреждение сосудов. Однако есть мнение, что наиболее частым и самым ранним последствием диабета, по крайней мере СД 2-го типа, является поражение нервной системы в виде диабетической энцефаломиелопатии (ДЭМ), диабетической автономной невропатии (ДАН) и диабетической периферической сенсорно-моторной полиневропатии (ДПСМП), которые, будучи самостоятельными клиническими синдромами, одновременно служат предикторами и факторами риска развития другой, ассоциированной с СД 2-го типа, патологии (схема 1). Приведены данные о том, что субклиническая невропатия появляется уже через 1 год после манифестации СД 2-го типа, а у некоторых больных обнаруживается до манифестации [1, 5, 8]. example_title: Медицинская статья --- # MedT5SummRu <!-- Provide a quick summary of what the model is/does. --> ## Model Details Seq2seq model for abstractive summarization of Russian biomedical texts. ### Model Description <!-- Provide a longer summary of what this model is. --> - **Model type:** seq2seq - **Language:** Russian - **Finetuned from model:** IlyaGusev/rut5_base_sum_gazeta
null
BioNLP
# MedT5SummRu <!-- Provide a quick summary of what the model is/does. --> ## Model Details Seq2seq model for abstractive summarization of Russian biomedical texts. ### Model Description <!-- Provide a longer summary of what this model is. --> - **Model type:** seq2seq - **Language:** Russian - **Finetuned from model:** IlyaGusev/rut5_base_sum_gazeta
{"datasets": "sacreemure/rmj_covid", "language": ["ru"], "library_name": "transformers", "metrics": ["bertscore", "bleu", "rougel", "rouge1", "rouge2"], "pipeline_tag": "summarization", "tags": ["summarization", "t5", "medicine"], "finetuned from": "IlyaGusev/rut5_base_sum_gazeta", "widget": [{"text": "Сахарный диабет (СД) – острейшая медико-социальная проблема современности. В большинстве стран распространенность СД превысила эпидемический уровень 10% и продолжает увеличиваться. При этом более 90% диабетической популяции составляют больные СД 2–го типа. Крайне негативное влияние диабета на здоровье населения и экономику в значительной мере связано с его осложнениями (имеются в виду функциональная дезорганизация и разрушительное влияние на ткани, органы и системы). С диабетом ассоциируются артериальная гипертония (АГ), ишемическая болезнь сердца (ИБС), цереброваскулярная болезнь (ЦВБ), ожирение, ретинопатия, нефропатия, гангрена нижних конечностей и другая патология. Традиционно считается, что в основе осложнений СД лежит повреждение сосудов. Однако есть мнение, что наиболее частым и самым ранним последствием диабета, по крайней мере СД 2-го типа, является поражение нервной системы в виде диабетической энцефаломиелопатии (ДЭМ), диабетической автономной невропатии (ДАН) и диабетической периферической сенсорно-моторной полиневропатии (ДПСМП), которые, будучи самостоятельными клиническими синдромами, одновременно служат предикторами и факторами риска развития другой, ассоциированной с СД 2-го типа, патологии (схема 1). Приведены данные о том, что субклиническая невропатия появляется уже через 1 год после манифестации СД 2-го типа, а у некоторых больных обнаруживается до манифестации [1, 5, 8].", "example_title": "Медицинская статья"}]}
task
[ "SUMMARIZATION" ]
42,939
magnustragardh/marian-finetuned-kde4-en-to-fr
magnustragardh
translation
[ "transformers", "pytorch", "tensorboard", "marian", "text2text-generation", "translation", "generated_from_trainer", "dataset:kde4", "base_model:Helsinki-NLP/opus-mt-en-fr", "base_model:finetune:Helsinki-NLP/opus-mt-en-fr", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2023-08-16T19:09:14Z
2023-08-17T18:33:40+00:00
16
0
--- base_model: Helsinki-NLP/opus-mt-en-fr datasets: - kde4 license: apache-2.0 metrics: - bleu tags: - translation - generated_from_trainer model-index: - name: marian-finetuned-kde4-en-to-fr results: - task: type: text2text-generation name: Sequence-to-sequence Language Modeling dataset: name: kde4 type: kde4 config: en-fr split: train args: en-fr metrics: - type: bleu value: 52.87878984885333 name: Bleu --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # marian-finetuned-kde4-en-to-fr This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-fr](https://huggingface.co/Helsinki-NLP/opus-mt-en-fr) on the kde4 dataset. It achieves the following results on the evaluation set: - Loss: 0.8556 - Bleu: 52.8788 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # marian-finetuned-kde4-en-to-fr This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-fr](https://huggingface.co/Helsinki-NLP/opus-mt-en-fr) on the kde4 dataset. It achieves the following results on the evaluation set: - Loss: 0.8556 - Bleu: 52.8788 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
{"base_model": "Helsinki-NLP/opus-mt-en-fr", "datasets": ["kde4"], "license": "apache-2.0", "metrics": ["bleu"], "tags": ["translation", "generated_from_trainer"], "model-index": [{"name": "marian-finetuned-kde4-en-to-fr", "results": [{"task": {"type": "text2text-generation", "name": "Sequence-to-sequence Language Modeling"}, "dataset": {"name": "kde4", "type": "kde4", "config": "en-fr", "split": "train", "args": "en-fr"}, "metrics": [{"type": "bleu", "value": 52.87878984885333, "name": "Bleu"}]}]}]}
task
[ "TRANSLATION" ]
42,940
fine-tuned/NFCorpus-512-192-gpt-4o-2024-05-13-76823162
fine-tuned
feature-extraction
[ "sentence-transformers", "safetensors", "bert", "feature-extraction", "sentence-similarity", "mteb", "en", "dataset:fine-tuned/NFCorpus-512-192-gpt-4o-2024-05-13-76823162", "dataset:allenai/c4", "license:apache-2.0", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
2024-05-28T18:53:49Z
2024-05-28T18:54:20+00:00
9
0
--- datasets: - fine-tuned/NFCorpus-512-192-gpt-4o-2024-05-13-76823162 - allenai/c4 language: - en - en license: apache-2.0 pipeline_tag: feature-extraction tags: - sentence-transformers - feature-extraction - sentence-similarity - mteb --- This model is a fine-tuned version of [**BAAI/bge-large-en-v1.5**](https://huggingface.co/BAAI/bge-large-en-v1.5) designed for the following use case: None ## How to Use This model can be easily integrated into your NLP pipeline for tasks such as text classification, sentiment analysis, entity recognition, and more. Here's a simple example to get you started: ```python from sentence_transformers import SentenceTransformer from sentence_transformers.util import cos_sim model = SentenceTransformer( 'fine-tuned/NFCorpus-512-192-gpt-4o-2024-05-13-76823162', trust_remote_code=True ) embeddings = model.encode([ 'first text to embed', 'second text to embed' ]) print(cos_sim(embeddings[0], embeddings[1])) ```
null
Non_BioNLP
This model is a fine-tuned version of [**BAAI/bge-large-en-v1.5**](https://huggingface.co/BAAI/bge-large-en-v1.5) designed for the following use case: None ## How to Use This model can be easily integrated into your NLP pipeline for tasks such as text classification, sentiment analysis, entity recognition, and more. Here's a simple example to get you started: ```python from sentence_transformers import SentenceTransformer from sentence_transformers.util import cos_sim model = SentenceTransformer( 'fine-tuned/NFCorpus-512-192-gpt-4o-2024-05-13-76823162', trust_remote_code=True ) embeddings = model.encode([ 'first text to embed', 'second text to embed' ]) print(cos_sim(embeddings[0], embeddings[1])) ```
{"datasets": ["fine-tuned/NFCorpus-512-192-gpt-4o-2024-05-13-76823162", "allenai/c4"], "language": ["en", "en"], "license": "apache-2.0", "pipeline_tag": "feature-extraction", "tags": ["sentence-transformers", "feature-extraction", "sentence-similarity", "mteb"]}
task
[ "TEXT_CLASSIFICATION" ]
42,941
RichardErkhov/AI-Sweden-Models_-_gpt-sw3-356m-8bits
RichardErkhov
text-generation
[ "transformers", "safetensors", "gpt2", "text-generation", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "8-bit", "bitsandbytes", "region:us" ]
2024-07-20T11:12:15Z
2024-07-20T11:12:39+00:00
78
0
--- {} --- Quantization made by Richard Erkhov. [Github](https://github.com/RichardErkhov) [Discord](https://discord.gg/pvy7H8DZMG) [Request more models](https://github.com/RichardErkhov/quant_request) gpt-sw3-356m - bnb 8bits - Model creator: https://huggingface.co/AI-Sweden-Models/ - Original model: https://huggingface.co/AI-Sweden-Models/gpt-sw3-356m/ Original model description: --- license: other language: - da - sv - 'no' - en - is --- # Model description [AI Sweden](https://huggingface.co/AI-Sweden-Models/) **Base models** [GPT-Sw3 126M](https://huggingface.co/AI-Sweden-Models/gpt-sw3-126m/) | [GPT-Sw3 356M](https://huggingface.co/AI-Sweden-Models/gpt-sw3-356m/) | [GPT-Sw3 1.3B](https://huggingface.co/AI-Sweden-Models/gpt-sw3-1.3b/) [GPT-Sw3 6.7B](https://huggingface.co/AI-Sweden-Models/gpt-sw3-6.7b/) | [GPT-Sw3 6.7B v2](https://huggingface.co/AI-Sweden-Models/gpt-sw3-6.7b-v2/) | [GPT-Sw3 20B](https://huggingface.co/AI-Sweden-Models/gpt-sw3-20b/) [GPT-Sw3 40B](https://huggingface.co/AI-Sweden-Models/gpt-sw3-40b/) **Instruct models** [GPT-Sw3 126M Instruct](https://huggingface.co/AI-Sweden-Models/gpt-sw3-126m-instruct/) | [GPT-Sw3 356M Instruct](https://huggingface.co/AI-Sweden-Models/gpt-sw3-356m-instruct/) | [GPT-Sw3 1.3B Instruct](https://huggingface.co/AI-Sweden-Models/gpt-sw3-1.3b-instruct/) [GPT-Sw3 6.7B v2 Instruct](https://huggingface.co/AI-Sweden-Models/gpt-sw3-6.7b-v2-instruct/) | [GPT-Sw3 20B Instruct](https://huggingface.co/AI-Sweden-Models/gpt-sw3-20b-instruct/) **Quantized models** [GPT-Sw3 6.7B v2 Instruct 4-bit gptq](https://huggingface.co/AI-Sweden-Models/gpt-sw3-6.7b-v2-instruct-4bit-gptq) | [GPT-Sw3 20B Instruct 4-bit gptq](https://huggingface.co/AI-Sweden-Models/gpt-sw3-20b-instruct-4bit-gptq) GPT-SW3 is a collection of large decoder-only pretrained transformer language models that were developed by AI Sweden in collaboration with RISE and the WASP WARA for Media and Language. GPT-SW3 has been trained on a dataset containing 320B tokens in Swedish, Norwegian, Danish, Icelandic, English, and programming code. The model was pretrained using a causal language modeling (CLM) objective utilizing the NeMo Megatron GPT implementation. # Intended use GPT-SW3 is an autoregressive large language model that is capable of generating coherent text in 5 different languages, and 4 programming languages. GPT-SW3 can also be instructed to perform text tasks that it has not been explicitly trained for, by casting them as text generation tasks. AI Sweden shares GPT-SW3 in a controlled pre-release with organizations and individuals in the Nordic NLP ecosystem who can contribute to the validation and testing of the models and provide feedback to the community. This is an important step in the process of validating the model and collecting feedback on both what works well and what does not. # Limitations Like other large language models for which the diversity (or lack thereof) of training data induces downstream impact on the quality of our model, GPT-SW3 has limitations in terms of for example bias and safety. GPT-SW3 can also have quality issues in terms of generation diversity and hallucination. By releasing with the modified RAIL license, we also hope to increase communication, transparency, and the study of large language models. The model may: overrepresent some viewpoints and underrepresent others, contain stereotypes, generate hateful, abusive, violent, discriminatory or prejudicial language. The model may make errors, including producing incorrect information as if it were factual, it may generate irrelevant or repetitive outputs, and content that may not be appropriate for all settings, including sexual content. # How to use To be able to access the model from Python, since this is a private repository, you have to log in with your access token. This can be done with `huggingface-cli login`, see [HuggingFace Quick Start Guide](https://huggingface.co/docs/huggingface_hub/quick-start#login) for more information. The following code snippet loads our tokenizer & model, and uses the GPU if available. ```python import torch from transformers import pipeline, AutoTokenizer, AutoModelForCausalLM # Initialize Variables model_name = "AI-Sweden-Models/gpt-sw3-356m" device = "cuda:0" if torch.cuda.is_available() else "cpu" prompt = "Träd är fina för att" # Initialize Tokenizer & Model tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForCausalLM.from_pretrained(model_name) model.eval() model.to(device) ``` Generating text using the `generate` method is done as follows: ```python input_ids = tokenizer(prompt, return_tensors="pt")["input_ids"].to(device) generated_token_ids = model.generate( inputs=input_ids, max_new_tokens=100, do_sample=True, temperature=0.6, top_p=1, )[0] generated_text = tokenizer.decode(generated_token_ids) ``` A convenient alternative to the `generate` method is the HuggingFace pipeline, which handles most of the work for you: ```python generator = pipeline('text-generation', tokenizer=tokenizer, model=model, device=device) generated = generator(prompt, max_new_tokens=100, do_sample=True, temperature=0.6, top_p=1)[0]["generated_text"] ``` # Compliance The release of GPT-SW3 consists of model weights, a configuration file, a tokenizer file and a vocabulary file. None of these files contain any personally identifiable information (PII) or any copyrighted material. # GPT-SW3 Model Card Following Mitchell et al. (2018), we provide a model card for GPT-SW3. # Model Details - Person or organization developing model: GPT-SW3 was developed by AI Sweden in collaboration with RISE and the WASP WARA for Media and Language. - Model date: GPT-SW3 date of release 2022-12-20 - Model version: This is the second generation of GPT-SW3. - Model type: GPT-SW3 is a large decoder-only transformer language model. - Information about training algorithms, parameters, fairness constraints or other applied approaches, and features: GPT-SW3 was trained with the NeMo Megatron GPT implementation. - Paper or other resource for more information: N/A. - License: [LICENSE](https://huggingface.co/AI-Sweden-Models/gpt-sw3-356m/blob/main/LICENSE). - Where to send questions or comments about the model: [email protected] # Intended Use - Primary intended uses: We pre-release GPT-SW3 for research and evaluation of the capabilities of Large Language Models for the Nordic languages. This is an important step in the process of knowledge building for LLMs, validating the model and collecting feedback on both what works well and what does not. - Primary intended users: Organizations and individuals in the Nordic NLP ecosystem who can contribute to the validation and testing of the models and provide feedback to the community. - Out-of-scope use cases: See the modified RAIL license. # Data, Limitations, and Recommendations - Data selection for training: Training data for GPT-SW3 was selected based on a combination of breadth and availability. See our Datasheet for more detailed information on the data used to train our model. - Data selection for evaluation: N/A - Limitations: Like other large language models for which the diversity (or lack thereof) of training data induces downstream impact on the quality of our model, GPT-SW3 has limitations in terms of bias and safety. GPT-SW3 can also have quality issues in terms of generation diversity and hallucination. In general, GPT-SW3 is not immune from the plethora of issues that plague modern large language models. By releasing with the modified RAIL license, we also hope to increase communication, transparency, and the study of large language models. The model may: Overrepresent some viewpoints and underrepresent others. Contain stereotypes. Generate: Hateful, abusive, or violent language. Discriminatory or prejudicial language. Content that may not be appropriate for all settings, including sexual content. Make errors, including producing incorrect information as if it were factual. Generate irrelevant or repetitive outputs. - Recommendations for future work: Indirect users should be made aware when the content they're working with is created by the LLM. Users should be aware of Risks and Limitations, and include an appropriate age disclaimer or blocking interface as necessary. Models pretrained with the LLM should include an updated Model Card. Users of the model should provide mechanisms for those affected to provide feedback, such as an email address for comments. - We hope that the release of GPT-SW3, as well as information around our model training process, will increase open science around both large language models in specific and natural language processing and deep learning in general. # GPT-SW3 Datasheet - We follow the recommendations of Gebru et al. (2021) and provide a datasheet for the dataset used to train GPT-SW3. # Motivation - For what purpose was the dataset created? Was there a specific task in mind? Was there a specific gap that needed to be filled? Please provide a description. Pre-training of Large Language Models (LLM), such as GPT-3 (T. B. Brown et al., 2020), Gopher (J. W. Rae et al., 2022), BLOOM (T. L. Scao et al., 2022), etc. require 100s or even 1000s GBs of text data, with recent studies (Chinchilla: J. Hoffmann et al., 2022) suggesting that the scale of the training data is even more important than previously imagined. Therefore, in order to train Swedish LLMs, we needed a large scale Swedish dataset of high quality. Since no such datasets existed before this initiative, we collected data in the Nordic and English languages. - Who created the dataset (e.g., which team, research group) and on behalf of which entity (e.g., company, institution, organization)? The Strategic Initiative Natural Language Understanding at AI Sweden has established a new research environment in which collaboration is key. The core team working on the creation of the dataset is the NLU research group at AI Sweden. This group consists of researchers and developers from AI Sweden (Lindholmen Science Park AB) and RISE. - Who funded the creation of the dataset? If there is an associated grant, please provide the name of the grantor and the grant name and number. The Swedish Innovation Agency (Vinnova) has funded this work across several different grants, including 2019-02996 and 2022-00949. - Any other comments? No. # Composition - What do the instances that comprise the dataset represent (e.g., documents, photos, people, countries)? Are there multiple types of instances (e.g., movies, users, and ratings; people and interactions between them; nodes and edges)? Please provide a description. The instances are textual documents categorized by language and document type. The dataset is a filtered and deduplicated collection that includes the following sources: - Books - Litteraturbanken (https://litteraturbanken.se/) - The Pile - Articles - Diva (https://www.diva-portal.org/) - The Pile: PubMed - The Pile: ArXiv - Code - Code Parrot: Github code (https://huggingface.co/datasets/codeparrot/github-code) - Conversational - Familjeliv (https://www.familjeliv.se/) - Flashback (https://flashback.se/) - Datasets collected through Parlai (see Appendix in data paper for complete list) (https://github.com/facebookresearch/ParlAI) - Pushshift.io Reddit dataset, developed in Baumgartner et al. (2020) and processed in Roller et al. (2021) - Math - English Math dataset generated with code from DeepMind (D. Saxton et al., 2019) - Swedish Math dataset, generated as above with manually translated templates - Miscellaneous - Summarization data (https://www.ida.liu.se/~arnjo82/papers/clarin-21-julius.pdf) - OPUS, the open parallel corpus (https://opus.nlpl.eu/) - Movie scripts (https://github.com/Aveek-Saha/Movie-Script-Database) - Natural Instructions (https://github.com/allenai/natural-instructions) - P3 (Public Pool of Prompts), (https://huggingface.co/datasets/bigscience/P3) - The Norwegian Colossal Corpus (https://huggingface.co/datasets/NbAiLab/NCC) - Danish Gigaword (https://gigaword.dk/) - Icelandic Gigaword (https://clarin.is/en/resources/gigaword/) - The Pile: Stack Exchange - Web Common Crawl - Web data from the project LES (Linguistic Explorations of Societies, https://les.gu.se). - Multilingual C4 (MC4), prepared by AllenAI from C4 (C. Raffel et al., 2019) - Open Super-large Crawled Aggregated coRpus (OSCAR) (P. O. Suarez, 2019) - The Pile: Open Web Text - Web Sources - Various public Swedish website scrapes (see Appendix in data paper) - Familjeliv Articles - Public Swedish Job Ads from JobTech/Arbetsförmedlingen - Wikipedia - Official Wikipedia dumps - How many instances are there in total (of each type, if appropriate)? The training data consists of 1.1TB UTF-8 encoded text, containing 660M documents with a total of 320B tokens. - Does the dataset contain all possible instances or is it a sample (not necessarily random) of instances from a larger set? If the dataset is a sample, then what is the larger set? Is the sample representative of the larger set (e.g., geographic coverage)? If so, please describe how this representativeness was validated/verified. If it is not representative of the larger set, please describe why not (e.g., to cover a more diverse range of instances, because instances were withheld or unavailable). The subset of our dataset that comes from multilingual Common Crawl datasets (MC4, Oscar), are filtered by language to only include Swedish, Norwegian, Danish, and Icelandic. From The Pile, we included only the parts that typically are of highest textual quality or complemented the rest of our dataset with sources we otherwise lacked (e.g. books). The remainder of the dataset was collected from the above sources. - What data does each instance consist of? “Raw” data (e.g., unprocessed text or images) or features? In either case, please provide a description. Each instance consists of raw text data. - Is there a label or target associated with each instance? If so, please provide a description. No. - Is any information missing from individual instances? If so, please provide a description, explaining why this information is missing (e.g., because it was unavailable). This does not include intentionally removed information, but might include, e.g., redacted text. No. - Are relationships between individual instances made explicit (e.g., users’ movie ratings, social network links)? If so, please describe how these relationships are made explicit. There are no explicit relationships between individual instances. - Are there recommended data splits (e.g., training, development/validation, testing)? If so, please provide a description of these splits, explaining the rationale behind them. There are no explicit splits recommended for this dataset. When pre-training the model, a random split for train, dev, test is set to 99.99%, 0.08%, 0.02% respectively, and is sampled proportionally to each subset’s weight and size. The weight of each subset was manually decided beforehand. These decisions were made considering the data’s value, source, and language, to form a representative and balanced pre-training corpus. - Are there any errors, sources of noise, or redundancies in the dataset? If so, please provide a description. The dataset is a collection of many sources, some of which naturally contain some overlap. Although we have performed deduplication, some overlap may still remain. Furthermore, there may be some noise remaining from artifacts originating in Common Crawl datasets, that have been missed by our data filtering process. Except for these, we are not aware of any errors, sources of noise, or redundancies. - Is the dataset self-contained, or does it link to or otherwise rely on external resources (e.g., websites, tweets, other datasets)? The dataset is self-contained. - Does the dataset contain data that, if viewed directly, might be offensive, insulting, threatening, or might otherwise cause anxiety? If so, please describe why. The dataset contains subsets of public Common Crawl, Reddit, Familjeliv and Flashback. These could contain sentences that, if viewed directly, might be offensive, insulting, threatening, or might otherwise cause anxiety. - Does the dataset relate to people? If not, you may skip the remaining questions in this section. Some documents of this data relate to people, such as news articles, Wikipedia descriptions, etc. - Does the dataset identify any subpopulations (e.g., by age, gender)? If so, please describe how these subpopulations are identified and provide a description of their respective distributions within the dataset. No, the dataset does not explicitly include subpopulation identification. - Any other comments? No. # Collection Process - How was the data associated with each instance acquired? Was the data directly observable (e.g., raw text, movie ratings), reported by subjects (e.g., survey responses), or indirectly inferred/derived from other data (e.g., part-of-speech tags, model-based guesses for age or language)? If data was reported by subjects or indirectly inferred/derived from other data, was the data validated/verified? If so, please describe how. N/A. The dataset is a union of publicly available datasets and sources. - What mechanisms or procedures were used to collect the data (e.g., hardware apparatus or sensor, manual human curation, software program, software API)? How were these mechanisms or procedures validated? The data was downloaded from the internet. - If the dataset is a sample from a larger set, what was the sampling strategy (e.g., deterministic, probabilistic with specific sampling probabilities)? Please see previous answers for how parts of the dataset were selected. - Who was involved in the data collection process (e.g., students, crowdworkers, contractors) and how were they compensated (e.g., how much were crowdworkers paid)? This data is mined, filtered and sampled by machines. - Over what timeframe was the data collected? Does this timeframe match the creation timeframe of the data associated with the instances (e.g., recent crawl of old news articles)? If not, please describe the timeframe in which the data associated with the instances was created. The dataset was collected during the period June 2021 to June 2022. The creation of the collected sources varies, with e.g. Common Crawl data that have been continuously collected over 12 years. - Does the dataset relate to people? If not, you may skip the remainder of the questions in this section. Yes. The texts have been produced by people. Any personal information potentially present in publicly available data sources and thus in the created dataset is of no interest to the collection and use of the dataset. - Has an analysis of the potential impact of the dataset and its use on data subjects (e.g., a data protection impact analysis) been conducted? If so, please provide a description of this analysis, including the outcomes, as well as a link or other access point to any supporting documentation. Yes. - Any other comments? No. - Preprocessing/cleaning/labeling - Was any preprocessing/cleaning/labeling of the data done (e.g., discretization or bucketing, tokenization, part-of-speech tagging, SIFT feature extraction, removal of instances, processing of missing values)? If so, please provide a description. If not, you may skip the remainder of the questions in this section. The dataset was filtered and re-formatted on a document-level using standard procedures, inspired by the work in The BigScience ROOTS Corpus (H. Laurençon et al., 2022) and Gopher (J. W. Rae et al., 2022). This was done with the goal of achieving a consistent text format throughout the dataset, and to remove documents that did not meet our textual quality requirements (e.g. repetitiveness). Furthermore, the dataset was deduplicated to remedy the overlap between collected subsets using the MinHash algorithm, similar to the method used in GPT-3 and The Pile, and described in greater detail in “Deduplicating Training Data Makes Language Models Better” (K. Lee et al., 2021). - Was the “raw” data saved in addition to the preprocessed/cleaned/labeled data (e.g., to support unanticipated future uses)? If so, please provide a link or other access point to the “raw” data. The “raw” component datasets are publicly available in their respective locations. - Any other comments? No. # Uses - Has the dataset been used for any tasks already? If so, please provide a description. The dataset was used to pre-train the GPT-SW3 models. - Is there a repository that links to any or all papers or systems that use the dataset? If so, please provide a link or other access point. N/A. - What (other) tasks could the dataset be used for? The data can be used to pre-train language models, which are foundations for many current and future language tasks. - Is there anything about the composition of the dataset or the way it was collected and preprocessed/cleaned/labeled that might impact future uses? For example, is there anything that a future user might need to know to avoid uses that could result in unfair treatment of individuals or groups (e.g., stereotyping, quality of service issues) or other undesirable harms (e.g., financial harms, legal risks) If so, please provide a description. Is there anything a future user could do to mitigate these undesirable harms? The dataset is probably quite representative of Swedish internet discourse in general, and of the Swedish public sector, but we know that this data does not necessarily reflect the entire Swedish population. - Are there tasks for which the dataset should not be used? If so, please provide a description. None that we are currently aware of. - Any other comments? No. # Distribution - Will the dataset be distributed to third parties outside of the entity (e.g., company, institution, organization) on behalf of which the dataset was created? If so, please provide a description. No. - How will the dataset distributed (e.g., tarball on website, API, GitHub)? Does the dataset have a digital object identifier (DOI)? N/A. - When will the dataset be distributed? N/A. - Will the dataset be distributed under a copyright or other intellectual property (IP) license, and/or under applicable terms of use (ToU)? If so, please describe this license and/or ToU, and provide a link or other access point to, or otherwise reproduce, any relevant licensing terms or ToU, as well as any fees associated with these restrictions. N/A. - Do any export controls or other regulatory restrictions apply to the dataset or to individual instances? If so, please describe these restrictions, and provide a link or other access point to, or otherwise reproduce, any supporting documentation. N/A. - Any other comments? No. # Maintenance - Who is supporting/hosting/maintaining the dataset? AI Sweden at Lindholmen Science Park AB. - How can the owner/curator/manager of the dataset be contacted (e.g., email address)? [email protected] - Is there an erratum? If so, please provide a link or other access point. N/A. - Will the dataset be updated (e.g., to correct labeling errors, add new instances, delete instances)? If so, please describe how often, by whom, and how updates will be communicated to users (e.g., mailing list, GitHub)? Currently, there are no plans for updating the dataset. - If the dataset relates to people, are there applicable limits on the retention of the data associated with the instances (e.g., were individuals in question told that their data would be retained for a fixed period of time and then deleted)? If so, please describe these limits and explain how they will be enforced. Read the privacy policy for the NLU initiative at AI Sweden [here](https://www.ai.se/en/privacy-policy-nlu). - Will older versions of the dataset continue to be supported/hosted/maintained? If so, please describe how. If not, please describe how its obsolescence will be communicated to users. N/A. - If others want to extend/augment/build on/contribute to the dataset, is there a mechanism for them to do so? If so, please provide a description. Will these contributions be validated/ verified? If so, please describe how. If not, why not? Is there a process for communicating/ distributing these contributions to other users? If so, please provide a description. Not at this time. - Any other comments? No.
null
Non_BioNLP
Quantization made by Richard Erkhov. [Github](https://github.com/RichardErkhov) [Discord](https://discord.gg/pvy7H8DZMG) [Request more models](https://github.com/RichardErkhov/quant_request) gpt-sw3-356m - bnb 8bits - Model creator: https://huggingface.co/AI-Sweden-Models/ - Original model: https://huggingface.co/AI-Sweden-Models/gpt-sw3-356m/ Original model description: --- license: other language: - da - sv - 'no' - en - is --- # Model description [AI Sweden](https://huggingface.co/AI-Sweden-Models/) **Base models** [GPT-Sw3 126M](https://huggingface.co/AI-Sweden-Models/gpt-sw3-126m/) | [GPT-Sw3 356M](https://huggingface.co/AI-Sweden-Models/gpt-sw3-356m/) | [GPT-Sw3 1.3B](https://huggingface.co/AI-Sweden-Models/gpt-sw3-1.3b/) [GPT-Sw3 6.7B](https://huggingface.co/AI-Sweden-Models/gpt-sw3-6.7b/) | [GPT-Sw3 6.7B v2](https://huggingface.co/AI-Sweden-Models/gpt-sw3-6.7b-v2/) | [GPT-Sw3 20B](https://huggingface.co/AI-Sweden-Models/gpt-sw3-20b/) [GPT-Sw3 40B](https://huggingface.co/AI-Sweden-Models/gpt-sw3-40b/) **Instruct models** [GPT-Sw3 126M Instruct](https://huggingface.co/AI-Sweden-Models/gpt-sw3-126m-instruct/) | [GPT-Sw3 356M Instruct](https://huggingface.co/AI-Sweden-Models/gpt-sw3-356m-instruct/) | [GPT-Sw3 1.3B Instruct](https://huggingface.co/AI-Sweden-Models/gpt-sw3-1.3b-instruct/) [GPT-Sw3 6.7B v2 Instruct](https://huggingface.co/AI-Sweden-Models/gpt-sw3-6.7b-v2-instruct/) | [GPT-Sw3 20B Instruct](https://huggingface.co/AI-Sweden-Models/gpt-sw3-20b-instruct/) **Quantized models** [GPT-Sw3 6.7B v2 Instruct 4-bit gptq](https://huggingface.co/AI-Sweden-Models/gpt-sw3-6.7b-v2-instruct-4bit-gptq) | [GPT-Sw3 20B Instruct 4-bit gptq](https://huggingface.co/AI-Sweden-Models/gpt-sw3-20b-instruct-4bit-gptq) GPT-SW3 is a collection of large decoder-only pretrained transformer language models that were developed by AI Sweden in collaboration with RISE and the WASP WARA for Media and Language. GPT-SW3 has been trained on a dataset containing 320B tokens in Swedish, Norwegian, Danish, Icelandic, English, and programming code. The model was pretrained using a causal language modeling (CLM) objective utilizing the NeMo Megatron GPT implementation. # Intended use GPT-SW3 is an autoregressive large language model that is capable of generating coherent text in 5 different languages, and 4 programming languages. GPT-SW3 can also be instructed to perform text tasks that it has not been explicitly trained for, by casting them as text generation tasks. AI Sweden shares GPT-SW3 in a controlled pre-release with organizations and individuals in the Nordic NLP ecosystem who can contribute to the validation and testing of the models and provide feedback to the community. This is an important step in the process of validating the model and collecting feedback on both what works well and what does not. # Limitations Like other large language models for which the diversity (or lack thereof) of training data induces downstream impact on the quality of our model, GPT-SW3 has limitations in terms of for example bias and safety. GPT-SW3 can also have quality issues in terms of generation diversity and hallucination. By releasing with the modified RAIL license, we also hope to increase communication, transparency, and the study of large language models. The model may: overrepresent some viewpoints and underrepresent others, contain stereotypes, generate hateful, abusive, violent, discriminatory or prejudicial language. The model may make errors, including producing incorrect information as if it were factual, it may generate irrelevant or repetitive outputs, and content that may not be appropriate for all settings, including sexual content. # How to use To be able to access the model from Python, since this is a private repository, you have to log in with your access token. This can be done with `huggingface-cli login`, see [HuggingFace Quick Start Guide](https://huggingface.co/docs/huggingface_hub/quick-start#login) for more information. The following code snippet loads our tokenizer & model, and uses the GPU if available. ```python import torch from transformers import pipeline, AutoTokenizer, AutoModelForCausalLM # Initialize Variables model_name = "AI-Sweden-Models/gpt-sw3-356m" device = "cuda:0" if torch.cuda.is_available() else "cpu" prompt = "Träd är fina för att" # Initialize Tokenizer & Model tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForCausalLM.from_pretrained(model_name) model.eval() model.to(device) ``` Generating text using the `generate` method is done as follows: ```python input_ids = tokenizer(prompt, return_tensors="pt")["input_ids"].to(device) generated_token_ids = model.generate( inputs=input_ids, max_new_tokens=100, do_sample=True, temperature=0.6, top_p=1, )[0] generated_text = tokenizer.decode(generated_token_ids) ``` A convenient alternative to the `generate` method is the HuggingFace pipeline, which handles most of the work for you: ```python generator = pipeline('text-generation', tokenizer=tokenizer, model=model, device=device) generated = generator(prompt, max_new_tokens=100, do_sample=True, temperature=0.6, top_p=1)[0]["generated_text"] ``` # Compliance The release of GPT-SW3 consists of model weights, a configuration file, a tokenizer file and a vocabulary file. None of these files contain any personally identifiable information (PII) or any copyrighted material. # GPT-SW3 Model Card Following Mitchell et al. (2018), we provide a model card for GPT-SW3. # Model Details - Person or organization developing model: GPT-SW3 was developed by AI Sweden in collaboration with RISE and the WASP WARA for Media and Language. - Model date: GPT-SW3 date of release 2022-12-20 - Model version: This is the second generation of GPT-SW3. - Model type: GPT-SW3 is a large decoder-only transformer language model. - Information about training algorithms, parameters, fairness constraints or other applied approaches, and features: GPT-SW3 was trained with the NeMo Megatron GPT implementation. - Paper or other resource for more information: N/A. - License: [LICENSE](https://huggingface.co/AI-Sweden-Models/gpt-sw3-356m/blob/main/LICENSE). - Where to send questions or comments about the model: [email protected] # Intended Use - Primary intended uses: We pre-release GPT-SW3 for research and evaluation of the capabilities of Large Language Models for the Nordic languages. This is an important step in the process of knowledge building for LLMs, validating the model and collecting feedback on both what works well and what does not. - Primary intended users: Organizations and individuals in the Nordic NLP ecosystem who can contribute to the validation and testing of the models and provide feedback to the community. - Out-of-scope use cases: See the modified RAIL license. # Data, Limitations, and Recommendations - Data selection for training: Training data for GPT-SW3 was selected based on a combination of breadth and availability. See our Datasheet for more detailed information on the data used to train our model. - Data selection for evaluation: N/A - Limitations: Like other large language models for which the diversity (or lack thereof) of training data induces downstream impact on the quality of our model, GPT-SW3 has limitations in terms of bias and safety. GPT-SW3 can also have quality issues in terms of generation diversity and hallucination. In general, GPT-SW3 is not immune from the plethora of issues that plague modern large language models. By releasing with the modified RAIL license, we also hope to increase communication, transparency, and the study of large language models. The model may: Overrepresent some viewpoints and underrepresent others. Contain stereotypes. Generate: Hateful, abusive, or violent language. Discriminatory or prejudicial language. Content that may not be appropriate for all settings, including sexual content. Make errors, including producing incorrect information as if it were factual. Generate irrelevant or repetitive outputs. - Recommendations for future work: Indirect users should be made aware when the content they're working with is created by the LLM. Users should be aware of Risks and Limitations, and include an appropriate age disclaimer or blocking interface as necessary. Models pretrained with the LLM should include an updated Model Card. Users of the model should provide mechanisms for those affected to provide feedback, such as an email address for comments. - We hope that the release of GPT-SW3, as well as information around our model training process, will increase open science around both large language models in specific and natural language processing and deep learning in general. # GPT-SW3 Datasheet - We follow the recommendations of Gebru et al. (2021) and provide a datasheet for the dataset used to train GPT-SW3. # Motivation - For what purpose was the dataset created? Was there a specific task in mind? Was there a specific gap that needed to be filled? Please provide a description. Pre-training of Large Language Models (LLM), such as GPT-3 (T. B. Brown et al., 2020), Gopher (J. W. Rae et al., 2022), BLOOM (T. L. Scao et al., 2022), etc. require 100s or even 1000s GBs of text data, with recent studies (Chinchilla: J. Hoffmann et al., 2022) suggesting that the scale of the training data is even more important than previously imagined. Therefore, in order to train Swedish LLMs, we needed a large scale Swedish dataset of high quality. Since no such datasets existed before this initiative, we collected data in the Nordic and English languages. - Who created the dataset (e.g., which team, research group) and on behalf of which entity (e.g., company, institution, organization)? The Strategic Initiative Natural Language Understanding at AI Sweden has established a new research environment in which collaboration is key. The core team working on the creation of the dataset is the NLU research group at AI Sweden. This group consists of researchers and developers from AI Sweden (Lindholmen Science Park AB) and RISE. - Who funded the creation of the dataset? If there is an associated grant, please provide the name of the grantor and the grant name and number. The Swedish Innovation Agency (Vinnova) has funded this work across several different grants, including 2019-02996 and 2022-00949. - Any other comments? No. # Composition - What do the instances that comprise the dataset represent (e.g., documents, photos, people, countries)? Are there multiple types of instances (e.g., movies, users, and ratings; people and interactions between them; nodes and edges)? Please provide a description. The instances are textual documents categorized by language and document type. The dataset is a filtered and deduplicated collection that includes the following sources: - Books - Litteraturbanken (https://litteraturbanken.se/) - The Pile - Articles - Diva (https://www.diva-portal.org/) - The Pile: PubMed - The Pile: ArXiv - Code - Code Parrot: Github code (https://huggingface.co/datasets/codeparrot/github-code) - Conversational - Familjeliv (https://www.familjeliv.se/) - Flashback (https://flashback.se/) - Datasets collected through Parlai (see Appendix in data paper for complete list) (https://github.com/facebookresearch/ParlAI) - Pushshift.io Reddit dataset, developed in Baumgartner et al. (2020) and processed in Roller et al. (2021) - Math - English Math dataset generated with code from DeepMind (D. Saxton et al., 2019) - Swedish Math dataset, generated as above with manually translated templates - Miscellaneous - Summarization data (https://www.ida.liu.se/~arnjo82/papers/clarin-21-julius.pdf) - OPUS, the open parallel corpus (https://opus.nlpl.eu/) - Movie scripts (https://github.com/Aveek-Saha/Movie-Script-Database) - Natural Instructions (https://github.com/allenai/natural-instructions) - P3 (Public Pool of Prompts), (https://huggingface.co/datasets/bigscience/P3) - The Norwegian Colossal Corpus (https://huggingface.co/datasets/NbAiLab/NCC) - Danish Gigaword (https://gigaword.dk/) - Icelandic Gigaword (https://clarin.is/en/resources/gigaword/) - The Pile: Stack Exchange - Web Common Crawl - Web data from the project LES (Linguistic Explorations of Societies, https://les.gu.se). - Multilingual C4 (MC4), prepared by AllenAI from C4 (C. Raffel et al., 2019) - Open Super-large Crawled Aggregated coRpus (OSCAR) (P. O. Suarez, 2019) - The Pile: Open Web Text - Web Sources - Various public Swedish website scrapes (see Appendix in data paper) - Familjeliv Articles - Public Swedish Job Ads from JobTech/Arbetsförmedlingen - Wikipedia - Official Wikipedia dumps - How many instances are there in total (of each type, if appropriate)? The training data consists of 1.1TB UTF-8 encoded text, containing 660M documents with a total of 320B tokens. - Does the dataset contain all possible instances or is it a sample (not necessarily random) of instances from a larger set? If the dataset is a sample, then what is the larger set? Is the sample representative of the larger set (e.g., geographic coverage)? If so, please describe how this representativeness was validated/verified. If it is not representative of the larger set, please describe why not (e.g., to cover a more diverse range of instances, because instances were withheld or unavailable). The subset of our dataset that comes from multilingual Common Crawl datasets (MC4, Oscar), are filtered by language to only include Swedish, Norwegian, Danish, and Icelandic. From The Pile, we included only the parts that typically are of highest textual quality or complemented the rest of our dataset with sources we otherwise lacked (e.g. books). The remainder of the dataset was collected from the above sources. - What data does each instance consist of? “Raw” data (e.g., unprocessed text or images) or features? In either case, please provide a description. Each instance consists of raw text data. - Is there a label or target associated with each instance? If so, please provide a description. No. - Is any information missing from individual instances? If so, please provide a description, explaining why this information is missing (e.g., because it was unavailable). This does not include intentionally removed information, but might include, e.g., redacted text. No. - Are relationships between individual instances made explicit (e.g., users’ movie ratings, social network links)? If so, please describe how these relationships are made explicit. There are no explicit relationships between individual instances. - Are there recommended data splits (e.g., training, development/validation, testing)? If so, please provide a description of these splits, explaining the rationale behind them. There are no explicit splits recommended for this dataset. When pre-training the model, a random split for train, dev, test is set to 99.99%, 0.08%, 0.02% respectively, and is sampled proportionally to each subset’s weight and size. The weight of each subset was manually decided beforehand. These decisions were made considering the data’s value, source, and language, to form a representative and balanced pre-training corpus. - Are there any errors, sources of noise, or redundancies in the dataset? If so, please provide a description. The dataset is a collection of many sources, some of which naturally contain some overlap. Although we have performed deduplication, some overlap may still remain. Furthermore, there may be some noise remaining from artifacts originating in Common Crawl datasets, that have been missed by our data filtering process. Except for these, we are not aware of any errors, sources of noise, or redundancies. - Is the dataset self-contained, or does it link to or otherwise rely on external resources (e.g., websites, tweets, other datasets)? The dataset is self-contained. - Does the dataset contain data that, if viewed directly, might be offensive, insulting, threatening, or might otherwise cause anxiety? If so, please describe why. The dataset contains subsets of public Common Crawl, Reddit, Familjeliv and Flashback. These could contain sentences that, if viewed directly, might be offensive, insulting, threatening, or might otherwise cause anxiety. - Does the dataset relate to people? If not, you may skip the remaining questions in this section. Some documents of this data relate to people, such as news articles, Wikipedia descriptions, etc. - Does the dataset identify any subpopulations (e.g., by age, gender)? If so, please describe how these subpopulations are identified and provide a description of their respective distributions within the dataset. No, the dataset does not explicitly include subpopulation identification. - Any other comments? No. # Collection Process - How was the data associated with each instance acquired? Was the data directly observable (e.g., raw text, movie ratings), reported by subjects (e.g., survey responses), or indirectly inferred/derived from other data (e.g., part-of-speech tags, model-based guesses for age or language)? If data was reported by subjects or indirectly inferred/derived from other data, was the data validated/verified? If so, please describe how. N/A. The dataset is a union of publicly available datasets and sources. - What mechanisms or procedures were used to collect the data (e.g., hardware apparatus or sensor, manual human curation, software program, software API)? How were these mechanisms or procedures validated? The data was downloaded from the internet. - If the dataset is a sample from a larger set, what was the sampling strategy (e.g., deterministic, probabilistic with specific sampling probabilities)? Please see previous answers for how parts of the dataset were selected. - Who was involved in the data collection process (e.g., students, crowdworkers, contractors) and how were they compensated (e.g., how much were crowdworkers paid)? This data is mined, filtered and sampled by machines. - Over what timeframe was the data collected? Does this timeframe match the creation timeframe of the data associated with the instances (e.g., recent crawl of old news articles)? If not, please describe the timeframe in which the data associated with the instances was created. The dataset was collected during the period June 2021 to June 2022. The creation of the collected sources varies, with e.g. Common Crawl data that have been continuously collected over 12 years. - Does the dataset relate to people? If not, you may skip the remainder of the questions in this section. Yes. The texts have been produced by people. Any personal information potentially present in publicly available data sources and thus in the created dataset is of no interest to the collection and use of the dataset. - Has an analysis of the potential impact of the dataset and its use on data subjects (e.g., a data protection impact analysis) been conducted? If so, please provide a description of this analysis, including the outcomes, as well as a link or other access point to any supporting documentation. Yes. - Any other comments? No. - Preprocessing/cleaning/labeling - Was any preprocessing/cleaning/labeling of the data done (e.g., discretization or bucketing, tokenization, part-of-speech tagging, SIFT feature extraction, removal of instances, processing of missing values)? If so, please provide a description. If not, you may skip the remainder of the questions in this section. The dataset was filtered and re-formatted on a document-level using standard procedures, inspired by the work in The BigScience ROOTS Corpus (H. Laurençon et al., 2022) and Gopher (J. W. Rae et al., 2022). This was done with the goal of achieving a consistent text format throughout the dataset, and to remove documents that did not meet our textual quality requirements (e.g. repetitiveness). Furthermore, the dataset was deduplicated to remedy the overlap between collected subsets using the MinHash algorithm, similar to the method used in GPT-3 and The Pile, and described in greater detail in “Deduplicating Training Data Makes Language Models Better” (K. Lee et al., 2021). - Was the “raw” data saved in addition to the preprocessed/cleaned/labeled data (e.g., to support unanticipated future uses)? If so, please provide a link or other access point to the “raw” data. The “raw” component datasets are publicly available in their respective locations. - Any other comments? No. # Uses - Has the dataset been used for any tasks already? If so, please provide a description. The dataset was used to pre-train the GPT-SW3 models. - Is there a repository that links to any or all papers or systems that use the dataset? If so, please provide a link or other access point. N/A. - What (other) tasks could the dataset be used for? The data can be used to pre-train language models, which are foundations for many current and future language tasks. - Is there anything about the composition of the dataset or the way it was collected and preprocessed/cleaned/labeled that might impact future uses? For example, is there anything that a future user might need to know to avoid uses that could result in unfair treatment of individuals or groups (e.g., stereotyping, quality of service issues) or other undesirable harms (e.g., financial harms, legal risks) If so, please provide a description. Is there anything a future user could do to mitigate these undesirable harms? The dataset is probably quite representative of Swedish internet discourse in general, and of the Swedish public sector, but we know that this data does not necessarily reflect the entire Swedish population. - Are there tasks for which the dataset should not be used? If so, please provide a description. None that we are currently aware of. - Any other comments? No. # Distribution - Will the dataset be distributed to third parties outside of the entity (e.g., company, institution, organization) on behalf of which the dataset was created? If so, please provide a description. No. - How will the dataset distributed (e.g., tarball on website, API, GitHub)? Does the dataset have a digital object identifier (DOI)? N/A. - When will the dataset be distributed? N/A. - Will the dataset be distributed under a copyright or other intellectual property (IP) license, and/or under applicable terms of use (ToU)? If so, please describe this license and/or ToU, and provide a link or other access point to, or otherwise reproduce, any relevant licensing terms or ToU, as well as any fees associated with these restrictions. N/A. - Do any export controls or other regulatory restrictions apply to the dataset or to individual instances? If so, please describe these restrictions, and provide a link or other access point to, or otherwise reproduce, any supporting documentation. N/A. - Any other comments? No. # Maintenance - Who is supporting/hosting/maintaining the dataset? AI Sweden at Lindholmen Science Park AB. - How can the owner/curator/manager of the dataset be contacted (e.g., email address)? [email protected] - Is there an erratum? If so, please provide a link or other access point. N/A. - Will the dataset be updated (e.g., to correct labeling errors, add new instances, delete instances)? If so, please describe how often, by whom, and how updates will be communicated to users (e.g., mailing list, GitHub)? Currently, there are no plans for updating the dataset. - If the dataset relates to people, are there applicable limits on the retention of the data associated with the instances (e.g., were individuals in question told that their data would be retained for a fixed period of time and then deleted)? If so, please describe these limits and explain how they will be enforced. Read the privacy policy for the NLU initiative at AI Sweden [here](https://www.ai.se/en/privacy-policy-nlu). - Will older versions of the dataset continue to be supported/hosted/maintained? If so, please describe how. If not, please describe how its obsolescence will be communicated to users. N/A. - If others want to extend/augment/build on/contribute to the dataset, is there a mechanism for them to do so? If so, please provide a description. Will these contributions be validated/ verified? If so, please describe how. If not, why not? Is there a process for communicating/ distributing these contributions to other users? If so, please provide a description. Not at this time. - Any other comments? No.
{}
task
[ "SUMMARIZATION" ]
42,942
levanhien/finetuning-sen-ana
levanhien
text-classification
[ "transformers", "pytorch", "distilbert", "text-classification", "generated_from_trainer", "dataset:imdb", "base_model:distilbert/distilbert-base-uncased", "base_model:finetune:distilbert/distilbert-base-uncased", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2023-09-08T07:51:39Z
2023-09-08T08:55:43+00:00
8
0
--- base_model: distilbert-base-uncased datasets: - imdb license: apache-2.0 metrics: - accuracy - f1 tags: - generated_from_trainer model-index: - name: finetuning-sen-ana results: - task: type: text-classification name: Text Classification dataset: name: imdb type: imdb config: plain_text split: test args: plain_text metrics: - type: accuracy value: 0.87 name: Accuracy - type: f1 value: 0.8729641693811074 name: F1 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # finetuning-sen-ana This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the imdb dataset. It achieves the following results on the evaluation set: - Loss: 0.3184 - Accuracy: 0.87 - F1: 0.8730 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results ### Framework versions - Transformers 4.33.1 - Pytorch 2.0.1+cu117 - Datasets 2.14.5 - Tokenizers 0.13.3
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # finetuning-sen-ana This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the imdb dataset. It achieves the following results on the evaluation set: - Loss: 0.3184 - Accuracy: 0.87 - F1: 0.8730 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results ### Framework versions - Transformers 4.33.1 - Pytorch 2.0.1+cu117 - Datasets 2.14.5 - Tokenizers 0.13.3
{"base_model": "distilbert-base-uncased", "datasets": ["imdb"], "license": "apache-2.0", "metrics": ["accuracy", "f1"], "tags": ["generated_from_trainer"], "model-index": [{"name": "finetuning-sen-ana", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "imdb", "type": "imdb", "config": "plain_text", "split": "test", "args": "plain_text"}, "metrics": [{"type": "accuracy", "value": 0.87, "name": "Accuracy"}, {"type": "f1", "value": 0.8729641693811074, "name": "F1"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,943
Xenova/mbart-large-50-many-to-one-mmt
Xenova
translation
[ "transformers.js", "onnx", "mbart", "text2text-generation", "translation", "base_model:facebook/mbart-large-50-many-to-one-mmt", "base_model:quantized:facebook/mbart-large-50-many-to-one-mmt", "region:us" ]
2023-09-06T02:44:24Z
2024-10-08T13:42:20+00:00
46
0
--- base_model: facebook/mbart-large-50-many-to-one-mmt library_name: transformers.js pipeline_tag: translation --- # mBART-50 many to one multilingual machine translation [facebook/mbart-large-50-many-to-one-mmt](https://huggingface.co/facebook/mbart-large-50-many-to-one-mmt) with ONNX weights to be compatible with [Transformers.js](https://huggingface.co/docs/transformers.js). ## Usage ```js // npm i @xenova/transformers import { pipeline } from '@xenova/transformers'; // Create translation pipeline let translator = await pipeline('translation', 'Xenova/mbart-large-50-many-to-many-mmt'); // Translate text from Hindi to French let output = await translator('संयुक्त राष्ट्र के प्रमुख का कहना है कि सीरिया में कोई सैन्य समाधान नहीं है', { src_lang: 'hi_IN', // Hindi tgt_lang: 'fr_XX', // French }); // [{ translation_text: 'Le chef des Nations affirme qu 'il n 'y a military solution in Syria.' }] ``` ## Languages covered Arabic (ar_AR), Czech (cs_CZ), German (de_DE), English (en_XX), Spanish (es_XX), Estonian (et_EE), Finnish (fi_FI), French (fr_XX), Gujarati (gu_IN), Hindi (hi_IN), Italian (it_IT), Japanese (ja_XX), Kazakh (kk_KZ), Korean (ko_KR), Lithuanian (lt_LT), Latvian (lv_LV), Burmese (my_MM), Nepali (ne_NP), Dutch (nl_XX), Romanian (ro_RO), Russian (ru_RU), Sinhala (si_LK), Turkish (tr_TR), Vietnamese (vi_VN), Chinese (zh_CN), Afrikaans (af_ZA), Azerbaijani (az_AZ), Bengali (bn_IN), Persian (fa_IR), Hebrew (he_IL), Croatian (hr_HR), Indonesian (id_ID), Georgian (ka_GE), Khmer (km_KH), Macedonian (mk_MK), Malayalam (ml_IN), Mongolian (mn_MN), Marathi (mr_IN), Polish (pl_PL), Pashto (ps_AF), Portuguese (pt_XX), Swedish (sv_SE), Swahili (sw_KE), Tamil (ta_IN), Telugu (te_IN), Thai (th_TH), Tagalog (tl_XX), Ukrainian (uk_UA), Urdu (ur_PK), Xhosa (xh_ZA), Galician (gl_ES), Slovene (sl_SI) --- Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using [🤗 Optimum](https://huggingface.co/docs/optimum/index) and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).
null
Non_BioNLP
# mBART-50 many to one multilingual machine translation [facebook/mbart-large-50-many-to-one-mmt](https://huggingface.co/facebook/mbart-large-50-many-to-one-mmt) with ONNX weights to be compatible with [Transformers.js](https://huggingface.co/docs/transformers.js). ## Usage ```js // npm i @xenova/transformers import { pipeline } from '@xenova/transformers'; // Create translation pipeline let translator = await pipeline('translation', 'Xenova/mbart-large-50-many-to-many-mmt'); // Translate text from Hindi to French let output = await translator('संयुक्त राष्ट्र के प्रमुख का कहना है कि सीरिया में कोई सैन्य समाधान नहीं है', { src_lang: 'hi_IN', // Hindi tgt_lang: 'fr_XX', // French }); // [{ translation_text: 'Le chef des Nations affirme qu 'il n 'y a military solution in Syria.' }] ``` ## Languages covered Arabic (ar_AR), Czech (cs_CZ), German (de_DE), English (en_XX), Spanish (es_XX), Estonian (et_EE), Finnish (fi_FI), French (fr_XX), Gujarati (gu_IN), Hindi (hi_IN), Italian (it_IT), Japanese (ja_XX), Kazakh (kk_KZ), Korean (ko_KR), Lithuanian (lt_LT), Latvian (lv_LV), Burmese (my_MM), Nepali (ne_NP), Dutch (nl_XX), Romanian (ro_RO), Russian (ru_RU), Sinhala (si_LK), Turkish (tr_TR), Vietnamese (vi_VN), Chinese (zh_CN), Afrikaans (af_ZA), Azerbaijani (az_AZ), Bengali (bn_IN), Persian (fa_IR), Hebrew (he_IL), Croatian (hr_HR), Indonesian (id_ID), Georgian (ka_GE), Khmer (km_KH), Macedonian (mk_MK), Malayalam (ml_IN), Mongolian (mn_MN), Marathi (mr_IN), Polish (pl_PL), Pashto (ps_AF), Portuguese (pt_XX), Swedish (sv_SE), Swahili (sw_KE), Tamil (ta_IN), Telugu (te_IN), Thai (th_TH), Tagalog (tl_XX), Ukrainian (uk_UA), Urdu (ur_PK), Xhosa (xh_ZA), Galician (gl_ES), Slovene (sl_SI) --- Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using [🤗 Optimum](https://huggingface.co/docs/optimum/index) and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).
{"base_model": "facebook/mbart-large-50-many-to-one-mmt", "library_name": "transformers.js", "pipeline_tag": "translation"}
task
[ "TRANSLATION" ]
42,944
PlanTL-GOB-ES/mt-plantl-es-gl
PlanTL-GOB-ES
null
[ "license:apache-2.0", "region:us" ]
2022-11-28T08:33:35Z
2022-12-05T10:09:38+00:00
0
1
--- license: apache-2.0 --- ## PlanTL Project's Spanish-Galician machine translation model ## Table of Contents - [Model Description](#model-description) - [Intended Uses and Limitations](#intended-use) - [How to Use](#how-to-use) - [Training](#training) - [Training data](#training-data) - [Training procedure](#training-procedure) - [Data Preparation](#data-preparation) - [Tokenization](#tokenization) - [Hyperparameters](#hyperparameters) - [Evaluation](#evaluation) - [Variable and Metrics](#variable-and-metrics) - [Evaluation Results](#evaluation-results) - [Additional Information](#additional-information) - [Author](#author) - [Contact Information](#contact-information) - [Copyright](#copyright) - [Licensing Information](#licensing-information) - [Funding](#funding) - [Disclaimer](#disclaimer) ## Model description This model was trained from scratch using the [Fairseq toolkit](https://fairseq.readthedocs.io/en/latest/) on a combination of Spanish-Galician datasets, up to 31 million sentences. Additionally, the model is evaluated on several public datasets, Flores 101, Spanish Constitutioni (TaCon) and Tatoeba. ## Intended uses and limitations You can use this model for machine translation from Spanish to Galician. ## How to use ### Usage Required libraries: ```bash pip install ctranslate2 pyonmttok ``` Translate a sentence using python ```python import ctranslate2 import pyonmttok from huggingface_hub import snapshot_download model_dir = snapshot_download(repo_id="PlanTL-GOB-ES/mt-plantl-es-gl", revision="main") tokenizer=pyonmttok.Tokenizer(mode="none", sp_model_path = model_dir + "/spm.model") tokenized=tokenizer.tokenize("Bienvenido al Proyecto PlanTL!") translator = ctranslate2.Translator(model_dir) translated = translator.translate_batch([tokenized[0]]) print(tokenizer.detokenize(translated[0][0]['tokens'])) ``` ## Training ### Training data The model was trained on a combination of the following datasets: | Dataset | Sentences | |-------------------|----------------| | CLUVI | 318.612 | | WikiMatrix | 438.181 | | WikiMedia | 83.511 | | QED | 30.211 | | TED 2020 v1 | 33.324 | | CCMatrix v1 | 24.165.978 | | ParaCrawl | 6.537.374 | | OpenSubtitles | 197.519 | | **Total** | **31.804.710** | ### Training procedure ### Data preparation All datasets are concatenated and filtered using the [mBERT Gencata parallel filter](https://huggingface.co/projecte-aina/mbert-base-gencata) and cleaned using the clean-corpus-n.pl script from [moses](https://github.com/moses-smt/mosesdecoder), allowing sentences between 5 and 150 words. Before training, the punctuation is normalized using a modified version of the join-single-file.py script from [SoftCatalà](https://github.com/Softcatala/nmt-models/blob/master/data-processing-tools/join-single-file.py) #### Tokenization All data is tokenized using sentencepiece, with 50 thousand token sentencepiece model learned from the combination of all filtered training data. This model is included. #### Hyperparameters The model is based on the Transformer-XLarge proposed by [Subramanian et al.](https://aclanthology.org/2021.wmt-1.18.pdf) The following hyperparamenters were set on the Fairseq toolkit: | Hyperparameter | Value | |------------------------------------|-----------------------------------| | Architecture | transformer_vaswani_wmt_en_de_big | | Embedding size | 1024 | | Feedforward size | 4096 | | Number of heads | 16 | | Encoder layers | 24 | | Decoder layers | 6 | | Normalize before attention | True | | --share-decoder-input-output-embed | True | | --share-all-embeddings | True | | Effective batch size | 96.000 | | Optimizer | adam | | Adam betas | (0.9, 0.980) | | Clip norm | 0.0 | | Learning rate | 1e-3 | | Lr. schedurer | inverse sqrt | | Warmup updates | 4000 | | Dropout | 0.1 | | Label smoothing | 0.1 | The model was trained using shards of 10 million sentences, for a total of 8.000 updates. Weights were saved every 1000 updates and reported results are the average of the last 6 checkpoints. After this, the model was trained an extra epoch on the CLUVI dataset. ## Evaluation ### Variable and metrics We use the BLEU score for evaluation on test sets: [Flores-101](https://github.com/facebookresearch/flores), [TaCon](https://elrc-share.eu/repository/browse/tacon-spanish-constitution-mt-test-set/84a96138b98611ec9c1a00155d02670628f3e6857b0f422abd82abc3795ec8c2/), [Tatoeba](https://opus.nlpl.eu/Tatoeba.php) ### Evaluation results Below are the evaluation results on the machine translation from Spanish to Galician compared to [Apertium](https://apertium.org/), [Google Translate](https://translate.google.es/?hl=es) and [M2M 100 418M](https://huggingface.co/facebook/m2m100_418M): | Test set | Apertium | Google Translate | M2M-100 418M | mt-plantl-es-gl | |----------------------|------------|------------------|--------------|-----------------| | Spanish Constitution | 74,5 | 60,4 | 70,7 | **84,3** | | Flores 101 devtest | 21,4 | **25,6** | 21,6 | 21,8 | | Tatoeba | **67,9** | 52,8 | 53,9 | 66,6 | | Average | 54,3 | 46,3 | 48,7 | **57,6** | ## Additional information ### Author Text Mining Unit (TeMU) at the Barcelona Supercomputing Center ([email protected]) ### Contact information For further information, send an email to <[email protected]> ### Copyright Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022) ### Licensing information This work is licensed under a [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0) ### Funding This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) ### Disclaimer <details> <summary>Click to expand</summary> The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence. In no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models. Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables. Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial. En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos. </details>
null
Non_BioNLP
## PlanTL Project's Spanish-Galician machine translation model ## Table of Contents - [Model Description](#model-description) - [Intended Uses and Limitations](#intended-use) - [How to Use](#how-to-use) - [Training](#training) - [Training data](#training-data) - [Training procedure](#training-procedure) - [Data Preparation](#data-preparation) - [Tokenization](#tokenization) - [Hyperparameters](#hyperparameters) - [Evaluation](#evaluation) - [Variable and Metrics](#variable-and-metrics) - [Evaluation Results](#evaluation-results) - [Additional Information](#additional-information) - [Author](#author) - [Contact Information](#contact-information) - [Copyright](#copyright) - [Licensing Information](#licensing-information) - [Funding](#funding) - [Disclaimer](#disclaimer) ## Model description This model was trained from scratch using the [Fairseq toolkit](https://fairseq.readthedocs.io/en/latest/) on a combination of Spanish-Galician datasets, up to 31 million sentences. Additionally, the model is evaluated on several public datasets, Flores 101, Spanish Constitutioni (TaCon) and Tatoeba. ## Intended uses and limitations You can use this model for machine translation from Spanish to Galician. ## How to use ### Usage Required libraries: ```bash pip install ctranslate2 pyonmttok ``` Translate a sentence using python ```python import ctranslate2 import pyonmttok from huggingface_hub import snapshot_download model_dir = snapshot_download(repo_id="PlanTL-GOB-ES/mt-plantl-es-gl", revision="main") tokenizer=pyonmttok.Tokenizer(mode="none", sp_model_path = model_dir + "/spm.model") tokenized=tokenizer.tokenize("Bienvenido al Proyecto PlanTL!") translator = ctranslate2.Translator(model_dir) translated = translator.translate_batch([tokenized[0]]) print(tokenizer.detokenize(translated[0][0]['tokens'])) ``` ## Training ### Training data The model was trained on a combination of the following datasets: | Dataset | Sentences | |-------------------|----------------| | CLUVI | 318.612 | | WikiMatrix | 438.181 | | WikiMedia | 83.511 | | QED | 30.211 | | TED 2020 v1 | 33.324 | | CCMatrix v1 | 24.165.978 | | ParaCrawl | 6.537.374 | | OpenSubtitles | 197.519 | | **Total** | **31.804.710** | ### Training procedure ### Data preparation All datasets are concatenated and filtered using the [mBERT Gencata parallel filter](https://huggingface.co/projecte-aina/mbert-base-gencata) and cleaned using the clean-corpus-n.pl script from [moses](https://github.com/moses-smt/mosesdecoder), allowing sentences between 5 and 150 words. Before training, the punctuation is normalized using a modified version of the join-single-file.py script from [SoftCatalà](https://github.com/Softcatala/nmt-models/blob/master/data-processing-tools/join-single-file.py) #### Tokenization All data is tokenized using sentencepiece, with 50 thousand token sentencepiece model learned from the combination of all filtered training data. This model is included. #### Hyperparameters The model is based on the Transformer-XLarge proposed by [Subramanian et al.](https://aclanthology.org/2021.wmt-1.18.pdf) The following hyperparamenters were set on the Fairseq toolkit: | Hyperparameter | Value | |------------------------------------|-----------------------------------| | Architecture | transformer_vaswani_wmt_en_de_big | | Embedding size | 1024 | | Feedforward size | 4096 | | Number of heads | 16 | | Encoder layers | 24 | | Decoder layers | 6 | | Normalize before attention | True | | --share-decoder-input-output-embed | True | | --share-all-embeddings | True | | Effective batch size | 96.000 | | Optimizer | adam | | Adam betas | (0.9, 0.980) | | Clip norm | 0.0 | | Learning rate | 1e-3 | | Lr. schedurer | inverse sqrt | | Warmup updates | 4000 | | Dropout | 0.1 | | Label smoothing | 0.1 | The model was trained using shards of 10 million sentences, for a total of 8.000 updates. Weights were saved every 1000 updates and reported results are the average of the last 6 checkpoints. After this, the model was trained an extra epoch on the CLUVI dataset. ## Evaluation ### Variable and metrics We use the BLEU score for evaluation on test sets: [Flores-101](https://github.com/facebookresearch/flores), [TaCon](https://elrc-share.eu/repository/browse/tacon-spanish-constitution-mt-test-set/84a96138b98611ec9c1a00155d02670628f3e6857b0f422abd82abc3795ec8c2/), [Tatoeba](https://opus.nlpl.eu/Tatoeba.php) ### Evaluation results Below are the evaluation results on the machine translation from Spanish to Galician compared to [Apertium](https://apertium.org/), [Google Translate](https://translate.google.es/?hl=es) and [M2M 100 418M](https://huggingface.co/facebook/m2m100_418M): | Test set | Apertium | Google Translate | M2M-100 418M | mt-plantl-es-gl | |----------------------|------------|------------------|--------------|-----------------| | Spanish Constitution | 74,5 | 60,4 | 70,7 | **84,3** | | Flores 101 devtest | 21,4 | **25,6** | 21,6 | 21,8 | | Tatoeba | **67,9** | 52,8 | 53,9 | 66,6 | | Average | 54,3 | 46,3 | 48,7 | **57,6** | ## Additional information ### Author Text Mining Unit (TeMU) at the Barcelona Supercomputing Center ([email protected]) ### Contact information For further information, send an email to <[email protected]> ### Copyright Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022) ### Licensing information This work is licensed under a [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0) ### Funding This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) ### Disclaimer <details> <summary>Click to expand</summary> The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence. In no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models. Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables. Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial. En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos. </details>
{"license": "apache-2.0"}
task
[ "TRANSLATION" ]
42,945
zap8600/autotrain-t5-billsum-47010115876
zap8600
summarization
[ "transformers", "pytorch", "t5", "text2text-generation", "autotrain", "summarization", "en", "dataset:zap8600/autotrain-data-t5-billsum", "co2_eq_emissions", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
2023-04-05T13:40:16Z
2023-04-05T13:46:49+00:00
17
0
--- datasets: - zap8600/autotrain-data-t5-billsum language: - en tags: - autotrain - summarization widget: - text: I love AutoTrain 🤗 co2_eq_emissions: emissions: 0.011131664546159842 --- # Model Trained Using AutoTrain - Problem type: Summarization - Model ID: 47010115876 - CO2 Emissions (in grams): 0.0111 ## Validation Metrics - Loss: 2.472 - Rouge1: 20.002 - Rouge2: 10.000 - RougeL: 17.035 - RougeLsum: 18.427 - Gen Len: 19.000 ## Usage You can use cURL to access this model: ``` $ curl -X POST -H "Authorization: Bearer YOUR_HUGGINGFACE_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoTrain"}' https://api-inference.huggingface.co/zap8600/autotrain-t5-billsum-47010115876 ```
null
Non_BioNLP
# Model Trained Using AutoTrain - Problem type: Summarization - Model ID: 47010115876 - CO2 Emissions (in grams): 0.0111 ## Validation Metrics - Loss: 2.472 - Rouge1: 20.002 - Rouge2: 10.000 - RougeL: 17.035 - RougeLsum: 18.427 - Gen Len: 19.000 ## Usage You can use cURL to access this model: ``` $ curl -X POST -H "Authorization: Bearer YOUR_HUGGINGFACE_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoTrain"}' https://api-inference.huggingface.co/zap8600/autotrain-t5-billsum-47010115876 ```
{"datasets": ["zap8600/autotrain-data-t5-billsum"], "language": ["en"], "tags": ["autotrain", "summarization"], "widget": [{"text": "I love AutoTrain 🤗"}], "co2_eq_emissions": {"emissions": 0.011131664546159842}}
task
[ "SUMMARIZATION" ]
42,946
utter-project/hutter-12-3rd-base
utter-project
null
[ "fairseq", "pytorch", "safetensors", "hubert", "fr", "es", "pt", "da", "de", "nl", "fy", "zh", "ja", "ar", "sw", "gn", "dataset:mozilla-foundation/common_voice_11_0", "license:cc-by-nc-4.0", "region:us" ]
2023-11-30T12:37:21Z
2024-11-08T10:29:01+00:00
17
2
--- datasets: - mozilla-foundation/common_voice_11_0 language: - fr - es - pt - da - de - nl - fy - zh - ja - ar - sw - gn library_name: fairseq license: cc-by-nc-4.0 --- **HUTTER-12: H(uBERT) UTTER model covering 12 languages.** * Total training hours: 1,622 from Romance (French: 300h, Spanish: 300h, Portuguese: 102.3h), West-Germanic (Danish: 3.5h, German: 300h, Dutch: 72.1h, Frisian: 41.2h) and other languages (Chinese (zh-CN): 104.6h, Japanese: 37h, Arabic: 61h, Swahili 300h, Guaraní: 0.4h) * Number of updates: 400K * Number of iterations: 3 * Clustering approach: mini-batch K-means (100% of the data) * Dataset: CommonVoice v13 # Funding <img src="https://hf.fast360.xyz/production/uploads/62262e19d36494a6f743a28d/HbzC1C-uHe25ewTy2wyoK.png" width=7% height=7%> This is an output of the European Project UTTER (Unified Transcription and Translation for Extended Reality) under grant number 101070631. For more information go to https://he-utter.eu/
null
Non_BioNLP
**HUTTER-12: H(uBERT) UTTER model covering 12 languages.** * Total training hours: 1,622 from Romance (French: 300h, Spanish: 300h, Portuguese: 102.3h), West-Germanic (Danish: 3.5h, German: 300h, Dutch: 72.1h, Frisian: 41.2h) and other languages (Chinese (zh-CN): 104.6h, Japanese: 37h, Arabic: 61h, Swahili 300h, Guaraní: 0.4h) * Number of updates: 400K * Number of iterations: 3 * Clustering approach: mini-batch K-means (100% of the data) * Dataset: CommonVoice v13 # Funding <img src="https://hf.fast360.xyz/production/uploads/62262e19d36494a6f743a28d/HbzC1C-uHe25ewTy2wyoK.png" width=7% height=7%> This is an output of the European Project UTTER (Unified Transcription and Translation for Extended Reality) under grant number 101070631. For more information go to https://he-utter.eu/
{"datasets": ["mozilla-foundation/common_voice_11_0"], "language": ["fr", "es", "pt", "da", "de", "nl", "fy", "zh", "ja", "ar", "sw", "gn"], "library_name": "fairseq", "license": "cc-by-nc-4.0"}
task
[ "TRANSLATION" ]
42,947
ZenXir/marian-finetuned-kde4-en-to-fr
ZenXir
translation
[ "transformers", "safetensors", "marian", "text2text-generation", "translation", "generated_from_trainer", "dataset:kde4", "base_model:Helsinki-NLP/opus-mt-en-fr", "base_model:finetune:Helsinki-NLP/opus-mt-en-fr", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-03-01T03:52:10Z
2024-03-05T07:30:00+00:00
28
0
--- base_model: Helsinki-NLP/opus-mt-en-fr datasets: - kde4 license: apache-2.0 metrics: - bleu tags: - translation - generated_from_trainer model-index: - name: marian-finetuned-kde4-en-to-fr results: - task: type: text2text-generation name: Sequence-to-sequence Language Modeling dataset: name: kde4 type: kde4 config: en-fr split: train args: en-fr metrics: - type: bleu value: 52.86546516633332 name: Bleu --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # marian-finetuned-kde4-en-to-fr This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-fr](https://huggingface.co/Helsinki-NLP/opus-mt-en-fr) on the kde4 dataset. It achieves the following results on the evaluation set: - Loss: 0.8556 - Bleu: 52.8655 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.37.2 - Pytorch 2.2.0+cu121 - Datasets 2.17.1 - Tokenizers 0.15.2
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # marian-finetuned-kde4-en-to-fr This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-fr](https://huggingface.co/Helsinki-NLP/opus-mt-en-fr) on the kde4 dataset. It achieves the following results on the evaluation set: - Loss: 0.8556 - Bleu: 52.8655 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.37.2 - Pytorch 2.2.0+cu121 - Datasets 2.17.1 - Tokenizers 0.15.2
{"base_model": "Helsinki-NLP/opus-mt-en-fr", "datasets": ["kde4"], "license": "apache-2.0", "metrics": ["bleu"], "tags": ["translation", "generated_from_trainer"], "model-index": [{"name": "marian-finetuned-kde4-en-to-fr", "results": [{"task": {"type": "text2text-generation", "name": "Sequence-to-sequence Language Modeling"}, "dataset": {"name": "kde4", "type": "kde4", "config": "en-fr", "split": "train", "args": "en-fr"}, "metrics": [{"type": "bleu", "value": 52.86546516633332, "name": "Bleu"}]}]}]}
task
[ "TRANSLATION" ]
42,948
CausalLM/35b-beta-long
CausalLM
text-generation
[ "transformers", "safetensors", "cohere", "text-generation", "conversational", "en", "zh", "ja", "de", "dataset:JosephusCheung/GuanacoDataset", "dataset:meta-math/MetaMathQA", "dataset:jondurbin/airoboros-3.1", "dataset:WizardLM/WizardLM_evol_instruct_V2_196k", "dataset:RyokoAI/ShareGPT52K", "dataset:RyokoAI/Fandom23K", "dataset:milashkaarshif/MoeGirlPedia_wikitext_raw_archive", "dataset:wikipedia", "dataset:wiki_lingua", "dataset:garage-bAInd/Open-Platypus", "dataset:LDJnr/Puffin", "dataset:BAAI/COIG", "dataset:TigerResearch/tigerbot-zhihu-zh-10k", "dataset:liwu/MNBVC", "dataset:teknium/openhermes", "dataset:CausalLM/Refined-Anime-Text", "dataset:microsoft/orca-math-word-problems-200k", "dataset:m-a-p/CodeFeedback-Filtered-Instruction", "license:wtfpl", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
2024-04-13T18:47:13Z
2025-02-11T14:15:03+00:00
167
65
--- datasets: - JosephusCheung/GuanacoDataset - meta-math/MetaMathQA - jondurbin/airoboros-3.1 - WizardLM/WizardLM_evol_instruct_V2_196k - RyokoAI/ShareGPT52K - RyokoAI/Fandom23K - milashkaarshif/MoeGirlPedia_wikitext_raw_archive - wikipedia - wiki_lingua - garage-bAInd/Open-Platypus - LDJnr/Puffin - BAAI/COIG - TigerResearch/tigerbot-zhihu-zh-10k - liwu/MNBVC - teknium/openhermes - CausalLM/Refined-Anime-Text - microsoft/orca-math-word-problems-200k - m-a-p/CodeFeedback-Filtered-Instruction language: - en - zh - ja - de license: wtfpl --- # 35b-beta-long This release, CausalLM/35b-beta-long, represents the culmination of our experience and accumulated training data in fine-tuning large language models. We are open-sourcing these weights to foster development within the open-source community. We chose Cohere's multilingual, 35B-parameter with long context [CohereForAI/c4ai-command-r-v01] MHA model as our base. In our evaluation, it proved to be the most responsive to the quality of training data throughout the Supervised Fine-Tuning process, outperforming other open-source LLMs. Although its initial SFT/RL focuses on specific tasks and comes with a non-commercial license, we believe it's currently the best foundation for personal and internal use cases. Utilizing extensive factual content from web crawls, we synthesized over 30 million multi-turn dialogue data entries, grounded in multiple web-pages or documents. This process involved substantial human oversight and a data pipeline designed to ensure high quality. The model was then trained on this data in full 128K context using BF16 precision. We also incorporated widely-used open-source dialogue datasets to enhance general conversational fluency. Our data synthesis approach addressed crucial limitations in typical LLM training corpora. LLMs often struggle to extract thematic summaries, key information, or perform comparisons at the paragraph or document level. Therefore, we focused on generating fact-based data using multiple documents within a long context setting. This involved leveraging existing SOTA LLMs with human guidance to synthesize information through thematic summarization, information extraction, and comparison of source materials. This approach yielded significant improvements in model performance during fine-tuning. We observed reductions in hallucinations, enhanced long-context capabilities, and improvements in general abilities such as math, coding, and knowledge recall. The training process incorporated both the original source material and the synthesized outputs, further reinforcing the model's ability to recall and utilize abstract concepts embedded within the pre-training data. Our analysis revealed that this combination of original and synthesized data was crucial for achieving a more balanced performance profile. Intermediate checkpoints and models trained solely on synthesized data are also released for research purposes. Compared to the original task-specific model, our further fine-tuned model demonstrates more robust recall in long-context scenarios without requiring specific document formatting or prompt engineering. This fine-tuned model also exhibits performance comparable to models twice its size in quantifiable benchmarks. As this model has only undergone SFT, it may still exhibit biases or generate undesirable content. We implemented basic safety measures using open-source refusal datasets to mitigate outputs related to illegal activities, NSFW content, and violence. However, further Reinforcement Learning is necessary for robust alignment with human values. ## Please note Tokenizer is different from cohere - and chat template is **ChatML**. Pressure Testing from: https://github.com/LeonEricsson/llmcontext ![image/png](https://cdn-uploads.huggingface.co/production/uploads/63468a143ea42ee2cb49ddd1/2XbONpyTeMH1qWCtE9ziH.png)
null
Non_BioNLP
# 35b-beta-long This release, CausalLM/35b-beta-long, represents the culmination of our experience and accumulated training data in fine-tuning large language models. We are open-sourcing these weights to foster development within the open-source community. We chose Cohere's multilingual, 35B-parameter with long context [CohereForAI/c4ai-command-r-v01] MHA model as our base. In our evaluation, it proved to be the most responsive to the quality of training data throughout the Supervised Fine-Tuning process, outperforming other open-source LLMs. Although its initial SFT/RL focuses on specific tasks and comes with a non-commercial license, we believe it's currently the best foundation for personal and internal use cases. Utilizing extensive factual content from web crawls, we synthesized over 30 million multi-turn dialogue data entries, grounded in multiple web-pages or documents. This process involved substantial human oversight and a data pipeline designed to ensure high quality. The model was then trained on this data in full 128K context using BF16 precision. We also incorporated widely-used open-source dialogue datasets to enhance general conversational fluency. Our data synthesis approach addressed crucial limitations in typical LLM training corpora. LLMs often struggle to extract thematic summaries, key information, or perform comparisons at the paragraph or document level. Therefore, we focused on generating fact-based data using multiple documents within a long context setting. This involved leveraging existing SOTA LLMs with human guidance to synthesize information through thematic summarization, information extraction, and comparison of source materials. This approach yielded significant improvements in model performance during fine-tuning. We observed reductions in hallucinations, enhanced long-context capabilities, and improvements in general abilities such as math, coding, and knowledge recall. The training process incorporated both the original source material and the synthesized outputs, further reinforcing the model's ability to recall and utilize abstract concepts embedded within the pre-training data. Our analysis revealed that this combination of original and synthesized data was crucial for achieving a more balanced performance profile. Intermediate checkpoints and models trained solely on synthesized data are also released for research purposes. Compared to the original task-specific model, our further fine-tuned model demonstrates more robust recall in long-context scenarios without requiring specific document formatting or prompt engineering. This fine-tuned model also exhibits performance comparable to models twice its size in quantifiable benchmarks. As this model has only undergone SFT, it may still exhibit biases or generate undesirable content. We implemented basic safety measures using open-source refusal datasets to mitigate outputs related to illegal activities, NSFW content, and violence. However, further Reinforcement Learning is necessary for robust alignment with human values. ## Please note Tokenizer is different from cohere - and chat template is **ChatML**. Pressure Testing from: https://github.com/LeonEricsson/llmcontext ![image/png](https://cdn-uploads.huggingface.co/production/uploads/63468a143ea42ee2cb49ddd1/2XbONpyTeMH1qWCtE9ziH.png)
{"datasets": ["JosephusCheung/GuanacoDataset", "meta-math/MetaMathQA", "jondurbin/airoboros-3.1", "WizardLM/WizardLM_evol_instruct_V2_196k", "RyokoAI/ShareGPT52K", "RyokoAI/Fandom23K", "milashkaarshif/MoeGirlPedia_wikitext_raw_archive", "wikipedia", "wiki_lingua", "garage-bAInd/Open-Platypus", "LDJnr/Puffin", "BAAI/COIG", "TigerResearch/tigerbot-zhihu-zh-10k", "liwu/MNBVC", "teknium/openhermes", "CausalLM/Refined-Anime-Text", "microsoft/orca-math-word-problems-200k", "m-a-p/CodeFeedback-Filtered-Instruction"], "language": ["en", "zh", "ja", "de"], "license": "wtfpl"}
task
[ "SUMMARIZATION" ]
42,950
cloudqi/cqi_question_solver_translator_v0
cloudqi
text2text-generation
[ "transformers", "pytorch", "tf", "jax", "safetensors", "t5", "text2text-generation", "en", "fr", "ro", "de", "multilingual", "dataset:svakulenk0/qrecc", "dataset:taskmaster2", "dataset:djaym7/wiki_dialog", "dataset:deepmind/code_contests", "dataset:lambada", "dataset:gsm8k", "dataset:aqua_rat", "dataset:esnli", "dataset:quasc", "dataset:qed", "arxiv:2210.11416", "arxiv:1910.09700", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
2023-03-31T17:44:11Z
2023-03-31T19:01:41+00:00
14
1
--- datasets: - svakulenk0/qrecc - taskmaster2 - djaym7/wiki_dialog - deepmind/code_contests - lambada - gsm8k - aqua_rat - esnli - quasc - qed language: - en - fr - ro - de - multilingual license: apache-2.0 tags: - text2text-generation widget: - text: 'Translate to English: Meu nome é Bruno.' example_title: Tradução - text: Please answer to the following question. Who is going to be the next Ballon d'or? example_title: Question Answering - text: 'Q: Can Geoffrey Hinton have a conversation with George Washington? Give the rationale before answering.' example_title: Logical reasoning - text: Please answer the following question. What is the boiling point of Nitrogen? example_title: Scientific knowledge - text: Answer the following yes/no question. Can you write a whole Haiku in a single tweet? example_title: Yes/no question - text: Answer the following yes/no question by reasoning step-by-step. Can you write a whole Haiku in a single tweet? example_title: Reasoning task - text: 'Q: ( False or not False or False ) is? A: Let''s think step by step' example_title: Boolean Expressions - text: The square root of x is the cube root of y. What is y to the power of 2, if x = 4? example_title: Math reasoning - text: 'Premise: At my age you will probably have learnt one lesson. Hypothesis: It''s not certain how many lessons you''ll learn by your thirties. Does the premise entail the hypothesis?' example_title: Premise and hypothesis --- # Model Card for CQI-Multitool-Model (From Flan T5) # Table of Contents 0. [TL;DR](#TL;DR) 1. [Model Details](#model-details) 2. [Usage](#usage) 3. [Uses](#uses) 4. [Bias, Risks, and Limitations](#bias-risks-and-limitations) 5. [Training Details](#training-details) 6. [Evaluation](#evaluation) 7. [Environmental Impact](#environmental-impact) 8. [Citation](#citation) 9. [Model Card Authors](#model-card-authors) # TL;DR If you already know T5, FLAN-T5 is just better at everything. For the same number of parameters, these models have been fine-tuned on more than 1000 additional tasks covering also more languages. As mentioned in the first few lines of the abstract : > Flan-PaLM 540B achieves state-of-the-art performance on several benchmarks, such as 75.2% on five-shot MMLU. We also publicly release Flan-T5 checkpoints,1 which achieve strong few-shot performance even compared to much larger models, such as PaLM 62B. Overall, instruction finetuning is a general method for improving the performance and usability of pretrained language models. **Disclaimer**: Content from **this** model card has been written by the Hugging Face team, and parts of it were copy pasted from the [T5 model card](https://huggingface.co/t5-large). # Model Details ## Model Description - **Model type:** Language model - **Language(s) (NLP):** English, Spanish, Japanese, Persian, Hindi, French, Chinese, Bengali, Gujarati, German, Telugu, Italian, Arabic, Polish, Tamil, Marathi, Malayalam, Oriya, Panjabi, Portuguese, Urdu, Galician, Hebrew, Korean, Catalan, Thai, Dutch, Indonesian, Vietnamese, Bulgarian, Filipino, Central Khmer, Lao, Turkish, Russian, Croatian, Swedish, Yoruba, Kurdish, Burmese, Malay, Czech, Finnish, Somali, Tagalog, Swahili, Sinhala, Kannada, Zhuang, Igbo, Xhosa, Romanian, Haitian, Estonian, Slovak, Lithuanian, Greek, Nepali, Assamese, Norwegian - **License:** Apache 2.0 - **Related Models:** [All FLAN-T5 Checkpoints](https://huggingface.co/models?search=flan-t5) - **Original Checkpoints:** [All Original FLAN-T5 Checkpoints](https://github.com/google-research/t5x/blob/main/docs/models.md#flan-t5-checkpoints) - **Resources for more information:** - [Research paper](https://arxiv.org/pdf/2210.11416.pdf) - [GitHub Repo](https://github.com/google-research/t5x) - [Hugging Face FLAN-T5 Docs (Similar to T5) ](https://huggingface.co/docs/transformers/model_doc/t5) # Usage Find below some example scripts on how to use the model in `transformers`: ## Using the Pytorch model ### Running the model on a CPU <details> <summary> Click to expand </summary> ```python from transformers import T5Tokenizer, T5ForConditionalGeneration tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-base") model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-base") input_text = "translate English to German: How old are you?" input_ids = tokenizer(input_text, return_tensors="pt").input_ids outputs = model.generate(input_ids) print(tokenizer.decode(outputs[0])) ``` </details> ### Running the model on a GPU <details> <summary> Click to expand </summary> ```python # pip install accelerate from transformers import T5Tokenizer, T5ForConditionalGeneration tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-base") model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-base", device_map="auto") input_text = "translate English to German: How old are you?" input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda") outputs = model.generate(input_ids) print(tokenizer.decode(outputs[0])) ``` </details> ### Running the model on a GPU using different precisions #### FP16 <details> <summary> Click to expand </summary> ```python # pip install accelerate import torch from transformers import T5Tokenizer, T5ForConditionalGeneration tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-base") model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-base", device_map="auto", torch_dtype=torch.float16) input_text = "translate English to German: How old are you?" input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda") outputs = model.generate(input_ids) print(tokenizer.decode(outputs[0])) ``` </details> #### INT8 <details> <summary> Click to expand </summary> ```python # pip install bitsandbytes accelerate from transformers import T5Tokenizer, T5ForConditionalGeneration tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-base") model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-base", device_map="auto", load_in_8bit=True) input_text = "translate English to German: How old are you?" input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda") outputs = model.generate(input_ids) print(tokenizer.decode(outputs[0])) ``` </details> # Uses ## Direct Use and Downstream Use The authors write in [the original paper's model card](https://arxiv.org/pdf/2210.11416.pdf) that: > The primary use is research on language models, including: research on zero-shot NLP tasks and in-context few-shot learning NLP tasks, such as reasoning, and question answering; advancing fairness and safety research, and understanding limitations of current large language models See the [research paper](https://arxiv.org/pdf/2210.11416.pdf) for further details. ## Out-of-Scope Use More information needed. # Bias, Risks, and Limitations The information below in this section are copied from the model's [official model card](https://arxiv.org/pdf/2210.11416.pdf): > Language models, including Flan-T5, can potentially be used for language generation in a harmful way, according to Rae et al. (2021). Flan-T5 should not be used directly in any application, without a prior assessment of safety and fairness concerns specific to the application. ## Ethical considerations and risks > Flan-T5 is fine-tuned on a large corpus of text data that was not filtered for explicit content or assessed for existing biases. As a result the model itself is potentially vulnerable to generating equivalently inappropriate content or replicating inherent biases in the underlying data. ## Known Limitations > Flan-T5 has not been tested in real world applications. ## Sensitive Use: > Flan-T5 should not be applied for any unacceptable use cases, e.g., generation of abusive speech. # Training Details ## Training Data The model was trained on a mixture of tasks, that includes the tasks described in the table below (from the original paper, figure 2): ![table.png](https://s3.amazonaws.com/moonup/production/uploads/1666363265279-62441d1d9fdefb55a0b7d12c.png) ## Training Procedure According to the model card from the [original paper](https://arxiv.org/pdf/2210.11416.pdf): > These models are based on pretrained T5 (Raffel et al., 2020) and fine-tuned with instructions for better zero-shot and few-shot performance. There is one fine-tuned Flan model per T5 model size. The model has been trained on TPU v3 or TPU v4 pods, using [`t5x`](https://github.com/google-research/t5x) codebase together with [`jax`](https://github.com/google/jax). # Evaluation ## Testing Data, Factors & Metrics The authors evaluated the model on various tasks covering several languages (1836 in total). See the table below for some quantitative evaluation: ![image.png](https://s3.amazonaws.com/moonup/production/uploads/1668072995230-62441d1d9fdefb55a0b7d12c.png) For full details, please check the [research paper](https://arxiv.org/pdf/2210.11416.pdf). ## Results For full results for FLAN-T5-Base, see the [research paper](https://arxiv.org/pdf/2210.11416.pdf), Table 3. # Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** Google Cloud TPU Pods - TPU v3 or TPU v4 | Number of chips ≥ 4. - **Hours used:** More information needed - **Cloud Provider:** GCP - **Compute Region:** More information needed - **Carbon Emitted:** More information needed # Citation **BibTeX:** ```bibtex @misc{https://doi.org/10.48550/arxiv.2210.11416, doi = {10.48550/ARXIV.2210.11416}, url = {https://arxiv.org/abs/2210.11416}, author = {Chung, Hyung Won and Hou, Le and Longpre, Shayne and Zoph, Barret and Tay, Yi and Fedus, William and Li, Eric and Wang, Xuezhi and Dehghani, Mostafa and Brahma, Siddhartha and Webson, Albert and Gu, Shixiang Shane and Dai, Zhuyun and Suzgun, Mirac and Chen, Xinyun and Chowdhery, Aakanksha and Narang, Sharan and Mishra, Gaurav and Yu, Adams and Zhao, Vincent and Huang, Yanping and Dai, Andrew and Yu, Hongkun and Petrov, Slav and Chi, Ed H. and Dean, Jeff and Devlin, Jacob and Roberts, Adam and Zhou, Denny and Le, Quoc V. and Wei, Jason}, keywords = {Machine Learning (cs.LG), Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Scaling Instruction-Finetuned Language Models}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ``` ## Model Recycling [Evaluation on 36 datasets](https://ibm.github.io/model-recycling/model_gain_chart?avg=9.16&mnli_lp=nan&20_newsgroup=3.34&ag_news=1.49&amazon_reviews_multi=0.21&anli=13.91&boolq=16.75&cb=23.12&cola=9.97&copa=34.50&dbpedia=6.90&esnli=5.37&financial_phrasebank=18.66&imdb=0.33&isear=1.37&mnli=11.74&mrpc=16.63&multirc=6.24&poem_sentiment=14.62&qnli=3.41&qqp=6.18&rotten_tomatoes=2.98&rte=24.26&sst2=0.67&sst_5bins=5.44&stsb=20.68&trec_coarse=3.95&trec_fine=10.73&tweet_ev_emoji=13.39&tweet_ev_emotion=4.62&tweet_ev_hate=3.46&tweet_ev_irony=9.04&tweet_ev_offensive=1.69&tweet_ev_sentiment=0.75&wic=14.22&wnli=9.44&wsc=5.53&yahoo_answers=4.14&model_name=google%2Fflan-t5-base&base_name=google%2Ft5-v1_1-base) using google/flan-t5-base as a base model yields average score of 77.98 in comparison to 68.82 by google/t5-v1_1-base. The model is ranked 1st among all tested models for the google/t5-v1_1-base architecture as of 06/02/2023 Results: | 20_newsgroup | ag_news | amazon_reviews_multi | anli | boolq | cb | cola | copa | dbpedia | esnli | financial_phrasebank | imdb | isear | mnli | mrpc | multirc | poem_sentiment | qnli | qqp | rotten_tomatoes | rte | sst2 | sst_5bins | stsb | trec_coarse | trec_fine | tweet_ev_emoji | tweet_ev_emotion | tweet_ev_hate | tweet_ev_irony | tweet_ev_offensive | tweet_ev_sentiment | wic | wnli | wsc | yahoo_answers | |---------------:|----------:|-----------------------:|--------:|--------:|--------:|--------:|-------:|----------:|--------:|-----------------------:|-------:|--------:|--------:|--------:|----------:|-----------------:|--------:|--------:|------------------:|--------:|--------:|------------:|--------:|--------------:|------------:|-----------------:|-------------------:|----------------:|-----------------:|---------------------:|---------------------:|--------:|-------:|--------:|----------------:| | 86.2188 | 89.6667 | 67.12 | 51.9688 | 82.3242 | 78.5714 | 80.1534 | 75 | 77.6667 | 90.9507 | 85.4 | 93.324 | 72.425 | 87.2457 | 89.4608 | 62.3762 | 82.6923 | 92.7878 | 89.7724 | 89.0244 | 84.8375 | 94.3807 | 57.2851 | 89.4759 | 97.2 | 92.8 | 46.848 | 80.2252 | 54.9832 | 76.6582 | 84.3023 | 70.6366 | 70.0627 | 56.338 | 53.8462 | 73.4 | For more information, see: [Model Recycling](https://ibm.github.io/model-recycling/)
null
Non_BioNLP
# Model Card for CQI-Multitool-Model (From Flan T5) # Table of Contents 0. [TL;DR](#TL;DR) 1. [Model Details](#model-details) 2. [Usage](#usage) 3. [Uses](#uses) 4. [Bias, Risks, and Limitations](#bias-risks-and-limitations) 5. [Training Details](#training-details) 6. [Evaluation](#evaluation) 7. [Environmental Impact](#environmental-impact) 8. [Citation](#citation) 9. [Model Card Authors](#model-card-authors) # TL;DR If you already know T5, FLAN-T5 is just better at everything. For the same number of parameters, these models have been fine-tuned on more than 1000 additional tasks covering also more languages. As mentioned in the first few lines of the abstract : > Flan-PaLM 540B achieves state-of-the-art performance on several benchmarks, such as 75.2% on five-shot MMLU. We also publicly release Flan-T5 checkpoints,1 which achieve strong few-shot performance even compared to much larger models, such as PaLM 62B. Overall, instruction finetuning is a general method for improving the performance and usability of pretrained language models. **Disclaimer**: Content from **this** model card has been written by the Hugging Face team, and parts of it were copy pasted from the [T5 model card](https://huggingface.co/t5-large). # Model Details ## Model Description - **Model type:** Language model - **Language(s) (NLP):** English, Spanish, Japanese, Persian, Hindi, French, Chinese, Bengali, Gujarati, German, Telugu, Italian, Arabic, Polish, Tamil, Marathi, Malayalam, Oriya, Panjabi, Portuguese, Urdu, Galician, Hebrew, Korean, Catalan, Thai, Dutch, Indonesian, Vietnamese, Bulgarian, Filipino, Central Khmer, Lao, Turkish, Russian, Croatian, Swedish, Yoruba, Kurdish, Burmese, Malay, Czech, Finnish, Somali, Tagalog, Swahili, Sinhala, Kannada, Zhuang, Igbo, Xhosa, Romanian, Haitian, Estonian, Slovak, Lithuanian, Greek, Nepali, Assamese, Norwegian - **License:** Apache 2.0 - **Related Models:** [All FLAN-T5 Checkpoints](https://huggingface.co/models?search=flan-t5) - **Original Checkpoints:** [All Original FLAN-T5 Checkpoints](https://github.com/google-research/t5x/blob/main/docs/models.md#flan-t5-checkpoints) - **Resources for more information:** - [Research paper](https://arxiv.org/pdf/2210.11416.pdf) - [GitHub Repo](https://github.com/google-research/t5x) - [Hugging Face FLAN-T5 Docs (Similar to T5) ](https://huggingface.co/docs/transformers/model_doc/t5) # Usage Find below some example scripts on how to use the model in `transformers`: ## Using the Pytorch model ### Running the model on a CPU <details> <summary> Click to expand </summary> ```python from transformers import T5Tokenizer, T5ForConditionalGeneration tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-base") model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-base") input_text = "translate English to German: How old are you?" input_ids = tokenizer(input_text, return_tensors="pt").input_ids outputs = model.generate(input_ids) print(tokenizer.decode(outputs[0])) ``` </details> ### Running the model on a GPU <details> <summary> Click to expand </summary> ```python # pip install accelerate from transformers import T5Tokenizer, T5ForConditionalGeneration tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-base") model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-base", device_map="auto") input_text = "translate English to German: How old are you?" input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda") outputs = model.generate(input_ids) print(tokenizer.decode(outputs[0])) ``` </details> ### Running the model on a GPU using different precisions #### FP16 <details> <summary> Click to expand </summary> ```python # pip install accelerate import torch from transformers import T5Tokenizer, T5ForConditionalGeneration tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-base") model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-base", device_map="auto", torch_dtype=torch.float16) input_text = "translate English to German: How old are you?" input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda") outputs = model.generate(input_ids) print(tokenizer.decode(outputs[0])) ``` </details> #### INT8 <details> <summary> Click to expand </summary> ```python # pip install bitsandbytes accelerate from transformers import T5Tokenizer, T5ForConditionalGeneration tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-base") model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-base", device_map="auto", load_in_8bit=True) input_text = "translate English to German: How old are you?" input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda") outputs = model.generate(input_ids) print(tokenizer.decode(outputs[0])) ``` </details> # Uses ## Direct Use and Downstream Use The authors write in [the original paper's model card](https://arxiv.org/pdf/2210.11416.pdf) that: > The primary use is research on language models, including: research on zero-shot NLP tasks and in-context few-shot learning NLP tasks, such as reasoning, and question answering; advancing fairness and safety research, and understanding limitations of current large language models See the [research paper](https://arxiv.org/pdf/2210.11416.pdf) for further details. ## Out-of-Scope Use More information needed. # Bias, Risks, and Limitations The information below in this section are copied from the model's [official model card](https://arxiv.org/pdf/2210.11416.pdf): > Language models, including Flan-T5, can potentially be used for language generation in a harmful way, according to Rae et al. (2021). Flan-T5 should not be used directly in any application, without a prior assessment of safety and fairness concerns specific to the application. ## Ethical considerations and risks > Flan-T5 is fine-tuned on a large corpus of text data that was not filtered for explicit content or assessed for existing biases. As a result the model itself is potentially vulnerable to generating equivalently inappropriate content or replicating inherent biases in the underlying data. ## Known Limitations > Flan-T5 has not been tested in real world applications. ## Sensitive Use: > Flan-T5 should not be applied for any unacceptable use cases, e.g., generation of abusive speech. # Training Details ## Training Data The model was trained on a mixture of tasks, that includes the tasks described in the table below (from the original paper, figure 2): ![table.png](https://s3.amazonaws.com/moonup/production/uploads/1666363265279-62441d1d9fdefb55a0b7d12c.png) ## Training Procedure According to the model card from the [original paper](https://arxiv.org/pdf/2210.11416.pdf): > These models are based on pretrained T5 (Raffel et al., 2020) and fine-tuned with instructions for better zero-shot and few-shot performance. There is one fine-tuned Flan model per T5 model size. The model has been trained on TPU v3 or TPU v4 pods, using [`t5x`](https://github.com/google-research/t5x) codebase together with [`jax`](https://github.com/google/jax). # Evaluation ## Testing Data, Factors & Metrics The authors evaluated the model on various tasks covering several languages (1836 in total). See the table below for some quantitative evaluation: ![image.png](https://s3.amazonaws.com/moonup/production/uploads/1668072995230-62441d1d9fdefb55a0b7d12c.png) For full details, please check the [research paper](https://arxiv.org/pdf/2210.11416.pdf). ## Results For full results for FLAN-T5-Base, see the [research paper](https://arxiv.org/pdf/2210.11416.pdf), Table 3. # Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** Google Cloud TPU Pods - TPU v3 or TPU v4 | Number of chips ≥ 4. - **Hours used:** More information needed - **Cloud Provider:** GCP - **Compute Region:** More information needed - **Carbon Emitted:** More information needed # Citation **BibTeX:** ```bibtex @misc{https://doi.org/10.48550/arxiv.2210.11416, doi = {10.48550/ARXIV.2210.11416}, url = {https://arxiv.org/abs/2210.11416}, author = {Chung, Hyung Won and Hou, Le and Longpre, Shayne and Zoph, Barret and Tay, Yi and Fedus, William and Li, Eric and Wang, Xuezhi and Dehghani, Mostafa and Brahma, Siddhartha and Webson, Albert and Gu, Shixiang Shane and Dai, Zhuyun and Suzgun, Mirac and Chen, Xinyun and Chowdhery, Aakanksha and Narang, Sharan and Mishra, Gaurav and Yu, Adams and Zhao, Vincent and Huang, Yanping and Dai, Andrew and Yu, Hongkun and Petrov, Slav and Chi, Ed H. and Dean, Jeff and Devlin, Jacob and Roberts, Adam and Zhou, Denny and Le, Quoc V. and Wei, Jason}, keywords = {Machine Learning (cs.LG), Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Scaling Instruction-Finetuned Language Models}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ``` ## Model Recycling [Evaluation on 36 datasets](https://ibm.github.io/model-recycling/model_gain_chart?avg=9.16&mnli_lp=nan&20_newsgroup=3.34&ag_news=1.49&amazon_reviews_multi=0.21&anli=13.91&boolq=16.75&cb=23.12&cola=9.97&copa=34.50&dbpedia=6.90&esnli=5.37&financial_phrasebank=18.66&imdb=0.33&isear=1.37&mnli=11.74&mrpc=16.63&multirc=6.24&poem_sentiment=14.62&qnli=3.41&qqp=6.18&rotten_tomatoes=2.98&rte=24.26&sst2=0.67&sst_5bins=5.44&stsb=20.68&trec_coarse=3.95&trec_fine=10.73&tweet_ev_emoji=13.39&tweet_ev_emotion=4.62&tweet_ev_hate=3.46&tweet_ev_irony=9.04&tweet_ev_offensive=1.69&tweet_ev_sentiment=0.75&wic=14.22&wnli=9.44&wsc=5.53&yahoo_answers=4.14&model_name=google%2Fflan-t5-base&base_name=google%2Ft5-v1_1-base) using google/flan-t5-base as a base model yields average score of 77.98 in comparison to 68.82 by google/t5-v1_1-base. The model is ranked 1st among all tested models for the google/t5-v1_1-base architecture as of 06/02/2023 Results: | 20_newsgroup | ag_news | amazon_reviews_multi | anli | boolq | cb | cola | copa | dbpedia | esnli | financial_phrasebank | imdb | isear | mnli | mrpc | multirc | poem_sentiment | qnli | qqp | rotten_tomatoes | rte | sst2 | sst_5bins | stsb | trec_coarse | trec_fine | tweet_ev_emoji | tweet_ev_emotion | tweet_ev_hate | tweet_ev_irony | tweet_ev_offensive | tweet_ev_sentiment | wic | wnli | wsc | yahoo_answers | |---------------:|----------:|-----------------------:|--------:|--------:|--------:|--------:|-------:|----------:|--------:|-----------------------:|-------:|--------:|--------:|--------:|----------:|-----------------:|--------:|--------:|------------------:|--------:|--------:|------------:|--------:|--------------:|------------:|-----------------:|-------------------:|----------------:|-----------------:|---------------------:|---------------------:|--------:|-------:|--------:|----------------:| | 86.2188 | 89.6667 | 67.12 | 51.9688 | 82.3242 | 78.5714 | 80.1534 | 75 | 77.6667 | 90.9507 | 85.4 | 93.324 | 72.425 | 87.2457 | 89.4608 | 62.3762 | 82.6923 | 92.7878 | 89.7724 | 89.0244 | 84.8375 | 94.3807 | 57.2851 | 89.4759 | 97.2 | 92.8 | 46.848 | 80.2252 | 54.9832 | 76.6582 | 84.3023 | 70.6366 | 70.0627 | 56.338 | 53.8462 | 73.4 | For more information, see: [Model Recycling](https://ibm.github.io/model-recycling/)
{"datasets": ["svakulenk0/qrecc", "taskmaster2", "djaym7/wiki_dialog", "deepmind/code_contests", "lambada", "gsm8k", "aqua_rat", "esnli", "quasc", "qed"], "language": ["en", "fr", "ro", "de", "multilingual"], "license": "apache-2.0", "tags": ["text2text-generation"], "widget": [{"text": "Translate to English: Meu nome é Bruno.", "example_title": "Tradução"}, {"text": "Please answer to the following question. Who is going to be the next Ballon d'or?", "example_title": "Question Answering"}, {"text": "Q: Can Geoffrey Hinton have a conversation with George Washington? Give the rationale before answering.", "example_title": "Logical reasoning"}, {"text": "Please answer the following question. What is the boiling point of Nitrogen?", "example_title": "Scientific knowledge"}, {"text": "Answer the following yes/no question. Can you write a whole Haiku in a single tweet?", "example_title": "Yes/no question"}, {"text": "Answer the following yes/no question by reasoning step-by-step. Can you write a whole Haiku in a single tweet?", "example_title": "Reasoning task"}, {"text": "Q: ( False or not False or False ) is? A: Let's think step by step", "example_title": "Boolean Expressions"}, {"text": "The square root of x is the cube root of y. What is y to the power of 2, if x = 4?", "example_title": "Math reasoning"}, {"text": "Premise: At my age you will probably have learnt one lesson. Hypothesis: It's not certain how many lessons you'll learn by your thirties. Does the premise entail the hypothesis?", "example_title": "Premise and hypothesis"}]}
task
[ "QUESTION_ANSWERING" ]
42,951
RichardErkhov/HuggingFaceTB_-_SmolLM2-1.7B-Instruct-awq
RichardErkhov
null
[ "safetensors", "llama", "4-bit", "awq", "region:us" ]
2024-12-01T17:26:10Z
2024-12-01T17:26:48+00:00
11
0
--- {} --- Quantization made by Richard Erkhov. [Github](https://github.com/RichardErkhov) [Discord](https://discord.gg/pvy7H8DZMG) [Request more models](https://github.com/RichardErkhov/quant_request) SmolLM2-1.7B-Instruct - AWQ - Model creator: https://huggingface.co/HuggingFaceTB/ - Original model: https://huggingface.co/HuggingFaceTB/SmolLM2-1.7B-Instruct/ Original model description: --- library_name: transformers license: apache-2.0 language: - en pipeline_tag: text-generation tags: - safetensors - onnx - transformers.js base_model: - HuggingFaceTB/SmolLM2-1.7B --- # SmolLM2 ![image/png](https://cdn-uploads.huggingface.co/production/uploads/61c141342aac764ce1654e43/y45hIMNREW7w_XpHYB_0q.png) ## Table of Contents 1. [Model Summary](#model-summary) 2. [Evaluation](#evaluation) 3. [Examples](#examples) 4. [Limitations](#limitations) 5. [Training](#training) 6. [License](#license) 7. [Citation](#citation) ## Model Summary SmolLM2 is a family of compact language models available in three size: 135M, 360M, and 1.7B parameters. They are capable of solving a wide range of tasks while being lightweight enough to run on-device. The 1.7B variant demonstrates significant advances over its predecessor SmolLM1-1.7B, particularly in instruction following, knowledge, reasoning, and mathematics. It was trained on 11 trillion tokens using a diverse dataset combination: FineWeb-Edu, DCLM, The Stack, along with new mathematics and coding datasets that we curated and will release soon. We developed the instruct version through supervised fine-tuning (SFT) using a combination of public datasets and our own curated datasets. We then applied Direct Preference Optimization (DPO) using [UltraFeedback](https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized). The instruct model additionally supports tasks such as text rewriting, summarization and function calling thanks to datasets developed by [Argilla](https://huggingface.co/argilla) such as [Synth-APIGen-v0.1](https://huggingface.co/datasets/argilla/Synth-APIGen-v0.1). You can find the SFT dataset here: https://huggingface.co/datasets/HuggingFaceTB/smoltalk. For more details refer to: https://github.com/huggingface/smollm. You will find pre-training, post-training, evaluation and local inference code. ### How to use ### Transformers ```bash pip install transformers ``` ```python from transformers import AutoModelForCausalLM, AutoTokenizer checkpoint = "HuggingFaceTB/SmolLM2-1.7B-Instruct" device = "cuda" # for GPU usage or "cpu" for CPU usage tokenizer = AutoTokenizer.from_pretrained(checkpoint) # for multiple GPUs install accelerate and do `model = AutoModelForCausalLM.from_pretrained(checkpoint, device_map="auto")` model = AutoModelForCausalLM.from_pretrained(checkpoint).to(device) messages = [{"role": "user", "content": "What is the capital of France."}] input_text=tokenizer.apply_chat_template(messages, tokenize=False) inputs = tokenizer.encode(input_text, return_tensors="pt").to(device) outputs = model.generate(inputs, max_new_tokens=50, temperature=0.2, top_p=0.9, do_sample=True) print(tokenizer.decode(outputs[0])) ``` ### Chat in TRL You can also use the TRL CLI to chat with the model from the terminal: ```bash pip install trl trl chat --model_name_or_path HuggingFaceTB/SmolLM2-1.7B-Instruct --device cpu ``` ## Evaluation In this section, we report the evaluation results of SmolLM2. All evaluations are zero-shot unless stated otherwise, and we use [lighteval](https://github.com/huggingface/lighteval) to run them. ## Base Pre-Trained Model | Metric | SmolLM2-1.7B | Llama-1B | Qwen2.5-1.5B | SmolLM1-1.7B | |------------------|--------------|-------------|---------------|--------------| | HellaSwag | **68.7** | 61.2 | 66.4 | 62.9 | | ARC (Average) | **60.5** | 49.2 | 58.5 | 59.9 | | PIQA | **77.6** | 74.8 | 76.1 | 76.0 | | MMLU-Pro (MCF) | **19.4** | 11.7 | 13.7 | 10.8 | | CommonsenseQA | **43.6** | 41.2 | 34.1 | 38.0 | | TriviaQA | **36.7** | 28.1 | 20.9 | 22.5 | | Winogrande | **59.4** | 57.8 | 59.3 | 54.7 | | OpenBookQA | 42.2 | 38.4 | 40.0 | **42.4** | | GSM8K (5-shot) | 31.0 | 7.2 | **61.3** | 5.5 | ## Instruction Model | Metric | SmolLM2-1.7B-Instruct | Llama-1B-Instruct | Qwen2.5-1.5B-Instruct | SmolLM1-1.7B-Instruct | |:-----------------------------|:---------------------:|:-----------------:|:----------------------:|:----------------------:| | IFEval (Average prompt/inst) | **56.7** | 53.5 | 47.4 | 23.1 | | MT-Bench | 6.13 | 5.48 | **6.52** | 4.33 | | OpenRewrite-Eval (micro_avg RougeL) | 44.9 | 39.2 | **46.9** | NaN | | HellaSwag | **66.1** | 56.1 | 60.9 | 55.5 | | ARC (Average) | **51.7** | 41.6 | 46.2 | 43.7 | | PIQA | **74.4** | 72.3 | 73.2 | 71.6 | | MMLU-Pro (MCF) | 19.3 | 12.7 | **24.2** | 11.7 | | BBH (3-shot) | 32.2 | 27.6 | **35.3** | 25.7 | | GSM8K (5-shot) | **48.2** | 26.8 | 42.8 | 4.62 | ## Examples Below are some system and instruct prompts that work well for special tasks ### Text rewriting ```python system_prompt_rewrite = "You are an AI writing assistant. Your task is to rewrite the user's email to make it more professional and approachable while maintaining its main points and key message. Do not return any text other than the rewritten message." user_prompt_rewrite = "Rewrite the message below to make it more friendly and approachable while maintaining its main points and key message. Do not add any new information or return any text other than the rewritten message\nThe message:" messages = [{"role": "system", "content": system_prompt_rewrite}, {"role": "user", "content":f"{user_prompt_rewrite} The CI is failing after your last commit!"}] input_text=tokenizer.apply_chat_template(messages, tokenize=False) inputs = tokenizer.encode(input_text, return_tensors="pt").to(device) outputs = model.generate(inputs, max_new_tokens=50, temperature=0.2, top_p=0.9, do_sample=True) print(tokenizer.decode(outputs[0])) ``` ``` Hey there! I noticed that the CI isn't passing after your latest commit. Could you take a look and let me know what's going on? Thanks so much for your help! ``` ### Summarization ```python system_prompt_summarize = "Provide a concise, objective summary of the input text in up to three sentences, focusing on key actions and intentions without using second or third person pronouns." messages = [{"role": "system", "content": system_prompt_summarize}, {"role": "user", "content": INSERT_LONG_EMAIL}] input_text=tokenizer.apply_chat_template(messages, tokenize=False) inputs = tokenizer.encode(input_text, return_tensors="pt").to(device) outputs = model.generate(inputs, max_new_tokens=50, temperature=0.2, top_p=0.9, do_sample=True) print(tokenizer.decode(outputs[0])) ``` ### Function calling SmolLM2-1.7B-Instruct can handle function calling, it scores 27% on the [BFCL Leaderboard](https://gorilla.cs.berkeley.edu/blogs/8_berkeley_function_calling_leaderboard.html). Here's how you can leverage it: ```python import json import re from typing import Optional from jinja2 import Template import torch from transformers import AutoModelForCausalLM, AutoTokenizer from transformers.utils import get_json_schema system_prompt = Template("""You are an expert in composing functions. You are given a question and a set of possible functions. Based on the question, you will need to make one or more function/tool calls to achieve the purpose. If none of the functions can be used, point it out and refuse to answer. If the given question lacks the parameters required by the function, also point it out. You have access to the following tools: <tools>{{ tools }}</tools> The output MUST strictly adhere to the following format, and NO other text MUST be included. The example format is as follows. Please make sure the parameter type is correct. If no function call is needed, please make the tool calls an empty list '[]'. <tool_call>[ {"name": "func_name1", "arguments": {"argument1": "value1", "argument2": "value2"}}, ... (more tool calls as required) ]</tool_call>""") def prepare_messages( query: str, tools: Optional[dict[str, any]] = None, history: Optional[list[dict[str, str]]] = None ) -> list[dict[str, str]]: """Prepare the system and user messages for the given query and tools. Args: query: The query to be answered. tools: The tools available to the user. Defaults to None, in which case if a list without content will be passed to the model. history: Exchange of messages, including the system_prompt from the first query. Defaults to None, the first message in a conversation. """ if tools is None: tools = [] if history: messages = history.copy() messages.append({"role": "user", "content": query}) else: messages = [ {"role": "system", "content": system_prompt.render(tools=json.dumps(tools))}, {"role": "user", "content": query} ] return messages def parse_response(text: str) -> str | dict[str, any]: """Parses a response from the model, returning either the parsed list with the tool calls parsed, or the model thought or response if couldn't generate one. Args: text: Response from the model. """ pattern = r"<tool_call>(.*?)</tool_call>" matches = re.findall(pattern, text, re.DOTALL) if matches: return json.loads(matches[0]) return text model_name_smollm = "HuggingFaceTB/SmolLM2-1.7B-Instruct" model = AutoModelForCausalLM.from_pretrained(model_name_smollm, device_map="auto", torch_dtype="auto", trust_remote_code=True) tokenizer = AutoTokenizer.from_pretrained(model_name_smollm) from datetime import datetime import random def get_current_time() -> str: """Returns the current time in 24-hour format. Returns: str: Current time in HH:MM:SS format. """ return datetime.now().strftime("%H:%M:%S") def get_random_number_between(min: int, max: int) -> int: """ Gets a random number between min and max. Args: min: The minimum number. max: The maximum number. Returns: A random number between min and max. """ return random.randint(min, max) tools = [get_json_schema(get_random_number_between), get_json_schema(get_current_time)] toolbox = {"get_random_number_between": get_random_number_between, "get_current_time": get_current_time} query = "Give me a number between 1 and 300" messages = prepare_messages(query, tools=tools) inputs = tokenizer.apply_chat_template(messages, add_generation_prompt=True, return_tensors="pt").to(model.device) outputs = model.generate(inputs, max_new_tokens=512, do_sample=False, num_return_sequences=1, eos_token_id=tokenizer.eos_token_id) result = tokenizer.decode(outputs[0][len(inputs[0]):], skip_special_tokens=True) tool_calls = parse_response(result) # [{'name': 'get_random_number_between', 'arguments': {'min': 1, 'max': 300}} # Get tool responses tool_responses = [toolbox.get(tc["name"])(*tc["arguments"].values()) for tc in tool_calls] # [63] # For the second turn, rebuild the history of messages: history = messages.copy() # Add the "parsed response" history.append({"role": "assistant", "content": result}) query = "Can you give me the hour?" history.append({"role": "user", "content": query}) inputs = tokenizer.apply_chat_template(history, add_generation_prompt=True, return_tensors="pt").to(model.device) outputs = model.generate(inputs, max_new_tokens=512, do_sample=False, num_return_sequences=1, eos_token_id=tokenizer.eos_token_id) result = tokenizer.decode(outputs[0][len(inputs[0]):], skip_special_tokens=True) tool_calls = parse_response(result) tool_responses = [toolbox.get(tc["name"])(*tc["arguments"].values()) for tc in tool_calls] # ['07:57:25'] ``` More details such as parallel function calls and tools not available can be found [here](https://huggingface.co/HuggingFaceTB/SmolLM2-1.7B-Instruct/blob/main/instructions_function_calling.md) ## Limitations SmolLM2 models primarily understand and generate content in English. They can produce text on a variety of topics, but the generated content may not always be factually accurate, logically consistent, or free from biases present in the training data. These models should be used as assistive tools rather than definitive sources of information. Users should always verify important information and critically evaluate any generated content. ## Training ### Model - **Architecture:** Transformer decoder - **Pretraining tokens:** 11T - **Precision:** bfloat16 ### Hardware - **GPUs:** 256 H100 ### Software - **Training Framework:** [nanotron](https://github.com/huggingface/nanotron/tree/main) - **Alignement Handbook** [alignement-handbook](https://github.com/huggingface/alignment-handbook/) ## License [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0) ## Citation ```bash @misc{allal2024SmolLM2, title={SmolLM2 - with great data, comes great performance}, author={Loubna Ben Allal and Anton Lozhkov and Elie Bakouch and Gabriel Martín Blázquez and Lewis Tunstall and Agustín Piqueres and Andres Marafioti and Cyril Zakka and Leandro von Werra and Thomas Wolf}, year={2024}, } ```
null
Non_BioNLP
Quantization made by Richard Erkhov. [Github](https://github.com/RichardErkhov) [Discord](https://discord.gg/pvy7H8DZMG) [Request more models](https://github.com/RichardErkhov/quant_request) SmolLM2-1.7B-Instruct - AWQ - Model creator: https://huggingface.co/HuggingFaceTB/ - Original model: https://huggingface.co/HuggingFaceTB/SmolLM2-1.7B-Instruct/ Original model description: --- library_name: transformers license: apache-2.0 language: - en pipeline_tag: text-generation tags: - safetensors - onnx - transformers.js base_model: - HuggingFaceTB/SmolLM2-1.7B --- # SmolLM2 ![image/png](https://cdn-uploads.huggingface.co/production/uploads/61c141342aac764ce1654e43/y45hIMNREW7w_XpHYB_0q.png) ## Table of Contents 1. [Model Summary](#model-summary) 2. [Evaluation](#evaluation) 3. [Examples](#examples) 4. [Limitations](#limitations) 5. [Training](#training) 6. [License](#license) 7. [Citation](#citation) ## Model Summary SmolLM2 is a family of compact language models available in three size: 135M, 360M, and 1.7B parameters. They are capable of solving a wide range of tasks while being lightweight enough to run on-device. The 1.7B variant demonstrates significant advances over its predecessor SmolLM1-1.7B, particularly in instruction following, knowledge, reasoning, and mathematics. It was trained on 11 trillion tokens using a diverse dataset combination: FineWeb-Edu, DCLM, The Stack, along with new mathematics and coding datasets that we curated and will release soon. We developed the instruct version through supervised fine-tuning (SFT) using a combination of public datasets and our own curated datasets. We then applied Direct Preference Optimization (DPO) using [UltraFeedback](https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized). The instruct model additionally supports tasks such as text rewriting, summarization and function calling thanks to datasets developed by [Argilla](https://huggingface.co/argilla) such as [Synth-APIGen-v0.1](https://huggingface.co/datasets/argilla/Synth-APIGen-v0.1). You can find the SFT dataset here: https://huggingface.co/datasets/HuggingFaceTB/smoltalk. For more details refer to: https://github.com/huggingface/smollm. You will find pre-training, post-training, evaluation and local inference code. ### How to use ### Transformers ```bash pip install transformers ``` ```python from transformers import AutoModelForCausalLM, AutoTokenizer checkpoint = "HuggingFaceTB/SmolLM2-1.7B-Instruct" device = "cuda" # for GPU usage or "cpu" for CPU usage tokenizer = AutoTokenizer.from_pretrained(checkpoint) # for multiple GPUs install accelerate and do `model = AutoModelForCausalLM.from_pretrained(checkpoint, device_map="auto")` model = AutoModelForCausalLM.from_pretrained(checkpoint).to(device) messages = [{"role": "user", "content": "What is the capital of France."}] input_text=tokenizer.apply_chat_template(messages, tokenize=False) inputs = tokenizer.encode(input_text, return_tensors="pt").to(device) outputs = model.generate(inputs, max_new_tokens=50, temperature=0.2, top_p=0.9, do_sample=True) print(tokenizer.decode(outputs[0])) ``` ### Chat in TRL You can also use the TRL CLI to chat with the model from the terminal: ```bash pip install trl trl chat --model_name_or_path HuggingFaceTB/SmolLM2-1.7B-Instruct --device cpu ``` ## Evaluation In this section, we report the evaluation results of SmolLM2. All evaluations are zero-shot unless stated otherwise, and we use [lighteval](https://github.com/huggingface/lighteval) to run them. ## Base Pre-Trained Model | Metric | SmolLM2-1.7B | Llama-1B | Qwen2.5-1.5B | SmolLM1-1.7B | |------------------|--------------|-------------|---------------|--------------| | HellaSwag | **68.7** | 61.2 | 66.4 | 62.9 | | ARC (Average) | **60.5** | 49.2 | 58.5 | 59.9 | | PIQA | **77.6** | 74.8 | 76.1 | 76.0 | | MMLU-Pro (MCF) | **19.4** | 11.7 | 13.7 | 10.8 | | CommonsenseQA | **43.6** | 41.2 | 34.1 | 38.0 | | TriviaQA | **36.7** | 28.1 | 20.9 | 22.5 | | Winogrande | **59.4** | 57.8 | 59.3 | 54.7 | | OpenBookQA | 42.2 | 38.4 | 40.0 | **42.4** | | GSM8K (5-shot) | 31.0 | 7.2 | **61.3** | 5.5 | ## Instruction Model | Metric | SmolLM2-1.7B-Instruct | Llama-1B-Instruct | Qwen2.5-1.5B-Instruct | SmolLM1-1.7B-Instruct | |:-----------------------------|:---------------------:|:-----------------:|:----------------------:|:----------------------:| | IFEval (Average prompt/inst) | **56.7** | 53.5 | 47.4 | 23.1 | | MT-Bench | 6.13 | 5.48 | **6.52** | 4.33 | | OpenRewrite-Eval (micro_avg RougeL) | 44.9 | 39.2 | **46.9** | NaN | | HellaSwag | **66.1** | 56.1 | 60.9 | 55.5 | | ARC (Average) | **51.7** | 41.6 | 46.2 | 43.7 | | PIQA | **74.4** | 72.3 | 73.2 | 71.6 | | MMLU-Pro (MCF) | 19.3 | 12.7 | **24.2** | 11.7 | | BBH (3-shot) | 32.2 | 27.6 | **35.3** | 25.7 | | GSM8K (5-shot) | **48.2** | 26.8 | 42.8 | 4.62 | ## Examples Below are some system and instruct prompts that work well for special tasks ### Text rewriting ```python system_prompt_rewrite = "You are an AI writing assistant. Your task is to rewrite the user's email to make it more professional and approachable while maintaining its main points and key message. Do not return any text other than the rewritten message." user_prompt_rewrite = "Rewrite the message below to make it more friendly and approachable while maintaining its main points and key message. Do not add any new information or return any text other than the rewritten message\nThe message:" messages = [{"role": "system", "content": system_prompt_rewrite}, {"role": "user", "content":f"{user_prompt_rewrite} The CI is failing after your last commit!"}] input_text=tokenizer.apply_chat_template(messages, tokenize=False) inputs = tokenizer.encode(input_text, return_tensors="pt").to(device) outputs = model.generate(inputs, max_new_tokens=50, temperature=0.2, top_p=0.9, do_sample=True) print(tokenizer.decode(outputs[0])) ``` ``` Hey there! I noticed that the CI isn't passing after your latest commit. Could you take a look and let me know what's going on? Thanks so much for your help! ``` ### Summarization ```python system_prompt_summarize = "Provide a concise, objective summary of the input text in up to three sentences, focusing on key actions and intentions without using second or third person pronouns." messages = [{"role": "system", "content": system_prompt_summarize}, {"role": "user", "content": INSERT_LONG_EMAIL}] input_text=tokenizer.apply_chat_template(messages, tokenize=False) inputs = tokenizer.encode(input_text, return_tensors="pt").to(device) outputs = model.generate(inputs, max_new_tokens=50, temperature=0.2, top_p=0.9, do_sample=True) print(tokenizer.decode(outputs[0])) ``` ### Function calling SmolLM2-1.7B-Instruct can handle function calling, it scores 27% on the [BFCL Leaderboard](https://gorilla.cs.berkeley.edu/blogs/8_berkeley_function_calling_leaderboard.html). Here's how you can leverage it: ```python import json import re from typing import Optional from jinja2 import Template import torch from transformers import AutoModelForCausalLM, AutoTokenizer from transformers.utils import get_json_schema system_prompt = Template("""You are an expert in composing functions. You are given a question and a set of possible functions. Based on the question, you will need to make one or more function/tool calls to achieve the purpose. If none of the functions can be used, point it out and refuse to answer. If the given question lacks the parameters required by the function, also point it out. You have access to the following tools: <tools>{{ tools }}</tools> The output MUST strictly adhere to the following format, and NO other text MUST be included. The example format is as follows. Please make sure the parameter type is correct. If no function call is needed, please make the tool calls an empty list '[]'. <tool_call>[ {"name": "func_name1", "arguments": {"argument1": "value1", "argument2": "value2"}}, ... (more tool calls as required) ]</tool_call>""") def prepare_messages( query: str, tools: Optional[dict[str, any]] = None, history: Optional[list[dict[str, str]]] = None ) -> list[dict[str, str]]: """Prepare the system and user messages for the given query and tools. Args: query: The query to be answered. tools: The tools available to the user. Defaults to None, in which case if a list without content will be passed to the model. history: Exchange of messages, including the system_prompt from the first query. Defaults to None, the first message in a conversation. """ if tools is None: tools = [] if history: messages = history.copy() messages.append({"role": "user", "content": query}) else: messages = [ {"role": "system", "content": system_prompt.render(tools=json.dumps(tools))}, {"role": "user", "content": query} ] return messages def parse_response(text: str) -> str | dict[str, any]: """Parses a response from the model, returning either the parsed list with the tool calls parsed, or the model thought or response if couldn't generate one. Args: text: Response from the model. """ pattern = r"<tool_call>(.*?)</tool_call>" matches = re.findall(pattern, text, re.DOTALL) if matches: return json.loads(matches[0]) return text model_name_smollm = "HuggingFaceTB/SmolLM2-1.7B-Instruct" model = AutoModelForCausalLM.from_pretrained(model_name_smollm, device_map="auto", torch_dtype="auto", trust_remote_code=True) tokenizer = AutoTokenizer.from_pretrained(model_name_smollm) from datetime import datetime import random def get_current_time() -> str: """Returns the current time in 24-hour format. Returns: str: Current time in HH:MM:SS format. """ return datetime.now().strftime("%H:%M:%S") def get_random_number_between(min: int, max: int) -> int: """ Gets a random number between min and max. Args: min: The minimum number. max: The maximum number. Returns: A random number between min and max. """ return random.randint(min, max) tools = [get_json_schema(get_random_number_between), get_json_schema(get_current_time)] toolbox = {"get_random_number_between": get_random_number_between, "get_current_time": get_current_time} query = "Give me a number between 1 and 300" messages = prepare_messages(query, tools=tools) inputs = tokenizer.apply_chat_template(messages, add_generation_prompt=True, return_tensors="pt").to(model.device) outputs = model.generate(inputs, max_new_tokens=512, do_sample=False, num_return_sequences=1, eos_token_id=tokenizer.eos_token_id) result = tokenizer.decode(outputs[0][len(inputs[0]):], skip_special_tokens=True) tool_calls = parse_response(result) # [{'name': 'get_random_number_between', 'arguments': {'min': 1, 'max': 300}} # Get tool responses tool_responses = [toolbox.get(tc["name"])(*tc["arguments"].values()) for tc in tool_calls] # [63] # For the second turn, rebuild the history of messages: history = messages.copy() # Add the "parsed response" history.append({"role": "assistant", "content": result}) query = "Can you give me the hour?" history.append({"role": "user", "content": query}) inputs = tokenizer.apply_chat_template(history, add_generation_prompt=True, return_tensors="pt").to(model.device) outputs = model.generate(inputs, max_new_tokens=512, do_sample=False, num_return_sequences=1, eos_token_id=tokenizer.eos_token_id) result = tokenizer.decode(outputs[0][len(inputs[0]):], skip_special_tokens=True) tool_calls = parse_response(result) tool_responses = [toolbox.get(tc["name"])(*tc["arguments"].values()) for tc in tool_calls] # ['07:57:25'] ``` More details such as parallel function calls and tools not available can be found [here](https://huggingface.co/HuggingFaceTB/SmolLM2-1.7B-Instruct/blob/main/instructions_function_calling.md) ## Limitations SmolLM2 models primarily understand and generate content in English. They can produce text on a variety of topics, but the generated content may not always be factually accurate, logically consistent, or free from biases present in the training data. These models should be used as assistive tools rather than definitive sources of information. Users should always verify important information and critically evaluate any generated content. ## Training ### Model - **Architecture:** Transformer decoder - **Pretraining tokens:** 11T - **Precision:** bfloat16 ### Hardware - **GPUs:** 256 H100 ### Software - **Training Framework:** [nanotron](https://github.com/huggingface/nanotron/tree/main) - **Alignement Handbook** [alignement-handbook](https://github.com/huggingface/alignment-handbook/) ## License [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0) ## Citation ```bash @misc{allal2024SmolLM2, title={SmolLM2 - with great data, comes great performance}, author={Loubna Ben Allal and Anton Lozhkov and Elie Bakouch and Gabriel Martín Blázquez and Lewis Tunstall and Agustín Piqueres and Andres Marafioti and Cyril Zakka and Leandro von Werra and Thomas Wolf}, year={2024}, } ```
{}
task
[ "SUMMARIZATION" ]
42,952
entaroid/distilbert-base-uncased-finetuned-emotion
entaroid
text-classification
[ "transformers", "pytorch", "distilbert", "text-classification", "generated_from_trainer", "dataset:emotion", "base_model:distilbert/distilbert-base-uncased", "base_model:finetune:distilbert/distilbert-base-uncased", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2023-08-29T13:44:53Z
2023-11-14T17:08:46+00:00
10
0
--- base_model: distilbert-base-uncased datasets: - emotion license: apache-2.0 metrics: - accuracy - f1 tags: - generated_from_trainer model-index: - name: distilbert-base-uncased-finetuned-emotion results: - task: type: text-classification name: Text Classification dataset: name: emotion type: emotion config: split split: validation args: split metrics: - type: accuracy value: 0.9265 name: Accuracy - type: f1 value: 0.9261920632620516 name: F1 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-emotion This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset. It achieves the following results on the evaluation set: - Loss: 0.2144 - Accuracy: 0.9265 - F1: 0.9262 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | No log | 1.0 | 250 | 0.3298 | 0.9085 | 0.9077 | | No log | 2.0 | 500 | 0.2144 | 0.9265 | 0.9262 | ### Framework versions - Transformers 4.34.1 - Pytorch 2.1.0 - Datasets 2.14.6 - Tokenizers 0.14.1
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-emotion This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset. It achieves the following results on the evaluation set: - Loss: 0.2144 - Accuracy: 0.9265 - F1: 0.9262 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | No log | 1.0 | 250 | 0.3298 | 0.9085 | 0.9077 | | No log | 2.0 | 500 | 0.2144 | 0.9265 | 0.9262 | ### Framework versions - Transformers 4.34.1 - Pytorch 2.1.0 - Datasets 2.14.6 - Tokenizers 0.14.1
{"base_model": "distilbert-base-uncased", "datasets": ["emotion"], "license": "apache-2.0", "metrics": ["accuracy", "f1"], "tags": ["generated_from_trainer"], "model-index": [{"name": "distilbert-base-uncased-finetuned-emotion", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "emotion", "type": "emotion", "config": "split", "split": "validation", "args": "split"}, "metrics": [{"type": "accuracy", "value": 0.9265, "name": "Accuracy"}, {"type": "f1", "value": 0.9261920632620516, "name": "F1"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,953
gokuls/distilbert_add_GLUE_Experiment_logit_kd_mrpc_384
gokuls
text-classification
[ "transformers", "pytorch", "tensorboard", "distilbert", "text-classification", "generated_from_trainer", "en", "dataset:glue", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2023-01-28T23:22:41Z
2023-01-28T23:24:30+00:00
134
0
--- datasets: - glue language: - en license: apache-2.0 metrics: - accuracy - f1 tags: - generated_from_trainer model-index: - name: distilbert_add_GLUE_Experiment_logit_kd_mrpc_384 results: - task: type: text-classification name: Text Classification dataset: name: GLUE MRPC type: glue config: mrpc split: validation args: mrpc metrics: - type: accuracy value: 0.3161764705882353 name: Accuracy - type: f1 value: 0.0 name: F1 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert_add_GLUE_Experiment_logit_kd_mrpc_384 This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the GLUE MRPC dataset. It achieves the following results on the evaluation set: - Loss: 0.5290 - Accuracy: 0.3162 - F1: 0.0 - Combined Score: 0.1581 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 256 - eval_batch_size: 256 - seed: 10 - distributed_type: multi-GPU - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Combined Score | |:-------------:|:-----:|:----:|:---------------:|:--------:|:---:|:--------------:| | 0.5334 | 1.0 | 15 | 0.5306 | 0.3162 | 0.0 | 0.1581 | | 0.5311 | 2.0 | 30 | 0.5290 | 0.3162 | 0.0 | 0.1581 | | 0.5295 | 3.0 | 45 | 0.5295 | 0.3162 | 0.0 | 0.1581 | | 0.5295 | 4.0 | 60 | 0.5292 | 0.3162 | 0.0 | 0.1581 | | 0.5283 | 5.0 | 75 | 0.5292 | 0.3162 | 0.0 | 0.1581 | | 0.529 | 6.0 | 90 | 0.5293 | 0.3162 | 0.0 | 0.1581 | | 0.528 | 7.0 | 105 | 0.5292 | 0.3162 | 0.0 | 0.1581 | ### Framework versions - Transformers 4.26.0 - Pytorch 1.14.0a0+410ce96 - Datasets 2.9.0 - Tokenizers 0.13.2
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert_add_GLUE_Experiment_logit_kd_mrpc_384 This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the GLUE MRPC dataset. It achieves the following results on the evaluation set: - Loss: 0.5290 - Accuracy: 0.3162 - F1: 0.0 - Combined Score: 0.1581 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 256 - eval_batch_size: 256 - seed: 10 - distributed_type: multi-GPU - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Combined Score | |:-------------:|:-----:|:----:|:---------------:|:--------:|:---:|:--------------:| | 0.5334 | 1.0 | 15 | 0.5306 | 0.3162 | 0.0 | 0.1581 | | 0.5311 | 2.0 | 30 | 0.5290 | 0.3162 | 0.0 | 0.1581 | | 0.5295 | 3.0 | 45 | 0.5295 | 0.3162 | 0.0 | 0.1581 | | 0.5295 | 4.0 | 60 | 0.5292 | 0.3162 | 0.0 | 0.1581 | | 0.5283 | 5.0 | 75 | 0.5292 | 0.3162 | 0.0 | 0.1581 | | 0.529 | 6.0 | 90 | 0.5293 | 0.3162 | 0.0 | 0.1581 | | 0.528 | 7.0 | 105 | 0.5292 | 0.3162 | 0.0 | 0.1581 | ### Framework versions - Transformers 4.26.0 - Pytorch 1.14.0a0+410ce96 - Datasets 2.9.0 - Tokenizers 0.13.2
{"datasets": ["glue"], "language": ["en"], "license": "apache-2.0", "metrics": ["accuracy", "f1"], "tags": ["generated_from_trainer"], "model-index": [{"name": "distilbert_add_GLUE_Experiment_logit_kd_mrpc_384", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "GLUE MRPC", "type": "glue", "config": "mrpc", "split": "validation", "args": "mrpc"}, "metrics": [{"type": "accuracy", "value": 0.3161764705882353, "name": "Accuracy"}, {"type": "f1", "value": 0.0, "name": "F1"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,954
MohamedZaitoon/bart-fine-tune
MohamedZaitoon
summarization
[ "transformers", "pytorch", "jax", "bart", "text2text-generation", "summarization", "dataset:CNN/Daily-mail", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04Z
2021-06-13T17:27:59+00:00
124
1
--- datasets: - CNN/Daily-mail metrics: - ROUGE tags: - summarization ---
null
Non_BioNLP
{"datasets": ["CNN/Daily-mail"], "metrics": ["ROUGE"], "tags": ["summarization"]}
task
[ "SUMMARIZATION" ]
42,955
juanpablomesa/bge-small-bioasq-1epoch-batch32-100steps
juanpablomesa
sentence-similarity
[ "sentence-transformers", "safetensors", "bert", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:4012", "loss:MultipleNegativesRankingLoss", "en", "arxiv:1908.10084", "arxiv:1705.00652", "base_model:BAAI/bge-small-en-v1.5", "base_model:finetune:BAAI/bge-small-en-v1.5", "license:apache-2.0", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
2024-07-02T18:56:05Z
2024-07-02T18:56:09+00:00
59
0
--- base_model: BAAI/bge-small-en-v1.5 datasets: [] language: - en library_name: sentence-transformers license: apache-2.0 metrics: - cosine_accuracy@1 - cosine_accuracy@3 - cosine_accuracy@5 - cosine_accuracy@10 - cosine_precision@1 - cosine_precision@3 - cosine_precision@5 - cosine_precision@10 - cosine_recall@1 - cosine_recall@3 - cosine_recall@5 - cosine_recall@10 - cosine_ndcg@10 - cosine_mrr@10 - cosine_map@100 pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:4012 - loss:MultipleNegativesRankingLoss widget: - source_sentence: 'Extensive messenger RNA editing generates transcript and protein diversity in genes involved in neural excitability, as previously described, as well as in genes participating in a broad range of other cellular functions. ' sentences: - Do cephalopods use RNA editing less frequently than other species? - GV1001 vaccine targets which enzyme? - Which event results in the acetylation of S6K1? - source_sentence: Yes, exposure to household furry pets influences the gut microbiota of infants. sentences: - Can pets affect infant microbiomed? - What is the mode of action of Thiazovivin? - What are the effects of CAMK4 inhibition? - source_sentence: "In children with heart failure evidence of the effect of enalapril\ \ is empirical. Enalapril was clinically safe and effective in 50% to 80% of for\ \ children with cardiac failure secondary to congenital heart malformations before\ \ and after cardiac surgery, impaired ventricular function , valvar regurgitation,\ \ congestive cardiomyopathy, , arterial hypertension, life-threatening arrhythmias\ \ coexisting with circulatory insufficiency. \nACE inhibitors have shown a transient\ \ beneficial effect on heart failure due to anticancer drugs and possibly a beneficial\ \ effect in muscular dystrophy-associated cardiomyopathy, which deserves further\ \ studies." sentences: - Which receptors can be evaluated with the [18F]altanserin? - In what proportion of children with heart failure has Enalapril been shown to be safe and effective? - Which major signaling pathways are regulated by RIP1? - source_sentence: Cellular senescence-associated heterochromatic foci (SAHFS) are a novel type of chromatin condensation involving alterations of linker histone H1 and linker DNA-binding proteins. SAHFS can be formed by a variety of cell types, but their mechanism of action remains unclear. sentences: - What is the relationship between the X chromosome and a neutrophil drumstick? - Which microRNAs are involved in exercise adaptation? - How are SAHFS created? - source_sentence: Multicluster Pcdh diversity is required for mouse olfactory neural circuit assembly. The vertebrate clustered protocadherin (Pcdh) cell surface proteins are encoded by three closely linked gene clusters (Pcdhα, Pcdhβ, and Pcdhγ). Although deletion of individual Pcdh clusters had subtle phenotypic consequences, the loss of all three clusters (tricluster deletion) led to a severe axonal arborization defect and loss of self-avoidance. sentences: - What are the effects of the deletion of all three Pcdh clusters (tricluster deletion) in mice? - what is the role of MEF-2 in cardiomyocyte differentiation? - How many periods of regulatory innovation led to the evolution of vertebrates? model-index: - name: BGE small finetuned BIOASQ results: - task: type: information-retrieval name: Information Retrieval dataset: name: BAAI/bge small en v1.5 type: BAAI/bge-small-en-v1.5 metrics: - type: cosine_accuracy@1 value: 0.8345120226308345 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.9207920792079208 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.942008486562942 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.9547383309759547 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.8345120226308345 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.3069306930693069 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.18840169731258838 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.09547383309759547 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.8345120226308345 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.9207920792079208 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.942008486562942 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.9547383309759547 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.9001912196285257 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.8821973013627894 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.8832658504735496 name: Cosine Map@100 --- # BGE small finetuned BIOASQ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) <!-- at revision 5c38ec7c405ec4b44b94cc5a9bb96e735b38267a --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 384 tokens - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> - **Language:** en - **License:** apache-2.0 ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("juanpablomesa/bge-small-bioasq-1epoch-batch32-100steps") # Run inference sentences = [ 'Multicluster Pcdh diversity is required for mouse olfactory neural circuit assembly. The vertebrate clustered protocadherin (Pcdh) cell surface proteins are encoded by three closely linked gene clusters (Pcdhα, Pcdhβ, and Pcdhγ). Although deletion of individual Pcdh clusters had subtle phenotypic consequences, the loss of all three clusters (tricluster deletion) led to a severe axonal arborization defect and loss of self-avoidance.', 'What are the effects of the deletion of all three Pcdh clusters (tricluster deletion) in mice?', 'How many periods of regulatory innovation led to the evolution of vertebrates?', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 384] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Information Retrieval * Dataset: `BAAI/bge-small-en-v1.5` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.8345 | | cosine_accuracy@3 | 0.9208 | | cosine_accuracy@5 | 0.942 | | cosine_accuracy@10 | 0.9547 | | cosine_precision@1 | 0.8345 | | cosine_precision@3 | 0.3069 | | cosine_precision@5 | 0.1884 | | cosine_precision@10 | 0.0955 | | cosine_recall@1 | 0.8345 | | cosine_recall@3 | 0.9208 | | cosine_recall@5 | 0.942 | | cosine_recall@10 | 0.9547 | | cosine_ndcg@10 | 0.9002 | | cosine_mrr@10 | 0.8822 | | **cosine_map@100** | **0.8833** | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 4,012 training samples * Columns: <code>positive</code> and <code>anchor</code> * Approximate statistics based on the first 1000 samples: | | positive | anchor | |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 3 tokens</li><li>mean: 63.38 tokens</li><li>max: 485 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 16.13 tokens</li><li>max: 49 tokens</li></ul> | * Samples: | positive | anchor | |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------| | <code>Aberrant patterns of H3K4, H3K9, and H3K27 histone lysine methylation were shown to result in histone code alterations, which induce changes in gene expression, and affect the proliferation rate of cells in medulloblastoma.</code> | <code>What is the implication of histone lysine methylation in medulloblastoma?</code> | | <code>STAG1/STAG2 proteins are tumour suppressor proteins that suppress cell proliferation and are essential for differentiation.</code> | <code>What is the role of STAG1/STAG2 proteins in differentiation?</code> | | <code>The association between cell phone use and incident glioblastoma remains unclear. Some studies have reported that cell phone use was associated with incident glioblastoma, and with reduced survival of patients diagnosed with glioblastoma. However, other studies have repeatedly replicated to find an association between cell phone use and glioblastoma.</code> | <code>What is the association between cell phone use and glioblastoma?</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 32 - `per_device_eval_batch_size`: 16 - `learning_rate`: 2e-05 - `num_train_epochs`: 1 - `warmup_ratio`: 0.1 - `fp16`: True - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 32 - `per_device_eval_batch_size`: 16 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 1 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | BAAI/bge-small-en-v1.5_cosine_map@100 | |:------:|:----:|:-------------:|:-------------------------------------:| | 0.7937 | 100 | 0.2124 | 0.8833 | ### Framework Versions - Python: 3.11.5 - Sentence Transformers: 3.0.1 - Transformers: 4.41.2 - PyTorch: 2.1.2+cu121 - Accelerate: 0.31.0 - Datasets: 2.19.1 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
null
BioNLP
# BGE small finetuned BIOASQ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) <!-- at revision 5c38ec7c405ec4b44b94cc5a9bb96e735b38267a --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 384 tokens - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> - **Language:** en - **License:** apache-2.0 ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("juanpablomesa/bge-small-bioasq-1epoch-batch32-100steps") # Run inference sentences = [ 'Multicluster Pcdh diversity is required for mouse olfactory neural circuit assembly. The vertebrate clustered protocadherin (Pcdh) cell surface proteins are encoded by three closely linked gene clusters (Pcdhα, Pcdhβ, and Pcdhγ). Although deletion of individual Pcdh clusters had subtle phenotypic consequences, the loss of all three clusters (tricluster deletion) led to a severe axonal arborization defect and loss of self-avoidance.', 'What are the effects of the deletion of all three Pcdh clusters (tricluster deletion) in mice?', 'How many periods of regulatory innovation led to the evolution of vertebrates?', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 384] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Information Retrieval * Dataset: `BAAI/bge-small-en-v1.5` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.8345 | | cosine_accuracy@3 | 0.9208 | | cosine_accuracy@5 | 0.942 | | cosine_accuracy@10 | 0.9547 | | cosine_precision@1 | 0.8345 | | cosine_precision@3 | 0.3069 | | cosine_precision@5 | 0.1884 | | cosine_precision@10 | 0.0955 | | cosine_recall@1 | 0.8345 | | cosine_recall@3 | 0.9208 | | cosine_recall@5 | 0.942 | | cosine_recall@10 | 0.9547 | | cosine_ndcg@10 | 0.9002 | | cosine_mrr@10 | 0.8822 | | **cosine_map@100** | **0.8833** | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 4,012 training samples * Columns: <code>positive</code> and <code>anchor</code> * Approximate statistics based on the first 1000 samples: | | positive | anchor | |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 3 tokens</li><li>mean: 63.38 tokens</li><li>max: 485 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 16.13 tokens</li><li>max: 49 tokens</li></ul> | * Samples: | positive | anchor | |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------| | <code>Aberrant patterns of H3K4, H3K9, and H3K27 histone lysine methylation were shown to result in histone code alterations, which induce changes in gene expression, and affect the proliferation rate of cells in medulloblastoma.</code> | <code>What is the implication of histone lysine methylation in medulloblastoma?</code> | | <code>STAG1/STAG2 proteins are tumour suppressor proteins that suppress cell proliferation and are essential for differentiation.</code> | <code>What is the role of STAG1/STAG2 proteins in differentiation?</code> | | <code>The association between cell phone use and incident glioblastoma remains unclear. Some studies have reported that cell phone use was associated with incident glioblastoma, and with reduced survival of patients diagnosed with glioblastoma. However, other studies have repeatedly replicated to find an association between cell phone use and glioblastoma.</code> | <code>What is the association between cell phone use and glioblastoma?</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 32 - `per_device_eval_batch_size`: 16 - `learning_rate`: 2e-05 - `num_train_epochs`: 1 - `warmup_ratio`: 0.1 - `fp16`: True - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 32 - `per_device_eval_batch_size`: 16 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 1 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | BAAI/bge-small-en-v1.5_cosine_map@100 | |:------:|:----:|:-------------:|:-------------------------------------:| | 0.7937 | 100 | 0.2124 | 0.8833 | ### Framework Versions - Python: 3.11.5 - Sentence Transformers: 3.0.1 - Transformers: 4.41.2 - PyTorch: 2.1.2+cu121 - Accelerate: 0.31.0 - Datasets: 2.19.1 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
{"base_model": "BAAI/bge-small-en-v1.5", "datasets": [], "language": ["en"], "library_name": "sentence-transformers", "license": "apache-2.0", "metrics": ["cosine_accuracy@1", "cosine_accuracy@3", "cosine_accuracy@5", "cosine_accuracy@10", "cosine_precision@1", "cosine_precision@3", "cosine_precision@5", "cosine_precision@10", "cosine_recall@1", "cosine_recall@3", "cosine_recall@5", "cosine_recall@10", "cosine_ndcg@10", "cosine_mrr@10", "cosine_map@100"], "pipeline_tag": "sentence-similarity", "tags": ["sentence-transformers", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:4012", "loss:MultipleNegativesRankingLoss"], "widget": [{"source_sentence": "Extensive messenger RNA editing generates transcript and protein diversity in genes involved in neural excitability, as previously described, as well as in genes participating in a broad range of other cellular functions. ", "sentences": ["Do cephalopods use RNA editing less frequently than other species?", "GV1001 vaccine targets which enzyme?", "Which event results in the acetylation of S6K1?"]}, {"source_sentence": "Yes, exposure to household furry pets influences the gut microbiota of infants.", "sentences": ["Can pets affect infant microbiomed?", "What is the mode of action of Thiazovivin?", "What are the effects of CAMK4 inhibition?"]}, {"source_sentence": "In children with heart failure evidence of the effect of enalapril is empirical. Enalapril was clinically safe and effective in 50% to 80% of for children with cardiac failure secondary to congenital heart malformations before and after cardiac surgery, impaired ventricular function , valvar regurgitation, congestive cardiomyopathy, , arterial hypertension, life-threatening arrhythmias coexisting with circulatory insufficiency. \nACE inhibitors have shown a transient beneficial effect on heart failure due to anticancer drugs and possibly a beneficial effect in muscular dystrophy-associated cardiomyopathy, which deserves further studies.", "sentences": ["Which receptors can be evaluated with the [18F]altanserin?", "In what proportion of children with heart failure has Enalapril been shown to be safe and effective?", "Which major signaling pathways are regulated by RIP1?"]}, {"source_sentence": "Cellular senescence-associated heterochromatic foci (SAHFS) are a novel type of chromatin condensation involving alterations of linker histone H1 and linker DNA-binding proteins. SAHFS can be formed by a variety of cell types, but their mechanism of action remains unclear.", "sentences": ["What is the relationship between the X chromosome and a neutrophil drumstick?", "Which microRNAs are involved in exercise adaptation?", "How are SAHFS created?"]}, {"source_sentence": "Multicluster Pcdh diversity is required for mouse olfactory neural circuit assembly. The vertebrate clustered protocadherin (Pcdh) cell surface proteins are encoded by three closely linked gene clusters (Pcdhα, Pcdhβ, and Pcdhγ). Although deletion of individual Pcdh clusters had subtle phenotypic consequences, the loss of all three clusters (tricluster deletion) led to a severe axonal arborization defect and loss of self-avoidance.", "sentences": ["What are the effects of the deletion of all three Pcdh clusters (tricluster deletion) in mice?", "what is the role of MEF-2 in cardiomyocyte differentiation?", "How many periods of regulatory innovation led to the evolution of vertebrates?"]}], "model-index": [{"name": "BGE small finetuned BIOASQ", "results": [{"task": {"type": "information-retrieval", "name": "Information Retrieval"}, "dataset": {"name": "BAAI/bge small en v1.5", "type": "BAAI/bge-small-en-v1.5"}, "metrics": [{"type": "cosine_accuracy@1", "value": 0.8345120226308345, "name": "Cosine Accuracy@1"}, {"type": "cosine_accuracy@3", "value": 0.9207920792079208, "name": "Cosine Accuracy@3"}, {"type": "cosine_accuracy@5", "value": 0.942008486562942, "name": "Cosine Accuracy@5"}, {"type": "cosine_accuracy@10", "value": 0.9547383309759547, "name": "Cosine Accuracy@10"}, {"type": "cosine_precision@1", "value": 0.8345120226308345, "name": "Cosine Precision@1"}, {"type": "cosine_precision@3", "value": 0.3069306930693069, "name": "Cosine Precision@3"}, {"type": "cosine_precision@5", "value": 0.18840169731258838, "name": "Cosine Precision@5"}, {"type": "cosine_precision@10", "value": 0.09547383309759547, "name": "Cosine Precision@10"}, {"type": "cosine_recall@1", "value": 0.8345120226308345, "name": "Cosine Recall@1"}, {"type": "cosine_recall@3", "value": 0.9207920792079208, "name": "Cosine Recall@3"}, {"type": "cosine_recall@5", "value": 0.942008486562942, "name": "Cosine Recall@5"}, {"type": "cosine_recall@10", "value": 0.9547383309759547, "name": "Cosine Recall@10"}, {"type": "cosine_ndcg@10", "value": 0.9001912196285257, "name": "Cosine Ndcg@10"}, {"type": "cosine_mrr@10", "value": 0.8821973013627894, "name": "Cosine Mrr@10"}, {"type": "cosine_map@100", "value": 0.8832658504735496, "name": "Cosine Map@100"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,956
TransferGraph/connectivity_bert_ft_qqp-96-finetuned-lora-ag_news
TransferGraph
text-classification
[ "peft", "safetensors", "parquet", "text-classification", "dataset:ag_news", "base_model:connectivity/bert_ft_qqp-96", "base_model:adapter:connectivity/bert_ft_qqp-96", "model-index", "region:us" ]
2024-02-28T00:08:16Z
2024-02-28T00:08:18+00:00
1
0
--- base_model: connectivity/bert_ft_qqp-96 datasets: - ag_news library_name: peft metrics: - accuracy tags: - parquet - text-classification model-index: - name: connectivity_bert_ft_qqp-96-finetuned-lora-ag_news results: - task: type: text-classification name: Text Classification dataset: name: ag_news type: ag_news config: default split: test args: default metrics: - type: accuracy value: 0.9265789473684211 name: accuracy --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # connectivity_bert_ft_qqp-96-finetuned-lora-ag_news This model is a fine-tuned version of [connectivity/bert_ft_qqp-96](https://huggingface.co/connectivity/bert_ft_qqp-96) on the ag_news dataset. It achieves the following results on the evaluation set: - accuracy: 0.9266 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0004 - train_batch_size: 24 - eval_batch_size: 24 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | accuracy | train_loss | epoch | |:--------:|:----------:|:-----:| | 0.25 | None | 0 | | 0.9178 | 0.3200 | 0 | | 0.9211 | 0.2290 | 1 | | 0.9274 | 0.2085 | 2 | | 0.9266 | 0.1977 | 3 | ### Framework versions - PEFT 0.8.2 - Transformers 4.37.2 - Pytorch 2.2.0 - Datasets 2.16.1 - Tokenizers 0.15.2
null
Non_BioNLP
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # connectivity_bert_ft_qqp-96-finetuned-lora-ag_news This model is a fine-tuned version of [connectivity/bert_ft_qqp-96](https://huggingface.co/connectivity/bert_ft_qqp-96) on the ag_news dataset. It achieves the following results on the evaluation set: - accuracy: 0.9266 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0004 - train_batch_size: 24 - eval_batch_size: 24 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | accuracy | train_loss | epoch | |:--------:|:----------:|:-----:| | 0.25 | None | 0 | | 0.9178 | 0.3200 | 0 | | 0.9211 | 0.2290 | 1 | | 0.9274 | 0.2085 | 2 | | 0.9266 | 0.1977 | 3 | ### Framework versions - PEFT 0.8.2 - Transformers 4.37.2 - Pytorch 2.2.0 - Datasets 2.16.1 - Tokenizers 0.15.2
{"base_model": "connectivity/bert_ft_qqp-96", "datasets": ["ag_news"], "library_name": "peft", "metrics": ["accuracy"], "tags": ["parquet", "text-classification"], "model-index": [{"name": "connectivity_bert_ft_qqp-96-finetuned-lora-ag_news", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "ag_news", "type": "ag_news", "config": "default", "split": "test", "args": "default"}, "metrics": [{"type": "accuracy", "value": 0.9265789473684211, "name": "accuracy"}]}]}]}
task
[ "TEXT_CLASSIFICATION" ]
42,957