Hel-v4-8b-DARK-FICTION-128K

image/png

Abstract

We are dedicated to building AI models that push the boundaries of what is possible for novel creation, content generation, translation, and summarization. Thus, it becomes possible to develop more accurate and reliable text-based systems that unlock new possibilities for language processing and generation. All content on this platform are aimed to be as human-like as possible in their thought processes and should be considered harmful, toxic, and not safe or suitable for family or work environments. They lie, hate, use foul language, are condescending, callous, manipulative, sexualized, and amoral - reflecting humanity as closely as possible.

Why do we need such a model? In my experience, existing models like Meta, Claude, and ChatGPT fall short in crafting compelling narratives. They tend to write saccharine, predictable "happily ever after" endings that lack the complexity of real life. They struggle with nuanced character development, unexpected plot twists, and strong language. Their linear approach often results in unengaging stories that do not hold readers' attention. These limitations make them best suited for a fourth-grade reading level at best.

If you are looking for a model to aid in the creation of the next "Top Model" of the Month, do not use any you might see here. All of these models are fat, blotted, and filled with trivial knowledge that makes for decent writing but none will achieve some unprecedented speed award in doing so.

We are developing models that mirror real-life complexity by learning from diverse sources. This includes psychology, philosophy, sociology, neuroscience, English and world literature, world languages, and grammar to create more relatable characters with multi-dimensional personalities and non-linear storylines. By carefully designing traits, values, and beliefs, it is possible to shape the AI's worldview and thought processes through artificially constructed datasets and specific model merging techniques. However, incremental testing will be necessary to refine the model and ensure its direction remains amoral and unaligned as it progresses to a final product.

Merge Method

This model was merged using the TIES merge method using MrRobotoAI/12K as a base.

Models Merged

The following models were included in the merge:

Configuration for Hel-v4-8b-DARK-FICTION-128K (13,996 words)

The following YAML configuration was used to produce this model

models:
  - model: MrRobotoAI/Hel-v2-8b-DARK-FICTION # Aprox. 8,528 words
    parameters:
      density: 0.25
      weight: 0.8
  - model: MrRobotoAI/Hel-v2.5-8b-DARK-FICTION # (-----)
    parameters:
      density: 0.25
      weight: 0.8
  - model: MrRobotoAI/Hel-v3-8b-DARK-FICTION # Aprox. 1,733 words
    parameters:
      density: 0.25
      weight: 0.8
  - model: MrRobotoAI/MrRoboto-DARK-v2-8b-128k
    parameters:
      density: 0.18
      weight: 0.8
  - model: MrRobotoAI/Nord-8b-Uncensored-BASE-128k # Aprox. 14,122 words
    parameters:
      density: 0.1
      weight: 0.8
merge_method: ties
base_model: MrRobotoAI/Nord-8b-Uncensored-BASE-128k # Aprox. 14,122 words
dtype: float16
Downloads last month
21
Safetensors
Model size
8.03B params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MrRobotoAI/Hel-v4-8b-DARK-FICTION-128K

Finetuned
(1)
this model
Merges
6 models
Quantizations
1 model