
azale-ai/GotongRoyong-MixtralMoE-7Bx4-v1.0
Text Generation
•
Updated
•
20
•
2
GotongRoyong is a series of language models focused on Mixture of Experts (MoE), made with the following models using LazyMergekit and cg123/mergekit.