sometimesanotion PRO

sometimesanotion

AI & ML interests

Agentic LLM services, model merging, finetunes, distillation

Recent Activity

liked a model 23 minutes ago
MaziyarPanahi/Lamarck-14B-v0.7-Fusion-GGUF
View all activity

Organizations

Hugging Face Discord Community's profile picture

sometimesanotion's activity

replied to jjokah's post 6 minutes ago
view reply

Right-sizing language models is something I'm really here for. I find that a 1.5B parameter model fronting simple questions from a backing RAG source that a larger model gradually works on is more scalable. Classic information sources and stores can be QA'd, and they don't have such huge energy footprints.

AI will work out better if we give humans, classic code, SLMs, and frontier LLMs the roles they're right-sized for, and ensure data privacy and individual dignity at every stage of the contract.

reacted to jjokah's post with 👍 11 minutes ago
view post
Post
1464
The past few years have been a blast for artificial intelligence, with large language models (LLMs) stunning everyone with their capabilities and powering everything from chatbots to code assistants. However, not all applications demand the massive size and complexity of LLMs, the computational power required makes them impractical for many use cases. This is why Small Language Models (SLMs) entered the scene to make powerful AI models more accessible by shrinking in size.

In this article we went through what SLMs are, how they are made small, their benefits and limitations, real-world use cases, and how they can be used on mobile and desktop devices.
https://huggingface.co/blog/jjokah/small-language-model
  • 1 reply
·
New activity in sometimesanotion/Lamarck-14B-v0.7 2 days ago

Excellent model!

14
#3 opened 18 days ago by
nixudos