Original Reddit post

Tired of “Heavy Bombers” (70B+ models) that eat your VRAM for breakfast? We just dropped Cicikuş v2-3B . It’s a Llama 3.2 3B fine-tuned with our patented Behavioral Consciousness Engine (BCE) . It uses a “Secret Chain-of-Thought” (s-CoT) and Eulerian reasoning to calculate its own cognitive reflections before it even speaks to you. The Specs: Efficiency: Only 4.5 GB VRAM required (Local AI is finally usable). Brain: s-CoT & Behavioral DNA integration. Dataset: 26.8k rows of reasoning-heavy behavioral traces. Model: pthinc/Cicikus_v2_3B Dataset: BCE-Prettybird-Micro-Standard-v0.0.2 It’s a “strategic sniper” for your pocket. Try it before it decides to automate your coffee machine. ☕🤖 submitted by /u/Connect-Bid9700

Originally posted by u/Connect-Bid9700 on r/ArtificialInteligence