DeepSeek R1 MFANN TIES Unretrained 7B by netcat420

 ยป  All LLMs  ยป  netcat420  ยป  DeepSeek R1 MFANN TIES Unretrained 7B   URL Share it on

DeepSeek R1 MFANN TIES Unretrained 7B is an open-source language model by netcat420. Features: 7b LLM, VRAM: 15.2GB, Context: 4K, Code Generating, Merged, LLM Explorer Score: 0.19.

  Merged Model   Arxiv:2306.01708   Autotrain compatible Base model:deepseek-ai/deepsee... Base model:netcat420/qwen2.5-c... Base model:qwen/qwen2.5-math-7...   Codegen   Conversational   Endpoints compatible   Qwen2   Region:us   Safetensors   Sharded   Tensorflow

DeepSeek R1 MFANN TIES Unretrained 7B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
DeepSeek R1 MFANN TIES Unretrained 7B (netcat420/DeepSeek-R1-MFANN-TIES-unretrained-7b)
๐ŸŒŸ Advertise your project ๐Ÿš€

DeepSeek R1 MFANN TIES Unretrained 7B Parameters and Internals

LLM NameDeepSeek R1 MFANN TIES Unretrained 7B
Repository ๐Ÿค—https://huggingface.co/netcat420/DeepSeek-R1-MFANN-TIES-unretrained-7b 
Base Model(s)  Qwen/Qwen2.5-Math-7B   DeepSeek R1 Distill Qwen 7B   netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN-Slerp-Unretrained   Qwen/Qwen2.5-Math-7B   deepseek-ai/DeepSeek-R1-Distill-Qwen-7B   netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN-Slerp-Unretrained
Merged ModelYes
Model Size7b
Required VRAM15.2 GB
Updated2025-01-23
Maintainernetcat420
Model Typeqwen2
Model Files  5.0 GB: 1-of-4   4.9 GB: 2-of-4   5.0 GB: 3-of-4   0.3 GB: 4-of-4
Generates CodeYes
Model ArchitectureQwen2ForCausalLM
Context Length4096
Model Max Length4096
Transformers Version4.46.2
Tokenizer ClassQwen2Tokenizer
Padding Token<|endoftext|>
Vocabulary Size152064
Torch Data Typefloat16
Errorsreplace

Best Alternatives to DeepSeek R1 MFANN TIES Unretrained 7B

Best Alternatives
Context / RAM
Downloads
Likes
SakalFusion 7B Beta986K / 15.2 GB60
Qwen2.5 7B Rebase986K / 15.2 GB732
Qwen2.5 7B Rebase986K / 15.2 GB72
...R1 Distill Qwen MFANN Slerp 7B128K / 15.2 GB110
Qwen2.5 7B Coder Codeio Pp128K / 15.2 GB55
Qwen2.5 7B CySecButler V0.1128K / 15.2 GB33
CoT 2.5128K / 15.2 GB390
Mergekit Ties Uqhfast128K / 15.2 GB260
CoT 2.5128K / 15.2 GB260
Mergekit Ties Uqhfast128K / 15.2 GB130

Rank the DeepSeek R1 MFANN TIES Unretrained 7B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52721 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a