Ice0.88 07.02 RP DPO Merged 16bit by icefog72

 ยป  All LLMs  ยป  icefog72  ยป  Ice0.88 07.02 RP DPO Merged 16bit   URL Share it on

  Merged Model   16bit   Autotrain compatible Base model:finetune:icefog72/i... Base model:icefog72/ice0.88-07...   Conversational   En   Endpoints compatible   Mistral   Pytorch   Quantized   Region:us   Sharded   Trl   Unsloth

Ice0.88 07.02 RP DPO Merged 16bit Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Ice0.88 07.02 RP DPO Merged 16bit (icefog72/Ice0.88-07.02-RP-dpo-merged_16bit)
๐ŸŒŸ Advertise your project ๐Ÿš€

Ice0.88 07.02 RP DPO Merged 16bit Parameters and Internals

LLM NameIce0.88 07.02 RP DPO Merged 16bit
Repository ๐Ÿค—https://huggingface.co/icefog72/Ice0.88-07.02-RP-dpo-merged_16bit 
Base Model(s)  icefog72/Ice0.88-07.02-RP   icefog72/Ice0.88-07.02-RP
Merged ModelYes
Required VRAM14.4 GB
Updated2025-07-10
Maintainericefog72
Model Typemistral
Model Files  4.9 GB: 1-of-3   5.0 GB: 2-of-3   4.5 GB: 3-of-3
Supported Languagesen
Quantization Type16bit
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.48.2
Tokenizer ClassLlamaTokenizer
Vocabulary Size32000
Torch Data Typefloat16

Quantized Models of the Ice0.88 07.02 RP DPO Merged 16bit

Model
Likes
Downloads
VRAM
...88 07.02 RP DPO Merged 16bit 30514 GB

Best Alternatives to Ice0.88 07.02 RP DPO Merged 16bit

Best Alternatives
Context / RAM
Downloads
Likes
NemoMix Unleashed EXL2 4bpw1000K / 7.3 GB79
Devstral Small 2505 8bit128K / 25 GB3462
...eZephir Sft Instruct Ead 16bit32K / 14.4 GB560
...cr To Json V1 HQQ 1bit Smashed32K / 1.6 GB60
...cr To Json V1 HQQ 4bit Smashed32K / 4.2 GB60
ScikitLLM Model EXL232K / 3 GB31
Chargen V2 8bpw EXL232K / 7.4 GB51
...tral PairRM DPO 8.0bpw H8 EXL232K / 7.4 GB42
HamSter 0.2 8.0bpw H8 EXL232K / 7.4 GB51
...N L1 Chat RL V1.6.0bpw H6 EXL232K / 5.6 GB131
Note: green Score (e.g. "73.2") means that the model is better than icefog72/Ice0.88-07.02-RP-dpo-merged_16bit.

Rank the Ice0.88 07.02 RP DPO Merged 16bit Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50068 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124