Merak 7B V3 Mini Orca Indo by asyafiqe

 ยป  All LLMs  ยป  asyafiqe  ยป  Merak 7B V3 Mini Orca Indo   URL Share it on

  Arxiv:2307.09288   Autotrain compatible Dataset:asyafiqe/orca mini v1 ...   En   Id   Llama   Pytorch   Region:us   Sharded

Merak 7B V3 Mini Orca Indo Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Merak 7B V3 Mini Orca Indo (asyafiqe/Merak-7B-v3-Mini-Orca-Indo)
๐ŸŒŸ Advertise your project ๐Ÿš€

Merak 7B V3 Mini Orca Indo Parameters and Internals

Model Type 
fine-tuned
Use Cases 
Areas:
chat applications, assistive AI
Applications:
power conservation suggestions, interactive question answering
Limitations:
May produce inaccurate, biased, or objectionable responses., Risks with usage not entirely predictable.
Supported Languages 
en (English), id (Bahasa Indonesia)
Training Details 
Data Sources:
psmathur/orca_mini_v1_dataset (translated to Bahasa Indonesia)
Methodology:
Instruction fine-tuning using LoRA, DeepSpeed ZeRO-2, and FlashAttention
Context Length:
4096
Training Time:
6 hours
Hardware Used:
2 x 3090-24GB GPUs
Responsible Ai Considerations 
Mitigation Strategies:
Developers should perform safety testing and tuning tailored to their specific applications.
Input Output 
Input Format:
[Vicuna 1.1](https://github.com/oobabooga/text-generation-webui/blob/main/instruction-templates/Vicuna-v1.1.yaml) format
Accepted Modalities:
text
Output Format:
text responses
LLM NameMerak 7B V3 Mini Orca Indo
Repository ๐Ÿค—https://huggingface.co/asyafiqe/Merak-7B-v3-Mini-Orca-Indo 
Model Size7b
Required VRAM13.6 GB
Updated2025-08-18
Maintainerasyafiqe
Model Typellama
Model Files  4.0 GB: 1-of-4   4.0 GB: 2-of-4   4.0 GB: 3-of-4   1.6 GB: 4-of-4
Supported Languagesen id
Model ArchitectureLlamaForCausalLM
Licensecc-by-nc-sa-4.0
Context Length4096
Model Max Length4096
Transformers Version4.32.0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32000
Torch Data Typefloat16

Quantized Models of the Merak 7B V3 Mini Orca Indo

Model
Likes
Downloads
VRAM
...ak 7B V3 Mini Orca Indo Gptq 2073 GB
...erak 7B V3 Mini Orca Indo GPTQ163 GB

Best Alternatives to Merak 7B V3 Mini Orca Indo

Best Alternatives
Context / RAM
Downloads
Likes
A6 L1024K / 16.1 GB2010
A3.41024K / 16.1 GB130
A5.41024K / 16.1 GB120
M1024K / 16.1 GB1270
A2.41024K / 16.1 GB120
1571024K / 16.1 GB1010
1241024K / 16.1 GB930
1621024K / 16.1 GB600
2 Very Sci Fi1024K / 16.1 GB3170
1181024K / 16.1 GB150
Note: green Score (e.g. "73.2") means that the model is better than asyafiqe/Merak-7B-v3-Mini-Orca-Indo.

Rank the Merak 7B V3 Mini Orca Indo Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50729 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124