Back to Models

Phi4 Mini

Text Generation

Phi-4 Mini 3.8B - Microsoft's compact reasoning LLM with 128K context

Integration

main.rs
use xybrid_sdk::{Xybrid, Envelope};

// Load the LLM
let model = Xybrid::model("phi4-mini").load()?;

// Run text generation
let result = model.run(&Envelope::text("Explain quantum computing in simple terms."))?;

println!("{}", result.text.unwrap());

Details

Task
Text Generation
Family
Microsoft
Parameters
3.8B
Format
gguf
Quantization
q4_k_m
Size
8.1 GB
Model ID
phi4-mini