Arkor is in alpha, so this page is intentionally sparse. Items are grouped by what state they’re in: actively being built, scoped and waiting their turn, or under consideration. We don’t commit to dates yet.Documentation Index
Fetch the complete documentation index at: https://docs.arkor.ai/llms.txt
Use this file to discover all available pages before exploring further.
In progress
Nothing actively in development on the public surface this cycle. Current effort is on stabilizing what’s already shipped (CLI, Studio, scaffolder).Up next
Auth0 token auto-refresh
Silent refresh on expiry, so long-running sessions stop getting interrupted by re-login.
Bring your own dataset (JSONL)
Upload a local JSONL file as the training dataset, alongside the existing HuggingFace name and blob URL paths.
Train on a local GPU
Run training on your own GPU instead of routing every job through Arkor’s managed GPUs.
Dry-run from Studio
Surface the existing dry-run option in the Studio UI for fast smoke tests before kicking off a full training run.
Backlog
Self-hosted training backend
Run the training backend on your own infrastructure, with a documented
ARKOR_CLOUD_API_URL knob and versioned API guarantees.deploy and eval slots
Grow
createArkor into an umbrella for shipping and evaluating models, not only training.More base models
Expand support beyond Gemma to additional open-weight model families.
Download trained models
Export a trained model as a file you can run on your own machine or deploy target, instead of staying on Arkor’s managed inference.
Synthetic data from a seed set
Generate training data from a small seed set, for cases where you don’t already have a labeled dataset.
Distillation templates
Templates that pair compatible teacher and student models so distillation runs work out of the box.
On-device model templates
Templates aimed at small models suitable for WebGPU and mobile targets.