A.I. designed for you, only for you.

Run powerful language models locally. Keep your conversations private. Switch between models without reinstalling.

See Arbiter in Action

AI Chat Interface showing poetry conversation
Settings screen with model selection
Technical conversation with keyboard visible
Code generation showing merge sort algorithm
Creative AI Conversations
Engage in creative writing, poetry, and artistic projects with AI models running locally on your device.

Key Features

Flexible Model Management

Download and switch between on-device AI models like LLaMA and Whisper. Add new models without reinstalling the app.

Private, Persistent Conversations

All chats are stored locally for full privacy and ownership. Conversations are searchable and exportable.

Hybrid Inference Engine

Run AI entirely on-device by default for speed and privacy, with the option to switch to cloud APIs for more power.

Real-Time Search Augmentation

Integrate fresh web results directly into your conversations for up-to-the-minute insights and updates.

Multimodal Input & Accessibility

Speak to Arbiter with voice-to-text, and soon listen with text-to-speech—ideal for accessibility and multitasking.

Open Source Foundation

Built on open standards and protocols. Contribute to the future of private AI or build your own extensions.

Our Mission

Arbiter is redefining the future of AI chat. We believe in empowering individuals and organizations with tools that offer choice, control, and clarity. By running AI locally and giving users ownership of their data, Arbiter puts privacy and power back into your hands.

Team

Headshot of Jordan Stone

Jordan Stone

Founder (CEO/CTO)

Headshot of Yousef Abu-Salah

Yousef Abu Salah

Chief Design Officer (CDO)

Headshot of Harpreet Singh

Harpreet Singh

Chief Financial Officer (CFO)

Headshot of Smith Patel

Smith Patel

Chief Operating Officer (COO)

FAQ

Is my data private?

Yes. Everything is processed and stored on your device. Your chats and files never leave your phone unless you explicitly choose to share them. We do not store, log, or sell your data. Your information stays 100% under your control.

Do I need an internet connection to use the app?

No. Once installed, the app works entirely offline. You can use it on airplanes, in remote areas, or without Wi-Fi, while still getting fast and responsive answers.

How does offline AI work?

The app uses optimized large language models (LLMs) that run locally on your device's processor. These models are pre-downloaded and don't require server access, which means your requests are processed instantly without sending them to the cloud.

What devices are supported?

Currently, the app supports modern iPhones and iPads running iOS 16 or later, with best performance on devices equipped with Apple Silicon chips (A14 Bionic or newer). Most models will require a device with more than 4 GB of RAM.

What AI models are available?

You can choose from a range of open-source LLMs. The app supports switching models at any time so you can balance speed, accuracy, and device performance.