logo

Mobile Artificial IntelligenceMobile AI

Maid / Guides / OpenAI

Use OpenAI & Compatible APIs

Connect Maid to OpenAI's cloud models — including GPT-4o and o3 — using your own API key. The same provider also works with any OpenAI-compatible endpoint, including LM Studio, OpenRouter, and vLLM.

Overview

Maid's OpenAI provider sends requests to the OpenAI REST API using the standard chat completions endpoint. Because the base URL is configurable, it also works with any third-party service that implements the same API spec — which many do, including local inference servers like LM Studio and cloud aggregators like OpenRouter.

Unlike the local llama.cpp provider, this option requires an internet connection and charges are incurred based on token usage at OpenAI's current rates. Your API key is stored in encrypted local storage on your device and is only ever sent to the endpoint you configure.

Get an OpenAI API key

1
Create an account at platform.openai.com.
2
Navigate to API Keys in your account dashboard.
3
Click Create new secret key, give it a name, and copy the key. Store it somewhere safe — OpenAI only shows it once.

Configure Maid

1
Open Settings in Maid and select OpenAI from the API dropdown.
2
Paste your API key into the API Key field.
3
Maid automatically fetches the list of models available on your account and populates the Model dropdown. Select a model — for most use cases gpt-4o-mini is a good cost-effective choice, while gpt-4o offers the best quality.
4
Optionally set a Custom Base URL if you are using a compatible third-party endpoint (see below).

Configuration reference

FieldRequiredDefault
API KeyYes
ModelYesAuto-populated from your account
Base URLNohttps://api.openai.com/v1
Custom HeadersNo
ParametersNoProvider defaults

Using compatible APIs

Because the base URL is configurable, the OpenAI provider in Maid works with any service that implements the OpenAI chat completions specification. This includes local inference servers — meaning you can run a model on your desktop and connect to it from your phone without Ollama.

ServiceBase URLAPI Key
LM Studiohttp://<host>:1234/v1Not required
OpenRouterhttps://openrouter.ai/api/v1Your OpenRouter key
vLLMhttp://<host>:8000/v1Optional
When using OpenRouter, you may also need to add an HTTP-Referer custom header with your site URL. This can be added via the Headers panel in Maid's Settings.