Lovable Prompt: Behest Chat App
Copy-paste this into a new Lovable project. It will produce a working chat app backed by Behest, with Supabase auth, streaming responses, per-user rate limits, and thread history.
Before you start
- Create a Behest project at https://behest.ai, copy the slug and API key.
- Connect Supabase to the Lovable project (Lovable Cloud).
- In Supabase → Settings → Edge Functions → Secrets: set
BEHEST_KEY = behest_sk_live_...andBEHEST_BASE_URL = https://<slug>.behest.app. - In Behest dashboard → Project → Settings → Allowed Origins: add your Lovable app origin.
(No frontend env-vars needed — Lovable's Project settings doesn't expose a generic env-vars UI, and it doesn't have to: the Edge Function from step 3 already has BEHEST_BASE_URL as a secret and returns it in the token response.)
Paste this into Lovable
SECURITY NON-NEGOTIABLES (do not deviate)
- BEHEST_KEY is server-only. If code puts it under VITE_/NEXT_PUBLIC_/EXPO_PUBLIC_, the answer is wrong.
- user_id MUST come from the verified Supabase session inside the Edge Function (supabase.auth.getUser()). Never read it from the request body, query, or client headers.
- tier MUST be looked up from your billing system (profiles table, Stripe webhook, etc.). Do NOT hardcode it and do NOT accept it from the client.
- role is derived from the API key on the server. Never pass role: "admin" in the mint body; any admin behaviour requires an admin-roled API key.
- The token-mint Edge Function must set Access-Control-Allow-Origin to a specific origin from a CORS_ORIGIN secret. No wildcards on auth endpoints.
- The browser MUST NOT import @behest/client-ts. It fetches a minted JWT from the Edge Function and uses the OpenAI SDK directly.
- If local signing is used later: the private key (behest_pk_*) is server-only. The SDK refuses to load it in a browser — do not try to work around that.
Build a chat app with the following behavior:
AUTH
- Use Supabase auth (email + password).
- Signed-out users see a landing page with sign up / sign in.
- Signed-in users see the chat UI at /chat.
BEHEST INTEGRATION
- Create a Supabase Edge Function called `behest-token`:
- Verifies the caller's Supabase session via supabase.auth.getUser().
- Reads BEHEST_KEY and BEHEST_BASE_URL from function secrets.
- Looks up tier from the user's profile row (e.g. profiles.plan), defaulting to 1.
- Imports the v1.5 Behest SDK and mints:
import { Behest } from "https://esm.sh/@behest/client-ts";
const behest = new Behest({ key: Deno.env.get("BEHEST_KEY"), baseUrl: Deno.env.get("BEHEST_BASE_URL") });
const { token, ttl, sessionId, expiresAt } = await behest.auth.mint({ user_id: user.id, tier, ttl: 900 });
- Sets Access-Control-Allow-Origin from a CORS_ORIGIN function secret (no wildcard) and Access-Control-Allow-Credentials: true.
- Returns JSON: { token, ttl, sessionId, expiresAt, baseUrl }, where baseUrl = Deno.env.get("BEHEST_BASE_URL"). This lets the browser call Behest without its own copy of the URL — the Supabase secret is the single source of truth.
- Deploy the Edge Function.
- In the browser, install `openai` (NOT @behest/client-ts) and create a helper:
import OpenAI from "openai";
type TokenBundle = { token: string; ttl: number; sessionId: string; expiresAt: number; baseUrl: string };
let cached: TokenBundle | null = null;
async function getBehestToken() {
const now = Math.floor(Date.now() / 1000);
if (cached && cached.expiresAt - now > 60) return cached;
const { data: { session } } = await supabase.auth.getSession();
const r = await fetch(`${import.meta.env.VITE_SUPABASE_URL}/functions/v1/behest-token`, {
method: "POST",
headers: { Authorization: `Bearer ${session?.access_token}` },
});
cached = await r.json();
return cached!;
}
export async function getOpenAI() {
const { token, sessionId, baseUrl } = await getBehestToken();
return new OpenAI({
apiKey: token,
baseURL: `${baseUrl}/v1`,
dangerouslyAllowBrowser: true,
defaultHeaders: { "X-Session-Id": sessionId },
});
}
CHAT UI (/chat)
- Sidebar on the left lists the user's threads. Fetch them from a new Edge Function `behest-threads` that proxies GET /v1/threads on the server (or re-mint a token and fetch from the browser with Authorization: Bearer <token>).
- "New chat" button creates a new threadId (crypto.randomUUID) and clears the message list.
- Main pane shows messages for the active thread.
- Input at the bottom. On submit:
- Append user message locally.
- const openai = await getOpenAI();
- Call openai.chat.completions.create({ messages, stream: true }, { headers: { "X-Thread-Id": activeThreadId } }).
- Stream the assistant reply chunk-by-chunk into the UI.
- Support aborting with an AbortController wired to a Stop button.
ERROR HANDLING (browser — responses come back as OpenAI APIError or non-2xx fetch)
- On 402: show "You've hit your daily limit. Upgrade to Pro for more." (parse `error.details` from the response body for current tier and limits).
- On 429: read the Retry-After response header (seconds) and show "Slow down — try again in N seconds".
- On 401: clear the cached token, call supabase.auth.signOut(), route to sign-in.
STYLING
- Use Tailwind + shadcn/ui.
- Clean minimal design. Sidebar 280px wide. Message bubbles distinct for user/assistant.
- Smooth streaming (typewriter feel). Show a "..." loader while waiting for the first token.
DO NOT
- Never put BEHEST_KEY in frontend code or env vars prefixed with VITE_.
- Never import @behest/client-ts in the browser.
- Never call Behest without a token from the Edge Function.
- Never send more than one user message per assistant response.
After Lovable finishes
Test the flow:
- Sign up → check Supabase Auth dashboard for the new user.
- Send a message → check Behest dashboard → Usage: should show one request under the signed-in user's id.
- Open a second browser with a different user → verify messages are isolated.
- Refresh → previous threads appear in the sidebar.
If a step fails, paste the browser console error into Lovable — it's trained on this pattern and will fix it.
See also
- Lovable + Supabase quickstart — full manual version of this
- Multi-conversation chat
- Error handling