Template-driven inputs
Content, code, SQL and marketing templates surface variables so teams can fill context without editing raw text.
AI Engineering · React · Gemini 2.5 Pro
The LLM Prompt Optimizer is a React + Vite SPA that ingests draft prompts, expands implicit requirements, and ships structured instructions tailored to Gemini, Claude, ChatGPT, or Llama. It prioritises determinism, anti-hallucination guardrails, and local-first history so portfolios stay trustworthy.
const result = await optimizePrompt({
prompt: "Outline an AI compliance briefing",
provider: targetLLM.CLAUDE,
variables: {
audience: "Risk Committee",
region: "EU",
length: "400 words",
}
})
// ✅ Returns
// · XML sections (<context>, <task>, <constraints>)
// · Negative instructions
// · Success metrics + QA checklist
Overview
The application runs entirely in the browser, stores settings locally, and keeps every optimisation reproducible.
Content, code, SQL and marketing templates surface variables so teams can fill context without editing raw text.
Searchable, favoritable history entries capture timestamp, target LLM, and the optimized output.
Switch between Gemini SDK and any OpenAI-compatible endpoint through a modal—no rebuilds required.
Capabilities
Each model family receives a tailored system prompt emphasising structure, tone, and guardrails.
Copy buttons target both draft and optimized prompt fields, with feedback loops for accessibility.
Loading states, progress bars, and error surfaces keep the single-page flow predictable.
API keys and base URLs stay in localStorage, making GitHub Pages deployments risk-free.
Architecture
App.tsx orchestrates state, variable parsing, template selection, and history tabs.
services/geminiService.ts detects provider, composes system prompts, and handles fetch or SDK calls.
useSettings.ts persists provider configs and temperature while guarding against invalid payloads.
Workflow
Choose a template or paste a custom brief; placeholders instantly appear as inputs.
Set provider, API key, base URL, and temperature via the modal; values persist locally.
Anti-hallucination instructions and formatting constraints accompany every request.
Copy, favorite, or rehydrate any history item for the next iteration.
Contact
Lisbon · Remote-friendly