SummarizeArticle generates an LLM summary with provider selection and fallback support.
SummarizeArticleRequest specifies parameters for LLM article summarization.
LLM provider: "ollama", "groq", "openrouter"
11Headlines to summarize (max 8 used).
Summarization mode: "brief", "analysis", "translate", "" (default).
Geographic signal context to include in the prompt.
Variant: "full", "tech", or target language for translate mode.
Output language code, default "en".
Successful response
SummarizeArticleResponse contains the LLM summarization result.
The generated summary text.
Model identifier used for generation.
Provider that produced the result (or "cache").
Token count from the LLM response.
Whether the client should try the next provider in the fallback chain.
Error message if the request failed.
Error type/name (e.g. "TypeError").
SummarizeStatus indicates the outcome of a summarization request.
SUMMARIZE_STATUS_UNSPECIFIED, SUMMARIZE_STATUS_SUCCESS, SUMMARIZE_STATUS_CACHED, SUMMARIZE_STATUS_SKIPPED, SUMMARIZE_STATUS_ERROR Human-readable detail for non-success statuses (skip reason, etc.).