API Reference
Annotations
@AIService
Marks a Java interface as an AI Service managed by Contextica.
Key attributes: name, provider, model.
@AIService(name = "DocumentService", provider = LLMProvider.OPENAI, model = LLMModel.GPT_4O_MINI)
public interface DocumentService { ... }
| Property | Description |
|---|---|
| name | The name of the AIService |
| provider | The name of the LLM provider. Currently only OpenAI is supported. |
| model | The name of the LLM Model to be used |
@AIFunction
Marks an interface method as an AI function.
@AIFunction(description = "Translates a piece of text into a different language")
String translate(@ContextParameter(name = "input") String text,
@ContextParameter(name = "targetLanguage") String language);
| Property | Description |
|---|---|
| description | Allows the developer to describe what the function does. It's never used in the prompt. |
| ContextParameter | Allows the developer to specify which parameter binds to a context parameter. This is only used in certain task types. |
@AIServiceConfiguration
Associates a configuration class with a AIService interface.
@AIServiceConfiguration(client = DocumentService.class)
class DocumentServiceConfiguration { ... }
@AIFunctionConfiguration
Declares a method returning FunctionConfiguration for a specific @AIFunction.
@AIFunctionConfiguration
public FunctionConfiguration summarize() { ... }
Core Classes
Contextica
This is the entry point for instantiating an AIService class.
DocumentService documentService = Contextica.createService(DocumentService.class);
FunctionConfiguration
This provides a builder-driven configuration object per AI function.
Important fields:
description(String): Description of the instruction. This gets passed to the LLM for prompt generation and optimization.taskType(TaskType): The type of task thisAIFunctionperforms. Determines which prompt generation strategy is used.examples(...): Training set examples to support context optimization. Crucial to achieve optimal performrance in real world use cases.instructions(...): Additional instructions passed to the LLM for generating prompts.
return FunctionConfiguration.builder()
.description("Generate a basic financial research report on the company with the specified ticker symbol")
.taskType(TaskType.GENERATE)
.instructions(Instructions.builder()
.rules(List.of(
"Be precise and concise.",
"Cite sources inline and include a References section."
))
.restrictions(List.of(
"Do NOT fabricate data.",
"Do NOT provide personal financial advice."
))
.structure(OutputStructure.builder()
.addSection("Executive Summary", "One paragraph thesis...")
.addSection("Company Overview", "Business model, revenue streams...")
.addSection("Valuation", "DCF, comps, sensitivities...")
.build())
.format(OutputFormat.MARKDOWN)
.readingLevel(ReadingLevel.PROFESSIONAL)
.roleContext("Act as a senior equity research analyst.")
.targetAudience("Audience is institutional investors.")
.build())
.build();
Enums
TaskType
The list of possible task types that Contextica currently supports.
GENERATESUMMARIZEEXTRACTCLASSIFYTRANSFORMTRANSLATECHAT
Providers and Models
LLMProvider: Allows a user to specify which LLM provider to use. Currently only OpenAI is supported.LLMModel: Allows a user to specify the model that will be used at runtime and during optimization.