Skip to main content

Getting Started

This section explains how to install Contextica, configure an LLM provider, and run your first optimization. By following these steps, you will have a working AI function with automatically generated and optimized prompts.


Prerequisites

Make sure you have the following before continuing:

  • Java 11 or higher
  • Maven or Gradle build system
  • LLM provider credentials (for example, an OpenAI API key)

Installation

Since Contextica is not an open source project, you must first add the agensys nexus repository url:

<repositories>
<repository>
<id>agensys-releases</id>
<url>https://nexus.agensys.ai/repository/agensys-releases/</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>false</enabled>
</snapshots>
</repository>
</repositories>

Add the appropriate dependency to your project:

Community Edition

<dependencies>
<dependency>
<groupId>ai.agensys</groupId>
<artifactId>contextica-community</artifactId>
<version>1.0.0</version>
</dependency>
</dependencies>

Enterprise Edition

<dependencies>
<dependency>
<groupId>ai.agensys</groupId>
<artifactId>contextica-enterprise</artifactId>
<version>1.0.0</version>
</dependency>
</dependencies>

The Enterprise edition requires a license. Every developer running the context generation process needs to have a valid license. Here is how you activate the license:

First, configure your license key. You can set this either as an environment variable or in application.properties:

contextica.enterprise.license.key=YOUR-LICENSE-KEY

Then, activate your license by running the maven command:

contextica:activate-license

If the validation succeeds, you will receive an offline token that is valid for 30 days. As a result, the license check against the Contextica licensing server happens only once every 30 days.

Maven plugin

You can also enable the Contextica Maven plugin to run context generation automatically during your build:

<build>
<plugins>
<plugin>
<groupId>ai.agensys</groupId>
<artifactId>contextica-maven-plugin</artifactId>
<version>1.0.0</version>
<executions>
<execution>
<goals>
<goal>generate-contexts</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>

Configure LLM Provider

Contextica requires an API key for the LLM provider. The key is needed both for the optimization process and for runtime execution.

Add the following to application.properties:

# Required API key
contextica.llm.openai.api.key=sk-...

At the moment only OpenAI is supported. Additional providers will be added in future releases.


Hello World

A simple “Hello World” example demonstrates the typical development flow:

  1. Declare an AI service
  2. Configure its functions
  3. Run the context generation process
  4. Invoke the service at runtime

1. Declare an AIService

@AIService(name = "DocumentService", provider = LLMProvider.OPENAI, model = LLMModel.GPT_4O_MINI)
public interface DocumentService {

@AIFunction(description = "Summarize a piece of text")
String summarize(String input);

}

2. Declare configuration for the AIFunction

@AIServiceConfiguration(name = DocumentService.class)
class DocumentServiceConfig {

@AIFunctionConfiguration
public FunctionConfiguration summarize() {
return FunctionConfiguration.builder()
.description("Summarize the input text clearly and concisely")
.taskType(TaskType.SUMMARIZE)
.build();
}

}

3. Run the context generation process

When you define an @AIService and @AIFunction, they do not yet have an associated context (optimized prompt). To generate one, you must run the context generation process.

This process generates and evaluates candidate prompts, selects the best performing one, and stores it in a contexts.json file in your project. At runtime, the stored context is automatically used, so you do not need to write or maintain prompts manually.

Run the following commands:

mvn clean install
mvn contextica:generate-contexts

Once complete, your service is ready to use with optimized prompts.

4. Use the AI function

class App {
public static void main(String[] args) {
DocumentService documentService = Contextica.createService(DocumentService.class);
String result = documentService.summarize("Long text...");
System.out.println(result);
}
}