Skip to content
This repository was archived by the owner on Jul 18, 2024. It is now read-only.

Releases: rcaneppele/simple-openai-client

Release: 1.5.2 - Bugfixes

25 Jan 22:39

Choose a tag to compare

We're pleased to announce the release of version 1.5.2, dedicated to resolving bugs and introducing changes to endpoints with path variables.

Bugfixes

In this release, we've addressed a few bugs to enhance the overall reliability of the library.

Endpoint Changes: Path Variables

The primary modifications center around endpoints requiring path variables. With this release, a pivotal adjustment has been made: all endpoint path variables must now be explicitly included as parameters within the methods of the OpenAIClient class, rather than being encapsulated within the request objects. This change is designed to enhance clarity and ensure a more straightforward and consistent approach when working with path variables in the library.

Upgrade to version 1.5.2

We strongly recommend upgrading to the latest version, 1.5.2, of the library in your projects to leverage the bug fixes and endpoint enhancements introduced in this release. Staying up-to-date with the latest version ensures that your projects benefit from improved stability, performance, and the latest features.

We look forward to providing you with an even more robust and reliable library experience with version 1.5.2!

Full Changelog: 1.5.0...1.5.2

1.4.0 - Messages Unleashed!

11 Jan 21:38

Choose a tag to compare

We're excited to introduce version 1.4.0 of our library, placing a spotlight on the powerful Messages endpoints. This release empowers you to seamlessly create, list, retrieve, and modify Thread Messages through the OpenAI API.

Create Message

Easily create a message within a thread using the following snippet:

var request = new CreateMessageRequestBuilder()
    .threadId("thread-id")
    .content("content")
    .fileIds("fileId-1", "fileId-2")
    .metadata(Map.of("key", "value"))
    .build();

var response = client.createMessage(request);
System.out.println(response);

List Messages

Effortlessly retrieve a list of messages associated with a specific thread:

var response = client.listMessages("thread-id");
System.out.println(response);

// Apply filters/limits using the QueryParameters object:
response = client.listMessages("thread-id", queryParameters);

Retrieve Message

Retrieve detailed information about a specific message within a thread:

var message = client.retrieveMessage("thread-id", "message-id");
System.out.println(message);

// Access Text Message content:
var firstMessage = message.firstMessageContentText();
var lastMessage = message.lastMessageContentText();

// Access Image Message content:
var firstImageFileId = message.firstMessageContentImage();
var lastImageFileId = message.lastMessageContentImage();

Modify Message

Update the metadata of a specific message in a thread with ease:

var request = new ModifyMessageRequestBuilder()
    .threadId("thread-id")
    .messageId("message-id")
    .metadata(Map.of("key", "value"))
    .build();

var response = client.modifyMessage(request);
System.out.println(response);

Upgrade your messaging capabilities with version 1.4.0, offering enhanced control and functionality through the OpenAI API.

1.3.0 - Simplified Method Names

09 Jan 13:11

Choose a tag to compare

We're thrilled to announce the swift release of version 1.3.0 of our library! This release was expedited to address the recent breaking changes in the main class of the project.

Breaking Changes: Important Updates

ATTENTION: This version introduces significant breaking changes to enhance functionality and simplify your experience.

All methods responsible for sending requests to the OpenAI API have undergone name simplifications. We have removed the send prefix and the Request suffix, aiming for increased simplicity.

Example of old method name: client.sendListAssistantsRequest()
New method name: client.listAssistants()

Stay up-to-date by adapting to these changes, and enjoy a more straightforward and efficient experience with version 1.3.0 of our library!

Threads Support

09 Jan 02:27

Choose a tag to compare

Introducing Threads Endpoints

We're excited to announce the addition of Threads endpoints in this latest release! Now, you have the power to create, retrieve, modify, and delete Threads using the OpenAI API.

Breaking Changes: Important Updates

ATTENTION: This version comes with significant breaking changes to enhance functionality and streamline your experience.

Deprecated Classes

The following classes have been deprecated in this release:

  • ListAssistantRequest
  • ListAssistantFilesRequest
  • ListAssistantFilesRequestBuilder

New Pagination and Filtering Approach

To control pagination and filters on endpoints supporting these features, adopt the new QueryParameters class:

var parameters = new QueryParametersBuilder()
    .limit(5)
    .after("after-id")
    .before("before-id")
    .ascOrder()
    .build();

var assistants = client.sendListAssistantsRequest(parameters);

Assistant Support

06 Jan 11:17

Choose a tag to compare

Assistant Support with OpenAI Assistants API

We're thrilled to introduce an upgraded experience in this latest release, now featuring the powerful Assistants API from OpenAI.

In this initial phase, we've rolled out support for assistant endpoints. Stay tuned, as we're actively working on integrating Threads, Messages, and Run support, which will be available in the upcoming releases.

Multiple messages

29 Dec 14:44

Choose a tag to compare

Version 1.0.4

You can now send multiple user/system role messages on the Chat Completion Request. Below is an example illustrating the usage:

var request = new ChatCompletionRequestBuilder()
    .model(OpenAIModel.GPT_4_1106_PREVIEW)
    .systemMessage("First system message")
    .systemMessage("Second system message")
    .userMessage("First user message")
    .userMessage("Second user message")
    .build();

Streaming Support for Chat Completion

28 Dec 18:26

Choose a tag to compare

We are excited to introduce streaming support for the chat completion endpoint, allowing for a more dynamic and responsive interaction with the OpenAI API. This feature leverages RXJava for reactive programming, enabling event-based programs.

You can now use the sendStreamChatCompletionRequest method to integrate streaming into your Java applications. Below is an example illustrating the usage:

client.sendStreamChatCompletionRequest(request).subscribe(response -> {
    var message = response.firstChoiceMessageContent();
    if (message != null) {
        System.out.println(message);
    }
}, error -> {
    System.out.println("Error during streaming: " +error.getMessage());
}, () -> {
    System.out.println("Streaming completed");
});

This example demonstrates how to handle responses, errors, and the completion of the streaming request. Feel free to incorporate this feature to enhance the real-time capabilities of your OpenAI-powered applications.

For more details, refer to the streaming documentation.