Skip to content

Releases: jamesrochabrun/SwiftOpenAI

Function Calling Required

04 May 05:01
Compare
Choose a tag to compare
Screenshot 2024-05-03 at 9 59 45 PM

Also Bug fixes.

Assistants API V2

30 Apr 05:14
97a0ec7
Compare
Choose a tag to compare

https://platform.openai.com/docs/assistants/whats-new

Screenshot 2024-04-29 at 10 06 07 PM

Migrated SwiftOpenAI to assistants V2, if you need V1 support make sure to use v2.3

Check OpenAI migration guide.

We have changed the way that tools and files work in the Assistants API between the v1 and v2 versions of the beta. Both versions of the beta continue to be accessible via the API today, but we recommend migrating to the newest version of our APIs as soon as feasible. We will deprecate v1 of the beta by the end of 2024.

If you do not use tools or files with the Assistants API today, there should be no changes required for you to migrate from the v1 version to the v2 version of the beta. Simply pass the v2 beta version header and/or move to the latest version of our Node and Python SDKs!
What has changed
The v2 version of the Assistants API contains the following changes:

Tool rename: The retrieval tool has been renamed to the file_search tool
Files belong to tools: Files are now associated with tools instead of Assistants and Messages. This means that:
AssistantFile and MessageFile objects no longer exist.
Instead of AssistantFile and MessageFile, files are attached to Assistants and Threads using the new tool_resources object.
The tool_resources for the code interpreter tool are a list of file_ids.
The tool_resources for the file_search tool are a new object called a vector_stores.
Messages now have an attachments, rather than a file_ids parameter. Message attachments are helpers that add the files to a Thread’s tool_resources.

Screenshot 2024-04-29 at 10 10 55 PM

Assistants have tools and tool_resources instead of file_ids. The retrieval tool is now the file_search tool. The tool_resource for the file_search tool is a vector_store.

Screenshot 2024-04-29 at 10 11 24 PM

Threads can bring their own tool_resources into a conversation.

Screenshot 2024-04-29 at 10 11 50 PM

Messages have attachments instead of file_ids. attachments are helpers that add files to the Thread’s tool_resources.

All v1 endpoints and objects for the Assistants API can be found under the Legacy section of the API reference.

  • Support for batch.
  • Support for vector stores.
  • Support for vector store files
  • Support for vector store file batch.

New interfaces:

 // MARK: Batch

   /// Creates and executes a batch from an uploaded file of requests
   ///
   /// - Parameter parameters: The parameters needed to create a batch.
   /// - Returns: A [batch](https://platform.openai.com/docs/api-reference/batch/object) object.
   /// - Throws: An error if the request fails
   ///
   /// For more information, refer to [OpenAI's Batch API documentation](https://platform.openai.com/docs/api-reference/batch/create).
   func createBatch(
      parameters: BatchParameter)
      async throws -> BatchObject

   /// Retrieves a batch.
   ///
   /// - Parameter id: The identifier of the batch to retrieve.
   /// - Returns: A [BatchObject](https://platform.openai.com/docs/api-reference/batch/object) matching the specified ID..
   /// - Throws: An error if the request fails.
   ///
   /// For more information, refer to [OpenAI's Batch documentation](https://platform.openai.com/docs/api-reference/batch/retrieve).
   func retrieveBatch(
      id: String)
      async throws -> BatchObject
   
   /// Cancels an in-progress batch.
   ///
   /// - Parameter id: The identifier of the batch to cancel.
   /// - Returns: A [BatchObject](https://platform.openai.com/docs/api-reference/batch/object) matching the specified ID..
   /// - Throws: An error if the request fails.
   ///
   /// For more information, refer to [OpenAI's Batch documentation](https://platform.openai.com/docs/api-reference/batch/cancel)
   func cancelBatch(
      id: String)
      async throws -> BatchObject

   /// List your organization's batches.
   ///
   /// - Parameters:
   ///   - after: A cursor for use in pagination. after is an object ID that defines your place in the list. For instance, if you make a list request and receive 100 objects, ending with obj_foo, your subsequent call can include after=obj_foo in order to fetch the next page of the list.
   ///   - limit: A limit on the number of objects to be returned. Limit can range between 1 and 100, and the default is 20.
   /// - Returns: An `OpenAIResponse<BatchObject>` containing a list of paginated [Batch](https://platform.openai.com/docs/api-reference/batch/object) objects.
   /// - Throws: An error if the request fails.
   ///
   /// For more information, refer to [OpenAI's Batch API documentation](https://platform.openai.com/docs/api-reference/batch/list).
   func listBatch(
      after: String?,
      limit: Int?)
      async throws -> OpenAIResponse<BatchObject>
   
   // MARK: Vector Store
   
   /// Create a vector store.
   ///
   /// - Parameter parameters: The parameters needed to create a batc,.
   /// - Returns: A [Vector store](https://platform.openai.com/docs/api-reference/vector-stores) object.
   /// - Throws: An error if the request fails
   ///
   /// For more information, refer to [OpenAI's Vector store API documentation](https://platform.openai.com/docs/api-reference/vector-stores/create).
   func createVectorStore(
      parameters: VectorStoreParameter)
      async throws -> VectorStoreObject
   
   /// Returns a list of vector stores.
   ///
   /// - Parameter limit: A limit on the number of objects to be returned. Limit can range between 1 and 100, and the default is 20.
   /// - Parameter order: Sort order by the created_at timestamp of the objects. asc for ascending order and desc for descending order.
   /// - Parameter after: A cursor for use in pagination. after is an object ID that defines your place in the list. For instance, if you make a list request and receive 100 objects, ending with obj_foo, your subsequent call can include after=obj_foo in order to fetch the next page of the list.
   /// - Parameter before: A cursor for use in pagination. before is an object ID that defines your place in the list. For instance, if you make a list request and receive 100 objects, ending with obj_foo, your subsequent call can include before=obj_foo in order to fetch the previous page of the list.
   /// - Returns: A list of [VectorStoreObject](https://platform.openai.com/docs/api-reference/vector-stores) objects.
   /// - Throws: An error if the request fails.
   ///
   /// For more information, refer to [OpenAI's  Vector stores API documentation](https://platform.openai.com/docs/api-reference/vector-stores/list).
   func listVectorStores(
      limit: Int?,
      order: String?,
      after: String?,
      before: String?)
      async throws -> OpenAIResponse<VectorStoreObject>
   
   /// Retrieves a vector store.
   ///
   /// - Parameter id: The ID of the vector store to retrieve.
   /// - Returns: A [Vector Store](https://platform.openai.com/docs/api-reference/vector-stores) matching the specified ID..
   /// - Throws: An error if the request fails.
   ///
   /// For more information, refer to [OpenAI's Batch documentation](https://platform.openai.com/docs/api-reference/vector-stores/retrieve).
   func retrieveVectorStore(
      id: String)
      async throws -> VectorStoreObject
   
   /// Modifies a vector store.
   ///
   /// - Parameter id: The ID of the vector store to modify.
   /// - Returns: A [Vector Store](https://platform.openai.com/docs/api-reference/vector-stores) matching the specified ID..
   /// - Throws: An error if the request fails.
   ///
   /// For more information, refer to [OpenAI's Batch documentation](https://platform.openai.com/docs/api-reference/vector-stores/modify).
   func modifyVectorStore(
      id: String)
      async throws -> VectorStoreObject

   /// Delete a vector store.
   ///
   /// - Parameter id: The ID of the vector store to delete.
   /// - Returns: A Deletion status.
   /// - Throws: An error if the request fails.
   ///
   /// For more information, refer to [OpenAI's Batch documentation](https://platform.openai.com/docs/api-reference/vector-stores/modify).
   func deleteVectorStore(
      id: String)
      async throws -> DeletionStatus
   
   // MARK: Vector Store Files
   
   /// Create a vector store file by attaching a [File](https://platform.openai.com/docs/api-reference/files) to a vector store.
   ///
   /// - Parameter vectorStoreID: The ID of the vector store for which to create a File.
   /// - Parameter parameters: The paramaters needed to create a vector store File.
   /// - Returns: A [VectorStoreFileObject](https://platform.openai.com/docs/api-reference/vector-stores-files/file-object)
   /// - Throws: An error if the request fails.
   ///
   /// For more information, refer to [OpenAI's Vectore store file documentation.](https://platform.openai.com/docs/api-reference/vector-stores-files/createFile).
   func createVectorStoreFile(
      vectorStoreID: String,
      parameters: VectorStoreFileParameter)
      async throws -> VectorStoreFileObject
   
   /// Returns a list of vector ...
Read more

Bug Fixes

25 Apr 04:45
3847516
Compare
Choose a tag to compare
  • Bug Fixes
  • Assistants API Response format fix.

Updating Swift Package to the latest Open AI updates.

Third Party Library AI Proxy

29 Mar 16:57
7278ddb
Compare
Choose a tag to compare

AIProxy

The AI Proxy team made this contribution independently. SwiftOpenAI's owner is not involved in its development but accepts it in the spirit of open-source collaboration. It is added for convenience, and its use is at the discretion of the developer.

video.mp4

Protect your OpenAI key without a backend.

What is it?

AIProxy is a backend for AI apps that proxies requests from your app to OpenAI. You can use this service to avoid exposing your OpenAI key in your app. We offer AIProxy support so that developers can build and distribute apps using SwiftOpenAI.

How does my SwiftOpenAI code change?
SwiftOpenAI supports proxying requests through AIProxy with a small change to your integration code.

Instead of initializing service with:

let apiKey = "your_openai_api_key_here"
let service = OpenAIServiceFactory.service(apiKey: apiKey)

Use:

#if DEBUG && targetEnvironment(simulator)
let service = OpenAIServiceFactory.service(
    aiproxyPartialKey: "hardcode_partial_key_here",
    aiproxyDeviceCheckBypass: "hardcode_device_check_bypass_here"
)
#else
let service = OpenAIServiceFactory.service(
    aiproxyPartialKey: "hardcode_partial_key_here"
)
#endif

The aiproxyPartialKey and aiproxyDeviceCheckBypass values are provided to you on the AIProxy developer dashboard.

⚠️ It is important that you do not let the aiproxyDeviceCheckBypass token leak into a distribution build of your app (including TestFlight distributions). Please retain the conditional compilation checks that are present in the sample code above.

What is the aiproxyDeviceCheckBypass constant?
AIProxy uses Apple's DeviceCheck to ensure that requests received by the backend originated from your app on a legitimate Apple device. However, the iOS simulator cannot produce DeviceCheck tokens. Rather than requiring you to constantly build and run on device during development, AIProxy provides a way to skip the DeviceCheck integrity check. The token is intended for use by developers only. If an attacker gets the token, they can make requests to your AIProxy project without including a DeviceCheck token, and thus remove one level of protection.

What is the aiproxyPartialKey constant?
This constant is intended to be included in the distributed version of your app. As the name implies, it is a partial representation of your OpenAI key. Specifically, it is one half of an encrypted version of your key. The other half resides on AIProxy's backend. As your app makes requests to AIProxy, the two encrypted parts are paired, decrypted, and used to fulfill the request to OpenAI.

How to setup my project on AIProxy?
Please see the AIProxy integration guide

⚠️ Disclaimer
Contributors of SwiftOpenAI shall not be liable for any damages or losses caused by third parties. Contributors of this library provide third party integrations as a convenience. Any use of a third party's services are assumed at your own risk.

Assistant API Stream

22 Mar 22:55
Compare
Choose a tag to compare

Assistants API stream support.

You can stream events from the Create Thread and Run, Create Run, and Submit Tool Outputs endpoints by passing "stream": true. The response will be a Server-Sent events stream.

In Swift:

   /// Creates a thread and run with stream enabled.
   ///
   /// - Parameter parameters: The parameters needed to create a thread and run.
   /// - Returns: An AsyncThrowingStream of [AssistantStreamEvent](https://platform.openai.com/docs/api-reference/assistants-streaming/events) objects.
   /// - Throws: An error if the request fails.
   ///
   /// For more information, refer to [OpenAI's  Run API documentation](https://platform.openai.com/docs/api-reference/runs/createThreadAndRun).
   func createThreadAndRunStream(
      parameters: CreateThreadAndRunParameter)
   async throws -> AsyncThrowingStream<AssistantStreamEvent, Error>
   
   /// Create a run with stream enabled.
   ///
   /// - Parameter threadID: The ID of the thread to run.
   /// - Parameter parameters: The parameters needed to build a Run.
   /// - Returns: An AsyncThrowingStream of [AssistantStreamEvent](https://platform.openai.com/docs/api-reference/assistants-streaming/events) objects.
   /// - Throws: An error if the request fails.
   ///
   /// For more information, refer to [OpenAI's  Run API documentation](https://platform.openai.com/docs/api-reference/runs/createRun).
   func createRunStream(
      threadID: String,
      parameters: RunParameter)
   async throws -> AsyncThrowingStream<AssistantStreamEvent, Error>
   
   
   /// When a run has the status: "requires_action" and required_action.type is submit_tool_outputs, this endpoint can be used to submit the outputs from the tool calls once they're all completed. All outputs must be submitted in a single request. Stream enabled
   ///
   /// - Parameter threadID: The ID of the [thread](https://platform.openai.com/docs/api-reference/threads) to which this run belongs.
   /// - Parameter runID: The ID of the run that requires the tool output submission.
   /// - Parameter parameters: The parameters needed for the run tools output.
   /// - Returns: An AsyncThrowingStream of [AssistantStreamEvent](https://platform.openai.com/docs/api-reference/assistants-streaming/events) objects.
   /// - Throws: An error if the request fails.
   ///
   /// For more information, refer to [OpenAI's  Run API documentation](https://platform.openai.com/docs/api-reference/runs/submitToolOutputs).
   func submitToolOutputsToRunStream(
      threadID: String,
      runID: String,
      parameters: RunToolsOutputParameter)
   async throws -> AsyncThrowingStream<AssistantStreamEvent, Error>

Added demo project/tutorial based on Python tutorial.

streamgif

Adding latest changes from OpenAI API https://platform.openai.com/docs/changelog

20 Feb 06:29
Compare
Choose a tag to compare

Azure OpenAI

24 Jan 06:57
1153951
Compare
Choose a tag to compare

This release provides support for both chat completions and chat stream completions through Azure OpenAI. Currently, DefaultOpenAIAzureService supports chat completions, including both streamed and non-streamed options.

openai-azure

v1.5

03 Jan 19:17
61d7539
Compare
Choose a tag to compare
  • Stream now can be canceled.
  • Updated demo for stream cancellation.

Simulator Screen Recording - iPhone 15 - 2024-01-03 at 11 15 49

Log probs are now available with chat completions.

19 Dec 07:12
Compare
Choose a tag to compare
Screenshot 2023-12-18 at 11 10 57 PM Screenshot 2023-12-18 at 11 10 57 PM