You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/source/genai.rst
+9-10Lines changed: 9 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,7 +3,7 @@ GenAI
3
3
4
4
The GenAI module provides a unified interface for working with LLMs while doing conversational analysis in ConvoKit. The current implementation supports multiple providers including OpenAI GPT and Google Gemini, but is designed to be extensible to LLMs from other model providers and local models. This module makes it easy to integrate AI-powered text generation into your ConvoKit workflows for diverse tasks. The module handles API key management, response formatting, and provides consistent interfaces across different LLM providers.
5
5
6
-
The module includes a ConvoKit transformer that allow you to apply LLM processing directly to corpus objects at different levels (utterances, conversations, speakers, or entire corpus), making it seamless to integrate AI analysis into your conversational data processing pipelines.
6
+
The module includes ConvoKit transformers that allow you to apply LLM processing directly to corpus objects at different levels (utterances, conversations, speakers, or entire corpus), making it seamless to integrate AI analysis into your conversational data processing pipelines.
7
7
8
8
Example usage: `GenAI module demo <https://github.com/CornellNLP/ConvoKit/blob/master/convokit/genai/example/example.ipynb>`_.
9
9
@@ -17,7 +17,7 @@ The GenAI module consists of several key components:
17
17
* **Factory Pattern**: Simple factory function to create appropriate client instances
18
18
* **Configuration Management**: Centralized API key and configuration management
19
19
* **Provider Clients**: Concrete implementations for different LLM providers (GPT, Gemini, Local)
20
-
* **GenAI Transformers**: ConvoKit transformers that apply LLM processing to corpus objects
20
+
* **LLMPromptTransformer**: Flexible ConvoKit transformer that applies custom LLM prompts to corpus objects at any level
21
21
22
22
Basic Interface and Configuration
23
23
---------------------------------
@@ -31,14 +31,6 @@ Basic Interface and Configuration
31
31
.. automodule:: convokit.genai.factory
32
32
:members:
33
33
34
-
LLMPromptTransformer
35
-
^^^^^^^^^^^^^^^^^^^^
36
-
37
-
The LLMPromptTransformer is a flexible transformer that allows you to apply custom prompts and formatters to any level of corpus objects (utterances, conversations, speakers, or the entire corpus). It provides fine-grained control over how objects are formatted for LLM processing and where the results are stored.
@@ -107,3 +99,10 @@ The GenAIConfigManager handles API key storage and retrieval for different LLM p
107
99
108
100
# Configuration is automatically saved and can be reused
109
101
102
+
LLMPromptTransformer
103
+
--------------------
104
+
105
+
The LLMPromptTransformer is a flexible and powerful ConvoKit transformer that allows you to apply custom LLM prompts to corpus objects at different levels (utterances, conversations, speakers, or the entire corpus). It provides fine-grained control over how objects are formatted for LLM processing and where the results are stored.
The LLMPromptTransformer is a flexible ConvoKit transformer that allows you to apply custom LLM prompts to corpus objects at different levels (utterances, conversations, speakers, or the entire corpus). It provides fine-grained control over how objects are formatted for LLM processing and where the results are stored as metadata.
5
+
6
+
This transformer is part of the GenAI module (see :doc:`GenAI <genai>`) and integrates seamlessly with the GenAI client infrastructure to support multiple LLM providers (OpenAI GPT, Google Gemini, and local models).
7
+
8
+
Example usage: `GenAI module demo <https://github.com/CornellNLP/ConvoKit/blob/master/convokit/genai/example/example.ipynb>`_.
0 commit comments