Skip to content

Commit 9c2770a

Browse files
authored
fix genai documentation (#326)
1 parent 3a60019 commit 9c2770a

File tree

4 files changed

+22
-13
lines changed

4 files changed

+22
-13
lines changed

convokit/genai/__init__.py

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,6 @@
2121
pass
2222

2323
from .factory import get_llm_client
24-
from .llm_transformer import LLM
2524
from .llmprompttransformer import LLMPromptTransformer
2625

2726
__all__ = [
@@ -32,6 +31,5 @@
3231
"LocalClient",
3332
"get_llm_client",
3433
"GenAIConfigManager",
35-
"LLM",
3634
"LLMPromptTransformer",
3735
]

docs/source/featureExtraction.rst

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,4 +15,5 @@ These are the transformers related to extracting features from the corpus and it
1515
ExpectedContextModel <expected_context_model.rst>
1616
Redirection <redirectionAndUtteranceLikelihood.rst>
1717
UtteranceLikelihood <redirectionAndUtteranceLikelihood.rst>
18-
PivotalMomentMeasure <pivotal.rst>
18+
PivotalMomentMeasure <pivotal.rst>
19+
LLMPromptTransformer <llmprompttransformer.rst>

docs/source/genai.rst

Lines changed: 9 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ GenAI
33

44
The GenAI module provides a unified interface for working with LLMs while doing conversational analysis in ConvoKit. The current implementation supports multiple providers including OpenAI GPT and Google Gemini, but is designed to be extensible to LLMs from other model providers and local models. This module makes it easy to integrate AI-powered text generation into your ConvoKit workflows for diverse tasks. The module handles API key management, response formatting, and provides consistent interfaces across different LLM providers.
55

6-
The module includes a ConvoKit transformer that allow you to apply LLM processing directly to corpus objects at different levels (utterances, conversations, speakers, or entire corpus), making it seamless to integrate AI analysis into your conversational data processing pipelines.
6+
The module includes ConvoKit transformers that allow you to apply LLM processing directly to corpus objects at different levels (utterances, conversations, speakers, or entire corpus), making it seamless to integrate AI analysis into your conversational data processing pipelines.
77

88
Example usage: `GenAI module demo <https://github.com/CornellNLP/ConvoKit/blob/master/convokit/genai/example/example.ipynb>`_.
99

@@ -17,7 +17,7 @@ The GenAI module consists of several key components:
1717
* **Factory Pattern**: Simple factory function to create appropriate client instances
1818
* **Configuration Management**: Centralized API key and configuration management
1919
* **Provider Clients**: Concrete implementations for different LLM providers (GPT, Gemini, Local)
20-
* **GenAI Transformers**: ConvoKit transformers that apply LLM processing to corpus objects
20+
* **LLMPromptTransformer**: Flexible ConvoKit transformer that applies custom LLM prompts to corpus objects at any level
2121

2222
Basic Interface and Configuration
2323
---------------------------------
@@ -31,14 +31,6 @@ Basic Interface and Configuration
3131
.. automodule:: convokit.genai.factory
3232
:members:
3333

34-
LLMPromptTransformer
35-
^^^^^^^^^^^^^^^^^^^^
36-
37-
The LLMPromptTransformer is a flexible transformer that allows you to apply custom prompts and formatters to any level of corpus objects (utterances, conversations, speakers, or the entire corpus). It provides fine-grained control over how objects are formatted for LLM processing and where the results are stored.
38-
39-
.. automodule:: convokit.genai.llmprompttransformer
40-
:members:
41-
4234
Provider Clients
4335
----------------
4436

@@ -107,3 +99,10 @@ The GenAIConfigManager handles API key storage and retrieval for different LLM p
10799
108100
# Configuration is automatically saved and can be reused
109101
102+
LLMPromptTransformer
103+
--------------------
104+
105+
The LLMPromptTransformer is a flexible and powerful ConvoKit transformer that allows you to apply custom LLM prompts to corpus objects at different levels (utterances, conversations, speakers, or the entire corpus). It provides fine-grained control over how objects are formatted for LLM processing and where the results are stored.
106+
107+
.. toctree::
108+
LLMPromptTransformer <llmprompttransformer.rst>
Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
LLMPromptTransformer
2+
====================
3+
4+
The LLMPromptTransformer is a flexible ConvoKit transformer that allows you to apply custom LLM prompts to corpus objects at different levels (utterances, conversations, speakers, or the entire corpus). It provides fine-grained control over how objects are formatted for LLM processing and where the results are stored as metadata.
5+
6+
This transformer is part of the GenAI module (see :doc:`GenAI <genai>`) and integrates seamlessly with the GenAI client infrastructure to support multiple LLM providers (OpenAI GPT, Google Gemini, and local models).
7+
8+
Example usage: `GenAI module demo <https://github.com/CornellNLP/ConvoKit/blob/master/convokit/genai/example/example.ipynb>`_.
9+
10+
.. automodule:: convokit.genai.llmprompttransformer
11+
:members:

0 commit comments

Comments
 (0)