Open
Description
Is your feature request related to a problem? Please describe.
At the moment when exception happens from run as a function, for example, content management policy
exception triggered all nodes output are lost:
f = load_flow(source="../../examples/flows/chat/chat-basic/")
f.context.streaming = True
try:
result = f(
chat_history=[
{
"inputs": {"chat_input": "Hi"},
"outputs": {"chat_output": "Hello! How can I assist you today?"},
}
],
question="How are you?",
)
except WrappedOpenAIError as exc:
# we lost succeeded node output as well
pass
answer = ""
# the result will be a generator, iterate it to get the result
for r in result["answer"]:
answer += r
Describe the solution you'd like
when exception happens can we have some nodes output saved somewhere? for example: node1->node2->node3 when node2 triggers exception we can still have output from node1?
Describe alternatives you've considered
Not sure
Additional context
Not sure