Open
Description
Is there anyway to return perplexity with respect to the number of iterations? In case we want to optimise the number of iteration and avoid getting into burn-in periods in future executions.
A way I found to do that, is to add a list attribute that is iteratively filled with the perplexity corresponding to that iteration.
Perplexity is computed as exp(-(modelLogLikelihood() / totalTokens)))
Any chance I can submit a pull request?
Metadata
Assignees
Labels
No labels