The function has several parameters, including the command to be evaluated, the data to be processed, batch size, whether to shuffle data, multiplicity (number of times each data entry is run through ChatGPT), maximum retry attempts, validation callback, maximum requests and tokens, and a log file. It also includes parameters for formatting the command and data, and parameters forwarded to the GPT API such as the model and temperature.
Key takeaways:
- The 'gpt_batch_eval' function allows you to run an "English language function" over a set of data using ChatGPT, with data grouped into batches to reduce API calls.
- The function has several parameters that can be adjusted, including batch size, whether to shuffle data, the number of times each data entry is run through ChatGPT (multiplicity), and the maximum number of retries for failed data.
- You can preview what the ChatGPT prompt will look like for a single batch using "batch.preview()", or run exactly one batch through ChatGPT to see the results using "batch.test()".
- Additional parameters include the ability to limit the number of calls to the OpenAI API, limit the number of tokens used, specify a log file, and adjust the formatting of the command and data before it is sent to ChatGPT.