Documentation Index
Fetch the complete documentation index at: https://docs.uselayerup.com/llms.txt
Use this file to discover all available pages before exploring further.
When to use
Use this method when catching errors from failed LLM calls, feeding in your your error object and other additional metadata.
Usage
layerup.logError(error, messages, metadata);
Function Parameters
Error response from LLM, such as an object or a string. Must be JSON serializable.
List of objects, each representing a message in the LLM conversation chain.
Metadata object, as specified here. Error response from LLM, such as a dictionary or a string. Must be JSON serializable.
List of dictionaries, each representing a message in the LLM conversation chain.
Metadata dictionary, as specified here.
Response
The logError method will return a Promise that resolves to an object with the following fields:Whether or not the log was successful.
The log_error method will return a dictionary with the following fields:Whether or not the log was successful.
// Keep track of your messages
const messages = [
{ role: 'system', content: 'You are Jedi master Yoda.' },
{ role: 'user', content: "What is the favorite fruit of Luke Skywalker?" },
];
try {
// Send your request
await openai.chat.completions.create({
messages,
model: 'gpt-3.5-turbo',
});
} catch (error) {
// Log error using Layerup error logging
layerup.logError(error, messages);
}