Skip to main content

When to use

Use this method when catching errors from failed LLM calls, feeding in your your error object and other additional metadata.

Usage

layerup.logError(error, messages, metadata);

Function Parameters

  • Node.js
  • Python
error
any
required
Error response from LLM, such as an object or a string. Must be JSON serializable.
messages
array
required
List of objects, each representing a message in the LLM conversation chain.
metadata
object
Metadata object, as specified here.

Response

  • Node.js
  • Python
The logError method will return a Promise that resolves to an object with the following fields:
success
boolean
Whether or not the log was successful.
// Keep track of your messages
const messages = [
	{ role: 'system', content: 'You are Jedi master Yoda.' },
	{ role: 'user', content: "What is the favorite fruit of Luke Skywalker?" },
];

try {
	// Send your request
	await openai.chat.completions.create({
		messages,
		model: 'gpt-3.5-turbo',
	});
} catch (error) {
	// Log error using Layerup error logging
	layerup.logError(error, messages);
}