// Keep track of your messages
const sensitiveMessages = [
	{ role: 'system', content: 'Summarize the following email for me.', },
	{ role: 'user', content: 'Dear Mr. Smith, hope you are doing well. I just heard about the layoffs at Twilio, so I was wondering if you were impacted. Can you please call me back at your earliest convenience? My number is (123) 456-7890. Best Regards, Bob Dylan', },
];

// Make the call to Layerup
let [messages, unmaskResponse] = await layerup.maskPrompt(sensitiveMessages);

// Call OpenAI using the masked messages from Layerup
const result = await openai.chat.completions.create({
	messages,
	model: 'gpt-3.5-turbo',
});

// Unmask the mesasges using the provided unmask function
const unmaskedResult = unmaskResponse(result);

When to use

Use this method on your prompt if it is at risk of containing PII or sensitive data. Our SDK will mask any sensitive information and send you an updated prompt. After receiving the response back from your LLM, use our provided unmasking function to retrieve your unmasked data.

You can read about how it works here.

Usage

const [messages, unmaskResponse] = await layerup.maskPrompt(messages, metadata);

Function Parameters

messages
array
required

Array of objects, each representing a message in the LLM conversation chain.

metadata
object

Metadata object, as specified here.

Response

The maskPrompt method will return a Promise that resolves to an array with exactly 2 values:

messages
array

This array is an exact clone of the messages array but with all PII and sensitive data replaced with templated variable names.

unmaskResponse
function

This is a function that can be used on the LLM response to unmask the data.

The function takes in the LLM response and returns the same response but with the original PII and sensitive data restored.

The LLM response can be formatted as either:

  1. A raw OpenAI chat completion object, or
  2. A string with the raw response content

Providing any other format of data to the unmaskResponse function will result in an error being thrown.

Note: The unmask function is only valid for the specific masked prompt that was provided. It cannot be used to unmask other prompts.

// Keep track of your messages
const sensitiveMessages = [
	{ role: 'system', content: 'Summarize the following email for me.', },
	{ role: 'user', content: 'Dear Mr. Smith, hope you are doing well. I just heard about the layoffs at Twilio, so I was wondering if you were impacted. Can you please call me back at your earliest convenience? My number is (123) 456-7890. Best Regards, Bob Dylan', },
];

// Make the call to Layerup
let [messages, unmaskResponse] = await layerup.maskPrompt(sensitiveMessages);

// Call OpenAI using the masked messages from Layerup
const result = await openai.chat.completions.create({
	messages,
	model: 'gpt-3.5-turbo',
});

// Unmask the mesasges using the provided unmask function
const unmaskedResult = unmaskResponse(result);