Quickstart
Learn how to enhance your LLM queries with Pine in under 5 minutes
Get your API Key
To use Pine, you’ll need an API key. Generate one in your dashboard.
It will look something like this sk-cj9IuZc4S2Vk19Bisu5WTegfa5eMXNcJGoXlHJb5nJkf
. Make sure to
save it in a safe place, as you will not be able to view it again on the dashboard.
Make Your First Request
To get context for your LLM query, call the API Context Endpoint.
In this example, we will add context to the query "Whats new in the new MacBook Pro"
.
The API will return context in markdown format that you can add to your LLM query:
Using the Response
Pine delivers context in a natural language format tailored for LLMs. Though the markdown may seem unstructured to humans, it mirrors how LLMs process information—conversationally blending specs, quotes, and updates for accurate, up-to-date responses.
Using Pine Context in a LLM
Once you have the context from Pine, simply include it before your query in your LLM prompt. Here’s a simple example using OpenAI:
which prints an up to date and accurate response: