📄️ Getting started (tutorial)
LLMAsAService.io is made up of two main components. An administrative control panel, and a LLM service proxy. This knowledge base documents the adminstrative control panel.
📄️ Supported LLM Providers
The current status of different LLM Providers and limitations
📄️ Managing API Keys
Each vendor needs you to identify when making a call to their service. This is performed by giving you a secret key called an API Key. That key is passed with every request to their APIs. If you are calling through your software you need to securely manage that key just like a password or credit-card number. If it was made public those people could call the LLM vendor impersonating you, running up your bill.
📄️ Handling/Redacting PII
Personal Identifiable Information is anything that cand be used to identify an individual. We can intercept this information in a prompt and "tokenize" it. Tokenizing means replacing it with a random tag on the way to the LLM so that never get the true information. We then replace that random token when we get a response back to the original data. It allows responses from LLM with them never getting the personal information. If the original meaning of that data was necessary for the response from the LLM via context, the responses will suffer.
📄️ Zapier Integration
LLMAsAService.io keeps track and allows you to manage and monitor customer usage. Many companies also have other tools for managing customers, like CRM tools or credit card processing like Stripe. We have a Zapier integration that allows you to automate the synchronization and management of contacts.
📄️ Agent Embedding: Actions
Actions allow certain parts of a response to be converted into user interactive elements, like hyperlinks or buttons.