Blog
Home » How to Set Up an LLM Connection in Acumatica Using OpenAI
How to Set Up an LLM Connection in Acumatica Using OpenAI
As AI capabilities continue to expand within business applications, many Acumatica users are beginning to experiment with Large Language Model (LLM) integrations. However, documentation around initial setup can still be fairly limited. Recently, we worked through the process of creating an LLM connection in Acumatica using OpenAI. Along the way, we discovered a few details that aren’t immediately obvious, especially when it comes to API keys and selecting the correct model. This guide walks through the steps we followed and highlights a few things that may save you time.
Step 1: Create an LLM Connection in Acumatica
Everything starts inside Acumatica with the LLM Connections screen. To create a new connection:
• Create a Connection ID
• Give the connection a name (for example: OpenAI ChatGPT)
• Select your LLM Provider
Currently, Acumatica supports a small set of providers, though more may be added over time. For this example, we selected OpenAI. Once the provider is selected, you’ll need to configure the connection parameters, which requires two key pieces of information:
• An API Key
• A Model Name
Step 2: Generate an OpenAI API Key
The API key allows Acumatica to securely connect to OpenAI’s services. Important note: this key is not generated through the normal ChatGPT interface. Instead, you must create it through OpenAI’s developer platform.
Steps:
• Go to the OpenAI developer platform
• Navigate to API Keys
• Select Create New Secret Key
• Give the key a name (for example: Acumatica Integration)
You can also configure permissions when generating the key. For testing purposes, we allowed full permissions to ensure the connection would work. However, in a production environment you may want to restrict permissions or create a dedicated service account instead of using a personal account
Once the key is created, copy the secret key and paste it into the API Key field in Acumatica.
Step 3: Select the Correct Model
This step turned out to be the most confusing part of the process. Acumatica requires you to enter a model name, but it does not provide a dropdown or list of available models. That means you must manually enter the correct model identifier.
After some trial and error, we successfully connected using:
gpt-5.1
One thing we discovered quickly: syntax matters. Even a small formatting mistake, such as an extra space, will cause the connection to fail. The model’s name must be entered exactly as expected. For example, we initially attempted to use a smaller model variation, but it did not work. Whether that was due to compatibility limitations or configuration issues remains unclear, as there is currently limited documentation on which models Acumatica supports.
Step 4: Test the Connection
Once both the API key and model are entered, you can test the connection. If everything is configured correctly, Acumatica will return a confirmation message indicating that the operation completed successfully. At this point, your LLM connection is ready to use.
Step 5: Use the Connection in AI Features
After creating the connection, it can be referenced within features that leverage AI. For example, when configuring tools like AI-generated Acumatica case closure notes, you simply select the LLM connection you created. From there, you can begin testing prompts and experimenting with AI-assisted workflows.
A Note on Usage Tracking
OpenAI usage is measured in tokens, and the developer dashboard shows usage metrics tied to your project. During early testing, we noticed that usage reporting can sometimes appear confusing at first especially if filters or projects are applied in the dashboard. Adjusting those filters typically reveals the actual usage totals. For most initial testing scenarios, usage will be minimal.
Setting up an LLM connection in Acumatica is actually fairly straightforward once you know the required components:
• Create an LLM connection
• Generate an API key
• Enter a valid model name
• Test the connection
The biggest challenge currently is simply the lack of documentation around supported models and exact syntax requirements. Hopefully this walkthrough saves you some time if you’re exploring AI integrations within Acumatica. As Acumatica continues expanding its AI capabilities, we expect more providers, models, and documentation to become available.


