Using OLLAMA in Oracle Apex.
What if you could chat with an AI model — completely offline, with no API keys, no usage costs, and no data leaving your server — right inside your Oracle APEX application? That's exactly what OLLAMA makes possible.
In this guide, we walk through the full process of setting up OLLAMA locally, exposing it as an HTTP endpoint, and calling it from Oracle APEX using APEX_WEB_SERVICE.
What is OLLAMA?
OLLAMA is an open-source tool that lets you run large language models (LLMs) like Llama 3, Mistral, Gemma, and many others directly on your local machine — no cloud required. It exposes a simple REST API on localhost:11434 that you can call just like any other web service.
- Runs 100% offline — your data never leaves your machine
- Supports dozens of open-source models
- REST API compatible with the OpenAI format
- Free and open-source (MIT license)
Architecture Overview
Oracle APEX Page
│
▼
PL/SQL Process (APEX_WEB_SERVICE.MAKE_REST_REQUEST)
│
▼
OLLAMA REST API ← http://localhost:11434/api/chat
│
▼
Local LLM Model (llama3.2, mistral, etc.)
│
▼
JSON Response → Parsed & Displayed in APEX
Step 1 — Install OLLAMA on Your Server
Visit ollama.com and download for your OS. On Linux:
curl -fsSL https://ollama.com/install.sh | sh
Start the OLLAMA service:
ollama serve
Then pull a model (Llama 3 recommended for general use):
ollama pull llama3
Other popular options: ollama pull mistral | ollama pull gemma:2b | ollama pull phi3
Note: If OLLAMA is on the same machine as Oracle DB, uselocalhost. If it's on a separate server, use its IP and open port11434in the firewall.
Step 2 — Expose OLLAMA on a Network-Accessible Address
By default, OLLAMA only listens on 127.0.0.1. Bind it to all interfaces:
OLLAMA_HOST=0.0.0.0 ollama serve
Or set it permanently in the systemd service file:
[Service] Environment="OLLAMA_HOST=0.0.0.0"
sudo systemctl daemon-reload sudo systemctl restart ollama
Step 3 — Grant Oracle DB Access via ACL
Oracle uses Access Control Lists to restrict outbound HTTP connections. Run this as SYS or a DBA account:
BEGIN
DBMS_NETWORK_ACL_ADMIN.APPEND_HOST_ACE(
host => '192.168.1.100', -- IP of your OLLAMA server
lower_port => 11434,
upper_port => 11434,
ace => xs$ace_type(
privilege_list => xs$name_list('connect'),
principal_name => 'APEX_PUBLIC_USER',
principal_type => xs_acl.ptype_db
)
);
COMMIT;
END;
/
Important: Replace192.168.1.100with your actual OLLAMA server IP. ReplaceAPEX_PUBLIC_USERwith your workspace schema if different.
Step 4 — Test OLLAMA from the Command Line
Before touching APEX, verify OLLAMA works with a direct curl call:
curl http://localhost:11434/api/chat \
-d '{
"model": "llama3.2",
"messages": [
{ "role": "user", "content": "What is Oracle APEX?" }
],
"stream": false
}'
Expected response:
{
"model": "llama3.2",
"response": "Oracle APEX (Application Express) is a low-code development platform...",
"done": true
}
Step 5 — Call OLLAMA from Oracle APEX (PL/SQL Process)
In your APEX page, go to Page Designer → Processing and add a new PL/SQL Process:
DECLARE
l_url VARCHAR2(500) := 'http://192.168.1.100:11434/api/chat';
l_body CLOB;
l_response CLOB;
l_json APEX_JSON.t_values;
l_answer CLOB;
BEGIN
APEX_JSON.INITIALIZE_CLOB_OUTPUT;
APEX_JSON.OPEN_OBJECT;
APEX_JSON.WRITE('model', 'llama3.2');
APEX_JSON.WRITE('stream', FALSE);
APEX_JSON.OPEN_ARRAY('messages');
APEX_JSON.OPEN_OBJECT;
APEX_JSON.WRITE('role', 'user');
APEX_JSON.WRITE('content', :P1_USER_QUESTION);
APEX_JSON.CLOSE_OBJECT;
APEX_JSON.CLOSE_ARRAY;
APEX_JSON.CLOSE_OBJECT;
l_body := APEX_JSON.GET_CLOB_OUTPUT;
APEX_JSON.FREE_OUTPUT;
-- Make the REST call to OLLAMA
l_response := APEX_WEB_SERVICE.MAKE_REST_REQUEST(
p_url => l_url,
p_http_method => 'POST',
p_body => l_body,
p_wallet_path => NULL
);
-- Parse the JSON response
APEX_JSON.PARSE(l_json, l_response);
l_answer := APEX_JSON.GET_CLOB_VALUE(
p_path => 'response',
p_values => l_json
);
-- Push result to page item
:P1_AI_RESPONSE := l_answer;
EXCEPTION
WHEN OTHERS THEN
:P1_AI_RESPONSE := 'Error: ' || SQLERRM;
END;
Step 6 — Build the APEX Page UI
Your page needs three components:
- Text Area item:
P1_USER_QUESTION— the user types their prompt here - Button: label "Ask AI", action = Submit Page
- Text Area item:
P1_AI_RESPONSE— set to Read Only to display the AI's answer
For a non-blocking experience, use a Dynamic Action with AJAX Callback instead of full page submit:
// JavaScript on button click (Dynamic Action)
apex.server.process('CALL_OLLAMA', {
pageItems: '#P1_USER_QUESTION'
}, {
success: function(data) {
apex.item('P1_AI_RESPONSE').setValue(data.answer);
}
});
Step 7 — OpenAI-Compatible API (Optional)
OLLAMA also supports the OpenAI /v1/chat/completions endpoint format — useful if you're already familiar with that structure:
curl http://localhost:11434/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "llama3.2",
"messages": [
{"role": "user", "content": "Explain PL/SQL cursors in simple terms"}
]
}'
Parse the response at choices[0].message.content using APEX_JSON.
Practical Use Cases
- Auto-generate shipment remarks or GL narrations from structured data
- Summarize long customer email threads inside an APEX page
- Let users query data in plain English and convert it to SQL
- Internal FAQ bot powered by your own documents (RAG)
- Draft journal entry descriptions from accounting transactions
Troubleshooting
- ORA-29273 (HTTP request failed): ACL not configured — revisit Step 3
- Connection refused: OLLAMA not running or bound to localhost only — revisit Step 2
- Slow first response: Normal — the model loads into RAM on first call. Subsequent calls are much faster.
- Empty response: Ensure
"stream": falseis in your request body
Tags: Oracle APEX, OLLAMA, Local LLM, PL/SQL, AI Integration, APEX_WEB_SERVICE, Llama3
Feel Free to Comment on this Post..
ReplyDeleteIf you guys have any doubts or any blocker to achieve this , Please drop a message in this comment.
Will give solution for your doubts.