Building a Fully Local AI-Powered Assistant with Oracle APEX and Vector Search #APEXCONN2025 #orclAPEX

👩: Falls Sie nicht müde sind  :-|
👨: Ich bin immer müde, aber das ist okay :-)

I recently attended the APEX Connect 2025 conference from Syntegris Company. One of the presentations that strongly caught my attention focused on the development of a personal assistant powered by Artificial Intelligence (AI), which runs completely offline, with no reliance on cloud services.

This assistant was implemented using technologies such as Oracle Database 23ai, Vector Search, and the RAG (Retrieval-Augmented Generation) approach. It was designed to manage personal information, emails, calendar entries, and project data.

👀Watch this short Video about #ApexConn2025   https://youtube.com/shorts/UQmwwTEsJuk?si=WJUXdJ4jvh1Nmsap





The presenter highlighted the challenges of his life, such as:
High volumes of daily emails 
Multiple overlapping meetings
Fragmented and scattered information

He proposed a personal AI assistant that could :
Filter emails and extract relevant content
Quickly retrieve previously stored information
Operate completely offline to ensure data privacy

He used a wide range of tools and showcased a fully functional project 👌, but in this article, I will focus only on the core technical components of his approach. I am going to focus on :
✔Implementing AI entirely locally
Bringing data into the Oracle Database
Convert text to vectors for semantic retrieval
Building an Oracle APEX application on top of it


😎Let's get Started: 

First you need to have below software: 
  • Oracle Database 23ai

  • Oracle APEX 24.2

  • ORDS (Oracle REST Data Services)

  • Local LLMs (Language Models): Ollama

👉 First Step
✔Implementing AI entirely locally
To run AI models without an internet connection, Install Ollama on your local machine also  you can  Run a lightweight model such as mistral or llama3.2 or whatever you want
In your Terminal run below code :

ollama pull chroma/all-minilm-l6-v2-f32

 

let's test it by below code:

curl -X POST http://localhost:11434/api/embeddings -H "Content-Type: application/json" -d "{\"model\":\"chroma/all-minilm-l6-v2-f32\",\"prompt\":\" this is test for LIDA \"}"

You can see the embedding vector is gene

👉 Second Step
✔Bringing data into the Oracle Database

now we shouild bring data for example emails, files, notes, etc into an Oracle table to be later processed using Vector Search.

CREATE TABLE my_notes (
  id NUMBER PRIMARY KEY,
 text CLOB );


Write a script in Python or using APEX REST Source SQLLoader, or how you prefer to import data from sources into the MY_NOTE table

I inserted some test data, as you can see below



👉 Third Step
Convert text to vectors for semantic retrieval
For implementig that first create a tablke with a column with VECTOR datatype

DROP TABLE my_vector_notes;
 
CREATE TABLE my_vector_notes (
  id        NUMBER PRIMARY KEY,
  content   CLOB,
  embedding VECTOR(384)   
);

python -m pip install --upgrade pip
python -m pip install oracledb ollama pandas
 

with this python code you connect python to oracle database and by using ollama convert texts to vectors 

import oracledb
import ollama
import json
from tqdm import tqdm
connection = oracledb.connect(
    user="sys",
    password="password",
    dsn="localhost:1521/FREEPDB1",
    mode=oracledb.SYSDBA  
)
cursor = connection.cursor()
cursor.execute("""
    DECLARE
        v_exists NUMBER;
    BEGIN
        SELECT COUNT(*) INTO v_exists FROM user_tables 
        WHERE table_name = 'MY_VECTOR_NOTES';
        
        IF v_exists = 0 THEN
            EXECUTE IMMEDIATE '
                CREATE TABLE my_vector_notes (
                    id NUMBER PRIMARY KEY,
                    content CLOB,
                    embedding CLOB 
                )';
        END IF;
    END;
""")

cursor.execute("SELECT id, text FROM my_notes")
rows = cursor.fetchall()

success_count = 0
for id, text in tqdm(rows):
    try:
        truncated_text = text[:4000] if text else ""
        
        response = ollama.embeddings(
            model='chroma/all-minilm-l6-v2-f32',
            prompt=truncated_text
        )
        
        vector = json.dumps(response['embedding'])
        
        cursor.execute("""
            INSERT INTO my_vector_notes (id, content, embedding)
            VALUES (:id, :content, :vector)
            """,
            {'id': id, 'content': text, 'vector': vector}
        )
        success_count += 1
        
    except Exception as e:
        print(f"\n error for {id}: {str(e)}")
        connection.rollback()

connection.commit()
print(f"\n complete ! {success_count}/{len(rows)} done.")

cursor.close()
connection.close()

select * from my_vector_notes;
Vectors embedding created and Done 😁


Now we can have a vector similarity search query in Oracle Database 23ai , which supports vector data types and AI-native features. This query retrieves the rows from my_vector_notes that are most similar to the input vector, sorted by how semantically similar they are.
  • Rows with smaller distance values come first → more relevant results.

  • This is commonly used in AI search, semantic retrieval, RAG pipelines, and chatbot memory systems.


SELECT id, content
FROM my_vector_notes
ORDER BY VECTOR_DISTANCE(embedding, :input_vector);


👉 Fourth Step
Building an Oracle APEX application on top of it

In this step I am goint to user-friendly Oracle APEX application .

The goal is to enable users to interact with the AI system through a

web interface,submit queries in natural language, and receive meaningful,

intelligent responses based on both LLM inference and semantic vector search.

1.Create a page in APEX with a Text Area for user input

Add a Submit button to process the input.


2. Integration with our Local LLM Ollama

Define a REST Data Source in APEX pointing to your local model endpoint
When a user submits a question, APEX sends it to the model in JSON format,
and the model returns a natural language response.
As i tested by this code in previus step so we can create a REST Data Sources for http://localhost:11434/api/embeddings    curl -X POST http://localhost:11434/api/embeddings -H "Content-Type: application/json" -d "{\"model\":\"chroma/all-minilm-l6-v2-f32\",\"prompt\":\" this is test for LIDA \"}"

Shared Components → REST Data Sources → Create


SELECT host, lower_port, upper_port, privilege, principal
FROM dba_network_acls
JOIN dba_network_acl_privileges
ON (dba_network_acls.acl = dba_network_acl_privileges.acl)
WHERE (host LIKE '%localhost%' OR host = '127.0.0.1') AND lower_port = 11434;

**TIPS: If it is null then run below codes

BEGIN
  DBMS_NETWORK_ACL_ADMIN.DROP_ACL('ollama_acl.xml');
  COMMIT;
END;


BEGIN
  DBMS_NETWORK_ACL_ADMIN.CREATE_ACL(
    acl         => 'ollama_acl.xml',
    description => 'ACL for accessing Ollama API on 127.0.0.1:11434 for LIDA',
    principal   => 'LIDA',
    is_grant    => TRUE,
    privilege   => 'connect'
  );
  COMMIT;
END;
/

BEGIN
  DBMS_NETWORK_ACL_ADMIN.ASSIGN_ACL(
    acl  => 'ollama_acl.xml',
    host => '127.0.0.1',
    lower_port => 11434,
    upper_port => 11434
  );
  COMMIT;
END;
/

BEGIN
  DBMS_NETWORK_ACL_ADMIN.ADD_PRIVILEGE(
    acl       => 'ollama_acl.xml',
    principal => 'ORDS_PUBLIC_USER',
    is_grant  => TRUE,
    privilege => 'connect'
  );
  COMMIT;
END;
/

SELECT host, lower_port, upper_port, privilege, principal
FROM dba_network_acls
JOIN dba_network_acl_privileges
ON (dba_network_acls.acl = dba_network_acl_privileges.acl)
WHERE (host LIKE '%localhost%' OR host = '127.0.0.1') AND lower_port = 11434;

Name: OLLAMA_EMBEDDINGS
URL Endpoint: http://localhost:11434/api/embeddings





Method: POST
{
  "model": "chroma/all-minilm-l6-v2-f32",
  "prompt": "&P1_USER_INPUT."

For Header write: 
Content-Type: application/json
 
3. Now we need a code in our APEX page to convert a user's text input into a vector embedding using Ollama's API, then store both the original text and its vector representation in a database table.create a process and write below code: 

DECLARE
  l_response CLOB;
  l_vector JSON_OBJECT_T;
BEGIN
  l_response := APEX_WEB_SERVICE.make_request(
    p_url => 'http://localhost:11434/api/embeddings',
    p_http_method => 'POST',
    p_body => JSON_OBJECT(
      'model' VALUE 'chroma/all-minilm-l6-v2-f32',
      'prompt' VALUE :P1_USER_INPUT
    )
  );
  l_vector := JSON_OBJECT_T.parse(l_response);
  :P1_EMBEDDING := l_vector.get_String('embedding');
END;

4. Displaying Results Use Interactive Report

SELECT 
    id,
    content,
    ROUND(1 - VECTOR_DISTANCE(
        embedding, 
        TO_VECTOR(:P1_EMBEDDING, 384, FLOAT32)
    , 3) AS similarity_score
FROM my_vector_notes
ORDER BY VECTOR_DISTANCE(
    embedding, 
    TO_VECTOR(:P1_EMBEDDING, 384, FLOAT32)
)
FETCH FIRST 5 ROWS ONLY

I hope this article was helpful. Thanks for reading!


#APEXCONN2025
 #orclAPEX 

Comments

Popular posts from this blog

JavaScript In Oracle APEX (Client Side)

Installation of Oracle Database 23ai and APEX 24.2 and ORDS 24.4 on Windows Step-by-Step

Configuration of ORDS: Standalone, Tomcat, and WebLogic on Windows Step-by-Step

✔ RTL Map Region Labels APEX 24.2 & 🌍GeoCoding

Oracle JET and Oracle APEX are best friends💕

Vector Tiles in Oracle Spatial 23

Building an AI-Powered Resume Matching System Using Oracle APEX and 23ai Vector Search😎

What is SYS.ODCI (SYS.ODCINumberList or SYS.ODCIVarchar2List) and How Can Use it in APEX ?

My Experience at Oracle CloudWorld Tour in Frankfurt – April 10, 2025