Building a Fully Local AI-Powered Assistant with Oracle APEX and Vector Search #APEXCONN2025 #orclAPEX
👩: Falls Sie nicht müde sind :-|
👨: Ich bin immer müde, aber das ist okay :-)
I recently attended the APEX Connect 2025 conference from Syntegris Company. One of the presentations that strongly caught my attention focused on the development of a personal assistant powered by Artificial Intelligence (AI), which runs completely offline, with no reliance on cloud services.
This assistant was implemented using technologies such as Oracle Database 23ai, Vector Search, and the RAG (Retrieval-Augmented Generation) approach. It was designed to manage personal information, emails, calendar entries, and project data.
👀Watch this short Video about #ApexConn2025 https://youtube.com/shorts/UQmwwTEsJuk?si=WJUXdJ4jvh1Nmsap
-
Oracle Database 23ai
-
Oracle APEX 24.2
-
ORDS (Oracle REST Data Services)
-
Local LLMs (Language Models): Ollama
curl -X POST http://localhost:11434/api/embeddings -H "Content-Type: application/json" -d "{\"model\":\"chroma/all-minilm-l6-v2-f32\",\"prompt\":\" this is test for LIDA \"}"
now we shouild bring data for example emails, files, notes, etc into an Oracle table to be later processed using Vector Search.
id NUMBER PRIMARY KEY,
MY_NOTE
tableDROP TABLE my_vector_notes;
CREATE TABLE my_vector_notes (
id NUMBER PRIMARY KEY,
content CLOB,
embedding VECTOR(384)
);
import oracledb import ollama import json from tqdm import tqdm connection = oracledb.connect( user="sys", password="password", dsn="localhost:1521/FREEPDB1", mode=oracledb.SYSDBA ) cursor = connection.cursor() cursor.execute(""" DECLARE v_exists NUMBER; BEGIN SELECT COUNT(*) INTO v_exists FROM user_tables WHERE table_name = 'MY_VECTOR_NOTES'; IF v_exists = 0 THEN EXECUTE IMMEDIATE ' CREATE TABLE my_vector_notes ( id NUMBER PRIMARY KEY, content CLOB, embedding CLOB )'; END IF; END; """) cursor.execute("SELECT id, text FROM my_notes") rows = cursor.fetchall() success_count = 0 for id, text in tqdm(rows): try: truncated_text = text[:4000] if text else "" response = ollama.embeddings( model='chroma/all-minilm-l6-v2-f32', prompt=truncated_text ) vector = json.dumps(response['embedding']) cursor.execute(""" INSERT INTO my_vector_notes (id, content, embedding) VALUES (:id, :content, :vector) """, {'id': id, 'content': text, 'vector': vector} ) success_count += 1 except Exception as e: print(f"\n error for {id}: {str(e)}") connection.rollback() connection.commit() print(f"\n complete ! {success_count}/{len(rows)} done.") cursor.close() connection.close()
select * from my_vector_notes;
Vectors embedding created and Done 😁
my_vector_notes
that are most similar to the input vector, sorted by how semantically similar they are.-
Rows with smaller distance values come first → more relevant results.
-
This is commonly used in AI search, semantic retrieval, RAG pipelines, and chatbot memory systems.
Comments
Post a Comment