In the rapidly evolving world of AI, agentic systems have emerged as a powerful architectural pattern that brings together intelligent decision-making and real-world utility. This post dives into what agentic systems are, introduces the concept of agents, and walks you through setting up a complete working prototype: an Agentic Ticket System powered by FastAPI, Streamlit, and locally run LLMs via Ollama.
What Are Agentic Systems?
An agentic system refers to an architecture where one or more agents—autonomous AI components—observe, reason, and act to achieve goals. These systems are built to operate independently or semi-independently, adapting to their environment and making context-aware decisions.
Agent Definition according to IBM
An artificial intelligence (AI) agent refers to a system or program that is capable of autonomously performing tasks on behalf of a user or another system by designing its workflow and utilizing available tools.
Source: https://www.ibm.com/think/topics/ai-agents
Project Overview: Agentic Ticket System
This is a practical example of an agentic system: An Agentic Ticket System. This prototype allows you to interact with LLM agents to create support tickets.
👉 GitHub Repository: Agentic Ticket System
The project uses:
- FastAPI for the backend agent logic and API endpoints.
- Streamlit for a reactive frontend UI.
- Ollama to run powerful LLMs like
granite3.1-dense
directly on your machine. - LangChain is a framework for building applications powered by language models through modular components like prompt templates, memory, and tools, while
- LangGraph is a library built on top of LangChain that enables building stateful, multi-agent applications using graph-based execution flows.
System Architecture
Here’s a high-level overview of the stack:
Interface (Streamlit) 🧑💻
↓
FastAPI Backend 🔁
↓
Local LLMs via Ollama 🧠
↓
(Optional) SQLite via TablePlus 🗃️
How to Set It Up
Step 1: Install Ollama and pull models
To run models locally using Ollama:
- Download and install Ollama for macOS
- Pull a model, e.g.:
ollama pull granite3.1-dense:latest
- Check if the model is installed:
ollama list
- Optionally test it:
ollama run granite3.1-dense:latest "Hello"
Step 2: Set Up the Backend
- Navigate to the
backend
directory. - Create a virtual environment:
uv venv .venv --python=/opt/homebrew/bin/python3.12
- Activate it:
source .venv/bin/activate
- Install dependencies:
uv pip install -r requirements.txt
- Go back to the root directory:
cd ../
- Copy
.env.sample
to.env
and configure any models or credentials. - Run the FastAPI app:
uvicorn backend.main:app --reload
Step 3: Set Up the Frontend
- Navigate to the
frontend
directory. - Create and activate a virtual environment:
uv venv source .venv/bin/activate
- Install dependencies:
uv pip install -r requirements.txt
- Start the frontend app:
streamlit run streamlit_app.py
- To run the ticket dashboard app:
streamlit run streamlit_ticket_dashboard.py
Step 4 (Optional): Connect to SQLite via TablePlus
- Download TablePlus from here
- Create a new SQLite connection and name it
ticketapi
. - Select the
ticket.db
file in yourbackend
folder. - Test the connection and connect.
Conclusion
Agentic systems are not just theoretical—they are rapidly becoming the core of next-gen applications. Whether it’s customer support, data analysis, or workflow automation, autonomous agents powered by LLMs will be integral to future software.
To try it yourself, find the repository here: