1 File = 1 Feature — Atomic Microkernel Architecture

Atomic Microkernel Architecture.
Python-based. AI-Native. One file per feature.

MicroCoreOS is a Python framework built on the Atomic Microkernel Architecture. A implementation where every feature lives in a single file—allowing AI agents to master any plugin in seconds, not minutes.

Star on GitHub
git clone https://github.com/theanibalos/MicroCoreOS.git
$ uv run main.py
domains/users/plugins/list_users_plugin.py
python
from core.base_plugin import BasePlugin
from domains.users.models.user_model import UserModel, UserListResponse

class ListUsersPlugin(BasePlugin):
    def __init__(self, http, db, logger):
        self.http = http
        self.db = db
        self.logger = logger

    def on_boot(self):
        self.http.add_endpoint(
            path="/users", 
            method="GET", 
            handler=self.execute, 
            tags=["Users"],
            response_model=UserListResponse
        )
        self.logger.info("ListUsersPlugin: Endpoint /users registered with Schema.")

    def execute(self, data: dict, context=None):
        rows = self.db.query("SELECT id, name, email, password_hash FROM users")
        users = [UserModel.from_row(row).to_dict() for row in rows]
        return {"success": True, "users": users}

The Architecture Overhead

In traditional layered architectures, adding a simple CRUD endpoint means explaining entities, repositories, factories, controllers, and DTOs to your AI. That's 6-8 files and 200+ lines of code for one endpoint.

Context window saturation
Leaky abstractions across layers
Repetitive boilerplate for AI to generate
Diff-noise in every pull request
AI AGENT LOG

"I've identified 7 files to modify...
Creating repository...
Injecting into factory..."

MicroCoreOS Solution

Atomic Ownership

All logic, routing, and operations in one atomic plugin file.

AI Context Manifest

Auto-generated `AI_CONTEXT.md` describes everything available.

Pure Kernel

The core knows zero business rules. It only handles DI and lifecycle.

Extensible Tools

Add your own infrastructure (Redis, Kafka, S3) in minutes. Pure plug-and-play.

Atomic System Architecture

Three layers. Each one maps directly to a folder in your project — nothing hidden, nothing implicit.

Stateless Tools
/http_server
/sqlite_db
/event_bus
/logger
Add tool
MicroKernel
container.py kernel.py registry.py
Domain Plugins
users
notifications
observability
ui
Add domain
Auto-Generated Context AI_CONTEXT.md

Event-Driven Communication Between Plugins

No direct imports. No coupling. Plugins talk through the bus — never to each other. The EventBus is itself a Tool — just another stateless driver living in tools/event_bus/.

Fire & Forget — publish & move on
create_order_plugin
📢 order.created
send_email
notify_supplier
update_inventory

Each in its own file. None imports another.

Request / Response — await an answer
create_order_plugin
📨 inventory.check
→ request
check_inventory_plugin
← response (ok / abort)
✅ {"stock": 42}
create_order_plugin → proceeds

Still no direct import. Still self-contained.

Predictable File Structure

Everything has its place. The kernel auto-discovers plugins in the /domains folder and wires them to the /tools they need — zero manual wiring, zero config files.

Your AI assistant knows exactly where to put new code. No more explaining layers, factories, or DI containers. The structure is the documentation.

core/— Microkernel & orchestrator

~240 lines. Read it once, understand everything.

tools/— Infrastructure drivers + your custom tools

Stateless. Config reads from env/configmaps at boot — no mutable state, ever.

domains/— Business logic (1 file per feature)

1 file = 1 feature. Self-contained — delete it and nothing breaks.

Project Explorer
core/ # ~240 lines total
kernel.py
container.py
registry.py
base_plugin.py
tool_plugin.py
tools/
http_server/
sqlite/
event_bus/
your_custom_tool/
domains/
users/
models/
user_model.py
plugins/ # 1 file = 1 feature
create_user_plugin.py
delete_user_plugin.py
AI_CONTEXT.md # Manifest for LLMs

Declare Once. Use Anywhere.

No service locators. No factory calls. No container.get(). Just declare what you need in your constructor — the kernel delivers it.

1

Declare your dependencies

List what you need in __init__. The kernel reads your signature and knows exactly what to inject.

2

Kernel wires everything

Boot time. Zero config files. The orchestrator resolves the dependency graph and delivers each tool instance automatically.

3

Call self.tool — that's it

Use any tool anywhere in your plugin. Your constructor is the contract — the AI reads it and instantly knows what the plugin needs.

The constructor is not boilerplate — it's the dependency specification. One glance tells you everything a plugin needs to function.

Plugin
class CreateUserPlugin(BasePlugin):
def __init__(self,http,db,logger):
# ↑ just declare what you need
self.http =http
self.db =db
self.logger =logger
def execute(self, data):
# ↓ use any tool directly
self.db.execute("INSERT ...")
self.logger.info("User created")
return{"success": True}
test_plugin.py
python
import unittest
from unittest.mock import MagicMock
from domains.users.plugins.create_user_plugin import CreateUserPlugin

class TestCreateUser(unittest.TestCase):
    def setUp(self):
        # 1. Trivial Mocking of dependencies
        self.mock_db = MagicMock()
        self.mock_bus = MagicMock()
        self.plugin = CreateUserPlugin(db=self.mock_db, event_bus=self.mock_bus, ...)

    def test_execute_success(self):
        self.mock_db.execute.return_value = 42
        result = self.plugin.execute({"name": "Test", "email": "test@test.com"})
        
        # ✅ Verify logic AND Side-Effects (Events, DB)
        assert result["success"] is True
        self.mock_db.execute.assert_called_once()
        self.mock_bus.publish.assert_called_with("users.created", result["user"])

Testable by Design

Stop fighting with DI containers in your tests. Because every plugin is a simple class with an explicit constructor, you can test it in perfect isolation.

Mocking is trivial: No complex setup. Just pass your mocks directly.

Side-effect verification: Ensure DB calls and EventBus pings happen exactly as expected without a real environment.

Develop as a Product: Program a feature in the morning, test it locally in the afternoon, drop it into the domains/ folder at night.

Implicit Integration: If it passes its unit tests, the kernel handles the wiring. It will work.

Measured Token Usage

How much context your AI needs to understand a single feature.

MicroCoreOS

~1,000

Estimated Tokens

Clean Architecture

~4,000

4x More Bloat

Development Cost

1 File

Per Feature

MicroCoreOS
1000
Vertical Slice
1500
N-Layer
2500
Hexagonal
3500
Clean Arch
4000

Core Principles

Designed to be simple, auditable, and rigid where it matters.

Tool = Stateless Plugin = Stateful

Pure Kernel

Zero business logic in the core. It is a neutral orchestrator that boots what you drop in /domains.

Stateless Tools

Infrastructure drivers (DB, HTTP, Bus). Reusable, stateless, and neutral capabilities.

Atomic Plugins

1 file = 1 feature. Pure business logic implementation. AI agents understand the whole module in one read.

Event-Driven

Plugins communicate via EventBus only. Decoupled by default, scalable by design.

Declarative DI

Dependencies are declared in the constructor. The kernel delivers what's requested.

Auto-Discovery

Just drop a file in a folder. The kernel finds it, boots it, and wires it.

The Roadmap

Where MicroCoreOS is heading — from observability to a full tool marketplace.

Phase 1 In Progress

Tracer Tool

Integrated mapping of which plugins react to which events. Full observability from day one — no external tools required.

Phase 2 Planned

Hot Reload

Drop a new plugin file and the kernel picks it up instantly — no restart required. Zero-downtime development loop, potentially in production too.

Phase 3 Upcoming

Observability Dashboard

A visual dashboard built on top of the Tracer Tool — see event flows, plugin timings, and system health in real-time. Already in early form, being improved.

Phase 4 Upcoming

Atomic Tool Marketplace

Drop-in tool ecosystem — Redis, PostgreSQL, LLMs — as self-contained folders with their own manifests and AI instructions. Install via uv or copy-paste.

Phase 5 Vision

Polyglot Kernels

Sidecar plugins via gRPC or WASM — write performance-critical modules in Go or Rust, orchestrated by the same MicroCoreOS kernel.

Build Faster with AI

Stop wasting tokens and time. Join the 1-file revolution and build features that AI understands instantly.

MIT License Python 3.10+ AI Optimized