Back to Home

apflow

Task orchestration framework for AI applications

About

More than a traditional task list, apflow is a powerful orchestration framework that builds complex task trees with dependencies, priorities, and unified execution. Serves as a Modular AI Orchestration Layer that runs Standalone or Embedded, seamlessly integrating AI agent support. The core is pure orchestration with no LLM dependencies — CrewAI integration is optional. Features built-in task scheduling with support for cron expressions, intervals, daily/weekly/monthly triggers, and external scheduler integration. The framework includes an extensive executor ecosystem supporting HTTP/REST APIs, SSH remote execution, Docker containers, gRPC services, WebSocket communication, MCP integration, and LLM-based task tree generation.

Features

Pure orchestration framework (no LLM dependencies)
A2A Protocol support for agent communication
Optional CrewAI integration
Extended executor framework with multiple execution methods
HTTP/REST API executor with authentication support
SSH executor for remote command execution
Docker executor for containerized execution
gRPC executor for microservice communication
WebSocket executor for bidirectional communication
MCP (Model Context Protocol) executor and server
Task tree generation from natural language (LLM)
Built-in task scheduling (cron, interval, daily, weekly, monthly)
Flexible storage (DuckDB default, PostgreSQL optional)
Real-time streaming support (SSE, WebSocket)
Enhanced CLI with full API synchronization

Workflow Examples

Visualize how tasks are organized in trees and how dependencies control execution order

Sequential Pipeline with Task Tree

Demonstrates both task tree organization (parent-child) and execution dependencies. The tree organizes tasks hierarchically, while dependencies control when tasks execute.

Dependency - Controls execution order
Tree - Parent-child organization
Node shows Schema (inputs)
Flow RootPipeline RootWorkflow root node0http_request_executorFetch DataFetch data from APISchemaurl, method, headersT1command_executorProcess DataProcess fetched dataSchemaoperation, dataT2command_executorSave ResultsSave processed resultsSchemadestination, dataT3

Install

pip install apflow

Quick Start

from apflow import TaskManager, TaskTreeNode, create_session

# Create database session and task manager
db = create_session()
task_manager = TaskManager(db)

# Create a task using built-in executor
task = await task_manager.task_repository.create_task(
    name="system_info_executor",  # Built-in executor ID
    user_id="user_123",
    priority=2,
    inputs={"resource": "cpu"}    # Get CPU information
)

# Build task tree and execute
task_tree = TaskTreeNode(task)
await task_manager.distribute_task_tree(task_tree)

# Get result
result = await task_manager.task_repository.get_task_by_id(task.id)
print(f"Status: {result.status}")
print(f"Result: {result.result}")

Sub-projects

apflow-webapp

A modern web application for managing and executing tasks with apflow. Built with Next.js and Mantine.

Related Products

apcore

The schema-driven module development framework that apflow is built on.