#ai-agent #task #generator #artificial-intelligence #backlog #api-bindings

taskai-core

Core functionality for TaskAI - AI-powered task generation for agents and automation workflows

1 unstable release

new 0.1.0 Apr 22, 2025

#42 in #ai-agent

36 downloads per month
Used in taskai

MIT license

31KB
520 lines

TaskAI

TaskAI Logo

Generate structured task backlogs for AI agents and automation workflows

License: MIT Rust Version

๐Ÿ“– Overview

TaskAI bridges natural language and structured task definitions, generating well-organized backlogs that both humans and AI agents can understand and execute. Convert informal descriptions into actionable tasks with dependencies, deliverables, and completion criteria - perfect for driving autonomous agents or organizing human-AI collaboration.

โœจ Features

  • ๐Ÿค– AI-Agent Ready: Generate structured YAML task definitions optimized for AI agent consumption
  • ๐Ÿ—ฃ๏ธ Natural Language Input: Convert simple text descriptions into comprehensive task breakdowns
  • ๐Ÿ“‹ State Tracking: Monitor task progress with Todo/Done states
  • ๐Ÿ”„ Dependency Resolution: Automatically identify tasks ready for execution based on dependencies
  • โœ… Progress Tracking: Mark tasks as complete and manage the workflow lifecycle
  • ๐ŸŒ Multilingual: Support for inputs in both English and French
  • ๐Ÿณ Containerized: Run in Docker for CI/CD integration and consistent environments

๐Ÿš€ Installation

# Install directly from crates.io
cargo install taskai

# Or build from source
git clone https://github.com/graniet/taskai.git
cd taskai
cargo install --path crates/cli

๐ŸŽฎ Usage

1. Create a task backlog from natural language

Create a simple text file with your requirements:

# simple_request.txt
Create a Python script that fetches weather data from an API.

Generate a structured task backlog:

taskai gen simple_request.txt > weather_tasks.yml

The output will be a structured YAML backlog:

project: weather-api-client
tasks:
  - id: W-1
    title: "Setup project structure"
    depends: []
    state: Todo
    deliverable: ["README.md", "requirements.txt"]
    done_when: ["Project directory structure is created"]
  
  - id: W-2
    title: "Create API client for weather data"
    depends: ["W-1"]
    state: Todo
    deliverable: ["weather_client.py"]
    done_when: ["Client can connect to weather API", "Basic error handling implemented"]
  
  - id: W-3
    title: "Implement data parsing and formatting"
    depends: ["W-2"]
    state: Todo
    deliverable: ["weather_client.py"]
    done_when: ["Weather data is properly parsed and formatted"]

2. Query Tasks Ready for Execution

Identify tasks that are ready to be worked on (all dependencies satisfied):

taskai next weather_tasks.yml

Output:

Tasks ready to work on:
W-1: Setup project structure
  Deliverables:
    - README.md
    - requirements.txt

Using Claude with TaskAI - Simple Workflow

With TaskAI, you can supercharge Claude's coding capabilities by giving it structured tasks to work on:

# Generate tasks from a simple description
echo "Build a weather dashboard with React" > idea.txt
taskai gen idea.txt > tasks.yml

Now, simply tell Claude to read and implement the task:

Read tasks.yml and implement the next ready task. 
Use 'taskai next tasks.yml' to see what task to work on 
and 'taskai mark-done tasks.yml --task TASK-ID' to mark it complete when done.

That's it! With just this simple prompt, Claude will:

  • Read the tasks.yml file
  • Find the next ready task
  • Implement it according to the specifications
  • Provide the command to mark the task as complete

When Claude finishes, simply mark the task as done:

taskai mark-done tasks.yml --task TASK-ID

Then you can ask Claude to work on the next task with the same basic prompt. This creates a continuous loop where Claude methodically works through the entire project, one task at a time, with minimal input from you.

๐Ÿ“Š Architecture

TaskAI is organized into three Rust crates:

  • schema: Defines the data structures for tasks, dependencies and completion criteria
  • core: Implements the LLM communication and YAML generation/validation
  • cli: Provides the command-line interface

๐Ÿงช Environment Variables

  • OPENAI_API_KEY: Required for LLM functionality

๐Ÿค Contributing

Contributions welcome! Please feel free to submit a Pull Request.

๐Ÿ“œ License

This project is licensed under the MIT License - see the LICENSE file for details.

Dependencies

~9โ€“21MB
~282K SLoC