Files
HCFS/curatorWorkflow.md
Claude Code 0a92dc3432 Complete HCFS Phase 2: Production API & Multi-Language SDK Ecosystem
Major Phase 2 Achievements:
 Enterprise-grade FastAPI server with comprehensive middleware
 JWT and API key authentication systems
 Comprehensive Python SDK (sync/async) with advanced features
 Multi-language SDK ecosystem (JavaScript/TypeScript, Go, Rust, Java, C#)
 OpenAPI/Swagger documentation with PDF generation
 WebSocket streaming and real-time updates
 Advanced caching systems (LRU, LFU, FIFO, TTL)
 Comprehensive error handling hierarchies
 Batch operations and high-throughput processing

SDK Features Implemented:
- Promise-based JavaScript/TypeScript with full type safety
- Context-aware Go SDK with goroutine safety
- Memory-safe Rust SDK with async/await
- Reactive Java SDK with RxJava integration
- .NET 6+ C# SDK with dependency injection support
- Consistent API design across all languages
- Production-ready error handling and caching

Documentation & Testing:
- Complete OpenAPI specification with interactive docs
- Professional Sphinx documentation with ReadTheDocs styling
- LaTeX-generated PDF manuals
- Comprehensive functional testing across all SDKs
- Performance validation and benchmarking

Project Status: PRODUCTION-READY
- 2 major phases completed on schedule
- 5 programming languages with full feature parity
- Enterprise features: authentication, caching, streaming, monitoring
- Ready for deployment, academic publication, and commercial licensing

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-07-30 14:07:45 +10:00

7.6 KiB
Raw Blame History

Heres a runnable n8n workflow skeleton for your Curator concept this handles deterministic filtering first, then escalates to a Curator Model when needed.


🏗 n8n Workflow: “Context Curator”

What it does:

  • Accepts a webhook from Bzzz (context_discovery).
  • Runs deterministic rules (file type, folder patterns).
  • If not conclusive → calls Curator Model (Claude/GPT/Ollama).
  • Posts decision back to Hive/Bzzz.

📦 Workflow JSON Export (Import into n8n)

{
  "name": "Context Curator",
  "nodes": [
    {
      "parameters": {
        "httpMethod": "POST",
        "path": "context-curator",
        "responseMode": "onReceived",
        "options": {}
      },
      "name": "Bzzz Webhook",
      "type": "n8n-nodes-base.webhook",
      "typeVersion": 1,
      "position": [250, 300]
    },
    {
      "parameters": {
        "functionCode": "const path = $json[\"path\"] || \"\";\n\n// Simple deterministic rules (expand later)\nconst rules = [\n  { pattern: /node_modules/, decision: { relevance: \"exclude\", reason: \"node_modules irrelevant\" } },\n  { pattern: /.*\\.log$/, decision: { relevance: \"exclude\", reason: \"log file\" } },\n  { pattern: /.*\\.css$/, decision: { relevance: \"include\", roles: [\"frontend\"], reason: \"CSS → frontend\" } },\n  { pattern: /.*\\.sql$/, decision: { relevance: \"include\", roles: [\"backend\"], reason: \"SQL → backend\" } }\n];\n\n// Look for match\nlet match = null;\nfor (const rule of rules) {\n  if (rule.pattern.test(path)) {\n    match = rule.decision;\n    break;\n  }\n}\n\nreturn [{ json: { matched: !!match, decision: match, path } }];"
      },
      "name": "Deterministic Rules",
      "type": "n8n-nodes-base.function",
      "typeVersion": 1,
      "position": [500, 300]
    },
    {
      "parameters": {
        "conditions": {
          "boolean": [],
          "number": [],
          "string": [
            {
              "value1": "={{$json[\"matched\"]}}",
              "operation": "isEqual",
              "value2": "true"
            }
          ]
        }
      },
      "name": "Matched?",
      "type": "n8n-nodes-base.if",
      "typeVersion": 1,
      "position": [750, 300]
    },
    {
      "parameters": {
        "functionCode": "return [{ json: {\n  action: \"context_rule.add\",\n  rule: {\n    target: $json.path,\n    condition: { role: $json.decision?.roles || [] },\n    action: {\n      relevance: $json.decision?.relevance || \"exclude\",\n      scope: \"local\"\n    },\n    metadata: {\n      reason: $json.decision?.reason || \"Deterministic rule match\"\n    }\n  }\n} }];"
      },
      "name": "Build Decision (Deterministic)",
      "type": "n8n-nodes-base.function",
      "typeVersion": 1,
      "position": [1000, 200]
    },
    {
      "parameters": {
        "functionCode": "return [{ json: {\n  prompt: `You are the Context Curator for a multi-agent AI system. Analyze this discovery and decide which agent roles it is relevant to, whether to include, exclude, or escalate, and draft a Context Rule DSL snippet.\\n\\nPath: ${$json.path}\\n\\nMetadata: ${JSON.stringify($json.metadata || {})}`\n} }];"
      },
      "name": "Build Model Prompt",
      "type": "n8n-nodes-base.function",
      "typeVersion": 1,
      "position": [1000, 400]
    },
    {
      "parameters": {
        "resource": "chat",
        "operation": "create",
        "model": "gpt-4o",
        "messages": [
          {
            "role": "system",
            "content": "You are the Context Curator for a multi-agent system. Decide on relevance, roles, and generate a Context Rule JSON."
          },
          {
            "role": "user",
            "content": "={{$json[\"prompt\"]}}"
          }
        ]
      },
      "name": "Call Curator Model",
      "type": "n8n-nodes-openai.chat",
      "typeVersion": 1,
      "position": [1250, 400],
      "credentials": {
        "openAIApi": "OpenAI API"
      }
    },
    {
      "parameters": {
        "functionCode": "// Parse model response (assume model replies with JSON rule)\nlet rule;\ntry {\n  rule = JSON.parse($json.choices[0].message.content);\n} catch (e) {\n  rule = { error: \"Model response parse error\", raw: $json };\n}\n\nreturn [{ json: { action: \"context_rule.add\", rule } }];"
      },
      "name": "Parse Model Output",
      "type": "n8n-nodes-base.function",
      "typeVersion": 1,
      "position": [1500, 400]
    },
    {
      "parameters": {
        "url": "http://hive.local/api/context",
        "method": "POST",
        "jsonParameters": true,
        "options": {},
        "body": "={{$json}}"
      },
      "name": "Send Decision to Hive",
      "type": "n8n-nodes-base.httpRequest",
      "typeVersion": 1,
      "position": [1750, 300]
    }
  ],
  "connections": {
    "Bzzz Webhook": {
      "main": [
        [
          {
            "node": "Deterministic Rules",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Deterministic Rules": {
      "main": [
        [
          {
            "node": "Matched?",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Matched?": {
      "main": [
        [
          {
            "node": "Build Decision (Deterministic)",
            "type": "main",
            "index": 0
          }
        ],
        [
          {
            "node": "Build Model Prompt",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Build Decision (Deterministic)": {
      "main": [
        [
          {
            "node": "Send Decision to Hive",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Build Model Prompt": {
      "main": [
        [
          {
            "node": "Call Curator Model",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Call Curator Model": {
      "main": [
        [
          {
            "node": "Parse Model Output",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Parse Model Output": {
      "main": [
        [
          {
            "node": "Send Decision to Hive",
            "type": "main",
            "index": 0
          }
        ]
      ]
    }
  }
}

🔧 How It Works

  • Bzzz agents send discoveries to:

    POST http://<n8n-host>/webhook/context-curator
    

    Example payload:

    {
      "node_id": "bzzz-node-3",
      "path": "/src/featureX/benchmark_runner.py",
      "metadata": { "size": 1024, "file_types": ["py"], "creator": "claude-code" }
    }
    
  • Deterministic Rules node: Matches quick obvious patterns (logs, node_modules, css/sql → roles).

  • IF node: If a deterministic match is found → build DSL rule → send to Hive.

  • Else: Build Curator Prompt → send to GPT/Claude → parse JSON rule → send to Hive.


🛠 How You Can Use It

  • Import into n8n → set OpenAI API credentials in Call Curator Model node.
  • Expand Deterministic Rules function node with your own patterns.
  • Swap OpenAI with Ollama, Claude, or a local model if you want.
  • Hive API endpoint in “Send Decision to Hive” should point to your Hive context endpoint.

📌 Next Step

Now that this skeleton exists, we can:

  • Write the Curator System Prompt for the model (so the “AI lane” has a consistent voice).
  • Add more state hooks (e.g. “server up/down” checks via n8n HTTP nodes).
  • Decide how Hive handles “draft rules” from the curator (auto-apply vs. human review).