WIP: Save current work before CHORUS rebrand

- Agent roles integration progress
- Various backend and frontend updates
- Storybook cache cleanup

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
anthonyrawlins
2025-08-01 02:20:56 +10:00
parent 1e81daaf18
commit b6bff318d9
740 changed files with 90022 additions and 279523 deletions

63
AUTH_CREDENTIALS.md Normal file
View File

@@ -0,0 +1,63 @@
# Hive Authentication System Credentials
## Default Administrator Account
**CRITICAL: These are the OFFICIAL Hive admin credentials. Do not change without updating all references.**
```
Username: admin
Password: hiveadmin123
```
## Authentication System Architecture
- **Backend**: FastAPI with OAuth2 + JWT tokens
- **Frontend**: React with AuthContext using FormData for login
- **Database**: PostgreSQL users table with bcrypt password hashing
- **API Endpoint**: `POST /api/auth/login` (expects FormData, not JSON)
## Database Schema
The default admin user should be created in the database with:
- username: `admin`
- email: `admin@hive.local`
- password: `hiveadmin123` (bcrypt hashed)
- is_superuser: `true`
- is_active: `true`
- is_verified: `true`
## Frontend Integration
Login form sends FormData:
```javascript
const formData = new FormData();
formData.append('username', 'admin');
formData.append('password', 'hiveadmin123');
```
## Backend Response Format
Successful login returns:
```json
{
"access_token": "jwt_token_here",
"refresh_token": "refresh_token_here",
"token_type": "bearer",
"expires_in": 3600,
"user": {
"id": "uuid",
"username": "admin",
"email": "admin@hive.local",
"is_superuser": true,
"is_active": true,
"is_verified": true
}
}
```
## Notes
- Password was previously `hiveadmin` but is now officially `hiveadmin123`
- All development and production environments must use these credentials
- Update database seed scripts to ensure admin user exists with correct password
- Frontend demo credentials display should show `hiveadmin123`

View File

@@ -5,6 +5,50 @@
---
## 🎯 **CRITICAL PRIORITY: RL Context Curator Integration**
### **0. Context Feedback and Learning System**
**Priority: Critical - Integration with HCFS RL Context Curator**
- [ ] **Task Outcome Tracking**
- [ ] Extend `backend/app/models/task.py` with completion metrics
- [ ] Add fields: completion_time, errors_encountered, follow_up_questions, success_rate
- [ ] Implement task outcome classification (completed, failed, abandoned)
- [ ] Add confidence scoring for task completions
- [ ] **Agent Role Management System**
- [ ] Modify `backend/app/services/agent_manager.py` for role-based capabilities
- [ ] Implement role definitions: backend, frontend, devops, qa, testing
- [ ] Add directory scope patterns for each agent role
- [ ] Create agent permission management with role inheritance
- [ ] Support dynamic role assignment based on task requirements
- [ ] **Context Feedback Collection API**
- [ ] Create `backend/app/api/feedback.py` with context feedback endpoints
- [ ] Implement POST /api/feedback/context/{context_id} (upvote, downvote, forgetfulness)
- [ ] Add POST /api/feedback/task-outcome/{task_id} for task completion feedback
- [ ] Create feedback confidence and usage context tracking
- [ ] Add feedback aggregation and analytics endpoints
- [ ] **Real-time Task Event System**
- [ ] Extend `backend/app/services/websocket_manager.py` for task events
- [ ] Add WebSocket events for task completion/failure triggers
- [ ] Implement real-time feedback collection notifications
- [ ] Create task-to-context relevance tracking events
- [ ] Add agent role change notifications
- [ ] **Database Schema Extensions for Context Learning**
- [ ] Create migration for context_feedback table
- [ ] Create migration for agent_permissions table
- [ ] Add context relevance tracking to tasks table
- [ ] Extend agent model with role and directory scope fields
- [ ] Implement feedback aggregation views for RL training
- [ ] **Integration with Bzzz Context Events**
- [ ] Add endpoints to receive context feedback from Bzzz P2P network
- [ ] Implement feedback event routing to HCFS RL Context Curator
- [ ] Create feedback event validation and deduplication
- [ ] Add task-context relevance correlation tracking
## 🎯 **HIGH PRIORITY: Project Registration & Activation System**
### **1. Database-Driven Project Management**

View File

@@ -0,0 +1,221 @@
# 🏗️ Hive-Bzzz Registration Architecture Design Plan
## 🔍 Current Architecture Problems
1. **Static Configuration**: Hardcoded node IPs in `cluster_service.py`
2. **SSH Dependencies**: Requires SSH keys, network access, security risks
3. **Docker Isolation**: Can't SSH from container to host network
4. **No Dynamic Discovery**: Nodes can't join/leave dynamically
5. **Stale Data**: No real-time hardware/status updates
## 🎯 Proposed Architecture: Registration-Based Cluster
Similar to Docker Swarm's `docker swarm join` with tokens:
```bash
# Bzzz clients register with Hive coordinator
HIVE_CLUSTER_TOKEN=abc123... HIVE_COORDINATOR_URL=https://hive.example.com bzzz-client
```
## 📋 Implementation Plan
### Phase 1: Hive Coordinator Registration System
#### 1.1 Database Schema Changes
```sql
-- Cluster registration tokens
CREATE TABLE cluster_tokens (
id SERIAL PRIMARY KEY,
token VARCHAR(64) UNIQUE NOT NULL,
description TEXT,
created_at TIMESTAMP DEFAULT NOW(),
expires_at TIMESTAMP,
is_active BOOLEAN DEFAULT true
);
-- Registered cluster nodes
CREATE TABLE cluster_nodes (
id SERIAL PRIMARY KEY,
node_id VARCHAR(64) UNIQUE NOT NULL,
hostname VARCHAR(255) NOT NULL,
ip_address INET NOT NULL,
registration_token VARCHAR(64) REFERENCES cluster_tokens(token),
-- Hardware info (reported by client)
cpu_info JSONB,
memory_info JSONB,
gpu_info JSONB,
disk_info JSONB,
-- Status tracking
status VARCHAR(20) DEFAULT 'online',
last_heartbeat TIMESTAMP DEFAULT NOW(),
first_registered TIMESTAMP DEFAULT NOW(),
-- Capabilities
services JSONB, -- ollama, docker, etc.
capabilities JSONB -- models, tools, etc.
);
```
#### 1.2 Registration API Endpoints
```python
# /api/cluster/register (POST)
# - Validates token
# - Records node hardware info
# - Returns node_id and heartbeat interval
# /api/cluster/heartbeat (POST)
# - Updates last_heartbeat
# - Updates current status/metrics
# - Returns cluster commands/tasks
# /api/cluster/tokens (GET/POST)
# - Generate/list/revoke cluster tokens
# - Admin endpoint for token management
```
### Phase 2: Bzzz Client Registration Capability
#### 2.1 Environment Variables
```bash
HIVE_CLUSTER_TOKEN=token_here # Required for registration
HIVE_COORDINATOR_URL=https://hive.local:8000 # Hive API endpoint
HIVE_NODE_ID=walnut-$(hostname) # Optional: custom node ID
HIVE_HEARTBEAT_INTERVAL=30 # Seconds between heartbeats
```
#### 2.2 Hardware Detection Module
```python
# bzzz/system_info.py
def get_system_info():
return {
"cpu": detect_cpu(), # lscpu parsing
"memory": detect_memory(), # /proc/meminfo
"gpu": detect_gpu(), # nvidia-smi, lspci
"disk": detect_storage(), # df, lsblk
"services": detect_services(), # docker, ollama, etc.
"capabilities": detect_models() # available models
}
```
#### 2.3 Registration Logic
```python
# bzzz/cluster_client.py
class HiveClusterClient:
def __init__(self):
self.token = os.getenv('HIVE_CLUSTER_TOKEN')
self.coordinator_url = os.getenv('HIVE_COORDINATOR_URL')
self.node_id = os.getenv('HIVE_NODE_ID', f"{socket.gethostname()}-{uuid4()}")
async def register(self):
"""Register with Hive coordinator"""
system_info = get_system_info()
payload = {
"token": self.token,
"node_id": self.node_id,
"hostname": socket.gethostname(),
"ip_address": get_local_ip(),
"system_info": system_info
}
# POST to /api/cluster/register
async def heartbeat_loop(self):
"""Send periodic heartbeats with current status"""
while True:
current_status = get_current_status()
# POST to /api/cluster/heartbeat
await asyncio.sleep(self.heartbeat_interval)
```
### Phase 3: Integration & Migration
#### 3.1 Remove Hardcoded Nodes
- Delete static `cluster_nodes` dict from `cluster_service.py`
- Replace with dynamic database queries
- Update all cluster APIs to use registered nodes
#### 3.2 Frontend Updates
- **Node Management UI**: View/approve/remove registered nodes
- **Token Management**: Generate/revoke cluster tokens
- **Real-time Status**: Live hardware metrics from heartbeats
- **Registration Instructions**: Show token and join commands
#### 3.3 Bzzz Client Integration
- Add cluster client to Bzzz startup sequence
- Environment variable configuration
- Graceful handling of registration failures
## 🔄 Registration Flow
```mermaid
sequenceDiagram
participant B as Bzzz Client
participant H as Hive Coordinator
participant DB as Database
Note over H: Admin generates token
H->>DB: INSERT cluster_token
Note over B: Start with env vars
B->>B: Detect system info
B->>H: POST /api/cluster/register
H->>DB: Validate token
H->>DB: INSERT cluster_node
H->>B: Return node_id, heartbeat_interval
loop Every 30 seconds
B->>B: Get current status
B->>H: POST /api/cluster/heartbeat
H->>DB: UPDATE last_heartbeat
end
```
## 🔐 Security Considerations
1. **Token-based Auth**: No SSH keys or passwords needed
2. **Token Expiration**: Configurable token lifetimes
3. **IP Validation**: Optional IP whitelist for token usage
4. **TLS Required**: All communication over HTTPS
5. **Token Rotation**: Ability to revoke/regenerate tokens
## ✅ Benefits of New Architecture
1. **Dynamic Discovery**: Nodes self-register, no pre-configuration
2. **Real-time Data**: Live hardware metrics from heartbeats
3. **Security**: No SSH, credential management, or open ports
4. **Scalability**: Works with any number of nodes
5. **Fault Tolerance**: Nodes can rejoin after network issues
6. **Docker Friendly**: No host network access required
7. **Cloud Ready**: Works across NAT, VPCs, different networks
## 🚀 Implementation Priority
1. **High Priority**: Database schema, registration endpoints, basic heartbeat
2. **Medium Priority**: Bzzz client integration, hardware detection
3. **Low Priority**: Advanced UI features, token management UI
## 📝 Implementation Status
- [ ] Phase 1.1: Database schema migration
- [ ] Phase 1.2: Registration API endpoints
- [ ] Phase 2.1: Bzzz environment variable support
- [ ] Phase 2.2: System hardware detection module
- [ ] Phase 2.3: Registration client logic
- [ ] Phase 3.1: Remove hardcoded cluster nodes
- [ ] Phase 3.2: Frontend cluster management UI
- [ ] Phase 3.3: Full Bzzz integration
## 🔗 Related Files
- `/backend/app/services/cluster_service.py` - Current hardcoded implementation
- `/backend/app/api/cluster.py` - Cluster API endpoints
- `/backend/migrations/` - Database schema changes
- `/frontend/src/components/cluster/` - Cluster UI components
---
**Created**: 2025-07-31
**Status**: Planning Phase
**Priority**: High
**Impact**: Solves fundamental hardware detection and cluster management issues

34
backend/Dockerfile.dev Normal file
View File

@@ -0,0 +1,34 @@
FROM python:3.11-slim
WORKDIR /app
# Install system dependencies
RUN apt-get update && apt-get install -y \
curl \
git \
build-essential \
&& rm -rf /var/lib/apt/lists/*
# Copy requirements
COPY requirements.txt .
# Install Python dependencies
RUN pip install --no-cache-dir -r requirements.txt
RUN pip install --no-cache-dir watchdog # For hot reload
# Copy source code
COPY . .
# Create non-root user
RUN useradd -m -u 1001 appuser && chown -R appuser:appuser /app
USER appuser
# Expose port
EXPOSE 8000
# Health check
HEALTHCHECK --interval=30s --timeout=10s --start-period=10s --retries=3 \
CMD curl -f http://localhost:8000/api/health || exit 1
# Start development server with hot reload
CMD ["python", "-m", "uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000", "--reload"]

View File

@@ -15,6 +15,8 @@ Key Features:
from fastapi import APIRouter, HTTPException, Request, Depends, status
from typing import List, Dict, Any
import time
import logging
from ..models.agent import Agent
from ..models.responses import (
AgentListResponse,
@@ -29,6 +31,9 @@ router = APIRouter()
from app.core.database import SessionLocal
from app.models.agent import Agent as ORMAgent
from ..services.agent_service import AgentType
logger = logging.getLogger(__name__)
@router.get(
@@ -384,4 +389,244 @@ async def unregister_agent(
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail=f"Failed to unregister agent: {str(e)}"
)
@router.post(
"/agents/heartbeat",
status_code=status.HTTP_200_OK,
summary="Agent heartbeat update",
description="""
Update agent status and maintain registration through periodic heartbeat.
This endpoint allows agents to:
- Confirm they are still online and responsive
- Update their current status and metrics
- Report any capability or configuration changes
- Maintain their registration in the cluster
Agents should call this endpoint every 30-60 seconds to maintain
their active status in the Hive cluster.
""",
responses={
200: {"description": "Heartbeat received successfully"},
404: {"model": ErrorResponse, "description": "Agent not registered"},
400: {"model": ErrorResponse, "description": "Invalid heartbeat data"}
}
)
async def agent_heartbeat(
heartbeat_data: Dict[str, Any],
request: Request
):
"""
Process agent heartbeat to maintain registration.
Args:
heartbeat_data: Agent status and metrics data
request: FastAPI request object
Returns:
Success confirmation and any coordinator updates
"""
agent_id = heartbeat_data.get("agent_id")
if not agent_id:
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Missing agent_id in heartbeat data"
)
# Access coordinator
hive_coordinator = getattr(request.app.state, 'hive_coordinator', None)
if not hive_coordinator:
from ..main import unified_coordinator
hive_coordinator = unified_coordinator
if not hive_coordinator:
raise HTTPException(
status_code=status.HTTP_503_SERVICE_UNAVAILABLE,
detail="Coordinator service unavailable"
)
try:
# Update agent heartbeat timestamp
agent_service = hive_coordinator.agent_service
if agent_service:
agent_service.update_agent_heartbeat(agent_id)
# Update current tasks if provided - use raw SQL to avoid role column
if "current_tasks" in heartbeat_data:
current_tasks = heartbeat_data["current_tasks"]
try:
with SessionLocal() as db:
from sqlalchemy import text
db.execute(text(
"UPDATE agents SET current_tasks = :current_tasks, last_seen = NOW() WHERE id = :agent_id"
), {
"current_tasks": current_tasks,
"agent_id": agent_id
})
db.commit()
except Exception as e:
logger.warning(f"Could not update agent tasks: {e}")
return {
"status": "success",
"message": f"Heartbeat received from agent '{agent_id}'",
"timestamp": time.time()
}
except Exception as e:
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail=f"Failed to process heartbeat: {str(e)}"
)
@router.post(
"/agents/auto-register",
response_model=AgentRegistrationResponse,
status_code=status.HTTP_201_CREATED,
summary="Automatic agent registration",
description="""
Register an agent automatically with capability detection.
This endpoint is designed for Bzzz agents running as systemd services
to automatically register themselves with the Hive coordinator.
Features:
- Automatic capability detection based on available models
- Network discovery support
- Retry-friendly for service startup scenarios
- Health validation before registration
""",
responses={
201: {"description": "Agent auto-registered successfully"},
400: {"model": ErrorResponse, "description": "Invalid agent configuration"},
409: {"model": ErrorResponse, "description": "Agent already registered"},
503: {"model": ErrorResponse, "description": "Agent endpoint unreachable"}
}
)
async def auto_register_agent(
agent_data: Dict[str, Any],
request: Request
) -> AgentRegistrationResponse:
"""
Automatically register a Bzzz agent with the Hive coordinator.
Args:
agent_data: Agent configuration including endpoint, models, etc.
request: FastAPI request object
Returns:
AgentRegistrationResponse: Registration confirmation
"""
# Extract required fields
agent_id = agent_data.get("agent_id")
endpoint = agent_data.get("endpoint")
hostname = agent_data.get("hostname")
if not agent_id or not endpoint:
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Missing required fields: agent_id, endpoint"
)
# Access coordinator
hive_coordinator = getattr(request.app.state, 'hive_coordinator', None)
if not hive_coordinator:
from ..main import unified_coordinator
hive_coordinator = unified_coordinator
if not hive_coordinator:
raise HTTPException(
status_code=status.HTTP_503_SERVICE_UNAVAILABLE,
detail="Coordinator service unavailable"
)
try:
# Check if agent already exists - use basic query to avoid role column
try:
with SessionLocal() as db:
from sqlalchemy import text
existing_agent = db.execute(text(
"SELECT id, endpoint FROM agents WHERE id = :agent_id LIMIT 1"
), {"agent_id": agent_id}).fetchone()
if existing_agent:
# Update existing agent
db.execute(text(
"UPDATE agents SET endpoint = :endpoint, last_seen = NOW() WHERE id = :agent_id"
), {"endpoint": endpoint, "agent_id": agent_id})
db.commit()
return AgentRegistrationResponse(
agent_id=agent_id,
endpoint=endpoint,
message=f"Agent '{agent_id}' registration updated successfully"
)
except Exception as e:
logger.warning(f"Could not check existing agent: {e}")
# Detect capabilities and models
models = agent_data.get("models", [])
if not models:
# Try to detect models from endpoint
try:
import aiohttp
async with aiohttp.ClientSession() as session:
async with session.get(f"{endpoint}/api/tags", timeout=aiohttp.ClientTimeout(total=5)) as response:
if response.status == 200:
tags_data = await response.json()
models = [model["name"] for model in tags_data.get("models", [])]
except Exception as e:
logger.warning(f"Could not detect models for {agent_id}: {e}")
# Determine specialty based on models or hostname
specialty = AgentType.GENERAL_AI # Default
if "codellama" in str(models).lower() or "code" in hostname.lower():
specialty = AgentType.KERNEL_DEV
elif "gemma" in str(models).lower():
specialty = AgentType.PYTORCH_DEV
elif any(model for model in models if "llama" in model.lower()):
specialty = AgentType.GENERAL_AI
# Insert agent directly into database
try:
with SessionLocal() as db:
from sqlalchemy import text
# Insert new agent using raw SQL to avoid role column issues
db.execute(text("""
INSERT INTO agents (id, name, endpoint, model, specialty, max_concurrent, current_tasks, status, created_at, last_seen)
VALUES (:agent_id, :name, :endpoint, :model, :specialty, :max_concurrent, 0, 'active', NOW(), NOW())
ON CONFLICT (id) DO UPDATE SET
endpoint = EXCLUDED.endpoint,
model = EXCLUDED.model,
specialty = EXCLUDED.specialty,
max_concurrent = EXCLUDED.max_concurrent,
last_seen = NOW()
"""), {
"agent_id": agent_id,
"name": agent_id, # Use agent_id as name
"endpoint": endpoint,
"model": models[0] if models else "unknown",
"specialty": specialty.value,
"max_concurrent": agent_data.get("max_concurrent", 2)
})
db.commit()
return AgentRegistrationResponse(
agent_id=agent_id,
endpoint=endpoint,
message=f"Agent '{agent_id}' auto-registered successfully with specialty '{specialty.value}'"
)
except Exception as e:
logger.error(f"Database insert failed: {e}")
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail=f"Failed to register agent in database: {str(e)}"
)
except Exception as e:
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail=f"Failed to auto-register agent: {str(e)}"
)

View File

@@ -0,0 +1,287 @@
"""
Bzzz hypercore/hyperswarm log streaming API endpoints.
Provides real-time access to agent communication logs from the Bzzz network.
"""
from fastapi import APIRouter, WebSocket, WebSocketDisconnect, HTTPException, Query
from fastapi.responses import StreamingResponse
from typing import List, Optional, Dict, Any
import asyncio
import json
import logging
import httpx
import time
from datetime import datetime, timedelta
router = APIRouter()
logger = logging.getLogger(__name__)
# Keep track of active WebSocket connections
active_connections: List[WebSocket] = []
class BzzzLogEntry:
"""Represents a Bzzz hypercore log entry"""
def __init__(self, data: Dict[str, Any]):
self.index = data.get("index", 0)
self.timestamp = data.get("timestamp", "")
self.author = data.get("author", "")
self.log_type = data.get("type", "")
self.message_data = data.get("data", {})
self.hash_value = data.get("hash", "")
self.prev_hash = data.get("prev_hash", "")
def to_chat_message(self) -> Dict[str, Any]:
"""Convert hypercore log entry to chat message format"""
# Extract message details from the log data
msg_data = self.message_data
return {
"id": f"log-{self.index}",
"senderId": msg_data.get("from_short", self.author),
"senderName": msg_data.get("from_short", self.author),
"content": self._format_message_content(),
"timestamp": self.timestamp,
"messageType": self._determine_message_type(),
"channel": msg_data.get("topic", "unknown"),
"swarmId": f"swarm-{msg_data.get('topic', 'unknown')}",
"isDelivered": True,
"isRead": True,
"logType": self.log_type,
"hash": self.hash_value
}
def _format_message_content(self) -> str:
"""Format the log entry into a readable message"""
msg_data = self.message_data
message_type = msg_data.get("message_type", self.log_type)
if message_type == "availability_broadcast":
status = msg_data.get("data", {}).get("status", "unknown")
current_tasks = msg_data.get("data", {}).get("current_tasks", 0)
max_tasks = msg_data.get("data", {}).get("max_tasks", 0)
return f"Status: {status} ({current_tasks}/{max_tasks} tasks)"
elif message_type == "capability_broadcast":
capabilities = msg_data.get("data", {}).get("capabilities", [])
models = msg_data.get("data", {}).get("models", [])
return f"Updated capabilities: {', '.join(capabilities[:3])}{'...' if len(capabilities) > 3 else ''}"
elif message_type == "task_announced":
task_data = msg_data.get("data", {})
return f"Task announced: {task_data.get('title', 'Unknown task')}"
elif message_type == "task_claimed":
task_data = msg_data.get("data", {})
return f"Task claimed: {task_data.get('title', 'Unknown task')}"
elif message_type == "role_announcement":
role = msg_data.get("data", {}).get("role", "unknown")
return f"Role announcement: {role}"
elif message_type == "collaboration":
return f"Collaboration: {msg_data.get('data', {}).get('content', 'Agent discussion')}"
elif self.log_type == "peer_joined":
return "Agent joined the network"
elif self.log_type == "peer_left":
return "Agent left the network"
else:
# Generic fallback
return f"{message_type}: {json.dumps(msg_data.get('data', {}))[:100]}{'...' if len(str(msg_data.get('data', {}))) > 100 else ''}"
def _determine_message_type(self) -> str:
"""Determine if this is a sent, received, or system message"""
msg_data = self.message_data
# System messages
if self.log_type in ["peer_joined", "peer_left", "network_event"]:
return "system"
# For now, treat all as received since we're monitoring
# In a real implementation, you'd check if the author is the current node
return "received"
class BzzzLogStreamer:
"""Manages streaming of Bzzz hypercore logs"""
def __init__(self):
self.agent_endpoints = {}
self.last_indices = {} # Track last seen index per agent
async def discover_bzzz_agents(self) -> List[Dict[str, str]]:
"""Discover active Bzzz agents from the Hive agents API"""
try:
# This would typically query the actual agents database
# For now, return known endpoints based on cluster nodes
return [
{"agent_id": "acacia-bzzz", "endpoint": "http://acacia.local:8080"},
{"agent_id": "walnut-bzzz", "endpoint": "http://walnut.local:8080"},
{"agent_id": "ironwood-bzzz", "endpoint": "http://ironwood.local:8080"},
{"agent_id": "rosewood-bzzz", "endpoint": "http://rosewood.local:8080"},
]
except Exception as e:
logger.error(f"Failed to discover Bzzz agents: {e}")
return []
async def fetch_agent_logs(self, agent_endpoint: str, since_index: int = 0) -> List[BzzzLogEntry]:
"""Fetch hypercore logs from a specific Bzzz agent"""
try:
# This would call the actual Bzzz agent's HTTP API
# For now, return mock data structure that matches hypercore format
async with httpx.AsyncClient() as client:
response = await client.get(
f"{agent_endpoint}/api/hypercore/logs",
params={"since": since_index},
timeout=5.0
)
if response.status_code == 200:
logs_data = response.json()
return [BzzzLogEntry(log) for log in logs_data.get("entries", [])]
else:
logger.warning(f"Failed to fetch logs from {agent_endpoint}: {response.status_code}")
return []
except httpx.ConnectError:
logger.debug(f"Agent at {agent_endpoint} is not reachable")
return []
except Exception as e:
logger.error(f"Error fetching logs from {agent_endpoint}: {e}")
return []
async def get_recent_logs(self, limit: int = 100) -> List[Dict[str, Any]]:
"""Get recent logs from all agents"""
agents = await self.discover_bzzz_agents()
all_messages = []
for agent in agents:
logs = await self.fetch_agent_logs(agent["endpoint"])
for log in logs[-limit:]: # Get recent entries
message = log.to_chat_message()
message["agent_id"] = agent["agent_id"]
all_messages.append(message)
# Sort by timestamp
all_messages.sort(key=lambda x: x["timestamp"])
return all_messages[-limit:]
async def stream_new_logs(self):
"""Continuously stream new logs from all agents"""
while True:
try:
agents = await self.discover_bzzz_agents()
new_messages = []
for agent in agents:
agent_id = agent["agent_id"]
last_index = self.last_indices.get(agent_id, 0)
logs = await self.fetch_agent_logs(agent["endpoint"], last_index)
for log in logs:
if log.index > last_index:
message = log.to_chat_message()
message["agent_id"] = agent_id
new_messages.append(message)
self.last_indices[agent_id] = log.index
# Send new messages to all connected WebSocket clients
if new_messages and active_connections:
message_data = {
"type": "new_messages",
"messages": new_messages
}
# Remove disconnected clients
disconnected = []
for connection in active_connections:
try:
await connection.send_text(json.dumps(message_data))
except:
disconnected.append(connection)
for conn in disconnected:
active_connections.remove(conn)
await asyncio.sleep(2) # Poll every 2 seconds
except Exception as e:
logger.error(f"Error in log streaming: {e}")
await asyncio.sleep(5)
# Global log streamer instance
log_streamer = BzzzLogStreamer()
@router.get("/bzzz/logs")
async def get_bzzz_logs(
limit: int = Query(default=100, le=1000),
agent_id: Optional[str] = None
):
"""Get recent Bzzz hypercore logs"""
try:
logs = await log_streamer.get_recent_logs(limit)
if agent_id:
logs = [log for log in logs if log.get("agent_id") == agent_id]
return {
"logs": logs,
"count": len(logs),
"timestamp": datetime.utcnow().isoformat()
}
except Exception as e:
logger.error(f"Error fetching Bzzz logs: {e}")
raise HTTPException(status_code=500, detail=str(e))
@router.get("/bzzz/agents")
async def get_bzzz_agents():
"""Get list of discovered Bzzz agents"""
try:
agents = await log_streamer.discover_bzzz_agents()
return {"agents": agents}
except Exception as e:
logger.error(f"Error discovering Bzzz agents: {e}")
raise HTTPException(status_code=500, detail=str(e))
@router.websocket("/bzzz/logs/stream")
async def websocket_bzzz_logs(websocket: WebSocket):
"""WebSocket endpoint for real-time Bzzz log streaming"""
await websocket.accept()
active_connections.append(websocket)
try:
# Send initial recent logs
recent_logs = await log_streamer.get_recent_logs(50)
await websocket.send_text(json.dumps({
"type": "initial_logs",
"messages": recent_logs
}))
# Keep connection alive and handle client messages
while True:
try:
# Wait for client messages (ping, filters, etc.)
message = await asyncio.wait_for(websocket.receive_text(), timeout=30)
client_data = json.loads(message)
if client_data.get("type") == "ping":
await websocket.send_text(json.dumps({"type": "pong"}))
except asyncio.TimeoutError:
# Send periodic heartbeat
await websocket.send_text(json.dumps({"type": "heartbeat"}))
except WebSocketDisconnect:
active_connections.remove(websocket)
except Exception as e:
logger.error(f"WebSocket error: {e}")
if websocket in active_connections:
active_connections.remove(websocket)
# Start the log streaming background task
@router.on_event("startup")
async def start_log_streaming():
"""Start the background log streaming task"""
asyncio.create_task(log_streamer.stream_new_logs())

View File

@@ -0,0 +1,434 @@
"""
Cluster Registration API endpoints
Handles registration-based cluster management for Hive-Bzzz integration.
"""
from fastapi import APIRouter, HTTPException, Request, Depends
from pydantic import BaseModel, Field
from typing import Dict, Any, List, Optional
import logging
import os
from ..services.cluster_registration_service import (
ClusterRegistrationService,
RegistrationRequest,
HeartbeatRequest
)
logger = logging.getLogger(__name__)
router = APIRouter()
# Initialize service
DATABASE_URL = os.getenv("DATABASE_URL", "postgresql://hive:hivepass@localhost:5432/hive")
cluster_registration_service = ClusterRegistrationService(DATABASE_URL)
# Pydantic models for API
class NodeRegistrationRequest(BaseModel):
token: str = Field(..., description="Cluster registration token")
node_id: str = Field(..., description="Unique node identifier")
hostname: str = Field(..., description="Node hostname")
system_info: Dict[str, Any] = Field(..., description="System hardware and OS information")
client_version: Optional[str] = Field(None, description="Bzzz client version")
services: Optional[Dict[str, Any]] = Field(None, description="Available services")
capabilities: Optional[Dict[str, Any]] = Field(None, description="Node capabilities")
ports: Optional[Dict[str, Any]] = Field(None, description="Service ports")
metadata: Optional[Dict[str, Any]] = Field(None, description="Additional metadata")
class NodeHeartbeatRequest(BaseModel):
node_id: str = Field(..., description="Node identifier")
status: str = Field("online", description="Node status")
cpu_usage: Optional[float] = Field(None, ge=0, le=100, description="CPU usage percentage")
memory_usage: Optional[float] = Field(None, ge=0, le=100, description="Memory usage percentage")
disk_usage: Optional[float] = Field(None, ge=0, le=100, description="Disk usage percentage")
gpu_usage: Optional[float] = Field(None, ge=0, le=100, description="GPU usage percentage")
services_status: Optional[Dict[str, Any]] = Field(None, description="Service status information")
network_metrics: Optional[Dict[str, Any]] = Field(None, description="Network metrics")
custom_metrics: Optional[Dict[str, Any]] = Field(None, description="Custom node metrics")
class TokenCreateRequest(BaseModel):
description: str = Field(..., description="Token description")
expires_in_days: Optional[int] = Field(None, gt=0, description="Token expiration in days")
max_registrations: Optional[int] = Field(None, gt=0, description="Maximum number of registrations")
allowed_ip_ranges: Optional[List[str]] = Field(None, description="Allowed IP CIDR ranges")
# Helper function to get client IP
def get_client_ip(request: Request) -> str:
"""Extract client IP address from request."""
# Check for X-Forwarded-For header (proxy/load balancer)
forwarded_for = request.headers.get("X-Forwarded-For")
if forwarded_for:
# Take the first IP in the chain (original client)
return forwarded_for.split(",")[0].strip()
# Check for X-Real-IP header (nginx)
real_ip = request.headers.get("X-Real-IP")
if real_ip:
return real_ip.strip()
# Fall back to direct connection IP
return request.client.host if request.client else "unknown"
# Registration endpoints
@router.post("/cluster/register")
async def register_node(
registration: NodeRegistrationRequest,
request: Request
) -> Dict[str, Any]:
"""
Register a new node in the cluster.
This endpoint allows Bzzz clients to register themselves with the Hive coordinator
using a valid cluster token. Similar to `docker swarm join`.
"""
try:
client_ip = get_client_ip(request)
logger.info(f"Node registration attempt: {registration.node_id} from {client_ip}")
# Convert to service request
reg_request = RegistrationRequest(
token=registration.token,
node_id=registration.node_id,
hostname=registration.hostname,
ip_address=client_ip,
system_info=registration.system_info,
client_version=registration.client_version,
services=registration.services,
capabilities=registration.capabilities,
ports=registration.ports,
metadata=registration.metadata
)
result = await cluster_registration_service.register_node(reg_request, client_ip)
logger.info(f"Node {registration.node_id} registered successfully")
return result
except ValueError as e:
logger.warning(f"Registration failed for {registration.node_id}: {e}")
raise HTTPException(status_code=400, detail=str(e))
except Exception as e:
logger.error(f"Registration error for {registration.node_id}: {e}")
raise HTTPException(status_code=500, detail="Registration failed")
@router.post("/cluster/heartbeat")
async def node_heartbeat(heartbeat: NodeHeartbeatRequest) -> Dict[str, Any]:
"""
Update node heartbeat and status.
Registered nodes should call this endpoint periodically (every 30 seconds)
to maintain their registration and report current status/metrics.
"""
try:
heartbeat_request = HeartbeatRequest(
node_id=heartbeat.node_id,
status=heartbeat.status,
cpu_usage=heartbeat.cpu_usage,
memory_usage=heartbeat.memory_usage,
disk_usage=heartbeat.disk_usage,
gpu_usage=heartbeat.gpu_usage,
services_status=heartbeat.services_status,
network_metrics=heartbeat.network_metrics,
custom_metrics=heartbeat.custom_metrics
)
result = await cluster_registration_service.update_heartbeat(heartbeat_request)
return result
except ValueError as e:
logger.warning(f"Heartbeat failed for {heartbeat.node_id}: {e}")
raise HTTPException(status_code=404, detail=str(e))
except Exception as e:
logger.error(f"Heartbeat error for {heartbeat.node_id}: {e}")
raise HTTPException(status_code=500, detail="Heartbeat update failed")
# Node management endpoints
@router.get("/cluster/nodes/registered")
async def get_registered_nodes(include_offline: bool = True) -> Dict[str, Any]:
"""
Get all registered cluster nodes.
Returns detailed information about all nodes that have registered
with the cluster, including their hardware specs and current status.
"""
try:
nodes = await cluster_registration_service.get_registered_nodes(include_offline)
# Convert to API response format
nodes_data = []
for node in nodes:
# Convert dataclass to dict and handle datetime serialization
node_dict = {
"id": node.id,
"node_id": node.node_id,
"hostname": node.hostname,
"ip_address": node.ip_address,
"status": node.status,
"hardware": {
"cpu": node.cpu_info or {},
"memory": node.memory_info or {},
"gpu": node.gpu_info or {},
"disk": node.disk_info or {},
"os": node.os_info or {},
"platform": node.platform_info or {}
},
"services": node.services or {},
"capabilities": node.capabilities or {},
"ports": node.ports or {},
"client_version": node.client_version,
"first_registered": node.first_registered.isoformat(),
"last_heartbeat": node.last_heartbeat.isoformat(),
"registration_metadata": node.registration_metadata or {}
}
nodes_data.append(node_dict)
return {
"nodes": nodes_data,
"total_count": len(nodes_data),
"online_count": len([n for n in nodes if n.status == "online"]),
"offline_count": len([n for n in nodes if n.status == "offline"])
}
except Exception as e:
logger.error(f"Failed to get registered nodes: {e}")
raise HTTPException(status_code=500, detail="Failed to retrieve registered nodes")
@router.get("/cluster/nodes/{node_id}")
async def get_node_details(node_id: str) -> Dict[str, Any]:
"""Get detailed information about a specific registered node."""
try:
node = await cluster_registration_service.get_node_details(node_id)
if not node:
raise HTTPException(status_code=404, detail="Node not found")
return {
"id": node.id,
"node_id": node.node_id,
"hostname": node.hostname,
"ip_address": node.ip_address,
"status": node.status,
"hardware": {
"cpu": node.cpu_info or {},
"memory": node.memory_info or {},
"gpu": node.gpu_info or {},
"disk": node.disk_info or {},
"os": node.os_info or {},
"platform": node.platform_info or {}
},
"services": node.services or {},
"capabilities": node.capabilities or {},
"ports": node.ports or {},
"client_version": node.client_version,
"first_registered": node.first_registered.isoformat(),
"last_heartbeat": node.last_heartbeat.isoformat(),
"registration_metadata": node.registration_metadata or {}
}
except HTTPException:
raise
except Exception as e:
logger.error(f"Failed to get node details for {node_id}: {e}")
raise HTTPException(status_code=500, detail="Failed to retrieve node details")
@router.delete("/cluster/nodes/{node_id}")
async def remove_node(node_id: str) -> Dict[str, Any]:
"""
Remove a node from the cluster.
This will unregister the node and stop accepting its heartbeats.
The node will need to re-register to rejoin the cluster.
"""
try:
success = await cluster_registration_service.remove_node(node_id)
if not success:
raise HTTPException(status_code=404, detail="Node not found")
return {
"node_id": node_id,
"status": "removed",
"message": "Node successfully removed from cluster"
}
except HTTPException:
raise
except Exception as e:
logger.error(f"Failed to remove node {node_id}: {e}")
raise HTTPException(status_code=500, detail="Failed to remove node")
# Token management endpoints
@router.post("/cluster/tokens")
async def create_cluster_token(token_request: TokenCreateRequest) -> Dict[str, Any]:
"""
Create a new cluster registration token.
Tokens are used by Bzzz clients to authenticate and register with the cluster.
Only administrators should have access to this endpoint.
"""
try:
# For now, use a default admin user ID
# TODO: Extract from JWT token or session
admin_user_id = "admin" # This should come from authentication
token = await cluster_registration_service.generate_cluster_token(
description=token_request.description,
created_by_user_id=admin_user_id,
expires_in_days=token_request.expires_in_days,
max_registrations=token_request.max_registrations,
allowed_ip_ranges=token_request.allowed_ip_ranges
)
return {
"id": token.id,
"token": token.token,
"description": token.description,
"created_at": token.created_at.isoformat(),
"expires_at": token.expires_at.isoformat() if token.expires_at else None,
"is_active": token.is_active,
"max_registrations": token.max_registrations,
"current_registrations": token.current_registrations,
"allowed_ip_ranges": token.allowed_ip_ranges
}
except Exception as e:
logger.error(f"Failed to create cluster token: {e}")
raise HTTPException(status_code=500, detail="Failed to create token")
@router.get("/cluster/tokens")
async def list_cluster_tokens() -> Dict[str, Any]:
"""
List all cluster registration tokens.
Returns information about all tokens including their usage statistics.
Only administrators should have access to this endpoint.
"""
try:
tokens = await cluster_registration_service.list_tokens()
tokens_data = []
for token in tokens:
tokens_data.append({
"id": token.id,
"token": token.token[:20] + "..." if len(token.token) > 20 else token.token, # Partial token for security
"description": token.description,
"created_at": token.created_at.isoformat(),
"expires_at": token.expires_at.isoformat() if token.expires_at else None,
"is_active": token.is_active,
"max_registrations": token.max_registrations,
"current_registrations": token.current_registrations,
"allowed_ip_ranges": token.allowed_ip_ranges
})
return {
"tokens": tokens_data,
"total_count": len(tokens_data)
}
except Exception as e:
logger.error(f"Failed to list cluster tokens: {e}")
raise HTTPException(status_code=500, detail="Failed to list tokens")
@router.delete("/cluster/tokens/{token}")
async def revoke_cluster_token(token: str) -> Dict[str, Any]:
"""
Revoke a cluster registration token.
This will prevent new registrations using this token, but won't affect
nodes that are already registered.
"""
try:
success = await cluster_registration_service.revoke_token(token)
if not success:
raise HTTPException(status_code=404, detail="Token not found")
return {
"token": token[:20] + "..." if len(token) > 20 else token,
"status": "revoked",
"message": "Token successfully revoked"
}
except HTTPException:
raise
except Exception as e:
logger.error(f"Failed to revoke token {token}: {e}")
raise HTTPException(status_code=500, detail="Failed to revoke token")
# Cluster statistics and monitoring
@router.get("/cluster/statistics")
async def get_cluster_statistics() -> Dict[str, Any]:
"""
Get cluster health and usage statistics.
Returns information about node counts, token usage, and overall cluster health.
"""
try:
stats = await cluster_registration_service.get_cluster_statistics()
return stats
except Exception as e:
logger.error(f"Failed to get cluster statistics: {e}")
raise HTTPException(status_code=500, detail="Failed to retrieve cluster statistics")
# Maintenance endpoints
@router.post("/cluster/maintenance/cleanup-offline")
async def cleanup_offline_nodes(offline_threshold_minutes: int = 10) -> Dict[str, Any]:
"""
Mark nodes as offline if they haven't sent heartbeats recently.
This maintenance endpoint should be called periodically to keep
the cluster status accurate.
"""
try:
count = await cluster_registration_service.cleanup_offline_nodes(offline_threshold_minutes)
return {
"nodes_marked_offline": count,
"threshold_minutes": offline_threshold_minutes,
"message": f"Marked {count} nodes as offline"
}
except Exception as e:
logger.error(f"Failed to cleanup offline nodes: {e}")
raise HTTPException(status_code=500, detail="Failed to cleanup offline nodes")
@router.post("/cluster/maintenance/cleanup-heartbeats")
async def cleanup_old_heartbeats(retention_days: int = 30) -> Dict[str, Any]:
"""
Remove old heartbeat data to manage database size.
This maintenance endpoint should be called periodically to prevent
the heartbeat table from growing too large.
"""
try:
count = await cluster_registration_service.cleanup_old_heartbeats(retention_days)
return {
"heartbeats_deleted": count,
"retention_days": retention_days,
"message": f"Deleted {count} old heartbeat records"
}
except Exception as e:
logger.error(f"Failed to cleanup old heartbeats: {e}")
raise HTTPException(status_code=500, detail="Failed to cleanup old heartbeats")
# Health check endpoint
@router.get("/cluster/health")
async def cluster_registration_health() -> Dict[str, Any]:
"""
Health check for the cluster registration system.
"""
try:
# Test database connection
stats = await cluster_registration_service.get_cluster_statistics()
return {
"status": "healthy",
"database_connected": True,
"cluster_health": stats.get("cluster_health", {}),
"timestamp": stats.get("last_updated")
}
except Exception as e:
logger.error(f"Cluster registration health check failed: {e}")
return {
"status": "unhealthy",
"database_connected": False,
"error": str(e),
"timestamp": None
}

474
backend/app/api/feedback.py Normal file
View File

@@ -0,0 +1,474 @@
"""
Context Feedback API endpoints for RL Context Curator integration
"""
from fastapi import APIRouter, Depends, HTTPException, BackgroundTasks
from sqlalchemy.orm import Session
from typing import List, Optional, Dict, Any
from datetime import datetime, timedelta
from pydantic import BaseModel, Field
from ..core.database import get_db
from ..models.context_feedback import ContextFeedback, AgentPermissions, PromotionRuleHistory
from ..models.task import Task
from ..models.agent import Agent
from ..services.auth import get_current_user
from ..models.responses import StatusResponse
router = APIRouter(prefix="/api/feedback", tags=["Context Feedback"])
# Pydantic models for API
class ContextFeedbackRequest(BaseModel):
"""Request model for context feedback"""
context_id: str = Field(..., description="HCFS context ID")
feedback_type: str = Field(..., description="Type of feedback: upvote, downvote, forgetfulness, task_success, task_failure")
confidence: float = Field(..., ge=0.0, le=1.0, description="Confidence in feedback")
reason: Optional[str] = Field(None, description="Optional reason for feedback")
usage_context: Optional[str] = Field(None, description="Context of usage")
directory_scope: Optional[str] = Field(None, description="Directory where context was used")
task_type: Optional[str] = Field(None, description="Type of task being performed")
class TaskOutcomeFeedbackRequest(BaseModel):
"""Request model for task outcome feedback"""
task_id: str = Field(..., description="Task ID")
outcome: str = Field(..., description="Task outcome: completed, failed, abandoned")
completion_time: Optional[int] = Field(None, description="Time to complete in seconds")
errors_encountered: int = Field(0, description="Number of errors during execution")
follow_up_questions: int = Field(0, description="Number of follow-up questions")
context_used: Optional[List[str]] = Field(None, description="Context IDs used in task")
context_relevance_score: Optional[float] = Field(None, ge=0.0, le=1.0, description="Average relevance of used context")
outcome_confidence: Optional[float] = Field(None, ge=0.0, le=1.0, description="Confidence in outcome classification")
class AgentPermissionsRequest(BaseModel):
"""Request model for agent permissions"""
agent_id: str = Field(..., description="Agent ID")
role: str = Field(..., description="Agent role")
directory_patterns: List[str] = Field(..., description="Directory patterns for this role")
task_types: List[str] = Field(..., description="Task types this agent can handle")
context_weight: float = Field(1.0, ge=0.1, le=2.0, description="Weight for context relevance")
class ContextFeedbackResponse(BaseModel):
"""Response model for context feedback"""
id: int
context_id: str
agent_id: str
task_id: Optional[str]
feedback_type: str
role: str
confidence: float
reason: Optional[str]
usage_context: Optional[str]
directory_scope: Optional[str]
task_type: Optional[str]
timestamp: datetime
class FeedbackStatsResponse(BaseModel):
"""Response model for feedback statistics"""
total_feedback: int
feedback_by_type: Dict[str, int]
feedback_by_role: Dict[str, int]
average_confidence: float
recent_feedback_count: int
top_contexts: List[Dict[str, Any]]
@router.post("/context/{context_id}", response_model=StatusResponse)
async def submit_context_feedback(
context_id: str,
request: ContextFeedbackRequest,
background_tasks: BackgroundTasks,
db: Session = Depends(get_db),
current_user: dict = Depends(get_current_user)
):
"""
Submit feedback for a specific context
"""
try:
# Get agent information
agent = db.query(Agent).filter(Agent.id == current_user.get("agent_id", "unknown")).first()
if not agent:
raise HTTPException(status_code=404, detail="Agent not found")
# Validate feedback type
valid_types = ["upvote", "downvote", "forgetfulness", "task_success", "task_failure"]
if request.feedback_type not in valid_types:
raise HTTPException(status_code=400, detail=f"Invalid feedback type. Must be one of: {valid_types}")
# Create feedback record
feedback = ContextFeedback(
context_id=request.context_id,
agent_id=agent.id,
feedback_type=request.feedback_type,
role=agent.role if agent.role else "general",
confidence=request.confidence,
reason=request.reason,
usage_context=request.usage_context,
directory_scope=request.directory_scope,
task_type=request.task_type
)
db.add(feedback)
db.commit()
db.refresh(feedback)
# Send feedback to RL Context Curator in background
background_tasks.add_task(
send_feedback_to_rl_curator,
feedback.id,
request.context_id,
request.feedback_type,
agent.id,
agent.role if agent.role else "general",
request.confidence
)
return StatusResponse(
status="success",
message="Context feedback submitted successfully",
data={"feedback_id": feedback.id, "context_id": request.context_id}
)
except Exception as e:
db.rollback()
raise HTTPException(status_code=500, detail=f"Failed to submit feedback: {str(e)}")
@router.post("/task-outcome/{task_id}", response_model=StatusResponse)
async def submit_task_outcome_feedback(
task_id: str,
request: TaskOutcomeFeedbackRequest,
background_tasks: BackgroundTasks,
db: Session = Depends(get_db),
current_user: dict = Depends(get_current_user)
):
"""
Submit task outcome feedback for RL learning
"""
try:
# Get task
task = db.query(Task).filter(Task.id == task_id).first()
if not task:
raise HTTPException(status_code=404, detail="Task not found")
# Update task with outcome metrics
task.task_outcome = request.outcome
task.completion_time = request.completion_time
task.errors_encountered = request.errors_encountered
task.follow_up_questions = request.follow_up_questions
task.context_relevance_score = request.context_relevance_score
task.outcome_confidence = request.outcome_confidence
task.feedback_collected = True
if request.context_used:
task.context_used = request.context_used
if request.outcome in ["completed", "failed", "abandoned"] and not task.completed_at:
task.completed_at = datetime.utcnow()
# Calculate success rate
if request.outcome == "completed":
task.success_rate = 1.0 - (request.errors_encountered * 0.1) # Simple calculation
task.success_rate = max(0.0, min(1.0, task.success_rate))
else:
task.success_rate = 0.0
db.commit()
# Create feedback events for used contexts
if request.context_used and task.assigned_agent_id:
agent = db.query(Agent).filter(Agent.id == task.assigned_agent_id).first()
if agent:
feedback_type = "task_success" if request.outcome == "completed" else "task_failure"
for context_id in request.context_used:
feedback = ContextFeedback(
context_id=context_id,
agent_id=agent.id,
task_id=task.id,
feedback_type=feedback_type,
role=agent.role if agent.role else "general",
confidence=request.outcome_confidence or 0.8,
reason=f"Task {request.outcome}",
usage_context=f"task_execution_{request.outcome}",
task_type=request.task_type
)
db.add(feedback)
db.commit()
return StatusResponse(
status="success",
message="Task outcome feedback submitted successfully",
data={"task_id": task_id, "outcome": request.outcome}
)
except Exception as e:
db.rollback()
raise HTTPException(status_code=500, detail=f"Failed to submit task outcome: {str(e)}")
@router.get("/stats", response_model=FeedbackStatsResponse)
async def get_feedback_stats(
days: int = 7,
role: Optional[str] = None,
db: Session = Depends(get_db),
current_user: dict = Depends(get_current_user)
):
"""
Get feedback statistics for analysis
"""
try:
# Base query
query = db.query(ContextFeedback)
# Filter by date range
if days > 0:
since_date = datetime.utcnow() - timedelta(days=days)
query = query.filter(ContextFeedback.timestamp >= since_date)
# Filter by role if specified
if role:
query = query.filter(ContextFeedback.role == role)
feedback_records = query.all()
# Calculate statistics
total_feedback = len(feedback_records)
feedback_by_type = {}
feedback_by_role = {}
confidence_values = []
context_usage = {}
for feedback in feedback_records:
# Count by type
feedback_by_type[feedback.feedback_type] = feedback_by_type.get(feedback.feedback_type, 0) + 1
# Count by role
feedback_by_role[feedback.role] = feedback_by_role.get(feedback.role, 0) + 1
# Collect confidence values
confidence_values.append(feedback.confidence)
# Count context usage
context_usage[feedback.context_id] = context_usage.get(feedback.context_id, 0) + 1
# Calculate average confidence
average_confidence = sum(confidence_values) / len(confidence_values) if confidence_values else 0.0
# Get recent feedback count (last 24 hours)
recent_since = datetime.utcnow() - timedelta(days=1)
recent_count = db.query(ContextFeedback).filter(
ContextFeedback.timestamp >= recent_since
).count()
# Get top contexts by usage
top_contexts = [
{"context_id": ctx_id, "usage_count": count}
for ctx_id, count in sorted(context_usage.items(), key=lambda x: x[1], reverse=True)[:10]
]
return FeedbackStatsResponse(
total_feedback=total_feedback,
feedback_by_type=feedback_by_type,
feedback_by_role=feedback_by_role,
average_confidence=average_confidence,
recent_feedback_count=recent_count,
top_contexts=top_contexts
)
except Exception as e:
raise HTTPException(status_code=500, detail=f"Failed to get feedback stats: {str(e)}")
@router.get("/recent", response_model=List[ContextFeedbackResponse])
async def get_recent_feedback(
limit: int = 50,
feedback_type: Optional[str] = None,
role: Optional[str] = None,
db: Session = Depends(get_db),
current_user: dict = Depends(get_current_user)
):
"""
Get recent feedback events
"""
try:
query = db.query(ContextFeedback).order_by(ContextFeedback.timestamp.desc())
if feedback_type:
query = query.filter(ContextFeedback.feedback_type == feedback_type)
if role:
query = query.filter(ContextFeedback.role == role)
feedback_records = query.limit(limit).all()
return [
ContextFeedbackResponse(
id=fb.id,
context_id=fb.context_id,
agent_id=fb.agent_id,
task_id=str(fb.task_id) if fb.task_id else None,
feedback_type=fb.feedback_type,
role=fb.role,
confidence=fb.confidence,
reason=fb.reason,
usage_context=fb.usage_context,
directory_scope=fb.directory_scope,
task_type=fb.task_type,
timestamp=fb.timestamp
)
for fb in feedback_records
]
except Exception as e:
raise HTTPException(status_code=500, detail=f"Failed to get recent feedback: {str(e)}")
@router.post("/agent-permissions", response_model=StatusResponse)
async def set_agent_permissions(
request: AgentPermissionsRequest,
db: Session = Depends(get_db),
current_user: dict = Depends(get_current_user)
):
"""
Set or update agent permissions for context filtering
"""
try:
# Check if permissions already exist
existing = db.query(AgentPermissions).filter(
AgentPermissions.agent_id == request.agent_id,
AgentPermissions.role == request.role
).first()
if existing:
# Update existing permissions
existing.directory_patterns = ",".join(request.directory_patterns)
existing.task_types = ",".join(request.task_types)
existing.context_weight = request.context_weight
existing.updated_at = datetime.utcnow()
else:
# Create new permissions
permissions = AgentPermissions(
agent_id=request.agent_id,
role=request.role,
directory_patterns=",".join(request.directory_patterns),
task_types=",".join(request.task_types),
context_weight=request.context_weight
)
db.add(permissions)
db.commit()
return StatusResponse(
status="success",
message="Agent permissions updated successfully",
data={"agent_id": request.agent_id, "role": request.role}
)
except Exception as e:
db.rollback()
raise HTTPException(status_code=500, detail=f"Failed to set agent permissions: {str(e)}")
@router.get("/agent-permissions/{agent_id}")
async def get_agent_permissions(
agent_id: str,
db: Session = Depends(get_db),
current_user: dict = Depends(get_current_user)
):
"""
Get agent permissions for context filtering
"""
try:
permissions = db.query(AgentPermissions).filter(
AgentPermissions.agent_id == agent_id,
AgentPermissions.active == "true"
).all()
return [
{
"id": perm.id,
"agent_id": perm.agent_id,
"role": perm.role,
"directory_patterns": perm.directory_patterns.split(",") if perm.directory_patterns else [],
"task_types": perm.task_types.split(",") if perm.task_types else [],
"context_weight": perm.context_weight,
"created_at": perm.created_at,
"updated_at": perm.updated_at
}
for perm in permissions
]
except Exception as e:
raise HTTPException(status_code=500, detail=f"Failed to get agent permissions: {str(e)}")
async def send_feedback_to_rl_curator(
feedback_id: int,
context_id: str,
feedback_type: str,
agent_id: str,
role: str,
confidence: float
):
"""
Background task to send feedback to RL Context Curator
"""
try:
import httpx
import json
from datetime import datetime
# Prepare feedback event in Bzzz format
feedback_event = {
"bzzz_type": "feedback_event",
"timestamp": datetime.utcnow().isoformat(),
"origin": {
"node_id": "hive",
"agent_id": agent_id,
"task_id": f"hive-feedback-{feedback_id}",
"workspace": "hive://context-feedback",
"directory": "/feedback/"
},
"feedback": {
"type": feedback_type,
"category": "general", # Could be enhanced with category detection
"role": role,
"context_id": context_id,
"reason": f"Feedback from Hive agent {agent_id}",
"confidence": confidence,
"usage_context": "hive_platform"
},
"task_outcome": {
"completed": feedback_type in ["upvote", "task_success"],
"completion_time": 0,
"errors_encountered": 0,
"follow_up_questions": 0
}
}
# Send to HCFS RL Tuner Service
async with httpx.AsyncClient() as client:
try:
response = await client.post(
"http://localhost:8001/api/feedback",
json=feedback_event,
timeout=10.0
)
if response.status_code == 200:
print(f"✅ Feedback sent to RL Curator: {feedback_id}")
else:
print(f"⚠️ RL Curator responded with status {response.status_code}")
except httpx.ConnectError:
print(f"⚠️ Could not connect to RL Curator service (feedback {feedback_id})")
except Exception as e:
print(f"❌ Error sending feedback to RL Curator: {e}")
except Exception as e:
print(f"❌ Background feedback task failed: {e}")

View File

@@ -47,6 +47,37 @@ async def get_project_tasks(project_id: str, current_user: Dict[str, Any] = Depe
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
@router.put("/projects/{project_id}")
async def update_project(project_id: str, project_data: Dict[str, Any], current_user: Dict[str, Any] = Depends(get_current_user_context)) -> Dict[str, Any]:
"""Update a project configuration."""
try:
updated_project = project_service.update_project(project_id, project_data)
if not updated_project:
raise HTTPException(status_code=404, detail="Project not found")
return updated_project
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
@router.post("/projects")
async def create_project(project_data: Dict[str, Any], current_user: Dict[str, Any] = Depends(get_current_user_context)) -> Dict[str, Any]:
"""Create a new project."""
try:
new_project = project_service.create_project(project_data)
return new_project
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
@router.delete("/projects/{project_id}")
async def delete_project(project_id: str, current_user: Dict[str, Any] = Depends(get_current_user_context)) -> Dict[str, Any]:
"""Delete a project."""
try:
result = project_service.delete_project(project_id)
if not result:
raise HTTPException(status_code=404, detail="Project not found")
return {"success": True, "message": "Project deleted successfully"}
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
# === Bzzz Integration Endpoints ===
@bzzz_router.get("/active-repos")

View File

@@ -11,7 +11,7 @@ from typing import Dict, Any, Optional
from dataclasses import asdict
# Add CCLI source to path
ccli_path = os.path.join(os.path.dirname(__file__), '../../../ccli_src')
ccli_path = os.path.join(os.path.dirname(__file__), '../../ccli_src')
sys.path.insert(0, ccli_path)
from agents.gemini_cli_agent import GeminiCliAgent, GeminiCliConfig, TaskRequest as CliTaskRequest, TaskResult as CliTaskResult

View File

@@ -273,7 +273,6 @@ def create_token_response(user_id: int, user_data: Dict[str, Any]) -> Dict[str,
"refresh_token": refresh_token,
"token_type": "bearer",
"expires_in": ACCESS_TOKEN_EXPIRE_MINUTES * 60, # seconds
"user": user_data,
}

View File

@@ -174,6 +174,10 @@ app = FastAPI(
"name": "cluster",
"description": "Cluster-wide operations and coordination"
},
{
"name": "cluster-registration",
"description": "Dynamic cluster node registration and management"
},
{
"name": "distributed-workflows",
"description": "Advanced distributed workflow management"
@@ -206,7 +210,7 @@ def get_coordinator() -> UnifiedCoordinator:
return unified_coordinator
# Import API routers
from .api import agents, workflows, executions, monitoring, projects, tasks, cluster, distributed_workflows, cli_agents, auth
from .api import agents, workflows, executions, monitoring, projects, tasks, cluster, distributed_workflows, cli_agents, auth, bzzz_logs, cluster_registration
# Import error handlers and response models
from .core.error_handlers import (
@@ -239,8 +243,10 @@ app.include_router(projects.router, prefix="/api", tags=["projects"])
app.include_router(projects.bzzz_router, prefix="/api", tags=["bzzz-integration"])
app.include_router(tasks.router, prefix="/api", tags=["tasks"])
app.include_router(cluster.router, prefix="/api", tags=["cluster"])
app.include_router(cluster_registration.router, prefix="/api", tags=["cluster-registration"])
app.include_router(distributed_workflows.router, tags=["distributed-workflows"])
app.include_router(cli_agents.router, tags=["cli-agents"])
app.include_router(bzzz_logs.router, prefix="/api", tags=["bzzz-logs"])
# Override dependency functions in API modules with our coordinator instance
agents.get_coordinator = get_coordinator
@@ -528,16 +534,6 @@ async def root():
# Removed duplicate /health endpoint - using the enhanced one above
@app.get("/api/health", response_model=None)
async def health_check():
"""Simple health check endpoint"""
return {
"status": "healthy",
"timestamp": datetime.now().isoformat(),
"version": "1.0.0",
"message": "Hive API is operational"
}
@app.get("/api/status")
async def get_system_status():
"""Get comprehensive system status"""

View File

@@ -2,4 +2,5 @@ from . import agent
from . import agent_role
from . import project
from . import task
from . import context_feedback
from . import sqlalchemy_models

View File

@@ -34,6 +34,8 @@ class Agent(Base):
# Relationships
tasks = relationship("Task", back_populates="assigned_agent")
context_feedback = relationship("ContextFeedback", back_populates="agent")
permissions = relationship("AgentPermissions", back_populates="agent")
def to_dict(self):
return {

View File

@@ -0,0 +1,85 @@
"""
Context Feedback model for RL Context Curator integration
"""
from sqlalchemy import Column, String, Text, Integer, DateTime, ForeignKey, UUID as SqlUUID, Float
from sqlalchemy.sql import func
from sqlalchemy.orm import relationship
from ..core.database import Base
import uuid
class ContextFeedback(Base):
__tablename__ = "context_feedback"
# Primary identification
id = Column(Integer, primary_key=True, index=True)
# Context and agent information
context_id = Column(String(255), nullable=False, index=True) # HCFS context ID
agent_id = Column(String(255), ForeignKey("agents.id"), nullable=False)
task_id = Column(SqlUUID(as_uuid=True), ForeignKey("tasks.id"), nullable=True)
# Feedback details
feedback_type = Column(String(50), nullable=False) # upvote, downvote, forgetfulness, task_success, task_failure
role = Column(String(100), nullable=False) # Agent role when feedback was given
confidence = Column(Float, nullable=False) # Confidence in feedback (0.0 to 1.0)
reason = Column(Text, nullable=True) # Optional reason for feedback
usage_context = Column(String(255), nullable=True) # Context of usage (debugging, coding, etc.)
# Additional metadata
directory_scope = Column(String(500), nullable=True) # Directory where context was used
task_type = Column(String(100), nullable=True) # Type of task being performed
# Timestamps
timestamp = Column(DateTime(timezone=True), server_default=func.now())
# Relationships
agent = relationship("Agent", back_populates="context_feedback")
task = relationship("Task", backref="context_feedback")
class AgentPermissions(Base):
__tablename__ = "agent_permissions"
# Primary identification
id = Column(Integer, primary_key=True, index=True)
# Agent and role information
agent_id = Column(String(255), ForeignKey("agents.id"), nullable=False, index=True)
role = Column(String(100), nullable=False)
# Permission details
directory_patterns = Column(Text, nullable=True) # JSON array of path patterns
task_types = Column(Text, nullable=True) # JSON array of allowed task types
context_weight = Column(Float, default=1.0) # Weight for context relevance
# Status
active = Column(String(10), default='true') # String to match existing boolean patterns
# Timestamps
created_at = Column(DateTime(timezone=True), server_default=func.now())
updated_at = Column(DateTime(timezone=True), server_default=func.now(), onupdate=func.now())
# Relationships
agent = relationship("Agent", back_populates="permissions")
class PromotionRuleHistory(Base):
__tablename__ = "promotion_rule_history"
# Primary identification
id = Column(Integer, primary_key=True, index=True)
# Rule information
rule_version = Column(String(50), nullable=False)
category = Column(String(100), nullable=False)
role = Column(String(100), nullable=False)
weight_value = Column(Float, nullable=False)
# Change information
change_reason = Column(Text, nullable=True)
previous_value = Column(Float, nullable=True)
# Timestamps
timestamp = Column(DateTime(timezone=True), server_default=func.now())

View File

@@ -2,7 +2,7 @@
Task model for SQLAlchemy ORM
"""
from sqlalchemy import Column, String, Text, Integer, DateTime, ForeignKey, UUID as SqlUUID
from sqlalchemy import Column, String, Text, Integer, DateTime, ForeignKey, UUID as SqlUUID, Float, Boolean
from sqlalchemy.dialects.postgresql import JSONB
from sqlalchemy.sql import func
from sqlalchemy.orm import relationship
@@ -30,6 +30,17 @@ class Task(Base):
# Task metadata (includes context and payload)
task_metadata = Column("metadata", JSONB, nullable=True)
# RL Context Curator outcome tracking fields
completion_time = Column(Integer, nullable=True) # Time to complete in seconds
errors_encountered = Column(Integer, default=0) # Number of errors during execution
follow_up_questions = Column(Integer, default=0) # Number of follow-up questions
success_rate = Column(Float, nullable=True) # Success rate (0.0 to 1.0)
context_used = Column(JSONB, nullable=True) # Context IDs used in this task
context_relevance_score = Column(Float, nullable=True) # Average relevance of used context
feedback_collected = Column(Boolean, default=False) # Whether feedback was collected
task_outcome = Column(String(50), nullable=True) # completed, failed, abandoned
outcome_confidence = Column(Float, nullable=True) # Confidence in outcome classification
# Timestamps
created_at = Column(DateTime(timezone=True), server_default=func.now())
started_at = Column(DateTime(timezone=True), nullable=True)

View File

@@ -0,0 +1,522 @@
"""
Cluster Registration Service
Handles registration-based cluster management for Hive-Bzzz integration.
"""
import asyncpg
import secrets
import json
import socket
from datetime import datetime, timedelta
from typing import Dict, List, Optional, Any
from dataclasses import dataclass
from ipaddress import IPv4Network, IPv6Network, ip_address
import logging
logger = logging.getLogger(__name__)
@dataclass
class ClusterToken:
id: int
token: str
description: str
created_at: datetime
expires_at: Optional[datetime]
is_active: bool
max_registrations: Optional[int]
current_registrations: int
allowed_ip_ranges: Optional[List[str]]
@dataclass
class ClusterNode:
id: int
node_id: str
hostname: str
ip_address: str
registration_token: str
cpu_info: Optional[Dict[str, Any]]
memory_info: Optional[Dict[str, Any]]
gpu_info: Optional[Dict[str, Any]]
disk_info: Optional[Dict[str, Any]]
os_info: Optional[Dict[str, Any]]
platform_info: Optional[Dict[str, Any]]
status: str
last_heartbeat: datetime
first_registered: datetime
services: Optional[Dict[str, Any]]
capabilities: Optional[Dict[str, Any]]
ports: Optional[Dict[str, Any]]
client_version: Optional[str]
registration_metadata: Optional[Dict[str, Any]]
@dataclass
class RegistrationRequest:
token: str
node_id: str
hostname: str
ip_address: str
system_info: Dict[str, Any]
client_version: Optional[str] = None
services: Optional[Dict[str, Any]] = None
capabilities: Optional[Dict[str, Any]] = None
ports: Optional[Dict[str, Any]] = None
metadata: Optional[Dict[str, Any]] = None
@dataclass
class HeartbeatRequest:
node_id: str
status: str = "online"
cpu_usage: Optional[float] = None
memory_usage: Optional[float] = None
disk_usage: Optional[float] = None
gpu_usage: Optional[float] = None
services_status: Optional[Dict[str, Any]] = None
network_metrics: Optional[Dict[str, Any]] = None
custom_metrics: Optional[Dict[str, Any]] = None
class ClusterRegistrationService:
def __init__(self, database_url: str):
self.database_url = database_url
self._conn_cache = None
async def get_connection(self) -> asyncpg.Connection:
"""Get database connection with caching."""
if not self._conn_cache or self._conn_cache.is_closed():
try:
self._conn_cache = await asyncpg.connect(self.database_url)
except Exception as e:
logger.error(f"Failed to connect to database: {e}")
raise
return self._conn_cache
async def close_connection(self):
"""Close database connection."""
if self._conn_cache and not self._conn_cache.is_closed():
await self._conn_cache.close()
# Token Management
async def generate_cluster_token(
self,
description: str,
created_by_user_id: str,
expires_in_days: Optional[int] = None,
max_registrations: Optional[int] = None,
allowed_ip_ranges: Optional[List[str]] = None
) -> ClusterToken:
"""Generate a new cluster registration token."""
conn = await self.get_connection()
# Generate secure token
token = f"hive_cluster_{secrets.token_urlsafe(32)}"
expires_at = datetime.now() + timedelta(days=expires_in_days) if expires_in_days else None
try:
result = await conn.fetchrow("""
INSERT INTO cluster_tokens (
token, description, created_by, expires_at,
max_registrations, allowed_ip_ranges
) VALUES ($1, $2, $3, $4, $5, $6)
RETURNING id, token, description, created_at, expires_at,
is_active, max_registrations, current_registrations, allowed_ip_ranges
""", token, description, created_by_user_id, expires_at, max_registrations, allowed_ip_ranges)
return ClusterToken(**dict(result))
except Exception as e:
logger.error(f"Failed to generate cluster token: {e}")
raise
async def validate_token(self, token: str, client_ip: str) -> Optional[ClusterToken]:
"""Validate a cluster registration token."""
conn = await self.get_connection()
try:
result = await conn.fetchrow("""
SELECT id, token, description, created_at, expires_at,
is_active, max_registrations, current_registrations, allowed_ip_ranges
FROM cluster_tokens
WHERE token = $1 AND is_active = true
""", token)
if not result:
return None
cluster_token = ClusterToken(**dict(result))
# Check expiration
if cluster_token.expires_at and datetime.now() > cluster_token.expires_at:
logger.warning(f"Token {token[:20]}... has expired")
return None
# Check registration limit
if (cluster_token.max_registrations and
cluster_token.current_registrations >= cluster_token.max_registrations):
logger.warning(f"Token {token[:20]}... has reached registration limit")
return None
# Check IP restrictions
if cluster_token.allowed_ip_ranges:
client_ip_obj = ip_address(client_ip)
allowed = False
for ip_range in cluster_token.allowed_ip_ranges:
try:
network = IPv4Network(ip_range, strict=False) if ':' not in ip_range else IPv6Network(ip_range, strict=False)
if client_ip_obj in network:
allowed = True
break
except Exception as e:
logger.warning(f"Invalid IP range {ip_range}: {e}")
if not allowed:
logger.warning(f"IP {client_ip} not allowed for token {token[:20]}...")
return None
return cluster_token
except Exception as e:
logger.error(f"Failed to validate token: {e}")
return None
async def list_tokens(self) -> List[ClusterToken]:
"""List all cluster tokens."""
conn = await self.get_connection()
try:
results = await conn.fetch("""
SELECT id, token, description, created_at, expires_at,
is_active, max_registrations, current_registrations, allowed_ip_ranges
FROM cluster_tokens
ORDER BY created_at DESC
""")
return [ClusterToken(**dict(result)) for result in results]
except Exception as e:
logger.error(f"Failed to list tokens: {e}")
raise
async def revoke_token(self, token: str) -> bool:
"""Revoke a cluster token."""
conn = await self.get_connection()
try:
result = await conn.execute("""
UPDATE cluster_tokens
SET is_active = false
WHERE token = $1
""", token)
return result != "UPDATE 0"
except Exception as e:
logger.error(f"Failed to revoke token: {e}")
return False
# Node Registration
async def register_node(self, request: RegistrationRequest, client_ip: str) -> Dict[str, Any]:
"""Register a new cluster node."""
conn = await self.get_connection()
# Log registration attempt
await self._log_registration_attempt(
client_ip, request.token, request.node_id,
request.hostname, True, None, request.metadata
)
try:
# Validate token
token_info = await self.validate_token(request.token, client_ip)
if not token_info:
await self._log_registration_attempt(
client_ip, request.token, request.node_id,
request.hostname, False, "Invalid or expired token", request.metadata
)
raise ValueError("Invalid or expired registration token")
# Extract system info components
system_info = request.system_info or {}
cpu_info = system_info.get('cpu', {})
memory_info = system_info.get('memory', {})
gpu_info = system_info.get('gpu', {})
disk_info = system_info.get('disk', {})
os_info = system_info.get('os', {})
platform_info = system_info.get('platform', {})
# Register or update node
result = await conn.fetchrow("""
INSERT INTO cluster_nodes (
node_id, hostname, ip_address, registration_token,
cpu_info, memory_info, gpu_info, disk_info, os_info, platform_info,
services, capabilities, ports, client_version, registration_metadata
) VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15)
ON CONFLICT (node_id) DO UPDATE SET
hostname = EXCLUDED.hostname,
ip_address = EXCLUDED.ip_address,
cpu_info = EXCLUDED.cpu_info,
memory_info = EXCLUDED.memory_info,
gpu_info = EXCLUDED.gpu_info,
disk_info = EXCLUDED.disk_info,
os_info = EXCLUDED.os_info,
platform_info = EXCLUDED.platform_info,
services = EXCLUDED.services,
capabilities = EXCLUDED.capabilities,
ports = EXCLUDED.ports,
client_version = EXCLUDED.client_version,
registration_metadata = EXCLUDED.registration_metadata,
status = 'online',
last_heartbeat = NOW()
RETURNING id, node_id, hostname, ip_address, first_registered
""",
request.node_id, request.hostname, request.ip_address, request.token,
json.dumps(cpu_info) if cpu_info else None,
json.dumps(memory_info) if memory_info else None,
json.dumps(gpu_info) if gpu_info else None,
json.dumps(disk_info) if disk_info else None,
json.dumps(os_info) if os_info else None,
json.dumps(platform_info) if platform_info else None,
json.dumps(request.services) if request.services else None,
json.dumps(request.capabilities) if request.capabilities else None,
json.dumps(request.ports) if request.ports else None,
request.client_version,
json.dumps(request.metadata) if request.metadata else None
)
logger.info(f"Node {request.node_id} registered successfully from {client_ip}")
return {
"node_id": result["node_id"],
"registration_status": "success",
"heartbeat_interval": 30, # seconds
"registered_at": result["first_registered"].isoformat(),
"cluster_info": {
"coordinator_version": "1.0.0",
"features": ["heartbeat", "dynamic_scaling", "service_discovery"]
}
}
except Exception as e:
logger.error(f"Failed to register node {request.node_id}: {e}")
await self._log_registration_attempt(
client_ip, request.token, request.node_id,
request.hostname, False, str(e), request.metadata
)
raise
async def update_heartbeat(self, request: HeartbeatRequest) -> Dict[str, Any]:
"""Update node heartbeat and metrics."""
conn = await self.get_connection()
try:
# Update node status and heartbeat
result = await conn.fetchrow("""
UPDATE cluster_nodes
SET status = $2, last_heartbeat = NOW()
WHERE node_id = $1
RETURNING node_id, status, last_heartbeat
""", request.node_id, request.status)
if not result:
raise ValueError(f"Node {request.node_id} not found")
# Record heartbeat metrics
await conn.execute("""
INSERT INTO node_heartbeats (
node_id, cpu_usage, memory_usage, disk_usage, gpu_usage,
services_status, network_metrics, custom_metrics
) VALUES ($1, $2, $3, $4, $5, $6, $7, $8)
""",
request.node_id, request.cpu_usage, request.memory_usage,
request.disk_usage, request.gpu_usage,
json.dumps(request.services_status) if request.services_status else None,
json.dumps(request.network_metrics) if request.network_metrics else None,
json.dumps(request.custom_metrics) if request.custom_metrics else None
)
return {
"node_id": result["node_id"],
"status": result["status"],
"heartbeat_received": result["last_heartbeat"].isoformat(),
"next_heartbeat_in": 30, # seconds
"commands": [] # Future: cluster management commands
}
except Exception as e:
logger.error(f"Failed to update heartbeat for {request.node_id}: {e}")
raise
async def get_registered_nodes(self, include_offline: bool = True) -> List[ClusterNode]:
"""Get all registered cluster nodes."""
conn = await self.get_connection()
try:
query = """
SELECT id, node_id, hostname, ip_address, registration_token,
cpu_info, memory_info, gpu_info, disk_info, os_info, platform_info,
status, last_heartbeat, first_registered,
services, capabilities, ports, client_version, registration_metadata
FROM cluster_nodes
"""
if not include_offline:
query += " WHERE status != 'offline'"
query += " ORDER BY first_registered DESC"
results = await conn.fetch(query)
nodes = []
for result in results:
node_dict = dict(result)
# Parse JSON fields
for json_field in ['cpu_info', 'memory_info', 'gpu_info', 'disk_info',
'os_info', 'platform_info', 'services', 'capabilities',
'ports', 'registration_metadata']:
if node_dict[json_field]:
try:
node_dict[json_field] = json.loads(node_dict[json_field])
except json.JSONDecodeError:
node_dict[json_field] = None
nodes.append(ClusterNode(**node_dict))
return nodes
except Exception as e:
logger.error(f"Failed to get registered nodes: {e}")
raise
async def get_node_details(self, node_id: str) -> Optional[ClusterNode]:
"""Get detailed information about a specific node."""
nodes = await self.get_registered_nodes()
return next((node for node in nodes if node.node_id == node_id), None)
async def remove_node(self, node_id: str) -> bool:
"""Remove a node from the cluster."""
conn = await self.get_connection()
try:
result = await conn.execute("""
DELETE FROM cluster_nodes WHERE node_id = $1
""", node_id)
if result != "DELETE 0":
logger.info(f"Node {node_id} removed from cluster")
return True
return False
except Exception as e:
logger.error(f"Failed to remove node {node_id}: {e}")
return False
# Maintenance and Monitoring
async def cleanup_offline_nodes(self, offline_threshold_minutes: int = 10) -> int:
"""Mark nodes as offline if they haven't sent heartbeats."""
conn = await self.get_connection()
try:
result = await conn.execute("""
UPDATE cluster_nodes
SET status = 'offline'
WHERE status = 'online'
AND last_heartbeat < NOW() - INTERVAL '%s minutes'
""" % offline_threshold_minutes)
# Extract number from result like "UPDATE 3"
count = int(result.split()[-1]) if result.split()[-1].isdigit() else 0
if count > 0:
logger.info(f"Marked {count} nodes as offline due to missing heartbeats")
return count
except Exception as e:
logger.error(f"Failed to cleanup offline nodes: {e}")
return 0
async def cleanup_old_heartbeats(self, retention_days: int = 30) -> int:
"""Remove old heartbeat data for storage management."""
conn = await self.get_connection()
try:
result = await conn.execute("""
DELETE FROM node_heartbeats
WHERE heartbeat_time < NOW() - INTERVAL '%s days'
""" % retention_days)
count = int(result.split()[-1]) if result.split()[-1].isdigit() else 0
if count > 0:
logger.info(f"Cleaned up {count} old heartbeat records")
return count
except Exception as e:
logger.error(f"Failed to cleanup old heartbeats: {e}")
return 0
async def _log_registration_attempt(
self,
ip_address: str,
token: str,
node_id: str,
hostname: str,
success: bool,
failure_reason: Optional[str] = None,
metadata: Optional[Dict[str, Any]] = None
):
"""Log registration attempts for security monitoring."""
conn = await self.get_connection()
try:
await conn.execute("""
INSERT INTO node_registration_attempts (
ip_address, token_used, node_id, hostname,
success, failure_reason, request_metadata
) VALUES ($1, $2, $3, $4, $5, $6, $7)
""", ip_address, token, node_id, hostname, success, failure_reason,
json.dumps(metadata) if metadata else None)
except Exception as e:
logger.error(f"Failed to log registration attempt: {e}")
async def get_cluster_statistics(self) -> Dict[str, Any]:
"""Get cluster statistics and health metrics."""
conn = await self.get_connection()
try:
# Node statistics
node_stats = await conn.fetchrow("""
SELECT
COUNT(*) as total_nodes,
COUNT(*) FILTER (WHERE status = 'online') as online_nodes,
COUNT(*) FILTER (WHERE status = 'offline') as offline_nodes,
COUNT(*) FILTER (WHERE status = 'maintenance') as maintenance_nodes
FROM cluster_nodes
""")
# Token statistics
token_stats = await conn.fetchrow("""
SELECT
COUNT(*) as total_tokens,
COUNT(*) FILTER (WHERE is_active = true) as active_tokens,
COUNT(*) FILTER (WHERE expires_at IS NOT NULL AND expires_at < NOW()) as expired_tokens
FROM cluster_tokens
""")
return {
"cluster_health": {
"total_nodes": node_stats["total_nodes"],
"online_nodes": node_stats["online_nodes"],
"offline_nodes": node_stats["offline_nodes"],
"maintenance_nodes": node_stats["maintenance_nodes"],
"health_percentage": (node_stats["online_nodes"] / max(node_stats["total_nodes"], 1)) * 100
},
"token_management": {
"total_tokens": token_stats["total_tokens"],
"active_tokens": token_stats["active_tokens"],
"expired_tokens": token_stats["expired_tokens"]
},
"last_updated": datetime.now().isoformat()
}
except Exception as e:
logger.error(f"Failed to get cluster statistics: {e}")
return {
"error": str(e),
"last_updated": datetime.now().isoformat()
}

View File

@@ -26,7 +26,7 @@ class ClusterService:
"ip": "192.168.1.113",
"hostname": "ironwood",
"role": "worker",
"gpu": "NVIDIA RTX 3070",
"gpu": "NVIDIA RTX 2080S",
"memory": "128GB",
"cpu": "AMD Threadripper 2920X",
"ollama_port": 11434,
@@ -57,6 +57,66 @@ class ClusterService:
self.n8n_api_base = "https://n8n.home.deepblack.cloud/api/v1"
self.n8n_api_key = self._get_n8n_api_key()
def _get_live_hardware_info(self, hostname: str, ip: str) -> Dict[str, str]:
"""Get live hardware information from a remote node via SSH."""
hardware = {
"cpu": "Unknown",
"memory": "Unknown",
"gpu": "Unknown"
}
try:
# Try to get GPU info via SSH
print(f"🔍 SSH GPU command for {hostname}: ssh tony@{ip} 'nvidia-smi || lspci | grep -i vga'")
gpu_result = subprocess.run([
"ssh", "-o", "StrictHostKeyChecking=no", "-o", "ConnectTimeout=5",
f"tony@{ip}", "nvidia-smi --query-gpu=name --format=csv,noheader,nounits || lspci | grep -i 'vga\\|3d\\|display'"
], capture_output=True, text=True, timeout=10)
print(f"📊 GPU command result for {hostname}: returncode={gpu_result.returncode}, stdout='{gpu_result.stdout.strip()}', stderr='{gpu_result.stderr.strip()}'")
if gpu_result.returncode == 0 and gpu_result.stdout.strip():
gpu_info = gpu_result.stdout.strip().split('\n')[0]
if "NVIDIA" in gpu_info or "RTX" in gpu_info or "GTX" in gpu_info:
hardware["gpu"] = gpu_info.strip()
elif "VGA" in gpu_info or "Display" in gpu_info:
# Parse lspci output for GPU info
if "NVIDIA" in gpu_info:
parts = gpu_info.split("NVIDIA")
if len(parts) > 1:
gpu_name = "NVIDIA" + parts[1].split('[')[0].strip()
hardware["gpu"] = gpu_name
elif "AMD" in gpu_info or "Radeon" in gpu_info:
parts = gpu_info.split(":")
if len(parts) > 2:
gpu_name = parts[2].strip()
hardware["gpu"] = gpu_name
# Try to get memory info via SSH
mem_result = subprocess.run([
"ssh", "-o", "StrictHostKeyChecking=no", "-o", "ConnectTimeout=5",
f"tony@{ip}", "free -h | grep '^Mem:' | awk '{print $2}'"
], capture_output=True, text=True, timeout=10)
if mem_result.returncode == 0 and mem_result.stdout.strip():
memory_info = mem_result.stdout.strip()
hardware["memory"] = memory_info
# Try to get CPU info via SSH
cpu_result = subprocess.run([
"ssh", "-o", "StrictHostKeyChecking=no", "-o", "ConnectTimeout=5",
f"tony@{ip}", "lscpu | grep 'Model name:' | cut -d':' -f2- | xargs"
], capture_output=True, text=True, timeout=10)
if cpu_result.returncode == 0 and cpu_result.stdout.strip():
cpu_info = cpu_result.stdout.strip()
hardware["cpu"] = cpu_info
except Exception as e:
print(f"Error getting live hardware info for {hostname}: {e}")
return hardware
def _get_n8n_api_key(self) -> Optional[str]:
"""Get n8n API key from secrets."""
try:
@@ -136,17 +196,35 @@ class ClusterService:
except Exception:
pass
# Try to get live hardware info if node is online
hardware_info = {
"cpu": node_info["cpu"],
"memory": node_info["memory"],
"gpu": node_info["gpu"]
}
if status == "online":
try:
print(f"🔍 Getting live hardware info for {node_id} ({node_info['ip']})")
live_hardware = self._get_live_hardware_info(node_info["hostname"], node_info["ip"])
print(f"📊 Live hardware detected for {node_id}: {live_hardware}")
# Use live data if available, fallback to hardcoded values
for key in ["cpu", "memory", "gpu"]:
if live_hardware[key] != "Unknown":
print(f"✅ Using live {key} for {node_id}: {live_hardware[key]}")
hardware_info[key] = live_hardware[key]
else:
print(f"⚠️ Using fallback {key} for {node_id}: {hardware_info[key]}")
except Exception as e:
print(f"❌ Failed to get live hardware info for {node_id}: {e}")
return {
"id": node_id,
"hostname": node_info["hostname"],
"ip": node_info["ip"],
"status": status,
"role": node_info["role"],
"hardware": {
"cpu": node_info["cpu"],
"memory": node_info["memory"],
"gpu": node_info["gpu"]
},
"hardware": hardware_info,
"model_count": model_count,
"models": [{"name": m["name"], "size": m.get("size", 0)} for m in models],
"metrics": {

View File

@@ -689,4 +689,58 @@ class ProjectService:
# Handle escalation status
if status == "escalated":
print(f"Task escalated for human review: {metadata}")
# TODO: Trigger N8N webhook for human escalation
# TODO: Trigger N8N webhook for human escalation
def update_project(self, project_id: str, project_data: Dict[str, Any]) -> Optional[Dict[str, Any]]:
"""Update a project configuration."""
try:
# For now, projects are read-only from the filesystem
# This could be extended to update project metadata files
project = self.get_project_by_id(project_id)
if not project:
return None
# Update project metadata in a local JSON file if needed
# For now, just return the existing project as projects are filesystem-based
print(f"Project update request for {project_id}: {project_data}")
return project
except Exception as e:
print(f"Error updating project {project_id}: {e}")
return None
def create_project(self, project_data: Dict[str, Any]) -> Dict[str, Any]:
"""Create a new project."""
try:
# For now, projects are filesystem-based and read-only
# This could be extended to create new project directories
print(f"Project creation request: {project_data}")
# Return a mock project for now
project_id = project_data.get("name", "new-project").lower().replace(" ", "-")
return {
"id": project_id,
"name": project_data.get("name", "New Project"),
"description": project_data.get("description", ""),
"status": "created",
"created_at": datetime.now().isoformat(),
"updated_at": datetime.now().isoformat()
}
except Exception as e:
print(f"Error creating project: {e}")
raise
def delete_project(self, project_id: str) -> bool:
"""Delete a project."""
try:
# For now, projects are filesystem-based and read-only
# This could be extended to archive or remove project directories
project = self.get_project_by_id(project_id)
if not project:
return False
print(f"Project deletion request for {project_id}")
# Return success for now (projects are read-only)
return True
except Exception as e:
print(f"Error deleting project {project_id}: {e}")
return False

View File

@@ -0,0 +1,117 @@
-- Migration 004: Add Context Feedback Tables for RL Context Curator Integration
-- Created: 2025-01-30
-- Description: Adds tables for context feedback, agent permissions, and task outcome tracking
-- Add RL Context Curator fields to tasks table
ALTER TABLE tasks ADD COLUMN IF NOT EXISTS completion_time INTEGER;
ALTER TABLE tasks ADD COLUMN IF NOT EXISTS errors_encountered INTEGER DEFAULT 0;
ALTER TABLE tasks ADD COLUMN IF NOT EXISTS follow_up_questions INTEGER DEFAULT 0;
ALTER TABLE tasks ADD COLUMN IF NOT EXISTS success_rate REAL;
ALTER TABLE tasks ADD COLUMN IF NOT EXISTS context_used JSONB;
ALTER TABLE tasks ADD COLUMN IF NOT EXISTS context_relevance_score REAL;
ALTER TABLE tasks ADD COLUMN IF NOT EXISTS feedback_collected BOOLEAN DEFAULT false;
ALTER TABLE tasks ADD COLUMN IF NOT EXISTS task_outcome VARCHAR(50);
ALTER TABLE tasks ADD COLUMN IF NOT EXISTS outcome_confidence REAL;
-- Create context_feedback table
CREATE TABLE IF NOT EXISTS context_feedback (
id SERIAL PRIMARY KEY,
context_id VARCHAR(255) NOT NULL,
agent_id VARCHAR(255) NOT NULL REFERENCES agents(id),
task_id UUID REFERENCES tasks(id),
feedback_type VARCHAR(50) NOT NULL CHECK (feedback_type IN ('upvote', 'downvote', 'forgetfulness', 'task_success', 'task_failure')),
role VARCHAR(100) NOT NULL,
confidence REAL NOT NULL CHECK (confidence >= 0.0 AND confidence <= 1.0),
reason TEXT,
usage_context VARCHAR(255),
directory_scope VARCHAR(500),
task_type VARCHAR(100),
timestamp TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
);
-- Create indexes for context_feedback table
CREATE INDEX IF NOT EXISTS idx_context_feedback_context_id ON context_feedback(context_id);
CREATE INDEX IF NOT EXISTS idx_context_feedback_agent_id ON context_feedback(agent_id);
CREATE INDEX IF NOT EXISTS idx_context_feedback_timestamp ON context_feedback(timestamp);
CREATE INDEX IF NOT EXISTS idx_context_feedback_feedback_type ON context_feedback(feedback_type);
CREATE INDEX IF NOT EXISTS idx_context_feedback_role ON context_feedback(role);
-- Create agent_permissions table
CREATE TABLE IF NOT EXISTS agent_permissions (
id SERIAL PRIMARY KEY,
agent_id VARCHAR(255) NOT NULL REFERENCES agents(id),
role VARCHAR(100) NOT NULL,
directory_patterns TEXT,
task_types TEXT,
context_weight REAL DEFAULT 1.0 CHECK (context_weight >= 0.1 AND context_weight <= 2.0),
active VARCHAR(10) DEFAULT 'true',
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
);
-- Create indexes for agent_permissions table
CREATE INDEX IF NOT EXISTS idx_agent_permissions_agent_id ON agent_permissions(agent_id);
CREATE INDEX IF NOT EXISTS idx_agent_permissions_role ON agent_permissions(role);
CREATE INDEX IF NOT EXISTS idx_agent_permissions_active ON agent_permissions(active);
-- Create promotion_rule_history table
CREATE TABLE IF NOT EXISTS promotion_rule_history (
id SERIAL PRIMARY KEY,
rule_version VARCHAR(50) NOT NULL,
category VARCHAR(100) NOT NULL,
role VARCHAR(100) NOT NULL,
weight_value REAL NOT NULL,
change_reason TEXT,
previous_value REAL,
timestamp TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
);
-- Create indexes for promotion_rule_history table
CREATE INDEX IF NOT EXISTS idx_promotion_rule_history_timestamp ON promotion_rule_history(timestamp);
CREATE INDEX IF NOT EXISTS idx_promotion_rule_history_category ON promotion_rule_history(category);
CREATE INDEX IF NOT EXISTS idx_promotion_rule_history_role ON promotion_rule_history(role);
-- Create trigger to update updated_at timestamp for agent_permissions
CREATE OR REPLACE FUNCTION update_agent_permissions_updated_at()
RETURNS TRIGGER AS $$
BEGIN
NEW.updated_at = CURRENT_TIMESTAMP;
RETURN NEW;
END;
$$ LANGUAGE plpgsql;
CREATE TRIGGER IF NOT EXISTS trigger_agent_permissions_updated_at
BEFORE UPDATE ON agent_permissions
FOR EACH ROW
EXECUTE FUNCTION update_agent_permissions_updated_at();
-- Insert default agent permissions for existing agents
INSERT INTO agent_permissions (agent_id, role, directory_patterns, task_types, context_weight)
SELECT
id as agent_id,
COALESCE(role, 'general') as role,
'*' as directory_patterns,
'general_development,coding,documentation' as task_types,
1.0 as context_weight
FROM agents
WHERE id NOT IN (SELECT agent_id FROM agent_permissions)
ON CONFLICT DO NOTHING;
-- Add comments to tables
COMMENT ON TABLE context_feedback IS 'Stores context feedback from agents for RL Context Curator learning';
COMMENT ON TABLE agent_permissions IS 'Stores agent role-based permissions for context filtering';
COMMENT ON TABLE promotion_rule_history IS 'Tracks changes to promotion rule weights over time';
-- Add comments to key columns
COMMENT ON COLUMN context_feedback.context_id IS 'Reference to HCFS context ID';
COMMENT ON COLUMN context_feedback.feedback_type IS 'Type of feedback: upvote, downvote, forgetfulness, task_success, task_failure';
COMMENT ON COLUMN context_feedback.confidence IS 'Confidence level in the feedback (0.0 to 1.0)';
COMMENT ON COLUMN agent_permissions.directory_patterns IS 'Comma-separated list of directory patterns this agent can access';
COMMENT ON COLUMN agent_permissions.context_weight IS 'Weight for context relevance calculation (0.1 to 2.0)';
-- Grant permissions (adjust as needed for your setup)
-- GRANT SELECT, INSERT, UPDATE ON context_feedback TO hive_user;
-- GRANT SELECT, INSERT, UPDATE ON agent_permissions TO hive_user;
-- GRANT SELECT, INSERT ON promotion_rule_history TO hive_user;
COMMIT;

View File

@@ -0,0 +1,241 @@
-- Cluster Registration Migration
-- Implements the registration-based cluster architecture for Hive-Bzzz integration
-- Version: 1.0
-- Date: 2025-07-31
-- =============================================================================
-- CLUSTER REGISTRATION SYSTEM
-- =============================================================================
-- Cluster registration tokens (similar to Docker Swarm tokens)
CREATE TABLE cluster_tokens (
id SERIAL PRIMARY KEY,
token VARCHAR(64) UNIQUE NOT NULL,
description TEXT,
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
expires_at TIMESTAMP WITH TIME ZONE,
is_active BOOLEAN DEFAULT true,
created_by UUID REFERENCES users(id) ON DELETE SET NULL,
-- Token metadata
max_registrations INTEGER DEFAULT NULL, -- NULL = unlimited
current_registrations INTEGER DEFAULT 0,
-- IP restrictions (optional)
allowed_ip_ranges TEXT[], -- CIDR ranges like ['192.168.1.0/24']
CONSTRAINT valid_token_format CHECK (token ~ '^[a-zA-Z0-9_-]{32,64}$')
);
-- Registered cluster nodes (dynamic discovery)
CREATE TABLE cluster_nodes (
id SERIAL PRIMARY KEY,
node_id VARCHAR(64) UNIQUE NOT NULL,
hostname VARCHAR(255) NOT NULL,
ip_address INET NOT NULL,
registration_token VARCHAR(64) REFERENCES cluster_tokens(token) ON DELETE CASCADE,
-- Hardware information (reported by client)
cpu_info JSONB,
memory_info JSONB,
gpu_info JSONB,
disk_info JSONB,
-- System information
os_info JSONB,
platform_info JSONB,
-- Status tracking
status VARCHAR(20) DEFAULT 'online' CHECK (status IN ('online', 'offline', 'maintenance', 'error')),
last_heartbeat TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
first_registered TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
-- Services and capabilities
services JSONB, -- Available services like ollama, docker, etc.
capabilities JSONB, -- Available models, tools, etc.
-- Network information
ports JSONB, -- Service ports like {"ollama": 11434, "cockpit": 9090}
-- Registration metadata
client_version VARCHAR(50),
registration_metadata JSONB,
CONSTRAINT valid_node_id_format CHECK (node_id ~ '^[a-zA-Z0-9_-]+$'),
CONSTRAINT valid_status CHECK (status IN ('online', 'offline', 'maintenance', 'error'))
);
-- Node heartbeat history (for performance tracking)
CREATE TABLE node_heartbeats (
id SERIAL PRIMARY KEY,
node_id VARCHAR(64) REFERENCES cluster_nodes(node_id) ON DELETE CASCADE,
heartbeat_time TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
-- Runtime metrics
cpu_usage FLOAT,
memory_usage FLOAT,
disk_usage FLOAT,
gpu_usage FLOAT,
-- Service status
services_status JSONB,
-- Network metrics
network_metrics JSONB,
-- Custom metrics from client
custom_metrics JSONB
);
-- Node registration attempts (for security monitoring)
CREATE TABLE node_registration_attempts (
id SERIAL PRIMARY KEY,
attempted_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
-- Request information
ip_address INET NOT NULL,
user_agent TEXT,
token_used VARCHAR(64),
node_id VARCHAR(64),
hostname VARCHAR(255),
-- Result
success BOOLEAN NOT NULL,
failure_reason TEXT,
-- Security metadata
request_metadata JSONB
);
-- =============================================================================
-- INDEXES FOR PERFORMANCE
-- =============================================================================
-- Token lookup and validation
CREATE INDEX idx_cluster_tokens_token ON cluster_tokens(token) WHERE is_active = true;
CREATE INDEX idx_cluster_tokens_active ON cluster_tokens(is_active, expires_at);
-- Node lookups and status queries
CREATE INDEX idx_cluster_nodes_node_id ON cluster_nodes(node_id);
CREATE INDEX idx_cluster_nodes_status ON cluster_nodes(status);
CREATE INDEX idx_cluster_nodes_last_heartbeat ON cluster_nodes(last_heartbeat);
CREATE INDEX idx_cluster_nodes_token ON cluster_nodes(registration_token);
-- Heartbeat queries (time-series data)
CREATE INDEX idx_node_heartbeats_node_time ON node_heartbeats(node_id, heartbeat_time DESC);
CREATE INDEX idx_node_heartbeats_time ON node_heartbeats(heartbeat_time DESC);
-- Security monitoring
CREATE INDEX idx_registration_attempts_ip_time ON node_registration_attempts(ip_address, attempted_at DESC);
CREATE INDEX idx_registration_attempts_success ON node_registration_attempts(success, attempted_at DESC);
-- =============================================================================
-- FUNCTIONS AND TRIGGERS
-- =============================================================================
-- Function to update token registration count
CREATE OR REPLACE FUNCTION update_token_registration_count()
RETURNS TRIGGER AS $$
BEGIN
IF TG_OP = 'INSERT' THEN
UPDATE cluster_tokens
SET current_registrations = current_registrations + 1
WHERE token = NEW.registration_token;
RETURN NEW;
ELSIF TG_OP = 'DELETE' THEN
UPDATE cluster_tokens
SET current_registrations = current_registrations - 1
WHERE token = OLD.registration_token;
RETURN OLD;
END IF;
RETURN NULL;
END;
$$ LANGUAGE plpgsql;
-- Trigger to maintain registration counts
CREATE TRIGGER trigger_update_token_count
AFTER INSERT OR DELETE ON cluster_nodes
FOR EACH ROW
EXECUTE FUNCTION update_token_registration_count();
-- Function to clean up old heartbeats (data retention)
CREATE OR REPLACE FUNCTION cleanup_old_heartbeats()
RETURNS INTEGER AS $$
DECLARE
deleted_count INTEGER;
BEGIN
DELETE FROM node_heartbeats
WHERE heartbeat_time < NOW() - INTERVAL '30 days';
GET DIAGNOSTICS deleted_count = ROW_COUNT;
RETURN deleted_count;
END;
$$ LANGUAGE plpgsql;
-- Function to update node status based on heartbeat
CREATE OR REPLACE FUNCTION update_node_status()
RETURNS INTEGER AS $$
DECLARE
updated_count INTEGER;
BEGIN
-- Mark nodes as offline if no heartbeat in 5 minutes
UPDATE cluster_nodes
SET status = 'offline'
WHERE status = 'online'
AND last_heartbeat < NOW() - INTERVAL '5 minutes';
GET DIAGNOSTICS updated_count = ROW_COUNT;
RETURN updated_count;
END;
$$ LANGUAGE plpgsql;
-- =============================================================================
-- INITIAL DATA (for development/testing)
-- =============================================================================
-- Insert development cluster token
INSERT INTO cluster_tokens (token, description, created_by)
VALUES (
'hive_dev_cluster_token_12345678901234567890123456789012',
'Development cluster token for testing',
(SELECT id FROM users WHERE username = 'admin' LIMIT 1)
) ON CONFLICT (token) DO NOTHING;
-- Insert production cluster token (should be changed in production)
INSERT INTO cluster_tokens (token, description, created_by, expires_at)
VALUES (
'hive_prod_cluster_token_98765432109876543210987654321098',
'Production cluster token - CHANGE THIS IN PRODUCTION',
(SELECT id FROM users WHERE username = 'admin' LIMIT 1),
NOW() + INTERVAL '1 year'
) ON CONFLICT (token) DO NOTHING;
-- =============================================================================
-- COMMENTS AND DOCUMENTATION
-- =============================================================================
COMMENT ON TABLE cluster_tokens IS 'Registration tokens for cluster nodes to join the Hive cluster';
COMMENT ON TABLE cluster_nodes IS 'Dynamically registered cluster nodes with hardware and capability information';
COMMENT ON TABLE node_heartbeats IS 'Heartbeat history for performance monitoring and status tracking';
COMMENT ON TABLE node_registration_attempts IS 'Security log of all node registration attempts';
COMMENT ON COLUMN cluster_tokens.token IS 'Unique token for node registration, format: hive_[env]_cluster_token_[random]';
COMMENT ON COLUMN cluster_tokens.max_registrations IS 'Maximum number of nodes that can use this token (NULL = unlimited)';
COMMENT ON COLUMN cluster_tokens.allowed_ip_ranges IS 'CIDR ranges that can use this token (NULL = any IP)';
COMMENT ON COLUMN cluster_nodes.node_id IS 'Unique identifier for the node (hostname-uuid format recommended)';
COMMENT ON COLUMN cluster_nodes.cpu_info IS 'CPU information: {"cores": 8, "model": "AMD Ryzen 7", "architecture": "x86_64"}';
COMMENT ON COLUMN cluster_nodes.memory_info IS 'Memory information: {"total_gb": 64, "available_gb": 32, "type": "DDR4"}';
COMMENT ON COLUMN cluster_nodes.gpu_info IS 'GPU information: {"model": "NVIDIA RTX 2080S", "memory_gb": 8, "driver": "535.86.05"}';
COMMENT ON COLUMN cluster_nodes.services IS 'Available services: {"ollama": {"version": "0.1.7", "port": 11434}, "docker": {"version": "24.0.6"}}';
COMMENT ON COLUMN cluster_nodes.capabilities IS 'Node capabilities: {"models": ["llama2", "codellama"], "max_concurrent": 4}';
-- Migration completion notice
DO $$
BEGIN
RAISE NOTICE 'Cluster registration migration completed successfully!';
RAISE NOTICE 'Development token: hive_dev_cluster_token_12345678901234567890123456789012';
RAISE NOTICE 'Production token: hive_prod_cluster_token_98765432109876543210987654321098';
RAISE NOTICE 'SECURITY WARNING: Change production tokens before deployment!';
END
$$;

View File

@@ -0,0 +1,106 @@
#!/bin/bash
# Apply cluster registration migration to Hive database
# This script applies the 007_add_cluster_registration.sql migration
set -e
echo "🚀 Applying Cluster Registration Migration..."
# Configuration
DB_NAME="hive"
DB_USER="postgres"
DB_PASSWORD="hive123"
MIGRATION_FILE="./migrations/007_add_cluster_registration.sql"
# Check if migration file exists
if [[ ! -f "$MIGRATION_FILE" ]]; then
echo "❌ Migration file not found: $MIGRATION_FILE"
exit 1
fi
echo "📁 Migration file: $MIGRATION_FILE"
# Function to run SQL via Docker
run_sql_docker() {
local sql_file="$1"
echo "🐳 Executing migration via Docker..."
# Check if PostgreSQL service is running in Docker swarm
if docker service ls | grep -q "hive_postgres"; then
echo "✅ PostgreSQL service found in Docker swarm"
# Get a running PostgreSQL container
CONTAINER_ID=$(docker ps --filter "label=com.docker.swarm.service.name=hive_postgres" --format "{{.ID}}" | head -n1)
if [[ -z "$CONTAINER_ID" ]]; then
echo "❌ No running PostgreSQL container found"
exit 1
fi
echo "📦 Using PostgreSQL container: $CONTAINER_ID"
# Copy migration file to container and execute
docker cp "$sql_file" "$CONTAINER_ID:/tmp/migration.sql"
docker exec "$CONTAINER_ID" psql -U "$DB_USER" -d "$DB_NAME" -f /tmp/migration.sql
docker exec "$CONTAINER_ID" rm -f /tmp/migration.sql
else
echo "❌ PostgreSQL service not found in Docker swarm"
exit 1
fi
}
# Function to run SQL locally
run_sql_local() {
local sql_file="$1"
echo "🏠 Executing migration locally..."
# Check if psql is available
if ! command -v psql &> /dev/null; then
echo "❌ psql command not found"
exit 1
fi
# Try to connect locally
PGPASSWORD="$DB_PASSWORD" psql -h localhost -U "$DB_USER" -d "$DB_NAME" -f "$sql_file"
}
# Try Docker first, then local
echo "🔍 Attempting migration..."
if run_sql_docker "$MIGRATION_FILE" 2>/dev/null; then
echo "✅ Migration applied successfully via Docker!"
elif run_sql_local "$MIGRATION_FILE" 2>/dev/null; then
echo "✅ Migration applied successfully locally!"
else
echo "❌ Migration failed via both Docker and local methods"
echo "📝 Manual steps:"
echo "1. Ensure PostgreSQL is running"
echo "2. Check database credentials"
echo "3. Run manually: psql -h localhost -U postgres -d hive -f $MIGRATION_FILE"
exit 1
fi
echo ""
echo "🎉 Cluster Registration Migration Complete!"
echo ""
echo "📋 Summary:"
echo " ✅ cluster_tokens table created"
echo " ✅ cluster_nodes table created"
echo " ✅ node_heartbeats table created"
echo " ✅ node_registration_attempts table created"
echo " ✅ Indexes and triggers created"
echo " ✅ Development tokens inserted"
echo ""
echo "🔐 Development Tokens:"
echo " Dev Token: hive_dev_cluster_token_12345678901234567890123456789012"
echo " Prod Token: hive_prod_cluster_token_98765432109876543210987654321098"
echo ""
echo "⚠️ SECURITY WARNING: Change production tokens before deployment!"
echo ""
echo "🚀 Next steps:"
echo " 1. Implement registration API endpoints (/api/cluster/register)"
echo " 2. Add heartbeat API endpoint (/api/cluster/heartbeat)"
echo " 3. Update Bzzz clients to use registration system"
echo ""

56
check_browser_logs.py Normal file
View File

@@ -0,0 +1,56 @@
#!/usr/bin/env python3
"""
Check browser console logs to see the debug output
"""
import time
from selenium import webdriver
from selenium.webdriver.chrome.options import Options
from selenium.webdriver.common.by import By
def check_browser_logs():
chrome_options = Options()
chrome_options.add_argument("--headless")
chrome_options.add_argument("--no-sandbox")
chrome_options.add_argument("--disable-dev-shm-usage")
chrome_options.add_argument("--enable-logging")
chrome_options.add_argument("--log-level=0")
driver = webdriver.Chrome(options=chrome_options)
try:
print("🌐 Loading frontend to check debug logs...")
driver.get("http://localhost:3000")
time.sleep(3)
# Try to get browser console logs (may not work in headless mode)
try:
logs = driver.get_log('browser')
print("📋 Browser Console Logs:")
for log in logs:
print(f" [{log['level']}] {log['message']}")
except:
print("📋 Console logs not available in this mode")
# Check for specific debug information in page source
page_source = driver.page_source
if "Auth API_BASE_URL" in page_source:
lines = page_source.split('\n')
for line in lines:
if "Auth API_BASE_URL" in line or "apiConfig" in line:
print(f"🔍 Found debug info: {line.strip()}")
else:
print("🔍 No debug info found in page source")
# Also check page text for any visible errors
body_text = driver.find_element(By.TAG_NAME, "body").text
if "api_base_url" in body_text.lower() or "auth" in body_text.lower():
print(f"📄 Page content related to API: {body_text[:500]}")
except Exception as e:
print(f"❌ Error: {e}")
finally:
driver.quit()
if __name__ == "__main__":
check_browser_logs()

81
debug_frontend.py Normal file
View File

@@ -0,0 +1,81 @@
#!/usr/bin/env python3
"""
Debug Frontend Issues - Capture JavaScript errors and console logs
"""
import os
import time
from selenium import webdriver
from selenium.webdriver.chrome.options import Options
from selenium.webdriver.chrome.service import Service
from selenium.webdriver.common.by import By
def debug_frontend():
print("🔍 Debugging Frontend JavaScript Issues...")
chrome_options = Options()
chrome_options.add_argument("--headless")
chrome_options.add_argument("--no-sandbox")
chrome_options.add_argument("--disable-dev-shm-usage")
chrome_options.add_argument("--disable-gpu")
chrome_options.add_argument("--window-size=1920,1080")
chrome_options.add_argument("--enable-logging")
chrome_options.add_argument("--log-level=0")
driver = webdriver.Chrome(options=chrome_options)
try:
print("🌐 Loading http://localhost:3000...")
driver.get("http://localhost:3000")
# Wait for page to fully load
time.sleep(5)
print("📋 Page Title:", driver.title)
print("🔗 Current URL:", driver.current_url)
# Get page source and check for root element
page_source = driver.page_source
print("📄 Page source length:", len(page_source))
# Check if root div has content
try:
root_element = driver.find_element(By.ID, "root")
print("🏠 Root element innerHTML length:", len(root_element.get_attribute("innerHTML")))
print("🏠 Root element text content:", root_element.text[:200] + "..." if len(root_element.text) > 200 else root_element.text)
except Exception as e:
print(f"❌ Error finding root element: {e}")
# Get console logs (newer Selenium API)
print("\n🐛 JavaScript Console Logs:")
try:
logs = driver.get_log('browser')
if logs:
for log in logs:
level = log['level']
message = log['message']
print(f" [{level}] {message}")
else:
print(" No console logs found")
except Exception as e:
print(f" Could not get console logs: {e}")
# Check for specific error patterns
if "Failed to import" in page_source or "import" in str(logs):
print("\n❌ Import Error Detected!")
if len(root_element.get_attribute("innerHTML")) < 100:
print("\n❌ Root element is nearly empty - React app not loading!")
# Take screenshot for debugging
screenshot_path = "/tmp/debug_frontend.png"
driver.save_screenshot(screenshot_path)
print(f"\n📸 Debug screenshot saved: {screenshot_path}")
except Exception as e:
print(f"❌ Error during debugging: {e}")
finally:
driver.quit()
if __name__ == "__main__":
debug_frontend()

212
dev-start.sh Executable file
View File

@@ -0,0 +1,212 @@
#!/bin/bash
# Hive Development Environment Startup Script
# This script provides a fast development cycle with hot reload
set -e
echo "🚀 Starting Hive Development Environment"
echo "========================================="
# Colors for output
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
RED='\033[0;31m'
NC='\033[0m' # No Color
# Check if we're in the right directory
if [ ! -f "docker-compose.dev.yml" ]; then
echo -e "${RED}❌ Error: Please run this script from the hive project root directory${NC}"
exit 1
fi
# Function to check if backend is running
check_backend() {
local backend_url="https://hive.home.deepblack.cloud/api/health"
local dev_url="http://localhost:8089/api/health"
echo -e "${YELLOW}⏳ Checking backend availability...${NC}"
if curl -s -f "$backend_url" > /dev/null 2>&1; then
echo -e "${GREEN}✅ Production backend is available at $backend_url${NC}"
return 0
elif curl -s -f "$dev_url" > /dev/null 2>&1; then
echo -e "${GREEN}✅ Development backend is available at $dev_url${NC}"
return 0
else
echo -e "${YELLOW}⚠️ No backend detected. Will start development backend.${NC}"
return 1
fi
}
# Function to start frontend only (with production backend)
start_frontend_only() {
echo -e "${GREEN}🎯 Starting frontend with hot reload (using production backend)${NC}"
cd frontend
# Install dependencies if needed
if [ ! -d "node_modules" ]; then
echo -e "${YELLOW}📦 Installing frontend dependencies...${NC}"
npm install
fi
# Create development .env
cat > .env.development.local << EOF
VITE_API_BASE_URL=https://hive.home.deepblack.cloud
VITE_WS_BASE_URL=https://hive.home.deepblack.cloud
VITE_ENABLE_DEBUG_MODE=true
VITE_LOG_LEVEL=debug
VITE_DEV_MODE=true
EOF
echo -e "${GREEN}🔥 Starting frontend development server with hot reload...${NC}"
echo -e "${YELLOW}💡 Frontend will be available at: http://localhost:3000${NC}"
echo -e "${YELLOW}💡 Backend API: https://hive.home.deepblack.cloud${NC}"
echo -e "${YELLOW}💡 Press Ctrl+C to stop${NC}"
echo ""
npm run dev
}
# Function to start full development stack
start_full_stack() {
echo -e "${GREEN}🔧 Starting full development stack with Docker Compose...${NC}"
# Build and start development containers
echo -e "${YELLOW}🐳 Building development containers...${NC}"
docker-compose -f docker-compose.dev.yml build
echo -e "${YELLOW}🚀 Starting development containers...${NC}"
docker-compose -f docker-compose.dev.yml up --remove-orphans
}
# Main menu
show_menu() {
echo ""
echo "Choose development mode:"
echo "1) Frontend only (hot reload, uses production backend) - FAST ⚡"
echo "2) Full stack (frontend + backend in Docker) - COMPLETE 🔧"
echo "3) Check system status 📊"
echo "4) Clean development environment 🧹"
echo "5) Exit"
echo ""
}
# Clean development environment
clean_dev_env() {
echo -e "${YELLOW}🧹 Cleaning development environment...${NC}"
# Stop and remove dev containers
docker-compose -f docker-compose.dev.yml down --remove-orphans || true
# Remove dev images
docker images --format "table {{.Repository}}\t{{.Tag}}" | grep "hive.*dev" | awk '{print $1":"$2}' | xargs -r docker rmi || true
# Clean frontend
if [ -d "frontend/node_modules" ]; then
echo -e "${YELLOW}📦 Cleaning frontend node_modules...${NC}"
rm -rf frontend/node_modules
fi
# Clean local env files
rm -f frontend/.env.development.local
echo -e "${GREEN}✅ Development environment cleaned${NC}"
}
# Check system status
check_status() {
echo -e "${YELLOW}📊 System Status Check${NC}"
echo "===================="
# Check production backend
if curl -s -f "https://hive.home.deepblack.cloud/api/health" > /dev/null 2>&1; then
echo -e "${GREEN}✅ Production backend: Online${NC}"
else
echo -e "${RED}❌ Production backend: Offline${NC}"
fi
# Check development backend
if curl -s -f "http://localhost:8089/api/health" > /dev/null 2>&1; then
echo -e "${GREEN}✅ Development backend: Online${NC}"
else
echo -e "${YELLOW}⚠️ Development backend: Offline${NC}"
fi
# Check frontend
if curl -s -f "http://localhost:3000" > /dev/null 2>&1; then
echo -e "${GREEN}✅ Development frontend: Online${NC}"
else
echo -e "${YELLOW}⚠️ Development frontend: Offline${NC}"
fi
# Check Docker
if docker ps > /dev/null 2>&1; then
echo -e "${GREEN}✅ Docker: Available${NC}"
# Check dev containers
DEV_CONTAINERS=$(docker-compose -f docker-compose.dev.yml ps --services --filter status=running 2>/dev/null | wc -l)
if [ "$DEV_CONTAINERS" -gt 0 ]; then
echo -e "${GREEN}✅ Development containers: $DEV_CONTAINERS running${NC}"
else
echo -e "${YELLOW}⚠️ Development containers: None running${NC}"
fi
else
echo -e "${RED}❌ Docker: Not available${NC}"
fi
# Check Node.js
if command -v node > /dev/null 2>&1; then
NODE_VERSION=$(node --version)
echo -e "${GREEN}✅ Node.js: $NODE_VERSION${NC}"
else
echo -e "${RED}❌ Node.js: Not installed${NC}"
fi
}
# Trap to clean up on exit
cleanup() {
echo -e "\n${YELLOW}🛑 Shutting down development environment...${NC}"
if [ -f docker-compose.dev.yml ]; then
docker-compose -f docker-compose.dev.yml down > /dev/null 2>&1 || true
fi
}
trap cleanup EXIT
# Main script loop
while true; do
show_menu
read -p "Enter your choice (1-5): " choice
case $choice in
1)
if check_backend; then
start_frontend_only
else
echo -e "${RED}❌ No backend available. Please choose option 2 for full stack or start the production backend.${NC}"
sleep 2
fi
;;
2)
start_full_stack
;;
3)
check_status
read -p "Press Enter to return to menu..."
;;
4)
clean_dev_env
read -p "Press Enter to return to menu..."
;;
5)
echo -e "${GREEN}👋 Goodbye!${NC}"
exit 0
;;
*)
echo -e "${RED}❌ Invalid choice. Please enter 1-5.${NC}"
sleep 1
;;
esac
done

61
docker-compose.dev.yml Normal file
View File

@@ -0,0 +1,61 @@
version: '3.8'
services:
# Development Frontend with Hot Reload
hive-frontend-dev:
build:
context: ./frontend
dockerfile: Dockerfile.dev
ports:
- "3000:3000" # Direct access for development
volumes:
# Mount source code for hot reload
- ./frontend/src:/app/src:ro
- ./frontend/public:/app/public:ro
- ./frontend/package.json:/app/package.json:ro
- ./frontend/package-lock.json:/app/package-lock.json:ro
- ./frontend/vite.config.ts:/app/vite.config.ts:ro
- ./frontend/tsconfig.json:/app/tsconfig.json:ro
- ./frontend/tailwind.config.js:/app/tailwind.config.js:ro
- ./frontend/postcss.config.js:/app/postcss.config.js:ro
- ./frontend/.env.development:/app/.env:ro
environment:
- NODE_ENV=development
- VITE_API_BASE_URL=https://hive.home.deepblack.cloud
- VITE_WS_BASE_URL=https://hive.home.deepblack.cloud
- VITE_ENABLE_DEBUG_MODE=true
- VITE_LOG_LEVEL=debug
- VITE_DEV_MODE=true
networks:
- hive-dev-network
depends_on:
- hive-backend-dev
command: npm run dev
# Development Backend (optional - can use production backend)
hive-backend-dev:
build:
context: ./backend
dockerfile: Dockerfile.dev
ports:
- "8089:8000" # Different port to avoid conflicts with filebrowser
volumes:
- ./backend:/app:ro
environment:
- DATABASE_URL=postgresql://hive:hivepass@host.docker.internal:5433/hive # Connect to production DB
- REDIS_URL=redis://:hivepass@host.docker.internal:6380
- ENVIRONMENT=development
- LOG_LEVEL=debug
- CORS_ORIGINS=http://localhost:3000,https://hive.home.deepblack.cloud
- HOT_RELOAD=true
networks:
- hive-dev-network
extra_hosts:
- "host.docker.internal:host-gateway" # Access host services
networks:
hive-dev-network:
driver: bridge
# Note: This setup uses production database/redis but with dev frontend/backend
# This gives us live data while maintaining fast development cycle

View File

@@ -1,7 +1,7 @@
services:
# Hive Backend API
hive-backend:
image: registry.home.deepblack.cloud/tony/hive-backend:latest
image: registry.home.deepblack.cloud/tony/hive-backend:v4
build:
context: ./backend
dockerfile: Dockerfile
@@ -57,7 +57,7 @@ services:
# Hive Frontend
hive-frontend:
image: registry.home.deepblack.cloud/tony/hive-frontend:latest
image: registry.home.deepblack.cloud/tony/hive-frontend:v6
build:
context: ./frontend
dockerfile: Dockerfile

View File

@@ -0,0 +1,5 @@
VITE_API_BASE_URL=https://hive.home.deepblack.cloud
VITE_WS_BASE_URL=https://hive.home.deepblack.cloud
VITE_ENABLE_DEBUG_MODE=true
VITE_LOG_LEVEL=debug
VITE_DEV_MODE=true

View File

@@ -11,9 +11,6 @@ RUN npm install && npm install -g serve
# Copy source code
COPY . .
# Build the application
RUN npm run build
# Install curl for health checks (as root)
RUN apk add --no-cache curl
@@ -21,6 +18,10 @@ RUN apk add --no-cache curl
RUN addgroup -g 1001 -S nodejs
RUN adduser -S nextjs -u 1001
# Build the application (with cache busting)
ARG BUILD_DATE
RUN echo "Building at: $BUILD_DATE" && npm run build
# Change ownership
RUN chown -R nextjs:nodejs /app
USER nextjs

33
frontend/Dockerfile.dev Normal file
View File

@@ -0,0 +1,33 @@
FROM node:18-alpine
WORKDIR /app
# Install curl for health checks and other utilities
RUN apk add --no-cache curl git
# Copy package files
COPY package*.json ./
# Install dependencies (dev mode)
RUN npm install
# Copy source code
COPY . .
# Create non-root user
RUN addgroup -g 1001 -S nodejs
RUN adduser -S nextjs -u 1001
# Change ownership
RUN chown -R nextjs:nodejs /app
USER nextjs
# Expose port
EXPOSE 3000
# Health check
HEALTHCHECK --interval=30s --timeout=10s --start-period=10s --retries=3 \
CMD curl -f http://localhost:3000 || exit 1
# Start development server with hot reload
CMD ["npm", "run", "dev", "--", "--host", "0.0.0.0"]

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -1,77 +0,0 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<link rel="icon" type="image/svg+xml" href="/hive-icon.svg" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<meta name="description" content="Hive - Unified Distributed AI Orchestration Platform for coordinating AI agents, managing workflows, and monitoring performance" />
<meta name="keywords" content="AI, orchestration, workflows, agents, distributed computing, automation" />
<meta name="author" content="Hive Platform" />
<meta name="theme-color" content="#3b82f6" />
<!-- Open Graph / Facebook -->
<meta property="og:type" content="website" />
<meta property="og:title" content="Hive - Distributed AI Orchestration" />
<meta property="og:description" content="Unified platform for coordinating AI agents, managing workflows, and monitoring performance" />
<meta property="og:url" content="https://hive.home.deepblack.cloud" />
<!-- Twitter -->
<meta property="twitter:card" content="summary_large_image" />
<meta property="twitter:title" content="Hive - Distributed AI Orchestration" />
<meta property="twitter:description" content="Unified platform for coordinating AI agents, managing workflows, and monitoring performance" />
<!-- Accessibility -->
<meta name="format-detection" content="telephone=no" />
<title>🐝 Hive - Distributed AI Orchestration</title>
<style>
/* Loading styles for better UX */
#root {
min-height: 100vh;
}
/* Screen reader only class */
.sr-only {
position: absolute;
width: 1px;
height: 1px;
padding: 0;
margin: -1px;
overflow: hidden;
clip: rect(0, 0, 0, 0);
white-space: nowrap;
border: 0;
}
/* Focus visible styles */
.focus-visible:focus {
outline: 2px solid #3b82f6;
outline-offset: 2px;
}
/* Reduce motion for users who prefer it */
@media (prefers-reduced-motion: reduce) {
*,
*::before,
*::after {
animation-duration: 0.01ms !important;
animation-iteration-count: 1 !important;
transition-duration: 0.01ms !important;
}
}
</style>
<script type="module" crossorigin src="/assets/index-f7xYn9lw.js"></script>
<link rel="stylesheet" crossorigin href="/assets/index-CYSOVan7.css">
</head>
<body>
<noscript>
<div style="text-align: center; padding: 2rem; font-family: sans-serif;">
<h1>JavaScript Required</h1>
<p>This application requires JavaScript to be enabled in your browser.</p>
<p>Please enable JavaScript and refresh the page to continue.</p>
</div>
</noscript>
<div id="root" role="main"></div>
</body>
</html>

View File

@@ -23,6 +23,11 @@
<!-- Accessibility -->
<meta name="format-detection" content="telephone=no" />
<!-- Google Fonts - Fira Sans -->
<link rel="preconnect" href="https://fonts.googleapis.com">
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
<link href="https://fonts.googleapis.com/css2?family=Fira+Sans:ital,wght@0,100;0,200;0,300;0,400;0,500;0,600;0,700;0,800;0,900;1,100;1,200;1,300;1,400;1,500;1,600;1,700;1,800;1,900&display=swap" rel="stylesheet">
<title>🐝 Hive - Distributed AI Orchestration</title>
<style>

View File

@@ -1 +0,0 @@
module.exports = __STORYBOOK_MODULE_GLOBAL__;

View File

@@ -1 +0,0 @@
module.exports = __STORYBOOK_MODULE_ACTIONS__;

View File

@@ -1 +0,0 @@
module.exports = __STORYBOOK_MODULE_CHANNELS__;

View File

@@ -1 +0,0 @@
module.exports = __STORYBOOK_MODULE_CLIENT_LOGGER__;

View File

@@ -1 +0,0 @@
module.exports = __STORYBOOK_MODULE_CORE_EVENTS__;

View File

@@ -1 +0,0 @@
module.exports = __STORYBOOK_MODULE_PREVIEW_API__;

View File

@@ -1 +0,0 @@
module.exports = __STORYBOOK_MODULE_CORE_EVENTS_PREVIEW_ERRORS__;

View File

@@ -1 +0,0 @@
module.exports = __STORYBOOK_MODULE_TYPES__;

View File

@@ -1 +0,0 @@
module.exports = __STORYBOOK_MODULE_PREVIEW_API__;

View File

@@ -1 +0,0 @@
module.exports = __STORYBOOK_MODULE_TEST__;

View File

@@ -1,480 +0,0 @@
try{
(() => {
// global-externals:react
var react_default = __REACT__, { Children, Component, Fragment, Profiler, PureComponent, StrictMode, Suspense, __SECRET_INTERNALS_DO_NOT_USE_OR_YOU_WILL_BE_FIRED, act, cloneElement, createContext, createElement, createFactory, createRef, forwardRef, isValidElement, lazy, memo, startTransition, unstable_act, useCallback, useContext, useDebugValue, useDeferredValue, useEffect, useId, useImperativeHandle, useInsertionEffect, useLayoutEffect, useMemo, useReducer, useRef, useState, useSyncExternalStore, useTransition, version } = __REACT__;
// global-externals:storybook/internal/components
var components_default = __STORYBOOK_COMPONENTS__, { A, ActionBar, AddonPanel, Badge, Bar, Blockquote, Button, Checkbox, ClipboardCode, Code, DL, Div, DocumentWrapper, EmptyTabContent, ErrorFormatter, FlexBar, Form, H1, H2, H3, H4, H5, H6, HR, IconButton, Img, LI, Link, ListItem, Loader, Modal, OL, P, Placeholder, Pre, ProgressSpinner, ResetWrapper, ScrollArea, Separator, Spaced, Span, StorybookIcon, StorybookLogo, SyntaxHighlighter, TT, TabBar, TabButton, TabWrapper, Table, Tabs, TabsState, TooltipLinkList, TooltipMessage, TooltipNote, UL, WithTooltip, WithTooltipPure, Zoom, codeCommon, components, createCopyToClipboardFunction, getStoryHref, interleaveSeparators, nameSpaceClassNames, resetComponents, withReset } = __STORYBOOK_COMPONENTS__;
// global-externals:storybook/manager-api
var manager_api_default = __STORYBOOK_API__, { ActiveTabs, Consumer, ManagerContext, Provider, RequestResponseError, addons, combineParameters, controlOrMetaKey, controlOrMetaSymbol, eventMatchesShortcut, eventToShortcut, experimental_MockUniversalStore, experimental_UniversalStore, experimental_getStatusStore, experimental_getTestProviderStore, experimental_requestResponse, experimental_useStatusStore, experimental_useTestProviderStore, experimental_useUniversalStore, internal_fullStatusStore, internal_fullTestProviderStore, internal_universalStatusStore, internal_universalTestProviderStore, isMacLike, isShortcutTaken, keyToSymbol, merge, mockChannel, optionOrAltSymbol, shortcutMatchesShortcut, shortcutToHumanString, types, useAddonState, useArgTypes, useArgs, useChannel, useGlobalTypes, useGlobals, useParameter, useSharedState, useStoryPrepared, useStorybookApi, useStorybookState } = __STORYBOOK_API__;
// global-externals:storybook/theming
var theming_default = __STORYBOOK_THEMING__, { CacheProvider, ClassNames, Global, ThemeProvider, background, color, convert, create, createCache, createGlobal, createReset, css, darken, ensure, ignoreSsrWarning, isPropValid, jsx, keyframes, lighten, styled, themes, typography, useTheme, withTheme } = __STORYBOOK_THEMING__;
// node_modules/@storybook/addon-docs/dist/manager.js
var ADDON_ID = "storybook/docs", PANEL_ID = `${ADDON_ID}/panel`, PARAM_KEY = "docs", SNIPPET_RENDERED = `${ADDON_ID}/snippet-rendered`;
function _extends() {
return _extends = Object.assign ? Object.assign.bind() : function(n) {
for (var e = 1; e < arguments.length; e++) {
var t = arguments[e];
for (var r in t) ({}).hasOwnProperty.call(t, r) && (n[r] = t[r]);
}
return n;
}, _extends.apply(null, arguments);
}
function _assertThisInitialized(e) {
if (e === void 0) throw new ReferenceError("this hasn't been initialised - super() hasn't been called");
return e;
}
function _setPrototypeOf(t, e) {
return _setPrototypeOf = Object.setPrototypeOf ? Object.setPrototypeOf.bind() : function(t2, e2) {
return t2.__proto__ = e2, t2;
}, _setPrototypeOf(t, e);
}
function _inheritsLoose(t, o) {
t.prototype = Object.create(o.prototype), t.prototype.constructor = t, _setPrototypeOf(t, o);
}
function _getPrototypeOf(t) {
return _getPrototypeOf = Object.setPrototypeOf ? Object.getPrototypeOf.bind() : function(t2) {
return t2.__proto__ || Object.getPrototypeOf(t2);
}, _getPrototypeOf(t);
}
function _isNativeFunction(t) {
try {
return Function.toString.call(t).indexOf("[native code]") !== -1;
} catch {
return typeof t == "function";
}
}
function _isNativeReflectConstruct() {
try {
var t = !Boolean.prototype.valueOf.call(Reflect.construct(Boolean, [], function() {
}));
} catch {
}
return (_isNativeReflectConstruct = function() {
return !!t;
})();
}
function _construct(t, e, r) {
if (_isNativeReflectConstruct()) return Reflect.construct.apply(null, arguments);
var o = [null];
o.push.apply(o, e);
var p = new (t.bind.apply(t, o))();
return r && _setPrototypeOf(p, r.prototype), p;
}
function _wrapNativeSuper(t) {
var r = typeof Map == "function" ? /* @__PURE__ */ new Map() : void 0;
return _wrapNativeSuper = function(t2) {
if (t2 === null || !_isNativeFunction(t2)) return t2;
if (typeof t2 != "function") throw new TypeError("Super expression must either be null or a function");
if (r !== void 0) {
if (r.has(t2)) return r.get(t2);
r.set(t2, Wrapper2);
}
function Wrapper2() {
return _construct(t2, arguments, _getPrototypeOf(this).constructor);
}
return Wrapper2.prototype = Object.create(t2.prototype, { constructor: { value: Wrapper2, enumerable: !1, writable: !0, configurable: !0 } }), _setPrototypeOf(Wrapper2, t2);
}, _wrapNativeSuper(t);
}
var ERRORS = { 1: `Passed invalid arguments to hsl, please pass multiple numbers e.g. hsl(360, 0.75, 0.4) or an object e.g. rgb({ hue: 255, saturation: 0.4, lightness: 0.75 }).
`, 2: `Passed invalid arguments to hsla, please pass multiple numbers e.g. hsla(360, 0.75, 0.4, 0.7) or an object e.g. rgb({ hue: 255, saturation: 0.4, lightness: 0.75, alpha: 0.7 }).
`, 3: `Passed an incorrect argument to a color function, please pass a string representation of a color.
`, 4: `Couldn't generate valid rgb string from %s, it returned %s.
`, 5: `Couldn't parse the color string. Please provide the color as a string in hex, rgb, rgba, hsl or hsla notation.
`, 6: `Passed invalid arguments to rgb, please pass multiple numbers e.g. rgb(255, 205, 100) or an object e.g. rgb({ red: 255, green: 205, blue: 100 }).
`, 7: `Passed invalid arguments to rgba, please pass multiple numbers e.g. rgb(255, 205, 100, 0.75) or an object e.g. rgb({ red: 255, green: 205, blue: 100, alpha: 0.75 }).
`, 8: `Passed invalid argument to toColorString, please pass a RgbColor, RgbaColor, HslColor or HslaColor object.
`, 9: `Please provide a number of steps to the modularScale helper.
`, 10: `Please pass a number or one of the predefined scales to the modularScale helper as the ratio.
`, 11: `Invalid value passed as base to modularScale, expected number or em string but got "%s"
`, 12: `Expected a string ending in "px" or a number passed as the first argument to %s(), got "%s" instead.
`, 13: `Expected a string ending in "px" or a number passed as the second argument to %s(), got "%s" instead.
`, 14: `Passed invalid pixel value ("%s") to %s(), please pass a value like "12px" or 12.
`, 15: `Passed invalid base value ("%s") to %s(), please pass a value like "12px" or 12.
`, 16: `You must provide a template to this method.
`, 17: `You passed an unsupported selector state to this method.
`, 18: `minScreen and maxScreen must be provided as stringified numbers with the same units.
`, 19: `fromSize and toSize must be provided as stringified numbers with the same units.
`, 20: `expects either an array of objects or a single object with the properties prop, fromSize, and toSize.
`, 21: "expects the objects in the first argument array to have the properties `prop`, `fromSize`, and `toSize`.\n\n", 22: "expects the first argument object to have the properties `prop`, `fromSize`, and `toSize`.\n\n", 23: `fontFace expects a name of a font-family.
`, 24: `fontFace expects either the path to the font file(s) or a name of a local copy.
`, 25: `fontFace expects localFonts to be an array.
`, 26: `fontFace expects fileFormats to be an array.
`, 27: `radialGradient requries at least 2 color-stops to properly render.
`, 28: `Please supply a filename to retinaImage() as the first argument.
`, 29: `Passed invalid argument to triangle, please pass correct pointingDirection e.g. 'right'.
`, 30: "Passed an invalid value to `height` or `width`. Please provide a pixel based unit.\n\n", 31: `The animation shorthand only takes 8 arguments. See the specification for more information: http://mdn.io/animation
`, 32: `To pass multiple animations please supply them in arrays, e.g. animation(['rotate', '2s'], ['move', '1s'])
To pass a single animation please supply them in simple values, e.g. animation('rotate', '2s')
`, 33: `The animation shorthand arrays can only have 8 elements. See the specification for more information: http://mdn.io/animation
`, 34: `borderRadius expects a radius value as a string or number as the second argument.
`, 35: `borderRadius expects one of "top", "bottom", "left" or "right" as the first argument.
`, 36: `Property must be a string value.
`, 37: `Syntax Error at %s.
`, 38: `Formula contains a function that needs parentheses at %s.
`, 39: `Formula is missing closing parenthesis at %s.
`, 40: `Formula has too many closing parentheses at %s.
`, 41: `All values in a formula must have the same unit or be unitless.
`, 42: `Please provide a number of steps to the modularScale helper.
`, 43: `Please pass a number or one of the predefined scales to the modularScale helper as the ratio.
`, 44: `Invalid value passed as base to modularScale, expected number or em/rem string but got %s.
`, 45: `Passed invalid argument to hslToColorString, please pass a HslColor or HslaColor object.
`, 46: `Passed invalid argument to rgbToColorString, please pass a RgbColor or RgbaColor object.
`, 47: `minScreen and maxScreen must be provided as stringified numbers with the same units.
`, 48: `fromSize and toSize must be provided as stringified numbers with the same units.
`, 49: `Expects either an array of objects or a single object with the properties prop, fromSize, and toSize.
`, 50: `Expects the objects in the first argument array to have the properties prop, fromSize, and toSize.
`, 51: `Expects the first argument object to have the properties prop, fromSize, and toSize.
`, 52: `fontFace expects either the path to the font file(s) or a name of a local copy.
`, 53: `fontFace expects localFonts to be an array.
`, 54: `fontFace expects fileFormats to be an array.
`, 55: `fontFace expects a name of a font-family.
`, 56: `linearGradient requries at least 2 color-stops to properly render.
`, 57: `radialGradient requries at least 2 color-stops to properly render.
`, 58: `Please supply a filename to retinaImage() as the first argument.
`, 59: `Passed invalid argument to triangle, please pass correct pointingDirection e.g. 'right'.
`, 60: "Passed an invalid value to `height` or `width`. Please provide a pixel based unit.\n\n", 61: `Property must be a string value.
`, 62: `borderRadius expects a radius value as a string or number as the second argument.
`, 63: `borderRadius expects one of "top", "bottom", "left" or "right" as the first argument.
`, 64: `The animation shorthand only takes 8 arguments. See the specification for more information: http://mdn.io/animation.
`, 65: `To pass multiple animations please supply them in arrays, e.g. animation(['rotate', '2s'], ['move', '1s'])\\nTo pass a single animation please supply them in simple values, e.g. animation('rotate', '2s').
`, 66: `The animation shorthand arrays can only have 8 elements. See the specification for more information: http://mdn.io/animation.
`, 67: `You must provide a template to this method.
`, 68: `You passed an unsupported selector state to this method.
`, 69: `Expected a string ending in "px" or a number passed as the first argument to %s(), got %s instead.
`, 70: `Expected a string ending in "px" or a number passed as the second argument to %s(), got %s instead.
`, 71: `Passed invalid pixel value %s to %s(), please pass a value like "12px" or 12.
`, 72: `Passed invalid base value %s to %s(), please pass a value like "12px" or 12.
`, 73: `Please provide a valid CSS variable.
`, 74: `CSS variable not found and no default was provided.
`, 75: `important requires a valid style object, got a %s instead.
`, 76: `fromSize and toSize must be provided as stringified numbers with the same units as minScreen and maxScreen.
`, 77: `remToPx expects a value in "rem" but you provided it in "%s".
`, 78: `base must be set in "px" or "%" but you set it in "%s".
` };
function format() {
for (var _len = arguments.length, args = new Array(_len), _key = 0; _key < _len; _key++) args[_key] = arguments[_key];
var a = args[0], b = [], c;
for (c = 1; c < args.length; c += 1) b.push(args[c]);
return b.forEach(function(d) {
a = a.replace(/%[a-z]/, d);
}), a;
}
var PolishedError = function(_Error) {
_inheritsLoose(PolishedError2, _Error);
function PolishedError2(code) {
for (var _this, _len2 = arguments.length, args = new Array(_len2 > 1 ? _len2 - 1 : 0), _key2 = 1; _key2 < _len2; _key2++) args[_key2 - 1] = arguments[_key2];
return _this = _Error.call(this, format.apply(void 0, [ERRORS[code]].concat(args))) || this, _assertThisInitialized(_this);
}
return PolishedError2;
}(_wrapNativeSuper(Error));
function colorToInt(color2) {
return Math.round(color2 * 255);
}
function convertToInt(red, green, blue) {
return colorToInt(red) + "," + colorToInt(green) + "," + colorToInt(blue);
}
function hslToRgb(hue, saturation, lightness, convert2) {
if (convert2 === void 0 && (convert2 = convertToInt), saturation === 0) return convert2(lightness, lightness, lightness);
var huePrime = (hue % 360 + 360) % 360 / 60, chroma = (1 - Math.abs(2 * lightness - 1)) * saturation, secondComponent = chroma * (1 - Math.abs(huePrime % 2 - 1)), red = 0, green = 0, blue = 0;
huePrime >= 0 && huePrime < 1 ? (red = chroma, green = secondComponent) : huePrime >= 1 && huePrime < 2 ? (red = secondComponent, green = chroma) : huePrime >= 2 && huePrime < 3 ? (green = chroma, blue = secondComponent) : huePrime >= 3 && huePrime < 4 ? (green = secondComponent, blue = chroma) : huePrime >= 4 && huePrime < 5 ? (red = secondComponent, blue = chroma) : huePrime >= 5 && huePrime < 6 && (red = chroma, blue = secondComponent);
var lightnessModification = lightness - chroma / 2, finalRed = red + lightnessModification, finalGreen = green + lightnessModification, finalBlue = blue + lightnessModification;
return convert2(finalRed, finalGreen, finalBlue);
}
var namedColorMap = { aliceblue: "f0f8ff", antiquewhite: "faebd7", aqua: "00ffff", aquamarine: "7fffd4", azure: "f0ffff", beige: "f5f5dc", bisque: "ffe4c4", black: "000", blanchedalmond: "ffebcd", blue: "0000ff", blueviolet: "8a2be2", brown: "a52a2a", burlywood: "deb887", cadetblue: "5f9ea0", chartreuse: "7fff00", chocolate: "d2691e", coral: "ff7f50", cornflowerblue: "6495ed", cornsilk: "fff8dc", crimson: "dc143c", cyan: "00ffff", darkblue: "00008b", darkcyan: "008b8b", darkgoldenrod: "b8860b", darkgray: "a9a9a9", darkgreen: "006400", darkgrey: "a9a9a9", darkkhaki: "bdb76b", darkmagenta: "8b008b", darkolivegreen: "556b2f", darkorange: "ff8c00", darkorchid: "9932cc", darkred: "8b0000", darksalmon: "e9967a", darkseagreen: "8fbc8f", darkslateblue: "483d8b", darkslategray: "2f4f4f", darkslategrey: "2f4f4f", darkturquoise: "00ced1", darkviolet: "9400d3", deeppink: "ff1493", deepskyblue: "00bfff", dimgray: "696969", dimgrey: "696969", dodgerblue: "1e90ff", firebrick: "b22222", floralwhite: "fffaf0", forestgreen: "228b22", fuchsia: "ff00ff", gainsboro: "dcdcdc", ghostwhite: "f8f8ff", gold: "ffd700", goldenrod: "daa520", gray: "808080", green: "008000", greenyellow: "adff2f", grey: "808080", honeydew: "f0fff0", hotpink: "ff69b4", indianred: "cd5c5c", indigo: "4b0082", ivory: "fffff0", khaki: "f0e68c", lavender: "e6e6fa", lavenderblush: "fff0f5", lawngreen: "7cfc00", lemonchiffon: "fffacd", lightblue: "add8e6", lightcoral: "f08080", lightcyan: "e0ffff", lightgoldenrodyellow: "fafad2", lightgray: "d3d3d3", lightgreen: "90ee90", lightgrey: "d3d3d3", lightpink: "ffb6c1", lightsalmon: "ffa07a", lightseagreen: "20b2aa", lightskyblue: "87cefa", lightslategray: "789", lightslategrey: "789", lightsteelblue: "b0c4de", lightyellow: "ffffe0", lime: "0f0", limegreen: "32cd32", linen: "faf0e6", magenta: "f0f", maroon: "800000", mediumaquamarine: "66cdaa", mediumblue: "0000cd", mediumorchid: "ba55d3", mediumpurple: "9370db", mediumseagreen: "3cb371", mediumslateblue: "7b68ee", mediumspringgreen: "00fa9a", mediumturquoise: "48d1cc", mediumvioletred: "c71585", midnightblue: "191970", mintcream: "f5fffa", mistyrose: "ffe4e1", moccasin: "ffe4b5", navajowhite: "ffdead", navy: "000080", oldlace: "fdf5e6", olive: "808000", olivedrab: "6b8e23", orange: "ffa500", orangered: "ff4500", orchid: "da70d6", palegoldenrod: "eee8aa", palegreen: "98fb98", paleturquoise: "afeeee", palevioletred: "db7093", papayawhip: "ffefd5", peachpuff: "ffdab9", peru: "cd853f", pink: "ffc0cb", plum: "dda0dd", powderblue: "b0e0e6", purple: "800080", rebeccapurple: "639", red: "f00", rosybrown: "bc8f8f", royalblue: "4169e1", saddlebrown: "8b4513", salmon: "fa8072", sandybrown: "f4a460", seagreen: "2e8b57", seashell: "fff5ee", sienna: "a0522d", silver: "c0c0c0", skyblue: "87ceeb", slateblue: "6a5acd", slategray: "708090", slategrey: "708090", snow: "fffafa", springgreen: "00ff7f", steelblue: "4682b4", tan: "d2b48c", teal: "008080", thistle: "d8bfd8", tomato: "ff6347", turquoise: "40e0d0", violet: "ee82ee", wheat: "f5deb3", white: "fff", whitesmoke: "f5f5f5", yellow: "ff0", yellowgreen: "9acd32" };
function nameToHex(color2) {
if (typeof color2 != "string") return color2;
var normalizedColorName = color2.toLowerCase();
return namedColorMap[normalizedColorName] ? "#" + namedColorMap[normalizedColorName] : color2;
}
var hexRegex = /^#[a-fA-F0-9]{6}$/, hexRgbaRegex = /^#[a-fA-F0-9]{8}$/, reducedHexRegex = /^#[a-fA-F0-9]{3}$/, reducedRgbaHexRegex = /^#[a-fA-F0-9]{4}$/, rgbRegex = /^rgb\(\s*(\d{1,3})\s*(?:,)?\s*(\d{1,3})\s*(?:,)?\s*(\d{1,3})\s*\)$/i, rgbaRegex = /^rgb(?:a)?\(\s*(\d{1,3})\s*(?:,)?\s*(\d{1,3})\s*(?:,)?\s*(\d{1,3})\s*(?:,|\/)\s*([-+]?\d*[.]?\d+[%]?)\s*\)$/i, hslRegex = /^hsl\(\s*(\d{0,3}[.]?[0-9]+(?:deg)?)\s*(?:,)?\s*(\d{1,3}[.]?[0-9]?)%\s*(?:,)?\s*(\d{1,3}[.]?[0-9]?)%\s*\)$/i, hslaRegex = /^hsl(?:a)?\(\s*(\d{0,3}[.]?[0-9]+(?:deg)?)\s*(?:,)?\s*(\d{1,3}[.]?[0-9]?)%\s*(?:,)?\s*(\d{1,3}[.]?[0-9]?)%\s*(?:,|\/)\s*([-+]?\d*[.]?\d+[%]?)\s*\)$/i;
function parseToRgb(color2) {
if (typeof color2 != "string") throw new PolishedError(3);
var normalizedColor = nameToHex(color2);
if (normalizedColor.match(hexRegex)) return { red: parseInt("" + normalizedColor[1] + normalizedColor[2], 16), green: parseInt("" + normalizedColor[3] + normalizedColor[4], 16), blue: parseInt("" + normalizedColor[5] + normalizedColor[6], 16) };
if (normalizedColor.match(hexRgbaRegex)) {
var alpha = parseFloat((parseInt("" + normalizedColor[7] + normalizedColor[8], 16) / 255).toFixed(2));
return { red: parseInt("" + normalizedColor[1] + normalizedColor[2], 16), green: parseInt("" + normalizedColor[3] + normalizedColor[4], 16), blue: parseInt("" + normalizedColor[5] + normalizedColor[6], 16), alpha };
}
if (normalizedColor.match(reducedHexRegex)) return { red: parseInt("" + normalizedColor[1] + normalizedColor[1], 16), green: parseInt("" + normalizedColor[2] + normalizedColor[2], 16), blue: parseInt("" + normalizedColor[3] + normalizedColor[3], 16) };
if (normalizedColor.match(reducedRgbaHexRegex)) {
var _alpha = parseFloat((parseInt("" + normalizedColor[4] + normalizedColor[4], 16) / 255).toFixed(2));
return { red: parseInt("" + normalizedColor[1] + normalizedColor[1], 16), green: parseInt("" + normalizedColor[2] + normalizedColor[2], 16), blue: parseInt("" + normalizedColor[3] + normalizedColor[3], 16), alpha: _alpha };
}
var rgbMatched = rgbRegex.exec(normalizedColor);
if (rgbMatched) return { red: parseInt("" + rgbMatched[1], 10), green: parseInt("" + rgbMatched[2], 10), blue: parseInt("" + rgbMatched[3], 10) };
var rgbaMatched = rgbaRegex.exec(normalizedColor.substring(0, 50));
if (rgbaMatched) return { red: parseInt("" + rgbaMatched[1], 10), green: parseInt("" + rgbaMatched[2], 10), blue: parseInt("" + rgbaMatched[3], 10), alpha: parseFloat("" + rgbaMatched[4]) > 1 ? parseFloat("" + rgbaMatched[4]) / 100 : parseFloat("" + rgbaMatched[4]) };
var hslMatched = hslRegex.exec(normalizedColor);
if (hslMatched) {
var hue = parseInt("" + hslMatched[1], 10), saturation = parseInt("" + hslMatched[2], 10) / 100, lightness = parseInt("" + hslMatched[3], 10) / 100, rgbColorString = "rgb(" + hslToRgb(hue, saturation, lightness) + ")", hslRgbMatched = rgbRegex.exec(rgbColorString);
if (!hslRgbMatched) throw new PolishedError(4, normalizedColor, rgbColorString);
return { red: parseInt("" + hslRgbMatched[1], 10), green: parseInt("" + hslRgbMatched[2], 10), blue: parseInt("" + hslRgbMatched[3], 10) };
}
var hslaMatched = hslaRegex.exec(normalizedColor.substring(0, 50));
if (hslaMatched) {
var _hue = parseInt("" + hslaMatched[1], 10), _saturation = parseInt("" + hslaMatched[2], 10) / 100, _lightness = parseInt("" + hslaMatched[3], 10) / 100, _rgbColorString = "rgb(" + hslToRgb(_hue, _saturation, _lightness) + ")", _hslRgbMatched = rgbRegex.exec(_rgbColorString);
if (!_hslRgbMatched) throw new PolishedError(4, normalizedColor, _rgbColorString);
return { red: parseInt("" + _hslRgbMatched[1], 10), green: parseInt("" + _hslRgbMatched[2], 10), blue: parseInt("" + _hslRgbMatched[3], 10), alpha: parseFloat("" + hslaMatched[4]) > 1 ? parseFloat("" + hslaMatched[4]) / 100 : parseFloat("" + hslaMatched[4]) };
}
throw new PolishedError(5);
}
function rgbToHsl(color2) {
var red = color2.red / 255, green = color2.green / 255, blue = color2.blue / 255, max = Math.max(red, green, blue), min = Math.min(red, green, blue), lightness = (max + min) / 2;
if (max === min) return color2.alpha !== void 0 ? { hue: 0, saturation: 0, lightness, alpha: color2.alpha } : { hue: 0, saturation: 0, lightness };
var hue, delta = max - min, saturation = lightness > 0.5 ? delta / (2 - max - min) : delta / (max + min);
switch (max) {
case red:
hue = (green - blue) / delta + (green < blue ? 6 : 0);
break;
case green:
hue = (blue - red) / delta + 2;
break;
default:
hue = (red - green) / delta + 4;
break;
}
return hue *= 60, color2.alpha !== void 0 ? { hue, saturation, lightness, alpha: color2.alpha } : { hue, saturation, lightness };
}
function parseToHsl(color2) {
return rgbToHsl(parseToRgb(color2));
}
var reduceHexValue = function(value) {
return value.length === 7 && value[1] === value[2] && value[3] === value[4] && value[5] === value[6] ? "#" + value[1] + value[3] + value[5] : value;
}, reduceHexValue$1 = reduceHexValue;
function numberToHex(value) {
var hex = value.toString(16);
return hex.length === 1 ? "0" + hex : hex;
}
function colorToHex(color2) {
return numberToHex(Math.round(color2 * 255));
}
function convertToHex(red, green, blue) {
return reduceHexValue$1("#" + colorToHex(red) + colorToHex(green) + colorToHex(blue));
}
function hslToHex(hue, saturation, lightness) {
return hslToRgb(hue, saturation, lightness, convertToHex);
}
function hsl(value, saturation, lightness) {
if (typeof value == "number" && typeof saturation == "number" && typeof lightness == "number") return hslToHex(value, saturation, lightness);
if (typeof value == "object" && saturation === void 0 && lightness === void 0) return hslToHex(value.hue, value.saturation, value.lightness);
throw new PolishedError(1);
}
function hsla(value, saturation, lightness, alpha) {
if (typeof value == "number" && typeof saturation == "number" && typeof lightness == "number" && typeof alpha == "number") return alpha >= 1 ? hslToHex(value, saturation, lightness) : "rgba(" + hslToRgb(value, saturation, lightness) + "," + alpha + ")";
if (typeof value == "object" && saturation === void 0 && lightness === void 0 && alpha === void 0) return value.alpha >= 1 ? hslToHex(value.hue, value.saturation, value.lightness) : "rgba(" + hslToRgb(value.hue, value.saturation, value.lightness) + "," + value.alpha + ")";
throw new PolishedError(2);
}
function rgb(value, green, blue) {
if (typeof value == "number" && typeof green == "number" && typeof blue == "number") return reduceHexValue$1("#" + numberToHex(value) + numberToHex(green) + numberToHex(blue));
if (typeof value == "object" && green === void 0 && blue === void 0) return reduceHexValue$1("#" + numberToHex(value.red) + numberToHex(value.green) + numberToHex(value.blue));
throw new PolishedError(6);
}
function rgba(firstValue, secondValue, thirdValue, fourthValue) {
if (typeof firstValue == "string" && typeof secondValue == "number") {
var rgbValue = parseToRgb(firstValue);
return "rgba(" + rgbValue.red + "," + rgbValue.green + "," + rgbValue.blue + "," + secondValue + ")";
} else {
if (typeof firstValue == "number" && typeof secondValue == "number" && typeof thirdValue == "number" && typeof fourthValue == "number") return fourthValue >= 1 ? rgb(firstValue, secondValue, thirdValue) : "rgba(" + firstValue + "," + secondValue + "," + thirdValue + "," + fourthValue + ")";
if (typeof firstValue == "object" && secondValue === void 0 && thirdValue === void 0 && fourthValue === void 0) return firstValue.alpha >= 1 ? rgb(firstValue.red, firstValue.green, firstValue.blue) : "rgba(" + firstValue.red + "," + firstValue.green + "," + firstValue.blue + "," + firstValue.alpha + ")";
}
throw new PolishedError(7);
}
var isRgb = function(color2) {
return typeof color2.red == "number" && typeof color2.green == "number" && typeof color2.blue == "number" && (typeof color2.alpha != "number" || typeof color2.alpha > "u");
}, isRgba = function(color2) {
return typeof color2.red == "number" && typeof color2.green == "number" && typeof color2.blue == "number" && typeof color2.alpha == "number";
}, isHsl = function(color2) {
return typeof color2.hue == "number" && typeof color2.saturation == "number" && typeof color2.lightness == "number" && (typeof color2.alpha != "number" || typeof color2.alpha > "u");
}, isHsla = function(color2) {
return typeof color2.hue == "number" && typeof color2.saturation == "number" && typeof color2.lightness == "number" && typeof color2.alpha == "number";
};
function toColorString(color2) {
if (typeof color2 != "object") throw new PolishedError(8);
if (isRgba(color2)) return rgba(color2);
if (isRgb(color2)) return rgb(color2);
if (isHsla(color2)) return hsla(color2);
if (isHsl(color2)) return hsl(color2);
throw new PolishedError(8);
}
function curried(f, length, acc) {
return function() {
var combined = acc.concat(Array.prototype.slice.call(arguments));
return combined.length >= length ? f.apply(this, combined) : curried(f, length, combined);
};
}
function curry(f) {
return curried(f, f.length, []);
}
function adjustHue(degree, color2) {
if (color2 === "transparent") return color2;
var hslColor = parseToHsl(color2);
return toColorString(_extends({}, hslColor, { hue: hslColor.hue + parseFloat(degree) }));
}
curry(adjustHue);
function guard(lowerBoundary, upperBoundary, value) {
return Math.max(lowerBoundary, Math.min(upperBoundary, value));
}
function darken2(amount, color2) {
if (color2 === "transparent") return color2;
var hslColor = parseToHsl(color2);
return toColorString(_extends({}, hslColor, { lightness: guard(0, 1, hslColor.lightness - parseFloat(amount)) }));
}
curry(darken2);
function desaturate(amount, color2) {
if (color2 === "transparent") return color2;
var hslColor = parseToHsl(color2);
return toColorString(_extends({}, hslColor, { saturation: guard(0, 1, hslColor.saturation - parseFloat(amount)) }));
}
curry(desaturate);
function lighten2(amount, color2) {
if (color2 === "transparent") return color2;
var hslColor = parseToHsl(color2);
return toColorString(_extends({}, hslColor, { lightness: guard(0, 1, hslColor.lightness + parseFloat(amount)) }));
}
curry(lighten2);
function mix(weight, color2, otherColor) {
if (color2 === "transparent") return otherColor;
if (otherColor === "transparent") return color2;
if (weight === 0) return otherColor;
var parsedColor1 = parseToRgb(color2), color1 = _extends({}, parsedColor1, { alpha: typeof parsedColor1.alpha == "number" ? parsedColor1.alpha : 1 }), parsedColor2 = parseToRgb(otherColor), color22 = _extends({}, parsedColor2, { alpha: typeof parsedColor2.alpha == "number" ? parsedColor2.alpha : 1 }), alphaDelta = color1.alpha - color22.alpha, x = parseFloat(weight) * 2 - 1, y = x * alphaDelta === -1 ? x : x + alphaDelta, z = 1 + x * alphaDelta, weight1 = (y / z + 1) / 2, weight2 = 1 - weight1, mixedColor = { red: Math.floor(color1.red * weight1 + color22.red * weight2), green: Math.floor(color1.green * weight1 + color22.green * weight2), blue: Math.floor(color1.blue * weight1 + color22.blue * weight2), alpha: color1.alpha * parseFloat(weight) + color22.alpha * (1 - parseFloat(weight)) };
return rgba(mixedColor);
}
var curriedMix = curry(mix), mix$1 = curriedMix;
function opacify(amount, color2) {
if (color2 === "transparent") return color2;
var parsedColor = parseToRgb(color2), alpha = typeof parsedColor.alpha == "number" ? parsedColor.alpha : 1, colorWithAlpha = _extends({}, parsedColor, { alpha: guard(0, 1, (alpha * 100 + parseFloat(amount) * 100) / 100) });
return rgba(colorWithAlpha);
}
curry(opacify);
function saturate(amount, color2) {
if (color2 === "transparent") return color2;
var hslColor = parseToHsl(color2);
return toColorString(_extends({}, hslColor, { saturation: guard(0, 1, hslColor.saturation + parseFloat(amount)) }));
}
curry(saturate);
function setHue(hue, color2) {
return color2 === "transparent" ? color2 : toColorString(_extends({}, parseToHsl(color2), { hue: parseFloat(hue) }));
}
curry(setHue);
function setLightness(lightness, color2) {
return color2 === "transparent" ? color2 : toColorString(_extends({}, parseToHsl(color2), { lightness: parseFloat(lightness) }));
}
curry(setLightness);
function setSaturation(saturation, color2) {
return color2 === "transparent" ? color2 : toColorString(_extends({}, parseToHsl(color2), { saturation: parseFloat(saturation) }));
}
curry(setSaturation);
function shade(percentage, color2) {
return color2 === "transparent" ? color2 : mix$1(parseFloat(percentage), "rgb(0, 0, 0)", color2);
}
curry(shade);
function tint(percentage, color2) {
return color2 === "transparent" ? color2 : mix$1(parseFloat(percentage), "rgb(255, 255, 255)", color2);
}
curry(tint);
function transparentize(amount, color2) {
if (color2 === "transparent") return color2;
var parsedColor = parseToRgb(color2), alpha = typeof parsedColor.alpha == "number" ? parsedColor.alpha : 1, colorWithAlpha = _extends({}, parsedColor, { alpha: guard(0, 1, +(alpha * 100 - parseFloat(amount) * 100).toFixed(2) / 100) });
return rgba(colorWithAlpha);
}
var curriedTransparentize = curry(transparentize), curriedTransparentize$1 = curriedTransparentize, Wrapper = styled.div(withReset, ({ theme }) => ({ backgroundColor: theme.base === "light" ? "rgba(0,0,0,.01)" : "rgba(255,255,255,.01)", borderRadius: theme.appBorderRadius, border: `1px dashed ${theme.appBorderColor}`, display: "flex", alignItems: "center", justifyContent: "center", padding: 20, margin: "25px 0 40px", color: curriedTransparentize$1(0.3, theme.color.defaultText), fontSize: theme.typography.size.s2 })), EmptyBlock = (props) => react_default.createElement(Wrapper, { ...props, className: "docblock-emptyblock sb-unstyled" }), StyledSyntaxHighlighter = styled(SyntaxHighlighter)(({ theme }) => ({ fontSize: `${theme.typography.size.s2 - 1}px`, lineHeight: "19px", margin: "25px 0 40px", borderRadius: theme.appBorderRadius, boxShadow: theme.base === "light" ? "rgba(0, 0, 0, 0.10) 0 1px 3px 0" : "rgba(0, 0, 0, 0.20) 0 2px 5px 0", "pre.prismjs": { padding: 20, background: "inherit" } })), SourceSkeletonWrapper = styled.div(({ theme }) => ({ background: theme.background.content, borderRadius: theme.appBorderRadius, border: `1px solid ${theme.appBorderColor}`, boxShadow: theme.base === "light" ? "rgba(0, 0, 0, 0.10) 0 1px 3px 0" : "rgba(0, 0, 0, 0.20) 0 2px 5px 0", margin: "25px 0 40px", padding: "20px 20px 20px 22px" })), SourceSkeletonPlaceholder = styled.div(({ theme }) => ({ animation: `${theme.animation.glow} 1.5s ease-in-out infinite`, background: theme.appBorderColor, height: 17, marginTop: 1, width: "60%", [`&:first-child${ignoreSsrWarning}`]: { margin: 0 } })), SourceSkeleton = () => react_default.createElement(SourceSkeletonWrapper, null, react_default.createElement(SourceSkeletonPlaceholder, null), react_default.createElement(SourceSkeletonPlaceholder, { style: { width: "80%" } }), react_default.createElement(SourceSkeletonPlaceholder, { style: { width: "30%" } }), react_default.createElement(SourceSkeletonPlaceholder, { style: { width: "80%" } })), Source = ({ isLoading, error, language, code, dark, format: format2 = !0, ...rest }) => {
let { typography: typography2 } = useTheme();
if (isLoading) return react_default.createElement(SourceSkeleton, null);
if (error) return react_default.createElement(EmptyBlock, null, error);
let syntaxHighlighter = react_default.createElement(StyledSyntaxHighlighter, { bordered: !0, copyable: !0, format: format2, language: language ?? "jsx", className: "docblock-source sb-unstyled", ...rest }, code);
if (typeof dark > "u") return syntaxHighlighter;
let overrideTheme = dark ? themes.dark : themes.light;
return react_default.createElement(ThemeProvider, { theme: convert({ ...overrideTheme, fontCode: typography2.fonts.mono, fontBase: typography2.fonts.base }) }, syntaxHighlighter);
};
addons.register(ADDON_ID, (api) => {
addons.add(PANEL_ID, { title: "Code", type: types.PANEL, paramKey: PARAM_KEY, disabled: (parameters) => !parameters?.docs?.codePanel, match: ({ viewMode }) => viewMode === "story", render: ({ active }) => {
let channel = api.getChannel(), currentStory = api.getCurrentStoryData(), lastEvent = channel?.last(SNIPPET_RENDERED)?.[0], [codeSnippet, setSourceCode] = useState({ source: lastEvent?.source, format: lastEvent?.format ?? void 0 }), parameter = useParameter(PARAM_KEY, { source: { code: "" }, theme: "dark" });
useEffect(() => {
setSourceCode({ source: void 0, format: void 0 });
}, [currentStory?.id]), useChannel({ [SNIPPET_RENDERED]: ({ source, format: format2 }) => {
setSourceCode({ source, format: format2 });
} });
let isDark = useTheme().base !== "light";
return react_default.createElement(AddonPanel, { active: !!active }, react_default.createElement(SourceStyles, null, react_default.createElement(Source, { ...parameter.source, code: parameter.source?.code || codeSnippet.source || parameter.source?.originalSource, format: codeSnippet.format, dark: isDark })));
} });
});
var SourceStyles = styled.div(() => ({ height: "100%", [`> :first-child${ignoreSsrWarning}`]: { margin: 0, height: "100%", boxShadow: "none" } }));
})();
}catch(e){ console.error("[Storybook] One of your manager-entries failed: " + import.meta.url, e); }

View File

@@ -1 +0,0 @@
import '/home/tony/AI/projects/hive/frontend/node_modules/@storybook/addon-docs/dist/manager.js';

View File

@@ -1 +0,0 @@
import '/home/tony/AI/projects/hive/frontend/node_modules/@storybook/addon-onboarding/dist/manager.js';

View File

@@ -1 +0,0 @@
import '/home/tony/AI/projects/hive/frontend/node_modules/storybook/dist/core-server/presets/common-manager.js';

View File

@@ -1,25 +0,0 @@
import "./chunk-KEXKKQVW.js";
// node_modules/@emotion/memoize/dist/memoize.browser.esm.js
function memoize(fn) {
var cache = {};
return function(arg) {
if (cache[arg] === void 0) cache[arg] = fn(arg);
return cache[arg];
};
}
var memoize_browser_esm_default = memoize;
// node_modules/@emotion/is-prop-valid/dist/is-prop-valid.browser.esm.js
var reactPropsRegex = /^((children|dangerouslySetInnerHTML|key|ref|autoFocus|defaultValue|defaultChecked|innerHTML|suppressContentEditableWarning|suppressHydrationWarning|valueLink|accept|acceptCharset|accessKey|action|allow|allowUserMedia|allowPaymentRequest|allowFullScreen|allowTransparency|alt|async|autoComplete|autoPlay|capture|cellPadding|cellSpacing|challenge|charSet|checked|cite|classID|className|cols|colSpan|content|contentEditable|contextMenu|controls|controlsList|coords|crossOrigin|data|dateTime|decoding|default|defer|dir|disabled|disablePictureInPicture|download|draggable|encType|form|formAction|formEncType|formMethod|formNoValidate|formTarget|frameBorder|headers|height|hidden|high|href|hrefLang|htmlFor|httpEquiv|id|inputMode|integrity|is|keyParams|keyType|kind|label|lang|list|loading|loop|low|marginHeight|marginWidth|max|maxLength|media|mediaGroup|method|min|minLength|multiple|muted|name|nonce|noValidate|open|optimum|pattern|placeholder|playsInline|poster|preload|profile|radioGroup|readOnly|referrerPolicy|rel|required|reversed|role|rows|rowSpan|sandbox|scope|scoped|scrolling|seamless|selected|shape|size|sizes|slot|span|spellCheck|src|srcDoc|srcLang|srcSet|start|step|style|summary|tabIndex|target|title|type|useMap|value|width|wmode|wrap|about|datatype|inlist|prefix|property|resource|typeof|vocab|autoCapitalize|autoCorrect|autoSave|color|inert|itemProp|itemScope|itemType|itemID|itemRef|on|results|security|unselectable|accentHeight|accumulate|additive|alignmentBaseline|allowReorder|alphabetic|amplitude|arabicForm|ascent|attributeName|attributeType|autoReverse|azimuth|baseFrequency|baselineShift|baseProfile|bbox|begin|bias|by|calcMode|capHeight|clip|clipPathUnits|clipPath|clipRule|colorInterpolation|colorInterpolationFilters|colorProfile|colorRendering|contentScriptType|contentStyleType|cursor|cx|cy|d|decelerate|descent|diffuseConstant|direction|display|divisor|dominantBaseline|dur|dx|dy|edgeMode|elevation|enableBackground|end|exponent|externalResourcesRequired|fill|fillOpacity|fillRule|filter|filterRes|filterUnits|floodColor|floodOpacity|focusable|fontFamily|fontSize|fontSizeAdjust|fontStretch|fontStyle|fontVariant|fontWeight|format|from|fr|fx|fy|g1|g2|glyphName|glyphOrientationHorizontal|glyphOrientationVertical|glyphRef|gradientTransform|gradientUnits|hanging|horizAdvX|horizOriginX|ideographic|imageRendering|in|in2|intercept|k|k1|k2|k3|k4|kernelMatrix|kernelUnitLength|kerning|keyPoints|keySplines|keyTimes|lengthAdjust|letterSpacing|lightingColor|limitingConeAngle|local|markerEnd|markerMid|markerStart|markerHeight|markerUnits|markerWidth|mask|maskContentUnits|maskUnits|mathematical|mode|numOctaves|offset|opacity|operator|order|orient|orientation|origin|overflow|overlinePosition|overlineThickness|panose1|paintOrder|pathLength|patternContentUnits|patternTransform|patternUnits|pointerEvents|points|pointsAtX|pointsAtY|pointsAtZ|preserveAlpha|preserveAspectRatio|primitiveUnits|r|radius|refX|refY|renderingIntent|repeatCount|repeatDur|requiredExtensions|requiredFeatures|restart|result|rotate|rx|ry|scale|seed|shapeRendering|slope|spacing|specularConstant|specularExponent|speed|spreadMethod|startOffset|stdDeviation|stemh|stemv|stitchTiles|stopColor|stopOpacity|strikethroughPosition|strikethroughThickness|string|stroke|strokeDasharray|strokeDashoffset|strokeLinecap|strokeLinejoin|strokeMiterlimit|strokeOpacity|strokeWidth|surfaceScale|systemLanguage|tableValues|targetX|targetY|textAnchor|textDecoration|textRendering|textLength|to|transform|u1|u2|underlinePosition|underlineThickness|unicode|unicodeBidi|unicodeRange|unitsPerEm|vAlphabetic|vHanging|vIdeographic|vMathematical|values|vectorEffect|version|vertAdvY|vertOriginX|vertOriginY|viewBox|viewTarget|visibility|widths|wordSpacing|writingMode|x|xHeight|x1|x2|xChannelSelector|xlinkActuate|xlinkArcrole|xlinkHref|xlinkRole|xlinkShow|xlinkTitle|xlinkType|xmlBase|xmlns|xmlnsXlink|xmlLang|xmlSpace|y|y1|y2|yChannelSelector|z|zoomAndPan|for|class|autofocus)|(([Dd][Aa][Tt][Aa]|[Aa][Rr][Ii][Aa]|x)-.*))$/;
var index = memoize_browser_esm_default(
function(prop) {
return reactPropsRegex.test(prop) || prop.charCodeAt(0) === 111 && prop.charCodeAt(1) === 110 && prop.charCodeAt(2) < 91;
}
/* Z+1 */
);
var is_prop_valid_browser_esm_default = index;
export {
is_prop_valid_browser_esm_default as default
};
//# sourceMappingURL=@emotion_is-prop-valid.js.map

View File

@@ -1,7 +0,0 @@
{
"version": 3,
"sources": ["../../../../../@emotion/memoize/dist/memoize.browser.esm.js", "../../../../../@emotion/is-prop-valid/dist/is-prop-valid.browser.esm.js"],
"sourcesContent": ["function memoize(fn) {\n var cache = {};\n return function (arg) {\n if (cache[arg] === undefined) cache[arg] = fn(arg);\n return cache[arg];\n };\n}\n\nexport default memoize;\n", "import memoize from '@emotion/memoize';\n\nvar reactPropsRegex = /^((children|dangerouslySetInnerHTML|key|ref|autoFocus|defaultValue|defaultChecked|innerHTML|suppressContentEditableWarning|suppressHydrationWarning|valueLink|accept|acceptCharset|accessKey|action|allow|allowUserMedia|allowPaymentRequest|allowFullScreen|allowTransparency|alt|async|autoComplete|autoPlay|capture|cellPadding|cellSpacing|challenge|charSet|checked|cite|classID|className|cols|colSpan|content|contentEditable|contextMenu|controls|controlsList|coords|crossOrigin|data|dateTime|decoding|default|defer|dir|disabled|disablePictureInPicture|download|draggable|encType|form|formAction|formEncType|formMethod|formNoValidate|formTarget|frameBorder|headers|height|hidden|high|href|hrefLang|htmlFor|httpEquiv|id|inputMode|integrity|is|keyParams|keyType|kind|label|lang|list|loading|loop|low|marginHeight|marginWidth|max|maxLength|media|mediaGroup|method|min|minLength|multiple|muted|name|nonce|noValidate|open|optimum|pattern|placeholder|playsInline|poster|preload|profile|radioGroup|readOnly|referrerPolicy|rel|required|reversed|role|rows|rowSpan|sandbox|scope|scoped|scrolling|seamless|selected|shape|size|sizes|slot|span|spellCheck|src|srcDoc|srcLang|srcSet|start|step|style|summary|tabIndex|target|title|type|useMap|value|width|wmode|wrap|about|datatype|inlist|prefix|property|resource|typeof|vocab|autoCapitalize|autoCorrect|autoSave|color|inert|itemProp|itemScope|itemType|itemID|itemRef|on|results|security|unselectable|accentHeight|accumulate|additive|alignmentBaseline|allowReorder|alphabetic|amplitude|arabicForm|ascent|attributeName|attributeType|autoReverse|azimuth|baseFrequency|baselineShift|baseProfile|bbox|begin|bias|by|calcMode|capHeight|clip|clipPathUnits|clipPath|clipRule|colorInterpolation|colorInterpolationFilters|colorProfile|colorRendering|contentScriptType|contentStyleType|cursor|cx|cy|d|decelerate|descent|diffuseConstant|direction|display|divisor|dominantBaseline|dur|dx|dy|edgeMode|elevation|enableBackground|end|exponent|externalResourcesRequired|fill|fillOpacity|fillRule|filter|filterRes|filterUnits|floodColor|floodOpacity|focusable|fontFamily|fontSize|fontSizeAdjust|fontStretch|fontStyle|fontVariant|fontWeight|format|from|fr|fx|fy|g1|g2|glyphName|glyphOrientationHorizontal|glyphOrientationVertical|glyphRef|gradientTransform|gradientUnits|hanging|horizAdvX|horizOriginX|ideographic|imageRendering|in|in2|intercept|k|k1|k2|k3|k4|kernelMatrix|kernelUnitLength|kerning|keyPoints|keySplines|keyTimes|lengthAdjust|letterSpacing|lightingColor|limitingConeAngle|local|markerEnd|markerMid|markerStart|markerHeight|markerUnits|markerWidth|mask|maskContentUnits|maskUnits|mathematical|mode|numOctaves|offset|opacity|operator|order|orient|orientation|origin|overflow|overlinePosition|overlineThickness|panose1|paintOrder|pathLength|patternContentUnits|patternTransform|patternUnits|pointerEvents|points|pointsAtX|pointsAtY|pointsAtZ|preserveAlpha|preserveAspectRatio|primitiveUnits|r|radius|refX|refY|renderingIntent|repeatCount|repeatDur|requiredExtensions|requiredFeatures|restart|result|rotate|rx|ry|scale|seed|shapeRendering|slope|spacing|specularConstant|specularExponent|speed|spreadMethod|startOffset|stdDeviation|stemh|stemv|stitchTiles|stopColor|stopOpacity|strikethroughPosition|strikethroughThickness|string|stroke|strokeDasharray|strokeDashoffset|strokeLinecap|strokeLinejoin|strokeMiterlimit|strokeOpacity|strokeWidth|surfaceScale|systemLanguage|tableValues|targetX|targetY|textAnchor|textDecoration|textRendering|textLength|to|transform|u1|u2|underlinePosition|underlineThickness|unicode|unicodeBidi|unicodeRange|unitsPerEm|vAlphabetic|vHanging|vIdeographic|vMathematical|values|vectorEffect|version|vertAdvY|vertOriginX|vertOriginY|viewBox|viewTarget|visibility|widths|wordSpacing|writingMode|x|xHeight|x1|x2|xChannelSelector|xlinkActuate|xlinkArcrole|xlinkHref|xlinkRole|xlinkShow|xlinkTitle|xlinkType|xmlBase|xmlns|xmlnsXlink|xmlLang|xmlSpace|y|y1|y2|yChannelSelector|z|zoomAndPan|for|class|autofocus)|(([Dd][Aa][Tt][Aa]|[Aa][Rr][Ii][Aa]|x)-.*))$/; // https://esbench.com/bench/5bfee68a4cd7e6009ef61d23\n\nvar index = memoize(function (prop) {\n return reactPropsRegex.test(prop) || prop.charCodeAt(0) === 111\n /* o */\n && prop.charCodeAt(1) === 110\n /* n */\n && prop.charCodeAt(2) < 91;\n}\n/* Z+1 */\n);\n\nexport default index;\n"],
"mappings": ";;;AAAA,SAAS,QAAQ,IAAI;AACnB,MAAI,QAAQ,CAAC;AACb,SAAO,SAAU,KAAK;AACpB,QAAI,MAAM,GAAG,MAAM,OAAW,OAAM,GAAG,IAAI,GAAG,GAAG;AACjD,WAAO,MAAM,GAAG;AAAA,EAClB;AACF;AAEA,IAAO,8BAAQ;;;ACNf,IAAI,kBAAkB;AAEtB,IAAI,QAAQ;AAAA,EAAQ,SAAU,MAAM;AAClC,WAAO,gBAAgB,KAAK,IAAI,KAAK,KAAK,WAAW,CAAC,MAAM,OAEzD,KAAK,WAAW,CAAC,MAAM,OAEvB,KAAK,WAAW,CAAC,IAAI;AAAA,EAC1B;AAAA;AAEA;AAEA,IAAO,oCAAQ;",
"names": []
}

View File

@@ -1,419 +0,0 @@
import "./chunk-KEXKKQVW.js";
// node_modules/@jridgewell/sourcemap-codec/dist/sourcemap-codec.mjs
var comma = ",".charCodeAt(0);
var semicolon = ";".charCodeAt(0);
var chars = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
var intToChar = new Uint8Array(64);
var charToInt = new Uint8Array(128);
for (let i = 0; i < chars.length; i++) {
const c = chars.charCodeAt(i);
intToChar[i] = c;
charToInt[c] = i;
}
function decodeInteger(reader, relative) {
let value = 0;
let shift = 0;
let integer = 0;
do {
const c = reader.next();
integer = charToInt[c];
value |= (integer & 31) << shift;
shift += 5;
} while (integer & 32);
const shouldNegate = value & 1;
value >>>= 1;
if (shouldNegate) {
value = -2147483648 | -value;
}
return relative + value;
}
function encodeInteger(builder, num, relative) {
let delta = num - relative;
delta = delta < 0 ? -delta << 1 | 1 : delta << 1;
do {
let clamped = delta & 31;
delta >>>= 5;
if (delta > 0) clamped |= 32;
builder.write(intToChar[clamped]);
} while (delta > 0);
return num;
}
function hasMoreVlq(reader, max) {
if (reader.pos >= max) return false;
return reader.peek() !== comma;
}
var bufLength = 1024 * 16;
var td = typeof TextDecoder !== "undefined" ? new TextDecoder() : typeof Buffer !== "undefined" ? {
decode(buf) {
const out = Buffer.from(buf.buffer, buf.byteOffset, buf.byteLength);
return out.toString();
}
} : {
decode(buf) {
let out = "";
for (let i = 0; i < buf.length; i++) {
out += String.fromCharCode(buf[i]);
}
return out;
}
};
var StringWriter = class {
constructor() {
this.pos = 0;
this.out = "";
this.buffer = new Uint8Array(bufLength);
}
write(v) {
const { buffer } = this;
buffer[this.pos++] = v;
if (this.pos === bufLength) {
this.out += td.decode(buffer);
this.pos = 0;
}
}
flush() {
const { buffer, out, pos } = this;
return pos > 0 ? out + td.decode(buffer.subarray(0, pos)) : out;
}
};
var StringReader = class {
constructor(buffer) {
this.pos = 0;
this.buffer = buffer;
}
next() {
return this.buffer.charCodeAt(this.pos++);
}
peek() {
return this.buffer.charCodeAt(this.pos);
}
indexOf(char) {
const { buffer, pos } = this;
const idx = buffer.indexOf(char, pos);
return idx === -1 ? buffer.length : idx;
}
};
var EMPTY = [];
function decodeOriginalScopes(input) {
const { length } = input;
const reader = new StringReader(input);
const scopes = [];
const stack = [];
let line = 0;
for (; reader.pos < length; reader.pos++) {
line = decodeInteger(reader, line);
const column = decodeInteger(reader, 0);
if (!hasMoreVlq(reader, length)) {
const last = stack.pop();
last[2] = line;
last[3] = column;
continue;
}
const kind = decodeInteger(reader, 0);
const fields = decodeInteger(reader, 0);
const hasName = fields & 1;
const scope = hasName ? [line, column, 0, 0, kind, decodeInteger(reader, 0)] : [line, column, 0, 0, kind];
let vars = EMPTY;
if (hasMoreVlq(reader, length)) {
vars = [];
do {
const varsIndex = decodeInteger(reader, 0);
vars.push(varsIndex);
} while (hasMoreVlq(reader, length));
}
scope.vars = vars;
scopes.push(scope);
stack.push(scope);
}
return scopes;
}
function encodeOriginalScopes(scopes) {
const writer = new StringWriter();
for (let i = 0; i < scopes.length; ) {
i = _encodeOriginalScopes(scopes, i, writer, [0]);
}
return writer.flush();
}
function _encodeOriginalScopes(scopes, index, writer, state) {
const scope = scopes[index];
const { 0: startLine, 1: startColumn, 2: endLine, 3: endColumn, 4: kind, vars } = scope;
if (index > 0) writer.write(comma);
state[0] = encodeInteger(writer, startLine, state[0]);
encodeInteger(writer, startColumn, 0);
encodeInteger(writer, kind, 0);
const fields = scope.length === 6 ? 1 : 0;
encodeInteger(writer, fields, 0);
if (scope.length === 6) encodeInteger(writer, scope[5], 0);
for (const v of vars) {
encodeInteger(writer, v, 0);
}
for (index++; index < scopes.length; ) {
const next = scopes[index];
const { 0: l, 1: c } = next;
if (l > endLine || l === endLine && c >= endColumn) {
break;
}
index = _encodeOriginalScopes(scopes, index, writer, state);
}
writer.write(comma);
state[0] = encodeInteger(writer, endLine, state[0]);
encodeInteger(writer, endColumn, 0);
return index;
}
function decodeGeneratedRanges(input) {
const { length } = input;
const reader = new StringReader(input);
const ranges = [];
const stack = [];
let genLine = 0;
let definitionSourcesIndex = 0;
let definitionScopeIndex = 0;
let callsiteSourcesIndex = 0;
let callsiteLine = 0;
let callsiteColumn = 0;
let bindingLine = 0;
let bindingColumn = 0;
do {
const semi = reader.indexOf(";");
let genColumn = 0;
for (; reader.pos < semi; reader.pos++) {
genColumn = decodeInteger(reader, genColumn);
if (!hasMoreVlq(reader, semi)) {
const last = stack.pop();
last[2] = genLine;
last[3] = genColumn;
continue;
}
const fields = decodeInteger(reader, 0);
const hasDefinition = fields & 1;
const hasCallsite = fields & 2;
const hasScope = fields & 4;
let callsite = null;
let bindings = EMPTY;
let range;
if (hasDefinition) {
const defSourcesIndex = decodeInteger(reader, definitionSourcesIndex);
definitionScopeIndex = decodeInteger(
reader,
definitionSourcesIndex === defSourcesIndex ? definitionScopeIndex : 0
);
definitionSourcesIndex = defSourcesIndex;
range = [genLine, genColumn, 0, 0, defSourcesIndex, definitionScopeIndex];
} else {
range = [genLine, genColumn, 0, 0];
}
range.isScope = !!hasScope;
if (hasCallsite) {
const prevCsi = callsiteSourcesIndex;
const prevLine = callsiteLine;
callsiteSourcesIndex = decodeInteger(reader, callsiteSourcesIndex);
const sameSource = prevCsi === callsiteSourcesIndex;
callsiteLine = decodeInteger(reader, sameSource ? callsiteLine : 0);
callsiteColumn = decodeInteger(
reader,
sameSource && prevLine === callsiteLine ? callsiteColumn : 0
);
callsite = [callsiteSourcesIndex, callsiteLine, callsiteColumn];
}
range.callsite = callsite;
if (hasMoreVlq(reader, semi)) {
bindings = [];
do {
bindingLine = genLine;
bindingColumn = genColumn;
const expressionsCount = decodeInteger(reader, 0);
let expressionRanges;
if (expressionsCount < -1) {
expressionRanges = [[decodeInteger(reader, 0)]];
for (let i = -1; i > expressionsCount; i--) {
const prevBl = bindingLine;
bindingLine = decodeInteger(reader, bindingLine);
bindingColumn = decodeInteger(reader, bindingLine === prevBl ? bindingColumn : 0);
const expression = decodeInteger(reader, 0);
expressionRanges.push([expression, bindingLine, bindingColumn]);
}
} else {
expressionRanges = [[expressionsCount]];
}
bindings.push(expressionRanges);
} while (hasMoreVlq(reader, semi));
}
range.bindings = bindings;
ranges.push(range);
stack.push(range);
}
genLine++;
reader.pos = semi + 1;
} while (reader.pos < length);
return ranges;
}
function encodeGeneratedRanges(ranges) {
if (ranges.length === 0) return "";
const writer = new StringWriter();
for (let i = 0; i < ranges.length; ) {
i = _encodeGeneratedRanges(ranges, i, writer, [0, 0, 0, 0, 0, 0, 0]);
}
return writer.flush();
}
function _encodeGeneratedRanges(ranges, index, writer, state) {
const range = ranges[index];
const {
0: startLine,
1: startColumn,
2: endLine,
3: endColumn,
isScope,
callsite,
bindings
} = range;
if (state[0] < startLine) {
catchupLine(writer, state[0], startLine);
state[0] = startLine;
state[1] = 0;
} else if (index > 0) {
writer.write(comma);
}
state[1] = encodeInteger(writer, range[1], state[1]);
const fields = (range.length === 6 ? 1 : 0) | (callsite ? 2 : 0) | (isScope ? 4 : 0);
encodeInteger(writer, fields, 0);
if (range.length === 6) {
const { 4: sourcesIndex, 5: scopesIndex } = range;
if (sourcesIndex !== state[2]) {
state[3] = 0;
}
state[2] = encodeInteger(writer, sourcesIndex, state[2]);
state[3] = encodeInteger(writer, scopesIndex, state[3]);
}
if (callsite) {
const { 0: sourcesIndex, 1: callLine, 2: callColumn } = range.callsite;
if (sourcesIndex !== state[4]) {
state[5] = 0;
state[6] = 0;
} else if (callLine !== state[5]) {
state[6] = 0;
}
state[4] = encodeInteger(writer, sourcesIndex, state[4]);
state[5] = encodeInteger(writer, callLine, state[5]);
state[6] = encodeInteger(writer, callColumn, state[6]);
}
if (bindings) {
for (const binding of bindings) {
if (binding.length > 1) encodeInteger(writer, -binding.length, 0);
const expression = binding[0][0];
encodeInteger(writer, expression, 0);
let bindingStartLine = startLine;
let bindingStartColumn = startColumn;
for (let i = 1; i < binding.length; i++) {
const expRange = binding[i];
bindingStartLine = encodeInteger(writer, expRange[1], bindingStartLine);
bindingStartColumn = encodeInteger(writer, expRange[2], bindingStartColumn);
encodeInteger(writer, expRange[0], 0);
}
}
}
for (index++; index < ranges.length; ) {
const next = ranges[index];
const { 0: l, 1: c } = next;
if (l > endLine || l === endLine && c >= endColumn) {
break;
}
index = _encodeGeneratedRanges(ranges, index, writer, state);
}
if (state[0] < endLine) {
catchupLine(writer, state[0], endLine);
state[0] = endLine;
state[1] = 0;
} else {
writer.write(comma);
}
state[1] = encodeInteger(writer, endColumn, state[1]);
return index;
}
function catchupLine(writer, lastLine, line) {
do {
writer.write(semicolon);
} while (++lastLine < line);
}
function decode(mappings) {
const { length } = mappings;
const reader = new StringReader(mappings);
const decoded = [];
let genColumn = 0;
let sourcesIndex = 0;
let sourceLine = 0;
let sourceColumn = 0;
let namesIndex = 0;
do {
const semi = reader.indexOf(";");
const line = [];
let sorted = true;
let lastCol = 0;
genColumn = 0;
while (reader.pos < semi) {
let seg;
genColumn = decodeInteger(reader, genColumn);
if (genColumn < lastCol) sorted = false;
lastCol = genColumn;
if (hasMoreVlq(reader, semi)) {
sourcesIndex = decodeInteger(reader, sourcesIndex);
sourceLine = decodeInteger(reader, sourceLine);
sourceColumn = decodeInteger(reader, sourceColumn);
if (hasMoreVlq(reader, semi)) {
namesIndex = decodeInteger(reader, namesIndex);
seg = [genColumn, sourcesIndex, sourceLine, sourceColumn, namesIndex];
} else {
seg = [genColumn, sourcesIndex, sourceLine, sourceColumn];
}
} else {
seg = [genColumn];
}
line.push(seg);
reader.pos++;
}
if (!sorted) sort(line);
decoded.push(line);
reader.pos = semi + 1;
} while (reader.pos <= length);
return decoded;
}
function sort(line) {
line.sort(sortComparator);
}
function sortComparator(a, b) {
return a[0] - b[0];
}
function encode(decoded) {
const writer = new StringWriter();
let sourcesIndex = 0;
let sourceLine = 0;
let sourceColumn = 0;
let namesIndex = 0;
for (let i = 0; i < decoded.length; i++) {
const line = decoded[i];
if (i > 0) writer.write(semicolon);
if (line.length === 0) continue;
let genColumn = 0;
for (let j = 0; j < line.length; j++) {
const segment = line[j];
if (j > 0) writer.write(comma);
genColumn = encodeInteger(writer, segment[0], genColumn);
if (segment.length === 1) continue;
sourcesIndex = encodeInteger(writer, segment[1], sourcesIndex);
sourceLine = encodeInteger(writer, segment[2], sourceLine);
sourceColumn = encodeInteger(writer, segment[3], sourceColumn);
if (segment.length === 4) continue;
namesIndex = encodeInteger(writer, segment[4], namesIndex);
}
}
return writer.flush();
}
export {
decode,
decodeGeneratedRanges,
decodeOriginalScopes,
encode,
encodeGeneratedRanges,
encodeOriginalScopes
};
//# sourceMappingURL=@jridgewell_sourcemap-codec.js.map

View File

@@ -1,41 +0,0 @@
import {
require_react
} from "./chunk-WIJRE3H4.js";
import {
__toESM
} from "./chunk-KEXKKQVW.js";
// node_modules/@mdx-js/react/lib/index.js
var import_react = __toESM(require_react(), 1);
var emptyComponents = {};
var MDXContext = import_react.default.createContext(emptyComponents);
function useMDXComponents(components) {
const contextComponents = import_react.default.useContext(MDXContext);
return import_react.default.useMemo(
function() {
if (typeof components === "function") {
return components(contextComponents);
}
return { ...contextComponents, ...components };
},
[contextComponents, components]
);
}
function MDXProvider(properties) {
let allComponents;
if (properties.disableParentContext) {
allComponents = typeof properties.components === "function" ? properties.components(emptyComponents) : properties.components || emptyComponents;
} else {
allComponents = useMDXComponents(properties.components);
}
return import_react.default.createElement(
MDXContext.Provider,
{ value: allComponents },
properties.children
);
}
export {
MDXProvider,
useMDXComponents
};
//# sourceMappingURL=@mdx-js_react.js.map

View File

@@ -1,7 +0,0 @@
{
"version": 3,
"sources": ["../../../../../@mdx-js/react/lib/index.js"],
"sourcesContent": ["/**\n * @import {MDXComponents} from 'mdx/types.js'\n * @import {Component, ReactElement, ReactNode} from 'react'\n */\n\n/**\n * @callback MergeComponents\n * Custom merge function.\n * @param {Readonly<MDXComponents>} currentComponents\n * Current components from the context.\n * @returns {MDXComponents}\n * Additional components.\n *\n * @typedef Props\n * Configuration for `MDXProvider`.\n * @property {ReactNode | null | undefined} [children]\n * Children (optional).\n * @property {Readonly<MDXComponents> | MergeComponents | null | undefined} [components]\n * Additional components to use or a function that creates them (optional).\n * @property {boolean | null | undefined} [disableParentContext=false]\n * Turn off outer component context (default: `false`).\n */\n\nimport React from 'react'\n\n/** @type {Readonly<MDXComponents>} */\nconst emptyComponents = {}\n\nconst MDXContext = React.createContext(emptyComponents)\n\n/**\n * Get current components from the MDX Context.\n *\n * @param {Readonly<MDXComponents> | MergeComponents | null | undefined} [components]\n * Additional components to use or a function that creates them (optional).\n * @returns {MDXComponents}\n * Current components.\n */\nexport function useMDXComponents(components) {\n const contextComponents = React.useContext(MDXContext)\n\n // Memoize to avoid unnecessary top-level context changes\n return React.useMemo(\n function () {\n // Custom merge via a function prop\n if (typeof components === 'function') {\n return components(contextComponents)\n }\n\n return {...contextComponents, ...components}\n },\n [contextComponents, components]\n )\n}\n\n/**\n * Provider for MDX context.\n *\n * @param {Readonly<Props>} properties\n * Properties.\n * @returns {ReactElement}\n * Element.\n * @satisfies {Component}\n */\nexport function MDXProvider(properties) {\n /** @type {Readonly<MDXComponents>} */\n let allComponents\n\n if (properties.disableParentContext) {\n allComponents =\n typeof properties.components === 'function'\n ? properties.components(emptyComponents)\n : properties.components || emptyComponents\n } else {\n allComponents = useMDXComponents(properties.components)\n }\n\n return React.createElement(\n MDXContext.Provider,\n {value: allComponents},\n properties.children\n )\n}\n"],
"mappings": ";;;;;;;;AAuBA,mBAAkB;AAGlB,IAAM,kBAAkB,CAAC;AAEzB,IAAM,aAAa,aAAAA,QAAM,cAAc,eAAe;AAU/C,SAAS,iBAAiB,YAAY;AAC3C,QAAM,oBAAoB,aAAAA,QAAM,WAAW,UAAU;AAGrD,SAAO,aAAAA,QAAM;AAAA,IACX,WAAY;AAEV,UAAI,OAAO,eAAe,YAAY;AACpC,eAAO,WAAW,iBAAiB;AAAA,MACrC;AAEA,aAAO,EAAC,GAAG,mBAAmB,GAAG,WAAU;AAAA,IAC7C;AAAA,IACA,CAAC,mBAAmB,UAAU;AAAA,EAChC;AACF;AAWO,SAAS,YAAY,YAAY;AAEtC,MAAI;AAEJ,MAAI,WAAW,sBAAsB;AACnC,oBACE,OAAO,WAAW,eAAe,aAC7B,WAAW,WAAW,eAAe,IACrC,WAAW,cAAc;AAAA,EACjC,OAAO;AACL,oBAAgB,iBAAiB,WAAW,UAAU;AAAA,EACxD;AAEA,SAAO,aAAAA,QAAM;AAAA,IACX,WAAW;AAAA,IACX,EAAC,OAAO,cAAa;AAAA,IACrB,WAAW;AAAA,EACb;AACF;",
"names": ["React"]
}

View File

@@ -1,48 +0,0 @@
import {
DocsRenderer
} from "./chunk-VG4OXZTU.js";
import "./chunk-57ZXLNKK.js";
import "./chunk-TYV5OM3H.js";
import "./chunk-FNTD6K4X.js";
import "./chunk-JLBFQ2EK.js";
import {
__export
} from "./chunk-RM5O7ZR7.js";
import "./chunk-RTHSENM2.js";
import "./chunk-K46MDWSL.js";
import "./chunk-H4EEZRGF.js";
import "./chunk-FTMWZLOQ.js";
import "./chunk-YO32UEEW.js";
import "./chunk-E4Q3YXXP.js";
import "./chunk-YYB2ULC3.js";
import "./chunk-GF7VUYY4.js";
import "./chunk-ZHATCZIL.js";
import {
require_preview_api
} from "./chunk-NDPLLWBS.js";
import "./chunk-WIJRE3H4.js";
import {
__toESM
} from "./chunk-KEXKKQVW.js";
// node_modules/@storybook/addon-docs/dist/index.mjs
var import_preview_api = __toESM(require_preview_api(), 1);
var preview_exports = {};
__export(preview_exports, { parameters: () => parameters });
var excludeTags = Object.entries(globalThis.TAGS_OPTIONS ?? {}).reduce((acc, entry) => {
let [tag, option] = entry;
return option.excludeFromDocsStories && (acc[tag] = true), acc;
}, {});
var parameters = { docs: { renderer: async () => {
let { DocsRenderer: DocsRenderer2 } = await import("./DocsRenderer-3PZUHFFL-FOAYSAPL.js");
return new DocsRenderer2();
}, stories: { filter: (story) => {
var _a;
return (story.tags || []).filter((tag) => excludeTags[tag]).length === 0 && !((_a = story.parameters.docs) == null ? void 0 : _a.disable);
} } } };
var index_default = () => (0, import_preview_api.definePreview)(preview_exports);
export {
DocsRenderer,
index_default as default
};
//# sourceMappingURL=@storybook_addon-docs.js.map

View File

@@ -1,7 +0,0 @@
{
"version": 3,
"sources": ["../../../../../@storybook/addon-docs/dist/index.mjs"],
"sourcesContent": ["export { DocsRenderer } from './chunk-GWJYCGSQ.mjs';\nimport { __export } from './chunk-QUZPS4B6.mjs';\nimport { definePreview } from 'storybook/preview-api';\n\nvar preview_exports={};__export(preview_exports,{parameters:()=>parameters});var excludeTags=Object.entries(globalThis.TAGS_OPTIONS??{}).reduce((acc,entry)=>{let[tag,option]=entry;return option.excludeFromDocsStories&&(acc[tag]=!0),acc},{}),parameters={docs:{renderer:async()=>{let{DocsRenderer:DocsRenderer2}=await import('./DocsRenderer-3PZUHFFL.mjs');return new DocsRenderer2},stories:{filter:story=>(story.tags||[]).filter(tag=>excludeTags[tag]).length===0&&!story.parameters.docs?.disable}}};var index_default=()=>definePreview(preview_exports);\n\nexport { index_default as default };\n"],
"mappings": ";;;;;;;;;;;;;;;;;;;;;;;;;;;;AAEA,yBAA8B;AAE9B,IAAI,kBAAgB,CAAC;AAAE,SAAS,iBAAgB,EAAC,YAAW,MAAI,WAAU,CAAC;AAAE,IAAI,cAAY,OAAO,QAAQ,WAAW,gBAAc,CAAC,CAAC,EAAE,OAAO,CAAC,KAAI,UAAQ;AAAC,MAAG,CAAC,KAAI,MAAM,IAAE;AAAM,SAAO,OAAO,2BAAyB,IAAI,GAAG,IAAE,OAAI;AAAG,GAAE,CAAC,CAAC;AAAlK,IAAoK,aAAW,EAAC,MAAK,EAAC,UAAS,YAAS;AAAC,MAAG,EAAC,cAAa,cAAa,IAAE,MAAM,OAAO,qCAA6B;AAAE,SAAO,IAAI;AAAa,GAAE,SAAQ,EAAC,QAAO,WAAK;AAJjZ;AAIoZ,gBAAM,QAAM,CAAC,GAAG,OAAO,SAAK,YAAY,GAAG,CAAC,EAAE,WAAS,KAAG,GAAC,WAAM,WAAW,SAAjB,mBAAuB;AAAA,EAAO,EAAC,EAAC;AAAE,IAAI,gBAAc,UAAI,kCAAc,eAAe;",
"names": []
}

View File

@@ -1,149 +0,0 @@
import {
AddContext,
Anchor,
AnchorMdx,
ArgTypes,
ArgsTable,
BooleanControl,
Canvas,
CodeOrSourceMdx,
ColorControl,
ColorItem,
ColorPalette,
Controls3,
DateControl,
DescriptionContainer,
DescriptionType,
Docs,
DocsContainer,
DocsContext,
DocsPage,
DocsStory,
ExternalDocs,
ExternalDocsContainer,
FilesControl,
HeaderMdx,
HeadersMdx,
Heading2,
IconGallery,
IconItem,
Markdown,
Meta,
NumberControl,
ObjectControl,
OptionsControl,
PRIMARY_STORY,
Primary,
RangeControl,
Source2,
SourceContainer,
SourceContext,
Stories,
Story2,
Subheading,
Subtitle2,
TableOfContents,
TextControl,
Title3,
Typeset,
UNKNOWN_ARGS_HASH,
Unstyled,
Wrapper10,
anchorBlockIdFromId,
argsHash,
assertIsFn,
extractTitle,
format2,
formatDate,
formatTime,
getStoryId2,
getStoryProps,
parse2,
parseDate,
parseTime,
slugs,
useOf,
useSourceProps
} from "./chunk-FNTD6K4X.js";
import "./chunk-JLBFQ2EK.js";
import "./chunk-RM5O7ZR7.js";
import "./chunk-RTHSENM2.js";
import "./chunk-K46MDWSL.js";
import "./chunk-H4EEZRGF.js";
import "./chunk-FTMWZLOQ.js";
import "./chunk-YO32UEEW.js";
import "./chunk-E4Q3YXXP.js";
import "./chunk-YYB2ULC3.js";
import "./chunk-GF7VUYY4.js";
import "./chunk-ZHATCZIL.js";
import "./chunk-NDPLLWBS.js";
import "./chunk-WIJRE3H4.js";
import "./chunk-KEXKKQVW.js";
export {
AddContext,
Anchor,
AnchorMdx,
ArgTypes,
BooleanControl,
Canvas,
CodeOrSourceMdx,
ColorControl,
ColorItem,
ColorPalette,
Controls3 as Controls,
DateControl,
DescriptionContainer as Description,
DescriptionType,
Docs,
DocsContainer,
DocsContext,
DocsPage,
DocsStory,
ExternalDocs,
ExternalDocsContainer,
FilesControl,
HeaderMdx,
HeadersMdx,
Heading2 as Heading,
IconGallery,
IconItem,
Markdown,
Meta,
NumberControl,
ObjectControl,
OptionsControl,
PRIMARY_STORY,
Primary,
ArgsTable as PureArgsTable,
RangeControl,
Source2 as Source,
SourceContainer,
SourceContext,
Stories,
Story2 as Story,
Subheading,
Subtitle2 as Subtitle,
TableOfContents,
TextControl,
Title3 as Title,
Typeset,
UNKNOWN_ARGS_HASH,
Unstyled,
Wrapper10 as Wrapper,
anchorBlockIdFromId,
argsHash,
assertIsFn,
extractTitle,
format2 as format,
formatDate,
formatTime,
getStoryId2 as getStoryId,
getStoryProps,
parse2 as parse,
parseDate,
parseTime,
slugs,
useOf,
useSourceProps
};
//# sourceMappingURL=@storybook_addon-docs_blocks.js.map

View File

@@ -1,18 +0,0 @@
import "./chunk-KEXKKQVW.js";
// node_modules/@storybook/addon-docs/dist/preview.mjs
var excludeTags = Object.entries(globalThis.TAGS_OPTIONS ?? {}).reduce((acc, entry) => {
let [tag, option] = entry;
return option.excludeFromDocsStories && (acc[tag] = true), acc;
}, {});
var parameters = { docs: { renderer: async () => {
let { DocsRenderer } = await import("./DocsRenderer-PQXLIZUC-RVPN436C.js");
return new DocsRenderer();
}, stories: { filter: (story) => {
var _a;
return (story.tags || []).filter((tag) => excludeTags[tag]).length === 0 && !((_a = story.parameters.docs) == null ? void 0 : _a.disable);
} } } };
export {
parameters
};
//# sourceMappingURL=@storybook_addon-docs_preview.js.map

View File

@@ -1,7 +0,0 @@
{
"version": 3,
"sources": ["../../../../../@storybook/addon-docs/dist/preview.mjs"],
"sourcesContent": ["var excludeTags=Object.entries(globalThis.TAGS_OPTIONS??{}).reduce((acc,entry)=>{let[tag,option]=entry;return option.excludeFromDocsStories&&(acc[tag]=!0),acc},{}),parameters={docs:{renderer:async()=>{let{DocsRenderer}=await import('./DocsRenderer-PQXLIZUC.mjs');return new DocsRenderer},stories:{filter:story=>(story.tags||[]).filter(tag=>excludeTags[tag]).length===0&&!story.parameters.docs?.disable}}};\n\nexport { parameters };\n"],
"mappings": ";;;AAAA,IAAI,cAAY,OAAO,QAAQ,WAAW,gBAAc,CAAC,CAAC,EAAE,OAAO,CAAC,KAAI,UAAQ;AAAC,MAAG,CAAC,KAAI,MAAM,IAAE;AAAM,SAAO,OAAO,2BAAyB,IAAI,GAAG,IAAE,OAAI;AAAG,GAAE,CAAC,CAAC;AAAlK,IAAoK,aAAW,EAAC,MAAK,EAAC,UAAS,YAAS;AAAC,MAAG,EAAC,aAAY,IAAE,MAAM,OAAO,qCAA6B;AAAE,SAAO,IAAI;AAAY,GAAE,SAAQ,EAAC,QAAO,WAAK;AAArT;AAAwT,gBAAM,QAAM,CAAC,GAAG,OAAO,SAAK,YAAY,GAAG,CAAC,EAAE,WAAS,KAAG,GAAC,WAAM,WAAW,SAAjB,mBAAuB;AAAA,EAAO,EAAC,EAAC;",
"names": []
}

View File

@@ -1,19 +0,0 @@
import {
applyDecorators2,
decorators,
parameters
} from "./chunk-AHTFTWU7.js";
import "./chunk-OAOLO3MQ.js";
import "./chunk-YYB2ULC3.js";
import "./chunk-GF7VUYY4.js";
import "./chunk-ZHATCZIL.js";
import "./chunk-NDPLLWBS.js";
import "./chunk-WIJRE3H4.js";
import "./chunk-DF7VAP3D.js";
import "./chunk-KEXKKQVW.js";
export {
applyDecorators2 as applyDecorators,
decorators,
parameters
};
//# sourceMappingURL=@storybook_react_dist_entry-preview-docs__mjs.js.map

View File

@@ -1,9 +0,0 @@
import "./chunk-DF7VAP3D.js";
import "./chunk-KEXKKQVW.js";
// node_modules/@storybook/react/dist/entry-preview-rsc.mjs
var parameters = { react: { rsc: true } };
export {
parameters
};
//# sourceMappingURL=@storybook_react_dist_entry-preview-rsc__mjs.js.map

View File

@@ -1,7 +0,0 @@
{
"version": 3,
"sources": ["../../../../../@storybook/react/dist/entry-preview-rsc.mjs"],
"sourcesContent": ["import './chunk-XP5HYGXS.mjs';\n\nvar parameters={react:{rsc:!0}};\n\nexport { parameters };\n"],
"mappings": ";;;;AAEA,IAAI,aAAW,EAAC,OAAM,EAAC,KAAI,KAAE,EAAC;",
"names": []
}

View File

@@ -1,26 +0,0 @@
import {
beforeAll,
decorators,
mount,
parameters,
render,
renderToCanvas
} from "./chunk-D63W3CRC.js";
import "./chunk-E4Q3YXXP.js";
import {
applyDecorators
} from "./chunk-OAOLO3MQ.js";
import "./chunk-NDPLLWBS.js";
import "./chunk-WIJRE3H4.js";
import "./chunk-DF7VAP3D.js";
import "./chunk-KEXKKQVW.js";
export {
applyDecorators,
beforeAll,
decorators,
mount,
parameters,
render,
renderToCanvas
};
//# sourceMappingURL=@storybook_react_dist_entry-preview__mjs.js.map

View File

@@ -1,7 +0,0 @@
{
"version": 3,
"sources": [],
"sourcesContent": [],
"mappings": "",
"names": []
}

View File

@@ -1,718 +0,0 @@
import {
MarkupIcon,
__commonJS,
__toESM as __toESM2,
debounce2,
getControlId
} from "./chunk-RM5O7ZR7.js";
import {
G3,
N7,
O3,
xr
} from "./chunk-RTHSENM2.js";
import "./chunk-H4EEZRGF.js";
import "./chunk-FTMWZLOQ.js";
import "./chunk-YO32UEEW.js";
import "./chunk-E4Q3YXXP.js";
import "./chunk-GF7VUYY4.js";
import {
require_react
} from "./chunk-WIJRE3H4.js";
import {
__toESM
} from "./chunk-KEXKKQVW.js";
// node_modules/@storybook/addon-docs/dist/Color-AVL7NMMY.mjs
var import_react = __toESM(require_react(), 1);
var require_color_name = __commonJS({ "../../node_modules/color-name/index.js"(exports, module) {
module.exports = { aliceblue: [240, 248, 255], antiquewhite: [250, 235, 215], aqua: [0, 255, 255], aquamarine: [127, 255, 212], azure: [240, 255, 255], beige: [245, 245, 220], bisque: [255, 228, 196], black: [0, 0, 0], blanchedalmond: [255, 235, 205], blue: [0, 0, 255], blueviolet: [138, 43, 226], brown: [165, 42, 42], burlywood: [222, 184, 135], cadetblue: [95, 158, 160], chartreuse: [127, 255, 0], chocolate: [210, 105, 30], coral: [255, 127, 80], cornflowerblue: [100, 149, 237], cornsilk: [255, 248, 220], crimson: [220, 20, 60], cyan: [0, 255, 255], darkblue: [0, 0, 139], darkcyan: [0, 139, 139], darkgoldenrod: [184, 134, 11], darkgray: [169, 169, 169], darkgreen: [0, 100, 0], darkgrey: [169, 169, 169], darkkhaki: [189, 183, 107], darkmagenta: [139, 0, 139], darkolivegreen: [85, 107, 47], darkorange: [255, 140, 0], darkorchid: [153, 50, 204], darkred: [139, 0, 0], darksalmon: [233, 150, 122], darkseagreen: [143, 188, 143], darkslateblue: [72, 61, 139], darkslategray: [47, 79, 79], darkslategrey: [47, 79, 79], darkturquoise: [0, 206, 209], darkviolet: [148, 0, 211], deeppink: [255, 20, 147], deepskyblue: [0, 191, 255], dimgray: [105, 105, 105], dimgrey: [105, 105, 105], dodgerblue: [30, 144, 255], firebrick: [178, 34, 34], floralwhite: [255, 250, 240], forestgreen: [34, 139, 34], fuchsia: [255, 0, 255], gainsboro: [220, 220, 220], ghostwhite: [248, 248, 255], gold: [255, 215, 0], goldenrod: [218, 165, 32], gray: [128, 128, 128], green: [0, 128, 0], greenyellow: [173, 255, 47], grey: [128, 128, 128], honeydew: [240, 255, 240], hotpink: [255, 105, 180], indianred: [205, 92, 92], indigo: [75, 0, 130], ivory: [255, 255, 240], khaki: [240, 230, 140], lavender: [230, 230, 250], lavenderblush: [255, 240, 245], lawngreen: [124, 252, 0], lemonchiffon: [255, 250, 205], lightblue: [173, 216, 230], lightcoral: [240, 128, 128], lightcyan: [224, 255, 255], lightgoldenrodyellow: [250, 250, 210], lightgray: [211, 211, 211], lightgreen: [144, 238, 144], lightgrey: [211, 211, 211], lightpink: [255, 182, 193], lightsalmon: [255, 160, 122], lightseagreen: [32, 178, 170], lightskyblue: [135, 206, 250], lightslategray: [119, 136, 153], lightslategrey: [119, 136, 153], lightsteelblue: [176, 196, 222], lightyellow: [255, 255, 224], lime: [0, 255, 0], limegreen: [50, 205, 50], linen: [250, 240, 230], magenta: [255, 0, 255], maroon: [128, 0, 0], mediumaquamarine: [102, 205, 170], mediumblue: [0, 0, 205], mediumorchid: [186, 85, 211], mediumpurple: [147, 112, 219], mediumseagreen: [60, 179, 113], mediumslateblue: [123, 104, 238], mediumspringgreen: [0, 250, 154], mediumturquoise: [72, 209, 204], mediumvioletred: [199, 21, 133], midnightblue: [25, 25, 112], mintcream: [245, 255, 250], mistyrose: [255, 228, 225], moccasin: [255, 228, 181], navajowhite: [255, 222, 173], navy: [0, 0, 128], oldlace: [253, 245, 230], olive: [128, 128, 0], olivedrab: [107, 142, 35], orange: [255, 165, 0], orangered: [255, 69, 0], orchid: [218, 112, 214], palegoldenrod: [238, 232, 170], palegreen: [152, 251, 152], paleturquoise: [175, 238, 238], palevioletred: [219, 112, 147], papayawhip: [255, 239, 213], peachpuff: [255, 218, 185], peru: [205, 133, 63], pink: [255, 192, 203], plum: [221, 160, 221], powderblue: [176, 224, 230], purple: [128, 0, 128], rebeccapurple: [102, 51, 153], red: [255, 0, 0], rosybrown: [188, 143, 143], royalblue: [65, 105, 225], saddlebrown: [139, 69, 19], salmon: [250, 128, 114], sandybrown: [244, 164, 96], seagreen: [46, 139, 87], seashell: [255, 245, 238], sienna: [160, 82, 45], silver: [192, 192, 192], skyblue: [135, 206, 235], slateblue: [106, 90, 205], slategray: [112, 128, 144], slategrey: [112, 128, 144], snow: [255, 250, 250], springgreen: [0, 255, 127], steelblue: [70, 130, 180], tan: [210, 180, 140], teal: [0, 128, 128], thistle: [216, 191, 216], tomato: [255, 99, 71], turquoise: [64, 224, 208], violet: [238, 130, 238], wheat: [245, 222, 179], white: [255, 255, 255], whitesmoke: [245, 245, 245], yellow: [255, 255, 0], yellowgreen: [154, 205, 50] };
} });
var require_conversions = __commonJS({ "../../node_modules/color-convert/conversions.js"(exports, module) {
var cssKeywords = require_color_name(), reverseKeywords = {};
for (let key of Object.keys(cssKeywords)) reverseKeywords[cssKeywords[key]] = key;
var convert2 = { rgb: { channels: 3, labels: "rgb" }, hsl: { channels: 3, labels: "hsl" }, hsv: { channels: 3, labels: "hsv" }, hwb: { channels: 3, labels: "hwb" }, cmyk: { channels: 4, labels: "cmyk" }, xyz: { channels: 3, labels: "xyz" }, lab: { channels: 3, labels: "lab" }, lch: { channels: 3, labels: "lch" }, hex: { channels: 1, labels: ["hex"] }, keyword: { channels: 1, labels: ["keyword"] }, ansi16: { channels: 1, labels: ["ansi16"] }, ansi256: { channels: 1, labels: ["ansi256"] }, hcg: { channels: 3, labels: ["h", "c", "g"] }, apple: { channels: 3, labels: ["r16", "g16", "b16"] }, gray: { channels: 1, labels: ["gray"] } };
module.exports = convert2;
for (let model of Object.keys(convert2)) {
if (!("channels" in convert2[model])) throw new Error("missing channels property: " + model);
if (!("labels" in convert2[model])) throw new Error("missing channel labels property: " + model);
if (convert2[model].labels.length !== convert2[model].channels) throw new Error("channel and label counts mismatch: " + model);
let { channels, labels } = convert2[model];
delete convert2[model].channels, delete convert2[model].labels, Object.defineProperty(convert2[model], "channels", { value: channels }), Object.defineProperty(convert2[model], "labels", { value: labels });
}
convert2.rgb.hsl = function(rgb) {
let r2 = rgb[0] / 255, g2 = rgb[1] / 255, b2 = rgb[2] / 255, min = Math.min(r2, g2, b2), max = Math.max(r2, g2, b2), delta = max - min, h2, s2;
max === min ? h2 = 0 : r2 === max ? h2 = (g2 - b2) / delta : g2 === max ? h2 = 2 + (b2 - r2) / delta : b2 === max && (h2 = 4 + (r2 - g2) / delta), h2 = Math.min(h2 * 60, 360), h2 < 0 && (h2 += 360);
let l2 = (min + max) / 2;
return max === min ? s2 = 0 : l2 <= 0.5 ? s2 = delta / (max + min) : s2 = delta / (2 - max - min), [h2, s2 * 100, l2 * 100];
};
convert2.rgb.hsv = function(rgb) {
let rdif, gdif, bdif, h2, s2, r2 = rgb[0] / 255, g2 = rgb[1] / 255, b2 = rgb[2] / 255, v2 = Math.max(r2, g2, b2), diff = v2 - Math.min(r2, g2, b2), diffc = function(c2) {
return (v2 - c2) / 6 / diff + 1 / 2;
};
return diff === 0 ? (h2 = 0, s2 = 0) : (s2 = diff / v2, rdif = diffc(r2), gdif = diffc(g2), bdif = diffc(b2), r2 === v2 ? h2 = bdif - gdif : g2 === v2 ? h2 = 1 / 3 + rdif - bdif : b2 === v2 && (h2 = 2 / 3 + gdif - rdif), h2 < 0 ? h2 += 1 : h2 > 1 && (h2 -= 1)), [h2 * 360, s2 * 100, v2 * 100];
};
convert2.rgb.hwb = function(rgb) {
let r2 = rgb[0], g2 = rgb[1], b2 = rgb[2], h2 = convert2.rgb.hsl(rgb)[0], w2 = 1 / 255 * Math.min(r2, Math.min(g2, b2));
return b2 = 1 - 1 / 255 * Math.max(r2, Math.max(g2, b2)), [h2, w2 * 100, b2 * 100];
};
convert2.rgb.cmyk = function(rgb) {
let r2 = rgb[0] / 255, g2 = rgb[1] / 255, b2 = rgb[2] / 255, k2 = Math.min(1 - r2, 1 - g2, 1 - b2), c2 = (1 - r2 - k2) / (1 - k2) || 0, m2 = (1 - g2 - k2) / (1 - k2) || 0, y2 = (1 - b2 - k2) / (1 - k2) || 0;
return [c2 * 100, m2 * 100, y2 * 100, k2 * 100];
};
function comparativeDistance(x2, y2) {
return (x2[0] - y2[0]) ** 2 + (x2[1] - y2[1]) ** 2 + (x2[2] - y2[2]) ** 2;
}
convert2.rgb.keyword = function(rgb) {
let reversed = reverseKeywords[rgb];
if (reversed) return reversed;
let currentClosestDistance = 1 / 0, currentClosestKeyword;
for (let keyword of Object.keys(cssKeywords)) {
let value = cssKeywords[keyword], distance = comparativeDistance(rgb, value);
distance < currentClosestDistance && (currentClosestDistance = distance, currentClosestKeyword = keyword);
}
return currentClosestKeyword;
};
convert2.keyword.rgb = function(keyword) {
return cssKeywords[keyword];
};
convert2.rgb.xyz = function(rgb) {
let r2 = rgb[0] / 255, g2 = rgb[1] / 255, b2 = rgb[2] / 255;
r2 = r2 > 0.04045 ? ((r2 + 0.055) / 1.055) ** 2.4 : r2 / 12.92, g2 = g2 > 0.04045 ? ((g2 + 0.055) / 1.055) ** 2.4 : g2 / 12.92, b2 = b2 > 0.04045 ? ((b2 + 0.055) / 1.055) ** 2.4 : b2 / 12.92;
let x2 = r2 * 0.4124 + g2 * 0.3576 + b2 * 0.1805, y2 = r2 * 0.2126 + g2 * 0.7152 + b2 * 0.0722, z2 = r2 * 0.0193 + g2 * 0.1192 + b2 * 0.9505;
return [x2 * 100, y2 * 100, z2 * 100];
};
convert2.rgb.lab = function(rgb) {
let xyz = convert2.rgb.xyz(rgb), x2 = xyz[0], y2 = xyz[1], z2 = xyz[2];
x2 /= 95.047, y2 /= 100, z2 /= 108.883, x2 = x2 > 8856e-6 ? x2 ** (1 / 3) : 7.787 * x2 + 16 / 116, y2 = y2 > 8856e-6 ? y2 ** (1 / 3) : 7.787 * y2 + 16 / 116, z2 = z2 > 8856e-6 ? z2 ** (1 / 3) : 7.787 * z2 + 16 / 116;
let l2 = 116 * y2 - 16, a2 = 500 * (x2 - y2), b2 = 200 * (y2 - z2);
return [l2, a2, b2];
};
convert2.hsl.rgb = function(hsl) {
let h2 = hsl[0] / 360, s2 = hsl[1] / 100, l2 = hsl[2] / 100, t2, t3, val;
if (s2 === 0) return val = l2 * 255, [val, val, val];
l2 < 0.5 ? t2 = l2 * (1 + s2) : t2 = l2 + s2 - l2 * s2;
let t1 = 2 * l2 - t2, rgb = [0, 0, 0];
for (let i2 = 0; i2 < 3; i2++) t3 = h2 + 1 / 3 * -(i2 - 1), t3 < 0 && t3++, t3 > 1 && t3--, 6 * t3 < 1 ? val = t1 + (t2 - t1) * 6 * t3 : 2 * t3 < 1 ? val = t2 : 3 * t3 < 2 ? val = t1 + (t2 - t1) * (2 / 3 - t3) * 6 : val = t1, rgb[i2] = val * 255;
return rgb;
};
convert2.hsl.hsv = function(hsl) {
let h2 = hsl[0], s2 = hsl[1] / 100, l2 = hsl[2] / 100, smin = s2, lmin = Math.max(l2, 0.01);
l2 *= 2, s2 *= l2 <= 1 ? l2 : 2 - l2, smin *= lmin <= 1 ? lmin : 2 - lmin;
let v2 = (l2 + s2) / 2, sv = l2 === 0 ? 2 * smin / (lmin + smin) : 2 * s2 / (l2 + s2);
return [h2, sv * 100, v2 * 100];
};
convert2.hsv.rgb = function(hsv) {
let h2 = hsv[0] / 60, s2 = hsv[1] / 100, v2 = hsv[2] / 100, hi = Math.floor(h2) % 6, f2 = h2 - Math.floor(h2), p2 = 255 * v2 * (1 - s2), q2 = 255 * v2 * (1 - s2 * f2), t2 = 255 * v2 * (1 - s2 * (1 - f2));
switch (v2 *= 255, hi) {
case 0:
return [v2, t2, p2];
case 1:
return [q2, v2, p2];
case 2:
return [p2, v2, t2];
case 3:
return [p2, q2, v2];
case 4:
return [t2, p2, v2];
case 5:
return [v2, p2, q2];
}
};
convert2.hsv.hsl = function(hsv) {
let h2 = hsv[0], s2 = hsv[1] / 100, v2 = hsv[2] / 100, vmin = Math.max(v2, 0.01), sl, l2;
l2 = (2 - s2) * v2;
let lmin = (2 - s2) * vmin;
return sl = s2 * vmin, sl /= lmin <= 1 ? lmin : 2 - lmin, sl = sl || 0, l2 /= 2, [h2, sl * 100, l2 * 100];
};
convert2.hwb.rgb = function(hwb) {
let h2 = hwb[0] / 360, wh = hwb[1] / 100, bl = hwb[2] / 100, ratio = wh + bl, f2;
ratio > 1 && (wh /= ratio, bl /= ratio);
let i2 = Math.floor(6 * h2), v2 = 1 - bl;
f2 = 6 * h2 - i2, (i2 & 1) !== 0 && (f2 = 1 - f2);
let n2 = wh + f2 * (v2 - wh), r2, g2, b2;
switch (i2) {
default:
case 6:
case 0:
r2 = v2, g2 = n2, b2 = wh;
break;
case 1:
r2 = n2, g2 = v2, b2 = wh;
break;
case 2:
r2 = wh, g2 = v2, b2 = n2;
break;
case 3:
r2 = wh, g2 = n2, b2 = v2;
break;
case 4:
r2 = n2, g2 = wh, b2 = v2;
break;
case 5:
r2 = v2, g2 = wh, b2 = n2;
break;
}
return [r2 * 255, g2 * 255, b2 * 255];
};
convert2.cmyk.rgb = function(cmyk) {
let c2 = cmyk[0] / 100, m2 = cmyk[1] / 100, y2 = cmyk[2] / 100, k2 = cmyk[3] / 100, r2 = 1 - Math.min(1, c2 * (1 - k2) + k2), g2 = 1 - Math.min(1, m2 * (1 - k2) + k2), b2 = 1 - Math.min(1, y2 * (1 - k2) + k2);
return [r2 * 255, g2 * 255, b2 * 255];
};
convert2.xyz.rgb = function(xyz) {
let x2 = xyz[0] / 100, y2 = xyz[1] / 100, z2 = xyz[2] / 100, r2, g2, b2;
return r2 = x2 * 3.2406 + y2 * -1.5372 + z2 * -0.4986, g2 = x2 * -0.9689 + y2 * 1.8758 + z2 * 0.0415, b2 = x2 * 0.0557 + y2 * -0.204 + z2 * 1.057, r2 = r2 > 31308e-7 ? 1.055 * r2 ** (1 / 2.4) - 0.055 : r2 * 12.92, g2 = g2 > 31308e-7 ? 1.055 * g2 ** (1 / 2.4) - 0.055 : g2 * 12.92, b2 = b2 > 31308e-7 ? 1.055 * b2 ** (1 / 2.4) - 0.055 : b2 * 12.92, r2 = Math.min(Math.max(0, r2), 1), g2 = Math.min(Math.max(0, g2), 1), b2 = Math.min(Math.max(0, b2), 1), [r2 * 255, g2 * 255, b2 * 255];
};
convert2.xyz.lab = function(xyz) {
let x2 = xyz[0], y2 = xyz[1], z2 = xyz[2];
x2 /= 95.047, y2 /= 100, z2 /= 108.883, x2 = x2 > 8856e-6 ? x2 ** (1 / 3) : 7.787 * x2 + 16 / 116, y2 = y2 > 8856e-6 ? y2 ** (1 / 3) : 7.787 * y2 + 16 / 116, z2 = z2 > 8856e-6 ? z2 ** (1 / 3) : 7.787 * z2 + 16 / 116;
let l2 = 116 * y2 - 16, a2 = 500 * (x2 - y2), b2 = 200 * (y2 - z2);
return [l2, a2, b2];
};
convert2.lab.xyz = function(lab) {
let l2 = lab[0], a2 = lab[1], b2 = lab[2], x2, y2, z2;
y2 = (l2 + 16) / 116, x2 = a2 / 500 + y2, z2 = y2 - b2 / 200;
let y22 = y2 ** 3, x22 = x2 ** 3, z22 = z2 ** 3;
return y2 = y22 > 8856e-6 ? y22 : (y2 - 16 / 116) / 7.787, x2 = x22 > 8856e-6 ? x22 : (x2 - 16 / 116) / 7.787, z2 = z22 > 8856e-6 ? z22 : (z2 - 16 / 116) / 7.787, x2 *= 95.047, y2 *= 100, z2 *= 108.883, [x2, y2, z2];
};
convert2.lab.lch = function(lab) {
let l2 = lab[0], a2 = lab[1], b2 = lab[2], h2;
h2 = Math.atan2(b2, a2) * 360 / 2 / Math.PI, h2 < 0 && (h2 += 360);
let c2 = Math.sqrt(a2 * a2 + b2 * b2);
return [l2, c2, h2];
};
convert2.lch.lab = function(lch) {
let l2 = lch[0], c2 = lch[1], hr = lch[2] / 360 * 2 * Math.PI, a2 = c2 * Math.cos(hr), b2 = c2 * Math.sin(hr);
return [l2, a2, b2];
};
convert2.rgb.ansi16 = function(args, saturation = null) {
let [r2, g2, b2] = args, value = saturation === null ? convert2.rgb.hsv(args)[2] : saturation;
if (value = Math.round(value / 50), value === 0) return 30;
let ansi = 30 + (Math.round(b2 / 255) << 2 | Math.round(g2 / 255) << 1 | Math.round(r2 / 255));
return value === 2 && (ansi += 60), ansi;
};
convert2.hsv.ansi16 = function(args) {
return convert2.rgb.ansi16(convert2.hsv.rgb(args), args[2]);
};
convert2.rgb.ansi256 = function(args) {
let r2 = args[0], g2 = args[1], b2 = args[2];
return r2 === g2 && g2 === b2 ? r2 < 8 ? 16 : r2 > 248 ? 231 : Math.round((r2 - 8) / 247 * 24) + 232 : 16 + 36 * Math.round(r2 / 255 * 5) + 6 * Math.round(g2 / 255 * 5) + Math.round(b2 / 255 * 5);
};
convert2.ansi16.rgb = function(args) {
let color = args % 10;
if (color === 0 || color === 7) return args > 50 && (color += 3.5), color = color / 10.5 * 255, [color, color, color];
let mult = (~~(args > 50) + 1) * 0.5, r2 = (color & 1) * mult * 255, g2 = (color >> 1 & 1) * mult * 255, b2 = (color >> 2 & 1) * mult * 255;
return [r2, g2, b2];
};
convert2.ansi256.rgb = function(args) {
if (args >= 232) {
let c2 = (args - 232) * 10 + 8;
return [c2, c2, c2];
}
args -= 16;
let rem, r2 = Math.floor(args / 36) / 5 * 255, g2 = Math.floor((rem = args % 36) / 6) / 5 * 255, b2 = rem % 6 / 5 * 255;
return [r2, g2, b2];
};
convert2.rgb.hex = function(args) {
let string = (((Math.round(args[0]) & 255) << 16) + ((Math.round(args[1]) & 255) << 8) + (Math.round(args[2]) & 255)).toString(16).toUpperCase();
return "000000".substring(string.length) + string;
};
convert2.hex.rgb = function(args) {
let match = args.toString(16).match(/[a-f0-9]{6}|[a-f0-9]{3}/i);
if (!match) return [0, 0, 0];
let colorString = match[0];
match[0].length === 3 && (colorString = colorString.split("").map((char) => char + char).join(""));
let integer = parseInt(colorString, 16), r2 = integer >> 16 & 255, g2 = integer >> 8 & 255, b2 = integer & 255;
return [r2, g2, b2];
};
convert2.rgb.hcg = function(rgb) {
let r2 = rgb[0] / 255, g2 = rgb[1] / 255, b2 = rgb[2] / 255, max = Math.max(Math.max(r2, g2), b2), min = Math.min(Math.min(r2, g2), b2), chroma = max - min, grayscale, hue;
return chroma < 1 ? grayscale = min / (1 - chroma) : grayscale = 0, chroma <= 0 ? hue = 0 : max === r2 ? hue = (g2 - b2) / chroma % 6 : max === g2 ? hue = 2 + (b2 - r2) / chroma : hue = 4 + (r2 - g2) / chroma, hue /= 6, hue %= 1, [hue * 360, chroma * 100, grayscale * 100];
};
convert2.hsl.hcg = function(hsl) {
let s2 = hsl[1] / 100, l2 = hsl[2] / 100, c2 = l2 < 0.5 ? 2 * s2 * l2 : 2 * s2 * (1 - l2), f2 = 0;
return c2 < 1 && (f2 = (l2 - 0.5 * c2) / (1 - c2)), [hsl[0], c2 * 100, f2 * 100];
};
convert2.hsv.hcg = function(hsv) {
let s2 = hsv[1] / 100, v2 = hsv[2] / 100, c2 = s2 * v2, f2 = 0;
return c2 < 1 && (f2 = (v2 - c2) / (1 - c2)), [hsv[0], c2 * 100, f2 * 100];
};
convert2.hcg.rgb = function(hcg) {
let h2 = hcg[0] / 360, c2 = hcg[1] / 100, g2 = hcg[2] / 100;
if (c2 === 0) return [g2 * 255, g2 * 255, g2 * 255];
let pure = [0, 0, 0], hi = h2 % 1 * 6, v2 = hi % 1, w2 = 1 - v2, mg = 0;
switch (Math.floor(hi)) {
case 0:
pure[0] = 1, pure[1] = v2, pure[2] = 0;
break;
case 1:
pure[0] = w2, pure[1] = 1, pure[2] = 0;
break;
case 2:
pure[0] = 0, pure[1] = 1, pure[2] = v2;
break;
case 3:
pure[0] = 0, pure[1] = w2, pure[2] = 1;
break;
case 4:
pure[0] = v2, pure[1] = 0, pure[2] = 1;
break;
default:
pure[0] = 1, pure[1] = 0, pure[2] = w2;
}
return mg = (1 - c2) * g2, [(c2 * pure[0] + mg) * 255, (c2 * pure[1] + mg) * 255, (c2 * pure[2] + mg) * 255];
};
convert2.hcg.hsv = function(hcg) {
let c2 = hcg[1] / 100, g2 = hcg[2] / 100, v2 = c2 + g2 * (1 - c2), f2 = 0;
return v2 > 0 && (f2 = c2 / v2), [hcg[0], f2 * 100, v2 * 100];
};
convert2.hcg.hsl = function(hcg) {
let c2 = hcg[1] / 100, l2 = hcg[2] / 100 * (1 - c2) + 0.5 * c2, s2 = 0;
return l2 > 0 && l2 < 0.5 ? s2 = c2 / (2 * l2) : l2 >= 0.5 && l2 < 1 && (s2 = c2 / (2 * (1 - l2))), [hcg[0], s2 * 100, l2 * 100];
};
convert2.hcg.hwb = function(hcg) {
let c2 = hcg[1] / 100, g2 = hcg[2] / 100, v2 = c2 + g2 * (1 - c2);
return [hcg[0], (v2 - c2) * 100, (1 - v2) * 100];
};
convert2.hwb.hcg = function(hwb) {
let w2 = hwb[1] / 100, v2 = 1 - hwb[2] / 100, c2 = v2 - w2, g2 = 0;
return c2 < 1 && (g2 = (v2 - c2) / (1 - c2)), [hwb[0], c2 * 100, g2 * 100];
};
convert2.apple.rgb = function(apple) {
return [apple[0] / 65535 * 255, apple[1] / 65535 * 255, apple[2] / 65535 * 255];
};
convert2.rgb.apple = function(rgb) {
return [rgb[0] / 255 * 65535, rgb[1] / 255 * 65535, rgb[2] / 255 * 65535];
};
convert2.gray.rgb = function(args) {
return [args[0] / 100 * 255, args[0] / 100 * 255, args[0] / 100 * 255];
};
convert2.gray.hsl = function(args) {
return [0, 0, args[0]];
};
convert2.gray.hsv = convert2.gray.hsl;
convert2.gray.hwb = function(gray) {
return [0, 100, gray[0]];
};
convert2.gray.cmyk = function(gray) {
return [0, 0, 0, gray[0]];
};
convert2.gray.lab = function(gray) {
return [gray[0], 0, 0];
};
convert2.gray.hex = function(gray) {
let val = Math.round(gray[0] / 100 * 255) & 255, string = ((val << 16) + (val << 8) + val).toString(16).toUpperCase();
return "000000".substring(string.length) + string;
};
convert2.rgb.gray = function(rgb) {
return [(rgb[0] + rgb[1] + rgb[2]) / 3 / 255 * 100];
};
} });
var require_route = __commonJS({ "../../node_modules/color-convert/route.js"(exports, module) {
var conversions = require_conversions();
function buildGraph() {
let graph = {}, models = Object.keys(conversions);
for (let len = models.length, i2 = 0; i2 < len; i2++) graph[models[i2]] = { distance: -1, parent: null };
return graph;
}
function deriveBFS(fromModel) {
let graph = buildGraph(), queue = [fromModel];
for (graph[fromModel].distance = 0; queue.length; ) {
let current = queue.pop(), adjacents = Object.keys(conversions[current]);
for (let len = adjacents.length, i2 = 0; i2 < len; i2++) {
let adjacent = adjacents[i2], node = graph[adjacent];
node.distance === -1 && (node.distance = graph[current].distance + 1, node.parent = current, queue.unshift(adjacent));
}
}
return graph;
}
function link(from, to) {
return function(args) {
return to(from(args));
};
}
function wrapConversion(toModel, graph) {
let path = [graph[toModel].parent, toModel], fn = conversions[graph[toModel].parent][toModel], cur = graph[toModel].parent;
for (; graph[cur].parent; ) path.unshift(graph[cur].parent), fn = link(conversions[graph[cur].parent][cur], fn), cur = graph[cur].parent;
return fn.conversion = path, fn;
}
module.exports = function(fromModel) {
let graph = deriveBFS(fromModel), conversion = {}, models = Object.keys(graph);
for (let len = models.length, i2 = 0; i2 < len; i2++) {
let toModel = models[i2];
graph[toModel].parent !== null && (conversion[toModel] = wrapConversion(toModel, graph));
}
return conversion;
};
} });
var require_color_convert = __commonJS({ "../../node_modules/color-convert/index.js"(exports, module) {
var conversions = require_conversions(), route = require_route(), convert2 = {}, models = Object.keys(conversions);
function wrapRaw(fn) {
let wrappedFn = function(...args) {
let arg0 = args[0];
return arg0 == null ? arg0 : (arg0.length > 1 && (args = arg0), fn(args));
};
return "conversion" in fn && (wrappedFn.conversion = fn.conversion), wrappedFn;
}
function wrapRounded(fn) {
let wrappedFn = function(...args) {
let arg0 = args[0];
if (arg0 == null) return arg0;
arg0.length > 1 && (args = arg0);
let result = fn(args);
if (typeof result == "object") for (let len = result.length, i2 = 0; i2 < len; i2++) result[i2] = Math.round(result[i2]);
return result;
};
return "conversion" in fn && (wrappedFn.conversion = fn.conversion), wrappedFn;
}
models.forEach((fromModel) => {
convert2[fromModel] = {}, Object.defineProperty(convert2[fromModel], "channels", { value: conversions[fromModel].channels }), Object.defineProperty(convert2[fromModel], "labels", { value: conversions[fromModel].labels });
let routes = route(fromModel);
Object.keys(routes).forEach((toModel) => {
let fn = routes[toModel];
convert2[fromModel][toModel] = wrapRounded(fn), convert2[fromModel][toModel].raw = wrapRaw(fn);
});
});
module.exports = convert2;
} });
var import_color_convert = __toESM2(require_color_convert());
function u() {
return (u = Object.assign || function(e2) {
for (var r2 = 1; r2 < arguments.length; r2++) {
var t2 = arguments[r2];
for (var n2 in t2) Object.prototype.hasOwnProperty.call(t2, n2) && (e2[n2] = t2[n2]);
}
return e2;
}).apply(this, arguments);
}
function c(e2, r2) {
if (e2 == null) return {};
var t2, n2, o2 = {}, a2 = Object.keys(e2);
for (n2 = 0; n2 < a2.length; n2++) r2.indexOf(t2 = a2[n2]) >= 0 || (o2[t2] = e2[t2]);
return o2;
}
function i(e2) {
var t2 = (0, import_react.useRef)(e2), n2 = (0, import_react.useRef)(function(e3) {
t2.current && t2.current(e3);
});
return t2.current = e2, n2.current;
}
var s = function(e2, r2, t2) {
return r2 === void 0 && (r2 = 0), t2 === void 0 && (t2 = 1), e2 > t2 ? t2 : e2 < r2 ? r2 : e2;
};
var f = function(e2) {
return "touches" in e2;
};
var v = function(e2) {
return e2 && e2.ownerDocument.defaultView || self;
};
var d = function(e2, r2, t2) {
var n2 = e2.getBoundingClientRect(), o2 = f(r2) ? function(e3, r3) {
for (var t3 = 0; t3 < e3.length; t3++) if (e3[t3].identifier === r3) return e3[t3];
return e3[0];
}(r2.touches, t2) : r2;
return { left: s((o2.pageX - (n2.left + v(e2).pageXOffset)) / n2.width), top: s((o2.pageY - (n2.top + v(e2).pageYOffset)) / n2.height) };
};
var h = function(e2) {
!f(e2) && e2.preventDefault();
};
var m = import_react.default.memo(function(o2) {
var a2 = o2.onMove, l2 = o2.onKey, s2 = c(o2, ["onMove", "onKey"]), m2 = (0, import_react.useRef)(null), g2 = i(a2), p2 = i(l2), b2 = (0, import_react.useRef)(null), _2 = (0, import_react.useRef)(false), x2 = (0, import_react.useMemo)(function() {
var e2 = function(e3) {
h(e3), (f(e3) ? e3.touches.length > 0 : e3.buttons > 0) && m2.current ? g2(d(m2.current, e3, b2.current)) : t2(false);
}, r2 = function() {
return t2(false);
};
function t2(t3) {
var n2 = _2.current, o3 = v(m2.current), a3 = t3 ? o3.addEventListener : o3.removeEventListener;
a3(n2 ? "touchmove" : "mousemove", e2), a3(n2 ? "touchend" : "mouseup", r2);
}
return [function(e3) {
var r3 = e3.nativeEvent, n2 = m2.current;
if (n2 && (h(r3), !function(e4, r4) {
return r4 && !f(e4);
}(r3, _2.current) && n2)) {
if (f(r3)) {
_2.current = true;
var o3 = r3.changedTouches || [];
o3.length && (b2.current = o3[0].identifier);
}
n2.focus(), g2(d(n2, r3, b2.current)), t2(true);
}
}, function(e3) {
var r3 = e3.which || e3.keyCode;
r3 < 37 || r3 > 40 || (e3.preventDefault(), p2({ left: r3 === 39 ? 0.05 : r3 === 37 ? -0.05 : 0, top: r3 === 40 ? 0.05 : r3 === 38 ? -0.05 : 0 }));
}, t2];
}, [p2, g2]), C2 = x2[0], E2 = x2[1], H2 = x2[2];
return (0, import_react.useEffect)(function() {
return H2;
}, [H2]), import_react.default.createElement("div", u({}, s2, { onTouchStart: C2, onMouseDown: C2, className: "react-colorful__interactive", ref: m2, onKeyDown: E2, tabIndex: 0, role: "slider" }));
});
var g = function(e2) {
return e2.filter(Boolean).join(" ");
};
var p = function(r2) {
var t2 = r2.color, n2 = r2.left, o2 = r2.top, a2 = o2 === void 0 ? 0.5 : o2, l2 = g(["react-colorful__pointer", r2.className]);
return import_react.default.createElement("div", { className: l2, style: { top: 100 * a2 + "%", left: 100 * n2 + "%" } }, import_react.default.createElement("div", { className: "react-colorful__pointer-fill", style: { backgroundColor: t2 } }));
};
var b = function(e2, r2, t2) {
return r2 === void 0 && (r2 = 0), t2 === void 0 && (t2 = Math.pow(10, r2)), Math.round(t2 * e2) / t2;
};
var _ = { grad: 0.9, turn: 360, rad: 360 / (2 * Math.PI) };
var x = function(e2) {
return L(C(e2));
};
var C = function(e2) {
return e2[0] === "#" && (e2 = e2.substring(1)), e2.length < 6 ? { r: parseInt(e2[0] + e2[0], 16), g: parseInt(e2[1] + e2[1], 16), b: parseInt(e2[2] + e2[2], 16), a: e2.length === 4 ? b(parseInt(e2[3] + e2[3], 16) / 255, 2) : 1 } : { r: parseInt(e2.substring(0, 2), 16), g: parseInt(e2.substring(2, 4), 16), b: parseInt(e2.substring(4, 6), 16), a: e2.length === 8 ? b(parseInt(e2.substring(6, 8), 16) / 255, 2) : 1 };
};
var E = function(e2, r2) {
return r2 === void 0 && (r2 = "deg"), Number(e2) * (_[r2] || 1);
};
var H = function(e2) {
var r2 = /hsla?\(?\s*(-?\d*\.?\d+)(deg|rad|grad|turn)?[,\s]+(-?\d*\.?\d+)%?[,\s]+(-?\d*\.?\d+)%?,?\s*[/\s]*(-?\d*\.?\d+)?(%)?\s*\)?/i.exec(e2);
return r2 ? N({ h: E(r2[1], r2[2]), s: Number(r2[3]), l: Number(r2[4]), a: r2[5] === void 0 ? 1 : Number(r2[5]) / (r2[6] ? 100 : 1) }) : { h: 0, s: 0, v: 0, a: 1 };
};
var N = function(e2) {
var r2 = e2.s, t2 = e2.l;
return { h: e2.h, s: (r2 *= (t2 < 50 ? t2 : 100 - t2) / 100) > 0 ? 2 * r2 / (t2 + r2) * 100 : 0, v: t2 + r2, a: e2.a };
};
var w = function(e2) {
return K(I(e2));
};
var y = function(e2) {
var r2 = e2.s, t2 = e2.v, n2 = e2.a, o2 = (200 - r2) * t2 / 100;
return { h: b(e2.h), s: b(o2 > 0 && o2 < 200 ? r2 * t2 / 100 / (o2 <= 100 ? o2 : 200 - o2) * 100 : 0), l: b(o2 / 2), a: b(n2, 2) };
};
var q = function(e2) {
var r2 = y(e2);
return "hsl(" + r2.h + ", " + r2.s + "%, " + r2.l + "%)";
};
var k = function(e2) {
var r2 = y(e2);
return "hsla(" + r2.h + ", " + r2.s + "%, " + r2.l + "%, " + r2.a + ")";
};
var I = function(e2) {
var r2 = e2.h, t2 = e2.s, n2 = e2.v, o2 = e2.a;
r2 = r2 / 360 * 6, t2 /= 100, n2 /= 100;
var a2 = Math.floor(r2), l2 = n2 * (1 - t2), u2 = n2 * (1 - (r2 - a2) * t2), c2 = n2 * (1 - (1 - r2 + a2) * t2), i2 = a2 % 6;
return { r: b(255 * [n2, u2, l2, l2, c2, n2][i2]), g: b(255 * [c2, n2, n2, u2, l2, l2][i2]), b: b(255 * [l2, l2, c2, n2, n2, u2][i2]), a: b(o2, 2) };
};
var z = function(e2) {
var r2 = /rgba?\(?\s*(-?\d*\.?\d+)(%)?[,\s]+(-?\d*\.?\d+)(%)?[,\s]+(-?\d*\.?\d+)(%)?,?\s*[/\s]*(-?\d*\.?\d+)?(%)?\s*\)?/i.exec(e2);
return r2 ? L({ r: Number(r2[1]) / (r2[2] ? 100 / 255 : 1), g: Number(r2[3]) / (r2[4] ? 100 / 255 : 1), b: Number(r2[5]) / (r2[6] ? 100 / 255 : 1), a: r2[7] === void 0 ? 1 : Number(r2[7]) / (r2[8] ? 100 : 1) }) : { h: 0, s: 0, v: 0, a: 1 };
};
var D = function(e2) {
var r2 = e2.toString(16);
return r2.length < 2 ? "0" + r2 : r2;
};
var K = function(e2) {
var r2 = e2.r, t2 = e2.g, n2 = e2.b, o2 = e2.a, a2 = o2 < 1 ? D(b(255 * o2)) : "";
return "#" + D(r2) + D(t2) + D(n2) + a2;
};
var L = function(e2) {
var r2 = e2.r, t2 = e2.g, n2 = e2.b, o2 = e2.a, a2 = Math.max(r2, t2, n2), l2 = a2 - Math.min(r2, t2, n2), u2 = l2 ? a2 === r2 ? (t2 - n2) / l2 : a2 === t2 ? 2 + (n2 - r2) / l2 : 4 + (r2 - t2) / l2 : 0;
return { h: b(60 * (u2 < 0 ? u2 + 6 : u2)), s: b(a2 ? l2 / a2 * 100 : 0), v: b(a2 / 255 * 100), a: o2 };
};
var S = import_react.default.memo(function(r2) {
var t2 = r2.hue, n2 = r2.onChange, o2 = g(["react-colorful__hue", r2.className]);
return import_react.default.createElement("div", { className: o2 }, import_react.default.createElement(m, { onMove: function(e2) {
n2({ h: 360 * e2.left });
}, onKey: function(e2) {
n2({ h: s(t2 + 360 * e2.left, 0, 360) });
}, "aria-label": "Hue", "aria-valuenow": b(t2), "aria-valuemax": "360", "aria-valuemin": "0" }, import_react.default.createElement(p, { className: "react-colorful__hue-pointer", left: t2 / 360, color: q({ h: t2, s: 100, v: 100, a: 1 }) })));
});
var T = import_react.default.memo(function(r2) {
var t2 = r2.hsva, n2 = r2.onChange, o2 = { backgroundColor: q({ h: t2.h, s: 100, v: 100, a: 1 }) };
return import_react.default.createElement("div", { className: "react-colorful__saturation", style: o2 }, import_react.default.createElement(m, { onMove: function(e2) {
n2({ s: 100 * e2.left, v: 100 - 100 * e2.top });
}, onKey: function(e2) {
n2({ s: s(t2.s + 100 * e2.left, 0, 100), v: s(t2.v - 100 * e2.top, 0, 100) });
}, "aria-label": "Color", "aria-valuetext": "Saturation " + b(t2.s) + "%, Brightness " + b(t2.v) + "%" }, import_react.default.createElement(p, { className: "react-colorful__saturation-pointer", top: 1 - t2.v / 100, left: t2.s / 100, color: q(t2) })));
});
var F = function(e2, r2) {
if (e2 === r2) return true;
for (var t2 in e2) if (e2[t2] !== r2[t2]) return false;
return true;
};
var P = function(e2, r2) {
return e2.replace(/\s/g, "") === r2.replace(/\s/g, "");
};
var X = function(e2, r2) {
return e2.toLowerCase() === r2.toLowerCase() || F(C(e2), C(r2));
};
function Y(e2, t2, l2) {
var u2 = i(l2), c2 = (0, import_react.useState)(function() {
return e2.toHsva(t2);
}), s2 = c2[0], f2 = c2[1], v2 = (0, import_react.useRef)({ color: t2, hsva: s2 });
(0, import_react.useEffect)(function() {
if (!e2.equal(t2, v2.current.color)) {
var r2 = e2.toHsva(t2);
v2.current = { hsva: r2, color: t2 }, f2(r2);
}
}, [t2, e2]), (0, import_react.useEffect)(function() {
var r2;
F(s2, v2.current.hsva) || e2.equal(r2 = e2.fromHsva(s2), v2.current.color) || (v2.current = { hsva: s2, color: r2 }, u2(r2));
}, [s2, e2, u2]);
var d2 = (0, import_react.useCallback)(function(e3) {
f2(function(r2) {
return Object.assign({}, r2, e3);
});
}, []);
return [s2, d2];
}
var V = typeof window < "u" ? import_react.useLayoutEffect : import_react.useEffect;
var $ = function() {
return typeof __webpack_nonce__ < "u" ? __webpack_nonce__ : void 0;
};
var J = /* @__PURE__ */ new Map();
var Q = function(e2) {
V(function() {
var r2 = e2.current ? e2.current.ownerDocument : document;
if (r2 !== void 0 && !J.has(r2)) {
var t2 = r2.createElement("style");
t2.innerHTML = `.react-colorful{position:relative;display:flex;flex-direction:column;width:200px;height:200px;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none;cursor:default}.react-colorful__saturation{position:relative;flex-grow:1;border-color:transparent;border-bottom:12px solid #000;border-radius:8px 8px 0 0;background-image:linear-gradient(0deg,#000,transparent),linear-gradient(90deg,#fff,hsla(0,0%,100%,0))}.react-colorful__alpha-gradient,.react-colorful__pointer-fill{content:"";position:absolute;left:0;top:0;right:0;bottom:0;pointer-events:none;border-radius:inherit}.react-colorful__alpha-gradient,.react-colorful__saturation{box-shadow:inset 0 0 0 1px rgba(0,0,0,.05)}.react-colorful__alpha,.react-colorful__hue{position:relative;height:24px}.react-colorful__hue{background:linear-gradient(90deg,red 0,#ff0 17%,#0f0 33%,#0ff 50%,#00f 67%,#f0f 83%,red)}.react-colorful__last-control{border-radius:0 0 8px 8px}.react-colorful__interactive{position:absolute;left:0;top:0;right:0;bottom:0;border-radius:inherit;outline:none;touch-action:none}.react-colorful__pointer{position:absolute;z-index:1;box-sizing:border-box;width:28px;height:28px;transform:translate(-50%,-50%);background-color:#fff;border:2px solid #fff;border-radius:50%;box-shadow:0 2px 4px rgba(0,0,0,.2)}.react-colorful__interactive:focus .react-colorful__pointer{transform:translate(-50%,-50%) scale(1.1)}.react-colorful__alpha,.react-colorful__alpha-pointer{background-color:#fff;background-image:url('data:image/svg+xml;charset=utf-8,<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill-opacity=".05"><path d="M8 0h8v8H8zM0 8h8v8H0z"/></svg>')}.react-colorful__saturation-pointer{z-index:3}.react-colorful__hue-pointer{z-index:2}`, J.set(r2, t2);
var n2 = $();
n2 && t2.setAttribute("nonce", n2), r2.head.appendChild(t2);
}
}, []);
};
var U = function(t2) {
var n2 = t2.className, o2 = t2.colorModel, a2 = t2.color, l2 = a2 === void 0 ? o2.defaultColor : a2, i2 = t2.onChange, s2 = c(t2, ["className", "colorModel", "color", "onChange"]), f2 = (0, import_react.useRef)(null);
Q(f2);
var v2 = Y(o2, l2, i2), d2 = v2[0], h2 = v2[1], m2 = g(["react-colorful", n2]);
return import_react.default.createElement("div", u({}, s2, { ref: f2, className: m2 }), import_react.default.createElement(T, { hsva: d2, onChange: h2 }), import_react.default.createElement(S, { hue: d2.h, onChange: h2, className: "react-colorful__last-control" }));
};
var W = { defaultColor: "000", toHsva: x, fromHsva: function(e2) {
return w({ h: e2.h, s: e2.s, v: e2.v, a: 1 });
}, equal: X };
var Z = function(r2) {
return import_react.default.createElement(U, u({}, r2, { colorModel: W }));
};
var ee = function(r2) {
var t2 = r2.className, n2 = r2.hsva, o2 = r2.onChange, a2 = { backgroundImage: "linear-gradient(90deg, " + k(Object.assign({}, n2, { a: 0 })) + ", " + k(Object.assign({}, n2, { a: 1 })) + ")" }, l2 = g(["react-colorful__alpha", t2]), u2 = b(100 * n2.a);
return import_react.default.createElement("div", { className: l2 }, import_react.default.createElement("div", { className: "react-colorful__alpha-gradient", style: a2 }), import_react.default.createElement(m, { onMove: function(e2) {
o2({ a: e2.left });
}, onKey: function(e2) {
o2({ a: s(n2.a + e2.left) });
}, "aria-label": "Alpha", "aria-valuetext": u2 + "%", "aria-valuenow": u2, "aria-valuemin": "0", "aria-valuemax": "100" }, import_react.default.createElement(p, { className: "react-colorful__alpha-pointer", left: n2.a, color: k(n2) })));
};
var re = function(t2) {
var n2 = t2.className, o2 = t2.colorModel, a2 = t2.color, l2 = a2 === void 0 ? o2.defaultColor : a2, i2 = t2.onChange, s2 = c(t2, ["className", "colorModel", "color", "onChange"]), f2 = (0, import_react.useRef)(null);
Q(f2);
var v2 = Y(o2, l2, i2), d2 = v2[0], h2 = v2[1], m2 = g(["react-colorful", n2]);
return import_react.default.createElement("div", u({}, s2, { ref: f2, className: m2 }), import_react.default.createElement(T, { hsva: d2, onChange: h2 }), import_react.default.createElement(S, { hue: d2.h, onChange: h2 }), import_react.default.createElement(ee, { hsva: d2, onChange: h2, className: "react-colorful__last-control" }));
};
var le = { defaultColor: "hsla(0, 0%, 0%, 1)", toHsva: H, fromHsva: k, equal: P };
var ue = function(r2) {
return import_react.default.createElement(re, u({}, r2, { colorModel: le }));
};
var Ee = { defaultColor: "rgba(0, 0, 0, 1)", toHsva: z, fromHsva: function(e2) {
var r2 = I(e2);
return "rgba(" + r2.r + ", " + r2.g + ", " + r2.b + ", " + r2.a + ")";
}, equal: P };
var He = function(r2) {
return import_react.default.createElement(re, u({}, r2, { colorModel: Ee }));
};
var Wrapper = xr.div({ position: "relative", maxWidth: 250, '&[aria-readonly="true"]': { opacity: 0.5 } });
var PickerTooltip = xr(O3)({ position: "absolute", zIndex: 1, top: 4, left: 4, "[aria-readonly=true] &": { cursor: "not-allowed" } });
var TooltipContent = xr.div({ width: 200, margin: 5, ".react-colorful__saturation": { borderRadius: "4px 4px 0 0" }, ".react-colorful__hue": { boxShadow: "inset 0 0 0 1px rgb(0 0 0 / 5%)" }, ".react-colorful__last-control": { borderRadius: "0 0 4px 4px" } });
var Note = xr(G3)(({ theme }) => ({ fontFamily: theme.typography.fonts.base }));
var Swatches = xr.div({ display: "grid", gridTemplateColumns: "repeat(9, 16px)", gap: 6, padding: 3, marginTop: 5, width: 200 });
var SwatchColor = xr.div(({ theme, active }) => ({ width: 16, height: 16, boxShadow: active ? `${theme.appBorderColor} 0 0 0 1px inset, ${theme.textMutedColor}50 0 0 0 4px` : `${theme.appBorderColor} 0 0 0 1px inset`, borderRadius: theme.appBorderRadius }));
var swatchBackground = `url('data:image/svg+xml;charset=utf-8,<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill-opacity=".05"><path d="M8 0h8v8H8zM0 8h8v8H0z"/></svg>')`;
var Swatch = ({ value, style, ...props }) => {
let backgroundImage = `linear-gradient(${value}, ${value}), ${swatchBackground}, linear-gradient(#fff, #fff)`;
return import_react.default.createElement(SwatchColor, { ...props, style: { ...style, backgroundImage } });
};
var Input = xr(N7.Input)(({ theme, readOnly }) => ({ width: "100%", paddingLeft: 30, paddingRight: 30, boxSizing: "border-box", fontFamily: theme.typography.fonts.base }));
var ToggleIcon = xr(MarkupIcon)(({ theme }) => ({ position: "absolute", zIndex: 1, top: 6, right: 7, width: 20, height: 20, padding: 4, boxSizing: "border-box", cursor: "pointer", color: theme.input.color }));
var ColorSpace = ((ColorSpace2) => (ColorSpace2.RGB = "rgb", ColorSpace2.HSL = "hsl", ColorSpace2.HEX = "hex", ColorSpace2))(ColorSpace || {});
var COLOR_SPACES = Object.values(ColorSpace);
var COLOR_REGEXP = /\(([0-9]+),\s*([0-9]+)%?,\s*([0-9]+)%?,?\s*([0-9.]+)?\)/;
var RGB_REGEXP = /^\s*rgba?\(([0-9]+),\s*([0-9]+),\s*([0-9]+),?\s*([0-9.]+)?\)\s*$/i;
var HSL_REGEXP = /^\s*hsla?\(([0-9]+),\s*([0-9]+)%,\s*([0-9]+)%,?\s*([0-9.]+)?\)\s*$/i;
var HEX_REGEXP = /^\s*#?([0-9a-f]{3}|[0-9a-f]{6})\s*$/i;
var SHORTHEX_REGEXP = /^\s*#?([0-9a-f]{3})\s*$/i;
var ColorPicker = { hex: Z, rgb: He, hsl: ue };
var fallbackColor = { hex: "transparent", rgb: "rgba(0, 0, 0, 0)", hsl: "hsla(0, 0%, 0%, 0)" };
var stringToArgs = (value) => {
let match = value == null ? void 0 : value.match(COLOR_REGEXP);
if (!match) return [0, 0, 0, 1];
let [, x2, y2, z2, a2 = 1] = match;
return [x2, y2, z2, a2].map(Number);
};
var parseRgb = (value) => {
let [r2, g2, b2, a2] = stringToArgs(value), [h2, s2, l2] = import_color_convert.default.rgb.hsl([r2, g2, b2]) || [0, 0, 0];
return { valid: true, value, keyword: import_color_convert.default.rgb.keyword([r2, g2, b2]), colorSpace: "rgb", rgb: value, hsl: `hsla(${h2}, ${s2}%, ${l2}%, ${a2})`, hex: `#${import_color_convert.default.rgb.hex([r2, g2, b2]).toLowerCase()}` };
};
var parseHsl = (value) => {
let [h2, s2, l2, a2] = stringToArgs(value), [r2, g2, b2] = import_color_convert.default.hsl.rgb([h2, s2, l2]) || [0, 0, 0];
return { valid: true, value, keyword: import_color_convert.default.hsl.keyword([h2, s2, l2]), colorSpace: "hsl", rgb: `rgba(${r2}, ${g2}, ${b2}, ${a2})`, hsl: value, hex: `#${import_color_convert.default.hsl.hex([h2, s2, l2]).toLowerCase()}` };
};
var parseHexOrKeyword = (value) => {
let plain = value.replace("#", ""), rgb = import_color_convert.default.keyword.rgb(plain) || import_color_convert.default.hex.rgb(plain), hsl = import_color_convert.default.rgb.hsl(rgb), mapped = value;
/[^#a-f0-9]/i.test(value) ? mapped = plain : HEX_REGEXP.test(value) && (mapped = `#${plain}`);
let valid = true;
if (mapped.startsWith("#")) valid = HEX_REGEXP.test(mapped);
else try {
import_color_convert.default.keyword.hex(mapped);
} catch {
valid = false;
}
return { valid, value: mapped, keyword: import_color_convert.default.rgb.keyword(rgb), colorSpace: "hex", rgb: `rgba(${rgb[0]}, ${rgb[1]}, ${rgb[2]}, 1)`, hsl: `hsla(${hsl[0]}, ${hsl[1]}%, ${hsl[2]}%, 1)`, hex: mapped };
};
var parseValue = (value) => {
if (value) return RGB_REGEXP.test(value) ? parseRgb(value) : HSL_REGEXP.test(value) ? parseHsl(value) : parseHexOrKeyword(value);
};
var getRealValue = (value, color, colorSpace) => {
if (!value || !(color == null ? void 0 : color.valid)) return fallbackColor[colorSpace];
if (colorSpace !== "hex") return (color == null ? void 0 : color[colorSpace]) || fallbackColor[colorSpace];
if (!color.hex.startsWith("#")) try {
return `#${import_color_convert.default.keyword.hex(color.hex)}`;
} catch {
return fallbackColor.hex;
}
let short = color.hex.match(SHORTHEX_REGEXP);
if (!short) return HEX_REGEXP.test(color.hex) ? color.hex : fallbackColor.hex;
let [r2, g2, b2] = short[1].split("");
return `#${r2}${r2}${g2}${g2}${b2}${b2}`;
};
var useColorInput = (initialValue, onChange) => {
let [value, setValue] = (0, import_react.useState)(initialValue || ""), [color, setColor] = (0, import_react.useState)(() => parseValue(value)), [colorSpace, setColorSpace] = (0, import_react.useState)((color == null ? void 0 : color.colorSpace) || "hex");
(0, import_react.useEffect)(() => {
let nextValue = initialValue || "", nextColor = parseValue(nextValue);
setValue(nextValue), setColor(nextColor), setColorSpace((nextColor == null ? void 0 : nextColor.colorSpace) || "hex");
}, [initialValue]);
let realValue = (0, import_react.useMemo)(() => getRealValue(value, color, colorSpace).toLowerCase(), [value, color, colorSpace]), updateValue = (0, import_react.useCallback)((update) => {
let parsed = parseValue(update), v2 = (parsed == null ? void 0 : parsed.value) || update || "";
setValue(v2), v2 === "" && (setColor(void 0), onChange(void 0)), parsed && (setColor(parsed), setColorSpace(parsed.colorSpace), onChange(parsed.value));
}, [onChange]), cycleColorSpace = (0, import_react.useCallback)(() => {
let nextIndex = (COLOR_SPACES.indexOf(colorSpace) + 1) % COLOR_SPACES.length, nextSpace = COLOR_SPACES[nextIndex];
setColorSpace(nextSpace);
let updatedValue = (color == null ? void 0 : color[nextSpace]) || "";
setValue(updatedValue), onChange(updatedValue);
}, [color, colorSpace, onChange]);
return { value, realValue, updateValue, color, colorSpace, cycleColorSpace };
};
var id = (value) => value.replace(/\s*/, "").toLowerCase();
var usePresets = (presetColors, currentColor, colorSpace) => {
let [selectedColors, setSelectedColors] = (0, import_react.useState)((currentColor == null ? void 0 : currentColor.valid) ? [currentColor] : []);
(0, import_react.useEffect)(() => {
currentColor === void 0 && setSelectedColors([]);
}, [currentColor]);
let presets = (0, import_react.useMemo)(() => (presetColors || []).map((preset) => typeof preset == "string" ? parseValue(preset) : preset.title ? { ...parseValue(preset.color), keyword: preset.title } : parseValue(preset.color)).concat(selectedColors).filter(Boolean).slice(-27), [presetColors, selectedColors]), addPreset = (0, import_react.useCallback)((color) => {
(color == null ? void 0 : color.valid) && (presets.some((preset) => preset && preset[colorSpace] && id(preset[colorSpace] || "") === id(color[colorSpace] || "")) || setSelectedColors((arr) => arr.concat(color)));
}, [colorSpace, presets]);
return { presets, addPreset };
};
var ColorControl = ({ name, value: initialValue, onChange, onFocus, onBlur, presetColors, startOpen = false, argType }) => {
var _a;
let debouncedOnChange = (0, import_react.useCallback)(debounce2(onChange, 200), [onChange]), { value, realValue, updateValue, color, colorSpace, cycleColorSpace } = useColorInput(initialValue, debouncedOnChange), { presets, addPreset } = usePresets(presetColors ?? [], color, colorSpace), Picker = ColorPicker[colorSpace], readonly = !!((_a = argType == null ? void 0 : argType.table) == null ? void 0 : _a.readonly);
return import_react.default.createElement(Wrapper, { "aria-readonly": readonly }, import_react.default.createElement(PickerTooltip, { startOpen, trigger: readonly ? null : void 0, closeOnOutsideClick: true, onVisibleChange: () => color && addPreset(color), tooltip: import_react.default.createElement(TooltipContent, null, import_react.default.createElement(Picker, { color: realValue === "transparent" ? "#000000" : realValue, onChange: updateValue, onFocus, onBlur }), presets.length > 0 && import_react.default.createElement(Swatches, null, presets.map((preset, index) => import_react.default.createElement(O3, { key: `${(preset == null ? void 0 : preset.value) || index}-${index}`, hasChrome: false, tooltip: import_react.default.createElement(Note, { note: (preset == null ? void 0 : preset.keyword) || (preset == null ? void 0 : preset.value) || "" }) }, import_react.default.createElement(Swatch, { value: (preset == null ? void 0 : preset[colorSpace]) || "", active: !!(color && preset && preset[colorSpace] && id(preset[colorSpace] || "") === id(color[colorSpace])), onClick: () => preset && updateValue(preset.value || "") }))))) }, import_react.default.createElement(Swatch, { value: realValue, style: { margin: 4 } })), import_react.default.createElement(Input, { id: getControlId(name), value, onChange: (e2) => updateValue(e2.target.value), onFocus: (e2) => e2.target.select(), readOnly: readonly, placeholder: "Choose color..." }), value ? import_react.default.createElement(ToggleIcon, { onClick: cycleColorSpace }) : null);
};
var Color_default = ColorControl;
export {
ColorControl,
Color_default as default
};
//# sourceMappingURL=Color-AVL7NMMY-4DCQC45D.js.map

View File

@@ -1,26 +0,0 @@
import {
DocsRenderer,
defaultComponents
} from "./chunk-VG4OXZTU.js";
import "./chunk-57ZXLNKK.js";
import "./chunk-TYV5OM3H.js";
import "./chunk-FNTD6K4X.js";
import "./chunk-JLBFQ2EK.js";
import "./chunk-RM5O7ZR7.js";
import "./chunk-RTHSENM2.js";
import "./chunk-K46MDWSL.js";
import "./chunk-H4EEZRGF.js";
import "./chunk-FTMWZLOQ.js";
import "./chunk-YO32UEEW.js";
import "./chunk-E4Q3YXXP.js";
import "./chunk-YYB2ULC3.js";
import "./chunk-GF7VUYY4.js";
import "./chunk-ZHATCZIL.js";
import "./chunk-NDPLLWBS.js";
import "./chunk-WIJRE3H4.js";
import "./chunk-KEXKKQVW.js";
export {
DocsRenderer,
defaultComponents
};
//# sourceMappingURL=DocsRenderer-3PZUHFFL-FOAYSAPL.js.map

View File

@@ -1,7 +0,0 @@
{
"version": 3,
"sources": [],
"sourcesContent": [],
"mappings": "",
"names": []
}

View File

@@ -1,67 +0,0 @@
import {
renderElement,
unmountElement
} from "./chunk-57ZXLNKK.js";
import "./chunk-TYV5OM3H.js";
import {
AnchorMdx,
CodeOrSourceMdx,
Docs,
HeadersMdx
} from "./chunk-FNTD6K4X.js";
import "./chunk-JLBFQ2EK.js";
import "./chunk-RM5O7ZR7.js";
import "./chunk-RTHSENM2.js";
import "./chunk-K46MDWSL.js";
import "./chunk-H4EEZRGF.js";
import "./chunk-FTMWZLOQ.js";
import "./chunk-YO32UEEW.js";
import "./chunk-E4Q3YXXP.js";
import "./chunk-YYB2ULC3.js";
import "./chunk-GF7VUYY4.js";
import "./chunk-ZHATCZIL.js";
import "./chunk-NDPLLWBS.js";
import {
require_react
} from "./chunk-WIJRE3H4.js";
import {
__toESM
} from "./chunk-KEXKKQVW.js";
// node_modules/@storybook/addon-docs/dist/DocsRenderer-PQXLIZUC.mjs
var import_react = __toESM(require_react(), 1);
var defaultComponents = { code: CodeOrSourceMdx, a: AnchorMdx, ...HeadersMdx };
var ErrorBoundary = class extends import_react.Component {
constructor() {
super(...arguments);
this.state = { hasError: false };
}
static getDerivedStateFromError() {
return { hasError: true };
}
componentDidCatch(err) {
let { showException } = this.props;
showException(err);
}
render() {
let { hasError } = this.state, { children } = this.props;
return hasError ? null : import_react.default.createElement(import_react.default.Fragment, null, children);
}
};
var DocsRenderer = class {
constructor() {
this.render = async (context, docsParameter, element) => {
let components = { ...defaultComponents, ...docsParameter == null ? void 0 : docsParameter.components }, TDocs = Docs;
return new Promise((resolve, reject) => {
import("./@mdx-js_react.js").then(({ MDXProvider }) => renderElement(import_react.default.createElement(ErrorBoundary, { showException: reject, key: Math.random() }, import_react.default.createElement(MDXProvider, { components }, import_react.default.createElement(TDocs, { context, docsParameter }))), element)).then(() => resolve());
});
}, this.unmount = (element) => {
unmountElement(element);
};
}
};
export {
DocsRenderer,
defaultComponents
};
//# sourceMappingURL=DocsRenderer-PQXLIZUC-RVPN436C.js.map

View File

@@ -1,7 +0,0 @@
{
"version": 3,
"sources": ["../../../../../@storybook/addon-docs/dist/DocsRenderer-PQXLIZUC.mjs"],
"sourcesContent": ["import React, { Component } from 'react';\nimport { renderElement, unmountElement } from '@storybook/react-dom-shim';\nimport { CodeOrSourceMdx, AnchorMdx, HeadersMdx, Docs } from '@storybook/addon-docs/blocks';\n\nvar defaultComponents={code:CodeOrSourceMdx,a:AnchorMdx,...HeadersMdx},ErrorBoundary=class extends Component{constructor(){super(...arguments);this.state={hasError:!1};}static getDerivedStateFromError(){return {hasError:!0}}componentDidCatch(err){let{showException}=this.props;showException(err);}render(){let{hasError}=this.state,{children}=this.props;return hasError?null:React.createElement(React.Fragment,null,children)}},DocsRenderer=class{constructor(){this.render=async(context,docsParameter,element)=>{let components={...defaultComponents,...docsParameter?.components},TDocs=Docs;return new Promise((resolve,reject)=>{import('@mdx-js/react').then(({MDXProvider})=>renderElement(React.createElement(ErrorBoundary,{showException:reject,key:Math.random()},React.createElement(MDXProvider,{components},React.createElement(TDocs,{context,docsParameter}))),element)).then(()=>resolve());})},this.unmount=element=>{unmountElement(element);};}};\n\nexport { DocsRenderer, defaultComponents };\n"],
"mappings": ";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,mBAAiC;AAIjC,IAAI,oBAAkB,EAAC,MAAK,iBAAgB,GAAE,WAAU,GAAG,WAAU;AAArE,IAAuE,gBAAc,cAAc,uBAAS;AAAA,EAAC,cAAa;AAAC,UAAM,GAAG,SAAS;AAAE,SAAK,QAAM,EAAC,UAAS,MAAE;AAAA,EAAE;AAAA,EAAC,OAAO,2BAA0B;AAAC,WAAO,EAAC,UAAS,KAAE;AAAA,EAAC;AAAA,EAAC,kBAAkB,KAAI;AAAC,QAAG,EAAC,cAAa,IAAE,KAAK;AAAM,kBAAc,GAAG;AAAA,EAAE;AAAA,EAAC,SAAQ;AAAC,QAAG,EAAC,SAAQ,IAAE,KAAK,OAAM,EAAC,SAAQ,IAAE,KAAK;AAAM,WAAO,WAAS,OAAK,aAAAA,QAAM,cAAc,aAAAA,QAAM,UAAS,MAAK,QAAQ;AAAA,EAAC;AAAC;AAAxa,IAA0a,eAAa,MAAK;AAAA,EAAC,cAAa;AAAC,SAAK,SAAO,OAAM,SAAQ,eAAc,YAAU;AAAC,UAAI,aAAW,EAAC,GAAG,mBAAkB,GAAG,+CAAe,WAAU,GAAE,QAAM;AAAK,aAAO,IAAI,QAAQ,CAAC,SAAQ,WAAS;AAAC,eAAO,oBAAe,EAAE,KAAK,CAAC,EAAC,YAAW,MAAI,cAAc,aAAAA,QAAM,cAAc,eAAc,EAAC,eAAc,QAAO,KAAI,KAAK,OAAO,EAAC,GAAE,aAAAA,QAAM,cAAc,aAAY,EAAC,WAAU,GAAE,aAAAA,QAAM,cAAc,OAAM,EAAC,SAAQ,cAAa,CAAC,CAAC,CAAC,GAAE,OAAO,CAAC,EAAE,KAAK,MAAI,QAAQ,CAAC;AAAA,MAAE,CAAC;AAAA,IAAC,GAAE,KAAK,UAAQ,aAAS;AAAC,qBAAe,OAAO;AAAA,IAAE;AAAA,EAAE;AAAC;",
"names": ["React"]
}

View File

@@ -1,39 +0,0 @@
import {
require_debounce
} from "./chunk-RHWKDMUE.js";
import {
require_isObject
} from "./chunk-LJMOOM7L.js";
import {
__commonJS
} from "./chunk-KEXKKQVW.js";
// node_modules/lodash/throttle.js
var require_throttle = __commonJS({
"node_modules/lodash/throttle.js"(exports, module) {
var debounce = require_debounce();
var isObject = require_isObject();
var FUNC_ERROR_TEXT = "Expected a function";
function throttle(func, wait, options) {
var leading = true, trailing = true;
if (typeof func != "function") {
throw new TypeError(FUNC_ERROR_TEXT);
}
if (isObject(options)) {
leading = "leading" in options ? !!options.leading : leading;
trailing = "trailing" in options ? !!options.trailing : trailing;
}
return debounce(func, wait, {
"leading": leading,
"maxWait": wait,
"trailing": trailing
});
}
module.exports = throttle;
}
});
export {
require_throttle
};
//# sourceMappingURL=chunk-2A7TYURX.js.map

View File

@@ -1,7 +0,0 @@
{
"version": 3,
"sources": ["../../../../../lodash/throttle.js"],
"sourcesContent": ["var debounce = require('./debounce'),\n isObject = require('./isObject');\n\n/** Error message constants. */\nvar FUNC_ERROR_TEXT = 'Expected a function';\n\n/**\n * Creates a throttled function that only invokes `func` at most once per\n * every `wait` milliseconds. The throttled function comes with a `cancel`\n * method to cancel delayed `func` invocations and a `flush` method to\n * immediately invoke them. Provide `options` to indicate whether `func`\n * should be invoked on the leading and/or trailing edge of the `wait`\n * timeout. The `func` is invoked with the last arguments provided to the\n * throttled function. Subsequent calls to the throttled function return the\n * result of the last `func` invocation.\n *\n * **Note:** If `leading` and `trailing` options are `true`, `func` is\n * invoked on the trailing edge of the timeout only if the throttled function\n * is invoked more than once during the `wait` timeout.\n *\n * If `wait` is `0` and `leading` is `false`, `func` invocation is deferred\n * until to the next tick, similar to `setTimeout` with a timeout of `0`.\n *\n * See [David Corbacho's article](https://css-tricks.com/debouncing-throttling-explained-examples/)\n * for details over the differences between `_.throttle` and `_.debounce`.\n *\n * @static\n * @memberOf _\n * @since 0.1.0\n * @category Function\n * @param {Function} func The function to throttle.\n * @param {number} [wait=0] The number of milliseconds to throttle invocations to.\n * @param {Object} [options={}] The options object.\n * @param {boolean} [options.leading=true]\n * Specify invoking on the leading edge of the timeout.\n * @param {boolean} [options.trailing=true]\n * Specify invoking on the trailing edge of the timeout.\n * @returns {Function} Returns the new throttled function.\n * @example\n *\n * // Avoid excessively updating the position while scrolling.\n * jQuery(window).on('scroll', _.throttle(updatePosition, 100));\n *\n * // Invoke `renewToken` when the click event is fired, but not more than once every 5 minutes.\n * var throttled = _.throttle(renewToken, 300000, { 'trailing': false });\n * jQuery(element).on('click', throttled);\n *\n * // Cancel the trailing throttled invocation.\n * jQuery(window).on('popstate', throttled.cancel);\n */\nfunction throttle(func, wait, options) {\n var leading = true,\n trailing = true;\n\n if (typeof func != 'function') {\n throw new TypeError(FUNC_ERROR_TEXT);\n }\n if (isObject(options)) {\n leading = 'leading' in options ? !!options.leading : leading;\n trailing = 'trailing' in options ? !!options.trailing : trailing;\n }\n return debounce(func, wait, {\n 'leading': leading,\n 'maxWait': wait,\n 'trailing': trailing\n });\n}\n\nmodule.exports = throttle;\n"],
"mappings": ";;;;;;;;;;;AAAA;AAAA;AAAA,QAAI,WAAW;AAAf,QACI,WAAW;AAGf,QAAI,kBAAkB;AA8CtB,aAAS,SAAS,MAAM,MAAM,SAAS;AACrC,UAAI,UAAU,MACV,WAAW;AAEf,UAAI,OAAO,QAAQ,YAAY;AAC7B,cAAM,IAAI,UAAU,eAAe;AAAA,MACrC;AACA,UAAI,SAAS,OAAO,GAAG;AACrB,kBAAU,aAAa,UAAU,CAAC,CAAC,QAAQ,UAAU;AACrD,mBAAW,cAAc,UAAU,CAAC,CAAC,QAAQ,WAAW;AAAA,MAC1D;AACA,aAAO,SAAS,MAAM,MAAM;AAAA,QAC1B,WAAW;AAAA,QACX,WAAW;AAAA,QACX,YAAY;AAAA,MACd,CAAC;AAAA,IACH;AAEA,WAAO,UAAU;AAAA;AAAA;",
"names": []
}

View File

@@ -1,22 +0,0 @@
import {
require_baseIsEqual
} from "./chunk-6Q6IFNG3.js";
import {
__commonJS
} from "./chunk-KEXKKQVW.js";
// node_modules/lodash/isEqual.js
var require_isEqual = __commonJS({
"node_modules/lodash/isEqual.js"(exports, module) {
var baseIsEqual = require_baseIsEqual();
function isEqual(value, other) {
return baseIsEqual(value, other);
}
module.exports = isEqual;
}
});
export {
require_isEqual
};
//# sourceMappingURL=chunk-2HKPRQOD.js.map

View File

@@ -1,7 +0,0 @@
{
"version": 3,
"sources": ["../../../../../lodash/isEqual.js"],
"sourcesContent": ["var baseIsEqual = require('./_baseIsEqual');\n\n/**\n * Performs a deep comparison between two values to determine if they are\n * equivalent.\n *\n * **Note:** This method supports comparing arrays, array buffers, booleans,\n * date objects, error objects, maps, numbers, `Object` objects, regexes,\n * sets, strings, symbols, and typed arrays. `Object` objects are compared\n * by their own, not inherited, enumerable properties. Functions and DOM\n * nodes are compared by strict equality, i.e. `===`.\n *\n * @static\n * @memberOf _\n * @since 0.1.0\n * @category Lang\n * @param {*} value The value to compare.\n * @param {*} other The other value to compare.\n * @returns {boolean} Returns `true` if the values are equivalent, else `false`.\n * @example\n *\n * var object = { 'a': 1 };\n * var other = { 'a': 1 };\n *\n * _.isEqual(object, other);\n * // => true\n *\n * object === other;\n * // => false\n */\nfunction isEqual(value, other) {\n return baseIsEqual(value, other);\n}\n\nmodule.exports = isEqual;\n"],
"mappings": ";;;;;;;;AAAA;AAAA;AAAA,QAAI,cAAc;AA8BlB,aAAS,QAAQ,OAAO,OAAO;AAC7B,aAAO,YAAY,OAAO,KAAK;AAAA,IACjC;AAEA,WAAO,UAAU;AAAA;AAAA;",
"names": []
}

View File

@@ -1,148 +0,0 @@
import {
require_toString
} from "./chunk-3NBNF4EG.js";
import {
__commonJS
} from "./chunk-KEXKKQVW.js";
// node_modules/lodash/_baseSlice.js
var require_baseSlice = __commonJS({
"node_modules/lodash/_baseSlice.js"(exports, module) {
function baseSlice(array, start, end) {
var index = -1, length = array.length;
if (start < 0) {
start = -start > length ? 0 : length + start;
}
end = end > length ? length : end;
if (end < 0) {
end += length;
}
length = start > end ? 0 : end - start >>> 0;
start >>>= 0;
var result = Array(length);
while (++index < length) {
result[index] = array[index + start];
}
return result;
}
module.exports = baseSlice;
}
});
// node_modules/lodash/_castSlice.js
var require_castSlice = __commonJS({
"node_modules/lodash/_castSlice.js"(exports, module) {
var baseSlice = require_baseSlice();
function castSlice(array, start, end) {
var length = array.length;
end = end === void 0 ? length : end;
return !start && end >= length ? array : baseSlice(array, start, end);
}
module.exports = castSlice;
}
});
// node_modules/lodash/_hasUnicode.js
var require_hasUnicode = __commonJS({
"node_modules/lodash/_hasUnicode.js"(exports, module) {
var rsAstralRange = "\\ud800-\\udfff";
var rsComboMarksRange = "\\u0300-\\u036f";
var reComboHalfMarksRange = "\\ufe20-\\ufe2f";
var rsComboSymbolsRange = "\\u20d0-\\u20ff";
var rsComboRange = rsComboMarksRange + reComboHalfMarksRange + rsComboSymbolsRange;
var rsVarRange = "\\ufe0e\\ufe0f";
var rsZWJ = "\\u200d";
var reHasUnicode = RegExp("[" + rsZWJ + rsAstralRange + rsComboRange + rsVarRange + "]");
function hasUnicode(string) {
return reHasUnicode.test(string);
}
module.exports = hasUnicode;
}
});
// node_modules/lodash/_asciiToArray.js
var require_asciiToArray = __commonJS({
"node_modules/lodash/_asciiToArray.js"(exports, module) {
function asciiToArray(string) {
return string.split("");
}
module.exports = asciiToArray;
}
});
// node_modules/lodash/_unicodeToArray.js
var require_unicodeToArray = __commonJS({
"node_modules/lodash/_unicodeToArray.js"(exports, module) {
var rsAstralRange = "\\ud800-\\udfff";
var rsComboMarksRange = "\\u0300-\\u036f";
var reComboHalfMarksRange = "\\ufe20-\\ufe2f";
var rsComboSymbolsRange = "\\u20d0-\\u20ff";
var rsComboRange = rsComboMarksRange + reComboHalfMarksRange + rsComboSymbolsRange;
var rsVarRange = "\\ufe0e\\ufe0f";
var rsAstral = "[" + rsAstralRange + "]";
var rsCombo = "[" + rsComboRange + "]";
var rsFitz = "\\ud83c[\\udffb-\\udfff]";
var rsModifier = "(?:" + rsCombo + "|" + rsFitz + ")";
var rsNonAstral = "[^" + rsAstralRange + "]";
var rsRegional = "(?:\\ud83c[\\udde6-\\uddff]){2}";
var rsSurrPair = "[\\ud800-\\udbff][\\udc00-\\udfff]";
var rsZWJ = "\\u200d";
var reOptMod = rsModifier + "?";
var rsOptVar = "[" + rsVarRange + "]?";
var rsOptJoin = "(?:" + rsZWJ + "(?:" + [rsNonAstral, rsRegional, rsSurrPair].join("|") + ")" + rsOptVar + reOptMod + ")*";
var rsSeq = rsOptVar + reOptMod + rsOptJoin;
var rsSymbol = "(?:" + [rsNonAstral + rsCombo + "?", rsCombo, rsRegional, rsSurrPair, rsAstral].join("|") + ")";
var reUnicode = RegExp(rsFitz + "(?=" + rsFitz + ")|" + rsSymbol + rsSeq, "g");
function unicodeToArray(string) {
return string.match(reUnicode) || [];
}
module.exports = unicodeToArray;
}
});
// node_modules/lodash/_stringToArray.js
var require_stringToArray = __commonJS({
"node_modules/lodash/_stringToArray.js"(exports, module) {
var asciiToArray = require_asciiToArray();
var hasUnicode = require_hasUnicode();
var unicodeToArray = require_unicodeToArray();
function stringToArray(string) {
return hasUnicode(string) ? unicodeToArray(string) : asciiToArray(string);
}
module.exports = stringToArray;
}
});
// node_modules/lodash/_createCaseFirst.js
var require_createCaseFirst = __commonJS({
"node_modules/lodash/_createCaseFirst.js"(exports, module) {
var castSlice = require_castSlice();
var hasUnicode = require_hasUnicode();
var stringToArray = require_stringToArray();
var toString = require_toString();
function createCaseFirst(methodName) {
return function(string) {
string = toString(string);
var strSymbols = hasUnicode(string) ? stringToArray(string) : void 0;
var chr = strSymbols ? strSymbols[0] : string.charAt(0);
var trailing = strSymbols ? castSlice(strSymbols, 1).join("") : string.slice(1);
return chr[methodName]() + trailing;
};
}
module.exports = createCaseFirst;
}
});
// node_modules/lodash/upperFirst.js
var require_upperFirst = __commonJS({
"node_modules/lodash/upperFirst.js"(exports, module) {
var createCaseFirst = require_createCaseFirst();
var upperFirst = createCaseFirst("toUpperCase");
module.exports = upperFirst;
}
});
export {
require_upperFirst
};
//# sourceMappingURL=chunk-3BMTL3O2.js.map

View File

@@ -1,70 +0,0 @@
import {
require_isArray
} from "./chunk-V2ZIA3AG.js";
import {
require_isSymbol
} from "./chunk-L7AFSFRA.js";
import {
require_Symbol
} from "./chunk-FMVO6WZI.js";
import {
__commonJS
} from "./chunk-KEXKKQVW.js";
// node_modules/lodash/_arrayMap.js
var require_arrayMap = __commonJS({
"node_modules/lodash/_arrayMap.js"(exports, module) {
function arrayMap(array, iteratee) {
var index = -1, length = array == null ? 0 : array.length, result = Array(length);
while (++index < length) {
result[index] = iteratee(array[index], index, array);
}
return result;
}
module.exports = arrayMap;
}
});
// node_modules/lodash/_baseToString.js
var require_baseToString = __commonJS({
"node_modules/lodash/_baseToString.js"(exports, module) {
var Symbol = require_Symbol();
var arrayMap = require_arrayMap();
var isArray = require_isArray();
var isSymbol = require_isSymbol();
var INFINITY = 1 / 0;
var symbolProto = Symbol ? Symbol.prototype : void 0;
var symbolToString = symbolProto ? symbolProto.toString : void 0;
function baseToString(value) {
if (typeof value == "string") {
return value;
}
if (isArray(value)) {
return arrayMap(value, baseToString) + "";
}
if (isSymbol(value)) {
return symbolToString ? symbolToString.call(value) : "";
}
var result = value + "";
return result == "0" && 1 / value == -INFINITY ? "-0" : result;
}
module.exports = baseToString;
}
});
// node_modules/lodash/toString.js
var require_toString = __commonJS({
"node_modules/lodash/toString.js"(exports, module) {
var baseToString = require_baseToString();
function toString(value) {
return value == null ? "" : baseToString(value);
}
module.exports = toString;
}
});
export {
require_arrayMap,
require_toString
};
//# sourceMappingURL=chunk-3NBNF4EG.js.map

View File

@@ -1,7 +0,0 @@
{
"version": 3,
"sources": ["../../../../../lodash/_arrayMap.js", "../../../../../lodash/_baseToString.js", "../../../../../lodash/toString.js"],
"sourcesContent": ["/**\n * A specialized version of `_.map` for arrays without support for iteratee\n * shorthands.\n *\n * @private\n * @param {Array} [array] The array to iterate over.\n * @param {Function} iteratee The function invoked per iteration.\n * @returns {Array} Returns the new mapped array.\n */\nfunction arrayMap(array, iteratee) {\n var index = -1,\n length = array == null ? 0 : array.length,\n result = Array(length);\n\n while (++index < length) {\n result[index] = iteratee(array[index], index, array);\n }\n return result;\n}\n\nmodule.exports = arrayMap;\n", "var Symbol = require('./_Symbol'),\n arrayMap = require('./_arrayMap'),\n isArray = require('./isArray'),\n isSymbol = require('./isSymbol');\n\n/** Used as references for various `Number` constants. */\nvar INFINITY = 1 / 0;\n\n/** Used to convert symbols to primitives and strings. */\nvar symbolProto = Symbol ? Symbol.prototype : undefined,\n symbolToString = symbolProto ? symbolProto.toString : undefined;\n\n/**\n * The base implementation of `_.toString` which doesn't convert nullish\n * values to empty strings.\n *\n * @private\n * @param {*} value The value to process.\n * @returns {string} Returns the string.\n */\nfunction baseToString(value) {\n // Exit early for strings to avoid a performance hit in some environments.\n if (typeof value == 'string') {\n return value;\n }\n if (isArray(value)) {\n // Recursively convert values (susceptible to call stack limits).\n return arrayMap(value, baseToString) + '';\n }\n if (isSymbol(value)) {\n return symbolToString ? symbolToString.call(value) : '';\n }\n var result = (value + '');\n return (result == '0' && (1 / value) == -INFINITY) ? '-0' : result;\n}\n\nmodule.exports = baseToString;\n", "var baseToString = require('./_baseToString');\n\n/**\n * Converts `value` to a string. An empty string is returned for `null`\n * and `undefined` values. The sign of `-0` is preserved.\n *\n * @static\n * @memberOf _\n * @since 4.0.0\n * @category Lang\n * @param {*} value The value to convert.\n * @returns {string} Returns the converted string.\n * @example\n *\n * _.toString(null);\n * // => ''\n *\n * _.toString(-0);\n * // => '-0'\n *\n * _.toString([1, 2, 3]);\n * // => '1,2,3'\n */\nfunction toString(value) {\n return value == null ? '' : baseToString(value);\n}\n\nmodule.exports = toString;\n"],
"mappings": ";;;;;;;;;;;;;;AAAA;AAAA;AASA,aAAS,SAAS,OAAO,UAAU;AACjC,UAAI,QAAQ,IACR,SAAS,SAAS,OAAO,IAAI,MAAM,QACnC,SAAS,MAAM,MAAM;AAEzB,aAAO,EAAE,QAAQ,QAAQ;AACvB,eAAO,KAAK,IAAI,SAAS,MAAM,KAAK,GAAG,OAAO,KAAK;AAAA,MACrD;AACA,aAAO;AAAA,IACT;AAEA,WAAO,UAAU;AAAA;AAAA;;;ACpBjB;AAAA;AAAA,QAAI,SAAS;AAAb,QACI,WAAW;AADf,QAEI,UAAU;AAFd,QAGI,WAAW;AAGf,QAAI,WAAW,IAAI;AAGnB,QAAI,cAAc,SAAS,OAAO,YAAY;AAA9C,QACI,iBAAiB,cAAc,YAAY,WAAW;AAU1D,aAAS,aAAa,OAAO;AAE3B,UAAI,OAAO,SAAS,UAAU;AAC5B,eAAO;AAAA,MACT;AACA,UAAI,QAAQ,KAAK,GAAG;AAElB,eAAO,SAAS,OAAO,YAAY,IAAI;AAAA,MACzC;AACA,UAAI,SAAS,KAAK,GAAG;AACnB,eAAO,iBAAiB,eAAe,KAAK,KAAK,IAAI;AAAA,MACvD;AACA,UAAI,SAAU,QAAQ;AACtB,aAAQ,UAAU,OAAQ,IAAI,SAAU,CAAC,WAAY,OAAO;AAAA,IAC9D;AAEA,WAAO,UAAU;AAAA;AAAA;;;ACpCjB;AAAA;AAAA,QAAI,eAAe;AAuBnB,aAAS,SAAS,OAAO;AACvB,aAAO,SAAS,OAAO,KAAK,aAAa,KAAK;AAAA,IAChD;AAEA,WAAO,UAAU;AAAA;AAAA;",
"names": []
}

View File

@@ -1,271 +0,0 @@
import {
require_overRest,
require_setToString
} from "./chunk-DJFXXVH6.js";
import {
require_isPlainObject
} from "./chunk-Y6FQDQQ2.js";
import {
require_cloneBuffer,
require_cloneTypedArray,
require_copyArray,
require_copyObject,
require_initCloneObject
} from "./chunk-KNVPX2EF.js";
import {
require_keysIn
} from "./chunk-5TYTZAF2.js";
import {
require_baseFor
} from "./chunk-4O4FP57Y.js";
import {
require_identity
} from "./chunk-P43QBNJ2.js";
import {
require_baseAssignValue
} from "./chunk-JLPG4W6G.js";
import {
require_Stack,
require_isArrayLike,
require_isBuffer,
require_isTypedArray
} from "./chunk-QAGLP4WU.js";
import {
require_isArguments,
require_isIndex
} from "./chunk-KAVMBZUT.js";
import {
require_eq
} from "./chunk-4YXVX72G.js";
import {
require_isFunction
} from "./chunk-BP6K5E6K.js";
import {
require_isArray
} from "./chunk-V2ZIA3AG.js";
import {
require_isObject
} from "./chunk-LJMOOM7L.js";
import {
require_isObjectLike
} from "./chunk-LSIB72U6.js";
import {
__commonJS
} from "./chunk-KEXKKQVW.js";
// node_modules/lodash/_assignMergeValue.js
var require_assignMergeValue = __commonJS({
"node_modules/lodash/_assignMergeValue.js"(exports, module) {
var baseAssignValue = require_baseAssignValue();
var eq = require_eq();
function assignMergeValue(object, key, value) {
if (value !== void 0 && !eq(object[key], value) || value === void 0 && !(key in object)) {
baseAssignValue(object, key, value);
}
}
module.exports = assignMergeValue;
}
});
// node_modules/lodash/isArrayLikeObject.js
var require_isArrayLikeObject = __commonJS({
"node_modules/lodash/isArrayLikeObject.js"(exports, module) {
var isArrayLike = require_isArrayLike();
var isObjectLike = require_isObjectLike();
function isArrayLikeObject(value) {
return isObjectLike(value) && isArrayLike(value);
}
module.exports = isArrayLikeObject;
}
});
// node_modules/lodash/_safeGet.js
var require_safeGet = __commonJS({
"node_modules/lodash/_safeGet.js"(exports, module) {
function safeGet(object, key) {
if (key === "constructor" && typeof object[key] === "function") {
return;
}
if (key == "__proto__") {
return;
}
return object[key];
}
module.exports = safeGet;
}
});
// node_modules/lodash/toPlainObject.js
var require_toPlainObject = __commonJS({
"node_modules/lodash/toPlainObject.js"(exports, module) {
var copyObject = require_copyObject();
var keysIn = require_keysIn();
function toPlainObject(value) {
return copyObject(value, keysIn(value));
}
module.exports = toPlainObject;
}
});
// node_modules/lodash/_baseMergeDeep.js
var require_baseMergeDeep = __commonJS({
"node_modules/lodash/_baseMergeDeep.js"(exports, module) {
var assignMergeValue = require_assignMergeValue();
var cloneBuffer = require_cloneBuffer();
var cloneTypedArray = require_cloneTypedArray();
var copyArray = require_copyArray();
var initCloneObject = require_initCloneObject();
var isArguments = require_isArguments();
var isArray = require_isArray();
var isArrayLikeObject = require_isArrayLikeObject();
var isBuffer = require_isBuffer();
var isFunction = require_isFunction();
var isObject = require_isObject();
var isPlainObject = require_isPlainObject();
var isTypedArray = require_isTypedArray();
var safeGet = require_safeGet();
var toPlainObject = require_toPlainObject();
function baseMergeDeep(object, source, key, srcIndex, mergeFunc, customizer, stack) {
var objValue = safeGet(object, key), srcValue = safeGet(source, key), stacked = stack.get(srcValue);
if (stacked) {
assignMergeValue(object, key, stacked);
return;
}
var newValue = customizer ? customizer(objValue, srcValue, key + "", object, source, stack) : void 0;
var isCommon = newValue === void 0;
if (isCommon) {
var isArr = isArray(srcValue), isBuff = !isArr && isBuffer(srcValue), isTyped = !isArr && !isBuff && isTypedArray(srcValue);
newValue = srcValue;
if (isArr || isBuff || isTyped) {
if (isArray(objValue)) {
newValue = objValue;
} else if (isArrayLikeObject(objValue)) {
newValue = copyArray(objValue);
} else if (isBuff) {
isCommon = false;
newValue = cloneBuffer(srcValue, true);
} else if (isTyped) {
isCommon = false;
newValue = cloneTypedArray(srcValue, true);
} else {
newValue = [];
}
} else if (isPlainObject(srcValue) || isArguments(srcValue)) {
newValue = objValue;
if (isArguments(objValue)) {
newValue = toPlainObject(objValue);
} else if (!isObject(objValue) || isFunction(objValue)) {
newValue = initCloneObject(srcValue);
}
} else {
isCommon = false;
}
}
if (isCommon) {
stack.set(srcValue, newValue);
mergeFunc(newValue, srcValue, srcIndex, customizer, stack);
stack["delete"](srcValue);
}
assignMergeValue(object, key, newValue);
}
module.exports = baseMergeDeep;
}
});
// node_modules/lodash/_baseMerge.js
var require_baseMerge = __commonJS({
"node_modules/lodash/_baseMerge.js"(exports, module) {
var Stack = require_Stack();
var assignMergeValue = require_assignMergeValue();
var baseFor = require_baseFor();
var baseMergeDeep = require_baseMergeDeep();
var isObject = require_isObject();
var keysIn = require_keysIn();
var safeGet = require_safeGet();
function baseMerge(object, source, srcIndex, customizer, stack) {
if (object === source) {
return;
}
baseFor(source, function(srcValue, key) {
stack || (stack = new Stack());
if (isObject(srcValue)) {
baseMergeDeep(object, source, key, srcIndex, baseMerge, customizer, stack);
} else {
var newValue = customizer ? customizer(safeGet(object, key), srcValue, key + "", object, source, stack) : void 0;
if (newValue === void 0) {
newValue = srcValue;
}
assignMergeValue(object, key, newValue);
}
}, keysIn);
}
module.exports = baseMerge;
}
});
// node_modules/lodash/_baseRest.js
var require_baseRest = __commonJS({
"node_modules/lodash/_baseRest.js"(exports, module) {
var identity = require_identity();
var overRest = require_overRest();
var setToString = require_setToString();
function baseRest(func, start) {
return setToString(overRest(func, start, identity), func + "");
}
module.exports = baseRest;
}
});
// node_modules/lodash/_isIterateeCall.js
var require_isIterateeCall = __commonJS({
"node_modules/lodash/_isIterateeCall.js"(exports, module) {
var eq = require_eq();
var isArrayLike = require_isArrayLike();
var isIndex = require_isIndex();
var isObject = require_isObject();
function isIterateeCall(value, index, object) {
if (!isObject(object)) {
return false;
}
var type = typeof index;
if (type == "number" ? isArrayLike(object) && isIndex(index, object.length) : type == "string" && index in object) {
return eq(object[index], value);
}
return false;
}
module.exports = isIterateeCall;
}
});
// node_modules/lodash/_createAssigner.js
var require_createAssigner = __commonJS({
"node_modules/lodash/_createAssigner.js"(exports, module) {
var baseRest = require_baseRest();
var isIterateeCall = require_isIterateeCall();
function createAssigner(assigner) {
return baseRest(function(object, sources) {
var index = -1, length = sources.length, customizer = length > 1 ? sources[length - 1] : void 0, guard = length > 2 ? sources[2] : void 0;
customizer = assigner.length > 3 && typeof customizer == "function" ? (length--, customizer) : void 0;
if (guard && isIterateeCall(sources[0], sources[1], guard)) {
customizer = length < 3 ? void 0 : customizer;
length = 1;
}
object = Object(object);
while (++index < length) {
var source = sources[index];
if (source) {
assigner(object, source, index, customizer);
}
}
return object;
});
}
module.exports = createAssigner;
}
});
export {
require_baseMerge,
require_createAssigner
};
//# sourceMappingURL=chunk-3V7DD6PY.js.map

View File

@@ -1,380 +0,0 @@
import {
require_toString
} from "./chunk-3NBNF4EG.js";
import {
__commonJS
} from "./chunk-KEXKKQVW.js";
// node_modules/lodash/_arrayReduce.js
var require_arrayReduce = __commonJS({
"node_modules/lodash/_arrayReduce.js"(exports, module) {
function arrayReduce(array, iteratee, accumulator, initAccum) {
var index = -1, length = array == null ? 0 : array.length;
if (initAccum && length) {
accumulator = array[++index];
}
while (++index < length) {
accumulator = iteratee(accumulator, array[index], index, array);
}
return accumulator;
}
module.exports = arrayReduce;
}
});
// node_modules/lodash/_basePropertyOf.js
var require_basePropertyOf = __commonJS({
"node_modules/lodash/_basePropertyOf.js"(exports, module) {
function basePropertyOf(object) {
return function(key) {
return object == null ? void 0 : object[key];
};
}
module.exports = basePropertyOf;
}
});
// node_modules/lodash/_deburrLetter.js
var require_deburrLetter = __commonJS({
"node_modules/lodash/_deburrLetter.js"(exports, module) {
var basePropertyOf = require_basePropertyOf();
var deburredLetters = {
// Latin-1 Supplement block.
"À": "A",
"Á": "A",
"Â": "A",
"Ã": "A",
"Ä": "A",
"Å": "A",
"à": "a",
"á": "a",
"â": "a",
"ã": "a",
"ä": "a",
"å": "a",
"Ç": "C",
"ç": "c",
"Ð": "D",
"ð": "d",
"È": "E",
"É": "E",
"Ê": "E",
"Ë": "E",
"è": "e",
"é": "e",
"ê": "e",
"ë": "e",
"Ì": "I",
"Í": "I",
"Î": "I",
"Ï": "I",
"ì": "i",
"í": "i",
"î": "i",
"ï": "i",
"Ñ": "N",
"ñ": "n",
"Ò": "O",
"Ó": "O",
"Ô": "O",
"Õ": "O",
"Ö": "O",
"Ø": "O",
"ò": "o",
"ó": "o",
"ô": "o",
"õ": "o",
"ö": "o",
"ø": "o",
"Ù": "U",
"Ú": "U",
"Û": "U",
"Ü": "U",
"ù": "u",
"ú": "u",
"û": "u",
"ü": "u",
"Ý": "Y",
"ý": "y",
"ÿ": "y",
"Æ": "Ae",
"æ": "ae",
"Þ": "Th",
"þ": "th",
"ß": "ss",
// Latin Extended-A block.
"Ā": "A",
"Ă": "A",
"Ą": "A",
"ā": "a",
"ă": "a",
"ą": "a",
"Ć": "C",
"Ĉ": "C",
"Ċ": "C",
"Č": "C",
"ć": "c",
"ĉ": "c",
"ċ": "c",
"č": "c",
"Ď": "D",
"Đ": "D",
"ď": "d",
"đ": "d",
"Ē": "E",
"Ĕ": "E",
"Ė": "E",
"Ę": "E",
"Ě": "E",
"ē": "e",
"ĕ": "e",
"ė": "e",
"ę": "e",
"ě": "e",
"Ĝ": "G",
"Ğ": "G",
"Ġ": "G",
"Ģ": "G",
"ĝ": "g",
"ğ": "g",
"ġ": "g",
"ģ": "g",
"Ĥ": "H",
"Ħ": "H",
"ĥ": "h",
"ħ": "h",
"Ĩ": "I",
"Ī": "I",
"Ĭ": "I",
"Į": "I",
"İ": "I",
"ĩ": "i",
"ī": "i",
"ĭ": "i",
"į": "i",
"ı": "i",
"Ĵ": "J",
"ĵ": "j",
"Ķ": "K",
"ķ": "k",
"ĸ": "k",
"Ĺ": "L",
"Ļ": "L",
"Ľ": "L",
"Ŀ": "L",
"Ł": "L",
"ĺ": "l",
"ļ": "l",
"ľ": "l",
"ŀ": "l",
"ł": "l",
"Ń": "N",
"Ņ": "N",
"Ň": "N",
"Ŋ": "N",
"ń": "n",
"ņ": "n",
"ň": "n",
"ŋ": "n",
"Ō": "O",
"Ŏ": "O",
"Ő": "O",
"ō": "o",
"ŏ": "o",
"ő": "o",
"Ŕ": "R",
"Ŗ": "R",
"Ř": "R",
"ŕ": "r",
"ŗ": "r",
"ř": "r",
"Ś": "S",
"Ŝ": "S",
"Ş": "S",
"Š": "S",
"ś": "s",
"ŝ": "s",
"ş": "s",
"š": "s",
"Ţ": "T",
"Ť": "T",
"Ŧ": "T",
"ţ": "t",
"ť": "t",
"ŧ": "t",
"Ũ": "U",
"Ū": "U",
"Ŭ": "U",
"Ů": "U",
"Ű": "U",
"Ų": "U",
"ũ": "u",
"ū": "u",
"ŭ": "u",
"ů": "u",
"ű": "u",
"ų": "u",
"Ŵ": "W",
"ŵ": "w",
"Ŷ": "Y",
"ŷ": "y",
"Ÿ": "Y",
"Ź": "Z",
"Ż": "Z",
"Ž": "Z",
"ź": "z",
"ż": "z",
"ž": "z",
"IJ": "IJ",
"ij": "ij",
"Œ": "Oe",
"œ": "oe",
"ʼn": "'n",
"ſ": "s"
};
var deburrLetter = basePropertyOf(deburredLetters);
module.exports = deburrLetter;
}
});
// node_modules/lodash/deburr.js
var require_deburr = __commonJS({
"node_modules/lodash/deburr.js"(exports, module) {
var deburrLetter = require_deburrLetter();
var toString = require_toString();
var reLatin = /[\xc0-\xd6\xd8-\xf6\xf8-\xff\u0100-\u017f]/g;
var rsComboMarksRange = "\\u0300-\\u036f";
var reComboHalfMarksRange = "\\ufe20-\\ufe2f";
var rsComboSymbolsRange = "\\u20d0-\\u20ff";
var rsComboRange = rsComboMarksRange + reComboHalfMarksRange + rsComboSymbolsRange;
var rsCombo = "[" + rsComboRange + "]";
var reComboMark = RegExp(rsCombo, "g");
function deburr(string) {
string = toString(string);
return string && string.replace(reLatin, deburrLetter).replace(reComboMark, "");
}
module.exports = deburr;
}
});
// node_modules/lodash/_asciiWords.js
var require_asciiWords = __commonJS({
"node_modules/lodash/_asciiWords.js"(exports, module) {
var reAsciiWord = /[^\x00-\x2f\x3a-\x40\x5b-\x60\x7b-\x7f]+/g;
function asciiWords(string) {
return string.match(reAsciiWord) || [];
}
module.exports = asciiWords;
}
});
// node_modules/lodash/_hasUnicodeWord.js
var require_hasUnicodeWord = __commonJS({
"node_modules/lodash/_hasUnicodeWord.js"(exports, module) {
var reHasUnicodeWord = /[a-z][A-Z]|[A-Z]{2}[a-z]|[0-9][a-zA-Z]|[a-zA-Z][0-9]|[^a-zA-Z0-9 ]/;
function hasUnicodeWord(string) {
return reHasUnicodeWord.test(string);
}
module.exports = hasUnicodeWord;
}
});
// node_modules/lodash/_unicodeWords.js
var require_unicodeWords = __commonJS({
"node_modules/lodash/_unicodeWords.js"(exports, module) {
var rsAstralRange = "\\ud800-\\udfff";
var rsComboMarksRange = "\\u0300-\\u036f";
var reComboHalfMarksRange = "\\ufe20-\\ufe2f";
var rsComboSymbolsRange = "\\u20d0-\\u20ff";
var rsComboRange = rsComboMarksRange + reComboHalfMarksRange + rsComboSymbolsRange;
var rsDingbatRange = "\\u2700-\\u27bf";
var rsLowerRange = "a-z\\xdf-\\xf6\\xf8-\\xff";
var rsMathOpRange = "\\xac\\xb1\\xd7\\xf7";
var rsNonCharRange = "\\x00-\\x2f\\x3a-\\x40\\x5b-\\x60\\x7b-\\xbf";
var rsPunctuationRange = "\\u2000-\\u206f";
var rsSpaceRange = " \\t\\x0b\\f\\xa0\\ufeff\\n\\r\\u2028\\u2029\\u1680\\u180e\\u2000\\u2001\\u2002\\u2003\\u2004\\u2005\\u2006\\u2007\\u2008\\u2009\\u200a\\u202f\\u205f\\u3000";
var rsUpperRange = "A-Z\\xc0-\\xd6\\xd8-\\xde";
var rsVarRange = "\\ufe0e\\ufe0f";
var rsBreakRange = rsMathOpRange + rsNonCharRange + rsPunctuationRange + rsSpaceRange;
var rsApos = "[']";
var rsBreak = "[" + rsBreakRange + "]";
var rsCombo = "[" + rsComboRange + "]";
var rsDigits = "\\d+";
var rsDingbat = "[" + rsDingbatRange + "]";
var rsLower = "[" + rsLowerRange + "]";
var rsMisc = "[^" + rsAstralRange + rsBreakRange + rsDigits + rsDingbatRange + rsLowerRange + rsUpperRange + "]";
var rsFitz = "\\ud83c[\\udffb-\\udfff]";
var rsModifier = "(?:" + rsCombo + "|" + rsFitz + ")";
var rsNonAstral = "[^" + rsAstralRange + "]";
var rsRegional = "(?:\\ud83c[\\udde6-\\uddff]){2}";
var rsSurrPair = "[\\ud800-\\udbff][\\udc00-\\udfff]";
var rsUpper = "[" + rsUpperRange + "]";
var rsZWJ = "\\u200d";
var rsMiscLower = "(?:" + rsLower + "|" + rsMisc + ")";
var rsMiscUpper = "(?:" + rsUpper + "|" + rsMisc + ")";
var rsOptContrLower = "(?:" + rsApos + "(?:d|ll|m|re|s|t|ve))?";
var rsOptContrUpper = "(?:" + rsApos + "(?:D|LL|M|RE|S|T|VE))?";
var reOptMod = rsModifier + "?";
var rsOptVar = "[" + rsVarRange + "]?";
var rsOptJoin = "(?:" + rsZWJ + "(?:" + [rsNonAstral, rsRegional, rsSurrPair].join("|") + ")" + rsOptVar + reOptMod + ")*";
var rsOrdLower = "\\d*(?:1st|2nd|3rd|(?![123])\\dth)(?=\\b|[A-Z_])";
var rsOrdUpper = "\\d*(?:1ST|2ND|3RD|(?![123])\\dTH)(?=\\b|[a-z_])";
var rsSeq = rsOptVar + reOptMod + rsOptJoin;
var rsEmoji = "(?:" + [rsDingbat, rsRegional, rsSurrPair].join("|") + ")" + rsSeq;
var reUnicodeWord = RegExp([
rsUpper + "?" + rsLower + "+" + rsOptContrLower + "(?=" + [rsBreak, rsUpper, "$"].join("|") + ")",
rsMiscUpper + "+" + rsOptContrUpper + "(?=" + [rsBreak, rsUpper + rsMiscLower, "$"].join("|") + ")",
rsUpper + "?" + rsMiscLower + "+" + rsOptContrLower,
rsUpper + "+" + rsOptContrUpper,
rsOrdUpper,
rsOrdLower,
rsDigits,
rsEmoji
].join("|"), "g");
function unicodeWords(string) {
return string.match(reUnicodeWord) || [];
}
module.exports = unicodeWords;
}
});
// node_modules/lodash/words.js
var require_words = __commonJS({
"node_modules/lodash/words.js"(exports, module) {
var asciiWords = require_asciiWords();
var hasUnicodeWord = require_hasUnicodeWord();
var toString = require_toString();
var unicodeWords = require_unicodeWords();
function words(string, pattern, guard) {
string = toString(string);
pattern = guard ? void 0 : pattern;
if (pattern === void 0) {
return hasUnicodeWord(string) ? unicodeWords(string) : asciiWords(string);
}
return string.match(pattern) || [];
}
module.exports = words;
}
});
// node_modules/lodash/_createCompounder.js
var require_createCompounder = __commonJS({
"node_modules/lodash/_createCompounder.js"(exports, module) {
var arrayReduce = require_arrayReduce();
var deburr = require_deburr();
var words = require_words();
var rsApos = "[']";
var reApos = RegExp(rsApos, "g");
function createCompounder(callback) {
return function(string) {
return arrayReduce(words(deburr(string).replace(reApos, "")), callback, "");
};
}
module.exports = createCompounder;
}
});
export {
require_createCompounder
};
//# sourceMappingURL=chunk-4CUKBX3S.js.map

View File

@@ -1,122 +0,0 @@
import {
require_basePickBy
} from "./chunk-67PDOJAZ.js";
import {
require_overRest,
require_setToString
} from "./chunk-DJFXXVH6.js";
import {
require_hasIn
} from "./chunk-TPPVBE2B.js";
import {
require_arrayPush
} from "./chunk-WCMV6LZ5.js";
import {
require_isArguments
} from "./chunk-KAVMBZUT.js";
import {
require_isArray
} from "./chunk-V2ZIA3AG.js";
import {
require_Symbol
} from "./chunk-FMVO6WZI.js";
import {
__commonJS
} from "./chunk-KEXKKQVW.js";
// node_modules/lodash/_basePick.js
var require_basePick = __commonJS({
"node_modules/lodash/_basePick.js"(exports, module) {
var basePickBy = require_basePickBy();
var hasIn = require_hasIn();
function basePick(object, paths) {
return basePickBy(object, paths, function(value, path) {
return hasIn(object, path);
});
}
module.exports = basePick;
}
});
// node_modules/lodash/_isFlattenable.js
var require_isFlattenable = __commonJS({
"node_modules/lodash/_isFlattenable.js"(exports, module) {
var Symbol = require_Symbol();
var isArguments = require_isArguments();
var isArray = require_isArray();
var spreadableSymbol = Symbol ? Symbol.isConcatSpreadable : void 0;
function isFlattenable(value) {
return isArray(value) || isArguments(value) || !!(spreadableSymbol && value && value[spreadableSymbol]);
}
module.exports = isFlattenable;
}
});
// node_modules/lodash/_baseFlatten.js
var require_baseFlatten = __commonJS({
"node_modules/lodash/_baseFlatten.js"(exports, module) {
var arrayPush = require_arrayPush();
var isFlattenable = require_isFlattenable();
function baseFlatten(array, depth, predicate, isStrict, result) {
var index = -1, length = array.length;
predicate || (predicate = isFlattenable);
result || (result = []);
while (++index < length) {
var value = array[index];
if (depth > 0 && predicate(value)) {
if (depth > 1) {
baseFlatten(value, depth - 1, predicate, isStrict, result);
} else {
arrayPush(result, value);
}
} else if (!isStrict) {
result[result.length] = value;
}
}
return result;
}
module.exports = baseFlatten;
}
});
// node_modules/lodash/flatten.js
var require_flatten = __commonJS({
"node_modules/lodash/flatten.js"(exports, module) {
var baseFlatten = require_baseFlatten();
function flatten(array) {
var length = array == null ? 0 : array.length;
return length ? baseFlatten(array, 1) : [];
}
module.exports = flatten;
}
});
// node_modules/lodash/_flatRest.js
var require_flatRest = __commonJS({
"node_modules/lodash/_flatRest.js"(exports, module) {
var flatten = require_flatten();
var overRest = require_overRest();
var setToString = require_setToString();
function flatRest(func) {
return setToString(overRest(func, void 0, flatten), func + "");
}
module.exports = flatRest;
}
});
// node_modules/lodash/pick.js
var require_pick = __commonJS({
"node_modules/lodash/pick.js"(exports, module) {
var basePick = require_basePick();
var flatRest = require_flatRest();
var pick = flatRest(function(object, paths) {
return object == null ? {} : basePick(object, paths);
});
module.exports = pick;
}
});
export {
require_pick
};
//# sourceMappingURL=chunk-4H3Q2QKZ.js.map

View File

@@ -1,7 +0,0 @@
{
"version": 3,
"sources": ["../../../../../lodash/_basePick.js", "../../../../../lodash/_isFlattenable.js", "../../../../../lodash/_baseFlatten.js", "../../../../../lodash/flatten.js", "../../../../../lodash/_flatRest.js", "../../../../../lodash/pick.js"],
"sourcesContent": ["var basePickBy = require('./_basePickBy'),\n hasIn = require('./hasIn');\n\n/**\n * The base implementation of `_.pick` without support for individual\n * property identifiers.\n *\n * @private\n * @param {Object} object The source object.\n * @param {string[]} paths The property paths to pick.\n * @returns {Object} Returns the new object.\n */\nfunction basePick(object, paths) {\n return basePickBy(object, paths, function(value, path) {\n return hasIn(object, path);\n });\n}\n\nmodule.exports = basePick;\n", "var Symbol = require('./_Symbol'),\n isArguments = require('./isArguments'),\n isArray = require('./isArray');\n\n/** Built-in value references. */\nvar spreadableSymbol = Symbol ? Symbol.isConcatSpreadable : undefined;\n\n/**\n * Checks if `value` is a flattenable `arguments` object or array.\n *\n * @private\n * @param {*} value The value to check.\n * @returns {boolean} Returns `true` if `value` is flattenable, else `false`.\n */\nfunction isFlattenable(value) {\n return isArray(value) || isArguments(value) ||\n !!(spreadableSymbol && value && value[spreadableSymbol]);\n}\n\nmodule.exports = isFlattenable;\n", "var arrayPush = require('./_arrayPush'),\n isFlattenable = require('./_isFlattenable');\n\n/**\n * The base implementation of `_.flatten` with support for restricting flattening.\n *\n * @private\n * @param {Array} array The array to flatten.\n * @param {number} depth The maximum recursion depth.\n * @param {boolean} [predicate=isFlattenable] The function invoked per iteration.\n * @param {boolean} [isStrict] Restrict to values that pass `predicate` checks.\n * @param {Array} [result=[]] The initial result value.\n * @returns {Array} Returns the new flattened array.\n */\nfunction baseFlatten(array, depth, predicate, isStrict, result) {\n var index = -1,\n length = array.length;\n\n predicate || (predicate = isFlattenable);\n result || (result = []);\n\n while (++index < length) {\n var value = array[index];\n if (depth > 0 && predicate(value)) {\n if (depth > 1) {\n // Recursively flatten arrays (susceptible to call stack limits).\n baseFlatten(value, depth - 1, predicate, isStrict, result);\n } else {\n arrayPush(result, value);\n }\n } else if (!isStrict) {\n result[result.length] = value;\n }\n }\n return result;\n}\n\nmodule.exports = baseFlatten;\n", "var baseFlatten = require('./_baseFlatten');\n\n/**\n * Flattens `array` a single level deep.\n *\n * @static\n * @memberOf _\n * @since 0.1.0\n * @category Array\n * @param {Array} array The array to flatten.\n * @returns {Array} Returns the new flattened array.\n * @example\n *\n * _.flatten([1, [2, [3, [4]], 5]]);\n * // => [1, 2, [3, [4]], 5]\n */\nfunction flatten(array) {\n var length = array == null ? 0 : array.length;\n return length ? baseFlatten(array, 1) : [];\n}\n\nmodule.exports = flatten;\n", "var flatten = require('./flatten'),\n overRest = require('./_overRest'),\n setToString = require('./_setToString');\n\n/**\n * A specialized version of `baseRest` which flattens the rest array.\n *\n * @private\n * @param {Function} func The function to apply a rest parameter to.\n * @returns {Function} Returns the new function.\n */\nfunction flatRest(func) {\n return setToString(overRest(func, undefined, flatten), func + '');\n}\n\nmodule.exports = flatRest;\n", "var basePick = require('./_basePick'),\n flatRest = require('./_flatRest');\n\n/**\n * Creates an object composed of the picked `object` properties.\n *\n * @static\n * @since 0.1.0\n * @memberOf _\n * @category Object\n * @param {Object} object The source object.\n * @param {...(string|string[])} [paths] The property paths to pick.\n * @returns {Object} Returns the new object.\n * @example\n *\n * var object = { 'a': 1, 'b': '2', 'c': 3 };\n *\n * _.pick(object, ['a', 'c']);\n * // => { 'a': 1, 'c': 3 }\n */\nvar pick = flatRest(function(object, paths) {\n return object == null ? {} : basePick(object, paths);\n});\n\nmodule.exports = pick;\n"],
"mappings": ";;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA;AAAA;AAAA,QAAI,aAAa;AAAjB,QACI,QAAQ;AAWZ,aAAS,SAAS,QAAQ,OAAO;AAC/B,aAAO,WAAW,QAAQ,OAAO,SAAS,OAAO,MAAM;AACrD,eAAO,MAAM,QAAQ,IAAI;AAAA,MAC3B,CAAC;AAAA,IACH;AAEA,WAAO,UAAU;AAAA;AAAA;;;AClBjB;AAAA;AAAA,QAAI,SAAS;AAAb,QACI,cAAc;AADlB,QAEI,UAAU;AAGd,QAAI,mBAAmB,SAAS,OAAO,qBAAqB;AAS5D,aAAS,cAAc,OAAO;AAC5B,aAAO,QAAQ,KAAK,KAAK,YAAY,KAAK,KACxC,CAAC,EAAE,oBAAoB,SAAS,MAAM,gBAAgB;AAAA,IAC1D;AAEA,WAAO,UAAU;AAAA;AAAA;;;ACnBjB;AAAA;AAAA,QAAI,YAAY;AAAhB,QACI,gBAAgB;AAapB,aAAS,YAAY,OAAO,OAAO,WAAW,UAAU,QAAQ;AAC9D,UAAI,QAAQ,IACR,SAAS,MAAM;AAEnB,oBAAc,YAAY;AAC1B,iBAAW,SAAS,CAAC;AAErB,aAAO,EAAE,QAAQ,QAAQ;AACvB,YAAI,QAAQ,MAAM,KAAK;AACvB,YAAI,QAAQ,KAAK,UAAU,KAAK,GAAG;AACjC,cAAI,QAAQ,GAAG;AAEb,wBAAY,OAAO,QAAQ,GAAG,WAAW,UAAU,MAAM;AAAA,UAC3D,OAAO;AACL,sBAAU,QAAQ,KAAK;AAAA,UACzB;AAAA,QACF,WAAW,CAAC,UAAU;AACpB,iBAAO,OAAO,MAAM,IAAI;AAAA,QAC1B;AAAA,MACF;AACA,aAAO;AAAA,IACT;AAEA,WAAO,UAAU;AAAA;AAAA;;;ACrCjB;AAAA;AAAA,QAAI,cAAc;AAgBlB,aAAS,QAAQ,OAAO;AACtB,UAAI,SAAS,SAAS,OAAO,IAAI,MAAM;AACvC,aAAO,SAAS,YAAY,OAAO,CAAC,IAAI,CAAC;AAAA,IAC3C;AAEA,WAAO,UAAU;AAAA;AAAA;;;ACrBjB;AAAA;AAAA,QAAI,UAAU;AAAd,QACI,WAAW;AADf,QAEI,cAAc;AASlB,aAAS,SAAS,MAAM;AACtB,aAAO,YAAY,SAAS,MAAM,QAAW,OAAO,GAAG,OAAO,EAAE;AAAA,IAClE;AAEA,WAAO,UAAU;AAAA;AAAA;;;ACfjB;AAAA;AAAA,QAAI,WAAW;AAAf,QACI,WAAW;AAmBf,QAAI,OAAO,SAAS,SAAS,QAAQ,OAAO;AAC1C,aAAO,UAAU,OAAO,CAAC,IAAI,SAAS,QAAQ,KAAK;AAAA,IACrD,CAAC;AAED,WAAO,UAAU;AAAA;AAAA;",
"names": []
}

View File

@@ -1,36 +0,0 @@
import {
__commonJS
} from "./chunk-KEXKKQVW.js";
// node_modules/lodash/_createBaseFor.js
var require_createBaseFor = __commonJS({
"node_modules/lodash/_createBaseFor.js"(exports, module) {
function createBaseFor(fromRight) {
return function(object, iteratee, keysFunc) {
var index = -1, iterable = Object(object), props = keysFunc(object), length = props.length;
while (length--) {
var key = props[fromRight ? length : ++index];
if (iteratee(iterable[key], key, iterable) === false) {
break;
}
}
return object;
};
}
module.exports = createBaseFor;
}
});
// node_modules/lodash/_baseFor.js
var require_baseFor = __commonJS({
"node_modules/lodash/_baseFor.js"(exports, module) {
var createBaseFor = require_createBaseFor();
var baseFor = createBaseFor();
module.exports = baseFor;
}
});
export {
require_baseFor
};
//# sourceMappingURL=chunk-4O4FP57Y.js.map

Some files were not shown because too many files have changed in this diff Show More