Frontend Enhancements: - Complete React TypeScript frontend with modern UI components - Distributed workflows management interface with real-time updates - Socket.IO integration for live agent status monitoring - Agent management dashboard with cluster visualization - Project management interface with metrics and task tracking - Responsive design with proper error handling and loading states Backend Infrastructure: - Distributed coordinator for multi-agent workflow orchestration - Cluster management API with comprehensive agent operations - Enhanced database models for agents and projects - Project service for filesystem-based project discovery - Performance monitoring and metrics collection - Comprehensive API documentation and error handling Documentation: - Complete distributed development guide (README_DISTRIBUTED.md) - Comprehensive development report with architecture insights - System configuration templates and deployment guides The platform now provides a complete web interface for managing the distributed AI cluster with real-time monitoring, workflow orchestration, and agent coordination capabilities. 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
2.7 KiB
p-limit
Run multiple promise-returning & async functions with limited concurrency
Install
$ npm install p-limit
Usage
const pLimit = require('p-limit');
const limit = pLimit(1);
const input = [
limit(() => fetchSomething('foo')),
limit(() => fetchSomething('bar')),
limit(() => doSomething())
];
(async () => {
// Only one promise is run at once
const result = await Promise.all(input);
console.log(result);
})();
API
pLimit(concurrency)
Returns a limit function.
concurrency
Type: number
Minimum: 1
Default: Infinity
Concurrency limit.
limit(fn, ...args)
Returns the promise returned by calling fn(...args).
fn
Type: Function
Promise-returning/async function.
args
Any arguments to pass through to fn.
Support for passing arguments on to the fn is provided in order to be able to avoid creating unnecessary closures. You probably don't need this optimization unless you're pushing a lot of functions.
limit.activeCount
The number of promises that are currently running.
limit.pendingCount
The number of promises that are waiting to run (i.e. their internal fn was not called yet).
limit.clearQueue()
Discard pending promises that are waiting to run.
This might be useful if you want to teardown the queue at the end of your program's lifecycle or discard any function calls referencing an intermediary state of your app.
Note: This does not cancel promises that are already running.
FAQ
How is this different from the p-queue package?
This package is only about limiting the number of concurrent executions, while p-queue is a fully featured queue implementation with lots of different options, introspection, and ability to pause the queue.
Related
- p-queue - Promise queue with concurrency control
- p-throttle - Throttle promise-returning & async functions
- p-debounce - Debounce promise-returning & async functions
- p-all - Run promise-returning & async functions concurrently with optional limited concurrency
- More…
Tidelift helps make open source sustainable for maintainers while giving companies
assurances about security, maintenance, and licensing for their dependencies.