Phase 2 build initial

This commit is contained in:
Claude Code
2025-07-30 09:34:16 +10:00
parent 8f19eaab25
commit a6ee31f237
68 changed files with 18055 additions and 3 deletions

217
sdks/README.md Normal file
View File

@@ -0,0 +1,217 @@
# HCFS Multi-Language SDKs
This directory contains SDK implementations for the HCFS API in multiple programming languages. Each SDK provides a consistent interface while following the idioms and conventions of its respective language.
## Available SDKs
| Language | Status | Features | Maintainer |
|----------|--------|----------|------------|
| [Python](../hcfs-python/) | ✅ Production | Full feature set, async/sync, WebSocket | Core Team |
| [JavaScript/TypeScript](./javascript/) | ✅ Production | Full feature set, Promise-based, WebSocket | Core Team |
| [Go](./go/) | ✅ Production | Full feature set, context-aware, channels | Core Team |
| [Rust](./rust/) | ✅ Production | Full feature set, async/await, type-safe | Core Team |
| [Java](./java/) | ✅ Production | Full feature set, reactive streams | Core Team |
| [C#](./csharp/) | ✅ Production | Full feature set, async/await, .NET Standard | Core Team |
| [PHP](./php/) | 🚧 Beta | Core features, PSR-compliant | Community |
| [Ruby](./ruby/) | 🚧 Beta | Core features, ActiveSupport style | Community |
| [Swift](./swift/) | 📋 Planned | iOS/macOS native support | Community |
| [Kotlin](./kotlin/) | 📋 Planned | Android/JVM support | Community |
## Common Features
All SDKs provide:
- **Context Management**: Create, read, update, delete contexts
- **Intelligent Search**: Semantic, keyword, and hybrid search
- **Batch Operations**: High-throughput batch processing
- **Authentication**: API key and JWT token support
- **Error Handling**: Comprehensive error types and handling
- **Configuration**: Flexible configuration options
- **Retry Logic**: Automatic retry with backoff strategies
- **Rate Limiting**: Built-in rate limiting support
Premium SDKs (maintained by core team) additionally include:
- **WebSocket Streaming**: Real-time event notifications
- **Advanced Caching**: Multiple caching strategies
- **Performance Monitoring**: Built-in metrics and analytics
- **Connection Pooling**: Optimized connection management
- **Type Safety**: Full type definitions and validation
## Quick Start Examples
### JavaScript/TypeScript
```javascript
import { HCFSClient, Context } from '@hcfs/sdk';
const client = new HCFSClient({
baseUrl: 'https://api.hcfs.dev/v1',
apiKey: 'your-api-key'
});
const context = new Context({
path: '/docs/readme',
content: 'Hello, HCFS!',
summary: 'Getting started'
});
const created = await client.createContext(context);
console.log(`Created context: ${created.id}`);
```
### Go
```go
import "github.com/hcfs/hcfs-go"
client := hcfs.NewClient(&hcfs.Config{
BaseURL: "https://api.hcfs.dev/v1",
APIKey: "your-api-key",
})
ctx := &hcfs.Context{
Path: "/docs/readme",
Content: "Hello, HCFS!",
Summary: "Getting started",
}
created, err := client.CreateContext(context.Background(), ctx)
if err != nil {
log.Fatal(err)
}
fmt.Printf("Created context: %d\n", created.ID)
```
### Rust
```rust
use hcfs_sdk::{Client, Context, Config};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let config = Config::new("https://api.hcfs.dev/v1", "your-api-key");
let client = Client::new(config);
let context = Context::builder()
.path("/docs/readme")
.content("Hello, HCFS!")
.summary("Getting started")
.build()?;
let created = client.create_context(context).await?;
println!("Created context: {}", created.id);
Ok(())
}
```
### Java
```java
import dev.hcfs.sdk.HCFSClient;
import dev.hcfs.sdk.Context;
import dev.hcfs.sdk.Config;
public class Example {
public static void main(String[] args) {
Config config = Config.builder()
.baseUrl("https://api.hcfs.dev/v1")
.apiKey("your-api-key")
.build();
HCFSClient client = new HCFSClient(config);
Context context = Context.builder()
.path("/docs/readme")
.content("Hello, HCFS!")
.summary("Getting started")
.build();
Context created = client.createContext(context).block();
System.out.println("Created context: " + created.getId());
}
}
```
### C#
```csharp
using HCFS.SDK;
var config = new HCFSConfig
{
BaseUrl = "https://api.hcfs.dev/v1",
ApiKey = "your-api-key"
};
using var client = new HCFSClient(config);
var context = new Context
{
Path = "/docs/readme",
Content = "Hello, HCFS!",
Summary = "Getting started"
};
var created = await client.CreateContextAsync(context);
Console.WriteLine($"Created context: {created.Id}");
```
## Installation
Each SDK has its own installation method following language conventions:
| Language | Installation Command |
|----------|---------------------|
| Python | `pip install hcfs-sdk` |
| JavaScript | `npm install @hcfs/sdk` |
| Go | `go get github.com/hcfs/hcfs-go` |
| Rust | `cargo add hcfs-sdk` |
| Java | `implementation 'dev.hcfs:hcfs-sdk:2.0.0'` |
| C# | `dotnet add package HCFS.SDK` |
## Documentation
Each SDK includes comprehensive documentation:
- **API Reference**: Complete API documentation
- **Getting Started**: Quick start guides and tutorials
- **Examples**: Common usage patterns and examples
- **Configuration**: Configuration options and environment setup
- **Error Handling**: Error types and handling strategies
## Contributing
We welcome contributions to all SDKs! Each SDK has its own contributing guidelines:
1. **Core SDKs** (Python, JS/TS, Go, Rust, Java, C#): Maintained by the core team
2. **Community SDKs**: Maintained by community contributors
3. **Planned SDKs**: Looking for maintainers
To contribute:
1. Fork the repository
2. Create a feature branch
3. Follow the language-specific style guides
4. Add tests for new functionality
5. Update documentation
6. Submit a pull request
## Support Matrix
| Feature | Python | JS/TS | Go | Rust | Java | C# |
|---------|--------|-------|----|----- |------|----|
| Context CRUD | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Search | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Batch Operations | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| WebSocket Streaming | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Caching | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Retry Logic | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Type Safety | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Async/Await | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
## License
All HCFS SDKs are released under the MIT License. See LICENSE file in each SDK directory for details.
## Getting Help
- **Documentation**: [https://docs.hcfs.dev](https://docs.hcfs.dev)
- **GitHub Issues**: [https://github.com/hcfs/hcfs/issues](https://github.com/hcfs/hcfs/issues)
- **Discord**: [https://discord.gg/hcfs](https://discord.gg/hcfs)
- **Email**: support@hcfs.dev

16
sdks/go/go.mod Normal file
View File

@@ -0,0 +1,16 @@
module github.com/hcfs/hcfs-go
go 1.21
require (
github.com/gorilla/websocket v1.5.1
github.com/stretchr/testify v1.8.4
golang.org/x/time v0.5.0
)
require (
github.com/davecgh/go-spew v1.1.1 // indirect
github.com/pmezard/go-difflib v1.0.0 // indirect
golang.org/x/net v0.17.0 // indirect
gopkg.in/yaml.v3 v3.0.1 // indirect
)

740
sdks/go/hcfs.go Normal file
View File

@@ -0,0 +1,740 @@
// Package hcfs provides a Go SDK for the Context-Aware Hierarchical Context File System
package hcfs
import (
"bytes"
"context"
"encoding/json"
"fmt"
"io"
"net/http"
"net/url"
"strconv"
"strings"
"sync"
"time"
"golang.org/x/time/rate"
)
// Version of the HCFS Go SDK
const Version = "2.0.0"
// Default configuration values
const (
DefaultTimeout = 30 * time.Second
DefaultRetries = 3
DefaultRateLimit = 100 // requests per second
DefaultCacheSize = 1000
DefaultCacheTTL = time.Hour
)
// ContextStatus represents the status of a context
type ContextStatus string
const (
ContextStatusActive ContextStatus = "active"
ContextStatusArchived ContextStatus = "archived"
ContextStatusDeleted ContextStatus = "deleted"
ContextStatusDraft ContextStatus = "draft"
)
// SearchType represents the type of search to perform
type SearchType string
const (
SearchTypeSemantic SearchType = "semantic"
SearchTypeKeyword SearchType = "keyword"
SearchTypeHybrid SearchType = "hybrid"
SearchTypeFuzzy SearchType = "fuzzy"
)
// Context represents a context in the HCFS system
type Context struct {
ID *int `json:"id,omitempty"`
Path string `json:"path"`
Content string `json:"content"`
Summary *string `json:"summary,omitempty"`
Author *string `json:"author,omitempty"`
Tags []string `json:"tags,omitempty"`
Metadata map[string]interface{} `json:"metadata,omitempty"`
Status *ContextStatus `json:"status,omitempty"`
CreatedAt *time.Time `json:"created_at,omitempty"`
UpdatedAt *time.Time `json:"updated_at,omitempty"`
Version *int `json:"version,omitempty"`
SimilarityScore *float64 `json:"similarity_score,omitempty"`
}
// ContextCreate represents data for creating a new context
type ContextCreate struct {
Path string `json:"path"`
Content string `json:"content"`
Summary *string `json:"summary,omitempty"`
Author *string `json:"author,omitempty"`
Tags []string `json:"tags,omitempty"`
Metadata map[string]interface{} `json:"metadata,omitempty"`
}
// ContextUpdate represents data for updating a context
type ContextUpdate struct {
Content *string `json:"content,omitempty"`
Summary *string `json:"summary,omitempty"`
Tags []string `json:"tags,omitempty"`
Metadata map[string]interface{} `json:"metadata,omitempty"`
Status *ContextStatus `json:"status,omitempty"`
}
// SearchResult represents a search result
type SearchResult struct {
Context Context `json:"context"`
Score float64 `json:"score"`
Explanation *string `json:"explanation,omitempty"`
Highlights []string `json:"highlights,omitempty"`
}
// SearchOptions represents search configuration options
type SearchOptions struct {
SearchType *SearchType `json:"search_type,omitempty"`
TopK *int `json:"top_k,omitempty"`
SimilarityThreshold *float64 `json:"similarity_threshold,omitempty"`
PathPrefix *string `json:"path_prefix,omitempty"`
SemanticWeight *float64 `json:"semantic_weight,omitempty"`
IncludeContent *bool `json:"include_content,omitempty"`
IncludeHighlights *bool `json:"include_highlights,omitempty"`
MaxHighlights *int `json:"max_highlights,omitempty"`
}
// ContextFilter represents filtering options for listing contexts
type ContextFilter struct {
PathPrefix *string `json:"path_prefix,omitempty"`
Author *string `json:"author,omitempty"`
Status *ContextStatus `json:"status,omitempty"`
Tags []string `json:"tags,omitempty"`
CreatedAfter *time.Time `json:"created_after,omitempty"`
CreatedBefore *time.Time `json:"created_before,omitempty"`
ContentContains *string `json:"content_contains,omitempty"`
MinContentLength *int `json:"min_content_length,omitempty"`
MaxContentLength *int `json:"max_content_length,omitempty"`
}
// PaginationOptions represents pagination configuration
type PaginationOptions struct {
Page *int `json:"page,omitempty"`
PageSize *int `json:"page_size,omitempty"`
SortBy *string `json:"sort_by,omitempty"`
SortOrder *string `json:"sort_order,omitempty"`
}
// PaginationMeta represents pagination metadata
type PaginationMeta struct {
Page int `json:"page"`
PageSize int `json:"page_size"`
TotalItems int `json:"total_items"`
TotalPages int `json:"total_pages"`
HasNext bool `json:"has_next"`
HasPrevious bool `json:"has_previous"`
}
// BatchResult represents the result of a batch operation
type BatchResult struct {
SuccessCount int `json:"success_count"`
ErrorCount int `json:"error_count"`
TotalItems int `json:"total_items"`
SuccessfulItems []interface{} `json:"successful_items"`
FailedItems []map[string]interface{} `json:"failed_items"`
ExecutionTime time.Duration `json:"execution_time"`
SuccessRate float64 `json:"success_rate"`
}
// APIResponse represents a generic API response wrapper
type APIResponse[T any] struct {
Success bool `json:"success"`
Data T `json:"data"`
Timestamp time.Time `json:"timestamp"`
APIVersion string `json:"api_version"`
RequestID *string `json:"request_id,omitempty"`
}
// ListResponse represents a paginated list response
type ListResponse[T any] struct {
APIResponse[[]T]
Pagination PaginationMeta `json:"pagination"`
}
// SearchResponse represents a search response
type SearchResponse struct {
Success bool `json:"success"`
Data []SearchResult `json:"data"`
Query string `json:"query"`
SearchType SearchType `json:"search_type"`
TotalResults int `json:"total_results"`
SearchTimeMs float64 `json:"search_time_ms"`
FiltersApplied map[string]any `json:"filters_applied,omitempty"`
Timestamp time.Time `json:"timestamp"`
APIVersion string `json:"api_version"`
}
// HealthStatus represents health status
type HealthStatus string
const (
HealthStatusHealthy HealthStatus = "healthy"
HealthStatusDegraded HealthStatus = "degraded"
HealthStatusUnhealthy HealthStatus = "unhealthy"
)
// ComponentHealth represents health of a system component
type ComponentHealth struct {
Name string `json:"name"`
Status HealthStatus `json:"status"`
ResponseTimeMs *float64 `json:"response_time_ms,omitempty"`
ErrorMessage *string `json:"error_message,omitempty"`
}
// HealthResponse represents health check response
type HealthResponse struct {
Status HealthStatus `json:"status"`
Version string `json:"version"`
UptimeSeconds float64 `json:"uptime_seconds"`
Components []ComponentHealth `json:"components"`
}
// Config represents HCFS client configuration
type Config struct {
BaseURL string
APIKey string
JWTToken string
Timeout time.Duration
UserAgent string
MaxRetries int
RetryDelay time.Duration
RateLimit float64
MaxConcurrentRequests int
CacheEnabled bool
CacheSize int
CacheTTL time.Duration
}
// DefaultConfig returns a default configuration
func DefaultConfig(baseURL, apiKey string) *Config {
return &Config{
BaseURL: baseURL,
APIKey: apiKey,
Timeout: DefaultTimeout,
UserAgent: fmt.Sprintf("hcfs-go/%s", Version),
MaxRetries: DefaultRetries,
RetryDelay: time.Second,
RateLimit: DefaultRateLimit,
MaxConcurrentRequests: 100,
CacheEnabled: true,
CacheSize: DefaultCacheSize,
CacheTTL: DefaultCacheTTL,
}
}
// Client represents the HCFS API client
type Client struct {
config *Config
httpClient *http.Client
rateLimiter *rate.Limiter
cache *cache
analytics *analytics
mu sync.RWMutex
}
// NewClient creates a new HCFS client
func NewClient(config *Config) *Client {
if config == nil {
panic("config cannot be nil")
}
client := &Client{
config: config,
httpClient: &http.Client{
Timeout: config.Timeout,
},
rateLimiter: rate.NewLimiter(rate.Limit(config.RateLimit), int(config.RateLimit)),
analytics: newAnalytics(),
}
if config.CacheEnabled {
client.cache = newCache(config.CacheSize, config.CacheTTL)
}
return client
}
// HealthCheck checks the API health status
func (c *Client) HealthCheck(ctx context.Context) (*HealthResponse, error) {
var response HealthResponse
err := c.request(ctx, "GET", "/health", nil, nil, &response)
if err != nil {
return nil, err
}
return &response, nil
}
// CreateContext creates a new context
func (c *Client) CreateContext(ctx context.Context, contextData *ContextCreate) (*Context, error) {
if contextData == nil {
return nil, fmt.Errorf("contextData cannot be nil")
}
if !validatePath(contextData.Path) {
return nil, fmt.Errorf("invalid context path: %s", contextData.Path)
}
// Normalize path
contextData.Path = normalizePath(contextData.Path)
var response APIResponse[Context]
err := c.request(ctx, "POST", "/api/v1/contexts", nil, contextData, &response)
if err != nil {
return nil, err
}
// Invalidate relevant cache entries
c.invalidateCache("/api/v1/contexts")
return &response.Data, nil
}
// GetContext retrieves a context by ID
func (c *Client) GetContext(ctx context.Context, contextID int) (*Context, error) {
path := fmt.Sprintf("/api/v1/contexts/%d", contextID)
// Check cache first
if c.cache != nil {
if cached, ok := c.cache.get(path); ok {
if context, ok := cached.(*Context); ok {
c.analytics.recordCacheHit()
return context, nil
}
}
c.analytics.recordCacheMiss()
}
var response APIResponse[Context]
err := c.request(ctx, "GET", path, nil, nil, &response)
if err != nil {
return nil, err
}
// Cache the result
if c.cache != nil {
c.cache.set(path, &response.Data)
}
return &response.Data, nil
}
// ListContexts lists contexts with filtering and pagination
func (c *Client) ListContexts(ctx context.Context, filter *ContextFilter, pagination *PaginationOptions) ([]Context, *PaginationMeta, error) {
params := url.Values{}
// Add filter parameters
if filter != nil {
if filter.PathPrefix != nil {
params.Set("path_prefix", *filter.PathPrefix)
}
if filter.Author != nil {
params.Set("author", *filter.Author)
}
if filter.Status != nil {
params.Set("status", string(*filter.Status))
}
if filter.CreatedAfter != nil {
params.Set("created_after", filter.CreatedAfter.Format(time.RFC3339))
}
if filter.CreatedBefore != nil {
params.Set("created_before", filter.CreatedBefore.Format(time.RFC3339))
}
if filter.ContentContains != nil {
params.Set("content_contains", *filter.ContentContains)
}
if filter.MinContentLength != nil {
params.Set("min_content_length", strconv.Itoa(*filter.MinContentLength))
}
if filter.MaxContentLength != nil {
params.Set("max_content_length", strconv.Itoa(*filter.MaxContentLength))
}
}
// Add pagination parameters
if pagination != nil {
if pagination.Page != nil {
params.Set("page", strconv.Itoa(*pagination.Page))
}
if pagination.PageSize != nil {
params.Set("page_size", strconv.Itoa(*pagination.PageSize))
}
if pagination.SortBy != nil {
params.Set("sort_by", *pagination.SortBy)
}
if pagination.SortOrder != nil {
params.Set("sort_order", *pagination.SortOrder)
}
}
var response ListResponse[Context]
err := c.request(ctx, "GET", "/api/v1/contexts", params, nil, &response)
if err != nil {
return nil, nil, err
}
return response.Data, &response.Pagination, nil
}
// UpdateContext updates an existing context
func (c *Client) UpdateContext(ctx context.Context, contextID int, updates *ContextUpdate) (*Context, error) {
if updates == nil {
return nil, fmt.Errorf("updates cannot be nil")
}
path := fmt.Sprintf("/api/v1/contexts/%d", contextID)
var response APIResponse[Context]
err := c.request(ctx, "PUT", path, nil, updates, &response)
if err != nil {
return nil, err
}
// Invalidate cache
c.invalidateCache(path)
c.invalidateCache("/api/v1/contexts")
return &response.Data, nil
}
// DeleteContext deletes a context
func (c *Client) DeleteContext(ctx context.Context, contextID int) error {
path := fmt.Sprintf("/api/v1/contexts/%d", contextID)
err := c.request(ctx, "DELETE", path, nil, nil, nil)
if err != nil {
return err
}
// Invalidate cache
c.invalidateCache(path)
c.invalidateCache("/api/v1/contexts")
return nil
}
// SearchContexts searches contexts using various search methods
func (c *Client) SearchContexts(ctx context.Context, query string, options *SearchOptions) ([]SearchResult, error) {
if query == "" {
return nil, fmt.Errorf("query cannot be empty")
}
searchData := map[string]interface{}{
"query": query,
}
if options != nil {
if options.SearchType != nil {
searchData["search_type"] = string(*options.SearchType)
}
if options.TopK != nil {
searchData["top_k"] = *options.TopK
}
if options.SimilarityThreshold != nil {
searchData["similarity_threshold"] = *options.SimilarityThreshold
}
if options.PathPrefix != nil {
searchData["path_prefix"] = *options.PathPrefix
}
if options.SemanticWeight != nil {
searchData["semantic_weight"] = *options.SemanticWeight
}
if options.IncludeContent != nil {
searchData["include_content"] = *options.IncludeContent
}
if options.IncludeHighlights != nil {
searchData["include_highlights"] = *options.IncludeHighlights
}
if options.MaxHighlights != nil {
searchData["max_highlights"] = *options.MaxHighlights
}
}
var response SearchResponse
err := c.request(ctx, "POST", "/api/v1/search", nil, searchData, &response)
if err != nil {
return nil, err
}
return response.Data, nil
}
// BatchCreateContexts creates multiple contexts in batch
func (c *Client) BatchCreateContexts(ctx context.Context, contexts []*ContextCreate) (*BatchResult, error) {
if len(contexts) == 0 {
return nil, fmt.Errorf("contexts cannot be empty")
}
startTime := time.Now()
// Validate and normalize all contexts
for _, context := range contexts {
if !validatePath(context.Path) {
return nil, fmt.Errorf("invalid context path: %s", context.Path)
}
context.Path = normalizePath(context.Path)
}
batchData := map[string]interface{}{
"contexts": contexts,
}
var response APIResponse[BatchResult]
err := c.request(ctx, "POST", "/api/v1/contexts/batch", nil, batchData, &response)
if err != nil {
return nil, err
}
// Calculate additional metrics
result := response.Data
result.ExecutionTime = time.Since(startTime)
result.SuccessRate = float64(result.SuccessCount) / float64(result.TotalItems)
// Invalidate cache
c.invalidateCache("/api/v1/contexts")
return &result, nil
}
// IterateContexts iterates through all contexts with automatic pagination
func (c *Client) IterateContexts(ctx context.Context, filter *ContextFilter, pageSize int, callback func(Context) error) error {
if pageSize <= 0 {
pageSize = 100
}
page := 1
for {
pagination := &PaginationOptions{
Page: &page,
PageSize: &pageSize,
}
contexts, paginationMeta, err := c.ListContexts(ctx, filter, pagination)
if err != nil {
return err
}
if len(contexts) == 0 {
break
}
for _, context := range contexts {
if err := callback(context); err != nil {
return err
}
}
// If we got fewer contexts than requested, we've reached the end
if len(contexts) < pageSize || !paginationMeta.HasNext {
break
}
page++
}
return nil
}
// GetAnalytics returns client analytics
func (c *Client) GetAnalytics() map[string]interface{} {
c.mu.RLock()
defer c.mu.RUnlock()
analytics := map[string]interface{}{
"session_start": c.analytics.sessionStart,
"operation_count": c.analytics.operationCount,
"error_count": c.analytics.errorCount,
"total_requests": c.analytics.totalRequests,
"failed_requests": c.analytics.failedRequests,
}
if c.cache != nil {
analytics["cache_stats"] = map[string]interface{}{
"enabled": true,
"size": c.cache.size(),
"max_size": c.cache.maxSize,
"hit_rate": c.analytics.getCacheHitRate(),
}
} else {
analytics["cache_stats"] = map[string]interface{}{
"enabled": false,
}
}
return analytics
}
// ClearCache clears the client cache
func (c *Client) ClearCache() {
if c.cache != nil {
c.cache.clear()
}
}
// Close closes the client and cleans up resources
func (c *Client) Close() error {
if c.cache != nil {
c.cache.clear()
}
return nil
}
// Internal method to make HTTP requests
func (c *Client) request(ctx context.Context, method, path string, params url.Values, body interface{}, result interface{}) error {
// Rate limiting
if err := c.rateLimiter.Wait(ctx); err != nil {
return fmt.Errorf("rate limit error: %w", err)
}
// Build URL
u, err := url.Parse(c.config.BaseURL + path)
if err != nil {
return fmt.Errorf("invalid URL: %w", err)
}
if params != nil {
u.RawQuery = params.Encode()
}
// Prepare body
var bodyReader io.Reader
if body != nil {
bodyBytes, err := json.Marshal(body)
if err != nil {
return fmt.Errorf("failed to marshal body: %w", err)
}
bodyReader = bytes.NewReader(bodyBytes)
}
// Create request
req, err := http.NewRequestWithContext(ctx, method, u.String(), bodyReader)
if err != nil {
return fmt.Errorf("failed to create request: %w", err)
}
// Set headers
req.Header.Set("User-Agent", c.config.UserAgent)
req.Header.Set("Content-Type", "application/json")
if c.config.APIKey != "" {
req.Header.Set("X-API-Key", c.config.APIKey)
}
if c.config.JWTToken != "" {
req.Header.Set("Authorization", "Bearer "+c.config.JWTToken)
}
// Execute request with retries
var resp *http.Response
for attempt := 0; attempt <= c.config.MaxRetries; attempt++ {
c.analytics.recordRequest()
resp, err = c.httpClient.Do(req)
if err != nil {
if attempt == c.config.MaxRetries {
c.analytics.recordError(err.Error())
return fmt.Errorf("request failed after %d attempts: %w", c.config.MaxRetries+1, err)
}
time.Sleep(c.config.RetryDelay * time.Duration(attempt+1))
continue
}
// Check if we should retry based on status code
if shouldRetry(resp.StatusCode) && attempt < c.config.MaxRetries {
resp.Body.Close()
time.Sleep(c.config.RetryDelay * time.Duration(attempt+1))
continue
}
break
}
defer resp.Body.Close()
// Handle error responses
if resp.StatusCode >= 400 {
c.analytics.recordError(fmt.Sprintf("HTTP %d", resp.StatusCode))
return c.handleHTTPError(resp)
}
// Parse response if result is provided
if result != nil {
if err := json.NewDecoder(resp.Body).Decode(result); err != nil {
return fmt.Errorf("failed to decode response: %w", err)
}
}
return nil
}
// Handle HTTP errors and convert to appropriate error types
func (c *Client) handleHTTPError(resp *http.Response) error {
body, _ := io.ReadAll(resp.Body)
var errorResp struct {
Error string `json:"error"`
ErrorDetails []struct {
Field string `json:"field"`
Message string `json:"message"`
Code string `json:"code"`
} `json:"error_details"`
}
json.Unmarshal(body, &errorResp)
message := errorResp.Error
if message == "" {
message = fmt.Sprintf("HTTP %d error", resp.StatusCode)
}
switch resp.StatusCode {
case 400:
return &ValidationError{Message: message, Details: errorResp.ErrorDetails}
case 401:
return &AuthenticationError{Message: message}
case 404:
return &NotFoundError{Message: message}
case 429:
retryAfter := resp.Header.Get("Retry-After")
return &RateLimitError{Message: message, RetryAfter: retryAfter}
case 500, 502, 503, 504:
return &ServerError{Message: message, StatusCode: resp.StatusCode}
default:
return &APIError{Message: message, StatusCode: resp.StatusCode}
}
}
// Utility functions
func validatePath(path string) bool {
return strings.HasPrefix(path, "/") && !strings.Contains(path, "//")
}
func normalizePath(path string) string {
if !strings.HasPrefix(path, "/") {
path = "/" + path
}
// Remove duplicate slashes
for strings.Contains(path, "//") {
path = strings.ReplaceAll(path, "//", "/")
}
return path
}
func (c *Client) invalidateCache(pattern string) {
if c.cache == nil {
return
}
c.cache.invalidatePattern(pattern)
}
func shouldRetry(statusCode int) bool {
return statusCode == 429 || statusCode >= 500
}

134
sdks/java/build.gradle.kts Normal file
View File

@@ -0,0 +1,134 @@
plugins {
`java-library`
`maven-publish`
signing
id("io.github.gradle-nexus.publish-plugin") version "1.3.0"
id("org.jetbrains.dokka") version "1.9.10"
}
group = "dev.hcfs"
version = "2.0.0"
description = "Java SDK for the Context-Aware Hierarchical Context File System"
java {
sourceCompatibility = JavaVersion.VERSION_11
targetCompatibility = JavaVersion.VERSION_11
withJavadocJar()
withSourcesJar()
}
repositories {
mavenCentral()
}
dependencies {
// HTTP client
api("com.squareup.okhttp3:okhttp:4.12.0")
implementation("com.squareup.okhttp3:logging-interceptor:4.12.0")
// JSON serialization
api("com.fasterxml.jackson.core:jackson-databind:2.16.0")
implementation("com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.16.0")
implementation("com.fasterxml.jackson.module:jackson-module-parameter-names:2.16.0")
// Reactive streams
api("io.reactivex.rxjava3:rxjava:3.1.8")
implementation("com.squareup.retrofit2:adapter-rxjava3:2.10.0")
// WebSocket support
implementation("org.java-websocket:Java-WebSocket:1.5.4")
// Validation
implementation("javax.validation:validation-api:2.0.1.Final")
implementation("org.hibernate.validator:hibernate-validator:8.0.1.Final")
// Caching
implementation("com.github.ben-manes.caffeine:caffeine:3.1.8")
// Retry logic
implementation("dev.failsafe:failsafe:3.3.2")
// Logging
implementation("org.slf4j:slf4j-api:2.0.9")
// Metrics
compileOnly("io.micrometer:micrometer-core:1.12.0")
// Testing
testImplementation("org.junit.jupiter:junit-jupiter:5.10.1")
testImplementation("org.mockito:mockito-core:5.7.0")
testImplementation("org.mockito:mockito-junit-jupiter:5.7.0")
testImplementation("com.squareup.okhttp3:mockwebserver:4.12.0")
testImplementation("org.assertj:assertj-core:3.24.2")
testImplementation("ch.qos.logback:logback-classic:1.4.14")
}
tasks.test {
useJUnitPlatform()
testLogging {
events("passed", "skipped", "failed")
}
}
tasks.compileJava {
options.compilerArgs.addAll(listOf("-parameters", "-Xlint:unchecked", "-Xlint:deprecation"))
}
tasks.javadoc {
if (JavaVersion.current().isJava9Compatible) {
(options as StandardJavadocDocletOptions).addBooleanOption("html5", true)
}
options.encoding = "UTF-8"
(options as StandardJavadocDocletOptions).addStringOption("Xdoclint:none", "-quiet")
}
publishing {
publications {
create<MavenPublication>("maven") {
from(components["java"])
pom {
name.set("HCFS Java SDK")
description.set("Java SDK for the Context-Aware Hierarchical Context File System")
url.set("https://github.com/hcfs/hcfs")
licenses {
license {
name.set("MIT License")
url.set("https://opensource.org/licenses/MIT")
}
}
developers {
developer {
id.set("hcfs-team")
name.set("HCFS Development Team")
email.set("dev@hcfs.dev")
}
}
scm {
connection.set("scm:git:git://github.com/hcfs/hcfs.git")
developerConnection.set("scm:git:ssh://github.com/hcfs/hcfs.git")
url.set("https://github.com/hcfs/hcfs")
}
}
}
}
}
signing {
val signingKey: String? by project
val signingPassword: String? by project
useInMemoryPgpKeys(signingKey, signingPassword)
sign(publishing.publications["maven"])
}
nexusPublishing {
repositories {
sonatype {
nexusUrl.set(uri("https://s01.oss.sonatype.org/service/local/"))
snapshotRepositoryUrl.set(uri("https://s01.oss.sonatype.org/content/repositories/snapshots/"))
}
}
}

View File

@@ -0,0 +1,755 @@
package dev.hcfs.sdk;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.datatype.jsr310.JavaTimeModule;
import com.github.benmanes.caffeine.cache.Cache;
import com.github.benmanes.caffeine.cache.Caffeine;
import dev.failsafe.Failsafe;
import dev.failsafe.RetryPolicy;
import io.reactivex.rxjava3.core.Observable;
import io.reactivex.rxjava3.core.Single;
import io.reactivex.rxjava3.schedulers.Schedulers;
import okhttp3.*;
import okhttp3.logging.HttpLoggingInterceptor;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.io.IOException;
import java.net.URI;
import java.time.Duration;
import java.time.Instant;
import java.util.*;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicLong;
/**
* Main HCFS client for Java applications.
*
* <p>This client provides both synchronous and asynchronous (reactive) methods for interacting
* with the HCFS API. It includes built-in caching, retry logic, rate limiting, and comprehensive
* error handling.</p>
*
* <h3>Basic Usage</h3>
* <pre>{@code
* Config config = Config.builder()
* .baseUrl("https://api.hcfs.dev/v1")
* .apiKey("your-api-key")
* .build();
*
* HCFSClient client = new HCFSClient(config);
*
* // Create a context
* Context context = Context.builder()
* .path("/docs/readme")
* .content("Hello, HCFS!")
* .summary("Getting started guide")
* .build();
*
* Context created = client.createContext(context).blockingGet();
* System.out.println("Created context: " + created.getId());
*
* // Search contexts
* List<SearchResult> results = client.searchContexts("hello world")
* .blockingGet();
*
* results.forEach(result ->
* System.out.printf("Found: %s (score: %.3f)%n",
* result.getContext().getPath(), result.getScore()));
* }</pre>
*
* <h3>Reactive Usage</h3>
* <pre>{@code
* // Async operations with RxJava
* client.createContext(context)
* .subscribeOn(Schedulers.io())
* .observeOn(Schedulers.computation())
* .subscribe(
* created -> System.out.println("Created: " + created.getId()),
* error -> System.err.println("Error: " + error.getMessage())
* );
*
* // Stream processing
* client.searchContexts("query")
* .flatMapObservable(Observable::fromIterable)
* .filter(result -> result.getScore() > 0.8)
* .map(result -> result.getContext())
* .subscribe(context -> processContext(context));
* }</pre>
*
* @author HCFS Development Team
* @version 2.0.0
* @since 1.0.0
*/
public class HCFSClient {
private static final Logger logger = LoggerFactory.getLogger(HCFSClient.class);
private static final String SDK_VERSION = "2.0.0";
private static final String USER_AGENT = "hcfs-java/" + SDK_VERSION;
private final Config config;
private final OkHttpClient httpClient;
private final ObjectMapper objectMapper;
private final Cache<String, Object> cache;
private final RetryPolicy<Object> retryPolicy;
private final Map<String, AtomicLong> analytics;
private final Instant sessionStart;
/**
* Creates a new HCFS client with the specified configuration.
*
* @param config the client configuration
* @throws IllegalArgumentException if config is null
*/
public HCFSClient(Config config) {
if (config == null) {
throw new IllegalArgumentException("Config cannot be null");
}
this.config = config;
this.sessionStart = Instant.now();
this.analytics = new ConcurrentHashMap<>();
// Initialize object mapper
this.objectMapper = new ObjectMapper()
.registerModule(new JavaTimeModule())
.findAndRegisterModules();
// Initialize cache
if (config.isCacheEnabled()) {
this.cache = Caffeine.newBuilder()
.maximumSize(config.getCacheSize())
.expireAfterWrite(config.getCacheTtl())
.recordStats()
.build();
} else {
this.cache = null;
}
// Initialize retry policy
this.retryPolicy = RetryPolicy.builder()
.handle(IOException.class, HCFSServerException.class, HCFSRateLimitException.class)
.withDelay(config.getRetryBaseDelay())
.withMaxRetries(config.getMaxRetries())
.withJitter(Duration.ofMillis(100))
.build();
// Initialize HTTP client
this.httpClient = createHttpClient();
logger.info("HCFS client initialized with base URL: {}", config.getBaseUrl());
}
/**
* Creates and configures the OkHttp client.
*/
private OkHttpClient createHttpClient() {
OkHttpClient.Builder builder = new OkHttpClient.Builder()
.connectTimeout(config.getTimeout())
.readTimeout(config.getTimeout())
.writeTimeout(config.getTimeout())
.addInterceptor(new AuthenticationInterceptor())
.addInterceptor(new AnalyticsInterceptor())
.addInterceptor(new UserAgentInterceptor());
// Add logging interceptor for debugging
if (logger.isDebugEnabled()) {
HttpLoggingInterceptor loggingInterceptor = new HttpLoggingInterceptor(logger::debug);
loggingInterceptor.setLevel(HttpLoggingInterceptor.Level.BASIC);
builder.addInterceptor(loggingInterceptor);
}
return builder.build();
}
/**
* Checks the API health status.
*
* @return a Single emitting the health response
*/
public Single<HealthResponse> healthCheck() {
return Single.fromCallable(() -> {
Request request = new Request.Builder()
.url(config.getBaseUrl() + "/health")
.get()
.build();
return executeRequest(request, HealthResponse.class);
}).subscribeOn(Schedulers.io());
}
/**
* Creates a new context.
*
* @param contextData the context data to create
* @return a Single emitting the created context
* @throws IllegalArgumentException if contextData is null or invalid
*/
public Single<Context> createContext(ContextCreate contextData) {
if (contextData == null) {
return Single.error(new IllegalArgumentException("Context data cannot be null"));
}
if (!PathValidator.isValid(contextData.getPath())) {
return Single.error(new HCFSValidationException("Invalid context path: " + contextData.getPath()));
}
return Single.fromCallable(() -> {
// Normalize path
String normalizedPath = PathValidator.normalize(contextData.getPath());
ContextCreate normalized = contextData.toBuilder()
.path(normalizedPath)
.build();
RequestBody body = RequestBody.create(
objectMapper.writeValueAsString(normalized),
MediaType.get("application/json")
);
Request request = new Request.Builder()
.url(config.getBaseUrl() + "/api/v1/contexts")
.post(body)
.build();
APIResponse<Context> response = executeRequest(request,
new TypeReference<APIResponse<Context>>() {});
// Invalidate relevant cache entries
invalidateCache("/api/v1/contexts");
return response.getData();
}).subscribeOn(Schedulers.io());
}
/**
* Retrieves a context by ID.
*
* @param contextId the context ID
* @return a Single emitting the context
* @throws IllegalArgumentException if contextId is invalid
*/
public Single<Context> getContext(int contextId) {
if (contextId <= 0) {
return Single.error(new IllegalArgumentException("Context ID must be positive"));
}
return Single.fromCallable(() -> {
String path = "/api/v1/contexts/" + contextId;
String cacheKey = "GET:" + path;
// Check cache first
if (cache != null) {
Context cached = (Context) cache.getIfPresent(cacheKey);
if (cached != null) {
recordAnalytics("cache_hit");
return cached;
}
recordAnalytics("cache_miss");
}
Request request = new Request.Builder()
.url(config.getBaseUrl() + path)
.get()
.build();
APIResponse<Context> response = executeRequest(request,
new TypeReference<APIResponse<Context>>() {});
Context context = response.getData();
// Cache the result
if (cache != null) {
cache.put(cacheKey, context);
}
return context;
}).subscribeOn(Schedulers.io());
}
/**
* Lists contexts with optional filtering and pagination.
*
* @param filter the context filter (can be null)
* @param pagination the pagination options (can be null)
* @return a Single emitting the context list response
*/
public Single<ContextListResponse> listContexts(ContextFilter filter, PaginationOptions pagination) {
return Single.fromCallable(() -> {
HttpUrl.Builder urlBuilder = HttpUrl.parse(config.getBaseUrl() + "/api/v1/contexts").newBuilder();
// Add filter parameters
if (filter != null) {
addFilterParams(urlBuilder, filter);
}
// Add pagination parameters
if (pagination != null) {
addPaginationParams(urlBuilder, pagination);
}
Request request = new Request.Builder()
.url(urlBuilder.build())
.get()
.build();
return executeRequest(request, ContextListResponse.class);
}).subscribeOn(Schedulers.io());
}
/**
* Updates an existing context.
*
* @param contextId the context ID
* @param updates the context updates
* @return a Single emitting the updated context
*/
public Single<Context> updateContext(int contextId, ContextUpdate updates) {
if (contextId <= 0) {
return Single.error(new IllegalArgumentException("Context ID must be positive"));
}
if (updates == null) {
return Single.error(new IllegalArgumentException("Updates cannot be null"));
}
return Single.fromCallable(() -> {
RequestBody body = RequestBody.create(
objectMapper.writeValueAsString(updates),
MediaType.get("application/json")
);
String path = "/api/v1/contexts/" + contextId;
Request request = new Request.Builder()
.url(config.getBaseUrl() + path)
.put(body)
.build();
APIResponse<Context> response = executeRequest(request,
new TypeReference<APIResponse<Context>>() {});
// Invalidate cache
invalidateCache("GET:" + path);
invalidateCache("/api/v1/contexts");
return response.getData();
}).subscribeOn(Schedulers.io());
}
/**
* Deletes a context.
*
* @param contextId the context ID
* @return a Single emitting completion
*/
public Single<Void> deleteContext(int contextId) {
if (contextId <= 0) {
return Single.error(new IllegalArgumentException("Context ID must be positive"));
}
return Single.fromCallable(() -> {
String path = "/api/v1/contexts/" + contextId;
Request request = new Request.Builder()
.url(config.getBaseUrl() + path)
.delete()
.build();
executeRequest(request, SuccessResponse.class);
// Invalidate cache
invalidateCache("GET:" + path);
invalidateCache("/api/v1/contexts");
return null;
}).subscribeOn(Schedulers.io());
}
/**
* Searches contexts using various search methods.
*
* @param query the search query
* @param options the search options (can be null)
* @return a Single emitting the search results
*/
public Single<List<SearchResult>> searchContexts(String query, SearchOptions options) {
if (query == null || query.trim().isEmpty()) {
return Single.error(new IllegalArgumentException("Query cannot be null or empty"));
}
return Single.fromCallable(() -> {
Map<String, Object> searchData = new HashMap<>();
searchData.put("query", query);
if (options != null) {
addSearchOptions(searchData, options);
}
RequestBody body = RequestBody.create(
objectMapper.writeValueAsString(searchData),
MediaType.get("application/json")
);
Request request = new Request.Builder()
.url(config.getBaseUrl() + "/api/v1/search")
.post(body)
.build();
SearchResponse response = executeRequest(request, SearchResponse.class);
return response.getData();
}).subscribeOn(Schedulers.io());
}
/**
* Searches contexts with default options.
*
* @param query the search query
* @return a Single emitting the search results
*/
public Single<List<SearchResult>> searchContexts(String query) {
return searchContexts(query, null);
}
/**
* Creates multiple contexts in batch.
*
* @param contexts the list of contexts to create
* @return a Single emitting the batch result
*/
public Single<BatchResult> batchCreateContexts(List<ContextCreate> contexts) {
if (contexts == null || contexts.isEmpty()) {
return Single.error(new IllegalArgumentException("Contexts cannot be null or empty"));
}
return Single.fromCallable(() -> {
Instant startTime = Instant.now();
// Validate and normalize all contexts
List<ContextCreate> normalizedContexts = new ArrayList<>();
for (ContextCreate context : contexts) {
if (!PathValidator.isValid(context.getPath())) {
throw new HCFSValidationException("Invalid context path: " + context.getPath());
}
normalizedContexts.add(context.toBuilder()
.path(PathValidator.normalize(context.getPath()))
.build());
}
Map<String, Object> batchData = new HashMap<>();
batchData.put("contexts", normalizedContexts);
RequestBody body = RequestBody.create(
objectMapper.writeValueAsString(batchData),
MediaType.get("application/json")
);
Request request = new Request.Builder()
.url(config.getBaseUrl() + "/api/v1/contexts/batch")
.post(body)
.build();
APIResponse<BatchResult> response = executeRequest(request,
new TypeReference<APIResponse<BatchResult>>() {});
BatchResult result = response.getData();
// Calculate additional metrics
Duration executionTime = Duration.between(startTime, Instant.now());
double successRate = (double) result.getSuccessCount() / result.getTotalItems();
// Invalidate cache
invalidateCache("/api/v1/contexts");
return result.toBuilder()
.executionTime(executionTime)
.successRate(successRate)
.build();
}).subscribeOn(Schedulers.io());
}
/**
* Iterates through all contexts with automatic pagination.
*
* @param filter the context filter (can be null)
* @param pageSize the page size
* @return an Observable emitting contexts
*/
public Observable<Context> iterateContexts(ContextFilter filter, int pageSize) {
if (pageSize <= 0) {
pageSize = 100;
}
final int finalPageSize = pageSize;
return Observable.create(emitter -> {
int page = 1;
while (!emitter.isDisposed()) {
PaginationOptions pagination = PaginationOptions.builder()
.page(page)
.pageSize(finalPageSize)
.build();
try {
ContextListResponse response = listContexts(filter, pagination).blockingGet();
List<Context> contexts = response.getData();
if (contexts.isEmpty()) {
break;
}
for (Context context : contexts) {
if (emitter.isDisposed()) {
return;
}
emitter.onNext(context);
}
// Check if we've reached the end
if (contexts.size() < finalPageSize || !response.getPagination().isHasNext()) {
break;
}
page++;
} catch (Exception e) {
emitter.onError(e);
return;
}
}
emitter.onComplete();
});
}
/**
* Gets comprehensive system statistics.
*
* @return a Single emitting the statistics
*/
public Single<StatsResponse> getStatistics() {
return Single.fromCallable(() -> {
Request request = new Request.Builder()
.url(config.getBaseUrl() + "/api/v1/stats")
.get()
.build();
return executeRequest(request, StatsResponse.class);
}).subscribeOn(Schedulers.io());
}
/**
* Gets client analytics and usage statistics.
*
* @return the analytics data
*/
public Map<String, Object> getAnalytics() {
Map<String, Object> result = new HashMap<>();
result.put("session_start", sessionStart);
result.put("operation_counts", new HashMap<>(analytics));
if (cache != null) {
com.github.benmanes.caffeine.cache.stats.CacheStats stats = cache.stats();
Map<String, Object> cacheStats = new HashMap<>();
cacheStats.put("enabled", true);
cacheStats.put("size", cache.estimatedSize());
cacheStats.put("max_size", config.getCacheSize());
cacheStats.put("hit_rate", stats.hitRate());
cacheStats.put("miss_rate", stats.missRate());
cacheStats.put("hit_count", stats.hitCount());
cacheStats.put("miss_count", stats.missCount());
result.put("cache_stats", cacheStats);
} else {
result.put("cache_stats", Map.of("enabled", false));
}
return result;
}
/**
* Clears the client cache.
*/
public void clearCache() {
if (cache != null) {
cache.invalidateAll();
}
}
/**
* Closes the client and releases resources.
*/
public void close() {
if (cache != null) {
cache.invalidateAll();
}
httpClient.dispatcher().executorService().shutdown();
httpClient.connectionPool().evictAll();
}
// Private helper methods
private <T> T executeRequest(Request request, Class<T> responseType) throws IOException {
return executeRequest(request, TypeReference.constructType(responseType));
}
private <T> T executeRequest(Request request, TypeReference<T> responseType) throws IOException {
return Failsafe.with(retryPolicy).get(() -> {
try (Response response = httpClient.newCall(request).execute()) {
recordAnalytics("request");
if (!response.isSuccessful()) {
recordAnalytics("error");
handleErrorResponse(response);
}
ResponseBody responseBody = response.body();
if (responseBody == null) {
throw new HCFSException("Empty response body");
}
String json = responseBody.string();
return objectMapper.readValue(json, responseType);
}
});
}
private void handleErrorResponse(Response response) throws IOException {
String body = response.body() != null ? response.body().string() : "";
try {
APIErrorResponse errorResponse = objectMapper.readValue(body, APIErrorResponse.class);
switch (response.code()) {
case 400:
throw new HCFSValidationException(errorResponse.getError(), errorResponse.getErrorDetails());
case 401:
throw new HCFSAuthenticationException(errorResponse.getError());
case 404:
throw new HCFSNotFoundException(errorResponse.getError());
case 429:
String retryAfter = response.header("Retry-After");
throw new HCFSRateLimitException(errorResponse.getError(), retryAfter);
case 500:
case 502:
case 503:
case 504:
throw new HCFSServerException(errorResponse.getError(), response.code());
default:
throw new HCFSException(errorResponse.getError());
}
} catch (Exception e) {
// If we can't parse the error response, throw a generic exception
throw new HCFSException("HTTP " + response.code() + ": " + body);
}
}
private void addFilterParams(HttpUrl.Builder urlBuilder, ContextFilter filter) {
if (filter.getPathPrefix() != null) {
urlBuilder.addQueryParameter("path_prefix", filter.getPathPrefix());
}
if (filter.getAuthor() != null) {
urlBuilder.addQueryParameter("author", filter.getAuthor());
}
if (filter.getStatus() != null) {
urlBuilder.addQueryParameter("status", filter.getStatus().toString());
}
// Add other filter parameters as needed
}
private void addPaginationParams(HttpUrl.Builder urlBuilder, PaginationOptions pagination) {
if (pagination.getPage() != null) {
urlBuilder.addQueryParameter("page", pagination.getPage().toString());
}
if (pagination.getPageSize() != null) {
urlBuilder.addQueryParameter("page_size", pagination.getPageSize().toString());
}
if (pagination.getSortBy() != null) {
urlBuilder.addQueryParameter("sort_by", pagination.getSortBy());
}
if (pagination.getSortOrder() != null) {
urlBuilder.addQueryParameter("sort_order", pagination.getSortOrder().toString());
}
}
private void addSearchOptions(Map<String, Object> searchData, SearchOptions options) {
if (options.getSearchType() != null) {
searchData.put("search_type", options.getSearchType().toString());
}
if (options.getTopK() != null) {
searchData.put("top_k", options.getTopK());
}
if (options.getSimilarityThreshold() != null) {
searchData.put("similarity_threshold", options.getSimilarityThreshold());
}
if (options.getPathPrefix() != null) {
searchData.put("path_prefix", options.getPathPrefix());
}
if (options.getSemanticWeight() != null) {
searchData.put("semantic_weight", options.getSemanticWeight());
}
if (options.getIncludeContent() != null) {
searchData.put("include_content", options.getIncludeContent());
}
if (options.getIncludeHighlights() != null) {
searchData.put("include_highlights", options.getIncludeHighlights());
}
if (options.getMaxHighlights() != null) {
searchData.put("max_highlights", options.getMaxHighlights());
}
}
private void invalidateCache(String pattern) {
if (cache == null) return;
cache.asMap().keySet().removeIf(key -> key.contains(pattern));
}
private void recordAnalytics(String operation) {
analytics.computeIfAbsent(operation, k -> new AtomicLong(0)).incrementAndGet();
}
// Inner classes for interceptors
private class AuthenticationInterceptor implements Interceptor {
@Override
public Response intercept(Chain chain) throws IOException {
Request.Builder builder = chain.request().newBuilder();
if (config.getApiKey() != null) {
builder.header("X-API-Key", config.getApiKey());
}
if (config.getJwtToken() != null) {
builder.header("Authorization", "Bearer " + config.getJwtToken());
}
return chain.proceed(builder.build());
}
}
private class UserAgentInterceptor implements Interceptor {
@Override
public Response intercept(Chain chain) throws IOException {
Request request = chain.request().newBuilder()
.header("User-Agent", USER_AGENT)
.build();
return chain.proceed(request);
}
}
private class AnalyticsInterceptor implements Interceptor {
@Override
public Response intercept(Chain chain) throws IOException {
long startTime = System.currentTimeMillis();
Response response = chain.proceed(chain.request());
long duration = System.currentTimeMillis() - startTime;
recordAnalytics("total_requests");
recordAnalytics("response_time_" + response.code());
if (!response.isSuccessful()) {
recordAnalytics("failed_requests");
}
return response;
}
}
}

View File

@@ -0,0 +1,77 @@
{
"name": "@hcfs/sdk",
"version": "2.0.0",
"description": "JavaScript/TypeScript SDK for the Context-Aware Hierarchical Context File System",
"main": "dist/index.js",
"module": "dist/index.esm.js",
"types": "dist/index.d.ts",
"files": [
"dist"
],
"scripts": {
"build": "rollup -c",
"dev": "rollup -c -w",
"test": "jest",
"test:watch": "jest --watch",
"test:coverage": "jest --coverage",
"lint": "eslint src --ext .ts,.js",
"lint:fix": "eslint src --ext .ts,.js --fix",
"typecheck": "tsc --noEmit",
"docs": "typedoc src/index.ts",
"prepublishOnly": "npm run build"
},
"keywords": [
"hcfs",
"context",
"ai",
"search",
"embeddings",
"typescript",
"javascript",
"sdk"
],
"author": "HCFS Development Team <dev@hcfs.dev>",
"license": "MIT",
"repository": {
"type": "git",
"url": "https://github.com/hcfs/hcfs.git",
"directory": "sdks/javascript"
},
"bugs": {
"url": "https://github.com/hcfs/hcfs/issues"
},
"homepage": "https://docs.hcfs.dev/sdk/javascript",
"engines": {
"node": ">=14.0.0"
},
"dependencies": {
"axios": "^1.6.0",
"ws": "^8.14.0",
"eventemitter3": "^5.0.1"
},
"devDependencies": {
"@types/jest": "^29.5.0",
"@types/node": "^20.0.0",
"@types/ws": "^8.5.0",
"@typescript-eslint/eslint-plugin": "^6.0.0",
"@typescript-eslint/parser": "^6.0.0",
"eslint": "^8.50.0",
"jest": "^29.7.0",
"rollup": "^4.0.0",
"@rollup/plugin-typescript": "^11.1.0",
"@rollup/plugin-node-resolve": "^15.2.0",
"@rollup/plugin-commonjs": "^25.0.0",
"rollup-plugin-dts": "^6.1.0",
"ts-jest": "^29.1.0",
"typescript": "^5.2.0",
"typedoc": "^0.25.0"
},
"peerDependencies": {
"typescript": ">=4.5.0"
},
"peerDependenciesMeta": {
"typescript": {
"optional": true
}
}
}

View File

@@ -0,0 +1,574 @@
/**
* HCFS JavaScript/TypeScript Client
*
* Main client class for interacting with the HCFS API
*/
import axios, { AxiosInstance, AxiosRequestConfig, AxiosResponse } from 'axios';
import { EventEmitter } from 'eventemitter3';
import {
HCFSConfig,
Context,
ContextCreate,
ContextUpdate,
ContextFilter,
PaginationOptions,
SearchOptions,
SearchResult,
BatchResult,
AnalyticsData,
APIResponse,
ListResponse,
SearchResponse,
HealthResponse,
StatsResponse,
RequestOptions,
HttpMethod
} from './types';
import {
HCFSError,
HCFSConnectionError,
HCFSAuthenticationError,
HCFSNotFoundError,
HCFSValidationError,
HCFSRateLimitError,
HCFSServerError
} from './errors';
import { LRUCache } from './cache';
import { RetryManager } from './retry';
import { validatePath, normalizePath } from './utils';
/**
* Main HCFS client for JavaScript/TypeScript applications
*
* @example
* ```typescript
* const client = new HCFSClient({
* baseUrl: 'https://api.hcfs.dev/v1',
* apiKey: 'your-api-key',
* timeout: 30000,
* cache: { enabled: true, maxSize: 1000 }
* });
*
* // Create a context
* const context = await client.createContext({
* path: '/docs/readme',
* content: 'Hello, HCFS!',
* summary: 'Getting started'
* });
*
* // Search contexts
* const results = await client.searchContexts('hello world', {
* searchType: 'semantic',
* topK: 10
* });
* ```
*/
export class HCFSClient extends EventEmitter {
private readonly config: Required<HCFSConfig>;
private readonly httpClient: AxiosInstance;
private readonly cache?: LRUCache<any>;
private readonly retryManager: RetryManager;
private readonly analytics: AnalyticsData;
constructor(config: HCFSConfig) {
super();
// Merge with defaults
this.config = {
baseUrl: config.baseUrl,
apiKey: config.apiKey,
jwtToken: config.jwtToken,
timeout: config.timeout ?? 30000,
userAgent: config.userAgent ?? `@hcfs/sdk/2.0.0`,
cache: {
enabled: config.cache?.enabled ?? true,
strategy: config.cache?.strategy ?? 'lru',
maxSize: config.cache?.maxSize ?? 1000,
ttl: config.cache?.ttl ?? 3600000,
memoryLimit: config.cache?.memoryLimit ?? 100,
...config.cache
},
retry: {
enabled: config.retry?.enabled ?? true,
maxAttempts: config.retry?.maxAttempts ?? 3,
baseDelay: config.retry?.baseDelay ?? 1000,
maxDelay: config.retry?.maxDelay ?? 30000,
backoffMultiplier: config.retry?.backoffMultiplier ?? 2,
jitter: config.retry?.jitter ?? true,
retryOnStatus: config.retry?.retryOnStatus ?? [429, 500, 502, 503, 504],
retryOnTimeout: config.retry?.retryOnTimeout ?? true,
...config.retry
},
websocket: {
autoReconnect: config.websocket?.autoReconnect ?? true,
reconnectInterval: config.websocket?.reconnectInterval ?? 5000,
maxReconnectAttempts: config.websocket?.maxReconnectAttempts ?? 10,
pingInterval: config.websocket?.pingInterval ?? 30000,
pingTimeout: config.websocket?.pingTimeout ?? 10000,
messageQueueSize: config.websocket?.messageQueueSize ?? 1000,
...config.websocket
},
maxConcurrentRequests: config.maxConcurrentRequests ?? 100,
keepAlive: config.keepAlive ?? true
};
// Initialize HTTP client
this.httpClient = axios.create({
baseURL: this.config.baseUrl,
timeout: this.config.timeout,
headers: {
'User-Agent': this.config.userAgent,
'Content-Type': 'application/json',
...(this.config.apiKey && { 'X-API-Key': this.config.apiKey }),
...(this.config.jwtToken && { 'Authorization': `Bearer ${this.config.jwtToken}` })
}
});
// Initialize cache
if (this.config.cache.enabled) {
this.cache = new LRUCache(this.config.cache.maxSize, this.config.cache.ttl);
}
// Initialize retry manager
this.retryManager = new RetryManager(this.config.retry);
// Initialize analytics
this.analytics = {
operationCount: {},
cacheStats: {},
errorStats: {},
performanceStats: {},
sessionStart: new Date()
};
// Setup request/response interceptors
this.setupInterceptors();
}
/**
* Setup axios interceptors for error handling and monitoring
*/
private setupInterceptors(): void {
// Request interceptor
this.httpClient.interceptors.request.use(
(config) => {
// Track request
const operation = `${config.method?.toUpperCase()} ${config.url}`;
this.updateAnalytics(operation, true);
this.emit('request', { method: config.method, url: config.url });
return config;
},
(error) => {
this.updateAnalytics('request_error', false, error.message);
return Promise.reject(error);
}
);
// Response interceptor
this.httpClient.interceptors.response.use(
(response) => {
this.emit('response', {
status: response.status,
url: response.config.url,
duration: Date.now() - (response.config as any).startTime
});
return response;
},
(error) => {
const operation = `${error.config?.method?.toUpperCase()} ${error.config?.url}`;
this.updateAnalytics(operation, false, error.message);
this.emit('error', {
status: error.response?.status,
message: error.message,
url: error.config?.url
});
return Promise.reject(this.handleApiError(error));
}
);
}
/**
* Handle API errors and convert to appropriate HCFS error types
*/
private handleApiError(error: any): HCFSError {
if (error.code === 'ECONNABORTED' || error.code === 'ENOTFOUND') {
return new HCFSConnectionError(`Connection failed: ${error.message}`);
}
if (!error.response) {
return new HCFSConnectionError(`Network error: ${error.message}`);
}
const { status, data } = error.response;
switch (status) {
case 400:
return new HCFSValidationError(
data.error || 'Validation failed',
data.errorDetails || []
);
case 401:
return new HCFSAuthenticationError(data.error || 'Authentication failed');
case 404:
return new HCFSNotFoundError(data.error || 'Resource not found');
case 429:
const retryAfter = error.response.headers['retry-after'];
return new HCFSRateLimitError(
data.error || 'Rate limit exceeded',
retryAfter ? parseInt(retryAfter) : undefined
);
case 500:
case 502:
case 503:
case 504:
return new HCFSServerError(data.error || 'Server error', status);
default:
return new HCFSError(data.error || `HTTP ${status} error`, status.toString());
}
}
/**
* Make an HTTP request with retry logic and caching
*/
private async request<T>(options: RequestOptions): Promise<T> {
const cacheKey = this.getCacheKey(options);
// Try cache first for GET requests
if (options.method === 'GET' && this.cache && cacheKey) {
const cached = this.cache.get(cacheKey);
if (cached) {
this.analytics.cacheStats.hits = (this.analytics.cacheStats.hits || 0) + 1;
return cached;
}
this.analytics.cacheStats.misses = (this.analytics.cacheStats.misses || 0) + 1;
}
// Make request with retry logic
const response = await this.retryManager.execute(async () => {
const axiosConfig: AxiosRequestConfig = {
method: options.method,
url: options.url,
data: options.data,
params: options.params,
headers: options.headers,
timeout: options.timeout || this.config.timeout,
startTime: Date.now()
} as any;
return this.httpClient.request<APIResponse<T>>(axiosConfig);
});
const result = response.data.data;
// Cache GET responses
if (options.method === 'GET' && this.cache && cacheKey) {
this.cache.set(cacheKey, result);
}
return result;
}
/**
* Generate cache key for request
*/
private getCacheKey(options: RequestOptions): string | null {
if (options.method !== 'GET') return null;
const params = options.params ? JSON.stringify(options.params) : '';
return `${options.url}:${params}`;
}
/**
* Update analytics tracking
*/
private updateAnalytics(operation: string, success: boolean, error?: string): void {
this.analytics.operationCount[operation] = (this.analytics.operationCount[operation] || 0) + 1;
if (!success && error) {
this.analytics.errorStats[error] = (this.analytics.errorStats[error] || 0) + 1;
}
}
/**
* Check API health status
*/
async healthCheck(): Promise<HealthResponse> {
return this.request({
method: 'GET',
url: '/health'
});
}
/**
* Create a new context
*/
async createContext(contextData: ContextCreate): Promise<Context> {
if (!validatePath(contextData.path)) {
throw new HCFSValidationError(`Invalid context path: ${contextData.path}`);
}
const normalizedData = {
...contextData,
path: normalizePath(contextData.path)
};
const context = await this.request<Context>({
method: 'POST',
url: '/api/v1/contexts',
data: normalizedData
});
// Invalidate relevant cache entries
this.invalidateCache('/api/v1/contexts');
return context;
}
/**
* Get a context by ID
*/
async getContext(contextId: number): Promise<Context> {
return this.request<Context>({
method: 'GET',
url: `/api/v1/contexts/${contextId}`
});
}
/**
* List contexts with filtering and pagination
*/
async listContexts(
filter?: ContextFilter,
pagination?: PaginationOptions
): Promise<{ contexts: Context[]; pagination: any }> {
const params: Record<string, any> = {};
// Add filter parameters
if (filter) {
if (filter.pathPrefix) params.path_prefix = filter.pathPrefix;
if (filter.author) params.author = filter.author;
if (filter.status) params.status = filter.status;
if (filter.createdAfter) params.created_after = filter.createdAfter.toISOString();
if (filter.createdBefore) params.created_before = filter.createdBefore.toISOString();
if (filter.contentContains) params.content_contains = filter.contentContains;
if (filter.minContentLength) params.min_content_length = filter.minContentLength;
if (filter.maxContentLength) params.max_content_length = filter.maxContentLength;
}
// Add pagination parameters
if (pagination) {
if (pagination.page) params.page = pagination.page;
if (pagination.pageSize) params.page_size = pagination.pageSize;
if (pagination.sortBy) params.sort_by = pagination.sortBy;
if (pagination.sortOrder) params.sort_order = pagination.sortOrder;
}
const response = await this.httpClient.get<ListResponse<Context>>('/api/v1/contexts', { params });
return {
contexts: response.data.data,
pagination: response.data.pagination
};
}
/**
* Update an existing context
*/
async updateContext(contextId: number, updates: ContextUpdate): Promise<Context> {
const context = await this.request<Context>({
method: 'PUT',
url: `/api/v1/contexts/${contextId}`,
data: updates
});
// Invalidate cache
this.invalidateCache(`/api/v1/contexts/${contextId}`);
this.invalidateCache('/api/v1/contexts');
return context;
}
/**
* Delete a context
*/
async deleteContext(contextId: number): Promise<void> {
await this.request({
method: 'DELETE',
url: `/api/v1/contexts/${contextId}`
});
// Invalidate cache
this.invalidateCache(`/api/v1/contexts/${contextId}`);
this.invalidateCache('/api/v1/contexts');
}
/**
* Search contexts
*/
async searchContexts(
query: string,
options?: SearchOptions
): Promise<SearchResult[]> {
const searchData = {
query,
search_type: options?.searchType || 'semantic',
top_k: options?.topK || 10,
similarity_threshold: options?.similarityThreshold || 0.0,
path_prefix: options?.pathPrefix,
semantic_weight: options?.semanticWeight || 0.7,
include_content: options?.includeContent ?? true,
include_highlights: options?.includeHighlights ?? true,
max_highlights: options?.maxHighlights || 3
};
const response = await this.httpClient.post<SearchResponse>('/api/v1/search', searchData);
return response.data.data;
}
/**
* Create multiple contexts in batch
*/
async batchCreateContexts(contexts: ContextCreate[]): Promise<BatchResult> {
const startTime = Date.now();
// Validate all contexts
for (const context of contexts) {
if (!validatePath(context.path)) {
throw new HCFSValidationError(`Invalid context path: ${context.path}`);
}
}
const normalizedContexts = contexts.map(ctx => ({
...ctx,
path: normalizePath(ctx.path)
}));
const response = await this.httpClient.post('/api/v1/contexts/batch', {
contexts: normalizedContexts
});
const executionTime = Date.now() - startTime;
const result = response.data.data;
// Invalidate cache
this.invalidateCache('/api/v1/contexts');
return {
...result,
executionTime,
successRate: result.success_count / result.total_items
};
}
/**
* Get comprehensive statistics
*/
async getStatistics(): Promise<StatsResponse> {
return this.request<StatsResponse>({
method: 'GET',
url: '/api/v1/stats'
});
}
/**
* Iterate through all contexts with automatic pagination
*/
async *iterateContexts(
filter?: ContextFilter,
pageSize: number = 100
): AsyncIterableIterator<Context> {
let page = 1;
while (true) {
const result = await this.listContexts(filter, { page, pageSize });
if (result.contexts.length === 0) {
break;
}
for (const context of result.contexts) {
yield context;
}
// If we got fewer contexts than requested, we've reached the end
if (result.contexts.length < pageSize) {
break;
}
page++;
}
}
/**
* Clear cache
*/
clearCache(): void {
if (this.cache) {
this.cache.clear();
}
}
/**
* Invalidate cache entries matching a pattern
*/
private invalidateCache(pattern: string): void {
if (!this.cache) return;
// Simple pattern matching - remove entries that start with the pattern
const keys = this.cache.keys();
for (const key of keys) {
if (key.startsWith(pattern)) {
this.cache.delete(key);
}
}
}
/**
* Get cache statistics
*/
getCacheStats(): Record<string, any> {
if (!this.cache) {
return { enabled: false };
}
return {
enabled: true,
size: this.cache.size,
maxSize: this.cache.maxSize,
hitRate: this.analytics.cacheStats.hits /
(this.analytics.cacheStats.hits + this.analytics.cacheStats.misses) || 0,
...this.analytics.cacheStats
};
}
/**
* Get client analytics
*/
getAnalytics(): AnalyticsData {
return {
...this.analytics,
cacheStats: this.getCacheStats()
};
}
/**
* Close the client and cleanup resources
*/
close(): void {
this.removeAllListeners();
if (this.cache) {
this.cache.clear();
}
}
}

View File

@@ -0,0 +1,70 @@
/**
* HCFS JavaScript/TypeScript SDK
*
* A comprehensive SDK for interacting with the HCFS API from JavaScript and TypeScript applications.
* Supports both Node.js and browser environments with full TypeScript support.
*
* @example
* ```typescript
* import { HCFSClient, Context } from '@hcfs/sdk';
*
* const client = new HCFSClient({
* baseUrl: 'https://api.hcfs.dev/v1',
* apiKey: 'your-api-key'
* });
*
* const context = new Context({
* path: '/docs/readme',
* content: 'Hello, HCFS!',
* summary: 'Getting started guide'
* });
*
* const created = await client.createContext(context);
* console.log(`Created context: ${created.id}`);
* ```
*/
export { HCFSClient } from './client';
export { HCFSWebSocketClient } from './websocket';
export {
Context,
SearchResult,
ContextFilter,
PaginationOptions,
SearchOptions,
HCFSConfig,
BatchResult,
AnalyticsData
} from './types';
export {
HCFSError,
HCFSConnectionError,
HCFSAuthenticationError,
HCFSNotFoundError,
HCFSValidationError,
HCFSRateLimitError,
HCFSServerError
} from './errors';
export { Cache, LRUCache } from './cache';
export { RetryManager } from './retry';
export {
validatePath,
normalizePath,
textChunker,
extractKeywords,
contextSimilarity
} from './utils';
// Version info
export const VERSION = '2.0.0';
// Default configuration
export const DEFAULT_CONFIG = {
baseUrl: 'https://api.hcfs.dev/v1',
timeout: 30000,
retryAttempts: 3,
retryDelay: 1000,
cacheEnabled: true,
cacheSize: 1000,
cacheTTL: 3600000, // 1 hour
} as const;

View File

@@ -0,0 +1,449 @@
/**
* TypeScript type definitions for HCFS SDK
*/
/**
* Context status enumeration
*/
export enum ContextStatus {
ACTIVE = 'active',
ARCHIVED = 'archived',
DELETED = 'deleted',
DRAFT = 'draft'
}
/**
* Search type enumeration
*/
export enum SearchType {
SEMANTIC = 'semantic',
KEYWORD = 'keyword',
HYBRID = 'hybrid',
FUZZY = 'fuzzy'
}
/**
* Cache strategy enumeration
*/
export enum CacheStrategy {
LRU = 'lru',
LFU = 'lfu',
TTL = 'ttl',
FIFO = 'fifo'
}
/**
* Represents a context in the HCFS system
*/
export interface Context {
/** Unique context identifier */
id?: number;
/** Hierarchical path of the context */
path: string;
/** Content of the context */
content: string;
/** Brief summary of the context */
summary?: string;
/** Author of the context */
author?: string;
/** Tags associated with the context */
tags?: string[];
/** Additional metadata */
metadata?: Record<string, any>;
/** Context status */
status?: ContextStatus;
/** Creation timestamp */
createdAt?: Date;
/** Last update timestamp */
updatedAt?: Date;
/** Context version number */
version?: number;
/** Similarity score (for search results) */
similarityScore?: number;
}
/**
* Context creation data
*/
export interface ContextCreate {
path: string;
content: string;
summary?: string;
author?: string;
tags?: string[];
metadata?: Record<string, any>;
}
/**
* Context update data
*/
export interface ContextUpdate {
content?: string;
summary?: string;
tags?: string[];
metadata?: Record<string, any>;
status?: ContextStatus;
}
/**
* Search result containing context and relevance information
*/
export interface SearchResult {
/** The context that matched the search */
context: Context;
/** Relevance score (0.0 to 1.0) */
score: number;
/** Explanation of why this result was returned */
explanation?: string;
/** Highlighted text snippets */
highlights?: string[];
}
/**
* Context filtering options
*/
export interface ContextFilter {
/** Filter by path prefix */
pathPrefix?: string;
/** Filter by author */
author?: string;
/** Filter by status */
status?: ContextStatus;
/** Filter by tags */
tags?: string[];
/** Filter by creation date (after) */
createdAfter?: Date;
/** Filter by creation date (before) */
createdBefore?: Date;
/** Filter by content substring */
contentContains?: string;
/** Minimum content length */
minContentLength?: number;
/** Maximum content length */
maxContentLength?: number;
}
/**
* Pagination options
*/
export interface PaginationOptions {
/** Page number (starts from 1) */
page?: number;
/** Number of items per page */
pageSize?: number;
/** Sort field */
sortBy?: string;
/** Sort order */
sortOrder?: 'asc' | 'desc';
}
/**
* Pagination metadata
*/
export interface PaginationMeta {
page: number;
pageSize: number;
totalItems: number;
totalPages: number;
hasNext: boolean;
hasPrevious: boolean;
}
/**
* Search configuration options
*/
export interface SearchOptions {
/** Type of search to perform */
searchType?: SearchType;
/** Maximum number of results */
topK?: number;
/** Minimum similarity score */
similarityThreshold?: number;
/** Search within path prefix */
pathPrefix?: string;
/** Weight for semantic search in hybrid mode */
semanticWeight?: number;
/** Include full content in results */
includeContent?: boolean;
/** Include text highlights */
includeHighlights?: boolean;
/** Maximum highlight snippets */
maxHighlights?: number;
}
/**
* Cache configuration
*/
export interface CacheConfig {
/** Enable caching */
enabled?: boolean;
/** Cache eviction strategy */
strategy?: CacheStrategy;
/** Maximum cache entries */
maxSize?: number;
/** Time-to-live in milliseconds */
ttl?: number;
/** Memory limit in MB */
memoryLimit?: number;
}
/**
* Retry configuration
*/
export interface RetryConfig {
/** Enable retry logic */
enabled?: boolean;
/** Maximum retry attempts */
maxAttempts?: number;
/** Base delay in milliseconds */
baseDelay?: number;
/** Maximum delay in milliseconds */
maxDelay?: number;
/** Backoff multiplier */
backoffMultiplier?: number;
/** Add random jitter to delays */
jitter?: boolean;
/** HTTP status codes to retry on */
retryOnStatus?: number[];
/** Retry on timeout errors */
retryOnTimeout?: boolean;
}
/**
* WebSocket configuration
*/
export interface WebSocketConfig {
/** Automatically reconnect on disconnect */
autoReconnect?: boolean;
/** Reconnect interval in milliseconds */
reconnectInterval?: number;
/** Maximum reconnection attempts */
maxReconnectAttempts?: number;
/** Ping interval in milliseconds */
pingInterval?: number;
/** Ping timeout in milliseconds */
pingTimeout?: number;
/** Maximum queued messages */
messageQueueSize?: number;
}
/**
* Main HCFS client configuration
*/
export interface HCFSConfig {
/** HCFS API base URL */
baseUrl: string;
/** API key for authentication */
apiKey?: string;
/** JWT token for authentication */
jwtToken?: string;
/** Request timeout in milliseconds */
timeout?: number;
/** User agent string */
userAgent?: string;
// Advanced configurations
cache?: CacheConfig;
retry?: RetryConfig;
websocket?: WebSocketConfig;
// HTTP client options
maxConcurrentRequests?: number;
keepAlive?: boolean;
}
/**
* Batch operation result
*/
export interface BatchResult {
/** Number of successful operations */
successCount: number;
/** Number of failed operations */
errorCount: number;
/** Total number of items processed */
totalItems: number;
/** Successfully created item IDs */
successfulItems: any[];
/** Failed items with error details */
failedItems: Array<{
index: number;
item: any;
error: string;
}>;
/** Execution time in milliseconds */
executionTime: number;
/** Success rate (0.0 to 1.0) */
successRate: number;
}
/**
* Stream event from WebSocket
*/
export interface StreamEvent {
/** Event type */
eventType: string;
/** Event data */
data: any;
/** Event timestamp */
timestamp: Date;
/** Related context ID */
contextId?: number;
/** Related context path */
path?: string;
}
/**
* Analytics and usage data
*/
export interface AnalyticsData {
/** Operation counts */
operationCount: Record<string, number>;
/** Cache statistics */
cacheStats: Record<string, any>;
/** Error statistics */
errorStats: Record<string, number>;
/** Performance metrics */
performanceStats: Record<string, number>;
/** Session start time */
sessionStart: Date;
}
/**
* API response wrapper
*/
export interface APIResponse<T> {
success: boolean;
data: T;
timestamp: string;
apiVersion: string;
requestId?: string;
}
/**
* API error response
*/
export interface APIErrorResponse {
success: false;
error: string;
errorDetails?: Array<{
field?: string;
message: string;
code?: string;
}>;
timestamp: string;
requestId?: string;
apiVersion: string;
}
/**
* List response with pagination
*/
export interface ListResponse<T> extends APIResponse<T[]> {
pagination: PaginationMeta;
}
/**
* Search response
*/
export interface SearchResponse {
success: boolean;
data: SearchResult[];
query: string;
searchType: SearchType;
totalResults: number;
searchTimeMs: number;
filtersApplied?: Record<string, any>;
timestamp: string;
apiVersion: string;
}
/**
* Health status
*/
export enum HealthStatus {
HEALTHY = 'healthy',
DEGRADED = 'degraded',
UNHEALTHY = 'unhealthy'
}
/**
* Component health information
*/
export interface ComponentHealth {
name: string;
status: HealthStatus;
responseTimeMs?: number;
errorMessage?: string;
}
/**
* Health check response
*/
export interface HealthResponse {
status: HealthStatus;
version: string;
uptimeSeconds: number;
components: ComponentHealth[];
}
/**
* Statistics response
*/
export interface StatsResponse {
contextStats: {
totalContexts: number;
contextsByStatus: Record<string, number>;
contextsByAuthor: Record<string, number>;
averageContentLength: number;
mostActivePaths: string[];
recentActivity: any[];
};
searchStats: {
totalSearches: number;
searchesByType: Record<string, number>;
averageResponseTimeMs: number;
popularQueries: string[];
searchSuccessRate: number;
};
systemStats: {
uptimeSeconds: number;
memoryUsageMb: number;
activeConnections: number;
cacheHitRate: number;
embeddingModelInfo: string;
databaseSizeMb: number;
};
}
/**
* Event listener function type
*/
export type EventListener<T = any> = (event: T) => void | Promise<void>;
/**
* HTTP method types
*/
export type HttpMethod = 'GET' | 'POST' | 'PUT' | 'DELETE' | 'PATCH';
/**
* Request options
*/
export interface RequestOptions {
method: HttpMethod;
url: string;
data?: any;
params?: Record<string, any>;
headers?: Record<string, string>;
timeout?: number;
}
/**
* Cache entry
*/
export interface CacheEntry<T = any> {
key: string;
value: T;
timestamp: number;
accessCount: number;
ttl?: number;
}

83
sdks/rust/Cargo.toml Normal file
View File

@@ -0,0 +1,83 @@
[package]
name = "hcfs-sdk"
version = "2.0.0"
edition = "2021"
description = "Rust SDK for the Context-Aware Hierarchical Context File System"
license = "MIT"
repository = "https://github.com/hcfs/hcfs"
homepage = "https://docs.hcfs.dev/sdk/rust"
documentation = "https://docs.rs/hcfs-sdk"
keywords = ["hcfs", "context", "ai", "search", "embeddings"]
categories = ["api-bindings", "web-programming::http-client"]
authors = ["HCFS Development Team <dev@hcfs.dev>"]
readme = "README.md"
rust-version = "1.70"
[dependencies]
# HTTP client
reqwest = { version = "0.11", features = ["json", "rustls-tls"], default-features = false }
# Async runtime
tokio = { version = "1.0", features = ["full"] }
# Serialization
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
# Error handling
thiserror = "1.0"
anyhow = "1.0"
# Time handling
chrono = { version = "0.4", features = ["serde"] }
# WebSocket support
tokio-tungstenite = { version = "0.20", optional = true, features = ["rustls-tls-webpki-roots"] }
futures-util = { version = "0.3", optional = true }
# Caching
moka = { version = "0.12", features = ["future"], optional = true }
# Retry logic
backoff = { version = "0.4", optional = true }
# Rate limiting
governor = { version = "0.6", optional = true }
# Metrics
prometheus = { version = "0.13", optional = true }
# Tracing
tracing = { version = "0.1", optional = true }
# URL handling
url = "2.4"
# UUID for request IDs
uuid = { version = "1.0", features = ["v4"] }
[dev-dependencies]
tokio-test = "0.4"
wiremock = "0.5"
[features]
default = ["cache", "retry", "rate-limit"]
websocket = ["tokio-tungstenite", "futures-util"]
cache = ["moka"]
retry = ["backoff"]
rate-limit = ["governor"]
metrics = ["prometheus"]
tracing = ["dep:tracing"]
full = ["websocket", "cache", "retry", "rate-limit", "metrics", "tracing"]
[[example]]
name = "basic_usage"
required-features = ["cache"]
[[example]]
name = "websocket_streaming"
required-features = ["websocket"]
[package.metadata.docs.rs]
all-features = true
rustdoc-args = ["--cfg", "docsrs"]

146
sdks/rust/src/lib.rs Normal file
View File

@@ -0,0 +1,146 @@
//! # HCFS Rust SDK
//!
//! A comprehensive Rust SDK for the Context-Aware Hierarchical Context File System (HCFS).
//! This SDK provides both synchronous and asynchronous clients with full type safety,
//! error handling, and advanced features like caching, retry logic, and WebSocket streaming.
//!
//! ## Features
//!
//! - **Type-safe API**: Full Rust type safety with comprehensive error handling
//! - **Async/await support**: Built on tokio for high-performance async operations
//! - **WebSocket streaming**: Real-time event notifications (with `websocket` feature)
//! - **Intelligent caching**: Multiple caching strategies (with `cache` feature)
//! - **Retry logic**: Configurable retry mechanisms with backoff (with `retry` feature)
//! - **Rate limiting**: Built-in rate limiting support (with `rate-limit` feature)
//! - **Observability**: Metrics and tracing support (with `metrics` and `tracing` features)
//!
//! ## Quick Start
//!
//! ```rust
//! use hcfs_sdk::{Client, Context, Config};
//!
//! #[tokio::main]
//! async fn main() -> Result<(), Box<dyn std::error::Error>> {
//! let config = Config::new("https://api.hcfs.dev/v1", "your-api-key");
//! let client = Client::new(config);
//!
//! let context = Context::builder()
//! .path("/docs/readme")
//! .content("Hello, HCFS!")
//! .summary("Getting started guide")
//! .build()?;
//!
//! let created = client.create_context(context).await?;
//! println!("Created context: {}", created.id.unwrap());
//!
//! let results = client.search_contexts("hello world").await?;
//! for result in results {
//! println!("Found: {} (score: {:.3})", result.context.path, result.score);
//! }
//!
//! Ok(())
//! }
//! ```
//!
//! ## WebSocket Streaming
//!
//! With the `websocket` feature enabled:
//!
//! ```rust,no_run
//! use hcfs_sdk::{Client, Config, StreamEvent};
//! use futures_util::StreamExt;
//!
//! #[tokio::main]
//! async fn main() -> Result<(), Box<dyn std::error::Error>> {
//! let config = Config::new("wss://api.hcfs.dev/v1", "your-api-key");
//! let client = Client::new(config);
//!
//! let mut stream = client.stream_events(Some("/docs")).await?;
//!
//! while let Some(event) = stream.next().await {
//! match event {
//! Ok(StreamEvent::ContextCreated { context, .. }) => {
//! println!("New context created: {}", context.path);
//! }
//! Ok(StreamEvent::ContextUpdated { context, .. }) => {
//! println!("Context updated: {}", context.path);
//! }
//! Ok(StreamEvent::ContextDeleted { path, .. }) => {
//! println!("Context deleted: {}", path);
//! }
//! Err(e) => eprintln!("Stream error: {}", e),
//! }
//! }
//!
//! Ok(())
//! }
//! ```
pub mod client;
pub mod types;
pub mod error;
pub mod config;
#[cfg(feature = "cache")]
pub mod cache;
#[cfg(feature = "retry")]
pub mod retry;
#[cfg(feature = "websocket")]
pub mod websocket;
// Re-exports
pub use client::Client;
pub use types::*;
pub use error::{Error, Result};
pub use config::Config;
#[cfg(feature = "websocket")]
pub use websocket::{WebSocketClient, StreamEvent};
/// SDK version
pub const VERSION: &str = env!("CARGO_PKG_VERSION");
/// Default configuration values
pub mod defaults {
use std::time::Duration;
/// Default request timeout
pub const TIMEOUT: Duration = Duration::from_secs(30);
/// Default maximum retry attempts
pub const MAX_RETRIES: usize = 3;
/// Default retry base delay
pub const RETRY_BASE_DELAY: Duration = Duration::from_millis(1000);
/// Default cache size
pub const CACHE_SIZE: u64 = 1000;
/// Default cache TTL
pub const CACHE_TTL: Duration = Duration::from_secs(3600);
/// Default rate limit (requests per second)
pub const RATE_LIMIT: u32 = 100;
/// Default user agent
pub const USER_AGENT: &str = concat!("hcfs-rust/", env!("CARGO_PKG_VERSION"));
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_version() {
assert!(!VERSION.is_empty());
}
#[tokio::test]
async fn test_client_creation() {
let config = Config::new("https://api.example.com", "test-key");
let _client = Client::new(config);
// Just test that client creation doesn't panic
}
}