Files
amcs/configs/docker.yaml
Hein (Warky) 72b4f7ce3d feat: implement file upload handler and related functionality
- Added file upload handler to process both multipart and raw file uploads.
- Implemented parsing logic for upload requests, including handling file metadata.
- Introduced SaveFileDecodedInput structure for handling decoded file uploads.
- Created unit tests for file upload parsing and validation.

feat: add metadata retry configuration and functionality

- Introduced MetadataRetryConfig to the application configuration.
- Implemented MetadataRetryer to handle retrying metadata extraction for thoughts.
- Added new tool for retrying failed metadata extractions.
- Updated thought metadata structure to include status and timestamps for metadata processing.

fix: enhance metadata normalization and error handling

- Updated metadata normalization functions to track status and errors.
- Improved handling of metadata extraction failures during thought updates and captures.
- Ensured that metadata status is correctly set during various operations.

refactor: streamline file saving logic in FilesTool

- Refactored Save method in FilesTool to utilize new SaveDecoded method.
- Simplified project and thought ID resolution logic during file saving.
2026-03-30 22:57:21 +02:00

89 lines
1.8 KiB
YAML

server:
host: "0.0.0.0"
port: 8080
read_timeout: "15s"
write_timeout: "30s"
idle_timeout: "60s"
allowed_origins:
- "*"
mcp:
path: "/mcp"
server_name: "amcs"
version: "0.1.0"
transport: "streamable_http"
auth:
header_name: "x-brain-key"
query_param: "key"
allow_query_param: false
keys:
- id: "local-client"
value: "replace-me"
description: "main local client key"
oauth:
clients:
- id: "oauth-client"
client_id: ""
client_secret: ""
description: "used when auth.mode=oauth_client_credentials"
database:
url: "postgres://postgres:postgres@db:5432/amcs?sslmode=disable"
max_conns: 10
min_conns: 2
max_conn_lifetime: "30m"
max_conn_idle_time: "10m"
ai:
provider: "litellm"
embeddings:
model: "openai/text-embedding-3-small"
dimensions: 1536
metadata:
model: "gpt-4o-mini"
temperature: 0.1
litellm:
base_url: "http://host.containers.internal:4000/v1"
api_key: "replace-me"
use_responses_api: false
request_headers: {}
embedding_model: "openrouter/openai/text-embedding-3-small"
metadata_model: "gpt-4o-mini"
ollama:
base_url: "http://host.containers.internal:11434/v1"
api_key: "ollama"
request_headers: {}
openrouter:
base_url: "https://openrouter.ai/api/v1"
api_key: ""
app_name: "amcs"
site_url: ""
extra_headers: {}
capture:
source: "mcp"
metadata_defaults:
type: "observation"
topic_fallback: "uncategorized"
search:
default_limit: 10
default_threshold: 0.5
max_limit: 50
logging:
level: "info"
format: "json"
observability:
metrics_enabled: true
pprof_enabled: false
metadata_retry:
enabled: false
run_on_startup: false
interval: "24h"
max_per_run: 100
include_archived: false