Functions
Functions are serverless-style applications on Quave Cloud that automatically scale based on incoming traffic, including the ability to scale to zero when idle. They are powered by Knative and are ideal for event-driven workloads, APIs with variable traffic, webhooks, and background processors that don't need to run continuously.
How Functions Work
Unlike regular apps that maintain a fixed number of containers, functions dynamically adjust:
- Scale to zero: When no requests are received for a configurable idle period, the function scales down to zero containers, saving costs.
- Scale up on demand: When a request arrives, a container is started automatically (cold start). Subsequent requests are handled by active containers.
- Concurrency-based scaling: New containers are added when the number of concurrent requests per container exceeds the configured threshold.
Prerequisites
- Quave ONE Connect plan (or higher)
- Functions must be enabled on your account by the Quave Cloud team
- The deployment region must support functions
If you want to run in a region where it's not available please contact us. Even if you are not on Quave ONE Connect.
Creating a Function App
Via the Web UI
- Navigate to Apps in your account
- Click Add function
- Choose the Use image deployment method
- Provide the Docker image URL for your function
- Select a region that supports functions
- Click Create
Via the API
curl -X POST \
-H 'Authorization: YOUR_TOKEN' \
-H 'Content-Type: application/json' \
-d '{
"name": "my-function",
"accountId": "YOUR_ACCOUNT_ID",
"port": 8080,
"dockerPreset": "FUNCTION",
"useImage": true,
"image": "docker.io/myorg/my-function:latest"
}' \
https://api.quave.cloud/api/public/v1/app
Via CLI (build from source)
Create the app in the web UI or API, then deploy code directly:
quaveone deploy --user-token <token> --env <env name> \
--dir ./my-function
Quave Cloud builds the Docker image from your Dockerfile and deploys it as a Knative function.
Via MCP
Ask your AI assistant:
"Create a function app called my-function on port 3000 with a Dockerfile for CLI deployment"
Or with a pre-built image:
"Create a function app called my-function on port 8080 using image docker.io/myorg/my-function:latest"
The create-app MCP tool supports dockerPreset: "FUNCTION" for creating function apps, with either isCliDeployment: true (build from source) or useImage: true (pre-built image).
Function Configuration
Each function environment has a functionConfig that controls scaling and timeout behavior. You can configure these settings via the web UI, API, CLI, or MCP.
Configuration Fields
| Field | Type | Description | Default |
|---|---|---|---|
containerConcurrency | Integer | Maximum number of concurrent requests per container instance. When exceeded, new containers are started. | Knative default |
timeoutSeconds | Integer | Maximum time in seconds a request can take before being terminated. | Knative default |
idleTimeoutSeconds | Integer | Seconds of inactivity before the function scales to zero. Minimum: 900 seconds. Subject to account-level minimum. | Account default |
responseStartTimeoutSeconds | Integer | Maximum time in seconds to wait for the first byte of a response. | Knative default |
minScale | Integer | Minimum number of container instances to keep running. Set to 0 to allow scale-to-zero. | 0 |
maxScale | Integer | Maximum number of container instances. Subject to account-level maximum. | Account default |
Updating via the API
Use the dedicated function config endpoint:
curl -X PATCH \
-H 'Authorization: YOUR_TOKEN' \
-H 'Content-Type: application/json' \
-d '{
"appEnvId": "YOUR_APP_ENV_ID",
"containerConcurrency": 80,
"timeoutSeconds": 300,
"idleTimeoutSeconds": 600,
"minScale": 0,
"maxScale": 10,
"applyImmediately": true
}' \
https://api.quave.cloud/api/public/v1/app-env/function-config
You can also update function config through the general update endpoint (PUT /api/public/v1/app-env) by including a functionConfig object in the request body.
Updating via CLI
Pass function flags during deployment:
quaveone deploy --user-token <token> --env <env name> \
--fn-container-concurrency 80 \
--fn-timeout 300 \
--fn-idle-timeout 600 \
--fn-min-scale 0 \
--fn-max-scale 10
Updating via MCP
Ask your AI assistant:
"Set the max scale to 10 and idle timeout to 600 seconds for my production function"
The update-function-config MCP tool handles all function configuration changes.
Deploying Functions
Functions support two deployment methods: building from source (recommended) or deploying a pre-built Docker image.
Build from Source (CLI)
Deploy your code directly and let Quave Cloud build the Docker image for you:
quaveone deploy --user-token <token> --env <env name> \
--dir ./my-function
You can combine code deployment with function config updates:
quaveone deploy --user-token <token> --env <env name> \
--dir ./my-function \
--fn-timeout 300 --fn-min-scale 0 --fn-max-scale 5
Build from Source (MCP)
Use the code upload flow: request-deploy-storage-key to get an upload URL, upload your code archive, then notify-code-upload to trigger the build. Quave Cloud builds the image and deploys it to Knative.
Deploy a Pre-Built Image (CLI)
If you already have a Docker image, deploy it directly:
quaveone deploy --user-token <token> --env <env name> \
--image docker.io/myorg/my-function:v1.2.3
You can combine image deployment with function config updates:
quaveone deploy --user-token <token> --env <env name> \
--image docker.io/myorg/my-function:v1.2.3 \
--fn-timeout 300 --fn-min-scale 0 --fn-max-scale 5
Deploy a Pre-Built Image (API)
Use the deploy-image endpoint:
curl -X POST \
-H 'Authorization: YOUR_TOKEN' \
-H 'Content-Type: application/json' \
-d '{
"appEnvId": "YOUR_APP_ENV_ID",
"image": "docker.io/myorg/my-function:v1.2.3"
}' \
https://api.quave.cloud/api/public/v1/app-env/deploy-image
Function apps can use the deploy-image endpoint without needing useImage=true on the app.
Deploy a Pre-Built Image (MCP)
Ask your AI assistant:
"Deploy image docker.io/myorg/my-function:v1.2.3 to my production function"
Scaling Behavior
Cold Starts
When a function is scaled to zero and a request arrives, a new container must start before it can handle the request. This introduces a cold start delay. To minimize cold starts:
- Set
minScaleto 1 or higher to keep at least one container warm - Optimize your container startup time (keep the image small, minimize initialization work)
- Use
responseStartTimeoutSecondsto configure how long to wait for the first response byte
Scale-to-Zero
When no requests are received for the duration specified by idleTimeoutSeconds, the function scales down to zero. This is the primary cost-saving feature of functions.
- The minimum
idleTimeoutSecondsis 300 (5 minutes), further limited by your account'sfunctionMinIdleTimeoutSeconds - Scale-to-zero can be disabled at the account level by setting
functionScaleToZerotofalse
Concurrency-Based Scaling
Knative monitors the number of concurrent requests per container. When the containerConcurrency limit is approached, new containers are started to handle the load, up to the maxScale limit.
Differences from Regular Apps
| Aspect | Regular Apps | Functions |
|---|---|---|
| Scaling | Fixed containers or HPA autoscaling | Knative concurrency-based, scale to zero |
| Domain | Standard app domain | Dedicated function domain |
| Billing | Standard per-container pricing | 3x multiplier on function pod time |
| Pods | Recent pods shown | Last 20 pods shown regardless of status |
| Regions | All enabled regions | Only regions with function domain support |
| Persistent Storage | Supported (volumes) | Not supported |
Billing
Function pods are billed at 3x the standard per-container rate. This multiplier accounts for the Knative infrastructure overhead and the scale-to-zero capability. When your function is scaled to zero, you are not billed for container time.
Observability
Functions support the same observability tools as regular apps:
- Logs: View function logs via the web UI, API (
GET /logs), CLI, or MCP (get-logs) - Pods: View function pods via the web UI, API (
GET /app-env/pods), or MCP (get-app-env-pods). Functions show the last 20 pods regardless of termination status - History: View deployment history via the web UI, API (
GET /app-env/history), or MCP (get-app-env-history) - Status: Check function status via the web UI, API (
GET /app-env/status), or MCP (get-app-env-status)
Note: Traditional CPU and memory metrics are not collected for functions since containers are ephemeral.
Managing Functions via MCP
The following MCP tools are useful for managing functions:
| Tool | Description |
|---|---|
create-app | Create a function app with dockerPreset: "FUNCTION" (supports both isCliDeployment and useImage) |
update-function-config | Update function scaling and timeout settings |
request-deploy-storage-key | Get upload URL for code deployment (Step 1 of build from source) |
notify-code-upload | Trigger build after code upload (Step 3 of build from source) |
deploy-app-env-image | Deploy a pre-built Docker image to a function environment |
get-app-env | View function configuration (includes functionConfig and isFunction flag) |
get-app-env-status | Check function runtime status |
get-app-env-pods | View function pod details |
get-logs | View function logs |
stop-app-env | Stop a function environment |
start-app-env | Start a stopped function environment |
Limitations
- Dockerfile required: Functions must have a Dockerfile for building from source. Docker presets that auto-generate Dockerfiles are not supported for functions.
- No persistent volumes: Functions cannot use persistent storage (
useVolumeis not supported). - Region restrictions: Functions are only available in regions that have a function domain configured.
- No traditional autoscaling: Functions use Knative's concurrency-based scaling instead of the HPA autoscaling available for regular apps. The
update-app-env-autoscalingandscale-app-env-containerstools do not apply to functions. - Plan requirement: Functions require the Quave ONE Connect plan or higher.