Skip to main content

Functions

Functions are serverless-style applications on Quave Cloud that automatically scale based on incoming traffic, including the ability to scale to zero when idle. They are powered by Knative and are ideal for event-driven workloads, APIs with variable traffic, webhooks, and background processors that don't need to run continuously.

How Functions Work

Unlike regular apps that maintain a fixed number of containers, functions dynamically adjust:

  • Scale to zero: When no requests are received for a configurable idle period, the function scales down to zero containers, saving costs.
  • Scale up on demand: When a request arrives, a container is started automatically (cold start). Subsequent requests are handled by active containers.
  • Concurrency-based scaling: New containers are added when the number of concurrent requests per container exceeds the configured threshold.

Prerequisites

  • Quave ONE Connect plan (or higher)
  • Functions must be enabled on your account by the Quave Cloud team
  • The deployment region must support functions

If you want to run in a region where it's not available please contact us. Even if you are not on Quave ONE Connect.

Creating a Function App

Via the Web UI

  1. Navigate to Apps in your account
  2. Click Add function
  3. Choose the Use image deployment method
  4. Provide the Docker image URL for your function
  5. Select a region that supports functions
  6. Click Create

Via the API

curl -X POST \
-H 'Authorization: YOUR_TOKEN' \
-H 'Content-Type: application/json' \
-d '{
"name": "my-function",
"accountId": "YOUR_ACCOUNT_ID",
"port": 8080,
"dockerPreset": "FUNCTION",
"useImage": true,
"image": "docker.io/myorg/my-function:latest"
}' \
https://api.quave.cloud/api/public/v1/app

Via CLI (build from source)

Create the app in the web UI or API, then deploy code directly:

quaveone deploy --user-token <token> --env <env name> \
--dir ./my-function

Quave Cloud builds the Docker image from your Dockerfile and deploys it as a Knative function.

Via MCP

Ask your AI assistant:

"Create a function app called my-function on port 3000 with a Dockerfile for CLI deployment"

Or with a pre-built image:

"Create a function app called my-function on port 8080 using image docker.io/myorg/my-function:latest"

The create-app MCP tool supports dockerPreset: "FUNCTION" for creating function apps, with either isCliDeployment: true (build from source) or useImage: true (pre-built image).

Function Configuration

Each function environment has a functionConfig that controls scaling and timeout behavior. You can configure these settings via the web UI, API, CLI, or MCP.

Configuration Fields

FieldTypeDescriptionDefault
containerConcurrencyIntegerMaximum number of concurrent requests per container instance. When exceeded, new containers are started.Knative default
timeoutSecondsIntegerMaximum time in seconds a request can take before being terminated.Knative default
idleTimeoutSecondsIntegerSeconds of inactivity before the function scales to zero. Minimum: 900 seconds. Subject to account-level minimum.Account default
responseStartTimeoutSecondsIntegerMaximum time in seconds to wait for the first byte of a response.Knative default
minScaleIntegerMinimum number of container instances to keep running. Set to 0 to allow scale-to-zero.0
maxScaleIntegerMaximum number of container instances. Subject to account-level maximum.Account default

Updating via the API

Use the dedicated function config endpoint:

curl -X PATCH \
-H 'Authorization: YOUR_TOKEN' \
-H 'Content-Type: application/json' \
-d '{
"appEnvId": "YOUR_APP_ENV_ID",
"containerConcurrency": 80,
"timeoutSeconds": 300,
"idleTimeoutSeconds": 600,
"minScale": 0,
"maxScale": 10,
"applyImmediately": true
}' \
https://api.quave.cloud/api/public/v1/app-env/function-config

You can also update function config through the general update endpoint (PUT /api/public/v1/app-env) by including a functionConfig object in the request body.

Updating via CLI

Pass function flags during deployment:

quaveone deploy --user-token <token> --env <env name> \
--fn-container-concurrency 80 \
--fn-timeout 300 \
--fn-idle-timeout 600 \
--fn-min-scale 0 \
--fn-max-scale 10

Updating via MCP

Ask your AI assistant:

"Set the max scale to 10 and idle timeout to 600 seconds for my production function"

The update-function-config MCP tool handles all function configuration changes.

Deploying Functions

Functions support two deployment methods: building from source (recommended) or deploying a pre-built Docker image.

Build from Source (CLI)

Deploy your code directly and let Quave Cloud build the Docker image for you:

quaveone deploy --user-token <token> --env <env name> \
--dir ./my-function

You can combine code deployment with function config updates:

quaveone deploy --user-token <token> --env <env name> \
--dir ./my-function \
--fn-timeout 300 --fn-min-scale 0 --fn-max-scale 5

Build from Source (MCP)

Use the code upload flow: request-deploy-storage-key to get an upload URL, upload your code archive, then notify-code-upload to trigger the build. Quave Cloud builds the image and deploys it to Knative.

Deploy a Pre-Built Image (CLI)

If you already have a Docker image, deploy it directly:

quaveone deploy --user-token <token> --env <env name> \
--image docker.io/myorg/my-function:v1.2.3

You can combine image deployment with function config updates:

quaveone deploy --user-token <token> --env <env name> \
--image docker.io/myorg/my-function:v1.2.3 \
--fn-timeout 300 --fn-min-scale 0 --fn-max-scale 5

Deploy a Pre-Built Image (API)

Use the deploy-image endpoint:

curl -X POST \
-H 'Authorization: YOUR_TOKEN' \
-H 'Content-Type: application/json' \
-d '{
"appEnvId": "YOUR_APP_ENV_ID",
"image": "docker.io/myorg/my-function:v1.2.3"
}' \
https://api.quave.cloud/api/public/v1/app-env/deploy-image

Function apps can use the deploy-image endpoint without needing useImage=true on the app.

Deploy a Pre-Built Image (MCP)

Ask your AI assistant:

"Deploy image docker.io/myorg/my-function:v1.2.3 to my production function"

Scaling Behavior

Cold Starts

When a function is scaled to zero and a request arrives, a new container must start before it can handle the request. This introduces a cold start delay. To minimize cold starts:

  • Set minScale to 1 or higher to keep at least one container warm
  • Optimize your container startup time (keep the image small, minimize initialization work)
  • Use responseStartTimeoutSeconds to configure how long to wait for the first response byte

Scale-to-Zero

When no requests are received for the duration specified by idleTimeoutSeconds, the function scales down to zero. This is the primary cost-saving feature of functions.

  • The minimum idleTimeoutSeconds is 300 (5 minutes), further limited by your account's functionMinIdleTimeoutSeconds
  • Scale-to-zero can be disabled at the account level by setting functionScaleToZero to false

Concurrency-Based Scaling

Knative monitors the number of concurrent requests per container. When the containerConcurrency limit is approached, new containers are started to handle the load, up to the maxScale limit.

Differences from Regular Apps

AspectRegular AppsFunctions
ScalingFixed containers or HPA autoscalingKnative concurrency-based, scale to zero
DomainStandard app domainDedicated function domain
BillingStandard per-container pricing3x multiplier on function pod time
PodsRecent pods shownLast 20 pods shown regardless of status
RegionsAll enabled regionsOnly regions with function domain support
Persistent StorageSupported (volumes)Not supported

Billing

Function pods are billed at 3x the standard per-container rate. This multiplier accounts for the Knative infrastructure overhead and the scale-to-zero capability. When your function is scaled to zero, you are not billed for container time.

Observability

Functions support the same observability tools as regular apps:

  • Logs: View function logs via the web UI, API (GET /logs), CLI, or MCP (get-logs)
  • Pods: View function pods via the web UI, API (GET /app-env/pods), or MCP (get-app-env-pods). Functions show the last 20 pods regardless of termination status
  • History: View deployment history via the web UI, API (GET /app-env/history), or MCP (get-app-env-history)
  • Status: Check function status via the web UI, API (GET /app-env/status), or MCP (get-app-env-status)

Note: Traditional CPU and memory metrics are not collected for functions since containers are ephemeral.

Managing Functions via MCP

The following MCP tools are useful for managing functions:

ToolDescription
create-appCreate a function app with dockerPreset: "FUNCTION" (supports both isCliDeployment and useImage)
update-function-configUpdate function scaling and timeout settings
request-deploy-storage-keyGet upload URL for code deployment (Step 1 of build from source)
notify-code-uploadTrigger build after code upload (Step 3 of build from source)
deploy-app-env-imageDeploy a pre-built Docker image to a function environment
get-app-envView function configuration (includes functionConfig and isFunction flag)
get-app-env-statusCheck function runtime status
get-app-env-podsView function pod details
get-logsView function logs
stop-app-envStop a function environment
start-app-envStart a stopped function environment

Limitations

  • Dockerfile required: Functions must have a Dockerfile for building from source. Docker presets that auto-generate Dockerfiles are not supported for functions.
  • No persistent volumes: Functions cannot use persistent storage (useVolume is not supported).
  • Region restrictions: Functions are only available in regions that have a function domain configured.
  • No traditional autoscaling: Functions use Knative's concurrency-based scaling instead of the HPA autoscaling available for regular apps. The update-app-env-autoscaling and scale-app-env-containers tools do not apply to functions.
  • Plan requirement: Functions require the Quave ONE Connect plan or higher.