Job Queues provide a way to process discrete jobs without worrying about node capacity, load balancing, and retries. With Job Queues you configure a port on your container, deploy your container group, and then send requests to a static endpoint. These requests are automatically queued and distributed to healthy nodes, with automatic retries to handle transient failures.

Terminology

  • Job: An individual request to be processed. An example of a job might be a token string sent to an image generation service.
  • Job Queue Worker: A small, multi-architecture go binary program you install in your container which receives jobs from the Queue, forwards them to your application running in the container, receives results and returns them to the Queue.
  • Job Queue: Salad-managed queue service that receives requests from your systems, queues them, and distributes them FIFO to nodes. Job queues distribute received jobs to one or more container groups. A container group is connected to a Job Queue when the container group is first created.

Architecture

Getting Started with Job Queues

  1. Create a Job Queue
  2. Configure the Job Queue Worker
  3. Push your container to a registry
  4. Configure a Container Group with a Job Queue
  5. Add Jobs to the Queue
  6. Retrieve Results