Using Spider with CLP#
Spider is a fast and scalable distributed task execution engine that can be used to run tasks. This guide describes how to set up and use Spider with CLP.
Note
Spider is under active development, and its integration with CLP may change in the future. Right now, Spider only supports executing CLP compression tasks. Support for search tasks will be added later.
Requirements#
CLP v0.8.0 or higher
Docker v28 or higher
Docker Compose v2.20.2 or higher
Set up#
To use Spider for CLP compression tasks, you need to set up CLP with Spider in configuration.
Setting up CLP with Spider#
Follow the quick-start guide to download and extract the CLP package, but don’t start the package just yet.
Before starting the package, update the package’s config file (
etc/clp-config.yaml) as follows:Change the
compression_scheduler.typefield to"spider".compression_scheduler: type: "spider"
(Optional) Override
database.names.spiderto avoid name conflicts when using self-provisioned database instances.database: names: spider: "spider-db"
(Optional) Override the
spider_schedulerdefault config to change listening host or avoid port conflicts.spider_scheduler: host: "localhost" port: 6000
(Optional) If you do not intend to use generated credentials, set your own Spider credentials in
etc/credentials.yamlbefore starting the package.database: spider_username: "spider-user" spider_password: "spider-pass"
Continue following the quick-start guide to start CLP.