In this quickstart, you’ll create a real‑time data pipeline that streams changes from a Postgres database into an Elasticsearch index. You’ll:

  • Boot Sequin
  • Connect to a sample playground database
  • Start a local Elasticsearch + Kibana stack
  • Create an Elasticsearch index
  • Create a Sequin sink from Postgres to Elasticsearch
  • Watch your data flow in real‑time

By the end you’ll have hands‑on experience setting up Postgres change data capture (CDC) with Sequin and Elasticsearch. The same pattern applies to your own database.

Run Sequin

The easiest way to get started with Sequin is with our Docker Compose file. This file starts a Postgres database, Redis instance, and Sequin server.

1

Create directory and start services

  1. Download sequin-docker-compose.zip.
  2. Unzip the file.
  3. Navigate to the unzipped directory and start the services:
cd sequin-docker-compose && docker compose up -d
2

Verify services are running

Check that Sequin is running using docker ps:

docker ps

You should see output like the following:

CONTAINER ID   IMAGE                           COMMAND                  CREATED          STATUS                    PORTS                              NAMES
bd5c458cabde   sequin/sequin:latest            "/scripts/start_comm…"   11 seconds ago   Up 9 seconds              4000/tcp, 0.0.0.0:7376->7376/tcp   sequin-sequin-1
3bacd89765e7   grafana/grafana                 "/run.sh"                11 seconds ago   Up 11 seconds             0.0.0.0:3000->3000/tcp             sequin-sequin_grafana-1
3ad41319a66c   postgres:16                     "docker-entrypoint.s…"   11 seconds ago   Up 11 seconds (healthy)   0.0.0.0:7377->5432/tcp             sequin-sequin_postgres-1
6139a5fc4e80   redis:7                         "docker-entrypoint.s…"   11 seconds ago   Up 11 seconds             0.0.0.0:7378->6379/tcp             sequin-sequin_redis-1
7e07a5b052de   prom/prometheus                 "/bin/prometheus --c…"   11 seconds ago   Up 11 seconds             0.0.0.0:9090->9090/tcp             sequin-sequin_prometheus-1

Sequin, Postgres, Redis, Prometheus, and Grafana should be up and running (status: Up).

Login

The Docker Compose file automatically configures Sequin with an admin user and a playground database.

Let’s log in to the Sequin web console:

1

Open the web console

After starting the Docker Compose services, open the Sequin web console at http://localhost:7376:

2

Login with default credentials

Use the following default credentials to login:

  • Email:
admin@sequinstream.com
  • Password:
sequinpassword!

View the playground database

To get you started quickly, Sequin’s Docker Compose file creates a logical database called sequin_playground with a sample dataset in the public.products table.

Let’s take a look:

1

Navigate to Databases

In the Sequin web console, click Databases in the sidebar.

2

Select playground database

Click on the pre-configured sequin-playground database:

The database “Health” should be green.

3

View contents of the products table

Let’s get a sense of what’s in the products table. Run the following command:

docker exec -i sequin-sequin_postgres-1 \
  psql -U postgres -d sequin_playground -c \
  "select id, name, price from products;"

This command connects to the running Postgres container and runs a psql command.

You should see a list of the rows in the products table:

  id |         name          | price 
----+-----------------------+-------
  1 | Avocados (3 pack)     |  5.99
  2 | Flank Steak (1 lb)    |  8.99
  3 | Salmon Fillet (12 oz) | 14.99
  4 | Baby Spinach (16 oz)  |  4.99
  5 | Sourdough Bread       |  6.99
  6 | Blueberries (6 oz)    |  3.99
(6 rows)

We’ll make modifications to this table in a bit.

Start Elasticsearch & Kibana

We’ll run Elasticsearch locally with Docker using Elastic’s start‑local helper script.

# Download and run the helper script (≈ 1–2 minutes)
curl -fsSL https://elastic.co/start-local | sh

The script:

  • Downloads the Elasticsearch & Kibana images
  • Generates credentials
  • Starts both services via docker‑compose

When the script finishes you’ll see output like:

🎉 Congrats, Elasticsearch and Kibana are installed and running in Docker!

🌐 Open your browser at http://localhost:5601

Username: elastic
Password: <elastic-password>

🔌 Elasticsearch API endpoint: http://localhost:9200
🔑 API key: <api-key>

Copy the API key and API endpoint URL – you’ll need them when configuring the sink.

Create an index

Next create the products index that will receive documents.

curl -X PUT "http://localhost:9200/products" \
  -H "Authorization: ApiKey <api-key>" \
  -H "Content-Type: application/json" \
  -d '{
    "settings": {
      "number_of_shards": 1,
      "number_of_replicas": 0
    }
  }'

Make sure to replace <api-key> with the API key you copied earlier.

You should receive:

{"acknowledged": true, "shards_acknowledged": true, "index": "products"}

Create an Elasticsearch sink

With the playground database connected and the index created, you’re ready to add a sink that pushes changes to Elasticsearch.

1

Head back to the Sequin console and navigate to the Sinks tab

Click Sinks in the sidebar, then Create Sink.

2

Select sink type

Choose Elasticsearch and click Continue.

3

Verify source configuration

In the Source card you’ll see the sequin_playground database and the products table pre‑selected. Leave the defaults.

4

Add a transform

Open the Transform card, click + Create new transform and use the following Elixir function in a Function transform:

def transform(action, record, changes, metadata) do
  Map.take(record, ["id", "name", "price"])
end

Name the transform products-elasticsearch and click Create transform.

5

Select the transform

Navigate back to the Sinks tab and select the transform you just created.

If you don’t see the transform you just created, click the refresh button.

6

Configure a backfill

Open Initial backfill and choose Backfill all rows so the existing data is loaded into Elasticsearch as soon as the sink is created.

7

Configure Elasticsearch

In the Elasticsearch card enter:

  • Endpoint URL: http://host.docker.internal:9200
  • Index name: products
  • Authentication type: api_key
  • Authentication value: <api-key> (copied earlier)

Leave the other defaults.

8

Create the sink

Give it a name, e.g. products-elasticsearch, and click Create Sink.

Sequin will first backfill all rows from the products table, then stream every change in real‑time.

Query your data in Elasticsearch

Your backfill should load all rows from the products table into Elasticsearch. When it completes, you should see the sink health is green and the backfill card displays Processed 6 and ingested 6 records in 1s.

You can now query your data in Elasticsearch:

curl -X GET "http://localhost:9200/products/_search?pretty" \
  -H "Authorization: ApiKey <api-key>" \
  -H "Content-Type: application/json" \
  -d '{
    "query": {
      "match_all": {}
    }
  }'

You should see the documents from your Postgres table.

See changes flow to Elasticsearch

Let’s test live updates:

1

Insert a product

docker exec -i sequin-sequin_postgres-1 \
  psql -U postgres -d sequin_playground -c \
  "insert into products (name, price) values ('Organic Honey (16 oz)', 12.99);"

Search for the new product:

curl -X GET "http://localhost:9200/products/_search?pretty" \
  -H "Authorization: ApiKey <api-key>" \
  -H "Content-Type: application/json" \
  -d '{"query": {"match": {"name": "honey"}}}'
2

3

Each change appears (or disappears) in Elasticsearch within a few seconds.

Great work!

You’ve successfully:

  • Started Elasticsearch + Kibana locally
  • Created an index
  • Loaded existing data via backfill
  • Streamed live changes
  • Queried Elasticsearch

Ready to stream your own data