Get started with Elasticsearch
Learn how to create real-time Elasticsearch indexes from Postgres changes in minutes using Sequin.
In this quickstart, you’ll create a real‑time data pipeline that streams changes from a Postgres database into an Elasticsearch index. You’ll:
- Boot Sequin
- Connect to a sample playground database
- Start a local Elasticsearch + Kibana stack
- Create an Elasticsearch index
- Create a Sequin sink from Postgres to Elasticsearch
- Watch your data flow in real‑time
By the end you’ll have hands‑on experience setting up Postgres change data capture (CDC) with Sequin and Elasticsearch. The same pattern applies to your own database.
Run Sequin
The easiest way to get started with Sequin is with our Docker Compose file. This file starts a Postgres database, Redis instance, and Sequin server.
Create directory and start services
- Download sequin-docker-compose.zip.
- Unzip the file.
- Navigate to the unzipped directory and start the services:
Verify services are running
Check that Sequin is running using docker ps
:
You should see output like the following:
Sequin, Postgres, Redis, Prometheus, and Grafana should be up and running (status: Up
).
Login
The Docker Compose file automatically configures Sequin with an admin user and a playground database.
Let’s log in to the Sequin web console:
Open the web console
After starting the Docker Compose services, open the Sequin web console at http://localhost:7376:
Login with default credentials
Use the following default credentials to login:
- Email:
- Password:
View the playground database
To get you started quickly, Sequin’s Docker Compose file creates a logical database called sequin_playground
with a sample dataset in the public.products
table.
Let’s take a look:
Navigate to Databases
In the Sequin web console, click Databases in the sidebar.
Select playground database
Click on the pre-configured sequin-playground
database:
The database “Health” should be green.
View contents of the products table
Let’s get a sense of what’s in the products
table. Run the following command:
This command connects to the running Postgres container and runs a psql
command.
You should see a list of the rows in the products
table:
We’ll make modifications to this table in a bit.
Start Elasticsearch & Kibana
We’ll run Elasticsearch locally with Docker using Elastic’s start‑local helper script.
The script:
- Downloads the Elasticsearch & Kibana images
- Generates credentials
- Starts both services via docker‑compose
When the script finishes you’ll see output like:
Copy the API key and API endpoint URL – you’ll need them when configuring the sink.
Create an index
Next create the products
index that will receive documents.
Make sure to replace <api-key>
with the API key you copied earlier.
You should receive:
Create an Elasticsearch sink
With the playground database connected and the index created, you’re ready to add a sink that pushes changes to Elasticsearch.
Head back to the Sequin console and navigate to the Sinks tab
Click Sinks in the sidebar, then Create Sink.
Select sink type
Choose Elasticsearch and click Continue.
Verify source configuration
In the Source card you’ll see the sequin_playground
database and the products
table pre‑selected. Leave the defaults.
Add a transform
Open the Transform card, click + Create new transform and use the following Elixir function in a Function transform:
Name the transform products-elasticsearch
and click Create transform.
Select the transform
Navigate back to the Sinks tab and select the transform you just created.
If you don’t see the transform you just created, click the refresh button.
Configure a backfill
Open Initial backfill and choose Backfill all rows so the existing data is loaded into Elasticsearch as soon as the sink is created.
Configure Elasticsearch
In the Elasticsearch card enter:
- Endpoint URL:
http://host.docker.internal:9200
- Index name:
products
- Authentication type:
api_key
- Authentication value:
<api-key>
(copied earlier)
Leave the other defaults.
Create the sink
Give it a name, e.g. products-elasticsearch
, and click Create Sink.
Sequin will first backfill all rows from the products
table, then stream every change in real‑time.
Query your data in Elasticsearch
Your backfill should load all rows from the products
table into Elasticsearch. When it completes, you should see the sink health is green and the backfill card displays Processed 6 and ingested 6 records in 1s
.
You can now query your data in Elasticsearch:
You should see the documents from your Postgres table.
See changes flow to Elasticsearch
Let’s test live updates:
Insert a product
Search for the new product:
Great work!
You’ve successfully:
- Started Elasticsearch + Kibana locally
- Created an index
- Loaded existing data via backfill
- Streamed live changes
- Queried Elasticsearch