Show HN: Pontoon – open-source customer data syncs

Aug 1, 2025 - 15:45
 0  0
Show HN: Pontoon – open-source customer data syncs
      

Build production ready data syncs that

integrate with your customer's data warehouse.


About

Pontoon is an open source, self-hosted, data export platform. We built Pontoon from the ground up for the use case of shipping data products to your enterprise customers.

Pontoon is engineered to make it easy to sync data directly to your customer's data warehouse (eg. Snowflake, BigQuery, and Redshift). Your customers get their data, without needing to build ETLs or pay for ETL tools, empowering them to make data driven decisions for their business, with data from your product.

Welcome to the future of customer data syncs 🚀

Want to get Pontoon up and running in minutes? Try our Quick Start Guide.

Key Features

  • 🚀 Easy Deployment: Get started in minutes with Docker or deploy to AWS Fargate
  • 🎯 Major Data Warehouses Integrations: Supports Snowflake, Google BigQuery, Amazon Redshift, and Postgres as sources and destinations
  • ☁️ Multi-cloud: Send data from any cloud to any cloud. Amazon Redshift ➡️ Google BigQuery? No problem!
  • ⚡ Automated Syncs: Schedule data transfers with automatic backfills. Incremental loads automatically keep destination datasets in sync
  • 🏗️ Built for Scale: Sync over 1 million records per minute
  • ✨ Web Interface: User-friendly dashboard for managing syncs, built with React/Nextjs
  • 🔌 REST API: Programmatic access to all Pontoon features, built with FastAPI
  • ✅ Open Source: Complete control over your data and infrastructure with zero lock-in

The Problem with APIs & Data

Data Export with APIs

We built Pontoon because traditional APIs are becoming increasingly problematic at modern data scale:

  • Performance Issues: APIs struggle with large datasets and complex queries
  • Poor Customer Experience: Customers have to spend weeks building ETLs or pay for managed ETL tools ($$$)
  • Rate Limits: Data workloads tend to be bursty, often triggering rate limits, resulting in a frustrating experience for everyone involved
  • Backfills: Backfilling historical data through APIs is inherently slow, as most APIs are optimized for real-time, not bulk ingestion

Data Export with Pontoon

Pontoon solves these problems with:

  • Direct Warehouse Integration: Send data directly to customer's data warehouse. No more ETLs needed!
  • Scalable Architecture: Handle millions of rows efficiently. Say goodbye to rate limits!
  • Automated Scheduled Syncs: Automatically keep destinations up to date, only syncing data that's changed/new. Backfills are completed on the first sync.
  • Self-Hosted: Full control over your data and infrastructure

Is Pontoon Just Another ETL Platform?

Short Answer: No.

ETL platforms are used by data teams to pull data out of vendors (eg. Salesforce). Data teams maintain/pay for the ETL platform.

Pontoon is used by vendors (eg. Salesforce) to provide data syncs as a product feature for their customer's data team. The vendor deploys Pontoon.

Pontoon vs. Other ETL / Reverse-ETL Platforms

Pontoon Airbyte Singer/Stitch Fivetran Hightouch Prequel Bobsled
Open Source ✅ Yes ✅ Yes ✅ Singer only ❌ No ⚠️ Some SDKs ❌ No ❌ No
Self-Hosted Option ✅ Yes ✅ Yes ✅ Yes ❌ No ✅ Yes ❌ No ❌ No
First-class Data Products ✅ Yes ❌ No ❌ No ❌ No ⚠️ Possible (with effort) ✅ Yes ✅ Yes
Multi-Tenant Data Export ✅ Yes ❌ No ❌ No ❌ No ⚠️ Custom ✅ Yes ✅ Yes
Direct Data Warehouse Integrations ✅ Yes ✅ Yes ⚠️ Destinations only ⚠️ Destinations only ⚠️ Sources only ✅ Yes ✅ Yes
DBT Integration 🚧 Coming soon ❌ No ❌ No ⚠️ Limited ✅ Yes ❌ No ❌ No
Bulk Transfers (millions/billions of rows) ✅ Yes ✅ Yes ⚠️ Possible ⚠️ Possible ❌ No ✅ Yes ✅ Yes
Full Database/Table Replication ❌ No ✅ Yes ❌ No ❌ No ❌ No ❌ No ❌ No
Free to Use ✅ Yes ✅ Yes ✅ Yes (Singer CLI) ❌ No ⚠️ Limited ❌ No ❌ No

Quick Start

Get Pontoon running in seconds with our official docker image. Visit our docs for more information.

docker run \
    --name pontoon \
    -p 3000:3000 \
    -p 8000:8000 \
    --rm \
    --pull=always \
    -v pontoon-internal-postgres:/var/lib/postgresql/data \
    -v pontoon-internal-redis:/data \
    ghcr.io/pontoon-data/pontoon/pontoon-unified:latest

Note: If you're using CMD or Powershell, run the command in one line, without \

To view the Web UI: localhost:3000. To view the OpenAPI docs / test the API: localhost:8000/docs.

Check out the transfer quick start guide to add your first source and destination.

Running Pontoon with Docker Compose

Requirements: Docker Compose v2

To build Pontoon from source, use Docker Compose.

docker compose up --build

Creating Your First Data Export

To quickly set up your first transfer in Pontoon, follow the steps in the transfer quick start guide!

What's Your Reaction?

Like Like 0
Dislike Dislike 0
Love Love 0
Funny Funny 0
Angry Angry 0
Sad Sad 0
Wow Wow 0