Clera - Your AI talent agent
LoginStart
Start

This position is no longer available

Crustdata logo
Crustdata

Senior Data Platform Engineer

on-site•San Francisco•$140k - $200k+ 0.10% - 0.30%

Summary

Location

San Francisco

Salary

$140k - $200k

Equity

0.10% - 0.30%

Workplace

On-site

Experience

3+ years

Visa

US citizen/visa only

Company links

WebsiteLinkedInLinkedIn

This position is no longer available

This job listing has been removed by the employer and is no longer accepting applications.

Browse Similar Jobs

About this role

About Crustdata

The way information on the internet is consumed is changing. It's shifting from humans searching pre-crawled information on Google via point and click to AI agents doing real-time targeted crawling from sources of truth. The interface, the workflow, and the density of retrieved data and its fidelity, which worked for humans, doesn't work for AI agents. At Crustdata, we are building the gateway to the internet for AI agents. In simple terms, we're building the APIs for AI Agents to access real-time data from sources of truth. We already serve dozens of enterprise customers, are profitable and growing very fast. We're backed by some of the best investors in Silicon Valley including Y Combinator, General Catalyst, SV Angel, A Capital and Liquid 2 Ventures among others.

About the role

Skills: Go, Python, Kafka, Spark, Data Warehousing

The Role

We are looking for a foundational member of our engineering team: a highly motivated Software Engineer to own the design, creation, and evolution of our data platform. You will be part of the team that owns the data ingestion and management infrastructure that powers Crustdata’s capabilities. If you are passionate about building robust, scalable data systems and want to see your work directly influence customers, this is the role for you.

What You'll Do

Architect & Build: Design, build, and maintain our core data infrastructure, including our data warehouse and data lake, using modern cloud technologies (AWS, GCP, or Azure).

Pipeline Development: Develop and scale robust, fault-tolerant data pipelines (ETL/ELT) to ingest and process massive volumes of structured and unstructured data from diverse sources.

Enable Data Science & ML: Create the foundational platform to support our data scientists and ML engineers. This includes building systems for feature engineering, model training, and deploying ML models into production.

Orchestration at Scale: Implement and manage workflow orchestration for hundreds of daily data jobs, ensuring reliability, monitorability, and efficiency using tools like Airflow, Dagster, or Prefect.

Real-time Infrastructure: Build and manage real-time data streaming pipelines using technologies like Kafka or Flink to power live dashboards and time-sensitive product features.

Data Quality & Governance: Champion data quality and reliability. Implement frameworks for data validation, testing, and monitoring to ensure our data is accurate and trustworthy.

Who You Are

Experience: You have 3+ years of professional software engineering experience, with a significant focus on data engineering or building backend systems at scale.

Strong Coder: You possess strong programming skills in Python or another modern language (e.g., Java, Go).

Big Data Expertise: You have hands-on experience with modern big data technologies such as Spark, Flink, or Dask.

Pipeline Orchestration: You have practical experience with workflow management tools like Temporal, Airflow, Dagster, or Prefect.

Problem Solver: You are a pragmatic problem-solver who can navigate ambiguity, manage complexity, and take ownership of projects from inception to completion.

Startup Mentality: You are excited to work in a fast-paced, collaborative environment and wear multiple hats.

Nice to Haves

Experience with real-time streaming technologies (Kafka, Pulsar, Kinesis).

Familiarity with containerization and orchestration (Docker, Kubernetes).

Knowledge of modern data warehousing and lakehouse architectures (e.g., Delta Lake, Iceberg).

What you'll do

  • Architect & Build: Design, build, and maintain our core data infrastructure, including our data warehouse and data lake, using modern cloud technologies (AWS, GCP, or Azure).
  • Pipeline Development: Develop and scale robust, fault-tolerant data pipelines (ETL/ELT) to ingest and process massive volumes of structured and unstructured data from diverse sources.
  • Enable Data Science & ML: Create the foundational platform to support our data scientists and ML engineers. This includes building systems for feature engineering, model training, and deploying ML models into production.
  • Orchestration at Scale: Implement and manage workflow orchestration for hundreds of daily data jobs, ensuring reliability, monitorability, and efficiency using tools like Airflow, Dagster, or Prefect.
  • Real-time Infrastructure: Build and manage real-time data streaming pipelines using technologies like Kafka or Flink to power live dashboards and time-sensitive product features.
  • Data Quality & Governance: Champion data quality and reliability. Implement frameworks for data validation, testing, and monitoring to ensure our data is accurate and trustworthy.

About Crustdata

Realtime company and people data enrichment in the format you want it.We solved the problem of getting accurate, fresh data on companies and people by building a robust & easy-to-use set of APIs that pull live data from 15 different sources. Datasets include:Headcount (live and historical)Headcount growthPeople dataJobs listingWeb trafficProduct reviewsCompany and CEO approval ratingFunding Get the complete picture on the companies and people that matter to your business with the knowledge that the data is reliable and fresh.

Looking for similar opportunities?

Browse other open positions that match your skills

Frequently Asked Questions

What does Crustdata pay for a Senior Data Platform Engineer?

Toggle
Crustdata offers a competitive compensation package for the Senior Data Platform Engineer role. The salary range is USD 140k - 200k per year, plus 0.10% - 0.30% equity. Apply through Clera to learn more about the full compensation details.

What does a Senior Data Platform Engineer do at Crustdata?

Toggle
As a Senior Data Platform Engineer at Crustdata, you will: architect & Build: Design, build, and maintain our core data infrastructure, including our data warehouse and data lake, using modern cloud technologies (AWS, GCP, or Azure).; pipeline Development: Develop and scale robust, fault-tolerant data pipelines (ETL/ELT) to ingest and process massive volumes of structured and unstructured data from diverse sources.; enable Data Science & ML: Create the foundational platform to support our data scientists and ML engineers. This includes building systems for feature engineering, model training, and deploying ML models into production.; and more.

Is the Senior Data Platform Engineer position at Crustdata remote?

Toggle
The Senior Data Platform Engineer position at Crustdata is based in San Francisco, United States and is on-site. Contact the company through Clera for specific work arrangement details.

How do I apply for the Senior Data Platform Engineer position at Crustdata?

Toggle
You can apply for the Senior Data Platform Engineer position at Crustdata directly through Clera. Click the "Apply Now" button above to start your application. Clera's AI-powered platform will help match your profile with this opportunity and guide you through the application process.
Clera - Your AI talent agent
© 2026 Clera Labs, Inc.TermsPrivacyHelp