This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
At Solace, we are building the data foundation that will power patient outcomes. We are a small but growing team where everyone wears many hats and moves fast. We are looking for a Data Engineer who loves solving hard problems with clean, maintainable code and thrives in this high-growth startup environment. In this role, you will architect the infrastructure that allows us to scale. You will be a core builder of our data platform, establishing the frameworks, standards, and best practices that will define our engineering culture for years to come. From raw data ingestion to complex modeling, your work will ensure that our data is not just available, but trusted, reliable, and ready for action.
Job Responsibility:
Architect Robust Pipelines: design, build, and optimize scalable data pipelines using Airflow, Python, dbt, and Snowflake, replace brittle manual processes with resilient, automated workflows
Build Infrastructure as Code: manage and evolve cloud infrastructure (AWS/GCP) using Terraform, ensuring platform is reproducible, secure, and scalable
Elevate Code Quality: write clean, production-grade code for complex data processing, champion engineering best practices including code reviews, testing, and CI/CD
Optimize Data Models: collaborate with analysts to design performant SQL transformations and data models in Snowflake
Ensure Data Reliability: implement observability and monitoring to catch issues before they impact stakeholders, be the first line of defense for data quality
Partner Cross-Functionally: work closely with Data Analysts and Product Managers to understand their data needs and deliver high-quality data products that empower decision-making
Requirements:
Strong Python Proficiency: comfortable writing modular, testable, and efficient Python code for data processing and automation
Advanced SQL & Snowflake: deep expertise in SQL and cloud data warehousing (Snowflake preferred), understanding how to optimize queries for performance and cost
Orchestration Mastery: proven experience building and maintaining complex workflows using Airflow (or similar tools)
Infrastructure Mindset: familiarity with Terraform and cloud services (AWS or GCP), understanding how to provision and manage resources
Security & Stewardship: experience in properly handling PHI and PII, implementing secure access controls (RBAC), and adhering to strict governance standards
Startup DNA: self-starter comfortable with ambiguity, takes ownership of problems, willing to wear many hats
Communication Skills: ability to translate complex technical challenges into clear options for non-technical stakeholders
Applicants must be based in the United States
Nice to have:
dbt Expertise: experience using dbt to manage transformations and implement testing/documentation standards
Healthcare Background: experience working with healthcare data standards or strictly regulated environments (HIPAA) a plus
Containerization: experience with Docker and Kubernetes for deploying data applications