This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are looking for a senior-level Data Engineer to shape and deliver a scalable data platform in Kalamazoo, Michigan. This role combines strategic architecture with hands-on engineering, creating reliable data products that support reporting, advanced analytics, and AI-driven solutions. The ideal candidate will build secure, multi-tenant data capabilities with strong attention to privacy, governance, and long-term platform quality.
Job Responsibility:
Lead the design of a modern data platform that supports ingestion, transformation, storage, and consumption across analytical and operational use cases
Build and maintain robust batch and streaming pipelines that move data from relational systems, object storage, document databases, and event sources into centralized platforms
Define data architecture standards, modeling approaches, and engineering practices that improve consistency, reliability, and scalability across the organization
Create multi-tenant data solutions with strong isolation controls, secure access patterns, and governance measures built into the platform design
Develop data models and serving layers that enable enterprise reporting, self-service analytics, and AI or machine learning workloads
Evaluate cloud-based data services, processing frameworks, and warehouse technologies to ensure the platform meets performance, cost, and security expectations
Partner with product, engineering, and leadership teams to explain technical decisions, highlight risks, and align platform investments with business priorities
Oversee external vendors and implementation partners by reviewing recommendations, challenging misaligned approaches, and enforcing internal data standards
Requirements:
10+ years of experience in data engineering, including ownership of large-scale architecture decisions in production environments
Strong expertise with Python and modern data processing technologies such as Apache Spark, Kafka, Hadoop, or similar platforms
Proven experience designing both batch and real-time data pipelines for cloud-based ecosystems, including AWS services and enterprise data warehouses
Deep understanding of data modeling for transactional and analytical workloads, including multi-tenant design principles and access control strategies
Hands-on experience with platforms such as Snowflake, Redshift, graph databases, and document-oriented data sources
Solid knowledge of data governance, privacy requirements, and secure handling of sensitive information in shared environments
Experience with ETL and transformation frameworks, including testing, documentation, and deployment practices within CI/CD workflows
Demonstrated ability to communicate architecture strategy and technical tradeoffs clearly to senior leaders, business stakeholders, and external partners
What we offer:
Medical, vision, dental, and life and disability insurance