This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
As a Data Engineer II on our Core Engineering Data Team, you will focus on building, maintaining, and optimizing the critical data pipelines and infrastructure that power our analytical and application platforms. You will play a key role in ensuring our data is accurate, reliable, and accessible for downstream ML, Engineering, and Product teams.
Job Responsibility:
Develop and maintain efficient and reliable ETL/ELT data pipelines from product and service data sources to our centralized analytical platforms
Implement and monitor data pipeline orchestration using tools like Airflow or Dagster to ensure timely and accurate data delivery
Collaborate with senior engineers and analysts to understand requirements and implement technical solutions for data workflows
Perform routine monitoring of pipeline performance and reliability, assisting with troubleshooting and optimizing for efficiency
Contribute to the optimization of table structures and data storage to support various query and usage patterns
Support and maintain data discovery, catalog, and analytics tooling for internal teams
Assist in implementing data security measures and ensuring compliance with data governance policies
Write and maintain data quality validation checks and unit tests to ensure the integrity and reliability of data pipelines
Participate in team-wide discussions to help develop and refine team norms and engineering best practices
Requirements:
Bachelor’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience
2+ years of experience in software engineering, data engineering, or a related field
Foundational knowledge of AWS or other public cloud platforms (e.g., Azure, GCP)
Strong SQL knowledge and experience optimizing for data warehousing technologies like AWS Athena
Strong knowledge of Python for use in data transformation
Experience with ETL/ELT, schema design, and datalake technologies
Familiarity with various data and table formats (JSON, Avro, Parquet, Iceberg)
Love of data and a passion for building reliable data products
Nice to have:
Experience with CI/CD pipelines, Docker, Kubernetes, and infrastructure-as-code tools (e.g., Terraform, CloudFormation)
Familiarity with data orchestration tools like Dagster, Airflow, or Prefect
Experience in home security technology and/or consumer IoT products
Experience using and managing data analytics, data quality, and data catalog tools
Familiarity with data streaming platforms like Kafka, Kinesis, or Spark
What we offer:
A mission- and values-driven culture and a safe, inclusive environment where you can build, grow and thrive
A comprehensive total rewards package that supports your wellness and provides security for SimpliSafers and their families
Free SimpliSafe system and professional monitoring for your home
Employee Resource Groups (ERGs) that bring people together, give opportunities to network, mentor and develop, and advocate for change
Participation in our annual bonus program, equity, and other forms of compensation, in addition to a full range of medical, retirement, and lifestyle benefits
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.