This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
As a Senior Data Engineer, you will play a pivotal role in the transformation of data into actionable insights. Collaborate with our dynamic team of technologists to develop cutting-edge data solutions that drive innovation and fuel business growth. Your responsibilities will include managing complex data structures and delivering scalable and efficient data solutions. Your expertise in data engineering will be crucial in optimizing our data-driven decision-making processes. If you're passionate about leveraging data to make a tangible impact, we welcome you to join us in shaping the future of our organization.
Job Responsibility:
contribute to maintaining, updating, and expanding existing Core Data platform data pipelines
build tools and services to support data discovery, lineage, governance, and privacy
collaborate with other software/data engineers and cross-functional teams
collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform
contribute to developing and documenting both internal and external standards and best practices for pipeline configurations, naming conventions, and more
ensure high operational efficiency and quality of the Core Data platform datasets to ensure our solutions meet SLAs and project reliability and accuracy to all our stakeholders (Engineering, Data Science, Operations, and Analytics teams)
Requirements:
7+ years of data engineering experience developing large data pipelines
proficiency in at least one major programming language (e.g. Python, Java, Scala)
strong SQL skills and ability to create queries to analyze complex datasets
hands-on production environment experience with distributed processing systems such as Spark
hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
experience with Databricks
experience with Snowflake a plus
deep understanding of AWS or other cloud providers as well as infrastructure as code
familiarity with Data Modeling techniques and Data Warehousing standard methodologies and practices
strong algorithmic problem-solving expertise
excellent written and verbal communication
advanced understanding of OLTP vs OLAP environments
willingness and ability to learn and pick up new skill sets
self-starting problem solver with an eye for detail and excellent analytical and communication skills
strong background in at least one of the following: distributed data processing or software engineering of data services, or data modeling
familiar with Scrum and Agile methodologies
bachelor’s Degree in Computer Science, Information Systems equivalent industry experience
Nice to have:
experience with Snowflake
familiarity with Scrum and Agile methodologies
What we offer:
medical, vision, dental, and life and disability insurance
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.