This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are on the lookout for a Principal Data Engineer to help define and lead the next generation of our data platform and data capabilities. You’ll play a key role in building scalable, resilient and intelligent data systems that power real time services, insights, products and decisions across Dotdigital. As a Principal Data Engineer, you will be instrumental in driving the architecture, development and delivery of our data platform. You will lead key initiatives, provide technical direction and collaborate with product, analytics and data science teams to ensure data value is realised across the entire ecosystem. Working across the entire data lifecycle, you will help shape how data is collected, processed and consumed across Dotdigital.
Job Responsibility:
Lead the design and implementation of scalable, secure and resilient data systems across streaming, batch and real-time use cases
Architect data pipelines, model and storage solutions that power analytical and product use cases
using primarily Python and SQL via orchestration tooling that run workloads in the cloud
Leverage AI to automate both data processing and engineering processes
Assure and drive best practices relating to data infrastructure, governance, security and observability
Work with technologists across multiple teams to deliver coherent features and data outcomes
Support the data team to help adopt data engineering principles
Identify, validate and promote new tools and technologies that improve the performance and stability of data services
Requirements:
Extensive experience delivering python-based projects in the data engineering space
Extensive experience working with SQL and NoSQL database technologies (e.g. SQL Server, MongoDB & Cassandra)
Proven experience with modern data warehousing and large-scale data processing tools (e.g. Snowflake, DBT, BiqQuery, Clickhouse)
Hands on experience with data orchestration tools like Airflow, Dagster or Prefect
Experience using cloud environments (e.g. Azure, AWS, GCP) to process, store and surface large scale data
Experience using Kafka or similar event-based architectures e.g. (Pub/Sub via AWS SQS, Azure EventHubs, AWS Kinesis)
Strong grasp of data architecture and data modelling principles for both OLAP and OLTP workloads
Capable in the wider software development lifecycle in terms of agile ways of working and continuous integration/deployment of data solutions
Experience as a lead or Principal Engineer on large-scale data initiative or product builds
Demonstrated ability to architect data systems and data structures for high volume, high throughput systems
Proven experience leading data platform modernisation or cloud migration projects
Comfortable taking ownership of difficult data problems and driving them to resolution
Nice to have:
Experience using ClickHouse as part of a data pipeline and analytics solution
Experience using Databricks or similar data platforms
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.