CrawlJobs Logo

Lead Data Quality Engineer

ebrd.com Logo

European Bank for Reconstruction and Development

Location Icon

Location:
Bulgaria , Sofia

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Are you ready to lead the quality frontier in data and analytics? EBRD is seeking a strategic, hands-off QE leader to set vision, drive standards, and empower a high-performing team. Based in Sofia, you’ll shape the future of quality engineering across next-generation data platforms, collaborating with international experts and influencing business-critical outcomes.

Job Responsibility:

  • Set and Evolve QE Strategy: Define and continuously improve the quality engineering strategy, frameworks, and standards for data and analytics platforms, ensuring alignment with business goals and regulatory frameworks (NIST CSF, DORA, ITIL v4)
  • Leadership & Mentorship: Mentor and develop QE leads and Engineers, fostering a culture of quality, innovation, and continuous learning. Act as a role model for hands-off, empowering leadership
  • Stakeholder Engagement: Represent QE in leadership forums at practice and at capability level, influencing technology and business decisions, and ensuring quality is embedded at every stage of the technology lifecycle
  • Quality Governance: Oversee quality metrics, reporting, and continuous improvement initiatives across the data and analytics capability. Champion high-automation, AI/ML-driven assurance, and best practices in shift-left/shift-right quality
  • Operational Excellence: Integrate Agile and DevOps principles across data integration pipelines, aligning with ITIL v4 and DataOps for quality-centric operational delivery
  • Resilience & Compliance: Ensure resilience and failover validation scenarios are in place for critical dataflows, and that all processes meet compliance and security standards

Requirements:

  • Proven experience in quality engineering management and operations within an agile, product-focused IT department, ideally in a financial institution
  • Demonstrable leadership in advanced automation, performance, and modern testing practices (shift-left/right, AI/ML-driven assurance)
  • Understanding of data and analytics platforms (Microsoft Fabric, Azure Data Factory, Informatica, SAP HANA, Azure Databricks, Power BI)
  • Strong knowledge of CI/CD, DevOps, and DataOps principles
  • Familiarity with regulatory frameworks (NIST CSF, DORA, ITIL v4)
  • ISTQB Advanced Test Manager or equivalent recognised certification in test management
  • Qualification in IT Service Management (ITIL v3/v4 Foundation or equivalent)
  • Excellent communication and stakeholder management skills
  • Fluency in English required

Nice to have:

Bulgarian language skills a plus but not mandatory

What we offer:
  • Competitive salary and annual bonus
  • Private health and dental insurance
  • Flexible/hybrid working (minimum 3 days in the office)
  • Generous annual leave and wellbeing support
  • Professional development budget
  • International conferences
  • Clear progression opportunities
  • Inclusive workplace culture

Additional Information:

Job Posted:
December 25, 2025

Expiration:
December 31, 2025

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Lead Data Quality Engineer

New

Senior AWS Data Engineer / Data Platform Engineer

We are seeking a highly experienced Senior AWS Data Engineer to design, build, a...
Location
Location
United Arab Emirates , Dubai
Salary
Salary:
Not provided
NorthBay
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8+ years of experience in data engineering and data platform development
  • Strong hands-on experience with: AWS Glue
  • Amazon EMR (Spark)
  • AWS Lambda
  • Apache Airflow (MWAA)
  • Amazon EC2
  • Amazon CloudWatch
  • Amazon Redshift
  • Amazon DynamoDB
  • AWS DataZone
Job Responsibility
Job Responsibility
  • Design, develop, and optimize scalable data pipelines using AWS native services
  • Lead the implementation of batch and near-real-time data processing solutions
  • Architect and manage data ingestion, transformation, and storage layers
  • Build and maintain ETL/ELT workflows using AWS Glue and Apache Spark on EMR
  • Orchestrate complex data workflows using Apache Airflow (MWAA)
  • Develop and manage serverless data processing using AWS Lambda
  • Design and optimize data warehouses using Amazon Redshift
  • Implement and manage NoSQL data models using Amazon DynamoDB
  • Utilize AWS DataZone for data governance, cataloging, and access management
  • Monitor, log, and troubleshoot data pipelines using Amazon CloudWatch
  • Fulltime
Read More
Arrow Right
New

Healthcare Data Quality Lead

Seeking a seasoned data quality lead to drive healthcare data governance and qua...
Location
Location
United States , Philadelphia
Salary
Salary:
145000.00 - 155000.00 USD / Year
bhsg.com Logo
Beacon Hill
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10+ years of data quality experience in healthcare
  • Expertise in Collibra Data Intelligence Platform and/or Informatica DQ
  • Strong technofunctional and analytical skills
  • SQL knowledge and prior ETL experience preferred
  • Ability to manage stakeholders and deliver scalable solutions
Job Responsibility
Job Responsibility
  • Drive healthcare data governance and quality initiatives
  • Bridge business, engineering, and project teams to ensure accurate data quality implementation using Collibra
  • Review functional requirements
  • Design and deploy data quality rules
  • Eliminate redundancies
  • Prepare dashboards for leadership
  • Fulltime
Read More
Arrow Right

Lead Data Engineer

We're looking for a Lead Data Engineer to build the data infrastructure that pow...
Location
Location
United States
Salary
Salary:
185000.00 - 225000.00 USD / Year
zora.co Logo
Zora
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience in data engineering, with at least 2 years in a technical leadership role
  • Strong proficiency in Python and SQL for building production data pipelines, complex data transformations and evolving data platforms, shared infrastructure, and internal tooling with engineering best practices.
  • Strong experience in designing, building, and maintaining cloud-based data pipelines using orchestration tools such as Airflow, Dagster, Prefect, Temporal, or similar.
  • Hands-on experience with cloud data platforms (AWS, GCP, or Azure) and modern data stack tools
  • Deep understanding of data warehousing concepts and experience with platforms like Snowflake, BigQuery, Redshift, or similar
  • Strong software engineering fundamentals including testing, CI/CD, version control, and writing maintainable, documented code
  • Track record of optimizing data systems for performance, reliability, and cost efficiency at scale
  • Excellent communication skills and ability to collaborate with cross-functional teams including product, engineering, and design
Job Responsibility
Job Responsibility
  • Design and build scalable data pipelines to ingest, process, and transform blockchain data, trading events, user activity, and market signals at high volume and low latency
  • Architect and maintain data infrastructure that powers real-time trading analytics, P&L calculations, leaderboards, market cap tracking, and liquidity monitoring across the platform
  • Own ETL/ELT processes that transform raw onchain data from multiple blockchains into clean, reliable, and performant datasets used by product, engineering, analytics, and ML teams
  • Build and optimize data models and schemas that support both operational systems (serving live trading data) and analytical use cases (understanding market dynamics and user behavior)
  • Establish data quality frameworks including monitoring, alerting, testing, and validation to ensure pipeline reliability and data accuracy at scale
  • Collaborate with backend engineers to design event schemas, data contracts, and APIs that enable real-time data flow between systems
  • Partner with product and analytics teams to understand data needs and translate them into robust engineering solutions
  • Provide technical leadership by mentoring engineers, conducting code reviews, establishing best practices, and driving architectural decisions for the data platform
  • Optimize performance and costs of data infrastructure as we scale to handle exponentially growing trading volumes
What we offer
What we offer
  • Remote-First Culture: Work from anywhere in the world!
  • Competitive Compensation: Including salary, pre-IPO stock options, token compensation, and additional financial incentives
  • Comprehensive Benefits: Robust healthcare options, including fully covered medical, dental, and vision for employees
  • Retirement Contributions: Up to 4% employer match on your 401(k) contributions
  • Health & Wellness: Free memberships to One Medical, Teladoc, and Health Advocate
  • Unlimited Time Off: Flexible vacation policies, company holidays, and recharge weeks to prioritize wellness
  • Home Office Reimbursement: To cover home office items, monthly home internet, and monthly cell phone (if applicable)
  • Ease of Life Reimbursement: To cover everything from an Uber home in the rain, childcare, or meal delivery
  • Career Development: Access to mentorship, training, and opportunities to grow your career
  • Inclusive Environment: A culture dedicated to diversity, equity, inclusion, and belonging
  • Fulltime
Read More
Arrow Right

Lead Data Engineer

As a Lead Data Engineer at Rearc, you'll play a pivotal role in establishing and...
Location
Location
United States
Salary
Salary:
Not provided
rearc.io Logo
Rearc
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10+ years of experience in data engineering, data architecture, or related technical fields
  • Proven ability to design, build, and optimize large-scale data ecosystems
  • Strong track record of leading complex data engineering initiatives
  • Deep hands-on expertise in ETL/ELT design, data warehousing, and data modeling
  • Extensive experience with data integration frameworks and best practices
  • Advanced knowledge of cloud-based data services and architectures (AWS Redshift, Azure Synapse Analytics, Google BigQuery, or equivalent)
  • Strong strategic and analytical thinking
  • Proficiency with modern data engineering frameworks (Databricks, Spark, lakehouse technologies like Delta Lake)
  • Exceptional communication and interpersonal skills
Job Responsibility
Job Responsibility
  • Engage deeply with stakeholders to understand data needs, business challenges, and technical constraints
  • Translate stakeholder needs into scalable, high-quality data solutions
  • Implement with a DataOps mindset using tools like Apache Airflow, Databricks/Spark, Kafka
  • Build reliable, automated, and efficient data pipelines and architectures
  • Lead and execute complex projects
  • Provide technical direction and set engineering standards
  • Ensure alignment with customer goals and company principles
  • Mentor and develop data engineers
  • Promote knowledge sharing and thought leadership
  • Contribute to internal and external content
What we offer
What we offer
  • Comprehensive health benefits
  • Generous time away and flexible PTO
  • Maternity and paternity leave
  • Access to educational resources with reimbursement for continued learning
  • 401(k) plan with company contribution
Read More
Arrow Right

Collibra Data Quality Lead

Client looking for someone who is having very good Technical hands-on experience...
Location
Location
United States , Pittsburgh
Salary
Salary:
60.00 USD / Hour
realign-llc.com Logo
Realign
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong understanding of Data Quality and Data Governance
  • Proficiency in data profiling, data cleansing, and data validation techniques
  • Strong Collibra development experience
  • Must be able to setup rules using Collibra DQ
  • Must be able to set up workflows in Collibra DIP
  • Strong experience in SQL
  • Strong experience with ETL processes and good knowledge on any ETL tool
  • Good experience with Unix Scripting
  • Experience working on different databases like Oracle, Vertica, Snowflake etc
  • Good experience leading a team
Read More
Arrow Right

Data Engineering & Analytics Lead

Premium Health is seeking a highly skilled, hands-on Data Engineering & Analytic...
Location
Location
United States , Brooklyn
Salary
Salary:
Not provided
premiumhealth.org Logo
Premium Health
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Engineering, or a related field. Master's degree preferred
  • Proven track record and progressively responsible experience in data engineering, data architecture, or related technical roles
  • healthcare experience preferred
  • Strong knowledge of data engineering principles, data integration, ETL processes, and semantic mapping techniques and best practices
  • Experience implementing data quality management processes, data governance frameworks, cataloging, and master data management concepts
  • Familiarity with healthcare data standards (e.g., HL7, FHIR, etc), health information management principles, and regulatory requirements (e.g., HIPAA)
  • Understanding of healthcare data, including clinical, operational, and financial data models, preferred
  • Advanced proficiency in SQL, data modeling, database design, optimization, and performance tuning
  • Experience designing and integrating data from disparate systems into harmonized data models or semantic layers
  • Hands-on experience with modern cloud-based data platforms (e.g Azure, AWS, GCP)
Job Responsibility
Job Responsibility
  • Collaborate with the CDIO and Director of Technology to define a clear data vision aligned with the organization's goals and execute the enterprise data roadmap
  • Serve as a thought leader for data engineering and analytics, guiding the evolution of our data ecosystem and championing data-driven decision-making across the organization
  • Build and mentor a small data team, providing technical direction and performance feedback, fostering best practices and continuous learning, while remaining a hands-on implementor
  • Define and implement best practices, standards, and processes for data engineering, analytics, and data management across the organization
  • Design, implement, and maintain a scalable, reliable, and high-performing modern data infrastructure, aligned with the organizational needs and industry best practices
  • Architect and maintain data lake/lakehouse, warehouse, and related platform components to support analytics, reporting, and operational use cases
  • Establish and enforce data architecture standards, governance models, naming conventions ,and documentation
  • Develop, optimize, and maintain scalable ETL/ELT pipelines and data workflows to collect, transform, normalize, and integrate data from diverse systems
  • Implement robust data quality processes, validation, monitoring, and error-handling frameworks
  • Ensure data is accurate, timely, secure, and ready for self-service analytics and downstream applications
What we offer
What we offer
  • Paid Time Off, Medical, Dental and Vision plans, Retirement plans
  • Public Service Loan Forgiveness (PSLF)
  • Fulltime
Read More
Arrow Right

Senior Data Engineer – Data Engineering & AI Platforms

We are looking for a highly skilled Senior Data Engineer (L2) who can design, bu...
Location
Location
India , Chennai, Madurai, Coimbatore
Salary
Salary:
Not provided
optisolbusiness.com Logo
OptiSol Business Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong hands-on expertise in cloud ecosystems (Azure / AWS / GCP)
  • Excellent Python programming skills with data engineering libraries and frameworks
  • Advanced SQL capabilities including window functions, CTEs, and performance tuning
  • Solid understanding of distributed processing using Spark/PySpark
  • Experience designing and implementing scalable ETL/ELT workflows
  • Good understanding of data modeling concepts (dimensional, star, snowflake)
  • Familiarity with GenAI/LLM-based integration for data workflows
  • Experience working with Git, CI/CD, and Agile delivery frameworks
  • Strong communication skills for interacting with clients, stakeholders, and internal teams
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable ETL/ELT pipelines across cloud and big data platforms
  • Contribute to architectural discussions by translating business needs into data solutions spanning ingestion, transformation, and consumption layers
  • Work closely with solutioning and pre-sales teams for technical evaluations and client-facing discussions
  • Lead squads of L0/L1 engineers—ensuring delivery quality, mentoring, and guiding career growth
  • Develop cloud-native data engineering solutions using Python, SQL, PySpark, and modern data frameworks
  • Ensure data reliability, performance, and maintainability across the pipeline lifecycle—from development to deployment
  • Support long-term ODC/T&M projects by demonstrating expertise during technical discussions and interviews
  • Integrate emerging GenAI tools where applicable to enhance data enrichment, automation, and transformations
What we offer
What we offer
  • Opportunity to work at the intersection of Data Engineering, Cloud, and Generative AI
  • Hands-on exposure to modern data stacks and emerging AI technologies
  • Collaboration with experts across Data, AI/ML, and cloud practices
  • Access to structured learning, certifications, and leadership mentoring
  • Competitive compensation with fast-track career growth and visibility
  • Fulltime
Read More
Arrow Right

Quality Engineer - AI and Data Platforms

This is a pioneering Quality Engineer role at the intersection of data engineeri...
Location
Location
United Kingdom , Manchester
Salary
Salary:
44000.00 - 66000.00 GBP / Year
matillion.com Logo
Matillion
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Solid foundation in data engineering: including SQL, ETL/ELT design, and specific experience building data pipelines and managing data movement
  • Strong practical AI experience: you have used, experimented with, and are an advocate for an AI-first approach to quality engineering
  • Proficiency in coding in Java or JavaScript to navigate the codebase and implement quality frameworks
  • Demonstrated Autonomy, Curiosity, and Problem-solving skills, with a willingness to look at challenges in a different way and ask for assistance as needed
  • Experience in managing end-to-end testing of SaaS applications, including developing and maintaining efficient test automation tooling
Job Responsibility
Job Responsibility
  • Leveraging AI and agentic solutions, including our agentic AI product Maia, to accelerate investigation, generate test cases, and increase quality assurance across the Data Productivity Cloud
  • Performing root cause analysis on pipeline stability issues, particularly identifying why DPC pipelines run out of memory (OOM) within the agents
  • Building pipelines to automate every process, solutionizing problems to increase overall team and product productivity
  • Acting as a crucial bridge by collaborating extensively with various teams, raising problems, and ensuring that fixes are implemented effectively
  • Adopting, implementing, and championing shift-left testing practices across the team, leading an automation-first approach
What we offer
What we offer
  • Company Equity
  • 30 days holiday + bank holidays
  • 5 days paid volunteering leave
  • Health insurance
  • Life Insurance
  • Pension
  • Access to mental health support
  • Fulltime
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.