CrawlJobs Logo

Enterprise Data Warehouse (EDW) Engineer

yottatechports.com Logo

Yotta Tech Ports

Location Icon

Location:
United States , Glen Allen

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are seeking a Mid-Level Enterprise Data Warehouse (EDW) Engineer with a passion for building scalable, cloud-native data solutions. This position is ideal for individuals who excel in collaborative environments, communicate effectively, and approach challenges with a problem-solving mindsetnot just those who follow instructions.

Job Responsibility:

  • Design, develop, and maintain high-performance data pipelines and ETL/ELT workflows for the enterprise data warehouse
  • Work with cloud-based data warehouse platforms like Snowflake, BigQuery, or Redshift to optimize data storage and retrieval
  • Write clean, efficient, and maintainable SQL and Python code for data transformation and automation tasks
  • Implement and manage CI/CD pipelines for data workflows using tools like Git, Jenkins, or GitHub Actions
  • Leverage orchestration tools (e.g., Apache Airflow, dbt Cloud, Prefect) to schedule and monitor data workflows
  • Conduct detailed data analysis between current and target systems, and prepare mapping documentation
  • Collaborate with data analysts, scientists, and business teams to generate actionable insights
  • Proactively identify and address data quality issues and performance bottlenecks
  • Contribute to data architecture decisions and establish best practices

Requirements:

  • 4-15 years of experience in Data Engineering or EDW Development
  • Strong hands-on experience with Snowflake, BigQuery, or Redshift
  • Expertise in SQL and Python
  • Working knowledge of CI/CD tools such as Git, Jenkins, and GitHub Actions
  • Experience with workflow orchestration tools like Airflow and Prefect
  • Ability to analyze large datasets and present findings in a business context
  • Excellent communication and teamwork skills
  • A proactive, solution-oriented mindset with strong ownership and accountability

Nice to have:

  • Experience in data modeling and architecture
  • Understanding data governance, security, and compliance best practices
  • Familiarity with modern data stack tools such as dbt, Fivetran, or Looker
  • Experience with large-scale enterprise data warehouse implementations

Additional Information:

Job Posted:
February 13, 2026

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Enterprise Data Warehouse (EDW) Engineer

Qlik Data Engineer

This position is NOT eligible for visa sponsorship. This role will specialize in...
Location
Location
United States , Easton
Salary
Salary:
Not provided
victaulic.com Logo
Victaulic
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Information Systems, or related technical field
  • 4+ years of experience in enterprise data integration with at least 2 years of hands-on Qlik or Talend experience
  • Strong understanding of change data capture (CDC) technologies and real-time data streaming concepts
  • Strong understanding of data lake and data warehouse strategies, and data modelling
  • Advanced SQL skills with expertise in database replication, synchronization, and performance tuning
  • Experience with enterprise ETL/ELT tools and data integration patterns
  • Proficiency in at least one programming language (Java, Python, or SQL scripting)
Job Responsibility
Job Responsibility
  • Develop and maintain ETL/ELT data pipelines leveraging Qlik Data Integration for data warehouse generation in bronze, silver, gold layers
  • Build consumer facing datamarts, views, and push-down calculations to enable improved analytics by BI team and Citizen Developers
  • Implement enterprise data integration patterns supporting batch, real-time, and hybrid processing requirements
  • Coordinate execution of and monitor pipelines to ensure timely reload of EDW
  • Configure and manage Qlik Data Integration components including pipeline projects, lineage, data catalog, data quality, and data marketplace
  • Implement data quality rules and monitoring using Qlik and Talend tools
  • Manage Qlik Tenant, security, access and manage Data Movement Gate way
  • Monitor and optimize data replication performance, latency, and throughput across all integration points
  • Implement comprehensive logging, alerting, and performance monitoring
  • Conduct regular performance audits and capacity planning for integration infrastructure
Read More
Arrow Right

Specialist Data and Artificial Intelligence

Reporting into the Applications Manager, the Data & AI Specialist plays a critic...
Location
Location
Australia , Perth
Salary
Salary:
Not provided
pls.com Logo
PLS
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • A bachelor’s or master’s degree in a relevant field such as data science, statistics, or business analytics
  • Bachelor’s Degree Required
  • Microsoft Power BI Certification: DA-100 | PL-300
  • Snowflake certification “SnowPro Core COF-C02”
  • Power BI Administration & Governance – Experience managing an enterprise Power BI tenant, including security, licensing, and workspace management
  • Data Modeling & DAX Expertise – Strong proficiency in DAX, Power Query (M), and SQL for creating efficient and scalable data models
  • Enterprise Data Warehouse (EDW) Integration – Experience working with enterprise data warehouses and optimizing Power BI connections for performance
  • Data Governance & Security – Knowledge of row-level security (RLS), object-level security (OLS), and role-based access control (RBAC) in Power BI and data platforms
  • Performance Optimization – Ability to diagnose and optimize Power BI reports, data models, and refresh schedules to ensure efficiency
  • Stakeholder Engagement – Experience working with business users, data engineers, and IT teams to define reporting requirements and ensure data accessibility
Job Responsibility
Job Responsibility
  • Power BI Administration & Governance: Manage and optimize the enterprise Power BI tenant, including licensing, security, and performance tuning
  • Oversee workspace and dataset management, ensuring efficient and scalable data models
  • Implement and enforce Power BI governance policies, including role-based access control (RBAC) and row-level security (RLS)
  • Monitor and troubleshoot Power BI service performance, refresh failures, and user access issues
  • AI & Advanced Analytics: Identify and implement opportunities to leverage artificial intelligence and machine learning for business insights and predictive analytics
  • Collaborate with data scientists and engineers to deploy AI/ML models into production environments
  • Explore and evaluate emerging AI technologies and their applicability to business challenges
  • Support the development of AI-driven dashboards and automated insights within Power BI
  • Audit AI Technologies effectiveness and optimise AI/ML Models
  • Review and apply PLS AI Governance and Standards
What we offer
What we offer
  • 18 weeks parental leave for primary carers and 4 weeks for secondary carers
  • Health and wellbeing allowance
  • Annual short-term incentive bonus that recognises individual and business performance
  • PLS employee share scheme
  • Novated leasing through salary sacrifice
  • Newly refurbished facilities at Pilgangoora including gym, tennis, pickleball and volleyball courts, sports oval, and scenic walking tracks
  • Paid community leave
  • Monthly employee recognition awards
  • Access to PLS’ KidsCo School Holiday Program
  • Fulltime
Read More
Arrow Right

Data Architect

We are seeking a highly skilled Data Architect to join our Enterprise Transforma...
Location
Location
Bulgaria , Sofia
Salary
Salary:
Not provided
hotschedules.com Logo
HotSchedules Corporate
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field
  • 5+ years in data architecture, database administration, or data engineering, with a focus on enterprise and business systems
  • Strong knowledge of SQL and NoSQL databases (e.g., PostgreSQL, MongoDB, Cassandra)
  • Experience with cloud platforms (e.g., AWS, Azure, GCP – BigQuery, Redshift, Snowflake)
  • Hands-on experience with ETL/ELT tools (e.g., Apache Airflow, dbt, Informatica, Talend, Azure Data Factory, Synapse)
  • Strong knowledge of API’s and Automation Tools (e.g. Make, Zapier, MS Power Automate)
  • Familiarity with ERP, CRM, and HRIS integrations
  • Programming skills in Python, Java, or Scala
  • Deep understanding of data governance, master data management, and security/compliance (especially GDPR)
  • Excellent analytical, problem-solving, and communication skills
Job Responsibility
Job Responsibility
  • Design, develop, and maintain the organization’s overall data architecture to support enterprise‑wide business applications, internal reporting, and analytics
  • Create and manage conceptual, logical, and physical data models for organizational data domains (HR, Finance, Sales, Operations, etc.)
  • Define and implement data governance policies, standards, and best practices across the enterprise
  • Oversee ETL/ELT processes and pipelines for integrating data from diverse business systems (ERP, CRM, HRIS, etc.)
  • Collaborate with internal stakeholders (business teams, IT, data engineers) to align data initiatives with organizational objectives
  • Optimize performance, cost, and scalability of data warehouses and internal reporting systems
  • Evaluate and recommend tools and platforms to enhance internal data and business application efficiency
  • Ensure compliance with GDPR and other relevant data security/privacy regulations
  • Responsible for the successful design and execution of Data related programs and projects
What we offer
What we offer
  • 25+ days off, as well as birthday day off and 4 charity days off per year
  • Flexible start and end of the working day and hybrid working mode, including a combination remote and in the office
  • Team-centric atmosphere
  • Encouraging healthy lifestyle and work-life balance including supplemental health insurance
  • New parents bonus scheme
  • Fulltime
Read More
Arrow Right

Data Architect

We are seeking a highly skilled and experienced Data Architect to join our team....
Location
Location
Poland , Warsaw
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 6+ years of professional experience in data-centric roles
  • Proven experience in a technical leadership position such as Solution Architect, Technical Lead, or Principal Engineer
  • Expert-level knowledge of Snowflake and Azure Data Factory (ADF)
  • Deep experience with SQL Server
  • Familiarity with Informatica and IBM DB2
  • Hands-on experience with Control-M
  • Experience with SAP Business Objects and Power BI
  • Expert-level SQL
  • Proficiency in Python
  • Experience with XML and JSON data formats
Job Responsibility
Job Responsibility
  • Design and document end-to-end solutions for new data integrations, major enhancements, and complex data products
  • Contribute to the technical roadmap for the Enterprise Data Warehouse (EDW)
  • Define and enforce data engineering and development standards across the program
  • Serve as the final technical approver for all significant code changes, pull requests, and architectural modifications
  • Act as the highest point of technical escalation for critical P1/P2 incidents and complex problems
  • Lead technical investigation during major outages
  • Drive technical resolution for systemic issues
  • Mentor Data Engineers and Data Visualization Engineers
  • Champion creation and maintenance of technical documentation
  • Work closely with client architects and source system owners
What we offer
What we offer
  • Flexible working hours
  • Hybrid work model
  • Cafeteria system
  • Referral bonuses up to PLN6,000
  • Additional revenue sharing opportunities
  • Ongoing guidance from dedicated Team Manager
  • Tailored technical mentoring
  • Dedicated team-building budget
  • Opportunities to participate in charitable initiatives and local sports programs
  • Supportive and inclusive work culture
  • Fulltime
Read More
Arrow Right

Data Architect

We are seeking a highly skilled and experienced Data Architect to join our team....
Location
Location
Poland , Warsaw
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 6+ years of professional experience in data-centric roles
  • Proven experience in a technical leadership position such as Solution Architect, Technical Lead, or Principal Engineer
  • Expert-level knowledge of Snowflake and Azure Data Factory (ADF)
  • Deep experience with SQL Server
  • Familiarity with Informatica and IBM DB2
  • Hands-on experience with Control-M
  • Experience with SAP Business Objects and Power BI
  • Expert-level SQL
  • Proficiency in Python
  • Experience with XML and JSON data formats
Job Responsibility
Job Responsibility
  • Design and document end-to-end solutions for new data integrations, major enhancements, and complex data products
  • Contribute to the technical roadmap for the Enterprise Data Warehouse (EDW)
  • Define and enforce data engineering and development standards across the program
  • Serve as the final technical approver for all significant code changes, pull requests, and architectural modifications
  • Act as the highest point of technical escalation for critical P1/P2 incidents and complex problems
  • Lead technical investigation during major outages
  • Drive technical resolution for systemic issues
  • Mentor Data Engineers and Data Visualization Engineers
  • Champion creation and maintenance of technical documentation
  • Work closely with client architects, source system owners, and other vendor teams
What we offer
What we offer
  • Flexible working hours
  • Hybrid work model
  • Cafeteria system
  • Referral bonuses up to PLN6,000
  • Additional revenue sharing opportunities
  • Ongoing guidance from dedicated Team Manager
  • Tailored technical mentoring
  • Dedicated team-building budget
  • Opportunities to participate in charitable initiatives and local sports programs
  • Supportive and inclusive work culture
  • Fulltime
Read More
Arrow Right
New

Enterprise Data Warehouse (EDW) Engineer

We are seeking a Mid-Level Enterprise Data Warehouse (EDW) Engineer with a passi...
Location
Location
United States , Glen Allen
Salary
Salary:
Not provided
yottatechports.com Logo
Yotta Tech Ports
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4-15 years of experience in Data Engineering or EDW Development
  • Strong hands-on experience with Snowflake, BigQuery, or Redshift
  • Expertise in SQL and Python
  • Working knowledge of CI/CD tools such as Git, Jenkins, and GitHub Actions
  • Experience with workflow orchestration tools like Airflow and Prefect
  • Ability to analyze large datasets and present findings in a business context
  • Excellent communication and teamwork skills
  • A proactive, solution-oriented mindset with strong ownership and accountability
Job Responsibility
Job Responsibility
  • Design, develop, and maintain high-performance data pipelines and ETL/ELT workflows for the enterprise data warehouse
  • Work with cloud-based data warehouse platforms like Snowflake, BigQuery, or Redshift to optimize data storage and retrieval
  • Write clean, efficient, and maintainable SQL and Python code for data transformation and automation tasks
  • Implement and manage CI/CD pipelines for data workflows using tools like Git, Jenkins, or GitHub Actions
  • Leverage orchestration tools (e.g., Apache Airflow, dbt Cloud, Prefect) to schedule and monitor data workflows
  • Conduct detailed data analysis between current and target systems, and prepare mapping documentation
  • Collaborate with data analysts, scientists, and business teams to generate actionable insights
  • Proactively identify and address data quality issues and performance bottlenecks
  • Contribute to data architecture decisions and establish best practices
  • Fulltime
Read More
Arrow Right

Software Engineer, Backend & Data

The Software Engineer, Backend & Data will play a critical role in building and ...
Location
Location
United States
Salary
Salary:
160000.00 - 200000.00 USD / Year
epickids.com Logo
Epic Kids
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree or higher in Computer Science, Software Engineering, or a related field
  • Strong experience working with databases and advanced SQL skills
  • Experience with HiveSQL and Spark SQL
  • Proficiency in at least one programming language: Python, Scala, or Java
  • Working knowledge of big data technologies such as Hadoop, HDFS, Hive, Spark, Flink, HBase, or similar systems
  • Solid understanding of enterprise data warehouse (EDW) design principles
  • Experience with Kimball dimensional modeling, including fact tables, dimension tables, and star schemas
  • Strong analytical thinking and attention to detail
  • High sense of ownership, accountability, and responsibility
  • Excellent problem-solving skills and the ability to work effectively in evolving systems
Job Responsibility
Job Responsibility
  • Design, develop, and maintain Epic’s core backend systems and services
  • Own key components end-to-end, including requirements analysis, system design, implementation, testing, and performance optimization
  • Collaborate with product managers, frontend engineers, and business stakeholders to deliver scalable and reliable solutions
  • Contribute to the design and development of Epic’s enterprise data warehouse (EDW)
  • Build, optimize, and maintain data pipelines to ensure high data quality, reliability, and performance
  • Support reporting, analytics, and research use cases by ensuring data is well-modeled and accessible
  • Partner with analytics and business teams to translate data needs into technical solutions
  • Assist with data monitoring, governance, and best practices
  • Create and maintain clear technical documentation for backend systems, data models, and pipelines
  • Collaborate effectively across time zones in a remote, global engineering environment
  • Fulltime
Read More
Arrow Right

Software Developer

Location
Location
United States , Multiple locations
Salary
Salary:
Not provided
dataedgeusa.com Logo
DataEdge
Expiration Date
August 18, 2026
Flip Icon
Requirements
Requirements
  • Bachelor Degree in Computer Science, Software Engineering, Information Technology or Equivalent
Job Responsibility
Job Responsibility
  • Develop and test the Software Applications using Dataware housing technologies like SAP Business Objects XI, Tableau, Web Intelligence, Universe Designer, Crystal Reports, SQL, PL/SQL, UNIX, Tidal Enterprise Scheduler
  • Create and refresh quarterly Dashboard using SAP Business Objects Xcelcius Dashboard and Live office
  • Use Crystal Report XI Enterprise to design the front-end reports
  • Create classes, subclasses and objects including complex measure objects according to the business requirements from the Universe schemas and database tables
  • Modify the existing Universes by merging different Classes and resolving the Loops and Chasm Traps by context
  • Create and reviewed the conceptual model for the EDW (Enterprise Data Warehouse) with business users
Read More
Arrow Right