CrawlJobs Logo

ETL Engineer

daxko.com Logo

Daxko

Location Icon

Location:
United States , Birmingham

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We’re looking for a curious, detail-oriented ETL Engineer who enjoys turning complex data into reliable, actionable insights. In this role, you’ll design, maintain, and improve data integration workflows that power our internal tools and client solutions. You’ll collaborate closely with fellow ETL engineers and cross-functional teams, playing a key role in ensuring data accuracy, reliability, and performance. If you enjoy problem-solving, automation, and continuously learning new technologies, this role offers an excellent opportunity to grow your ETL and engineering skill set.

Job Responsibility:

  • Maintain and enhance custom internal automation tools and ETL processes using PHP, C#, and SSIS
  • Build, test, and troubleshoot custom scripts to manipulate and load customer data with minimal supervision
  • Perform root cause analysis on data issues and implement long-term, scalable solutions
  • Monitor ETL workflows, troubleshoot failures, and ensure high data quality standards
  • Communicate data risks and issues clearly and effectively to internal teams and external customers
  • Consistently meet or exceed KPIs and performance expectations
  • Conduct quality assurance reviews and provide constructive peer feedback on code and scripts
  • Gain a strong understanding of the full implementation lifecycle to support automation and data issue resolution
  • Support billable development and consultation projects across teams as needed
  • Manage service records for small to mid-sized customers

Requirements:

  • Functional knowledge of MySQL, SQL Server, and custom script development
  • Ability to analyze table structures and design custom solutions for data challenges
  • Strong analytical and problem-solving skills
  • Clear communicator who can explain complex technical concepts to audiences of varying experience
  • Self-motivated, organized, and comfortable working independently or collaboratively
  • Strong attention to detail with the ability to prioritize effectively
  • Professional, team-oriented mindset with a positive attitude
  • Associate’s degree or equivalent professional experience in ETL or software development
  • 1–2 years of experience with: SQL and ETL scripting
  • Higher-level programming languages (PHP or C# preferred)
  • Building stored procedures or routines for repeatable processes

Nice to have:

  • Bachelor’s degree in a related field
  • 1+ year of professional experience with: MySQL or SQL Server
  • PHP and/or C# (nice to have, not required)
What we offer:
  • Flexible paid time off
  • Affordable health, dental, and vision insurance options
  • Monthly fitness reimbursement
  • 401(k) matching
  • New-Parent Paid Leave
  • Casual work environments
  • Remote work

Additional Information:

Job Posted:
January 09, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for ETL Engineer

Data Engineer – Snowflake & ETL

We are seeking a Data Engineer in Hyderabad (WFO) with expertise in data enginee...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
rightanglesol.com Logo
Right Angle Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum 5+ years of experience in data engineering, ETL, and Snowflake development
  • Proven expertise in Snowflake including SQL scripting, performance tuning, and data warehousing concepts
  • Hands-on experience with Matillion ETL for building and maintaining ETL jobs
  • Strong understanding of cloud platforms (AWS, Azure, or GCP) and cloud-based data architectures
  • Proficiency in SQL, Python, or other scripting languages for automation and data transformation
  • Experience with API integrations and data ingestion frameworks
  • Knowledge of data governance, security policies, and access control within Snowflake environments
  • Excellent communication skills with the ability to engage both business and technical stakeholders
  • Self-motivated professional capable of working independently and delivering projects on time
  • Qualification: BE/BS/MTech/MS or equivalent work experience
  • Fulltime
Read More
Arrow Right

Senior Data Engineer ETL Lead

The Sr Data Engineer ETL Lead is a senior level position responsible for establi...
Location
Location
United States , Irving
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 6+ years of relevant experience in Apps Development or systems analysis role
  • Extensive experience system analysis and in programming of software applications
  • Experience in managing and implementing successful projects
  • Subject Matter Expert (SME) in at least one area of Applications Development
  • Ability to adjust priorities quickly as circumstances dictate
  • Demonstrated leadership and project management skills
  • Consistently demonstrates clear and concise written and verbal communication
  • Bachelor’s degree in Computer Science or equivalent /University degree or equivalent experience
  • Data Warehouse/ETL design and development methodologies knowledge and experience required
  • ETL expertise on AbInitio tool (EME, GDE, Co-op) bringing together all components like Unix, Oracle, Storage
Job Responsibility
Job Responsibility
  • Partner with multiple management teams to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements
  • Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards
  • Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint
  • Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation
  • Develop comprehensive knowledge of how areas of business, such as architecture and infrastructure, integrate to accomplish business goals
  • Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions
  • Serve as advisor or coach to mid-level developers and analysts, allocating work as necessary
  • Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency
What we offer
What we offer
  • Medical, dental & vision coverage
  • 401(k)
  • life, accident, and disability insurance
  • wellness programs
  • paid time off packages including vacation, sick leave and paid holidays
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

We’re growing our team at ELEKS in partnership with a company, the UK’s largest ...
Location
Location
Salary
Salary:
Not provided
eleks.com Logo
ELEKS
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience in Data Engineering, SQL, ETL(data validation + data mapping + exception handling) 4+ years
  • Hands-on experience with Databricks 2+ years
  • Experience with Python
  • Experience with AWS (e.g. S3, Redshift, Athena, Glue, Lambda, etc.)
  • At least an Upper-Intermediate level of English
Job Responsibility
Job Responsibility
  • Building Databases and Pipelines: Developing databases, data lakes, and data ingestion pipelines to deliver datasets for various projects
  • End-to-End Solutions: Designing, developing, and deploying comprehensive solutions for data and data science models, ensuring usability for both data scientists and non-technical users. This includes following best engineering and data science practices
  • Scalable Solutions: Developing and maintaining scalable data and machine learning solutions throughout the data lifecycle, supporting the code and infrastructure for databases, data pipelines, metadata, and code management
  • Stakeholder Engagement: Collaborating with stakeholders across various departments, including data platforms, architecture, development, and operational teams, as well as addressing data security, privacy, and third-party coordination
What we offer
What we offer
  • Close cooperation with a customer
  • Challenging tasks
  • Competence development
  • Ability to influence project technologies
  • Team of professionals
  • Dynamic environment with low level of bureaucracy
Read More
Arrow Right

Data Engineer

We are seeking our first Data Engineer, someone who can refine our data infrastr...
Location
Location
United States , New York City; San Francisco
Salary
Salary:
190000.00 - 250000.00 USD / Year
hebbia.ai Logo
Hebbia
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's or Master's degree in Computer Science, Data Science, Statistics, or a related field
  • 5+ years software development experience at a venture-backed startup or top technology firm, with a focus on data engineering
  • Significant hands-on experience in data engineering (ETL development, data warehousing, data lake management, etc.)
  • Adept at identifying and owning data projects end to end, with the ability to work independently and exercise sound judgment
  • Proficient in Python and SQL
  • comfortable working with cloud-based data stack tools
  • Familiar with big data processing frameworks (e.g., Spark, Hadoop) and data integration technologies (e.g., Airflow, DBT, or similar)
  • Experience implementing data governance, security, and compliance measures
  • Strong collaboration and communication skills, with the ability to translate business requirements into technical solutions
  • You are comfortable working in-person 5 days a week
Job Responsibility
Job Responsibility
  • Architect, build, and maintain ETL pipelines and workflows that ensure high data quality and reliability
  • Design and manage a central data lake to consolidate data from various sources, enabling advanced analytics and reporting
  • Collaborate with cross-functional stakeholders (product, engineering, and business) to identify data gaps and develop effective solutions
  • Implement best practices in data security and governance to ensure compliance and trustworthiness
  • Evaluate and integrate new technologies, tools, and approaches to optimize data processes and architectures
  • Continuously monitor, troubleshoot, and improve data pipelines and infrastructure for performance, scalability, and cost-efficiency
What we offer
What we offer
  • PTO: Unlimited
  • Insurance: Medical + Dental + Vision + 401K + Wellness Benefits
  • Eats: Catered lunch daily + doordash dinner credit if you ever need to stay late
  • Parental leave policy: 3 months non-birthing parent, 4 months for birthing parent
  • Fertility benefits: $15k lifetime benefit
  • New hire equity grant: competitive equity package with unmatched upside potential
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer to design, develop, and optimize data platforms, pipelines,...
Location
Location
United States , Chicago
Salary
Salary:
160555.00 - 176610.00 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Master's degree in Engineering Management, Software Engineering, Computer Science, or a related technical field
  • 3 years of experience in data engineering
  • Experience building data platforms and pipelines
  • Experience with AWS, GCP or Azure
  • Experience with SQL and Python for data manipulation, transformation, and automation
  • Experience with Apache Airflow for workflow orchestration
  • Experience with data governance, data quality, data lineage and metadata management
  • Experience with real-time data ingestion tools including Pub/Sub, Kafka, or Spark
  • Experience with CI/CD pipelines for continuous deployment and delivery of data products
  • Experience maintaining technical records and system designs
Job Responsibility
Job Responsibility
  • Design, develop, and optimize data platforms, pipelines, and governance frameworks
  • Enhance business intelligence, analytics, and AI capabilities
  • Ensure accurate data flows and push data-driven decision-making across teams
  • Write product-grade performant code for data extraction, transformations, and loading (ETL) using SQL/Python
  • Manage workflows and scheduling using Apache Airflow and build custom operators for data ETL
  • Build, deploy and maintain both inbound and outbound data pipelines to integrate diverse data sources
  • Develop and manage CI/CD pipelines to support continuous deployment of data products
  • Utilize Google Cloud Platform (GCP) tools, including BigQuery, Composer, GCS, DataStream, and Dataflow, for building scalable data systems
  • Implement real-time data ingestion solutions using GCP Pub/Sub, Kafka, or Spark
  • Develop and expose REST APIs for sharing data across teams
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Annual incentive program
  • Fulltime
Read More
Arrow Right

Tableau Data Engineer

A RDS Tableau data engineer is responsible for designing, building and maintaini...
Location
Location
Portugal , Porto
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Relevant professional experience as a data engineer is mandatory (3-5 years)
  • Knowledge of ETL (Extract, Transform and Load) data integration process is mandatory
  • Proficiency in SQL is mandatory
  • Proficiency in Python is mandatory
  • Proficiency in Tableau is mandatory
  • Fluent in English
Job Responsibility
Job Responsibility
  • Designing, building and maintaining the data infrastructure that supports Tableau-based analytics and reporting
  • Integrating data from various sources, transforming it into usable formats and creating efficient data models for analysis
  • Developing and maintaining Tableau dashboards and reports, ensuring data accuracy and collaborating with stakeholders to deliver data-driven insights
  • Work closely with business stakeholders to understand their data needs and reporting requirements
  • Support the project manager on all Tableau Server and Tableau Desktop related subjects
  • Follow-up of both design and production teams on the tool's implementation
  • Provide technical support to the development team
  • Configuration of the solution on non-production environments
  • Regular upgrade of Tableau Server version
  • Support the business lines on the handling of Tableau solutions and implementation of best practices
  • Fulltime
Read More
Arrow Right

Data Engineer

As a Data Engineer at Rearc, you'll contribute to the technical excellence of ou...
Location
Location
India , Bengaluru
Salary
Salary:
Not provided
rearc.io Logo
Rearc
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2+ years of experience in data engineering, data architecture, or related fields
  • Solid track record of contributing to complex data engineering projects
  • Hands-on experience with ETL processes, data warehousing, and data modelling tools
  • Good understanding of data integration tools and best practices
  • Familiarity with cloud-based data services and technologies (e.g., AWS Redshift, Azure Synapse Analytics, Google BigQuery)
  • Strong analytical skills
  • Proficiency in implementing and optimizing data pipelines using modern tools and frameworks
  • Strong communication and interpersonal skills
Job Responsibility
Job Responsibility
  • Collaborate with Colleagues to understand customers' data requirements and challenges
  • Apply DataOps Principles to create scalable and efficient data pipelines and architectures
  • Support Data Engineering Projects
  • Promote Knowledge Sharing through technical blogs and articles
Read More
Arrow Right

Data Engineer

We are looking for a Data Engineer with a collaborative, “can-do” attitude who i...
Location
Location
India , Gurugram
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Engineering, Computer Science or related discipline, Master’s Degree preferred
  • 3+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment
  • 3+ years of experience with setting up and operating data pipelines using Python or SQL
  • 3+ years of advanced SQL Programming: PL/SQL, T-SQL
  • 3+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization
  • Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads
  • 3+ years of strong and extensive hands-on experience in Azure, preferably data heavy/analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data
  • 3+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions
  • 3+ years of experience in defining and enabling data quality standards for auditing, and monitoring
  • Strong analytical abilities and a strong intellectual curiosity
Job Responsibility
Job Responsibility
  • Collaborate with business stakeholders and other technical team members to acquire and migrate data sources that are most relevant to business needs and goals
  • Demonstrate technical and domain knowledge of relational and non-relational databases, Data Warehouses, Data lakes among other structured and unstructured storage options
  • Determine solutions that are best suited to develop a pipeline for a particular data source
  • Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development
  • Efficient in ELT/ETL development using Azure cloud services and Snowflake, including Testing and operational support (RCA, Monitoring, Maintenance)
  • Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines for scalable analytics deliver
  • Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders
  • Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability)
  • Stay current with and adopt new tools and applications to ensure high quality and efficient solutions
  • Build cross-platform data strategy to aggregate multiple sources and process development datasets
  • Fulltime
Read More
Arrow Right