CrawlJobs Logo

Matillion ETL Engineer

https://www.roberthalf.com Logo

Robert Half

Location Icon

Location:
United States , Austin

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Responsibility:

  • Build and maintain end‑to‑end ETL pipelines using Matillion to extract data from source systems, transform data (clean, join, aggregate), and load into Snowflake
  • Refactor existing pipelines to improve performance, reliability, and readability
  • Clean and standardize inconsistent or raw data from multiple sources
  • Redesign and improve data models to support Tableau reporting and analytics consumption (not a BI development role)
  • Partner with analytics and reporting teams to ensure data models are performant, intuitive, and accurate
  • Support ongoing data quality initiatives and pipeline monitoring
  • Assist with cloud‑based data initiatives, including exposure to data and platform migrations
  • Document ETL logic, data models, and pipeline processes as needed

Requirements:

  • 3+ years of professional experience in Data Engineering, ETL development, or a related role
  • Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or a related field
  • Hands‑on experience with: Matillion ETL, Snowflake, AWS, Tableau (from a data modeling / data engineering perspective)
  • Experience building and maintaining ETL pipelines in a cloud data warehouse environment
  • Comfortable working with raw data, evolving schemas, and imperfect source systems
  • Basic understanding of data modeling concepts for analytics use cases
  • Exposure to cloud or data platform migrations

Nice to have:

  • Experience refactoring legacy or inherited ETL pipelines
  • Familiarity with SQL performance tuning in Snowflake
What we offer:
  • medical, vision, dental, and life and disability insurance
  • enrollment in company 401(k) plan

Additional Information:

Job Posted:
May 15, 2026

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Matillion ETL Engineer

Data Engineer – Snowflake & ETL

We are seeking a Data Engineer in Hyderabad (WFO) with expertise in data enginee...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
rightanglesol.com Logo
Right Angle Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum 5+ years of experience in data engineering, ETL, and Snowflake development
  • Proven expertise in Snowflake including SQL scripting, performance tuning, and data warehousing concepts
  • Hands-on experience with Matillion ETL for building and maintaining ETL jobs
  • Strong understanding of cloud platforms (AWS, Azure, or GCP) and cloud-based data architectures
  • Proficiency in SQL, Python, or other scripting languages for automation and data transformation
  • Experience with API integrations and data ingestion frameworks
  • Knowledge of data governance, security policies, and access control within Snowflake environments
  • Excellent communication skills with the ability to engage both business and technical stakeholders
  • Self-motivated professional capable of working independently and delivering projects on time
  • Qualification: BE/BS/MTech/MS or equivalent work experience
  • Fulltime
Read More
Arrow Right

Sr. Data Engineer

We are looking for a skilled Sr. Data Engineer to join our team in Oklahoma City...
Location
Location
United States , Oklahoma City
Salary
Salary:
Not provided
https://www.roberthalf.com Logo
Robert Half
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven experience with Snowflake data warehousing and schema design
  • proficiency in ETL tools such as Matillion or similar platforms
  • strong knowledge of Python and PowerShell for data automation
  • experience working with Microsoft SQL Server and related technologies
  • familiarity with cloud technologies, particularly AWS
  • understanding of data visualization and analytics tools
  • background in working with big data technologies such as Apache Kafka, Hadoop, Spark, or Pig
  • ability to design and implement APIs for data integration and management.
Job Responsibility
Job Responsibility
  • Design, implement, and maintain Snowflake data warehousing solutions to support business needs
  • assist in the migration of in-house data to Snowflake, ensuring a seamless transition
  • develop data pipelines and workflows using tools such as Matillion or equivalent ETL solutions
  • collaborate with teams to optimize and manage the existing data warehouse built on Microsoft SQL Server
  • utilize Python and PowerShell to automate data processes and enhance system efficiency
  • partner with the implementation team to shadow and learn best practices for Snowflake deployment
  • ensure data integrity, scalability, and security across all data engineering processes
  • provide insights into data visualization and analytics to support decision-making
  • work with cloud technologies, including AWS, to enhance data storage and accessibility
  • implement and manage APIs to enable seamless data integration and sharing.
What we offer
What we offer
  • Medical, vision, dental, and life and disability insurance
  • eligibility to enroll in 401(k) plan
  • access to competitive compensation and free online training.
  • Fulltime
Read More
Arrow Right

Lead Data Engineer

Snowflake Lead Data Engineer for an initial 4-month project with potential to ex...
Location
Location
United Kingdom , Nottingham; Coventry
Salary
Salary:
Not provided
brosterbuchanan.com Logo
Broster Buchanan
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Developing complex ETL processes ideally in Matillion or a similar ETL product
  • Designing and Developing Snowflake Staging, Preparation, and Warehouse areas
  • Data Modeling
  • Managing Development/QA and Live environment in Snowflake and Matillion
  • Orchestrated the setup and architecture of Snowflake, establishing workspace structures, defining Medallion (Bronze/Silver/Gold) layers
  • Strong ownership of the design and data modelling part
  • Strong accountability and be involved in the Sprint management, running daily standups, and dealing with the backlogs
  • Can travel to Nottingham and Coventry weekly
Job Responsibility
Job Responsibility
  • Lead and finalise the data model design and be hands-on in building the data model
  • Guide a team of 2 data engineers to help build the data model
  • Build a new data model in Snowflake following the Medallion architecture – Bronze, Silver, Gold
  • Fulltime
Read More
Arrow Right

Data Engineer

The project focuses on building and evolving a modern data platform to support a...
Location
Location
Salary
Salary:
Not provided
coherentsolutions.com Logo
Coherent Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of experience with ETL tools (Matillion, SSIS, or similar)
  • 3+ years of experience in data warehousing
  • Strong SQL skills (Snowflake, MS SQL, or similar)
  • Good experience with Python programming
  • Experience with API integrations
  • Hands-on experience with AWS (S3, Redshift, or similar)
  • Solid understanding of data modeling and analytics concepts
  • English level: B1 (Intermediate) or higher
Job Responsibility
Job Responsibility
  • Identify and understand source data systems and data flows
  • Develop mappings and data movement across the data ecosystem
  • Integrate data from databases, flat files, and APIs
  • Build and maintain robust data pipelines, data lakes, and data warehouses
  • Design, optimize, and support ETL/ELT processes
  • Perform data discovery and ad hoc analysis
  • Improve data standards, best practices, and documentation
What we offer
What we offer
  • Technical and non-technical training for professional and personal growth
  • Internal conferences and meetups to learn from industry experts
  • Support and mentorship from an experienced employee to help you professional grow and development
  • Internal startup incubator
  • Health insurance
  • English courses
  • Sports activities to promote a healthy lifestyle
  • Flexible work options, including remote and hybrid opportunities
  • Referral program for bringing in new talent
  • Work anniversary program and additional vacation days
Read More
Arrow Right

Data Engineer

Robert Half is partnering with a financial services organization to hire a Data ...
Location
Location
United States , Austin
Salary
Salary:
Not provided
https://www.roberthalf.com Logo
Robert Half
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of experience working in data engineering or a closely related technical role
  • Hands-on experience building data workflows with Matillion
  • Strong working knowledge of Snowflake for data storage, transformation, and performance tuning
  • Experience preparing and delivering data for Tableau reporting and dashboard development
  • Familiarity with AWS services used in cloud-based data environments
  • Understanding of data pipeline design, ETL/ELT concepts, and data quality best practices
  • Ability to communicate effectively with both technical teams and business stakeholders
Job Responsibility
Job Responsibility
  • Build and maintain data pipelines that move information from data lake environments into structured warehouse and reporting platforms
  • Develop, schedule, and optimize ETL and ELT workflows using Matillion to support dependable data delivery
  • Design and manage Snowflake data models that improve accessibility, performance, and scalability for business users
  • Partner with analytics and reporting stakeholders to prepare datasets that support Tableau dashboards and visual insights
  • Monitor data processing jobs, troubleshoot failures, and resolve quality issues to maintain trusted data assets
  • Work within AWS-based environments to support secure, efficient, and scalable data integration processes
  • Collaborate with cross-functional teams to understand data needs and translate them into practical engineering solutions
What we offer
What we offer
  • medical
  • vision
  • dental
  • life and disability insurance
  • 401(k) plan
  • Fulltime
Read More
Arrow Right

Data Engineer

We are looking for a talented Data Engineer to join our team in Grand Rapids, Mi...
Location
Location
United States , Grand Rapids
Salary
Salary:
Not provided
https://www.roberthalf.com Logo
Robert Half
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • At least 3 years of hands-on experience with Snowflake, including SnowSQL, Snowpipe, Streams, Tasks, and Warehouses
  • Strong SQL skills with expertise in performance tuning and complex query development
  • Familiarity with cloud platforms such as AWS, Azure, or Google Cloud
  • Proficiency in ETL development using tools like dbt, Matillion, or similar technologies
  • Solid understanding of data modeling techniques, including star schema and dimensional modeling
  • Experience with scripting languages, particularly Python
  • Knowledge of data warehousing concepts, data lakes, and modern data architecture patterns
  • Preferred experience with CI/CD tools like GitHub Actions, Azure DevOps, or Jenkins
Job Responsibility
Job Responsibility
  • Design and implement scalable data models, schemas, and tables within Snowflake, including staging, integration, and presentation layers
  • Develop and optimize data pipelines using Snowflake tools such as Snowpipe, Streams, Tasks, and stored procedures
  • Ensure data security and access through role-based controls and best practices for data sharing
  • Build and maintain ETL pipelines leveraging tools like dbt, Matillion, Fivetran, Informatica, or Azure-native solutions
  • Integrate data from diverse sources such as APIs, IoT devices, and NoSQL databases to create unified datasets
  • Enhance performance by utilizing clustering, partitioning, caching, and efficient warehouse sizing strategies
  • Collaborate with cloud technologies such as AWS, Azure, or Google Cloud to support Snowflake infrastructure and operations
  • Implement automated workflows and CI/CD processes for seamless deployment of data solutions
  • Maintain high standards for data accuracy, completeness, and reliability while supporting governance and documentation
  • Work closely with analytics, reporting, and business teams to troubleshoot issues and deliver scalable solutions
What we offer
What we offer
  • medical
  • vision
  • dental
  • life and disability insurance
  • company 401(k) plan
Read More
Arrow Right

Data Engineer

Senior data engineering role delivering large-scale data integration solutions a...
Location
Location
United States , Jersey City
Salary
Salary:
142320.00 - 213480.00 USD / Year
https://www.citi.com/ Logo
Citi
Expiration Date
June 15, 2026
Flip Icon
Requirements
Requirements
  • 6-10 years of experience in data engineering, data integration, or platform engineering with a focus on large enterprise programs
  • Proven track record delivering data integration for ERP and/or CRM platforms in complex environments
  • Strong hands-on experience with one or more enterprise ETL/ELT tools (e.g., Informatica, DataStage, SSIS, Talend, Azure Data Factory, Matillion, dbt or similar)
  • Advanced Python skills for data processing, automation, and scripting (packaging, logging, error handling, and testing discipline)
  • Expert-level SQL skills, including performance tuning and data warehousing concepts (dimensional modeling awareness a plus)
  • Working knowledge of NoSQL concepts and at least one NoSQL technology (implementation experience preferred)
  • Strong understanding of data pipeline fundamentals: incremental loading, late-arriving data, schema evolution, and end-to-end observability
  • Ability to communicate clearly with both technical and non-technical stakeholders
  • strong documentation and design skills
Job Responsibility
Job Responsibility
  • Data integration delivery: Design, build, and operate robust batch and near-real-time integration pipelines for CRM data domains (e.g., customers, products, orders, invoices, service interactions)
  • ETL/ELT engineering: Develop and optimize workflows using enterprise ETL tools
  • establish reusable patterns for ingestion, transformation, and publish layers
  • Python engineering: Build Python-based utilities and frameworks for data loads, automation, validation, reconciliation, and operational tooling
  • SQL excellence: Write and tune complex SQL (query optimization, indexing strategy awareness, incremental loads, CDC patterns, and performance troubleshooting)
  • NoSQL competence: Apply non-relational data modeling and query patterns where appropriate (document/columnar/key-value/graph), including performance and consistency considerations
  • Data quality and controls: Implement validation rules, error handling, auditability, and reconciliation controls
  • create monitoring/alerting to meet SLAs/SLOs
  • Architecture and design: Contribute to system architecture and integration design decisions (data contracts, schemas, idempotency, versioning, resiliency)
  • Security and compliance: Ensure data pipelines follow security best practices (encryption, access control, secrets management) and align with retention and privacy requirements
What we offer
What we offer
  • medical
  • dental & vision coverage
  • 401(k)
  • life, accident, and disability insurance
  • wellness programs
  • paid time off packages, including planned time off (vacation), unplanned time off (sick leave), and paid holidays
  • Fulltime
Read More
Arrow Right

Contract Data Engineer

Robert Half is seeking a Contract Data Engineer to support our client’s data and...
Location
Location
United States , Nashville
Salary
Salary:
Not provided
https://www.roberthalf.com Logo
Robert Half
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven experience as a Data Engineer or in a similar role
  • Strong proficiency in SQL and Python (or similar languages)
  • Experience with cloud platforms (AWS, Azure, GCP)
  • Hands-on experience with ETL / ELT tools (Airflow, dbt, Fivetran, Matillion, Glue, ADF, etc.)
  • Experience with data warehousing platforms such as Snowflake, Redshift, BigQuery, or Azure Synapse
  • Familiarity with streaming and big data tools (Kafka, Spark, Databricks, Hadoop) is a plus
  • Strong understanding of data modeling, performance tuning, and pipeline optimization
  • Experience with version control systems (Git) and agile development practices
  • Excellent problem-solving, analytical, and communication skills
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable ETL / ELT pipelines to ingest, transform, and deliver data from multiple sources
  • Develop and optimize data models, schemas, and warehouse structures to support analytics, reporting, and business intelligence needs
  • Work within cloud environments such as AWS, Azure, or GCP to deploy and manage data solutions
  • Design and support enterprise data warehouses using platforms such as Snowflake, Redshift, BigQuery, or Azure Synapse
  • Develop solutions using big data technologies such as Spark, Databricks, Kafka, and Hadoop when required
  • Tune queries, pipelines, and storage solutions for performance, scalability, and cost efficiency
  • Implement monitoring, validation, and alerting processes to ensure data accuracy, integrity, and availability
  • Work closely with Data Analysts, Data Scientists, Software Engineers, and business stakeholders to understand requirements and deliver data solutions
  • Maintain detailed documentation for pipelines, data flows, and system architecture
What we offer
What we offer
  • medical, vision, dental, and life and disability insurance
  • eligible to enroll in our company 401(k) plan
Read More
Arrow Right