CrawlJobs Logo

ETL Data Engineer

revelit.com Logo

Revel IT

Location Icon

Location:
United States , Columbus

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Columbus, Ohio client is seeking an ETL Data Engineer contractor who will be focused on the delivery of data migration topics to ensure that high quality data is migrated from the legacy systems to the new systems. This may involve data mapping, SQL development and other technical activities to support Data Migration objectives. The team is responsible for the implementation of the new Contract Management System (FIS Asset Finance) as well as the integration into the overall environment and the migration of data from the legacy contract management system to the new system.

Job Responsibility:

  • Focused on the delivery of data migration topics to ensure high quality data is migrated from legacy systems to new systems
  • Involve data mapping, SQL development and other technical activities to support Data Migration objectives
  • Support the implementation of the new Contract Management System (FIS Asset Finance) and integration into the overall environment
  • Support migration of data from the legacy contract management system to the new system

Requirements:

  • Candidates must be local to Columbus, Ohio
  • Candidates must be willing and able to work a hybrid schedule (3 days in office & 2 days WFH)
  • Strong C# and SQL Server design and development skills
  • Strong technical analysis skills
  • Strong collaboration skills to work effectively with cross-functional teams
  • Exceptional ability to structure, illustrate, and communicate complex concepts clearly and effectively to diverse audiences
  • Demonstrated adaptability and problem-solving skills to navigate challenges in a fast-paced environment
  • Strong prioritization and time management skills to balance multiple projects and deadlines
  • In-depth knowledge of Agile methodologies and practices

Nice to have:

  • ETL design and development
  • data mapping skills and experience
  • experience executing/driving technical design and implementation topics

Additional Information:

Job Posted:
February 13, 2026

Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for ETL Data Engineer

Data Engineer – Snowflake & ETL

We are seeking a Data Engineer in Hyderabad (WFO) with expertise in data enginee...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
rightanglesol.com Logo
Right Angle Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum 5+ years of experience in data engineering, ETL, and Snowflake development
  • Proven expertise in Snowflake including SQL scripting, performance tuning, and data warehousing concepts
  • Hands-on experience with Matillion ETL for building and maintaining ETL jobs
  • Strong understanding of cloud platforms (AWS, Azure, or GCP) and cloud-based data architectures
  • Proficiency in SQL, Python, or other scripting languages for automation and data transformation
  • Experience with API integrations and data ingestion frameworks
  • Knowledge of data governance, security policies, and access control within Snowflake environments
  • Excellent communication skills with the ability to engage both business and technical stakeholders
  • Self-motivated professional capable of working independently and delivering projects on time
  • Qualification: BE/BS/MTech/MS or equivalent work experience
  • Fulltime
Read More
Arrow Right

Senior AWS Data Engineer / Data Platform Engineer

We are seeking a highly experienced Senior AWS Data Engineer to design, build, a...
Location
Location
United Arab Emirates , Dubai
Salary
Salary:
Not provided
northbaysolutions.com Logo
NorthBay
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8+ years of experience in data engineering and data platform development
  • Strong hands-on experience with: AWS Glue
  • Amazon EMR (Spark)
  • AWS Lambda
  • Apache Airflow (MWAA)
  • Amazon EC2
  • Amazon CloudWatch
  • Amazon Redshift
  • Amazon DynamoDB
  • AWS DataZone
Job Responsibility
Job Responsibility
  • Design, develop, and optimize scalable data pipelines using AWS native services
  • Lead the implementation of batch and near-real-time data processing solutions
  • Architect and manage data ingestion, transformation, and storage layers
  • Build and maintain ETL/ELT workflows using AWS Glue and Apache Spark on EMR
  • Orchestrate complex data workflows using Apache Airflow (MWAA)
  • Develop and manage serverless data processing using AWS Lambda
  • Design and optimize data warehouses using Amazon Redshift
  • Implement and manage NoSQL data models using Amazon DynamoDB
  • Utilize AWS DataZone for data governance, cataloging, and access management
  • Monitor, log, and troubleshoot data pipelines using Amazon CloudWatch
  • Fulltime
Read More
Arrow Right

Data Engineer

As a Data Engineer at Rearc, you'll contribute to the technical excellence of ou...
Location
Location
India , Bengaluru
Salary
Salary:
Not provided
rearc.io Logo
Rearc
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2+ years of experience in data engineering, data architecture, or related fields
  • Solid track record of contributing to complex data engineering projects
  • Hands-on experience with ETL processes, data warehousing, and data modelling tools
  • Good understanding of data integration tools and best practices
  • Familiarity with cloud-based data services and technologies (e.g., AWS Redshift, Azure Synapse Analytics, Google BigQuery)
  • Strong analytical skills
  • Proficiency in implementing and optimizing data pipelines using modern tools and frameworks
  • Strong communication and interpersonal skills
Job Responsibility
Job Responsibility
  • Collaborate with Colleagues to understand customers' data requirements and challenges
  • Apply DataOps Principles to create scalable and efficient data pipelines and architectures
  • Support Data Engineering Projects
  • Promote Knowledge Sharing through technical blogs and articles
Read More
Arrow Right

Senior Data Engineer ETL Lead

The Sr Data Engineer ETL Lead is a senior level position responsible for establi...
Location
Location
United States , Irving
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 6+ years of relevant experience in Apps Development or systems analysis role
  • Extensive experience system analysis and in programming of software applications
  • Experience in managing and implementing successful projects
  • Subject Matter Expert (SME) in at least one area of Applications Development
  • Ability to adjust priorities quickly as circumstances dictate
  • Demonstrated leadership and project management skills
  • Consistently demonstrates clear and concise written and verbal communication
  • Bachelor’s degree in Computer Science or equivalent /University degree or equivalent experience
  • Data Warehouse/ETL design and development methodologies knowledge and experience required
  • ETL expertise on AbInitio tool (EME, GDE, Co-op) bringing together all components like Unix, Oracle, Storage
Job Responsibility
Job Responsibility
  • Partner with multiple management teams to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements
  • Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards
  • Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint
  • Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation
  • Develop comprehensive knowledge of how areas of business, such as architecture and infrastructure, integrate to accomplish business goals
  • Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions
  • Serve as advisor or coach to mid-level developers and analysts, allocating work as necessary
  • Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency
What we offer
What we offer
  • Medical, dental & vision coverage
  • 401(k)
  • life, accident, and disability insurance
  • wellness programs
  • paid time off packages including vacation, sick leave and paid holidays
  • Fulltime
Read More
Arrow Right

Data Engineer, Enterprise Data, Analytics and Innovation

Are you passionate about building robust data infrastructure and enabling innova...
Location
Location
United States
Salary
Salary:
110000.00 - 125000.00 USD / Year
vaniamgroup.com Logo
Vaniam Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of professional experience in data engineering, ETL, or related roles
  • Strong proficiency in Python and SQL for data engineering
  • Hands-on experience building and maintaining pipelines in a lakehouse or modern data platform
  • Practical understanding of Medallion architectures and layered data design
  • Familiarity with modern data stack tools, including: Spark or PySpark
  • Workflow orchestration (Airflow, dbt, or similar)
  • Testing and observability frameworks
  • Containers (Docker) and Git-based version control
  • Excellent communication skills, problem-solving mindset, and a collaborative approach
Job Responsibility
Job Responsibility
  • Design, build, and operate reliable ETL and ELT pipelines in Python and SQL
  • Manage ingestion into Bronze, standardization and quality in Silver, and curated serving in Gold layers of our Medallion architecture
  • Maintain ingestion from transactional MySQL systems into Vaniam Core to keep production data flows seamless
  • Implement observability, data quality checks, and lineage tracking to ensure trust in all downstream datasets
  • Develop schemas, tables, and views optimized for analytics, APIs, and product use cases
  • Apply and enforce best practices for security, privacy, compliance, and access control, ensuring data integrity across sensitive healthcare domains
  • Maintain clear and consistent documentation for datasets, pipelines, and operating procedures
  • Lead the integration of third-party datasets, client-provided sources, and new product-generated data into Vaniam Core
  • Partner with product and innovation teams to build repeatable processes for onboarding new data streams
  • Ensure harmonization, normalization, and governance across varied data types (scientific, engagement, operational)
What we offer
What we offer
  • 100% remote environment with opportunities for local meet-ups
  • Positive, diverse, and supportive culture
  • Passionate about serving clients focused on Cancer and Blood diseases
  • Investment in you with opportunities for professional growth and personal development through Vaniam Group University
  • Health benefits – medical, dental, vision
  • Generous parental leave benefit
  • Focused on your financial future with a 401(k) Plan and company match
  • Work-Life Balance and Flexibility
  • Flexible Time Off policy for rest and relaxation
  • Volunteer Time Off for community involvement
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

At Ingka Investments (Part of Ingka Group – the largest owner and operator of IK...
Location
Location
Netherlands , Leiden
Salary
Salary:
Not provided
https://www.ikea.com Logo
IKEA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Formal qualifications (BSc, MSc, PhD) in computer science, software engineering, informatics or equivalent
  • Minimum 3 years of professional experience as a (Junior) Data Engineer
  • Strong knowledge in designing efficient, robust and automated data pipelines, ETL workflows, data warehousing and Big Data processing
  • Hands-on experience with Azure data services like Azure Databricks, Unity Catalog, Azure Data Lake Storage, Azure Data Factory, DBT and Power BI
  • Hands-on experience with data modeling for BI & ML for performance and efficiency
  • The ability to apply such methods to solve business problems using one or more Azure Data and Analytics services in combination with building data pipelines, data streams, and system integration
  • Experience in driving new data engineering developments (e.g. apply new cutting edge data engineering methods to improve performance of data integration, use new tools to improve data quality and etc.)
  • Knowledge of DevOps practices and tools including CI/CD pipelines and version control systems (e.g., Git)
  • Proficiency in programming languages such as Python, SQL, PySpark and others relevant to data engineering
  • Hands-on experience to deploy code artifacts into production
Job Responsibility
Job Responsibility
  • Contribute to the development of D&A platform and analytical tools, ensuring easy and standardized access and sharing of data
  • Subject matter expert for Azure Databrick, Azure Data factory and ADLS
  • Help design, build and maintain data pipelines (accelerators)
  • Document and make the relevant know-how & standard available
  • Ensure pipelines and consistency with relevant digital frameworks, principles, guidelines and standards
  • Support in understand needs of Data Product Teams and other stakeholders
  • Explore ways create better visibility on data quality and Data assets on the D&A platform
  • Identify opportunities for data assets and D&A platform toolchain
  • Work closely together with partners, peers and other relevant roles like data engineers, analysts or architects across IKEA as well as in your team
What we offer
What we offer
  • Opportunity to develop on a cutting-edge Data & Analytics platform
  • Opportunities to have a global impact on your work
  • A team of great colleagues to learn together with
  • An environment focused on driving business and personal growth together, with focus on continuous learning
  • Fulltime
Read More
Arrow Right

Data Engineer

We are seeking a skilled and innovative Data Engineer to join our team in Nieuwe...
Location
Location
Netherlands , Nieuwegein
Salary
Salary:
3000.00 - 6000.00 EUR / Month
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • BSc or MSc degree in IT or a related field
  • Minimum of 2 years of relevant work experience in data engineering
  • Proficiency in building data pipelines using tools such as Azure Data Factory, Informatica Cloud, Synapse Pro, Spark, Python, R, Kubernetes, Snowflake, Databricks, or AWS
  • Advanced SQL knowledge and experience with relational databases
  • Hands-on experience in data modelling and data integration (both on-premise and cloud-based)
  • Strong problem-solving skills and analytical mindset
  • Knowledge of data warehousing concepts and big data technologies
  • Experience with version control systems, preferably Git
  • Excellent communication skills and ability to work collaboratively in a team environment
  • Fluency in Dutch language (required)
Job Responsibility
Job Responsibility
  • Design, develop, and maintain scalable data pipelines and ETL/ELT processes
  • Collaborate with Information Analysts to provide technical frameworks for business requirements of medium complexity
  • Contribute to architecture discussions and identify potential technical and process bottlenecks
  • Implement data quality checks and ensure data integrity throughout the data lifecycle
  • Optimise data storage and retrieval systems for improved performance
  • Work closely with cross-functional teams to understand data needs and deliver efficient solutions
  • Stay up-to-date with emerging technologies and best practices in data engineering
  • Troubleshoot and resolve data-related issues in a timely manner
  • Document data processes, architectures, and workflows for knowledge sharing and future reference
What we offer
What we offer
  • A permanent contract and a gross monthly salary between €3,000 and €6,000 (based on 40 hours per week)
  • 8% holiday allowance
  • A generous mobility budget, including options such as an electric lease car with an NS Business Card, a lease bike, or alternative transportation that best suits your travel needs
  • 8% profit sharing on target (or a fixed OTB amount, depending on the role)
  • 27 paid vacation days
  • A flex benefits budget of €1,800 per year, plus an additional percentage of your salary. This can be used for things like purchasing extra vacation days or contributing more to your pension
  • A home office setup with a laptop, phone, and a monthly internet allowance
  • Hybrid working: from home or at the office, depending on what works best for you
  • Development opportunities through training, knowledge-sharing sessions, and inspiring (networking) events
  • Social activities with colleagues — from casual drinks to sports and content-driven outings
  • Fulltime
Read More
Arrow Right

Data Engineer

We are seeking our first Data Engineer, someone who can refine our data infrastr...
Location
Location
United States , New York City; San Francisco
Salary
Salary:
190000.00 - 250000.00 USD / Year
hebbia.ai Logo
Hebbia
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's or Master's degree in Computer Science, Data Science, Statistics, or a related field
  • 5+ years software development experience at a venture-backed startup or top technology firm, with a focus on data engineering
  • Significant hands-on experience in data engineering (ETL development, data warehousing, data lake management, etc.)
  • Adept at identifying and owning data projects end to end, with the ability to work independently and exercise sound judgment
  • Proficient in Python and SQL
  • comfortable working with cloud-based data stack tools
  • Familiar with big data processing frameworks (e.g., Spark, Hadoop) and data integration technologies (e.g., Airflow, DBT, or similar)
  • Experience implementing data governance, security, and compliance measures
  • Strong collaboration and communication skills, with the ability to translate business requirements into technical solutions
  • You are comfortable working in-person 5 days a week
Job Responsibility
Job Responsibility
  • Architect, build, and maintain ETL pipelines and workflows that ensure high data quality and reliability
  • Design and manage a central data lake to consolidate data from various sources, enabling advanced analytics and reporting
  • Collaborate with cross-functional stakeholders (product, engineering, and business) to identify data gaps and develop effective solutions
  • Implement best practices in data security and governance to ensure compliance and trustworthiness
  • Evaluate and integrate new technologies, tools, and approaches to optimize data processes and architectures
  • Continuously monitor, troubleshoot, and improve data pipelines and infrastructure for performance, scalability, and cost-efficiency
What we offer
What we offer
  • PTO: Unlimited
  • Insurance: Medical + Dental + Vision + 401K + Wellness Benefits
  • Eats: Catered lunch daily + doordash dinner credit if you ever need to stay late
  • Parental leave policy: 3 months non-birthing parent, 4 months for birthing parent
  • Fertility benefits: $15k lifetime benefit
  • New hire equity grant: competitive equity package with unmatched upside potential
  • Fulltime
Read More
Arrow Right