CrawlJobs Logo

Data Integration Developer

odu.edu Logo

Old Dominion University

Location Icon

Location:
United States , Norfolk

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

124259.00 USD / Year

Job Description:

Design, develop, and deploy integration workflows across cloud and on-premise environments. Design and implement data integration solutions using IICS. Develop ETL pipelines for date ingestion, transformation, and loading from diverse data sources. Work with stakeholders to gather requirements and translate them into technical specifications. Monitor and optimize workflows for performance and reliability. Ensure data accuracy, consistency, and security across systems. Work from home is permitted.

Job Responsibility:

  • Design, develop, and deploy integration workflows across cloud and on-premise environments
  • Design and implement data integration solutions using IICS
  • Develop ETL pipelines for data ingestion, transformation, and loading from diverse data sources
  • Work with stakeholders to gather requirements and translate them into technical specifications
  • Monitor and optimize workflows for performance and reliability
  • Ensure data accuracy, consistency, and security across systems

Requirements:

  • Bachelor’s Degree in Computer Science, Information Technology, Data Science, or a related field
  • 5 years of experience in a Data Integration Developer or related occupation

Additional Information:

Job Posted:
March 05, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Integration Developer

Integration Developer / Data Engineer

We are looking for an experienced Integration Developer / Data Engineer to desig...
Location
Location
Poland , Wroclaw
Salary
Salary:
Not provided
eviden.com Logo
Eviden
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven experience with MS SQL (T-SQL, query optimization)
  • Knowledge of Microsoft Dataverse and Dynamics 365 Sales CRM
  • Hands-on experience with Azure Data Factory
  • Strong understanding of ETL process design
  • English – professional working proficiency
Job Responsibility
Job Responsibility
  • Design and implement integration processes (MS SQL → Dataverse / D365 Sales CRM)
  • Develop and optimize pipelines in Azure Data Factory
  • Analyze business and technical requirements
  • Monitor and maintain ETL processes
  • Collaborate with development and analytics teams
  • Parttime
Read More
Arrow Right

Data and Integration Architect

The Data & Integration Architect at Cayuse is a key technical leader responsible...
Location
Location
United States
Salary
Salary:
Not provided
cayuse.com Logo
Cayuse
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Demonstrated success as a Data Architect, Integration Architect, or in a similar role, within a multi-tenant SaaS environment
  • Expertise in data modeling at all levels (conceptual, logical, and physical), including experience with dynamic and extensible models
  • Proficiency in database and data management technologies, including relational (e.g., Postgres) and cloud-native solutions (e.g., Snowflake, AWS RDS)
  • Deep understanding of API design principles, including REST, bulk file-based, asynchronous, and event-driven architectures (e.g., Kafka, AWS EventBridge)
  • Deep understanding of data warehouses, including design, optimization, and best practices for analytics
  • Experience designing and implementing scalable, tenant-aware data architectures
  • Strong grasp of modern integration patterns, including API gateways, data streaming, and hybrid batch-stream processing
  • Knowledge of modern data governance practices including data security, lineage, observability, and compliance requirements
  • Excellent collaboration, communication, and influence skills, with experience working across product, architecture, engineering, and operational teams
Job Responsibility
Job Responsibility
  • Define and drive a comprehensive data and integration strategy aligned with SaaS multi-tenancy, security, and scalability requirements
  • Collaborate with product management, engineering leadership, fellow architects, and business stakeholders to design interoperable and future-proof data and integration solutions
  • Establish and evolve the suite’s conceptual, logical, and physical data models, ensuring consistency, flexibility, and efficiency across products
  • Develop a unified data model for the suite, defining common entities and relationships across multiple product domains
  • Define extensible data architectures, supporting multi-tenant and customer-specific configurations without compromising performance
  • Lead the development of governance frameworks, ensuring data quality, security, and compliance
  • Guide decisions on database and data management solutions tailored to specific use cases
  • Design dynamic data extension mechanisms to support customer-specific tenant requirements
  • Design and standardize suite-wide API and data exchange patterns
  • Define and advocate for event-driven architectures
What we offer
What we offer
  • Competitive Medical Benefits (PPO + HSA available)
  • Vision, Dental, Short-Term Disability fully covered by Cayuse
  • Unlimited PTO + Holidays + Flexible Work Schedule
  • Remote Work Stipend
  • Equal Paid Parental Leave
  • 401k with Employer Matching
  • Quarterly Wellness Reimbursement
  • Remote Work Environment, supporting the Ultimate Employee Experience
  • Fulltime
Read More
Arrow Right

Application Developer - Data Governance

This role is an intermediate level position responsible for participation in the...
Location
Location
Canada , Mississauga
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong understanding of Data Lineage, metadata management and reference data development and data analytics
  • Good knowledge about relational databases like Oracle, SQL / PLSQL
  • Strong knowledge in one or more of the areas of: Data lineage, application development, python or Java coding experience
  • Hands on experience of any coding language and tool based configuration prior experience
  • Full Software Development Kit (SDK) development cycle experience
  • Pragmatic problem-solving and ability to work independently or as part of a team
  • Proficiency in ab-initio mHub or python programming languages
  • Proficiency with 1 or more of the following programming languages: Java, API, Python
  • 2+ years of non-internship professional software development experience
  • A passion for development, strong work ethic, and continuous learning
Job Responsibility
Job Responsibility
  • Develop and maintain application development to complicated enterprise data lineage
  • Optimize industry based tool to simplify enterprise level data complexity via data lineage
  • Debug and resolve graph-related issues
  • Collaborate on designing and implementing new features to simplify complex problems
  • Conduct code reviews for quality assurance
  • Write and maintain documentation for functionalities and APIs
  • Integrate and validate third-party libraries and tools
  • Manage source code using version control systems
  • Implement algorithms for code generation and optimization
  • Perform code refactoring for better maintainability and efficiency
What we offer
What we offer
  • Global benefits
  • Equal opportunity employer
  • Reasonable accommodations provided for individuals with disabilities.
  • Fulltime
Read More
Arrow Right

Senior Software Engineer - Data Integration & JVM Ecosystem

The Connectors team is the bridge between ClickHouse and the entire data ecosyst...
Location
Location
Germany
Salary
Salary:
Not provided
clickhouse.com Logo
ClickHouse
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 6+ years of software development experience focusing on building and delivering high-quality, data-intensive solutions
  • Proven experience with the internals of at least one of the following technologies: Apache Spark, Apache Flink, Kafka Connect, or Apache Beam
  • Experience developing or extending connectors, sinks, or sources for at least one big data processing framework such as Apache Spark, Flink, Beam, or Kafka Connect
  • Strong understanding of database fundamentals: SQL, data modeling, query optimization, and familiarity with OLAP/analytical databases
  • A track record of building scalable data integration systems (beyond simple ETL jobs)
  • Strong proficiency in Java and the JVM ecosystem, including deep knowledge of memory management, garbage collection tuning, and performance profiling
  • Solid experience with concurrent programming in Java, including threads, executors, and reactive or asynchronous patterns
  • Outstanding written and verbal communication skills to collaborate effectively within the team and across engineering functions
  • Understanding of JDBC, network protocols (TCP/IP, HTTP), and techniques for optimizing data throughput over the wire
  • Passion for open-source development
Job Responsibility
Job Responsibility
  • Own and maintain critical parts of ClickHouse's Data engineering ecosystem
  • Own the full lifecycle of data framework integrations - from the core database driver to SDKs and connectors
  • Build the foundation that thousands of Data engineers rely on for their most critical data workloads
  • Collaborate closely with the open-source community, internal teams, and enterprise users to ensure our JVM integrations set the standard for performance, reliability, and developer experience
What we offer
What we offer
  • Flexible work environment - ClickHouse is a globally distributed company and remote-friendly
  • Healthcare - Employer contributions towards your healthcare
  • Equity in the company - Every new team member who joins our company receives stock options
  • Time off - Flexible time off in the US, generous entitlement in other countries
  • A $500 Home office setup if you’re a remote employee
  • Global Gatherings – opportunities to engage with colleagues at company-wide offsites
Read More
Arrow Right

Senior Software Engineer - Data Integration & JVM Ecosystem

The Connectors team is the bridge between ClickHouse and the entire data ecosyst...
Location
Location
United Kingdom
Salary
Salary:
Not provided
clickhouse.com Logo
ClickHouse
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 6+ years of software development experience focusing on building and delivering high-quality, data-intensive solutions
  • Proven experience with the internals of at least one of the following technologies: Apache Spark, Apache Flink, Kafka Connect, or Apache Beam
  • Experience developing or extending connectors, sinks, or sources for at least one big data processing framework such as Apache Spark, Flink, Beam, or Kafka Connect
  • Strong understanding of database fundamentals: SQL, data modeling, query optimization, and familiarity with OLAP/analytical databases
  • A track record of building scalable data integration systems (beyond simple ETL jobs)
  • Strong proficiency in Java and the JVM ecosystem, including deep knowledge of memory management, garbage collection tuning, and performance profiling
  • Solid experience with concurrent programming in Java, including threads, executors, and reactive or asynchronous patterns
  • Outstanding written and verbal communication skills to collaborate effectively within the team and across engineering functions
  • Understanding of JDBC, network protocols (TCP/IP, HTTP), and techniques for optimizing data throughput over the wire
  • Passion for open-source development
Job Responsibility
Job Responsibility
  • Serve as a core contributor, owning and maintaining critical parts of ClickHouse's Data engineering ecosystem
  • Own the full lifecycle of data framework integrations - from the core database driver that handles billions of records per second, to SDKs and connectors that make ClickHouse feel native in JVM-based applications
  • Build the foundation that thousands of Data engineers rely on for their most critical data workloads
  • Collaborate closely with the open-source community, internal teams, and enterprise users to ensure our JVM integrations set the standard for performance, reliability, and developer experience
What we offer
What we offer
  • Flexible work environment - ClickHouse is a globally distributed company and remote-friendly. We currently operate in 20 countries
  • Healthcare - Employer contributions towards your healthcare
  • Equity in the company - Every new team member who joins our company receives stock options
  • Time off - Flexible time off in the US, generous entitlement in other countries
  • A $500 Home office setup if you’re a remote employee
  • Global Gatherings – We believe in the power of in-person connection and offer opportunities to engage with colleagues at company-wide offsites
  • Fulltime
Read More
Arrow Right

Principal SAP Data Integration Architect

The Senior SAP Data Architect bridges the gap between business users and the SAP...
Location
Location
United States , Remote
Salary
Salary:
113930.00 - 170900.00 USD / Year
lambweston.com Logo
Lamb Weston
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Information Systems, or related field
  • 8+ years of experience as a Business Architect in SAP environments
  • Experience in ABAP, BW and SAC development
  • Experience in SAP data migration to Snowflake
  • Strong understanding of SAP systems architecture and functional modules
  • Experience with reporting tools and cloud data platforms such as Power BI and Snowflake
  • Excellent communication, facilitation, and problem-solving skills
Job Responsibility
Job Responsibility
  • Lead BW design and technical solutions with SAP SAC and Analysis for Office
  • Work with offshore teams to review the development, testing, and production support
  • Collaborate with business partners to define reporting and analytics requirements within SAP and Snowflake
  • Analyze and document data requirements across SAP functional areas including Procurement, Supply Chain, Finance and Trade Promotion Management
  • Support data validation and reconciliation processes during SAP transformation and enhancement projects
  • Develop data mappings, user stories, and acceptance criteria for analytics initiatives
  • Assist in data governance efforts, ensuring business definitions and metadata are accurately captured
  • Work closely with developers to ensure solutions meet business objectives and follow best development practices
What we offer
What we offer
  • Health Insurance Benefits - Medical, Dental, Vision
  • Flexible Spending Accounts for Health and Dependent Care, and Health Reimbursement Accounts
  • Well-being programs including companywide events and a wellness incentive program
  • Paid Time Off
  • Financial Wellness – Industry leading 401(k) plan with generous company contributions, Financial Planning Services, Employee Stock purchase program, and Health Savings Accounts, Life and Accident insurance
  • Family-Friendly Employee events
  • Employee Assistance Program services – mental health and other concierge type services
  • Fulltime
Read More
Arrow Right

Application Developer - Data Governance

This role is an intermediate Application Developer - Data Governance level posit...
Location
Location
Canada , Mississauga
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong understanding of Data Lineage, metadata management and reference data development and data analytics
  • Good knowledge about relational databases like Oracle, SQL / PLSQL
  • Strong knowledge in one or more of the areas of: Data lineage, application development, python or Java coding experience
  • Hands on experience of any coding language and tool based configuration prior experience
  • Full Software Development Kit (SDK) development cycle experience
  • Pragmatic problem-solving and ability to work independently or as part of a team
  • Proficiency in ab-initio mHub or python programming languages
  • Proficiency with 1 or more of the following programming languages: Java, API, Python
  • 2+ years of non-internship professional software development experience
  • A passion for development, strong work ethic, and continuous learning
Job Responsibility
Job Responsibility
  • Develop and maintain application development to complicated enterprise data lineage
  • Optimize industry based tool to simplify enterprise level data complexity via data lineage
  • Debug and resolve graph-related issues
  • Collaborate on designing and implementing new features to simplify complex problems
  • Conduct code reviews for quality assurance
  • Write and maintain documentation for functionalities and APIs
  • Integrate and validate third-party libraries and tools
  • Manage source code using version control systems
  • Implement algorithms for code generation and optimization
  • Perform code refactoring for better maintainability and efficiency
What we offer
What we offer
  • Equal opportunity employer
  • Accessibility accommodations for persons with disabilities
  • Global benefits
  • Fulltime
Read More
Arrow Right

IT Development Manager for Data Intelligence Platform

You will lead technical developments of a Data Intelligence Platform and partner...
Location
Location
Poland , Warsaw
Salary
Salary:
Not provided
https://www.bosch.pl/ Logo
Robert Bosch Sp. z o.o.
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience in developing Data Management and Analytics applications
  • Extensive Knowledge of (Meta) Data Management capabilities: Data Governance, Data Lineage, Data Assets, Data Products, Data Catalog, Data Marketplace, Data Policy, Ontologies
  • Considerable experience in designing, developing & integrating Data and Analytics applications using modern architectures and frameworks, structured and unstructured data
  • Broad and up to date technical knowledge: Databases (e.g. Oracle, Databricks), Middleware/Integration (e.g. Solace, Kafka), API Management, Cloud Providers (Azure, AWS, Google), AI Technologies (LLMs, Agents)
  • Experimental mindset, self-motivation to search for solutions and appreciate learning new things
  • Strong communication skills, proactive in contacting people
  • English (fluent in spoken and written)
Job Responsibility
Job Responsibility
  • Lead technical developments of a Data Intelligence Platform
  • Partner with Business, Solution and IT Architects on the strategy and delivery of the Platform functionalities
  • Define consistent system specific guidelines for the software development and configuration environment in alignment with central guidelines
  • Assess customer requirements from a technical perspective with respective effort estimations and assist in the design and development of proof of concept and prototypes
  • Document specifications and support the creation of operational support manuals during the technical implementation
  • Take over responsibility for interface implementation and documentation
  • Steer external and internal developers
  • Support DevOps by sizing and scalability concepts (for specific use cases)
What we offer
What we offer
  • Competitive salary + annual bonus
  • Hybrid work with flexible working hours
  • Referral Bonus Program
  • Copyright costs for IT employees
  • Private medical care and life insurance
  • Cafeteria System with multiple benefits (incl. MultiSport, shopping vouchers, cinema tickets, etc.)
  • Prepaid Lunch Card
  • Non-working day on the 31st of December
  • Fulltime
Read More
Arrow Right