CrawlJobs Logo

Senior Systems Engineer, Data and Integrations

fivetran.com Logo

Fivetran

Location Icon

Location:
Serbia , Novi Sad

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Fivetran is building data pipelines to power the modern data stack for thousands of companies. We're looking for a Systems Engineer, Data and Integrations to build and maintain system-to-system and data integrations on our Systems Engineering team. In this role, you will help develop our integration architecture, build and maintain reliable integrations that support business processes running across multiple systems, and proactively ensure the timeliness and accuracy of the data flow. This position will sit at the crossroads of software engineering, systems integration, and quality assurance, ensuring our internal systems are properly integrated and primed for scalability. You will partner closely with our other Systems and Engineering teams, as well as business partner teams across the company. By stepping into this role, you'll be at the forefront of driving quality, reliability, and observability of the integrations that power Fivetran's key business processes! This is a full-time position based out of our Novi Sad office.

Job Responsibility:

  • Partner with others on the Systems Engineering Architecture team to define our preferred integration patterns to satisfy our integration needs
  • Enforce standards, guidelines, and governance for data and process integration across various internal systems
  • Handle inquiries from other technical teams, often requiring investigation and code changes
  • Work with business partner teams to understand business processes and build appropriate integrations to support them
  • Plan and execute integration engineering end-to-end, from design to code
  • Ensure that our integrations are proactively monitored
  • Create dashboards and alerts that provide end-to-end visibility into system performance and stability, minimizing downtime and improving incident response times

Requirements:

  • 5+ years of experience in Systems or Software Engineering roles, with a focus on integrating data and applications
  • Proficiency in a general purpose programming language is required
  • Strong SQL proficiency is required
  • Working knowledge of business applications such as Salesforce, Netsuite, Coupa, Workday, and Zendesk
  • Ability to balance short-term tactical needs with long-term strategic vision, driving organizational best practices around quality
  • Strong communication and collaboration skills with both technical and non-technical colleagues
  • Working knowledge of one or more of our business applications, with a desire to keep learning, is necessary
  • Fluent in SQL and strong in a general purpose programming language such as Python or Typescript
  • A software engineering background is strongly preferred, with proficiency in API integrations, integration testing, and monitoring/alerting solutions

Nice to have:

  • Familiarity with iPaaS platforms and reverse ETL tools, their capabilities, and tradeoffs
  • Experience working with data pipelines (ETL/ELT) and SQL for data validation or monitoring
What we offer:
  • 100% employer-paid medical insurance
  • Generous paid time-off policy (PTO), plus paid sick time, inclusive parental leave policy, holidays, and volunteer days off
  • RSU stock grants
  • Professional development and training opportunities
  • Company virtual happy hours, free food, and fun team-building activities
  • Monthly cell phone stipend
  • Access to an innovative mental health support platform that offers personalized care and resources in areas such as: therapy, coaching, and self-guided mindfulness exercises for all covered employees and their covered dependents

Additional Information:

Job Posted:
May 16, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Systems Engineer, Data and Integrations

Senior Data Engineer

We are looking for a Senior Data Engineer (SDE 3) to build scalable, high-perfor...
Location
Location
India , Mumbai
Salary
Salary:
Not provided
https://cogoport.com/ Logo
Cogoport
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 6+ years of experience in data engineering, working with large-scale distributed systems
  • Strong proficiency in Python, Java, or Scala for data processing
  • Expertise in SQL and NoSQL databases (PostgreSQL, Cassandra, Snowflake, Apache Hive, Redshift)
  • Experience with big data processing frameworks (Apache Spark, Flink, Hadoop)
  • Hands-on experience with real-time data streaming (Kafka, Kinesis, Pulsar) for logistics use cases
  • Deep knowledge of AWS/GCP/Azure cloud data services like S3, Glue, EMR, Databricks, or equivalent
  • Familiarity with Airflow, Prefect, or Dagster for workflow orchestration
  • Strong understanding of logistics and supply chain data structures, including freight pricing models, carrier APIs, and shipment tracking systems
Job Responsibility
Job Responsibility
  • Design and develop real-time and batch ETL/ELT pipelines for structured and unstructured logistics data (freight rates, shipping schedules, tracking events, etc.)
  • Optimize data ingestion, transformation, and storage for high availability and cost efficiency
  • Ensure seamless integration of data from global trade platforms, carrier APIs, and operational databases
  • Architect scalable, cloud-native data platforms using AWS (S3, Glue, EMR, Redshift), GCP (BigQuery, Dataflow), or Azure
  • Build and manage data lakes, warehouses, and real-time processing frameworks to support analytics, machine learning, and reporting needs
  • Optimize distributed databases (Snowflake, Redshift, BigQuery, Apache Hive) for logistics analytics
  • Develop streaming data solutions using Apache Kafka, Pulsar, or Kinesis to power real-time shipment tracking, anomaly detection, and dynamic pricing
  • Enable AI-driven freight rate predictions, demand forecasting, and shipment delay analytics
  • Improve customer experience by providing real-time visibility into supply chain disruptions and delivery timeline
  • Ensure high availability, fault tolerance, and data security compliance (GDPR, CCPA) across the platform
What we offer
What we offer
  • Work with some of the brightest minds in the industry
  • Entrepreneurial culture fostering innovation, impact, and career growth
  • Opportunity to work on real-world logistics challenges
  • Collaborate with cross-functional teams across data science, engineering, and product
  • Be part of a fast-growing company scaling next-gen logistics platforms using advanced data engineering and AI
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

At Ingka Investments (Part of Ingka Group – the largest owner and operator of IK...
Location
Location
Netherlands , Leiden
Salary
Salary:
Not provided
https://www.ikea.com Logo
IKEA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Formal qualifications (BSc, MSc, PhD) in computer science, software engineering, informatics or equivalent
  • Minimum 3 years of professional experience as a (Junior) Data Engineer
  • Strong knowledge in designing efficient, robust and automated data pipelines, ETL workflows, data warehousing and Big Data processing
  • Hands-on experience with Azure data services like Azure Databricks, Unity Catalog, Azure Data Lake Storage, Azure Data Factory, DBT and Power BI
  • Hands-on experience with data modeling for BI & ML for performance and efficiency
  • The ability to apply such methods to solve business problems using one or more Azure Data and Analytics services in combination with building data pipelines, data streams, and system integration
  • Experience in driving new data engineering developments (e.g. apply new cutting edge data engineering methods to improve performance of data integration, use new tools to improve data quality and etc.)
  • Knowledge of DevOps practices and tools including CI/CD pipelines and version control systems (e.g., Git)
  • Proficiency in programming languages such as Python, SQL, PySpark and others relevant to data engineering
  • Hands-on experience to deploy code artifacts into production
Job Responsibility
Job Responsibility
  • Contribute to the development of D&A platform and analytical tools, ensuring easy and standardized access and sharing of data
  • Subject matter expert for Azure Databrick, Azure Data factory and ADLS
  • Help design, build and maintain data pipelines (accelerators)
  • Document and make the relevant know-how & standard available
  • Ensure pipelines and consistency with relevant digital frameworks, principles, guidelines and standards
  • Support in understand needs of Data Product Teams and other stakeholders
  • Explore ways create better visibility on data quality and Data assets on the D&A platform
  • Identify opportunities for data assets and D&A platform toolchain
  • Work closely together with partners, peers and other relevant roles like data engineers, analysts or architects across IKEA as well as in your team
What we offer
What we offer
  • Opportunity to develop on a cutting-edge Data & Analytics platform
  • Opportunities to have a global impact on your work
  • A team of great colleagues to learn together with
  • An environment focused on driving business and personal growth together, with focus on continuous learning
  • Fulltime
Read More
Arrow Right

Private Cloud/AI System Integration Engineer

In the HPE Chief Technology Office, we lead the innovation agenda and technology...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Engineering, or a related technical field (or currently pursuing and near completion)
  • Some experience through coursework, internships, or personal projects in system integration, cloud, or software development
  • Programming skills in Python, Golang or Java
  • Strong experience in DevOps, CI/CD
  • Understanding of basic testing, coding, and debugging procedures
  • Ability to quickly learn new skills and technologies and work well with other team members
  • Good written and verbal communication skills
  • Familiarity with virtualization concepts and/or basic cloud technologies
  • Strong problem-solving mindset and eagerness to learn new tools and technologies
  • Effective communication and teamwork skills
Job Responsibility
Job Responsibility
  • Collaborate with senior engineers and cross-functional teams to help design, test, and deliver integrated systems that support innovative solutions
  • Bring technical thinking to break down complex data and to engineer new ideas and methods for solving, prototyping, designing, and implementing HPE Private Cloud solutions
  • Assist in the design, testing, and deployment of system integration solutions for private cloud environments
  • Work with software and hardware teams to ensure components integrate smoothly
  • Contribute to software development of automation and deployment workflows for HPE Private Cloud programs
  • Participate in troubleshooting and resolving integration issues under guidance
  • Help document best practices, workflows, and use tools such as Jira and Confluence for project tracking and collaboration
  • Learn and apply concepts related to system security, scalability, and performance
  • Stay informed on industry trends in private cloud, virtualization, and AI technologies
What we offer
What we offer
  • Health & Wellbeing
  • Personal & Professional Development
  • Unconditional Inclusion
  • Fulltime
Read More
Arrow Right

Big Data Platform Senior Engineer

Lead Java Data Engineer to guide and mentor a talented team of engineers in buil...
Location
Location
Bahrain , Seef, Manama
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Significant hands-on experience developing high-performance Java applications (Java 11+ preferred) with strong foundation in core Java concepts, OOP, and OOAD
  • Proven experience building and maintaining data pipelines using technologies like Kafka, Apache Spark, or Apache Flink
  • Familiarity with event-driven architectures and experience in developing real-time, low-latency applications
  • Deep understanding of distributed systems concepts and experience with MPP platforms such as Trino (Presto) or Snowflake
  • Experience deploying and managing applications on container orchestration platforms like Kubernetes, OpenShift, or ECS
  • Demonstrated ability to lead and mentor engineering teams, communicate complex technical concepts effectively, and collaborate across diverse teams
  • Excellent problem-solving skills and data-driven approach to decision-making
Job Responsibility
Job Responsibility
  • Provide technical leadership and mentorship to a team of data engineers
  • Lead the design and development of highly scalable, low-latency, fault-tolerant data pipelines and platform components
  • Stay abreast of emerging open-source data technologies and evaluate their suitability for integration
  • Continuously identify and implement performance optimizations across the data platform
  • Partner closely with stakeholders across engineering, data science, and business teams to understand requirements
  • Drive the timely and high-quality delivery of data platform projects
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Adswerve is looking for a Senior Data Engineer to join our Adobe Services team. ...
Location
Location
United States
Salary
Salary:
130000.00 - 155000.00 USD / Year
adswerve.com Logo
Adswerve, Inc.
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Data Engineering, Information Systems, or related field (or equivalent experience)
  • 5+ years of experience in a data engineering, analytics, or marketing technology role
  • Hands-on expertise in Adobe Experience Platform (AEP), Real-Time CDP, Journey Optimizer, or similar tools is a big plus
  • Strong proficiency in SQL and hands-on experience with data transformation and modeling
  • Understanding of ETL/ELT workflows (e.g., dbt, Fivetran, Airflow, etc.) and cloud data platforms (e.g., GCP, Snowflake, AWS, Azure)
  • Experience with ingress/egress patterns and interacting with API’s to move data
  • Experience with Python, or JavaScript in a data or scripting context
  • Experience with customer data platforms (CDPs), event-based tracking, or customer identity management
  • Understanding of Adobe Experience Cloud integrations (e.g., Adobe Analytics, Target, Campaign) is a plus
  • Strong communication skills with the ability to lead technical conversations and present to both technical and non-technical audiences
Job Responsibility
Job Responsibility
  • Lead the end-to-end architecture of data ingestion and transformation in Adobe Experience Platform (AEP) using Adobe Data Collection (Tags), Experience Data Model (XDM), and source connectors
  • Design and optimize data models, identity graphs, and segmentation strategies within Real-Time CDP to enable personalized customer experiences
  • Implement schema mapping, identity resolution, and data governance strategies
  • Collaborate with Data Architects to build scalable, reliable data pipelines across multiple systems
  • Conduct data quality assessments and support QA for new source integrations and activations
  • Write and maintain internal documentation and knowledge bases on AEP best practices and data workflows
  • Simplify complex technical concepts and educate team members and clients in a clear, approachable way
  • Contribute to internal knowledge sharing and mentor junior engineers in best practices around data modeling, pipeline development, and Adobe platform capabilities
  • Stay current on the latest Adobe Experience Platform features and data engineering trends to inform client strategies
What we offer
What we offer
  • Medical, dental and vision available for employees
  • Paid time off including vacation, sick leave & company holidays
  • Paid volunteer time
  • Flexible working hours
  • Summer Fridays
  • “Work From Home Light” days between Christmas and New Year’s Day
  • 401(k) Plan with 5% company match and no vesting period
  • Employer Paid Parental Leave
  • Health-care Spending Accounts
  • Dependent-care Spending Accounts
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer to design, develop, and optimize data platforms, pipelines,...
Location
Location
United States , Chicago
Salary
Salary:
160555.00 - 176610.00 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Master's degree in Engineering Management, Software Engineering, Computer Science, or a related technical field
  • 3 years of experience in data engineering
  • Experience building data platforms and pipelines
  • Experience with AWS, GCP or Azure
  • Experience with SQL and Python for data manipulation, transformation, and automation
  • Experience with Apache Airflow for workflow orchestration
  • Experience with data governance, data quality, data lineage and metadata management
  • Experience with real-time data ingestion tools including Pub/Sub, Kafka, or Spark
  • Experience with CI/CD pipelines for continuous deployment and delivery of data products
  • Experience maintaining technical records and system designs
Job Responsibility
Job Responsibility
  • Design, develop, and optimize data platforms, pipelines, and governance frameworks
  • Enhance business intelligence, analytics, and AI capabilities
  • Ensure accurate data flows and push data-driven decision-making across teams
  • Write product-grade performant code for data extraction, transformations, and loading (ETL) using SQL/Python
  • Manage workflows and scheduling using Apache Airflow and build custom operators for data ETL
  • Build, deploy and maintain both inbound and outbound data pipelines to integrate diverse data sources
  • Develop and manage CI/CD pipelines to support continuous deployment of data products
  • Utilize Google Cloud Platform (GCP) tools, including BigQuery, Composer, GCS, DataStream, and Dataflow, for building scalable data systems
  • Implement real-time data ingestion solutions using GCP Pub/Sub, Kafka, or Spark
  • Develop and expose REST APIs for sharing data across teams
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Annual incentive program
  • Fulltime
Read More
Arrow Right

Senior Integrations Engineer, Enterprise Integrations & Agentic AI

We are seeking a Senior Engineer, Enterprise Integrations & Agentic AI to design...
Location
Location
United States , San Francisco; New York City; Austin
Salary
Salary:
227000.00 - 294600.00 USD / Year
airtable.com Logo
Airtable
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10+ years of experience in integrations, software development, or systems engineering
  • Strong experience with integration/iPaaS platforms such as Workato, Zapier, Mulesoft, Hightouch, AWS, etc
  • Proficient in Python, cloud development (AWS or GCP), and handling large-scale data processing workloads
  • Deep hands-on expertise in integration architecture, API management, and data synchronization patterns (event-based, webhook, polling, etc.)
  • Experience with AI agent development, prompt engineering, and integration of LLMs into workflows
  • Experience with vibe coding—leveraging AI tools (e.g., Cursor, Bolt, Claude Code) to accelerate development velocity
  • Proficiency in working with Salesforce and Workday, including custom reports, calculated fields, and Workday Studio integrations
  • Solid understanding of integration security, data governance, and compliance practices
  • Strong understanding of authentication/authorization standards like OAuth 2.0 and SAML
  • Experience with data formats like JSON and XML and data transformation techniques
Job Responsibility
Job Responsibility
  • Design, build, and optimize scalable enterprise workflows using Workato, Hightouch and Workday Studio
  • Expand our integration stack by building hybrid solutions using Workato for workflow orchestration, and cloud-native services built using Python to handle complex logic and high-volume data processing
  • Own and evolve integration architecture that supports event-driven workflows, data syncs, and process orchestration across internal tools
  • Partner with Data and Infrastructure teams to manage real-time and batch data flows between systems, ensuring accuracy, resilience, and auditability
  • Apply agentic platforms (e.g. AgentForce, Workato Genie) to orchestrate multi-step automations using LLMs and other intelligent agents
  • Champion Airtable AI internally by building production-ready AI automations
  • Partner with business units, subject matter experts, and engineering teams to understand integration requirements, define architecture, and deliver resilient workflows
  • Document technical designs, maintain reusable components, and share best practices for future expansion
  • Bring strong technical judgment in selecting the right tools and patterns, ensure observability and monitoring are in place for integration health
  • Champion best practices that keep our integration layer reliable, maintainable, and future-ready
What we offer
What we offer
  • Benefits
  • Restricted stock units
  • Incentive compensation
  • Fulltime
Read More
Arrow Right

Senior Systems Engineer

AnaVation is seeking a highly skilled Senior Systems Engineer to join our Cross ...
Location
Location
United States , Vienna
Salary
Salary:
Not provided
anavationllc.com Logo
AnaVation
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Engineering, Computer Science, or related technical discipline
  • 7–9 years of documented experience in Information Systems Engineering
  • Hardware and network designs for large-scale enterprise applications
  • Implementing and maintaining security best practices, creating and maintaining documentation for architecture, configuration and processes
  • Experience establishing and maintaining monitoring and alerting systems for cloud and on premise resources
  • Optimizing on premise and cloud infrastructure for cost efficiency and performance
  • Troubleshooting and resolving issues related to performance and availability
  • Documented and demonstrated experience with troubleshooting and problem solving
  • Experience with software development
  • Experience scripting and programming for automation
Job Responsibility
Job Responsibility
  • Architect, develop and support a for a highly available resource for mission-critical programs composed of numerous AWS services and on-premises servers across multiple locations
  • Automation and Cloud Integration: Automate the creation and management of AWS resources using AWS CloudFormation, AWS Lambda, GitLab, BASH, and Python scripting
  • Infrastructure Lifecycle Automation: Design and implement an automated, hands-free monthly server rebuild and switchover process leveraging CloudFormation, Lambda, and EventBridge
  • Linux Automation and Monitoring: Develop and maintain a comprehensive system of scripts and processes to automate configuration, maintenance, and monitoring of UNIX systems
  • Maintain network hardware and server infrastructure, including analysis, configuration, installation, and testing of new hardware and software
  • Support daily network operations, evaluating utilization, monitoring response times, and detecting and resolving operational problems
  • Troubleshoot issues at both the physical and logical levels of the network, using diagnostic tools and communication protocol analysis
  • Participate in planning, design, technical reviews, and implementation of network and infrastructure projects supporting voice and data communications
  • Maintain and enhance network infrastructure standards, including TCP/IP communication protocols, and ensure adherence to industry and security best practices
  • Exhibit proficiency with virtualization technologies (VMware, AWS, etc.) and network administration, ensuring high system availability and scalability
What we offer
What we offer
  • Generous cost sharing for medical insurance for the employee and dependents
  • 100% company paid dental insurance for employees and dependents
  • 100% company paid long-term and short term disability insurance
  • 100% company paid vision insurance for employees and dependents
  • 401k plan with generous match and 100% immediate vesting
  • Competitive Pay
  • Generous paid leave and holiday package
  • Tuition and training reimbursement
  • Life and AD&D Insurance
  • Fulltime
Read More
Arrow Right