CrawlJobs Logo

Senior Software Engineer (Cloud ETL & Data)

bentley.com Logo

Bentley Systems

Location Icon

Location:
Lithuania , Vilnius

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are seeking a data-centric Senior Software Engineer to design, build, and evolve the core of our iModel Cloud Sync Services. This role is part of our Infrastructure Cloud team and will be focused on developing reliable, large-scale cloud services that ingest, transform, and synchronize complex engineering data from a multitude of sources. As a Senior Software Engineer, you will be pivotal in building the data backbone of our platform. You will work hands-on with distributed systems and cloud-native technologies to solve complex data-synchronization problems, ensuring our users have a seamless and reliable experience with their digital twins.

Job Responsibility:

  • Design and build robust, scalable ETL pipelines for parsing, validating, and transforming diverse engineering data formats
  • Develop and implement strategies for schema management and versioning within data synchronization workflows
  • Architect solutions that guarantee deterministic execution, fault tolerance, and transactional consistency for all data operations
  • Build distributed, event-driven, and task-oriented systems using microservices, messaging, and containerized workloads on Microsoft Azure
  • Implement resilience patterns such as retries, circuit breakers, and rate limiting to ensure high availability
  • Design and implement concurrency control, idempotency, and conflict-resolution patterns in distributed data workflows
  • Build and maintain comprehensive observability, including structured logging, metrics, and distributed tracing
  • Collaborate with architects on high-level design and implementation decisions
  • Mentor junior engineers through code reviews and technical guidance
  • Contribute to shared engineering standards and documentation

Requirements:

  • Graduate or postgraduate degree in Computer Science, Software Engineering, or equivalent experience
  • 7+ years of professional experience in software engineering with exposure to distributed or cloud based systems
  • Strong experience with Azure, microservices, containers, and Kubernetes
  • Hands on experience building ETL pipelines, workflow based systems, or event driven architectures
  • Solid proficiency in an object-oriented language, with a preference for C# .NET
  • Solid understanding of observability, CI/CD, reliability, and cloud operations
  • Strong problem solving skills and the ability to deliver production quality software

Nice to have:

  • Experience with workflow orchestration engines (e.g., Azure Durable Functions, Temporal, Airflow)
  • Knowledge of advanced data consistency patterns (e.g., Change Data Capture, event sourcing, saga patterns)
  • Familiarity with infrastructure-as-code and modern CI/CD practices (e.g., Terraform, Azure DevOps)
  • Prior experience in the AEC (Architecture, Engineering, Construction) domain or with complex 3D/geometric data
What we offer:
  • A great Team and culture
  • An exciting career as an integral part of a world-leading software company providing solutions for architecture, engineering, and construction
  • An attractive salary and benefits package
  • A commitment to inclusion, belonging and colleague wellbeing through global initiatives and resource groups
  • A company committed to making a real difference by advancing the world’s infrastructure for better quality of life, where your contributions help build a more sustainable, connected, and resilient world
  • Training and professional development opportunities (certifications programs, conferences etc.)
  • Additional annual leave days and extra paid days for different occasions (marriage, moving day, bereavement leave etc.)
  • Health insurance package and accidents insurance 24/7
  • Referral program with bonuses
  • Extra paid day for volunteering in the organization of your choice
  • Ability to work from office or fully remote from home

Additional Information:

Job Posted:
May 15, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Software Engineer (Cloud ETL & Data)

Senior Software Engineer

In your role as a Software Engineer with expertise in backend, you will work wit...
Location
Location
Sweden , Malmö
Salary
Salary:
Not provided
https://www.ikea.com Logo
IKEA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • At least 5 years of experience working with development & API design
  • develop, test, and deploy RESTful and GraphQL APIs using a variety of frameworks
  • implement API authentication and authorization mechanisms (OAuth 2.0, JWT, API keys)
  • optimize backend performance through caching, indexing, and query optimization
  • hands-on experience building modern, responsive web applications using React
  • good understanding of component-based architecture, design systems, and styling approaches
  • good experience with front-end and UI automation testing
  • experience integrating front-end applications with RESTful APIs
  • design and develop serverless and containerized applications
  • leverage event-driven architectures
Job Responsibility
Job Responsibility
  • Design, develop, and maintain scalable and secure digital and data products in a cloud environment
  • setup development practices working in product teams
  • ensure good code quality
  • build CI/CD pipelines
  • collaborate with the technology team of software and data engineers
  • assist the product manager with technical inputs
  • collaborate with business and IT stakeholders to improve solution architectures.
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer to design, develop, and optimize data platforms, pipelines,...
Location
Location
United States , Chicago
Salary
Salary:
160555.00 - 176610.00 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Master's degree in Engineering Management, Software Engineering, Computer Science, or a related technical field
  • 3 years of experience in data engineering
  • Experience building data platforms and pipelines
  • Experience with AWS, GCP or Azure
  • Experience with SQL and Python for data manipulation, transformation, and automation
  • Experience with Apache Airflow for workflow orchestration
  • Experience with data governance, data quality, data lineage and metadata management
  • Experience with real-time data ingestion tools including Pub/Sub, Kafka, or Spark
  • Experience with CI/CD pipelines for continuous deployment and delivery of data products
  • Experience maintaining technical records and system designs
Job Responsibility
Job Responsibility
  • Design, develop, and optimize data platforms, pipelines, and governance frameworks
  • Enhance business intelligence, analytics, and AI capabilities
  • Ensure accurate data flows and push data-driven decision-making across teams
  • Write product-grade performant code for data extraction, transformations, and loading (ETL) using SQL/Python
  • Manage workflows and scheduling using Apache Airflow and build custom operators for data ETL
  • Build, deploy and maintain both inbound and outbound data pipelines to integrate diverse data sources
  • Develop and manage CI/CD pipelines to support continuous deployment of data products
  • Utilize Google Cloud Platform (GCP) tools, including BigQuery, Composer, GCS, DataStream, and Dataflow, for building scalable data systems
  • Implement real-time data ingestion solutions using GCP Pub/Sub, Kafka, or Spark
  • Develop and expose REST APIs for sharing data across teams
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Annual incentive program
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

We’re hiring a Senior Data Engineer with strong experience in AWS and Databricks...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
appen.com Logo
Appen
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5-7 years of hands-on experience with AWS data engineering technologies, such as Amazon Redshift, AWS Glue, AWS Data Pipeline, Amazon Kinesis, Amazon RDS, and Apache Airflow
  • Hands-on experience working with Databricks, including Delta Lake, Apache Spark (Python or Scala), and Unity Catalog
  • Demonstrated proficiency in SQL and NoSQL databases, ETL tools, and data pipeline workflows
  • Experience with Python, and/or Java
  • Deep understanding of data structures, data modeling, and software architecture
  • Strong problem-solving skills and attention to detail
  • Self-motivated and able to work independently, with excellent organizational and multitasking skills
  • Exceptional communication skills, with the ability to explain complex data concepts to non-technical stakeholders
  • Bachelor's Degree in Computer Science, Information Systems, or a related field. A Master's Degree is preferred.
Job Responsibility
Job Responsibility
  • Design, build, and manage large-scale data infrastructures using a variety of AWS technologies such as Amazon Redshift, AWS Glue, Amazon Athena, AWS Data Pipeline, Amazon Kinesis, Amazon EMR, and Amazon RDS
  • Design, develop, and maintain scalable data pipelines and architectures on Databricks using tools such as Delta Lake, Unity Catalog, and Apache Spark (Python or Scala), or similar technologies
  • Integrate Databricks with cloud platforms like AWS to ensure smooth and secure data flow across systems
  • Build and automate CI/CD pipelines for deploying, testing, and monitoring Databricks workflows and data jobs
  • Continuously optimize data workflows for performance, reliability, and security, applying Databricks best practices around data governance and quality
  • Ensure the performance, availability, and security of datasets across the organization, utilizing AWS’s robust suite of tools for data management
  • Collaborate with data scientists, software engineers, product managers, and other key stakeholders to develop data-driven solutions and models
  • Translate complex functional and technical requirements into detailed design proposals and implement them
  • Mentor junior and mid-level data engineers, fostering a culture of continuous learning and improvement within the team
  • Identify, troubleshoot, and resolve complex data-related issues
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Darwin Recruitment are hiring for a Senior Data Engineer for a business in Luxem...
Location
Location
Luxembourg
Salary
Salary:
120000.00 EUR / Year
darwinrecruitment.com Logo
Darwin Recruitment GmbH
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of combined experience in Data Engineering, Cloud Engineering or similar roles
  • Proficiency in designing scalable, efficient, and maintainable data pipelines and architecture
  • Proficiency in deploying workloads on Kubernetes clusters
  • Strong experience with Apache Airflow
  • Experience in building and managing ETL workflows for data extraction, transformation, and loading
  • Experience with software development life cycle: design, development, test, deployment, operations
  • Proficiency in Python and Python data stack
  • Working knowledge of some infrastructure-as-code framework, preferably Terraform
  • Highly self-motivated, keen learner able to solve challenging problems with creative solutions
  • Strong team player with demonstrated ability to take ownership and drive execution
Job Responsibility
Job Responsibility
  • Design and implement cloud-native data pipelines
  • Optimize performance and scalability of existing data pipelines
  • Deploy and maintain cloud infrastructure (AWS)
  • Take ownership of system components from concept to delivery
  • Mentor team members
  • Collaborate with Data Scientists to bring scientific models to production
Read More
Arrow Right

Senior Data Engineer

The Data Engineer is responsible for designing, building, and maintaining robust...
Location
Location
Germany , Berlin
Salary
Salary:
Not provided
ibvogt.com Logo
ib vogt GmbH
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Degree in Computer Science, Data Engineering, or related field
  • 5+ years of experience in data engineering or similar roles
  • experience in renewable energy, engineering, or asset-heavy industries is a plus
  • Strong experience with modern data stack (e.g., PowerPlatform, Azure Data Factory, Databricks, Airflow, dbt, Synapse, Snowflake, BigQuery, etc.)
  • Proficiency in Python and SQL for data transformation and automation
  • Experience with APIs, message queues (Kafka, Event Hub), data streaming and knowledge of data lakehouse and data warehouse architectures
  • Familiarity with CI/CD pipelines, DevOps practices, and containerization (Docker, Kubernetes)
  • Understanding of cloud environments (preferably Microsoft Azure, PowerPlatform)
  • Strong analytical mindset and problem-solving attitude paired with a structured, detail-oriented, and documentation-driven work style
  • Team-oriented approach and excellent communication skills in English
Job Responsibility
Job Responsibility
  • Design, implement, and maintain efficient ETL/ELT data pipelines connecting internal systems (M365, Sharepoint, ERP, CRM, SCADA, O&M, etc.) and external data sources
  • Integrate structured and unstructured data from multiple sources into the central data lake / warehouse / Dataverse
  • Build data models and transformation workflows to support analytics, reporting, and AI/ML use cases
  • Implement data quality checks, validation rules, and metadata management according to the company’s data governance framework
  • Automate workflows, optimize performance, and ensure scalability of data pipelines and processing infrastructure
  • Work closely with Data Scientists, Software Engineers, and Domain Experts to deliver reliable datasets for Digital Twin and AI applications
  • Maintain clear documentation of data flows, schemas, and operational processes
What we offer
What we offer
  • Competitive remuneration and motivating benefits
  • Opportunity to shape the data foundation of ib vogt’s digital transformation journey
  • Work on cutting-edge data platforms supporting real-world renewable energy assets
  • A truly international working environment with colleagues from all over the world
  • An open-minded, collaborative, dynamic, and highly motivated team
  • Fulltime
Read More
Arrow Right

Senior Software Engineer

We're hiring a Senior Software Engineer to join Data Platform squad. We are look...
Location
Location
France , Paris
Salary
Salary:
55000.00 - 60000.00 EUR / Year
implicity.com Logo
Implicity
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Master's degree in Computer Science, Data Engineering, or equivalent
  • 5+ years of experience in backend development
  • Strong experience in backend development (Java, Node.js)
  • Hands-on experience with data engineering tools (Airflow, dbt, Kafka, Spark, Flink)
  • Solid understanding of SQL and NoSQL databases, including query optimization
  • Familiarity with cloud data services (AWS Redshift, Google BigQuery, Snowflake)
  • Knowledge of containerization and orchestration (Docker, Kubernetes)
  • Experience working with APIs and integrating data from multiple sources
  • Highly aware of secure development practices (Top 10 OWASP)
  • At least one experience in a micro-services architecture
Job Responsibility
Job Responsibility
  • Develop and maintain scalable backend applications with a focus on data-intensive workflows
  • Design, implement, and optimize ETL pipelines for efficient data processing
  • Work with streaming and batch data processing frameworks (Apache Kafka, Spark, Flink)
  • Optimize relational (PostgreSQL, MySQL) databases for performance
  • Ensure data integrity, reliability, and scalability across distributed systems
  • Collaborate with data scientists, analysts, and DevOps teams to integrate data-driven solutions
  • Deploy and manage applications in cloud environments (AWS) using CI/CD pipelines
What we offer
What we offer
  • Health care plan: Alan (50% employer)
  • Luncheon voucher: 9€ (50% employer)
  • Transport: 50% of your pass OR sustainable mobility pass
  • Eligible for stock option (BSPCEs) according to the company's existing rules
  • Regular team events, especially every Thursday evenings
  • Shareholder: you will be incentivized with company equity
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Location
Location
United States , Flowood
Salary
Salary:
Not provided
phasorsoft.com Logo
PhasorSoft Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience with Snowflake or Azure Cloud Data Engineering, including setting up and managing data pipelines
  • Proficiency in designing and implementing ETL processes for data integration
  • Knowledge of data warehousing concepts and best practices
  • Strong SQL skills for querying and manipulating data in Snowflake or Azure databases
  • Experience with data modeling techniques and tools to design efficient data structures
  • Understanding of data governance principles and experience implementing them in cloud environments
  • Proficiency in Tableau or Power BI for creating visualizations and interactive dashboards
  • Ability to write scripts (e.g., Python, PowerShell) for automation and orchestration of data pipelines
  • Skills to monitor and optimize data pipelines for performance and cost efficiency
  • Knowledge of cloud data security practices and tools to ensure data protection
Job Responsibility
Job Responsibility
  • Design, implement, and maintain data pipelines and architectures on Snowflake or Azure Cloud platforms
  • Develop ETL processes to extract, transform, and load data from various sources into data warehouses
  • Optimize data storage, retrieval, and processing for performance and cost-efficiency in cloud environments
  • Collaborate with stakeholders to understand data requirements and translate them into technical solutions
  • Implement data security and governance best practices to ensure data integrity and compliance
  • Work with reporting tools such as Tableau or Power BI to create interactive dashboards and visualizations
  • Monitor and troubleshoot data pipelines, ensuring reliability and scalability
  • Automate data workflows and processes using cloud-native services and scripting languages
  • Provide technical expertise and support to data analysts, scientists, and business users
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Radix is building the most trusted data and analytics platform in multifamily. J...
Location
Location
United States , Scottsdale
Salary
Salary:
Not provided
radix.com Logo
Radix (AZ)
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience in data engineering or backend systems, with 2+ years collaborating closely with analytical or scientific practitioners
  • Experience designing at least one end-to-end data application (visualization, model pipeline, etc.) for production use
  • Strong understanding of data modeling, batch processing, and streaming systems
  • Hands-on experience with SQL/NoSQL databases, cloud storage, file-based datasets, and Infrastructure-as-code
  • Experience building or operating data pipelines on AWS cloud services (Lambda, S3, RDS, ECS)
  • Understanding of AI/LLM integration and prompt engineering fundamentals
  • Proficiency with Git/GitHub, and familiarity with Spark and Kubernetes
  • Strong problem-solving skills, ownership, and ability to identify root causes of technical debt and recurring problems
  • Demonstrated ability to translate Product/Science requirements into technical plans and hold teams accountable to delivery timelines
  • Undergraduate degree in computer science, computer engineering, software engineering, or equivalent
Job Responsibility
Job Responsibility
  • Build scalable ETL/ELT pipelines for ingesting structured and unstructured data (Excel, JSON, PDFs, APIs)
  • Design and maintain data pipelines using SQL, Python, Node.js, or TypeScript
  • Work with distributed compute systems (Spark, Kubernetes), message queues, and streaming data
  • Manage and optimize data storage in MongoDB, PostgreSQL, Redis, Snowflake, and S3
  • Develop clean, standardized data schemas and event-driven transformations
  • Integrate AI-assisted parsers, mappers, and LLM-supported transformations
  • Collaborate with backend, analytics, product, and AI teams to break down requirements into well-defined engineering problems
  • Implement monitoring, data validation, and reliability checks (DQ rules, freshness, duplication)
  • Own production readiness, including on-call responsibilities and incident follow-ups
  • Mentor engineers, conduct thorough code reviews, and introduce patterns that raise the team's technical bar
What we offer
What we offer
  • Medical, dental and vision coverage designed to support your wellbeing
  • Pre-IPO Equity
  • Performance Bonus
  • Learn From the Best
  • Build Category-Defining Products
Read More
Arrow Right