CrawlJobs Logo

Experienced Data Platform Engineer

boeing.com Logo

Boeing

Location Icon

Location:
India , Bengaluru

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

At Boeing, we innovate and collaborate to make the world a better place. We’re committed to fostering an environment for every teammate that’s welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us. Boeing Test and Evaluation team is currently looking for a Experienced Data Platform Engineer to join their team in Bengaluru, KA.

Job Responsibility:

  • Design, deploy, and operate systems across Azure and AWS, including hybrid and multi‑cloud environments
  • Evaluate and select cloud services based on cost, usability, scalability, and long‑term maintainability
  • Implement infrastructure‑as‑code using Terraform, CloudFormation, ARM, or Bicep to enable repeatable, secure deployments
  • Support containerized and cloud‑native architectures (e.g., AKS, EKS, ECS)
  • Design and optimize relational database schemas and data models supporting both transactional and analytical workloads
  • Build and integrate data services and pipelines that enable engineers to discover, explore, and reuse test data efficiently
  • Collaborate with data scientists and analysts to support analytics, visualization, and ML workflows without exposing unnecessary infrastructure complexity
  • Build CI/CD pipelines and DevSecOps automation to enable rapid, reliable, and secure delivery
  • Apply Site Reliability Engineering (SRE) practices to ensure system availability, performance, and resilience
  • Build and maintain observability capabilities—including logging, metrics, and distributed tracing—to enable rapid diagnosis, performance optimization, and operational insight
  • Contribute to runbooks, incident response, postmortems, and continuous improvement activities
  • Partner with security and compliance teams to ensure solutions meet Boeing security, data governance, and regulatory requirements (e.g., ITAR, EAR, DFARS)
  • Produce clear technical documentation and operational artifacts
  • Present technical concepts, findings, and recommendations to engineers, stakeholders, and leadership as needed

Requirements:

  • Bachelor degree in Engineering, Engineering Technology (including Manufacturing Technology), Computer Science, Data Science, Mathematics, Physics, Chemistry, or non-US equivalent qualifications directly related to the work statement
  • 3+ years of experience in full-stack application development or equivalent systems-software integration roles
  • Strong systems thinking skills with experience designing end-to-end software and system solutions
  • Proficiency in one or more programming languages (JavaScript/TypeScript, Python, C#, or Go)
  • Experience deploying and operating applications in Azure and/or AWS environments
  • 2+ years of hands-on experience with infrastructure-as-code (Terraform, CloudFormation, ARM/Bicep)
  • Experience with relational databases (e.g., PostgreSQL), including schema design and performance considerations
  • Working knowledge of CI/CD pipelines, Git, Docker, and Linux
  • Strong communication skills and ability to work across technical and non‑technical stakeholders

Nice to have:

  • Experience designing developer platforms or internal engineering tools
  • Experience with Kubernetes, serverless, or event driven architectures
  • Familiarity with observability tooling (Prometheus, Grafana, Datadog, Azure Monitor)
  • Experience with SRE concepts (SLAs, SLOs, error budgets, incident postmortems)
  • Cloud certifications (AWS and/or Azure)
  • Experience working in regulated, safety critical, or aerospace environments
  • Familiarity with system architecture frameworks (SysML, DoDAF, UAF)
  • Experience building operator or human machine interfaces
What we offer:
  • Competitive base pay and incentive programs
  • Industry-leading tuition assistance program pays your institution directly
  • Resources and opportunities to grow your career
  • Up to $10,000 match when you support your favorite nonprofit organizations
  • Relocation based on candidate eligibility

Additional Information:

Job Posted:
March 04, 2026

Expiration:
March 07, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Experienced Data Platform Engineer

Platform Engineer – Storage Product Platform Development

Senior level network and system expert to define and lead Enterprise storage pro...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's or Master's degree in Computer Science, Information Systems, or equivalent
  • Typically 8+ years of total experience
  • Prior experience of bringing up a Hardware platform
  • Prior experience of performance tuning disk drives, device drivers & memory management for scale
  • Designing software systems running on multiple platform types and protocols like SNMP & iSCSI
  • Must have very strong system programming background with C/C++/Golang for large enterprise class software
  • Must have proficiency with data structures, algorithms and multi-threaded programming
  • Must have in-depth knowledge of OS internals, networking, and storage concepts
  • Strong analytical and problem-solving skills
Job Responsibility
Job Responsibility
  • Design and develop products that require in-depth knowledge of Device-driver development and Linux internals
  • Design, specify, and lead the implementation of the platform features of the storage array
  • Work with cross organizational interactions: Hardware, Firmware, System management, Network teams, Architects
  • Design enhancements, updates, and programming changes for portions and subsystems of systems software, including IO path, storage management, databases and cloud-related application
  • Write and execute complete testing plans, protocols, and documentation
  • Identify, debug and create solutions for issues with code and integration into system architecture
  • Collaborate and communicate with management, internal, and external partners regarding software systems design status, project progress, and issue resolution
  • Provide guidance and mentoring to less-experienced staff members
What we offer
What we offer
  • Health & Wellbeing benefits
  • Personal & Professional Development programs
  • Unconditional Inclusion environment
  • Comprehensive suite of benefits supporting physical, financial and emotional wellbeing
  • Fulltime
Read More
Arrow Right

Platform Engineer – Storage Product Platform Development

Senior level network and system expert to define and lead Enterprise storage pro...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's or Master's degree in Computer Science, Information Systems, or equivalent
  • Typically 8+ years of total experience
  • Prior experience of bringing up a Hardware platform
  • Prior experience of performance tuning disk drives, device drivers & memory management for scale
  • Designing software systems running on multiple platform types and protocols like SNMP & iSCSI
  • Must have very strong system programming background with C/C++/Golang for large enterprise class software
  • Must have proficiency with data structures, algorithms and multi-threaded programming
  • Must have in-depth knowledge of OS internals, networking, and storage concepts
  • Strong analytical and problem-solving skills
Job Responsibility
Job Responsibility
  • Define and lead Enterprise storage product efforts
  • Design and develop products that require in-depth knowledge of Device-driver development and Linux internals
  • Design, specify, and lead the implementation of the platform features of the storage array
  • Work with cross organizational interactions: Hardware, Firmware, System management, Network teams, Architects
  • Design enhancements, updates, and programming changes for portions and subsystems of systems software, including IO path, storage management, databases and cloud-related application
  • Write and execute complete testing plans, protocols, and documentation
  • Provide guidance and mentoring to less-experienced staff members
What we offer
What we offer
  • Health & Wellbeing benefits
  • Personal & Professional Development programs
  • Unconditional Inclusion environment
  • Fulltime
Read More
Arrow Right

Traineeship Data Engineering

This is a traineeship program in data engineering offered by Sopra Steria in Bru...
Location
Location
Belgium , Brussels
Salary
Salary:
Not provided
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • You like to solve complex (data) problems and theorems and are solution and result oriented
  • Fascinated by new trends and best practices, you make a point of working carefully and independently
  • You have recently graduated or are about to graduate with a bachelor's or master's degree in a data-related field
  • You are eager to learn and excited to expand your knowledge across various data domains
  • You have strong communication skills and enjoy working in a consultancy environment
  • You are analytical, critical, and enjoy collaborating with others to find solutions
  • You have an interest in data cloud platforms
  • You are enthusiastic about technologies like Azure, SQL, Python, and AWS
  • You are fluent in Dutch and English
  • knowledge of French is a plus
Job Responsibility
Job Responsibility
  • You will become Cloud Certified (Azure (Fabric), AWS, Databricks, Google,…)
  • You will become proficient in querying databases through SQL
  • You will start building, implementing and maintaining scalable data pipelines
  • You will learn how to program in Python, aligned with best practices
  • Data modelling will have no more secrets for you anymore
  • You will learn software development and participate in an Agile environment, using the Scrum principles
  • You will be learning how to gather requirements from various stakeholders
  • In time, you will become an experienced designer of data solutions
  • You will follow a dedicated training program designed and given by our own colleagues
  • Throughout and after the program, you can rely on the support of our senior colleagues
What we offer
What we offer
  • Access to our Sopra Steria training and personal development academy
  • A company car lease or mobility budget
  • A company laptop and smartphone
  • Monthly office drinks
  • Insurance coverage
  • Meal vouchers
  • A competitive salary with an indefinite contract
  • Fulltime
Read More
Arrow Right

Data Engineer

We are looking for a seasoned Data Engineer to join our MarTech and Data Strateg...
Location
Location
United States
Salary
Salary:
Not provided
zionandzion.com Logo
Zion & Zion
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4-6 years experience in a data engineering role
  • Significant experience and (preferably) certified in public cloud products: GCP Cloud Architect / Data Engineer or equivalent on AWS
  • Experience with tools like dbt
  • Familiar with ETL/ELT and (nice to have) reverse ETL platforms
  • Experience on a cloud-based or Business Intelligence project (as a technical project manager, developer or architect)
  • DevOps capabilities (CI/CD, Infrastructure as code, Docker, etc.)
  • Experience leveraging digital data in the cloud for marketing activations
  • Excellent verbal and written communication skills and comfortable working with both marketing and technical teams
  • Client-facing experience for detailed technical specifications discussions
  • Fluent in SQL, Python or R
Job Responsibility
Job Responsibility
  • Work with internal and external teams to design and implement technical architecture in order to facilitate the advanced activation of data
  • Interacting with 3rd party MarTech solutions (Google Analytics, Google Marketing Platform, Data Management Platforms, Tag Management Solutions, Adobe Analytics, Clouds, Customer Data Platforms, etc.)
  • Come up with creative solutions to integrate data from a variety of sources into platforms and data warehouses
  • Work with internal teams to specify data processing pipelines (database schemas, integrity constraints, delivery throughput) for use case activations and implement it in the cloud
  • Work with internal data science team to scope and check typical machine learning and AI project requirements
  • Work with data visualization teams to design and implement tables to help power complex dashboards
Read More
Arrow Right

Data Engineer

This is a data engineer position - a programmer responsible for the design, deve...
Location
Location
India , Chennai
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5-8 years of experience in working in data eco systems
  • 4-5 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting and other Big data frameworks
  • 3+ years of experience with relational SQL and NoSQL databases: Oracle, MongoDB, HBase
  • Strong proficiency in Python and Spark Java with knowledge of core spark concepts (RDDs, Dataframes, Spark Streaming, etc) and Scala and SQL
  • Data Integration, Migration & Large Scale ETL experience (Common ETL platforms such as PySpark/DataStage/AbInitio etc.) - ETL design & build, handling, reconciliation and normalization
  • Data Modeling experience (OLAP, OLTP, Logical/Physical Modeling, Normalization, knowledge on performance tuning)
  • Experienced in working with large and multiple datasets and data warehouses
  • Experience building and optimizing 'big data' data pipelines, architectures, and datasets
  • Strong analytic skills and experience working with unstructured datasets
  • Ability to effectively use complex analytical, interpretive, and problem-solving techniques
Job Responsibility
Job Responsibility
  • Ensuring high quality software development, with complete documentation and traceability
  • Develop and optimize scalable Spark Java-based data pipelines for processing and analyzing large scale financial data
  • Design and implement distributed computing solutions for risk modeling, pricing and regulatory compliance
  • Ensure efficient data storage and retrieval using Big Data
  • Implement best practices for spark performance tuning including partition, caching and memory management
  • Maintain high code quality through testing, CI/CD pipelines and version control (Git, Jenkins)
  • Work on batch processing frameworks for Market risk analytics
  • Promoting unit/functional testing and code inspection processes
  • Work with business stakeholders and Business Analysts to understand the requirements
  • Work with other data scientists to understand and interpret complex datasets
  • Fulltime
Read More
Arrow Right

Data Engineer

Hands on development and support of new or existing data applications. Work clos...
Location
Location
United States , Branchville, NJ
Salary
Salary:
Not provided
https://www.roberthalf.com Logo
Robert Half
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Five to seven years of experience in Data Warehousing, Data integration or Data Engineering projects
  • Ability to effectively work well with people in other departments and/or outside of the enterprise
  • Proficient in SQL
  • Experience working within Azure ecosystem
  • Experience in Informatica Powercenter, IICS, Cognos, Netezza Performance servers
  • Experienced in any of these analytical platforms - PowerBI, AzureML, Databricks or Synapse
  • Experience using Python or Scala
  • Experience in Azure DevOps and Github is preferred
  • P&C Insurance experience is preferred
  • Possesses excellent communication skills
Job Responsibility
Job Responsibility
  • Hands on development and support of new or existing data applications
  • Work closely with business and analysts to understand data and business process and make recommendations to clients as requested on best practices or long-term solutions to resolve current issues and also for future system design
  • Works closely with Application and Enterprise Architects to create/review low level implementation designs, understand high level data flow designs developed by data architects
  • Provide technical guidance to the team for implementing complex data solutions
  • Provide support in the design, development, code reviews, test deploy and documentation of data engineering and data integration Applications
  • Maintain detailed documentation to support downstream integrations
  • Provide support for production issues
  • Performs activities of a scrum master
  • Identify technology trends and explore opportunities for use within the organization
What we offer
What we offer
  • Medical, vision, dental, and life and disability insurance
  • Eligibility to enroll in company 401(k) plan
  • Fulltime
Read More
Arrow Right

Data Engineering Lead

Data Engineering Lead a strategic professional who stays abreast of developments...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10-15 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting and other Big data frameworks
  • 4+ years of experience with relational SQL and NoSQL databases: Oracle, MongoDB, HBase
  • Strong proficiency in Python and Spark Java with knowledge of core spark concepts (RDDs, Dataframes, Spark Streaming, etc) and Scala and SQL
  • Data Integration, Migration & Large Scale ETL experience (Common ETL platforms such as PySpark/DataStage/AbInitio etc.) - ETL design & build, handling, reconciliation and normalization
  • Data Modeling experience (OLAP, OLTP, Logical/Physical Modeling, Normalization, knowledge on performance tuning)
  • Experienced in working with large and multiple datasets and data warehouses
  • Experience building and optimizing ‘big data’ data pipelines, architectures, and datasets
  • Strong analytic skills and experience working with unstructured datasets
  • Ability to effectively use complex analytical, interpretive, and problem-solving techniques
  • Experience with Confluent Kafka, Redhat JBPM, CI/CD build pipelines and toolchain – Git, BitBucket, Jira
Job Responsibility
Job Responsibility
  • Strategic Leadership: Define and execute the data engineering roadmap for Global Wealth Data, aligning with overall business objectives and technology strategy
  • Team Management: Lead, mentor, and develop a high-performing, globally distributed team of data engineers, fostering a culture of collaboration, innovation, and continuous improvement
  • Architecture and Design: Oversee the design and implementation of robust and scalable data pipelines, data warehouses, and data lakes, ensuring data quality, integrity, and availability for global wealth data
  • Technology Selection and Implementation: Evaluate and select appropriate technologies and tools for data engineering, staying abreast of industry best practices and emerging trends specific to wealth management data
  • Performance Optimization: Continuously monitor and optimize data pipelines and infrastructure for performance, scalability, and cost-effectiveness, ensuring optimal access to global wealth data
  • Collaboration: Partner with business stakeholders, data scientists, portfolio managers, and other technology teams to understand data needs and deliver effective solutions that support investment strategies and client reporting
  • Data Governance: Implement and enforce data governance policies and procedures to ensure data quality, security, and compliance with relevant regulations, particularly around sensitive financial data
  • Fulltime
Read More
Arrow Right

Data Engineering Lead

The Engineering Lead Analyst is a senior level position responsible for leading ...
Location
Location
Singapore , Singapore
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10-15 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting and other Big data frameworks
  • 4+ years of experience with relational SQL and NoSQL databases: Oracle, MongoDB, HBase
  • Strong proficiency in Python and Spark Java with knowledge of core spark concepts (RDDs, Dataframes, Spark Streaming, etc) and Scala and SQL
  • Data Integration, Migration & Large Scale ETL experience (Common ETL platforms such as PySpark/DataStage/AbInitio etc.) - ETL design & build, handling, reconciliation and normalization
  • Data Modeling experience (OLAP, OLTP, Logical/Physical Modeling, Normalization, knowledge on performance tuning)
  • Experienced in working with large and multiple datasets and data warehouses
  • Experience building and optimizing ‘big data’ data pipelines, architectures, and datasets
  • Strong analytic skills and experience working with unstructured datasets
  • Ability to effectively use complex analytical, interpretive, and problem-solving techniques
  • Experience with Confluent Kafka, Redhat JBPM, CI/CD build pipelines and toolchain – Git, BitBucket, Jira
Job Responsibility
Job Responsibility
  • Define and execute the data engineering roadmap for Global Wealth Data, aligning with overall business objectives and technology strategy
  • Lead, mentor, and develop a high-performing, globally distributed team of data engineers, fostering a culture of collaboration, innovation, and continuous improvement
  • Oversee the design and implementation of robust and scalable data pipelines, data warehouses, and data lakes, ensuring data quality, integrity, and availability for global wealth data
  • Evaluate and select appropriate technologies and tools for data engineering, staying abreast of industry best practices and emerging trends specific to wealth management data
  • Continuously monitor and optimize data pipelines and infrastructure for performance, scalability, and cost-effectiveness
  • Partner with business stakeholders, data scientists, portfolio managers, and other technology teams to understand data needs and deliver effective solutions
  • Implement and enforce data governance policies and procedures to ensure data quality, security, and compliance with relevant regulations
What we offer
What we offer
  • Equal opportunity employer commitment
  • Accessibility and accommodation support
  • Global workforce benefits
  • Fulltime
Read More
Arrow Right