This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
At Boeing, we innovate and collaborate to make the world a better place. We’re committed to fostering an environment for every teammate that’s welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us. The Boeing Company is currently seeking a Senior Data Programmer Analyst to join the team in Seattle, WA; Kent, WA; Renton, WA; or Everett, WA. We are seeking a highly experienced, detail-oriented individual to lead the design and implementation of our enterprise data infrastructure within a high-precision airplane engineering, manufacturing, and supply chain environment. This hands-on role requires proven cloud, Amazon Web Services (AWS) experience, modern ingestion and curation tooling, and a strong background in data architecture, integration, and platform modernization. You will ensure the integrity, scalability, and security of data systems that enable engineering, production, supply chain, and compliance workflows.
Job Responsibility:
Serve as the Lead Data Programmer Analyst and technical owner for data platform initiatives
Build and maintain strong partnerships with business stakeholders to deepen domain knowledge and align data solutions with strategic needs
Design and implement data ingestion patterns and pipelines to migrate and integrate on-premise sources (Oracle, Teradata) to cloud-based platforms (AWS)
Build and operate modern cloud-based ingestion tools and frameworks (Databricks, Snowflake, or equivalents)
Define and maintain data architecture, lineage, and platform documentation
Curate and structure data for business usage and self-service analytics
Leverage cloud Artificial Intelligence/Machine Learning (AI/ML) marketplaces and services to enable advanced analytics and model deployment
Map, document, and analyze current application architectures, ingestion patterns, data flows, and platform constraints to drive a clear modernization roadmap
Design conceptual, logical, and physical data models tailored to manufacturing, engineering, and PLM (product lifecycle management) systems
Lead integration of diverse data sources including IoT/factory equipment telemetry, Enterprise Resource Planning (ERP) systems (e.g., SAP), and Computer-Aided Design/Product Lifecycle Management (CAD/PLM) tools (e.g., Siemens Teamcenter, CATIA)
Oversee end-to-end data integration and Extract, Transform, Load/ Extract Load, Transform (ETL/ELT) processes across heterogeneous sources
Promote and operationalize modern platform practices including infrastructure-as-code, pipeline observability, metadata/cataloging, data contracts and versioning, and policy-as-code
Automate access provisioning, ingestion pipelines, testing, Continuous Integration/Continuous Delivery (CI/CD), and deployment to reduce manual work and accelerate safe delivery
Collaborate closely with business stakeholders to understand data needs and deliver actionable insights
Mentor engineers and guide best practices for secure, scalable, maintainable data engineering
Requirements:
5+ years of experience as a Developer
3+ years of experience with cloud (AWS) and modern ingestion tools (Databricks, Snowflake, or similar)
Experience with data warehousing and cloud/on-prem platforms such as Oracle, Teradata, Redshift, and AWS services
Experience in data processing with Python and PySpark
Experience with ETL/ELT patterns, data integration techniques, and data modeling
Experience with Linux operating systems and shell scripting
Experience with infrastructure-as-code (e.g., Terraform, CloudFormation), CI/CD pipelines, and automated testing for data pipelines
Experience communicating with stakeholders and cross-functional teams
Nice to have:
Bachelor’s degree or higher
Experience in manufacturing or aerospace
Experience with Oracle, Teradata, SQL Server
Experience with AWS Redshift
Experience with data governance, cataloging tools, and data contract/versioning frameworks
Experience implementing observability for data pipelines and datasets
What we offer:
competitive base pay and variable compensation opportunities