This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are seeking an experienced Data Engineer to join the Governance, Risk, and Compliance (GRC) team at Raytheon Technologies. The candidate will work closely with our GRC Dev Ops team and various IT and Cybersecurity stakeholders to design, implement, and maintain data warehousing solutions. This role focuses on building scalable data pipelines and models, transforming raw data (from structured, semi-structured and unstructured sources) into curated datasets, and ensuring data is accessible for BI reporting and AI/ML use cases.
Job Responsibility:
Collaborate with the Business and Data Analysts as well as Front-end and Full Stack AI Developers to understand data requirements and deliver scalable solutions that support large-scale automation initiatives, incorporating AI/ML
Design, develop, and optimize ETL/ELT pipelines to process, model and transform data from raw to curated layers, enabling seamless integration into published layers for BI and advanced analytics
Implement and manage data warehousing solutions using Object storage, Snowflake, Databricks, Matillion, and Informatica
Develop and maintain APIs to facilitate secure and efficient data integration between various IT, Cyber and GRC systems, applications, and data pipelines
Ensure the accuracy, reliability, and scalability of data pipelines and data models
Support the ingestion, integration, and transformation of large datasets to meet IT, Cybersecurity and GRC operational and reporting needs
Partner with stakeholders to understand their data and reporting requirements and provide tailored solutions
Stay informed on the latest advancements in data engineering, warehousing, and integration tools and methodologies
Requirements:
Proven experience as a Data Engineer with a focus on data warehousing, ETL/ELT development, and pipeline design
Strong proficiency in SQL, experience with relational and non-relational databases (e.g., MySQL, PostgreSQL, SQL Server, Snowflake, Databricks)
Experience building APIs and integrating data pipelines with RESTful or GraphQL APIs and implement CI/CD for pipelines and SQL transformations (Git workflows, automated testing, release/version control)
Hands-on experience with ETL/ELT tools and platforms such as Matillion, Informatica, or equivalent
Proficiency in programming languages such as Python or Java for building and optimizing data pipelines
Expertise in cloud platforms (AWS, Google Cloud, Azure) and their data services
Familiarity with BI tools like Power BI and an understanding of how to prepare data for reporting needs
Strong analytical and problem-solving skills with a focus on delivering high-quality, scalable solutions
Excellent communication and collaboration skills for cross-functional teamwork
US Citizens only apply
Nice to have:
Experience working on Cybersecurity or GRC-related projects or industries
Working knowledge of machine learning and AI concepts (model registry/model hub workflows or equivalent)
Familiarity with data governance, security, and compliance principles
Understanding of regulatory compliance standards and frameworks