This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
As part of the cybersecurity organization, the Data Engineer is responsible for designing, building, and maintaining data infrastructure to support data-driven decision-making. This role involves working with large datasets, developing reports, executing data governance initiatives, and ensuring data is accessible, reliable, and efficiently managed. The role sits at the intersection of data infrastructure and business insight delivery, requiring the Data Engineer to design and build robust data pipelines while also translating data into meaningful visualizations for stakeholders across the organization. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture, ETL processes, and cybersecurity data frameworks.
Job Responsibility:
Design, develop, and maintain data solutions for data generation, collection, and processing
Be a key team member that assists in design and development of the data pipeline
Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems
Develop and maintain interactive dashboards and reports using tools like Tableau, ensuring data accuracy and usability
Schedule and manage workflows the ensure pipelines run on schedule and are monitored for failures
Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs
Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency
Implement data security and privacy measures to protect sensitive data
Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions
Collaborate and communicate effectively with product teams
Collaborate with data scientists to develop pipelines that meet dynamic business needs
Share and discuss findings with team members practicing SAFe Agile delivery model
Requirements:
Master’s degree OR Bachelor’s degree and 5 to 9 years of Computer Science, IT or related field experience
Hands on experience with data practices, technologies, and platforms, such as Databricks, Python, Gitlab, LucidChart, etc.
Proficiency in data analysis tools (e.g. SQL) and experience with data sourcing tools
Excellent problem-solving skills and the ability to work with large, complex datasets
Understanding of data governance frameworks, tools, and best practices
Knowledge of and experience with data standards (FAIR) and protection regulations and compliance requirements (e.g., GDPR, CCPA)
Experience with ETL tools and various Python packages related to data processing, machine learning model development
Strong understanding of data modeling, data warehousing, and data integration concepts
Experience with data visualization and dashboarding tools—Tableau, Power BI, or similar is a plus
Knowledge of Python/R, Databricks, cloud data platforms
Experience working in Product team's environment
Experience working in an Agile environment
AWS Certified Data Engineer preferred
Databricks Certificate preferred
Initiative to explore alternate technology and approaches to solving problems
Skilled in breaking down problems, documenting problem statements, and estimating efforts
Excellent analytical and troubleshooting skills
Strong verbal and written communication skills
Ability to work effectively with global, virtual teams
High degree of initiative and self-motivation
Ability to manage multiple priorities successfully
Team-oriented, with a focus on achieving team goals
Nice to have:
Experience with ETL tools and various Python packages related to data processing, machine learning model development
Strong understanding of data modeling, data warehousing, and data integration concepts
Experience with data visualization and dashboarding tools—Tableau, Power BI, or similar is a plus
Knowledge of Python/R, Databricks, cloud data platforms