This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Meta Platforms, Inc. (Meta), formerly known as Facebook Inc., builds technologies that help people connect, find communities, and grow businesses. When Facebook launched in 2004, it changed the way people connect. Apps and services like Messenger, Instagram, and WhatsApp further empowered billions around the world. Now, Meta is moving beyond 2D screens toward immersive experiences like augmented and virtual reality to help build the next evolution in social technology.
Job Responsibility:
Design, model, and implement data ware housing activities to deliver the data foundation that drives impact through informed decision making
Design, build and launch collections of sophisticated data models and visualizations that support multiple use cases across different products or domains
Collaborate with engineers, product managers and data scientists to understand data needs, representing key data insights visually in a meaningful way
Define and manage SLA for all data sets in allocated areas of ownership
Create and contribute to frameworks that improve the efficacy of logging data, while working with data infrastructure to triage issues and resolve
Determine and implement the security model based on privacy requirements, confirm safeguards are followed, address data quality issues, and evolve governance processes within allocated areas of ownership
Solve challenging data integration problems utilizing optimal ETL patterns, frameworks, query techniques, and sourcing from structured and unstructured data sources
Optimize pipelines, dashboards, frameworks, and systems to facilitate easier development of data artifacts
Influence product and cross-functional teams to identify data opportunities to drive impact
Work on problems of diverse scope where analysis of data requires evaluation of identifiable factors
Demonstrate good judgment in selecting methods and techniques for obtaining solutions
Requirements:
Requires a Master’s degree (or foreign equivalent) in Computer Science, Engineering, Information Systems, Mathematics, Statistics, Data Analytics, Applied Sciences, or a related field
Requires completion of one graduate-level course, one research project, or one internship involving each of the following: Features, design, and use-case scenarios across a big data ecosystem
Custom ETL design, implementation, and maintenance
Object-oriented programming languages
Schema design and dimensional data modeling
Writing SQL statements
Analyzing data to identify deliverables, gaps, and inconsistencies
Managing and communicating data warehouse plans to internal clients