This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Meta Platforms, Inc. (Meta), formerly known as Facebook Inc., builds technologies that help people connect, find communities, and grow businesses. When Facebook launched in 2004, it changed the way people connect. Apps and services like Messenger, Instagram, and WhatsApp further empowered billions around the world. Now, Meta is moving beyond 2D screens toward immersive experiences like augmented and virtual reality to help build the next evolution in social technology.
Job Responsibility:
Design and develop creative and innovative solutions to manipulate and transform data coming from diverse domains and components, such as advertisements and pages, as well as other custom software systems and databases
Design, architect, and develop data solutions that help product and business teams make data-driven decisions
Lead data solution design efforts from beginning to end, as well as processes for the development and implementation of data integration for external insights, and work on Data Warehouse and Operational Data Store
Provide insights that influence the overall strategy and plan for the framework and construction of a scalable data solution and data warehouse environment
Leverage a homegrown Extract, Transform, Load (ETL) framework as well as off-the-shelf ETL tools, as appropriate
Collaborate with data infrastructure and engineering teams to build and extend cross-platform ETL and reports generation frameworks
Provide consultation to business partners including analysts, managers, end users, and developers to clarify work objectives, determine project scope, obtain consensus, identify problems, and recommend solutions
Support end users on ad hoc data usage and serve as subject matter experts on functional matters
Develop ETL using Python, Pen PHP, or similar technologies
Work with ETL techniques and best practices to manage extremely large volumes of data
Work with Data warehousing architecture and data modeling best practices
Work with Hadoop, HBase, and Hive
Work with File Systems, server architectures, and distributed systems
Work with MicroStrategy or other similar Business Intelligence (BI) reporting solutions
Work with database programming and performance tuning techniques
Requirements:
Master's degree (or foreign degree equivalent) in Computer Science, Computer Engineering, Information Systems, Mathematics, or a related field
Two years of work experience in the job offered or in an operations automation or computer-related occupation
Two years of experience in the data warehouse space, custom ETL design, implementation, and maintenance
Two years of experience in DSQL and development experience in at least one language: Java, C++, Perl, PHP, or Python
Two years of experience in data architecture, data modeling, schema design, and software development
Two years of experience in leading data-driven projects from definition through interpretation and execution
Two years of experience in large data sets, Hadoop, and data visualization tools
Two years of experience in initiating and driving projects, and communicating data warehouse plans to internal clients/stakeholders