This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Meta Platforms, Inc. (Meta), formerly known as Facebook Inc., builds technologies that help people connect, find communities, and grow businesses. When Facebook launched in 2004, it changed the way people connect. Apps and services like Messenger, Instagram, and WhatsApp further empowered billions around the world. Now, Meta is moving beyond 2D screens toward immersive experiences like augmented and virtual reality to help build the next evolution in social technology. To apply, click 'Apply to Job' online on this web page.
Job Responsibility:
Design, build, and launch data pipelines to move data across systems and build the next generation of data tools that generate business insights for a product
Analyze user needs and software requirements to determine workability and to offer support for end users on data usage
Design, architect, and develop software and data solutions that help product and business teams make data-driven decisions
Rethink and influence strategy and roadmap for building efficient data solutions and scalable data warehouse plans
Design, develop, test, and launch new data models and processes into production, and provide support
Leverage homegrown extract, transform, and load (ETL) framework as well as off-the-shelf ETL tools, as appropriate
Interface closely with data infrastructure, product, and engineering teams to build and extend cross platform ETL and reports generation framework
Identify data infrastructure issues and drive to resolution
Requirements:
Master’s degree (or foreign equivalent) in Data Science, Data Analytics, Computer Engineering, Computer Science or a related field
Completion of one graduate-level course, one research project, or one internship involving: Data ETL (Extract, Transform, Load) design, implementation, and maintenance on a large scale
Data visualization via Tableau, R, or Python
Programming in Hack, C/C++, Python, Perl, Java, or PHP
Internet technologies: HTTP, HTML, CSS, or JavaScript
Writing and optimizing SQL statements
Analyzing large volumes of data to provide data driven insights, gaps, and inconsistencies
Data governance standard and data privacy compliance
Data processing automation
Data warehousing architecture and plans
Informatica, Talend, Pentaho, dimensional data modeling, or schema design
Map Reduce or MPP system
Machine Learning and Artificial Intelligence fundamentals
Statistics methods: descriptive statistics, hypothesis testing, and regression analysis
Distributed processing technologies and frameworks, such as Hadoop, and distributed storage systems (e.g., HDFS, S3)
Spark programming: code writing, debugging and optimization