This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Uber's mission is to reimagine the way the world moves for the better. Here, bold ideas create real-world impact, challenges drive growth, and speed fuelds progress. What moves us, moves the world - let’s move it forward, together.
Job Responsibility:
Own the Technical Vision: You will own and drive the technical roadmap for the Payments data ecosystem, balancing long-term architectural scalability with short-term business critical deliveries
Navigate Ambiguity: Actively identify strategically important problems and inefficiencies without waiting for instruction
Drive Alignment: See the big picture and drive consensus on complex technical decisions across the organization
Architect at Scale: Design and implement resilient, cost-effective, and high-scale batch and streaming pipelines that power critical support operations and financial analytics
Elevate Data Standards: Define and enforce robust data modeling standards, data contracts, and governance frameworks
Optimize & Automate: Identify opportunities to automate manual workflows (like SLA tracking and issue detection) and optimize infrastructure efficiency to lower TCO
Raise the Bar: Champion sustainable engineering practices
Be a Trusted Mentor: Serve as a humble mentor and technical advisor to both junior engineers and peer leaders
Force Multiplier: Act as a role model for judgment and responsibility
Requirements:
Education: Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field
Experience Level: 10+ years of hands-on experience in Data Engineering, with a proven track record of delivering results at a Staff Engineer level (or equivalent scope) at a premier technology company
Expert SQL Competency: 10+ years of hands-on, expert-level SQL experience
Data Modeling & Warehousing: Extensive experience designing dimensional data models (Star/Snowflake schemas) and data warehouses
Software Engineering Fundamentals: Proficiency in at least one high-level programming language (Java, Scala, Python, or Go)
Big Data Ecosystem: 10+ years of experience working with distributed data systems (Hadoop, Hive, Spark) and MPP databases (Vertica, Redshift, etc.)
End-to-End Architecture: Experience designing full-lifecycle data systems, including logging, ingestion (Batch/Stream), quality frameworks, and monitoring
Technical Leadership: Excellent written and verbal communication skills
Mentorship & Growth: A strong passion for driving engineering excellence and mentoring engineers