This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Senior Data Engineer P3 Job Description. About JLL Technologies (JLLT): JLL Technologies is a specialized group within JLL that delivers unparalleled digital advisory, implementation, and services solutions to organizations globally. We provide best-in-class technologies to bring digital ambitions to life by aligning technology, people, and processes. Our goal is to leverage technology to increase the value and liquidity of the world's buildings while enhancing the productivity and happiness of those who occupy them. What the Job Involves: We are seeking a Senior Data Engineer who is a self-starter to work in a diverse and fast-paced environment as part of our Capital Markets Data Engineering team. This individual contributor role is responsible for designing and developing data solutions that are strategic to the business and built on the latest technologies and patterns. This is a global role that requires partnering with the broader JLLT team at the country, regional, and global levels by utilizing in-depth knowledge of data, infrastructure, technologies, and data engineering experience.
Job Responsibility:
Design and implement robust, scalable data pipelines using Databricks, Apache Spark, and Delta Lake as well as BigQuery
Design and implement efficient data pipeline frameworks, ensuring the smooth flow of data from various sources to data lakes, data warehouses, and analytical platforms
Troubleshoot and resolve issues related to data processing, data quality, and data pipeline performance
Document data infrastructure, data pipelines, and ETL processes, ensuring knowledge transfer and smooth handovers
Create automated tests and integrate them into testing frameworks
Configure and optimize Databricks workspaces, clusters, and job scheduling
Work in a Multi-cloud environment including Azure, GCP and AWS
Implement security best practices including access controls, encryption, and audit logging
Build integrations with market data vendors, trading systems, and risk management platforms
Establish monitoring and performance tuning for data pipeline health and efficiency
Collaborate with cross-functional teams to understand data requirements, identify potential data sources, and define data ingestion
Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver data solutions that meet their needs
Requirements:
Bachelor's degree in Computer Science, Data Engineering, or a related field (Master's degree preferred)
Minimum 5+ years of experience in data engineering or full-stack development, with a focus on cloud-based environments
Advanced expertise in managing big data technologies (Python, SQL, PySpark, Spark) with a proven track record of working on large-scale data projects
Strong Databricks experience
Advanced database/backend testing with the ability to write complex SQL queries for data validation and integrity
Strong experience in designing and implementing data pipelines, ETL processes, and workflow automation
Experience with data warehousing concepts, dimensional modeling, data governance best practices, and cloud-based data warehousing platforms (e.g., Google BigQuery, Snowflake)
Experience with cloud platforms such as Microsoft Azure, or Google Cloud Platform (GCP)
Experience working in DevOps model
Experience with Unit, Functional, Integration, User Acceptance, System, and Security testing of data pipelines
Proficiency in object-oriented programming and software design patterns
Familiarity with cutting-edge AI technologies and demonstrated ability to rapidly learn and adapt to emerging concepts and frameworks
Strong problem-solving skills and ability to analyze complex data processing issues
Excellent communication and interpersonal skills to collaborate effectively with cross-functional teams
Attention to detail and commitment to delivering high-quality, reliable data solutions
Ability to adapt to evolving technologies and work effectively in a fast-paced, dynamic environment