This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data
Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures
Development of processing and analysis algorithms fit for the intended data complexity and volumes
Collaboration with data scientist to build and deploy machine learning models
Requirements:
Extensive hands-on experience designing and implementing data entitlements systems at enterprise scale, including policy creation, access controls, and entitlement workflows
Deep technical expertise with entitlements engines and access control frameworks (RBAC/ABAC), including implementation and troubleshooting across complex environments
Proven experience implementing entitlements across modern data platforms including Snowflake, Databricks, and KDB, with understanding of their native security models
Strong AWS cloud platform experience, particularly with IAM, Lake Formation, S3 bucket policies, and cloud-native security services
Nice to have:
Experience with Immuta or similar data access governance platforms for policy-based access control and automated compliance
Knowledge of market data workflows and vendor licensing models (Bloomberg, Refinitiv, ICE, etc.) and their entitlement requirements
Strong ability and willingness to collaborate with cross-functional teams across technology, business, compliance, and vendor management
Active interest in AI and machine learning applications for metadata management, policy recommendation, and entitlement optimization