This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are currently seeking a Data Architect (Databricks) to join our team in Bangalore, Karnātaka (IN-KA), India (IN). Job Duties: We are looking for a senior, client-facing Data Architect to support the delivery of complex data products for a large enterprise client. The role requires a “heavy-hitter” architect who can operate confidently in an Agile delivery environment, collaborate closely with client stakeholders, and translate business requirements into scalable, robust data architectures. The engagement is expected to start around late January / early February and will run for approximately 6 months, with iterative delivery cycles and phased releases.
Job Responsibility:
Act as the lead Data Architect for Databricks-based data platforms and products
Design end-to-end data architectures covering ingestion, processing, storage, and consumption
Define Lakehouse architecture using Databricks (Delta Lake, Unity Catalog, etc.)
Work closely with client stakeholders, product owners, delivery leads, data engineers and analytics teams
Translate business and analytical requirements into scalable technical designs
Ensure architecture aligns with performance, security, governance, and cost-optimization best practices
Support Agile delivery, including sprint planning, design workshops, incremental releases
Provide architectural guidance during implementation and unblock delivery issues
Review and approve technical designs, data models, and implementation patterns
Requirements:
10–15+ years in data & analytics roles
7–10+ years in data architecture / solution design
3–5+ years hands-on (architecture + delivery) Databricks Experience
3+ years working directly with demanding enterprise clients (Client-Facing / Consulting Experience)
Strong expertise in data architecture and data platform design
Deep understanding of data warehousing, Lakehouse patterns, and batch and streaming data processing
Hands-on experience with Databricks Lakehouse architecture, Delta Lake, Databricks Workflows, Unity Catalog (governance & security)
Experience on Azure (preferred) or equivalent cloud platforms
Strong understanding of data ingestion frameworks, ETL / ELT patterns, data modeling (analytical & dimensional)
Ability to guide data engineers on best practices
Proven experience working in Agile / Scrum delivery models
Comfortable delivering in fast-paced, iterative programs
Experience working with offshore teams while engaging closely with onshore clients
Strong verbal and written communication skills
Ability to lead architecture discussions with senior client stakeholders, challenge requirements constructively, explain complex technical concepts in simple business language
Confidence to work with a demanding client environment
Nice to have:
Consulting or systems-integrator background
Experience delivering energy, utilities, or large enterprise data platforms
Exposure to data governance, data quality frameworks, and regulatory environments