This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
SDG Group is seeking a Solutions Architect to lead technical delivery and innovation for our most ambitious client engagements. In this role, you will be the technical backbone of our Databricks offering. You will design sophisticated, large-scale Big Data architectures and actively engage with the Databricks organization in close collaboration with our Alliances Lead to showcase SDG’s premier technical capabilities. You are not just building pipelines; you are architecting the future of data intelligence for the world’s leading brands.
Job Responsibility:
Design and define end-to-end, high-performance architectures for processing massive, complex volumes of data in distributed environments
Establish standards for Data Lakehouse and Modern Data Warehouse patterns to ensure scalability and efficiency
Implement advanced storage strategies using Delta Lake, Apache Iceberg, and optimized formats like Parquet
Lead the technical design of solutions on the Databricks Data Intelligence Platform, utilizing Unity Catalog for unified governance
Demonstrate deep experience with Delta Live Tables (DLT) and maintain awareness of the upcoming Lakeflow ecosystem (Spark Declarative Pipelines)
Provide expert guidance on performance tuning for large Spark clusters and optimizing SQL workloads within Databricks
Act as the "Technical North Star" in client workshops, translating strategic business needs into viable, high-impact technical blueprints
Collaborate with the Alliances Lead to engage with Databricks’ technical teams, demonstrating SDG’s excellence through deep-dive architectural reviews and Proofs of Concept (PoCs)
Support project execution from a technical standpoint, ensuring the team adheres to DataOps best practices, CI/CD, and robust data quality frameworks
Requirements:
6+ years of professional experience in data engineering, with a focus on building solutions on distributed platforms and managing complex ETL/ELT pipelines
Proven experience in architecting solutions that manage the full data lifecycle—ingestion, cleaning, and governance—at enterprise scale
3+ years of hands-on experience designing and deploying solutions within the Databricks ecosystem
Deep experience with at least one major cloud provider (Azure, AWS, or GCP), particularly their data-related services
Proficiency in Python, Spark, PySpark, and/or their variants such as SparkSQL
Nice to have:
Hands-on knowledge of orchestrators such as Airflow or native Databricks tools like Databricks Workflows
Advanced experience with frameworks such as dbt
Exceptional interpersonal and communication skills, with a proven ability to lead technical discussions with both developers and executive stakeholders
Collaborate with our global innovation labs to implement proprietary accelerators and artifacts that reduce time-to-market for our clients
A proactive, "Growth Mindset" with a passion for staying at the forefront of emerging data technologies
What we offer:
Competitive Compensation: Full-time employment in an exciting and growing high-tech industry with a competitive salary
Career Growth: A professional career with opportunities for personal/professional growth and rapid advancement based on meritocracy
Feedback Loop: Regular meetings with your manager to share feedback, discuss new challenges, and review your career trajectory
Flexibility: A creative and innovative work environment with flexible working options (flexible working hours, good mix of on-site and remote)
Health & Wellness: Robust Healthcare and Benefits package including Medical, Dental, Vision, Disability coverage, and various other benefit options
Retirement: 401k Retirement plan with Employer match
Time Off: 3+ weeks paid vacation and flexible holiday schedule