This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Solvd Inc. is a rapidly growing AI-native consulting and technology services firm delivering enterprise transformation across cloud, data, software engineering, and artificial intelligence. We work with industry-leading organizations to design, build, and operationalize technology solutions that drive measurable business outcomes. Following the acquisition of Tooploox, a premier AI and product development company, Solvd now offers true end-to-end delivery—from strategic advisory and solution design to custom AI development and enterprise-scale implementation. Our capability centers combine deep technical expertise, proven delivery methodologies, and sector-specific knowledge to address complex business challenges quickly and effectively.
Job Responsibility:
Provide technical leadership and guide customers to successful implementations of big data projects, ranging from architecture and design to data engineering to model deployment
Lead clients through migrations and architect solutions from concept to production
Provide structured mentorship for other team members and technical support for pre-sales efforts
Requirements:
Experience leading design and end-to-end development of distributed Databricks-based solutions, including Lakehouse architectures, ETL pipelines, streaming and batch processing, and integration with client's cloud and on-premises enterprise systems
Hands-on experience in building big data pipelines
Hands-on programming experience in Python
Expert knowledge of SQL, Spark (or PySpark), including data ingestion from a variety of sources such as APIs, relational databases, etc.
Hands-on experience using dbt within Databricks ecosystem
Solid understanding of data governance, security, and compliance
Hands-on experience with platform optimization: cluster configuration, partitioning strategies, caching, autoscaling, and job orchestration
Ability to clearly articulate core Databricks concepts and explain their practical applications
Bachelor's degree in computer science, Information Systems, Engineering, or equivalent work experience
Databricks Certification (highly desirable)
Nice to have:
Experience with DevOps using Terraform
Experience with design and deployment of scalable machine learning pipelines on Databricks using MLflow and Delta Lake
Experience with Azure, AWS or GCP Databricks deployment
Experience with Databricks deployments for government clients
Experience with development of interactive dashboards and reports using BI tools, such as Power BI
Programming experience with Scala
What we offer:
Shape real-world AI-driven projects across key industries, working with clients from startup innovation to enterprise transformation
Be part of a global team with equal opportunities for collaboration across continents and cultures
Thrive in an inclusive environment that prioritizes continuous learning, innovation, and ethical AI standards