CrawlJobs Logo

Databricks Specialist

knowit.se Logo

Knowit Sweden

Location Icon

Location:
Sweden , Malmö

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Do you want to work as a Data Engineer with Databricks (one of the most exciting data platforms on the market) and take your expertise to the next level? At Knowit Core in Malmö, we are looking for Data Engineers who are passionate about building scalable data solutions with Databricks at the center of it all.

Job Responsibility:

  • Help our clients design, build, and orchestrate modern data platforms in whichever cloud they prefer (Azure, AWS, or GCP)
  • Explore new features and capabilities like Databricks Genie
  • Dive into UC metrics and experiment with the possibilities of Databricks One
  • Building and orchestrating data pipelines with PySpark
  • Managing data governance through Unity Catalog
  • Working with AI and BI integrations to unlock business value
  • Helping clients establish scalable and secure Lakehouse architectures

Requirements:

  • Proven experience with Databricks
  • Strong skills in PySpark
  • Experience with data orchestration and management in Unity Catalog
  • At least a few years of hands-on experience as a Data Engineer
  • Based in Sweden and available to work on-site at our Malmö office several days a week

Nice to have:

  • Previously worked in consulting
  • An interest or experience in AI or ML
What we offer:
  • Cutting-edge projects across industries like retail, finance, automotive, and industry
  • The freedom to work with any major cloud provider
  • Continuous competence development through certifications, trainings, and conferences
  • A modern workplace in central Malmö, complete with a rooftop terrace

Additional Information:

Job Posted:
January 20, 2026

Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Databricks Specialist

AI Product Specialist – Databricks Deployment

We are looking for an interim AI Product Specialist to support clients with the ...
Location
Location
Netherlands , Rotterdam
Salary
Salary:
Not provided
riverflex.com Logo
Riverflex
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3–7 years in data product delivery, analytics engineering, applied AI, or similar roles (consulting experience is a plus)
  • Proven experience with Databricks in real delivery contexts (platform capabilities, deployment patterns, production considerations)
  • Strong understanding of data lifecycle and architecture (pipelines, lakehouse concepts, quality, governance)
  • Strong stakeholder management: able to align technical and business teams and keep delivery outcome-focused
  • Comfortable working in fast-moving environments, creating clarity and structure where it’s missing
  • Excellent communication skills—able to explain trade-offs, risks, and value in plain language
Job Responsibility
Job Responsibility
  • Deploy and operationalize AI products on Databricks, working closely with data engineers, data scientists, and platform teams
  • Translate business goals into deployable product requirements, including data needs, success metrics, and rollout approach
  • Design and optimize pipelines and environments (lakehouse patterns, compute, orchestration) to support analytics and ML workflows
  • Drive the path from PoC to production, including implementation planning, technical decision-making, and delivery governance
  • Implement data governance and quality practices (data lineage, access, quality checks, compliance considerations) to meet enterprise standards
  • Act as a Databricks SME: advise on architecture choices, integration patterns, performance considerations, and operational readiness
  • Support value realization: define KPI/ROI frameworks with stakeholders and help communicate impact and adoption progress
  • Fulltime
Read More
Arrow Right
New

Data Engineer Specialist

We are seeking an experienced Data Engineering Specialist with strong hands-on e...
Location
Location
India , Pune
Salary
Salary:
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong expertise in Databricks on GCP including Delta Lake, notebooks/jobs, Unity Catalog, and cluster policies
  • Experienced in Cloud Data Fusion design, including pipeline management, error handling, and orchestration
  • Skilled in Dataproc Spark with experience building PySpark jobs, configuring ephemeral clusters, and handling initialisation actions
  • Proficient in Python for data engineering including packaging, unit testing, type hints, and linting
  • Strong SQL skills, specifically with BigQuery including performance tuning, partitioning, and clustering
  • Familiar with GCP services such as Cloud Storage, Pub/Sub, and Cloud Composer/Airflow
  • Holds a qualification such as B.E., B.Tech, BCA, MCA, BSc, or MSc in Computer Science or a related field
Job Responsibility
Job Responsibility
  • Design and build data pipelines on GCP using Databricks (Delta Lake and Unity Catalog) for orchestration, Dataproc for Spark execution, supporting both ETL/ELT and feature engineering workloads
  • Engineer declarative, modular, and reusable pipelines in Python, following configuration-as-code principles and CI/CD practices including Git-based promotion, testing, and deployment
  • Implement and maintain data quality and observability practices using validation frameworks, logging, metrics, and alerts
  • Optimise pipeline performance, reliability, and cost through techniques such as cluster sizing, auto-termination, Z-ordering, caching, and partitioning strategies
  • Apply robust error handling, parameterisation, and triggers within Cloud Data Fusion pipelines
  • Ensure operational excellence by maintaining monitoring, performance tuning, and continuous improvements across data products and workloads
What we offer
What we offer
  • The opportunity to build and scale data solutions using leading GCP and Databricks technologies
  • Exposure to enterprise-level CI/CD, observability, and configuration-as-code practices
  • A collaborative environment where innovation, continuous learning, and technical excellence are encouraged
  • The chance to contribute to high-impact global data platforms
Read More
Arrow Right
New

Data Engineering Specialist

We are seeking an experienced Data Engineering Specialist with strong hands-on e...
Location
Location
India , Pune
Salary
Salary:
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong expertise in Databricks on GCP including Delta Lake, notebooks/jobs, Unity Catalog, and cluster policies
  • Experienced in Cloud Data Fusion design, including pipeline management, error handling, and orchestration
  • Skilled in Dataproc Spark with experience building PySpark jobs, configuring ephemeral clusters, and handling initialisation actions
  • Proficient in Python for data engineering including packaging, unit testing, type hints, and linting
  • Strong SQL skills, specifically with BigQuery including performance tuning, partitioning, and clustering
  • Familiar with GCP services such as Cloud Storage, Pub/Sub, and Cloud Composer/Airflow
  • Holds a qualification such as B.E., B.Tech, BCA, MCA, BSc, or MSc in Computer Science or a related field
Job Responsibility
Job Responsibility
  • Design and build data pipelines on GCP using Databricks (Delta Lake and Unity Catalog) for orchestration, Dataproc for Spark execution, supporting both ETL/ELT and feature engineering workloads
  • Engineer declarative, modular, and reusable pipelines in Python, following configuration-as-code principles and CI/CD practices including Git-based promotion, testing, and deployment
  • Implement and maintain data quality and observability practices using validation frameworks, logging, metrics, and alerts
  • Optimise pipeline performance, reliability, and cost through techniques such as cluster sizing, auto-termination, Z-ordering, caching, and partitioning strategies
  • Apply robust error handling, parameterisation, and triggers within Cloud Data Fusion pipelines
  • Ensure operational excellence by maintaining monitoring, performance tuning, and continuous improvements across data products and workloads
What we offer
What we offer
  • The opportunity to build and scale data solutions using leading GCP and Databricks technologies
  • Exposure to enterprise-level CI/CD, observability, and configuration-as-code practices
  • A collaborative environment where innovation, continuous learning, and technical excellence are encouraged
  • The chance to contribute to high-impact global data platforms
Read More
Arrow Right

Revenue Enablement Onboarding Program Manager

We're hiring an Onboarding Program Lead to own the strategy, delivery, and conti...
Location
Location
United States , Denver
Salary
Salary:
125000.00 - 150000.00 USD / Year
hightouch.com Logo
Hightouch
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4-6+ years in sales enablement, sales training, or frontline sales management with proven ability to develop talent
  • Direct experience designing and facilitating onboarding programs for B2B SaaS sales teams
  • Track record of coaching individual contributors with measurable improvements in performance and ramp velocity
  • Experience working with sales methodologies (MEDDPICC, Command of the Message, Challenger, SPIN, etc.) and translating them into practical application
  • Strong facilitation and presentation skills with ability to engage diverse audiences and make complex topics accessible
  • Coaching mindset with ability to diagnose capability gaps and deliver targeted, actionable feedback
  • Data-driven approach to program optimization with comfort analyzing performance metrics and deriving insights
  • Proven ability to build trust and credibility with sales leaders and individual contributors alike
  • Bias toward action and iteration. Comfortable shipping programs quickly and refining based on feedback
Job Responsibility
Job Responsibility
  • Own end-to-end design and facilitation of sales onboarding bootcamps, including our multi-day in-person cohort weeks
  • Deliver core training sessions on product, sales methodology, competitive positioning, and industry/persona-specific messaging
  • Build and iterate onboarding curriculum that balances foundational knowledge with practical, hands-on application
  • Create certification standards and competency assessments that validate readiness for various sales motions
  • Conduct 1:1 coaching sessions with new hires to accelerate ramp and address individual development needs
  • Shadow new sellers on calls and provide actionable feedback on discovery, demos, and deal progression
  • Partner with frontline managers to ensure consistent coaching standards and seamless handoff post-onboarding
  • Identify common struggling points and build targeted interventions to address capability gaps
  • Run regular onboarding check ins with new hires
  • Define and track key onboarding metrics including time-to-first-meeting, time-to-close, ramp attainment, and certification completion
What we offer
What we offer
  • meaningful equity compensation in the form of ISO options
  • offer early exercise and a 10-year post-termination exercise window
  • Fulltime
Read More
Arrow Right

Data Migration Consultant

As a Data Migration Engineer at Sopra Steria, you will play a key role in helpin...
Location
Location
Belgium , Brussels/Flanders
Salary
Salary:
Not provided
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • At least 3 years of experience as a Data Engineer, Database Administrator, or Data Migration Specialist
  • Hands-on experience with data migration, ETL workflows, and cloud-based data platforms (e.g., Azure Databricks, Data Factory, Synapse, Data Lake)
  • Strong proficiency in SQL and relational databases
  • experience with data modeling concepts (Kimball, Snowflake, SCD, Data Vault)
  • Programming experience in Python, PySpark, or similar languages for data transformation and automation
  • Analytical thinking, attention to detail, and a structured, solution-oriented approach
  • Fluency in English
Job Responsibility
Job Responsibility
  • Design, implement, and validate data migration processes that ensure data from source systems is accurately converted, cleansed, and loaded into target systems
  • Translate business requirements into technical data migration solutions, define transformation rules, and ensure data quality, completeness, and compliance
  • Work in collaborative, agile teams to help optimize migration workflows, streamline data pipelines, and implement automation and governance standards
What we offer
What we offer
  • A competitive salary and an indefinite contract
  • A company car or mobility budget
  • Comprehensive insurance coverage (hospitalisation, outpatient care and group insurance)
  • 32 days paid time off (20 days + 12 bonus days)
  • Flexible, location-independent work
  • Laptop, phone and mobile subscription
  • Continuous learning opportunities through the Sopra Steria Academy to support your career development
  • Fulltime
Read More
Arrow Right

Analyst III - Data

Our team members are at the heart of everything we do. At Cencora, we are united...
Location
Location
India , Pune
Salary
Salary:
Not provided
cencora.com Logo
Cencora
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience leading design, development, and support of complex Data Analytics & Data Visualization projects
  • Hands-on expertise across SAP BW/HANA, Azure Synapse, Azure Databricks, Power BI, and Alteryx
  • Strong BI knowledge across domains: Finance, Distribution, Master Data Management
  • Proficient in Financial Reporting, Profitability Analysis, SQL, and SAP FICO (COPA Reporting)
  • Skilled in building data pipelines and integrations with PaPM, Vendavo, and Hyperion EPM
  • Bachelor’s Degree in Statistics, Computer Science, Information Technology, or a related discipline (or equivalent relevant experience)
  • Behavioral Skills: Critical Thinking, Detail Oriented, Impact and Influencing, Interpersonal Communication, Multitasking, Problem Solving, Time Management
  • Technical Skills: Advanced Data Visualization Techniques, Business Intelligence (BI) Reporting Tools, Data Analysis and Synthesis, Data Management, Programming languages like SQL
  • Tools Knowledge: Business Intelligence Software like Tableau, Power BI, Alteryx, QlikSense, Microsoft Office Suite, Relational Database Management Systems (RDBMS) Software, Data Visualization Tools
Job Responsibility
Job Responsibility
  • Responsible for design, development, and support of complex and high-visibility data analytics and visualization projects
  • Acts as the lead specialist for end-to-end BI and reporting solutions
  • Works across multiple technologies: SAP BW/HANA, Azure Synapse, Azure Databricks, Power BI, Alteryx
  • Builds and manages data pipelines and integrations with systems like:(PaPM (Process Modeling), Vendavo (Price Modeling), Hyperion EPM (Forecasting & Planning)
  • Supports Tableau-based projects including PoC creation, prototyping, analytics demonstrations, and internal reporting
  • Builds dashboards that highlight key value drivers and business insights
  • Develops reporting tools for operational teams to support daily business activities
  • Ensures consistency, accuracy, and integrity of data provided to stakeholders
  • Translates business requirements into technical solutions and data warehouse designs
  • Provides recommendations for improving reporting solutions using business and technical knowledge
  • Fulltime
Read More
Arrow Right

Analyst III - Data

Our team members are at the heart of everything we do. At Cencora, we are united...
Location
Location
India , Pune
Salary
Salary:
Not provided
cencora.com Logo
Cencora
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience leading design, development, and support of complex Data Analytics & Data Visualization projects
  • Hands-on expertise across SAP BW/HANA, Azure Synapse, Azure Databricks, Power BI, and Alteryx
  • Strong BI knowledge across domains: Finance, Distribution, Master Data Management
  • Proficient in Financial Reporting, Profitability Analysis, SQL, and SAP FICO (COPA Reporting)
  • Skilled in building data pipelines and integrations with PaPM, Vendavo, and Hyperion EPM
  • Bachelor’s Degree in Statistics, Computer Science, Information Technology, or a related discipline (or equivalent relevant experience)
  • Behavioral Skills: Critical Thinking, Detail Oriented, Impact and Influencing, Interpersonal Communication, Multitasking, Problem Solving, Time Management
  • Technical Skills: Advanced Data Visualization Techniques, Business Intelligence (BI) Reporting Tools, Data Analysis and Synthesis, Data Management, Programming languages like SQL
  • Tools Knowledge: Business Intelligence Software like Tableau, Power BI, Alteryx, QlikSense, Microsoft Office Suite, Relational Database Management Systems (RDBMS) Software, Data Visualization Tools
Job Responsibility
Job Responsibility
  • Responsible for design, development, and support of complex and high-visibility data analytics and visualization projects
  • Acts as the lead specialist for end-to-end BI and reporting solutions
  • Works across multiple technologies: SAP BW/HANA, Azure Synapse, Azure Databricks, Power BI, Alteryx
  • Possesses strong SQL expertise
  • Experienced across BI platforms and functional domains: Finance, Distribution, Master Data Management
  • Skilled in Financial Reporting, Profitability Analysis, and SAP FICO (COPA Reporting)
  • Builds and manages data pipelines and integrations with systems like: PaPM (Process Modeling), Vendavo (Price Modeling), Hyperion EPM (Forecasting & Planning)
  • Supports Tableau-based projects including PoC creation, prototyping, analytics demonstrations, and internal reporting
  • Builds dashboards that highlight key value drivers and business insights
  • Develops reporting tools for operational teams to support daily business activities
  • Fulltime
Read More
Arrow Right

Outside Sales Product Specialist - Data Management

At Dell Technologies, we create the extraordinary. Our Outside Sales Product Spe...
Location
Location
United Kingdom , London
Salary
Salary:
Not provided
dell.com Logo
Dell
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of enterprise sales experience in data management and analytics, ideally with vendors such as Databricks, Snowflake, Denodo, Informatica, Teradata, ScaleAI or similar
  • Strong understanding of modern data architectures (data warehouses, data lakes, lakehouse models, and vector databases for AI/RAG) at a business and value level
  • Familiarity with data engineering, analytics, BI, and data science workflows sufficient to sell AI based outcomes and guide customer conversations
Job Responsibility
Job Responsibility
  • Proactively identify and solve customer business needs by providing subject matter expertise and creating solutions using Dell’s products and services
  • Manage relationships with senior level technical personnel and decision makers
  • Demonstrate the value of a product and/or service technology to advance customer business objectives
  • Provide insight and thought leadership to customers concerning applicability of highly complex products and services
  • Act as a technical resource for the sales organization to help meet and exceed their objectives
  • Develop and execute account strategies aligned to customer data, analytics, and AI transformation initiatives that are focused on Dell’s AI Data Platform Data Engines
  • Engage senior customer stakeholders, including CIOs, CDOs, CTOs, and Heads of Data, Analytics, and AI
  • Lead value-based sales conversations that connect measurable business ROI based outcomes to the AI Data Platform
  • Position competitively and win against incumbent and next-generation data platform providers
  • Articulate how Dell products enable AI applications through retrieval-augmented generation (RAG) and unstructured data processing for use cases like intelligent search, document analysis, and conversational AI
What we offer
What we offer
  • Comprehensive Healthcare Programs
  • Award Winning Financial Wellness Tools and Resources
  • Generous Leave of Absence for New Parents and Caregivers
  • Industry Leading Wellness Platform
  • Employee Assistance Program
Read More
Arrow Right