CrawlJobs Logo

Databricks Developer (Digital Engineering)

coherentsolutions.com Logo

Coherent Solutions

Location Icon

Location:

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

The project focuses on a production system managing clients and life insurance programs. The role involves both support and new feature development, with a strong emphasis on building and optimizing data pipelines and working within a modern Azure-based data platform.

Job Responsibility:

  • Support and new feature development
  • Building and optimizing data pipelines
  • Working within a modern Azure-based data platform

Requirements:

  • 3+ years of experience as a Data Engineer
  • Strong experience with Databricks ecosystem (architecture, Unity Catalog, Delta tables, lineage, permissions)
  • Solid knowledge of SQL
  • Strong experience with Python
  • Experience working in Azure cloud environment
  • Good understanding of data pipelines and data platform architecture
  • Strong analytical and problem-solving skills
  • Excellent communication skills
  • English level: B2 (Upper-Intermediate) or higher
What we offer:
  • Technical and non-technical training for professional and personal growth
  • Internal conferences and meetups to learn from industry experts
  • Support and mentorship from an experienced employee
  • Health insurance
  • English courses
  • Sports activities
  • Flexible work options
  • Referral program
  • Work anniversary program and additional vacation days

Additional Information:

Job Posted:
April 24, 2026

Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Databricks Developer (Digital Engineering)

Data Engineer - Platform

The Amgen India Digital Technology & Innovation (DTI) Sr. Associate IS Data Engi...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
amgen.com Logo
Amgen
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Master’s degree and 2 years of Information Systems experience Or Bachelor’s degree and 4 years of Information Systems experience
  • 3+ years of experience in developing, implementing and supporting Data and Analytics Data Lake systems, analytics/reporting and metrics/visualization tools
  • Experience using Databricks platform developing integration pipelines, SFDC APIs, REST based web services, SQL, XML, and JavaScript
  • Extensive hands-on technical and engineering experience with building REACT based UI solutions
  • Experience working in BI environments like Tableau / Power-BI etc.
  • Experience developing in an enterprise environment with numerous applications, development teams, in an AGILE framework
  • Understand AWS services, Security frameworks and AI/LLM base models
  • Ability to handle multiple projects simultaneously and prioritize tasks effectively
  • Excellent problem-solving skills and a passion for tackling complex challenges
  • Collaborative spirit and effective communication skills to work seamlessly in a cross-functional team
Job Responsibility
Job Responsibility
  • Function as a Digital Technology Data and Analytics Data Engineer within a Scaled Agile Framework (SAFe) product team
  • As member of Global Medical Data and Analytics Product team, ensure progress to business needs in a cost effective, compliant, reliable way
  • Use data, AI and metrics to understand trends in user behavior and product usage to influence strategy
  • Integrate data and develop integration queries, ETL pipelines in Databricks datalake environment
  • Lead data analysis and insights user stories and utilize existing CI/CD delivery pipeline and processes - streamlining current processes with a focus on automation
  • Communicate software and system designs using Miro, Lucid Chart, or other modeling tools and software
  • Develop Test classes and automation to validate business logic and avoid production issues
  • Work closely with other developers in PRODUCT SCRUM team including business analyst, software engineers, Validation analyst and Product Owner
  • Research AI/LLM based solutions and Drive Proof of Concepts to suit business requirements, Stay abreast of the latest trends in DevOps methodologies and emerging industry solutions
  • Participate and develop business prioritized Data and Analytics matrix and user stories using Databricks environment
What we offer
What we offer
  • Reasonable accommodation for individuals with disabilities
Read More
Arrow Right

Data Engineer

FinXL is seeking a Data Engineer to join our consulting team and be deployed wit...
Location
Location
Australia , Macquarie Park
Salary
Salary:
160000.00 - 170000.00 AUD / Year
finxl.com.au Logo
FinXL
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Deep expertise in Azure and Databricks (Spark and Delta Lake)
  • Advanced proficiency in Python and SQL for complex data transformation
  • Proven experience building Lakehouse architectures and managing data partitions
  • Strong background in data engineering for read-heavy API workloads
  • Familiarity with DevOps practices including CI/CD pipelines and Terraform
  • Ability to work autonomously in ambiguous, first-of-its-kind project settings
  • Strong stakeholder management skills to navigate complex corporate environments
Job Responsibility
Job Responsibility
  • Design and build a secure data partition and sandpit using Databricks and Azure
  • Engineer data pipelines from internal cloud warehouses and raw network telemetry
  • Develop a "Digital Twin" replica to facilitate a Data-as-a-Service model
  • Implement rigorous security measures including PII masking and row-level security
  • Structure data for secure API exposure to enable client-facing AI tools
  • Collaborate with internal teams to navigate data access and governance processes
  • Fulltime
Read More
Arrow Right
New

Data Designer/Engineer

Join us as a Data Designer/Engineer in Barclays, responsible for supporting the ...
Location
Location
India , Pune
Salary
Salary:
Not provided
barclays.co.uk Logo
Barclays
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong experience in AWS data engineering (S3, Glue, EMR, Athena, IAM)
  • Advanced Python development skills for data engineering
  • Strong hands-on experience with Apache Spark / PySpark
  • Proven experience with Databricks (development, optimisation, notebooks, jobs)
  • Hands-on experience with Apache Airflow and Astronomer
  • Strong SQL skills and experience working with large datasets
  • Experience with CI/CD, Git-based version control, and DevOps practices for data platforms
  • Solid understanding of data modelling, lakehouse architecture, and analytics-ready data design
  • Experience implementing data quality, validation, and reconciliation frameworks
  • Familiarity with metadata, lineage, and data governance concepts
Job Responsibility
Job Responsibility
  • Support the successful delivery of Location Strategy projects to plan, budget, agreed quality and governance standards
  • Spearhead the evolution of our digital landscape, driving innovation and excellence
  • Harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences
  • Build and maintain the systems that collect, store, process, and analyse data
  • Build and maintenance of data architectures pipelines
  • Design and implementation of data warehouses and data lakes
  • Development of processing and analysis algorithms
  • Collaboration with data scientists to build and deploy machine learning models
What we offer
What we offer
  • Competitive holiday allowance
  • Life assurance
  • Private medical care
  • Pension contribution
  • Fulltime
Read More
Arrow Right

Data Engineer - Platform

The Amgen India Digital Technology & Innovation (DTI) Sr. Associate IS Data Engi...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
amgen.com Logo
Amgen
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Master’s degree and 2 years of Information Systems experience Or Bachelor’s degree and 4 years of Information Systems experience
  • 3+ years of experience in developing, implementing and supporting Data and Analytics Data Lake systems, analytics/reporting and metrics/visualization tools
  • Experience using Databricks platform developing integration pipelines, SFDC APIs, REST based web services, SQL, XML, and JavaScript
  • Extensive hands-on technical and engineering experience with building REACT based UI solutions
  • Experience working in BI environments like Tableau / Power-BI etc.
  • Experience developing in an enterprise environment with numerous applications, development teams, in an AGILE framework
  • Understand AWS services, Security frameworks and AI/LLM base models
Job Responsibility
Job Responsibility
  • Function as a Digital Technology Data and Analytics Data Engineer within a Scaled Agile Framework (SAFe) product team
  • As member of Global Medical Data and Analytics Product team, ensure progress to business needs in a cost effective, compliant, reliable way
  • Use data, AI and metrics to understand trends in user behavior and product usage to influence strategy
  • Integrate data and develop integration queries, ETL pipelines in Databricks datalake environment.
  • Lead data analysis and insights user stories and utilize existing CI/CD delivery pipeline and processes - streamlining current processes with a focus on automation
  • Communicate software and system designs using Miro, Lucid Chart, or other modeling tools and software.
  • Develop Test classes and automation to validate business logic and avoid production issues
  • Work closely with other developers in PRODUCT SCRUM team including business analyst, software engineers, Validation analyst and Product Owner
  • Research AI/LLM based solutions and Drive Proof of Concepts to suit business requirements, Stay abreast of the latest trends in DevOps methodologies and emerging industry solutions
  • Participate and develop business prioritized Data and Analytics matrix and user stories using Databricks environment
Read More
Arrow Right

Data Engineering Lead

Lead your team in creating the pipeline for Data management, data visualization,...
Location
Location
Portugal , Porto
Salary
Salary:
Not provided
metyis.com Logo
Metyis
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5 – 8 years of experience in a similar role with experience in developing and deploying ETL solutions on Azure, GCP or AWS
  • Solid knowledge of data warehousing principles, concepts, and best practices (e.g. ODS, data marts, data lakes, data vault, 3NF)
  • Good understanding of common platforms and practices regarding digital development
  • Cloud-based setups, Advanced Analytics and computing environments, and GitHub (or similar) and CI/CD workflows
  • Experience in setting up automated testing frameworks including unit tests, integration tests
  • Understanding of modern cloud-based architecture (Lambda & Kappa architectures)
  • Advanced SQL, data transformation, and data profiling skills
  • Experience in building production ETL/ELT pipelines at scale
  • Data governance experience
  • 3 – 5 years of hands-on experience with Azure: Data Factory, Databricks, Synapse (DWH), Azure Functions, App logic, and other data analytics services, including streaming
Job Responsibility
Job Responsibility
  • Lead, influence & implement the technical roadmap of our clients, in light of overall technology and architecture roadmaps
  • Be responsible for the Architecture, making sure that technical and non-technical components work together in the product to deliver the customer needs
  • Lead and Develop the digital development and operations using Python, Spark, RESTful API, and Microsoft Azure Cloud, ideally with some Data Insights experience
  • Steer the technical excellence of the application and consistency with relevant digital frameworks, best practices, and standards
  • Coach and enable other product team members to deliver best-in-class products
  • Deliver data platforms, data insights and with the collaboration of the data science team – data products, such as recommendation systems, data lakes/hubs, data insights and dashboards
  • Lead project activities and will be responsible for the evaluation of business needs and objectives in collaboration with the project team and other stakeholders
What we offer
What we offer
  • Become part of a fast-growing international and diverse team
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

At Ingka Investments (Part of Ingka Group – the largest owner and operator of IK...
Location
Location
Netherlands , Leiden
Salary
Salary:
Not provided
https://www.ikea.com Logo
IKEA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Formal qualifications (BSc, MSc, PhD) in computer science, software engineering, informatics or equivalent
  • Minimum 3 years of professional experience as a (Junior) Data Engineer
  • Strong knowledge in designing efficient, robust and automated data pipelines, ETL workflows, data warehousing and Big Data processing
  • Hands-on experience with Azure data services like Azure Databricks, Unity Catalog, Azure Data Lake Storage, Azure Data Factory, DBT and Power BI
  • Hands-on experience with data modeling for BI & ML for performance and efficiency
  • The ability to apply such methods to solve business problems using one or more Azure Data and Analytics services in combination with building data pipelines, data streams, and system integration
  • Experience in driving new data engineering developments (e.g. apply new cutting edge data engineering methods to improve performance of data integration, use new tools to improve data quality and etc.)
  • Knowledge of DevOps practices and tools including CI/CD pipelines and version control systems (e.g., Git)
  • Proficiency in programming languages such as Python, SQL, PySpark and others relevant to data engineering
  • Hands-on experience to deploy code artifacts into production
Job Responsibility
Job Responsibility
  • Contribute to the development of D&A platform and analytical tools, ensuring easy and standardized access and sharing of data
  • Subject matter expert for Azure Databrick, Azure Data factory and ADLS
  • Help design, build and maintain data pipelines (accelerators)
  • Document and make the relevant know-how & standard available
  • Ensure pipelines and consistency with relevant digital frameworks, principles, guidelines and standards
  • Support in understand needs of Data Product Teams and other stakeholders
  • Explore ways create better visibility on data quality and Data assets on the D&A platform
  • Identify opportunities for data assets and D&A platform toolchain
  • Work closely together with partners, peers and other relevant roles like data engineers, analysts or architects across IKEA as well as in your team
What we offer
What we offer
  • Opportunity to develop on a cutting-edge Data & Analytics platform
  • Opportunities to have a global impact on your work
  • A team of great colleagues to learn together with
  • An environment focused on driving business and personal growth together, with focus on continuous learning
  • Fulltime
Read More
Arrow Right

D&A(Digitalization & Automation) software developer

The Digitalization & Automation (D&A) team within the Operational Excellence org...
Location
Location
Korea, Republic of , Hwasung
Salary
Salary:
Not provided
asml.com Logo
ASML
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Associate, bachelor's, or master's degree in Computer Science, Engineering, Industrial Engineering, or IT‑related fields
  • 2–3 years of experience in software programming
  • Fluent in English and Korean
  • Individuals interested in identifying root causes of problems and solving them through digital approaches
  • Individuals who question existing methods and seek better ways to work
  • Curiosity and fast learning ability regarding new technologies (AI, automation platforms, etc.)
  • Value operational excellence more than simple feature development
  • Ability to understand internal customer needs and communicate solutions effectively
  • Python skills (for Web Automation, MS Office handling, Data handling)
  • Knowledge of RDBMS such as PostgreSQL or Databricks, or file‑based databases like SQLite
Job Responsibility
Job Responsibility
  • Develop automation software to improve internal customer productivity and efficiency
  • Lead end-to-end development from requirements analysis, design, development, testing, deployment, to maintenance
  • Identify repetitive and manual processes and convert them into automation solutions
  • Support programming and automation technology training for new or junior developers
  • Collaborate with team members to generate improvement ideas and conduct PoC activities
  • Create technical documents and user manuals that are easy for end users to understand
  • Participate in client development considering UI/UX elements, including web/app development (Flutter/React, etc.) when needed
  • Fulltime
Read More
Arrow Right
New

QA Automation Engineer (Digital Engineering)

The project focuses on ensuring quality and reliability of systems within the fi...
Location
Location
Bulgaria; Georgia; Lithuania; Mexico; Moldova; Poland; Romania; Ukraine
Salary
Salary:
Not provided
coherentsolutions.com Logo
Coherent Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ year of experience in QA
  • Experience with Databricks ecosystem
  • Experience working in Azure cloud environment
  • Good understanding of data pipelines and data platform architecture
  • Experience with Python for test automation
  • Understanding of automation frameworks
  • Experience working with SQL
  • Strong knowledge of manual QA practices
  • Attention to detail and analytical mindset
  • Ability to work in a team environment
Job Responsibility
Job Responsibility
  • Perform manual testing of application features and workflows
  • Develop and maintain automation tests using Python
  • Work with existing automation frameworks and improve test coverage
  • Validate data and perform database checks using SQL
  • Collaborate with development and QA teams to ensure product quality
  • Identify, document, and track defects
What we offer
What we offer
  • Technical and non-technical training for professional and personal growth
  • Internal conferences and meetups to learn from industry experts
  • Support and mentorship from an experienced employee to help you professional grow and development
  • Health insurance
  • English courses
  • Sports activities to promote a healthy lifestyle
  • Flexible work options, including remote and hybrid opportunities
  • Referral program for bringing in new talent
  • Work anniversary program and additional vacation days
  • Fulltime
Read More
Arrow Right