CrawlJobs Logo

Databricks Devops

realign-llc.com Logo

Realign

Location Icon

Location:
United States , Dallas

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

135000.00 USD / Year

Requirements:

  • 7+ years of experience developing APIs and integrating with 3rd party APIs. Strong SQL skills (SQL Server or Oracle).
  • Familiarity with scripting languages (Python, Bash, Powershell).
  • Experience with version control systems (Git, GitHub, GitLab).
  • Knowledge of CI/CD pipelines and DevOps best practices.
  • Azure AD, OAuth, JWT, Microsoft Teams SDK, Building apps for teams, Azure hands with AD B2C, BOT, SignalR, AKS, App Registrations, App Insights
  • Azure Databricks
  • CI/CD: Jenkins, Terraform, GitLab CI/CD
  • Terraform Module Development

Nice to have:

  • Data Engineer
  • Strong understanding of ETL/ELT processes, data modeling, data warehousing, and big data concepts.
  • Understanding of workflow automations is a plus.

Additional Information:

Job Posted:
May 04, 2026

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Databricks Devops

Data (DevOps) Engineer

Ivy Partners is a Swiss consulting firm dedicated to helping businesses navigate...
Location
Location
Switzerland , Genève
Salary
Salary:
Not provided
ivy.partners Logo
IVY Partners
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Substantial experience with Apache Airflow in complex orchestration and production settings
  • Advanced skills in AWS, Databricks, and Python for data pipelines, MLOps tooling, and automation
  • Proven experience deploying volumetric and sensitive pipelines
  • Confirmed to senior level
  • Highly autonomous, capable of working in a critical and structuring environment
  • Not just a team player but someone who challenges the status quo, proposes solutions, and elevates the team
  • Communicate clearly and possess a strong sense of business urgency
Job Responsibility
Job Responsibility
  • Design and maintain high-performance data pipelines
  • Migrate large volumes of historical and operational data to AWS
  • Optimize data flows used by machine learning models for feature creation, time series, and trade signals
  • Ensure the quality, availability, and traceability of critical datasets
  • Collaborate directly with data scientists to integrate, monitor, and industrialize models: price prediction models, optimization algorithms, and automated trading systems
  • Support model execution and stability in production environments utilizing Airflow and Databricks
  • Build, optimize, and monitor Airflow DAGs
  • Automate Databricks jobs and integrate CI/CD pipelines (GitLab/Jenkins)
  • Monitor the performance of pipelines and models, and address incidents
  • Deploy robust, secure, and scalable AWS data architectures
What we offer
What we offer
  • Supportive environment where everyone is valued, with training and career advancement opportunities
  • Building a relationship based on transparency, professionalism, and commitment
  • Encouraging innovation
  • Taking responsibility
Read More
Arrow Right

Azure Data Architect

We are offering an exciting opportunity for an Azure Data Architect within the O...
Location
Location
United States , Houston
Salary
Salary:
Not provided
https://www.roberthalf.com Logo
Robert Half
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Comprehensive understanding of Azure products and Data platforms such as Databricks
  • Strong collaboration skills with Enterprise Architects, Cloud Engineering, Business Architects, and Product teams
  • Proficient in Azure Databricks, Azure DevOps, Azure Blob Storage, Azure Data Lake
  • Experienced in scripting languages like Python, Bash, or Powershell
  • Knowledge of cloud security principles and best practices
  • Familiarity with Azure Resource Manager, Virtual Networks, Azure Blob Storage, Azure Automation, Azure Active Directory, and Azure Site Recovery
Job Responsibility
Job Responsibility
  • Collaborate with multiple teams to design and implement solutions
  • Ensure solutions are optimized for performance, cost, and compliance
  • Operate hands-on with Azure products and Data platforms
  • Develop and deploy Cloud Native Applications using Azure PaaS Capabilities
  • Manage cloud deployment, technical and security architecture, database architecture, virtualization, software design, networking, DevOps, and DevSecOps
  • Employ Azure data services
  • Utilize scripting languages to automate routine tasks
  • Implement IAM, Authentication and Authorization of applications
  • Utilize knowledge of cloud security principles and best practices
  • Handle Azure Resource Manager, Virtual Networks, Azure Blob Storage, Azure Automation, Azure Active Directory, and Azure Site Recovery
What we offer
What we offer
  • Medical, vision, dental, life and disability insurance
  • Eligibility to enroll in company 401(k) plan
  • Fulltime
Read More
Arrow Right

Data Engineer

We are looking for a motivated Data Engineer to join our team on a dynamic data ...
Location
Location
Portugal , Lisbon
Salary
Salary:
Not provided
extia-group.com Logo
Extia
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong experience with Databricks, Apache Spark, and SQL
  • Familiarity with Agile methodologies and Scrum processes
  • Experience with CI/CD pipelines and Azure DevOps
  • Ability to understand business requirements and translate them into technical solutions
  • Strong communication and teamwork skills
Job Responsibility
Job Responsibility
  • Analyze and translate business requirements in collaboration with the functional team
  • Design, develop, test, and deploy scalable data pipelines using Databricks, Spark, and SQL
  • Collaborate with the development team to plan and align tasks using Agile methodologies (Scrum)
  • Participate in CI/CD processes using Azure DevOps to ensure efficient and reliable deployment of data solutions
Read More
Arrow Right

Salesforce DevOps Lead

Salesforce DevOps Lead is responsible for continuous delivery, monitoring, optim...
Location
Location
Mexico , Guadalajara
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8-10+ years of experience in DevOps ecosystem
  • Strong experience with Salesforce Administration and DevOps practices
  • Experience in Databricks
  • Proficiency in GitHub, branching strategies, and version control
  • Hands-on expertise with VS Code, Shell scripting, and DevOps pipelines
  • Deep understanding of Salesforce environments, sandbox management, and deployment processes
  • Experience in release planning and coordinating with multiple teams
Job Responsibility
Job Responsibility
  • Designs and maintains CI/CD pipelines, manages complex metadata deployments, and optimizes multi-org strategies for seamless integration
  • Implements monitoring solutions, optimizes Mean Time To Recovery (MTTR), and configures proactive alerting to prevent incidents
  • Manages Databricks implementations, monitors API health and performance, and establishes observability standards across integration layers
  • Oversees platforms, defines GIT branching strategies, and establishes release governance frameworks ensure compliance/quality
  • Develops sandbox refresh strategies, automates environment provisioning, and implements data masking protocols to protect sensitive information
  • Manage GitHub repositories, branching strategies, and code merges
  • Ensure proper governance and code quality standards across teams
  • Work with VS Code and Shell scripting for automation tasks
  • Design and maintain DevOps pipelines for CI/CD processes
  • Automate repetitive tasks to improve efficiency and reduce errors
Read More
Arrow Right

Senior Software Developer - ETL

Do you have extensive experience in gathering requirements and business process ...
Location
Location
Canada , North York
Salary
Salary:
543.70 - 579.46 CAD / Hour
https://www.randstad.com Logo
Randstad
Expiration Date
May 11, 2026
Flip Icon
Requirements
Requirements
  • Extensive experience in gathering requirements and business process knowledge in order to design correct and high-quality data transformation
  • Extensive experience in design, development, and implementation with Azure Data Factory, Databricks
  • Experience with CI/CD (DevOps) pipelines and concepts
  • Experience with Azure DevOps
  • Experience in programming and analysis
  • specialized software package support at the specified experience level
  • Ability to collaborate with IT Professionals throughout the Software Development Life Cycle
  • Experience in structured methodologies for the development, design, implementation and maintenance of applications
  • Experience in design, code, test, debug and document applications
  • Experience in the use of object and/or third generation language development tools
Job Responsibility
Job Responsibility
  • Design, implement, and continuously expand data pipelines by performing extraction, transformation, and loading activities.
  • Required to translate technical systems specifications into working, tested applications.
  • This includes developing detailed programming specifications, writing and/or generating code, compiling data-driven programs, maintaining, and conducting unit tests.
  • Resolves and troubleshoots technical problems which arise during the use and operation of software packages, including technical assistance in implementation, conversion and migrations.
What we offer
What we offer
  • Earn a competitive rate within the industry.
  • Potential for extension.
  • Fulltime
!
Read More
Arrow Right

Senior Data Engineer

We are looking for a Senior DevOps Engineer with strong experience in Azure Clou...
Location
Location
Netherlands , Utrecht
Salary
Salary:
Not provided
levy-professionals.com Logo
Levy Professionals
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum 5 years of experience as a Platform / DevOps Engineer
  • Strong hands-on experience with Azure Cloud and Azure DevOps
  • Experience with Databricks and Apache Airflow
  • Programming experience with Python and PySpark
  • Proven experience in automating deployment and operational workflows
  • Strong communication skills and ability to coach and support fellow DevOps engineers
  • Comfortable working in a complex, high-impact and fast-paced environment
  • Write Python logic to produce data to central data lake
  • Implement data transfer controls to ensure high data quality
  • Provide automated insights in the health of daily batch system
Job Responsibility
Job Responsibility
  • Streamlining and optimizing Azure DevOps pipelines
  • Developing and maintaining Infrastructure as Code (IaC) on Azure
  • Collaborating with engineers and squads to define clear technical requirements
  • Setting up and promoting DevOps best practices
  • Providing test automation support
  • Improving deployment, monitoring and operational processes
  • Fulltime
Read More
Arrow Right

Data Engineer

Location
Location
Salary
Salary:
Not provided
united-its.com Logo
United ITs
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Microsoft Azure platform expertise
  • Microsoft Azure platform Data services expert knowledge
  • Proven experience working with SQL Server, Data Factory, Logic Apps, Azure Functions, Databricks, Synapse and Azure Devops
  • Python, PySPark, Spark SQL, T-SQL expert knowledge and experience
  • Proven experience working with Data Lakes and Delta lakes
  • Proven experience writing complex transformations and ingestion pipelines using Data Factory, Databricks and PySpark
  • Proven experience working with relational database engines such as SQL Server
  • Proven experience in Microsoft Business Intelligence modules covering Integration Services (SSIS), Reporting Services (SSRS), Analysis Services (SSAS)
  • Excellent scripting skills with PowerShell, bash Shell and Python
  • Proven experience building CI/CD pipelines in Azure using, at least, scripting and ARM templates
  • Fulltime
Read More
Arrow Right

Databricks Engineer

Job Title: Databricks Engineer; Location – Remote; Fulltime; Skill: Databricks E...
Location
Location
United States
Salary
Salary:
140000.00 USD / Year
realign-llc.com Logo
Realign
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum 6-8 years of relevant experience
  • Develop and maintain CI/CD pipelines for Azure Databricks deployments (Azure DevOps/YAML and related tools)
  • Automate deployment and configuration of Databricks clusters, jobs, libraries, notebooks, and environment promotions
  • Implement and manage the Databricks environment for performance, cost efficiency, and scalability
  • optimize cluster sizing and autoscaling
  • Collaborate with Data Engineers/Scientists/Software Engineers to design, deploy, and scale data pipelines and models on Databricks
  • Monitor and troubleshoot clusters, pipelines, jobs, and associated workflows
  • integrate Azure Monitor/Log Analytics for visibility and metrics
  • Implement Infrastructure as Code (IaC) using Terraform, ARM templates, or Bicep to manage Azure resources and Databricks artifacts
  • Design and maintain backup, recovery, and DR strategies for Databricks environments
Job Responsibility
Job Responsibility
  • Workspace & environment engineering: Standardize Dev/UAT/Prod workspaces (network/Private Link, VNets, secure egress), service principals, secret scopes, and Key Vault integrations
  • Unity Catalog & governance: Configure catalogs/schemas, RBAC, lineage, and data access patterns aligned to our guardrails
  • CI/CD for Databricks: Implement YAML based Azure DevOps pipelines to automate notebook/job deployments, dependencies, environment promotions, and approvals/compliance checks
  • IaC for Databricks & Azure: Author reusable Bicep/Terraform modules for workspaces, clusters/pools, UC objects, and supporting Azure resources
  • Observability & reliability: Establish monitoring/alerting for jobs, clusters, SLAs, autoscaling, and cost controls
  • and automation for disaster recovery scenarios
  • Documentation & handover: Patterns, pipeline templates, IaC modules, and operational runbooks for BAU, plus KT during the first two releases
  • Fulltime
Read More
Arrow Right