CrawlJobs Logo

Staff Engineer I – Azure Databricks Engineer

softwareresources.com Logo

Software Resources

Location Icon

Location:
United States , Irving, Dallas

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Software Resources has an immediate, direct hire job opportunity for a Staff Engineer I – Azure Databricks Engineer with a major corporation in Dallas, TX. 4 days per week on-site, Friday Remote.

Job Responsibility:

  • Responsible for delivery and operations of technologies and platforms required to model, transform, analyze, report, visualize data
  • Work in a mid-level role with proficiency needed in building, optimizing, streamlining and automating the Azure Databricks platform to enable analytical workloads like Machine learning (ML), Model Development, Data Insights, Data Science etc
  • Partner with ML engineers, data scientists, data analysts, and enterprise architects to enforce best practices, train and enable users using the Azure Databricks platform
  • Take assignments that can be worked on individually without supervision and manage work effort from concept to completion
  • Build, optimize and maintain the Azure Databricks platform, ensuring scalability, security, governance and performance
  • Implement and manage Azure Databricks workspaces, clusters, jobs, access management
  • Implement and manage policies, monitoring and observability for the Azure Databricks platform
  • Ensure compliance with IT policies, procedures, and industry standards, including reviewing and refining IT control enhancements
  • Work closely with business and analytics teams to ensure reliable and governed data access for data needed for analytics
  • Troubleshoot platform issues, optimize performance, and ensure uptime for critical Databricks services
  • Stay current on emerging data analytics technologies, recommending enhancements to improve efficiency and governance

Requirements:

  • 5+ years of related experience in data analytics administration and development
  • 2+ years of Databricks related experience
  • Bachelor’s degree in related field required
  • Hands-on experience in Azure Databricks (Workspace management, Clusters, Jobs, Unity Catalog, Delta Lake, User access management, Rest APIs and SDKs)
  • Understanding of Azure infrastructure and data services, including Azure Data Lake, Azure Data Factory, Azure SQL, Azure Synapse Analytics, Azure Key Vault, Azure Monitor, networking
  • Experience with CI/CD pipelines (Azure DevOps preferred)
  • Strong programming skills in SQL, Python, and/or PySpark
  • Proven experience in leading cross-functional teams and managing multiple projects simultaneously
  • Intermediate to advanced ability to see the big picture and align projects with organizational goals. Capable of leading and motivating cross-functional teams. Expertise in resolving conflicts and addressing challenges as well as skilled at identifying and mitigating risks at the project level
  • Good communication skills, excellent teamwork experience
  • Occasional travel might be required

Nice to have:

Intermediate to advanced knowledge of general Financial Services or Banking preferred

What we offer:
  • Competitive salaries
  • An ownership stake in the company
  • Medical and dental insurance
  • Time off
  • A great 401k matching program
  • Tuition assistance program
  • An employee volunteer program
  • A wellness program
  • Opportunity to bolster your business knowledge, learning the ins and outs of how successful companies operate and manage their finances, giving you invaluable hands-on experience to help grow your career

Additional Information:

Job Posted:
April 01, 2026

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Staff Engineer I – Azure Databricks Engineer

Staff Data Engineer

Data Idols are working with one of the best-known retail brands in the UK that a...
Location
Location
United Kingdom , London
Salary
Salary:
85000.00 - 95000.00 GBP / Year
dataidols.com Logo
Data Idols
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong hands-on experience with Azure data platforms
  • Advanced SQL skills
  • Commercial experience using Databricks and PySpark
  • Proven background building and maintaining scalable data pipelines
Job Responsibility
Job Responsibility
  • Play a key role in scaling production data systems and raising engineering standards across the wider data function
  • Take ownership of complex, production-grade data pipelines and act as a technical leader within the data engineering team
  • Work on cloud-native solutions built on Azure and Databricks, making key decisions around data processing, modelling, and performance
  • Help set best practices, support other engineers, and influence how data engineering is done across the organisation
  • Fulltime
Read More
Arrow Right

Staff Data Engineer

Data Idols are working with one of the best-known retail brands in the UK that a...
Location
Location
United Kingdom , London
Salary
Salary:
85000.00 - 95000.00 GBP / Year
dataidols.com Logo
Data Idols
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong hands-on experience with Azure data platforms
  • Advanced SQL skills
  • Commercial experience using Databricks and PySpark
  • Proven background building and maintaining scalable data pipelines
Job Responsibility
Job Responsibility
  • Take ownership of complex, production-grade data pipelines
  • Act as a technical leader within the data engineering team
  • Work on cloud-native solutions built on Azure and Databricks
  • Make key decisions around data processing, modelling, and performance
  • Set best practices
  • Support other engineers
  • Influence how data engineering is done across the organisation
  • Fulltime
Read More
Arrow Right

Staff Software Engineer (Infra)

As a Staff Software Engineer (Infra) at Amigo, you'll own the technical directio...
Location
Location
United States , New York City
Salary
Salary:
220000.00 - 260000.00 USD / Year
amigo.ai Logo
Amigo
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of production infrastructure experience, with significant time at elite engineering organizations
  • Expert-level experience with Kubernetes and container orchestration at scale
  • Proven track record designing infrastructure that scales across multiple regions
  • Deep experience with cloud platforms (AWS, GCP, or Azure)
  • Strong understanding of infrastructure-level networking and security configurations
  • History of establishing engineering standards and mentoring engineers
  • Extremely high standards for reliability, security, and operational excellence
  • Both execution-oriented and defensive-minded: you ship infrastructure while anticipating failure modes
  • Deep knowledge of infrastructure as code tools (Terraform, Pulumi, or similar)
  • Experience with compliance requirements and data residency controls in regulated industries
Job Responsibility
Job Responsibility
  • Own technical architecture for infrastructure across cloud platforms, Kubernetes, Databricks, and supporting systems
  • Drive engineering standards for reliability, security, observability, and incident response
  • Architect multi-region deployment strategies with zero-downtime updates for critical systems
  • Design the compliance & security infrastructure for healthcare (HIPAA, SOC 2) and support future regulatory requirements
  • Own disaster recovery architecture and backup systems meeting healthcare compliance requirements
  • Make build vs. buy decisions for infrastructure tooling and evaluate technical tradeoffs
  • Design auto-scaling systems that handle traffic spikes while maintaining cost efficiency
  • Own infrastructure as code of our infrastructure, ensuring clearly documented and identical deployments across regions
  • Mentor engineers and establish patterns that raise the bar for the infrastructure team
  • Collaborate with backend, platform, and security teams to ensure system-wide coherence
What we offer
What we offer
  • Comprehensive health, dental, and vision insurance
  • Mental health support and wellness coaching
  • Flexible wellness stipend for fitness, therapy, or personal growth
  • Daily catered lunch and dinner
  • Annual learning budget for courses, books, or conferences
  • Conference attendance budget for professional development
  • Development setup of your choice
  • Academic collaboration opportunities
  • Fulltime
Read More
Arrow Right

Head of Data, Automation & AI

Knovia Group, the UK’s leading apprenticeship provider, is on a bold mission to ...
Location
Location
United Kingdom
Salary
Salary:
90000.00 GBP / Year
paragonskills.co.uk Logo
Paragon Skills
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Degree in computer science, data science, engineering, a related field or equivalent professional training/qualifications
  • Strong CPD record keeping abreast of latest in data architecture, governance, cloud data platforms, advanced analytics, connected systems, and development of AI agents
  • Proven experience in data platform strategy, AI/ML enablement, or data transformation at scale
  • 5+ years' Senior experience in a data scientist, data engineer or developer role
  • 3+ years' leading a function
  • Experience in a senior data, AI, or digital transformation leadership role
  • Track record of delivering enterprise-scale data infrastructure and AI/automation initiatives
  • Strong understanding of data architecture, governance, and cloud data platforms (e.g., Snowflake, Databricks, AWS/GCP, MS Fabric)
  • Deep expertise in cloud-based data architectures (e.g., AWS, Azure, or GCP), data engineering, and MLOps
  • Familiarity with tools like Databricks, Snowflake, MLflow, Airflow, dbt, and LLM technologies would be advantageous
Job Responsibility
Job Responsibility
  • Lead the development of a modern data platform (data lake, warehouse, pipelines, BI suite)
  • Automating and integrating end-to-end business processes using tools like Workato
  • Developing and deploying AI agents to enhance operational efficiency and learner/employer experiences
  • Enhancing our analytics capabilities to better understand and serve our customers
  • Shape our internal AI capability — from staff skills to leadership development
What we offer
What we offer
  • Generous Annual Leave: 21 days, increasing with length of service, plus a holiday purchase scheme
  • Holiday Benefits: 3 Knovia Days for our operational December closure and 8 Public Bank Holidays
  • Extra Day Off: Enjoy an additional day off to celebrate your birthday
  • Paid Volunteering Leave: Up to 3 days of paid leave for volunteering opportunities and corporate conscience initiatives
  • Perkbox: Access to a wide range of lifestyle benefits and wellness tools
  • Recognition and Long Service Awards: Celebrating the milestones and contributions of our colleagues
  • Fulltime
Read More
Arrow Right

Data Platform Architect

We are seeking a hands-on Data Platform Architect to lead the design, implementa...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
sandisk.com Logo
Sandisk
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10+ years of hands-on experience in data architecture, engineering, and analytics delivery
  • Proven success in building modern data platforms on cloud (AWS, Azure, GCP)
  • Deep knowledge of data lakehouse architectures (e.g., Databricks, Fabric)
  • Proficiency with Python, SQL, Spark, and orchestration frameworks
  • Experience with ETL/ELT tools (e.g., Informatica, Talend, Fivetran) and containerization (Docker, Kubernetes)
  • Strong background in Data Modeling (ERD, star/snowflake, canonical models)
  • Familiarity with REST APIs, GraphQL, and event-driven design
  • Demonstrated experience integrating AI/ML and GenAI components into data platforms
Job Responsibility
Job Responsibility
  • Define and continuously evolve the target data architecture across the stack—governance, engineering, modeling, lakehouse, AI/ML
  • Translate business and technical goals into scalable and resilient platform designs
  • Own and maintain architectural roadmaps, standards, and decision frameworks
  • Act as the bridge between architects, Business SME/Analysts, data engineers, and analytics teams to ensure alignment and compliance with platform standards
  • Design and implement modern ELT/ETL pipelines using tools like Spark, Python, SQL, Scala, and cloud-native components (e.g., Fivetran, Databricks, Snowflake, BigQuery)
  • Build and maintain Lakehouse platforms using Delta Lake, Iceberg, or equivalent technologies
  • Manage data ingestion from heterogeneous sources including ERP, CRM, IoT, and third-party APIs
  • Guide hands-on development of robust, reusable, and automated data flows
  • Implement and enforce data governance frameworks including data lineage, metadata management, and access controls
  • Partner with Data Stewards and Governance Analysts to catalog data domains, define entities, and ensure SOX compliance
  • Fulltime
Read More
Arrow Right

Data Platform Architect

We are seeking a hands-on Data Platform Architect to lead the design, implementa...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
sandisk.com Logo
Sandisk
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 12+ years of hands-on experience in data architecture, engineering, and analytics delivery
  • Proven success in building modern data platforms on cloud (AWS, Azure, GCP)
  • Deep knowledge of data lakehouse architectures (e.g., Databricks, Fabric)
  • Proficiency with Python, SQL, Spark, and orchestration frameworks
  • Experience with ETL/ELT tools (e.g., Informatica, Talend, Fivetran) and containerization (Docker, Kubernetes)
  • Strong background in Data Modeling (ERD, star/snowflake, canonical models)
  • Familiarity with REST APIs, GraphQL, and event-driven design
  • Demonstrated experience integrating AI/ML and GenAI components into data platforms
Job Responsibility
Job Responsibility
  • Define and continuously evolve the target data architecture across the stack—governance, engineering, modeling, lakehouse, AI/ML
  • Translate business and technical goals into scalable and resilient platform designs
  • Own and maintain architectural roadmaps, standards, and decision frameworks
  • Act as the bridge between architects, Business SME/Analysts, data engineers, and analytics teams to ensure alignment and compliance with platform standards
  • Design and implement modern ELT/ETL pipelines using tools like Spark, Python, SQL, Scala, and cloud-native components (e.g., Fivetran, Databricks, Snowflake, BigQuery)
  • Build and maintain Lakehouse platforms using Delta Lake, Iceberg, or equivalent technologies
  • Manage data ingestion from heterogeneous sources including ERP, CRM, IoT, and third-party APIs
  • Guide hands-on development of robust, reusable, and automated data flows
  • Implement and enforce data governance frameworks including data lineage, metadata management, and access controls
  • Partner with Data Stewards and Governance Analysts to catalog data domains, define entities, and ensure SOX compliance
  • Fulltime
Read More
Arrow Right

Staff Software Engineer, Backend

Tonal is looking for a passionate Staff Backend Engineer to work cross-functiona...
Location
Location
United States , San Francisco; Toronto
Salary
Salary:
200000.00 - 220000.00 USD / Year
tonal.com Logo
Tonal
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of software development experience
  • Track record of solving problems end-to-end
  • Experience with distributed systems, microservices architecture, cloud platforms (AWS, Azure or GCP) and RESTful APIs
  • Expertise in server-side software development in Golang (or other languages, e.g. Rust, C++, C#, Java, Python)
  • Strong understanding of database design and modeling (e.g., PostgreSQL)
  • Excellent communicator with the ability to work collaboratively and cohesively in a cross-functional team
  • Experience with data-intensive applications, big data pipelines and analytics, having used tools such as Snowflake, DataBricks, Amplitude and Looker
Job Responsibility
Job Responsibility
  • Impact and contribute to the development of major software projects for our backend services
  • Collaborate with our tightly integrated software, hardware and content teams to continually evolve a unique, consumer-oriented fitness product
  • Work closely with product team to clarify requirements and develop designs for new features
  • Implement, test, deploy and monitor software for our cloud-based backend microservices
  • Write and review clean, secure and testable code with a focus on maintainability, scalability and performance
  • Develop and support cloud-based big data pipelines
  • Use analytics to understand product behavior and motivate data-driven decisions
  • Mentor and share your broad knowledge with more junior engineers
What we offer
What we offer
  • Offers Equity
  • Fulltime
Read More
Arrow Right

Staff Data Scientist

Join Blue Yonder Data Science and Machine Learning team as a Staff Data Scientis...
Location
Location
United States , Coppell, Texas
Salary
Salary:
130023.09 - 163976.90 USD / Year
blueyonder.com Logo
Blue Yonder
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s Degree in Computer Science or any other related field is required
  • Min 8 to 10 years of experience with strong foundation in data science and deep learning principles
  • Proficient in Python programming with a solid understanding of data structures
  • Experience with frameworks and libraries like Pandas, NumPy, Keras, TensorFlow, Jupyter, Matplotlib, etc.
  • Expertise in a database query language, preferably SQL
  • Familiarity with deep learning, time series, NLP, reinforcement learning, and combinatorial optimization
  • Familiarity with Big Data technologies like Snowflake, Apache Beam/Spark/Flink, and Databricks
  • Solid experience with major cloud platforms, preferably Azure and/or GCP
  • Knowledge of modern software development tools and best practices, including Git, Github Actions, Jenkins, Docker, Jira, etc.
  • Proven experience in team leadership, mentoring junior data scientists in an official or unofficial capacity
Job Responsibility
Job Responsibility
  • Is responsible for designing, developing, and testing new algorithms, models, or solving approaches based on machine learning, operation research, or other techniques to solve a Blue Yonder business problem with little or no supervision
  • Develops prototypes and proofs of principles for innovative features
  • Integrates new models and algorithms into a product solution
  • Mentors team with scientific libraries and development tools
  • Develops quality software (including test code) according to clean code principles and Blue Yonder standards
  • Considers operational aspects of machine learning services in early design stages and strives for more automation and self-service to reduce operational efforts for themselves and the team
  • Empowers team members by sharing knowledge and providing hints so they come up with their own solutions
  • Leads by example
  • As Staff Data scientist, you will be a guiding the team in the implementation of machine learning models and deep learning tools
  • Collaborate with cross-functional teams to execute data enrichment, cleansing routines, and feature selection
What we offer
What we offer
  • Comprehensive Medical, Dental and Vision
  • 401K with Matching
  • Flexible Time Off
  • Corporate Fitness Program
  • A variety of voluntary benefits such as
  • Legal Plans, Accident and Hospital Indemnity, Pet Insurance and much more
  • Fulltime
Read More
Arrow Right