CrawlJobs Logo

Software engineer 2 / Senior Software engineer - Azure Data

https://www.microsoft.com/ Logo

Microsoft Corporation

Location Icon

Location:
India , Bangalore

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Microsoft's Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products in our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Our team is part of Azure Data, building reliable, large-scale distributed systems for data engineering. We develop Fabric Materialized Lake Views, which automatically refresh analytical data in Microsoft Fabric so customers can query up-to-date results without manual pipelines. We also maintain Azure HDInsight in production, the PaaS platform running open-source Hadoop, Spark, HBase, and Kafka.

Job Responsibility:

  • Write extensible, maintainable code in C#, Java, Scala, or Python for Fabric Materialized Lake View services and HDInsight components
  • Use AI tools and coding best practices across the development lifecycle
  • Design data refresh, scheduling, and query optimisation features with minimal supervision
  • Review code from teammates for correctness, test coverage, security risks, and adherence to team standards
  • Coach junior engineers through code reviews
  • Debug complex issues in distributed systems running on Azure, Linux, and Windows
  • Run live site operations on a rotational, on-call basis
  • Integrate logging and instrumentation to gather telemetry on system health, performance, reliability, and security
  • Work with product managers, technical leads, and partners across geographies to define customer requirements for Materialized Lake View features

Requirements:

  • Bachelor's Degree in Computer Science or related technical field AND 3+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python
  • OR equivalent experience
  • Experience with the Azure stack including Storage, Compute, Networking, Fabric, Purview, Synapse, AKS, DevOps, Data Factory, or Power BI
  • Experience with big data technologies such as Spark, Kafka, Hadoop, or HBase
  • Experience building data lake or data engineering products, tools, or pipelines
  • Familiarity with container-based architectures (Docker, Kubernetes)
  • Ability to debug complex distributed systems on Linux and/or Windows platforms

Nice to have:

  • Master's + 4 years technical engineering experience
  • OR Bachelor's + 5 years technical engineering experience
  • OR equivalent experience

Additional Information:

Job Posted:
April 16, 2026

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Software engineer 2 / Senior Software engineer - Azure Data

Software Engineer 2 / Senior Software Engineer

We are looking for an experienced Software Engineers for our Bangalore location ...
Location
Location
India , Bengaluru
Salary
Salary:
Not provided
komprise.com Logo
Komprise, Inc.
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Solid grasp of computer science fundamentals and especially data structures, algorithms, multi-threading
  • Ability to solve difficult problems with a simple elegant solution
  • Should have solid object-oriented programming background with impeccable design skills
  • Experience in developing management applications and performance management applications is ideal
  • Experience with object-based file systems and REST interfaces is a plus (e.g. Amazon S3, Azure, Google Cloud Service)
  • Should have a BE or higher in CS, EE, Math or related engineering or science field
  • At least 5+ years of experience in software deployment
  • Tech Stack: Java, Maven Virtualisation, SaaS, Github, Jira, Slack, Cloud Solutions and Hypervisors
Job Responsibility
Job Responsibility
  • Responsible for designing and developing features that powers Komprise data management platform to manage billions of files and petabytes of data
  • Responsible for designing of major components and systems of our product architecture, ensuring that Komprise data management platform is highly available and scalable
  • Responsible for writing performance code, evaluate feasibility, develop for quality and optimize for maintainability
  • Work in agile, customer focused and fast paced team with direct interaction with the customers
  • Responsible for analysing customer escalated issues and provide resolutions in a timely manner
  • Should be able to design and implement highly performant, scalable distributed systems
Read More
Arrow Right

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
84835.61 - 149076.17 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows, BQML, Vertex AI.
  • Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics.
  • Hands-on experience working with real-time, unstructured, and synthetic data.
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar.
  • Expert knowledge on Python programming and SQL.
  • Experience with cloud platforms (AWS, GCP, Azure) and their data services
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed.
  • Familiarity with synthetic data generation and unstructured data processing
Job Responsibility
Job Responsibility
  • Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets
  • Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth.
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound.
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform.
  • Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics
  • Optimize data pipelines for performance, scalability, and reliability.
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products.
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root.
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Eligible to participate in an annual incentive program
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

As a Senior Data Engineer on our Core Engineering Data Team, you will design and...
Location
Location
United States , Boston
Salary
Salary:
111800.00 - 164000.00 USD / Year
simplisafe.com Logo
SimpliSafe
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience
  • 4+ years of experience in software engineering, data engineering, or a related field, with at least 2 years focused on data operations or data infrastructure
  • Strong knowledge of AWS or other public cloud platforms (e.g., Azure, GCP)
  • Strong SQL knowledge and experience optimizing for data warehousing technologies like AWS Athena
  • Strong knowledge of Python for use in data transformation
  • Hands-on experience with ETL/ELT, schema design, and datalake technologies
  • Hands-on experience with data orchestration tools like Dagster, Airflow, or Prefect
  • Experience with CI/CD pipelines, Docker, Kubernetes, and infrastructure-as-code tools (e.g., Terraform, CloudFormation)
  • Familiarity with various data and table formats (JSON, Avro, Parquet, Iceberg)
  • Love of data and a passion for building reliable data products
Job Responsibility
Job Responsibility
  • Collaborate with analysts, engineers, product managers, and stakeholders to design and implement solutions for Product and Engineering data workflows
  • Identify areas for improvement and contribute to centralized data platform
  • Manage data pipeline, orchestration, storage, and analytics infrastructure for Product and Engineering
  • Monitor performance and reliability of data pipelines, implementing solutions for scalability and efficiency
  • Optimize table structures to support query and usage patterns
  • Partner with producers of data across SimpliSafe to develop an understanding of data creation and meaning
  • Support data discovery, catalog, and analytics tooling
  • Implement and maintain data security measures and ensure compliance with data governance policies
  • Design and implement testing strategies and data quality validation to ensure the accuracy, reliability, and integrity of data pipelines
  • Contribute to the formation of our new team, assisting with the development of team norms, practices, and charter
What we offer
What we offer
  • A mission- and values-driven culture and a safe, inclusive environment where you can build, grow and thrive
  • A comprehensive total rewards package that supports your wellness and provides security for SimpliSafers and their families
  • Free SimpliSafe system and professional monitoring for your home
  • Employee Resource Groups (ERGs) that bring people together, give opportunities to network, mentor and develop, and advocate for change
  • Participation in our annual bonus program, equity, and other forms of compensation, in addition to a full range of medical, retirement, and lifestyle benefits
  • Fulltime
Read More
Arrow Right

Data Engineer II

As a Data Engineer II on our Core Engineering Data Team, you will focus on build...
Location
Location
United States , Boston
Salary
Salary:
93200.00 - 136600.00 USD / Year
simplisafe.com Logo
SimpliSafe
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience
  • 2+ years of experience in software engineering, data engineering, or a related field
  • Foundational knowledge of AWS or other public cloud platforms (e.g., Azure, GCP)
  • Strong SQL knowledge and experience optimizing for data warehousing technologies like AWS Athena
  • Strong knowledge of Python for use in data transformation
  • Experience with ETL/ELT, schema design, and datalake technologies
  • Familiarity with various data and table formats (JSON, Avro, Parquet, Iceberg)
  • Love of data and a passion for building reliable data products
Job Responsibility
Job Responsibility
  • Develop and maintain efficient and reliable ETL/ELT data pipelines from product and service data sources to our centralized analytical platforms
  • Implement and monitor data pipeline orchestration using tools like Airflow or Dagster to ensure timely and accurate data delivery
  • Collaborate with senior engineers and analysts to understand requirements and implement technical solutions for data workflows
  • Perform routine monitoring of pipeline performance and reliability, assisting with troubleshooting and optimizing for efficiency
  • Contribute to the optimization of table structures and data storage to support various query and usage patterns
  • Support and maintain data discovery, catalog, and analytics tooling for internal teams
  • Assist in implementing data security measures and ensuring compliance with data governance policies
  • Write and maintain data quality validation checks and unit tests to ensure the integrity and reliability of data pipelines
  • Participate in team-wide discussions to help develop and refine team norms and engineering best practices
What we offer
What we offer
  • A mission- and values-driven culture and a safe, inclusive environment where you can build, grow and thrive
  • A comprehensive total rewards package that supports your wellness and provides security for SimpliSafers and their families
  • Free SimpliSafe system and professional monitoring for your home
  • Employee Resource Groups (ERGs) that bring people together, give opportunities to network, mentor and develop, and advocate for change
  • Participation in our annual bonus program, equity, and other forms of compensation, in addition to a full range of medical, retirement, and lifestyle benefits
  • Fulltime
Read More
Arrow Right

Senior Cloud Engineer

The Senior Cloud Engineer is responsible for the development, delivery and perfo...
Location
Location
United States , Wilton
Salary
Salary:
Not provided
asml.com Logo
ASML
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Information Technology, Computer Science or Engineer. Master’s degree a plus
  • 8 years related and proven experience in large technical software development environments
  • Linux (RHEL, Centos)
  • Networking – Architecture (TCP/IP UDP)
  • Ticketing system – (Service now)
  • Monitoring systems - (Splunk)
  • Application management experience (Atlassian Stack - JIRA, Confluence, Wiki, Bitbucket)
  • Storage knowledge (NFS, SMB, CIFS, ZFS, NAS)
  • Scripting language (Python, Bash, Perl)
  • SW Developer tools (IDE, code quality checkers, review tool, etc.)
Job Responsibility
Job Responsibility
  • Collaborates with teams across various projects and customers on feasibilities and applications' customizations
  • Responsible for SW build and development support activities for systems local or remote, which includes installation, software patch, system security, data backup, storage management, VM, and problem analysis activities
  • Performs remote troubleshooting through diagnostic techniques and pertinent questions
  • Provides Support process by serving as Tier 1 / Tier 2 point of contact and owner of problem/incident
  • Logs all customer requests and updates calls utilizing the designated call handling and tracking system
  • Define improvement proposals that focus on system performance and process efficiency for end-users
  • Investigate and implement non-standard changes with a limited scope
  • Support / take ownership of requests from engineering and help to solve these requests
  • Ensures all IT standard processes and procedures and Service Level Agreements are met
  • Other duties as assigned
  • Fulltime
Read More
Arrow Right

Senior Cloud Engineer - Product Metrics

The Product Metrics team owns the collection, storage, and serving of metrics co...
Location
Location
United States
Salary
Salary:
141000.00 - 208000.00 USD / Year
clickhouse.com Logo
ClickHouse
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of relevant software development industry experience building and operating scalable, fault-tolerant, distributed systems
  • 2+ years of software application development experience using Golang
  • Experience with at least one of the major Cloud Service Providers such as AWS, GCP or Azure
  • Experience with storing, shipping, and retrieving large volumes of data efficiently using technologies such as ClickHouse
  • Experience with technologies such as Kubernetes, Helm, ArgoCD, Temporal as well as infrastructure-as-code tools such as Terraform
Job Responsibility
Job Responsibility
  • Take an active part in determining the roadmap for the Product Metrics team
  • Work closely within the team to deliver new features, iterate and improve them
  • Design, build, operate, and maintain business-critical petabyte-scale systems
  • Be responsible for the performance, reliability, availability and cost-efficiency of the Product Metrics systems
  • Mentor and support other team members, participate in design discussions and collaborate with the team
  • Be a part of on-call rotation and take ownership of the services you're running
What we offer
What we offer
  • Flexible work environment - ClickHouse is a globally distributed company and remote-friendly. We currently operate in 20 countries
  • Healthcare - Employer contributions towards your healthcare
  • Equity in the company - Every new team member who joins our company receives stock options
  • Time off - Flexible time off in the US, generous entitlement in other countries
  • A $500 Home office setup if you’re a remote employee
  • Global Gatherings – We believe in the power of in-person connection and offer opportunities to engage with colleagues at company-wide offsites
  • Fulltime
Read More
Arrow Right

Senior Cloud Engineer - Product Metrics

The Product Metrics team owns the collection, storage, and serving of metrics co...
Location
Location
Canada
Salary
Salary:
Not provided
clickhouse.com Logo
ClickHouse
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of relevant software development industry experience building and operating scalable, fault-tolerant, distributed systems
  • 2+ years of software application development experience using Golang
  • Experience with at least one of the major Cloud Service Providers such as AWS, GCP or Azure
  • Experience with storing, shipping, and retrieving large volumes of data efficiently using technologies such as ClickHouse
  • Experience with technologies such as Kubernetes, Helm, ArgoCD, Temporal as well as infrastructure-as-code tools such as Terraform
Job Responsibility
Job Responsibility
  • Take an active part in determining the roadmap for the Product Metrics team
  • Work closely within the team to deliver new features, iterate and improve them
  • Design, build, operate, and maintain business-critical petabyte-scale systems
  • Be responsible for the performance, reliability, availability and cost-efficiency of the Product Metrics systems
  • Mentor and support other team members, participate in design discussions and collaborate with the team
  • Be a part of on-call rotation and take ownership of the services you're running
What we offer
What we offer
  • Flexible work environment - ClickHouse is a globally distributed company and remote-friendly. We currently operate in 20 countries
  • Healthcare - Employer contributions towards your healthcare
  • Equity in the company - Every new team member who joins our company receives stock options
  • Time off - Flexible time off in the US, generous entitlement in other countries
  • A $500 Home office setup if you’re a remote employee
  • Global Gatherings – We believe in the power of in-person connection and offer opportunities to engage with colleagues at company-wide offsites
Read More
Arrow Right

Senior Data Engineer - Digital Marketing & Analytics

The Senior Data Engineer – Digital Marketing & Analytics is responsible for impl...
Location
Location
United States , Redmond
Salary
Salary:
119800.00 - 234700.00 USD / Year
https://www.microsoft.com/ Logo
Microsoft Corporation
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Master's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 3+ years experience in business analytics, data science, software development, data modeling, or data engineering
  • OR Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 4+ years experience in business analytics, data science, software development, data modeling, or data engineering
  • OR equivalent experience
  • 2+ years experience with data governance, data compliance and/or data security
  • Experience supporting digital marketing, personalization, experimentation, or customer analytics use cases
  • Experience working with or alongside a Customer Data Platform (CDP), Azure and Fabric Data Factories
  • Familiarity with identity data concepts and activation workflows
  • Experience operating within architect-led or strategist-led delivery models
Job Responsibility
Job Responsibility
  • Build data pipelines and curated datasets per CDP architecture and Technical Architect guidelines
  • Define implementation specs covering mapping, transformation, validation, and dependencies
  • Maintain analytics- and activation-ready data layers for reporting, segmentation, personalization, and AI
  • Collaborate with Data Strategist and analytics teams to clarify measurement and AI requirements
  • Assess feasibility, risks, and methods using available data sources
  • Flag and recommend solutions for data gaps, quality issues, or dependencies
  • Integrate necessary first- and third-party data for CDP use cases
  • Apply standard ingestion and transformation patterns according to governance standards
  • Structure and document data for analytics and activation
  • Embed data quality checks, monitoring, and lineage in workflows
  • Fulltime
Read More
Arrow Right