CrawlJobs Logo

Senior Software Engineer | Azure Data Analytics

https://www.microsoft.com/ Logo

Microsoft Corporation

Location Icon

Location:
Canada , Vancouver

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

114400.00 - 203900.00 CAD / Year

Job Description:

Microsoft’s Azure Data Engineering team is seeking a Senior Software Engineer who will join a team that is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the big data analytics team provides a range of products that enable data engineers and data scientists to extract intelligence from all data – structured, semi-structured, and unstructured. We build the Data Engineering, Data Science, and Data Integration pillars of Microsoft Fabric. The big data analytics team is hiring a Senior Software Engineer for Microsoft's internal big data analytics platform. We process exabytes of data each day on behalf of internal microsoft customers.

Job Responsibility:

  • Improve performance of data processing in a large distributed system
  • Assist customers in complicated debugging scenarios
  • Enable dbuild the Data Engineering, Data Science, and Data Integration pillars of Microsoft Fabric
  • Build the Data Engineering, Data Science, and Data Integration pillars of Microsoft Fabric.
  • Handle distributed live site issues
  • Embody our culture and values

Requirements:

  • Bachelor's Degree in Computer Science or related technical field AND 4+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python
  • OR equivalent experience
  • 4+ years professional experience with enterprise level distributed systems and databases such as T-SQL, Oracle-SQL or Non-SQL
  • Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter.

Nice to have:

  • Master's Degree in Computer Science or related technical field AND 6+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python
  • OR Bachelor's Degree in Computer Science or related technical field AND 8+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python
  • OR equivalent experience.
  • Experience with distributed systems
  • Experience with database internals and database optimization

Additional Information:

Job Posted:
March 20, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Software Engineer | Azure Data Analytics

Senior Software Engineer

Axis Security - Acquired by HPE Aruba is seeking a highly skilled and motivated ...
Location
Location
Israel , Tel Aviv
Salary
Salary:
Not provided
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of professional software development experience
  • Proficiency in one or more languages such as C#, JavaScript/TypeScript, or Go
  • Experience with frameworks such as .NET Core & React
  • Strong understanding of relational (PostgreSQL, MySQL) and NoSQL (MongoDB, Redis) databases
  • Strong experience in building RESTful APIs and microservices architectures
  • Experience working with one of the leading vendors for big data processing, analytics, and storage (Advantage)
  • Experience with AWS, Azure, or Google Cloud Platform (GCP) (Advantage)
  • Understanding of secure coding practices and data protection regulations (Advantage)
  • Experience with unit testing, integration testing, and automated testing frameworks (Advantage)
  • Experience with Docker, Kubernetes, Gitlab, or other CI/CD tools (Advantage)
Job Responsibility
Job Responsibility
  • Design, develop, test, and maintain robust, scalable, and high-quality software applications
  • Contribute to architectural decisions, ensuring efficient system design and implementation
  • Design and optimize data pipelines, integrating structured and unstructured data sources into data lakes
  • Write clean, maintainable, and well-documented code while enforcing coding standards and best practices (SOLID principles, TDD, CI/CD)
  • Identify bottlenecks and optimize application performance, scalability, and security
  • Mentor junior developers, conduct code reviews, and promote knowledge sharing within the team
  • Work closely with product managers, designers, DevOps, and QA teams to deliver high-quality software solutions
  • Troubleshoot and resolve complex technical issues across different components of the software stack
  • Participate in Agile methodologies, including sprint planning, daily stand-ups, and retrospectives
What we offer
What we offer
  • Health & Wellbeing
  • Personal & Professional Development
  • Diversity, Inclusion & Belonging
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

At Ingka Investments (Part of Ingka Group – the largest owner and operator of IK...
Location
Location
Netherlands , Leiden
Salary
Salary:
Not provided
https://www.ikea.com Logo
IKEA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Formal qualifications (BSc, MSc, PhD) in computer science, software engineering, informatics or equivalent
  • Minimum 3 years of professional experience as a (Junior) Data Engineer
  • Strong knowledge in designing efficient, robust and automated data pipelines, ETL workflows, data warehousing and Big Data processing
  • Hands-on experience with Azure data services like Azure Databricks, Unity Catalog, Azure Data Lake Storage, Azure Data Factory, DBT and Power BI
  • Hands-on experience with data modeling for BI & ML for performance and efficiency
  • The ability to apply such methods to solve business problems using one or more Azure Data and Analytics services in combination with building data pipelines, data streams, and system integration
  • Experience in driving new data engineering developments (e.g. apply new cutting edge data engineering methods to improve performance of data integration, use new tools to improve data quality and etc.)
  • Knowledge of DevOps practices and tools including CI/CD pipelines and version control systems (e.g., Git)
  • Proficiency in programming languages such as Python, SQL, PySpark and others relevant to data engineering
  • Hands-on experience to deploy code artifacts into production
Job Responsibility
Job Responsibility
  • Contribute to the development of D&A platform and analytical tools, ensuring easy and standardized access and sharing of data
  • Subject matter expert for Azure Databrick, Azure Data factory and ADLS
  • Help design, build and maintain data pipelines (accelerators)
  • Document and make the relevant know-how & standard available
  • Ensure pipelines and consistency with relevant digital frameworks, principles, guidelines and standards
  • Support in understand needs of Data Product Teams and other stakeholders
  • Explore ways create better visibility on data quality and Data assets on the D&A platform
  • Identify opportunities for data assets and D&A platform toolchain
  • Work closely together with partners, peers and other relevant roles like data engineers, analysts or architects across IKEA as well as in your team
What we offer
What we offer
  • Opportunity to develop on a cutting-edge Data & Analytics platform
  • Opportunities to have a global impact on your work
  • A team of great colleagues to learn together with
  • An environment focused on driving business and personal growth together, with focus on continuous learning
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer to design, develop, and optimize data platforms, pipelines,...
Location
Location
United States , Chicago
Salary
Salary:
160555.00 - 176610.00 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Master's degree in Engineering Management, Software Engineering, Computer Science, or a related technical field
  • 3 years of experience in data engineering
  • Experience building data platforms and pipelines
  • Experience with AWS, GCP or Azure
  • Experience with SQL and Python for data manipulation, transformation, and automation
  • Experience with Apache Airflow for workflow orchestration
  • Experience with data governance, data quality, data lineage and metadata management
  • Experience with real-time data ingestion tools including Pub/Sub, Kafka, or Spark
  • Experience with CI/CD pipelines for continuous deployment and delivery of data products
  • Experience maintaining technical records and system designs
Job Responsibility
Job Responsibility
  • Design, develop, and optimize data platforms, pipelines, and governance frameworks
  • Enhance business intelligence, analytics, and AI capabilities
  • Ensure accurate data flows and push data-driven decision-making across teams
  • Write product-grade performant code for data extraction, transformations, and loading (ETL) using SQL/Python
  • Manage workflows and scheduling using Apache Airflow and build custom operators for data ETL
  • Build, deploy and maintain both inbound and outbound data pipelines to integrate diverse data sources
  • Develop and manage CI/CD pipelines to support continuous deployment of data products
  • Utilize Google Cloud Platform (GCP) tools, including BigQuery, Composer, GCS, DataStream, and Dataflow, for building scalable data systems
  • Implement real-time data ingestion solutions using GCP Pub/Sub, Kafka, or Spark
  • Develop and expose REST APIs for sharing data across teams
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Annual incentive program
  • Fulltime
Read More
Arrow Right

Senior Software Engineer

About the Role: As a Senior Software Engineer at Dotdigital, your role will invo...
Location
Location
Poland; South Africa
Salary
Salary:
Not provided
dotdigital.com Logo
Dotdigital
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong expertise in .NET (console and web apps)
  • JavaScript (jQuery, VueJS)
  • TypeScript
  • HTML+CSS (Sass, Scss etc)
  • MSSQL with experience in other languages beneficial
  • Experience with cloud computing platforms, particularly Azure
  • Production experience in self-contained integration development
  • Experience with modern real-time analytical data platforms
  • Strong problem-solving skills and the ability to champion any given problem
  • Leadership, proactive, and communication talents, working well independently and within a remote team environment
Job Responsibility
Job Responsibility
  • Utilizing .NET and JavaScript programming languages to develop and maintain applications for our customers
  • Supporting multiple existing customer integrations hosted in our Azure web and task server environment
  • Working with our Solution Architects to bring their solution designs to life by collaborating with them through the entire project life cycle
  • Collaborating closely with Core Development, DataOps, and ServiceOps teams
  • Involvement with database design and maintenance for our integrations
  • Ownership of maintaining and improving our framework libraries
  • Continuously improving processes and finding opportunities for innovation
What we offer
What we offer
  • Parental leave
  • Medical benefits
  • Paid sick leave
  • Dotdigital day
  • Share reward
  • Wellbeing reward
  • Wellbeing Days
  • Loyalty reward
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

The Data Engineer is responsible for designing, building, and maintaining robust...
Location
Location
Germany , Berlin
Salary
Salary:
Not provided
ibvogt.com Logo
ib vogt GmbH
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Degree in Computer Science, Data Engineering, or related field
  • 5+ years of experience in data engineering or similar roles
  • experience in renewable energy, engineering, or asset-heavy industries is a plus
  • Strong experience with modern data stack (e.g., PowerPlatform, Azure Data Factory, Databricks, Airflow, dbt, Synapse, Snowflake, BigQuery, etc.)
  • Proficiency in Python and SQL for data transformation and automation
  • Experience with APIs, message queues (Kafka, Event Hub), data streaming and knowledge of data lakehouse and data warehouse architectures
  • Familiarity with CI/CD pipelines, DevOps practices, and containerization (Docker, Kubernetes)
  • Understanding of cloud environments (preferably Microsoft Azure, PowerPlatform)
  • Strong analytical mindset and problem-solving attitude paired with a structured, detail-oriented, and documentation-driven work style
  • Team-oriented approach and excellent communication skills in English
Job Responsibility
Job Responsibility
  • Design, implement, and maintain efficient ETL/ELT data pipelines connecting internal systems (M365, Sharepoint, ERP, CRM, SCADA, O&M, etc.) and external data sources
  • Integrate structured and unstructured data from multiple sources into the central data lake / warehouse / Dataverse
  • Build data models and transformation workflows to support analytics, reporting, and AI/ML use cases
  • Implement data quality checks, validation rules, and metadata management according to the company’s data governance framework
  • Automate workflows, optimize performance, and ensure scalability of data pipelines and processing infrastructure
  • Work closely with Data Scientists, Software Engineers, and Domain Experts to deliver reliable datasets for Digital Twin and AI applications
  • Maintain clear documentation of data flows, schemas, and operational processes
What we offer
What we offer
  • Competitive remuneration and motivating benefits
  • Opportunity to shape the data foundation of ib vogt’s digital transformation journey
  • Work on cutting-edge data platforms supporting real-world renewable energy assets
  • A truly international working environment with colleagues from all over the world
  • An open-minded, collaborative, dynamic, and highly motivated team
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Location
Location
United States , Flowood
Salary
Salary:
Not provided
phasorsoft.com Logo
PhasorSoft Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience with Snowflake or Azure Cloud Data Engineering, including setting up and managing data pipelines
  • Proficiency in designing and implementing ETL processes for data integration
  • Knowledge of data warehousing concepts and best practices
  • Strong SQL skills for querying and manipulating data in Snowflake or Azure databases
  • Experience with data modeling techniques and tools to design efficient data structures
  • Understanding of data governance principles and experience implementing them in cloud environments
  • Proficiency in Tableau or Power BI for creating visualizations and interactive dashboards
  • Ability to write scripts (e.g., Python, PowerShell) for automation and orchestration of data pipelines
  • Skills to monitor and optimize data pipelines for performance and cost efficiency
  • Knowledge of cloud data security practices and tools to ensure data protection
Job Responsibility
Job Responsibility
  • Design, implement, and maintain data pipelines and architectures on Snowflake or Azure Cloud platforms
  • Develop ETL processes to extract, transform, and load data from various sources into data warehouses
  • Optimize data storage, retrieval, and processing for performance and cost-efficiency in cloud environments
  • Collaborate with stakeholders to understand data requirements and translate them into technical solutions
  • Implement data security and governance best practices to ensure data integrity and compliance
  • Work with reporting tools such as Tableau or Power BI to create interactive dashboards and visualizations
  • Monitor and troubleshoot data pipelines, ensuring reliability and scalability
  • Automate data workflows and processes using cloud-native services and scripting languages
  • Provide technical expertise and support to data analysts, scientists, and business users
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
84835.61 - 149076.17 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows, BQML, Vertex AI.
  • Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics.
  • Hands-on experience working with real-time, unstructured, and synthetic data.
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar.
  • Expert knowledge on Python programming and SQL.
  • Experience with cloud platforms (AWS, GCP, Azure) and their data services
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed.
  • Familiarity with synthetic data generation and unstructured data processing
Job Responsibility
Job Responsibility
  • Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets
  • Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth.
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound.
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform.
  • Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics
  • Optimize data pipelines for performance, scalability, and reliability.
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products.
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root.
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Eligible to participate in an annual incentive program
  • Fulltime
Read More
Arrow Right

Senior Software Engineer

A Senior Software Engineer will work closely with Product Managers, Design, and ...
Location
Location
United States , Chicago
Salary
Salary:
145000.00 - 170000.00 USD / Year
arrivelogistics.com Logo
Arrive Logistics
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4+ years of software engineering or other closely related experience
  • Experience building enterprise software in .NET
  • Experience with React, Redux, and GraphQL, preferred
  • Experience working in a collaborative environment working hand-in-hand with Engineering, Product, QA, and DevOps teams preferred
  • Experience developing on large-scale projects, involving multiple teams and modern development frameworks
  • Strong knowledge of core Computer Science fundamentals, engineering best practices, and industry trends
  • Proficiency in system design, and a passion for solving architectural problems
  • Capable of communicating technical decisions and design to non-technical stakeholders
  • Ability to problem-solve unique & complex issues, both independently & collaboratively
  • Strong analytical, problem-solving, decision-making, and interpersonal skills
Job Responsibility
Job Responsibility
  • Work in partnership with Product and their Engineering team to develop impactful software solutions that drive Arrive to be a top freight brokerage
  • Take ownership in designing and executing medium to large-scale technical solutions with relative independence to produce high-quality software
  • Oversee all builds from developing, testing, deploying, and continuing to monitor after implementation
  • Develop and maintain relationships across departments such as Data, Product, and other Engineering teams to increase collaboration and identify issues proactively, and provide solutions larger than the team’s purview
  • In partnership with other leaders, establish best practices across the organization and drive the organization’s standards within the team, leading by example
  • Share technical expertise and communicate the why behind all projects to increase team effectiveness
  • Be a leader, mentor, and subject matter expert for the team, stakeholders, and peers. Foster a collaborative environment that drives solutions forward at a larger scope
  • Continue to increase knowledge and understanding of the business and industry at a larger scale to be able to strategically contribute to the team’s roadmap in partnership with the Product Manager
  • Ensure the team is producing a quality product by completing code reviews, test coverage, and providing effective feedback to encourage improvement
  • Practice quality documentation and ensure codebases are left in a comprehensive manner for other team members to use
What we offer
What we offer
  • Take advantage of excellent benefits, including medical, dental, vision, life, and disability coverage
  • Invest in your future with our matching 401(k) program
  • Build relationships and take part in learning opportunities through our Employee Resource Groups
  • Enjoy office wide engagement activities, team events, happy hours and more
  • Leave the suit and tie at home
  • our dress code is casual
  • Work in the heart of downtown Chicago, IL
  • There are CTA and L train stops walking distance from the office and you can store your bike safely inside of the building
  • Sweat it out at the LifeStart gym in our office building that includes brand new Peloton bikes, top-of-the-line equipment and personal training options
  • Maximize your wellness with free counseling sessions through our Employee Assistance Program
  • Fulltime
Read More
Arrow Right