CrawlJobs Logo

Principle Data Platform Engineer

https://www.microsoft.com/ Logo

Microsoft Corporation

Location Icon

Location:
United States , Redmond

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

139900.00 - 274800.00 USD / Year

Job Description:

Imagine being at the forefront of a revolution powered by Data and AI (Artificial intelligence), transforming Marketing at Microsoft into a true “Frontier Organization” that drives impact at an unprecedented scale. As a Principle Data Platform Engineer on the Global Marketing Data Engineering team, you will have the opportunity to build cutting-edge data platforms that process and synthesize billions of demand signals, enabling personalized customer journeys, media innovation, and high impact campaigns that redefine customer engagement. By crafting data products that integrate, enrich, and serve deep insights across an infinite number of scenarios, you are not only shaping the future but experiencing it firsthand as we directly contribute to the evolution of our most advanced technologies like Purview and Fabric. Join us to unleash your potential and play a pivotal role as a data innovator with world-changing impact - your journey starts here!

Job Responsibility:

  • Develop and implement a federated infrastructure strategy for marketing data focused on cost optimization and scalability
  • Develop monitoring systems and processes for Data Platform infrastructure, pipelines, usage, and access patterns
  • Build capabilities for data discovery, access management, policy enforcement, and lineage tracking, aligned with the DE data platform vision
  • Create and deploy frameworks and tools for data quality measurement and monitoring to deliver secure, sustainable, high-performing, and reliable marketing data for consumer needs
  • Design and roll out data storage framework and solution in line with medallion architecture and automated role-based access controls across all access layers
  • Establish engineering excellence through best practices, coding standards, and robust code management tooling
  • Collaborate with Data Product Engineering, Data Ops, and Partner Engineering teams to deliver end-to-end platform solutions
  • Partner with Fabric, Purview, Azure ML teams to influence product roadmaps and address feature gaps
  • Implement shared tools for cross-tenant Azure data intake and publishing
  • Deploy capabilities for data classification, tagging, retention, archival, and deletion to meet privacy policy requirements
  • Identify and prioritize platform improvements using incident and feedback data
  • Embody our culture and values

Requirements:

  • Bachelor's Degree in Computer Science or related technical field AND 6+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience
  • Master's Degree in Computer Science or related technical field AND 8+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR Bachelor's Degree in Computer Science or related technical field AND 12+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience
  • 6+ years of software development lifecycle experience, including 2+ years in a lead role managing big data platforms
  • 2+ years of Infrastructure-as-Code (Terraform, Bicep, ARM)
  • 2+ years of CI/CD pipelines and DevOps automationIdentity & access management (RBAC, PIM, conditional access)
  • 2+ years of Governance frameworks (data classification, policy enforcement, compliance controls)
  • Observability (OpenTelemetry, monitoring, logging, SLOs)
  • Incident management and reliability engineering practices

Additional Information:

Job Posted:
February 13, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Principle Data Platform Engineer

Data Engineering & Analytics Lead

Premium Health is seeking a highly skilled, hands-on Data Engineering & Analytic...
Location
Location
United States , Brooklyn
Salary
Salary:
Not provided
premiumhealth.org Logo
Premium Health
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Engineering, or a related field. Master's degree preferred
  • Proven track record and progressively responsible experience in data engineering, data architecture, or related technical roles
  • healthcare experience preferred
  • Strong knowledge of data engineering principles, data integration, ETL processes, and semantic mapping techniques and best practices
  • Experience implementing data quality management processes, data governance frameworks, cataloging, and master data management concepts
  • Familiarity with healthcare data standards (e.g., HL7, FHIR, etc), health information management principles, and regulatory requirements (e.g., HIPAA)
  • Understanding of healthcare data, including clinical, operational, and financial data models, preferred
  • Advanced proficiency in SQL, data modeling, database design, optimization, and performance tuning
  • Experience designing and integrating data from disparate systems into harmonized data models or semantic layers
  • Hands-on experience with modern cloud-based data platforms (e.g Azure, AWS, GCP)
Job Responsibility
Job Responsibility
  • Collaborate with the CDIO and Director of Technology to define a clear data vision aligned with the organization's goals and execute the enterprise data roadmap
  • Serve as a thought leader for data engineering and analytics, guiding the evolution of our data ecosystem and championing data-driven decision-making across the organization
  • Build and mentor a small data team, providing technical direction and performance feedback, fostering best practices and continuous learning, while remaining a hands-on implementor
  • Define and implement best practices, standards, and processes for data engineering, analytics, and data management across the organization
  • Design, implement, and maintain a scalable, reliable, and high-performing modern data infrastructure, aligned with the organizational needs and industry best practices
  • Architect and maintain data lake/lakehouse, warehouse, and related platform components to support analytics, reporting, and operational use cases
  • Establish and enforce data architecture standards, governance models, naming conventions ,and documentation
  • Develop, optimize, and maintain scalable ETL/ELT pipelines and data workflows to collect, transform, normalize, and integrate data from diverse systems
  • Implement robust data quality processes, validation, monitoring, and error-handling frameworks
  • Ensure data is accurate, timely, secure, and ready for self-service analytics and downstream applications
What we offer
What we offer
  • Paid Time Off, Medical, Dental and Vision plans, Retirement plans
  • Public Service Loan Forgiveness (PSLF)
  • Fulltime
Read More
Arrow Right

Senior Data Engineering Architect

Location
Location
Poland
Salary
Salary:
Not provided
lingarogroup.com Logo
Lingaro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven work experience as a Data Engineering Architect or a similar role and strong experience in in the Data & Analytics area
  • Strong understanding of data engineering concepts, including data modeling, ETL processes, data pipelines, and data governance
  • Expertise in designing and implementing scalable and efficient data processing frameworks
  • In-depth knowledge of various data technologies and tools, such as relational databases, NoSQL databases, data lakes, data warehouses, and big data frameworks (e.g., Hadoop, Spark)
  • Experience in selecting and integrating appropriate technologies to meet business requirements and long-term data strategy
  • Ability to work closely with stakeholders to understand business needs and translate them into data engineering solutions
  • Strong analytical and problem-solving skills, with the ability to identify and address complex data engineering challenges
  • Proficiency in Python, PySpark, SQL
  • Familiarity with cloud platforms and services, such as AWS, GCP, or Azure, and experience in designing and implementing data solutions in a cloud environment
  • Knowledge of data governance principles and best practices, including data privacy and security regulations
Job Responsibility
Job Responsibility
  • Collaborate with stakeholders to understand business requirements and translate them into data engineering solutions
  • Design and oversee the overall data architecture and infrastructure, ensuring scalability, performance, security, maintainability, and adherence to industry best practices
  • Define data models and data schemas to meet business needs, considering factors such as data volume, velocity, variety, and veracity
  • Select and integrate appropriate data technologies and tools, such as databases, data lakes, data warehouses, and big data frameworks, to support data processing and analysis
  • Create scalable and efficient data processing frameworks, including ETL (Extract, Transform, Load) processes, data pipelines, and data integration solutions
  • Ensure that data engineering solutions align with the organization's long-term data strategy and goals
  • Evaluate and recommend data governance strategies and practices, including data privacy, security, and compliance measures
  • Collaborate with data scientists, analysts, and other stakeholders to define data requirements and enable effective data analysis and reporting
  • Provide technical guidance and expertise to data engineering teams, promoting best practices and ensuring high-quality deliverables. Support to team throughout the implementation process, answering questions and addressing issues as they arise
  • Oversee the implementation of the solution, ensuring that it is implemented according to the design documents and technical specifications
What we offer
What we offer
  • Stable employment. On the market since 2008, 1500+ talents currently on board in 7 global sites
  • Workation. Enjoy working from inspiring locations in line with our workation policy
  • Great Place to Work® certified employer
  • Flexibility regarding working hours and your preferred form of contract
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs. Lingarians earn 500+ technology certificates yearly
  • Upskilling support. Capability development programs, Competency Centers, knowledge sharing sessions, community webinars, 110+ training opportunities yearly
  • Grow as we grow as a company. 76% of our managers are internal promotions
Read More
Arrow Right

Senior Back End Engineer for Streaming Data Platform

Do you want to build a high-quality data platform that will innovate financial m...
Location
Location
Salary
Salary:
Not provided
korfinancial.com Logo
KOR Financial
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • A minimum of 8+ years of experience as a Back End Engineer
  • Experience with Java and Spring Boot Framework
  • Experience with building and running applications on public cloud vendors like AWS
  • Working experience Kafka, DataBricks and Streaming data solutions
  • Experience profiling, debugging, and performance tuning complex distributed systems
  • A firm reliance on unit testing and mocking frameworks with a TDD (Test Driven Development) mindset
  • Knowledge of OOP principles and modern development practices
Job Responsibility
Job Responsibility
  • Designing and implementing the streaming data platform engine and SDK
  • Implementing new features for our range of web and streaming applications and data reporting capabilities
  • Be an active voice in the platform's build-out in regards to the technical choices and implementations
  • Working closely with the broader team to embrace new challenges and adapt requirements as we continue to grow and adjust priorities
  • Paired programming with a growing team of Back-end, Data, and Front-end Engineers
What we offer
What we offer
  • Culture of trust, empowerment, and constructive feedback
  • Competitive salary, great IT equipment, and expense allowance
  • Flexible working times
  • A span of control that matches your ambitions and skills
  • Commitment to a genuine, balanced relationship
Read More
Arrow Right

Senior Principal Engineer Core Data Platform

As an engineer well into your career, we know you're an expert at what you do an...
Location
Location
United States , Seattle; San Francisco; Mountain View
Salary
Salary:
198300.00 - 318600.00 USD / Year
https://www.atlassian.com Logo
Atlassian
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related technical field
  • 12+ years of experience in backend software development, with a focus on distributed systems and large-scale storage solutions
  • 8+ years of experience designing and managing highly available, large-scale storage architectures in cloud environments
  • 5+ years of hands-on experience working with AWS storage services (S3, EBS, EFS, FSx, Glacier, DynamoDB)
  • Proficiency in system design, performance optimization, and cost-efficient architecture for exabyte-scale storage
  • Expertise in at least one major backend programming language (Kotlin, Java, Go, Rust, or Python)
  • Experience leading technical strategy and architectural decisions in large, multi-team engineering organizations
  • Strong understanding of distributed systems principles, including consistency models, replication, sharding, and consensus algorithms (Raft, Paxos)
  • Deep knowledge of security best practices, including encryption, access control (IAM), and compliance standards (SOC2, GDPR, HIPAA)
  • Experience mentoring senior engineers and driving high-impact engineering initiatives
Job Responsibility
Job Responsibility
  • Collaborate with partner teams and internal customers to help define technical direction and OKRs for the Core Data platform organization
  • Regularly tackle the largest and most complex problems on the team, from technical design to implementation and launch
  • Partner across engineering teams to take on company-wide initiatives spanning multiple projects
  • Routinely tackle complex architecture challenges and apply architectural standards and start using them on new projects
  • Work across senior engineering and product leaders to build strategy and design solutions to earn customers trust and business
  • Own key OKRs and end-to-end outcomes of critical projects in a microservices environment
  • Champion best practices and innovative techniques for scalability, reliability, and performance optimizations
  • Own engineering and operational excellence for the health of our systems and processes
  • Proactively drive opportunities for continuous improvements and own key operational metrics
  • Continually drive developer productivity initiatives to ensure that we unleash the potential of our own teams
What we offer
What we offer
  • health coverage
  • paid volunteer days
  • wellness resources
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

At ANS, the Senior Data Engineer plays a key role in delivering robust, scalable...
Location
Location
United Kingdom , Manchester
Salary
Salary:
Not provided
ans.co.uk Logo
ANS Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience in building and optimising pipelines in Azure Data Factory, Synapse, or Fabric
  • Strong knowledge of Python and SQL
  • Experience in using metadata frameworks in data engineering
  • Experience in best practice data engineering principles including CI/CD via Azure DevOps or Github
  • Understanding of Azure networking and security in relation to the data platform
  • Experience of data governance and regulation, including GDPR, principle of least privilege, classification etc.
  • Experience of lakehouse architecture, data warehousing principles, and data modelling
  • Familiarity with Microsoft Purview in a data platform context
  • Base knowledge of Azure foundry
Job Responsibility
Job Responsibility
  • Build and optimise data pipelines, notebooks, and data flows in Microsoft Fabric and Synapse Analytics, connecting to a variety of on-premises and cloud based data sources
  • Support Data Architects and Cloud Engineers by implementing solutions based on provided designs and offering feedback where needed
  • Collaborate across disciplines to ensure high-quality delivery of data solutions, including working with presales, managed services, and customer teams
  • Mentor Data engineers and support their development through guidance and task distribution
  • Ensure best practice adherence in engineering processes, including CI/CD via Azure DevOps and secure data handling (e.g. Key vault, private endpoints)
  • Contribute to Agile delivery by participating in standups, user story creation, and sprint planning
  • Document implemented solutions clearly and accurately for internal and customer use
  • Troubleshoot and resolve issues across subscriptions and environments
  • Work closely with the Project Manager (where applicable) to align on delivery timelines, report progress, and manage risks, while also acting as a key point of contact for customer SMEs and engineers to support collaboration and clarify technical requirements
  • Engage in continuous learning through certifications (e.g. DP-600 and/or DP700, AI-900, AI102, etc.) and development days
What we offer
What we offer
  • 25 days’ holiday, plus you can buy up to 5 more days
  • Birthday off
  • An extra celebration day
  • 5 days’ additional holiday in the year you get married
  • 5 volunteer days
  • Private health insurance
  • Pension contribution match and 4 x life assurance
  • Flexible working and work from anywhere for up to 30 days per year (some exceptions)
  • Maternity: 16 weeks’ full pay, Paternity: 3 weeks’ full pay, Adoption: 16 weeks’ full pay
  • Company social events
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

At Ingka Investments (Part of Ingka Group – the largest owner and operator of IK...
Location
Location
Netherlands , Leiden
Salary
Salary:
Not provided
https://www.ikea.com Logo
IKEA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Formal qualifications (BSc, MSc, PhD) in computer science, software engineering, informatics or equivalent
  • Minimum 3 years of professional experience as a (Junior) Data Engineer
  • Strong knowledge in designing efficient, robust and automated data pipelines, ETL workflows, data warehousing and Big Data processing
  • Hands-on experience with Azure data services like Azure Databricks, Unity Catalog, Azure Data Lake Storage, Azure Data Factory, DBT and Power BI
  • Hands-on experience with data modeling for BI & ML for performance and efficiency
  • The ability to apply such methods to solve business problems using one or more Azure Data and Analytics services in combination with building data pipelines, data streams, and system integration
  • Experience in driving new data engineering developments (e.g. apply new cutting edge data engineering methods to improve performance of data integration, use new tools to improve data quality and etc.)
  • Knowledge of DevOps practices and tools including CI/CD pipelines and version control systems (e.g., Git)
  • Proficiency in programming languages such as Python, SQL, PySpark and others relevant to data engineering
  • Hands-on experience to deploy code artifacts into production
Job Responsibility
Job Responsibility
  • Contribute to the development of D&A platform and analytical tools, ensuring easy and standardized access and sharing of data
  • Subject matter expert for Azure Databrick, Azure Data factory and ADLS
  • Help design, build and maintain data pipelines (accelerators)
  • Document and make the relevant know-how & standard available
  • Ensure pipelines and consistency with relevant digital frameworks, principles, guidelines and standards
  • Support in understand needs of Data Product Teams and other stakeholders
  • Explore ways create better visibility on data quality and Data assets on the D&A platform
  • Identify opportunities for data assets and D&A platform toolchain
  • Work closely together with partners, peers and other relevant roles like data engineers, analysts or architects across IKEA as well as in your team
What we offer
What we offer
  • Opportunity to develop on a cutting-edge Data & Analytics platform
  • Opportunities to have a global impact on your work
  • A team of great colleagues to learn together with
  • An environment focused on driving business and personal growth together, with focus on continuous learning
  • Fulltime
Read More
Arrow Right

Data Engineer

At ANS, the Data Engineer plays a vital role in enabling data-driven decision-ma...
Location
Location
United Kingdom , Manchester
Salary
Salary:
Not provided
ans.co.uk Logo
ANS Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience in building and optimising pipelines in Azure Data Factory, Synapse, or Fabric
  • Knowledge of python and SQL
  • Experience in using metadata frameworks in data engineering
  • Experience in best practice data engineering principles including CI/CD via Azure DevOps or Github
  • Base knowledge understanding of Azure networking and security in relation to the data platform
  • Experience of data governance and regulation, including GDPR, principle of least privilege, classification etc.
  • Experience of Lakehouse architecture, data warehousing principles, and data modelling
Job Responsibility
Job Responsibility
  • Deliver high-quality data solutions by building and optimising data pipelines, notebooks, and data flows in Microsoft Fabric and Synapse Analytics, connecting to a variety of on-premises and cloud-based data sources
  • Work closely with Data Architects and Senior Data Engineers to implement technical designs and contribute to solution development
  • Collaborate with customer-side data engineers to ensure smooth integration and alignment with business requirements
  • Focus on task execution and delivery, ensuring timelines and quality standards are met
  • Follow engineering best practices including CI/CD via Azure DevOps, secure data handling using Key Vault and private endpoints, and maintain code quality
  • Participate in Agile ceremonies such as standups, sprint planning, and user story refinement
  • Document solutions clearly for internal use and knowledge sharing
  • Troubleshoot and resolve technical issues across environments and subscriptions
  • Engage in continuous learning through certifications (e.g. DP-600 and/or DP700, AI-900, AI-102, etc.) and development days
  • Contribute to the Data Engineer Guild by sharing knowledge, participating in discussions, and helping shape engineering standards and practices
What we offer
What we offer
  • 25 days’ holiday, plus you can buy up to 5 more days
  • Birthday off
  • Extra celebration day
  • 5 days’ additional holiday in the year you get married
  • 5 volunteer days
  • Private health insurance
  • Pension contribution match and 4 x life assurance
  • Flexible working and work from anywhere for up to 30 days per year (some exceptions)
  • Maternity: 16 weeks’ full pay, Paternity: 3 weeks’ full pay, Adoption: 16 weeks’ full pay
  • Company social events
  • Fulltime
Read More
Arrow Right

Principal Data Engineer

We are on the lookout for a Principal Data Engineer to help define and lead the ...
Location
Location
United Kingdom
Salary
Salary:
Not provided
dotdigital.com Logo
Dotdigital
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Extensive experience delivering python-based projects in the data engineering space
  • Extensive experience working with SQL and NoSQL database technologies (e.g. SQL Server, MongoDB & Cassandra)
  • Proven experience with modern data warehousing and large-scale data processing tools (e.g. Snowflake, DBT, BiqQuery, Clickhouse)
  • Hands on experience with data orchestration tools like Airflow, Dagster or Prefect
  • Experience using cloud environments (e.g. Azure, AWS, GCP) to process, store and surface large scale data
  • Experience using Kafka or similar event-based architectures e.g. (Pub/Sub via AWS SQS, Azure EventHubs, AWS Kinesis)
  • Strong grasp of data architecture and data modelling principles for both OLAP and OLTP workloads
  • Capable in the wider software development lifecycle in terms of agile ways of working and continuous integration/deployment of data solutions
  • Experience as a lead or Principal Engineer on large-scale data initiative or product builds
  • Demonstrated ability to architect data systems and data structures for high volume, high throughput systems
Job Responsibility
Job Responsibility
  • Lead the design and implementation of scalable, secure and resilient data systems across streaming, batch and real-time use cases
  • Architect data pipelines, model and storage solutions that power analytical and product use cases
  • using primarily Python and SQL via orchestration tooling that run workloads in the cloud
  • Leverage AI to automate both data processing and engineering processes
  • Assure and drive best practices relating to data infrastructure, governance, security and observability
  • Work with technologists across multiple teams to deliver coherent features and data outcomes
  • Support the data team to help adopt data engineering principles
  • Identify, validate and promote new tools and technologies that improve the performance and stability of data services
What we offer
What we offer
  • Parental leave
  • Medical benefits
  • Paid sick leave
  • Dotdigital day
  • Share reward
  • Wellbeing reward
  • Wellbeing Days
  • Loyalty reward
  • Fulltime
Read More
Arrow Right