CrawlJobs Logo

Senior Data Warehouse Developer

urmc.rochester.edu Logo

University of Rochester

Location Icon

Location:
United States of America , Brighton

Category Icon

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

86482.00 - 129723.00 USD / Year

Job Description:

The Senior Data Warehouse Developer is responsible for architecting, designing, developing, implementing and supporting various components for the enterprise data warehouse and associated data sources. Works independently and provides support for reporting projects, as well as process and architectural recommendations. Responsible for mentoring staff, demonstrating best practices, planning for business continuity and sustainability and working on projects that support Business Intelligence and Data Warehousing functions as assigned.

Job Responsibility:

  • Works with the user community and reporting team to understand business data and data transformation requirements. Converts those requirements to functional/technical specifications for database and integration design. Manages requests and/or components of larger projects with appropriate documentation and status reports
  • Architects data integration and transformation processes across on premise and cloud data sources. Provides leadership and is a subject matter expert depending on area of focus.
  • Develops data integration and transformation processes across on premise and cloud data sources using tools including Oracle Data Integrator, Boomi Atmosphere, Informatica as determined by the environment.
  • Develops data models and database designs that leverage their knowledge of data warehouse modeling best practices.
  • Collaborates with other team members using scrum processes or other project management disciplines depending on area of focus. Follows best practices for change and code management. Provides supporting technical documentation that is written, organized and maintained.
  • Participates in ongoing stabilization, support and maintenance. Provides on call support as needed for business continuity and operations. Attends webinars, reads case studies and white papers, and participates in opportunities to further refine skills and stay apprised on functional area topics. Maintains current knowledge of technology, equipment, and/or systems and advises on newer technologies.
  • Other duties as assigned

Requirements:

  • Bachelor's degree in a related discipline such as Computer Science, Business, Mathematics, Statistics, Science or Engineering required
  • 5 years of related experience required
  • Multiple project experience with Data Warehousing ETL technologies and overall knowledge of Data Warehousing/Business Intelligence lifecycles required
  • Experience with dimensional data models required
  • Strong Analytical and Problem-solving abilities required
  • Ability to actively participate and collaborate on a team required
  • Excellent written communication skills. Ability to translate functional business requirements into documented system requirements and designs required
  • Analytical mindset with strong problem solving abilities required
  • Proficiency in developing data transformation packages/stored procedures required

Nice to have:

  • Master’s degree preferred
  • Understanding in Python, SQL, and familiarity with cloud data platforms (e.g. AWS, GCP, Azure) preferred
  • Proven ability to work professionally and progressively in a matrix organization preferred
  • Strong TSQL experience (MSSQL Server) preferred
  • Proven development skills (SDLC) preferred
  • Report Development experience (one or more of SQL, SSRS, , Tableau, Spotfire) preferred
  • Healthcare information systems experience preferred
  • Experience with Epic Systems preferred
  • Informatica IDMC experience preferred
  • Denodo Virtual Data Platform
  • Understanding of data warehouse and agile software development processes preferred
  • Ability and initiative to learn new technologies preferred
  • Proven Ability to work under pressure in a fast-paced environment preferred
  • Epic Systems Cogito or Clarity and Caboodle
  • Cogito Fundamentals and a Cogito Data Model certification obtained within the first year of employment

Additional Information:

Job Posted:
February 20, 2026

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Data Warehouse Developer

Senior Software Developer - ETL

We are seeking a highly experienced Senior Software Developer - ETL to design, d...
Location
Location
Canada , Toronto
Salary
Salary:
Not provided
https://www.randstad.com Logo
Randstad
Expiration Date
February 22, 2026
Flip Icon
Requirements
Requirements
  • Experience with the Microsoft suite of technology, including: Azure Data Factory, Azure SQL Database, Azure Data Lake, and Power BI
  • Experience with developing data extraction, transformation, and load programs (daily and initial load) functionality on a wide range of data repositories (structured and semi-structured files, relational and multi-dimensional data stores)
  • Experience with developing, implementing, and maintaining schedule/dependency logic for Extract Transform Load (ETL) scripts
  • Experience with data integration, data cleansing, and data analytics
  • Experience with data modeling and design principles for data marts and data warehouses
  • Experience in database management and administration
  • Experience in constructing complex SQL queries and performance tuning
  • Experience with Git and knowledge of source control strategies
  • Experience creating technical documentation including ETL source-to-target mappings, data model diagrams, architecture artifacts, detailed design documents, etc.
Job Responsibility
Job Responsibility
  • ETL Design and Implementation: Designing, implementing, and continuously expanding data pipelines by performing extraction, transformation, and loading activities, with a focus on daily and initial load programs
  • Technology Stack: Utilizing the Microsoft suite of technology, including Azure Data Factory, Azure SQL Database, Azure Data Lake, and Power BI
  • Coding & Quality: Translating technical systems specifications into working, tested applications by writing high-quality code. This includes writing and/or generating code, compiling data-driven programs, and conducting unit tests
  • Data Expertise: Applying expertise in data integration, data cleansing, and data analytics. Designing and implementing data modeling and design principles for data marts and data warehouses
  • Database Optimization: Constructing complex SQL queries and performing performance tuning. Possessing experience in database management and administration
  • SDLC & Documentation: Collaborating with IT Professionals throughout the SDLC, ensuring applications remain scalable while complying with standards. Creating comprehensive technical documentation including ETL source-to-target mappings, data model diagrams, and detailed design documents
  • Troubleshooting & Support: Resolving and troubleshooting technical problems within ETL pipelines, notifying end-users of issues, and proposing adequate solutions.
What we offer
What we offer
  • Long-Term Engagement: Secure a 12-month contract with the potential for extension
  • Onsite Collaboration: Work fully onsite in Toronto, fostering strong team dynamics and collaboration.
!
Read More
Arrow Right

Senior Data Engineer

Intratek Computer, Inc. is seeking a highly skilled and experienced Sr Data Engi...
Location
Location
United States , Los Angeles
Salary
Salary:
Not provided
intrapc.com Logo
Intratek Computer, Inc.
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proficiency in WhereScape RED for data warehouse automation, including designing, building, and managing data warehouses
  • Expertise in Snowflake’s cloud data platform, including data loading, transformation, and querying using Snowflake SQL
  • Experience with SQL-based development, optimization, and tuning for large-scale data processing
  • Strong understanding of dimensional modeling concepts and experience in designing and implementing data models for analytics and reporting purposes
  • Ability to optimize data pipelines and queries for performance and scalability
  • Familiarity with Snowflake’s features such as virtual warehouses, data sharing, and data governance capabilities
  • Knowledge of WhereScape scripting language (WSL) for customizing and extending automation processes
  • Experience with data integration tools and techniques to ingest data from various sources into Snowflake
  • Understanding of data governance principles and experience implementing data governance frameworks within Snowflake
  • Ability to implement data quality checks and ensure data integrity within the data warehouse environment
What we offer
What we offer
  • Medical benefits
  • Paid vacation
  • Paid holidays
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

We are looking for a highly skilled Senior Data Engineer to lead the design and ...
Location
Location
United Kingdom
Salary
Salary:
45000.00 - 60000.00 GBP / Year
activate-group.com Logo
Activate Group Limited
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven experience as a Senior Data Engineer, BI/Data Warehouse Engineer, or similar
  • Strong hands-on expertise with Microsoft Fabric and related services
  • End-to-end DWH development experience, from ingestion to modelling and consumption
  • Strong background in data modelling, including star schema, dimensional modelling and semantic modelling
  • Experience with orchestration, monitoring and optimisation of data pipelines
  • Proficiency in SQL and strong understanding of database principles
  • Ability to design scalable data architectures aligned to business needs
Job Responsibility
Job Responsibility
  • Lead the design, architecture and build of a new enterprise data warehouse on Microsoft Fabric
  • Develop robust data pipelines, orchestration processes and monitoring frameworks using Fabric components (Data Factory, Data Engineering, Lakehouse)
  • Create scalable and high-quality data models to support analytics, Power BI reporting and self-service data consumption
  • Establish and enforce data governance, documentation and best practices across the data ecosystem
  • Collaborate with cross-functional teams to understand data needs and translate them into technical solutions
  • Provide technical leadership, mentoring and guidance to junior team members where required
What we offer
What we offer
  • 33 days holiday (including bank holidays)
  • Personal health cash plan – claim back the cost of things like dentist and optical check ups
  • Enhanced maternity / paternity / adoption / shared parental pay
  • Life assurance: three times basic salary
  • Free breakfasts and fruit
  • Birthday surprise for everybody
  • Fulltime
Read More
Arrow Right

Senior Data Scientist – Audience Data Product

We are looking for a Senior Data Scientist – Audience Data Product to join our G...
Location
Location
United States , Bethesda, MD
Salary
Salary:
52.06 - 82.45 USD / Hour
https://www.marriott.com Logo
Marriott Bonvoy
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4-year degree from an accredited university in Data Analytics, Computer Science, Engineering, Information Systems, or a similar quantitative discipline, with 4+ years of experience demonstrating progressive career growth and a history of exceptional performance in data science or analytics OR 2+ years of experience in data science or analytics with a Master's degree
  • Proven experience in building and optimizing audience segmentation and lookalike models for targeted advertising using data science techniques, including clustering, classification, and predictive modeling
  • Strong experience with writing AI/ML models in Python, Spark or SQL, and familiarity with cloud data warehouses (e.g., Snowflake, AWS) for data processing and model deployment
  • Demonstrated ability to use data science methodologies (e.g., machine learning, statistical modeling) to analyze and optimize large-scale datasets for audience targeting
  • Ability to collaborate cross-functionally with marketing, digital, and technology teams to align on broader data strategies and contribute insights to media optimization and performance measurement
Job Responsibility
Job Responsibility
  • Design and develop audience segmentation and various lookalike models to optimize targeting across owned & paid channels
  • Work with cross-functional teams to collect, analyze, and optimize media exposure data for continuous media optimization and audience measurement
  • Collaborate closely with teams from marketing, digital, and technology to evolve the media data ecosystem, leveraging advanced tools such as cloud data warehouses (Snowflake), data cleanrooms, and customer data platforms (CDPs)
  • Develop and implement segmentation and modeling strategies
  • Collaborate with marketing and analytics teams to design and build segmentation models and lookalike models for optimized targeting across paid media channels
  • Work with the AdTech and IT teams to ensure effective integration and deployment of data infrastructure that supports segmentation and audience activation efforts
  • Align segmentation models and tactics with the broader organizational data strategy to ensure consistency with long-term business goals and objectives
  • Collaborate closely with marketing and analytics teams to define key performance indicators (KPIs) and data requirements for optimizing segmentation strategies, improving campaign targeting, and driving ROI
  • Collaborate with technical teams and vendors (e.g., AWS, Snowflake) to build and enhance infrastructure that supports data-driven segmentation, model development, and performance measurement
  • Identify and work on improving processes that enhance data quality and reduce manual interventions in segmentation model development and activation workflows
What we offer
What we offer
  • Coverage for medical, dental, vision, health care flexible spending account, dependent care flexible spending account, life insurance, disability insurance, accident insurance, adoption expense reimbursements, paid parental leave, 401(k) plan, stock purchase plan, discounts at Marriott properties, commuter benefits, employee assistance plan, and childcare discounts
  • Fulltime
Read More
Arrow Right

Senior Software Engineer - Data Infrastructure

We build the data and machine learning infrastructure to enable Plaid engineers ...
Location
Location
United States , San Francisco
Salary
Salary:
180000.00 - 270000.00 USD / Year
plaid.com Logo
Plaid
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of software engineering experience
  • Extensive hands-on software engineering experience, with a strong track record of delivering successful projects within the Data Infrastructure or Platform domain at similar or larger companies
  • Deep understanding of one of: ML Infrastructure systems, including Feature Stores, Training Infrastructure, Serving Infrastructure, and Model Monitoring OR Data Infrastructure systems, including Data Warehouses, Data Lakehouses, Apache Spark, Streaming Infrastructure, Workflow Orchestration
  • Strong cross-functional collaboration, communication, and project management skills, with proven ability to coordinate effectively
  • Proficiency in coding, testing, and system design, ensuring reliable and scalable solutions
  • Demonstrated leadership abilities, including experience mentoring and guiding junior engineers
Job Responsibility
Job Responsibility
  • Contribute towards the long-term technical roadmap for data-driven and machine learning iteration at Plaid
  • Leading key data infrastructure projects such as improving ML development golden paths, implementing offline streaming solutions for data freshness, building net new ETL pipeline infrastructure, and evolving data warehouse or data lakehouse capabilities
  • Working with stakeholders in other teams and functions to define technical roadmaps for key backend systems and abstractions across Plaid
  • Debugging, troubleshooting, and reducing operational burden for our Data Platform
  • Growing the team via mentorship and leadership, reviewing technical documents and code changes
What we offer
What we offer
  • medical, dental, vision, and 401(k)
  • equity and/or commission
  • Fulltime
Read More
Arrow Right

Senior AWS Data Engineer / Data Platform Engineer

We are seeking a highly experienced Senior AWS Data Engineer to design, build, a...
Location
Location
United Arab Emirates , Dubai
Salary
Salary:
Not provided
northbaysolutions.com Logo
NorthBay
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8+ years of experience in data engineering and data platform development
  • Strong hands-on experience with: AWS Glue
  • Amazon EMR (Spark)
  • AWS Lambda
  • Apache Airflow (MWAA)
  • Amazon EC2
  • Amazon CloudWatch
  • Amazon Redshift
  • Amazon DynamoDB
  • AWS DataZone
Job Responsibility
Job Responsibility
  • Design, develop, and optimize scalable data pipelines using AWS native services
  • Lead the implementation of batch and near-real-time data processing solutions
  • Architect and manage data ingestion, transformation, and storage layers
  • Build and maintain ETL/ELT workflows using AWS Glue and Apache Spark on EMR
  • Orchestrate complex data workflows using Apache Airflow (MWAA)
  • Develop and manage serverless data processing using AWS Lambda
  • Design and optimize data warehouses using Amazon Redshift
  • Implement and manage NoSQL data models using Amazon DynamoDB
  • Utilize AWS DataZone for data governance, cataloging, and access management
  • Monitor, log, and troubleshoot data pipelines using Amazon CloudWatch
  • Fulltime
Read More
Arrow Right

Senior ETL Developer

Embark on a transformative journey as a Senior ETL Developer at Barclays, where ...
Location
Location
United States , Whippany
Salary
Salary:
120000.00 - 175000.00 USD / Year
barclays.co.uk Logo
Barclays
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Designing and building large-scale, highly optimized data warehouses and data marts
  • Python, SQL, Informatica, and shell scripting
  • Writing and optimizing multi-layered SQL queries for performance and scalability
  • Overseeing large-scale batch processing and job scheduling using tools like Autosys
  • Supporting production environments and navigating the change management lifecycle
Job Responsibility
Job Responsibility
  • Development and delivery of high-quality software solutions by using industry aligned programming languages, frameworks, and tools. Ensuring that code is scalable, maintainable, and optimized for performance
  • Cross-functional collaboration with product managers, designers, and other engineers to define software requirements, devise solution strategies, and ensure seamless integration and alignment with business objectives
  • Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing
  • Stay informed of industry technology trends and innovations and actively contribute to the organization’s technology communities to foster a culture of technical excellence and growth
  • Adherence to secure coding practices to mitigate vulnerabilities, protect sensitive data, and ensure secure software solutions
  • Implementation of effective unit testing practices to ensure proper code design, readability, and reliability
What we offer
What we offer
  • Competitive holiday allowance
  • Life assurance
  • Private medical care
  • Pension contribution
  • Medical, dental and vision coverage
  • 401(k)
  • life insurance
  • other paid leave for qualifying circumstances
  • incentive award eligibility
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

At Relatient, we’re on a mission to simplify access to care – intelligently. As ...
Location
Location
India , Pune
Salary
Salary:
Not provided
relatient.com Logo
Relatient
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree, B.E./ B. Tech, computer engineering, or equivalent work experience in lieu of a degree is required, Master’s degree preferred
  • 7+ years of experience in database engineering, data warehousing, or data architecture
  • Proven expertise with at least one major data warehouse platform (e.g. Postgres, Snowflake, Redshift, BigQuery)
  • Strong SQL and ETL/ELT development skills
  • Deep understanding of data modeling
  • Experience with cloud data ecosystems (AWS)
  • Hands-on experience with orchestration tools and version control (Git)
  • Experience in data governance, security, and compliance best practices
  • Experience building/generating analytical reports using Power BI
Job Responsibility
Job Responsibility
  • Architect, design, and implement robust end-to-end data warehouse (DW) solutions using modern technologies (e.g. Postgres or on-prem solutions)
  • Define data modeling standards (dimensional and normalized) and build ETL/ELT pipelines for efficient data flow and transformation
  • Integrate data from multiple sources (ERP, CRM. APIs, flat files, real-time streams)
  • Develop and maintain scalable and reliable data ingestion, transformation, and storage pipelines
  • Ensure data quality, consistency, and lineage across all data systems
  • Analyst and tune SQL queries, schemas, indexes, and ETL process to maximize database and warehouse performance
  • Monitor data systems and optimize storage costs and query response times
  • Implement high availability, backup, disaster recovery, and data security strategies
  • Collaborate with DevOps and Infrastructure teams to ensure optimal deployment, scaling, and performance of DW environments
  • Work closely with Data Scientists, Analysts, and Business Teams to translate business needs into technical data solutions
What we offer
What we offer
  • INR 5,00,000/- of life insurance coverage for all full-time employees and their immediate family
  • INR 15,00,000/- of group accident insurance
  • Education reimbursement
  • 10 national and state holidays, plus 1 floating holiday
  • Flexible working hours and a hybrid policy
  • Fulltime
Read More
Arrow Right