CrawlJobs Logo

Senior Engineer, Data Warehouse

carnival.com Logo

Carnival Cruises

Location Icon

Location:
United States , Miami

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

The Senior Engineer, Data Warehouse is responsible for designing, developing, testing, and maintaining software applications and systems to meet user needs and business objectives. This role involves collaborating with cross-functional teams, including product managers, designers, and other engineers, to create high-quality, scalable, and efficient software solutions. The focus for this role is building software applications, writing code, and implementing functionality to meet user needs or business goals. The Senior Engineer, Data Warehouse leverages business and technical experience to deliver high quality products. This role provides technical leadership and guidance to technical teams. This position is responsible for design, development, deployment, maintenance, and in some cases support of multi-tiered applications. This position collaborates with architects while managing the technical implementation, integration, and optimization of the product. This role partners with the IT leaders and business leaders to deliver the technical implementation of the business requirements while adhering to architectural guidelines, best practices, and compliance requirements. This position is responsible for a complex application portfolio.

Job Responsibility:

  • Software Development: Collaborate with team members and stakeholders to understand project goals, provide technical input, and ensure successful delivery
  • Participate in architectural design decisions to ensure scalability, performance, and reliability
  • Collaborate with Business Analyst and Software Architects to analyze user requirements and application design to create software while ensuring separate elements of the application works well in the larger program
  • Write, debug, and optimize code to implement software solutions based on technical and business requirements
  • Leverage best practices and industry standards to ensure quality code is developed according to specs and user requirements
  • Ensure solutions can be easily expanded with new feature sets and functionality on frameworks that are easily extendible
  • Create frameworks which promote ease of adoption, sharing, re-use, and interfacing with other application platforms
  • Develop unit tests, perform code reviews, and ensure the software meets quality standards
  • Provide ROM to development leads for planning purposes
  • Collaborates with Quality Engineer during all phases of testing to ensure defects /deficiencies are addressed
  • Collaborate with Project Manager/ Scrum Master on issues, risks, and status
  • Documentation: Create and maintain technical documentation for software functionality, processes, and best practices
  • Document each aspect of a system or application as a reference for future upgrades and maintenance
  • Quality: Identify and resolve issues that arise during the testing and maintenance processes
  • Ensure applications are stable and meet established KPI/ SLAs
  • Ensures that software complies with security requirements (i.e. PCI, SOX, PII)
  • Operations/Support: Respond to production issues
  • Quickly analyze and resolve bugs or errors in production without affecting system stability
  • Deploy updates, features, or patches to production environment
  • Validate post-deployment functionality and ensure no regressions occur
  • Address performance bottlenecks (e.g., slow database queries, unoptimized code)
  • Optimize system resources like CPU, memory, or network usage
  • Leverage tools to watch for anomalies
  • Investigate issues reported by monitoring tools, users, or customer support teams
  • Set up alerts and dashboards to proactively address issues
  • Identify the underlying causes of incidents and prevent recurrence

Requirements:

  • Bachelor’s degree/Master’s degree preferred in Information Technology, Computer Science, or related field or equivalent work experience
  • 5+ years Experience as a Software Engineer (Developer) required
  • 5+ years Experience serving as technical resource throughout the full software development lifecycle, from conception, architecture definition, detailed design, scoping, planning, implementation, testing to documentation, delivery, and maintenance required
  • 5+ years Experience in production support, troubleshooting complex issues, developing RCAs, and meeting SLAs required
  • 5+ years Experience with compliance and security principles and developing secure applications that meet legal and organizational requirements. (i.e. PCI, SOX, PII, etc.)
  • 5+ years Experience with monitoring and logging tools to report on application and infrastructure health required
  • 5+ years Experience with DevOps and CI/CD required
  • 5+ years Experience developing with role-based access control (RBAC), IAM (Identity and Access Management)
  • 5+ years Experience developing and implementing of third-party APIs and library integrations
  • 5+ years Experience with cloud platforms (AWS, Azure, or Google Cloud) and containerization
  • 5+ years Experience with PL/SQL developer, functions, procedures, packages, and triggers
  • 5+ years Experience with SQL Navigator or Toad
  • 5+ years Experience developing on Oracle and SQL Databases
  • 5+ years Experience with the concept of code re-use and business rules standardization
  • 5+ years Experience in Informatica including Workflow Manager, Workflow Monitor, Repository Manager, Designer, Server Manager, Power Exchange and ETL
  • 5+ years Experience performing project activities including design and development of various, data warehouse, casino and customer relationship management project
  • 5+ years Experience in Informatica Data Management Cloud (IDMC)
  • 5+ years Experience in analysis, design, construction, documentation, and the support of very high-volume/performance data warehouse integrations using the following set of technologies (Oracle, Informatica, UNIX)
  • 5+ years Experience in handling/analyzing customer related data (CRM)
  • Ability to create PowerPoint presentations that are informative and engaging and deliver them to various audiences including management
  • Very good communication, team building, conflict management, and organizational skills
  • Proven track record of working collaboratively with cross functional teams to achieve common goals and drive results
  • Proficiency in MS Office
  • Ability to quickly learn new technologies and concepts

Nice to have:

Understanding of casino player related data is a plus

What we offer:
  • Health Benefits: Cost-effective medical, dental and vision plans
  • Employee Assistance Program and other mental health resources
  • Additional programs include company paid term life insurance and disability coverage
  • Financial Benefits: 401(k) plan that includes a company match
  • Employee Stock Purchase plan
  • Paid Time Off
  • Holidays – All full-time and part-time with benefits employees receive days off for 8 company-wide holidays, plus 2 additional floating holidays to be taken at the employee’s discretion
  • Vacation Time – All full-time employees at the manager and below level start with 14 days/year
  • director and above level start with 19 days/year
  • Part-time with benefits employees receive time off based on the number of hours they work, with a minimum of 84 hours/year
  • All employees gain additional vacation time with further tenure
  • Sick Time – All full-time employees receive 80 hours of sick time each year
  • Part-time with benefits employees receive time off based on the number of hours they work, with a minimum of 60 hours each year
  • Other Benefits: Complementary stand-by cruises, employee discounts on confirmed cruises, plus special rates for family and friends
  • Personal and professional learning and development resources including tuition reimbursement

Additional Information:

Job Posted:
January 30, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Engineer, Data Warehouse

Senior Data Engineer

Join a leading global live-entertainment discovery tech platform. As a Senior Da...
Location
Location
Spain , Madrid
Salary
Salary:
Not provided
https://feverup.com/fe Logo
Fever
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • You have a strong background in at least two of: data engineering, business intelligence, software engineering
  • You are an expert in Python3 and its data ecosystem
  • You have proven experience working with SQL languages
  • You have worked with complex data pipelines
  • You are a collaborative team player with strong communication skills
  • You are proactive, driven, and bring positive energy
  • You possess strong analytical and problem-solving abilities backed by solid software engineering skills
  • You are proficient in business English.
Job Responsibility
Job Responsibility
  • Own critical data pipelines of our data warehouse
  • Ideate and implement tools and processes to exploit data
  • Work closely with other business units to create structured and scalable solutions
  • Contribute to the development of a complex data and software ecosystem
  • Build trusted data assets
  • Build automatizations to create business opportunities
  • Design, build and support modern data infrastructure.
What we offer
What we offer
  • Attractive compensation package with potential bonus
  • Stock options
  • 40% discount on all Fever events and experiences
  • Home office friendly
  • Responsibility from day one
  • Great work environment with a young international team
  • Health insurance
  • Flexible remuneration with 100% tax exemption through Cobee
  • English lessons
  • Gympass membership
  • Fulltime
Read More
Arrow Right

Senior Microsoft Stack Data Engineer

Hands-On Technical SENIOR Microsoft Stack Data Engineer / On Prem to Cloud Senio...
Location
Location
United States , West Des Moines
Salary
Salary:
155000.00 USD / Year
https://www.roberthalf.com Logo
Robert Half
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of DATA WAREHOUSE EXPERIENCE / Data Lake experience
  • Advanced SQL Server
  • Strong SQL experience, working with structured and unstructured data
  • Strong in SSIS ETL
  • Proficiency in SQL and SQL Queries
  • Experience with SQL Server and SQL Server
  • Knowledge of Data Warehousing and Data Warehousing
  • Data Warehouse experience: Star Schema and Fact & Dimension data warehouse structure
  • Experience with Azure Data Lake and Data lakes
  • Proficiency in ETL / SSIS and SSAS
Job Responsibility
Job Responsibility
  • Modernize, Build out a Data Warehouse, and Lead & Build out a Data Lake in the CLOUD
  • REBUILD an OnPrem data warehouse working with disparate data to structure the data for consumable reporting
  • ALL ASPECTS OF Data Engineering
  • Technical Leader of the team
What we offer
What we offer
  • Bonus
  • 2 1/2 day weekends
  • Medical, vision, dental, and life and disability insurance
  • 401(k) plan
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

We are looking for a Senior Data Engineer (SDE 3) to build scalable, high-perfor...
Location
Location
India , Mumbai
Salary
Salary:
Not provided
https://cogoport.com/ Logo
Cogoport
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 6+ years of experience in data engineering, working with large-scale distributed systems
  • Strong proficiency in Python, Java, or Scala for data processing
  • Expertise in SQL and NoSQL databases (PostgreSQL, Cassandra, Snowflake, Apache Hive, Redshift)
  • Experience with big data processing frameworks (Apache Spark, Flink, Hadoop)
  • Hands-on experience with real-time data streaming (Kafka, Kinesis, Pulsar) for logistics use cases
  • Deep knowledge of AWS/GCP/Azure cloud data services like S3, Glue, EMR, Databricks, or equivalent
  • Familiarity with Airflow, Prefect, or Dagster for workflow orchestration
  • Strong understanding of logistics and supply chain data structures, including freight pricing models, carrier APIs, and shipment tracking systems
Job Responsibility
Job Responsibility
  • Design and develop real-time and batch ETL/ELT pipelines for structured and unstructured logistics data (freight rates, shipping schedules, tracking events, etc.)
  • Optimize data ingestion, transformation, and storage for high availability and cost efficiency
  • Ensure seamless integration of data from global trade platforms, carrier APIs, and operational databases
  • Architect scalable, cloud-native data platforms using AWS (S3, Glue, EMR, Redshift), GCP (BigQuery, Dataflow), or Azure
  • Build and manage data lakes, warehouses, and real-time processing frameworks to support analytics, machine learning, and reporting needs
  • Optimize distributed databases (Snowflake, Redshift, BigQuery, Apache Hive) for logistics analytics
  • Develop streaming data solutions using Apache Kafka, Pulsar, or Kinesis to power real-time shipment tracking, anomaly detection, and dynamic pricing
  • Enable AI-driven freight rate predictions, demand forecasting, and shipment delay analytics
  • Improve customer experience by providing real-time visibility into supply chain disruptions and delivery timeline
  • Ensure high availability, fault tolerance, and data security compliance (GDPR, CCPA) across the platform
What we offer
What we offer
  • Work with some of the brightest minds in the industry
  • Entrepreneurial culture fostering innovation, impact, and career growth
  • Opportunity to work on real-world logistics challenges
  • Collaborate with cross-functional teams across data science, engineering, and product
  • Be part of a fast-growing company scaling next-gen logistics platforms using advanced data engineering and AI
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer role driving Circle K's cloud-first strategy to unlock the ...
Location
Location
India , Gurugram
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree in Computer Engineering, Computer Science or related discipline
  • Master's Degree preferred
  • 5+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment
  • 5+ years of experience with setting up and operating data pipelines using Python or SQL
  • 5+ years of advanced SQL Programming: PL/SQL, T-SQL
  • 5+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization
  • Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads
  • 5+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data
  • 5+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions
  • 5+ years of experience in defining and enabling data quality standards for auditing, and monitoring
Job Responsibility
Job Responsibility
  • Collaborate with business stakeholders and other technical team members to acquire and migrate data sources
  • Determine solutions that are best suited to develop a pipeline for a particular data source
  • Develop data flow pipelines to extract, transform, and load data from various data sources
  • Efficient in ETL/ELT development using Azure cloud services and Snowflake
  • Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines
  • Provide clear documentation for delivered solutions and processes
  • Identify and implement internal process improvements for data management
  • Stay current with and adopt new tools and applications
  • Build cross-platform data strategy to aggregate multiple sources
  • Proactive in stakeholder communication, mentor/guide junior resources
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

We're seeking an experienced Senior Data Engineer to help shape the future of he...
Location
Location
Germany , Berlin
Salary
Salary:
Not provided
audibene.de Logo
Audibene GmbH
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of hands on experience with complex ETL processes, data modeling, and large scale data systems
  • Production experience with modern cloud data warehouses (Snowflake, BigQuery, Redshift) on AWS, GCP, or Azure
  • Proficiency in building and optimizing data transformations and pipelines in python
  • Experience with columnar storage, MPP databases, and distributed data processing architectures
  • Ability to translate complex technical concepts for diverse audiences, from engineers to business stakeholders
  • Experience with semantic layers, data catalogs, or metadata management systems
  • Familiarity with modern analytical databases like Snowflake, BigQuery, ClickHouse, DuckDB, or similar systems
  • Experience with streaming technologies like Kafka, Pulsar, Redpanda, or Kinesis
Job Responsibility
Job Responsibility
  • Design and build robust, high performance data pipelines using our modern stack (Airflow, Snowflake, Pulsar, Kubernetes) that feed directly into our semantic layer and data catalog
  • Create data products optimized for consumption by AI agents and LLMs where data quality, context, and semantic richness are crucial
  • Structure and transform data to be inherently machine readable, with rich metadata and clear lineage that powers intelligent applications
  • Take responsibility from raw data ingestion through to semantic modeling, ensuring data is not just accurate but contextually rich and agent ready
  • Champion best practices in building LLM consumable data products, optimize for both human and machine consumers, and help evolve our dbt transformation layer
  • Built data products for AI/LLM consumption, not just analytics dashboards
What we offer
What we offer
  • Work 4 days a week from our office (Berlin/Mainz) with a passionate team, and 1 day a week from home
  • Regularly join on- and offline team events, company off-sites, and the annual audibene Wandertag
  • Cost of the Deutschland-Ticket covered
  • Access to over 50,000 gyms and wellness facilities through Urban Sports Club
  • Support for personal development with a wide range of programs, trainings, and coaching opportunities
  • Dog-friendly office
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

As a Senior Data Engineer at Corporate Tools, you will work closely with our Sof...
Location
Location
United States
Salary
Salary:
150000.00 USD / Year
corporatetools.com Logo
Corporate Tools
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s (BA or BS) in computer science, or related field
  • 2+ years in a full stack development role
  • 4+ years of experience working in a data engineer role, or related position
  • 2+ years of experience standing up and maintaining a Redshift warehouse
  • 4+ years of experience with Postgres, specifically with RDS
  • 4+ years of AWS experience, specifically S3, Glue, IAM, EC2, DDB, and other related data solutions
  • Experience working with Redshift, DBT, Snowflake, Apache Airflow, Azure Data Warehouse, or other industry standard big data or ETL related technologies
  • Experience working with both analytical and transactional databases
  • Advanced working SQL (Preferably PostgreSQL) knowledge and experience working with relational databases
  • Experience with Grafana or other monitoring/charting systems
Job Responsibility
Job Responsibility
  • Focus on data infrastructure. Lead and build out data services/platforms from scratch (using OpenSource tech)
  • Creating and maintaining transparent, bulletproof ETL (extract, transform, and load) pipelines that cleans, transforms, and aggregates unorganized and messy data into databases or data sources
  • Consume data from roughly 40 different sources
  • Collaborate closely with our Data Analysts to get them the data they need
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc
  • Improve existing data models while implementing new business capabilities and integration points
  • Creating proactive monitoring so we learn about data breakages or inconsistencies right away
  • Maintaining internal documentation of how the data is housed and transformed
  • Improve existing data models, and design new ones to meet the needs of data consumers across Corporate Tools
  • Stay current with latest cloud technologies, patterns, and methodologies
What we offer
What we offer
  • 100% employer-paid medical, dental and vision for employees
  • Annual review with raise option
  • 22 days Paid Time Off accrued annually, and 4 holidays
  • After 3 years, PTO increases to 29 days. Employees transition to flexible time off after 5 years with the company—not accrued, not capped, take time off when you want
  • The 4 holidays are: New Year’s Day, Fourth of July, Thanksgiving, and Christmas Day
  • Paid Parental Leave
  • Up to 6% company matching 401(k) with no vesting period
  • Quarterly allowance
  • Use to make your remote work set up more comfortable, for continuing education classes, a plant for your desk, coffee for your coworker, a massage for yourself... really, whatever
  • Open concept office with friendly coworkers
  • Fulltime
Read More
Arrow Right

Senior Data Warehouse Administrator

We are looking for a Senior Data Warehouse Administrator to bolster our expandin...
Location
Location
India , Pune
Salary
Salary:
Not provided
floqast.com Logo
FloQast
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5–8 years of experience as a Data WareHouse Administrator or Data Platform Engineer
  • Expert-level knowledge of Data WareHouse administration
  • Proven experience with CDC tools, FiveTran, CData or similar ELT tools (e.g., Stitch, Airbyte)
  • Strong understanding of SQL tuning, partitioning, and query optimization
  • Deep knowledge of data warehousing concepts and modern data platforms
  • Experience with CI/CD, infrastructure-as-code (e.g., Terraform), and monitoring tools
  • Familiarity with data modeling (star/snowflake schemas) and governance practices
  • Strong scripting skills in Python or Bash for automation
Job Responsibility
Job Responsibility
  • Design, implement, and maintain scalable and secure data warehouse environments
  • Optimize data warehouse performance, including fine-tuning complex SQL queries, managing indexing, and monitoring workloads to ensure peak efficiency
  • Lead all administration tasks, encompassing user access control, Role-Based Access Control (RBAC), schema design, partitioning strategies, and ongoing cost optimization
  • Manage and monitor data ingestion pipelines, ensuring reliable ETL/ELT processes and demonstrating awareness of Change Data Capture (CDC) tools for efficient data flow
  • Collaborate closely with data engineers and data analysts to design and implement efficient data models and robust data transformations
  • Contribute significantly to our modern data lake architecture, specifically leveraging Apache Iceberg for data organization and schema evolution
  • Implement and enforce data governance and compliance policies across the data warehouse and data lake environments
  • Tooling and Automation: Building and maintaining tools to automate common administrative tasks, such as table compaction, data expiration policies, and health checks
  • Lead troubleshooting and Root Cause Analysis (RCA) efforts for critical data issues, ensuring rapid resolution and preventing recurrence
  • Mentor junior data warehouse administrators and actively share best practices across the broader data and engineering teams
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

The Data Engineer is responsible for designing, building, and maintaining robust...
Location
Location
Germany , Berlin
Salary
Salary:
Not provided
ibvogt.com Logo
ib vogt GmbH
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Degree in Computer Science, Data Engineering, or related field
  • 5+ years of experience in data engineering or similar roles
  • experience in renewable energy, engineering, or asset-heavy industries is a plus
  • Strong experience with modern data stack (e.g., PowerPlatform, Azure Data Factory, Databricks, Airflow, dbt, Synapse, Snowflake, BigQuery, etc.)
  • Proficiency in Python and SQL for data transformation and automation
  • Experience with APIs, message queues (Kafka, Event Hub), data streaming and knowledge of data lakehouse and data warehouse architectures
  • Familiarity with CI/CD pipelines, DevOps practices, and containerization (Docker, Kubernetes)
  • Understanding of cloud environments (preferably Microsoft Azure, PowerPlatform)
  • Strong analytical mindset and problem-solving attitude paired with a structured, detail-oriented, and documentation-driven work style
  • Team-oriented approach and excellent communication skills in English
Job Responsibility
Job Responsibility
  • Design, implement, and maintain efficient ETL/ELT data pipelines connecting internal systems (M365, Sharepoint, ERP, CRM, SCADA, O&M, etc.) and external data sources
  • Integrate structured and unstructured data from multiple sources into the central data lake / warehouse / Dataverse
  • Build data models and transformation workflows to support analytics, reporting, and AI/ML use cases
  • Implement data quality checks, validation rules, and metadata management according to the company’s data governance framework
  • Automate workflows, optimize performance, and ensure scalability of data pipelines and processing infrastructure
  • Work closely with Data Scientists, Software Engineers, and Domain Experts to deliver reliable datasets for Digital Twin and AI applications
  • Maintain clear documentation of data flows, schemas, and operational processes
What we offer
What we offer
  • Competitive remuneration and motivating benefits
  • Opportunity to shape the data foundation of ib vogt’s digital transformation journey
  • Work on cutting-edge data platforms supporting real-world renewable energy assets
  • A truly international working environment with colleagues from all over the world
  • An open-minded, collaborative, dynamic, and highly motivated team
  • Fulltime
Read More
Arrow Right