CrawlJobs Logo

Lakehouse Host

fourseasons.com Logo

Four Seasons

Location Icon

Location:
United States of America , Orlando

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are looking for an individual who can provide dining support at our relaxing, outdoor poolside restaurant, The Lakehouse. You will perform in a public relations capacity when promoting Four Seasons and is responsible for the guest’s first and last impression when visiting our restaurant outlets.

Job Responsibility:

  • Maintain the reception area and entrance in all aspects of stocking and cleanliness
  • Assist the department managers with training of new employees as requested
  • Retain all assigned wardrobe in accordance with established standards and uniform guidelines
  • Work in an efficient and professional manner while maintaining a positive attitude and deliver superior guest service at all times
  • Perform all assigned duties as detailed in your opening, afternoon and closing checklists
  • Distribute the server’s slips (guest notes) and ensure any updates are given to servers and management promptly
  • Check the blocking sheet to identify all regulars and VIP guests
  • Use OpenTable when taking reservations and when checking on current or future reservations
  • ensure accuracy when entering data into OpenTable
  • Answer calls promptly to maximize the restaurant’s capacity
  • Try to accommodate regulars and reservation requests of concierges
  • Utilize all tables to achieve higher turnaround of guests
  • Work closely with concierges and other agents to increase guest satisfaction
  • Assume all other responsibility relating to operational items at the reception desk
  • Complete all administrative duties competently, efficiently, thoroughly, and in a timely manner
  • Update all operational telephone numbers as needed
  • Move furniture as directed by management, including but not limited to tables, chairs, bar stools, banquettes, bar tables, boxes, equipment, etc.
  • Attend training sessions and read training notes to improve your knowledge
  • Give constant feedback to managers regarding our guests’ satisfaction
  • Be fully conversant with every aspect of Four Seasons emergency procedures
  • Be fully conversant with all the Four Seasons policies as detailed within the employee handbook
  • Be fully conversant with the geographical layout of the operation and to know the exact whereabouts of all operational equipment
  • Clear and reset tables as needed (if necessary)
  • Complete any other task as assigned by management

Requirements:

  • Prior experience working in Food & Beverage
  • Must be able to thrive in a fast-paced environment with a large team
  • Ability to perform basic mathematical equations
  • Ability to demonstrate a positive attitude at all times
  • Ability to keep an open and objective view
  • Ability to listen effectively, empathetically and be respectful at all times
  • Ability to communicate assertively
  • Ability to maintain confidentiality
  • Ability to maintain composure and stay focused
  • Ability to maintain personal integrity
  • Ability to work as a team, stay organized, multi-task, and prioritize
  • Ability to use strong judgment
  • Must portray an outgoing personality that is warm and friendly while demonstrating enthusiasm and professionalism
  • A sincere willingness to provide service to residents, guests, and peers
  • Good organizational skills, with the ability to work independently
  • Ability to function well under pressure, set priorities and adjust to changing conditions
  • High work ethic, with a sense of responsibility for the role filled within our team
  • A successful candidate will have a flexible schedule, ability to work weekends and holidays
  • Must be fluent in English and possess legal work authorization in the United States
What we offer:
  • Energizing Employee Culture where you are encouraged to be your true self
  • Comprehensive learning and development programs to help you master your craft
  • Inclusive and diverse employee engagement events all year-round
  • Exclusive discount and travel programs with Four Seasons
  • Competitive wages and benefits
  • 401(k) and Retirement Plan Matching
  • Employee Assistance Program

Additional Information:

Job Posted:
March 23, 2026

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Lakehouse Host

New

Data Engineer

Goldman Sachs is seeking a Data Engineer to join their datastore-migration Facto...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's or Master's degree in Computer Science, Applied Mathematics, Engineering, or a related quantitative field
  • Minimum of 3-5 years of professional hands-on-keyboard coding experience in a collaborative, team-based environment
  • Ability to troubleshoot (SQL) and basic scripting experience
  • Professional proficiency in Python or Java
  • Deep familiarity with the full Software Development Life Cycle (SDLC) and CI/CD best practices & K8s deployment experience
  • Sophisticated understanding of Temporal Data Modeling (SCD Type 2)
  • Expertise in Schema Evolution (Iceberg Apache) and enforcement strategies
  • Advanced knowledge of data partitioning and clustering
  • Balancing Normalization vs. Denormalization and strategic use of Natural vs. Surrogate Keys
  • Candidates must demonstrate strong stakeholder engagement
Job Responsibility
Job Responsibility
  • Perform end-to-end datastore migration from on-prem DataLake to AWS hosted LakeHouse
  • Refactoring and migrating extraction logic and job scheduling from legacy frameworks to the new Lakehouse environment
  • Executing the physical migration of underlying datasets while ensuring data integrity
  • Acting as a technical liaison to internal clients, facilitating handoff and sign-off conversations with data owners to ensure migrated assets meet business requirements
  • Translating and optimizing legacy SQL and Spark-based consumption patterns (raw and modeled) for compatibility with Snowflake and Iceberg
  • Understanding usage patterns to deliver the required data products
  • Working with reconciliation frameworks to build confidence that migrated data is functionally equivalent to that already used within production flows
  • Fulltime
Read More
Arrow Right
New

Data Engineer

The Data Engineer position in Dallas, Texas, requires a Bachelor’s or Master’s d...
Location
Location
United States , Dallas
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Education: Bachelor's or Master's degree in Computer Science, Applied Mathematics, Engineering, or related field
  • Experience: Minimum 3–5 years of hands-on coding experience
  • ability to troubleshoot SQL and basic scripting
  • 3+ Years of Professional proficiency in Python or Java
  • 3+ Years of experience with SDLC, CI/CD best practices and Kubernetes (K8s) deployment experience
Job Responsibility
Job Responsibility
  • Engineer will be part of the datastore-migration Factory team responsible for end-to-end datastore migration from on-prem DataLake to AWS hosted LakeHouse
  • Refactoring and migrating extraction logic and job scheduling from legacy frameworks to the new Lakehouse environment
  • Executing the physical migration of underlying datasets while ensuring data integrity
  • Acting as a technical liaison to internal clients, facilitating handoff and sign-off conversations with data owners
  • Translating and optimizing legacy SQL and Spark-based consumption patterns for compatibility with Snowflake and Iceberg
  • Understanding usage patterns to deliver the required data products
  • Working on data reconciliation frameworks to ensure migrated data is functionally equivalent to production data
  • Working with internal data management platforms and learning new workflows and language constructs as necessary
  • Fulltime
Read More
Arrow Right
New

Data Engineer

Goldman Sachs is seeking a Data Engineer to join their datastore-migration Facto...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Applied Mathematics, Engineering, or a related quantitative field
  • Minimum of 3-5 years of professional hands-on-keyboard coding experience in a collaborative, team-based environment
  • Ability to trouble shoot (SQL) and basic scripting experience
  • Professional proficiency in Python or Java
  • Deep familiarity with the full Software Development Life Cycle (SDLC) and CI/CD best practices & K8s deployment experience
  • Demonstrated understanding of Temporal Data Modeling (e.g., SCD Type 2)
  • Expertise in Schema Evolution (Ref: Iceberg Apache) and enforcement strategies
  • Advanced knowledge of data partitioning and clustering
  • Balancing Normalization vs. Denormalization and the strategic use of Natural vs. Surrogate Keys
  • Good conduct and ethical decision-making
Job Responsibility
Job Responsibility
  • Engineer will be part of the datastore-migration Factory team that will be responsible to perform for the end-to-end datastore migration from on-prem DataLake to AWS hosted LakeHouse
  • Pipeline Migration - Refactoring and migrating extraction logic and job scheduling from legacy frameworks to the new Lakehouse environment
  • Data Transfer - Executing the physical migration of underlying datasets while ensuring data integrity
  • Stakeholder Engagement - Acting as a technical liaison to internal clients, facilitating handoff and sign-off conversations with data owners to ensure migrated assets meet business requirements
  • Consumption Pattern Migration - Code Conversion: Translating and optimizing legacy SQL and Spark-based consumption patterns (raw and modeled) for compatibility with Snowflake and Iceberg
  • Usage analysis: Understand usage patterns to deliver the required data products
  • Stakeholder Engagement - Acting as a technical liaison to internal clients, facilitating handoff and sign-off conversations with data owners to ensure migrated assets meet business requirements
  • Data Reconciliation & Quality - A rigorous approach to data validation is required
  • Candidates must work with reconciliation frameworks to build confidence that migrated data is functionally equivalent to that already used within production flows
  • Engineer will also need to work with our other internal data management platform, and must have an aptitude for learning new workflows and language constructs as necessary
  • Fulltime
Read More
Arrow Right

Data Engineer - Python AND Kafka AND (Hadoop OR HDFS OR Hive) AND Snowflake AND apache AND (iceberg

The Data Engineer will play a crucial role in migrating data from on-prem DataLa...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's or Master's degree in Computer Science, Applied Mathematics, Engineering, or a related quantitative field
  • Minimum of 3-5 years of professional hands-on-keyboard coding experience in a collaborative, team-based environment
  • Ability to troubleshoot SQL and basic scripting experience
  • Professional proficiency in Python or Java
  • Deep familiarity with the full Software Development Life Cycle (SDLC) and CI/CD best practices
  • K8s deployment experience
  • Sophisticated understanding of Temporal Data Modeling, Schema Management, Performance Optimization, and Architectural Theory
  • Experience with Kafka, ANSI SQL, FTP, Apache Spark
  • Experience with JSON, Avro, Parquet
  • Experience with Hadoop (HDFS/Hive), Snowflake, Apache Iceberg, Sybase IQ
Job Responsibility
Job Responsibility
  • Perform end-to-end datastore migration from on-prem DataLake to AWS hosted LakeHouse
  • Pipeline Migration - Refactoring and migrating extraction logic and job scheduling from legacy frameworks to the new Lakehouse environment
  • Data Transfer - Executing the physical migration of underlying datasets while ensuring data integrity
  • Stakeholder Engagement - Acting as a technical liaison to internal clients, facilitating handoff and sign-off conversations with data owners to ensure migrated assets meet business requirements
  • Consumption Pattern Migration - Translating and optimizing legacy SQL and Spark-based consumption patterns for compatibility with Snowflake and Iceberg
  • Usage analysis to understand usage patterns and deliver required data products
  • Data Reconciliation and Quality - Work with reconciliation frameworks to build confidence that migrated data is functionally equivalent to that already used within production flows
  • Fulltime
Read More
Arrow Right

Data Engineer

The Data Engineer will play a crucial role in migrating data from on-prem DataLa...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Applied Mathematics, Engineering, or a related quantitative field
  • Minimum of 3-5 years of professional hands-on-keyboard coding experience in a collaborative, team-based environment
  • Ability to troubleshoot (SQL) and basic scripting experience
  • Professional proficiency in Python or Java
  • Deep familiarity with the full Software Development Life Cycle (SDLC) and CI/CD best practices & K8s deployment experience
  • Temporal Data Modeling
  • Schema Management
  • Performance Optimization
  • Architectural Theory
  • Kafka, ANSI SQL, FTP, Apache Spark
Job Responsibility
Job Responsibility
  • Engineer will be part of the datastore-migration Factory team that will be responsible to perform for the end-to-end datastore migration from on-prem DataLake to AWS hosted LakeHouse
  • Pipeline Migration: Refactoring and migrating extraction logic and job scheduling from legacy frameworks to the new Lakehouse environment
  • Data Transfer: Executing the physical migration of underlying datasets while ensuring data integrity
  • Stakeholder Engagement: Acting as a technical liaison to internal clients
  • Code Conversion: Translating and optimizing legacy SQL and Spark-based consumption patterns for compatibility with Snowflake and Iceberg
  • Usage analysis: Understand usage patterns to deliver the required data products
  • Data Reconciliation & Quality: Work with reconciliation frameworks to build confidence that migrated data is functionally equivalent
Read More
Arrow Right

Data Engineer

The Data Engineer will play a crucial role in migrating data from on-prem DataLa...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Applied Mathematics, Engineering, or a related quantitative field
  • Minimum of 3-5 years of professional "hands-on-keyboard" coding experience in a collaborative, team-based environment
  • Ability to trouble shoot (SQL) and basic scripting experience
  • Professional proficiency in Python or Java
  • Deep familiarity with the full Software Development Life Cycle (SDLC) and CI/CD best practices & K8s deployment experience
  • Sophisticated understanding of Temporal Data Modeling, Schema Management, Performance Optimization, and Architectural Theory
  • Experience with technologies: Kafka, ANSI SQL, FTP, Apache Spark, JSON, Avro, Parquet, Hadoop (HDFS/Hive), Snowflake, Apache Iceberg, Sybase IQ
Job Responsibility
Job Responsibility
  • Engineer will be part of the datastore-migration Factory team that will be responsible to perform for the end-to-end datastore migration from on-prem DataLake to AWS hosted LakeHouse
  • Pipeline Migration: Refactoring and migrating extraction logic and job scheduling from legacy frameworks to the new Lakehouse environment
  • Data Transfer: Executing the physical migration of underlying datasets while ensuring data integrity
  • Stakeholder Engagement: Acting as a technical liaison to internal clients, facilitating "handoff and sign-off" conversations with data owners to ensure migrated assets meet business requirements
  • Consumption Pattern Migration: Translating and optimizing legacy SQL and Spark-based consumption patterns (raw and modeled) for compatibility with Snowflake and Iceberg
  • Usage analysis: Understand usage patterns to deliver the required data products
  • Data Reconciliation & Quality: Work with reconciliation frameworks to build confidence that migrated data is functionally equivalent to that already used within production flows
  • Work with our other internal data management platform, and must have an aptitude for learning new workflows and language constructs as necessary
Read More
Arrow Right

Data Engineer

The Data Engineer role involves migrating data from on-prem DataLake to AWS Lake...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Applied Mathematics, Engineering, or a related quantitative field
  • Minimum of 3-5 years of professional 'hands-on-keyboard' coding experience in a collaborative, team-based environment
  • Ability to trouble shoot (SQL) and basic scripting experience
  • Professional proficiency in Python or Java
  • Deep familiarity with the full Software Development Life Cycle (SDLC) and CI/CD best practices & K8s deployment experience
  • Candidates must demonstrate a sophisticated understanding of the following modeling concepts: Temporal Data Modeling, Schema Management, Performance Optimization, Architectural Theory
  • Technical Stack Requirements: Extraction & Logic: Kafka, ANSI SQL, FTP, Apache Spark
  • Data Formats: JSON, Avro, Parquet
  • Platforms: Hadoop (HDFS/Hive), Snowflake, Apache Iceberg, Sybase IQ
Job Responsibility
Job Responsibility
  • Engineer will be part of the datastore-migration Factory team that will be responsible to perform for the end-to-end datastore migration from on-prem DataLake to AWS hosted LakeHouse
  • Pipeline MigrationLogic & Scheduling: Refactoring and migrating extraction logic and job scheduling from legacy frameworks to the new Lakehouse environment
  • Data Transfer: Executing the physical migration of underlying datasets while ensuring data integrity
  • Stakeholder Engagement: Acting as a technical liaison to internal clients, facilitating 'handoff and sign-off' conversations with data owners to ensure migrated assets meet business requirements
  • Consumption Pattern MigrationCode Conversion: Translating and optimizing legacy SQL and Spark-based consumption patterns (raw and modeled) for compatibility with Snowflake and Iceberg
  • Usage analysis: Understand usage patterns to deliver the required data products
  • Data Reconciliation & Quality: A rigorous approach to data validation is required
  • Candidates must work with reconciliation frameworks to build confidence that migrated data is functionally equivalent to that already used within production flows
  • Engineer will also need to work with our other internal data management platform, and must have an aptitude for learning new workflows and language constructs as necessary
  • Fulltime
Read More
Arrow Right

Data Engineer

The Data Engineer will be responsible for end-to-end datastore migration from on...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor or Master degree in Computer Science, Applied Mathematics, Engineering, or a related quantitative field
  • Minimum of 3-5 years of professional hands-on-keyboard coding experience in a collaborative, team-based environment
  • Ability to troubleshoot (SQL) and basic scripting experience
  • Professional proficiency in Python or Java
  • Deep familiarity with the full Software Development Life Cycle (SDLC) and CI/CD best practices and K8s deployment experience
  • Demonstrated understanding of Temporal Data Modeling, Schema Management, Performance Optimization, Architectural Theory
  • Experience with Kafka, ANSI SQL, FTP, Apache Spark
  • Experience with JSON, Avro, Parquet
  • Experience with Hadoop (HDFS/Hive), Snowflake, Apache Iceberg, Sybase IQ
Job Responsibility
Job Responsibility
  • Perform end-to-end datastore migration from on-prem DataLake to AWS hosted LakeHouse
  • Pipeline migration including refactoring and migrating extraction logic and job scheduling
  • Data Transfer executing physical migration of underlying datasets while ensuring data integrity
  • Stakeholder Engagement acting as a technical liaison to internal clients
  • Code Conversion translating and optimizing legacy SQL and Spark-based consumption patterns for compatibility with Snowflake and Iceberg
  • Usage analysis to understand usage patterns to deliver required data products
  • Data Reconciliation and Quality with a rigorous approach to data validation
  • Work with other internal data management platform
Read More
Arrow Right