CrawlJobs Logo

Data Scientist, Business Analytics and Industrial Engineering

softwareresources.com Logo

Software Resources

Location Icon

Location:
United States , Orlando

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Software Resources has an immediate, contract job opportunity for a Data Scientist, Business Analytics and Industrial Engineering​ with a major corporation in Orlando, FL. Formulate and apply advanced analytical approaches, such as statistical modeling, forecasting, machine learning, mathematical optimization, and decision analysis, to name a few, in the production of analytical models, tools, and processes to improve the effectiveness and relevance of our efforts to drive incremental business value for the company.

Job Responsibility:

  • Formulate and apply advanced analytical approaches, such as causal and time series forecasting modeling, econometric methods, machine learning modeling, mathematical optimization, for complex analyses, predictions and problem-solving
  • Perform data and error analysis to improve models
  • Assist in collecting, cleaning, and organizing data
  • Analyze large structured and unstructured datasets to identify trends, patterns, and insights
  • Identify causes of variability, test new features, and integrate into code
  • Use business intelligence tools to create and maintain clear and compelling reporting and visualizations to communicate complex findings to technical and non-technical stakeholders
  • Explore new, state-of-the-art analytical tools, methodologies, and technology that can be leveraged by our team
  • Lead and manage interns, co-ops, Analysts, Engineers and/or Associate Data Scientists
  • Develop analytical and predictive models and solutions for any and all operating divisions within Destinations & Experiences (UDX) to support key decisions

Requirements:

  • Must have proficiency with data mining, mathematics, machine learning, and statistical analysis
  • Must have experience and proficiency using machine learning platforms (e.g., AzureML), process automation platforms (e.g., MS Power Automate, etc.), application development platform (e.g., Power Apps, etc.), data querying and scripting/programming tools (e.g., SQL, Python, etc.), business intelligence/data visualization software tools (e.g., Power BI, Tableau, etc.), and statistical modeling software applications (e.g., R, etc.)
  • Proven strong PC skills including spreadsheets, databases, and presentation graphics software
  • Demonstrated leadership ability and/or previous managerial experience
  • Strong coaching/mentoring skills and Leadership experience in continuous improvement
  • Strong critical thinking and creative problem solving skills
  • Demonstrated superior conceptual thinking skills and the ability to work through complex problems
  • Capable of communicating and expressing ideas clearly and concisely, in both written and oral formats
  • Strong communication skills with the ability to convey complex findings to both technical and non-technical stakeholders
  • Ability to work collaboratively in a team environment, work in a fast-paced environment and under changing conditions
  • Demonstrated understanding of the Universal Destinations & Experiences (UDX) heritage and a commitment to change and excellence
  • Willingness to support a 24/7/365 Universal Destinations & Experiences (UDX) operation, which includes periods of high demand (weekends, holidays, etc.)
  • Senior-level proficiency in operational forecasting
  • Production-grade Python fluency
  • High-judgment, AI-enabled problem solving

Nice to have:

  • Doctoral degree in industrial engineering, operations research, mathematics, statistics, computer science, decision science, management science, data science, data analytics, or a closely related field of study
  • 5+ of experience formulating and applying advanced analytical approaches in theme parks, resorts, retail/dining/entertainment venues, hospitality industry, etc
What we offer:
  • Medical, dental, and vision coverage
  • 401(k) with company match
  • Short-term disability
  • Life insurance with AD&D

Additional Information:

Job Posted:
May 05, 2026

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Scientist, Business Analytics and Industrial Engineering

Senior Data Engineering Architect

Location
Location
Poland
Salary
Salary:
Not provided
lingarogroup.com Logo
Lingaro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven work experience as a Data Engineering Architect or a similar role and strong experience in in the Data & Analytics area
  • Strong understanding of data engineering concepts, including data modeling, ETL processes, data pipelines, and data governance
  • Expertise in designing and implementing scalable and efficient data processing frameworks
  • In-depth knowledge of various data technologies and tools, such as relational databases, NoSQL databases, data lakes, data warehouses, and big data frameworks (e.g., Hadoop, Spark)
  • Experience in selecting and integrating appropriate technologies to meet business requirements and long-term data strategy
  • Ability to work closely with stakeholders to understand business needs and translate them into data engineering solutions
  • Strong analytical and problem-solving skills, with the ability to identify and address complex data engineering challenges
  • Proficiency in Python, PySpark, SQL
  • Familiarity with cloud platforms and services, such as AWS, GCP, or Azure, and experience in designing and implementing data solutions in a cloud environment
  • Knowledge of data governance principles and best practices, including data privacy and security regulations
Job Responsibility
Job Responsibility
  • Collaborate with stakeholders to understand business requirements and translate them into data engineering solutions
  • Design and oversee the overall data architecture and infrastructure, ensuring scalability, performance, security, maintainability, and adherence to industry best practices
  • Define data models and data schemas to meet business needs, considering factors such as data volume, velocity, variety, and veracity
  • Select and integrate appropriate data technologies and tools, such as databases, data lakes, data warehouses, and big data frameworks, to support data processing and analysis
  • Create scalable and efficient data processing frameworks, including ETL (Extract, Transform, Load) processes, data pipelines, and data integration solutions
  • Ensure that data engineering solutions align with the organization's long-term data strategy and goals
  • Evaluate and recommend data governance strategies and practices, including data privacy, security, and compliance measures
  • Collaborate with data scientists, analysts, and other stakeholders to define data requirements and enable effective data analysis and reporting
  • Provide technical guidance and expertise to data engineering teams, promoting best practices and ensuring high-quality deliverables. Support to team throughout the implementation process, answering questions and addressing issues as they arise
  • Oversee the implementation of the solution, ensuring that it is implemented according to the design documents and technical specifications
What we offer
What we offer
  • Stable employment. On the market since 2008, 1500+ talents currently on board in 7 global sites
  • Workation. Enjoy working from inspiring locations in line with our workation policy
  • Great Place to Work® certified employer
  • Flexibility regarding working hours and your preferred form of contract
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs. Lingarians earn 500+ technology certificates yearly
  • Upskilling support. Capability development programs, Competency Centers, knowledge sharing sessions, community webinars, 110+ training opportunities yearly
  • Grow as we grow as a company. 76% of our managers are internal promotions
Read More
Arrow Right

Data Engineering Architect

Data engineering involves the development of solutions for the collection, trans...
Location
Location
India
Salary
Salary:
Not provided
lingarogroup.com Logo
Lingaro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10+ years’ experience in the Data & Analytics area
  • 4+ years’ experience into Data Engineering Architecture
  • Proficiency in Python, PySpark, SQL
  • Strong expertise in Azure cloud services such as: ADF, databricks, pyspark, Logic app
  • Strong understanding of data engineering concepts, including data modeling, ETL processes, data pipelines, and data governance
  • Expertise in designing and implementing scalable and efficient data processing frameworks
  • In-depth knowledge of various data technologies and tools, such as relational databases, NoSQL databases, data lakes, data warehouses, and big data frameworks (e.g., Hadoop, Spark)
  • Experience in selecting and integrating appropriate technologies to meet business requirements and long-term data strategy
  • Ability to work closely with stakeholders to understand business needs and translate them into data engineering solutions
  • Strong analytical and problem-solving skills, with the ability to identify and address complex data engineering challenges
Job Responsibility
Job Responsibility
  • Collaborate with stakeholders to understand business requirements and translate them into data engineering solutions
  • Design and oversee the overall data architecture and infrastructure, ensuring scalability, performance, security, maintainability, and adherence to industry best practices
  • Define data models and data schemas to meet business needs, considering factors such as data volume, velocity, variety, and veracity
  • Select and integrate appropriate data technologies and tools, such as databases, data lakes, data warehouses, and big data frameworks, to support data processing and analysis
  • Create scalable and efficient data processing frameworks, including ETL (Extract, Transform, Load) processes, data pipelines, and data integration solutions
  • Ensure that data engineering solutions align with the organization's long-term data strategy and goals
  • Evaluate and recommend data governance strategies and practices, including data privacy, security, and compliance measures
  • Collaborate with data scientists, analysts, and other stakeholders to define data requirements and enable effective data analysis and reporting
  • Provide technical guidance and expertise to data engineering teams, promoting best practices and ensuring high-quality deliverables
  • Support to team throughout the implementation process, answering questions and addressing issues as they arise
What we offer
What we offer
  • Stable employment
  • “Office as an option” model
  • Flexibility regarding working hours and your preferred form of contract
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs
  • Upskilling support
  • Internal Gallup Certified Strengths Coach to support your growth
  • Grow as we grow as a company
Read More
Arrow Right

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
84835.61 - 149076.17 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows, BQML, Vertex AI.
  • Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics.
  • Hands-on experience working with real-time, unstructured, and synthetic data.
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar.
  • Expert knowledge on Python programming and SQL.
  • Experience with cloud platforms (AWS, GCP, Azure) and their data services
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed.
  • Familiarity with synthetic data generation and unstructured data processing
Job Responsibility
Job Responsibility
  • Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets
  • Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth.
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound.
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform.
  • Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics
  • Optimize data pipelines for performance, scalability, and reliability.
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products.
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root.
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Eligible to participate in an annual incentive program
  • Fulltime
Read More
Arrow Right

Data Engineering Lead

Data Engineering Lead a strategic professional who stays abreast of developments...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10-15 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting and other Big data frameworks
  • 4+ years of experience with relational SQL and NoSQL databases: Oracle, MongoDB, HBase
  • Strong proficiency in Python and Spark Java with knowledge of core spark concepts (RDDs, Dataframes, Spark Streaming, etc) and Scala and SQL
  • Data Integration, Migration & Large Scale ETL experience (Common ETL platforms such as PySpark/DataStage/AbInitio etc.) - ETL design & build, handling, reconciliation and normalization
  • Data Modeling experience (OLAP, OLTP, Logical/Physical Modeling, Normalization, knowledge on performance tuning)
  • Experienced in working with large and multiple datasets and data warehouses
  • Experience building and optimizing ‘big data’ data pipelines, architectures, and datasets
  • Strong analytic skills and experience working with unstructured datasets
  • Ability to effectively use complex analytical, interpretive, and problem-solving techniques
  • Experience with Confluent Kafka, Redhat JBPM, CI/CD build pipelines and toolchain – Git, BitBucket, Jira
Job Responsibility
Job Responsibility
  • Strategic Leadership: Define and execute the data engineering roadmap for Global Wealth Data, aligning with overall business objectives and technology strategy
  • Team Management: Lead, mentor, and develop a high-performing, globally distributed team of data engineers, fostering a culture of collaboration, innovation, and continuous improvement
  • Architecture and Design: Oversee the design and implementation of robust and scalable data pipelines, data warehouses, and data lakes, ensuring data quality, integrity, and availability for global wealth data
  • Technology Selection and Implementation: Evaluate and select appropriate technologies and tools for data engineering, staying abreast of industry best practices and emerging trends specific to wealth management data
  • Performance Optimization: Continuously monitor and optimize data pipelines and infrastructure for performance, scalability, and cost-effectiveness, ensuring optimal access to global wealth data
  • Collaboration: Partner with business stakeholders, data scientists, portfolio managers, and other technology teams to understand data needs and deliver effective solutions that support investment strategies and client reporting
  • Data Governance: Implement and enforce data governance policies and procedures to ensure data quality, security, and compliance with relevant regulations, particularly around sensitive financial data
  • Fulltime
Read More
Arrow Right

Data Engineering Lead

The Engineering Lead Analyst is a senior level position responsible for leading ...
Location
Location
Singapore , Singapore
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10-15 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting and other Big data frameworks
  • 4+ years of experience with relational SQL and NoSQL databases: Oracle, MongoDB, HBase
  • Strong proficiency in Python and Spark Java with knowledge of core spark concepts (RDDs, Dataframes, Spark Streaming, etc) and Scala and SQL
  • Data Integration, Migration & Large Scale ETL experience (Common ETL platforms such as PySpark/DataStage/AbInitio etc.) - ETL design & build, handling, reconciliation and normalization
  • Data Modeling experience (OLAP, OLTP, Logical/Physical Modeling, Normalization, knowledge on performance tuning)
  • Experienced in working with large and multiple datasets and data warehouses
  • Experience building and optimizing ‘big data’ data pipelines, architectures, and datasets
  • Strong analytic skills and experience working with unstructured datasets
  • Ability to effectively use complex analytical, interpretive, and problem-solving techniques
  • Experience with Confluent Kafka, Redhat JBPM, CI/CD build pipelines and toolchain – Git, BitBucket, Jira
Job Responsibility
Job Responsibility
  • Define and execute the data engineering roadmap for Global Wealth Data, aligning with overall business objectives and technology strategy
  • Lead, mentor, and develop a high-performing, globally distributed team of data engineers, fostering a culture of collaboration, innovation, and continuous improvement
  • Oversee the design and implementation of robust and scalable data pipelines, data warehouses, and data lakes, ensuring data quality, integrity, and availability for global wealth data
  • Evaluate and select appropriate technologies and tools for data engineering, staying abreast of industry best practices and emerging trends specific to wealth management data
  • Continuously monitor and optimize data pipelines and infrastructure for performance, scalability, and cost-effectiveness
  • Partner with business stakeholders, data scientists, portfolio managers, and other technology teams to understand data needs and deliver effective solutions
  • Implement and enforce data governance policies and procedures to ensure data quality, security, and compliance with relevant regulations
What we offer
What we offer
  • Equal opportunity employer commitment
  • Accessibility and accommodation support
  • Global workforce benefits
  • Fulltime
Read More
Arrow Right

AWS Data Engineer

We are seeking a skilled AWS Data Engineer to join our team and help drive data ...
Location
Location
United States , Charlotte
Salary
Salary:
60.00 USD / Hour
realign-llc.com Logo
Realign
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Information Systems, or related field
  • 3+ years of experience in data engineering, with a focus on AWS cloud services
  • Proficiency in SQL, Python, and AWS data services such as S3, Glue, EMR, and Redshift
  • Experience with ETL processes, data modeling, and data visualization tools
  • Strong analytical and problem-solving skills
  • Excellent communication and teamwork abilities
Job Responsibility
Job Responsibility
  • Design and implement scalable and efficient data pipelines using AWS services such as S3, Glue, EMR, and Redshift
  • Develop and maintain data lakes and data warehouses to store and process large volumes of structured and unstructured data
  • Collaborate with data scientists and business analysts to deliver actionable insights and analytics solutions
  • Optimize data infrastructure for performance, reliability, and cost efficiency
  • Troubleshoot and resolve data integration and data quality issues
  • Stay current with industry trends and best practices in cloud data engineering
  • Provide technical guidance and mentorship to junior team members
Read More
Arrow Right

Data Engineer

We are looking for a skilled and enthusiastic Data Engineer to help design and o...
Location
Location
United States , East Windsor
Salary
Salary:
Not provided
beaconfireinc.com Logo
Beaconfire
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
  • Strong analytic skills related to working with unstructured datasets
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management
  • A successful history of manipulating, processing and extracting value from large disconnected datasets
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores
  • Strong project management and organizational skills
  • Experience supporting and working with cross-functional teams in a dynamic environment
Job Responsibility
Job Responsibility
  • Create and maintain optimal data pipeline architecture
  • Assemble large, complex data sets that meet functional / non-functional business requirements
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs
  • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader
  • Work with data and analytics experts to strive for greater functionality in our data systems
  • Fulltime
Read More
Arrow Right

Lead Data Scientist

Prism is seeking a Lead Data Scientist who will make high-impact, hands-on contr...
Location
Location
United States , NYC or San Diego (La Jolla/UTC)
Salary
Salary:
140000.00 - 160000.00 USD / Year
prismdata.com Logo
Prism Data
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Statistics, Data Science, Math, Industrial Engineering, or similar quantitative discipline required
  • 4+ years of experience as a Data Scientist or Statistician, preferably in credit risk or financial services
  • Research & development of new attributes, models, decision strategies, and/or analytic capabilities
  • Customer-facing testing of data or analytic solutions (e.g., credit risk scores, alternative data, fraud prevention, etc.)
  • Collaboration with business teams to understand and address business problems
  • Modeling experience in the latest AI and machine learning approaches
  • Mastery of rudimentary modeling approaches (e.g., regression, decision trees)
  • Knowledge of data labeling/categorization, feature development and selection, model design and training, testing, coding, implementation, monitoring, and governance
  • Financial services industry data expertise, including types of data sources applicable to different types of business decisions based on regulatory factors
  • Experience evaluating alternative statistical approaches, including review of academic papers on new techniques and ideas, and justifying which approach is best suited for which problems and why
Job Responsibility
Job Responsibility
  • Maintain and advance existing analytical product suite
  • Enhance core cash flow underwriting offerings as new data elements become available
  • Develop next-generation versions of products to boost stability, predictive accuracy, and relevancy to emerging use cases
  • Develop new products for new market opportunities using novel data assets
  • Demonstrate best practices in transparent modeling approaches to enable easy client adoption
  • Support sales, revenue attainment, and thought leadership
  • Conduct proof of concept analyses of the efficacy of Prism products in the context of clients’ business problems, and present results and insights
  • Support new and existing clients by guiding them on score cutoffs and the use of powerful custom attribute sets for their business
  • Educate clients on technical product knowledge
  • Contribute analysis to thought leadership content delivered by Prism executives in white papers and industry conferences
What we offer
What we offer
  • medical
  • dental
  • vision
  • 401(k)
  • equity-based compensation
  • Fulltime
Read More
Arrow Right