CrawlJobs Logo

Starburst Data Engineer

realign-llc.com Logo

Realign

Location Icon

Location:
United States , Charlotte, NC

Category Icon

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

130000.00 USD / Year

Job Description:

Job Title: Starburst Data Engineer. Location – Charlotte ,NC & Plano , TX. FTE Only.

Job Responsibility:

  • Demonstrated expertise in Starburst Data Virtualization, including designing and implementing data virtualization solutions, optimizing query performance across distributed data sources, and integrating Starburst with enterprise data architectures
  • Analyze data mapping documents and business requirements to design comprehensive test plans and cases
  • Perform source-to-target data reconciliation, check data loading, and ensure transformation rules are applied correctly
  • Write complex SQL scripts for validation (count, data completeness, data consistency, data truncation)
  • Identify, log, and track data defects using tools like JIRA or HP ALM or Octane
  • Automate test scripts and validate data volume, performance, and scalability
  • Validate HiveQL, HDFS file structures, and data processing within the Hadoop cluster

Requirements:

  • Minimum 10 years experience
  • Primary Skill: Starburst
  • Secondary: Data Virtualization Engineer, Dremio, Presto, SQL Performance Tuning, Shell Scripting, Autosys
  • Expert-level knowledge of SQL for data analysis
  • Experience with tools such as Informatica and IDMC
  • Understanding of data warehouse concepts and architectures (e.g., star/snowflake schema)
  • Familiarity with Hadoop or Spark is often preferred
  • Strong analytical and troubleshooting skills
  • Excellent communication for collaborating with developers and stakeholders
  • Domain: Banking knowledge, Payments knowledge preferred
  • Concept: Data Virtualization, Data Warehousing, Data Transformation, ETL/ELT, Data Quality

Nice to have:

  • Familiarity with Hadoop or Spark is often preferred
  • Domain: Banking knowledge, Payments knowledge preferred

Additional Information:

Job Posted:
March 21, 2026

Employment Type:
Fulltime
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Starburst Data Engineer

Big Data / PySpark Engineering Lead - Vice President

The Applications Development Technology Lead Analyst is a senior level position ...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Highly experienced and skilled technical lead with 12+years of experience with software building and platform engineering
  • Experience in Data Engineering, focused on Big Data ecosystems
  • Knowledge in Hadoop, YARN, Hive, Impala, Spark, and Spark SQL with extensive high volume of data processing pipeline development
  • Programming Expert level and hand on experience in Python
  • Familiarity with data formats like Avro, Parquet, CSV, JSON
  • Hands-on experience in writing SQL queries
  • Highly experienced with Unix based operating systems and shell scripting
  • Experience with source code management tools such as Bitbucket, Git etc
  • Big Data Tech Proficiency and hands-on in Hadoop, Spark, Hive, Kafka, and NoSQL databases (MongoDB, HBase)
  • Experience working with query engines like Trino, Presto, Starburst
Job Responsibility
Job Responsibility
  • Design and implement scalable, fault-tolerant batch and real-time data processing pipelines
  • Develop robust data models and schema designs optimized for both performance and storage efficiency
  • Evaluate and integrate emerging tools and frameworks (e.g., Spark, Flink, Kafka) into the existing stack
  • Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions
  • Develop comprehensive knowledge of how areas of business, such as architecture and infrastructure, integrate to accomplish business goals
  • Legacy Systems Decommissioning: Lead the strategic migration of data and logic from legacy platforms (e.g. on-premises SQL Servers) to a modern Data Lakehouse environment
  • ETL/ELT Transformation: Re-engineer existing stored procedures and complex legacy ETL jobs into scalable, distributed processing frameworks using Spark (Python) and Starburst/Trino
  • Validation & Parity Testing: Design and implement automated frameworks for Data Parity Testing to ensure 100% accuracy and consistency between legacy outputs and new big data results
  • Schema Evolution: Map and transform rigid, legacy relational schemas into flexible, high-performance formats optimized for the cloud (e.g., Parquet, Avro, or Iceberg)
  • Phased Cutover Management: Orchestrate a phased migration strategy (Parallel Run, Shadow Execution) to ensure zero downtime for downstream business applications and reporting tools
  • Fulltime
Read More
Arrow Right

Business Intelligence Developer

The CTI Enterprise Analytical Services (EAS) organization is actively recruiting...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience with Business Intelligence tool or data federation tools such as Starburst developer or administrator
  • 3+ years of Linux, Shell scripting, Ansible experience
  • 8+ years overall IT experience
  • Knowledge of the Hadoop ecosystem with experience in Hive, Spark, etc. is a plus
  • Knowledge of Java or any programming language is a plus
  • Good interpersonal skills with excellent communication skills - written and spoken English
  • Able to interact with client projects in cross-functional teams
  • Good team player interested in sharing knowledge and cross-training other team members and shows interest in learning new technologies and products
  • 5+ years of hands-on experience in setting up security (authentication and authorization) for Business Intelligence or data federation products
  • Experience with container technologies, Kubernetes, and cloud architectures, including some exposure to public cloud platforms such as AWS, GCP
Job Responsibility
Job Responsibility
  • Deliver the tooling and capabilities needed to enable data & analytics services such as Starburst, Tableau on massive, distributed data sets
  • Understand Engineering needs including those required to build, maintain, and operate the system through all phases of its life
  • Create and maintain continuous integration and deployment processes including testing and monitoring to ensure the solution is reliable and measurable
  • Take full ownership of designing solutions, and building blueprints, prototypes, and frameworks to drive enablement of new capabilities
  • Collaborate with cross-functional teams to build a portfolio of capabilities for recommendation and use in new product developments
  • Publish best practices, configuration recommendations, design patterns, tool/technology selection methodologies, and playbooks for Engineering and user communities
  • Collaborate with cross-functional Engineering teams to build a portfolio of capabilities to recommend and use in analytical product development across Citi lines of Businesses
  • Enable Hybrid cloud implementation along with security for Business Intelligence products
  • Enable Business Intelligence products on external Cloud platforms as SaaS or PaaS solutions and integrate with various Cloud and on-prem data sources
  • Build reusable security and deployment framework for Business Intelligence services enabled on Cloud and on-prem
  • Fulltime
Read More
Arrow Right

Equities Quant Platform Engineering Lead – Python

Citi's Equities Technology team is undergoing significant growth and investment,...
Location
Location
United Kingdom , London
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Extensive background in delivering production-grade, data-centric applications for quantitative trading and analytics
  • Demonstrated expertise in the python data engineering stack (Polars, Parquet, FastAPI, Jupyter, Airflow, Streamlit, Ray)
  • Demonstrated expertise in high-performance data stores and query engines (Starburst, Snowflake)
  • Demonstrated expertise in real-time streaming analytics technologies (Kafka, Flink)
  • Demonstrated expertise in cloud container technologies (AWS, Azure, GCP, Docker, Kubernetes)
  • Proven success in enhancing developer experience that reduces friction in coding, building and deploying APIs and client libraries
  • Real-world application of generative AI prompt engineering and RAG pipelines
  • Full-stack HTML5 web development skills
Job Responsibility
Job Responsibility
  • Guide the technical direction and implementation of the platform
  • Championing engineering excellence through hands-on feature creation, rigorous code quality via pull request reviews, and by mentoring junior engineers to establish robust coding standards and guardrails
  • Architecting scalable, secure re-usable components
  • Drive the design process, aligning with or challenging existing blueprints, seeking consensus from senior leads and stakeholders
  • Staying up to date with open-source solutions and latest trends to accelerate business outcomes
What we offer
What we offer
  • 27 days annual leave (plus bank holidays)
  • A discretional annual performance related bonus
  • Private Medical Care & Life Insurance
  • Employee Assistance Program
  • Pension Plan
  • Paid Parental Leave
  • Special discounts for employees, family, and friends
  • Access to an array of learning and development resources
  • Fulltime
Read More
Arrow Right

Staff Data Engineer

The Staff Data Engineer role is part of the Bamboo Health Engineering Team. You ...
Location
Location
United States
Salary
Salary:
Not provided
bamboohealth.com Logo
Bamboo Health
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in computer science, Analytics, a related field, or equivalent experience
  • 8+ years total software and relational database development experience
  • 3+ years with a strong demonstrated ability to develop and maintain ETL solutions, ideally using Python and various Application Programming Interfaces (API)
  • 3+ years’ experience with AWS Cloud Solutions/Services
  • Experience working in an Agile environment include using ticketing software such as JIRA
  • Strong technical problem-solving abilities
  • Hands on experience maintaining databases on Redshift, PostgreSQL, MySQL or Oracle relational database systems.
  • Experience with software development using Python, Ruby, or other modern scripting languages, ideally in container solutions such as Docker or Kubernetes.
  • Proficiency with modern data stack tools such as dbt, Starburst, AWS Glue.
  • The ability to travel periodically for work.
Job Responsibility
Job Responsibility
  • Develop, debug and support ETL processes utilizing AWS services
  • Lead ideation and development of data models used for data science and analytics
  • Meet the delivery expectations of the Agile Project Management methodology
  • Maintain and optimize reports and extracts that serve as lifesaving information sources to customers
  • Create clear and concise documentation regarding technical solutions, while sharing knowledge and documentation with teammates via “Lunch and Learns”
  • Collaborate with internal and external customers to deliver modern data products
  • Explore opportunities to enhance workflows through AI or automation tools (e.g., document summarization, task routing, or data parsing).
  • Identify repetitive tasks and partner with team leads to implement scalable automation solutions.
What we offer
What we offer
  • Receive competitive compensation, including health, dental, vision and other benefits
Read More
Arrow Right

Database Development Engineer

The Database Development Engineer is an intermediate level position responsible ...
Location
Location
Canada , Mississauga
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5-8 years of relevant experience
  • Experience in systems analysis and programming of software applications
  • Experience in managing and implementing successful projects
  • Oracle SQL & PL/SQL Expertise – Strong knowledge of writing queries, stored procedures, triggers, and performance tuning
  • Database Migration & ETL – Experience in moving data between Oracle and other databases (e.g., PostgreSQL, SQL Server)
  • Python for Data Migration – Proficiency in using Python libraries like cx_Oracle, SQLAlchemy, and pandas for data extraction, transformation, and loading (ETL)
  • Data Transformation & Cleansing – Hands-on experience with data validation, transformation, and error handling
  • Shell Scripting & Automation – Writing scripts to automate database tasks and migrations
  • Performance Optimization – Indexing, query tuning, and bulk data loading techniques (e.g., SQL*Loader, DBMS_DATAPUMP)
  • Experience with StarBurst Data is an added advantage
Job Responsibility
Job Responsibility
  • Conduct tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establish and implement new or revised applications systems and programs to meet specific business needs or user areas
  • Monitor and control all phases of development process and analysis, design, construction, testing, and implementation as well as provide user and operational support on applications to business users
  • Utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, provide evaluation of business process, system process, and industry standards, and make evaluative judgement
  • Recommend and develop security measures in post implementation analysis of business usage to ensure successful system design and functionality
  • Consult with users/clients and other technology groups on issues, recommend advanced programming solutions, and install and assist customer exposure systems
  • Ensure essential procedures are followed and help define operating standards and processes
  • Serve as advisor or coach to new or lower level analysts
  • Has the ability to operate with a limited level of direct supervision
  • Can exercise independence of judgement and autonomy
  • Acts as SME to senior stakeholders and /or other team members
  • Fulltime
Read More
Arrow Right

Senior Software Engineer

Immuta is the Data Provisioning Company, helping organizations provision secure,...
Location
Location
United States , College Park
Salary
Salary:
155000.00 - 170000.00 USD / Year
immuta.com Logo
Immuta
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5–8 years of software engineering experience in SaaS, cloud, or data-intensive environments
  • Bachelor’s or Master’s degree in Computer Science or a related field is preferred
  • Strong proficiency in TypeScript and Node.js, with experience building backend services and data-driven applications
  • Hands-on experience designing and operating microservice and distributed systems, including asynchronous or long-running workflows (e.g., Temporal or similar systems) and API design
  • Experience working with Postgres, including writing and tuning SQL for performance, and deploying services using Docker and Kubernetes in cloud environments (AWS, Azure, or GCP)
  • Excellent communicator who is curious, self-directed, and passionate about building high-quality software that drives measurable customer value
Job Responsibility
Job Responsibility
  • Design and develop new backend pipelines and workflows that deliver high reliability and performance
  • Identify bottlenecks, tune Postgres queries, and optimize system performance as data volumes grow
  • Provide technical leadership, mentoring junior engineers and fostering a culture of learning and excellence
  • Improve engineering processes through automation, testing, and continuous delivery
  • Design, build, and deliver backend services and distributed workflows that power Immuta’s core platform
  • Build and operate services that integrate with modern data platforms such as Snowflake, Databricks, Starburst, and Redshift
  • Implement and maintain TypeScript-based microservices, RESTful APIs, and Temporal workflows
  • Own Postgres performance and reliability, including query authoring, tuning (configuration of memory and buffers, WAL tuning, and table design), benchmarking, and schema design
  • Deploy and operate microservices in Kubernetes-based environments, using tools like Skaffold and Flux to support modern CI/CD workflows, with a focus on scalability and reliability
  • Participation in code reviews, design discussions, and system architecture planning
What we offer
What we offer
  • 100% employer paid Healthcare (Medical, Dental, Vision) premiums for you and your dependents (including Domestic Partners)
  • Stock Options
  • Paid parental leave (Both Maternity and Paternity)
  • Unlimited Paid time off (U.S. based positions)
  • Learning and Development Resources
  • 401(k) plan
  • Fulltime
Read More
Arrow Right
New

Data Engineer

Role: Data Engineer. Experience – 10 To 15 years. Fulltime Permanent FTE.
Location
Location
United States , New York
Salary
Salary:
159000.00 USD / Year
realign-llc.com Logo
Realign
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Hands-on experience in building ETL using Databricks SaaS infrastructure
  • Experience in developing data pipeline solutions to ingest and exploit new and existing data sources
  • Expertise in leveraging SQL, programming language like Python and ETL tools like Databricks
  • Perform code reviews to ensure requirements, optimal execution patterns and adherence to established standards
  • Expertise in AWS Compute (EC2, EMR), AWS Storage (S3, EBS), AWS Databases (RDS, DynamoDB), AWS Data Integration (Glue)
  • Advanced understanding of Container Orchestration services including Docker and Kubernetes, and a variety of AWS tools and services
  • Good understanding of AWS Identify and Access management, AWS Networking and AWS Monitoring tools
  • Proficiency in CI/CD and deployment automation using GITLAB pipeline
  • Proficiency in Cloud infrastructure provisioning tools e.g., Terraform
  • Proficiency in one or more programming languages e.g., Python, Scala
Job Responsibility
Job Responsibility
  • Work on migrating applications from an on-premises location to the cloud service providers
  • Develop products and services on the latest technologies through contributions in Development, enhancements, testing and implementation
  • Develop, modify, extend code for building cloud infrastructure, and automate using CI/CD pipeline
  • Partners with business and peers in the pursuit of solutions that achieve business goals through an agile software development methodology
  • Perform problem analysis, data analysis, reporting, and communication
  • Work with peers across the system to define and implement best practices and standards
  • Assess applications and help determine the appropriate application infrastructure patterns
  • Use the best practices and knowledge of internal or external drivers to improve products or services
  • Fulltime
Read More
Arrow Right

Senior Solution Architect

This is where we value your strategic mindset, technical expertise and passion f...
Location
Location
India , Gurgaon
Salary
Salary:
Not provided
https://www.baxter.com/ Logo
Baxter
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Engineering, Finance, Math, Statistics, or related discipline
  • 15+ years of experience in data and analytics roles
  • 10+ years of hands-on experience with Power BI/Tableau/Alteryx and related BI tools
  • Expertise in enterprise architecture frameworks and cloud-native technologies (AWS, Azure, GCP)
  • Deep knowledge of ETL tools (e.g., Informatica), data modeling (e.g., Erwin), and data pipeline orchestration
  • Advanced proficiency in Snowflake, Synapse, Alteryx, Power BI, Tableau, and related BI tools
  • Experience with data virtualization platforms such as Denodo and Starburst
  • Familiarity with processing mining tools like Celonis
  • Strong knowledge of Finance, including financial reporting, analysis, and performance tracking
  • Proficiency in cloud data architecture and operationalization on platforms like Snowflake, AWS S3, AWS Glue, Athena, Redshift
Job Responsibility
Job Responsibility
  • Define and own the end-to-end solution architecture for Finance Analytics, covering areas such as Global Business Solution (GBS), Treasury, Tax, Audit, FP&A and other functions within Baxter Finance
  • Partner with Finance leadership, enterprise architects, and delivery teams to translate business strategies into robust, future-ready technology solutions
  • Provide architectural leadership across programs and projects, ensuring consistency, integration, and adherence to best practices
  • Evaluate and recommend emerging technologies and platforms to enhance financial capabilities and operational efficiency
  • Ensure seamless integration across ERP systems, financial applications, data platforms, and reporting tools
  • Act as a strategic advisor, influencing technology decisions and roadmaps within the Finance function
What we offer
What we offer
  • Paid Time Off
  • Employee Heath & Well-Being Benefits
  • Continuing Education/ Professional Development
  • Support for Parents
  • Employee Assistance Program
  • Fulltime
Read More
Arrow Right