This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are looking for an experienced Data Engineer to join our team in Greenville, South Carolina. This role offers an exciting opportunity to work with modern data technologies, ensuring the efficient operation and optimization of data pipelines and systems. The ideal candidate will bring a strong technical background, leadership skills, and a proactive approach to maintaining and improving data infrastructure.
Job Responsibility:
Oversee daily data loads and ensure the smooth operation of data pipelines and related systems
Troubleshoot and resolve issues such as pipeline failures, performance bottlenecks, schema mismatches, and cloud resource disruptions
Conduct root-cause analyses and implement permanent solutions to prevent recurring issues
Maintain and optimize existing data processes, refactoring or retiring outdated workflows as necessary
Design and build scalable data ingestion pipelines using technologies such as Azure Data Factory, Databricks, and Synapse Pipelines
Collaborate with teams to create and improve operational runbooks, monitoring dashboards, and incident response workflows
Develop reusable ingestion patterns for platforms like Guidewire DataHub, InfoCenter, and other business data sources
Lead the implementation of real-time and event-driven data engineering solutions to enable operational insights and automation
Partner with architects to modernize data workloads using advanced frameworks like Delta Lake and Medallion Architecture
Mentor entry-level engineers, enforce coding best practices, and review code to ensure quality and compliance
Requirements:
Expertise in ETL processes and tools for efficient data extraction, transformation, and loading
Proficiency in Microsoft SQL Server for database programming and management
Hands-on experience with Azure technologies such as Data Factory, Data Lake, and Databricks
Strong scripting skills in T-SQL and PL/SQL for database optimization
Familiarity with real-time data processing and event-driven architectures
Knowledge of modern data governance practices and tools
Ability to create analytics-ready datasets that support advanced reporting and insights
Strong leadership and communication skills to guide teams and collaborate across departments