This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are looking for a talented Data Engineer to join our team in Somerset, New Jersey. This is a Contract-to-Permanent opportunity where you will play a key role in designing, developing, and optimizing scalable data solutions. The ideal candidate will collaborate with cross-functional teams to modernize our data infrastructure and support the migration to an Azure Lakehouse architecture.
Job Responsibility:
Lead the migration of existing data assets to an Azure Lakehouse architecture, ensuring smooth transitions and optimized performance
Design, develop, and maintain scalable data pipelines using Azure Databricks, incorporating Delta Lake and Parquet file formats
Collaborate with data architects to implement best practices in data modeling, partitioning, and storage optimization
Enhance query performance and streamline cost management within Azure Data Lake Storage and Databricks environments
Develop and implement data quality frameworks to ensure accuracy, reliability, and consistency of data
Integrate analytics and reporting solutions on top of the data lake to support business intelligence needs
Contribute to the roadmap for transitioning to Microsoft Fabric, leveraging Delta Lake and Parquet assets
Document workflows, architecture designs, and operational processes to ensure seamless knowledge sharing
Mentor team members with less experience and lead technical training sessions to foster skill development
Requirements:
Bachelor’s degree in Computer Science, Information Systems, or a related field, or equivalent relevant experience
At least 5 years of experience in data engineering with a strong focus on Azure data platform technologies
Advanced knowledge of Azure Databricks, Delta Lake, and Parquet file formats
Hands-on experience in building and maintaining data pipelines and implementing data modeling strategies
Proficiency in programming languages such as Python or Scala
Solid understanding of data governance, security, and compliance standards
Familiarity with Microsoft Fabric architecture and migration strategies
Excellent communication and collaboration skills in team-oriented environments
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.