This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The mission of the business intelligence team is to create a data-driven culture that empowers leaders to integrate data into daily decisions and strategic planning. We aim to provide visibility, transparency, and guidance regarding the quantity and quality of results, activities, financial KPIs, and leading indicators to identify trends aimed at data-based decision-making easily. As a Senior Data Engineer, you will be responsible for designing, architecting, and implementing robust data solutions in a cloud-based environment (GCP). You will partner with other data engineers and technical teams to ensure the availability, reliability, and performance of our data systems.
Job Responsibility:
Architect and build complex data pipelines using advanced cloud data technologies
Lead efforts to optimize data pipelines for performance, scalability, and cost-efficiency
Define industry best practices for building data pipelines
Ensure data security, compliance, and governance standards are met
Partner with leadership team to define and implement agile and DevOps methodologies
Serve as subject matter expert and define data architecture and infrastructure requirements
Partner with business analysts to plan project execution including appropriate product and technical specifications, direction, resources, and establishing realistic completion times
Understand data technology trends and identify opportunities to implement new technologies and provide forward-thinking recommendations
Proactively partner with internal stakeholders to bridge gaps, provide historical references, and design the appropriate processes
Design and implement a robust data observability process
Resolve escalated reporting requests and communicate proactively and timely
Troubleshoot, and provide technical guidance to resolve issues related to misaligned or inaccurate data or data fields or new customer requirements
Maintain new release, migration, and sprint schedules for software upgrades, enhancements, and fixes to aid with product evolution
Write QA/QC Scripts to conduct first round of testing and partner with BA team for test validation for new developments prior to moving to production
Use industry knowledge & feedback to aid in the development of technology roadmap and future product(s) vision
Document standard ways of working via QRGs, intranet pages, and video series
Drive day-to-day development activities of development team in close collaboration with on-site and off-shore resources, scrum masters and product owners
Bootstrapping a data engineering team at an early stage in the team’s evolution
Provide leadership on technical front in difficult situations facilitate contentious discussions, and report up when necessary
Guide, mentor and coach offshore resources
Provide input in forming a long-term data strategy
Requirements:
Master’s degree in Computer Science / Information Technology or related field, highly preferred
Extensive knowledge of BI concepts and related technologies that help drive sustainable technical solutions
Extensive Experience with data lakes, ETL and data warehouses
Advanced experience of building data pipelines
Passion for building quality BI software
Project Management and/or process improvement experience highly preferred
Polyglot coder and expert level in multiple languages including, Python, R, Java, SQL, relational databases, ERP, DOMO or other data visualization tools i.e. Tableau
Advanced and proven experience with Google cloud platform (GCP) is preferred. But experience with Microsoft Azure / Amazon will be considered
Any exposure to Kafka, Spark, and Scala will be an added advantage
Should demonstrate a strong understanding of OOPS concepts and methodologies
Expert level understanding of data engineering
Intrinsic motivation and problem-solving
Proactive leadership, project management, time management, and problem-solving skills
Demonstrated continuous improvement, process documentation, and workflow skills
Extensive experience with data analysis , modeling, and data pipelining including data cleaning, standardizing, scaling, tuning, scheduling and deployment
Experience composing detailed technical documentation and procedures for data models
Ability to prioritize and manage multiple projects, tasks, and meeting deadlines while maintaining quality
Strong drive and commitment for delivering outstanding results
Strong follow up and service orientation
Nice to have:
Any exposure to Kafka, Spark, and Scala will be an added advantage
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.