CrawlJobs Logo

Senior Data Engineer

lightpointglobal.com Logo

Lightpoint Global

Location Icon

Location:

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

This project is designed for consulting companies that provide analytics and predictions to subscription-based business customers based on their website users' behavior and other finance and transactional data (conversions, registrations, payments, etc.). It integrates world-class data science to improve customer engagement by: analyzing Audience; optimizing Pricing strategy; making recommendations on Paywall strategy; making recommendations on content. Based on the received analytics results, customers can make decisions about the development of their own products, decisions in real time in the context of a particular user due to the possibility of automated access to analytics. Product works as well for a bigger variety of industries: telco, eCommerce, financial, healthcare, media & publishing, retail, sports etc.

Job Responsibility:

  • work in an agile team to design, develop, and implement data integration services that connect diverse data sources including event tracking platforms (GA4, Segment), databases, APIs, and third-party systems
  • build and maintain robust data pipelines using Apache Airflow, dbt, and Spark to orchestrate complex workflows and transform raw data into analytics-ready datasets in Snowflake
  • develop Python-based integration services and APIs that enable seamless data flow between various data technologies and downstream applications
  • collaborate actively with data analysts, analytics engineers, and platform teams to understand requirements, troubleshoot data issues, and optimize pipeline performance
  • participate in code reviews, sprint planning, and retrospectives to ensure high-quality, production-ready code by end of each sprint
  • contribute to the continuous improvement of data platform infrastructure, development practices, and deployment processes in accordance with CI/CD best practices

Requirements:

  • successfully implemented and released data integration services or APIs using modern Python frameworks in the past 4 years
  • successfully designed data models and schemas for analytics or data warehousing solutions
  • strong analysis and problem solving skills
  • strong knowledge of Python programming language and data engineering
  • deep understanding of good programming practices, design patterns, and software architecture principles
  • ability to work as part of a team by contributing to product backlog reviews and solution design and implementation
  • be disciplined in implementing software in a timely manner while ensuring product quality isn't compromised
  • formal training in software engineering, computer science, computer engineering, or data engineering
  • have working knowledge with Apache Airflow or a similar technology for workflow orchestration
  • have working knowledge with dbt (data build tool) for analytics transformation workflows
  • experience with cloud data warehouses (Snowflake preferred) and distributed computing frameworks (Apache Spark)
  • experience integrating with event tracking platforms (e.g., Google Analytics 4, Segment, Amplitude) and third-party data sources
  • understanding of data modeling concepts including dimensional modeling, slowly changing dimensions, and incremental loading patterns
  • have working knowledge with containerization (Docker/Kubernetes) and CI/CD pipelines
  • successfully implemented data APIs and integration services that seamlessly connect various data technologies
  • English level: B2 (Upper-intermediate)

Nice to have:

experience with AI tools

Additional Information:

Job Posted:
December 09, 2025

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.