This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Socure is building the identity trust infrastructure for the digital economy — verifying 100% of good identities in real time and stopping fraud before it starts. The mission is big, the problems are complex, and the impact is felt by businesses, governments, and millions of people every day. We are looking for an experienced Analytics Engineer to own and evolve the BI team’s technical infrastructure (Snowflake, dbt, GitLab CI/CD, scheduling frameworks, and ingestion tooling) while ensuring all BI systems and workflows remain fully aligned with the broader Data Engineering architecture and design principles. This role is responsible for keeping the BI environment scalable, maintainable, and consistent with the company’s overall data platform strategy. You will collaborate closely with Data Engineering, who manage the source-to-mesh pipelines, and build everything needed to deliver clean, reliable, analytics-ready data into the BI workspace. This includes developing curated data layers, ensuring pipeline reliability, maintaining governance standards, and enabling efficient downstream analytics across dashboards, reporting, and domain models.
Job Responsibility:
Own and enhance BI infrastructure
Administer and optimize our Snowflake data warehouse (roles, performance, cost control, governance)
Maintain and scale dbt projects including core models, tests, documentation, semantic modeling, and deployments
Manage GitLab pipelines/runners to support robust CI/CD for BI assets
Oversee job scheduling and orchestration for BI transformations and data flows
Own ingestion pipelines relevant to BI data needs
Bridge the gap between Data Engineering and BI
Collaborate with Data Engineering to understand upstream mesh data products with BI analysts to understand business logic, metrics definitions, and performance targets
Extend mesh data into curated BI data layers optimized for analytics
Design data structures that support accurate, scalable analytics (fact tables, dimensions, semantic layers)
Participate in architectural decisions to align upstream pipelines with downstream analytical requirements
Deliver data experiences to end users
Build custom solutions (APIs, extracts, materialized datasets, governed marts) to deliver data in the right format for each use case
Implement robust testing, monitoring, and reliability processes for BI pipelines
Ensure fast, reliable data availability for business stakeholders
Support and guide BI initiatives
Partner with BI Analysts to maintain a reliable modeling environment and help unblock analytical workflows
Recommend the most effective data modeling approaches and development processes, considering business priorities and resource limits
Participate in tooling evaluations and decisions, ensuring solutions fit BI use cases and organizational architecture
Provide clarity in ambiguous situations and advise leadership on risks, dependencies, and sequencing of work
Requirements:
3-5+ years as an Analytics Engineer, Data Engineer, or similar role with a strong analytics orientation
Strong proficiency with Snowflake and familiarity with AWS analytics services (Redshift, Athena, S3, SageMaker, etc.)
Expertise in SQL, Python, and Spark for data processing, automation, and custom integrations
Experience with dbt and modern data modeling best practices
Hands-on experience with Git-based CI/CD workflows (GitLab preferred)
Familiarity with ingestion tools such as Fivetran
Proven ability to translate business requirements and metric definitions into robust, scalable data models
Strong communication and stakeholder management skills
Nice to have:
Experience supporting BI or analytics teams directly
Knowledge of semantic layers, metrics stores, or analytics engineering frameworks
Python experience for automation, orchestration, or custom integrations
Familiarity with data mesh principles and domain-oriented data products
Experience optimizing cross-cloud data architecture or hybrid environments