This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Our Platform Engineering Team is working to solve the Multiplicity Problem. We are trusted by some of the most reputable and established FinTech Firms. Recently, our team has spearheaded the Conversion & Go Live of apps that support the backbone of the Financial Trading Industry. Are you a data enthusiast with a natural ability for analytics? We’re looking forward skilled Data/Analytics Engineers to fill multiple roles for our exciting new client. This is your chance to shine, demonstrating your dedication and commitment in a role that promises both challenge and reward.
Job Responsibility:
Design, develop, and maintain data pipelines to ingest, transform, and load data from various sources into Snowflake
Implement ETL (Extract, Transform, Load) processes using Snowflake's features such as Snowpipe, Streams, and Tasks
Design and implement efficient data models and schemas within Snowflake to support reporting, analytics, and business intelligence needs
Optimize data warehouse performance and scalability using Snowflake features like clustering, partitioning, and materialized views
Integrate Snowflake with external systems and data sources, including on-premises databases, cloud storage, and third-party APIs
Implement data synchronization processes to ensure consistency and accuracy of data across different systems
Monitor and optimize query performance and resource utilization within Snowflake using query profiling, query optimization techniques, and workload management features
Identify and resolve performance bottlenecks and optimize data warehouse configurations for maximum efficiency
Work on Snowflake modeling – roles, databases, schemas, ETL tools with cloud-driven skills
Work on SQL performance measuring, query tuning, and database tuning
Handle SQL language and cloud-based technologies
Set up the RBAC model at the infra and data level
Work on Data Masking / Encryption / Tokenization, Data Wrangling / Data Pipeline orchestration (tasks)
Setup AWS S3/EC2, Configure External stages, and SQS/SNS
Perform Data Integration e.g. MSK Kafka connect and other partners like Delta Lake (data bricks)
Requirements:
ETL – Experience with ETL processes for data integration
SQL – Strong SQL skills for querying and data manipulation
Python – Strong command of Python, especially in AWS Boto3, JSON handling, and dictionary operations
Unix – Competent in Unix for file operations, searches, and regular expressions
AWS – Proficient with AWS services including EC2, Glue, S3, Step Functions, and Lambda for scalable cloud solutions
Database Modeling – Solid grasp of database design principles, including logical and physical data models, and change data capture (CDC) mechanisms
Snowflake – Experienced in Snowflake for efficient data integration, utilizing features like Snowpipe, Streams, Tasks, and Stored Procedures
Airflow – Fundamental knowledge of Airflow for orchestrating complex data workflows and setting up automated pipelines
Bachelor's degree in Computer Science, or a related field is preferred. Relevant work experience may be considered in lieu of a degree
Excellent communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and stakeholders
Proven leadership abilities, with experience mentoring junior developers and driving technical excellence within the team
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.