This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Token Metrics is seeking a multi-talented Back End Engineer to facilitate the operations of our Data Scientists and Engineering team. Back End Engineer will be responsible to employ various tools and techniques to construct frameworks that prepare information using SQL, Python, R, Java and C++. The Big Data Engineer will be responsible for employing machine learning techniques to create and sustain structures that allow for the analysis of data while remaining familiar with dominant programming and deployment strategies in the field. During various aspects of this process, you should collaborate with coworkers to ensure that your approach meets the needs of each project.
Job Responsibility:
Liaising with coworkers and clients to elucidate the requirements for each task
Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed
Reformulating existing frameworks to optimize their functioning
Testing such structures to ensure that they are fit for use
Building a data pipeline from different data sources using different data types like API, CSV, JSON, etc
Preparing raw data for manipulation by Data Scientists
Implementing proper data validation and data reconciliation methodologies
Ensuring that your work remains backed up and readily accessible to relevant coworkers
Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs
Requirements:
Bachelor's degree in Data Engineering, Big Data Analytics, Computer Engineering, or related field
A Master's degree in a relevant field is an added advantage
3+ years of Python, Java or any programming language development experience
3+ years of SQL & No-SQL experience (Snowflake Cloud DW & MongoDB experience is a plus)
3+ years of experience with schema design and dimensional data modeling
Expert proficiency in SQL, NoSQL, Python, C++, Java, R
Expert with building Data Lake, Data Warehouse or suitable equivalent
Expert in AWS Cloud
Excellent analytical and problem-solving skills
A knack for independence and group work
Capacity to successfully manage a pipeline of duties with minimal supervision
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.