This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Spoak is looking for a hybrid data engineer / data scientist to join us on our mission to build the world’s most loved interior design platform. As a company committed to using data to drive our business and roadmap, we are looking for a talented data engineer/data scientist hybrid who can help us to develop and maintain our data infrastructure and use advanced analytics techniques to uncover insights that will help us grow our business and achieve our mission.
Job Responsibility:
Design, build and maintain our data infrastructure, including ETL pipelines and databases
Develop and implement advanced analytics models and algorithms to uncover insights that can be used to optimize our products and customer experience
Work closely with product managers, designers, and engineers to identify data needs and build out new data-driven features
Develop and maintain data documentation, ensuring that our data is accurate, consistent, and well-documented
Participate in cross-functional projects and collaborate with other teams to share insights and knowledge
Requirements:
Bachelor's degree in computer science, statistics, mathematics or a related field
Strong knowledge of data engineering and data science concepts and techniques, including ETL, data warehousing, statistical modeling, machine learning, and data visualization
Proficiency in programming languages such as Python, R, or SQL
Experience with cloud platforms such as AWS or GCP
Ability to work collaboratively in a fast-paced, startup environment
Excellent communication skills and ability to explain technical concepts and insights to non-technical stakeholders
Nice to have:
Experience with data visualization tools such as Tableau or Power BI
Experience with distributed computing systems such as Hadoop or Spark
Experience with big data technologies such as Apache Kafka, BigQuery or Cassandra
Experience with containerization technologies such as Docker or Kubernetes
Experience with machine learning platforms like TensorFlow or PyTorch