This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Intermediate BI Developer will be responsible for developing and managing data solutions, particularly using the Snowflake platform. As part of a growing team, we need experienced Intermediate BI Developers to join us on a journey to Understand, Design and Build the FDLC (Full Development Lifecycle) of the client BI Solutions. This role will not only enable you to collaborate with the best developers in the region, but also allow you to have input on key technical decisions under the guidance of Senior BI Developers and BI Architects that includes frameworks and tools being used into wider technical strategy/architectures.
Job Responsibility:
Develop, manage, and optimize data solutions using the Snowflake platform
Write applications that extend Snowflake, act as a client, or act as an integrating component
Use Snowpark API to run Python, Java, and Scala code in Snowpark
Create data pipelines in Python, Java, or Scala
Build Machine Learning workflows with fast data access and data processing
Create and manage Snowflake resources across data engineering, Snowpark, Snowpark ML, and application workloads using a unified, first-class Python API
Explore various options such as Native Apps Framework, Streamlit, functions, procedures, and more
Leading BI software development, deployment, and maintenance
Requirements:
At least 5+ years production experience in analysing and wrangling data
At least 5+ years production experience building data warehouses and dimensional data models from relational models
Bachelor’s degree in computer science, Information Systems or equivalent experience
Proficiency in SQL, Azure tools, and Power BI
Proven experience as a Snowflake Developer or similar role
Proficiency in Python, Java, Scala, or other relevant languages
Experience with data engineering, AI & ML, Data Lake, and DevOps
Nice to have:
Production experience analysing and wrangling relational data sources (e.g. CSV, Excel, SQL, Oracle, PostgreSQL, MySQL, Databrics)
Production understanding and mastery of the difference between Relational, Dimensional and Non-structured data and the modelling thereof
Production experience transforming and conforming diverse data sources to business insights through collaborative Business Metrics and visible Data Lineage
Production experience working with at least 3 modern modelling tools - on premise and / or in the cloud (e.g. MS SQL, Analysis Services, Power BI, Azure Synapse Analytics, Microsoft Fabric)
Production experience building star-schema dimensional data warehouses (e.g. SQL Server on Premise, Azure SQL variants, Azure Synapse Analytics, Microsoft Fabric)
Production experience building user friendly self-service Data Models using modern visualisation tools (e.g. Power BI, Analysis Services)
Production experience building developer friendly technical Data Models using modern visualisation tools (e.g. SQL Views, Analysis Services, Azure Synapse Analytics, Power BI Desktop, Power BI Service, Power BI Apps)
Production experience working with DAX modelling measures (e.g. Power BI, Analysis Services, DAX Studio, Tabular Editor or other external tools used with Power BI Desktop)
Production experience building Dashboards - particularly for dev-testing self-service or technical dimensional models and to engage peer and subordinate Report Writers and Dashboard Builders (e.g. Power BI Desktop, Excel)
Working knowledge with data from a variety of mainstream source systems (e.g. SAP, Oracle EBS, Salesforce, SharePoint, Dynamics 365, Office 365, SQL based Systems)
Working knowledge Master Data Management, Data Governance, Data Lineage
Working knowledge with public data, non-relational data, unstructured data, streaming data to enriching dimensional models
Working knowledge with Hybrid Cloud and non- Microsoft Azure Cloud (e.g. Google, AWS)
Working knowledge with related technologies (e.g. IoT, Edge, Big data, Data Lakes, Data Streaming, Advanced Analytics, Machine Learning, Artificial Intelligence, Robotics)
Experience building effective CI/CD tools and processes (ie.. Devops, Github)
Working knowledge of Databricks
If you have any Microsoft Certification, you’re one step ahead