CrawlJobs Logo

Intermediate data modeler

https://www.randstad.com Logo

Randstad

Location Icon

Location:
Canada , Calgary

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Are you a forward-thinking data professional looking to build innovative solutions within a fast-paced, product-driven environment? Our Calgary client is seeking a skilled Data Modeler II for an initial 12-month contract to drive the end-to-end development of high-impact Supply Chain tools. If you thrive in a start-up mindset and want to apply advanced data science techniques to real-world energy sector challenges, this is your opportunity to make a tangible impact on enterprise-scale operations.

Job Responsibility:

  • Design and optimize ETL pipelines for scalable data processing in Azure Databricks and Dataverse
  • Develop data science solutions focused on predictive analysis and text-based sentiment insights
  • Create master datasets to support spend analytics and procurement decision-making
  • Oversee project timelines and coordinate cross-functional teams to ensure goal alignment
  • Facilitate project meetings and provide regular progress updates to business stakeholders
  • Maintain comprehensive documentation including technical designs and idea assessments
  • Ensure all data products comply with corporate governance and security policies

Requirements:

  • A minimum of 5 to 7 years of experience in SCM or product development roles
  • Advanced proficiency in Python and PySpark for large-scale data processing within Databricks
  • Hands-on experience building low-code solutions via Microsoft Power Platform and Copilot Studio
  • Functional knowledge of SAP or Oracle Cloud modules related to SCM, Finance, or Projects
  • Strong background in the Oil and Gas sector with a focus on Procure-to-Pay processes
What we offer:
  • Work in a dynamic, "start-up" style environment within a major industry leader
  • Gain exposure to a cutting-edge tech stack including Azure Databricks and Microsoft Copilot Studio
  • Collaborate on high-visibility projects that directly influence Supply Chain value drivers
  • Engage in a role that blends technical data science with strategic program management
  • Build your portfolio by creating master datasets for procurement and market intelligence

Additional Information:

Job Posted:
February 22, 2026

Expiration:
April 02, 2026

Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Intermediate data modeler

Data Science Intermediate Analyst

The Data Science Intermediate Analyst is a developing professional role. Deals w...
Location
Location
Mexico , Ciudad De Mexico
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2-5 years experience using tools for statistical modeling of large data sets
  • Ability to effectively use complex analytical, interpretive and problem solving techniques
  • Analytical, flexible, team-oriented and has good interpersonal/communication skills
  • Demonstrated influencing, facilitation and partnering skills
  • Proficient in Microsoft
  • Indispensable experiencia profesional en paquetes estadísticos (SAS Avanzado, SQL, R Intermedio y Python intermedio, Spark deseable).
Job Responsibility
Job Responsibility
  • Conducts strategic data analysis, identifies insights and implications and make strategic recommendations, develops data displays that clearly communicate complex analysis
  • Mines and analyzes data from various banking platforms to drive optimization and improve data quality
  • Delivers analytics initiatives to address business problems with the ability to determine data required, assess time & effort required and establish a project plan
  • Consults with business clients to determine system functional specifications
  • Consults with users and clients to solve complex system issues/problems through in-depth evaluation of business processes, systems and industry standards
  • recommends solutions
  • Leads system change process from requirements through implementation
  • provides user and operational support of application to business users
  • Formulates and defines systems scope and objectives for complex projects through research and fact-finding combined with an understanding of applicable business systems and industry standards
  • Impacts the business directly by ensuring the quality of work provided by self and others
What we offer
What we offer
  • Equal opportunity employer
  • Global workforce benefits supporting well-being, growth, and work-life balance.
  • Fulltime
Read More
Arrow Right

Liquidity Requirements and Testing Intermediate Analyst

The role involves understanding and executing liquidity testing, contributing to...
Location
Location
India , Mumbai
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of related experience
  • Financial Services Industry and Project Management experience preferred
  • Proficient in Microsoft Office with an emphasis on MS Excel
  • Working knowledge of related industry practices and standards
  • Ability to work with multiple functions
  • Proven problem-solving skills
  • Consistently demonstrates clear and concise written and verbal communication skill
Job Responsibility
Job Responsibility
  • Has knowledge on FR2052a and other liquidity metrics such as LCR/ NSFR and other liquidity submissions
  • Understand and execute testing following strict standards and ensuring impact is fully understood for complex products/reports assigned
  • Be a key participant in working groups, present items under testing and articulate expected impact and testing approach
  • Identify, communicate, and escalate (as applicable) risks to testing where timeliness or quality of testing could be compromised
  • Ensure exception processes as applicable, when triggered are enforced and documented
  • Be closely familiar and follow documented Standard Operating procedures across the testing lifecycle
  • Identify, advocate for, and execute improvements or automation related to the process
  • Acts as SME to senior stakeholders and /or other team members
  • Appropriately assess risk when business decisions are made, demonstrating consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency
What we offer
What we offer
  • Resources to meet your unique needs
  • Empowerment to make healthy decisions
  • Financial well-being support
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

This project is designed for consulting companies that provide analytics and pre...
Location
Location
Salary
Salary:
Not provided
lightpointglobal.com Logo
Lightpoint Global
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • successfully implemented and released data integration services or APIs using modern Python frameworks in the past 4 years
  • successfully designed data models and schemas for analytics or data warehousing solutions
  • strong analysis and problem solving skills
  • strong knowledge of Python programming language and data engineering
  • deep understanding of good programming practices, design patterns, and software architecture principles
  • ability to work as part of a team by contributing to product backlog reviews and solution design and implementation
  • be disciplined in implementing software in a timely manner while ensuring product quality isn't compromised
  • formal training in software engineering, computer science, computer engineering, or data engineering
  • have working knowledge with Apache Airflow or a similar technology for workflow orchestration
  • have working knowledge with dbt (data build tool) for analytics transformation workflows
Job Responsibility
Job Responsibility
  • work in an agile team to design, develop, and implement data integration services that connect diverse data sources including event tracking platforms (GA4, Segment), databases, APIs, and third-party systems
  • build and maintain robust data pipelines using Apache Airflow, dbt, and Spark to orchestrate complex workflows and transform raw data into analytics-ready datasets in Snowflake
  • develop Python-based integration services and APIs that enable seamless data flow between various data technologies and downstream applications
  • collaborate actively with data analysts, analytics engineers, and platform teams to understand requirements, troubleshoot data issues, and optimize pipeline performance
  • participate in code reviews, sprint planning, and retrospectives to ensure high-quality, production-ready code by end of each sprint
  • contribute to the continuous improvement of data platform infrastructure, development practices, and deployment processes in accordance with CI/CD best practices
  • Fulltime
Read More
Arrow Right

Business Analyst DWH

Within the Performance & Compliance tribe, we have an open position for a DWH Bu...
Location
Location
Romania , Bucharest
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • A good level of knowledge of banking products and dependencies between different banking areas
  • Ability to analyze and understand a data set and the correlations between these data
  • Ability to analyze business requirements and write related functional specifications
  • Ability to write and apply test scenarios (functional tests)
  • Ability to identify the impact of the projects in which they work
  • Ability to provide user support in the use of new developments
  • Good knowledge of project methodology
  • Very good knowledge in the use of Microsoft Office products
  • Good knowledge of SQL, DWH concepts, data modeling
  • Intermediate-advanced English language skills
Job Responsibility
Job Responsibility
  • Performs business analysis of user requirements
  • Provides support to business departments in the detailed definition of requirements
  • Proposes definitions of business terms, creates the necessary referentials and conceptual modeling of data in the DWH
  • Creates specific documentation, aiming to create quality deliverables: functional specification, inventory and dictionary of business terms, specification for referentials, technical specification, test scenarios, user guides related to new developments
  • Performs or provides support for functional approval tests of new referentials/data/reports/functionalities, respectively non-regression tests of existing referentials/data/reports/functionalities
  • Ensures compliance with delivery deadlines established within projects
  • Provides support to users from the business area in the acceptance tests carried out by them
  • Change Management: ensures communication to users regarding new developments made on the DWH platform and the transmission of user guides for new applications
  • participates in user training
  • Actively participates in project review/retrospective meetings, sends constructive feedback and proposals for optimizing the way of working
What we offer
What we offer
  • Full access to foreign language learning platform
  • Personalized access to tech learning platforms
  • Tailored workshops and trainings to sustain your growth
  • Medical insurance
  • Meal tickets
  • Monthly budget to allocate on flexible benefit platform
  • Access to 7 Card services
  • Wellbeing activities and gatherings
  • Fulltime
Read More
Arrow Right

Business Analyst Dwh

Within the Performance & Compliance tribe, we have an open position for a DWH Bu...
Location
Location
Romania , Bucharest
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • A good level of knowledge of banking products and dependencies between different banking areas
  • ability to analyze and understand a data set and the correlations between these data
  • ability to analyze business requirements and write related functional specifications
  • ability to write and apply test scenarios (functional tests)
  • ability to identify the impact of the projects in which they work
  • ability to provide user support in the use of new developments
  • good knowledge of project methodology
  • very good knowledge in the use of Microsoft Office products
  • good knowledge of SQL, DWH concepts, data modeling
  • intermediate-advanced English language skills
Job Responsibility
Job Responsibility
  • Performs business analysis of user requirements
  • provides support to business departments in the detailed definition of requirements
  • proposes definitions of business terms, creates the necessary referentials and conceptual modeling of data in the DWH
  • creates specific documentation, aiming to create quality deliverables: functional specification, inventory and dictionary of business terms, specification for referentials, technical specification, test scenarios, user guides related to new developments
  • performs or provides support for functional approval tests of new referentials/data/reports/functionalities, respectively non-regression tests of existing referentials/data/reports/functionalities
  • ensures compliance with delivery deadlines established within projects
  • provides support to users from the business area in the acceptance tests carried out by them
  • change management: ensures communication to users regarding new developments made on the DWH platform and the transmission of user guides for new applications
  • participates in user training
  • actively participates in project review/retrospective meetings, sends constructive feedback and proposals for optimizing the way of working
What we offer
What we offer
  • Full access to foreign language learning platform
  • personalized access to tech learning platforms
  • tailored workshops and trainings to sustain your growth
  • medical insurance
  • meal tickets
  • monthly budget to allocate on flexible benefit platform
  • access to 7 Card services
  • wellbeing activities and gatherings
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Provectus helps companies adopt ML/AI to transform the ways they operate, compet...
Location
Location
Salary
Salary:
Not provided
provectus.com Logo
Provectus
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience in data engineering
  • Experience in AWS
  • Experience handling real-time and batch data flow and data warehousing with tools and technologies like Airflow, Dagster, Kafka, Apache Druid, Spark, dbt, etc.
  • Proficiency in programming languages relevant to data engineering, such as Python and SQL
  • Proficiency with Infrastructure as Code (IaC) technologies like Terraform or AWS CloudFormation
  • Experience in building scalable APIs
  • Familiarity with Data Governance aspects like Quality, Discovery, Lineage, Security, Business Glossary, Modeling, Master Data, and Cost Optimization
  • Upper-Intermediate or higher English skills
  • Ability to take ownership, solve problems proactively, and collaborate effectively in dynamic settings
Job Responsibility
Job Responsibility
  • Collaborate closely with clients to deeply understand their existing IT environments, applications, business requirements, and digital transformation goals
  • Collect and manage large volumes of varied data sets
  • Work directly with ML Engineers to create robust and resilient data pipelines that feed Data Products
  • Define data models that integrate disparate data across the organization
  • Design, implement, and maintain ETL/ELT data pipelines
  • Perform data transformations using tools such as Spark, Trino, and AWS Athena to handle large volumes of data efficiently
  • Develop, continuously test, and deploy Data API Products with Python and frameworks like Flask or FastAPI
What we offer
What we offer
  • Long-term B2B collaboration
  • Paid vacations and sick leaves
  • Public holidays
  • Compensation for medical insurance or sports coverage
  • External and Internal educational opportunities and AWS certifications
  • A collaborative local team and international project exposure
Read More
Arrow Right

Senior Data Engineer

Provectus, a leading AI consultancy and solutions provider specializing in Data ...
Location
Location
Salary
Salary:
Not provided
provectus.com Logo
Provectus
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience handling real-time and batch data flow and data warehousing with tools and technologies like Airflow, Dagster, Kafka, Apache Druid, Spark, dbt, etc.
  • Experience in AWS
  • Proficiency in programming languages relevant to data engineering, such as Python and SQL
  • Proficiency with Infrastructure as Code (IaC) technologies like Terraform or AWS CloudFormation
  • Experience in building scalable APIs
  • Familiarity with Data Governance aspects like Quality, Discovery, Lineage, Security, Business Glossary, Modeling, Master Data, and Cost Optimization
  • Upper-Intermediate or higher English skills
  • Ability to take ownership, solve problems proactively, and collaborate effectively in dynamic settings
Job Responsibility
Job Responsibility
  • Collaborate closely with clients to deeply understand their existing IT environments, applications, business requirements, and digital transformation goals
  • Collect and manage large volumes of varied data sets
  • Work directly with ML Engineers to create robust and resilient data pipelines that feed Data Products
  • Define data models that integrate disparate data across the organization
  • Design, implement, and maintain ETL/ELT data pipelines
  • Perform data transformations using tools such as Spark, Trino, and AWS Athena to handle large volumes of data efficiently
  • Develop, continuously test, and deploy Data API Products with Python and frameworks like Flask or FastAPI
What we offer
What we offer
  • Participate in internal training programs (Leadership, Public Speaking, etc.) with full support for AWS and other professional certifications
  • Work with the latest AI tools, premium subscriptions, and the freedom to use them in your daily work
  • Long-term B2B collaboration
  • 100% remote — with flexible hours
  • Collaboration with an international, cross-functional team
  • Comprehensive private medical insurance or budget for your medical needs
  • Paid sick leave, vacation, and public holidays
  • Equipment and all the tech you need for comfortable, productive work
  • Special gifts for weddings, childbirth, and other personal milestones
Read More
Arrow Right

Senior Data Engineer

Provectus, a leading AI consultancy and solutions provider specializing in Data ...
Location
Location
Salary
Salary:
Not provided
provectus.com Logo
Provectus
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience handling real-time and batch data flow and data warehousing with tools and technologies like Airflow, Dagster, Kafka, Apache Druid, Spark, dbt, etc.
  • Experience in AWS
  • Proficiency in programming languages relevant to data engineering, such as Python and SQL
  • Proficiency with Infrastructure as Code (IaC) technologies like Terraform or AWS CloudFormation
  • Experience in building scalable APIs
  • Familiarity with Data Governance aspects like Quality, Discovery, Lineage, Security, Business Glossary, Modeling, Master Data, and Cost Optimization
  • Upper-Intermediate or higher English skills
  • Ability to take ownership, solve problems proactively, and collaborate effectively in dynamic settings
Job Responsibility
Job Responsibility
  • Collaborate closely with clients to deeply understand their existing IT environments, applications, business requirements, and digital transformation goals
  • Collect and manage large volumes of varied data sets
  • Work directly with ML Engineers to create robust and resilient data pipelines that feed Data Products
  • Define data models that integrate disparate data across the organization
  • Design, implement, and maintain ETL/ELT data pipelines
  • Perform data transformations using tools such as Spark, Trino, and AWS Athena to handle large volumes of data efficiently
  • Develop, continuously test, and deploy Data API Products with Python and frameworks like Flask or FastAPI
What we offer
What we offer
  • Participate in internal training programs (Leadership, Public Speaking, etc.) with full support for AWS and other professional certifications
  • Work with the latest AI tools, premium subscriptions, and the freedom to use them in your daily work
  • Collaboration with an international, cross-functional team
  • Comprehensive private medical insurance or budget for your medical needs
  • Paid sick leave, vacation, and public holidays
  • Equipment and all the tech you need for comfortable, productive work
  • Special gifts for weddings, childbirth, and other personal milestones
Read More
Arrow Right