CrawlJobs Logo

Stakeholder Intelligence & Data Automation Intern

veolianorthamerica.com Logo

Veolia

Location Icon

Location:
United States , Hackensack

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

20.00 - 24.00 USD / Hour

Job Description:

We are seeking a highly motivated intern to support the development and ongoing enhancement of a Stakeholder Intelligence Platform designed to aggregate, structure, and maintain publicly available information from regulatory agencies, utilities, municipalities, and related organizations. This internship centers on a real, production-oriented project used to support strategic planning, regulatory awareness, and external affairs-not a theoretical exercise. The intern will work with live data, evolving requirements, and real-world constraints typical of regulated infrastructure environments.

Job Responsibility:

  • Support the continued development and refinement of a web-based data aggregation and web-scraping application
  • Perform quality assurance, validation, and cleanup of structured stakeholder data
  • Research and populate missing or incomplete public information
  • Assist in designing and testing automated or semi-automated update requests to source agencies
  • Document data sources, assumptions, workflows, and known gaps
  • Help improve usability and organization of the platform for both read-only and editor users

Requirements:

  • High School Diploma/GED is required
  • Working towards a four year degree and you have Senior status in Computer Science, Data Science, or a related major
  • 3.2 Cumulative G.P.A required
  • Strong attention to detail and data accuracy
  • Comfortable working with incomplete or inconsistent real-world data
  • Clear written communication skills
  • Able to work independently and document assumptions
  • Curious about how public institutions, regulators, and infrastructure organizations operate

Additional Information:

Job Posted:
March 04, 2026

Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Stakeholder Intelligence & Data Automation Intern

Data Analyst Intern

IKEA's data center serves as the enterprise-level data hub, responsible for the ...
Location
Location
China , Shanghai
Salary
Salary:
Not provided
https://www.ikea.com Logo
IKEA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Background related to product management or data analysis
  • Hard skills: Data analysis abilities (SQL/Python data modeling, Tableau visualization), product thinking (requirement analysis, prototype design)
  • Soft skills: Logical framework thinking (closed-loop deduction from data insights to product solutions), cross-departmental collaboration abilities (defining metric standards with IT and business stakeholders)
Job Responsibility
Job Responsibility
  • Promoting the upgrade of data products from 'analyst-specific tools' to a 'self-service analysis platform for all' based on AI+BI integration technologies (such as natural language queries, intelligent alerts, and automated modeling)
Read More
Arrow Right

Data Sciences Assistant Manager

The role involves designing and managing BI solutions, integrating data from var...
Location
Location
India , Noida
Salary
Salary:
Not provided
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s/Master’s degree in Computer Science, Data Science, Business Analytics, or a related technical field
  • 6+ years of experience in Business Intelligence, Data Engineering, or Cloud Data Analytics
  • Proficiency in SQL, Python, or data wrangling languages
  • Deep knowledge of BI tools like Power BI, Tableau, or QlikView
  • Strong data modeling, ETL, and data governance capabilities
Job Responsibility
Job Responsibility
  • Design, build, and manage end-to-end Business Intelligence solutions, integrating structured and unstructured data from internal and external sources
  • Architect and maintain scalable data pipelines using cloud-native services (e.g., AWS, Azure, GCP)
  • Implement ETL/ELT processes to ensure data quality, transformation, and availability for analytics and reporting
  • Support the Market Intelligence team by building dashboards, visualizations, and data models that reflect competitive, market, and customer insights
  • Work with research analysts to convert qualitative insights into measurable datasets
  • Drive the automation of insight delivery, enabling real-time or near real-time updates
  • Design interactive dashboards and executive-level visual reports using tools such as Power BI, or Tableau
  • Maintain data storytelling standards to deliver clear, compelling narratives aligned with strategic objectives
  • Act as a key liaison between business users, strategy teams, research analysts, and IT/cloud engineering
  • Translate analytical and research needs into scalable, sustainable BI solutions
What we offer
What we offer
  • Commitment to fighting against all forms of discrimination
  • Inclusive and respectful work environment
  • Positions open to people with disabilities
  • Fulltime
Read More
Arrow Right

Data Warehousing Specialist

Priambasoft LLC seeks a Data Warehousing Specialist who will take up the role “T...
Location
Location
United States , Iselin
Salary
Salary:
Not provided
priamba.com Logo
Priamba
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in computer science or a related discipline, or equivalent work experience required
  • 7-12 years of experience in data modeling, data warehousing, data entity analysis, logical and relational database design, Business Intelligence tools or an equivalent combination of education and work experience required
Job Responsibility
Job Responsibility
  • Build infrastructure required for optimal extraction, transformation, and loading from various data sources in the reporting application
  • Build analytics layer that utilize the data pipeline to provide valuable insights into company and project performance
  • Work with internal stakeholders to assist with data-related technical and infrastructure support, takes hold on the subject datamarts to build the data warehouse for an organization
  • Designs and creates more complex logical/physical data models, and data dictionaries that cater to the specific business and functional requirements of applications
  • Provides support for data architecture efforts as needed
  • Develops logical data designs of increasing complexity to deliver stable and flexible high-performance data ingestion / extraction
  • Develop advanced SQL queries to process the source data
  • Performs data extracts and creates complex data reports to analyze Business metrics
  • Provides knowledge of enterprise data to assist the business in the creation and definition of internal and external metrics
  • Provides knowledge in existing database environments and makes recommendations for opportunities to reduce data redundancy
Read More
Arrow Right

Data Architect

We are seeking a highly skilled Data Architect to join our Enterprise Transforma...
Location
Location
Bulgaria , Sofia
Salary
Salary:
Not provided
hotschedules.com Logo
HotSchedules Corporate
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field
  • 5+ years in data architecture, database administration, or data engineering, with a focus on enterprise and business systems
  • Strong knowledge of SQL and NoSQL databases (e.g., PostgreSQL, MongoDB, Cassandra)
  • Experience with cloud platforms (e.g., AWS, Azure, GCP – BigQuery, Redshift, Snowflake)
  • Hands-on experience with ETL/ELT tools (e.g., Apache Airflow, dbt, Informatica, Talend, Azure Data Factory, Synapse)
  • Strong knowledge of API’s and Automation Tools (e.g. Make, Zapier, MS Power Automate)
  • Familiarity with ERP, CRM, and HRIS integrations
  • Programming skills in Python, Java, or Scala
  • Deep understanding of data governance, master data management, and security/compliance (especially GDPR)
  • Excellent analytical, problem-solving, and communication skills
Job Responsibility
Job Responsibility
  • Design, develop, and maintain the organization’s overall data architecture to support enterprise‑wide business applications, internal reporting, and analytics
  • Create and manage conceptual, logical, and physical data models for organizational data domains (HR, Finance, Sales, Operations, etc.)
  • Define and implement data governance policies, standards, and best practices across the enterprise
  • Oversee ETL/ELT processes and pipelines for integrating data from diverse business systems (ERP, CRM, HRIS, etc.)
  • Collaborate with internal stakeholders (business teams, IT, data engineers) to align data initiatives with organizational objectives
  • Optimize performance, cost, and scalability of data warehouses and internal reporting systems
  • Evaluate and recommend tools and platforms to enhance internal data and business application efficiency
  • Ensure compliance with GDPR and other relevant data security/privacy regulations
  • Responsible for the successful design and execution of Data related programs and projects
What we offer
What we offer
  • 25+ days off, as well as birthday day off and 4 charity days off per year
  • Flexible start and end of the working day and hybrid working mode, including a combination remote and in the office
  • Team-centric atmosphere
  • Encouraging healthy lifestyle and work-life balance including supplemental health insurance
  • New parents bonus scheme
  • Fulltime
Read More
Arrow Right

Lead Analytics Engineer

As a Lead Analytics Engineer, you'll join a large impactful team of data & analy...
Location
Location
Netherlands , Amsterdam
Salary
Salary:
Not provided
adevinta.com Logo
Adevinta
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 12+ years of experience in analytics engineering, business intelligence, or data engineering, with a strong focus on data modeling and transformation
  • Solid hands-on experience using SQL and Python, with a track record of building & maintaining production-level data models
  • Strong experience with a modern cloud data warehouse like Databricks, Google BigQuery, Snowflake, or Amazon Redshift
  • Lead architectural and technical decisions for secure, performant, and scalable data products
  • Deep expertise with a modern data transformation tool like dbt (data build tool)
  • Proven experience in data warehouse design and development, using dimensional modeling and other techniques for business intelligence
  • Experienced in AWS Cloud usage and data management (automation, data governance, cost optimisation, delivering reliable & scalable data solutions)
  • Ensure data quality, schema governance and monitoring across pipelines
  • Experience with orchestrators such as Airflow, Kubeflow & Databricks workflows
  • Drive the creation of reusable data products across multiple parts of the organisation
Job Responsibility
Job Responsibility
  • Lead our analytics engineering function, acting as the bridge between our core data pipelines and the business intelligence needs of our stakeholders
  • Building and scaling our data models
  • Mentoring the team, setting best practices, and driving the technical roadmap
  • Delivering high-quality, well-documented solutions that directly serve our internal customers
  • Elevate engineering practices, and drive technical excellence across domains
What we offer
What we offer
  • An attractive Base Salary
  • Participation in our Short Term Incentive plan (annual bonus)
  • Work From Anywhere: Enjoy up to 20 days a year of working from anywhere
  • A 24/7 Employee Assistance Program for you and your family
  • A collaborative environment with an opportunity to explore your potential and grow
  • Fulltime
Read More
Arrow Right

Lead Analytics Engineer

As a Lead Analytics Engineer, you'll join a large impactful team of data & analy...
Location
Location
Netherlands , Amsterdam
Salary
Salary:
Not provided
adevinta.com Logo
Adevinta
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 12+ years of experience in analytics engineering, business intelligence, or data engineering, with a strong focus on data modeling and transformation
  • Solid hands-on experience using SQL and Python, with a track record of building & maintaining production-level data models
  • Strong experience with a modern cloud data warehouse like Databricks, Google BigQuery, Snowflake, or Amazon Redshift
  • Lead architectural and technical decisions for secure, performant, and scalable data products
  • Deep expertise with a modern data transformation tool like dbt (data build tool)
  • Proven experience in data warehouse design and development, using dimensional modeling and other techniques for business intelligence
  • Experienced in AWS Cloud usage and data management (automation, data governance, cost optimisation, delivering reliable & scalable data solutions)
  • Ensure data quality, schema governance and monitoring across pipelines
  • Experience with orchestrators such as Airflow, Kubeflow & Databricks workflows
  • Drive the creation of reusable data products across multiple parts of the organisation
Job Responsibility
Job Responsibility
  • Lead our analytics engineering function, acting as the bridge between our core data pipelines and the business intelligence needs of our stakeholders
  • Building and scaling our data models
  • Mentoring the team, setting best practices, and driving the technical roadmap
  • Delivering high-quality, well-documented solutions that directly serve our internal customers
  • Lead by example, elevate engineering practices, and drive technical excellence across domains
What we offer
What we offer
  • An attractive Base Salary
  • Participation in our Short Term Incentive plan (annual bonus)
  • Work From Anywhere: Enjoy up to 20 days a year of working from anywhere
  • A 24/7 Employee Assistance Program for you and your family
  • A collaborative environment with an opportunity to explore your potential and grow
  • Fulltime
Read More
Arrow Right

Director of Data Engineering and Agentic AI Automation

We are looking for a Director of Data Engineering and Agentic AI Automation to l...
Location
Location
United States , San Francisco
Salary
Salary:
347000.00 - 490000.00 USD / Year
openai.com Logo
OpenAI
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 12+ years in data engineering, with proven experience building and managing enterprise-scale, auditable ETL pipelines and complex datasets
  • Proficiency in SQL and Python, with demonstrated experience in schema design, data modeling, and orchestration frameworks
  • Expertise in distributed data processing technologies such as Apache Spark, Kafka, and cloud-native storage (e.g., S3, ADLS)
  • Deep knowledge of enterprise data architecture, especially within Finance and Supply Chain
  • Familiarity with financial processes (close, allocations, revenue recognition) and supply chain data models (Supply and demand planning, procurement, vendor master), along with experience in ingesting data from internal engineering systems with large volumes of B2C
  • Strong track record of partnering with senior business stakeholders and translating complex requirements into scalable technical solutions
Job Responsibility
Job Responsibility
  • Build and maintain scalable, auditable data infrastructure that powers accurate financial information, with a focus on revenue recognition, compute attribution, and close automation
  • Lead and grow teams of analytics engineers, data engineers, and AI engineers to deliver high-impact, intelligent data systems
  • Guide work across financial close and allocations automation, B2C revenue automation from engineering systems to ERP (including reconciliation with cash and source systems), and other mission-critical financial processes
  • Design and implement data pipelines connecting ERP, planning, and operational systems, including Oracle Fusion, Anaplan, and Workday
  • Build and support scalable, audit-proof architecture that enables reliable financial reporting and compliance
  • Develop data and AI-powered workflows that enhance forecasting accuracy, compliance automation, and operational efficiency
  • Create and maintain data marts and products that support stakeholders across Revenue, FP&A, Tax, Procurement, Hardware Accounting, and Controller teams
  • Define and enforce best practices for data modeling, lineage, observability, and reconciliation across finance data domains
  • Set the technical direction and manage team structure, mentoring engineers and overseeing contractors or system integrators to ensure delivery of high-quality outcomes
  • Partner with senior leaders across Finance, Engineering, and Infrastructure to align on priorities and integrate new automation capabilities
What we offer
What we offer
  • Medical, dental, and vision insurance for you and your family, with employer contributions to Health Savings Accounts
  • Pre-tax accounts for Health FSA, Dependent Care FSA, and commuter expenses (parking and transit)
  • 401(k) retirement plan with employer match
  • Paid parental leave (up to 24 weeks for birth parents and 20 weeks for non-birthing parents), plus paid medical and caregiver leave (up to 8 weeks)
  • Paid time off: flexible PTO for exempt employees and up to 15 days annually for non-exempt employees
  • 13+ paid company holidays, and multiple paid coordinated company office closures throughout the year for focus and recharge, plus paid sick or safe time (1 hour per 30 hours worked, or more, as required by applicable state or local law)
  • Mental health and wellness support
  • Employer-paid basic life and disability coverage
  • Annual learning and development stipend to fuel your professional growth
  • Daily meals in our offices, and meal delivery credits as eligible
  • Fulltime
Read More
Arrow Right

Lead Data Engineer

Location
Location
India , Bengaluru
Salary
Salary:
Not provided
cigres.com Logo
Cigres
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • You should have a bachelors or master’s degree in computer science, Information Technology or other quantitative fields
  • You should have at least 8 years working as a data engineer in supporting large data transformation initiatives related to machine learning, with experience in building and optimizing pipelines and data sets
  • Strong analytic skills related to working with unstructured datasets.
  • Experience with Azure cloud services, ADF, ADLS, HDInsight, Data Bricks, App Insights etc
  • Experience in handling ETL’s using Spark.
  • Experience with object-oriented/object function scripting languages: Python, Pyspark, etc.
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
  • You should be a good team player and committed for the success of team and overall project.
Job Responsibility
Job Responsibility
  • Create and maintain optimal data pipeline architecture
  • assemble large, complex data sets that meet functional / non-functional requirements.
  • Design the right schema to support the functional requirement and consumption patter.
  • Design and build production data pipelines from ingestion to consumption.
  • Create necessary preprocessing and postprocessing for various forms of data for training/ retraining and inference ingestions as required.
  • Create data visualization and business intelligence tools for stakeholders and data scientists for necessary business/ solution insights.
  • Identify, design, and implement internal process improvements: automating manual data processes, optimizing data delivery, etc.
  • Ensure our data is separated and secure across national boundaries through multiple data centers.
  • Fulltime
Read More
Arrow Right