CrawlJobs Logo

Engineering Data Controller

crusoe.ai Logo

Crusoe

Location Icon

Location:
United States , Tulsa

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

50000.00 - 60000.00 USD / Year

Job Description:

Join Crusoe as an Engineering Data Controller, bringing a proactive and detail-oriented mindset as an early career engineer. This role will provide essential support to the Lead Process Engineer within a developing Engineering Services department. You will be the primary custodian of our manufacturing documentation, data systems, and project coordination—serving as the bridge of communication and information between design engineering and all other departments at Crusoe. Your work directly contributes to the operational efficiency and integrity of our entire manufacturing process. The ideal candidate will be eager to learn, possess strong organizational skills, and be committed to maintaining high standards of data accuracy and process adherence. Your opportunities to learn lie in the exposure to all aspects of project creation and management on a variety of scales.

Job Responsibility:

  • Item Master Support & BOM Management: Serve as the primary custodian of the company's Item Master database, managing the creation, revision, and deprecation of all part numbers and raw materials
  • Maintain the Bill of Materials (BOM) for all engineered designs, working directly with the engineering team to translate designs into a structured BOM format for the MRP system
  • Engineering Change Control (ECOs): Process and implement Engineering Change Orders (ECOs) in a timely and controlled manner, ensuring all related documents and systems are updated accurately to reflect the changes
  • Technical Document Control: Manage a centralized system for all technical documents, including engineering drawings, diagrams, O&M manuals, and SOPs
  • Ensure all documents are correctly versioned, indexed, and easily accessible to authorized personnel
  • Document Distribution & Approval: Issue and distribute all official engineering documents to internal departments and external partners
  • Coordinate the review and approval process for all new or revised technical documents, ensuring compliance with company standards and industry regulations
  • Cross-Functional Collaboration: Act as a critical liaison between the Engineering, Production, Supply Chain, and Quality departments to ensure seamless communication and data flow
  • Relay feedback from external departments back to Engineering to facilitate design and MRP improvements
  • Auditing and Reporting: Regularly audit the Item Master and BOM databases to identify and correct discrepancies proactively
  • Generate reports on data accuracy, ECO status, and document revision histories to support management decision-making and develop Key Performance Indicators (KPIs) related to data integrity
  • Process Improvement & Training: Identify opportunities for process improvement within data management and document control systems
  • Assist in the selection, implementation, and maintenance of new Product Lifecycle Management (PLM) or Enterprise Resource Planning (ERP) modules and develop training for team members on proper data entry and document control procedures

Requirements:

  • An Associate's or Bachelor's degree in Engineering, Information Systems, or a related technical field
  • OR a minimum of 2 years of experience in a manufacturing or engineering environment, with a focus on data management or document control
  • Strong proficiency with MRP/ERP systems (e.g., Acumatica, SAP, NetSuite) and familiarity with Product Lifecycle Management (PLM) software
  • Excellent attention to detail, a high degree of accuracy, and strong analytical and problem-solving skills
  • Excellent written and verbal communication skills for effective cross-functional liaison and documentation
  • The ability to work independently, manage multiple priorities in a fast-paced environment, and demonstrate an eagerness to learn and take initiative

Nice to have:

  • Direct experience managing a database for an Item Master or Bill of Materials (BOM)
  • Experience in a role supporting a major ERP or PLM system implementation
  • Specific experience processing Engineering Change Orders (ECOs) in a controlled manufacturing environment
What we offer:
  • Restricted Stock Units in a fast growing, well-funded technology company
  • Health insurance package options that include HDHP and PPO, vision, and dental for you and your dependents
  • Employer contributions to HSA accounts
  • Paid Parental Leave
  • Paid life insurance, short-term and long-term disability
  • Teladoc
  • 401(k) with a 100% match up to 4% of salary
  • Generous paid time off and holiday schedule
  • Cell phone reimbursement
  • Tuition reimbursement
  • Subscription to the Calm app
  • MetLife Legal
  • Company paid commuter benefit
  • $300 per month

Additional Information:

Job Posted:
January 19, 2026

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Engineering Data Controller

Software Engineer (Data Engineering)

We are seeking a Software Engineer (Data Engineering) who can seamlessly integra...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
nstarxinc.com Logo
NStarX
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4+ years in Data Engineering and AI/ML roles
  • Bachelor’s or Master’s degree in Computer Science, Data Science, or a related field
  • Python, SQL, Bash, PySpark, Spark SQL, boto3, pandas
  • Apache Spark on EMR (driver/executor model, sizing, dynamic allocation)
  • Amazon S3 (Parquet) with lifecycle management to Glacier
  • AWS Glue Catalog and Crawlers
  • AWS Step Functions, AWS Lambda, Amazon EventBridge
  • CloudWatch Logs and Metrics, Kinesis Data Firehose (or Kafka/MSK)
  • Amazon Redshift and Redshift Spectrum
  • IAM (least privilege), Secrets Manager, SSM
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable ETL and ELT pipelines for large-scale data processing
  • Develop and optimize data architectures supporting analytics and ML workflows
  • Ensure data integrity, security, and compliance with organizational and industry standards
  • Collaborate with DevOps teams to deploy and monitor data pipelines in production environments
  • Build predictive and prescriptive models leveraging AI and ML techniques
  • Develop and deploy machine learning and deep learning models using TensorFlow, PyTorch, or Scikit-learn
  • Perform feature engineering, statistical analysis, and data preprocessing
  • Continuously monitor and optimize models for accuracy and scalability
  • Integrate AI-driven insights into business processes and strategies
  • Serve as the technical liaison between NStarX and client teams
What we offer
What we offer
  • Competitive salary and performance-based incentives
  • Opportunity to work on cutting-edge AI and ML projects
  • Exposure to global clients and international project delivery
  • Continuous learning and professional development opportunities
  • Competitive base + commission
  • Fast growth into leadership roles
  • Fulltime
Read More
Arrow Right

Data Engineer, Enterprise Data, Analytics and Innovation

Are you passionate about building robust data infrastructure and enabling innova...
Location
Location
United States
Salary
Salary:
110000.00 - 125000.00 USD / Year
vaniamgroup.com Logo
Vaniam Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of professional experience in data engineering, ETL, or related roles
  • Strong proficiency in Python and SQL for data engineering
  • Hands-on experience building and maintaining pipelines in a lakehouse or modern data platform
  • Practical understanding of Medallion architectures and layered data design
  • Familiarity with modern data stack tools, including: Spark or PySpark
  • Workflow orchestration (Airflow, dbt, or similar)
  • Testing and observability frameworks
  • Containers (Docker) and Git-based version control
  • Excellent communication skills, problem-solving mindset, and a collaborative approach
Job Responsibility
Job Responsibility
  • Design, build, and operate reliable ETL and ELT pipelines in Python and SQL
  • Manage ingestion into Bronze, standardization and quality in Silver, and curated serving in Gold layers of our Medallion architecture
  • Maintain ingestion from transactional MySQL systems into Vaniam Core to keep production data flows seamless
  • Implement observability, data quality checks, and lineage tracking to ensure trust in all downstream datasets
  • Develop schemas, tables, and views optimized for analytics, APIs, and product use cases
  • Apply and enforce best practices for security, privacy, compliance, and access control, ensuring data integrity across sensitive healthcare domains
  • Maintain clear and consistent documentation for datasets, pipelines, and operating procedures
  • Lead the integration of third-party datasets, client-provided sources, and new product-generated data into Vaniam Core
  • Partner with product and innovation teams to build repeatable processes for onboarding new data streams
  • Ensure harmonization, normalization, and governance across varied data types (scientific, engagement, operational)
What we offer
What we offer
  • 100% remote environment with opportunities for local meet-ups
  • Positive, diverse, and supportive culture
  • Passionate about serving clients focused on Cancer and Blood diseases
  • Investment in you with opportunities for professional growth and personal development through Vaniam Group University
  • Health benefits – medical, dental, vision
  • Generous parental leave benefit
  • Focused on your financial future with a 401(k) Plan and company match
  • Work-Life Balance and Flexibility
  • Flexible Time Off policy for rest and relaxation
  • Volunteer Time Off for community involvement
  • Fulltime
Read More
Arrow Right

Data Engineering & Analytics Lead

Premium Health is seeking a highly skilled, hands-on Data Engineering & Analytic...
Location
Location
United States , Brooklyn
Salary
Salary:
Not provided
premiumhealth.org Logo
Premium Health
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Engineering, or a related field. Master's degree preferred
  • Proven track record and progressively responsible experience in data engineering, data architecture, or related technical roles
  • healthcare experience preferred
  • Strong knowledge of data engineering principles, data integration, ETL processes, and semantic mapping techniques and best practices
  • Experience implementing data quality management processes, data governance frameworks, cataloging, and master data management concepts
  • Familiarity with healthcare data standards (e.g., HL7, FHIR, etc), health information management principles, and regulatory requirements (e.g., HIPAA)
  • Understanding of healthcare data, including clinical, operational, and financial data models, preferred
  • Advanced proficiency in SQL, data modeling, database design, optimization, and performance tuning
  • Experience designing and integrating data from disparate systems into harmonized data models or semantic layers
  • Hands-on experience with modern cloud-based data platforms (e.g Azure, AWS, GCP)
Job Responsibility
Job Responsibility
  • Collaborate with the CDIO and Director of Technology to define a clear data vision aligned with the organization's goals and execute the enterprise data roadmap
  • Serve as a thought leader for data engineering and analytics, guiding the evolution of our data ecosystem and championing data-driven decision-making across the organization
  • Build and mentor a small data team, providing technical direction and performance feedback, fostering best practices and continuous learning, while remaining a hands-on implementor
  • Define and implement best practices, standards, and processes for data engineering, analytics, and data management across the organization
  • Design, implement, and maintain a scalable, reliable, and high-performing modern data infrastructure, aligned with the organizational needs and industry best practices
  • Architect and maintain data lake/lakehouse, warehouse, and related platform components to support analytics, reporting, and operational use cases
  • Establish and enforce data architecture standards, governance models, naming conventions ,and documentation
  • Develop, optimize, and maintain scalable ETL/ELT pipelines and data workflows to collect, transform, normalize, and integrate data from diverse systems
  • Implement robust data quality processes, validation, monitoring, and error-handling frameworks
  • Ensure data is accurate, timely, secure, and ready for self-service analytics and downstream applications
What we offer
What we offer
  • Paid Time Off, Medical, Dental and Vision plans, Retirement plans
  • Public Service Loan Forgiveness (PSLF)
  • Fulltime
Read More
Arrow Right

Data Engineer

We are seeking a skilled and innovative Data Engineer to join our team in Nieuwe...
Location
Location
Netherlands , Nieuwegein
Salary
Salary:
3000.00 - 6000.00 EUR / Month
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • BSc or MSc degree in IT or a related field
  • Minimum of 2 years of relevant work experience in data engineering
  • Proficiency in building data pipelines using tools such as Azure Data Factory, Informatica Cloud, Synapse Pro, Spark, Python, R, Kubernetes, Snowflake, Databricks, or AWS
  • Advanced SQL knowledge and experience with relational databases
  • Hands-on experience in data modelling and data integration (both on-premise and cloud-based)
  • Strong problem-solving skills and analytical mindset
  • Knowledge of data warehousing concepts and big data technologies
  • Experience with version control systems, preferably Git
  • Excellent communication skills and ability to work collaboratively in a team environment
  • Fluency in Dutch language (required)
Job Responsibility
Job Responsibility
  • Design, develop, and maintain scalable data pipelines and ETL/ELT processes
  • Collaborate with Information Analysts to provide technical frameworks for business requirements of medium complexity
  • Contribute to architecture discussions and identify potential technical and process bottlenecks
  • Implement data quality checks and ensure data integrity throughout the data lifecycle
  • Optimise data storage and retrieval systems for improved performance
  • Work closely with cross-functional teams to understand data needs and deliver efficient solutions
  • Stay up-to-date with emerging technologies and best practices in data engineering
  • Troubleshoot and resolve data-related issues in a timely manner
  • Document data processes, architectures, and workflows for knowledge sharing and future reference
What we offer
What we offer
  • A permanent contract and a gross monthly salary between €3,000 and €6,000 (based on 40 hours per week)
  • 8% holiday allowance
  • A generous mobility budget, including options such as an electric lease car with an NS Business Card, a lease bike, or alternative transportation that best suits your travel needs
  • 8% profit sharing on target (or a fixed OTB amount, depending on the role)
  • 27 paid vacation days
  • A flex benefits budget of €1,800 per year, plus an additional percentage of your salary. This can be used for things like purchasing extra vacation days or contributing more to your pension
  • A home office setup with a laptop, phone, and a monthly internet allowance
  • Hybrid working: from home or at the office, depending on what works best for you
  • Development opportunities through training, knowledge-sharing sessions, and inspiring (networking) events
  • Social activities with colleagues — from casual drinks to sports and content-driven outings
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

At Ingka Investments (Part of Ingka Group – the largest owner and operator of IK...
Location
Location
Netherlands , Leiden
Salary
Salary:
Not provided
https://www.ikea.com Logo
IKEA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Formal qualifications (BSc, MSc, PhD) in computer science, software engineering, informatics or equivalent
  • Minimum 3 years of professional experience as a (Junior) Data Engineer
  • Strong knowledge in designing efficient, robust and automated data pipelines, ETL workflows, data warehousing and Big Data processing
  • Hands-on experience with Azure data services like Azure Databricks, Unity Catalog, Azure Data Lake Storage, Azure Data Factory, DBT and Power BI
  • Hands-on experience with data modeling for BI & ML for performance and efficiency
  • The ability to apply such methods to solve business problems using one or more Azure Data and Analytics services in combination with building data pipelines, data streams, and system integration
  • Experience in driving new data engineering developments (e.g. apply new cutting edge data engineering methods to improve performance of data integration, use new tools to improve data quality and etc.)
  • Knowledge of DevOps practices and tools including CI/CD pipelines and version control systems (e.g., Git)
  • Proficiency in programming languages such as Python, SQL, PySpark and others relevant to data engineering
  • Hands-on experience to deploy code artifacts into production
Job Responsibility
Job Responsibility
  • Contribute to the development of D&A platform and analytical tools, ensuring easy and standardized access and sharing of data
  • Subject matter expert for Azure Databrick, Azure Data factory and ADLS
  • Help design, build and maintain data pipelines (accelerators)
  • Document and make the relevant know-how & standard available
  • Ensure pipelines and consistency with relevant digital frameworks, principles, guidelines and standards
  • Support in understand needs of Data Product Teams and other stakeholders
  • Explore ways create better visibility on data quality and Data assets on the D&A platform
  • Identify opportunities for data assets and D&A platform toolchain
  • Work closely together with partners, peers and other relevant roles like data engineers, analysts or architects across IKEA as well as in your team
What we offer
What we offer
  • Opportunity to develop on a cutting-edge Data & Analytics platform
  • Opportunities to have a global impact on your work
  • A team of great colleagues to learn together with
  • An environment focused on driving business and personal growth together, with focus on continuous learning
  • Fulltime
Read More
Arrow Right

Data Engineer

The Data Engineer is accountable for developing high quality data products to su...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • First Class Degree in Engineering/Technology/MCA
  • 3 to 4 years’ experience implementing data-intensive solutions using agile methodologies
  • Experience of relational databases and using SQL for data querying, transformation and manipulation
  • Experience of modelling data for analytical consumers
  • Ability to automate and streamline the build, test and deployment of data pipelines
  • Experience in cloud native technologies and patterns
  • A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training
  • Excellent communication and problem-solving skills
  • ETL: Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica
  • Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing
Job Responsibility
Job Responsibility
  • Developing and supporting scalable, extensible, and highly available data solutions
  • Deliver on critical business priorities while ensuring alignment with the wider architectural vision
  • Identify and help address potential risks in the data supply chain
  • Follow and contribute to technical standards
  • Design and develop analytical data models
  • Fulltime
Read More
Arrow Right

Data Engineer

The Data Engineer is accountable for developing high quality data products to su...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • First Class Degree in Engineering/Technology/MCA
  • 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies
  • Experience of relational databases and using SQL for data querying, transformation and manipulation
  • Experience of modelling data for analytical consumers
  • Ability to automate and streamline the build, test and deployment of data pipelines
  • Experience in cloud native technologies and patterns
  • A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training
  • Excellent communication and problem-solving skills
  • ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica
  • Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing
Job Responsibility
Job Responsibility
  • Developing and supporting scalable, extensible, and highly available data solutions
  • Deliver on critical business priorities while ensuring alignment with the wider architectural vision
  • Identify and help address potential risks in the data supply chain
  • Follow and contribute to technical standards
  • Design and develop analytical data models
  • Fulltime
Read More
Arrow Right

Staff Data Engineer

We’re looking for a Staff Data Engineer to own the design, scalability, and reli...
Location
Location
United States , San Jose
Salary
Salary:
150000.00 - 250000.00 USD / Year
figure.ai Logo
Figure
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience owning or architecting large-scale data platforms — ideally in EV, autonomous driving, or robotics fleet environments, where telemetry, sensor data, and system metrics are core to product decisions
  • Deep expertise in data engineering and architecture (data modeling, ETL orchestration, schema design, transformation frameworks)
  • Strong foundation in Python, SQL, and modern data stacks (dbt, Airflow, Kafka, Spark, BigQuery, ClickHouse, or Snowflake)
  • Experience building data quality, validation, and observability systems to detect regressions, schema drift, and missing data
  • Excellent communication skills — able to understand technical needs from domain experts (controls, perception, operations) and translate complex data patterns into clear, actionable insights for engineers and leadership
  • First-principles understanding of electrical and mechanical systems, including motors, actuators, encoders, and control loops
Job Responsibility
Job Responsibility
  • Architect and evolve Figure’s end-to-end platform data pipeline — from robot telemetry ingestion to warehouse transformation and visualization
  • Improve and maintain existing ETL/ELT pipelines for scalability, reliability, and observability
  • Detect and mitigate data regressions, schema drift, and missing data via validation and anomaly-detection frameworks
  • Identify and close gaps in data coverage, ensuring high-fidelity metrics coverage across releases and subsystems
  • Define the tech stack and architecture for the next generation of our data warehouse, transformation framework, and monitoring layer
  • Collaborate with robotics domain experts (controls, perception, Guardian, fall-prevention) to turn raw telemetry into structured metrics that drive engineering/business decisions
  • Partner with fleet management, operators, and leadership to design and communicate fleet-level KPIs, trends, and regressions in clear, actionable ways
  • Enable self-service access to clean, documented datasets for engineers
  • Develop tools and interfaces that make fleet data accessible and explorable for engineers without deep data backgrounds
  • Fulltime
Read More
Arrow Right