CrawlJobs Logo

Command and Data Handling Engineer

newspacetechnical.com Logo

NewSpace Technical

Location Icon

Location:
United Kingdom , Reading

Category Icon

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

Not provided

Job Description:

A high-impact opportunity to join a fast-scaling space start-up building ultra-low Earth orbit satellites, enabled by an innovative propulsion system designed to overcome atmospheric drag. You’ll take ownership of Command & Data Handling (C&DH) – the spacecraft’s central nervous system – bridging on-board computing, avionics, comms, and flight software to deliver reliable telemetry, commanding, and autonomy in a harsh on-orbit environment.

Job Responsibility:

  • Own the end-to-end C&DH architecture (commanding, telemetry, data routing, on-board compute)
  • Define and manage spacecraft interfaces (ICDs) across avionics, payload, comms, and power subsystems
  • Design fault-tolerant command sequencing, mode management, and safe state behaviours
  • Lead integration of on-board data buses and protocols (e.g. CAN, I2C, SPI, UART, SpaceWire)
  • Support test & verification through SIL/HIL, functional testing, and operational readiness
  • Work closely with flight software, AIT, and systems engineering to deliver flight-ready capability

Requirements:

  • Strong experience in spacecraft avionics / C&DH / embedded flight systems (or equivalent safety-critical systems)
  • Confidence owning system interfaces and managing integration across multiple subsystems
  • Solid understanding of spacecraft telemetry/telecommand concepts and operational workflows
  • Experience with embedded protocols/buses (CAN, I2C, SPI, UART, etc.)
  • Engineering discipline around verification, documentation, and configuration control
  • Right to work in the UK

Nice to have:

  • Experience with spacecraft OBCs, radios, EPS, payload interfaces
  • Knowledge of autonomy/FDIR concepts and designing for failure
  • Familiarity with AIT workflows, EGSE, acceptance testing and commissioning
  • Prior work in NewSpace or fast-iteration engineering environments
What we offer:
  • Equity
  • Benefits

Additional Information:

Job Posted:
January 22, 2026

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Command and Data Handling Engineer

Senior Data Engineer

We are looking for a Data Engineer to join our team and support with designing, ...
Location
Location
Salary
Salary:
Not provided
foundever.com Logo
Foundever
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum of 7 years plus experience in data engineering
  • Track record of deploying and maintaining complex data systems at an enterprise level within regulated environments
  • Expertise in implementing robust data security measures, access controls, and monitoring systems
  • Proficiency in data modeling and database management
  • Strong programming skills in Python and SQL
  • Knowledge of big data technologies like Hadoop, Spark, and NoSQL databases
  • Deep experience with ETL processes and data pipeline development
  • Strong understanding of data warehousing concepts and best practices
  • Experience with cloud platforms such as AWS and Azure
  • Excellent problem-solving skills and attention to detail
Job Responsibility
Job Responsibility
  • Design and optimize complex data storage solutions, including data warehouses and data lakes
  • Develop, automate, and maintain data pipelines for efficient and scalable ETL processes
  • Ensure data quality and integrity through data validation, cleansing, and error handling
  • Collaborate with data analysts, machine learning engineers, and software engineers to deliver relevant datasets or data APIs for downstream applications
  • Implement data security measures and access controls to protect sensitive information
  • Monitor data infrastructure for performance and reliability, addressing issues promptly
  • Stay abreast of industry trends and emerging technologies in data engineering
  • Document data pipelines, processes, and best practices for knowledge sharing
  • Lead data governance and compliance efforts to meet regulatory requirements
  • Collaborate with cross-functional teams to drive data-driven decision-making within the organization
What we offer
What we offer
  • Impactful work
  • Professional growth
  • Competitive compensation
  • Collaborative environment
  • Attractive salary and benefits package
  • Continuous learning and development opportunities
  • A supportive team culture with opportunities for occasional travel for training and industry events
Read More
Arrow Right

Backend Software Engineer - Reference Data Services

The role is for an experienced Software Engineer on the FACT Team at Clear Stree...
Location
Location
United States , New York
Salary
Salary:
200000.00 - 250000.00 USD / Year
clearstreet.io Logo
Clear Street
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • At least eight (8) years of professional experience implementing highly scalable services (we implement our code in Golang)
  • Confidence in designing and building flexible APIs which enable a microservice architecture to reliably deliver consistent data
  • Contributed to systems that deliver solutions to complex business problems that handle massive amounts of data
  • Drawn towards scale, distributed systems, and associated technologies
  • Strong command over object-oriented design patterns, data structures, and algorithms
  • Communicate technical ideas with ease and always look to collaborate to deliver high quality products
  • Experience will help you mentor team members, define our engineering standards, and drive a system design approach to building new services
Job Responsibility
Job Responsibility
  • Work with a team of passionate and highly collaborative engineers to build out our core Platform
  • Own the design and implementation of new features and services
  • Turn the complexity of processing financial transactions across various asset classes into highly scalable services
  • Tackle non trivial problems that will challenge you to flex your system design muscles, balance trade offs, and implement clean efficient code
  • As a voice of experience in the team, you will help mentor teammates, evolve our technical standards and best practices, and further our culture of system designs
What we offer
What we offer
  • Competitive compensation packages
  • Company equity
  • 401k matching
  • Gender neutral parental leave
  • Full medical, dental and vision insurance
  • Lunch stipends
  • Fully stocked kitchens
  • Happy hours
  • A great location
  • Amazing views
  • Fulltime
Read More
Arrow Right

Data Operations Engineer

We're seeking an early-career data professional who can support development and ...
Location
Location
United States
Salary
Salary:
65000.00 - 90000.00 USD / Year
personifyhealth.com Logo
Personify Health
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2-3 years experience in data engineering, analytics engineering, or related technical role
  • AWS Certification (or willingness to obtain within 6-12 months), such as AWS Cloud Practitioner or AWS Developer – Associate
  • Experience handling support tickets or operational data issues strongly preferred (~50% of role)
  • Proficiency in Python (required) and SQL, including writing queries, joins, basic transformations, and troubleshooting
  • Hands-on experience with relational databases (PostgreSQL, Oracle, AWS RDS) and familiarity with basic data warehouse concepts
  • Understanding of ETL/ELT pipelines, data validation, and data quality monitoring
  • Basic knowledge of Linux command line for navigating servers and running scripts
  • Some exposure to cloud environments (AWS, Azure) preferred but not required
  • Familiarity with JIRA and Git/Bitbucket for version control and task management
  • Effective written and verbal communication skills with ability to document findings and processes
Job Responsibility
Job Responsibility
  • Support data pipelines: Assist in maintaining and troubleshooting ETL/ELT data pipelines used for healthcare and TPA claims processing across on-prem and cloud environments
  • Handle operational support: Manage support tickets (~50% of time), responding to user requests, researching data questions, and helping resolve operational data problems efficiently
  • Work with core technologies: Use Python and SQL to support data extraction, transformation, validation, and loading while monitoring pipeline performance and resolving data issues
  • Monitor and troubleshoot: Review logs, investigate failed jobs, and correct data discrepancies while supporting daily process monitoring including production processes and application performance
  • Maintain data quality: Execute routine data quality checks, maintain documentation, and follow up on accuracy concerns to ensure reliable data across systems
  • Support database operations: Work with data management tasks in systems such as PostgreSQL, Oracle, and cloud-based databases while learning healthcare data formats
  • Collaborate cross-functionally: Partner with Data Analysts, Developers, and business users to understand data needs and support ongoing reporting and data operations
  • Continue learning: Participate in team meetings, sprint activities, and knowledge-sharing sessions while working with senior team members to develop data engineering skills
What we offer
What we offer
  • Comprehensive medical and dental coverage through our own health solutions
  • Mental health support and wellness programs designed by experts who get it
  • Flexible work arrangements that fit your life
  • Retirement planning support to help you build real wealth for the future
  • Basic Life and AD&D Insurance plus Short-Term and Long-Term Disability protection
  • Employee savings programs and voluntary benefits like Critical Illness and Hospital Indemnity coverage
  • Professional development opportunities and clear career progression paths
  • Mentorship from industry leaders who want to see you succeed
  • Learning budget to invest in skills that matter to your future
  • Unlimited PTO policy
  • Fulltime
Read More
Arrow Right

Celonis Technical Lead

Sopra Steria, a leading tech company in Europe, is hiring a Celonis Technical Le...
Location
Location
India , Noida
Salary
Salary:
Not provided
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Sound knowledge and experience of Process Mining using Celonis
  • Strong experience in programming, preferably Vertica SQL/PQL and Python
  • Experience handling large datasets
  • Working knowledge of data models and data structures
  • Technical expertise with data mining
  • Experience with Time Series Data
  • Ability to codify process into step by step linear commands
  • Experience with data visualization tools such as Power BI and Tableau
  • Professional experience writing performant SQL queries and improving existing code
  • Experience working with relational and non-relational databases
Job Responsibility
Job Responsibility
  • Implementation projects for clients from various industries that process data at various levels of complexity
  • Translate complex functional and technical requirements into data models
  • Setup of process data extractions including table and field mappings
  • Estimating and modeling memory requirements for data processing
  • Prepare and connect to On-premise/Cloud source system, extract and transform customer data, and develop process- and customer-specific studies
  • Solicit requirements for Business Process Mining models, including what data they will utilise and how the organisation will use them when they are built
  • Building accurate, reliable, and informative business process mining models will enable our company to expand even more quickly
  • Build the infrastructure required for optimal extraction, transformation and loading of data from disparate data sources
  • Applying analytics and modelling will enable us to own and actively drive process improvement projects and initiatives within the relevant function
  • Maintaining our familiarity with the Celonis platform will require us to write documentation on its technical procedures and processes
What we offer
What we offer
  • Inclusive and respectful work environment
  • Open to people with disabilities
  • Fulltime
Read More
Arrow Right

Staff DevOps - Data Platform

We are looking for a Staff DevOps - Data Platform to join the Data and ML Platfo...
Location
Location
France , Paris
Salary
Salary:
Not provided
doctolib.fr Logo
Doctolib
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience after graduation as a Staff Data Platform Engineer, Staff Data Ops, Staff Site Reliability Engineer, or in a similar role, with a history of architecting and scaling robust data platforms
  • Extensive experience with Google Cloud Platform and a command of Kubernetes & Terraform for automated deployments
  • Authority on implementing network and IAM security best practices
  • Deep technical proficiency in orchestrating data pipelines using Airflow or Dagster, deploying applications to the cloud, and leveraging modern data warehouses such as BigQuery
  • Highly skilled in programming with Python, and have a solid understanding of software development principles
  • Excellent troubleshooter who excels at diagnosing and fixing data infrastructure and identifying performance bottlenecks
  • Strong communicator who can articulate complex technical concepts to both technical and non-technical audiences
Job Responsibility
Job Responsibility
  • Design and implement enterprise-scale data infrastructure strategies, conducting thorough impact and cost analysis for major technical decisions, and establishing architectural standards across the organization
  • Build and optimize complex, multi-region data pipelines handling petabyte-scale datasets, ensuring 99.9% reliability and implementing advanced monitoring and alerting systems
  • Lead cost analysis initiatives, identify optimization opportunities across our data stack, and implement solutions that reduce infrastructure spend while improving performance and reliability
  • Provide technical guidance to data engineers and cross-functional teams, conduct architecture reviews, and drive adoption of best practices in DataOps, security, and governance
  • Evaluate emerging technologies, conduct proof-of-concepts for new data tools and platforms, and lead the technical roadmap for data infrastructure modernization
What we offer
What we offer
  • Free comprehensive health insurance for you and your children
  • Parent Care Program: receive one additional month of leave on top of the legal parental leave
  • Free mental health and coaching services through our partner Moka.care
  • For caregivers and workers with disabilities, a package including an adaptation of the remote policy, extra days off for medical reasons, and psychological support
  • Work from EU countries and the UK for up to 10 days per year, thanks to our flexibility days policy
  • Work Council subsidy to refund part of sport club membership or creative class
  • Up to 14 days of RTT
  • A subsidy from the work council to refund part of the membership to a sport club or a creative class
  • Lunch voucher with Swile card
  • Fulltime
Read More
Arrow Right

Principal Data Engineer

Our Platform Engineering Team is working to solve the Multiplicity Problem. We a...
Location
Location
United States , Reston
Salary
Salary:
Not provided
intellibus.com Logo
Intellibus
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • ETL – Experience with ETL processes for data integration
  • SQL – Strong SQL skills for querying and data manipulation
  • Python – Strong command of Python, especially in AWS Boto3, JSON handling, and dictionary operations
  • Unix – Competent in Unix for file operations, searches, and regular expressions
  • AWS – Proficient with AWS services including EC2, Glue, S3, Step Functions, and Lambda for scalable cloud solutions
  • Database Modeling – Solid grasp of database design principles, including logical and physical data models, and change data capture (CDC) mechanisms
  • Snowflake – Experienced in Snowflake for efficient data integration, utilizing features like Snowpipe, Streams, Tasks, and Stored Procedures
  • Airflow – Fundamental knowledge of Airflow for orchestrating complex data workflows and setting up automated pipelines
  • Bachelor's degree in Computer Science, or a related field is preferred. Relevant work experience may be considered in lieu of a degree
  • Excellent communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and stakeholders
Job Responsibility
Job Responsibility
  • Design, develop, and maintain data pipelines to ingest, transform, and load data from various sources into Snowflake
  • Implement ETL (Extract, Transform, Load) processes using Snowflake's features such as Snowpipe, Streams, and Tasks
  • Design and implement efficient data models and schemas within Snowflake to support reporting, analytics, and business intelligence needs
  • Optimize data warehouse performance and scalability using Snowflake features like clustering, partitioning, and materialized views
  • Integrate Snowflake with external systems and data sources, including on-premises databases, cloud storage, and third-party APIs
  • Implement data synchronization processes to ensure consistency and accuracy of data across different systems
  • Monitor and optimize query performance and resource utilization within Snowflake using query profiling, query optimization techniques, and workload management features
  • Identify and resolve performance bottlenecks and optimize data warehouse configurations for maximum efficiency
  • Work on Snowflake modeling – roles, databases, schemas, ETL tools with cloud-driven skills
  • Work on SQL performance measuring, query tuning, and database tuning
Read More
Arrow Right

Senior Software Engineer - Trade Processing Middle Office Platform

As an experienced Staff / Senior Software Engineer, you’ll shape our flagship Mi...
Location
Location
United States , New York
Salary
Salary:
170000.00 - 240000.00 USD / Year
clearstreet.io Logo
Clear Street
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree in Computer Science or Engineering
  • 10+ years of strong proficiency in Java / Spring Boot, Spring, RDBMS, Service Oriented Architecture (SOA), microservice based server side application development
  • Strong experience with distributed systems, event-driven architecture, and tools like Kafka
  • Practical knowledge of relational databases (e.g., Postgres) and schema design
  • You have contributed to systems that deliver solutions to complex business problems that handle massive amounts of data
  • You prioritize end user experience and it shows in your API designs, functionality, and performance
  • You have a strong command over design patterns, data structures, and algorithms
  • You have strong problem-solving skills with a keen eye for performance optimization
  • You can clearly explain the nuances of system design and paradigms to engineers and stakeholders
  • Strong understanding of multi-threading, concurrency, and performance tuning
Job Responsibility
Job Responsibility
  • Architect and build highly available, horizontally scalable mission critical applications in a modern technology stack
  • Design, build, and optimize core components responsible for processing a high volume of trade data in a low latency environment
  • Solve complex performance and scalability challenges, ensuring our systems handle large-scale financial data efficiently
  • Collaborate with product managers, and other engineers to translate financial methodologies into robust software solutions
  • Lead by example in system design discussions, architectural trade-offs, and best practices
  • Mentor team members, contributing to a strong culture of engineering excellence
What we offer
What we offer
  • Competitive compensation, benefits, and perks
  • Company equity
  • 401k matching
  • Gender neutral parental leave
  • Full medical, dental and vision insurance
  • Lunch stipends
  • Fully stocked kitchens
  • Happy hours
  • Fulltime
Read More
Arrow Right

Staff DataOps Engineer

We are looking for a Staff DataOps / Platform Engineer to join the Data and ML p...
Location
Location
France , Paris
Salary
Salary:
Not provided
doctolib.fr Logo
Doctolib
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience after graduation as a Senior Data Platform, Senior Data Engineer or in a similar role, with a history of architecting and scaling robust data platforms
  • Extensive experience with Google Cloud Platform and a command of Kubernetes & Terraform for automated deployments
  • Authority on implementing network and IAM security best practices
  • Deep technical proficiency in orchestrating data pipelines using Airflow or Dagster, deploying applications to the cloud, and leveraging modern data warehouses such as BigQuery
  • Highly skilled in programming with Python, and have a solid understanding of software development principles
  • Excellent troubleshooter who excels at diagnosing and fixing data infrastructure and identifying performance bottlenecks
  • Strong communicator who can articulate complex technical concepts to both technical and non-technical audiences
Job Responsibility
Job Responsibility
  • Design and implement enterprise-scale data infrastructure strategies, conducting thorough impact and cost analysis for major technical decisions, and establishing architectural standards across the organization
  • Build and optimize complex, multi-region data pipelines handling petabyte-scale datasets, ensuring 99.9% reliability and implementing advanced monitoring and alerting systems
  • Lead cost analysis initiatives, identify optimization opportunities across our data stack, and implement solutions that reduce infrastructure spend while improving performance and reliability
  • Provide technical guidance to data engineers and cross-functional teams, conduct architecture reviews, and drive adoption of best practices in DataOps, security, and governance
  • Evaluate emerging technologies, conduct proof-of-concepts for new data tools and platforms, and lead the technical roadmap for data infrastructure modernization
What we offer
What we offer
  • Free comprehensive health insurance for you and your children
  • Parent Care Program: receive one additional month of leave on top of the legal parental leave
  • Free mental health and coaching services through our partner Moka.care
  • For caregivers and workers with disabilities, a package including an adaptation of the remote policy, extra days off for medical reasons, and psychological support
  • Work from EU countries and the UK for up to 10 days per year, thanks to our flexibility days policy
  • Work Council subsidy to refund part of sport club membership or creative class
  • Up to 14 days of RTT
  • A subsidy from the work council to refund part of the membership to a sport club or a creative class
  • Lunch voucher with Swile card
  • Fulltime
Read More
Arrow Right