CrawlJobs Logo

Senior Software Engineer - Cloud Data Storage

temporal.io Logo

Temporal

Location Icon

Location:
United States

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

180000.00 - 225000.00 USD / Year

Job Description:

Cloud Data Store (CDS) owns the storage, retrieval, and lifecycle of all workflow data at planet scale. We design the persistence APIs, build storage abstractions that run across cloud vendors, and deliver the observability that lets customers trust their state machines for years. As a Senior Software Engineer, you will get the chance to design, build, and maintain significant portions of our backend functionality for highly scalable, multi-tenant services. You’ll own the custom persistence stack for Temporal Cloud which includes a Write Ahead Log, various metadata stores (Cassandra, etcd), multi-level caches, tiered storage etc.

Job Responsibility:

  • Design & build distributed data systems – craft APIs, schemas, and replication paths that keep petabytes of workflow history durable and query-able
  • Drive reliability & performance – own SLOs, create chaos-test plans, profile hot paths, and lead incident reviews
  • Technical leadership – break down roadmap epics, mentor mid-level engineers, steward design docs through RFC
  • Cross-team collaboration – partner with the Server, Cloud, and DX teams to land features end-to-end

Requirements:

  • 5 or more years of experience as an 'Arranger' and/or 'Builder/Enhancer' of highly scalable distributed systems
  • Solid computer science fundamentals in distributed systems concepts including multi-threading and concurrency
  • Experience writing concurrent code in production with languages like Go or Java or other applicable languages with skill level as 'high end of Intermediate' and/or 'Advanced' or 'Expert' levels
  • Experience building and running services on AWS

Nice to have:

  • Prior contributions to Temporal, Cadence, or other workflow engines
  • Deep expertise in a storage domain (LSM trees, columnar stores, transactional logs, etc.)
  • Operated multi-region, ≥99.99 % uptime services
  • Experience working with Open Source Systems
  • Experience in building K8s controllers and/or CRDs is a plus
What we offer:
  • Unlimited PTO, 12 Holidays + 2 Floating Holidays
  • 100% Premiums Coverage for Medical, Dental, and Vision
  • AD&D, LT & ST Disability, and Life Insurance (Standard & Supplemental Available)
  • Empower 401K Plan
  • Additional Perks for Learning & Development, Lifestyle Spending, In-Home Office Setup, Professional Memberships, WFH Meals, Internet Stipend and more
  • $3,600 / Year Work from Home Meals
  • $1,500 / Year Career Development & Learning
  • $1,200 / Year Lifestyle Spending Account
  • $1,000 / Year In-Home Office Setup
  • $500 / Year Professional Memberships
  • $74 / Month Reimbursement for Internet
  • Calm App Subscription for Mental Health & Wellness

Additional Information:

Job Posted:
January 04, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Software Engineer - Cloud Data Storage

Senior Software Engineer

Axis Security - Acquired by HPE Aruba is seeking a highly skilled and motivated ...
Location
Location
Israel , Tel Aviv
Salary
Salary:
Not provided
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of professional software development experience
  • Proficiency in one or more languages such as C#, JavaScript/TypeScript, or Go
  • Experience with frameworks such as .NET Core & React
  • Strong understanding of relational (PostgreSQL, MySQL) and NoSQL (MongoDB, Redis) databases
  • Strong experience in building RESTful APIs and microservices architectures
  • Experience working with one of the leading vendors for big data processing, analytics, and storage (Advantage)
  • Experience with AWS, Azure, or Google Cloud Platform (GCP) (Advantage)
  • Understanding of secure coding practices and data protection regulations (Advantage)
  • Experience with unit testing, integration testing, and automated testing frameworks (Advantage)
  • Experience with Docker, Kubernetes, Gitlab, or other CI/CD tools (Advantage)
Job Responsibility
Job Responsibility
  • Design, develop, test, and maintain robust, scalable, and high-quality software applications
  • Contribute to architectural decisions, ensuring efficient system design and implementation
  • Design and optimize data pipelines, integrating structured and unstructured data sources into data lakes
  • Write clean, maintainable, and well-documented code while enforcing coding standards and best practices (SOLID principles, TDD, CI/CD)
  • Identify bottlenecks and optimize application performance, scalability, and security
  • Mentor junior developers, conduct code reviews, and promote knowledge sharing within the team
  • Work closely with product managers, designers, DevOps, and QA teams to deliver high-quality software solutions
  • Troubleshoot and resolve complex technical issues across different components of the software stack
  • Participate in Agile methodologies, including sprint planning, daily stand-ups, and retrospectives
What we offer
What we offer
  • Health & Wellbeing
  • Personal & Professional Development
  • Diversity, Inclusion & Belonging
  • Fulltime
Read More
Arrow Right

Senior Software Engineer - Transactional Data Platform

As a Senior Software Engineer, you will play a critical role in designing, build...
Location
Location
Australia , Sydney
Salary
Salary:
Not provided
https://www.atlassian.com Logo
Atlassian
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related technical field
  • 5+ years of experience in backend software development
  • 3+ years of hands-on experience working with AWS cloud services, particularly AWS storage technologies (S3, DynamoDB, EBS, EFS, FSx, or Glacier)
  • 3+ years of experience in designing and developing distributed systems or high-scale backend services
  • Strong programming skills in Kotlin
  • Experience working in agile environments following DevOps and CI/CD best practices
  • Strong Backend Development Skills
  • Proficiency in Kotlin, Java for backend development
  • Experience building high-performance, scalable microservices and APIs
  • Strong understanding of RESTful APIs, gRPC, and event-driven architectures
Job Responsibility
Job Responsibility
  • Designing, building, and optimizing high-performance, scalable, and resilient backend storage solutions on AWS cloud infrastructure
  • Developing distributed storage systems, APIs, and backend services that power mission-critical applications, ensuring low-latency, high-throughput, and fault-tolerant data storage
  • Collaborating closely with principal engineers, architects, SREs, and product teams to define technical roadmaps, improve storage efficiency, and optimize access patterns
  • Driving performance tuning, data modeling, caching strategies, and cost optimization across AWS storage services like S3, DynamoDB, EBS, EFS, FSx, and Glacier
  • Contributing to infrastructure automation, security best practices, and monitoring strategies using tools like Terraform, CloudWatch, Prometheus, and OpenTelemetry
  • Troubleshooting and resolving production incidents related to data integrity, latency spikes, and storage failures, ensuring high availability and disaster recovery preparedness
  • Mentoring junior engineers, participating in design reviews and architectural discussions, and advocating for engineering best practices such as CI/CD automation, infrastructure as code, and observability-driven development
What we offer
What we offer
  • Atlassians can choose where they work – whether in an office, from home, or a combination of the two
  • Flexibility for eligible candidates to work remotely across the West US
  • Fulltime
Read More
Arrow Right

Senior Software Engineer

At Atlassian, we're motivated by a common goal: to unleash the potential of ever...
Location
Location
India
Salary
Salary:
Not provided
https://www.atlassian.com Logo
Atlassian
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Solid understanding and experience in building RESTful APIs and micro services, e.g. with Flask, Spring boot
  • Experience with Big Data processing and storage technologies such as Spark, DBT
  • Built solutions using public cloud offerings such as Amazon Web Services
  • SQL knowledge
  • Experience with test automation and ensuring data quality across multiple datasets used for analytical purposes
  • Experience with continuous delivery, continuous integration, and source control system such as Git
  • Expert level programming skills in OO Programming language like Java, Kotlin, Scala or Python
  • Deep understanding of big data challenges
  • Degree in Computer Science, EE, or related STEM discipline
What we offer
What we offer
  • health coverage
  • paid volunteer days
  • wellness resources
  • Fulltime
Read More
Arrow Right

Senior Software Engineer (Data Platform)

Atlassians can choose where they work – whether in an office, from home, or a co...
Location
Location
India , Remote
Salary
Salary:
Not provided
https://www.atlassian.com Logo
Atlassian
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Solid understanding and experience in building RESTful APIs and micro services, e.g. with Flask, Spring boot etc
  • Experience with Big Data processing and storage technologies such as Spark, DBT
  • Built solutions using public cloud offerings such as Amazon Web Services
  • SQL knowledge
  • Experience with test automation and ensuring data quality across multiple datasets used for analytical purposes
  • Experience with continuous delivery, continuous integration, and source control system such as Git
  • Expert level programming skills in OO Programming language like Java, Kotlin, Scala or Python
  • Deep understanding of big data challenges
  • Degree in Computer Science, EE, or related STEM discipline
What we offer
What we offer
  • health coverage
  • paid volunteer days
  • wellness resources
  • Fulltime
Read More
Arrow Right

Data engineer senior

Within a dynamic, high-level team, you will contribute to both R&D and client pr...
Location
Location
France , Paris
Salary
Salary:
Not provided
artelys.com Logo
Artelys
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Degree from a top engineering school or a high-level university program
  • At least 3 years of experience in designing and developing data-driven solutions with high business impact, particularly in industrial or large-scale environments
  • Excellent command of Python for both application development and data processing, with strong expertise in libraries such as Pandas, Polars, NumPy, and the broader Python Data ecosystem
  • Experience implementing data processing pipelines using tools like Apache Airflow, Databricks, Dask, or flow orchestrators integrated into production environments
  • Contributed to large-scale projects combining data analysis, workflow orchestration, back-end development (REST APIs and/or Messaging), and industrialisation, within a DevOps/DevSecOps-oriented framework
  • Proficient in using Docker for processing encapsulation and deployment
  • Experience with Kubernetes for orchestrating workloads in cloud-native architectures
  • Motivated by practical applications of data in socially valuable sectors such as energy, mobility, or health, and thrives in environments where autonomy, rigour, curiosity, and teamwork are valued
  • Fluency in English and French is required
Job Responsibility
Job Responsibility
  • Design and develop innovative and high-performance software solutions addressing industrial challenges, primarily using the Python language and a microservices architecture
  • Gather user and business needs to design data collection and storage solutions best suited to the presented use cases
  • Develop technical solutions for data collection, cleaning, and processing, then industrialise and automate them
  • Contribute to setting up technical architectures based on Data or even Big Data environments
  • Carry out development work aimed at industrialising and orchestrating computations (statistical and optimisation models) and participate in software testing and qualification
What we offer
What we offer
  • Up to 2 days of remote work per week possible
  • Flexible working hours
  • Offices located in the city center of each city where we are located
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Figure is an AI Robotics company developing a general-purpose humanoid. Our huma...
Location
Location
United States , San Jose
Salary
Salary:
140000.00 - 350000.00 USD / Year
figure.ai Logo
Figure
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's or Master’s degree in Computer Science, Data Engineering, or a related field
  • 3+ years of experience in data engineering, preferably with time-series or log data processing
  • Proficiency in Python with experience in Pandas, Polars, or PySpark for large-scale data processing
  • Strong understanding of database design, indexing, and query optimization (SQL and NoSQL)
  • Experience handling complex data formats such as Parquet, MCAP, or protobuf
  • Experience building custom web based data visualization tools (JavaScript, React…)
  • Familiarity with data visualization tools like Grafana for real-time analysis and monitoring
  • Experience with distributed computing frameworks and cloud-based data storage solutions
  • Strong debugging skills and ability to work with lab teams to interpret robotic system logs
Job Responsibility
Job Responsibility
  • Develop and maintain pipelines and tools to transform robot logs to make it easier to access, visualize, and automatically detect events of interest
  • Optimize data processing to reduce the time needed between data offload and the availability of the data to our engineering teams
  • Design and optimize data storage solutions for handling complex, high-volume time-series and structured data
  • Build and maintain database schemas and queries to support analytics and visualization of extracted patterns
  • Support mechanical, electrical, software, integration and test engineers with their needs to extract and visualize data
  • Develop dashboards and custom data visualizations tools to enable engineers to quickly extract information from the data and track robot performance
  • Integrate your solutions with existing data pipelines and our robot testing framework
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Kiddom is redefining how technology powers learning. We combine world-class curr...
Location
Location
United States , San Francisco
Salary
Salary:
150000.00 - 220000.00 USD / Year
kiddom.co Logo
Kiddom
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of experience as a data engineer
  • 8+ years of software engineering experience (including data engineering)
  • Proven experience as a Data Engineer or in a similar role with strong data modeling, architecture, and design skills
  • Strong understanding of data engineering principles including infrastructure deployment, governance and security
  • Experience with MySQL, Snowflake, Cassandra and familiarity with Graph databases. (Neptune or Neo4J)
  • Proficiency in SQL, Python, (Golang)
  • Proficient with AWS offerings such as AWS Glue, EKS, ECS and Lambda
  • Excellent communication skills, with the ability to articulate complex technical concepts to non-technical stakeholders
  • Strong understanding of PII compliance and best practices in data handling and storage
  • Strong problem-solving skills, with a knack for optimizing performance and ensuring data integrity and accuracy
Job Responsibility
Job Responsibility
  • Design, implement, and maintain the organization’s data infrastructure, ensuring it meets business requirements and technical standards
  • Deploy data pipelines to AWS infrastructure such as EKS, ECS, Lambdas and AWS Glue
  • Develop and deploy data pipelines to clean and transform data to support other engineering teams, analytics and AI applications
  • Extract and deploy reusable features to Feature stores such as Feast or equivalent
  • Evaluate and select appropriate database technologies, tools, and platforms, both on-premises and in the cloud
  • Monitor data systems and troubleshoot issues related to data quality, performance, and integrity
  • Work closely with other departments, including Product, Engineering, and Analytics, to understand and cater to their data needs
  • Define and document data workflows, pipelines, and transformation processes for clear understanding and knowledge sharing
What we offer
What we offer
  • Meaningful equity
  • Health insurance benefits: medical (various PPO/HMO/HSA plans), dental, vision, disability and life insurance
  • One Medical membership (in participating locations)
  • Flexible vacation time policy (subject to internal approval). Average use 4 weeks off per year
  • 10 paid sick days per year (pro rated depending on start date)
  • Paid holidays
  • Paid bereavement leave
  • Paid family leave after birth/adoption. Minimum of 16 paid weeks for birthing parents, 10 weeks for caretaker parents. Meant to supplement benefits offered by State
  • Commuter and FSA plans
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Location
Location
United States , Flowood
Salary
Salary:
Not provided
phasorsoft.com Logo
PhasorSoft Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience with Snowflake or Azure Cloud Data Engineering, including setting up and managing data pipelines
  • Proficiency in designing and implementing ETL processes for data integration
  • Knowledge of data warehousing concepts and best practices
  • Strong SQL skills for querying and manipulating data in Snowflake or Azure databases
  • Experience with data modeling techniques and tools to design efficient data structures
  • Understanding of data governance principles and experience implementing them in cloud environments
  • Proficiency in Tableau or Power BI for creating visualizations and interactive dashboards
  • Ability to write scripts (e.g., Python, PowerShell) for automation and orchestration of data pipelines
  • Skills to monitor and optimize data pipelines for performance and cost efficiency
  • Knowledge of cloud data security practices and tools to ensure data protection
Job Responsibility
Job Responsibility
  • Design, implement, and maintain data pipelines and architectures on Snowflake or Azure Cloud platforms
  • Develop ETL processes to extract, transform, and load data from various sources into data warehouses
  • Optimize data storage, retrieval, and processing for performance and cost-efficiency in cloud environments
  • Collaborate with stakeholders to understand data requirements and translate them into technical solutions
  • Implement data security and governance best practices to ensure data integrity and compliance
  • Work with reporting tools such as Tableau or Power BI to create interactive dashboards and visualizations
  • Monitor and troubleshoot data pipelines, ensuring reliability and scalability
  • Automate data workflows and processes using cloud-native services and scripting languages
  • Provide technical expertise and support to data analysts, scientists, and business users
  • Fulltime
Read More
Arrow Right