This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
As a foundational software engineer on our data platform engineering team, you will lead the charge on a variety of projects writing software to aggregate, store, and make sense of data. Examples of possible work include everything from building data warehousing for ERP and machine data, building software agents to get data off of machines, and building certified data sets to enable operations and business intelligence use of data. You will be challenged to think creatively and solve complex data integration problems. You will work cross functionally with production experts, software engineers, and machining specialists to develop novel solutions working toward fully automated factories.
Job Responsibility:
Scope, architect, implement, and deploy critical applications that will drive revenue and make a positive impact in the world
Build and manage a robust data warehouse and write software to coordinate and deploy data pipelines
Conceptualize and own the data architecture for multiple large-scale projects
Create and contribute to data frameworks that span on-premises and cloud infrastructure improve the efficacy of logging machine data, while working with data infrastructure to triage issues and resolve
Solve our most challenging machine data integration problems, utilizing optimal ETL patterns, frameworks, query techniques, sourcing from structured and unstructured data sources
Collaborate with machine engineers, product managers, and data scientists to understand data needs, representing key data insights visually in a meaningful way
Get to build alongside an incredible team of software engineers, mechanical engineers, operators, and the best machinists/CAM programmers in the world.
Requirements:
Extensive experience shipping modern, data-centric applications (data systems use Argo-Workflows, Dagster, Superset, Aurora, RDS, S3, back-ends are Go and Python, with gRPC/Avro and Kafka as messaging platform)
Experience with IaC and GitOps tooling (use Terraform extensively, centralized on Kubernetes/Argo/Helm)
Extremely well versed with data querying techniques across NoSQL and SQL platforms
Bachelor's degree in Computer Science and/or equivalent experience
Solid understanding in building data architecture and pipelines
Are self-motivated and eager to get hands-on and tackle challenges independently while working collaboratively toward identified objectives
Work with a platform mentality -- driven to find the right architecture and plan up front and solve problems with the long term in mind
Take responsibility and ownership finding solutions no matter what
Deploy broad experience and big picture view to fix undreamed of problems with innovative solutions
Feel passionate about making things move in the real world with software
Excited to work in a fast-paced environment with high-stakes and quick iteration cycles
Are a highly effective communicator when speaking or writing, especially when presenting technical information
To conform to U.S. Government space technology export regulations, including the International Traffic in Arms Regulations (ITAR) you must be a U.S. citizen, lawful permanent resident of the U.S., protected individual as defined by 8 U.S.C. 1324b(a)(3), or eligible to obtain the required authorizations from the U.S. Department of State.
What we offer:
Medical, dental, vision, and life insurance plans for employees
401k
Relocation support may be provided for certain situations, based on business need