This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Profound is on a mission to help companies understand and control their AI presence. We're building the foundational agentic layer for modern companies. Our Workflow Runner is the execution backbone that turns complex AI work into reliable, composable workflows. You will shape the core primitives, execution, scheduling, state, and streaming, that power intelligent systems at scale.
Job Responsibility:
Build core workflow engine primitives used to orchestrate agents, tools, and dataflows
Own the real-time control plane, including streaming events, reliable job orchestration, idempotency, and replay
Ship high-leverage systems that turn prototypes into production-grade, scalable workflows
Design and ship Rust-first backend services and clean APIs for creating, executing, and supervising graph or DAG workflows
Build reliable job orchestration with multiple response modes including blocking, streaming, and fire-and-forget, with durable state
Strengthen tenant isolation, security, and access patterns across the platform
Improve runtime behavior including scheduling, backpressure, timeouts, retries, and idempotency
Evolve schemas and repositories and own migrations, indexing, and query performance
Instrument with meaningful telemetry and raise the bar on testing and operational excellence
Partner closely with product, frontend, and data teams to deliver high-impact features
Requirements:
Strong portfolio or GitHub showing backend or system design depth
Proficient in a modern systems language, with Rust as ideal, and comfortable owning services end to end in production
Solid experience with SQL and relational data modeling, with hands-on PostgreSQL experience
Experience with distributed systems patterns including messaging or streaming using Kafka or NATS, retries, idempotency, and backpressure
Familiarity with caching and state systems such as Redis and real-time delivery using SSE or WebSockets
Strong grasp of API authentication and authorization, multi-tenancy, and security best practices
Pragmatic, collaborative, and product-minded, thriving in a fast, in-person environment
Experience building or operating workflow engines, schedulers, or agent runtimes
Experience with Postgres RLS and data governance at scale
Practical exposure to containerization, CI or CD, and cloud environments
Background integrating external AI or model providers and securing provider configurations
Experience working with Rust 2024, PostgreSQL, Redis, Kafka or NATS, containers, modern CI or CD, and real-time streaming to clients