This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are looking for a leader who is passionate about collaborating with high-performance teams to solve unique national security challenges with novel technologies. As a Principal Software Architect, you will leverage your skills in systems and software development to operationalize advanced statistical machine learning algorithms and support production-level systems. You will work closely with other software engineers and researchers – teammates from both STR and subcontractors – to design cutting-edge solutions to challenging problems impacting national security including classification, anomaly detection, forecasting, and much more. You will also interact with customers to understand requirements and use cases and apply feedback to respond to their most critical problems.
Job Responsibility:
Lead an active, distributed team of STR and subcontractor engineers in creating and maintaining system infrastructure, including data pipeline architecture, algorithm execution, storage, distribution and access
Be a senior STR technical leader onsite, engaging on new requirements, providing ad hoc status updates, coordinating with subcontractors, and ensuring that a corpus of programs are collaborating on data, technical lessons learned, and technical integration activities
Understand stakeholders, end users, and their constraints to create customer solutions that address mission requirements
Architect and implement capabilities for integrating component technologies and orchestrating overall workflows
Empower engineers, and scientists to act with a meaningful degree of autonomy by developing and clearly communicating a unifying product vision and strategy
Identify and implement opportunities to automate manual processes, optimize data delivery, system performance and scalability
Monitor for changes to data or environment that could significantly impact system performance to mitigate issues and maximize system uptime
Perform testing to ensure system function and performance
Requirements:
Ability to obtain a Top Secret security clearance, for which U.S. citizenship is needed by U.S. Government
BS, MS, PhD in a related field or equivalent experience
At least 10+ years’ experience in software development
Motivated collaborator who is looking for the opportunity to work with a team of high-end researchers and engineers to develop and deploy novel machine learning solutions for real mission impact
Aptitude for collaborating with stakeholders across a wide range of technical comfort levels
Organized, detail-oriented, and with an ability to work both independently and collaboratively
Experience with a range of software maturity, from functional prototypes to production-level systems
experience troubleshooting issues and identifying opportunities for improvement
Experience supporting and working within a large, cross-functional, distributed team
Experience with translating mission needs into production software, decomposing a problem into addressable component parts while identifying and tracking product risk
Proficiency with one or more high-level programming languages, like Python
Experience with cloud computing platforms, preferably AWS
Experience with software delivery and containerization: e.g. Docker, Kubernetes
Working knowledge of Linux Operating System and shell scripting
Relational SQL and NoSQL databases: e.g. Postgres, Cassandra
Nice to have:
Active TS/SCI security clearance with CI poly
Prior military experience in delivering intelligence analytic products
Strong presentation and organizational skills
Demonstrated experience with DevSecOps and SAFe Agile
Possession of SAFe Agile, Security+, other certifications, or willingness to get them
History of manipulating, processing and extracting value from large, disconnected datasets
Experience with development of APIs (e.g., REST), access control, and auditing
Experience with message queuing, stream processing, and optimizing ‘big data’ data stores
Experience developing build processes supporting data transformation, data structures, metadata, dependency and workload management
Expert SQL knowledge and experience working with a variety of databases
Experience using the following software/tools: Big Data tools: e.g. Hadoop, Spark, Kafka, ElasticSearch
AWS: Athena, RDB, AWS credentials from Cloud Practitioner to Solutions Architect
Data Lakes: e.g. Delta Lake, Apache Hudi, Apache Iceberg
Distributed SQL interfaces: e.g. Apache Hive, Presto/Trino, Spark
Data pipeline and workflow management tools: e.g Luigi, Airflow
Dashboard frontends: e.g. Grafana, Kibana
Stream-processing systems: e.g. Storm, Spark-Streaming, etc.