This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
As a Senior Software Engineer, you will leverage your skills in systems and software development to operationalize advanced statistical machine learning algorithms and support production-level systems. You will work closely with other software engineers and researchers to design cutting-edge solutions to challenging problems impacting national security including classification, anomaly detection, forecasting, and much more. You will also interact with customers to understand requirements and use cases and apply feedback to respond to their most critical problems.
Job Responsibility:
Collaborate to create and maintain software for data pipelines, algorithms, storage, and access
Build infrastructure for extraction, transformation, and loading using SQL and AWS technologies
Architect and implement capabilities for integrating technologies and orchestrating workflows
Build analytic tools that utilize data pipelines to provide actionable insights into customer requests
Identify, design, and implement internal process improvements
Monitor data for changes that could significantly impact system performance
Develop and execute plans to mitigate issues, maximize uptime, and ensure system performance
Requirements:
Active Top Secret security clearance, for which U.S. citizenship is needed by U.S. Government
Motivated collaborator looking to work with a large and distributed team
Ability to lead and direct development initiatives from inception to prototype and production
BS, MS, or PhD in a related field or equivalent experience
5+ years of experience in one or more high level programming languages, like Python
Experience with AWS cloud services, SQL, and a variety of databases
Experience working with large datasets, building and optimizing pipelines and architectures
Experience in navigating and contributing to complex, large code bases
Nice to have:
Possession of Security+ or other certifications
Experience developing processes for data transformation and workload management
Experience with development of REST APIs, access control, and auditing
Experience with DevOps pipelines
Experience using the following software/tools: Big Data tools: e.g. Hadoop, Spark, Kafka, ElasticSearch
Data Lakes: e.g. Delta Lake, Apache Hudi, Apache Iceberg
Distributed Data Warehouse Frontends: e.g. Apache Hive, Presto
Data pipeline and workflow management tools: e.g Luigi, Airflow
Dashboard frontends: e.g. Grafana, Kibana
Stream-processing systems: e.g. Storm, Spark-Streaming, etc.
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.