This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Data Engineer will build scalable pipelines and data models, implement ETL workflows, and help ensure enterprise data is reliable, accessible, and secure. You will work closely with data scientists, analysts, engineers, and stakeholders to translate mission and business needs into high-quality data solutions and actionable insights.
Job Responsibility:
Build and maintain scalable, reliable data pipelines to collect, process, and store data from multiple sources
Design and implement ETL processes to support analytics, reporting, and operational needs
Develop and maintain data models, schemas, and standards to support enterprise data usage
Collaborate with data scientists, analysts, and stakeholders to understand requirements and deliver solutions
Analyze large datasets to identify trends, patterns, and actionable insights
Present findings and recommendations through dashboards, reports, and visualizations
Optimize database and pipeline performance for scalability and reliability across large datasets
Monitor and troubleshoot pipeline issues to minimize downtime and improve system resilience
Implement data quality checks, validation routines, and integrity controls
Implement security measures to protect data and systems from unauthorized access
Ensure compliance with data governance policies, security standards, and applicable regulatory requirements (e.g., GDPR, HIPAA as applicable)
Establish best practices for data management, governance, and secure handling of sensitive information
Stay current on relevant tools and emerging technologies to strengthen engineering and analytical capabilities
Identify opportunities to improve workflows for data ingestion, processing, and analysis
Evaluate and recommend tools, platforms, and data management solutions aligned to organizational goals
Requirements:
Active DoD TS/SCI clearance (required or pending verification)
Bachelor’s degree in Computer Science, Data Science, Engineering, or related field (or equivalent experience) OR CSSLP / CISSP-ISSAP
Strong programming skills in Python, Java, or Scala
Strong SQL skills
familiarity with analytics languages/tools such as R
Experience with data processing frameworks (e.g., Apache Spark, Hadoop) and orchestration tools (e.g., Airflow)
Familiarity with cloud-based data services (e.g., AWS Redshift, Google BigQuery, Azure Data Factory)
Experience with data modeling, database design, and data architecture concepts
Strong analytical and problem-solving skills with attention to detail
Strong written and verbal communication skills
ability to collaborate across technical and non-technical teams
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.