CrawlJobs Logo

Data Engineer

techstargroup.com Logo

Techstar Consulting

Location Icon

Location:
United States, Woodcliff Lake, New Jersey

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

Not provided

Job Description:

As a Data Engineer you will work with product owners, data scientists, business analysts and software engineers to design and build solutions to ingest, transform, store, and export data in a cloud environment while maintaining security, scalability, and personal data protection.

Job Responsibility:

  • Implements and enhances complex data processing pipelines with a focus on collecting, parsing, cleaning, managing, and analyzing large data sets that produce valuable business insights and discoveries
  • Determines the required infrastructure, services, and software required to build advanced data ingestion & transformation pipelines and solutions in the cloud
  • Assists data scientists and data analysts with data preparation, exploration, and analysis activities
  • Applies problem solving experience and knowledge of advanced algorithms to build high performance, parallel, and distributed solutions
  • Performs code and solution review activities and recommends enhancements that improve efficiency, performance, stability, and decreased support costs
  • Applies the latest DevOps and Agile methodologies to improve delivery time
  • Works with SCRUM teams in daily stand-up, providing progress updates on a frequent basis
  • Supports application, including incident and problem management
  • Performs debugging and triage of incident or problem and deployment of fix to restore services
  • Documents requirements and configurations and clarifies ambiguous specs
  • Performs other duties as assigned by management

Requirements:

  • Bachelor’s degree in Computer Science, Mathematics, or Engineering or the equivalent of 4 years of related professional IT experience
  • 3+ years of enterprise software engineering experience with object-oriented design, coding and testing patterns, as well as experience in engineering (commercial or open source) software platforms and large-scale data infrastructure solutions
  • 3+ years of software engineering and architecture experience within a cloud environment (Azure, AWS)
  • 3+ years of enterprise data engineering experience within any “Big Data” environment (preferred)
  • 3+ years of software development experience using Python
  • 2+ years of experience working in an Agile environment (Scrum, Lean or Kanban)
  • 3+ years of experience working in large-scale data integration and analytics projects, including using cloud (e.g., AWS Redshift, S3, EC2, Glue, Kinesis, EMR) and data-orchestration (e.g., Oozie, Apache Airflow) technologies
  • 3+ years of experience in implementing distributed data processing pipelines using Apache Spark
  • 3+ years of experience in designing relational/NoSQL databases and data warehouse solutions
  • 2+ years of experience in writing and optimizing SQL queries in a business environment with large-scale, complex datasets
  • 2+ years of Unix/Linux operating system knowledge (including shell programming)
  • 1+ years of experience in automation/configuration management tools such as Terraform, Puppet or Chef
  • 1+ years of experience in container development and management using Docker
  • SQL, Python, Spark
  • Strong experience with data handling using Python is required
  • Basic knowledge of continuous integration tools (e.g., Jenkins)
  • Basic knowledge of machine learning algorithms and data visualization tools such as Microsoft Power BI and Tableau

Additional Information:

Job Posted:
December 12, 2025

Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Engineer

New

Senior Data Engineer – Data Engineering & AI Platforms

We are looking for a highly skilled Senior Data Engineer (L2) who can design, bu...
Location
Location
India , Chennai, Madurai, Coimbatore
Salary
Salary:
Not provided
optisolbusiness.com Logo
OptiSol Business Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong hands-on expertise in cloud ecosystems (Azure / AWS / GCP)
  • Excellent Python programming skills with data engineering libraries and frameworks
  • Advanced SQL capabilities including window functions, CTEs, and performance tuning
  • Solid understanding of distributed processing using Spark/PySpark
  • Experience designing and implementing scalable ETL/ELT workflows
  • Good understanding of data modeling concepts (dimensional, star, snowflake)
  • Familiarity with GenAI/LLM-based integration for data workflows
  • Experience working with Git, CI/CD, and Agile delivery frameworks
  • Strong communication skills for interacting with clients, stakeholders, and internal teams
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable ETL/ELT pipelines across cloud and big data platforms
  • Contribute to architectural discussions by translating business needs into data solutions spanning ingestion, transformation, and consumption layers
  • Work closely with solutioning and pre-sales teams for technical evaluations and client-facing discussions
  • Lead squads of L0/L1 engineers—ensuring delivery quality, mentoring, and guiding career growth
  • Develop cloud-native data engineering solutions using Python, SQL, PySpark, and modern data frameworks
  • Ensure data reliability, performance, and maintainability across the pipeline lifecycle—from development to deployment
  • Support long-term ODC/T&M projects by demonstrating expertise during technical discussions and interviews
  • Integrate emerging GenAI tools where applicable to enhance data enrichment, automation, and transformations
What we offer
What we offer
  • Opportunity to work at the intersection of Data Engineering, Cloud, and Generative AI
  • Hands-on exposure to modern data stacks and emerging AI technologies
  • Collaboration with experts across Data, AI/ML, and cloud practices
  • Access to structured learning, certifications, and leadership mentoring
  • Competitive compensation with fast-track career growth and visibility
  • Fulltime
Read More
Arrow Right
New

Software Engineer, Data Engineering

Join us in building the future of finance. Our mission is to democratize finance...
Location
Location
Canada , Toronto
Salary
Salary:
124000.00 - 145000.00 CAD / Year
robinhood.com Logo
Robinhood
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of professional experience building end-to-end data pipelines
  • Hands-on software engineering experience, with the ability to write production-level code in Python for user-facing applications, services, or systems (not just data scripting or automation)
  • Expert at building and maintaining large-scale data pipelines using open source frameworks (Spark, Flink, etc)
  • Strong SQL (Presto, Spark SQL, etc) skills
  • Experience solving problems across the data stack (Data Infrastructure, Analytics and Visualization platforms)
  • Expert collaborator with the ability to democratize data through actionable insights and solutions
Job Responsibility
Job Responsibility
  • Help define and build key datasets across all Robinhood product areas. Lead the evolution of these datasets as use cases grow
  • Build scalable data pipelines using Python, Spark and Airflow to move data from different applications into our data lake
  • Partner with upstream engineering teams to enhance data generation patterns
  • Partner with data consumers across Robinhood to understand consumption patterns and design intuitive data models
  • Ideate and contribute to shared data engineering tooling and standards
  • Define and promote data engineering best practices across the company
What we offer
What we offer
  • bonus opportunities
  • equity
  • benefits
  • Fulltime
Read More
Arrow Right
New

Senior Software Engineer, Data Engineering

Join us in building the future of finance. Our mission is to democratize finance...
Location
Location
United States , Menlo Park
Salary
Salary:
146000.00 - 198000.00 USD / Year
robinhood.com Logo
Robinhood
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of professional experience building end-to-end data pipelines
  • Hands-on software engineering experience, with the ability to write production-level code in Python for user-facing applications, services, or systems (not just data scripting or automation)
  • Expert at building and maintaining large-scale data pipelines using open source frameworks (Spark, Flink, etc)
  • Strong SQL (Presto, Spark SQL, etc) skills
  • Experience solving problems across the data stack (Data Infrastructure, Analytics and Visualization platforms)
  • Expert collaborator with the ability to democratize data through actionable insights and solutions
Job Responsibility
Job Responsibility
  • Help define and build key datasets across all Robinhood product areas. Lead the evolution of these datasets as use cases grow
  • Build scalable data pipelines using Python, Spark and Airflow to move data from different applications into our data lake
  • Partner with upstream engineering teams to enhance data generation patterns
  • Partner with data consumers across Robinhood to understand consumption patterns and design intuitive data models
  • Ideate and contribute to shared data engineering tooling and standards
  • Define and promote data engineering best practices across the company
What we offer
What we offer
  • Market competitive and pay equity-focused compensation structure
  • 100% paid health insurance for employees with 90% coverage for dependents
  • Annual lifestyle wallet for personal wellness, learning and development, and more
  • Lifetime maximum benefit for family forming and fertility benefits
  • Dedicated mental health support for employees and eligible dependents
  • Generous time away including company holidays, paid time off, sick time, parental leave, and more
  • Lively office environment with catered meals, fully stocked kitchens, and geo-specific commuter benefits
  • Bonus opportunities
  • Equity
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

We are looking for a Senior Data Engineer to join our product DE team and report...
Location
Location
United States , Seattle; San Francisco; Mountain View
Salary
Salary:
135600.00 - 217800.00 USD / Year
https://www.atlassian.com Logo
Atlassian
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Partner across engineering teams to tackle company-wide initiatives
  • Mentor junior members of the team
  • Partner with leadership, engineers, program managers and data scientists to understand data needs
  • Apply expertise and build scalable data solution
  • Develop and launch efficient & reliable data pipelines to move and transform data (both large and small amounts)
  • Intelligently design data models for storage and retrieval
  • Deploy data quality checks to ensure high quality of data
  • Ownership of the end-to-end data engineering component of the solution
  • Support on-call shift as needed to support the team
  • Design and develop new systems in partnership with software engineers to enable quick and easy consumption of data
Job Responsibility
Job Responsibility
  • Build top-notch data solutions and data architecture to inform our most critical strategic and real-time decisions
  • Help translate business needs into data requirements and identify efficiency opportunities
What we offer
What we offer
  • Health coverage
  • Paid volunteer days
  • Wellness resources
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Atlassian is looking for a Senior Data Engineer to join their product DE team. T...
Location
Location
United States , Seattle; San Francisco; Mountain View
Salary
Salary:
135600.00 - 217800.00 USD / Year
https://www.atlassian.com Logo
Atlassian
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Partner across engineering teams to tackle company-wide initiatives
  • Mentor junior members of the team
  • Partner with leadership, engineers, program managers and data scientists to understand data needs
  • Apply expertise and build scalable data solution
  • Develop and launch efficient & reliable data pipelines to move and transform data (both large and small amounts)
  • Intelligently design data models for storage and retrieval
  • Deploy data quality checks to ensure high quality of data
  • Ownership of the end-to-end data engineering component of the solution
  • Support on-call shift to support the team
  • Design and develop new systems in partnership with software engineers to enable quick and easy consumption of data
Job Responsibility
Job Responsibility
  • Build top-notch data solutions and data architecture to inform our most critical strategic and real-time decisions
  • Help translate business needs into data requirements and identify efficiency opportunities
What we offer
What we offer
  • Health coverage
  • Paid volunteer days
  • Wellness resources
  • Fulltime
Read More
Arrow Right

Software Engineer - Data Engineering

Akuna Capital is a leading proprietary trading firm specializing in options mark...
Location
Location
United States , Chicago
Salary
Salary:
130000.00 USD / Year
akunacapital.com Logo
AKUNA CAPITAL
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • BS/MS/PhD in Computer Science, Engineering, Physics, Math, or equivalent technical field
  • 5+ years of professional experience developing software applications
  • Java/Scala experience required
  • Highly motivated and willing to take ownership of high-impact projects upon arrival
  • Prior hands-on experience with data platforms and technologies such as Delta Lake, Spark, Kubernetes, Kafka, ClickHouse, and/or Presto/Trino
  • Experience building large-scale batch and streaming pipelines with strict SLA and data quality requirements
  • Must possess excellent communication, analytical, and problem-solving skills
  • Recent hands-on experience with AWS Cloud development, deployment and monitoring necessary
  • Demonstrated experience working on an Agile team employing software engineering best practices, such as GitOps and CI/CD, to deliver complex software projects
  • The ability to react quickly and accurately to rapidly changing market conditions, including the ability to quickly and accurately respond and/or solve math and coding problems are essential functions of the role
Job Responsibility
Job Responsibility
  • Work within a growing Data Engineering division supporting the strategic role of data at Akuna
  • Drive the ongoing design and expansion of our data platform across a wide variety of data sources, supporting an array of streaming, operational and research workflows
  • Work closely with Trading, Quant, Technology & Business Operations teams throughout the firm to identify how data is produced and consumed, helping to define and deliver high impact projects
  • Build and deploy batch and streaming pipelines to collect and transform our rapidly growing Big Data set within our hybrid cloud architecture utilizing Kubernetes/EKS, Kafka/MSK and Databricks/Spark
  • Mentor junior engineers in software and data engineering best practices
  • Produce clean, well-tested, and documented code with a clear design to support mission critical applications
  • Build automated data validation test suites that ensure that data is processed and published in accordance with well-defined Service Level Agreements (SLA’s) pertaining to data quality, data availability and data correctness
  • Challenge the status quo and help push our organization forward, as we grow beyond the limits of our current tech stack
What we offer
What we offer
  • Discretionary performance bonus
  • Comprehensive benefits package that may encompass employer-paid medical, dental, vision, retirement contributions, paid time off, and other benefits
  • Fulltime
Read More
Arrow Right
New

Sr. Data Engineer I, Data Infrastructure

Khan Academy is partnering with SafeInsights to provide researchers access to st...
Location
Location
United States , Mountain View
Salary
Salary:
137871.00 - 172339.00 USD / Year
khanacademy.org Logo
Khan Academy
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience in data engineering within a production environment
  • Experience with Google infrastructure services and data services particularly GCS & Big Query
  • Advanced knowledge of Python
  • Strong SQL skills
  • Demonstrated proficiency with Docker and Kubernetes
  • Experience with Airflow
  • Background in applied math, statistics, economics, or a related technical field
  • Motivated by the Khan Academy mission “to provide a free world-class education for anyone, anywhere"
  • Proven cross-cultural competency skills demonstrating self-awareness, awareness of other, and the ability to adopt inclusive perspectives, attitudes, and behaviors to drive inclusion and belonging throughout the organization.
Job Responsibility
Job Responsibility
  • Collaborate with our Data Insights Group to understand the desired end data product and the overall projects goals
  • Collaborate with our data architect and Data Infrastructure team to understand available data source and tooling
  • Propose tooling and processes that will work with extant Khan Academy systems to meet the projects goals
  • Build test and refine these tools and processes
  • Deliver the SafeInsights container and associated project deliverables
  • Provide documentation and training required to maintain and enhance the above
What we offer
What we offer
  • Competitive salaries
  • Ample paid time off as needed
  • Remote-first culture
  • Generous parental leave
  • An exceptional team
  • The chance to put your talents towards a deeply meaningful mission
  • Opportunities to connect through affinity, ally, and social groups
  • 401(k) + 4% matching
  • Comprehensive insurance, including medical, dental, vision, and life
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineering Architect

Location
Location
Poland
Salary
Salary:
Not provided
lingarogroup.com Logo
Lingaro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven work experience as a Data Engineering Architect or a similar role and strong experience in in the Data & Analytics area
  • Strong understanding of data engineering concepts, including data modeling, ETL processes, data pipelines, and data governance
  • Expertise in designing and implementing scalable and efficient data processing frameworks
  • In-depth knowledge of various data technologies and tools, such as relational databases, NoSQL databases, data lakes, data warehouses, and big data frameworks (e.g., Hadoop, Spark)
  • Experience in selecting and integrating appropriate technologies to meet business requirements and long-term data strategy
  • Ability to work closely with stakeholders to understand business needs and translate them into data engineering solutions
  • Strong analytical and problem-solving skills, with the ability to identify and address complex data engineering challenges
  • Proficiency in Python, PySpark, SQL
  • Familiarity with cloud platforms and services, such as AWS, GCP, or Azure, and experience in designing and implementing data solutions in a cloud environment
  • Knowledge of data governance principles and best practices, including data privacy and security regulations
Job Responsibility
Job Responsibility
  • Collaborate with stakeholders to understand business requirements and translate them into data engineering solutions
  • Design and oversee the overall data architecture and infrastructure, ensuring scalability, performance, security, maintainability, and adherence to industry best practices
  • Define data models and data schemas to meet business needs, considering factors such as data volume, velocity, variety, and veracity
  • Select and integrate appropriate data technologies and tools, such as databases, data lakes, data warehouses, and big data frameworks, to support data processing and analysis
  • Create scalable and efficient data processing frameworks, including ETL (Extract, Transform, Load) processes, data pipelines, and data integration solutions
  • Ensure that data engineering solutions align with the organization's long-term data strategy and goals
  • Evaluate and recommend data governance strategies and practices, including data privacy, security, and compliance measures
  • Collaborate with data scientists, analysts, and other stakeholders to define data requirements and enable effective data analysis and reporting
  • Provide technical guidance and expertise to data engineering teams, promoting best practices and ensuring high-quality deliverables. Support to team throughout the implementation process, answering questions and addressing issues as they arise
  • Oversee the implementation of the solution, ensuring that it is implemented according to the design documents and technical specifications
What we offer
What we offer
  • Stable employment. On the market since 2008, 1500+ talents currently on board in 7 global sites
  • Workation. Enjoy working from inspiring locations in line with our workation policy
  • Great Place to Work® certified employer
  • Flexibility regarding working hours and your preferred form of contract
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs. Lingarians earn 500+ technology certificates yearly
  • Upskilling support. Capability development programs, Competency Centers, knowledge sharing sessions, community webinars, 110+ training opportunities yearly
  • Grow as we grow as a company. 76% of our managers are internal promotions
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.