CrawlJobs Logo

Software Engineer – Data

solasit.ie Logo

Solas IT Recruitment

Location Icon

Location:
Ireland , Dublin 1

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

65000.00 EUR / Year

Job Description:

We are seeking a skilled Software Engineer with expertise in data engineering and analytics to join our team. The role involves designing and optimizing Data pipelines, managing Data Lake architectures on Azure or AWS, and leveraging tools like Databricks, Data Factory, Delta Lake, NoSQL, and Spark.

Job Responsibility:

  • Develop and maintain scalable ETL/ELT pipelines for data transformation
  • Manage Data Lakes and big data frameworks, ensuring performance and security
  • Utilize Azure tools such as Data Factory, Databricks, and Delta Lake
  • Collaborate with analytics teams to deliver curated datasets
  • Optimize NoSQL databases and leverage Apache Spark for big data processing

Requirements:

  • Strong experience in data engineering with Azure or AWS
  • Hands-on expertise with Databricks, Delta Lake, Data Factory, and NoSQL databases
  • Proficiency in Python, Scala, or SQL for data workflows
  • Solid understanding of ETL/ELT patterns and big data processing
What we offer:
  • pension
  • healthcare
  • dental
  • 25 days annual leave
  • Hybrid working model for flexibility
  • Opportunities for professional development and cutting-edge projects

Additional Information:

Job Posted:
December 24, 2025

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Software Engineer – Data

New

Senior Rust Software Engineer - Data Classification

As a Senior Engineer on the Data Classification team, you’ll design and developm...
Location
Location
Israel , Tel Aviv
Salary
Salary:
Not provided
paloaltonetworks.com Logo
Palo Alto Networks
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • BSc in Computer Science with 5+ years, or MSc with 3+ years, or equivalent military experience
  • Experience with systems-level languages (C++, C, Go, etc.)
  • Deep understanding of memory management
  • Strong understanding of application and OS interaction
  • Experience with multi-threaded and multi-process development with a performance focus
  • Familiarity with CI/CD pipelines and cloud infra - Have the ability to “make stuff work" on top of writing good code
Job Responsibility
Job Responsibility
  • Develop solutions for data security and classification using Rust, Python & Golang
  • Contribute to feature development (design, implementation, testing, deployment)
  • Collaborate with cross-functional teams for product and infrastructure integration
  • Innovate solutions for high-scale data operations
  • Serve as a leader, improving the work of others
  • Generate ideas and participate in brainstorming
  • Identify and push for team improvements
  • Fulltime
Read More
Arrow Right

Senior Software Engineer, Data Platform

We are looking for a foundational member of the Data Team to enable Skydio to ma...
Location
Location
United States , San Mateo
Salary
Salary:
180000.00 - 240000.00 USD / Year
skydio.com Logo
Skydio
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of professional experience
  • 2+ years in software engineering
  • 2+ years in data engineering with a bias towards getting your hands dirty
  • Deep experience with Databricks building pipelines, managing datasets, and developing dashboards or analytical applications
  • Proven track record of operating scalable data platforms, defining company-wide patterns that ensure reliability, performance, and cost effectiveness
  • Proficiency in SQL and at least one modern programming language (we use Python)
  • Comfort working across the full data stack — from ingestion and transformation to orchestration and visualization
  • Strong communication skills, with the ability to collaborate effectively across all levels and functions
  • Demonstrated ability to lead technical direction, mentor teammates, and promote engineering excellence and best practices across the organization
  • Familiarity with AI-assisted data workflows, including tools that accelerate data transformations or enable natural-language interfaces for analytics
Job Responsibility
Job Responsibility
  • Design and scale the data infrastructure that ingests live telemetry from tens of thousands of autonomous drones
  • Build and evolve our Databricks and Palantir Foundry environments to empower every Skydian to query data, define jobs, and build dashboards
  • Develop data systems that make our products truly data-driven — from predictive analytics that anticipate hardware failures, to 3D connectivity mapping, to in-depth flight telemetry analysis
  • Create and integrate AI-powered tools for data analysis, transformation, and pipeline generation
  • Champion a data-driven culture by defining and enforcing best practices for data quality, lineage, and governance
  • Collaborate with autonomy, manufacturing, and operations teams to unify how data flows across the company
  • Lead and mentor data engineers, analysts, and stakeholders across Skydio
  • Ensure platform reliability by implementing robust monitoring, observability, and contributing to the on-call rotation for critical data systems
What we offer
What we offer
  • Equity in the form of stock options
  • Comprehensive benefits packages
  • Relocation assistance may also be provided for eligible roles
  • Paid vacation time
  • Sick leave
  • Holiday pay
  • 401K savings plan
  • Fulltime
Read More
Arrow Right

Software Engineer, Data Engineering

Join us in building the future of finance. Our mission is to democratize finance...
Location
Location
Canada , Toronto
Salary
Salary:
124000.00 - 145000.00 CAD / Year
robinhood.com Logo
Robinhood
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of professional experience building end-to-end data pipelines
  • Hands-on software engineering experience, with the ability to write production-level code in Python for user-facing applications, services, or systems (not just data scripting or automation)
  • Expert at building and maintaining large-scale data pipelines using open source frameworks (Spark, Flink, etc)
  • Strong SQL (Presto, Spark SQL, etc) skills
  • Experience solving problems across the data stack (Data Infrastructure, Analytics and Visualization platforms)
  • Expert collaborator with the ability to democratize data through actionable insights and solutions
Job Responsibility
Job Responsibility
  • Help define and build key datasets across all Robinhood product areas. Lead the evolution of these datasets as use cases grow
  • Build scalable data pipelines using Python, Spark and Airflow to move data from different applications into our data lake
  • Partner with upstream engineering teams to enhance data generation patterns
  • Partner with data consumers across Robinhood to understand consumption patterns and design intuitive data models
  • Ideate and contribute to shared data engineering tooling and standards
  • Define and promote data engineering best practices across the company
What we offer
What we offer
  • bonus opportunities
  • equity
  • benefits
  • Fulltime
Read More
Arrow Right

Senior Software Engineer, Data Engineering

Join us in building the future of finance. Our mission is to democratize finance...
Location
Location
United States , Menlo Park
Salary
Salary:
146000.00 - 198000.00 USD / Year
robinhood.com Logo
Robinhood
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of professional experience building end-to-end data pipelines
  • Hands-on software engineering experience, with the ability to write production-level code in Python for user-facing applications, services, or systems (not just data scripting or automation)
  • Expert at building and maintaining large-scale data pipelines using open source frameworks (Spark, Flink, etc)
  • Strong SQL (Presto, Spark SQL, etc) skills
  • Experience solving problems across the data stack (Data Infrastructure, Analytics and Visualization platforms)
  • Expert collaborator with the ability to democratize data through actionable insights and solutions
Job Responsibility
Job Responsibility
  • Help define and build key datasets across all Robinhood product areas. Lead the evolution of these datasets as use cases grow
  • Build scalable data pipelines using Python, Spark and Airflow to move data from different applications into our data lake
  • Partner with upstream engineering teams to enhance data generation patterns
  • Partner with data consumers across Robinhood to understand consumption patterns and design intuitive data models
  • Ideate and contribute to shared data engineering tooling and standards
  • Define and promote data engineering best practices across the company
What we offer
What we offer
  • Market competitive and pay equity-focused compensation structure
  • 100% paid health insurance for employees with 90% coverage for dependents
  • Annual lifestyle wallet for personal wellness, learning and development, and more
  • Lifetime maximum benefit for family forming and fertility benefits
  • Dedicated mental health support for employees and eligible dependents
  • Generous time away including company holidays, paid time off, sick time, parental leave, and more
  • Lively office environment with catered meals, fully stocked kitchens, and geo-specific commuter benefits
  • Bonus opportunities
  • Equity
  • Fulltime
Read More
Arrow Right

Senior Software Engineer, Core Data

As a Senior Software Engineer on our Core Data team, you will take a leading rol...
Location
Location
United States
Salary
Salary:
190000.00 - 220000.00 USD / Year
pomelocare.com Logo
Pomelo Care
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience building high-quality, scalable data systems and pipelines
  • Expert-level proficiency in SQL and Python, with a deep understanding of data modeling and transformation best practices
  • Hands-on experience with dbt for data transformation and Dagster, Beam, Dataflow or similar tools for pipeline orchestration
  • Experience with modern data stack tools and cloud platforms, with a strong understanding of data warehouse design principles
  • A track record of delivering elegant and maintainable solutions to complex data problems that drive real business impact
Job Responsibility
Job Responsibility
  • Build and maintain elegant data pipelines that orchestrate ingestion from diverse sources and normalize data for company-wide consumption
  • Lead the design and development of robust, scalable data infrastructure that enables our clinical and product teams to make data-driven decisions, using dbt, Dagster, Beam and Dataflow
  • Write clean, performant SQL and Python to transform raw data into actionable insights that power our platform
  • Architect data models and transformations that support both operational analytics and new data-driven product features
  • Mentor other engineers, providing technical guidance on data engineering best practices and thoughtful code reviews, fostering a culture of data excellence
  • Collaborate with product, clinical and analytics teams to understand data needs and ensure we are building infrastructure that unlocks the most impactful insights
  • Optimize data processing workflows for performance, reliability and cost-effectiveness
What we offer
What we offer
  • Competitive healthcare benefits
  • Generous equity compensation
  • Unlimited vacation
  • Membership in the First Round Network (a curated and confidential community with events, guides, thousands of Q&A questions, and opportunities for 1-1 mentorship)
  • Fulltime
Read More
Arrow Right

Senior Software Engineer - Data Infrastructure

We build the data and machine learning infrastructure to enable Plaid engineers ...
Location
Location
United States , San Francisco
Salary
Salary:
180000.00 - 270000.00 USD / Year
plaid.com Logo
Plaid
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of software engineering experience
  • Extensive hands-on software engineering experience, with a strong track record of delivering successful projects within the Data Infrastructure or Platform domain at similar or larger companies
  • Deep understanding of one of: ML Infrastructure systems, including Feature Stores, Training Infrastructure, Serving Infrastructure, and Model Monitoring OR Data Infrastructure systems, including Data Warehouses, Data Lakehouses, Apache Spark, Streaming Infrastructure, Workflow Orchestration
  • Strong cross-functional collaboration, communication, and project management skills, with proven ability to coordinate effectively
  • Proficiency in coding, testing, and system design, ensuring reliable and scalable solutions
  • Demonstrated leadership abilities, including experience mentoring and guiding junior engineers
Job Responsibility
Job Responsibility
  • Contribute towards the long-term technical roadmap for data-driven and machine learning iteration at Plaid
  • Leading key data infrastructure projects such as improving ML development golden paths, implementing offline streaming solutions for data freshness, building net new ETL pipeline infrastructure, and evolving data warehouse or data lakehouse capabilities
  • Working with stakeholders in other teams and functions to define technical roadmaps for key backend systems and abstractions across Plaid
  • Debugging, troubleshooting, and reducing operational burden for our Data Platform
  • Growing the team via mentorship and leadership, reviewing technical documents and code changes
What we offer
What we offer
  • medical, dental, vision, and 401(k)
  • equity and/or commission
  • Fulltime
Read More
Arrow Right

Senior Software Engineer - Data Protection

LufCo is seeking a Senior Software Engineer with a focus on Data Protection. Thi...
Location
Location
United States , Annapolis Junction
Salary
Salary:
170000.00 - 245000.00 USD / Year
lufburrow.com Logo
LufCo
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor of Science degree in Software Engineering, Computer Science, Information Systems, or other related field
  • 4 years of relevant work experience may be substituted for a B.S. degree
  • Fourteen (14) or more years experience as a Software Engineer in programs and contracts of similar scope
  • Languages: Java (for both front-end (Swing) and back-end (servlets), Javascript (vanilla/JQuery),Shell Scripting (BASH), PL/SQL (Oracle)
  • Frameworks: React and Spring/Spring Boot
  • OS: Linux and Windows
  • COTs: AEM (Adobe)
  • Servers: JBoss 7.x and Tomcat
  • Active TS/SCI with Polygraph clearance
Job Responsibility
Job Responsibility
  • Drive next generation Data Protection forward utilizing commercial and government best practices for ensuring secure encryption solutions
  • Planning, implementation, and evolution of Data Protection sets for evaluation and analysis as part of existing system modernization efforts
  • Ability to see impacts of system changes at scale, minimizing technical debt and critical thinking related to strategic moves regarding Identity, Credentialing, and Access Management Solutions
  • Provide fundamental knowledge on applying technologies like containerization to legacy physical workloads, the ability to identify automation improvements, and the ability to communicate pros/cons as part of the technical decision making process
  • Demonstrate a high level of familiarity with software patterns and modern design methodology
  • Software development on Linux based platforms
  • Software planning to include development planning, build planning, and sprint planning
  • Develop software to meet cybersecurity related software requirements and constraints
  • Advocate for automation in all aspects of the system (build, deployment, test, updating, and monitoring)
  • Perform requirements analysis, refinement, testing, troubleshooting, deployment, and push secure access solutions forward to support the customer
What we offer
What we offer
  • Competitive salary
  • generous PTO
  • health/dental/vision insurance
  • 401K matching
  • tuition reimbursement
  • Paid Time Off
  • 401K Contribution and Employer Match Contributions
  • Medical, Dental, and Vision Coverage
  • Impactful Work
  • Cutting-Edge Technology
  • Fulltime
Read More
Arrow Right

Software Engineer 2 / Senior Software Engineer

We are looking for an experienced Software Engineers for our Bangalore location ...
Location
Location
India , Bengaluru
Salary
Salary:
Not provided
komprise.com Logo
Komprise, Inc.
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Solid grasp of computer science fundamentals and especially data structures, algorithms, multi-threading
  • Ability to solve difficult problems with a simple elegant solution
  • Should have solid object-oriented programming background with impeccable design skills
  • Experience in developing management applications and performance management applications is ideal
  • Experience with object-based file systems and REST interfaces is a plus (e.g. Amazon S3, Azure, Google Cloud Service)
  • Should have a BE or higher in CS, EE, Math or related engineering or science field
  • At least 5+ years of experience in software deployment
  • Tech Stack: Java, Maven Virtualisation, SaaS, Github, Jira, Slack, Cloud Solutions and Hypervisors
Job Responsibility
Job Responsibility
  • Responsible for designing and developing features that powers Komprise data management platform to manage billions of files and petabytes of data
  • Responsible for designing of major components and systems of our product architecture, ensuring that Komprise data management platform is highly available and scalable
  • Responsible for writing performance code, evaluate feasibility, develop for quality and optimize for maintainability
  • Work in agile, customer focused and fast paced team with direct interaction with the customers
  • Responsible for analysing customer escalated issues and provide resolutions in a timely manner
  • Should be able to design and implement highly performant, scalable distributed systems
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.