CrawlJobs Logo

Middle Data Engineer

n-ix.com Logo

N-iX

Location Icon

Location:

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are seeking a motivated Data Engineer to join our team. In this role, you will be responsible for developing and maintaining robust data pipelines that drive our business intelligence and analytics.

Job Responsibility:

Developing and maintaining robust data pipelines that drive our business intelligence and analytics

Requirements:

  • 2+ years of experience in batch and streaming ETL using Spark, Python, Scala, Snowflake, or Databricks for Data Engineering or Machine Learning workloads
  • 2+ years orchestrating and implementing pipelines with workflow tools like Databricks Workflows, Apache Airflow, or Luigi
  • 2+ years of experience prepping structured and unstructured data for data science models
  • 2+ years of experience with containerization and orchestration technologies (Docker, Kubernetes discussable) and experience with shell scripting in Bash/Unix shell is preferable
  • Proficiency in Oracle & SQL and data manipulation techniques
  • Experience using machine learning in data pipelines to discover, classify, and clean data
  • Implemented CI/CD with automated testing in Jenkins, Github Actions, or Gitlab CI/CD
  • Familiarity with AWS Services not limited to Lambda, S3, and DynamoDB
  • Demonstrated experience implementing data management life cycle, using data quality functions like standardization, transformation, rationalization, linking, and matching
What we offer:
  • Flexible working format - remote, office-based or flexible
  • A competitive salary and good compensation package
  • Personalized career growth
  • Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
  • Active tech communities with regular knowledge sharing
  • Education reimbursement
  • Memorable anniversary presents
  • Corporate events and team buildings
  • Other location-specific benefits

Additional Information:

Job Posted:
January 22, 2026

Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Middle Data Engineer

Middle Data Engineer

At LeverX, we have had the privilege of delivering over 950 projects. With 20+ y...
Location
Location
Salary
Salary:
Not provided
leverx.com Logo
LeverX
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3–5 years of experience in data engineering
  • Strong SQL and solid Python for data processing
  • Hands-on experience with at least one cloud and a modern warehouse/lakehouse: Snowflake, Redshift, Databricks, or Apache Spark/Iceberg/Delta
  • Experience delivering on Data Warehouse or Lakehouse projects: star/snowflake modeling, ELT/ETL concepts
  • Familiarity with orchestration (Airflow, Prefect, or similar) and containerization fundamentals (Docker)
  • Understanding of data modeling, performance tuning, cost-aware architecture, and security/RBAC
  • English B1+
Job Responsibility
Job Responsibility
  • Design, build, and maintain batch/streaming pipelines (ELT/ETL) from diverse sources into DWH/Lakehouse
  • Model data for analytics (star/snowflake, slowly changing dimensions, semantic/metrics layers)
  • Write production-grade SQL and Python
  • optimize queries, file layouts, and partitioning
  • Implement orchestration, monitoring, testing, and CI/CD for data workflows
  • Ensure data quality (validation, reconciliation, observability) and document lineage
  • Collaborate with BI/analytics to deliver trusted, performant datasets and dashboards
What we offer
What we offer
  • Projects in different domains: Healthcare, manufacturing, e-commerce, fintech, etc.
  • Projects for every taste: Startup products, enterprise solutions, research & development projects, and projects at the crossroads of SAP and the latest web technologies
  • Global clients based in Europe and the US, including Fortune 500 companies
  • Employment security: We hire for our team, not just a specific project. If your project ends, we will find you a new one
  • Healthy work atmosphere: On average, our employees stay in the company for 4+ years
  • Market-based compensation and regular performance reviews
  • Internal expert communities and courses
  • Perks to support your growth and well-being
Read More
Arrow Right

Middle QA Engineer

We are looking for a Middle QA Engineer to join our team and help improve a comp...
Location
Location
Ukraine
Salary
Salary:
Not provided
honeycombsoft.com Logo
Honeycomb Software
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2.5+ years of experience in QA for web applications
  • Strong understanding of testing methodologies, SDLC, and QA processes
  • Hands-on experience testing systems with real-time data updates (SignalR/WebSockets/SSE or similar)
  • Proficiency with REST API testing tools (Postman, Swagger, etc.)
  • Solid SQL knowledge for validating data integrity and performing complex queries
  • Experience working with distributed or multi-module systems
  • Ability to read logs, investigate issues, and collaborate with developers on root-cause analysis
  • Understanding of client–server architecture and asynchronous processing
  • Experience with Jira, TestRail, or similar tools
  • Upper Intermediate+ English level
Job Responsibility
Job Responsibility
  • Test a multi-module web application that enables users to view, manage, and analyze large sets of operational and analytical data
  • Validate data accuracy, risk calculations, and business logic across several interconnected modules
  • Test real-time updates delivered via SignalR, ensuring correct synchronization across users and sessions
  • Collaborate closely with developers, Team Lead, and PM to clarify requirements and influence product quality
  • Create and maintain test documentation (test cases, checklists, test suites)
  • Perform functional, regression, integration, UI/UX, and exploratory testing
  • Test communication flows between distributed services, background jobs, and APIs
  • Participate in testing the Excel Add-In and ensure its correct integration with the main application
  • Provide detailed bug reports and assist the team during investigation
  • Contribute to continuous improvement of QA workflows and overall product reliability
What we offer
What we offer
  • Opportunity to work on a long-term, business-critical product with complex logic and real-time processing
  • Collaboration with a strong engineering team and supportive leadership
  • Room for professional growth, knowledge sharing, and exposure to advanced technologies
  • Flexible schedule and remote-friendly environment
  • Competitive compensation and transparent performance culture
  • Fulltime
Read More
Arrow Right

Middle Process Engineer

We are excited to announce an opportunity for a Middle Process Engineer to join ...
Location
Location
Georgia
Salary
Salary:
Not provided
omnic.net Logo
OMNIC
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven experience as a Process Engineer or similar role, with a focus on manufacturing processes
  • Strong understanding of lean manufacturing principles and experience with 5S implementation
  • Experience in launching products from scratch, with a results-oriented mindset
  • Proficiency in SolidWorks and MS Excel for process optimization and data analysis
  • Fluency in English or Russian for effective communication and collaboration
Job Responsibility
Job Responsibility
  • Develop and support technological processes for laser cutting, welding, and painting, ensuring optimal efficiency and quality
  • Monitor cycle times and material consumption rates to identify areas for continuous improvement
  • Implement and maintain lean manufacturing principles, including the effective execution of 5S methodologies
  • Collaborate with cross-functional teams to launch products from concept to execution, ensuring alignment with company goals
  • Utilize SolidWorks and MS Excel for process design, analysis, and reporting to enhance operational efficiencies
What we offer
What we offer
  • Competitive salary and comprehensive benefits package
  • Professional growth opportunities within a forward-thinking company
  • A supportive work environment that fosters innovation and creativity
  • The chance to work on groundbreaking projects that promote sustainability and positive change
  • Join a team committed to shaping a better future for our planet and leaving a lasting legacy
  • Fulltime
Read More
Arrow Right

Middle .Net Engineer

Vention is a global engineering partner to tech leaders and fast-growing startup...
Location
Location
Georgia , Tbilisi
Salary
Salary:
Not provided
ventionteams.com Logo
Vention
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • C# / .NET (ASP.NET Core) - strong, production-level experience
  • Deep SQL and database performance knowledge: execution plans, indexing, SARGability, data types, query profiling
  • confident in interpreting execution plans
  • EF Core and Dapper: understand when and why to choose one over the other (heavy read paths, SQL shape, ORM overhead)
  • Production debugging & performance engineering: real-world troubleshooting cases, step-by-step reasoning, use of profilers/logs/metrics/traces
  • Microservices and monolith decomposition: hands-on migration experience, risk assessment, testing, parallel runs, cutover strategies
  • Async/await and concurrency: async/await behavior, deadlock analysis, batching, parallelization, distinguishing I/O vs CPU tasks
  • Testing: unit, integration, boundary tests, proper mocking
  • understanding when architectural/contract tests are needed
  • High-load systems: experience stabilizing/scaling systems handling tens–hundreds of RPS and 100k+ daily events, using queues/buffering
Job Responsibility
Job Responsibility
  • Deliver features end-to-end: requirements clarification, design, implementation, testing, deployment, observability, and optimization
  • Production debugging & performance engineering: analyze slow SQL (execution plans, DMVs), optimize EF Core (AsNoTracking, Includes, SQL shape), fix N+1 issues, design caching strategies (read replicas, Redis), debug deadlocks and async/await issues, resolve front-end freezes caused by large payloads
  • Optimize SQL and database performance: execution plan analysis, SARGability, indexing, avoiding full scans, proper joins and type matching, making informed choices between Dapper and EF Core
  • Participate in monolith decomposition: apply strangler pattern, minimize downtime, define domains properly, implement async/event-driven integrations, ensure correct transaction boundaries
  • Maintain strong observability: structured logging, tracing, dashboards, and alerts (Serilog, DataDog, OpenTelemetry)
  • Collaborate across teams: negotiate contracts, mock dependencies, and unblock yourself without constant supervision
What we offer
What we offer
  • EDU corporate community (300+ members): tech communities, interest clubs, events, a small R&D lab, a knowledge base, and a dedicated AI track
  • Licenses for AI tools: GitHub Copilot, Cursor, and others
  • 24 working days of vacation per year
  • Expanded medical insurance
  • Corporate getaways & team building activities
  • Fitpass sport program
  • Support for significant life events
  • Access to discounts across a variety of stores, restaurants & cafes through a corporate discount program
  • Referral bonuses for bringing in new talent
  • Fulltime
Read More
Arrow Right

Senior Software Engineer - Trade Processing Middle Office Platform

As an experienced Staff / Senior Software Engineer, you’ll shape our flagship Mi...
Location
Location
United States , New York
Salary
Salary:
170000.00 - 240000.00 USD / Year
clearstreet.io Logo
Clear Street
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree in Computer Science or Engineering
  • 10+ years of strong proficiency in Java / Spring Boot, Spring, RDBMS, Service Oriented Architecture (SOA), microservice based server side application development
  • Strong experience with distributed systems, event-driven architecture, and tools like Kafka
  • Practical knowledge of relational databases (e.g., Postgres) and schema design
  • You have contributed to systems that deliver solutions to complex business problems that handle massive amounts of data
  • You prioritize end user experience and it shows in your API designs, functionality, and performance
  • You have a strong command over design patterns, data structures, and algorithms
  • You have strong problem-solving skills with a keen eye for performance optimization
  • You can clearly explain the nuances of system design and paradigms to engineers and stakeholders
  • Strong understanding of multi-threading, concurrency, and performance tuning
Job Responsibility
Job Responsibility
  • Architect and build highly available, horizontally scalable mission critical applications in a modern technology stack
  • Design, build, and optimize core components responsible for processing a high volume of trade data in a low latency environment
  • Solve complex performance and scalability challenges, ensuring our systems handle large-scale financial data efficiently
  • Collaborate with product managers, and other engineers to translate financial methodologies into robust software solutions
  • Lead by example in system design discussions, architectural trade-offs, and best practices
  • Mentor team members, contributing to a strong culture of engineering excellence
What we offer
What we offer
  • Competitive compensation, benefits, and perks
  • Company equity
  • 401k matching
  • Gender neutral parental leave
  • Full medical, dental and vision insurance
  • Lunch stipends
  • Fully stocked kitchens
  • Happy hours
  • Fulltime
Read More
Arrow Right

Middle Palantir Foundry Developer

At LeverX, we have had the privilege of delivering 1,500+ projects. With 20+ yea...
Location
Location
Uzbekistan, Georgia
Salary
Salary:
Not provided
leverx.com Logo
LeverX
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years in data/analytics engineering or software development
  • Hands-on experience with Palantir Foundry (pipelines and/or applications)
  • Proficiency in Python and SQL
  • Confidence with Git
  • Ability to translate business requirements into working solutions
  • English B1+
Job Responsibility
Job Responsibility
  • Build and maintain data pipelines and transformations in Foundry
  • Implement application logic, views, and access controls
  • Validate data and ensure basic documentation and support
  • Work with stakeholders to clarify requirements and iterate on features
What we offer
What we offer
  • Impactful use-case delivery on real data
  • Possibility to progress into a Team Lead role: mentoring, design facilitation, and coordination
  • Projects in different domains: Healthcare, manufacturing, e-commerce, fintech, etc.
  • Projects for every taste: Startup products, enterprise solutions, research & development projects, and projects at the crossroads of SAP and the latest web technologies
  • Global clients based in Europe and the US, including Fortune 500 companies
  • Employment security: We hire for our team, not just for a specific project. If your project ends, we will find you a new one
  • Healthy work atmosphere: On average, our employees stay in the company for 4+ years
Read More
Arrow Right

Middle QA Big Data

The objective of this project is to enhance the QA processes through the impleme...
Location
Location
Ukraine
Salary
Salary:
Not provided
n-ix.com Logo
N-iX
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of QA experience with a strong focus on Big Data testing, particularly with hands-on experience in Data Lake environments on any cloud platform (preferably Azure)
  • Experience with Azure
  • Hands-on experience in Azure Data Factory, Azure Synapse Analytics, or similar services
  • Proficiency in SQL, capable of writing and optimizing both simple and complex queries for data validation and testing purposes
  • Experienced in PySpark, with experience in data manipulation and transformation, and a demonstrated ability to write and execute test scripts for data processing and validation (ability to understand the code and convert the logic to SQL)
  • Hands-on experience with Functional & System Integration Testing in big data environments, ensuring seamless data flow and accuracy across multiple systems
  • Knowledge and ability to design and execute test cases in a behavior-driven development environment
  • Fluency in Agile methodologies, with active participation in Scrum ceremonies and a strong understanding of Agile principles
  • Familiarity with tools like Jira, including experience with X-Ray for defect management and test case management
  • Proven experience working on high-traffic and large-scale software products, ensuring data quality, reliability, and performance under demanding conditions
Job Responsibility
Job Responsibility
  • Design and execute data validation tests to ensure completeness in Azure Data Lake Storage (ADLS), Azure Synapse, and Databricks
  • Verify data ingestion, transformation, and loading (ETL/ELT) processes in Azure Data Factory (ADF)
  • Validate data schema, constraints, and format consistency across different storage layers
  • Conduct performance testing on data pipelines
  • Optimize query performance by working with data engineers
  • Identify, log, and track defects in JIRA
  • Collaborate with Data Engineers and Business Analysts to resolve data inconsistencies
  • Generate detailed test reports, dashboards, and documentation for stakeholders
What we offer
What we offer
  • Flexible working format - remote, office-based or flexible
  • A competitive salary and good compensation package
  • Personalized career growth
  • Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
  • Active tech communities with regular knowledge sharing
  • Education reimbursement
  • Memorable anniversary presents
  • Corporate events and team buildings
  • Other location-specific benefits
Read More
Arrow Right

Rust Engineer - Platform

As a Platform Backend Engineer (Rust) at Keyrock, you will drive the development...
Location
Location
Salary
Salary:
Not provided
keyrock.com Logo
Keyrock
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field, or equivalent experience
  • Proven experience in building and maintaining data-intensive, large-scale, high-performance trading data platforms
  • Strong expertise in Rust (or C++), Python, and TypeScript for system development and automation in the financial services industry
  • Good understanding of data engineering principles, including data modeling, ETL pipelines, and stream processing
  • Experience with financial services data workflows, including trading, middle office, and back office operations
  • Extensive experience in cloud-native architectures, with proficiency in AWS
  • Proficient in GitOps tools and methodologies for infrastructure automation and deployment
  • Strong background in DevSecFinOps, ensuring compliance, security, and cost efficiency across the development lifecycle
  • Hands-on experience with CI/CD pipelines, infrastructure as code (IaC), and monitoring tools
Job Responsibility
Job Responsibility
  • Rust Development: Design, build, and maintain high-performance backend services and APIs using Rust, ensuring low latency and high availability for critical trading data platforms
  • Strong systems engineering fundamentals: Concurrency, memory management, networking, serialization, and observability Solid understanding of performance tuning and profiling in real-world systems
  • System Integration: Create seamless integrations between live trading operations (exchanges/DeFi) and backoffice systems, automating workflows to improve operational efficiency
  • Cloud-Native Deployment: Deploy and manage services in a cloud-native environment, leveraging AWS, Kubernetes, and Terraform to scale infrastructure infrastructure-as-code
  • DevOps & Observability: Maintain GitOps-driven workflows, ensuring robust CI/CD pipelines and implementing deep system observability (logging, metrics, tracing) for rapid incident response
  • Database Optimization: Optimize data storage and retrieval strategies (SQL/NoSQL), balancing query performance, cost efficiency, and data integrity in a high-volume financial environment
  • Security & Compliance: Engineer solutions with a "Security-First" mindset, ensuring strict adherence to compliance standards and secure handling of sensitive financial data
  • Cross-Functional Collaboration: Partner with Product Managers, Risk teams, and other engineers to translate complex business requirements into reliable technical specifications and features
  • Technical Excellence: Actively participate in code reviews, contribute to architectural discussions, and mentor fellow engineers to foster a culture of high code quality and innovation
  • Continuous Improvement: Stay updated on emerging trends in the Rust ecosystem, cloud infrastructure, and blockchain technologies to continuously refine the platform’s capabilities
What we offer
What we offer
  • A competitive salary package
  • Autonomy in your time management thanks to flexible working hours and the opportunity to work remotely
  • The freedom to create your own entrepreneurial experience by being part of a team of people in search of excellence
  • Fulltime
Read More
Arrow Right