CrawlJobs Logo

Data Engineering Professional

plus.net Logo

Plusnet

Location Icon

Location:
United Kingdom , Exeter

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

This role is central to delivering the Managed Services Contract (MSC), ensuring critical blue light network services are reliable and fully supported. As part of the Data Networks team, you will provide technical support across LAN and WAN infrastructure in Devon, Dorset and Cornwall, resolving incidents within SLA and acting as an escalation point while meeting contractual requirements for Data and Voice Network Services.

Job Responsibility:

  • Resolve complex technical issues escalated from second line
  • Investigate and troubleshoot unfamiliar issues
  • Manage and prioritise workload to meet key performance indicators
  • Support lifecycle management and security vulnerability remediation
  • Contribute to continuous improvement initiatives
  • Maintain up-to-date knowledge of BT products, solutions and industry developments
  • Deliver an effective, innovative and ISO20K-compliant service
  • Participate in an out-of-hours call rota, 1 week in 3

Requirements:

  • Must hold or be eligible for SC (Security Clearance) & NPPV 3 Clearance
  • Minimum of 5 years working history in the UK
  • Full UK driving license
  • Customer-focused approach with commitment to high-quality service
  • Experience working collaboratively across cross-functional or virtual teams
  • Experience in at least two of the following: Cisco networking (including CCNA or working towards), network security, wireless networking, Cisco Prime/ISE/DNAC, or SolarWinds
  • Ability to produce clear technical documentation
  • Knowledge of EMS products, customer IT systems and applications
  • Experience working with ITIL processes
What we offer:
  • 10% on target annual bonus
  • BT Pension scheme, minimum 5% employee contribution, BT contribution 10%
  • X4 Salary Life Assurance
  • Huge range of flexible benefits including Cycle to Work, Healthcare, Season Ticket Loan, Electric Vehicle Salary Sacrifice
  • 25 days annual leave (not including bank holidays), increasing with service
  • From January 2025, equal family leave: receive 18 weeks at full pay, 8 weeks at half pay and 26 weeks at the statutory rate
  • Enhanced women’s health support
  • 24/7 private virtual GP appointments for UK colleagues
  • 2 weeks paid carer’s leave
  • World-class training and development opportunities
  • Option to join BT Shares Saving schemes
  • Discounted broadband, mobile and TV package, including 50% off EE mobile pay monthly or SIM only plans
  • Access to 100’s of retail discounts including the BT shop

Additional Information:

Job Posted:
April 01, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Engineering Professional

Senior Software Engineer, Data Engineering

Join us in building the future of finance. Our mission is to democratize finance...
Location
Location
United States , Menlo Park
Salary
Salary:
146000.00 - 198000.00 USD / Year
robinhood.com Logo
Robinhood
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of professional experience building end-to-end data pipelines
  • Hands-on software engineering experience, with the ability to write production-level code in Python for user-facing applications, services, or systems (not just data scripting or automation)
  • Expert at building and maintaining large-scale data pipelines using open source frameworks (Spark, Flink, etc)
  • Strong SQL (Presto, Spark SQL, etc) skills
  • Experience solving problems across the data stack (Data Infrastructure, Analytics and Visualization platforms)
  • Expert collaborator with the ability to democratize data through actionable insights and solutions
Job Responsibility
Job Responsibility
  • Help define and build key datasets across all Robinhood product areas. Lead the evolution of these datasets as use cases grow
  • Build scalable data pipelines using Python, Spark and Airflow to move data from different applications into our data lake
  • Partner with upstream engineering teams to enhance data generation patterns
  • Partner with data consumers across Robinhood to understand consumption patterns and design intuitive data models
  • Ideate and contribute to shared data engineering tooling and standards
  • Define and promote data engineering best practices across the company
What we offer
What we offer
  • Market competitive and pay equity-focused compensation structure
  • 100% paid health insurance for employees with 90% coverage for dependents
  • Annual lifestyle wallet for personal wellness, learning and development, and more
  • Lifetime maximum benefit for family forming and fertility benefits
  • Dedicated mental health support for employees and eligible dependents
  • Generous time away including company holidays, paid time off, sick time, parental leave, and more
  • Lively office environment with catered meals, fully stocked kitchens, and geo-specific commuter benefits
  • Bonus opportunities
  • Equity
  • Fulltime
Read More
Arrow Right

Software Engineer, Data Engineering

Join us in building the future of finance. Our mission is to democratize finance...
Location
Location
Canada , Toronto
Salary
Salary:
124000.00 - 145000.00 CAD / Year
robinhood.com Logo
Robinhood
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of professional experience building end-to-end data pipelines
  • Hands-on software engineering experience, with the ability to write production-level code in Python for user-facing applications, services, or systems (not just data scripting or automation)
  • Expert at building and maintaining large-scale data pipelines using open source frameworks (Spark, Flink, etc)
  • Strong SQL (Presto, Spark SQL, etc) skills
  • Experience solving problems across the data stack (Data Infrastructure, Analytics and Visualization platforms)
  • Expert collaborator with the ability to democratize data through actionable insights and solutions
Job Responsibility
Job Responsibility
  • Help define and build key datasets across all Robinhood product areas. Lead the evolution of these datasets as use cases grow
  • Build scalable data pipelines using Python, Spark and Airflow to move data from different applications into our data lake
  • Partner with upstream engineering teams to enhance data generation patterns
  • Partner with data consumers across Robinhood to understand consumption patterns and design intuitive data models
  • Ideate and contribute to shared data engineering tooling and standards
  • Define and promote data engineering best practices across the company
What we offer
What we offer
  • bonus opportunities
  • equity
  • benefits
  • Fulltime
Read More
Arrow Right

Software Engineer - Data Engineering

Akuna Capital is a leading proprietary trading firm specializing in options mark...
Location
Location
United States , Chicago
Salary
Salary:
130000.00 USD / Year
akunacapital.com Logo
AKUNA CAPITAL
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • BS/MS/PhD in Computer Science, Engineering, Physics, Math, or equivalent technical field
  • 5+ years of professional experience developing software applications
  • Java/Scala experience required
  • Highly motivated and willing to take ownership of high-impact projects upon arrival
  • Prior hands-on experience with data platforms and technologies such as Delta Lake, Spark, Kubernetes, Kafka, ClickHouse, and/or Presto/Trino
  • Experience building large-scale batch and streaming pipelines with strict SLA and data quality requirements
  • Must possess excellent communication, analytical, and problem-solving skills
  • Recent hands-on experience with AWS Cloud development, deployment and monitoring necessary
  • Demonstrated experience working on an Agile team employing software engineering best practices, such as GitOps and CI/CD, to deliver complex software projects
  • The ability to react quickly and accurately to rapidly changing market conditions, including the ability to quickly and accurately respond and/or solve math and coding problems are essential functions of the role
Job Responsibility
Job Responsibility
  • Work within a growing Data Engineering division supporting the strategic role of data at Akuna
  • Drive the ongoing design and expansion of our data platform across a wide variety of data sources, supporting an array of streaming, operational and research workflows
  • Work closely with Trading, Quant, Technology & Business Operations teams throughout the firm to identify how data is produced and consumed, helping to define and deliver high impact projects
  • Build and deploy batch and streaming pipelines to collect and transform our rapidly growing Big Data set within our hybrid cloud architecture utilizing Kubernetes/EKS, Kafka/MSK and Databricks/Spark
  • Mentor junior engineers in software and data engineering best practices
  • Produce clean, well-tested, and documented code with a clear design to support mission critical applications
  • Build automated data validation test suites that ensure that data is processed and published in accordance with well-defined Service Level Agreements (SLA’s) pertaining to data quality, data availability and data correctness
  • Challenge the status quo and help push our organization forward, as we grow beyond the limits of our current tech stack
What we offer
What we offer
  • Discretionary performance bonus
  • Comprehensive benefits package that may encompass employer-paid medical, dental, vision, retirement contributions, paid time off, and other benefits
  • Fulltime
Read More
Arrow Right

Data Engineer, Enterprise Data, Analytics and Innovation

Are you passionate about building robust data infrastructure and enabling innova...
Location
Location
United States
Salary
Salary:
110000.00 - 125000.00 USD / Year
vaniamgroup.com Logo
Vaniam Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of professional experience in data engineering, ETL, or related roles
  • Strong proficiency in Python and SQL for data engineering
  • Hands-on experience building and maintaining pipelines in a lakehouse or modern data platform
  • Practical understanding of Medallion architectures and layered data design
  • Familiarity with modern data stack tools, including: Spark or PySpark
  • Workflow orchestration (Airflow, dbt, or similar)
  • Testing and observability frameworks
  • Containers (Docker) and Git-based version control
  • Excellent communication skills, problem-solving mindset, and a collaborative approach
Job Responsibility
Job Responsibility
  • Design, build, and operate reliable ETL and ELT pipelines in Python and SQL
  • Manage ingestion into Bronze, standardization and quality in Silver, and curated serving in Gold layers of our Medallion architecture
  • Maintain ingestion from transactional MySQL systems into Vaniam Core to keep production data flows seamless
  • Implement observability, data quality checks, and lineage tracking to ensure trust in all downstream datasets
  • Develop schemas, tables, and views optimized for analytics, APIs, and product use cases
  • Apply and enforce best practices for security, privacy, compliance, and access control, ensuring data integrity across sensitive healthcare domains
  • Maintain clear and consistent documentation for datasets, pipelines, and operating procedures
  • Lead the integration of third-party datasets, client-provided sources, and new product-generated data into Vaniam Core
  • Partner with product and innovation teams to build repeatable processes for onboarding new data streams
  • Ensure harmonization, normalization, and governance across varied data types (scientific, engagement, operational)
What we offer
What we offer
  • 100% remote environment with opportunities for local meet-ups
  • Positive, diverse, and supportive culture
  • Passionate about serving clients focused on Cancer and Blood diseases
  • Investment in you with opportunities for professional growth and personal development through Vaniam Group University
  • Health benefits – medical, dental, vision
  • Generous parental leave benefit
  • Focused on your financial future with a 401(k) Plan and company match
  • Work-Life Balance and Flexibility
  • Flexible Time Off policy for rest and relaxation
  • Volunteer Time Off for community involvement
  • Fulltime
Read More
Arrow Right

Principal Data Engineer

PointClickCare is searching for a Principal Data Engineer who will contribute to...
Location
Location
United States
Salary
Salary:
183200.00 - 203500.00 USD / Year
pointclickcare.com Logo
PointClickCare
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Principal Data Engineer with at least 10 years of professional experience in software or data engineering, including a minimum of 4 years focused on streaming and real-time data systems
  • Proven experience driving technical direction and mentoring engineers while delivering complex, high-scale solutions as a hands-on contributor
  • Deep expertise in streaming and real-time data technologies, including frameworks such as Apache Kafka, Flink, and Spark Streaming
  • Strong understanding of event-driven architectures and distributed systems, with hands-on experience implementing resilient, low-latency pipelines
  • Practical experience with cloud platforms (AWS, Azure, or GCP) and containerized deployments for data workloads
  • Fluency in data quality practices and CI/CD integration, including schema management, automated testing, and validation frameworks (e.g., dbt, Great Expectations)
  • Operational excellence in observability, with experience implementing metrics, logging, tracing, and alerting for data pipelines using modern tools
  • Solid foundation in data governance and performance optimization, ensuring reliability and scalability across batch and streaming environments
  • Experience with Lakehouse architectures and related technologies, including Databricks, Azure ADLS Gen2, and Apache Hudi
  • Strong collaboration and communication skills, with the ability to influence stakeholders and evangelize modern data practices within your team and across the organization
Job Responsibility
Job Responsibility
  • Lead and guide the design and implementation of scalable streaming data pipelines
  • Engineer and optimize real-time data solutions using frameworks like Apache Kafka, Flink, Spark Streaming
  • Collaborate cross-functionally with product, analytics, and AI teams to ensure data is a strategic asset
  • Advance ongoing modernization efforts, deepening adoption of event-driven architectures and cloud-native technologies
  • Drive adoption of best practices in data governance, observability, and performance tuning for streaming workloads
  • Embed data quality in processing pipelines by defining schema contracts, implementing transformation tests and data assertions, enforcing backward-compatible schema evolution, and automating checks for freshness, completeness, and accuracy across batch and streaming paths before production deployment
  • Establish robust observability for data pipelines by implementing metrics, logging, and distributed tracing for streaming jobs, defining SLAs and SLOs for latency and throughput, and integrating alerting and dashboards to enable proactive monitoring and rapid incident response
  • Foster a culture of quality through peer reviews, providing constructive feedback and seeking input on your own work
What we offer
What we offer
  • Benefits starting from Day 1!
  • Retirement Plan Matching
  • Flexible Paid Time Off
  • Wellness Support Programs and Resources
  • Parental & Caregiver Leaves
  • Fertility & Adoption Support
  • Continuous Development Support Program
  • Employee Assistance Program
  • Allyship and Inclusion Communities
  • Employee Recognition … and more!
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

At Ingka Investments (Part of Ingka Group – the largest owner and operator of IK...
Location
Location
Netherlands , Leiden
Salary
Salary:
Not provided
https://www.ikea.com Logo
IKEA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Formal qualifications (BSc, MSc, PhD) in computer science, software engineering, informatics or equivalent
  • Minimum 3 years of professional experience as a (Junior) Data Engineer
  • Strong knowledge in designing efficient, robust and automated data pipelines, ETL workflows, data warehousing and Big Data processing
  • Hands-on experience with Azure data services like Azure Databricks, Unity Catalog, Azure Data Lake Storage, Azure Data Factory, DBT and Power BI
  • Hands-on experience with data modeling for BI & ML for performance and efficiency
  • The ability to apply such methods to solve business problems using one or more Azure Data and Analytics services in combination with building data pipelines, data streams, and system integration
  • Experience in driving new data engineering developments (e.g. apply new cutting edge data engineering methods to improve performance of data integration, use new tools to improve data quality and etc.)
  • Knowledge of DevOps practices and tools including CI/CD pipelines and version control systems (e.g., Git)
  • Proficiency in programming languages such as Python, SQL, PySpark and others relevant to data engineering
  • Hands-on experience to deploy code artifacts into production
Job Responsibility
Job Responsibility
  • Contribute to the development of D&A platform and analytical tools, ensuring easy and standardized access and sharing of data
  • Subject matter expert for Azure Databrick, Azure Data factory and ADLS
  • Help design, build and maintain data pipelines (accelerators)
  • Document and make the relevant know-how & standard available
  • Ensure pipelines and consistency with relevant digital frameworks, principles, guidelines and standards
  • Support in understand needs of Data Product Teams and other stakeholders
  • Explore ways create better visibility on data quality and Data assets on the D&A platform
  • Identify opportunities for data assets and D&A platform toolchain
  • Work closely together with partners, peers and other relevant roles like data engineers, analysts or architects across IKEA as well as in your team
What we offer
What we offer
  • Opportunity to develop on a cutting-edge Data & Analytics platform
  • Opportunities to have a global impact on your work
  • A team of great colleagues to learn together with
  • An environment focused on driving business and personal growth together, with focus on continuous learning
  • Fulltime
Read More
Arrow Right

Electrical Professional Engineer

Location
Location
United States , Baton Rouge
Salary
Salary:
Not provided
gecinc.com Logo
G.E.C., Inc. (GEC)
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Graduate Engineer – Bachelors of Science in Electrical Engineering (from an ABET accredited program)
  • Registered as a Professional Engineer in the State of Louisiana or ability to gain through reciprocity within 6 months
  • Strong background in electrical power systems design, construction, and construction drawing bid package preparation for distribution systems, generators, and AC drives
  • Experience of 8 years minimum in power system engineering
  • Modeling utilizing ETAP or similar software
  • Equipment and electrical distribution modeling
  • Arc flash analysis and incident energy calculations
  • Protective device coordination and voltage drop calculations
  • Duct bank analysis
  • Electrical construction related experience (480V to ~35kV)
Job Responsibility
Job Responsibility
  • Providing engineering and technical support within the Electrical Engineering Department through services such as project development, detailed project planning, front-end engineering and detailed design, opinion of construction cost, and technical construction engineering and inspection for a variety of electrical projects
  • Use of engineer seal is required for projects that applicant is in responsible charge
  • Responsible for providing engineering and technical support within the Engineering Division
  • Prepares or directs preparation and modification of reports, specifications, plans, construction schedules, and project designs
  • Analyzes reports, maps, drawings, blueprints, tests, and aerial photographs for planning and design
  • Uses computer assisted engineering and design software and equipment to prepare engineering and design documents
  • Directs CAD personnel to convert designs to working drawings
  • Performs quality control/quality assurance review of engineering documents
  • Visits construction site to monitor progress and other duties per the client contract documents
  • Assists in pricing, estimating, scoping, marketing strategies, and business development for proposed projects
What we offer
What we offer
  • 401(k)
  • Health Insurance
  • Dental Insurance
  • Vision Insurance
  • Health Savings Accounts
  • Life and AD&D Insurance
  • Disability Insurance
  • Voluntary Benefits
  • Vacation Time Off
  • Sick Time
  • Fulltime
Read More
Arrow Right

Data Architect / Engineer

The Data Architect – Engineer will shape the future of the telecommunications in...
Location
Location
France
Salary
Salary:
Not provided
netsf.fr Logo
NETWORK SOLUTIONS FACTORY
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of experience as a Data Architect, Data Engineer, or similar role in large-scale data environments
  • Expertise in data modeling, metadata management, and data governance
  • Experience with ETL/ELT frameworks and data pipeline orchestration
  • Solid SQL skills and experience with databases and distributed architectures
  • Professional working English
Job Responsibility
Job Responsibility
  • Define data standards, governance policies, naming conventions, and documentation practices
  • Build scalable data pipelines and lakehouse solutions
  • Develop and optimize data models for analytics, reporting, and artificial intelligence / machine learning.
  • Optimize data architectures for performance, cost-efficiency, and reliability
  • Evaluate and recommend data platforms, storage technologies, and integration patterns
  • Collaborate with analytics and business teams to understand data needs and translate them into architectural designs
  • Provide technical leadership on data projects and mentor junior team members
What we offer
What we offer
  • Remote work (with a compensatory allowance) up to 3 days a week
  • Profit-sharing and employee savings plan
  • Annual vacation bonus
  • Meal vouchers subsidized at 59,89% by NetSF, up to €11
  • Family health insurance, subsidized at 75% by NetSF
  • Provision of additional health insurance
  • A friendly work atmosphere: monthly breakfast, internal newsletter, fruit basket, free coffee and tea, etc…
  • Transport allowance and sustainable mobility package
  • International work environment
  • Fulltime
Read More
Arrow Right