CrawlJobs Logo

Gcp Data Engineer- Lead Consultant

lingarogroup.com Logo

Lingaro

Location Icon

Location:

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Responsibility:

  • Part of the team accountable for design, model and development of whole GCP data ecosystem for one of our Client’s (Cloud Storage, Cloud Functions, BigQuery)
  • Involvement throughout the whole process starting with the gathering, analyzing, modelling, and documenting business/technical requirements
  • The role will include direct contact with clients
  • Modelling the data from various sources and technologies
  • Troubleshooting and supporting the most complex and high impact problems, to deliver new features and functionalities
  • Designing and optimizing data storage architectures, including data lakes, data warehouses, or distributed file systems
  • Implementing techniques like partitioning, compression, or indexing to optimize data storage and retrieval
  • Identifying and resolving bottlenecks, tuning queries, and implementing caching strategies to enhance data retrieval speed and overall system efficiency
  • Identifying and resolving issues related to data processing, storage, or infrastructure
  • Monitoring system performance, identifying anomalies, and conducting root cause analysis to ensure smooth and uninterrupted data operations
  • Train and mentor less experienced data engineers, providing guidance and knowledge transfer

Requirements:

  • At least 7+ years of experience as a Data Engineer working with GCP cloud-based infrastructure & systems
  • Deep knowledge of Google Cloud Platform and cloud computing services
  • Extensive experience in design, build, and deploy data pipelines in the cloud, to ingest data from various sources like databases, APIs or streaming platforms
  • Proficient in database management systems such as SQL (Big Query is a must), NoSQL
  • Programming skills (SQL, Python, other scripting)
  • Proficient in data modeling techniques and database optimization
  • Knowledge of at least one orchestration and scheduling tool (Airflow is a must)
  • Experience with data integration tools and techniques, such as ETL and ELT
  • Knowledge of modern data transformation tools (such as DBT, Dataform)
  • Excellent communication skills
  • Ability to actively participate/lead discussions with clients
  • Tools knowledge: Git, Jira, Confluence, etc
  • Open to learn new technologies and solutions
  • Experience in multinational environment and distributed teams

Nice to have:

  • Certifications in big data technologies or/and cloud platforms
  • Experience with BI solutions (e.g. Looker, Power BI, Tableau)
  • Experience with ETL tools: e.g. Talend, Alteryx
  • Experience with Apache Spark, especially in GCP environment
  • Experience with Databricks
  • Experience with Azure cloud-based infrastructure & systems
What we offer:
  • Stable employment
  • “Office as an option” model
  • Flexibility regarding working hours and your preferred form of contract
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs
  • Upskilling support
  • Internal Gallup Certified Strengths Coach to support your growth
  • Grow as we grow as a company
  • A diverse, inclusive, and values-driven community
  • Autonomy to choose the way you work
  • Create our community together
  • Activities to support your well-being and health
  • Plenty of opportunities to donate to charities and support the environment
  • Modern office equipment

Additional Information:

Job Posted:
February 18, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Gcp Data Engineer- Lead Consultant

Principal Consulting AI / Data Engineer

As a Principal Consulting AI / Data Engineer, you will design, build, and optimi...
Location
Location
Australia , Sydney
Salary
Salary:
Not provided
dyflex.com.au Logo
DyFlex Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven expertise in delivering enterprise-grade data engineering and AI solutions in production environments
  • Strong proficiency in Python and SQL, plus experience with Spark, Airflow, dbt, Kafka, or Flink
  • Experience with cloud platforms (AWS, Azure, or GCP) and Databricks
  • Ability to confidently communicate and present at C-suite level, simplifying technical concepts into business impact
  • Track record of engaging senior executives and influencing strategic decisions
  • Strong consulting and stakeholder management skills with client-facing experience
  • Background in MLOps, ML pipelines, or AI solution delivery highly regarded
  • Degree in Computer Science, Engineering, Data Science, Mathematics, or a related field
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable data and AI solutions using Databricks, cloud platforms, and modern frameworks
  • Lead solution architecture discussions with clients, ensuring alignment of technical delivery with business strategy
  • Present to and influence executive-level stakeholders, including boards, C-suite, and senior directors
  • Translate highly technical solutions into clear business value propositions for non-technical audiences
  • Mentor and guide teams of engineers and consultants to deliver high-quality solutions
  • Champion best practices across data engineering, MLOps, and cloud delivery
  • Build DyFlex’s reputation as a trusted partner in Data & AI through thought leadership and client advocacy
What we offer
What we offer
  • Work with SAP’s latest technologies on cloud as S/4HANA, BTP and Joule, plus Databricks, ML/AI tools and cloud platforms
  • A flexible and supportive work environment including work from home
  • Competitive remuneration and benefits including novated lease, birthday leave, salary packaging, wellbeing programme, additional purchased leave, and company-provided laptop
  • Comprehensive training budget and paid certifications (Databricks, SAP, cloud platforms)
  • Structured career advancement pathways with opportunities to lead large-scale client programs
  • Exposure to diverse industries and client environments, including executive-level engagement
  • Fulltime
Read More
Arrow Right

Consulting AI / Data Engineer

As a Consulting AI / Data Engineer, you will design, build, and optimise enterpr...
Location
Location
Australia , Sydney
Salary
Salary:
Not provided
dyflex.com.au Logo
DyFlex Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Hands-on data engineering experience in production environments
  • Strong proficiency in Python and SQL
  • Experience with at least one additional language (e.g. Java, Typescript/Javascript)
  • Experience with modern frameworks such as Apache Spark, Airflow, dbt, Kafka, or Flink
  • Background in building ML pipelines, MLOps practices, or feature stores is highly valued
  • Proven expertise in relational databases, data modelling, and query optimisation
  • Demonstrated ability to solve complex technical problems independently
  • Excellent communication skills with ability to engage clients and stakeholders
  • Degree in Computer Science, Engineering, Data Science, Mathematics, or a related field
Job Responsibility
Job Responsibility
  • Build and maintain scalable data pipelines for ingesting, transforming, and delivering data
  • Manage and optimise databases, warehouses, and cloud storage solutions
  • Implement data quality frameworks and testing processes to ensure reliable systems
  • Design and deliver cloud-based solutions (AWS, Azure, or GCP)
  • Take technical ownership of project components and lead small development teams
  • Engage directly with clients, translating business requirements into technical solutions
  • Champion best practices including version control, CI/CD, and infrastructure as code
What we offer
What we offer
  • Work with SAP’s latest technologies on cloud as S/4HANA, BTP and Joule, plus Databricks, ML/AI tools and cloud platforms
  • A flexible and supportive work environment including work from home
  • Competitive remuneration and benefits including novated lease, birthday leave, remote working, additional purchased leave, and company-provided laptop
  • Competitive remuneration and benefits including novated lease, birthday leave, salary packaging, wellbeing programme, additional purchased leave, and company-provided laptop
  • Comprehensive training budget and paid certifications (Databricks, SAP, cloud platforms)
  • Structured career advancement pathways with mentoring from senior engineers
  • Exposure to diverse industries and client environments
  • Fulltime
Read More
Arrow Right

Lead Consultant - Software Engineering

We are looking for a Lead Consultant in Software Engineering who will be primari...
Location
Location
Philippines
Salary
Salary:
Not provided
3cloudsolutions.com Logo
3Cloud
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in computer science, engineering, data science or a related field
  • 3-4 years of experience managing a team / or equivalent
  • 5-7 years of experience in application development specializing in C# .NET Framework, Angular, Microservices, React, Node
  • Some relevant working experience in Azure, Azure App Services, Microsoft, and Agile
  • A passion for building front-end applications and cloud-native applications using PaaS, containers, serverless and modern front-end JS/TS
  • Desire for continuous learning and new technology to be used in the cloud space
  • Desire to be hands-on code but able to develop complex architectures in the cloud
  • Experience working in a consulting environment
  • Experience working in a major cloud platform, Azure, AWS, or GCP. Azure Preferred
  • Very knowledgeable in OOA/OOD and Design Patterns
Job Responsibility
Job Responsibility
  • Own technical envisioning, design, project scope and working with the project team to deploy Azure-based solutions that meet our client’s needs
  • Translate requirements into a technical design leveraging existing tools, services and frameworks
  • Drive design and deployment of the client’s workloads into Azure by providing architectural guidance (including, design, implementation and deployment), supporting development of the client’s cloud adoption model, and providing appropriate recommendations to overcome blockers
  • Keep abreast of emerging technology trends and their impact on cloud solutions
  • Identify, validate, and grow opportunities to accelerate consumption in high potential customer accounts in partnership with the sales team, by driving solution architecture for both Microsoft and third-party solutions
  • As a senior role, you are required to train, shadow, and provide knowledge transfer to junior developers and other technology teams
What we offer
What we offer
  • Competitive compensation package, salary, allowance, standard benefits including quarterly and annual performance-based cash bonus and other remuneration
  • Great working environment and company culture with flexible work location
  • Fulltime
Read More
Arrow Right

Sr. Data Engineer - Snowflake

Data Ideology is seeking a Sr. Snowflake Data Engineer to join our growing team ...
Location
Location
Salary
Salary:
Not provided
dataideology.com Logo
Data Ideology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience in data engineering, data warehousing, or data architecture
  • 3+ years of hands-on Snowflake experience (performance tuning, data sharing, Snowpark, Snowpipe, etc.)
  • Strong SQL and Python skills, with production experience using dbt
  • Familiarity with cloud platforms (AWS, Azure, or GCP) and modern data tooling (Airflow, Fivetran, Power BI, Looker, Informatica, etc.)
  • Prior experience in a consulting or client-facing delivery role
  • Excellent communication skills, with the ability to collaborate across technical and business stakeholders
  • SnowPro Core Certification required (or willingness to obtain upon hire)
  • advanced Snowflake certifications preferred
Job Responsibility
Job Responsibility
  • Design and build scalable, secure, and cost-effective data solutions in Snowflake
  • Develop and optimize data pipelines using tools such as dbt, Python, CloverDX, and cloud-native services
  • Participate in discovery sessions with clients to gather requirements and translate them into solution designs and project plans
  • Collaborate with engagement managers and account teams to help scope work and provide technical input for Statements of Work (SOWs)
  • Serve as a Snowflake subject matter expert, guiding best practices in performance tuning, cost optimization, access control, and workload management
  • Lead modernization and migration initiatives to move clients from legacy systems into Snowflake
  • Integrate Snowflake with BI tools, governance platforms, and AI/ML frameworks
  • Contribute to internal accelerators, frameworks, and proofs of concept
  • Mentor junior engineers and support knowledge sharing across the team
What we offer
What we offer
  • Flexible Time Off Policy
  • Eligibility for Health Benefits
  • Retirement Plan with Company Match
  • Training and Certification Reimbursement
  • Utilization Based Incentive Program
  • Commission Incentive Program
  • Referral Bonuses
  • Work from Home
  • Fulltime
Read More
Arrow Right

Sales Engineering Lead

EvoluteIQ is seeking a Sales Engineering Lead to drive pre-sales and sales engin...
Location
Location
India , Bengaluru
Salary
Salary:
Not provided
evoluteiq.com Logo
EvoluteIQ
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8–10 years of experience in pre-sales, solution engineering, or consulting for enterprise software, automation, or AI-driven platforms
  • Hands-on knowledge of process automation, AI/ML, data integration, API orchestration, or low-code/no-code environments
  • Experience collaborating with global channel partners, system integrators, or technology alliances
  • Familiarity with one or more BPM (Appian, Pega, Camunda), LCAP (Outsystems, Mendix) and automation stacks (e.g., ServiceNow, UiPath, Power Automate, Blue Prism, MuleSoft) and cloud platforms (AWS, Azure, or GCP), as well as Agentic AI and Generative AI technologies
  • Proven ability to connect technical capabilities with business outcomes and present to C-suite stakeholders
  • Strong communication, executive presentation, and solution storytelling abilities
  • Strategic thinker with the ability to influence joint go-to-market initiatives and co-create customer success outcomes
  • Bachelors or Masters degree in computer science, Engineering, or a related field
  • MBA preferred but not mandatory
Job Responsibility
Job Responsibility
  • Lead all technical pre-sales engagements for new opportunities, from discovery, requirement scoping, and solution design to technical validation, proposal creation and proof-of-concept delivery
  • Collaborate with business, sales, and technical teams to understand customer objectives and expand solution footprints
  • Translate business objectives and challenges into robust technical architectures and solution proposals that clearly articulate value, ROI, and platform differentiation
  • Design and deliver customized product demonstrations, solution architectures, and joint proof-of-concepts highlighting the EIQ platform’s unified automation capabilities
  • Coordinate for partner enablement workshops and provide ongoing knowledge transfer to enhance Eguardian’s sales and delivery maturity
  • Serve as a solution advisor by identifying and shaping new Agentic Automation use cases leveraging AI, ML, RPA, and orchestration
  • Act as a technical liaison between EvoluteIQ’s product, engineering, and partner management teams to ensure solution scalability and roadmap alignment
  • Design and present compelling solution demos, proofs of concept (POCs), and architecture blueprints tailored to client industries (banking, healthcare, insurance, telecom)
  • Build reusable demo assets, templates, and solution accelerators to support repeatable GTM success
  • Partner with Delivery teams to ensure a seamless handover from pre-sales to implementation
What we offer
What we offer
  • Opportunity to shape the strategy of a next-gen hyper-automation platform
  • Work with a cross-disciplinary team in a fast-growing, innovation-driven environment
  • Competitive compensation and growth opportunities
  • A culture of innovation, ownership, and continuous learning
  • Fulltime
Read More
Arrow Right

Data Architect/Databricks Consultant

We are seeking a specialized Databricks Architect with deep expertise in cost op...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
genzeon.com Logo
Genzeon
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8+ years of experience in big data architecture with focus on cost optimization
  • 5+ years of hands-on Databricks experience with proven cost reduction achievements
  • Demonstrated experience architecting and executing complete platform migrations from Databricks to alternative solutions with successful outcomes
  • 6+ years of advanced Apache Spark development and cluster management experience
  • Track record of achieving significant cost savings (minimum 40%+) in cloud data platforms
  • Expert knowledge of Databricks pricing models, compute types, and cost drivers
  • Experience with FinOps practices and cloud cost management tools
  • Proven ability to implement automated cost controls and budget management systems
  • Knowledge of alternative platforms and their cost structures (EMR, HDInsight, GCP Dataproc, etc.)
  • Deep expertise in migrating complex data workloads between different Spark platforms
Job Responsibility
Job Responsibility
  • Conduct comprehensive cost analysis and auditing of existing Databricks deployments across multiple workspaces
  • Develop and implement aggressive cost reduction strategies targeting 30-50% savings through cluster optimization
  • Design and deploy automated cost monitoring solutions with real-time alerts and budget controls
  • Optimize cluster configurations, auto-scaling policies, and job scheduling to minimize compute costs
  • Implement spot instance strategies and preemptible VM usage for non-critical workloads
  • Establish cost allocation frameworks and implement chargeback mechanisms for business unit accountability
  • Create cost governance policies and developer guidelines to prevent cost overruns
  • Analyze and optimize storage costs including Delta Lake table optimization and data lifecycle management
  • Lead strategic initiatives to migrate workloads away from Databricks to cost-effective alternatives
  • Assess existing Databricks implementations and create detailed migration roadmaps to target platforms
Read More
Arrow Right

Presales AI & Data

As a Data & AI Presales Expert, you will play a critical role in driving busines...
Location
Location
Poland , Warsaw
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Degree in Computer Science, Data Science, Engineering, or related field
  • 3 years presales/technical consulting experience in AI/Data
  • Ability to lead workshops, Proof of Concepts (POCs), and technical demonstrations
  • Strong understanding of AI/ML concepts and data technologies
  • Experience with cloud AI/data services (AWS, Azure, GCP)
  • Fluency in Polish and professional English
Job Responsibility
Job Responsibility
  • Understand customer needs and translate them into AI/data solutions
  • Develop and present technical proposals and demonstrations
  • Design solution architectures and collaborate with internal teams
  • Act as a technical expert and advocate for our AI/data offerings
  • Support sales efforts, including RFPs/RFIs
  • Manage technical aspects of PoCs and pilot projects
  • Stay updated on AI/data trends and competitor landscape
What we offer
What we offer
  • Flexible working hours
  • Hybrid work model
  • Cafeteria system
  • Generous referral bonuses
  • Ongoing guidance from a dedicated Team Manager
  • Tailored technical mentoring
  • Dedicated team-building budget
  • Opportunities to participate in charitable initiatives and local sports programs
  • Supportive and inclusive work culture
  • Fulltime
Read More
Arrow Right

Lead Data Engineer

The Lead Data Engineer is responsible for building and leading our data engineer...
Location
Location
Salary
Salary:
Not provided
basicagency.com Logo
BASIC/DEPT®
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Extensive experience in data engineering with a proven track record of leading complex data initiatives
  • Strong technical leadership experience, with the ability to guide and develop engineering teams
  • Deep expertise in designing and implementing data architectures, data pipelines, and ETL processes
  • Hands-on experience with cloud platforms (GCP, AWS, Azure) and their data-specific services
  • Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT)
  • Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.)
  • Strong understanding of data modeling, data governance, and data quality principles
  • Excellent communication skills with the ability to translate complex technical concepts for business stakeholders
  • Strategic thinking with the ability to develop and execute technical roadmaps
  • Experience working in an agency or consulting environment is highly advantageous
Job Responsibility
Job Responsibility
  • Lead the strategic development of our data engineering proposition and capabilities
  • Set technical standards and best practices for data engineering across EMEA
  • Guide and mentor data engineers, fostering a culture of technical excellence and innovation
  • Design and implement scalable, reliable data architectures and pipelines for enterprise clients
  • Serve as the technical authority on client engagements, providing expert guidance
  • Build and grow a high-performing data engineering team
  • Work closely with the broader Data & AI practice to deliver integrated solutions
  • Support business development efforts through technical expertise in client pitches
  • Stay ahead of emerging technologies and methodologies in the data engineering space
What we offer
What we offer
  • A reputation for doing good
  • Awesome clients
  • The opportunity for possibility with training, development and certifications
  • Global annual DEPT® Cares Month in which employees come together and donate their skills to support local charities
  • Additionally
  • each office has its long list of local benefits
Read More
Arrow Right