CrawlJobs Logo

Database Developer (Python + DBT)

votredircom.fr Logo

Wissen

Location Icon

Location:
India, Bangalore South

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Wissen Technology is hiring a Database Developer (Python + DBT) who will design, develop, and maintain scalable ETL/ELT pipelines, build automation workflows, and collaborate with cross-functional teams to deliver high-quality data solutions.

Job Responsibility:

  • Design, develop, and maintain scalable ETL/ELT pipelines using DBT
  • Write efficient, high-quality SQL for data extraction, transformation, and modeling
  • Build automation scripts and data processing workflows using Python
  • Collaborate with data analysts, BI developers, and business stakeholders to understand data needs
  • Implement best practices for data quality, testing, documentation, and version control
  • Optimize data pipelines for performance, scalability, and reliability
  • Monitor data workflows and troubleshoot issues proactively
  • Integrate data from multiple sources into centralized data platforms

Requirements:

  • Strong hands-on experience in SQL (complex queries, performance tuning, data modeling)
  • Proficiency in Python for data processing, automation, and scripting
  • Solid practical experience with DBT (models, tests, macros, documentation)
  • Experience working with cloud data warehouses (e.g., Snowflake, BigQuery, Redshift, etc.)
  • Good understanding of data warehousing concepts and ETL/ELT methodologies
  • Familiarity with version control (Git) and CI/CD workflows
  • Strong problem-solving skills and attention to detail

Nice to have:

  • Exposure data pipeline orchestration tools
  • Knowledge of advanced DBT features and best practices
  • Experience with containerization (Docker) and cloud deployment

Additional Information:

Job Posted:
December 13, 2025

Employment Type:
Fulltime
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Database Developer (Python + DBT)

Data Engineer

At Allianz Technology, we power the digital transformation of the Allianz Group....
Location
Location
Spain , Barcelona
Salary
Salary:
Not provided
https://www.allianz.com Logo
Allianz
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proficient in Python for development, automation, and data processing
  • strong experience using DBT for data transformation and management in cloud environments
  • solid experience with Azure Data Factory (ADF) for orchestrating ETL workflows
  • familiarity with Jenkins for CI/CD workflows
  • expertise in SQL for querying databases and building data models
  • ability to design and implement effective data models, create databases, and optimize performance
  • knowledge of Agile Methodology and familiarity with data governance and security best practices with strong problem-solving, troubleshooting and collaboration skills, as well as the ability to thrive in a dynamic environment
  • experience with data warehousing and cloud-based platforms (Azure)
  • familiarity with APIs for integrating third-party systems into data workflows
  • experience in data modeling and data analytics platforms
Job Responsibility
Job Responsibility
  • Design and implement scalable ETL pipelines using DBT and Azure Data Factory (ADF) to process and transform data
  • integrate DBT with ADF runtime environments and leverage APIs for seamless execution
  • write high-quality, well-documented Python code for data transformation, extraction, and automation processes
  • utilize Visual Studio Code for efficient development and manage project versioning with GitHub
  • collaborate closely with the team to design and maintain SQL-based data models and data warehouse solutions
  • collaborate with cross-functional teams to understand data requirements and ensure data accessibility and usability
  • implement Jenkins for continuous integration and delivery (CI/CD) to automate data pipeline workflows
  • create and maintain automated workflows to enhance business intelligence, reporting, and data insights
  • troubleshoot, resolve, and optimize data pipeline issues to support large-scale data processing and ensure consistent data quality
  • proactively monitor data pipelines, ensuring data accuracy, consistency, and reliability
What we offer
What we offer
  • Hybrid work model which recognizes the value of striking a balance between in-person collaboration and remote working incl. up to 25 days per year working from abroad
  • company bonus scheme, pension, employee shares program and multiple employee discounts (details vary by location)
  • career development and digital learning programs to international career mobility, we offer lifelong learning for our employees worldwide and an environment where innovation, delivery and empowerment are fostered
  • flexible working, health and wellbeing offers (including healthcare and parental leave benefits) support to balance family and career and help our people return from career breaks with experience that nothing else can teach
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

As a Senior Data Engineer, you will be pivotal in designing, building, and optim...
Location
Location
United States
Salary
Salary:
102000.00 - 125000.00 USD / Year
wpromote.com Logo
Wpromote
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Information Technology, or a related field, or equivalent practical experience
  • 4+ years of experience in data engineering or a related field
  • Intermediate to advanced programming skills in Python
  • Proficiency in SQL and experience with relational databases
  • Strong knowledge of database and data warehousing design and management
  • Strong experience with DBT (data build tool) and test-driven development practices
  • Proficiency with at least 1 cloud database (e.g. BigQuery, Snowflake, Redshift, etc.)
  • Excellent problem-solving skills, project management habits, and attention to detail
  • Advanced level Excel and Google Sheets experience
  • Familiarity with data orchestration tools (e.g. Airflow, Dagster, AWS Glue, Azure data factory, etc.)
Job Responsibility
Job Responsibility
  • Developing data pipelines leveraging a variety of technologies including dbt and BigQuery
  • Gathering requirements from non-technical stakeholders and building effective solutions
  • Identifying areas of innovation that align with existing company and team objectives
  • Managing multiple pipelines across Wpromote’s client portfolio
What we offer
What we offer
  • Half-day Fridays year round
  • Unlimited PTO
  • Extended Holiday break (Winter)
  • Flexible schedules
  • Work from anywhere options*
  • 100% paid parental leave
  • 401(k) matching
  • Medical, Dental, Vision, Life, Pet Insurance
  • Sponsored life insurance
  • Short Term Disability insurance and additional voluntary insurance
  • Fulltime
Read More
Arrow Right

Data Engineer

You will be part of a world-class Data Engineering team, where you will influenc...
Location
Location
India , Bengaluru
Salary
Salary:
Not provided
https://www.atlassian.com Logo
Atlassian
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree (or equivalent work experience) in a STEM field
  • Proficiency in Python or other modern programming language development experience
  • Proficiency in SQL and relational databases experience
Job Responsibility
Job Responsibility
  • influence product teams
  • inform Data Science and Analytics Platform teams
  • partner closely with data consumers and producers to ensure quality and usefulness of data assets
  • Defining metrics
  • Instrumenting logging
  • Acquiring/ingesting data
  • Architecting & modeling data
  • Transforming data
  • Ensuring data quality, governance, and enablement
  • Alerting, visualization, and reporting
What we offer
What we offer
  • health coverage
  • paid volunteer days
  • wellness resources
  • Fulltime
Read More
Arrow Right
New

Staff Software Engineer

At dbt Labs, our mission is to empower data practitioners to create and dissemin...
Location
Location
India
Salary
Salary:
Not provided
getdbt.com Logo
dbt Labs
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 6+ years of professional software development experience, with a focus on building and operating backend systems in a production environment
  • Deep backend engineering experience and proficiency in Python
  • A systematic, first-principles approach to problem-solving, with comfort navigating and debugging in Linux environments (e.g., understanding process scheduling, memory management, file systems)
  • Solid grasp of database internals, such as query execution engines, storage layers, indexing strategies, and transaction management
  • Excellent communication skills and a sense of ownership, with the ability to balance technical depth with fast, iterative delivery
Job Responsibility
Job Responsibility
  • Dive deep into the dbt-core execution engine, using your systems expertise to identify, debug, and eliminate performance bottlenecks in our Python codebase
  • Architect and implement improvements to our adapter interface, the crucial layer that connects dbt to a growing ecosystem of databases and data platforms, enabling the community to build more powerful and efficient integrations
  • Debug complex, system-level issues that span from Linux process management and concurrency models to database query planning and network protocols
  • Lead technical design discussions and contribute to the long-term architectural roadmap for dbt Core, making key decisions about what, how, and when we build to ensure its scalability and reliability for years to come
  • Mentor other engineers and review contributions from our vibrant open-source community, upholding our high standards for code quality, testing, and design while fostering a collaborative and inclusive environment
  • Take strong ownership of our distributed systems, troubleshoot complex production issues, and participate in an on-call rotation to maintain high availability and deliver a resilient platform experience
What we offer
What we offer
  • Equity Stake
  • Unlimited Vacation (and we encourage you to use it!)
  • Excellent Healthcare Insurance
  • Paid Parental Leave
  • Wellness Stipend
  • Fulltime
Read More
Arrow Right
New

Data Engineer

Location
Location
Salary
Salary:
Not provided
ryzlabs.com Logo
Ryz Labs
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum 5 years of experience designing, building, and maintaining data platforms
  • Proficiency building software systems, testing, and the fundamental principles of software design (e.g. SOLID)
  • Proficiency in Python and experience with data engineering libraries and frameworks
  • Experience optimizing query performance on cloud data warehouse platforms and relational databases (e.g. Snowflake)
  • Experience with DBT for data transformation
  • Experience with data pipeline orchestration tools (e.g., Airflow)
  • Familiarity with DevOps tooling (e.g. Docker, K8S, Helm, Github Actions)
  • Familiarity with various components of cloud computing platforms (e.g. AWS)
  • Excellent team player with strong communication skills
  • Comfortable collaborating with stakeholders and navigating ambiguity to understand product requirements
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable and reliable data pipelines using Python, Airflow, and other tools
  • Collaborate closely with data scientists and analysts to understand their requirements and translate them into efficient data processing workflows
  • Manage and optimize data storage and querying in Snowflake to ensure high performance and reliability
  • Implement best practices for data governance, security, and privacy
  • Coordinate and participate in on-call rotations
  • Develop a culture of high ownership, innovation, empathy, and collaboration
  • Make investments in engineering reliability, productivity, and excellence
  • Collaborate with stakeholders across the world to drive teams towards high impact
Read More
Arrow Right
New

Data Engineer, Solutions Architecture

We are seeking a talented Data Engineer to design, build, and maintain our data ...
Location
Location
United States , Scottsdale
Salary
Salary:
90000.00 - 120000.00 USD / Year
clearwayenergy.com Logo
Clearway Energy
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2-4 years of hands-on data engineering experience in production environments
  • Bachelor's degree in Computer Science, Engineering, or a related field
  • Proficiency in Dagster or Airflow for pipeline scheduling, dependency management, and workflow automation
  • Advanced-level Snowflake administration, including virtual warehouses, clustering, security, and cost optimization
  • Proficiency in dbt for data modeling, testing, documentation, and version control of analytical transformations
  • Strong Python and SQL skills for data processing and automation
  • 1-2+ years of experience with continuous integration and continuous deployment practices and tools (Git, GitHub Actions, GitLab CI, or similar)
  • Advanced SQL skills, database design principles, and experience with multiple database platforms
  • Proficiency in AWS/Azure/GCP data services, storage solutions (S3, Azure Blob, GCS), and infrastructure as code
  • Experience with APIs, streaming platforms (Kafka, Kinesis), and various data connectors and formats
Job Responsibility
Job Responsibility
  • Design, deploy, and maintain scalable data infrastructure to support enterprise analytics and reporting needs
  • Manage Snowflake instances, including performance tuning, security configuration, and capacity planning for growing data volumes
  • Optimize query performance and resource utilization to control costs and improve processing speed
  • Build and orchestrate complex ETL/ELT workflows using Dagster to ensure reliable, automated data processing for asset management and energy trading
  • Develop robust data pipelines that handle high-volume, time-sensitive energy market data and asset generation and performance metrics
  • Implement workflow automation and dependency management for critical business operations
  • Develop and maintain dbt models to transform raw data into business-ready analytical datasets and dimensional models
  • Create efficient SQL-based transformations for complex energy market calculations and asset performance metrics
  • Support advanced analytics initiatives through proper data preparation and feature engineering
  • Implement comprehensive data validation, testing, and monitoring frameworks to ensure accuracy and consistency across all energy and financial data assets
What we offer
What we offer
  • generous PTO
  • medical, dental & vision care
  • HSAs with company contributions
  • health FSAs
  • dependent daycare FSAs
  • commuter benefits
  • relocation
  • a 401(k) plan with employer match
  • a variety of life & accident insurances
  • fertility programs
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

This project is designed for consulting companies that provide analytics and pre...
Location
Location
Salary
Salary:
Not provided
lightpointglobal.com Logo
Lightpoint Global
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • successfully implemented and released data integration services or APIs using modern Python frameworks in the past 4 years
  • successfully designed data models and schemas for analytics or data warehousing solutions
  • strong analysis and problem solving skills
  • strong knowledge of Python programming language and data engineering
  • deep understanding of good programming practices, design patterns, and software architecture principles
  • ability to work as part of a team by contributing to product backlog reviews and solution design and implementation
  • be disciplined in implementing software in a timely manner while ensuring product quality isn't compromised
  • formal training in software engineering, computer science, computer engineering, or data engineering
  • have working knowledge with Apache Airflow or a similar technology for workflow orchestration
  • have working knowledge with dbt (data build tool) for analytics transformation workflows
Job Responsibility
Job Responsibility
  • work in an agile team to design, develop, and implement data integration services that connect diverse data sources including event tracking platforms (GA4, Segment), databases, APIs, and third-party systems
  • build and maintain robust data pipelines using Apache Airflow, dbt, and Spark to orchestrate complex workflows and transform raw data into analytics-ready datasets in Snowflake
  • develop Python-based integration services and APIs that enable seamless data flow between various data technologies and downstream applications
  • collaborate actively with data analysts, analytics engineers, and platform teams to understand requirements, troubleshoot data issues, and optimize pipeline performance
  • participate in code reviews, sprint planning, and retrospectives to ensure high-quality, production-ready code by end of each sprint
  • contribute to the continuous improvement of data platform infrastructure, development practices, and deployment processes in accordance with CI/CD best practices
  • Fulltime
Read More
Arrow Right
New

Senior Analytics Engineer

We are seeking a highly skilled and motivated Senior Analytics Engineer to play ...
Location
Location
United States , New York
Salary
Salary:
200000.00 - 225000.00 USD / Year
evolutioniq.com Logo
EvolutionIQ
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field
  • 5+ years of experience in data engineering, with a focus on building analytical data platforms
  • Demonstrated experience with cloud data warehousing solutions (e.g. BigQuery, Snowflake, Redshift)
  • Strong proficiency in SQL and experience with data modeling techniques (e.g., star schema, dimensional modeling)
  • Experience building and maintaining ETL/ELT pipelines using tools like Dagster, Apache Airflow, dbt, or similar
  • Experience with programming languages such as Python, Java, or Scala
  • Excellent problem-solving and analytical skills, along with passion for data and a commitment to data quality
Job Responsibility
Job Responsibility
  • Lead the design, development, and implementation of a scalable, reliable, and efficient analytical data platform
  • Influence decisions on technology and tool selection for data ingestion, storage, transformation, and analysis for the analytics platform
  • Contribute to the overall data architecture and strategy, ensuring alignment with business needs and best practices
  • Build and maintain robust ETL/ELT pipelines to ingest, transform, and load data from various sources into the data warehouse (e.g., cloud storage, databases, APIs)
  • Develop and optimize data models for analytical use cases, ensuring data quality, consistency, and accessibility
  • Establish and enforce data quality standards and processes to ensure data accuracy and integrity
  • Implement data governance policies and procedures to manage data access, security, and compliance for analytics use cases
  • Proactively identify and address data quality issues, working with stakeholders to resolve root causes
  • Partner with data scientists, analysts, and other engineers to understand their data needs and provide solutions
  • Effectively communicate technical concepts and designs to both technical and non-technical audiences
What we offer
What we offer
  • Medical, dental, vision, short & long-term disability, life insurance and AD&D, and 401k matching
  • Additional family, wellness, and pet benefits
  • Paid time off and sick leave, 100% paid parental leave (16 weeks for primary caregivers and 12 weeks for secondary caregivers)
  • Flexible schedule for new parents returning to work
  • Catered lunches, happy hours, pet-friendly spaces, and monthly technology stipend
  • $1,000/year for each employee for professional development, as well opportunities for tuition reimbursement
  • Annual bonus plan and company equity plan (RSUs)
  • Fulltime
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.