CrawlJobs Logo

Python Airflow Developer

realign-llc.com Logo

Realign

Location Icon

Location:
Canada , Toronto

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

140000.00 USD / Year

Job Description:

Job Description Python Airflow Developer Toronto, ON - Onsite

Requirements:

  • Design and maintenance of services in Python
  • Provide guidance and expertise for Python solutions
  • Responsible for the end-to-end technical solution as defined by the solution architecture and resolving design ambiguity
  • Provide technical leadership, expert counsel and guidance to the development team, adhering to solution architecture and best practices
  • Leads complex group meetings (including business partners) for technical design, decision making, problem solving, implementation and strategic planning
  • Provides direction, expertise, feedback, coaching and development to build the capability of junior technical development staff
  • Conduct analysis in form of written and/or diagram to provide feedback
  • Create, design, analyze, develop, and debug Various tasks such as updating table entries, create program variants
  • Develop interface to transfer data between two systems
  • Support the integration projects through various phases from gathering business requirements to go-live and post-implementation support
  • Analyze business requirements and provide guidance and clarity
  • Support the assessment of change requests (corrections, enhancements), proposing and developing solutions
  • Support the team during requirements and testing
  • Perform unit testing for the developed objects
  • Prepare technical design document of the development
  • Prepare PRP document and provide walkthrough Implement the changes using Service Now Ticketing process
  • Create implementation plan and preparations for Go-Live
  • Develop accurate estimates for completion of technical tasks Manage risks, assumptions and constraints and communicate to appropriate parties
  • For production problem tickets, determine root-cause analysis and provide options for solution
  • Provide first class support to immediate team and all partners Develop API Web Services
  • Provide guidance to adhere to compliance items like server
  • Must-have Undergraduate Computer Science or Engineering Degree or equivalent experience
  • Proven experience in leading development team on large projects or programs interfacing with multiple applications and or third parties
  • Proven experience as a senior Python developer
  • Proven experience in Oracle, MS-SQL, Postgres and Airflow
  • Experience in front end and backend development Knowledge of flow of data inbound and outbound files
  • Proven experience with agile delivery methodology and governance Ability to successfully multi-task Solid grasp of OO principles
  • 5 years’ experience with Java technologies such as Spring, Spring Boot, JAX-WS and JAX-RS
  • Experience with IntelliJ, Eclipse IDE Experience with Source Code Management system such as GIT
  • Experience with DevOps such as Jenkins Experience with PCF, OCP, Azure, AWS, Webservices (SOAP and REST) Middleware technologies (ex. MQSeries Kafka, Redis)

Nice to have:

  • JCL knowledge experience willing to learn
  • Basic knowledge API Web services VISIO creating flows and diagrams experience

Additional Information:

Job Posted:
March 21, 2026

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Python Airflow Developer

Python Data Developer

We’re looking for a proactive and experienced developer to join our dynamic team...
Location
Location
Portugal , Lisbon
Salary
Salary:
Not provided
https://egor.pt Logo
Egor
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong proficiency in Python
  • Hands-on experience with Google Cloud Platform (GCP)
  • Solid understanding of Airflow / Cloud Composer
  • Expertise in SQL and relational databases
Job Responsibility
Job Responsibility
  • Design, develop, and maintain robust data pipelines using Python and Apache Airflow (Cloud Composer)
  • Leverage the power of Google Cloud Platform (GCP) to create scalable and reliable data solutions
  • Write clean, efficient, and optimized SQL queries for data transformation and extraction
  • Collaborate with cross-functional teams to improve data quality, consistency, and accessibility
  • Participate in code reviews and contribute to a high standard of technical excellence
What we offer
What we offer
  • Work on impactful projects with cutting-edge technologies
  • Be part of a collaborative, innovative team within a leading multinational environment
  • Fulltime
Read More
Arrow Right

Python Developer

The IT company Andersen invites a Python Developer in Abu Dhabi to join its team...
Location
Location
United Arab Emirates , Abu Dhabi
Salary
Salary:
Not provided
andersenlab.com Logo
Andersen
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience in backend development using Python for 5+ years
  • Strong experience with FastAPI for building RESTful APIs and microservices
  • Hands-on experience with API gateways (e.g., Kong, WSO2) and authentication/authorization solutions (e.g., Keycloak, Active Directory/ADFS)
  • Familiarity with CDN configuration (CloudFront, Cloudflare, Fastly, Akamai) for static assets and API edge caching
  • Experience integrating messaging systems (e.g., Kafka, RabbitMQ) and event-driven architectures
  • Knowledge of CI/CD pipelines and version control (Git, Azure DevOps)
  • Backend Development: Python, FastAPI, REST API design and implementation
  • Microservices Architecture: Design, deployment, and maintenance
  • Data Management: PostgreSQL, Oracle PL/SQL
  • caching with Redis
Job Responsibility
Job Responsibility
  • Developing, maintaining, and enhancing web applications
  • Writing clean, efficient, and well-documented code
  • Collaborating with the team to understand project requirements and specifications
  • Integrating third-party APIs and services
  • Troubleshooting, debugging, and resolving technical issues
  • Implementing security and data protection measures
  • Optimizing application performance and scalability
  • Conducting code reviews and providing constructive feedback to team members
  • Ensuring code quality through different types of testing and continuous integration
What we offer
What we offer
  • Experience in teamwork with leaders in FinTech, Healthcare, Retail, Telecom, and others
  • The opportunity to change the project and/or develop expertise in an interesting business domain
  • Guarantee of professional, financial, and career growth
  • The company has introduced systems of mentoring and adaptation for each new employee
  • The opportunity to earn up to an additional 1,000 USD per month, depending on the level of expertise, which will be included in the annual bonus, by participating in the company's activities
  • Access to the corporate training portal, where the entire knowledge base of the company is collected and which is constantly updated
  • Bright corporate life (parties / pizza days / PlayStation / fruits / coffee / snacks / movies)
  • Certification compensation (AWS, PMP, etc)
  • Referral program
  • English courses
Read More
Arrow Right

Software Developer

Sopra Steria, a major Tech player in Europe, is hiring a Software Developer skil...
Location
Location
India , Bengaluru
Salary
Salary:
Not provided
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Python
  • PySpark
  • AWS (Lambda, RDS)
  • Spark SQL
  • Experience with Foundry (Slate, Workbook, etc.)
  • Schedulers (like Airflow)
  • Proficiency in Python, SQL, and/or Spark with minimum of 3 years working experience
  • Solid understanding of data modeling and data warehousing concepts
  • Experience working with data integration, transformation, and visualization tools
  • Full stack knowledge experience in Python & PySpark
Job Responsibility
Job Responsibility
  • To deliver all activities assigned within agreed KPIs
  • Develop and maintain adequate competences needed to meet expected deliveries
  • Build and maintain healthy relationship with all stakeholders including customers
  • Continuously look for opportunities to increase/optimize efficiency of operations
  • Capture learnings and share knowledge within the team
  • Launch initiatives and ensure meaningful closure
  • Regular reporting on the status to required Stakeholders
  • Adequate and timely communication to stakeholders
  • Exchange feedback with team members
  • Abide by company policies
What we offer
What we offer
  • Inclusive and respectful work environment
  • Open to people with disabilities
  • Fulltime
Read More
Arrow Right

Managed Airflow Platform (MAP) Support Engineer

Location
Location
Salary
Salary:
Not provided
kloud9.nyc Logo
Kloud9
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science or a related field
  • 3+ years of experience in large-scale production-grade platform support, including participation in on-call rotations
  • 3+ years of hands-on experience with cloud platforms like AWS, Azure, or GCP
  • 2+ years of experience developing and supporting data pipelines using Apache Airflow including DAG lifecycle management and scheduling best practices
  • Troubleshooting task failures, scheduler issues, performance bottlenecks managing and error handling
  • Strong programming proficiency in Python, especially for developing and troubleshooting RESTful APIs
  • 1+ years of experience in observability using the ELK stack (Elasticsearch, Logstash, Kibana) or Grafana Stack
  • 2+ years of experience with DevOps and Infrastructure-as-Code tools such as GitHub, Jenkins, Docker, and Terraform
  • 2+ years of hands-on experience with Kubernetes, including managing and debugging cluster resources and workloads within Amazon EKS
  • Exposure to Agile and test-driven development a plus
Job Responsibility
Job Responsibility
  • Evangelize and cultivate adoption of Global Platforms, open-source software and agile principles within the organization
  • Ensure solutions are designed and developed using a scalable, highly resilient cloud native architecture
  • Ensure the operational stability, performance, and scalability of cloud-native platforms through proactive monitoring and timely issue resolution
  • Diagnose infrastructure and system issues across cloud environments and Kubernetes clusters, and lead efforts in troubleshooting and remediation
  • Collaborate with engineering and infrastructure teams to manage configurations, resource tuning, and platform upgrades without disrupting business operations
  • Maintain clear, accurate runbooks, support documentation, and platform knowledge bases to enable faster onboarding and incident response
  • Support observability initiatives by improving logging, metrics, dashboards, and alerting frameworks
  • Advocate for operational excellence and drive continuous improvement in system reliability, cost-efficiency, and maintainability
  • Work with product management to support product / service scoping activities
  • Work with leadership to define delivery schedules of key features through an agile framework
What we offer
What we offer
  • Kloud9 provides a robust compensation package and a forward-looking opportunity for growth in emerging fields
Read More
Arrow Right

BI Manager

Groupon is a marketplace where customers discover new experiences and services e...
Location
Location
Salary
Salary:
Not provided
groupon.com Logo
Groupon
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Leadership: 2+ years managing or mentoring a data team (sprint planning, code reviews, career development)
  • Technical Fluency: Advanced SQL is mandatory
  • You can read/debug Python or Airflow DAGs
  • You understand data modeling (Star Schema, Data Marts) and cloud warehouses (BigQuery)
  • Business Acumen: You can explain 'Revenue Recognition' or 'Conversion Funnels' to an engineer, and 'ETL Latency' to a Sales Director
  • Data analytics: Expert-level proficiency in analytics and data visualisation
  • You know how to design for usability, not just aesthetics
  • AI first mindset: be a pioneer and lead by example on AI usage for both personal effectiveness (MCPs, N8N, cursor AI or Claude code) as well as AI first solution (text mining, optimisation, innovative solution)
Job Responsibility
Job Responsibility
  • Squad Leadership: Manage the backlog, capacity, and development of ~5 data professionals
  • Translate vague business problems into technical specs
  • Data Product Ownership: Own the end-to-end lifecycle of your domain’s data—from ingestion design (reviewing Engineering plans) to final visualization (Tableau)
  • Stakeholder Management: Act as the single point of truth for your domain
  • Negotiate priorities with Business Leaders to ensure the team works on high-ROI tasks
  • Modernization: Be part of data platform unification project to decommission legacy systems running on Teradata and Hive and go fully to GCP native
  • Quality & Governance: Enforce best practices on data governance and documentation, ensure metric consistency
  • If the data breaks, you lead the fix
Read More
Arrow Right

Regular Data Engineer

Inetum Polska is part of the global Inetum Group and plays a key role in driving...
Location
Location
Poland , Warsaw
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum 2 years of professional experience as a programmer working with large datasets
  • Experience in at least 1 project involving the processing of large datasets
  • Experience in at least 1 project programming with Python
  • Experience in at least 1 project within an on-premise computing environment
  • Proven experience programming with Apache Spark
  • Proven experience programming with Python
  • Proven experience programming with Apache Airflow
  • Proven experience programming with SQL
  • Familiarity with Hadoop concepts
  • Proven experience in programming ELT/ETL processes
Job Responsibility
Job Responsibility
  • Design, develop, and implement efficient ELT/ETL processes for large datasets
  • Build and optimize data processing workflows using Apache Spark
  • Utilize Python for data manipulation, transformation, and analysis
  • Develop and manage data pipelines using Apache Airflow
  • Write and optimize SQL queries for data extraction, transformation, and loading
  • Collaborate with data scientists, analysts, and other engineers to understand data requirements and deliver effective solutions
  • Work within an on-premise computing environment for data processing and storage
  • Ensure data quality, integrity, and performance throughout the data lifecycle
  • Participate in the implementation and maintenance of CI/CD pipelines for data processes
  • Utilize Git for version control and collaborative development
What we offer
What we offer
  • Flexible working hours
  • Hybrid work model
  • Cafeteria system
  • Generous referral bonuses
  • Additional revenue sharing opportunities
  • Ongoing guidance from a dedicated Team Manager
  • Tailored technical mentoring
  • Dedicated team-building budget
  • Opportunities to participate in charitable initiatives and local sports programs
  • Supportive and inclusive work culture
  • Fulltime
Read More
Arrow Right

Software Engineer-Snowflake

Join our Snowflake Managed Services team as a Software Engineer to work on data ...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
genzeon.com Logo
Genzeon
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4+ years of hands-on experience in Snowflake development and support
  • Strong SQL, data modeling, and performance tuning experience
  • Exposure to CI/CD pipelines and scripting languages (e.g., Python, Shell)
  • Understanding of Snowflake security (RBAC), warehouse sizing, cost controls
  • Experience with data pipelines and orchestration tools (Airflow, dbt, ADF)
Job Responsibility
Job Responsibility
  • Design and develop Snowflake pipelines, data models, and transformations
  • Provide L2/L3 production support for Snowflake jobs, queries, and integrations
  • Troubleshoot failed jobs, resolve incidents, and conduct RCA
  • Tune queries, monitor warehouses, and help optimize Snowflake usage and cost
  • Handle service requests like user provisioning, access changes, and role management
  • Participate in code reviews, deployment pipelines, and continuous improvement
  • Document issues, enhancements, and standard procedures (runbooks)
Read More
Arrow Right

Senior Data Engineer

Come work on fantastically high-scale systems with us! Blis is an award-winning,...
Location
Location
United Kingdom , Edinburgh
Salary
Salary:
Not provided
blis.com Logo
Blis
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years direct experience delivering robust performant data pipelines within the constraints of direct SLA’s and commercial financial footprints
  • Proven experience in architecting, developing, and maintaining Apache Druid and Imply platforms, with a focus on DevOps practices and large-scale system re-architecture
  • Mastery of building Pipelines in GCP maximising the use of native and native supporting technologies e.g. Apache Airflow
  • Mastery of Python for data and computational tasks with fluency in data cleansing, validation and composition techniques
  • Hands-on implementation and architectural familiarity with all forms of data sourcing i.e streaming data, relational and non-relational databases, and distributed processing technologies (e.g. Spark)
  • Fluency with all appropriate python libraries typical of data science e.g. pandas, scikit-learn, scipy, numpy, MLlib and/or other machine learning and statistical libraries
  • Advanced knowledge of cloud based services specifically GCP
  • Excellent working understanding of server-side Linux
  • Professional in managing and updating on tasks ensuring appropriate levels of documentation, testing and assurance around their solutions
Job Responsibility
Job Responsibility
  • Design, build, monitor, and support large scale data processing pipelines
  • Support, mentor, and pair with other members of the team to advance our team’s capabilities and capacity
  • Help Blis explore and exploit new data streams to innovative and support commercial and technical growth
  • Work closely with Product and be comfortable with taking, making and delivering against fast paced decisions to delight our customers
Read More
Arrow Right