CrawlJobs Logo

Senior Data Engineer

Blue Margin

Location Icon

Location:
United States, Fort Collins

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

110000.00 - 140000.00 USD / Year

Job Description:

At Blue Margin, we are on a mission to build the go-to data platform for PE-backed mid-market companies. We are a dynamic, customer-focused company providing hosted data platforms for companies across many industries. We’re passionate about leveraging the power of data to drive business success for our clients. We help mid-market companies turn their data into a strategic asset. Our clients rely on us to design and deliver reporting platforms that fuel better, faster decision-making. We’re passionate about helping clients increase company value through better analysis and decision-making, and we’re looking for a Senior Data Engineer to strengthen our team. As a Senior Data Engineer, you will lead the design, optimization, and scalability of data platforms that power analytics for our clients. You will be hands-on with data pipelines, large-scale data processing, and modern cloud data stacks while mentoring team members and helping shape best practices. This role requires strong expertise in Python (PySpark/Apache Spark), deep knowledge of working with high-volume data, and experience optimizing Delta Lake–based architectures. Exposure to Snowflake or Microsoft Fabric, and tools like Fivetran, Azure Data Factory, and Synapse Pipelines, is highly valued. If you’re motivated by solving complex data challenges, thrive in a collaborative environment, and enjoy applying AI to increase engineering productivity, this role offers the opportunity to have significant technical and strategic impact.

Job Responsibility:

  • Architect, design, and optimize large-scale data pipelines using tools like PySpark, SparkSQL, Delta Lake, and cloud-native tools
  • Drive efficiency in incremental/delta data loading, partitioning, and performance tuning
  • Lead implementations across Azure Synapse, Microsoft Fabric, and/or Snowflake environments
  • Collaborate with stakeholders and analysts to translate business needs into scalable data solutions
  • Evaluate and incorporate AI/automation to improve development speed, testing, and data quality
  • Oversee and mentor junior data engineers, establishing coding standards and best practices
  • Ensure high standards for data quality, security, and governance
  • Participate in solution design for client engagements, balancing technical depth with practical outcomes

Requirements:

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related field
  • 5+ years of professional experience in data engineering, with emphasis on Python & PySpark/Apache Spark
  • Proven ability to manage large datasets and optimize for speed, scalability, and reliability
  • Strong SQL skills and understanding of relational and distributed data systems
  • Experience with Azure Data Factory, Synapse Pipelines, Fivetran, Delta Lake, Microsoft Fabric, or Snowflake
  • Knowledge of data modeling, orchestration, and Delta/Parquet file management best practices
  • Familiarity with CI/CD, version control, and DevOps practices for data pipelines
  • Experience leveraging AI-assisted tools to accelerate engineering workflows
  • Strong communication skills
  • ability to convey complex technical details to both engineers and business stakeholders

Nice to have:

Relevant certifications (Azure, Snowflake, or Fabric) a plus

What we offer:
  • Competitive pay
  • strong benefits
  • flexible hybrid work setup

Additional Information:

Job Posted:
December 06, 2025

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.