CrawlJobs Logo

Senior Data Engineer, Information Technology

creallc.com Logo

CREA (IN)

Location Icon

Location:
United States , Indianapolis

Category Icon

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

97000.00 - 143000.00 USD / Year

Job Description:

CREA, LLC is a full-service low-income housing tax credit (LIHTC) syndicator forming long-term relationships with investors and developers that cultivate success and improve lives. With 25 years in affordable housing, CREA has raised $13.5 billion, resulting in the formation of over 97,500 homes within 1059 communities across the country. Thanks to the contributions of over 135 employees, CREA continues to seek talented, passionate individuals eager to grow with us. We are seeking a Senior Data Engineer to lead the design, development, and modernization of enterprise data platforms supporting LIHTC and real estate finance operations. This role is ideal for a seasoned data professional who thrives in building scalable data solutions, driving cloud transformation, and enabling advanced analytics. You will play a key role in CREA's transition to a modern Azure Fabric data ecosystem, unlocking data-driven insights across the organization.

Job Responsibility:

  • Lead the design, development, and modernization of enterprise data platforms supporting LIHTC and real estate finance operations
  • Design, build, test, deploy, and maintain scalable, secure, and reliable data pipelines and integrations
  • Develop and maintain ETL/ELT processes for structured and semi-structured data, including orchestration and documentation
  • Ingest, transform, and integrate structured and unstructured data into centralized data platforms (warehouse/lakehouse)
  • Build reusable code and automation to streamline data processing and management
  • Lead and support migration of on-prem SQL Server workloads to Azure/Microsoft Fabric
  • Participate in data architecture decisions, standards, and platform/tool selection
  • Maintain and optimize performance, reliability, and monitoring across cloud and hybrid environments
  • Develop and support data solutions that enable analytics, reporting, and business operations
  • Produce dashboards and reports
  • communicate insights to stakeholders
  • Serve as a subject matter expert, partnering with IT and business teams to solve key data challenges
  • Implement data quality, validation, and monitoring controls
  • perform root cause analysis
  • Establish and support data governance practices to ensure integrity and business value
  • Champion data security, compliance, and adherence to company guidelines
  • Drive Power BI/Fabric governance, including semantic models and workspace management

Requirements:

  • Bachelor's degree in Engineering, Computer Science, or related field required
  • advanced degree preferred
  • 7+ years of experience in data engineering or development with increasing responsibility
  • Proven experience building and evolving data platforms and supporting business decision-making
  • Industry experience in Real Estate, Financial Services, Banking, or FinTech preferred
  • Strong experience designing, integrating, securing, monitoring, and optimizing data platforms
  • Experience with APIs for system integration and data exchange
  • Experience leveraging AI-enabled tools for analytics, automation, or solution development
  • Azure Data Platform: SQL Server Managed Instance, Azure SQL Database, Azure Synapse, Dataverse, Microsoft Fabric (OneLake, Fabric Warehouse, Data Factory)
  • Data Integration: Azure Data Factory, Microsoft Fabric
  • Analytics & Reporting: Power BI, Power Query, DAX
  • Business Applications: Microsoft 365 (O365, SharePoint, OneDrive), VSTO/C# .NET Add-ins
  • Development & DevOps: GitHub, CI/CD pipelines, Visual Studio, Power BI Deployment Pipelines, Monday.com
  • Ability to travel up to 5% as needed
  • No visa sponsorship available

Additional Information:

Job Posted:
May 16, 2026

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Data Engineer, Information Technology

Senior Data Engineer

As a senior data engineer, you will help our clients with building a variety of ...
Location
Location
Belgium , Brussels
Salary
Salary:
Not provided
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • At least 5 years of experience as a Data Engineer or in software engineering in a data context
  • Programming experience with one or more languages: Python, Scala, Java, C/C++
  • Knowledge of relational database technologies/concepts and SQL is required
  • Experience building, scheduling and maintaining data pipelines (Spark, Airflow, Data Factory)
  • Practical experience with at least one cloud provider (GCP, AWS or Azure). Certifications from any of these is considered a plus
  • Knowledge of Git and CI/CD
  • Able to work independently, prioritize multiple stakeholders and tasks, and manage work time effectively
  • You have a degree in Computer Engineering, Information Technology or related field
  • You are proficient in English, knowledge of Dutch and/or French is a plus.
Job Responsibility
Job Responsibility
  • Gather business requirements and translate them to technical specifications
  • Design, implement and orchestrate scalable and efficient data pipelines to collect, process, and serve large datasets
  • Apply DataOps best practices to automate testing, deployment and monitoring
  • Continuously follow & learn the latest trends in the data world.
What we offer
What we offer
  • A variety of perks, such as mobility options (including a company car), insurance coverage, meal vouchers, eco-cheques, and more
  • Continuous learning opportunities through the Sopra Steria Academy to support your career development
  • The opportunity to connect with fellow Sopra Steria colleagues at various team events.
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer role at UpGuard supporting analytics teams to extract insig...
Location
Location
Australia , Sydney; Melbourne; Brisbane; Hobart
Salary
Salary:
Not provided
https://www.upguard.com Logo
UpGuard
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience with data sourcing, storage and modelling to effectively deliver business value right through to BI platform
  • AI first mindset and experience scaling an Analytics and BI function at another SaaS business
  • Experience with Looker (Explores, Looks, Dashboards, Developer interface, dimensions and measures, models, raw SQL queries)
  • Experience with CloudSQL (PostgreSQL) and BigQuery (complex queries, indices, materialised views, clustering, partitioning)
  • Experience with Containers, Docker and Kubernetes (GKE)
  • Familiarity with n8n for automation
  • Experience with programming languages (Go for ETL workers)
  • Comfortable interfacing with various APIs (REST+JSON or MCP Server)
  • Experience with version control via GitHub and GitHub Flow
  • Security-first mindset
Job Responsibility
Job Responsibility
  • Design, build, and maintain reliable data pipelines to consolidate information from various internal systems and third-party sources
  • Develop and manage comprehensive semantic layer using technologies like LookML, dbt or SQLMesh
  • Implement and enforce data quality checks, validation rules, and governance processes
  • Ensure AI agents have access to necessary structured and unstructured data
  • Create clear, self-maintaining documentation for data models, pipelines, and semantic layer
What we offer
What we offer
  • Great Place to Work certified company
  • Equal Employment Opportunity and Affirmative Action employer
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Adswerve is looking for a Senior Data Engineer to join our Adobe Services team. ...
Location
Location
United States
Salary
Salary:
130000.00 - 155000.00 USD / Year
adswerve.com Logo
Adswerve, Inc.
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Data Engineering, Information Systems, or related field (or equivalent experience)
  • 5+ years of experience in a data engineering, analytics, or marketing technology role
  • Hands-on expertise in Adobe Experience Platform (AEP), Real-Time CDP, Journey Optimizer, or similar tools is a big plus
  • Strong proficiency in SQL and hands-on experience with data transformation and modeling
  • Understanding of ETL/ELT workflows (e.g., dbt, Fivetran, Airflow, etc.) and cloud data platforms (e.g., GCP, Snowflake, AWS, Azure)
  • Experience with ingress/egress patterns and interacting with API’s to move data
  • Experience with Python, or JavaScript in a data or scripting context
  • Experience with customer data platforms (CDPs), event-based tracking, or customer identity management
  • Understanding of Adobe Experience Cloud integrations (e.g., Adobe Analytics, Target, Campaign) is a plus
  • Strong communication skills with the ability to lead technical conversations and present to both technical and non-technical audiences
Job Responsibility
Job Responsibility
  • Lead the end-to-end architecture of data ingestion and transformation in Adobe Experience Platform (AEP) using Adobe Data Collection (Tags), Experience Data Model (XDM), and source connectors
  • Design and optimize data models, identity graphs, and segmentation strategies within Real-Time CDP to enable personalized customer experiences
  • Implement schema mapping, identity resolution, and data governance strategies
  • Collaborate with Data Architects to build scalable, reliable data pipelines across multiple systems
  • Conduct data quality assessments and support QA for new source integrations and activations
  • Write and maintain internal documentation and knowledge bases on AEP best practices and data workflows
  • Simplify complex technical concepts and educate team members and clients in a clear, approachable way
  • Contribute to internal knowledge sharing and mentor junior engineers in best practices around data modeling, pipeline development, and Adobe platform capabilities
  • Stay current on the latest Adobe Experience Platform features and data engineering trends to inform client strategies
What we offer
What we offer
  • Medical, dental and vision available for employees
  • Paid time off including vacation, sick leave & company holidays
  • Paid volunteer time
  • Flexible working hours
  • Summer Fridays
  • “Work From Home Light” days between Christmas and New Year’s Day
  • 401(k) Plan with 5% company match and no vesting period
  • Employer Paid Parental Leave
  • Health-care Spending Accounts
  • Dependent-care Spending Accounts
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

We’re hiring a Senior Data Engineer with strong experience in AWS and Databricks...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
appen.com Logo
Appen
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5-7 years of hands-on experience with AWS data engineering technologies, such as Amazon Redshift, AWS Glue, AWS Data Pipeline, Amazon Kinesis, Amazon RDS, and Apache Airflow
  • Hands-on experience working with Databricks, including Delta Lake, Apache Spark (Python or Scala), and Unity Catalog
  • Demonstrated proficiency in SQL and NoSQL databases, ETL tools, and data pipeline workflows
  • Experience with Python, and/or Java
  • Deep understanding of data structures, data modeling, and software architecture
  • Strong problem-solving skills and attention to detail
  • Self-motivated and able to work independently, with excellent organizational and multitasking skills
  • Exceptional communication skills, with the ability to explain complex data concepts to non-technical stakeholders
  • Bachelor's Degree in Computer Science, Information Systems, or a related field. A Master's Degree is preferred.
Job Responsibility
Job Responsibility
  • Design, build, and manage large-scale data infrastructures using a variety of AWS technologies such as Amazon Redshift, AWS Glue, Amazon Athena, AWS Data Pipeline, Amazon Kinesis, Amazon EMR, and Amazon RDS
  • Design, develop, and maintain scalable data pipelines and architectures on Databricks using tools such as Delta Lake, Unity Catalog, and Apache Spark (Python or Scala), or similar technologies
  • Integrate Databricks with cloud platforms like AWS to ensure smooth and secure data flow across systems
  • Build and automate CI/CD pipelines for deploying, testing, and monitoring Databricks workflows and data jobs
  • Continuously optimize data workflows for performance, reliability, and security, applying Databricks best practices around data governance and quality
  • Ensure the performance, availability, and security of datasets across the organization, utilizing AWS’s robust suite of tools for data management
  • Collaborate with data scientists, software engineers, product managers, and other key stakeholders to develop data-driven solutions and models
  • Translate complex functional and technical requirements into detailed design proposals and implement them
  • Mentor junior and mid-level data engineers, fostering a culture of continuous learning and improvement within the team
  • Identify, troubleshoot, and resolve complex data-related issues
  • Fulltime
Read More
Arrow Right

Senior Azure Data Engineer

Seeking a Lead AI DevOps Engineer to oversee design and delivery of advanced AI/...
Location
Location
Poland
Salary
Salary:
Not provided
lingarogroup.com Logo
Lingaro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • At least 6 years of professional experience in the Data & Analytics area
  • 1+ years of experience (or acting as) in the Senior Consultant or above role with a strong focus on data solutions build in Azure and Databricks/Synapse/(MS Fabric is nice to have)
  • Proven experience in Azure cloud-based infrastructure, Databricks and one of SQL implementation (e.g., Oracle, T-SQL, MySQL, etc.)
  • Proficiency in programming languages such as SQL, Python, PySpark is essential (R or Scala nice to have)
  • Very good level of communication including ability to convey information clearly and specifically to co-workers and business stakeholders
  • Working experience in the agile methodologies – supporting tools (JIRA, Azure DevOps)
  • Experience in leading and managing a team of data engineers, providing guidance, mentorship, and technical support
  • Knowledge of data management principles and best practices, including data governance, data quality, and data integration
  • Good project management skills, with the ability to prioritize tasks, manage timelines, and deliver high-quality results within designated deadlines
  • Excellent problem-solving and analytical skills, with the ability to identify and resolve complex data engineering issues
Job Responsibility
Job Responsibility
  • Act as a senior member of the Data Science & AI Competency Center, AI Engineering team, guiding delivery and coordinating workstreams
  • Develop and execute a cloud data strategy aligned with organizational goals
  • Lead data integration efforts, including ETL processes, to ensure seamless data flow
  • Implement security measures and compliance standards in cloud environments
  • Continuously monitor and optimize data solutions for cost-efficiency
  • Establish and enforce data governance and quality standards
  • Leverage Azure services, as well as tools like dbt and Databricks, for efficient data pipelines and analytics solutions
  • Work with cross-functional teams to understand requirements and provide data solutions
  • Maintain comprehensive documentation for data architecture and solutions
  • Mentor junior team members in cloud data architecture best practices
What we offer
What we offer
  • Stable employment
  • “Office as an option” model
  • Workation
  • Great Place to Work® certified employer
  • Flexibility regarding working hours and your preferred form of contract
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs
  • Upskilling support
Read More
Arrow Right

Senior Data Engineering Architect

Location
Location
Poland
Salary
Salary:
Not provided
lingarogroup.com Logo
Lingaro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven work experience as a Data Engineering Architect or a similar role and strong experience in in the Data & Analytics area
  • Strong understanding of data engineering concepts, including data modeling, ETL processes, data pipelines, and data governance
  • Expertise in designing and implementing scalable and efficient data processing frameworks
  • In-depth knowledge of various data technologies and tools, such as relational databases, NoSQL databases, data lakes, data warehouses, and big data frameworks (e.g., Hadoop, Spark)
  • Experience in selecting and integrating appropriate technologies to meet business requirements and long-term data strategy
  • Ability to work closely with stakeholders to understand business needs and translate them into data engineering solutions
  • Strong analytical and problem-solving skills, with the ability to identify and address complex data engineering challenges
  • Proficiency in Python, PySpark, SQL
  • Familiarity with cloud platforms and services, such as AWS, GCP, or Azure, and experience in designing and implementing data solutions in a cloud environment
  • Knowledge of data governance principles and best practices, including data privacy and security regulations
Job Responsibility
Job Responsibility
  • Collaborate with stakeholders to understand business requirements and translate them into data engineering solutions
  • Design and oversee the overall data architecture and infrastructure, ensuring scalability, performance, security, maintainability, and adherence to industry best practices
  • Define data models and data schemas to meet business needs, considering factors such as data volume, velocity, variety, and veracity
  • Select and integrate appropriate data technologies and tools, such as databases, data lakes, data warehouses, and big data frameworks, to support data processing and analysis
  • Create scalable and efficient data processing frameworks, including ETL (Extract, Transform, Load) processes, data pipelines, and data integration solutions
  • Ensure that data engineering solutions align with the organization's long-term data strategy and goals
  • Evaluate and recommend data governance strategies and practices, including data privacy, security, and compliance measures
  • Collaborate with data scientists, analysts, and other stakeholders to define data requirements and enable effective data analysis and reporting
  • Provide technical guidance and expertise to data engineering teams, promoting best practices and ensuring high-quality deliverables. Support to team throughout the implementation process, answering questions and addressing issues as they arise
  • Oversee the implementation of the solution, ensuring that it is implemented according to the design documents and technical specifications
What we offer
What we offer
  • Stable employment. On the market since 2008, 1500+ talents currently on board in 7 global sites
  • Workation. Enjoy working from inspiring locations in line with our workation policy
  • Great Place to Work® certified employer
  • Flexibility regarding working hours and your preferred form of contract
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs. Lingarians earn 500+ technology certificates yearly
  • Upskilling support. Capability development programs, Competency Centers, knowledge sharing sessions, community webinars, 110+ training opportunities yearly
  • Grow as we grow as a company. 76% of our managers are internal promotions
Read More
Arrow Right

Senior Information Technology Engineer

The IT Systems Engineer is responsible for architecting, securing, and scaling L...
Location
Location
India , Pune
Salary
Salary:
Not provided
logicmonitor.com Logo
LogicMonitor
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of IT experience in a global high-tech environment
  • 5+ years of hands-on networking experience (enterprise/global scale)
  • Strong experience managing Cisco switches, Fortinet firewalls, VPNs, and wireless infrastructure
  • Demonstrated experience with Zero-Trust Network Architecture (ZTNA), Secure Web Gateways, and CASB (preferred: Cloudflare)
  • Proficiency with Terraform for Infrastructure-as-Code and familiarity with GitOps practices
  • Strong understanding of networking in cloud environments (AWS, GCP, Azure)
  • Familiarity with FedRAMP/GovCloud requirements preferred
  • Experience using AI tools to enhance productivity, innovation, or problem-solving
  • Solid Linux systems experience
  • macOS networking and certificate compatibility knowledge required
Job Responsibility
Job Responsibility
  • Own Cloudflare ZTNA and Secure Web Gateway end-to-end: design, policy enforcement, monitoring, troubleshooting, and Terraform-based configuration
  • Handle multiple instances of Cloudflare ZTNA, covering commercial and government infrastructure
  • Ensure compatibility and reliability of certificates and macOS networking with SWG/Zero-Trust controls
  • Architect and administer global networking across offices, data centers, and multi-cloud (AWS, GCP, Azure) environments
  • Manage Cisco switches, Fortinet firewalls, VPNs, Wi-Fi, and global remote access infrastructure
  • Implement Infrastructure-as-Code practices with Terraform and support GitOps workflows
  • Deliver and maintain network observability dashboards, SLAs, and uptime reporting using LogicMonitor
  • Partner with Security and Technical Operations to maintain compliance in both commercial and FedRAMP environments
  • Ability to work within an on-call rotation schedule and be available after hours for specialized support
  • Proactively identify opportunities for AI-driven automation within IT operations and quietly deliver solutions that reduce manual workloads
Read More
Arrow Right

Senior Data Engineer

We build simple yet innovative consumer products and developer APIs that shape h...
Location
Location
United States , San Francisco
Salary
Salary:
180000.00 - 270000.00 USD / Year
plaid.com Logo
Plaid
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4+ years of dedicated data engineering experience, solving complex data pipelines issues at scale
  • Experience building data models and data pipelines on top of large datasets (in the order of 500TB to petabytes)
  • Value SQL as a flexible and extensible tool, and are comfortable with modern SQL data orchestration tools like DBT, Mode, and Airflow
  • Experience working with different performant warehouses and data lakes
  • Redshift, Snowflake, Databricks
  • Experience building and maintaining batch and realtime pipelines using technologies like Spark, Kafka
  • Appreciate the importance of schema design, and can evolve an analytics schema on top of unstructured data
  • Excited to try out new technologies and like to produce proof-of-concepts that balance technical advancement and user experience and adoption
  • Like to get deep in the weeds to manage, deploy, and improve low level data infrastructure
  • Empathetic working with stakeholders
Job Responsibility
Job Responsibility
  • Understanding different aspects of the Plaid product and strategy to inform golden dataset choices, design and data usage principles
  • Have data quality and performance top of mind while designing datasets
  • Leading key data engineering projects that drive collaboration across the company
  • Advocating for adopting industry tools and practices at the right time
  • Owning core SQL and python data pipelines that power our data lake and data warehouse
  • Well-documented data with defined dataset quality, uptime, and usefulness
What we offer
What we offer
  • medical
  • dental
  • vision
  • 401(k)
  • equity
  • commission
  • Fulltime
Read More
Arrow Right