CrawlJobs Logo

Data Warehouse Developer II

beacontechinc.com Logo

Beacon Technologies

Location Icon

Location:
United States

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Beacon Technologies is seeking a Data Warehouse Developer II for our client partner. This role is responsible for designing, developing, testing, and implementing solutions using Informatica Data Management Cloud (IDMC). This role requires strong hands-on experience with Data Integration (DI), Task flows, Mappings, parameter files, and reusable transformations to build scalable, secure, and high-performing data pipelines. The candidate will manage metadata, ensure data accuracy and integrity, and drive continuous optimization within the IDMC and data warehouse environment. Candidate MUST be a Wisconsin resident or willing to relocate to WI prior to starting the role at their own expense. 90-100% remote but can require staff to come onsite as necessary with sufficient notice. Onsite work is not likely, but candidate should be prepared to come onsite if required.

Job Responsibility:

  • Designing, developing, testing, and implementing solutions using Informatica Data Management Cloud (IDMC)
  • Building scalable, secure, and high-performing data pipelines
  • Managing metadata
  • Ensuring data accuracy and integrity
  • Driving continuous optimization within the IDMC and data warehouse environment

Requirements:

  • Moderate experience in data warehousing process design, development, performance & tuning using the Informatica data warehousing tool suite (4+ years)
  • Developing skills in the architecture of data warehouse and data integration strategies and solutions
  • Knowledge in relational database technologies, e.g. Oracle, DB2, SQL Server (3+ years)
  • Strong data management and data analysis using SQL (4+ years)
  • Must have strong hands-on experience with Informatica IDMC. (3+ years)
  • Candidate MUST be physically located in the United States
  • Candidate MUST be a Wisconsin resident or willing to relocate to WI prior to starting the role at their own expense
  • Candidate must be available to perform all work during the Central Standard Time (CST) business hours 9:00 am – 3 pm (or CST hours as defined by the hiring manager)
  • Candidate will be required to provide their own equipment for this position
What we offer:
  • Career advancement opportunities
  • Extensive training
  • Excellent benefits including paying for health and dental premiums for salaried employees

Additional Information:

Job Posted:
March 04, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Warehouse Developer II

Data Engineer - II

The Data Engineer will design, develop, and maintain scalable data pipelines and...
Location
Location
India , Pune
Salary
Salary:
Not provided
aticaglobal.com Logo
Atica Global
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in computer science, Engineering, Mathematics, a related field, or equivalent practical experience
  • 3-5 years of experience in data engineering or a similar mid-level role
  • Proficiency in Python and SQL
  • experience with Java is a plus
  • Hands-on experience with AWS, Airbyte, DBT, PostgreSQL, MongoDB, Airflow, and Spark
  • Familiarity with data storage solutions such as PostgreSQL, MongoDB
  • Experience with BigQuery (setup, management and scaling)
  • Strong understanding of data modeling, ETL/ELT processes, and database systems
  • Experience with data extraction, batch processing and data warehousing
  • Excellent problem-solving skills and a keen attention to detail
Job Responsibility
Job Responsibility
  • Design, develop, and maintain scalable data pipelines and ETL/ELT processes using tools like Airflow, Airbyte and PySpark
  • Collaborate with software engineers and analysts to ensure data availability and integrity for various applications
  • Design and implement robust data pipelines to extract, transform, and load (ETL) data from various sources
  • Utilize Airflow for orchestrating complex workflows and managing data pipelines
  • Implement batch processing techniques using Airflow/PySpark to handle large volumes of data efficiently
  • Develop ELT processes to optimize data extraction and transformation within the target data warehouse
  • Leverage AWS services (e.g., S3, RDS, Lambda) for data storage, processing, and orchestration
  • Ensure data security, reliability, and performance when utilizing AWS resources
  • Work closely with developers, analysts, and other stakeholders to understand data requirements and provide the necessary data infrastructure
  • Assist in troubleshooting and optimizing existing data workflows and queries
What we offer
What we offer
  • Competitive salary and benefits package
  • Comprehensive Health Care benefits (best in the country, includes IPD+OPD, covers Employee, Spouse and two children)
  • Growth and advancement opportunities within a rapidly expanding company
  • Fulltime
Read More
Arrow Right

Senior Strategist II

This position will act as the primary statistician for the Data & Analytics Depa...
Location
Location
United States , Washington, D.C.
Salary
Salary:
95038.00 - 142567.00 USD / Year
afscme.org Logo
American Federation of State, County and Municipal Employees
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Graduation from an accredited four-year college or university with course work in data science, IT management, or a related field
  • Minimum of five (5) years of related work experience
  • Commitment to the union and its mission
  • Thorough understanding of Structured Query Language (SQL) and constructing queries
  • Familiarity with online electoral tools and datasets such as Catalist, VAN, Action Builder, and digital CRM tools such as Action Network
  • Extensive experience building creative reports using data visualization tools such as PowerBI
  • Proven experience in building and validating statistical models
  • Ability to handle multiple projects simultaneously and meet tight deadlines
  • Strong interpersonal and communication skills including the ability to communicate technical subjects to non–technical individuals
  • Able to learn new material and skills quickly and engage on a variety of projects
Job Responsibility
Job Responsibility
  • Act as the primary statistician for the Data & Analytics Department
  • Lead and advise on a variety of additional analytics projects
  • Responsible for testing, research, implementation and management of advanced database systems, tools, targeting, testing and research to support organizing, digital and political action campaigns
  • Establishment of training modules for IU staff, affiliate leaders/staff and members in how to utilize these resources
  • Creation of systems for the oversight of projects assigned to the Data & Analytics Department
  • Expected to interact directly with international and affiliate staff and leadership, members, and partner organizations
  • Utilize advanced data management tools and techniques to conduct voter targeting and analysis for digital, organizing, issue and electoral campaigns
  • Work with IU departments and affiliates to drive adoption of tools and processes across the union
  • May coordinate and assign work for members of the Data & Analytics Department as appropriate
  • May coordinate large-scale voter, general public and member contact programs utilizing in-house and vendor-provided tools
What we offer
What we offer
  • List of benefits
Read More
Arrow Right

Data Warehouse Developer II

We are seeking a Data Warehouse Developer II to support the design, development,...
Location
Location
United States
Salary
Salary:
40.00 - 45.00 USD / Hour
bhsg.com Logo
Beacon Hill
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Data warehousing process design, development, and performance tuning using Informatica tools (4+ years)
  • Strong hands-on experience with Informatica IDMC (3+ years)
  • Relational database experience (Oracle, DB2, SQL Server) (3+ years)
  • Strong SQL for data management and analysis (4+ years)
  • Experience developing ETL solutions in enterprise environments
  • Ability to work independently with moderate supervision
  • Must be physically located within the United States
  • Work must be performed during Central Standard Time business hours
  • Candidate must provide their own equipment that meets technical requirements
Job Responsibility
Job Responsibility
  • Design, develop, test, and implement ETL solutions using Informatica IDMC
  • Build and maintain data integrations using mappings, taskflows, parameter files, and reusable transformations
  • Support data warehouse development, enhancement, and ongoing maintenance
  • Perform ETL and query performance tuning and optimization
  • Ensure data accuracy, integrity, and security compliance
  • Collaborate with business stakeholders and IT teams to translate requirements into technical solutions
  • Support migration and modernization efforts within the data warehouse environment
  • Participate in on-call support as required during business hours (CST)
Read More
Arrow Right

Data Engineer II

MU-Data Engineers II located in Costa Mesa, CA will provide direct data engineer...
Location
Location
United States , Costa Mesa
Salary
Salary:
128676.72 - 130000.00 USD / Year
https://www.t-mobile.com Logo
T-Mobile
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree (or foreign equivalent) in computer science, any engineering, data science or closely related field
  • 2 years of relevant experience
  • Advanced level in SQL and Python for data engineering and backend development
  • Understanding of popular code development approaches: Test-driven development & Continuous Integration/Continuous Deployment (CI/CD)
  • Experienced working with AWS services (Lambda, S3, Glue, Redshift etc.) and Cloud Data Warehouse
  • Build processes supporting data transformation, data structures, metadata, dependency, and workload management
  • Define digital data collection methods and implementations to DWH and customer management platforms
  • Provide guidance to website tagging, digital data integration and collections
  • Experienced with QA testing process to verify data is collected & rendered as expected
  • Prior working knowledge of Braze or other Customer Management Platforms
Job Responsibility
Job Responsibility
  • Work collaboratively with cross-functional teams to determine data transformation needs, establish data collection requirements, and deploy & maintain accurate data to ensure metrics & dimensions are addressed for analytics/reporting requirements
  • Participate in daily/weekly SCRUM ceremonies
  • Transform required data conditions into performance code logic
  • Develop high-performance code in SQL and Python
  • Develop, automate, and enhance ELT processes for continuous data flow
  • Onboard domain data as custom attributes and events to external systems such as Braze, Customer Journey
  • Maintain data daily updates and core logic integrity
  • Provide support to troubleshoot data anomalies and apply resolutions
  • Create, modify, and deploy code to production via AWS resources including Lambda, S3, Glue, Redshift, etc.
  • Partner with stakeholders to support the implementation and development of data requirements. Troubleshoot data anomalies and provide applicable solutions
What we offer
What we offer
  • Competitive base salary and compensation package
  • Annual stock grant
  • Employee stock purchase plan
  • 401(k)
  • Access to free, year-round money coaches
  • Medical, dental and vision insurance
  • Flexible spending account
  • Paid time off
  • Up to 12 paid holidays
  • Paid parental and family leave
  • Fulltime
Read More
Arrow Right

Data Engineer

Do you love solving real-world data problems with the latest and best techniques...
Location
Location
India , Pune
Salary
Salary:
Not provided
Jash Data Sciences
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong Python coding skills with basic knowledge of algorithms/data structures and their application
  • Strong understanding of Data Engineering concepts including ETL, ELT, Data Lake, Data Warehousing, and Data Pipelines
  • Experience designing and implementing Data Lakes, Data Warehouses, and Data Marts that support terabytes of scale data
  • A track record of implementing Data Pipelines on public cloud environments (AWS/GCP/Azure) is highly desirable
  • A clear understanding of Database concepts like indexing, query performance optimization, views, and various types of schemas
  • Hands-on SQL programming experience with knowledge of windowing functions, subqueries, and various types of joins
  • Experience working with Big Data technologies like PySpark/ Hadoop
  • A good team player with the ability to communicate with clarity
  • Show us your git repo/ blog
  • 1-2 years of experience working on Data Engineering projects for Data Engineer I
Job Responsibility
Job Responsibility
  • Discovering trends in the data sets and developing algorithms to transform raw data for further analytics
  • Create Data Pipelines to bring in data from various sources, with different formats, transform it, and finally load it to the target database
  • Implement ETL/ ELT processes in the cloud using tools like AirFlow, Glue, Stitch, Cloud Data Fusion, and DataFlow
  • Design and implement Data Lake, Data Warehouse, and Data Marts in AWS, GCP, or Azure using Redshift, BigQuery, PostgreSQL, etc
  • Creating efficient SQL queries and understanding query execution plans for tuning queries on engines like PostgreSQL
  • Performance tuning of OLAP/ OLTP databases by creating indices, tables, and views
  • Write Python scripts for the orchestration of data pipelines
  • Have thoughtful discussions with customers to understand their data engineering requirements
  • Break complex requirements into smaller tasks for execution
Read More
Arrow Right
New

Senior Data Engineer II

We are seeking a skilled Data Engineer to design, build, and maintain robust dat...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
alterdomus.com Logo
Alter Domus
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience designing, building, and maintaining ETL/ELT data pipelines
  • Hands-on experience with DBT for implementing modular transformations and testing
  • Practical knowledge of Apache Airflow for workflow orchestration and scheduling
  • Strong SQL skills including complex queries, data modeling, and optimization
  • Experience with Amazon Athena or similar query services for analytical workloads
  • Proficiency in Python for data processing and automation
  • Understanding of data quality practices, validation, and governance principles
  • Experience with performance tuning of data pipelines and queries
  • Strong problem-solving skills and attention to detail
  • Excellent communication and collaboration abilities
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable ETL/ELT pipelines to support analytics and reporting needs
  • Develop automated data ingestion processes from various sources into the data warehouse
  • Implement data transformation workflows that ensure data consistency and reliability
  • Monitor and troubleshoot pipeline issues to maintain data availability and accuracy
  • Build reusable pipeline components and frameworks for efficient development
  • Implement modular transformations and testing frameworks using DBT (Data Build Tool)
  • Design efficient database schemas optimized for analytical workloads
  • Create and maintain data models that support business requirements
  • Write complex SQL queries and optimize them for performance
  • Work with Amazon Athena or similar query services for large-scale data analysis
What we offer
What we offer
  • Support for professional accreditations
  • Flexible arrangements, generous holidays, plus an additional day off for your birthday
  • Continuous mentoring along your career progression
  • Active sports, events and social committees across our offices
  • 24/7 support available from our Employee Assistance Program
  • The opportunity to invest in our growth and success through our Employee Share Plan
  • Additional local benefits depending on your location
Read More
Arrow Right

Data Analytics Engineer II

A Data and Analytics Engineer II is responsible for development, expansion, and ...
Location
Location
United States , Irving
Salary
Salary:
Not provided
christushealth.org Logo
CHRISTUS Health
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in computer science, Engineering, Math, or related field is required
  • Minimum of three (3) years of experience in MapReduce, Spark programming
  • Minimum of three (3) years of experience developing analytics solutions with large data sets within an OLAP and MPP architecture
  • Minimum of five (5) years of experience with design, architecture and development of Enterprise scale platforms built on open-source frameworks
  • Three (3) years of experience with working in a Microsoft SQL Server environment is preferred
  • One (1) year of Healthcare IT experience is preferred
  • Two (2) years of experience working with Microsoft SQL Server Integration Services (SSIS) is preferred
Job Responsibility
Job Responsibility
  • Meets expectations of the applicable OneCHRISTUS Competencies: Leader of Self, Leader of Others, or Leader of Leaders
  • Responsible for analyzing and understanding data sources, participating in requirement gathering, and providing insights and guidance on data technology and data modeling best practices
  • Analyzes ideas and business and functional requirements to formulate a design strategy
  • Acts as a tenant to draw out a workable application design and coding parameters with essential functionalities
  • Works in collaboration with the team members to identify and address the issues by implementing a viable technical solution that is time and cost-effective and ensuring that it does not affect performance quality
  • Develops code following the industry's best practices and adhere to the organizational development rules and standards
  • Involved in the evaluation of proposed system acquisitions or solutions development and provides input to the decision-making process relative to compatibility, cost, resource requirements, operations, and maintenance
  • Integrates software components, subsystems, facilities, and services into the existing technical systems environment
  • assesses the impact on other systems and works with cross-functional teams within information Services to ensure positive project impact. Installs configure and verify the operation of software components
  • Participates in the development of standards, design, and implementation of proactive processes to collect and report data and statistics on assigned systems
  • Fulltime
Read More
Arrow Right

Senior Product Owner II

The Collections Technology Product Owner is responsible for leading the developm...
Location
Location
United States , Cincinnati
Salary
Salary:
73.00 - 79.00 USD / Hour
apexsystems.com Logo
Apex Systems
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in business, finance, computer science, data analytics, or a related field
  • Master’s degree preferred
  • Experience as a Product Owner or similar role within Collections, Contact Center technology, or financial services
  • Strong knowledge of AWS Connect (contact flows, Lambda, Lex, Contact Lens, omni-channel capabilities)
  • Experience with Collections platforms such as CACS X or comparable systems (e.g., FICO Debt Manager, Latitude, Experian PowerCurve)
  • Demonstrated experience integrating data between operational systems and domain services/data warehouses
  • Proven ability to build data products supporting complex reporting and analytics, particularly in Collections or Credit Risk
  • Strong understanding of Collections processes (delinquency management, treatment strategies, payment arrangements, skip tracing, legal processes, charge-off procedures)
  • Experience with Agile methodologies, especially Scrum
  • Strong analytical, problem-solving, and data literacy skills
Job Responsibility
Job Responsibility
  • Develop and maintain a strategic roadmap for AWS Connect and CACS X, aligned with Collections objectives, enterprise data strategy, and customer experience goals
  • Identify modernization opportunities and ensure platforms evolve to meet business and regulatory needs
  • Collaborate with Collections Operations, Technology, Data & Analytics, Risk, Compliance, and Domain teams to gather requirements, understand challenges, and identify opportunities for improvement
  • Facilitate communication across teams to ensure alignment and transparency
  • Define product vision, strategy, and roadmap for AWS Connect and CACS X
  • Prioritize and manage the product backlog, ensuring timely delivery of features, integrations, and data capabilities
  • Provide ongoing guidance to development teams throughout the delivery lifecycle
  • Lead initiatives to integrate critical Collections data into enterprise domain services (e.g., Account, Card, Loan domains)
  • Design and support real-time and batch data integration patterns between Collections platforms and enterprise systems
  • Develop and manage data products supporting advanced Collections reporting, including delinquency analytics, roll rates, cure rates, contact effectiveness, payment arrangements, and liquidation performance
What we offer
What we offer
  • medical
  • dental
  • vision
  • life
  • disability
  • other insurance plans
  • ESPP (employee stock purchase program)
  • 401K program with company match after 12 months
  • HSA (Health Savings Account on the HDHP plan)
  • SupportLinc Employee Assistance Program (EAP) with up to 8 free counseling sessions
  • Fulltime
Read More
Arrow Right