CrawlJobs Logo

Data Warehouse Developer II

beacontechinc.com Logo

Beacon Technologies

Location Icon

Location:
United States

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Beacon Technologies is seeking a Data Warehouse Developer II for our client partner. This role is responsible for designing, developing, testing, and implementing solutions using Informatica Data Management Cloud (IDMC). This role requires strong hands-on experience with Data Integration (DI), Task flows, Mappings, parameter files, and reusable transformations to build scalable, secure, and high-performing data pipelines. The candidate will manage metadata, ensure data accuracy and integrity, and drive continuous optimization within the IDMC and data warehouse environment. Candidate MUST be a Wisconsin resident or willing to relocate to WI prior to starting the role at their own expense. 90-100% remote but can require staff to come onsite as necessary with sufficient notice. Onsite work is not likely, but candidate should be prepared to come onsite if required.

Job Responsibility:

  • Designing, developing, testing, and implementing solutions using Informatica Data Management Cloud (IDMC)
  • Building scalable, secure, and high-performing data pipelines
  • Managing metadata
  • Ensuring data accuracy and integrity
  • Driving continuous optimization within the IDMC and data warehouse environment

Requirements:

  • Moderate experience in data warehousing process design, development, performance & tuning using the Informatica data warehousing tool suite (4+ years)
  • Developing skills in the architecture of data warehouse and data integration strategies and solutions
  • Knowledge in relational database technologies, e.g. Oracle, DB2, SQL Server (3+ years)
  • Strong data management and data analysis using SQL (4+ years)
  • Must have strong hands-on experience with Informatica IDMC. (3+ years)
  • Candidate MUST be physically located in the United States
  • Candidate MUST be a Wisconsin resident or willing to relocate to WI prior to starting the role at their own expense
  • Candidate must be available to perform all work during the Central Standard Time (CST) business hours 9:00 am – 3 pm (or CST hours as defined by the hiring manager)
  • Candidate will be required to provide their own equipment for this position
What we offer:
  • Career advancement opportunities
  • Extensive training
  • Excellent benefits including paying for health and dental premiums for salaried employees

Additional Information:

Job Posted:
March 04, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Warehouse Developer II

Data Engineer - II

The Data Engineer will design, develop, and maintain scalable data pipelines and...
Location
Location
India , Pune
Salary
Salary:
Not provided
aticaglobal.com Logo
Atica Global
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in computer science, Engineering, Mathematics, a related field, or equivalent practical experience
  • 3-5 years of experience in data engineering or a similar mid-level role
  • Proficiency in Python and SQL
  • experience with Java is a plus
  • Hands-on experience with AWS, Airbyte, DBT, PostgreSQL, MongoDB, Airflow, and Spark
  • Familiarity with data storage solutions such as PostgreSQL, MongoDB
  • Experience with BigQuery (setup, management and scaling)
  • Strong understanding of data modeling, ETL/ELT processes, and database systems
  • Experience with data extraction, batch processing and data warehousing
  • Excellent problem-solving skills and a keen attention to detail
Job Responsibility
Job Responsibility
  • Design, develop, and maintain scalable data pipelines and ETL/ELT processes using tools like Airflow, Airbyte and PySpark
  • Collaborate with software engineers and analysts to ensure data availability and integrity for various applications
  • Design and implement robust data pipelines to extract, transform, and load (ETL) data from various sources
  • Utilize Airflow for orchestrating complex workflows and managing data pipelines
  • Implement batch processing techniques using Airflow/PySpark to handle large volumes of data efficiently
  • Develop ELT processes to optimize data extraction and transformation within the target data warehouse
  • Leverage AWS services (e.g., S3, RDS, Lambda) for data storage, processing, and orchestration
  • Ensure data security, reliability, and performance when utilizing AWS resources
  • Work closely with developers, analysts, and other stakeholders to understand data requirements and provide the necessary data infrastructure
  • Assist in troubleshooting and optimizing existing data workflows and queries
What we offer
What we offer
  • Competitive salary and benefits package
  • Comprehensive Health Care benefits (best in the country, includes IPD+OPD, covers Employee, Spouse and two children)
  • Growth and advancement opportunities within a rapidly expanding company
  • Fulltime
Read More
Arrow Right

Senior Strategist II

This position will act as the primary statistician for the Data & Analytics Depa...
Location
Location
United States , Washington, D.C.
Salary
Salary:
95038.00 - 142567.00 USD / Year
afscme.org Logo
American Federation of State, County and Municipal Employees
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Graduation from an accredited four-year college or university with course work in data science, IT management, or a related field
  • Minimum of five (5) years of related work experience
  • Commitment to the union and its mission
  • Thorough understanding of Structured Query Language (SQL) and constructing queries
  • Familiarity with online electoral tools and datasets such as Catalist, VAN, Action Builder, and digital CRM tools such as Action Network
  • Extensive experience building creative reports using data visualization tools such as PowerBI
  • Proven experience in building and validating statistical models
  • Ability to handle multiple projects simultaneously and meet tight deadlines
  • Strong interpersonal and communication skills including the ability to communicate technical subjects to non–technical individuals
  • Able to learn new material and skills quickly and engage on a variety of projects
Job Responsibility
Job Responsibility
  • Act as the primary statistician for the Data & Analytics Department
  • Lead and advise on a variety of additional analytics projects
  • Responsible for testing, research, implementation and management of advanced database systems, tools, targeting, testing and research to support organizing, digital and political action campaigns
  • Establishment of training modules for IU staff, affiliate leaders/staff and members in how to utilize these resources
  • Creation of systems for the oversight of projects assigned to the Data & Analytics Department
  • Expected to interact directly with international and affiliate staff and leadership, members, and partner organizations
  • Utilize advanced data management tools and techniques to conduct voter targeting and analysis for digital, organizing, issue and electoral campaigns
  • Work with IU departments and affiliates to drive adoption of tools and processes across the union
  • May coordinate and assign work for members of the Data & Analytics Department as appropriate
  • May coordinate large-scale voter, general public and member contact programs utilizing in-house and vendor-provided tools
What we offer
What we offer
  • List of benefits
Read More
Arrow Right

Data Engineer

Do you love solving real-world data problems with the latest and best techniques...
Location
Location
India , Pune
Salary
Salary:
Not provided
Jash Data Sciences
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong Python coding skills with basic knowledge of algorithms/data structures and their application
  • Strong understanding of Data Engineering concepts including ETL, ELT, Data Lake, Data Warehousing, and Data Pipelines
  • Experience designing and implementing Data Lakes, Data Warehouses, and Data Marts that support terabytes of scale data
  • A track record of implementing Data Pipelines on public cloud environments (AWS/GCP/Azure) is highly desirable
  • A clear understanding of Database concepts like indexing, query performance optimization, views, and various types of schemas
  • Hands-on SQL programming experience with knowledge of windowing functions, subqueries, and various types of joins
  • Experience working with Big Data technologies like PySpark/ Hadoop
  • A good team player with the ability to communicate with clarity
  • Show us your git repo/ blog
  • 1-2 years of experience working on Data Engineering projects for Data Engineer I
Job Responsibility
Job Responsibility
  • Discovering trends in the data sets and developing algorithms to transform raw data for further analytics
  • Create Data Pipelines to bring in data from various sources, with different formats, transform it, and finally load it to the target database
  • Implement ETL/ ELT processes in the cloud using tools like AirFlow, Glue, Stitch, Cloud Data Fusion, and DataFlow
  • Design and implement Data Lake, Data Warehouse, and Data Marts in AWS, GCP, or Azure using Redshift, BigQuery, PostgreSQL, etc
  • Creating efficient SQL queries and understanding query execution plans for tuning queries on engines like PostgreSQL
  • Performance tuning of OLAP/ OLTP databases by creating indices, tables, and views
  • Write Python scripts for the orchestration of data pipelines
  • Have thoughtful discussions with customers to understand their data engineering requirements
  • Break complex requirements into smaller tasks for execution
Read More
Arrow Right

Data Analytics Engineer II

A Data and Analytics Engineer II is responsible for development, expansion, and ...
Location
Location
United States , Irving
Salary
Salary:
Not provided
christushealth.org Logo
CHRISTUS Health
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in computer science, Engineering, Math, or related field is required
  • Minimum of three (3) years of experience in MapReduce, Spark programming
  • Minimum of three (3) years of experience developing analytics solutions with large data sets within an OLAP and MPP architecture
  • Minimum of five (5) years of experience with design, architecture and development of Enterprise scale platforms built on open-source frameworks
  • Three (3) years of experience with working in a Microsoft SQL Server environment is preferred
  • One (1) year of Healthcare IT experience is preferred
  • Two (2) years of experience working with Microsoft SQL Server Integration Services (SSIS) is preferred
Job Responsibility
Job Responsibility
  • Meets expectations of the applicable OneCHRISTUS Competencies: Leader of Self, Leader of Others, or Leader of Leaders
  • Responsible for analyzing and understanding data sources, participating in requirement gathering, and providing insights and guidance on data technology and data modeling best practices
  • Analyzes ideas and business and functional requirements to formulate a design strategy
  • Acts as a tenant to draw out a workable application design and coding parameters with essential functionalities
  • Works in collaboration with the team members to identify and address the issues by implementing a viable technical solution that is time and cost-effective and ensuring that it does not affect performance quality
  • Develops code following the industry's best practices and adhere to the organizational development rules and standards
  • Involved in the evaluation of proposed system acquisitions or solutions development and provides input to the decision-making process relative to compatibility, cost, resource requirements, operations, and maintenance
  • Integrates software components, subsystems, facilities, and services into the existing technical systems environment
  • assesses the impact on other systems and works with cross-functional teams within information Services to ensure positive project impact. Installs configure and verify the operation of software components
  • Participates in the development of standards, design, and implementation of proactive processes to collect and report data and statistics on assigned systems
  • Fulltime
Read More
Arrow Right

Senior Product Owner II

The Collections Technology Product Owner is responsible for leading the developm...
Location
Location
United States , Cincinnati
Salary
Salary:
73.00 - 79.00 USD / Hour
apexsystems.com Logo
Apex Systems
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in business, finance, computer science, data analytics, or a related field
  • Master’s degree preferred
  • Experience as a Product Owner or similar role within Collections, Contact Center technology, or financial services
  • Strong knowledge of AWS Connect (contact flows, Lambda, Lex, Contact Lens, omni-channel capabilities)
  • Experience with Collections platforms such as CACS X or comparable systems (e.g., FICO Debt Manager, Latitude, Experian PowerCurve)
  • Demonstrated experience integrating data between operational systems and domain services/data warehouses
  • Proven ability to build data products supporting complex reporting and analytics, particularly in Collections or Credit Risk
  • Strong understanding of Collections processes (delinquency management, treatment strategies, payment arrangements, skip tracing, legal processes, charge-off procedures)
  • Experience with Agile methodologies, especially Scrum
  • Strong analytical, problem-solving, and data literacy skills
Job Responsibility
Job Responsibility
  • Develop and maintain a strategic roadmap for AWS Connect and CACS X, aligned with Collections objectives, enterprise data strategy, and customer experience goals
  • Identify modernization opportunities and ensure platforms evolve to meet business and regulatory needs
  • Collaborate with Collections Operations, Technology, Data & Analytics, Risk, Compliance, and Domain teams to gather requirements, understand challenges, and identify opportunities for improvement
  • Facilitate communication across teams to ensure alignment and transparency
  • Define product vision, strategy, and roadmap for AWS Connect and CACS X
  • Prioritize and manage the product backlog, ensuring timely delivery of features, integrations, and data capabilities
  • Provide ongoing guidance to development teams throughout the delivery lifecycle
  • Lead initiatives to integrate critical Collections data into enterprise domain services (e.g., Account, Card, Loan domains)
  • Design and support real-time and batch data integration patterns between Collections platforms and enterprise systems
  • Develop and manage data products supporting advanced Collections reporting, including delinquency analytics, roll rates, cure rates, contact effectiveness, payment arrangements, and liquidation performance
What we offer
What we offer
  • medical
  • dental
  • vision
  • life
  • disability
  • other insurance plans
  • ESPP (employee stock purchase program)
  • 401K program with company match after 12 months
  • HSA (Health Savings Account on the HDHP plan)
  • SupportLinc Employee Assistance Program (EAP) with up to 8 free counseling sessions
  • Fulltime
Read More
Arrow Right

Data Analytics Engineer II

The Data Analytics Engineer II is responsible for development, expansion, and ma...
Location
Location
United States , Irving
Salary
Salary:
Not provided
christushealth.org Logo
CHRISTUS Health
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in computer science, engineering, math, or related field, or foreign equivalent
  • Advanced knowledge of designing and developing data pipelines and delivering advanced analytics, with open-source Big Data processing frameworks such as Hadoop technologies
  • Proven competency in programming utilizing distributed computing principles
  • Demonstrative knowledge of data mining techniques, relational, and non-relational databases, Lambda Architecture, and BI and analytics landscape, preferable in large-scale development environments
  • Demonstrated proficiency in open-source technology
  • for example, Python, Spark, Hive, HDFS, Nifi, etc.
  • Experience in data integration with ETL techniques and frameworks
  • Big Data querying tools, such as Hive, Impala, and Spark SQL
  • and large-scale data lake data and data warehouse implementations
  • Minimum five (5) years’ experience with design, architecture and development of Enterprise scale platforms built on open-source frameworks
Job Responsibility
Job Responsibility
  • Meets expectations of the applicable OneCHRISTUS Competencies: Leader of Self, Leader of Others, or Leader of Leaders
  • Responsible for analyzing and understanding data sources, participating in requirement gathering, and providing insights and guidance on data technology and data modeling best practices
  • Analyzes ideas and business and functional requirements to formulate a design strategy
  • Acts as a tenant to draw out a workable application design and coding parameters with essential functionalities
  • Works in collaboration with the team members to identify and address the issues by implementing a viable technical solution that is time- and cost-effective and ensuring that it does not affect performance quality
  • Develops code following the industry's best practices and adheres to the organizational development rules and standards
  • Involved in the evaluation of proposed system acquisitions or solutions development and provides input to the decision-making process relative to compatibility, cost, resource requirements, operations, and maintenance
  • Integrates software components, subsystems, facilities, and services into the existing technical systems environment
  • assesses the impact on other systems and works with cross- functional teams within Information Services to ensure positive project impact
  • and installs, configures, and verifies the operation of software components
  • Fulltime
Read More
Arrow Right

Business Intelligence Analyst II

At Aristocrat, we believe in pushing boundaries and redefining the gaming experi...
Location
Location
United States , Las Vegas
Salary
Salary:
64214.00 - 119255.00 USD / Year
aristocratgaming.com Logo
Aristocrat Gaming
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • BA / BS degree or equivalent experience
  • At least 2 years of experience using data visualization tools (Tableau, Power BI, Qlik), specializing in building dashboards
  • At least 2 years of experience handling data warehouse tools (SQL, Snowflake), with responsibilities that include querying and building tables/views
  • Demonstrated practical experience in data presentation and analysis
  • Experience developing, testing, implementing, and operating data solutions in an enterprise environment
  • Strong interpersonal skills, with a focus on establishing relationships with mutual trust and respect
  • Ability to manage individual workload, meet deadlines, and communicate progress to leaders and collaborators
  • Strong communication skills, both oral and written, with the ability to present to peers and leaders
  • Ability to learn new concepts quickly and apply them to work products
Job Responsibility
Job Responsibility
  • Collaborate with internal partners to translate business requirements into technical solutions and remediate data quality issues
  • Iteratively develop and test data assets, ensuring scalability, maintainability, usability, and data quality
  • Build tables and views in Microsoft SQL Server, dashboards in Tableau, and PowerPoint presentations
  • Develop intuitive data visualizations to make large and sophisticated data more accessible and understandable
  • Establish relationships with business partners, including commercial strategy, enterprise data, sales, and finance
  • Estimate and communicate the complexity and duration of work on projects
  • Lead multiple projects and proactively communicate status to collaborators
  • Use the project management tool, Wrike, to track progress on work
What we offer
What we offer
  • health, dental, and vision insurance
  • paid time off
  • 401(k) plan with employer matching
  • robust benefits package
  • global career opportunities
  • Fulltime
Read More
Arrow Right

Senior Salesforce Developer

We are seeking a highly skilled and motivated Senior Salesforce Developer to joi...
Location
Location
United States
Salary
Salary:
165000.00 - 200000.00 USD / Year
humaninterest.com Logo
Human Interest
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum of 5+ years of hands-on experience as a Salesforce Developer in a complex enterprise environment
  • Advanced proficiency in programmatic development on the Salesforce platform, including Apex, Triggers, Batch Apex, Queueable Apex, and REST/SOAP APIs
  • Expert knowledge of the Lightning Experience, including extensive experience developing with Lightning Web Components (LWC)
  • Strong experience with large-scale data migrations, integration patterns, and data management on the Salesforce platform
  • Active Salesforce Certified Platform Developer I required
  • Salesforce Certified Platform Developer II highly preferred
Job Responsibility
Job Responsibility
  • Lead the design, development, and implementation of complex, custom solutions on the Salesforce platform using Apex, Lightning Web Components (LWC), Visualforce, SOQL/SOSL, and platform events
  • Design and build robust integrations between Salesforce and internal/external systems (e.g., core financial platform, marketing tools, and data warehouses) using REST/SOAP APIs and platform events
  • Serve as a technical leader on projects, providing guidance, conducting code reviews, and mentoring junior developers to ensure code quality, adherence to best practices, and security standards
  • Collaborate with Solution Architects and Business Analysts to translate complex business requirements into scalable and well-architected technical designs, considering data volume, security, and governor limits
  • Drive the adoption of Salesforce development best practices, including CI/CD, automated testing (unit and integration), source-driven development (using Git/GitHub/Salesforce DX), and change management
  • Own the resolution of complex production issues, identifying root causes and implementing sustainable fixes to ensure high system uptime and performance
  • Utilize Salesforce's declarative tools (Flow, Process Builder, Validation Rules) effectively, knowing when to choose declarative vs. programmatic solutions
  • Ensure all Salesforce solutions adhere to strict financial services regulations and internal security policies
What we offer
What we offer
  • A great 401(k) plan: Our own! Our 401(k) includes a dollar-for-dollar employer match up to 4% of compensation (immediately vested) and $0 plan fees
  • Top-of-the-line health plans, as well as dental and vision insurance
  • Competitive time off and parental leave
  • Addition Wealth: Unlimited access to digital tools, financial professionals, and a knowledge center to help you understand your equity and support your financial wellness
  • Lyra: Enhanced Mental Health Support for Employees and dependents
  • Carrot: Fertility healthcare and family forming benefits
  • Candidly: Student loan resource to help you and your family plan, borrow, and repay student debt
  • Monthly work-from-home stipend
  • quarterly lifestyle stipend
  • Engaging team-building experiences, ranging from virtual social events to team offsites, promoting collaboration and camaraderie
  • Fulltime
Read More
Arrow Right