CrawlJobs Logo

IICS Developer

lumendata.com Logo

LumenData

Location Icon

Location:
India , Bangalore South

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

LumenData is looking for an experienced IICS Developer to join our team in Bangalore, India. The ideal candidate should have 3+ years of experience in Informatica Intelligent Cloud Services (IICS), with expertise in Cloud Data Integration (CDI) and Cloud Data Quality (CDQ). This role will involve designing, developing, and optimizing data integration workflows to support enterprise-wide data initiatives.

Job Responsibility:

  • Develop, test, and deploy IICS CDI and CDQ solutions for data integration and transformation
  • Collaborate with business and technical teams to understand data requirements and implement scalable solutions
  • Design and optimize ETL processes for performance, scalability, and reliability
  • Implement data validation, cleansing, and transformation rules to improve data quality
  • Work with SQL, PL/SQL, and relational databases (Oracle, SQL Server, PostgreSQL, etc.)
  • Develop APIs and integrations with external applications using REST/SOAP web services
  • Troubleshoot and resolve data integration issues in a timely manner
  • Maintain technical documentation, including data flow diagrams and process documentation
  • Stay updated with Informatica Cloud advancements and recommend best practices

Requirements:

  • 3+ years of hands-on experience in Informatica IICS CDI/CDQ
  • Strong knowledge of ETL, data integration, and data quality principles
  • Experience with SQL, PL/SQL, and database optimization techniques
  • Familiarity with API-based integrations and working knowledge of REST/SOAP web services
  • Strong problem-solving and debugging skills
  • Ability to work in an agile development environment
  • Excellent communication and team collaboration skills

Nice to have:

  • Informatica Cloud Certification is a plus
  • Good to have experience in CAI (Cloud Application Integration)
  • Experience with Cloud Platforms (AWS, Azure, GCP)
  • Knowledge of Python or Shell Scripting for automation
  • Understanding of Big Data technologies and data governance frameworks

Additional Information:

Job Posted:
December 31, 2025

Employment Type:
Fulltime
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for IICS Developer

Informatica Developer

We are seeking an experienced Informatica Developer to join our dynamic IT team....
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
rightanglesol.com Logo
Right Angle Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years hands on experience with Informatica Power Center
  • Minimum 3 years’ experience in using Intelligent Cloud Services (IICS)
  • Experience with other Informatica products such as Informatica Cloud, Informatica Data Quality, or Informatica MDM
  • Experience with cloud platforms such as AWS, Azure, or Google Cloud
  • Experience with Agile development methodologies
  • Expertise in SQL is MUST
  • BE/BS/MTech/MS or equivalent work experience
Job Responsibility
Job Responsibility
  • Designing, developing, and maintaining ETL processes using Informatica PowerCenter
  • Identify and understand the requirement and design
  • Profile & Analyze Source Data (Mainframe Extracts, flat files, SQL Server, web Services etc.) usage using Informatica ETL
  • Able to create Mapplets
  • Analyzing and evaluating data sources, data volume, and business rules
  • Develop ETL components using Informatica intelligent cloud services Extract data, create data models
  • Automation, job scheduling, dependencies, monitoring
  • Extraction, transformation, and load of data using the ETL tools
  • Configure/Script Business rules and transformation rules in Informatica
  • Experience in understanding complex stored procedures and be able enhance the procedures based on the requirements to optimize the code
  • Fulltime
Read More
Arrow Right

Sr. Data Engineer

We are looking for a Sr. Data Engineer to join our growing Quality Engineering t...
Location
Location
Salary
Salary:
Not provided
dataideology.com Logo
Data Ideology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Information Systems, or a related field (or equivalent experience)
  • 5+ years of experience in data engineering, data warehousing, or data architecture
  • Expert-level experience with Snowflake, including data modeling, performance tuning, security, and migration from legacy platforms
  • Hands-on experience with Azure Data Factory (ADF) for building, orchestrating, and optimizing data pipelines
  • Strong experience with Informatica (PowerCenter and/or IICS) for ETL/ELT development, workflow management, and performance optimization
  • Deep knowledge of data modeling techniques (dimensional, tabular, and modern cloud-native patterns)
  • Proven ability to translate business requirements into scalable, high-performance data solutions
  • Experience designing and supporting end-to-end data pipelines across cloud and hybrid architectures
  • Strong proficiency in SQL and experience optimizing large-scale analytic workloads
  • Experience working within SDLC frameworks, CI/CD practices, and version control
Job Responsibility
Job Responsibility
  • Ability to collect and understand business requirements and translate those requirements into data models, integration strategies, and implementation plans
  • Lead modernization and migration initiatives to move clients from legacy systems into Snowflake, ensuring functionality, performance and data integrity
  • Ability to work within the SDLC framework in multiple environments and understand the complexities and dependencies of the data warehouse
  • Optimize and troubleshoot ETL/ELT workflows, applying best practices for scheduling, orchestration, and performance tuning
  • Maintain documentation, architecture diagrams, and migration plans to support knowledge transfer and project tracking
What we offer
What we offer
  • PTO Policy
  • Eligibility for Health Benefits
  • Retirement Plan
  • Work from Home
  • Fulltime
Read More
Arrow Right
New

Informatica IICS Developer

Job Description: Informatica IICS Developer
Location
Location
India , Hyderabad; Bangalore; Chennai; Kolkata; Noida; Gurgaon; Pune; Indore; Mumbai
Salary
Salary:
Not provided
dxc.com Logo
DXC Technology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8+ years of experience on ETL Tools like Informatica cloud, IICS & Informatica PowerCenter
  • Proficient with tools like Excel, Confluence, Jira, GIT, SharePoint
  • Proficient in SQL for data manipulation and validation
  • Experience with performance tuning / optimization of ETL processes
  • Experience with UNIX environments, SSH file transfer, basic scripting and navigation commands
  • Experience in large data migration programs
  • Experience using, configuring, and scheduling Control-M jobs
  • Willing to work a minimum of 2 days a week in the office
Read More
Arrow Right
New

Sr Consultant MDM/IICS

Excellent knowledge on Informatica IICS platform as a whole and the integrations...
Location
Location
India , Bangalore South
Salary
Salary:
Not provided
lumendata.com Logo
LumenData
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Excellent knowledge on Informatica IICS platform as a whole and the integrations among different Cloud components and service
  • Should have experience in Informatica MDM / Informatica MDM SaaS
  • Advanced knowledge of ETL, including the ability to read and write efficient, robust code, follow, or implement best practices and coding standards, and creation
  • Ability to program complex ETL mappings using CDI
  • IICS Data Integration development experience with multiple data loading technologies fulfilling mapping specifications.
  • Experienced in Implementing the API calls in CDI by building Swagger file creations, business service components
  • Proficient in Cloud Application Integration (CAI) developments and understanding of SOAP/Restful API Interface design principles
  • Confident in taking up ad-hoc API design & developments with limited technical resources sometimes open sources to integrate
  • Strong in trouble shooting both ETL and API connectivity and transactions pro-actively to report and resolve in no time
  • Strong in SQL queries. Ability to write complex SQL query and performance tuning for the same
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

We are looking for a Senior Data Engineer to join our growing Quality Engineerin...
Location
Location
United States , Pittsburgh
Salary
Salary:
Not provided
dataideology.com Logo
Data Ideology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Information Systems, or a related field (or equivalent experience)
  • 5+ years of experience in data engineering, data warehousing, or data architecture
  • Expert-level experience with Snowflake, including data modeling, performance tuning, security, and migration from legacy platforms
  • Hands-on experience with Azure Data Factory (ADF) for building, orchestrating, and optimizing data pipelines
  • Strong experience with Informatica (PowerCenter and/or IICS) for ETL/ELT development, workflow management, and performance optimization
  • Deep knowledge of data modeling techniques (dimensional, tabular, and modern cloud-native patterns)
  • Proven ability to translate business requirements into scalable, high-performance data solutions
  • Experience designing and supporting end-to-end data pipelines across cloud and hybrid architectures
  • Strong proficiency in SQL and experience optimizing large-scale analytic workloads
  • Experience working within SDLC frameworks, CI/CD practices, and version control
Job Responsibility
Job Responsibility
  • Collect and understand business requirements and translate those requirements into data models, integration strategies, and implementation plans
  • Lead modernization and migration initiatives to move clients from legacy systems into Snowflake, ensuring functionality, performance and data integrity
  • Work within the SDLC framework in multiple environments and understand the complexities and dependencies of the data warehouse
  • Optimize and troubleshoot ETL/ELT workflows, applying best practices for scheduling, orchestration, and performance tuning
  • Maintain documentation, architecture diagrams, and migration plans to support knowledge transfer and project tracking
What we offer
What we offer
  • PTO Policy
  • Eligibility for Health Benefits
  • Retirement Plan
  • Work from Home
  • Fulltime
Read More
Arrow Right

Informatica MDM Technical Analyst

We are seeking an experienced MDM-Hub Technical Analyst to join our team and sup...
Location
Location
Romania , Bucharest
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience in master data integration with Informatica tools (B360 and IICS CAI and CDI)
  • Expertise in MDM-Hub implementation, data modeling, and master data governance
  • knowledge of SQL (e.g., MSSQL) and Python for data processing and automation
  • Experience with API-based integrations (REST and SOAP) and Event-Driven architectures
  • Understanding of ETL pipelines, data warehousing concepts, and cloud-based data services
  • Familiarity with data quality, data lineage, metadata management, and compliance standards
  • Experience with workflow automation and business process management (BPM) tools
  • Excellent problem-solving skills, attention to detail, and ability to work independently
Job Responsibility
Job Responsibility
  • Design and develop master data integration flows using Informatica B360, IICS CAI and CDI, and related technologies
  • Collaborate with business and IT teams to understand data models, data flow, and integration requirements
  • Optimize and troubleshoot MDM-Hub, ETL and data pipeline processes, ensuring high performance and reliability
  • Define and enforce data governance best practices, including data quality, lineage, and compliance
  • Work with cloud platforms (e.g., Azure)
  • Develop and maintain technical documentation, including data mappings, integration workflows, and design specifications
  • Support data migration activities and ensure the successful onboarding of legacy and new data sources
  • Provide guidance and best practices for data security, access control, and compliance
What we offer
What we offer
  • Full access to foreign language learning platform
  • Personalized access to tech learning platforms
  • Tailored workshops and trainings to sustain your growth
  • Medical subscription
  • Meal tickets
  • Monthly budget to allocate on flexible benefit platform
  • Access to 7 Card services
  • Wellbeing activities and gatherings
  • Fulltime
Read More
Arrow Right

Systems Analyst 3

Understands business objectives and problems, identifies alternative solutions, ...
Location
Location
United States , Austin
Salary
Salary:
Not provided
honorvettech.com Logo
HonorVet Technologies
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience developing mappings and workflows to automate ETL processes using Informatica Power Center or IICS.
  • Experiencing acquiring and integrating data from multiple data sources/technologies using Informatica Power Center or IICS for use by a Tableau data visualization object. Data source techs should include Oracle, SQL Server, Excel, Access and Adobe PDF.
  • Experience designing and developing complex Oracle and/or Snowflake SQL scripts that are fast and efficient
  • Strong analytical and problem-solving skills with experience as a system analyst for a data analytics, performance management system, or data warehousing project.
  • Technical writing and diagraming skills, including proficiency with modeling and mapping tools (e.g., Visio, Erwin), and the Microsoft Office Suite (Word, Excel, and PowerPoint) and MS Project.
  • Experience in planning and delivering software platforms used across multiple products and organizational units.
  • Proven ability to write well designed, testable, efficient code by using best software development practices
Job Responsibility
Job Responsibility
  • Filling the role of a technical leader, leading an agile development team through a project.
  • Data acquisition from a variety of data sources for multiple uses.
  • Developing complex SQL scripts to transform the source data to fit into a dimensional model, then to create views and materialized views in Oracle.
  • Developing automation with Informatica Power Center/IICS to pull data from external data sources and transform it to fit into a dimensional model.
  • Collaborating with other members of the Data Engineering Team on the design and implementation of an optimal data design.
  • Verification and validation of SQL scripts, Informatica automation and database views.
  • Developing automated means of performing verification and validation.
  • Participating in all sprint ceremonies
  • Work closely with the Architects and Data Engineering Team on implementation designs and data acquisition strategies.
  • Develop mockups and work with customers for validation
Read More
Arrow Right
New

Data Engineer

We are seeking a hands-on Data Engineer to join our Quality Engineering team. Th...
Location
Location
United States , Pittsburgh
Salary
Salary:
Not provided
dataideology.com Logo
Data Ideology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of hands-on experience building and supporting ETL pipelines in production
  • Strong SQL skills, including complex joins, subqueries, aggregations, and performance tuning
  • Hands-on experience with Informatica PowerCenter and/or Informatica IICS
  • Experience working with cloud data warehouses, including Snowflake
  • Ability to independently troubleshoot ETL failures, data issues, and query performance problems
  • Experience working within SDLC frameworks and version control systems
  • Strong communication skills and ability to explain technical work clearly
  • Must be able to contribute quickly without extensive training
Job Responsibility
Job Responsibility
  • Develop, maintain, and enhance production ETL pipelines using Informatica (PowerCenter and/or IICS)
  • Write, optimize, and troubleshoot complex SQL used for transformations, validations, and analytics
  • Support and enhance data pipelines loading into Snowflake
  • Debug failed ETL jobs, data discrepancies, and performance issues across ETL and warehouse layers
  • Collaborate with business and technical stakeholders to understand data requirements and translate them into working solutions
  • Support data warehouse environments across cloud and hybrid architectures
  • Maintain technical documentation, mappings, and workflow logic
  • Participate in SDLC processes including deployments, testing, and version control
What we offer
What we offer
  • PTO Policy
  • Eligibility for Health Benefits
  • Retirement Plan
  • Work from Home
  • Fulltime
Read More
Arrow Right