CrawlJobs Logo

Sr Data Engineer

amgen.com Logo

Amgen

Location Icon

Location:
India , Hyderabad

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are looking for highly motivated expert Senior Data Engineer who can own the design & development of complex data pipelines, solutions and frameworks. The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management.

Job Responsibility:

  • Design, develop, and maintain scalable ETL/ELT pipelines to support structured, semi-structured, and unstructured data processing across the Enterprise Data Fabric
  • Implement real-time and batch data processing solutions, integrating data from multiple sources into a unified, governed data fabric architecture
  • Optimize big data processing frameworks using Apache Spark, Hadoop, or similar distributed computing technologies to ensure high availability and cost efficiency
  • Work with metadata management and data lineage tracking tools to enable enterprise-wide data discovery and governance
  • Ensure data security, compliance, and role-based access control (RBAC) across data environments
  • Optimize query performance, indexing strategies, partitioning, and caching for large-scale data sets
  • Develop CI/CD pipelines for automated data pipeline deployments, version control, and monitoring
  • Implement data virtualization techniques to provide seamless access to data across multiple storage systems
  • Collaborate with cross-functional teams, including data architects, business analysts, and DevOps teams, to align data engineering strategies with enterprise goals
  • Stay up to date with emerging data technologies and best practices, ensuring continuous improvement of Enterprise Data Fabric architectures

Requirements:

  • Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies
  • Proficiency in workflow orchestration, performance tuning on big data processing
  • Strong understanding of AWS services
  • Experience with Data Fabric, Data Mesh, or similar enterprise-wide data architectures
  • Ability to quickly learn, adapt and apply new technologies
  • Strong problem-solving and analytical skills
  • Excellent communication and teamwork skills
  • Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices
  • 9 to 12 years of Computer Science, IT or related field experience

Nice to have:

  • Good to have deep expertise in Biotech & Pharma industries
  • Experience in writing APIs to make the data available to the consumers
  • Experienced with SQL/NOSQL database, vector database for large language models
  • Experienced with data modeling and performance tuning for both OLAP and OLTP databases
  • Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops
  • AWS Certified Data Engineer preferred
  • Databricks Certificate preferred
  • Scaled Agile SAFe certification preferred

Additional Information:

Job Posted:
January 31, 2026

Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Sr Data Engineer

Sr Engineer, Data

The Sr Data Engineer designs and develops data architectures in on-premise, clou...
Location
Location
United States , Overland Park
Salary
Salary:
105100.00 - 189600.00 USD / Year
https://www.t-mobile.com Logo
T-Mobile
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Engineering, Computer Science, a related subject area or equivalent experience
  • 5+ years developing cloud solutions using data series
  • experience with cloud platforms (Amazon Web Services, Azure, or Google Cloud)
  • Hands-on development using and migrating data to cloud platforms
  • Experience in SQL, NoSQL, and/or relational database design and development
  • Advanced knowledge and experience in building complex data pipelines with Python, Experience in languages such as SQL, DAX Python, Java, Scala, and/or Go
Job Responsibility
Job Responsibility
  • Develop data engineering solutions, including data pipelines, visualization and analytical tools
  • Design and develop data architectures in on-premise, cloud and hybrid platforms
  • Data wrangling of heterogeneous data, exploration and discovery in pursuit of new business insights
  • Actively contribute to the team’s knowledge and drive new capabilities forward
  • Mentor other team members in their efforts to build data engineering skillsets
  • Assist team management in defining projects, including helping estimate, plan and scope work
  • Prepare and contribute to presentations required by management
What we offer
What we offer
  • Competitive base salary and compensation package
  • Annual stock grant
  • Employee stock purchase plan
  • 401(k)
  • Access to free, year-round money coaches
  • Medical, dental and vision insurance
  • Flexible spending account
  • Paid time off
  • Up to 12 paid holidays
  • Paid parental and family leave
  • Fulltime
Read More
Arrow Right

Sr. Data Engineer - Snowflake

Data Ideology is seeking a Sr. Snowflake Data Engineer to join our growing team ...
Location
Location
Salary
Salary:
Not provided
dataideology.com Logo
Data Ideology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience in data engineering, data warehousing, or data architecture
  • 3+ years of hands-on Snowflake experience (performance tuning, data sharing, Snowpark, Snowpipe, etc.)
  • Strong SQL and Python skills, with production experience using dbt
  • Familiarity with cloud platforms (AWS, Azure, or GCP) and modern data tooling (Airflow, Fivetran, Power BI, Looker, Informatica, etc.)
  • Prior experience in a consulting or client-facing delivery role
  • Excellent communication skills, with the ability to collaborate across technical and business stakeholders
  • SnowPro Core Certification required (or willingness to obtain upon hire)
  • advanced Snowflake certifications preferred
Job Responsibility
Job Responsibility
  • Design and build scalable, secure, and cost-effective data solutions in Snowflake
  • Develop and optimize data pipelines using tools such as dbt, Python, CloverDX, and cloud-native services
  • Participate in discovery sessions with clients to gather requirements and translate them into solution designs and project plans
  • Collaborate with engagement managers and account teams to help scope work and provide technical input for Statements of Work (SOWs)
  • Serve as a Snowflake subject matter expert, guiding best practices in performance tuning, cost optimization, access control, and workload management
  • Lead modernization and migration initiatives to move clients from legacy systems into Snowflake
  • Integrate Snowflake with BI tools, governance platforms, and AI/ML frameworks
  • Contribute to internal accelerators, frameworks, and proofs of concept
  • Mentor junior engineers and support knowledge sharing across the team
What we offer
What we offer
  • Flexible Time Off Policy
  • Eligibility for Health Benefits
  • Retirement Plan with Company Match
  • Training and Certification Reimbursement
  • Utilization Based Incentive Program
  • Commission Incentive Program
  • Referral Bonuses
  • Work from Home
  • Fulltime
Read More
Arrow Right

Sr. Data Engineer

We are looking for a Sr. Data Engineer to join our growing Quality Engineering t...
Location
Location
Salary
Salary:
Not provided
dataideology.com Logo
Data Ideology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Information Systems, or a related field (or equivalent experience)
  • 5+ years of experience in data engineering, data warehousing, or data architecture
  • Expert-level experience with Snowflake, including data modeling, performance tuning, security, and migration from legacy platforms
  • Hands-on experience with Azure Data Factory (ADF) for building, orchestrating, and optimizing data pipelines
  • Strong experience with Informatica (PowerCenter and/or IICS) for ETL/ELT development, workflow management, and performance optimization
  • Deep knowledge of data modeling techniques (dimensional, tabular, and modern cloud-native patterns)
  • Proven ability to translate business requirements into scalable, high-performance data solutions
  • Experience designing and supporting end-to-end data pipelines across cloud and hybrid architectures
  • Strong proficiency in SQL and experience optimizing large-scale analytic workloads
  • Experience working within SDLC frameworks, CI/CD practices, and version control
Job Responsibility
Job Responsibility
  • Ability to collect and understand business requirements and translate those requirements into data models, integration strategies, and implementation plans
  • Lead modernization and migration initiatives to move clients from legacy systems into Snowflake, ensuring functionality, performance and data integrity
  • Ability to work within the SDLC framework in multiple environments and understand the complexities and dependencies of the data warehouse
  • Optimize and troubleshoot ETL/ELT workflows, applying best practices for scheduling, orchestration, and performance tuning
  • Maintain documentation, architecture diagrams, and migration plans to support knowledge transfer and project tracking
What we offer
What we offer
  • PTO Policy
  • Eligibility for Health Benefits
  • Retirement Plan
  • Work from Home
  • Fulltime
Read More
Arrow Right

Sr Data Engineer

(Locals or Nearby resources only). You will work with technologies that include ...
Location
Location
United States , Glendale
Salary
Salary:
Not provided
enormousenterprise.com Logo
Enormous Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of data engineering experience developing large data pipelines
  • Proficiency in at least one major programming language (e.g. Python, Java, Scala)
  • Hands-on production environment experience with distributed processing systems such as Spark
  • Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
  • Experience with at least one major Massively Parallel Processing (MPP) or cloud database technology (Snowflake, Databricks, Big Query)
  • Experience in developing APIs with GraphQL
  • Advance understanding of OLTP vs OLAP environments
  • Candidates must work W2, no Corp 2 Corp
  • US Citizen, Green Card Holder, H4-EAD, TN-Visa
  • Airflow
Job Responsibility
Job Responsibility
  • Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines
  • Build and maintain APIs to expose data to downstream applications
  • Develop real-time streaming data pipelines
  • Collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform
  • Contribute to developing and documenting both internal and external standards and best practices for pipeline configurations, naming conventions, and more
  • Ensure high operational efficiency and quality of the Core Data platform datasets to ensure our solutions meet SLAs and project reliability and accuracy to all our stakeholders (Engineering, Data Science, Operations, and Analytics teams)
What we offer
What we offer
  • 3 levels of medical insurance for you and your family
  • Dental insurance for you and your family
  • 401k
  • Overtime
  • Sick leave policy: accrue 1 hour for every 30 hours worked up to 48 hours
Read More
Arrow Right

Sr Data Engineer

Resource Informatics Group, Inc. is actively seeking a skilled Senior Data Engin...
Location
Location
United States , Irving
Salary
Salary:
Not provided
rigusinc.com Logo
Resource Informatics Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related fields
  • Strong expertise in data engineering and cloud-based solutions
  • 6+ years of experience in data engineering, architecture, and implementation of large-scale data solutions
  • Proficiency in designing and implementing data models, data structures, and algorithms
  • Advanced knowledge of SQL and NoSQL databases
  • Demonstrated expertise in optimizing data pipelines and improving data reliability, efficiency, and quality
  • Excellent problem-solving capabilities with a keen attention to detail
  • Strong communication and collaboration skills, with the ability to work effectively across diverse teams
  • Relevant certifications in cloud technologies (Azure, AWS, or GCP) advantageous
  • Master’s in Data Science or Computer Science or foreign equivalent, plus 6+ years of experience, OR Bachelor’s in Computer Science, Information Technology, or Electronics and Communication Engineering or foreign equivalent
Job Responsibility
Job Responsibility
  • Develop and execute ETL processes for data extraction, transformation, and loading into warehouses and data lakes
  • Architect data warehousing solutions using Azure Synapse Analytics for efficient querying and reporting
  • Optimize query performance, data processing speed, and resource utilization within Azure environments
  • Construct seamless data pipelines across Azure services utilizing Azure Data Factory, Databricks, and SQL Server Integration Services
  • Collaborate with stakeholders, including data scientists and analysts, to understand data requirements and deliver effective solutions
  • Manage large data volumes leveraging the Hadoop ecosystem for diverse source collection and loading
  • Design, maintain, and optimize data processing jobs using Hadoop MapReduce, Spark, and Hive, with coding in Java or Python for custom applications
  • Monitor job and cluster performance using tools like Ambari and custom monitoring scripts, scaling and maintaining Hadoop clusters and Azure data services
  • Ensure adherence to data security measures and governance standards
  • Integrate cross-cloud data with AWS and GCP services
  • Fulltime
Read More
Arrow Right

Sr. Data Engineer

We are looking for a Sr. Data Engineer to join our team.
Location
Location
Salary
Salary:
Not provided
bostondatapro.com Logo
Boston Data Pro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Data Engineering: 8 years (Preferred)
  • Data Programming languages: 5 years (Preferred)
  • Data Developers: 5 years (Preferred)
Job Responsibility
Job Responsibility
  • Designs and implements standardized data management procedures around data staging, data ingestion, data preparation, data provisioning, and data destruction
  • Ensures quality of technical solutions as data moves across multiple zones and environments
  • Provides insight into the changing data environment, data processing, data storage and utilization requirements for the company, and offer suggestions for solutions
  • Ensures managed analytic assets to support the company’s strategic goals by creating and verifying data acquisition requirements and strategy
  • Develops, constructs, tests, and maintains architectures
  • Aligns architecture with business requirements and use programming language and tools
  • Identifies ways to improve data reliability, efficiency, and quality
  • Conducts research for industry and business questions
  • Deploys sophisticated analytics programs, machine learning, and statistical methods to efficiently implement solutions
  • Prepares data for predictive and prescriptive modeling and find hidden patterns using data
Read More
Arrow Right

Sr. Data Engineer

We are looking for a Sr. Data Engineer to join our team.
Location
Location
Salary
Salary:
Not provided
bostondatapro.com Logo
Boston Data Pro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Data Engineering: 8 years (Preferred)
  • Data Programming languages: 5 years (Preferred)
  • Data Developers: 5 years (Preferred)
Job Responsibility
Job Responsibility
  • Designs and implements standardized data management procedures around data staging, data ingestion, data preparation, data provisioning, and data destruction
  • Ensures quality of technical solutions as data moves across multiple zones and environments
  • Provides insight into the changing data environment, data processing, data storage and utilization requirements for the company, and offer suggestions for solutions
  • Ensures managed analytic assets to support the company’s strategic goals by creating and verifying data acquisition requirements and strategy
  • Develops, constructs, tests, and maintains architectures
  • Aligns architecture with business requirements and use programming language and tools
  • Identifies ways to improve data reliability, efficiency, and quality
  • Conducts research for industry and business questions
  • Deploys sophisticated analytics programs, machine learning, and statistical methods to efficiently implement solutions
  • Prepares data for predictive and prescriptive modeling and find hidden patterns using data
Read More
Arrow Right

Sr. Network Data Center Engineer

If you live and breathe networking, virtualization, and high-availability system...
Location
Location
United States
Salary
Salary:
150000.00 USD / Year
corporatetools.com Logo
Corporate Tools
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience with Proxmox or other hypervisors (VMware, KVM, Xen, Hyper-V)
  • 5+ years of network engineering, data center operations, or cloud infrastructure
  • Experience with Ceph or SAN-based storage solutions (iSCSI, NFS, ZFS)
  • Experience with containers and networking
  • Excellent problem-solving skills and a keen eye for detail
  • Ability to work on projects solo or with a team
  • Love for learning and improving code
  • Strong communication and collaboration skills
  • Understanding of Ceph storage architecture (OSDs, MONs, MDS, RADOS, etc.)
  • Experience in iSCSI/NFS/ZFS SAN setups and performance tuning
Job Responsibility
Job Responsibility
  • Develop and design robust and scalable software solutions
  • Take ownership of projects from conception to deployment, ensuring timely delivery and meeting the specified requirements
  • Work closely with cross-functional teams, including IT, product management, and other software teams, to ensure seamless integration and alignment with business objectives
  • Stay updated with the latest industry trends, technologies, and best practices to bring innovative solutions to the table
  • Design, implement, and maintain a robust network architecture that supports Proxmox virtualization, Ceph/SAN storage, and container networking
  • Manage firewalls (iptables, pfSense, UFW, etc.) to secure access to virtualized environments and hosting services
  • Configure and optimize VLANs, subnets, and routing to ensure isolated and secure network segments for virtual machines, storage, and frontend applications
  • Configure and maintain VPNs, BGP, OSPF, or other routing protocols to ensure proper network redundancy and failover
  • Set up and maintain bridged, NAT, and VXLAN networking in Proxmox for efficient VM communication
  • Implement high-availability (HA) networking for Hypervisor networks and Ceph/SAN clusters
What we offer
What we offer
  • 100% employer-paid medical, dental and vision for employees
  • Annual review with raise option
  • 22 days Paid Time Off accrued annually, and 4 holidays
  • After 3 years, PTO increases to 29 days. Employees transition to flexible time off after 5 years with the company—not accrued, not capped, take time off when you want
  • The 4 holidays are: New Year’s Day, Fourth of July, Thanksgiving, and Christmas Day
  • Paid Parental Leave
  • Up to 6% company matching 401(k) with no vesting period
  • Quarterly allowance
  • Use to make your remote work set up more comfortable, for continuing education classes, a plant for your desk, coffee for your coworker, a massage for yourself... really, whatever
  • Open concept office with friendly coworkers
  • Fulltime
Read More
Arrow Right