CrawlJobs Logo

Hadoop Administrator

opulent-soft.com Logo

OPULENTSOFT

Location Icon

Location:
United States , Tampa

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Requirements:

  • Strong in Big Data and Analytical skills – Min 3 Years Exp.
  • Experience in Hadoop cluster administration and configuration
  • Experience in Java and Unix based systems
  • Ability to co-ordinate with multiple technical teams, Business users and Customer
  • Strong communication
  • Strong troubleshooting skills

Additional Information:

Job Posted:
January 02, 2026

Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Hadoop Administrator

Hadoop Administrator

Location
Location
United States , Atlanta
Salary
Salary:
Not provided
logic-loops.com Logo
Logic Loops
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Hands-on experience with the Hadoop stack (HDFS, MapReduce, Hbase, Pig, Hive, Oozie)
  • Extensive experience with Oracle 10g/11g databases and PL/SQL
  • Monitor and review Oracle database instances to identify potential maintenance and tuning issues
  • Expertise in systems administration, Linux tools, configuration management in a large-scale environment
  • Troubleshoot and debug Hadoop ecosystem runtime issues
  • Recover from node failures and troubleshoot common Hadoop cluster issues
  • Document all production scenarios, issues, and resolutions
  • Teamwork in providing hardware architectural guidance, planning, estimating cluster capacity, and creating roadmaps for Hadoop cluster deployment
  • Evaluation of Hadoop infrastructure requirements and design/deploy solutions (i.e., high availability big data clusters, etc.)
  • Expertise in performance tuning, system dump analysis, and storage capacity management
Job Responsibility
Job Responsibility
  • Deploy new Hadoop infrastructure, Hadoop cluster upgrades, cluster maintenance, troubleshooting, capacity planning and resource optimization
  • Review, develop, and implement strategies that preserve the availability, stability, security and scalability of large Hadoop clusters
  • Interact with developers, architects and other operation team members to resolve job performance issues
  • Preparation of architecture, design and operational documentation
  • Participation in weekly on call rotation to provide operational support
Read More
Arrow Right

Database Administrator II

We are seeking a highly skilled Cloudera Hadoop Administrator (DBA) with hands-o...
Location
Location
Salary
Salary:
100900.00 - 126100.00 USD / Year
acehardware.com Logo
ACE Hardware
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4+ years of hands-on experience administering Cloudera Hadoop clusters
  • 2–3+ years of Databricks experience in production environments
  • 2+ years of Databricks administration experience on Azure (preferred)
  • Strong knowledge of Spark and Delta Lake architecture
  • Experience with IAM, Active Directory, and SSO integration
  • Familiarity with DevOps and CI/CD for data platforms
  • Deep understanding of Hadoop ecosystem: Hive, Impala, Spark, HDFS, YARN
  • Experience integrating data from DB2 to Hadoop/Databricks using tools like Sqoop or custom connectors
  • Scripting skills in Shell and/or Python for automation and system administration
  • Solid foundation in Linux/Unix system administration
Job Responsibility
Job Responsibility
  • Manage and support Cloudera Hadoop clusters and services (HDFS, YARN, Hive, Impala, Spark, Oozie, etc.)
  • Perform cluster upgrades, patching, performance tuning, capacity planning, and health monitoring
  • Secure the Hadoop platform using Kerberos, Ranger, or Sentry
  • Develop and maintain automation and monitoring scripts
  • Ingest data using tools such as Sqoop, NiFi, DEI Informatica, Qlik
  • Support release and deployment activities, including deployment of new across Dev/Test and Production environments
  • Integration of CI/CD pipelines (Git, or custom tooling) for automated code deployment
  • Ensuring minimal downtime, rollback capability, and alignment with change management policies
  • Maintain detailed release documentation, track changes in version control systems, and collaborate with development and operations teams to streamline deployment workflows
  • Administer and maintain Databricks workspaces in cloud environments (Azure, or GCP)
What we offer
What we offer
  • Incentive opportunities
  • Generous 401(k) retirement savings plan with matching and discretionary contributions
  • Comprehensive health coverage (medical, dental, vision and disability) & life insurance benefits
  • 21 days of vacation
  • Up to 6 paid holidays
  • Annual Ace Cares Week
  • 20 hours off work per year to volunteer
  • Opportunities to help Children’s Miracle Network Hospitals and the Ace Helpful Fund
  • On-site classes, facilitator-led courses, and a generous tuition assistance program
  • Frequent campus events (Employee Appreciation Week, vendor demos, cookouts, merchandise sales)
  • Fulltime
Read More
Arrow Right

Hadoop and Bigdata Administrator

You will work in a multi-functional role with a combination of expertise in Syst...
Location
Location
India , Indore, NOIDA
Salary
Salary:
Not provided
clear-trail.com Logo
ClearTrail
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Linux Administration
  • Experience in Python and Shell Scripting
  • Deploying and administering Hortonworks, Cloudera, Apache Hadoop/Spark ecosystem
  • Knowledge of Hadoop core components such as Zookeeper, Kafka, NIFI, HDFS, YARN, REDIS, SPARK etc.
  • Knowledge of HBASE Clusters
Job Responsibility
Job Responsibility
  • Deploying and administering Hortonworks, Cloudera, Apache Hadoop/Spark ecosystem
  • Installing Linux Operating System and Networking
  • Writing Unix SHELL/Ansible Scripting for automation
  • Maintaining core components such as Zookeeper, Kafka, NIFI, HDFS, YARN, REDIS, SPARK, HBASE etc.
  • Takes care of the day-to-day running of Hadoop clusters using Ambari/Cloudera manager/Other monitoring tools, ensuring that the Hadoop cluster is up and running all the time
  • Maintaining HBASE Clusters and capacity planning
  • Maintaining SOLR Cluster and capacity planning
  • Work closely with the database team, network team and application teams to make sure that all the big data applications are highly available and performing as expected
  • Manage KVM Virtualization environment
Read More
Arrow Right

Hadoop Admin

Role: Hadoop Admin. Location: Austin, TX (Onsite). FTE ONLY.
Location
Location
United States , Austin
Salary
Salary:
147500.00 USD / Year
realign-llc.com Logo
Realign
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Performance tuning of Hadoop clusters and Hadoop workloads
  • Screen Hadoop cluster job performances and capacity planning at application/queue level
  • Monitor Hadoop cluster connectivity and security
  • Manage and review Hadoop log files
  • File system management and monitoring
  • Implementation and ongoing administration of Hadoop infrastructure
  • Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments
  • Working with data delivery teams to setup new Hadoop users/applications
  • Cluster maintenance as well as creation and removal of nodes using tools like Ambari and other home-grown tools
  • HDFS support and maintenance
Job Responsibility
Job Responsibility
  • Responsible for implementation and ongoing administration of Hadoop infrastructure
  • Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments
  • Working with data delivery teams to setup new Hadoop users/applications
  • Cluster maintenance as well as creation and removal of nodes using tools like Ambari and other home-grown tools
  • Performance tuning of Hadoop clusters and Hadoop workloads
  • Screen Hadoop cluster job performances and capacity planning at application/queue level
  • Monitor Hadoop cluster connectivity and security
  • Manage and review Hadoop log files
  • File system management and monitoring
  • HDFS support and maintenance
  • Fulltime
Read More
Arrow Right

Senior Systems Administrator

The role of the System Administrator includes supporting the implementation, tro...
Location
Location
United States , Laurel
Salary
Salary:
Not provided
wrench.io Logo
Wrench Technology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Fourteen (10) years of experience of professional experience as a SA
  • Bachelor’s degree in Computer Science or related discipline from an accredited college or university is required
  • Five (5) years of additional SA experience may be substituted for a bachelor’s degree
  • Provide expert in troubleshooting IT systems
  • Provide thorough analysis and feedback to management and internal customers regarding escalated tickets
  • Extend support for dispatch system and hardware issues, remaining actively engaged in the resolution process
  • Handle configuration and management of UNIX and Windows (or other relevant) operating systems, including installation/loading of software, troubleshooting, maintaining integrity, configuring network components, and implementing enhancements to improve reliability and performance
  • NetApp experience required
  • Able to write the following scripting languages: Python, Ruby and Perl
Job Responsibility
Job Responsibility
  • Supporting the implementation, troubleshooting, and upkeep of Information Technology (IT) systems
  • Overseeing the IT system infrastructure and associated processes
  • Providing assistance for day-to-day operations, monitoring, and resolving issues related to client/server/storage/network devices, as well as mobile devices
  • Diagnosing and resolving problems
  • Configuring, and managing UNIX and Windows operating systems
  • Installing, and maintaining operating system software
  • Ensuring integrity, and configuring network components
  • Implementing enhancements to operating systems to enhance reliability and performance
  • Provides assistance with the installation, configuration, optimization, and administration of extensive Hadoop (Apache Accumulo) clusters dedicated to data-intensive computing tasks
Read More
Arrow Right

Hadoop Engineer

Role - Hadoop Engineer; Experience - 6+
Location
Location
United States , Bellevue
Salary
Salary:
180000.00 USD / Year
realign-llc.com Logo
Realign
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Must Have Technical/Functional Skills: Digital: Big Data and Hadoop Ecosystems, Digital: Kafka, Digital : HBase
  • Required Skills: DEVOPS ENGINEER SENIOR EMAIL SECURITY ENGINEER
Job Responsibility
Job Responsibility
  • Hadoop administration, Cloud Era, AWS, Hortonworks
  • Supporting HDInsight product team in Hadoop Admin support
  • Helping customers in providing resolutions to end customers by resolving the ICM
  • Responsibilities would include – Run prototype to migrate Big-data workloads Performance tuning for Spark, Hive, and Hadoop jobs
  • Troubleshoot production issues, identify the root cause, and provide mitigation
  • Build tools and services to improve debuggability/supportability
  • Monitoring cluster health for top customers and providing support
  • Fulltime
Read More
Arrow Right

Senior Solutions Architect

At Cloudera, we empower people to transform complex data into clear and actionab...
Location
Location
India , Mumbai
Salary
Salary:
Not provided
cloudera.com Logo
Cloudera
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Overall 12+ years IT experience
  • At least 4+ years of production experience working with Hadoop and/or NiFi, data engineering
  • Hands-on experience with all aspects of developing, testing and implementing low-latency big data pipelines
  • Demonstrated production experience in data engineering, data management, cluster management and/or analytics domains
  • Experience designing data queries against data in the HDFS environment using tools such as Apache Hive
  • Experience implementing MapReduce, Spark jobs
  • Experience setting up multi-node Hadoop clusters
  • Experience in systems administration or DevOps experience with one or more open-source operating systems
  • Experience with Data Warehouse design, ETL (Extraction, Transformation & Load), architecting efficient software designs for DW platform
  • Experience implementing operational best practices such as alerting, monitoring, and metadata management
Job Responsibility
Job Responsibility
  • Work directly with customer business and technical teams to understand requirements and develop high quality solutions
  • Design highly scalable and reliable data pipelines to consume, integrate, and analyze large amounts of data from various sources
  • Able to understand big data use-cases and recommend standard design, implementation patterns used in Hadoop-based deployments
  • Able to document and present complex architectures for the customer’s technical teams
  • Work closely with Cloudera teams at all levels to ensure project and customer success
  • Design effective data models for optimal storage and retrieval, deploy inclusive data quality checks to ensure high quality of data
  • Design, build, tune and maintain data pipelines using Hadoop, NiFi or related data integration technologies
  • Install, deploy, augment, upgrade, manage and operate large Hadoop clusters
  • Write and produce technical documentation, customer status reports and knowledge-base articles
  • Keep up with current Hadoop, NiFi, Big Data ecosystem / technologies
What we offer
What we offer
  • Generous PTO Policy
  • Support work life balance with Unplugged Days
  • Flexible WFH Policy
  • Mental & Physical Wellness programs
  • Phone and Internet Reimbursement program
  • Access to Continued Career Development
  • Comprehensive Benefits and Competitive Packages
  • Paid Volunteer Time
  • Employee Resource Groups
  • Fulltime
Read More
Arrow Right

Senior Solutions Architect

At Cloudera, we empower people to transform complex data into clear and actionab...
Location
Location
India , Mumbai
Salary
Salary:
Not provided
cloudera.com Logo
Cloudera
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Overall 12+ years IT experience
  • At least 4+ years of production experience working with Hadoop and/or NiFi, data engineering
  • Hands-on experience with all aspects of developing, testing and implementing low-latency big data pipelines
  • Demonstrated production experience in data engineering, data management, cluster management and/or analytics domains
  • Experience designing data queries against data in the HDFS environment using tools such as Apache Hive
  • Experience implementing MapReduce, Spark jobs
  • Experience setting up multi-node Hadoop clusters
  • Experience in systems administration or DevOps experience with one or more open-source operating systems
  • Experience with Data Warehouse design, ETL (Extraction, Transformation & Load), architecting efficient software designs for DW platform
  • Experience implementing operational best practices such as alerting, monitoring, and metadata management
Job Responsibility
Job Responsibility
  • Work directly with customer business and technical teams to understand requirements and develop high quality solutions
  • Design highly scalable and reliable data pipelines to consume, integrate, and analyze large amounts of data from various sources
  • Able to understand big data use-cases and recommend standard design, implementation patterns used in Hadoop-based deployments
  • Able to document and present complex architectures for the customer’s technical teams
  • Work closely with Cloudera teams at all levels to ensure project and customer success
  • Design effective data models for optimal storage and retrieval, deploy inclusive data quality checks to ensure high quality of data
  • Design, build, tune and maintain data pipelines using Hadoop, NiFi or related data integration technologies
  • Install, deploy, augment, upgrade, manage and operate large Hadoop clusters
  • Write and produce technical documentation, customer status reports and knowledge-base articles
  • Keep up with current Hadoop, NiFi, Big Data ecosystem / technologies
What we offer
What we offer
  • Generous PTO Policy
  • Support work life balance with Unplugged Days
  • Flexible WFH Policy
  • Mental & Physical Wellness programs
  • Phone and Internet Reimbursement program
  • Access to Continued Career Development
  • Comprehensive Benefits and Competitive Packages
  • Paid Volunteer Time
  • Employee Resource Groups
  • Fulltime
Read More
Arrow Right