CrawlJobs Logo
Briefcase Icon
Category Icon

Filters

×
Countries
Work Mode

Data Lake SME Jobs (On-site work)

1 Job Offers

Filters
Data Lake SME
Save Icon
Location Icon
Location
India , Bangalore
Salary Icon
Salary
Not provided
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Read More
Arrow Right
Explore the world of Data Lake Subject Matter Expert (SME) jobs and discover a pivotal career at the intersection of data architecture, engineering, and governance. A Data Lake SME is a senior-level professional responsible for the strategic design, implementation, and ongoing management of an organization's centralized data repository, the data lake. This role goes beyond simple data storage; it focuses on creating a scalable, secure, and high-performance platform that serves as the single source of truth for the entire enterprise, enabling advanced analytics, business intelligence, and data-driven decision-making. Professionals in these jobs are the master architects of the data ecosystem. Their typical responsibilities encompass the entire data lifecycle. They design and build robust data ingestion pipelines that pull in vast volumes of structured, semi-structured, and unstructured data from diverse sources such as databases, application APIs, and real-time streaming platforms. A core function involves developing, optimizing, and managing complex ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes to cleanse, transform, and prepare data for consumption. This includes working with both batch and real-time processing frameworks to meet varying business needs. Performance tuning is a constant focus, requiring expertise in implementing strategies like data partitioning, indexing, and caching to ensure efficient storage and rapid retrieval of petabytes of data. Furthermore, Data Lake SMEs are the guardians of data integrity and security. They establish and enforce data governance policies, ensuring compliance with regulations like GDPR and HIPAA. This involves implementing critical security measures such as encryption, data masking, and role-based access control (RBAC) to protect sensitive information. They also champion data discoverability by overseeing metadata management, data lineage tracking, and data cataloging initiatives, making it easy for users to find and trust the data they need. Collaboration is key, as these experts regularly partner with data scientists, business analysts, and other engineering teams to understand their requirements and enable them to derive powerful insights from the data lake. The typical skill set for these high-impact jobs is comprehensive. Employers generally seek candidates with extensive experience in data engineering and a deep understanding of big data ecosystems. Proficiency in programming languages like Python, Scala, or Java is essential, coupled with expert-level SQL skills. Hands-on experience with core technologies such as Apache Spark, Hadoop, Kafka, and modern table formats like Delta Lake or Iceberg is highly valued. As most modern data lakes are cloud-native, demonstrated expertise with cloud platforms (AWS, Azure, or GCP) and their specific data services (e.g., AWS Glue, Azure Data Factory, Databricks) is a standard requirement. Familiarity with workflow orchestration tools like Apache Airflow is also common. For those with a passion for building the foundational systems that power modern analytics, Data Lake SME jobs offer a challenging and highly rewarding career path with significant demand across industries.

Filters

×
Category
Location
Work Mode
Salary