CrawlJobs Logo

Big Data System Specialist

vodafone.com Logo

Vodafone

Location Icon

Location:
Romania , Bucuresti

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are seeking a System Engineering Specialist to manage and optimise Big Data platforms, ensuring operational excellence and supporting transformation initiatives. This role involves designing, configuring, and implementing software solutions aligned with global standards, as well as developing scripts and programs to automate processes and enhance system functionality. You will play a key role in IT projects from requirements gathering to implementation and testing, while ensuring high availability and performance of critical platforms.

Job Responsibility:

  • Participate in all phases of IT projects, including defining technical requirements, evaluating solutions, developing scripts, and implementing platforms
  • Develop automation scripts and programs to extend functionality and streamline recurring activities
  • Design, configure, and implement solutions involving operating systems and Big Data technologies
  • Create and test disaster recovery solutions and maintain system resilience
  • Optimise existing architectures and perform upgrades to meet evolving business needs
  • Provide support for transformation projects and technology refresh programmes to enhance service availability and performance
  • Investigate and resolve performance incidents, ensuring minimal impact on client operations
  • Install and maintain operating systems and perform proactive maintenance using automated tools
  • Evaluate emerging technologies and contribute to proof-of-concept initiatives
  • Be available for exceptional situations requiring extended hours

Requirements:

  • University degree with 3–5 years of experience in IT systems administration
  • Proven expertise in Big Data solutions (Hadoop ecosystem: HDFS, YARN, MapReduce, Machine Learning, Teradata)
  • Strong knowledge of Red Hat Enterprise Linux or Microsoft Windows operating systems
  • Proficiency in procedural programming and shell scripting (Python, KSH, Bash, Perl)
  • Familiarity with ITIL practices and advanced cyber security principles
  • Understanding of SOX compliance
  • Excellent problem-solving skills, structured delivery approach, and ability to manage multiple priorities under pressure
  • Strong documentation skills and customer-oriented mindset
  • Fluent in English (written and spoken)
What we offer:
  • Hybrid way of working
  • Medical and dental services
  • Life and hospitalization insurance
  • Dedicated employee phone subscription
  • Take control of your benefits and choose any of the below options: MEAL TICKETS/ PRIVATE PENSION/ VACATION VOUCHERS/ CULTURAL VOUCHERS within the budget
  • Special discounts for gyms and retailers
  • Annual Company Bonus
  • Loyalty Programme
  • Ongoing Education – we continuously invest in you to ensure you have everything needed to excel on the job and enhance your skills
  • You get to work with tried and trusted web-technology
  • Getting in on the ground floor of a technology changing company
  • We let you write your own story by planning vacations: go for a trip, experience new things, have fun and enjoy your 23 days off
  • Special Paternal Program - 4 months of paid paternity leave

Additional Information:

Job Posted:
January 22, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Big Data System Specialist

Big Data System Specialist

We are seeking a System Engineering Specialist to manage and support Big Data pl...
Location
Location
Romania , Bucharest
Salary
Salary:
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • University graduate with 3–5 years’ experience in IT infrastructure administration
  • Proven expertise in Big Data technologies including Hadoop (HDFS, YARN, MapReduce), Teradata, and Machine Learning
  • Proficient in Red Hat Enterprise Linux or Microsoft Windows operating systems
  • Skilled in procedural programming and shell scripting
  • Knowledgeable in ITIL, cyber security, and SOX principles
  • Strong documentation and problem-solving skills
  • Able to manage multiple priorities under pressure and meet tight deadlines
  • Fluent in English (written and spoken)
  • Customer-focused with the ability to work effectively in distributed teams
Job Responsibility
Job Responsibility
  • Manage the operational aspects of Big Data platforms
  • Participate in the full lifecycle of IT projects, including planning, analysis, and implementation
  • Define technical requirements and develop scripts/programmes using Bash, KSH, Python, Perl, and SQL
  • Design and implement disaster recovery solutions and conduct periodic testing
  • Optimise existing systems through architectural redesigns and software upgrades
  • Support transformation and technology refresh initiatives to enhance service availability and performance
  • Investigate and resolve performance incidents and client-reported software issues
  • Install and maintain operating systems and perform proactive maintenance using automated tools
  • Evaluate new technologies and conduct proof-of-concept exercises
What we offer
What we offer
  • Hybrid way of working: 2 days per week/ 8 per month
  • Medical and dental services
  • Life and hospitalization insurance
  • Dedicated employee phone subscription
  • Take control of your benefits and choose any of the below options: MEAL TICKETS/ PRIVATE PENSION/VACATION VOUCHERS/ CULTURAL VOUCHERS within the budget
  • Special discounts for gyms and retailers
  • Annual Company Bonus
  • Ongoing Education – we continuously invest in you to ensure you have everything needed to excel on the job and enhance your skills
  • You get to work with tried and trusted web-technology
  • We let you write your own story by planning vacations: go for a trip, experience new things, have fun and enjoy your 23 days off
  • Fulltime
Read More
Arrow Right

Systems Integration Specialist Advisor

We are currently seeking a Systems Integration Specialist Advisor to join our te...
Location
Location
United States , Plano
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10+ years of experience in Big Data / Data Engineering / DBA / Data Operations roles
  • Minimum 2+ years of hands-on experience with Apache Iceberg in production environments
  • 6+ years of experience working with Cloudera ecosystem (CDP Ecosystem)
  • Strong expertise in: Iceberg table optimization (compaction, metadata management, partition evolution)
  • Multi-engine performance tuning (Spark, Hive, Impala)
  • Troubleshooting complex data and query performance issues
  • Proven experience handling: P1/P2 production incidents
  • Large-scale environments (TB/PB scale)
  • Data migration initiatives (Hive/Teradata → Iceberg)
  • Lead enforcement of data modeling and Lakehouse standards across applications
Job Responsibility
Job Responsibility
  • Lead enforcement of data modeling and Lakehouse standards across applications
  • Guide teams on: Medallion architecture implementation
  • Balancing normalization vs performance
  • Review and resolve complex data modeling and performance trade-offs
  • Ensure consistency of data structures across domains and workloads
  • Mentor and guide L2 resources in operational best practices and troubleshooting
Read More
Arrow Right
New

Systems Integration Specialist Advisor

We are currently seeking a Systems Integration Specialist Advisor to join our te...
Location
Location
United States , Plano
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4–6 years of experience in Big Data / Data Operations / DBA roles
  • Minimum 1+ year of experience with Apache Iceberg or similar table formats (Hive/Delta/Hudi)
  • 4+ years of experience with Cloudera ecosystem (CDP)
  • Hands-on experience with: Iceberg table operations and maintenance, Spark SQL, Hive, or Impala
  • Experience in: Production support and incident handling, Monitoring, troubleshooting, and operational support
  • Apply established data modeling and Lakehouse standards in day-to-day operations
  • Support: Table structuring, Partition alignment with ingestion patterns
  • Assist in maintaining consistency of datasets across Bronze/Silver/Gold layers
Job Responsibility
Job Responsibility
  • Ensures data correctness and performance for downstream analytics and business-critical reporting
  • Enables successful modernization from legacy platforms to Iceberg
  • Maintains high availability and reliability of the enterprise data layer
  • Fulltime
Read More
Arrow Right

Senior Data Scientist

Senior Data Scientist (AI/ML). This role has been designed as ‘Hybrid’ with an e...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 9+ years of experience in a Data science role
  • Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field
  • 5+ years experience building data pipelines for data science-driven solutions and deployed in Production environment
  • Experience working in technical support environment, working with dataset from CRM, H/W and S/W bugs data, machine logs
  • Experience supporting and working with multi-functional teams in a multidimensional fast paced environment
  • Good team worker with excellent interpersonal, written, verbal and presentation skills
  • Experience building and optimizing data pipelines, architectures and data sets
  • Experience performing root cause analysis on internal and external data and processes
  • Strong analytic skills related to working with unstructured datasets
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management
Job Responsibility
Job Responsibility
  • Leads as well as develops scalable AI solutions using relevant AI (ML/DL/Gen AI) techniques
  • Architects large scale AI solutions that seamlessly merge AI model and techniques in SDLC
  • Organizes and leads comprehensive code and design review sessions, driving discussions to align with project requirements and best practices. Mentor and provide feedback to junior and mid-level team members
  • Conducts research and stays up to date with the latest advancements in AI and machine learning technologies, frameworks, and algorithms. Explore and experiment with cutting-edge techniques to solve complex problems and improve existing models
  • Collaborates with cross-functional teams to understand business requirements and design AI and machine learning solutions. Determine the appropriate algorithms, models, and frameworks to use and architect the overall system to ensure scalability, efficiency, and robustness
  • Develops, implements, and optimizes machine learning models and algorithms. This includes data pre-processing, feature engineering, model selection, hyperparameter tuning, and training on large datasets. Continuously monitor and improve model performance and accuracy
  • Leverage or build analytics tools that utilize the data pipeline to provide significant insights into customer case data, bug data, operational and other key business performance metrics
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
  • Work with data and analytics specialists to strive for greater functionality in our data systems
  • Identify trends, patterns from dataset to scope opportunities for automation
What we offer
What we offer
  • Health & Wellbeing
  • Personal & Professional Development
  • Unconditional Inclusion
Read More
Arrow Right

Senior Data Scientist (AI/ML)

Senior Data Scientist (AI/ML). This role has been designed as ‘Hybrid’ with an e...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 9+ years of experience in a Data science role
  • Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field
  • 5+ years experience building data pipelines for data science-driven solutions and deployed in Production environment
  • Experience working in technical support environment, working with dataset from CRM, H/W and S/W bugs data, machine logs
  • Experience supporting and working with multi-functional teams in a multidimensional fast paced environment
  • Good team worker with excellent interpersonal, written, verbal and presentation skills
  • Experience building and optimizing data pipelines, architectures and data sets
  • Experience performing root cause analysis on internal and external data and processes
  • Strong analytic skills related to working with unstructured datasets
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management
Job Responsibility
Job Responsibility
  • Leads as well as develops scalable AI solutions using relevant AI (ML/DL/Gen AI) techniques
  • Architects large scale AI solutions that seamlessly merge AI model and techniques in SDLC
  • Organizes and leads comprehensive code and design review sessions, driving discussions to align with project requirements and best practices. Mentor and provide feedback to junior and mid-level team members
  • Conducts research and stays up to date with the latest advancements in AI and machine learning technologies, frameworks, and algorithms. Explore and experiment with cutting-edge techniques to solve complex problems and improve existing models
  • Collaborates with cross-functional teams to understand business requirements and design AI and machine learning solutions. Determine the appropriate algorithms, models, and frameworks to use and architect the overall system to ensure scalability, efficiency, and robustness
  • Develops, implements, and optimizes machine learning models and algorithms. This includes data pre-processing, feature engineering, model selection, hyperparameter tuning, and training on large datasets. Continuously monitor and improve model performance and accuracy
  • Leverage or build analytics tools that utilize the data pipeline to provide significant insights into customer case data, bug data, operational and other key business performance metrics
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
  • Work with data and analytics specialists to strive for greater functionality in our data systems
  • Identify trends, patterns from dataset to scope opportunities for automation
What we offer
What we offer
  • Health & Wellbeing
  • Personal & Professional Development
  • Unconditional Inclusion
Read More
Arrow Right

Infrastructure Specialist

Helping our customers and partners harness the power of the data, the Infrastruc...
Location
Location
Romania , Bucharest
Salary
Salary:
Not provided
btprovider.com Logo
btProvider
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Hands-on experience, industry-agnostic, tool-agnostic
  • System architecture knowledge
  • OS and application installation and management knowledge
  • Willingness to learn
  • Customer-facing skills
  • Fluent English
Job Responsibility
Job Responsibility
  • Define the integration architecture on a project-by-project basis, observing common practices in the field
  • Define requirements and prerequisites for the architecture related to software components
  • Installation of server-side software components and their integration in the existing infrastructure
  • Server-side software components include Tableau Server, Vertica, Talend, Dremio, Mulesoft
  • Document the system architecture
  • Define and implement proactive maintenance activities
  • Plan, perform and validate software components upgrade activities
  • Perform troubleshooting and root cause identification at software component level and OS level
  • Perform performance review and tuning
  • Actively collaborate with concerned teams on architecture design and deployment
What we offer
What we offer
  • Flexible working hours
  • Healthcare package
  • Knowledge sharing culture
  • Competitive salary
  • Hybrid working model (work from anywhere)
Read More
Arrow Right

Enterprise Data Science Senior Specialist

Act as a senior individual contributor responsible for implementing enterprise c...
Location
Location
Egypt , Giza
Salary
Salary:
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Excellent hands‑on experience with machine learning and deep learning algorithms
  • Strong proficiency in advanced analytics, ML, deep learning, and Generative AI (GenAI)
  • Minimum 4–6 years of experience in data science, AI, and analytics roles
  • Proven experience in building and deploying AI/ML models in production environments
  • Good proficiency in data visualization tools (Power BI, Tableau) to communicate insights effectively
  • Solid knowledge of relational databases (e.g., MySQL) for efficient data storage and querying
  • Experience with big data analytics frameworks, including PySpark and Spark clusters
  • Strong skills in data mining, feature engineering, and time‑series forecasting
  • Good working knowledge of Linux operating systems
  • Bachelor’s or Master’s degree in Computer Science, Data Science, Artificial Intelligence, or a related field
Job Responsibility
Job Responsibility
  • Translate customer and product requirements into clear data science and AI implementation tasks
  • Develop, train, test, and deploy ML/DL/GenAI and data pipelines aligned with customer‑specific use cases
  • Ensure AI components are efficient, maintainable, and production‑ready, supporting smooth integration with enterprise systems and platforms
  • Contribute to documentation, reproducibility, and knowledge transfer for implemented solutions
  • Act as a subject‑matter contributor in machine learning, AI and GenAI engineering within delivery teams
  • Apply best practices in feature engineering, model selection, evaluation, and optimization
  • Support MLOps activities such as model versioning, monitoring, retraining, and performance tracking
  • Participate in technical reviews, code reviews, and solution walkthroughs to maintain delivery quality
  • Work closely with Enterprise teams to ensure AI solutions address real customer problems and deliver measurable outcomes
  • Explain data science methods, assumptions, and results in a clear and structured way to technical and non‑technical stakeholders
Read More
Arrow Right

OpenText Exstream Developer

This role is responsible for the development, installation, and maintenance of o...
Location
Location
India , Pune
Salary
Salary:
Not provided
cencora.com Logo
Cencora
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4+ years of hands‑on experience in OpenText Exstream 23.x / 24.x / 25.x (CloudNative)
  • Strong experience with XML, Print Miner, columnar and delimited data inputs
  • Expertise in batch and real‑time application design
  • Skilled in creating automated and complex table structures
  • Experience working with barcodes and inserter configurations
  • Proficient in generating outputs: PS, PDF, AFP, Empower, multi‑channel delivery
  • Deep knowledge of rules, formula variables, control files, document/pages setup, design layers, language layers
  • Hands‑on experience with two‑pass application design
  • Knowledge of orchestration workflows
  • Proficient in sorting, bundling, and post‑processing (AFP/PDF)
Job Responsibility
Job Responsibility
  • Develop OpenText Exstream applications
  • Design, implement, unit test, document, and deploy applications/APIs
  • Develop database solutions using SSIS, T‑SQL, and stored procedures
  • Collaborate with business teams to define logical designs aligned with data architecture
  • Perform code reviews, analyze execution plans, and optimize/re-factor code
  • Provide technical guidance to junior software engineers
  • Follow data standards, resolve data issues, perform unit testing, and document ETL processes
  • Assist managers with project documentation, progress tracking, and test plan creation
  • Work with business analysts and source system experts on data extraction & transformation requirements
  • Coordinate with IT operations and testing teams for timely, sustainable releases
  • Fulltime
Read More
Arrow Right