CrawlJobs Logo

Software Engineer

https://www.hpe.com/ Logo

Hewlett Packard Enterprise

Location Icon

Location:
India, Bangalore

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

As a member of the Decision Support Analytics (DSA) team, you will collaborate with cross-functional teams to design, build, and manage scalable data pipelines, data warehouses, and machine learning (ML) models. Your work will involve analyzing and visualizing data, publishing dashboards or data models, and contributing to the development of web services for Engineering Technologies portals and applications. This role requires strong coding abilities, presentation skills, and expertise in big data infrastructure.

Job Responsibility:

  • Collaborate with cross-functional teams to design, build, and manage scalable data pipelines, data warehouses, and machine learning (ML) models
  • Analyze and visualize data, publish dashboards or data models, and contribute to the development of web services for Engineering Technologies portals and applications
  • Collaborate with internal stakeholders to gather requirements and understand business workflows
  • Develop scalable data pipelines and ensure high-quality data flow and integrity
  • Use advanced coding skills in languages such as SQL, Python, Java, or Scala to address business needs
  • Leverage statistical methods to analyze data, generate actionable insights, and produce business reports
  • Design meaningful visualizations using tools like Tableau, Power BI, or similar platforms for effective communication with stakeholders
  • Implement or upgrade data analysis tools and assist in strategic decisions regarding new systems
  • Build frameworks and automation tools to streamline data consumption and understanding
  • Train end-users on new dashboards, reports, or tools
  • Provide hands-on support for internal customers across various teams
  • Ensure compliance with data governance policies and security standards

Requirements:

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field
  • Proven track record of working with large datasets in fast-paced environments
  • Strong problem-solving skills with the ability to adapt to evolving technologies
  • Typically 8+ years experience
  • Data Engineering Tools & Frameworks: ETL tools such as Wherescape, Apache Airflow, or Azure Data Factory
  • Big Data technologies like Hadoop, Apache Spark, or Kafka
  • Cloud Platforms: Proficiency in cloud services such as AWS, Azure, or Google Cloud Platform for storage, computing, and analytics
  • Databases: Experience with both relational (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra)
  • Data Modeling & Architecture: Expertise in designing schemas for analytical use cases and optimizing storage mechanisms
  • Machine Learning & Automation: Familiarity with ML frameworks (e.g., TensorFlow, PyTorch) for building predictive models
  • Scripting & Automation: Advanced scripting for automation using Python/Scala/Java
  • APIs & Web Services: Building RESTful APIs for seamless integration with internal/external systems
  • Additional Skills: Cloud Architectures, Cross Domain Knowledge, Design Thinking, Development Fundamentals, DevOps, Distributed Computing, Microservices Fluency, Full Stack Development, Security-First Mindset, Solutions Design, Testing & Automation, User Experience (UX)

Nice to have:

  • Cloud Architectures
  • Cross Domain Knowledge
  • Design Thinking
  • Development Fundamentals
  • DevOps
  • Distributed Computing
  • Microservices Fluency
  • Full Stack Development
  • Security-First Mindset
  • Solutions Design
  • Testing & Automation
  • User Experience (UX)
What we offer:
  • Health & Wellbeing
  • Personal & Professional Development
  • Unconditional Inclusion

Additional Information:

Job Posted:
July 29, 2025

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.