Hewlett Packard Enterprise Logo

Hewlett Packard Enterprise

Data Engineer

Reposted 2 Days Ago
Be an Early Applicant
In-Office
Tel Aviv
Mid level
In-Office
Tel Aviv
Mid level
The Data Engineer will build and optimize data pipelines, develop reliable data models, integrate new data sources, and ensure data quality while collaborating with analysts and data scientists.
The summary above was generated by AI
Data Engineer

  

This role has been designed as ‘Hybrid’ with an expectation that you will work on average 2 days per week from an HPE office.

Who We Are:

Hewlett Packard Enterprise is the global edge-to-cloud company advancing the way people live and work. We help companies connect, protect, analyze, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today’s complex world. Our culture thrives on finding new and better ways to accelerate what’s next. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. If you are looking to stretch and grow your career our culture will embrace you. Open up opportunities with HPE.

Job Description:

   

Job Description: Data Engineer (Mid Level)

Employment Type: Full-time


About the Role
We are looking for a talented Data Engineer to help build and enhance the data platform that supports analytics, operations, and data-driven decision-making across the organization. You will work hands-on to develop scalable data pipelines, improve data models, ensure data quality, and contribute to the continuous evolution of our modern data ecosystem.
 

You’ll collaborate closely with Senior Engineers, Analysts, Data Scientists, and stakeholders across the business to deliver reliable, well-structured, and well-governed data solutions.
 

What You’ll DoEngineering & Delivery
  • Build, maintain, and optimize data pipelines for batch and streaming workloads.

  • Develop reliable data models and transformations to support analytics, reporting, and operational use cases.

  • Integrate new data sources, APIs, and event streams into the platform.

  • Implement data quality checks, testing, documentation, and monitoring.

  • Write clean, performant SQL and Python code.

  • Contribute to improving performance, scalability, and cost-efficiency across the data platform.

Collaboration & Teamwork

  • Work closely with senior engineers to implement architectural patterns and best practices.

  • Collaborate with analysts and data scientists to translate requirements into technical solutions.

  • Participate in code reviews, design discussions, and continuous improvement initiatives.

  • Help maintain clear documentation of data flows, models, and processes.

Platform & Process

  • Support the adoption and roll-out of new data tools, standards, and workflows.

  • Contribute to DataOps processes such as CI/CD, testing, and automation.

  • Assist in monitoring pipeline health and resolving data-related issues.

What We’re Looking For
  • 2–5+ years of experience as a Data Engineer or similar role.

  • Hands-on experience with Snowflake (mandatory)—including SQL, modeling, and basic optimization.

  • Experience with dbt (or similar)—model development, tests, documentation, and version control workflows.

  • Strong SQL skills for data modeling and analysis.

  • Proficiency with Python for pipeline development and automation.

  • Experience working with orchestration tools (Airflow, Dagster, Prefect, or equivalent).

  • Understanding of ETL/ELT design patterns, data lifecycle, and data modeling best practices.

  • Familiarity with cloud environments (AWS, GCP, or Azure).

  • Knowledge of data quality, observability, or monitoring concepts.

  • Good communication skills and the ability to collaborate with cross-functional teams.
     

Nice to Have
  • Exposure to streaming/event technologies (Kafka, Kinesis, Pub/Sub).
  • Experience with data governance or cataloging tools.

  • Basic understanding of ML workflows or MLOps concepts.

  • Experience with infrastructure-as-code tools (Terraform, CloudFormation).

  • Familiarity with testing frameworks or data validation tools.

Additional Skills:

Cloud Architectures, Cross Domain Knowledge, Design Thinking, Development Fundamentals, DevOps, Distributed Computing, Microservices Fluency, Full Stack Development, Security-First Mindset, User Experience (UX)

What We Can Offer You:

Health & Wellbeing

We strive to provide our team members and their loved ones with a comprehensive suite of benefits that supports their physical, financial and emotional wellbeing.

Personal & Professional Development

We also invest in your career because the better you are, the better we all are. We have specific programs catered to helping you reach any career goals you have — whether you want to become a knowledge expert in your field or apply your skills to another division.

Unconditional Inclusion

We are unconditionally inclusive in the way we work and celebrate individual uniqueness. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good.

Let's Stay Connected:

Follow @HPECareers on Instagram to see the latest on people, culture and tech at HPE.

Job:

Engineering

Job Level:

TCP_03

    

    

HPE is an Equal Employment Opportunity/ Veterans/Disabled/LGBT employer. We do not discriminate on the basis of race, gender, or any other protected category, and all decisions we make are made on the basis of qualifications, merit, and business need. Our goal is to be one global team that is representative of our customers, in an inclusive environment where we can continue to innovate and grow together. Please click here: Equal Employment Opportunity.

Hewlett Packard Enterprise is EEO Protected Veteran/ Individual with Disabilities.

   

HPE will comply with all applicable laws related to employer use of arrest and conviction records, including laws requiring employers to consider for employment qualified applicants with criminal histories.

   

No Fees Notice & Recruitment Fraud Disclaimer

 

It has come to HPE’s attention that there has been an increase in recruitment fraud whereby scammer impersonate HPE or HPE-authorized recruiting agencies and offer fake employment opportunities to candidates.  These scammers often seek to obtain personal information or money from candidates.

 

Please note that Hewlett Packard Enterprise (HPE), its direct and indirect subsidiaries and affiliated companies, and its authorized recruitment agencies/vendors will never charge any candidate a registration fee, hiring fee, or any other fee in connection with its recruitment and hiring process.  The credentials of any hiring agency that claims to be working with HPE for recruitment of talent should be verified by candidates and candidates shall be solely responsible to conduct such verification. Any candidate/individual who relies on the erroneous representations made by fraudulent employment agencies does so at their own risk, and HPE disclaims liability for any damages or claims that may result from any such communication.

Top Skills

Airflow
AWS
Azure
Dagster
Dbt
GCP
Prefect
Python
Snowflake
SQL

Similar Jobs

Yesterday
Hybrid
Tel Aviv, ISR
Mid level
Mid level
AdTech • Big Data • Digital Media • Marketing Tech
As a Data Engineer, you'll optimize database performance, design data flows, and develop automation tools while working with large-scale data environments.
Top Skills: BashBigQueryCassandraElasticsearchGrafanaHbaseHdfsIcebergKafkaMySQLPerlPythonSparkVertica
7 Days Ago
In-Office
Tel Aviv, ISR
Mid level
Mid level
Gaming
You will build and scale data infrastructure, manage data lakes, and collaborate on data processing for products.
Top Skills: AirflowAWSBigQueryGCPKafkaPostgresPubsubPythonRedshiftSQL
7 Days Ago
Easy Apply
In-Office
Tel Aviv, ISR
Easy Apply
Mid level
Mid level
Information Technology • Software
As a Data Engineer, you'll build and scale data infrastructure, manage data pipelines, maintain data lakes, design schemas, and collaborate globally with engineers.
Top Skills: AirflowApache IcebergAWSBigQueryGCPKafkaPostgresPubsubPythonRedshiftSQL

What you need to know about the NYC Tech Scene

As the undisputed financial capital of the world, New York City is an epicenter of startup funding activity. The city has a thriving fintech scene and is a major player in verticals ranging from AI to biotech, cybersecurity and digital media. It also has universities like NYU, Columbia and Cornell Tech attracting students and researchers from across the globe, providing the ecosystem with a constant influx of world-class talent. And its East Coast location and three international airports make it a perfect spot for European companies establishing a foothold in the United States.

Key Facts About NYC Tech

  • Number of Tech Workers: 549,200; 6% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Capgemini, Bloomberg, IBM, Spotify
  • Key Industries: Artificial intelligence, Fintech
  • Funding Landscape: $25.5 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Greycroft, Thrive Capital, Union Square Ventures, FirstMark Capital, Tiger Global Management, Tribeca Venture Partners, Insight Partners, Two Sigma Ventures
  • Research Centers and Universities: Columbia University, New York University, Fordham University, CUNY, AI Now Institute, Flatiron Institute, C.N. Yang Institute for Theoretical Physics, NASA Space Radiation Laboratory

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account