MOHELA Logo

MOHELA

Data Engineer

Reposted 9 Days Ago
Remote
Hiring Remotely in USA
Mid level
Remote
Hiring Remotely in USA
Mid level
The Data Engineer will enhance the Enterprise Data Warehouse, modernizing ETL processes, optimizing data transformations, ensuring data quality, and providing production support while collaborating with cross-functional teams.
The summary above was generated by AI
Job Summary & Responsibilities

POSITION OVERVIEW:

We are seeking a technically skilled and experienced Data Engineer to provide support and enhancement of our Enterprise Data Warehouse. The role focuses on modernizing ETL processes within an on-premises Cloudera Data Platform (CDP) environment, leveraging technologies like Apache Spark, Apache Iceberg, and Apache Airflow for scalable, efficient, and reliable data transformation and management. The ideal candidate will have strong ETL development and troubleshooting skills, along with experience participating in production support environments.

 


Essential job functions:

  1. Development
    • Contribute development efforts for ETL pipelines in the Enterprise Data Warehouse (EDW)
    • Support and rebuild legacy ETL jobs (currently not using ACID transactions) with modern solutions using Apache Spark and Apache Iceberg to support ACID transactions.
    • Transform and integrate EBCDIC Mainframe data into Hive and Impala tables using Precisely Connect for Big Data.
    • Optimize data transformation processes for performance, scalability, and reliability.
    • Ensure data consistency, accuracy, and quality across the ETL pipelines.
    • Utilizes best practices for ETL code development, version control, and deployment using Azure DevOps.
  2. Production Support
    • Shares weekly 24/7 production support with managed service vendor on a 4-week rotation.
    • Monitor ETL workflows and troubleshoot issues to ensure smooth production operations.
    • Research and resolve user requests and issues
  3. Collaboration and Stakeholder Engagement
    • Collaborate with cross-functional teams, including data engineers, business analysts, administrators, and quality analyst engineers to ensure alignment on requirements and deliverables.
    • Engage with business stakeholders to understand data requirements and translate them into scalable technical solutions.
  4. Technical Governance
    • Contribute to process documentation, and follow best practices within the Enterprise Data Warehouse
    • Follow proper SDLC protocols within Azure DevOps code repository
    • Stay updated on emerging technologies and trends to continuously improve data platform capabilities.
    • Other tasks as assigned by management

 

MINIMUM REQUIREMENTS:

  • Bachelor’s degree in IT or similar field. (Additional equivalent experience above the required minimum may be substituted for the degree requirement.)
  • 3+ years of experience in ETL development and data engineering roles
  • 3+ years of advanced SQL experience
  • 3+ years in Python and Linux for Spark-based development.
  • Proven experience in using Apache Spark or Apache Iceberg or Airflow for ETL pipelines.
  • Strong familiarity with version control systems, especially Azure DevOps.
  • Knowledge of data governance and security best practices in a distributed data environment.
  • Familiarity with data modeling, schema design, and building data models for reporting needs.
  • In-depth understanding of ETL frameworks, ACID transactions, change data capture, and distributed computing.
  • Experience in designing and managing large-scale data pipelines and workflows.
  • Excellent problem-solving and troubleshooting skills.
  • Effective communication and collaboration abilities to collaborate with diverse teams and stakeholders.
  • Timeline centric mindset
  • Enterprise application awareness and technical alignment standards
  • This position requires (6C) personnel security screening in accordance with the U.S. Department of Education’s (ED) policy regarding the personnel security screening requirements for all contractor and subcontractor employees. A qualified applicant must successfully submit for personnel security screening within 14 calendar days from employment offer. Some travel may be required for PIV support.

 

PREFERRED QUALIFICATIONS:

  • Experience with Cloudera Data Platform (CDP), including Hive and Impala
  • Knowledge of Precisely Connect for Big Data or similar tools for mainframe data transformation

Top Skills

Apache Airflow
Apache Iceberg
Spark
Azure Devops
Linux
Python
SQL

Similar Jobs

20 Hours Ago
Easy Apply
Remote
USA
Easy Apply
107K-150K Annually
Junior
107K-150K Annually
Junior
Big Data • Healthtech • HR Tech • Machine Learning • Software • Telehealth • Big Data Analytics
As a Data Engineer II, you will build and optimize data pipelines, define reusable datasets, and create a data validation framework while ensuring user privacy and security.
Top Skills: AirbyteAirflowArgoAWSDbtDuckdbElasticsearchIcebergLookerPythonSparkSQLTerraform
2 Days Ago
Easy Apply
Remote or Hybrid
USA
Easy Apply
158K-205K Annually
Senior level
158K-205K Annually
Senior level
Food • Software
The Senior Data Engineer handles ChowNow's data platform, collaborating with teams to enhance data availability and insights, supporting internal and customer-facing products.
Top Skills: AWSDbtPythonSnowflakeSQL
21 Days Ago
Remote or Hybrid
Framingham, MA, USA
69K-129K Annually
Mid level
69K-129K Annually
Mid level
Big Data • Healthtech • Software
Design, build, and maintain scalable ETL/ELT pipelines using Python, Spark, Databricks, Airflow and SSIS. Integrate and cleanse diverse healthcare datasets, implement Unity Catalog for metadata and governance, optimize Spark performance and JVM tuning, support Medallion architecture, and collaborate with cross-functional teams to automate CI/CD, observability, and data quality processes.
Top Skills: Apache AirflowSparkAWSCsvDatabricksDeltaGitlab CiJenkinsJvmMedallion ArchitectureNoSQLParquetPythonScalaSQLSsisUnity CatalogXML

What you need to know about the NYC Tech Scene

As the undisputed financial capital of the world, New York City is an epicenter of startup funding activity. The city has a thriving fintech scene and is a major player in verticals ranging from AI to biotech, cybersecurity and digital media. It also has universities like NYU, Columbia and Cornell Tech attracting students and researchers from across the globe, providing the ecosystem with a constant influx of world-class talent. And its East Coast location and three international airports make it a perfect spot for European companies establishing a foothold in the United States.

Key Facts About NYC Tech

  • Number of Tech Workers: 549,200; 6% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Capgemini, Bloomberg, IBM, Spotify
  • Key Industries: Artificial intelligence, Fintech
  • Funding Landscape: $25.5 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Greycroft, Thrive Capital, Union Square Ventures, FirstMark Capital, Tiger Global Management, Tribeca Venture Partners, Insight Partners, Two Sigma Ventures
  • Research Centers and Universities: Columbia University, New York University, Fordham University, CUNY, AI Now Institute, Flatiron Institute, C.N. Yang Institute for Theoretical Physics, NASA Space Radiation Laboratory

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account