MeridianLink Logo

MeridianLink

Sr. Data Engineer

Reposted 7 Days Ago
Remote
Hiring Remotely in US
117K-150K Annually
Senior level
Remote
Hiring Remotely in US
117K-150K Annually
Senior level
The Senior Data Engineer will design and maintain data processing pipelines for ETL, develop technical tools utilizing AI and machine learning, and ensure data governance and compliance.
The summary above was generated by AI

The Senior Data Engineer will design, build, implement, and maintain data processing pipelines for the extraction, transformation, and loading (ETL) of data from a variety of data sources. The role will develop robust and scalable solutions that transform data into a useful format for analysis, enhance data flow, and enable end-users to consume and analyze data faster and easier. Additionally, the Senior Data Engineer will serve as the principal person responsible for Informatica data ingress (import) and egress (export) and monitoring to ensure data is accurately and securely transferred in and out of systems.

Expected Duties:

  • Senior Data Engineers will design, build, implement, and maintain data processing pipelines for the extraction, transformation, and loading (ETL) of data from a variety of data sources

  • Informatica ingress and egress creation and monitoring acting as principal point of contact

  • Expected to lead the writing of complex SQL queries to support analytics needs

  • Responsible for developing technical tools and programming that leverage artificial intelligence, machine learning, and big-data techniques to cleanse, organize and transform data and to maintain, defend and update data structures and integrity on an automated basis

  • Senior Data Engineers will evaluate and recommend tools and technologies for data infrastructure and processing. Collaborate with engineers, data scientists, data analysts, product teams, and other stakeholders to translate business requirements to technical specifications and coded data pipelines

  • The role will work with tools, languages, data processing frameworks, and databases such as R, Python, SQL, Databricks, Informatica, Sigma, Spark, Delta, APIs. Work with structured and unstructured data from a variety of data stores, such as data lakes, relational database management systems, and/or data warehouses

Qualifications: Knowledge, Skills, and Abilities:

The role will include work on problems of diverse scopes where analysis of information requires evaluation of identifiable factors. Work is expected to be done independently through independent judgment.

  • Ability to assess unusual circumstances and use sophisticated analytical and problem-solving techniques to identify the cause

  • Working with people, processes facing Enterprise a plus, improvement of processes/reduction of costs in the enterprise processes leveraging data is preferable.

  • Ability to enhance relationships and networks with senior internal partners who are not familiar with the subject often requires persuasion

  • Architect and scale our modern data platform to support real-time and batch processing for financial forecasting, risk analytics, and customer insights

  • Enforce high standards for data governance, quality, lineage, and compliance

  • Partner with stakeholders across engineering, finance, sales, and compliance to translate business requirements into reliable data models and workflows.

  • Evaluate emerging technologies and lead POCs that shape the future of our data stack.

  • Champion a culture of security, automation, and continuous delivery in all data workflows

Technical Qualifications:

  • Deep expertise in Python, SQL, Informatica, Sigma and distributed processing frameworks like Apache Spark, Databricks, Snowflake, Redshift, BigQuery.

  • Proven experience with cloud-based data platforms (preferably AWS or Azure).

  • Hands-on experience with data orchestration tools (e.g., Airflow, dbt) and data warehouses (e.g., Databricks, Snowflake, Redshift, BigQuery).

  • Strong understanding of data security, privacy, and compliance within a financial services context.

  • Experience working with structured and semi-structured data (e.g., Delta, JSON, Parquet, Avro) at scale.

  • Familiarity with modelling datasets in Salesforce, Netsuite and Anaplan to solve business use cases required.

  • Previous experience Democratizing data at scale for the enterprise a huge plus.

Educational Qualifications and Work experience

  • Bachelor's or master's degree in computer science, Engineering, or a related field.

  • 6-8 years of experience in data engineering, with a strong focus on financial systems on SaaS platforms.

Top Skills

Airflow
Spark
Avro
AWS
Azure
BigQuery
Databricks
Dbt
Delta
JSON
Parquet
Python
Redshift
Snowflake
SQL

Similar Jobs

6 Days Ago
Remote or Hybrid
3 Locations
59-78 Hourly
Senior level
59-78 Hourly
Senior level
Information Technology
The Senior Data Engineer will design and maintain data infrastructure and pipelines, ensuring data integrity and supporting analytics for a nonprofit organization.
Top Skills: AWSAws QuicksightAws RdsDbtETLJavaScriptLookerMetabasePostgresPythonSQLTableau
9 Days Ago
Remote or Hybrid
United States
135K-155K Annually
Senior level
135K-155K Annually
Senior level
Professional Services • Software
Lead the buildout of a new enterprise data platform, designing infrastructure, pipelines, and storage while ensuring data governance and quality.
Top Skills: AWSAzureDatabricksGCPJavaSnowflakeSQL
5 Days Ago
Easy Apply
Remote or Hybrid
USA
Easy Apply
130K-165K Annually
Senior level
130K-165K Annually
Senior level
Artificial Intelligence • Insurance • Machine Learning • Software • Analytics
Lead design and implementation of scalable, HIPAA-compliant data pipelines and platforms for healthcare ML. Build ETL, orchestration, and tooling for processing EHR, claims, pharmacy, and bioinformatics data; collaborate with data scientists to produce modeling-ready datasets and ensure data quality, reliability, and operational excellence.
Top Skills: AirflowApache Spark (Pyspark)AWSCi/CdDagsterDatabricksDbtDockerKubernetesPrefectPythonSnowflakeSQLTerraform

What you need to know about the NYC Tech Scene

As the undisputed financial capital of the world, New York City is an epicenter of startup funding activity. The city has a thriving fintech scene and is a major player in verticals ranging from AI to biotech, cybersecurity and digital media. It also has universities like NYU, Columbia and Cornell Tech attracting students and researchers from across the globe, providing the ecosystem with a constant influx of world-class talent. And its East Coast location and three international airports make it a perfect spot for European companies establishing a foothold in the United States.

Key Facts About NYC Tech

  • Number of Tech Workers: 549,200; 6% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Capgemini, Bloomberg, IBM, Spotify
  • Key Industries: Artificial intelligence, Fintech
  • Funding Landscape: $25.5 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Greycroft, Thrive Capital, Union Square Ventures, FirstMark Capital, Tiger Global Management, Tribeca Venture Partners, Insight Partners, Two Sigma Ventures
  • Research Centers and Universities: Columbia University, New York University, Fordham University, CUNY, AI Now Institute, Flatiron Institute, C.N. Yang Institute for Theoretical Physics, NASA Space Radiation Laboratory

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account