Tidal Financial Group Logo

Tidal Financial Group

Data Engineer

Reposted 18 Hours Ago
Remote
Hiring Remotely in United States
Senior level
Remote
Hiring Remotely in United States
Senior level
Lead Data Engineer responsible for architecting, scaling, and optimizing data pipelines to support Tidal's ETF platform, ensuring high-performance data delivery and collaboration with cross-functional teams.
The summary above was generated by AI

The Tidal Financial Group is a leading ETF investment technology platform dedicated to creating, operating, and growing ETFs. We combine expertise and innovative partnership approaches to offer comprehensive, value-generating ETF solutions. 

 

Our platform offers best-in-class strategic guidance, product planning, trust and fund services, legal support, operations support, marketing and research, and sales and distribution services.



About the Role:

We are seeking a Data Engineer to build, maintain, and optimize the data ecosystem that powers Tidal’s ETF platform. This role sits at the intersection of data engineering, analytics, and infrastructure—responsible for developing robust, cloud-native data pipelines and ensuring reliable data delivery across business and operational systems.

You will contribute to technical design, implement data engineering best practices, and collaborate with cross-functional partners to ensure that our data infrastructure supports accurate, timely, and compliant decision-making across the organization.


What you'll do

1. Data Pipeline Development & Maintenance

  • Build, maintain, and optimize scalable data pipelines in AWS (or equivalent), leveraging technologies such as Snowflake, dbt, and modern orchestration frameworks (Airflow, Prefect, etc.).
  • Integrate complex financial data sources—Bloomberg APIs, custodial feeds, and fund administration data—into reliable, auditable data flows.
  • Contribute to the development of a modular, well-documented data ecosystem supporting analytics, reporting, and operations.
  • Participate in code reviews and ETL design discussions to ensure code quality, performance, and maintainability.

2. Data Quality & Reliability

  • Implement and maintain automated data validation and testing checks (e.g., Great Expectations or similar).
  • Monitor data pipeline performance and troubleshoot issues to ensure reliability and accuracy.
  • Adhere to established standards for data lineage, versioning, and documentation within the engineering environment.
  • Work with the team to ensure that all financial data meets Tidal’s quality and regulatory expectations.

3. Collaboration & Business Support

  • Work with sales, operations, and trading teams to understand data requirements and implement infrastructure that delivers actionable insights.
  • Collaborate with analytics teams to support model optimization, dataset creation, and query performance in BI tools such as Sigma.
  • Assist in automating workflows to support data-driven decision-making across the ETF product lifecycle.

4. Engineering Best Practices

  • Write clean, efficient, and well-documented code, following best practices in data modeling and system design.
  • Contribute to the team’s knowledge base by maintaining documentation on pipelines, design standards, and deployment processes.
  • Participate in sprint planning and stand-ups to ensure timely delivery of features and fixes.
  • Stay curious and keep up to date with emerging technologies and patterns in data engineering and cloud architecture.

Qualifications

Education & Experience

  • Bachelor’s degree in Computer Science, Data Engineering, or related technical experience.
  • 3–5 years of professional experience in data engineering or backend software development.
  • Experience building and maintaining cloud-based data architectures, preferably in financial or fintech environments.

Technical Skills

  • Proficient in Python and SQL for data transformation, pipeline orchestration, and automation.
  • Strong working knowledge of data warehousing (Snowflake, Redshift, or equivalent) and cloud platforms (AWS preferred).
  • Hands-on experience with modern orchestration and transformation tools (Airflow, dbt, Prefect).
  • Familiarity with APIs, event-driven pipelines (Kafka or Kinesis), and ETL frameworks.
  • Solid understanding of version control (Git/GitHub), CI/CD, and Agile methodologies.
  • Familiarity with financial datasets (ETFs, trading, market data) is a plus.

Soft Skills

  • Strong communication and collaboration skills with technical and non-technical teams.
  • Problem-solving mindset with the ability to troubleshoot complex data issues.
  • Eager to learn and contribute to a culture of engineering excellence.
  • Detail-oriented and committed to data reliability and quality.

Top Skills

Airflow
AWS
Dbt
Git
Git
Kafka
Kinesis
Prefect
Python
Redshift
Snowflake
SQL
HQ

Tidal Financial Group New York, New York, USA Office

New York, NYC, United States

Similar Jobs

21 Minutes Ago
In-Office or Remote
7 Locations
90K-161K Annually
Senior level
90K-161K Annually
Senior level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Data Engineer will manage ETL processes, build data pipelines, optimize workflows, and ensure data quality within the cloud-based environment, while collaborating with cross-functional teams on various projects.
Top Skills: Azure Data FactoryBlob StoragePythonSnowflakeSQL Server
2 Days Ago
In-Office or Remote
Little Rock, AR, USA
90K-161K Annually
Senior level
90K-161K Annually
Senior level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Data Developer (ADF) will design and maintain ETL pipelines, resolve data quality issues, and support intrastate agencies with automated data processes.
Top Skills: Azure Data FactoryETLJSONPl/SqlPythonSnowflakeSQLT-SqlXML
2 Days Ago
Easy Apply
Remote
United States
Easy Apply
170K-215K Annually
Mid level
170K-215K Annually
Mid level
Artificial Intelligence • Fintech • Healthtech • Software
The Data Engineer III role involves designing and maintaining scalable data pipelines, modernizing data flows, and improving data quality within Cedar's healthcare system. Collaboration with various teams is essential to ensure accurate data delivery and governance standards are met.
Top Skills: AirflowAWSDbtPythonSnowflakeSQL

What you need to know about the NYC Tech Scene

As the undisputed financial capital of the world, New York City is an epicenter of startup funding activity. The city has a thriving fintech scene and is a major player in verticals ranging from AI to biotech, cybersecurity and digital media. It also has universities like NYU, Columbia and Cornell Tech attracting students and researchers from across the globe, providing the ecosystem with a constant influx of world-class talent. And its East Coast location and three international airports make it a perfect spot for European companies establishing a foothold in the United States.

Key Facts About NYC Tech

  • Number of Tech Workers: 549,200; 6% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Capgemini, Bloomberg, IBM, Spotify
  • Key Industries: Artificial intelligence, Fintech
  • Funding Landscape: $25.5 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Greycroft, Thrive Capital, Union Square Ventures, FirstMark Capital, Tiger Global Management, Tribeca Venture Partners, Insight Partners, Two Sigma Ventures
  • Research Centers and Universities: Columbia University, New York University, Fordham University, CUNY, AI Now Institute, Flatiron Institute, C.N. Yang Institute for Theoretical Physics, NASA Space Radiation Laboratory

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account