v4c.ai Logo

v4c.ai

Data Engineer

Posted 19 Days Ago
Remote
Hiring Remotely in United States
Junior
Remote
Hiring Remotely in United States
Junior
Build and maintain Databricks-based ETL/ELT pipelines, transform and model data, optimize workflows, monitor and troubleshoot pipelines, and collaborate with cross-functional teams.
The summary above was generated by AI
Position Overview

V4C.ai is seeking a motivated Data Engineer to join our remote team in the United States. In this role, you will support the design, development, and maintenance of data solutions using Databricks, helping clients and internal teams process, transform, and analyze data effectively. You'll work on building reliable data pipelines and workflows in a collaborative environment, gaining hands-on experience with modern data engineering tools and cloud technologies.

Key Responsibilities
  • Collaborate with team members and stakeholders to understand data requirements and contribute to building scalable data pipelines and workflows in Databricks.
  • Develop and implement ETL/ELT processes using Databricks, Python, SQL, and related tools to ingest, transform, and prepare data.
  • Assist in optimizing data workflows for better performance, reliability, and cost-efficiency within Databricks environments.
  • Support the creation and maintenance of data models, tables, and integrations in cloud platforms (Azure, AWS, or similar).
  • Work closely with cross-functional teams (data analysts, scientists, and engineers) to deliver clean, accessible data for analytics and reporting.
  • Monitor data pipelines, troubleshoot basic issues, and contribute to documentation and best practices.
  • Stay curious about new Databricks features and data engineering trends to support ongoing improvements.
Requirements
  • Bachelor's degree in Computer Science, Data Science, Engineering, Information Systems, or a related field (or equivalent practical experience).
  • 1-2 years of professional experience in data engineering, data processing, analytics engineering, or a closely related role (internships, co-ops, or academic projects with relevant tools count toward this).
  • Hands-on experience and comfort building basic data pipelines or transformations.
  • Proficiency in Python and SQL; experience with Scala is a plus but not required.
  • Basic understanding of cloud platforms such as Azure, AWS, or GCP (ex: working with storage, compute, or data services).
  • Solid analytical and problem-solving skills with attention to detail and a focus on writing clean, maintainable code.
  • Strong communication skills and ability to work collaboratively in a remote team environment.
  • Eagerness to learn, take ownership of tasks, and grow within data engineering.

Top Skills

AWS
Azure
Databricks
Elt
ETL
GCP
Python
Scala
SQL

Similar Jobs

20 Hours Ago
Easy Apply
Remote
USA
Easy Apply
107K-150K Annually
Junior
107K-150K Annually
Junior
Big Data • Healthtech • HR Tech • Machine Learning • Software • Telehealth • Big Data Analytics
As a Data Engineer II, you will build and optimize data pipelines, define reusable datasets, and create a data validation framework while ensuring user privacy and security.
Top Skills: AirbyteAirflowArgoAWSDbtDuckdbElasticsearchIcebergLookerPythonSparkSQLTerraform
2 Days Ago
Easy Apply
Remote or Hybrid
USA
Easy Apply
158K-205K Annually
Senior level
158K-205K Annually
Senior level
Food • Software
The Senior Data Engineer handles ChowNow's data platform, collaborating with teams to enhance data availability and insights, supporting internal and customer-facing products.
Top Skills: AWSDbtPythonSnowflakeSQL
21 Days Ago
Remote or Hybrid
Framingham, MA, USA
69K-129K Annually
Mid level
69K-129K Annually
Mid level
Big Data • Healthtech • Software
Design, build, and maintain scalable ETL/ELT pipelines using Python, Spark, Databricks, Airflow and SSIS. Integrate and cleanse diverse healthcare datasets, implement Unity Catalog for metadata and governance, optimize Spark performance and JVM tuning, support Medallion architecture, and collaborate with cross-functional teams to automate CI/CD, observability, and data quality processes.
Top Skills: Apache AirflowSparkAWSCsvDatabricksDeltaGitlab CiJenkinsJvmMedallion ArchitectureNoSQLParquetPythonScalaSQLSsisUnity CatalogXML

What you need to know about the NYC Tech Scene

As the undisputed financial capital of the world, New York City is an epicenter of startup funding activity. The city has a thriving fintech scene and is a major player in verticals ranging from AI to biotech, cybersecurity and digital media. It also has universities like NYU, Columbia and Cornell Tech attracting students and researchers from across the globe, providing the ecosystem with a constant influx of world-class talent. And its East Coast location and three international airports make it a perfect spot for European companies establishing a foothold in the United States.

Key Facts About NYC Tech

  • Number of Tech Workers: 549,200; 6% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Capgemini, Bloomberg, IBM, Spotify
  • Key Industries: Artificial intelligence, Fintech
  • Funding Landscape: $25.5 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Greycroft, Thrive Capital, Union Square Ventures, FirstMark Capital, Tiger Global Management, Tribeca Venture Partners, Insight Partners, Two Sigma Ventures
  • Research Centers and Universities: Columbia University, New York University, Fordham University, CUNY, AI Now Institute, Flatiron Institute, C.N. Yang Institute for Theoretical Physics, NASA Space Radiation Laboratory

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account