Stash is a digital-first financial services company committed to making saving and investing accessible to everyone. By breaking down barriers and building transparent, technology-driven products, we help the 99% build smarter financial habits so they can confidently save more, grow wealth, and enjoy life.
At Stash, data is at the core of how we make decisions and build great products for millions of users. As a Data Engineer you will be a part of our Data Platform Team which is leading the architectural design decisions and implementation of a modern data infrastructure at scale. You will build distributed services and large scale processing systems that will support various teams to work faster and smarter. You will partner with Data Science to help productionize machine learning models and algorithms into actual data driven products that will help make smarter products for our users.
Tools and technologies in our tech stack (evolving):
- Hadoop, Yarn, Spark, MongoDB, Hive
- AWS EMR/EC2/Lambda/kinesis/S3/Glue/DynamoDB/API Gateway, Redshift
- ElasticSearch, Airflow, and Terraform.
- Scala, Python
- Build core components of data platform which will serve various types of consumers including but not limited to data science, engineers, product, qa
- Build various data ingestion and transformation job/s as and when they are needed
- Productionize our machine learning models and algorithms into data-driven feature MVPs that scale
- Leverage best practices in continuous integration and deployment to our cloud-based infrastructure
- Build scalable data services to bridge the gap between analytics and application space
- Optimize data access and consumption for our business and product colleagues
- Develop an understanding of key product, user, and business questions
- 2+ years of professional experience working in data engineering
- BS / MS in Computer Science, Engineering, Mathematics, or a related field
- You have built large-scale data products and understand the tradeoffs made when building these features
- You have a deep understanding of system design, data structures, and algorithms
- Experience (or a strong interest in) working with Python or Scala
- Experience with working with a cluster manager (YARN / Mesos / Kubernetes)
- Experience with distributed computing and working with Spark, Hadoop, or MapReduce Framework
- Experience working on a cloud platform such as AWS
- Experience with ETL in general
- Experience working with Apache Airflow
- Experience working with AWS Glue
- Experience in Machine Learning and Information Retrieval
We believe that diversity and inclusion are essential to living our values, promoting innovation, and building the best products out there. Our success is directly related to the employees that we hire, grow and retain and we believe that our team should reflect the diversity of the customers that we serve.
As an Equal Opportunity Employer, Stash is committed to building an inclusive environment for people of all backgrounds. We do not discriminate on the basis of race, color, gender, sexual orientation, gender identity or expression, religion, disability, national origin, protected veteran status, age, or any other status protected by law. Everyone is encouraged to apply.
Benefits & Perks:
- Equity in Stash
- Flexible Vacation
- Family-Friendly Medical, Dental, and Vision Insurance Plans
- Learning & Development Stipend
- Commuter Benefits and Flexible Spending Account (FSA)
- Employee referral bonuses
- Stocked fridges & kitchens and catered lunch on Fridays
- Thursday happy hours
- Team outings that do not involve trust falls...
Awards & Recognition:
- Forbes Fintech 50 (2019)
- LendIt Fintech Innovator of the Year (2019)
- Built in NYC's Best Places to work (2019)
- Built in NYC’s Startups to Watch (2018)
- Wall Street Journal's "Top 25 Tech Companies To Watch" (2018)
- MarCom Awards Double Gold & Platinum Winner (2018)
- Webby Award Winner for Best Mobile Sites & Apps in the Financial Services and Banking category (2017)
- W3 Awards Winner for Best User Experience (2017)
**No recruiters, please.