Senior Database Engineer

| Remote
Sorry, this job was removed at 8:16 a.m. (EST) on Tuesday, October 26, 2021
Find out who’s hiring remotely
See all Remote jobs
Apply
By clicking Apply Now you agree to share your profile information with the hiring company.

ABOUT HAPPY MONEY 


Happy Money® is building a happier and more equitable financial ecosystem that seamlessly blends psychology, technology, and a focus on happiness to help consumers go from borrower to saver, investor, and giver. The company provides a path toward improving financial well-being and securing greater happiness – beginning with paying off credit card debt – through its science-enabled, purpose-driven marketplace between mission-aligned capital and consumers.


Backed by leading investors including Anthemis Group, Tencent Holdings and CMFG Ventures, Happy Money has helped nearly 150,000 members pay off more than $2.7 billion in credit card debt since inception through its award-winning Payoff Loan™. Founded in 2009, Happy Money has a diverse employee base of over 350 people across the United States.





WHAT YOU'LL BE DOING

  • Use industry standards and best practices to ensure that data integration solutions are operational and working efficiently by focusing on automation, as well as continuous integration and continuous delivery (CI/CD).
  • Be responsible for service delivery, reliability, scalability, monitoring, and production support.
  • Use scripts and automation to manage data in Amazon Web Services (AWS)’s cloud storage with data technologies such as Databricks and Snowflake.
  • Automate the deployment and operation of data pipelines.
  • Administer and manage data using automation in relational databases and distributed computing systems.
  • Communicate and/or address build, deployment, and operational issues as they come up. 
  • Optimize workflows and improve execution. Enhance existing monitoring and reliability metrics across our database/data platforms. 
  • Solve day-to-day customer and production challenges.
  • Work closely with other teams to extract, transform, and load data from a wide variety of data sources, including in-house databases.
  • Participate in the design, implementation, and ongoing management of data and database infrastructure tooling and operations.
  • Design optimizations to meet the scalability, reliability, and performance needs of the organization.
  • Lead design and implementation efforts for data disaster recovery tasks for each application in collaboration with Engineering teams.
  • Serve as an on-call escalation contact for relevant database/data issues.
  • Work across multiple codebases and discuss technical requirements with engineers from the Application, Data, and DevOps teams.
  • Identify points of friction in extract, transform, load (ETL) processes and propose automation strategies.
  • Cultivate best practices in data processing across the engineering organization.
  • Design, develop, and maintain solutions to source, integrate, transform, cleanse, normalize, and expose/present data to various constituents/consumer groups such as other engineering groups, the Data Science team, or other end users.
  • Develop data insights as a cross-cutting concern across all products.
  • Help drive our business metrics and key performance indicators (KPIs) by stewarding insights and the use of data.
  • Become an expert on our cloud data platforms, including AWS, Salesforce, and Snowflake.

  • This position is 100% remote (US & Canada only).

ABOUT YOU

  • 5+ years of relational and nonrelational clustered database (DB) experience (PostgreSQL, MySQL, Kafka, Mongo).
  • 5+ years of relevant work experience with data warehouses, data lake operations, and production support.
  • 2+ years of experience building large data pipelines.
  • 2+ years of experience working with the Snowflake data platform
  • Familiarity with Databricks.
  • Extensive experience working with data in the cloud, preferably using AWS.
  • Experience managing both SQL and NoSQL databases such as PostgreSQL, Aurora (RDS), and Amazon DocumentDB.
  • Experience with database migration and data replication technologies (PostgreSQL).
  • Hands-on experience building large, robust ETL pipelines.
  • Fluency in multiple programming languages; Python, Ruby, or Clojure are a big plus.
  • A systematic problem-solving approach, coupled with strong communication skills and a sense of ownership and drive.
  • Excellent cross-functional collaborative skills and ability to build relationships while working with multiple development teams. 
  • Ability to persuade development teams to adopt new cloud technologies, processes, and best practices.
  • A great attitude and ready to hustle.
  • Knowledge of data warehouse design is a plus (we use Data Vault 2.0).
  • Knowledge of Airflow is nice but not required.

Here at Happy Money, we live by our core values of Love, Trust, and Hustle and welcome all. Love is shown in how we develop meaningful relationships with everyone we interact with; whether it’s a member or your manager. Trust is shown through how we empower each other to come to work as our true selves and embrace our differences. Hustle is shown through how we fail fast and learn from our mistakes. No one is perfect, we’re all human; if this job description doesn’t exactly match your background, we encourage you to apply anyway!

Read Full Job Description
Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.

Similar Jobs

Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.
Learn more about Happy MoneyFind similar jobs