Data Engineer at Policygenius
Data Engineering at Policygenius...
Policygenius continues to disrupt the insurance industry by delivering innovative technology-driven experiences. We are advancing our tech capabilities and learning to leverage our hordes of data to develop innovative machine learning applications. We are relentless in our drive to reliably deliver outstanding products at scale. We are growing fast, but we can go further faster with experienced, collaborative, challenge-seeking data engineers like yourself.
Our data engineering team builds, tests, and deploys data/ML pipelines to drive automation and insights. We partner closely with product, design, engineering, data and numerous stakeholders across the company to ensure data arrives to users with utmost quality and availability.
In this role, you will…
- Design data pipelines for data ingestion and distribution via streams and/or batches.
- Be responsible for designing, documenting and maintaining a scalable data model for your product
- Deploy machine learning pipelines into production, leveraging modern MLOps practices
- Work directly in a cross functional product team with product managers, designers, software engineers, data scientists, and analysts
- Ensure ultimate reliability in data pipelines and enforce data governance, security and protection of our customer’s information.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery and re-designing infrastructure for greater scalability.
We’d love to hear from you if…
- You have 2+ years of experience as a software engineer or data engineer coding in python and SQL.
- You have a passion for data pipelining, modeling and architecture.
- You are aware of machine learning concepts and/or MLOps practices.
- You can communicate with a team and articulate ideas to both team members and non-technical stakeholders.
- You understand life is not all ML/AI and simple pipelining is often extremely relevant to add business value.
- You are obsessed with reducing lag, building scalable systems, optimizing performance, automating things and solving complex problems!
- You have a background in computer science or a related field.
- You have a drive to learn and master new technologies and techniques.
- You have experience with relational databases like postgres, but also comfortable working with NoSQL standards and file storage systems