Director of Engineering, Analytic Platforms
In this role you will lead Doubleverify’s Analytics department responsible for providing insights and data for our clients and internal users. Our next challenge is to create the next gen analytic platform while extending, supporting and operating the existing one. The new platform is going to ingest billions of records per day and make them available for consumption through an intuitive and highly performant UI. Additionally, it will allow to programmatically access data while applying a tight Data Governance strategy.
- Overall responsibility for the Analytics group and teams that build DoubleVerify’s Reporting and Analytics solutions.
- Design, development and commercialization of next generation analytics platform that serves hundreds of Fortune 500 customers as well as thousands of media partners.
- Attract, interview, hire and onboard talented engineers and team leaders.
- Mentor, coach and manage a group of engineers - help them shine and grow
- Oversee the creation of a new big data online analytic platform, including technology selection, implementation of Authentication/Authorization as well as data access layers operating against the newly established data store
- Define and advocate for technical standards of quality, monitoring, security, modifiability, extensibility and maintainability of data processing software
- Continuously explore the technical landscape of the Big Data ecosystem through analysis of various frameworks and technologies
- Continuously optimize the group’s SDLC processes including through implementation of CI/CD
- Provide technical guidance and coaching to engineers through brainstorming sessions, design reviews and pair-programming
- 10+ years of experience in building and operating mission critical data intensive and distributed systems
- 5+ years of experience in leadership roles
- Must have 5+ hands-on experience with Scala/Java/.net Core/Python development
- Ability to Architect and Design complex software systems while adhering to fundamental principles and best practices
- In-depth understanding and hands-on experience with distributed columnar data stores (BigQuery, Snowflake, Redshift, Vertica etc…)
- Fluent in SQL and data analysis
- Experience with following and advocating state of the art SDLC processes
- BS/MS degree in Computer Science or other related field