DV is the leader in digital performance solutions, improving the impression quality and audience impact of digital advertising. Built on best practices, DV solutions create value for media buyers and sellers by bringing transparency and accountability to the market, ensuring ad viewability, brand safety, fraud protection, accurate impression delivery and audience quality across campaigns to drive performance. Since 2008, DV has helped hundreds of Fortune 500 companies gain the most value out of their media spend by delivering best in class solutions across the digital ecosystem that help build a better industry. Headquartered in New York City.Role Description
In this role you will lead Doubleverify’s Analytics department responsible for planning, building and operating the next gen analytic platform while extending and supporting the existing one. The new platform is going to ingest billions of records per day and make them available for consumption through an intuitive and highly performant UI. Additionally, it will allow to programmatically access data while applying a tight Data Governance strategy. This role is ideal for you If you are excited creating state of the art software, leading people to accomplish ambitious visions and possess exceptional architectural, engineering, analytical and leadership skills.Responsibilities
- Overall responsibility for the Analytics group and teams that build DoubleVerify’s Reporting and Analytics solutions.
- Design, develop and operate DV’s next generation analytic platform dealing with high volume of data, provides a sub-second latency for online analytics use cases and scales well. It serves hundreds of Fortune 500 customers as well as thousands of media partners.
- Attract, interview, hire and onboard talented engineers and team leaders.
- Mentor, coach and manage a group of engineers - help them shine and grow
- Oversee the creation of a new big data online analytic platform, including technology selection, implementation of Authentication/Authorization as well as data access layers operating against the newly established data store
- Define and advocate for technical standards of quality, monitoring, security, modifiability, extensibility and maintainability of data processing software
- Continuously explore the technical landscape of the Big Data ecosystem through analysis of various frameworks and technologies
- Continuously optimize the group’s SDLC processes including through implementation of CI/CD
- Provide technical guidance and coaching to engineers through brainstorming sessions, design reviews and pair-programming
- 2+ years of in-depth understanding and hands-on experience with distributed columnar data stores (BigQuery, Snowflake, Redshift, Vertica or equivalent)
- 10+ years of experience in building and operating mission critical data intensive and distributed systems
- 5+ hands-on experience with Scala/Java/Python development
- 5+ years of experience in leadership roles
- Ability to Architect and Design complex software systems while adhering to fundamental principles and best practices
- Fluent in SQL and data analysis
- Experience with following and advocating state of the art SDLC processes
- BS/MS degree in Computer Science or other related field