Data Operations Analyst
DoubleVerify is the recognized market innovator with technology that accurately authenticates the quality of digital media and drive ad performance for the world's leading brands. DV provides media transparency and accountability to deliver the highest level of impression quality for maximum advertising performance. Since 2008, DV has helped hundreds of Fortune 500 companies gain the most from their media spend and helped build a better industry. Learn more at doubleverify.com.
The Data Operations Engineer is an integral part of the Data Operations (DataOps) Team, responsible for analyzing and externalizing DoubleVerify’s data internally as well as monitoring, troubleshooting and improving the various company’s data pipelines and technologies.
- You will gain in-depth knowledge of how data is collected, processed and externalized to clients within DoubleVerify’s architecture.
- You will script in python and SQL extensively
You will be working with data analysis tools such as Splunk to create reports and data visualizations
- You will be part of the on-call rotation.
- You will work with Vertica, OLTP, and Hive/SparkSQL
- You are also thrilled at the prospect of building strong relationships with different teams in the company, solving operational issues and implementing quality improvements.
- Bachelor's degree in CS or equivalent experience. Degree in a technical field preferred
- Strong SQL querying skills.
- Experience with Vertica columnar DB
- Experience Hadoop, Hive and MongoDB a plus
- Demonstrated ability to quickly adapt, learn new skill sets, and be able to understand operational challenges
- Strong analytical, problem-solving, negotiation and organizational skills with a clear focus under pressure
- Must be proactive with proven ability to execute on multiple tasks simultaneously
- Resourceful, results orientated with the ability to get things done and overcome obstacles
- Excellent interpersonal skills, including relationship building with diverse, global, cross-functional team
- Proven ability to troubleshoot and problem solve in complex systems.
- Linux environment experience
- Good understanding of BI and Data Warehousing concepts (ETL, OLAP vs. OLTP, Slowly Changing Dimensions)
- Good understanding of process automation
- Python, bash or any other scripting language a plus