At White Ops, we are all about keeping it human. We are the global leader in bot mitigation. We protect more than 200 enterprises—including the largest internet platforms—from sophisticated bots by verifying the humanity of nearly one trillion online interactions every week. The most sophisticated bots look and act like humans when they click on ads, visit websites, fill out forms, take over accounts, and commit payment fraud. We stop them.
Founded in 2012 in a Brooklyn sci-fi bookstore, our Bot Mitigation Platform protects enterprises from the sophisticated bots that threaten them. It’s an ongoing war that we fight passionately every day. Join our mission to stop bots, disrupt the economics of cyber crime, and keep it human.
The Data Engineering team is responsible for the primary organs and arteries of our system. The team’s mission is to consistently deliver a world-class data platform that enables our security researchers, threat intel analysts, and data scientists to quickly and reliably detect malicious non-human activity. You will be solving a wide range of problems from finding more efficient ways to research sophisticated botnets to delivering only the most critical and impactful insights to our customers as fast as possible so they can take action.
By joining the Data Platform team at White Ops, you will gain a deep understanding of how our products and services work all the way from signal collection to customer reports and are able to make a major impact on cleaning up the internet. You will be responsible for the evolution of a platform that will grow in scale by 100x (current volume is over 1 PB) to 1000x, ensure that our data can never be used for ill by anyone (e.g. privacy and security are at the core of the system design), and incorporate the latest ML and fraud research techniques.
What you'll do:
- Architect, Design, Build, Test, Deploy and Maintain reliable, scalable systems and applications to process our current and expected future loads
- Partner closely with security researchers, threat intel analysts, and data scientists to understand how they work and uncover new ways to enhance and amplify their analysis
- Manage the ingestion and usability of a Snowflake data warehouse that ingests and process very high volume near real time data
- Partner with our operations teams to debug and resolve critical issues and participate in an on-call support rotation for mission critical production applications
- Empower our stakeholders to automate and enhance their day to day data requirements (finance, marketing, sales, product management, etc.) leveraging frameworks and libraries
- Leverage DevOps Practices and Principles to build and leverage CI/CD pipelines
- Participate in Git-based code reviews and design reviews at project to architecture scale
Who you are:
- You can develop software in Java and Python, understanding both the theoretical concepts of the language as well as how to write well-documented well-tested code
- You have experience in designing and architecting big data eco systems leveraging large data sets (Order of 100 billion interactions and multiple petabytes) and hands on experience developing them
- You have demonstrated ability to design processes that support data transformation, data structures, metadata, dependency and workflow management
- You have experience efficiently analyzing and extracting key insights from large (terabytes/petabytes) datasets
- You have the ability to clearly articulate pros and cons of various technologies and platforms and to document use cases, solutions and recommendations
- You can own medium to-large sized features all the way through from technical design to deployment in production and collaborate with cross functional teams
- You can provide updates on the scope and state of work, proactively identify issues that you are facing, and learn to anticipate roadblocks that may arise
These are some additional skills that the team has developed and use regularly but are not required for new hires (other than a burning desire to learn them when needed):
- Experience with the Java framework, Spring
- Experience automating the deployment of distributed software to cloud environments like AWS, Azure or GCP (which likely also means you know some combination of Kubernetes, Docker, and/or Terraform)
- Knowledge of the Snowflake database, primarily its use as a data warehouse
- Advanced SQL knowledge and working familiarity with a variety of SQL and No-SQL Databases.
- Able to do basic *nix system administration
Benefits and Perks
- Unlimited vacation policy
- Stock options, 401(k), and commuter benefits
- Competitive salary and benefits
- Medical and dental insurance for all full-time employees
- Fully paid parental leave
- Professional development fund
- Great coaching from senior leaders and challenging development opportunities
Life at White Ops:
Our HQ office is located in the heart of New York City. We are growing the company deliberately with a keen eye towards maintaining a culture that values diversity, lifestyle, and career growth. We are doing meaningful work and we need people to join our mighty team. We are proud of our overwhelmingly positive presence on Glassdoor and Built in NYC. We have offices located in NYC, DC, Victoria, and London.