Today, there’s more data and users outside the enterprise than inside, causing the network perimeter as we know it to dissolve. We realized a new perimeter was needed, one that is built in the cloud and follows and protects data wherever it goes, so we started Netskope to redefine Cloud, Network and Data Security.
Since 2012, we have built the market-leading cloud security company and an award-winning culture powered by hundreds of employees spread across offices in Santa Clara, San Francisco, Seattle, Bangalore, London, Melbourne, and Tokyo. Our core values are openness, honesty, and transparency, and we purposely developed our open desk layouts and large meeting spaces to support and promote partnerships, collaboration, and teamwork. From catered lunches and office celebrations to employee recognition events and social professional groups such as the Awesome Women of Netskope (AWON), we strive to keep work fun, supportive and interactive. Visit us at Netskope Careers and follow us on Twitter @Netskope and Facebook.
- You will be part of a growing team of industry experts in the exciting space of Cloud Analytics.
- Your contributions will create a high impact in the industry and our customers through our products.
- You can shape Data Engineering at Netskope!
What we are looking for
- Experience working with SQL and no-SQL datastores like Elasticsearch, MongoDB, Druid, Postgres, Teradata.
- Understanding of DB Internals.
- Solid test automation experience in Python, Go or any other language
- Experience testing data ingestion pipeline and data querying services.
- Experience testing REST services
- Understanding the impact of data organization and query optimization on query performance.
- Good understanding of data structures and algorithms and excellent programming skills
- Expert level understanding of big data infrastructure.
- Strong verbal and written communication skills
You will be writing tests for our big data workflows; You will help us validate the ingestion pipelines and query platforms, work closely with the Data Engineers to optimize and test data infrastructure and platforms for massive scale.
- BS or MS in Computer Science or equivalent technical degree