レベル: Mid-Senior level

ジョブタイプ: Full-time

Loading ...

仕事内容

About Netskope

Today, there’s more data and users outside the enterprise than inside, causing the network perimeter as we know it to dissolve. We realized a new perimeter was needed, one that is built in the cloud and follows and protects data wherever it goes, so we started Netskope to redefine Cloud, Network and Data Security.

Since 2012, we have built the market-leading cloud security company and an award-winning culture powered by hundreds of employees spread across offices in Santa Clara, San Francisco, Seattle, Bangalore, London, Melbourne, and Tokyo. Our core values are openness, honesty, and transparency, and we purposely developed our open desk layouts and large meeting spaces to support and promote partnerships, collaboration, and teamwork. From catered lunches and office celebrations to employee recognition events (pre and hopefully post-Covid) and social professional groups such as the Awesome Women of Netskope (AWON), we strive to keep work fun, supportive and interactive. Visit us at Netskope Careers and follow us on Twitter @Netskope and Facebook.

Principal Software Engineer - Data

With a mission to evolve security for the way people work, Netskope, a cloud security company, was founded by early architects and distinguished engineers from security and networking leaders like Palo Alto Networks Juniper Networks, Cisco, and VMware.

Principal Engineer - Data

We are looking for skilled engineers with eyes for building and optimizing distributed systems. From data ingestion, processing, storage optimization, we work closely with engineers and the product team to build highly scalable systems that tackle real world data problems. Our customers depend on us to provide accurate, real-time, and fault tolerant solutions to their ever growing data needs. The senior level engineer position is a highly technical position with responsibility to lead the development, validation, publishing, and maintenance of logical and physical data models which support various OLTP and analytics environments.

Role And Responsibilities
  • Designs and implements planet scale distributed data platform, services and frameworks including solutions to address high-volume and complex data collections, processing, transformations and analytical reporting
  • Work with the application development team to implement data strategies, build data flows and develop conceptual data models
  • Understand and translate business requirements into data models supporting long-term solutions
  • Develop best practices for standard naming conventions and coding practices to ensure consistency of data models
  • Analyze data system integration challenges and propose optimized solutions
  • Research to identify effective data designs, new tools and methodologies for data analysis
  • Provide guidance and expertise to development community in effective implementation of data models and building high throughput data access services
  • Work through all stages of a data solution life cycle: analyze/profile data, create conceptual, logical & physical data model designs, derive reporting and analytics solutions
  • Evaluate existing data and physical databases for variances and discrepancies
  • Participate in data strategy and road map exercises including data architecture, business intelligence / data lake tool selection, technical design and implementation
  • Provide technical leadership in all phases of a project from discovery and planning through implementation and delivery
Qualifications
  • At least 8 years of hands-on experience in architecture, design or development of enterprise data solutions, applications, and integrations
  • Ability to conceptualize and articulate ideas clearly and concisely
  • Demonstrable experience in developing, validating, publishing, maintaining logical and physical data models
  • Excellent communication, presentation and interpersonal skills
  • Hands-on experience with modern enterprise data architectures and data toolsets (e.g.: data warehouse, data marts, data lake, 3NF and dimensional models, modeling tools, profiling tools)
  • Bachelor or Master degree in STEM majors
  • Strong algorithms, data structures, and coding background with either Java, Python or Scala programming experience
  • Exceptional proficiency in SQL
  • Experience building products using 1 of the each following distributed technologies:
 Relational Stores (i.e. Postgres, MySQL or Oracle) 

  • Columnar or NoSQL Stores (i.e. Big Query, Clickhouse, or Redis) 

  • Distributed Processing Engines (i.e. Apache Spark, Apache Flink, or Celery) 

  • Distributed Queues (i.e. Apache Kafka, AWS Kinesis or GCP PusSub)
  • Experience with software engineering standard methodologies (e.g. unit testing, code reviews, design document)
  • Experience working with GCP, Azure, AWS or similar cloud platform technologies a plus
  • Ability to drive change through persuasion and consensus
Loading ...
Loading ...

締切: 08-12-2024

無料の候補者に適用するにはクリックしてください

申し込む

Loading ...