• Find preferred job with Jobstinger
  • ID
    #46087384
  • Job type
    Permanent
  • Salary
    $165,000 - $180,000 per year
  • Source
    Jobot
  • Date
    2022-09-28
  • Deadline
    2022-11-26

Senior Data Engineer

California, Sanfrancisco, 94102 Sanfrancisco USA
 
Permanent

Vacancy expired!

100% Remote- Leader in Cyber Security!

This Jobot Job is hosted by: Sarah MurphyAre you a fit? Easy Apply now by clicking the "Apply Now" button and sending us your resume.Salary: $165,000 - $180,000 per year

A bit about us:

We're on a mission to help our clients make their businesses more secure. We're one of the fastest-growing companies in a truly essential industry.

In your role, you'll be inspired by a team of the brightest business and technical minds in cybersecurity. We are passionate champions for our clients and know from experience that the best solutions for our clients' needs come from working hard together. As part of our team, your voice matters, and you will do important work that has an impact, on people, businesses, and nations. Our industry and our company move fast, and you can be sure that you will always have room to learn and grow. We're proud of our team and the important work we do to build confidence for a more connected world.

Job Duties and Responsibilities:Build data pipelines to automate batch and real-time data delivery through Stream Sets' streaming data platform to data lakes, warehouses, and analytical and machine learning applications.Develop applications from the ground up using a modern technology stack including Python, Spark, Scala, RDS, NoSQL, Graph databases, Elastic Search, and Splunk Integrate and ship code into AWS and Azure cloud Production environments Build and utilize APIs and data delivery services that support critical operational and analytical applications for customersTransform complex analytical models into scalable, production-ready solutionsCustomer-facing role - Walk thru to whiteboarding of architecture and implementation planning to stakeholdersQualifications:5+ years of related work experience (Bachelor's degree preferred)5+ years of experience delivering data solutions using open-source components3+ years of technical architecture experience3+ years of experience in one or more scripting programming languages - Python, JSON, Ruby, C#, PowerShell, YAML3+ years of experience with relational SQL and NoSQL databases2+ years of experience with AWS, Azure, and/or Google Cloud Platform preferredExperience taking technical ownership over projectsExperience with service-oriented architecture for cloud-based services.Experience with big data tools: Hadoop, Spark, Kafka, etcExperience with stream-processing systems: Stream Sets, NiFi, Storm, Spark-Streaming, etc.Strong analytic skills related to working with unstructured datasetsExperience building and optimizing data pipelines, architectures and data sets utilizing tools like NiFi, Stream Sets, etcExperience with the following technologies is highly desirable: Scala, Tableau, Salt, Elastic stack (Logstash, Elasticsearch, Kibana)Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvementExperience with Agile Practices like Scrum, Kanban, CI/CD preferredExperience with deployment orchestration, automation, and security configuration management (Jenkins, Puppet, Chef, Cloud Formation, Terraform, Ansible) preferredExperience working with cloud security and governance tools, cloud access security brokers (CASBs), and server virtualization technologiesWorking knowledge of common and industry standard cloud-native/cloud-friendly authentication mechanisms (OAuth, OpenID, etc)

Why join us?

Medical, dental, visionFlexible Spending Account (FSA), Health Savings Account (HSA) with Optiv contribution401K match-50% of your employee deferral contribution, up to 6% and an annual company match maximum.All employer matching contributions are immediately vested.Promotes 401(k) planning and advice resources with education to help employees prepare for retirement.Voluntary life and AD&DAccident, critical illness, hospital indemnityIdentity theft protectionPre-tax commuter benefitsFlexible time off policyIn-person and remote courses that focus on our employees' professional developmentAn on-demand, industry-leading learning management systemTuition reimbursementCertification and vendor trainingAnd more!

Job Details

Job Duties and Responsibilities:Build data pipelines to automate batch and real-time data delivery through Stream Sets' streaming data platform to data lakes, warehouses, and analytical and machine learning applications.Develop applications from the ground up using a modern technology stack including Python, Spark, Scala, RDS, NoSQL, Graph databases, Elastic Search, and Splunk Integrate and ship code into AWS and Azure cloud Production environments Build and utilize APIs and data delivery services that support critical operational and analytical applications for customersTransform complex analytical models into scalable, production-ready solutionsCustomer-facing role - Walk thru to whiteboarding of architecture and implementation planning to stakeholdersQualifications:5+ years of related work experience (Bachelor's degree preferred)5+ years of experience delivering data solutions using open-source components3+ years of technical architecture experience3+ years of experience in one or more scripting programming languages - Python, JSON, Ruby, C#, PowerShell, YAML3+ years of experience with relational SQL and NoSQL databases2+ years of experience with AWS, Azure, and/or Google Cloud Platform preferredExperience taking technical ownership over projectsExperience with service-oriented architecture for cloud-based services.Experience with big data tools: Hadoop, Spark, Kafka, etcExperience with stream-processing systems: Stream Sets, NiFi, Storm, Spark-Streaming, etc.Strong analytic skills related to working with unstructured datasetsExperience building and optimizing data pipelines, architectures and data sets utilizing tools like NiFi, Stream Sets, etcExperience with the following technologies is highly desirable: Scala, Tableau, Salt, Elastic stack (Logstash, Elasticsearch, Kibana)Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvementExperience with Agile Practices like Scrum, Kanban, CI/CD preferredExperience with deployment orchestration, automation, and security configuration management (Jenkins, Puppet, Chef, Cloud Formation, Terraform, Ansible) preferredExperience working with cloud security and governance tools, cloud access security brokers (CASBs), and server virtualization technologiesWorking knowledge of common and industry standard cloud-native/cloud-friendly authentication mechanisms (OAuth, OpenID, etc)

Interested in hearing more? Easy Apply now by clicking the "Apply Now" button.

Vacancy expired!

Report job

Related Jobs

Jobstinger