• Find preferred job with Jobstinger
  • ID
    #44813883
  • Job type
    Contract
  • Salary
    TBD
  • Source
    Tandym Tech
  • Date
    2022-08-11
  • Deadline
    2022-10-10
 
Contract

Vacancy expired!

A healthcare technology company in Silicon Valley has a great Remote opportunity awaiting a new Senior Data Engineer. In this role, the Senior Data Engineer (Remote) will be responsible for assisting with growing the company's data lake and scale systems operations.

Responsibilities: The Senior Data Engineer (Remote) will:
  • Implementing Transformation pipelines within Snowflake using DBT
  • Querying Snowflake using SQL
  • Building out a tagging scheme for restricted data (PHI, PII, etc.) and integrating that into the access control system
  • Development of scripts using SQL, Golang, and Bash for loading, extracting, and transforming data
  • Writing code to build infrastructure components (we use terraform)
  • Interpreting data controls and building appropriate controls to manage access
  • Twisting SQL queries for improving performance
  • Assisting with production issues in data warehouses like reloading data, transformations, and translations
  • Navigating new data sets frequently and model data efficiently and effectively
  • Translating complex business requirements into generic and highly scalable technical solutions
  • Development and implementation of platform components to support containers
  • Writing code to build infrastructure components (we use terraform)
  • Working with the SRE team to operationalize data infrastructure
  • Performing other duties, as needed

Qualifications:
  • Proficiency with writing code in Go is a must
  • 3+ years designing, implementing, and administering a Kubernetes cluster in a production environment
  • Hands-on experience with Snowflake is a HUGE plus (but not an absolute must)
  • Argo Workflows or have experience with other workflow/CICD technologies
  • Experience using job schedulers such as Apache Airflow is a big plus
  • Comfortable with tools such as Git, Terraform, Docker, etc.
  • Experience implementing Data Access Controls
  • Experience with ETL/ELT technologies such as Boomi, FiveTran, AWS Glue
  • Proficient in data cleanup/prep for analysis
  • Familiarity with the needs of a Business Intelligence / Data Analyst team
  • Thorough understanding of IT fundamentals to include: Windows and Linux operating systems, networking, security, cloud, and git usage
  • Solid problem solving and time management skills
  • Great interpersonal skills
  • Excellent communication skills (written and verbal)
  • Strong attention to detail
  • Highly organized

Desired Skills:
  • Experience with Argo Workflows
  • Experience with Data Transport technologies, such as Kafka, AWS Firehose, etc.
  • Experience working in the Healthcare, Medical Device, or other regulated industries

Vacancy expired!

Report job

Related Jobs

Jobstinger