• Find preferred job with Jobstinger
  • ID
    #17812376
  • Job type
    Contract
  • Salary
    Depends on Experience
  • Source
    Atlantic Partners
  • Date
    2021-07-23
  • Deadline
    2021-09-21

Vacancy expired!

MLOps Architect- Job Description: Client Ops Architect, platform is built over the AWS platform for one of USA's well known and fastest growing pharma company in NY/NJ region. Support a global team of developers using this platform.

Skills & Qualification:
  • Total 12+ years of Admin/Architect experience in Bigdata domain with last 12-18 months focused on Client Ops Out of total experience minimum of 6+ years of experience as Enterprise scale Architect.
  • Experience on establishing and maintaining listed Client models such
  • Model hyperparameter optimization
  • Model evaluation and explainability
  • Model Training
  • model management, version tracking & storage
  • Hands on experience with hosting of production Client models considering aspects of Observability, Deployment, Serving, Validation, Compliance and Audit of models
  • Experience on supporting model builds by developing Client platforms offering capabilities of hosted notebooks (e.g., JupyterHub,or Databricks), experiment tracking (Client Flow, or Comet), model management, version tracking & storage (Google AI,), model training ( Dataiku or KubeFlow), model hyperparameter optimization (Comet or AWS Sagemaker) and model evaluation and explainability (Tensorflow or Streamlit)
  • Experience with maintaining Data Quality frameworks such as OwlDQ, or Great Expectations and Data Catalogs such as Alation, Apache Hive, AWS Glue, Apache Atlas (Any one of them is good enough)
  • Experience with distributed systems like big data processing/streaming/storage engines (e.g., Apache Hadoop, Apache Spark, Apache Kafka, Apache Hive), different Cloud environments (e.g., AWS or Google Cloud Platform), or resource management systems (e.g., Apache Mesos or Kubernetes)
  • Extensive programming experience in any of these language Java, Scala, Python or Go (Any one of them is good enough)
  • Ability to independently navigate systems and tools
  • Experience administering Hadoop applications such as HDFS, MapReduce, YARN, Hive, Pig, Oozie, Slider, Sqoop, Nifi, Airflow, Ranger, Griffin, Atlas, DatalQ
  • Experience with Hadoop in public cloud such as AWS is a must.
  • Experience with AWS Services such as Elastic Compute Cloud (EC2), Auto Scaling, Elastic Load Balancing (ELB), CloudFormation (CF), Identity and Access Management (IAM), CloudTrail, CloudWatch (CW), Simple Storage Service (S3), Elastic Block Store (EBS), Elastic File System (EFS).
  • Experience with Devops Environment.
  • Excellent problem solving and troubleshooting skills
  • Linux administration experience preferred.

Certifications (Desirable):
  • HDP Certified Administrator or Any Hadoop Administration Certification
  • AWS Administrator Associate or up

Vacancy expired!

Report job

Related Jobs

Jobstinger