• Find preferred job with Jobstinger
  • ID
  • Job type
  • Salary
    USD $100000 - $105000 per annum 100000 - 105000 per annum
  • Source
    Vaco Technology
  • Date
  • Deadline

Data Engineer

North Carolina, Charlotte, 28202 Charlotte USA

U.S. Citizens and those authorized to work in the U.S. are encouraged to apply. We are unable to sponsor at this time.
  • Utilize multiple development languages/tools such as Java, Python, Scala and object-oriented approaches in building prototypes and evaluate results for efficiency and feasibility
  • Design, develop, test, and implement data-driven solutions to meet business requirements, ability to quickly identify an opportunity and recommend possible technical solutions by working with third party vendors.
  • Provide business analysis and develop ETL code to meet all technical specifications and business requirements according to the established architectural designs
  • Extracting business data from multiple structured and unstructured data sources, utilizing data pipeline
  • Experience in Custom Data pipeline development and migrate data from large-scale data environments Oracle, SQL server with experience in end-to-end design and build process of Near-Real Time and Batch Data Pipelines.
  • Deploys application code and analytical models using CI/CD tools and techniques and provides support for deployed data applications and analytical models using Jenkins, GitHub Collaborating as part of a cross-functional Agile team to build and improve software for next generation Big Data and Cloud applications
  • Performing unit tests and conducting reviews with other team members to ensure code is designed with high code coverage by code scans using Sonarqube, checkmarx
  • Willing to take ownership of pipeline and can communicate concisely and persuasively to varied audience including data provider, engineering and analysts.
The Skills You Bring
  • Bachelor's or Master's degree in related field or equivalent experience
  • Minimum 3+ years of related technical experience and expert level knowledge of Java, Python Expertise in Data technologies and tools like Spark, Kafka, Hive, NiFi, Sqoop, Impala, Flume, Oozie
  • Experience in API Development and Handling.
  • Experience in Cloud Technologies: AWS, Glue, Lambda, EMR and Snowflake/RedShift database.
  • Demonstrate ability to work with team members and clients to assess needs, provide assistance and resolve problems
  • Excellent problem-solving skills, verbal/written communication, and the ability to explain technical concepts to business partners.
  • Proven design and development background in ETL, DW, BI and data migration projects.
  • Partner with Development teams to ensure Coding standards are in alignment with DevOps practices with respect to Tools, Standards and Security
  • Ability to work in a fast-paced environment and handle multiple priorities in parallel.
  • Automation mind-set an innate aim to continuously look for ways to automate existing processes.
  • Experience working on Agile Scrum teams.

U.S. Citizens and those authorized to work in the U.S. are encouraged to apply. We are unable to sponsor at this time.

Report job

Related Jobs