Job Role: ETL / DQ Developer Job Location: As of now Remote/ Philadelphia, PA Duration: Long Term Mandatory Skills: AWS, Java, Spark Responsibilities:
- 4+ years Spark/Java Hands on development experience : Kafka, Spark Streaming is must
- 3+ years of hands-on Development experience in Hadoop ecosystem tools - Hive, Parquet, Sqoop, Presto, DistCp is must
- 4+ years of development experience in Big Data on Cloud - Specifically in AWS - S3, Glue
- AWS certification is preferable: AWS Developer/Architect/DevOps/Big Data
- 3+ years of experience with NoSQL implementation ( Mongo, Cassandra, Graph)
- 3+ years of experience in SQL
- 2+ years of experience with UNIX/Linux, including basic commands and shell scripting
- 2+ years of experience with Agile engineering practices
- Technical expertise regarding data models, database design development, data mining and segmentation techniques
- Good experience writing complex SQL and ETL processes
- Excellent coding and design skills, particularly in Java/Scala and Python and or Java.
- Experience working with large data volumes, including processing, transforming and transporting large-scale data
- Excellent working knowledge of Apache Hadoop, Apache Spark, Kafka, Scala, Python etc.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
- Good understanding & usage of algorithms and data structures
- Good Experience building reusable frameworks.
- Experience working in an Agile Team environment.
- Excellent communication skills both verbal and written
Java Backend Developer
Senior .NET Developer
Lead Developer / REMOTE / Data Analytics
Java Developer / Robotics
Junior Developer / C