Data Platform Engineer

Posted 3 years ago

Must have skills:
• B.S., M.S. or Ph.D. degree in computer science or a related field or
equivalent work experience.
• 3+ Years of solid professional coding experience writing production
quality code, preferably Scala (Spark) & Python (PySpark).
• In-depth knowledge of distributed systems, MapReduce, Hive, Tez, Spark
and Kafka internals.
• 2+ years of experience working on complex distributed systems, or
any data processing and data management systems leveraging Spark, Flink
and Kafka.
• Experience working with public cloud platforms, preferably AWS.
• Bonus points if experience working with EMR, Databricks and Snowflake.

Nice to have skills:
• Working knowledge of open-source ML frameworks and end-to-end model
development life cycle.
• Previous working experience with running containers (Docker/LXC) in a
environment using one of the container orchestration services
(Kubernetes, Docker Swarm, AWS ECS, AWS EKS).

Location:

There are multiple positions with these options:

  1. Remote
  2. Hybrid: Onsite few days in NJ/NY (Local to NY/NJ preferred)

Apply Online