Teradata, Data Warehouse Must.
• Minimum of 3+ years’ experience in Big Data Hadoop Technologies and 5+ years’
• Hands-on and expert level knowledge on Spark Data frames and Spark SQL API’s using Python/Scala
• Hands-on experience building dags using Apache Airflow
• Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with cloud databases like Redshift and Snowflake
• Experience with object-oriented/object function scripting languages: Python or Scala
• Strong analytical and problem-solving skills
• Should be able to work as an individual contributor with minimal or no assistance