background_image
  • IMAGE: Return to Main
  • IMAGE: Show All Jobs


Position Details: Sr Data Engineer - 961386N

Location: Beaverton, OR
Openings: 2
Job Number:

Share

Description:

Required basic skills

1) Minimum 2+ Years of experience in Spark.

2) Performance tuning in Spark.

3) Hive Basics and concepts around Partitioning.

4) Experience with any object oriented programming languages (Python is preferred)

5) Basic Shell Scripting knowledge.

6) Good Knowledge on ETL and SQL.

7) AWS knowledge is plus.

8) Good Communication Skills.

Responsibilities:

(The primary tasks, functions and deliverables of the role)

  1. Design and build reusable components, frameworks and libraries at scale to support analytics products 
  2. Design and implement product features in collaboration with business and Technology stakeholders 
  3. Identify and solve issues concerning data management to improve data quality
  4. Clean, prepare and optimize data for ingestion and consumption
  5. Collaborate on the implementation of new data management projects and re-structure of the current data architecture
  6. Implement automated workflows and routines using workflow scheduling tools 
  7. Build continuous integration, test-driven development and production deployment frameworks
  8. Collaboratively review design, code, test plans and dataset implementation performed by other data engineers in support of maintaining data engineering standards
  9. Analyze and profile data for designing scalable solutions 
  10. Troubleshoot data issues and perform root cause analysis to proactively resolve product and operational issues

Skills

  1. Strong understanding of data structures and algorithms
  2. Strong understanding of solution and technical design
  3. Has a strong problem solving and analytical mindset
  4. Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders
  5. Able to quickly pick up new programming languages, technologies, and frameworks
  6. Advanced experience building cloud scalable, real time and high-performance data lake solutions
  7. Experience with relational SQL
  8. Experience with scripting languages such as Shell, Python 
  9. Experience with source control tools such as GitHub and related dev processes
  10. Experience with workflow scheduling tools
  11. In-depth understanding of micro service architecture
  12. Strong understanding of developing complex data solutions
  13. Experience working on end-to-end solution design
  14. Able to lead others in solving complex problems by taking a broad perspective to identify innovative solutions 
  15. Willing to learn new skills and technologies
  16. Has a passion for data solutions

Education

  1. Bachelor's degree in IT or related field.
  2.  One of the following alternatives may be accepted:  

PhD or Law + 3 yrs;

Masters + 4 yrs;

Associates + 6 yrs;

High School + 7 yrs exp

Required

  • SQL
  • DATA ARCHITECTURE
  • PYTHON
  • BIG DATA ANALYTICS
  • AWS

Additional

  • DATA MANAGEMENT
  • DATA QUALITY
  • DATA STRUCTURES
  • DEPLOYMENT
  • INTEGRATION
  • INTEGRATOR
  • PROBLEM SOLVING
  • ROOT CAUSE ANALYSIS
  • SCHEDULING
  • SCRIPTING
  • TECHNICAL DESIGN
  • TEST PLANS
  • WORKFLOW

Perform an action:

IMAGE: Apply to Position
mautic is open source marketing automation




Powered by: CATS - Applicant Tracking System