• IMAGE: Return to Main
  • IMAGE: Show All Jobs

Position Details: DevOps Engineer (812953FI)

Location: Beaverton, OR
Openings: 1
Job Number:



Required JobQualifications

·        Linux: 5 or more years in Unixsystems engineering with experience in Red Hat Linux, Centos or Ubuntu.

·        AWS: Working experience andgood understanding of the AWS environment, including VPC, EC2, EBS, S3, RDS,SQS, Cloud Formation, Lambda and HBase.

  • Networking: Working knowledge of TCP/IP networking, SMTP, HTTP and HTTPS, load-balancers (ELB, HAProxy), NGINX and high availability architecture.

·        Programming: Experience programmingwith AWS lambda, Python, node.js, Bash, REST APIs, and JSON encoding.

·        DevOps Automation: Experience with DevOps-Orchestration/Configuration Management and CI/CD tools (Jenkins,CircleCI,Atlantis, Puppet, Jenkins, Troposphere, Terraform, etc.).

·        ELK stack: Experience with settingup ELK stack for Analytics.

·        Version Control: Working experience with oneor more version control platforms (Bitbucket, Git).

  • AWS EMR: Experience in Amazon EMR cluster configuration.
  • ETL: Job scheduler experience like Airflow or Data Pipeline. Nice to have Airflow experience.
  • Security: Experience implementing role based security, including AD integration, security policies, and auditing in a Linux/Hadoop/AWS environment.
  • Backup/Recovery: Experience with the design and implementation of big data backup / recovery solutions.
  • Ability to keep systems running at peak performance, upgrade operating system, patches, and version upgrades as required.
  • Lead other admins and platform engineers through design and implementation decisions to achieve balance between strategic design and tactical needs.
  • Research and deploy new tools and frameworks to build a sustainable big data platform.
  • Agile/Scrum/Kanban experience.
  • Demonstrated communication and interpersonal skills.
  • Proven track record of success in fast-moving organizations with complex technology applications.
  • Collaborate with Project Managers, Product Managers, QA and Engineering teams to deliver results.

Nice to have

·        Hadoop: 1 year operationalexperience with the Hadoop stack (MapReduce, Spark, Sqoop, Pig, Hive, Impala,Sentry, HDFS).

  • Monitoring: Hands on experience with monitoring tools such as AWS CloudWatch, Nagios or Splunk.



·        MS/BSin Computer Science or related field.

Perform an action:

IMAGE: Apply to Position
mautic is open source marketing automation

Powered by: CATS - Applicant Tracking System