On site in Beaverton (No Remote)
These Product Owners will be working in the Machine Learning group.
Pos. 1: Own the data modeling and testing
Pos 2: Own the Tools (Tensor, Stagemaker, Redshift, Snowflake)
Pos 3: Own the Visualization (Power BI, Tableau)
As a, Platform Product Owner, you will be part of fast pace engineering teams, representing the view of the customer.
Understands Customer need by maintaining a close relationship with customers and stakeholders to comprehend and communicate their needs and participate in validation of the solutions
Works closely with Platform Product Directors to understand and communicate product vision, strategy, and roadmaps that are aligned with Client’s strategies and initiatives
Represents the customer in sprints to the delivery teams and provides sprint prioritization to ensure technology alignment with business priorities & strategies
Is accountable for
Provides customer service to our technology and business partners for onboarding, adoption, and documentation.
Partners closely with Reference Architecture and Platform Engineering to align Platform Product Roadmaps with execution Plans and to ensure requirements are understood and end-products meet acceptance criteria
Measure and report on the value provided by Platform Product offerings and provide regular updates to stakeholders on product priority, delivery, and adoption
Manages all product prioritization to available budgets and is able to work with finance to communicate financial costs and chargebacks
Stays current with industry trends and recommends relevant technologies & products in the areas of Cloud, Digitization, Data and Analytical tools, Visualizations, AI, and other emerging technologies
Contributes back to the overall roadmap and prioritized backlog
Must have knowledge and/or Experience with data storage solutions with emphasis on Snowflake, Redshift, S3, Glacier, HDFS, and others. Should be able to articulate with customers each technology and the benefits/tradeoffs of each solution.
Must be able to articulate and/or have experience with data lakes, data streaming, and distributed computing patterns for access, movement, management, security, and collaboration.
Must have working knowledge of data streaming and data movement technologies and patterns such as Spark Streaming, Kinesis, Kafka, NiFi, etc.
Must have working knowledge of data engineering, compute, and enrichment technologies and patterns such as Airflow, Tensorflow, Spark, EMR, Qubole, DataBricks, etc.
Knowledge of Data Science, Advanced Analytics/ Machine Learning, and Artificial Intelligence tools, technologies, and/or techniques. Should show understanding of the lifecycle of model development from notebooks into production and working with tools such as Data Robot, Algorithmia, Kubeflow, Sagemaker, etc.
Must have an understanding of enterprise security requirements associated with the above technologies.
7+ in technology & architecture roles
Extensive background in working with business partners
Proven track record of being results oriented with demonstrated ability to achieve aggressive goals
Proven presentation and facilitation skills
Experience with Agile software development methodology
Must excel working in team-oriented roles that rely on ability to collaborate with others
Self-directed and comfortable working in ambiguous environments
Experience working in a highly matrixed organization
Excellent oral and written communication skills with the ability to influence others internally and externally
Experience working with internal and external customers, especially in an enterprise setting
A Bachelor's degree in Information Technology or related field