• Works with other data engineers and architects to establish secure and performant data architectures, enhancements, updates, and programming changes for portions and subsystems of data platform, repositories, or models for structured/unstructured data
• Analyzes design and determines coding, programming, and integration activities required based on general objectives and knowledge of overall architecture of product or solution
• Promotes and drives use of agile and DevOps methodologies and patterns including continuous integration, continuous testing, test-driven development, continuous delivery, etc.
• Follow, promote, and adopt HP’s Release and Change Management processes
• Reviews and evaluates designs and project activities for compliance with architecture, security and quality guidelines and standards; provides tangible feedback to improve product quality and mitigate failure risk
• Writes and executes complete testing plans, protocols, and documentation for assigned portion of data system or component; identifies defects and creates solutions for issues with code and integration into data system architecture
• Experience with software development (varies by level) and mastery of software development fundamentals
• Strong analytical and problem-solving skills with ability to represent complex algorithms in software
• Hands-on experience (varies by level) developing high availability, scalable, medium to large scale ETL data processing pipelines with Java/Python or Scala
• Proficient understanding of big data distributed computing principles, AWS architecture, and SQL querying tools
• Experience with Apache Spark, AWS building blocks (S3, RDS, SQS, EKS, EMR, Lambda etc.), Microservices/Rest API, Docker and Kubernetes, RedShift and Aurora MySQL, logging framework such as ELK or Splunk Passion for quality and attention to details.
• Ability to effectively communicate across both technical and business audiences
• Ability to work independently under a fast-paced environment and deliver results under pressure
• Nice to Have Databricks or Jupyter notebooks experience, Airflow experience, DBT
• Experience with Terraform, familiar with visualization tools like Looker or PowerBI
Compensation : Employment agreement ( UoP) 22K PLN Gross