Designing and implementing data transformation, ingestion and curation functions on GCP cloud using GCP native or custom programming
• Designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture, using Python etc.
• Performing detail assessments of current state data platforms and creating an appropriate transition path to GCP cloud
• Analysing, re-architecting and re-platforming on-premise data warehouses to data platforms on GCP cloud using GCP/3rd party services
• Optimizing data pipelines for performance and cost for large scale data lakes
Desired Qualifications:
• Hands-on GCP experience with a minimum of 1 solution designed and implemented at production scale
• 3+ years of experience writing complex SQL queries, stored procedures, etc
• Hands-on experience architecting and designing data lakes on GCP cloud serving analytics and BI application integrations
• Experience in designing and optimizing data models on GCP cloud using GCP data stores such as BigQuery, BigTable
• Experience integrating GCP or 3rd party KMS, HSM with GCP data services for building secure data solutions
• Experience architecting and implementing metadata management on GCP
• Architecting and implementing data governance and security for data platforms on GCP
• Agile development skills and experience
• Experience with CI/CD pipelines such as Concourse, Jenkins
• Dimensional modeling using tools like AtScale
• Google Cloud Platform certification is a plus
Permanent
Salary:
Industry: IT / Software
Functional Area: IT software admin
Role Category: System Engineer
Employment Type: Full time