P&G
Responsibilities of the role
- Understand the business requirements and convert into technical design of data pipelines and data models
- Write code to ingest, transform and harmonize raw data into usable refined models
- Analyze multiple data sets associated with the use cases in-scope in order to effectively design and develop the most optimal data models and transformation
- Craft integrated systems, implementing ELT/ ETL jobs to fulfil business deliverables.
- Performing sophisticated data operations such as data orchestration, transformation, and visualization with large datasets.
- Coordinate with data asset managers, architects, and development team to ensure that the solution is fit for use and are meeting vital architectural requirements.
- Demonstrate standard coding practices to ensure delivery excellence and reusability
Job Qualifications
Role Requirements
- At least 3 years of experience in Data Engineering
- Hands-on experience in building data models, data pipelines, data ingestion, harmonization along with data governance.
- Hands-on experience in scripting language like Python, R or Scala
- Backend Development expertise on SQL Database, SQL Data Warehouse or any data warehousing solutions in cloud
- Hands-on experience of reporting tools like Power BI or Tableau
- Knowledge in DevOps Tools and CICD tools (e.g. Azure DevOps and Github)
- Knowledge in cloud technologies (Azure Cloud) – at least 2 years inclusive of software engineering experience
- Knowledge in Agile or Scrum methodologies with proven track record of successful projects
- Graduate of Engineering or IT related course
To apply for this job please visit www.pgcareers.com.