Data Engineer Analyst - 6 month contract

Date: 30 Aug 2024

Location: Ampol SG, Singapore, Singapore

Company: Ampol

 

About Ampol Trading & Shipping

Our purpose at Ampol is to power better journeys today and tomorrow. Our Trading and Shipping team is key to achieving this, not only supplying our customer’s needs today, but evolving as the energy transition takes place to supply our customers well into the future.

 

We take pride in our people and value innovation, sustainability, and customer satisfaction. If you’re a professional committed to excellence, Ampol is looking for you.

 

About the role

The primary purpose of this position is to design, implement and manage Trading and Shipping data infrastructure within the Snowflake platform. He or she will play a crucial role in ensuring data is ingested, processed, transformed and made available for analysis and reporting. This role will work with data analysts and business users to understand data requirements and create scalable data solutions. Given that Ampol has just migrated to data lake in Snowflake, He or she will also be expected to monitor and maintain the existing data pipelines and overall health of Snowflake data infrastructure.

 

 

Key responsibilities

Data Strategy and Planning

Continuously develop and enhance Ampol data strategy and architecture and roadmap to ensure alignment to business requirements and overall company vision.

 

Data Exploration and Analysis

Explore data to identify patterns, trends and insights. Perform quantitative and qualitative analysis to answer specific business questions and to gain a deeper understanding of data. 

 

Data Pipeline Development

Build and maintain data pipelines to move and transform data from various sources into Snowflake with assurance that data ingestion is reliable, scalable and optimized for performance. 

 

Data Modelling and Design

With requirements from business, design and implement data models within Snowflake, including schema design, table structures and data partitioning to support high performance, low latency querying and analytics.

 

Data Quality and Integrity

Enforce data quality checks and validation during data ingestion process to ensure accuracy and integrity of data in Snowflake.

 

Support Continuous Improvement in Data Product

Support the Commercial Analytics in advancing continuous improvement in the data space, including team engagement to generate awareness of initiatives and track progress.

 

Monitoring and Maintenance

Proactively monitor data pipelines and the overall health of Snowflake data infrastructure to identify and resolve issues. 

 

Automation and Orchestration

Automate data workflows and orchestrate data processes to minimize manual intervention and improve operational efficiency.

 

Collaboration with Data Users

Work closely with data analysts, data engineers, and business users to understand their data requirements and provide necessary data support.

 

 

Qualifications & experience

  • Tertiary qualification in Data Analytics or related discipline
  • Project 3 to 7 years of data engineering in Snowflake
  • 3 to 7 year’s downstream oil industry and/or petroleum trading companies in areas such as Supply planning, Supply operations, line finance roles, corporate planning and strategy, and distribution and sales leadership positions.

 

 

Knowledge & Skills

  • Proficiency in programming languages like SQL to work with database sets and data manipulation. 
  • Solid understanding of database management systems, data storage, data indexing and query optimization for efficient data handling. 
  • Proficient in data modelling techniques and design patterns to help create efficient and scalable data architectures.
  • Effective communication skills and has ability to collaborate with cross functional teams to understand data requirements and deliver valuable insights. 
  • Understand data quality principles and data governance best practices.
  • Proficiency in data analytics languages like Python, Java, Scala, etc.
  • Knowledge of bid data technologies like Hadoop, Spark and distributed computing frameworks to manage large scale data processing
  • Proficient in using version control systems like Git for managing code and configurations
  • Stay updated with latest data trends, technologies and best practices in data engineering

 

 

Want to take your career to the next level? Apply today.