Sunday , January 29 2023

Deriv Hiring – Senior Data Engineer

Deriv

Job Summary


Deriv just listed a job advertisement for Deriv Hiring – Senior Data Engineer that will require candidates to work Full Time located in Islāmābād, PK Remember to check the description thoroughly before submitting your job application form.

Job Title: Deriv Hiring – Senior Data Engineer
Company Name: Deriv
Job Location: Islāmābād, PK
Job Type: Full Time
Job Category: Financial Services
Job Link Expiry: 2023-01-30
Posted on: Jobpicks.online

Job Details:


About the job

*This opening is for relocation to our HQ in Cyberjaya, Malaysia*

At Deriv, we are looking for a detail-oriented Senior Data Engineer with the capability to define the data pipeline of our future data models. You will leverage technologies such as Google Cloud Platform, Airflow, Python, Docker, and PostgreSQL to help provide the company with dependable business intelligence solutions. As part of this role, you will develop, test, and maintain architectures for data processing and build Extract, Transform, and Load (ETL) pipelines. You will be a key contributor to making our data warehouse trustworthy by ensuring data accuracy.

Your challenges

  • Ensure data integrity while extracting data from in-house and third-party complex sources and manage its systematic storage. Responsible for data security, accuracy, and accessibility.
  • Provide tangible business solutions and decisions using your expertise in the data engineering domain.
  • Design and build high-performance, secure, and scalable company data warehouse and pipeline to support data science projects following best practices.
  • Debug and resolve complex issues, and recommend improvements to ensure a well-functioning ETL pipelined architecture.
  • Transform raw data into easy-to-use tables for the Data Analysts.
  • Keep up-to-date on company products and new releases to efficiently plan changes in our data warehouse or pipelines.

Requirements

  • A minimum of 5 years of experience in the data engineering field
  • Expertise in data modelling techniques such as Kimball star schema, Anchor modelling, and Data vault
  • Competence in object-oriented or object function scripting languages such as Python
  • Proficiency in relational SQL and NoSQL databases, preferably with PostgreSQL, PITR, Pg_basebackup, WAL archival, and Replication
  • Familiarity with column-oriented storage or data warehouses such as parquet, Redshift, and Bigquery
  • In-depth skills in developing and maintaining ETL/ELT data pipelines and workflow management tools such as Airflow
  • Hands-on experience with Google Cloud Platform (GCP) services such as BigQuery, scheduled queries, Cloud Storage and Cloud Functions
  • Familiar with alerting and self-recovery methods concerning data accuracy
  • Analytical skills with the ability to transform data into optimal business decisions
  • Expertise in peer reviewing pipeline codes and suggesting improvements when required
  • Experience in helping teams make informed business decisions with data
  • Strong communication and presentation skills
  • Fluency in spoken and written English

What’s Good To Have

  • A good background in cybersecurity and data protection
  • Proficiency in using data pipeline and workflow management tools such as Luigi
  • Exposure to maintaining and monitoring database health and resolving errors
  • Experience in managing stakeholders’ expectations and technical requirement gathering
  • Familiarity with container technologies such as Docker
  • A fintech background

Benefits

  • Exciting work challenges
  • Collaborative work environment
  • Career advancement opportunities
  • Market-based salary
  • Annual performance bonus
  • Health benefits
  • Casual dress code
  • Travel and internet allowances
  • Team building events



 Report Job