Join The Most Talented Data Engineering Team in the Country!

Looking for a change of pace? We have a number of data engineering positions open!

As a boutique, specialised Data and ML Engineering business, our team’s capabilities are unmatched. 

If you are serious about growing your skill, working with the best and growing at speed – our structure and culture is designed to support and develop the best global talent! 

This is why we are on the lookout for A-Players…

We are currently on the lookout for A-player consultants in the Data Engineering space to help support our global clientbase with the necessary skills to make AI and digital transformation a reality. 

You will be working on fixed-scope, fixed-timeline projects across AWS and GCP for large national and international customers, including banks, airlines, medical aid providers and telecommunications companies.

We also have fully remote, hybrid and office based (Johannesburg and Cape Town) arrangements available. And we offer opportunities for upskill:

  • We pay for you to take your cloud certifications
  • We allow for you to study for cloud certifications on company time
  • We run weekly internal knowledge sharing sessions

If you’re a Mid or Senior-level data engineer that is looking for a change of pace, development and direction, then see if any of these roles are the right fit for you:

Data Engineering Positions

Mid-Level Data Engineer

Required Experience 

  • Hands-on experience in building ETL pipelines on Big Data, using modern cloud Data Engineering tools (1+ years):
    • AWS Glue (highly preferred)
    • GCP DataFlow (preferred)
    • Azure Data Factory
  • Object Oriented Python programming (2+ years)
    • Python package development
    • Python class development
    • Pythonic standards across the board
  • Working knowledge of PySpark (preferred) or Apache Beam
    • Efficient use of the PySpark DataFrame API
    • Knowledge of big data file formats (Parquet, ORC, AVRO) and partitioning strategies
  • Hands-on experience in writing SQL queries (2+ years)
    • ANSI-SQL (preferred)
    • Postgresql (preferred)
    • MS-SQL
    • MySQL
    • Oracle
  • Database experience:
    • Amazon Redshift
    • Google BigQuery
    • Or equivalent
  • Infrastructure as code:
    • Terraform (preferred)
    • Cloudformation
    • AWS CDK
  • Hands on experience in making use of additional cloud services:
    • AWS Lambda (highly important)
    • AWS IAM (highly important)
      • Cross account roles
      • IAM policies
    • Amazon S3 (highly important)
    • AWS Step Functions (important)
    • Amazon SQS
    • Amazon SNS
    • Amazon DynamoDB
    • Amazon Kinesis is a plus
  • Architectural thinking is a plus!
  • Scalability, security, fault-tolerance

Required Qualifications

  • BSc Computer Science/Engineering (or equivalent experience)
  • AWS or GCP Certifications (Associate and above)

Senior Data Engineer

Required Experience 

  • Hands-on experience in building ETL pipelines on Big Data, using modern cloud Data Engineering tools (3+ years):
    • AWS Glue (highly preferred)
    • GCP DataFlow (preferred)
    • Azure Data Factory
  • Object Oriented Python programming (5+ years)
    • Python package development
    • Python class development
    • Pythonic standards across the board
  • Working knowledge of PySpark (preferred) or Apache Beam
    • Efficient use of the PySpark DataFrame API
    • Knowledge of big data file formats (Parquet, ORC, AVRO) and partitioning strategies
  • Hands-on experience in writing SQL queries (5+ years)
    • ANSI-SQL (preferred)
    • Postgresql (preferred)
    • MS-SQL
    • MySQL
    • Oracle
  • Database experience:
    • Amazon Redshift
    • Google BigQuery
    • Or equivalent
  • Infrastructure as code:
    • Terraform (preferred)
    • Cloudformation
    • AWS CDK
  • Hands on experience in making use of additional cloud services:
    • AWS Lambda (highly important)
    • AWS IAM (highly important)
      • Cross account roles
      • IAM policies
    • Amazon S3 (highly important)
    • AWS Step Functions (important)
    • Amazon SQS
    • Amazon SNS
    • Amazon DynamoDB
    • Amazon Kinesis is a plus
  • Ability to scope a Data Solution end-to-end
    • Service integrations
    • Drawing architectural diagrams
  • Scalability, security, fault-tolerance

Required Qualifications

  • BSc Computer Science/Engineering (or equivalent experience)
  • AWS or GCP Certifications (Associate and above)

Send Us Your CV & Accelerate Your Career!

Do you feel like this position and our culture is a perfect fit for you?

Then don’t hesitate! Get in touch with us

Also, feel free to take a glance at our resources:

If we like your CV, you will be invited to a screening interview, followed by one or more technical interviews*. If you make it past the technical interviews, we follow the topgrading interview process for culture fit.

* Please note that your technical interview will include a live coding component.

More in the Blog

Stay informed on all things AI...

< Get the latest AI news >

Join Our Webinar Cloud Migration with a twist

Aug 18, 2022 03:00 PM BST / 04:00 PM SAST