AWS Data Engineer (CDE485)

Overview

Reference
CDE485

Salary
£600 - £600/day

Job Location
- United Kingdom -- England -- Greater London -- London

Job Type
Contract

Posted
07 August 2025


AWS Data Engineer

Contract

Outside IR35

6 months initially

£600 p/d

Remote Based (Adhoc London Travel Expensed)

ASAP Start

 

Client - My client are a major global developer and operator of luxury casino resorts, known for iconic properties in Las Vegas, Macau, and Singapore.

 

Overview

The Data Engineer will be working with the product engineering team and focusing enabling Business Intelligence and Data Science efforts, primarily on AWS-based data technologies. The Engineer will also work closely with the corporate technology team to build, manage, automate and support our AWS infrastructure.

The position demands someone who is highly technically competent, detail oriented, and driven to stay current with evolving technologies.

 

Essential Duties & Responsibilities - 

  • Work collaboratively with data consumers, data producers, and regulatory compliance specialists to identify requirements for data solutions that will have executive-level exposure.
  • Design, build, operationalize, and support, solutions that enable Business Intelligence and Data Science efforts in the AWS cloud.
    • Data ingestion pipelines
    • ETL/ELT processes (batch and streaming)
    • Curated data products
    • Integrations for third-party tools
  • Support and evolve the data lake, enterprise data catalog, cloud data warehouse, and data processing infrastructure.
  • Provision, configure and maintain AWS services and infrastructure as code with Amazon CDK.
  • Provide recommendations regarding product/vendor selection, technology evolution, and design strategies.
  • Take initiative to identify and seize opportunities to eliminate waste, redundancy, and complexity.
  • Manage a queue of work requests to meet service level objectives and KPIs.
  • Execute incident, problem and change management processes and reporting as needed.

Skills & Experience - 

  • 3+ years working with Spark to build and operate batch and stream processing jobs in a production capacity.  Exposure to Ray is a plus.
  • 3+ years working with one or more cloud data warehousing tools like Snowflake, Redshift, BigQuery, or ClickHouse. 
  • Mastery of SQL, with significant exposure to HiveQL.  
  • Competency programming in Java, Scala, Python, and Typescript.
  • Strong understanding AWS security mechanisms, particularly as they relate to AWS data services like S3, Kinesis, EMR, Glue, and LakeFormation.
  • Experience with GitHub, DataDog, and AWS.

AWS Data Engineer


Contact information

Alex Carter

Search powered by Multiple Job Posting & Applicant Tracking Software System Logic Melon