The original source of this document is https://www.tind.au/resume-ato/
Recent Career Highlights
- Designing and collaboratively building a Iceberg format data lake and serverless ingestion pipeline, at First Mode . Tested to receive 120M events per minute at peak, with ingestion latency ~165 seconds
- Creating, cultivating and leading a Data Practice at Mechanical Rock that contributed significantly to annual revenue
- Leading and coaching colleagues and customers in Data Architecture, Engineering, Platform Design and Data Transformation
- Launched 4 Customer Journeys on Snowflake with TIND and Mechanical Rock
- Presented at several Snowflake and Perth Data Engineering Community events, covering
Compentencies
A sample of tools, technologies and techniques I've developed experience in during my career:
Cloud
- AWS & Azure
- Automation
- Build Pipelines
- Containers
- Data & Analytics
- Key Management
- Linux
- NoSQL & Relational Databases
- Reliability Engineering
- IAM & Security
- Serverless
- Storage
- Streaming
Data
- Airflow
- CDC
- DataBricks
- dbt
- dbtCloud
- De-duplication
- ELT+ETL
- Fivetran
- AWS Glue
- Kimball Design
- Python
- Amazon RedShift & Athena
- Snowflake
- SQL
- Spark: pyspark, sparkUI
Software
- API Development
- Build Systems
- Clean Architecture
- Java / Kotlin
- Microservice Design
- Packaging & Artifacting Dependencies
- React + CSS
- Refactor & Re-platform
- Software Frameworks
- TDD+BDD
- Javascript, Typescript, Python, Java, Kotlin, HTML5
Previous Roles
-
Principal Consultant
2024 - presentPioneer Credit (13 weeks - subcontract) - Kickstarting the customer migration from on-premise to cloud
Key Focus Areas:
- Understanding the customer's tooling preferences, engineer experience, and aspirational outcomes
- Designing, documenting and collaboratively implementing Data Integration and Transformation workflows with Fivetran, Azure Data Factory and dbt core
- Analysing customer's governance requirements; design and implementation of security solutions to ensure RBAC flow through from Azure AD, and combination of tag-based column masking and row level security to safeguard sensitive data
- Continous collaboration and documentation to emboss the solutions in the customer consciousness
Data Vanguards (2 weeks - subcontract @ Resources client) - Solution Architecture and Analysis
Key Focus Areas:
- Evaluation of various Data Catalog Products for fit to customer requirements
- Solution Architecture self-service transformation; re-platform of existing Redshift cluster to Redshift Serverless with dbt cloud
Rent.com.au - Data Platform Maturity Assessment
Key Focus Areas:
- Interviews and Solution deep dives to understand current state, and pain points
- Production of assessment report with executive summary, technical assessment, and non-functional appraisal
-
Principal Data Architect
2022 - 2024Led and mentored a team of 2 other devops / data engineers to deliver a cloud data platform for a large volume of telemetry data: ingested from S3, loaded to Iceberg tables, transformed with dbt and visualised with Grafana
- Migrated from a single instance TimescaleDB to a S3 and Iceberg data lake using Athena and dbt
- Created a solution to translate 2000+ Grafana dashboards (and SQL) from TimescaleDB to Athena
- Used a off the shelf data loader to format, compress and ship PLC data to data lake
- Created a solution to generate dbt packages of models to transform 6,000+ telemetry signals into respective domain tables
Our key innovation was building a serverless iceberg ingestion pipeline that used Athena to load data with low latency, similar to the Tabular product that was acquired by Databricks in 2024.
The motivation for us was navigating around a few key constraints:
- The expense of Kinesis Data Firehose for landing data
- The cost and latency of running regular glue jobs, or cost of Kafka brokers / kinesis streams for use with Spark Streaming
- The over-reliance on AWS products, expecting to support multiple cloud platforms (Athena can be replaced with Trino)
-
Principal Consultant
2018 - 2022Completed 15 distinct engagements for 7 clients over 4 years, typically leading teams of 2-4 people.
Highlights include:
- Planned and implemented the integration and roll out of dbtCloud at a 'Big Australian' Resources client
- Workshops and demonstrations with several clients on the benefits and experience of working with dbt (core)
- Design and implementation of a serverless data integration between on-premise Oracle and Snowflake Data Warehouse, including schema auto-detection and migration, three years prior to Snowflake's INFER_SCHEMA feature release
- Assisting the cloud platform team at a security conscious insurance company onboard AWS Glue in a constrained environment
-
Tech Lead
2017 - 2018- Led a small team to deliver a simplified home loan credit decisioning solution using a mixture of vendor (Experian) and Java products
- Automated the packaging and delivering of vendor artifacts into JAR consumables to simplify integration
- Introduced bespoke JUnit integration testing of the Experian decisioning software platform, run as regression tests in the build pipeline
-
Java Developer
2011 - 2017Developed backend Java services and frontend HTML and Javascript experiences for Business Banking at Bankwest