9 Pound Hammer
Data Engineering & Cloud Architecture
I design and build production data platforms — from orchestration and pipeline development to cloud infrastructure and analytics delivery. I work across the full stack of modern data engineering and multiply my output with agentic AI tooling.
What I Build
I architect end-to-end data systems that move data reliably from source to insight. My work spans pipeline orchestration, cloud infrastructure, data transformation, and the devops practices that keep it all running in production.
Pipeline Orchestration
Apache Airflow on Google Cloud Composer — 200+ production DAGs, custom operators and hooks, GCS sensors, dynamic task generation with TaskGroups, Celery distributed execution, and Cloud Run Jobs integration.
Google Cloud Platform
BigQuery, Cloud Composer, GCS, Cloud SQL, Pub/Sub, Dataflow, Dataproc, Cloud Run, Cloud Functions, Cloud Build, GKE, IAM, KMS, DLP, Secret Manager, and Data Catalog.
Data Processing
Apache Beam pipelines on Dataflow, PySpark jobs on Dataproc, BigQuery SQL transformations, dbt Cloud models, and Python ETL with schema management and incremental loading.
DevOps & CI/CD
Cloud Build pipelines, Docker containerization, Kubernetes deployments on GKE, Pytest suites, Nox automation, and environment promotion from dev through production.
Data Storage & Integration
BigQuery, Cloud SQL (PostgreSQL), Cloud Spanner, Bigtable, Firestore, and GCS data lakes. REST API integrations, Pub/Sub event streaming, Kafka consumers, and third-party platform connectors.
GCP AI & ML Services
Vertex AI, Cloud Vision, Speech-to-Text, Translation, Natural Language API, Video Intelligence, and AutoML. Integrating AI services into data pipelines at scale.
Web Scraping & Automation
Scrapy spiders, Selenium on Kubernetes (headless Chrome clusters on GKE), BeautifulSoup, and browser automation pipelines orchestrated through Airflow.
Monitoring & Data Quality
Cloud Logging, Cloud Monitoring, OpenLineage for data lineage tracking, SLA enforcement in Airflow, and automated data quality validation across pipelines.
AI-Augmented Engineering
I use agentic coding tools — Claude Code, Cursor, and similar AI-native development environments — as a core part of my engineering workflow, not as a novelty. This is a force multiplier. The development of AI skills in my workflow reduces the cycle time of delivery.
Faster Delivery
Agentic AI handles boilerplate, test generation, and repetitive refactoring so I can focus on architecture decisions and business logic. Projects that used to take weeks get shipped in days.
Higher Accuracy
AI-assisted code review catches edge cases and anti-patterns early. I pair with agents the way senior engineers pair with each other — constant feedback, fewer blind spots.
Broader Reach
When I need to work across unfamiliar frameworks or languages, agentic tools compress the ramp-up time dramatically. One engineer, full-stack capability, production-grade output.
Let's Work Together
If you need a data platform built right — or an existing one fixed — get in touch.