01.

about_me

// who I am

Data engineer with a background in BI and analytics, now focused on building the infrastructure that makes data actually useful. I like clean architecture, scalable pipelines, and systems that still make sense at 3am. Currently based in Berlin, always looking for the next interesting data problem to solve.

// what I focus on

I build data pipelines and analytics systems that are reliable, readable, and built to last. Clean transformations in dbt. Well-orchestrated workflows in Airflow. Data that makes sense by the time it reaches the people who need it. I care about the whole journey, from raw source to analytics-ready dataset. Not just making it work, but making it maintainable.

4+
years in data
2+
pipelines in prod
8+
dashboards created
0
weekend incidents (lately)
02.

featured_projects

Financial portfolio analysis dashboard
Financial portfolio analysis dashboard

End-to-end financial data platform: Yahoo Finance and FRED data ingestion into GCS, BigQuery warehousing with dbt transforms, and a Streamlit analytics dashboard. Built during final project week of data engineering bootcamp.

BigQuery Airflow dbt Python
03.

tech_stack

ingestion

Airbyte

transformation

dbt
Apache Spark
SQL / dbt macros

orchestration

Apache Airflow

storage

Snowflake
BigQuery

infrastructure

Terraform
Docker
GCP

languages

Python
SQL
Bash / Shell
YAML / Jinja
04.

work_experience

Apr 2023 โ€” Jun 2025
Business Intelligence Analyst
@ Klickrent GmbH ยท Berlin, DE
  • Built and maintained ETL pipelines consolidating ~2M rows from SAP, Salesforce, SQL and CSV sources into a unified analytics layer, reducing data prep time by ~30%.
  • Developed 7+ KPI dashboards in CRM Analytics and Salesforce Reporting for Management, Finance, and Sales.
  • Defined data structuring standards and ETL norms and trained 1โ€“3 users on CRM Analytics, cutting ad-hoc requests by ~40%.
Nov 2020 - Mar 2023
IT Project Manager
@ nxt gen digital ยท Berlin, DE
  • Implemented Xentral and Haufe X360 (Acumatica) ERP for 5+ SME retail clients.
  • Built Make (Integromat) automation workflows for an integration between ERP and DHL portal saving ~8 hrs/week per client.
  • Extracted and validated data via SQL queries and REST APIs across multiple client environments.