class Aira:
def __init__(self):
self.curiosity = float("inf")
self.stack = ["Git", "Python", "SQL", "RDBMS", "Data Modelling", "Data Visualizing", "Big Data", "Cloud-practical development", "NoSQL", "Data warehouse lifecycle", "AI Ops"]
self.institution = "Stockholms Tekniska Institut"
def iterate(self):
return "Always learning. Always building. Data Engineering skills in progress."
def study(self):
return f"Currently studying Data Engineering at {self.institution}."
me = Aira()- 🦆 DuckDB SQL Relational Database and Evidence Dashboard using Sakila Database, DuckDB and Pandas
- 🧮 Object Oriented Programming / Python
- 🏫 Data Modelling for a school system: Relational Database, 3NF, PostgreSQL, Docker
- Develop applications in Python with a focus on clear structure, modularization, and reuse.
- Work within isolated virtual environments, manage dependencies, and prioritize reproducible execution.
- Apply a testing mindset and deterministic workflows to ensure reliable outcomes.
- Use SQL to transform, validate, and analyze data. Write efficient joins, aggregations, and window operations.
- Leverage queries to verify data integrity and support analytical requirements.
- Design data structures across conceptual, logical, and physical levels.
- Apply normalization principles with emphasis on 3NF, primary and foreign keys, and constraint management.
- Distinguish between transactional and analytical workloads and model accordingly using OLTP and OLAP patterns.
- Work primarily with relational databases such as PostgreSQL and DuckDB.
- Understand schema design, data typing, and governance fundamentals.
- Maintain awareness of document-oriented storage concepts and their use cases.
- Implement ETL and ELT workflows for batch ingestion and transformation.
- Build repeatable pipelines that enforce data quality and traceability.
- Structure processing steps to support downstream analytical consumption.
- Understand the lifecycle of a data warehouse from raw ingestion to curated layers.
- Design analytics-ready datasets using fact and dimension thinking.
- Apply introductory dimensional modeling principles to support reporting needs.
- Develop containerized solutions and think in terms of services rather than scripts.
- Expose data through APIs and apply validation through contracts.
- Practice ownership across development and operational concerns.
- Perform exploratory data analysis to uncover trends and patterns.
- Translate data into business-facing insights.
- Deliver results through structured reports and interactive dashboards.
- Understand distributed processing concepts and the trade-offs of platform choices.
- Build awareness of storage and compute considerations in modern data architectures.
- Use version control in collaborative environments.
- Produce structured technical documentation and convert business requirements into implementable solutions.
- Emphasize maintainability, clarity, and scalability while working within agile frameworks.
- 🗻 Enjoys hiking mountains (the higher, the better)
- 🏂🏻 Snowboarding in POWDER, off piste preferably
- ⛺ Camping by the beach
- ☕ Good roasted coffee
- 🍜 Cooking and sharing food
- 👩🏻💻 Always learning. Keep on going.
