About Me
Hi ๐๐ผ, my name is Felix Uellendall. I am a passionated Software Engineer ๐ง๐ผโ๐ป building several tools for the Airflow ecosystem. I have 4 years of experience as a Data Engineer, and developed a lot of expertise in Software Engineering as well. I am an Apache Airflow Committer since September 2019.
Tools & Services
Sorted by Expertise
Badges created via auto-markdown-badges
Industrial Knowledge
Data Engineering
Software Engineering
Unit Testing
Orchestration
Batch Processing
Data Lake
Data Ingestion
CI/CD
Data Infrastructure
Distributed Systems
ETL/ELT
Monitoring
Alerting
Data Observability
Stream Processing
Sorted by Expertise
Languages
- German ๐ฉ๐ช (Native speaker)
- English ๐ฌ๐ง (Upper-intermediate)
Projects
"Enforce Best Practices for all your Airflow DAGs. โญ" - Felix Uellendall
A personal project to enforce best practices for all your Airflow DAGs. Airflint lints your Airflow DAGs
and reformats them to be compliant with Airflow Best Practices. It consists of rules e.g. use function-level imports
instead of top-level imports or even detect if jinja syntax can be used instead of calling Variable.get()
which makes Airflow much more performant and stable.
"Auto-generated Diagrams from Airflow DAGs. ๐ฎ ๐ช" - Felix Uellendall
A personal project to auto-generate diagrams based on Apache Airflow DAGs. Airflow DAGs specify workflows which consists of tasks which can depend on eachother. Based of these tasks and dependencies a Diagram will be generated. A Diagram visualises the provider (often cloud-based) used in the Airflow DAGs by simply displaying its logos.
"Auto-generated markdown badges. ๐ง๐ผ" - Felix Uellendall
A personal project to auto-generate markdown badges from words and links. Inspired by markdown-badges, I wanted to have a tool which automatically creates badges for me.
"A kind data platform on your local machine. ๐ค" - Felix Uellendall
A personal project for building a data platform via kubernetes-in-docker fully on your local machine, shipped with tools such as Apache Airflow, DBT, Apache Superset & a lot more.
"Airflow is a platform to programmatically author, schedule and monitor workflows." - Apache Airflow
In search for a solution for the management of data pipelines I stumbled upon the Open-Source project Apache Airflow in January 2018. I really appreciate the simplicity of use, high coverage of use cases and the community behind it. Thatโs why I decided to give something back so I started contributing and am now a committer of this project.
Experience
"Modern Data Orchestration - Build, run, and observe data pipelines-as-code with Astro, the essential data orchestration platform powered by Apache Airflowโข." - Astronomer
For Astronomer I am..
- iterating and improving on Airflowโs developer and DAG authoring experiences
- adding new third-party integrations (aka โProvidersโ) to Apache Airflow
- providing technical direction, mentorship, pairing opportunities, and code reviews to encourage the growth of others
- improving code quality, test coverage, and reducing test runtime
Improving Apache Airflow โค๏ธ. Building Tools around Apache Airflow ๐ .
"Make your money work for you." - Trade Republic Bank GmbH
For Trade Republic I am..
- building a data platform on aws from scratch with terraform, kubernetes (eks, fargate), snowflake, airflow (mwaa), dbt, dms, and more
- maintaining airflow pipelines for data ingestion
- onboarding data scientists and analytic engineers to the data platform
- scaling services to handle the rapidly growing organization from serving 7 to serving about 150 engineers
- planning, coordinating and implementing the migration from self-hosted Airflow to MWAA
- building an Airflow PR development workflow on a remote airflow instance that scales, handles 100s of engineers developing concurrently
Scaling Users ๐งโ๐ฌ. Scaling Tools & Services ๐ . Scaling Infrastructure ๐ก.
"The World's Largest Open Source Foundation" - Apache Software Foundation
For Apache Airflow I am..
- writing project and code documentation
- writing unit tests in python
- adding new features like Airflow hooks and operators
- reporting bugs via Jira or GitHub Issues
- communicating with the community via email and slack
- writing Airflow Improvement Proposals in Confluence
- reviewing GitHub pull requests
- helping users to join the community
- testing and voting releases
Open Source is Love โค๏ธ. Communication is Key ๐ฌ. Commitment ๐.
Digitas Pixelpark GmbH
https://www.digitaspixelpark.com/Junior Data Engineer
January 2018 - November 2020
"Germany's most impactful customer experience agency" - Digitas Pixelpark GmbH
- built a data management platform on AWS for our data analysts and scientists to access analytics data efficiently
- added CI/CD to our DMP (data management platform) via GitLab CI/CD which lints, tests, builds documentation and deploys our code to a development or production environment
- built (ETL/ELT) data pipelines with Apache Airflow
- connected new data sources (mostly via REST APIs) to our data platform
- transformed data via SQL or Python Pandas to be analytics-ready
- designed workflows efficiently by making use of a lot of features of Apache Airflow
- monitored of our data pipelines
Containers are awesome ๐ณ. Automatisation ๐. Cloud Services โ๏ธ. Leading ๐จโ๐ซ.
"Our mission is to enable every SMB organization to achieve operational agility with visual scheduling." - NETRONIC Software GmbH
- built a .NET Windows Forms application in C#
- learned that code readibility is important through refactoring and mentorship
- learned the efficient usage of version control systems
- built a web application accessing Microsoft Azure Active Directoy via a REST API to manage users for one of our products
- built a hybrid mobile application to let our customers monitor their machines status easily
Code can be elegant ๐. Teamwork ๐ช. Coding ๐ป.
Education
OSZ IMT
https://www.oszimt.de/Shortened apprenticeship for software development
September 2016 - January 2018
Oberstufenzentrum Informations- und Medizintechnik
During my time of the apprentice I was working for Publicis Pixelpark GmbH. There I built..
- a REST API with the Spring Framework in Java
- a web application with Node.js, Express.js and Handlebars
- a command line application for our data team for automated transfer of social listening data
FH Aachen
https://www.fh-aachen.de/ICT with specifics in application development [Incomplete]
August 2013 - August 2016
Aachen University of Applied Sciences
I learned about..
- the basics of programming in C++
- several other programming languages like Assembly, Prolog and Matlab
- algorithms and data structures
- IT security and forensic
- databases and web technologies
- computer systems architectures, operating systems and distributed systems
Bk GuT
https://www.bkgut.de/IT assistant and advanced technical college entrance qualification
August 2010 - August 2013
Berufskolleg fรผr Gestaltung und Technik
I learned about..
- the basics of programming in C# and Java
- web development with PHP and MySQL
A Little More About Me
I love creating things, solving puzzles, thinking about code and improving challenging problems. Open Source is what I like doing the most. I enjoy working in the community and sharing my knowledge with each other.
I also fell in love with the nature and the silence being out there. I like hiking, playing the piano and I am fascinated by the universe.