Just a week ago I did a presentation on Docker and why it's so incredibly useful as a Data ( = Engineer, Scientist, etc..).
This midsummer my friends gave me the idea that I should generate Swedish Drinking Songs, or Snapsvisor, using Machine Learning and I thought it could be a lot of fun! 🍻
To achieve the best results I'd need access to GPT-3, or equivalent model, alas I don’t and as such I needed to do some extra work! Fun work though! 🤓
First some examples:
Just a week ago I did a presentation on Knowledge Distillation and how it can help improve efficiency of models.
In this post I walk through Self-Attention Transformers from scratch with demos at the end for Text Classification & Generation, where the PyTorch-code is wrapped by fast.ai to simplify end-2-end.
I did a presentation/workshop on Object Detection using Transformers. It's a little bit confusing as the experience of the people that joined was very different, some even not knowing Python.
This is a presentation I did on Transfer Learning.
In this post I improve the previous FAQ search engine by some low hanging fruits. The requirements stay the same thus SotA is not achieved but rather it's simply generic & easy on hardware (Raspberry Pi capable).
I decided to scratch a small itch I've had for a while now - creating a search engine using an unsupervised approach. The final product, or the first iteration rather, ended up pretty good and I wanted to share what I've done so far.
I've set a goal to create one blog post per Competence Meeting I've held at AFRY to spread the knowledge further. This goal will also grab all the older meetings, my hope is that I'll be finished before summer 2020, but we'll see.