Oxen.ai Blog

Welcome to the Oxen.ai blog 🐂

The team at Oxen.ai is dedicated to helping AI practictioners go from research to production. To help enable this, we host a research paper club on Fridays called ArXiv Dives, where we go over state of the art research and how you can apply it to your own work.

Take a look at our Arxiv Dives, Practical ML Dives as well as a treasure trove of content on how to go from raw datasets to production ready AI/ML systems. We cover everything from prompt engineering, fine-tuning, computer vision, natural language understanding, generative ai, data engineering, to best practices when versioning your data. So, dive in and explore – we're excited to share our journey and learnings with you 🚀

Arxiv Dives - How Mistral 7B works
Arxiv Dives - How Mistral 7B works

What is Mistral 7B? Mistral 7B is an open weights large language model by Mistral.ai that was build for performance and efficiency. It outshines models that are twice it's size, i...

Greg Schoeninger
Greg Schoeninger
Dec 23, 2023
- Arxiv Dives
10 min read
Practical ML Dive - How to train Mamba for Question Answering
Practical ML Dive - How to train Mamba for Question Answering

What is Mamba 🐍? There is a lot of hype about Mamba being a fast alternative to the Transformer architecture. The paper released in December of 2023 claims 5x faster throughput w...

Greg Schoeninger
Greg Schoeninger
Dec 21, 2023
- Practical ML
22 min read
Mamba: Linear-Time Sequence Modeling with Selective State Spaces - Arxiv Dives
Mamba: Linear-Time Sequence Modeling with Selective State Spaces - Arxiv Dives

What is Mamba 🐍? Mamba at it's core is a recurrent neural network architecture, that outperforms Transformers with faster inference and improved handling of long sequences of len...

Greg Schoeninger
Greg Schoeninger
Dec 15, 2023
- Arxiv Dives
15 min read
Practical ML Dive - How to customize a Vision Transformer on your own data
Practical ML Dive - How to customize a Vision Transformer on your own data

Welcome to Practical ML Dives, a series spin off of Arxiv Dives. In Arxiv Dives, we cover state of the art research papers, and dive into the gnitty gritty details of how AI model...

Greg Schoeninger
Greg Schoeninger
Dec 14, 2023
- Arxiv Dives
20 min read
Arxiv Dives - Zero-shot Image Classification with CLIP
Arxiv Dives - Zero-shot Image Classification with CLIP

CLIP explores the efficacy of learning image representations from scratch with 400 million image-text pairs, showcasing zero-shot transfer capabilities across diverse computer visi...

Greg Schoeninger
Greg Schoeninger
Dec 8, 2023
- Arxiv Dives
14 min read
How NOT to store unstructured machine learning datasets
How NOT to store unstructured machine learning datasets

Training data is typically the most valuable part of any machine learning project. As we converge on model architectures like the transformer that perform well on many tasks, it is...

Greg Schoeninger
Greg Schoeninger
Dec 8, 2023
6 min read
🧼 SUDS - A Guide to Structuring Unstructured Data
🧼 SUDS - A Guide to Structuring Unstructured Data

At Oxen.ai we value high quality datasets. We have many years of experience training and evaluating models, and have seen many interesting data formats. Interesting is something we...

Greg Schoeninger
Greg Schoeninger
Dec 8, 2023
12 min read
Arxiv Dives - Vision Transformers (ViT)
Arxiv Dives - Vision Transformers (ViT)

With all of the hype around Transformers for natural language processing and text, the authors of this paper beg the question - can we apply self-attention and Transformers to imag...

Greg Schoeninger
Greg Schoeninger
Dec 1, 2023
- Arxiv Dives
13 min read
Reading List For Andrej Karpathy’s “Intro to Large Language Models” Video
Reading List For Andrej Karpathy’s “Intro to Large Language Models” Video

Andrej Karpathy recently released an hour long talk on “The busy person’s intro to large language models” that had some great tidbits whether you are an expert in machine learning ...

Greg Schoeninger
Greg Schoeninger
Nov 27, 2023
11 min read
Arxiv Dives - A Mathematical Framework for Transformer Circuits - Part 2
Arxiv Dives - A Mathematical Framework for Transformer Circuits - Part 2

Every Friday at Oxen.ai we host a paper club called "Arxiv Dives" to make us smarter Oxen 🐂 🧠. We believe diving into the details of research papers is the best way to build fund...

Greg Schoeninger
Greg Schoeninger
Nov 21, 2023
- Arxiv Dives
16 min read