Building the ML Stack of the Future

Ethan Steininger
3 min readFeb 1, 2023

If you’ve read my prediction on which companies will vertically integrate the AI stack, you may have your own guesses.

But the real question is, what do you do with that information?

In this article you’ll build an end-to-end application that uses the following technologies:

Quick Definitions

Model Hub: A repository of self-contained deep learning models pre-trained for a wide variety of applications. They are typically low friction ways to integrate AI into your code with just a python import.

Serverless GPU: On-demand access to powerful GPU resources. GPUs are important for ML workloads because they enable high volume linear algebra, critical for ML.

ACID-Compliant Database: An ACID (Atomicity, Consistency, Isolation, Durability) compliant database provides strong guarantees for data integrity and consistency.

Vector Engine: A vector search engine enables efficient and accurate search and retrieval of high-dimensional data. This is through vector representations of data, which capture the semantic relationships between terms. More here: https://vectorsearch.dev/

Application Search: This is where the business value is presented. All the technology under the hood is useless if it doesn’t generate value to the end-user.

What are we building?

We’ll be building an application that generates AI portraits of deceased celebrities. It stores them in a database and makes them searchable via vector search:

https://wallprints.io

Don’t try to purchase a print, it’s not ready :)

Model Hub

https://huggingface.co/stabilityai/stable-diffusion-2-1

from diffusers import StableDiffusionPipeline, DPMSolverMultistepScheduler

prompt = "Steve Jobs portrait in Surrealism style"

def create_image():
model_id = "stabilityai/stable-diffusion-2-1"
pipe = StableDiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.float16)
pipe.scheduler = DPMSolverMultistepScheduler.from_config(pipe.scheduler.config)
pipe = pipe.to("cuda")

image = pipe(prompt).images[0]

image.save("steve_jobs.png")
Generated using Stable Diffusion on HuggingFace

Serverless GPU

https://modal.com/docs/guide

import modal

stub = modal.Stub(name="")

@stub.function
def create_image():
...

ACID-Compliant Database

https://pymongo.readthedocs.io/en/stable/api/pymongo/collection.html

import pymongo

...

mdb_url = ""
mdb_client = pymongo.MongoClient(mdb_url)['db']['collection']

mdb_client.insert_one({
"prompt":prompt,
"image_path":file_url
})

You do need to store the file in a cloud store somewhere for reference, I’ll use S3:

import boto3

def save_to_s3(url, file_id):
r = requests.get(url, stream=True)

s3_client.put_object(
Body=r.content,
Bucket=bucket_name,
Key=f"v1/{file_id}.png"
)

https://github.com/mixpeek/pdf-search-s3/blob/master/insert.py

Vector Engine

We’ll use the change_streams method to copy every change from MongoDB to keep our Weaviate database up-to-date.

https://weaviate.io/blog/how-to-build-an-image-search-application-with-weaviate

import weaviate

client = weaviate.Client(weaviate_url)

pipeline = [{'$match': {'operationType': 'insert'}}]

# listen on changes to MDB
with db.collection.watch(pipeline) as stream:

# each change will insert into Weaviate
for insert_change in stream:
with client.batch as batch:
batch.add_data_object(encoded_file_from_hf, prompt)

Application Search Wrapped Up in a Single API

All of these technologies, wrapped up in a single Python library:

https://docs.mixpeek.com/

from mixpeek import Mixpeek

mix = Mixpeek(api_key="mixpeek_api_key")

mix.generate("image", "Steve Jobs in Surrealism style")

mix.search("Jobs")

Congrats! You’ve successfully leveraged the ML stack of the future! Now that you’ve built this, you can start to ideate on how to do finance, education and healthcare use cases leveraging this advanced technology.

Questions/comments? Reach out to ethan@mixpeek.com

Want LIVE guidance?

I’m hosting a LIVE course that walks you through engineering the above in a 6 part stream:

https://mixpeek.com/community#events

--

--