top of page

Host an api against your ML model in Jupyter!

The image the program below generated for “busy street”

I love teaching ML and all things tech so it shouldn’t be a huge surprise that I ❤ Jupyter and I ❤ Python!

Today, I would like to share a tutorial to build an ML model in a Jupyter notebook and publish it using ngrok to be called from a browser. And we won’t build just any model, it’s my favorite — Stable Diffusion! If you would like to read my earlier post on SD, you can find it here.

Ready?! Let’s go.

#install flask and ngrok
!pip install flask_ngrok
!pip install pyngrok

#You will need to sign up for a free account with ngrok to retreive your auth token.
!ngrok authtoken <INSERT YOUR AUTH TOKEN HERE>

#Import pandas, flask, etc etc
import pandas as pd
from flask_ngrok import run_with_ngrok
from flask import request, jsonify, Flask, redirect, render_template, url_for, send_file
import random as rk
import torch
from torch import autocast

This next block is specific to Stable Diffusion.If you rather use a different model, please feel free to replace this code with any other model of your choice.

!pip install diffusers==0.2.4
!pip install transformers scipy ftfy
!pip install "ipywidgets>=7,<8"

# SD weights can be accessed via Hugging face so the following #block is needed to get access. As I mentioned in my blog, you will # need a free HuggingFace login as well.

from google.colab import output
from huggingface_hub import notebook_login

notebook_login() #<< This will ask you for your HuggingFace Token which you can find in the settings section of your HuggingFace account.

from diffusers import StableDiffusionPipeline

#download the weights
pipe = StableDiffusionPipeline.from_pretrained("CompVis/stable-diffusion-v1-4", revision="fp16", torch_dtype=torch.float16, use_auth_token=True)

#Ensure that your model is running on the GPU.
#Every generation on a GPU takes about 13 secs versus a min+ on a CPU
pipe ="cuda")

Next, we need to set up our web server using Flask.

app = Flask(__name__) #the name of the application package
import base64
from io import BytesIO

def home():
    # Keeping the html piece really simple. 
    # Textbox and a button
    return myPageHtml

def predict():
    # Generate an image using SD for the string passed in
    page = request.args.get('param', default = 'lake', type = str)
    global pipe
    with autocast("cuda"):
        image = pipe(page)["sample"][0]  # [PIL format image]'img.png')
    return send_file('img.png', mimetype='image/png')

#Run the Flask app

Let’s add the last bit with the Html and Javascript code.

myPageHtml = """<!DOCTYPE html>
<html lang="en" xmlns="">
<head><meta charset="utf-8" /></head>

// Add a text box and a button<b> Enter the string you would like to generate the image w/
</b>  <input type="text" id="textbox" name="message">
<button id="button" >Generate</button>
// Add an image but it will be hidden
<img id="img" style="display: none;" width="400" height="400"/>

var button = document.getElementById("button");
var textBox = document.getElementById("textbox");
// This event is fired when button is clicked
button.addEventListener("click", function () {    
    // Make a call to get the prediction
    var str = '/predict?param=' + textBox.value;
    var getImage = new XMLHttpRequest();'GET', str, true);
    getImage.responseType = 'arraybuffer';
    // Render the image in the hidden image object       
    getImage.onload = function(event){

function renderHTML(data){
    // Create a binary string from the returned data, then encode it as a data URL.
    var uInt8Array = new Uint8Array(data);
    var i = uInt8Array.length;
    var binaryString = new Array(i);
    while (i--)
        binaryString[i] = String.fromCharCode(uInt8Array[i]);
    var data = binaryString.join('');
    var base64 = window.btoa(data);
    document.getElementById("img").src="data:image/png;base64," + base64;
    document.getElementById("img").style.display = "";

That’s it!! I hope this helps.

If you have any questions, please leave them in the comments section!

6 views0 comments

Recent Posts

See All


bottom of page