A simple example of Vertex AI Pipeline — the foundation of Google Cloud’s support of MLOps

person walking along an above-ground pipeline through a marsh
person walking along an above-ground pipeline through a marsh
Photo by Rodion Kutsaev on Unsplash

Vertex AI Tutorial Series

Background

This is the sixth and last article of this series. So far, we’ve trained, optimized, and deployed models on Google Cloud’s Vertex AI. We’ve also explored using AutoML and BigQuery ML. In this last article, we’ll…


Entertain the idea of BigQuery ML for image classification

decorative: blue disc standing edgewise on a red surface against a black background
decorative: blue disc standing edgewise on a red surface against a black background
Photo by Michael Dziedzic on Unsplash

Vertex AI Tutorial Series

Background

We’ve covered many Vertex AI services so far — notebook, custom training, hypertune, experiment, dataset, AutoML, model, endpoint, etc. — in previous articles. We’re going to try something new in this article: BigQuery ML. The basic…


Alternative to custom training: AutoML — build a great model from scratch

Vertex AI logo
Vertex AI logo
Vertex AI (Source: Google Cloud)

Vertex AI Tutorial Series

Background

In part 1, part 2, and part 3 of this series, we started from scratch, trained a custom model, optimized it, and deployed it on Vertex AI to serve online predictions. It was quite a journey…


Serving the model on Vertex AI to produce online prediction and explanation

Vertex AI logo
Vertex AI logo
Vertex AI (Source: Google Cloud)

Vertex AI Tutorial Series

Background

This is the third episode of the Vertex AI tutorial series. In the first article, we trained our first model. In the second article, we optimized it. Now it’s time to make use of it. …


An in-depth discussion about the ROC curve’s definition, interpretation, limitation, computation complexity, and extension to multi-class.

Background

Receiver Operating Characteristics curve, ROC curve in short, is a mechanism to measure and visualize the performance of a classifier. We want the classifier to identify as many relevant examples as possible. But at the same time, we don’t want it to produce too many false positives. ROC curve plots the relationship between the hit rate and the false positive rate to reveal the tradeoff. We will examine it in detail in this blog post. Nomenclature interlude: the hit rate has other names…


Tuning model hyperparameters, and visualizing metrics on managed Tensorboard

Vertex AI logo
Vertex AI logo
Vertex AI (Source: Google Cloud)

Vertex AI Tutorial Series

Background

In the previous article (first one of this series), we walked through the step-by-step instructions to have the first model trained on Vertex AI, Google Cloud’s newest integrated machine learning platform. …


Starting from scratch, arriving at the first model trained on Vertex AI

Vertex AI logo
Vertex AI logo
Vertex AI (Source: Google Cloud)

Vertex AI Tutorial Series

Background and Motivation

Google recently announced the general availability of its cloud platform for machine learning — Vertex AI. I’m very excited about this. I’ve long wanted to see a coherent, end-to-end story on ML workflows on Google Cloud…


A quantitative and qualitative exploration of how well it guards against white-box generation of adversarial examples

Background

Machine learning is prone to adversarial examples — targeted input data that are specifically crafted to deceive the model and lead to erroneous output. Adversarial training is a technique to defend against such attacks by deliberately generating adversarial examples to augment the training dataset, in hope of improving the robustness of the model. A natural question to ask then, is how far can it go — if we iterate with adversarial training, how would the model evolve?

Experiment Setup

To investigate this question…


How to achieve good accuracy by bootstrapping with minimum manually labeled examples

Child yelling into a microphone
Child yelling into a microphone
Photo by Jason Rosewell on Unsplash

Background

Voice commands are a big part of modern AI applications. The tensorflow website has published a good tutorial for basic voice command recognition [link]. This blog post was inspired by that tutorial, but we set out to do something different. That tutorial builds a neural network-based model to classify commands of “down,” “go,” “left,” “no,” “right,” “stop,” “up,” and “yes” from a subset of the Speech Commands dataset, which comprises 8,000 audio files across the commands (under CC_BY license).

We, however, are interested in the gender perception of the voice commands. It may have applicable scenarios in, for example, medical…


And A Step-by-step Walk-through of Its Core Architecture

Background

This is a simple experiment based on the tensorflow Cycle-Gnerative Adversarial Network (CycleGAN) tutorial. That tutorial converts between horses and zebras images. I try to reproduce it for conversion between map views and satellite views. Conveniently, there is already a prepared dataset cycle_gan/maps under the tensorflow dataset catalogs. The only noteworthy code change we need to make to the tensorflow CycleGAN tutorial is just switching to load the cycle_gan/maps dataset.

CycleGAN Core Architecture

It’s worth taking a deep dive into the core ideas behind CycleGAN since I think gaining a deeper…

Eileen Pangu

Eng @ FANG. Enthusiastic tech generalist. Enjoy distilling wisdom from experiences. Believe in that learning is a lifelong journey.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store