Difficulty: medium
Estimated Time: 1 hour per lab

This lab will guide you through a end to end ML use case from vizualizing data to deploying the trained model to make predicitons. Based on a Google github repository (https://github.com/GoogleCloudPlatform/training-data-analyst.git) you'll learn :

  1. how to explore a dataset with python libraries
  2. how to split a full dataset into a train and eval one
  3. how to use canned (prebuillt) models : dnn, wide&deep
  4. how to preprocess the dataset with Apache Beam
  5. how to train at large scale with CMLE
  6. and how to deloy the trained model to make predictions

Don’t stop now! The next scenario will only take about 10 minutes to complete.

End to end ML with TensorFlow

Step 1 of 3

Setup the project

  • Install Python3 apt -y install python3-pip

  • Install the required packages pip3 install jupyter pandas datalab tensorflow==1.5.0

  • Configurate Jupyter and create a password

  1. jupyter notebook --generate-config

  2. sed -i "s/#c.NotebookApp.password = ''/c.NotebookApp.password = u'sha1:6c2164fc2b22:ed55ecf07fc0f985ab46561483c0e888e8964ae6'/g" /root/.jupyter/jupyter_notebook_config.py to put the default password "secret" or jupyter notebook password to choose one