diff --git a/docs/tutorials/tf_quantum_starter.ipynb b/docs/tutorials/tf_quantum_starter.ipynb
new file mode 100644
index 000000000..199b73832
--- /dev/null
+++ b/docs/tutorials/tf_quantum_starter.ipynb
@@ -0,0 +1,934 @@
+{
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Getting started with TensorFlow Quantum\n",
+ "\n",
+ "
"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "In this notebook you will build your first hybrid quantum classical model with \n",
+ "[Cirq](https://cirq.readthedocs.io/en/stable/) and TensorFlow Quantum (TFQ). We will build a very simple model to do\n",
+ "binary classification in this notebook. You will then use Keras to create a wrapper for the model and simulate it to\n",
+ "train and evluate the model.\n",
+ "\n",
+ "> Note: This notebook is designed to be run in Google Colab if you want to run it locally or on a Jupyter notebook you \n",
+ "would skip the code cells with the `Colab only` comment."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Setup"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Install TensorFlow 2.x (Colab only)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# Colab only\n",
+ "!pip install -q tensorflow==2.1.0"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Install TensorFlow Quantum (Colab only)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# Colab only\n",
+ "!pip install -q tensorflow-quantum"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Imports"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Now import TensorFlow and the module dependencies:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 1,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "import cirq\n",
+ "import random\n",
+ "import numpy as np\n",
+ "import sympy\n",
+ "import tensorflow as tf\n",
+ "import tensorflow_quantum as tfq\n",
+ "\n",
+ "from matplotlib import pyplot as plt\n",
+ "from cirq.contrib.svg import SVGCircuit"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Place a qubit on the grid\n",
+ "\n",
+ "You will then place a qubit on the grid"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 2,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "qubit = cirq.GridQubit(0, 0)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Prepare quantum data\n",
+ "\n",
+ "The first thing you would do is set up the labels and parameters for preparation of the quantum data. For simplicity\n",
+ "here we have included just 2 data points `a` and `b`."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 3,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "expected_labels = np.array([[1, 0], [0, 1]])"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Randonly rotate the `x` and `z` axes"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 4,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "angle = np.random.uniform(0, 2 * np.pi)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Building the quantum Circuit\n",
+ "\n",
+ "You will now build the quantum circuit and also convert it into a tensor"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 5,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a = cirq.Circuit(cirq.ry(angle)(qubit))\n",
+ "b = cirq.Circuit(cirq.ry(angle + np.pi / 2)(qubit))\n",
+ "quantum_data = tfq.convert_to_tensor([a, b])"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 6,
+ "metadata": {},
+ "outputs": [
+ {
+ "name": "stderr",
+ "output_type": "stream",
+ "text": [
+ "findfont: Font family ['Arial'] not found. Falling back to DejaVu Sans.\n"
+ ]
+ },
+ {
+ "data": {
+ "image/svg+xml": [
+ ""
+ ],
+ "text/plain": [
+ ""
+ ]
+ },
+ "execution_count": 6,
+ "metadata": {},
+ "output_type": "execute_result"
+ }
+ ],
+ "source": [
+ "SVGCircuit(a)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 8,
+ "metadata": {},
+ "outputs": [
+ {
+ "data": {
+ "image/svg+xml": [
+ ""
+ ],
+ "text/plain": [
+ ""
+ ]
+ },
+ "execution_count": 8,
+ "metadata": {},
+ "output_type": "execute_result"
+ }
+ ],
+ "source": [
+ "SVGCircuit(b)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Building the hybrid model\n",
+ "\n",
+ "This section also shows the interoperatability between TensorFlow and Cirq. With the TFQ PQC layer you can easily\n",
+ "embed your quantum part of the model within a standard classical Keras model."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 9,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "q_data_input = tf.keras.Input(shape = (), dtype = tf.dtypes.string)\n",
+ "theta = sympy.Symbol(\"theta\")\n",
+ "q_model = cirq.Circuit(cirq.ry(theta)(qubit))"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 11,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "expectation = tfq.layers.PQC(q_model, cirq.Z(qubit))\n",
+ "expectation_output = expectation(q_data_input)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 12,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "classifier = tf.keras.layers.Dense(2, activation = tf.keras.activations.softmax)\n",
+ "classifier_output = classifier(expectation_output)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "You will now define the optimizer and loss functions for your model"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 13,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "model = tf.keras.Model(inputs = q_data_input, \n",
+ " outputs = classifier_output)\n",
+ "model.compile(optimizer = tf.keras.optimizers.Adam(learning_rate = 0.1), \n",
+ " loss = tf.keras.losses.CategoricalCrossentropy())"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Traaining the model\n",
+ "\n",
+ "Training the model is just like training any other Keras model and is made easy."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 14,
+ "metadata": {},
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "Train on 2 samples\n",
+ "Epoch 1/250\n",
+ "2/2 [==============================] - 2s 1s/sample - loss: 0.5722\n",
+ "Epoch 2/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.5098\n",
+ "Epoch 3/250\n",
+ "2/2 [==============================] - 0s 2ms/sample - loss: 0.4543\n",
+ "Epoch 4/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.4016\n",
+ "Epoch 5/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.3534\n",
+ "Epoch 6/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.3114\n",
+ "Epoch 7/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.2756\n",
+ "Epoch 8/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.2450\n",
+ "Epoch 9/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.2185\n",
+ "Epoch 10/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.1952\n",
+ "Epoch 11/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.1745\n",
+ "Epoch 12/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.1560\n",
+ "Epoch 13/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.1395\n",
+ "Epoch 14/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.1248\n",
+ "Epoch 15/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.1118\n",
+ "Epoch 16/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.1004\n",
+ "Epoch 17/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0904\n",
+ "Epoch 18/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0817\n",
+ "Epoch 19/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0741\n",
+ "Epoch 20/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0675\n",
+ "Epoch 21/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0617\n",
+ "Epoch 22/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0566\n",
+ "Epoch 23/250\n",
+ "2/2 [==============================] - 0s 2ms/sample - loss: 0.0522\n",
+ "Epoch 24/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0483\n",
+ "Epoch 25/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0449\n",
+ "Epoch 26/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0418\n",
+ "Epoch 27/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0391\n",
+ "Epoch 28/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0367\n",
+ "Epoch 29/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0345\n",
+ "Epoch 30/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0325\n",
+ "Epoch 31/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0308\n",
+ "Epoch 32/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0292\n",
+ "Epoch 33/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.0277\n",
+ "Epoch 34/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0264\n",
+ "Epoch 35/250\n",
+ "2/2 [==============================] - 0s 5ms/sample - loss: 0.0252\n",
+ "Epoch 36/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0241\n",
+ "Epoch 37/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0231\n",
+ "Epoch 38/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0221\n",
+ "Epoch 39/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0213\n",
+ "Epoch 40/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0205\n",
+ "Epoch 41/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0197\n",
+ "Epoch 42/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0191\n",
+ "Epoch 43/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0184\n",
+ "Epoch 44/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0178\n",
+ "Epoch 45/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0173\n",
+ "Epoch 46/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0168\n",
+ "Epoch 47/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0163\n",
+ "Epoch 48/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0158\n",
+ "Epoch 49/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0154\n",
+ "Epoch 50/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0150\n",
+ "Epoch 51/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0146\n",
+ "Epoch 52/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0143\n",
+ "Epoch 53/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0140\n",
+ "Epoch 54/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0136\n",
+ "Epoch 55/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0133\n",
+ "Epoch 56/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0130\n",
+ "Epoch 57/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0128\n",
+ "Epoch 58/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0125\n",
+ "Epoch 59/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0123\n",
+ "Epoch 60/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.0120\n",
+ "Epoch 61/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0118\n",
+ "Epoch 62/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.0116\n",
+ "Epoch 63/250\n",
+ "2/2 [==============================] - 0s 5ms/sample - loss: 0.0114\n",
+ "Epoch 64/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0112\n",
+ "Epoch 65/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0110\n",
+ "Epoch 66/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.0108\n",
+ "Epoch 67/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0106\n",
+ "Epoch 68/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0104\n",
+ "Epoch 69/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.0103\n",
+ "Epoch 70/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0101\n",
+ "Epoch 71/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0100\n",
+ "Epoch 72/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0098\n",
+ "Epoch 73/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0097\n",
+ "Epoch 74/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0095\n",
+ "Epoch 75/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0094\n",
+ "Epoch 76/250\n",
+ "2/2 [==============================] - 0s 5ms/sample - loss: 0.0092\n",
+ "Epoch 77/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0091\n",
+ "Epoch 78/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0090\n",
+ "Epoch 79/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0089\n",
+ "Epoch 80/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0087\n",
+ "Epoch 81/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0086\n",
+ "Epoch 82/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0085\n",
+ "Epoch 83/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.0084\n",
+ "Epoch 84/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0083\n",
+ "Epoch 85/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0082\n",
+ "Epoch 86/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.0081\n",
+ "Epoch 87/250\n",
+ "2/2 [==============================] - 0s 6ms/sample - loss: 0.0080\n",
+ "Epoch 88/250\n",
+ "2/2 [==============================] - 0s 5ms/sample - loss: 0.0079\n",
+ "Epoch 89/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0078\n",
+ "Epoch 90/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0077\n",
+ "Epoch 91/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0076\n",
+ "Epoch 92/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.0075\n",
+ "Epoch 93/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0074\n",
+ "Epoch 94/250\n",
+ "2/2 [==============================] - 0s 5ms/sample - loss: 0.0073\n",
+ "Epoch 95/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.0072\n",
+ "Epoch 96/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0072\n",
+ "Epoch 97/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0071\n",
+ "Epoch 98/250\n",
+ "2/2 [==============================] - 0s 5ms/sample - loss: 0.0070\n",
+ "Epoch 99/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0069\n",
+ "Epoch 100/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0068\n",
+ "Epoch 101/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0068\n",
+ "Epoch 102/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0067\n",
+ "Epoch 103/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0066\n",
+ "Epoch 104/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0065\n",
+ "Epoch 105/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0065\n",
+ "Epoch 106/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0064\n",
+ "Epoch 107/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0063\n",
+ "Epoch 108/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0063\n",
+ "Epoch 109/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0062\n",
+ "Epoch 110/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0061\n",
+ "Epoch 111/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0061\n",
+ "Epoch 112/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0060\n",
+ "Epoch 113/250\n",
+ "2/2 [==============================] - 0s 5ms/sample - loss: 0.0059\n",
+ "Epoch 114/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0059\n",
+ "Epoch 115/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0058\n",
+ "Epoch 116/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0058\n",
+ "Epoch 117/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0057\n",
+ "Epoch 118/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.0057\n",
+ "Epoch 119/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0056\n",
+ "Epoch 120/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0055\n",
+ "Epoch 121/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0055\n",
+ "Epoch 122/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0054\n",
+ "Epoch 123/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.0054\n",
+ "Epoch 124/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0053\n",
+ "Epoch 125/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0053\n",
+ "Epoch 126/250\n",
+ "2/2 [==============================] - 0s 5ms/sample - loss: 0.0052\n",
+ "Epoch 127/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0052\n",
+ "Epoch 128/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0051\n",
+ "Epoch 129/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0051\n",
+ "Epoch 130/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0050\n",
+ "Epoch 131/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0050\n",
+ "Epoch 132/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0049\n",
+ "Epoch 133/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0049\n",
+ "Epoch 134/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0049\n",
+ "Epoch 135/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0048\n",
+ "Epoch 136/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0048\n",
+ "Epoch 137/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0047\n",
+ "Epoch 138/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.0047\n",
+ "Epoch 139/250\n",
+ "2/2 [==============================] - 0s 5ms/sample - loss: 0.0046\n",
+ "Epoch 140/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0046\n",
+ "Epoch 141/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0046\n",
+ "Epoch 142/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0045\n",
+ "Epoch 143/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0045\n",
+ "Epoch 144/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0045\n",
+ "Epoch 145/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.0044\n",
+ "Epoch 146/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0044\n",
+ "Epoch 147/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0043\n",
+ "Epoch 148/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0043\n",
+ "Epoch 149/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0043\n",
+ "Epoch 150/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0042\n",
+ "Epoch 151/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.0042\n",
+ "Epoch 152/250\n",
+ "2/2 [==============================] - 0s 5ms/sample - loss: 0.0042\n",
+ "Epoch 153/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.0041\n",
+ "Epoch 154/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0041\n",
+ "Epoch 155/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0041\n",
+ "Epoch 156/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.0040\n",
+ "Epoch 157/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0040\n",
+ "Epoch 158/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0040\n",
+ "Epoch 159/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0039\n",
+ "Epoch 160/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0039\n",
+ "Epoch 161/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0039\n",
+ "Epoch 162/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0038\n",
+ "Epoch 163/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0038\n",
+ "Epoch 164/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0038\n",
+ "Epoch 165/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0038\n",
+ "Epoch 166/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0037\n",
+ "Epoch 167/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0037\n",
+ "Epoch 168/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0037\n",
+ "Epoch 169/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0036\n",
+ "Epoch 170/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0036\n",
+ "Epoch 171/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0036\n",
+ "Epoch 172/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0036\n",
+ "Epoch 173/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0035\n",
+ "Epoch 174/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0035\n",
+ "Epoch 175/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0035\n",
+ "Epoch 176/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0035\n",
+ "Epoch 177/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.0034\n",
+ "Epoch 178/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0034\n",
+ "Epoch 179/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0034\n",
+ "Epoch 180/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0034\n",
+ "Epoch 181/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0033\n",
+ "Epoch 182/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0033\n",
+ "Epoch 183/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0033\n",
+ "Epoch 184/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0033\n",
+ "Epoch 185/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0032\n",
+ "Epoch 186/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0032\n",
+ "Epoch 187/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0032\n",
+ "Epoch 188/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0032\n",
+ "Epoch 189/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0032\n",
+ "Epoch 190/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0031\n",
+ "Epoch 191/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0031\n",
+ "Epoch 192/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0031\n",
+ "Epoch 193/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0031\n",
+ "Epoch 194/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0030\n",
+ "Epoch 195/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0030\n",
+ "Epoch 196/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0030\n",
+ "Epoch 197/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0030\n",
+ "Epoch 198/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0030\n",
+ "Epoch 199/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0029\n",
+ "Epoch 200/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0029\n",
+ "Epoch 201/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0029\n",
+ "Epoch 202/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0029\n",
+ "Epoch 203/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.0029\n",
+ "Epoch 204/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0029\n",
+ "Epoch 205/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0028\n",
+ "Epoch 206/250\n",
+ "2/2 [==============================] - 0s 5ms/sample - loss: 0.0028\n",
+ "Epoch 207/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0028\n",
+ "Epoch 208/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0028\n",
+ "Epoch 209/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0028\n",
+ "Epoch 210/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0027\n",
+ "Epoch 211/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.0027\n",
+ "Epoch 212/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0027\n",
+ "Epoch 213/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0027\n",
+ "Epoch 214/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0027\n",
+ "Epoch 215/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0027\n",
+ "Epoch 216/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0026\n",
+ "Epoch 217/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0026\n",
+ "Epoch 218/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.0026\n",
+ "Epoch 219/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0026\n",
+ "Epoch 220/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0026\n",
+ "Epoch 221/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0026\n",
+ "Epoch 222/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0025\n",
+ "Epoch 223/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0025\n",
+ "Epoch 224/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0025\n",
+ "Epoch 225/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0025\n",
+ "Epoch 226/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0025\n",
+ "Epoch 227/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0025\n",
+ "Epoch 228/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0025\n",
+ "Epoch 229/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0024\n",
+ "Epoch 230/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0024\n",
+ "Epoch 231/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0024\n",
+ "Epoch 232/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0024\n",
+ "Epoch 233/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0024\n",
+ "Epoch 234/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0024\n",
+ "Epoch 235/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.0024\n",
+ "Epoch 236/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0023\n",
+ "Epoch 237/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0023\n",
+ "Epoch 238/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0023\n",
+ "Epoch 239/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0023\n",
+ "Epoch 240/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0023\n",
+ "Epoch 241/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0023\n",
+ "Epoch 242/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0023\n",
+ "Epoch 243/250\n",
+ "2/2 [==============================] - 0s 4ms/sample - loss: 0.0023\n",
+ "Epoch 244/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0022\n",
+ "Epoch 245/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0022\n",
+ "Epoch 246/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0022\n",
+ "Epoch 247/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0022\n",
+ "Epoch 248/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0022\n",
+ "Epoch 249/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0022\n",
+ "Epoch 250/250\n",
+ "2/2 [==============================] - 0s 3ms/sample - loss: 0.0022\n"
+ ]
+ }
+ ],
+ "source": [
+ "history = model.fit(x = quantum_data, \n",
+ " y = expected_labels, \n",
+ " epochs = 250)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Evaluating the model"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 15,
+ "metadata": {},
+ "outputs": [
+ {
+ "data": {
+ "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYIAAAEWCAYAAABrDZDcAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nO3deZxddX3/8df73tm3TJZJCFlIgCCghQAh2oJWa20Ba8FqFau20ipixaU/24f466L92UV/j25a0YhKResPXABFG6VqC2oBTcCAkBASwpIhQBYy2Wf//P44ZyY3k5nJDZkzN3PP+/l43Mec5XvP/ZxcmPec7znnexQRmJlZfhUqXYCZmVWWg8DMLOccBGZmOecgMDPLOQeBmVnOOQjMzHLOQWBWJklflPQ3ZbZ9XNKvH+t2zCaDg8DMLOccBGZmOecgsKqSdsn8maQHJO2T9AVJcyR9V9IeST+QNL2k/W9LekhSl6Q7JJ1Rsu4cSfel7/sq0DDis35L0pr0vXdJOut51vwOSRslPSfpNkknpssl6Z8lbZW0K92nF6XrLpG0Nq3tKUl/+rz+wcxwEFh1eh3wKuA04DXAd4H/Dcwi+W/+vQCSTgNuBN4PdAArgW9LqpNUB3wT+DIwA/h6ul3S954LXA+8E5gJfBa4TVL90RQq6deAvwfeAMwFngBuSlf/BvCydD/agTcCO9J1XwDeGRGtwIuA/zqazzUr5SCwavSvEfFsRDwF/Bj4aUT8PCJ6gFuBc9J2bwT+IyK+HxF9wD8AjcCvAC8BaoF/iYi+iPgGsKrkM94BfDYifhoRAxFxA9CTvu9ovBm4PiLuS+v7EPDLkhYBfUArcDqgiFgXEU+n7+sDzpTUFhE7I+K+o/xcs2EOAqtGz5ZMHxhlviWdPpHkL3AAImIQ2AzMS9c9FYeOyvhEyfRJwAfSbqEuSV3AgvR9R2NkDXtJ/uqfFxH/BXwKuBZ4VtJ1ktrSpq8DLgGekHSnpF8+ys81G+YgsDzbQvILHUj65El+mT8FPA3MS5cNWVgyvRn424hoL3k1RcSNx1hDM0lX01MAEfHJiDgPeCFJF9GfpctXRcSlwGySLqyvHeXnmg1zEFiefQ14taRXSqoFPkDSvXMXcDfQD7xXUo2k3wGWl7z3c8BVkl6cntRtlvRqSa1HWcP/A66QtDQ9v/B3JF1Zj0s6P91+LbAP6AYG0nMYb5Y0Le3S2g0MHMO/g+Wcg8ByKyLWA28B/hXYTnJi+TUR0RsRvcDvAG8DdpKcT7il5L2rSc4TfCpdvzFte7Q1/BD4S+BmkqOQU4DL09VtJIGzk6T7aAfJeQyAtwKPS9oNXJXuh9nzIj+Yxsws33xEYGaWc5kGgaSLJK1Pb5a5Zow2L09vynlI0p1Z1mNmZofLrGtIUhF4hOTGnk6Sa7DfFBFrS9q0k5yYuyginpQ0OyK2ZlKQmZmNKssjguXAxojYlJ54uwm4dESb3wNuiYgnARwCZmaTrybDbc8judZ6SCfw4hFtTgNqJd1BcgflJyLiS+NtdNasWbFo0aIJLNPMrPrde++92yOiY7R1WQaBRlk2sh+qBjgPeCXJrf13S7onIh45ZEPSlcCVAAsXLmT16tUZlGtmVr0kPTHWuiy7hjpJ7tIcMp/kLsqRbb4XEfsiYjvwI+DskRuKiOsiYllELOvoGDXQzMzsecoyCFYBSyQtTkdyvBy4bUSbbwEvTe/cbCLpOlqXYU1mZjZCZl1DEdEv6WrgdqBIMsLiQ5KuSteviIh1kr4HPAAMAp+PiAezqsnMzA435e4sXrZsWYw8R9DX10dnZyfd3d0VqmryNDQ0MH/+fGpraytdiplNIZLujYhlo63L8mTxpOns7KS1tZVFixZx6GCR1SUi2LFjB52dnSxevLjS5ZhZlaiKISa6u7uZOXNmVYcAgCRmzpyZiyMfM5s8VREEQNWHwJC87KeZTZ6qCYIj6e4b4Jld3fQPDFa6FDOz40pugqCnf4Cte7rpG5j4k+NdXV18+tOfPur3XXLJJXR1dU14PWZmRyM3QVBMu1QGM7hKaqwgGBgY/6FRK1eupL29fcLrMTM7GlVx1VA5CoUkCAYGJz4IrrnmGh599FGWLl1KbW0tLS0tzJ07lzVr1rB27Vouu+wyNm/eTHd3N+973/u48sorAVi0aBGrV69m7969XHzxxVx44YXcddddzJs3j29961s0NjZOeK1mZiNVXRD89bcfYu2W3YctH4zgQO8A9bVFagpHd8L1zBPb+PBrXjjm+o997GM8+OCDrFmzhjvuuINXv/rVPPjgg8OXeF5//fXMmDGDAwcOcP755/O6172OmTNnHrKNDRs2cOONN/K5z32ON7zhDdx888285S1++qCZZa/qgmAsw1fbRDD6eHgTZ/ny5Ydc5//JT36SW2+9FYDNmzezYcOGw4Jg8eLFLF26FIDzzjuPxx9/PNMazcyGVF0QjPWX++Bg8OCWXZwwrYHZrQ2Z1tDc3Dw8fccdd/CDH/yAu+++m6amJl7+8pePeh9AfX398HSxWOTAgQOZ1mhmNiQ3J4slEGIwg3MEra2t7NmzZ9R1u3btYvr06TQ1NfHwww9zzz33TPjnm5kdi6o7IhiLJAoFyODqUWbOnMkFF1zAi170IhobG5kzZ87wuosuuogVK1Zw1lln8YIXvICXvOQlE1+AmdkxqIpB59atW8cZZ5xxxPc+/MxumutqWDCjKavyJkW5+2tmNmS8Qedy0zUEUJAyuXzUzGwqy1UQFCUGptgRkJlZ1qomCMrp4ioWsjlZPJmmWleemR3/qiIIGhoa2LFjxxF/SRYKU/uIYOh5BA0N2V7+amb5UhVXDc2fP5/Ozk62bds2bruu/b0c6B0gdk7doRuGnlBmZjZRqiIIamtry3pi18e++zBf+MkmHvmbiz2uv5lZqiq6hsrV2lBD30DQ0+9nEpiZDcldEADs6e6vcCVmZsePXAVBS30SBHt7HARmZkNyFQStDbUA7PURgZnZsFwFwdARwZ7uvgpXYmZ2/MhVEAyfI3DXkJnZsHwGgbuGzMyGZRoEki6StF7SRknXjLL+5ZJ2SVqTvv4qy3ra0nMEuw+4a8jMbEhmN5RJKgLXAq8COoFVkm6LiLUjmv44In4rqzpKDR0R7HIQmJkNy/KIYDmwMSI2RUQvcBNwaYafd0Q1xQIt9TXs9sliM7NhWQbBPGBzyXxnumykX5Z0v6TvShr1gcOSrpS0WtLqI40ndCTTGmt9RGBmViLLIBhtMJ+RQ3/eB5wUEWcD/wp8c7QNRcR1EbEsIpZ1dHQcU1FtjbU+R2BmViLLIOgEFpTMzwe2lDaIiN0RsTedXgnUSpqVYU20NdSw+4CvGjIzG5JlEKwClkhaLKkOuBy4rbSBpBOUDgMqaXlaz44Ma3LXkJnZCJldNRQR/ZKuBm4HisD1EfGQpKvS9SuA1wPvktQPHAAuj4wfweUgMDM7VKbPI0i7e1aOWLaiZPpTwKeyrGGktsZaXzVkZlYiV3cWQ3JEsL93gL4BP5PAzAxyGgTgm8rMzIbkLgjaGpPeMF9CamaWyF0Q+IjAzOxQDgIzs5zLXRAMj0DqoajNzIAcBoGPCMzMDpW7IGhr9DMJzMxK5S4IGmqL1NUUHARmZqncBQF4mAkzs1K5DQIPM2FmlshlELQ11PiIwMwslcsgcNeQmdlBuQ0CP5zGzCyRyyBo8xGBmdmwXAbB0MniwcFMn4FjZjYl5DYIImBvr7uHzMxyGQRD4w3t2u/uITOzfAaBxxsyMxuWyyAYGnjON5WZmeU9CHxEYGaWzyAYelylu4bMzHIaBAePCHzVkJlZLoOgpb6GgnxEYGYGOQ0CSb672MwslcsgAA9FbWY2JNMgkHSRpPWSNkq6Zpx250sakPT6LOsp5RFIzcwSmQWBpCJwLXAxcCbwJklnjtHu48DtWdUymrYGB4GZGWR7RLAc2BgRmyKiF7gJuHSUdu8Bbga2ZljLYXxEYGaWyDII5gGbS+Y702XDJM0DXgusGG9Dkq6UtFrS6m3btk1Ice1NtR5ryMyMbINAoywbOe7zvwAfjIiB8TYUEddFxLKIWNbR0TEhxbU31dJ1oI8ID0VtZvlWk+G2O4EFJfPzgS0j2iwDbpIEMAu4RFJ/RHwzw7oAmN5Ux8BgsLu7f/gGMzOzPMoyCFYBSyQtBp4CLgd+r7RBRCwempb0ReA7kxECAO1NdQB07e91EJhZrmXWNRQR/cDVJFcDrQO+FhEPSbpK0lVZfW65pjclv/y7fJ7AzHIuyyMCImIlsHLEslFPDEfE27KsZaT2NAh27u+dzI81Mzvu5PbO4oNdQz4iMLN8y20QTE+DwEcEZpZ3uQ2CoRPEPiIws7zLbRAUC6KtoYYuHxGYWc7lNggApjfXsdNHBGaWc7kOgvamOp8jMLPcy3cQeOA5M7N8B8H0plofEZhZ7uU6CNqb6uja5yMCM8u3XAfB9KY69vT00zcwWOlSzMwqJt9B0OxhJszMch0EM5vrAXhun4PAzPIr10EwozkZZuK5vQ4CM8uvsoJA0s2SXi2pqoJjVksSBNt9RGBmOVbuL/bPkDxUZoOkj0k6PcOaJs3BI4KeCldiZlY5ZQVBRPwgIt4MnAs8Dnxf0l2SrpA0ZR/v1d5UR0Gww0cEZpZjZXf1SJoJvA14O/Bz4BMkwfD9TCqbBMWCmN5Ux3afIzCzHCvrCWWSbgFOB74MvCYink5XfVXS6qyKmwwzW+p4bp+7hswsv8p9VOWnIuK/RlsREcsmsJ5JN6O5jh0+IjCzHCu3a+gMSe1DM5KmS/rjjGqaVDNb6n0fgZnlWrlB8I6I6BqaiYidwDuyKWlyzWyuY7uvGjKzHCs3CAqSNDQjqQjUZVPS5JrZXM/u7n56+z3ekJnlU7lBcDvwNUmvlPRrwI3A97Ira/LMaPFD7M0s38o9WfxB4J3AuwAB/wl8PquiJtOs9KayHXt7mdPWUOFqzMwmX1lBEBGDJHcXfybbcibf0N3FPk9gZnlV7lhDSyR9Q9JaSZuGXmW87yJJ6yVtlHTNKOsvlfSApDWSVku68PnsxLHoaE1GIN22x0FgZvlU7jmCfyM5GugHXgF8ieTmsjGlJ5SvBS4GzgTeJOnMEc1+CJwdEUuBP6QC3U2z0+6grQ4CM8upcoOgMSJ+CCginoiIjwC/doT3LAc2RsSmiOgFbgIuLW0QEXsjItLZZiCYZC31NTTXFdm6p3uyP9rM7LhQ7sni7nQI6g2SrgaeAmYf4T3zgM0l853Ai0c2kvRa4O/T7b26zHom1Oy2Bh8RmFlulXtE8H6gCXgvcB7wFuAPjvAejbLssL/4I+LWiDgduAz46Kgbkq5MzyGs3rZtW5kll6+jtZ5tux0EZpZPRwyCtK//DWk3TmdEXBERr4uIe47w1k5gQcn8fGDLWI0j4kfAKZJmjbLuuohYFhHLOjo6jlTyUZvT1uCuITPLrSMGQUQMAOeV3llcplXAEkmLJdUBlwO3lTaQdOrQdiWdS3K38o6j/JxjNru13l1DZpZb5Z4j+DnwLUlfB/YNLYyIW8Z6Q0T0p+cTbgeKwPUR8ZCkq9L1K4DXAb8vqQ84ALyx5OTxpJndWs/+3gH29vTTUl/uP4mZWXUo97feDJK/1EuvFApgzCAAiIiVwMoRy1aUTH8c+HiZNWRmdltyL8Gzu7tp6WipcDVmZpOr3DuLr8i6kEqa05reS7C7h1McBGaWM+U+oezfGP2Knz+c8IoqYOiIwCeMzSyPyu0a+k7JdAPwWsa5Amiq6UiPCDzMhJnlUbldQzeXzku6EfhBJhVVQFtDDY21RZ7e5SMCM8ufcm8oG2kJsHAiC6kkScxtb+DpXQcqXYqZ2aQr9xzBHg49R/AMyTMKqsa89ka2dPmIwMzyp9yuodasC6m0udMaWP/MxA9fYWZ2vCv3eQSvlTStZL5d0mXZlTX5TmxvZNveHj+72Mxyp9xzBB+OiF1DMxHRBXw4m5Iq48RpjUQkN5WZmeVJuUEwWruqGothbntyCemWLp8wNrN8KTcIVkv6J0mnSDpZ0j8D92ZZ2GQ7sb0RgC2+csjMcqbcIHgP0At8FfgayQBx786qqEo4cVoaBL5yyMxyptyrhvYBhz18vpo01hWZ3lTrriEzy51yrxr6vqT2kvnpkm7PrqzKmDut0UFgZrlTbtfQrPRKIQAiYidHfmbxlLNgRiOdOx0EZpYv5QbBoKThISUkLWKU0UinuoUzmnjyuf1U4Nk4ZmYVU+4loH8O/ETSnen8y4ArsympchbOaKKnf5Cte3qY09ZQ6XLMzCZFWUcEEfE9YBmwnuTKoQ+QXDlUVRbMaALgyef2V7gSM7PJU+6gc28H3gfMB9YALwHu5tBHV055C4eCYMd+zl80o8LVmJlNjnLPEbwPOB94IiJeAZwDVN0IbfOmNyL5iMDM8qXcIOiOiG4ASfUR8TDwguzKqoz6miJz2xrY7CAwsxwp92RxZ3ofwTeB70vaSRU9qrLUgvTKITOzvCj3zuLXppMfkfTfwDTge5lVVUELZzRxxyNV1+tlZjamox5BNCLuPHKrqWvRrGa23dvJ3p5+WuqraoBVM7NRPd9nFletk2c1A/D49n0VrsTMbHI4CEY4uaMFgEe37a1wJWZmkyPTIJB0kaT1kjZKOmz0UklvlvRA+rpL0tlZ1lOOk2Y2IcGmbT4iMLN8yCwIJBWBa4GLgTOBN0k6c0Szx4BfjYizgI8C12VVT7kaaovMa29kk7uGzCwnsjwiWA5sjIhNEdEL3ARcWtogIu5KRzIFuIfkzuWKO7mjhce2u2vIzPIhyyCYB2wume9Ml43lj4DvjrZC0pWSVktavW1b9pd2njyrmce27fMopGaWC1kGgUZZNupvVkmvIAmCD462PiKui4hlEbGso6NjAksc3SkdzezrHeDZ3T2Zf5aZWaVlGQSdwIKS+fmMcjeypLOAzwOXRsSODOsp2ymzkyuHNm5195CZVb8sg2AVsETSYkl1wOXAbaUN0ofd3AK8NSIeybCWo3LanFYA1j+7p8KVmJllL7NbZyOiX9LVwO1AEbg+Ih6SdFW6fgXwV8BM4NOSAPojYllWNZVrVks9M5vreOQZB4GZVb9Mx1CIiJXAyhHLVpRMvx14e5Y1PF9L5rTwyFYHgZlVP99ZPIYXzGnlkWf2+MohM6t6DoIxnHZCK/t6B3iqq+qeyGlmdggHwRiGThhveNZXDplZdXMQjGEoCNY+vbvClZiZZctBMIZpjbXMn97oIDCzqucgGMcLT2xj7RYHgZlVNwfBOF544jQe276PvT39lS7FzCwzDoJxvPDENgDWuXvIzKqYg2AcLzxxGoC7h8ysqjkIxjGnrZ5ZLXX84qldlS7FzCwzDoJxSOKs+e080NlV6VLMzDLjIDiCs+e3s2HrXvZ091W6FDOzTDgIjmDpwnYi4Bed7h4ys+rkIDiCs+cnJ4zXuHvIzKqUg+AI2pvqWDyrmTVPOgjMrDo5CMpwzoJ27nuyy0NSm1lVchCUYdmiGWzf28Nj2/dVuhQzswnnICjD8sUzAFj1+HMVrsTMbOI5CMpwSkczM5vr+NljOytdipnZhHMQlEES5y+awc8e31HpUszMJpyDoEzLF89g83MH/OhKM6s6DoIyXXDqLAB+smFbhSsxM5tYDoIynTanhdmt9fx4w/ZKl2JmNqEcBGWSxIVLZvE/G7czOOj7CcysejgIjsJLl8xi5/4+HtzicYfMrHo4CI7Cy5Z0IMEP122tdClmZhMm0yCQdJGk9ZI2SrpmlPWnS7pbUo+kP82ylokws6WeZSdN5/trn610KWZmEyazIJBUBK4FLgbOBN4k6cwRzZ4D3gv8Q1Z1TLRXnTmHtU/vpnPn/kqXYmY2IbI8IlgObIyITRHRC9wEXFraICK2RsQqYMo89eVVZ54A4KMCM6saWQbBPGBzyXxnuuyoSbpS0mpJq7dtq+x1/ItnNXP6Ca18+/4tFa3DzGyiZBkEGmXZ87ruMiKui4hlEbGso6PjGMs6dpcuncd9T3bx5A53D5nZ1JdlEHQCC0rm5wNV8Wf0a86eC8C3H6iK3TGznMsyCFYBSyQtllQHXA7cluHnTZr505tYvmgG37i30w+rMbMpL7MgiIh+4GrgdmAd8LWIeEjSVZKuApB0gqRO4H8BfyGpU1JbVjVNpDe9eAGPbd/H3Y96RFIzm9oyvY8gIlZGxGkRcUpE/G26bEVErEinn4mI+RHRFhHt6fTuLGuaKBe/aC7tTbV85adPVroUM7Nj4juLn6eG2iK/e958bn/oGQ9NbWZTmoPgGLztgsUAfP7HmypciZnZ8+cgOAbz2hv57bNP5Kafbea5fb2VLsfM7HlxEByjd738FHr6B/jMHRsrXYqZ2fPiIDhGS+a08jvnzueGu59gi88VmNkU5CCYAH/yqtMQ8Hcr11W6FDOzo+YgmADz2ht59ytO5TsPPM2PHvEzjc1sanEQTJB3/urJnNzRzIdu+QW7u6fMYKpmZg6CiVJfU+Qff/dsntndzV9+80EPPWFmU4aDYAKds3A673/lEr61Zgs33PV4pcsxMyuLg2CCvfsVp/LrZ8zmo/+xju89+EylyzEzOyIHwQQrFMS/XH4OZ82fxntuvI871vtB92Z2fHMQZKClvoYvXrGc0+a08s4v38udvpLIzI5jDoKMTGus5ct/9GIWz2rmin/7GV/4yWM+gWxmxyUHQYZmNNfxjXf9Cr9+xhw++p21fODr9/vSUjM77jgIMtZSX8OKt5zH+165hG/+/Cl+859/xH8/7PMGZnb8cBBMgkJB/MmrTuOWP76A1oYarvjiKt5+w2rWP7On0qWZmTkIJtPSBe18+z0X8me/+QJ+umkHF33iR7znxp+zZnNXpUszsxzTVDuBuWzZsli9enWlyzhmXft7+cydj/KVe55kb08/5yxs5/XnzefiF81lRnNdpcszsyoj6d6IWDbqOgdBZe3p7uMb93by7/c8waPb9lEsiAtPncUlv3QCL13SwYntjZUu0cyqgINgCogI1j29h28/sIVv37+Fzp3Jsw1OntXMBafO4vzFM1g6v50FMxqRVOFqzWyqcRBMMRHB+mf38JMN2/mfjdv56WPPsb93AIDpTbX80vx2Tj+hlVM7Wjhldgunzm5hWmNthas2s+OZg2CK6xsYZP0ze3igcxf3b+7iF0/t4tFte+npHxxu09Faz/zpjcxrb2Te9Ebmpz9PbG+ko6We6U11FAo+kjDLKwdBFRoYDDp37mfj1r1s2LqXTdv20rnzAE91HeDprm56BwYPaV8siOlNdcxqqaOjtZ6ZzXXMaqlnenMdbQ01tDXWJq+GWqY11tDWkMw31BYrtIdmNpHGC4KayS7GJkaxIE6a2cxJM5t55RlzDlk3OBhs39tDZ9cBtnQdYPueHrbv7WXHvh627ell+94eHt+xj+17ejnQNzDu59TVFGhrqKWlvkhjXQ1NdcXhV3NdDY11RZrra2isLdKctmlO1zfUJq/6mgL1NUXqawuHTdcVCz7nYVZhDoIqVCiI2W0NzG5r4NyF08dt2903wJ7ufnYd6GN3dx+7D/Sl0/3sPpDM7+7uY1/PAPt7B9jf28+e7n6e3d2dzifLuvsGx/2c8SThUKB+ODRGhEVNkbqiqC0WqCkWqC2K2kKB2hpRUyhQV1OgppCsry1pV1dU2r5keUHUpgE0NF1bKFBTFMVC8qopHJxO5gsUJYrFknWSu9qsamQaBJIuAj4BFIHPR8THRqxXuv4SYD/wtoi4L8ua7FBDf7V3tNYf03YGBoMDfUko7E9D40BfPz39g8mrb5Ce/oGS+ZLp/oF0fUmbkva79vfSNxD0Dw7SNxD0DQzSNzBI/0DQm/7sGxikf3ByuzklDgmGYiEJnoIOBkZN8eC6kfM1hQKFAknQpOsLAmkoaKAgpa8k4IemiwUdbDf0nvT9hTSkhqfTV7GQtBuaLmhoG8m2S7c32mcVBMWh95Rsn/Sn0n+Tg9M6OJ80O3RZSXsoaTfWNka2LxzhM9N15XzmyPZ5k1kQSCoC1wKvAjqBVZJui4i1Jc0uBpakrxcDn0l/2hRTLIiW+hpa6mugtTI1RMTBwOhPQyKd7hscPTySVzAwmATJQPoqnT44P8jAIMNtB0vaHf7eEW0j6B84uL50fn9/PwODwWCQ/gwiYCBKpgcPn05eMBjJdmK06fQ9dnRGDRMODw5GBJyG33/wvVAahCAODUaGPyNZNzxfsp2hz3nT8oW8/aUnT/j+ZnlEsBzYGBGbACTdBFwKlAbBpcCXIjljfY+kdklzI+LpDOuyKiWJuhpRRwF8c/awKAmMwQgGBw+fPnLoHHzPYaEDRCSfMzQ99P6hZcPz6TRx+LKx2kfEodscuY3Szxyj/SHLStpDck5t3M8csX8j2w/9G8fwvzcEpdsChuZHWRckMwffX/KZpdsOjvnIfSxZBsE8YHPJfCeH/7U/Wpt5wCFBIOlK4EqAhQsXTnihZtVsqPunSP66PKw8WQ46N9p/dSMPUstpQ0RcFxHLImJZR0fHhBRnZmaJLIOgE1hQMj8f2PI82piZWYayDIJVwBJJiyXVAZcDt41ocxvw+0q8BNjl8wNmZpMrs3MEEdEv6WrgdpLLR6+PiIckXZWuXwGsJLl0dCPJ5aNXZFWPmZmNLtP7CCJiJckv+9JlK0qmA3h3ljWYmdn4/IQyM7OccxCYmeWcg8DMLOem3DDUkrYBTzzPt88Ctk9gOVNFHvfb+5wP3ufynRQRo96INeWC4FhIWj3WeNzVLI/77X3OB+/zxHDXkJlZzjkIzMxyLm9BcF2lC6iQPO639zkfvM8TIFfnCMzM7HB5OyIwM7MRHARmZjmXmyCQdJGk9ZI2Srqm0vVkRdLjkn4haY2k1emyGZK+L2lD+nP8J9of5yRdL2mrpAdLlo25j5I+lH7v6yX9ZmWqPjZj7PNHJD2VftdrJF1Ssq4a9nmBpP+WtE7SQ5Lely6v2u96nH3O9rtOHslW3S+S0U8fBU4meYjh/cCZla4ro319HJg1Ytn/Ba5Jp68BPl7pOo9xH18GnAs8eKR9BM5Mv+96YHH630Gx0vswQfv8EeBPRzE9UUQAAAO8SURBVGlbLfs8Fzg3nW4FHkn3rWq/63H2OdPvOi9HBMPPT46IXmDo+cl5cSlwQzp9A3BZBWs5ZhHxI+C5EYvH2sdLgZsioiciHiMZ8nz5pBQ6gcbY57FUyz4/HRH3pdN7gHUkj7Kt2u96nH0ey4Tsc16CYKxnI1ejAP5T0r3ps54B5kT6wJ/05+yKVZedsfax2r/7qyU9kHYdDXWRVN0+S1oEnAP8lJx81yP2GTL8rvMSBGU9G7lKXBAR5wIXA++W9LJKF1Rh1fzdfwY4BVgKPA38Y7q8qvZZUgtwM/D+iNg9XtNRlk3J/R5lnzP9rvMSBLl5NnJEbEl/bgVuJTlMfFbSXID059bKVZiZsfaxar/7iHg2IgYiYhD4HAe7BKpmnyXVkvxC/EpE3JIururverR9zvq7zksQlPP85ClPUrOk1qFp4DeAB0n29Q/SZn8AfKsyFWZqrH28DbhcUr2kxcAS4GcVqG/CDf0yTL2W5LuGKtlnSQK+AKyLiH8qWVW13/VY+5z5d13ps+STeDb+EpIz8I8Cf17pejLax5NJriC4H3hoaD+BmcAPgQ3pzxmVrvUY9/NGksPjPpK/iP5ovH0E/jz93tcDF1e6/gnc5y8DvwAeSH8hzK2yfb6QpJvjAWBN+rqkmr/rcfY50+/aQ0yYmeVcXrqGzMxsDA4CM7OccxCYmeWcg8DMLOccBGZmOecgMJtEkl4u6TuVrsOslIPAzCznHARmo5D0Fkk/S8d+/6ykoqS9kv5R0n2SfiipI227VNI96YBgtw4NCCbpVEk/kHR/+p5T0s23SPqGpIclfSW9m9SsYhwEZiNIOgN4I8kAfkuBAeDNQDNwXySD+t0JfDh9y5eAD0bEWSR3fw4t/wpwbUScDfwKyZ3BkIwo+X6SseRPBi7IfKfMxlFT6QLMjkOvBM4DVqV/rDeSDGw2CHw1bfPvwC2SpgHtEXFnuvwG4OvpmE/zIuJWgIjoBki397OI6Ezn1wCLgJ9kv1tmo3MQmB1OwA0R8aFDFkp/OaLdeOOzjNfd01MyPYD/P7QKc9eQ2eF+CLxe0mwYfkbuSST/v7w+bfN7wE8iYhewU9JL0+VvBe6MZAz5TkmXpduol9Q0qXthVib/JWI2QkSslfQXJE96K5CM+PluYB/wQkn3ArtIziNAMhTyivQX/SbginT5W4HPSvo/6TZ+dxJ3w6xsHn3UrEyS9kZES6XrMJto7hoyM8s5HxGYmeWcjwjMzHLOQWBmlnMOAjOznHMQmJnlnIPAzCzn/j89SO9Chk7gMAAAAABJRU5ErkJggg==\n",
+ "text/plain": [
+ ""
+ ]
+ },
+ "metadata": {
+ "needs_background": "light"
+ },
+ "output_type": "display_data"
+ }
+ ],
+ "source": [
+ "plt.plot(history.history['loss'])\n",
+ "plt.title('model loss')\n",
+ "plt.ylabel('accuracy')\n",
+ "plt.xlabel('epoch')\n",
+ "plt.legend(['train'], loc='upper left')\n",
+ "plt.show()"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Performing inference"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 16,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "noise = np.random.uniform(-0.25, 0.25, 2)\n",
+ "test_data = tfq.convert_to_tensor([\n",
+ " cirq.Circuit(cirq.ry(noise[0])(qubit)),\n",
+ " cirq.Circuit(cirq.ry(noise[1] + np.pi/2)(qubit)), \n",
+ "])"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "You can see in the below cell that our model does a good job with this data though it was very easy."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 17,
+ "metadata": {},
+ "outputs": [
+ {
+ "data": {
+ "text/plain": [
+ "array([[9.0111643e-01, 9.8883577e-02],\n",
+ " [1.7436201e-04, 9.9982566e-01]], dtype=float32)"
+ ]
+ },
+ "execution_count": 17,
+ "metadata": {},
+ "output_type": "execute_result"
+ }
+ ],
+ "source": [
+ "predictions = model.predict(test_data)\n",
+ "predictions"
+ ]
+ }
+ ],
+ "metadata": {
+ "environment": {
+ "name": "tf2-gpu.2-1.m47",
+ "type": "gcloud",
+ "uri": "gcr.io/deeplearning-platform-release/tf2-gpu.2-1:m47"
+ },
+ "kernelspec": {
+ "display_name": "Python 3",
+ "language": "python",
+ "name": "python3"
+ },
+ "language_info": {
+ "codemirror_mode": {
+ "name": "ipython",
+ "version": 3
+ },
+ "file_extension": ".py",
+ "mimetype": "text/x-python",
+ "name": "python",
+ "nbconvert_exporter": "python",
+ "pygments_lexer": "ipython3",
+ "version": "3.7.6"
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 4
+}
\ No newline at end of file