TP-reseaux-profond/TP4.ipynb

2174 lines
309 KiB
Plaintext
Raw Permalink Normal View History

2023-06-22 18:35:38 +00:00
{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "XMMppWbnG3dN"
},
"source": [
"\n",
"# Estimation de posture dans une image\n",
"\n",
"Pour ce TP ainsi que le suivant, nous allons traiter le problème de la détection du \"squelette\" d'un humain dans une image, tel qu'illustré dans la figure ci-dessous.\n",
"\n",
"![Texte alternatif…](https://drive.google.com/uc?id=1HpyLwzwkFdyQ6APoGZQJL7f837JCHNkh)\n",
"\n",
"Nous allons pour ce faire utiliser le [Leeds Sport Pose Dataset](https://sam.johnson.io/research/lspet.html) qui introduit 10000 images présentant des sportifs dans diverses situations, augmentées d'une annotation manuelle du squelette.\n",
"\n",
"À chaque image est associée une matrice de taille 3x14, correspondant aux coordonnées dans l'image des 14 joints du squelette de la personne décrite dans l'image. La 3e dimension désigne la visibilité du joint (1 s'il est visible, 0 s'il est occulté)\n",
"\n",
"Ces joints sont, dans l'ordre :\n",
"* Cheville droite\n",
"* Genou droit\n",
"* Hanche droite\n",
"* Hanche gauche\n",
"* Genou gauche\n",
"* Cheville gauche\n",
"* Poignet droit\n",
"* Coude droit\n",
"* Épaule droite\n",
"* Épaule gauche\n",
"* Coude gauche\n",
"* Poignet gauche\n",
"* Cou\n",
"* Sommet du crâne\n",
"\n",
"Pour un rappel des notions vues en cours sur ce sujet, vous pouvez regarder la vidéo ci-dessous :\n"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"vscode": {
"languageId": "python"
}
},
"outputs": [
{
"data": {
"text/html": [
"\n",
" <iframe\n",
" width=\"640\"\n",
" height=\"360\"\n",
" src=\"https://video.polymny.studio/?v=84ace9c1-f460-4375-9b33-917c3ff82c83/\"\n",
" frameborder=\"0\"\n",
" allowfullscreen\n",
" \n",
" ></iframe>\n",
" "
],
"text/plain": [
"<IPython.lib.display.IFrame at 0x7f1618711910>"
]
},
"execution_count": 1,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"from IPython.display import IFrame\n",
"IFrame(\"https://video.polymny.studio/?v=84ace9c1-f460-4375-9b33-917c3ff82c83/\", width=640, height=360)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Méthodologie \n",
"\n",
"Pour résoudre ce problème, nous allons suivre une méthodologie similaire à celle présentée dans le 2e cours, et rappelée sur la figure suivante : \n",
"\n",
"![Méthodologie de développement d'un algorithme d'apprentissage profond](https://drive.google.com/uc?id=195pkcjca4r_g86KDt2LCe0QdQsMC6iba)\n",
"\n",
"Ainsi nous allons commencer par une modélisation simple du problème, construire un modèle et l'améliorer pas à pas et évaluer sa performance.\n",
"Dans un second temps, nous modifierons la modélisation du problème, et donc l'architecture utilisée, afin d'améliorer les résultats.\n",
"\n",
"Pour chacune de ces deux étapes, je vous suggère de suivre la démarche suivante : \n",
"\n",
"- Simplifier le problème en traitant 10 imagettes (par exemple de dimension $64 \\times 64$) et construire un réseau qui surapprend parfaitement (qui diminue la perte jusqu'à quasiment 0)\n",
"- Ajouter des images (~1000) et recalibrer le réseau pour à nouveau, obtenir un sur-apprentissage\n",
"- Commencer à corriger le sur-apprentissage en ajoutant de la régularisation\n",
"- Et enfin, utiliser l'ensemble de la base de données pour diminuer le sur-apprentissage au maximum"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "qjqZNAX2CVi1"
},
"source": [
"# Régression de la position des joints"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "WFXG64qtCaCb"
},
"source": [
"Dans un premier temps, et comme vu en cours, nous allons nous inspirer de l'algorithme DeepPose (**[Toshev et al.] DeepPose : Human Pose Estimation via Deep Neural Networks**) et formuler le problème comme une régression de la position (x,y) des joints dans l'espace de l'image."
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "z3mdNJJXc6Wy"
},
"source": [
"Commencez par télécharger la base de données sur Github\n",
"\n",
"\n"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"id": "3IVjmLKWRDag",
"vscode": {
"languageId": "python"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"fatal: le chemin de destination 'lsp' existe déjà et n'est pas un répertoire vide.\n"
]
}
],
"source": [
"!git clone https://github.com/axelcarlier/lsp.git"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "_-EFIogzdCc9"
},
"source": [
"Le bloc suivant contient une fonction qui permet de charger les images de la base de données dans les variables x et y. Par défaut les images sont redimensionnées en taille 128$\\times$128 et la base de données contient 1000 images. Pour commencer et vous permettre de travailler plus efficacement, **je vous suggère très fortement de diminuer la dimension des images** (par exemple 64$\\times$64) **et de ne travailler que sur un ensemble réduit d'images** (par exemple, 10). \n",
"\n",
"\n",
"N'oubliez pas également de diviser les données en images de test et/ou de validation pour obtenir des informations sur le sur-apprentissage éventuel. \n"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {
"id": "quOHEF__pf36",
"vscode": {
"languageId": "python"
}
},
"outputs": [
{
"data": {
"text/plain": [
"((10, 64, 64, 3), (10, 3, 14))"
]
},
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"import numpy as np\n",
"import PIL\n",
"from PIL import Image\n",
"import os, sys\n",
"from scipy.io import loadmat\n",
"\n",
"# Cette fonction permettra plus tard de charger plus ou moins d'images (en modifiant le paramètre num_images)\n",
"# et de modifier la dimension d'entrée\n",
"def load_data(image_size=128, num_images=1000):\n",
"\n",
" path = \"./lsp/images/\"\n",
" dirs = sorted(os.listdir(path))\n",
"\n",
" x = np.zeros((min(num_images,len(dirs)),image_size,image_size,3))\n",
" y = np.zeros((min(num_images,len(dirs)), 3, 14))\n",
" \n",
" #Chargement des joints \n",
" mat_contents = loadmat('./lsp/joints.mat')\n",
" joints = mat_contents['joints']\n",
"\n",
" # Chargement des images, qui sont rangées dans lsp/images\n",
" for i in range(min(num_images,len(dirs))):\n",
" item = dirs[i]\n",
" if os.path.isfile(path+item):\n",
" img = Image.open(path+item)\n",
" # Redimensionnement et sauvegarde des joints\n",
" y[i, 0] = joints[:,0,i]*image_size/img.size[0]\n",
" y[i, 1] = joints[:,1,i]*image_size/img.size[1]\n",
" y[i, 2] = joints[:,2,i]\n",
" # Redimensionnement et sauvegarde des images \n",
" img = img.resize((image_size,image_size))\n",
" x[i] = np.asarray(img)\n",
"\n",
"\n",
" return x, y\n",
"\n",
"# Chargement de seulement 10 images, de taille 64x64\n",
"x, y = load_data(image_size=64, num_images=10) \n",
"x.shape, y.shape"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {
"id": "zRc0B4oxe6h_",
"vscode": {
"languageId": "python"
}
},
"outputs": [],
"source": [
"labels = {0: 'Cheville droite',\n",
" 1: 'Genou droit',\n",
" 2: 'Hanche droite',\n",
" 3: 'Hanche gauche',\n",
" 4: 'Genou gauche',\n",
" 5: 'Cheville gauche',\n",
" 6: 'Poignet droit',\n",
" 7: 'Coude droit',\n",
" 8: 'Épaule droite',\n",
" 9: 'Épaule gauche',\n",
" 10: 'Coude gauche',\n",
" 11: 'Poignet gauche',\n",
" 12: 'Cou',\n",
" 13: 'Sommet du crâne'}"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "meezS1y4G8QO"
},
"source": [
"La fonction suivante vous permet de visualiser les données. Vous vous rendrez compte que certaines données sont manquantes ! En effet quand des joints sont occultés dans les images, des valeurs de position aberrantes (négatives) sont indiquées. Dans ce cas, nous n'afficherons pas les articulations."
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {
"id": "JvcqdQIZdCYk",
"vscode": {
"languageId": "python"
}
},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAATEAAAEzCAYAAABZrTRjAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAAsTAAALEwEAmpwYAAB0MUlEQVR4nO2dd3hcV5n/v+dOH42kUbO6Lbn3nuIUJ3F6DyGEEiDUwLKwsOz+lixsoSwltIVll0AgIZQUQghpkODEcZrjbstVtiXbsnrXaDR95t7z+0NK7Pc9F0tuKvb5PI8f652599xz25l7v+ctQkoJjUajmagYY90BjUajORX0IKbRaCY0ehDTaDQTGj2IaTSaCY0exDQazYRGD2IajWZCc0qDmBDiOiHEfiFEvRDi3tPVKY1Goxkp4mT9xIQQDgAHAFwNoBnAZgDvl1LuPX3d02g0muNzKk9i5wOol1IeklKmADwO4NbT0y2NRqMZGacyiJUDaDrGbh76TKPRaEYN55negBDiHgD3AIBhiGU+rwuGQTebNi1lvVTGpO0YDmJnub3EDmT5lTZiiRizE8S2MnS7wubN2sfaNbwuYjsc1LZM2m8AGBjop8uAbsjv8xE7JyugtOGAoH0VQlmG9suhfGYYxrDLHMtAqFf5zOOk63g9bmLH4vQYCyc9PgDgz8kndm8P3U5kIEpsd1a20kYqTpfxO+hx97ppP/ujcaUN06DXkHDSfckO0O3m5ajnxTJTxG5pbSV2PEmPh4R6rfu99BrzeahdWFhE+2mozx4D0Qix29tbiO1y0/Pg99J9B9Trw+Xk1zo9phInLkWZmQyxB8IDyjIZdg/NmTMbALB169ZuKWURX/5UBrEWAJXH2BVDnxGklA8AeAAAAlkeuWBuBfy+IFmmvV+9wBr76EnxuOkFtLxqLrFXXrBYaWPzvq3ErtlfT+xoJ92GN6UODPMuWkqXmV5C7GAetRPhsNLGy6+8QOw46IW9dNFCYl+z7CKljVzDQ2zDQS84p5OeSr9fHdSzs7OPazsMuv8vP/WI0sa0kjxiz6quJPbOPftoP4MVShvLrn4PsR/9ze+J/eabm4k9eenlShtN+zYSe3FOiNhzJtN+vrhxt9JGn38Wsd15dF8uv3QlsW+/+mKljVhfM7H/7Wv/Sexd9bXEtgQ99wCwZO4yYs+dRq+Hj3/s74jt8mUpbbyx4S1if+u7dJ6topJepwvn0fsHAAI+OrCVFNJ1soP0mEqp/mBLiz18SHpN9fb0EHvtmleVNnp76Y/a5i0bBtsS4oiyME5tENsMYIYQohqDg9f7AHzgeCtISyIdTWMgTkffVFI9GA725OX10wMs2Q/8zFkzlTbikg5STS30V9KXpL8KZjiptPHWG2uJndhNBweXk15Qroy6L4X5dACeVjaZ2MnuPmJ3NdF+AkDKQ5/WnB56AHzsaS6ZUH8YwJ4CTJPuv4sNjJEIfYIEgN1724mdl5ND7PpDTcSOgF60AJBbtYjYXjfdbl4O3ZdUTO1HPEavoSP9dLsySW+E7t6Q0gYE/cEpn55LbLeDPmmk2VMEAEj2VlHEzm18/35iGw71yfRwIz3f+dmltB/sB8ol1GtMpOmTaSZJr+XGhsPEnjaZDtgAkOOn13JfmB73LjYAJVNppY1AFj136SR9Ug330zZNS90Xj8ejfHY8TnoQk1JmhBCfBfBXAA4AD0kp95xsexqNRnMynJImJqX8C4C/nKa+aDQazQmjPfY1Gs2E5ozPTh6LtAAzYSGeZLNP/hxlWStF9az+/hCx69J1xH7p5ZeUNoRFRdQpBXQ7RoDORqUiqiYWP0T7OhCmtmVQncRIqbrJFSsvIfbSC+hkwb6DVDeZVUF1FQDIZdqT4aN9d7upbTd76XJRPaalhc7D9PVSzSPUr85O1u3eRWy/j/brYGMHsfceVuZ6sKuxk9jveddtxF6+cAZto0OdLMny0t/fI4epwN7TRfWsnrR6qZ+3gM6SXryUCv2BPCpkp1KqzujNojpSWVU1sdNgM8A2mml3H72m5sydT+zCQtrPgT5VZzRSdCY+FaNtSoPqoa0tqu5aXFRM7Hic7u/69etpPwbUmcUF82jfuQaWZB4C6TTVzIZ6a/PZ30Y/iWk0mgmNHsQ0Gs2ERg9iGo1mQqMHMY1GM6EZVWE/bZpo7w0jP7uAfO5x+dRl41RUTjFBsHOACv+rV/9VaWNaORUqi4uo+B3MouJ3ydSpShvB0knEfnk79cDu6qPCpddGk5wSpMLsjELarxTz8s+x8bb38/AnH3UI9DBhn4eQAECCHcO9e2nCkQP7qbe9FVZF+TAL33ljcw2xQwNUqE3bXGL7a6k7YeSKFcSeOZV6+ccMOhEAAJEueu46mNNkJkmF7IxTDbMJBljfkiFiFuTQCZbCXPU69blpP6pL6LkuyaNREdNn0ckDAHjXDbcR+7rLriG2xa4p0+Yaa26lTsiVxdRhtj8aIvah+kNKG4agkxAmC/85eIiu43ap5zbJnGx5JEmUtWkXuiSlGpp1PPSTmEajmdDoQUyj0Uxo9CCm0WgmNKOqiRlOF7zFJUilqY6QsEuTkmZOgcxZL5WmTqV9UdVRtRksPYtFdaPSKTSrx4wqqiMAwAWrFlP7aqr5PL+GOtluW71GacProIc5xjQwi+s5NjmBBA8CZjZPk9LfrwZNb9lKs3rU1lJ9r62FOoz6WAA9AAiWNqY5RB0ePVnUQdTKUKdLALBY4HldLdXmlsyhulFxSNWztsdCxHaxfrm9VDNMZ9S0Qx4PPYaRfupEOtBHtTh3hmqZABDvo869k/103z5y05XEvvS6m5U2li65kNihXqpdHmym22hqUrXK17dQPTMvl56HyjKakWLHPupgDQDxOL22eZaK8jJ6f2RsHFW5g2x/KMS+p0656ZSa1YOn6xkO/SSm0WgmNHoQ02g0Exo9iGk0mgnN6GpibheyKspQEaS+V9UlU5RlV7/6KrG7eqkuYLGkf/GYqon1sCDY6gzd3fwcmgTP41MT1k2pon278NrlxJ61fDGxfxxX9b38Uqq9dQ2EaD9ZamErqmpRGaar+dOqL9mxHGloUD6TLMvmddffROyOdhoUvO7l55Q2whmqgfVmaGK8Ai/VHf3MBoB+lva6hQUj5+YFiV0RV7UXK8UCnFnAu3BRHc1hExDvYSmaHSwds5Wi11S4Sw2abt2/idgDbY3EzrVYOuZWmrwRAFpz6f2wYz/VJuua6fHaf0jVxA730GM0J48GphfmskD1YqqRAcC7brud2BGWeHJ/LQ3+7w/RZJ4A0NHRTey6A1R7MzNMd7NJimjj4nhc9JOYRqOZ0OhBTKPRTGj0IKbRaCY0o6qJJZNJ1B2sRzKP6hllk9QkgIXl1Cclv5qWtORaS79N6ac0Sx6XxQIb3X7qS5MwVV8iH9PNLFYxvbqaJsG76+67lTaSPV3EXr+N6iiN3VRHcB5uUNrI9tFiIzmsUpHbRY+HXVq5srIyYi9YvITYuQ30eKx5Qc08Hpf0GFkssWRWZSGx811qZZ62Q7RoTTJD9SqXn+5bwK+e22SM+hslk1Sb87C4WMNSfY8cLFYwkE3PdUcX1XwcCbUfXQdoBa2+dlqQo7axjdiHO9V+7N5HNa51Ow4SO+Om8ZjhlE2pPU+QmKVF9HopLaAa6r5DVLsDgGSCaYDMn7Gjk8Znxmy026Zmui+8DV6sxq7EodNm946HfhLTaDQTGj2IaTSaCY0exDQazYRGD2IajWZCM7rVjjIZpLp60NBFBUEzocrQST+r5lNGEynmF1LhMnuy6rxX6aLl4OezCtfZklUw9lBhGwAslrCxP0L77vLTQOO58+YpbTzx6O+I/dc33iB2SrDfErWwMtwGE9BzaZUhnsAuyyaxYm4uFa5r66iAPNDPhNxuKp4DgJFPhfrsYnoeRC51GA0PqG04WTXz1nYarP77p+mEQhVLNAgALqb+mhlW3ZwlCAjaVJUOsCBxL+vXui01xN6VUvcl00eF7MQAPYZ1LfR62dG9Q2kj8vpuamdov6bNow7WTg895gDgZ0kx8wP0/JcV0Ws7ZnNefve7x+g
"text/plain": [
"<Figure size 360x360 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"import matplotlib.pyplot as plt\n",
"\n",
"# Fonction d'affichage d'une image et de son label associé\n",
"def print_data(x,y,i):\n",
" \n",
" if y.shape[1] < 3:\n",
" y_new = np.ones((y.shape[0], 3, y.shape[2]))\n",
" y_new[:,0:2,:] = y\n",
" y = y_new\n",
" \n",
" plt.figure(figsize=(5, 5))\n",
" plt.imshow(x[i]/255)\n",
" for j in range(0,14):\n",
" if y[i, 2, j] == 1:\n",
" plt.scatter(y[i,0,j],y[i,1,j],label=labels.get(j))\n",
"\n",
" # Jambe droite \n",
" if (y[i, 2, 0] + y[i, 2, 1] == 2):\n",
" plt.plot(y[i,0,0:2],y[i,1,0:2],'b')\n",
" # Cuisse droite \n",
" if (y[i, 2, 1] + y[i, 2, 2] == 2):\n",
" plt.plot(y[i,0,1:3],y[i,1,1:3],'b')\n",
" # Bassin \n",
" if (y[i, 2, 2] + y[i, 2, 3] == 2):\n",
" plt.plot(y[i,0,2:4],y[i,1,2:4],'b')\n",
" # Cuisse gauche \n",
" if (y[i, 2, 3] + y[i, 2, 4] == 2):\n",
" plt.plot(y[i,0,3:5],y[i,1,3:5],'b')\n",
" # Jambe gauche \n",
" if (y[i, 2, 4] + y[i, 2, 5] == 2):\n",
" plt.plot(y[i,0,4:6],y[i,1,4:6],'b')\n",
" # Avant-bras droit \n",
" if (y[i, 2, 6] + y[i, 2, 7] == 2):\n",
" plt.plot(y[i,0,6:8],y[i,1,6:8],'b')\n",
" # Bras droit \n",
" if (y[i, 2, 7] + y[i, 2, 8] == 2):\n",
" plt.plot(y[i,0,7:9],y[i,1,7:9],'b')\n",
" # Bras gauche \n",
" if (y[i, 2, 9] + y[i, 2, 10] == 2):\n",
" plt.plot(y[i,0,9:11],y[i,1,9:11],'b')\n",
" # Avant-bras gauche \n",
" if (y[i, 2, 10] + y[i, 2, 11] == 2):\n",
" plt.plot(y[i,0,10:12],y[i,1,10:12],'b') \n",
" # Buste droit\n",
" x1=[y[i,0,2],y[i,0,12]]\n",
" y1=[y[i,1,2],y[i,1,12]]\n",
" if (y[i, 2, 2] + y[i, 2, 12] == 2):\n",
" plt.plot(x1, y1,'b')\n",
" # Buste gauche\n",
" x1=[y[i,0,3],y[i,0,12]]\n",
" y1=[y[i,1,3],y[i,1,12]]\n",
" if (y[i, 2, 3] + y[i, 2, 12] == 2):\n",
" plt.plot(x1, y1,'b')\n",
" # Omoplate droite\n",
" x1=[y[i,0,8],y[i,0,12]]\n",
" y1=[y[i,1,8],y[i,1,12]]\n",
" if (y[i, 2, 8] + y[i, 2, 12] == 2):\n",
" plt.plot(x1, y1,'b')\n",
" # Omoplate gauche\n",
" x1=[y[i,0,9],y[i,0,12]]\n",
" y1=[y[i,1,9],y[i,1,12]]\n",
" if (y[i, 2, 9] + y[i, 2, 12] == 2):\n",
" plt.plot(x1, y1,'b')\n",
" # Tete \n",
" if (y[i, 2, 12] + y[i, 2, 13] == 2):\n",
" plt.plot(y[i,0,12:14],y[i,1,12:14],'b')\n",
"\n",
" plt.axis([0, x.shape[1], x.shape[2], 0])\n",
" plt.show()\n",
" #plt.legend()\n",
"\n",
"# Affichage aléatoire d'une image\n",
"print_data(x,y,np.random.randint(x.shape[0]-1))\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Si nous formulons ce problème comme une régression, nous allons utiliser pour évaluer nos réseaux de neurones l'erreur quadratique moyenne (fonction *MSE*). Cette fonction sera parfaite comme fonction de perte, mais elle ne permet pas d'appréhender les résultats de manière satisfaisante.\n",
"\n",
"Une métrique commune en estimation de posture est le **PCK0.5**, pour *Percentage of Correct Keypoints*. *0.5* correspond à un seuil en-deça duquel on considère qu'un joint est correctement estimé. Cette question du seuil est particulièrement sensible car il faut utiliser une valeur qui soit valable pour n'importe quelle image. La personne considérée peut apparaître plus ou moins largement sur l'image, de face ou de profil, ce qui fait qu'une erreur de prédiction sur un joint peut avoir une importance très grande ou très faible selon les cas.\n",
"\n",
"Pour résoudre cette ambiguïté, on considère dans la métrique du **PCK0.5** que la référence est la taille de la tête, définie par la distance entre le joint du cou et le joint de la tête sur la vérité terrain. Un joint prédit par le réseau sera considéré correct s'il est situé à une distance inférieure à la moitié (*0.5*) de la taille de la tête par rapport au joint réel. ([Andriluka et al.] 2D Human Pose Estimation: New Benchmark and State of the Art Analysis)"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {
"vscode": {
"languageId": "python"
}
},
"outputs": [],
"source": [
"import numpy.matlib \n",
"\n",
"# Calcul du \"Percentage of Correct Keypoint\" avec seuil alpha :\n",
"# On compte corrects les joints pour lesquels la distance entre valeurs réelle et prédite \n",
"# est inférieure à alpha fois la dimension de la tête (c'est un peu arbitraire...)\n",
"# On ne comptera pas les joints invisibles.\n",
"# y_true est de dimension Nx3x14 et y_pred Nx2x14 (le réseau ne prédit pas la visibilité)\n",
"def compute_PCK_alpha(y_true, y_pred, alpha=0.5):\n",
" # Calcul des seuils ; la taille de la tête est la distance entre joints 12 et 13\n",
" head_sizes = np.sqrt(np.square(y_true[:,0,13]-y_true[:,0,12])+np.square(y_true[:,1,13]-y_true[:,1,12]))\n",
" thresholds = alpha*head_sizes\n",
" thresholds = np.matlib.repmat(np.expand_dims(thresholds, 1), 1, 14)\n",
"\n",
" # Calcul des distances inter-joints\n",
" joints_distances = np.sqrt(np.square(y_true[:,0,:]-y_pred[:,0,:]) + np.square(y_true[:,1,:]-y_pred[:,1,:]))\n",
"\n",
" # Visibilité des joints de la vérité terrain\n",
" visibility = y_true[:,2,:]\n",
" \n",
" total_joints = np.count_nonzero(visibility==1)\n",
" correctly_predicted_joints = np.count_nonzero(np.logical_and(joints_distances<thresholds, visibility == 1))\n",
" \n",
" return correctly_predicted_joints/total_joints"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Comme dit précédemment, on va utiliser l'erreur quadratique moyenne (*MSE*) comme fonction de coût pour entraîner notre réseau de neurones, et on peut également comme pour le TP2 utiliser l'erreur absolue moyenne (*MAE*) pour obtenir une estimation plus fine des performances de notre réseau pendant l'entraînement (on obtient une erreur moyenne en pixels, ce qui est plus simple à interpréter).\n",
"\n",
"Il y a cependant une subtilité importante évoquée un peu plus haut : certains joints sont invisibles, et ont des coordonnées négatives (pour, il faut l'avouer, une raison un peu inexplicable). Il est important de ne pas affecter l'apprentissage en faisant prédire ces valeurs négatives, insensées, au réseau. \n",
"\n",
"On doit donc implanter nous-même notre propre fonction de coût, qui ne va pas prendre en compte les joints invisibles. Pour cela, il faut savoir que la vérité-terrain contient en fait 3 valeurs pour chaque joint : les 2 premières sont ses coordonnées sur l'image, la 3e représente la visibilité du joint (1 s'il est visible, 0 sinon).\n",
"\n",
"La fonction *custom_mse*, définie juste en-dessous, réalise cette opération. Prenez le temps de comprendre ce qu'il s'y passe. **Remarque importante** : Ce code fait appel à des fonctions particulières du Backend de Keras, dont vous trouverez les détails sur [cette page](https://keras.rstudio.com/articles/backend.html). Ces fonctions doivent traiter des tenseurs, de type *Tensor* (et pas des tableaux numpy), car elles seront appelées pendant l'entraînement sur des variables internes à Tensorflow. Les fonctions utilisables sont également limitées car il faut pouvoir dériver la fonction *custom_mse* pour la rétropropagation des gradients."
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {
"vscode": {
"languageId": "python"
}
},
"outputs": [],
"source": [
"import keras.backend as K\n",
"# y_true : vérité terrain de dimension B x 3 x 14\n",
"# y_pred : une prédiction de dimension B x 2 x 14 (on ne prédit pas la visibilité)\n",
"# B est le nombre d'images considérées (par exemple, pourra être la taille d'un mini-batch)\n",
"def custom_mse(y_true, y_pred):\n",
" # Changement de dimension : Bx3x14 -> Bx14x3\n",
" y_true = K.permute_dimensions(y_true, (0, 2, 1))\n",
" # Changement de dimension : Bx14x3 -> (B*14)x3\n",
" y_true = K.reshape(y_true, shape=(-1, 3))\n",
" \n",
" # Changement de dimension : Bx2x14 -> Bx14x2\n",
" y_pred = K.permute_dimensions(y_pred, (0, 2, 1))\n",
" # Changement de dimension : Bx14x2 -> (B*14)x2\n",
" y_pred = K.reshape(y_pred, shape=(-1, 2))\n",
" \n",
" # Détermination de l'indices des joints visibles\n",
" visible = K.greater_equal(y_true[:, 2], 1) \n",
" indices = K.arange(0, K.shape(y_true)[0])\n",
" indices_visible = indices[visible]\n",
" \n",
" # Sélection des vérité-terrains et prédictions des joints visibles\n",
" y_true_visible = K.gather(y_true[:,0:2], indices_visible)\n",
" y_pred_visible = K.gather(y_pred, indices_visible)\n",
" \n",
" # Calcul de la MSE\n",
" return K.mean(K.square(y_pred_visible[:,0] - y_true_visible[:,0]) + K.square(y_pred_visible[:,1] - y_true_visible[:,1]))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Si vous avez bien compris le code de *custom_mse*, vous devriez pouvoir sans trop de problèmes écrire le code pour la fonction *custom_mae* ci-dessous :"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {
"vscode": {
"languageId": "python"
}
},
"outputs": [],
"source": [
"# y_true : vérité terrain de dimension B x 3 x 14\n",
"# y_pred : une prédiction de dimension B x 2 x 14 (on ne prédit pas la visibilité)\n",
"# B est le nombre d'images considérées (par exemple, pourra être la taille d'un mini-batch)\n",
"def custom_mae(y_true, y_pred):\n",
" # Changement de dimension : Bx3x14 -> Bx14x3\n",
" y_true = K.permute_dimensions(y_true, (0, 2, 1))\n",
" # Changement de dimension : Bx14x3 -> (B*14)x3\n",
" y_true = K.reshape(y_true, shape=(-1, 3))\n",
" \n",
" # Changement de dimension : Bx2x14 -> Bx14x2\n",
" y_pred = K.permute_dimensions(y_pred, (0, 2, 1))\n",
" # Changement de dimension : Bx14x2 -> (B*14)x2\n",
" y_pred = K.reshape(y_pred, shape=(-1, 2))\n",
" \n",
" # Détermination de l'indices des joints visibles\n",
" visible = K.greater_equal(y_true[:, 2], 1) \n",
" indices = K.arange(0, K.shape(y_true)[0])\n",
" indices_visible = indices[visible]\n",
" \n",
" # Sélection des vérité-terrains et prédictions des joints visibles\n",
" y_true_visible = K.gather(y_true[:,0:2], indices_visible)\n",
" y_pred_visible = K.gather(y_pred, indices_visible)\n",
" \n",
" # Calcul de la MAE\n",
" return K.mean(K.abs(y_pred_visible[:,0] - y_true_visible[:,0]) + K.abs(y_pred_visible[:,1] - y_true_visible[:,1]))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Comme d'habitude, on peut monitorer l'entraînement grâce à la fonction suivante (adaptée à nos fonctions *custom_mse* et *custom_mae* définies juste avant) : "
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {
"vscode": {
"languageId": "python"
}
},
"outputs": [],
"source": [
"def plot_training_analysis(history):\n",
" mae = history.history['custom_mae']\n",
" val_mae = history.history['val_custom_mae']\n",
" loss = history.history['loss']\n",
" val_loss = history.history['val_loss']\n",
"\n",
" epochs = range(len(loss))\n",
"\n",
" plt.plot(epochs, mae, 'b', linestyle=\"--\",label='Training MAE')\n",
" plt.plot(epochs, val_mae, 'g', label='Validation MAE')\n",
" plt.title('Training and validation MAE')\n",
" plt.legend()\n",
"\n",
" plt.figure()\n",
"\n",
" plt.plot(epochs, loss, 'b', linestyle=\"--\",label='Training loss')\n",
" plt.plot(epochs, val_loss,'g', label='Validation loss')\n",
" plt.title('Training and validation loss')\n",
" plt.legend()\n",
"\n",
" plt.show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# A vous de jouer :\n",
"\n",
"Pour tenter de résoudre le problème, vous pouvez suivre les étapes suivantes : \n",
"\n",
"- Simplifier le problème en traitant 10 imagettes (par exemple de dimension $64 \\times 64$) et construire un réseau qui surapprend parfaitement (qui diminue la perte jusqu'à quasiment 0)\n",
"- Ajouter des images (~1000) et éventuellement recalibrer votre réseau pour à nouveau, obtenir un sur-apprentissage\n",
"- Commencer à corriger le sur-apprentissage en ajoutant de la régularisation (notamment sur les couches denses)\n",
"- Et enfin, utiliser l'ensemble de la base de données pour diminuer le sur-apprentissage au maximum\n"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {
"vscode": {
"languageId": "python"
}
},
"outputs": [],
"source": [
"import tensorflow\n",
"from tensorflow.keras.models import Sequential\n",
"from tensorflow.keras.layers import InputLayer, Dense, Flatten, Conv2D, MaxPooling2D, Reshape\n",
"from tensorflow.keras import optimizers"
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {
"vscode": {
"languageId": "python"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Model: \"sequential\"\n",
"_________________________________________________________________\n",
" Layer (type) Output Shape Param # \n",
"=================================================================\n",
" conv2d (Conv2D) (None, 62, 62, 32) 896 \n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"2022-04-06 17:55:59.634360: I tensorflow/core/platform/cpu_feature_guard.cc:151] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA\n",
"To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.\n",
"2022-04-06 17:56:00.243575: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1525] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 1537 MB memory: -> device: 0, name: Quadro K620, pci bus id: 0000:03:00.0, compute capability: 5.0\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
" \n",
" max_pooling2d (MaxPooling2D (None, 31, 31, 32) 0 \n",
" ) \n",
" \n",
" conv2d_1 (Conv2D) (None, 29, 29, 64) 18496 \n",
" \n",
" max_pooling2d_1 (MaxPooling (None, 14, 14, 64) 0 \n",
" 2D) \n",
" \n",
" conv2d_2 (Conv2D) (None, 12, 12, 92) 53084 \n",
" \n",
" max_pooling2d_2 (MaxPooling (None, 6, 6, 92) 0 \n",
" 2D) \n",
" \n",
" conv2d_3 (Conv2D) (None, 4, 4, 128) 106112 \n",
" \n",
" max_pooling2d_3 (MaxPooling (None, 2, 2, 128) 0 \n",
" 2D) \n",
" \n",
" flatten (Flatten) (None, 512) 0 \n",
" \n",
" dense (Dense) (None, 512) 262656 \n",
" \n",
" dense_1 (Dense) (None, 42) 21546 \n",
" \n",
" reshape (Reshape) (None, 3, 14) 0 \n",
" \n",
"=================================================================\n",
"Total params: 462,790\n",
"Trainable params: 462,790\n",
"Non-trainable params: 0\n",
"_________________________________________________________________\n",
"Epoch 1/500\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"2022-04-06 17:56:01.836865: I tensorflow/stream_executor/cuda/cuda_dnn.cc:368] Loaded cuDNN version 8100\n",
"2022-04-06 17:56:02.100433: W tensorflow/stream_executor/gpu/asm_compiler.cc:111] *** WARNING *** You are using ptxas 10.1.243, which is older than 11.1. ptxas before 11.1 is known to miscompile XLA code, leading to incorrect results or invalid-address errors.\n",
"\n",
"You may not need to update to CUDA 11.1; cherry-picking the ptxas binary is often sufficient.\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"8/8 [==============================] - 3s 51ms/step - loss: 1645.1646 - custom_mae: 52.2486 - accuracy: 0.0833 - val_loss: 1693.5813 - val_custom_mae: 49.8971 - val_accuracy: 0.0000e+00\n",
"Epoch 2/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1444.6974 - custom_mae: 48.6929 - accuracy: 0.0000e+00 - val_loss: 1508.6548 - val_custom_mae: 46.6334 - val_accuracy: 0.0000e+00\n",
"Epoch 3/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1282.0558 - custom_mae: 45.4224 - accuracy: 0.0000e+00 - val_loss: 1346.8967 - val_custom_mae: 43.7476 - val_accuracy: 0.0000e+00\n",
"Epoch 4/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1133.2850 - custom_mae: 42.3140 - accuracy: 0.0000e+00 - val_loss: 1205.5819 - val_custom_mae: 40.9955 - val_accuracy: 0.0000e+00\n",
"Epoch 5/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 998.4826 - custom_mae: 39.4157 - accuracy: 0.0417 - val_loss: 1081.7169 - val_custom_mae: 38.4698 - val_accuracy: 0.0000e+00\n",
"Epoch 6/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 888.7233 - custom_mae: 36.5170 - accuracy: 0.0417 - val_loss: 969.3460 - val_custom_mae: 36.0378 - val_accuracy: 0.0000e+00\n",
"Epoch 7/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 776.2627 - custom_mae: 33.8331 - accuracy: 0.0417 - val_loss: 871.7504 - val_custom_mae: 33.8565 - val_accuracy: 0.0000e+00\n",
"Epoch 8/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 683.6859 - custom_mae: 31.4663 - accuracy: 0.0417 - val_loss: 788.3529 - val_custom_mae: 32.0696 - val_accuracy: 0.0000e+00\n",
"Epoch 9/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 601.2419 - custom_mae: 29.1058 - accuracy: 0.0417 - val_loss: 714.2122 - val_custom_mae: 30.3781 - val_accuracy: 0.0000e+00\n",
"Epoch 10/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 530.6856 - custom_mae: 26.8891 - accuracy: 0.0417 - val_loss: 650.6997 - val_custom_mae: 28.8502 - val_accuracy: 0.0000e+00\n",
"Epoch 11/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 460.8983 - custom_mae: 24.9396 - accuracy: 0.0417 - val_loss: 595.4228 - val_custom_mae: 27.6559 - val_accuracy: 0.0000e+00\n",
"Epoch 12/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 419.7189 - custom_mae: 23.5046 - accuracy: 0.0417 - val_loss: 554.6285 - val_custom_mae: 26.7735 - val_accuracy: 0.0000e+00\n",
"Epoch 13/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 365.1169 - custom_mae: 21.7708 - accuracy: 0.0417 - val_loss: 519.9942 - val_custom_mae: 26.0165 - val_accuracy: 0.0000e+00\n",
"Epoch 14/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 334.8514 - custom_mae: 20.6905 - accuracy: 0.0417 - val_loss: 492.1808 - val_custom_mae: 25.4429 - val_accuracy: 0.0000e+00\n",
"Epoch 15/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 303.1241 - custom_mae: 19.5947 - accuracy: 0.0417 - val_loss: 471.8553 - val_custom_mae: 25.1111 - val_accuracy: 0.0000e+00\n",
"Epoch 16/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 280.7592 - custom_mae: 18.8709 - accuracy: 0.0417 - val_loss: 454.4292 - val_custom_mae: 24.8094 - val_accuracy: 0.0000e+00\n",
"Epoch 17/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 261.7864 - custom_mae: 18.2127 - accuracy: 0.0417 - val_loss: 438.0183 - val_custom_mae: 24.6108 - val_accuracy: 0.0000e+00\n",
"Epoch 18/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 247.5717 - custom_mae: 17.7303 - accuracy: 0.0417 - val_loss: 423.1320 - val_custom_mae: 24.3313 - val_accuracy: 0.0000e+00\n",
"Epoch 19/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 236.6561 - custom_mae: 17.3107 - accuracy: 0.0417 - val_loss: 418.0301 - val_custom_mae: 24.1982 - val_accuracy: 0.0000e+00\n",
"Epoch 20/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 227.7027 - custom_mae: 16.9665 - accuracy: 0.0417 - val_loss: 410.5230 - val_custom_mae: 24.0212 - val_accuracy: 0.0000e+00\n",
"Epoch 21/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 219.6795 - custom_mae: 16.6314 - accuracy: 0.0417 - val_loss: 407.5679 - val_custom_mae: 23.8922 - val_accuracy: 0.0000e+00\n",
"Epoch 22/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 216.2900 - custom_mae: 16.4975 - accuracy: 0.0417 - val_loss: 409.7578 - val_custom_mae: 23.9177 - val_accuracy: 0.0000e+00\n",
"Epoch 23/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 210.1933 - custom_mae: 16.2920 - accuracy: 0.0417 - val_loss: 410.7000 - val_custom_mae: 23.9031 - val_accuracy: 0.0000e+00\n",
"Epoch 24/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 205.3121 - custom_mae: 16.1387 - accuracy: 0.0417 - val_loss: 409.0459 - val_custom_mae: 23.8626 - val_accuracy: 0.0000e+00\n",
"Epoch 25/500\n",
"8/8 [==============================] - 0s 12ms/step - loss: 201.6986 - custom_mae: 16.0118 - accuracy: 0.0417 - val_loss: 404.6946 - val_custom_mae: 23.7187 - val_accuracy: 0.0000e+00\n",
"Epoch 26/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 199.1012 - custom_mae: 15.8988 - accuracy: 0.0417 - val_loss: 404.3250 - val_custom_mae: 23.7437 - val_accuracy: 0.0000e+00\n",
"Epoch 27/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 195.8875 - custom_mae: 15.7511 - accuracy: 0.0417 - val_loss: 402.3555 - val_custom_mae: 23.6955 - val_accuracy: 0.0000e+00\n",
"Epoch 28/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 192.9225 - custom_mae: 15.6315 - accuracy: 0.0417 - val_loss: 402.2287 - val_custom_mae: 23.6931 - val_accuracy: 0.0000e+00\n",
"Epoch 29/500\n",
"8/8 [==============================] - 0s 12ms/step - loss: 190.4810 - custom_mae: 15.5196 - accuracy: 0.0417 - val_loss: 403.3840 - val_custom_mae: 23.6961 - val_accuracy: 0.0000e+00\n",
"Epoch 30/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 188.2551 - custom_mae: 15.4126 - accuracy: 0.0417 - val_loss: 403.4760 - val_custom_mae: 23.6841 - val_accuracy: 0.0000e+00\n",
"Epoch 31/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 185.4532 - custom_mae: 15.2986 - accuracy: 0.0417 - val_loss: 400.3525 - val_custom_mae: 23.6035 - val_accuracy: 0.0000e+00\n",
"Epoch 32/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 183.5413 - custom_mae: 15.2177 - accuracy: 0.0417 - val_loss: 402.4420 - val_custom_mae: 23.6483 - val_accuracy: 0.0000e+00\n",
"Epoch 33/500\n",
"8/8 [==============================] - 0s 12ms/step - loss: 182.2500 - custom_mae: 15.1734 - accuracy: 0.0417 - val_loss: 402.6736 - val_custom_mae: 23.6653 - val_accuracy: 0.0000e+00\n",
"Epoch 34/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 180.7219 - custom_mae: 15.1348 - accuracy: 0.0417 - val_loss: 402.6113 - val_custom_mae: 23.7009 - val_accuracy: 0.0000e+00\n",
"Epoch 35/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 178.1879 - custom_mae: 14.9883 - accuracy: 0.0417 - val_loss: 402.5612 - val_custom_mae: 23.7117 - val_accuracy: 0.0000e+00\n",
"Epoch 36/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 176.4017 - custom_mae: 14.9213 - accuracy: 0.0417 - val_loss: 406.7460 - val_custom_mae: 23.8702 - val_accuracy: 0.0000e+00\n",
"Epoch 37/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 175.1766 - custom_mae: 14.8836 - accuracy: 0.0833 - val_loss: 402.6191 - val_custom_mae: 23.7288 - val_accuracy: 0.0000e+00\n",
"Epoch 38/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 172.3543 - custom_mae: 14.7350 - accuracy: 0.0833 - val_loss: 404.3982 - val_custom_mae: 23.7843 - val_accuracy: 0.0000e+00\n",
"Epoch 39/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 171.6455 - custom_mae: 14.7247 - accuracy: 0.0833 - val_loss: 406.1129 - val_custom_mae: 23.8634 - val_accuracy: 0.0000e+00\n",
"Epoch 40/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 169.0677 - custom_mae: 14.6236 - accuracy: 0.0833 - val_loss: 406.9654 - val_custom_mae: 23.9125 - val_accuracy: 0.0000e+00\n",
"Epoch 41/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 167.4359 - custom_mae: 14.5639 - accuracy: 0.0833 - val_loss: 406.0610 - val_custom_mae: 23.8634 - val_accuracy: 0.0000e+00\n",
"Epoch 42/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 165.4825 - custom_mae: 14.4465 - accuracy: 0.0833 - val_loss: 405.1148 - val_custom_mae: 23.8130 - val_accuracy: 0.0000e+00\n",
"Epoch 43/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 165.0361 - custom_mae: 14.4085 - accuracy: 0.0417 - val_loss: 404.9694 - val_custom_mae: 23.8234 - val_accuracy: 0.0000e+00\n",
"Epoch 44/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 161.9687 - custom_mae: 14.2890 - accuracy: 0.0417 - val_loss: 405.4351 - val_custom_mae: 23.8430 - val_accuracy: 0.0000e+00\n",
"Epoch 45/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 160.6979 - custom_mae: 14.2326 - accuracy: 0.0417 - val_loss: 407.0562 - val_custom_mae: 23.9097 - val_accuracy: 0.0000e+00\n",
"Epoch 46/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 159.2588 - custom_mae: 14.1610 - accuracy: 0.0417 - val_loss: 406.7657 - val_custom_mae: 23.8849 - val_accuracy: 0.0000e+00\n",
"Epoch 47/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 157.5886 - custom_mae: 14.1194 - accuracy: 0.0417 - val_loss: 408.6310 - val_custom_mae: 23.9701 - val_accuracy: 0.0000e+00\n",
"Epoch 48/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 156.3378 - custom_mae: 14.0550 - accuracy: 0.0417 - val_loss: 412.2852 - val_custom_mae: 24.1261 - val_accuracy: 0.0000e+00\n",
"Epoch 49/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 154.6217 - custom_mae: 14.0001 - accuracy: 0.0417 - val_loss: 414.9316 - val_custom_mae: 24.1959 - val_accuracy: 0.0000e+00\n",
"Epoch 50/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 152.6574 - custom_mae: 13.8891 - accuracy: 0.0417 - val_loss: 414.6105 - val_custom_mae: 24.1723 - val_accuracy: 0.0000e+00\n",
"Epoch 51/500\n",
"8/8 [==============================] - 0s 13ms/step - loss: 151.5095 - custom_mae: 13.8038 - accuracy: 0.0417 - val_loss: 409.2286 - val_custom_mae: 24.0196 - val_accuracy: 0.0000e+00\n",
"Epoch 52/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 150.8093 - custom_mae: 13.7771 - accuracy: 0.0417 - val_loss: 406.7242 - val_custom_mae: 23.9258 - val_accuracy: 0.0000e+00\n",
"Epoch 53/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 148.4751 - custom_mae: 13.6448 - accuracy: 0.0417 - val_loss: 410.3669 - val_custom_mae: 24.0966 - val_accuracy: 0.0000e+00\n",
"Epoch 54/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 146.6503 - custom_mae: 13.5933 - accuracy: 0.0417 - val_loss: 410.4221 - val_custom_mae: 24.1278 - val_accuracy: 0.0000e+00\n",
"Epoch 55/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 145.3093 - custom_mae: 13.5273 - accuracy: 0.0417 - val_loss: 412.1499 - val_custom_mae: 24.1638 - val_accuracy: 0.0000e+00\n",
"Epoch 56/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 144.4268 - custom_mae: 13.4793 - accuracy: 0.0417 - val_loss: 413.4369 - val_custom_mae: 24.2716 - val_accuracy: 0.0000e+00\n",
"Epoch 57/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 142.1533 - custom_mae: 13.3693 - accuracy: 0.0417 - val_loss: 410.2064 - val_custom_mae: 24.1708 - val_accuracy: 0.0000e+00\n",
"Epoch 58/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 141.1678 - custom_mae: 13.3268 - accuracy: 0.0417 - val_loss: 411.3347 - val_custom_mae: 24.2119 - val_accuracy: 0.0000e+00\n",
"Epoch 59/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 140.5152 - custom_mae: 13.2561 - accuracy: 0.0417 - val_loss: 405.2321 - val_custom_mae: 23.9684 - val_accuracy: 0.0000e+00\n",
"Epoch 60/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 139.5516 - custom_mae: 13.2209 - accuracy: 0.0417 - val_loss: 415.0569 - val_custom_mae: 24.3283 - val_accuracy: 0.0000e+00\n",
"Epoch 61/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 137.3128 - custom_mae: 13.1172 - accuracy: 0.0417 - val_loss: 409.6513 - val_custom_mae: 24.1504 - val_accuracy: 0.0000e+00\n",
"Epoch 62/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 135.6534 - custom_mae: 13.0263 - accuracy: 0.0417 - val_loss: 412.5158 - val_custom_mae: 24.2829 - val_accuracy: 0.0000e+00\n",
"Epoch 63/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 133.7861 - custom_mae: 12.9487 - accuracy: 0.0417 - val_loss: 414.0869 - val_custom_mae: 24.3709 - val_accuracy: 0.0000e+00\n",
"Epoch 64/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 134.0365 - custom_mae: 13.0084 - accuracy: 0.0417 - val_loss: 417.8215 - val_custom_mae: 24.4823 - val_accuracy: 0.0000e+00\n",
"Epoch 65/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 131.4517 - custom_mae: 12.8456 - accuracy: 0.0417 - val_loss: 413.1723 - val_custom_mae: 24.3072 - val_accuracy: 0.0000e+00\n",
"Epoch 66/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 129.6526 - custom_mae: 12.7441 - accuracy: 0.0417 - val_loss: 413.1596 - val_custom_mae: 24.3132 - val_accuracy: 0.0000e+00\n",
"Epoch 67/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 128.8961 - custom_mae: 12.7063 - accuracy: 0.0417 - val_loss: 411.5398 - val_custom_mae: 24.2937 - val_accuracy: 0.0000e+00\n",
"Epoch 68/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 127.6665 - custom_mae: 12.6517 - accuracy: 0.0417 - val_loss: 413.7512 - val_custom_mae: 24.3615 - val_accuracy: 0.0000e+00\n",
"Epoch 69/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 127.0278 - custom_mae: 12.6188 - accuracy: 0.0417 - val_loss: 409.0102 - val_custom_mae: 24.1772 - val_accuracy: 0.0000e+00\n",
"Epoch 70/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 124.4445 - custom_mae: 12.4524 - accuracy: 0.0417 - val_loss: 410.4698 - val_custom_mae: 24.2348 - val_accuracy: 0.0000e+00\n",
"Epoch 71/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 123.5950 - custom_mae: 12.4353 - accuracy: 0.0417 - val_loss: 410.0284 - val_custom_mae: 24.2068 - val_accuracy: 0.0000e+00\n",
"Epoch 72/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 123.1236 - custom_mae: 12.3779 - accuracy: 0.0417 - val_loss: 416.6451 - val_custom_mae: 24.4725 - val_accuracy: 0.0000e+00\n",
"Epoch 73/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 120.3443 - custom_mae: 12.2514 - accuracy: 0.0417 - val_loss: 419.1819 - val_custom_mae: 24.5486 - val_accuracy: 0.0000e+00\n",
"Epoch 74/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 120.0631 - custom_mae: 12.2410 - accuracy: 0.0417 - val_loss: 420.6152 - val_custom_mae: 24.6451 - val_accuracy: 0.0000e+00\n",
"Epoch 75/500\n",
"8/8 [==============================] - 0s 12ms/step - loss: 118.5125 - custom_mae: 12.1661 - accuracy: 0.0417 - val_loss: 416.0493 - val_custom_mae: 24.4676 - val_accuracy: 0.0000e+00\n",
"Epoch 76/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 116.7565 - custom_mae: 12.0846 - accuracy: 0.0417 - val_loss: 418.2752 - val_custom_mae: 24.5161 - val_accuracy: 0.0000e+00\n",
"Epoch 77/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 117.4395 - custom_mae: 12.1152 - accuracy: 0.0417 - val_loss: 425.4371 - val_custom_mae: 24.7851 - val_accuracy: 0.0000e+00\n",
"Epoch 78/500\n",
"8/8 [==============================] - 0s 12ms/step - loss: 114.2474 - custom_mae: 11.9354 - accuracy: 0.0417 - val_loss: 420.5996 - val_custom_mae: 24.6146 - val_accuracy: 0.0000e+00\n",
"Epoch 79/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 113.1958 - custom_mae: 11.8511 - accuracy: 0.0417 - val_loss: 418.4744 - val_custom_mae: 24.4806 - val_accuracy: 0.0000e+00\n",
"Epoch 80/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 111.9214 - custom_mae: 11.7995 - accuracy: 0.0417 - val_loss: 420.5223 - val_custom_mae: 24.6236 - val_accuracy: 0.0000e+00\n",
"Epoch 81/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 110.7491 - custom_mae: 11.7557 - accuracy: 0.0417 - val_loss: 420.4352 - val_custom_mae: 24.6390 - val_accuracy: 0.0000e+00\n",
"Epoch 82/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 109.5168 - custom_mae: 11.6780 - accuracy: 0.0417 - val_loss: 418.5866 - val_custom_mae: 24.5498 - val_accuracy: 0.0000e+00\n",
"Epoch 83/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 108.5691 - custom_mae: 11.5864 - accuracy: 0.0417 - val_loss: 415.6513 - val_custom_mae: 24.4047 - val_accuracy: 0.0000e+00\n",
"Epoch 84/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 107.1655 - custom_mae: 11.4943 - accuracy: 0.0000e+00 - val_loss: 417.2522 - val_custom_mae: 24.4805 - val_accuracy: 0.0000e+00\n",
"Epoch 85/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 104.9143 - custom_mae: 11.4095 - accuracy: 0.0000e+00 - val_loss: 421.9020 - val_custom_mae: 24.6575 - val_accuracy: 0.0000e+00\n",
"Epoch 86/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 104.2563 - custom_mae: 11.3663 - accuracy: 0.0417 - val_loss: 428.3400 - val_custom_mae: 24.9005 - val_accuracy: 0.0000e+00\n",
"Epoch 87/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 103.1367 - custom_mae: 11.2972 - accuracy: 0.0417 - val_loss: 429.1896 - val_custom_mae: 24.9507 - val_accuracy: 0.0000e+00\n",
"Epoch 88/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 102.2875 - custom_mae: 11.2484 - accuracy: 0.0417 - val_loss: 423.3947 - val_custom_mae: 24.7061 - val_accuracy: 0.0000e+00\n",
"Epoch 89/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 101.1349 - custom_mae: 11.1665 - accuracy: 0.0417 - val_loss: 427.1229 - val_custom_mae: 24.8693 - val_accuracy: 0.0000e+00\n",
"Epoch 90/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 100.1330 - custom_mae: 11.1440 - accuracy: 0.0417 - val_loss: 420.6021 - val_custom_mae: 24.6144 - val_accuracy: 0.0000e+00\n",
"Epoch 91/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 99.7618 - custom_mae: 11.0987 - accuracy: 0.0417 - val_loss: 429.6985 - val_custom_mae: 24.9727 - val_accuracy: 0.0000e+00\n",
"Epoch 92/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 97.0187 - custom_mae: 10.9343 - accuracy: 0.0833 - val_loss: 429.4257 - val_custom_mae: 24.9567 - val_accuracy: 0.0000e+00\n",
"Epoch 93/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 95.7500 - custom_mae: 10.8596 - accuracy: 0.0833 - val_loss: 426.2121 - val_custom_mae: 24.8527 - val_accuracy: 0.0000e+00\n",
"Epoch 94/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 96.7112 - custom_mae: 10.8691 - accuracy: 0.0000e+00 - val_loss: 417.3474 - val_custom_mae: 24.4897 - val_accuracy: 0.0000e+00\n",
"Epoch 95/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 93.4611 - custom_mae: 10.7117 - accuracy: 0.0000e+00 - val_loss: 420.2762 - val_custom_mae: 24.6373 - val_accuracy: 0.0000e+00\n",
"Epoch 96/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 91.9772 - custom_mae: 10.6206 - accuracy: 0.0417 - val_loss: 425.6195 - val_custom_mae: 24.8340 - val_accuracy: 0.0000e+00\n",
"Epoch 97/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 91.2963 - custom_mae: 10.5700 - accuracy: 0.0417 - val_loss: 427.8983 - val_custom_mae: 24.9089 - val_accuracy: 0.0000e+00\n",
"Epoch 98/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 89.9349 - custom_mae: 10.4948 - accuracy: 0.0417 - val_loss: 427.4246 - val_custom_mae: 24.8702 - val_accuracy: 0.0000e+00\n",
"Epoch 99/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 89.0157 - custom_mae: 10.4247 - accuracy: 0.0417 - val_loss: 424.4568 - val_custom_mae: 24.7276 - val_accuracy: 0.0000e+00\n",
"Epoch 100/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 87.8003 - custom_mae: 10.3883 - accuracy: 0.0417 - val_loss: 422.1203 - val_custom_mae: 24.7226 - val_accuracy: 0.0000e+00\n",
"Epoch 101/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 86.6618 - custom_mae: 10.3205 - accuracy: 0.0833 - val_loss: 426.9736 - val_custom_mae: 24.8916 - val_accuracy: 0.0000e+00\n",
"Epoch 102/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 84.9795 - custom_mae: 10.2201 - accuracy: 0.0833 - val_loss: 424.4954 - val_custom_mae: 24.7382 - val_accuracy: 0.0000e+00\n",
"Epoch 103/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 84.6022 - custom_mae: 10.1958 - accuracy: 0.0417 - val_loss: 422.1271 - val_custom_mae: 24.6736 - val_accuracy: 0.0000e+00\n",
"Epoch 104/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 82.9915 - custom_mae: 10.0656 - accuracy: 0.0417 - val_loss: 421.9286 - val_custom_mae: 24.6940 - val_accuracy: 0.0000e+00\n",
"Epoch 105/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 81.7410 - custom_mae: 9.9713 - accuracy: 0.0417 - val_loss: 427.6994 - val_custom_mae: 24.8964 - val_accuracy: 0.0000e+00\n",
"Epoch 106/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 80.9972 - custom_mae: 9.9322 - accuracy: 0.0417 - val_loss: 431.2686 - val_custom_mae: 25.0594 - val_accuracy: 0.0000e+00\n",
"Epoch 107/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 80.6039 - custom_mae: 9.9045 - accuracy: 0.0833 - val_loss: 437.9518 - val_custom_mae: 25.2723 - val_accuracy: 0.0000e+00\n",
"Epoch 108/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 78.3437 - custom_mae: 9.7548 - accuracy: 0.0833 - val_loss: 428.8666 - val_custom_mae: 24.9395 - val_accuracy: 0.0000e+00\n",
"Epoch 109/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 78.3060 - custom_mae: 9.7770 - accuracy: 0.0417 - val_loss: 426.6060 - val_custom_mae: 24.7673 - val_accuracy: 0.0000e+00\n",
"Epoch 110/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 76.4649 - custom_mae: 9.6531 - accuracy: 0.0417 - val_loss: 430.5608 - val_custom_mae: 24.9726 - val_accuracy: 0.0000e+00\n",
"Epoch 111/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 74.6942 - custom_mae: 9.5391 - accuracy: 0.0417 - val_loss: 430.2473 - val_custom_mae: 24.9625 - val_accuracy: 0.0000e+00\n",
"Epoch 112/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 73.3168 - custom_mae: 9.4405 - accuracy: 0.0417 - val_loss: 429.2884 - val_custom_mae: 24.9538 - val_accuracy: 0.0000e+00\n",
"Epoch 113/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 72.2169 - custom_mae: 9.3628 - accuracy: 0.0417 - val_loss: 432.4158 - val_custom_mae: 25.0389 - val_accuracy: 0.0000e+00\n",
"Epoch 114/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 71.0414 - custom_mae: 9.3034 - accuracy: 0.0417 - val_loss: 432.2120 - val_custom_mae: 25.0539 - val_accuracy: 0.0000e+00\n",
"Epoch 115/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 70.5078 - custom_mae: 9.2599 - accuracy: 0.0417 - val_loss: 436.0043 - val_custom_mae: 25.1645 - val_accuracy: 0.0000e+00\n",
"Epoch 116/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 69.0165 - custom_mae: 9.1592 - accuracy: 0.0417 - val_loss: 435.2921 - val_custom_mae: 25.1592 - val_accuracy: 0.0000e+00\n",
"Epoch 117/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 67.9589 - custom_mae: 9.0724 - accuracy: 0.0417 - val_loss: 428.1173 - val_custom_mae: 24.8933 - val_accuracy: 0.0000e+00\n",
"Epoch 118/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 67.4143 - custom_mae: 9.0412 - accuracy: 0.0417 - val_loss: 434.8939 - val_custom_mae: 25.0991 - val_accuracy: 0.0000e+00\n",
"Epoch 119/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 66.0839 - custom_mae: 8.9594 - accuracy: 0.0417 - val_loss: 433.0708 - val_custom_mae: 25.0399 - val_accuracy: 0.0000e+00\n",
"Epoch 120/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 65.9256 - custom_mae: 8.9698 - accuracy: 0.0417 - val_loss: 428.9426 - val_custom_mae: 24.8979 - val_accuracy: 0.0000e+00\n",
"Epoch 121/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 63.1886 - custom_mae: 8.7238 - accuracy: 0.0417 - val_loss: 436.8160 - val_custom_mae: 25.2241 - val_accuracy: 0.0000e+00\n",
"Epoch 122/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 62.3870 - custom_mae: 8.6666 - accuracy: 0.0417 - val_loss: 437.2810 - val_custom_mae: 25.2320 - val_accuracy: 0.0000e+00\n",
"Epoch 123/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 61.7238 - custom_mae: 8.6275 - accuracy: 0.0417 - val_loss: 438.6030 - val_custom_mae: 25.3054 - val_accuracy: 0.0000e+00\n",
"Epoch 124/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 60.5433 - custom_mae: 8.5357 - accuracy: 0.0417 - val_loss: 443.5125 - val_custom_mae: 25.4655 - val_accuracy: 0.0000e+00\n",
"Epoch 125/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 59.3125 - custom_mae: 8.4223 - accuracy: 0.0417 - val_loss: 440.6275 - val_custom_mae: 25.3692 - val_accuracy: 0.0000e+00\n",
"Epoch 126/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 58.4023 - custom_mae: 8.3737 - accuracy: 0.0417 - val_loss: 440.1028 - val_custom_mae: 25.3387 - val_accuracy: 0.0000e+00\n",
"Epoch 127/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 58.1661 - custom_mae: 8.3705 - accuracy: 0.0417 - val_loss: 442.4982 - val_custom_mae: 25.3775 - val_accuracy: 0.0000e+00\n",
"Epoch 128/500\n",
"8/8 [==============================] - 0s 12ms/step - loss: 56.2306 - custom_mae: 8.1875 - accuracy: 0.0417 - val_loss: 444.4538 - val_custom_mae: 25.4769 - val_accuracy: 0.0000e+00\n",
"Epoch 129/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 55.0303 - custom_mae: 8.1080 - accuracy: 0.0417 - val_loss: 442.9167 - val_custom_mae: 25.4125 - val_accuracy: 0.0000e+00\n",
"Epoch 130/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 54.0156 - custom_mae: 8.0168 - accuracy: 0.0417 - val_loss: 446.5889 - val_custom_mae: 25.5574 - val_accuracy: 0.0000e+00\n",
"Epoch 131/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 52.9252 - custom_mae: 7.9337 - accuracy: 0.0833 - val_loss: 446.3956 - val_custom_mae: 25.5658 - val_accuracy: 0.0000e+00\n",
"Epoch 132/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 51.9142 - custom_mae: 7.8970 - accuracy: 0.0417 - val_loss: 442.4985 - val_custom_mae: 25.3680 - val_accuracy: 0.0000e+00\n",
"Epoch 133/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 51.1319 - custom_mae: 7.8152 - accuracy: 0.0417 - val_loss: 447.0637 - val_custom_mae: 25.5970 - val_accuracy: 0.0000e+00\n",
"Epoch 134/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 49.8007 - custom_mae: 7.6705 - accuracy: 0.0417 - val_loss: 445.0345 - val_custom_mae: 25.5861 - val_accuracy: 0.0000e+00\n",
"Epoch 135/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 48.6811 - custom_mae: 7.5643 - accuracy: 0.0417 - val_loss: 443.0012 - val_custom_mae: 25.5583 - val_accuracy: 0.0000e+00\n",
"Epoch 136/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 48.2607 - custom_mae: 7.5659 - accuracy: 0.0417 - val_loss: 439.7896 - val_custom_mae: 25.3781 - val_accuracy: 0.0000e+00\n",
"Epoch 137/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 46.8254 - custom_mae: 7.4343 - accuracy: 0.0417 - val_loss: 442.0301 - val_custom_mae: 25.4404 - val_accuracy: 0.0000e+00\n",
"Epoch 138/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 45.5511 - custom_mae: 7.3325 - accuracy: 0.0417 - val_loss: 445.8210 - val_custom_mae: 25.5933 - val_accuracy: 0.0000e+00\n",
"Epoch 139/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 45.0168 - custom_mae: 7.2914 - accuracy: 0.0417 - val_loss: 448.6976 - val_custom_mae: 25.6725 - val_accuracy: 0.0000e+00\n",
"Epoch 140/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 43.9791 - custom_mae: 7.1802 - accuracy: 0.0417 - val_loss: 446.7123 - val_custom_mae: 25.5835 - val_accuracy: 0.0000e+00\n",
"Epoch 141/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 43.1393 - custom_mae: 7.1117 - accuracy: 0.0417 - val_loss: 443.8585 - val_custom_mae: 25.4488 - val_accuracy: 0.0000e+00\n",
"Epoch 142/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 42.0530 - custom_mae: 7.0421 - accuracy: 0.0417 - val_loss: 446.0829 - val_custom_mae: 25.5527 - val_accuracy: 0.0000e+00\n",
"Epoch 143/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 42.1297 - custom_mae: 7.0045 - accuracy: 0.0417 - val_loss: 455.9144 - val_custom_mae: 25.9180 - val_accuracy: 0.0000e+00\n",
"Epoch 144/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 40.2080 - custom_mae: 6.8173 - accuracy: 0.0417 - val_loss: 452.2101 - val_custom_mae: 25.7997 - val_accuracy: 0.0000e+00\n",
"Epoch 145/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 39.6084 - custom_mae: 6.7818 - accuracy: 0.0417 - val_loss: 444.0982 - val_custom_mae: 25.4777 - val_accuracy: 0.0000e+00\n",
"Epoch 146/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 38.7660 - custom_mae: 6.7019 - accuracy: 0.0417 - val_loss: 449.2700 - val_custom_mae: 25.6236 - val_accuracy: 0.0000e+00\n",
"Epoch 147/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 38.2095 - custom_mae: 6.6438 - accuracy: 0.0417 - val_loss: 449.8256 - val_custom_mae: 25.6540 - val_accuracy: 0.0000e+00\n",
"Epoch 148/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 36.9949 - custom_mae: 6.5295 - accuracy: 0.0417 - val_loss: 453.5786 - val_custom_mae: 25.7693 - val_accuracy: 0.0000e+00\n",
"Epoch 149/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 36.3998 - custom_mae: 6.4640 - accuracy: 0.0417 - val_loss: 453.8980 - val_custom_mae: 25.7478 - val_accuracy: 0.0000e+00\n",
"Epoch 150/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 35.7318 - custom_mae: 6.4153 - accuracy: 0.0417 - val_loss: 455.4772 - val_custom_mae: 25.8269 - val_accuracy: 0.0000e+00\n",
"Epoch 151/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 35.3494 - custom_mae: 6.3914 - accuracy: 0.0417 - val_loss: 450.8029 - val_custom_mae: 25.6449 - val_accuracy: 0.0000e+00\n",
"Epoch 152/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 34.5699 - custom_mae: 6.3014 - accuracy: 0.0417 - val_loss: 459.5722 - val_custom_mae: 25.9405 - val_accuracy: 0.0000e+00\n",
"Epoch 153/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 33.3331 - custom_mae: 6.1431 - accuracy: 0.0417 - val_loss: 457.1331 - val_custom_mae: 25.8412 - val_accuracy: 0.0000e+00\n",
"Epoch 154/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 32.9963 - custom_mae: 6.1439 - accuracy: 0.0417 - val_loss: 453.6251 - val_custom_mae: 25.7267 - val_accuracy: 0.0000e+00\n",
"Epoch 155/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 32.0551 - custom_mae: 6.0378 - accuracy: 0.0417 - val_loss: 456.0417 - val_custom_mae: 25.7919 - val_accuracy: 0.0000e+00\n",
"Epoch 156/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 32.3482 - custom_mae: 6.0396 - accuracy: 0.0833 - val_loss: 466.5399 - val_custom_mae: 26.1508 - val_accuracy: 0.0000e+00\n",
"Epoch 157/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 30.7660 - custom_mae: 5.8952 - accuracy: 0.0833 - val_loss: 461.1880 - val_custom_mae: 25.9642 - val_accuracy: 0.0000e+00\n",
"Epoch 158/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 29.7537 - custom_mae: 5.8285 - accuracy: 0.0833 - val_loss: 457.4011 - val_custom_mae: 25.8132 - val_accuracy: 0.0000e+00\n",
"Epoch 159/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 29.5087 - custom_mae: 5.7671 - accuracy: 0.0417 - val_loss: 457.6366 - val_custom_mae: 25.8140 - val_accuracy: 0.0000e+00\n",
"Epoch 160/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 28.9460 - custom_mae: 5.7221 - accuracy: 0.0417 - val_loss: 460.2501 - val_custom_mae: 25.9221 - val_accuracy: 0.0000e+00\n",
"Epoch 161/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 28.2239 - custom_mae: 5.6289 - accuracy: 0.0417 - val_loss: 460.9203 - val_custom_mae: 25.9758 - val_accuracy: 0.0000e+00\n",
"Epoch 162/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 27.4046 - custom_mae: 5.5540 - accuracy: 0.0417 - val_loss: 463.3424 - val_custom_mae: 26.0324 - val_accuracy: 0.0000e+00\n",
"Epoch 163/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 27.0080 - custom_mae: 5.4735 - accuracy: 0.0417 - val_loss: 462.5216 - val_custom_mae: 25.9708 - val_accuracy: 0.0000e+00\n",
"Epoch 164/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 26.5262 - custom_mae: 5.4505 - accuracy: 0.0417 - val_loss: 459.0640 - val_custom_mae: 25.8779 - val_accuracy: 0.0000e+00\n",
"Epoch 165/500\n",
"8/8 [==============================] - 0s 12ms/step - loss: 25.6896 - custom_mae: 5.3489 - accuracy: 0.0417 - val_loss: 467.4054 - val_custom_mae: 26.1621 - val_accuracy: 0.0000e+00\n",
"Epoch 166/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 25.0785 - custom_mae: 5.2762 - accuracy: 0.0417 - val_loss: 463.7683 - val_custom_mae: 26.0607 - val_accuracy: 0.0000e+00\n",
"Epoch 167/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 24.5338 - custom_mae: 5.2252 - accuracy: 0.0417 - val_loss: 463.9749 - val_custom_mae: 26.0405 - val_accuracy: 0.0000e+00\n",
"Epoch 168/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 24.8176 - custom_mae: 5.2634 - accuracy: 0.0833 - val_loss: 469.7042 - val_custom_mae: 26.2299 - val_accuracy: 0.0000e+00\n",
"Epoch 169/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 23.4418 - custom_mae: 5.1271 - accuracy: 0.0417 - val_loss: 465.4104 - val_custom_mae: 26.0587 - val_accuracy: 0.0000e+00\n",
"Epoch 170/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 23.4545 - custom_mae: 5.0908 - accuracy: 0.0833 - val_loss: 471.0990 - val_custom_mae: 26.2474 - val_accuracy: 0.0000e+00\n",
"Epoch 171/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 22.6246 - custom_mae: 4.9760 - accuracy: 0.0417 - val_loss: 464.9403 - val_custom_mae: 26.0717 - val_accuracy: 0.0000e+00\n",
"Epoch 172/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 22.0491 - custom_mae: 4.9336 - accuracy: 0.0417 - val_loss: 464.9464 - val_custom_mae: 26.1121 - val_accuracy: 0.0000e+00\n",
"Epoch 173/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 21.6660 - custom_mae: 4.8979 - accuracy: 0.0417 - val_loss: 463.2189 - val_custom_mae: 26.0392 - val_accuracy: 0.0000e+00\n",
"Epoch 174/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 20.9636 - custom_mae: 4.8194 - accuracy: 0.0417 - val_loss: 469.7394 - val_custom_mae: 26.2230 - val_accuracy: 0.0000e+00\n",
"Epoch 175/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 21.0507 - custom_mae: 4.8291 - accuracy: 0.0417 - val_loss: 474.6667 - val_custom_mae: 26.3608 - val_accuracy: 0.0000e+00\n",
"Epoch 176/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 20.0407 - custom_mae: 4.6759 - accuracy: 0.0417 - val_loss: 472.0492 - val_custom_mae: 26.2491 - val_accuracy: 0.0000e+00\n",
"Epoch 177/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 19.6447 - custom_mae: 4.6592 - accuracy: 0.0417 - val_loss: 466.8966 - val_custom_mae: 26.1098 - val_accuracy: 0.0000e+00\n",
"Epoch 178/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 19.3669 - custom_mae: 4.6167 - accuracy: 0.0417 - val_loss: 466.9753 - val_custom_mae: 26.1518 - val_accuracy: 0.0000e+00\n",
"Epoch 179/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 18.8801 - custom_mae: 4.5518 - accuracy: 0.0417 - val_loss: 473.5079 - val_custom_mae: 26.3749 - val_accuracy: 0.0000e+00\n",
"Epoch 180/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 18.4521 - custom_mae: 4.5003 - accuracy: 0.0417 - val_loss: 474.3641 - val_custom_mae: 26.3569 - val_accuracy: 0.0000e+00\n",
"Epoch 181/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 17.7927 - custom_mae: 4.3935 - accuracy: 0.0417 - val_loss: 471.8148 - val_custom_mae: 26.2449 - val_accuracy: 0.0000e+00\n",
"Epoch 182/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 17.4695 - custom_mae: 4.3722 - accuracy: 0.0417 - val_loss: 471.5206 - val_custom_mae: 26.2671 - val_accuracy: 0.0000e+00\n",
"Epoch 183/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 17.1027 - custom_mae: 4.3010 - accuracy: 0.0417 - val_loss: 471.3881 - val_custom_mae: 26.2539 - val_accuracy: 0.0000e+00\n",
"Epoch 184/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 16.6544 - custom_mae: 4.2442 - accuracy: 0.0417 - val_loss: 471.6605 - val_custom_mae: 26.2710 - val_accuracy: 0.0000e+00\n",
"Epoch 185/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 16.2947 - custom_mae: 4.1925 - accuracy: 0.0417 - val_loss: 475.3328 - val_custom_mae: 26.3920 - val_accuracy: 0.0000e+00\n",
"Epoch 186/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 16.1027 - custom_mae: 4.2016 - accuracy: 0.0417 - val_loss: 475.3194 - val_custom_mae: 26.3913 - val_accuracy: 0.0000e+00\n",
"Epoch 187/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 15.4615 - custom_mae: 4.1142 - accuracy: 0.0417 - val_loss: 474.8930 - val_custom_mae: 26.3670 - val_accuracy: 0.0000e+00\n",
"Epoch 188/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 15.0995 - custom_mae: 4.0559 - accuracy: 0.0417 - val_loss: 471.7344 - val_custom_mae: 26.2607 - val_accuracy: 0.0000e+00\n",
"Epoch 189/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 14.7917 - custom_mae: 4.0052 - accuracy: 0.0417 - val_loss: 471.8742 - val_custom_mae: 26.2761 - val_accuracy: 0.0000e+00\n",
"Epoch 190/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 14.5857 - custom_mae: 3.9726 - accuracy: 0.0417 - val_loss: 473.7744 - val_custom_mae: 26.3379 - val_accuracy: 0.0000e+00\n",
"Epoch 191/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 14.4158 - custom_mae: 3.9429 - accuracy: 0.0417 - val_loss: 479.8463 - val_custom_mae: 26.5359 - val_accuracy: 0.0000e+00\n",
"Epoch 192/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 13.8274 - custom_mae: 3.8969 - accuracy: 0.0417 - val_loss: 473.4620 - val_custom_mae: 26.3280 - val_accuracy: 0.0000e+00\n",
"Epoch 193/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 13.6661 - custom_mae: 3.8534 - accuracy: 0.0417 - val_loss: 472.7038 - val_custom_mae: 26.2905 - val_accuracy: 0.0000e+00\n",
"Epoch 194/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 13.3780 - custom_mae: 3.8121 - accuracy: 0.0417 - val_loss: 476.4352 - val_custom_mae: 26.4500 - val_accuracy: 0.0000e+00\n",
"Epoch 195/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 12.7909 - custom_mae: 3.7180 - accuracy: 0.0417 - val_loss: 475.7620 - val_custom_mae: 26.4157 - val_accuracy: 0.0000e+00\n",
"Epoch 196/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 12.5520 - custom_mae: 3.6657 - accuracy: 0.0417 - val_loss: 476.3515 - val_custom_mae: 26.4403 - val_accuracy: 0.0000e+00\n",
"Epoch 197/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 12.1764 - custom_mae: 3.6299 - accuracy: 0.0417 - val_loss: 475.9054 - val_custom_mae: 26.4461 - val_accuracy: 0.0000e+00\n",
"Epoch 198/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 11.9500 - custom_mae: 3.5970 - accuracy: 0.0417 - val_loss: 473.1739 - val_custom_mae: 26.3367 - val_accuracy: 0.0000e+00\n",
"Epoch 199/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 11.5614 - custom_mae: 3.5167 - accuracy: 0.0417 - val_loss: 476.7552 - val_custom_mae: 26.4499 - val_accuracy: 0.0000e+00\n",
"Epoch 200/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 11.3241 - custom_mae: 3.4732 - accuracy: 0.0417 - val_loss: 478.5265 - val_custom_mae: 26.5310 - val_accuracy: 0.0000e+00\n",
"Epoch 201/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 11.1584 - custom_mae: 3.4535 - accuracy: 0.0417 - val_loss: 476.2689 - val_custom_mae: 26.4369 - val_accuracy: 0.0000e+00\n",
"Epoch 202/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 10.6642 - custom_mae: 3.3776 - accuracy: 0.0417 - val_loss: 475.9219 - val_custom_mae: 26.4274 - val_accuracy: 0.0000e+00\n",
"Epoch 203/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 10.6251 - custom_mae: 3.3579 - accuracy: 0.0417 - val_loss: 480.9362 - val_custom_mae: 26.6021 - val_accuracy: 0.0000e+00\n",
"Epoch 204/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 10.2866 - custom_mae: 3.3197 - accuracy: 0.0417 - val_loss: 477.6129 - val_custom_mae: 26.4705 - val_accuracy: 0.0000e+00\n",
"Epoch 205/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 9.9283 - custom_mae: 3.2573 - accuracy: 0.0417 - val_loss: 476.9622 - val_custom_mae: 26.4626 - val_accuracy: 0.0000e+00\n",
"Epoch 206/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 9.8609 - custom_mae: 3.2517 - accuracy: 0.0417 - val_loss: 475.8647 - val_custom_mae: 26.4406 - val_accuracy: 0.0000e+00\n",
"Epoch 207/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 9.4229 - custom_mae: 3.1697 - accuracy: 0.0417 - val_loss: 479.7575 - val_custom_mae: 26.5694 - val_accuracy: 0.0000e+00\n",
"Epoch 208/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 9.3209 - custom_mae: 3.1502 - accuracy: 0.0417 - val_loss: 480.2411 - val_custom_mae: 26.5635 - val_accuracy: 0.0000e+00\n",
"Epoch 209/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 9.2476 - custom_mae: 3.1546 - accuracy: 0.0417 - val_loss: 483.7486 - val_custom_mae: 26.6802 - val_accuracy: 0.0000e+00\n",
"Epoch 210/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 8.7970 - custom_mae: 3.1031 - accuracy: 0.0417 - val_loss: 482.4272 - val_custom_mae: 26.6145 - val_accuracy: 0.0000e+00\n",
"Epoch 211/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 8.8304 - custom_mae: 3.0748 - accuracy: 0.0417 - val_loss: 477.6331 - val_custom_mae: 26.4479 - val_accuracy: 0.0000e+00\n",
"Epoch 212/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 8.4544 - custom_mae: 2.9726 - accuracy: 0.0417 - val_loss: 482.2137 - val_custom_mae: 26.6322 - val_accuracy: 0.0000e+00\n",
"Epoch 213/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 8.2045 - custom_mae: 2.9717 - accuracy: 0.0417 - val_loss: 483.4682 - val_custom_mae: 26.6789 - val_accuracy: 0.0000e+00\n",
"Epoch 214/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 8.0322 - custom_mae: 2.9384 - accuracy: 0.0417 - val_loss: 478.7954 - val_custom_mae: 26.5078 - val_accuracy: 0.0000e+00\n",
"Epoch 215/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 7.7288 - custom_mae: 2.8449 - accuracy: 0.0417 - val_loss: 481.1552 - val_custom_mae: 26.5881 - val_accuracy: 0.0000e+00\n",
"Epoch 216/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 7.5951 - custom_mae: 2.8360 - accuracy: 0.0417 - val_loss: 483.2097 - val_custom_mae: 26.6620 - val_accuracy: 0.0000e+00\n",
"Epoch 217/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 7.3217 - custom_mae: 2.7876 - accuracy: 0.0417 - val_loss: 482.3259 - val_custom_mae: 26.6244 - val_accuracy: 0.0000e+00\n",
"Epoch 218/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 7.1674 - custom_mae: 2.7456 - accuracy: 0.0417 - val_loss: 482.1502 - val_custom_mae: 26.6162 - val_accuracy: 0.0000e+00\n",
"Epoch 219/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 7.0554 - custom_mae: 2.7201 - accuracy: 0.0417 - val_loss: 481.5610 - val_custom_mae: 26.6031 - val_accuracy: 0.0000e+00\n",
"Epoch 220/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 6.7992 - custom_mae: 2.6690 - accuracy: 0.0417 - val_loss: 480.9422 - val_custom_mae: 26.5998 - val_accuracy: 0.0000e+00\n",
"Epoch 221/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 6.6288 - custom_mae: 2.6315 - accuracy: 0.0417 - val_loss: 482.3175 - val_custom_mae: 26.6392 - val_accuracy: 0.0000e+00\n",
"Epoch 222/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 6.4631 - custom_mae: 2.5971 - accuracy: 0.0417 - val_loss: 482.6704 - val_custom_mae: 26.6644 - val_accuracy: 0.0000e+00\n",
"Epoch 223/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 6.2518 - custom_mae: 2.5768 - accuracy: 0.0417 - val_loss: 484.4213 - val_custom_mae: 26.7052 - val_accuracy: 0.0000e+00\n",
"Epoch 224/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 6.1151 - custom_mae: 2.5461 - accuracy: 0.0417 - val_loss: 482.9436 - val_custom_mae: 26.6709 - val_accuracy: 0.0000e+00\n",
"Epoch 225/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 5.9538 - custom_mae: 2.5064 - accuracy: 0.0417 - val_loss: 481.1745 - val_custom_mae: 26.6197 - val_accuracy: 0.0000e+00\n",
"Epoch 226/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 5.7982 - custom_mae: 2.4491 - accuracy: 0.0417 - val_loss: 483.1761 - val_custom_mae: 26.6742 - val_accuracy: 0.0000e+00\n",
"Epoch 227/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 5.6811 - custom_mae: 2.4184 - accuracy: 0.0417 - val_loss: 481.5321 - val_custom_mae: 26.6290 - val_accuracy: 0.0000e+00\n",
"Epoch 228/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 5.6439 - custom_mae: 2.4236 - accuracy: 0.0417 - val_loss: 484.2849 - val_custom_mae: 26.7045 - val_accuracy: 0.0000e+00\n",
"Epoch 229/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 5.3910 - custom_mae: 2.4047 - accuracy: 0.0417 - val_loss: 484.6814 - val_custom_mae: 26.7116 - val_accuracy: 0.0000e+00\n",
"Epoch 230/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 5.2502 - custom_mae: 2.3314 - accuracy: 0.0417 - val_loss: 481.8926 - val_custom_mae: 26.6336 - val_accuracy: 0.0000e+00\n",
"Epoch 231/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 5.1015 - custom_mae: 2.2883 - accuracy: 0.0417 - val_loss: 483.5603 - val_custom_mae: 26.6887 - val_accuracy: 0.0000e+00\n",
"Epoch 232/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 4.9428 - custom_mae: 2.2504 - accuracy: 0.0417 - val_loss: 484.5164 - val_custom_mae: 26.7148 - val_accuracy: 0.0000e+00\n",
"Epoch 233/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 4.8061 - custom_mae: 2.2283 - accuracy: 0.0417 - val_loss: 483.0309 - val_custom_mae: 26.6787 - val_accuracy: 0.0000e+00\n",
"Epoch 234/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 4.7203 - custom_mae: 2.1983 - accuracy: 0.0417 - val_loss: 482.1803 - val_custom_mae: 26.6563 - val_accuracy: 0.0000e+00\n",
"Epoch 235/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 4.5844 - custom_mae: 2.1763 - accuracy: 0.0417 - val_loss: 485.1508 - val_custom_mae: 26.7345 - val_accuracy: 0.0000e+00\n",
"Epoch 236/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 4.4690 - custom_mae: 2.1567 - accuracy: 0.0417 - val_loss: 484.5008 - val_custom_mae: 26.7102 - val_accuracy: 0.0000e+00\n",
"Epoch 237/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 4.3288 - custom_mae: 2.1133 - accuracy: 0.0417 - val_loss: 484.6514 - val_custom_mae: 26.7237 - val_accuracy: 0.0000e+00\n",
"Epoch 238/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 4.1703 - custom_mae: 2.0672 - accuracy: 0.0417 - val_loss: 484.2482 - val_custom_mae: 26.7213 - val_accuracy: 0.0000e+00\n",
"Epoch 239/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 4.1035 - custom_mae: 2.0552 - accuracy: 0.0417 - val_loss: 485.8715 - val_custom_mae: 26.7689 - val_accuracy: 0.0000e+00\n",
"Epoch 240/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 4.0573 - custom_mae: 2.0286 - accuracy: 0.0417 - val_loss: 482.4370 - val_custom_mae: 26.6746 - val_accuracy: 0.0000e+00\n",
"Epoch 241/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 3.8923 - custom_mae: 1.9748 - accuracy: 0.0417 - val_loss: 484.6525 - val_custom_mae: 26.7375 - val_accuracy: 0.0000e+00\n",
"Epoch 242/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 3.8083 - custom_mae: 1.9649 - accuracy: 0.0417 - val_loss: 486.6066 - val_custom_mae: 26.7897 - val_accuracy: 0.0000e+00\n",
"Epoch 243/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 3.6956 - custom_mae: 1.9540 - accuracy: 0.0417 - val_loss: 486.0649 - val_custom_mae: 26.7732 - val_accuracy: 0.0000e+00\n",
"Epoch 244/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 3.5765 - custom_mae: 1.9265 - accuracy: 0.0417 - val_loss: 485.4619 - val_custom_mae: 26.7510 - val_accuracy: 0.0000e+00\n",
"Epoch 245/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 3.5586 - custom_mae: 1.8838 - accuracy: 0.0417 - val_loss: 483.5710 - val_custom_mae: 26.7111 - val_accuracy: 0.0000e+00\n",
"Epoch 246/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 3.4745 - custom_mae: 1.8430 - accuracy: 0.0417 - val_loss: 485.7385 - val_custom_mae: 26.7809 - val_accuracy: 0.0000e+00\n",
"Epoch 247/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 3.2997 - custom_mae: 1.8138 - accuracy: 0.0417 - val_loss: 484.8431 - val_custom_mae: 26.7510 - val_accuracy: 0.0000e+00\n",
"Epoch 248/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 3.2009 - custom_mae: 1.7841 - accuracy: 0.0417 - val_loss: 484.9460 - val_custom_mae: 26.7484 - val_accuracy: 0.0000e+00\n",
"Epoch 249/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 3.1651 - custom_mae: 1.7822 - accuracy: 0.0417 - val_loss: 486.1873 - val_custom_mae: 26.7779 - val_accuracy: 0.0000e+00\n",
"Epoch 250/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 3.0366 - custom_mae: 1.7465 - accuracy: 0.0417 - val_loss: 487.0725 - val_custom_mae: 26.8050 - val_accuracy: 0.0000e+00\n",
"Epoch 251/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.9916 - custom_mae: 1.7321 - accuracy: 0.0417 - val_loss: 486.6724 - val_custom_mae: 26.8004 - val_accuracy: 0.0000e+00\n",
"Epoch 252/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.9219 - custom_mae: 1.7150 - accuracy: 0.0417 - val_loss: 487.4058 - val_custom_mae: 26.8154 - val_accuracy: 0.0000e+00\n",
"Epoch 253/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.8601 - custom_mae: 1.6870 - accuracy: 0.0417 - val_loss: 485.9131 - val_custom_mae: 26.7819 - val_accuracy: 0.0000e+00\n",
"Epoch 254/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.7535 - custom_mae: 1.6466 - accuracy: 0.0417 - val_loss: 487.6457 - val_custom_mae: 26.8282 - val_accuracy: 0.0000e+00\n",
"Epoch 255/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.7029 - custom_mae: 1.6742 - accuracy: 0.0417 - val_loss: 486.9001 - val_custom_mae: 26.8037 - val_accuracy: 0.0000e+00\n",
"Epoch 256/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.5937 - custom_mae: 1.6200 - accuracy: 0.0417 - val_loss: 485.7010 - val_custom_mae: 26.7692 - val_accuracy: 0.0000e+00\n",
"Epoch 257/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.5744 - custom_mae: 1.5719 - accuracy: 0.0417 - val_loss: 485.4424 - val_custom_mae: 26.7727 - val_accuracy: 0.0000e+00\n",
"Epoch 258/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.4540 - custom_mae: 1.5382 - accuracy: 0.0417 - val_loss: 486.7944 - val_custom_mae: 26.8143 - val_accuracy: 0.0000e+00\n",
"Epoch 259/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.3810 - custom_mae: 1.5183 - accuracy: 0.0417 - val_loss: 486.7484 - val_custom_mae: 26.8047 - val_accuracy: 0.0000e+00\n",
"Epoch 260/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.3523 - custom_mae: 1.5282 - accuracy: 0.0417 - val_loss: 487.8643 - val_custom_mae: 26.8270 - val_accuracy: 0.0000e+00\n",
"Epoch 261/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.3121 - custom_mae: 1.5271 - accuracy: 0.0417 - val_loss: 487.3483 - val_custom_mae: 26.8192 - val_accuracy: 0.0000e+00\n",
"Epoch 262/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.1976 - custom_mae: 1.4701 - accuracy: 0.0417 - val_loss: 487.7708 - val_custom_mae: 26.8348 - val_accuracy: 0.0000e+00\n",
"Epoch 263/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.1569 - custom_mae: 1.4452 - accuracy: 0.0417 - val_loss: 487.0035 - val_custom_mae: 26.8101 - val_accuracy: 0.0000e+00\n",
"Epoch 264/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.1042 - custom_mae: 1.4067 - accuracy: 0.0417 - val_loss: 486.7105 - val_custom_mae: 26.8014 - val_accuracy: 0.0000e+00\n",
"Epoch 265/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.0109 - custom_mae: 1.3925 - accuracy: 0.0417 - val_loss: 487.7247 - val_custom_mae: 26.8299 - val_accuracy: 0.0000e+00\n",
"Epoch 266/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.9935 - custom_mae: 1.4166 - accuracy: 0.0417 - val_loss: 488.8463 - val_custom_mae: 26.8552 - val_accuracy: 0.0000e+00\n",
"Epoch 267/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.9531 - custom_mae: 1.3788 - accuracy: 0.0417 - val_loss: 487.1858 - val_custom_mae: 26.8107 - val_accuracy: 0.0000e+00\n",
"Epoch 268/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.8745 - custom_mae: 1.3282 - accuracy: 0.0417 - val_loss: 487.7438 - val_custom_mae: 26.8362 - val_accuracy: 0.0000e+00\n",
"Epoch 269/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.8010 - custom_mae: 1.3065 - accuracy: 0.0417 - val_loss: 489.0699 - val_custom_mae: 26.8655 - val_accuracy: 0.0000e+00\n",
"Epoch 270/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.7769 - custom_mae: 1.3088 - accuracy: 0.0417 - val_loss: 488.2446 - val_custom_mae: 26.8362 - val_accuracy: 0.0000e+00\n",
"Epoch 271/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.7711 - custom_mae: 1.3366 - accuracy: 0.0417 - val_loss: 488.4815 - val_custom_mae: 26.8399 - val_accuracy: 0.0000e+00\n",
"Epoch 272/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.6652 - custom_mae: 1.2751 - accuracy: 0.0417 - val_loss: 487.8883 - val_custom_mae: 26.8287 - val_accuracy: 0.0000e+00\n",
"Epoch 273/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.6143 - custom_mae: 1.2271 - accuracy: 0.0417 - val_loss: 488.2449 - val_custom_mae: 26.8406 - val_accuracy: 0.0000e+00\n",
"Epoch 274/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.6082 - custom_mae: 1.2450 - accuracy: 0.0417 - val_loss: 488.4309 - val_custom_mae: 26.8439 - val_accuracy: 0.0000e+00\n",
"Epoch 275/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.5240 - custom_mae: 1.2057 - accuracy: 0.0417 - val_loss: 488.4113 - val_custom_mae: 26.8350 - val_accuracy: 0.0000e+00\n",
"Epoch 276/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.4973 - custom_mae: 1.1910 - accuracy: 0.0417 - val_loss: 488.4000 - val_custom_mae: 26.8347 - val_accuracy: 0.0000e+00\n",
"Epoch 277/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.4520 - custom_mae: 1.1725 - accuracy: 0.0417 - val_loss: 488.4713 - val_custom_mae: 26.8412 - val_accuracy: 0.0000e+00\n",
"Epoch 278/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 1.3980 - custom_mae: 1.1481 - accuracy: 0.0417 - val_loss: 488.7491 - val_custom_mae: 26.8505 - val_accuracy: 0.0000e+00\n",
"Epoch 279/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.3548 - custom_mae: 1.1228 - accuracy: 0.0417 - val_loss: 488.9457 - val_custom_mae: 26.8549 - val_accuracy: 0.0000e+00\n",
"Epoch 280/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.3351 - custom_mae: 1.1085 - accuracy: 0.0417 - val_loss: 488.3984 - val_custom_mae: 26.8396 - val_accuracy: 0.0000e+00\n",
"Epoch 281/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 1.2813 - custom_mae: 1.0892 - accuracy: 0.0417 - val_loss: 489.3159 - val_custom_mae: 26.8581 - val_accuracy: 0.0000e+00\n",
"Epoch 282/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.2785 - custom_mae: 1.1231 - accuracy: 0.0417 - val_loss: 489.8076 - val_custom_mae: 26.8672 - val_accuracy: 0.0000e+00\n",
"Epoch 283/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.2103 - custom_mae: 1.0779 - accuracy: 0.0417 - val_loss: 489.5902 - val_custom_mae: 26.8607 - val_accuracy: 0.0000e+00\n",
"Epoch 284/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.1828 - custom_mae: 1.0525 - accuracy: 0.0417 - val_loss: 489.1288 - val_custom_mae: 26.8514 - val_accuracy: 0.0000e+00\n",
"Epoch 285/500\n",
"8/8 [==============================] - 0s 12ms/step - loss: 1.1573 - custom_mae: 1.0459 - accuracy: 0.0417 - val_loss: 489.6539 - val_custom_mae: 26.8638 - val_accuracy: 0.0000e+00\n",
"Epoch 286/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.1059 - custom_mae: 1.0117 - accuracy: 0.0417 - val_loss: 489.4764 - val_custom_mae: 26.8570 - val_accuracy: 0.0000e+00\n",
"Epoch 287/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.0807 - custom_mae: 1.0001 - accuracy: 0.0417 - val_loss: 489.7199 - val_custom_mae: 26.8640 - val_accuracy: 0.0000e+00\n",
"Epoch 288/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.0718 - custom_mae: 0.9890 - accuracy: 0.0417 - val_loss: 489.0706 - val_custom_mae: 26.8514 - val_accuracy: 0.0000e+00\n",
"Epoch 289/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 1.0287 - custom_mae: 0.9621 - accuracy: 0.0417 - val_loss: 489.6757 - val_custom_mae: 26.8680 - val_accuracy: 0.0000e+00\n",
"Epoch 290/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.9918 - custom_mae: 0.9491 - accuracy: 0.0417 - val_loss: 489.7965 - val_custom_mae: 26.8615 - val_accuracy: 0.0000e+00\n",
"Epoch 291/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.9727 - custom_mae: 0.9435 - accuracy: 0.0417 - val_loss: 489.5268 - val_custom_mae: 26.8530 - val_accuracy: 0.0000e+00\n",
"Epoch 292/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.9295 - custom_mae: 0.9131 - accuracy: 0.0417 - val_loss: 489.4735 - val_custom_mae: 26.8555 - val_accuracy: 0.0000e+00\n",
"Epoch 293/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.9144 - custom_mae: 0.9202 - accuracy: 0.0417 - val_loss: 491.0399 - val_custom_mae: 26.8941 - val_accuracy: 0.0000e+00\n",
"Epoch 294/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.9095 - custom_mae: 0.9488 - accuracy: 0.0417 - val_loss: 490.7638 - val_custom_mae: 26.8798 - val_accuracy: 0.0000e+00\n",
"Epoch 295/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.8483 - custom_mae: 0.8974 - accuracy: 0.0417 - val_loss: 490.3413 - val_custom_mae: 26.8688 - val_accuracy: 0.0000e+00\n",
"Epoch 296/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.8330 - custom_mae: 0.8698 - accuracy: 0.0417 - val_loss: 490.5738 - val_custom_mae: 26.8810 - val_accuracy: 0.0000e+00\n",
"Epoch 297/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.8146 - custom_mae: 0.8633 - accuracy: 0.0417 - val_loss: 490.6082 - val_custom_mae: 26.8784 - val_accuracy: 0.0000e+00\n",
"Epoch 298/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.7932 - custom_mae: 0.8358 - accuracy: 0.0417 - val_loss: 489.8294 - val_custom_mae: 26.8530 - val_accuracy: 0.0000e+00\n",
"Epoch 299/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.7604 - custom_mae: 0.8166 - accuracy: 0.0417 - val_loss: 490.5051 - val_custom_mae: 26.8729 - val_accuracy: 0.0000e+00\n",
"Epoch 300/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.7348 - custom_mae: 0.8074 - accuracy: 0.0417 - val_loss: 490.5215 - val_custom_mae: 26.8753 - val_accuracy: 0.0000e+00\n",
"Epoch 301/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.7105 - custom_mae: 0.7996 - accuracy: 0.0417 - val_loss: 491.4999 - val_custom_mae: 26.9012 - val_accuracy: 0.0000e+00\n",
"Epoch 302/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.7176 - custom_mae: 0.8318 - accuracy: 0.0417 - val_loss: 491.8369 - val_custom_mae: 26.9076 - val_accuracy: 0.0000e+00\n",
"Epoch 303/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.6669 - custom_mae: 0.7988 - accuracy: 0.0417 - val_loss: 491.5625 - val_custom_mae: 26.8972 - val_accuracy: 0.0000e+00\n",
"Epoch 304/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.6482 - custom_mae: 0.7726 - accuracy: 0.0417 - val_loss: 491.5914 - val_custom_mae: 26.8999 - val_accuracy: 0.0000e+00\n",
"Epoch 305/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.6304 - custom_mae: 0.7482 - accuracy: 0.0417 - val_loss: 491.4050 - val_custom_mae: 26.8977 - val_accuracy: 0.0000e+00\n",
"Epoch 306/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.6217 - custom_mae: 0.7486 - accuracy: 0.0417 - val_loss: 492.0545 - val_custom_mae: 26.9137 - val_accuracy: 0.0000e+00\n",
"Epoch 307/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 0.5897 - custom_mae: 0.7256 - accuracy: 0.0417 - val_loss: 491.6697 - val_custom_mae: 26.8996 - val_accuracy: 0.0000e+00\n",
"Epoch 308/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.5818 - custom_mae: 0.7112 - accuracy: 0.0417 - val_loss: 491.8970 - val_custom_mae: 26.9064 - val_accuracy: 0.0000e+00\n",
"Epoch 309/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.5544 - custom_mae: 0.7007 - accuracy: 0.0417 - val_loss: 492.2628 - val_custom_mae: 26.9152 - val_accuracy: 0.0000e+00\n",
"Epoch 310/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.5454 - custom_mae: 0.6890 - accuracy: 0.0417 - val_loss: 491.7086 - val_custom_mae: 26.8989 - val_accuracy: 0.0000e+00\n",
"Epoch 311/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.5413 - custom_mae: 0.6991 - accuracy: 0.0417 - val_loss: 492.8210 - val_custom_mae: 26.9295 - val_accuracy: 0.0000e+00\n",
"Epoch 312/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.5090 - custom_mae: 0.6852 - accuracy: 0.0417 - val_loss: 492.4782 - val_custom_mae: 26.9173 - val_accuracy: 0.0000e+00\n",
"Epoch 313/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.4935 - custom_mae: 0.6604 - accuracy: 0.0417 - val_loss: 492.3422 - val_custom_mae: 26.9127 - val_accuracy: 0.0000e+00\n",
"Epoch 314/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.4805 - custom_mae: 0.6533 - accuracy: 0.0417 - val_loss: 492.7933 - val_custom_mae: 26.9251 - val_accuracy: 0.0000e+00\n",
"Epoch 315/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.4609 - custom_mae: 0.6370 - accuracy: 0.0417 - val_loss: 492.6136 - val_custom_mae: 26.9206 - val_accuracy: 0.0000e+00\n",
"Epoch 316/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 0.4516 - custom_mae: 0.6220 - accuracy: 0.0417 - val_loss: 492.3545 - val_custom_mae: 26.9116 - val_accuracy: 0.0000e+00\n",
"Epoch 317/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.4295 - custom_mae: 0.6018 - accuracy: 0.0417 - val_loss: 492.4247 - val_custom_mae: 26.9142 - val_accuracy: 0.0000e+00\n",
"Epoch 318/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 0.4188 - custom_mae: 0.6046 - accuracy: 0.0417 - val_loss: 493.3983 - val_custom_mae: 26.9355 - val_accuracy: 0.0000e+00\n",
"Epoch 319/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 0.4074 - custom_mae: 0.6093 - accuracy: 0.0417 - val_loss: 492.9115 - val_custom_mae: 26.9212 - val_accuracy: 0.0000e+00\n",
"Epoch 320/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 0.3992 - custom_mae: 0.5902 - accuracy: 0.0417 - val_loss: 493.3150 - val_custom_mae: 26.9336 - val_accuracy: 0.0000e+00\n",
"Epoch 321/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.3761 - custom_mae: 0.5705 - accuracy: 0.0417 - val_loss: 493.3654 - val_custom_mae: 26.9372 - val_accuracy: 0.0000e+00\n",
"Epoch 322/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.3696 - custom_mae: 0.5747 - accuracy: 0.0417 - val_loss: 493.6761 - val_custom_mae: 26.9436 - val_accuracy: 0.0000e+00\n",
"Epoch 323/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 0.3619 - custom_mae: 0.5804 - accuracy: 0.0417 - val_loss: 493.7351 - val_custom_mae: 26.9404 - val_accuracy: 0.0000e+00\n",
"Epoch 324/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.3415 - custom_mae: 0.5621 - accuracy: 0.0417 - val_loss: 493.8978 - val_custom_mae: 26.9422 - val_accuracy: 0.0000e+00\n",
"Epoch 325/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 0.3310 - custom_mae: 0.5378 - accuracy: 0.0417 - val_loss: 493.8358 - val_custom_mae: 26.9441 - val_accuracy: 0.0000e+00\n",
"Epoch 326/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.3340 - custom_mae: 0.5253 - accuracy: 0.0417 - val_loss: 493.1354 - val_custom_mae: 26.9265 - val_accuracy: 0.0000e+00\n",
"Epoch 327/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.3111 - custom_mae: 0.5072 - accuracy: 0.0417 - val_loss: 494.1719 - val_custom_mae: 26.9546 - val_accuracy: 0.0000e+00\n",
"Epoch 328/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.3109 - custom_mae: 0.5297 - accuracy: 0.0417 - val_loss: 494.3883 - val_custom_mae: 26.9533 - val_accuracy: 0.0000e+00\n",
"Epoch 329/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.2908 - custom_mae: 0.5176 - accuracy: 0.0417 - val_loss: 494.2490 - val_custom_mae: 26.9481 - val_accuracy: 0.0000e+00\n",
"Epoch 330/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.2818 - custom_mae: 0.5050 - accuracy: 0.0417 - val_loss: 494.5416 - val_custom_mae: 26.9603 - val_accuracy: 0.0000e+00\n",
"Epoch 331/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.2808 - custom_mae: 0.4855 - accuracy: 0.0417 - val_loss: 493.9260 - val_custom_mae: 26.9455 - val_accuracy: 0.0000e+00\n",
"Epoch 332/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.2660 - custom_mae: 0.4782 - accuracy: 0.0417 - val_loss: 494.6317 - val_custom_mae: 26.9635 - val_accuracy: 0.0000e+00\n",
"Epoch 333/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.2546 - custom_mae: 0.4685 - accuracy: 0.0417 - val_loss: 494.2513 - val_custom_mae: 26.9488 - val_accuracy: 0.0000e+00\n",
"Epoch 334/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.2582 - custom_mae: 0.4839 - accuracy: 0.0417 - val_loss: 494.9866 - val_custom_mae: 26.9672 - val_accuracy: 0.0000e+00\n",
"Epoch 335/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.2404 - custom_mae: 0.4712 - accuracy: 0.0417 - val_loss: 495.0686 - val_custom_mae: 26.9691 - val_accuracy: 0.0000e+00\n",
"Epoch 336/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 0.2381 - custom_mae: 0.4418 - accuracy: 0.0417 - val_loss: 494.2435 - val_custom_mae: 26.9496 - val_accuracy: 0.0000e+00\n",
"Epoch 337/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.2292 - custom_mae: 0.4422 - accuracy: 0.0417 - val_loss: 495.1205 - val_custom_mae: 26.9709 - val_accuracy: 0.0000e+00\n",
"Epoch 338/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.2142 - custom_mae: 0.4290 - accuracy: 0.0417 - val_loss: 494.4334 - val_custom_mae: 26.9437 - val_accuracy: 0.0000e+00\n",
"Epoch 339/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.2084 - custom_mae: 0.4247 - accuracy: 0.0417 - val_loss: 494.7942 - val_custom_mae: 26.9488 - val_accuracy: 0.0000e+00\n",
"Epoch 340/500\n",
"8/8 [==============================] - 0s 13ms/step - loss: 0.1987 - custom_mae: 0.4204 - accuracy: 0.0417 - val_loss: 494.9329 - val_custom_mae: 26.9485 - val_accuracy: 0.0000e+00\n",
"Epoch 341/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.1987 - custom_mae: 0.4091 - accuracy: 0.0417 - val_loss: 494.6151 - val_custom_mae: 26.9407 - val_accuracy: 0.0000e+00\n",
"Epoch 342/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 0.1975 - custom_mae: 0.4155 - accuracy: 0.0417 - val_loss: 495.0503 - val_custom_mae: 26.9517 - val_accuracy: 0.0000e+00\n",
"Epoch 343/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.1863 - custom_mae: 0.3969 - accuracy: 0.0417 - val_loss: 494.3197 - val_custom_mae: 26.9326 - val_accuracy: 0.0000e+00\n",
"Epoch 344/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.1715 - custom_mae: 0.3738 - accuracy: 0.0417 - val_loss: 494.9692 - val_custom_mae: 26.9513 - val_accuracy: 0.0000e+00\n",
"Epoch 345/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.1693 - custom_mae: 0.3822 - accuracy: 0.0417 - val_loss: 495.3814 - val_custom_mae: 26.9565 - val_accuracy: 0.0000e+00\n",
"Epoch 346/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.1623 - custom_mae: 0.3803 - accuracy: 0.0417 - val_loss: 495.2960 - val_custom_mae: 26.9538 - val_accuracy: 0.0000e+00\n",
"Epoch 347/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.1579 - custom_mae: 0.3726 - accuracy: 0.0417 - val_loss: 494.9153 - val_custom_mae: 26.9464 - val_accuracy: 0.0000e+00\n",
"Epoch 348/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.1576 - custom_mae: 0.3721 - accuracy: 0.0417 - val_loss: 495.5757 - val_custom_mae: 26.9584 - val_accuracy: 0.0000e+00\n",
"Epoch 349/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.1436 - custom_mae: 0.3610 - accuracy: 0.0417 - val_loss: 495.4489 - val_custom_mae: 26.9470 - val_accuracy: 0.0000e+00\n",
"Epoch 350/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.1431 - custom_mae: 0.3475 - accuracy: 0.0417 - val_loss: 495.2490 - val_custom_mae: 26.9432 - val_accuracy: 0.0000e+00\n",
"Epoch 351/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.1362 - custom_mae: 0.3362 - accuracy: 0.0417 - val_loss: 495.7759 - val_custom_mae: 26.9607 - val_accuracy: 0.0000e+00\n",
"Epoch 352/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 0.1297 - custom_mae: 0.3326 - accuracy: 0.0417 - val_loss: 495.4026 - val_custom_mae: 26.9494 - val_accuracy: 0.0000e+00\n",
"Epoch 353/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 0.1245 - custom_mae: 0.3256 - accuracy: 0.0417 - val_loss: 495.6801 - val_custom_mae: 26.9568 - val_accuracy: 0.0000e+00\n",
"Epoch 354/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.1210 - custom_mae: 0.3264 - accuracy: 0.0417 - val_loss: 496.0530 - val_custom_mae: 26.9644 - val_accuracy: 0.0000e+00\n",
"Epoch 355/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.1152 - custom_mae: 0.3187 - accuracy: 0.0417 - val_loss: 495.8714 - val_custom_mae: 26.9562 - val_accuracy: 0.0000e+00\n",
"Epoch 356/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.1117 - custom_mae: 0.3086 - accuracy: 0.0417 - val_loss: 495.8951 - val_custom_mae: 26.9568 - val_accuracy: 0.0000e+00\n",
"Epoch 357/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.1121 - custom_mae: 0.3017 - accuracy: 0.0417 - val_loss: 495.8504 - val_custom_mae: 26.9556 - val_accuracy: 0.0000e+00\n",
"Epoch 358/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.1074 - custom_mae: 0.3031 - accuracy: 0.0417 - val_loss: 496.4321 - val_custom_mae: 26.9685 - val_accuracy: 0.0000e+00\n",
"Epoch 359/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.1014 - custom_mae: 0.2996 - accuracy: 0.0417 - val_loss: 496.1546 - val_custom_mae: 26.9593 - val_accuracy: 0.0000e+00\n",
"Epoch 360/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0957 - custom_mae: 0.2903 - accuracy: 0.0417 - val_loss: 496.2288 - val_custom_mae: 26.9634 - val_accuracy: 0.0000e+00\n",
"Epoch 361/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0931 - custom_mae: 0.2792 - accuracy: 0.0417 - val_loss: 496.2048 - val_custom_mae: 26.9635 - val_accuracy: 0.0000e+00\n",
"Epoch 362/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0912 - custom_mae: 0.2787 - accuracy: 0.0417 - val_loss: 496.6744 - val_custom_mae: 26.9743 - val_accuracy: 0.0000e+00\n",
"Epoch 363/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0866 - custom_mae: 0.2782 - accuracy: 0.0417 - val_loss: 496.5798 - val_custom_mae: 26.9681 - val_accuracy: 0.0000e+00\n",
"Epoch 364/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0844 - custom_mae: 0.2649 - accuracy: 0.0417 - val_loss: 496.2144 - val_custom_mae: 26.9607 - val_accuracy: 0.0000e+00\n",
"Epoch 365/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0808 - custom_mae: 0.2579 - accuracy: 0.0417 - val_loss: 496.8933 - val_custom_mae: 26.9792 - val_accuracy: 0.0000e+00\n",
"Epoch 366/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0765 - custom_mae: 0.2568 - accuracy: 0.0417 - val_loss: 496.6259 - val_custom_mae: 26.9691 - val_accuracy: 0.0000e+00\n",
"Epoch 367/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0755 - custom_mae: 0.2580 - accuracy: 0.0417 - val_loss: 496.8974 - val_custom_mae: 26.9749 - val_accuracy: 0.0000e+00\n",
"Epoch 368/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0717 - custom_mae: 0.2495 - accuracy: 0.0417 - val_loss: 496.8920 - val_custom_mae: 26.9769 - val_accuracy: 0.0000e+00\n",
"Epoch 369/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0697 - custom_mae: 0.2503 - accuracy: 0.0417 - val_loss: 497.0009 - val_custom_mae: 26.9795 - val_accuracy: 0.0000e+00\n",
"Epoch 370/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0668 - custom_mae: 0.2366 - accuracy: 0.0417 - val_loss: 496.7284 - val_custom_mae: 26.9724 - val_accuracy: 0.0000e+00\n",
"Epoch 371/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0643 - custom_mae: 0.2342 - accuracy: 0.0417 - val_loss: 497.2168 - val_custom_mae: 26.9857 - val_accuracy: 0.0000e+00\n",
"Epoch 372/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0601 - custom_mae: 0.2293 - accuracy: 0.0417 - val_loss: 497.0638 - val_custom_mae: 26.9790 - val_accuracy: 0.0000e+00\n",
"Epoch 373/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0604 - custom_mae: 0.2347 - accuracy: 0.0417 - val_loss: 497.4595 - val_custom_mae: 26.9871 - val_accuracy: 0.0000e+00\n",
"Epoch 374/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0567 - custom_mae: 0.2206 - accuracy: 0.0417 - val_loss: 497.0615 - val_custom_mae: 26.9781 - val_accuracy: 0.0000e+00\n",
"Epoch 375/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0537 - custom_mae: 0.2090 - accuracy: 0.0417 - val_loss: 497.2368 - val_custom_mae: 26.9851 - val_accuracy: 0.0000e+00\n",
"Epoch 376/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0520 - custom_mae: 0.2067 - accuracy: 0.0417 - val_loss: 497.4591 - val_custom_mae: 26.9886 - val_accuracy: 0.0000e+00\n",
"Epoch 377/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 0.0510 - custom_mae: 0.2146 - accuracy: 0.0417 - val_loss: 497.6440 - val_custom_mae: 26.9901 - val_accuracy: 0.0000e+00\n",
"Epoch 378/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0484 - custom_mae: 0.2105 - accuracy: 0.0417 - val_loss: 497.5444 - val_custom_mae: 26.9879 - val_accuracy: 0.0000e+00\n",
"Epoch 379/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0467 - custom_mae: 0.2013 - accuracy: 0.0417 - val_loss: 497.5261 - val_custom_mae: 26.9903 - val_accuracy: 0.0000e+00\n",
"Epoch 380/500\n",
"8/8 [==============================] - 0s 12ms/step - loss: 0.0445 - custom_mae: 0.1960 - accuracy: 0.0417 - val_loss: 497.7465 - val_custom_mae: 26.9950 - val_accuracy: 0.0000e+00\n",
"Epoch 381/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0419 - custom_mae: 0.1950 - accuracy: 0.0417 - val_loss: 497.7120 - val_custom_mae: 26.9909 - val_accuracy: 0.0000e+00\n",
"Epoch 382/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0416 - custom_mae: 0.1949 - accuracy: 0.0417 - val_loss: 497.7560 - val_custom_mae: 26.9930 - val_accuracy: 0.0000e+00\n",
"Epoch 383/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0402 - custom_mae: 0.1822 - accuracy: 0.0417 - val_loss: 497.4890 - val_custom_mae: 26.9874 - val_accuracy: 0.0000e+00\n",
"Epoch 384/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0376 - custom_mae: 0.1732 - accuracy: 0.0417 - val_loss: 497.9262 - val_custom_mae: 26.9994 - val_accuracy: 0.0000e+00\n",
"Epoch 385/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0355 - custom_mae: 0.1749 - accuracy: 0.0417 - val_loss: 497.9811 - val_custom_mae: 26.9980 - val_accuracy: 0.0000e+00\n",
"Epoch 386/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0350 - custom_mae: 0.1792 - accuracy: 0.0417 - val_loss: 498.1114 - val_custom_mae: 26.9997 - val_accuracy: 0.0000e+00\n",
"Epoch 387/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0335 - custom_mae: 0.1760 - accuracy: 0.0417 - val_loss: 498.0630 - val_custom_mae: 26.9996 - val_accuracy: 0.0000e+00\n",
"Epoch 388/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0319 - custom_mae: 0.1701 - accuracy: 0.0417 - val_loss: 498.0527 - val_custom_mae: 27.0015 - val_accuracy: 0.0000e+00\n",
"Epoch 389/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0313 - custom_mae: 0.1688 - accuracy: 0.0417 - val_loss: 498.2177 - val_custom_mae: 27.0046 - val_accuracy: 0.0000e+00\n",
"Epoch 390/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 0.0298 - custom_mae: 0.1604 - accuracy: 0.0417 - val_loss: 498.0598 - val_custom_mae: 26.9978 - val_accuracy: 0.0000e+00\n",
"Epoch 391/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0297 - custom_mae: 0.1548 - accuracy: 0.0417 - val_loss: 498.1248 - val_custom_mae: 27.0005 - val_accuracy: 0.0000e+00\n",
"Epoch 392/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0293 - custom_mae: 0.1593 - accuracy: 0.0417 - val_loss: 498.4585 - val_custom_mae: 27.0096 - val_accuracy: 0.0000e+00\n",
"Epoch 393/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0268 - custom_mae: 0.1576 - accuracy: 0.0417 - val_loss: 498.0756 - val_custom_mae: 26.9984 - val_accuracy: 0.0000e+00\n",
"Epoch 394/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0256 - custom_mae: 0.1525 - accuracy: 0.0417 - val_loss: 498.4431 - val_custom_mae: 27.0073 - val_accuracy: 0.0000e+00\n",
"Epoch 395/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0242 - custom_mae: 0.1463 - accuracy: 0.0417 - val_loss: 498.4828 - val_custom_mae: 27.0080 - val_accuracy: 0.0000e+00\n",
"Epoch 396/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0229 - custom_mae: 0.1409 - accuracy: 0.0417 - val_loss: 498.2771 - val_custom_mae: 27.0023 - val_accuracy: 0.0000e+00\n",
"Epoch 397/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0230 - custom_mae: 0.1439 - accuracy: 0.0417 - val_loss: 498.5149 - val_custom_mae: 27.0085 - val_accuracy: 0.0000e+00\n",
"Epoch 398/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0208 - custom_mae: 0.1411 - accuracy: 0.0417 - val_loss: 498.5582 - val_custom_mae: 27.0075 - val_accuracy: 0.0000e+00\n",
"Epoch 399/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 0.0196 - custom_mae: 0.1319 - accuracy: 0.0417 - val_loss: 498.4999 - val_custom_mae: 27.0057 - val_accuracy: 0.0000e+00\n",
"Epoch 400/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0195 - custom_mae: 0.1302 - accuracy: 0.0417 - val_loss: 498.6266 - val_custom_mae: 27.0100 - val_accuracy: 0.0000e+00\n",
"Epoch 401/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0188 - custom_mae: 0.1263 - accuracy: 0.0417 - val_loss: 498.5002 - val_custom_mae: 27.0056 - val_accuracy: 0.0000e+00\n",
"Epoch 402/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0182 - custom_mae: 0.1250 - accuracy: 0.0417 - val_loss: 498.7569 - val_custom_mae: 27.0118 - val_accuracy: 0.0000e+00\n",
"Epoch 403/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0176 - custom_mae: 0.1230 - accuracy: 0.0417 - val_loss: 498.6774 - val_custom_mae: 27.0086 - val_accuracy: 0.0000e+00\n",
"Epoch 404/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0160 - custom_mae: 0.1190 - accuracy: 0.0417 - val_loss: 498.8095 - val_custom_mae: 27.0130 - val_accuracy: 0.0000e+00\n",
"Epoch 405/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0161 - custom_mae: 0.1260 - accuracy: 0.0417 - val_loss: 498.8636 - val_custom_mae: 27.0137 - val_accuracy: 0.0000e+00\n",
"Epoch 406/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0148 - custom_mae: 0.1193 - accuracy: 0.0417 - val_loss: 498.7666 - val_custom_mae: 27.0105 - val_accuracy: 0.0000e+00\n",
"Epoch 407/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0147 - custom_mae: 0.1144 - accuracy: 0.0417 - val_loss: 498.8578 - val_custom_mae: 27.0144 - val_accuracy: 0.0000e+00\n",
"Epoch 408/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0150 - custom_mae: 0.1120 - accuracy: 0.0417 - val_loss: 498.7195 - val_custom_mae: 27.0105 - val_accuracy: 0.0000e+00\n",
"Epoch 409/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 0.0137 - custom_mae: 0.1091 - accuracy: 0.0417 - val_loss: 499.0517 - val_custom_mae: 27.0172 - val_accuracy: 0.0000e+00\n",
"Epoch 410/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0127 - custom_mae: 0.1085 - accuracy: 0.0417 - val_loss: 498.9183 - val_custom_mae: 27.0130 - val_accuracy: 0.0000e+00\n",
"Epoch 411/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0122 - custom_mae: 0.1073 - accuracy: 0.0417 - val_loss: 499.0080 - val_custom_mae: 27.0170 - val_accuracy: 0.0000e+00\n",
"Epoch 412/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0121 - custom_mae: 0.1027 - accuracy: 0.0417 - val_loss: 498.9417 - val_custom_mae: 27.0150 - val_accuracy: 0.0000e+00\n",
"Epoch 413/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0114 - custom_mae: 0.0995 - accuracy: 0.0417 - val_loss: 499.1314 - val_custom_mae: 27.0189 - val_accuracy: 0.0000e+00\n",
"Epoch 414/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0107 - custom_mae: 0.0982 - accuracy: 0.0417 - val_loss: 499.0407 - val_custom_mae: 27.0157 - val_accuracy: 0.0000e+00\n",
"Epoch 415/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0105 - custom_mae: 0.0999 - accuracy: 0.0417 - val_loss: 499.0301 - val_custom_mae: 27.0160 - val_accuracy: 0.0000e+00\n",
"Epoch 416/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0101 - custom_mae: 0.0991 - accuracy: 0.0417 - val_loss: 499.1465 - val_custom_mae: 27.0182 - val_accuracy: 0.0000e+00\n",
"Epoch 417/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0096 - custom_mae: 0.0940 - accuracy: 0.0417 - val_loss: 499.0759 - val_custom_mae: 27.0160 - val_accuracy: 0.0000e+00\n",
"Epoch 418/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0091 - custom_mae: 0.0874 - accuracy: 0.0417 - val_loss: 499.2227 - val_custom_mae: 27.0202 - val_accuracy: 0.0000e+00\n",
"Epoch 419/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0084 - custom_mae: 0.0860 - accuracy: 0.0417 - val_loss: 499.2723 - val_custom_mae: 27.0196 - val_accuracy: 0.0000e+00\n",
"Epoch 420/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0083 - custom_mae: 0.0904 - accuracy: 0.0417 - val_loss: 499.2439 - val_custom_mae: 27.0183 - val_accuracy: 0.0000e+00\n",
"Epoch 421/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0082 - custom_mae: 0.0863 - accuracy: 0.0417 - val_loss: 499.2122 - val_custom_mae: 27.0185 - val_accuracy: 0.0000e+00\n",
"Epoch 422/500\n",
"8/8 [==============================] - 0s 13ms/step - loss: 0.0076 - custom_mae: 0.0854 - accuracy: 0.0417 - val_loss: 499.3022 - val_custom_mae: 27.0211 - val_accuracy: 0.0000e+00\n",
"Epoch 423/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0071 - custom_mae: 0.0816 - accuracy: 0.0417 - val_loss: 499.2509 - val_custom_mae: 27.0193 - val_accuracy: 0.0000e+00\n",
"Epoch 424/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0069 - custom_mae: 0.0800 - accuracy: 0.0417 - val_loss: 499.3427 - val_custom_mae: 27.0213 - val_accuracy: 0.0000e+00\n",
"Epoch 425/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0066 - custom_mae: 0.0791 - accuracy: 0.0417 - val_loss: 499.3573 - val_custom_mae: 27.0212 - val_accuracy: 0.0000e+00\n",
"Epoch 426/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0064 - custom_mae: 0.0739 - accuracy: 0.0417 - val_loss: 499.3005 - val_custom_mae: 27.0197 - val_accuracy: 0.0000e+00\n",
"Epoch 427/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0061 - custom_mae: 0.0711 - accuracy: 0.0417 - val_loss: 499.3195 - val_custom_mae: 27.0204 - val_accuracy: 0.0000e+00\n",
"Epoch 428/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 0.0056 - custom_mae: 0.0680 - accuracy: 0.0417 - val_loss: 499.3864 - val_custom_mae: 27.0220 - val_accuracy: 0.0000e+00\n",
"Epoch 429/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0054 - custom_mae: 0.0713 - accuracy: 0.0417 - val_loss: 499.4987 - val_custom_mae: 27.0244 - val_accuracy: 0.0000e+00\n",
"Epoch 430/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0054 - custom_mae: 0.0741 - accuracy: 0.0417 - val_loss: 499.4813 - val_custom_mae: 27.0233 - val_accuracy: 0.0000e+00\n",
"Epoch 431/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0054 - custom_mae: 0.0717 - accuracy: 0.0417 - val_loss: 499.4486 - val_custom_mae: 27.0233 - val_accuracy: 0.0000e+00\n",
"Epoch 432/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0050 - custom_mae: 0.0668 - accuracy: 0.0417 - val_loss: 499.5279 - val_custom_mae: 27.0251 - val_accuracy: 0.0000e+00\n",
"Epoch 433/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 0.0046 - custom_mae: 0.0640 - accuracy: 0.0417 - val_loss: 499.4086 - val_custom_mae: 27.0213 - val_accuracy: 0.0000e+00\n",
"Epoch 434/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0046 - custom_mae: 0.0629 - accuracy: 0.0417 - val_loss: 499.5440 - val_custom_mae: 27.0245 - val_accuracy: 0.0000e+00\n",
"Epoch 435/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0044 - custom_mae: 0.0633 - accuracy: 0.0417 - val_loss: 499.5598 - val_custom_mae: 27.0243 - val_accuracy: 0.0000e+00\n",
"Epoch 436/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0041 - custom_mae: 0.0609 - accuracy: 0.0417 - val_loss: 499.5330 - val_custom_mae: 27.0239 - val_accuracy: 0.0000e+00\n",
"Epoch 437/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0038 - custom_mae: 0.0577 - accuracy: 0.0417 - val_loss: 499.5914 - val_custom_mae: 27.0258 - val_accuracy: 0.0000e+00\n",
"Epoch 438/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0037 - custom_mae: 0.0560 - accuracy: 0.0417 - val_loss: 499.5930 - val_custom_mae: 27.0261 - val_accuracy: 0.0000e+00\n",
"Epoch 439/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 0.0036 - custom_mae: 0.0588 - accuracy: 0.0417 - val_loss: 499.6241 - val_custom_mae: 27.0285 - val_accuracy: 0.0000e+00\n",
"Epoch 440/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 0.0033 - custom_mae: 0.0562 - accuracy: 0.0417 - val_loss: 499.6143 - val_custom_mae: 27.0267 - val_accuracy: 0.0000e+00\n",
"Epoch 441/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0031 - custom_mae: 0.0524 - accuracy: 0.0417 - val_loss: 499.6224 - val_custom_mae: 27.0260 - val_accuracy: 0.0000e+00\n",
"Epoch 442/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0032 - custom_mae: 0.0517 - accuracy: 0.0417 - val_loss: 499.6124 - val_custom_mae: 27.0254 - val_accuracy: 0.0000e+00\n",
"Epoch 443/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0030 - custom_mae: 0.0510 - accuracy: 0.0417 - val_loss: 499.6590 - val_custom_mae: 27.0278 - val_accuracy: 0.0000e+00\n",
"Epoch 444/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0028 - custom_mae: 0.0509 - accuracy: 0.0417 - val_loss: 499.6487 - val_custom_mae: 27.0275 - val_accuracy: 0.0000e+00\n",
"Epoch 445/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0026 - custom_mae: 0.0486 - accuracy: 0.0417 - val_loss: 499.6879 - val_custom_mae: 27.0283 - val_accuracy: 0.0000e+00\n",
"Epoch 446/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0026 - custom_mae: 0.0474 - accuracy: 0.0417 - val_loss: 499.6669 - val_custom_mae: 27.0273 - val_accuracy: 0.0000e+00\n",
"Epoch 447/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0024 - custom_mae: 0.0450 - accuracy: 0.0417 - val_loss: 499.6914 - val_custom_mae: 27.0278 - val_accuracy: 0.0000e+00\n",
"Epoch 448/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0022 - custom_mae: 0.0440 - accuracy: 0.0417 - val_loss: 499.7463 - val_custom_mae: 27.0293 - val_accuracy: 0.0000e+00\n",
"Epoch 449/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0022 - custom_mae: 0.0454 - accuracy: 0.0417 - val_loss: 499.7432 - val_custom_mae: 27.0298 - val_accuracy: 0.0000e+00\n",
"Epoch 450/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0021 - custom_mae: 0.0444 - accuracy: 0.0417 - val_loss: 499.7231 - val_custom_mae: 27.0289 - val_accuracy: 0.0000e+00\n",
"Epoch 451/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0020 - custom_mae: 0.0434 - accuracy: 0.0417 - val_loss: 499.7239 - val_custom_mae: 27.0290 - val_accuracy: 0.0000e+00\n",
"Epoch 452/500\n",
"8/8 [==============================] - 0s 12ms/step - loss: 0.0020 - custom_mae: 0.0412 - accuracy: 0.0417 - val_loss: 499.7679 - val_custom_mae: 27.0294 - val_accuracy: 0.0000e+00\n",
"Epoch 453/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0020 - custom_mae: 0.0432 - accuracy: 0.0417 - val_loss: 499.7787 - val_custom_mae: 27.0294 - val_accuracy: 0.0000e+00\n",
"Epoch 454/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0017 - custom_mae: 0.0397 - accuracy: 0.0417 - val_loss: 499.6925 - val_custom_mae: 27.0285 - val_accuracy: 0.0000e+00\n",
"Epoch 455/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0017 - custom_mae: 0.0375 - accuracy: 0.0417 - val_loss: 499.7932 - val_custom_mae: 27.0308 - val_accuracy: 0.0000e+00\n",
"Epoch 456/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0015 - custom_mae: 0.0360 - accuracy: 0.0417 - val_loss: 499.7675 - val_custom_mae: 27.0290 - val_accuracy: 0.0000e+00\n",
"Epoch 457/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0016 - custom_mae: 0.0381 - accuracy: 0.0417 - val_loss: 499.8068 - val_custom_mae: 27.0303 - val_accuracy: 0.0000e+00\n",
"Epoch 458/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0014 - custom_mae: 0.0371 - accuracy: 0.0417 - val_loss: 499.8171 - val_custom_mae: 27.0304 - val_accuracy: 0.0000e+00\n",
"Epoch 459/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0014 - custom_mae: 0.0352 - accuracy: 0.0417 - val_loss: 499.7731 - val_custom_mae: 27.0291 - val_accuracy: 0.0000e+00\n",
"Epoch 460/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0014 - custom_mae: 0.0338 - accuracy: 0.0417 - val_loss: 499.8286 - val_custom_mae: 27.0307 - val_accuracy: 0.0000e+00\n",
"Epoch 461/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0012 - custom_mae: 0.0328 - accuracy: 0.0417 - val_loss: 499.7890 - val_custom_mae: 27.0293 - val_accuracy: 0.0000e+00\n",
"Epoch 462/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0012 - custom_mae: 0.0322 - accuracy: 0.0417 - val_loss: 499.8723 - val_custom_mae: 27.0321 - val_accuracy: 0.0000e+00\n",
"Epoch 463/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0011 - custom_mae: 0.0323 - accuracy: 0.0417 - val_loss: 499.8212 - val_custom_mae: 27.0314 - val_accuracy: 0.0000e+00\n",
"Epoch 464/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0010 - custom_mae: 0.0301 - accuracy: 0.0417 - val_loss: 499.8231 - val_custom_mae: 27.0314 - val_accuracy: 0.0000e+00\n",
"Epoch 465/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 0.0010 - custom_mae: 0.0299 - accuracy: 0.0417 - val_loss: 499.8588 - val_custom_mae: 27.0311 - val_accuracy: 0.0000e+00\n",
"Epoch 466/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 9.8071e-04 - custom_mae: 0.0294 - accuracy: 0.0417 - val_loss: 499.8590 - val_custom_mae: 27.0307 - val_accuracy: 0.0000e+00\n",
"Epoch 467/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 8.9396e-04 - custom_mae: 0.0285 - accuracy: 0.0417 - val_loss: 499.8817 - val_custom_mae: 27.0321 - val_accuracy: 0.0000e+00\n",
"Epoch 468/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 8.7410e-04 - custom_mae: 0.0278 - accuracy: 0.0417 - val_loss: 499.8451 - val_custom_mae: 27.0314 - val_accuracy: 0.0000e+00\n",
"Epoch 469/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 8.2264e-04 - custom_mae: 0.0261 - accuracy: 0.0417 - val_loss: 499.8533 - val_custom_mae: 27.0314 - val_accuracy: 0.0000e+00\n",
"Epoch 470/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 7.8254e-04 - custom_mae: 0.0256 - accuracy: 0.0417 - val_loss: 499.9111 - val_custom_mae: 27.0325 - val_accuracy: 0.0000e+00\n",
"Epoch 471/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 7.7250e-04 - custom_mae: 0.0271 - accuracy: 0.0417 - val_loss: 499.9024 - val_custom_mae: 27.0323 - val_accuracy: 0.0000e+00\n",
"Epoch 472/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 6.8937e-04 - custom_mae: 0.0260 - accuracy: 0.0417 - val_loss: 499.8756 - val_custom_mae: 27.0321 - val_accuracy: 0.0000e+00\n",
"Epoch 473/500\n",
"8/8 [==============================] - 0s 12ms/step - loss: 7.2580e-04 - custom_mae: 0.0246 - accuracy: 0.0417 - val_loss: 499.8849 - val_custom_mae: 27.0321 - val_accuracy: 0.0000e+00\n",
"Epoch 474/500\n",
"8/8 [==============================] - 0s 12ms/step - loss: 6.5127e-04 - custom_mae: 0.0237 - accuracy: 0.0417 - val_loss: 499.9116 - val_custom_mae: 27.0323 - val_accuracy: 0.0000e+00\n",
"Epoch 475/500\n",
"8/8 [==============================] - 0s 12ms/step - loss: 6.0367e-04 - custom_mae: 0.0234 - accuracy: 0.0417 - val_loss: 499.9120 - val_custom_mae: 27.0326 - val_accuracy: 0.0000e+00\n",
"Epoch 476/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 5.6024e-04 - custom_mae: 0.0224 - accuracy: 0.0417 - val_loss: 499.8936 - val_custom_mae: 27.0323 - val_accuracy: 0.0000e+00\n",
"Epoch 477/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 5.3261e-04 - custom_mae: 0.0217 - accuracy: 0.0417 - val_loss: 499.9124 - val_custom_mae: 27.0324 - val_accuracy: 0.0000e+00\n",
"Epoch 478/500\n",
"8/8 [==============================] - 0s 12ms/step - loss: 5.1553e-04 - custom_mae: 0.0208 - accuracy: 0.0417 - val_loss: 499.8885 - val_custom_mae: 27.0318 - val_accuracy: 0.0000e+00\n",
"Epoch 479/500\n",
"8/8 [==============================] - 0s 12ms/step - loss: 4.8770e-04 - custom_mae: 0.0196 - accuracy: 0.0417 - val_loss: 499.9443 - val_custom_mae: 27.0334 - val_accuracy: 0.0000e+00\n",
"Epoch 480/500\n",
"8/8 [==============================] - 0s 12ms/step - loss: 4.5954e-04 - custom_mae: 0.0202 - accuracy: 0.0417 - val_loss: 499.9435 - val_custom_mae: 27.0328 - val_accuracy: 0.0000e+00\n",
"Epoch 481/500\n",
"8/8 [==============================] - 0s 10ms/step - loss: 4.4328e-04 - custom_mae: 0.0201 - accuracy: 0.0417 - val_loss: 499.9374 - val_custom_mae: 27.0330 - val_accuracy: 0.0000e+00\n",
"Epoch 482/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 4.2739e-04 - custom_mae: 0.0189 - accuracy: 0.0417 - val_loss: 499.8960 - val_custom_mae: 27.0326 - val_accuracy: 0.0000e+00\n",
"Epoch 483/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 4.0729e-04 - custom_mae: 0.0182 - accuracy: 0.0417 - val_loss: 499.9327 - val_custom_mae: 27.0334 - val_accuracy: 0.0000e+00\n",
"Epoch 484/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 3.6898e-04 - custom_mae: 0.0178 - accuracy: 0.0417 - val_loss: 499.9790 - val_custom_mae: 27.0338 - val_accuracy: 0.0000e+00\n",
"Epoch 485/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 3.6186e-04 - custom_mae: 0.0182 - accuracy: 0.0417 - val_loss: 499.9576 - val_custom_mae: 27.0329 - val_accuracy: 0.0000e+00\n",
"Epoch 486/500\n",
"8/8 [==============================] - 0s 12ms/step - loss: 3.4267e-04 - custom_mae: 0.0173 - accuracy: 0.0417 - val_loss: 499.9493 - val_custom_mae: 27.0336 - val_accuracy: 0.0000e+00\n",
"Epoch 487/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 3.1823e-04 - custom_mae: 0.0169 - accuracy: 0.0417 - val_loss: 499.9608 - val_custom_mae: 27.0339 - val_accuracy: 0.0000e+00\n",
"Epoch 488/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 3.0632e-04 - custom_mae: 0.0169 - accuracy: 0.0417 - val_loss: 499.9485 - val_custom_mae: 27.0333 - val_accuracy: 0.0000e+00\n",
"Epoch 489/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.8342e-04 - custom_mae: 0.0154 - accuracy: 0.0417 - val_loss: 499.9747 - val_custom_mae: 27.0340 - val_accuracy: 0.0000e+00\n",
"Epoch 490/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.6139e-04 - custom_mae: 0.0145 - accuracy: 0.0417 - val_loss: 499.9545 - val_custom_mae: 27.0334 - val_accuracy: 0.0000e+00\n",
"Epoch 491/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.6569e-04 - custom_mae: 0.0145 - accuracy: 0.0417 - val_loss: 499.9816 - val_custom_mae: 27.0340 - val_accuracy: 0.0000e+00\n",
"Epoch 492/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.5165e-04 - custom_mae: 0.0148 - accuracy: 0.0417 - val_loss: 499.9815 - val_custom_mae: 27.0341 - val_accuracy: 0.0000e+00\n",
"Epoch 493/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.3426e-04 - custom_mae: 0.0149 - accuracy: 0.0417 - val_loss: 499.9771 - val_custom_mae: 27.0339 - val_accuracy: 0.0000e+00\n",
"Epoch 494/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.4409e-04 - custom_mae: 0.0143 - accuracy: 0.0417 - val_loss: 499.9797 - val_custom_mae: 27.0339 - val_accuracy: 0.0000e+00\n",
"Epoch 495/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.0566e-04 - custom_mae: 0.0129 - accuracy: 0.0417 - val_loss: 499.9780 - val_custom_mae: 27.0340 - val_accuracy: 0.0000e+00\n",
"Epoch 496/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.3338e-04 - custom_mae: 0.0148 - accuracy: 0.0417 - val_loss: 499.9927 - val_custom_mae: 27.0345 - val_accuracy: 0.0000e+00\n",
"Epoch 497/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 2.0089e-04 - custom_mae: 0.0131 - accuracy: 0.0417 - val_loss: 499.9764 - val_custom_mae: 27.0341 - val_accuracy: 0.0000e+00\n",
"Epoch 498/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.7758e-04 - custom_mae: 0.0119 - accuracy: 0.0417 - val_loss: 500.0060 - val_custom_mae: 27.0346 - val_accuracy: 0.0000e+00\n",
"Epoch 499/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.6596e-04 - custom_mae: 0.0118 - accuracy: 0.0417 - val_loss: 499.9808 - val_custom_mae: 27.0343 - val_accuracy: 0.0000e+00\n",
"Epoch 500/500\n",
"8/8 [==============================] - 0s 11ms/step - loss: 1.5378e-04 - custom_mae: 0.0112 - accuracy: 0.0417 - val_loss: 500.0090 - val_custom_mae: 27.0350 - val_accuracy: 0.0000e+00\n"
]
}
],
"source": [
"x, y = load_data(image_size=64, num_images=10)\n",
"\n",
"model = Sequential([\n",
" InputLayer(input_shape=x.shape[1:]),\n",
" \n",
" Conv2D(32, 3, activation=\"relu\"),\n",
" MaxPooling2D(pool_size=(2, 2)),\n",
" \n",
" Conv2D(64, 3, activation=\"relu\"),\n",
" MaxPooling2D(pool_size=(2, 2)),\n",
"\n",
" Conv2D(92, 3, activation=\"relu\"),\n",
" MaxPooling2D(pool_size=(2, 2)),\n",
"\n",
" Conv2D(128, 3, activation=\"relu\"),\n",
" MaxPooling2D(pool_size=(2, 2)),\n",
"\n",
" Flatten(),\n",
"\n",
" Dense(512, activation=\"relu\"),\n",
" Dense(y.shape[1] * y.shape[2], activation=\"linear\"),\n",
" Reshape(y.shape[1:])\n",
"])\n",
"\n",
"model.summary()\n",
"\n",
"adam = optimizers.Adam(learning_rate=1e-5)\n",
"model.compile(optimizer=adam, loss=custom_mse, metrics=[custom_mae, \"accuracy\"])\n",
"history = model.fit(x, y, epochs=300, validation_split=0.2, batch_size=1)"
]
},
{
"cell_type": "code",
"execution_count": 12,
"metadata": {
"vscode": {
"languageId": "python"
}
},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAXAAAAEICAYAAABGaK+TAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAAsTAAALEwEAmpwYAAAzIklEQVR4nO3dd3yV1f3A8c83e5LNkBU2iuwACqLgKk5EEaUO0Lqo1vFrnbXO+lN/tS3a1lZbxVEEUYoDQQWUouJiKlOGEQIEkkAWmTc5vz/Oc8NNSEjIunmS7/v1uq97n/Os73MJ33vuuec5R4wxKKWUcp8AfweglFKqfjSBK6WUS2kCV0opl9IErpRSLqUJXCmlXEoTuFJKuZQmcIWILBaRaY29rT+JSKqInN0ExzUi0tt5/Q8R+V1dtq3Hea4SkY/rG6dqGzSBu5SI5Ps8ykWk0Gf5quM5ljHmPGPMq429bWtnjLnFGPN4Q48jIslOsg/yOfZsY8y5DT12Neca55xrQZXywU758irlIiI7RWRTNcdaLiJFVf4W32/smFXNgmrfRLVExpgo72sRSQVuMMYsrbqdiAQZYzzNGZtq8TKAU0UkwRiT5ZRNA36oZtvTgfZAkIiMMMZ8W2X9bcaYfzVhrOoYtAbeyjg1rDQRuVdE0oFZIhInIgtFJENEDjmvu/jss1xEbnBeTxeRz0XkGWfbH0XkvHpu20NEVohInogsFZG/ici/a4i7LjE+LiJfOMf7WEQSfdZfIyI/iUiWiPz2GO/PKBFJF5FAn7JJIvKd83qkiHwpItkisk9E/ioiITUc6xUR+b3P8t3OPntF5Poq214gImtFJFdEdovIIz6rVzjP2U4t9lTve+uz/2gR+VZEcpzn0XV9b6pRArwDXOnsHwhcAcyuZttpwLvAIue1akE0gbdOHYF4oDtwE/bfeZaz3A0oBP56jP1HAVuBROD/gJdEROqx7RvAN0AC8AhwzTHOWZcYfw5ch60RhgC/ARCRk4C/O8c/wTlfF6phjPkaOAycWeW4bzivy4C7nOs5FTgL+OUx4saJYYITzzlAH6Bq+/th4FogFrgAmCEilzjrTneeY40xUcaYL6scOx74AHjOubY/AR+ISEKVazjqvTmG15x4AH4GbAD2VjlvBDAZm9hnA1fW9GGm/EMTeOtUDjxsjCk2xhQaY7KMMfONMQXGmDzgCeCMY+z/kzHmn8aYMuBVoBPQ4Xi2FZFuwAjgIWNMiTHmc+C9mk5YxxhnGWN+MMYUAvOAIU75ZGChMWaFMaYY+J3zHtRkDjAVQESigfOdMowxq40xXxljPMaYVOCFauKozhQnvg3GmMPYDyzf61tujPneGFNujPnOOV9djgs24W8zxrzuxDUH2AJc5LNNTe9NtYwxK4F4EemHTeSvVbPZpUAx8DH2AyTYicXXc863Fe+jwb8JqLrTBN46ZRhjirwLIhIhIi84TQy52K/ssb7NCFWke18YYwqcl1HHue0JwEGfMoDdNQVcxxjTfV4X+MR0gu+xnQSaRc3eAC4VkVBsklpjjPnJiaOv03yT7sTxv9jaeG0qxQD8VOX6RonIp04TUQ5wSx2P6z32T1XKfgI6+yzX9N4cy+vAbcB4YEE166cB85wPjSJgPkc3o9xujIn1edTYK0c1Pk3grVPVISZ/DfQDRhlj2nHkK3tNzSKNYR+2hhfhU9b1GNs3JMZ9vsd2zplQ08bGmE3YBHgelZtPwDbFbAH6OHE8UJ8YsM1Avt7AfgPpaoyJAf7hc9zahgTdi21a8tUN2FOHuI7ldWzz0KIqH7Q4vz+cCVztfJilY7/pnF9L+7pqRprA24ZobJtyttOe+nBTn9Cp0a4CHhGREBE5lcpf+RszxreBC0XkNKeN9jFq/9t+A7gD+0HxVpU4coF8EekPzKhjDPOA6SJykvMBUjX+aOw3kiIRGYn94PDKwDb59Kzh2IuAviLycxEJEpErgJOAhXWMrVrGmB+xzTjV/eh7DbZXSj9sc8wQoC+QhtP8pPxPE3jbMBMIBzKBr4APm+m8V2F/CMwCfg+8iW1Trc5M6hmjMWYjcCs2Ke8DDmETzbF426A/McZk+pT/Bptc84B/OjHXJYbFzjV8Amx3nn39EnhMRPKAh7AJ37tvAbbN/wunHfmUKsfOAi7EfkvJAu4BLqwSd70YYz43xuytZtU04HljTLrvA/vNwbcZ5a9SuR/46obGpOpOdEIH1VxE5E1gizGmyb8BKNUWaA1cNRkRGSEivUQkwOlmNxHb/1gp1Qj0TkzVlDoC/8H+oJgGzDDGrPVvSEq1HtqEopRSLqVNKEop5VLN2oSSmJhokpOTm/OUSinleqtXr840xiRVLW/WBJ6cnMyqVaua85RKKeV6IlL1TlxAm1CUUsq1NIErpZRLaQJXSimX0n7gSrVCpaWlpKWlUVRUVPvGqsUICwujS5cuBAcH12l7TeBKtUJpaWlER0eTnJxMzXNxqJbEGENWVhZpaWn06NGjTvtoE4pSrVBRUREJCQmavF1EREhISDiub02awJVqpTR5u8/x/ptpAldKKZdyRQJ/8EGYpvNhK+UaWVlZDBkyhCFDhtCxY0c6d+5csVxSUnLMfVetWsXtt99e6zlGjx7dKLEuX74cEeFf//pXRdm6desQEZ555pmKMo/HQ1JSEvfdd1+l/ceNG0e/fv0qrm/y5MmNEldduOJHzB07YLUOE6+UayQkJLBu3ToAHnnkEaKiovjNb35Tsd7j8RAUVH36SUlJISUlpdZzrFy5slFiBTj55JOZN28eN9xwAwBz5sxh8ODBlbZZsmQJffv25a233uLJJ5+s1Nwxe/bsOsXc2FxRA4+Kgvx8f0ehlGqI6dOnc8sttzBq1CjuuecevvnmG0499VSGDh3K6NGj2bp1K2BrxBdeeCFgk//111/PuHHj6NmzJ88991zF8aKioiq2HzduHJMnT6Z///5cddVVeEdZXbRoEf3792f48OHcfvvtFcetqnv37hQVFbF//36MMXz44Yecd955lbaZM2cOd9xxB926dePLL79s9PenPlxRA9cErlTDjBt3dNmUKfDLX0JBAZx//tHrp0+3j8xMqNoqsHx5/eJIS0tj5cqVBAYGkpuby2effUZQUBBLly7lgQceYP78+Ufts2XLFj799FPy8vLo168fM2bMOKqf9Nq1a9m4cSMnnHACY8aM4YsvviAlJYWbb76ZFStW0KNHD6ZOPfZUnpMnT+att95i6NChDBs2jNDQ0Ip1RUVFLF26lBdeeIHs7GzmzJlTqQnnqquuIjw8HIBzzjmHP/zhD/V7g46TqxK4MaA/rCvlXpdffjmBgYEA5OTkMG3aNLZt24aIUFpaWu0+F1xwAaGhoYSGhtK+fXv2799Ply5dKm0zcuTIirIhQ4aQmppKVFQUPXv2rOhTPXXqVF588cUaY5syZQpXXHEFW7ZsYerUqZWaaBYuXMj48eMJDw/nsssu4/HHH2fmzJkV1+KvJhRXJPCePWHUKCgpAZ8PRaVUHR2rxhwRcez1iYn1r3FXFRkZWfH6d7/7HePHj2fBggWkpqYyrrqvCVCpJhwYGIjH46nXNrXp2LEjwcHBLFmyhGeffbZSAp8zZw6ff/453uGws7Ky+OSTTzjnnHOO+zyNqU4JXERSsbN0lwEeY0yKiMRjZ+xOBlKBKcaYQ00R5HXX2YdSqvXIycmhc+fOALzyyiuNfvx+/fqxc+dOUlNTSU5O5s0336x1n8cee4wDBw5U1KyBiqae3bt3V3xQzJo1izlz5vg9gR/Pj5jjjTFDjDHe7wn3AcuMMX2AZc6yUkrVyT333MP999/P0KFD61Vjrk14eDjPP/88EyZMYPjw4URHRxMTE3PMfUaPHs0ll1xSqWzBggWceeaZlWr5EydO5P3336e4uBiwbeDeboRnn312o19LTeo0J6ZTA08xxmT6lG0Fxhlj9olIJ2C5MabfsY6TkpJi6jOhw6/+/RxvvZfLp48+yIknHvfuSrU5mzdv5kT9z0J+fj5RUVEYY7j11lvp06cPd911l7/DOqbq/u1EZLVP5blCXWvgBvhYRFa
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAX0AAAEICAYAAACzliQjAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAAsTAAALEwEAmpwYAAA1YklEQVR4nO3deXxU1fn48c+Tyb4BWQhIQIKCyhogLIoiLq2IKC5UpVagWLdqXbpYl6pUa9uvtdWfbbXFvVZFWytixbqgiLuyiSCLgKBhDWFJINskeX5/nDthEpIQsg3MPO/Xa15z7zl3OXcmee6Zc889V1QVY4wxkSEq1AUwxhjTfizoG2NMBLGgb4wxEcSCvjHGRBAL+sYYE0Es6BtjTASxoG+aTUReE5Eprb1sKInIehE5vQ22qyJytDf9NxG5vSnLNmM/l4jIG80tZyPbHSMi+a29XdP+okNdANO+RGRP0GwiUA5UefNXquozTd2Wqp7ZFsuGO1W9qjW2IyI9ga+BGFWt9Lb9DNDk79BEHgv6EUZVkwPTIrIe+JGqvlV3ORGJDgQSY0z4sOYdA+z7+S4ivxSRLcATItJJRP4rIgUistObzg5aZ56I/Mibnioi74vIfd6yX4vImc1cNkdE5otIsYi8JSJ/FZF/NlDuppTxbhH5wNveGyKSEZR/qYhsEJFCEbmtkc9nhIhsERFfUNp5IrLUmx4uIh+JyC4R2SwifxGR2Aa29aSI/CZo/hfeOptEZFqdZc8SkcUiUiQi34rI9KDs+d77LhHZIyLHBz7boPVPEJHPRGS3935CUz+bxojIcd76u0RkuYicE5Q3TkS+9La5UUR+7qVneN/PLhHZISLviYjFoHZmH7gJ1gVIA44ErsD9fTzhzfcASoG/NLL+CGAVkAHcCzwmItKMZZ8FPgXSgenApY3ssyll/D7wQ6AzEAsEglBf4GFv+0d4+8umHqr6CbAXOLXOdp/1pquAG73jOR44DfhxI+XGK8NYrzzfAXoDda8n7AUmAx2Bs4CrReRcL2+0995RVZNV9aM6204DXgUe9I7tT8CrIpJe5xj2+2wOUOYY4BXgDW+9nwDPiMgx3iKP4ZoKU4D+wNte+s+AfCATyAJuBWwcmHZmQd8EqwbuVNVyVS1V1UJVfVFVS1S1GLgHOLmR9Teo6iOqWgU8BXTF/XM3eVkR6QEMA+5Q1QpVfR+Y3dAOm1jGJ1R1taqWAi8AuV76ROC/qjpfVcuB273PoCHPAZMARCQFGOeloaoLVfVjVa1U1fXA3+spR30u9Mq3TFX34k5ywcc3T1W/UNVqVV3q7a8p2wV3kvhKVZ/2yvUcsBI4O2iZhj6bxowEkoHfe9/R28B/8T4bwA/0FZFUVd2pqouC0rsCR6qqX1XfUxv8q91Z0DfBClS1LDAjIoki8nev+aMI15zQMbiJo44tgQlVLfEmkw9y2SOAHUFpAN82VOAmlnFL0HRJUJmOCN62F3QLG9oXrlZ/vojEAecDi1R1g1eOPl7TxRavHL/F1foPpFYZgA11jm+EiLzjNV/tBq5q4nYD295QJ20D0C1ovqHP5oBlVtXgE2Twdi/AnRA3iMi7InK8l/4HYA3whoisE5Gbm3YYpjVZ0DfB6ta6fgYcA4xQ1VT2NSc01GTTGjYDaSKSGJTWvZHlW1LGzcHb9vaZ3tDCqvolLridSe2mHXDNRCuB3l45bm1OGXBNVMGexf3S6a6qHYC/BW33QLXkTbhmr2A9gI1NKNeBttu9Tnt8zXZV9TNVnYBr+pmF+wWBqhar6s9UtRdwDvBTETmthWUxB8mCvmlMCq6NfJfXPnxnW+/QqzkvAKaLSKxXSzy7kVVaUsZ/A+NF5ETvoutdHPh/4lngetzJ5V91ylEE7BGRY4Grm1iGF4CpItLXO+nULX8K7pdPmYgMx51sAgpwzVG9Gtj2HKCPiHxfRKJF5CKgL64ppiU+wf0quElEYkRkDO47mul9Z5eISAdV9eM+k2oAERkvIkd71252466DNNacZtqABX3TmAeABGA78DHwv3ba7yW4i6GFwG+A53H3E9TnAZpZRlVdDlyDC+SbgZ24C42NCbSpv62q24PSf44LyMXAI16Zm1KG17xjeBvX9PF2nUV+DNwlIsXAHXi1Zm/dEtw1jA+8HjEj62y7EBiP+zVUCNwEjK9T7oOmqhW4IH8m7nN/CJisqiu9RS4F1nvNXFfhvk9wF6rfAvYAHwEPqeo7LSmLOXhi11HMoU5EngdWqmqb/9IwJtxZTd8cckRkmIgcJSJRXpfGCbi2YWNMC9kdueZQ1AX4D+6iaj5wtaouDm2RjAkP1rxjjDERxJp3jDEmghyweUdEHsf1ANimqv29tOdxfaPB3R6+S1VzxY36twJ3ez3Ax4ERBUVkKPAkrqfFHOD6ptyNl5GRoT179mz6ERljTIRbuHDhdlXNrC+vKW36T+LGMvlHIEFVLwpMi8gfcX1uA9aqam4923kYuBzXx3cOMBZ47UA779mzJwsWLGhCMY0xxgCISN07sWscsHlHVecDOxrYsODGDnnuAAXoCqR6Y5Mo7gRy7oH2bYwxpnW1tE3/JGCrqn4VlJbjDQX7roic5KV1o/ZNL/nUHv/DGGNMO2hpl81J1K7lbwZ6qGqh14Y/S0T6HexGReQK3NC+9OhRdygSY4wxzdXsoC8i0biRBocG0rzhacu96YUishbogxuIKXic8mwaGfRJVWcAMwDy8vKsT6kx7cjv95Ofn09ZWdmBFzYhFR8fT3Z2NjExMU1epyU1/dNxt8bXNNuISCZucKgqEemFG2tjnaruEPfkn5G4C7mTgT+3YN/GmDaSn59PSkoKPXv2pOFn4JhQU1UKCwvJz88nJyenyesdsE1fRJ7DDY50jLjH6V3mZV3M/hdwRwNLRWQJbgTDq1Q1cBH4x8CjuEGl1tKEnjvGmPZXVlZGenq6BfxDnIiQnp5+0L/IDljTV9VJDaRPrSftReDFBpZfgHt0mjHmEGcB//DQnO8pbO/Ivfvdu3l9zeuhLoYxxhxSwjbo/98H/8cba98IdTGMMQepsLCQ3NxccnNz6dKlC926dauZr6ioaHTdBQsWcN111x1wHyeccEKrlHXevHmMHz++VbbVXsJ2lM346HjKqxp67oYx5lCVnp7OkiVLAJg+fTrJycn8/Oc/r8mvrKwkOrr+0JWXl0deXt4B9/Hhhx+2SlkPR2Fb04+LjqOs0rqcGRMOpk6dylVXXcWIESO46aab+PTTTzn++OMZPHgwJ5xwAqtWueG+gmve06dPZ9q0aYwZM4ZevXrx4IMP1mwvOTm5ZvkxY8YwceJEjj32WC655BICQ4LNmTOHY489lqFDh3LdddcdsEa/Y8cOzj33XAYOHMjIkSNZunQpAO+++27NL5XBgwdTXFzM5s2bGT16NLm5ufTv35/33nuv1T+zhoR1Td+CvjEtN2bM/mkXXgg//jGUlMC4cfvnT53qXtu3w8SJtfPmzWteOfLz8/nwww/x+XwUFRXx3nvvER0dzVtvvcWtt97Kiy/u34dk5cqVvPPOOxQXF3PMMcdw9dVX79enffHixSxfvpwjjjiCUaNG8cEHH5CXl8eVV17J/PnzycnJYdKkevuz1HLnnXcyePBgZs2axdtvv83kyZNZsmQJ9913H3/9618ZNWoUe/bsIT4+nhkzZnDGGWdw2223UVVVRUlJSfM+lGYI66BvzTvGhI/vfe97+Hw+AHbv3s2UKVP46quvEBH8fn+965x11lnExcURFxdH586d2bp1K9nZ2bWWGT58eE1abm4u69evJzk5mV69etX0f580aRIzZsxotHzvv/9+zYnn1FNPpbCwkKKiIkaNGsVPf/pTLrnkEs4//3yys7MZNmwY06ZNw+/3c+6555Kbm9uSj+aghG3Q92k8xSVW0zempRqrmScmNp6fkdH8mn1dSUlJNdO33347p5xyCi+99BLr169nTH0/R4C4uLiaaZ/PR2VlZbOWaYmbb76Zs846izlz5jBq1Chef/11Ro8ezfz583n11VeZOnUqP/3pT5k8eXKr7rchYdumv3xpHCu+sqBvTDj
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"0.08333333333333333\n"
]
}
],
"source": [
"plot_training_analysis(history)\n",
"\n",
"y_pred = model.predict(x)\n",
"pck = compute_PCK_alpha(y, y_pred)\n",
"print(pck)"
]
},
{
"cell_type": "code",
"execution_count": 16,
"metadata": {
"vscode": {
"languageId": "python"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Model: \"sequential_3\"\n",
"_________________________________________________________________\n",
" Layer (type) Output Shape Param # \n",
"=================================================================\n",
" conv2d_12 (Conv2D) (None, 62, 62, 32) 896 \n",
" \n",
" max_pooling2d_12 (MaxPoolin (None, 31, 31, 32) 0 \n",
" g2D) \n",
" \n",
" conv2d_13 (Conv2D) (None, 29, 29, 64) 18496 \n",
" \n",
" max_pooling2d_13 (MaxPoolin (None, 14, 14, 64) 0 \n",
" g2D) \n",
" \n",
" conv2d_14 (Conv2D) (None, 12, 12, 92) 53084 \n",
" \n",
" max_pooling2d_14 (MaxPoolin (None, 6, 6, 92) 0 \n",
" g2D) \n",
" \n",
" conv2d_15 (Conv2D) (None, 4, 4, 128) 106112 \n",
" \n",
" max_pooling2d_15 (MaxPoolin (None, 2, 2, 128) 0 \n",
" g2D) \n",
" \n",
" flatten_3 (Flatten) (None, 512) 0 \n",
" \n",
" dense_6 (Dense) (None, 512) 262656 \n",
" \n",
" dense_7 (Dense) (None, 42) 21546 \n",
" \n",
" reshape_3 (Reshape) (None, 3, 14) 0 \n",
" \n",
"=================================================================\n",
"Total params: 462,790\n",
"Trainable params: 462,790\n",
"Non-trainable params: 0\n",
"_________________________________________________________________\n",
"Epoch 1/20\n",
"900/900 [==============================] - 7s 7ms/step - loss: 436.1250 - custom_mae: 22.9435 - accuracy: 0.1063 - val_loss: 301.4426 - val_custom_mae: 19.5207 - val_accuracy: 0.1067\n",
"Epoch 2/20\n",
"900/900 [==============================] - 6s 7ms/step - loss: 306.5551 - custom_mae: 19.7242 - accuracy: 0.0911 - val_loss: 282.6898 - val_custom_mae: 18.9662 - val_accuracy: 0.0867\n",
"Epoch 3/20\n",
"900/900 [==============================] - 6s 7ms/step - loss: 293.9691 - custom_mae: 19.2866 - accuracy: 0.0796 - val_loss: 282.8599 - val_custom_mae: 18.9778 - val_accuracy: 0.0667\n",
"Epoch 4/20\n",
"900/900 [==============================] - 6s 7ms/step - loss: 286.9001 - custom_mae: 19.0547 - accuracy: 0.0752 - val_loss: 277.5684 - val_custom_mae: 18.7558 - val_accuracy: 0.0600\n",
"Epoch 5/20\n",
"900/900 [==============================] - 6s 7ms/step - loss: 279.4478 - custom_mae: 18.8183 - accuracy: 0.0733 - val_loss: 276.7022 - val_custom_mae: 18.8043 - val_accuracy: 0.0300\n",
"Epoch 6/20\n",
"900/900 [==============================] - 6s 7ms/step - loss: 275.4402 - custom_mae: 18.6737 - accuracy: 0.0785 - val_loss: 276.4867 - val_custom_mae: 18.7703 - val_accuracy: 0.0567\n",
"Epoch 7/20\n",
"900/900 [==============================] - 6s 7ms/step - loss: 270.3714 - custom_mae: 18.5070 - accuracy: 0.0737 - val_loss: 273.4456 - val_custom_mae: 18.5880 - val_accuracy: 0.0633\n",
"Epoch 8/20\n",
"900/900 [==============================] - 6s 7ms/step - loss: 265.8038 - custom_mae: 18.3447 - accuracy: 0.0700 - val_loss: 274.5800 - val_custom_mae: 18.6840 - val_accuracy: 0.0400\n",
"Epoch 9/20\n",
"900/900 [==============================] - 6s 7ms/step - loss: 261.2702 - custom_mae: 18.1500 - accuracy: 0.0693 - val_loss: 271.1825 - val_custom_mae: 18.5663 - val_accuracy: 0.1067\n",
"Epoch 10/20\n",
"900/900 [==============================] - 6s 7ms/step - loss: 257.5489 - custom_mae: 18.0389 - accuracy: 0.0737 - val_loss: 272.6388 - val_custom_mae: 18.5681 - val_accuracy: 0.0367\n",
"Epoch 11/20\n",
"900/900 [==============================] - 6s 6ms/step - loss: 253.8484 - custom_mae: 17.8498 - accuracy: 0.0659 - val_loss: 275.5805 - val_custom_mae: 18.5730 - val_accuracy: 0.0600\n",
"Epoch 12/20\n",
"900/900 [==============================] - 6s 7ms/step - loss: 248.3715 - custom_mae: 17.6936 - accuracy: 0.0730 - val_loss: 270.3711 - val_custom_mae: 18.5194 - val_accuracy: 0.0600\n",
"Epoch 13/20\n",
"900/900 [==============================] - 6s 7ms/step - loss: 242.3419 - custom_mae: 17.4627 - accuracy: 0.0737 - val_loss: 269.5866 - val_custom_mae: 18.4398 - val_accuracy: 0.0567\n",
"Epoch 14/20\n",
"900/900 [==============================] - 6s 7ms/step - loss: 239.1807 - custom_mae: 17.3445 - accuracy: 0.0696 - val_loss: 273.4536 - val_custom_mae: 18.5913 - val_accuracy: 0.0500\n",
"Epoch 15/20\n",
"900/900 [==============================] - 6s 7ms/step - loss: 234.5071 - custom_mae: 17.1719 - accuracy: 0.0748 - val_loss: 281.4737 - val_custom_mae: 18.8920 - val_accuracy: 0.0400\n",
"Epoch 16/20\n",
"900/900 [==============================] - 6s 7ms/step - loss: 231.7038 - custom_mae: 17.0464 - accuracy: 0.0670 - val_loss: 272.9087 - val_custom_mae: 18.5188 - val_accuracy: 0.0367\n",
"Epoch 17/20\n",
"900/900 [==============================] - 6s 7ms/step - loss: 226.5649 - custom_mae: 16.8362 - accuracy: 0.0774 - val_loss: 268.8194 - val_custom_mae: 18.3130 - val_accuracy: 0.0367\n",
"Epoch 18/20\n",
"900/900 [==============================] - 6s 7ms/step - loss: 221.8425 - custom_mae: 16.6449 - accuracy: 0.0793 - val_loss: 269.8996 - val_custom_mae: 18.4402 - val_accuracy: 0.0333\n",
"Epoch 19/20\n",
"900/900 [==============================] - 6s 7ms/step - loss: 219.2012 - custom_mae: 16.5254 - accuracy: 0.0715 - val_loss: 279.0634 - val_custom_mae: 18.6853 - val_accuracy: 0.0600\n",
"Epoch 20/20\n",
"900/900 [==============================] - 6s 7ms/step - loss: 215.3356 - custom_mae: 16.4273 - accuracy: 0.0759 - val_loss: 267.8714 - val_custom_mae: 18.2929 - val_accuracy: 0.0400\n"
]
}
],
"source": [
"x, y = load_data(image_size=64, num_images=1000)\n",
"\n",
"model = Sequential([\n",
" InputLayer(input_shape=x.shape[1:]),\n",
" \n",
" Conv2D(32, 3, activation=\"relu\"),\n",
" MaxPooling2D(pool_size=(2, 2)),\n",
" \n",
" Conv2D(64, 3, activation=\"relu\"),\n",
" MaxPooling2D(pool_size=(2, 2)),\n",
"\n",
" Conv2D(92, 3, activation=\"relu\"),\n",
" MaxPooling2D(pool_size=(2, 2)),\n",
"\n",
" Conv2D(128, 3, activation=\"relu\"),\n",
" MaxPooling2D(pool_size=(2, 2)),\n",
"\n",
" Flatten(),\n",
"\n",
" Dense(512, activation=\"relu\"),\n",
" Dense(y.shape[1] * y.shape[2], activation=\"linear\"),\n",
" Reshape(y.shape[1:])\n",
"])\n",
"\n",
"model.summary()\n",
"\n",
"adam = optimizers.Adam(learning_rate=1e-5)\n",
"model.compile(optimizer=adam, loss=custom_mse, metrics=[custom_mae, \"accuracy\"])\n",
"history = model.fit(x, y, epochs=20, validation_split=0.1, batch_size=1)"
]
},
{
"cell_type": "code",
"execution_count": 17,
"metadata": {
"vscode": {
"languageId": "python"
}
},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAXAAAAEICAYAAABGaK+TAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAAsTAAALEwEAmpwYAAA3f0lEQVR4nO3dd3xUVfr48c9DILQgJYQemkLoCSGggFTRpQmCiLCsgmBDsbC6WNZVFsvX3gsqrBVDUUEsqBT5UZUOErqYQIBQIiWUkHZ+f5xJGEJChmRKZvK8X6/7mju3PrmZPDlz7jnnijEGpZRS/qeUrwNQSilVOJrAlVLKT2kCV0opP6UJXCml/JQmcKWU8lOawJVSyk9pAlcAiMg8ERnp7m19SUTiRaSXB45rROQKx/xkEfmPK9sW4jwjROTnwsapAp8mcD8mIiedpiwROeP0fsSlHMsY08cY84m7tw10xpi7jTFPF/U4ItLQkexLOx17mjHmuqIeO49zdXeca3au5ZGO5YtzLRcR2S0iW/I41mIRSc31WfzW3TGrvJUueBNVXBljQrLnRSQeuN0YsyD3diJS2hiT4c3YVLF3GOgoIqHGmGTHspHAjjy27QrUAEqLSHtjzOpc68cZY6Z4MFaVDy2BByBHCStRRB4RkSTgIxGpKiLfichhETnqmK/ntM9iEbndMT9KRJaJyMuObf8UkT6F3LaRiCwRkRQRWSAi74jI5/nE7UqMT4vIcsfxfhaR6k7rbxGRBBFJFpF/X+T6XCkiSSIS5LRskIhscsx3EJGVInJMRA6IyNsiEpzPsT4WkWec3v/Lsc9+ERmda9t+IrJeRE6IyF4Rmei0eonj9ZijFNsx+9o67d9JRFaLyHHHaydXr00e0oA5wDDH/kHAzcC0PLYdCXwD/OCYV8WEJvDAVQuoBjQA7sT+rj9yvK8PnAHevsj+VwLbgerAi8BUEZFCbPsFsAoIBSYCt1zknK7E+HfgNmyJMBh4GEBEWgDvOY5fx3G+euTBGPMbcAromeu4XzjmM4Hxjp+nI3ANcM9F4sYRQ29HPNcCTYDc9e+ngFuBKkA/YKyI3OBY19XxWsUYE2KMWZnr2NWA74E3HT/bq8D3IhKa62e44NpcxKeOeAD+BmwG9uc6bwVgCDaxTwOG5ffPTHmfJvDAlQU8ZYw5a4w5Y4xJNsZ8ZYw5bYxJAZ4Ful1k/wRjzIfGmEzgE6A2UPNSthWR+kB74EljTJoxZhkwN78TuhjjR8aYHcaYM8BMIMqxfAjwnTFmiTHmLPAfxzXITywwHEBEKgF9Hcswxqw1xvxqjMkwxsQD7+cRR16GOuLbbIw5hf2H5fzzLTbG/G6MyTLGbHKcz5Xjgk34O40xnzniigW2Adc7bZPftcmTMWYFUE1EIrCJ/NM8NhsMnAV+xv4DKeOIxdmbjm8r2VOR7wko12gCD1yHjTGp2W9EpIKIvO+oYjiB/cpexbkaIZek7BljzGnHbMglblsH+MtpGcDe/AJ2McYkp/nTTjHVcT62I4Emk78vgMEiUhabpNYZYxIccTR1VN8kOeJ4DlsaL8h5MQAJuX6+K0XkF0cV0XHgbhePm33shFzLEoC6Tu/zuzYX8xkwDugBzM5j/UhgpuOfRirwFRdWo9xvjKniNOXbKke5lybwwJV7mMmHgAjgSmPMZZz7yp5ftYg7HMCW8Co4LQu/yPZFifGA87Ed5wzNb2NjzBZsAuzD+dUnYKtitgFNHHE8XpgYsNVAzr7AfgMJN8ZUBiY7HbegYUH3Y6uWnNUH9rkQ18V8hq0e+iHXP1oc9x96Av9w/DNLwn7T6VtA/bryEk3gJUclbJ3yMUd96lOePqGjRLsGmCgiwSLSkfO/8rszxi+B/iJytaOOdhIFf76/AB7A/qOYlSuOE8BJEWkGjHUxhpnAKBFp4fgHkjv+SthvJKki0gH7jyPbYWyVT+N8jv0D0FRE/i4ipUXkZqAF8J2LseXJGPMnthonr5u+t2BbpURgq2OigKZAIo7qJ+VbmsBLjteB8sAR4FfgRy+ddwT2RmAy8AwwA1unmpfXKWSMxpg44F5sUj4AHMUmmovJroNeZIw54rT8YWxyTQE+dMTsSgzzHD/DImCX49XZPcAkEUkBnsQm/Ox9T2Pr/Jc76pGvynXsZKA/9ltKMjAB6J8r7kIxxiwzxuzPY9VI4F1jTJLzhP3m4FyN8rac3w58bVFjUq4RfaCD8iYRmQFsM8Z4/BuAUoFOS+DKo0SkvYhcLiKlHM3sBmLbHyulikh7YipPqwV8jb2hmAiMNcas921ISgUGrUJRSik/pVUoSinlp7xahVK9enXTsGFDb55SKaX83tq1a48YY8JyL/dqAm/YsCFr1qzx5imVUsrviUjuXriAVqEopZTf0gSulFJ+ShO4Ukr5qQLrwEUkHDvMZE3sgDsfGGPecAwZORA7fsMhYFQ+3XGVUl6Wnp5OYmIiqampBW+sio1y5cpRr149ypQp49L2rtzEzAAeMsasc4ybvFZE5gMvZQ8bKSL3Y8d2uLuQcSul3CgxMZFKlSrRsGFD8n8OhypOjDEkJyeTmJhIo0aNXNqnwCoUY8wBY8w6x3wKsBWoa4w54bRZRQoeDlMp5SWpqamEhoZq8vYjIkJoaOglfWu6pGaEItIQaAv85nj/LPZJHsexA8IrpYoJTd7+51J/Zy7fxBSREOzTOB7MLn0bY/5tjAnHPitvXD773Skia0RkzeHDhy8pOKWUUvlzKYGLSBls8p5mjPk6j02mATfmta8x5gNjTIwxJiYs7IKORC75/HPo0AEyMwu1u1LKy5KTk4mKiiIqKopatWpRt27dnPdpaWkX3XfNmjXcf//9BZ6jU6dObol18eLFiAhTpkzJWbZhwwZEhJdffjlnWUZGBmFhYTz66KPn7d+9e3ciIiJyfr4hQ4a4JS5XuNIKRYCpwFZjzKtOy5sYY3Y63g7EPoLKI86ehdWrISEBGuf3vBKlVLERGhrKhg0bAJg4cSIhISE8/PDDOeszMjIoXTrv9BMTE0NMTEyB51ixYoVbYgVo1aoVM2fO5PbbbwcgNjaWyMjI87aZP38+TZs2ZdasWfzf//3fedUd06ZNcylmd3OlBN4Z+2ilniKywTH1BZ4Xkc0isgm4DvtoKo9o0cK+btniqTMopTxt1KhR3H333Vx55ZVMmDCBVatW0bFjR9q2bUunTp3Yvn07YEvE/fv3B2zyHz16NN27d6dx48a8+eabOccLCQnJ2b579+4MGTKEZs2aMWLECLJHWf3hhx9o1qwZ7dq14/777885bm4NGjQgNTWVgwcPYozhxx9/pE+fPudtExsbywMPPED9+vVZuXKl269PYRRYAjfGLCPvB7r+4P5w8ta8uX3dsgXyuf5KqYvo3v3CZUOHwj33wOnT0LfvhetHjbLTkSOQu1Zg8eLCxZGYmMiKFSsICgrixIkTLF26lNKlS7NgwQIef/xxvvrqqwv22bZtG7/88gspKSlEREQwduzYC9pJr1+/nri4OOrUqUPnzp1Zvnw5MTEx3HXXXSxZsoRGjRoxfPjFH+M5ZMgQZs2aRdu2bYmOjqZs2bI561JTU1mwYAHvv/8+x44dIzY29rwqnBEjRlC+fHkArr32Wl566aXCXaBL5BcPdKhSBerU0RK4Uv7upptuIigoCIDjx48zcuRIdu7ciYiQnp6e5z79+vWjbNmylC1blho1anDw4EHq1at33jYdOnTIWRYVFUV8fDwhISE0btw4p0318OHD+eCDD/KNbejQodx8881s27aN4cOHn1dF891339GjRw/Kly/PjTfeyNNPP83rr7+e87P4qgrFLxI4wIABUMh7oEqVeBcrMVeocPH11asXvsSdW8WKFXPm//Of/9CjRw9mz55NfHw83fP6mgDnlYSDgoLIyMgo1DYFqVWrFmXKlGH+/Pm88cYb5yXw2NhYli1bRvZw2MnJySxatIhrr732ks/jTn6TwN97z9cRKKXc6fjx49StWxeAjz/+2O3Hj4iIYPfu3cTHx9OwYUNmzJhR4D6TJk3i0KFDOSVrIKeqZ+/evTn/KD766CN
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAXcAAAEICAYAAACktLTqAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAAsTAAALEwEAmpwYAAA1jUlEQVR4nO3dd3xUVfr48c+TQkJIqAk1KKFLTUhCtdBW6gIqq2JBll0LdvnZ/YpY9+uuriy7qy7q17K6gqu76NKUXlSQukAAaQYNNQQSEkJIO78/zk0hppFMZpKZ5/163dfcfp+5mTxz5txzzxVjDEoppbyLn6cDUEop5Xqa3JVSygtpcldKKS+kyV0ppbyQJnellPJCmtyVUsoLaXJXFRKRxSJym6vX9SQRSRSR4TWwXyMiHZ3xN0Xk6cqsW4Xj3CwiX1U1znL2O1hEkly9X+V+AZ4OQNUMEckoNhkCnAfynOk7jTEfVXZfxphRNbGutzPG3OWK/YhIO+AHINAYk+vs+yOg0n9D5Xs0uXspY0xowbiIJAK/NcYsK7meiAQUJAyllPfQahkfU/CzW0QeE5FjwLsi0kREFohIsoicdsYji22zSkR+64xPEZF1IvKKs+4PIjKqiutGicgaEUkXkWUi8lcR+bCMuCsT4/Mi8rWzv69EJLzY8ltF5JCIpIjIU+Wcn34ickxE/IvNu0ZEtjvjfUXkWxFJFZGjIvIXEalXxr7eE5EXik0/4mxzRESmllh3jIhsFZEzIvKTiMwstniN85oqIhkiMqDg3BbbfqCIbBSRNOd1YGXPTXlE5DJn+1QRSRCRccWWjRaRXc4+D4vIw878cOfvkyoip0RkrYhornEzPeG+qSXQFLgUuAP7OXjXmb4EOAf8pZzt+wHfA+HA74F3RESqsO4/gO+AZsBM4NZyjlmZGG8Cfg00B+oBBcmmG/CGs//WzvEiKYUxZgNwFhhaYr//cMbzgIec9zMAGAbcXU7cODGMdOL5BdAJKFnffxaYDDQGxgDTRGSCs+xK57WxMSbUGPNtiX03BRYCs5339kdgoYg0K/EefnZuKog5EPgP8JWz3X3ARyLSxVnlHWwVXxjQA1jhzP9/QBIQAbQAngS0nxM30+Tum/KBZ4wx540x54wxKcaYz4wxmcaYdOBF4Kpytj9kjHnLGJMHvA+0wv4TV3pdEbkEiAdmGGOyjTHrgC/KOmAlY3zXGLPXGHMO+ASIduZPBBYYY9YYY84DTzvnoCwfA5MARCQMGO3Mwxiz2Riz3hiTa4xJBP5WShylud6Jb6cx5iz2y6z4+1tljNlhjMk3xmx3jleZ/YL9MthnjPm7E9fHwB7gl8XWKevclKc/EAr8r/M3WgEswDk3QA7QTUQaGmNOG2O2FJvfCrjUGJNjjFlrtBMrt9Pk7puSjTFZBRMiEiIif3OqLc5gqwEaF6+aKOFYwYgxJtMZDb3IdVsDp4rNA/iprIArGeOxYuOZxWJqXXzfTnJNKetY2FL6tSISBFwLbDHGHHLi6OxUORxz4ngJW4qvyAUxAIdKvL9+IrLSqXZKA+6q5H4L9n2oxLxDQJti02WdmwpjNsYU/yIsvt/rsF98h0RktYgMcOb/AdgPfCUiB0Xk8cq9DeVKmtx9U8lS1P8DugD9jDENKaoGKKuqxRWOAk1FJKTYvLblrF+dGI8W37dzzGZlrWyM2YVNYqO4sEoGbPXOHqCTE8eTVYkBW7VU3D+wv1zaGmMaAW8W229Fpd4j2Oqq4i4BDlciror227ZEfXnhfo0xG40x47FVNvOxvwgwxqQbY/6fMaY9MA6YLiLDqhmLukia3BVAGLYOO9Wpv32mpg/olIQ3ATNFpJ5T6vtlOZtUJ8ZPgbEicrlz8fM5Kv7s/wN4APsl8s8ScZwBMkSkKzCtkjF8AkwRkW7Ol0vJ+MOwv2SyRKQv9kulQDK2Gql9GfteBHQWkZtEJEBEbgC6YatQqmMDtpT/qIgEishg7N9orvM3u1lEGhljcrDnJB9ARMaKSEfn2koa9jpFedVgqgZoclcAs4D6wElgPbDETce9GXtRMgV4AZiHbY9fmllUMUZjTAJwDzZhHwVOYy/4laegznuFMeZksfkPYxNvOvCWE3NlYljsvIcV2CqLFSVWuRt4TkTSgRk4pWBn20zsNYavnRYo/UvsOwUYi/11kwI8CowtEfdFM8ZkY5P5KOx5fx2YbIzZ46xyK5DoVE/dhf17gr1gvAzIAL4FXjfGrKxOLOriiV7nULWFiMwD9hhjavyXg1LeTkvuymNEJF5EOoiIn9NUcDy27lYpVU16h6rypJbAv7AXN5OAacaYrZ4NSSnvoNUySinlhbRaRimlvFCtqJYJDw837dq183QYSilVp2zevPmkMSaitGW1Irm3a9eOTZs2eToMpZSqU0Sk5J3JhbRaRimlvJAmd6WU8kKa3JVSygvVijp3pZT75eTkkJSURFZWVsUrK48KDg4mMjKSwMDASm+jyV0pH5WUlERYWBjt2rWj7GetKE8zxpCSkkJSUhJRUVGV3k6rZZTyUVlZWTRr1kwTey0nIjRr1uyif2FpclfKh2lirxuq8nfS5K6UUl6oTif3rVth0CDYts3TkSilLlZKSgrR0dFER0fTsmVL2rRpUzidnZ1d7rabNm3i/vvvr/AYAwcOdEmsq1atYuzYsS7Zl7vU6Quq9erBN9/A7t0QHe3paJRSF6NZs2Zsc0pmM2fOJDQ0lIcffrhweW5uLgEBpaeouLg44uLiKjzGN99845JY66I6XXJv7zx07MABz8ahlHKNKVOmcNddd9GvXz8effRRvvvuOwYMGEBMTAwDBw7k+++/By4sSc+cOZOpU6cyePBg2rdvz+zZswv3FxoaWrj+4MGDmThxIl27duXmm2+moEfcRYsW0bVrV2JjY7n//vsrLKGfOnWKCRMm0KtXL/r378/27dsBWL16deEvj5iYGNLT0zl69ChXXnkl0dHR9OjRg7Vr17r8nJWlTpfc69eH1q1h/35PR6JU3Td48M/nXX893H03ZGbC6NE/Xz5lih1OnoSJEy9ctmpV1eJISkrim2++wd/fnzNnzrB27VoCAgJYtmwZTz75JJ999tnPttmzZw8rV64kPT2dLl26MG3atJ+1Cd+6dSsJCQm0bt2aQYMG8fXXXxMXF8edd97JmjVriIqKYtKkSRXG98wzzxATE8P8+fNZsWIFkydPZtu2bbzyyiv89a9/ZdCgQWRkZBAcHMycOXMYMWIETz31FHl5eWRmZlbtpFRBnU7uAB07asldKW/yq1/9Cn9/fwDS0tK47bbb2LdvHyJCTk5OqduMGTOGoKAggoKCaN68OcePHycyMvKCdfr27Vs4Lzo6msTEREJDQ2nfvn1h+/FJkyYxZ86ccuNbt25d4RfM0KFDSUlJ4cyZMwwaNIjp06dz8803c+211xIZGUl8fDxTp04lJyeHCRMmEO3G+uM6n9yvugp++snTUShV95VX0g4JKX95eHjVS+olNWjQoHD86aefZsiQIfz73/8mMTGRwaX9vACCgoIKx/39/cnNza3SOtXx+OOPM2bMGBYtWsSgQYP48ssvufLKK1mzZg0LFy5kypQpTJ8+ncmTJ7v0uGWp03XuAM89B+++6+kolFI1IS0tjTZt2gDw3nvvuXz/Xbp04eDBgyQmJgIwb968Cre54oor+OijjwBblx8eHk7Dhg05cOAAPXv25LHHHiM+Pp49e/Zw6NAhWrRowe23385vf/tbtmzZ4vL3UJY6n9yVUt7r0Ucf5YknniAmJsblJW2A+vXr8/rrrzNy5EhiY2MJCwujUaNG5W4zc+ZMNm/eTK9evXj88cd5//33AZg1axY9evSgV69eBAYGMmrUKFatWkXv3r2JiYlh3rx5PPDAAy5/D2WpFc9QjYuLM1V9WMfevTB8OPz5zzB+vIsDU8qL7d69m8suu8zTYXhcRkYGoaGhGGO455576NSpEw899JCnw/qZ0v5eIrLZGFNqm9A6X3KPiLB17vv2eToSpVRd9NZbbxEdHU337t1JS0vjzjvv9HRILlHnL6g2aQJNm2qLGaVU1Tz
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"0.11499959306584194\n"
]
}
],
"source": [
"plot_training_analysis(history)\n",
"\n",
"y_pred = model.predict(x)\n",
"pck = compute_PCK_alpha(y, y_pred)\n",
"print(pck)"
]
},
{
"cell_type": "code",
"execution_count": 18,
"metadata": {
"vscode": {
"languageId": "python"
}
},
"outputs": [],
"source": [
"from tensorflow.keras.regularizers import L1"
]
},
{
"cell_type": "code",
"execution_count": 28,
"metadata": {
"vscode": {
"languageId": "python"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Model: \"sequential_8\"\n",
"_________________________________________________________________\n",
" Layer (type) Output Shape Param # \n",
"=================================================================\n",
" conv2d_32 (Conv2D) (None, 62, 62, 32) 896 \n",
" \n",
" max_pooling2d_32 (MaxPoolin (None, 31, 31, 32) 0 \n",
" g2D) \n",
" \n",
" conv2d_33 (Conv2D) (None, 29, 29, 64) 18496 \n",
" \n",
" max_pooling2d_33 (MaxPoolin (None, 14, 14, 64) 0 \n",
" g2D) \n",
" \n",
" conv2d_34 (Conv2D) (None, 12, 12, 92) 53084 \n",
" \n",
" max_pooling2d_34 (MaxPoolin (None, 6, 6, 92) 0 \n",
" g2D) \n",
" \n",
" conv2d_35 (Conv2D) (None, 4, 4, 128) 106112 \n",
" \n",
" max_pooling2d_35 (MaxPoolin (None, 2, 2, 128) 0 \n",
" g2D) \n",
" \n",
" flatten_8 (Flatten) (None, 512) 0 \n",
" \n",
" dense_16 (Dense) (None, 512) 262656 \n",
" \n",
" dense_17 (Dense) (None, 42) 21546 \n",
" \n",
" reshape_8 (Reshape) (None, 3, 14) 0 \n",
" \n",
"=================================================================\n",
"Total params: 462,790\n",
"Trainable params: 462,790\n",
"Non-trainable params: 0\n",
"_________________________________________________________________\n",
"Epoch 1/50\n",
"900/900 [==============================] - 7s 7ms/step - loss: 1600.6188 - custom_mae: 24.7274 - accuracy: 0.0856 - val_loss: 1313.9059 - val_custom_mae: 19.2149 - val_accuracy: 0.0567\n",
"Epoch 2/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 1291.3328 - custom_mae: 19.5816 - accuracy: 0.0663 - val_loss: 1230.8826 - val_custom_mae: 18.9220 - val_accuracy: 0.0633\n",
"Epoch 3/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 1207.2291 - custom_mae: 19.2404 - accuracy: 0.0696 - val_loss: 1155.3102 - val_custom_mae: 18.8178 - val_accuracy: 0.0267\n",
"Epoch 4/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 1132.5310 - custom_mae: 19.0138 - accuracy: 0.0659 - val_loss: 1093.0542 - val_custom_mae: 18.9194 - val_accuracy: 0.0433\n",
"Epoch 5/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 1062.8149 - custom_mae: 18.7545 - accuracy: 0.0681 - val_loss: 1035.3274 - val_custom_mae: 18.8979 - val_accuracy: 0.0733\n",
"Epoch 6/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 1001.9334 - custom_mae: 18.5588 - accuracy: 0.0737 - val_loss: 978.5314 - val_custom_mae: 18.7171 - val_accuracy: 0.0367\n",
"Epoch 7/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 948.2893 - custom_mae: 18.3561 - accuracy: 0.0678 - val_loss: 935.9355 - val_custom_mae: 18.7541 - val_accuracy: 0.0533\n",
"Epoch 8/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 902.5227 - custom_mae: 18.1901 - accuracy: 0.0730 - val_loss: 912.6291 - val_custom_mae: 19.1317 - val_accuracy: 0.0700\n",
"Epoch 9/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 866.1888 - custom_mae: 17.9962 - accuracy: 0.0719 - val_loss: 868.8121 - val_custom_mae: 18.6186 - val_accuracy: 0.0567\n",
"Epoch 10/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 835.9440 - custom_mae: 17.8430 - accuracy: 0.0696 - val_loss: 843.2394 - val_custom_mae: 18.5577 - val_accuracy: 0.0700\n",
"Epoch 11/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 809.4442 - custom_mae: 17.6539 - accuracy: 0.0696 - val_loss: 847.6819 - val_custom_mae: 19.3826 - val_accuracy: 0.0400\n",
"Epoch 12/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 785.1916 - custom_mae: 17.4793 - accuracy: 0.0715 - val_loss: 803.3231 - val_custom_mae: 18.3979 - val_accuracy: 0.0400\n",
"Epoch 13/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 764.9724 - custom_mae: 17.2881 - accuracy: 0.0741 - val_loss: 790.0184 - val_custom_mae: 18.4058 - val_accuracy: 0.0800\n",
"Epoch 14/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 747.8378 - custom_mae: 17.1618 - accuracy: 0.0785 - val_loss: 778.4919 - val_custom_mae: 18.3664 - val_accuracy: 0.0400\n",
"Epoch 15/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 731.6422 - custom_mae: 16.9957 - accuracy: 0.0767 - val_loss: 774.0274 - val_custom_mae: 18.5825 - val_accuracy: 0.0267\n",
"Epoch 16/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 717.2520 - custom_mae: 16.8505 - accuracy: 0.0704 - val_loss: 753.8166 - val_custom_mae: 18.3133 - val_accuracy: 0.0567\n",
"Epoch 17/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 704.8784 - custom_mae: 16.7288 - accuracy: 0.0726 - val_loss: 747.3119 - val_custom_mae: 18.3142 - val_accuracy: 0.0333\n",
"Epoch 18/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 692.4477 - custom_mae: 16.5168 - accuracy: 0.0741 - val_loss: 736.3395 - val_custom_mae: 18.2599 - val_accuracy: 0.0400\n",
"Epoch 19/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 681.8674 - custom_mae: 16.3894 - accuracy: 0.0719 - val_loss: 732.4158 - val_custom_mae: 18.3323 - val_accuracy: 0.0567\n",
"Epoch 20/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 671.8062 - custom_mae: 16.2239 - accuracy: 0.0685 - val_loss: 723.9401 - val_custom_mae: 18.2230 - val_accuracy: 0.0700\n",
"Epoch 21/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 663.5496 - custom_mae: 16.1147 - accuracy: 0.0707 - val_loss: 717.6505 - val_custom_mae: 18.2588 - val_accuracy: 0.0567\n",
"Epoch 22/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 655.1087 - custom_mae: 16.0022 - accuracy: 0.0778 - val_loss: 711.7806 - val_custom_mae: 18.1043 - val_accuracy: 0.0467\n",
"Epoch 23/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 648.0977 - custom_mae: 15.8962 - accuracy: 0.0770 - val_loss: 724.2146 - val_custom_mae: 18.7435 - val_accuracy: 0.0867\n",
"Epoch 24/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 638.8377 - custom_mae: 15.6779 - accuracy: 0.0741 - val_loss: 702.2473 - val_custom_mae: 18.1418 - val_accuracy: 0.0833\n",
"Epoch 25/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 631.8398 - custom_mae: 15.5892 - accuracy: 0.0744 - val_loss: 699.0881 - val_custom_mae: 18.1864 - val_accuracy: 0.0800\n",
"Epoch 26/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 625.5166 - custom_mae: 15.5084 - accuracy: 0.0815 - val_loss: 700.0754 - val_custom_mae: 18.2896 - val_accuracy: 0.0867\n",
"Epoch 27/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 618.1070 - custom_mae: 15.3338 - accuracy: 0.0774 - val_loss: 696.2742 - val_custom_mae: 18.4092 - val_accuracy: 0.0633\n",
"Epoch 28/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 611.0324 - custom_mae: 15.2123 - accuracy: 0.0819 - val_loss: 693.2403 - val_custom_mae: 18.3400 - val_accuracy: 0.0433\n",
"Epoch 29/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 605.3654 - custom_mae: 15.1036 - accuracy: 0.0785 - val_loss: 689.4489 - val_custom_mae: 18.3240 - val_accuracy: 0.0867\n",
"Epoch 30/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 598.5970 - custom_mae: 14.9257 - accuracy: 0.0830 - val_loss: 697.1451 - val_custom_mae: 18.6957 - val_accuracy: 0.0700\n",
"Epoch 31/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 593.0287 - custom_mae: 14.8324 - accuracy: 0.0837 - val_loss: 682.5284 - val_custom_mae: 18.2608 - val_accuracy: 0.1067\n",
"Epoch 32/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 586.8223 - custom_mae: 14.6828 - accuracy: 0.0919 - val_loss: 676.1535 - val_custom_mae: 18.1355 - val_accuracy: 0.0600\n",
"Epoch 33/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 580.5615 - custom_mae: 14.5288 - accuracy: 0.0893 - val_loss: 679.8427 - val_custom_mae: 18.3359 - val_accuracy: 0.0700\n",
"Epoch 34/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 575.5278 - custom_mae: 14.4210 - accuracy: 0.0848 - val_loss: 673.6619 - val_custom_mae: 18.2497 - val_accuracy: 0.0467\n",
"Epoch 35/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 569.3248 - custom_mae: 14.2640 - accuracy: 0.0837 - val_loss: 674.8303 - val_custom_mae: 18.3349 - val_accuracy: 0.0567\n",
"Epoch 36/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 563.3799 - custom_mae: 14.1157 - accuracy: 0.0863 - val_loss: 678.8046 - val_custom_mae: 18.4942 - val_accuracy: 0.0667\n",
"Epoch 37/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 558.2557 - custom_mae: 13.9926 - accuracy: 0.0881 - val_loss: 669.0584 - val_custom_mae: 18.3795 - val_accuracy: 0.0700\n",
"Epoch 38/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 553.7178 - custom_mae: 13.9113 - accuracy: 0.0867 - val_loss: 664.1337 - val_custom_mae: 18.2757 - val_accuracy: 0.0667\n",
"Epoch 39/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 547.0479 - custom_mae: 13.6922 - accuracy: 0.0893 - val_loss: 665.8436 - val_custom_mae: 18.4226 - val_accuracy: 0.0700\n",
"Epoch 40/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 542.4849 - custom_mae: 13.6235 - accuracy: 0.0881 - val_loss: 667.3710 - val_custom_mae: 18.5683 - val_accuracy: 0.0700\n",
"Epoch 41/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 537.6252 - custom_mae: 13.4894 - accuracy: 0.0885 - val_loss: 662.0100 - val_custom_mae: 18.4422 - val_accuracy: 0.0633\n",
"Epoch 42/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 532.7130 - custom_mae: 13.3705 - accuracy: 0.0941 - val_loss: 658.4034 - val_custom_mae: 18.3899 - val_accuracy: 0.0467\n",
"Epoch 43/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 526.7731 - custom_mae: 13.1849 - accuracy: 0.0948 - val_loss: 663.8610 - val_custom_mae: 18.7139 - val_accuracy: 0.0933\n",
"Epoch 44/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 521.6479 - custom_mae: 13.0322 - accuracy: 0.0937 - val_loss: 659.8156 - val_custom_mae: 18.6054 - val_accuracy: 0.0500\n",
"Epoch 45/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 517.1684 - custom_mae: 12.9476 - accuracy: 0.0937 - val_loss: 655.1570 - val_custom_mae: 18.5126 - val_accuracy: 0.0633\n",
"Epoch 46/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 511.0065 - custom_mae: 12.7268 - accuracy: 0.0915 - val_loss: 654.9119 - val_custom_mae: 18.5737 - val_accuracy: 0.0500\n",
"Epoch 47/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 506.4033 - custom_mae: 12.6392 - accuracy: 0.0911 - val_loss: 653.8961 - val_custom_mae: 18.5993 - val_accuracy: 0.0600\n",
"Epoch 48/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 502.6320 - custom_mae: 12.5451 - accuracy: 0.0996 - val_loss: 658.9085 - val_custom_mae: 18.7675 - val_accuracy: 0.0433\n",
"Epoch 49/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 497.2288 - custom_mae: 12.3731 - accuracy: 0.0970 - val_loss: 652.7715 - val_custom_mae: 18.6982 - val_accuracy: 0.0667\n",
"Epoch 50/50\n",
"900/900 [==============================] - 6s 7ms/step - loss: 492.1422 - custom_mae: 12.1980 - accuracy: 0.0948 - val_loss: 650.8259 - val_custom_mae: 18.7055 - val_accuracy: 0.0400\n"
]
}
],
"source": [
"x, y = load_data(image_size=64, num_images=1000)\n",
"\n",
"model = Sequential([\n",
" InputLayer(input_shape=x.shape[1:]),\n",
" \n",
" Conv2D(32, 3, activation=\"relu\"),\n",
" MaxPooling2D(pool_size=(2, 2)),\n",
" \n",
" Conv2D(64, 3, activation=\"relu\"),\n",
" MaxPooling2D(pool_size=(2, 2)),\n",
"\n",
" Conv2D(92, 3, activation=\"relu\"),\n",
" MaxPooling2D(pool_size=(2, 2)),\n",
"\n",
" Conv2D(128, 3, activation=\"relu\"),\n",
" MaxPooling2D(pool_size=(2, 2)),\n",
"\n",
" Flatten(),\n",
"\n",
" Dense(512, activation=\"relu\", kernel_regularizer=L1(0.1)),\n",
" Dense(y.shape[1] * y.shape[2], activation=\"linear\", kernel_regularizer=L1(0.1)),\n",
" Reshape(y.shape[1:])\n",
"])\n",
"\n",
"model.summary()\n",
"\n",
"adam = optimizers.Adam(learning_rate=1e-5)\n",
"model.compile(optimizer=adam, loss=custom_mse, metrics=[custom_mae, \"accuracy\"])\n",
"history = model.fit(x, y, epochs=50, validation_split=0.1, batch_size=1)"
]
},
{
"cell_type": "code",
"execution_count": 29,
"metadata": {
"vscode": {
"languageId": "python"
}
},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAXAAAAEICAYAAABGaK+TAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAAsTAAALEwEAmpwYAAA430lEQVR4nO3dd3iUVfbA8e9JgQChJ7QETEIVKQFCkSJdQRFcRRCVhbWguBawoD92FcV1G+6KCtgVZBEEBUTpdYVFuvQmJUBCSQgkBEIg5f7+uJNKGmmTSc7neeaZmbeeNwxn7tz3FjHGoJRSyvW4OTsApZRS+aMJXCmlXJQmcKWUclGawJVSykVpAldKKRelCVwppVyUJnAFgIgsFZERhb2tM4lIqIj0KYLjGhFp5Hj9sYi8npdt83GeR0RkRX7jVKWfJnAXJiKX0z2SReRquveP3MyxjDH9jTEzCnvb0s4Y87Qx5u2CHkdEAhzJ3iPdsWcZY+4s6LGzOFcPx7kWZFre2rF8XablIiLHRGR/FsdaJyLxmT6LPxZ2zCprHrlvokoqY4x3ymsRCQWeMMasyrydiHgYYxKLMzZV4kUCt4tITWNMlGPZCOBwFtveAdQCPESkvTFma6b1zxpjPi/CWFU2tAReCjlKWGEi8qqInAW+EpHqIvKTiESKyEXHa/90+6wTkSccr0eKyAYRedex7XER6Z/PbQNF5GcRiRWRVSIyVUT+k03ceYnxbRH5n+N4K0TEJ9364SJyQkSiRORPOfx9OorIWRFxT7fsdyKy2/G6g4j8IiLRInJGRKaISLlsjjVdRP6S7v0rjn1Oi8hjmba9R0R+FZFLInJKRN5Mt/pnx3O0oxR7e8rfNt3+nUVkq4jEOJ475/Vvk4XrwELgIcf+7sBQYFYW244AfgCWOF6rEkITeOlVB6gB3AKMwv5bf+V43wC4CkzJYf+OwCHAB/gn8IWISD62/QbYAtQE3gSG53DOvMT4MPAHbImwHPAygIg0Bz5yHL+e43z+ZMEYsxm4AvTKdNxvHK+TgLGO67kd6A08k0PcOGLo54inL9AYyFz/fgX4PVANuAcYLSL3Odbd4XiuZozxNsb8kunYNYDFwAeOa/s3sFhEama6hhv+Njn42hEPwF3AXuB0pvNWBAZjE/ss4KHsvsxU8dMEXnolAxOMMdeMMVeNMVHGmO+NMXHGmFjgHaB7DvufMMZ8ZoxJAmYAdYHaN7OtiDQA2gNvGGOuG2M2AIuyO2EeY/zKGHPYGHMVmAsEO5YPBn4yxvxsjLkGvO74G2RnNjAMQEQqA3c7lmGM2W6M2WSMSTTGhAKfZBFHVoY44ttrjLmC/cJKf33rjDF7jDHJxpjdjvPl5bhgE/5vxpiZjrhmAweBe9Ntk93fJkvGmI1ADRFpik3kX2ex2f3ANWAF9gvE0xFLeh84fq2kPAp8T0DljSbw0ivSGBOf8kZEKorIJ44qhkvYn+zV0lcjZHI25YUxJs7x0vsmt60HXEi3DOBUdgHnMcaz6V7HpYupXvpjOxJoFNn7BrhfRMpjk9QOY8wJRxxNHNU3Zx1x/BVbGs9NhhiAE5mur6OIrHVUEcUAT+fxuCnHPpFp2QnAL9377P42OZkJPAv0BBZksX4EMNfxpREPfM+N1SjPG2OqpXtk2ypHFS5N4KVX5mEmXwKaAh2NMVVI+8meXbVIYTiDLeFVTLesfg7bFyTGM+mP7Thnzew2NsbsxybA/mSsPgFbFXMQaOyIY3x+YsBWA6X3DfYXSH1jTFXg43THzW1Y0NPYqqX0GgDheYgrJzOx1UNLMn3R4rj/0At41PFldhb7S+fuXOrXVTHRBF52VMbWKUc76lMnFPUJHSXabcCbIlJORG4n40/+wozxO2CAiHR11NFOJPfP9zfAC9gvinmZ4rgEXBaRZsDoPMYwFxgpIs0dXyCZ46+M/UUSLyIdsF8cKSKxVT5B2Rx7CdBERB4WEQ8RGQo0B37KY2xZMsYcx1bjZHXTdzi2VUpTbHVMMNAECMNR/aScSxN42TEZqACcBzYBy4rpvI9gbwRGAX8BvsXWqWZlMvmM0RizD/gjNimfAS5iE01OUuqg1xhjzqdb/jI2ucYCnzlizksMSx3XsAY44nhO7xlgoojEAm9gE37KvnHYOv//OeqRO2U6dhQwAPsrJQoYBwzIFHe+GGM2GGNOZ7FqBDDNGHM2/QP7yyF9NcoUydgOfHtBY1J5IzqhgypOIvItcNAYU+S/AJQq7bQEroqUiLQXkYYi4uZoZjcI2/5YKVVA2hNTFbU6wHzsDcUwYLQx5lfnhqRU6aBVKEop5aK0CkUppVxUsVah+Pj4mICAgOI8pVJKubzt27efN8b4Zl5erAk8ICCAbdu2FecplVLK5YlI5l64gFahKKWUy9IErpRSLkoTuFJKuShtB65UKZSQkEBYWBjx8fG5b6xKDC8vL/z9/fH09MzT9prAlSqFwsLCqFy5MgEBAWQ/D4cqSYwxREVFERYWRmBgYJ720SoUpUqh+Ph4atasqcnbhYgINWvWvKlfTZrAlSqlNHm7npv9N9MErpRSLsolEvgnn0Dv3s6OQimVV1FRUQQHBxMcHEydOnXw8/NLfX/9+vUc9922bRvPP/98rufo3LlzocS6bt06RITPP/88ddnOnTsREd59993UZYmJifj6+vLaa69l2L9Hjx40bdo09foGDx5cKHHlhUvcxIyIgDVrICEB8nhzVinlRDVr1mTnzp0AvPnmm3h7e/Pyyy+nrk9MTMTDI+v0ExISQkhISK7n2LhxY6HECtCiRQvmzp3LE088AcDs2bNp3bp1hm1WrlxJkyZNmDdvHn/7298yVHfMmjUrTzEXNpcogdd0zGx44YJz41BK5d/IkSN5+umn6dixI+PGjWPLli3cfvvttGnThs6dO3Po0CHAlogHDBgA2OT/2GOP0aNHD4KCgvjggw9Sj+ft7Z26fY8ePRg8eDDNmjXjkUceIWWU1SVLltCsWTPatWvH888/n3rczG655Rbi4+M5d+4cxhiWLVtG//79M2wze/ZsXnjhBRo0aMAvv/xS6H+f/HCJEriPY/rU8+ehdm3nxqKUK+rR48ZlQ4bAM89AXBzcffeN60eOtI/z5yFzrcC6dfmLIywsjI0bN+Lu7s6lS5dYv349Hh4erFq1ivHjx/P999/fsM/BgwdZu3YtsbGxNG3alNGjR9/QTvrXX39l37591KtXjy5duvC///2PkJAQnnrqKX7++WcCAwMZNiznaTwHDx7MvHnzaNOmDW3btqV8+fKp6+Lj41m1ahWffPIJ0dHRzJ49O0MVziOPPEKFChUA6Nu3L5MmTcrfH+gmuUQCTymBny/w7H9KKWd68MEHcXd3ByAmJoYRI0bw22+/ISIkJCRkuc8999xD+fLlKV++PLVq1eLcuXP4+/tn2KZDhw6py4KDgwkNDcXb25ugoKDUNtXDhg3j008/zTa2IUOGMHToUA4ePMiwYcMyVNH89NNP9OzZkwoVKvDAAw/w9ttvM3ny5NRrcVYViksk8Hr1oGNHyKbKTCmVi5xKzBUr5rzexyf/Je7MKlWqlPr69ddfp2fPnixYsIDQ0FB6ZPUzATKUhN3d3UlMTMzXNrmpU6cOnp6erFy5kvfffz9DAp89ezYbNmwgZTjsqKgo1qxZQ9++fW/6PIXJJVLirbfCpk3OjkIpVZhiYmLw8/MDYPr06YV+/KZNm3Ls2DFCQ0MJCAjg22+/zXWfiRMnEhERkVqyBlKrek6dOpX6RfHVV18xe/ZsTeBKqbJp3LhxjBgxgr/85S/cc889hX78ChUqMG3aNPr160elSpVo3759rvtk1TRxwYIF9OrVK0Mpf9CgQYwbN45r164BGevAfXx8WLVqVSFdRc6KdU7MkJAQk98JHbp0gf794c9/LuSglCqFDhw4wK233ursMJzu8uXLeHt7Y4zhj3/8I40bN2bs2LHODitHWf3bich2Y8wNley5NiMUkfoislZE9ovIPhF5IdP6l0TEiIhPgSPPQVgY/PZbUZ5BKVXafPbZZwQHB3PbbbcRExPDU0895eyQClVeqlA
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAX0AAAEICAYAAACzliQjAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAAsTAAALEwEAmpwYAAA5O0lEQVR4nO3dd3wVVfr48c+TTgotgVASJHSpCSRUUYoFRAEVUdZVELtYVhTsgoX92UVUsIvuIuVrW1RcFBVhUYSA9F4ChJYQIIQSQpLz++NMCoGEkHaTe5/36zWvO/dMuc+kPDP3zJlzxBiDUkopz+Dl6gCUUkpVHE36SinlQTTpK6WUB9Gkr5RSHkSTvlJKeRBN+kop5UE06asSE5EfRGR4Wa/rSiKSICKXlsN+jYg0c+bfFZGni7NuCT7nJhH5saRxFrHfXiKSWNb7VRXPx9UBqIolIkfzvQ0ETgJZzvu7jDHTirsvY0z/8ljX3Rlj7i6L/YhIY2A74GuMyXT2PQ0o9u9QeR5N+h7GGBOcMy8iCcDtxph5BdcTEZ+cRKKUch9avaOAvK/vIvKoiOwDPhGRWiLynYgki8ghZz4i3zbzReR2Z36EiPxPRF511t0uIv1LuG6UiCwQkTQRmSci74jIvwuJuzgxPi8ii5z9/SgiYfmW3ywiO0QkRUSeLOLn00VE9omId76ya0RklTPfWUT+EJHDIrJXRN4WEb9C9jVVRF7I936Ms80eERlZYN0BIvKXiBwRkV0iMj7f4gXO62EROSoi3XJ+tvm27y4iS0Uk1XntXtyfTVFE5EJn+8MislZEBuZbdqWIrHP2uVtEHnHKw5zfz2EROSgiC0VEc1AF0x+4yq8eUBu4ALgT+/fxifO+EXACeLuI7bsAG4Ew4GXgIxGREqz7ObAECAXGAzcX8ZnFifFvwK1AXcAPyElCrYEpzv4bOJ8XwVkYY/4EjgF9Cuz3c2c+C3jIOZ5uQF/g3iLixomhnxPPZUBzoOD9hGPALUBNYABwj4gMdpZd7LzWNMYEG2P+KLDv2sD3wCTn2F4HvheR0ALHcMbP5hwx+wLfAj86290PTBORls4qH2GrCkOAtsAvTvnDQCJQBwgHngC0H5gKpklf5ZcNjDPGnDTGnDDGpBhjvjTGHDfGpAETgEuK2H6HMeYDY0wW8ClQH/vPXex1RaQREAc8Y4zJMMb8D5hd2AcWM8ZPjDGbjDEngFlAtFM+BPjOGLPAGHMSeNr5GRRmOjAMQERCgCudMowxy4wxi40xmcaYBOC9s8RxNkOd+NYYY45hT3L5j2++MWa1MSbbGLPK+bzi7BfsSWKzMeZfTlzTgQ3A1fnWKexnU5SuQDDwovM7+gX4DudnA5wCWotIdWPMIWPM8nzl9YELjDGnjDELjXb+VeE06av8ko0x6TlvRCRQRN5zqj+OYKsTauav4ihgX86MMea4Mxt8nus2AA7mKwPYVVjAxYxxX7754/liapB/307STSnss7BX9deKiD9wLbDcGLPDiaOFU3Wxz4njn9ir/nM5LQZgR4Hj6yIivzrVV6nA3cXcb86+dxQo2wE0zPe+sJ/NOWM2xuQ/Qebf73XYE+IOEflNRLo55a8AW4AfRWSbiDxWvMNQZUmTvsqv4FXXw0BLoIsxpjp51QmFVdmUhb1AbREJzFcWWcT6pYlxb/59O58ZWtjKxph12OTWn9OrdsBWE20AmjtxPFGSGLBVVPl9jv2mE2mMqQG8m2+/57pK3oOt9sqvEbC7GHGda7+RBerjc/drjFlqjBmErfr5BvsNAmNMmjHmYWNME2AgMFpE+pYyFnWeNOmrooRg68gPO/XD48r7A50r53hgvIj4OVeJVxexSWli/AK4SkQucm66Pse5/yc+Bx7Enlz+r0AcR4CjItIKuKeYMcwCRohIa+ekUzD+EOw3n3QR6Yw92eRIxlZHNSlk33OAFiLyNxHxEZEbgNbYqpjS+BP7rWCsiPiKSC/s72iG8zu7SURqGGNOYX8m2QAicpWINHPu3aRi74MUVZ2myoEmfVWUiUA14ACwGPhvBX3uTdiboSnAC8BM7PMEZzOREsZojFkLjMIm8r3AIeyNxqLk1Kn/Yow5kK/8EWxCTgM+cGIuTgw/OMfwC7bq45cCq9wLPCciacAzOFfNzrbHsfcwFjktYroW2HcKcBX221AKMBa4qkDc580Yk4FN8v2xP/fJwC3GmA3OKjcDCU41193Y3yfYG9XzgKPAH8BkY8yvpYlFnT/R+yiqshORmcAGY0y5f9NQyt3plb6qdEQkTkSaioiX06RxELZuWClVSvpErqqM6gFfYW+qJgL3GGP+cm1ISrkHrd5RSikPotU7SinlQSp19U5YWJhp3Lixq8NQSqkqZdmyZQeMMXXOtqxSJ/3GjRsTHx/v6jCUUqpKEZGCT2Ln0uodpZTyIJr0lVLKg2jSV0opD1Kp6/SVUhXv1KlTJCYmkp6efu6VlUsFBAQQERGBr69vsbfRpK+UOk1iYiIhISE0btyYwsfAUa5mjCElJYXExESioqKKvd05q3dE5GMRSRKRNQXK7xeRDc5QaS/nK39cRLaIyEYRuSJfeT+nbIv2o61U5ZWenk5oaKgm/EpORAgNDT3vb2TFudKfih1+7rN8H9Yb2x9KB2PMSRGp65S3Bm4E2mAHWpgnIi2czd7BDgmXCCwVkdlO/+RKqUpGE37VUJLf0zmTvjFmgYg0LlB8D3aotJPOOklO+SBghlO+XUS2AJ2dZVuMMducQGc462rSV0qpClTS1jstgJ4i8qczHFqcU96Q04d+S3TKCis/g4jcKSLxIhKfnJxcouDS06FPH/jkkxJtrpRyoZSUFKKjo4mOjqZevXo0bNgw931GRkaR28bHx/PAAw+c8zO6d+9eJrHOnz+fq666qkz2VVFKeiPXB6iNHSA5DpglIoWN3nNejDHvA+8DxMbGlqg3OH9/WLoU2rUri4iUUhUpNDSUFStWADB+/HiCg4N55JFHcpdnZmbi43P21BUbG0tsbOw5P+P3338vk1iropJe6ScCXxlrCXbIszDsGJn5x/uMcMoKKy8XIhAZCYnnGgNJKVUljBgxgrvvvpsuXbowduxYlixZQrdu3YiJiaF79+5s3LgROP3Ke/z48YwcOZJevXrRpEkTJk2alLu/4ODg3PV79erFkCFDaNWqFTfddBM5PQ/PmTOHVq1a0alTJx544IFzXtEfPHiQwYMH0759e7p27cqqVasA+O2333K/qcTExJCWlsbevXu5+OKLiY6Opm3btixcuLDMf2aFKemV/jdAb+BX50atH3bYtNnA5yLyOvZGbnNgCXYg5+YiEoVN9jdy+lifZS4yEnbtOvd6Sqmi9ep1ZtnQoXDvvXD8OFx55ZnLR4yw04EDMGTI6cvmzy9ZHImJifz+++94e3tz5MgRFi5ciI+PD/PmzeOJJ57gyy+/PGObDRs28Ouvv5KWlkbLli255557zmjT/tdff7F27VoaNGhAjx49WLRoEbGxsdx1110sWLCAqKgohg0bds74xo0bR0xMDN988w2//PILt9xyCytWrODVV1/lnXfeoUePHhw9epSAgADef/99rrjiCp588kmysrI4fvx4yX4oJXDOpC8i04FeQJiIJGIHbv4Y+NhpxpkBDDf29LhWRGZhb9BmAqOMMVnOfu4D5gLewMfO+KTlJiICnBOtUsoNXH/99Xh7ewOQmprK8OHD2bx5MyLCqVOnzrrNgAED8Pf3x9/fn7p167J//34iIiJOW6dz5865ZdHR0SQkJBAcHEyTJk1y278PGzaM999/v8j4/ve//+WeePr06UNKSgpHjhyhR48ejB49mptuuolrr72WiIgI4uLiGDlyJKdOnWLw4MFER0eX5kdzXorTeqewU9zfC1l/Anaw5oLlc4A55xVdKXTsaK/0s7PBSzubUKrEiroyDwwsenlYWMmv7AsKCgrKnX/66afp3bs3X3/9NQkJCfQ629cRwN/fP3fe29ubzMzMEq1TGo899hgDBgxgzpw59OjRg7lz53LxxRezYMECvv/+e0aMGMHo0aO55ZZbyvRzC+O26XD
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"0.11475543257101001\n"
]
}
],
"source": [
"plot_training_analysis(history)\n",
"\n",
"y_pred = model.predict(x)\n",
"pck = compute_PCK_alpha(y, y_pred)\n",
"print(pck)"
]
}
],
"metadata": {
"accelerator": "GPU",
"colab": {
"collapsed_sections": [],
"machine_shape": "hm",
"name": "IAM2020 - TP4 - Estimation de Posture.ipynb",
"provenance": [],
"toc_visible": true
},
"kernelspec": {
"display_name": ".env",
"language": "python",
"name": ".env"
}
},
"nbformat": 4,
"nbformat_minor": 1
}