Loading

Food Recognition Challenge

πŸ• Food Recognition Challenge : Data Exploration & Baseline

Food Recognition Challenge Notebook

Shubhamai

πŸ• Food Recognition Challenge : Data Exploration & Baseline

So, in this project, we are building a Deep Learning Model which is capable to detect various food using Detectron2, coco, along with other libraries such as Weights & Biases for recording our experimentations, and a bunch of other things



Problem

Detecting & Segmenting various kinds of food from an image. For ex. Someone got into new restaurent and get a food that he has never seen, well our DL model is in rescue, so our DL model will help indentifying which food it is from the class our model is being trained on!


Data

We will be using data from Food Recognition Challenge - A benchmark for image-based food recognition challange which was launched on March 9, 2020 and ended on May 26, 2020.

https://www.aicrowd.com/challenges/food-recognition-challenge#datasets

  • We have a total of 24120 RGB images with 2053 validation, all in MS-COCO format and test set for now is same as validation ( debug mode ).


Evaluation

The evaluation metrics is IOU aka. Intersection Over Union ( more about that later ).

The actualy metric is computed by averaging over all the precision and recall values for IOU which greater than 0.5.

https://www.aicrowd.com/challenges/food-recognition-challenge#evaluation-criteria

Table of Content

  1. Setting our Workspace πŸ’Ό
    • Downloading & Unzipping our Dataset
    • Downloading & Importing Necessary Libraries
  1. Data Exploration 🧐

    • Reading our Dataset
    • Data Visualisations
  2. Image Visulisation πŸ–ΌοΈ

    • Reading Images
  3. Creating our Dataset πŸ”¨

    • Fixing the Dataset
    • Creating our dataset
  4. Creating our Model 🏭

    • Creating R-CNN Model
    • Setting up hyperparameters
  5. Training the Model πŸš‚

    • Setting up Tensorboard
    • Start Training!
  6. Evaluating the model πŸ§ͺ

    • Evaluating our Model
  7. Testing the Model πŸ’―

    • Testing the Model
  8. Submitting our predictions πŸ“

  9. Generate More Data + Some tips & tricks πŸ’‘

In [ ]:

Setting our Workspace 💼

In this section we will be downloading our dataset, unzipping it & downliading detectron2 library and importing all libraries that we will be using

Downloading & Unzipping our Dataset

In [ ]:
# Downloading Training Dataset
!wget -q https://datasets.aicrowd.com/default/aicrowd-public-datasets/food-recognition-challenge/v0.4/train-v0.4.tar.gz -O train.zip

# Downloading Validation Dataset
!wget -q https://datasets.aicrowd.com/default/aicrowd-public-datasets/food-recognition-challenge/v0.4/val-v0.4.tar.gz -O val.zip
In [ ]:
# Unzipping Training Dataset
!unzip train.zip > /dev/null
In [ ]:
# Unzipping Validation Dataset
!unzip val.zip > /dev/null

So, the data structure is something like this

content
|
└─── sample_data
|
└─── Train 
β”‚   β”‚   annotations.json
β”‚   └───images
β”‚       β”‚   012170.jpg
β”‚       β”‚   012030.jpg
β”‚       β”‚   ...
β”‚   
└─── val
β”‚   β”‚   annotations.json
β”‚   └───images
β”‚       β”‚   011397.jpg
β”‚       β”‚   012340.jpg
β”‚       β”‚   ...
|    train.zip
|    val.zip

Importing Necessary Libraries

In [1]:
# Making sure that we are using GPUs
!nvidia-smi
Fri Feb  5 05:10:23 2021       
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 460.39       Driver Version: 418.67       CUDA Version: 10.1     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  Tesla P100-PCIE...  Off  | 00000000:00:04.0 Off |                    0 |
| N/A   36C    P0    26W / 250W |      0MiB / 16280MiB |      0%      Default |
|                               |                      |                 ERR! |
+-------------------------------+----------------------+----------------------+
                                                                               
+-----------------------------------------------------------------------------+
| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|=============================================================================|
|  No running processes found                                                 |
+-----------------------------------------------------------------------------+
In [ ]:
# install dependencies: (use cu101 because colab has CUDA 10.1)
!pip install -U torch==1.5 torchvision==0.6 -f https://download.pytorch.org/whl/cu101/torch_stable.html 
!pip install cython pyyaml==5.1
!pip install -U 'git+https://github.com/cocodataset/cocoapi.git#subdirectory=PythonAPI'
import torch, torchvision
print(torch.__version__, torch.cuda.is_available())
!gcc --version
Looking in links: https://download.pytorch.org/whl/cu101/torch_stable.html
Collecting torch==1.5
  Downloading https://download.pytorch.org/whl/cu101/torch-1.5.0%2Bcu101-cp36-cp36m-linux_x86_64.whl (703.8MB)
     |β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 703.8MB 24kB/s 
Collecting torchvision==0.6
  Downloading https://download.pytorch.org/whl/cu101/torchvision-0.6.0%2Bcu101-cp36-cp36m-linux_x86_64.whl (6.6MB)
     |β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 6.6MB 26.7MB/s 
Requirement already satisfied, skipping upgrade: numpy in /usr/local/lib/python3.6/dist-packages (from torch==1.5) (1.19.5)
Requirement already satisfied, skipping upgrade: future in /usr/local/lib/python3.6/dist-packages (from torch==1.5) (0.16.0)
Requirement already satisfied, skipping upgrade: pillow>=4.1.1 in /usr/local/lib/python3.6/dist-packages (from torchvision==0.6) (7.0.0)
Installing collected packages: torch, torchvision
  Found existing installation: torch 1.7.0+cu101
    Uninstalling torch-1.7.0+cu101:
      Successfully uninstalled torch-1.7.0+cu101
  Found existing installation: torchvision 0.8.1+cu101
    Uninstalling torchvision-0.8.1+cu101:
      Successfully uninstalled torchvision-0.8.1+cu101
Successfully installed torch-1.5.0+cu101 torchvision-0.6.0+cu101
Requirement already satisfied: cython in /usr/local/lib/python3.6/dist-packages (0.29.21)
Collecting pyyaml==5.1
  Downloading https://files.pythonhosted.org/packages/9f/2c/9417b5c774792634834e730932745bc09a7d36754ca00acf1ccd1ac2594d/PyYAML-5.1.tar.gz (274kB)
     |β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 276kB 8.6MB/s 
Building wheels for collected packages: pyyaml
  Building wheel for pyyaml (setup.py) ... done
  Created wheel for pyyaml: filename=PyYAML-5.1-cp36-cp36m-linux_x86_64.whl size=44075 sha256=d5c1880e50d90a9cdc7e61d91bf4def3e289689b69a2f756cd8fe7f977133c5d
  Stored in directory: /root/.cache/pip/wheels/ad/56/bc/1522f864feb2a358ea6f1a92b4798d69ac783a28e80567a18b
Successfully built pyyaml
Installing collected packages: pyyaml
  Found existing installation: PyYAML 3.13
    Uninstalling PyYAML-3.13:
      Successfully uninstalled PyYAML-3.13
Successfully installed pyyaml-5.1
Collecting git+https://github.com/cocodataset/cocoapi.git#subdirectory=PythonAPI
  Cloning https://github.com/cocodataset/cocoapi.git to /tmp/pip-req-build-ck15ryyp
  Running command git clone -q https://github.com/cocodataset/cocoapi.git /tmp/pip-req-build-ck15ryyp
Requirement already satisfied, skipping upgrade: setuptools>=18.0 in /usr/local/lib/python3.6/dist-packages (from pycocotools==2.0) (53.0.0)
Requirement already satisfied, skipping upgrade: cython>=0.27.3 in /usr/local/lib/python3.6/dist-packages (from pycocotools==2.0) (0.29.21)
Requirement already satisfied, skipping upgrade: matplotlib>=2.1.0 in /usr/local/lib/python3.6/dist-packages (from pycocotools==2.0) (3.2.2)
Requirement already satisfied, skipping upgrade: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib>=2.1.0->pycocotools==2.0) (2.4.7)
Requirement already satisfied, skipping upgrade: numpy>=1.11 in /usr/local/lib/python3.6/dist-packages (from matplotlib>=2.1.0->pycocotools==2.0) (1.19.5)
Requirement already satisfied, skipping upgrade: cycler>=0.10 in /usr/local/lib/python3.6/dist-packages (from matplotlib>=2.1.0->pycocotools==2.0) (0.10.0)
Requirement already satisfied, skipping upgrade: python-dateutil>=2.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib>=2.1.0->pycocotools==2.0) (2.8.1)
Requirement already satisfied, skipping upgrade: kiwisolver>=1.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib>=2.1.0->pycocotools==2.0) (1.3.1)
Requirement already satisfied, skipping upgrade: six in /usr/local/lib/python3.6/dist-packages (from cycler>=0.10->matplotlib>=2.1.0->pycocotools==2.0) (1.15.0)
Building wheels for collected packages: pycocotools
  Building wheel for pycocotools (setup.py) ... done
  Created wheel for pycocotools: filename=pycocotools-2.0-cp36-cp36m-linux_x86_64.whl size=265567 sha256=cbe08ba616ad4b301ac531aa3a7d8feda31eb5da93c91abfb36305afa400828a
  Stored in directory: /tmp/pip-ephem-wheel-cache-g1nh7ukv/wheels/90/51/41/646daf401c3bc408ff10de34ec76587a9b3ebfac8d21ca5c3a
Successfully built pycocotools
Installing collected packages: pycocotools
  Found existing installation: pycocotools 2.0.2
    Uninstalling pycocotools-2.0.2:
      Successfully uninstalled pycocotools-2.0.2
Successfully installed pycocotools-2.0
1.5.0+cu101 True
gcc (Ubuntu 7.5.0-3ubuntu1~18.04) 7.5.0
Copyright (C) 2017 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

In [ ]:
# install detectron2:
!pip install detectron2==0.1.2 -f https://dl.fbaipublicfiles.com/detectron2/wheels/cu101/index.html
Looking in links: https://dl.fbaipublicfiles.com/detectron2/wheels/cu101/index.html
Collecting detectron2==0.1.2
  Downloading https://dl.fbaipublicfiles.com/detectron2/wheels/cu101/torch1.5/detectron2-0.1.2%2Bcu101-cp36-cp36m-linux_x86_64.whl (6.2MB)
     |β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 6.2MB 4.6MB/s 
Collecting yacs>=0.1.6
  Downloading https://files.pythonhosted.org/packages/38/4f/fe9a4d472aa867878ce3bb7efb16654c5d63672b86dc0e6e953a67018433/yacs-0.1.8-py3-none-any.whl
Requirement already satisfied: Pillow in /usr/local/lib/python3.6/dist-packages (from detectron2==0.1.2) (7.0.0)
Collecting fvcore
  Downloading https://files.pythonhosted.org/packages/ed/38/e425d90ddd07e3d23a84b49d636df76a41f645e62fd6dc944b5769c8ab34/fvcore-0.1.3.post20210204.tar.gz
Requirement already satisfied: matplotlib in /usr/local/lib/python3.6/dist-packages (from detectron2==0.1.2) (3.2.2)
Requirement already satisfied: cloudpickle in /usr/local/lib/python3.6/dist-packages (from detectron2==0.1.2) (1.3.0)
Requirement already satisfied: tqdm>4.29.0 in /usr/local/lib/python3.6/dist-packages (from detectron2==0.1.2) (4.41.1)
Requirement already satisfied: tensorboard in /usr/local/lib/python3.6/dist-packages (from detectron2==0.1.2) (2.4.1)
Requirement already satisfied: pydot in /usr/local/lib/python3.6/dist-packages (from detectron2==0.1.2) (1.3.0)
Requirement already satisfied: future in /usr/local/lib/python3.6/dist-packages (from detectron2==0.1.2) (0.16.0)
Collecting mock
  Downloading https://files.pythonhosted.org/packages/5c/03/b7e605db4a57c0f6fba744b11ef3ddf4ddebcada35022927a2b5fc623fdf/mock-4.0.3-py3-none-any.whl
Requirement already satisfied: tabulate in /usr/local/lib/python3.6/dist-packages (from detectron2==0.1.2) (0.8.7)
Requirement already satisfied: termcolor>=1.1 in /usr/local/lib/python3.6/dist-packages (from detectron2==0.1.2) (1.1.0)
Requirement already satisfied: PyYAML in /usr/local/lib/python3.6/dist-packages (from yacs>=0.1.6->detectron2==0.1.2) (5.1)
Requirement already satisfied: numpy in /usr/local/lib/python3.6/dist-packages (from fvcore->detectron2==0.1.2) (1.19.5)
Collecting iopath>=0.1.2
  Downloading https://files.pythonhosted.org/packages/f2/c8/1830019bcecf26e76c3fdc36e2a6fa454388a233894dcf6f5eb00a881468/iopath-0.1.3.tar.gz
Requirement already satisfied: python-dateutil>=2.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->detectron2==0.1.2) (2.8.1)
Requirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.6/dist-packages (from matplotlib->detectron2==0.1.2) (0.10.0)
Requirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->detectron2==0.1.2) (1.3.1)
Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->detectron2==0.1.2) (2.4.7)
Requirement already satisfied: absl-py>=0.4 in /usr/local/lib/python3.6/dist-packages (from tensorboard->detectron2==0.1.2) (0.10.0)
Requirement already satisfied: werkzeug>=0.11.15 in /usr/local/lib/python3.6/dist-packages (from tensorboard->detectron2==0.1.2) (1.0.1)
Requirement already satisfied: google-auth<2,>=1.6.3 in /usr/local/lib/python3.6/dist-packages (from tensorboard->detectron2==0.1.2) (1.24.0)
Requirement already satisfied: protobuf>=3.6.0 in /usr/local/lib/python3.6/dist-packages (from tensorboard->detectron2==0.1.2) (3.12.4)
Requirement already satisfied: setuptools>=41.0.0 in /usr/local/lib/python3.6/dist-packages (from tensorboard->detectron2==0.1.2) (53.0.0)
Requirement already satisfied: six>=1.10.0 in /usr/local/lib/python3.6/dist-packages (from tensorboard->detectron2==0.1.2) (1.15.0)
Requirement already satisfied: grpcio>=1.24.3 in /usr/local/lib/python3.6/dist-packages (from tensorboard->detectron2==0.1.2) (1.32.0)
Requirement already satisfied: requests<3,>=2.21.0 in /usr/local/lib/python3.6/dist-packages (from tensorboard->detectron2==0.1.2) (2.23.0)
Requirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.6/dist-packages (from tensorboard->detectron2==0.1.2) (3.3.3)
Requirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /usr/local/lib/python3.6/dist-packages (from tensorboard->detectron2==0.1.2) (1.8.0)
Requirement already satisfied: wheel>=0.26; python_version >= "3" in /usr/local/lib/python3.6/dist-packages (from tensorboard->detectron2==0.1.2) (0.36.2)
Requirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /usr/local/lib/python3.6/dist-packages (from tensorboard->detectron2==0.1.2) (0.4.2)
Collecting portalocker
  Downloading https://files.pythonhosted.org/packages/82/22/e684c9e2e59b561dbe36538852e81849122c666c423448e3a5c99362c228/portalocker-2.2.1-py2.py3-none-any.whl
Requirement already satisfied: rsa<5,>=3.1.4; python_version >= "3.6" in /usr/local/lib/python3.6/dist-packages (from google-auth<2,>=1.6.3->tensorboard->detectron2==0.1.2) (4.7)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.6/dist-packages (from google-auth<2,>=1.6.3->tensorboard->detectron2==0.1.2) (0.2.8)
Requirement already satisfied: cachetools<5.0,>=2.0.0 in /usr/local/lib/python3.6/dist-packages (from google-auth<2,>=1.6.3->tensorboard->detectron2==0.1.2) (4.2.1)
Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests<3,>=2.21.0->tensorboard->detectron2==0.1.2) (3.0.4)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests<3,>=2.21.0->tensorboard->detectron2==0.1.2) (2020.12.5)
Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests<3,>=2.21.0->tensorboard->detectron2==0.1.2) (2.10)
Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests<3,>=2.21.0->tensorboard->detectron2==0.1.2) (1.24.3)
Requirement already satisfied: importlib-metadata; python_version < "3.8" in /usr/local/lib/python3.6/dist-packages (from markdown>=2.6.8->tensorboard->detectron2==0.1.2) (3.4.0)
Requirement already satisfied: requests-oauthlib>=0.7.0 in /usr/local/lib/python3.6/dist-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard->detectron2==0.1.2) (1.3.0)
Requirement already satisfied: pyasn1>=0.1.3 in /usr/local/lib/python3.6/dist-packages (from rsa<5,>=3.1.4; python_version >= "3.6"->google-auth<2,>=1.6.3->tensorboard->detectron2==0.1.2) (0.4.8)
Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.6/dist-packages (from importlib-metadata; python_version < "3.8"->markdown>=2.6.8->tensorboard->detectron2==0.1.2) (3.4.0)
Requirement already satisfied: typing-extensions>=3.6.4; python_version < "3.8" in /usr/local/lib/python3.6/dist-packages (from importlib-metadata; python_version < "3.8"->markdown>=2.6.8->tensorboard->detectron2==0.1.2) (3.7.4.3)
Requirement already satisfied: oauthlib>=3.0.0 in /usr/local/lib/python3.6/dist-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard->detectron2==0.1.2) (3.1.0)
Building wheels for collected packages: fvcore, iopath
  Building wheel for fvcore (setup.py) ... done
  Created wheel for fvcore: filename=fvcore-0.1.3.post20210204-cp36-none-any.whl size=44945 sha256=27670c0261008ace4e3cb172a66fc686fd57d97d2ab64ed5db8864c4aad9c1a8
  Stored in directory: /root/.cache/pip/wheels/7a/d5/3d/1393f94e0a0c6345f674387da5bd382c9aba98a23371a3804e
  Building wheel for iopath (setup.py) ... done
  Created wheel for iopath: filename=iopath-0.1.3-cp36-none-any.whl size=11169 sha256=bc4be0e595dfbaf354f3442d36c2eee174d58159f10e0ce78a7ae4f49c5d28bf
  Stored in directory: /root/.cache/pip/wheels/a9/1d/55/94a55e032409ac7617f9cbb88a0fa2cf4e7208806c29730804
Successfully built fvcore iopath
Installing collected packages: yacs, portalocker, iopath, fvcore, mock, detectron2
Successfully installed detectron2-0.1.2+cu101 fvcore-0.1.3.post20210204 iopath-0.1.3 mock-4.0.3 portalocker-2.2.1 yacs-0.1.8
In [ ]:
# You may need to restart your runtime prior to this, to let your installation take effect
# Some basic setup:
# Setup detectron2 logger
import detectron2
from detectron2.utils.logger import setup_logger
setup_logger()

# import some common libraries
import numpy as np
import pandas as pd
import cv2
import json
from tqdm.notebook import tqdm

# import some common detectron2 utilities
from detectron2 import model_zoo
from detectron2.engine import DefaultPredictor
from detectron2.config import get_cfg
from detectron2.utils.visualizer import Visualizer
from detectron2.data import MetadataCatalog
from detectron2.utils.visualizer import ColorMode
from detectron2.data.datasets import register_coco_instances
from detectron2.engine import DefaultTrainer
from detectron2.evaluation import COCOEvaluator, inference_on_dataset
from detectron2.data import build_detection_test_loader

# For reading annotations file
from pycocotools.coco import COCO

# utilities
from pprint import pprint # For beautiful print!
import os 

# For data visualisation
import matplotlib.pyplot as plt
import plotly.graph_objects as go
import plotly.express as px
from google.colab.patches import cv2_imshow
** fvcore version of PathManager will be deprecated soon. **
** Please migrate to the version in iopath repo. **
https://github.com/facebookresearch/iopath 

In [ ]:

Data Exploration 🧐

In this section we are going to read our dataset & doing some data visualisations

Reading Data

In [ ]:
# Reading annotations.json

TRAIN_ANNOTATIONS_PATH = "/content/drive/MyDrive/aicrowd/annotations/training.json"
TRAIN_IMAGE_DIRECTIORY = "/content/train/images/"

VAL_IMAGE_DIRECTIORY = "/content/drive/MyDrive/aicrowd/annotations/validation.json"
VAL_ANNOTATIONS_PATH = "/content/val/annotations.json"

train_coco = COCO(TRAIN_ANNOTATIONS_PATH)
loading annotations into memory...
Done (t=4.92s)
creating index...
index created!
In [ ]:
# Reading the annotation files
with open(TRAIN_ANNOTATIONS_PATH) as f:
  train_annotations_data = json.load(f)

with open(VAL_ANNOTATIONS_PATH) as f:
  val_annotations_data = json.load(f)
---------------------------------------------------------------------------
FileNotFoundError                         Traceback (most recent call last)
<ipython-input-6-e838d88172ff> in <module>()
      3   train_annotations_data = json.load(f)
      4 
----> 5 with open(VAL_ANNOTATIONS_PATH) as f:
      6   val_annotations_data = json.load(f)

FileNotFoundError: [Errno 2] No such file or directory: '/content/val/annotations.json'
In [ ]:
train_annotations_data['annotations'][0]

Data Format:

Our COCO data format is something like this -

"info": {...},
"categories": [...],
"images": [...],
"annotations": [...],

In which categories is like this

[
  {'id': 2578,
  'name': 'water',
  'name_readable': 'Water',
  'supercategory': 'food'},
  {'id': 1157,
  'name': 'pear',
  'name_readable': 'Pear',
  'supercategory': 'food'},
  ...
  {'id': 1190,
  'name': 'peach',
  'name_readable': 'Peach',
  'supercategory': 'food'}
]

Info is empty ( not sure why )

images is like this

[
  {'file_name': '065537.jpg', 
  'height': 464, 
  'id': 65537, 
  'width': 464},
  {'file_name': '065539.jpg', 
  'height': 464, 
  'id': 65539, 
  'width': 464},
 ...
  {'file_name': '069900.jpg', 
  'height': 391, 
  'id': 69900, 
  'width': 392},
]

Annotations is like this

{'area': 44320.0,
 'bbox': [86.5, 127.49999999999999, 286.0, 170.0],
 'category_id': 2578,
 'id': 102434,
 'image_id': 65537,
 'iscrowd': 0,
 'segmentation': [[235.99999999999997,
   372.5,
   169.0,
   372.5,
   ...
   368.5,
   264.0,
   371.5]]}
In [ ]:
# Reading all classes

category_ids = train_coco.loadCats(train_coco.getCatIds())

category_names = [_["name_readable"] for _ in category_ids]

pprint(", ".join(category_names))
('Water, Pear, Egg, Grapes, Butter, Bread, white, Jam, Bread, whole wheat, '
 'Apple, Tea, green, White coffee, with caffeine, Tea, black, Mixed salad '
 '(chopped without sauce), Cheese, Tomato sauce, Pasta, spaghetti, Carrot, '
 'Onion, Beef, cut into stripes (only meat), Rice noodles/vermicelli, Salad, '
 'leaf / salad, green, Bread, grain, Espresso, with caffeine, Banana, Mixed '
 'vegetables, Bread, wholemeal, Savoury puff pastry, Wine, white, Dried meat, '
 'Fresh cheese, Red radish, Hard cheese, Ham, raw, Bread, fruit, Oil & vinegar '
 'salad dressing, Tomato, Cauliflower, Potato-gnocchi, Wine, red, Sauce, '
 'cream, Pasta, linguini, parpadelle, Tagliatelle, French beans, Almonds, Dark '
 'chocolate, Mandarine, Semi-hard cheese, Croissant, Sushi, Berries, Biscuits, '
 'Thickened cream (> 35%), Corn, Celeriac, Alfa sprouts, Chickpeas, Leaf '
 'spinach, Rice, Chocolate cookies, Pineapple, Tart, Coffee, with caffeine, '
 'Focaccia, Pizza, with vegetables, baked, Soup, vegetable, Bread, toast, '
 'Potatoes steamed, Spaetzle, Frying sausage, Lasagne, meat, prepared, Boisson '
 'au glucose 50g, Müesli, Peanut butter, Chips, french fries, Mushroom, '
 'Ratatouille, Veggie burger, Country fries, Yaourt, yahourt, yogourt ou '
 'yoghourt, natural, Hummus, Fish, Beer, Peanut, Pizza, Margherita, baked, '
 'Pickle, Ham, cooked, Cake, chocolate, Bread, French (white flour), Sauce, '
 'mushroom, Rice, Basmati, Soup of lentils, Dahl (Dhal) , Pumpkin, Witloof '
 'chicory, Vegetable au gratin, baked, Balsamic salad dressing, Pasta, penne, '
 'Tea, peppermint, Soup, pumpkin, Quiche, with cheese, baked, with puff '
 'pastry, Mango, Green bean, steamed, without addition of salt, Cucumber, '
 'Bread, half white, Pasta, Beef, filet, Pasta, twist, Pasta, wholemeal, '
 'Walnut, Soft cheese, Salmon, smoked, Sweet pepper, Sauce, soya, Chicken, '
 'breast, Rice, whole-grain, Bread, nut, Green olives, Roll of half-white or '
 'white flour, with large void, Parmesan, Cappuccino, Flakes, oat, Mayonnaise, '
 'Chicken, Cheese for raclette, Orange, Goat cheese (soft), Tuna, Tomme, Apple '
 'pie, Rosti, Broccoli, Beans, kidney, White cabbage, Ketchup, Salt cake '
 '(vegetables, filled) , Pistachio, Feta, Salmon, Avocado, Sauce, pesto, '
 'Salad, rocket, Pizza, with ham, baked, Gruyère, Ristretto, with caffeine, '
 'Risotto, without cheese, cooked, Crunch Müesli, Braided white loaf, Peas, '
 'Chicken curry (cream/coconut milk. curry spices/paste)), Bolognaise sauce, '
 'Bacon, frying, Salami, Lentils, Mushrooms, Mashed potatoes, prepared, with '
 'full fat milk, with butter, Fennel, Chocolate mousse, Corn crisps, Sweet '
 'potato, Birchermüesli, prepared, no sugar added, Beetroot, steamed, without '
 'addition of salt, Sauce (savoury), Leek, Milk, Tea, Fruit salad , Bread, '
 "rye, Salad, lambs' ear, Potatoes au gratin, dauphinois, prepared, Red "
 'cabbage, Praline, Bread, black, Black olives, Mozzarella, Bacon, cooking, '
 'Pomegranate, Hamburger (Bread, meat, ketchup), Curry, vegetarian, Honey, '
 'Juice, orange, Cookies, Mixed nuts, Breadcrumbs (unspiced), Chicken, leg, '
 'Raspberries, Beef, sirloin steak, Salad dressing, Shrimp / prawn (large), '
 'Sour cream, Greek salad, Sauce, roast, Zucchini, Greek Yaourt, yahourt, '
 'yogourt ou yoghourt, Cashew nut, Meat terrine, paté, Chicken, cut into '
 'stripes (only meat), Couscous, Bread, wholemeal toast, CrΓƒΒͺpe, plain, Bread, '
 '5-grain, Tofu, Water, mineral, Ham croissant, Juice, apple, Falafel (balls), '
 'Egg, scrambled, prepared, Brioche, Bread, pita, Pasta, Hârnli, Blue mould '
 'cheese, Vegetable mix, peas and carrots, Quinoa, Crisps, Beef, Butter, '
 'spread, puree almond, Beef, minced (only meat), Hazelnut-chocolate '
 'spread(Nutella, Ovomaltine, Caotina), Chocolate, Nectarine, Ice tea , '
 'Applesauce, unsweetened, canned, Syrup (diluted, ready to drink), Sugar '
 'Melon , Bread, sourdough, Rusk, wholemeal, Gluten-free bread, Shrimp / prawn '
 '(small), French salad dressing, Pancakes, Milk chocolate, Pork, Dairy ice '
 'cream, Guacamole, Sausage, Herbal tea, Fruit coulis, Water with lemon juice, '
 'Brownie, Lemon, Veal sausage, Dates, Roll with pieces of chocolate, '
 'Taboulé, prepared, with couscous, Croissant with chocolate filling, '
 'Eggplant, Sesame seeds, Cottage cheese, Fruit tart, Cream cheese, Tea, '
 'verveine, Tiramisu, Grits, polenta, maize flour, Pasta, noodles, Artichoke, '
 'Blueberries, Mixed seeds, Caprese salad (Tomato Mozzarella), Omelette, '
 'plain, Hazelnut, Kiwi, Dried raisins, Kolhrabi, Plums, Beetroot, raw, Cream, '
 'Fajita (bread only), Apricots, Kefir drink, Bread, Strawberries, Wine, '
 'rosé, Watermelon, fresh, Green asparagus, White asparagus, Peach')
In [ ]:
category_names
Out[ ]:
['Water',
 'Pear',
 'Egg',
 'Grapes',
 'Butter',
 'Bread, white',
 'Jam',
 'Bread, whole wheat',
 'Apple',
 'Tea, green',
 'White coffee, with caffeine',
 'Tea, black',
 'Mixed salad (chopped without sauce)',
 'Cheese',
 'Tomato sauce',
 'Pasta, spaghetti',
 'Carrot',
 'Onion',
 'Beef, cut into stripes (only meat)',
 'Rice noodles/vermicelli',
 'Salad, leaf / salad, green',
 'Bread, grain',
 'Espresso, with caffeine',
 'Banana',
 'Mixed vegetables',
 'Bread, wholemeal',
 'Savoury puff pastry',
 'Wine, white',
 'Dried meat',
 'Fresh cheese',
 'Red radish',
 'Hard cheese',
 'Ham, raw',
 'Bread, fruit',
 'Oil & vinegar salad dressing',
 'Tomato',
 'Cauliflower',
 'Potato-gnocchi',
 'Wine, red',
 'Sauce, cream',
 'Pasta, linguini, parpadelle, Tagliatelle',
 'French beans',
 'Almonds',
 'Dark chocolate',
 'Mandarine',
 'Semi-hard cheese',
 'Croissant',
 'Sushi',
 'Berries',
 'Biscuits',
 'Thickened cream (> 35%)',
 'Corn',
 'Celeriac',
 'Alfa sprouts',
 'Chickpeas',
 'Leaf spinach',
 'Rice',
 'Chocolate cookies',
 'Pineapple',
 'Tart',
 'Coffee, with caffeine',
 'Focaccia',
 'Pizza, with vegetables, baked',
 'Soup, vegetable',
 'Bread, toast',
 'Potatoes steamed',
 'Spaetzle',
 'Frying sausage',
 'Lasagne, meat, prepared',
 'Boisson au glucose 50g',
 'Müesli',
 'Peanut butter',
 'Chips, french fries',
 'Mushroom',
 'Ratatouille',
 'Veggie burger',
 'Country fries',
 'Yaourt, yahourt, yogourt ou yoghourt, natural',
 'Hummus',
 'Fish',
 'Beer',
 'Peanut',
 'Pizza, Margherita, baked',
 'Pickle',
 'Ham, cooked',
 'Cake, chocolate',
 'Bread, French (white flour)',
 'Sauce, mushroom',
 'Rice, Basmati',
 'Soup of lentils, Dahl (Dhal) ',
 'Pumpkin',
 'Witloof chicory',
 'Vegetable au gratin, baked',
 'Balsamic salad dressing',
 'Pasta, penne',
 'Tea, peppermint',
 'Soup, pumpkin',
 'Quiche, with cheese, baked, with puff pastry',
 'Mango',
 'Green bean, steamed, without addition of salt',
 'Cucumber',
 'Bread, half white',
 'Pasta',
 'Beef, filet',
 'Pasta, twist',
 'Pasta, wholemeal',
 'Walnut',
 'Soft cheese',
 'Salmon, smoked',
 'Sweet pepper',
 'Sauce, soya',
 'Chicken, breast',
 'Rice, whole-grain',
 'Bread, nut',
 'Green olives',
 'Roll of half-white or white flour, with large void',
 'Parmesan',
 'Cappuccino',
 'Flakes, oat',
 'Mayonnaise',
 'Chicken',
 'Cheese for raclette',
 'Orange',
 'Goat cheese (soft)',
 'Tuna',
 'Tomme',
 'Apple pie',
 'Rosti',
 'Broccoli',
 'Beans, kidney',
 'White cabbage',
 'Ketchup',
 'Salt cake (vegetables, filled) ',
 'Pistachio',
 'Feta',
 'Salmon',
 'Avocado',
 'Sauce, pesto',
 'Salad, rocket',
 'Pizza, with ham, baked',
 'Gruyère',
 'Ristretto, with caffeine',
 'Risotto, without cheese, cooked',
 'Crunch Müesli',
 'Braided white loaf',
 'Peas',
 'Chicken curry (cream/coconut milk. curry spices/paste))',
 'Bolognaise sauce',
 'Bacon, frying',
 'Salami',
 'Lentils',
 'Mushrooms',
 'Mashed potatoes, prepared, with full fat milk, with butter',
 'Fennel',
 'Chocolate mousse',
 'Corn crisps',
 'Sweet potato',
 'Birchermüesli, prepared, no sugar added',
 'Beetroot, steamed, without addition of salt',
 'Sauce (savoury)',
 'Leek',
 'Milk',
 'Tea',
 'Fruit salad ',
 'Bread, rye',
 "Salad, lambs' ear",
 'Potatoes au gratin, dauphinois, prepared',
 'Red cabbage',
 'Praline',
 'Bread, black',
 'Black olives',
 'Mozzarella',
 'Bacon, cooking',
 'Pomegranate',
 'Hamburger (Bread, meat, ketchup)',
 'Curry, vegetarian',
 'Honey',
 'Juice, orange',
 'Cookies',
 'Mixed nuts',
 'Breadcrumbs (unspiced)',
 'Chicken, leg',
 'Raspberries',
 'Beef, sirloin steak',
 'Salad dressing',
 'Shrimp / prawn (large)',
 'Sour cream',
 'Greek salad',
 'Sauce, roast',
 'Zucchini',
 'Greek Yaourt, yahourt, yogourt ou yoghourt',
 'Cashew nut',
 'Meat terrine, paté',
 'Chicken, cut into stripes (only meat)',
 'Couscous',
 'Bread, wholemeal toast',
 'CrΓƒΒͺpe, plain',
 'Bread, 5-grain',
 'Tofu',
 'Water, mineral',
 'Ham croissant',
 'Juice, apple',
 'Falafel (balls)',
 'Egg, scrambled, prepared',
 'Brioche',
 'Bread, pita',
 'Pasta, Hârnli',
 'Blue mould cheese',
 'Vegetable mix, peas and carrots',
 'Quinoa',
 'Crisps',
 'Beef',
 'Butter, spread, puree almond',
 'Beef, minced (only meat)',
 'Hazelnut-chocolate spread(Nutella, Ovomaltine, Caotina)',
 'Chocolate',
 'Nectarine',
 'Ice tea ',
 'Applesauce, unsweetened, canned',
 'Syrup (diluted, ready to drink)',
 'Sugar Melon ',
 'Bread, sourdough',
 'Rusk, wholemeal',
 'Gluten-free bread',
 'Shrimp / prawn (small)',
 'French salad dressing',
 'Pancakes',
 'Milk chocolate',
 'Pork',
 'Dairy ice cream',
 'Guacamole',
 'Sausage',
 'Herbal tea',
 'Fruit coulis',
 'Water with lemon juice',
 'Brownie',
 'Lemon',
 'Veal sausage',
 'Dates',
 'Roll with pieces of chocolate',
 'Taboulé, prepared, with couscous',
 'Croissant with chocolate filling',
 'Eggplant',
 'Sesame seeds',
 'Cottage cheese',
 'Fruit tart',
 'Cream cheese',
 'Tea, verveine',
 'Tiramisu',
 'Grits, polenta, maize flour',
 'Pasta, noodles',
 'Artichoke',
 'Blueberries',
 'Mixed seeds',
 'Caprese salad (Tomato Mozzarella)',
 'Omelette, plain',
 'Hazelnut',
 'Kiwi',
 'Dried raisins',
 'Kolhrabi',
 'Plums',
 'Beetroot, raw',
 'Cream',
 'Fajita (bread only)',
 'Apricots',
 'Kefir drink',
 'Bread',
 'Strawberries',
 'Wine, rosé',
 'Watermelon, fresh',
 'Green asparagus',
 'White asparagus',
 'Peach']
In [ ]:
# Getting all categoriy with respective to their total images

no_images_per_category = {}

for n, i in enumerate(train_coco.getCatIds()):
  imgIds = train_coco.getImgIds(catIds=i)
  label = category_names[n]
  no_images_per_category[label] = len(imgIds)

img_info = pd.DataFrame(train_coco.loadImgs(train_coco.getImgIds()))

no_images_per_category
Out[ ]:
{'Alfa sprouts': 40,
 'Almonds': 159,
 'Apple': 504,
 'Apple pie': 104,
 'Applesauce, unsweetened, canned': 39,
 'Apricots': 91,
 'Artichoke': 43,
 'Avocado': 296,
 'Bacon, cooking': 47,
 'Bacon, frying': 127,
 'Balsamic salad dressing': 117,
 'Banana': 412,
 'Beans, kidney': 40,
 'Beef': 85,
 'Beef, cut into stripes (only meat)': 41,
 'Beef, filet': 51,
 'Beef, minced (only meat)': 65,
 'Beef, sirloin steak': 41,
 'Beer': 158,
 'Beetroot, raw': 45,
 'Beetroot, steamed, without addition of salt': 91,
 'Berries': 64,
 'Birchermüesli, prepared, no sugar added': 96,
 'Biscuits': 134,
 'Black olives': 132,
 'Blue mould cheese': 68,
 'Blueberries': 159,
 'Boisson au glucose 50g': 171,
 'Bolognaise sauce': 87,
 'Braided white loaf': 194,
 'Bread': 63,
 'Bread, 5-grain': 48,
 'Bread, French (white flour)': 121,
 'Bread, black': 54,
 'Bread, fruit': 48,
 'Bread, grain': 102,
 'Bread, half white': 76,
 'Bread, nut': 57,
 'Bread, pita': 38,
 'Bread, rye': 47,
 'Bread, sourdough': 124,
 'Bread, toast': 75,
 'Bread, white': 1273,
 'Bread, whole wheat': 223,
 'Bread, wholemeal': 901,
 'Bread, wholemeal toast': 65,
 'Breadcrumbs (unspiced)': 56,
 'Brioche': 45,
 'Broccoli': 261,
 'Brownie': 46,
 'Butter': 1008,
 'Butter, spread, puree almond': 45,
 'Cake, chocolate': 147,
 'Cappuccino': 139,
 'Caprese salad (Tomato Mozzarella)': 85,
 'Carrot': 893,
 'Cashew nut': 75,
 'Cauliflower': 116,
 'Celeriac': 53,
 'Cheese': 404,
 'Cheese for raclette': 77,
 'Chicken': 280,
 'Chicken curry (cream/coconut milk. curry spices/paste))': 80,
 'Chicken, breast': 91,
 'Chicken, cut into stripes (only meat)': 72,
 'Chicken, leg': 57,
 'Chickpeas': 92,
 'Chips, french fries': 238,
 'Chocolate': 82,
 'Chocolate cookies': 40,
 'Chocolate mousse': 53,
 'Coffee, with caffeine': 876,
 'Cookies': 58,
 'Corn': 130,
 'Corn crisps': 46,
 'Cottage cheese': 69,
 'Country fries': 52,
 'Couscous': 80,
 'Cream': 42,
 'Cream cheese': 52,
 'Crisps': 74,
 'Croissant': 144,
 'Croissant with chocolate filling': 66,
 'Crunch Müesli': 46,
 'CrΓƒΒͺpe, plain': 103,
 'Cucumber': 382,
 'Curry, vegetarian': 91,
 'Dairy ice cream': 152,
 'Dark chocolate': 213,
 'Dates': 40,
 'Dried meat': 140,
 'Dried raisins': 44,
 'Egg': 626,
 'Egg, scrambled, prepared': 84,
 'Eggplant': 138,
 'Espresso, with caffeine': 391,
 'Fajita (bread only)': 56,
 'Falafel (balls)': 36,
 'Fennel': 134,
 'Feta': 107,
 'Fish': 119,
 'Flakes, oat': 48,
 'Focaccia': 40,
 'French beans': 160,
 'French salad dressing': 94,
 'Fresh cheese': 52,
 'Fruit coulis': 41,
 'Fruit salad ': 97,
 'Fruit tart': 49,
 'Frying sausage': 38,
 'Gluten-free bread': 68,
 'Goat cheese (soft)': 55,
 'Grapes': 94,
 'Greek Yaourt, yahourt, yogourt ou yoghourt': 39,
 'Greek salad': 43,
 'Green asparagus': 131,
 'Green bean, steamed, without addition of salt': 40,
 'Green olives': 88,
 'Grits, polenta, maize flour': 56,
 'Gruyère': 183,
 'Guacamole': 84,
 'Ham croissant': 44,
 'Ham, cooked': 151,
 'Ham, raw': 161,
 'Hamburger (Bread, meat, ketchup)': 101,
 'Hard cheese': 315,
 'Hazelnut': 39,
 'Hazelnut-chocolate spread(Nutella, Ovomaltine, Caotina)': 49,
 'Herbal tea': 126,
 'Honey': 230,
 'Hummus': 120,
 'Ice tea ': 44,
 'Jam': 502,
 'Juice, apple': 43,
 'Juice, orange': 164,
 'Kefir drink': 44,
 'Ketchup': 87,
 'Kiwi': 120,
 'Kolhrabi': 51,
 'Lasagne, meat, prepared': 54,
 'Leaf spinach': 194,
 'Leek': 65,
 'Lemon': 74,
 'Lentils': 83,
 'Mandarine': 138,
 'Mango': 64,
 'Mashed potatoes, prepared, with full fat milk, with butter': 96,
 'Mayonnaise': 198,
 'Meat terrine, paté': 44,
 'Milk': 106,
 'Milk chocolate': 71,
 'Mixed nuts': 163,
 'Mixed salad (chopped without sauce)': 374,
 'Mixed seeds': 47,
 'Mixed vegetables': 624,
 'Mozzarella': 139,
 'Mushroom': 72,
 'Mushrooms': 118,
 'Müesli': 75,
 'Nectarine': 69,
 'Oil & vinegar salad dressing': 50,
 'Omelette, plain': 65,
 'Onion': 144,
 'Orange': 139,
 'Pancakes': 49,
 'Parmesan': 200,
 'Pasta': 136,
 'Pasta, Hârnli': 61,
 'Pasta, linguini, parpadelle, Tagliatelle': 75,
 'Pasta, noodles': 49,
 'Pasta, penne': 131,
 'Pasta, spaghetti': 256,
 'Pasta, twist': 100,
 'Pasta, wholemeal': 85,
 'Peach': 53,
 'Peanut': 68,
 'Peanut butter': 54,
 'Pear': 151,
 'Peas': 73,
 'Pickle': 138,
 'Pineapple': 54,
 'Pistachio': 65,
 'Pizza, Margherita, baked': 202,
 'Pizza, with ham, baked': 42,
 'Pizza, with vegetables, baked': 56,
 'Plums': 44,
 'Pomegranate': 72,
 'Pork': 69,
 'Potato-gnocchi': 58,
 'Potatoes au gratin, dauphinois, prepared': 49,
 'Potatoes steamed': 450,
 'Praline': 76,
 'Pumpkin': 48,
 'Quiche, with cheese, baked, with puff pastry': 51,
 'Quinoa': 120,
 'Raspberries': 116,
 'Ratatouille': 92,
 'Red cabbage': 79,
 'Red radish': 138,
 'Rice': 659,
 'Rice noodles/vermicelli': 50,
 'Rice, Basmati': 67,
 'Rice, whole-grain': 42,
 'Risotto, without cheese, cooked': 111,
 'Ristretto, with caffeine': 82,
 'Roll of half-white or white flour, with large void': 54,
 'Roll with pieces of chocolate': 53,
 'Rosti': 49,
 'Rusk, wholemeal': 38,
 'Salad dressing': 78,
 "Salad, lambs' ear": 101,
 'Salad, leaf / salad, green': 1189,
 'Salad, rocket': 130,
 'Salami': 173,
 'Salmon': 177,
 'Salmon, smoked': 162,
 'Salt cake (vegetables, filled) ': 67,
 'Sauce (savoury)': 129,
 'Sauce, cream': 131,
 'Sauce, mushroom': 39,
 'Sauce, pesto': 77,
 'Sauce, roast': 76,
 'Sauce, soya': 39,
 'Sausage': 80,
 'Savoury puff pastry': 49,
 'Semi-hard cheese': 101,
 'Sesame seeds': 43,
 'Shrimp / prawn (large)': 63,
 'Shrimp / prawn (small)': 58,
 'Soft cheese': 180,
 'Soup of lentils, Dahl (Dhal) ': 44,
 'Soup, pumpkin': 78,
 'Soup, vegetable': 96,
 'Sour cream': 55,
 'Spaetzle': 60,
 'Strawberries': 244,
 'Sugar Melon ': 102,
 'Sushi': 56,
 'Sweet pepper': 275,
 'Sweet potato': 121,
 'Syrup (diluted, ready to drink)': 51,
 'Taboulé, prepared, with couscous': 50,
 'Tart': 98,
 'Tea': 353,
 'Tea, black': 109,
 'Tea, green': 140,
 'Tea, peppermint': 39,
 'Tea, verveine': 40,
 'Thickened cream (> 35%)': 81,
 'Tiramisu': 59,
 'Tofu': 120,
 'Tomato': 1069,
 'Tomato sauce': 290,
 'Tomme': 52,
 'Tuna': 96,
 'Veal sausage': 44,
 'Vegetable au gratin, baked': 52,
 'Vegetable mix, peas and carrots': 42,
 'Veggie burger': 35,
 'Walnut': 98,
 'Water': 1835,
 'Water with lemon juice': 87,
 'Water, mineral': 185,
 'Watermelon, fresh': 51,
 'White asparagus': 56,
 'White cabbage': 51,
 'White coffee, with caffeine': 274,
 'Wine, red': 545,
 'Wine, rosé': 67,
 'Wine, white': 333,
 'Witloof chicory': 98,
 'Yaourt, yahourt, yogourt ou yoghourt, natural': 139,
 'Zucchini': 231}
In [ ]:
pd.DataFrame(no_images_per_category.items()).sort_values(1).iloc[::-1][0][:30].tolist()
Out[ ]:
['Water',
 'Bread, white',
 'Salad, leaf / salad, green',
 'Tomato',
 'Butter',
 'Bread, wholemeal',
 'Carrot',
 'Coffee, with caffeine',
 'Rice',
 'Egg',
 'Mixed vegetables',
 'Wine, red',
 'Apple',
 'Jam',
 'Potatoes steamed',
 'Banana',
 'Cheese',
 'Espresso, with caffeine',
 'Cucumber',
 'Mixed salad (chopped without sauce)',
 'Tea',
 'Wine, white',
 'Hard cheese',
 'Avocado',
 'Tomato sauce',
 'Chicken',
 'Sweet pepper',
 'White coffee, with caffeine',
 'Broccoli',
 'Pasta, spaghetti']

Data Visualisations

In [ ]:
fig = go.Figure([go.Bar(x=list(no_images_per_category.keys()), y=list(no_images_per_category.values()))])
fig.update_layout(
    title="No of Image per class",)
fig.show()
In [ ]:
pprint(f"Average number of image per class : { sum(list(no_images_per_category.values())) / len(list(no_images_per_category.values())) }")
pprint(f"Highest number of image per class is : { list(no_images_per_category.keys())[0]} of { list(no_images_per_category.values())[0] }")
pprint(f"Lowest number of image per class is : Veggie Burger of { sorted(list(no_images_per_category.values()))[0] }")
'Average number of image per class : 143.6153846153846'
'Highest number of image per class is : Water of 1835'
'Lowest number of image per class is : Veggie Burger of 35'
In [ ]:
fig = go.Figure(data=[go.Pie(labels=list(no_images_per_category.keys()), values=list(no_images_per_category.values()), 
                             hole=.3, textposition='inside', )], )
fig.update_layout(
    title="No of Image per class ( In pie )",)
fig.show()
In [ ]:
fig = go.Figure()
fig.add_trace(go.Histogram(x=img_info['height']))
fig.add_trace(go.Histogram(x=img_info['width']))

# Overlay both histograms
fig.update_layout(barmode='stack', title="Histogram of Image width & height",)


fig.show()

Image Visulisation 🖼️

In this section we are going to do imaghe visualisations!

In [ ]:
img_info
Out[ ]:
id file_name width height
0 65537 065537.jpg 464 464
1 65539 065539.jpg 464 464
2 65561 065561.jpg 464 464
3 65574 065574.jpg 831 830
4 65577 065577.jpg 480 480
... ... ... ... ...
24115 65500 065500.jpg 628 628
24116 65514 065514.jpg 1037 1036
24117 65516 065516.jpg 480 480
24118 65523 065523.jpg 480 480
24119 65524 065524.jpg 464 464

24120 rows Γ— 4 columns

In [ ]:
len(train_annotations_data['annotations'][n]['segmentation']), len(train_annotations_data['annotations'][n]['bbox'])
Out[ ]:
(2, 4)
In [ ]:
for n, i in enumerate(tqdm((train_annotations_data['annotations']))):

  # if np.array(train_annotations_data['annotations'][n]['segmentation']).shape[0] != np.array(train_annotations_data['annotations'][n]['bbox']).shape[0]:

    # print(n)
  if np.array(train_annotations_data['annotations'][n]['segmentation']).shape[0] != 1:
    print(n)

  else:
    pass
/usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:6: VisibleDeprecationWarning:

Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray

20
33
49
65
66
68
77
99
108
125
135
136
184
190
192
194
195
196
210
222
230
231
248
256
264
272
284
300
312
315
316
318
323
343
352
357
360
379
382
385
405
429
432
457
462
464
480
481
491
492
493
498
502
503
505
506
516
522
523
528
538
540
541
545
567
583
585
586
594
595
596
598
615
627
632
665
669
686
687
691
712
719
722
733
738
744
745
748
753
760
784
785
787
789
795
801
804
805
806
809
829
830
833
843
846
851
852
876
885
887
909
926
932
933
950
952
954
969
970
977
979
982
989
993
998
1000
1031
1034
1035
1045
1049
1054
1055
1061
1065
1068
1070
1073
1089
1099
1108
1110
1118
1120
1133
1134
1146
1149
1154
1176
1177
1185
1187
1188
1198
1204
1207
1214
1215
1217
1218
1246
1258
1259
1276
1277
1286
1287
1289
1300
1301
1307
1309
1335
1341
1343
1345
1361
1362
1370
1377
1380
1399
1412
1441
1443
1445
1486
1501
1537
1539
1582
1584
1599
1608
1625
1628
1658
1660
1662
1663
1667
1668
1691
1699
1708
1716
1720
1721
1735
1740
1743
1760
1772
1785
1815
1820
1826
1830
1834
1838
1876
1880
1886
1895
1896
1900
1912
1927
1936
1937
1938
1940
1941
1989
1992
1993
1994
2007
2008
2009
2019
2035
2041
2060
2093
2094
2096
2099
2101
2107
2109
2111
2121
2124
2133
2154
2165
2166
2171
2172
2178
2182
2194
2227
2243
2244
2259
2260
2264
2304
2305
2306
2307
2327
2337
2338
2343
2360
2361
2364
2378
2400
2404
2410
2412
2416
2430
2437
2444
2449
2467
2469
2470
2483
2491
2495
2500
2507
2508
2510
2522
2527
2536
2562
2566
2567
2568
2569
2574
2596
2597
2629
2634
2660
2696
2734
2735
2762
2787
2794
2805
2819
2826
2827
2829
2835
2863
2867
2868
2883
2897
2904
2916
2918
2919
2926
2927
2930
2940
2945
2949
2953
2954
2956
2957
2980
3006
3010
3011
3025
3031
3032
3039
3059
3077
3078
3091
3093
3094
3100
3104
3116
3117
3118
3124
3126
3130
3142
3166
3201
3203
3218
3227
3229
3231
3253
3291
3300
3349
3350
3351
3357
3358
3363
3377
3386
3388
3394
3396
3404
3437
3447
3451
3452
3453
3495
3507
3523
3525
3527
3563
3588
3600
3619
3624
3631
3636
3637
3639
3640
3641
3642
3654
3671
3704
3709
3713
3716
3727
3735
3743
3745
3746
3758
3760
3768
3789
3790
3810
3817
3837
3847
3865
3875
3876
3878
3880
3885
3892
3905
3916
3926
3927
3939
3945
3952
3954
3972
3973
3977
4011
4014
4029
4053
4055
4056
4057
4061
4076
4092
4100
4122
4123
4124
4125
4127
4144
4147
4153
4163
4166
4168
4190
4193
4204
4214
4227
4230
4237
4238
4244
4245
4249
4258
4261
4273
4274
4279
4289
4304
4305
4325
4334
4341
4342
4345
4365
4372
4397
4402
4403
4404
4405
4455
4456
4476
4479
4481
4498
4507
4511
4512
4531
4532
4540
4544
4550
4565
4568
4569
4579
4582
4584
4585
4593
4602
4611
4612
4624
4630
4634
4645
4651
4657
4659
4660
4661
4664
4665
4706
4708
4709
4714
4715
4719
4722
4730
4739
4743
4745
4774
4775
4794
4809
4810
4832
4833
4843
4844
4851
4859
4863
4870
4871
4877
4879
4880
4891
4893
4901
4917
4930
4931
4950
4953
5001
5005
5007
5012
5040
5047
5064
5073
5089
5135
5137
5138
5146
5149
5155
5163
5173
5181
5182
5188
5189
5196
5197
5212
5214
5218
5229
5240
5277
5278
5280
5288
5296
5300
5303
5325
5326
5328
5345
5353
5356
5357
5363
5368
5370
5371
5403
5404
5414
5415
5416
5429
5438
5449
5462
5463
5496
5529
5531
5566
5577
5683
5688
5689
5704
5705
5721
5722
5723
5729
5731
5732
5736
5748
5757
5762
5766
5769
5771
5780
5812
5817
5849
5850
5858
5862
5867
5868
5874
5887
5909
5923
5925
5926
5965
5967
6005
6024
6036
6045
6049
6068
6074
6083
6112
6139
6163
6165
6167
6194
6231
6235
6263
6265
6271
6293
6294
6318
6337
6363
6414
6415
6440
6462
6483
6485
6490
6500
6501
6536
6538
6566
6592
6605
6628
6631
6682
6722
6798
6804
6806
6857
6862
6890
6898
6907
6912
6936
6938
6942
6954
7032
7053
7178
7191
7214
7222
7223
7235
7242
7258
7261
7266
7269
7369
7417
7454
7465
7473
7495
7496
7545
7567
7589
7590
7599
7602
7645
7664
7684
7685
7706
7713
7716
7717
7749
7761
7836
7881
7891
7904
7905
7908
7922
7938
7941
7948
7954
7965
7975
8001
8038
8039
8053
8054
8056
8082
8091
8093
8100
8103
8122
8125
8164
8173
8174
8205
8206
8207
8212
8215
8216
8253
8279
8286
8294
8310
8312
8329
8330
8331
8336
8339
8340
8342
8353
8354
8439
8463
8468
8469
8470
8471
8473
8474
8475
8501
8504
8516
8529
8534
8538
8548
8569
8619
8620
8621
8627
8628
8629
8632
8683
8686
8690
8697
8733
8734
8742
8752
8759
8769
8770
8773
8778
8801
8811
8815
8847
8848
8858
8859
8870
8879
8892
8893
8948
8953
8998
9015
9016
9017
9027
9034
9042
9043
9070
9076
9082
9108
9109
9127
9132
9141
9142
9154
9188
9195
9197
9208
9227
9247
9248
9250
9262
9277
9278
9282
9289
9307
9309
9326
9333
9334
9348
9370
9379
9439
9450
9453
9455
9470
9516
9531
9549
9588
9605
9618
9622
9643
9673
9682
9709
9742
9769
9770
9786
9811
9812
9858
9861
9863
9865
9897
9900
9905
9911
9912
9914
9917
9918
9966
9979
9980
9981
9994
10010
10037
10045
10046
10047
10050
10054
10109
10115
10136
10157
10173
10212
10242
10247
10248
10257
10264
10265
10268
10277
10322
10334
10346
10387
10400
10412
10419
10423
10462
10472
10477
10495
10500
10502
10516
10528
10533
10534
10550
10551
10553
10556
10564
10575
10583
10588
10589
10591
10595
10605
10610
10616
10624
10644
10648
10667
10687
10703
10714
10715
10722
10724
10730
10743
10750
10754
10766
10767
10773
10775
10777
10778
10781
10794
10796
10798
10801
10804
10824
10827
10828
10835
10880
10893
10897
10913
10916
10925
11038
11072
11084
11105
11114
11130
11137
11139
11157
11165
11199
11202
11229
11244
11245
11269
11285
11300
11321
11324
11335
11336
11341
11342
11343
11374
11406
11420
11443
11444
11456
11466
11470
11472
11473
11483
11524
11525
11526
11530
11531
11532
11553
11568
11574
11583
11595
11599
11656
11657
11679
11700
11710
11766
11791
11797
11822
11835
11890
11891
11903
11905
11920
11960
11984
11993
11994
12026
12032
12079
12081
12082
12090
12099
12108
12111
12115
12142
12153
12168
12178
12186
12253
12299
12317
12318
12324
12337
12381
12385
12386
12387
12389
12390
12399
12415
12416
12420
12422
12424
12453
12456
12488
12499
12505
12512
12533
12535
12536
12537
12546
12547
12560
12573
12591
12592
12593
12610
12639
12650
12653
12666
12668
12673
12676
12677
12721
12731
12763
12767
12784
12787
12789
12792
12816
12817
12837
12846
12861
12876
12884
12895
12913
12916
12918
12919
12922
12923
12924
12925
12930
12931
12947
12948
12958
12965
12967
12993
12995
13008
13009
13025
13026
13027
13028
13040
13041
13042
13068
13078
13094
13124
13125
13126
13127
13128
13129
13141
13170
13172
13175
13176
13177
13190
13195
13213
13217
13226
13227
13234
13245
13249
13251
13252
13261
13268
13269
13270
13274
13280
13287
13290
13306
13307
13333
13342
13344
13352
13366
13371
13381
13384
13385
13392
13408
13409
13410
13418
13429
13432
13458
13459
13460
13461
13462
13464
13474
13499
13500
13501
13502
13503
13504
13505
13512
13513
13527
13532
13534
13549
13570
13571
13577
13579
13603
13624
13633
13636
13637
13638
13660
13662
13672
13690
13695
13721
13727
13748
13749
13753
13778
13803
13804
13811
13821
13823
13826
13850
13851
13857
13860
13861
13906
13915
13925
13926
13928
13948
13955
13966
13986
13998
14006
14011
14037
14049
14068
14081
14094
14101
14114
14132
14135
14158
14160
14161
14164
14168
14184
14196
14205
14206
14273
14304
14306
14310
14323
14327
14328
14329
14344
14351
14358
14362
14369
14371
14387
14389
14401
14405
14406
14408
14432
14444
14470
14483
14495
14497
14518
14532
14543
14565
14569
14577
14582
14612
14620
14621
14632
14634
14637
14645
14650
14651
14658
14664
14666
14679
14696
14702
14710
14714
14716
14731
14738
14756
14758
14763
14764
14768
14789
14791
14820
14828
14847
14866
14880
14885
14886
14896
14897
14898
14909
14948
14961
14964
14968
14969
14981
14986
14993
15017
15020
15025
15026
15030
15051
15056
15057
15058
15060
15061
15062
15068
15069
15070
15072
15073
15074
15075
15083
15085
15097
15098
15111
15150
15151
15152
15165
15166
15171
15195
15219
15228
15229
15243
15244
15282
15328
15342
15354
15364
15373
15374
15383
15390
15402
15407
15415
15417
15449
15478
15489
15490
15491
15495
15500
15501
15512
15522
15523
15543
15574
15612
15616
15618
15620
15623
15633
15634
15636
15640
15643
15644
15645
15653
15654
15663
15697
15711
15743
15754
15755
15756
15762
15776
15781
15782
15789
15794
15802
15826
15830
15835
15836
15851
15860
15863
15868
15890
15912
15917
15921
15949
15953
15954
15967
15975
15993
15994
15995
16008
16012
16037
16041
16042
16061
16067
16072
16092
16093
16133
16137
16149
16152
16153
16161
16192
16195
16212
16231
16246
16295
16296
16297
16307
16310
16318
16327
16328
16331
16332
16333
16335
16376
16423
16431
16443
16445
16461
16474
16500
16501
16503
16511
16512
16514
16517
16525
16526
16528
16542
16544
16545
16546
16569
16572
16574
16575
16576
16577
16581
16591
16601
16603
16604
16606
16608
16622
16637
16648
16656
16657
16661
16662
16676
16688
16703
16705
16708
16728
16739
16742
16744
16777
16778
16785
16808
16809
16810
16811
16813
16814
16815
16816
16832
16833
16834
16874
16875
16904
16914
16915
16919
16920
16927
16964
16970
16973
16979
16991
16994
17006
17015
17018
17034
17042
17055
17064
17065
17066
17070
17071
17111
17144
17148
17152
17161
17162
17164
17168
17169
17170
17208
17215
17224
17225
17228
17229
17233
17244
17253
17255
17257
17258
17259
17271
17274
17281
17285
17290
17313
17339
17340
17343
17344
17373
17377
17379
17380
17381
17382
17397
17408
17409
17411
17412
17413
17425
17429
17435
17441
17445
17468
17477
17490
17491
17531
17566
17569
17570
17571
17594
17596
17599
17600
17612
17613
17614
17615
17617
17618
17619
17643
17645
17649
17650
17651
17652
17654
17661
17662
17671
17699
17702
17703
17706
17707
17721
17723
17724
17725
17726
17744
17766
17770
17781
17783
17808
17810
17835
17836
17838
17841
17872
17875
17888
17892
17893
17899
17918
17928
17957
17958
18004
18021
18022
18023
18024
18042
18044
18052
18059
18083
18089
18101
18103
18104
18120
18133
18134
18135
18136
18147
18148
18149
18156
18157
18174
18175
18190
18198
18217
18220
18229
18246
18247
18248
18250
18265
18267
18271
18272
18285
18286
18310
18315
18331
18340
18353
18354
18360
18377
18382
18388
18392
18408
18436
18459
18467
18475
18476
18477
18487
18497
18510
18512
18523
18541
18563
18564
18568
18572
18579
18586
18594
18627
18629
18633
18634
18638
18658
18659
18678
18679
18680
18682
18703
18722
18723
18741
18760
18765
18766
18776
18777
18781
18785
18787
18788
18800
18805
18814
18817
18840
18848
18864
18880
18881
18882
18894
18895
18906
18913
18936
18954
18959
18966
18972
18991
18993
19010
19026
19051
19068
19069
19076
19081
19093
19131
19161
19168
19178
19181
19182
19183
19185
19187
19191
19197
19200
19203
19204
19214
19216
19219
19237
19239
19259
19261
19262
19265
19269
19274
19289
19297
19300
19309
19310
19312
19329
19330
19349
19365
19390
19416
19448
19454
19455
19462
19478
19479
19492
19495
19497
19499
19500
19501
19503
19513
19516
19520
19526
19533
19534
19536
19537
19538
19549
19561
19583
19608
19633
19645
19650
19688
19692
19693
19715
19717
19735
19756
19759
19800
19807
19812
19814
19816
19825
19835
19836
19837
19847
19849
19853
19856
19867
19877
19878
19879
19885
19888
19896
19897
19927
19930
19940
19958
19964
19975
19983
19985
19986
20003
20010
20013
20017
20024
20028
20029
20038
20053
20077
20078
20079
20093
20100
20102
20107
20136
20138
20141
20142
20143
20169
20174
20201
20202
20205
20216
20220
20221
20226
20227
20234
20293
20298
20303
20306
20307
20313
20314
20317
20320
20321
20323
20331
20337
20357
20358
20365
20366
20368
20372
20384
20394
20402
20406
20410
20412
20424
20435
20444
20448
20450
20475
20477
20495
20504
20514
20550
20564
20586
20594
20595
20601
20659
20667
20668
20674
20675
20676
20680
20692
20703
20734
20749
20750
20751
20754
20764
20766
20770
20773
20774
20787
20801
20833
20834
20837
20846
20847
20860
20865
20866
20867
20878
20880
20893
20907
20917
20921
20939
20951
20962
20967
20982
20983
21005
21007
21016
21020
21024
21050
21055
21056
21060
21065
21086
21101
21126
21136
21137
21165
21166
21194
21198
21199
21204
21211
21213
21227
21269
21276
21279
21287
21298
21307
21320
21346
21348
21349
21369
21390
21394
21407
21408
21430
21431
21471
21472
21483
21484
21497
21517
21522
21536
21538
21539
21540
21555
21556
21557
21558
21559
21563
21569
21585
21586
21588
21590
21591
21593
21594
21596
21602
21638
21657
21658
21660
21667
21677
21715
21722
21723
21731
21739
21747
21748
21760
21761
21762
21831
21843
21844
21845
21857
21888
21901
21904
21914
21918
21929
21930
21940
21947
21968
21974
21994
21998
21999
22007
22030
22065
22078
22079
22082
22084
22095
22106
22126
22160
22162
22166
22177
22183
22185
22189
22190
22191
22194
22201
22202
22208
22210
22232
22238
22239
22254
22307
22326
22327
22332
22333
22335
22346
22365
22367
22369
22371
22390
22391
22410
22411
22415
22426
22434
22436
22438
22439
22442
22443
22459
22461
22468
22474
22475
22489
22503
22510
22511
22536
22547
22555
22587
22596
22613
22615
22622
22625
22637
22651
22652
22677
22678
22680
22681
22685
22707
22723
22728
22729
22741
22746
22747
22794
22805
22810
22815
22816
22819
22841
22842
22864
22865
22867
22868
22881
22971
22975
23001
23020
23025
23027
23062
23063
23070
23081
23085
23100
23124
23145
23157
23183
23187
23188
23204
23207
23211
23235
23241
23242
23248
23250
23256
23259
23264
23284
23303
23315
23317
23318
23324
23329
23333
23340
23341
23355
23360
23362
23371
23394
23400
23404
23420
23422
23424
23429
23430
23437
23451
23453
23456
23473
23474
23480
23481
23512
23513
23514
23520
23544
23545
23553
23561
23562
23572
23576
23586
23587
23588
23591
23610
23638
23645
23658
23661
23671
23677
23678
23683
23687
23707
23719
23720
23722
23732
23738
23740
23745
23750
23751
23765
23766
23767
23777
23778
23780
23794
23820
23851
23862
23878
23892
23902
23924
23927
23937
23939
23948
23963
23967
23968
23970
23973
23983
23990
23991
24000
24006
24007
24010
24017
24020
24039
24050
24059
24066
24071
24094
24099
24108
24109
24110
24116
24122
24140
24141
24147
24178
24193
24212
24219
24220
24221
24233
24249
24278
24280
24281
24282
24283
24287
24320
24321
24322
24329
24333
24334
24335
24349
24362
24366
24370
24379
24386
24387
24390
24392
24405
24406
24407
24422
24428
24430
24437
24442
24444
24448
24452
24464
24474
24486
24494
24495
24508
24515
24554
24555
24557
24560
24571
24572
24574
24585
24586
24587
24624
24626
24629
24630
24631
24641
24669
24672
24676
24700
24721
24726
24747
24754
24755
24756
24791
24794
24795
24797
24804
24821
24824
24830
24835
24838
24840
24847
24848
24854
24867
24896
24899
24911
24945
24946
24948
24949
24954
24958
24959
24971
24972
24977
24991
24993
24995
25016
25031
25032
25033
25059
25065
25067
25077
25078
25079
25126
25137
25138
25140
25143
25144
25145
25146
25184
25216
25231
25232
25272
25274
25276
25278
25279
25280
25300
25301
25303
25327
25328
25342
25343
25348
25349
25353
25368
25386
25388
25432
25448
25452
25468
25473
25483
25488
25504
25505
25507
25513
25514
25538
25540
25542
25547
25548
25559
25591
25592
25621
25623
25624
25634
25636
25640
25641
25661
25668
25689
25701
25716
25721
25729
25730
25731
25748
25767
25771
25797
25798
25825
25838
25850
25852
25855
25872
25892
25893
25906
25912
25913
25914
25937
25941
25942
25970
25972
25979
25980
25983
26006
26018
26020
26028
26029
26038
26039
26040
26041
26043
26079
26091
26092
26103
26113
26134
26151
26167
26168
26178
26186
26187
26193
26217
26227
26231
26245
26252
26257
26259
26260
26262
26263
26266
26271
26273
26274
26286
26314
26319
26325
26326
26354
26373
26385
26389
26390
26406
26415
26435
26437
26444
26446
26452
26453
26454
26455
26479
26491
26498
26500
26506
26524
26525
26526
26541
26546
26551
26570
26588
26608
26610
26623
26641
26644
26663
26664
26666
26667
26672
26674
26675
26695
26696
26700
26701
26712
26723
26725
26728
26731
26732
26733
26740
26788
26798
26799
26800
26828
26835
26877
26886
26897
26907
26921
26931
26939
26950
26957
26988
27006
27011
27013
27015
27016
27023
27038
27050
27076
27077
27078
27081
27085
27099
27103
27135
27136
27140
27150
27157
27168
27176
27192
27203
27209
27218
27220
27238
27245
27250
27254
27255
27261
27262
27277
27280
27281
27317
27322
27326
27332
27333
27337
27340
27350
27357
27358
27363
27376
27381
27386
27394
27398
27409
27414
27420
27422
27430
27432
27443
27444
27458
27466
27473
27475
27484
27497
27504
27505
27506
27508
27516
27517
27520
27532
27533
27559
27570
27587
27588
27613
27614
27641
27642
27666
27668
27685
27702
27724
27729
27732
27737
27738
27752
27754
27761
27762
27763
27790
27791
27821
27823
27833
27852
27853
27862
27892
27935
27948
27949
27950
27953
27977
27992
28000
28007
28015
28019
28022
28023
28029
28039
28040
28051
28052
28067
28069
28071
28080
28082
28083
28092
28095
28134
28150
28184
28186
28196
28205
28214
28224
28228
28233
28262
28272
28276
28280
28281
28307
28308
28314
28315
28317
28318
28337
28340
28342
28362
28363
28368
28379
28385
28389
28408
28419
28422
28430
28441
28462
28473
28475
28476
28479
28480
28484
28524
28528
28529
28530
28545
28567
28593
28604
28605
28608
28613
28627
28628
28636
28647
28649
28653
28669
28682
28683
28687
28691
28692
28697
28746
28748
28750
28751
28758
28769
28776
28782
28783
28791
28796
28799
28804
28815
28817
28820
28821
28822
28825
28842
28848
28852
28869
28874
28883
28884
28885
28886
28887
28894
28897
28899
28936
28946
28955
28956
28957
28959
28967
28973
28976
28987
28990
28992
28993
29001
29004
29005
29007
29008
29009
29024
29041
29042
29046
29047
29050
29051
29058
29061
29063
29064
29084
29087
29088
29089
29091
29092
29093
29096
29097
29107
29108
29140
29147
29151
29155
29166
29169
29170
29171
29186
29200
29201
29202
29219
29220
29222
29225
29226
29228
29237
29254
29255
29256
29257
29258
29277
29302
29329
29331
29351
29353
29362
29363
29368
29369
29373
29380
29383
29384
29392
29393
29428
29429
29444
29445
29464
29467
29472
29482
29504
29514
29520
29521
29552
29557
29558
29559
29568
29569
29570
29583
29590
29591
29592
29593
29596
29597
29603
29666
29667
29684
29688
29689
29696
29706
29708
29718
29719
29728
29729
29732
29745
29746
29747
29763
29764
29765
29769
29771
29783
29788
29800
29801
29802
29817
29830
29831
29851
29853
29865
29870
29875
29892
29897
29909
29913
29915
29916
29926
29969
29999
30001
30003
30006
30008
30024
30026
30032
30035
30036
30040
30041
30042
30046
30048
30055
30058
30068
30069
30072
30073
30131
30132
30133
30142
30144
30179
30183
30188
30192
30199
30229
30231
30238
30250
30251
30254
30268
30275
30276
30277
30280
30297
30299
30307
30327
30343
30344
30354
30355
30364
30375
30377
30395
30403
30404
30428
30439
30466
30487
30506
30507
30519
30520
30522
30559
30562
30565
30566
30567
30573
30576
30577
30597
30598
30599
30606
30615
30628
30649
30661
30683
30688
30697
30698
30700
30701
30703
30704
30705
30723
30731
30763
30764
30768
30777
30779
30781
30795
30835
30844
30860
30861
30864
30866
30870
30871
30877
30879
30885
30903
30929
30935
30936
30949
30964
30965
30978
30979
30989
30997
31041
31043
31044
31065
31093
31112
31178
31183
31185
31201
31209
31210
31211
31214
31217
31256
31266
31291
31292
31293
31294
31300
31304
31317
31340
31354
31388
31392
31393
31395
31397
31415
31416
31417
31418
31431
31435
31441
31442
31443
31467
31474
31476
31478
31506
31529
31543
31552
31554
31594
31595
31613
31614
31619
31620
31621
31650
31664
31682
31697
31710
31712
31747
31748
31763
31773
31784
31794
31812
31823
31830
31831
31850
31853
31867
31871
31875
31877
31880
31922
31944
31948
31950
31951
31952
31957
31970
31987
31988
32011
32020
32021
32036
32037
32049
32068
32100
32101
32102
32103
32104
32105
32110
32125
32126
32173
32182
32183
32184
32186
32187
32188
32206
32210
32211
32216
32218
32231
32234
32235
32277
32318
32324
32325
32337
32366
32367
32368
32370
32389
32391
32419
32424
32431
32433
32478
32541
32548
32556
32557
32565
32566
32569
32572
32583
32590
32591
32615
32616
32620
32637
32661
32663
32682
32687
32688
32691
32709
32713
32755
32758
32760
32761
32769
32784
32799
32802
32830
32832
32833
32835
32854
32856
32859
32864
32870
32872
32874
32885
32896
32913
32919
32921
32922
32924
32930
32943
32957
32966
32967
32968
32969
32973
32981
32986
32989
32992
32993
33059
33071
33085
33086
33095
33109
33116
33118
33122
33123
33126
33128
33148
33149
33150
33160
33161
33173
33174
33186
33187
33188
33207
33208
33213
33231
33263
33265
33272
33279
33283
33285
33301
33317
33326
33328
33330
33332
33350
33351
33352
33372
33379
33396
33401
33402
33414
33415
33419
33509
33545
33587
33640
33651
33686
33697
33699
33712
33713
33714
33715
33716
33717
33718
33731
33732
33735
33744
33777
33793
33832
33836
33852
33861
33868
33880
33884
33888
33893
33902
33903
33981
34006
34025
34028
34063
34067
34082
34084
34119
34124
34136
34152
34159
34160
34183
34226
34230
34238
34276
34277
34287
34288
34291
34293
34300
34301
34304
34312
34322
34344
34358
34359
34363
34364
34370
34383
34385
34386
34387
34388
34396
34407
34408
34410
34411
34422
34432
34436
34442
34443
34444
34445
34454
34456
34458
34467
34492
34494
34495
34496
34498
34504
34505
34507
34508
34509
34510
34515
34516
34522
34525
34531
34534
34537
34545
34547
34558
34559
34563
34564
34566
34567
34568
34570
34572
34575
34592
34599
34601
34610
34614
34627
34632
34633
34635
34643
34657
34664
34669
34695
34697
34705
34706
34713
34714
34734
34742
34746
34751
34752
34758
34760
34761
34763
34765
34782
34786
34787
34789
34791
34792
34828
34829
34835
34850
34871
34884
34887
34897
34905
34935
34937
34958
34997
35008
35009
35010
35026
35035
35036
35037
35052
35060
35066
35075
35099
35105
35143
35157
35158
35170
35177
35184
35187
35191
35207
35221
35231
35244
35249
35256
35257
35287
35298
35305
35306
35314
35315
35317
35318
35322
35323
35328
35329
35330
35331
35332
35357
35358
35367
35373
35374
35384
35391
35392
35418
35431
35437
35438
35441
35442
35472
35483
35485
35486
35499
35504
35506
35508
35509
35510
35512
35525
35540
35543
35619
35625
35633
35659
35666
35740
35743
35765
35770
35782
35790
35801
35805
35808
35810
35811
35817
35823
35836
35837
35853
35854
35857
35863
35895
35944
35954
35958
35963
35966
35972
35977
35996
36016
36024
36037
36049
36061
36064
36067
36072
36077
36079
36080
36097
36103
36123
36126
36137
36143
36165
36183
36197
36213
36217
36220
36236
36245
36255
36261
36281
36282
36291
36309
36317
36318
36328
36341
36351
36360
36370
36390
36392
36404
36441
36442
36449
36459
36483
36484
36489
36490
36491
36508
36575
36589
36610
36623
36627
36628
36638
36639
36681
36690
36691
36693
36699
36700
36703
36727
36728
36729
36757
36768
36769
36808
36812
36819
36833
36835
36872
36873
36875
36880
36887
36896
36900
36914
36915
36923
36936
36945
36953
36955
36958
36967
36977
36980
37013
37019
37020
37030
37031
37034
37042
37046
37049
37050
37056
37063
37067
37084
37092
37093
37095
37109
37115
37122
37123
37129
37139
37141
37142
37143
37145
37146
37155
37156
37167
37188
37189
37194
37196
37197
37201
37202
37207
37213
37227
37228
37229
37236
37250
37263
37271
37275
37276
37278
37285
37294
37300
37301
37305
37306
37311
37321
37326
37328
37344
37349
37354
37358
37363
37390
37401
37405
37419
37421
37422
37426
37464
37479
37483
37487
37488
37523
37542
37543
37544
37545
37546
37548
37551
37552
37560
37561
37566
37571
37572
37573
37574
37579
37593
37601
37602
37612
37632
37643
37645
37659
37671
37672
37675
37682
37683
37684
37696
37699
37700
37702
37703
37704
37711
37715
37718
37719
37723
37726
37727
37728
37733
37735
37746
37753
37754
37755
37761
37768
37770
37783
37784
37787
37788
37789
37790
37792
37799
37811
37813
37814
37815
37820
37827
37830
37833
37836
37840
37850
37857
37861
37864
37870
37875
37879
37887
37897
37903
37904
37905
37923
37924
37950
37951
37956
37957
37973
37981
37988
38005
38013
38014
38018
38019
38042
38048
38050
38051
38054
38070
38076
38079
38099
38100
38107
38108
38129
38130
38139
38152
38156
38157
38160
38165
38174
38178
38184
38185
38189
38191
38203
38213
38214
38224
38236
38240
38243
38271
38287
38288
38295
38303
38305
38309
38313
38319
38320
38321
38334
38338
38341
38360
38361
38367
38370
38378
38381
38384
38388
38391
38392
38413
38427
38434
38484
38543
38544
38545
38572
38587
38598
38599
38603
38612
38626
38628
38629
38631
38656
38711
38727
38728
38731
38734
38735
38736
38737
38740
38747
38748
38749
38751
38752
38753
38754
38787
38793
38799
38803
38810
38821
38824
38825
38826
38828
38848
38849
38850
38855
38856
38866
38867
38876
38882
38887
38888
38889
38915
38922
38938
38947
38948
38949
38959
38969
38970
38999
39007
39013
39014
39016
39071
39111
39150
39157
39158
39159
39172
39174
39175
39185
39193
39213
39228
39236
39237
39254
39255
39260
39261
39276
39283
39284
39285
39319
39321
39322

In [ ]:
img_no = 4

annIds = train_coco.getAnnIds(imgIds=train_annotations_data['annotations'][img_no]['image_id'])
anns = train_coco.loadAnns(annIds)

# load and render the image

plt.imshow(plt.imread(TRAIN_IMAGE_DIRECTIORY+train_annotations_data['images'][img_no]['file_name']))
plt.axis('off')
# Render annotations on top of the image
train_coco.showAnns(anns)
In [ ]:
w, h = 12, 12 # Setting width and height of every image
rows, cols = 5, 5 # Setting the number of image rows & cols

fig = plt.figure(figsize=(20, 14)) # Making the figure with size 

plt.title("Images") 
plt.axis('off')

# Going thought every cell in rows and cols
for i in range(1, cols * rows+1):

  annIds = train_coco.getAnnIds(imgIds=img_info['id'][i])
  anns = train_coco.loadAnns(annIds)

  fig.add_subplot(rows, cols, i)

  # Show the image

  img = plt.imread(TRAIN_IMAGE_DIRECTIORY+img_info['file_name'][i])

  for i in anns:
    [x,y,w,h] = i['bbox']
    cv2.rectangle(img, (int(x), int(y)), (int(x+h), int(y+w)), (255,0,0), 5)

  plt.imshow(img)

  # Render annotations on top of the image
  train_coco.showAnns(anns)


  # Setting the axis off
  plt.axis("off")

# Showing the figure
plt.show()

Creating our Dataset 🔨

In this section we are goind to fix out dataset first ( because there is some issues with dataset ( size mismatch ) & creating our dataset to put into the model

Fixing the Data

In [ ]:
np.array(train_annotations_data['annotations'][n]['segmentation']).shape , np.array(train_annotations_data['annotations'][n]['bbox']).shape
Out[ ]:
((1, 22), (4,))
In [ ]:
# Function for taking a annotation & directiory of images and returning new annoation json with fixed image size info
def fix_data(annotations, directiory):
  for n, i in enumerate(tqdm((annotations['images']))):
   
      img = cv2.imread(directiory+i["file_name"])
 

      if img.shape[0] != i['height']:
          annotations['images'][n]['height'] = img.shape[0]
          print(i["file_name"])
          print(annotations['images'][n], img.shape)

      if img.shape[1] != i['width']:
          annotations['images'][n]['width'] = img.shape[1]
          print(i["file_name"])
          print(annotations['images'][n], img.shape)

  return annotations

    
train_annotations_data = fix_data(train_annotations_data, TRAIN_IMAGE_DIRECTIORY)

  
with open('/content/train/new_ann.json', 'w') as f:
    json.dump(train_annotations_data, f)

In [ ]:
def fix_data_val(annotations, directiory):
  for n, i in enumerate(tqdm((annotations['images']))):
   
      img = cv2.imread(directiory+i["file_name"])
      
 

      if img.shape[0] != i['height']:
          print(n)
          annotations['images'][n]['height'] = img.shape[0]
          print(i["file_name"])
          print(annotations['images'][n], img.shape)

      if img.shape[1] != i['width']:
          annotations['images'][n]['width'] = img.shape[1]
          print(i["file_name"])
          print(annotations['images'][n], img.shape)

  return annotations

val_annotations_data = fix_data_val(val_annotations_data, VAL_IMAGE_DIRECTIORY)

with open('/content/val/new_ann.json', 'w') as f:
    json.dump(val_annotations_data, f)

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-95-2c84961ef844> in <module>()
     19   return annotations
     20 
---> 21 val_annotations_data = fix_data_val(val_annotations_data, VAL_IMAGE_DIRECTIORY)
     22 
     23 with open('/content/val/new_ann.json', 'w') as f:

<ipython-input-95-2c84961ef844> in fix_data_val(annotations, directiory)
      6 
      7 
----> 8       if img.shape[0] != i['height']:
      9           print(n)
     10           annotations['images'][n]['height'] = img.shape[0]

AttributeError: 'NoneType' object has no attribute 'shape'
In [ ]:
dict_addres = val_annotations_data['images'][748]
plt.imread(VAL_IMAGE_DIRECTIORY+dict_addres['file_name']).shape[:2], (dict_addres['height'], dict_addres['width'])
Out[ ]:
((3456, 4608), (4608, 3456))
In [ ]:
for n, i in enumerate(val_annotations_data['images']):
  if i['file_name'] == '053879.jpg':
    print(n)
    print("yes")
748
yes

Creating our Dataset

In [ ]:
train_annotations_path = '/content/train/new_ann.json'
train_images_path = '/content/train/images'

val_annotations_path = '/content/val/new_ann.json'
val_images_path = '/content/val/images'
In [ ]:
register_coco_instances("training_dataset", {},train_annotations_path, train_images_path)

register_coco_instances("validation_dataset", {},val_annotations_path, VAL_IMAGE_DIRECTIORY)

Creating our Model 🏭

We are going to make an Faster R-CNN Model using Detectron2, and setting up hyperpamaters to train our model

Creating R-CNN Model

In [ ]:
cfg = get_cfg()
# Check the model zoo and use any of the models ( from detectron2 github repo)
cfg.merge_from_file(model_zoo.get_config_file("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml"))
cfg.DATASETS.TRAIN = ("training_dataset",)
cfg.DATASETS.TEST = ()

cfg.DATALOADER.NUM_WORKERS = 2
# Loading pre trained weights
cfg.MODEL.WEIGHTS = model_zoo.get_checkpoint_url("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml")

Setting up hyperparameters

In [ ]:
# No. of Batchs
cfg.SOLVER.IMS_PER_BATCH = 2

# Learning Rate: 
cfg.SOLVER.BASE_LR = 0.00025 

# No of Interations
cfg.SOLVER.MAX_ITER = 2000

# Images per batch (Batch Size) 
cfg.MODEL.ROI_HEADS.BATCH_SIZE_PER_IMAGE = 128   

# No of Categories(Classes) present
cfg.MODEL.ROI_HEADS.NUM_CLASSES = 273

cfg.OUTPUT_DIR = "/content/logs/"
os.makedirs(cfg.OUTPUT_DIR, exist_ok=True)
In [ ]:
trainer = DefaultTrainer(cfg) 
trainer.resume_or_load(resume=False)
[11/08 07:20:20 d2.engine.defaults]: Model:
GeneralizedRCNN(
  (backbone): FPN(
    (fpn_lateral2): Conv2d(256, 256, kernel_size=(1, 1), stride=(1, 1))
    (fpn_output2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (fpn_lateral3): Conv2d(512, 256, kernel_size=(1, 1), stride=(1, 1))
    (fpn_output3): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (fpn_lateral4): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1))
    (fpn_output4): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (fpn_lateral5): Conv2d(2048, 256, kernel_size=(1, 1), stride=(1, 1))
    (fpn_output5): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (top_block): LastLevelMaxPool()
    (bottom_up): ResNet(
      (stem): BasicStem(
        (conv1): Conv2d(
          3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias=False
          (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
        )
      )
      (res2): Sequential(
        (0): BottleneckBlock(
          (shortcut): Conv2d(
            64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv1): Conv2d(
            64, 64, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
          )
          (conv2): Conv2d(
            64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
          )
          (conv3): Conv2d(
            64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
        )
        (1): BottleneckBlock(
          (conv1): Conv2d(
            256, 64, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
          )
          (conv2): Conv2d(
            64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
          )
          (conv3): Conv2d(
            64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
        )
        (2): BottleneckBlock(
          (conv1): Conv2d(
            256, 64, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
          )
          (conv2): Conv2d(
            64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
          )
          (conv3): Conv2d(
            64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
        )
      )
      (res3): Sequential(
        (0): BottleneckBlock(
          (shortcut): Conv2d(
            256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
          (conv1): Conv2d(
            256, 128, kernel_size=(1, 1), stride=(2, 2), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv2): Conv2d(
            128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv3): Conv2d(
            128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
        )
        (1): BottleneckBlock(
          (conv1): Conv2d(
            512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv2): Conv2d(
            128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv3): Conv2d(
            128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
        )
        (2): BottleneckBlock(
          (conv1): Conv2d(
            512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv2): Conv2d(
            128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv3): Conv2d(
            128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
        )
        (3): BottleneckBlock(
          (conv1): Conv2d(
            512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv2): Conv2d(
            128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv3): Conv2d(
            128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
        )
      )
      (res4): Sequential(
        (0): BottleneckBlock(
          (shortcut): Conv2d(
            512, 1024, kernel_size=(1, 1), stride=(2, 2), bias=False
            (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
          )
          (conv1): Conv2d(
            512, 256, kernel_size=(1, 1), stride=(2, 2), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv2): Conv2d(
            256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv3): Conv2d(
            256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
          )
        )
        (1): BottleneckBlock(
          (conv1): Conv2d(
            1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv2): Conv2d(
            256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv3): Conv2d(
            256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
          )
        )
        (2): BottleneckBlock(
          (conv1): Conv2d(
            1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv2): Conv2d(
            256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv3): Conv2d(
            256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
          )
        )
        (3): BottleneckBlock(
          (conv1): Conv2d(
            1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv2): Conv2d(
            256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv3): Conv2d(
            256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
          )
        )
        (4): BottleneckBlock(
          (conv1): Conv2d(
            1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv2): Conv2d(
            256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv3): Conv2d(
            256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
          )
        )
        (5): BottleneckBlock(
          (conv1): Conv2d(
            1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv2): Conv2d(
            256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv3): Conv2d(
            256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
          )
        )
      )
      (res5): Sequential(
        (0): BottleneckBlock(
          (shortcut): Conv2d(
            1024, 2048, kernel_size=(1, 1), stride=(2, 2), bias=False
            (norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05)
          )
          (conv1): Conv2d(
            1024, 512, kernel_size=(1, 1), stride=(2, 2), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
          (conv2): Conv2d(
            512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
          (conv3): Conv2d(
            512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05)
          )
        )
        (1): BottleneckBlock(
          (conv1): Conv2d(
            2048, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
          (conv2): Conv2d(
            512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
          (conv3): Conv2d(
            512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05)
          )
        )
        (2): BottleneckBlock(
          (conv1): Conv2d(
            2048, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
          (conv2): Conv2d(
            512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
          (conv3): Conv2d(
            512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05)
          )
        )
      )
    )
  )
  (proposal_generator): RPN(
    (anchor_generator): DefaultAnchorGenerator(
      (cell_anchors): BufferList()
    )
    (rpn_head): StandardRPNHead(
      (conv): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (objectness_logits): Conv2d(256, 3, kernel_size=(1, 1), stride=(1, 1))
      (anchor_deltas): Conv2d(256, 12, kernel_size=(1, 1), stride=(1, 1))
    )
  )
  (roi_heads): StandardROIHeads(
    (box_pooler): ROIPooler(
      (level_poolers): ModuleList(
        (0): ROIAlign(output_size=(7, 7), spatial_scale=0.25, sampling_ratio=0, aligned=True)
        (1): ROIAlign(output_size=(7, 7), spatial_scale=0.125, sampling_ratio=0, aligned=True)
        (2): ROIAlign(output_size=(7, 7), spatial_scale=0.0625, sampling_ratio=0, aligned=True)
        (3): ROIAlign(output_size=(7, 7), spatial_scale=0.03125, sampling_ratio=0, aligned=True)
      )
    )
    (box_head): FastRCNNConvFCHead(
      (fc1): Linear(in_features=12544, out_features=1024, bias=True)
      (fc2): Linear(in_features=1024, out_features=1024, bias=True)
    )
    (box_predictor): FastRCNNOutputLayers(
      (cls_score): Linear(in_features=1024, out_features=274, bias=True)
      (bbox_pred): Linear(in_features=1024, out_features=1092, bias=True)
    )
    (mask_pooler): ROIPooler(
      (level_poolers): ModuleList(
        (0): ROIAlign(output_size=(14, 14), spatial_scale=0.25, sampling_ratio=0, aligned=True)
        (1): ROIAlign(output_size=(14, 14), spatial_scale=0.125, sampling_ratio=0, aligned=True)
        (2): ROIAlign(output_size=(14, 14), spatial_scale=0.0625, sampling_ratio=0, aligned=True)
        (3): ROIAlign(output_size=(14, 14), spatial_scale=0.03125, sampling_ratio=0, aligned=True)
      )
    )
    (mask_head): MaskRCNNConvUpsampleHead(
      (mask_fcn1): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (mask_fcn2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (mask_fcn3): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (mask_fcn4): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (deconv): ConvTranspose2d(256, 256, kernel_size=(2, 2), stride=(2, 2))
      (predictor): Conv2d(256, 273, kernel_size=(1, 1), stride=(1, 1))
    )
  )
)
[11/08 07:20:22 d2.data.datasets.coco]: Loading /content/train/new_ann.json takes 1.51 seconds.
WARNING [11/08 07:20:22 d2.data.datasets.coco]: 
Category ids in annotations are not in [1, #categories]! We'll apply a mapping for you.

[11/08 07:20:22 d2.data.datasets.coco]: Loaded 24120 images in COCO format from /content/train/new_ann.json
[11/08 07:20:22 d2.data.build]: Removed 0 images with no usable annotations. 24120 images left.
[11/08 07:20:23 d2.data.build]: Distribution of instances among all 273 categories:
|   category    | #instances   |   category    | #instances   |   category    | #instances   |
|:-------------:|:-------------|:-------------:|:-------------|:-------------:|:-------------|
| beetroot-st.. | 91           | green-bean-.. | 40           | watermelon-.. | 51           |
| pizza-with-.. | 42           | pizza-with-.. | 56           | applesauce-.. | 40           |
| quiche-with.. | 52           | sweet-potato  | 121          | country-fries | 52           |
| potato-gnoc.. | 58           | potatoes-st.. | 452          | chips-frenc.. | 238          |
|     rosti     | 49           | vegetable-m.. | 42           | mixed-veget.. | 627          |
|  ratatouille  | 92           | mixed-salad.. | 374          | leaf-spinach  | 194          |
| witloof-chi.. | 98           | salad-rocket  | 130          | salad-leaf-.. | 1190         |
| salad-lambs.. | 101          |   artichoke   | 43           |   eggplant    | 138          |
|    avocado    | 298          | french-beans  | 160          |    pickle     | 138          |
|   cucumber    | 383          |    pumpkin    | 48           | sweet-pepper  | 279          |
|    tomato     | 1072         |   zucchini    | 233          |  red-radish   | 138          |
| beetroot-raw  | 45           |    carrot     | 898          |   celeriac    | 53           |
|  cauliflower  | 116          |   broccoli    | 262          |   kolhrabi    | 51           |
|  red-cabbage  | 79           | white-cabbage | 51           |   mushroom    | 72           |
|   mushrooms   | 118          |     peas      | 73           |     corn      | 130          |
|     leek      | 65           |     onion     | 146          |    fennel     | 134          |
| green-aspar.. | 132          | white-aspar.. | 56           | alfa-sprouts  | 40           |
| beans-kidney  | 40           |   chickpeas   | 92           |    lentils    | 83           |
|   pineapple   | 54           |     apple     | 504          |  pomegranate  | 72           |
|   apricots    | 94           |    banana     | 412          |    berries    | 64           |
|     pear      | 151          |     dates     | 42           | strawberries  | 245          |
|  fruit-salad  | 97           |  blueberries  | 159          |  raspberries  | 116          |
|     kiwi      | 121          |   mandarine   | 138          |     mango     | 64           |
|  sugar-melon  | 102          |   nectarine   | 69           |    orange     | 139          |
|     peach     | 54           |     plums     | 45           |    grapes     | 96           |
| dried-raisins | 45           |     lemon     | 74           | peanut-butter | 54           |
|  mixed-seeds  | 47           |    almonds    | 159          |    walnut     | 98           |
|  cashew-nut   | 75           |    peanut     | 68           |   hazelnut    | 39           |
|  mixed-nuts   | 163          |   pistachio   | 65           | sesame-seeds  | 43           |
| green-olives  | 88           | black-olives  | 132          |     milk      | 106          |
|  kefir-drink  | 44           | cottage-che.. | 69           | blue-mould-.. | 69           |
|     feta      | 107          | fresh-cheese  | 52           |   gruya-re    | 183          |
| semi-hard-c.. | 101          |  hard-cheese  | 319          |    cheese     | 405          |
|  mozzarella   | 139          |   parmesan    | 201          | cheese-for-.. | 77           |
| cream-cheese  | 52           |     tomme     | 52           |  soft-cheese  | 181          |
|   tiramisu    | 59           |     cream     | 42           |  sour-cream   | 55           |
| thickened-c.. | 81           | dairy-ice-c.. | 154          |  flakes-oat   | 48           |
| rice-noodle.. | 50           |   couscous    | 80           | grits-polen.. | 57           |
|    quinoa     | 121          |     rice      | 659          | rice-basmati  | 67           |
| rice-whole-.. | 42           |   spaetzle    | 60           |     pasta     | 136          |
| pasta-haprnli | 61           | pasta-lingu.. | 75           | pasta-noodles | 49           |
|  pasta-penne  | 132          | pasta-spagh.. | 256          |  pasta-twist  | 100          |
| pasta-whole.. | 85           | bread-frenc.. | 123          |     bread     | 63           |
| bread-5-grain | 48           |  bread-fruit  | 49           | bread-half-.. | 76           |
|  bread-grain  | 104          |   bread-nut   | 57           |  bread-pita   | 39           |
|   bread-rye   | 47           | bread-whole.. | 228          | bread-sourd.. | 125          |
|  bread-black  | 56           |  bread-toast  | 75           | bread-whole.. | 66           |
| bread-whole.. | 905          |  bread-white  | 1276         |    brioche    | 46           |
| roll-of-hal.. | 54           | roll-with-p.. | 53           |   focaccia    | 41           |
|   croissant   | 144          | braided-whi.. | 194          | breadcrumbs.. | 56           |
| rusk-wholem.. | 41           | crunch-ma1-.. | 46           |   ma1-4esli   | 75           |
|     beef      | 85           | beef-sirloi.. | 41           |  beef-filet   | 51           |
| beef-minced.. | 65           | beef-cut-in.. | 41           |     pork      | 69           |
|    chicken    | 280          | chicken-bre.. | 91           | chicken-cut.. | 74           |
|  chicken-leg  | 57           | frying-saus.. | 38           |  dried-meat   | 140          |
| veal-sausage  | 44           |    salami     | 173          |  ham-cooked   | 151          |
|    ham-raw    | 161          | bacon-frying  | 127          | bacon-cooking | 48           |
| meat-terrin.. | 45           |    sausage    | 81           | veggie-burger | 36           |
|     tofu      | 121          |     fish      | 119          |    salmon     | 177          |
|     tuna      | 96           | shrimp-praw.. | 58           | shrimp-praw.. | 63           |
|      egg      | 632          |    butter     | 1010         |    praline    | 77           |
|      jam      | 505          |     honey     | 230          | dark-chocol.. | 213          |
| milk-chocol.. | 71           |   chocolate   | 82           | hazelnut-ch.. | 49           |
|   apple-pie   | 104          |    brownie    | 46           | craape-plain  | 103          |
|  fruit-tart   | 49           | cake-chocol.. | 147          | omelette-pl.. | 65           |
|     tart      | 99           | croissant-w.. | 66           |    cookies    | 58           |
|   biscuits    | 135          | chocolate-c.. | 40           |  juice-apple  | 43           |
| juice-orange  | 164          |    ice-tea    | 44           | syrup-dilut.. | 51           |
|      tea      | 353          |  cappuccino   | 139          | espresso-wi.. | 391          |
| coffee-with.. | 876          | white-coffe.. | 274          | ristretto-w.. | 82           |
|   tea-green   | 140          |   tea-black   | 109          | tea-verveine  | 40           |
|  herbal-tea   | 126          | tea-pepperm.. | 39           |     water     | 1837         |
| water-mineral | 185          |  wine-rosa-c  | 67           |   wine-red    | 546          |
|  wine-white   | 333          |     beer      | 158          | sauce-savoury | 129          |
|  sauce-roast  | 76           |  sauce-pesto  | 77           | sauce-mushr.. | 39           |
|  sauce-cream  | 132          |    ketchup    | 87           | bolognaise-.. | 87           |
| tomato-sauce  | 292          | salad-dress.. | 78           | balsamic-sa.. | 117          |
| french-sala.. | 95           | oil-vinegar.. | 50           |   guacamole   | 85           |
|  mayonnaise   | 198          |  sauce-soya   | 39           | soup-vegeta.. | 96           |
| soup-pumpkin  | 78           | falafel-balls | 37           | savoury-puf.. | 50           |
|  corn-crisps  | 46           |    crisps     | 74           | ham-croissant | 44           |
| salt-cake-v.. | 67           | hamburger-b.. | 103          | lasagne-mea.. | 54           |
| mashed-pota.. | 96           | pizza-margh.. | 206          |     sushi     | 59           |
|   pancakes    | 49           |    hummus     | 120          |  greek-salad  | 43           |
| chocolate-m.. | 53           | caprese-sal.. | 85           | taboula-c-p.. | 50           |
| risotto-wit.. | 111          | salmon-smoked | 162          | egg-scrambl.. | 84           |
| boisson-au-.. | 171          | chicken-cur.. | 80           | potatoes-au.. | 49           |
| bircherma1-.. | 96           | fajita-brea.. | 56           | butter-spre.. | 45           |
| water-with-.. | 87           | gluten-free.. | 68           | fruit-coulis  | 41           |
| greek-yaour.. | 39           | soup-of-len.. | 44           | vegetable-a.. | 52           |
| curry-veget.. | 91           | yaourt-yaho.. | 139          | goat-cheese.. | 55           |
|               |              |               |              |               |              |
|     total     | 39328        |               |              |               |              |
[11/08 07:20:23 d2.data.common]: Serializing 24120 elements to byte tensors and concatenating them all ...
[11/08 07:20:23 d2.data.common]: Serialized dataset takes 70.18 MiB
[11/08 07:20:23 d2.data.detection_utils]: TransformGens used in training: [ResizeShortestEdge(short_edge_length=(640, 672, 704, 736, 768, 800), max_size=1333, sample_style='choice'), RandomFlip()]
[11/08 07:20:23 d2.data.build]: Using training sampler TrainingSampler
model_final_f10217.pkl: 178MB [00:01, 94.1MB/s]                           
Skip loading parameter 'roi_heads.box_predictor.cls_score.weight' to the model due to incompatible shapes: (81, 1024) in the checkpoint but (274, 1024) in the model! You might want to double check if this is expected.
Skip loading parameter 'roi_heads.box_predictor.cls_score.bias' to the model due to incompatible shapes: (81,) in the checkpoint but (274,) in the model! You might want to double check if this is expected.
Skip loading parameter 'roi_heads.box_predictor.bbox_pred.weight' to the model due to incompatible shapes: (320, 1024) in the checkpoint but (1092, 1024) in the model! You might want to double check if this is expected.
Skip loading parameter 'roi_heads.box_predictor.bbox_pred.bias' to the model due to incompatible shapes: (320,) in the checkpoint but (1092,) in the model! You might want to double check if this is expected.
Skip loading parameter 'roi_heads.mask_head.predictor.weight' to the model due to incompatible shapes: (80, 256, 1, 1) in the checkpoint but (273, 256, 1, 1) in the model! You might want to double check if this is expected.
Skip loading parameter 'roi_heads.mask_head.predictor.bias' to the model due to incompatible shapes: (80,) in the checkpoint but (273,) in the model! You might want to double check if this is expected.

Training the Model 🚂

Setting up Tensorboard & finally training our model!

Setting up Tensorboard

In [ ]:
!pip install --upgrade git+git://github.com/wandb/client.git
Collecting git+git://github.com/wandb/client.git
  Cloning git://github.com/wandb/client.git to /tmp/pip-req-build-coaxvwht
  Running command git clone -q git://github.com/wandb/client.git /tmp/pip-req-build-coaxvwht
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
    Preparing wheel metadata ... done
Collecting shortuuid>=0.5.0
  Downloading https://files.pythonhosted.org/packages/25/a6/2ecc1daa6a304e7f1b216f0896b26156b78e7c38e1211e9b798b4716c53d/shortuuid-1.0.1-py3-none-any.whl
Collecting GitPython>=1.0.0
  Downloading https://files.pythonhosted.org/packages/c0/d7/b2b0672e0331567157adf9281f41ee731c412ee518ca5e6552c27fa73c91/GitPython-3.1.9-py3-none-any.whl (159kB)
     |β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 163kB 12.2MB/s 
Requirement already satisfied, skipping upgrade: protobuf>=3.12.0 in /usr/local/lib/python3.6/dist-packages (from wandb==0.10.7.dev1) (3.12.4)
Requirement already satisfied, skipping upgrade: python-dateutil>=2.6.1 in /usr/local/lib/python3.6/dist-packages (from wandb==0.10.7.dev1) (2.8.1)
Collecting docker-pycreds>=0.4.0
  Downloading https://files.pythonhosted.org/packages/f5/e8/f6bd1eee09314e7e6dee49cbe2c5e22314ccdb38db16c9fc72d2fa80d054/docker_pycreds-0.4.0-py2.py3-none-any.whl
Requirement already satisfied, skipping upgrade: promise<3,>=2.0 in /usr/local/lib/python3.6/dist-packages (from wandb==0.10.7.dev1) (2.3)
Requirement already satisfied, skipping upgrade: six>=1.13.0 in /usr/local/lib/python3.6/dist-packages (from wandb==0.10.7.dev1) (1.15.0)
Collecting sentry-sdk>=0.4.0
  Downloading https://files.pythonhosted.org/packages/1f/08/5eb320799e3085ccc66ec0fc3360421302803f3b784f74959564dbc6cdc9/sentry_sdk-0.19.0-py2.py3-none-any.whl (120kB)
     |β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 122kB 30.3MB/s 
Collecting subprocess32>=3.5.3
  Downloading https://files.pythonhosted.org/packages/32/c8/564be4d12629b912ea431f1a50eb8b3b9d00f1a0b1ceff17f266be190007/subprocess32-3.5.4.tar.gz (97kB)
     |β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 102kB 11.9MB/s 
Collecting configparser>=3.8.1
  Downloading https://files.pythonhosted.org/packages/08/b2/ef713e0e67f6e7ec7d59aea3ee78d05b39c15930057e724cc6d362a8c3bb/configparser-5.0.1-py3-none-any.whl
Requirement already satisfied, skipping upgrade: PyYAML in /usr/local/lib/python3.6/dist-packages (from wandb==0.10.7.dev1) (5.1)
Requirement already satisfied, skipping upgrade: requests<3,>=2.0.0 in /usr/local/lib/python3.6/dist-packages (from wandb==0.10.7.dev1) (2.23.0)
Requirement already satisfied, skipping upgrade: psutil>=5.0.0 in /usr/local/lib/python3.6/dist-packages (from wandb==0.10.7.dev1) (5.4.8)
Requirement already satisfied, skipping upgrade: Click>=7.0 in /usr/local/lib/python3.6/dist-packages (from wandb==0.10.7.dev1) (7.1.2)
Collecting watchdog>=0.8.3
  Downloading https://files.pythonhosted.org/packages/0e/06/121302598a4fc01aca942d937f4a2c33430b7181137b35758913a8db10ad/watchdog-0.10.3.tar.gz (94kB)
     |β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 102kB 12.0MB/s 
Collecting gitdb<5,>=4.0.1
  Downloading https://files.pythonhosted.org/packages/48/11/d1800bca0a3bae820b84b7d813ad1eff15a48a64caea9c823fc8c1b119e8/gitdb-4.0.5-py3-none-any.whl (63kB)
     |β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 71kB 7.6MB/s 
Requirement already satisfied, skipping upgrade: setuptools in /usr/local/lib/python3.6/dist-packages (from protobuf>=3.12.0->wandb==0.10.7.dev1) (50.3.0)
Requirement already satisfied, skipping upgrade: urllib3>=1.10.0 in /usr/local/lib/python3.6/dist-packages (from sentry-sdk>=0.4.0->wandb==0.10.7.dev1) (1.24.3)
Requirement already satisfied, skipping upgrade: certifi in /usr/local/lib/python3.6/dist-packages (from sentry-sdk>=0.4.0->wandb==0.10.7.dev1) (2020.6.20)
Requirement already satisfied, skipping upgrade: chardet<4,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests<3,>=2.0.0->wandb==0.10.7.dev1) (3.0.4)
Requirement already satisfied, skipping upgrade: idna<3,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests<3,>=2.0.0->wandb==0.10.7.dev1) (2.10)
Collecting pathtools>=0.1.1
  Downloading https://files.pythonhosted.org/packages/e7/7f/470d6fcdf23f9f3518f6b0b76be9df16dcc8630ad409947f8be2eb0ed13a/pathtools-0.1.2.tar.gz
Collecting smmap<4,>=3.0.1
  Downloading https://files.pythonhosted.org/packages/b0/9a/4d409a6234eb940e6a78dfdfc66156e7522262f5f2fecca07dc55915952d/smmap-3.0.4-py2.py3-none-any.whl
Building wheels for collected packages: wandb
  Building wheel for wandb (PEP 517) ... done
  Created wheel for wandb: filename=wandb-0.10.7.dev1-cp36-none-any.whl size=1693091 sha256=3222c8a75719b9aa12ceb6adced4c065c1111c0d8098a7cc6230f65173b46b03
  Stored in directory: /tmp/pip-ephem-wheel-cache-s8m9f4gs/wheels/f5/bb/45/4b2ff6f79b35baf639a4eaf2673c5df97fa21335bf82cc7bfa
Successfully built wandb
Building wheels for collected packages: subprocess32, watchdog, pathtools
  Building wheel for subprocess32 (setup.py) ... done
  Created wheel for subprocess32: filename=subprocess32-3.5.4-cp36-none-any.whl size=6489 sha256=797ae5ded101c517d2dd89cd2e3835d82137e9153a07043ce1f862c595fbf2d4
  Stored in directory: /root/.cache/pip/wheels/68/39/1a/5e402bdfdf004af1786c8b853fd92f8c4a04f22aad179654d1
  Building wheel for watchdog (setup.py) ... done
  Created wheel for watchdog: filename=watchdog-0.10.3-cp36-none-any.whl size=73873 sha256=39f3a578f4c871a3c54ea53c2951051acfd3b91639c243847cd543ebd4d7560e
  Stored in directory: /root/.cache/pip/wheels/a8/1d/38/2c19bb311f67cc7b4d07a2ec5ea36ab1a0a0ea50db994a5bc7
  Building wheel for pathtools (setup.py) ... done
  Created wheel for pathtools: filename=pathtools-0.1.2-cp36-none-any.whl size=8785 sha256=07700793fd010b8da19b70de0c6006c38fb9e111783c482816162da157e26346
  Stored in directory: /root/.cache/pip/wheels/0b/04/79/c3b0c3a0266a3cb4376da31e5bfe8bba0c489246968a68e843
Successfully built subprocess32 watchdog pathtools
Installing collected packages: shortuuid, smmap, gitdb, GitPython, docker-pycreds, sentry-sdk, subprocess32, configparser, pathtools, watchdog, wandb
Successfully installed GitPython-3.1.9 configparser-5.0.1 docker-pycreds-0.4.0 gitdb-4.0.5 pathtools-0.1.2 sentry-sdk-0.19.0 shortuuid-1.0.1 smmap-3.0.4 subprocess32-3.5.4 wandb-0.10.7.dev1 watchdog-0.10.3
In [ ]:
import wandb
wandb.init(project='food detection', sync_tensorboard=True)
wandb: Appending key for api.wandb.ai to your netrc file: /root/.netrc
wandb: wandb version 0.10.7 is available!  To upgrade, please run:
wandb:  $ pip install wandb --upgrade
Tracking run with wandb version 0.10.7.dev1
Syncing run helpful-sun-12 to Weights & Biases (Documentation).
Project page: https://wandb.ai/shubhamai/food%20detection
Run page: https://wandb.ai/shubhamai/food%20detection/runs/1kuucz5k
Run data is saved locally in wandb/run-20201017_060820-1kuucz5k

Out[ ]:

Run(1kuucz5k)

In [ ]:
%load_ext tensorboard
In [ ]:
%tensorboard --logdir logs
Output hidden; open in https://colab.research.google.com to view.

Start Training!

In [ ]:
trainer.train()
[11/08 07:20:34 d2.engine.train_loop]: Starting training from iteration 0
[11/08 07:20:41 d2.utils.events]:  eta: 0:10:28  iter: 19  total_loss: 6.803  loss_cls: 5.567  loss_box_reg: 0.527  loss_mask: 0.692  loss_rpn_cls: 0.026  loss_rpn_loc: 0.016  time: 0.3200  data_time: 0.0174  lr: 0.000005  max_mem: 1926M
[11/08 07:20:48 d2.utils.events]:  eta: 0:10:25  iter: 39  total_loss: 6.706  loss_cls: 5.464  loss_box_reg: 0.521  loss_mask: 0.693  loss_rpn_cls: 0.008  loss_rpn_loc: 0.012  time: 0.3217  data_time: 0.0074  lr: 0.000010  max_mem: 2090M
[11/08 07:20:54 d2.utils.events]:  eta: 0:10:24  iter: 59  total_loss: 6.288  loss_cls: 4.979  loss_box_reg: 0.626  loss_mask: 0.693  loss_rpn_cls: 0.023  loss_rpn_loc: 0.021  time: 0.3261  data_time: 0.0117  lr: 0.000015  max_mem: 2090M
[11/08 07:21:01 d2.utils.events]:  eta: 0:10:20  iter: 79  total_loss: 5.606  loss_cls: 4.338  loss_box_reg: 0.560  loss_mask: 0.692  loss_rpn_cls: 0.034  loss_rpn_loc: 0.014  time: 0.3271  data_time: 0.0073  lr: 0.000020  max_mem: 2090M
[11/08 07:21:08 d2.utils.events]:  eta: 0:10:27  iter: 99  total_loss: 4.539  loss_cls: 3.214  loss_box_reg: 0.667  loss_mask: 0.692  loss_rpn_cls: 0.011  loss_rpn_loc: 0.010  time: 0.3297  data_time: 0.0080  lr: 0.000025  max_mem: 2090M
[11/08 07:21:15 d2.utils.events]:  eta: 0:10:27  iter: 119  total_loss: 3.083  loss_cls: 1.640  loss_box_reg: 0.653  loss_mask: 0.690  loss_rpn_cls: 0.004  loss_rpn_loc: 0.012  time: 0.3325  data_time: 0.0075  lr: 0.000030  max_mem: 2090M
[11/08 07:21:22 d2.utils.events]:  eta: 0:10:24  iter: 139  total_loss: 2.526  loss_cls: 1.218  loss_box_reg: 0.590  loss_mask: 0.690  loss_rpn_cls: 0.007  loss_rpn_loc: 0.017  time: 0.3344  data_time: 0.0088  lr: 0.000035  max_mem: 2191M
[11/08 07:21:28 d2.utils.events]:  eta: 0:10:17  iter: 159  total_loss: 2.498  loss_cls: 1.169  loss_box_reg: 0.650  loss_mask: 0.691  loss_rpn_cls: 0.009  loss_rpn_loc: 0.015  time: 0.3344  data_time: 0.0074  lr: 0.000040  max_mem: 2191M
[11/08 07:21:35 d2.utils.events]:  eta: 0:10:14  iter: 179  total_loss: 2.692  loss_cls: 1.268  loss_box_reg: 0.695  loss_mask: 0.688  loss_rpn_cls: 0.014  loss_rpn_loc: 0.013  time: 0.3349  data_time: 0.0062  lr: 0.000045  max_mem: 2191M
[11/08 07:21:42 d2.utils.events]:  eta: 0:10:11  iter: 199  total_loss: 2.448  loss_cls: 1.124  loss_box_reg: 0.600  loss_mask: 0.690  loss_rpn_cls: 0.029  loss_rpn_loc: 0.017  time: 0.3380  data_time: 0.0198  lr: 0.000050  max_mem: 2191M
[11/08 07:21:49 d2.utils.events]:  eta: 0:10:05  iter: 219  total_loss: 2.777  loss_cls: 1.323  loss_box_reg: 0.725  loss_mask: 0.689  loss_rpn_cls: 0.024  loss_rpn_loc: 0.017  time: 0.3389  data_time: 0.0069  lr: 0.000055  max_mem: 2191M
[11/08 07:21:56 d2.utils.events]:  eta: 0:10:01  iter: 239  total_loss: 2.624  loss_cls: 1.256  loss_box_reg: 0.707  loss_mask: 0.685  loss_rpn_cls: 0.008  loss_rpn_loc: 0.017  time: 0.3404  data_time: 0.0063  lr: 0.000060  max_mem: 2351M
[11/08 07:22:03 d2.utils.events]:  eta: 0:09:55  iter: 259  total_loss: 2.322  loss_cls: 1.014  loss_box_reg: 0.534  loss_mask: 0.685  loss_rpn_cls: 0.011  loss_rpn_loc: 0.010  time: 0.3410  data_time: 0.0176  lr: 0.000065  max_mem: 2351M
[11/08 07:22:11 d2.utils.events]:  eta: 0:09:51  iter: 279  total_loss: 2.551  loss_cls: 1.178  loss_box_reg: 0.628  loss_mask: 0.684  loss_rpn_cls: 0.016  loss_rpn_loc: 0.017  time: 0.3421  data_time: 0.0088  lr: 0.000070  max_mem: 2351M
[11/08 07:22:18 d2.utils.events]:  eta: 0:09:45  iter: 299  total_loss: 2.351  loss_cls: 1.034  loss_box_reg: 0.566  loss_mask: 0.687  loss_rpn_cls: 0.007  loss_rpn_loc: 0.009  time: 0.3426  data_time: 0.0063  lr: 0.000075  max_mem: 2351M
[11/08 07:22:24 d2.utils.events]:  eta: 0:09:36  iter: 319  total_loss: 2.496  loss_cls: 1.144  loss_box_reg: 0.617  loss_mask: 0.677  loss_rpn_cls: 0.009  loss_rpn_loc: 0.010  time: 0.3413  data_time: 0.0087  lr: 0.000080  max_mem: 2351M
[11/08 07:22:31 d2.utils.events]:  eta: 0:09:30  iter: 339  total_loss: 2.349  loss_cls: 1.029  loss_box_reg: 0.611  loss_mask: 0.678  loss_rpn_cls: 0.003  loss_rpn_loc: 0.009  time: 0.3415  data_time: 0.0078  lr: 0.000085  max_mem: 2351M
[11/08 07:22:38 d2.utils.events]:  eta: 0:09:23  iter: 359  total_loss: 2.250  loss_cls: 1.026  loss_box_reg: 0.571  loss_mask: 0.679  loss_rpn_cls: 0.009  loss_rpn_loc: 0.011  time: 0.3418  data_time: 0.0060  lr: 0.000090  max_mem: 2351M
/usr/local/lib/python3.6/dist-packages/PIL/TiffImagePlugin.py:590: UserWarning:

Metadata Warning, tag 282 had too many entries: 2, expected 1

/usr/local/lib/python3.6/dist-packages/PIL/TiffImagePlugin.py:590: UserWarning:

Metadata Warning, tag 283 had too many entries: 2, expected 1

[11/08 07:22:46 d2.utils.events]:  eta: 0:09:18  iter: 379  total_loss: 2.436  loss_cls: 1.109  loss_box_reg: 0.648  loss_mask: 0.675  loss_rpn_cls: 0.010  loss_rpn_loc: 0.014  time: 0.3448  data_time: 0.0291  lr: 0.000095  max_mem: 2723M
[11/08 07:22:53 d2.utils.events]:  eta: 0:09:12  iter: 399  total_loss: 2.420  loss_cls: 1.106  loss_box_reg: 0.596  loss_mask: 0.673  loss_rpn_cls: 0.014  loss_rpn_loc: 0.013  time: 0.3452  data_time: 0.0086  lr: 0.000100  max_mem: 2723M
[11/08 07:23:00 d2.utils.events]:  eta: 0:09:05  iter: 419  total_loss: 2.638  loss_cls: 1.189  loss_box_reg: 0.760  loss_mask: 0.674  loss_rpn_cls: 0.005  loss_rpn_loc: 0.011  time: 0.3454  data_time: 0.0075  lr: 0.000105  max_mem: 2723M
[11/08 07:23:07 d2.utils.events]:  eta: 0:09:00  iter: 439  total_loss: 2.604  loss_cls: 1.206  loss_box_reg: 0.669  loss_mask: 0.676  loss_rpn_cls: 0.007  loss_rpn_loc: 0.012  time: 0.3458  data_time: 0.0078  lr: 0.000110  max_mem: 2723M
[11/08 07:23:14 d2.utils.events]:  eta: 0:08:54  iter: 459  total_loss: 2.164  loss_cls: 0.941  loss_box_reg: 0.596  loss_mask: 0.668  loss_rpn_cls: 0.010  loss_rpn_loc: 0.012  time: 0.3468  data_time: 0.0064  lr: 0.000115  max_mem: 2723M
[11/08 07:23:22 d2.utils.events]:  eta: 0:08:48  iter: 479  total_loss: 2.608  loss_cls: 1.188  loss_box_reg: 0.663  loss_mask: 0.659  loss_rpn_cls: 0.008  loss_rpn_loc: 0.014  time: 0.3476  data_time: 0.0094  lr: 0.000120  max_mem: 2723M
[11/08 07:23:29 d2.utils.events]:  eta: 0:08:42  iter: 499  total_loss: 2.343  loss_cls: 1.026  loss_box_reg: 0.586  loss_mask: 0.659  loss_rpn_cls: 0.013  loss_rpn_loc: 0.011  time: 0.3486  data_time: 0.0193  lr: 0.000125  max_mem: 2723M
[11/08 07:23:36 d2.utils.events]:  eta: 0:08:36  iter: 519  total_loss: 2.317  loss_cls: 1.060  loss_box_reg: 0.577  loss_mask: 0.673  loss_rpn_cls: 0.008  loss_rpn_loc: 0.011  time: 0.3489  data_time: 0.0070  lr: 0.000130  max_mem: 2723M
[11/08 07:23:43 d2.utils.events]:  eta: 0:08:29  iter: 539  total_loss: 2.468  loss_cls: 1.075  loss_box_reg: 0.678  loss_mask: 0.671  loss_rpn_cls: 0.010  loss_rpn_loc: 0.010  time: 0.3489  data_time: 0.0085  lr: 0.000135  max_mem: 2723M
[11/08 07:23:50 d2.utils.events]:  eta: 0:08:22  iter: 559  total_loss: 2.124  loss_cls: 0.958  loss_box_reg: 0.532  loss_mask: 0.671  loss_rpn_cls: 0.012  loss_rpn_loc: 0.011  time: 0.3490  data_time: 0.0091  lr: 0.000140  max_mem: 2723M
[11/08 07:23:58 d2.utils.events]:  eta: 0:08:16  iter: 579  total_loss: 2.376  loss_cls: 1.076  loss_box_reg: 0.622  loss_mask: 0.669  loss_rpn_cls: 0.008  loss_rpn_loc: 0.010  time: 0.3498  data_time: 0.0147  lr: 0.000145  max_mem: 2723M
[11/08 07:24:05 d2.utils.events]:  eta: 0:08:10  iter: 599  total_loss: 2.478  loss_cls: 1.110  loss_box_reg: 0.690  loss_mask: 0.648  loss_rpn_cls: 0.008  loss_rpn_loc: 0.009  time: 0.3502  data_time: 0.0070  lr: 0.000150  max_mem: 2723M
[11/08 07:24:12 d2.utils.events]:  eta: 0:08:03  iter: 619  total_loss: 2.403  loss_cls: 1.109  loss_box_reg: 0.595  loss_mask: 0.640  loss_rpn_cls: 0.008  loss_rpn_loc: 0.010  time: 0.3504  data_time: 0.0054  lr: 0.000155  max_mem: 2723M
[11/08 07:24:20 d2.utils.events]:  eta: 0:07:56  iter: 639  total_loss: 2.153  loss_cls: 0.914  loss_box_reg: 0.554  loss_mask: 0.647  loss_rpn_cls: 0.007  loss_rpn_loc: 0.010  time: 0.3510  data_time: 0.0164  lr: 0.000160  max_mem: 2723M
[11/08 07:24:27 d2.utils.events]:  eta: 0:07:50  iter: 659  total_loss: 2.515  loss_cls: 1.182  loss_box_reg: 0.683  loss_mask: 0.601  loss_rpn_cls: 0.005  loss_rpn_loc: 0.009  time: 0.3513  data_time: 0.0081  lr: 0.000165  max_mem: 2723M
[11/08 07:24:34 d2.utils.events]:  eta: 0:07:43  iter: 679  total_loss: 1.995  loss_cls: 0.738  loss_box_reg: 0.505  loss_mask: 0.612  loss_rpn_cls: 0.008  loss_rpn_loc: 0.010  time: 0.3521  data_time: 0.0068  lr: 0.000170  max_mem: 2723M
[11/08 07:24:42 d2.utils.events]:  eta: 0:07:36  iter: 699  total_loss: 2.145  loss_cls: 0.879  loss_box_reg: 0.582  loss_mask: 0.538  loss_rpn_cls: 0.012  loss_rpn_loc: 0.011  time: 0.3522  data_time: 0.0064  lr: 0.000175  max_mem: 2723M
/usr/local/lib/python3.6/dist-packages/PIL/TiffImagePlugin.py:590: UserWarning:

Metadata Warning, tag 282 had too many entries: 2, expected 1

/usr/local/lib/python3.6/dist-packages/PIL/TiffImagePlugin.py:590: UserWarning:

Metadata Warning, tag 283 had too many entries: 2, expected 1

[11/08 07:24:49 d2.utils.events]:  eta: 0:07:29  iter: 719  total_loss: 2.147  loss_cls: 0.979  loss_box_reg: 0.628  loss_mask: 0.555  loss_rpn_cls: 0.004  loss_rpn_loc: 0.009  time: 0.3525  data_time: 0.0179  lr: 0.000180  max_mem: 2723M
[11/08 07:24:57 d2.utils.events]:  eta: 0:07:23  iter: 739  total_loss: 2.420  loss_cls: 1.120  loss_box_reg: 0.669  loss_mask: 0.596  loss_rpn_cls: 0.013  loss_rpn_loc: 0.010  time: 0.3534  data_time: 0.0262  lr: 0.000185  max_mem: 2723M
[11/08 07:25:04 d2.utils.events]:  eta: 0:07:16  iter: 759  total_loss: 2.407  loss_cls: 1.078  loss_box_reg: 0.640  loss_mask: 0.604  loss_rpn_cls: 0.009  loss_rpn_loc: 0.010  time: 0.3539  data_time: 0.0150  lr: 0.000190  max_mem: 2723M
[11/08 07:25:12 d2.utils.events]:  eta: 0:07:09  iter: 779  total_loss: 2.255  loss_cls: 0.998  loss_box_reg: 0.616  loss_mask: 0.568  loss_rpn_cls: 0.005  loss_rpn_loc: 0.009  time: 0.3545  data_time: 0.0063  lr: 0.000195  max_mem: 2723M
[11/08 07:25:19 d2.utils.events]:  eta: 0:07:03  iter: 799  total_loss: 2.313  loss_cls: 1.040  loss_box_reg: 0.663  loss_mask: 0.605  loss_rpn_cls: 0.006  loss_rpn_loc: 0.010  time: 0.3553  data_time: 0.0066  lr: 0.000200  max_mem: 2723M
[11/08 07:25:27 d2.utils.events]:  eta: 0:06:57  iter: 819  total_loss: 2.268  loss_cls: 1.023  loss_box_reg: 0.643  loss_mask: 0.541  loss_rpn_cls: 0.013  loss_rpn_loc: 0.009  time: 0.3558  data_time: 0.0084  lr: 0.000205  max_mem: 2723M
[11/08 07:25:34 d2.utils.events]:  eta: 0:06:50  iter: 839  total_loss: 2.234  loss_cls: 1.022  loss_box_reg: 0.635  loss_mask: 0.554  loss_rpn_cls: 0.005  loss_rpn_loc: 0.012  time: 0.3562  data_time: 0.0065  lr: 0.000210  max_mem: 2723M
[11/08 07:25:42 d2.utils.events]:  eta: 0:06:44  iter: 859  total_loss: 2.262  loss_cls: 1.017  loss_box_reg: 0.642  loss_mask: 0.499  loss_rpn_cls: 0.014  loss_rpn_loc: 0.009  time: 0.3564  data_time: 0.0056  lr: 0.000215  max_mem: 2723M
[11/08 07:25:49 d2.utils.events]:  eta: 0:06:37  iter: 879  total_loss: 2.353  loss_cls: 1.117  loss_box_reg: 0.695  loss_mask: 0.577  loss_rpn_cls: 0.007  loss_rpn_loc: 0.008  time: 0.3565  data_time: 0.0061  lr: 0.000220  max_mem: 2723M
[11/08 07:25:56 d2.utils.events]:  eta: 0:06:30  iter: 899  total_loss: 1.785  loss_cls: 0.722  loss_box_reg: 0.591  loss_mask: 0.430  loss_rpn_cls: 0.011  loss_rpn_loc: 0.007  time: 0.3565  data_time: 0.0063  lr: 0.000225  max_mem: 2723M
[11/08 07:26:04 d2.utils.events]:  eta: 0:06:23  iter: 919  total_loss: 2.241  loss_cls: 1.001  loss_box_reg: 0.655  loss_mask: 0.523  loss_rpn_cls: 0.008  loss_rpn_loc: 0.006  time: 0.3570  data_time: 0.0169  lr: 0.000230  max_mem: 2723M
[11/08 07:26:11 d2.utils.events]:  eta: 0:06:16  iter: 939  total_loss: 2.241  loss_cls: 1.050  loss_box_reg: 0.679  loss_mask: 0.463  loss_rpn_cls: 0.007  loss_rpn_loc: 0.008  time: 0.3568  data_time: 0.0071  lr: 0.000235  max_mem: 2723M
[11/08 07:26:18 d2.utils.events]:  eta: 0:06:09  iter: 959  total_loss: 1.777  loss_cls: 0.778  loss_box_reg: 0.575  loss_mask: 0.446  loss_rpn_cls: 0.006  loss_rpn_loc: 0.007  time: 0.3569  data_time: 0.0077  lr: 0.000240  max_mem: 2723M
[11/08 07:26:25 d2.utils.events]:  eta: 0:06:02  iter: 979  total_loss: 2.094  loss_cls: 0.937  loss_box_reg: 0.661  loss_mask: 0.476  loss_rpn_cls: 0.007  loss_rpn_loc: 0.008  time: 0.3571  data_time: 0.0056  lr: 0.000245  max_mem: 2723M
[11/08 07:26:32 d2.utils.events]:  eta: 0:05:55  iter: 999  total_loss: 1.920  loss_cls: 0.885  loss_box_reg: 0.570  loss_mask: 0.527  loss_rpn_cls: 0.004  loss_rpn_loc: 0.009  time: 0.3572  data_time: 0.0065  lr: 0.000250  max_mem: 2723M
[11/08 07:26:40 d2.utils.events]:  eta: 0:05:49  iter: 1019  total_loss: 1.690  loss_cls: 0.789  loss_box_reg: 0.531  loss_mask: 0.457  loss_rpn_cls: 0.009  loss_rpn_loc: 0.008  time: 0.3575  data_time: 0.0074  lr: 0.000250  max_mem: 2723M
[11/08 07:26:47 d2.utils.events]:  eta: 0:05:43  iter: 1039  total_loss: 2.275  loss_cls: 1.064  loss_box_reg: 0.676  loss_mask: 0.456  loss_rpn_cls: 0.007  loss_rpn_loc: 0.008  time: 0.3577  data_time: 0.0060  lr: 0.000250  max_mem: 2723M
[11/08 07:26:55 d2.utils.events]:  eta: 0:05:37  iter: 1059  total_loss: 2.176  loss_cls: 1.083  loss_box_reg: 0.677  loss_mask: 0.467  loss_rpn_cls: 0.011  loss_rpn_loc: 0.011  time: 0.3584  data_time: 0.0190  lr: 0.000250  max_mem: 2723M
[11/08 07:27:03 d2.utils.events]:  eta: 0:05:31  iter: 1079  total_loss: 2.361  loss_cls: 1.121  loss_box_reg: 0.660  loss_mask: 0.503  loss_rpn_cls: 0.010  loss_rpn_loc: 0.012  time: 0.3588  data_time: 0.0064  lr: 0.000250  max_mem: 2723M
[11/08 07:27:10 d2.utils.events]:  eta: 0:05:25  iter: 1099  total_loss: 2.100  loss_cls: 0.999  loss_box_reg: 0.579  loss_mask: 0.451  loss_rpn_cls: 0.007  loss_rpn_loc: 0.007  time: 0.3590  data_time: 0.0065  lr: 0.000250  max_mem: 2723M
[11/08 07:27:18 d2.utils.events]:  eta: 0:05:18  iter: 1119  total_loss: 1.989  loss_cls: 0.845  loss_box_reg: 0.641  loss_mask: 0.381  loss_rpn_cls: 0.006  loss_rpn_loc: 0.007  time: 0.3593  data_time: 0.0065  lr: 0.000250  max_mem: 2723M
[11/08 07:27:25 d2.utils.events]:  eta: 0:05:11  iter: 1139  total_loss: 1.992  loss_cls: 0.948  loss_box_reg: 0.597  loss_mask: 0.495  loss_rpn_cls: 0.010  loss_rpn_loc: 0.009  time: 0.3595  data_time: 0.0097  lr: 0.000250  max_mem: 2723M
[11/08 07:27:33 d2.utils.events]:  eta: 0:05:04  iter: 1159  total_loss: 2.172  loss_cls: 1.043  loss_box_reg: 0.650  loss_mask: 0.439  loss_rpn_cls: 0.006  loss_rpn_loc: 0.008  time: 0.3597  data_time: 0.0274  lr: 0.000250  max_mem: 2723M
[11/08 07:27:40 d2.utils.events]:  eta: 0:04:57  iter: 1179  total_loss: 1.856  loss_cls: 0.861  loss_box_reg: 0.611  loss_mask: 0.362  loss_rpn_cls: 0.005  loss_rpn_loc: 0.006  time: 0.3597  data_time: 0.0067  lr: 0.000250  max_mem: 2723M
[11/08 07:27:47 d2.utils.events]:  eta: 0:04:50  iter: 1199  total_loss: 2.023  loss_cls: 0.946  loss_box_reg: 0.639  loss_mask: 0.378  loss_rpn_cls: 0.008  loss_rpn_loc: 0.010  time: 0.3599  data_time: 0.0077  lr: 0.000250  max_mem: 2723M
[11/08 07:27:55 d2.utils.events]:  eta: 0:04:43  iter: 1219  total_loss: 2.055  loss_cls: 1.035  loss_box_reg: 0.679  loss_mask: 0.426  loss_rpn_cls: 0.005  loss_rpn_loc: 0.008  time: 0.3600  data_time: 0.0071  lr: 0.000250  max_mem: 2723M
[11/08 07:28:02 d2.utils.events]:  eta: 0:04:36  iter: 1239  total_loss: 2.244  loss_cls: 0.992  loss_box_reg: 0.705  loss_mask: 0.479  loss_rpn_cls: 0.004  loss_rpn_loc: 0.012  time: 0.3602  data_time: 0.0055  lr: 0.000250  max_mem: 2723M
[11/08 07:28:09 d2.utils.events]:  eta: 0:04:29  iter: 1259  total_loss: 2.120  loss_cls: 1.072  loss_box_reg: 0.645  loss_mask: 0.457  loss_rpn_cls: 0.006  loss_rpn_loc: 0.009  time: 0.3602  data_time: 0.0077  lr: 0.000250  max_mem: 2723M
[11/08 07:28:17 d2.utils.events]:  eta: 0:04:22  iter: 1279  total_loss: 2.253  loss_cls: 1.069  loss_box_reg: 0.667  loss_mask: 0.425  loss_rpn_cls: 0.009  loss_rpn_loc: 0.008  time: 0.3603  data_time: 0.0067  lr: 0.000250  max_mem: 2723M
[11/08 07:28:24 d2.utils.events]:  eta: 0:04:15  iter: 1299  total_loss: 2.022  loss_cls: 0.971  loss_box_reg: 0.623  loss_mask: 0.371  loss_rpn_cls: 0.007  loss_rpn_loc: 0.008  time: 0.3604  data_time: 0.0070  lr: 0.000250  max_mem: 2723M
[11/08 07:28:31 d2.utils.events]:  eta: 0:04:09  iter: 1319  total_loss: 1.972  loss_cls: 0.872  loss_box_reg: 0.603  loss_mask: 0.453  loss_rpn_cls: 0.007  loss_rpn_loc: 0.010  time: 0.3606  data_time: 0.0071  lr: 0.000250  max_mem: 2723M
[11/08 07:28:39 d2.utils.events]:  eta: 0:04:02  iter: 1339  total_loss: 1.894  loss_cls: 0.846  loss_box_reg: 0.661  loss_mask: 0.381  loss_rpn_cls: 0.011  loss_rpn_loc: 0.007  time: 0.3607  data_time: 0.0053  lr: 0.000250  max_mem: 2723M
[11/08 07:28:46 d2.utils.events]:  eta: 0:03:55  iter: 1359  total_loss: 2.301  loss_cls: 1.086  loss_box_reg: 0.688  loss_mask: 0.499  loss_rpn_cls: 0.010  loss_rpn_loc: 0.015  time: 0.3609  data_time: 0.0085  lr: 0.000250  max_mem: 2723M
[11/08 07:28:54 d2.utils.events]:  eta: 0:03:48  iter: 1379  total_loss: 2.246  loss_cls: 1.055  loss_box_reg: 0.637  loss_mask: 0.429  loss_rpn_cls: 0.007  loss_rpn_loc: 0.008  time: 0.3611  data_time: 0.0052  lr: 0.000250  max_mem: 2723M
[11/08 07:29:01 d2.utils.events]:  eta: 0:03:41  iter: 1399  total_loss: 2.276  loss_cls: 1.058  loss_box_reg: 0.748  loss_mask: 0.421  loss_rpn_cls: 0.008  loss_rpn_loc: 0.013  time: 0.3613  data_time: 0.0067  lr: 0.000250  max_mem: 2723M
[11/08 07:29:09 d2.utils.events]:  eta: 0:03:33  iter: 1419  total_loss: 2.130  loss_cls: 1.083  loss_box_reg: 0.696  loss_mask: 0.385  loss_rpn_cls: 0.005  loss_rpn_loc: 0.008  time: 0.3614  data_time: 0.0062  lr: 0.000250  max_mem: 2723M
[11/08 07:29:16 d2.utils.events]:  eta: 0:03:26  iter: 1439  total_loss: 2.001  loss_cls: 0.989  loss_box_reg: 0.638  loss_mask: 0.315  loss_rpn_cls: 0.004  loss_rpn_loc: 0.008  time: 0.3616  data_time: 0.0143  lr: 0.000250  max_mem: 2723M
[11/08 07:29:24 d2.utils.events]:  eta: 0:03:19  iter: 1459  total_loss: 1.886  loss_cls: 0.925  loss_box_reg: 0.689  loss_mask: 0.328  loss_rpn_cls: 0.003  loss_rpn_loc: 0.006  time: 0.3616  data_time: 0.0069  lr: 0.000250  max_mem: 2723M
[11/08 07:29:31 d2.utils.events]:  eta: 0:03:12  iter: 1479  total_loss: 2.038  loss_cls: 1.071  loss_box_reg: 0.666  loss_mask: 0.349  loss_rpn_cls: 0.011  loss_rpn_loc: 0.008  time: 0.3619  data_time: 0.0089  lr: 0.000250  max_mem: 2723M
[11/08 07:29:39 d2.utils.events]:  eta: 0:03:04  iter: 1499  total_loss: 2.131  loss_cls: 0.901  loss_box_reg: 0.613  loss_mask: 0.449  loss_rpn_cls: 0.013  loss_rpn_loc: 0.009  time: 0.3620  data_time: 0.0157  lr: 0.000250  max_mem: 2723M
[11/08 07:29:46 d2.utils.events]:  eta: 0:02:57  iter: 1519  total_loss: 2.084  loss_cls: 0.989  loss_box_reg: 0.650  loss_mask: 0.445  loss_rpn_cls: 0.009  loss_rpn_loc: 0.006  time: 0.3622  data_time: 0.0194  lr: 0.000250  max_mem: 2723M
[11/08 07:29:53 d2.utils.events]:  eta: 0:02:50  iter: 1539  total_loss: 2.102  loss_cls: 1.059  loss_box_reg: 0.667  loss_mask: 0.389  loss_rpn_cls: 0.005  loss_rpn_loc: 0.005  time: 0.3622  data_time: 0.0071  lr: 0.000250  max_mem: 2723M
[11/08 07:30:01 d2.utils.events]:  eta: 0:02:42  iter: 1559  total_loss: 2.174  loss_cls: 1.049  loss_box_reg: 0.734  loss_mask: 0.407  loss_rpn_cls: 0.009  loss_rpn_loc: 0.010  time: 0.3624  data_time: 0.0085  lr: 0.000250  max_mem: 2723M
[11/08 07:30:08 d2.utils.events]:  eta: 0:02:35  iter: 1579  total_loss: 2.138  loss_cls: 0.999  loss_box_reg: 0.666  loss_mask: 0.380  loss_rpn_cls: 0.009  loss_rpn_loc: 0.008  time: 0.3626  data_time: 0.0065  lr: 0.000250  max_mem: 2723M
[11/08 07:30:16 d2.utils.events]:  eta: 0:02:28  iter: 1599  total_loss: 2.154  loss_cls: 1.093  loss_box_reg: 0.727  loss_mask: 0.375  loss_rpn_cls: 0.004  loss_rpn_loc: 0.006  time: 0.3626  data_time: 0.0160  lr: 0.000250  max_mem: 2723M
[11/08 07:30:23 d2.utils.events]:  eta: 0:02:20  iter: 1619  total_loss: 2.177  loss_cls: 0.944  loss_box_reg: 0.664  loss_mask: 0.434  loss_rpn_cls: 0.007  loss_rpn_loc: 0.008  time: 0.3627  data_time: 0.0081  lr: 0.000250  max_mem: 2723M
[11/08 07:30:31 d2.utils.events]:  eta: 0:02:13  iter: 1639  total_loss: 1.974  loss_cls: 0.903  loss_box_reg: 0.652  loss_mask: 0.281  loss_rpn_cls: 0.010  loss_rpn_loc: 0.009  time: 0.3628  data_time: 0.0097  lr: 0.000250  max_mem: 2723M
[11/08 07:30:38 d2.utils.events]:  eta: 0:02:06  iter: 1659  total_loss: 2.110  loss_cls: 1.015  loss_box_reg: 0.719  loss_mask: 0.359  loss_rpn_cls: 0.010  loss_rpn_loc: 0.011  time: 0.3630  data_time: 0.0181  lr: 0.000250  max_mem: 2723M
[11/08 07:30:45 d2.utils.events]:  eta: 0:01:58  iter: 1679  total_loss: 2.212  loss_cls: 1.013  loss_box_reg: 0.654  loss_mask: 0.343  loss_rpn_cls: 0.010  loss_rpn_loc: 0.010  time: 0.3630  data_time: 0.0095  lr: 0.000250  max_mem: 2723M
[11/08 07:30:53 d2.utils.events]:  eta: 0:01:51  iter: 1699  total_loss: 1.787  loss_cls: 0.833  loss_box_reg: 0.561  loss_mask: 0.299  loss_rpn_cls: 0.007  loss_rpn_loc: 0.012  time: 0.3630  data_time: 0.0050  lr: 0.000250  max_mem: 2723M
[11/08 07:31:00 d2.utils.events]:  eta: 0:01:43  iter: 1719  total_loss: 2.136  loss_cls: 1.048  loss_box_reg: 0.685  loss_mask: 0.322  loss_rpn_cls: 0.008  loss_rpn_loc: 0.006  time: 0.3631  data_time: 0.0072  lr: 0.000250  max_mem: 2723M
[11/08 07:31:08 d2.utils.events]:  eta: 0:01:36  iter: 1739  total_loss: 2.008  loss_cls: 0.941  loss_box_reg: 0.610  loss_mask: 0.396  loss_rpn_cls: 0.009  loss_rpn_loc: 0.009  time: 0.3632  data_time: 0.0071  lr: 0.000250  max_mem: 2723M
[11/08 07:31:15 d2.utils.events]:  eta: 0:01:29  iter: 1759  total_loss: 1.915  loss_cls: 1.006  loss_box_reg: 0.649  loss_mask: 0.257  loss_rpn_cls: 0.008  loss_rpn_loc: 0.006  time: 0.3632  data_time: 0.0059  lr: 0.000250  max_mem: 2723M
[11/08 07:31:22 d2.utils.events]:  eta: 0:01:21  iter: 1779  total_loss: 1.681  loss_cls: 0.764  loss_box_reg: 0.559  loss_mask: 0.329  loss_rpn_cls: 0.006  loss_rpn_loc: 0.008  time: 0.3633  data_time: 0.0057  lr: 0.000250  max_mem: 2723M
[11/08 07:31:30 d2.utils.events]:  eta: 0:01:14  iter: 1799  total_loss: 1.931  loss_cls: 0.909  loss_box_reg: 0.605  loss_mask: 0.333  loss_rpn_cls: 0.006  loss_rpn_loc: 0.008  time: 0.3633  data_time: 0.0080  lr: 0.000250  max_mem: 2723M
[11/08 07:31:37 d2.utils.events]:  eta: 0:01:06  iter: 1819  total_loss: 2.299  loss_cls: 1.019  loss_box_reg: 0.654  loss_mask: 0.436  loss_rpn_cls: 0.005  loss_rpn_loc: 0.010  time: 0.3633  data_time: 0.0059  lr: 0.000250  max_mem: 2723M
[11/08 07:31:45 d2.utils.events]:  eta: 0:00:59  iter: 1839  total_loss: 2.160  loss_cls: 1.048  loss_box_reg: 0.697  loss_mask: 0.341  loss_rpn_cls: 0.008  loss_rpn_loc: 0.007  time: 0.3636  data_time: 0.0078  lr: 0.000250  max_mem: 2723M
[11/08 07:31:52 d2.utils.events]:  eta: 0:00:52  iter: 1859  total_loss: 2.017  loss_cls: 0.948  loss_box_reg: 0.682  loss_mask: 0.369  loss_rpn_cls: 0.007  loss_rpn_loc: 0.010  time: 0.3637  data_time: 0.0072  lr: 0.000250  max_mem: 2723M
[11/08 07:32:00 d2.utils.events]:  eta: 0:00:44  iter: 1879  total_loss: 2.056  loss_cls: 0.852  loss_box_reg: 0.638  loss_mask: 0.497  loss_rpn_cls: 0.006  loss_rpn_loc: 0.006  time: 0.3637  data_time: 0.0074  lr: 0.000250  max_mem: 2723M
[11/08 07:32:07 d2.utils.events]:  eta: 0:00:37  iter: 1899  total_loss: 1.862  loss_cls: 0.991  loss_box_reg: 0.677  loss_mask: 0.304  loss_rpn_cls: 0.008  loss_rpn_loc: 0.008  time: 0.3639  data_time: 0.0082  lr: 0.000250  max_mem: 2723M
[11/08 07:32:15 d2.utils.events]:  eta: 0:00:29  iter: 1919  total_loss: 2.036  loss_cls: 0.975  loss_box_reg: 0.632  loss_mask: 0.312  loss_rpn_cls: 0.007  loss_rpn_loc: 0.007  time: 0.3640  data_time: 0.0062  lr: 0.000250  max_mem: 2723M
[11/08 07:32:22 d2.utils.events]:  eta: 0:00:22  iter: 1939  total_loss: 2.010  loss_cls: 0.938  loss_box_reg: 0.592  loss_mask: 0.423  loss_rpn_cls: 0.006  loss_rpn_loc: 0.008  time: 0.3641  data_time: 0.0077  lr: 0.000250  max_mem: 2723M
[11/08 07:32:30 d2.utils.events]:  eta: 0:00:15  iter: 1959  total_loss: 2.156  loss_cls: 1.077  loss_box_reg: 0.717  loss_mask: 0.333  loss_rpn_cls: 0.012  loss_rpn_loc: 0.010  time: 0.3642  data_time: 0.0066  lr: 0.000250  max_mem: 2723M
[11/08 07:32:37 d2.utils.events]:  eta: 0:00:07  iter: 1979  total_loss: 2.120  loss_cls: 1.032  loss_box_reg: 0.720  loss_mask: 0.423  loss_rpn_cls: 0.009  loss_rpn_loc: 0.009  time: 0.3643  data_time: 0.0065  lr: 0.000250  max_mem: 2723M
[11/08 07:32:46 d2.utils.events]:  eta: 0:00:00  iter: 1999  total_loss: 2.235  loss_cls: 0.940  loss_box_reg: 0.633  loss_mask: 0.410  loss_rpn_cls: 0.010  loss_rpn_loc: 0.012  time: 0.3643  data_time: 0.0071  lr: 0.000250  max_mem: 2723M
[11/08 07:32:46 d2.engine.hooks]: Overall training speed: 1997 iterations in 0:12:07 (0.3645 s / it)
[11/08 07:32:46 d2.engine.hooks]: Total training time: 0:12:10 (0:00:03 on hooks)

Evaluating the model 🧪

After training is done, we will evaluate our model to see how it's performing in the unseen images!

In [ ]:
cfg.MODEL.ROI_HEADS.SCORE_THRESH_TEST = 0.01

cfg.MODEL.WEIGHTS = os.path.join(cfg.OUTPUT_DIR, "model_final.pth")

evaluator = COCOEvaluator("validation_dataset", cfg, False, output_dir=cfg.OUTPUT_DIR)
val_loader = build_detection_test_loader(cfg, "validation_dataset")
valResults = inference_on_dataset(trainer.model, val_loader, evaluator)
WARNING [10/17 07:39:37 d2.data.datasets.coco]: 
Category ids in annotations are not in [1, #categories]! We'll apply a mapping for you.

[10/17 07:39:37 d2.data.datasets.coco]: Loaded 1269 images in COCO format from /content/val/new_ann.json
[10/17 07:39:37 d2.data.common]: Serializing 1269 elements to byte tensors and concatenating them all ...
[10/17 07:39:37 d2.data.common]: Serialized dataset takes 3.59 MiB
[10/17 07:39:37 d2.evaluation.evaluator]: Start inference on 1269 images
[10/17 07:39:38 d2.evaluation.evaluator]: Inference done 11/1269. 0.0565 s / img. ETA=0:01:16
[10/17 07:39:43 d2.evaluation.evaluator]: Inference done 90/1269. 0.0572 s / img. ETA=0:01:15
[10/17 07:39:48 d2.evaluation.evaluator]: Inference done 164/1269. 0.0579 s / img. ETA=0:01:12
[10/17 07:39:53 d2.evaluation.evaluator]: Inference done 237/1269. 0.0580 s / img. ETA=0:01:09
[10/17 07:39:58 d2.evaluation.evaluator]: Inference done 304/1269. 0.0581 s / img. ETA=0:01:06
[10/17 07:40:04 d2.evaluation.evaluator]: Inference done 377/1269. 0.0582 s / img. ETA=0:01:03
[10/17 07:40:09 d2.evaluation.evaluator]: Inference done 448/1269. 0.0581 s / img. ETA=0:00:58
[10/17 07:40:16 d2.evaluation.evaluator]: Inference done 524/1269. 0.0583 s / img. ETA=0:00:55
/usr/local/lib/python3.6/dist-packages/PIL/TiffImagePlugin.py:590: UserWarning:

Metadata Warning, tag 282 had too many entries: 2, expected 1

/usr/local/lib/python3.6/dist-packages/PIL/TiffImagePlugin.py:590: UserWarning:

Metadata Warning, tag 283 had too many entries: 2, expected 1

/usr/local/lib/python3.6/dist-packages/PIL/TiffImagePlugin.py:590: UserWarning:

Metadata Warning, tag 34853 had too many entries: 9, expected 1

[10/17 07:40:21 d2.evaluation.evaluator]: Inference done 579/1269. 0.0584 s / img. ETA=0:00:52
[10/17 07:40:26 d2.evaluation.evaluator]: Inference done 652/1269. 0.0585 s / img. ETA=0:00:46
[10/17 07:40:31 d2.evaluation.evaluator]: Inference done 727/1269. 0.0585 s / img. ETA=0:00:40
[10/17 07:40:36 d2.evaluation.evaluator]: Inference done 802/1269. 0.0585 s / img. ETA=0:00:34
[10/17 07:40:41 d2.evaluation.evaluator]: Inference done 880/1269. 0.0585 s / img. ETA=0:00:28
[10/17 07:40:46 d2.evaluation.evaluator]: Inference done 954/1269. 0.0585 s / img. ETA=0:00:22
[10/17 07:40:51 d2.evaluation.evaluator]: Inference done 1034/1269. 0.0584 s / img. ETA=0:00:16
[10/17 07:40:56 d2.evaluation.evaluator]: Inference done 1107/1269. 0.0585 s / img. ETA=0:00:11
[10/17 07:41:01 d2.evaluation.evaluator]: Inference done 1185/1269. 0.0584 s / img. ETA=0:00:05
[10/17 07:41:06 d2.evaluation.evaluator]: Inference done 1268/1269. 0.0583 s / img. ETA=0:00:00
[10/17 07:41:07 d2.evaluation.evaluator]: Total inference time: 0:01:29.038839 (0.070442 s / img per device, on 1 devices)
[10/17 07:41:07 d2.evaluation.evaluator]: Total inference pure compute time: 0:01:13 (0.058350 s / img per device, on 1 devices)
[10/17 07:41:07 d2.evaluation.coco_evaluation]: Preparing results for COCO format ...
[10/17 07:41:07 d2.evaluation.coco_evaluation]: Saving results to /content/logs/coco_instances_results.json
[10/17 07:41:07 d2.evaluation.coco_evaluation]: Evaluating predictions ...
Loading and preparing results...
DONE (t=0.01s)
creating index...
index created!
Running per image evaluation...
Evaluate annotation type *bbox*
DONE (t=4.62s).
Accumulating evaluation results...
DONE (t=1.18s).
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.002
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.001
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.001
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.002
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.001
[10/17 07:41:13 d2.evaluation.coco_evaluation]: Evaluation results for bbox: 
|  AP   |  AP50  |  AP75  |  APs  |  APm  |  APl  |
|:-----:|:------:|:------:|:-----:|:-----:|:-----:|
| 0.010 | 0.043  | 0.000  | 0.000 | 0.222 | 0.010 |
[10/17 07:41:13 d2.evaluation.coco_evaluation]: Per-category bbox AP: 
| category                                                | AP    | category                                            | AP    | category                                             | AP    |
|:--------------------------------------------------------|:------|:----------------------------------------------------|:------|:-----------------------------------------------------|:------|
| beetroot-steamed-without-addition-of-salt               | 0.000 | green-bean-steamed-without-addition-of-salt         | 0.000 | watermelon-fresh                                     | 0.000 |
| pizza-with-ham-baked                                    | 0.000 | pizza-with-vegetables-baked                         | 0.000 | applesauce-unsweetened-canned                        | 0.000 |
| quiche-with-cheese-baked-with-puff-pastry               | 0.000 | sweet-potato                                        | 0.000 | country-fries                                        | 0.000 |
| potato-gnocchi                                          | 0.000 | potatoes-steamed                                    | 0.000 | chips-french-fries                                   | 0.000 |
| rosti                                                   | 0.000 | vegetable-mix-peas-and-carrots                      | 0.000 | mixed-vegetables                                     | 0.000 |
| ratatouille                                             | 0.000 | mixed-salad-chopped-without-sauce                   | 0.000 | leaf-spinach                                         | 0.000 |
| witloof-chicory                                         | 0.000 | salad-rocket                                        | 0.000 | salad-leaf-salad-green                               | 0.000 |
| salad-lambs-ear                                         | 0.000 | artichoke                                           | 0.000 | eggplant                                             | 0.000 |
| avocado                                                 | 0.000 | french-beans                                        | 0.000 | pickle                                               | 0.000 |
| cucumber                                                | 0.000 | pumpkin                                             | 0.000 | sweet-pepper                                         | 0.000 |
| tomato                                                  | 0.000 | zucchini                                            | 0.000 | red-radish                                           | 0.000 |
| beetroot-raw                                            | 0.000 | carrot                                              | 0.000 | celeriac                                             | 0.000 |
| cauliflower                                             | 0.000 | broccoli                                            | 0.000 | kolhrabi                                             | 0.000 |
| red-cabbage                                             | 0.000 | white-cabbage                                       | 0.000 | mushroom                                             | 0.000 |
| mushrooms                                               | 0.000 | peas                                                | 0.000 | corn                                                 | 0.000 |
| leek                                                    | 0.000 | onion                                               | 0.000 | fennel                                               | 0.000 |
| green-asparagus                                         | 0.000 | white-asparagus                                     | nan   | alfa-sprouts                                         | 0.000 |
| beans-kidney                                            | 0.000 | chickpeas                                           | 0.000 | lentils                                              | 0.000 |
| pineapple                                               | 0.000 | apple                                               | 0.000 | pomegranate                                          | 0.000 |
| apricots                                                | 0.000 | banana                                              | 0.000 | berries                                              | 0.000 |
| pear                                                    | 0.000 | dates                                               | 0.000 | strawberries                                         | 0.000 |
| fruit-salad                                             | 0.000 | blueberries                                         | 0.000 | raspberries                                          | nan   |
| kiwi                                                    | 0.000 | mandarine                                           | 0.000 | mango                                                | 0.000 |
| sugar-melon                                             | 0.000 | nectarine                                           | 0.000 | orange                                               | 0.000 |
| peach                                                   | 0.000 | plums                                               | 0.000 | grapes                                               | 0.000 |
| dried-raisins                                           | 0.000 | lemon                                               | 0.000 | peanut-butter                                        | 0.000 |
| mixed-seeds                                             | 0.000 | almonds                                             | 0.000 | walnut                                               | 0.000 |
| cashew-nut                                              | 0.000 | peanut                                              | 0.000 | hazelnut                                             | 0.000 |
| mixed-nuts                                              | 0.000 | pistachio                                           | 0.000 | sesame-seeds                                         | 0.000 |
| green-olives                                            | 0.000 | black-olives                                        | 0.000 | milk                                                 | 0.000 |
| kefir-drink                                             | 0.000 | cottage-cheese                                      | 0.000 | blue-mould-cheese                                    | 0.000 |
| feta                                                    | 0.000 | fresh-cheese                                        | 0.000 | gruya-re                                             | 0.000 |
| semi-hard-cheese                                        | 0.000 | hard-cheese                                         | 0.000 | cheese                                               | 0.000 |
| mozzarella                                              | 0.000 | parmesan                                            | 0.000 | cheese-for-raclette                                  | 0.000 |
| cream-cheese                                            | 0.000 | tomme                                               | 0.000 | soft-cheese                                          | 0.000 |
| tiramisu                                                | nan   | cream                                               | nan   | sour-cream                                           | 0.000 |
| thickened-cream-35                                      | 0.000 | dairy-ice-cream                                     | 0.000 | flakes-oat                                           | 0.000 |
| rice-noodles-vermicelli                                 | 0.000 | couscous                                            | 0.000 | grits-polenta-maize-flour                            | 0.000 |
| quinoa                                                  | 0.000 | rice                                                | 0.000 | rice-basmati                                         | 0.000 |
| rice-whole-grain                                        | 0.000 | spaetzle                                            | nan   | pasta                                                | 0.000 |
| pasta-haprnli                                           | 0.000 | pasta-linguini-parpadelle-tagliatelle               | 0.000 | pasta-noodles                                        | 0.000 |
| pasta-penne                                             | 0.000 | pasta-spaghetti                                     | 0.000 | pasta-twist                                          | 0.000 |
| pasta-wholemeal                                         | 0.000 | bread-french-white-flour                            | 0.000 | bread                                                | 0.000 |
| bread-5-grain                                           | 0.000 | bread-fruit                                         | 0.000 | bread-half-white                                     | 0.000 |
| bread-grain                                             | 0.000 | bread-nut                                           | 0.000 | bread-pita                                           | 0.000 |
| bread-rye                                               | 0.000 | bread-whole-wheat                                   | 0.000 | bread-sourdough                                      | 0.000 |
| bread-black                                             | 0.000 | bread-toast                                         | 0.000 | bread-wholemeal-toast                                | 0.000 |
| bread-wholemeal                                         | 0.000 | bread-white                                         | 0.000 | brioche                                              | 0.000 |
| roll-of-half-white-or-white-flour-with-large-void       | 0.000 | roll-with-pieces-of-chocolate                       | 0.000 | focaccia                                             | 0.000 |
| croissant                                               | 0.000 | braided-white-loaf                                  | 0.000 | breadcrumbs-unspiced                                 | 0.000 |
| rusk-wholemeal                                          | 0.000 | crunch-ma1-4esli                                    | 0.000 | ma1-4esli                                            | 0.000 |
| beef                                                    | 0.000 | beef-sirloin-steak                                  | 0.000 | beef-filet                                           | 0.000 |
| beef-minced-only-meat                                   | 0.000 | beef-cut-into-stripes-only-meat                     | 0.000 | pork                                                 | 0.000 |
| chicken                                                 | 0.000 | chicken-breast                                      | 0.000 | chicken-cut-into-stripes-only-meat                   | 0.000 |
| chicken-leg                                             | 0.000 | frying-sausage                                      | 0.000 | dried-meat                                           | 0.000 |
| veal-sausage                                            | 0.000 | salami                                              | 0.000 | ham-cooked                                           | 0.000 |
| ham-raw                                                 | 0.000 | bacon-frying                                        | 0.000 | bacon-cooking                                        | 0.000 |
| meat-terrine-pata-c                                     | 0.000 | sausage                                             | 0.000 | veggie-burger                                        | 0.000 |
| tofu                                                    | 0.000 | fish                                                | 0.000 | salmon                                               | 0.000 |
| tuna                                                    | 0.000 | shrimp-prawn-small                                  | 0.000 | shrimp-prawn-large                                   | 0.000 |
| egg                                                     | 0.000 | butter                                              | 0.000 | praline                                              | 0.000 |
| jam                                                     | 0.000 | honey                                               | 0.000 | dark-chocolate                                       | 0.000 |
| milk-chocolate                                          | 0.000 | chocolate                                           | 0.000 | hazelnut-chocolate-spread-nutella-ovomaltine-caotina | nan   |
| apple-pie                                               | 0.000 | brownie                                             | 0.000 | craape-plain                                         | 0.000 |
| fruit-tart                                              | 0.000 | cake-chocolate                                      | 0.000 | omelette-plain                                       | nan   |
| tart                                                    | 0.000 | croissant-with-chocolate-filling                    | 0.000 | cookies                                              | 0.000 |
| biscuits                                                | 0.000 | chocolate-cookies                                   | 0.000 | juice-apple                                          | 0.000 |
| juice-orange                                            | 0.000 | ice-tea                                             | 0.000 | syrup-diluted-ready-to-drink                         | 0.000 |
| tea                                                     | 0.000 | cappuccino                                          | 0.000 | espresso-with-caffeine                               | 0.000 |
| coffee-with-caffeine                                    | 0.000 | white-coffee-with-caffeine                          | 0.000 | ristretto-with-caffeine                              | 0.000 |
| tea-green                                               | 0.000 | tea-black                                           | 0.000 | tea-verveine                                         | 0.000 |
| herbal-tea                                              | 0.000 | tea-peppermint                                      | 0.000 | water                                                | 2.566 |
| water-mineral                                           | 0.000 | wine-rosa-c                                         | 0.000 | wine-red                                             | 0.000 |
| wine-white                                              | 0.000 | beer                                                | 0.000 | sauce-savoury                                        | 0.000 |
| sauce-roast                                             | 0.000 | sauce-pesto                                         | 0.000 | sauce-mushroom                                       | 0.000 |
| sauce-cream                                             | 0.000 | ketchup                                             | 0.000 | bolognaise-sauce                                     | 0.000 |
| tomato-sauce                                            | 0.000 | salad-dressing                                      | 0.000 | balsamic-salad-dressing                              | 0.000 |
| french-salad-dressing                                   | 0.000 | oil-vinegar-salad-dressing                          | 0.000 | guacamole                                            | 0.000 |
| mayonnaise                                              | 0.000 | sauce-soya                                          | 0.000 | soup-vegetable                                       | 0.000 |
| soup-pumpkin                                            | 0.000 | falafel-balls                                       | 0.000 | savoury-puff-pastry                                  | 0.000 |
| corn-crisps                                             | 0.000 | crisps                                              | 0.000 | ham-croissant                                        | 0.000 |
| salt-cake-vegetables-filled                             | 0.000 | hamburger-bread-meat-ketchup                        | 0.000 | lasagne-meat-prepared                                | 0.000 |
| mashed-potatoes-prepared-with-full-fat-milk-with-butter | 0.000 | pizza-margherita-baked                              | 0.000 | sushi                                                | 0.000 |
| pancakes                                                | 0.000 | hummus                                              | 0.000 | greek-salad                                          | 0.000 |
| chocolate-mousse                                        | nan   | caprese-salad-tomato-mozzarella                     | 0.000 | taboula-c-prepared-with-couscous                     | 0.000 |
| risotto-without-cheese-cooked                           | 0.000 | salmon-smoked                                       | 0.000 | egg-scrambled-prepared                               | 0.000 |
| boisson-au-glucose-50g                                  | 0.000 | chicken-curry-cream-coconut-milk-curry-spices-paste | 0.000 | potatoes-au-gratin-dauphinois-prepared               | 0.000 |
| bircherma1-4esli-prepared-no-sugar-added                | 0.000 | fajita-bread-only                                   | 0.000 | butter-spread-puree-almond                           | nan   |
| water-with-lemon-juice                                  | 0.000 | gluten-free-bread                                   | 0.000 | fruit-coulis                                         | 0.000 |
| greek-yaourt-yahourt-yogourt-ou-yoghourt                | 0.000 | soup-of-lentils-dahl-dhal                           | 0.000 | vegetable-au-gratin-baked                            | 0.000 |
| curry-vegetarian                                        | 0.000 | yaourt-yahourt-yogourt-ou-yoghourt-natural          | 0.000 | goat-cheese-soft                                     | 0.000 |
Loading and preparing results...
DONE (t=0.13s)
creating index...
index created!
Running per image evaluation...
Evaluate annotation type *segm*
DONE (t=4.33s).
Accumulating evaluation results...
DONE (t=1.21s).
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.001
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.002
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.002
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.002
[10/17 07:41:18 d2.evaluation.coco_evaluation]: Evaluation results for segm: 
|  AP   |  AP50  |  AP75  |  APs  |  APm  |  APl  |
|:-----:|:------:|:------:|:-----:|:-----:|:-----:|
| 0.020 | 0.072  | 0.006  | 0.000 | 0.000 | 0.021 |
[10/17 07:41:18 d2.evaluation.coco_evaluation]: Per-category segm AP: 
| category                                                | AP    | category                                            | AP    | category                                             | AP    |
|:--------------------------------------------------------|:------|:----------------------------------------------------|:------|:-----------------------------------------------------|:------|
| beetroot-steamed-without-addition-of-salt               | 0.000 | green-bean-steamed-without-addition-of-salt         | 0.000 | watermelon-fresh                                     | 0.000 |
| pizza-with-ham-baked                                    | 0.000 | pizza-with-vegetables-baked                         | 0.000 | applesauce-unsweetened-canned                        | 0.000 |
| quiche-with-cheese-baked-with-puff-pastry               | 0.000 | sweet-potato                                        | 0.000 | country-fries                                        | 0.000 |
| potato-gnocchi                                          | 0.000 | potatoes-steamed                                    | 0.000 | chips-french-fries                                   | 0.000 |
| rosti                                                   | 0.000 | vegetable-mix-peas-and-carrots                      | 0.000 | mixed-vegetables                                     | 0.000 |
| ratatouille                                             | 0.000 | mixed-salad-chopped-without-sauce                   | 0.000 | leaf-spinach                                         | 0.000 |
| witloof-chicory                                         | 0.000 | salad-rocket                                        | 0.000 | salad-leaf-salad-green                               | 0.000 |
| salad-lambs-ear                                         | 0.000 | artichoke                                           | 0.000 | eggplant                                             | 0.000 |
| avocado                                                 | 0.000 | french-beans                                        | 0.000 | pickle                                               | 0.000 |
| cucumber                                                | 0.000 | pumpkin                                             | 0.000 | sweet-pepper                                         | 0.000 |
| tomato                                                  | 0.000 | zucchini                                            | 0.000 | red-radish                                           | 0.000 |
| beetroot-raw                                            | 0.000 | carrot                                              | 0.000 | celeriac                                             | 0.000 |
| cauliflower                                             | 0.000 | broccoli                                            | 0.000 | kolhrabi                                             | 0.000 |
| red-cabbage                                             | 0.000 | white-cabbage                                       | 0.000 | mushroom                                             | 0.000 |
| mushrooms                                               | 0.000 | peas                                                | 0.000 | corn                                                 | 0.000 |
| leek                                                    | 0.000 | onion                                               | 0.000 | fennel                                               | 0.000 |
| green-asparagus                                         | 0.000 | white-asparagus                                     | nan   | alfa-sprouts                                         | 0.000 |
| beans-kidney                                            | 0.000 | chickpeas                                           | 0.000 | lentils                                              | 0.000 |
| pineapple                                               | 0.000 | apple                                               | 0.000 | pomegranate                                          | 0.000 |
| apricots                                                | 0.000 | banana                                              | 0.000 | berries                                              | 0.000 |
| pear                                                    | 0.000 | dates                                               | 0.000 | strawberries                                         | 0.000 |
| fruit-salad                                             | 0.000 | blueberries                                         | 0.000 | raspberries                                          | nan   |
| kiwi                                                    | 0.000 | mandarine                                           | 0.000 | mango                                                | 0.000 |
| sugar-melon                                             | 0.000 | nectarine                                           | 0.000 | orange                                               | 0.000 |
| peach                                                   | 0.000 | plums                                               | 0.000 | grapes                                               | 0.000 |
| dried-raisins                                           | 0.000 | lemon                                               | 0.000 | peanut-butter                                        | 0.000 |
| mixed-seeds                                             | 0.000 | almonds                                             | 0.000 | walnut                                               | 0.000 |
| cashew-nut                                              | 0.000 | peanut                                              | 0.000 | hazelnut                                             | 0.000 |
| mixed-nuts                                              | 0.000 | pistachio                                           | 0.000 | sesame-seeds                                         | 0.000 |
| green-olives                                            | 0.000 | black-olives                                        | 0.000 | milk                                                 | 0.000 |
| kefir-drink                                             | 0.000 | cottage-cheese                                      | 0.000 | blue-mould-cheese                                    | 0.000 |
| feta                                                    | 0.000 | fresh-cheese                                        | 0.000 | gruya-re                                             | 0.000 |
| semi-hard-cheese                                        | 0.000 | hard-cheese                                         | 0.000 | cheese                                               | 0.000 |
| mozzarella                                              | 0.000 | parmesan                                            | 0.000 | cheese-for-raclette                                  | 0.000 |
| cream-cheese                                            | 0.000 | tomme                                               | 0.000 | soft-cheese                                          | 0.000 |
| tiramisu                                                | nan   | cream                                               | nan   | sour-cream                                           | 0.000 |
| thickened-cream-35                                      | 0.000 | dairy-ice-cream                                     | 0.000 | flakes-oat                                           | 0.000 |
| rice-noodles-vermicelli                                 | 0.000 | couscous                                            | 0.000 | grits-polenta-maize-flour                            | 0.000 |
| quinoa                                                  | 0.000 | rice                                                | 0.000 | rice-basmati                                         | 0.000 |
| rice-whole-grain                                        | 0.000 | spaetzle                                            | nan   | pasta                                                | 0.000 |
| pasta-haprnli                                           | 0.000 | pasta-linguini-parpadelle-tagliatelle               | 0.000 | pasta-noodles                                        | 0.000 |
| pasta-penne                                             | 0.000 | pasta-spaghetti                                     | 0.000 | pasta-twist                                          | 0.000 |
| pasta-wholemeal                                         | 0.000 | bread-french-white-flour                            | 0.000 | bread                                                | 0.000 |
| bread-5-grain                                           | 0.000 | bread-fruit                                         | 0.000 | bread-half-white                                     | 0.000 |
| bread-grain                                             | 0.000 | bread-nut                                           | 0.000 | bread-pita                                           | 0.000 |
| bread-rye                                               | 0.000 | bread-whole-wheat                                   | 0.000 | bread-sourdough                                      | 0.000 |
| bread-black                                             | 0.000 | bread-toast                                         | 0.000 | bread-wholemeal-toast                                | 0.000 |
| bread-wholemeal                                         | 0.000 | bread-white                                         | 0.000 | brioche                                              | 0.000 |
| roll-of-half-white-or-white-flour-with-large-void       | 0.000 | roll-with-pieces-of-chocolate                       | 0.000 | focaccia                                             | 0.000 |
| croissant                                               | 0.000 | braided-white-loaf                                  | 0.000 | breadcrumbs-unspiced                                 | 0.000 |
| rusk-wholemeal                                          | 0.000 | crunch-ma1-4esli                                    | 0.000 | ma1-4esli                                            | 0.000 |
| beef                                                    | 0.000 | beef-sirloin-steak                                  | 0.000 | beef-filet                                           | 0.000 |
| beef-minced-only-meat                                   | 0.000 | beef-cut-into-stripes-only-meat                     | 0.000 | pork                                                 | 0.000 |
| chicken                                                 | 0.000 | chicken-breast                                      | 0.000 | chicken-cut-into-stripes-only-meat                   | 0.000 |
| chicken-leg                                             | 0.000 | frying-sausage                                      | 0.000 | dried-meat                                           | 0.000 |
| veal-sausage                                            | 0.000 | salami                                              | 0.000 | ham-cooked                                           | 0.000 |
| ham-raw                                                 | 0.000 | bacon-frying                                        | 0.000 | bacon-cooking                                        | 0.000 |
| meat-terrine-pata-c                                     | 0.000 | sausage                                             | 0.000 | veggie-burger                                        | 0.000 |
| tofu                                                    | 0.000 | fish                                                | 0.000 | salmon                                               | 0.000 |
| tuna                                                    | 0.000 | shrimp-prawn-small                                  | 0.000 | shrimp-prawn-large                                   | 0.000 |
| egg                                                     | 0.000 | butter                                              | 0.000 | praline                                              | 0.000 |
| jam                                                     | 0.000 | honey                                               | 0.000 | dark-chocolate                                       | 0.000 |
| milk-chocolate                                          | 0.000 | chocolate                                           | 0.000 | hazelnut-chocolate-spread-nutella-ovomaltine-caotina | nan   |
| apple-pie                                               | 0.000 | brownie                                             | 0.000 | craape-plain                                         | 0.000 |
| fruit-tart                                              | 0.000 | cake-chocolate                                      | 0.000 | omelette-plain                                       | nan   |
| tart                                                    | 0.000 | croissant-with-chocolate-filling                    | 0.000 | cookies                                              | 0.000 |
| biscuits                                                | 0.000 | chocolate-cookies                                   | 0.000 | juice-apple                                          | 0.000 |
| juice-orange                                            | 0.000 | ice-tea                                             | 0.000 | syrup-diluted-ready-to-drink                         | 0.000 |
| tea                                                     | 0.000 | cappuccino                                          | 0.000 | espresso-with-caffeine                               | 0.000 |
| coffee-with-caffeine                                    | 0.000 | white-coffee-with-caffeine                          | 0.000 | ristretto-with-caffeine                              | 0.000 |
| tea-green                                               | 0.000 | tea-black                                           | 0.000 | tea-verveine                                         | 0.000 |
| herbal-tea                                              | 0.000 | tea-peppermint                                      | 0.000 | water                                                | 5.288 |
| water-mineral                                           | 0.000 | wine-rosa-c                                         | 0.000 | wine-red                                             | 0.000 |
| wine-white                                              | 0.000 | beer                                                | 0.000 | sauce-savoury                                        | 0.000 |
| sauce-roast                                             | 0.000 | sauce-pesto                                         | 0.000 | sauce-mushroom                                       | 0.000 |
| sauce-cream                                             | 0.000 | ketchup                                             | 0.000 | bolognaise-sauce                                     | 0.000 |
| tomato-sauce                                            | 0.000 | salad-dressing                                      | 0.000 | balsamic-salad-dressing                              | 0.000 |
| french-salad-dressing                                   | 0.000 | oil-vinegar-salad-dressing                          | 0.000 | guacamole                                            | 0.000 |
| mayonnaise                                              | 0.000 | sauce-soya                                          | 0.000 | soup-vegetable                                       | 0.000 |
| soup-pumpkin                                            | 0.000 | falafel-balls                                       | 0.000 | savoury-puff-pastry                                  | 0.000 |
| corn-crisps                                             | 0.000 | crisps                                              | 0.000 | ham-croissant                                        | 0.000 |
| salt-cake-vegetables-filled                             | 0.000 | hamburger-bread-meat-ketchup                        | 0.000 | lasagne-meat-prepared                                | 0.000 |
| mashed-potatoes-prepared-with-full-fat-milk-with-butter | 0.000 | pizza-margherita-baked                              | 0.000 | sushi                                                | 0.000 |
| pancakes                                                | 0.000 | hummus                                              | 0.000 | greek-salad                                          | 0.000 |
| chocolate-mousse                                        | nan   | caprese-salad-tomato-mozzarella                     | 0.000 | taboula-c-prepared-with-couscous                     | 0.000 |
| risotto-without-cheese-cooked                           | 0.000 | salmon-smoked                                       | 0.000 | egg-scrambled-prepared                               | 0.000 |
| boisson-au-glucose-50g                                  | 0.000 | chicken-curry-cream-coconut-milk-curry-spices-paste | 0.000 | potatoes-au-gratin-dauphinois-prepared               | 0.000 |
| bircherma1-4esli-prepared-no-sugar-added                | 0.000 | fajita-bread-only                                   | 0.000 | butter-spread-puree-almond                           | nan   |
| water-with-lemon-juice                                  | 0.000 | gluten-free-bread                                   | 0.000 | fruit-coulis                                         | 0.000 |
| greek-yaourt-yahourt-yogourt-ou-yoghourt                | 0.000 | soup-of-lentils-dahl-dhal                           | 0.000 | vegetable-au-gratin-baked                            | 0.000 |
| curry-vegetarian                                        | 0.000 | yaourt-yahourt-yogourt-ou-yoghourt-natural          | 0.000 | goat-cheese-soft                                     | 0.000 |
In [ ]:
cfg.MODEL.WEIGHTS = os.path.join(cfg.OUTPUT_DIR, "model_final.pth")
cfg.MODEL.ROI_HEADS.SCORE_THRESH_TEST = 0.01
cfg.MODEL.ROI_HEADS.NUM_CLASSES = 273

cfg.DATASETS.TEST = ("validation_dataset", )
predictor = DefaultPredictor(cfg)
In [ ]:
val_metadata = MetadataCatalog.get("val_dataset")

im = cv2.imread("val/images/084035.jpg")

outputs = predictor(im)

v = Visualizer(im[:, :, ::-1],
                   metadata=val_metadata, 
                   scale=2, 
               

                   instance_mode=ColorMode.IMAGE_BW
    )

out = v.draw_instance_predictions(outputs["instances"].to("cpu"))
cv2_imshow(out.get_image()[:, :, ::-1])
Output hidden; open in https://colab.research.google.com to view.
In [ ]:
trainer.resume_or_load(resume=True)

Making Submission on AICrowd

Step 0 : Fork the baseline to make your own changes to it. Go to settings and make the repo private.

Step 1 : Setting up SSH to login locally to Gitlab

  1. Run ssh-keygen -t ecdsa -b 521
  2. Run cat ~./ssh/id_ecdsa.pub and copy the output
  3. Go to Gitlab SSH Keys and then paste the output inside the key and use whaever title you like.

Step 2: Clone Repo & Add Models & Push Changes

  1. Run git clone git@gitlab.aicrowd.com:[Your Username]/food_recognition_detectron2_baseline.git
  2. Put your model inside the data directioary and then run git-lfs track "data/"
  3. Run git stage . then git commit -m " adding model"
  4. Run git push origin master

Step 3. Create Submission

  1. Go to the repo and then tags and then New Tag.
  2. In the tag name, use submission-v1, ( Everytime you make a new submission, just increase the no. like - submission-v2, submission-v3 )
  3. A new issue will be created with showing the process. Enjoy!

If you do not have SSH Keys, Check this link

Add your SSH Keys to your GitLab account by following the instructions here

In [ ]:
aicrowd_submission = {
    "author": "Shubham",
    "username": "Shubhamai",
    "description": "detectron2 trial 2",
    "model_path": "logs/model_final.pth",
    "model_type": "model_zoo",
    "model_config_file": "COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml",
    "detectron_model_config": {
      "ROI_HEADS": {
        "SCORE_THRESH_TEST": 0.3,
        "NUM_CLASSES": 273
      }
    }
}

aicrowd_submission["description"] = aicrowd_submission["description"].replace(" ", "-")
with open("aicrowd.json", "w") as fp:
  json.dump(aicrowd_submission, fp)

Submit to AIcrowd

Note: We will create an SSH key on your google drive. This key will be used to identify you on gitlab.aicrowd.com.

In [ ]:
from pycocotools.coco import COCO
import json

coco_api = COCO(TRAIN_ANNOTATIONS_PATH)

category_ids = sorted(coco_api.getCatIds())
categories = coco_api.loadCats(category_ids)

class_to_category = { int(class_id): int(category_id) for class_id, category_id in enumerate(category_ids) }

with open("class_to_category.json", "w") as fp:
  json.dump(class_to_category, fp)
loading annotations into memory...
Done (t=1.67s)
creating index...
index created!
In [ ]:
!bash <(curl -sL https://gitlab.aicrowd.com/jyotish/food-recognition-challenge-detectron2-baseline/raw/master/utils/submit-colab.sh)
4/6AGMvrF0BpkxiyuEgRUOBTD_5Z02OyP0lpURVjWWWOEYFMIxFdKa3Gg
4/6AGMvrF0BpkxiyuEgRUOBTD_5Z02OyP0lpURVjWWWOEYFMIxFdKa3Gg
4/6AGMvrF0BpkxiyuEgRUOBTD_5Z02OyP0lpURVjWWWOEYFMIxFdKa3Gg
4/6AGMvrF0BpkxiyuEgRUOBTD_5Z02OyP0lpURVjWWWOEYFMIxFdKa3Gg
4/6AGMvrF0BpkxiyuEgRUOBTD_5Z02OyP0lpURVjWWWOEYFMIxFdKa3Gg
Preparing the environment for submission πŸ’ͺ

WARNING: apt does not have a stable CLI interface. Use with caution in scripts.

πŸ” Checking for SSH key...
Go to this URL in a browser: https://accounts.google.com/o/oauth2/auth?client_id=947318989803-6bn6qk8qdgf4n4g3pfee6491hc0brc4i.apps.googleusercontent.com&redirect_uri=urn%3aietf%3awg%3aoauth%3a2.0%3aoob&scope=email%20https%3a%2f%2fwww.googleapis.com%2fauth%2fdocs.test%20https%3a%2f%2fwww.googleapis.com%2fauth%2fdrive%20https%3a%2f%2fwww.googleapis.com%2fauth%2fdrive.photos.readonly%20https%3a%2f%2fwww.googleapis.com%2fauth%2fpeopleapi.readonly&response_type=code

Enter your authorization code:
Mounted at /content/drive
Verified! βœ…

βš™οΈ Setting up git... 
Identity added: /content/drive/My Drive/aicrowd/id_ecdsa (/content/drive/My Drive/aicrowd/id_ecdsa)

βš™οΈ Cloning the git repository... 
Updated git hooks.
Git LFS initialized.
Git LFS: (2 of 2 files) 691.75 MB / 691.75 MB

πŸ’ΎοΈ Copying your files to the git repo... 
"*.pth" already supported
[master 5849bfc] detectron2-trial-2
 2 files changed, 3 insertions(+), 3 deletions(-)
Git LFS: (1 of 1 files) 345.88 MB / 345.88 MB
Git LFS: (0 of 0 files, 1 skipped) 0 B / 0 B, 345.88 MB skippedCounting objects: 5, done.
Delta compression using up to 2 threads.
Compressing objects: 100% (5/5), done.
Writing objects: 100% (5/5), 580 bytes | 580.00 KiB/s, done.
Total 5 (delta 3), reused 0 (delta 0)
To gitlab.aicrowd.com:Shubhamai/food-recognition-challenge-starter-kit.git
   773e57a..bdcd841  master -> master
Counting objects: 1, done.
Writing objects: 100% (1/1), 178 bytes | 178.00 KiB/s, done.
Total 1 (delta 0), reused 0 (delta 0)
To gitlab.aicrowd.com:Shubhamai/food-recognition-challenge-starter-kit.git
 * [new tag]         submission-detectron2-trial-2 -> submission-detectron2-trial-2


======================================


🎊 Solution submitted successfully! πŸ™Œ
Track the progress of your submission at https://gitlab.aicrowd.com/Shubhamai/food-recognition-challenge-starter-kit/issues

Generate More Data + Some tips & tricks 💡

CLoDSA seems a really great choice to generate more dataset. Fortunately there is also a really cool colab notebook on how to generate more dataset using CLoDSA


There is also an ongoing youtube series series from Immersive Limit on how to generate more dataset using blender ! Blender for AI Devs

Blender for AI Devs

And also this --

Automated Background Switching in Blender

Data Argumentation

In [ ]:
!pip install --upgrade fastai
Collecting fastai
  Downloading https://files.pythonhosted.org/packages/98/2e/d4dcc69f67b4557c8543a4c65d3e136b1929b01136b227ceb986e2596825/fastai-2.0.15-py3-none-any.whl (185kB)
     |β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 194kB 4.5MB/s 
Requirement already satisfied, skipping upgrade: scikit-learn in /usr/local/lib/python3.6/dist-packages (from fastai) (0.22.2.post1)
Requirement already satisfied, skipping upgrade: matplotlib in /usr/local/lib/python3.6/dist-packages (from fastai) (3.2.2)
Requirement already satisfied, skipping upgrade: requests in /usr/local/lib/python3.6/dist-packages (from fastai) (2.23.0)
Requirement already satisfied, skipping upgrade: fastprogress>=0.2.4 in /usr/local/lib/python3.6/dist-packages (from fastai) (1.0.0)
Requirement already satisfied, skipping upgrade: pillow in /usr/local/lib/python3.6/dist-packages (from fastai) (7.0.0)
Collecting torch>=1.6.0
  Downloading https://files.pythonhosted.org/packages/38/53/914885a93a44b96c0dd1c36f36ff10afe341f091230aad68f7228d61db1e/torch-1.6.0-cp36-cp36m-manylinux1_x86_64.whl (748.8MB)
     |β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 748.8MB 18kB/s 
Requirement already satisfied, skipping upgrade: packaging in /usr/local/lib/python3.6/dist-packages (from fastai) (20.4)
Collecting fastcore>=1.0.5
  Downloading https://files.pythonhosted.org/packages/8e/4f/dc306a98a16a2c2c83d04636387e2945e19f8693ca8bce11bc147107d3bb/fastcore-1.0.20-py3-none-any.whl (41kB)
     |β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 51kB 7.7MB/s 
Requirement already satisfied, skipping upgrade: pandas in /usr/local/lib/python3.6/dist-packages (from fastai) (1.1.2)
Requirement already satisfied, skipping upgrade: pyyaml in /usr/local/lib/python3.6/dist-packages (from fastai) (5.1)
Requirement already satisfied, skipping upgrade: scipy in /usr/local/lib/python3.6/dist-packages (from fastai) (1.4.1)
Requirement already satisfied, skipping upgrade: pip in /usr/local/lib/python3.6/dist-packages (from fastai) (19.3.1)
Collecting torchvision>=0.7
  Downloading https://files.pythonhosted.org/packages/8e/dc/4a939cfbd38398f4765f712576df21425241020bfccc200af76d19088533/torchvision-0.7.0-cp36-cp36m-manylinux1_x86_64.whl (5.9MB)
     |β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 5.9MB 49.7MB/s 
Requirement already satisfied, skipping upgrade: spacy in /usr/local/lib/python3.6/dist-packages (from fastai) (2.2.4)
Requirement already satisfied, skipping upgrade: joblib>=0.11 in /usr/local/lib/python3.6/dist-packages (from scikit-learn->fastai) (0.16.0)
Requirement already satisfied, skipping upgrade: numpy>=1.11.0 in /usr/local/lib/python3.6/dist-packages (from scikit-learn->fastai) (1.18.5)
Requirement already satisfied, skipping upgrade: kiwisolver>=1.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->fastai) (1.2.0)
Requirement already satisfied, skipping upgrade: cycler>=0.10 in /usr/local/lib/python3.6/dist-packages (from matplotlib->fastai) (0.10.0)
Requirement already satisfied, skipping upgrade: python-dateutil>=2.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->fastai) (2.8.1)
Requirement already satisfied, skipping upgrade: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->fastai) (2.4.7)
Requirement already satisfied, skipping upgrade: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests->fastai) (1.24.3)
Requirement already satisfied, skipping upgrade: chardet<4,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests->fastai) (3.0.4)
Requirement already satisfied, skipping upgrade: idna<3,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests->fastai) (2.10)
Requirement already satisfied, skipping upgrade: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests->fastai) (2020.6.20)
Requirement already satisfied, skipping upgrade: future in /usr/local/lib/python3.6/dist-packages (from torch>=1.6.0->fastai) (0.16.0)
Requirement already satisfied, skipping upgrade: six in /usr/local/lib/python3.6/dist-packages (from packaging->fastai) (1.15.0)
Requirement already satisfied, skipping upgrade: pytz>=2017.2 in /usr/local/lib/python3.6/dist-packages (from pandas->fastai) (2018.9)
Requirement already satisfied, skipping upgrade: catalogue<1.1.0,>=0.0.7 in /usr/local/lib/python3.6/dist-packages (from spacy->fastai) (1.0.0)
Requirement already satisfied, skipping upgrade: murmurhash<1.1.0,>=0.28.0 in /usr/local/lib/python3.6/dist-packages (from spacy->fastai) (1.0.2)
Requirement already satisfied, skipping upgrade: srsly<1.1.0,>=1.0.2 in /usr/local/lib/python3.6/dist-packages (from spacy->fastai) (1.0.2)
Requirement already satisfied, skipping upgrade: thinc==7.4.0 in /usr/local/lib/python3.6/dist-packages (from spacy->fastai) (7.4.0)
Requirement already satisfied, skipping upgrade: blis<0.5.0,>=0.4.0 in /usr/local/lib/python3.6/dist-packages (from spacy->fastai) (0.4.1)
Requirement already satisfied, skipping upgrade: tqdm<5.0.0,>=4.38.0 in /usr/local/lib/python3.6/dist-packages (from spacy->fastai) (4.41.1)
Requirement already satisfied, skipping upgrade: wasabi<1.1.0,>=0.4.0 in /usr/local/lib/python3.6/dist-packages (from spacy->fastai) (0.8.0)
Requirement already satisfied, skipping upgrade: preshed<3.1.0,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from spacy->fastai) (3.0.2)
Requirement already satisfied, skipping upgrade: cymem<2.1.0,>=2.0.2 in /usr/local/lib/python3.6/dist-packages (from spacy->fastai) (2.0.3)
Requirement already satisfied, skipping upgrade: setuptools in /usr/local/lib/python3.6/dist-packages (from spacy->fastai) (50.3.0)
Requirement already satisfied, skipping upgrade: plac<1.2.0,>=0.9.6 in /usr/local/lib/python3.6/dist-packages (from spacy->fastai) (1.1.3)
Requirement already satisfied, skipping upgrade: importlib-metadata>=0.20; python_version < "3.8" in /usr/local/lib/python3.6/dist-packages (from catalogue<1.1.0,>=0.0.7->spacy->fastai) (2.0.0)
Requirement already satisfied, skipping upgrade: zipp>=0.5 in /usr/local/lib/python3.6/dist-packages (from importlib-metadata>=0.20; python_version < "3.8"->catalogue<1.1.0,>=0.0.7->spacy->fastai) (3.2.0)
Installing collected packages: torch, fastcore, torchvision, fastai
  Found existing installation: torch 1.5.0+cu101
    Uninstalling torch-1.5.0+cu101:
      Successfully uninstalled torch-1.5.0+cu101
  Found existing installation: torchvision 0.6.0+cu101
    Uninstalling torchvision-0.6.0+cu101:
      Successfully uninstalled torchvision-0.6.0+cu101
  Found existing installation: fastai 1.0.61
    Uninstalling fastai-1.0.61:
      Successfully uninstalled fastai-1.0.61
Successfully installed fastai-2.0.15 fastcore-1.0.20 torch-1.6.0 torchvision-0.7.0
In [ ]:
from fastai.vision.core import *
from fastai.vision.utils import *
from fastai.vision.augment import *
from fastai.data.core import *
from fastai.data.transforms import *
In [ ]:
images, lbl_bbox = get_annotations('/content/train/annotations.json')
# idx=2
# coco_fn,bbox = coco/'train'/images[idx],lbl_bbox[idx]

# def _coco_bb(x):  return TensorBBox.create(bbox[0])
# def _coco_lbl(x): return bbox[1]
In [ ]:
idx=4
coco_fn,bbox = '/content/train/images/'+images[idx],lbl_bbox[idx]

def _coco_bb(x):  return TensorBBox.create(bbox[0])
def _coco_lbl(x): return bbox[1]
In [ ]:
coco_dsrc = Datasets([coco_fn]*10, [PILImage.create, [_coco_bb,], [_coco_lbl, MultiCategorize(add_na=True)]], n_inp=1)
coco_tdl = TfmdDL(coco_dsrc, bs=9, after_item=[BBoxLabeler(), PointScaler(), ToTensor(), Resize(256)],
                  after_batch=[IntToFloatTensor(), *aug_transforms()])

coco_tdl.show_batch(max_n=9)

Comments

You must login before you can post a comment.

Execute