bluemellophone commited on
Commit
aea0a3d
·
unverified ·
1 Parent(s): 73f6108

Add documentation

Browse files
.gitignore CHANGED
@@ -11,3 +11,5 @@ coverage/
11
  gradio_cached_examples/
12
  __pycache__/
13
  docs/build/
 
 
 
11
  gradio_cached_examples/
12
  __pycache__/
13
  docs/build/
14
+
15
+ docs/_build/
README.rst CHANGED
@@ -14,7 +14,7 @@ How to Install
14
 
15
  You need to first install Anaconda on your machine. Below are the instructions on how to install Anaconda on an Apple macOS machine, but it is possible to install on a Windows and Linux machine as well. Consult the `official Anaconda page <https://www.anaconda.com>`_ to download and install on other systems.
16
 
17
- .. code:: bash
18
 
19
  # Install Homebrew
20
  /bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
@@ -27,7 +27,7 @@ You need to first install Anaconda on your machine. Below are the instructions
27
 
28
  Once Anaconda is installed, you will need an environment and the following packages installed
29
 
30
- .. code:: bash
31
 
32
  # Create Environment
33
  conda create --name scoutbot
@@ -42,19 +42,24 @@ Once Anaconda is installed, you will need an environment and the following packa
42
  How to Run
43
  ----------
44
 
45
- It is recommended to use `ipython` and to copy sections of code into and inspecting the
46
 
47
- .. code:: bash
48
 
49
- # Run the live demo
50
- python app.py
 
 
 
 
 
51
 
52
  Docker
53
  ------
54
 
55
  The application can also be built into a Docker image and hosted on Docker Hub.
56
 
57
- .. code:: bash
58
 
59
  # linux/amd64
60
 
@@ -72,7 +77,7 @@ The application can also be built into a Docker image and hosted on Docker Hub.
72
 
73
  To run:
74
 
75
- .. code:: bash
76
 
77
  docker run \
78
  -it \
@@ -92,7 +97,7 @@ Building Documentation
92
 
93
  There is Sphinx documentation in the `docs/` folder, which can be built with the code below:
94
 
95
- .. code:: bash
96
 
97
  cd docs/
98
  sphinx-build -M html . build/
@@ -110,7 +115,7 @@ on any code you write. (See also `pre-commit.com <https://pre-commit.com/>`_)
110
 
111
  Reference `pre-commit's installation instructions <https://pre-commit.com/#install>`_ for software installation on your OS/platform. After you have the software installed, run ``pre-commit install`` on the command line. Now every time you commit to this project's code base the linter procedures will automatically run over the changed files. To run pre-commit on files preemtively from the command line use:
112
 
113
- .. code:: bash
114
 
115
  git add .
116
  pre-commit run
@@ -127,7 +132,7 @@ The code base has been formatted by Brunette, which is a fork and more configura
127
  :alt: GitHub CI
128
 
129
  .. |Codecov| image:: https://codecov.io/gh/WildMeOrg/scoutbot/branch/main/graph/badge.svg?token=FR6ITMWQNI
130
- :target: https://codecov.io/gh/WildMeOrg/scoutbot
131
  :alt: Codecov
132
 
133
  .. |Wheel| image:: https://github.com/WildMeOrg/scoutbot/actions/workflows/python-publish.yml/badge.svg
 
14
 
15
  You need to first install Anaconda on your machine. Below are the instructions on how to install Anaconda on an Apple macOS machine, but it is possible to install on a Windows and Linux machine as well. Consult the `official Anaconda page <https://www.anaconda.com>`_ to download and install on other systems.
16
 
17
+ .. code-block:: console
18
 
19
  # Install Homebrew
20
  /bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
 
27
 
28
  Once Anaconda is installed, you will need an environment and the following packages installed
29
 
30
+ .. code-block:: console
31
 
32
  # Create Environment
33
  conda create --name scoutbot
 
42
  How to Run
43
  ----------
44
 
45
+ You can run the tile-base Gradio demo with:
46
 
47
+ .. code-block:: console
48
 
49
+ (.venv) $ python app.py
50
+
51
+ or, you can run the image-base Gradio demo with:
52
+
53
+ .. code-block:: console
54
+
55
+ (.venv) $ python app2.py
56
 
57
  Docker
58
  ------
59
 
60
  The application can also be built into a Docker image and hosted on Docker Hub.
61
 
62
+ .. code-block:: console
63
 
64
  # linux/amd64
65
 
 
77
 
78
  To run:
79
 
80
+ .. code-block:: console
81
 
82
  docker run \
83
  -it \
 
97
 
98
  There is Sphinx documentation in the `docs/` folder, which can be built with the code below:
99
 
100
+ .. code-block:: console
101
 
102
  cd docs/
103
  sphinx-build -M html . build/
 
115
 
116
  Reference `pre-commit's installation instructions <https://pre-commit.com/#install>`_ for software installation on your OS/platform. After you have the software installed, run ``pre-commit install`` on the command line. Now every time you commit to this project's code base the linter procedures will automatically run over the changed files. To run pre-commit on files preemtively from the command line use:
117
 
118
+ .. code-block:: console
119
 
120
  git add .
121
  pre-commit run
 
132
  :alt: GitHub CI
133
 
134
  .. |Codecov| image:: https://codecov.io/gh/WildMeOrg/scoutbot/branch/main/graph/badge.svg?token=FR6ITMWQNI
135
+ :target: https://app.codecov.io/gh/WildMeOrg/scoutbot
136
  :alt: Codecov
137
 
138
  .. |Wheel| image:: https://github.com/WildMeOrg/scoutbot/actions/workflows/python-publish.yml/badge.svg
docs/conf.py CHANGED
@@ -32,12 +32,19 @@ extensions = [
32
  'sphinx.ext.autodoc',
33
  'sphinx.ext.autosummary',
34
  'sphinx.ext.intersphinx',
 
 
 
 
 
35
  ]
36
 
37
  intersphinx_mapping = {
38
  'rtd': ('https://docs.readthedocs.io/en/stable/', None),
39
  'python': ('https://docs.python.org/3/', None),
40
  'sphinx': ('https://www.sphinx-doc.org/en/master/', None),
 
 
41
  }
42
  intersphinx_disabled_domains = ['std']
43
 
@@ -51,6 +58,8 @@ epub_show_urls = 'footnote'
51
  # This pattern also affects html_static_path and html_extra_path.
52
  exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']
53
 
 
 
54
  # -- Options for HTML output -------------------------------------------------
55
 
56
  # The theme to use for HTML and HTML Help pages. See the documentation for
@@ -58,6 +67,20 @@ exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']
58
  #
59
  html_theme = 'sphinx_rtd_theme'
60
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
61
  # Add any paths that contain custom static files (such as style sheets) here,
62
  # relative to this directory. They are copied after the builtin static files,
63
  # so a file named "default.css" will overwrite the builtin "default.css".
 
32
  'sphinx.ext.autodoc',
33
  'sphinx.ext.autosummary',
34
  'sphinx.ext.intersphinx',
35
+ 'sphinx.ext.autosectionlabel',
36
+ 'sphinx.ext.coverage',
37
+ 'sphinx.ext.viewcode',
38
+ 'sphinx.ext.imgmath',
39
+ 'sphinx.ext.napoleon',
40
  ]
41
 
42
  intersphinx_mapping = {
43
  'rtd': ('https://docs.readthedocs.io/en/stable/', None),
44
  'python': ('https://docs.python.org/3/', None),
45
  'sphinx': ('https://www.sphinx-doc.org/en/master/', None),
46
+ 'numpy': ('https://numpy.org/doc/stable/', None),
47
+ 'cv2': ('https://docs.opencv.org/2.4.13.7/', None),
48
  }
49
  intersphinx_disabled_domains = ['std']
50
 
 
58
  # This pattern also affects html_static_path and html_extra_path.
59
  exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']
60
 
61
+ autosectionlabel_prefix_document = True
62
+
63
  # -- Options for HTML output -------------------------------------------------
64
 
65
  # The theme to use for HTML and HTML Help pages. See the documentation for
 
67
  #
68
  html_theme = 'sphinx_rtd_theme'
69
 
70
+ html_theme_path = [
71
+ '_themes',
72
+ ]
73
+
74
+ html_sidebars = {
75
+ '**': [
76
+ 'about.html',
77
+ 'navigation.html',
78
+ 'relations.html',
79
+ 'searchbox.html',
80
+ 'donate.html',
81
+ ]
82
+ }
83
+
84
  # Add any paths that contain custom static files (such as style sheets) here,
85
  # relative to this directory. They are copied after the builtin static files,
86
  # so a file named "default.css" will overwrite the builtin "default.css".
docs/index.rst CHANGED
@@ -10,6 +10,5 @@ Contents
10
  .. toctree::
11
 
12
  Home <self>
13
- usage
14
  scoutbot
15
  cli
 
10
  .. toctree::
11
 
12
  Home <self>
 
13
  scoutbot
14
  cli
docs/scoutbot.rst CHANGED
@@ -6,8 +6,8 @@ ScoutBot API
6
  :caption: Contents:
7
 
8
 
9
- Tiles
10
- -----
11
 
12
  .. automodule:: scoutbot.tile
13
  :members:
@@ -31,6 +31,14 @@ Localizer (LOC)
31
  :undoc-members:
32
  :show-inheritance:
33
 
 
 
 
 
 
 
 
 
34
  Utilities
35
  ---------
36
 
 
6
  :caption: Contents:
7
 
8
 
9
+ Tiles (TILE)
10
+ ------------
11
 
12
  .. automodule:: scoutbot.tile
13
  :members:
 
31
  :undoc-members:
32
  :show-inheritance:
33
 
34
+ Aggregation (AGG)
35
+ -----------------
36
+
37
+ .. automodule:: scoutbot.agg
38
+ :members:
39
+ :undoc-members:
40
+ :show-inheritance:
41
+
42
  Utilities
43
  ---------
44
 
docs/usage.rst DELETED
@@ -1,19 +0,0 @@
1
- Usage
2
- =====
3
-
4
- .. _installation:
5
-
6
- Installation
7
- ------------
8
-
9
- To use this code, first install its dependencies using pip:
10
-
11
- .. code-block:: console
12
-
13
- (.venv) $ pip install -r requirements.txt
14
-
15
- then, you can run the application via:
16
-
17
- .. code-block:: console
18
-
19
- (.venv) $ python app.py
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
scoutbot/__init__.py CHANGED
@@ -37,7 +37,7 @@ Notes:
37
  '''
38
  from scoutbot import agg, loc, tile, wic
39
 
40
- VERSION = '0.1.0'
41
  version = VERSION
42
  __version__ = VERSION
43
 
 
37
  '''
38
  from scoutbot import agg, loc, tile, wic
39
 
40
+ VERSION = '0.1.1'
41
  version = VERSION
42
  __version__ = VERSION
43
 
scoutbot/loc/__init__.py CHANGED
@@ -47,6 +47,22 @@ ONNX_MODEL_HASH = '85a9378311d42b5143f74570136f32f50bf97c548135921b178b46ba7612b
47
 
48
 
49
  def fetch(pull=False):
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
50
  if not pull and exists(ONNX_MODEL_PATH):
51
  onnx_model = ONNX_MODEL_PATH
52
  else:
@@ -61,6 +77,21 @@ def fetch(pull=False):
61
 
62
 
63
  def pre(inputs):
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
64
  transform = torchvision.transforms.ToTensor()
65
 
66
  data = []
@@ -80,6 +111,17 @@ def pre(inputs):
80
 
81
 
82
  def predict(data, fill=True):
 
 
 
 
 
 
 
 
 
 
 
83
  onnx_model = fetch()
84
 
85
  ort_session = ort.InferenceSession(
@@ -106,6 +148,38 @@ def predict(data, fill=True):
106
 
107
 
108
  def post(preds, sizes, loc_thresh=LOC_THRESH, nms_thresh=NMS_THRESH):
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
109
  postprocess = Compose(
110
  [
111
  GetBoundingBoxes(NUM_CLASSES, ANCHORS, loc_thresh),
 
47
 
48
 
49
  def fetch(pull=False):
50
+ """
51
+ Fetch the Localizer ONNX model file from a CDN if it does not exist locally.
52
+
53
+ This function will throw an AssertionError if the download fails or the
54
+ file otherwise does not exists locally on disk.
55
+
56
+ Args:
57
+ pull (bool, optional): If :obj:`True`, use a downloaded version stored in
58
+ sthe local system's cache. Defaults to :obj:`False`.
59
+
60
+ Returns:
61
+ str: local ONNX model file path.
62
+
63
+ Raises:
64
+ AssertionError: If the model cannot be fetched.
65
+ """
66
  if not pull and exists(ONNX_MODEL_PATH):
67
  onnx_model = ONNX_MODEL_PATH
68
  else:
 
77
 
78
 
79
  def pre(inputs):
80
+ """
81
+ Load a list of filepaths and return a corresponding list of the image
82
+ data as a 4-D list of floats. The image data is loaded from disk, transformed
83
+ as needed, and is normalized to the input ranges that the Localizer ONNX model
84
+ expects.
85
+
86
+ This function will throw an error if any of the filepaths do not exist.
87
+
88
+ Args:
89
+ inputs (list(str)): list of tile image filepaths (relative or absolute)
90
+
91
+ Returns:
92
+ list ( list ( list ( list ( float ) ) ) ), list ( tuple ( int ) ): list of
93
+ transformed image data, and a list of each tile's original size
94
+ """
95
  transform = torchvision.transforms.ToTensor()
96
 
97
  data = []
 
111
 
112
 
113
  def predict(data, fill=True):
114
+ """
115
+ Run neural network inference using the Localizer's ONNX model on preprocessed data.
116
+
117
+ Args:
118
+ data (list): list of transformed image data, the first return of :meth:`scoutbot.loc.pre`
119
+ fill (bool, optional): If :obj:`True`, fill any partial batches to the LOC `BATCH_SIZE`,
120
+ and then trim them after inference. Defaults to :obj:`True`.
121
+
122
+ Returns:
123
+ list ( list ( float ) ): list of raw ONNX model outputs
124
+ """
125
  onnx_model = fetch()
126
 
127
  ort_session = ort.InferenceSession(
 
148
 
149
 
150
  def post(preds, sizes, loc_thresh=LOC_THRESH, nms_thresh=NMS_THRESH):
151
+ """
152
+ Apply a post-processing normalization of the raw ONNX network outputs.
153
+
154
+ The final output is a list of lists of dictionaries, each representing a single
155
+ detection. Each dictionary has a structure with the following keys:
156
+
157
+ ::
158
+
159
+ {
160
+ 'l': class_label (str)
161
+ 'c': confidence (float)
162
+ 'x': x_top_left (float)
163
+ 'y': y_top_left (float)
164
+ 'w': width (float)
165
+ 'h': height (float)
166
+ }
167
+
168
+ The ``l`` label is the string class as used when the original
169
+ ONNX model was trained.
170
+
171
+ The ``c`` confidence value is a bounded float between ``0.0`` and
172
+ ``1.0`` (inclusive), but should not be treated as a probability.
173
+
174
+ The ``x``, ``y``, ``w``, ``h`` bounding box keys are in real pixel values.
175
+
176
+ Args:
177
+ preds (list): list of raw ONNX model outputs, the return of :meth:`scoutbot.loc.predict`
178
+ sizes (list): list of original tile sizes, the second return of :meth:`scoutbot.loc.pre`
179
+
180
+ Returns:
181
+ list ( list ( dict ) ): nested list of Localizer predictions
182
+ """
183
  postprocess = Compose(
184
  [
185
  GetBoundingBoxes(NUM_CLASSES, ANCHORS, loc_thresh),
scoutbot/scoutbot.py CHANGED
@@ -1,7 +1,7 @@
1
  #!/usr/bin/env python
2
  # -*- coding: utf-8 -*-
3
  """
4
- The lecture materials for Lecture 1: Dataset Prototyping and Visualization
5
  """
6
  import click
7
 
 
1
  #!/usr/bin/env python
2
  # -*- coding: utf-8 -*-
3
  """
4
+ ScoutBot CLI executable
5
  """
6
  import click
7
 
scoutbot/tile/__init__.py CHANGED
@@ -16,7 +16,9 @@ TILE_BORDERS = True
16
 
17
 
18
  def compute(img_filepath, grid1=True, grid2=True, ext=None, **kwargs):
19
- """Compute the tiles for a given input image"""
 
 
20
  assert exists(img_filepath)
21
  img = cv2.imread(img_filepath)
22
  shape = img.shape
@@ -35,6 +37,19 @@ def compute(img_filepath, grid1=True, grid2=True, ext=None, **kwargs):
35
 
36
 
37
  def tile_write(img, grid, filepath):
 
 
 
 
 
 
 
 
 
 
 
 
 
38
  if exists(filepath):
39
  return True
40
 
 
16
 
17
 
18
  def compute(img_filepath, grid1=True, grid2=True, ext=None, **kwargs):
19
+ """
20
+ Compute the tiles for a given input image
21
+ """
22
  assert exists(img_filepath)
23
  img = cv2.imread(img_filepath)
24
  shape = img.shape
 
37
 
38
 
39
  def tile_write(img, grid, filepath):
40
+ """
41
+ Write a single image's tile to disk using its grid coordinates and an output path.
42
+
43
+ Args:
44
+ img (numpy.ndarray): 3-dimentional Numpy array, the return from :func:`cv2.imread`
45
+ grid (dict): the grid coordinate dictionary, one of the returned dictionaries
46
+ from :meth:`scoutbot.tile.tile_grid`
47
+ filepath (str): the tile's full output filepath (relative or absolute)
48
+
49
+ Returns:
50
+ bool: returns :obj:`True` if the tile's filepath exists on disk.
51
+
52
+ """
53
  if exists(filepath):
54
  return True
55
 
scoutbot/wic/__init__.py CHANGED
@@ -1,6 +1,10 @@
1
  # -*- coding: utf-8 -*-
2
- '''
3
- 2022 Wild Me
 
 
 
 
4
  '''
5
  from os.path import exists, join
6
  from pathlib import Path
@@ -29,6 +33,22 @@ WIC_THRESH = 0.2
29
 
30
 
31
  def fetch(pull=False):
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
32
  if not pull and exists(ONNX_MODEL_PATH):
33
  onnx_model = ONNX_MODEL_PATH
34
  else:
@@ -43,6 +63,20 @@ def fetch(pull=False):
43
 
44
 
45
  def pre(inputs):
 
 
 
 
 
 
 
 
 
 
 
 
 
 
46
  transform = _init_transforms()
47
  dataset = ImageFilePathList(inputs, transform=transform)
48
  dataloader = torch.utils.data.DataLoader(
@@ -57,6 +91,17 @@ def pre(inputs):
57
 
58
 
59
  def predict(data, fill=False):
 
 
 
 
 
 
 
 
 
 
 
60
  onnx_model = fetch()
61
 
62
  ort_session = ort.InferenceSession(
@@ -83,5 +128,17 @@ def predict(data, fill=False):
83
 
84
 
85
  def post(preds):
 
 
 
 
 
 
 
 
 
 
 
 
86
  outputs = [dict(zip(ONNX_CLASSES, pred)) for pred in preds]
87
  return outputs
 
1
  # -*- coding: utf-8 -*-
2
+ '''The Whole Image Classifier (WIC) returns confidence scores for image tiles.
3
+
4
+ This module defines how WIC models are downloaded from an external CDN,
5
+ how to load an image and prepare it for inference, demonstrates how to run the
6
+ WIC ONNX model on this input, and finally how to convert this raw CNN output
7
+ into usable confidence scores.
8
  '''
9
  from os.path import exists, join
10
  from pathlib import Path
 
33
 
34
 
35
  def fetch(pull=False):
36
+ """
37
+ Fetch the WIC ONNX model file from a CDN if it does not exist locally.
38
+
39
+ This function will throw an AssertionError if the download fails or the
40
+ file otherwise does not exists locally on disk.
41
+
42
+ Args:
43
+ pull (bool, optional): If :obj:`True`, use a downloaded version stored in
44
+ sthe local system's cache. Defaults to :obj:`False`.
45
+
46
+ Returns:
47
+ str: local ONNX model file path.
48
+
49
+ Raises:
50
+ AssertionError: If the model cannot be fetched.
51
+ """
52
  if not pull and exists(ONNX_MODEL_PATH):
53
  onnx_model = ONNX_MODEL_PATH
54
  else:
 
63
 
64
 
65
  def pre(inputs):
66
+ """
67
+ Load a list of filepaths and return a corresponding list of the image
68
+ data as a 4-D list of floats. The image data is loaded from disk, transformed
69
+ as needed, and is normalized to the input ranges that the WIC ONNX model
70
+ expects.
71
+
72
+ This function will throw an error if any of the filepaths do not exist.
73
+
74
+ Args:
75
+ inputs (list(str)): list of tile image filepaths (relative or absolute)
76
+
77
+ Returns:
78
+ list ( list ( list ( list ( float ) ) ) ): list of transformed image data
79
+ """
80
  transform = _init_transforms()
81
  dataset = ImageFilePathList(inputs, transform=transform)
82
  dataloader = torch.utils.data.DataLoader(
 
91
 
92
 
93
  def predict(data, fill=False):
94
+ """
95
+ Run neural network inference using the WIC's ONNX model on preprocessed data.
96
+
97
+ Args:
98
+ data (list): list of transformed image data, the return of :meth:`scoutbot.wic.pre`
99
+ fill (bool, optional): If :obj:`True`, fill any partial batches to the WIC `BATCH_SIZE`,
100
+ and then trim them after inference. Defaults to :obj:`False`.
101
+
102
+ Returns:
103
+ list ( list ( float ) ): list of raw ONNX model outputs
104
+ """
105
  onnx_model = fetch()
106
 
107
  ort_session = ort.InferenceSession(
 
128
 
129
 
130
  def post(preds):
131
+ """
132
+ Apply a post-processing normalization of the raw ONNX network outputs.
133
+
134
+ The final output is a dictionary where the key values are the predicted labels
135
+ and the values are their corresponding confidence values.
136
+
137
+ Args:
138
+ preds (list): list of raw ONNX model outputs, the return of :meth:`scoutbot.wic.predict`
139
+
140
+ Returns:
141
+ list ( dict ): list of WIC predictions
142
+ """
143
  outputs = [dict(zip(ONNX_CLASSES, pred)) for pred in preds]
144
  return outputs
scoutbot/wic/dataloader.py CHANGED
@@ -5,7 +5,7 @@ import torch
5
  import torchvision
6
  import utool as ut
7
 
8
- BATCH_SIZE = 2048
9
  INPUT_SIZE = 224
10
 
11
 
 
5
  import torchvision
6
  import utool as ut
7
 
8
+ BATCH_SIZE = 512
9
  INPUT_SIZE = 224
10
 
11