Compare commits
2 Commits
main
...
b6a214ca07
| Author | SHA1 | Date | |
|---|---|---|---|
| b6a214ca07 | |||
| 8520cdb93f |
@@ -1,34 +0,0 @@
|
||||
name: Build Dev PWA
|
||||
run-name: ${{ gitea.actor }} is building new dev pwa version
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- main
|
||||
|
||||
jobs:
|
||||
Build-PWA:
|
||||
runs-on: ubuntu-22.04
|
||||
steps:
|
||||
- name: Check out repository code
|
||||
uses: actions/checkout@v4
|
||||
- name: Install node modules
|
||||
run: npm install
|
||||
- name: Add build number
|
||||
run: sed -i 's/####/#${{ github.run_number }}/' ./src/js/store.js
|
||||
- name: Build pwa
|
||||
run: npm run build
|
||||
- name: Replace previous dev pwa
|
||||
env:
|
||||
DEV_HOST: ${{ secrets.DEV_HOST }}
|
||||
DEV_KEY: ${{ secrets.DEV_KEY }}
|
||||
DEV_FP: ${{ secrets.DEV_FINGERPRINT }}
|
||||
run: |
|
||||
echo "$DEV_KEY" > ~/.ssh/id_rsa
|
||||
chmod 600 ~/.ssh/id_rsa
|
||||
echo "$DEV_FP" > ~/.ssh/known_hosts
|
||||
chmod 600 ~/.ssh/known_hosts
|
||||
ssh root@$DEV_HOST "rm -R /var/www/html/alvinn-dev/*"
|
||||
echo "Old files removed"
|
||||
scp -r ${{ gitea.workspace }}/www/* root@$DEV_HOST:/var/www/html/alvinn-dev
|
||||
ssh root@$DEV_HOST "chown -R www-data:www-data /var/www/html/alvinn-dev/*"
|
||||
echo "New files copied"
|
||||
5
.gitignore
vendored
5
.gitignore
vendored
@@ -40,8 +40,7 @@ cordova/platforms/
|
||||
cordova/plugins/
|
||||
cordova/www/
|
||||
|
||||
|
||||
|
||||
# Production build
|
||||
www/
|
||||
|
||||
# VSCode settings
|
||||
.vscode/settings.json
|
||||
|
||||
60
README.md
60
README.md
@@ -1,6 +1,6 @@
|
||||
# ALVINN
|
||||
|
||||
Anatomy Lab Visual Identification Neural Net (A.L.V.I.N.N.) is a f7 based app for using a computer vision neural net model to identify anatomical structures in photographic imagery.
|
||||
Anatomy Lab Visual Identification Neural Net (A.L.V.I.N.N) is a f7 based app for using a computer vision neural net model to identify anatomical structures in photographic imagery.
|
||||
|
||||
## Install
|
||||
* **Android:** Download the latest Android apk in [packages](https://gitea.azgeorgis.net/Georgi_Lab/ALVINN_f7/packages) and open the downloaded file to install.
|
||||
@@ -9,26 +9,16 @@ Anatomy Lab Visual Identification Neural Net (A.L.V.I.N.N.) is a f7 based app fo
|
||||
* **Run from source:** Clone this repository and in the root directory run `npm install` followed by `npm start`. For more information see [f7 info](f7_info.md).
|
||||
|
||||
## Quick Start
|
||||
1. Select the region of the body you want to identify structures from. The regions are:
|
||||
* Thorax and back
|
||||
* Abdomen and pelvis
|
||||
* Limbs
|
||||
* Head and neck
|
||||
1. Load an image in one of the following ways:
|
||||
* Click on the camera icon to take a new picture.
|
||||
* ALVINN will highlight areas with potential structures as you aim the camera.
|
||||
* Press Capture to use the current camera view.
|
||||
* Click on the image file icon to load a picture from the device storage.
|
||||
* If demo mode is turned on, you can click on the marked image icon to load an ALVINN sample image.
|
||||
1. When the picture is captured or loaded, any identifiable structures will be listed as tags below the image:
|
||||
* Click on each tag to see the structure highlighted in the image or click on the image to see the tag for that structure (additional clicks to the same area will select overlapping structres).
|
||||
* Tag color and proportion filled indicate ALVINN's level of confidence in the identification.
|
||||
* An incorrect tag can be deleted by clicking on the tag's X button.
|
||||
1. From the main screen of the app, select the menu icon in the upper left corner and go to `Settings`.
|
||||
1. Make sure that `Use external server` option is selected and fill in address and port parameters to connect to a back end serving the ALVINN models (Doods2 is the default backend).
|
||||
1. Save the settings and return to the main screen.
|
||||
1. Select the region of the body you want to identify structures from.
|
||||
1. In the region page, click on the camera icon to take a new picture or load a picture from storage. When the picture load, any identifiable structures will be listed as tags below the image.
|
||||
1. Click on each tag to see the structure highlighted in the image.
|
||||
|
||||
## Advanced Features
|
||||
### Detection Parameters
|
||||
If there are potential structures that do not satisfy the current detection settings, a badge on the detection menu icon will indicate the number of un-displayed structures.
|
||||
Clicking on the detection menu icon will open a menu of tools to adjust the detection settings.
|
||||
After an image has been loaded and structure detection has been performed, the detection parameters can be adjusted using the third detection menu button (eye).
|
||||
This button will make three tools available:
|
||||
1. Confidence slider: You can use the slider to change the confidence threshold for identifying structures.
|
||||
The default threshold is 50% confidence.
|
||||
@@ -38,36 +28,4 @@ The default threshold is 50% confidence.
|
||||
### Submitting Images
|
||||
If all of the detection tags that are currently visible have been viewed, then the final button (cloud upload) on the detection menu will be enabled.
|
||||
This button will cause the image and the verified structures to be uploaded to the ALVINN project servers where that data will be available for further training of the neural net.
|
||||
If after the image has been uploaded, the available detection tags change, then the option to re-upload the image will be available if all the new tags have been viewed and verified.
|
||||
|
||||
## Configuration
|
||||
Configuring aspects of the hosted ALVINN PWA is done through the `conf.yaml` file in the `conf` folder.
|
||||
### Site settings
|
||||
The following site settings are avaible:
|
||||
| name | description | values | default |
|
||||
| --- | --- | --- | --- |
|
||||
| `agreeExpire` | number of months before users are shown the site agreement dialog again<br />set to 0 to display dialog on every reload | integer >= 0 | 3 |
|
||||
| `demo` | set to **true** to enable demo mode by default | boolean | false |
|
||||
| `regions` | array of regions names to enable | thorax, abdomen, limbs, head | [thorax, abdomen, limbs, head] |
|
||||
| `useExternal` | detemines the ability to use an external detection server:<br />**none** - external server cannot be configured<br />**optional** - external server can be configured in the app's settings page<br />**list** - external server can be selected in the app's settings page but only the configured server(s) may be selected<br />**required** - external server settings from conf file will be used by default and disable server options in the settings page | none, optional, list, required | **optional** |
|
||||
| `disableWorkers` | force app to use a single thread for detection computations instead of multi threading web workers | boolean | **optional** |
|
||||
| `external` | properties of the external server(s) ALVINN may connect to<br />This setting must be a single element array if **useExternal** is set to **required**.<br />This setting must be an array of one or more elements if **useExternal** is set to **list** | external server settings array | []|
|
||||
| `infoUrl` | root url for links to information about identified structures<br />Structure labels with spaces replaced by underscores will be appended to this value for full information links (*e.g.,* Abdominal_diapragm) | string | info link not shown |
|
||||
|
||||
### External server settings
|
||||
ALVINN can use an external object detection server instead of the built in models; settings for that external server are configured here. These settings must be configured if **site - useExternal** is set to **list** or **required**.
|
||||
| name | description | default |
|
||||
| --- | --- | --- |
|
||||
| `name` | identifier for external server | *none* |
|
||||
| `address` | ip or url of external server | *none* |
|
||||
| `port` | port to access on external server | 9001 |
|
||||
|
||||
The external server's response must be json with a `detections` key that contains an array of the detected structure labels, bounding box data, and confidence values.
|
||||
```
|
||||
{
|
||||
"detections": [
|
||||
{"top": 0.1, "left": 0.1, "bottom": 0.9, "right": 0.9, "label": "Aorta", "confidence": 90.0 }
|
||||
...
|
||||
],
|
||||
}
|
||||
```
|
||||
If after the image has been uploaded, the available detection tags change, then the option to re-upload the image will be available if all the new tags have been viewed and verified.
|
||||
@@ -1,5 +1,5 @@
|
||||
<?xml version='1.0' encoding='utf-8'?>
|
||||
<widget id="edu.midwestern.alvinn" version="0.5.0-alpha" xmlns="http://www.w3.org/ns/widgets" xmlns:cdv="http://cordova.apache.org/ns/1.0" xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
<widget id="edu.midwestern.alvinn" version="0.5.0-rc" xmlns="http://www.w3.org/ns/widgets" xmlns:cdv="http://cordova.apache.org/ns/1.0" xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
<name>ALVINN</name>
|
||||
<description>Anatomy Lab Visual Identification Neural Network.</description>
|
||||
<author email="jgeorg@midwestern.edu" href="https://midwestern.edu">
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
{
|
||||
"name": "edu.midwestern.alvinn",
|
||||
"displayName": "ALVINN",
|
||||
"version": "0.5.0-alpha",
|
||||
"version": "0.5.0-rc",
|
||||
"description": "Anatomy Lab Visual Identification Neural Network.",
|
||||
"main": "index.js",
|
||||
"scripts": {
|
||||
|
||||
173
package-lock.json
generated
173
package-lock.json
generated
@@ -1,16 +1,16 @@
|
||||
{
|
||||
"name": "alvinn",
|
||||
"version": "0.5.0-alpha",
|
||||
"version": "0.5.0-rc",
|
||||
"lockfileVersion": 2,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"": {
|
||||
"name": "alvinn",
|
||||
"version": "0.5.0-alpha",
|
||||
"version": "0.5.0-rc",
|
||||
"hasInstallScript": true,
|
||||
"license": "UNLICENSED",
|
||||
"dependencies": {
|
||||
"@tensorflow/tfjs": "^4.21.0",
|
||||
"@tensorflow/tfjs": "^4.17.0",
|
||||
"dom7": "^4.0.6",
|
||||
"framework7": "^8.3.0",
|
||||
"framework7-icons": "^5.0.5",
|
||||
@@ -3354,17 +3354,16 @@
|
||||
}
|
||||
},
|
||||
"node_modules/@tensorflow/tfjs": {
|
||||
"version": "4.21.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs/-/tfjs-4.21.0.tgz",
|
||||
"integrity": "sha512-7D/+H150ptvt+POMbsME3WlIvLiuBR2rCC2Z0hOKKb/5Ygkj7xsp/K2HzOvUj0g0yjk+utkU45QEYhnhjnbHRA==",
|
||||
"license": "Apache-2.0",
|
||||
"version": "4.17.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs/-/tfjs-4.17.0.tgz",
|
||||
"integrity": "sha512-yXRBhpM3frlNA/YaPp6HNk9EfIi8han5RYeQA3R8OCa0Od+AfoG1PUmlxV8fE2wCorlGVyHsgpiJ6M9YZPB56w==",
|
||||
"dependencies": {
|
||||
"@tensorflow/tfjs-backend-cpu": "4.21.0",
|
||||
"@tensorflow/tfjs-backend-webgl": "4.21.0",
|
||||
"@tensorflow/tfjs-converter": "4.21.0",
|
||||
"@tensorflow/tfjs-core": "4.21.0",
|
||||
"@tensorflow/tfjs-data": "4.21.0",
|
||||
"@tensorflow/tfjs-layers": "4.21.0",
|
||||
"@tensorflow/tfjs-backend-cpu": "4.17.0",
|
||||
"@tensorflow/tfjs-backend-webgl": "4.17.0",
|
||||
"@tensorflow/tfjs-converter": "4.17.0",
|
||||
"@tensorflow/tfjs-core": "4.17.0",
|
||||
"@tensorflow/tfjs-data": "4.17.0",
|
||||
"@tensorflow/tfjs-layers": "4.17.0",
|
||||
"argparse": "^1.0.10",
|
||||
"chalk": "^4.1.0",
|
||||
"core-js": "3.29.1",
|
||||
@@ -3376,10 +3375,9 @@
|
||||
}
|
||||
},
|
||||
"node_modules/@tensorflow/tfjs-backend-cpu": {
|
||||
"version": "4.21.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-backend-cpu/-/tfjs-backend-cpu-4.21.0.tgz",
|
||||
"integrity": "sha512-yS9Oisg4L48N7ML6677ilv1eP5Jt59S74skSU1cCsM4yBAtH4DAn9b89/JtqBISh6JadanfX26b4HCWQvMvqFg==",
|
||||
"license": "Apache-2.0",
|
||||
"version": "4.17.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-backend-cpu/-/tfjs-backend-cpu-4.17.0.tgz",
|
||||
"integrity": "sha512-2VSCHnX9qhYTjw9HiVwTBSnRVlntKXeBlK7aSVsmZfHGwWE2faErTtO7bWmqNqw0U7gyznJbVAjlow/p+0RNGw==",
|
||||
"dependencies": {
|
||||
"@types/seedrandom": "^2.4.28",
|
||||
"seedrandom": "^3.0.5"
|
||||
@@ -3388,16 +3386,15 @@
|
||||
"yarn": ">= 1.3.2"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@tensorflow/tfjs-core": "4.21.0"
|
||||
"@tensorflow/tfjs-core": "4.17.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@tensorflow/tfjs-backend-webgl": {
|
||||
"version": "4.21.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-backend-webgl/-/tfjs-backend-webgl-4.21.0.tgz",
|
||||
"integrity": "sha512-7k6mb7dd0uF9jI51iunF3rhEXjvR/a613kjWZ0Rj3o1COFrneyku2C7cRMZERWPhbgXZ+dF+j9MdpGIpgtShIQ==",
|
||||
"license": "Apache-2.0",
|
||||
"version": "4.17.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-backend-webgl/-/tfjs-backend-webgl-4.17.0.tgz",
|
||||
"integrity": "sha512-CC5GsGECCd7eYAUaKq0XJ48FjEZdgXZWPxgUYx4djvfUx5fQPp35hCSP9w/k463jllBMbjl2tKRg8u7Ia/LYzg==",
|
||||
"dependencies": {
|
||||
"@tensorflow/tfjs-backend-cpu": "4.21.0",
|
||||
"@tensorflow/tfjs-backend-cpu": "4.17.0",
|
||||
"@types/offscreencanvas": "~2019.3.0",
|
||||
"@types/seedrandom": "^2.4.28",
|
||||
"seedrandom": "^3.0.5"
|
||||
@@ -3406,23 +3403,21 @@
|
||||
"yarn": ">= 1.3.2"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@tensorflow/tfjs-core": "4.21.0"
|
||||
"@tensorflow/tfjs-core": "4.17.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@tensorflow/tfjs-converter": {
|
||||
"version": "4.21.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-converter/-/tfjs-converter-4.21.0.tgz",
|
||||
"integrity": "sha512-cUhU+F1lGx2qnKk/gRy8odBh0PZlFz0Dl71TG8LVnj0/g352DqiNrKXlKO/po9aWzP8x0KUGC3gNMSMJW+T0DA==",
|
||||
"license": "Apache-2.0",
|
||||
"version": "4.17.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-converter/-/tfjs-converter-4.17.0.tgz",
|
||||
"integrity": "sha512-qFxIjPfomCuTrYxsFjtKbi3QfdmTTCWo+RvqD64oCMS0sjp7sUDNhJyKDoLx6LZhXlwXpHIVDJctLMRMwet0Zw==",
|
||||
"peerDependencies": {
|
||||
"@tensorflow/tfjs-core": "4.21.0"
|
||||
"@tensorflow/tfjs-core": "4.17.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@tensorflow/tfjs-core": {
|
||||
"version": "4.21.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-core/-/tfjs-core-4.21.0.tgz",
|
||||
"integrity": "sha512-ZbECwXps5wb9XXcGq4ZXvZDVjr5okc3I0+i/vU6bpQ+nVApyIrMiyEudP8f6vracVTvNmnlN62vUXoEsQb2F8g==",
|
||||
"license": "Apache-2.0",
|
||||
"version": "4.17.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-core/-/tfjs-core-4.17.0.tgz",
|
||||
"integrity": "sha512-v9Q5430EnRpyhWNd9LVgXadciKvxLiq+sTrLKRowh26BHyAsams4tZIgX3lFKjB7b90p+FYifVMcqLTTHgjGpQ==",
|
||||
"dependencies": {
|
||||
"@types/long": "^4.0.1",
|
||||
"@types/offscreencanvas": "~2019.7.0",
|
||||
@@ -3439,31 +3434,28 @@
|
||||
"node_modules/@tensorflow/tfjs-core/node_modules/@types/offscreencanvas": {
|
||||
"version": "2019.7.3",
|
||||
"resolved": "https://registry.npmjs.org/@types/offscreencanvas/-/offscreencanvas-2019.7.3.tgz",
|
||||
"integrity": "sha512-ieXiYmgSRXUDeOntE1InxjWyvEelZGP63M+cGuquuRLuIKKT1osnkXjxev9B7d1nXSug5vpunx+gNlbVxMlC9A==",
|
||||
"license": "MIT"
|
||||
"integrity": "sha512-ieXiYmgSRXUDeOntE1InxjWyvEelZGP63M+cGuquuRLuIKKT1osnkXjxev9B7d1nXSug5vpunx+gNlbVxMlC9A=="
|
||||
},
|
||||
"node_modules/@tensorflow/tfjs-data": {
|
||||
"version": "4.21.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-data/-/tfjs-data-4.21.0.tgz",
|
||||
"integrity": "sha512-LpJ/vyQMwYHkcVCqIRg7IVVw13VBY7rNAiuhmKP9S5NP/2ye4KA8BJ4XwDIDgjCVQM7glK9L8bMav++xCDf7xA==",
|
||||
"license": "Apache-2.0",
|
||||
"version": "4.17.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-data/-/tfjs-data-4.17.0.tgz",
|
||||
"integrity": "sha512-aPKrDFip+gXicWOFALeNT7KKQjRXFkHd/hNe/zs4mCFcIN00hy1PkZ6xkYsgrsdLDQMBSGeS4B4ZM0k5Cs88QA==",
|
||||
"dependencies": {
|
||||
"@types/node-fetch": "^2.1.2",
|
||||
"node-fetch": "~2.6.1",
|
||||
"string_decoder": "^1.3.0"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@tensorflow/tfjs-core": "4.21.0",
|
||||
"@tensorflow/tfjs-core": "4.17.0",
|
||||
"seedrandom": "^3.0.5"
|
||||
}
|
||||
},
|
||||
"node_modules/@tensorflow/tfjs-layers": {
|
||||
"version": "4.21.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-layers/-/tfjs-layers-4.21.0.tgz",
|
||||
"integrity": "sha512-a8KaMYlY3+llvE9079nvASKpaaf8xpCMdOjbgn+eGhdOGOcY7QuFUkd/2odvnXDG8fK/jffE1LoNOlfYoBHC4w==",
|
||||
"license": "Apache-2.0 AND MIT",
|
||||
"version": "4.17.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-layers/-/tfjs-layers-4.17.0.tgz",
|
||||
"integrity": "sha512-DEE0zRKvf3LJ0EcvG5XouJYOgFGWYAneZ0K1d23969z7LfSyqVmBdLC6BTwdLKuJk3ouUJIKXU1TcpFmjDuh7g==",
|
||||
"peerDependencies": {
|
||||
"@tensorflow/tfjs-core": "4.21.0"
|
||||
"@tensorflow/tfjs-core": "4.17.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@tensorflow/tfjs/node_modules/regenerator-runtime": {
|
||||
@@ -3480,8 +3472,7 @@
|
||||
"node_modules/@types/long": {
|
||||
"version": "4.0.2",
|
||||
"resolved": "https://registry.npmjs.org/@types/long/-/long-4.0.2.tgz",
|
||||
"integrity": "sha512-MqTGEo5bj5t157U6fA/BiDynNkn0YknVdh48CMPkTSpFTVmvao5UQmm7uEF6xBEo7qIMAlY/JSleYaE6VOdpaA==",
|
||||
"license": "MIT"
|
||||
"integrity": "sha512-MqTGEo5bj5t157U6fA/BiDynNkn0YknVdh48CMPkTSpFTVmvao5UQmm7uEF6xBEo7qIMAlY/JSleYaE6VOdpaA=="
|
||||
},
|
||||
"node_modules/@types/minimist": {
|
||||
"version": "1.2.5",
|
||||
@@ -3501,7 +3492,6 @@
|
||||
"version": "2.6.11",
|
||||
"resolved": "https://registry.npmjs.org/@types/node-fetch/-/node-fetch-2.6.11.tgz",
|
||||
"integrity": "sha512-24xFj9R5+rfQJLRyM56qh+wnVSYhyXC2tkoBndtY0U+vubqNsYXGjufB2nn8Q6gt0LrARwL6UBtMCSVCwl4B1g==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@types/node": "*",
|
||||
"form-data": "^4.0.0"
|
||||
@@ -3516,8 +3506,7 @@
|
||||
"node_modules/@types/offscreencanvas": {
|
||||
"version": "2019.3.0",
|
||||
"resolved": "https://registry.npmjs.org/@types/offscreencanvas/-/offscreencanvas-2019.3.0.tgz",
|
||||
"integrity": "sha512-esIJx9bQg+QYF0ra8GnvfianIY8qWB0GBx54PK5Eps6m+xTj86KLavHv6qDhzKcu5UUOgNfJ2pWaIIV7TRUd9Q==",
|
||||
"license": "MIT"
|
||||
"integrity": "sha512-esIJx9bQg+QYF0ra8GnvfianIY8qWB0GBx54PK5Eps6m+xTj86KLavHv6qDhzKcu5UUOgNfJ2pWaIIV7TRUd9Q=="
|
||||
},
|
||||
"node_modules/@types/resolve": {
|
||||
"version": "1.17.1",
|
||||
@@ -3531,8 +3520,7 @@
|
||||
"node_modules/@types/seedrandom": {
|
||||
"version": "2.4.34",
|
||||
"resolved": "https://registry.npmjs.org/@types/seedrandom/-/seedrandom-2.4.34.tgz",
|
||||
"integrity": "sha512-ytDiArvrn/3Xk6/vtylys5tlY6eo7Ane0hvcx++TKo6RxQXuVfW0AF/oeWqAj9dN29SyhtawuXstgmPlwNcv/A==",
|
||||
"license": "MIT"
|
||||
"integrity": "sha512-ytDiArvrn/3Xk6/vtylys5tlY6eo7Ane0hvcx++TKo6RxQXuVfW0AF/oeWqAj9dN29SyhtawuXstgmPlwNcv/A=="
|
||||
},
|
||||
"node_modules/@types/trusted-types": {
|
||||
"version": "2.0.6",
|
||||
@@ -3658,8 +3646,7 @@
|
||||
"node_modules/@webgpu/types": {
|
||||
"version": "0.1.38",
|
||||
"resolved": "https://registry.npmjs.org/@webgpu/types/-/types-0.1.38.tgz",
|
||||
"integrity": "sha512-7LrhVKz2PRh+DD7+S+PVaFd5HxaWQvoMqBbsV9fNJO1pjUs1P8bM2vQVNfk+3URTqbuTI7gkXi0rfsN0IadoBA==",
|
||||
"license": "BSD-3-Clause"
|
||||
"integrity": "sha512-7LrhVKz2PRh+DD7+S+PVaFd5HxaWQvoMqBbsV9fNJO1pjUs1P8bM2vQVNfk+3URTqbuTI7gkXi0rfsN0IadoBA=="
|
||||
},
|
||||
"node_modules/acorn": {
|
||||
"version": "8.11.2",
|
||||
@@ -3827,8 +3814,7 @@
|
||||
"node_modules/asynckit": {
|
||||
"version": "0.4.0",
|
||||
"resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz",
|
||||
"integrity": "sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q==",
|
||||
"license": "MIT"
|
||||
"integrity": "sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q=="
|
||||
},
|
||||
"node_modules/at-least-node": {
|
||||
"version": "1.0.0",
|
||||
@@ -4426,7 +4412,6 @@
|
||||
"version": "1.0.8",
|
||||
"resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.8.tgz",
|
||||
"integrity": "sha512-FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"delayed-stream": "~1.0.0"
|
||||
},
|
||||
@@ -4887,7 +4872,6 @@
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz",
|
||||
"integrity": "sha512-ZySD7Nf91aLB0RxL4KGrKHBXl7Eds1DAmEdcoVawXnLD7SDhpNgtuII2aAkg7a7QS41jxPSZ17p4VdGnMHk3MQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=0.4.0"
|
||||
}
|
||||
@@ -5373,7 +5357,6 @@
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/form-data/-/form-data-4.0.0.tgz",
|
||||
"integrity": "sha512-ETEklSGi5t0QMZuiXoA/Q6vcnxcLQP5vdugSpuAyi6SVGi2clPPp+xgEhuMaHC+zGgn31Kd235W35f7Hykkaww==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"asynckit": "^0.4.0",
|
||||
"combined-stream": "^1.0.8",
|
||||
@@ -6594,8 +6577,7 @@
|
||||
"node_modules/long": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/long/-/long-4.0.0.tgz",
|
||||
"integrity": "sha512-XsP+KhQif4bjX1kbuSiySJFNAehNxgLb6hPRGJ9QsUr8ajHkuXGdrHmFUTUUXhDwVX2R5bY4JNZEwbUiMhV+MA==",
|
||||
"license": "Apache-2.0"
|
||||
"integrity": "sha512-XsP+KhQif4bjX1kbuSiySJFNAehNxgLb6hPRGJ9QsUr8ajHkuXGdrHmFUTUUXhDwVX2R5bY4JNZEwbUiMhV+MA=="
|
||||
},
|
||||
"node_modules/lower-case": {
|
||||
"version": "2.0.2",
|
||||
@@ -6706,7 +6688,6 @@
|
||||
"version": "1.52.0",
|
||||
"resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.52.0.tgz",
|
||||
"integrity": "sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">= 0.6"
|
||||
}
|
||||
@@ -6715,7 +6696,6 @@
|
||||
"version": "2.1.35",
|
||||
"resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.35.tgz",
|
||||
"integrity": "sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"mime-db": "1.52.0"
|
||||
},
|
||||
@@ -6843,7 +6823,6 @@
|
||||
"version": "2.6.13",
|
||||
"resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-2.6.13.tgz",
|
||||
"integrity": "sha512-StxNAxh15zr77QvvkmveSQ8uCQ4+v5FkvNTj0OESmiHu+VRi/gXArXtkWMElOsOUNLtUEvI4yS+rdtOHZTwlQA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"whatwg-url": "^5.0.0"
|
||||
},
|
||||
@@ -6862,20 +6841,17 @@
|
||||
"node_modules/node-fetch/node_modules/tr46": {
|
||||
"version": "0.0.3",
|
||||
"resolved": "https://registry.npmjs.org/tr46/-/tr46-0.0.3.tgz",
|
||||
"integrity": "sha512-N3WMsuqV66lT30CrXNbEjx4GEwlow3v6rr4mCcv6prnfwhS01rkgyFdjPNBYd9br7LpXV1+Emh01fHnq2Gdgrw==",
|
||||
"license": "MIT"
|
||||
"integrity": "sha512-N3WMsuqV66lT30CrXNbEjx4GEwlow3v6rr4mCcv6prnfwhS01rkgyFdjPNBYd9br7LpXV1+Emh01fHnq2Gdgrw=="
|
||||
},
|
||||
"node_modules/node-fetch/node_modules/webidl-conversions": {
|
||||
"version": "3.0.1",
|
||||
"resolved": "https://registry.npmjs.org/webidl-conversions/-/webidl-conversions-3.0.1.tgz",
|
||||
"integrity": "sha512-2JAn3z8AR6rjK8Sm8orRC0h/bcl/DqL7tRPdGZ4I1CjdF+EaMLmYxBHyXuKL849eucPFhvBoxMsflfOb8kxaeQ==",
|
||||
"license": "BSD-2-Clause"
|
||||
"integrity": "sha512-2JAn3z8AR6rjK8Sm8orRC0h/bcl/DqL7tRPdGZ4I1CjdF+EaMLmYxBHyXuKL849eucPFhvBoxMsflfOb8kxaeQ=="
|
||||
},
|
||||
"node_modules/node-fetch/node_modules/whatwg-url": {
|
||||
"version": "5.0.0",
|
||||
"resolved": "https://registry.npmjs.org/whatwg-url/-/whatwg-url-5.0.0.tgz",
|
||||
"integrity": "sha512-saE57nupxk6v3HY35+jzBwYa0rKSy0XR8JSxZPwgLr7ys0IBzhGviA1/TUGJLmSVqs8pb9AnvICXEuOHLprYTw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"tr46": "~0.0.3",
|
||||
"webidl-conversions": "^3.0.0"
|
||||
@@ -8505,8 +8481,7 @@
|
||||
"node_modules/seedrandom": {
|
||||
"version": "3.0.5",
|
||||
"resolved": "https://registry.npmjs.org/seedrandom/-/seedrandom-3.0.5.tgz",
|
||||
"integrity": "sha512-8OwmbklUNzwezjGInmZ+2clQmExQPvomqjL7LFqOYqtmuxRgQYqOD3mHaU+MvZn5FLUeVxVfQjwLZW/n/JFuqg==",
|
||||
"license": "MIT"
|
||||
"integrity": "sha512-8OwmbklUNzwezjGInmZ+2clQmExQPvomqjL7LFqOYqtmuxRgQYqOD3mHaU+MvZn5FLUeVxVfQjwLZW/n/JFuqg=="
|
||||
},
|
||||
"node_modules/semver": {
|
||||
"version": "6.3.1",
|
||||
@@ -11883,16 +11858,16 @@
|
||||
}
|
||||
},
|
||||
"@tensorflow/tfjs": {
|
||||
"version": "4.21.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs/-/tfjs-4.21.0.tgz",
|
||||
"integrity": "sha512-7D/+H150ptvt+POMbsME3WlIvLiuBR2rCC2Z0hOKKb/5Ygkj7xsp/K2HzOvUj0g0yjk+utkU45QEYhnhjnbHRA==",
|
||||
"version": "4.17.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs/-/tfjs-4.17.0.tgz",
|
||||
"integrity": "sha512-yXRBhpM3frlNA/YaPp6HNk9EfIi8han5RYeQA3R8OCa0Od+AfoG1PUmlxV8fE2wCorlGVyHsgpiJ6M9YZPB56w==",
|
||||
"requires": {
|
||||
"@tensorflow/tfjs-backend-cpu": "4.21.0",
|
||||
"@tensorflow/tfjs-backend-webgl": "4.21.0",
|
||||
"@tensorflow/tfjs-converter": "4.21.0",
|
||||
"@tensorflow/tfjs-core": "4.21.0",
|
||||
"@tensorflow/tfjs-data": "4.21.0",
|
||||
"@tensorflow/tfjs-layers": "4.21.0",
|
||||
"@tensorflow/tfjs-backend-cpu": "4.17.0",
|
||||
"@tensorflow/tfjs-backend-webgl": "4.17.0",
|
||||
"@tensorflow/tfjs-converter": "4.17.0",
|
||||
"@tensorflow/tfjs-core": "4.17.0",
|
||||
"@tensorflow/tfjs-data": "4.17.0",
|
||||
"@tensorflow/tfjs-layers": "4.17.0",
|
||||
"argparse": "^1.0.10",
|
||||
"chalk": "^4.1.0",
|
||||
"core-js": "3.29.1",
|
||||
@@ -11908,35 +11883,35 @@
|
||||
}
|
||||
},
|
||||
"@tensorflow/tfjs-backend-cpu": {
|
||||
"version": "4.21.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-backend-cpu/-/tfjs-backend-cpu-4.21.0.tgz",
|
||||
"integrity": "sha512-yS9Oisg4L48N7ML6677ilv1eP5Jt59S74skSU1cCsM4yBAtH4DAn9b89/JtqBISh6JadanfX26b4HCWQvMvqFg==",
|
||||
"version": "4.17.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-backend-cpu/-/tfjs-backend-cpu-4.17.0.tgz",
|
||||
"integrity": "sha512-2VSCHnX9qhYTjw9HiVwTBSnRVlntKXeBlK7aSVsmZfHGwWE2faErTtO7bWmqNqw0U7gyznJbVAjlow/p+0RNGw==",
|
||||
"requires": {
|
||||
"@types/seedrandom": "^2.4.28",
|
||||
"seedrandom": "^3.0.5"
|
||||
}
|
||||
},
|
||||
"@tensorflow/tfjs-backend-webgl": {
|
||||
"version": "4.21.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-backend-webgl/-/tfjs-backend-webgl-4.21.0.tgz",
|
||||
"integrity": "sha512-7k6mb7dd0uF9jI51iunF3rhEXjvR/a613kjWZ0Rj3o1COFrneyku2C7cRMZERWPhbgXZ+dF+j9MdpGIpgtShIQ==",
|
||||
"version": "4.17.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-backend-webgl/-/tfjs-backend-webgl-4.17.0.tgz",
|
||||
"integrity": "sha512-CC5GsGECCd7eYAUaKq0XJ48FjEZdgXZWPxgUYx4djvfUx5fQPp35hCSP9w/k463jllBMbjl2tKRg8u7Ia/LYzg==",
|
||||
"requires": {
|
||||
"@tensorflow/tfjs-backend-cpu": "4.21.0",
|
||||
"@tensorflow/tfjs-backend-cpu": "4.17.0",
|
||||
"@types/offscreencanvas": "~2019.3.0",
|
||||
"@types/seedrandom": "^2.4.28",
|
||||
"seedrandom": "^3.0.5"
|
||||
}
|
||||
},
|
||||
"@tensorflow/tfjs-converter": {
|
||||
"version": "4.21.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-converter/-/tfjs-converter-4.21.0.tgz",
|
||||
"integrity": "sha512-cUhU+F1lGx2qnKk/gRy8odBh0PZlFz0Dl71TG8LVnj0/g352DqiNrKXlKO/po9aWzP8x0KUGC3gNMSMJW+T0DA==",
|
||||
"version": "4.17.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-converter/-/tfjs-converter-4.17.0.tgz",
|
||||
"integrity": "sha512-qFxIjPfomCuTrYxsFjtKbi3QfdmTTCWo+RvqD64oCMS0sjp7sUDNhJyKDoLx6LZhXlwXpHIVDJctLMRMwet0Zw==",
|
||||
"requires": {}
|
||||
},
|
||||
"@tensorflow/tfjs-core": {
|
||||
"version": "4.21.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-core/-/tfjs-core-4.21.0.tgz",
|
||||
"integrity": "sha512-ZbECwXps5wb9XXcGq4ZXvZDVjr5okc3I0+i/vU6bpQ+nVApyIrMiyEudP8f6vracVTvNmnlN62vUXoEsQb2F8g==",
|
||||
"version": "4.17.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-core/-/tfjs-core-4.17.0.tgz",
|
||||
"integrity": "sha512-v9Q5430EnRpyhWNd9LVgXadciKvxLiq+sTrLKRowh26BHyAsams4tZIgX3lFKjB7b90p+FYifVMcqLTTHgjGpQ==",
|
||||
"requires": {
|
||||
"@types/long": "^4.0.1",
|
||||
"@types/offscreencanvas": "~2019.7.0",
|
||||
@@ -11955,9 +11930,9 @@
|
||||
}
|
||||
},
|
||||
"@tensorflow/tfjs-data": {
|
||||
"version": "4.21.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-data/-/tfjs-data-4.21.0.tgz",
|
||||
"integrity": "sha512-LpJ/vyQMwYHkcVCqIRg7IVVw13VBY7rNAiuhmKP9S5NP/2ye4KA8BJ4XwDIDgjCVQM7glK9L8bMav++xCDf7xA==",
|
||||
"version": "4.17.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-data/-/tfjs-data-4.17.0.tgz",
|
||||
"integrity": "sha512-aPKrDFip+gXicWOFALeNT7KKQjRXFkHd/hNe/zs4mCFcIN00hy1PkZ6xkYsgrsdLDQMBSGeS4B4ZM0k5Cs88QA==",
|
||||
"requires": {
|
||||
"@types/node-fetch": "^2.1.2",
|
||||
"node-fetch": "~2.6.1",
|
||||
@@ -11965,9 +11940,9 @@
|
||||
}
|
||||
},
|
||||
"@tensorflow/tfjs-layers": {
|
||||
"version": "4.21.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-layers/-/tfjs-layers-4.21.0.tgz",
|
||||
"integrity": "sha512-a8KaMYlY3+llvE9079nvASKpaaf8xpCMdOjbgn+eGhdOGOcY7QuFUkd/2odvnXDG8fK/jffE1LoNOlfYoBHC4w==",
|
||||
"version": "4.17.0",
|
||||
"resolved": "https://registry.npmjs.org/@tensorflow/tfjs-layers/-/tfjs-layers-4.17.0.tgz",
|
||||
"integrity": "sha512-DEE0zRKvf3LJ0EcvG5XouJYOgFGWYAneZ0K1d23969z7LfSyqVmBdLC6BTwdLKuJk3ouUJIKXU1TcpFmjDuh7g==",
|
||||
"requires": {}
|
||||
},
|
||||
"@types/estree": {
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
{
|
||||
"name": "alvinn",
|
||||
"private": true,
|
||||
"version": "0.5.0-alpha",
|
||||
"version": "0.5.0-rc",
|
||||
"description": "ALVINN",
|
||||
"repository": "",
|
||||
"license": "UNLICENSED",
|
||||
@@ -14,8 +14,7 @@
|
||||
"cordova-ios": "cross-env TARGET=cordova cross-env NODE_ENV=production vite build && node ./build/build-cordova.js && cd cordova && cordova run ios",
|
||||
"build-cordova-android": "cross-env TARGET=cordova cross-env NODE_ENV=production vite build && node ./build/build-cordova.js && cd cordova && cordova build android",
|
||||
"cordova-android": "cross-env TARGET=cordova cross-env NODE_ENV=production vite build && node ./build/build-cordova.js && cd cordova && cordova run android",
|
||||
"postinstall": "cpy --flat ./node_modules/framework7-icons/fonts/*.* ./src/fonts/",
|
||||
"preview": "vite preview"
|
||||
"postinstall": "cpy --flat ./node_modules/framework7-icons/fonts/*.* ./src/fonts/"
|
||||
},
|
||||
"browserslist": [
|
||||
"IOS >= 15",
|
||||
@@ -24,7 +23,7 @@
|
||||
"last 5 Firefox versions"
|
||||
],
|
||||
"dependencies": {
|
||||
"@tensorflow/tfjs": "^4.21.0",
|
||||
"@tensorflow/tfjs": "^4.17.0",
|
||||
"dom7": "^4.0.6",
|
||||
"framework7": "^8.3.0",
|
||||
"framework7-icons": "^5.0.5",
|
||||
|
||||
@@ -1,17 +1,9 @@
|
||||
demo: true
|
||||
agreeExpire: 3
|
||||
regions:
|
||||
- thorax
|
||||
- abdomen
|
||||
- limbs
|
||||
- head
|
||||
useExternal: none
|
||||
disableWorkers: false
|
||||
site:
|
||||
demo: true
|
||||
regions:
|
||||
- thorax
|
||||
- abdomen
|
||||
- limbs
|
||||
external:
|
||||
- name: Mserver
|
||||
address: "192.169.1.105"
|
||||
port: 9001
|
||||
- name: Georgi lab server
|
||||
address: "10.188.0.98"
|
||||
port: 9001
|
||||
infoUrl: http://anatlabwiki.midwestern.edu/vetlab/index.php/
|
||||
address: "10.188.0.98"
|
||||
port: 9001
|
||||
@@ -1,10 +1,11 @@
|
||||
{
|
||||
"version": "0.1.0-n4",
|
||||
"region": "Thorax",
|
||||
"version": "0.0.0-n1",
|
||||
"region": "Coco",
|
||||
"size": 640,
|
||||
"epochs": 1000,
|
||||
"name": "nano4",
|
||||
"name": "coco128 test",
|
||||
"yolo-version": "8.1.20 docker",
|
||||
"date": "2024-03-08",
|
||||
"export": "0.1.0-th"
|
||||
"date": "2024-03-12",
|
||||
"export": "coco128.yaml"
|
||||
}
|
||||
|
||||
|
||||
Binary file not shown.
BIN
public/models/abdomen-mini/group1-shard1of4.bin
Normal file
BIN
public/models/abdomen-mini/group1-shard1of4.bin
Normal file
Binary file not shown.
Binary file not shown.
BIN
public/models/abdomen-mini/group1-shard2of4.bin
Normal file
BIN
public/models/abdomen-mini/group1-shard2of4.bin
Normal file
Binary file not shown.
Binary file not shown.
BIN
public/models/abdomen-mini/group1-shard3of4.bin
Normal file
BIN
public/models/abdomen-mini/group1-shard3of4.bin
Normal file
Binary file not shown.
BIN
public/models/abdomen-mini/group1-shard4of4.bin
Normal file
BIN
public/models/abdomen-mini/group1-shard4of4.bin
Normal file
Binary file not shown.
@@ -1,7 +1,7 @@
|
||||
description: Ultralytics best model trained on /data/ALVINN/Thorax/Thorax 0.1.0/thorax.yaml
|
||||
description: Ultralytics best model trained on /usr/src/ultralytics/ultralytics/cfg/datasets/coco128.yaml
|
||||
author: Ultralytics
|
||||
license: AGPL-3.0 https://ultralytics.com/license
|
||||
date: '2024-03-08T20:14:34.118186'
|
||||
date: '2024-03-12T16:25:00.089873'
|
||||
version: 8.1.20
|
||||
stride: 32
|
||||
task: detect
|
||||
@@ -10,44 +10,83 @@ imgsz:
|
||||
- 640
|
||||
- 640
|
||||
names:
|
||||
0: Abdominal diaphragm
|
||||
1: Aorta
|
||||
2: Azygous vein
|
||||
3: Brachiocephalic trunk
|
||||
4: Caudal vena cava
|
||||
5: Cranial vena cava
|
||||
6: Esophagus
|
||||
7: External abdominal oblique
|
||||
8: Iliocostalis
|
||||
9: Latissimus dorsi
|
||||
10: Left atrium
|
||||
11: Left auricle
|
||||
12: Left lung
|
||||
13: Left subclavian artery
|
||||
14: Left ventricle
|
||||
15: Longissimus
|
||||
16: Pectoralis profundus
|
||||
17: Pectoralis superficialis
|
||||
18: Pericardium
|
||||
19: Phrenic nerve
|
||||
20: Primary bronchus
|
||||
21: Pulmonary artery
|
||||
22: Pulmonary trunk
|
||||
23: Pulmonary vein
|
||||
24: Rectus abdominis
|
||||
25: Rectus thoracis
|
||||
26: Recurrent laryngeal nerve
|
||||
27: Rhomboideus
|
||||
28: Right atrium
|
||||
29: Right auricle
|
||||
30: Right lung
|
||||
31: Right ventricle
|
||||
32: Scalenus
|
||||
33: Serratus dorsalis caudalis
|
||||
34: Serratus dorsalis cranialis
|
||||
35: Serratus ventralis
|
||||
36: Spinalis
|
||||
37: Sympathetic chain
|
||||
38: Trachea
|
||||
39: Trapezius
|
||||
40: Vagus nerve
|
||||
0: person
|
||||
1: bicycle
|
||||
2: car
|
||||
3: motorcycle
|
||||
4: airplane
|
||||
5: bus
|
||||
6: train
|
||||
7: truck
|
||||
8: boat
|
||||
9: traffic light
|
||||
10: fire hydrant
|
||||
11: stop sign
|
||||
12: parking meter
|
||||
13: bench
|
||||
14: bird
|
||||
15: cat
|
||||
16: dog
|
||||
17: horse
|
||||
18: sheep
|
||||
19: cow
|
||||
20: elephant
|
||||
21: bear
|
||||
22: zebra
|
||||
23: giraffe
|
||||
24: backpack
|
||||
25: umbrella
|
||||
26: handbag
|
||||
27: tie
|
||||
28: suitcase
|
||||
29: frisbee
|
||||
30: skis
|
||||
31: snowboard
|
||||
32: sports ball
|
||||
33: kite
|
||||
34: baseball bat
|
||||
35: baseball glove
|
||||
36: skateboard
|
||||
37: surfboard
|
||||
38: tennis racket
|
||||
39: bottle
|
||||
40: wine glass
|
||||
41: cup
|
||||
42: fork
|
||||
43: knife
|
||||
44: spoon
|
||||
45: bowl
|
||||
46: banana
|
||||
47: apple
|
||||
48: sandwich
|
||||
49: orange
|
||||
50: broccoli
|
||||
51: carrot
|
||||
52: hot dog
|
||||
53: pizza
|
||||
54: donut
|
||||
55: cake
|
||||
56: chair
|
||||
57: couch
|
||||
58: potted plant
|
||||
59: bed
|
||||
60: dining table
|
||||
61: toilet
|
||||
62: tv
|
||||
63: laptop
|
||||
64: mouse
|
||||
65: remote
|
||||
66: keyboard
|
||||
67: cell phone
|
||||
68: microwave
|
||||
69: oven
|
||||
70: toaster
|
||||
71: sink
|
||||
72: refrigerator
|
||||
73: book
|
||||
74: clock
|
||||
75: vase
|
||||
76: scissors
|
||||
77: teddy bear
|
||||
78: hair drier
|
||||
79: toothbrush
|
||||
|
||||
File diff suppressed because one or more lines are too long
@@ -1,43 +1,82 @@
|
||||
[
|
||||
"Abdominal diaphragm",
|
||||
"Aorta",
|
||||
"Azygous vein",
|
||||
"Brachiocephalic trunk",
|
||||
"Caudal vena cava",
|
||||
"Cranial vena cava",
|
||||
"Esophagus",
|
||||
"External abdominal oblique",
|
||||
"Iliocostalis",
|
||||
"Latissimus dorsi",
|
||||
"Left atrium",
|
||||
"Left auricle",
|
||||
"Left lung",
|
||||
"Left subclavian artery",
|
||||
"Left ventricle",
|
||||
"Longissimus",
|
||||
"Pectoralis profundus",
|
||||
"Pectoralis superficialis",
|
||||
"Pericardium",
|
||||
"Phrenic nerve",
|
||||
"Primary bronchus",
|
||||
"Pulmonary artery",
|
||||
"Pulmonary trunk",
|
||||
"Pulmonary vein",
|
||||
"Rectus abdominis",
|
||||
"Rectus thoracis",
|
||||
"Recurrent laryngeal nerve",
|
||||
"Rhomboideus",
|
||||
"Right atrium",
|
||||
"Right auricle",
|
||||
"Right lung",
|
||||
"Right ventricle",
|
||||
"Scalenus",
|
||||
"Serratus dorsalis caudalis",
|
||||
"Serratus dorsalis cranialis",
|
||||
"Serratus ventralis",
|
||||
"Spinalis",
|
||||
"Sympathetic chain",
|
||||
"Trachea",
|
||||
"Trapezius",
|
||||
"Vagus nerve"
|
||||
"person",
|
||||
"bicycle",
|
||||
"car",
|
||||
"motorcycle",
|
||||
"airplane",
|
||||
"bus",
|
||||
"train",
|
||||
"truck",
|
||||
"boat",
|
||||
"traffic light",
|
||||
"fire hydrant",
|
||||
"stop sign",
|
||||
"parking meter",
|
||||
"bench",
|
||||
"bird",
|
||||
"cat",
|
||||
"dog",
|
||||
"horse",
|
||||
"sheep",
|
||||
"cow",
|
||||
"elephant",
|
||||
"bear",
|
||||
"zebra",
|
||||
"giraffe",
|
||||
"backpack",
|
||||
"umbrella",
|
||||
"handbag",
|
||||
"tie",
|
||||
"suitcase",
|
||||
"frisbee",
|
||||
"skis",
|
||||
"snowboard",
|
||||
"sports ball",
|
||||
"kite",
|
||||
"baseball bat",
|
||||
"baseball glove",
|
||||
"skateboard",
|
||||
"surfboard",
|
||||
"tennis racket",
|
||||
"bottle",
|
||||
"wine glass",
|
||||
"cup",
|
||||
"fork",
|
||||
"knife",
|
||||
"spoon",
|
||||
"bowl",
|
||||
"banana",
|
||||
"apple",
|
||||
"sandwich",
|
||||
"orange",
|
||||
"broccoli",
|
||||
"carrot",
|
||||
"hot dog",
|
||||
"pizza",
|
||||
"donut",
|
||||
"cake",
|
||||
"chair",
|
||||
"couch",
|
||||
"potted plant",
|
||||
"bed",
|
||||
"dining table",
|
||||
"toilet",
|
||||
"tv",
|
||||
"laptop",
|
||||
"mouse",
|
||||
"remote",
|
||||
"keyboard",
|
||||
"cell phone",
|
||||
"microwave",
|
||||
"oven",
|
||||
"toaster",
|
||||
"sink",
|
||||
"refrigerator",
|
||||
"book",
|
||||
"clock",
|
||||
"vase",
|
||||
"scissors",
|
||||
"teddy bear",
|
||||
"hair drier",
|
||||
"toothbrush"
|
||||
]
|
||||
@@ -1,10 +1,11 @@
|
||||
{
|
||||
"version": "0.1.0-n4",
|
||||
"region": "Thorax",
|
||||
"version": "0.0.0-n1",
|
||||
"region": "Coco",
|
||||
"size": 640,
|
||||
"epochs": 1000,
|
||||
"name": "nano4",
|
||||
"name": "coco128 test",
|
||||
"yolo-version": "8.1.20 docker",
|
||||
"date": "2024-03-08",
|
||||
"export": "0.1.0-th"
|
||||
"date": "2024-03-12",
|
||||
"export": "coco128.yaml"
|
||||
}
|
||||
|
||||
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
BIN
public/models/abdomen/group1-shard1of4.bin
Normal file
BIN
public/models/abdomen/group1-shard1of4.bin
Normal file
Binary file not shown.
Binary file not shown.
Binary file not shown.
BIN
public/models/abdomen/group1-shard2of4.bin
Normal file
BIN
public/models/abdomen/group1-shard2of4.bin
Normal file
Binary file not shown.
Binary file not shown.
Binary file not shown.
BIN
public/models/abdomen/group1-shard3of4.bin
Normal file
BIN
public/models/abdomen/group1-shard3of4.bin
Normal file
Binary file not shown.
Binary file not shown.
BIN
public/models/abdomen/group1-shard4of4.bin
Normal file
BIN
public/models/abdomen/group1-shard4of4.bin
Normal file
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -1,7 +1,7 @@
|
||||
description: Ultralytics best model trained on /data/ALVINN/Thorax/Thorax 0.1.0/thorax.yaml
|
||||
description: Ultralytics best model trained on /usr/src/ultralytics/ultralytics/cfg/datasets/coco128.yaml
|
||||
author: Ultralytics
|
||||
license: AGPL-3.0 https://ultralytics.com/license
|
||||
date: '2024-03-08T20:14:34.118186'
|
||||
date: '2024-03-12T16:25:00.089873'
|
||||
version: 8.1.20
|
||||
stride: 32
|
||||
task: detect
|
||||
@@ -10,44 +10,83 @@ imgsz:
|
||||
- 640
|
||||
- 640
|
||||
names:
|
||||
0: Abdominal diaphragm
|
||||
1: Aorta
|
||||
2: Azygous vein
|
||||
3: Brachiocephalic trunk
|
||||
4: Caudal vena cava
|
||||
5: Cranial vena cava
|
||||
6: Esophagus
|
||||
7: External abdominal oblique
|
||||
8: Iliocostalis
|
||||
9: Latissimus dorsi
|
||||
10: Left atrium
|
||||
11: Left auricle
|
||||
12: Left lung
|
||||
13: Left subclavian artery
|
||||
14: Left ventricle
|
||||
15: Longissimus
|
||||
16: Pectoralis profundus
|
||||
17: Pectoralis superficialis
|
||||
18: Pericardium
|
||||
19: Phrenic nerve
|
||||
20: Primary bronchus
|
||||
21: Pulmonary artery
|
||||
22: Pulmonary trunk
|
||||
23: Pulmonary vein
|
||||
24: Rectus abdominis
|
||||
25: Rectus thoracis
|
||||
26: Recurrent laryngeal nerve
|
||||
27: Rhomboideus
|
||||
28: Right atrium
|
||||
29: Right auricle
|
||||
30: Right lung
|
||||
31: Right ventricle
|
||||
32: Scalenus
|
||||
33: Serratus dorsalis caudalis
|
||||
34: Serratus dorsalis cranialis
|
||||
35: Serratus ventralis
|
||||
36: Spinalis
|
||||
37: Sympathetic chain
|
||||
38: Trachea
|
||||
39: Trapezius
|
||||
40: Vagus nerve
|
||||
0: person
|
||||
1: bicycle
|
||||
2: car
|
||||
3: motorcycle
|
||||
4: airplane
|
||||
5: bus
|
||||
6: train
|
||||
7: truck
|
||||
8: boat
|
||||
9: traffic light
|
||||
10: fire hydrant
|
||||
11: stop sign
|
||||
12: parking meter
|
||||
13: bench
|
||||
14: bird
|
||||
15: cat
|
||||
16: dog
|
||||
17: horse
|
||||
18: sheep
|
||||
19: cow
|
||||
20: elephant
|
||||
21: bear
|
||||
22: zebra
|
||||
23: giraffe
|
||||
24: backpack
|
||||
25: umbrella
|
||||
26: handbag
|
||||
27: tie
|
||||
28: suitcase
|
||||
29: frisbee
|
||||
30: skis
|
||||
31: snowboard
|
||||
32: sports ball
|
||||
33: kite
|
||||
34: baseball bat
|
||||
35: baseball glove
|
||||
36: skateboard
|
||||
37: surfboard
|
||||
38: tennis racket
|
||||
39: bottle
|
||||
40: wine glass
|
||||
41: cup
|
||||
42: fork
|
||||
43: knife
|
||||
44: spoon
|
||||
45: bowl
|
||||
46: banana
|
||||
47: apple
|
||||
48: sandwich
|
||||
49: orange
|
||||
50: broccoli
|
||||
51: carrot
|
||||
52: hot dog
|
||||
53: pizza
|
||||
54: donut
|
||||
55: cake
|
||||
56: chair
|
||||
57: couch
|
||||
58: potted plant
|
||||
59: bed
|
||||
60: dining table
|
||||
61: toilet
|
||||
62: tv
|
||||
63: laptop
|
||||
64: mouse
|
||||
65: remote
|
||||
66: keyboard
|
||||
67: cell phone
|
||||
68: microwave
|
||||
69: oven
|
||||
70: toaster
|
||||
71: sink
|
||||
72: refrigerator
|
||||
73: book
|
||||
74: clock
|
||||
75: vase
|
||||
76: scissors
|
||||
77: teddy bear
|
||||
78: hair drier
|
||||
79: toothbrush
|
||||
|
||||
File diff suppressed because one or more lines are too long
@@ -1,10 +0,0 @@
|
||||
{
|
||||
"version": "0.1.0-n4",
|
||||
"region": "Thorax",
|
||||
"size": 640,
|
||||
"epochs": 1000,
|
||||
"name": "nano4",
|
||||
"yolo-version": "8.1.20 docker",
|
||||
"date": "2024-03-08",
|
||||
"export": "0.1.0-th"
|
||||
}
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -1,53 +0,0 @@
|
||||
description: Ultralytics best model trained on /data/ALVINN/Thorax/Thorax 0.1.0/thorax.yaml
|
||||
author: Ultralytics
|
||||
license: AGPL-3.0 https://ultralytics.com/license
|
||||
date: '2024-03-08T20:14:34.118186'
|
||||
version: 8.1.20
|
||||
stride: 32
|
||||
task: detect
|
||||
batch: 1
|
||||
imgsz:
|
||||
- 640
|
||||
- 640
|
||||
names:
|
||||
0: Abdominal diaphragm
|
||||
1: Aorta
|
||||
2: Azygous vein
|
||||
3: Brachiocephalic trunk
|
||||
4: Caudal vena cava
|
||||
5: Cranial vena cava
|
||||
6: Esophagus
|
||||
7: External abdominal oblique
|
||||
8: Iliocostalis
|
||||
9: Latissimus dorsi
|
||||
10: Left atrium
|
||||
11: Left auricle
|
||||
12: Left lung
|
||||
13: Left subclavian artery
|
||||
14: Left ventricle
|
||||
15: Longissimus
|
||||
16: Pectoralis profundus
|
||||
17: Pectoralis superficialis
|
||||
18: Pericardium
|
||||
19: Phrenic nerve
|
||||
20: Primary bronchus
|
||||
21: Pulmonary artery
|
||||
22: Pulmonary trunk
|
||||
23: Pulmonary vein
|
||||
24: Rectus abdominis
|
||||
25: Rectus thoracis
|
||||
26: Recurrent laryngeal nerve
|
||||
27: Rhomboideus
|
||||
28: Right atrium
|
||||
29: Right auricle
|
||||
30: Right lung
|
||||
31: Right ventricle
|
||||
32: Scalenus
|
||||
33: Serratus dorsalis caudalis
|
||||
34: Serratus dorsalis cranialis
|
||||
35: Serratus ventralis
|
||||
36: Spinalis
|
||||
37: Sympathetic chain
|
||||
38: Trachea
|
||||
39: Trapezius
|
||||
40: Vagus nerve
|
||||
File diff suppressed because one or more lines are too long
@@ -1,43 +0,0 @@
|
||||
[
|
||||
"Abdominal diaphragm",
|
||||
"Aorta",
|
||||
"Azygous vein",
|
||||
"Brachiocephalic trunk",
|
||||
"Caudal vena cava",
|
||||
"Cranial vena cava",
|
||||
"Esophagus",
|
||||
"External abdominal oblique",
|
||||
"Iliocostalis",
|
||||
"Latissimus dorsi",
|
||||
"Left atrium",
|
||||
"Left auricle",
|
||||
"Left lung",
|
||||
"Left subclavian artery",
|
||||
"Left ventricle",
|
||||
"Longissimus",
|
||||
"Pectoralis profundus",
|
||||
"Pectoralis superficialis",
|
||||
"Pericardium",
|
||||
"Phrenic nerve",
|
||||
"Primary bronchus",
|
||||
"Pulmonary artery",
|
||||
"Pulmonary trunk",
|
||||
"Pulmonary vein",
|
||||
"Rectus abdominis",
|
||||
"Rectus thoracis",
|
||||
"Recurrent laryngeal nerve",
|
||||
"Rhomboideus",
|
||||
"Right atrium",
|
||||
"Right auricle",
|
||||
"Right lung",
|
||||
"Right ventricle",
|
||||
"Scalenus",
|
||||
"Serratus dorsalis caudalis",
|
||||
"Serratus dorsalis cranialis",
|
||||
"Serratus ventralis",
|
||||
"Spinalis",
|
||||
"Sympathetic chain",
|
||||
"Trachea",
|
||||
"Trapezius",
|
||||
"Vagus nerve"
|
||||
]
|
||||
@@ -1,12 +0,0 @@
|
||||
{
|
||||
"version": "0.3.1-s1",
|
||||
"region": "Thorax",
|
||||
"size": 960,
|
||||
"epochs": 2000,
|
||||
"epochsFinal:": 1656,
|
||||
"name": "small1",
|
||||
"yolo-version": "8.2.16 docker",
|
||||
"date": "2024-06-05",
|
||||
"export": "0.3.0-th",
|
||||
"grayscale": true
|
||||
}
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -1,53 +0,0 @@
|
||||
description: Ultralytics best model trained on /data/ALVINN/Thorax/Thorax 0.3.0/thorax_g.yaml
|
||||
author: Ultralytics
|
||||
license: AGPL-3.0 https://ultralytics.com/license
|
||||
date: '2024-06-05T22:55:38.088791'
|
||||
version: 8.1.20
|
||||
stride: 32
|
||||
task: detect
|
||||
batch: 1
|
||||
imgsz:
|
||||
- 960
|
||||
- 960
|
||||
names:
|
||||
0: Abdominal diaphragm
|
||||
1: Aorta
|
||||
2: Azygous vein
|
||||
3: Brachiocephalic trunk
|
||||
4: Caudal vena cava
|
||||
5: Cranial vena cava
|
||||
6: Esophagus
|
||||
7: External abdominal oblique
|
||||
8: Iliocostalis
|
||||
9: Latissimus dorsi
|
||||
10: Left atrium
|
||||
11: Left auricle
|
||||
12: Left lung
|
||||
13: Left subclavian artery
|
||||
14: Left ventricle
|
||||
15: Longissimus
|
||||
16: Pectoralis profundus
|
||||
17: Pectoralis superficialis
|
||||
18: Pericardium
|
||||
19: Phrenic nerve
|
||||
20: Primary bronchus
|
||||
21: Pulmonary artery
|
||||
22: Pulmonary trunk
|
||||
23: Pulmonary vein
|
||||
24: Rectus abdominis
|
||||
25: Rectus thoracis
|
||||
26: Recurrent laryngeal nerve
|
||||
27: Rhomboideus
|
||||
28: Right atrium
|
||||
29: Right auricle
|
||||
30: Right lung
|
||||
31: Right ventricle
|
||||
32: Scalenus
|
||||
33: Serratus dorsalis caudalis
|
||||
34: Serratus dorsalis cranialis
|
||||
35: Serratus ventralis
|
||||
36: Spinalis
|
||||
37: Sympathetic chain
|
||||
38: Trachea
|
||||
39: Trapezius
|
||||
40: Vagus nerve
|
||||
File diff suppressed because one or more lines are too long
@@ -1,12 +1,10 @@
|
||||
{
|
||||
"version": "0.2.1-n3",
|
||||
"version": "0.1.0-n4",
|
||||
"region": "Thorax",
|
||||
"size": 640,
|
||||
"epochs": 1500,
|
||||
"name": "nano3",
|
||||
"yolo-version": "8.2.16 docker",
|
||||
"date": "2024-06-17",
|
||||
"export": "0.2.1-th",
|
||||
"grayscale": true,
|
||||
"background": 35
|
||||
}
|
||||
"epochs": 1000,
|
||||
"name": "nano4",
|
||||
"yolo-version": "8.1.20 docker",
|
||||
"date": "2024-03-08",
|
||||
"export": "0.1.0-th"
|
||||
}
|
||||
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -1,9 +1,8 @@
|
||||
description: Ultralytics best model trained on /data/ALVINN/Thorax 0.2.1/thorax_g.yaml
|
||||
description: Ultralytics best model trained on /data/ALVINN/Thorax/Thorax 0.1.0/thorax.yaml
|
||||
author: Ultralytics
|
||||
date: '2024-06-17T22:40:05.967309'
|
||||
version: 8.2.16
|
||||
license: AGPL-3.0 License (https://ultralytics.com/license)
|
||||
docs: https://docs.ultralytics.com
|
||||
license: AGPL-3.0 https://ultralytics.com/license
|
||||
date: '2024-03-08T20:14:34.118186'
|
||||
version: 8.1.20
|
||||
stride: 32
|
||||
task: detect
|
||||
batch: 1
|
||||
|
||||
File diff suppressed because one or more lines are too long
@@ -1,12 +1,10 @@
|
||||
{
|
||||
"version": "0.2.1-s1",
|
||||
"version": "0.1.0-s1",
|
||||
"region": "Thorax",
|
||||
"size": 1080,
|
||||
"epochs": 1399,
|
||||
"size": 640,
|
||||
"epochs": 1000,
|
||||
"name": "small1",
|
||||
"yolo-version": "8.2.16 docker",
|
||||
"date": "2024-06-18",
|
||||
"export": "0.2.1-th",
|
||||
"grayscale": true,
|
||||
"background": 35
|
||||
"yolo-version": "8.1.20 docker",
|
||||
"date": "2024-03-07",
|
||||
"export": "0.1.0-th"
|
||||
}
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -1,15 +1,14 @@
|
||||
description: Ultralytics best model trained on /data/ALVINN/Thorax 0.2.1/thorax_g.yaml
|
||||
description: Ultralytics best model trained on /data/ALVINN/Thorax/Thorax 0.1.0/thorax.yaml
|
||||
author: Ultralytics
|
||||
date: '2024-06-18T23:10:47.568324'
|
||||
version: 8.2.16
|
||||
license: AGPL-3.0 License (https://ultralytics.com/license)
|
||||
docs: https://docs.ultralytics.com
|
||||
license: AGPL-3.0 https://ultralytics.com/license
|
||||
date: '2024-03-07T16:03:03.296997'
|
||||
version: 8.1.20
|
||||
stride: 32
|
||||
task: detect
|
||||
batch: 1
|
||||
imgsz:
|
||||
- 1088
|
||||
- 1088
|
||||
- 640
|
||||
- 640
|
||||
names:
|
||||
0: Abdominal diaphragm
|
||||
1: Aorta
|
||||
|
||||
File diff suppressed because one or more lines are too long
@@ -1,177 +0,0 @@
|
||||
import * as tf from '@tensorflow/tfjs'
|
||||
|
||||
let model = null
|
||||
|
||||
onmessage = function (e) {
|
||||
switch (e.data.call) {
|
||||
case 'loadModel':
|
||||
loadModel(e.data.weights,e.data.preload).then(() => {
|
||||
postMessage({success: 'model'})
|
||||
}).catch((err) => {
|
||||
postMessage({error: true, message: err.message})
|
||||
})
|
||||
break
|
||||
case 'localDetect':
|
||||
localDetect(e.data.image).then((dets) => {
|
||||
postMessage({success: 'detection', detections: dets})
|
||||
}).catch((err) => {
|
||||
//throw (err)
|
||||
postMessage({error: true, message: err.message})
|
||||
})
|
||||
e.data.image.close()
|
||||
break
|
||||
case 'videoFrame':
|
||||
videoFrame(e.data.image).then((frameDet) =>{
|
||||
postMessage({succes: 'frame', coords: frameDet.cds, modelWidth: frameDet.mW, modelHeight: frameDet.mH})
|
||||
}).catch((err) => {
|
||||
postMessage({error: true, message: err.message})
|
||||
})
|
||||
e.data.image.close()
|
||||
break
|
||||
default:
|
||||
console.log('Worker message incoming:')
|
||||
console.log(e)
|
||||
postMessage({result1: 'First result', result2: 'Second result'})
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
async function loadModel(weights, preload) {
|
||||
if (model && model.modelURL == weights) {
|
||||
return model
|
||||
} else if (model) {
|
||||
tf.dispose(model)
|
||||
}
|
||||
model = await tf.loadGraphModel(weights)
|
||||
const [modelWidth, modelHeight] = model.inputs[0].shape.slice(1, 3)
|
||||
/*****************
|
||||
* If preloading then run model
|
||||
* once on fake data to preload
|
||||
* weights for a faster response
|
||||
*****************/
|
||||
if (preload) {
|
||||
const dummyT = tf.ones([1,modelWidth,modelHeight,3])
|
||||
model.predict(dummyT)
|
||||
}
|
||||
return model
|
||||
}
|
||||
|
||||
async function localDetect(imageData) {
|
||||
console.time('sw: pre-process')
|
||||
const [modelWidth, modelHeight] = model.inputs[0].shape.slice(1, 3)
|
||||
let gTense = null
|
||||
const input = tf.tidy(() => {
|
||||
gTense = tf.image.rgbToGrayscale(tf.image.resizeBilinear(tf.browser.fromPixels(imageData), [modelWidth, modelHeight])).div(255.0).expandDims(0)
|
||||
return tf.concat([gTense,gTense,gTense],3)
|
||||
})
|
||||
tf.dispose(gTense)
|
||||
console.timeEnd('sw: pre-process')
|
||||
|
||||
console.time('sw: run prediction')
|
||||
const res = model.predict(input)
|
||||
const tRes = tf.transpose(res,[0,2,1])
|
||||
const rawRes = tRes.arraySync()[0]
|
||||
console.timeEnd('sw: run prediction')
|
||||
|
||||
console.time('sw: post-process')
|
||||
const outputSize = res.shape[1]
|
||||
const output = {
|
||||
detections: []
|
||||
}
|
||||
let rawBoxes = []
|
||||
let rawScores = []
|
||||
let getScores, getBox, boxCalc
|
||||
|
||||
for (let i = 0; i < rawRes.length; i++) {
|
||||
getScores = rawRes[i].slice(4)
|
||||
if (getScores.every( s => s < .05)) { continue }
|
||||
getBox = rawRes[i].slice(0,4)
|
||||
boxCalc = [
|
||||
(getBox[0] - (getBox[2] / 2)) / modelWidth,
|
||||
(getBox[1] - (getBox[3] / 2)) / modelHeight,
|
||||
(getBox[0] + (getBox[2] / 2)) / modelWidth,
|
||||
(getBox[1] + (getBox[3] / 2)) / modelHeight,
|
||||
]
|
||||
rawBoxes.push(boxCalc)
|
||||
rawScores.push(getScores)
|
||||
}
|
||||
|
||||
if (rawBoxes.length > 0) {
|
||||
const tBoxes = tf.tensor2d(rawBoxes)
|
||||
let tScores = null
|
||||
let resBoxes = null
|
||||
let validBoxes = []
|
||||
let structureScores = null
|
||||
let boxes_data = []
|
||||
let scores_data = []
|
||||
let classes_data = []
|
||||
for (let c = 0; c < outputSize - 4; c++) {
|
||||
structureScores = rawScores.map(x => x[c])
|
||||
tScores = tf.tensor1d(structureScores)
|
||||
resBoxes = await tf.image.nonMaxSuppressionAsync(tBoxes,tScores,10,0.5,.05)
|
||||
validBoxes = resBoxes.dataSync()
|
||||
tf.dispose(resBoxes)
|
||||
if (validBoxes) {
|
||||
boxes_data.push(...rawBoxes.filter( (_, idx) => validBoxes.includes(idx)))
|
||||
let outputScores = structureScores.filter( (_, idx) => validBoxes.includes(idx))
|
||||
scores_data.push(...outputScores)
|
||||
classes_data.push(...outputScores.fill(c))
|
||||
}
|
||||
}
|
||||
|
||||
validBoxes = []
|
||||
tf.dispose(tBoxes)
|
||||
tf.dispose(tScores)
|
||||
tf.dispose(tRes)
|
||||
tf.dispose(resBoxes)
|
||||
const valid_detections_data = classes_data.length
|
||||
for (let i =0; i < valid_detections_data; i++) {
|
||||
let [dLeft, dTop, dRight, dBottom] = boxes_data[i]
|
||||
output.detections.push({
|
||||
"top": dTop,
|
||||
"left": dLeft,
|
||||
"bottom": dBottom,
|
||||
"right": dRight,
|
||||
"label": classes_data[i],
|
||||
"confidence": scores_data[i] * 100
|
||||
})
|
||||
}
|
||||
}
|
||||
tf.dispose(res)
|
||||
tf.dispose(input)
|
||||
console.timeEnd('sw: post-process')
|
||||
|
||||
return output || { detections: [] }
|
||||
}
|
||||
|
||||
async function videoFrame (vidData) {
|
||||
const [modelWidth, modelHeight] = model.inputs[0].shape.slice(1, 3)
|
||||
console.time('sw: frame-process')
|
||||
let rawCoords = []
|
||||
try {
|
||||
const input = tf.tidy(() => {
|
||||
return tf.image.resizeBilinear(tf.browser.fromPixels(vidData), [modelWidth, modelHeight]).div(255.0).expandDims(0)
|
||||
})
|
||||
const res = model.predict(input)
|
||||
const rawRes = tf.transpose(res,[0,2,1]).arraySync()[0]
|
||||
|
||||
if (rawRes) {
|
||||
for (let i = 0; i < rawRes.length; i++) {
|
||||
let getScores = rawRes[i].slice(4)
|
||||
if (getScores.some( s => s > .5)) {
|
||||
let foundTarget = rawRes[i].slice(0,2)
|
||||
foundTarget.push(Math.max(...getScores))
|
||||
rawCoords.push(foundTarget)
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
tf.dispose(input)
|
||||
tf.dispose(res)
|
||||
tf.dispose(rawRes)
|
||||
} catch (e) {
|
||||
console.log(e)
|
||||
}
|
||||
console.timeEnd('sw: frame-process')
|
||||
return {cds: rawCoords, mW: modelWidth, mH: modelHeight}
|
||||
}
|
||||
@@ -33,7 +33,7 @@
|
||||
ALVINN is for educational purposes only. It may not be used for medical diagnosis, intervention, or treatment.
|
||||
</h3>
|
||||
<div style="display: flex; justify-content: space-around; flex-direction: row; align-items: center;">
|
||||
<span v-if="!siteConf || !siteConf.agreeExpire == 0" style="height: min-content;">
|
||||
<span style="height: min-content;">
|
||||
<f7-checkbox v-model:checked="rememberAgreement"/> Don't show again
|
||||
</span>
|
||||
<f7-button text="I agree" fill @click="setAgreement" />
|
||||
@@ -68,63 +68,34 @@
|
||||
rememberAgreement: false,
|
||||
siteAgreement: false,
|
||||
dateAgreement: null,
|
||||
showDisclaimer: false,
|
||||
showDisclaimer: true,
|
||||
alvinnVersion: store().getVersion,
|
||||
siteConf: {}
|
||||
}
|
||||
},
|
||||
async created () {
|
||||
document.addEventListener('keydown', e => {
|
||||
if (e.code == 'KeyR') {
|
||||
console.log(f7.views.main.router.history)
|
||||
}
|
||||
if (e.code == 'KeyB') {
|
||||
f7.views.main.router.back()
|
||||
}
|
||||
})
|
||||
if (!window.cordova) {
|
||||
const confText = await fetch('./conf/conf.yaml')
|
||||
.then((mod) => { return mod.text() })
|
||||
this.siteConf = YAML.parse(confText)
|
||||
}
|
||||
const loadSiteSettings = localStorage.getItem('siteSettings')
|
||||
created () {
|
||||
fetch(`${!!window.cordova ? 'https://localhost' : '.'}/conf/conf.yaml`)
|
||||
.then((mod) => { return mod.text() })
|
||||
.then((confText) => {
|
||||
this.siteConf = YAML.parse(confText)
|
||||
console.log(this.siteConf)
|
||||
})
|
||||
var loadSiteSettings = localStorage.getItem('siteSettings')
|
||||
if (loadSiteSettings) {
|
||||
let loadedSettings = JSON.parse(loadSiteSettings)
|
||||
var loadedSettings = JSON.parse(loadSiteSettings)
|
||||
this.siteAgreement = loadedSettings.siteAgreement
|
||||
this.rememberAgreement = loadedSettings.rememberAgreement
|
||||
this.dateAgreement = loadedSettings.dateAgreement && new Date(loadedSettings.dateAgreement)
|
||||
}
|
||||
const curDate = new Date ()
|
||||
const expireMonth = (this.dateAgreement?.getMonth() || 0) + (this.siteConf?.agreeExpire || 3)
|
||||
const agreeStillValid = this.dateAgreement && (curDate < this.dateAgreement.setMonth(expireMonth))
|
||||
if (this.siteAgreement && this.rememberAgreement && agreeStillValid && !this.siteConf?.agreeExpire == 0) {
|
||||
var curDate = new Date ()
|
||||
var agreeStillValid = this.dateAgreement && (curDate < this.dateAgreement.setMonth(this.dateAgreement.getMonth() + 3))
|
||||
if (this.siteAgreement && this.rememberAgreement && agreeStillValid) {
|
||||
this.showDisclaimer = false
|
||||
store().agree()
|
||||
} else {
|
||||
this.showDisclaimer = true
|
||||
}
|
||||
store().set('enabledRegions',this.siteConf?.regions)
|
||||
store().set('siteDemo',this.siteConf?.demo)
|
||||
store().set('infoUrl',this.siteConf?.infoUrl)
|
||||
const loadServerSettings = localStorage.getItem('serverSettings')
|
||||
if (this.siteConf.disableWorkers) {
|
||||
store().disableWorkers()
|
||||
}
|
||||
if (this.siteConf?.useExternal) {
|
||||
if (!['none','list','optional','required'].includes(this.siteConf.useExternal)) {
|
||||
console.warn(`'${this.siteConf.useExternal}' is not a valid value for useExternal configuration: using 'optional'`)
|
||||
} else {
|
||||
store().set('useExternal',this.siteConf.useExternal)
|
||||
if (this.siteConf.external) {
|
||||
store().set('externalServerList',this.siteConf.external)
|
||||
}
|
||||
}
|
||||
}
|
||||
if (this.siteConf.useExternal == 'none') {
|
||||
localStorage.setItem('serverSettings','{"use":false}')
|
||||
} else if (!loadServerSettings && !this.siteConf.external) {
|
||||
var loadServerSettings = localStorage.getItem('serverSettings')
|
||||
if (!loadServerSettings) {
|
||||
localStorage.setItem('serverSettings','{"use":false,"address":"10.188.0.98","port":"9001","previous":{"10.188.0.98":"9001"}}')
|
||||
} else if (this.siteConf.useExternal == 'required') {
|
||||
localStorage.setItem('serverSettings',`{"use":true,"address":"${this.siteConf.external[0].address}","port":${this.siteConf.external[0].port}}`)
|
||||
}
|
||||
},
|
||||
methods: {
|
||||
@@ -151,7 +122,7 @@
|
||||
this.showDisclaimer = false
|
||||
},
|
||||
() => {
|
||||
const toast = f7.toast.create({
|
||||
var toast = f7.toast.create({
|
||||
text: 'ERROR: No settings saved',
|
||||
closeTimeout: 2000
|
||||
})
|
||||
@@ -163,11 +134,13 @@
|
||||
setup() {
|
||||
const device = getDevice();
|
||||
// Framework7 Parameters
|
||||
const loadThemeSettings = localStorage.getItem('themeSettings')
|
||||
let themeSettings = {}
|
||||
let darkTheme = 'auto'
|
||||
if (loadThemeSettings) { themeSettings = JSON.parse(loadThemeSettings) }
|
||||
if (themeSettings?.darkMode) darkTheme = themeSettings.darkMode
|
||||
var loadThemeSettings = localStorage.getItem('themeSettings')
|
||||
if (loadThemeSettings) var themeSettings = JSON.parse(loadThemeSettings)
|
||||
try {
|
||||
if (themeSettings.darkMode.toString()) var darkTheme = themeSettings.darkMode
|
||||
} catch {
|
||||
var darkTheme = 'auto'
|
||||
}
|
||||
const f7params = {
|
||||
name: 'ALVINN', // App name
|
||||
theme: 'auto', // Automatic theme detection
|
||||
|
||||
@@ -1,25 +1,14 @@
|
||||
<template>
|
||||
<svg width="100%" height="100%" version="1.1" viewBox="0 0 24 24" xmlns="http://www.w3.org/2000/svg">
|
||||
<g v-if="iconSet == 1" stroke="none" :fill="fillColor" >
|
||||
<path d="M22,8.25 20.75,7.5 20.25,6.5 19,6 V5 L18.25,6 16,7.75 13.5,8.75 H8.5 L7,9 6,9.75 5,11 L4.25,12.5 3.5,14 2.5,15 2,15.5 2.5,15.75 3.5,15.5 4.5,14.25 5.5,12.25 6.75,10.75 7,12 7.25,13.25 6.5,15.5 7,19 H8 V 18.5 L7.5,18.25 7.75,15.75 9.75,12.25 12,13 15.25,13.5 15.5,17.25 16,19 H17 V18.5 L16.5,18.25 V15.5 L17,13 17.75,10.75 19,8.75 H20 L21.25,9 Z" style="opacity: .4;"/>
|
||||
<path v-if="region == 0" d="M16,7.75 13.5,8.75 12,13 15.25,13.5 17,13 17.75,10.75 Z" fill-rule="evenodd" :fill="fillColor" />
|
||||
<path v-else-if="region ==1" d="M13.5,8.75 H8.5 L7,9 6,9.75 5,11 4.25,12.5 3.5,14 2.5,15 2,15.5 2.5,15.75 3.5,15.5 4.5,14.25 5.5,12.25 6.75,10.75 7,12 9.75,12.25 12,13 Z" :fill="fillColor" fill-rule="evenodd"/>
|
||||
<path v-else-if="region == 2" d="M15,8.5 C14,8.5 13.25,9.25 13.25,10.25 C13.25,10.75 13.5,11.25 13.75,11.5 L15.25,13.5 15.5,17.25 16,19 H17 V18.5 L16.5,18.25 V15.5 L17,13 17.75,10.75 16.25,9 C16,8.75 15.5,8.5 15,8.5 Z M8.5,9 C7.5,9 6.75,9.75 6.75,10.75 L7,12 7.25,13.25 6.5,15.5 7,19 H8 V18.5 L7.5,18.25 7.75,15.75 9.75,12.25 10.25,10.75 C10.25,9.75 9.5,9 8.5,9 Z" :fill="fillColor" fill-rule="evenodd"/>
|
||||
<path v-else-if="region == 3" d="M22,8.25 20.75,7.5 20.25,6.5 19,6 V5 L18.25,6 16,7.75 17.75,10.75 19,8.75 H20 L21.25,9 Z" :fill="fillColor" fill-rule="evenodd"/>
|
||||
</g>
|
||||
<g v-else-if="iconSet == 2" stroke="none" :fill="fillColor" >
|
||||
<path d="M22,9.5 20.5,8.5 19.75,7.25 18.25,6.75 V 5 L 17,6.75 15,8.25 12,9.5 H 5.75 L 2.75,10 2,12 2.5,15.75 3.25,16 4,19 H 5.5 V 18.25 L 5,18 5.25,16.25 6,15.25 H 10 L 13.75,16 14.75,19 H 16.25 V 18.25 L 15.75,18 V 16 L 17,15.25 17.5,12 18.25,10.25 H 19.5 L 21,10.5 Z" style="opacity: .4;"/>
|
||||
<path v-if="region == 0" d="M12,9.5 H 11 L 10,15.25 13.75,16 H 15.75 L 17,15.25 17.5,12 Z" fill-rule="evenodd" :fill="fillColor" />
|
||||
<path v-else-if="region ==1" d="M 11,9.5 H 5.75 L 2.75,10 2,12 2.5,15.75 3.25,16 6,15.25 H 10 Z" :fill="fillColor" fill-rule="evenodd"/>
|
||||
<path v-else-if="region == 2" d="M11.25,10.25 C 10.25,11.25 10.25,12.75 11.25,13.75 L 13.75,16 14.75,19 H 16.25 V 18.25 L 15.75,18 V 16 L 16,12 14.75,10.25 C 13.75,9.25 12.25,9.25 11.25,10.25 Z M 3,11 2,12 2.5,15.75 3.25,16 4,19 H 5.5 V 18.25 L 5,18 5.25,16.25 6,15.25 6.5,14.5 C 7.5,13.5 7.5,12 6.5,11 C 5.5,10 4,10 3,11 Z" :fill="fillColor" fill-rule="evenodd"/>
|
||||
<path v-else-if="region == 3" d="M 22,9.5 20.5,8.5 19.75,7.25 18.25,6.75 V 5 L 17,6.75 15,8.25 12,9.5 17.5,12 18.25,10.25 H 19.5 L 21,10.5 Z" :fill="fillColor" fill-rule="evenodd"/>
|
||||
</g>
|
||||
<g v-else-if="iconSet == 3" stroke="none" :fill="fillColor" >
|
||||
<path d="M22,6.25 L21,6 V5 L19.5,4.5 V3 L18.25,4.5 16,6.5 12.5,8 6,8.25 4,8.5 2.5,9.75 2,10.5 2.75,10.75 3.5,10.25 V11 L4,12.5 4.25,14 3.25,16.5 4,21 H5.25 V20.25 L4.75,20 5,16.75 7.75,13 10.5,14.5 15,15.25 15.25,18.75 16,21 H17.25 V20.25 L16.75,20 V16.5 L17.75,13.75 18.5,10.5 19.5,8.25 H20.5 L21.5,8.75 22,7.75 Z" style="opacity: .4;"/>
|
||||
<path v-if="region == 0" d="M16,6.5 L12.5,8 10.5,14.5 15,15.25 17.75,13.75 18.5,10.5 Z" fill-rule="evenodd" :fill="fillColor" />
|
||||
<path v-else-if="region ==1" d="M12.5,8 L6,8.25 4,8.5 2.5,9.75 2,10.5 2.75,10.75 3.5,10.25 V11 L4,12.5 7.75,13 10.5,14.5 Z" :fill="fillColor" fill-rule="evenodd"/>
|
||||
<path v-else-if="region == 2" d="M12.75,8.25 C11.75,9.25 11.75,10.75 12.75,11.75 L15,15.25 15.25,18.75 16,21 H17.25 V20.25 L16.75,20 V16.5 L17.75,13.75 18.5,10.5 16.25,8.25 C15.25,7.25 13.75,7.25 12.75,8.25 Z M6,8.5 C4.75,8.5 3.5,9.75 3.5,11 L4,12.5 4.25,14 3.25,16.5 4,21 H5.25 V20.25 L4.75,20 5,16.75 7.75,13 8.5,11 C8.5,9.75 7.25,8.5 6,8.5 Z" :fill="fillColor" fill-rule="evenodd"/>
|
||||
<path v-else-if="region == 3" d="M22,6.25 L21,6 V5 L19.5,4.5 V3 L18.25,4.5 L16,6.5 18.5,10.5 19.5,8.25 H20.5 L21.5,8.75 22,7.75 Z" :fill="fillColor" fill-rule="evenodd"/>
|
||||
<svg width="100%" height="100%" version="1.1" viewBox="0 0 26.458333 26.458333" xmlns="http://www.w3.org/2000/svg">
|
||||
<g stroke="none" :fill="fillColor" >
|
||||
<path d="m25.402178 7.8631343c-0.487907-0.3670601-0.811572-0.7261214-1.573424-1.106523-0.006122-0.1598737 0.053853-0.2411643-0.072374-0.5438299-0.239221-0.3572156-1.352454-0.987126-2.19723-0.8590224-1.567124 0.9252583-1.879175 1.9380345-3.311246 2.9148849-0.987966 0.103956-2.015535 0.3206455-3.091153 0.6741123-10.556415-1.8721062-8.2481554 5.9196998-14.460584 1.7189868 0 0-0.24989902 0.06545-0.28777276 0.170279-0.0360567 0.0998 0.10708587 0.299783 0.10708587 0.299783 2.0948939 1.933517 4.742145 1.74421 6.6624536-0.07316 0.096935 0.768305 0.3887649 1.92789 0.8180324 3.363404-0.035692 1.245357-1.2923422 2.350278-1.3169003 2.801484-0.013354 0.24535 0.5120291 3.6149 0.7015429 3.650219l0.7793046 0.145235c0.8989154 0.167526 0.7195768-0.420583 0.3224789-0.780361-0.2085791-0.188976-0.3404558-0.252396-0.3637846-0.441707-0.3810495-3.092169 2.1284358-4.423261 2.4023638-6.742929 2.453391 0.120243 3.974486 1.282365 6.721539 1.403033 0.136906 1.035362-0.177341 4.099457-0.120257 4.484465 0.04824 0.325337 0.511082 0.918401 0.497537 1.876854-3e-3 0.211416 0.410117 0.159484 0.619918 0.185743 0.799059 0.09999 1.033405-0.329373 0.42557-0.75884-0.132327-0.0935-0.456134-0.264276-0.476806-0.424973-0.251045-1.951541 1.103782-4.917365 1.103782-4.917365 0.355435-0.554509 0.707693-1.135262 1.002776-2.188396 0.160636-0.543413 0.157772-1.012576 0.119972-1.465872 1.541867-1.5721797 1.123352-2.3466703 2.548492-2.7336036 0.65786 0.059985 1.147615 0.1738285 1.444935 0.3493259 0.420933-0.188852 0.760222-0.5096057 0.993749-1.001227z" style="opacity: .25;"/>
|
||||
<path v-if="region == 0" d="m 18.247904,8.2686439 c -0.987966,0.103956 -3.091153,0.6741123 -3.091153,0.6741123 -1.652395,2.7995828 -2.226698,3.8098238 -2.580037,4.4476078 0,0 2.617397,0.984666 4.665796,1.066659 -0.003,0.01385 2.049744,0.445884 2.049744,0.445884 0,0 0.707693,-1.135262 1.002776,-2.188396 0.160636,-0.543413 0.157772,-1.012576 0.119972,-1.465872 -0.291029,-0.377705 -1.38593,-1.9038754 -2.167098,-2.9799951 z" fill-rule="evenodd" :fill="fillColor" />
|
||||
<path v-else-if="region ==1" d="m15.156751 8.9427562c-10.556415-1.8721062-8.2481554 5.9196998-14.460584 1.7189868 0 0-0.24989902 0.06545-0.28777276 0.170279-0.0360567 0.0998 0.10708587 0.299783 0.10708587 0.299783 2.0948939 1.933517 4.742145 1.74421 6.6624536-0.07316 0.048468 0.384152 0.1456587 0.866125 0.2843915 1.431499 0.7210773 0.130029 2.5390772 0.501293 3.0586462 0.563846 0.613348 0.03006 1.528237 0.20676 2.05877 0.334503 0.563462-1.044613 0.536275-0.982536 2.57701-4.4457368z" :fill="fillColor" fill-rule="evenodd"/>
|
||||
<g v-else-if="region == 2" :fill="fillColor" fill-rule="evenodd">
|
||||
<path d="m17.24251 14.457023c0.136906 1.035362-0.177341 4.099457-0.120257 4.484465 0.04824 0.325337 0.511082 0.918401 0.497537 1.876854-3e-3 0.211416 0.410117 0.159484 0.619918 0.185743 0.799059 0.09999 1.033405-0.329373 0.42557-0.75884-0.132327-0.0935-0.456134-0.264276-0.476806-0.424973-0.251045-1.951541 1.103782-4.917365 1.103782-4.917365 0.355435-0.554509 0.707693-1.135262 1.002776-2.188396 0.160636-0.543413 0.157772-1.012576 0.119972-1.465872-3.100189-4.8581326-4.866767-0.394712-3.172492 3.208384z" />
|
||||
<path d="m7.1779333 11.058645c0.096935 0.768305 0.3887649 1.92789 0.8180324 3.363404-0.035692 1.245357-1.2923422 2.350278-1.3169003 2.801484-0.013354 0.24535 0.5120291 3.6149 0.7015429 3.650219l0.7793046 0.145235c0.8989154 0.167526 0.7195768-0.420583 0.3224789-0.780361-0.2085791-0.188976-0.3404558-0.252396-0.3637846-0.441707-0.3810495-3.092169 2.1284358-4.423261 2.4023638-6.742929 2.1562-5.4517681-2.8350883-3.4878487-3.3430377-1.995345z" />
|
||||
</g>
|
||||
<path v-else-if="region == 3" d="m25.402178 7.8631343c-0.487907-0.3670601-0.811572-0.7261214-1.573424-1.106523-0.006122-0.1598737 0.053853-0.2411643-0.072374-0.5438299-0.239221-0.3572156-1.352454-0.987126-2.19723-0.8590224-1.567124 0.9252583-1.879175 1.9380345-3.311246 2.9148849 0.566485 0.8398567 1.254642 1.7575311 2.167098 2.9799951 1.541867-1.5721797 1.123352-2.3466703 2.548492-2.7336036 0.65786 0.059985 1.147615 0.1738285 1.444935 0.3493259 0.420933-0.188852 0.760222-0.5096057 0.993749-1.001227z" :fill="fillColor" fill-rule="evenodd"/>
|
||||
</g>
|
||||
</svg>
|
||||
</template>
|
||||
@@ -37,13 +26,6 @@
|
||||
fillColor: {
|
||||
type: String,
|
||||
default: "var(--avn-theme-color)"
|
||||
},
|
||||
iconSet: {
|
||||
type: Number,
|
||||
default: 1,
|
||||
validator(value) {
|
||||
return value >= 1 && value <= 3
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -16,10 +16,6 @@
|
||||
<path v-else-if="icon == 'limbs'" d="M540-440q17 0 28.5-11.5T580-480q0-7-1.5-12.5T574-503q11-4 18.5-14t7.5-23q0-17-11.5-28.5T560-580q-13 0-23 7t-14 19l-146-70q2-4 2.5-8t.5-8q0-17-11.5-28.5T340-680q-17 0-28.5 11.5T300-640q0 6 2 11.5t5 10.5q-11 4-19 14t-8 24q0 17 11.5 28.5T320-540q14 0 24-7.5t14-19.5l146 70-4 17q0 17 11.5 28.5T540-440ZM394-80q-16-47-24-92.5t-10-86q-2-40.5-.5-74.5t4.5-58q-1 0 0 0-22-5-50.5-12.5t-61-20.5Q220-437 186-455.5T119-500l50-70q39 35 81.5 55.5t78.5 32q36 11.5 60 15l24 3.5q18 1 28.5 15t7.5 32l-4.5 33.5q-4.5 33.5-5 83.5t7.5 109q8 59 33 111h-86Zm366 0h-80v-423q0-48-25.5-87T586-649L313-772l49-67 257 117q64 29 102.5 88T760-503v423Zm-280 0q-25-52-33-111t-7.5-109q.5-50 5-83.5L449-417q3-18-7.5-32T413-464l-24-3.5q-24-3.5-60-15t-78.5-32Q208-535 169-570q39 35 81.5 55.5t78.5 32q36 11.5 60 15l24 3.5q18 1 28.5 15t7.5 32l-4.5 33.5q-4.5 33.5-5 83.5t7.5 109q8 59 33 111Z"/>
|
||||
<path v-else-if="icon == 'head'" d="M194-80v-395h80v315h280v-193l105-105q29-29 45-65t16-77q0-40-16.5-76T659-741l-25-26-127 127H347l-43 43-57-56 67-67h160l160-160 82 82q40 40 62 90.5T800-600q0 57-22 107.5T716-402l-82 82v240H194Zm197-187L183-475q-11-11-17-26t-6-31q0-16 6-30.5t17-25.5l84-85 124 123q28 28 43.5 64.5T450-409q0 40-15 76.5T391-267Z"/>
|
||||
<path v-else-if="icon == 'photo_sample'" d="M240-80q-33 0-56.5-23.5T160-160v-640q0-33 23.5-56.5T240-880h480q33 0 56.5 23.5T800-800v640q0 33-23.5 56.5T720-80H240Zm0-80h480v-640h-80v280l-100-60-100 60v-280H240v640Zm40-80h400L545-420 440-280l-65-87-95 127Zm-40 80v-640 640Zm200-360 100-60 100 60-100-60-100 60Z"/>
|
||||
<path v-else-if="icon == 'reset_slide'" d="M520-330v-60h160v60H520Zm60 210v-50h-60v-60h60v-50h60v160h-60Zm100-50v-60h160v60H680Zm40-110v-160h60v50h60v60h-60v50h-60Zm111-280h-83q-26-88-99-144t-169-56q-117 0-198.5 81.5T200-480q0 72 32.5 132t87.5 98v-110h80v240H160v-80h94q-62-50-98-122.5T120-480q0-75 28.5-140.5t77-114q48.5-48.5 114-77T480-840q129 0 226.5 79.5T831-560Z"/>
|
||||
<path v-else-if="icon == 'zoom_to'" d="M440-40v-167l-44 43-56-56 140-140 140 140-56 56-44-43v167h-80ZM220-340l-56-56 43-44H40v-80h167l-43-44 56-56 140 140-140 140Zm520 0L600-480l140-140 56 56-43 44h167v80H753l43 44-56 56Zm-260-80q-25 0-42.5-17.5T420-480q0-25 17.5-42.5T480-540q25 0 42.5 17.5T540-480q0 25-17.5 42.5T480-420Zm0-180L340-740l56-56 44 43v-167h80v167l44-43 56 56-140 140Z"/>
|
||||
<path v-else-if="icon == 'reset_zoom'" d="M480-320v-100q0-25 17.5-42.5T540-480h100v60H540v100h-60Zm60 240q-25 0-42.5-17.5T480-140v-100h60v100h100v60H540Zm280-240v-100H720v-60h100q25 0 42.5 17.5T880-420v100h-60ZM720-80v-60h100v-100h60v100q0 25-17.5 42.5T820-80H720Zm111-480h-83q-26-88-99-144t-169-56q-117 0-198.5 81.5T200-480q0 72 32.5 132t87.5 98v-110h80v240H160v-80h94q-62-50-98-122.5T120-480q0-75 28.5-140.5t77-114q48.5-48.5 114-77T480-840q129 0 226.5 79.5T831-560Z"/>
|
||||
<path v-else-if="icon == 'clipboard'" d="M200-120q-33 0-56.5-23.5T120-200v-560q0-33 23.5-56.5T200-840h167q11-35 43-57.5t70-22.5q40 0 71.5 22.5T594-840h166q33 0 56.5 23.5T840-760v560q0 33-23.5 56.5T760-120H200Zm0-80h560v-560h-80v120H280v-120h-80v560Zm280-560q17 0 28.5-11.5T520-800q0-17-11.5-28.5T480-840q-17 0-28.5 11.5T440-800q0 17 11.5 28.5T480-760Z"/>
|
||||
</svg>
|
||||
</template>
|
||||
|
||||
@@ -46,11 +42,7 @@
|
||||
'abdomen',
|
||||
'limbs',
|
||||
'head',
|
||||
'photo_sample',
|
||||
'reset_slide',
|
||||
'zoom_to',
|
||||
'reset_zoom',
|
||||
'clipboard'
|
||||
'photo_sample'
|
||||
]
|
||||
return iconList.includes(value)
|
||||
}
|
||||
|
||||
@@ -89,26 +89,6 @@
|
||||
display: none;
|
||||
}
|
||||
|
||||
.level-slide-marker {
|
||||
border: var(--avn-slide-marker-border);
|
||||
position: absolute;
|
||||
top: 0%;
|
||||
height: 100%;
|
||||
left: var(--avn-slide-marker-position);
|
||||
}
|
||||
|
||||
.range-bar {
|
||||
background: var(--avn-theme-color);
|
||||
}
|
||||
|
||||
.range-bar-active {
|
||||
background: rgba(255,255,255,.8);
|
||||
}
|
||||
|
||||
.dark .range-bar-active {
|
||||
background: rgba(0,0,0,.8);
|
||||
}
|
||||
|
||||
.image-menu {
|
||||
grid-area: menu-view;
|
||||
margin: 5px;
|
||||
@@ -147,13 +127,6 @@
|
||||
align-self: center;
|
||||
}
|
||||
|
||||
.structure-info {
|
||||
position: absolute;
|
||||
z-index: 3;
|
||||
color: #0f206c;
|
||||
border-radius: 100%;
|
||||
}
|
||||
|
||||
/*Additional styles for small format landscape orientation*/
|
||||
@media (max-height: 450px) and (orientation: landscape) {
|
||||
.detect-grid {
|
||||
@@ -189,12 +162,6 @@
|
||||
display: block;
|
||||
}
|
||||
|
||||
.level-slide-marker {
|
||||
top: calc(100% - var(--avn-slide-marker-position));
|
||||
height: auto;
|
||||
width: 100%;
|
||||
left: 0%;
|
||||
}
|
||||
|
||||
.image-container {
|
||||
flex-direction: column;
|
||||
|
||||
@@ -18,7 +18,7 @@
|
||||
<meta name="msapplication-tap-highlight" content="no">
|
||||
<title>ALVINN</title>
|
||||
<% if (TARGET === 'web') { %>
|
||||
<meta name="mobile-web-app-capable" content="yes">
|
||||
<meta name="apple-mobile-web-app-capable" content="yes">
|
||||
<meta name="apple-mobile-web-app-status-bar-style" content="black-translucent">
|
||||
<link rel="apple-touch-icon" href="icons/apple-touch-icon.png">
|
||||
<link rel="icon" href="icons/favicon.png">
|
||||
|
||||
@@ -2,65 +2,24 @@ import { reactive, computed } from 'vue';
|
||||
|
||||
const state = reactive({
|
||||
disclaimerAgreement: false,
|
||||
enabledRegions: ['thorax','abdomen','limbs','head'],
|
||||
regionIconSet: Math.floor(Math.random() * 3) + 1,
|
||||
version: '0.5.0-alpha',
|
||||
build: '####',
|
||||
fullscreen: false,
|
||||
useExternal: 'optional',
|
||||
workersEnabled: 'true',
|
||||
siteDemo: false,
|
||||
externalServerList: [],
|
||||
infoUrl: false
|
||||
enabledRegions: ['thorax','abdomen','limbs'],
|
||||
version: '0.5.0-rc',
|
||||
siteConfig: {}
|
||||
})
|
||||
|
||||
const set = (config, confObj) => {
|
||||
if (confObj === undefined) { return }
|
||||
state[config] = confObj
|
||||
const setConfig = (confObj) => {
|
||||
state.siteConfig = confObj
|
||||
}
|
||||
|
||||
const agree = () => {
|
||||
state.disclaimerAgreement = true
|
||||
}
|
||||
|
||||
const disableWorkers = () => {
|
||||
state.workersEnabled = false
|
||||
}
|
||||
|
||||
const getServerList = () => {
|
||||
if (state.useExternal == 'required') {
|
||||
return state.externalServerList[0]
|
||||
} else {
|
||||
return state.externalServerList
|
||||
}
|
||||
}
|
||||
|
||||
const toggleFullscreen = () => {
|
||||
if (document.fullscreenElement) {
|
||||
document.exitFullscreen().then( () => {
|
||||
state.fullscreen = false
|
||||
})
|
||||
} else {
|
||||
app.requestFullscreen().then( () => {
|
||||
state.fullscreen = true
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
export default () => ({
|
||||
isAgreed: computed(() => state.disclaimerAgreement),
|
||||
isFullscreen: computed(() => state.fullscreen),
|
||||
demoMode: computed(() => state.siteDemo),
|
||||
externalType: computed(() => state.useExternal),
|
||||
useWorkers: computed(() => state.workersEnabled),
|
||||
getRegions: computed(() => state.enabledRegions),
|
||||
getVersion: computed(() => state.version),
|
||||
getBuild: computed(() => state.build),
|
||||
getIconSet: computed(() => state.regionIconSet),
|
||||
getInfoUrl: computed(() => state.infoUrl),
|
||||
set,
|
||||
agree,
|
||||
disableWorkers,
|
||||
getServerList,
|
||||
toggleFullscreen
|
||||
getConfig: computed(() => state.siteConfig),
|
||||
setConfig,
|
||||
agree
|
||||
})
|
||||
|
||||
@@ -1,157 +0,0 @@
|
||||
class Coordinate {
|
||||
constructor(x, y) {
|
||||
this.x = x
|
||||
this.y = y
|
||||
}
|
||||
|
||||
toRefFrame(...frameArgs) {
|
||||
if (frameArgs.length == 0) {
|
||||
return {x: this.x, y: this.y}
|
||||
}
|
||||
let outFrames = []
|
||||
//Get Coordinates in Image Reference Frame
|
||||
if (frameArgs[0].tagName == 'IMG' && frameArgs[0].width && frameArgs[0].height) {
|
||||
outFrames.push({
|
||||
x: this.x * frameArgs[0].width,
|
||||
y: this.y * frameArgs[0].height
|
||||
})
|
||||
} else {
|
||||
throw new Error('Coordinate: invalid reference frame for frameType: Image')
|
||||
}
|
||||
//Get Coordinates in Canvas Reference Frame
|
||||
if (frameArgs[1]) {
|
||||
if (frameArgs[1].tagName == 'CANVAS' && frameArgs[1].width && frameArgs[1].height) {
|
||||
let imgWidth
|
||||
let imgHeight
|
||||
const imgAspect = frameArgs[0].width / frameArgs[0].height
|
||||
const rendAspect = frameArgs[1].width / frameArgs[1].height
|
||||
if (imgAspect >= rendAspect) {
|
||||
imgWidth = frameArgs[1].width
|
||||
imgHeight = frameArgs[1].width / imgAspect
|
||||
} else {
|
||||
imgWidth = frameArgs[1].height * imgAspect
|
||||
imgHeight = frameArgs[1].height
|
||||
}
|
||||
outFrames.push({
|
||||
x: (frameArgs[1].width - imgWidth) / 2 + this.x * imgWidth,
|
||||
y: (frameArgs[1].height - imgHeight) / 2 + this.y * imgHeight
|
||||
})
|
||||
} else {
|
||||
throw new Error('Coordinate: invalid reference frame for frameType: Canvas')
|
||||
}
|
||||
}
|
||||
//Get Coordinates in Screen Reference Frame
|
||||
if (frameArgs[2]) {
|
||||
if (frameArgs[2].zoom && frameArgs[2].offset && frameArgs[2].offset.x !== undefined && frameArgs[2].offset.y !== undefined) {
|
||||
outFrames.push({
|
||||
x: outFrames[1].x * frameArgs[2].zoom + frameArgs[2].offset.x,
|
||||
y: outFrames[1].y * frameArgs[2].zoom + frameArgs[2].offset.y
|
||||
})
|
||||
} else {
|
||||
throw new Error('Coordinate: invalid reference frame for frameType: Screen')
|
||||
}
|
||||
}
|
||||
|
||||
return outFrames
|
||||
}
|
||||
|
||||
toString() {
|
||||
return `(x: ${this.x}, y: ${this.y})`
|
||||
}
|
||||
}
|
||||
|
||||
export class StructureBox {
|
||||
constructor(top, left, bottom, right) {
|
||||
this.topLeft = new Coordinate(left, top)
|
||||
this.bottomRight = new Coordinate(right, bottom)
|
||||
}
|
||||
|
||||
getBoxes(boxType, ...frameArgs) {
|
||||
let lowerH, lowerV, calcSide
|
||||
switch (boxType) {
|
||||
case 'point':
|
||||
lowerH = 'right'
|
||||
lowerV = 'bottom'
|
||||
break
|
||||
case 'side':
|
||||
lowerH = 'width'
|
||||
lowerV = 'height'
|
||||
calcSide = true
|
||||
break
|
||||
default:
|
||||
throw new Error(`StructureBox: invalid boxType - ${boxType}`)
|
||||
}
|
||||
if (frameArgs.length == 0) {
|
||||
return {
|
||||
left: this.topLeft.x,
|
||||
top: this.topLeft.y,
|
||||
[lowerH]: this.bottomRight.x - ((calcSide) ? this.topLeft.x : 0),
|
||||
[lowerV]: this.bottomRight.y - ((calcSide) ? this.topLeft.y : 0)
|
||||
}
|
||||
}
|
||||
const tL = this.topLeft.toRefFrame(...frameArgs)
|
||||
const bR = this.bottomRight.toRefFrame(...frameArgs)
|
||||
let outBoxes = []
|
||||
tL.forEach((cd, i) => {
|
||||
outBoxes.push({
|
||||
left: cd.x,
|
||||
top: cd.y,
|
||||
[lowerH]: bR[i].x - ((calcSide) ? cd.x : 0),
|
||||
[lowerV]: bR[i].y - ((calcSide) ? cd.y : 0)
|
||||
})
|
||||
})
|
||||
return outBoxes
|
||||
}
|
||||
}
|
||||
|
||||
export class Structure {
|
||||
constructor(structResult) {
|
||||
this.label = structResult.label
|
||||
this.confidence = structResult.confidence
|
||||
this.box = new StructureBox(
|
||||
structResult.top,
|
||||
structResult.left,
|
||||
structResult.bottom,
|
||||
structResult.right
|
||||
)
|
||||
this.deleted = false
|
||||
this.index = -1
|
||||
this.passThreshold = true
|
||||
this.searched = false
|
||||
}
|
||||
|
||||
get resultIndex() {
|
||||
return this.index
|
||||
}
|
||||
|
||||
set resultIndex(newIdx) {
|
||||
this.index = newIdx
|
||||
}
|
||||
|
||||
get isDeleted() {
|
||||
return this.deleted
|
||||
}
|
||||
|
||||
set isDeleted(del) {
|
||||
this.deleted = !!del
|
||||
}
|
||||
|
||||
get isSearched() {
|
||||
return this.searched
|
||||
}
|
||||
|
||||
set isSearched(ser) {
|
||||
this.searched = !!ser
|
||||
}
|
||||
|
||||
get aboveThreshold() {
|
||||
return this.passThreshold
|
||||
}
|
||||
|
||||
setThreshold(level) {
|
||||
if (typeof level != 'number') {
|
||||
throw new Error(`Structure: invalid threshold level ${level}`)
|
||||
}
|
||||
this.passThreshold = this.confidence >= level
|
||||
}
|
||||
}
|
||||
@@ -1,13 +1,11 @@
|
||||
import { f7 } from 'framework7-vue'
|
||||
|
||||
export default {
|
||||
methods: {
|
||||
async openCamera(imContain) {
|
||||
let cameraLoaded = false
|
||||
var cameraLoaded = false
|
||||
const devicesList = await navigator.mediaDevices.enumerateDevices()
|
||||
let videoDeviceAvailable = devicesList.some( d => d.kind == "videoinput")
|
||||
if (videoDeviceAvailable) {
|
||||
let vidConstraint = {
|
||||
this.videoDeviceAvailable = devicesList.some( d => d.kind == "videoinput")
|
||||
if (this.videoDeviceAvailable) {
|
||||
var vidConstraint = {
|
||||
video: {
|
||||
width: {
|
||||
ideal: imContain.offsetWidth
|
||||
@@ -27,66 +25,17 @@ export default {
|
||||
},
|
||||
closeCamera () {
|
||||
this.cameraStream.getTracks().forEach( t => t.stop())
|
||||
this.cameraStream = null
|
||||
this.videoAvailable = false
|
||||
},
|
||||
captureVidFrame() {
|
||||
const vidViewer = this.$refs.vid_viewer
|
||||
vidViewer.pause()
|
||||
let tempCVS = document.createElement('canvas')
|
||||
tempCVS.id = 'temp-video-canvas'
|
||||
tempCVS.height = vidViewer.videoHeight || parseInt(vidViewer.style.height)
|
||||
tempCVS.width = vidViewer.videoWidth || parseInt(vidViewer.style.width)
|
||||
const tempCtx = tempCVS.getContext('2d')
|
||||
tempCtx.drawImage(vidViewer, 0, 0)
|
||||
this.getImage(tempCVS.toDataURL())
|
||||
},
|
||||
async videoFrameDetectWorker (vidData, vidWorker) {
|
||||
const startDetection = () => {
|
||||
createImageBitmap(vidData).then(imVideoFrame => {
|
||||
vidWorker.postMessage({call: 'videoFrame', image: imVideoFrame}, [imVideoFrame])
|
||||
})
|
||||
}
|
||||
vidData.addEventListener('resize',startDetection,{once: true})
|
||||
vidWorker.onmessage = (eVid) => {
|
||||
if (eVid.data.error) {
|
||||
console.log(eVid.data.message)
|
||||
f7.dialog.alert(`ALVINN AI model error: ${eVid.data.message}`)
|
||||
} else if (this.videoAvailable) {
|
||||
createImageBitmap(vidData).then(imVideoFrame => {
|
||||
vidWorker.postMessage({call: 'videoFrame', image: imVideoFrame}, [imVideoFrame])
|
||||
})
|
||||
if (eVid.data.coords) {
|
||||
imageCtx.clearRect(0,0,imCanvas.width,imCanvas.height)
|
||||
for (let coord of eVid.data.coords) {
|
||||
let pointX = (imCanvas.width - imgWidth) / 2 + (coord[0] / eVid.data.modelWidth) * imgWidth - 10
|
||||
let pointY = (imCanvas.height - imgHeight) / 2 + (coord[1] / eVid.data.modelHeight) * imgHeight - 10
|
||||
console.debug(`cx: ${pointX}, cy: ${pointY}`)
|
||||
imageCtx.globalAlpha = coord[2]
|
||||
imageCtx.drawImage(target, pointX, pointY, 20, 20)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const imCanvas = this.$refs.image_cvs
|
||||
const imageCtx = imCanvas.getContext("2d")
|
||||
const target = this.$refs.target_image
|
||||
let imgWidth, imgHeight
|
||||
f7.utils.nextFrame(() => {
|
||||
imCanvas.width = imCanvas.clientWidth
|
||||
imCanvas.height = imCanvas.clientHeight
|
||||
imageCtx.clearRect(0,0,imCanvas.width,imCanvas.height)
|
||||
const imgAspect = vidData.width / vidData.height
|
||||
const rendAspect = imCanvas.width / imCanvas.height
|
||||
if (imgAspect >= rendAspect) {
|
||||
imgWidth = imCanvas.width
|
||||
imgHeight = imCanvas.width / imgAspect
|
||||
} else {
|
||||
imgWidth = imCanvas.height * imgAspect
|
||||
imgHeight = imCanvas.height
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -56,7 +56,7 @@
|
||||
},
|
||||
computed: {
|
||||
commentText () {
|
||||
let text = f7.textEditor.get('.comment-editor').getValue()
|
||||
var text = f7.textEditor.get('.comment-editor').getValue()
|
||||
if (this.userEmail) {
|
||||
text += `\\n\\nSubmitted by: ${this.userEmail}`
|
||||
}
|
||||
@@ -65,9 +65,9 @@
|
||||
},
|
||||
methods: {
|
||||
sendFeedback () {
|
||||
let self = this
|
||||
const issueURL = `https://gitea.azgeorgis.net/api/v1/repos/Georgi_Lab/ALVINN_f7/issues?access_token=9af8ae15b1ee5a98afcb3083bb488e4cf3c683af`
|
||||
let xhr = new XMLHttpRequest()
|
||||
var self = this
|
||||
var issueURL = `https://gitea.azgeorgis.net/api/v1/repos/Georgi_Lab/ALVINN_f7/issues?access_token=9af8ae15b1ee5a98afcb3083bb488e4cf3c683af`
|
||||
var xhr = new XMLHttpRequest()
|
||||
xhr.open("POST", issueURL)
|
||||
xhr.setRequestHeader('Content-Type', 'application/json')
|
||||
xhr.setRequestHeader('accept', 'application/json')
|
||||
|
||||
@@ -1,48 +1,25 @@
|
||||
<template>
|
||||
<f7-page name="detect" :id="detectorName + '-detect-page'" @wheel="(e = $event) => e.preventDefault()" @touchmove="(e = $event) => e.preventDefault()">
|
||||
<f7-page name="detect" :id="detectorName + '-detect-page'">
|
||||
<!-- Top Navbar -->
|
||||
<f7-navbar :sliding="false" :back-link="true" back-link-url="/" back-link-force>
|
||||
<f7-nav-title sliding>{{ regionTitle }}</f7-nav-title>
|
||||
<f7-nav-title sliding>{{ regions[activeRegion] }}</f7-nav-title>
|
||||
<f7-nav-right>
|
||||
<f7-link v-if="!isCordova" :icon-only="true" tooltip="Fullscreen" :icon-f7="isFullscreen ? 'viewfinder_circle_fill' : 'viewfinder'" @click="toggleFullscreen"></f7-link>
|
||||
<f7-link :icon-only="true" tooltip="ALVINN help" icon-f7="question_circle_fill" href="/help/"></f7-link>
|
||||
</f7-nav-right>
|
||||
</f7-navbar>
|
||||
<f7-block class="detect-grid">
|
||||
<!--<div style="position: absolute;">{{ debugInfo ? JSON.stringify(debugInfo) : "No Info Available" }}</div>-->
|
||||
<div class="image-container" ref="image_container">
|
||||
<SvgIcon v-if="!imageView.src && !videoAvailable" :icon="f7route.params.region" fill-color="var(--avn-theme-color)"/>
|
||||
<SvgIcon v-if="!imageView && !videoAvailable" :icon="f7route.params.region" fill-color="var(--avn-theme-color)" @click="selectImage" />
|
||||
<div class="vid-container" :style="`display: ${videoAvailable ? 'block' : 'none'}; position: absolute; width: 100%; height: 100%;`">
|
||||
<video id="vid-view" ref="vid_viewer" :srcObject="cameraStream" :autoPlay="true" style="width: 100%; height: 100%"></video>
|
||||
<f7-button @click="captureVidFrame()" style="position: absolute; bottom: 32px; left: 50%; transform: translateX(-50%); z-index: 3;" fill large>Capture</f7-button>
|
||||
</div>
|
||||
<canvas
|
||||
id="im-draw"
|
||||
ref="image_cvs"
|
||||
@wheel="spinWheel($event)"
|
||||
@mousedown.middle="startMove($event)"
|
||||
@mousemove="makeMove($event)"
|
||||
@mouseup.middle="endMove($event)"
|
||||
@touchstart="startTouch($event)"
|
||||
@touchend="endTouch($event)"
|
||||
@touchmove="moveTouch($event)"
|
||||
@click="structureClick"
|
||||
:style="`display: ${(imageLoaded || videoAvailable) ? 'block' : 'none'}; flex: 1 1 0%; max-width: 100%; max-height: 100%; min-width: 0; min-height: 0; background-size: contain; background-position: center; background-repeat: no-repeat; z-index: 2;`"
|
||||
></canvas>
|
||||
<f7-link v-if="getInfoUrl && (selectedChip > -1) && showResults[selectedChip]"
|
||||
:style="`left: ${infoLinkPos.x}px; top: ${infoLinkPos.y}px; transform: translate(-50%,-50%); background: hsla(${showResults[selectedChip].confidence / 100 * 120}deg, 100%, 50%, .5)`"
|
||||
class="structure-info"
|
||||
:icon-only="true"
|
||||
icon-f7="info"
|
||||
target="_blank"
|
||||
:external="true"
|
||||
:href="infoLinkTarget"
|
||||
/>
|
||||
<canvas id="im-draw" ref="image_cvs" @click="structureClick" :style="`display: ${(imageLoaded || videoAvailable) ? 'block' : 'none'}; flex: 1 1 0%; max-width: 100%; max-height: 100%; min-width: 0; min-height: 0; background-size: contain; background-position: center; background-repeat: no-repeat; z-index: 2;`" />
|
||||
</div>
|
||||
<div class="chip-results" style="grid-area: result-view; flex: 0 0 auto; align-self: center; max-height: 450px;">
|
||||
<div class="chip-results" style="grid-area: result-view; flex: 0 0 auto; align-self: center;">
|
||||
<f7-chip v-for="result in showResults.filter( r => { return r.aboveThreshold && r.isSearched && !r.isDeleted })"
|
||||
:class="(result.resultIndex == selectedChip) ? 'selected-chip' : ''"
|
||||
:id="(result.resultIndex == selectedChip) ? 'selected_chip' : ''"
|
||||
:text="result.label"
|
||||
media=" "
|
||||
:tooltip="result.confidence.toFixed(1)"
|
||||
@@ -55,17 +32,8 @@
|
||||
<f7-progressbar v-if="(detecting || modelLoading)" style="width: 100%;" :infinite="true" />
|
||||
</div>
|
||||
<div v-if="showDetectSettings" class="detect-inputs" style="grid-area: detect-settings;">
|
||||
<f7-button @click="this.detectorLevel > 0 ? this.detectorLevel = 0 : this.detectorLevel = 50" style="flex: 0 1 20%">
|
||||
<SvgIcon :icon="this.detectorLevel > 0 ? 'visibility' : 'reset_slide'"/>
|
||||
</f7-button>
|
||||
<div style="position: relative; flex: 1 1 100%">
|
||||
<f7-range class="level-slide-horz" :min="0" :max="100" :step="1" @range:change="onLevelChange" v-model:value="detectorLevel" type="range"/>
|
||||
<f7-range class="level-slide-vert" vertical :min="0" :max="100" :step="1" @range:change="onLevelChange" v-model:value="detectorLevel" type="range"/>
|
||||
<div v-for="result in showResults.filter( r => { return r.isSearched && !r.isDeleted })"
|
||||
class="level-slide-marker"
|
||||
:style="`--avn-slide-marker-border: solid hsla(${result.confidence * 1.2}deg, 100%, 50%, .33) 1px; --avn-slide-marker-position: ${result.confidence.toFixed(1)}%`"
|
||||
></div>
|
||||
</div>
|
||||
<f7-range class="level-slide-horz" :min="0" :max="100" :step="1" @range:change="onLevelChange" v-model:value="detectorLevel" type="range" style="flex: 1 1 100%"/>
|
||||
<f7-range class="level-slide-vert" vertical :min="0" :max="100" :step="1" @range:change="onLevelChange" v-model:value="detectorLevel" type="range" style="flex: 1 1 100%"/>
|
||||
<f7-button @click="() => detectPanel = !detectPanel" :panel-open="!detectPanel && `#${detectorName}-settings`" :panel-close="detectPanel && `#${detectorName}-settings`" style="flex: 0 1 20%">
|
||||
<SvgIcon icon="check_list"/>
|
||||
</f7-button>
|
||||
@@ -74,19 +42,16 @@
|
||||
</f7-button>
|
||||
</div>
|
||||
<f7-segmented class="image-menu" raised>
|
||||
<f7-button popover-open="#region-popover">
|
||||
<RegionIcon :region="activeRegion" />
|
||||
</f7-button>
|
||||
<f7-button v-if="!videoAvailable" :class="(!modelLoading) ? '' : 'disabled'" popover-open="#capture-popover">
|
||||
<SvgIcon icon="camera_add"/>
|
||||
</f7-button>
|
||||
<f7-button v-if="videoAvailable" @click="closeCamera()">
|
||||
<SvgIcon icon="no_photography"/>
|
||||
</f7-button>
|
||||
<f7-button v-if="!structureZoomed && selectedChip >= 0" style="height: auto; width: auto;" popover-close="#image-popover" @click="zoomToSelected()">
|
||||
<SvgIcon icon="zoom_to" />
|
||||
</f7-button>
|
||||
<f7-button v-else :class="(canvasZoom != 1) ? '' : 'disabled'" style="height: auto; width: auto;" popover-close="#image-popover" @click="resetZoom()">
|
||||
<SvgIcon icon="reset_zoom" />
|
||||
</f7-button>
|
||||
<f7-button @click="toggleSettings()" :class="(imageLoaded) ? '' : 'disabled'">
|
||||
<f7-button @click="() => showDetectSettings = !showDetectSettings" :class="(imageLoaded) ? '' : 'disabled'">
|
||||
<SvgIcon icon="visibility"/>
|
||||
<f7-badge v-if="numResults && (showResults.length != numResults)" color="red" style="position: absolute; right: 15%; top: 15%;">{{ showResults.length - numResults }}</f7-badge>
|
||||
</f7-button>
|
||||
@@ -109,6 +74,23 @@
|
||||
</f7-page>
|
||||
</f7-panel>
|
||||
|
||||
<f7-popover id="region-popover" class="popover-button-menu">
|
||||
<f7-segmented raised class="segment-button-menu">
|
||||
<f7-button :class="(getRegions.includes('thorax')) ? '' : ' disabled'" style="height: auto; width: auto;" href="/detect/thorax/" popover-close="#region-popover">
|
||||
<RegionIcon :region="0" />
|
||||
</f7-button>
|
||||
<f7-button :class="(getRegions.includes('abdomen')) ? '' : ' disabled'" style="height: auto; width: auto;" href="/detect/abdomen/" popover-close="#region-popover">
|
||||
<RegionIcon :region="1" />
|
||||
</f7-button>
|
||||
<f7-button :class="(getRegions.includes('limbs')) ? '' : ' disabled'" style="height: auto; width: auto;" href="/detect/limbs/" popover-close="#region-popover">
|
||||
<RegionIcon :region="2" />
|
||||
</f7-button>
|
||||
<f7-button :class="(getRegions.includes('head')) ? '' : ' disabled'" style="height: auto; width: auto;" href="/detect/head/" popover-close="#region-popover">
|
||||
<RegionIcon :region="3" />
|
||||
</f7-button>
|
||||
</f7-segmented>
|
||||
</f7-popover>
|
||||
|
||||
<f7-popover id="capture-popover" class="popover-button-menu">
|
||||
<f7-segmented raised class="segment-button-menu">
|
||||
<f7-button style="height: auto; width: auto;" popover-close="#capture-popover" @click="selectImage('camera')">
|
||||
@@ -117,10 +99,7 @@
|
||||
<f7-button style="height: auto; width: auto;" popover-close="#capture-popover" @click="selectImage('file')">
|
||||
<SvgIcon icon="photo_library" />
|
||||
</f7-button>
|
||||
<f7-button v-if="secureProtocol" style="height: auto; width: auto;" popover-close="#capture-popover" @click="selectImage('clipboard')">
|
||||
<SvgIcon icon="clipboard" />
|
||||
</f7-button>
|
||||
<f7-button v-if="demoEnabled" style="height: auto; width: auto;" popover-close="#capture-popover" @click="selectImage('sample')">
|
||||
<f7-button v-if="otherSettings.demo" style="height: auto; width: auto;" popover-close="#capture-popover" @click="selectImage('sample')">
|
||||
<SvgIcon icon="photo_sample"/>
|
||||
</f7-button>
|
||||
</f7-segmented>
|
||||
@@ -141,27 +120,9 @@
|
||||
import submitMixin from './submit-mixin'
|
||||
import detectionMixin from './detection-mixin'
|
||||
import cameraMixin from './camera-mixin'
|
||||
import touchMixin from './touch-mixin'
|
||||
|
||||
import detectionWorker from '@/assets/detect-worker.js?worker&inline'
|
||||
import { Structure, StructureBox } from '../js/structures'
|
||||
|
||||
const regions = ['Thorax','Abdomen/Pelvis','Limbs','Head and Neck']
|
||||
let activeRegion = 4
|
||||
let classesList = []
|
||||
let imageLoadMode = "environment"
|
||||
let serverSettings = {}
|
||||
let otherSettings = {}
|
||||
let modelLocation = ''
|
||||
let miniLocation = ''
|
||||
let reloadModel = false
|
||||
let detectWorker = null
|
||||
let vidWorker = null
|
||||
let canvasMoving = false
|
||||
let imageLocation = new StructureBox(0, 0, 1, 1)
|
||||
|
||||
export default {
|
||||
mixins: [submitMixin, detectionMixin, cameraMixin, touchMixin],
|
||||
mixins: [submitMixin, detectionMixin, cameraMixin],
|
||||
props: {
|
||||
f7route: Object,
|
||||
},
|
||||
@@ -171,28 +132,33 @@
|
||||
},
|
||||
data () {
|
||||
return {
|
||||
regions: ['Thorax','Abdomen/Pelvis','Limbs','Head and Neck'],
|
||||
resultData: {},
|
||||
selectedChip: -1,
|
||||
activeRegion: 4,
|
||||
classesList: [],
|
||||
imageLoaded: false,
|
||||
imageView: new Image(),
|
||||
imageView: null,
|
||||
imageLoadMode: "environment",
|
||||
detecting: false,
|
||||
detectPanel: false,
|
||||
showDetectSettings: false,
|
||||
detectorName: '',
|
||||
detectorLevel: 50,
|
||||
detectorLabels: [],
|
||||
serverSettings: {},
|
||||
otherSettings: {},
|
||||
isCordova: !!window.cordova,
|
||||
secureProtocol: location.protocol == 'https:',
|
||||
isFullscreen: false,
|
||||
uploadUid: null,
|
||||
uploadDirty: false,
|
||||
modelLocation: '',
|
||||
miniLocation: '',
|
||||
modelLoading: true,
|
||||
reloadModel: false,
|
||||
videoDeviceAvailable: false,
|
||||
videoAvailable: false,
|
||||
cameraStream: null,
|
||||
infoLinkPos: {},
|
||||
canvasOffset: {x: 0, y: 0},
|
||||
canvasZoom: 1,
|
||||
structureZoomed: false,
|
||||
debugInfo: null
|
||||
cameraStream: null
|
||||
}
|
||||
},
|
||||
setup() {
|
||||
@@ -200,78 +166,51 @@
|
||||
},
|
||||
created () {
|
||||
let loadOtherSettings = localStorage.getItem('otherSettings')
|
||||
if (loadOtherSettings) otherSettings = JSON.parse(loadOtherSettings)
|
||||
if (loadOtherSettings) this.otherSettings = JSON.parse(loadOtherSettings)
|
||||
let modelRoot = this.isCordova ? 'https://localhost' : '.'
|
||||
this.detectorName = this.f7route.params.region
|
||||
switch (this.detectorName) {
|
||||
case 'thorax':
|
||||
activeRegion = 0
|
||||
this.activeRegion = 0
|
||||
break;
|
||||
case 'abdomen':
|
||||
activeRegion = 1
|
||||
this.activeRegion = 1
|
||||
break;
|
||||
case 'limbs':
|
||||
activeRegion = 2
|
||||
this.activeRegion = 2
|
||||
break;
|
||||
case 'head':
|
||||
activeRegion = 3
|
||||
this.activeRegion = 3
|
||||
break;
|
||||
}
|
||||
let modelJ = `../models/${this.detectorName}${otherSettings.mini ? '-mini' : ''}/model.json`
|
||||
let miniJ = `../models/${this.detectorName}-mini/model.json`
|
||||
modelLocation = new URL(modelJ,import.meta.url).href
|
||||
miniLocation = new URL(miniJ,import.meta.url).href
|
||||
let classesJ = `../models/${this.detectorName}/classes.json`
|
||||
fetch(new URL(classesJ,import.meta.url).href)
|
||||
this.modelLocation = `${modelRoot}/models/${this.detectorName}${this.otherSettings.mini ? '-mini' : ''}/model.json`
|
||||
this.miniLocation = `${modelRoot}/models/${this.detectorName}-mini/model.json`
|
||||
fetch(`${this.isCordova ? 'https://localhost' : '.'}/models/${this.detectorName}/classes.json`)
|
||||
.then((mod) => { return mod.json() })
|
||||
.then((classes) => {
|
||||
classesList = classes
|
||||
this.detectorLabels = classesList.map( l => { return {'name': l, 'detect': true} } )
|
||||
this.classesList = classes
|
||||
this.detectorLabels = this.classesList.map( l => { return {'name': l, 'detect': true} } )
|
||||
})
|
||||
const loadServerSettings = localStorage.getItem('serverSettings')
|
||||
if (loadServerSettings) serverSettings = JSON.parse(loadServerSettings)
|
||||
var loadServerSettings = localStorage.getItem('serverSettings')
|
||||
if (loadServerSettings) this.serverSettings = JSON.parse(loadServerSettings)
|
||||
},
|
||||
mounted () {
|
||||
if (serverSettings && serverSettings.use) {
|
||||
if (this.serverSettings && this.serverSettings.use) {
|
||||
this.getRemoteLabels()
|
||||
this.modelLoading = false
|
||||
} else {
|
||||
this.modelLoading = true
|
||||
if (!this.useWorkers) {
|
||||
this.loadModel(modelLocation, true).then(() => {
|
||||
this.modelLoading = false
|
||||
}).catch((e) => {
|
||||
console.log(e.message)
|
||||
f7.dialog.alert(`ALVINN AI model error: ${e.message}`)
|
||||
this.modelLoading = false
|
||||
})
|
||||
} else {
|
||||
detectWorker = new detectionWorker()
|
||||
detectWorker.onmessage = (eMount) => {
|
||||
self = this
|
||||
if (eMount.data.error) {
|
||||
console.log(eMount.data.message)
|
||||
f7.dialog.alert(`ALVINN AI model error: ${eMount.data.message}`)
|
||||
}
|
||||
self.modelLoading = false
|
||||
}
|
||||
vidWorker = new detectionWorker()
|
||||
vidWorker.onmessage = (eMount) => {
|
||||
self = this
|
||||
if (eMount.data.error) {
|
||||
console.log(eMount.data.message)
|
||||
f7.dialog.alert(`ALVINN AI nano model error: ${eMount.data.message}`)
|
||||
}
|
||||
}
|
||||
detectWorker.postMessage({call: 'loadModel', weights: modelLocation, preload: true})
|
||||
vidWorker.postMessage({call: 'loadModel', weights: miniLocation, preload: true})
|
||||
}
|
||||
this.loadModel(this.modelLocation, true).then(() => {
|
||||
this.modelLoading = false
|
||||
}).catch((e) => {
|
||||
console.log(e.message)
|
||||
f7.dialog.alert(`ALVINN AI model error: ${e.message}`)
|
||||
this.modelLoading = false
|
||||
})
|
||||
}
|
||||
window.onresize = (e) => { if (this.$refs.image_cvs) this.selectChip('redraw') }
|
||||
window.onresize = (e) => { this.selectChip('redraw') }
|
||||
},
|
||||
computed: {
|
||||
regionTitle () {
|
||||
return regions[activeRegion]
|
||||
},
|
||||
message () {
|
||||
if (this.modelLoading) {
|
||||
return "Preparing ALVINN..."
|
||||
@@ -284,17 +223,17 @@
|
||||
}
|
||||
},
|
||||
showResults () {
|
||||
let filteredResults = this.resultData.detections
|
||||
var filteredResults = this.resultData.detections
|
||||
if (!filteredResults) return []
|
||||
|
||||
const allSelect = this.detectorLabels.every( s => { return s.detect } )
|
||||
const selectedLabels = this.detectorLabels
|
||||
var allSelect = this.detectorLabels.every( s => { return s.detect } )
|
||||
var selectedLabels = this.detectorLabels
|
||||
.filter( l => { return l.detect })
|
||||
.map( l => { return l.name })
|
||||
filteredResults.forEach( (d, i) => {
|
||||
d.resultIndex = i
|
||||
d.setThreshold(this.detectorLevel)
|
||||
d.isSearched = allSelect || selectedLabels.includes(d.label)
|
||||
filteredResults[i].resultIndex = i
|
||||
filteredResults[i].aboveThreshold = d.confidence >= this.detectorLevel
|
||||
filteredResults[i].isSearched = allSelect || selectedLabels.includes(d.label)
|
||||
})
|
||||
|
||||
if (!filteredResults.some( s => s.resultIndex == this.selectedChip && s.aboveThreshold && s.isSearched && !s.isDeleted)) {
|
||||
@@ -313,15 +252,7 @@
|
||||
} else {
|
||||
return false
|
||||
}
|
||||
},
|
||||
demoEnabled () {
|
||||
return otherSettings.demo || this.demoMode
|
||||
},
|
||||
infoLinkTarget () {
|
||||
if (!this.getInfoUrl) return ''
|
||||
let structure = this.showResults.find( r => r.resultIndex == this.selectedChip)
|
||||
return structure ? this.getInfoUrl + structure.label.replaceAll(' ','_') : ''
|
||||
},
|
||||
}
|
||||
},
|
||||
methods: {
|
||||
chipGradient (confVal) {
|
||||
@@ -329,61 +260,14 @@
|
||||
return `--chip-media-gradient: conic-gradient(from ${270 - (confFactor * 360 / 2)}deg, hsl(${confFactor * 120}deg, 100%, 50%) ${confFactor}turn, hsl(${confFactor * 120}deg, 50%, 66%) ${confFactor}turn)`
|
||||
},
|
||||
async setData () {
|
||||
if (detectWorker) {
|
||||
detectWorker.onmessage = (eDetect) => {
|
||||
self = this
|
||||
if (eDetect.data.error) {
|
||||
self.detecting = false
|
||||
self.resultData = {}
|
||||
loadFailure()
|
||||
f7.dialog.alert(`ALVINN structure finding error: ${eDetect.data.message}`)
|
||||
} else if (eDetect.data.success == 'detection') {
|
||||
self.detecting = false
|
||||
self.resultData = {detections: []}
|
||||
eDetect.data.detections.detections.forEach((d) => {
|
||||
d.label = self.detectorLabels[d.label].name
|
||||
let detectedStructure = new Structure(d)
|
||||
self.resultData.detections.push(detectedStructure)
|
||||
})
|
||||
self.uploadDirty = true
|
||||
} else if (eDetect.data.success == 'model') {
|
||||
reloadModel = false
|
||||
loadSuccess()
|
||||
}
|
||||
f7.utils.nextFrame(() => {
|
||||
this.selectChip("redraw")
|
||||
})
|
||||
}
|
||||
if (this.reloadModel) {
|
||||
await this.loadModel(this.modelLocation)
|
||||
this.reloadModel = false
|
||||
}
|
||||
|
||||
let loadSuccess = null
|
||||
let loadFailure = null
|
||||
let modelReloading = null
|
||||
if (!this.useWorkers && reloadModel) {
|
||||
await this.loadModel(modelLocation)
|
||||
reloadModel = false
|
||||
} else {
|
||||
modelReloading = new Promise((res, rej) => {
|
||||
loadSuccess = res
|
||||
loadFailure = rej
|
||||
if (reloadModel) {
|
||||
detectWorker.postMessage({call: 'loadModel', weights: modelLocation})
|
||||
} else {
|
||||
loadSuccess()
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
if (serverSettings && serverSettings.use) {
|
||||
if (this.serverSettings && this.serverSettings.use) {
|
||||
this.remoteDetect()
|
||||
} else if (this.useWorkers) {
|
||||
Promise.all([modelReloading,createImageBitmap(this.imageView)]).then(res => {
|
||||
detectWorker.postMessage({call: 'localDetect', image: res[1]}, [res[1]])
|
||||
})
|
||||
} else {
|
||||
createImageBitmap(this.imageView).then(res => {
|
||||
return this.localDetect(res)
|
||||
}).then(dets => {
|
||||
this.localDetect(this.imageView).then(dets => {
|
||||
this.detecting = false
|
||||
this.resultData = dets
|
||||
this.uploadDirty = true
|
||||
@@ -394,9 +278,6 @@
|
||||
f7.dialog.alert(`ALVINN structure finding error: ${e.message}`)
|
||||
})
|
||||
}
|
||||
f7.utils.nextFrame(() => {
|
||||
this.selectChip("redraw")
|
||||
})
|
||||
},
|
||||
selectAll (ev) {
|
||||
if (ev.target.checked) {
|
||||
@@ -406,27 +287,24 @@
|
||||
}
|
||||
},
|
||||
async selectImage (mode) {
|
||||
imageLoadMode = mode
|
||||
this.imageLoadMode = mode
|
||||
if (this.isCordova && mode == "camera") {
|
||||
navigator.camera.getPicture(this.getImage, this.onFail, { quality: 50, destinationType: Camera.DestinationType.DATA_URL, correctOrientation: true });
|
||||
return
|
||||
}
|
||||
if (mode == "camera" && !otherSettings.disableVideo) {
|
||||
if (mode == "camera") {
|
||||
this.videoAvailable = await this.openCamera(this.$refs.image_container)
|
||||
if (this.videoAvailable) {
|
||||
this.selectedChip = -1
|
||||
this.imageLoaded = false
|
||||
this.imageView.src = null
|
||||
this.imageView = null
|
||||
this.$refs.image_cvs.style['background-image'] = 'none'
|
||||
this.resultData = {}
|
||||
const trackDetails = this.cameraStream.getVideoTracks()[0].getSettings()
|
||||
let vidElement = this.$refs.vid_viewer
|
||||
var trackDetails = this.cameraStream.getVideoTracks()[0].getSettings()
|
||||
var vidElement = this.$refs.vid_viewer
|
||||
vidElement.width = trackDetails.width
|
||||
vidElement.height = trackDetails.height
|
||||
if (!this.useWorkers) {
|
||||
this.videoFrameDetect(vidElement, miniLocation)
|
||||
} else {
|
||||
this.videoFrameDetectWorker(vidElement, vidWorker)
|
||||
if (!this.otherSettings.disableVideo) {
|
||||
this.videoFrameDetect(vidElement)
|
||||
}
|
||||
return
|
||||
}
|
||||
@@ -444,68 +322,33 @@
|
||||
}).open()
|
||||
return
|
||||
}
|
||||
if (mode == 'clipboard') {
|
||||
navigator.clipboard.read().then(clip => {
|
||||
if (!clip[0].types.includes("image/png")) {
|
||||
throw new Error("Clipboard does not contain valid image data.");
|
||||
}
|
||||
return clip[0].getType("image/png");
|
||||
}).then(blob => {
|
||||
let clipImage = URL.createObjectURL(blob);
|
||||
this.getImage(clipImage)
|
||||
}).catch(e => {
|
||||
console.log(e)
|
||||
f7.dialog.alert(`Error pasting image: ${e.message}`)
|
||||
})
|
||||
return
|
||||
}
|
||||
this.$refs.image_chooser.click()
|
||||
},
|
||||
onFail (message) {
|
||||
alert(`Camera fail: ${message}`)
|
||||
},
|
||||
selectChip ( iChip ) {
|
||||
const [imCanvas, imageCtx] = this.resetView()
|
||||
|
||||
if (this.selectedChip == iChip) {
|
||||
this.selectedChip = -1
|
||||
this.resetView()
|
||||
return
|
||||
}
|
||||
|
||||
if (iChip == 'redraw') {
|
||||
if (this.selectedChip == -1) {
|
||||
this.resetView()
|
||||
return
|
||||
}
|
||||
if (this.selectedChip == -1) return
|
||||
iChip = this.selectedChip
|
||||
}
|
||||
const [imCanvas, imageCtx] = this.resetView(true)
|
||||
let structBox, cvsBox, screenBox
|
||||
[structBox, cvsBox, screenBox] = this.resultData.detections[iChip].box.getBoxes('side', this.imageView, imCanvas, {zoom: this.canvasZoom, offset: {...this.canvasOffset}})
|
||||
|
||||
this.infoLinkPos.x = Math.min(Math.max(screenBox.left, 0),imCanvas.width)
|
||||
this.infoLinkPos.y = Math.min(Math.max(screenBox.top, 0), imCanvas.height)
|
||||
const boxCoords = this.box2cvs(this.resultData.detections[iChip])[0]
|
||||
|
||||
const imageScale = Math.max(this.imageView.width / imCanvas.width, this.imageView.height / imCanvas.height)
|
||||
imageCtx.drawImage(this.imageView, structBox.left, structBox.top, structBox.width, structBox.height, cvsBox.left, cvsBox.top, cvsBox.width, cvsBox.height)
|
||||
imageCtx.save()
|
||||
imageCtx.arc(cvsBox.left, cvsBox.top, 14 / this.canvasZoom, 0, 2 * Math.PI)
|
||||
imageCtx.closePath()
|
||||
imageCtx.clip()
|
||||
imageCtx.drawImage(this.imageView,
|
||||
structBox.left - (14 / this.canvasZoom * imageScale),
|
||||
structBox.top - (14 / this.canvasZoom * imageScale),
|
||||
(28 / this.canvasZoom * imageScale),
|
||||
(28 / this.canvasZoom * imageScale),
|
||||
cvsBox.left - (14 / this.canvasZoom),
|
||||
cvsBox.top - (14 / this.canvasZoom),
|
||||
(28 / this.canvasZoom), (28 / this.canvasZoom))
|
||||
imageCtx.restore()
|
||||
var boxLeft = boxCoords.cvsLeft
|
||||
var boxTop = boxCoords.cvsTop
|
||||
var boxWidth = boxCoords.cvsRight - boxCoords.cvsLeft
|
||||
var boxHeight = boxCoords.cvsBottom - boxCoords.cvsTop
|
||||
imageCtx.strokeRect(boxLeft,boxTop,boxWidth,boxHeight)
|
||||
this.selectedChip = iChip
|
||||
this.resultData.detections[iChip].beenViewed = true
|
||||
|
||||
this.$nextTick( () => {
|
||||
document.getElementById('selected_chip').scrollIntoView({behavior: 'smooth', block: 'nearest'})
|
||||
})
|
||||
},
|
||||
deleteChip ( iChip ) {
|
||||
f7.dialog.confirm(`${this.resultData.detections[iChip].label} is identified with ${this.resultData.detections[iChip].confidence.toFixed(1)}% confidence. Are you sure you want to delete it?`, () => {
|
||||
@@ -515,24 +358,14 @@
|
||||
this.uploadDirty = true
|
||||
});
|
||||
},
|
||||
resetView (drawChip) {
|
||||
resetView () {
|
||||
const imCanvas = this.$refs.image_cvs
|
||||
const imageCtx = imCanvas.getContext("2d")
|
||||
imCanvas.width = imCanvas.clientWidth
|
||||
imCanvas.height = imCanvas.clientHeight
|
||||
imageCtx.clearRect(0,0,imCanvas.width,imCanvas.height)
|
||||
imageCtx.translate(this.canvasOffset.x,this.canvasOffset.y)
|
||||
imageCtx.scale(this.canvasZoom,this.canvasZoom)
|
||||
imageCtx.globalAlpha = 1
|
||||
imageCtx.strokeStyle = 'yellow'
|
||||
imageCtx.lineWidth = 3 / this.canvasZoom
|
||||
if (this.imageLoaded) {
|
||||
const imageLoc = imageLocation.getBoxes('side', this.imageView, imCanvas)
|
||||
if (drawChip) {imageCtx.globalAlpha = .5}
|
||||
imageCtx.drawImage(this.imageView, 0, 0, this.imageView.width, this.imageView.height, imageLoc[1].left, imageLoc[1].top, imageLoc[1].width, imageLoc[1].height)
|
||||
if (drawChip) {imageCtx.globalAlpha = 1}
|
||||
}
|
||||
this.structureZoomed = false
|
||||
imageCtx.lineWidth = 3
|
||||
return [imCanvas, imageCtx]
|
||||
},
|
||||
getImage (searchImage) {
|
||||
@@ -540,22 +373,18 @@
|
||||
if (this.videoAvailable) {
|
||||
this.closeCamera()
|
||||
this.detecting = true
|
||||
reloadModel = true
|
||||
this.reloadModel = true
|
||||
resolve(searchImage)
|
||||
} else if (this.isCordova && imageLoadMode == "camera") {
|
||||
} else if (this.isCordova && this.imageLoadMode == "camera") {
|
||||
this.detecting = true
|
||||
resolve('data:image/jpg;base64,' + searchImage)
|
||||
}
|
||||
if (imageLoadMode == 'clipboard') {
|
||||
this.detecting = true
|
||||
resolve(searchImage)
|
||||
}
|
||||
const reader = new FileReader()
|
||||
var reader = new FileReader()
|
||||
reader.addEventListener("load", () => {
|
||||
this.detecting = true
|
||||
resolve(reader.result)
|
||||
},{once: true})
|
||||
if (imageLoadMode == 'sample') {
|
||||
})
|
||||
if (this.imageLoadMode == 'sample') {
|
||||
fetch(`${this.isCordova ? 'https://localhost' : '.'}/samples/${this.detectorName}-${searchImage}.jpeg`).then( resp => {
|
||||
return resp.blob()
|
||||
}).then(respBlob => {
|
||||
@@ -574,27 +403,26 @@
|
||||
this.imageLoaded = true
|
||||
this.resultData = {}
|
||||
this.selectedChip = -1
|
||||
this.imageView = new Image()
|
||||
this.imageView.src = imgData
|
||||
return(this.imageView.decode())
|
||||
}).then( () => {
|
||||
this.canvasOffset = {x: 0, y: 0}
|
||||
this.canvasZoom = 1
|
||||
const imCanvas = this.$refs.image_cvs
|
||||
imCanvas.width = imCanvas.clientWidth
|
||||
imCanvas.height = imCanvas.clientHeight
|
||||
const imageCtx = imCanvas.getContext("2d")
|
||||
const imageLoc = imageLocation.getBoxes('side', this.imageView, imCanvas)
|
||||
imageCtx.drawImage(this.imageView, 0, 0, this.imageView.width, this.imageView.height, imageLoc[1].left, imageLoc[1].top, imageLoc[1].width, imageLoc[1].height)
|
||||
f7.utils.nextFrame(() => {
|
||||
const [imCanvas, _] = this.resetView()
|
||||
imCanvas.style['background-image'] = `url(${this.imageView.src})`
|
||||
/******
|
||||
* setTimeout is not a good solution, but it's the only way
|
||||
* I can find to not cut off drawing of the canvas background
|
||||
******/
|
||||
setTimeout(() => {
|
||||
this.setData()
|
||||
})
|
||||
}, 1)
|
||||
}).catch((e) => {
|
||||
console.log(e.message)
|
||||
f7.dialog.alert(`Error loading image: ${e.message}`)
|
||||
})
|
||||
},
|
||||
async submitData () {
|
||||
let uploadData = this.showResults
|
||||
var uploadData = this.showResults
|
||||
.filter( d => { return d.aboveThreshold && d.isSearched && !d.isDeleted })
|
||||
.map( r => { return {"top": r.top, "left": r.left, "bottom": r.bottom, "right": r.right, "label": r.label}})
|
||||
this.uploadUid = await this.uploadData(this.imageView.src.split(',')[1],uploadData,this.uploadUid)
|
||||
@@ -604,85 +432,53 @@
|
||||
this.detectorLevel = value
|
||||
},
|
||||
structureClick(e) {
|
||||
let self = this
|
||||
function loopIndex(i) {
|
||||
if (self.selectedChip == -1) return i
|
||||
let li = i + self.selectedChip
|
||||
if (li >= numBoxes) li -= numBoxes
|
||||
return li
|
||||
}
|
||||
let boxCoords = []
|
||||
this.resultData.detections.forEach(d => {
|
||||
let cvsBox = d.box.getBoxes('point',this.imageView,this.$refs.image_cvs)[1]
|
||||
cvsBox.clickable = d.aboveThreshold && d.isSearched && !d.isDeleted
|
||||
boxCoords.push(cvsBox)
|
||||
const boxCoords = this.box2cvs(this.showResults)
|
||||
var findBox = boxCoords.findIndex( (r, i) => { return r.cvsLeft <= e.offsetX &&
|
||||
r.cvsRight >= e.offsetX &&
|
||||
r.cvsTop <= e.offsetY &&
|
||||
r.cvsBottom >= e.offsetY &&
|
||||
this.resultData.detections[i].resultIndex > this.selectedChip &&
|
||||
this.resultData.detections[i].aboveThreshold &&
|
||||
this.resultData.detections[i].isSearched &&
|
||||
!this.resultData.detections[i].isDeleted
|
||||
})
|
||||
const numBoxes = boxCoords.length
|
||||
let clickX = (e.offsetX - this.canvasOffset.x) / this.canvasZoom
|
||||
let clickY = (e.offsetY - this.canvasOffset.y) / this.canvasZoom
|
||||
let boxEnd = boxCoords.splice(0, this.selectedChip)
|
||||
boxCoords = boxCoords.concat(boxEnd)
|
||||
const findBox = boxCoords.findIndex( (r, i) => {
|
||||
let di = loopIndex(i)
|
||||
if (di == this.selectedChip ) return false
|
||||
return r.clickable &&
|
||||
r.left <= clickX &&
|
||||
r.right >= clickX &&
|
||||
r.top <= clickY &&
|
||||
r.bottom >= clickY
|
||||
})
|
||||
this.selectChip(findBox >= 0 ? this.resultData.detections[loopIndex(findBox)].resultIndex : this.selectedChip)
|
||||
this.selectChip(findBox >= 0 ? this.resultData.detections[findBox].resultIndex : this.selectedChip)
|
||||
},
|
||||
toggleSettings() {
|
||||
this.showDetectSettings = !this.showDetectSettings
|
||||
f7.utils.nextFrame(() => {
|
||||
this.selectChip("redraw")
|
||||
})
|
||||
},
|
||||
startMove() {
|
||||
canvasMoving = true
|
||||
},
|
||||
endMove() {
|
||||
canvasMoving = false
|
||||
},
|
||||
makeMove(event) {
|
||||
if (canvasMoving) {
|
||||
this.canvasOffset.x += event.movementX
|
||||
this.canvasOffset.y += event.movementY
|
||||
this.selectChip("redraw")
|
||||
box2cvs(boxInput) {
|
||||
if (!boxInput || boxInput.length == 0) return []
|
||||
const boxList = boxInput.length ? boxInput : [boxInput]
|
||||
const [imCanvas, imageCtx] = this.resetView()
|
||||
var imgWidth
|
||||
var imgHeight
|
||||
const imgAspect = this.imageView.width / this.imageView.height
|
||||
const rendAspect = imCanvas.width / imCanvas.height
|
||||
if (imgAspect >= rendAspect) {
|
||||
imgWidth = imCanvas.width
|
||||
imgHeight = imCanvas.width / imgAspect
|
||||
} else {
|
||||
imgWidth = imCanvas.height * imgAspect
|
||||
imgHeight = imCanvas.height
|
||||
}
|
||||
const cvsCoords = boxList.map( (d, i) => {
|
||||
return {
|
||||
"cvsLeft": (imCanvas.width - imgWidth) / 2 + d.left * imgWidth,
|
||||
"cvsRight": (imCanvas.width - imgWidth) / 2 + d.right * imgWidth,
|
||||
"cvsTop": (imCanvas.height - imgHeight) / 2 + d.top * imgHeight,
|
||||
"cvsBottom": (imCanvas.height - imgHeight) / 2 + d.bottom * imgHeight
|
||||
}
|
||||
})
|
||||
return cvsCoords
|
||||
},
|
||||
spinWheel(event) {
|
||||
let zoomFactor
|
||||
if (event.wheelDelta > 0) {
|
||||
zoomFactor = 1.05
|
||||
} else if (event.wheelDelta < 0) {
|
||||
zoomFactor = 1 / 1.05
|
||||
toggleFullscreen() {
|
||||
if (document.fullscreenElement) {
|
||||
document.exitFullscreen().then( () => {
|
||||
this.isFullscreen = false
|
||||
})
|
||||
} else {
|
||||
app.requestFullscreen().then( () => {
|
||||
this.isFullscreen = true
|
||||
})
|
||||
}
|
||||
this.canvasZoom *= zoomFactor
|
||||
this.canvasOffset.x = event.offsetX * (1 - zoomFactor) + this.canvasOffset.x * zoomFactor
|
||||
this.canvasOffset.y = event.offsetY * (1 - zoomFactor) + this.canvasOffset.y * zoomFactor
|
||||
this.selectChip("redraw")
|
||||
},
|
||||
resetZoom() {
|
||||
this.canvasZoom = 1
|
||||
this.canvasOffset.x = 0
|
||||
this.canvasOffset.y = 0
|
||||
this.selectChip("redraw")
|
||||
},
|
||||
zoomToSelected() {
|
||||
const imCanvas = this.$refs.image_cvs
|
||||
const boxCoords = this.resultData.detections[this.selectedChip].box.getBoxes('point', this.imageView, imCanvas)
|
||||
const boxWidth = boxCoords[1].right - boxCoords[1].left
|
||||
const boxHeight = boxCoords[1].bottom - boxCoords[1].top
|
||||
const boxMidX = (boxCoords[1].right + boxCoords[1].left ) / 2
|
||||
const boxMidY = (boxCoords[1].bottom + boxCoords[1].top ) / 2
|
||||
const zoomFactor = Math.min(imCanvas.width / boxWidth * .9, imCanvas.height / boxHeight * .9, 8)
|
||||
this.canvasZoom = zoomFactor
|
||||
this.canvasOffset.x = -(boxMidX * zoomFactor) + imCanvas.width / 2
|
||||
this.canvasOffset.y = -(boxMidY * zoomFactor) + imCanvas.height / 2
|
||||
this.selectChip("redraw")
|
||||
this.structureZoomed = true
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import * as tf from '@tensorflow/tfjs'
|
||||
import { f7 } from 'framework7-vue'
|
||||
|
||||
let model = null
|
||||
var model = null
|
||||
|
||||
export default {
|
||||
methods: {
|
||||
@@ -9,7 +9,7 @@ export default {
|
||||
if (model && model.modelURL == weights) {
|
||||
return model
|
||||
} else if (model) {
|
||||
tf.dispose(model)
|
||||
model.dispose()
|
||||
}
|
||||
model = await tf.loadGraphModel(weights)
|
||||
const [modelWidth, modelHeight] = model.inputs[0].shape.slice(1, 3)
|
||||
@@ -25,32 +25,28 @@ export default {
|
||||
return model
|
||||
},
|
||||
async localDetect(imageData) {
|
||||
console.time('mx: pre-process')
|
||||
console.time('pre-process')
|
||||
const [modelWidth, modelHeight] = model.inputs[0].shape.slice(1, 3)
|
||||
let gTense = null
|
||||
const input = tf.tidy(() => {
|
||||
gTense = tf.image.rgbToGrayscale(tf.image.resizeBilinear(tf.browser.fromPixels(imageData), [modelWidth, modelHeight])).div(255.0).expandDims(0)
|
||||
return tf.concat([gTense,gTense,gTense],3)
|
||||
return tf.image.resizeBilinear(tf.browser.fromPixels(imageData), [modelWidth, modelHeight]).div(255.0).expandDims(0)
|
||||
})
|
||||
tf.dispose(gTense)
|
||||
console.timeEnd('mx: pre-process')
|
||||
console.timeEnd('pre-process')
|
||||
|
||||
console.time('mx: run prediction')
|
||||
console.time('run prediction')
|
||||
const res = model.predict(input)
|
||||
const tRes = tf.transpose(res,[0,2,1])
|
||||
const rawRes = tRes.arraySync()[0]
|
||||
console.timeEnd('mx: run prediction')
|
||||
const rawRes = tf.transpose(res,[0,2,1]).arraySync()[0]
|
||||
console.timeEnd('run prediction')
|
||||
|
||||
console.time('mx: post-process')
|
||||
console.time('post-process')
|
||||
const outputSize = res.shape[1]
|
||||
let rawBoxes = []
|
||||
let rawScores = []
|
||||
|
||||
for (let i = 0; i < rawRes.length; i++) {
|
||||
const getScores = rawRes[i].slice(4)
|
||||
for (var i = 0; i < rawRes.length; i++) {
|
||||
var getScores = rawRes[i].slice(4)
|
||||
if (getScores.every( s => s < .05)) { continue }
|
||||
const getBox = rawRes[i].slice(0,4)
|
||||
const boxCalc = [
|
||||
var getBox = rawRes[i].slice(0,4)
|
||||
var boxCalc = [
|
||||
(getBox[0] - (getBox[2] / 2)) / modelWidth,
|
||||
(getBox[1] - (getBox[3] / 2)) / modelHeight,
|
||||
(getBox[0] + (getBox[2] / 2)) / modelWidth,
|
||||
@@ -63,36 +59,31 @@ export default {
|
||||
if (rawBoxes.length > 0) {
|
||||
const tBoxes = tf.tensor2d(rawBoxes)
|
||||
let tScores = null
|
||||
let resBoxes = null
|
||||
let validBoxes = []
|
||||
let structureScores = null
|
||||
let boxes_data = []
|
||||
let scores_data = []
|
||||
let classes_data = []
|
||||
for (let c = 0; c < outputSize - 4; c++) {
|
||||
for (var c = 0; c < outputSize - 4; c++) {
|
||||
structureScores = rawScores.map(x => x[c])
|
||||
tScores = tf.tensor1d(structureScores)
|
||||
resBoxes = await tf.image.nonMaxSuppressionAsync(tBoxes,tScores,10,0.5,.05)
|
||||
validBoxes = resBoxes.dataSync()
|
||||
tf.dispose(resBoxes)
|
||||
var validBoxes = await tf.image.nonMaxSuppressionAsync(tBoxes,tScores,10,0.5,.05)
|
||||
validBoxes = validBoxes.dataSync()
|
||||
if (validBoxes) {
|
||||
boxes_data.push(...rawBoxes.filter( (_, idx) => validBoxes.includes(idx)))
|
||||
let outputScores = structureScores.filter( (_, idx) => validBoxes.includes(idx))
|
||||
var outputScores = structureScores.filter( (_, idx) => validBoxes.includes(idx))
|
||||
scores_data.push(...outputScores)
|
||||
classes_data.push(...outputScores.fill(c))
|
||||
}
|
||||
}
|
||||
|
||||
validBoxes = []
|
||||
tf.dispose(tBoxes)
|
||||
tf.dispose(tScores)
|
||||
tf.dispose(tRes)
|
||||
const valid_detections_data = classes_data.length
|
||||
const output = {
|
||||
var output = {
|
||||
detections: []
|
||||
}
|
||||
for (let i =0; i < valid_detections_data; i++) {
|
||||
const [dLeft, dTop, dRight, dBottom] = boxes_data[i]
|
||||
for (var i =0; i < valid_detections_data; i++) {
|
||||
var [dLeft, dTop, dRight, dBottom] = boxes_data[i]
|
||||
output.detections.push({
|
||||
"top": dTop,
|
||||
"left": dLeft,
|
||||
@@ -105,14 +96,14 @@ export default {
|
||||
}
|
||||
tf.dispose(res)
|
||||
tf.dispose(input)
|
||||
console.timeEnd('mx: post-process')
|
||||
console.timeEnd('post-process')
|
||||
|
||||
return output || { detections: [] }
|
||||
},
|
||||
getRemoteLabels() {
|
||||
let self = this
|
||||
const modelURL = `http://${this.serverSettings.address}:${this.serverSettings.port}/detectors`
|
||||
let xhr = new XMLHttpRequest()
|
||||
var self = this
|
||||
var modelURL = `http://${this.serverSettings.address}:${this.serverSettings.port}/detectors`
|
||||
var xhr = new XMLHttpRequest()
|
||||
xhr.open("GET", modelURL)
|
||||
xhr.setRequestHeader('Content-Type', 'application/json')
|
||||
xhr.timeout = 10000
|
||||
@@ -124,8 +115,8 @@ export default {
|
||||
f7.dialog.alert(`ALVINN has encountered an error: ${errorResponse.error}`)
|
||||
return
|
||||
}
|
||||
const detectors = JSON.parse(xhr.response).detectors
|
||||
let findLabel = detectors
|
||||
var detectors = JSON.parse(xhr.response).detectors
|
||||
var findLabel = detectors
|
||||
.find( d => { return d.name == self.detectorName } )?.labels
|
||||
.filter( l => { return l != "" } ).sort()
|
||||
.map( l => { return {'name': l, 'detect': true} } )
|
||||
@@ -139,9 +130,9 @@ export default {
|
||||
xhr.send()
|
||||
},
|
||||
remoteDetect() {
|
||||
let self = this
|
||||
const modelURL = `http://${this.serverSettings.address}:${this.serverSettings.port}/detect`
|
||||
let xhr = new XMLHttpRequest()
|
||||
var self = this
|
||||
var modelURL = `http://${this.serverSettings.address}:${this.serverSettings.port}/detect`
|
||||
var xhr = new XMLHttpRequest()
|
||||
xhr.open("POST", modelURL)
|
||||
xhr.timeout = 10000
|
||||
xhr.ontimeout = this.remoteTimeout
|
||||
@@ -158,7 +149,7 @@ export default {
|
||||
self.uploadDirty = true
|
||||
}
|
||||
|
||||
const doodsData = {
|
||||
var doodsData = {
|
||||
"detector_name": this.detectorName,
|
||||
"detect": {
|
||||
"*": 1
|
||||
@@ -172,8 +163,8 @@ export default {
|
||||
this.detecting = false
|
||||
f7.dialog.alert('No connection to remote ALVINN instance. Please check app settings.')
|
||||
},
|
||||
async videoFrameDetect (vidData, miniModel) {
|
||||
await this.loadModel(miniModel)
|
||||
async videoFrameDetect (vidData) {
|
||||
await this.loadModel(this.miniLocation)
|
||||
const [modelWidth, modelHeight] = model.inputs[0].shape.slice(1, 3)
|
||||
const imCanvas = this.$refs.image_cvs
|
||||
const imageCtx = imCanvas.getContext("2d")
|
||||
@@ -182,7 +173,8 @@ export default {
|
||||
imCanvas.width = imCanvas.clientWidth
|
||||
imCanvas.height = imCanvas.clientHeight
|
||||
imageCtx.clearRect(0,0,imCanvas.width,imCanvas.height)
|
||||
let imgWidth, imgHeight
|
||||
var imgWidth
|
||||
var imgHeight
|
||||
const imgAspect = vidData.width / vidData.height
|
||||
const rendAspect = imCanvas.width / imCanvas.height
|
||||
if (imgAspect >= rendAspect) {
|
||||
@@ -193,7 +185,7 @@ export default {
|
||||
imgHeight = imCanvas.height
|
||||
}
|
||||
while (this.videoAvailable) {
|
||||
console.time('mx: frame-process')
|
||||
console.time('frame-process')
|
||||
try {
|
||||
const input = tf.tidy(() => {
|
||||
return tf.image.resizeBilinear(tf.browser.fromPixels(vidData), [modelWidth, modelHeight]).div(255.0).expandDims(0)
|
||||
@@ -203,31 +195,25 @@ export default {
|
||||
|
||||
let rawCoords = []
|
||||
if (rawRes) {
|
||||
for (let i = 0; i < rawRes.length; i++) {
|
||||
let getScores = rawRes[i].slice(4)
|
||||
for (var i = 0; i < rawRes.length; i++) {
|
||||
var getScores = rawRes[i].slice(4)
|
||||
if (getScores.some( s => s > .5)) {
|
||||
let foundTarget = rawRes[i].slice(0,2)
|
||||
foundTarget.push(Math.max(...getScores))
|
||||
rawCoords.push(foundTarget)
|
||||
rawCoords.push(rawRes[i].slice(0,2))
|
||||
}
|
||||
}
|
||||
|
||||
imageCtx.clearRect(0,0,imCanvas.width,imCanvas.height)
|
||||
for (let coord of rawCoords) {
|
||||
for (var coord of rawCoords) {
|
||||
console.log(`x: ${coord[0]}, y: ${coord[1]}`)
|
||||
let pointX = (imCanvas.width - imgWidth) / 2 + (coord[0] / modelWidth) * imgWidth -5
|
||||
let pointY = (imCanvas.height - imgHeight) / 2 + (coord[1] / modelHeight) * imgHeight -5
|
||||
imageCtx.globalAlpha = coord[2]
|
||||
imageCtx.drawImage(target, pointX, pointY, 20, 20)
|
||||
}
|
||||
}
|
||||
tf.dispose(input)
|
||||
tf.dispose(res)
|
||||
tf.dispose(rawRes)
|
||||
} catch (e) {
|
||||
console.log(e)
|
||||
}
|
||||
console.timeEnd('mx: frame-process')
|
||||
console.timeEnd('frame-process')
|
||||
await tf.nextFrame();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -4,14 +4,7 @@
|
||||
<f7-block>
|
||||
<h2>Quick Start</h2>
|
||||
<ol>
|
||||
<li>Select the region of the body you want to identify structures from. The regions are:
|
||||
<ul>
|
||||
<li><RegionIcon :region="0" class="list-region"/>Thorax and back</li>
|
||||
<li><RegionIcon :region="1" class="list-region"/>Abdomen and pelvis</li>
|
||||
<li><RegionIcon :region="2" class="list-region"/>Limbs</li>
|
||||
<li><RegionIcon :region="3" class="list-region"/>Head and neck</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>Select the region of the body you want to identify structures from.</li>
|
||||
<li>Load an image:
|
||||
<ul>
|
||||
<li>Click on the camera icon <SvgIcon icon="photo_camera" class="list-svg"/> to take a new picture.
|
||||
@@ -21,40 +14,33 @@
|
||||
</ul>
|
||||
</li>
|
||||
<li>Click on the image file icon <SvgIcon icon="photo_library" class="list-svg"/> to load a picture from the device storage.</li>
|
||||
<li>If the clipboard is available on the system, then there will be a paste icon <SvgIcon icon="clipboard" class="list-svg"/> to paste image data directly into the app.</li>
|
||||
<li>If demo mode is turned on, you can click on the marked image icon <SvgIcon icon="photo_sample" class="list-svg"/> to load an ALVINN sample image.</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>When the picture is captured or loaded, any identifiable structures will be listed as tags below the image:
|
||||
<f7-chip text="Structure name" media=" " class="demo-chip" deleteable/>
|
||||
<f7-chip text="Structure name" media=" " class="demo-chip"/>
|
||||
<ul>
|
||||
<li>Click on each tag to see the structure highlighted in the image or click on the image to see the tag for that structure (additional clicks to the same area will select overlapping structres).</li>
|
||||
<li>Click on each tag to see the structure highlighted in the image.</li>
|
||||
<li>Tag color and proportion filled indicate ALVINN's level of confidence in the identification.</li>
|
||||
<li>An incorrect tag can be deleted by clicking on the tag's <f7-icon icon="chip-delete" style="margin-right: 1px;"></f7-icon> button.</li>
|
||||
<li>Click on the zoom to structure button <SvgIcon icon="zoom_to" class="list-svg"/> to magnify the view of the selected structure</li>
|
||||
<li>If there are potential structures that do not satisfy the current detection threshold, a badge on the detection menu icon <SvgIcon icon="visibility" class="list-svg"/> will indicate the number of un-displayed structures.</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>Pan (middle click or touch and drag) and zoom (mouse wheel or pinch) to manually select detailed views in the image.</li>
|
||||
<li>The reset zoom button <SvgIcon icon="reset_zoom" class="list-svg"/> will return the image to its initial position and magnification.</li>
|
||||
</ol>
|
||||
<h2>Advanced Features</h2>
|
||||
<h3>Detection Parameters</h3>
|
||||
<p>
|
||||
If there are potential structures that do not satisfy the current detection settings, a badge on the detection menu icon <SvgIcon icon="visibility" class="list-svg"/> will indicate the number of un-displayed structures.
|
||||
Clicking on the detection menu icon will open a menu of tools to adjust the detection settings.
|
||||
After an image has been loaded and structure detection has been performed, the detection parameters can be adjusted using the detection menu icon <SvgIcon icon="visibility" class="list-svg"/>.
|
||||
This button will make three tools available:
|
||||
</p>
|
||||
<ol>
|
||||
<li>Confidence filter <SvgIcon icon="visibility" class="list-svg"/> or <SvgIcon icon="reset_slide" class="list-svg"/>: You can press this button to show all structures or return the confidence slider to the default value (50%).</li>
|
||||
<li>Confidence slider: You can use the slider to change the confidence threshold for identifying structures.</li>
|
||||
<li>Refresh detections <SvgIcon icon="refresh_search" class="list-svg"/>: If there has been a permanent change to the structure detection list, such as deleting a tag, the detection list can be reset to its original state.</li>
|
||||
<li>Confidence slider: You can use the slider to change the confidence threshold for identifying structures. The default threshold is 50% confidence.</li>
|
||||
<li>Refresh detections <SvgIcon icon="refresh_search" class="list-svg"/>: If there has been a permanent change to the structures detections, such as deleting a tag, the detection list can be reset to its original state.</li>
|
||||
<li>Structure list <SvgIcon icon="check_list" class="list-svg"/>: you can view a list of all the structures available for detection in that region and select/deselect individual structures for detection.</li>
|
||||
</ol>
|
||||
<h3>Submitting Images</h3>
|
||||
<p>
|
||||
If all of the detection tags that are currently visible have been clicked on and viewed, then the cloud upload button <SvgIcon icon="cloud_upload" class="list-svg"/> on the detection menu will be enabled.
|
||||
This button will cause the image and the verified structures to be uploaded to the ALVINN project servers where that data will become available for further training of the neural net.
|
||||
If after the image has been uploaded, the available detection tags are changed via deletion or the detection settings options, then the option to re-upload the image will be available if all the new tags have been viewed and verified.
|
||||
If all of the detection tags that are currently visible have been viewed, then the cloud upload button <SvgIcon icon="cloud_upload" class="list-svg"/> on the detection menu will be enabled.
|
||||
This button will cause the image and the verified structures to be uploaded to the ALVINN project servers where that data will be available for further training of the neural net. If after the image has been uploaded, the available detection tags change, then the option to re-upload the image will be available if all the new tags have been viewed and verified.
|
||||
</p>
|
||||
</f7-block>
|
||||
</f7-page>
|
||||
@@ -71,13 +57,6 @@
|
||||
top: .5em;
|
||||
}
|
||||
|
||||
.list-region {
|
||||
width: 3em;
|
||||
position:relative;
|
||||
top: 1em;
|
||||
margin-right: .5em;
|
||||
}
|
||||
|
||||
.cap-button {
|
||||
background-color: var(--f7-theme-color);
|
||||
color: white;
|
||||
@@ -101,12 +80,10 @@
|
||||
|
||||
<script>
|
||||
import SvgIcon from '../components/svg-icon.vue'
|
||||
import RegionIcon from '../components/region-icon.vue'
|
||||
|
||||
export default {
|
||||
components: {
|
||||
SvgIcon,
|
||||
RegionIcon
|
||||
SvgIcon
|
||||
}
|
||||
}
|
||||
</script>
|
||||
|
||||
@@ -6,10 +6,6 @@
|
||||
<f7-link icon-ios="f7:bars" icon-md="f7:bars" panel-open="left"></f7-link>
|
||||
</f7-nav-left>
|
||||
<f7-nav-title sliding>A.L.V.I.N.N.</f7-nav-title>
|
||||
<f7-nav-right>
|
||||
<f7-link v-if="!isCordova" :icon-only="true" tooltip="Fullscreen" :icon-f7="isFullscreen ? 'viewfinder_circle_fill' : 'viewfinder'" @click="toggleFullscreen"></f7-link>
|
||||
<f7-link :icon-only="true" tooltip="ALVINN help" icon-f7="question_circle_fill" href="/help/"></f7-link>
|
||||
</f7-nav-right>
|
||||
</f7-navbar>
|
||||
<!-- Page content-->
|
||||
<div style="display: grid; grid-template-columns: 100%; grid-template-rows: min-content min-content min-content 1fr; align-content: stretch; gap: 15px; min-height: 0px; max-height: calc(100vh - (var(--f7-navbar-height) + var(--f7-safe-area-top))); height: calc(100vh - (var(--f7-navbar-height) + var(--f7-safe-area-top)));">
|
||||
@@ -24,16 +20,16 @@
|
||||
<p style="text-align: center; margin: 0;">Select a region to begin.</p>
|
||||
<div class="region-grid">
|
||||
<f7-button :class="`region-button thorax${isAgreed && getRegions.includes('thorax') ? '' : ' disabled'}`" :href="isAgreed && getRegions.includes('thorax') && '/detect/thorax/'">
|
||||
<RegionIcon class="region-image" :region="0" :iconSet="getIconSet" />
|
||||
<RegionIcon class="region-image" :region="0" />
|
||||
</f7-button>
|
||||
<f7-button :class="`region-button abdomen${isAgreed && getRegions.includes('abdomen') ? '' : ' disabled'}`" :href="isAgreed && getRegions.includes('abdomen') && '/detect/abdomen/'">
|
||||
<RegionIcon class="region-image" :region="1" :iconSet="getIconSet" />
|
||||
<RegionIcon class="region-image" :region="1" />
|
||||
</f7-button>
|
||||
<f7-button :class="`region-button limbs${isAgreed && getRegions.includes('limbs') ? '' : ' disabled'}`" :href="isAgreed && getRegions.includes('limbs') && '/detect/limbs/'">
|
||||
<RegionIcon class="region-image" :region="2" :iconSet="getIconSet" />
|
||||
<RegionIcon class="region-image" :region="2" />
|
||||
</f7-button>
|
||||
<f7-button :class="`region-button headneck${isAgreed && getRegions.includes('head') ? '' : ' disabled'}`" :href="isAgreed && getRegions.includes('head') && '/detect/head/'">
|
||||
<RegionIcon class="region-image" :region="3" :iconSet="getIconSet" />
|
||||
<RegionIcon class="region-image" :region="3" />
|
||||
</f7-button>
|
||||
</div>
|
||||
</div>
|
||||
@@ -107,16 +103,10 @@
|
||||
},
|
||||
data () {
|
||||
return {
|
||||
isCordova: !!window.cordova,
|
||||
alphaCheck: false
|
||||
}
|
||||
},
|
||||
setup() {
|
||||
//URL TESTING CODE
|
||||
//let testUrl = URL.parse(`../models/thorax/model.json`,import.meta.url).href
|
||||
//console.log(testUrl)
|
||||
//let testUrl2 = new URL(`../models/thorax/model.json`,import.meta.url)
|
||||
//console.log(testUrl2)
|
||||
return store()
|
||||
},
|
||||
methods: {
|
||||
|
||||
@@ -31,22 +31,19 @@
|
||||
<span style="margin-left: 16px;">Disable video estimates<f7-icon size="16" style="padding-left: 5px;" f7="question_diamond_fill" tooltip="faster: recommended for slower devices" /></span>
|
||||
<f7-toggle v-model:checked="otherSettings.disableVideo" style="margin-right: 16px;" />
|
||||
</div>
|
||||
<div v-if="serverToggle">
|
||||
<div style="display:flex; justify-content:space-between; width: 100%">
|
||||
<span style="margin-left: 16px;">Use external server</span>
|
||||
<f7-toggle v-model:checked="serverSettings.use" style="margin-right: 16px;" @change="setDirty()" />
|
||||
</div>
|
||||
<f7-list >
|
||||
<f7-list-input :disabled="!serverSettings.use || serverList" v-model:value="serverSettings.address" label="Server address" type="text" placeholder="127.0.0.1" />
|
||||
<f7-list-input :disabled="!serverSettings.use || serverList" v-model:value="serverSettings.port" label="Server port" type="text" placeholder="9001" />
|
||||
</f7-list>
|
||||
<span>Other servers</span>
|
||||
<f7-list :dividers="true" :outline="true" :strong="true" :inset="true" style="width: calc(100% - 32px); margin-top: 0;">
|
||||
<f7-list-item v-for="(addObj) in externalIp" :disabled="!serverSettings.use" :title="addObj.name" @click="setServerProps(addObj.address, addObj.port)"></f7-list-item>
|
||||
<f7-list-item v-if="!serverList" v-for="(port, add) in otherIp" :disabled="!serverSettings.use" :title="add" @click="setServerProps(add, port)">{{ port }}</f7-list-item>
|
||||
<f7-list-item v-if="Object.keys(otherIp).length == 0 && externalIp.length == 0" title="No previous server settings"></f7-list-item>
|
||||
</f7-list>
|
||||
<div style="display:flex; justify-content:space-between; width: 100%">
|
||||
<span style="margin-left: 16px;">Use external server</span>
|
||||
<f7-toggle v-model:checked="serverSettings.use" style="margin-right: 16px;" @change="setDirty()" />
|
||||
</div>
|
||||
<f7-list>
|
||||
<f7-list-input :disabled="!serverSettings.use" v-model:value="serverSettings.address" label="Server address" type="text" placeholder="127.0.0.1" />
|
||||
<f7-list-input :disabled="!serverSettings.use" v-model:value="serverSettings.port" label="Server port" type="text" placeholder="9001" />
|
||||
</f7-list>
|
||||
<span>Other servers</span>
|
||||
<f7-list :dividers="true" :outline="true" :strong="true" :inset="true" style="width: calc(100% - 32px); margin-top: 0;">
|
||||
<f7-list-item v-for="(port, add) in otherIp" :disabled="!serverSettings.use" :title="add" @click="setServerProps(add, port)">{{ port }}</f7-list-item>
|
||||
<f7-list-item v-if="Object.keys(otherIp).length == 0" title="No previous server settings"></f7-list-item>
|
||||
</f7-list>
|
||||
</div>
|
||||
<f7-button fill @click="saveAllSettings">SAVE</f7-button>
|
||||
</div>
|
||||
@@ -64,7 +61,6 @@
|
||||
|
||||
<script>
|
||||
import { f7 } from 'framework7-vue'
|
||||
import store from '../js/store'
|
||||
|
||||
export default {
|
||||
data () {
|
||||
@@ -76,8 +72,8 @@
|
||||
},
|
||||
serverSettings: {
|
||||
use: false,
|
||||
address: '127.0.0.1',
|
||||
port: '9000',
|
||||
address: '10.170.64.22',
|
||||
port: '9001',
|
||||
previous: {}
|
||||
},
|
||||
themeSettings: {
|
||||
@@ -85,36 +81,24 @@
|
||||
}
|
||||
}
|
||||
},
|
||||
setup() {
|
||||
return store()
|
||||
},
|
||||
computed: {
|
||||
otherIp () {
|
||||
let filteredIps = {}
|
||||
for (let oldIp in this.serverSettings.previous) {
|
||||
for (var oldIp in this.serverSettings.previous) {
|
||||
if (oldIp != this.serverSettings.address) {
|
||||
filteredIps[oldIp] = this.serverSettings.previous[oldIp]
|
||||
}
|
||||
}
|
||||
return filteredIps
|
||||
},
|
||||
serverToggle () {
|
||||
return ['optional','list'].includes(this.externalType)
|
||||
},
|
||||
serverList () {
|
||||
return this.externalType == 'list'
|
||||
},
|
||||
externalIp () {
|
||||
return this.getServerList()
|
||||
}
|
||||
},
|
||||
created () {
|
||||
const loadServerSettings = localStorage.getItem('serverSettings')
|
||||
var loadServerSettings = localStorage.getItem('serverSettings')
|
||||
if (loadServerSettings) this.serverSettings = JSON.parse(loadServerSettings)
|
||||
if (!this.serverSettings.previous) this.serverSettings.previous = {}
|
||||
const loadThemeSettings = localStorage.getItem('themeSettings')
|
||||
var loadThemeSettings = localStorage.getItem('themeSettings')
|
||||
if (loadThemeSettings) this.themeSettings = JSON.parse(loadThemeSettings)
|
||||
const loadOtherSettings = localStorage.getItem('otherSettings')
|
||||
var loadOtherSettings = localStorage.getItem('otherSettings')
|
||||
if (loadOtherSettings) this.otherSettings = JSON.parse(loadOtherSettings)
|
||||
},
|
||||
methods: {
|
||||
@@ -122,7 +106,7 @@
|
||||
let saveSetting = new Promise(
|
||||
(saved,failed) => {
|
||||
try {
|
||||
if (this.serverSettings.use && !this.externalIp.some( (srv) => srv.address == this.serverSettings.address)) {
|
||||
if (this.serverSettings.use) {
|
||||
this.serverSettings.previous[this.serverSettings.address] = this.serverSettings.port
|
||||
}
|
||||
localStorage.setItem('serverSettings',JSON.stringify(this.serverSettings))
|
||||
@@ -136,7 +120,7 @@
|
||||
)
|
||||
saveSetting.then(
|
||||
() => {
|
||||
const toast = f7.toast.create({
|
||||
var toast = f7.toast.create({
|
||||
text: 'Settings saved',
|
||||
closeTimeout: 2000
|
||||
})
|
||||
@@ -144,7 +128,7 @@
|
||||
this.isDirty = false;
|
||||
},
|
||||
() => {
|
||||
const toast = f7.toast.create({
|
||||
var toast = f7.toast.create({
|
||||
text: 'ERROR: No settings saved',
|
||||
closeTimeout: 2000
|
||||
})
|
||||
@@ -167,8 +151,7 @@
|
||||
},
|
||||
toggleSettingsView () {
|
||||
this.showAdvanced = !this.showAdvanced
|
||||
//this.$refs.advancedSettings.style.maxHeight = `${this.showAdvanced ? this.$refs.advancedSettings.scrollHeight : 0}px`
|
||||
this.$refs.advancedSettings.style.maxHeight = this.showAdvanced ? '100%' : '0px'
|
||||
this.$refs.advancedSettings.style.maxHeight = `${this.showAdvanced ? this.$refs.advancedSettings.scrollHeight : 0}px`
|
||||
},
|
||||
confirmBack () {
|
||||
if (this.isDirty) {
|
||||
|
||||
@@ -8,8 +8,6 @@
|
||||
<f7-block-title medium>Details</f7-block-title>
|
||||
<f7-list>
|
||||
<f7-list-item title="Version" :after="alvinnVersion"></f7-list-item>
|
||||
<f7-list-item title="Build" :after="alvinnBuild"></f7-list-item>
|
||||
<f7-list-item title="Workers" :after="useWorkers ? 'Enabled' : 'Disabled'"></f7-list-item>
|
||||
</f7-list>
|
||||
<f7-block-title medium>Models</f7-block-title>
|
||||
<f7-list style="width: 100%;">
|
||||
@@ -17,10 +15,8 @@
|
||||
<f7-list-item title="Thorax-m" :after="miniThoraxDetails.version"></f7-list-item>
|
||||
<f7-list-item :class="otherSettings.mini ? 'unused-model' : ''" title="Abdomen/Pelvis" :after="abdomenDetails.version"></f7-list-item>
|
||||
<f7-list-item title="Abd/Pel-m" :after="miniAbdomenDetails.version"></f7-list-item>
|
||||
<f7-list-item :class="otherSettings.mini ? 'unused-model' : ''" title="Limbs" :after="limbsDetails.version"></f7-list-item>
|
||||
<f7-list-item title="Limbs-m" :after="miniLimbsDetails.version"></f7-list-item>
|
||||
<f7-list-item :class="otherSettings.mini ? 'unused-model' : ''" title="Head/Neck" :after="headneckDetails.version"></f7-list-item>
|
||||
<f7-list-item title="Head-m" :after="miniHeadneckDetails.version"></f7-list-item>
|
||||
<f7-list-item title="Limbs" :after="limbsDetails.version"></f7-list-item>
|
||||
<f7-list-item title="Head/Neck" :after="headneckDetails.version"></f7-list-item>
|
||||
</f7-list>
|
||||
</div>
|
||||
</f7-block>
|
||||
@@ -46,16 +42,10 @@
|
||||
miniThoraxDetails: {},
|
||||
abdomenDetails: {},
|
||||
miniAbdomenDetails: {},
|
||||
//limbsDetails: { "version": "N/A" },
|
||||
//headneckDetails: { "version": "N/A" },
|
||||
limbsDetails: {},
|
||||
miniLimbsDetails: {},
|
||||
headneckDetails: {},
|
||||
miniHeadneckDetails: {},
|
||||
limbsDetails: { "version": "N/A" },
|
||||
headneckDetails: { "version": "N/A" },
|
||||
alvinnVersion: store().getVersion,
|
||||
alvinnBuild: store().getBuild,
|
||||
isCordova: !!window.cordova,
|
||||
useWorkers: store().useWorkers,
|
||||
otherSettings: {}
|
||||
}
|
||||
},
|
||||
@@ -63,7 +53,7 @@
|
||||
return store()
|
||||
},
|
||||
created () {
|
||||
const loadOtherSettings = localStorage.getItem('otherSettings')
|
||||
var loadOtherSettings = localStorage.getItem('otherSettings')
|
||||
if (loadOtherSettings) this.otherSettings = JSON.parse(loadOtherSettings)
|
||||
fetch(`${this.isCordova ? 'https://localhost' : '.'}/models/thorax/descript.json`)
|
||||
.then((mod) => { return mod.json() })
|
||||
@@ -77,18 +67,6 @@
|
||||
fetch(`${this.isCordova ? 'https://localhost' : '.'}/models/abdomen-mini/descript.json`)
|
||||
.then((mod) => { return mod.json() })
|
||||
.then((desc) => { this.miniAbdomenDetails = desc })
|
||||
fetch(`${this.isCordova ? 'https://localhost' : '.'}/models/limbs/descript.json`)
|
||||
.then((mod) => { return mod.json() })
|
||||
.then((desc) => { this.limbsDetails = desc })
|
||||
fetch(`${this.isCordova ? 'https://localhost' : '.'}/models/limbs-mini/descript.json`)
|
||||
.then((mod) => { return mod.json() })
|
||||
.then((desc) => { this.miniLimbsDetails = desc })
|
||||
fetch(`${this.isCordova ? 'https://localhost' : '.'}/models/head/descript.json`)
|
||||
.then((mod) => { return mod.json() })
|
||||
.then((desc) => { this.headneckDetails = desc })
|
||||
fetch(`${this.isCordova ? 'https://localhost' : '.'}/models/head-mini/descript.json`)
|
||||
.then((mod) => { return mod.json() })
|
||||
.then((desc) => { this.miniHeadneckDetails = desc })
|
||||
},
|
||||
methods: {
|
||||
}
|
||||
|
||||
@@ -5,8 +5,8 @@ export default {
|
||||
newUid (length) {
|
||||
const uidLength = length || 16
|
||||
const uidChars = 'abcdefghijklmnopqrstuvwxyz0123456789'
|
||||
let uid = []
|
||||
for (let i = 0; i < uidLength; i++) {
|
||||
var uid = []
|
||||
for (var i = 0; i < uidLength; i++) {
|
||||
uid.push(uidChars.charAt(Math.floor(Math.random() * ((i < 4) ? 26 : 36))))
|
||||
}
|
||||
return uid.join('')
|
||||
@@ -14,23 +14,24 @@ export default {
|
||||
uploadData (imagePayload, classPayload, prevUid) {
|
||||
let uploadImage = new Promise (resolve => {
|
||||
const dataUid = prevUid || this.newUid(16)
|
||||
let byteChars = window.atob(imagePayload)
|
||||
let byteArrays = []
|
||||
var byteChars = window.atob(imagePayload)
|
||||
var byteArrays = []
|
||||
var len = byteChars.length
|
||||
|
||||
for (let offset = 0; offset < byteChars.length; offset += 1024) {
|
||||
let slice = byteChars.slice(offset, offset + 1024)
|
||||
let byteNumbers = new Array(slice.length)
|
||||
for (let i = 0; i < slice.length; i++) {
|
||||
for (var offset = 0; offset < len; offset += 1024) {
|
||||
var slice = byteChars.slice(offset, offset + 1024)
|
||||
var byteNumbers = new Array(slice.length)
|
||||
for (var i = 0; i < slice.length; i++) {
|
||||
byteNumbers[i] = slice.charCodeAt(i)
|
||||
}
|
||||
|
||||
let byteArray = new Uint8Array(byteNumbers)
|
||||
var byteArray = new Uint8Array(byteNumbers)
|
||||
byteArrays.push(byteArray)
|
||||
}
|
||||
const imageBlob = new Blob(byteArrays, {type: 'image/jpeg'})
|
||||
var imageBlob = new Blob(byteArrays, {type: 'image/jpeg'})
|
||||
|
||||
let xhrJpg = new XMLHttpRequest()
|
||||
let uploadUrl = `https://nextcloud.azgeorgis.net/public.php/webdav/${dataUid}.jpeg`
|
||||
var xhrJpg = new XMLHttpRequest()
|
||||
var uploadUrl = `https://nextcloud.azgeorgis.net/public.php/webdav/${dataUid}.jpeg`
|
||||
xhrJpg.open("PUT", uploadUrl)
|
||||
xhrJpg.setRequestHeader('Content-Type', 'image/jpeg')
|
||||
xhrJpg.setRequestHeader('X-Method-Override', 'PUT')
|
||||
@@ -38,8 +39,8 @@ export default {
|
||||
xhrJpg.setRequestHeader("Authorization", "Basic " + btoa("LKBm3H6JdSaywyg:"))
|
||||
xhrJpg.send(imageBlob)
|
||||
|
||||
let xhrTxt = new XMLHttpRequest()
|
||||
uploadUrl = `https://nextcloud.azgeorgis.net/public.php/webdav/${dataUid}.txt`
|
||||
var xhrTxt = new XMLHttpRequest()
|
||||
var uploadUrl = `https://nextcloud.azgeorgis.net/public.php/webdav/${dataUid}.txt`
|
||||
xhrTxt.open("PUT", uploadUrl)
|
||||
xhrTxt.setRequestHeader('Content-Type', 'text/plain')
|
||||
xhrTxt.setRequestHeader('X-Method-Override', 'PUT')
|
||||
@@ -50,7 +51,7 @@ export default {
|
||||
resolve(dataUid)
|
||||
})
|
||||
return uploadImage.then((newUid) => {
|
||||
const toast = f7.toast.create({
|
||||
var toast = f7.toast.create({
|
||||
text: 'Detections Uploaded: thank you.',
|
||||
closeTimeout: 2000
|
||||
})
|
||||
|
||||
@@ -1,51 +0,0 @@
|
||||
export default {
|
||||
data () {
|
||||
return {
|
||||
touchPrevious: {}
|
||||
}
|
||||
},
|
||||
methods: {
|
||||
startTouch(event) {
|
||||
if (event.touches.length == 1) {
|
||||
this.touchPrevious = {x: event.touches[0].clientX, y: event.touches[0].clientY}
|
||||
}
|
||||
if (event.touches.length == 2) {
|
||||
let midX = (event.touches.item(0).clientX + event.touches.item(1).clientX) / 2
|
||||
let midY = (event.touches.item(0).clientY + event.touches.item(1).clientY) / 2
|
||||
this.touchPrevious = {distance: this.touchDistance(event.touches), x: midX, y: midY}
|
||||
}
|
||||
},
|
||||
endTouch(event) {
|
||||
if (event.touches.length == 1) {
|
||||
this.touchPrevious = {x: event.touches[0].clientX, y: event.touches[0].clientY}
|
||||
} else {
|
||||
//this.debugInfo = null
|
||||
}
|
||||
},
|
||||
moveTouch(event) {
|
||||
switch (event.touches.length) {
|
||||
case 1:
|
||||
this.canvasOffset.x += event.touches[0].clientX - this.touchPrevious.x
|
||||
this.canvasOffset.y += event.touches[0].clientY - this.touchPrevious.y
|
||||
this.touchPrevious = {x: event.touches[0].clientX, y: event.touches[0].clientY}
|
||||
break;
|
||||
case 2:
|
||||
let newDistance = this.touchDistance(event.touches)
|
||||
let midX = (event.touches.item(0).clientX + event.touches.item(1).clientX) / 2
|
||||
let midY = (event.touches.item(0).clientY + event.touches.item(1).clientY) / 2
|
||||
let zoomFactor = newDistance / this.touchPrevious.distance
|
||||
this.canvasZoom *= zoomFactor
|
||||
this.canvasOffset.x = (midX - 16) * (1 - zoomFactor) + this.canvasOffset.x * zoomFactor + (midX - this.touchPrevious.x)
|
||||
this.canvasOffset.y = (midY - 96) * (1 - zoomFactor) + this.canvasOffset.y * zoomFactor + (midY - this.touchPrevious.y)
|
||||
this.touchPrevious = {distance: newDistance, x: midX, y: midY}
|
||||
break;
|
||||
}
|
||||
this.selectChip("redraw")
|
||||
},
|
||||
touchDistance(touches) {
|
||||
let touch1 = touches.item(0)
|
||||
let touch2 = touches.item(1)
|
||||
return Math.sqrt((touch1.clientX - touch2.clientX) ** 2 + (touch1.clientY - touch2.clientY) ** 2)
|
||||
}
|
||||
}
|
||||
}
|
||||
Reference in New Issue
Block a user