issue1038 / README.md
jrsimuix's picture
Update README.md
e1ff6a2 verified
metadata
license: cc

Install following python libs

pip3 install tensorflow
pip3 install tensorflowjs
pip3 install tf2onnx
pip3 install onnxruntime
pip3 install pillow
pip3 install optimum[exporters]

Change to compatible version of numpy for tensorflow

pip3 uninstall numpy
pip3 install numpy==1.23.5

Node Install

Download install project dependencies.

npm install

Summary of Commands:

  • Run the Node training script to save the Layers Model.
  • Convert tfjs_layers_model → tfjs_graph_model
  • Convert graph model to onnx
  • Validate onnx structure
  • Test Model

1. Create Tensorflow model in node

This will loop through the training images taking base folder name as the label for the images to be associated against. Once complete saved-model/model.json is created.

node generate.js

2. Convert Model

Convert from layers to graph model this is required to generate an onnx from tf2onnx

tensorflowjs_converter --input_format=tfjs_layers_model \  --output_format=tfjs_graph_model \  ./saved-model/layers-model/model.json \  ./saved-model/graph-model

3. Convert to ONNX Model

This will convert to a ONNX model to be used with transformers.js on web or nodejs.

python3 -m tf2onnx.convert --tfjs ./saved-model/graph-model/model.json --output ./saved-model/model.onnx

Unable to figure a way to use Optimum with tensorflow.js models atm..

4. Validate ONNX

Make sure the conversion worked and no issues

python3 validate_onnx.py

5. Test ONNX Model python

update the image path in the code to point to an image to confirm working as expected

  • I tested against one of the trained image that should give 1.
python3 test_image.py

Inference outputs: [array([[0., 1.]], dtype=float32)]

5. Test ONNX Model JS onnxruntime-node

update the image path in the code to point to an image to confirm working as expected

node onnxruntime-node

Inference outputs: Tensor { cpuData: Float32Array(2) [ 0, 1 ], dataLocation: 'cpu', type: 'float32', dims: [ 1, 2 ], size: 2 }