tensorflow_lite v2.2.1

tensorflow lite bindings for crystal lang

tensorflow_lite

CI A library for running TF Lite models

  • once you've trained a model in TensorFlow you can convert it to TF Lite for production use
  • inspect the TF Lite model using netron.app
  • some good TF models for object detection (need conversion)

Also see the project documentation

Installation

  1. Add the dependency to your shard.yml:

    dependencies:
      tensorflow_lite:
        github: spider-gazelle/tensorflow_lite
    
  2. Run shards install

Usage

See the specs for basic usage or have a look at imagine

require "tensorflow_lite"

you can use the example metadata extractor to obtain the metadata for TF Lite models downloaded from tfhub.dev

With and EdgeTPU

Such as a Coral USB device

require "tensorflow_lite/edge_tpu"

To install the edge tpu delegate:

# Add Google Cloud public key
RUN wget -q -O - https://packages.cloud.google.com/apt/doc/apt-key.gpg | gpg --dearmor > /etc/apt/trusted.gpg.d/coral-edgetpu.gpg

# Add Coral packages repository
RUN echo "deb [signed-by=/etc/apt/trusted.gpg.d/coral-edgetpu.gpg] https://packages.cloud.google.com/apt coral-edgetpu-stable main" | tee /etc/apt/sources.list.d/coral-edgetpu.list

# install the lib
sudo apt update
sudo apt install libedgetpu-dev

To install the Coral USB drivers

sudo apt install libedgetpu1-std
# OR for max frequency
sudo apt install libedgetpu1-max

# unplug and re-plug the coral or run this
sudo systemctl restart udev

NOTE:: when using a coral and running lsusb you need to look for either:

  • Global Unichip Corp.
  • Google Inc.

after running something on the chip it will change identity to Google Inc.

And you need to include the Google identity version in any docker files.

Development

To update tensorflow lite bindings ./generate_bindings.sh

lib installation

Dockerfile

The dockerfile is used to build a compatible tensorflow build for target platforms. There is an image pre-built at docker pull stakach/tensorflowlite:latest

To build an image run:

docker buildx build --progress=plain --platform linux/arm64,linux/amd64 -t stakach/tensorflowlite:latest --push .

to extract the libraries

mkdir -p ./ext
docker pull stakach/tensorflowlite:latest
docker create --name tflite_tmp stakach/tensorflowlite:latest true

docker cp tflite_tmp:/usr/local/lib/libedgetpu.so ./ext/libedgetpu.so
docker cp tflite_tmp:/usr/local/lib/libtensorflowlite_c.so ./ext/libtensorflowlite_c.so
docker cp tflite_tmp:/usr/local/lib/libtensorflowlite_gpu_delegate.so ./ext/libtensorflowlite_gpu_delegate.so

docker rm tflite_tmp

this operation is performed post-install by this library

Old method

Requires libtensorflow to be installed, this is handled automatically by ./build_tensorflowlite.sh

  • there is a guide to building it
  • you can use ./build_tensorflowlite.sh to automate this
  • then requires export LD_LIBRARY_PATH=/usr/local/lib to run
  • test if installed successfully crystal ./src/tensorflow_lite.cr
    • this will output Launching with tensorflow lite vx.x.x

NOTE:: the lib is installed for local use via a postinstall script. Make sure to distribute libtensorflowlite_c.so with your production app

Contributing

  1. Fork it (https://github.com/your-github-user/tensorflow_lite/fork)
  2. Create your feature branch (git checkout -b my-new-feature)
  3. Commit your changes (git commit -am 'Add some feature')
  4. Push to the branch (git push origin my-new-feature)
  5. Create a new Pull Request

Contributors

Repository

tensorflow_lite

Owner
Statistic
  • 8
  • 0
  • 0
  • 2
  • 0
  • 5 months ago
  • November 15, 2022
License

MIT License

Links
Synced at

Thu, 21 Nov 2024 07:41:55 GMT

Languages