MXNet (AI/ML) bindings for the Crystal language.
  • v0.3.0 - March 31, 2020
  • v0.2.0 - February 6, 2020
  • v0.1.0 - April 11, 2019

Deep Learning for Crystal

GitHub Release Build Status Documentation provides MXNet bindings for the Crystal programming language. MXNet is a framework for machine learning and deep learning written in C++, supporting distributed training across multiple machines and multiple GPUs (if available). follows the design of the Python bindings, albeit with Crystal syntax. The following code:

require "mxnet"
a = MXNet::NDArray.array([[1, 2], [3, 4]])
b = MXNet::NDArray.array([1, 0])
puts a * b


[[1, 0], [3, 0]]
<NDArray 2x2 int32 cpu(0)>


If you want to see what can do, check out toddsundsted/deep-learning. It is a collection of problems and solutions from Deep Learning - The Straight Dope, a set of notebooks teaching deep learning using MXNet.

Installation requires MXNet.

Build MXNet from source (including Python language bindings) or install the library from prebuilt packages using the Python package manager pip, per the MXNet installation instructions:

And add the following to your application's shard.yml:

    github: toddsundsted/

Troubleshooting relies on the Python library to find the installed MXNet shared library (""). You can verify MXNet is installed with the following Python code:

import mxnet as mx
a = mx.ndarray.array([[1, 2], [3, 4]])
b = mx.ndarray.array([1, 0])
print(a * b)

which outputs:

[[1. 0.]
 [3. 0.]]
<NDArray 2x2 @cpu(0)>


On OSX, you may need to give your program a hint about the location of the MXNet shared library ( If you build and run your program and see an error message like the following:

dyld: Library not loaded: lib/
  Referenced from: /Users/homedirectory/.cache/crystal/crystal-run-eval.tmp
  Reason: image not found

you need to either: 1) explicitly set the DYLD_FALLBACK_LIBRARY_PATH environment variable to point to the directory containing, or 2) move or copy into a well-known location (such as the project's own lib directory).

Alternatively, and more permanently, you can modify the shared library so that it knows where it's located at runtime (you will modify the library's LC_ID_DYLIB information):

LIBMXNET=/Users/homedirectory/mxnet-1.5.1/lib/python3.6/site-packages/mxnet/ # the full path
install_name_tool -id $LIBMXNET $LIBMXNET

Status currently implements a subset of Gluon, and supports a rich set of operations on arrays and symbols (arithmetic, trigonometric, hyperbolic, exponents and logarithms, powers, comparison, logical, rounding, sorting, searching, reduction and indexing) with automatic differentiation built in.

Implemented classes:

  • MXNet
    • Autograd
    • Context
    • Executor
    • Optimizer
    • NDArray
    • Symbol
    • Gluon
      • Block
      • HybridBlock
      • Sequential
      • HybridSequential
      • SymbolBlock
      • Dense
      • Pooling
      • Conv1D
      • Conv2D
      • Conv3D
      • MaxPool1D
      • MaxPool2D
      • MaxPool3D
      • Flatten
      • L1Loss
      • L2Loss
      • SoftmaxCrossEntropyLoss
      • Activation
      • Trainer
      • Parameter
      • Constant
github statistic
  • 9
  • 0
  • 0
  • 0
  • 5 days ago
  • February 23, 2019


Synced at

Fri, 27 Nov 2020 11:37:43 GMT