Skip to main content

Neural Network Structures Supported by e-AI Translator

e-AI translator (free version) supports microcontrollers with comparatively small ROM/RAM capacity. In order to compress the capacity used by the library, only functions that are often used by neural networks are supported.

Translatable Functions (Inference)

Neural net functions that are used only for learning and not for inference (for example, tf.nn.l2_loss for evaluating loss) are usable.

TypePyTorchTensorflow (tf-keras)KerasTensorflow Lite (New V2.1.0)
C_FP32C_FP32C_FP32APIC_INT8
1. Convolution Layersnn.Conv2dtf.keras.layers.Conv2Dkeras.layers.Conv2Dtfl.conv_2dsupport
nn.Conv2dtf.keras.layers.DepthwiseConv2Dkeras.layers.DepthwiseConv2Dtfl.depthwise_conv_2dsupport
nn.ConvTranspose2dtf.keras.layers.Conv2DTransposekeras.layers.Conv2DTransposetfl.transpose_convsupport
+tf.keras.layers.addkeras.layers.add support
2. Pooling layersnn.MaxPool2dtf.keras.layers.MaxPool2Dkeras.layers.MaxPooling2Dtfl.max_pool_2dsupport
nn.MaxPool2dtf.keras.layers.GlobalMaxPooling2Dkeras.layers.GlobalMaxPooling2Dtfl.meansupport
nn.AvgPool2dtf.keras.layers.AveragePooling2Dkeras.layers.AveragePooling2Dtfl.average_pool_2dsupport
nn.AvgPool2dtf.keras.layers.GlobalAveragePooling2Dkeras.layers.GlobalAveragePooling2Dtfl.reducemaxsupport
3. Non-linear Activations (weighted sum, nonlinearity)nn.Hardsigmoidtf.keras.activations.hard_sigmoid
tf.keras.layers.Activation (“hard_sigmoid”)
keras.activations.hard_sigmoid
keras.layers.Activation (“hard_sigmoid”)
None-
nn.LeakyReLUtf.keras.layers.LeakyReLUkeras.layers.LeakyReLUtfl.leaky_relusupport
nn.ReLUtf.keras.activations.relu
tf.keras.layers.Activation (“relu”)
tf.keras.layers.relu
keras.activations.relu
keras.layers.Activation (“relu”)
keras.layers.relu
tfl.relu
(fused activation)
support
nn.ReLU6tf.keras.activations.relu (max_value=6.)
tf.keras.layers.relu (max_value=6.)
keras.activations.relu (max_value=6.)
keras.layers.relu (max_value=6.)
tfl.relu
with max_value = 6
support
nn.Sigmoidtf.keras.activations.sigmoid
tf.keras.layers.Activation (“sigmoid”)
keras.activations.sigmoid
keras.layers.Activation (“sigmoid”)
tfl.logisticsupport
nn.Softplustf.keras.activations.softplus
tf.keras.layers.Activation (“softplus”)
keras.activations.softplus
keras.layers.Activation (“softplus”)
None-
nn.Softsigntf.keras.activations.softsign
tf.keras.layers.Activation (“softsign”)
keras.activations.softsign
keras.layers.Activation (“softsign”)
None-
nn.Tanhtf.keras.activations.tanh
tf.keras.layers.Activation (“tanh”)
keras.activations.tanh
keras.layers.Activation (“tanh”)
tfl.tanhsupport
4. Non-linear Activations (other)nn.Softmax
nn.Softmax2d
tf.keras.activations.softmax
tf.keras.layers.Activation (“softmax”)
tf.keras.layers.Softmax
keras.activations.softmax
keras.layers.Activation (“softmax”)
keras.layers.Softmax
tfl.softmaxsupport
torch.clip
torch.clamp
tf.keras.backend.clip (use lambda)keras.backend.clip (use lambda)None-
5. Normalization Layersnn.BatchNorm2dtf.keras.layers.BatchNormalizationkeras.layers.BatchNormalizationtfl.mul, tfl.add-
CBR only
6. Linear Layersnn.Linear
torch.matmul
tf.keras.layers.Densekeras.layers.Densetfl.fully_connectedsupport
8. Utilitiesnn.Flattentf.keras.layers.Flattenkeras.layers.FlattenNonesupport
(skip)
view
torch.reshape
tf.keras.layers.Reshapekeras.layers.ReshapeNonesupport
(skip)
9. Mearge layers (structure check)+tf.keras.layers.addkeras.layers.addtfl.addsupport
-tf.keras.layers.subtractkeras.layers.subtracttfl.subsupport
*tf.keras.layers.multiplykeras.layers.multiplytfl.mulsupport
torch.cattf.keras.layers.concatenatekeras.layers.concatenatetfl.concatenationsupport

Supported Neural Network Structures (e-AI Translator V2.1.0)

Neural network namemodel nameStatusNote
AutoEncoder supportSupport Conv2D-Fully Connected-Deconv2D structure, too.
Convolutional Neural Network (CNN)
Branch structures are not included.
LeNetsupport 
AlexNetsupport 
VGGsupportIf the channel numbers of fully connected layers are big, huge memory capacity is required.
Convolutional Neural Network (CNN)
Branch structures are included. (skip connection, etc.)
Network in Network GoogleNetsupportRAM usage reduction function is available (New V2.1.0)
MobileNetsupport
ResNetsupport
SqueezeNetsupport
SENetsupport
ContextNet-Need to replace activation function from Swish to ReLU
Self Attention Net-Support SENe only.
Recurrent Neural Network (RNN)RNNoise-Not supported.
Consider to utilize SE module instead of RNN.
Long Short Term Memory (LSTM) -Not supported.
Consider to utilize SE module instead of LSTM.
Gated Recurrent Unit (GRU) -Not supported.
Consider to utilize SE module instead of GRU.

Translatable Functions (e-AI Translator V1.6.0: inference)

TypeTensorFlowtf-kerasKerasCaffe
1. Convolution Layerstf.nn.conv2d
tf.contrib.layers.conv2d
tf.layers.conv2d
tf.keras.layers.Conv2Dkeras.layers.Conv2DConvolution
tf.nn.depthwise_conv2dtf.keras.layers.DepthwiseConv2Dkeras.layers.DepthwiseConv2DNone
tf.nn.conv2d_transpose
tf.contrib.layers.conv2d_transpose
tf.layers.conv2d_transpose
tf.keras.layers.Conv2DTransposekeras.layers.Conv2DTransposeDeconvolution
tf.add
tf.nn.bias_add
tf.contrib.layers.bias_add
+
tf.keras.layers.addkeras.layers.addNone
2. Pooling layerstf.nn.max_pool
tf.contrib.layers.max_pool2d
tf.layers.max_pooling2d
tf.keras.layers.MaxPool2Dkeras.layers.MaxPooling2DPooling
with pooling_param {pool: MAX}
Nonetf.keras.layers.GlobalMaxPooling2Dkeras.layers.GlobalMaxPooling2DNone
tf.nn.avg_pool
tf.contrib.layers.avg_pool2d
tf.layers.average_pooling2d
tf.keras.layers.AveragePooling2Dkeras.layers.AveragePooling2DPooling
with pooling_param {pool: AVE}
Nonetf.keras.layers.GlobalAveragePooling2Dkeras.layers.GlobalAveragePooling2DNone
3. Non-linear Activations (weighted sum, nonlinearity)Nonetf.keras.activations.hard_sigmoid
tf.keras.layers.Activation (“hard_sigmoid”)
keras.activations.hard_sigmoid
keras.layers.Activation (“hard_sigmoid”)
None
tf.nn.leaky_relutf.keras.layers.LeakyReLUkeras.layers.LeakyReLUNone
tf.nn.relutf.keras.activations.relu
tf.keras.layers.Activation ( “relu”)
tf.keras.layers.relu
keras.activations.relu
keras.layers.Activation (“relu”)
keras.layers.relu
ReLU
tf.nn.relu6tf.keras.activations.relu (max_value=6.)
tf.keras.layers.relu (max_value=6.)
keras.activations.relu (max_value=6.)
keras.layers.relu (max_value=6.)
None
tf.sigmoidtf.keras.activations.sigmoid
tf.keras.layers.Activation (“sigmoid”)
keras.activations.sigmoid
keras.layers.Activation (“sigmoid”)
Sigmoid
tf.nn.softplus
tf.math.softplus
tf.keras.activations.softplus
tf.keras.layers.Activation (“softplus”)
keras.activations.softplus
keras.layers.Activation (“softplus”)
None
tf.nn.softsign
tf.math.softsign
tf.keras.activations.softsign
tf.keras.layers.Activation (“softsign”)
keras.activations.softsign
keras.layers.Activation (“softsign”)
None
tf.nn.tanh
tf.tanh
tf.keras.activations.tanh
tf.keras.layers.Activation (“tanh”)
keras.activations.tanh
keras.layers.Activation (“tanh”)
TanH
4. Non-linear Activations (other)tf.nn.softmax
tf.contrib.layers.softmax
tf.nn.softmax_cross_entropy_with_logits
tf.nn.softmax_cross_entropy_with_logits_v2
tf.keras.activations.softmax
tf.keras.layers.Activation (“softmax”)
tf.keras.layers.Softmax
keras.activations.softmax
keras.layers.Activation (“softmax”)
keras.layers.Softmax
Softmax
tf.clip_by_average_normNoneNoneNone
tf.clip_by_global_normNoneNoneNone
tf.clip_by_normNoneNoneNone
tf.clip_by_valuetf.keras.backend.clip (use lambda)keras.backend.clip (use lambda)None
5. Normalization Layerstf.layers.batch_normalizationtf.keras.layers.BatchNormalizationkeras.layers.BatchNormalizationNone
tf.nn.lrn
tf.nn.local_response_normalization
NoneNoneLRN
6. Linear Layerstf.matmul
tf.contrib.layers.fully_connected
tf.layers.dense
tf.keras.layers.Densekeras.layers.DenseInnerProduct
8. Utilitiestf.layers.flatten
tf.contrib.layers.flatten
tf.keras.layers.Flattenkeras.layers.FlattenNone
tf.reshapetf.keras.layers.Reshapekeras.layers.ReshapeReshape
Split
9. Mearge layers
(structure check)
tf.add
tf.math.add
+
tf.keras.layers.addkeras.layers.addNone
tf.substruct
tf.math.substruct
-
tf.keras.layers.subtractkeras.layers.subtractNone
tf.math.multiply
*
tf.keras.layers.multiplykeras.layers.multiplyNone
tf.concattf.keras.layers.concatenatekeras.layers.concatenateNone

Supported Neural Network Structures (e-AI Translator V1.6.0)

Neural network namemodel nameStatusNote
AutoEncoder supportSupport Conv2D-Fully Connected-Deconv2D structure, too.
Convolutional Neural Network (CNN)
Branch structures are not included.
LeNetsupport 
AlexNetsupport 
VGGsupportIf the channel numbers of fully connected layers are big, huge memory capacity is required.
Convolutional Neural Network (CNN)
Branch structures are included. (skip connection, etc.)
Network in Network
GoogleNet
supportRAM usage reduction function is not available.
MobileNetsupportRAM usage reduction function is not available.
ResNetsupportRAM usage reduction function is not available.
SENetsupportRAM usage reduction function is not available.
Recurrent Neural Network (RNN) -Not supported.
Consider to utilize SE module instead of RNN.
Long Short Term Memory (LSTM) -Not supported.
Consider to utilize SE module instead of LSTM.

Videos