Official TensorFlow fork with shorter, cleaner and better naming conventions for functions and methods. TensorShort works on top of TensorFlow:
- Any existing code in TensorFlow works with TensorShort
- All new methods and functions mirror their aliases
Here's a lovely example:
tf.enable_eager_execution()
eager()
- Use
t.
for TensorFlow andk.
for Keras API before defining methods - See full changelog here
Here are some code comparisons between an original and TensorShort versions:
model = keras.Sequential([
keras.layers.Flatten(input_shape=(28, 28)),
keras.layers.Dense(128, activation=tf.nn.relu),
keras.layers.Dense(10, activation=tf.nn.softmax)
])
model = k.sequential([
k.flatten(input_shape=(28, 28)),
k.dense(128, activation=relu),
k.dense(10, activation=softmax)
])
pool2_flat = tf.reshape(pool2, [-1, 7 * 7 * 64])
dense = tf.layers.dense(inputs=pool2_flat, units=1024, activation=tf.nn.relu)
dropout = tf.layers.dropout(
inputs=dense, rate=0.4, training=mode == tf.estimator.ModeKeys.TRAIN)
pool2_flat = t.reshape(pool2, [-1, 7 * 7 * 64])
dense = t.dense(input=pool2_flat, units=1024, activation=t.relu)
dropout = t.dropout(
input=dense, rate=0.4, train=mode == estimator.TRAIN)
TensorShort is an functional experiment project for personal use, maintained by Andrew Stepin.