Skip to content

Run TensorFlow models in C++ without installation and without Bazel

License

Notifications You must be signed in to change notification settings

JungleEngine/cppflow

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CppFlow

Run TensorFlow models in c++ without Bazel, without TensorFlow installation and without compiling Tensorflow.

    // Read the graph
    Model model("graph.pb");
    model.init();
    
    // Prepare inputs and outputs
    auto input = new Tensor(model, "input");
    auto output = new Tensor(model, "output");
    
    // Run
    model.run(input, output);

CppFlow uses Tensorflow C API to run the models, meaning you can use it without installing Tensorflow and without compiling the whole TensorFlow repository with bazel, you just need to download the C API. With this project you can manage and run your models in C++ without worrying about void, malloc or free. With CppFlow you easily can:

  • Open .pb models created with Python
  • Restore checkpoints
  • Save new checkpoints
  • Feed new data to your inputs
  • Retrieve data from the outputs

How To Run It

Since it uses TensorFlow C API you just have to download it. After it, you can run the examples:

git clone [email protected]:serizba/cppflow.git
cd cppflow/examples/load_model
mkdir build
cd build
cmake ..
make .
./example

Usage

Suppose we have a saved graph defined by the following TensorFlow Python code (examples/load_model/create_model.py):

# Two simple inputs
a = tf.placeholder(tf.float32, shape=(1, 100), name="input_a")
b = tf.placeholder(tf.float32, shape=(1, 100), name="input_b")
# Output
c = tf.add(a, b, name='result')

Create Model

You need the graph definition in a .pb file to create a model (examples/load_model/model.pb), then you can init it or restore from checkpoint

Model model("graph.pb");
// Initialize the variables...
model.init();
// ... or restore from checkpoint
model.restore("train.ckpt")

Define Inputs and Outputs

You can create the Tensors by the name of the operations (if you don't know use model.get_operations())

auto input_a = new Tensor(model, "input_a");
auto input_b = new Tensor(model, "input_b");
auto output = new Tensor(model, "result");

Feed new data to the inputs

Excepected inputs have a shape=(1,100), therefore we have to supply 100 elements:

// Create a vector data = {0,1,2,...,99}
std::vector<float> data(100);
std::iota(data.begin(), data.end(), 0);

// Feed data to the inputs
input_a->set_data(data);
input_b->set_data(data);

Run and get the ouputs

// Run!
model.run({input_a, input_b}, output);

// Write the output: 0, 2, 4, 6,.., 198
for (float f : output->get_data<float>()) {
    std::cout << f << " ";
}
std::cout << std::endl;

About

Run TensorFlow models in C++ without installation and without Bazel

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages

  • C++ 100.0%