As I mentioned in the first post, it's planned to have the inference run as a standalone process, so we feed it with image data and obtain the predicted result. We need some Inter-communication framework to do so. (I later realize probably DLL is enough, so I maybe fallback to DLL someday).
All things are done under Windows, please follow me patiently.
First as suggested by official tutorial of bazel and tensorflow, create a folder named tools under root directory like C:\ and install bazel there. Also some dependency like swigwin.
Second git clone tensorflow source code from github.
Third, under the directory C:\tensorflow\tensorflow\cc, create folder mnist. all of our work goes into this folder.
To utilize inter-process communication, I create two files MnistComm.h and MnistComm.cpp, I will paste the content in the next post. I also create another file MnistModel.cc to load the model and run inference.
Fourth, let's create files related to the build process.
1. BUILD file:
# Description:
# TensorFlow is a computational framework, primarily for use in machine
# learning applications.
package(
default_visibility = ["//visibility:public"],
)
licenses(["notice"]) # Apache 2.0
exports_files(["LICENSE"])
load(":mnist.bzl", "mnist_copts")
cc_library(
name = "MnistComm",
srcs = ["MnistComm.cpp"],
hdrs = ["MnistComm.h"],
copts = ["/Zc:wchar_t", "/D_UNICODE", "/DUNICODE"],
linkopts = [],
deps = [
],
)
cc_binary(
name = "MnistModel",
srcs = ["MnistModel.cc"],
copts = mnist_copts(["/Zc:wchar_t", "/D_UNICODE", "/DUNICODE"]),
deps = [
":MnistComm",
"//tensorflow/core:tensorflow",
],
)
All things are done under Windows, please follow me patiently.
First as suggested by official tutorial of bazel and tensorflow, create a folder named tools under root directory like C:\ and install bazel there. Also some dependency like swigwin.
Second git clone tensorflow source code from github.
Third, under the directory C:\tensorflow\tensorflow\cc, create folder mnist. all of our work goes into this folder.
To utilize inter-process communication, I create two files MnistComm.h and MnistComm.cpp, I will paste the content in the next post. I also create another file MnistModel.cc to load the model and run inference.
Fourth, let's create files related to the build process.
1. BUILD file:
# Description:
# TensorFlow is a computational framework, primarily for use in machine
# learning applications.
package(
default_visibility = ["//visibility:public"],
)
licenses(["notice"]) # Apache 2.0
exports_files(["LICENSE"])
load(":mnist.bzl", "mnist_copts")
cc_library(
name = "MnistComm",
srcs = ["MnistComm.cpp"],
hdrs = ["MnistComm.h"],
copts = ["/Zc:wchar_t", "/D_UNICODE", "/DUNICODE"],
linkopts = [],
deps = [
],
)
cc_binary(
name = "MnistModel",
srcs = ["MnistModel.cc"],
copts = mnist_copts(["/Zc:wchar_t", "/D_UNICODE", "/DUNICODE"]),
deps = [
":MnistComm",
"//tensorflow/core:tensorflow",
],
)
2. a tiny extension file named mnist.bzl:
def mnist_copts(fs):
cflags = fs + ["-DEIGEN_AVOID_STL_ARRAY",
"-Iexternal/gemmlowp",
"-Wno-sign-compare",
"-fno-exceptions"] + \
select({
"//tensorflow:windows": [
"/DLANG_CXX11",
"/D__VERSION__=\\\"MSVC\\\"",
],
"//conditions:default": ["-pthread"]})
return cflags
Then how to build these targets?
1. Open the msys2 console, set the environment variables:
cd c:/tensorflow
export JAVA_HOME="$(ls -d C:/Program\ Files/Java/jdk* | sort | tail -n 1)"
export BAZEL_SH=c:/tools/msys64/usr/bin/bash.exe
export BAZEL_VS="C:/Program Files (x86)/Microsoft Visual Studio 14.0"
export BAZEL_PYTHON="C:/Program Files/Python35/python.exe"
export PATH=$PATH:/c/tools/swigwin-3.0.10:/c/tools/bazel:/c/Program\ Files/Python35
Adjust these variables properly.
2. Configure and build:
./configure
bazel build -c opt --cpu=x64_windows_msvc --host_cpu=x64_windows_msvc //tensorflow/cc/mnist:MnistComm --verbose_failures
bazel build -c opt --cpu=x64_windows_msvc --host_cpu=x64_windows_msvc //tensorflow/cc/mnist:MnistModel --verbose_failures
No comments:
Post a Comment