-
Notifications
You must be signed in to change notification settings - Fork 4.3k
CNTK Evaluation Overview
This page presents an overview of model evaluation using CNTK.
Aside from training a model, CNTK also provides two different ways of evaluating the model.
The first method is using the CNTK executable itself (similar to the training process), but instead of using the "train" command, the "eval" command is placed in the configuration file.
The alternate method for evaluating a CNTK model is to use the evaluation library. Currently both Eval V1 and Eval V2(Beta) library are supported by CNTK.
-
The Eval V1 API can be used to evaluate models saved in CNTK V1 model format. The Eval V1 library is available in C++ (on Windows and Linux) and C# (on Windows only)
-
The Eval V2 API(Beta) can be used to evaluate models saved in CNTK V1 and V2 model format. The Eval V2 library is currently available in C++ and Python (both on Windows and Linux). The C# support is currently under development.
The Eval V2 libray provides new features including:
- Support both CPU and GPU device.
- Support evaluation of multiple evaluation requests in parallel by multi-threading.
- Share model parameter among multiple threads if the same model is loaded. This significantly reduces memory usage when running evaluation in service environment.
The following pages provide detailed information about model evaluation in different scenarios:
- Evaluating a model using cntk.exe
- Evaluating a model using Eval V1 API on Windows
- Evaluating a model using Eval V1 API on Linux
- Evaluating a model using Eval V2 API on Windows
- Evaluating a model using Eval V2 API on Linux
- Evaluating a model using Python
- Evaluating a model in Azure
The CNTK source repository has several C++/C# examples consuming both Eval libraries, as well as Python Eval examples.
CNTK allows users to save a model into a file for future use. It can be done by specifying "modelPath" in the config file when using BrainScript or cntk.exe, or by calling save_model in python, or SaveModel in C++ when using CNTK Library API.
CNTK can save a model in two file formats.
-
The legacy V1 model format. This format is used by BrainScript with cntk.exe to save a model, or by CNTK Library API when the parameter
use_legacy_format
of save_model() in python, oruseLegacyModelFormat
of SaveModel() is set totrue
. -
The protobuf-based V2 Model Format. A model is saved in this format by CNTK Library API when the parameter
use_legacy_format
of save_model() in python, oruseLegacyModelFormat
of SaveModel() is set tofalse
.
The following table presents an overview of which model format is created or consumed by which CNTK binary.
Model Creation | Model Evaluation | |
---|---|---|
Legacy V1 model format | BrainScript/CNTK.exe; or V2 Library API when useLegacyModelFormat=true (by default) | C++, C# Evaluation interface, Nuget Package; or CNTK Library API |
Protobuf-based V2 model format | V2 Library API when useLegacyModelFormat=false | CNTK Library API |
The CNTK binary download package also includes samples for using the eval library in C++, C# and Python. To download this package please see the CNTK Releases page.
Under the Examples\Evaluation
folder there are some code samples demonstrating how to use the CNTK Evaluation library in C++ and C#.
-
CPPEvalClient
: this is the sample for using the C++ Eval V1 library. -
CPPEvalExtendedClient
: this is the sample for using the C++ Eval V1 extended interface to evalute RNN model. -
CSEvalClient
: this is the sample for using the C# Eval V1 library (only for Windows). The sample uses the CNTK Evaluation Nuget Package. -
CPPEvalV2Client
: this is the sample showing how to use C++ Eval V2 API to evaluate a model. The sample also shows how to evaluate multiple requests in parallel and to share model parameters among threads.
On Windows, The solution file [EvalClients.sln]
(https://github.com/Microsoft/CNTK/blob/master/Examples/Evaluation/EvalClients.sln) can be used to build and run samples. Please note
- You need Visual Studio 2013 update 5 for using these samples.
- The samples should be built for the 64-bit target platform and with the release configuration. Otherwise some issues arise when calling the library. Please also refer to the Troubleshoot CNTK page for more information.
- The required libs (EvalDll.lib or CNTKLibrary-2.0.lib) should be located in $(SolutionDir)..\..\cntk for building.
- After a successful build, the executable is
saved under the
$(SolutionDir)..\..$ (Platform)$(ProjectName).$(Configuration)\ folder, e.g. ..\..\X64\CPPEvalClient.Release\CppEvalClient.exe. - In order to run the program, the directory containing dlls (EvalDll.dll or CNTKLibrary-2.0.dll) and other dependent dlls, usually the $(SolutionDir)..\..\cntk, should be included in the search path of dlls for your application, e.g. as a part of the PATH environment variable.
On Linux, please refer to Makefile to how to build samples. The target names EVAL_CLIENT, EVAL_EXTENDED_CLIENT, and EVALV2_SAMPLE_CLIENT are used to build these projects.
You can also use Python to evaluate a pre-trained model as described here.