-
Notifications
You must be signed in to change notification settings - Fork 4.3k
CNTK Evaluation Overview
This page presents an overview of model evaluation using CNTK.
Aside from training a model, CNTK provides two different ways of evaluating the model.
The first method is using the CNTK executable itself (similar to the training process), but instead of using the "train" command, the "eval" command is placed in the configuration file.
The alternate method for evaluating a CNTK model is to use the evaluation library. Currently both Eval V1 and Eval V2(Beta) library are supported by CNTK.
-
The Eval V1 API can be used to evaluate models saved in the CNTK V1 model format. The Eval V1 library is available in C++ (on Windows and Linux) and C# (on Windows only)
-
The Eval V2 API(Beta) can be used to evaluate models saved in the CNTK V1 and V2 model format. The Eval V2 library is currently available in C++ and Python (both on Windows and Linux). The C# support is currently under development.
The Eval V2 library provides new features including:
- Support both CPU and GPU device.
- Support evaluation of multiple evaluation requests in parallel by multi-threading.
- Share model parameters among multiple threads if the same model is loaded. This could significantly reduce memory usage when running evaluation in a service environment.
The following pages provide detailed information about model evaluation in different scenarios:
- Evaluating a model using cntk.exe
- Eval V1 API on Windows
- Eval V1 API on Linux
- Eval V2 API on Windows
- Eval V2 API on Linux
- Using Python for evaluation
- Evaluating a model in Azure
The CNTK source repository has several C++/C# examples consuming Eval libraries, as well as Python Eval examples.
CNTK allows users to save a model into a file for future use. It can be done by specifying "modelPath" in the config file when using BrainScript/cntk.exe, or by calling save_model() in python, or SaveModel() in C++ when using CNTK Library API.
CNTK can save a model in two file formats.
-
The legacy V1 model format. This format is used by BrainScript/cntk.exe to save a model, or by CNTK Library API when the parameter
use_legacy_format
of save_model() in python, oruseLegacyModelFormat
of SaveModel() in C++ is set totrue
. -
The protobuf-based V2 model format. A model is saved in this format by CNTK Library API when the parameter
use_legacy_format
of save_model() in python, oruseLegacyModelFormat
of SaveModel() in C++ is set tofalse
.
The following table presents an overview of which model format is created or consumed by which CNTK binary.
Model Creation | Model Evaluation | Lanugage Support | |
---|---|---|---|
Legacy V1 model format | BrainScript/cntk.exe; or V2 Library API when useLegacyModelFormat=true (by default) | Eval V1 API, Eval V2 API(Beta) | C++, C#, Nuget Package |
Protobuf-based V2 model format | V2 Library API when useLegacyModelFormat=false | Eval V2 API(Beta) | C++, Python |
The CNTK binary download package also includes samples for using the eval library in C++, C# and Python. To download this package please see the CNTK Releases page.
Under the Examples\Evaluation folder there are some code samples demonstrating how to use the CNTK Eval library in C++ and C#.
-
CPPEvalClient
: this sample uses the C++ Eval V1 library. -
CPPEvalExtendedClient
: this sample uses the C++ Eval V1 extended interface to evalute a RNN model. -
CSEvalClient
: this sample uses the C# Eval V1 library (only for Windows). It uses the CNTK Evaluation Nuget Package. -
CPPEvalV2Client
: this sample uses the C++ Eval V2 API to evaluate a model. The sample also shows how to evaluate multiple requests in parallel and to share model parameters among threads.
On Windows, The solution file EvalClients.sln is used to build and run samples. Please note
- You need Visual Studio 2013 update 5 for using these samples.
- The samples should be built for the 64-bit target platform and with the release configuration. Otherwise some issues arise when calling the library. Please also refer to the Troubleshoot CNTK page for more information.
- The required libs (EvalDll.lib or CNTKLibrary-2.0.lib) should be located in $(SolutionDir)..\..\cntk for building.
- After a successful build, the executable is
saved under the
$(SolutionDir)..\..$ (Platform)$(ProjectName).$(Configuration)\ folder, e.g. ..\..\X64\CPPEvalClient.Release\CppEvalClient.exe. - In order to run the program, the directory containing dlls (EvalDll.dll or CNTKLibrary-2.0.dll) and other dependent dlls, usually the $(SolutionDir)..\..\cntk, should be included in the search path of dlls for your application, e.g. as a part of the PATH environment variable.
On Linux, please refer to Makefile for building samples. The target name EVAL_CLIENT, EVAL_EXTENDED_CLIENT, and EVALV2_SAMPLE_CLIENT are used to build these projects.
You can also use Python to evaluate a pre-trained model as described here.