Preparing Dump Data of an Offline Model
Prerequisites
Before preparing dump data, convert the model and prepare the model file by referring to ATC Tool Instructions. If model quantization is involved, perform quantization by referring to Ascend Model Compression Toolkit Instructions (Caffe) or Ascend Model Compression Toolkit Instructions (TensorFlow) before model conversion. In addition, build and run the application project with the generated model file and ensure that the project runs properly.
- During the execution of the Ascend Model Compression and Training Toolkit (AMCT), a quantization fusion file used for accuracy comparison is also generated, which will be needed for model comparison.
- In Docker scenarios, dump is not supported in containers.
- Data dump can be implemented by calling the aclInit() or aclmdlSetDump() API.
Dump of single-operators can be implemented only by calling the aclmdlSetDump() API.
For details about the aclInit() and aclmdlSetDump() APIs, see Application Software Development Guide.
Generating Dump Data
Perform the following steps to dump data of the offline model:
- Open the project file, find the path of the acl.json file in the aclInit() or aclmdlSetDump() call.
If aclInit() or aclmdlSetDump() is initialized to empty, pass the acl.json path created in Step 2 to the call. The acl.json path is relative to the path of the binary file generated during project build.
- Modify the acl.json file in the directory (if the file does not exist, create it to the out directory after project build) and add the dump configuration in the following format.
{ "dump":{ "dump_list":[ { "model_name":"ResNet-101" }, { "model_name":"ResNet-50", "layer":[ "conv1conv1_relu", "res2a_branch2ares2a_branch2a_relu", "res2a_branch1", "pool1" ] } ], "dump_path":"/home/HwHiAiUser/output", "dump_mode":"output", "dump_op_switch":"off" } }
The dump configuration rules are as follows:- In the sample, the dump, dump_list, and dump_path fields are required, while the model_name, layer, dump_mode, and dump_op_switch fields are optional.
- To dump all operators of a model, the layer field does not need to be included.
- To dump certain operators, configure each operator in a line in the layer field and separate them with commas (,).
- To dump multiple models, add the dump configuration for each model and separate them with commas (,).
- To dump a single-operator model, leave dump_list empty and set dump_op_switch to on.
The configuration items are as follows:- dump_list: a list of network-wide models for data dump
- model_name: model name
To load a model from a file, enter the model file name without the file name extension. To load a model from memory, set this parameter to the value of the name field in the .json file after model conversion. For details about how to load a model, see AscendCL API Reference in Application Software Development Guide.
- layer: operator name
- model_name: model name
- dump_path: dump path in the operating environment
Both absolute path and relative path (relative to the path where the command is run) are supported.
- If set to an absolute path, starts with a slash (/), for example, /home/HwHiAiUser/output.
- If set to a relative path, starts with a directory name, for example, output.
For example, if dump_path is set to /home/HwHiAiUser/output, the dump data is generated to the /home/HwHiAiUser/output directory in the operating environment.
The directory specified by this parameter must be created in advance and the user configured during installation must have the read and write permissions on the directory.
- dump_mode: (optional) dump mode, either input, output (default), or all
- input: dumps the operator input only.
- output: dumps the operator output only.
- all: dumps both the operator input and output.
- dump_op_switch: (optional) dump switch of the single-operator model, either on or off (default)
- on: enables dump for the single-operator model.
- off: disables dump for the single-operator model.
- For TBE and AI CPU operators that do not output results, (for example, StreamActive, Send, Recv and const), dump data is not generated. For operators that are not executed on the AI CPU or AI Core (for example, concatD) after build, dump data cannot be generated.
- When only certain operators are to be dumped, since the Data operators are not executed on the AI CPU or AI Core, you need to select all the downstream operators of the Data operator.
- When modifying the acl.json file, ensure that each model_name is unique.
- If the model is loaded from a file, model_name can also be set to the value of the name field in the .json file after model conversion. Model name and operator names can be obtained using either of the following methods:
If the value of model_name in the acl.json file contains both the model file name and the model name, the model file name takes effect.
- Run the application project to generate a dump file.
Find the generated dump file in {dump_path} in the operating environment. The path and format are described as follows:
time/deviceid/model_name/model_id/data_index/dump_file
- time: dump time, formatted as YYYYMMDDhhmmss
- deviceid: device ID
- model_name: model name
- model_id: model ID
- data_index: execution sequence number of each task, indexed starting at 0. This value is increased by 1 every dump.
- dump_file: formatted as {op_type}.{op_name}.{taskid}.{timestamp}
Periods (.), forward slashes (/), backslashes (\), and spaces in model_name, op_type or op_name are replaced with underscores (_).
Dump data of a single-operator model is generated to {dump_path}/time/deviceid/dump_file.