网站建站方式有哪些,百度推广代运营,知名网建公司,网站设计好网站使用onnx.helper可以进行onnx的制造组装操作#xff1a;
对象描述ValueInfoProto 对象张量名、张量的基本数据类型、张量形状算子节点信息 NodeProto算子名称(可选)、算子类型、输入和输出列表(列表元素为数值元素)GraphProto对象用张量节点和算子节点组成的计算图对象ModelP…使用onnx.helper可以进行onnx的制造组装操作
对象描述ValueInfoProto 对象张量名、张量的基本数据类型、张量形状算子节点信息 NodeProto算子名称(可选)、算子类型、输入和输出列表(列表元素为数值元素)GraphProto对象用张量节点和算子节点组成的计算图对象ModelProto对象GraphProto封装后的对象
方法描述onnx.helper.make_tensor_value_info制作ValueInfoProto对象onnx.helper.make_tensor使用指定的参数制作一个张量原型(与ValueInfoProto相比可以设置具体值)onnx.helper.make_node构建一个节点原型NodeProto对象 (输入列表为之前定义的名称)onnx.helper.make_graph构造图原型GraphProto对象输入列表为之前定义的对象make_modelgraph **kwargsGraphProto封装后为ModelProto对象make_sequence使用指定的值参数创建序列make_operatorsetidmake_opsetidmake_model_gen_version推断模型IR_VERSION的make_model扩展如果未指定则使用尽力而为的基础。set_model_propsset_model_propsmake_map使用指定的键值对参数创建 Mapmake_attributeget_attribute_valuemake_empty_tensor_value_infomake_sparse_tensor
提取出一个子模型
import onnx onnx.utils.extract_model(whole_model.onnx, partial_model.onnx, [22], [28])
提取时添加额外输出
onnx.utils.extract_model(whole_model.onnx, submodel_1.onnx, [22], [27, 31]) # 本来只有31节点输出现在让27节点的值也输出出来使用尝试构建一个模型 import onnx
from onnx import helper
from onnx import TensorProto
import numpy as npdef create_initializer_tensor(name: str,tensor_array: np.ndarray,data_type: onnx.TensorProto onnx.TensorProto.FLOAT
) - onnx.TensorProto:# (TensorProto)initializer_tensor onnx.helper.make_tensor(namename,data_typedata_type,dimstensor_array.shape,valstensor_array.flatten().tolist())return initializer_tensor# input and output
a helper.make_tensor_value_info(a, TensorProto.FLOAT, [None,3,10, 10])
x helper.make_tensor_value_info(weight, TensorProto.FLOAT, [10, 10]) b helper.make_tensor_value_info(b, TensorProto.FLOAT, [None,3, 10,10])
output helper.make_tensor_value_info(output, TensorProto.FLOAT, [None,None,None, None]) # Mul
mul helper.make_node(Mul, [a, weight], [c]) # Add
add helper.make_node(Add, [c, b], [output_of_liner]) # Conv
conv1_W_initializer_tensor_name Conv1_W
conv1_W_initializer_tensor create_initializer_tensor(nameconv1_W_initializer_tensor_name,tensor_arraynp.ones(shape(1, 3,*(2,2))).astype(np.float32),data_typeonnx.TensorProto.FLOAT)
conv1_B_initializer_tensor_name Conv1_B
conv1_B_initializer_tensor create_initializer_tensor(nameconv1_B_initializer_tensor_name,tensor_arraynp.ones(shape(1)).astype(np.float32),data_typeonnx.TensorProto.FLOAT)conv_node onnx.helper.make_node(nameConvnodename, # Name is optional.op_typeConv, # Must follow the order of input and output definitions. # https://github.com/onnx/onnx/blob/rel-1.9.0/docs/Operators.md#inputs-2---3inputs[ output_of_liner, conv1_W_initializer_tensor_name,conv1_B_initializer_tensor_name ],outputs[output],kernel_shape (2, 2), #pads(1, 1, 1, 1),
)# graph and model
graph helper.make_graph([mul, add,conv_node], test, [a, x, b], [output],initializer[conv1_W_initializer_tensor, conv1_B_initializer_tensor,],)
model helper.make_model(graph) # save model
onnx.checker.check_model(model)
print(model)
onnx.save(model, test.onnx) ###################EEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEVVVVVVVVVVVVVVVVVVVVVVVVVVVVVAAAAAAAAAAAAAAAAAAAAAAAAALLLLLLLLLLLLLLLLLLLLLLLL#########
import onnxruntime
# import numpy as np sess onnxruntime.InferenceSession(test.onnx)
a np.random.rand(1,3,10, 10).astype(np.float32)
b np.random.rand(1,3,10, 10).astype(np.float32)
x np.random.rand(10, 10).astype(np.float32) output sess.run([output], {a: a, b: b, weight: x})[0] print(output)https://github.com/NVIDIA/TensorRT/tree/master/tools/onnx-graphsurgeon