DDR爱好者之家 Design By 杰米

1、Motivation:

I wanna modify the value of some param;

I wanna check the value of some param.

The needed function:

2、state_dict() #generator type

model.modules()#generator type

named_parameters()#OrderDict type

from torch import nn
import torch
#creat a simple model
model = nn.Sequential(
  nn.Conv3d(1,16,kernel_size=1),
  nn.Conv3d(16,2,kernel_size=1))#tend to print the W of this layer
input = torch.randn([1,1,16,256,256])
if torch.cuda.is_available():
  print('cuda is avaliable')
  model.cuda()
  input = input.cuda()
#打印某一层的参数名
for name in model.state_dict():
  print(name)
#Then I konw that the name of target layer is '1.weight'

#schemem1(recommended)
print(model.state_dict()['1.weight'])

#scheme2
params = list(model.named_parameters())#get the index by debuging
print(params[2][0])#name
print(params[2][1].data)#data

#scheme3
params = {}#change the tpye of 'generator' into dict
for name,param in model.named_parameters():
params[name] = param.detach().cpu().numpy()
print(params['0.weight'])

#scheme4
for layer in model.modules():
if(isinstance(layer,nn.Conv3d)):
  print(layer.weight)

#打印每一层的参数名和参数值
#schemem1(recommended)
for name,param in model.named_parameters():
  print(name,param)

#scheme2
for name in model.state_dict():
  print(name)
  print(model.state_dict()[name])

以上这篇pytorch获取模型某一层参数名及参数值方式就是小编分享给大家的全部内容了,希望能给大家一个参考,也希望大家多多支持。

DDR爱好者之家 Design By 杰米
广告合作:本站广告合作请联系QQ:858582 申请时备注:广告合作(否则不回)
免责声明:本站资源来自互联网收集,仅供用于学习和交流,请遵循相关法律法规,本站一切资源不代表本站立场,如有侵权、后门、不妥请联系本站删除!
DDR爱好者之家 Design By 杰米