查看网络中的权重的名字(tensorflow)L2正则

  • Post author:
  • Post category:其他


背景:为了在Loss中加上L2正则项

可以先定义一个简单的卷积层

import tensorflow as tf

#定义卷积操作
def conv_op(x,name,n_out,training,useBN,acrivation=1,kh=3,kw=3,dh=1,dw=1,padding="SAME"):
    n_in = x.get_shape[-1].value
    with tf.name_scope(name) as scope:
        w = tf.variable(tf.random_normal([kh,kw,n_in,n_out],stddev=0.01),name = scope +'w')
        b = tf.variable(tf.random_normal([n_out],stddev=0.01),name = scope +'b')
        conv = tf.nn.conv2d(x,w,[1,dh,dw,1],padding=padding)
        z = tf.nn.bias_add(conv,b)
    ..............

#由以上定义可以看出权重的名字后都会有'w',偏置的名字后都有'b',
#如果仍不能确定,只需构建一个简单的卷积操作
a = tf.constant([[[[1.0,2.0,3.0],[4.0,5.0,6.0],[7.0,8.0,9.0]]]])
b = conv_op(a,'b',3,training = True,useBN = True,kn=2,kw=2)

#然后使用查看网络图中可训练的变量
tf.trainable_variables()

[out]:

[

<tf.Variable ‘b/b/w:0’ shape=(2,2,3,3) dtype=float32_ref>

<tf.Variable ‘b/b/b:0’ shape=(3,) dtype=float32_ref>

<tf.Variable ‘b_1/b_1/w:0’ shape=(2,2,3,3) dtype=float32_ref>

]

可见所有的权重中都会有“w”

那么在定义L2正则Loss时

total_vars = tf.trainable_variables()
weights_name_list = [var for var in total_vars if "w" in var.name]
loss_holder = []
for w in range(len(weights_name_list)):
    l2_loss = tf.nn.l2_loss(weights_name_list[W])
    loss_holder.append(l2_loss)
regular_loss = tf.reduce_mean(loss_holder)*lamda #lamda正则化系数
loss = loss_pred + regular_loss



版权声明:本文为shiny0910原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接和本声明。