torch_geometric中softmax的理解

  • Post author:
  • Post category:其他


torch_geometric的utils库中的softmax的计算原理可能跟咱一开始所想并不是计算。它并不是简单地计算每个节点在图中的权重,而是精细到每个节点中特征权重。由于其开发文档中没有给出详细的解释,所以我就给其相应的代码写上注释,方便理解。

from typing import Optional

from torch import Tensor
from torch_scatter import scatter, segment_csr, gather_csr

from .num_nodes import maybe_num_nodes


def softmax(src: Tensor, index: Optional[Tensor], ptr: Optional[Tensor] = None,
            num_nodes: Optional[int] = None) -> Tensor:
    r"""Computes a sparsely evaluated softmax.
    Given a value tensor :attr:`src`, this function first groups the values
    along the first dimension based on the indices specified in :attr:`index`,
    and then proceeds to compute the softmax individually for each group.

    Args:
        src (Tensor): The source tensor.
        index (LongTensor): The indices of elements for applying the softmax.
        ptr (LongTensor, optional): If given, computes the softmax based on
            sorted inputs in CSR representation. (default: :obj:`None`)
        num_nodes (int, optional): The number of nodes, *i.e.*
            :obj:`max_val + 1` of :attr:`index`. (default: :obj:`None`)

    :rtype: :class:`Tensor`
    """
    if ptr is not None:
        src_max = gather_csr(segment_csr(src, ptr, reduce='max'), ptr)
        out = (src - src_max).exp()
        out_sum = gather_csr(segment_csr(out, ptr, reduce='sum'), ptr)
    elif index is not None:
        N = maybe_num_nodes(index, num_nodes) # 计算batch_size的大小
        """
        取每张图中节点特征的最大值,相当于对图中的所有节点进行了一次max_pooling
        """
        src_max = scatter(src, index, dim=0, dim_size=N, reduce='max')[index]
        out = (src - src_max).exp()
        """
        计算图中每个节点的注意力值,这里的注意力值并不是单纯地计算每个节点的注意力值,
        而是每个节点中特征的注意力值。这种做法更加精细。最后图表示向量中特征是图中所有
        节点相应特征的加权和。
        """
        out_sum = scatter(out, index, dim=0, dim_size=N, reduce='sum')[index]
    else:
        raise NotImplementedError

    return out / (out_sum + 1e-16)



版权声明:本文为qq_41818256原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接和本声明。