deephyper.keras.layers.AttentionGAT
deephyper.keras.layers.AttentionGAT#
-
class
deephyper.keras.layers.
AttentionGAT
(*args: Any, **kwargs: Any)[source]# Bases:
tensorflow.keras.layers.
GAT Attention. Check details here https://arxiv.org/abs/1710.10903
The attention coefficient between node \(i\) and \(j\) is calculated as:
\[\text{LeakyReLU}(\textbf{a}(\textbf{Wh}_i||\textbf{Wh}_j))\]where \(\textbf{a}\) is a trainable vector, and \(||\) represents concatenation.
- Parameters
Methods
build
Apply the layer on input tensors.
-
__call__
(*args: Any, **kwargs: Any) → Any# Call self as a function.
-
call
(inputs, **kwargs)[source]# Apply the layer on input tensors.
- Parameters
inputs (list) – X (tensor): node feature tensor N (int): number of nodes targets (tensor): target node index tensor sources (tensor): source node index tensor degree (tensor): node degree sqrt tensor (for GCN attention)
- Returns
attention coefficient tensor
- Return type
attn_coef (tensor)