Forwarded from Bioinformatics
π Enhancing Molecular Network-Based Cancer Driver Gene Prediction Using Machine Learning Approaches: Current Challenges and Opportunities
π Journal: Journal of Cellular and Molecular Medicine (I.F.=4.3)
πPublish year: 2025
π§βπ»Authors: Hao Zhang, Chaohuan Lin, Ying'ao Chen, ...
π’Universities: Wenzhou Medical University - University of Chinese Academy of Sciences, China
π Study the paper
π²Channel: @Bioinformatics
#review #cancer #network #driver_gene #machine_learning
π Journal: Journal of Cellular and Molecular Medicine (I.F.=4.3)
πPublish year: 2025
π§βπ»Authors: Hao Zhang, Chaohuan Lin, Ying'ao Chen, ...
π’Universities: Wenzhou Medical University - University of Chinese Academy of Sciences, China
π Study the paper
π²Channel: @Bioinformatics
#review #cancer #network #driver_gene #machine_learning
π Machine Learning with Graphs: design space of graph neural networks
π₯Free recorded course by Prof. Jure Leskovec
π₯ This part discussed the important topic of GNN architecture design. Here, we introduce 3 key aspects in GNN design: (1) a general GNN design space, which includes intra-layer design, inter-layer design and learning configurations; (2) a GNN task space with similarity metrics so that we can characterize different GNN tasks and, therefore, transfer the best GNN models across tasks; (3) an effective GNN evaluation technique so that we can convincingly evaluate any GNN design question, such as βIs BatchNorm generally useful for GNNs?β. Overall, we provide the first systematic investigation of general guidelines for GNN design, understandings of GNN tasks, and how to transfer the best GNN designs across tasks. We release GraphGym as an easy-to-use code platform for GNN architectural design. More information can be found in the paper: Design Space for Graph Neural Networks
π½ Watch
π²Channel: @ComplexNetworkAnalysis
#video #course #Graph #GNN #Machine_Learning
π₯Free recorded course by Prof. Jure Leskovec
π₯ This part discussed the important topic of GNN architecture design. Here, we introduce 3 key aspects in GNN design: (1) a general GNN design space, which includes intra-layer design, inter-layer design and learning configurations; (2) a GNN task space with similarity metrics so that we can characterize different GNN tasks and, therefore, transfer the best GNN models across tasks; (3) an effective GNN evaluation technique so that we can convincingly evaluate any GNN design question, such as βIs BatchNorm generally useful for GNNs?β. Overall, we provide the first systematic investigation of general guidelines for GNN design, understandings of GNN tasks, and how to transfer the best GNN designs across tasks. We release GraphGym as an easy-to-use code platform for GNN architectural design. More information can be found in the paper: Design Space for Graph Neural Networks
π½ Watch
π²Channel: @ComplexNetworkAnalysis
#video #course #Graph #GNN #Machine_Learning
arXiv.org
Design Space for Graph Neural Networks
The rapid evolution of Graph Neural Networks (GNNs) has led to a growing number of new architectures as well as novel applications. However, current research focuses on proposing and evaluating...
π Graph Data Management and Graph Machine Learning: Synergies and Opportunities
π Publish year: 2025
π§βπ»Authors: Arijit Kha, Xiangyu Ke, Yinghui Wu
π’University:
- Aalborg University, Denmark
- Zhejiang University, China
- Case Western Reserve University, USA
π Study the paper
β‘οΈChannel: @ComplexNetworkAnalysis
#review #graph #machine_learning #data_management
π Publish year: 2025
π§βπ»Authors: Arijit Kha, Xiangyu Ke, Yinghui Wu
π’University:
- Aalborg University, Denmark
- Zhejiang University, China
- Case Western Reserve University, USA
π Study the paper
β‘οΈChannel: @ComplexNetworkAnalysis
#review #graph #machine_learning #data_management
π1
π Machine Learning with Graphs: GraphSAGE Neighbor Sampling
π₯Free recorded course by Prof. Jure Leskovec
π₯ This part discussed Neighbor Sampling, That is a representative method used to scale up GNNs to large graphs. The key insight is that a K-layer GNN generates a node embedding by using only the nodes from the K-hop neighborhood around that node. Therefore, to generate embeddings of nodes in the mini-batch, only the K-hop neighborhood nodes and their features are needed to load onto a GPU, a tractable operation even if the original graph is large. To further reduce the computational cost, only a subset of neighboring nodes is sampled for GNNs to aggregate.
π½ Watch
π²Channel: @ComplexNetworkAnalysis
#video #course #Graph #GNN #Machine_Learning #GraphSAGE
π₯Free recorded course by Prof. Jure Leskovec
π₯ This part discussed Neighbor Sampling, That is a representative method used to scale up GNNs to large graphs. The key insight is that a K-layer GNN generates a node embedding by using only the nodes from the K-hop neighborhood around that node. Therefore, to generate embeddings of nodes in the mini-batch, only the K-hop neighborhood nodes and their features are needed to load onto a GPU, a tractable operation even if the original graph is large. To further reduce the computational cost, only a subset of neighboring nodes is sampled for GNNs to aggregate.
π½ Watch
π²Channel: @ComplexNetworkAnalysis
#video #course #Graph #GNN #Machine_Learning #GraphSAGE
YouTube
Stanford CS224W: Machine Learning with Graphs | 2021 | Lecture 17.2 - GraphSAGE Neighbor Sampling
For more information about Stanfordβs Artificial Intelligence professional and graduate programs, visit: https://stanford.io/3Brn5kW
Lecture 17.2 - GraphSAGE Neighbor Sampling Scaling up GNNs
Jure Leskovec
Computer Science, PhD
Neighbor Sampling is a representativeβ¦
Lecture 17.2 - GraphSAGE Neighbor Sampling Scaling up GNNs
Jure Leskovec
Computer Science, PhD
Neighbor Sampling is a representativeβ¦