Next Article in Journal
γ-Fe2O3-Based MEMS Gas Sensor for Propane Detection
Next Article in Special Issue
High-Dimensional Attention Generative Adversarial Network Framework for Underwater Image Enhancement
Previous Article in Journal
Investigation of Torque Ripple in Servo Motors with Different Magnet Geometries
Previous Article in Special Issue
Mining Nuanced Weibo Sentiment with Hierarchical Graph Modeling and Self-Supervised Learning
 
 
Article
Peer-Review Record

Efficient Graph Representation Learning by Non-Local Information Exchange†

Electronics 2025, 14(5), 1047; https://doi.org/10.3390/electronics14051047
by Ziquan Wei, Tingting Dan, Jiaqi Ding and Guorong Wu *
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Reviewer 3: Anonymous
Electronics 2025, 14(5), 1047; https://doi.org/10.3390/electronics14051047
Submission received: 4 February 2025 / Revised: 28 February 2025 / Accepted: 2 March 2025 / Published: 6 March 2025
(This article belongs to the Special Issue Artificial Intelligence in Graphics and Images)

Round 1

Reviewer 1 Report

Comments and Suggestions for Authors

The paper is very interesting and well presented. The proposed formalization is solid and well explained. The experiments are robust and the overall contribution has the necessary academic rigor to be considered for publication. In what follows, I’ll highlight some aspects that, in my opinion, could be addressed to further improve the quality of the paper:

  • The Discussion section needs to be improved. Limitations, theoretical and practical implications need to be addressed in a more detailed manner. Implications are of outmost importance in this context. For instance, the method proposed by the authors could be easily used to create embeddings of communities in social network with the aim of incorporating as much information as possible. Social network data analysis, and especially topic-oriented analysis such as the one in [10.1016/j.ipm.2023.103516] especially would benefit from this. I suggest the authors consider including this reference as well as other ones when discussing implications.
  • The Introduction makes a decent work in discussing the related work. However, it feels like more importance can be given to this matter. Particularly, starting from the previous works, I suggest introducing a table to summarize the most recent works and to highlight the novelty of the proposed work. The approach is very well presented but it needs to be positioned better w.r.t. the related context.
  • For the presentation, new ideas with unique features compared to existing/published papers should be more highlighted.
  • Explain the feasibility of the results from the implementation and computational point of view.
  • The conclusions should be extended and future lines of research should be discussed with more care.

 

Author Response

Comments 1: The Discussion section needs to be improved. Limitations, theoretical and practical implications need to be addressed in a more detailed manner. Implications are of outmost importance in this context. For instance, the method proposed by the authors could be easily used to create embeddings of communities in social network with the aim of incorporating as much information as possible. Social network data analysis, and especially topic-oriented analysis such as the one in [10.1016/j.ipm.2023.103516] especially would benefit from this. I suggest the authors consider including this reference as well as other ones when discussing implications.

Response 1: Thank you for this suggestion. We revise the Discussion section with more details of implications as inserting `The ease of utilizing our method to create embeddings of communities in social networks advances the aim of incorporating as much information as possible. Especially, the topic-oriented analysis for social networks [10.1016/j.ipm.2023.103516] can take advantage from our method.` in the end of the first paragraph.

Comments 2: The Introduction makes a decent work in discussing the related work. However, it feels like more importance can be given to this matter. Particularly, starting from the previous works, I suggest introducing a table to summarize the most recent works and to highlight the novelty of the proposed work. The approach is very well presented but it needs to be positioned better w.r.t. the related context.

Response 2: Thank you for this suggestion. We add a table to the Introduction section to summarize and highlight the novelty of our method in terms of Long-range interaction, Graph re-wiring, and anti-oversmoothing.

Comments 3: For the presentation, new ideas with unique features compared to existing/published papers should be more highlighted.

Response 3: Thank you. We insert a statement ` As summarized in Table 1, we propose a non-local exchange manner of graph re-wiring to address problems of over-smoothing and over-squashing via increasing the node expressibility previous to model training. In contrast to previous graph re-wiring methods, NLE performs the re-wiring incorporating with the original topology as non-local means for images.` to highlight our novelty at the end of 3rd paragraph of Introduction section.

Comments 4: Explain the feasibility of the results from the implementation and computational point of view.

Respones 4: Thank you. We add an additional subsection to Method section to explain the implementation and the computational costs as `Our \textit{ExM} framework is an agnostic GNN plug-in. The implementation of it is shown in Algorithm 1  and 2. We analyse in the experiments about which way should be used according to different degrees and homophily ratios of graph. The computational cost of Algorithm 1  is $O(HN)$ additional to any GNN to store the re-wired graph topology, where the time cost remains the same as the original GNN. Moreover, the Algorithm 2 not only requires $O(HN)$ space but doubles the GNN computation since it needs the model to compute twice.`

Comments 5: The conclusions should be extended and future lines of research should be discussed with more care.

Response 5: Thank you. We add future lines at the end of Conclusion section ` Our future work includes (1) extensive benchmarks with existing GNN models and public datasets, (2) studying learnable graph re-wiring way of non-local information exchange, and (3) having a deeper theoretical support for the interplay between topological re-wiring, i.e., the topology transformation.`

Reviewer 2 Report

Comments and Suggestions for Authors

The paper is well written and deals with an important problem.  The authors have provided a sufficient amount of information.
There is a number of points that could be improved
1. Title of fig. 1 is rather strange, it should describe what is in the figure, and not be the part of your research description.
2.fig 2 needs at least a paragraph describing it, at this moment it requires a lot of effort from reader to fully understand it
3.what is the computational complexity of algorithms 1 and 2, in resoect to tge number of nides, and maybe with respect to number of hops?
4. On which bases have you selected datasets?
5. In the discussion I expected to get an insight on why the algorithm proposed is much better for some datasets and slightly better for others. ?
6. For which datasets/types of data your approach would be most beneficial to use.?
7. The other issue for me is the lack of complexity or time that it adds, i.e if I add your exm into my pipline how much time/computational cost it adds?
8. Why is it worth adding?
9. Could you provide a longer discussion on how type or structure of graphs influences the gain from your approach?
10 How does number of classic influence the results ?
11. How would tge approach deal with dynamic change in graphs?
12 Is the implementation of your module available publicly

Author Response

Comments 1: Title of fig. 1 is rather strange, it should describe what is in the figure, and not be the part of your research description.

Response 1: Thank you. We revise the caption of Fig.1 as the description of the figure elements: `Relationship between expressibility of the original and 3 underlying topology of graphs and modern GNN performance (in node classification accuracy). Different landmarks represent different datasets. Colors denote graph re-wiring methods. Note that all rewiring methods are applied with the same hyperparameter of baseline.`.

Comments 2: fig 2 needs at least a paragraph describing it, at this moment it requires a lot of effort from reader to fully understand it

Response 2: Thank you. We revise the separate description of Fig 2 as an entire paragraph in the 5th paragraph of the revised Introduction section.

Comments 3: what is the computational complexity of algorithms 1 and 2, in resoect to tge number of nides, and maybe with respect to number of hops?

Response3: Thank you. Yes it is respect to number of hops. We add an additional subsection to Method section to explain the implementation and the computational costs as `Our \textit{ExM} framework is an agnostic GNN plug-in. The implementation of it is shown in Algorithm 1  and 2. We analyse in the experiments about which way should be used according to different degrees and homophily ratios of graph. The computational cost of Algorithm 1  is $O(HN)$ additional to any GNN to store the re-wired graph topology, where the time cost remains the same as the original GNN. Moreover, the Algorithm 2 not only requires $O(HN)$ space but doubles the GNN computation since it needs the model to compute twice.`

Comments 4: On which bases have you selected datasets?

Response 4: Thank you. We select datasets across different scopes including social, citation, and knowledge, as well as homophily ratios from 0 to 1 to comprehensively evaluate the performance.
Comments 5: In the discussion I expected to get an insight on why the algorithm proposed is much better for some datasets and slightly better for others. ?

Response 5: Thank you. We add this insight in the 2nd paragraph of Discussion section: ` Second, the limitation of NLE is the minor improvement shown in homophilous graphs given results in Table \ref{table:exp2}. This implies that nodes containing valuable non-local information are more distant than heterophilous graphs. The future work towards to this limitation could be incorporating with clustering of homophilous node to determine better hyperparameters $K_{start},K_{end}$.`.

Comments 6: For which datasets/types of data your approach would be most beneficial to use.?

Response 6: Thank you. As we mentioned in the sections of Ablation studies and Discussion, heterophilous graphs with lower degree would be the most beneficial to use NLE.

Comments 7: The other issue for me is the lack of complexity or time that it adds, i.e if I add your exm into my pipline how much time/computational cost it adds?

Response 7: Thank you. We add an additional subsection to Method section to explain the the computational costs as `The computational cost of Algorithm 1  is $O(HN)$ additional to any GNN to store the re-wired graph topology, where the time cost remains the same as the original GNN. Moreover, the Algorithm 2 not only requires $O(HN)$ space but doubles the GNN computation since it needs the model to compute twice.`

Comments 8: Why is it worth adding?

Response 8: Thank you. As shown in our experiments, with $O(HN)$ additional storage, adding NLE can lead to improvement for both homophilous and heterophilous graphs, where it is promising for heterophilous with new record of accuracy.

Comments 9: Could you provide a longer discussion on how type or structure of graphs influences the gain from your approach?

Response 9: Thank you. As we revise our Discussion section, the performance gain by our method is influenced by the homophily ratio since homophilous nodes containing valuable non-local information are more distant than heterophilous graphs.

Comments 10: How does number of classic influence the results ?

Response 10: Thank you. Number of class is not explicitly influencing the results of NLE since our method is for a better representation learning, while the number of class influences the fitness of prediction heads in a GNN.

Comments 11: How would tge approach deal with dynamic change in graphs?

Response 11: Thank you. Dynamic change of graph topology would not influence our results since our NLE can also be plug in any sequential or temporal GNNs.

Comments 12: Is the implementation of your module available publicly

Response 12: Thank you. Yes it will be publicly available.

Reviewer 3 Report

Comments and Suggestions for Authors

Please see the attachment.

Comments for author File: Comments.pdf

Author Response

Comments 1: The introduction is well structured, where the non-local exchanging (NLE) is linked to GNN and the rewiring wrapper. Also, the references quoted are sufficient and up-to-date. However, in my opinion, a brief reference about homophilous and heterophilous datasets could be added.

Response 1: Thank you. We add them in the end of 2nd paragraph in Introduction section as ` \cite{zhu2020beyond} found this potential of node classification in a graph is determined by homophily metric ratio representing if an edge likely connecting nodes with the same label, where a graph with high homophily ratio is considered homophilic and the opposite is heterophilic. Methods were developed for homophilic and heterophilic graphs based on information aggregating and gating, respectively, if not considering graph re-wiring \cite{zheng2022graph}.`.

Comments 2: In section 2, in page 5, there is Table 1 on top. Then, in line 158 in the same page, it says that the content of Table 1 is related to homophilic graphs. However, in line 222 in page 7, it says that many of the graphs stated in Table 1 are heterophilic. Hence, please clarify if the graphs shown in Table 1 are either homophilic or heterophilic.

Response 2: Thank you. In line 158 in the same page, we described these datasets are the least homophilic in our work as `… the five least homophilic graphs (i.e., bottom five smallest h) …`. We intended to say that they are not homophilic but heterophilic by being negative in heterophilic, that is the opposite of heterophilic. To avoid confusing readers, we revise this part as `… the five heterophilic graphs…`

Comments 3: The experimental part is well described, where the use of express messenger wrapper to rewire graph topologies through non-local information exchange leads to outperform other graph rewiring methods, while leading some GNN backbones to get a performance comparable to other state of the art methods on different homophilous and heterophilous datasets

Response 3: Thank you for your kind words.

Comments 4: The use of English language is fit for a research journal, as just a pair of small grammar mistakes have been found.

Response 4: Thank you. We correct some grammar mistakes, e.g. heterophilic -> heterophilous.

Back to TopTop