Datasets:
Tasks:
Text Classification
Modalities:
Text
Formats:
text
Languages:
English
Size:
10K - 100K
License:
{"forum": "S1zyEXYI8B", "submission_url": "https://openreview.net/forum?id=S1zyEXYI8B", "submission_content": {"keywords": ["spiking neural networks", "Spike-time dependent plasticity", "network simulations"], "authors": ["Anonymous"], "title": "Towards learning principles of the brain and spiking neural networks", "abstract": "The brain, the only system with general intelligence, is a network of spiking neurons (i.e., spiking neural networks, SNNs), and several neuromorphic chips have been developed to implement SNNs to build power-efficient learning systems. Naturally, both neuroscience and machine learning (ML) scientists are attracted to SNNs\u2019 operating principles. Based on biologically plausible network simulations, we propose that spatially nonspecific top-down inputs, projected into lower-order areas from high-order areas, can enhance the brain\u2019s learning process. Our study raises the possibility that training SNNs need novel mechanisms that do not exist in conventional artificial neural networks (ANNs) including deep neural networks (DNNs). ", "authorids": ["NeurIPS.cc/2019/Workshop/Neuro_AI/Paper26/Authors"], "pdf": "/pdf/503bac69556c01c6da3b715c28ff69f886cab60b.pdf", "paperhash": "anonymous|towards_learning_principles_of_the_brain_and_spiking_neural_networks"}, "submission_cdate": 1568211751405, "submission_tcdate": 1568211751405, "submission_tmdate": 1570097889303, "submission_ddate": null, "review_id": ["HyxQyqD9DS", "B1x7dIA9PB", "HyetgNJedr"], "review_url": ["https://openreview.net/forum?id=S1zyEXYI8B¬eId=HyxQyqD9DS", "https://openreview.net/forum?id=S1zyEXYI8B¬eId=B1x7dIA9PB", "https://openreview.net/forum?id=S1zyEXYI8B¬eId=HyetgNJedr"], "review_cdate": [1569515994761, 1569543786652, 1569874929431], "review_tcdate": [1569515994761, 1569543786652, 1569874929431], "review_tmdate": [1570047546422, 1570047539499, 1570047531042], "review_readers": [["everyone"], ["everyone"], ["everyone"]], "review_writers": [["NeurIPS.cc/2019/Workshop/Neuro_AI/Paper26/AnonReviewer2"], ["NeurIPS.cc/2019/Workshop/Neuro_AI/Paper26/AnonReviewer3"], ["NeurIPS.cc/2019/Workshop/Neuro_AI/Paper26/AnonReviewer1"]], "review_reply_count": [{"replyCount": 0}, {"replyCount": 0}, {"replyCount": 0}], "review_replyto": ["S1zyEXYI8B", "S1zyEXYI8B", "S1zyEXYI8B"], "review_content": [{"title": "Nice basic idea but limited exploration of the model", "importance": "2: Marginally important", "importance_comment": "This is a clever basic idea. As far as I can tell the basic underlying mechanism is similar to the single-cell competitive STDP process proposed by Song, Miller and Abbott (2000), but at a circuit level, and loosely mapped on to cortical hierarchy. This aspect I believe is novel and interesting.", "rigor_comment": "Overall the results were pretty minimal and could have been expanded to strengthen the authors' case. The results are something like \"proof by example\". It would be more convincing to me if the authors could explore the robustness and generality of this mechanism. How does it depend on parameter choices? What regimes will it have strongest effect and when will it break? What is the role of the separate L2/3 and L4 networks? Although they mimic the cortical anatomy, what computational functions do they perform here?\n\nThe methods section could be more elaborate to aid reproducibility. For example the STDP model is not described at all, and it is known that the implementation details can affect competition (additive vs multiplicative weight changes being one example). Also STDP simulations are notoriously sensitive to parameter choices.", "clarity_comment": "Overall it is well written and the figures are clear.\n\nHowever neither of the two conclusions the authors make are clear to me: \"First, the divergence of synaptic connection\u2019s strengths grows bigger with nonspecific feedback inputs, which can increase SNNs\u2019 learning capability. Second, more synapses can be trained in parallel, which can shorten the training times of SNNs.\" Elaboration or", "clarity": "4: Well-written", "evaluation": "3: Good", "intersection_comment": "The study is straight-ahead computational neuroscience. Although the text mentions potential applications to machine learning, neuromorphic computing, and deep learning, ML-style models are not implemented. It may be that mapping these mechanisms onto ANNs is not straightforward. The two conclusions mentioned ", "intersection": "2: Low", "comment": "As mentioned in my response to the technical rigor section, this study could be greatly improved by further simulations exploring parameter sensitivity, STDP rule choices, and network architecture choices. The basic idea has merit but needs to be better explored.", "technical_rigor": "2: Marginally convincing", "category": "Common question to both AI & Neuro"}, {"title": "Nice preliminary idea but with limited investigation/analysis of the model and results ", "importance": "3: Important", "importance_comment": "The paper provides circuitry level simulation setup of STDP process for hierarchical network structure as observed in brain which is an interesting idea and could go long way with systematic exploration. ", "rigor_comment": "The result presented is nice but could have been expanded with more analysis driven by different variations to the model parameters/input regimes/network structure to better understand the mechanisms in actions. As of now the results presented seem incomplete to fully support the authors' conclusions. ", "clarity_comment": "Overall the paper is well written and easy to follow", "clarity": "4: Well-written", "evaluation": "3: Good", "intersection_comment": "The concepts discussed in the paper seem to have much stronger association with computational neuroscience rather than ML. Although author does mention potential applications to machine learning but it is unclear how the mechanism presented in the paper could be implemented into artificial neural nets. ", "intersection": "2: Low", "comment": "The ideas and results presented in the paper are novel but as of now there doesn't seem to be enough analysis to fully support the author's claims. As mentioned in 'technical rigor' section, the paper could be greatly improved with further exploration of the model.", "technical_rigor": "2: Marginally convincing", "category": "Common question to both AI & Neuro"}, {"category": "Not applicable", "title": "Interesting observation with limitations", "importance": "2: Marginally important", "comment": "I believe this idea needs a more rigorous evaluation and better motivation. A simple search for \"stdp with top down feedback\" or similar turns up a multitude of similar models; the authors should clarify what's their contribution.", "evaluation": "2: Poor", "intersection": "3: Medium", "rigor_comment": "The initial experiments provided hint at possible roles of top-down feedback in learning, however, more evidence is necessary to make significant conclusions. Neither theoretical nor intuitive justification is provided for what is observed.\nIn particular, the model considered is just one of many possible ones (there could be inhibitory feedback, too, for instance, and for a more realistic setting, the feedback connections should possibly be plastic, as well.)\nIt is not obvious to me how the results shown in fig. 3 come about. It seems like the blue curve is exactly the same in both cases, while the orange curve is just constant (zero) in the case without feedback. Since STDP normally leads to weight changes even without feedback, something must be unusual here.", "clarity": "4: Well-written", "intersection_comment": "The machine learning relevance of the proposed approach is not obvious.", "technical_rigor": "2: Marginally convincing", "clarity_comment": "The manuscript is easy to follow.", "importance_comment": "Albeit similar models have been explored previously, investigating the role of top-down feedback in learning could be important for understanding learning in biological neural circuitry."}], "comment_id": [], "comment_cdate": [], "comment_tcdate": [], "comment_tmdate": [], "comment_readers": [], "comment_writers": [], "comment_reply_content": [], "comment_content": [], "comment_replyto": [], "comment_url": [], "meta_review_cdate": null, "meta_review_tcdate": null, "meta_review_tmdate": null, "meta_review_ddate ": null, "meta_review_title": null, "meta_review_metareview": null, "meta_review_confidence": null, "meta_review_readers": null, "meta_review_writers": null, "meta_review_reply_count": null, "meta_review_url": null, "decision": "Reject"} |