Core Metrics
Total Citations
1,665
H-Index
15
Publications
58
i10-Index
19
2-Year Citedness
2.3
avg citations per work
Ability Dimensions
1,665 citations, h=15
2yr mean: 2.3
58 papers (3.9/year)
29 cites/paper
4 unique research topics
4 topic areas
Top 3 papers: 72% of citations
* Percentile scores are calculated relative to all scholars in the computational neuroscience dataset. Tags are assigned based on dimension combinations. Hover over the radar chart for details.
Scholar Profile Analysis
João Sacramento is a emerging scholar in computational neuroscience, currently affiliated with Independent Research.
Over a 15-year academic career, published 58 papers (averaging 3.9 per year), with 1,665 citations.
Academic impact accumulated gradually: first 5 years account for only 1.3%, indicating later works are more influential.
Primary research areas include Recall, Modular design, Similarity (geometry).
Key Findings
Signature Work
"A deep learning framework for neuroscience" is the most influential work, with 1,018 citations, published in 2019.
Early Career Analysis (First 5 Years)
Career Start
2010 - 2014
Early Citations
21
Early Works
3
Early Impact %
1.3%
Top Early Career Paper
Tree-like hierarchical associative memory structures
Publication Timeline
Research Topics
Top Publications
A deep learning framework for neuroscience
1,018
Citations
Dendritic cortical microcircuits approximate the backpropagation\n algorithm
93
Citations
Transformers learn in-context by gradient descent
88
Citations
Dendritic cortical microcircuits approximate the backpropagation algorithm
68
Citations
Dendritic error backpropagation in deep cortical microcircuits
35
Citations
Sensory representation of an auditory cued tactile stimulus in the posterior parietal cortex of the mouse
32
Citations
A Theoretical Framework for Target Propagation
32
Citations
Energy Efficient Sparse Connectivity from Imbalanced Synaptic Plasticity Rules
29
Citations
Computational roles of plastic probabilistic synapses
25
Citations
Learning where to learn: Gradient sparsity in meta and continual learning
24
Citations