A research team at Stanford’s Wu Tsai Neurosciences Institute has developed an AI algorithm, a topographic deep artificial neural network (TDANN), that mimics how the brain organizes sensory information. This algorithm successfully predicts sensory responses and spatial organization in multiple parts of the human brain’s visual system. The team’s findings, published in the journal Neuron, provide new insights into the brain’s organization and have implications for both neuroscience and artificial intelligence. The TDANN model can be used to study how the visual cortex operates and develops, potentially leading to improved treatments for neurological disorders and more sophisticated AI visual processing systems.
Source link