CNEP for Artificial Intelligence

Since 1940's AI (artificial intelligence) has employed various artificial neural network (ANN) models all with a over-simplified neuron computing model. All neurons in an ANN are functionally identical, having no internal structure, and conduct a scalar operation: weighted summation of synaptic inputs followed by a sigmoid function.

CNEP provides a new, biological prototype-inspired base to develop a new generation of single neuron computing models. In these new models, each neuron has its internal structure characterized by a unique dendrogram and associated patch configurations. The output of neuronal computing is no longer a scalar function but a vector of computation results generated at a set of arithmetic units distributed over the dendrogram. Some of these computation results in the vector collectively determine the output of the neuron to other neurons, and many of them back-propagate within the dendrogram to determine the patch reconfiguration and hence the plasticity of synaptic connections.

At Abinitio Laboratories we plug such neuronal computing models derived from CNEP technology into a first-principle-driven framework for AI. An energy function, or the system Hamiltonian for a single neuron, is constructed based on all patch Hamiltonians which are determined by all ionic channels of the patchs. By adding all single neuron Hamiltonians together, we have the system Hamiltonian for a generic neural network. A least action postulate is proposed to be the foundation of all theoretical constructions which leads to a system of Euler-Lagrange equations concerning the neural network under consideration, and a system of canonical equations can be achived via a Legendre transform.

This completes the basic construction of a somatic neurodynamics which one of the founders developed in early 1990's. Somatic neurodynamics completely determines the evolution of neural activity (measured by all membrane potentials), neural connectivity (measured by efficacy of all synaptic connections), and neural plasticity (measured by plasticity of all synaptic efficacies) in respect to a somatic reference system.

In the framework of somatic neurodynamics, almost all neural computing tasks (e.g. classification, recognition, association, image processing, and natural language processing) can be mapped and implemented as boundary value problems concerning a set of Lagrange equations (the master equations), which can be further classified as forward problems (network analysis) or inverse problems (network design) concerning the master equations.

In the framework of somatic neurodynamics, neural plasticity, learning, and memory can be well formulated and implemented following a Hamilton-Jacobi dynamics, with the system's activity (membrane potentials) and connectivity (synaptic efficacy) playing as a pair of conjugate variables coupled by the canonical equations.

This framework of analytical neurodynamics, with solid foundation of CNEP at molecular and channel levels, represents a disruptive reconstruction of AI theory and practice. It is expected to overcome many challenges inherent with existing deep-learning based AI regimes.