论文部分内容阅读
本文提出了一种异联想记忆模型的优化学习算法.首先,我们将反映神经元网络性能的标准转化为一个易于控制的代价函数,从而将权值的确定过程自然地转化为一个全局最优化过程.优化过程采用了梯度下降技术.这种学习算法可以保证每个训练模式成为系统的稳定吸引子,并且具有优化意义上的最大吸引域.在理论上,我们讨论了异联想记忆模型的存储能力,训练模式的渐近稳定性和吸引域的范围.计算机实验结果充分说明了算法的有效性.
In this paper, we propose an optimization learning algorithm for allo-associative memory model.Firstly, we transform the standard that reflects the performance of neural network into an easy-to-control cost function, so that the process of determining weights is naturally transformed into a global optimization process The gradient descent technique is used in the optimization process, which ensures that each training pattern becomes a stable attractor of the system and has the maximum attracting domain in the optimization sense.In theory, we discuss the storage capacity of the X-linked memory model , The asymptotic stability of the training pattern and the scope of the attracting domain.The experimental results of the computer fully demonstrate the effectiveness of the algorithm.