Neural network theory

 

Neural networks and radial basis functions (talk at Bialowieza summer school)

Can neural network theory and statistical learning theory be formulated in terms of continuous complexity theory?  (talk in Dagstuhl, Germany) 

 

Papers:

 

Ensemble machine methods for DNA binding (with Y. Fan, and C. DeLisi),  Machine Learning and Applications 7,  M. Wani, et al., eds.  IEEE, Washington (2008),709-716.  Algorithm available here.

On some integrated approaches to inference (with L. Plaskota), technical report, (2005).

Complexity of predictive neural networks (with L. Plaskota) Proceedings of International Conference on Complexity,  Y. Bar-Yam, Ed., Cambridge, MA (2003)

Complexity of neural network approximation with limited information:  a worst-case approach (with L. Plaskota), J. Complexity 17 (2001), 345-365.

Complexity of regularization RBF networks (with L. Plaskota), in Proceedings of International Joint Congress on Neural Networks,INNS, Washington (2001), 342-346.

Neural networks, radial basis functions, and complexity (with L. Plaskota), Proceedings of Bialowieza Conference on Statistical Physics, 1997, 122-145.