Mark A. Kon

 

 

 

Mark Kon is professor of Mathematics and Statistics at Boston University, 
and is affiliated with the department of Cognitive and Neural Systems

 

 

 

He received a PhD in Mathematics from MIT, and has Bachelor's degrees in Mathematics, Physics, and Psychology from Cornell University.  He has had appointments at Columbia University as Assistant and Associate Professor (Computer Science, Mathematics), at Tufts University as Assistant Professor, and at MIT as a graduate instructor.  He has served as departmental director of graduate studies at Boston University.  He has approximately 70 publications in mathematics, statistics, computer science, neural network theory, and mathematical physics, including one book.  His current research and applications interests involve learning theory, statistics and probability, neural networks, complexity theory, optimization, and mathematical physics.  His recent work in learning theory has investigated complexities of designs for learning machines and neural networks which improve significantly on those for standard backpropagation architectures.  He is on the editorial board of Neural Networks, and has been on the organizing committee of the World Congress on Neural Networks twice.  He has had recent research grants and contracts from the American Fulbright Commission, National Science Foundation, and the U.S. Air Force.  He has given approximately 100 lectures in 15 countries.  Among organizational roles, he has been a co-organizer for MIT summer analysis seminars in Vermont, and the organizer of a mini-symposium on Computational Complexity Theory in Chamonix, France

 

Research:  Mark Kon works in machine learning, mathematical neural network theory, complexity theory, statistical learning theory, wavelets, and mathematical physics.  His current research focuses on learning as a statistical phenomenon in which an intelligent system learns to combine a priori information with current data to form a model of an input-output function to be learned.  This area naturally connects to complexity theory, neural network theory, and Bayesian inference, areas in which similar issues are prominent.  He and his co-workers focus on connections between these approaches, and more generally on formulation of an approach which unifies them.  One major goal of this project is to provide a normative index in which learning algorithms arising from various approaches can be compared in a single setting.

 Online Publications

Abstracts (in progress):  

 

(in progress)


Number of visitors since October 14, 2003