webmaster: Sven F. Crone
Centre for Forecasting Lancaster University Management School Lancaster LA1 4YF United Kingdom Tel +44.1524.592991 Fax +44.1524.844885 eMail sven dot crone (at) neural-forecasting dot com | | Who are the researchers currently influencing research and application on neural forecasting? We are gathering information on individual researcher profiles. As we have not finalised a scientometric analysis of cross citations (evaluating the impact of a researchers publications by the amount of publications citing him) , we estimate a researchers importance in or impact on the field of neural networks for forecasting by the number of dedicated publications in the field. Neural Forecasting Who-is-Who? | G. Peter Zhang | Publications | G. Peter Zhang is currently Associate Professor of Decision Sciences and Operations Management at Georgia State University. He received his B.S. and M.S. degrees in Mathematics and Statistics, respectively, from East China Normal University, Shanghai, China, and Ph.D. degree in Operations Research/Operations Management from Kent State University, USA. His research interests include neural networks, time series forecasting, supply chain management, and statistical quality control. His work has appeared in several reputable publications. He is on the editorial review board of Production and Operations Management journal. His profile has been published in Who’s Who in the 21st Century and Who’s Who in the World. http://www.gsu.edu/~dscgpz/index.html | 40 |
| Sven F. Crone | Publications | Sven Crone is currently a Research Associate in Management Science at Lancaster University, UK. Having formerly lectured and researched at the University of Hamburg (Germany), George Mason University (USA) and Stellenbosch University (South Africa), he received his B.S. and MBA (German equivalent diplomas) in Management, Business Administration & Economics from the University of Hamburg. His PhD at the University of Hamburg focuses on the use of artificial neural networks for forecasting and inventory management in retail companies. Having worked in IT and management consultancy, he has supervised various projects in demand planning for procurement and inventory management in supply chains (SAP APO-DP certified), process analysis & redesign in retail and wholesale warehouse logistics (ARIS certified), standard software selection and predictive data mining in Germany, Netherlands, Hungary and the UK. These areas also dominate his current research agenda, where he focuses on the development and application novel methods to reduce supply chain inventories through increased forecasting accuracy. He is also webmasterr for this portal. http://www.sven-crone.com | 10 |
| YOU! | Publications | Your information could be provided here! Please send us your details! http://www.you.com | ? |
Please send in your info in order to be evaluated & included ... | Scott Armstrong | Prof. Armstrong is the editor of the influential book on Forecasting Principles. He hosts a webpage with loads of information and publications for researchers and practitioners. | http://www.gsu.edu/~dscgpz/index.html |
| Dave Reilly | Dave is president at AFS and the "brains" behind the Autobox and Freefore software, possibly the best software for autoregressive modelling in the market, including level shift & structural break detection. If we take NN forward in forecasting, this is the software to challenge! | http://www.autobox.com/ |
Please send in your info in order to be evaluated & included ... Neural Networks Who-is-Who? - a Hall of Fame! | Ralph H. Abraham has been Professor of Mathematics at the University of California at Santa Cruz since 1968. He received the Ph.D. in Mathematics at the University of Michigan in 1960, and taught at Berkeley, Columbia, and Princeton before moving to Santa Cruz. He has held visiting positions in Amsterdam, Paris, Warwick, Barcelona, Basel, and Florence, and is the author of more than 20 texts & books. | Contact: WebSite |
Prof. William W. Armstrong | His research concerns adaptive logic networks (ALNs) and their applications. An ALN can fit a function to given data points by a process of training. For example, it can learn to predict a future value of some variable based on past values of that variable and other variables related to it. | Contact: WebSite |
| His work has concentrated on developing new nonlinear models and algorithms in the field broadly known as datamining. The approaches he uses come from a range of methods, such as classical linear systems theory, statistical techniques, signal processing, control theory, hybrid systems and nonlinear models including feedforward and recurrent neural networks. | Contact: WebSite |
| Sami Bengio holds PhD in Computer Science from the University of Montreal, Canada. He is currently working at DIAP (Dalle Molle Institute for Perceptual Artificial Intelligence) located in Martigny, Valais. His research is in various aspects related to statistical machine learning (neural networks, support vector machines, hidden Markov models, mixture models, adaboost and bagging, etc.). Mr. Bengio takes part in Torch (a well known machine learning library) development. | Contact: WebSite Email |
| Christopher M. Bishop is the author of the well-known book "Neural Networks for Pattern Recognition" (1995, currently in its 9th edition). The main points in his biography are graduating Oxford in 1980, PhD in Theoretical Physics in 1983. Then he pointed his interests to pattern recognition and neural networks where he has a lot of publications and books. He is currently working at Microsoft Research Laboratory in Cambridge, UK. | Contact: WebSite |
| Rodney A. Brooks is Director of the MIT Computer Science and Artificial Intelligence Laboratory, and is the Fujitsu Professor of Computer Science He is also Chairman and Chief Technical Officer of iRobot Corp | Contact: WebSite |
| Prof. Walter J. Freeman, currently at University of CA-Berkeley, studied physics and mathematics at M.I.T., philosophy at the University of Chicago, medicine at Yale University (M.D. cum laude 1954) and Johns Hopkins, and neurophysiology at UCLA with support from the Foundations’ Fund for Research in Psychiatry. He has taught brain science in the University of California at Berkeley since 1959, where he is a Professor of the Graduate School. He received Guggenheim Fellowship, the A.E. Bennett Award from the Society for Biological Psychiatry, the Pioneer Award from the Neural Networks Council of the IEEE, and a MERIT award from the National Institute of Mental Health. He is the author of more than 350 articles and books. | Contact: WebSite |
| Professor Gasteiger, Computer-Chemie-Centrum , University of Erlangen-Nürnberg Research area: Artificial Intelligence in Computational Chemistry | Contact: WebSite |
| As Associate Professor in NSF-Idaho Dr. Kantabutra is working on new algorithms to achieve higher and faster training convergence. | Contact: WebSite |
| Professor Nikola K. Kasabov holds a Personal Chair in the Department of Information Science, University of Otago, Dunedin, New Zealand. He received his MSc degree in Computer Science and his PhD degree in Mathematical Sciences from the Technical University in Sofia, Bulgaria. Kasabov has published over 250 works, among them over 50 journal papers, 90 conference papers, 15 book chapters, 5 text books, 3 edited research books, 5 edited conference proceedings, 21 patents and authorship certificates in the area of intelligent systems, connectionist and hybrid connectionist systems, fuzzy systems, expert systems, speech recognition, and data analysis. | Contact: WebSite |
| Dr. Kecman is the Author of a few dozens of journal and conference papers and 13 monographs, books or other bound publications. His research area includes: Learning from data sets - support vector machines, neural networks and fuzzy logic systems. Kernel machines. Pattern recognition, multivariate function approximation. Knowledge modeling and knowledge acquiring. | Contact: WebSite |
| His research areas are the theory of self-organization, associative memories, neural networks, and pattern recognition, in which he has published over 300 research papers and four monography books. Since the 1960s, Professor Kohonen has introduced several new concepts to neural computing: fundamental theories of distributed associative memory and optimal associative mappings, the learning subspace method, the self-organizing feature maps (SOMs), the learning vector quantization (LVQ) among those. | Contact: WebSite |
| Dr. Kruse is the Chairman of the working group "Foundations of Fuzzy Systems and Soft Computing" of the German | Contact: website |
| Paolo is an Italian IT specialist with over 20 years of experience in the field. Currently he is working at a large Italian Bank, in Rome, responsible for J2EE Architecture and System Integration. His interested in Neural Networks began in 1989, when he started to study the possibility to use the neural networks outside the academic world, making them suitable also for the industrial market. Feeling the necessity of a different approach, the focus of his activity in the last five years was to address issues in the development of a neural network framework based on the last available technologies, with particular emphasis to important features like portability, scalability, extensibility, modularity, etc. The result of that work is Joone, an Open Source neural network framework written in Java. | Contact: website |
| Marvin Minsky has made many contributions to AI, cognitive psychology, mathematics, computational linguistics, robotics, and optics. He received the BA and PhD in mathematics at Harvard and Princeton. In 1951 he built the SNARC, the first neural network simulator. His other inventions include mechanical hands and other robotic devices,the confocal scanning microscope, the "Muse" synthesizer for musical variations (with E. Fredkin), and the first LOGO "turtle" (with S. Papert). | Contact: WebSite |
| Nikolay Nikolaev lecturs in Computer Science Department of Computing Goldsmiths College, University of London. Research area: second-order backpropagation algorithms for re-training polynomial neural networks. | Contact: WebSite |
| Once in Star Trek, now Professor at School of Computer Sciences, Tel Aviv University Research area: Artificial Life and Evolutionary Computation and Dynamics Of Neural Networks | Contact: WebSite |
| Prof. Siegelmann, Associate Professor at University of Massachusetts, neural computation, adaptive information systems, machine learning and knowledge discovery, theory of analog and adaptive systems, bioinformatics. She has published in a variety of prestigious journals, including Science, Theoretical Computer Science, the Journal of Computer and Systems Science, IEEE Transactions on Information Theory, IEEE Transactions on Systems Man and Cybernetics, and IEEE Transactions on Neural Networks. | Contact: WebSite | Info: Analog Computer |
| He is a senior lecturer at the department of Computer Science of Manchester University since 1989, where he teaches and carry out research on neural networks and genetic algorithms. | Contact: WebSite |
| He was born in Shizuoka Prefecture Japan, on March 31, 1952. He graduated from the University of Electro-Communications, Tokyo. He received the Dr Eng. degree from Osaka University. He was a faculty member of the Department of Computer Science and Engineering, Toyohashi University of Technology from 1980 to 1986. Since 1986, he has been with the University of Electro-Communications, where he is currently Professor of Communications and Systems Engineering. He was previously engaged in the field of nonlinear network theory, queueing theory and performance evaluation of communication systems. | Contact: WebSite |
| Dr. Yoshiyasu Takefuji is a tenured professor of Keio University since April 1992. His research interests focus on neural computing and hyperspectral computing. He received the National Science Foundation/Research Initiation Award in 1989 and received the distinct service award from IEEE Trans. on Neural Networks in 1992 and has been an NSF advisory panelist. | Contact: WebSite |
Prof. John G Taylor | John G. Taylor has been involved in Neural Networks since 1969, when he developed analysis of synaptic noise in neural transmission, which has more recently been turned into a neural chip (the pRAM) with on-chip learning. | Contact: WebSite |
| Dr. Thaler holds several patents in area of Creativity Machines and self-learning Neural Networks. He has built self-learning systems for a number of large international corporations such as Anheuser- Bush, Boeing, Gillette and U.S. government agencies such as the U.S. Air Force and the State of California. | Contact: WebSite |
| Prof. Verleysen is currently senior research associate of the F.N.R.S. (Belgian National Fund for Scientific Reserach) and lecturer at the Electrical Engineering Department (Machine learning group) of UCL. He has ben invited professor at the E.P.F.L. (Ecole Polytechnique Fédérale de Lausanne, Switzerland) in 1992, at the Université d'Evry Val d'Essonne (France) in 2001, and at the Université Paris I Panthéon Sorbonne in 2002 and 2003. He is chairman of the annual European Symposium on Artificial Neural Networks, and editor-in-chief of the Neural Processing Letters journal. | Contact: WebSite |
| Dr. David Waltz is Vice President, Computer Science Research at the NEC Research Institute in Princeton, NJ, and an Adjunct Professor at Brandeis University. From 1984-93, he was Director of Advanced Information Systems at Thinking Machines Corporation and Professor of Computer Science at Brandeis. | Contact: WebSite |
| Dr. Paul J. Werbos holds 4 degrees from Harvard University and the London School of Economics, covering economics, mathematical phyiscs, decision and control, and the backpropagation algorithm. He has served as President of the International Neural Network Society, where he is still on the Governing Board. | Contact: WebSite |
Prof. Bayya Yegnanarayana | The focus of his research activity is to address issues in the development of speech systems for Indian languages with particular reference to speech-to-text and text- to-speech systems for Hindi. He develops algorithms for feature extraction and classification using signal processing methods and neural network models. New algorithms based on processing Fourier transform phase function (or group delay function) have been proposed. | Contact: WebSite |
|
|