Talk:Neural Filtering

From Scholarpedia
Jump to: navigation, search

    In general, this is a well writen review article for neual filetering. One concern from this reviewer is that there are very few, if completely not, applications of neural filtering mentioned in the article. So the article could be improved by summarizing some successful or potential applications.

    Reviewer A:

    I have read the article on neural filter by Dr. Lo with interest. This article discuss how to use RMLP to finish optimal filtering without such assumptions as Markov property, linear dynamics, Gaussian distribution, additive noise and mathematical model etc. in general Kalman filter or extended Kalman filter (EKF). This article includes lot of author’s contributions that published before, but they are simplified.

    This is my first review Scholarpedia article. I am not clear how the article should be. Having looked at other Scholarpedia article, I think that the Scholarpedia article should be useful to reads and explain the content by author in detail. So I suggest that 1) This article introduces two types of RMLP (Recurrent multilayer perceptrons) for optimal filtering, namely MLPs with interconnected nodes (MLPWINs) and MLPs with output feedbacks (MLPWOFs). This is author’s contribution published in some papers, but as Scholarpedia article, I hope to give some figures to explain the structure of the two types and to denote all the symbols of following formulae in section Neural filtering theorems, Dynamical range reducers and extenders, and Synthesizing neural filters etc., such as , . Since many readers that have not read authors publishes can not understand the idea well, for example, after theorem 2 “Consider an MLPWOF with nodes in a single hidden layer, free output feedbacks,” why free output feedbacks are Ny(N-1)? In MLPWINs or MLPWOFs how to decide the weights of feedback (interconnected nodes or output feedbacks)… 2) I hope author can show the difference between MLPWIN and MLPWOF on performance if some one use RMLP as optimal filter, and give the using conditions, advantages and disadvantages etc. 3) It is very interesting that author solve the difficult local-minimum problem in RMLP by the convexification method. I think that many readers are interested in it. Would you give the idea in detail? May be this is author’s Patent, but I only hope it. Hope to give the reason why the cost function added an exponential and parameter  can avoid local-minimum. 4) A few spelling mistakes hope to be modified if author read the article carefully.

    Personal tools
    Namespaces

    Variants
    Actions
    Navigation
    Focal areas
    Activity
    Tools