Anonymous | Login | Signup for a new account | 2024-11-24 13:28 MSK |
Main | My View | View Issues | Change Log | Roadmap | Docs |
Viewing Issue Advanced Details [ Jump to Notes ] | [ View Simple ] [ Issue History ] [ Print ] | ||||||
ID | Category | Severity | Reproducibility | Date Submitted | Last Update | ||
0000108 | [ALGLIB] Data analysis | minor | always | 2009-08-22 20:20 | 2009-08-22 20:21 | ||
Reporter | SergeyB | View Status | public | ||||
Assigned To | SergeyB | ||||||
Priority | normal | Resolution | fixed | Platform | |||
Status | resolved | OS | |||||
Projection | none | OS Version | |||||
ETA | none | Fixed in Version | 2.1.0 | Product Version | |||
Target Version | Product Build | ||||||
Summary | 0000108: FIXED: overflows in neural networks under some compilers | ||||||
Description |
Neural networks in 2.0.1 use tanh as activation function. Some compilers (Delphi 7, for example) have buggy standard libraries that can't calculate TanH of large arguments because of overflow (although final result is "1"). So if neural network is provided with inputs that are too large, such network will crash during training. Strictly speaking, it is Delphi bug, but we'll fix it anyway :) This problem is fixed by using special code for inputs that are larger than 100. |
||||||
Steps To Reproduce | |||||||
Additional Information | |||||||
Programming language | Unspecified | ||||||
Attached Files | |||||||
|
There are no notes attached to this issue. |
Mantis 1.1.6[^] Copyright © 2000 - 2008 Mantis Group |