Optimizing neural networks(nn) using particle swarm optimization(pso) or cooperative particle swarm optimization(cpso) has been done in many previous studies. This study introduces two new methods of cpso, merge-cpso and decompose-cpso by replacing the base cpso used for the optimization of decomposed neural networks. Solving high dimensional problems with cooperative particle swarm optimization can introduce the issue of saturation. It can directly affect the resulting solution of a problem by moving the particles arbitrarily. A previous study used random regrouping and factorization in nn decomposition to observe the effects on the performance of training neural networks with cpso using the decomposition methods discussed earlier. This study will observe the effects of using mcpso and dcpso with factorized, and non-factorized decompositions of the neural network. The experiment performed in this study was done over 5 data sets with dimensions ranging from 35 to 827 to compare the performance of optimization algorithms and decompositions. Using the two new algorithms Mcpso and Dcpso this study has found a slight improvement over the base cpso.
This repository has been archived by the owner on Jul 31, 2023. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 0
Source code of Internship Project. MCPSO & DCPSO for training large scale neural networks.
License
Rikveet/CPSO-NUMPY
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Source code of Internship Project. MCPSO & DCPSO for training large scale neural networks.
Topics
Resources
License
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published