8 results ·
● Live web index
M
medium.com
article
https://medium.com/@roopal.tatiwar20/neuroevolution-evolving-neural-network-w…
# Neuroevolution: Evolving Neural Network with Genetic Algorithms. Neuroevolution is a subfield of artificial intelligence (AI) and machine learning that combines evolutionary algorithms(like Genetic Algorithm) with neural networks. The primary idea behind neuroevolution is to evolve neural network architectures and/or their weights to solve problems or perform specific tasks. Before getting into neuroevolution in detail, let us first overview the concepts of neural networks and genetic algorithm. By marrying biological evolution principles with computational models, neuroevolution introduces a paradigm shift in the way neural networks learn, adapt, and solve complex problems. At its essence, neuroevolution harmonizes two powerful concepts — neural networks and genetic algorithms. Neuroevolution involves the application of genetic algorithms to enhance neural networks. They involve creating a population of neural networks, evaluating their performance on a given task, selecting the best-performing networks to serve as parents, and applying genetic operations (crossover and mutation) to produce a new generation of networks. Using Genetic Algorithms to Optimize Artificial Neural Networks..
I
ieeexplore.ieee.org
article
http://ieeexplore.ieee.org/document/1007742/
Optimization with neural networks trained by evolutionary algorithms | IEEE Conference Publication | IEEE Xplore. # Optimization with neural networks trained by evolutionary algorithms. Multilayer neural networks are trained to solve optimization problems. Multilayer neural networks are trained to solve optimization problems. **Published in:** Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290). **DOI:** 10.1109/IJCNN.2002.1007742. Tank and Hopfield [8] pioneered the use of neural networks to solve linear optimization problems, with Hopfield networks [3]. Kennedy and Chua [4] extended these results to solve non-linear problems. Romero [6] approached optimization problems with a multilayer neural network. Silva [17] coupled fuzzy logic with Hopfield networks to solve. linear and non-linear optimization problems. Recently, Xia and Wang [18], proposed a global convergent neural network to deal with linear optimization problems. Estimating the Un-sampled pH Value via Neighbouring Points Using Multi-Layer Neural Network - Genetic Algorithm. A Design of Self-Defect-Compensatable Hardware Neuron for Multi-layer Neural Networks. # References. **References is not available for this document.**.
U
ultralytics.com
article
https://www.ultralytics.com/glossary/evolutionary-algorithms
Introducing Ultralytics Platform: Annotate, train, and deploy YOLO models. Learn to optimize Ultralytics YOLO26 hyperparameters and enhance model performance. Train Ultralytics YOLO models on 22 cloud GPUs, monitor every metric in real time, and compare experiments side by side on Ultralytics Platform. Smart annotate your data, train Ultralytics YOLO26 on cloud GPUs, and deploy globally with a single click. Evolutionary Algorithms (EAs) are a powerful family of optimization algorithms that emulate the biological principles of natural selection and genetics to solve complex computational problems. For example, the `tune()` method in the Ultralytics library uses a genetic algorithm to discover the best training hyperparameters for YOLO26 models on custom datasets. The Ultralytics `tune` method runs an evolutionary process to mutate hyperparameters over several generations, automatically identifying the settings that yield the highest performance on your validation data. from ultralytics import YOLO # Load the standard YOLO26 model model = YOLO("yolo26n.pt") # Run hyperparameter tuning using a genetic algorithm approach # The tuner evolves parameters (lr, momentum, etc.) over 30 generations model.tune(data="coco8.yaml", epochs=10, iterations=30, plots=False).
L
link.springer.com
article
https://link.springer.com/article/10.1007/s42979-024-02972-5
We propose an evolutionary framework called AutoLR, capable of evolving optimizers for specific tasks. We use the framework to evolve optimizers for a popular
S
spiedigitallibrary.org
article
https://www.spiedigitallibrary.org/conference-proceedings-of-spie/6228/62280Q…
ABSTRACT. This paper surveys the various approaches used to apply evolutionary algorithms to develop artificial neural networks that solve pattern
Q
quora.com
article
https://www.quora.com/Is-there-any-overlap-between-evolutionary-algorithms-fo…
Generally, genetic algorithms tend to outperform neural networks in the optimization space and knowing just what I know so would say use the
N
nature.com
article
https://www.nature.com/articles/s41598-023-37540-z
*et al.* A novel neural network model with distributed evolutionary approach for big data classification. # A novel neural network model with distributed evolutionary approach for big data classification. introduces a numerical computing technique using artificial neural networks optimized with particle swarm optimization and active-set algorithms to solve the nonlinear corneal shape model11."). In this study, a distributed processing framework for GA-evolved neural network classifier has been explored, as well as the effectiveness of this framework in big data classification problems. A distributed GA Architecture is employed to train the neural networks to improve the model’s effectiveness in handling large datasets. Table 2 depicts the comparison of accuracy values obtained for GA-based ANN, which works in a normal mode, and the proposed distributed evolutionary neural network. A distributed GA model was adopted to train the neural network. Application of artificial neural network-genetic algorithm (ANN-GA) to correlation of density in nanofluids. A novel neural network model with distributed evolutionary approach for big data classification.
D
datascience.stackexchange.com
article
https://datascience.stackexchange.com/questions/38321/why-arent-genetic-algor…
#### Stack Exchange Network. # Why aren't Genetic Algorithms used for optimizing neural networks? Furthermore, training Neural Networks (especially deep ones) is hard and has many issues (non-convex cost functions - local minima, vanishing and exploding gradients etc.). Training Neural Networks (NNs) with Genetic Algorithms (GAs) is not only feasible, there are some niche areas where the performance is good enough to be used frequently. Genetic algorithms and other global searches for optimal parameters are robust in ways that gradient-based algorithms are not. In practice on a large multi-layer network, gradient methods are likely orders of magnitude faster than GA searches such as NEAT for finding network parameters. Still, their research uses backpropagation to train networks, but they use genetic algorithms to find a good architecture. There is another research for network architecture search, but they use bayesian optimization instead of genetic algorithms. ##### Stack Exchange Network.