8 results · ● Live web index
scitepress.org article

Evolutionary Techniques for Neural Network Optimization θ

https://www.scitepress.org/papers/2005/11918/11918.pdf

We use the genetic algorithm [2], [4] to search a space of possible neural network architectures. − In [15] are proposed main theoretical advantages of the use of L – systems to code network topologies over “blueprint representations” where the evolutionary algorithm specifies every single connection in neural networks. The Figure 3 illustrates the best representation of neural network for the solution of XOR problem (a) that was found with genetic algorithm [21] and its following adaptation with backpropagation (b). The best representation of neural network for solution of XOR problem with genetic algorithm (a) and its following adaptation with backpropagation (b). In Proceedings of the International Joint Conference on Neural Networks and Genetic Algorithms, Innsbruck, pp. Searching neural network structures with L systems and genetic algorithms.

Visit
link.springer.com article

How to Improve Neural Network Training Using Evolutionary ...

https://link.springer.com/article/10.1007/s42979-024-02972-5

by P Carvalho · 2024 · Cited by 5 — We propose an evolutionary framework called AutoLR, capable of evolving optimizers for specific tasks. We use the framework to evolve optimizers for a popular

Visit
medium.com article

Neural Network Optimization (Part 2) — Evolutionary Strategies

https://medium.com/@mandarangchekar7/neural-network-optimization-part-2-evolu…

# Neural Network Optimization (Part 2) — Evolutionary Strategies | by Mandar Angchekar | Medium. # Neural Network Optimization (Part 2) — Evolutionary Strategies. ## Neural Network Optimization (Part 1) — Differential Evolution Algorithm ### Explained and Implemented in Python. This post talks about Evolutionary Strategies (ES), evaluating its potential in training neural networks against the benchmark of traditional backpropagation methods. Central to this exploration was the adaptation of ES to fine-tune neural networks, with a particular focus on optimizing mutation rates for enhanced performance. The code block below illustrates key operations in an evolutionary strategy for neural network optimization: recombining genetic material from two parents to create offspring, mutating offspring by adding Gaussian noise, and assessing fitness by evaluating the negative loss on training data. The plot displays the trend in average fitness of a neural network optimized using Evolutionary Strategies (ES) over a logarithmic scale of generations. Image 15: Neural Network Optimization (Part 1) — Differential Evolution Algorithm. Image 25: 42.Backpropagation: How Neural Networks Learn.

Visit
sciencedirect.com article

Evolutionary approach for composing a thoroughly optimized ensemble of regression neural networks - ScienceDirect

https://www.sciencedirect.com/science/article/pii/S1110866524001440

# Full length article Evolutionary approach for composing a thoroughly optimized ensemble of regression neural networks. The paper presents the GeNNsem (**Ge**netic algorithm A**NN**s en**sem**ble) software framework for the simultaneous optimization of individual neural networks and building their optimal ensemble. The proposed framework employs a genetic algorithm to search for suitable architectures and hyperparameters of the individual neural networks to maximize the weighted sum of accuracy and diversity in their predictions. The proposed approach exhibited supremacy over other ensemble approaches and individual neural networks in all common regression modeling metrics. Real-world use-case experiments in the domain of hydro-informatics have further demonstrated the main advantages of GeNNsem: requires the least training sessions for individual models when optimizing an ensemble; networks in an ensemble are generally simple due to the regularization provided by a trivial initial population and custom genetic operators; execution times are reduced by two orders of magnitude as a result of parallelization.

Visit
walshmedicalmedia.com article

evolutionary-algorithms-for-deep-neural-network- ...

https://www.walshmedicalmedia.com/open-access/evolutionary-algorithms-for-dee…

Evolutionary Algorithms for Deep Neural Network Optimization Berman Nadimi* Department of Computer Science, Brock University, St. Catharines, Canada DESCRIPTION Evolutionary Algorithms (EAs) are a class of optimization techniques inspired by the process of natural evolution. When applied to Deep Neural Networks (DNNs), these algorithms offer a powerful way to optimize models, particularly in cases where traditional methods, such as gradient descent, fall short. One of the main benefits of using evolutionary algorithms for optimizing deep neural networks is their ability to avoid the common pitfalls of traditional optimization methods. Evolutionary algorithms offer a promising alternative to traditional methods for deep neural network optimization. Evolutionary Algorithms for Deep Neural Network Optimization.

Visit
reddit.com article

Evolutionary Optimization Algorithms

https://www.reddit.com/r/learnmachinelearning/comments/1caeemt/evolutionary_o…

Once you have a well-trained ML model, you can use it to evaluate an optimal configuration for specified conditions and objectives much faster

Visit
arxiv.org article

[2510.09566] Automated Evolutionary Optimization for Resource-Efficient Neural Network Training

https://arxiv.org/abs/2510.09566

## quick links. # Computer Science > Machine Learning. # Title:Automated Evolutionary Optimization for Resource-Efficient Neural Network Training. | Subjects: | Machine Learning (cs.LG) |. | Cite as: | arXiv:2510.09566 [cs.LG] |. | | (or arXiv:2510.09566v1 [cs.LG] for this version) |. | | Focus to learn more arXiv-issued DOI via DataCite |. ## Submission history. ## Access Paper:. ### References & Citations. ## BibTeX formatted citation. # Bibliographic and Citation Tools. # Code, Data and Media Associated with this Article. # Recommenders and Search Tools. # arXivLabs: experimental projects with community collaborators. arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website. Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them. Have an idea for a project that will add value for arXiv's community?

Visit